2026-03-10T08:46:23.509 INFO:root:teuthology version: 1.2.4.dev6+g1c580df7a 2026-03-10T08:46:23.517 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-10T08:46:23.540 INFO:teuthology.run:Config: archive_path: /archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/969 branch: squid description: orch/cephadm/mds_upgrade_sequence/{bluestore-bitmap centos_9.stream conf/{client mds mgr mon osd} fail_fs/no kernel overrides/{ignorelist_health ignorelist_upgrade ignorelist_wrongly_marked_down pg-warn pg_health syntax} roles tasks/{0-from/reef/{v18.2.1} 1-volume/{0-create 1-ranks/1 2-allow_standby_replay/no 3-inline/no 4-verify} 2-client/fuse 3-upgrade-mgr-staggered 4-config-upgrade/{fail_fs} 5-upgrade-with-workload 6-verify}} email: null first_in_suite: false flavor: default job_id: '969' last_in_suite: false machine_type: vps meta: - desc: 'setup ceph/v18.2.1 ' name: kyr-2026-03-10_01:00:38-orch-squid-none-default-vps no_nested_subset: false os_type: centos os_version: 9.stream overrides: admin_socket: branch: squid ansible.cephlab: branch: main skip_tags: nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs vars: timezone: UTC ceph: cluster-conf: mgr: client mount timeout: 30 debug client: 20 debug mgr: 20 debug ms: 1 mon warn on pool no app: false conf: client: client mount timeout: 600 debug client: 20 debug ms: 1 rados mon op timeout: 900 rados osd op timeout: 900 global: mon pg warn min per osd: 0 mds: debug mds: 20 debug mds balancer: 20 debug ms: 1 mds debug frag: true mds debug scatterstat: true mds op complaint time: 180 mds verify scatter: true osd op complaint time: 180 rados mon op timeout: 900 rados osd op timeout: 900 mgr: debug mgr: 20 debug ms: 1 mon: debug mon: 20 debug ms: 1 debug paxos: 20 mon down mkfs grace: 300 mon op complaint time: 120 osd: bdev async discard: true bdev enable discard: true bluestore allocator: bitmap bluestore block size: 96636764160 bluestore fsck on mount: true debug bluefs: 1/20 debug bluestore: 1/20 debug ms: 1 debug osd: 20 debug rocksdb: 4/10 mon osd backfillfull_ratio: 0.85 mon osd full ratio: 0.9 mon osd nearfull ratio: 0.8 osd failsafe full ratio: 0.95 osd mclock iops capacity threshold hdd: 49000 osd objectstore: bluestore osd op complaint time: 180 flavor: default fs: xfs log-ignorelist: - \(MDS_ALL_DOWN\) - \(MDS_UP_LESS_THAN_MAX\) - FS_DEGRADED - filesystem is degraded - FS_INLINE_DATA_DEPRECATED - FS_WITH_FAILED_MDS - MDS_ALL_DOWN - filesystem is offline - is offline because no MDS - MDS_DAMAGE - MDS_DEGRADED - MDS_FAILED - MDS_INSUFFICIENT_STANDBY - MDS_UP_LESS_THAN_MAX - online, but wants - filesystem is online with fewer MDS than max_mds - POOL_APP_NOT_ENABLED - do not have an application enabled - overall HEALTH_ - Replacing daemon - deprecated feature inline_data - MGR_MODULE_ERROR - OSD_DOWN - osds down - overall HEALTH_ - \(OSD_DOWN\) - \(OSD_ - but it is still running - is not responding - MON_DOWN - PG_AVAILABILITY - PG_DEGRADED - Reduced data availability - Degraded data redundancy - pg .* is stuck inactive - pg .* is .*degraded - pg .* is stuck peering sha1: e911bdebe5c8faa3800735d1568fcdca65db60df ceph-deploy: bluestore: true conf: client: log file: /var/log/ceph/ceph-$name.$pid.log mon: {} osd: bdev async discard: true bdev enable discard: true bluestore block size: 96636764160 bluestore fsck on mount: true debug bluefs: 1/20 debug bluestore: 1/20 debug rocksdb: 4/10 mon osd backfillfull_ratio: 0.85 mon osd full ratio: 0.9 mon osd nearfull ratio: 0.8 osd failsafe full ratio: 0.95 osd objectstore: bluestore fs: xfs install: ceph: flavor: default sha1: e911bdebe5c8faa3800735d1568fcdca65db60df extra_system_packages: deb: - python3-xmltodict - python3-jmespath rpm: - bzip2 - perl-Test-Harness - python3-xmltodict - python3-jmespath kclient: syntax: v1 selinux: allowlist: - scontext=system_u:system_r:logrotate_t:s0 - scontext=system_u:system_r:getty_t:s0 thrashosds: bdev_inject_crash: 2 bdev_inject_crash_probability: 0.5 workunit: branch: tt-squid sha1: 75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b owner: kyr priority: 1000 repo: https://github.com/ceph/ceph.git roles: - - host.a - client.0 - osd.0 - osd.1 - osd.2 - - host.b - client.1 - osd.3 - osd.4 - osd.5 seed: 8043 sha1: e911bdebe5c8faa3800735d1568fcdca65db60df sleep_before_teardown: 0 subset: 1/64 suite: orch suite_branch: tt-squid suite_path: /home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa suite_relpath: qa suite_repo: https://github.com/kshtsk/ceph.git suite_sha1: 75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b targets: vm05.local: ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCtp4Oy7pV+c70ZjGv78OhMfFlFkb3Yd5/ILHbg/7/DAnjxQihdZ+y8bl5ls/boMZvrKLCQsnkpYGWy5Y8GsQoI= vm08.local: ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHfZmUEsRiG9hHM8QzrjPHHk1z2drq7BtgA2br4GDNZVKL25gsNivYslo7v8AGJD40Qz3L+YGFKPu5Y4v5mzF90= tasks: - install: exclude_packages: - ceph-volume tag: v18.2.1 - print: '**** done install task...' - cephadm: compiled_cephadm_branch: reef conf: osd: osd_class_default_list: '*' osd_class_load_list: '*' image: quay.io/ceph/ceph:v18.2.1 roleless: true - print: '**** done end installing v18.2.1 cephadm ...' - cephadm.shell: host.a: - ceph config set mgr mgr/cephadm/use_repo_digest true --force - print: '**** done cephadm.shell ceph config set mgr...' - cephadm.shell: host.a: - ceph orch status - ceph orch ps - ceph orch ls - ceph orch host ls - ceph orch device ls - cephadm.shell: host.a: - ceph fs volume create cephfs --placement=4 - ceph fs dump - cephadm.shell: host.a: - ceph fs set cephfs max_mds 1 - cephadm.shell: host.a: - ceph fs set cephfs allow_standby_replay false - cephadm.shell: host.a: - ceph fs set cephfs inline_data false - cephadm.shell: host.a: - ceph fs dump - ceph --format=json fs dump | jq -e ".filesystems | length == 1" - while ! ceph --format=json mds versions | jq -e ". | add == 4"; do sleep 1; done - fs.pre_upgrade_save: null - ceph-fuse: null - print: '**** done client' - parallel: - upgrade-tasks - workload-tasks - cephadm.shell: host.a: - ceph fs dump - fs.post_upgrade_checks: null teuthology: fragments_dropped: - /home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/suites/orch/cephadm/mds_upgrade_sequence/tasks/3-upgrade-mgr-staggered.yaml meta: {} postmerge: - "local kernel = py_attrgetter(yaml).get('kernel')\nif kernel ~= nil then\n local\ \ branch = py_attrgetter(kernel).get('branch')\n if branch and not kernel.branch:find\ \ \"-all$\" then\n log.debug(\"removing default kernel specification: %s\"\ , kernel)\n py_attrgetter(kernel).pop('branch', nil)\n py_attrgetter(kernel).pop('deb',\ \ nil)\n py_attrgetter(kernel).pop('flavor', nil)\n py_attrgetter(kernel).pop('kdb',\ \ nil)\n py_attrgetter(kernel).pop('koji', nil)\n py_attrgetter(kernel).pop('koji_task',\ \ nil)\n py_attrgetter(kernel).pop('rpm', nil)\n py_attrgetter(kernel).pop('sha1',\ \ nil)\n py_attrgetter(kernel).pop('tag', nil)\n end\nend\n" variables: fail_fs: false teuthology_branch: clyso-debian-13 teuthology_repo: https://github.com/clyso/teuthology teuthology_sha1: 1c580df7a9c7c2aadc272da296344fd99f27c444 timestamp: 2026-03-10_01:00:38 tube: vps upgrade-tasks: sequential: - cephadm.shell: env: - sha1 host.a: - ceph config set mgr mgr/orchestrator/fail_fs false || true - cephadm.shell: env: - sha1 host.a: - ceph config set mon mon_warn_on_insecure_global_id_reclaim false --force - ceph config set mon mon_warn_on_insecure_global_id_reclaim_allowed false --force - ceph config set global log_to_journald false --force - ceph orch upgrade start --image quay.ceph.io/ceph-ci/ceph:$sha1 - cephadm.shell: env: - sha1 host.a: - while ceph orch upgrade status | jq '.in_progress' | grep true && ! ceph orch upgrade status | jq '.message' | grep Error ; do ceph orch ps ; ceph versions ; ceph fs dump; ceph orch upgrade status ; ceph health detail ; sleep 30 ; done - ceph orch ps - ceph orch upgrade status - ceph health detail - ceph versions - echo "wait for servicemap items w/ changing names to refresh" - sleep 60 - ceph orch ps - ceph versions - ceph versions | jq -e '.overall | length == 1' - ceph versions | jq -e '.overall | keys' | grep $sha1 user: kyr verbose: false worker_log: /home/teuthos/.teuthology/dispatcher/dispatcher.vps.611473 workload-tasks: sequential: - workunit: clients: all: - suites/fsstress.sh 2026-03-10T08:46:23.540 INFO:teuthology.run:suite_path is set to /home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa; will attempt to use it 2026-03-10T08:46:23.541 INFO:teuthology.run:Found tasks at /home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks 2026-03-10T08:46:23.541 INFO:teuthology.run_tasks:Running task internal.check_packages... 2026-03-10T08:46:23.541 INFO:teuthology.task.internal:Checking packages... 2026-03-10T08:46:23.541 INFO:teuthology.task.internal:Checking packages for os_type 'centos', flavor 'default' and ceph hash 'e911bdebe5c8faa3800735d1568fcdca65db60df' 2026-03-10T08:46:23.541 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using branch 2026-03-10T08:46:23.541 INFO:teuthology.packaging:ref: None 2026-03-10T08:46:23.541 INFO:teuthology.packaging:tag: None 2026-03-10T08:46:23.541 INFO:teuthology.packaging:branch: squid 2026-03-10T08:46:23.541 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T08:46:23.541 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&ref=squid 2026-03-10T08:46:24.341 INFO:teuthology.task.internal:Found packages for ceph version 19.2.3-678.ge911bdeb 2026-03-10T08:46:24.342 INFO:teuthology.run_tasks:Running task internal.buildpackages_prep... 2026-03-10T08:46:24.343 INFO:teuthology.task.internal:no buildpackages task found 2026-03-10T08:46:24.343 INFO:teuthology.run_tasks:Running task internal.save_config... 2026-03-10T08:46:24.343 INFO:teuthology.task.internal:Saving configuration 2026-03-10T08:46:24.351 INFO:teuthology.run_tasks:Running task internal.check_lock... 2026-03-10T08:46:24.352 INFO:teuthology.task.internal.check_lock:Checking locks... 2026-03-10T08:46:24.358 DEBUG:teuthology.task.internal.check_lock:machine status is {'name': 'vm05.local', 'description': '/archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/969', 'up': True, 'machine_type': 'vps', 'is_vm': True, 'vm_host': {'name': 'localhost', 'description': None, 'up': True, 'machine_type': 'libvirt', 'is_vm': False, 'vm_host': None, 'os_type': None, 'os_version': None, 'arch': None, 'locked': True, 'locked_since': None, 'locked_by': None, 'mac_address': None, 'ssh_pub_key': None}, 'os_type': 'centos', 'os_version': '9.stream', 'arch': 'x86_64', 'locked': True, 'locked_since': '2026-03-10 08:45:21.737568', 'locked_by': 'kyr', 'mac_address': '52:55:00:00:00:05', 'ssh_pub_key': 'ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCtp4Oy7pV+c70ZjGv78OhMfFlFkb3Yd5/ILHbg/7/DAnjxQihdZ+y8bl5ls/boMZvrKLCQsnkpYGWy5Y8GsQoI='} 2026-03-10T08:46:24.363 DEBUG:teuthology.task.internal.check_lock:machine status is {'name': 'vm08.local', 'description': '/archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/969', 'up': True, 'machine_type': 'vps', 'is_vm': True, 'vm_host': {'name': 'localhost', 'description': None, 'up': True, 'machine_type': 'libvirt', 'is_vm': False, 'vm_host': None, 'os_type': None, 'os_version': None, 'arch': None, 'locked': True, 'locked_since': None, 'locked_by': None, 'mac_address': None, 'ssh_pub_key': None}, 'os_type': 'centos', 'os_version': '9.stream', 'arch': 'x86_64', 'locked': True, 'locked_since': '2026-03-10 08:45:21.737997', 'locked_by': 'kyr', 'mac_address': '52:55:00:00:00:08', 'ssh_pub_key': 'ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHfZmUEsRiG9hHM8QzrjPHHk1z2drq7BtgA2br4GDNZVKL25gsNivYslo7v8AGJD40Qz3L+YGFKPu5Y4v5mzF90='} 2026-03-10T08:46:24.363 INFO:teuthology.run_tasks:Running task internal.add_remotes... 2026-03-10T08:46:24.364 INFO:teuthology.task.internal:roles: ubuntu@vm05.local - ['host.a', 'client.0', 'osd.0', 'osd.1', 'osd.2'] 2026-03-10T08:46:24.364 INFO:teuthology.task.internal:roles: ubuntu@vm08.local - ['host.b', 'client.1', 'osd.3', 'osd.4', 'osd.5'] 2026-03-10T08:46:24.364 INFO:teuthology.run_tasks:Running task console_log... 2026-03-10T08:46:24.370 DEBUG:teuthology.task.console_log:vm05 does not support IPMI; excluding 2026-03-10T08:46:24.376 DEBUG:teuthology.task.console_log:vm08 does not support IPMI; excluding 2026-03-10T08:46:24.376 DEBUG:teuthology.exit:Installing handler: Handler(exiter=, func=.kill_console_loggers at 0x7ff772b72170>, signals=[15]) 2026-03-10T08:46:24.376 INFO:teuthology.run_tasks:Running task internal.connect... 2026-03-10T08:46:24.377 INFO:teuthology.task.internal:Opening connections... 2026-03-10T08:46:24.377 DEBUG:teuthology.task.internal:connecting to ubuntu@vm05.local 2026-03-10T08:46:24.378 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm05.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-10T08:46:24.441 DEBUG:teuthology.task.internal:connecting to ubuntu@vm08.local 2026-03-10T08:46:24.441 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm08.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-10T08:46:24.503 INFO:teuthology.run_tasks:Running task internal.push_inventory... 2026-03-10T08:46:24.504 DEBUG:teuthology.orchestra.run.vm05:> uname -m 2026-03-10T08:46:24.557 INFO:teuthology.orchestra.run.vm05.stdout:x86_64 2026-03-10T08:46:24.557 DEBUG:teuthology.orchestra.run.vm05:> cat /etc/os-release 2026-03-10T08:46:24.611 INFO:teuthology.orchestra.run.vm05.stdout:NAME="CentOS Stream" 2026-03-10T08:46:24.612 INFO:teuthology.orchestra.run.vm05.stdout:VERSION="9" 2026-03-10T08:46:24.612 INFO:teuthology.orchestra.run.vm05.stdout:ID="centos" 2026-03-10T08:46:24.612 INFO:teuthology.orchestra.run.vm05.stdout:ID_LIKE="rhel fedora" 2026-03-10T08:46:24.612 INFO:teuthology.orchestra.run.vm05.stdout:VERSION_ID="9" 2026-03-10T08:46:24.612 INFO:teuthology.orchestra.run.vm05.stdout:PLATFORM_ID="platform:el9" 2026-03-10T08:46:24.612 INFO:teuthology.orchestra.run.vm05.stdout:PRETTY_NAME="CentOS Stream 9" 2026-03-10T08:46:24.612 INFO:teuthology.orchestra.run.vm05.stdout:ANSI_COLOR="0;31" 2026-03-10T08:46:24.612 INFO:teuthology.orchestra.run.vm05.stdout:LOGO="fedora-logo-icon" 2026-03-10T08:46:24.612 INFO:teuthology.orchestra.run.vm05.stdout:CPE_NAME="cpe:/o:centos:centos:9" 2026-03-10T08:46:24.612 INFO:teuthology.orchestra.run.vm05.stdout:HOME_URL="https://centos.org/" 2026-03-10T08:46:24.612 INFO:teuthology.orchestra.run.vm05.stdout:BUG_REPORT_URL="https://issues.redhat.com/" 2026-03-10T08:46:24.612 INFO:teuthology.orchestra.run.vm05.stdout:REDHAT_SUPPORT_PRODUCT="Red Hat Enterprise Linux 9" 2026-03-10T08:46:24.612 INFO:teuthology.orchestra.run.vm05.stdout:REDHAT_SUPPORT_PRODUCT_VERSION="CentOS Stream" 2026-03-10T08:46:24.612 INFO:teuthology.lock.ops:Updating vm05.local on lock server 2026-03-10T08:46:24.617 DEBUG:teuthology.orchestra.run.vm08:> uname -m 2026-03-10T08:46:24.633 INFO:teuthology.orchestra.run.vm08.stdout:x86_64 2026-03-10T08:46:24.633 DEBUG:teuthology.orchestra.run.vm08:> cat /etc/os-release 2026-03-10T08:46:24.695 INFO:teuthology.orchestra.run.vm08.stdout:NAME="CentOS Stream" 2026-03-10T08:46:24.695 INFO:teuthology.orchestra.run.vm08.stdout:VERSION="9" 2026-03-10T08:46:24.695 INFO:teuthology.orchestra.run.vm08.stdout:ID="centos" 2026-03-10T08:46:24.695 INFO:teuthology.orchestra.run.vm08.stdout:ID_LIKE="rhel fedora" 2026-03-10T08:46:24.695 INFO:teuthology.orchestra.run.vm08.stdout:VERSION_ID="9" 2026-03-10T08:46:24.695 INFO:teuthology.orchestra.run.vm08.stdout:PLATFORM_ID="platform:el9" 2026-03-10T08:46:24.695 INFO:teuthology.orchestra.run.vm08.stdout:PRETTY_NAME="CentOS Stream 9" 2026-03-10T08:46:24.695 INFO:teuthology.orchestra.run.vm08.stdout:ANSI_COLOR="0;31" 2026-03-10T08:46:24.695 INFO:teuthology.orchestra.run.vm08.stdout:LOGO="fedora-logo-icon" 2026-03-10T08:46:24.695 INFO:teuthology.orchestra.run.vm08.stdout:CPE_NAME="cpe:/o:centos:centos:9" 2026-03-10T08:46:24.695 INFO:teuthology.orchestra.run.vm08.stdout:HOME_URL="https://centos.org/" 2026-03-10T08:46:24.695 INFO:teuthology.orchestra.run.vm08.stdout:BUG_REPORT_URL="https://issues.redhat.com/" 2026-03-10T08:46:24.695 INFO:teuthology.orchestra.run.vm08.stdout:REDHAT_SUPPORT_PRODUCT="Red Hat Enterprise Linux 9" 2026-03-10T08:46:24.695 INFO:teuthology.orchestra.run.vm08.stdout:REDHAT_SUPPORT_PRODUCT_VERSION="CentOS Stream" 2026-03-10T08:46:24.695 INFO:teuthology.lock.ops:Updating vm08.local on lock server 2026-03-10T08:46:24.700 INFO:teuthology.run_tasks:Running task internal.serialize_remote_roles... 2026-03-10T08:46:24.702 INFO:teuthology.run_tasks:Running task internal.check_conflict... 2026-03-10T08:46:24.703 INFO:teuthology.task.internal:Checking for old test directory... 2026-03-10T08:46:24.703 DEBUG:teuthology.orchestra.run.vm05:> test '!' -e /home/ubuntu/cephtest 2026-03-10T08:46:24.705 DEBUG:teuthology.orchestra.run.vm08:> test '!' -e /home/ubuntu/cephtest 2026-03-10T08:46:24.753 INFO:teuthology.run_tasks:Running task internal.check_ceph_data... 2026-03-10T08:46:24.754 INFO:teuthology.task.internal:Checking for non-empty /var/lib/ceph... 2026-03-10T08:46:24.755 DEBUG:teuthology.orchestra.run.vm05:> test -z $(ls -A /var/lib/ceph) 2026-03-10T08:46:24.762 DEBUG:teuthology.orchestra.run.vm08:> test -z $(ls -A /var/lib/ceph) 2026-03-10T08:46:24.777 INFO:teuthology.orchestra.run.vm05.stderr:ls: cannot access '/var/lib/ceph': No such file or directory 2026-03-10T08:46:24.812 INFO:teuthology.orchestra.run.vm08.stderr:ls: cannot access '/var/lib/ceph': No such file or directory 2026-03-10T08:46:24.812 INFO:teuthology.run_tasks:Running task internal.vm_setup... 2026-03-10T08:46:24.820 DEBUG:teuthology.orchestra.run.vm05:> test -e /ceph-qa-ready 2026-03-10T08:46:24.835 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T08:46:25.035 DEBUG:teuthology.orchestra.run.vm08:> test -e /ceph-qa-ready 2026-03-10T08:46:25.052 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T08:46:25.287 INFO:teuthology.run_tasks:Running task internal.base... 2026-03-10T08:46:25.289 INFO:teuthology.task.internal:Creating test directory... 2026-03-10T08:46:25.289 DEBUG:teuthology.orchestra.run.vm05:> mkdir -p -m0755 -- /home/ubuntu/cephtest 2026-03-10T08:46:25.292 DEBUG:teuthology.orchestra.run.vm08:> mkdir -p -m0755 -- /home/ubuntu/cephtest 2026-03-10T08:46:25.310 INFO:teuthology.run_tasks:Running task internal.archive_upload... 2026-03-10T08:46:25.311 INFO:teuthology.run_tasks:Running task internal.archive... 2026-03-10T08:46:25.312 INFO:teuthology.task.internal:Creating archive directory... 2026-03-10T08:46:25.312 DEBUG:teuthology.orchestra.run.vm05:> install -d -m0755 -- /home/ubuntu/cephtest/archive 2026-03-10T08:46:25.348 DEBUG:teuthology.orchestra.run.vm08:> install -d -m0755 -- /home/ubuntu/cephtest/archive 2026-03-10T08:46:25.372 INFO:teuthology.run_tasks:Running task internal.coredump... 2026-03-10T08:46:25.373 INFO:teuthology.task.internal:Enabling coredump saving... 2026-03-10T08:46:25.373 DEBUG:teuthology.orchestra.run.vm05:> test -f /run/.containerenv -o -f /.dockerenv 2026-03-10T08:46:25.418 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T08:46:25.418 DEBUG:teuthology.orchestra.run.vm08:> test -f /run/.containerenv -o -f /.dockerenv 2026-03-10T08:46:25.433 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T08:46:25.433 DEBUG:teuthology.orchestra.run.vm05:> install -d -m0755 -- /home/ubuntu/cephtest/archive/coredump && sudo sysctl -w kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core && echo kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core | sudo tee -a /etc/sysctl.conf 2026-03-10T08:46:25.460 DEBUG:teuthology.orchestra.run.vm08:> install -d -m0755 -- /home/ubuntu/cephtest/archive/coredump && sudo sysctl -w kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core && echo kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core | sudo tee -a /etc/sysctl.conf 2026-03-10T08:46:25.483 INFO:teuthology.orchestra.run.vm05.stdout:kernel.core_pattern = /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-10T08:46:25.491 INFO:teuthology.orchestra.run.vm05.stdout:kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-10T08:46:25.499 INFO:teuthology.orchestra.run.vm08.stdout:kernel.core_pattern = /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-10T08:46:25.508 INFO:teuthology.orchestra.run.vm08.stdout:kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-10T08:46:25.509 INFO:teuthology.run_tasks:Running task internal.sudo... 2026-03-10T08:46:25.510 INFO:teuthology.task.internal:Configuring sudo... 2026-03-10T08:46:25.511 DEBUG:teuthology.orchestra.run.vm05:> sudo sed -i.orig.teuthology -e 's/^\([^#]*\) \(requiretty\)/\1 !\2/g' -e 's/^\([^#]*\) !\(visiblepw\)/\1 \2/g' /etc/sudoers 2026-03-10T08:46:25.534 DEBUG:teuthology.orchestra.run.vm08:> sudo sed -i.orig.teuthology -e 's/^\([^#]*\) \(requiretty\)/\1 !\2/g' -e 's/^\([^#]*\) !\(visiblepw\)/\1 \2/g' /etc/sudoers 2026-03-10T08:46:25.574 INFO:teuthology.run_tasks:Running task internal.syslog... 2026-03-10T08:46:25.577 INFO:teuthology.task.internal.syslog:Starting syslog monitoring... 2026-03-10T08:46:25.577 DEBUG:teuthology.orchestra.run.vm05:> mkdir -p -m0755 -- /home/ubuntu/cephtest/archive/syslog 2026-03-10T08:46:25.598 DEBUG:teuthology.orchestra.run.vm08:> mkdir -p -m0755 -- /home/ubuntu/cephtest/archive/syslog 2026-03-10T08:46:25.629 DEBUG:teuthology.orchestra.run.vm05:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-10T08:46:25.679 DEBUG:teuthology.orchestra.run.vm05:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-10T08:46:25.739 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-10T08:46:25.739 DEBUG:teuthology.orchestra.run.vm05:> sudo dd of=/etc/rsyslog.d/80-cephtest.conf 2026-03-10T08:46:25.801 DEBUG:teuthology.orchestra.run.vm08:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-10T08:46:25.825 DEBUG:teuthology.orchestra.run.vm08:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-10T08:46:25.881 DEBUG:teuthology.orchestra.run.vm08:> set -ex 2026-03-10T08:46:25.881 DEBUG:teuthology.orchestra.run.vm08:> sudo dd of=/etc/rsyslog.d/80-cephtest.conf 2026-03-10T08:46:25.940 DEBUG:teuthology.orchestra.run.vm05:> sudo service rsyslog restart 2026-03-10T08:46:25.942 DEBUG:teuthology.orchestra.run.vm08:> sudo service rsyslog restart 2026-03-10T08:46:25.971 INFO:teuthology.orchestra.run.vm05.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-10T08:46:26.008 INFO:teuthology.orchestra.run.vm08.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-10T08:46:26.418 INFO:teuthology.run_tasks:Running task internal.timer... 2026-03-10T08:46:26.420 INFO:teuthology.task.internal:Starting timer... 2026-03-10T08:46:26.420 INFO:teuthology.run_tasks:Running task pcp... 2026-03-10T08:46:26.423 INFO:teuthology.run_tasks:Running task selinux... 2026-03-10T08:46:26.425 DEBUG:teuthology.task:Applying overrides for task selinux: {'allowlist': ['scontext=system_u:system_r:logrotate_t:s0', 'scontext=system_u:system_r:getty_t:s0']} 2026-03-10T08:46:26.425 INFO:teuthology.task.selinux:Excluding vm05: VMs are not yet supported 2026-03-10T08:46:26.425 INFO:teuthology.task.selinux:Excluding vm08: VMs are not yet supported 2026-03-10T08:46:26.425 DEBUG:teuthology.task.selinux:Getting current SELinux state 2026-03-10T08:46:26.425 DEBUG:teuthology.task.selinux:Existing SELinux modes: {} 2026-03-10T08:46:26.425 INFO:teuthology.task.selinux:Putting SELinux into permissive mode 2026-03-10T08:46:26.425 INFO:teuthology.run_tasks:Running task ansible.cephlab... 2026-03-10T08:46:26.426 DEBUG:teuthology.task:Applying overrides for task ansible.cephlab: {'branch': 'main', 'skip_tags': 'nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs', 'vars': {'timezone': 'UTC'}} 2026-03-10T08:46:26.427 DEBUG:teuthology.repo_utils:Setting repo remote to https://github.com/ceph/ceph-cm-ansible.git 2026-03-10T08:46:26.428 INFO:teuthology.repo_utils:Fetching github.com_ceph_ceph-cm-ansible_main from origin 2026-03-10T08:46:27.045 DEBUG:teuthology.repo_utils:Resetting repo at /home/teuthos/src/github.com_ceph_ceph-cm-ansible_main to origin/main 2026-03-10T08:46:27.051 INFO:teuthology.task.ansible:Playbook: [{'import_playbook': 'ansible_managed.yml'}, {'import_playbook': 'teuthology.yml'}, {'hosts': 'testnodes', 'tasks': [{'set_fact': {'ran_from_cephlab_playbook': True}}]}, {'import_playbook': 'testnodes.yml'}, {'import_playbook': 'container-host.yml'}, {'import_playbook': 'cobbler.yml'}, {'import_playbook': 'paddles.yml'}, {'import_playbook': 'pulpito.yml'}, {'hosts': 'testnodes', 'become': True, 'tasks': [{'name': 'Touch /ceph-qa-ready', 'file': {'path': '/ceph-qa-ready', 'state': 'touch'}, 'when': 'ran_from_cephlab_playbook|bool'}]}] 2026-03-10T08:46:27.051 DEBUG:teuthology.task.ansible:Running ansible-playbook -v --extra-vars '{"ansible_ssh_user": "ubuntu", "timezone": "UTC"}' -i /tmp/teuth_ansible_inventory82i116sa --limit vm05.local,vm08.local /home/teuthos/src/github.com_ceph_ceph-cm-ansible_main/cephlab.yml --skip-tags nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs 2026-03-10T08:48:27.706 DEBUG:teuthology.task.ansible:Reconnecting to [Remote(name='ubuntu@vm05.local'), Remote(name='ubuntu@vm08.local')] 2026-03-10T08:48:27.707 INFO:teuthology.orchestra.remote:Trying to reconnect to host 'ubuntu@vm05.local' 2026-03-10T08:48:27.707 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm05.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-10T08:48:27.770 DEBUG:teuthology.orchestra.run.vm05:> true 2026-03-10T08:48:27.851 INFO:teuthology.orchestra.remote:Successfully reconnected to host 'ubuntu@vm05.local' 2026-03-10T08:48:27.851 INFO:teuthology.orchestra.remote:Trying to reconnect to host 'ubuntu@vm08.local' 2026-03-10T08:48:27.851 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm08.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-10T08:48:27.916 DEBUG:teuthology.orchestra.run.vm08:> true 2026-03-10T08:48:27.998 INFO:teuthology.orchestra.remote:Successfully reconnected to host 'ubuntu@vm08.local' 2026-03-10T08:48:27.998 INFO:teuthology.run_tasks:Running task clock... 2026-03-10T08:48:28.001 INFO:teuthology.task.clock:Syncing clocks and checking initial clock skew... 2026-03-10T08:48:28.001 INFO:teuthology.orchestra.run:Running command with timeout 360 2026-03-10T08:48:28.001 DEBUG:teuthology.orchestra.run.vm05:> sudo systemctl stop ntp.service || sudo systemctl stop ntpd.service || sudo systemctl stop chronyd.service ; sudo ntpd -gq || sudo chronyc makestep ; sudo systemctl start ntp.service || sudo systemctl start ntpd.service || sudo systemctl start chronyd.service ; PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-10T08:48:28.004 INFO:teuthology.orchestra.run:Running command with timeout 360 2026-03-10T08:48:28.004 DEBUG:teuthology.orchestra.run.vm08:> sudo systemctl stop ntp.service || sudo systemctl stop ntpd.service || sudo systemctl stop chronyd.service ; sudo ntpd -gq || sudo chronyc makestep ; sudo systemctl start ntp.service || sudo systemctl start ntpd.service || sudo systemctl start chronyd.service ; PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-10T08:48:28.033 INFO:teuthology.orchestra.run.vm05.stderr:Failed to stop ntp.service: Unit ntp.service not loaded. 2026-03-10T08:48:28.049 INFO:teuthology.orchestra.run.vm05.stderr:Failed to stop ntpd.service: Unit ntpd.service not loaded. 2026-03-10T08:48:28.073 INFO:teuthology.orchestra.run.vm08.stderr:Failed to stop ntp.service: Unit ntp.service not loaded. 2026-03-10T08:48:28.084 INFO:teuthology.orchestra.run.vm05.stderr:sudo: ntpd: command not found 2026-03-10T08:48:28.090 INFO:teuthology.orchestra.run.vm08.stderr:Failed to stop ntpd.service: Unit ntpd.service not loaded. 2026-03-10T08:48:28.100 INFO:teuthology.orchestra.run.vm05.stdout:506 Cannot talk to daemon 2026-03-10T08:48:28.119 INFO:teuthology.orchestra.run.vm05.stderr:Failed to start ntp.service: Unit ntp.service not found. 2026-03-10T08:48:28.125 INFO:teuthology.orchestra.run.vm08.stderr:sudo: ntpd: command not found 2026-03-10T08:48:28.140 INFO:teuthology.orchestra.run.vm05.stderr:Failed to start ntpd.service: Unit ntpd.service not found. 2026-03-10T08:48:28.142 INFO:teuthology.orchestra.run.vm08.stdout:506 Cannot talk to daemon 2026-03-10T08:48:28.160 INFO:teuthology.orchestra.run.vm08.stderr:Failed to start ntp.service: Unit ntp.service not found. 2026-03-10T08:48:28.179 INFO:teuthology.orchestra.run.vm08.stderr:Failed to start ntpd.service: Unit ntpd.service not found. 2026-03-10T08:48:28.197 INFO:teuthology.orchestra.run.vm05.stderr:bash: line 1: ntpq: command not found 2026-03-10T08:48:28.202 INFO:teuthology.orchestra.run.vm05.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-10T08:48:28.202 INFO:teuthology.orchestra.run.vm05.stdout:=============================================================================== 2026-03-10T08:48:28.202 INFO:teuthology.orchestra.run.vm05.stdout:^? server1a.meinberg.de 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-10T08:48:28.202 INFO:teuthology.orchestra.run.vm05.stdout:^? 47.ip-51-75-67.eu 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-10T08:48:28.202 INFO:teuthology.orchestra.run.vm05.stdout:^? vps-ber1.orleans.ddnss.de 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-10T08:48:28.202 INFO:teuthology.orchestra.run.vm05.stdout:^? nc-root-nue.nicesrv.de 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-10T08:48:28.230 INFO:teuthology.orchestra.run.vm08.stderr:bash: line 1: ntpq: command not found 2026-03-10T08:48:28.235 INFO:teuthology.orchestra.run.vm08.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-10T08:48:28.235 INFO:teuthology.orchestra.run.vm08.stdout:=============================================================================== 2026-03-10T08:48:28.235 INFO:teuthology.orchestra.run.vm08.stdout:^? vps-ber1.orleans.ddnss.de 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-10T08:48:28.235 INFO:teuthology.orchestra.run.vm08.stdout:^? nc-root-nue.nicesrv.de 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-10T08:48:28.235 INFO:teuthology.orchestra.run.vm08.stdout:^? server1a.meinberg.de 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-10T08:48:28.235 INFO:teuthology.orchestra.run.vm08.stdout:^? 47.ip-51-75-67.eu 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-10T08:48:28.235 INFO:teuthology.run_tasks:Running task install... 2026-03-10T08:48:28.237 DEBUG:teuthology.task.install:project ceph 2026-03-10T08:48:28.237 DEBUG:teuthology.task.install:INSTALL overrides: {'ceph': {'flavor': 'default', 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df'}, 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}} 2026-03-10T08:48:28.237 DEBUG:teuthology.task.install:config {'exclude_packages': ['ceph-volume'], 'tag': 'v18.2.1', 'flavor': 'default', 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}} 2026-03-10T08:48:28.237 INFO:teuthology.task.install:Using flavor: default 2026-03-10T08:48:28.240 DEBUG:teuthology.task.install:Package list is: {'deb': ['ceph', 'cephadm', 'ceph-mds', 'ceph-mgr', 'ceph-common', 'ceph-fuse', 'ceph-test', 'radosgw', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'libcephfs2', 'libcephfs-dev', 'librados2', 'librbd1', 'rbd-fuse'], 'rpm': ['ceph-radosgw', 'ceph-test', 'ceph', 'ceph-base', 'cephadm', 'ceph-immutable-object-cache', 'ceph-mgr', 'ceph-mgr-dashboard', 'ceph-mgr-diskprediction-local', 'ceph-mgr-rook', 'ceph-mgr-cephadm', 'ceph-fuse', 'librados-devel', 'libcephfs2', 'libcephfs-devel', 'librados2', 'librbd1', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'rbd-fuse', 'rbd-mirror', 'rbd-nbd']} 2026-03-10T08:48:28.240 INFO:teuthology.task.install:extra packages: [] 2026-03-10T08:48:28.240 DEBUG:teuthology.task.install.rpm:_update_package_list_and_install: config is {'branch': None, 'cleanup': None, 'debuginfo': None, 'downgrade_packages': [], 'exclude_packages': ['ceph-volume'], 'extra_packages': [], 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}, 'extras': None, 'enable_coprs': [], 'flavor': 'default', 'install_ceph_packages': True, 'packages': {}, 'project': 'ceph', 'repos_only': False, 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'tag': 'v18.2.1', 'wait_for_package': False} 2026-03-10T08:48:28.240 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using tag 2026-03-10T08:48:28.240 INFO:teuthology.packaging:ref: None 2026-03-10T08:48:28.240 INFO:teuthology.packaging:tag: v18.2.1 2026-03-10T08:48:28.240 INFO:teuthology.packaging:branch: None 2026-03-10T08:48:28.240 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T08:48:29.115 DEBUG:teuthology.repo_utils:git ls-remote https://github.com/ceph/ceph v18.2.1^{} -> 7fe91d5d5842e04be3b4f514d6dd990c54b29c76 2026-03-10T08:48:29.115 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=7fe91d5d5842e04be3b4f514d6dd990c54b29c76 2026-03-10T08:48:29.116 DEBUG:teuthology.task.install.rpm:_update_package_list_and_install: config is {'branch': None, 'cleanup': None, 'debuginfo': None, 'downgrade_packages': [], 'exclude_packages': ['ceph-volume'], 'extra_packages': [], 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}, 'extras': None, 'enable_coprs': [], 'flavor': 'default', 'install_ceph_packages': True, 'packages': {}, 'project': 'ceph', 'repos_only': False, 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'tag': 'v18.2.1', 'wait_for_package': False} 2026-03-10T08:48:29.116 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using tag 2026-03-10T08:48:29.116 INFO:teuthology.packaging:ref: None 2026-03-10T08:48:29.116 INFO:teuthology.packaging:tag: v18.2.1 2026-03-10T08:48:29.116 INFO:teuthology.packaging:branch: None 2026-03-10T08:48:29.116 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T08:48:29.116 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=7fe91d5d5842e04be3b4f514d6dd990c54b29c76 2026-03-10T08:48:29.773 INFO:teuthology.task.install.rpm:Pulling from https://chacra.ceph.com/r/ceph/reef/7fe91d5d5842e04be3b4f514d6dd990c54b29c76/centos/9/flavors/default/ 2026-03-10T08:48:29.773 INFO:teuthology.task.install.rpm:Package version is 18.2.1-0 2026-03-10T08:48:29.834 INFO:teuthology.task.install.rpm:Pulling from https://chacra.ceph.com/r/ceph/reef/7fe91d5d5842e04be3b4f514d6dd990c54b29c76/centos/9/flavors/default/ 2026-03-10T08:48:29.834 INFO:teuthology.task.install.rpm:Package version is 18.2.1-0 2026-03-10T08:48:30.148 INFO:teuthology.packaging:Writing yum repo: [ceph] name=ceph packages for $basearch baseurl=https://chacra.ceph.com/r/ceph/reef/7fe91d5d5842e04be3b4f514d6dd990c54b29c76/centos/9/flavors/default/$basearch enabled=1 gpgcheck=0 type=rpm-md [ceph-noarch] name=ceph noarch packages baseurl=https://chacra.ceph.com/r/ceph/reef/7fe91d5d5842e04be3b4f514d6dd990c54b29c76/centos/9/flavors/default/noarch enabled=1 gpgcheck=0 type=rpm-md [ceph-source] name=ceph source packages baseurl=https://chacra.ceph.com/r/ceph/reef/7fe91d5d5842e04be3b4f514d6dd990c54b29c76/centos/9/flavors/default/SRPMS enabled=1 gpgcheck=0 type=rpm-md 2026-03-10T08:48:30.148 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-10T08:48:30.148 DEBUG:teuthology.orchestra.run.vm05:> sudo dd of=/etc/yum.repos.d/ceph.repo 2026-03-10T08:48:30.181 INFO:teuthology.packaging:Writing yum repo: [ceph] name=ceph packages for $basearch baseurl=https://chacra.ceph.com/r/ceph/reef/7fe91d5d5842e04be3b4f514d6dd990c54b29c76/centos/9/flavors/default/$basearch enabled=1 gpgcheck=0 type=rpm-md [ceph-noarch] name=ceph noarch packages baseurl=https://chacra.ceph.com/r/ceph/reef/7fe91d5d5842e04be3b4f514d6dd990c54b29c76/centos/9/flavors/default/noarch enabled=1 gpgcheck=0 type=rpm-md [ceph-source] name=ceph source packages baseurl=https://chacra.ceph.com/r/ceph/reef/7fe91d5d5842e04be3b4f514d6dd990c54b29c76/centos/9/flavors/default/SRPMS enabled=1 gpgcheck=0 type=rpm-md 2026-03-10T08:48:30.182 DEBUG:teuthology.orchestra.run.vm08:> set -ex 2026-03-10T08:48:30.182 DEBUG:teuthology.orchestra.run.vm08:> sudo dd of=/etc/yum.repos.d/ceph.repo 2026-03-10T08:48:30.185 INFO:teuthology.task.install.rpm:Installing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd, bzip2, perl-Test-Harness, python3-xmltodict, python3-jmespath on remote rpm x86_64 2026-03-10T08:48:30.185 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using tag 2026-03-10T08:48:30.185 INFO:teuthology.packaging:ref: None 2026-03-10T08:48:30.185 INFO:teuthology.packaging:tag: v18.2.1 2026-03-10T08:48:30.185 INFO:teuthology.packaging:branch: None 2026-03-10T08:48:30.185 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T08:48:30.185 DEBUG:teuthology.orchestra.run.vm05:> if test -f /etc/yum.repos.d/ceph.repo ; then sudo sed -i -e ':a;N;$!ba;s/enabled=1\ngpg/enabled=1\npriority=1\ngpg/g' -e 's;ref/[a-zA-Z0-9_-]*/;ref/v18.2.1/;g' /etc/yum.repos.d/ceph.repo ; fi 2026-03-10T08:48:30.219 INFO:teuthology.task.install.rpm:Installing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd, bzip2, perl-Test-Harness, python3-xmltodict, python3-jmespath on remote rpm x86_64 2026-03-10T08:48:30.219 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using tag 2026-03-10T08:48:30.219 INFO:teuthology.packaging:ref: None 2026-03-10T08:48:30.219 INFO:teuthology.packaging:tag: v18.2.1 2026-03-10T08:48:30.219 INFO:teuthology.packaging:branch: None 2026-03-10T08:48:30.219 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T08:48:30.219 DEBUG:teuthology.orchestra.run.vm08:> if test -f /etc/yum.repos.d/ceph.repo ; then sudo sed -i -e ':a;N;$!ba;s/enabled=1\ngpg/enabled=1\npriority=1\ngpg/g' -e 's;ref/[a-zA-Z0-9_-]*/;ref/v18.2.1/;g' /etc/yum.repos.d/ceph.repo ; fi 2026-03-10T08:48:30.255 DEBUG:teuthology.orchestra.run.vm05:> sudo touch -a /etc/yum/pluginconf.d/priorities.conf ; test -e /etc/yum/pluginconf.d/priorities.conf.orig || sudo cp -af /etc/yum/pluginconf.d/priorities.conf /etc/yum/pluginconf.d/priorities.conf.orig 2026-03-10T08:48:30.293 DEBUG:teuthology.orchestra.run.vm08:> sudo touch -a /etc/yum/pluginconf.d/priorities.conf ; test -e /etc/yum/pluginconf.d/priorities.conf.orig || sudo cp -af /etc/yum/pluginconf.d/priorities.conf /etc/yum/pluginconf.d/priorities.conf.orig 2026-03-10T08:48:30.344 DEBUG:teuthology.orchestra.run.vm05:> grep check_obsoletes /etc/yum/pluginconf.d/priorities.conf && sudo sed -i 's/check_obsoletes.*0/check_obsoletes = 1/g' /etc/yum/pluginconf.d/priorities.conf || echo 'check_obsoletes = 1' | sudo tee -a /etc/yum/pluginconf.d/priorities.conf 2026-03-10T08:48:30.376 INFO:teuthology.orchestra.run.vm05.stdout:check_obsoletes = 1 2026-03-10T08:48:30.377 DEBUG:teuthology.orchestra.run.vm05:> sudo yum clean all 2026-03-10T08:48:30.387 DEBUG:teuthology.orchestra.run.vm08:> grep check_obsoletes /etc/yum/pluginconf.d/priorities.conf && sudo sed -i 's/check_obsoletes.*0/check_obsoletes = 1/g' /etc/yum/pluginconf.d/priorities.conf || echo 'check_obsoletes = 1' | sudo tee -a /etc/yum/pluginconf.d/priorities.conf 2026-03-10T08:48:30.421 INFO:teuthology.orchestra.run.vm08.stdout:check_obsoletes = 1 2026-03-10T08:48:30.425 DEBUG:teuthology.orchestra.run.vm08:> sudo yum clean all 2026-03-10T08:48:30.601 INFO:teuthology.orchestra.run.vm05.stdout:41 files removed 2026-03-10T08:48:30.635 INFO:teuthology.orchestra.run.vm08.stdout:41 files removed 2026-03-10T08:48:30.642 DEBUG:teuthology.orchestra.run.vm05:> sudo yum -y install ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd bzip2 perl-Test-Harness python3-xmltodict python3-jmespath 2026-03-10T08:48:30.671 DEBUG:teuthology.orchestra.run.vm08:> sudo yum -y install ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd bzip2 perl-Test-Harness python3-xmltodict python3-jmespath 2026-03-10T08:48:31.726 INFO:teuthology.orchestra.run.vm05.stdout:ceph packages for x86_64 88 kB/s | 76 kB 00:00 2026-03-10T08:48:31.728 INFO:teuthology.orchestra.run.vm08.stdout:ceph packages for x86_64 90 kB/s | 76 kB 00:00 2026-03-10T08:48:32.369 INFO:teuthology.orchestra.run.vm05.stdout:ceph noarch packages 15 kB/s | 9.4 kB 00:00 2026-03-10T08:48:32.380 INFO:teuthology.orchestra.run.vm08.stdout:ceph noarch packages 15 kB/s | 9.4 kB 00:00 2026-03-10T08:48:33.011 INFO:teuthology.orchestra.run.vm05.stdout:ceph source packages 3.5 kB/s | 2.2 kB 00:00 2026-03-10T08:48:33.032 INFO:teuthology.orchestra.run.vm08.stdout:ceph source packages 3.4 kB/s | 2.2 kB 00:00 2026-03-10T08:48:33.551 INFO:teuthology.orchestra.run.vm08.stdout:CentOS Stream 9 - BaseOS 18 MB/s | 8.9 MB 00:00 2026-03-10T08:48:34.966 INFO:teuthology.orchestra.run.vm05.stdout:CentOS Stream 9 - BaseOS 4.6 MB/s | 8.9 MB 00:01 2026-03-10T08:48:35.811 INFO:teuthology.orchestra.run.vm08.stdout:CentOS Stream 9 - AppStream 19 MB/s | 27 MB 00:01 2026-03-10T08:48:37.206 INFO:teuthology.orchestra.run.vm05.stdout:CentOS Stream 9 - AppStream 17 MB/s | 27 MB 00:01 2026-03-10T08:48:40.020 INFO:teuthology.orchestra.run.vm08.stdout:CentOS Stream 9 - CRB 5.7 MB/s | 8.0 MB 00:01 2026-03-10T08:48:40.943 INFO:teuthology.orchestra.run.vm05.stdout:CentOS Stream 9 - CRB 8.5 MB/s | 8.0 MB 00:00 2026-03-10T08:48:41.270 INFO:teuthology.orchestra.run.vm08.stdout:CentOS Stream 9 - Extras packages 54 kB/s | 20 kB 00:00 2026-03-10T08:48:42.023 INFO:teuthology.orchestra.run.vm05.stdout:CentOS Stream 9 - Extras packages 89 kB/s | 20 kB 00:00 2026-03-10T08:48:42.930 INFO:teuthology.orchestra.run.vm08.stdout:Extra Packages for Enterprise Linux 13 MB/s | 20 MB 00:01 2026-03-10T08:48:43.244 INFO:teuthology.orchestra.run.vm05.stdout:Extra Packages for Enterprise Linux 18 MB/s | 20 MB 00:01 2026-03-10T08:48:47.571 INFO:teuthology.orchestra.run.vm08.stdout:lab-extras 64 kB/s | 50 kB 00:00 2026-03-10T08:48:47.876 INFO:teuthology.orchestra.run.vm05.stdout:lab-extras 63 kB/s | 50 kB 00:00 2026-03-10T08:48:48.944 INFO:teuthology.orchestra.run.vm08.stdout:Package librados2-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-10T08:48:48.944 INFO:teuthology.orchestra.run.vm08.stdout:Package librbd1-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-10T08:48:48.950 INFO:teuthology.orchestra.run.vm08.stdout:Package bzip2-1.0.8-11.el9.x86_64 is already installed. 2026-03-10T08:48:48.950 INFO:teuthology.orchestra.run.vm08.stdout:Package perl-Test-Harness-1:3.42-461.el9.noarch is already installed. 2026-03-10T08:48:48.977 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T08:48:48.981 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T08:48:48.981 INFO:teuthology.orchestra.run.vm08.stdout: Package Arch Version Repository Size 2026-03-10T08:48:48.981 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T08:48:48.981 INFO:teuthology.orchestra.run.vm08.stdout:Installing: 2026-03-10T08:48:48.981 INFO:teuthology.orchestra.run.vm08.stdout: ceph x86_64 2:18.2.1-0.el9 ceph 6.4 k 2026-03-10T08:48:48.981 INFO:teuthology.orchestra.run.vm08.stdout: ceph-base x86_64 2:18.2.1-0.el9 ceph 5.2 M 2026-03-10T08:48:48.981 INFO:teuthology.orchestra.run.vm08.stdout: ceph-fuse x86_64 2:18.2.1-0.el9 ceph 839 k 2026-03-10T08:48:48.981 INFO:teuthology.orchestra.run.vm08.stdout: ceph-immutable-object-cache x86_64 2:18.2.1-0.el9 ceph 142 k 2026-03-10T08:48:48.981 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr x86_64 2:18.2.1-0.el9 ceph 1.4 M 2026-03-10T08:48:48.981 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-cephadm noarch 2:18.2.1-0.el9 ceph-noarch 132 k 2026-03-10T08:48:48.981 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-dashboard noarch 2:18.2.1-0.el9 ceph-noarch 1.8 M 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-diskprediction-local noarch 2:18.2.1-0.el9 ceph-noarch 7.4 M 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-rook noarch 2:18.2.1-0.el9 ceph-noarch 50 k 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout: ceph-radosgw x86_64 2:18.2.1-0.el9 ceph 7.7 M 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout: ceph-test x86_64 2:18.2.1-0.el9 ceph 40 M 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout: cephadm noarch 2:18.2.1-0.el9 ceph-noarch 221 k 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout: libcephfs-devel x86_64 2:18.2.1-0.el9 ceph 31 k 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout: libcephfs2 x86_64 2:18.2.1-0.el9 ceph 658 k 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout: librados-devel x86_64 2:18.2.1-0.el9 ceph 127 k 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout: python3-cephfs x86_64 2:18.2.1-0.el9 ceph 161 k 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout: python3-jmespath noarch 1.0.1-1.el9 appstream 48 k 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout: python3-rados x86_64 2:18.2.1-0.el9 ceph 321 k 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout: python3-rbd x86_64 2:18.2.1-0.el9 ceph 297 k 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout: python3-rgw x86_64 2:18.2.1-0.el9 ceph 99 k 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout: python3-xmltodict noarch 0.12.0-15.el9 epel 22 k 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout: rbd-fuse x86_64 2:18.2.1-0.el9 ceph 86 k 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout: rbd-mirror x86_64 2:18.2.1-0.el9 ceph 3.0 M 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout: rbd-nbd x86_64 2:18.2.1-0.el9 ceph 171 k 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout:Upgrading: 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout: librados2 x86_64 2:18.2.1-0.el9 ceph 3.3 M 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout: librbd1 x86_64 2:18.2.1-0.el9 ceph 3.0 M 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout:Installing dependencies: 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout: boost-program-options x86_64 1.75.0-13.el9 appstream 104 k 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout: ceph-common x86_64 2:18.2.1-0.el9 ceph 18 M 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout: ceph-grafana-dashboards noarch 2:18.2.1-0.el9 ceph-noarch 23 k 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mds x86_64 2:18.2.1-0.el9 ceph 2.1 M 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-modules-core noarch 2:18.2.1-0.el9 ceph-noarch 242 k 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mon x86_64 2:18.2.1-0.el9 ceph 4.4 M 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout: ceph-osd x86_64 2:18.2.1-0.el9 ceph 18 M 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout: ceph-prometheus-alerts noarch 2:18.2.1-0.el9 ceph-noarch 15 k 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout: ceph-selinux x86_64 2:18.2.1-0.el9 ceph 24 k 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout: flexiblas x86_64 3.0.4-9.el9 appstream 30 k 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 appstream 3.0 M 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 appstream 15 k 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout: fmt x86_64 8.1.1-5.el9 epel 111 k 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout: gperftools-libs x86_64 2.9.1-3.el9 epel 308 k 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout: ledmon-libs x86_64 1.1.0-3.el9 baseos 40 k 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout: libarrow x86_64 9.0.0-15.el9 epel 4.4 M 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout: libarrow-doc noarch 9.0.0-15.el9 epel 25 k 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout: libcephsqlite x86_64 2:18.2.1-0.el9 ceph 165 k 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout: libconfig x86_64 1.7.2-9.el9 baseos 72 k 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout: libgfortran x86_64 11.5.0-14.el9 baseos 794 k 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout: liboath x86_64 2.6.12-1.el9 epel 49 k 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout: libpmemobj x86_64 1.12.1-1.el9 appstream 160 k 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout: libquadmath x86_64 11.5.0-14.el9 baseos 184 k 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout: librabbitmq x86_64 0.11.0-7.el9 appstream 45 k 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout: libradosstriper1 x86_64 2:18.2.1-0.el9 ceph 474 k 2026-03-10T08:48:48.982 INFO:teuthology.orchestra.run.vm08.stdout: librdkafka x86_64 1.6.1-102.el9 appstream 662 k 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: librgw2 x86_64 2:18.2.1-0.el9 ceph 4.5 M 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: libstoragemgmt x86_64 1.10.1-1.el9 appstream 246 k 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: libunwind x86_64 1.6.2-1.el9 epel 67 k 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: libxslt x86_64 1.1.34-12.el9 appstream 233 k 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: lttng-ust x86_64 2.12.0-6.el9 appstream 292 k 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: mailcap noarch 2.1.49-5.el9 baseos 33 k 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: openblas x86_64 0.3.29-1.el9 appstream 42 k 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: openblas-openmp x86_64 0.3.29-1.el9 appstream 5.3 M 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: parquet-libs x86_64 9.0.0-15.el9 epel 838 k 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: python3-asyncssh noarch 2.13.2-5.el9 epel 548 k 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: python3-autocommand noarch 2.2.2-8.el9 epel 29 k 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: python3-babel noarch 2.9.1-2.el9 appstream 6.0 M 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 epel 60 k 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: python3-bcrypt x86_64 3.2.2-1.el9 epel 43 k 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: python3-cachetools noarch 4.2.4-1.el9 epel 32 k 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: python3-ceph-argparse x86_64 2:18.2.1-0.el9 ceph 45 k 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: python3-ceph-common x86_64 2:18.2.1-0.el9 ceph 124 k 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: python3-certifi noarch 2023.05.07-4.el9 epel 14 k 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: python3-cffi x86_64 1.14.5-5.el9 baseos 253 k 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: python3-cheroot noarch 10.0.1-4.el9 epel 173 k 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: python3-cherrypy noarch 18.6.1-2.el9 epel 358 k 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: python3-cryptography x86_64 36.0.1-5.el9 baseos 1.2 M 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: python3-devel x86_64 3.9.25-3.el9 appstream 244 k 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: python3-google-auth noarch 1:2.45.0-1.el9 epel 254 k 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco noarch 8.2.1-3.el9 epel 11 k 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 epel 18 k 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 epel 23 k 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-context noarch 6.0.1-3.el9 epel 20 k 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 epel 19 k 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-text noarch 4.0.0-2.el9 epel 26 k 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: python3-jinja2 noarch 2.11.3-8.el9 appstream 249 k 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: python3-jwt noarch 2.4.0-1.el9 epel 41 k 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 epel 1.0 M 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 appstream 177 k 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: python3-logutils noarch 0.3.5-21.el9 epel 46 k 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: python3-mako noarch 1.1.4-6.el9 appstream 172 k 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: python3-markupsafe x86_64 1.1.1-12.el9 appstream 35 k 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: python3-more-itertools noarch 8.12.0-2.el9 epel 79 k 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: python3-natsort noarch 7.1.1-5.el9 epel 58 k 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: python3-numpy x86_64 1:1.23.5-2.el9 appstream 6.1 M 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 appstream 442 k 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: python3-pecan noarch 1.4.2-3.el9 epel 272 k 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: python3-ply noarch 3.11-14.el9 baseos 106 k 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: python3-portend noarch 3.1.0-2.el9 epel 16 k 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 epel 90 k 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: python3-pyasn1 noarch 0.4.8-7.el9 appstream 157 k 2026-03-10T08:48:48.983 INFO:teuthology.orchestra.run.vm08.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 appstream 277 k 2026-03-10T08:48:48.984 INFO:teuthology.orchestra.run.vm08.stdout: python3-pycparser noarch 2.20-6.el9 baseos 135 k 2026-03-10T08:48:48.984 INFO:teuthology.orchestra.run.vm08.stdout: python3-repoze-lru noarch 0.7-16.el9 epel 31 k 2026-03-10T08:48:48.984 INFO:teuthology.orchestra.run.vm08.stdout: python3-requests noarch 2.25.1-10.el9 baseos 126 k 2026-03-10T08:48:48.984 INFO:teuthology.orchestra.run.vm08.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 appstream 54 k 2026-03-10T08:48:48.984 INFO:teuthology.orchestra.run.vm08.stdout: python3-routes noarch 2.5.1-5.el9 epel 188 k 2026-03-10T08:48:48.984 INFO:teuthology.orchestra.run.vm08.stdout: python3-rsa noarch 4.9-2.el9 epel 59 k 2026-03-10T08:48:48.984 INFO:teuthology.orchestra.run.vm08.stdout: python3-scipy x86_64 1.9.3-2.el9 appstream 19 M 2026-03-10T08:48:48.984 INFO:teuthology.orchestra.run.vm08.stdout: python3-tempora noarch 5.0.0-2.el9 epel 36 k 2026-03-10T08:48:48.984 INFO:teuthology.orchestra.run.vm08.stdout: python3-toml noarch 0.10.2-6.el9 appstream 42 k 2026-03-10T08:48:48.984 INFO:teuthology.orchestra.run.vm08.stdout: python3-typing-extensions noarch 4.15.0-1.el9 epel 86 k 2026-03-10T08:48:48.984 INFO:teuthology.orchestra.run.vm08.stdout: python3-urllib3 noarch 1.26.5-7.el9 baseos 218 k 2026-03-10T08:48:48.984 INFO:teuthology.orchestra.run.vm08.stdout: python3-webob noarch 1.8.8-2.el9 epel 230 k 2026-03-10T08:48:48.984 INFO:teuthology.orchestra.run.vm08.stdout: python3-websocket-client noarch 1.2.3-2.el9 epel 90 k 2026-03-10T08:48:48.984 INFO:teuthology.orchestra.run.vm08.stdout: python3-werkzeug noarch 2.0.3-3.el9.1 epel 427 k 2026-03-10T08:48:48.984 INFO:teuthology.orchestra.run.vm08.stdout: python3-zc-lockfile noarch 2.0-10.el9 epel 20 k 2026-03-10T08:48:48.984 INFO:teuthology.orchestra.run.vm08.stdout: re2 x86_64 1:20211101-20.el9 epel 191 k 2026-03-10T08:48:48.984 INFO:teuthology.orchestra.run.vm08.stdout: socat x86_64 1.7.4.1-8.el9 appstream 303 k 2026-03-10T08:48:48.984 INFO:teuthology.orchestra.run.vm08.stdout: thrift x86_64 0.15.0-4.el9 epel 1.6 M 2026-03-10T08:48:48.984 INFO:teuthology.orchestra.run.vm08.stdout: xmlstarlet x86_64 1.6.1-20.el9 appstream 64 k 2026-03-10T08:48:48.984 INFO:teuthology.orchestra.run.vm08.stdout:Installing weak dependencies: 2026-03-10T08:48:48.984 INFO:teuthology.orchestra.run.vm08.stdout: python3-jwt+crypto noarch 2.4.0-1.el9 epel 9.0 k 2026-03-10T08:48:48.984 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:48:48.984 INFO:teuthology.orchestra.run.vm08.stdout:Transaction Summary 2026-03-10T08:48:48.984 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T08:48:48.984 INFO:teuthology.orchestra.run.vm08.stdout:Install 117 Packages 2026-03-10T08:48:48.984 INFO:teuthology.orchestra.run.vm08.stdout:Upgrade 2 Packages 2026-03-10T08:48:48.984 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:48:48.984 INFO:teuthology.orchestra.run.vm08.stdout:Total download size: 182 M 2026-03-10T08:48:48.984 INFO:teuthology.orchestra.run.vm08.stdout:Downloading Packages: 2026-03-10T08:48:49.256 INFO:teuthology.orchestra.run.vm05.stdout:Package librados2-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-10T08:48:49.257 INFO:teuthology.orchestra.run.vm05.stdout:Package librbd1-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-10T08:48:49.261 INFO:teuthology.orchestra.run.vm05.stdout:Package bzip2-1.0.8-11.el9.x86_64 is already installed. 2026-03-10T08:48:49.261 INFO:teuthology.orchestra.run.vm05.stdout:Package perl-Test-Harness-1:3.42-461.el9.noarch is already installed. 2026-03-10T08:48:49.296 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T08:48:49.301 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T08:48:49.301 INFO:teuthology.orchestra.run.vm05.stdout: Package Arch Version Repository Size 2026-03-10T08:48:49.301 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T08:48:49.301 INFO:teuthology.orchestra.run.vm05.stdout:Installing: 2026-03-10T08:48:49.301 INFO:teuthology.orchestra.run.vm05.stdout: ceph x86_64 2:18.2.1-0.el9 ceph 6.4 k 2026-03-10T08:48:49.301 INFO:teuthology.orchestra.run.vm05.stdout: ceph-base x86_64 2:18.2.1-0.el9 ceph 5.2 M 2026-03-10T08:48:49.301 INFO:teuthology.orchestra.run.vm05.stdout: ceph-fuse x86_64 2:18.2.1-0.el9 ceph 839 k 2026-03-10T08:48:49.301 INFO:teuthology.orchestra.run.vm05.stdout: ceph-immutable-object-cache x86_64 2:18.2.1-0.el9 ceph 142 k 2026-03-10T08:48:49.301 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr x86_64 2:18.2.1-0.el9 ceph 1.4 M 2026-03-10T08:48:49.301 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-cephadm noarch 2:18.2.1-0.el9 ceph-noarch 132 k 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-dashboard noarch 2:18.2.1-0.el9 ceph-noarch 1.8 M 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-diskprediction-local noarch 2:18.2.1-0.el9 ceph-noarch 7.4 M 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-rook noarch 2:18.2.1-0.el9 ceph-noarch 50 k 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: ceph-radosgw x86_64 2:18.2.1-0.el9 ceph 7.7 M 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: ceph-test x86_64 2:18.2.1-0.el9 ceph 40 M 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: cephadm noarch 2:18.2.1-0.el9 ceph-noarch 221 k 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: libcephfs-devel x86_64 2:18.2.1-0.el9 ceph 31 k 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: libcephfs2 x86_64 2:18.2.1-0.el9 ceph 658 k 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: librados-devel x86_64 2:18.2.1-0.el9 ceph 127 k 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: python3-cephfs x86_64 2:18.2.1-0.el9 ceph 161 k 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: python3-jmespath noarch 1.0.1-1.el9 appstream 48 k 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: python3-rados x86_64 2:18.2.1-0.el9 ceph 321 k 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: python3-rbd x86_64 2:18.2.1-0.el9 ceph 297 k 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: python3-rgw x86_64 2:18.2.1-0.el9 ceph 99 k 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: python3-xmltodict noarch 0.12.0-15.el9 epel 22 k 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: rbd-fuse x86_64 2:18.2.1-0.el9 ceph 86 k 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: rbd-mirror x86_64 2:18.2.1-0.el9 ceph 3.0 M 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: rbd-nbd x86_64 2:18.2.1-0.el9 ceph 171 k 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout:Upgrading: 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: librados2 x86_64 2:18.2.1-0.el9 ceph 3.3 M 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: librbd1 x86_64 2:18.2.1-0.el9 ceph 3.0 M 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout:Installing dependencies: 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: boost-program-options x86_64 1.75.0-13.el9 appstream 104 k 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: ceph-common x86_64 2:18.2.1-0.el9 ceph 18 M 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: ceph-grafana-dashboards noarch 2:18.2.1-0.el9 ceph-noarch 23 k 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mds x86_64 2:18.2.1-0.el9 ceph 2.1 M 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-modules-core noarch 2:18.2.1-0.el9 ceph-noarch 242 k 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mon x86_64 2:18.2.1-0.el9 ceph 4.4 M 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: ceph-osd x86_64 2:18.2.1-0.el9 ceph 18 M 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: ceph-prometheus-alerts noarch 2:18.2.1-0.el9 ceph-noarch 15 k 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: ceph-selinux x86_64 2:18.2.1-0.el9 ceph 24 k 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas x86_64 3.0.4-9.el9 appstream 30 k 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 appstream 3.0 M 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 appstream 15 k 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: fmt x86_64 8.1.1-5.el9 epel 111 k 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: gperftools-libs x86_64 2.9.1-3.el9 epel 308 k 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: ledmon-libs x86_64 1.1.0-3.el9 baseos 40 k 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: libarrow x86_64 9.0.0-15.el9 epel 4.4 M 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: libarrow-doc noarch 9.0.0-15.el9 epel 25 k 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: libcephsqlite x86_64 2:18.2.1-0.el9 ceph 165 k 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: libconfig x86_64 1.7.2-9.el9 baseos 72 k 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: libgfortran x86_64 11.5.0-14.el9 baseos 794 k 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: liboath x86_64 2.6.12-1.el9 epel 49 k 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: libpmemobj x86_64 1.12.1-1.el9 appstream 160 k 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: libquadmath x86_64 11.5.0-14.el9 baseos 184 k 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: librabbitmq x86_64 0.11.0-7.el9 appstream 45 k 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: libradosstriper1 x86_64 2:18.2.1-0.el9 ceph 474 k 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: librdkafka x86_64 1.6.1-102.el9 appstream 662 k 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: librgw2 x86_64 2:18.2.1-0.el9 ceph 4.5 M 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: libstoragemgmt x86_64 1.10.1-1.el9 appstream 246 k 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: libunwind x86_64 1.6.2-1.el9 epel 67 k 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: libxslt x86_64 1.1.34-12.el9 appstream 233 k 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: lttng-ust x86_64 2.12.0-6.el9 appstream 292 k 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: mailcap noarch 2.1.49-5.el9 baseos 33 k 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: openblas x86_64 0.3.29-1.el9 appstream 42 k 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: openblas-openmp x86_64 0.3.29-1.el9 appstream 5.3 M 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: parquet-libs x86_64 9.0.0-15.el9 epel 838 k 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: python3-asyncssh noarch 2.13.2-5.el9 epel 548 k 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: python3-autocommand noarch 2.2.2-8.el9 epel 29 k 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: python3-babel noarch 2.9.1-2.el9 appstream 6.0 M 2026-03-10T08:48:49.302 INFO:teuthology.orchestra.run.vm05.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 epel 60 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-bcrypt x86_64 3.2.2-1.el9 epel 43 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-cachetools noarch 4.2.4-1.el9 epel 32 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-ceph-argparse x86_64 2:18.2.1-0.el9 ceph 45 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-ceph-common x86_64 2:18.2.1-0.el9 ceph 124 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-certifi noarch 2023.05.07-4.el9 epel 14 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-cffi x86_64 1.14.5-5.el9 baseos 253 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-cheroot noarch 10.0.1-4.el9 epel 173 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-cherrypy noarch 18.6.1-2.el9 epel 358 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-cryptography x86_64 36.0.1-5.el9 baseos 1.2 M 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-devel x86_64 3.9.25-3.el9 appstream 244 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-google-auth noarch 1:2.45.0-1.el9 epel 254 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco noarch 8.2.1-3.el9 epel 11 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 epel 18 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 epel 23 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-context noarch 6.0.1-3.el9 epel 20 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 epel 19 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-text noarch 4.0.0-2.el9 epel 26 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-jinja2 noarch 2.11.3-8.el9 appstream 249 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-jwt noarch 2.4.0-1.el9 epel 41 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 epel 1.0 M 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 appstream 177 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-logutils noarch 0.3.5-21.el9 epel 46 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-mako noarch 1.1.4-6.el9 appstream 172 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-markupsafe x86_64 1.1.1-12.el9 appstream 35 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-more-itertools noarch 8.12.0-2.el9 epel 79 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-natsort noarch 7.1.1-5.el9 epel 58 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-numpy x86_64 1:1.23.5-2.el9 appstream 6.1 M 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 appstream 442 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-pecan noarch 1.4.2-3.el9 epel 272 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-ply noarch 3.11-14.el9 baseos 106 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-portend noarch 3.1.0-2.el9 epel 16 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 epel 90 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyasn1 noarch 0.4.8-7.el9 appstream 157 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 appstream 277 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-pycparser noarch 2.20-6.el9 baseos 135 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-repoze-lru noarch 0.7-16.el9 epel 31 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-requests noarch 2.25.1-10.el9 baseos 126 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 appstream 54 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-routes noarch 2.5.1-5.el9 epel 188 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-rsa noarch 4.9-2.el9 epel 59 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-scipy x86_64 1.9.3-2.el9 appstream 19 M 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-tempora noarch 5.0.0-2.el9 epel 36 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-toml noarch 0.10.2-6.el9 appstream 42 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-typing-extensions noarch 4.15.0-1.el9 epel 86 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-urllib3 noarch 1.26.5-7.el9 baseos 218 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-webob noarch 1.8.8-2.el9 epel 230 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-websocket-client noarch 1.2.3-2.el9 epel 90 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-werkzeug noarch 2.0.3-3.el9.1 epel 427 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-zc-lockfile noarch 2.0-10.el9 epel 20 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: re2 x86_64 1:20211101-20.el9 epel 191 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: socat x86_64 1.7.4.1-8.el9 appstream 303 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: thrift x86_64 0.15.0-4.el9 epel 1.6 M 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: xmlstarlet x86_64 1.6.1-20.el9 appstream 64 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout:Installing weak dependencies: 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: python3-jwt+crypto noarch 2.4.0-1.el9 epel 9.0 k 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout:Transaction Summary 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout:Install 117 Packages 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout:Upgrade 2 Packages 2026-03-10T08:48:49.303 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:48:49.304 INFO:teuthology.orchestra.run.vm05.stdout:Total download size: 182 M 2026-03-10T08:48:49.304 INFO:teuthology.orchestra.run.vm05.stdout:Downloading Packages: 2026-03-10T08:48:50.290 INFO:teuthology.orchestra.run.vm05.stdout:(1/119): ceph-18.2.1-0.el9.x86_64.rpm 20 kB/s | 6.4 kB 00:00 2026-03-10T08:48:50.828 INFO:teuthology.orchestra.run.vm08.stdout:(1/119): ceph-18.2.1-0.el9.x86_64.rpm 21 kB/s | 6.4 kB 00:00 2026-03-10T08:48:50.906 INFO:teuthology.orchestra.run.vm05.stdout:(2/119): ceph-fuse-18.2.1-0.el9.x86_64.rpm 1.3 MB/s | 839 kB 00:00 2026-03-10T08:48:51.010 INFO:teuthology.orchestra.run.vm05.stdout:(3/119): ceph-immutable-object-cache-18.2.1-0.e 1.3 MB/s | 142 kB 00:00 2026-03-10T08:48:51.354 INFO:teuthology.orchestra.run.vm05.stdout:(4/119): ceph-mds-18.2.1-0.el9.x86_64.rpm 6.1 MB/s | 2.1 MB 00:00 2026-03-10T08:48:51.433 INFO:teuthology.orchestra.run.vm05.stdout:(5/119): ceph-base-18.2.1-0.el9.x86_64.rpm 3.6 MB/s | 5.2 MB 00:01 2026-03-10T08:48:51.566 INFO:teuthology.orchestra.run.vm05.stdout:(6/119): ceph-mgr-18.2.1-0.el9.x86_64.rpm 6.8 MB/s | 1.4 MB 00:00 2026-03-10T08:48:51.627 INFO:teuthology.orchestra.run.vm08.stdout:(2/119): ceph-fuse-18.2.1-0.el9.x86_64.rpm 1.0 MB/s | 839 kB 00:00 2026-03-10T08:48:51.728 INFO:teuthology.orchestra.run.vm08.stdout:(3/119): ceph-immutable-object-cache-18.2.1-0.e 1.4 MB/s | 142 kB 00:00 2026-03-10T08:48:51.967 INFO:teuthology.orchestra.run.vm05.stdout:(7/119): ceph-mon-18.2.1-0.el9.x86_64.rpm 8.3 MB/s | 4.4 MB 00:00 2026-03-10T08:48:52.920 INFO:teuthology.orchestra.run.vm08.stdout:(4/119): ceph-mds-18.2.1-0.el9.x86_64.rpm 1.8 MB/s | 2.1 MB 00:01 2026-03-10T08:48:53.431 INFO:teuthology.orchestra.run.vm05.stdout:(8/119): ceph-radosgw-18.2.1-0.el9.x86_64.rpm 5.3 MB/s | 7.7 MB 00:01 2026-03-10T08:48:53.539 INFO:teuthology.orchestra.run.vm05.stdout:(9/119): ceph-selinux-18.2.1-0.el9.x86_64.rpm 224 kB/s | 24 kB 00:00 2026-03-10T08:48:53.714 INFO:teuthology.orchestra.run.vm08.stdout:(5/119): ceph-mgr-18.2.1-0.el9.x86_64.rpm 1.8 MB/s | 1.4 MB 00:00 2026-03-10T08:48:53.999 INFO:teuthology.orchestra.run.vm05.stdout:(10/119): ceph-common-18.2.1-0.el9.x86_64.rpm 4.6 MB/s | 18 MB 00:04 2026-03-10T08:48:54.098 INFO:teuthology.orchestra.run.vm05.stdout:(11/119): libcephfs-devel-18.2.1-0.el9.x86_64.r 313 kB/s | 31 kB 00:00 2026-03-10T08:48:54.303 INFO:teuthology.orchestra.run.vm05.stdout:(12/119): libcephfs2-18.2.1-0.el9.x86_64.rpm 3.1 MB/s | 658 kB 00:00 2026-03-10T08:48:54.404 INFO:teuthology.orchestra.run.vm05.stdout:(13/119): libcephsqlite-18.2.1-0.el9.x86_64.rpm 1.6 MB/s | 165 kB 00:00 2026-03-10T08:48:54.504 INFO:teuthology.orchestra.run.vm05.stdout:(14/119): librados-devel-18.2.1-0.el9.x86_64.rp 1.2 MB/s | 127 kB 00:00 2026-03-10T08:48:54.707 INFO:teuthology.orchestra.run.vm05.stdout:(15/119): libradosstriper1-18.2.1-0.el9.x86_64. 2.3 MB/s | 474 kB 00:00 2026-03-10T08:48:55.025 INFO:teuthology.orchestra.run.vm05.stdout:(16/119): ceph-osd-18.2.1-0.el9.x86_64.rpm 5.1 MB/s | 18 MB 00:03 2026-03-10T08:48:55.134 INFO:teuthology.orchestra.run.vm05.stdout:(17/119): python3-ceph-argparse-18.2.1-0.el9.x8 416 kB/s | 45 kB 00:00 2026-03-10T08:48:55.237 INFO:teuthology.orchestra.run.vm05.stdout:(18/119): python3-ceph-common-18.2.1-0.el9.x86_ 1.2 MB/s | 124 kB 00:00 2026-03-10T08:48:55.713 INFO:teuthology.orchestra.run.vm08.stdout:(6/119): ceph-mon-18.2.1-0.el9.x86_64.rpm 2.2 MB/s | 4.4 MB 00:01 2026-03-10T08:48:56.106 INFO:teuthology.orchestra.run.vm05.stdout:(19/119): python3-cephfs-18.2.1-0.el9.x86_64.rp 186 kB/s | 161 kB 00:00 2026-03-10T08:48:56.212 INFO:teuthology.orchestra.run.vm05.stdout:(20/119): python3-rados-18.2.1-0.el9.x86_64.rpm 2.9 MB/s | 321 kB 00:00 2026-03-10T08:48:56.867 INFO:teuthology.orchestra.run.vm05.stdout:(21/119): librgw2-18.2.1-0.el9.x86_64.rpm 2.1 MB/s | 4.5 MB 00:02 2026-03-10T08:48:56.876 INFO:teuthology.orchestra.run.vm05.stdout:(22/119): python3-rbd-18.2.1-0.el9.x86_64.rpm 449 kB/s | 297 kB 00:00 2026-03-10T08:48:56.968 INFO:teuthology.orchestra.run.vm05.stdout:(23/119): python3-rgw-18.2.1-0.el9.x86_64.rpm 989 kB/s | 99 kB 00:00 2026-03-10T08:48:56.979 INFO:teuthology.orchestra.run.vm05.stdout:(24/119): rbd-fuse-18.2.1-0.el9.x86_64.rpm 837 kB/s | 86 kB 00:00 2026-03-10T08:48:57.083 INFO:teuthology.orchestra.run.vm05.stdout:(25/119): rbd-nbd-18.2.1-0.el9.x86_64.rpm 1.6 MB/s | 171 kB 00:00 2026-03-10T08:48:57.186 INFO:teuthology.orchestra.run.vm05.stdout:(26/119): ceph-grafana-dashboards-18.2.1-0.el9. 226 kB/s | 23 kB 00:00 2026-03-10T08:48:57.291 INFO:teuthology.orchestra.run.vm05.stdout:(27/119): ceph-mgr-cephadm-18.2.1-0.el9.noarch. 1.2 MB/s | 132 kB 00:00 2026-03-10T08:48:57.439 INFO:teuthology.orchestra.run.vm08.stdout:(7/119): ceph-common-18.2.1-0.el9.x86_64.rpm 2.7 MB/s | 18 MB 00:06 2026-03-10T08:48:57.908 INFO:teuthology.orchestra.run.vm05.stdout:(28/119): ceph-mgr-dashboard-18.2.1-0.el9.noarc 2.8 MB/s | 1.8 MB 00:00 2026-03-10T08:48:59.032 INFO:teuthology.orchestra.run.vm05.stdout:(29/119): rbd-mirror-18.2.1-0.el9.x86_64.rpm 1.4 MB/s | 3.0 MB 00:02 2026-03-10T08:48:59.135 INFO:teuthology.orchestra.run.vm08.stdout:(8/119): ceph-base-18.2.1-0.el9.x86_64.rpm 617 kB/s | 5.2 MB 00:08 2026-03-10T08:48:59.232 INFO:teuthology.orchestra.run.vm05.stdout:(30/119): ceph-mgr-modules-core-18.2.1-0.el9.no 1.2 MB/s | 242 kB 00:00 2026-03-10T08:48:59.239 INFO:teuthology.orchestra.run.vm08.stdout:(9/119): ceph-selinux-18.2.1-0.el9.x86_64.rpm 232 kB/s | 24 kB 00:00 2026-03-10T08:48:59.331 INFO:teuthology.orchestra.run.vm05.stdout:(31/119): ceph-mgr-rook-18.2.1-0.el9.noarch.rpm 505 kB/s | 50 kB 00:00 2026-03-10T08:48:59.430 INFO:teuthology.orchestra.run.vm05.stdout:(32/119): ceph-prometheus-alerts-18.2.1-0.el9.n 148 kB/s | 15 kB 00:00 2026-03-10T08:48:59.628 INFO:teuthology.orchestra.run.vm05.stdout:(33/119): cephadm-18.2.1-0.el9.noarch.rpm 1.1 MB/s | 221 kB 00:00 2026-03-10T08:49:00.180 INFO:teuthology.orchestra.run.vm05.stdout:(34/119): ledmon-libs-1.1.0-3.el9.x86_64.rpm 73 kB/s | 40 kB 00:00 2026-03-10T08:49:00.271 INFO:teuthology.orchestra.run.vm05.stdout:(35/119): ceph-mgr-diskprediction-local-18.2.1- 3.1 MB/s | 7.4 MB 00:02 2026-03-10T08:49:00.330 INFO:teuthology.orchestra.run.vm05.stdout:(36/119): libconfig-1.7.2-9.el9.x86_64.rpm 480 kB/s | 72 kB 00:00 2026-03-10T08:49:00.480 INFO:teuthology.orchestra.run.vm05.stdout:(37/119): libquadmath-11.5.0-14.el9.x86_64.rpm 1.2 MB/s | 184 kB 00:00 2026-03-10T08:49:00.561 INFO:teuthology.orchestra.run.vm05.stdout:(38/119): mailcap-2.1.49-5.el9.noarch.rpm 409 kB/s | 33 kB 00:00 2026-03-10T08:49:00.738 INFO:teuthology.orchestra.run.vm05.stdout:(39/119): python3-cffi-1.14.5-5.el9.x86_64.rpm 1.4 MB/s | 253 kB 00:00 2026-03-10T08:49:00.867 INFO:teuthology.orchestra.run.vm05.stdout:(40/119): libgfortran-11.5.0-14.el9.x86_64.rpm 1.3 MB/s | 794 kB 00:00 2026-03-10T08:49:00.907 INFO:teuthology.orchestra.run.vm05.stdout:(41/119): python3-cryptography-36.0.1-5.el9.x86 7.4 MB/s | 1.2 MB 00:00 2026-03-10T08:49:00.959 INFO:teuthology.orchestra.run.vm05.stdout:(42/119): python3-ply-3.11-14.el9.noarch.rpm 1.1 MB/s | 106 kB 00:00 2026-03-10T08:49:00.989 INFO:teuthology.orchestra.run.vm05.stdout:(43/119): python3-pycparser-2.20-6.el9.noarch.r 1.6 MB/s | 135 kB 00:00 2026-03-10T08:49:01.055 INFO:teuthology.orchestra.run.vm05.stdout:(44/119): python3-requests-2.25.1-10.el9.noarch 1.3 MB/s | 126 kB 00:00 2026-03-10T08:49:01.073 INFO:teuthology.orchestra.run.vm05.stdout:(45/119): python3-urllib3-1.26.5-7.el9.noarch.r 2.5 MB/s | 218 kB 00:00 2026-03-10T08:49:01.181 INFO:teuthology.orchestra.run.vm05.stdout:(46/119): flexiblas-3.0.4-9.el9.x86_64.rpm 275 kB/s | 30 kB 00:00 2026-03-10T08:49:01.206 INFO:teuthology.orchestra.run.vm05.stdout:(47/119): boost-program-options-1.75.0-13.el9.x 690 kB/s | 104 kB 00:00 2026-03-10T08:49:01.279 INFO:teuthology.orchestra.run.vm05.stdout:(48/119): flexiblas-openblas-openmp-3.0.4-9.el9 204 kB/s | 15 kB 00:00 2026-03-10T08:49:01.338 INFO:teuthology.orchestra.run.vm05.stdout:(49/119): libpmemobj-1.12.1-1.el9.x86_64.rpm 2.7 MB/s | 160 kB 00:00 2026-03-10T08:49:01.350 INFO:teuthology.orchestra.run.vm05.stdout:(50/119): flexiblas-netlib-3.0.4-9.el9.x86_64.r 18 MB/s | 3.0 MB 00:00 2026-03-10T08:49:01.392 INFO:teuthology.orchestra.run.vm05.stdout:(51/119): librabbitmq-0.11.0-7.el9.x86_64.rpm 840 kB/s | 45 kB 00:00 2026-03-10T08:49:01.405 INFO:teuthology.orchestra.run.vm05.stdout:(52/119): librdkafka-1.6.1-102.el9.x86_64.rpm 12 MB/s | 662 kB 00:00 2026-03-10T08:49:01.443 INFO:teuthology.orchestra.run.vm05.stdout:(53/119): libstoragemgmt-1.10.1-1.el9.x86_64.rp 4.8 MB/s | 246 kB 00:00 2026-03-10T08:49:01.455 INFO:teuthology.orchestra.run.vm05.stdout:(54/119): libxslt-1.1.34-12.el9.x86_64.rpm 4.6 MB/s | 233 kB 00:00 2026-03-10T08:49:01.495 INFO:teuthology.orchestra.run.vm05.stdout:(55/119): lttng-ust-2.12.0-6.el9.x86_64.rpm 5.6 MB/s | 292 kB 00:00 2026-03-10T08:49:01.503 INFO:teuthology.orchestra.run.vm05.stdout:(56/119): openblas-0.3.29-1.el9.x86_64.rpm 873 kB/s | 42 kB 00:00 2026-03-10T08:49:01.758 INFO:teuthology.orchestra.run.vm05.stdout:(57/119): openblas-openmp-0.3.29-1.el9.x86_64.r 20 MB/s | 5.3 MB 00:00 2026-03-10T08:49:01.839 INFO:teuthology.orchestra.run.vm05.stdout:(58/119): python3-babel-2.9.1-2.el9.noarch.rpm 18 MB/s | 6.0 MB 00:00 2026-03-10T08:49:01.840 INFO:teuthology.orchestra.run.vm05.stdout:(59/119): python3-devel-3.9.25-3.el9.x86_64.rpm 2.9 MB/s | 244 kB 00:00 2026-03-10T08:49:01.893 INFO:teuthology.orchestra.run.vm05.stdout:(60/119): python3-jinja2-2.11.3-8.el9.noarch.rp 4.7 MB/s | 249 kB 00:00 2026-03-10T08:49:01.894 INFO:teuthology.orchestra.run.vm05.stdout:(61/119): python3-jmespath-1.0.1-1.el9.noarch.r 904 kB/s | 48 kB 00:00 2026-03-10T08:49:01.948 INFO:teuthology.orchestra.run.vm05.stdout:(62/119): python3-mako-1.1.4-6.el9.noarch.rpm 3.1 MB/s | 172 kB 00:00 2026-03-10T08:49:01.952 INFO:teuthology.orchestra.run.vm05.stdout:(63/119): python3-libstoragemgmt-1.10.1-1.el9.x 2.9 MB/s | 177 kB 00:00 2026-03-10T08:49:02.051 INFO:teuthology.orchestra.run.vm08.stdout:(10/119): ceph-radosgw-18.2.1-0.el9.x86_64.rpm 1.7 MB/s | 7.7 MB 00:04 2026-03-10T08:49:02.052 INFO:teuthology.orchestra.run.vm05.stdout:(64/119): python3-markupsafe-1.1.1-12.el9.x86_6 334 kB/s | 35 kB 00:00 2026-03-10T08:49:02.152 INFO:teuthology.orchestra.run.vm08.stdout:(11/119): libcephfs-devel-18.2.1-0.el9.x86_64.r 308 kB/s | 31 kB 00:00 2026-03-10T08:49:02.159 INFO:teuthology.orchestra.run.vm05.stdout:(65/119): python3-numpy-f2py-1.23.5-2.el9.x86_6 4.1 MB/s | 442 kB 00:00 2026-03-10T08:49:02.238 INFO:teuthology.orchestra.run.vm05.stdout:(66/119): python3-numpy-1.23.5-2.el9.x86_64.rpm 21 MB/s | 6.1 MB 00:00 2026-03-10T08:49:02.239 INFO:teuthology.orchestra.run.vm05.stdout:(67/119): python3-pyasn1-0.4.8-7.el9.noarch.rpm 1.9 MB/s | 157 kB 00:00 2026-03-10T08:49:02.295 INFO:teuthology.orchestra.run.vm05.stdout:(68/119): python3-requests-oauthlib-1.3.0-12.el 963 kB/s | 54 kB 00:00 2026-03-10T08:49:02.303 INFO:teuthology.orchestra.run.vm05.stdout:(69/119): python3-pyasn1-modules-0.4.8-7.el9.no 4.2 MB/s | 277 kB 00:00 2026-03-10T08:49:02.431 INFO:teuthology.orchestra.run.vm05.stdout:(70/119): python3-toml-0.10.2-6.el9.noarch.rpm 326 kB/s | 42 kB 00:00 2026-03-10T08:49:02.551 INFO:teuthology.orchestra.run.vm05.stdout:(71/119): socat-1.7.4.1-8.el9.x86_64.rpm 2.5 MB/s | 303 kB 00:00 2026-03-10T08:49:02.668 INFO:teuthology.orchestra.run.vm05.stdout:(72/119): xmlstarlet-1.6.1-20.el9.x86_64.rpm 549 kB/s | 64 kB 00:00 2026-03-10T08:49:03.024 INFO:teuthology.orchestra.run.vm05.stdout:(73/119): fmt-8.1.1-5.el9.x86_64.rpm 312 kB/s | 111 kB 00:00 2026-03-10T08:49:03.071 INFO:teuthology.orchestra.run.vm08.stdout:(12/119): libcephfs2-18.2.1-0.el9.x86_64.rpm 716 kB/s | 658 kB 00:00 2026-03-10T08:49:03.139 INFO:teuthology.orchestra.run.vm05.stdout:(74/119): gperftools-libs-2.9.1-3.el9.x86_64.rp 2.6 MB/s | 308 kB 00:00 2026-03-10T08:49:03.226 INFO:teuthology.orchestra.run.vm05.stdout:(75/119): python3-scipy-1.9.3-2.el9.x86_64.rpm 21 MB/s | 19 MB 00:00 2026-03-10T08:49:03.370 INFO:teuthology.orchestra.run.vm08.stdout:(13/119): libcephsqlite-18.2.1-0.el9.x86_64.rpm 551 kB/s | 165 kB 00:00 2026-03-10T08:49:03.433 INFO:teuthology.orchestra.run.vm05.stdout:(76/119): libarrow-9.0.0-15.el9.x86_64.rpm 15 MB/s | 4.4 MB 00:00 2026-03-10T08:49:03.443 INFO:teuthology.orchestra.run.vm05.stdout:(77/119): libarrow-doc-9.0.0-15.el9.noarch.rpm 114 kB/s | 25 kB 00:00 2026-03-10T08:49:03.490 INFO:teuthology.orchestra.run.vm05.stdout:(78/119): liboath-2.6.12-1.el9.x86_64.rpm 858 kB/s | 49 kB 00:00 2026-03-10T08:49:03.557 INFO:teuthology.orchestra.run.vm05.stdout:(79/119): parquet-libs-9.0.0-15.el9.x86_64.rpm 12 MB/s | 838 kB 00:00 2026-03-10T08:49:03.557 INFO:teuthology.orchestra.run.vm05.stdout:(80/119): libunwind-1.6.2-1.el9.x86_64.rpm 591 kB/s | 67 kB 00:00 2026-03-10T08:49:03.569 INFO:teuthology.orchestra.run.vm08.stdout:(14/119): librados-devel-18.2.1-0.el9.x86_64.rp 638 kB/s | 127 kB 00:00 2026-03-10T08:49:03.629 INFO:teuthology.orchestra.run.vm05.stdout:(81/119): python3-autocommand-2.2.2-8.el9.noarc 411 kB/s | 29 kB 00:00 2026-03-10T08:49:03.636 INFO:teuthology.orchestra.run.vm05.stdout:(82/119): python3-asyncssh-2.13.2-5.el9.noarch. 6.7 MB/s | 548 kB 00:00 2026-03-10T08:49:03.688 INFO:teuthology.orchestra.run.vm05.stdout:(83/119): python3-backports-tarfile-1.2.0-1.el9 1.0 MB/s | 60 kB 00:00 2026-03-10T08:49:03.696 INFO:teuthology.orchestra.run.vm05.stdout:(84/119): python3-bcrypt-3.2.2-1.el9.x86_64.rpm 735 kB/s | 43 kB 00:00 2026-03-10T08:49:03.743 INFO:teuthology.orchestra.run.vm05.stdout:(85/119): python3-cachetools-4.2.4-1.el9.noarch 594 kB/s | 32 kB 00:00 2026-03-10T08:49:03.767 INFO:teuthology.orchestra.run.vm05.stdout:(86/119): python3-certifi-2023.05.07-4.el9.noar 200 kB/s | 14 kB 00:00 2026-03-10T08:49:03.858 INFO:teuthology.orchestra.run.vm05.stdout:(87/119): python3-cherrypy-18.6.1-2.el9.noarch. 3.9 MB/s | 358 kB 00:00 2026-03-10T08:49:03.887 INFO:teuthology.orchestra.run.vm05.stdout:(88/119): python3-cheroot-10.0.1-4.el9.noarch.r 1.2 MB/s | 173 kB 00:00 2026-03-10T08:49:03.918 INFO:teuthology.orchestra.run.vm05.stdout:(89/119): python3-google-auth-2.45.0-1.el9.noar 4.2 MB/s | 254 kB 00:00 2026-03-10T08:49:03.941 INFO:teuthology.orchestra.run.vm05.stdout:(90/119): python3-jaraco-8.2.1-3.el9.noarch.rpm 198 kB/s | 11 kB 00:00 2026-03-10T08:49:03.975 INFO:teuthology.orchestra.run.vm05.stdout:(91/119): python3-jaraco-classes-3.2.1-5.el9.no 314 kB/s | 18 kB 00:00 2026-03-10T08:49:03.997 INFO:teuthology.orchestra.run.vm05.stdout:(92/119): python3-jaraco-collections-3.0.0-8.el 417 kB/s | 23 kB 00:00 2026-03-10T08:49:04.032 INFO:teuthology.orchestra.run.vm05.stdout:(93/119): python3-jaraco-context-6.0.1-3.el9.no 348 kB/s | 20 kB 00:00 2026-03-10T08:49:04.051 INFO:teuthology.orchestra.run.vm05.stdout:(94/119): python3-jaraco-functools-3.5.0-2.el9. 361 kB/s | 19 kB 00:00 2026-03-10T08:49:04.095 INFO:teuthology.orchestra.run.vm05.stdout:(95/119): python3-jaraco-text-4.0.0-2.el9.noarc 417 kB/s | 26 kB 00:00 2026-03-10T08:49:04.105 INFO:teuthology.orchestra.run.vm05.stdout:(96/119): python3-jwt+crypto-2.4.0-1.el9.noarch 169 kB/s | 9.0 kB 00:00 2026-03-10T08:49:04.152 INFO:teuthology.orchestra.run.vm05.stdout:(97/119): python3-jwt-2.4.0-1.el9.noarch.rpm 720 kB/s | 41 kB 00:00 2026-03-10T08:49:04.209 INFO:teuthology.orchestra.run.vm05.stdout:(98/119): python3-logutils-0.3.5-21.el9.noarch. 816 kB/s | 46 kB 00:00 2026-03-10T08:49:04.259 INFO:teuthology.orchestra.run.vm08.stdout:(15/119): libradosstriper1-18.2.1-0.el9.x86_64. 687 kB/s | 474 kB 00:00 2026-03-10T08:49:04.271 INFO:teuthology.orchestra.run.vm05.stdout:(99/119): python3-kubernetes-26.1.0-3.el9.noarc 6.1 MB/s | 1.0 MB 00:00 2026-03-10T08:49:04.272 INFO:teuthology.orchestra.run.vm05.stdout:(100/119): python3-more-itertools-8.12.0-2.el9. 1.2 MB/s | 79 kB 00:00 2026-03-10T08:49:04.372 INFO:teuthology.orchestra.run.vm05.stdout:(101/119): python3-natsort-7.1.1-5.el9.noarch.r 576 kB/s | 58 kB 00:00 2026-03-10T08:49:04.390 INFO:teuthology.orchestra.run.vm05.stdout:(102/119): python3-pecan-1.4.2-3.el9.noarch.rpm 2.3 MB/s | 272 kB 00:00 2026-03-10T08:49:04.425 INFO:teuthology.orchestra.run.vm05.stdout:(103/119): python3-portend-3.1.0-2.el9.noarch.r 306 kB/s | 16 kB 00:00 2026-03-10T08:49:04.452 INFO:teuthology.orchestra.run.vm05.stdout:(104/119): python3-pyOpenSSL-21.0.0-1.el9.noarc 1.4 MB/s | 90 kB 00:00 2026-03-10T08:49:04.479 INFO:teuthology.orchestra.run.vm05.stdout:(105/119): python3-repoze-lru-0.7-16.el9.noarch 574 kB/s | 31 kB 00:00 2026-03-10T08:49:04.510 INFO:teuthology.orchestra.run.vm05.stdout:(106/119): python3-routes-2.5.1-5.el9.noarch.rp 3.1 MB/s | 188 kB 00:00 2026-03-10T08:49:04.533 INFO:teuthology.orchestra.run.vm05.stdout:(107/119): python3-rsa-4.9-2.el9.noarch.rpm 1.1 MB/s | 59 kB 00:00 2026-03-10T08:49:04.567 INFO:teuthology.orchestra.run.vm05.stdout:(108/119): python3-tempora-5.0.0-2.el9.noarch.r 634 kB/s | 36 kB 00:00 2026-03-10T08:49:04.588 INFO:teuthology.orchestra.run.vm05.stdout:(109/119): python3-typing-extensions-4.15.0-1.e 1.6 MB/s | 86 kB 00:00 2026-03-10T08:49:04.626 INFO:teuthology.orchestra.run.vm05.stdout:(110/119): python3-webob-1.8.8-2.el9.noarch.rpm 3.8 MB/s | 230 kB 00:00 2026-03-10T08:49:04.642 INFO:teuthology.orchestra.run.vm05.stdout:(111/119): python3-websocket-client-1.2.3-2.el9 1.6 MB/s | 90 kB 00:00 2026-03-10T08:49:04.688 INFO:teuthology.orchestra.run.vm05.stdout:(112/119): python3-werkzeug-2.0.3-3.el9.1.noarc 6.8 MB/s | 427 kB 00:00 2026-03-10T08:49:04.696 INFO:teuthology.orchestra.run.vm05.stdout:(113/119): python3-xmltodict-0.12.0-15.el9.noar 415 kB/s | 22 kB 00:00 2026-03-10T08:49:04.744 INFO:teuthology.orchestra.run.vm05.stdout:(114/119): python3-zc-lockfile-2.0-10.el9.noarc 355 kB/s | 20 kB 00:00 2026-03-10T08:49:04.752 INFO:teuthology.orchestra.run.vm05.stdout:(115/119): re2-20211101-20.el9.x86_64.rpm 3.3 MB/s | 191 kB 00:00 2026-03-10T08:49:05.505 INFO:teuthology.orchestra.run.vm05.stdout:(116/119): thrift-0.15.0-4.el9.x86_64.rpm 2.1 MB/s | 1.6 MB 00:00 2026-03-10T08:49:06.136 INFO:teuthology.orchestra.run.vm05.stdout:(117/119): librados2-18.2.1-0.el9.x86_64.rpm 2.4 MB/s | 3.3 MB 00:01 2026-03-10T08:49:07.711 INFO:teuthology.orchestra.run.vm05.stdout:(118/119): librbd1-18.2.1-0.el9.x86_64.rpm 1.4 MB/s | 3.0 MB 00:02 2026-03-10T08:49:09.303 INFO:teuthology.orchestra.run.vm05.stdout:(119/119): ceph-test-18.2.1-0.el9.x86_64.rpm 2.5 MB/s | 40 MB 00:15 2026-03-10T08:49:09.310 INFO:teuthology.orchestra.run.vm05.stdout:-------------------------------------------------------------------------------- 2026-03-10T08:49:09.310 INFO:teuthology.orchestra.run.vm05.stdout:Total 9.1 MB/s | 182 MB 00:20 2026-03-10T08:49:09.534 INFO:teuthology.orchestra.run.vm08.stdout:(16/119): librgw2-18.2.1-0.el9.x86_64.rpm 865 kB/s | 4.5 MB 00:05 2026-03-10T08:49:09.673 INFO:teuthology.orchestra.run.vm08.stdout:(17/119): python3-ceph-argparse-18.2.1-0.el9.x8 324 kB/s | 45 kB 00:00 2026-03-10T08:49:09.789 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction check 2026-03-10T08:49:09.842 INFO:teuthology.orchestra.run.vm05.stdout:Transaction check succeeded. 2026-03-10T08:49:09.842 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction test 2026-03-10T08:49:09.917 INFO:teuthology.orchestra.run.vm08.stdout:(18/119): python3-ceph-common-18.2.1-0.el9.x86_ 512 kB/s | 124 kB 00:00 2026-03-10T08:49:10.119 INFO:teuthology.orchestra.run.vm08.stdout:(19/119): python3-cephfs-18.2.1-0.el9.x86_64.rp 799 kB/s | 161 kB 00:00 2026-03-10T08:49:10.514 INFO:teuthology.orchestra.run.vm08.stdout:(20/119): python3-rados-18.2.1-0.el9.x86_64.rpm 812 kB/s | 321 kB 00:00 2026-03-10T08:49:10.594 INFO:teuthology.orchestra.run.vm05.stdout:Transaction test succeeded. 2026-03-10T08:49:10.595 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction 2026-03-10T08:49:10.909 INFO:teuthology.orchestra.run.vm08.stdout:(21/119): python3-rbd-18.2.1-0.el9.x86_64.rpm 754 kB/s | 297 kB 00:00 2026-03-10T08:49:11.163 INFO:teuthology.orchestra.run.vm08.stdout:(22/119): python3-rgw-18.2.1-0.el9.x86_64.rpm 390 kB/s | 99 kB 00:00 2026-03-10T08:49:11.264 INFO:teuthology.orchestra.run.vm08.stdout:(23/119): rbd-fuse-18.2.1-0.el9.x86_64.rpm 858 kB/s | 86 kB 00:00 2026-03-10T08:49:11.448 INFO:teuthology.orchestra.run.vm05.stdout: Preparing : 1/1 2026-03-10T08:49:11.458 INFO:teuthology.orchestra.run.vm05.stdout: Installing : thrift-0.15.0-4.el9.x86_64 1/121 2026-03-10T08:49:11.471 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-more-itertools-8.12.0-2.el9.noarch 2/121 2026-03-10T08:49:11.635 INFO:teuthology.orchestra.run.vm05.stdout: Installing : lttng-ust-2.12.0-6.el9.x86_64 3/121 2026-03-10T08:49:11.637 INFO:teuthology.orchestra.run.vm05.stdout: Upgrading : librados2-2:18.2.1-0.el9.x86_64 4/121 2026-03-10T08:49:11.683 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: librados2-2:18.2.1-0.el9.x86_64 4/121 2026-03-10T08:49:11.686 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libcephfs2-2:18.2.1-0.el9.x86_64 5/121 2026-03-10T08:49:11.716 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: libcephfs2-2:18.2.1-0.el9.x86_64 5/121 2026-03-10T08:49:11.726 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-rados-2:18.2.1-0.el9.x86_64 6/121 2026-03-10T08:49:11.731 INFO:teuthology.orchestra.run.vm05.stdout: Installing : librdkafka-1.6.1-102.el9.x86_64 7/121 2026-03-10T08:49:11.735 INFO:teuthology.orchestra.run.vm05.stdout: Installing : librabbitmq-0.11.0-7.el9.x86_64 8/121 2026-03-10T08:49:11.745 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-jaraco-8.2.1-3.el9.noarch 9/121 2026-03-10T08:49:11.747 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libcephsqlite-2:18.2.1-0.el9.x86_64 10/121 2026-03-10T08:49:11.854 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: libcephsqlite-2:18.2.1-0.el9.x86_64 10/121 2026-03-10T08:49:11.856 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libradosstriper1-2:18.2.1-0.el9.x86_64 11/121 2026-03-10T08:49:11.907 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: libradosstriper1-2:18.2.1-0.el9.x86_64 11/121 2026-03-10T08:49:11.914 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-werkzeug-2.0.3-3.el9.1.noarch 12/121 2026-03-10T08:49:11.944 INFO:teuthology.orchestra.run.vm05.stdout: Installing : liboath-2.6.12-1.el9.x86_64 13/121 2026-03-10T08:49:11.954 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-pyasn1-0.4.8-7.el9.noarch 14/121 2026-03-10T08:49:11.959 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-markupsafe-1.1.1-12.el9.x86_64 15/121 2026-03-10T08:49:11.990 INFO:teuthology.orchestra.run.vm05.stdout: Installing : flexiblas-3.0.4-9.el9.x86_64 16/121 2026-03-10T08:49:12.012 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-urllib3-1.26.5-7.el9.noarch 17/121 2026-03-10T08:49:12.018 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-requests-2.25.1-10.el9.noarch 18/121 2026-03-10T08:49:12.026 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libquadmath-11.5.0-14.el9.x86_64 19/121 2026-03-10T08:49:12.029 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libgfortran-11.5.0-14.el9.x86_64 20/121 2026-03-10T08:49:12.034 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ledmon-libs-1.1.0-3.el9.x86_64 21/121 2026-03-10T08:49:12.045 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-ceph-argparse-2:18.2.1-0.el9.x86_64 22/121 2026-03-10T08:49:12.060 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-cephfs-2:18.2.1-0.el9.x86_64 23/121 2026-03-10T08:49:12.094 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-requests-oauthlib-1.3.0-12.el9.noarch 24/121 2026-03-10T08:49:12.158 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-mako-1.1.4-6.el9.noarch 25/121 2026-03-10T08:49:12.176 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-pyasn1-modules-0.4.8-7.el9.noarch 26/121 2026-03-10T08:49:12.184 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-rsa-4.9-2.el9.noarch 27/121 2026-03-10T08:49:12.195 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-jaraco-classes-3.2.1-5.el9.noarch 28/121 2026-03-10T08:49:12.200 INFO:teuthology.orchestra.run.vm05.stdout: Installing : librados-devel-2:18.2.1-0.el9.x86_64 29/121 2026-03-10T08:49:12.233 INFO:teuthology.orchestra.run.vm08.stdout:(24/119): ceph-osd-18.2.1-0.el9.x86_64.rpm 1.1 MB/s | 18 MB 00:16 2026-03-10T08:49:12.238 INFO:teuthology.orchestra.run.vm05.stdout: Installing : re2-1:20211101-20.el9.x86_64 30/121 2026-03-10T08:49:12.245 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libarrow-9.0.0-15.el9.x86_64 31/121 2026-03-10T08:49:12.263 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-zc-lockfile-2.0-10.el9.noarch 32/121 2026-03-10T08:49:12.289 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-websocket-client-1.2.3-2.el9.noarch 33/121 2026-03-10T08:49:12.297 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-webob-1.8.8-2.el9.noarch 34/121 2026-03-10T08:49:12.304 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-typing-extensions-4.15.0-1.el9.noarch 35/121 2026-03-10T08:49:12.318 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-repoze-lru-0.7-16.el9.noarch 36/121 2026-03-10T08:49:12.331 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-routes-2.5.1-5.el9.noarch 37/121 2026-03-10T08:49:12.334 INFO:teuthology.orchestra.run.vm08.stdout:(25/119): rbd-nbd-18.2.1-0.el9.x86_64.rpm 1.6 MB/s | 171 kB 00:00 2026-03-10T08:49:12.343 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-natsort-7.1.1-5.el9.noarch 38/121 2026-03-10T08:49:12.405 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-logutils-0.3.5-21.el9.noarch 39/121 2026-03-10T08:49:12.414 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-pecan-1.4.2-3.el9.noarch 40/121 2026-03-10T08:49:12.425 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-certifi-2023.05.07-4.el9.noarch 41/121 2026-03-10T08:49:12.434 INFO:teuthology.orchestra.run.vm08.stdout:(26/119): ceph-grafana-dashboards-18.2.1-0.el9. 232 kB/s | 23 kB 00:00 2026-03-10T08:49:12.473 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-cachetools-4.2.4-1.el9.noarch 42/121 2026-03-10T08:49:12.535 INFO:teuthology.orchestra.run.vm08.stdout:(27/119): ceph-mgr-cephadm-18.2.1-0.el9.noarch. 1.3 MB/s | 132 kB 00:00 2026-03-10T08:49:12.847 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-google-auth-1:2.45.0-1.el9.noarch 43/121 2026-03-10T08:49:12.863 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-kubernetes-1:26.1.0-3.el9.noarch 44/121 2026-03-10T08:49:12.869 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-backports-tarfile-1.2.0-1.el9.noarch 45/121 2026-03-10T08:49:12.877 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-jaraco-context-6.0.1-3.el9.noarch 46/121 2026-03-10T08:49:12.881 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-autocommand-2.2.2-8.el9.noarch 47/121 2026-03-10T08:49:12.889 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libunwind-1.6.2-1.el9.x86_64 48/121 2026-03-10T08:49:12.893 INFO:teuthology.orchestra.run.vm05.stdout: Installing : gperftools-libs-2.9.1-3.el9.x86_64 49/121 2026-03-10T08:49:12.896 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libarrow-doc-9.0.0-15.el9.noarch 50/121 2026-03-10T08:49:12.906 INFO:teuthology.orchestra.run.vm05.stdout: Installing : fmt-8.1.1-5.el9.x86_64 51/121 2026-03-10T08:49:12.915 INFO:teuthology.orchestra.run.vm05.stdout: Installing : socat-1.7.4.1-8.el9.x86_64 52/121 2026-03-10T08:49:12.920 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-toml-0.10.2-6.el9.noarch 53/121 2026-03-10T08:49:12.928 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-jaraco-functools-3.5.0-2.el9.noarch 54/121 2026-03-10T08:49:12.934 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-jaraco-text-4.0.0-2.el9.noarch 55/121 2026-03-10T08:49:12.943 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-jaraco-collections-3.0.0-8.el9.noarch 56/121 2026-03-10T08:49:12.949 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-tempora-5.0.0-2.el9.noarch 57/121 2026-03-10T08:49:12.989 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-portend-3.1.0-2.el9.noarch 58/121 2026-03-10T08:49:13.236 INFO:teuthology.orchestra.run.vm08.stdout:(28/119): ceph-mgr-dashboard-18.2.1-0.el9.noarc 2.5 MB/s | 1.8 MB 00:00 2026-03-10T08:49:13.262 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-devel-3.9.25-3.el9.x86_64 59/121 2026-03-10T08:49:13.292 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-babel-2.9.1-2.el9.noarch 60/121 2026-03-10T08:49:13.299 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-jinja2-2.11.3-8.el9.noarch 61/121 2026-03-10T08:49:13.363 INFO:teuthology.orchestra.run.vm05.stdout: Installing : openblas-0.3.29-1.el9.x86_64 62/121 2026-03-10T08:49:13.366 INFO:teuthology.orchestra.run.vm05.stdout: Installing : openblas-openmp-0.3.29-1.el9.x86_64 63/121 2026-03-10T08:49:13.390 INFO:teuthology.orchestra.run.vm05.stdout: Installing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 64/121 2026-03-10T08:49:13.777 INFO:teuthology.orchestra.run.vm05.stdout: Installing : flexiblas-netlib-3.0.4-9.el9.x86_64 65/121 2026-03-10T08:49:13.864 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-numpy-1:1.23.5-2.el9.x86_64 66/121 2026-03-10T08:49:14.227 INFO:teuthology.orchestra.run.vm08.stdout:(29/119): rbd-mirror-18.2.1-0.el9.x86_64.rpm 1.0 MB/s | 3.0 MB 00:02 2026-03-10T08:49:14.528 INFO:teuthology.orchestra.run.vm08.stdout:(30/119): ceph-mgr-modules-core-18.2.1-0.el9.no 805 kB/s | 242 kB 00:00 2026-03-10T08:49:14.628 INFO:teuthology.orchestra.run.vm08.stdout:(31/119): ceph-mgr-rook-18.2.1-0.el9.noarch.rpm 501 kB/s | 50 kB 00:00 2026-03-10T08:49:14.639 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 67/121 2026-03-10T08:49:14.667 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-scipy-1.9.3-2.el9.x86_64 68/121 2026-03-10T08:49:14.673 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libxslt-1.1.34-12.el9.x86_64 69/121 2026-03-10T08:49:14.677 INFO:teuthology.orchestra.run.vm05.stdout: Installing : xmlstarlet-1.6.1-20.el9.x86_64 70/121 2026-03-10T08:49:14.728 INFO:teuthology.orchestra.run.vm08.stdout:(32/119): ceph-prometheus-alerts-18.2.1-0.el9.n 147 kB/s | 15 kB 00:00 2026-03-10T08:49:14.828 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libpmemobj-1.12.1-1.el9.x86_64 71/121 2026-03-10T08:49:14.830 INFO:teuthology.orchestra.run.vm05.stdout: Upgrading : librbd1-2:18.2.1-0.el9.x86_64 72/121 2026-03-10T08:49:14.863 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: librbd1-2:18.2.1-0.el9.x86_64 72/121 2026-03-10T08:49:14.866 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-rbd-2:18.2.1-0.el9.x86_64 73/121 2026-03-10T08:49:14.874 INFO:teuthology.orchestra.run.vm05.stdout: Installing : boost-program-options-1.75.0-13.el9.x86_64 74/121 2026-03-10T08:49:14.927 INFO:teuthology.orchestra.run.vm08.stdout:(33/119): cephadm-18.2.1-0.el9.noarch.rpm 1.1 MB/s | 221 kB 00:00 2026-03-10T08:49:15.088 INFO:teuthology.orchestra.run.vm05.stdout: Installing : parquet-libs-9.0.0-15.el9.x86_64 75/121 2026-03-10T08:49:15.090 INFO:teuthology.orchestra.run.vm05.stdout: Installing : librgw2-2:18.2.1-0.el9.x86_64 76/121 2026-03-10T08:49:15.109 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: librgw2-2:18.2.1-0.el9.x86_64 76/121 2026-03-10T08:49:15.118 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-rgw-2:18.2.1-0.el9.x86_64 77/121 2026-03-10T08:49:15.134 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-ply-3.11-14.el9.noarch 78/121 2026-03-10T08:49:15.154 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-pycparser-2.20-6.el9.noarch 79/121 2026-03-10T08:49:15.238 INFO:teuthology.orchestra.run.vm08.stdout:(34/119): ceph-mgr-diskprediction-local-18.2.1- 3.7 MB/s | 7.4 MB 00:02 2026-03-10T08:49:15.243 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-cffi-1.14.5-5.el9.x86_64 80/121 2026-03-10T08:49:15.256 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-cryptography-36.0.1-5.el9.x86_64 81/121 2026-03-10T08:49:15.267 INFO:teuthology.orchestra.run.vm08.stdout:(35/119): ledmon-libs-1.1.0-3.el9.x86_64.rpm 119 kB/s | 40 kB 00:00 2026-03-10T08:49:15.284 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-pyOpenSSL-21.0.0-1.el9.noarch 82/121 2026-03-10T08:49:15.324 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-cheroot-10.0.1-4.el9.noarch 83/121 2026-03-10T08:49:15.384 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-cherrypy-18.6.1-2.el9.noarch 84/121 2026-03-10T08:49:15.397 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-asyncssh-2.13.2-5.el9.noarch 85/121 2026-03-10T08:49:15.399 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-jwt-2.4.0-1.el9.noarch 86/121 2026-03-10T08:49:15.405 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-jwt+crypto-2.4.0-1.el9.noarch 87/121 2026-03-10T08:49:15.410 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-bcrypt-3.2.2-1.el9.x86_64 88/121 2026-03-10T08:49:15.414 INFO:teuthology.orchestra.run.vm05.stdout: Installing : mailcap-2.1.49-5.el9.noarch 89/121 2026-03-10T08:49:15.417 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libconfig-1.7.2-9.el9.x86_64 90/121 2026-03-10T08:49:15.437 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 91/121 2026-03-10T08:49:15.437 INFO:teuthology.orchestra.run.vm05.stdout:Creating group 'libstoragemgmt' with GID 994. 2026-03-10T08:49:15.437 INFO:teuthology.orchestra.run.vm05.stdout:Creating user 'libstoragemgmt' (daemon account for libstoragemgmt) with UID 994 and GID 994. 2026-03-10T08:49:15.437 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:49:15.448 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libstoragemgmt-1.10.1-1.el9.x86_64 91/121 2026-03-10T08:49:15.475 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 91/121 2026-03-10T08:49:15.475 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/libstoragemgmt.service → /usr/lib/systemd/system/libstoragemgmt.service. 2026-03-10T08:49:15.475 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:49:15.492 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 92/121 2026-03-10T08:49:15.542 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: cephadm-2:18.2.1-0.el9.noarch 93/121 2026-03-10T08:49:15.545 INFO:teuthology.orchestra.run.vm05.stdout: Installing : cephadm-2:18.2.1-0.el9.noarch 93/121 2026-03-10T08:49:15.550 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-prometheus-alerts-2:18.2.1-0.el9.noarch 94/121 2026-03-10T08:49:15.570 INFO:teuthology.orchestra.run.vm08.stdout:(36/119): libgfortran-11.5.0-14.el9.x86_64.rpm 2.6 MB/s | 794 kB 00:00 2026-03-10T08:49:15.576 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-grafana-dashboards-2:18.2.1-0.el9.noarch 95/121 2026-03-10T08:49:15.580 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-ceph-common-2:18.2.1-0.el9.x86_64 96/121 2026-03-10T08:49:15.683 INFO:teuthology.orchestra.run.vm08.stdout:(37/119): libquadmath-11.5.0-14.el9.x86_64.rpm 1.6 MB/s | 184 kB 00:00 2026-03-10T08:49:15.684 INFO:teuthology.orchestra.run.vm08.stdout:(38/119): libconfig-1.7.2-9.el9.x86_64.rpm 161 kB/s | 72 kB 00:00 2026-03-10T08:49:15.766 INFO:teuthology.orchestra.run.vm08.stdout:(39/119): mailcap-2.1.49-5.el9.noarch.rpm 402 kB/s | 33 kB 00:00 2026-03-10T08:49:15.933 INFO:teuthology.orchestra.run.vm08.stdout:(40/119): python3-cryptography-36.0.1-5.el9.x86 7.5 MB/s | 1.2 MB 00:00 2026-03-10T08:49:16.017 INFO:teuthology.orchestra.run.vm08.stdout:(41/119): python3-ply-3.11-14.el9.noarch.rpm 1.2 MB/s | 106 kB 00:00 2026-03-10T08:49:16.102 INFO:teuthology.orchestra.run.vm08.stdout:(42/119): python3-pycparser-2.20-6.el9.noarch.r 1.6 MB/s | 135 kB 00:00 2026-03-10T08:49:16.186 INFO:teuthology.orchestra.run.vm08.stdout:(43/119): python3-requests-2.25.1-10.el9.noarch 1.5 MB/s | 126 kB 00:00 2026-03-10T08:49:16.273 INFO:teuthology.orchestra.run.vm08.stdout:(44/119): python3-urllib3-1.26.5-7.el9.noarch.r 2.5 MB/s | 218 kB 00:00 2026-03-10T08:49:16.391 INFO:teuthology.orchestra.run.vm08.stdout:(45/119): python3-cffi-1.14.5-5.el9.x86_64.rpm 359 kB/s | 253 kB 00:00 2026-03-10T08:49:16.497 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-common-2:18.2.1-0.el9.x86_64 97/121 2026-03-10T08:49:16.503 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-common-2:18.2.1-0.el9.x86_64 97/121 2026-03-10T08:49:16.594 INFO:teuthology.orchestra.run.vm08.stdout:(46/119): boost-program-options-1.75.0-13.el9.x 324 kB/s | 104 kB 00:00 2026-03-10T08:49:16.633 INFO:teuthology.orchestra.run.vm08.stdout:(47/119): flexiblas-3.0.4-9.el9.x86_64.rpm 123 kB/s | 30 kB 00:00 2026-03-10T08:49:16.778 INFO:teuthology.orchestra.run.vm08.stdout:(48/119): flexiblas-openblas-openmp-3.0.4-9.el9 103 kB/s | 15 kB 00:00 2026-03-10T08:49:16.811 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-common-2:18.2.1-0.el9.x86_64 97/121 2026-03-10T08:49:16.817 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-base-2:18.2.1-0.el9.x86_64 98/121 2026-03-10T08:49:16.861 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-base-2:18.2.1-0.el9.x86_64 98/121 2026-03-10T08:49:16.862 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph.target → /usr/lib/systemd/system/ceph.target. 2026-03-10T08:49:16.862 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-crash.service → /usr/lib/systemd/system/ceph-crash.service. 2026-03-10T08:49:16.862 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:49:16.866 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-selinux-2:18.2.1-0.el9.x86_64 99/121 2026-03-10T08:49:17.054 INFO:teuthology.orchestra.run.vm08.stdout:(49/119): flexiblas-netlib-3.0.4-9.el9.x86_64.r 6.5 MB/s | 3.0 MB 00:00 2026-03-10T08:49:17.088 INFO:teuthology.orchestra.run.vm08.stdout:(50/119): libpmemobj-1.12.1-1.el9.x86_64.rpm 516 kB/s | 160 kB 00:00 2026-03-10T08:49:17.099 INFO:teuthology.orchestra.run.vm08.stdout:(51/119): librabbitmq-0.11.0-7.el9.x86_64.rpm 1.0 MB/s | 45 kB 00:00 2026-03-10T08:49:17.189 INFO:teuthology.orchestra.run.vm08.stdout:(52/119): libstoragemgmt-1.10.1-1.el9.x86_64.rp 2.7 MB/s | 246 kB 00:00 2026-03-10T08:49:17.233 INFO:teuthology.orchestra.run.vm08.stdout:(53/119): librdkafka-1.6.1-102.el9.x86_64.rpm 4.5 MB/s | 662 kB 00:00 2026-03-10T08:49:17.239 INFO:teuthology.orchestra.run.vm08.stdout:(54/119): libxslt-1.1.34-12.el9.x86_64.rpm 4.6 MB/s | 233 kB 00:00 2026-03-10T08:49:17.283 INFO:teuthology.orchestra.run.vm08.stdout:(55/119): lttng-ust-2.12.0-6.el9.x86_64.rpm 5.7 MB/s | 292 kB 00:00 2026-03-10T08:49:17.286 INFO:teuthology.orchestra.run.vm08.stdout:(56/119): openblas-0.3.29-1.el9.x86_64.rpm 898 kB/s | 42 kB 00:00 2026-03-10T08:49:17.858 INFO:teuthology.orchestra.run.vm08.stdout:(57/119): ceph-test-18.2.1-0.el9.x86_64.rpm 2.1 MB/s | 40 MB 00:18 2026-03-10T08:49:18.213 INFO:teuthology.orchestra.run.vm08.stdout:(58/119): python3-devel-3.9.25-3.el9.x86_64.rpm 688 kB/s | 244 kB 00:00 2026-03-10T08:49:18.520 INFO:teuthology.orchestra.run.vm08.stdout:(59/119): python3-jinja2-2.11.3-8.el9.noarch.rp 811 kB/s | 249 kB 00:00 2026-03-10T08:49:18.580 INFO:teuthology.orchestra.run.vm08.stdout:(60/119): openblas-openmp-0.3.29-1.el9.x86_64.r 4.1 MB/s | 5.3 MB 00:01 2026-03-10T08:49:18.708 INFO:teuthology.orchestra.run.vm08.stdout:(61/119): python3-babel-2.9.1-2.el9.noarch.rpm 4.2 MB/s | 6.0 MB 00:01 2026-03-10T08:49:18.709 INFO:teuthology.orchestra.run.vm08.stdout:(62/119): python3-jmespath-1.0.1-1.el9.noarch.r 251 kB/s | 48 kB 00:00 2026-03-10T08:49:18.710 INFO:teuthology.orchestra.run.vm08.stdout:(63/119): python3-libstoragemgmt-1.10.1-1.el9.x 1.3 MB/s | 177 kB 00:00 2026-03-10T08:49:18.756 INFO:teuthology.orchestra.run.vm08.stdout:(64/119): python3-mako-1.1.4-6.el9.noarch.rpm 3.6 MB/s | 172 kB 00:00 2026-03-10T08:49:18.756 INFO:teuthology.orchestra.run.vm08.stdout:(65/119): python3-markupsafe-1.1.1-12.el9.x86_6 751 kB/s | 35 kB 00:00 2026-03-10T08:49:18.859 INFO:teuthology.orchestra.run.vm08.stdout:(66/119): python3-numpy-f2py-1.23.5-2.el9.x86_6 4.2 MB/s | 442 kB 00:00 2026-03-10T08:49:18.860 INFO:teuthology.orchestra.run.vm08.stdout:(67/119): python3-pyasn1-0.4.8-7.el9.noarch.rpm 1.5 MB/s | 157 kB 00:00 2026-03-10T08:49:19.169 INFO:teuthology.orchestra.run.vm08.stdout:(68/119): python3-pyasn1-modules-0.4.8-7.el9.no 896 kB/s | 277 kB 00:00 2026-03-10T08:49:19.169 INFO:teuthology.orchestra.run.vm08.stdout:(69/119): python3-requests-oauthlib-1.3.0-12.el 173 kB/s | 54 kB 00:00 2026-03-10T08:49:19.345 INFO:teuthology.orchestra.run.vm08.stdout:(70/119): python3-toml-0.10.2-6.el9.noarch.rpm 237 kB/s | 42 kB 00:00 2026-03-10T08:49:19.489 INFO:teuthology.orchestra.run.vm08.stdout:(71/119): python3-numpy-1.23.5-2.el9.x86_64.rpm 7.9 MB/s | 6.1 MB 00:00 2026-03-10T08:49:19.634 INFO:teuthology.orchestra.run.vm08.stdout:(72/119): socat-1.7.4.1-8.el9.x86_64.rpm 1.0 MB/s | 303 kB 00:00 2026-03-10T08:49:19.765 INFO:teuthology.orchestra.run.vm08.stdout:(73/119): xmlstarlet-1.6.1-20.el9.x86_64.rpm 231 kB/s | 64 kB 00:00 2026-03-10T08:49:20.036 INFO:teuthology.orchestra.run.vm08.stdout:(74/119): fmt-8.1.1-5.el9.x86_64.rpm 276 kB/s | 111 kB 00:00 2026-03-10T08:49:20.108 INFO:teuthology.orchestra.run.vm08.stdout:(75/119): gperftools-libs-2.9.1-3.el9.x86_64.rp 897 kB/s | 308 kB 00:00 2026-03-10T08:49:20.166 INFO:teuthology.orchestra.run.vm08.stdout:(76/119): libarrow-doc-9.0.0-15.el9.noarch.rpm 427 kB/s | 25 kB 00:00 2026-03-10T08:49:20.227 INFO:teuthology.orchestra.run.vm08.stdout:(77/119): liboath-2.6.12-1.el9.x86_64.rpm 817 kB/s | 49 kB 00:00 2026-03-10T08:49:20.292 INFO:teuthology.orchestra.run.vm08.stdout:(78/119): libunwind-1.6.2-1.el9.x86_64.rpm 1.0 MB/s | 67 kB 00:00 2026-03-10T08:49:20.408 INFO:teuthology.orchestra.run.vm08.stdout:(79/119): parquet-libs-9.0.0-15.el9.x86_64.rpm 7.1 MB/s | 838 kB 00:00 2026-03-10T08:49:20.483 INFO:teuthology.orchestra.run.vm08.stdout:(80/119): libarrow-9.0.0-15.el9.x86_64.rpm 9.9 MB/s | 4.4 MB 00:00 2026-03-10T08:49:20.507 INFO:teuthology.orchestra.run.vm08.stdout:(81/119): python3-asyncssh-2.13.2-5.el9.noarch. 5.4 MB/s | 548 kB 00:00 2026-03-10T08:49:20.544 INFO:teuthology.orchestra.run.vm08.stdout:(82/119): python3-autocommand-2.2.2-8.el9.noarc 481 kB/s | 29 kB 00:00 2026-03-10T08:49:20.579 INFO:teuthology.orchestra.run.vm08.stdout:(83/119): python3-backports-tarfile-1.2.0-1.el9 840 kB/s | 60 kB 00:00 2026-03-10T08:49:20.606 INFO:teuthology.orchestra.run.vm08.stdout:(84/119): python3-bcrypt-3.2.2-1.el9.x86_64.rpm 706 kB/s | 43 kB 00:00 2026-03-10T08:49:20.642 INFO:teuthology.orchestra.run.vm08.stdout:(85/119): python3-cachetools-4.2.4-1.el9.noarch 512 kB/s | 32 kB 00:00 2026-03-10T08:49:20.669 INFO:teuthology.orchestra.run.vm08.stdout:(86/119): python3-certifi-2023.05.07-4.el9.noar 225 kB/s | 14 kB 00:00 2026-03-10T08:49:20.700 INFO:teuthology.orchestra.run.vm08.stdout:(87/119): python3-cheroot-10.0.1-4.el9.noarch.r 2.9 MB/s | 173 kB 00:00 2026-03-10T08:49:20.726 INFO:teuthology.orchestra.run.vm08.stdout:(88/119): python3-cherrypy-18.6.1-2.el9.noarch. 6.1 MB/s | 358 kB 00:00 2026-03-10T08:49:20.760 INFO:teuthology.orchestra.run.vm08.stdout:(89/119): python3-google-auth-2.45.0-1.el9.noar 4.2 MB/s | 254 kB 00:00 2026-03-10T08:49:20.795 INFO:teuthology.orchestra.run.vm08.stdout:(90/119): python3-jaraco-8.2.1-3.el9.noarch.rpm 156 kB/s | 11 kB 00:00 2026-03-10T08:49:20.822 INFO:teuthology.orchestra.run.vm08.stdout:(91/119): python3-jaraco-classes-3.2.1-5.el9.no 288 kB/s | 18 kB 00:00 2026-03-10T08:49:20.862 INFO:teuthology.orchestra.run.vm08.stdout:(92/119): python3-jaraco-collections-3.0.0-8.el 346 kB/s | 23 kB 00:00 2026-03-10T08:49:20.875 INFO:teuthology.orchestra.run.vm08.stdout:(93/119): python3-jaraco-context-6.0.1-3.el9.no 370 kB/s | 20 kB 00:00 2026-03-10T08:49:20.915 INFO:teuthology.orchestra.run.vm08.stdout:(94/119): python3-jaraco-functools-3.5.0-2.el9. 368 kB/s | 19 kB 00:00 2026-03-10T08:49:20.928 INFO:teuthology.orchestra.run.vm08.stdout:(95/119): python3-jaraco-text-4.0.0-2.el9.noarc 500 kB/s | 26 kB 00:00 2026-03-10T08:49:20.978 INFO:teuthology.orchestra.run.vm08.stdout:(96/119): python3-jwt+crypto-2.4.0-1.el9.noarch 144 kB/s | 9.0 kB 00:00 2026-03-10T08:49:20.981 INFO:teuthology.orchestra.run.vm08.stdout:(97/119): python3-jwt-2.4.0-1.el9.noarch.rpm 769 kB/s | 41 kB 00:00 2026-03-10T08:49:21.051 INFO:teuthology.orchestra.run.vm08.stdout:(98/119): python3-logutils-0.3.5-21.el9.noarch. 667 kB/s | 46 kB 00:00 2026-03-10T08:49:21.093 INFO:teuthology.orchestra.run.vm08.stdout:(99/119): python3-kubernetes-26.1.0-3.el9.noarc 8.9 MB/s | 1.0 MB 00:00 2026-03-10T08:49:21.139 INFO:teuthology.orchestra.run.vm08.stdout:(100/119): python3-more-itertools-8.12.0-2.el9. 896 kB/s | 79 kB 00:00 2026-03-10T08:49:21.155 INFO:teuthology.orchestra.run.vm08.stdout:(101/119): python3-natsort-7.1.1-5.el9.noarch.r 932 kB/s | 58 kB 00:00 2026-03-10T08:49:21.208 INFO:teuthology.orchestra.run.vm08.stdout:(102/119): python3-pecan-1.4.2-3.el9.noarch.rpm 3.9 MB/s | 272 kB 00:00 2026-03-10T08:49:21.211 INFO:teuthology.orchestra.run.vm08.stdout:(103/119): python3-portend-3.1.0-2.el9.noarch.r 293 kB/s | 16 kB 00:00 2026-03-10T08:49:21.278 INFO:teuthology.orchestra.run.vm08.stdout:(104/119): python3-pyOpenSSL-21.0.0-1.el9.noarc 1.3 MB/s | 90 kB 00:00 2026-03-10T08:49:21.340 INFO:teuthology.orchestra.run.vm08.stdout:(105/119): python3-scipy-1.9.3-2.el9.x86_64.rpm 8.9 MB/s | 19 MB 00:02 2026-03-10T08:49:21.341 INFO:teuthology.orchestra.run.vm08.stdout:(106/119): python3-repoze-lru-0.7-16.el9.noarch 238 kB/s | 31 kB 00:00 2026-03-10T08:49:21.342 INFO:teuthology.orchestra.run.vm08.stdout:(107/119): python3-routes-2.5.1-5.el9.noarch.rp 2.9 MB/s | 188 kB 00:00 2026-03-10T08:49:21.402 INFO:teuthology.orchestra.run.vm08.stdout:(108/119): python3-tempora-5.0.0-2.el9.noarch.r 583 kB/s | 36 kB 00:00 2026-03-10T08:49:21.411 INFO:teuthology.orchestra.run.vm08.stdout:(109/119): python3-typing-extensions-4.15.0-1.e 1.2 MB/s | 86 kB 00:00 2026-03-10T08:49:21.468 INFO:teuthology.orchestra.run.vm08.stdout:(110/119): python3-webob-1.8.8-2.el9.noarch.rpm 3.4 MB/s | 230 kB 00:00 2026-03-10T08:49:21.469 INFO:teuthology.orchestra.run.vm08.stdout:(111/119): python3-websocket-client-1.2.3-2.el9 1.5 MB/s | 90 kB 00:00 2026-03-10T08:49:21.527 INFO:teuthology.orchestra.run.vm08.stdout:(112/119): python3-werkzeug-2.0.3-3.el9.1.noarc 7.1 MB/s | 427 kB 00:00 2026-03-10T08:49:21.528 INFO:teuthology.orchestra.run.vm08.stdout:(113/119): python3-xmltodict-0.12.0-15.el9.noar 377 kB/s | 22 kB 00:00 2026-03-10T08:49:21.545 INFO:teuthology.orchestra.run.vm08.stdout:(114/119): python3-rsa-4.9-2.el9.noarch.rpm 288 kB/s | 59 kB 00:00 2026-03-10T08:49:21.582 INFO:teuthology.orchestra.run.vm08.stdout:(115/119): python3-zc-lockfile-2.0-10.el9.noarc 368 kB/s | 20 kB 00:00 2026-03-10T08:49:21.597 INFO:teuthology.orchestra.run.vm08.stdout:(116/119): re2-20211101-20.el9.x86_64.rpm 2.7 MB/s | 191 kB 00:00 2026-03-10T08:49:21.959 INFO:teuthology.orchestra.run.vm08.stdout:(117/119): thrift-0.15.0-4.el9.x86_64.rpm 3.8 MB/s | 1.6 MB 00:00 2026-03-10T08:49:22.567 INFO:teuthology.orchestra.run.vm08.stdout:(118/119): librbd1-18.2.1-0.el9.x86_64.rpm 3.1 MB/s | 3.0 MB 00:00 2026-03-10T08:49:22.723 INFO:teuthology.orchestra.run.vm08.stdout:(119/119): librados2-18.2.1-0.el9.x86_64.rpm 2.9 MB/s | 3.3 MB 00:01 2026-03-10T08:49:22.727 INFO:teuthology.orchestra.run.vm08.stdout:-------------------------------------------------------------------------------- 2026-03-10T08:49:22.728 INFO:teuthology.orchestra.run.vm08.stdout:Total 5.4 MB/s | 182 MB 00:33 2026-03-10T08:49:23.140 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction check 2026-03-10T08:49:23.183 INFO:teuthology.orchestra.run.vm08.stdout:Transaction check succeeded. 2026-03-10T08:49:23.183 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction test 2026-03-10T08:49:23.199 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-selinux-2:18.2.1-0.el9.x86_64 99/121 2026-03-10T08:49:23.199 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /sys 2026-03-10T08:49:23.199 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /proc 2026-03-10T08:49:23.199 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /mnt 2026-03-10T08:49:23.199 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /var/tmp 2026-03-10T08:49:23.199 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /home 2026-03-10T08:49:23.199 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /root 2026-03-10T08:49:23.199 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /tmp 2026-03-10T08:49:23.199 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:49:23.229 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 100/121 2026-03-10T08:49:23.352 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 100/121 2026-03-10T08:49:23.357 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 101/121 2026-03-10T08:49:23.861 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 101/121 2026-03-10T08:49:23.864 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noa 102/121 2026-03-10T08:49:23.894 INFO:teuthology.orchestra.run.vm08.stdout:Transaction test succeeded. 2026-03-10T08:49:23.894 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction 2026-03-10T08:49:23.923 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noa 102/121 2026-03-10T08:49:23.996 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-mgr-modules-core-2:18.2.1-0.el9.noarch 103/121 2026-03-10T08:49:24.000 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-mgr-2:18.2.1-0.el9.x86_64 104/121 2026-03-10T08:49:24.023 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-2:18.2.1-0.el9.x86_64 104/121 2026-03-10T08:49:24.023 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T08:49:24.023 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-10T08:49:24.023 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-10T08:49:24.023 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-10T08:49:24.023 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:49:24.037 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-mgr-rook-2:18.2.1-0.el9.noarch 105/121 2026-03-10T08:49:24.144 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-rook-2:18.2.1-0.el9.noarch 105/121 2026-03-10T08:49:24.147 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-mds-2:18.2.1-0.el9.x86_64 106/121 2026-03-10T08:49:24.168 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mds-2:18.2.1-0.el9.x86_64 106/121 2026-03-10T08:49:24.168 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T08:49:24.168 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-10T08:49:24.168 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-10T08:49:24.168 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-10T08:49:24.168 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:49:24.387 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-mon-2:18.2.1-0.el9.x86_64 107/121 2026-03-10T08:49:24.409 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mon-2:18.2.1-0.el9.x86_64 107/121 2026-03-10T08:49:24.409 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T08:49:24.409 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-10T08:49:24.409 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-10T08:49:24.409 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-10T08:49:24.409 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:49:24.678 INFO:teuthology.orchestra.run.vm08.stdout: Preparing : 1/1 2026-03-10T08:49:24.685 INFO:teuthology.orchestra.run.vm08.stdout: Installing : thrift-0.15.0-4.el9.x86_64 1/121 2026-03-10T08:49:24.697 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-more-itertools-8.12.0-2.el9.noarch 2/121 2026-03-10T08:49:24.857 INFO:teuthology.orchestra.run.vm08.stdout: Installing : lttng-ust-2.12.0-6.el9.x86_64 3/121 2026-03-10T08:49:24.859 INFO:teuthology.orchestra.run.vm08.stdout: Upgrading : librados2-2:18.2.1-0.el9.x86_64 4/121 2026-03-10T08:49:24.903 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: librados2-2:18.2.1-0.el9.x86_64 4/121 2026-03-10T08:49:24.905 INFO:teuthology.orchestra.run.vm08.stdout: Installing : libcephfs2-2:18.2.1-0.el9.x86_64 5/121 2026-03-10T08:49:24.936 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: libcephfs2-2:18.2.1-0.el9.x86_64 5/121 2026-03-10T08:49:24.945 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-rados-2:18.2.1-0.el9.x86_64 6/121 2026-03-10T08:49:24.949 INFO:teuthology.orchestra.run.vm08.stdout: Installing : librdkafka-1.6.1-102.el9.x86_64 7/121 2026-03-10T08:49:24.951 INFO:teuthology.orchestra.run.vm08.stdout: Installing : librabbitmq-0.11.0-7.el9.x86_64 8/121 2026-03-10T08:49:24.960 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-jaraco-8.2.1-3.el9.noarch 9/121 2026-03-10T08:49:24.962 INFO:teuthology.orchestra.run.vm08.stdout: Installing : libcephsqlite-2:18.2.1-0.el9.x86_64 10/121 2026-03-10T08:49:24.996 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: libcephsqlite-2:18.2.1-0.el9.x86_64 10/121 2026-03-10T08:49:24.998 INFO:teuthology.orchestra.run.vm08.stdout: Installing : libradosstriper1-2:18.2.1-0.el9.x86_64 11/121 2026-03-10T08:49:25.046 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: libradosstriper1-2:18.2.1-0.el9.x86_64 11/121 2026-03-10T08:49:25.052 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-werkzeug-2.0.3-3.el9.1.noarch 12/121 2026-03-10T08:49:25.077 INFO:teuthology.orchestra.run.vm08.stdout: Installing : liboath-2.6.12-1.el9.x86_64 13/121 2026-03-10T08:49:25.087 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-pyasn1-0.4.8-7.el9.noarch 14/121 2026-03-10T08:49:25.091 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-markupsafe-1.1.1-12.el9.x86_64 15/121 2026-03-10T08:49:25.117 INFO:teuthology.orchestra.run.vm08.stdout: Installing : flexiblas-3.0.4-9.el9.x86_64 16/121 2026-03-10T08:49:25.142 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-urllib3-1.26.5-7.el9.noarch 17/121 2026-03-10T08:49:25.147 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-requests-2.25.1-10.el9.noarch 18/121 2026-03-10T08:49:25.155 INFO:teuthology.orchestra.run.vm08.stdout: Installing : libquadmath-11.5.0-14.el9.x86_64 19/121 2026-03-10T08:49:25.157 INFO:teuthology.orchestra.run.vm08.stdout: Installing : libgfortran-11.5.0-14.el9.x86_64 20/121 2026-03-10T08:49:25.162 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ledmon-libs-1.1.0-3.el9.x86_64 21/121 2026-03-10T08:49:25.172 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-ceph-argparse-2:18.2.1-0.el9.x86_64 22/121 2026-03-10T08:49:25.186 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-cephfs-2:18.2.1-0.el9.x86_64 23/121 2026-03-10T08:49:25.216 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-requests-oauthlib-1.3.0-12.el9.noarch 24/121 2026-03-10T08:49:25.231 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-osd-2:18.2.1-0.el9.x86_64 108/121 2026-03-10T08:49:25.257 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-osd-2:18.2.1-0.el9.x86_64 108/121 2026-03-10T08:49:25.257 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T08:49:25.257 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-10T08:49:25.257 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-10T08:49:25.257 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-10T08:49:25.257 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:49:25.278 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-mako-1.1.4-6.el9.noarch 25/121 2026-03-10T08:49:25.294 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-pyasn1-modules-0.4.8-7.el9.noarch 26/121 2026-03-10T08:49:25.302 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-rsa-4.9-2.el9.noarch 27/121 2026-03-10T08:49:25.311 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-jaraco-classes-3.2.1-5.el9.noarch 28/121 2026-03-10T08:49:25.316 INFO:teuthology.orchestra.run.vm08.stdout: Installing : librados-devel-2:18.2.1-0.el9.x86_64 29/121 2026-03-10T08:49:25.351 INFO:teuthology.orchestra.run.vm08.stdout: Installing : re2-1:20211101-20.el9.x86_64 30/121 2026-03-10T08:49:25.357 INFO:teuthology.orchestra.run.vm08.stdout: Installing : libarrow-9.0.0-15.el9.x86_64 31/121 2026-03-10T08:49:25.374 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-zc-lockfile-2.0-10.el9.noarch 32/121 2026-03-10T08:49:25.400 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-websocket-client-1.2.3-2.el9.noarch 33/121 2026-03-10T08:49:25.409 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-webob-1.8.8-2.el9.noarch 34/121 2026-03-10T08:49:25.415 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-typing-extensions-4.15.0-1.el9.noarch 35/121 2026-03-10T08:49:25.429 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-repoze-lru-0.7-16.el9.noarch 36/121 2026-03-10T08:49:25.441 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-routes-2.5.1-5.el9.noarch 37/121 2026-03-10T08:49:25.454 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-natsort-7.1.1-5.el9.noarch 38/121 2026-03-10T08:49:25.518 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-logutils-0.3.5-21.el9.noarch 39/121 2026-03-10T08:49:25.527 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-pecan-1.4.2-3.el9.noarch 40/121 2026-03-10T08:49:25.537 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-certifi-2023.05.07-4.el9.noarch 41/121 2026-03-10T08:49:25.587 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-cachetools-4.2.4-1.el9.noarch 42/121 2026-03-10T08:49:25.637 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-2:18.2.1-0.el9.x86_64 109/121 2026-03-10T08:49:25.641 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-radosgw-2:18.2.1-0.el9.x86_64 110/121 2026-03-10T08:49:25.664 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-radosgw-2:18.2.1-0.el9.x86_64 110/121 2026-03-10T08:49:25.664 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T08:49:25.664 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-10T08:49:25.664 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-10T08:49:25.664 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-10T08:49:25.664 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:49:25.676 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-immutable-object-cache-2:18.2.1-0.el9.x86_6 111/121 2026-03-10T08:49:25.699 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.1-0.el9.x86_6 111/121 2026-03-10T08:49:25.699 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T08:49:25.700 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-10T08:49:25.700 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:49:25.854 INFO:teuthology.orchestra.run.vm05.stdout: Installing : rbd-mirror-2:18.2.1-0.el9.x86_64 112/121 2026-03-10T08:49:25.876 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: rbd-mirror-2:18.2.1-0.el9.x86_64 112/121 2026-03-10T08:49:25.876 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T08:49:25.876 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-10T08:49:25.876 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-10T08:49:25.876 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-10T08:49:25.876 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:49:25.977 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-google-auth-1:2.45.0-1.el9.noarch 43/121 2026-03-10T08:49:25.994 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-kubernetes-1:26.1.0-3.el9.noarch 44/121 2026-03-10T08:49:25.999 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-backports-tarfile-1.2.0-1.el9.noarch 45/121 2026-03-10T08:49:26.007 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-jaraco-context-6.0.1-3.el9.noarch 46/121 2026-03-10T08:49:26.012 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-autocommand-2.2.2-8.el9.noarch 47/121 2026-03-10T08:49:26.023 INFO:teuthology.orchestra.run.vm08.stdout: Installing : libunwind-1.6.2-1.el9.x86_64 48/121 2026-03-10T08:49:26.029 INFO:teuthology.orchestra.run.vm08.stdout: Installing : gperftools-libs-2.9.1-3.el9.x86_64 49/121 2026-03-10T08:49:26.033 INFO:teuthology.orchestra.run.vm08.stdout: Installing : libarrow-doc-9.0.0-15.el9.noarch 50/121 2026-03-10T08:49:26.053 INFO:teuthology.orchestra.run.vm08.stdout: Installing : fmt-8.1.1-5.el9.x86_64 51/121 2026-03-10T08:49:26.061 INFO:teuthology.orchestra.run.vm08.stdout: Installing : socat-1.7.4.1-8.el9.x86_64 52/121 2026-03-10T08:49:26.066 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-toml-0.10.2-6.el9.noarch 53/121 2026-03-10T08:49:26.074 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-jaraco-functools-3.5.0-2.el9.noarch 54/121 2026-03-10T08:49:26.079 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-jaraco-text-4.0.0-2.el9.noarch 55/121 2026-03-10T08:49:26.088 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-jaraco-collections-3.0.0-8.el9.noarch 56/121 2026-03-10T08:49:26.099 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-tempora-5.0.0-2.el9.noarch 57/121 2026-03-10T08:49:26.140 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-portend-3.1.0-2.el9.noarch 58/121 2026-03-10T08:49:26.408 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-devel-3.9.25-3.el9.x86_64 59/121 2026-03-10T08:49:26.439 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-babel-2.9.1-2.el9.noarch 60/121 2026-03-10T08:49:26.445 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-jinja2-2.11.3-8.el9.noarch 61/121 2026-03-10T08:49:26.508 INFO:teuthology.orchestra.run.vm08.stdout: Installing : openblas-0.3.29-1.el9.x86_64 62/121 2026-03-10T08:49:26.511 INFO:teuthology.orchestra.run.vm08.stdout: Installing : openblas-openmp-0.3.29-1.el9.x86_64 63/121 2026-03-10T08:49:26.535 INFO:teuthology.orchestra.run.vm08.stdout: Installing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 64/121 2026-03-10T08:49:26.927 INFO:teuthology.orchestra.run.vm08.stdout: Installing : flexiblas-netlib-3.0.4-9.el9.x86_64 65/121 2026-03-10T08:49:27.015 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-numpy-1:1.23.5-2.el9.x86_64 66/121 2026-03-10T08:49:27.826 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 67/121 2026-03-10T08:49:27.855 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-scipy-1.9.3-2.el9.x86_64 68/121 2026-03-10T08:49:27.862 INFO:teuthology.orchestra.run.vm08.stdout: Installing : libxslt-1.1.34-12.el9.x86_64 69/121 2026-03-10T08:49:27.867 INFO:teuthology.orchestra.run.vm08.stdout: Installing : xmlstarlet-1.6.1-20.el9.x86_64 70/121 2026-03-10T08:49:27.921 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-test-2:18.2.1-0.el9.x86_64 113/121 2026-03-10T08:49:27.933 INFO:teuthology.orchestra.run.vm05.stdout: Installing : rbd-fuse-2:18.2.1-0.el9.x86_64 114/121 2026-03-10T08:49:27.939 INFO:teuthology.orchestra.run.vm05.stdout: Installing : rbd-nbd-2:18.2.1-0.el9.x86_64 115/121 2026-03-10T08:49:27.980 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libcephfs-devel-2:18.2.1-0.el9.x86_64 116/121 2026-03-10T08:49:27.986 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-fuse-2:18.2.1-0.el9.x86_64 117/121 2026-03-10T08:49:27.995 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-xmltodict-0.12.0-15.el9.noarch 118/121 2026-03-10T08:49:28.000 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-jmespath-1.0.1-1.el9.noarch 119/121 2026-03-10T08:49:28.000 INFO:teuthology.orchestra.run.vm05.stdout: Cleanup : librbd1-2:16.2.4-5.el9.x86_64 120/121 2026-03-10T08:49:28.017 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: librbd1-2:16.2.4-5.el9.x86_64 120/121 2026-03-10T08:49:28.017 INFO:teuthology.orchestra.run.vm05.stdout: Cleanup : librados2-2:16.2.4-5.el9.x86_64 121/121 2026-03-10T08:49:28.022 INFO:teuthology.orchestra.run.vm08.stdout: Installing : libpmemobj-1.12.1-1.el9.x86_64 71/121 2026-03-10T08:49:28.024 INFO:teuthology.orchestra.run.vm08.stdout: Upgrading : librbd1-2:18.2.1-0.el9.x86_64 72/121 2026-03-10T08:49:28.056 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: librbd1-2:18.2.1-0.el9.x86_64 72/121 2026-03-10T08:49:28.060 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-rbd-2:18.2.1-0.el9.x86_64 73/121 2026-03-10T08:49:28.068 INFO:teuthology.orchestra.run.vm08.stdout: Installing : boost-program-options-1.75.0-13.el9.x86_64 74/121 2026-03-10T08:49:28.285 INFO:teuthology.orchestra.run.vm08.stdout: Installing : parquet-libs-9.0.0-15.el9.x86_64 75/121 2026-03-10T08:49:28.286 INFO:teuthology.orchestra.run.vm08.stdout: Installing : librgw2-2:18.2.1-0.el9.x86_64 76/121 2026-03-10T08:49:28.307 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: librgw2-2:18.2.1-0.el9.x86_64 76/121 2026-03-10T08:49:28.316 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-rgw-2:18.2.1-0.el9.x86_64 77/121 2026-03-10T08:49:28.332 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-ply-3.11-14.el9.noarch 78/121 2026-03-10T08:49:28.352 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-pycparser-2.20-6.el9.noarch 79/121 2026-03-10T08:49:28.450 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-cffi-1.14.5-5.el9.x86_64 80/121 2026-03-10T08:49:28.465 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-cryptography-36.0.1-5.el9.x86_64 81/121 2026-03-10T08:49:28.541 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-pyOpenSSL-21.0.0-1.el9.noarch 82/121 2026-03-10T08:49:28.666 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-cheroot-10.0.1-4.el9.noarch 83/121 2026-03-10T08:49:28.765 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-cherrypy-18.6.1-2.el9.noarch 84/121 2026-03-10T08:49:28.778 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-asyncssh-2.13.2-5.el9.noarch 85/121 2026-03-10T08:49:28.781 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-jwt-2.4.0-1.el9.noarch 86/121 2026-03-10T08:49:28.788 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-jwt+crypto-2.4.0-1.el9.noarch 87/121 2026-03-10T08:49:28.793 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-bcrypt-3.2.2-1.el9.x86_64 88/121 2026-03-10T08:49:28.876 INFO:teuthology.orchestra.run.vm08.stdout: Installing : mailcap-2.1.49-5.el9.noarch 89/121 2026-03-10T08:49:28.893 INFO:teuthology.orchestra.run.vm08.stdout: Installing : libconfig-1.7.2-9.el9.x86_64 90/121 2026-03-10T08:49:28.914 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 91/121 2026-03-10T08:49:28.914 INFO:teuthology.orchestra.run.vm08.stdout:Creating group 'libstoragemgmt' with GID 994. 2026-03-10T08:49:28.914 INFO:teuthology.orchestra.run.vm08.stdout:Creating user 'libstoragemgmt' (daemon account for libstoragemgmt) with UID 994 and GID 994. 2026-03-10T08:49:28.914 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:49:29.037 INFO:teuthology.orchestra.run.vm08.stdout: Installing : libstoragemgmt-1.10.1-1.el9.x86_64 91/121 2026-03-10T08:49:29.107 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 91/121 2026-03-10T08:49:29.107 INFO:teuthology.orchestra.run.vm08.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/libstoragemgmt.service → /usr/lib/systemd/system/libstoragemgmt.service. 2026-03-10T08:49:29.107 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:49:29.137 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 92/121 2026-03-10T08:49:29.180 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: librados2-2:16.2.4-5.el9.x86_64 121/121 2026-03-10T08:49:29.180 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-2:18.2.1-0.el9.x86_64 1/121 2026-03-10T08:49:29.180 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-base-2:18.2.1-0.el9.x86_64 2/121 2026-03-10T08:49:29.180 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-common-2:18.2.1-0.el9.x86_64 3/121 2026-03-10T08:49:29.180 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-fuse-2:18.2.1-0.el9.x86_64 4/121 2026-03-10T08:49:29.180 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-immutable-object-cache-2:18.2.1-0.el9.x86_6 5/121 2026-03-10T08:49:29.180 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mds-2:18.2.1-0.el9.x86_64 6/121 2026-03-10T08:49:29.180 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-2:18.2.1-0.el9.x86_64 7/121 2026-03-10T08:49:29.180 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mon-2:18.2.1-0.el9.x86_64 8/121 2026-03-10T08:49:29.180 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-osd-2:18.2.1-0.el9.x86_64 9/121 2026-03-10T08:49:29.180 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-radosgw-2:18.2.1-0.el9.x86_64 10/121 2026-03-10T08:49:29.180 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-selinux-2:18.2.1-0.el9.x86_64 11/121 2026-03-10T08:49:29.180 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-test-2:18.2.1-0.el9.x86_64 12/121 2026-03-10T08:49:29.180 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libcephfs-devel-2:18.2.1-0.el9.x86_64 13/121 2026-03-10T08:49:29.180 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libcephfs2-2:18.2.1-0.el9.x86_64 14/121 2026-03-10T08:49:29.180 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libcephsqlite-2:18.2.1-0.el9.x86_64 15/121 2026-03-10T08:49:29.180 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librados-devel-2:18.2.1-0.el9.x86_64 16/121 2026-03-10T08:49:29.180 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libradosstriper1-2:18.2.1-0.el9.x86_64 17/121 2026-03-10T08:49:29.180 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librgw2-2:18.2.1-0.el9.x86_64 18/121 2026-03-10T08:49:29.181 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-ceph-argparse-2:18.2.1-0.el9.x86_64 19/121 2026-03-10T08:49:29.181 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-ceph-common-2:18.2.1-0.el9.x86_64 20/121 2026-03-10T08:49:29.181 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cephfs-2:18.2.1-0.el9.x86_64 21/121 2026-03-10T08:49:29.181 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-rados-2:18.2.1-0.el9.x86_64 22/121 2026-03-10T08:49:29.181 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-rbd-2:18.2.1-0.el9.x86_64 23/121 2026-03-10T08:49:29.181 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-rgw-2:18.2.1-0.el9.x86_64 24/121 2026-03-10T08:49:29.181 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : rbd-fuse-2:18.2.1-0.el9.x86_64 25/121 2026-03-10T08:49:29.181 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : rbd-mirror-2:18.2.1-0.el9.x86_64 26/121 2026-03-10T08:49:29.181 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : rbd-nbd-2:18.2.1-0.el9.x86_64 27/121 2026-03-10T08:49:29.181 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-grafana-dashboards-2:18.2.1-0.el9.noarch 28/121 2026-03-10T08:49:29.182 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 29/121 2026-03-10T08:49:29.182 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 30/121 2026-03-10T08:49:29.182 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noa 31/121 2026-03-10T08:49:29.182 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-modules-core-2:18.2.1-0.el9.noarch 32/121 2026-03-10T08:49:29.182 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-rook-2:18.2.1-0.el9.noarch 33/121 2026-03-10T08:49:29.182 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-prometheus-alerts-2:18.2.1-0.el9.noarch 34/121 2026-03-10T08:49:29.182 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : cephadm-2:18.2.1-0.el9.noarch 35/121 2026-03-10T08:49:29.182 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 36/121 2026-03-10T08:49:29.182 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 37/121 2026-03-10T08:49:29.182 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 38/121 2026-03-10T08:49:29.182 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 39/121 2026-03-10T08:49:29.182 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 40/121 2026-03-10T08:49:29.182 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 41/121 2026-03-10T08:49:29.182 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 42/121 2026-03-10T08:49:29.182 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-ply-3.11-14.el9.noarch 43/121 2026-03-10T08:49:29.182 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 44/121 2026-03-10T08:49:29.182 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 45/121 2026-03-10T08:49:29.182 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 46/121 2026-03-10T08:49:29.182 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 47/121 2026-03-10T08:49:29.182 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 48/121 2026-03-10T08:49:29.182 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 49/121 2026-03-10T08:49:29.182 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 50/121 2026-03-10T08:49:29.182 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 51/121 2026-03-10T08:49:29.182 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 52/121 2026-03-10T08:49:29.182 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 53/121 2026-03-10T08:49:29.182 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 54/121 2026-03-10T08:49:29.182 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 55/121 2026-03-10T08:49:29.182 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 56/121 2026-03-10T08:49:29.182 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 57/121 2026-03-10T08:49:29.182 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 58/121 2026-03-10T08:49:29.182 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 59/121 2026-03-10T08:49:29.182 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 60/121 2026-03-10T08:49:29.182 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 61/121 2026-03-10T08:49:29.182 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jmespath-1.0.1-1.el9.noarch 62/121 2026-03-10T08:49:29.183 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 63/121 2026-03-10T08:49:29.183 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-mako-1.1.4-6.el9.noarch 64/121 2026-03-10T08:49:29.183 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 65/121 2026-03-10T08:49:29.183 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 66/121 2026-03-10T08:49:29.183 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 67/121 2026-03-10T08:49:29.183 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 68/121 2026-03-10T08:49:29.183 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 69/121 2026-03-10T08:49:29.183 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 70/121 2026-03-10T08:49:29.183 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 71/121 2026-03-10T08:49:29.183 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 72/121 2026-03-10T08:49:29.183 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 73/121 2026-03-10T08:49:29.183 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 74/121 2026-03-10T08:49:29.183 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : fmt-8.1.1-5.el9.x86_64 75/121 2026-03-10T08:49:29.183 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 76/121 2026-03-10T08:49:29.183 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 77/121 2026-03-10T08:49:29.183 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 78/121 2026-03-10T08:49:29.183 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 79/121 2026-03-10T08:49:29.183 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 80/121 2026-03-10T08:49:29.183 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 81/121 2026-03-10T08:49:29.183 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 82/121 2026-03-10T08:49:29.183 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 83/121 2026-03-10T08:49:29.183 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 84/121 2026-03-10T08:49:29.183 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 85/121 2026-03-10T08:49:29.183 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 86/121 2026-03-10T08:49:29.183 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 87/121 2026-03-10T08:49:29.183 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cheroot-10.0.1-4.el9.noarch 88/121 2026-03-10T08:49:29.183 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cherrypy-18.6.1-2.el9.noarch 89/121 2026-03-10T08:49:29.183 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 90/121 2026-03-10T08:49:29.183 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 91/121 2026-03-10T08:49:29.183 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 92/121 2026-03-10T08:49:29.184 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 93/121 2026-03-10T08:49:29.184 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 94/121 2026-03-10T08:49:29.184 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 95/121 2026-03-10T08:49:29.184 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 96/121 2026-03-10T08:49:29.184 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jwt+crypto-2.4.0-1.el9.noarch 97/121 2026-03-10T08:49:29.184 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jwt-2.4.0-1.el9.noarch 98/121 2026-03-10T08:49:29.184 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 99/121 2026-03-10T08:49:29.184 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-logutils-0.3.5-21.el9.noarch 100/121 2026-03-10T08:49:29.184 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 101/121 2026-03-10T08:49:29.184 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 102/121 2026-03-10T08:49:29.184 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pecan-1.4.2-3.el9.noarch 103/121 2026-03-10T08:49:29.184 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 104/121 2026-03-10T08:49:29.184 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 105/121 2026-03-10T08:49:29.184 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 106/121 2026-03-10T08:49:29.184 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 107/121 2026-03-10T08:49:29.184 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 108/121 2026-03-10T08:49:29.184 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 109/121 2026-03-10T08:49:29.184 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 110/121 2026-03-10T08:49:29.184 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-webob-1.8.8-2.el9.noarch 111/121 2026-03-10T08:49:29.184 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 112/121 2026-03-10T08:49:29.184 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-werkzeug-2.0.3-3.el9.1.noarch 113/121 2026-03-10T08:49:29.184 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-xmltodict-0.12.0-15.el9.noarch 114/121 2026-03-10T08:49:29.184 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 115/121 2026-03-10T08:49:29.184 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : re2-1:20211101-20.el9.x86_64 116/121 2026-03-10T08:49:29.184 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 117/121 2026-03-10T08:49:29.184 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librados2-2:18.2.1-0.el9.x86_64 118/121 2026-03-10T08:49:29.184 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librados2-2:16.2.4-5.el9.x86_64 119/121 2026-03-10T08:49:29.184 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librbd1-2:18.2.1-0.el9.x86_64 120/121 2026-03-10T08:49:29.186 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: cephadm-2:18.2.1-0.el9.noarch 93/121 2026-03-10T08:49:29.189 INFO:teuthology.orchestra.run.vm08.stdout: Installing : cephadm-2:18.2.1-0.el9.noarch 93/121 2026-03-10T08:49:29.194 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-prometheus-alerts-2:18.2.1-0.el9.noarch 94/121 2026-03-10T08:49:29.223 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-grafana-dashboards-2:18.2.1-0.el9.noarch 95/121 2026-03-10T08:49:29.227 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-ceph-common-2:18.2.1-0.el9.x86_64 96/121 2026-03-10T08:49:29.288 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librbd1-2:16.2.4-5.el9.x86_64 121/121 2026-03-10T08:49:29.288 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:49:29.288 INFO:teuthology.orchestra.run.vm05.stdout:Upgraded: 2026-03-10T08:49:29.288 INFO:teuthology.orchestra.run.vm05.stdout: librados2-2:18.2.1-0.el9.x86_64 librbd1-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:29.288 INFO:teuthology.orchestra.run.vm05.stdout:Installed: 2026-03-10T08:49:29.288 INFO:teuthology.orchestra.run.vm05.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-10T08:49:29.288 INFO:teuthology.orchestra.run.vm05.stdout: ceph-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:29.288 INFO:teuthology.orchestra.run.vm05.stdout: ceph-base-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:29.288 INFO:teuthology.orchestra.run.vm05.stdout: ceph-common-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:29.288 INFO:teuthology.orchestra.run.vm05.stdout: ceph-fuse-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:29.288 INFO:teuthology.orchestra.run.vm05.stdout: ceph-grafana-dashboards-2:18.2.1-0.el9.noarch 2026-03-10T08:49:29.288 INFO:teuthology.orchestra.run.vm05.stdout: ceph-immutable-object-cache-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:29.288 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mds-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:29.288 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noarch 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-modules-core-2:18.2.1-0.el9.noarch 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-rook-2:18.2.1-0.el9.noarch 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mon-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: ceph-osd-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: ceph-prometheus-alerts-2:18.2.1-0.el9.noarch 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: ceph-radosgw-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: ceph-selinux-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: ceph-test-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: cephadm-2:18.2.1-0.el9.noarch 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: fmt-8.1.1-5.el9.x86_64 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: ledmon-libs-1.1.0-3.el9.x86_64 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: libcephfs-devel-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: libcephfs2-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: libcephsqlite-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: libconfig-1.7.2-9.el9.x86_64 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: librados-devel-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: libradosstriper1-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: librgw2-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: libxslt-1.1.34-12.el9.x86_64 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: mailcap-2.1.49-5.el9.noarch 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: python3-ceph-argparse-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: python3-ceph-common-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: python3-cephfs-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: python3-cheroot-10.0.1-4.el9.noarch 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: python3-cherrypy-18.6.1-2.el9.noarch 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-10T08:49:29.289 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: python3-jmespath-1.0.1-1.el9.noarch 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: python3-jwt-2.4.0-1.el9.noarch 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: python3-jwt+crypto-2.4.0-1.el9.noarch 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: python3-logutils-0.3.5-21.el9.noarch 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: python3-mako-1.1.4-6.el9.noarch 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: python3-pecan-1.4.2-3.el9.noarch 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: python3-ply-3.11-14.el9.noarch 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: python3-rados-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: python3-rbd-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: python3-rgw-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: python3-webob-1.8.8-2.el9.noarch 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: python3-werkzeug-2.0.3-3.el9.1.noarch 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: python3-xmltodict-0.12.0-15.el9.noarch 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: rbd-fuse-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: rbd-mirror-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: rbd-nbd-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: re2-1:20211101-20.el9.x86_64 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: socat-1.7.4.1-8.el9.x86_64 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: xmlstarlet-1.6.1-20.el9.x86_64 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:49:29.290 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T08:49:29.387 DEBUG:teuthology.parallel:result is None 2026-03-10T08:49:30.207 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-common-2:18.2.1-0.el9.x86_64 97/121 2026-03-10T08:49:30.213 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-common-2:18.2.1-0.el9.x86_64 97/121 2026-03-10T08:49:30.530 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-common-2:18.2.1-0.el9.x86_64 97/121 2026-03-10T08:49:30.537 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-base-2:18.2.1-0.el9.x86_64 98/121 2026-03-10T08:49:30.580 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-base-2:18.2.1-0.el9.x86_64 98/121 2026-03-10T08:49:30.580 INFO:teuthology.orchestra.run.vm08.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph.target → /usr/lib/systemd/system/ceph.target. 2026-03-10T08:49:30.580 INFO:teuthology.orchestra.run.vm08.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-crash.service → /usr/lib/systemd/system/ceph-crash.service. 2026-03-10T08:49:30.580 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:49:30.586 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-selinux-2:18.2.1-0.el9.x86_64 99/121 2026-03-10T08:49:37.134 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-selinux-2:18.2.1-0.el9.x86_64 99/121 2026-03-10T08:49:37.134 INFO:teuthology.orchestra.run.vm08.stdout:skipping the directory /sys 2026-03-10T08:49:37.134 INFO:teuthology.orchestra.run.vm08.stdout:skipping the directory /proc 2026-03-10T08:49:37.134 INFO:teuthology.orchestra.run.vm08.stdout:skipping the directory /mnt 2026-03-10T08:49:37.134 INFO:teuthology.orchestra.run.vm08.stdout:skipping the directory /var/tmp 2026-03-10T08:49:37.134 INFO:teuthology.orchestra.run.vm08.stdout:skipping the directory /home 2026-03-10T08:49:37.134 INFO:teuthology.orchestra.run.vm08.stdout:skipping the directory /root 2026-03-10T08:49:37.134 INFO:teuthology.orchestra.run.vm08.stdout:skipping the directory /tmp 2026-03-10T08:49:37.134 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:49:37.166 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 100/121 2026-03-10T08:49:37.300 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 100/121 2026-03-10T08:49:37.306 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 101/121 2026-03-10T08:49:37.821 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 101/121 2026-03-10T08:49:37.823 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noa 102/121 2026-03-10T08:49:37.882 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noa 102/121 2026-03-10T08:49:37.957 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-mgr-modules-core-2:18.2.1-0.el9.noarch 103/121 2026-03-10T08:49:37.961 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-mgr-2:18.2.1-0.el9.x86_64 104/121 2026-03-10T08:49:37.984 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mgr-2:18.2.1-0.el9.x86_64 104/121 2026-03-10T08:49:37.984 INFO:teuthology.orchestra.run.vm08.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T08:49:37.984 INFO:teuthology.orchestra.run.vm08.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-10T08:49:37.984 INFO:teuthology.orchestra.run.vm08.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-10T08:49:37.984 INFO:teuthology.orchestra.run.vm08.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-10T08:49:37.984 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:49:38.000 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-mgr-rook-2:18.2.1-0.el9.noarch 105/121 2026-03-10T08:49:38.106 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mgr-rook-2:18.2.1-0.el9.noarch 105/121 2026-03-10T08:49:38.109 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-mds-2:18.2.1-0.el9.x86_64 106/121 2026-03-10T08:49:38.130 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mds-2:18.2.1-0.el9.x86_64 106/121 2026-03-10T08:49:38.130 INFO:teuthology.orchestra.run.vm08.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T08:49:38.130 INFO:teuthology.orchestra.run.vm08.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-10T08:49:38.130 INFO:teuthology.orchestra.run.vm08.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-10T08:49:38.130 INFO:teuthology.orchestra.run.vm08.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-10T08:49:38.130 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:49:38.356 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-mon-2:18.2.1-0.el9.x86_64 107/121 2026-03-10T08:49:38.377 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mon-2:18.2.1-0.el9.x86_64 107/121 2026-03-10T08:49:38.377 INFO:teuthology.orchestra.run.vm08.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T08:49:38.377 INFO:teuthology.orchestra.run.vm08.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-10T08:49:38.377 INFO:teuthology.orchestra.run.vm08.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-10T08:49:38.377 INFO:teuthology.orchestra.run.vm08.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-10T08:49:38.377 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:49:39.179 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-osd-2:18.2.1-0.el9.x86_64 108/121 2026-03-10T08:49:39.204 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-osd-2:18.2.1-0.el9.x86_64 108/121 2026-03-10T08:49:39.204 INFO:teuthology.orchestra.run.vm08.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T08:49:39.204 INFO:teuthology.orchestra.run.vm08.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-10T08:49:39.204 INFO:teuthology.orchestra.run.vm08.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-10T08:49:39.204 INFO:teuthology.orchestra.run.vm08.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-10T08:49:39.204 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:49:39.586 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-2:18.2.1-0.el9.x86_64 109/121 2026-03-10T08:49:39.590 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-radosgw-2:18.2.1-0.el9.x86_64 110/121 2026-03-10T08:49:39.616 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-radosgw-2:18.2.1-0.el9.x86_64 110/121 2026-03-10T08:49:39.616 INFO:teuthology.orchestra.run.vm08.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T08:49:39.616 INFO:teuthology.orchestra.run.vm08.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-10T08:49:39.616 INFO:teuthology.orchestra.run.vm08.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-10T08:49:39.616 INFO:teuthology.orchestra.run.vm08.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-10T08:49:39.616 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:49:39.629 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-immutable-object-cache-2:18.2.1-0.el9.x86_6 111/121 2026-03-10T08:49:39.650 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.1-0.el9.x86_6 111/121 2026-03-10T08:49:39.650 INFO:teuthology.orchestra.run.vm08.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T08:49:39.650 INFO:teuthology.orchestra.run.vm08.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-10T08:49:39.650 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:49:39.800 INFO:teuthology.orchestra.run.vm08.stdout: Installing : rbd-mirror-2:18.2.1-0.el9.x86_64 112/121 2026-03-10T08:49:39.824 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: rbd-mirror-2:18.2.1-0.el9.x86_64 112/121 2026-03-10T08:49:39.824 INFO:teuthology.orchestra.run.vm08.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T08:49:39.824 INFO:teuthology.orchestra.run.vm08.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-10T08:49:39.824 INFO:teuthology.orchestra.run.vm08.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-10T08:49:39.824 INFO:teuthology.orchestra.run.vm08.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-10T08:49:39.824 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:49:41.882 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-test-2:18.2.1-0.el9.x86_64 113/121 2026-03-10T08:49:41.894 INFO:teuthology.orchestra.run.vm08.stdout: Installing : rbd-fuse-2:18.2.1-0.el9.x86_64 114/121 2026-03-10T08:49:41.901 INFO:teuthology.orchestra.run.vm08.stdout: Installing : rbd-nbd-2:18.2.1-0.el9.x86_64 115/121 2026-03-10T08:49:41.946 INFO:teuthology.orchestra.run.vm08.stdout: Installing : libcephfs-devel-2:18.2.1-0.el9.x86_64 116/121 2026-03-10T08:49:41.953 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-fuse-2:18.2.1-0.el9.x86_64 117/121 2026-03-10T08:49:41.963 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-xmltodict-0.12.0-15.el9.noarch 118/121 2026-03-10T08:49:41.968 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-jmespath-1.0.1-1.el9.noarch 119/121 2026-03-10T08:49:41.968 INFO:teuthology.orchestra.run.vm08.stdout: Cleanup : librbd1-2:16.2.4-5.el9.x86_64 120/121 2026-03-10T08:49:41.985 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: librbd1-2:16.2.4-5.el9.x86_64 120/121 2026-03-10T08:49:41.985 INFO:teuthology.orchestra.run.vm08.stdout: Cleanup : librados2-2:16.2.4-5.el9.x86_64 121/121 2026-03-10T08:49:43.145 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: librados2-2:16.2.4-5.el9.x86_64 121/121 2026-03-10T08:49:43.145 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-2:18.2.1-0.el9.x86_64 1/121 2026-03-10T08:49:43.145 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-base-2:18.2.1-0.el9.x86_64 2/121 2026-03-10T08:49:43.145 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-common-2:18.2.1-0.el9.x86_64 3/121 2026-03-10T08:49:43.145 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-fuse-2:18.2.1-0.el9.x86_64 4/121 2026-03-10T08:49:43.145 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-immutable-object-cache-2:18.2.1-0.el9.x86_6 5/121 2026-03-10T08:49:43.145 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mds-2:18.2.1-0.el9.x86_64 6/121 2026-03-10T08:49:43.145 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mgr-2:18.2.1-0.el9.x86_64 7/121 2026-03-10T08:49:43.145 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mon-2:18.2.1-0.el9.x86_64 8/121 2026-03-10T08:49:43.145 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-osd-2:18.2.1-0.el9.x86_64 9/121 2026-03-10T08:49:43.145 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-radosgw-2:18.2.1-0.el9.x86_64 10/121 2026-03-10T08:49:43.145 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-selinux-2:18.2.1-0.el9.x86_64 11/121 2026-03-10T08:49:43.145 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-test-2:18.2.1-0.el9.x86_64 12/121 2026-03-10T08:49:43.145 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libcephfs-devel-2:18.2.1-0.el9.x86_64 13/121 2026-03-10T08:49:43.145 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libcephfs2-2:18.2.1-0.el9.x86_64 14/121 2026-03-10T08:49:43.145 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libcephsqlite-2:18.2.1-0.el9.x86_64 15/121 2026-03-10T08:49:43.145 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : librados-devel-2:18.2.1-0.el9.x86_64 16/121 2026-03-10T08:49:43.145 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libradosstriper1-2:18.2.1-0.el9.x86_64 17/121 2026-03-10T08:49:43.145 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : librgw2-2:18.2.1-0.el9.x86_64 18/121 2026-03-10T08:49:43.145 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-ceph-argparse-2:18.2.1-0.el9.x86_64 19/121 2026-03-10T08:49:43.145 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-ceph-common-2:18.2.1-0.el9.x86_64 20/121 2026-03-10T08:49:43.145 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-cephfs-2:18.2.1-0.el9.x86_64 21/121 2026-03-10T08:49:43.145 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-rados-2:18.2.1-0.el9.x86_64 22/121 2026-03-10T08:49:43.145 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-rbd-2:18.2.1-0.el9.x86_64 23/121 2026-03-10T08:49:43.145 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-rgw-2:18.2.1-0.el9.x86_64 24/121 2026-03-10T08:49:43.145 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : rbd-fuse-2:18.2.1-0.el9.x86_64 25/121 2026-03-10T08:49:43.145 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : rbd-mirror-2:18.2.1-0.el9.x86_64 26/121 2026-03-10T08:49:43.145 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : rbd-nbd-2:18.2.1-0.el9.x86_64 27/121 2026-03-10T08:49:43.145 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-grafana-dashboards-2:18.2.1-0.el9.noarch 28/121 2026-03-10T08:49:43.145 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 29/121 2026-03-10T08:49:43.145 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 30/121 2026-03-10T08:49:43.145 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noa 31/121 2026-03-10T08:49:43.145 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mgr-modules-core-2:18.2.1-0.el9.noarch 32/121 2026-03-10T08:49:43.145 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mgr-rook-2:18.2.1-0.el9.noarch 33/121 2026-03-10T08:49:43.145 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-prometheus-alerts-2:18.2.1-0.el9.noarch 34/121 2026-03-10T08:49:43.145 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : cephadm-2:18.2.1-0.el9.noarch 35/121 2026-03-10T08:49:43.145 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 36/121 2026-03-10T08:49:43.147 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 37/121 2026-03-10T08:49:43.147 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 38/121 2026-03-10T08:49:43.147 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 39/121 2026-03-10T08:49:43.147 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 40/121 2026-03-10T08:49:43.147 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 41/121 2026-03-10T08:49:43.147 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 42/121 2026-03-10T08:49:43.147 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-ply-3.11-14.el9.noarch 43/121 2026-03-10T08:49:43.147 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 44/121 2026-03-10T08:49:43.147 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 45/121 2026-03-10T08:49:43.147 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 46/121 2026-03-10T08:49:43.147 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 47/121 2026-03-10T08:49:43.147 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 48/121 2026-03-10T08:49:43.147 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 49/121 2026-03-10T08:49:43.147 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 50/121 2026-03-10T08:49:43.147 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 51/121 2026-03-10T08:49:43.147 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 52/121 2026-03-10T08:49:43.147 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 53/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 54/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 55/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 56/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 57/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 58/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 59/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 60/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 61/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jmespath-1.0.1-1.el9.noarch 62/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 63/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-mako-1.1.4-6.el9.noarch 64/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 65/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 66/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 67/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 68/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 69/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 70/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 71/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 72/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 73/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 74/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : fmt-8.1.1-5.el9.x86_64 75/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 76/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 77/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 78/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 79/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 80/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 81/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 82/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 83/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 84/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 85/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 86/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 87/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-cheroot-10.0.1-4.el9.noarch 88/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-cherrypy-18.6.1-2.el9.noarch 89/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 90/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 91/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 92/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 93/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 94/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 95/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 96/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jwt+crypto-2.4.0-1.el9.noarch 97/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jwt-2.4.0-1.el9.noarch 98/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 99/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-logutils-0.3.5-21.el9.noarch 100/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 101/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 102/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-pecan-1.4.2-3.el9.noarch 103/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 104/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 105/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 106/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 107/121 2026-03-10T08:49:43.148 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 108/121 2026-03-10T08:49:43.149 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 109/121 2026-03-10T08:49:43.149 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 110/121 2026-03-10T08:49:43.149 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-webob-1.8.8-2.el9.noarch 111/121 2026-03-10T08:49:43.149 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 112/121 2026-03-10T08:49:43.149 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-werkzeug-2.0.3-3.el9.1.noarch 113/121 2026-03-10T08:49:43.149 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-xmltodict-0.12.0-15.el9.noarch 114/121 2026-03-10T08:49:43.149 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 115/121 2026-03-10T08:49:43.149 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : re2-1:20211101-20.el9.x86_64 116/121 2026-03-10T08:49:43.149 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 117/121 2026-03-10T08:49:43.149 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : librados2-2:18.2.1-0.el9.x86_64 118/121 2026-03-10T08:49:43.149 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : librados2-2:16.2.4-5.el9.x86_64 119/121 2026-03-10T08:49:43.149 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : librbd1-2:18.2.1-0.el9.x86_64 120/121 2026-03-10T08:49:43.243 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : librbd1-2:16.2.4-5.el9.x86_64 121/121 2026-03-10T08:49:43.243 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:49:43.243 INFO:teuthology.orchestra.run.vm08.stdout:Upgraded: 2026-03-10T08:49:43.243 INFO:teuthology.orchestra.run.vm08.stdout: librados2-2:18.2.1-0.el9.x86_64 librbd1-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:43.243 INFO:teuthology.orchestra.run.vm08.stdout:Installed: 2026-03-10T08:49:43.243 INFO:teuthology.orchestra.run.vm08.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-10T08:49:43.243 INFO:teuthology.orchestra.run.vm08.stdout: ceph-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:43.243 INFO:teuthology.orchestra.run.vm08.stdout: ceph-base-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:43.243 INFO:teuthology.orchestra.run.vm08.stdout: ceph-common-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:43.243 INFO:teuthology.orchestra.run.vm08.stdout: ceph-fuse-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:43.243 INFO:teuthology.orchestra.run.vm08.stdout: ceph-grafana-dashboards-2:18.2.1-0.el9.noarch 2026-03-10T08:49:43.243 INFO:teuthology.orchestra.run.vm08.stdout: ceph-immutable-object-cache-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:43.243 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mds-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:43.243 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:43.243 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 2026-03-10T08:49:43.243 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 2026-03-10T08:49:43.243 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noarch 2026-03-10T08:49:43.243 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-modules-core-2:18.2.1-0.el9.noarch 2026-03-10T08:49:43.243 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-rook-2:18.2.1-0.el9.noarch 2026-03-10T08:49:43.243 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mon-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:43.243 INFO:teuthology.orchestra.run.vm08.stdout: ceph-osd-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:43.243 INFO:teuthology.orchestra.run.vm08.stdout: ceph-prometheus-alerts-2:18.2.1-0.el9.noarch 2026-03-10T08:49:43.243 INFO:teuthology.orchestra.run.vm08.stdout: ceph-radosgw-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:43.243 INFO:teuthology.orchestra.run.vm08.stdout: ceph-selinux-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:43.243 INFO:teuthology.orchestra.run.vm08.stdout: ceph-test-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:43.243 INFO:teuthology.orchestra.run.vm08.stdout: cephadm-2:18.2.1-0.el9.noarch 2026-03-10T08:49:43.243 INFO:teuthology.orchestra.run.vm08.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-10T08:49:43.243 INFO:teuthology.orchestra.run.vm08.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-10T08:49:43.243 INFO:teuthology.orchestra.run.vm08.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-10T08:49:43.243 INFO:teuthology.orchestra.run.vm08.stdout: fmt-8.1.1-5.el9.x86_64 2026-03-10T08:49:43.243 INFO:teuthology.orchestra.run.vm08.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-10T08:49:43.243 INFO:teuthology.orchestra.run.vm08.stdout: ledmon-libs-1.1.0-3.el9.x86_64 2026-03-10T08:49:43.243 INFO:teuthology.orchestra.run.vm08.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-10T08:49:43.243 INFO:teuthology.orchestra.run.vm08.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-10T08:49:43.243 INFO:teuthology.orchestra.run.vm08.stdout: libcephfs-devel-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:43.243 INFO:teuthology.orchestra.run.vm08.stdout: libcephfs2-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:43.243 INFO:teuthology.orchestra.run.vm08.stdout: libcephsqlite-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: libconfig-1.7.2-9.el9.x86_64 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: librados-devel-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: libradosstriper1-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: librgw2-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: libxslt-1.1.34-12.el9.x86_64 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: mailcap-2.1.49-5.el9.noarch 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-ceph-argparse-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-ceph-common-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-cephfs-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-cheroot-10.0.1-4.el9.noarch 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-cherrypy-18.6.1-2.el9.noarch 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-jmespath-1.0.1-1.el9.noarch 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-jwt-2.4.0-1.el9.noarch 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-jwt+crypto-2.4.0-1.el9.noarch 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-logutils-0.3.5-21.el9.noarch 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-mako-1.1.4-6.el9.noarch 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-pecan-1.4.2-3.el9.noarch 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-ply-3.11-14.el9.noarch 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-rados-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-rbd-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-10T08:49:43.244 INFO:teuthology.orchestra.run.vm08.stdout: python3-rgw-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:43.245 INFO:teuthology.orchestra.run.vm08.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-10T08:49:43.245 INFO:teuthology.orchestra.run.vm08.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-10T08:49:43.245 INFO:teuthology.orchestra.run.vm08.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-10T08:49:43.245 INFO:teuthology.orchestra.run.vm08.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-10T08:49:43.245 INFO:teuthology.orchestra.run.vm08.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-10T08:49:43.245 INFO:teuthology.orchestra.run.vm08.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-10T08:49:43.245 INFO:teuthology.orchestra.run.vm08.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-10T08:49:43.245 INFO:teuthology.orchestra.run.vm08.stdout: python3-webob-1.8.8-2.el9.noarch 2026-03-10T08:49:43.245 INFO:teuthology.orchestra.run.vm08.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-10T08:49:43.245 INFO:teuthology.orchestra.run.vm08.stdout: python3-werkzeug-2.0.3-3.el9.1.noarch 2026-03-10T08:49:43.245 INFO:teuthology.orchestra.run.vm08.stdout: python3-xmltodict-0.12.0-15.el9.noarch 2026-03-10T08:49:43.245 INFO:teuthology.orchestra.run.vm08.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-10T08:49:43.245 INFO:teuthology.orchestra.run.vm08.stdout: rbd-fuse-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:43.245 INFO:teuthology.orchestra.run.vm08.stdout: rbd-mirror-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:43.245 INFO:teuthology.orchestra.run.vm08.stdout: rbd-nbd-2:18.2.1-0.el9.x86_64 2026-03-10T08:49:43.245 INFO:teuthology.orchestra.run.vm08.stdout: re2-1:20211101-20.el9.x86_64 2026-03-10T08:49:43.245 INFO:teuthology.orchestra.run.vm08.stdout: socat-1.7.4.1-8.el9.x86_64 2026-03-10T08:49:43.245 INFO:teuthology.orchestra.run.vm08.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-10T08:49:43.245 INFO:teuthology.orchestra.run.vm08.stdout: xmlstarlet-1.6.1-20.el9.x86_64 2026-03-10T08:49:43.245 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:49:43.245 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T08:49:43.334 DEBUG:teuthology.parallel:result is None 2026-03-10T08:49:43.334 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using tag 2026-03-10T08:49:43.334 INFO:teuthology.packaging:ref: None 2026-03-10T08:49:43.334 INFO:teuthology.packaging:tag: v18.2.1 2026-03-10T08:49:43.334 INFO:teuthology.packaging:branch: None 2026-03-10T08:49:43.334 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T08:49:43.334 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=7fe91d5d5842e04be3b4f514d6dd990c54b29c76 2026-03-10T08:49:43.939 DEBUG:teuthology.orchestra.run.vm05:> rpm -q ceph --qf '%{VERSION}-%{RELEASE}' 2026-03-10T08:49:43.963 INFO:teuthology.orchestra.run.vm05.stdout:18.2.1-0.el9 2026-03-10T08:49:43.963 INFO:teuthology.packaging:The installed version of ceph is 18.2.1-0.el9 2026-03-10T08:49:43.963 INFO:teuthology.task.install:The correct ceph version 18.2.1-0 is installed. 2026-03-10T08:49:43.964 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using tag 2026-03-10T08:49:43.964 INFO:teuthology.packaging:ref: None 2026-03-10T08:49:43.964 INFO:teuthology.packaging:tag: v18.2.1 2026-03-10T08:49:43.964 INFO:teuthology.packaging:branch: None 2026-03-10T08:49:43.964 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T08:49:43.964 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=7fe91d5d5842e04be3b4f514d6dd990c54b29c76 2026-03-10T08:49:44.604 DEBUG:teuthology.orchestra.run.vm08:> rpm -q ceph --qf '%{VERSION}-%{RELEASE}' 2026-03-10T08:49:44.626 INFO:teuthology.orchestra.run.vm08.stdout:18.2.1-0.el9 2026-03-10T08:49:44.626 INFO:teuthology.packaging:The installed version of ceph is 18.2.1-0.el9 2026-03-10T08:49:44.626 INFO:teuthology.task.install:The correct ceph version 18.2.1-0 is installed. 2026-03-10T08:49:44.627 INFO:teuthology.task.install.util:Shipping valgrind.supp... 2026-03-10T08:49:44.627 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-10T08:49:44.627 DEBUG:teuthology.orchestra.run.vm05:> sudo dd of=/home/ubuntu/cephtest/valgrind.supp 2026-03-10T08:49:44.657 DEBUG:teuthology.orchestra.run.vm08:> set -ex 2026-03-10T08:49:44.657 DEBUG:teuthology.orchestra.run.vm08:> sudo dd of=/home/ubuntu/cephtest/valgrind.supp 2026-03-10T08:49:44.696 INFO:teuthology.task.install.util:Shipping 'daemon-helper'... 2026-03-10T08:49:44.696 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-10T08:49:44.696 DEBUG:teuthology.orchestra.run.vm05:> sudo dd of=/usr/bin/daemon-helper 2026-03-10T08:49:44.721 DEBUG:teuthology.orchestra.run.vm05:> sudo chmod a=rx -- /usr/bin/daemon-helper 2026-03-10T08:49:44.785 DEBUG:teuthology.orchestra.run.vm08:> set -ex 2026-03-10T08:49:44.785 DEBUG:teuthology.orchestra.run.vm08:> sudo dd of=/usr/bin/daemon-helper 2026-03-10T08:49:44.808 DEBUG:teuthology.orchestra.run.vm08:> sudo chmod a=rx -- /usr/bin/daemon-helper 2026-03-10T08:49:44.870 INFO:teuthology.task.install.util:Shipping 'adjust-ulimits'... 2026-03-10T08:49:44.870 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-10T08:49:44.870 DEBUG:teuthology.orchestra.run.vm05:> sudo dd of=/usr/bin/adjust-ulimits 2026-03-10T08:49:44.896 DEBUG:teuthology.orchestra.run.vm05:> sudo chmod a=rx -- /usr/bin/adjust-ulimits 2026-03-10T08:49:44.958 DEBUG:teuthology.orchestra.run.vm08:> set -ex 2026-03-10T08:49:44.959 DEBUG:teuthology.orchestra.run.vm08:> sudo dd of=/usr/bin/adjust-ulimits 2026-03-10T08:49:44.982 DEBUG:teuthology.orchestra.run.vm08:> sudo chmod a=rx -- /usr/bin/adjust-ulimits 2026-03-10T08:49:45.044 INFO:teuthology.task.install.util:Shipping 'stdin-killer'... 2026-03-10T08:49:45.045 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-10T08:49:45.045 DEBUG:teuthology.orchestra.run.vm05:> sudo dd of=/usr/bin/stdin-killer 2026-03-10T08:49:45.066 DEBUG:teuthology.orchestra.run.vm05:> sudo chmod a=rx -- /usr/bin/stdin-killer 2026-03-10T08:49:45.128 DEBUG:teuthology.orchestra.run.vm08:> set -ex 2026-03-10T08:49:45.128 DEBUG:teuthology.orchestra.run.vm08:> sudo dd of=/usr/bin/stdin-killer 2026-03-10T08:49:45.151 DEBUG:teuthology.orchestra.run.vm08:> sudo chmod a=rx -- /usr/bin/stdin-killer 2026-03-10T08:49:45.215 INFO:teuthology.run_tasks:Running task print... 2026-03-10T08:49:45.217 INFO:teuthology.task.print:**** done install task... 2026-03-10T08:49:45.217 INFO:teuthology.run_tasks:Running task cephadm... 2026-03-10T08:49:45.259 INFO:tasks.cephadm:Config: {'compiled_cephadm_branch': 'reef', 'conf': {'osd': {'osd_class_default_list': '*', 'osd_class_load_list': '*', 'bdev async discard': True, 'bdev enable discard': True, 'bluestore allocator': 'bitmap', 'bluestore block size': 96636764160, 'bluestore fsck on mount': True, 'debug bluefs': '1/20', 'debug bluestore': '1/20', 'debug ms': 1, 'debug osd': 20, 'debug rocksdb': '4/10', 'mon osd backfillfull_ratio': 0.85, 'mon osd full ratio': 0.9, 'mon osd nearfull ratio': 0.8, 'osd failsafe full ratio': 0.95, 'osd mclock iops capacity threshold hdd': 49000, 'osd objectstore': 'bluestore', 'osd op complaint time': 180}, 'client': {'client mount timeout': 600, 'debug client': 20, 'debug ms': 1, 'rados mon op timeout': 900, 'rados osd op timeout': 900}, 'global': {'mon pg warn min per osd': 0}, 'mds': {'debug mds': 20, 'debug mds balancer': 20, 'debug ms': 1, 'mds debug frag': True, 'mds debug scatterstat': True, 'mds op complaint time': 180, 'mds verify scatter': True, 'osd op complaint time': 180, 'rados mon op timeout': 900, 'rados osd op timeout': 900}, 'mgr': {'debug mgr': 20, 'debug ms': 1}, 'mon': {'debug mon': 20, 'debug ms': 1, 'debug paxos': 20, 'mon down mkfs grace': 300, 'mon op complaint time': 120}}, 'image': 'quay.io/ceph/ceph:v18.2.1', 'roleless': True, 'cluster-conf': {'mgr': {'client mount timeout': 30, 'debug client': 20, 'debug mgr': 20, 'debug ms': 1, 'mon warn on pool no app': False}}, 'flavor': 'default', 'fs': 'xfs', 'log-ignorelist': ['\\(MDS_ALL_DOWN\\)', '\\(MDS_UP_LESS_THAN_MAX\\)', 'FS_DEGRADED', 'filesystem is degraded', 'FS_INLINE_DATA_DEPRECATED', 'FS_WITH_FAILED_MDS', 'MDS_ALL_DOWN', 'filesystem is offline', 'is offline because no MDS', 'MDS_DAMAGE', 'MDS_DEGRADED', 'MDS_FAILED', 'MDS_INSUFFICIENT_STANDBY', 'MDS_UP_LESS_THAN_MAX', 'online, but wants', 'filesystem is online with fewer MDS than max_mds', 'POOL_APP_NOT_ENABLED', 'do not have an application enabled', 'overall HEALTH_', 'Replacing daemon', 'deprecated feature inline_data', 'MGR_MODULE_ERROR', 'OSD_DOWN', 'osds down', 'overall HEALTH_', '\\(OSD_DOWN\\)', '\\(OSD_', 'but it is still running', 'is not responding', 'MON_DOWN', 'PG_AVAILABILITY', 'PG_DEGRADED', 'Reduced data availability', 'Degraded data redundancy', 'pg .* is stuck inactive', 'pg .* is .*degraded', 'pg .* is stuck peering'], 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df'} 2026-03-10T08:49:45.259 INFO:tasks.cephadm:Cluster image is quay.io/ceph/ceph:v18.2.1 2026-03-10T08:49:45.259 INFO:tasks.cephadm:Cluster fsid is 16587ed2-1c5e-11f1-90f6-35051361a039 2026-03-10T08:49:45.259 INFO:tasks.cephadm:Choosing monitor IPs and ports... 2026-03-10T08:49:45.259 INFO:tasks.cephadm:No mon roles; fabricating mons 2026-03-10T08:49:45.259 INFO:tasks.cephadm:Monitor IPs: {'mon.vm05': '192.168.123.105', 'mon.vm08': '192.168.123.108'} 2026-03-10T08:49:45.259 INFO:tasks.cephadm:Normalizing hostnames... 2026-03-10T08:49:45.259 DEBUG:teuthology.orchestra.run.vm05:> sudo hostname $(hostname -s) 2026-03-10T08:49:45.282 DEBUG:teuthology.orchestra.run.vm08:> sudo hostname $(hostname -s) 2026-03-10T08:49:45.309 INFO:tasks.cephadm:Downloading "compiled" cephadm from cachra for reef 2026-03-10T08:49:45.309 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T08:49:45.925 INFO:tasks.cephadm:builder_project result: [{'url': 'https://3.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/centos/9/flavors/default/', 'chacra_url': 'https://3.chacra.ceph.com/repos/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/centos/9/flavors/default/', 'ref': 'squid', 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'distro': 'centos', 'distro_version': '9', 'distro_codename': None, 'modified': '2026-02-25 18:55:15.146628', 'status': 'ready', 'flavor': 'default', 'project': 'ceph', 'archs': ['source', 'x86_64'], 'extra': {'version': '19.2.3-678-ge911bdeb', 'package_manager_version': '19.2.3-678.ge911bdeb', 'build_url': 'https://jenkins.ceph.com/job/ceph-dev-pipeline/3275/', 'root_build_cause': '', 'node_name': '10.20.192.26+soko16', 'job_name': 'ceph-dev-pipeline'}}] 2026-03-10T08:49:46.667 INFO:tasks.util.chacra:got chacra host 3.chacra.ceph.com, ref reef, sha1 ab47f43c099b2cbae6e21342fe673ce251da54d6 from https://shaman.ceph.com/api/search/?project=ceph&distros=centos%2F9%2Fx86_64&flavor=default&ref=reef 2026-03-10T08:49:46.668 INFO:tasks.cephadm:Discovered cachra url: https://3.chacra.ceph.com/binaries/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/x86_64/flavors/default/cephadm 2026-03-10T08:49:46.668 INFO:tasks.cephadm:Downloading cephadm from url: https://3.chacra.ceph.com/binaries/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/x86_64/flavors/default/cephadm 2026-03-10T08:49:46.668 DEBUG:teuthology.orchestra.run.vm05:> curl --silent -L https://3.chacra.ceph.com/binaries/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/x86_64/flavors/default/cephadm > /home/ubuntu/cephtest/cephadm && ls -l /home/ubuntu/cephtest/cephadm 2026-03-10T08:49:47.940 INFO:teuthology.orchestra.run.vm05.stdout:-rw-r--r--. 1 ubuntu ubuntu 217462 Mar 10 08:49 /home/ubuntu/cephtest/cephadm 2026-03-10T08:49:47.940 DEBUG:teuthology.orchestra.run.vm08:> curl --silent -L https://3.chacra.ceph.com/binaries/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/x86_64/flavors/default/cephadm > /home/ubuntu/cephtest/cephadm && ls -l /home/ubuntu/cephtest/cephadm 2026-03-10T08:49:49.220 INFO:teuthology.orchestra.run.vm08.stdout:-rw-r--r--. 1 ubuntu ubuntu 217462 Mar 10 08:49 /home/ubuntu/cephtest/cephadm 2026-03-10T08:49:49.220 DEBUG:teuthology.orchestra.run.vm05:> test -s /home/ubuntu/cephtest/cephadm && test $(stat -c%s /home/ubuntu/cephtest/cephadm) -gt 1000 && chmod +x /home/ubuntu/cephtest/cephadm 2026-03-10T08:49:49.240 DEBUG:teuthology.orchestra.run.vm08:> test -s /home/ubuntu/cephtest/cephadm && test $(stat -c%s /home/ubuntu/cephtest/cephadm) -gt 1000 && chmod +x /home/ubuntu/cephtest/cephadm 2026-03-10T08:49:49.261 INFO:tasks.cephadm:Pulling image quay.io/ceph/ceph:v18.2.1 on all hosts... 2026-03-10T08:49:49.261 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 pull 2026-03-10T08:49:49.282 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 pull 2026-03-10T08:49:49.811 INFO:teuthology.orchestra.run.vm05.stderr:Pulling container image quay.io/ceph/ceph:v18.2.1... 2026-03-10T08:49:49.862 INFO:teuthology.orchestra.run.vm08.stderr:Pulling container image quay.io/ceph/ceph:v18.2.1... 2026-03-10T08:50:07.436 INFO:teuthology.orchestra.run.vm08.stdout:{ 2026-03-10T08:50:07.436 INFO:teuthology.orchestra.run.vm08.stdout: "ceph_version": "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)", 2026-03-10T08:50:07.436 INFO:teuthology.orchestra.run.vm08.stdout: "image_id": "5be31c24972a920012b90a9769e8313e2490c82aee752aefd49af9cf4c0f3fcf", 2026-03-10T08:50:07.437 INFO:teuthology.orchestra.run.vm08.stdout: "repo_digests": [ 2026-03-10T08:50:07.437 INFO:teuthology.orchestra.run.vm08.stdout: "quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3", 2026-03-10T08:50:07.437 INFO:teuthology.orchestra.run.vm08.stdout: "quay.io/ceph/ceph@sha256:e8e55db8b4fd270dbec25bc764437a2a3abb707971c4dba5f559fb83018049dc" 2026-03-10T08:50:07.437 INFO:teuthology.orchestra.run.vm08.stdout: ] 2026-03-10T08:50:07.437 INFO:teuthology.orchestra.run.vm08.stdout:} 2026-03-10T08:50:07.590 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T08:50:07.590 INFO:teuthology.orchestra.run.vm05.stdout: "ceph_version": "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)", 2026-03-10T08:50:07.590 INFO:teuthology.orchestra.run.vm05.stdout: "image_id": "5be31c24972a920012b90a9769e8313e2490c82aee752aefd49af9cf4c0f3fcf", 2026-03-10T08:50:07.590 INFO:teuthology.orchestra.run.vm05.stdout: "repo_digests": [ 2026-03-10T08:50:07.590 INFO:teuthology.orchestra.run.vm05.stdout: "quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3", 2026-03-10T08:50:07.590 INFO:teuthology.orchestra.run.vm05.stdout: "quay.io/ceph/ceph@sha256:e8e55db8b4fd270dbec25bc764437a2a3abb707971c4dba5f559fb83018049dc" 2026-03-10T08:50:07.590 INFO:teuthology.orchestra.run.vm05.stdout: ] 2026-03-10T08:50:07.590 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T08:50:07.604 DEBUG:teuthology.orchestra.run.vm05:> sudo mkdir -p /etc/ceph 2026-03-10T08:50:07.634 DEBUG:teuthology.orchestra.run.vm08:> sudo mkdir -p /etc/ceph 2026-03-10T08:50:07.661 DEBUG:teuthology.orchestra.run.vm05:> sudo chmod 777 /etc/ceph 2026-03-10T08:50:07.701 DEBUG:teuthology.orchestra.run.vm08:> sudo chmod 777 /etc/ceph 2026-03-10T08:50:07.728 INFO:tasks.cephadm:Writing seed config... 2026-03-10T08:50:07.728 INFO:tasks.cephadm: override: [osd] osd_class_default_list = * 2026-03-10T08:50:07.729 INFO:tasks.cephadm: override: [osd] osd_class_load_list = * 2026-03-10T08:50:07.729 INFO:tasks.cephadm: override: [osd] bdev async discard = True 2026-03-10T08:50:07.729 INFO:tasks.cephadm: override: [osd] bdev enable discard = True 2026-03-10T08:50:07.729 INFO:tasks.cephadm: override: [osd] bluestore allocator = bitmap 2026-03-10T08:50:07.729 INFO:tasks.cephadm: override: [osd] bluestore block size = 96636764160 2026-03-10T08:50:07.729 INFO:tasks.cephadm: override: [osd] bluestore fsck on mount = True 2026-03-10T08:50:07.729 INFO:tasks.cephadm: override: [osd] debug bluefs = 1/20 2026-03-10T08:50:07.729 INFO:tasks.cephadm: override: [osd] debug bluestore = 1/20 2026-03-10T08:50:07.729 INFO:tasks.cephadm: override: [osd] debug ms = 1 2026-03-10T08:50:07.729 INFO:tasks.cephadm: override: [osd] debug osd = 20 2026-03-10T08:50:07.729 INFO:tasks.cephadm: override: [osd] debug rocksdb = 4/10 2026-03-10T08:50:07.729 INFO:tasks.cephadm: override: [osd] mon osd backfillfull_ratio = 0.85 2026-03-10T08:50:07.729 INFO:tasks.cephadm: override: [osd] mon osd full ratio = 0.9 2026-03-10T08:50:07.729 INFO:tasks.cephadm: override: [osd] mon osd nearfull ratio = 0.8 2026-03-10T08:50:07.729 INFO:tasks.cephadm: override: [osd] osd failsafe full ratio = 0.95 2026-03-10T08:50:07.729 INFO:tasks.cephadm: override: [osd] osd mclock iops capacity threshold hdd = 49000 2026-03-10T08:50:07.729 INFO:tasks.cephadm: override: [osd] osd objectstore = bluestore 2026-03-10T08:50:07.729 INFO:tasks.cephadm: override: [osd] osd op complaint time = 180 2026-03-10T08:50:07.729 INFO:tasks.cephadm: override: [client] client mount timeout = 600 2026-03-10T08:50:07.729 INFO:tasks.cephadm: override: [client] debug client = 20 2026-03-10T08:50:07.729 INFO:tasks.cephadm: override: [client] debug ms = 1 2026-03-10T08:50:07.729 INFO:tasks.cephadm: override: [client] rados mon op timeout = 900 2026-03-10T08:50:07.729 INFO:tasks.cephadm: override: [client] rados osd op timeout = 900 2026-03-10T08:50:07.729 INFO:tasks.cephadm: override: [global] mon pg warn min per osd = 0 2026-03-10T08:50:07.729 INFO:tasks.cephadm: override: [mds] debug mds = 20 2026-03-10T08:50:07.729 INFO:tasks.cephadm: override: [mds] debug mds balancer = 20 2026-03-10T08:50:07.729 INFO:tasks.cephadm: override: [mds] debug ms = 1 2026-03-10T08:50:07.729 INFO:tasks.cephadm: override: [mds] mds debug frag = True 2026-03-10T08:50:07.729 INFO:tasks.cephadm: override: [mds] mds debug scatterstat = True 2026-03-10T08:50:07.729 INFO:tasks.cephadm: override: [mds] mds op complaint time = 180 2026-03-10T08:50:07.729 INFO:tasks.cephadm: override: [mds] mds verify scatter = True 2026-03-10T08:50:07.729 INFO:tasks.cephadm: override: [mds] osd op complaint time = 180 2026-03-10T08:50:07.729 INFO:tasks.cephadm: override: [mds] rados mon op timeout = 900 2026-03-10T08:50:07.729 INFO:tasks.cephadm: override: [mds] rados osd op timeout = 900 2026-03-10T08:50:07.729 INFO:tasks.cephadm: override: [mgr] debug mgr = 20 2026-03-10T08:50:07.729 INFO:tasks.cephadm: override: [mgr] debug ms = 1 2026-03-10T08:50:07.729 INFO:tasks.cephadm: override: [mon] debug mon = 20 2026-03-10T08:50:07.729 INFO:tasks.cephadm: override: [mon] debug ms = 1 2026-03-10T08:50:07.729 INFO:tasks.cephadm: override: [mon] debug paxos = 20 2026-03-10T08:50:07.729 INFO:tasks.cephadm: override: [mon] mon down mkfs grace = 300 2026-03-10T08:50:07.729 INFO:tasks.cephadm: override: [mon] mon op complaint time = 120 2026-03-10T08:50:07.729 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-10T08:50:07.729 DEBUG:teuthology.orchestra.run.vm05:> dd of=/home/ubuntu/cephtest/seed.ceph.conf 2026-03-10T08:50:07.758 DEBUG:tasks.cephadm:Final config: [global] # make logging friendly to teuthology log_to_file = true log_to_stderr = false log to journald = false mon cluster log to file = true mon cluster log file level = debug mon clock drift allowed = 1.000 # replicate across OSDs, not hosts osd crush chooseleaf type = 0 #osd pool default size = 2 osd pool default erasure code profile = plugin=jerasure technique=reed_sol_van k=2 m=1 crush-failure-domain=osd # enable some debugging auth debug = true ms die on old message = true ms die on bug = true debug asserts on shutdown = true # adjust warnings mon max pg per osd = 10000# >= luminous mon pg warn max object skew = 0 mon osd allow primary affinity = true mon osd allow pg remap = true mon warn on legacy crush tunables = false mon warn on crush straw calc version zero = false mon warn on no sortbitwise = false mon warn on osd down out interval zero = false mon warn on too few osds = false mon_warn_on_pool_pg_num_not_power_of_two = false # disable pg_autoscaler by default for new pools osd_pool_default_pg_autoscale_mode = off # tests delete pools mon allow pool delete = true fsid = 16587ed2-1c5e-11f1-90f6-35051361a039 mon pg warn min per osd = 0 [osd] osd scrub load threshold = 5.0 osd scrub max interval = 600 osd mclock profile = high_recovery_ops osd recover clone overlap = true osd recovery max chunk = 1048576 osd deep scrub update digest min age = 30 osd map max advance = 10 osd memory target autotune = true # debugging osd debug shutdown = true osd debug op order = true osd debug verify stray on activate = true osd debug pg log writeout = true osd debug verify cached snaps = true osd debug verify missing on start = true osd debug misdirected ops = true osd op queue = debug_random osd op queue cut off = debug_random osd shutdown pgref assert = true bdev debug aio = true osd sloppy crc = true osd_class_default_list = * osd_class_load_list = * bdev async discard = True bdev enable discard = True bluestore allocator = bitmap bluestore block size = 96636764160 bluestore fsck on mount = True debug bluefs = 1/20 debug bluestore = 1/20 debug ms = 1 debug osd = 20 debug rocksdb = 4/10 mon osd backfillfull_ratio = 0.85 mon osd full ratio = 0.9 mon osd nearfull ratio = 0.8 osd failsafe full ratio = 0.95 osd mclock iops capacity threshold hdd = 49000 osd objectstore = bluestore osd op complaint time = 180 [mgr] mon reweight min pgs per osd = 4 mon reweight min bytes per osd = 10 mgr/telemetry/nag = false debug mgr = 20 debug ms = 1 [mon] mon data avail warn = 5 mon mgr mkfs grace = 240 mon reweight min pgs per osd = 4 mon osd reporter subtree level = osd mon osd prime pg temp = true mon reweight min bytes per osd = 10 # rotate auth tickets quickly to exercise renewal paths auth mon ticket ttl = 660# 11m auth service ticket ttl = 240# 4m # don't complain about global id reclaim mon_warn_on_insecure_global_id_reclaim = false mon_warn_on_insecure_global_id_reclaim_allowed = false debug mon = 20 debug ms = 1 debug paxos = 20 mon down mkfs grace = 300 mon op complaint time = 120 [client.rgw] rgw cache enabled = true rgw enable ops log = true rgw enable usage log = true [client] client mount timeout = 600 debug client = 20 debug ms = 1 rados mon op timeout = 900 rados osd op timeout = 900 [mds] debug mds = 20 debug mds balancer = 20 debug ms = 1 mds debug frag = True mds debug scatterstat = True mds op complaint time = 180 mds verify scatter = True osd op complaint time = 180 rados mon op timeout = 900 rados osd op timeout = 900 2026-03-10T08:50:07.759 DEBUG:teuthology.orchestra.run.vm05:mon.vm05> sudo journalctl -f -n 0 -u ceph-16587ed2-1c5e-11f1-90f6-35051361a039@mon.vm05.service 2026-03-10T08:50:07.800 INFO:tasks.cephadm:Bootstrapping... 2026-03-10T08:50:07.800 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 -v bootstrap --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 --config /home/ubuntu/cephtest/seed.ceph.conf --output-config /etc/ceph/ceph.conf --output-keyring /etc/ceph/ceph.client.admin.keyring --output-pub-ssh-key /home/ubuntu/cephtest/ceph.pub --mon-ip 192.168.123.105 --skip-admin-label && sudo chmod +r /etc/ceph/ceph.client.admin.keyring 2026-03-10T08:50:07.912 INFO:teuthology.orchestra.run.vm05.stdout:-------------------------------------------------------------------------------- 2026-03-10T08:50:07.912 INFO:teuthology.orchestra.run.vm05.stdout:cephadm ['--image', 'quay.io/ceph/ceph:v18.2.1', '-v', 'bootstrap', '--fsid', '16587ed2-1c5e-11f1-90f6-35051361a039', '--config', '/home/ubuntu/cephtest/seed.ceph.conf', '--output-config', '/etc/ceph/ceph.conf', '--output-keyring', '/etc/ceph/ceph.client.admin.keyring', '--output-pub-ssh-key', '/home/ubuntu/cephtest/ceph.pub', '--mon-ip', '192.168.123.105', '--skip-admin-label'] 2026-03-10T08:50:07.931 INFO:teuthology.orchestra.run.vm05.stdout:/bin/podman: stdout 5.8.0 2026-03-10T08:50:07.931 INFO:teuthology.orchestra.run.vm05.stderr:Specifying an fsid for your cluster offers no advantages and may increase the likelihood of fsid conflicts. 2026-03-10T08:50:07.932 INFO:teuthology.orchestra.run.vm05.stdout:Verifying podman|docker is present... 2026-03-10T08:50:07.948 INFO:teuthology.orchestra.run.vm05.stdout:/bin/podman: stdout 5.8.0 2026-03-10T08:50:07.948 INFO:teuthology.orchestra.run.vm05.stdout:Verifying lvm2 is present... 2026-03-10T08:50:07.948 INFO:teuthology.orchestra.run.vm05.stdout:Verifying time synchronization is in place... 2026-03-10T08:50:07.955 INFO:teuthology.orchestra.run.vm05.stdout:Non-zero exit code 1 from systemctl is-enabled chrony.service 2026-03-10T08:50:07.955 INFO:teuthology.orchestra.run.vm05.stdout:systemctl: stderr Failed to get unit file state for chrony.service: No such file or directory 2026-03-10T08:50:07.961 INFO:teuthology.orchestra.run.vm05.stdout:Non-zero exit code 3 from systemctl is-active chrony.service 2026-03-10T08:50:07.961 INFO:teuthology.orchestra.run.vm05.stdout:systemctl: stdout inactive 2026-03-10T08:50:07.967 INFO:teuthology.orchestra.run.vm05.stdout:systemctl: stdout enabled 2026-03-10T08:50:07.972 INFO:teuthology.orchestra.run.vm05.stdout:systemctl: stdout active 2026-03-10T08:50:07.972 INFO:teuthology.orchestra.run.vm05.stdout:Unit chronyd.service is enabled and running 2026-03-10T08:50:07.972 INFO:teuthology.orchestra.run.vm05.stdout:Repeating the final host check... 2026-03-10T08:50:07.988 INFO:teuthology.orchestra.run.vm05.stdout:/bin/podman: stdout 5.8.0 2026-03-10T08:50:07.988 INFO:teuthology.orchestra.run.vm05.stdout:podman (/bin/podman) version 5.8.0 is present 2026-03-10T08:50:07.988 INFO:teuthology.orchestra.run.vm05.stdout:systemctl is present 2026-03-10T08:50:07.989 INFO:teuthology.orchestra.run.vm05.stdout:lvcreate is present 2026-03-10T08:50:07.994 INFO:teuthology.orchestra.run.vm05.stdout:Non-zero exit code 1 from systemctl is-enabled chrony.service 2026-03-10T08:50:07.994 INFO:teuthology.orchestra.run.vm05.stdout:systemctl: stderr Failed to get unit file state for chrony.service: No such file or directory 2026-03-10T08:50:07.999 INFO:teuthology.orchestra.run.vm05.stdout:Non-zero exit code 3 from systemctl is-active chrony.service 2026-03-10T08:50:07.999 INFO:teuthology.orchestra.run.vm05.stdout:systemctl: stdout inactive 2026-03-10T08:50:08.004 INFO:teuthology.orchestra.run.vm05.stdout:systemctl: stdout enabled 2026-03-10T08:50:08.009 INFO:teuthology.orchestra.run.vm05.stdout:systemctl: stdout active 2026-03-10T08:50:08.009 INFO:teuthology.orchestra.run.vm05.stdout:Unit chronyd.service is enabled and running 2026-03-10T08:50:08.009 INFO:teuthology.orchestra.run.vm05.stdout:Host looks OK 2026-03-10T08:50:08.009 INFO:teuthology.orchestra.run.vm05.stdout:Cluster fsid: 16587ed2-1c5e-11f1-90f6-35051361a039 2026-03-10T08:50:08.009 INFO:teuthology.orchestra.run.vm05.stdout:Acquiring lock 139756536574496 on /run/cephadm/16587ed2-1c5e-11f1-90f6-35051361a039.lock 2026-03-10T08:50:08.009 INFO:teuthology.orchestra.run.vm05.stdout:Lock 139756536574496 acquired on /run/cephadm/16587ed2-1c5e-11f1-90f6-35051361a039.lock 2026-03-10T08:50:08.010 INFO:teuthology.orchestra.run.vm05.stdout:Verifying IP 192.168.123.105 port 3300 ... 2026-03-10T08:50:08.010 INFO:teuthology.orchestra.run.vm05.stdout:Verifying IP 192.168.123.105 port 6789 ... 2026-03-10T08:50:08.010 INFO:teuthology.orchestra.run.vm05.stdout:Base mon IP(s) is [192.168.123.105:3300, 192.168.123.105:6789], mon addrv is [v2:192.168.123.105:3300,v1:192.168.123.105:6789] 2026-03-10T08:50:08.013 INFO:teuthology.orchestra.run.vm05.stdout:/sbin/ip: stdout default via 192.168.123.1 dev eth0 proto dhcp src 192.168.123.105 metric 100 2026-03-10T08:50:08.013 INFO:teuthology.orchestra.run.vm05.stdout:/sbin/ip: stdout 192.168.123.0/24 dev eth0 proto kernel scope link src 192.168.123.105 metric 100 2026-03-10T08:50:08.015 INFO:teuthology.orchestra.run.vm05.stdout:/sbin/ip: stdout ::1 dev lo proto kernel metric 256 pref medium 2026-03-10T08:50:08.015 INFO:teuthology.orchestra.run.vm05.stdout:/sbin/ip: stdout fe80::/64 dev eth0 proto kernel metric 1024 pref medium 2026-03-10T08:50:08.017 INFO:teuthology.orchestra.run.vm05.stdout:/sbin/ip: stdout 1: lo: mtu 65536 state UNKNOWN qlen 1000 2026-03-10T08:50:08.017 INFO:teuthology.orchestra.run.vm05.stdout:/sbin/ip: stdout inet6 ::1/128 scope host 2026-03-10T08:50:08.017 INFO:teuthology.orchestra.run.vm05.stdout:/sbin/ip: stdout valid_lft forever preferred_lft forever 2026-03-10T08:50:08.017 INFO:teuthology.orchestra.run.vm05.stdout:/sbin/ip: stdout 2: eth0: mtu 1500 state UP qlen 1000 2026-03-10T08:50:08.017 INFO:teuthology.orchestra.run.vm05.stdout:/sbin/ip: stdout inet6 fe80::5055:ff:fe00:5/64 scope link noprefixroute 2026-03-10T08:50:08.017 INFO:teuthology.orchestra.run.vm05.stdout:/sbin/ip: stdout valid_lft forever preferred_lft forever 2026-03-10T08:50:08.018 INFO:teuthology.orchestra.run.vm05.stdout:Mon IP `192.168.123.105` is in CIDR network `192.168.123.0/24` 2026-03-10T08:50:08.018 INFO:teuthology.orchestra.run.vm05.stdout:Mon IP `192.168.123.105` is in CIDR network `192.168.123.0/24` 2026-03-10T08:50:08.018 INFO:teuthology.orchestra.run.vm05.stdout:Inferred mon public CIDR from local network configuration ['192.168.123.0/24', '192.168.123.0/24'] 2026-03-10T08:50:08.018 INFO:teuthology.orchestra.run.vm05.stdout:Internal network (--cluster-network) has not been provided, OSD replication will default to the public_network 2026-03-10T08:50:08.019 INFO:teuthology.orchestra.run.vm05.stdout:Pulling container image quay.io/ceph/ceph:v18.2.1... 2026-03-10T08:50:09.196 INFO:teuthology.orchestra.run.vm05.stdout:/bin/podman: stdout 5be31c24972a920012b90a9769e8313e2490c82aee752aefd49af9cf4c0f3fcf 2026-03-10T08:50:09.196 INFO:teuthology.orchestra.run.vm05.stdout:/bin/podman: stderr Trying to pull quay.io/ceph/ceph:v18.2.1... 2026-03-10T08:50:09.196 INFO:teuthology.orchestra.run.vm05.stdout:/bin/podman: stderr Getting image source signatures 2026-03-10T08:50:09.196 INFO:teuthology.orchestra.run.vm05.stdout:/bin/podman: stderr Copying blob sha256:a733d3c618b71f19c168ebecd1953429dce2c1631835ca182e9551c36dce5989 2026-03-10T08:50:09.196 INFO:teuthology.orchestra.run.vm05.stdout:/bin/podman: stderr Copying blob sha256:7feca07754707458c3945cf0062cf4dabc512f6d90fe1a9a1370b362b6011124 2026-03-10T08:50:09.196 INFO:teuthology.orchestra.run.vm05.stdout:/bin/podman: stderr Copying config sha256:5be31c24972a920012b90a9769e8313e2490c82aee752aefd49af9cf4c0f3fcf 2026-03-10T08:50:09.196 INFO:teuthology.orchestra.run.vm05.stdout:/bin/podman: stderr Writing manifest to image destination 2026-03-10T08:50:09.378 INFO:teuthology.orchestra.run.vm05.stdout:ceph: stdout ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable) 2026-03-10T08:50:09.379 INFO:teuthology.orchestra.run.vm05.stdout:Ceph version: ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable) 2026-03-10T08:50:09.380 INFO:teuthology.orchestra.run.vm05.stdout:Extracting ceph user uid/gid from container image... 2026-03-10T08:50:09.475 INFO:teuthology.orchestra.run.vm05.stdout:stat: stdout 167 167 2026-03-10T08:50:09.475 INFO:teuthology.orchestra.run.vm05.stdout:Creating initial keys... 2026-03-10T08:50:09.594 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph-authtool: stdout AQBB269pm4WQIRAA2Dle7k8fGD+zkmWFIb7i9A== 2026-03-10T08:50:09.700 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph-authtool: stdout AQBB269psQ91JxAAkCMu4KQLFm/GaBcRvu1aog== 2026-03-10T08:50:09.807 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph-authtool: stdout AQBB269pVAvnLRAAbwnkiCUGUBLROCfmZOyKfw== 2026-03-10T08:50:09.807 INFO:teuthology.orchestra.run.vm05.stdout:Creating initial monmap... 2026-03-10T08:50:09.930 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/monmaptool: stdout /usr/bin/monmaptool: monmap file /tmp/monmap 2026-03-10T08:50:09.930 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/monmaptool: stdout setting min_mon_release = pacific 2026-03-10T08:50:09.930 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/monmaptool: stdout /usr/bin/monmaptool: set fsid to 16587ed2-1c5e-11f1-90f6-35051361a039 2026-03-10T08:50:09.930 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/monmaptool: stdout /usr/bin/monmaptool: writing epoch 0 to /tmp/monmap (1 monitors) 2026-03-10T08:50:09.930 INFO:teuthology.orchestra.run.vm05.stdout:monmaptool for vm05 [v2:192.168.123.105:3300,v1:192.168.123.105:6789] on /usr/bin/monmaptool: monmap file /tmp/monmap 2026-03-10T08:50:09.930 INFO:teuthology.orchestra.run.vm05.stdout:setting min_mon_release = pacific 2026-03-10T08:50:09.930 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/monmaptool: set fsid to 16587ed2-1c5e-11f1-90f6-35051361a039 2026-03-10T08:50:09.931 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/monmaptool: writing epoch 0 to /tmp/monmap (1 monitors) 2026-03-10T08:50:09.931 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:50:09.931 INFO:teuthology.orchestra.run.vm05.stdout:Creating mon... 2026-03-10T08:50:10.061 INFO:teuthology.orchestra.run.vm05.stdout:create mon.vm05 on 2026-03-10T08:50:10.216 INFO:teuthology.orchestra.run.vm05.stdout:systemctl: stderr Removed "/etc/systemd/system/multi-user.target.wants/ceph.target". 2026-03-10T08:50:10.338 INFO:teuthology.orchestra.run.vm05.stdout:systemctl: stderr Created symlink /etc/systemd/system/multi-user.target.wants/ceph.target → /etc/systemd/system/ceph.target. 2026-03-10T08:50:10.472 INFO:teuthology.orchestra.run.vm05.stdout:systemctl: stderr Created symlink /etc/systemd/system/multi-user.target.wants/ceph-16587ed2-1c5e-11f1-90f6-35051361a039.target → /etc/systemd/system/ceph-16587ed2-1c5e-11f1-90f6-35051361a039.target. 2026-03-10T08:50:10.472 INFO:teuthology.orchestra.run.vm05.stdout:systemctl: stderr Created symlink /etc/systemd/system/ceph.target.wants/ceph-16587ed2-1c5e-11f1-90f6-35051361a039.target → /etc/systemd/system/ceph-16587ed2-1c5e-11f1-90f6-35051361a039.target. 2026-03-10T08:50:10.613 INFO:teuthology.orchestra.run.vm05.stdout:Non-zero exit code 1 from systemctl reset-failed ceph-16587ed2-1c5e-11f1-90f6-35051361a039@mon.vm05 2026-03-10T08:50:10.613 INFO:teuthology.orchestra.run.vm05.stdout:systemctl: stderr Failed to reset failed state of unit ceph-16587ed2-1c5e-11f1-90f6-35051361a039@mon.vm05.service: Unit ceph-16587ed2-1c5e-11f1-90f6-35051361a039@mon.vm05.service not loaded. 2026-03-10T08:50:10.760 INFO:teuthology.orchestra.run.vm05.stdout:systemctl: stderr Created symlink /etc/systemd/system/ceph-16587ed2-1c5e-11f1-90f6-35051361a039.target.wants/ceph-16587ed2-1c5e-11f1-90f6-35051361a039@mon.vm05.service → /etc/systemd/system/ceph-16587ed2-1c5e-11f1-90f6-35051361a039@.service. 2026-03-10T08:50:10.930 INFO:teuthology.orchestra.run.vm05.stdout:firewalld does not appear to be present 2026-03-10T08:50:10.930 INFO:teuthology.orchestra.run.vm05.stdout:Not possible to enable service . firewalld.service is not available 2026-03-10T08:50:10.930 INFO:teuthology.orchestra.run.vm05.stdout:Waiting for mon to start... 2026-03-10T08:50:10.930 INFO:teuthology.orchestra.run.vm05.stdout:Waiting for mon... 2026-03-10T08:50:11.140 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout cluster: 2026-03-10T08:50:11.140 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout id: 16587ed2-1c5e-11f1-90f6-35051361a039 2026-03-10T08:50:11.140 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout health: HEALTH_OK 2026-03-10T08:50:11.140 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout 2026-03-10T08:50:11.140 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout services: 2026-03-10T08:50:11.140 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout mon: 1 daemons, quorum vm05 (age 0.155995s) 2026-03-10T08:50:11.140 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout mgr: no daemons active 2026-03-10T08:50:11.140 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout osd: 0 osds: 0 up, 0 in 2026-03-10T08:50:11.140 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout 2026-03-10T08:50:11.140 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout data: 2026-03-10T08:50:11.141 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout pools: 0 pools, 0 pgs 2026-03-10T08:50:11.141 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout objects: 0 objects, 0 B 2026-03-10T08:50:11.141 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout usage: 0 B used, 0 B / 0 B avail 2026-03-10T08:50:11.141 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout pgs: 2026-03-10T08:50:11.141 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout 2026-03-10T08:50:11.141 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.068+0000 7fa177d28700 1 Processor -- start 2026-03-10T08:50:11.141 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.069+0000 7fa177d28700 1 -- start start 2026-03-10T08:50:11.141 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.069+0000 7fa177d28700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa17007b760 0x7fa17007bb80 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:11.141 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.069+0000 7fa177d28700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa17007c150 con 0x7fa17007b760 2026-03-10T08:50:11.141 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.069+0000 7fa175ac4700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa17007b760 0x7fa17007bb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:11.141 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.069+0000 7fa175ac4700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa17007b760 0x7fa17007bb80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:43042/0 (socket says 192.168.123.105:43042) 2026-03-10T08:50:11.141 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.069+0000 7fa175ac4700 1 -- 192.168.123.105:0/623481494 learned_addr learned my addr 192.168.123.105:0/623481494 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:11.141 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.070+0000 7fa175ac4700 1 -- 192.168.123.105:0/623481494 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa17007c9b0 con 0x7fa17007b760 2026-03-10T08:50:11.141 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.070+0000 7fa175ac4700 1 --2- 192.168.123.105:0/623481494 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa17007b760 0x7fa17007bb80 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7fa160009cf0 tx=0x7fa16000b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=cb99276539717386 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:11.141 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.070+0000 7fa174ac2700 1 -- 192.168.123.105:0/623481494 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fa160004030 con 0x7fa17007b760 2026-03-10T08:50:11.142 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.070+0000 7fa174ac2700 1 -- 192.168.123.105:0/623481494 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(0 keys) v1 ==== 4+0+0 (secure 0 0 0) 0x7fa160004190 con 0x7fa17007b760 2026-03-10T08:50:11.142 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.071+0000 7fa177d28700 1 -- 192.168.123.105:0/623481494 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa17007b760 msgr2=0x7fa17007bb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:11.142 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.071+0000 7fa177d28700 1 --2- 192.168.123.105:0/623481494 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa17007b760 0x7fa17007bb80 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7fa160009cf0 tx=0x7fa16000b0e0 comp rx=0 tx=0).stop 2026-03-10T08:50:11.142 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.071+0000 7fa177d28700 1 -- 192.168.123.105:0/623481494 shutdown_connections 2026-03-10T08:50:11.142 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.071+0000 7fa177d28700 1 --2- 192.168.123.105:0/623481494 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa17007b760 0x7fa17007bb80 unknown :-1 s=CLOSED pgs=1 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:11.142 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.071+0000 7fa177d28700 1 -- 192.168.123.105:0/623481494 >> 192.168.123.105:0/623481494 conn(0x7fa170103f50 msgr2=0x7fa170106370 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:11.142 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.071+0000 7fa177d28700 1 -- 192.168.123.105:0/623481494 shutdown_connections 2026-03-10T08:50:11.142 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.071+0000 7fa177d28700 1 -- 192.168.123.105:0/623481494 wait complete. 2026-03-10T08:50:11.142 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.071+0000 7fa177d28700 1 Processor -- start 2026-03-10T08:50:11.142 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.072+0000 7fa177d28700 1 -- start start 2026-03-10T08:50:11.142 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.072+0000 7fa177d28700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa1701a8890 0x7fa1701a8cb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:11.142 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.072+0000 7fa177d28700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa17007c150 con 0x7fa1701a8890 2026-03-10T08:50:11.142 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.072+0000 7fa175ac4700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa1701a8890 0x7fa1701a8cb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:11.142 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.072+0000 7fa175ac4700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa1701a8890 0x7fa1701a8cb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:43046/0 (socket says 192.168.123.105:43046) 2026-03-10T08:50:11.142 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.072+0000 7fa175ac4700 1 -- 192.168.123.105:0/3883336741 learned_addr learned my addr 192.168.123.105:0/3883336741 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:11.142 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.072+0000 7fa175ac4700 1 -- 192.168.123.105:0/3883336741 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa160009740 con 0x7fa1701a8890 2026-03-10T08:50:11.142 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.072+0000 7fa175ac4700 1 --2- 192.168.123.105:0/3883336741 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa1701a8890 0x7fa1701a8cb0 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7fa17007c440 tx=0x7fa160004750 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:11.142 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.073+0000 7fa166ffd700 1 -- 192.168.123.105:0/3883336741 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fa16000bd90 con 0x7fa1701a8890 2026-03-10T08:50:11.142 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.073+0000 7fa166ffd700 1 -- 192.168.123.105:0/3883336741 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(0 keys) v1 ==== 4+0+0 (secure 0 0 0) 0x7fa1600036a0 con 0x7fa1701a8890 2026-03-10T08:50:11.142 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.073+0000 7fa177d28700 1 -- 192.168.123.105:0/3883336741 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa1701a91f0 con 0x7fa1701a8890 2026-03-10T08:50:11.142 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.073+0000 7fa166ffd700 1 -- 192.168.123.105:0/3883336741 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fa160003990 con 0x7fa1701a8890 2026-03-10T08:50:11.142 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.073+0000 7fa177d28700 1 -- 192.168.123.105:0/3883336741 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa1701abe70 con 0x7fa1701a8890 2026-03-10T08:50:11.142 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.074+0000 7fa166ffd700 1 -- 192.168.123.105:0/3883336741 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7fa160021020 con 0x7fa1701a8890 2026-03-10T08:50:11.142 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.074+0000 7fa177d28700 1 -- 192.168.123.105:0/3883336741 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa1700623c0 con 0x7fa1701a8890 2026-03-10T08:50:11.142 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.074+0000 7fa166ffd700 1 -- 192.168.123.105:0/3883336741 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7fa16001ad40 con 0x7fa1701a8890 2026-03-10T08:50:11.142 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.075+0000 7fa166ffd700 1 -- 192.168.123.105:0/3883336741 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72446 (secure 0 0 0) 0x7fa160029050 con 0x7fa1701a8890 2026-03-10T08:50:11.142 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.114+0000 7fa177d28700 1 -- 192.168.123.105:0/3883336741 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "status"} v 0) v1 -- 0x7fa1701ac150 con 0x7fa1701a8890 2026-03-10T08:50:11.142 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.114+0000 7fa166ffd700 1 -- 192.168.123.105:0/3883336741 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "status"}]=0 v0) v1 ==== 54+0+320 (secure 0 0 0) 0x7fa16001a440 con 0x7fa1701a8890 2026-03-10T08:50:11.142 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.116+0000 7fa177d28700 1 -- 192.168.123.105:0/3883336741 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa1701a8890 msgr2=0x7fa1701a8cb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:11.142 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.116+0000 7fa177d28700 1 --2- 192.168.123.105:0/3883336741 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa1701a8890 0x7fa1701a8cb0 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7fa17007c440 tx=0x7fa160004750 comp rx=0 tx=0).stop 2026-03-10T08:50:11.142 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.116+0000 7fa177d28700 1 -- 192.168.123.105:0/3883336741 shutdown_connections 2026-03-10T08:50:11.142 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.116+0000 7fa177d28700 1 --2- 192.168.123.105:0/3883336741 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa1701a8890 0x7fa1701a8cb0 unknown :-1 s=CLOSED pgs=2 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:11.142 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.116+0000 7fa177d28700 1 -- 192.168.123.105:0/3883336741 >> 192.168.123.105:0/3883336741 conn(0x7fa170103f50 msgr2=0x7fa170104b90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:11.142 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.116+0000 7fa177d28700 1 -- 192.168.123.105:0/3883336741 shutdown_connections 2026-03-10T08:50:11.142 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.116+0000 7fa177d28700 1 -- 192.168.123.105:0/3883336741 wait complete. 2026-03-10T08:50:11.142 INFO:teuthology.orchestra.run.vm05.stdout:mon is available 2026-03-10T08:50:11.142 INFO:teuthology.orchestra.run.vm05.stdout:Assimilating anything we can from ceph.conf... 2026-03-10T08:50:11.369 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout 2026-03-10T08:50:11.369 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout [global] 2026-03-10T08:50:11.369 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout fsid = 16587ed2-1c5e-11f1-90f6-35051361a039 2026-03-10T08:50:11.369 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout mon_host = [v2:192.168.123.105:3300,v1:192.168.123.105:6789] 2026-03-10T08:50:11.369 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout mon_osd_allow_pg_remap = true 2026-03-10T08:50:11.369 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout mon_osd_allow_primary_affinity = true 2026-03-10T08:50:11.369 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout mon_warn_on_no_sortbitwise = false 2026-03-10T08:50:11.369 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout osd_crush_chooseleaf_type = 0 2026-03-10T08:50:11.369 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout 2026-03-10T08:50:11.369 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout [mgr] 2026-03-10T08:50:11.369 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout mgr/telemetry/nag = false 2026-03-10T08:50:11.369 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout 2026-03-10T08:50:11.369 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout [osd] 2026-03-10T08:50:11.369 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout mon_osd_backfillfull_ratio = 0.85 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout mon_osd_full_ratio = 0.9 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout mon_osd_nearfull_ratio = 0.8 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout osd_map_max_advance = 10 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout osd_sloppy_crc = true 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.268+0000 7ff0fd7fc700 1 Processor -- start 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.268+0000 7ff0fd7fc700 1 -- start start 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.269+0000 7ff0fd7fc700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff0f8108980 0x7ff0f8108da0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.269+0000 7ff0fd7fc700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff0f8109370 con 0x7ff0f8108980 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.269+0000 7ff0f6ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff0f8108980 0x7ff0f8108da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.269+0000 7ff0f6ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff0f8108980 0x7ff0f8108da0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:43052/0 (socket says 192.168.123.105:43052) 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.269+0000 7ff0f6ffd700 1 -- 192.168.123.105:0/3162002801 learned_addr learned my addr 192.168.123.105:0/3162002801 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.269+0000 7ff0f6ffd700 1 -- 192.168.123.105:0/3162002801 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff0f8109b80 con 0x7ff0f8108980 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.270+0000 7ff0f6ffd700 1 --2- 192.168.123.105:0/3162002801 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff0f8108980 0x7ff0f8108da0 secure :-1 s=READY pgs=3 cs=0 l=1 rev1=1 crypto rx=0x7ff0e0009cf0 tx=0x7ff0e000b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=886de17caa283af9 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.270+0000 7ff0f5ffb700 1 -- 192.168.123.105:0/3162002801 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff0e0004030 con 0x7ff0f8108980 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.270+0000 7ff0f5ffb700 1 -- 192.168.123.105:0/3162002801 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(0 keys) v1 ==== 4+0+0 (secure 0 0 0) 0x7ff0e0004190 con 0x7ff0f8108980 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.271+0000 7ff0fd7fc700 1 -- 192.168.123.105:0/3162002801 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff0f8108980 msgr2=0x7ff0f8108da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.271+0000 7ff0fd7fc700 1 --2- 192.168.123.105:0/3162002801 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff0f8108980 0x7ff0f8108da0 secure :-1 s=READY pgs=3 cs=0 l=1 rev1=1 crypto rx=0x7ff0e0009cf0 tx=0x7ff0e000b0e0 comp rx=0 tx=0).stop 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.271+0000 7ff0fd7fc700 1 -- 192.168.123.105:0/3162002801 shutdown_connections 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.271+0000 7ff0fd7fc700 1 --2- 192.168.123.105:0/3162002801 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff0f8108980 0x7ff0f8108da0 unknown :-1 s=CLOSED pgs=3 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.271+0000 7ff0fd7fc700 1 -- 192.168.123.105:0/3162002801 >> 192.168.123.105:0/3162002801 conn(0x7ff0f81044d0 msgr2=0x7ff0f81068c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.271+0000 7ff0fd7fc700 1 -- 192.168.123.105:0/3162002801 shutdown_connections 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.271+0000 7ff0fd7fc700 1 -- 192.168.123.105:0/3162002801 wait complete. 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.271+0000 7ff0fd7fc700 1 Processor -- start 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.272+0000 7ff0fd7fc700 1 -- start start 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.272+0000 7ff0fd7fc700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff0f819bc90 0x7ff0f819c0d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.272+0000 7ff0f6ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff0f819bc90 0x7ff0f819c0d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.272+0000 7ff0f6ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff0f819bc90 0x7ff0f819c0d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:43060/0 (socket says 192.168.123.105:43060) 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.272+0000 7ff0f6ffd700 1 -- 192.168.123.105:0/1498021466 learned_addr learned my addr 192.168.123.105:0/1498021466 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.272+0000 7ff0fd7fc700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff0f8109370 con 0x7ff0f819bc90 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.272+0000 7ff0f6ffd700 1 -- 192.168.123.105:0/1498021466 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff0e0009740 con 0x7ff0f819bc90 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.273+0000 7ff0f6ffd700 1 --2- 192.168.123.105:0/1498021466 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff0f819bc90 0x7ff0f819c0d0 secure :-1 s=READY pgs=4 cs=0 l=1 rev1=1 crypto rx=0x7ff0e0004000 tx=0x7ff0e0004750 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.273+0000 7ff0effff700 1 -- 192.168.123.105:0/1498021466 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff0e000bd90 con 0x7ff0f819bc90 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.273+0000 7ff0effff700 1 -- 192.168.123.105:0/1498021466 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(0 keys) v1 ==== 4+0+0 (secure 0 0 0) 0x7ff0e00036a0 con 0x7ff0f819bc90 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.273+0000 7ff0effff700 1 -- 192.168.123.105:0/1498021466 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff0e0003990 con 0x7ff0f819bc90 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.273+0000 7ff0fd7fc700 1 -- 192.168.123.105:0/1498021466 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff0f819c610 con 0x7ff0f819bc90 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.274+0000 7ff0effff700 1 -- 192.168.123.105:0/1498021466 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7ff0e0021020 con 0x7ff0f819bc90 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.274+0000 7ff0fd7fc700 1 -- 192.168.123.105:0/1498021466 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff0f819f250 con 0x7ff0f819bc90 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.275+0000 7ff0effff700 1 -- 192.168.123.105:0/1498021466 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7ff0e001a500 con 0x7ff0f819bc90 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.275+0000 7ff0fd7fc700 1 -- 192.168.123.105:0/1498021466 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff0f8195330 con 0x7ff0f819bc90 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.276+0000 7ff0effff700 1 -- 192.168.123.105:0/1498021466 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72446 (secure 0 0 0) 0x7ff0e0043b10 con 0x7ff0f819bc90 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.312+0000 7ff0fd7fc700 1 -- 192.168.123.105:0/1498021466 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "config assimilate-conf"} v 0) v1 -- 0x7ff0f802ce00 con 0x7ff0f819bc90 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.320+0000 7ff0effff700 1 -- 192.168.123.105:0/1498021466 <== mon.0 v2:192.168.123.105:3300/0 7 ==== config(27 keys) v1 ==== 1037+0+0 (secure 0 0 0) 0x7ff0e0003e30 con 0x7ff0f819bc90 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.320+0000 7ff0effff700 1 -- 192.168.123.105:0/1498021466 <== mon.0 v2:192.168.123.105:3300/0 8 ==== mon_command_ack([{"prefix": "config assimilate-conf"}]=0 v2) v1 ==== 70+0+435 (secure 0 0 0) 0x7ff0e0032030 con 0x7ff0f819bc90 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.321+0000 7ff0fd7fc700 1 -- 192.168.123.105:0/1498021466 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff0f819bc90 msgr2=0x7ff0f819c0d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.321+0000 7ff0fd7fc700 1 --2- 192.168.123.105:0/1498021466 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff0f819bc90 0x7ff0f819c0d0 secure :-1 s=READY pgs=4 cs=0 l=1 rev1=1 crypto rx=0x7ff0e0004000 tx=0x7ff0e0004750 comp rx=0 tx=0).stop 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.321+0000 7ff0fd7fc700 1 -- 192.168.123.105:0/1498021466 shutdown_connections 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.321+0000 7ff0fd7fc700 1 --2- 192.168.123.105:0/1498021466 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff0f819bc90 0x7ff0f819c0d0 unknown :-1 s=CLOSED pgs=4 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.321+0000 7ff0fd7fc700 1 -- 192.168.123.105:0/1498021466 >> 192.168.123.105:0/1498021466 conn(0x7ff0f81044d0 msgr2=0x7ff0f8190340 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.322+0000 7ff0fd7fc700 1 -- 192.168.123.105:0/1498021466 shutdown_connections 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.322+0000 7ff0fd7fc700 1 -- 192.168.123.105:0/1498021466 wait complete. 2026-03-10T08:50:11.370 INFO:teuthology.orchestra.run.vm05.stdout:Generating new minimal ceph.conf... 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.495+0000 7f856eafe700 1 Processor -- start 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.496+0000 7f856eafe700 1 -- start start 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.496+0000 7f856eafe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8568106790 0x7f8568106bb0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.496+0000 7f856eafe700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8568107180 con 0x7f8568106790 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.496+0000 7f856c89a700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8568106790 0x7f8568106bb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.496+0000 7f856c89a700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8568106790 0x7f8568106bb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:43070/0 (socket says 192.168.123.105:43070) 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.496+0000 7f856c89a700 1 -- 192.168.123.105:0/1571372611 learned_addr learned my addr 192.168.123.105:0/1571372611 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.497+0000 7f856c89a700 1 -- 192.168.123.105:0/1571372611 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8568107990 con 0x7f8568106790 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.497+0000 7f856c89a700 1 --2- 192.168.123.105:0/1571372611 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8568106790 0x7f8568106bb0 secure :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0x7f8558009a90 tx=0x7f8558009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=948a3274a7c69715 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.497+0000 7f85677fe700 1 -- 192.168.123.105:0/1571372611 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f855800fbf0 con 0x7f8568106790 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.497+0000 7f85677fe700 1 -- 192.168.123.105:0/1571372611 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1037+0+0 (secure 0 0 0) 0x7f85580044a0 con 0x7f8568106790 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.497+0000 7f85677fe700 1 -- 192.168.123.105:0/1571372611 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8558017450 con 0x7f8568106790 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.497+0000 7f856eafe700 1 -- 192.168.123.105:0/1571372611 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8568106790 msgr2=0x7f8568106bb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.497+0000 7f856eafe700 1 --2- 192.168.123.105:0/1571372611 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8568106790 0x7f8568106bb0 secure :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0x7f8558009a90 tx=0x7f8558009da0 comp rx=0 tx=0).stop 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.498+0000 7f856eafe700 1 -- 192.168.123.105:0/1571372611 shutdown_connections 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.498+0000 7f856eafe700 1 --2- 192.168.123.105:0/1571372611 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8568106790 0x7f8568106bb0 unknown :-1 s=CLOSED pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.498+0000 7f856eafe700 1 -- 192.168.123.105:0/1571372611 >> 192.168.123.105:0/1571372611 conn(0x7f8568101d30 msgr2=0x7f8568104170 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.498+0000 7f856eafe700 1 -- 192.168.123.105:0/1571372611 shutdown_connections 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.498+0000 7f856eafe700 1 -- 192.168.123.105:0/1571372611 wait complete. 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.498+0000 7f856eafe700 1 Processor -- start 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.498+0000 7f856eafe700 1 -- start start 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.498+0000 7f856eafe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8568106790 0x7f8568198140 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.498+0000 7f856eafe700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8568198680 con 0x7f8568106790 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.499+0000 7f856c89a700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8568106790 0x7f8568198140 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.499+0000 7f856c89a700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8568106790 0x7f8568198140 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:43080/0 (socket says 192.168.123.105:43080) 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.499+0000 7f856c89a700 1 -- 192.168.123.105:0/2811733678 learned_addr learned my addr 192.168.123.105:0/2811733678 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.499+0000 7f856c89a700 1 -- 192.168.123.105:0/2811733678 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8558009740 con 0x7f8568106790 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.499+0000 7f856c89a700 1 --2- 192.168.123.105:0/2811733678 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8568106790 0x7f8568198140 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7f8558012040 tx=0x7f8558003fa0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.499+0000 7f8565ffb700 1 -- 192.168.123.105:0/2811733678 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f85580036a0 con 0x7f8568106790 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.499+0000 7f8565ffb700 1 -- 192.168.123.105:0/2811733678 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1037+0+0 (secure 0 0 0) 0x7f8558017b80 con 0x7f8568106790 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.499+0000 7f8565ffb700 1 -- 192.168.123.105:0/2811733678 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8558020a60 con 0x7f8568106790 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.499+0000 7f856eafe700 1 -- 192.168.123.105:0/2811733678 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8568198880 con 0x7f8568106790 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.499+0000 7f856eafe700 1 -- 192.168.123.105:0/2811733678 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8568198ca0 con 0x7f8568106790 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.500+0000 7f8565ffb700 1 -- 192.168.123.105:0/2811733678 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7f855801e040 con 0x7f8568106790 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.500+0000 7f8565ffb700 1 -- 192.168.123.105:0/2811733678 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f8558029d80 con 0x7f8568106790 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.501+0000 7f856eafe700 1 -- 192.168.123.105:0/2811733678 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8568192080 con 0x7f8568106790 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.502+0000 7f8565ffb700 1 -- 192.168.123.105:0/2811733678 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72446 (secure 0 0 0) 0x7f8558024020 con 0x7f8568106790 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.537+0000 7f856eafe700 1 -- 192.168.123.105:0/2811733678 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "config generate-minimal-conf"} v 0) v1 -- 0x7f856802d090 con 0x7f8568106790 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.538+0000 7f8565ffb700 1 -- 192.168.123.105:0/2811733678 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "config generate-minimal-conf"}]=0 v2) v1 ==== 76+0+181 (secure 0 0 0) 0x7f855801c070 con 0x7f8568106790 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.539+0000 7f856eafe700 1 -- 192.168.123.105:0/2811733678 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8568106790 msgr2=0x7f8568198140 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.539+0000 7f856eafe700 1 --2- 192.168.123.105:0/2811733678 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8568106790 0x7f8568198140 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7f8558012040 tx=0x7f8558003fa0 comp rx=0 tx=0).stop 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.539+0000 7f856eafe700 1 -- 192.168.123.105:0/2811733678 shutdown_connections 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.539+0000 7f856eafe700 1 --2- 192.168.123.105:0/2811733678 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8568106790 0x7f8568198140 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.539+0000 7f856eafe700 1 -- 192.168.123.105:0/2811733678 >> 192.168.123.105:0/2811733678 conn(0x7f8568101d30 msgr2=0x7f85681035a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.540+0000 7f856eafe700 1 -- 192.168.123.105:0/2811733678 shutdown_connections 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:11.540+0000 7f856eafe700 1 -- 192.168.123.105:0/2811733678 wait complete. 2026-03-10T08:50:11.588 INFO:teuthology.orchestra.run.vm05.stdout:Restarting the monitor... 2026-03-10T08:50:11.825 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 systemd[1]: Starting Ceph mon.vm05 for 16587ed2-1c5e-11f1-90f6-35051361a039... 2026-03-10T08:50:11.870 INFO:teuthology.orchestra.run.vm05.stdout:Setting public_network to 192.168.123.0/24 in global config section 2026-03-10T08:50:12.078 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 podman[49699]: 2026-03-10 08:50:11.825795083 +0000 UTC m=+0.016621648 container create 4cb0e74c858492fe7a9e643719fa5d270c6249137a89ab206b1ffa2780d304e0 (image=quay.io/ceph/ceph:v18.2.1, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-mon-vm05, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_POINT_RELEASE=-18.2.1, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, ceph=True, io.buildah.version=1.29.1, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, org.label-schema.license=GPLv2, org.label-schema.build-date=20240222, GIT_BRANCH=HEAD, maintainer=Guillaume Abrioux , RELEASE=HEAD, org.label-schema.name=CentOS Stream 8 Base Image) 2026-03-10T08:50:12.078 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 podman[49699]: 2026-03-10 08:50:11.85616449 +0000 UTC m=+0.046991065 container init 4cb0e74c858492fe7a9e643719fa5d270c6249137a89ab206b1ffa2780d304e0 (image=quay.io/ceph/ceph:v18.2.1, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-mon-vm05, CEPH_POINT_RELEASE=-18.2.1, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=HEAD, ceph=True, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20240222, maintainer=Guillaume Abrioux , GIT_CLEAN=True, RELEASE=HEAD, io.buildah.version=1.29.1, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.vendor=CentOS) 2026-03-10T08:50:12.078 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 podman[49699]: 2026-03-10 08:50:11.860578924 +0000 UTC m=+0.051405489 container start 4cb0e74c858492fe7a9e643719fa5d270c6249137a89ab206b1ffa2780d304e0 (image=quay.io/ceph/ceph:v18.2.1, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-mon-vm05, GIT_BRANCH=HEAD, RELEASE=HEAD, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_POINT_RELEASE=-18.2.1, maintainer=Guillaume Abrioux , GIT_CLEAN=True, ceph=True, io.buildah.version=1.29.1, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, org.label-schema.schema-version=1.0, GIT_REPO=https://github.com/ceph/ceph-container.git, org.label-schema.build-date=20240222) 2026-03-10T08:50:12.078 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 bash[49699]: 4cb0e74c858492fe7a9e643719fa5d270c6249137a89ab206b1ffa2780d304e0 2026-03-10T08:50:12.078 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 podman[49699]: 2026-03-10 08:50:11.818455302 +0000 UTC m=+0.009281877 image pull 5be31c24972a920012b90a9769e8313e2490c82aee752aefd49af9cf4c0f3fcf quay.io/ceph/ceph:v18.2.1 2026-03-10T08:50:12.078 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 systemd[1]: Started Ceph mon.vm05 for 16587ed2-1c5e-11f1-90f6-35051361a039. 2026-03-10T08:50:12.078 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: set uid:gid to 167:167 (ceph:ceph) 2026-03-10T08:50:12.078 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable), process ceph-mon, pid 2 2026-03-10T08:50:12.078 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: pidfile_write: ignore empty --pid-file 2026-03-10T08:50:12.078 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: load: jerasure load: lrc 2026-03-10T08:50:12.078 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: RocksDB version: 7.9.2 2026-03-10T08:50:12.078 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Git sha 0 2026-03-10T08:50:12.078 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Compile date 2023-12-11 22:07:34 2026-03-10T08:50:12.078 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: DB SUMMARY 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: DB Session ID: MYUDBZ93L0ONYYEN2PRE 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: CURRENT file: CURRENT 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: IDENTITY file: IDENTITY 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: MANIFEST file: MANIFEST-000010 size: 179 Bytes 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: SST files in /var/lib/ceph/mon/ceph-vm05/store.db dir, Total Num: 1, files: 000008.sst 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-vm05/store.db: 000009.log size: 89048 ; 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.error_if_exists: 0 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.create_if_missing: 0 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.paranoid_checks: 1 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.flush_verify_memtable_count: 1 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.env: 0x5610c4cc7720 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.fs: PosixFileSystem 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.info_log: 0x5610c7819360 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.max_file_opening_threads: 16 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.statistics: (nil) 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.use_fsync: 0 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.max_log_file_size: 0 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.max_manifest_file_size: 1073741824 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.log_file_time_to_roll: 0 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.keep_log_file_num: 1000 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.recycle_log_file_num: 0 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.allow_fallocate: 1 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.allow_mmap_reads: 0 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.allow_mmap_writes: 0 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.use_direct_reads: 0 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.create_missing_column_families: 0 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.db_log_dir: 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.wal_dir: 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.table_cache_numshardbits: 6 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.WAL_ttl_seconds: 0 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.WAL_size_limit_MB: 0 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.manifest_preallocation_size: 4194304 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.is_fd_close_on_exec: 1 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.advise_random_on_open: 1 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.db_write_buffer_size: 0 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.write_buffer_manager: 0x5610c6aa8320 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.access_hint_on_compaction_start: 1 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.random_access_max_buffer_size: 1048576 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.use_adaptive_mutex: 0 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.rate_limiter: (nil) 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.wal_recovery_mode: 2 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.enable_thread_tracking: 0 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.enable_pipelined_write: 0 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.unordered_write: 0 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.allow_concurrent_memtable_write: 1 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.write_thread_max_yield_usec: 100 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.write_thread_slow_yield_usec: 3 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.row_cache: None 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.wal_filter: None 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.avoid_flush_during_recovery: 0 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.allow_ingest_behind: 0 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.two_write_queues: 0 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.manual_wal_flush: 0 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.wal_compression: 0 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.atomic_flush: 0 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.persist_stats_to_disk: 0 2026-03-10T08:50:12.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.write_dbid_to_manifest: 0 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.log_readahead_size: 0 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.file_checksum_gen_factory: Unknown 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.best_efforts_recovery: 0 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.max_bgerror_resume_count: 2147483647 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.allow_data_in_errors: 0 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.db_host_id: __hostname__ 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.enforce_single_del_contracts: true 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.max_background_jobs: 2 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.max_background_compactions: -1 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.max_subcompactions: 1 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.avoid_flush_during_shutdown: 0 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.writable_file_max_buffer_size: 1048576 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.delayed_write_rate : 16777216 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.max_total_wal_size: 0 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.stats_dump_period_sec: 600 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.stats_persist_period_sec: 600 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.stats_history_buffer_size: 1048576 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.max_open_files: -1 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.bytes_per_sync: 0 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.wal_bytes_per_sync: 0 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.strict_bytes_per_sync: 0 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.compaction_readahead_size: 0 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.max_background_flushes: -1 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Compression algorithms supported: 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: kZSTDNotFinalCompression supported: 0 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: kZSTD supported: 0 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: kXpressCompression supported: 0 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: kLZ4HCCompression supported: 1 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: kZlibCompression supported: 1 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: kSnappyCompression supported: 1 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: kLZ4Compression supported: 1 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: kBZip2Compression supported: 0 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Fast CRC32 supported: Supported on x86 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: DMutex implementation: pthread_mutex_t 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-vm05/store.db/MANIFEST-000010 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.comparator: leveldb.BytewiseComparator 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.merge_operator: 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.compaction_filter: None 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.compaction_filter_factory: None 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.sst_partitioner_factory: None 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.memtable_factory: SkipListFactory 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.table_factory: BlockBasedTable 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5610c7819460) 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout: cache_index_and_filter_blocks: 1 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout: cache_index_and_filter_blocks_with_high_priority: 0 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout: pin_l0_filter_and_index_blocks_in_cache: 0 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout: pin_top_level_index_and_filter: 1 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout: index_type: 0 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout: data_block_index_type: 0 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout: index_shortening: 1 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout: data_block_hash_table_util_ratio: 0.750000 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout: checksum: 4 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout: no_block_cache: 0 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout: block_cache: 0x5610c6b2b350 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout: block_cache_name: BinnedLRUCache 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout: block_cache_options: 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout: capacity : 536870912 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout: num_shard_bits : 4 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout: strict_capacity_limit : 0 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout: high_pri_pool_ratio: 0.000 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout: block_cache_compressed: (nil) 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout: persistent_cache: (nil) 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout: block_size: 4096 2026-03-10T08:50:12.080 INFO:journalctl@ceph.mon.vm05.vm05.stdout: block_size_deviation: 10 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout: block_restart_interval: 16 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout: index_block_restart_interval: 1 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout: metadata_block_size: 4096 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout: partition_filters: 0 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout: use_delta_encoding: 1 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout: filter_policy: bloomfilter 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout: whole_key_filtering: 1 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout: verify_compression: 0 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout: read_amp_bytes_per_bit: 0 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout: format_version: 5 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout: enable_index_compression: 1 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout: block_align: 0 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout: max_auto_readahead_size: 262144 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout: prepopulate_block_cache: 0 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout: initial_auto_readahead_size: 8192 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout: num_file_reads_for_auto_readahead: 2 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.write_buffer_size: 33554432 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.max_write_buffer_number: 2 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.compression: NoCompression 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.bottommost_compression: Disabled 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.prefix_extractor: nullptr 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.num_levels: 7 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.min_write_buffer_number_to_merge: 1 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.bottommost_compression_opts.level: 32767 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.bottommost_compression_opts.strategy: 0 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.bottommost_compression_opts.enabled: false 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.compression_opts.window_bits: -14 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.compression_opts.level: 32767 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.compression_opts.strategy: 0 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.compression_opts.max_dict_bytes: 0 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.compression_opts.parallel_threads: 1 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.compression_opts.enabled: false 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.level0_file_num_compaction_trigger: 4 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.level0_slowdown_writes_trigger: 20 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.level0_stop_writes_trigger: 36 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.target_file_size_base: 67108864 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.target_file_size_multiplier: 1 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.max_bytes_for_level_base: 268435456 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.max_bytes_for_level_multiplier: 10.000000 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.max_sequential_skip_in_iterations: 8 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.max_compaction_bytes: 1677721600 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.arena_block_size: 1048576 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.disable_auto_compactions: 0 2026-03-10T08:50:12.081 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.compaction_style: kCompactionStyleLevel 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.compaction_pri: kMinOverlappingRatio 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.compaction_options_universal.size_ratio: 1 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.table_properties_collectors: 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.inplace_update_support: 0 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.inplace_update_num_locks: 10000 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.memtable_whole_key_filtering: 0 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.memtable_huge_page_size: 0 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.bloom_locality: 0 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.max_successive_merges: 0 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.optimize_filters_for_hits: 0 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.paranoid_file_checks: 0 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.force_consistency_checks: 1 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.report_bg_io_stats: 0 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.ttl: 2592000 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.periodic_compaction_seconds: 0 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.preclude_last_level_data_seconds: 0 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.preserve_internal_time_seconds: 0 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.enable_blob_files: false 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.min_blob_size: 0 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.blob_file_size: 268435456 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.blob_compression_type: NoCompression 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.enable_blob_garbage_collection: false 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.blob_compaction_readahead_size: 0 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.blob_file_starting_level: 0 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-vm05/store.db/MANIFEST-000010 succeeded,manifest_file_number is 10, next_file_number is 12, last_sequence is 5, log_number is 5,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 5 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 05836416-2294-42f5-b375-8f9e69647089 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773132611892393, "job": 1, "event": "recovery_started", "wal_files": [9]} 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #9 mode 2 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773132611893446, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 13, "file_size": 84711, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 8, "largest_seqno": 287, "table_properties": {"data_size": 82789, "index_size": 209, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 709, "raw_key_size": 13288, "raw_average_key_size": 51, "raw_value_size": 75614, "raw_average_value_size": 293, "num_data_blocks": 9, "num_entries": 258, "num_filter_entries": 258, "num_deletions": 3, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1773132611, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "05836416-2294-42f5-b375-8f9e69647089", "db_session_id": "MYUDBZ93L0ONYYEN2PRE", "orig_file_number": 13, "seqno_to_time_mapping": "N/A"}} 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773132611893485, "job": 1, "event": "recovery_finished"} 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: [db/version_set.cc:5047] Creating manifest 15 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-vm05/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x5610c6bc8000 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: rocksdb: DB pointer 0x5610c6bb4000 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: starting mon.vm05 rank 0 at public addrs [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] at bind addrs [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] mon_data /var/lib/ceph/mon/ceph-vm05 fsid 16587ed2-1c5e-11f1-90f6-35051361a039 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: mon.vm05@-1(???) e1 preinit fsid 16587ed2-1c5e-11f1-90f6-35051361a039 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: mon.vm05@-1(???).mds e0 Unable to load 'last_metadata' 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: mon.vm05@-1(???).mds e1 new map 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: mon.vm05@-1(???).mds e1 print_map 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout: e1 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout: enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout: default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout: legacy client fscid: -1 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout: 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout: No filesystems configured 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: mon.vm05@-1(???).osd e1 crush map has features 3314932999778484224, adjusting msgr requires 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: mon.vm05@-1(???).osd e1 crush map has features 288514050185494528, adjusting msgr requires 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: mon.vm05@-1(???).osd e1 crush map has features 288514050185494528, adjusting msgr requires 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: mon.vm05@-1(???).osd e1 crush map has features 288514050185494528, adjusting msgr requires 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: mon.vm05@-1(???).paxosservice(auth 1..2) refresh upgraded, format 0 -> 3 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: mon.vm05@-1(???).mgr e0 loading version 1 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: mon.vm05@-1(???).mgr e1 active server: (0) 2026-03-10T08:50:12.082 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: mon.vm05@-1(???).mgr e1 mkfs or daemon transitioned to available, loading commands 2026-03-10T08:50:12.083 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: mon.vm05 is new leader, mons vm05 in quorum (ranks 0) 2026-03-10T08:50:12.083 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: monmap e1: 1 mons at {vm05=[v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0]} removed_ranks: {} disallowed_leaders: {} 2026-03-10T08:50:12.083 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: fsmap 2026-03-10T08:50:12.083 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: osdmap e1: 0 total, 0 up, 0 in 2026-03-10T08:50:12.083 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:11 vm05 ceph-mon[49713]: mgrmap e1: no daemons active 2026-03-10T08:50:12.083 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.003+0000 7f2b97851700 1 Processor -- start 2026-03-10T08:50:12.084 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.004+0000 7f2b97851700 1 -- start start 2026-03-10T08:50:12.084 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.005+0000 7f2b97851700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2b9007b760 0x7f2b9007bb80 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:12.084 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.005+0000 7f2b97851700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2b9007c150 con 0x7f2b9007b760 2026-03-10T08:50:12.084 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.005+0000 7f2b955ed700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2b9007b760 0x7f2b9007bb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:12.084 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.005+0000 7f2b955ed700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2b9007b760 0x7f2b9007bb80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:43096/0 (socket says 192.168.123.105:43096) 2026-03-10T08:50:12.084 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.005+0000 7f2b955ed700 1 -- 192.168.123.105:0/2963743575 learned_addr learned my addr 192.168.123.105:0/2963743575 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:12.084 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.006+0000 7f2b955ed700 1 -- 192.168.123.105:0/2963743575 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2b9007c9b0 con 0x7f2b9007b760 2026-03-10T08:50:12.084 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.006+0000 7f2b955ed700 1 --2- 192.168.123.105:0/2963743575 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2b9007b760 0x7f2b9007bb80 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7f2b80009cf0 tx=0x7f2b8000b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=c89f07b55eed793e server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:12.084 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.006+0000 7f2b87fff700 1 -- 192.168.123.105:0/2963743575 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2b80004030 con 0x7f2b9007b760 2026-03-10T08:50:12.084 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.006+0000 7f2b87fff700 1 -- 192.168.123.105:0/2963743575 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1037+0+0 (secure 0 0 0) 0x7f2b8000b810 con 0x7f2b9007b760 2026-03-10T08:50:12.084 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.007+0000 7f2b97851700 1 -- 192.168.123.105:0/2963743575 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2b9007b760 msgr2=0x7f2b9007bb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:12.084 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.007+0000 7f2b97851700 1 --2- 192.168.123.105:0/2963743575 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2b9007b760 0x7f2b9007bb80 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7f2b80009cf0 tx=0x7f2b8000b0e0 comp rx=0 tx=0).stop 2026-03-10T08:50:12.084 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.007+0000 7f2b97851700 1 -- 192.168.123.105:0/2963743575 shutdown_connections 2026-03-10T08:50:12.084 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.007+0000 7f2b97851700 1 --2- 192.168.123.105:0/2963743575 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2b9007b760 0x7f2b9007bb80 unknown :-1 s=CLOSED pgs=1 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:12.084 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.007+0000 7f2b97851700 1 -- 192.168.123.105:0/2963743575 >> 192.168.123.105:0/2963743575 conn(0x7f2b90103f50 msgr2=0x7f2b90106370 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:12.084 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.007+0000 7f2b97851700 1 -- 192.168.123.105:0/2963743575 shutdown_connections 2026-03-10T08:50:12.084 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.007+0000 7f2b97851700 1 -- 192.168.123.105:0/2963743575 wait complete. 2026-03-10T08:50:12.084 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.008+0000 7f2b97851700 1 Processor -- start 2026-03-10T08:50:12.084 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.008+0000 7f2b97851700 1 -- start start 2026-03-10T08:50:12.084 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.008+0000 7f2b97851700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2b9007b760 0x7f2b901a0d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:12.084 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.008+0000 7f2b97851700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2b9007c150 con 0x7f2b9007b760 2026-03-10T08:50:12.084 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.009+0000 7f2b955ed700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2b9007b760 0x7f2b901a0d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:12.084 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.009+0000 7f2b955ed700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2b9007b760 0x7f2b901a0d30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:43104/0 (socket says 192.168.123.105:43104) 2026-03-10T08:50:12.084 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.009+0000 7f2b955ed700 1 -- 192.168.123.105:0/1341143613 learned_addr learned my addr 192.168.123.105:0/1341143613 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:12.084 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.009+0000 7f2b955ed700 1 -- 192.168.123.105:0/1341143613 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2b80009740 con 0x7f2b9007b760 2026-03-10T08:50:12.085 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.009+0000 7f2b955ed700 1 --2- 192.168.123.105:0/1341143613 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2b9007b760 0x7f2b901a0d30 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7f2b800116a0 tx=0x7f2b80011780 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:12.085 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.010+0000 7f2b867fc700 1 -- 192.168.123.105:0/1341143613 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2b800119d0 con 0x7f2b9007b760 2026-03-10T08:50:12.085 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.010+0000 7f2b867fc700 1 -- 192.168.123.105:0/1341143613 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1037+0+0 (secure 0 0 0) 0x7f2b8001a430 con 0x7f2b9007b760 2026-03-10T08:50:12.085 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.010+0000 7f2b867fc700 1 -- 192.168.123.105:0/1341143613 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2b800243f0 con 0x7f2b9007b760 2026-03-10T08:50:12.085 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.010+0000 7f2b97851700 1 -- 192.168.123.105:0/1341143613 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2b901a1270 con 0x7f2b9007b760 2026-03-10T08:50:12.085 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.010+0000 7f2b97851700 1 -- 192.168.123.105:0/1341143613 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2b901a1690 con 0x7f2b9007b760 2026-03-10T08:50:12.085 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.011+0000 7f2b867fc700 1 -- 192.168.123.105:0/1341143613 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7f2b80024550 con 0x7f2b9007b760 2026-03-10T08:50:12.085 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.011+0000 7f2b867fc700 1 -- 192.168.123.105:0/1341143613 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f2b8002d720 con 0x7f2b9007b760 2026-03-10T08:50:12.085 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.011+0000 7f2b97851700 1 -- 192.168.123.105:0/1341143613 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2b9019a4c0 con 0x7f2b9007b760 2026-03-10T08:50:12.085 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.012+0000 7f2b867fc700 1 -- 192.168.123.105:0/1341143613 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72446 (secure 0 0 0) 0x7f2b8001a5a0 con 0x7f2b9007b760 2026-03-10T08:50:12.085 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.049+0000 7f2b97851700 1 -- 192.168.123.105:0/1341143613 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command([{prefix=config set, name=public_network}] v 0) v1 -- 0x7f2b9004fa20 con 0x7f2b9007b760 2026-03-10T08:50:12.085 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.053+0000 7f2b867fc700 1 -- 192.168.123.105:0/1341143613 <== mon.0 v2:192.168.123.105:3300/0 7 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f2b80011e30 con 0x7f2b9007b760 2026-03-10T08:50:12.085 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.053+0000 7f2b867fc700 1 -- 192.168.123.105:0/1341143613 <== mon.0 v2:192.168.123.105:3300/0 8 ==== mon_command_ack([{prefix=config set, name=public_network}]=0 v3)=0 v3) v1 ==== 130+0+0 (secure 0 0 0) 0x7f2b80024be0 con 0x7f2b9007b760 2026-03-10T08:50:12.085 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.054+0000 7f2b97851700 1 -- 192.168.123.105:0/1341143613 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2b9007b760 msgr2=0x7f2b901a0d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:12.085 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.054+0000 7f2b97851700 1 --2- 192.168.123.105:0/1341143613 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2b9007b760 0x7f2b901a0d30 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7f2b800116a0 tx=0x7f2b80011780 comp rx=0 tx=0).stop 2026-03-10T08:50:12.085 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.055+0000 7f2b97851700 1 -- 192.168.123.105:0/1341143613 shutdown_connections 2026-03-10T08:50:12.085 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.055+0000 7f2b97851700 1 --2- 192.168.123.105:0/1341143613 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2b9007b760 0x7f2b901a0d30 unknown :-1 s=CLOSED pgs=2 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:12.085 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.055+0000 7f2b97851700 1 -- 192.168.123.105:0/1341143613 >> 192.168.123.105:0/1341143613 conn(0x7f2b90103f50 msgr2=0x7f2b90106260 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:12.085 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.055+0000 7f2b97851700 1 -- 192.168.123.105:0/1341143613 shutdown_connections 2026-03-10T08:50:12.085 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:12.055+0000 7f2b97851700 1 -- 192.168.123.105:0/1341143613 wait complete. 2026-03-10T08:50:12.085 INFO:teuthology.orchestra.run.vm05.stdout:Wrote config to /etc/ceph/ceph.conf 2026-03-10T08:50:12.085 INFO:teuthology.orchestra.run.vm05.stdout:Wrote keyring to /etc/ceph/ceph.client.admin.keyring 2026-03-10T08:50:12.085 INFO:teuthology.orchestra.run.vm05.stdout:Creating mgr... 2026-03-10T08:50:12.085 INFO:teuthology.orchestra.run.vm05.stdout:Verifying port 0.0.0.0:9283 ... 2026-03-10T08:50:12.085 INFO:teuthology.orchestra.run.vm05.stdout:Verifying port 0.0.0.0:8765 ... 2026-03-10T08:50:12.085 INFO:teuthology.orchestra.run.vm05.stdout:Verifying port 0.0.0.0:8443 ... 2026-03-10T08:50:12.235 INFO:teuthology.orchestra.run.vm05.stdout:Non-zero exit code 1 from systemctl reset-failed ceph-16587ed2-1c5e-11f1-90f6-35051361a039@mgr.vm05.rxwgjc 2026-03-10T08:50:12.235 INFO:teuthology.orchestra.run.vm05.stdout:systemctl: stderr Failed to reset failed state of unit ceph-16587ed2-1c5e-11f1-90f6-35051361a039@mgr.vm05.rxwgjc.service: Unit ceph-16587ed2-1c5e-11f1-90f6-35051361a039@mgr.vm05.rxwgjc.service not loaded. 2026-03-10T08:50:12.358 INFO:teuthology.orchestra.run.vm05.stdout:systemctl: stderr Created symlink /etc/systemd/system/ceph-16587ed2-1c5e-11f1-90f6-35051361a039.target.wants/ceph-16587ed2-1c5e-11f1-90f6-35051361a039@mgr.vm05.rxwgjc.service → /etc/systemd/system/ceph-16587ed2-1c5e-11f1-90f6-35051361a039@.service. 2026-03-10T08:50:12.531 INFO:teuthology.orchestra.run.vm05.stdout:firewalld does not appear to be present 2026-03-10T08:50:12.531 INFO:teuthology.orchestra.run.vm05.stdout:Not possible to enable service . firewalld.service is not available 2026-03-10T08:50:12.531 INFO:teuthology.orchestra.run.vm05.stdout:firewalld does not appear to be present 2026-03-10T08:50:12.531 INFO:teuthology.orchestra.run.vm05.stdout:Not possible to open ports <[9283, 8765, 8443]>. firewalld.service is not available 2026-03-10T08:50:12.531 INFO:teuthology.orchestra.run.vm05.stdout:Waiting for mgr to start... 2026-03-10T08:50:12.531 INFO:teuthology.orchestra.run.vm05.stdout:Waiting for mgr... 2026-03-10T08:50:13.602 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:13 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/1341143613' entity='client.admin' 2026-03-10T08:50:13.944 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout 2026-03-10T08:50:13.944 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout { 2026-03-10T08:50:13.944 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "fsid": "16587ed2-1c5e-11f1-90f6-35051361a039", 2026-03-10T08:50:13.944 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "health": { 2026-03-10T08:50:13.944 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "status": "HEALTH_OK", 2026-03-10T08:50:13.944 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "checks": {}, 2026-03-10T08:50:13.944 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "mutes": [] 2026-03-10T08:50:13.944 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T08:50:13.944 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "election_epoch": 5, 2026-03-10T08:50:13.944 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "quorum": [ 2026-03-10T08:50:13.944 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout 0 2026-03-10T08:50:13.944 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout ], 2026-03-10T08:50:13.944 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "quorum_names": [ 2026-03-10T08:50:13.944 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "vm05" 2026-03-10T08:50:13.944 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout ], 2026-03-10T08:50:13.944 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "quorum_age": 1, 2026-03-10T08:50:13.944 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "monmap": { 2026-03-10T08:50:13.944 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T08:50:13.944 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "min_mon_release_name": "reef", 2026-03-10T08:50:13.944 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_mons": 1 2026-03-10T08:50:13.944 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T08:50:13.944 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "osdmap": { 2026-03-10T08:50:13.944 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T08:50:13.944 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_osds": 0, 2026-03-10T08:50:13.944 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_up_osds": 0, 2026-03-10T08:50:13.944 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "osd_up_since": 0, 2026-03-10T08:50:13.944 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_in_osds": 0, 2026-03-10T08:50:13.944 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "osd_in_since": 0, 2026-03-10T08:50:13.944 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_remapped_pgs": 0 2026-03-10T08:50:13.944 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T08:50:13.944 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "pgmap": { 2026-03-10T08:50:13.946 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "pgs_by_state": [], 2026-03-10T08:50:13.946 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_pgs": 0, 2026-03-10T08:50:13.946 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_pools": 0, 2026-03-10T08:50:13.946 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_objects": 0, 2026-03-10T08:50:13.946 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "data_bytes": 0, 2026-03-10T08:50:13.946 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "bytes_used": 0, 2026-03-10T08:50:13.946 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "bytes_avail": 0, 2026-03-10T08:50:13.946 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "bytes_total": 0 2026-03-10T08:50:13.946 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T08:50:13.946 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "fsmap": { 2026-03-10T08:50:13.946 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T08:50:13.946 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "by_rank": [], 2026-03-10T08:50:13.946 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "up:standby": 0 2026-03-10T08:50:13.946 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "mgrmap": { 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "available": false, 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_standbys": 0, 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "modules": [ 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "iostat", 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "nfs", 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "restful" 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout ], 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "servicemap": { 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "modified": "2026-03-10T08:50:10.961861+0000", 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "progress_events": {} 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout } 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:13.366+0000 7fd6fd465700 1 Processor -- start 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:13.367+0000 7fd6fd465700 1 -- start start 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:13.367+0000 7fd6fd465700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd6f80721d0 0x7fd6f80725f0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:13.367+0000 7fd6fd465700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd6f8072bc0 con 0x7fd6f80721d0 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:13.367+0000 7fd6f7fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd6f80721d0 0x7fd6f80725f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:13.367+0000 7fd6f7fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd6f80721d0 0x7fd6f80725f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:43122/0 (socket says 192.168.123.105:43122) 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:13.367+0000 7fd6f7fff700 1 -- 192.168.123.105:0/434512969 learned_addr learned my addr 192.168.123.105:0/434512969 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:13.367+0000 7fd6f7fff700 1 -- 192.168.123.105:0/434512969 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd6f810e1c0 con 0x7fd6f80721d0 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:13.368+0000 7fd6f7fff700 1 --2- 192.168.123.105:0/434512969 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd6f80721d0 0x7fd6f80725f0 secure :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0x7fd6e800ab30 tx=0x7fd6e8010730 comp rx=0 tx=0).ready entity=mon.0 client_cookie=812a3c8f7334fe5e server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:13.368+0000 7fd6f77fe700 1 -- 192.168.123.105:0/434512969 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd6e8010e00 con 0x7fd6f80721d0 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:13.368+0000 7fd6f77fe700 1 -- 192.168.123.105:0/434512969 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fd6e80044d0 con 0x7fd6f80721d0 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:13.368+0000 7fd6fd465700 1 -- 192.168.123.105:0/434512969 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd6f80721d0 msgr2=0x7fd6f80725f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:13.368+0000 7fd6fd465700 1 --2- 192.168.123.105:0/434512969 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd6f80721d0 0x7fd6f80725f0 secure :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0x7fd6e800ab30 tx=0x7fd6e8010730 comp rx=0 tx=0).stop 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:13.368+0000 7fd6fd465700 1 -- 192.168.123.105:0/434512969 shutdown_connections 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:13.368+0000 7fd6fd465700 1 --2- 192.168.123.105:0/434512969 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd6f80721d0 0x7fd6f80725f0 unknown :-1 s=CLOSED pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:13.368+0000 7fd6fd465700 1 -- 192.168.123.105:0/434512969 >> 192.168.123.105:0/434512969 conn(0x7fd6f806d320 msgr2=0x7fd6f806f760 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:13.369+0000 7fd6fd465700 1 -- 192.168.123.105:0/434512969 shutdown_connections 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:13.369+0000 7fd6fd465700 1 -- 192.168.123.105:0/434512969 wait complete. 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:13.369+0000 7fd6fd465700 1 Processor -- start 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:13.369+0000 7fd6fd465700 1 -- start start 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:13.369+0000 7fd6fd465700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd6f81a9090 0x7fd6f81a94b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:13.369+0000 7fd6fd465700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd6e801a410 con 0x7fd6f81a9090 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:13.369+0000 7fd6f7fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd6f81a9090 0x7fd6f81a94b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:13.369+0000 7fd6f7fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd6f81a9090 0x7fd6f81a94b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:43128/0 (socket says 192.168.123.105:43128) 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:13.369+0000 7fd6f7fff700 1 -- 192.168.123.105:0/2922061617 learned_addr learned my addr 192.168.123.105:0/2922061617 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:13.370+0000 7fd6f7fff700 1 -- 192.168.123.105:0/2922061617 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd6e800a7e0 con 0x7fd6f81a9090 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:13.370+0000 7fd6f7fff700 1 --2- 192.168.123.105:0/2922061617 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd6f81a9090 0x7fd6f81a94b0 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7fd6e800bbd0 tx=0x7fd6e80044c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:13.370+0000 7fd6f5ffb700 1 -- 192.168.123.105:0/2922061617 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd6e801a7e0 con 0x7fd6f81a9090 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:13.370+0000 7fd6fd465700 1 -- 192.168.123.105:0/2922061617 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd6f81a99f0 con 0x7fd6f81a9090 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:13.370+0000 7fd6fd465700 1 -- 192.168.123.105:0/2922061617 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd6f81ac670 con 0x7fd6f81a9090 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:13.371+0000 7fd6f5ffb700 1 -- 192.168.123.105:0/2922061617 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fd6e800f070 con 0x7fd6f81a9090 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:13.371+0000 7fd6f5ffb700 1 -- 192.168.123.105:0/2922061617 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd6e8009430 con 0x7fd6f81a9090 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:13.371+0000 7fd6f5ffb700 1 -- 192.168.123.105:0/2922061617 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7fd6e8018070 con 0x7fd6f81a9090 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:13.371+0000 7fd6f5ffb700 1 -- 192.168.123.105:0/2922061617 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7fd6e8009b80 con 0x7fd6f81a9090 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:13.371+0000 7fd6fd465700 1 -- 192.168.123.105:0/2922061617 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd6d8005320 con 0x7fd6f81a9090 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:13.373+0000 7fd6f5ffb700 1 -- 192.168.123.105:0/2922061617 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72446 (secure 0 0 0) 0x7fd6e8009d90 con 0x7fd6f81a9090 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:13.419+0000 7fd6fd465700 1 -- 192.168.123.105:0/2922061617 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1 -- 0x7fd6d8005190 con 0x7fd6f81a9090 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:13.608+0000 7fd6f5ffb700 1 -- 192.168.123.105:0/2922061617 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "status", "format": "json-pretty"}]=0 v0) v1 ==== 79+0+1241 (secure 0 0 0) 0x7fd6e8009650 con 0x7fd6f81a9090 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:13.609+0000 7fd6eeffd700 1 -- 192.168.123.105:0/2922061617 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd6f81a9090 msgr2=0x7fd6f81a94b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:13.609+0000 7fd6eeffd700 1 --2- 192.168.123.105:0/2922061617 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd6f81a9090 0x7fd6f81a94b0 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7fd6e800bbd0 tx=0x7fd6e80044c0 comp rx=0 tx=0).stop 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:13.609+0000 7fd6eeffd700 1 -- 192.168.123.105:0/2922061617 shutdown_connections 2026-03-10T08:50:13.947 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:13.609+0000 7fd6eeffd700 1 --2- 192.168.123.105:0/2922061617 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd6f81a9090 0x7fd6f81a94b0 secure :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7fd6e800bbd0 tx=0x7fd6e80044c0 comp rx=0 tx=0).stop 2026-03-10T08:50:13.948 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:13.609+0000 7fd6eeffd700 1 -- 192.168.123.105:0/2922061617 >> 192.168.123.105:0/2922061617 conn(0x7fd6f806d320 msgr2=0x7fd6f806dd10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:13.948 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:13.614+0000 7fd6eeffd700 1 -- 192.168.123.105:0/2922061617 shutdown_connections 2026-03-10T08:50:13.948 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:13.614+0000 7fd6eeffd700 1 -- 192.168.123.105:0/2922061617 wait complete. 2026-03-10T08:50:13.948 INFO:teuthology.orchestra.run.vm05.stdout:mgr not available, waiting (1/15)... 2026-03-10T08:50:14.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:14 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/2922061617' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch 2026-03-10T08:50:16.174 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout 2026-03-10T08:50:16.174 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout { 2026-03-10T08:50:16.174 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "fsid": "16587ed2-1c5e-11f1-90f6-35051361a039", 2026-03-10T08:50:16.174 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "health": { 2026-03-10T08:50:16.174 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "status": "HEALTH_OK", 2026-03-10T08:50:16.174 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "checks": {}, 2026-03-10T08:50:16.174 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "mutes": [] 2026-03-10T08:50:16.174 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T08:50:16.174 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "election_epoch": 5, 2026-03-10T08:50:16.174 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "quorum": [ 2026-03-10T08:50:16.174 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout 0 2026-03-10T08:50:16.174 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout ], 2026-03-10T08:50:16.174 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "quorum_names": [ 2026-03-10T08:50:16.174 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "vm05" 2026-03-10T08:50:16.174 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout ], 2026-03-10T08:50:16.174 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "quorum_age": 4, 2026-03-10T08:50:16.174 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "monmap": { 2026-03-10T08:50:16.174 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T08:50:16.174 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "min_mon_release_name": "reef", 2026-03-10T08:50:16.174 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_mons": 1 2026-03-10T08:50:16.174 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T08:50:16.174 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "osdmap": { 2026-03-10T08:50:16.174 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T08:50:16.174 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_osds": 0, 2026-03-10T08:50:16.174 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_up_osds": 0, 2026-03-10T08:50:16.174 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "osd_up_since": 0, 2026-03-10T08:50:16.174 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_in_osds": 0, 2026-03-10T08:50:16.174 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "osd_in_since": 0, 2026-03-10T08:50:16.174 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_remapped_pgs": 0 2026-03-10T08:50:16.174 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T08:50:16.174 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "pgmap": { 2026-03-10T08:50:16.174 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "pgs_by_state": [], 2026-03-10T08:50:16.174 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_pgs": 0, 2026-03-10T08:50:16.174 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_pools": 0, 2026-03-10T08:50:16.174 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_objects": 0, 2026-03-10T08:50:16.174 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "data_bytes": 0, 2026-03-10T08:50:16.174 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "bytes_used": 0, 2026-03-10T08:50:16.174 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "bytes_avail": 0, 2026-03-10T08:50:16.174 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "bytes_total": 0 2026-03-10T08:50:16.176 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "fsmap": { 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "by_rank": [], 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "up:standby": 0 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "mgrmap": { 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "available": false, 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_standbys": 0, 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "modules": [ 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "iostat", 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "nfs", 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "restful" 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout ], 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "servicemap": { 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "modified": "2026-03-10T08:50:10.961861+0000", 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "progress_events": {} 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout } 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.089+0000 7fc63e80c700 1 Processor -- start 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.089+0000 7fc63e80c700 1 -- start start 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.089+0000 7fc63e80c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc638072210 0x7fc638072630 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.089+0000 7fc63e80c700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc638072c00 con 0x7fc638072210 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.091+0000 7fc637fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc638072210 0x7fc638072630 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.091+0000 7fc637fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc638072210 0x7fc638072630 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:43140/0 (socket says 192.168.123.105:43140) 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.091+0000 7fc637fff700 1 -- 192.168.123.105:0/3896057391 learned_addr learned my addr 192.168.123.105:0/3896057391 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.091+0000 7fc637fff700 1 -- 192.168.123.105:0/3896057391 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc63810e1d0 con 0x7fc638072210 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.091+0000 7fc637fff700 1 --2- 192.168.123.105:0/3896057391 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc638072210 0x7fc638072630 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7fc62800d450 tx=0x7fc62800d760 comp rx=0 tx=0).ready entity=mon.0 client_cookie=af8dcc36ce7fee2a server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.095+0000 7fc636ffd700 1 -- 192.168.123.105:0/3896057391 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc628010070 con 0x7fc638072210 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.095+0000 7fc636ffd700 1 -- 192.168.123.105:0/3896057391 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fc628004030 con 0x7fc638072210 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.095+0000 7fc636ffd700 1 -- 192.168.123.105:0/3896057391 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc6280165b0 con 0x7fc638072210 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.095+0000 7fc63e80c700 1 -- 192.168.123.105:0/3896057391 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc638072210 msgr2=0x7fc638072630 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.095+0000 7fc63e80c700 1 --2- 192.168.123.105:0/3896057391 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc638072210 0x7fc638072630 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7fc62800d450 tx=0x7fc62800d760 comp rx=0 tx=0).stop 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.095+0000 7fc63e80c700 1 -- 192.168.123.105:0/3896057391 shutdown_connections 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.095+0000 7fc63e80c700 1 --2- 192.168.123.105:0/3896057391 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc638072210 0x7fc638072630 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.095+0000 7fc63e80c700 1 -- 192.168.123.105:0/3896057391 >> 192.168.123.105:0/3896057391 conn(0x7fc63806d400 msgr2=0x7fc63806f840 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.095+0000 7fc63e80c700 1 -- 192.168.123.105:0/3896057391 shutdown_connections 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.095+0000 7fc63e80c700 1 -- 192.168.123.105:0/3896057391 wait complete. 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.096+0000 7fc63e80c700 1 Processor -- start 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.096+0000 7fc63e80c700 1 -- start start 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.096+0000 7fc63e80c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc638072210 0x7fc6381a0910 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.096+0000 7fc63e80c700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc6381a0e50 con 0x7fc638072210 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.096+0000 7fc637fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc638072210 0x7fc6381a0910 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:16.177 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.096+0000 7fc637fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc638072210 0x7fc6381a0910 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:43142/0 (socket says 192.168.123.105:43142) 2026-03-10T08:50:16.178 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.096+0000 7fc637fff700 1 -- 192.168.123.105:0/4091465321 learned_addr learned my addr 192.168.123.105:0/4091465321 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:16.178 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.096+0000 7fc637fff700 1 -- 192.168.123.105:0/4091465321 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc62800d130 con 0x7fc638072210 2026-03-10T08:50:16.178 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.096+0000 7fc637fff700 1 --2- 192.168.123.105:0/4091465321 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc638072210 0x7fc6381a0910 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7fc628007070 tx=0x7fc628007ec0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:16.178 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.097+0000 7fc6357fa700 1 -- 192.168.123.105:0/4091465321 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc628010070 con 0x7fc638072210 2026-03-10T08:50:16.178 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.097+0000 7fc63e80c700 1 -- 192.168.123.105:0/4091465321 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc6381a1050 con 0x7fc638072210 2026-03-10T08:50:16.178 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.097+0000 7fc63e80c700 1 -- 192.168.123.105:0/4091465321 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc6381a1470 con 0x7fc638072210 2026-03-10T08:50:16.178 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.097+0000 7fc6357fa700 1 -- 192.168.123.105:0/4091465321 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fc628016830 con 0x7fc638072210 2026-03-10T08:50:16.178 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.097+0000 7fc6357fa700 1 -- 192.168.123.105:0/4091465321 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc628008560 con 0x7fc638072210 2026-03-10T08:50:16.178 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.097+0000 7fc6357fa700 1 -- 192.168.123.105:0/4091465321 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7fc628014070 con 0x7fc638072210 2026-03-10T08:50:16.178 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.097+0000 7fc6357fa700 1 -- 192.168.123.105:0/4091465321 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7fc628028d80 con 0x7fc638072210 2026-03-10T08:50:16.178 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.098+0000 7fc63e80c700 1 -- 192.168.123.105:0/4091465321 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc618005320 con 0x7fc638072210 2026-03-10T08:50:16.178 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.099+0000 7fc6357fa700 1 -- 192.168.123.105:0/4091465321 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72446 (secure 0 0 0) 0x7fc62800be60 con 0x7fc638072210 2026-03-10T08:50:16.178 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.137+0000 7fc63e80c700 1 -- 192.168.123.105:0/4091465321 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1 -- 0x7fc618005190 con 0x7fc638072210 2026-03-10T08:50:16.178 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.138+0000 7fc6357fa700 1 -- 192.168.123.105:0/4091465321 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "status", "format": "json-pretty"}]=0 v0) v1 ==== 79+0+1241 (secure 0 0 0) 0x7fc628008970 con 0x7fc638072210 2026-03-10T08:50:16.178 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.139+0000 7fc62effd700 1 -- 192.168.123.105:0/4091465321 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc638072210 msgr2=0x7fc6381a0910 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:16.178 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.139+0000 7fc62effd700 1 --2- 192.168.123.105:0/4091465321 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc638072210 0x7fc6381a0910 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7fc628007070 tx=0x7fc628007ec0 comp rx=0 tx=0).stop 2026-03-10T08:50:16.178 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.139+0000 7fc62effd700 1 -- 192.168.123.105:0/4091465321 shutdown_connections 2026-03-10T08:50:16.178 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.139+0000 7fc62effd700 1 --2- 192.168.123.105:0/4091465321 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc638072210 0x7fc6381a0910 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:16.178 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.139+0000 7fc62effd700 1 -- 192.168.123.105:0/4091465321 >> 192.168.123.105:0/4091465321 conn(0x7fc63806d400 msgr2=0x7fc63806e0b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:16.178 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.139+0000 7fc62effd700 1 -- 192.168.123.105:0/4091465321 shutdown_connections 2026-03-10T08:50:16.178 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:16.139+0000 7fc62effd700 1 -- 192.168.123.105:0/4091465321 wait complete. 2026-03-10T08:50:16.178 INFO:teuthology.orchestra.run.vm05.stdout:mgr not available, waiting (2/15)... 2026-03-10T08:50:16.320 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:16 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/4091465321' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch 2026-03-10T08:50:18.201 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:17 vm05 ceph-mon[49713]: Activating manager daemon vm05.rxwgjc 2026-03-10T08:50:18.201 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:17 vm05 ceph-mon[49713]: mgrmap e2: vm05.rxwgjc(active, starting, since 0.00440185s) 2026-03-10T08:50:18.201 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:17 vm05 ceph-mon[49713]: from='mgr.14100 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T08:50:18.201 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:17 vm05 ceph-mon[49713]: from='mgr.14100 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T08:50:18.201 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:17 vm05 ceph-mon[49713]: from='mgr.14100 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T08:50:18.201 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:17 vm05 ceph-mon[49713]: from='mgr.14100 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T08:50:18.201 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:17 vm05 ceph-mon[49713]: from='mgr.14100 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T08:50:18.201 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:17 vm05 ceph-mon[49713]: from='mgr.14100 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T08:50:18.201 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:17 vm05 ceph-mon[49713]: from='mgr.14100 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-10T08:50:18.201 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:17 vm05 ceph-mon[49713]: from='mgr.14100 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mgr metadata", "who": "vm05.rxwgjc", "id": "vm05.rxwgjc"}]: dispatch 2026-03-10T08:50:18.201 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:17 vm05 ceph-mon[49713]: Manager daemon vm05.rxwgjc is now available 2026-03-10T08:50:18.201 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:17 vm05 ceph-mon[49713]: from='mgr.14100 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:18.202 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:17 vm05 ceph-mon[49713]: from='mgr.14100 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.rxwgjc/mirror_snapshot_schedule"}]: dispatch 2026-03-10T08:50:18.202 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:17 vm05 ceph-mon[49713]: from='mgr.14100 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.rxwgjc/trash_purge_schedule"}]: dispatch 2026-03-10T08:50:18.202 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:17 vm05 ceph-mon[49713]: from='mgr.14100 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:18.202 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:17 vm05 ceph-mon[49713]: from='mgr.14100 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:18.509 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout 2026-03-10T08:50:18.509 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout { 2026-03-10T08:50:18.509 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "fsid": "16587ed2-1c5e-11f1-90f6-35051361a039", 2026-03-10T08:50:18.509 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "health": { 2026-03-10T08:50:18.509 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "status": "HEALTH_OK", 2026-03-10T08:50:18.509 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "checks": {}, 2026-03-10T08:50:18.509 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "mutes": [] 2026-03-10T08:50:18.509 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T08:50:18.509 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "election_epoch": 5, 2026-03-10T08:50:18.509 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "quorum": [ 2026-03-10T08:50:18.509 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout 0 2026-03-10T08:50:18.509 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout ], 2026-03-10T08:50:18.509 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "quorum_names": [ 2026-03-10T08:50:18.509 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "vm05" 2026-03-10T08:50:18.509 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout ], 2026-03-10T08:50:18.509 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "quorum_age": 6, 2026-03-10T08:50:18.509 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "monmap": { 2026-03-10T08:50:18.509 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T08:50:18.509 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "min_mon_release_name": "reef", 2026-03-10T08:50:18.509 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_mons": 1 2026-03-10T08:50:18.509 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T08:50:18.509 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "osdmap": { 2026-03-10T08:50:18.509 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T08:50:18.509 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_osds": 0, 2026-03-10T08:50:18.509 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_up_osds": 0, 2026-03-10T08:50:18.509 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "osd_up_since": 0, 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_in_osds": 0, 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "osd_in_since": 0, 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_remapped_pgs": 0 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "pgmap": { 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "pgs_by_state": [], 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_pgs": 0, 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_pools": 0, 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_objects": 0, 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "data_bytes": 0, 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "bytes_used": 0, 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "bytes_avail": 0, 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "bytes_total": 0 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "fsmap": { 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "by_rank": [], 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "up:standby": 0 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "mgrmap": { 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "available": true, 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_standbys": 0, 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "modules": [ 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "iostat", 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "nfs", 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "restful" 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout ], 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "servicemap": { 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "modified": "2026-03-10T08:50:10.961861+0000", 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "progress_events": {} 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout } 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.304+0000 7fe269d7c700 1 Processor -- start 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.304+0000 7fe269d7c700 1 -- start start 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.304+0000 7fe269d7c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe264108980 0x7fe264108da0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.304+0000 7fe269d7c700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe264109370 con 0x7fe264108980 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.305+0000 7fe2637fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe264108980 0x7fe264108da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.305+0000 7fe2637fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe264108980 0x7fe264108da0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:43224/0 (socket says 192.168.123.105:43224) 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.305+0000 7fe2637fe700 1 -- 192.168.123.105:0/2050549108 learned_addr learned my addr 192.168.123.105:0/2050549108 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.305+0000 7fe2637fe700 1 -- 192.168.123.105:0/2050549108 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe264109b80 con 0x7fe264108980 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.305+0000 7fe2637fe700 1 --2- 192.168.123.105:0/2050549108 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe264108980 0x7fe264108da0 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7fe24c009cf0 tx=0x7fe24c00b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=da71775334dfa932 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.306+0000 7fe2627fc700 1 -- 192.168.123.105:0/2050549108 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fe24c004030 con 0x7fe264108980 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.306+0000 7fe2627fc700 1 -- 192.168.123.105:0/2050549108 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fe24c00b810 con 0x7fe264108980 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.306+0000 7fe269d7c700 1 -- 192.168.123.105:0/2050549108 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe264108980 msgr2=0x7fe264108da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.306+0000 7fe269d7c700 1 --2- 192.168.123.105:0/2050549108 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe264108980 0x7fe264108da0 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7fe24c009cf0 tx=0x7fe24c00b0e0 comp rx=0 tx=0).stop 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.307+0000 7fe269d7c700 1 -- 192.168.123.105:0/2050549108 shutdown_connections 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.307+0000 7fe269d7c700 1 --2- 192.168.123.105:0/2050549108 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe264108980 0x7fe264108da0 unknown :-1 s=CLOSED pgs=16 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.307+0000 7fe269d7c700 1 -- 192.168.123.105:0/2050549108 >> 192.168.123.105:0/2050549108 conn(0x7fe2641044d0 msgr2=0x7fe2641068c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.307+0000 7fe269d7c700 1 -- 192.168.123.105:0/2050549108 shutdown_connections 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.307+0000 7fe269d7c700 1 -- 192.168.123.105:0/2050549108 wait complete. 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.307+0000 7fe269d7c700 1 Processor -- start 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.307+0000 7fe269d7c700 1 -- start start 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.308+0000 7fe269d7c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe26419c7f0 0x7fe26419cc10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.308+0000 7fe2637fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe26419c7f0 0x7fe26419cc10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.308+0000 7fe2637fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe26419c7f0 0x7fe26419cc10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:43230/0 (socket says 192.168.123.105:43230) 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.308+0000 7fe2637fe700 1 -- 192.168.123.105:0/1966810145 learned_addr learned my addr 192.168.123.105:0/1966810145 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.308+0000 7fe269d7c700 1 -- 192.168.123.105:0/1966810145 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe264109370 con 0x7fe26419c7f0 2026-03-10T08:50:18.510 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.308+0000 7fe2637fe700 1 -- 192.168.123.105:0/1966810145 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe24c009740 con 0x7fe26419c7f0 2026-03-10T08:50:18.511 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.308+0000 7fe2637fe700 1 --2- 192.168.123.105:0/1966810145 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe26419c7f0 0x7fe26419cc10 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7fe24c009cc0 tx=0x7fe24c003cb0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:18.511 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.309+0000 7fe260ff9700 1 -- 192.168.123.105:0/1966810145 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fe24c003ed0 con 0x7fe26419c7f0 2026-03-10T08:50:18.511 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.309+0000 7fe260ff9700 1 -- 192.168.123.105:0/1966810145 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fe24c0044d0 con 0x7fe26419c7f0 2026-03-10T08:50:18.511 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.309+0000 7fe260ff9700 1 -- 192.168.123.105:0/1966810145 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fe24c01ac60 con 0x7fe26419c7f0 2026-03-10T08:50:18.511 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.309+0000 7fe269d7c700 1 -- 192.168.123.105:0/1966810145 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe26419d150 con 0x7fe26419c7f0 2026-03-10T08:50:18.511 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.309+0000 7fe269d7c700 1 -- 192.168.123.105:0/1966810145 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe26407c300 con 0x7fe26419c7f0 2026-03-10T08:50:18.511 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.310+0000 7fe260ff9700 1 -- 192.168.123.105:0/1966810145 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 3) v1 ==== 45161+0+0 (secure 0 0 0) 0x7fe24c004030 con 0x7fe26419c7f0 2026-03-10T08:50:18.511 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.310+0000 7fe269d7c700 1 -- 192.168.123.105:0/1966810145 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe26404fa20 con 0x7fe26419c7f0 2026-03-10T08:50:18.511 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.310+0000 7fe260ff9700 1 --2- 192.168.123.105:0/1966810145 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe250038220 0x7fe25003a6e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:18.511 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.310+0000 7fe260ff9700 1 -- 192.168.123.105:0/1966810145 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7fe24c04b2a0 con 0x7fe26419c7f0 2026-03-10T08:50:18.511 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.313+0000 7fe262ffd700 1 --2- 192.168.123.105:0/1966810145 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe250038220 0x7fe25003a6e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:18.511 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.313+0000 7fe260ff9700 1 -- 192.168.123.105:0/1966810145 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fe24c0042e0 con 0x7fe26419c7f0 2026-03-10T08:50:18.511 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.313+0000 7fe262ffd700 1 --2- 192.168.123.105:0/1966810145 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe250038220 0x7fe25003a6e0 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7fe254006fd0 tx=0x7fe254006e40 comp rx=0 tx=0).ready entity=mgr.14100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:18.511 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.453+0000 7fe269d7c700 1 -- 192.168.123.105:0/1966810145 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1 -- 0x7fe2640623c0 con 0x7fe26419c7f0 2026-03-10T08:50:18.511 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.453+0000 7fe260ff9700 1 -- 192.168.123.105:0/1966810145 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "status", "format": "json-pretty"}]=0 v0) v1 ==== 79+0+1240 (secure 0 0 0) 0x7fe24c01f4f0 con 0x7fe26419c7f0 2026-03-10T08:50:18.511 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.457+0000 7fe269d7c700 1 -- 192.168.123.105:0/1966810145 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe250038220 msgr2=0x7fe25003a6e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:18.511 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.457+0000 7fe269d7c700 1 --2- 192.168.123.105:0/1966810145 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe250038220 0x7fe25003a6e0 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7fe254006fd0 tx=0x7fe254006e40 comp rx=0 tx=0).stop 2026-03-10T08:50:18.511 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.457+0000 7fe269d7c700 1 -- 192.168.123.105:0/1966810145 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe26419c7f0 msgr2=0x7fe26419cc10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:18.511 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.457+0000 7fe269d7c700 1 --2- 192.168.123.105:0/1966810145 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe26419c7f0 0x7fe26419cc10 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7fe24c009cc0 tx=0x7fe24c003cb0 comp rx=0 tx=0).stop 2026-03-10T08:50:18.511 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.458+0000 7fe269d7c700 1 -- 192.168.123.105:0/1966810145 shutdown_connections 2026-03-10T08:50:18.511 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.458+0000 7fe269d7c700 1 --2- 192.168.123.105:0/1966810145 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe250038220 0x7fe25003a6e0 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:18.511 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.458+0000 7fe269d7c700 1 --2- 192.168.123.105:0/1966810145 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe26419c7f0 0x7fe26419cc10 unknown :-1 s=CLOSED pgs=17 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:18.511 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.458+0000 7fe269d7c700 1 -- 192.168.123.105:0/1966810145 >> 192.168.123.105:0/1966810145 conn(0x7fe2641044d0 msgr2=0x7fe264106080 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:18.511 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.458+0000 7fe269d7c700 1 -- 192.168.123.105:0/1966810145 shutdown_connections 2026-03-10T08:50:18.511 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.458+0000 7fe269d7c700 1 -- 192.168.123.105:0/1966810145 wait complete. 2026-03-10T08:50:18.511 INFO:teuthology.orchestra.run.vm05.stdout:mgr is available 2026-03-10T08:50:18.789 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout 2026-03-10T08:50:18.789 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout [global] 2026-03-10T08:50:18.789 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout fsid = 16587ed2-1c5e-11f1-90f6-35051361a039 2026-03-10T08:50:18.789 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout mon_osd_allow_pg_remap = true 2026-03-10T08:50:18.789 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout mon_osd_allow_primary_affinity = true 2026-03-10T08:50:18.789 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout mon_warn_on_no_sortbitwise = false 2026-03-10T08:50:18.789 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout osd_crush_chooseleaf_type = 0 2026-03-10T08:50:18.789 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout 2026-03-10T08:50:18.789 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout [mgr] 2026-03-10T08:50:18.789 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout mgr/telemetry/nag = false 2026-03-10T08:50:18.789 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout 2026-03-10T08:50:18.789 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout [osd] 2026-03-10T08:50:18.789 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout mon_osd_backfillfull_ratio = 0.85 2026-03-10T08:50:18.789 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout mon_osd_full_ratio = 0.9 2026-03-10T08:50:18.789 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout mon_osd_nearfull_ratio = 0.8 2026-03-10T08:50:18.789 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout osd_map_max_advance = 10 2026-03-10T08:50:18.789 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout osd_sloppy_crc = true 2026-03-10T08:50:18.789 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.624+0000 7f05d106c700 1 Processor -- start 2026-03-10T08:50:18.789 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.625+0000 7f05d106c700 1 -- start start 2026-03-10T08:50:18.789 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.625+0000 7f05d106c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f05cc1071c0 0x7f05cc1095b0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:18.789 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.625+0000 7f05d106c700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f05cc074720 con 0x7f05cc1071c0 2026-03-10T08:50:18.789 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.625+0000 7f05cad9d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f05cc1071c0 0x7f05cc1095b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.626+0000 7f05cad9d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f05cc1071c0 0x7f05cc1095b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:43242/0 (socket says 192.168.123.105:43242) 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.626+0000 7f05cad9d700 1 -- 192.168.123.105:0/3221371792 learned_addr learned my addr 192.168.123.105:0/3221371792 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.626+0000 7f05cad9d700 1 -- 192.168.123.105:0/3221371792 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f05cc109af0 con 0x7f05cc1071c0 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.626+0000 7f05cad9d700 1 --2- 192.168.123.105:0/3221371792 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f05cc1071c0 0x7f05cc1095b0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f05b4009cf0 tx=0x7f05b400b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=df51d0f7eb98c53f server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.627+0000 7f05c9d9b700 1 -- 192.168.123.105:0/3221371792 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f05b4004030 con 0x7f05cc1071c0 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.627+0000 7f05c9d9b700 1 -- 192.168.123.105:0/3221371792 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f05b400b810 con 0x7f05cc1071c0 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.627+0000 7f05c9d9b700 1 -- 192.168.123.105:0/3221371792 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f05b4003a90 con 0x7f05cc1071c0 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.627+0000 7f05d106c700 1 -- 192.168.123.105:0/3221371792 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f05cc1071c0 msgr2=0x7f05cc1095b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.627+0000 7f05d106c700 1 --2- 192.168.123.105:0/3221371792 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f05cc1071c0 0x7f05cc1095b0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f05b4009cf0 tx=0x7f05b400b0e0 comp rx=0 tx=0).stop 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.627+0000 7f05d106c700 1 -- 192.168.123.105:0/3221371792 shutdown_connections 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.627+0000 7f05d106c700 1 --2- 192.168.123.105:0/3221371792 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f05cc1071c0 0x7f05cc1095b0 unknown :-1 s=CLOSED pgs=18 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.627+0000 7f05d106c700 1 -- 192.168.123.105:0/3221371792 >> 192.168.123.105:0/3221371792 conn(0x7f05cc100bd0 msgr2=0x7f05cc103030 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.628+0000 7f05d106c700 1 -- 192.168.123.105:0/3221371792 shutdown_connections 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.628+0000 7f05d106c700 1 -- 192.168.123.105:0/3221371792 wait complete. 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.628+0000 7f05d106c700 1 Processor -- start 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.628+0000 7f05d106c700 1 -- start start 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.628+0000 7f05d106c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f05cc1071c0 0x7f05cc198250 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.629+0000 7f05cad9d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f05cc1071c0 0x7f05cc198250 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.629+0000 7f05cad9d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f05cc1071c0 0x7f05cc198250 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:43244/0 (socket says 192.168.123.105:43244) 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.629+0000 7f05cad9d700 1 -- 192.168.123.105:0/2632121436 learned_addr learned my addr 192.168.123.105:0/2632121436 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.629+0000 7f05d106c700 1 -- 192.168.123.105:0/2632121436 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f05cc198790 con 0x7f05cc1071c0 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.629+0000 7f05cad9d700 1 -- 192.168.123.105:0/2632121436 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f05b4009740 con 0x7f05cc1071c0 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.629+0000 7f05cad9d700 1 --2- 192.168.123.105:0/2632121436 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f05cc1071c0 0x7f05cc198250 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f05b400be70 tx=0x7f05b400bf50 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.630+0000 7f05c3fff700 1 -- 192.168.123.105:0/2632121436 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f05b4004140 con 0x7f05cc1071c0 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.630+0000 7f05c3fff700 1 -- 192.168.123.105:0/2632121436 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f05b40042a0 con 0x7f05cc1071c0 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.630+0000 7f05d106c700 1 -- 192.168.123.105:0/2632121436 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f05cc198990 con 0x7f05cc1071c0 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.630+0000 7f05c3fff700 1 -- 192.168.123.105:0/2632121436 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f05b4004140 con 0x7f05cc1071c0 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.631+0000 7f05d106c700 1 -- 192.168.123.105:0/2632121436 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f05cc198db0 con 0x7f05cc1071c0 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.631+0000 7f05c3fff700 1 -- 192.168.123.105:0/2632121436 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 3) v1 ==== 45161+0+0 (secure 0 0 0) 0x7f05b4004410 con 0x7f05cc1071c0 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.631+0000 7f05c1ffb700 1 -- 192.168.123.105:0/2632121436 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f05cc04fa90 con 0x7f05cc1071c0 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.631+0000 7f05c3fff700 1 --2- 192.168.123.105:0/2632121436 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f05b803c680 0x7f05b803eb40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.631+0000 7f05ca59c700 1 --2- 192.168.123.105:0/2632121436 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f05b803c680 0x7f05b803eb40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.632+0000 7f05c3fff700 1 -- 192.168.123.105:0/2632121436 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f05b4018b80 con 0x7f05cc1071c0 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.635+0000 7f05ca59c700 1 --2- 192.168.123.105:0/2632121436 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f05b803c680 0x7f05b803eb40 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f05bc006fd0 tx=0x7f05bc006e40 comp rx=0 tx=0).ready entity=mgr.14100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.635+0000 7f05c3fff700 1 -- 192.168.123.105:0/2632121436 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f05b4011800 con 0x7f05cc1071c0 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.736+0000 7f05c1ffb700 1 -- 192.168.123.105:0/2632121436 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "config assimilate-conf"} v 0) v1 -- 0x7f05cc0623c0 con 0x7f05cc1071c0 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.738+0000 7f05c3fff700 1 -- 192.168.123.105:0/2632121436 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "config assimilate-conf"}]=0 v3) v1 ==== 70+0+373 (secure 0 0 0) 0x7f05b4011800 con 0x7f05cc1071c0 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.741+0000 7f05c1ffb700 1 -- 192.168.123.105:0/2632121436 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f05b803c680 msgr2=0x7f05b803eb40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.741+0000 7f05c1ffb700 1 --2- 192.168.123.105:0/2632121436 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f05b803c680 0x7f05b803eb40 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f05bc006fd0 tx=0x7f05bc006e40 comp rx=0 tx=0).stop 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.741+0000 7f05c1ffb700 1 -- 192.168.123.105:0/2632121436 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f05cc1071c0 msgr2=0x7f05cc198250 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.741+0000 7f05c1ffb700 1 --2- 192.168.123.105:0/2632121436 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f05cc1071c0 0x7f05cc198250 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f05b400be70 tx=0x7f05b400bf50 comp rx=0 tx=0).stop 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.741+0000 7f05c1ffb700 1 -- 192.168.123.105:0/2632121436 shutdown_connections 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.741+0000 7f05c1ffb700 1 --2- 192.168.123.105:0/2632121436 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f05b803c680 0x7f05b803eb40 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.742+0000 7f05c1ffb700 1 --2- 192.168.123.105:0/2632121436 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f05cc1071c0 0x7f05cc198250 unknown :-1 s=CLOSED pgs=19 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.742+0000 7f05c1ffb700 1 -- 192.168.123.105:0/2632121436 >> 192.168.123.105:0/2632121436 conn(0x7f05cc100bd0 msgr2=0x7f05cc1018b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.742+0000 7f05c1ffb700 1 -- 192.168.123.105:0/2632121436 shutdown_connections 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.742+0000 7f05c1ffb700 1 -- 192.168.123.105:0/2632121436 wait complete. 2026-03-10T08:50:18.790 INFO:teuthology.orchestra.run.vm05.stdout:Enabling cephadm module... 2026-03-10T08:50:19.296 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.902+0000 7fc528c6f700 1 Processor -- start 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.903+0000 7fc528c6f700 1 -- start start 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.903+0000 7fc528c6f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc52407ade0 0x7fc524079240 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.903+0000 7fc528c6f700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc524079810 con 0x7fc52407ade0 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.903+0000 7fc52259c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc52407ade0 0x7fc524079240 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.903+0000 7fc52259c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc52407ade0 0x7fc524079240 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:43250/0 (socket says 192.168.123.105:43250) 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.903+0000 7fc52259c700 1 -- 192.168.123.105:0/3033803066 learned_addr learned my addr 192.168.123.105:0/3033803066 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.903+0000 7fc52259c700 1 -- 192.168.123.105:0/3033803066 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc524079950 con 0x7fc52407ade0 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.904+0000 7fc52259c700 1 --2- 192.168.123.105:0/3033803066 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc52407ade0 0x7fc524079240 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7fc518009a90 tx=0x7fc518009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=eed5318ea20005cf server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.904+0000 7fc52159a700 1 -- 192.168.123.105:0/3033803066 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc518004030 con 0x7fc52407ade0 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.904+0000 7fc52159a700 1 -- 192.168.123.105:0/3033803066 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fc51800b7e0 con 0x7fc52407ade0 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.904+0000 7fc528c6f700 1 -- 192.168.123.105:0/3033803066 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc52407ade0 msgr2=0x7fc524079240 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.904+0000 7fc528c6f700 1 --2- 192.168.123.105:0/3033803066 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc52407ade0 0x7fc524079240 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7fc518009a90 tx=0x7fc518009da0 comp rx=0 tx=0).stop 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.904+0000 7fc528c6f700 1 -- 192.168.123.105:0/3033803066 shutdown_connections 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.904+0000 7fc528c6f700 1 --2- 192.168.123.105:0/3033803066 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc52407ade0 0x7fc524079240 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.904+0000 7fc528c6f700 1 -- 192.168.123.105:0/3033803066 >> 192.168.123.105:0/3033803066 conn(0x7fc524101ce0 msgr2=0x7fc524104140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.905+0000 7fc528c6f700 1 -- 192.168.123.105:0/3033803066 shutdown_connections 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.905+0000 7fc528c6f700 1 -- 192.168.123.105:0/3033803066 wait complete. 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.905+0000 7fc528c6f700 1 Processor -- start 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.905+0000 7fc528c6f700 1 -- start start 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.905+0000 7fc528c6f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc52407ade0 0x7fc5241a0fe0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.905+0000 7fc528c6f700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc5241a1520 con 0x7fc52407ade0 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.905+0000 7fc52259c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc52407ade0 0x7fc5241a0fe0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.905+0000 7fc52259c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc52407ade0 0x7fc5241a0fe0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:43266/0 (socket says 192.168.123.105:43266) 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.905+0000 7fc52259c700 1 -- 192.168.123.105:0/998470969 learned_addr learned my addr 192.168.123.105:0/998470969 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.906+0000 7fc52259c700 1 -- 192.168.123.105:0/998470969 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc518009740 con 0x7fc52407ade0 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.906+0000 7fc52259c700 1 --2- 192.168.123.105:0/998470969 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc52407ade0 0x7fc5241a0fe0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fc51800bd60 tx=0x7fc51800be40 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.906+0000 7fc5137fe700 1 -- 192.168.123.105:0/998470969 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc51801b650 con 0x7fc52407ade0 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.906+0000 7fc5137fe700 1 -- 192.168.123.105:0/998470969 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fc51801bc50 con 0x7fc52407ade0 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.906+0000 7fc528c6f700 1 -- 192.168.123.105:0/998470969 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc5241a1720 con 0x7fc52407ade0 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.906+0000 7fc528c6f700 1 -- 192.168.123.105:0/998470969 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc5241a1b40 con 0x7fc52407ade0 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.907+0000 7fc5137fe700 1 -- 192.168.123.105:0/998470969 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc518011c50 con 0x7fc52407ade0 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.907+0000 7fc5137fe700 1 -- 192.168.123.105:0/998470969 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 3) v1 ==== 45161+0+0 (secure 0 0 0) 0x7fc51802d430 con 0x7fc52407ade0 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.908+0000 7fc528c6f700 1 -- 192.168.123.105:0/998470969 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc504005320 con 0x7fc52407ade0 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.909+0000 7fc5137fe700 1 --2- 192.168.123.105:0/998470969 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fc50c038250 0x7fc50c03a710 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.909+0000 7fc5137fe700 1 -- 192.168.123.105:0/998470969 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7fc51804cca0 con 0x7fc52407ade0 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.910+0000 7fc521d9b700 1 --2- 192.168.123.105:0/998470969 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fc50c038250 0x7fc50c03a710 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.911+0000 7fc521d9b700 1 --2- 192.168.123.105:0/998470969 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fc50c038250 0x7fc50c03a710 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7fc514006fd0 tx=0x7fc514006e40 comp rx=0 tx=0).ready entity=mgr.14100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:18.911+0000 7fc5137fe700 1 -- 192.168.123.105:0/998470969 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fc51802d6e0 con 0x7fc52407ade0 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.046+0000 7fc528c6f700 1 -- 192.168.123.105:0/998470969 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mgr module enable", "module": "cephadm"} v 0) v1 -- 0x7fc504005190 con 0x7fc52407ade0 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.262+0000 7fc5137fe700 1 -- 192.168.123.105:0/998470969 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mgrmap(e 4) v1 ==== 45278+0+0 (secure 0 0 0) 0x7fc51800fdb0 con 0x7fc52407ade0 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.262+0000 7fc5137fe700 1 -- 192.168.123.105:0/998470969 <== mon.0 v2:192.168.123.105:3300/0 8 ==== mon_command_ack([{"prefix": "mgr module enable", "module": "cephadm"}]=0 v4) v1 ==== 86+0+0 (secure 0 0 0) 0x7fc518043c80 con 0x7fc52407ade0 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.264+0000 7fc528c6f700 1 -- 192.168.123.105:0/998470969 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fc50c038250 msgr2=0x7fc50c03a710 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.264+0000 7fc528c6f700 1 --2- 192.168.123.105:0/998470969 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fc50c038250 0x7fc50c03a710 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7fc514006fd0 tx=0x7fc514006e40 comp rx=0 tx=0).stop 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.264+0000 7fc528c6f700 1 -- 192.168.123.105:0/998470969 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc52407ade0 msgr2=0x7fc5241a0fe0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.264+0000 7fc528c6f700 1 --2- 192.168.123.105:0/998470969 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc52407ade0 0x7fc5241a0fe0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fc51800bd60 tx=0x7fc51800be40 comp rx=0 tx=0).stop 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.264+0000 7fc528c6f700 1 -- 192.168.123.105:0/998470969 shutdown_connections 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.264+0000 7fc528c6f700 1 --2- 192.168.123.105:0/998470969 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fc50c038250 0x7fc50c03a710 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.264+0000 7fc528c6f700 1 --2- 192.168.123.105:0/998470969 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc52407ade0 0x7fc5241a0fe0 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.264+0000 7fc528c6f700 1 -- 192.168.123.105:0/998470969 >> 192.168.123.105:0/998470969 conn(0x7fc524101ce0 msgr2=0x7fc5241029c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.265+0000 7fc528c6f700 1 -- 192.168.123.105:0/998470969 shutdown_connections 2026-03-10T08:50:19.297 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.265+0000 7fc528c6f700 1 -- 192.168.123.105:0/998470969 wait complete. 2026-03-10T08:50:19.299 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:19 vm05 ceph-mon[49713]: mgrmap e3: vm05.rxwgjc(active, since 1.00806s) 2026-03-10T08:50:19.299 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:19 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/1966810145' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch 2026-03-10T08:50:19.299 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:19 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/2632121436' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch 2026-03-10T08:50:19.299 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:19 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/998470969' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "cephadm"}]: dispatch 2026-03-10T08:50:19.622 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout { 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "epoch": 4, 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "available": true, 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "active_name": "vm05.rxwgjc", 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_standby": 0 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout } 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.422+0000 7f35fc958700 1 Processor -- start 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.423+0000 7f35fc958700 1 -- start start 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.423+0000 7f35fc958700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f35f80721d0 0x7f35f80725f0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.423+0000 7f35fc958700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f35f8072bc0 con 0x7f35f80721d0 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.423+0000 7f35f77fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f35f80721d0 0x7f35f80725f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.423+0000 7f35f77fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f35f80721d0 0x7f35f80725f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:43292/0 (socket says 192.168.123.105:43292) 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.423+0000 7f35f77fe700 1 -- 192.168.123.105:0/3865795552 learned_addr learned my addr 192.168.123.105:0/3865795552 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.424+0000 7f35f77fe700 1 -- 192.168.123.105:0/3865795552 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f35f810e1c0 con 0x7f35f80721d0 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.424+0000 7f35f77fe700 1 --2- 192.168.123.105:0/3865795552 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f35f80721d0 0x7f35f80725f0 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f35e800d180 tx=0x7f35e800d490 comp rx=0 tx=0).ready entity=mon.0 client_cookie=220dfbdeffee91b1 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.424+0000 7f35f67fc700 1 -- 192.168.123.105:0/3865795552 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f35e8010070 con 0x7f35f80721d0 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.424+0000 7f35f67fc700 1 -- 192.168.123.105:0/3865795552 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f35e8004030 con 0x7f35f80721d0 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.425+0000 7f35fc958700 1 -- 192.168.123.105:0/3865795552 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f35f80721d0 msgr2=0x7f35f80725f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.425+0000 7f35fc958700 1 --2- 192.168.123.105:0/3865795552 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f35f80721d0 0x7f35f80725f0 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f35e800d180 tx=0x7f35e800d490 comp rx=0 tx=0).stop 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.425+0000 7f35fc958700 1 -- 192.168.123.105:0/3865795552 shutdown_connections 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.425+0000 7f35fc958700 1 --2- 192.168.123.105:0/3865795552 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f35f80721d0 0x7f35f80725f0 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.425+0000 7f35fc958700 1 -- 192.168.123.105:0/3865795552 >> 192.168.123.105:0/3865795552 conn(0x7f35f806d320 msgr2=0x7f35f806f760 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.425+0000 7f35fc958700 1 -- 192.168.123.105:0/3865795552 shutdown_connections 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.425+0000 7f35fc958700 1 -- 192.168.123.105:0/3865795552 wait complete. 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.425+0000 7f35fc958700 1 Processor -- start 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.425+0000 7f35fc958700 1 -- start start 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.426+0000 7f35fc958700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f35f81a0920 0x7f35f81a0d40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.426+0000 7f35fc958700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f35e8003bb0 con 0x7f35f81a0920 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.426+0000 7f35f77fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f35f81a0920 0x7f35f81a0d40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.426+0000 7f35f77fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f35f81a0920 0x7f35f81a0d40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:43308/0 (socket says 192.168.123.105:43308) 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.426+0000 7f35f77fe700 1 -- 192.168.123.105:0/1211704233 learned_addr learned my addr 192.168.123.105:0/1211704233 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.427+0000 7f35f77fe700 1 -- 192.168.123.105:0/1211704233 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f35e80087c0 con 0x7f35f81a0920 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.427+0000 7f35f77fe700 1 --2- 192.168.123.105:0/1211704233 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f35f81a0920 0x7f35f81a0d40 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f35e8008c10 tx=0x7f35e8008cf0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.428+0000 7f35f4ff9700 1 -- 192.168.123.105:0/1211704233 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f35e8010050 con 0x7f35f81a0920 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.428+0000 7f35fc958700 1 -- 192.168.123.105:0/1211704233 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f35f81a1280 con 0x7f35f81a0920 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.428+0000 7f35fc958700 1 -- 192.168.123.105:0/1211704233 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f35f81a1ef0 con 0x7f35f81a0920 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.428+0000 7f35f4ff9700 1 -- 192.168.123.105:0/1211704233 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f35e8004620 con 0x7f35f81a0920 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.428+0000 7f35f4ff9700 1 -- 192.168.123.105:0/1211704233 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f35e8016440 con 0x7f35f81a0920 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.429+0000 7f35fc958700 1 -- 192.168.123.105:0/1211704233 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f35e4005320 con 0x7f35f81a0920 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.429+0000 7f35f4ff9700 1 -- 192.168.123.105:0/1211704233 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 4) v1 ==== 45278+0+0 (secure 0 0 0) 0x7f35e8004180 con 0x7f35f81a0920 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.429+0000 7f35f4ff9700 1 --2- 192.168.123.105:0/1211704233 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f35e0038390 0x7f35e003a850 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.430+0000 7f35f6ffd700 1 -- 192.168.123.105:0/1211704233 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f35e0038390 msgr2=0x7f35e003a850 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.105:6800/2 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.430+0000 7f35f6ffd700 1 --2- 192.168.123.105:0/1211704233 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f35e0038390 0x7f35e003a850 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.430+0000 7f35f4ff9700 1 -- 192.168.123.105:0/1211704233 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f35e80165a0 con 0x7f35f81a0920 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.433+0000 7f35f4ff9700 1 -- 192.168.123.105:0/1211704233 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f35e800b2c0 con 0x7f35f81a0920 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.563+0000 7f35fc958700 1 -- 192.168.123.105:0/1211704233 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mgr stat"} v 0) v1 -- 0x7f35e4005cc0 con 0x7f35f81a0920 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.564+0000 7f35f4ff9700 1 -- 192.168.123.105:0/1211704233 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mgr stat"}]=0 v4) v1 ==== 56+0+98 (secure 0 0 0) 0x7f35e801b070 con 0x7f35f81a0920 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.570+0000 7f35de7fc700 1 -- 192.168.123.105:0/1211704233 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f35e0038390 msgr2=0x7f35e003a850 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.570+0000 7f35de7fc700 1 --2- 192.168.123.105:0/1211704233 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f35e0038390 0x7f35e003a850 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.570+0000 7f35de7fc700 1 -- 192.168.123.105:0/1211704233 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f35f81a0920 msgr2=0x7f35f81a0d40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.570+0000 7f35de7fc700 1 --2- 192.168.123.105:0/1211704233 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f35f81a0920 0x7f35f81a0d40 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f35e8008c10 tx=0x7f35e8008cf0 comp rx=0 tx=0).stop 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.570+0000 7f35de7fc700 1 -- 192.168.123.105:0/1211704233 shutdown_connections 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.570+0000 7f35de7fc700 1 --2- 192.168.123.105:0/1211704233 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f35e0038390 0x7f35e003a850 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.570+0000 7f35de7fc700 1 --2- 192.168.123.105:0/1211704233 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f35f81a0920 0x7f35f81a0d40 unknown :-1 s=CLOSED pgs=25 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.570+0000 7f35de7fc700 1 -- 192.168.123.105:0/1211704233 >> 192.168.123.105:0/1211704233 conn(0x7f35f806d320 msgr2=0x7f35f806dda0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.571+0000 7f35de7fc700 1 -- 192.168.123.105:0/1211704233 shutdown_connections 2026-03-10T08:50:19.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.571+0000 7f35de7fc700 1 -- 192.168.123.105:0/1211704233 wait complete. 2026-03-10T08:50:19.624 INFO:teuthology.orchestra.run.vm05.stdout:Waiting for the mgr to restart... 2026-03-10T08:50:19.624 INFO:teuthology.orchestra.run.vm05.stdout:Waiting for mgr epoch 4... 2026-03-10T08:50:20.669 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:20 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/998470969' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "cephadm"}]': finished 2026-03-10T08:50:20.669 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:20 vm05 ceph-mon[49713]: mgrmap e4: vm05.rxwgjc(active, since 2s) 2026-03-10T08:50:20.669 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:20 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/1211704233' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch 2026-03-10T08:50:24.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:23 vm05 ceph-mon[49713]: Active manager daemon vm05.rxwgjc restarted 2026-03-10T08:50:24.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:23 vm05 ceph-mon[49713]: Activating manager daemon vm05.rxwgjc 2026-03-10T08:50:24.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:23 vm05 ceph-mon[49713]: osdmap e2: 0 total, 0 up, 0 in 2026-03-10T08:50:24.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:23 vm05 ceph-mon[49713]: mgrmap e5: vm05.rxwgjc(active, starting, since 0.00525899s) 2026-03-10T08:50:24.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:23 vm05 ceph-mon[49713]: from='mgr.14118 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-10T08:50:24.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:23 vm05 ceph-mon[49713]: from='mgr.14118 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mgr metadata", "who": "vm05.rxwgjc", "id": "vm05.rxwgjc"}]: dispatch 2026-03-10T08:50:24.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:23 vm05 ceph-mon[49713]: from='mgr.14118 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T08:50:24.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:23 vm05 ceph-mon[49713]: from='mgr.14118 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T08:50:24.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:23 vm05 ceph-mon[49713]: from='mgr.14118 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T08:50:24.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:23 vm05 ceph-mon[49713]: Manager daemon vm05.rxwgjc is now available 2026-03-10T08:50:24.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:23 vm05 ceph-mon[49713]: from='mgr.14118 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:24.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:23 vm05 ceph-mon[49713]: from='mgr.14118 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:24.884 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout { 2026-03-10T08:50:24.884 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "mgrmap_epoch": 6, 2026-03-10T08:50:24.884 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "initialized": true 2026-03-10T08:50:24.884 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout } 2026-03-10T08:50:24.884 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.769+0000 7efccf700700 1 Processor -- start 2026-03-10T08:50:24.884 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.769+0000 7efccf700700 1 -- start start 2026-03-10T08:50:24.884 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.770+0000 7efccf700700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efcc8071e00 0x7efcc8072220 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:24.884 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.770+0000 7efccf700700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7efcc80727f0 con 0x7efcc8071e00 2026-03-10T08:50:24.884 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.770+0000 7efcce6fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efcc8071e00 0x7efcc8072220 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:24.884 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.770+0000 7efcce6fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efcc8071e00 0x7efcc8072220 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:55784/0 (socket says 192.168.123.105:55784) 2026-03-10T08:50:24.884 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.770+0000 7efcce6fe700 1 -- 192.168.123.105:0/2131225494 learned_addr learned my addr 192.168.123.105:0/2131225494 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:24.884 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.771+0000 7efcce6fe700 1 -- 192.168.123.105:0/2131225494 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7efcc810ddb0 con 0x7efcc8071e00 2026-03-10T08:50:24.884 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.771+0000 7efcce6fe700 1 --2- 192.168.123.105:0/2131225494 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efcc8071e00 0x7efcc8072220 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7efcbc009a90 tx=0x7efcbc009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=105375b56d5c395d server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:24.884 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.772+0000 7efccd6fc700 1 -- 192.168.123.105:0/2131225494 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7efcbc004030 con 0x7efcc8071e00 2026-03-10T08:50:24.884 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.772+0000 7efccd6fc700 1 -- 192.168.123.105:0/2131225494 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7efcbc00b7e0 con 0x7efcc8071e00 2026-03-10T08:50:24.884 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.772+0000 7efccf700700 1 -- 192.168.123.105:0/2131225494 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efcc8071e00 msgr2=0x7efcc8072220 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:24.884 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.772+0000 7efccf700700 1 --2- 192.168.123.105:0/2131225494 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efcc8071e00 0x7efcc8072220 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7efcbc009a90 tx=0x7efcbc009da0 comp rx=0 tx=0).stop 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.772+0000 7efccf700700 1 -- 192.168.123.105:0/2131225494 shutdown_connections 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.772+0000 7efccf700700 1 --2- 192.168.123.105:0/2131225494 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efcc8071e00 0x7efcc8072220 secure :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7efcbc009a90 tx=0x7efcbc009da0 comp rx=0 tx=0).stop 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.772+0000 7efccf700700 1 -- 192.168.123.105:0/2131225494 >> 192.168.123.105:0/2131225494 conn(0x7efcc806d320 msgr2=0x7efcc806f760 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.772+0000 7efccf700700 1 -- 192.168.123.105:0/2131225494 shutdown_connections 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.772+0000 7efccf700700 1 -- 192.168.123.105:0/2131225494 wait complete. 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.773+0000 7efccf700700 1 Processor -- start 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.773+0000 7efccf700700 1 -- start start 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.773+0000 7efccf700700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efcc811d5d0 0x7efcc811bc60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.773+0000 7efccf700700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7efcc80727f0 con 0x7efcc811d5d0 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.773+0000 7efcce6fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efcc811d5d0 0x7efcc811bc60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.773+0000 7efcce6fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efcc811d5d0 0x7efcc811bc60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:55796/0 (socket says 192.168.123.105:55796) 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.773+0000 7efcce6fe700 1 -- 192.168.123.105:0/3334887938 learned_addr learned my addr 192.168.123.105:0/3334887938 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.773+0000 7efcce6fe700 1 -- 192.168.123.105:0/3334887938 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7efcbc009740 con 0x7efcc811d5d0 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.773+0000 7efcce6fe700 1 --2- 192.168.123.105:0/3334887938 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efcc811d5d0 0x7efcc811bc60 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7efcc8072aa0 tx=0x7efcbc003f40 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.775+0000 7efcbb7fe700 1 -- 192.168.123.105:0/3334887938 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7efcbc00bed0 con 0x7efcc811d5d0 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.775+0000 7efccf700700 1 -- 192.168.123.105:0/3334887938 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7efcc811d9f0 con 0x7efcc811d5d0 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.775+0000 7efccf700700 1 -- 192.168.123.105:0/3334887938 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7efcc811c370 con 0x7efcc811d5d0 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.775+0000 7efcbb7fe700 1 -- 192.168.123.105:0/3334887938 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7efcbc024460 con 0x7efcc811d5d0 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.775+0000 7efcbb7fe700 1 -- 192.168.123.105:0/3334887938 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7efcbc01b440 con 0x7efcc811d5d0 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.776+0000 7efcbb7fe700 1 -- 192.168.123.105:0/3334887938 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 4) v1 ==== 45278+0+0 (secure 0 0 0) 0x7efcbc0245d0 con 0x7efcc811d5d0 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.776+0000 7efcbb7fe700 1 --2- 192.168.123.105:0/3334887938 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7efcb40383e0 0x7efcb403a8a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.776+0000 7efcbb7fe700 1 -- 192.168.123.105:0/3334887938 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- command(tid 0: {"prefix": "get_command_descriptions"}) v1 -- 0x7efcbc00bed0 con 0x7efcb40383e0 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.776+0000 7efcbb7fe700 1 -- 192.168.123.105:0/3334887938 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7efcbc04c130 con 0x7efcc811d5d0 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.776+0000 7efccdefd700 1 -- 192.168.123.105:0/3334887938 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7efcb40383e0 msgr2=0x7efcb403a8a0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.105:6800/2 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.776+0000 7efccdefd700 1 --2- 192.168.123.105:0/3334887938 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7efcb40383e0 0x7efcb403a8a0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.977+0000 7efccdefd700 1 -- 192.168.123.105:0/3334887938 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7efcb40383e0 msgr2=0x7efcb403a8a0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.105:6800/2 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:19.977+0000 7efccdefd700 1 --2- 192.168.123.105:0/3334887938 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7efcb40383e0 0x7efcb403a8a0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.400000 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:20.377+0000 7efccdefd700 1 -- 192.168.123.105:0/3334887938 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7efcb40383e0 msgr2=0x7efcb403a8a0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.105:6800/2 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:20.377+0000 7efccdefd700 1 --2- 192.168.123.105:0/3334887938 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7efcb40383e0 0x7efcb403a8a0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.800000 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:21.178+0000 7efccdefd700 1 -- 192.168.123.105:0/3334887938 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7efcb40383e0 msgr2=0x7efcb403a8a0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.105:6800/2 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:21.178+0000 7efccdefd700 1 --2- 192.168.123.105:0/3334887938 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7efcb40383e0 0x7efcb403a8a0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 1.600000 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:22.780+0000 7efccdefd700 1 -- 192.168.123.105:0/3334887938 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7efcb40383e0 msgr2=0x7efcb403a8a0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.105:6800/2 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:22.780+0000 7efccdefd700 1 --2- 192.168.123.105:0/3334887938 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7efcb40383e0 0x7efcb403a8a0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 3.200000 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:23.847+0000 7efcbb7fe700 1 -- 192.168.123.105:0/3334887938 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mgrmap(e 5) v1 ==== 45045+0+0 (secure 0 0 0) 0x7efcbc02e430 con 0x7efcc811d5d0 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:23.847+0000 7efcbb7fe700 1 -- 192.168.123.105:0/3334887938 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7efcb40383e0 msgr2=0x7efcb403a8a0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:23.847+0000 7efcbb7fe700 1 --2- 192.168.123.105:0/3334887938 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7efcb40383e0 0x7efcb403a8a0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:24.849+0000 7efcbb7fe700 1 -- 192.168.123.105:0/3334887938 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mgrmap(e 6) v1 ==== 45172+0+0 (secure 0 0 0) 0x7efcbc00ecf0 con 0x7efcc811d5d0 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:24.849+0000 7efcbb7fe700 1 --2- 192.168.123.105:0/3334887938 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7efcb40383e0 0x7efcb403a8a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:24.849+0000 7efcbb7fe700 1 -- 192.168.123.105:0/3334887938 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- command(tid 0: {"prefix": "get_command_descriptions"}) v1 -- 0x7efcbc00bed0 con 0x7efcb40383e0 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:24.852+0000 7efccdefd700 1 --2- 192.168.123.105:0/3334887938 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7efcb40383e0 0x7efcb403a8a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:24.852+0000 7efccdefd700 1 --2- 192.168.123.105:0/3334887938 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7efcb40383e0 0x7efcb403a8a0 secure :-1 s=READY pgs=3 cs=0 l=1 rev1=1 crypto rx=0x7efcc0003a10 tx=0x7efcc00092b0 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:24.853+0000 7efcbb7fe700 1 -- 192.168.123.105:0/3334887938 <== mgr.14118 v2:192.168.123.105:6800/2 1 ==== command_reply(tid 0: 0 ) v1 ==== 8+0+6910 (secure 0 0 0) 0x7efcbc00bed0 con 0x7efcb40383e0 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:24.856+0000 7efccf700700 1 -- 192.168.123.105:0/3334887938 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- command(tid 1: {"prefix": "mgr_status"}) v1 -- 0x7efcc811db80 con 0x7efcb40383e0 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:24.856+0000 7efcbb7fe700 1 -- 192.168.123.105:0/3334887938 <== mgr.14118 v2:192.168.123.105:6800/2 2 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+51 (secure 0 0 0) 0x7efcc811db80 con 0x7efcb40383e0 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:24.856+0000 7efccf700700 1 -- 192.168.123.105:0/3334887938 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7efcb40383e0 msgr2=0x7efcb403a8a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:24.885 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:24.856+0000 7efccf700700 1 --2- 192.168.123.105:0/3334887938 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7efcb40383e0 0x7efcb403a8a0 secure :-1 s=READY pgs=3 cs=0 l=1 rev1=1 crypto rx=0x7efcc0003a10 tx=0x7efcc00092b0 comp rx=0 tx=0).stop 2026-03-10T08:50:24.886 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:24.856+0000 7efccf700700 1 -- 192.168.123.105:0/3334887938 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efcc811d5d0 msgr2=0x7efcc811bc60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:24.886 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:24.856+0000 7efccf700700 1 --2- 192.168.123.105:0/3334887938 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efcc811d5d0 0x7efcc811bc60 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7efcc8072aa0 tx=0x7efcbc003f40 comp rx=0 tx=0).stop 2026-03-10T08:50:24.886 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:24.857+0000 7efccf700700 1 -- 192.168.123.105:0/3334887938 shutdown_connections 2026-03-10T08:50:24.886 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:24.857+0000 7efccf700700 1 --2- 192.168.123.105:0/3334887938 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7efcb40383e0 0x7efcb403a8a0 unknown :-1 s=CLOSED pgs=3 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:24.886 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:24.857+0000 7efccf700700 1 --2- 192.168.123.105:0/3334887938 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efcc811d5d0 0x7efcc811bc60 unknown :-1 s=CLOSED pgs=27 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:24.886 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:24.857+0000 7efccf700700 1 -- 192.168.123.105:0/3334887938 >> 192.168.123.105:0/3334887938 conn(0x7efcc806d320 msgr2=0x7efcc806dfe0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:24.886 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:24.857+0000 7efccf700700 1 -- 192.168.123.105:0/3334887938 shutdown_connections 2026-03-10T08:50:24.886 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:24.857+0000 7efccf700700 1 -- 192.168.123.105:0/3334887938 wait complete. 2026-03-10T08:50:24.886 INFO:teuthology.orchestra.run.vm05.stdout:mgr epoch 4 is available 2026-03-10T08:50:24.886 INFO:teuthology.orchestra.run.vm05.stdout:Setting orchestrator backend to cephadm... 2026-03-10T08:50:25.139 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:24 vm05 ceph-mon[49713]: Found migration_current of "None". Setting to last migration. 2026-03-10T08:50:25.139 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:24 vm05 ceph-mon[49713]: from='mgr.14118 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:50:25.139 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:24 vm05 ceph-mon[49713]: from='mgr.14118 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.rxwgjc/mirror_snapshot_schedule"}]: dispatch 2026-03-10T08:50:25.139 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:24 vm05 ceph-mon[49713]: from='mgr.14118 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:50:25.139 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:24 vm05 ceph-mon[49713]: from='mgr.14118 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:50:25.139 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:24 vm05 ceph-mon[49713]: from='mgr.14118 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.rxwgjc/trash_purge_schedule"}]: dispatch 2026-03-10T08:50:25.139 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:24 vm05 ceph-mon[49713]: from='mgr.14118 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:25.139 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:24 vm05 ceph-mon[49713]: from='mgr.14118 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:25.139 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:24 vm05 ceph-mon[49713]: from='mgr.14118 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:50:25.139 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:24 vm05 ceph-mon[49713]: mgrmap e6: vm05.rxwgjc(active, since 1.00742s) 2026-03-10T08:50:25.203 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.010+0000 7f257f3f3700 1 Processor -- start 2026-03-10T08:50:25.203 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.011+0000 7f257f3f3700 1 -- start start 2026-03-10T08:50:25.203 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.011+0000 7f257f3f3700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2578104fb0 0x7f25781053d0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:25.203 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.011+0000 7f257f3f3700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2578074720 con 0x7f2578104fb0 2026-03-10T08:50:25.203 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.011+0000 7f257d18f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2578104fb0 0x7f25781053d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:25.203 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.011+0000 7f257d18f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2578104fb0 0x7f25781053d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:55876/0 (socket says 192.168.123.105:55876) 2026-03-10T08:50:25.203 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.011+0000 7f257d18f700 1 -- 192.168.123.105:0/3667181331 learned_addr learned my addr 192.168.123.105:0/3667181331 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:25.203 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.011+0000 7f257d18f700 1 -- 192.168.123.105:0/3667181331 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2578105910 con 0x7f2578104fb0 2026-03-10T08:50:25.203 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.011+0000 7f257d18f700 1 --2- 192.168.123.105:0/3667181331 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2578104fb0 0x7f25781053d0 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f256c009a90 tx=0x7f256c009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=6c849b5439d299d6 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:25.203 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.012+0000 7f256bfff700 1 -- 192.168.123.105:0/3667181331 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f256c004030 con 0x7f2578104fb0 2026-03-10T08:50:25.203 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.012+0000 7f256bfff700 1 -- 192.168.123.105:0/3667181331 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f256c00b7e0 con 0x7f2578104fb0 2026-03-10T08:50:25.203 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.012+0000 7f256bfff700 1 -- 192.168.123.105:0/3667181331 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f256c0039f0 con 0x7f2578104fb0 2026-03-10T08:50:25.204 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.013+0000 7f257f3f3700 1 -- 192.168.123.105:0/3667181331 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2578104fb0 msgr2=0x7f25781053d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:25.204 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.013+0000 7f257f3f3700 1 --2- 192.168.123.105:0/3667181331 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2578104fb0 0x7f25781053d0 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f256c009a90 tx=0x7f256c009da0 comp rx=0 tx=0).stop 2026-03-10T08:50:25.204 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.013+0000 7f257f3f3700 1 -- 192.168.123.105:0/3667181331 shutdown_connections 2026-03-10T08:50:25.204 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.013+0000 7f257f3f3700 1 --2- 192.168.123.105:0/3667181331 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2578104fb0 0x7f25781053d0 unknown :-1 s=CLOSED pgs=35 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:25.204 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.013+0000 7f257f3f3700 1 -- 192.168.123.105:0/3667181331 >> 192.168.123.105:0/3667181331 conn(0x7f2578100bd0 msgr2=0x7f2578103030 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:25.204 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.014+0000 7f257f3f3700 1 -- 192.168.123.105:0/3667181331 shutdown_connections 2026-03-10T08:50:25.204 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.014+0000 7f257f3f3700 1 -- 192.168.123.105:0/3667181331 wait complete. 2026-03-10T08:50:25.204 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.014+0000 7f257f3f3700 1 Processor -- start 2026-03-10T08:50:25.204 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.014+0000 7f257f3f3700 1 -- start start 2026-03-10T08:50:25.204 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.014+0000 7f257f3f3700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2578104fb0 0x7f2578198180 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:25.204 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.014+0000 7f257f3f3700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f25781986c0 con 0x7f2578104fb0 2026-03-10T08:50:25.204 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.015+0000 7f257d18f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2578104fb0 0x7f2578198180 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:25.204 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.015+0000 7f257d18f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2578104fb0 0x7f2578198180 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:55884/0 (socket says 192.168.123.105:55884) 2026-03-10T08:50:25.204 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.015+0000 7f257d18f700 1 -- 192.168.123.105:0/3964699889 learned_addr learned my addr 192.168.123.105:0/3964699889 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:25.204 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.015+0000 7f257d18f700 1 -- 192.168.123.105:0/3964699889 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f256c009740 con 0x7f2578104fb0 2026-03-10T08:50:25.204 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.015+0000 7f257d18f700 1 --2- 192.168.123.105:0/3964699889 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2578104fb0 0x7f2578198180 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f256c00bef0 tx=0x7f256c003c60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:25.204 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.015+0000 7f256a7fc700 1 -- 192.168.123.105:0/3964699889 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f256c004080 con 0x7f2578104fb0 2026-03-10T08:50:25.204 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.015+0000 7f256a7fc700 1 -- 192.168.123.105:0/3964699889 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f256c01a430 con 0x7f2578104fb0 2026-03-10T08:50:25.204 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.015+0000 7f256a7fc700 1 -- 192.168.123.105:0/3964699889 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f256c011590 con 0x7f2578104fb0 2026-03-10T08:50:25.204 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.015+0000 7f257f3f3700 1 -- 192.168.123.105:0/3964699889 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f25781988c0 con 0x7f2578104fb0 2026-03-10T08:50:25.204 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.015+0000 7f257f3f3700 1 -- 192.168.123.105:0/3964699889 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2578198ce0 con 0x7f2578104fb0 2026-03-10T08:50:25.204 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.017+0000 7f256a7fc700 1 -- 192.168.123.105:0/3964699889 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 6) v1 ==== 45172+0+0 (secure 0 0 0) 0x7f256c0116f0 con 0x7f2578104fb0 2026-03-10T08:50:25.204 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.017+0000 7f256a7fc700 1 --2- 192.168.123.105:0/3964699889 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2564040ae0 0x7f2564042fa0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:25.204 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.017+0000 7f256a7fc700 1 -- 192.168.123.105:0/3964699889 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f256c04cf30 con 0x7f2578104fb0 2026-03-10T08:50:25.205 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.017+0000 7f257c98e700 1 --2- 192.168.123.105:0/3964699889 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2564040ae0 0x7f2564042fa0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:25.205 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.017+0000 7f257f3f3700 1 -- 192.168.123.105:0/3964699889 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2578191a30 con 0x7f2578104fb0 2026-03-10T08:50:25.205 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.020+0000 7f256a7fc700 1 -- 192.168.123.105:0/3964699889 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f256c0119a0 con 0x7f2578104fb0 2026-03-10T08:50:25.205 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.021+0000 7f257c98e700 1 --2- 192.168.123.105:0/3964699889 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2564040ae0 0x7f2564042fa0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f2574006fd0 tx=0x7f2574006e40 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:25.205 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.138+0000 7f257f3f3700 1 -- 192.168.123.105:0/3964699889 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch set backend", "module_name": "cephadm", "target": ["mon-mgr", ""]}) v1 -- 0x7f25780611d0 con 0x7f2564040ae0 2026-03-10T08:50:25.205 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.144+0000 7f256a7fc700 1 -- 192.168.123.105:0/3964699889 <== mgr.14118 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+0 (secure 0 0 0) 0x7f25780611d0 con 0x7f2564040ae0 2026-03-10T08:50:25.205 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.149+0000 7f257f3f3700 1 -- 192.168.123.105:0/3964699889 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2564040ae0 msgr2=0x7f2564042fa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:25.205 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.150+0000 7f257f3f3700 1 --2- 192.168.123.105:0/3964699889 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2564040ae0 0x7f2564042fa0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f2574006fd0 tx=0x7f2574006e40 comp rx=0 tx=0).stop 2026-03-10T08:50:25.205 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.150+0000 7f257f3f3700 1 -- 192.168.123.105:0/3964699889 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2578104fb0 msgr2=0x7f2578198180 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:25.205 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.150+0000 7f257f3f3700 1 --2- 192.168.123.105:0/3964699889 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2578104fb0 0x7f2578198180 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f256c00bef0 tx=0x7f256c003c60 comp rx=0 tx=0).stop 2026-03-10T08:50:25.205 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.150+0000 7f257f3f3700 1 -- 192.168.123.105:0/3964699889 shutdown_connections 2026-03-10T08:50:25.205 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.150+0000 7f257f3f3700 1 --2- 192.168.123.105:0/3964699889 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2564040ae0 0x7f2564042fa0 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:25.205 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.150+0000 7f257f3f3700 1 --2- 192.168.123.105:0/3964699889 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2578104fb0 0x7f2578198180 unknown :-1 s=CLOSED pgs=36 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:25.205 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.150+0000 7f257f3f3700 1 -- 192.168.123.105:0/3964699889 >> 192.168.123.105:0/3964699889 conn(0x7f2578100bd0 msgr2=0x7f257818c650 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:25.205 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.150+0000 7f257f3f3700 1 -- 192.168.123.105:0/3964699889 shutdown_connections 2026-03-10T08:50:25.205 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.151+0000 7f257f3f3700 1 -- 192.168.123.105:0/3964699889 wait complete. 2026-03-10T08:50:25.491 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout value unchanged 2026-03-10T08:50:25.491 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.325+0000 7f80c8e1a700 1 Processor -- start 2026-03-10T08:50:25.491 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.325+0000 7f80c8e1a700 1 -- start start 2026-03-10T08:50:25.491 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.325+0000 7f80c8e1a700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f80c4108980 0x7f80c4108da0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:25.491 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.325+0000 7f80c8e1a700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f80c4109370 con 0x7f80c4108980 2026-03-10T08:50:25.491 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.326+0000 7f80c259c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f80c4108980 0x7f80c4108da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:25.491 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.326+0000 7f80c259c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f80c4108980 0x7f80c4108da0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:55898/0 (socket says 192.168.123.105:55898) 2026-03-10T08:50:25.491 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.326+0000 7f80c259c700 1 -- 192.168.123.105:0/568785901 learned_addr learned my addr 192.168.123.105:0/568785901 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:25.491 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.326+0000 7f80c259c700 1 -- 192.168.123.105:0/568785901 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f80c4109b80 con 0x7f80c4108980 2026-03-10T08:50:25.491 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.326+0000 7f80c259c700 1 --2- 192.168.123.105:0/568785901 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f80c4108980 0x7f80c4108da0 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7f80ac009a90 tx=0x7f80ac009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=b9affa4eb9f7937f server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:25.491 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.326+0000 7f80c159a700 1 -- 192.168.123.105:0/568785901 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f80ac004030 con 0x7f80c4108980 2026-03-10T08:50:25.491 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.326+0000 7f80c159a700 1 -- 192.168.123.105:0/568785901 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f80ac00b7e0 con 0x7f80c4108980 2026-03-10T08:50:25.491 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.327+0000 7f80c8e1a700 1 -- 192.168.123.105:0/568785901 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f80c4108980 msgr2=0x7f80c4108da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:25.491 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.327+0000 7f80c8e1a700 1 --2- 192.168.123.105:0/568785901 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f80c4108980 0x7f80c4108da0 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7f80ac009a90 tx=0x7f80ac009da0 comp rx=0 tx=0).stop 2026-03-10T08:50:25.491 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.327+0000 7f80c8e1a700 1 -- 192.168.123.105:0/568785901 shutdown_connections 2026-03-10T08:50:25.491 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.327+0000 7f80c8e1a700 1 --2- 192.168.123.105:0/568785901 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f80c4108980 0x7f80c4108da0 unknown :-1 s=CLOSED pgs=37 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:25.491 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.327+0000 7f80c8e1a700 1 -- 192.168.123.105:0/568785901 >> 192.168.123.105:0/568785901 conn(0x7f80c41044d0 msgr2=0x7f80c41068c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:25.491 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.327+0000 7f80c8e1a700 1 -- 192.168.123.105:0/568785901 shutdown_connections 2026-03-10T08:50:25.491 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.327+0000 7f80c8e1a700 1 -- 192.168.123.105:0/568785901 wait complete. 2026-03-10T08:50:25.491 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.327+0000 7f80c8e1a700 1 Processor -- start 2026-03-10T08:50:25.491 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.328+0000 7f80c8e1a700 1 -- start start 2026-03-10T08:50:25.491 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.328+0000 7f80c8e1a700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f80c4108980 0x7f80c419c790 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:25.491 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.328+0000 7f80c8e1a700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f80c419ccd0 con 0x7f80c4108980 2026-03-10T08:50:25.491 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.328+0000 7f80c259c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f80c4108980 0x7f80c419c790 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:25.491 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.328+0000 7f80c259c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f80c4108980 0x7f80c419c790 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:55900/0 (socket says 192.168.123.105:55900) 2026-03-10T08:50:25.491 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.328+0000 7f80c259c700 1 -- 192.168.123.105:0/1017663792 learned_addr learned my addr 192.168.123.105:0/1017663792 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:25.491 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.328+0000 7f80c259c700 1 -- 192.168.123.105:0/1017663792 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f80ac009740 con 0x7f80c4108980 2026-03-10T08:50:25.491 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.328+0000 7f80c259c700 1 --2- 192.168.123.105:0/1017663792 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f80c4108980 0x7f80c419c790 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f80ac003710 tx=0x7f80ac00beb0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:25.491 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.329+0000 7f80bb7fe700 1 -- 192.168.123.105:0/1017663792 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f80ac020c20 con 0x7f80c4108980 2026-03-10T08:50:25.491 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.329+0000 7f80bb7fe700 1 -- 192.168.123.105:0/1017663792 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f80ac024070 con 0x7f80c4108980 2026-03-10T08:50:25.491 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.329+0000 7f80bb7fe700 1 -- 192.168.123.105:0/1017663792 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f80ac0287c0 con 0x7f80c4108980 2026-03-10T08:50:25.491 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.329+0000 7f80c8e1a700 1 -- 192.168.123.105:0/1017663792 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f80c419ced0 con 0x7f80c4108980 2026-03-10T08:50:25.491 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.329+0000 7f80c8e1a700 1 -- 192.168.123.105:0/1017663792 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f80c419d2f0 con 0x7f80c4108980 2026-03-10T08:50:25.491 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.330+0000 7f80bb7fe700 1 -- 192.168.123.105:0/1017663792 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 6) v1 ==== 45172+0+0 (secure 0 0 0) 0x7f80ac026030 con 0x7f80c4108980 2026-03-10T08:50:25.491 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.330+0000 7f80bb7fe700 1 --2- 192.168.123.105:0/1017663792 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f80b0038270 0x7f80b003a730 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:25.491 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.330+0000 7f80bb7fe700 1 -- 192.168.123.105:0/1017663792 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f80ac0549d0 con 0x7f80c4108980 2026-03-10T08:50:25.491 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.331+0000 7f80c1d9b700 1 --2- 192.168.123.105:0/1017663792 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f80b0038270 0x7f80b003a730 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:25.491 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.331+0000 7f80c8e1a700 1 -- 192.168.123.105:0/1017663792 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f80c40623c0 con 0x7f80c4108980 2026-03-10T08:50:25.491 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.334+0000 7f80c1d9b700 1 --2- 192.168.123.105:0/1017663792 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f80b0038270 0x7f80b003a730 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f80b4006fd0 tx=0x7f80b4006e40 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:25.491 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.334+0000 7f80bb7fe700 1 -- 192.168.123.105:0/1017663792 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f80ac028920 con 0x7f80c4108980 2026-03-10T08:50:25.492 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.437+0000 7f80c8e1a700 1 -- 192.168.123.105:0/1017663792 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "cephadm set-user", "user": "root", "target": ["mon-mgr", ""]}) v1 -- 0x7f80c41069b0 con 0x7f80b0038270 2026-03-10T08:50:25.492 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.438+0000 7f80bb7fe700 1 -- 192.168.123.105:0/1017663792 <== mgr.14118 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+16 (secure 0 0 0) 0x7f80c41069b0 con 0x7f80b0038270 2026-03-10T08:50:25.492 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.442+0000 7f80c8e1a700 1 -- 192.168.123.105:0/1017663792 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f80b0038270 msgr2=0x7f80b003a730 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:25.492 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.442+0000 7f80c8e1a700 1 --2- 192.168.123.105:0/1017663792 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f80b0038270 0x7f80b003a730 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f80b4006fd0 tx=0x7f80b4006e40 comp rx=0 tx=0).stop 2026-03-10T08:50:25.492 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.442+0000 7f80c8e1a700 1 -- 192.168.123.105:0/1017663792 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f80c4108980 msgr2=0x7f80c419c790 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:25.492 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.442+0000 7f80c8e1a700 1 --2- 192.168.123.105:0/1017663792 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f80c4108980 0x7f80c419c790 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f80ac003710 tx=0x7f80ac00beb0 comp rx=0 tx=0).stop 2026-03-10T08:50:25.492 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.442+0000 7f80c8e1a700 1 -- 192.168.123.105:0/1017663792 shutdown_connections 2026-03-10T08:50:25.492 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.442+0000 7f80c8e1a700 1 --2- 192.168.123.105:0/1017663792 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f80b0038270 0x7f80b003a730 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:25.492 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.443+0000 7f80c8e1a700 1 --2- 192.168.123.105:0/1017663792 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f80c4108980 0x7f80c419c790 unknown :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:25.492 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.443+0000 7f80c8e1a700 1 -- 192.168.123.105:0/1017663792 >> 192.168.123.105:0/1017663792 conn(0x7f80c41044d0 msgr2=0x7f80c41062a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:25.492 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.443+0000 7f80c8e1a700 1 -- 192.168.123.105:0/1017663792 shutdown_connections 2026-03-10T08:50:25.492 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.443+0000 7f80c8e1a700 1 -- 192.168.123.105:0/1017663792 wait complete. 2026-03-10T08:50:25.492 INFO:teuthology.orchestra.run.vm05.stdout:Generating ssh key... 2026-03-10T08:50:25.834 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.604+0000 7f33e2037700 1 Processor -- start 2026-03-10T08:50:25.834 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.605+0000 7f33e2037700 1 -- start start 2026-03-10T08:50:25.834 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.605+0000 7f33e2037700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f33dc108990 0x7f33dc108db0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:25.835 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.605+0000 7f33e2037700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f33dc109380 con 0x7f33dc108990 2026-03-10T08:50:25.835 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.605+0000 7f33db7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f33dc108990 0x7f33dc108db0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:25.835 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.605+0000 7f33db7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f33dc108990 0x7f33dc108db0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:55910/0 (socket says 192.168.123.105:55910) 2026-03-10T08:50:25.835 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.605+0000 7f33db7fe700 1 -- 192.168.123.105:0/213647072 learned_addr learned my addr 192.168.123.105:0/213647072 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:25.835 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.606+0000 7f33db7fe700 1 -- 192.168.123.105:0/213647072 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f33dc109b90 con 0x7f33dc108990 2026-03-10T08:50:25.835 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.606+0000 7f33db7fe700 1 --2- 192.168.123.105:0/213647072 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f33dc108990 0x7f33dc108db0 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f33c4009cf0 tx=0x7f33c400b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=e230df67e4764491 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:25.835 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.606+0000 7f33da7fc700 1 -- 192.168.123.105:0/213647072 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f33c4004030 con 0x7f33dc108990 2026-03-10T08:50:25.835 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.606+0000 7f33da7fc700 1 -- 192.168.123.105:0/213647072 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f33c400b810 con 0x7f33dc108990 2026-03-10T08:50:25.835 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.606+0000 7f33e2037700 1 -- 192.168.123.105:0/213647072 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f33dc108990 msgr2=0x7f33dc108db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:25.835 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.606+0000 7f33e2037700 1 --2- 192.168.123.105:0/213647072 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f33dc108990 0x7f33dc108db0 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f33c4009cf0 tx=0x7f33c400b0e0 comp rx=0 tx=0).stop 2026-03-10T08:50:25.835 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.607+0000 7f33e2037700 1 -- 192.168.123.105:0/213647072 shutdown_connections 2026-03-10T08:50:25.835 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.607+0000 7f33e2037700 1 --2- 192.168.123.105:0/213647072 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f33dc108990 0x7f33dc108db0 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:25.835 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.607+0000 7f33e2037700 1 -- 192.168.123.105:0/213647072 >> 192.168.123.105:0/213647072 conn(0x7f33dc103f50 msgr2=0x7f33dc106370 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:25.835 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.607+0000 7f33e2037700 1 -- 192.168.123.105:0/213647072 shutdown_connections 2026-03-10T08:50:25.835 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.607+0000 7f33e2037700 1 -- 192.168.123.105:0/213647072 wait complete. 2026-03-10T08:50:25.835 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.607+0000 7f33e2037700 1 Processor -- start 2026-03-10T08:50:25.835 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.607+0000 7f33e2037700 1 -- start start 2026-03-10T08:50:25.835 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.608+0000 7f33e2037700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f33dc19c7d0 0x7f33dc19cbf0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:25.835 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.608+0000 7f33e2037700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f33dc109380 con 0x7f33dc19c7d0 2026-03-10T08:50:25.835 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.608+0000 7f33db7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f33dc19c7d0 0x7f33dc19cbf0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:25.835 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.608+0000 7f33db7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f33dc19c7d0 0x7f33dc19cbf0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:55922/0 (socket says 192.168.123.105:55922) 2026-03-10T08:50:25.835 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.608+0000 7f33db7fe700 1 -- 192.168.123.105:0/1925345481 learned_addr learned my addr 192.168.123.105:0/1925345481 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:25.835 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.608+0000 7f33db7fe700 1 -- 192.168.123.105:0/1925345481 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f33c4009740 con 0x7f33dc19c7d0 2026-03-10T08:50:25.835 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.608+0000 7f33db7fe700 1 --2- 192.168.123.105:0/1925345481 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f33dc19c7d0 0x7f33dc19cbf0 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f33c4009cc0 tx=0x7f33c4003cb0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:25.835 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.608+0000 7f33d8ff9700 1 -- 192.168.123.105:0/1925345481 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f33c4003ed0 con 0x7f33dc19c7d0 2026-03-10T08:50:25.835 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.609+0000 7f33d8ff9700 1 -- 192.168.123.105:0/1925345481 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f33c40044d0 con 0x7f33dc19c7d0 2026-03-10T08:50:25.835 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.609+0000 7f33d8ff9700 1 -- 192.168.123.105:0/1925345481 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f33c401ac60 con 0x7f33dc19c7d0 2026-03-10T08:50:25.835 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.609+0000 7f33e2037700 1 -- 192.168.123.105:0/1925345481 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f33dc19d130 con 0x7f33dc19c7d0 2026-03-10T08:50:25.835 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.609+0000 7f33e2037700 1 -- 192.168.123.105:0/1925345481 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f33dc19fdb0 con 0x7f33dc19c7d0 2026-03-10T08:50:25.835 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.609+0000 7f33d8ff9700 1 -- 192.168.123.105:0/1925345481 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 6) v1 ==== 45172+0+0 (secure 0 0 0) 0x7f33c4011420 con 0x7f33dc19c7d0 2026-03-10T08:50:25.836 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.610+0000 7f33e2037700 1 -- 192.168.123.105:0/1925345481 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f33dc04fa90 con 0x7f33dc19c7d0 2026-03-10T08:50:25.836 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.610+0000 7f33d8ff9700 1 --2- 192.168.123.105:0/1925345481 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f33c8038270 0x7f33c803a730 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:25.836 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.610+0000 7f33d8ff9700 1 -- 192.168.123.105:0/1925345481 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f33c404b4a0 con 0x7f33dc19c7d0 2026-03-10T08:50:25.836 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.612+0000 7f33daffd700 1 --2- 192.168.123.105:0/1925345481 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f33c8038270 0x7f33c803a730 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:25.836 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.613+0000 7f33d8ff9700 1 -- 192.168.123.105:0/1925345481 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f33c402cda0 con 0x7f33dc19c7d0 2026-03-10T08:50:25.836 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.613+0000 7f33daffd700 1 --2- 192.168.123.105:0/1925345481 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f33c8038270 0x7f33c803a730 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f33cc006fd0 tx=0x7f33cc006e40 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:25.836 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.714+0000 7f33e2037700 1 -- 192.168.123.105:0/1925345481 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "cephadm generate-key", "target": ["mon-mgr", ""]}) v1 -- 0x7f33dc1051f0 con 0x7f33c8038270 2026-03-10T08:50:25.836 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.798+0000 7f33d8ff9700 1 -- 192.168.123.105:0/1925345481 <== mgr.14118 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+0 (secure 0 0 0) 0x7f33dc1051f0 con 0x7f33c8038270 2026-03-10T08:50:25.836 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.801+0000 7f33e2037700 1 -- 192.168.123.105:0/1925345481 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f33c8038270 msgr2=0x7f33c803a730 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:25.836 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.801+0000 7f33e2037700 1 --2- 192.168.123.105:0/1925345481 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f33c8038270 0x7f33c803a730 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f33cc006fd0 tx=0x7f33cc006e40 comp rx=0 tx=0).stop 2026-03-10T08:50:25.836 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.801+0000 7f33e2037700 1 -- 192.168.123.105:0/1925345481 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f33dc19c7d0 msgr2=0x7f33dc19cbf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:25.836 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.801+0000 7f33e2037700 1 --2- 192.168.123.105:0/1925345481 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f33dc19c7d0 0x7f33dc19cbf0 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f33c4009cc0 tx=0x7f33c4003cb0 comp rx=0 tx=0).stop 2026-03-10T08:50:25.836 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.801+0000 7f33e2037700 1 -- 192.168.123.105:0/1925345481 shutdown_connections 2026-03-10T08:50:25.836 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.801+0000 7f33e2037700 1 --2- 192.168.123.105:0/1925345481 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f33c8038270 0x7f33c803a730 unknown :-1 s=CLOSED pgs=9 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:25.836 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.801+0000 7f33e2037700 1 --2- 192.168.123.105:0/1925345481 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f33dc19c7d0 0x7f33dc19cbf0 unknown :-1 s=CLOSED pgs=40 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:25.836 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.801+0000 7f33e2037700 1 -- 192.168.123.105:0/1925345481 >> 192.168.123.105:0/1925345481 conn(0x7f33dc103f50 msgr2=0x7f33dc104ae0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:25.836 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.801+0000 7f33e2037700 1 -- 192.168.123.105:0/1925345481 shutdown_connections 2026-03-10T08:50:25.836 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.801+0000 7f33e2037700 1 -- 192.168.123.105:0/1925345481 wait complete. 2026-03-10T08:50:26.133 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDwebO0BEKOMDaI+d0wXskLAIqRq/l/8cSZ+vgs/FMPpC6V4pxW3M0JZm9XVo5UNZfcXqRceQofO4PmONsCzDgedN6ePamdwiR2s0k77ul/2J4TZgYsiHrELZymaDhh34J5kCphoFd9HA/NpDaDizb3BftH2VBM3x9PqpHZ5Lwy6rjFsWKZd8OinvSl76WVV7kLEOipKr8p0WRY+Iw7YpP/yaRdviUQvdui5xvw9a1RLjA+z/q4nDWif9BMFPn1DqeGFqAdFCGRU0qf77Ode08FP/MrLHPCpjyegkWd+tVtc9lh8Mjfa7xYbE32yrkOiG2Gi33Btq/eQWAgVAafm1dgQn0z1NJSD5lc+1vy+YgX65Lftw66S/CH4zAXz69FsEd7P53ic3P+RxWssK1kP9jLuGNMPsJ3KHIRrJu2mxVMk0/LfFqB7pUf0ngivfp4iGK/XBhe8iQCpg+f5VrxNRb7FrRQ67gMB38ssyzXF7innINqBJFXfu6q95NwlMpR19k= ceph-16587ed2-1c5e-11f1-90f6-35051361a039 2026-03-10T08:50:26.133 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.965+0000 7f388f9e6700 1 Processor -- start 2026-03-10T08:50:26.133 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.965+0000 7f388f9e6700 1 -- start start 2026-03-10T08:50:26.133 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.966+0000 7f388f9e6700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3888104fb0 0x7f38881073e0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:26.133 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.966+0000 7f388f9e6700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3888074720 con 0x7f3888104fb0 2026-03-10T08:50:26.133 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.967+0000 7f388d782700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3888104fb0 0x7f38881073e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:26.133 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.967+0000 7f388d782700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3888104fb0 0x7f38881073e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:55928/0 (socket says 192.168.123.105:55928) 2026-03-10T08:50:26.133 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.967+0000 7f388d782700 1 -- 192.168.123.105:0/928112780 learned_addr learned my addr 192.168.123.105:0/928112780 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:26.133 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.967+0000 7f388d782700 1 -- 192.168.123.105:0/928112780 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3888107920 con 0x7f3888104fb0 2026-03-10T08:50:26.133 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.968+0000 7f388d782700 1 --2- 192.168.123.105:0/928112780 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3888104fb0 0x7f38881073e0 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f387c009a90 tx=0x7f387c009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=63a105173227ec16 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:26.133 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.968+0000 7f387bfff700 1 -- 192.168.123.105:0/928112780 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f387c004030 con 0x7f3888104fb0 2026-03-10T08:50:26.133 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.968+0000 7f387bfff700 1 -- 192.168.123.105:0/928112780 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f387c00b7e0 con 0x7f3888104fb0 2026-03-10T08:50:26.133 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.969+0000 7f388f9e6700 1 -- 192.168.123.105:0/928112780 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3888104fb0 msgr2=0x7f38881073e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:26.133 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.969+0000 7f388f9e6700 1 --2- 192.168.123.105:0/928112780 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3888104fb0 0x7f38881073e0 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f387c009a90 tx=0x7f387c009da0 comp rx=0 tx=0).stop 2026-03-10T08:50:26.133 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.969+0000 7f388f9e6700 1 -- 192.168.123.105:0/928112780 shutdown_connections 2026-03-10T08:50:26.133 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.969+0000 7f388f9e6700 1 --2- 192.168.123.105:0/928112780 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3888104fb0 0x7f38881073e0 unknown :-1 s=CLOSED pgs=41 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:26.133 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.969+0000 7f388f9e6700 1 -- 192.168.123.105:0/928112780 >> 192.168.123.105:0/928112780 conn(0x7f3888100bd0 msgr2=0x7f3888103030 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:26.133 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.970+0000 7f388f9e6700 1 -- 192.168.123.105:0/928112780 shutdown_connections 2026-03-10T08:50:26.133 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.970+0000 7f388f9e6700 1 -- 192.168.123.105:0/928112780 wait complete. 2026-03-10T08:50:26.133 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.970+0000 7f388f9e6700 1 Processor -- start 2026-03-10T08:50:26.133 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.970+0000 7f388f9e6700 1 -- start start 2026-03-10T08:50:26.133 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.970+0000 7f388f9e6700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3888104fb0 0x7f38881abb10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:26.133 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.971+0000 7f388f9e6700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3888074720 con 0x7f3888104fb0 2026-03-10T08:50:26.133 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.971+0000 7f388d782700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3888104fb0 0x7f38881abb10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:26.133 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.971+0000 7f388d782700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3888104fb0 0x7f38881abb10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:55938/0 (socket says 192.168.123.105:55938) 2026-03-10T08:50:26.133 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.971+0000 7f388d782700 1 -- 192.168.123.105:0/1890290643 learned_addr learned my addr 192.168.123.105:0/1890290643 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:26.133 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.972+0000 7f388d782700 1 -- 192.168.123.105:0/1890290643 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f387c009740 con 0x7f3888104fb0 2026-03-10T08:50:26.133 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.972+0000 7f388d782700 1 --2- 192.168.123.105:0/1890290643 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3888104fb0 0x7f38881abb10 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f387c004000 tx=0x7f387c0040e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:26.133 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.972+0000 7f387a7fc700 1 -- 192.168.123.105:0/1890290643 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f387c004330 con 0x7f3888104fb0 2026-03-10T08:50:26.133 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.972+0000 7f387a7fc700 1 -- 192.168.123.105:0/1890290643 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f387c004490 con 0x7f3888104fb0 2026-03-10T08:50:26.133 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.972+0000 7f387a7fc700 1 -- 192.168.123.105:0/1890290643 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f387c011660 con 0x7f3888104fb0 2026-03-10T08:50:26.133 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.973+0000 7f388f9e6700 1 -- 192.168.123.105:0/1890290643 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f38881ac050 con 0x7f3888104fb0 2026-03-10T08:50:26.133 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.973+0000 7f388f9e6700 1 -- 192.168.123.105:0/1890290643 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f38881ac3b0 con 0x7f3888104fb0 2026-03-10T08:50:26.133 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.974+0000 7f387a7fc700 1 -- 192.168.123.105:0/1890290643 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 6) v1 ==== 45172+0+0 (secure 0 0 0) 0x7f387c028020 con 0x7f3888104fb0 2026-03-10T08:50:26.133 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.974+0000 7f387a7fc700 1 --2- 192.168.123.105:0/1890290643 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f38740382c0 0x7f387403a780 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:26.133 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.974+0000 7f387a7fc700 1 -- 192.168.123.105:0/1890290643 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f387c04bba0 con 0x7f3888104fb0 2026-03-10T08:50:26.133 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.974+0000 7f388f9e6700 1 -- 192.168.123.105:0/1890290643 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f38880623c0 con 0x7f3888104fb0 2026-03-10T08:50:26.133 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.974+0000 7f388cf81700 1 --2- 192.168.123.105:0/1890290643 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f38740382c0 0x7f387403a780 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:26.133 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.977+0000 7f388cf81700 1 --2- 192.168.123.105:0/1890290643 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f38740382c0 0x7f387403a780 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f3884006fd0 tx=0x7f3884006e40 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:26.133 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:25.977+0000 7f387a7fc700 1 -- 192.168.123.105:0/1890290643 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f387c0117c0 con 0x7f3888104fb0 2026-03-10T08:50:26.133 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.080+0000 7f388f9e6700 1 -- 192.168.123.105:0/1890290643 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "cephadm get-pub-key", "target": ["mon-mgr", ""]}) v1 -- 0x7f38881ac660 con 0x7f38740382c0 2026-03-10T08:50:26.133 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.080+0000 7f387a7fc700 1 -- 192.168.123.105:0/1890290643 <== mgr.14118 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+595 (secure 0 0 0) 0x7f38881ac660 con 0x7f38740382c0 2026-03-10T08:50:26.134 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.083+0000 7f388f9e6700 1 -- 192.168.123.105:0/1890290643 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f38740382c0 msgr2=0x7f387403a780 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:26.134 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.083+0000 7f388f9e6700 1 --2- 192.168.123.105:0/1890290643 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f38740382c0 0x7f387403a780 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f3884006fd0 tx=0x7f3884006e40 comp rx=0 tx=0).stop 2026-03-10T08:50:26.134 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.083+0000 7f388f9e6700 1 -- 192.168.123.105:0/1890290643 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3888104fb0 msgr2=0x7f38881abb10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:26.134 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.083+0000 7f388f9e6700 1 --2- 192.168.123.105:0/1890290643 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3888104fb0 0x7f38881abb10 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f387c004000 tx=0x7f387c0040e0 comp rx=0 tx=0).stop 2026-03-10T08:50:26.134 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.083+0000 7f388f9e6700 1 -- 192.168.123.105:0/1890290643 shutdown_connections 2026-03-10T08:50:26.134 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.083+0000 7f388f9e6700 1 --2- 192.168.123.105:0/1890290643 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f38740382c0 0x7f387403a780 unknown :-1 s=CLOSED pgs=10 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:26.134 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.083+0000 7f388f9e6700 1 --2- 192.168.123.105:0/1890290643 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3888104fb0 0x7f38881abb10 unknown :-1 s=CLOSED pgs=42 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:26.134 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.083+0000 7f388f9e6700 1 -- 192.168.123.105:0/1890290643 >> 192.168.123.105:0/1890290643 conn(0x7f3888100bd0 msgr2=0x7f3888103030 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:26.134 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.083+0000 7f388f9e6700 1 -- 192.168.123.105:0/1890290643 shutdown_connections 2026-03-10T08:50:26.134 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.083+0000 7f388f9e6700 1 -- 192.168.123.105:0/1890290643 wait complete. 2026-03-10T08:50:26.134 INFO:teuthology.orchestra.run.vm05.stdout:Wrote public SSH key to /home/ubuntu/cephtest/ceph.pub 2026-03-10T08:50:26.134 INFO:teuthology.orchestra.run.vm05.stdout:Adding key to root@localhost authorized_keys... 2026-03-10T08:50:26.134 INFO:teuthology.orchestra.run.vm05.stdout:Adding host vm05... 2026-03-10T08:50:26.253 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:26 vm05 ceph-mon[49713]: [10/Mar/2026:08:50:24] ENGINE Bus STARTING 2026-03-10T08:50:26.253 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:26 vm05 ceph-mon[49713]: [10/Mar/2026:08:50:24] ENGINE Serving on https://192.168.123.105:7150 2026-03-10T08:50:26.253 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:26 vm05 ceph-mon[49713]: [10/Mar/2026:08:50:24] ENGINE Serving on http://192.168.123.105:8765 2026-03-10T08:50:26.253 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:26 vm05 ceph-mon[49713]: [10/Mar/2026:08:50:24] ENGINE Bus STARTED 2026-03-10T08:50:26.253 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:26 vm05 ceph-mon[49713]: from='client.14122 -' entity='client.admin' cmd=[{"prefix": "get_command_descriptions"}]: dispatch 2026-03-10T08:50:26.253 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:26 vm05 ceph-mon[49713]: from='client.14122 -' entity='client.admin' cmd=[{"prefix": "mgr_status"}]: dispatch 2026-03-10T08:50:26.253 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:26 vm05 ceph-mon[49713]: from='client.14130 -' entity='client.admin' cmd=[{"prefix": "orch set backend", "module_name": "cephadm", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:50:26.253 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:26 vm05 ceph-mon[49713]: from='mgr.14118 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:26.253 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:26 vm05 ceph-mon[49713]: from='mgr.14118 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:50:26.253 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:26 vm05 ceph-mon[49713]: from='mgr.14118 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:26.253 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:26 vm05 ceph-mon[49713]: from='mgr.14118 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:27.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:27 vm05 ceph-mon[49713]: from='client.14132 -' entity='client.admin' cmd=[{"prefix": "cephadm set-user", "user": "root", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:50:27.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:27 vm05 ceph-mon[49713]: from='client.14134 -' entity='client.admin' cmd=[{"prefix": "cephadm generate-key", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:50:27.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:27 vm05 ceph-mon[49713]: Generating ssh key... 2026-03-10T08:50:27.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:27 vm05 ceph-mon[49713]: from='client.14136 -' entity='client.admin' cmd=[{"prefix": "cephadm get-pub-key", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:50:27.276 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:27 vm05 ceph-mon[49713]: mgrmap e7: vm05.rxwgjc(active, since 2s) 2026-03-10T08:50:28.208 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout Added host 'vm05' with addr '192.168.123.105' 2026-03-10T08:50:28.208 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.260+0000 7f3a18b9b700 1 Processor -- start 2026-03-10T08:50:28.208 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.261+0000 7f3a18b9b700 1 -- start start 2026-03-10T08:50:28.208 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.261+0000 7f3a18b9b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3a14104ff0 0x7f3a14105410 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:28.209 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.262+0000 7f3a18b9b700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3a140746e0 con 0x7f3a14104ff0 2026-03-10T08:50:28.209 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.262+0000 7f3a1259c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3a14104ff0 0x7f3a14105410 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:28.209 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.262+0000 7f3a1259c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3a14104ff0 0x7f3a14105410 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:55950/0 (socket says 192.168.123.105:55950) 2026-03-10T08:50:28.209 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.262+0000 7f3a1259c700 1 -- 192.168.123.105:0/948895717 learned_addr learned my addr 192.168.123.105:0/948895717 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:28.209 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.262+0000 7f3a1259c700 1 -- 192.168.123.105:0/948895717 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3a14105950 con 0x7f3a14104ff0 2026-03-10T08:50:28.209 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.263+0000 7f3a1259c700 1 --2- 192.168.123.105:0/948895717 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3a14104ff0 0x7f3a14105410 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f39fc009a90 tx=0x7f39fc009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=7eaa2cd33222f9cf server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:28.209 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.263+0000 7f3a1159a700 1 -- 192.168.123.105:0/948895717 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f39fc004030 con 0x7f3a14104ff0 2026-03-10T08:50:28.209 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.263+0000 7f3a1159a700 1 -- 192.168.123.105:0/948895717 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f39fc00b7e0 con 0x7f3a14104ff0 2026-03-10T08:50:28.209 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.263+0000 7f3a1159a700 1 -- 192.168.123.105:0/948895717 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f39fc003a40 con 0x7f3a14104ff0 2026-03-10T08:50:28.209 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.263+0000 7f3a18b9b700 1 -- 192.168.123.105:0/948895717 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3a14104ff0 msgr2=0x7f3a14105410 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:28.209 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.263+0000 7f3a18b9b700 1 --2- 192.168.123.105:0/948895717 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3a14104ff0 0x7f3a14105410 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f39fc009a90 tx=0x7f39fc009da0 comp rx=0 tx=0).stop 2026-03-10T08:50:28.209 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.263+0000 7f3a18b9b700 1 -- 192.168.123.105:0/948895717 shutdown_connections 2026-03-10T08:50:28.209 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.264+0000 7f3a18b9b700 1 --2- 192.168.123.105:0/948895717 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3a14104ff0 0x7f3a14105410 unknown :-1 s=CLOSED pgs=43 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:28.209 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.264+0000 7f3a18b9b700 1 -- 192.168.123.105:0/948895717 >> 192.168.123.105:0/948895717 conn(0x7f3a14100bd0 msgr2=0x7f3a14103030 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:28.209 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.264+0000 7f3a18b9b700 1 -- 192.168.123.105:0/948895717 shutdown_connections 2026-03-10T08:50:28.209 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.264+0000 7f3a18b9b700 1 -- 192.168.123.105:0/948895717 wait complete. 2026-03-10T08:50:28.209 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.264+0000 7f3a18b9b700 1 Processor -- start 2026-03-10T08:50:28.209 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.264+0000 7f3a18b9b700 1 -- start start 2026-03-10T08:50:28.209 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.265+0000 7f3a18b9b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3a1419c7d0 0x7f3a1419cbf0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:28.209 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.265+0000 7f3a18b9b700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3a1419d130 con 0x7f3a1419c7d0 2026-03-10T08:50:28.209 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.265+0000 7f3a1259c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3a1419c7d0 0x7f3a1419cbf0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:28.209 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.265+0000 7f3a1259c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3a1419c7d0 0x7f3a1419cbf0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:55966/0 (socket says 192.168.123.105:55966) 2026-03-10T08:50:28.209 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.265+0000 7f3a1259c700 1 -- 192.168.123.105:0/2566770083 learned_addr learned my addr 192.168.123.105:0/2566770083 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:28.209 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.265+0000 7f3a1259c700 1 -- 192.168.123.105:0/2566770083 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f39fc009740 con 0x7f3a1419c7d0 2026-03-10T08:50:28.209 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.265+0000 7f3a1259c700 1 --2- 192.168.123.105:0/2566770083 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3a1419c7d0 0x7f3a1419cbf0 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f39fc000c00 tx=0x7f39fc003b40 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:28.209 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.266+0000 7f3a0b7fe700 1 -- 192.168.123.105:0/2566770083 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f39fc003fc0 con 0x7f3a1419c7d0 2026-03-10T08:50:28.209 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.266+0000 7f3a0b7fe700 1 -- 192.168.123.105:0/2566770083 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f39fc01b440 con 0x7f3a1419c7d0 2026-03-10T08:50:28.209 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.266+0000 7f3a0b7fe700 1 -- 192.168.123.105:0/2566770083 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f39fc011420 con 0x7f3a1419c7d0 2026-03-10T08:50:28.209 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.266+0000 7f3a18b9b700 1 -- 192.168.123.105:0/2566770083 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3a1419d330 con 0x7f3a1419c7d0 2026-03-10T08:50:28.209 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.266+0000 7f3a18b9b700 1 -- 192.168.123.105:0/2566770083 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3a1419ff80 con 0x7f3a1419c7d0 2026-03-10T08:50:28.209 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.267+0000 7f3a0b7fe700 1 -- 192.168.123.105:0/2566770083 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 6) v1 ==== 45172+0+0 (secure 0 0 0) 0x7f39fc004120 con 0x7f3a1419c7d0 2026-03-10T08:50:28.209 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.267+0000 7f3a18b9b700 1 -- 192.168.123.105:0/2566770083 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3a14195f70 con 0x7f3a1419c7d0 2026-03-10T08:50:28.209 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.267+0000 7f3a0b7fe700 1 --2- 192.168.123.105:0/2566770083 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3a00038220 0x7f3a0003a6e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:28.209 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.267+0000 7f3a0b7fe700 1 -- 192.168.123.105:0/2566770083 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f39fc04bc50 con 0x7f3a1419c7d0 2026-03-10T08:50:28.209 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.269+0000 7f3a11d9b700 1 --2- 192.168.123.105:0/2566770083 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3a00038220 0x7f3a0003a6e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:28.209 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.269+0000 7f3a11d9b700 1 --2- 192.168.123.105:0/2566770083 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3a00038220 0x7f3a0003a6e0 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f3a04006fd0 tx=0x7f3a04006e40 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:28.209 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.270+0000 7f3a0b7fe700 1 -- 192.168.123.105:0/2566770083 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f39fc02ce20 con 0x7f3a1419c7d0 2026-03-10T08:50:28.209 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.371+0000 7f3a18b9b700 1 -- 192.168.123.105:0/2566770083 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch host add", "hostname": "vm05", "addr": "192.168.123.105", "target": ["mon-mgr", ""]}) v1 -- 0x7f3a140611d0 con 0x7f3a00038220 2026-03-10T08:50:28.209 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:26.799+0000 7f3a0b7fe700 1 -- 192.168.123.105:0/2566770083 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mgrmap(e 7) v1 ==== 45278+0+0 (secure 0 0 0) 0x7f39fc011640 con 0x7f3a1419c7d0 2026-03-10T08:50:28.209 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.161+0000 7f3a0b7fe700 1 -- 192.168.123.105:0/2566770083 <== mgr.14118 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+46 (secure 0 0 0) 0x7f3a140611d0 con 0x7f3a00038220 2026-03-10T08:50:28.209 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.163+0000 7f3a18b9b700 1 -- 192.168.123.105:0/2566770083 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3a00038220 msgr2=0x7f3a0003a6e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:28.210 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.163+0000 7f3a18b9b700 1 --2- 192.168.123.105:0/2566770083 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3a00038220 0x7f3a0003a6e0 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f3a04006fd0 tx=0x7f3a04006e40 comp rx=0 tx=0).stop 2026-03-10T08:50:28.210 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.163+0000 7f3a18b9b700 1 -- 192.168.123.105:0/2566770083 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3a1419c7d0 msgr2=0x7f3a1419cbf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:28.210 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.163+0000 7f3a18b9b700 1 --2- 192.168.123.105:0/2566770083 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3a1419c7d0 0x7f3a1419cbf0 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f39fc000c00 tx=0x7f39fc003b40 comp rx=0 tx=0).stop 2026-03-10T08:50:28.210 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.164+0000 7f3a18b9b700 1 -- 192.168.123.105:0/2566770083 shutdown_connections 2026-03-10T08:50:28.210 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.164+0000 7f3a18b9b700 1 --2- 192.168.123.105:0/2566770083 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3a00038220 0x7f3a0003a6e0 unknown :-1 s=CLOSED pgs=11 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:28.210 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.164+0000 7f3a18b9b700 1 --2- 192.168.123.105:0/2566770083 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3a1419c7d0 0x7f3a1419cbf0 unknown :-1 s=CLOSED pgs=44 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:28.210 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.164+0000 7f3a18b9b700 1 -- 192.168.123.105:0/2566770083 >> 192.168.123.105:0/2566770083 conn(0x7f3a14100bd0 msgr2=0x7f3a14190bb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:28.210 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.164+0000 7f3a18b9b700 1 -- 192.168.123.105:0/2566770083 shutdown_connections 2026-03-10T08:50:28.210 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.164+0000 7f3a18b9b700 1 -- 192.168.123.105:0/2566770083 wait complete. 2026-03-10T08:50:28.210 INFO:teuthology.orchestra.run.vm05.stdout:Deploying mon service with default placement... 2026-03-10T08:50:28.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:28 vm05 ceph-mon[49713]: from='client.14138 -' entity='client.admin' cmd=[{"prefix": "orch host add", "hostname": "vm05", "addr": "192.168.123.105", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:50:28.465 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:28 vm05 ceph-mon[49713]: Deploying cephadm binary to vm05 2026-03-10T08:50:28.524 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout Scheduled mon update... 2026-03-10T08:50:28.524 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.358+0000 7f8a8ae4b700 1 Processor -- start 2026-03-10T08:50:28.524 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.358+0000 7f8a8ae4b700 1 -- start start 2026-03-10T08:50:28.524 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.358+0000 7f8a8ae4b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8a84071ce0 0x7f8a84072100 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:28.524 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.358+0000 7f8a8ae4b700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8a840726d0 con 0x7f8a84071ce0 2026-03-10T08:50:28.524 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.359+0000 7f8a89e49700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8a84071ce0 0x7f8a84072100 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:28.524 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.359+0000 7f8a89e49700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8a84071ce0 0x7f8a84072100 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:55968/0 (socket says 192.168.123.105:55968) 2026-03-10T08:50:28.524 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.359+0000 7f8a89e49700 1 -- 192.168.123.105:0/143111173 learned_addr learned my addr 192.168.123.105:0/143111173 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:28.524 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.359+0000 7f8a89e49700 1 -- 192.168.123.105:0/143111173 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8a84072810 con 0x7f8a84071ce0 2026-03-10T08:50:28.524 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.359+0000 7f8a89e49700 1 --2- 192.168.123.105:0/143111173 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8a84071ce0 0x7f8a84072100 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f8a8000ac30 tx=0x7f8a80010730 comp rx=0 tx=0).ready entity=mon.0 client_cookie=f2eeb12eddae0fe9 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:28.524 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.359+0000 7f8a88e47700 1 -- 192.168.123.105:0/143111173 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8a80010d40 con 0x7f8a84071ce0 2026-03-10T08:50:28.524 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.359+0000 7f8a88e47700 1 -- 192.168.123.105:0/143111173 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f8a800044c0 con 0x7f8a84071ce0 2026-03-10T08:50:28.524 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.359+0000 7f8a8ae4b700 1 -- 192.168.123.105:0/143111173 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8a84071ce0 msgr2=0x7f8a84072100 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:28.524 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.359+0000 7f8a8ae4b700 1 --2- 192.168.123.105:0/143111173 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8a84071ce0 0x7f8a84072100 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f8a8000ac30 tx=0x7f8a80010730 comp rx=0 tx=0).stop 2026-03-10T08:50:28.525 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.360+0000 7f8a8ae4b700 1 -- 192.168.123.105:0/143111173 shutdown_connections 2026-03-10T08:50:28.525 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.360+0000 7f8a8ae4b700 1 --2- 192.168.123.105:0/143111173 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8a84071ce0 0x7f8a84072100 unknown :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:28.525 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.361+0000 7f8a8ae4b700 1 -- 192.168.123.105:0/143111173 >> 192.168.123.105:0/143111173 conn(0x7f8a8406d320 msgr2=0x7f8a8406f760 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:28.525 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.361+0000 7f8a8ae4b700 1 -- 192.168.123.105:0/143111173 shutdown_connections 2026-03-10T08:50:28.525 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.361+0000 7f8a8ae4b700 1 -- 192.168.123.105:0/143111173 wait complete. 2026-03-10T08:50:28.525 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.361+0000 7f8a8ae4b700 1 Processor -- start 2026-03-10T08:50:28.525 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.361+0000 7f8a8ae4b700 1 -- start start 2026-03-10T08:50:28.525 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.361+0000 7f8a8ae4b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8a84086dd0 0x7f8a840871f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:28.525 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.361+0000 7f8a8ae4b700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8a8001a420 con 0x7f8a84086dd0 2026-03-10T08:50:28.525 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.361+0000 7f8a89e49700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8a84086dd0 0x7f8a840871f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:28.525 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.361+0000 7f8a89e49700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8a84086dd0 0x7f8a840871f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:55984/0 (socket says 192.168.123.105:55984) 2026-03-10T08:50:28.525 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.361+0000 7f8a89e49700 1 -- 192.168.123.105:0/967961525 learned_addr learned my addr 192.168.123.105:0/967961525 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:28.525 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.362+0000 7f8a89e49700 1 -- 192.168.123.105:0/967961525 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8a8000a8e0 con 0x7f8a84086dd0 2026-03-10T08:50:28.525 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.362+0000 7f8a89e49700 1 --2- 192.168.123.105:0/967961525 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8a84086dd0 0x7f8a840871f0 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f8a80004580 tx=0x7f8a800046f0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:28.525 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.362+0000 7f8a7affd700 1 -- 192.168.123.105:0/967961525 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8a80003720 con 0x7f8a84086dd0 2026-03-10T08:50:28.525 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.362+0000 7f8a8ae4b700 1 -- 192.168.123.105:0/967961525 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8a84087730 con 0x7f8a84086dd0 2026-03-10T08:50:28.525 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.362+0000 7f8a8ae4b700 1 -- 192.168.123.105:0/967961525 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8a840883a0 con 0x7f8a84086dd0 2026-03-10T08:50:28.525 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.363+0000 7f8a7affd700 1 -- 192.168.123.105:0/967961525 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f8a8001a7a0 con 0x7f8a84086dd0 2026-03-10T08:50:28.525 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.363+0000 7f8a7affd700 1 -- 192.168.123.105:0/967961525 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8a80022800 con 0x7f8a84086dd0 2026-03-10T08:50:28.525 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.363+0000 7f8a7affd700 1 -- 192.168.123.105:0/967961525 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 7) v1 ==== 45278+0+0 (secure 0 0 0) 0x7f8a80018070 con 0x7f8a84086dd0 2026-03-10T08:50:28.525 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.364+0000 7f8a7affd700 1 --2- 192.168.123.105:0/967961525 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8a70040bd0 0x7f8a70043090 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:28.525 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.364+0000 7f8a89648700 1 --2- 192.168.123.105:0/967961525 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8a70040bd0 0x7f8a70043090 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:28.525 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.364+0000 7f8a7affd700 1 -- 192.168.123.105:0/967961525 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f8a8004c610 con 0x7f8a84086dd0 2026-03-10T08:50:28.525 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.365+0000 7f8a89648700 1 --2- 192.168.123.105:0/967961525 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8a70040bd0 0x7f8a70043090 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7f8a7c00ad30 tx=0x7f8a7c0093f0 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:28.525 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.365+0000 7f8a8ae4b700 1 -- 192.168.123.105:0/967961525 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8a68005320 con 0x7f8a84086dd0 2026-03-10T08:50:28.525 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.368+0000 7f8a7affd700 1 -- 192.168.123.105:0/967961525 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f8a8001e020 con 0x7f8a84086dd0 2026-03-10T08:50:28.525 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.480+0000 7f8a8ae4b700 1 -- 192.168.123.105:0/967961525 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "mon", "target": ["mon-mgr", ""]}) v1 -- 0x7f8a68000bf0 con 0x7f8a70040bd0 2026-03-10T08:50:28.525 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.485+0000 7f8a7affd700 1 -- 192.168.123.105:0/967961525 <== mgr.14118 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+24 (secure 0 0 0) 0x7f8a68000bf0 con 0x7f8a70040bd0 2026-03-10T08:50:28.525 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.489+0000 7f8a78ff9700 1 -- 192.168.123.105:0/967961525 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8a70040bd0 msgr2=0x7f8a70043090 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:28.525 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.489+0000 7f8a78ff9700 1 --2- 192.168.123.105:0/967961525 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8a70040bd0 0x7f8a70043090 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7f8a7c00ad30 tx=0x7f8a7c0093f0 comp rx=0 tx=0).stop 2026-03-10T08:50:28.525 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.489+0000 7f8a78ff9700 1 -- 192.168.123.105:0/967961525 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8a84086dd0 msgr2=0x7f8a840871f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:28.525 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.489+0000 7f8a78ff9700 1 --2- 192.168.123.105:0/967961525 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8a84086dd0 0x7f8a840871f0 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f8a80004580 tx=0x7f8a800046f0 comp rx=0 tx=0).stop 2026-03-10T08:50:28.525 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.489+0000 7f8a78ff9700 1 -- 192.168.123.105:0/967961525 shutdown_connections 2026-03-10T08:50:28.525 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.489+0000 7f8a78ff9700 1 --2- 192.168.123.105:0/967961525 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8a70040bd0 0x7f8a70043090 unknown :-1 s=CLOSED pgs=12 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:28.525 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.489+0000 7f8a78ff9700 1 --2- 192.168.123.105:0/967961525 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8a84086dd0 0x7f8a840871f0 unknown :-1 s=CLOSED pgs=46 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:28.525 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.489+0000 7f8a78ff9700 1 -- 192.168.123.105:0/967961525 >> 192.168.123.105:0/967961525 conn(0x7f8a8406d320 msgr2=0x7f8a8406dcd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:28.525 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.489+0000 7f8a78ff9700 1 -- 192.168.123.105:0/967961525 shutdown_connections 2026-03-10T08:50:28.525 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.489+0000 7f8a78ff9700 1 -- 192.168.123.105:0/967961525 wait complete. 2026-03-10T08:50:28.525 INFO:teuthology.orchestra.run.vm05.stdout:Deploying mgr service with default placement... 2026-03-10T08:50:28.890 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout Scheduled mgr update... 2026-03-10T08:50:28.890 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.683+0000 7fc006e7c700 1 Processor -- start 2026-03-10T08:50:28.890 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.684+0000 7fc006e7c700 1 -- start start 2026-03-10T08:50:28.890 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.684+0000 7fc006e7c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc0001087d0 0x7fc000108bf0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:28.890 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.684+0000 7fc006e7c700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc0001091c0 con 0x7fc0001087d0 2026-03-10T08:50:28.890 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.684+0000 7fc004c18700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc0001087d0 0x7fc000108bf0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:28.890 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.684+0000 7fc004c18700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc0001087d0 0x7fc000108bf0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:55998/0 (socket says 192.168.123.105:55998) 2026-03-10T08:50:28.891 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.684+0000 7fc004c18700 1 -- 192.168.123.105:0/3303620712 learned_addr learned my addr 192.168.123.105:0/3303620712 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:28.891 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.685+0000 7fc004c18700 1 -- 192.168.123.105:0/3303620712 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc0001099d0 con 0x7fc0001087d0 2026-03-10T08:50:28.891 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.685+0000 7fc004c18700 1 --2- 192.168.123.105:0/3303620712 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc0001087d0 0x7fc000108bf0 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7fbff4009a90 tx=0x7fbff4009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=f919a6212170fbd6 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:28.891 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.685+0000 7fbfff7fe700 1 -- 192.168.123.105:0/3303620712 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fbff4004030 con 0x7fc0001087d0 2026-03-10T08:50:28.891 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.686+0000 7fbfff7fe700 1 -- 192.168.123.105:0/3303620712 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fbff400b7e0 con 0x7fc0001087d0 2026-03-10T08:50:28.891 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.686+0000 7fc006e7c700 1 -- 192.168.123.105:0/3303620712 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc0001087d0 msgr2=0x7fc000108bf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:28.891 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.686+0000 7fc006e7c700 1 --2- 192.168.123.105:0/3303620712 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc0001087d0 0x7fc000108bf0 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7fbff4009a90 tx=0x7fbff4009da0 comp rx=0 tx=0).stop 2026-03-10T08:50:28.891 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.686+0000 7fc006e7c700 1 -- 192.168.123.105:0/3303620712 shutdown_connections 2026-03-10T08:50:28.891 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.686+0000 7fc006e7c700 1 --2- 192.168.123.105:0/3303620712 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc0001087d0 0x7fc000108bf0 unknown :-1 s=CLOSED pgs=47 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:28.891 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.686+0000 7fc006e7c700 1 -- 192.168.123.105:0/3303620712 >> 192.168.123.105:0/3303620712 conn(0x7fc00007bbb0 msgr2=0x7fc0001064e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:28.891 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.687+0000 7fc006e7c700 1 -- 192.168.123.105:0/3303620712 shutdown_connections 2026-03-10T08:50:28.891 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.687+0000 7fc006e7c700 1 -- 192.168.123.105:0/3303620712 wait complete. 2026-03-10T08:50:28.891 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.687+0000 7fc006e7c700 1 Processor -- start 2026-03-10T08:50:28.891 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.687+0000 7fc006e7c700 1 -- start start 2026-03-10T08:50:28.891 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.687+0000 7fc006e7c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc0001999f0 0x7fc000199e10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:28.891 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.687+0000 7fc006e7c700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc0001091c0 con 0x7fc0001999f0 2026-03-10T08:50:28.891 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.688+0000 7fc004c18700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc0001999f0 0x7fc000199e10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:28.891 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.688+0000 7fc004c18700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc0001999f0 0x7fc000199e10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:56010/0 (socket says 192.168.123.105:56010) 2026-03-10T08:50:28.891 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.688+0000 7fc004c18700 1 -- 192.168.123.105:0/3175105652 learned_addr learned my addr 192.168.123.105:0/3175105652 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:28.891 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.688+0000 7fc004c18700 1 -- 192.168.123.105:0/3175105652 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbff4009740 con 0x7fc0001999f0 2026-03-10T08:50:28.891 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.689+0000 7fc004c18700 1 --2- 192.168.123.105:0/3175105652 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc0001999f0 0x7fc000199e10 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7fbff400bd00 tx=0x7fbff400bde0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:28.891 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.689+0000 7fbffdffb700 1 -- 192.168.123.105:0/3175105652 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fbff401a670 con 0x7fc0001999f0 2026-03-10T08:50:28.891 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.689+0000 7fbffdffb700 1 -- 192.168.123.105:0/3175105652 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fbff401ac70 con 0x7fc0001999f0 2026-03-10T08:50:28.891 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.689+0000 7fbffdffb700 1 -- 192.168.123.105:0/3175105652 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fbff40044e0 con 0x7fc0001999f0 2026-03-10T08:50:28.891 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.689+0000 7fc006e7c700 1 -- 192.168.123.105:0/3175105652 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc00019a350 con 0x7fc0001999f0 2026-03-10T08:50:28.891 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.689+0000 7fc006e7c700 1 -- 192.168.123.105:0/3175105652 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc000198080 con 0x7fc0001999f0 2026-03-10T08:50:28.891 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.691+0000 7fc006e7c700 1 -- 192.168.123.105:0/3175105652 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbfe4005320 con 0x7fc0001999f0 2026-03-10T08:50:28.891 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.691+0000 7fbffdffb700 1 -- 192.168.123.105:0/3175105652 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 7) v1 ==== 45278+0+0 (secure 0 0 0) 0x7fbff4011420 con 0x7fc0001999f0 2026-03-10T08:50:28.891 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.692+0000 7fbffdffb700 1 --2- 192.168.123.105:0/3175105652 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbff00383c0 0x7fbff003a880 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:28.891 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.692+0000 7fbffdffb700 1 -- 192.168.123.105:0/3175105652 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7fbff404c770 con 0x7fc0001999f0 2026-03-10T08:50:28.891 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.695+0000 7fbffffff700 1 --2- 192.168.123.105:0/3175105652 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbff00383c0 0x7fbff003a880 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:28.891 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.695+0000 7fbffdffb700 1 -- 192.168.123.105:0/3175105652 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fbff4010970 con 0x7fc0001999f0 2026-03-10T08:50:28.891 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.695+0000 7fbffffff700 1 --2- 192.168.123.105:0/3175105652 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbff00383c0 0x7fbff003a880 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7fbfec006fd0 tx=0x7fbfec006e40 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:28.891 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.756+0000 7fbffdffb700 1 -- 192.168.123.105:0/3175105652 <== mon.0 v2:192.168.123.105:3300/0 7 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fbff4003c60 con 0x7fc0001999f0 2026-03-10T08:50:28.891 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.844+0000 7fc006e7c700 1 -- 192.168.123.105:0/3175105652 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "mgr", "target": ["mon-mgr", ""]}) v1 -- 0x7fbfe4000bf0 con 0x7fbff00383c0 2026-03-10T08:50:28.891 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.851+0000 7fbffdffb700 1 -- 192.168.123.105:0/3175105652 <== mgr.14118 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+24 (secure 0 0 0) 0x7fbfe4000bf0 con 0x7fbff00383c0 2026-03-10T08:50:28.891 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.854+0000 7fbfeb7fe700 1 -- 192.168.123.105:0/3175105652 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbff00383c0 msgr2=0x7fbff003a880 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:28.891 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.854+0000 7fbfeb7fe700 1 --2- 192.168.123.105:0/3175105652 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbff00383c0 0x7fbff003a880 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7fbfec006fd0 tx=0x7fbfec006e40 comp rx=0 tx=0).stop 2026-03-10T08:50:28.891 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.854+0000 7fbfeb7fe700 1 -- 192.168.123.105:0/3175105652 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc0001999f0 msgr2=0x7fc000199e10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:28.891 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.854+0000 7fbfeb7fe700 1 --2- 192.168.123.105:0/3175105652 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc0001999f0 0x7fc000199e10 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7fbff400bd00 tx=0x7fbff400bde0 comp rx=0 tx=0).stop 2026-03-10T08:50:28.891 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.854+0000 7fbfeb7fe700 1 -- 192.168.123.105:0/3175105652 shutdown_connections 2026-03-10T08:50:28.891 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.854+0000 7fbfeb7fe700 1 --2- 192.168.123.105:0/3175105652 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbff00383c0 0x7fbff003a880 unknown :-1 s=CLOSED pgs=13 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:28.892 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.855+0000 7fbfeb7fe700 1 --2- 192.168.123.105:0/3175105652 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc0001999f0 0x7fc000199e10 unknown :-1 s=CLOSED pgs=48 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:28.892 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.855+0000 7fbfeb7fe700 1 -- 192.168.123.105:0/3175105652 >> 192.168.123.105:0/3175105652 conn(0x7fc00007bbb0 msgr2=0x7fc000105c60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:28.892 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.855+0000 7fbfeb7fe700 1 -- 192.168.123.105:0/3175105652 shutdown_connections 2026-03-10T08:50:28.892 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:28.855+0000 7fbfeb7fe700 1 -- 192.168.123.105:0/3175105652 wait complete. 2026-03-10T08:50:28.892 INFO:teuthology.orchestra.run.vm05.stdout:Deploying crash service with default placement... 2026-03-10T08:50:29.255 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout Scheduled crash update... 2026-03-10T08:50:29.255 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.047+0000 7f10429f8700 1 Processor -- start 2026-03-10T08:50:29.255 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.047+0000 7f10429f8700 1 -- start start 2026-03-10T08:50:29.255 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.047+0000 7f10429f8700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f103c071eb0 0x7f103c0722d0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:29.255 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.047+0000 7f10429f8700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f103c0728a0 con 0x7f103c071eb0 2026-03-10T08:50:29.255 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.047+0000 7f103bfff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f103c071eb0 0x7f103c0722d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:29.255 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.047+0000 7f103bfff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f103c071eb0 0x7f103c0722d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:56016/0 (socket says 192.168.123.105:56016) 2026-03-10T08:50:29.255 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.047+0000 7f103bfff700 1 -- 192.168.123.105:0/2659205715 learned_addr learned my addr 192.168.123.105:0/2659205715 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:29.255 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.048+0000 7f103bfff700 1 -- 192.168.123.105:0/2659205715 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f103c079120 con 0x7f103c071eb0 2026-03-10T08:50:29.255 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.048+0000 7f103bfff700 1 --2- 192.168.123.105:0/2659205715 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f103c071eb0 0x7f103c0722d0 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7f102c00ac30 tx=0x7f102c010730 comp rx=0 tx=0).ready entity=mon.0 client_cookie=a283fed40590d92c server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:29.255 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.049+0000 7f103affd700 1 -- 192.168.123.105:0/2659205715 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f102c010d40 con 0x7f103c071eb0 2026-03-10T08:50:29.255 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.049+0000 7f103affd700 1 -- 192.168.123.105:0/2659205715 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f102c004500 con 0x7f103c071eb0 2026-03-10T08:50:29.255 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.049+0000 7f10429f8700 1 -- 192.168.123.105:0/2659205715 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f103c071eb0 msgr2=0x7f103c0722d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:29.255 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.049+0000 7f10429f8700 1 --2- 192.168.123.105:0/2659205715 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f103c071eb0 0x7f103c0722d0 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7f102c00ac30 tx=0x7f102c010730 comp rx=0 tx=0).stop 2026-03-10T08:50:29.255 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.049+0000 7f10429f8700 1 -- 192.168.123.105:0/2659205715 shutdown_connections 2026-03-10T08:50:29.255 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.049+0000 7f10429f8700 1 --2- 192.168.123.105:0/2659205715 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f103c071eb0 0x7f103c0722d0 unknown :-1 s=CLOSED pgs=49 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:29.255 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.049+0000 7f10429f8700 1 -- 192.168.123.105:0/2659205715 >> 192.168.123.105:0/2659205715 conn(0x7f103c06d5d0 msgr2=0x7f103c06fa30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:29.256 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.050+0000 7f10429f8700 1 -- 192.168.123.105:0/2659205715 shutdown_connections 2026-03-10T08:50:29.256 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.050+0000 7f10429f8700 1 -- 192.168.123.105:0/2659205715 wait complete. 2026-03-10T08:50:29.256 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.050+0000 7f10429f8700 1 Processor -- start 2026-03-10T08:50:29.256 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.050+0000 7f10429f8700 1 -- start start 2026-03-10T08:50:29.256 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.050+0000 7f10429f8700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f103c086f70 0x7f103c087390 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:29.256 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.050+0000 7f10429f8700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f102c01a420 con 0x7f103c086f70 2026-03-10T08:50:29.256 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.050+0000 7f103bfff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f103c086f70 0x7f103c087390 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:29.256 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.050+0000 7f103bfff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f103c086f70 0x7f103c087390 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:56032/0 (socket says 192.168.123.105:56032) 2026-03-10T08:50:29.256 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.050+0000 7f103bfff700 1 -- 192.168.123.105:0/3783504428 learned_addr learned my addr 192.168.123.105:0/3783504428 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:29.256 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.050+0000 7f103bfff700 1 -- 192.168.123.105:0/3783504428 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f102c00a8e0 con 0x7f103c086f70 2026-03-10T08:50:29.256 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.051+0000 7f103bfff700 1 --2- 192.168.123.105:0/3783504428 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f103c086f70 0x7f103c087390 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f102c003c20 tx=0x7f102c003d00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:29.256 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.051+0000 7f10397fa700 1 -- 192.168.123.105:0/3783504428 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f102c01b880 con 0x7f103c086f70 2026-03-10T08:50:29.256 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.051+0000 7f10429f8700 1 -- 192.168.123.105:0/3783504428 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f103c0878d0 con 0x7f103c086f70 2026-03-10T08:50:29.256 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.051+0000 7f10429f8700 1 -- 192.168.123.105:0/3783504428 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f103c088540 con 0x7f103c086f70 2026-03-10T08:50:29.256 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.052+0000 7f10397fa700 1 -- 192.168.123.105:0/3783504428 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f102c01bec0 con 0x7f103c086f70 2026-03-10T08:50:29.256 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.052+0000 7f10397fa700 1 -- 192.168.123.105:0/3783504428 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f102c0228f0 con 0x7f103c086f70 2026-03-10T08:50:29.256 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.052+0000 7f10397fa700 1 -- 192.168.123.105:0/3783504428 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 7) v1 ==== 45278+0+0 (secure 0 0 0) 0x7f102c018070 con 0x7f103c086f70 2026-03-10T08:50:29.256 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.053+0000 7f10397fa700 1 --2- 192.168.123.105:0/3783504428 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f10240383c0 0x7f102403a880 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:29.256 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.053+0000 7f103b7fe700 1 --2- 192.168.123.105:0/3783504428 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f10240383c0 0x7f102403a880 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:29.256 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.053+0000 7f10397fa700 1 -- 192.168.123.105:0/3783504428 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f102c04c790 con 0x7f103c086f70 2026-03-10T08:50:29.256 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.053+0000 7f103b7fe700 1 --2- 192.168.123.105:0/3783504428 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f10240383c0 0x7f102403a880 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7f103400ad30 tx=0x7f10340093f0 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:29.256 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.054+0000 7f10429f8700 1 -- 192.168.123.105:0/3783504428 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1028005320 con 0x7f103c086f70 2026-03-10T08:50:29.256 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.056+0000 7f10397fa700 1 -- 192.168.123.105:0/3783504428 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f102c01e020 con 0x7f103c086f70 2026-03-10T08:50:29.256 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.189+0000 7f10429f8700 1 -- 192.168.123.105:0/3783504428 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "crash", "target": ["mon-mgr", ""]}) v1 -- 0x7f1028000bf0 con 0x7f10240383c0 2026-03-10T08:50:29.256 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.195+0000 7f10397fa700 1 -- 192.168.123.105:0/3783504428 <== mgr.14118 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+26 (secure 0 0 0) 0x7f1028000bf0 con 0x7f10240383c0 2026-03-10T08:50:29.256 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.201+0000 7f10429f8700 1 -- 192.168.123.105:0/3783504428 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f10240383c0 msgr2=0x7f102403a880 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:29.256 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.201+0000 7f10429f8700 1 --2- 192.168.123.105:0/3783504428 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f10240383c0 0x7f102403a880 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7f103400ad30 tx=0x7f10340093f0 comp rx=0 tx=0).stop 2026-03-10T08:50:29.256 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.201+0000 7f10429f8700 1 -- 192.168.123.105:0/3783504428 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f103c086f70 msgr2=0x7f103c087390 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:29.256 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.201+0000 7f10429f8700 1 --2- 192.168.123.105:0/3783504428 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f103c086f70 0x7f103c087390 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f102c003c20 tx=0x7f102c003d00 comp rx=0 tx=0).stop 2026-03-10T08:50:29.256 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.201+0000 7f10429f8700 1 -- 192.168.123.105:0/3783504428 shutdown_connections 2026-03-10T08:50:29.256 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.201+0000 7f10429f8700 1 --2- 192.168.123.105:0/3783504428 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f10240383c0 0x7f102403a880 unknown :-1 s=CLOSED pgs=14 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:29.256 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.201+0000 7f10429f8700 1 --2- 192.168.123.105:0/3783504428 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f103c086f70 0x7f103c087390 unknown :-1 s=CLOSED pgs=50 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:29.256 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.201+0000 7f10429f8700 1 -- 192.168.123.105:0/3783504428 >> 192.168.123.105:0/3783504428 conn(0x7f103c06d5d0 msgr2=0x7f103c06ef20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:29.256 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.202+0000 7f10429f8700 1 -- 192.168.123.105:0/3783504428 shutdown_connections 2026-03-10T08:50:29.256 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.202+0000 7f10429f8700 1 -- 192.168.123.105:0/3783504428 wait complete. 2026-03-10T08:50:29.256 INFO:teuthology.orchestra.run.vm05.stdout:Deploying ceph-exporter service with default placement... 2026-03-10T08:50:29.290 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:29 vm05 ceph-mon[49713]: from='mgr.14118 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:29.290 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:29 vm05 ceph-mon[49713]: Added host vm05 2026-03-10T08:50:29.290 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:29 vm05 ceph-mon[49713]: from='mgr.14118 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:50:29.290 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:29 vm05 ceph-mon[49713]: from='mgr.14118 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:29.290 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:29 vm05 ceph-mon[49713]: from='mgr.14118 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:29.290 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:29 vm05 ceph-mon[49713]: from='mgr.14118 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:29.290 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:29 vm05 ceph-mon[49713]: from='mgr.14118 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:29.579 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout Scheduled ceph-exporter update... 2026-03-10T08:50:29.579 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.409+0000 7ff52df61700 1 Processor -- start 2026-03-10T08:50:29.579 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.409+0000 7ff52df61700 1 -- start start 2026-03-10T08:50:29.579 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.409+0000 7ff52df61700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff528071e80 0x7ff5280722a0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:29.579 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.409+0000 7ff52df61700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff528072870 con 0x7ff528071e80 2026-03-10T08:50:29.579 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.410+0000 7ff5277fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff528071e80 0x7ff5280722a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:29.579 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.410+0000 7ff5277fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff528071e80 0x7ff5280722a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:56038/0 (socket says 192.168.123.105:56038) 2026-03-10T08:50:29.579 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.410+0000 7ff5277fe700 1 -- 192.168.123.105:0/2313003517 learned_addr learned my addr 192.168.123.105:0/2313003517 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:29.579 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.411+0000 7ff5277fe700 1 -- 192.168.123.105:0/2313003517 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff5280729b0 con 0x7ff528071e80 2026-03-10T08:50:29.579 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.411+0000 7ff5277fe700 1 --2- 192.168.123.105:0/2313003517 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff528071e80 0x7ff5280722a0 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7ff51800d0d0 tx=0x7ff51800d3e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=1b43fc3fabbc081f server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:29.579 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.411+0000 7ff5267fc700 1 -- 192.168.123.105:0/2313003517 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff518010070 con 0x7ff528071e80 2026-03-10T08:50:29.579 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.411+0000 7ff5267fc700 1 -- 192.168.123.105:0/2313003517 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff518004030 con 0x7ff528071e80 2026-03-10T08:50:29.579 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.412+0000 7ff52df61700 1 -- 192.168.123.105:0/2313003517 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff528071e80 msgr2=0x7ff5280722a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:29.579 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.412+0000 7ff52df61700 1 --2- 192.168.123.105:0/2313003517 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff528071e80 0x7ff5280722a0 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7ff51800d0d0 tx=0x7ff51800d3e0 comp rx=0 tx=0).stop 2026-03-10T08:50:29.579 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.412+0000 7ff52df61700 1 -- 192.168.123.105:0/2313003517 shutdown_connections 2026-03-10T08:50:29.579 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.412+0000 7ff52df61700 1 --2- 192.168.123.105:0/2313003517 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff528071e80 0x7ff5280722a0 unknown :-1 s=CLOSED pgs=51 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:29.579 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.412+0000 7ff52df61700 1 -- 192.168.123.105:0/2313003517 >> 192.168.123.105:0/2313003517 conn(0x7ff52806d660 msgr2=0x7ff52806fac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:29.579 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.412+0000 7ff52df61700 1 -- 192.168.123.105:0/2313003517 shutdown_connections 2026-03-10T08:50:29.579 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.412+0000 7ff52df61700 1 -- 192.168.123.105:0/2313003517 wait complete. 2026-03-10T08:50:29.579 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.413+0000 7ff52df61700 1 Processor -- start 2026-03-10T08:50:29.579 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.413+0000 7ff52df61700 1 -- start start 2026-03-10T08:50:29.579 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.413+0000 7ff52df61700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff528087130 0x7ff528087550 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:29.579 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.413+0000 7ff52df61700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff528087a90 con 0x7ff528087130 2026-03-10T08:50:29.579 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.413+0000 7ff5277fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff528087130 0x7ff528087550 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:29.579 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.413+0000 7ff5277fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff528087130 0x7ff528087550 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:56044/0 (socket says 192.168.123.105:56044) 2026-03-10T08:50:29.579 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.413+0000 7ff5277fe700 1 -- 192.168.123.105:0/263185953 learned_addr learned my addr 192.168.123.105:0/263185953 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:29.579 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.413+0000 7ff5277fe700 1 -- 192.168.123.105:0/263185953 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff5180088c0 con 0x7ff528087130 2026-03-10T08:50:29.579 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.414+0000 7ff5277fe700 1 --2- 192.168.123.105:0/263185953 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff528087130 0x7ff528087550 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7ff518003a10 tx=0x7ff518017730 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:29.579 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.415+0000 7ff524ff9700 1 -- 192.168.123.105:0/263185953 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff518010040 con 0x7ff528087130 2026-03-10T08:50:29.579 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.415+0000 7ff524ff9700 1 -- 192.168.123.105:0/263185953 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff518017c10 con 0x7ff528087130 2026-03-10T08:50:29.579 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.415+0000 7ff524ff9700 1 -- 192.168.123.105:0/263185953 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff51801f690 con 0x7ff528087130 2026-03-10T08:50:29.579 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.415+0000 7ff52df61700 1 -- 192.168.123.105:0/263185953 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff528087c90 con 0x7ff528087130 2026-03-10T08:50:29.579 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.415+0000 7ff52df61700 1 -- 192.168.123.105:0/263185953 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff5280888d0 con 0x7ff528087130 2026-03-10T08:50:29.580 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.416+0000 7ff52df61700 1 -- 192.168.123.105:0/263185953 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff52804fa20 con 0x7ff528087130 2026-03-10T08:50:29.580 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.419+0000 7ff524ff9700 1 -- 192.168.123.105:0/263185953 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 7) v1 ==== 45278+0+0 (secure 0 0 0) 0x7ff51801d070 con 0x7ff528087130 2026-03-10T08:50:29.580 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.419+0000 7ff524ff9700 1 --2- 192.168.123.105:0/263185953 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff5100383c0 0x7ff51003a880 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:29.580 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.419+0000 7ff524ff9700 1 -- 192.168.123.105:0/263185953 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7ff51804c0b0 con 0x7ff528087130 2026-03-10T08:50:29.580 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.421+0000 7ff524ff9700 1 -- 192.168.123.105:0/263185953 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7ff518017d80 con 0x7ff528087130 2026-03-10T08:50:29.580 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.421+0000 7ff526ffd700 1 --2- 192.168.123.105:0/263185953 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff5100383c0 0x7ff51003a880 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:29.580 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.422+0000 7ff526ffd700 1 --2- 192.168.123.105:0/263185953 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff5100383c0 0x7ff51003a880 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7ff52000ad30 tx=0x7ff5200093f0 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:29.580 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.537+0000 7ff52df61700 1 -- 192.168.123.105:0/263185953 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "ceph-exporter", "target": ["mon-mgr", ""]}) v1 -- 0x7ff52806f800 con 0x7ff5100383c0 2026-03-10T08:50:29.580 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.546+0000 7ff524ff9700 1 -- 192.168.123.105:0/263185953 <== mgr.14118 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+34 (secure 0 0 0) 0x7ff52806f800 con 0x7ff5100383c0 2026-03-10T08:50:29.580 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.548+0000 7ff50e7fc700 1 -- 192.168.123.105:0/263185953 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff5100383c0 msgr2=0x7ff51003a880 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:29.580 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.548+0000 7ff50e7fc700 1 --2- 192.168.123.105:0/263185953 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff5100383c0 0x7ff51003a880 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7ff52000ad30 tx=0x7ff5200093f0 comp rx=0 tx=0).stop 2026-03-10T08:50:29.580 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.548+0000 7ff50e7fc700 1 -- 192.168.123.105:0/263185953 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff528087130 msgr2=0x7ff528087550 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:29.580 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.548+0000 7ff50e7fc700 1 --2- 192.168.123.105:0/263185953 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff528087130 0x7ff528087550 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7ff518003a10 tx=0x7ff518017730 comp rx=0 tx=0).stop 2026-03-10T08:50:29.580 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.549+0000 7ff50e7fc700 1 -- 192.168.123.105:0/263185953 shutdown_connections 2026-03-10T08:50:29.580 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.549+0000 7ff50e7fc700 1 --2- 192.168.123.105:0/263185953 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff5100383c0 0x7ff51003a880 unknown :-1 s=CLOSED pgs=15 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:29.580 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.549+0000 7ff50e7fc700 1 --2- 192.168.123.105:0/263185953 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff528087130 0x7ff528087550 unknown :-1 s=CLOSED pgs=52 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:29.580 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.549+0000 7ff50e7fc700 1 -- 192.168.123.105:0/263185953 >> 192.168.123.105:0/263185953 conn(0x7ff52806d660 msgr2=0x7ff52806f0f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:29.580 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.549+0000 7ff50e7fc700 1 -- 192.168.123.105:0/263185953 shutdown_connections 2026-03-10T08:50:29.580 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.549+0000 7ff50e7fc700 1 -- 192.168.123.105:0/263185953 wait complete. 2026-03-10T08:50:29.580 INFO:teuthology.orchestra.run.vm05.stdout:Deploying prometheus service with default placement... 2026-03-10T08:50:29.918 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout Scheduled prometheus update... 2026-03-10T08:50:29.918 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.715+0000 7fb2d0115700 1 Processor -- start 2026-03-10T08:50:29.918 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.715+0000 7fb2d0115700 1 -- start start 2026-03-10T08:50:29.918 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.716+0000 7fb2d0115700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb2c807ade0 0x7fb2c8079240 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:29.918 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.716+0000 7fb2d0115700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb2c8079810 con 0x7fb2c807ade0 2026-03-10T08:50:29.918 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.716+0000 7fb2cdeb1700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb2c807ade0 0x7fb2c8079240 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:29.918 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.716+0000 7fb2cdeb1700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb2c807ade0 0x7fb2c8079240 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:53688/0 (socket says 192.168.123.105:53688) 2026-03-10T08:50:29.918 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.716+0000 7fb2cdeb1700 1 -- 192.168.123.105:0/3903405966 learned_addr learned my addr 192.168.123.105:0/3903405966 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:29.918 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.716+0000 7fb2cdeb1700 1 -- 192.168.123.105:0/3903405966 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb2c8079950 con 0x7fb2c807ade0 2026-03-10T08:50:29.918 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.717+0000 7fb2cdeb1700 1 --2- 192.168.123.105:0/3903405966 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb2c807ade0 0x7fb2c8079240 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7fb2b8009cf0 tx=0x7fb2b800b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=3d1f2e999fdd0883 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:29.918 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.717+0000 7fb2cceaf700 1 -- 192.168.123.105:0/3903405966 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb2b8004030 con 0x7fb2c807ade0 2026-03-10T08:50:29.918 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.717+0000 7fb2cceaf700 1 -- 192.168.123.105:0/3903405966 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb2b800b810 con 0x7fb2c807ade0 2026-03-10T08:50:29.918 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.717+0000 7fb2d0115700 1 -- 192.168.123.105:0/3903405966 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb2c807ade0 msgr2=0x7fb2c8079240 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:29.918 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.717+0000 7fb2d0115700 1 --2- 192.168.123.105:0/3903405966 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb2c807ade0 0x7fb2c8079240 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7fb2b8009cf0 tx=0x7fb2b800b0e0 comp rx=0 tx=0).stop 2026-03-10T08:50:29.918 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.717+0000 7fb2d0115700 1 -- 192.168.123.105:0/3903405966 shutdown_connections 2026-03-10T08:50:29.918 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.717+0000 7fb2d0115700 1 --2- 192.168.123.105:0/3903405966 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb2c807ade0 0x7fb2c8079240 unknown :-1 s=CLOSED pgs=53 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:29.918 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.717+0000 7fb2d0115700 1 -- 192.168.123.105:0/3903405966 >> 192.168.123.105:0/3903405966 conn(0x7fb2c8101c80 msgr2=0x7fb2c81040e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:29.918 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.718+0000 7fb2d0115700 1 -- 192.168.123.105:0/3903405966 shutdown_connections 2026-03-10T08:50:29.918 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.718+0000 7fb2d0115700 1 -- 192.168.123.105:0/3903405966 wait complete. 2026-03-10T08:50:29.918 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.718+0000 7fb2d0115700 1 Processor -- start 2026-03-10T08:50:29.918 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.718+0000 7fb2d0115700 1 -- start start 2026-03-10T08:50:29.918 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.718+0000 7fb2d0115700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb2c807ade0 0x7fb2c81a0e20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:29.918 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.719+0000 7fb2cdeb1700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb2c807ade0 0x7fb2c81a0e20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:29.918 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.719+0000 7fb2cdeb1700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb2c807ade0 0x7fb2c81a0e20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:53690/0 (socket says 192.168.123.105:53690) 2026-03-10T08:50:29.918 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.719+0000 7fb2cdeb1700 1 -- 192.168.123.105:0/2236371636 learned_addr learned my addr 192.168.123.105:0/2236371636 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:29.918 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.718+0000 7fb2d0115700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb2c8079810 con 0x7fb2c807ade0 2026-03-10T08:50:29.918 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.719+0000 7fb2cdeb1700 1 -- 192.168.123.105:0/2236371636 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb2b8009740 con 0x7fb2c807ade0 2026-03-10T08:50:29.918 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.719+0000 7fb2cdeb1700 1 --2- 192.168.123.105:0/2236371636 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb2c807ade0 0x7fb2c81a0e20 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7fb2b8006e90 tx=0x7fb2b8003d30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:29.918 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.719+0000 7fb2beffd700 1 -- 192.168.123.105:0/2236371636 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb2b8003f40 con 0x7fb2c807ade0 2026-03-10T08:50:29.919 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.719+0000 7fb2beffd700 1 -- 192.168.123.105:0/2236371636 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb2b8004580 con 0x7fb2c807ade0 2026-03-10T08:50:29.919 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.719+0000 7fb2beffd700 1 -- 192.168.123.105:0/2236371636 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb2b801ae60 con 0x7fb2c807ade0 2026-03-10T08:50:29.919 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.719+0000 7fb2d0115700 1 -- 192.168.123.105:0/2236371636 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb2c81a1360 con 0x7fb2c807ade0 2026-03-10T08:50:29.919 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.719+0000 7fb2d0115700 1 -- 192.168.123.105:0/2236371636 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb2c81a1780 con 0x7fb2c807ade0 2026-03-10T08:50:29.919 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.720+0000 7fb2beffd700 1 -- 192.168.123.105:0/2236371636 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 7) v1 ==== 45278+0+0 (secure 0 0 0) 0x7fb2b801a430 con 0x7fb2c807ade0 2026-03-10T08:50:29.919 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.720+0000 7fb2d0115700 1 -- 192.168.123.105:0/2236371636 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb2c819b1a0 con 0x7fb2c807ade0 2026-03-10T08:50:29.919 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.720+0000 7fb2beffd700 1 --2- 192.168.123.105:0/2236371636 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb2b4038390 0x7fb2b403a850 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:29.919 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.720+0000 7fb2beffd700 1 -- 192.168.123.105:0/2236371636 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7fb2b804bae0 con 0x7fb2c807ade0 2026-03-10T08:50:29.919 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.721+0000 7fb2cd6b0700 1 --2- 192.168.123.105:0/2236371636 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb2b4038390 0x7fb2b403a850 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:29.919 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.723+0000 7fb2cd6b0700 1 --2- 192.168.123.105:0/2236371636 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb2b4038390 0x7fb2b403a850 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7fb2c4006fd0 tx=0x7fb2c4006e40 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:29.919 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.723+0000 7fb2beffd700 1 -- 192.168.123.105:0/2236371636 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fb2b801a730 con 0x7fb2c807ade0 2026-03-10T08:50:29.919 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.861+0000 7fb2d0115700 1 -- 192.168.123.105:0/2236371636 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "prometheus", "target": ["mon-mgr", ""]}) v1 -- 0x7fb2c802cfc0 con 0x7fb2b4038390 2026-03-10T08:50:29.919 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.869+0000 7fb2beffd700 1 -- 192.168.123.105:0/2236371636 <== mgr.14118 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+31 (secure 0 0 0) 0x7fb2c802cfc0 con 0x7fb2b4038390 2026-03-10T08:50:29.919 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.879+0000 7fb2d0115700 1 -- 192.168.123.105:0/2236371636 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb2b4038390 msgr2=0x7fb2b403a850 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:29.919 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.879+0000 7fb2d0115700 1 --2- 192.168.123.105:0/2236371636 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb2b4038390 0x7fb2b403a850 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7fb2c4006fd0 tx=0x7fb2c4006e40 comp rx=0 tx=0).stop 2026-03-10T08:50:29.919 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.879+0000 7fb2d0115700 1 -- 192.168.123.105:0/2236371636 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb2c807ade0 msgr2=0x7fb2c81a0e20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:29.919 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.879+0000 7fb2d0115700 1 --2- 192.168.123.105:0/2236371636 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb2c807ade0 0x7fb2c81a0e20 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7fb2b8006e90 tx=0x7fb2b8003d30 comp rx=0 tx=0).stop 2026-03-10T08:50:29.919 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.879+0000 7fb2d0115700 1 -- 192.168.123.105:0/2236371636 shutdown_connections 2026-03-10T08:50:29.919 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.879+0000 7fb2d0115700 1 --2- 192.168.123.105:0/2236371636 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb2b4038390 0x7fb2b403a850 unknown :-1 s=CLOSED pgs=16 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:29.919 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.879+0000 7fb2d0115700 1 --2- 192.168.123.105:0/2236371636 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb2c807ade0 0x7fb2c81a0e20 unknown :-1 s=CLOSED pgs=54 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:29.919 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.879+0000 7fb2d0115700 1 -- 192.168.123.105:0/2236371636 >> 192.168.123.105:0/2236371636 conn(0x7fb2c8101c80 msgr2=0x7fb2c8102960 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:29.919 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.879+0000 7fb2d0115700 1 -- 192.168.123.105:0/2236371636 shutdown_connections 2026-03-10T08:50:29.919 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:29.879+0000 7fb2d0115700 1 -- 192.168.123.105:0/2236371636 wait complete. 2026-03-10T08:50:29.919 INFO:teuthology.orchestra.run.vm05.stdout:Deploying grafana service with default placement... 2026-03-10T08:50:30.197 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:30 vm05 ceph-mon[49713]: from='client.14140 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "mon", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:50:30.265 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout Scheduled grafana update... 2026-03-10T08:50:30.265 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.070+0000 7fb32238d700 1 Processor -- start 2026-03-10T08:50:30.265 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.071+0000 7fb32238d700 1 -- start start 2026-03-10T08:50:30.265 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.071+0000 7fb32238d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb31c105030 0x7fb31c105450 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:30.265 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.071+0000 7fb32238d700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb31c0746a0 con 0x7fb31c105030 2026-03-10T08:50:30.265 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.072+0000 7fb31bfff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb31c105030 0x7fb31c105450 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:30.265 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.072+0000 7fb31bfff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb31c105030 0x7fb31c105450 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:53698/0 (socket says 192.168.123.105:53698) 2026-03-10T08:50:30.265 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.072+0000 7fb31bfff700 1 -- 192.168.123.105:0/3842419274 learned_addr learned my addr 192.168.123.105:0/3842419274 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:30.265 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.072+0000 7fb31bfff700 1 -- 192.168.123.105:0/3842419274 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb31c105990 con 0x7fb31c105030 2026-03-10T08:50:30.265 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.072+0000 7fb31bfff700 1 --2- 192.168.123.105:0/3842419274 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb31c105030 0x7fb31c105450 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7fb304009cf0 tx=0x7fb30400b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=e3fda88db496dd66 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:30.265 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.073+0000 7fb31affd700 1 -- 192.168.123.105:0/3842419274 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb304004030 con 0x7fb31c105030 2026-03-10T08:50:30.265 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.073+0000 7fb31affd700 1 -- 192.168.123.105:0/3842419274 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb30400b810 con 0x7fb31c105030 2026-03-10T08:50:30.265 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.073+0000 7fb31affd700 1 -- 192.168.123.105:0/3842419274 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb304003b10 con 0x7fb31c105030 2026-03-10T08:50:30.265 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.073+0000 7fb32238d700 1 -- 192.168.123.105:0/3842419274 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb31c105030 msgr2=0x7fb31c105450 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:30.265 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.073+0000 7fb32238d700 1 --2- 192.168.123.105:0/3842419274 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb31c105030 0x7fb31c105450 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7fb304009cf0 tx=0x7fb30400b0e0 comp rx=0 tx=0).stop 2026-03-10T08:50:30.265 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.074+0000 7fb32238d700 1 -- 192.168.123.105:0/3842419274 shutdown_connections 2026-03-10T08:50:30.265 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.074+0000 7fb32238d700 1 --2- 192.168.123.105:0/3842419274 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb31c105030 0x7fb31c105450 unknown :-1 s=CLOSED pgs=55 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:30.265 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.074+0000 7fb32238d700 1 -- 192.168.123.105:0/3842419274 >> 192.168.123.105:0/3842419274 conn(0x7fb31c100bd0 msgr2=0x7fb31c103030 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:30.265 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.074+0000 7fb32238d700 1 -- 192.168.123.105:0/3842419274 shutdown_connections 2026-03-10T08:50:30.265 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.074+0000 7fb32238d700 1 -- 192.168.123.105:0/3842419274 wait complete. 2026-03-10T08:50:30.265 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.075+0000 7fb32238d700 1 Processor -- start 2026-03-10T08:50:30.265 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.075+0000 7fb32238d700 1 -- start start 2026-03-10T08:50:30.265 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.075+0000 7fb32238d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb31c1075d0 0x7fb31c1079f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:30.265 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.075+0000 7fb32238d700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb31c107f30 con 0x7fb31c1075d0 2026-03-10T08:50:30.265 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.075+0000 7fb31bfff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb31c1075d0 0x7fb31c1079f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:30.265 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.075+0000 7fb31bfff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb31c1075d0 0x7fb31c1079f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:53700/0 (socket says 192.168.123.105:53700) 2026-03-10T08:50:30.265 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.075+0000 7fb31bfff700 1 -- 192.168.123.105:0/3129278701 learned_addr learned my addr 192.168.123.105:0/3129278701 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:30.266 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.076+0000 7fb31bfff700 1 -- 192.168.123.105:0/3129278701 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb304009740 con 0x7fb31c1075d0 2026-03-10T08:50:30.266 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.076+0000 7fb31bfff700 1 --2- 192.168.123.105:0/3129278701 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb31c1075d0 0x7fb31c1079f0 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7fb304000c00 tx=0x7fb30401b6c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:30.266 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.077+0000 7fb3197fa700 1 -- 192.168.123.105:0/3129278701 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb30401b8e0 con 0x7fb31c1075d0 2026-03-10T08:50:30.266 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.077+0000 7fb3197fa700 1 -- 192.168.123.105:0/3129278701 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb304025470 con 0x7fb31c1075d0 2026-03-10T08:50:30.266 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.077+0000 7fb3197fa700 1 -- 192.168.123.105:0/3129278701 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb30401c450 con 0x7fb31c1075d0 2026-03-10T08:50:30.266 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.077+0000 7fb32238d700 1 -- 192.168.123.105:0/3129278701 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb31c108130 con 0x7fb31c1075d0 2026-03-10T08:50:30.266 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.077+0000 7fb32238d700 1 -- 192.168.123.105:0/3129278701 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb31c10ad80 con 0x7fb31c1075d0 2026-03-10T08:50:30.266 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.078+0000 7fb3197fa700 1 -- 192.168.123.105:0/3129278701 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 7) v1 ==== 45278+0+0 (secure 0 0 0) 0x7fb304022070 con 0x7fb31c1075d0 2026-03-10T08:50:30.266 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.078+0000 7fb32238d700 1 -- 192.168.123.105:0/3129278701 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb31c195fe0 con 0x7fb31c1075d0 2026-03-10T08:50:30.266 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.078+0000 7fb3197fa700 1 --2- 192.168.123.105:0/3129278701 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb3080383a0 0x7fb30803a860 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:30.266 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.078+0000 7fb3197fa700 1 -- 192.168.123.105:0/3129278701 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7fb30404c130 con 0x7fb31c1075d0 2026-03-10T08:50:30.266 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.081+0000 7fb31b7fe700 1 --2- 192.168.123.105:0/3129278701 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb3080383a0 0x7fb30803a860 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:30.266 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.081+0000 7fb31b7fe700 1 --2- 192.168.123.105:0/3129278701 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb3080383a0 0x7fb30803a860 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7fb30c006fd0 tx=0x7fb30c006e40 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:30.266 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.081+0000 7fb3197fa700 1 -- 192.168.123.105:0/3129278701 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fb304025ac0 con 0x7fb31c1075d0 2026-03-10T08:50:30.266 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.205+0000 7fb32238d700 1 -- 192.168.123.105:0/3129278701 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "grafana", "target": ["mon-mgr", ""]}) v1 -- 0x7fb31c02cfe0 con 0x7fb3080383a0 2026-03-10T08:50:30.266 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.210+0000 7fb3197fa700 1 -- 192.168.123.105:0/3129278701 <== mgr.14118 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+28 (secure 0 0 0) 0x7fb31c02cfe0 con 0x7fb3080383a0 2026-03-10T08:50:30.266 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.217+0000 7fb312ffd700 1 -- 192.168.123.105:0/3129278701 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb3080383a0 msgr2=0x7fb30803a860 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:30.266 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.217+0000 7fb312ffd700 1 --2- 192.168.123.105:0/3129278701 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb3080383a0 0x7fb30803a860 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7fb30c006fd0 tx=0x7fb30c006e40 comp rx=0 tx=0).stop 2026-03-10T08:50:30.266 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.217+0000 7fb312ffd700 1 -- 192.168.123.105:0/3129278701 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb31c1075d0 msgr2=0x7fb31c1079f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:30.266 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.217+0000 7fb312ffd700 1 --2- 192.168.123.105:0/3129278701 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb31c1075d0 0x7fb31c1079f0 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7fb304000c00 tx=0x7fb30401b6c0 comp rx=0 tx=0).stop 2026-03-10T08:50:30.266 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.217+0000 7fb312ffd700 1 -- 192.168.123.105:0/3129278701 shutdown_connections 2026-03-10T08:50:30.266 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.217+0000 7fb312ffd700 1 --2- 192.168.123.105:0/3129278701 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb3080383a0 0x7fb30803a860 unknown :-1 s=CLOSED pgs=17 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:30.266 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.217+0000 7fb312ffd700 1 --2- 192.168.123.105:0/3129278701 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb31c1075d0 0x7fb31c1079f0 unknown :-1 s=CLOSED pgs=56 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:30.266 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.217+0000 7fb312ffd700 1 -- 192.168.123.105:0/3129278701 >> 192.168.123.105:0/3129278701 conn(0x7fb31c100bd0 msgr2=0x7fb31c190bf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:30.266 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.219+0000 7fb312ffd700 1 -- 192.168.123.105:0/3129278701 shutdown_connections 2026-03-10T08:50:30.266 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.219+0000 7fb312ffd700 1 -- 192.168.123.105:0/3129278701 wait complete. 2026-03-10T08:50:30.266 INFO:teuthology.orchestra.run.vm05.stdout:Deploying node-exporter service with default placement... 2026-03-10T08:50:30.445 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:30 vm05 ceph-mon[49713]: Saving service mon spec with placement count:5 2026-03-10T08:50:30.446 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:30 vm05 ceph-mon[49713]: from='client.14142 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "mgr", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:50:30.446 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:30 vm05 ceph-mon[49713]: Saving service mgr spec with placement count:2 2026-03-10T08:50:30.446 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:30 vm05 ceph-mon[49713]: from='client.14144 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "crash", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:50:30.446 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:30 vm05 ceph-mon[49713]: Saving service crash spec with placement * 2026-03-10T08:50:30.446 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:30 vm05 ceph-mon[49713]: from='mgr.14118 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:30.446 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:30 vm05 ceph-mon[49713]: from='mgr.14118 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:30.446 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:30 vm05 ceph-mon[49713]: from='mgr.14118 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:30.446 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:30 vm05 ceph-mon[49713]: from='mgr.14118 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:30.708 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout Scheduled node-exporter update... 2026-03-10T08:50:30.708 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.432+0000 7f4a5d695700 1 Processor -- start 2026-03-10T08:50:30.708 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.433+0000 7f4a5d695700 1 -- start start 2026-03-10T08:50:30.708 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.435+0000 7f4a5d695700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4a58108990 0x7f4a58108db0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:30.708 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.435+0000 7f4a5d695700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4a58109380 con 0x7f4a58108990 2026-03-10T08:50:30.708 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.436+0000 7f4a56ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4a58108990 0x7f4a58108db0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:30.708 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.436+0000 7f4a56ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4a58108990 0x7f4a58108db0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:53708/0 (socket says 192.168.123.105:53708) 2026-03-10T08:50:30.708 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.436+0000 7f4a56ffd700 1 -- 192.168.123.105:0/2944884358 learned_addr learned my addr 192.168.123.105:0/2944884358 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:30.708 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.436+0000 7f4a56ffd700 1 -- 192.168.123.105:0/2944884358 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4a58109b90 con 0x7f4a58108990 2026-03-10T08:50:30.708 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.436+0000 7f4a56ffd700 1 --2- 192.168.123.105:0/2944884358 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4a58108990 0x7f4a58108db0 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f4a48009cf0 tx=0x7f4a4800b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=27281aecd75f7953 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:30.708 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.437+0000 7f4a567fc700 1 -- 192.168.123.105:0/2944884358 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4a48004030 con 0x7f4a58108990 2026-03-10T08:50:30.708 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.437+0000 7f4a567fc700 1 -- 192.168.123.105:0/2944884358 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4a4800b810 con 0x7f4a58108990 2026-03-10T08:50:30.708 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.437+0000 7f4a5d695700 1 -- 192.168.123.105:0/2944884358 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4a58108990 msgr2=0x7f4a58108db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:30.708 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.437+0000 7f4a5d695700 1 --2- 192.168.123.105:0/2944884358 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4a58108990 0x7f4a58108db0 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f4a48009cf0 tx=0x7f4a4800b0e0 comp rx=0 tx=0).stop 2026-03-10T08:50:30.708 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.437+0000 7f4a567fc700 1 -- 192.168.123.105:0/2944884358 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4a48003b10 con 0x7f4a58108990 2026-03-10T08:50:30.708 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.438+0000 7f4a5d695700 1 -- 192.168.123.105:0/2944884358 shutdown_connections 2026-03-10T08:50:30.708 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.438+0000 7f4a5d695700 1 --2- 192.168.123.105:0/2944884358 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4a58108990 0x7f4a58108db0 unknown :-1 s=CLOSED pgs=57 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:30.708 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.438+0000 7f4a5d695700 1 -- 192.168.123.105:0/2944884358 >> 192.168.123.105:0/2944884358 conn(0x7f4a58103f50 msgr2=0x7f4a58106370 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:30.708 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.438+0000 7f4a5d695700 1 -- 192.168.123.105:0/2944884358 shutdown_connections 2026-03-10T08:50:30.708 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.438+0000 7f4a5d695700 1 -- 192.168.123.105:0/2944884358 wait complete. 2026-03-10T08:50:30.708 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.439+0000 7f4a5d695700 1 Processor -- start 2026-03-10T08:50:30.708 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.439+0000 7f4a5d695700 1 -- start start 2026-03-10T08:50:30.708 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.439+0000 7f4a5d695700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4a5819c930 0x7f4a5819cd50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:30.708 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.439+0000 7f4a5d695700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4a5819d290 con 0x7f4a5819c930 2026-03-10T08:50:30.708 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.439+0000 7f4a56ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4a5819c930 0x7f4a5819cd50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:30.709 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.439+0000 7f4a56ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4a5819c930 0x7f4a5819cd50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:53710/0 (socket says 192.168.123.105:53710) 2026-03-10T08:50:30.709 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.439+0000 7f4a56ffd700 1 -- 192.168.123.105:0/1808277951 learned_addr learned my addr 192.168.123.105:0/1808277951 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:30.709 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.440+0000 7f4a56ffd700 1 -- 192.168.123.105:0/1808277951 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4a48009740 con 0x7f4a5819c930 2026-03-10T08:50:30.709 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.440+0000 7f4a56ffd700 1 --2- 192.168.123.105:0/1808277951 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4a5819c930 0x7f4a5819cd50 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f4a4800b560 tx=0x7f4a48003e30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:30.709 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.440+0000 7f4a54ff9700 1 -- 192.168.123.105:0/1808277951 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4a48004050 con 0x7f4a5819c930 2026-03-10T08:50:30.709 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.440+0000 7f4a54ff9700 1 -- 192.168.123.105:0/1808277951 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4a4801b450 con 0x7f4a5819c930 2026-03-10T08:50:30.709 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.440+0000 7f4a54ff9700 1 -- 192.168.123.105:0/1808277951 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4a4801c450 con 0x7f4a5819c930 2026-03-10T08:50:30.709 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.440+0000 7f4a5d695700 1 -- 192.168.123.105:0/1808277951 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4a5819d490 con 0x7f4a5819c930 2026-03-10T08:50:30.709 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.440+0000 7f4a5d695700 1 -- 192.168.123.105:0/1808277951 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4a581a00e0 con 0x7f4a5819c930 2026-03-10T08:50:30.709 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.441+0000 7f4a5d695700 1 -- 192.168.123.105:0/1808277951 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4a580623c0 con 0x7f4a5819c930 2026-03-10T08:50:30.709 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.442+0000 7f4a54ff9700 1 -- 192.168.123.105:0/1808277951 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 7) v1 ==== 45278+0+0 (secure 0 0 0) 0x7f4a48022020 con 0x7f4a5819c930 2026-03-10T08:50:30.709 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.442+0000 7f4a54ff9700 1 --2- 192.168.123.105:0/1808277951 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4a440383e0 0x7f4a4403a8a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:30.709 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.442+0000 7f4a54ff9700 1 -- 192.168.123.105:0/1808277951 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f4a4804bbb0 con 0x7f4a5819c930 2026-03-10T08:50:30.709 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.445+0000 7f4a4e5ff700 1 --2- 192.168.123.105:0/1808277951 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4a440383e0 0x7f4a4403a8a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:30.709 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.447+0000 7f4a54ff9700 1 -- 192.168.123.105:0/1808277951 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f4a48018350 con 0x7f4a5819c930 2026-03-10T08:50:30.709 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.448+0000 7f4a4e5ff700 1 --2- 192.168.123.105:0/1808277951 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4a440383e0 0x7f4a4403a8a0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f4a40006fd0 tx=0x7f4a40006e40 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:30.709 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.571+0000 7f4a5d695700 1 -- 192.168.123.105:0/1808277951 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "node-exporter", "target": ["mon-mgr", ""]}) v1 -- 0x7f4a581053d0 con 0x7f4a440383e0 2026-03-10T08:50:30.709 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.641+0000 7f4a54ff9700 1 -- 192.168.123.105:0/1808277951 <== mgr.14118 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+34 (secure 0 0 0) 0x7f4a581053d0 con 0x7f4a440383e0 2026-03-10T08:50:30.709 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.644+0000 7f4a5d695700 1 -- 192.168.123.105:0/1808277951 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4a440383e0 msgr2=0x7f4a4403a8a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:30.709 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.644+0000 7f4a5d695700 1 --2- 192.168.123.105:0/1808277951 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4a440383e0 0x7f4a4403a8a0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f4a40006fd0 tx=0x7f4a40006e40 comp rx=0 tx=0).stop 2026-03-10T08:50:30.709 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.644+0000 7f4a5d695700 1 -- 192.168.123.105:0/1808277951 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4a5819c930 msgr2=0x7f4a5819cd50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:30.709 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.644+0000 7f4a5d695700 1 --2- 192.168.123.105:0/1808277951 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4a5819c930 0x7f4a5819cd50 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f4a4800b560 tx=0x7f4a48003e30 comp rx=0 tx=0).stop 2026-03-10T08:50:30.709 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.644+0000 7f4a5d695700 1 -- 192.168.123.105:0/1808277951 shutdown_connections 2026-03-10T08:50:30.709 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.644+0000 7f4a5d695700 1 --2- 192.168.123.105:0/1808277951 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4a440383e0 0x7f4a4403a8a0 unknown :-1 s=CLOSED pgs=18 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:30.709 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.644+0000 7f4a5d695700 1 --2- 192.168.123.105:0/1808277951 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4a5819c930 0x7f4a5819cd50 unknown :-1 s=CLOSED pgs=58 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:30.709 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.644+0000 7f4a5d695700 1 -- 192.168.123.105:0/1808277951 >> 192.168.123.105:0/1808277951 conn(0x7f4a58103f50 msgr2=0x7f4a58104cc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:30.709 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.644+0000 7f4a5d695700 1 -- 192.168.123.105:0/1808277951 shutdown_connections 2026-03-10T08:50:30.709 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.644+0000 7f4a5d695700 1 -- 192.168.123.105:0/1808277951 wait complete. 2026-03-10T08:50:30.709 INFO:teuthology.orchestra.run.vm05.stdout:Deploying alertmanager service with default placement... 2026-03-10T08:50:31.014 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout Scheduled alertmanager update... 2026-03-10T08:50:31.014 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.848+0000 7f82604e4700 1 Processor -- start 2026-03-10T08:50:31.014 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.848+0000 7f82604e4700 1 -- start start 2026-03-10T08:50:31.014 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.848+0000 7f82604e4700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8258071da0 0x7f82580721c0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:31.014 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.848+0000 7f82604e4700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8258072700 con 0x7f8258071da0 2026-03-10T08:50:31.014 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.848+0000 7f825e280700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8258071da0 0x7f82580721c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:31.014 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.848+0000 7f825e280700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8258071da0 0x7f82580721c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:53716/0 (socket says 192.168.123.105:53716) 2026-03-10T08:50:31.014 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.848+0000 7f825e280700 1 -- 192.168.123.105:0/3665043624 learned_addr learned my addr 192.168.123.105:0/3665043624 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:31.014 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.849+0000 7f825e280700 1 -- 192.168.123.105:0/3665043624 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8258072840 con 0x7f8258071da0 2026-03-10T08:50:31.014 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.849+0000 7f825e280700 1 --2- 192.168.123.105:0/3665043624 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8258071da0 0x7f82580721c0 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f8254009cf0 tx=0x7f825400b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=f9e65341193267e8 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:31.014 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.849+0000 7f825d27e700 1 -- 192.168.123.105:0/3665043624 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8254004030 con 0x7f8258071da0 2026-03-10T08:50:31.014 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.849+0000 7f825d27e700 1 -- 192.168.123.105:0/3665043624 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f825400b810 con 0x7f8258071da0 2026-03-10T08:50:31.015 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.849+0000 7f825d27e700 1 -- 192.168.123.105:0/3665043624 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8254003b10 con 0x7f8258071da0 2026-03-10T08:50:31.015 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.849+0000 7f82604e4700 1 -- 192.168.123.105:0/3665043624 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8258071da0 msgr2=0x7f82580721c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:31.015 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.849+0000 7f82604e4700 1 --2- 192.168.123.105:0/3665043624 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8258071da0 0x7f82580721c0 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f8254009cf0 tx=0x7f825400b0e0 comp rx=0 tx=0).stop 2026-03-10T08:50:31.015 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.850+0000 7f82604e4700 1 -- 192.168.123.105:0/3665043624 shutdown_connections 2026-03-10T08:50:31.015 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.850+0000 7f82604e4700 1 --2- 192.168.123.105:0/3665043624 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8258071da0 0x7f82580721c0 unknown :-1 s=CLOSED pgs=59 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:31.015 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.850+0000 7f82604e4700 1 -- 192.168.123.105:0/3665043624 >> 192.168.123.105:0/3665043624 conn(0x7f825806d400 msgr2=0x7f825806f840 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:31.015 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.850+0000 7f82604e4700 1 -- 192.168.123.105:0/3665043624 shutdown_connections 2026-03-10T08:50:31.015 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.850+0000 7f82604e4700 1 -- 192.168.123.105:0/3665043624 wait complete. 2026-03-10T08:50:31.015 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.850+0000 7f82604e4700 1 Processor -- start 2026-03-10T08:50:31.015 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.850+0000 7f82604e4700 1 -- start start 2026-03-10T08:50:31.015 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.850+0000 7f82604e4700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f825811d5a0 0x7f825811bc30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:31.015 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.850+0000 7f82604e4700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f825811d9c0 con 0x7f825811d5a0 2026-03-10T08:50:31.015 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.850+0000 7f825e280700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f825811d5a0 0x7f825811bc30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:31.015 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.850+0000 7f825e280700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f825811d5a0 0x7f825811bc30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:53728/0 (socket says 192.168.123.105:53728) 2026-03-10T08:50:31.015 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.850+0000 7f825e280700 1 -- 192.168.123.105:0/2323867133 learned_addr learned my addr 192.168.123.105:0/2323867133 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:31.015 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.851+0000 7f825e280700 1 -- 192.168.123.105:0/2323867133 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8254009740 con 0x7f825811d5a0 2026-03-10T08:50:31.015 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.851+0000 7f825e280700 1 --2- 192.168.123.105:0/2323867133 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f825811d5a0 0x7f825811bc30 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7f8254009cc0 tx=0x7f8254011770 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:31.015 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.851+0000 7f824f7fe700 1 -- 192.168.123.105:0/2323867133 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f82540118d0 con 0x7f825811d5a0 2026-03-10T08:50:31.015 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.851+0000 7f824f7fe700 1 -- 192.168.123.105:0/2323867133 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f825401a440 con 0x7f825811d5a0 2026-03-10T08:50:31.015 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.851+0000 7f824f7fe700 1 -- 192.168.123.105:0/2323867133 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f825401b440 con 0x7f825811d5a0 2026-03-10T08:50:31.015 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.851+0000 7f82604e4700 1 -- 192.168.123.105:0/2323867133 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f825811c170 con 0x7f825811d5a0 2026-03-10T08:50:31.015 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.851+0000 7f82604e4700 1 -- 192.168.123.105:0/2323867133 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f825811c590 con 0x7f825811d5a0 2026-03-10T08:50:31.015 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.852+0000 7f82604e4700 1 -- 192.168.123.105:0/2323867133 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f825804f000 con 0x7f825811d5a0 2026-03-10T08:50:31.015 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.855+0000 7f824f7fe700 1 -- 192.168.123.105:0/2323867133 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 7) v1 ==== 45278+0+0 (secure 0 0 0) 0x7f825401aa30 con 0x7f825811d5a0 2026-03-10T08:50:31.015 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.855+0000 7f824f7fe700 1 --2- 192.168.123.105:0/2323867133 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f82440383a0 0x7f824403a860 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:31.015 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.855+0000 7f824f7fe700 1 -- 192.168.123.105:0/2323867133 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f825402b030 con 0x7f825811d5a0 2026-03-10T08:50:31.015 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.855+0000 7f824f7fe700 1 -- 192.168.123.105:0/2323867133 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f825404c850 con 0x7f825811d5a0 2026-03-10T08:50:31.015 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.862+0000 7f825da7f700 1 --2- 192.168.123.105:0/2323867133 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f82440383a0 0x7f824403a860 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:31.015 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.864+0000 7f825da7f700 1 --2- 192.168.123.105:0/2323867133 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f82440383a0 0x7f824403a860 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f8248006fd0 tx=0x7f8248006e40 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:31.015 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.969+0000 7f82604e4700 1 -- 192.168.123.105:0/2323867133 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "alertmanager", "target": ["mon-mgr", ""]}) v1 -- 0x7f825811db00 con 0x7f82440383a0 2026-03-10T08:50:31.015 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.975+0000 7f824f7fe700 1 -- 192.168.123.105:0/2323867133 <== mgr.14118 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+33 (secure 0 0 0) 0x7f8248009e20 con 0x7f82440383a0 2026-03-10T08:50:31.015 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.979+0000 7f824d7fa700 1 -- 192.168.123.105:0/2323867133 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f82440383a0 msgr2=0x7f824403a860 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:31.015 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.979+0000 7f824d7fa700 1 --2- 192.168.123.105:0/2323867133 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f82440383a0 0x7f824403a860 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f8248006fd0 tx=0x7f8248006e40 comp rx=0 tx=0).stop 2026-03-10T08:50:31.015 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.979+0000 7f824d7fa700 1 -- 192.168.123.105:0/2323867133 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f825811d5a0 msgr2=0x7f825811bc30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:31.015 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.979+0000 7f824d7fa700 1 --2- 192.168.123.105:0/2323867133 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f825811d5a0 0x7f825811bc30 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7f8254009cc0 tx=0x7f8254011770 comp rx=0 tx=0).stop 2026-03-10T08:50:31.015 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.980+0000 7f824d7fa700 1 -- 192.168.123.105:0/2323867133 shutdown_connections 2026-03-10T08:50:31.015 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.980+0000 7f824d7fa700 1 --2- 192.168.123.105:0/2323867133 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f82440383a0 0x7f824403a860 unknown :-1 s=CLOSED pgs=19 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:31.015 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.980+0000 7f824d7fa700 1 --2- 192.168.123.105:0/2323867133 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f825811d5a0 0x7f825811bc30 unknown :-1 s=CLOSED pgs=60 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:31.015 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.980+0000 7f824d7fa700 1 -- 192.168.123.105:0/2323867133 >> 192.168.123.105:0/2323867133 conn(0x7f825806d400 msgr2=0x7f825806e0a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:31.015 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.980+0000 7f824d7fa700 1 -- 192.168.123.105:0/2323867133 shutdown_connections 2026-03-10T08:50:31.015 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:30.980+0000 7f824d7fa700 1 -- 192.168.123.105:0/2323867133 wait complete. 2026-03-10T08:50:31.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:31 vm05 ceph-mon[49713]: from='client.14146 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "ceph-exporter", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:50:31.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:31 vm05 ceph-mon[49713]: Saving service ceph-exporter spec with placement * 2026-03-10T08:50:31.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:31 vm05 ceph-mon[49713]: from='client.14148 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "prometheus", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:50:31.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:31 vm05 ceph-mon[49713]: Saving service prometheus spec with placement count:1 2026-03-10T08:50:31.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:31 vm05 ceph-mon[49713]: from='client.14150 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "grafana", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:50:31.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:31 vm05 ceph-mon[49713]: Saving service grafana spec with placement count:1 2026-03-10T08:50:31.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:31 vm05 ceph-mon[49713]: from='mgr.14118 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:31.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:31 vm05 ceph-mon[49713]: from='mgr.14118 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:31.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:31 vm05 ceph-mon[49713]: from='mgr.14118 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:31.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:31 vm05 ceph-mon[49713]: from='mgr.14118 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:31.340 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.164+0000 7ff42e9d8700 1 Processor -- start 2026-03-10T08:50:31.340 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.164+0000 7ff42e9d8700 1 -- start start 2026-03-10T08:50:31.340 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.164+0000 7ff42e9d8700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff428108990 0x7ff428108db0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:31.340 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.164+0000 7ff42e9d8700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff428109380 con 0x7ff428108990 2026-03-10T08:50:31.340 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.165+0000 7ff427fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff428108990 0x7ff428108db0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:31.340 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.165+0000 7ff427fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff428108990 0x7ff428108db0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:53734/0 (socket says 192.168.123.105:53734) 2026-03-10T08:50:31.340 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.165+0000 7ff427fff700 1 -- 192.168.123.105:0/4116071535 learned_addr learned my addr 192.168.123.105:0/4116071535 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:31.340 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.165+0000 7ff427fff700 1 -- 192.168.123.105:0/4116071535 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff428109b90 con 0x7ff428108990 2026-03-10T08:50:31.340 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.165+0000 7ff427fff700 1 --2- 192.168.123.105:0/4116071535 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff428108990 0x7ff428108db0 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7ff410009a90 tx=0x7ff410009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=32df57e97e8c9d7a server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:31.340 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.166+0000 7ff426ffd700 1 -- 192.168.123.105:0/4116071535 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff410004030 con 0x7ff428108990 2026-03-10T08:50:31.340 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.166+0000 7ff426ffd700 1 -- 192.168.123.105:0/4116071535 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff41000b7e0 con 0x7ff428108990 2026-03-10T08:50:31.340 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.166+0000 7ff42e9d8700 1 -- 192.168.123.105:0/4116071535 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff428108990 msgr2=0x7ff428108db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:31.340 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.166+0000 7ff42e9d8700 1 --2- 192.168.123.105:0/4116071535 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff428108990 0x7ff428108db0 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7ff410009a90 tx=0x7ff410009da0 comp rx=0 tx=0).stop 2026-03-10T08:50:31.340 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.166+0000 7ff42e9d8700 1 -- 192.168.123.105:0/4116071535 shutdown_connections 2026-03-10T08:50:31.340 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.166+0000 7ff42e9d8700 1 --2- 192.168.123.105:0/4116071535 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff428108990 0x7ff428108db0 unknown :-1 s=CLOSED pgs=61 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:31.340 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.166+0000 7ff42e9d8700 1 -- 192.168.123.105:0/4116071535 >> 192.168.123.105:0/4116071535 conn(0x7ff428103f50 msgr2=0x7ff428106370 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:31.340 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.166+0000 7ff42e9d8700 1 -- 192.168.123.105:0/4116071535 shutdown_connections 2026-03-10T08:50:31.340 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.167+0000 7ff42e9d8700 1 -- 192.168.123.105:0/4116071535 wait complete. 2026-03-10T08:50:31.340 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.167+0000 7ff42e9d8700 1 Processor -- start 2026-03-10T08:50:31.340 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.167+0000 7ff42e9d8700 1 -- start start 2026-03-10T08:50:31.340 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.167+0000 7ff42e9d8700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff428108990 0x7ff42819c780 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:31.340 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.167+0000 7ff42e9d8700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff428109380 con 0x7ff428108990 2026-03-10T08:50:31.340 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.167+0000 7ff427fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff428108990 0x7ff42819c780 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:31.340 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.168+0000 7ff427fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff428108990 0x7ff42819c780 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:53744/0 (socket says 192.168.123.105:53744) 2026-03-10T08:50:31.340 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.168+0000 7ff427fff700 1 -- 192.168.123.105:0/2297424064 learned_addr learned my addr 192.168.123.105:0/2297424064 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:31.340 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.168+0000 7ff427fff700 1 -- 192.168.123.105:0/2297424064 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff410009740 con 0x7ff428108990 2026-03-10T08:50:31.340 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.168+0000 7ff427fff700 1 --2- 192.168.123.105:0/2297424064 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff428108990 0x7ff42819c780 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7ff41000bf20 tx=0x7ff410003da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:31.340 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.168+0000 7ff4257fa700 1 -- 192.168.123.105:0/2297424064 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff410004240 con 0x7ff428108990 2026-03-10T08:50:31.340 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.168+0000 7ff4257fa700 1 -- 192.168.123.105:0/2297424064 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff4100043a0 con 0x7ff428108990 2026-03-10T08:50:31.341 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.168+0000 7ff42e9d8700 1 -- 192.168.123.105:0/2297424064 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff42819ccc0 con 0x7ff428108990 2026-03-10T08:50:31.341 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.168+0000 7ff4257fa700 1 -- 192.168.123.105:0/2297424064 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff410011680 con 0x7ff428108990 2026-03-10T08:50:31.341 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.168+0000 7ff42e9d8700 1 -- 192.168.123.105:0/2297424064 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff42819d0e0 con 0x7ff428108990 2026-03-10T08:50:31.341 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.169+0000 7ff4257fa700 1 -- 192.168.123.105:0/2297424064 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 7) v1 ==== 45278+0+0 (secure 0 0 0) 0x7ff4100118f0 con 0x7ff428108990 2026-03-10T08:50:31.341 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.169+0000 7ff42e9d8700 1 -- 192.168.123.105:0/2297424064 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff42804fa20 con 0x7ff428108990 2026-03-10T08:50:31.341 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.171+0000 7ff4257fa700 1 --2- 192.168.123.105:0/2297424064 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff4140383e0 0x7ff41403a8a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:31.341 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.171+0000 7ff4257fa700 1 -- 192.168.123.105:0/2297424064 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7ff41004cfd0 con 0x7ff428108990 2026-03-10T08:50:31.341 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.172+0000 7ff4257fa700 1 -- 192.168.123.105:0/2297424064 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7ff410029330 con 0x7ff428108990 2026-03-10T08:50:31.341 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.172+0000 7ff4277fe700 1 --2- 192.168.123.105:0/2297424064 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff4140383e0 0x7ff41403a8a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:31.341 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.172+0000 7ff4277fe700 1 --2- 192.168.123.105:0/2297424064 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff4140383e0 0x7ff41403a8a0 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7ff418006fd0 tx=0x7ff418006e40 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:31.341 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.281+0000 7ff42e9d8700 1 -- 192.168.123.105:0/2297424064 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command([{prefix=config set, name=mgr/cephadm/container_init}] v 0) v1 -- 0x7ff42819ff20 con 0x7ff428108990 2026-03-10T08:50:31.341 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.287+0000 7ff4257fa700 1 -- 192.168.123.105:0/2297424064 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mgr/cephadm/container_init}]=0 v7)=0 v7) v1 ==== 142+0+0 (secure 0 0 0) 0x7ff410018b40 con 0x7ff428108990 2026-03-10T08:50:31.341 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.291+0000 7ff42e9d8700 1 -- 192.168.123.105:0/2297424064 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff4140383e0 msgr2=0x7ff41403a8a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:31.341 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.291+0000 7ff42e9d8700 1 --2- 192.168.123.105:0/2297424064 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff4140383e0 0x7ff41403a8a0 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7ff418006fd0 tx=0x7ff418006e40 comp rx=0 tx=0).stop 2026-03-10T08:50:31.341 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.291+0000 7ff42e9d8700 1 -- 192.168.123.105:0/2297424064 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff428108990 msgr2=0x7ff42819c780 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:31.341 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.291+0000 7ff42e9d8700 1 --2- 192.168.123.105:0/2297424064 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff428108990 0x7ff42819c780 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7ff41000bf20 tx=0x7ff410003da0 comp rx=0 tx=0).stop 2026-03-10T08:50:31.341 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.291+0000 7ff42e9d8700 1 -- 192.168.123.105:0/2297424064 shutdown_connections 2026-03-10T08:50:31.341 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.291+0000 7ff42e9d8700 1 --2- 192.168.123.105:0/2297424064 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff4140383e0 0x7ff41403a8a0 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:31.341 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.291+0000 7ff42e9d8700 1 --2- 192.168.123.105:0/2297424064 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff428108990 0x7ff42819c780 unknown :-1 s=CLOSED pgs=62 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:31.341 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.291+0000 7ff42e9d8700 1 -- 192.168.123.105:0/2297424064 >> 192.168.123.105:0/2297424064 conn(0x7ff428103f50 msgr2=0x7ff428104c70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:31.341 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.291+0000 7ff42e9d8700 1 -- 192.168.123.105:0/2297424064 shutdown_connections 2026-03-10T08:50:31.341 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.291+0000 7ff42e9d8700 1 -- 192.168.123.105:0/2297424064 wait complete. 2026-03-10T08:50:31.669 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.470+0000 7f4e114b6700 1 Processor -- start 2026-03-10T08:50:31.669 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.470+0000 7f4e114b6700 1 -- start start 2026-03-10T08:50:31.669 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.470+0000 7f4e114b6700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4e0c104fb0 0x7f4e0c1073e0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:31.669 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.470+0000 7f4e114b6700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4e0c074720 con 0x7f4e0c104fb0 2026-03-10T08:50:31.669 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.471+0000 7f4e0affd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4e0c104fb0 0x7f4e0c1073e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:31.669 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.471+0000 7f4e0affd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4e0c104fb0 0x7f4e0c1073e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:53754/0 (socket says 192.168.123.105:53754) 2026-03-10T08:50:31.669 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.471+0000 7f4e0affd700 1 -- 192.168.123.105:0/2404498478 learned_addr learned my addr 192.168.123.105:0/2404498478 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:31.669 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.471+0000 7f4e0affd700 1 -- 192.168.123.105:0/2404498478 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4e0c107920 con 0x7f4e0c104fb0 2026-03-10T08:50:31.669 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.471+0000 7f4e0affd700 1 --2- 192.168.123.105:0/2404498478 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4e0c104fb0 0x7f4e0c1073e0 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f4df4009a90 tx=0x7f4df4009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=d6c17b77961f43d1 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:31.669 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.471+0000 7f4e09ffb700 1 -- 192.168.123.105:0/2404498478 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4df4004030 con 0x7f4e0c104fb0 2026-03-10T08:50:31.669 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.472+0000 7f4e09ffb700 1 -- 192.168.123.105:0/2404498478 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4df400b7e0 con 0x7f4e0c104fb0 2026-03-10T08:50:31.669 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.472+0000 7f4e114b6700 1 -- 192.168.123.105:0/2404498478 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4e0c104fb0 msgr2=0x7f4e0c1073e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:31.669 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.472+0000 7f4e114b6700 1 --2- 192.168.123.105:0/2404498478 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4e0c104fb0 0x7f4e0c1073e0 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f4df4009a90 tx=0x7f4df4009da0 comp rx=0 tx=0).stop 2026-03-10T08:50:31.669 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.472+0000 7f4e114b6700 1 -- 192.168.123.105:0/2404498478 shutdown_connections 2026-03-10T08:50:31.669 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.472+0000 7f4e114b6700 1 --2- 192.168.123.105:0/2404498478 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4e0c104fb0 0x7f4e0c1073e0 unknown :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:31.669 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.472+0000 7f4e114b6700 1 -- 192.168.123.105:0/2404498478 >> 192.168.123.105:0/2404498478 conn(0x7f4e0c100bd0 msgr2=0x7f4e0c103030 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:31.669 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.472+0000 7f4e114b6700 1 -- 192.168.123.105:0/2404498478 shutdown_connections 2026-03-10T08:50:31.669 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.472+0000 7f4e114b6700 1 -- 192.168.123.105:0/2404498478 wait complete. 2026-03-10T08:50:31.669 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.473+0000 7f4e114b6700 1 Processor -- start 2026-03-10T08:50:31.669 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.473+0000 7f4e114b6700 1 -- start start 2026-03-10T08:50:31.669 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.473+0000 7f4e114b6700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4e0c1a0bd0 0x7f4e0c1a1010 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:31.669 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.473+0000 7f4e114b6700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4e0c074720 con 0x7f4e0c1a0bd0 2026-03-10T08:50:31.669 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.473+0000 7f4e0affd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4e0c1a0bd0 0x7f4e0c1a1010 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:31.669 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.473+0000 7f4e0affd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4e0c1a0bd0 0x7f4e0c1a1010 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:53764/0 (socket says 192.168.123.105:53764) 2026-03-10T08:50:31.669 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.473+0000 7f4e0affd700 1 -- 192.168.123.105:0/4122827266 learned_addr learned my addr 192.168.123.105:0/4122827266 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:31.669 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.474+0000 7f4e0affd700 1 -- 192.168.123.105:0/4122827266 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4df4009740 con 0x7f4e0c1a0bd0 2026-03-10T08:50:31.669 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.474+0000 7f4e0affd700 1 --2- 192.168.123.105:0/4122827266 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4e0c1a0bd0 0x7f4e0c1a1010 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f4df400bdb0 tx=0x7f4df400be90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:31.669 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.474+0000 7f4e03fff700 1 -- 192.168.123.105:0/4122827266 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4df4003fb0 con 0x7f4e0c1a0bd0 2026-03-10T08:50:31.669 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.474+0000 7f4e03fff700 1 -- 192.168.123.105:0/4122827266 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4df40045f0 con 0x7f4e0c1a0bd0 2026-03-10T08:50:31.669 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.474+0000 7f4e03fff700 1 -- 192.168.123.105:0/4122827266 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4df401ad90 con 0x7f4e0c1a0bd0 2026-03-10T08:50:31.669 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.474+0000 7f4e114b6700 1 -- 192.168.123.105:0/4122827266 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4e0c1a1550 con 0x7f4e0c1a0bd0 2026-03-10T08:50:31.669 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.474+0000 7f4e114b6700 1 -- 192.168.123.105:0/4122827266 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4e0c1a4190 con 0x7f4e0c1a0bd0 2026-03-10T08:50:31.669 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.475+0000 7f4e03fff700 1 -- 192.168.123.105:0/4122827266 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 7) v1 ==== 45278+0+0 (secure 0 0 0) 0x7f4df402c430 con 0x7f4e0c1a0bd0 2026-03-10T08:50:31.669 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.475+0000 7f4e03fff700 1 --2- 192.168.123.105:0/4122827266 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4df80383c0 0x7f4df803a880 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:31.669 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.475+0000 7f4e03fff700 1 -- 192.168.123.105:0/4122827266 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f4df404c7d0 con 0x7f4e0c1a0bd0 2026-03-10T08:50:31.669 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.475+0000 7f4e0a7fc700 1 --2- 192.168.123.105:0/4122827266 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4df80383c0 0x7f4df803a880 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:31.669 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.476+0000 7f4e114b6700 1 -- 192.168.123.105:0/4122827266 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4e0c19a290 con 0x7f4e0c1a0bd0 2026-03-10T08:50:31.669 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.478+0000 7f4e0a7fc700 1 --2- 192.168.123.105:0/4122827266 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4df80383c0 0x7f4df803a880 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f4dfc006fd0 tx=0x7f4dfc006e40 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:31.669 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.478+0000 7f4e03fff700 1 -- 192.168.123.105:0/4122827266 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f4df4018940 con 0x7f4e0c1a0bd0 2026-03-10T08:50:31.669 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.584+0000 7f4e114b6700 1 -- 192.168.123.105:0/4122827266 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command([{prefix=config set, name=mgr/dashboard/ssl_server_port}] v 0) v1 -- 0x7f4e0c0623c0 con 0x7f4e0c1a0bd0 2026-03-10T08:50:31.669 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.623+0000 7f4e03fff700 1 -- 192.168.123.105:0/4122827266 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mgr/dashboard/ssl_server_port}]=0 v8)=0 v8) v1 ==== 130+0+0 (secure 0 0 0) 0x7f4df404a030 con 0x7f4e0c1a0bd0 2026-03-10T08:50:31.669 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.625+0000 7f4e114b6700 1 -- 192.168.123.105:0/4122827266 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4df80383c0 msgr2=0x7f4df803a880 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:31.669 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.625+0000 7f4e114b6700 1 --2- 192.168.123.105:0/4122827266 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4df80383c0 0x7f4df803a880 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f4dfc006fd0 tx=0x7f4dfc006e40 comp rx=0 tx=0).stop 2026-03-10T08:50:31.670 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.625+0000 7f4e114b6700 1 -- 192.168.123.105:0/4122827266 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4e0c1a0bd0 msgr2=0x7f4e0c1a1010 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:31.670 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.625+0000 7f4e114b6700 1 --2- 192.168.123.105:0/4122827266 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4e0c1a0bd0 0x7f4e0c1a1010 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f4df400bdb0 tx=0x7f4df400be90 comp rx=0 tx=0).stop 2026-03-10T08:50:31.670 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.625+0000 7f4e114b6700 1 -- 192.168.123.105:0/4122827266 shutdown_connections 2026-03-10T08:50:31.670 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.625+0000 7f4e114b6700 1 --2- 192.168.123.105:0/4122827266 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4df80383c0 0x7f4df803a880 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:31.670 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.625+0000 7f4e114b6700 1 --2- 192.168.123.105:0/4122827266 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4e0c1a0bd0 0x7f4e0c1a1010 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:31.670 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.625+0000 7f4e114b6700 1 -- 192.168.123.105:0/4122827266 >> 192.168.123.105:0/4122827266 conn(0x7f4e0c100bd0 msgr2=0x7f4e0c1071d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:31.670 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.625+0000 7f4e114b6700 1 -- 192.168.123.105:0/4122827266 shutdown_connections 2026-03-10T08:50:31.670 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.625+0000 7f4e114b6700 1 -- 192.168.123.105:0/4122827266 wait complete. 2026-03-10T08:50:31.670 INFO:teuthology.orchestra.run.vm05.stdout:Enabling the dashboard module... 2026-03-10T08:50:32.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:32 vm05 ceph-mon[49713]: from='client.14152 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "node-exporter", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:50:32.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:32 vm05 ceph-mon[49713]: Saving service node-exporter spec with placement * 2026-03-10T08:50:32.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:32 vm05 ceph-mon[49713]: from='client.14154 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "alertmanager", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:50:32.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:32 vm05 ceph-mon[49713]: Saving service alertmanager spec with placement count:1 2026-03-10T08:50:32.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:32 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/2297424064' entity='client.admin' 2026-03-10T08:50:32.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:32 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/4122827266' entity='client.admin' 2026-03-10T08:50:32.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:32 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/1292032290' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch 2026-03-10T08:50:32.951 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.781+0000 7f9c5bf98700 1 Processor -- start 2026-03-10T08:50:32.951 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.781+0000 7f9c5bf98700 1 -- start start 2026-03-10T08:50:32.951 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.782+0000 7f9c5bf98700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9c54104fb0 0x7f9c541073e0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:32.951 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.782+0000 7f9c5bf98700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9c54079ec0 con 0x7f9c54104fb0 2026-03-10T08:50:32.951 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.782+0000 7f9c59d34700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9c54104fb0 0x7f9c541073e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:32.951 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.782+0000 7f9c59d34700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9c54104fb0 0x7f9c541073e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:53766/0 (socket says 192.168.123.105:53766) 2026-03-10T08:50:32.952 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.782+0000 7f9c59d34700 1 -- 192.168.123.105:0/3883107327 learned_addr learned my addr 192.168.123.105:0/3883107327 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:32.952 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.782+0000 7f9c59d34700 1 -- 192.168.123.105:0/3883107327 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9c54107920 con 0x7f9c54104fb0 2026-03-10T08:50:32.952 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.782+0000 7f9c59d34700 1 --2- 192.168.123.105:0/3883107327 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9c54104fb0 0x7f9c541073e0 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f9c50009a90 tx=0x7f9c50009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=591f240cb27474c6 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:32.952 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.782+0000 7f9c58d32700 1 -- 192.168.123.105:0/3883107327 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f9c50004030 con 0x7f9c54104fb0 2026-03-10T08:50:32.952 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.782+0000 7f9c58d32700 1 -- 192.168.123.105:0/3883107327 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9c5000b7e0 con 0x7f9c54104fb0 2026-03-10T08:50:32.952 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.783+0000 7f9c5bf98700 1 -- 192.168.123.105:0/3883107327 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9c54104fb0 msgr2=0x7f9c541073e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:32.952 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.783+0000 7f9c5bf98700 1 --2- 192.168.123.105:0/3883107327 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9c54104fb0 0x7f9c541073e0 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f9c50009a90 tx=0x7f9c50009da0 comp rx=0 tx=0).stop 2026-03-10T08:50:32.952 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.783+0000 7f9c5bf98700 1 -- 192.168.123.105:0/3883107327 shutdown_connections 2026-03-10T08:50:32.952 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.783+0000 7f9c5bf98700 1 --2- 192.168.123.105:0/3883107327 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9c54104fb0 0x7f9c541073e0 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:32.952 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.783+0000 7f9c5bf98700 1 -- 192.168.123.105:0/3883107327 >> 192.168.123.105:0/3883107327 conn(0x7f9c54100bf0 msgr2=0x7f9c54103030 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:32.952 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.783+0000 7f9c5bf98700 1 -- 192.168.123.105:0/3883107327 shutdown_connections 2026-03-10T08:50:32.952 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.783+0000 7f9c5bf98700 1 -- 192.168.123.105:0/3883107327 wait complete. 2026-03-10T08:50:32.952 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.784+0000 7f9c5bf98700 1 Processor -- start 2026-03-10T08:50:32.952 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.784+0000 7f9c5bf98700 1 -- start start 2026-03-10T08:50:32.952 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.784+0000 7f9c5bf98700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9c54104fb0 0x7f9c541a0e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:32.952 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.784+0000 7f9c5bf98700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9c54079ec0 con 0x7f9c54104fb0 2026-03-10T08:50:32.952 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.784+0000 7f9c59d34700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9c54104fb0 0x7f9c541a0e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:32.952 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.784+0000 7f9c59d34700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9c54104fb0 0x7f9c541a0e50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:53774/0 (socket says 192.168.123.105:53774) 2026-03-10T08:50:32.952 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.784+0000 7f9c59d34700 1 -- 192.168.123.105:0/1292032290 learned_addr learned my addr 192.168.123.105:0/1292032290 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:32.952 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.784+0000 7f9c59d34700 1 -- 192.168.123.105:0/1292032290 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9c50009740 con 0x7f9c54104fb0 2026-03-10T08:50:32.952 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.784+0000 7f9c59d34700 1 --2- 192.168.123.105:0/1292032290 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9c54104fb0 0x7f9c541a0e50 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7f9c5000bd80 tx=0x7f9c5000be60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:32.952 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.784+0000 7f9c46ffd700 1 -- 192.168.123.105:0/1292032290 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f9c50003900 con 0x7f9c54104fb0 2026-03-10T08:50:32.952 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.784+0000 7f9c5bf98700 1 -- 192.168.123.105:0/1292032290 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9c541a1390 con 0x7f9c54104fb0 2026-03-10T08:50:32.952 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.785+0000 7f9c5bf98700 1 -- 192.168.123.105:0/1292032290 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9c541a17b0 con 0x7f9c54104fb0 2026-03-10T08:50:32.953 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.785+0000 7f9c46ffd700 1 -- 192.168.123.105:0/1292032290 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9c500044c0 con 0x7f9c54104fb0 2026-03-10T08:50:32.953 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.785+0000 7f9c46ffd700 1 -- 192.168.123.105:0/1292032290 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f9c50024d80 con 0x7f9c54104fb0 2026-03-10T08:50:32.953 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.785+0000 7f9c46ffd700 1 -- 192.168.123.105:0/1292032290 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 7) v1 ==== 45278+0+0 (secure 0 0 0) 0x7f9c5001b460 con 0x7f9c54104fb0 2026-03-10T08:50:32.953 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.785+0000 7f9c46ffd700 1 --2- 192.168.123.105:0/1292032290 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f9c400383e0 0x7f9c4003a8a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:32.953 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.785+0000 7f9c59533700 1 --2- 192.168.123.105:0/1292032290 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f9c400383e0 0x7f9c4003a8a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:32.953 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.786+0000 7f9c59533700 1 --2- 192.168.123.105:0/1292032290 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f9c400383e0 0x7f9c4003a8a0 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f9c48006fd0 tx=0x7f9c48006e40 comp rx=0 tx=0).ready entity=mgr.14118 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:32.953 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.786+0000 7f9c46ffd700 1 -- 192.168.123.105:0/1292032290 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f9c5004cf90 con 0x7f9c54104fb0 2026-03-10T08:50:32.953 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.786+0000 7f9c5bf98700 1 -- 192.168.123.105:0/1292032290 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9c38005320 con 0x7f9c54104fb0 2026-03-10T08:50:32.953 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.789+0000 7f9c46ffd700 1 -- 192.168.123.105:0/1292032290 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f9c5001f070 con 0x7f9c54104fb0 2026-03-10T08:50:32.953 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:31.920+0000 7f9c5bf98700 1 -- 192.168.123.105:0/1292032290 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mgr module enable", "module": "dashboard"} v 0) v1 -- 0x7f9c380059f0 con 0x7f9c54104fb0 2026-03-10T08:50:32.953 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:32.908+0000 7f9c46ffd700 1 -- 192.168.123.105:0/1292032290 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mgrmap(e 8) v1 ==== 45291+0+0 (secure 0 0 0) 0x7f9c500246b0 con 0x7f9c54104fb0 2026-03-10T08:50:32.953 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:32.913+0000 7f9c46ffd700 1 -- 192.168.123.105:0/1292032290 <== mon.0 v2:192.168.123.105:3300/0 8 ==== mon_command_ack([{"prefix": "mgr module enable", "module": "dashboard"}]=0 v8) v1 ==== 88+0+0 (secure 0 0 0) 0x7f9c5002dd00 con 0x7f9c54104fb0 2026-03-10T08:50:32.953 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:32.916+0000 7f9c5bf98700 1 -- 192.168.123.105:0/1292032290 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f9c400383e0 msgr2=0x7f9c4003a8a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:32.953 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:32.916+0000 7f9c5bf98700 1 --2- 192.168.123.105:0/1292032290 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f9c400383e0 0x7f9c4003a8a0 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f9c48006fd0 tx=0x7f9c48006e40 comp rx=0 tx=0).stop 2026-03-10T08:50:32.953 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:32.916+0000 7f9c5bf98700 1 -- 192.168.123.105:0/1292032290 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9c54104fb0 msgr2=0x7f9c541a0e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:32.953 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:32.916+0000 7f9c5bf98700 1 --2- 192.168.123.105:0/1292032290 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9c54104fb0 0x7f9c541a0e50 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7f9c5000bd80 tx=0x7f9c5000be60 comp rx=0 tx=0).stop 2026-03-10T08:50:32.953 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:32.916+0000 7f9c5bf98700 1 -- 192.168.123.105:0/1292032290 shutdown_connections 2026-03-10T08:50:32.953 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:32.916+0000 7f9c5bf98700 1 --2- 192.168.123.105:0/1292032290 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f9c400383e0 0x7f9c4003a8a0 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:32.953 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:32.916+0000 7f9c5bf98700 1 --2- 192.168.123.105:0/1292032290 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9c54104fb0 0x7f9c541a0e50 unknown :-1 s=CLOSED pgs=66 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:32.953 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:32.916+0000 7f9c5bf98700 1 -- 192.168.123.105:0/1292032290 >> 192.168.123.105:0/1292032290 conn(0x7f9c54100bf0 msgr2=0x7f9c54103030 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:32.953 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:32.916+0000 7f9c5bf98700 1 -- 192.168.123.105:0/1292032290 shutdown_connections 2026-03-10T08:50:32.953 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:32.916+0000 7f9c5bf98700 1 -- 192.168.123.105:0/1292032290 wait complete. 2026-03-10T08:50:33.313 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout { 2026-03-10T08:50:33.313 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "epoch": 8, 2026-03-10T08:50:33.313 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "available": true, 2026-03-10T08:50:33.313 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "active_name": "vm05.rxwgjc", 2026-03-10T08:50:33.313 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_standby": 0 2026-03-10T08:50:33.313 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout } 2026-03-10T08:50:33.313 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.127+0000 7f2363245700 1 Processor -- start 2026-03-10T08:50:33.313 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.127+0000 7f2363245700 1 -- start start 2026-03-10T08:50:33.313 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.127+0000 7f2363245700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f235c1065c0 0x7f235c1069e0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:33.313 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.127+0000 7f2363245700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f235c106fb0 con 0x7f235c1065c0 2026-03-10T08:50:33.313 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.128+0000 7f2360fe1700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f235c1065c0 0x7f235c1069e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:33.313 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.128+0000 7f2360fe1700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f235c1065c0 0x7f235c1069e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:53810/0 (socket says 192.168.123.105:53810) 2026-03-10T08:50:33.313 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.128+0000 7f2360fe1700 1 -- 192.168.123.105:0/974136417 learned_addr learned my addr 192.168.123.105:0/974136417 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:33.313 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.128+0000 7f2360fe1700 1 -- 192.168.123.105:0/974136417 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f235c1077c0 con 0x7f235c1065c0 2026-03-10T08:50:33.313 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.129+0000 7f2360fe1700 1 --2- 192.168.123.105:0/974136417 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f235c1065c0 0x7f235c1069e0 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f2350009a90 tx=0x7f2350009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=91bc39fbf21717e0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:33.313 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.129+0000 7f235b7fe700 1 -- 192.168.123.105:0/974136417 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2350004030 con 0x7f235c1065c0 2026-03-10T08:50:33.313 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.129+0000 7f235b7fe700 1 -- 192.168.123.105:0/974136417 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f235000b7e0 con 0x7f235c1065c0 2026-03-10T08:50:33.313 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.129+0000 7f235b7fe700 1 -- 192.168.123.105:0/974136417 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2350003ae0 con 0x7f235c1065c0 2026-03-10T08:50:33.313 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.129+0000 7f2363245700 1 -- 192.168.123.105:0/974136417 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f235c1065c0 msgr2=0x7f235c1069e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:33.313 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.129+0000 7f2363245700 1 --2- 192.168.123.105:0/974136417 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f235c1065c0 0x7f235c1069e0 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f2350009a90 tx=0x7f2350009da0 comp rx=0 tx=0).stop 2026-03-10T08:50:33.314 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.131+0000 7f2363245700 1 -- 192.168.123.105:0/974136417 shutdown_connections 2026-03-10T08:50:33.314 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.131+0000 7f2363245700 1 --2- 192.168.123.105:0/974136417 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f235c1065c0 0x7f235c1069e0 unknown :-1 s=CLOSED pgs=69 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:33.315 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.131+0000 7f2363245700 1 -- 192.168.123.105:0/974136417 >> 192.168.123.105:0/974136417 conn(0x7f235c101ba0 msgr2=0x7f235c103fc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:33.315 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.131+0000 7f2363245700 1 -- 192.168.123.105:0/974136417 shutdown_connections 2026-03-10T08:50:33.315 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.131+0000 7f2363245700 1 -- 192.168.123.105:0/974136417 wait complete. 2026-03-10T08:50:33.315 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.131+0000 7f2363245700 1 Processor -- start 2026-03-10T08:50:33.315 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.131+0000 7f2363245700 1 -- start start 2026-03-10T08:50:33.315 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.132+0000 7f2363245700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f235c1065c0 0x7f235c19a2b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:33.315 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.132+0000 7f2363245700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f235c19a7f0 con 0x7f235c1065c0 2026-03-10T08:50:33.315 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.132+0000 7f2360fe1700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f235c1065c0 0x7f235c19a2b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:33.315 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.132+0000 7f2360fe1700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f235c1065c0 0x7f235c19a2b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:53826/0 (socket says 192.168.123.105:53826) 2026-03-10T08:50:33.315 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.132+0000 7f2360fe1700 1 -- 192.168.123.105:0/1684121940 learned_addr learned my addr 192.168.123.105:0/1684121940 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:33.315 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.133+0000 7f2360fe1700 1 -- 192.168.123.105:0/1684121940 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2350009740 con 0x7f235c1065c0 2026-03-10T08:50:33.315 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.133+0000 7f2360fe1700 1 --2- 192.168.123.105:0/1684121940 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f235c1065c0 0x7f235c19a2b0 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7f2350009710 tx=0x7f235000bfa0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:33.315 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.135+0000 7f2359ffb700 1 -- 192.168.123.105:0/1684121940 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f23500041a0 con 0x7f235c1065c0 2026-03-10T08:50:33.315 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.135+0000 7f2363245700 1 -- 192.168.123.105:0/1684121940 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f235c19a9f0 con 0x7f235c1065c0 2026-03-10T08:50:33.315 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.135+0000 7f2363245700 1 -- 192.168.123.105:0/1684121940 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f235c19ae10 con 0x7f235c1065c0 2026-03-10T08:50:33.315 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.136+0000 7f2359ffb700 1 -- 192.168.123.105:0/1684121940 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f2350004300 con 0x7f235c1065c0 2026-03-10T08:50:33.315 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.136+0000 7f2359ffb700 1 -- 192.168.123.105:0/1684121940 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f23500114a0 con 0x7f235c1065c0 2026-03-10T08:50:33.315 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.136+0000 7f2359ffb700 1 -- 192.168.123.105:0/1684121940 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 8) v1 ==== 45291+0+0 (secure 0 0 0) 0x7f2350011710 con 0x7f235c1065c0 2026-03-10T08:50:33.315 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.136+0000 7f2359ffb700 1 --2- 192.168.123.105:0/1684121940 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f234c038490 0x7f234c03a950 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:33.315 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.136+0000 7f235bfff700 1 -- 192.168.123.105:0/1684121940 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f234c038490 msgr2=0x7f234c03a950 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.105:6800/2 2026-03-10T08:50:33.315 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.137+0000 7f235bfff700 1 --2- 192.168.123.105:0/1684121940 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f234c038490 0x7f234c03a950 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-10T08:50:33.315 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.137+0000 7f2359ffb700 1 -- 192.168.123.105:0/1684121940 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f235004d1a0 con 0x7f235c1065c0 2026-03-10T08:50:33.315 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.137+0000 7f2363245700 1 -- 192.168.123.105:0/1684121940 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2340005320 con 0x7f235c1065c0 2026-03-10T08:50:33.315 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.140+0000 7f2359ffb700 1 -- 192.168.123.105:0/1684121940 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f23500119c0 con 0x7f235c1065c0 2026-03-10T08:50:33.315 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.276+0000 7f2363245700 1 -- 192.168.123.105:0/1684121940 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mgr stat"} v 0) v1 -- 0x7f2340006200 con 0x7f235c1065c0 2026-03-10T08:50:33.315 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.277+0000 7f2359ffb700 1 -- 192.168.123.105:0/1684121940 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mgr stat"}]=0 v8) v1 ==== 56+0+98 (secure 0 0 0) 0x7f2350018b40 con 0x7f235c1065c0 2026-03-10T08:50:33.315 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.283+0000 7f23477fe700 1 -- 192.168.123.105:0/1684121940 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f234c038490 msgr2=0x7f234c03a950 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T08:50:33.315 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.283+0000 7f23477fe700 1 --2- 192.168.123.105:0/1684121940 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f234c038490 0x7f234c03a950 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:33.315 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.283+0000 7f23477fe700 1 -- 192.168.123.105:0/1684121940 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f235c1065c0 msgr2=0x7f235c19a2b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:33.315 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.283+0000 7f23477fe700 1 --2- 192.168.123.105:0/1684121940 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f235c1065c0 0x7f235c19a2b0 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7f2350009710 tx=0x7f235000bfa0 comp rx=0 tx=0).stop 2026-03-10T08:50:33.315 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.283+0000 7f23477fe700 1 -- 192.168.123.105:0/1684121940 shutdown_connections 2026-03-10T08:50:33.315 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.283+0000 7f23477fe700 1 --2- 192.168.123.105:0/1684121940 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f234c038490 0x7f234c03a950 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:33.315 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.283+0000 7f23477fe700 1 --2- 192.168.123.105:0/1684121940 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f235c1065c0 0x7f235c19a2b0 unknown :-1 s=CLOSED pgs=70 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:33.315 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.283+0000 7f23477fe700 1 -- 192.168.123.105:0/1684121940 >> 192.168.123.105:0/1684121940 conn(0x7f235c101ba0 msgr2=0x7f235c1027c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:33.315 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.283+0000 7f23477fe700 1 -- 192.168.123.105:0/1684121940 shutdown_connections 2026-03-10T08:50:33.315 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.283+0000 7f23477fe700 1 -- 192.168.123.105:0/1684121940 wait complete. 2026-03-10T08:50:33.315 INFO:teuthology.orchestra.run.vm05.stdout:Waiting for the mgr to restart... 2026-03-10T08:50:33.315 INFO:teuthology.orchestra.run.vm05.stdout:Waiting for mgr epoch 8... 2026-03-10T08:50:34.166 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:33 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/1292032290' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished 2026-03-10T08:50:34.167 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:33 vm05 ceph-mon[49713]: mgrmap e8: vm05.rxwgjc(active, since 9s) 2026-03-10T08:50:34.167 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:33 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/1684121940' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch 2026-03-10T08:50:37.905 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:37 vm05 ceph-mon[49713]: Active manager daemon vm05.rxwgjc restarted 2026-03-10T08:50:37.905 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:37 vm05 ceph-mon[49713]: Activating manager daemon vm05.rxwgjc 2026-03-10T08:50:38.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:37 vm05 ceph-mon[49713]: osdmap e3: 0 total, 0 up, 0 in 2026-03-10T08:50:38.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:37 vm05 ceph-mon[49713]: mgrmap e9: vm05.rxwgjc(active, starting, since 0.00448935s) 2026-03-10T08:50:38.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:37 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-10T08:50:38.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:37 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mgr metadata", "who": "vm05.rxwgjc", "id": "vm05.rxwgjc"}]: dispatch 2026-03-10T08:50:38.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:37 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T08:50:38.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:37 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T08:50:38.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:37 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T08:50:38.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:37 vm05 ceph-mon[49713]: Manager daemon vm05.rxwgjc is now available 2026-03-10T08:50:38.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:37 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:50:38.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:37 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:50:38.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:37 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.rxwgjc/mirror_snapshot_schedule"}]: dispatch 2026-03-10T08:50:38.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:37 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.rxwgjc/trash_purge_schedule"}]: dispatch 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout { 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "mgrmap_epoch": 10, 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "initialized": true 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout } 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.476+0000 7f4afda1b700 1 Processor -- start 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.476+0000 7f4afda1b700 1 -- start start 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.476+0000 7f4afda1b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4af8071da0 0x7f4af80721c0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.476+0000 7f4afda1b700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4af8072700 con 0x7f4af8071da0 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.477+0000 7f4af6ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4af8071da0 0x7f4af80721c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.477+0000 7f4af6ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4af8071da0 0x7f4af80721c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:53834/0 (socket says 192.168.123.105:53834) 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.477+0000 7f4af6ffd700 1 -- 192.168.123.105:0/2834243942 learned_addr learned my addr 192.168.123.105:0/2834243942 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.477+0000 7f4af6ffd700 1 -- 192.168.123.105:0/2834243942 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4af8072840 con 0x7f4af8071da0 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.477+0000 7f4af6ffd700 1 --2- 192.168.123.105:0/2834243942 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4af8071da0 0x7f4af80721c0 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f4ae8009a90 tx=0x7f4ae8009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=591db86077f20f67 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.481+0000 7f4af5ffb700 1 -- 192.168.123.105:0/2834243942 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4ae8004030 con 0x7f4af8071da0 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.481+0000 7f4af5ffb700 1 -- 192.168.123.105:0/2834243942 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4ae800b7e0 con 0x7f4af8071da0 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.481+0000 7f4afda1b700 1 -- 192.168.123.105:0/2834243942 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4af8071da0 msgr2=0x7f4af80721c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.481+0000 7f4afda1b700 1 --2- 192.168.123.105:0/2834243942 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4af8071da0 0x7f4af80721c0 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f4ae8009a90 tx=0x7f4ae8009da0 comp rx=0 tx=0).stop 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.481+0000 7f4afda1b700 1 -- 192.168.123.105:0/2834243942 shutdown_connections 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.481+0000 7f4afda1b700 1 --2- 192.168.123.105:0/2834243942 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4af8071da0 0x7f4af80721c0 unknown :-1 s=CLOSED pgs=71 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.481+0000 7f4afda1b700 1 -- 192.168.123.105:0/2834243942 >> 192.168.123.105:0/2834243942 conn(0x7f4af806d400 msgr2=0x7f4af806f840 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.482+0000 7f4afda1b700 1 -- 192.168.123.105:0/2834243942 shutdown_connections 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.482+0000 7f4afda1b700 1 -- 192.168.123.105:0/2834243942 wait complete. 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.482+0000 7f4afda1b700 1 Processor -- start 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.482+0000 7f4afda1b700 1 -- start start 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.482+0000 7f4afda1b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4af81a9290 0x7f4af81a96b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.482+0000 7f4afda1b700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4af8072700 con 0x7f4af81a9290 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.483+0000 7f4af6ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4af81a9290 0x7f4af81a96b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.483+0000 7f4af6ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4af81a9290 0x7f4af81a96b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:53844/0 (socket says 192.168.123.105:53844) 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.483+0000 7f4af6ffd700 1 -- 192.168.123.105:0/3141812439 learned_addr learned my addr 192.168.123.105:0/3141812439 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.483+0000 7f4af6ffd700 1 -- 192.168.123.105:0/3141812439 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4ae8009740 con 0x7f4af81a9290 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.483+0000 7f4af6ffd700 1 --2- 192.168.123.105:0/3141812439 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4af81a9290 0x7f4af81a96b0 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f4ae800bdb0 tx=0x7f4ae800be90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.483+0000 7f4afca19700 1 -- 192.168.123.105:0/3141812439 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4ae8003750 con 0x7f4af81a9290 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.484+0000 7f4afda1b700 1 -- 192.168.123.105:0/3141812439 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4af81a9bf0 con 0x7f4af81a9290 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.484+0000 7f4afda1b700 1 -- 192.168.123.105:0/3141812439 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4af807aff0 con 0x7f4af81a9290 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.485+0000 7f4afca19700 1 -- 192.168.123.105:0/3141812439 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4ae8004440 con 0x7f4af81a9290 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.485+0000 7f4afca19700 1 -- 192.168.123.105:0/3141812439 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4ae801aa40 con 0x7f4af81a9290 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.485+0000 7f4afca19700 1 -- 192.168.123.105:0/3141812439 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 8) v1 ==== 45291+0+0 (secure 0 0 0) 0x7f4ae801ad10 con 0x7f4af81a9290 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.485+0000 7f4afca19700 1 --2- 192.168.123.105:0/3141812439 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4ae0038390 0x7f4ae003a850 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.485+0000 7f4afca19700 1 -- 192.168.123.105:0/3141812439 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- command(tid 0: {"prefix": "get_command_descriptions"}) v1 -- 0x7f4ae003af60 con 0x7f4ae0038390 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.485+0000 7f4af67fc700 1 -- 192.168.123.105:0/3141812439 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4ae0038390 msgr2=0x7f4ae003a850 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.105:6800/2 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.485+0000 7f4af67fc700 1 --2- 192.168.123.105:0/3141812439 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4ae0038390 0x7f4ae003a850 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.486+0000 7f4afca19700 1 -- 192.168.123.105:0/3141812439 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f4ae8011940 con 0x7f4af81a9290 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.685+0000 7f4af67fc700 1 -- 192.168.123.105:0/3141812439 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4ae0038390 msgr2=0x7f4ae003a850 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.105:6800/2 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:33.686+0000 7f4af67fc700 1 --2- 192.168.123.105:0/3141812439 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4ae0038390 0x7f4ae003a850 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.400000 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:34.086+0000 7f4af67fc700 1 -- 192.168.123.105:0/3141812439 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4ae0038390 msgr2=0x7f4ae003a850 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.105:6800/2 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:34.086+0000 7f4af67fc700 1 --2- 192.168.123.105:0/3141812439 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4ae0038390 0x7f4ae003a850 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.800000 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:34.887+0000 7f4af67fc700 1 -- 192.168.123.105:0/3141812439 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4ae0038390 msgr2=0x7f4ae003a850 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.105:6800/2 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:34.887+0000 7f4af67fc700 1 --2- 192.168.123.105:0/3141812439 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4ae0038390 0x7f4ae003a850 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 1.600000 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:36.489+0000 7f4af67fc700 1 -- 192.168.123.105:0/3141812439 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4ae0038390 msgr2=0x7f4ae003a850 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.105:6800/2 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:36.489+0000 7f4af67fc700 1 --2- 192.168.123.105:0/3141812439 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4ae0038390 0x7f4ae003a850 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 3.200000 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:37.637+0000 7f4afca19700 1 -- 192.168.123.105:0/3141812439 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mgrmap(e 9) v1 ==== 45058+0+0 (secure 0 0 0) 0x7f4ae801ad10 con 0x7f4af81a9290 2026-03-10T08:50:38.695 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:37.637+0000 7f4afca19700 1 -- 192.168.123.105:0/3141812439 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4ae0038390 msgr2=0x7f4ae003a850 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T08:50:38.696 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:37.637+0000 7f4afca19700 1 --2- 192.168.123.105:0/3141812439 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4ae0038390 0x7f4ae003a850 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:38.696 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.642+0000 7f4afca19700 1 -- 192.168.123.105:0/3141812439 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mgrmap(e 10) v1 ==== 45185+0+0 (secure 0 0 0) 0x7f4ae801ad10 con 0x7f4af81a9290 2026-03-10T08:50:38.696 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.642+0000 7f4afca19700 1 --2- 192.168.123.105:0/3141812439 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4ae0038390 0x7f4ae003a850 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:38.696 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.642+0000 7f4afca19700 1 -- 192.168.123.105:0/3141812439 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- command(tid 0: {"prefix": "get_command_descriptions"}) v1 -- 0x7f4ae003af60 con 0x7f4ae0038390 2026-03-10T08:50:38.696 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.643+0000 7f4af67fc700 1 --2- 192.168.123.105:0/3141812439 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4ae0038390 0x7f4ae003a850 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:38.696 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.644+0000 7f4af67fc700 1 --2- 192.168.123.105:0/3141812439 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4ae0038390 0x7f4ae003a850 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7f4aec003a10 tx=0x7f4aec0092b0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:38.696 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.644+0000 7f4afca19700 1 -- 192.168.123.105:0/3141812439 <== mgr.14162 v2:192.168.123.105:6800/2 1 ==== command_reply(tid 0: 0 ) v1 ==== 8+0+6910 (secure 0 0 0) 0x7f4ae003af60 con 0x7f4ae0038390 2026-03-10T08:50:38.696 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.647+0000 7f4afda1b700 1 -- 192.168.123.105:0/3141812439 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- command(tid 1: {"prefix": "mgr_status"}) v1 -- 0x7f4af810ec20 con 0x7f4ae0038390 2026-03-10T08:50:38.696 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.647+0000 7f4afca19700 1 -- 192.168.123.105:0/3141812439 <== mgr.14162 v2:192.168.123.105:6800/2 2 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+52 (secure 0 0 0) 0x7f4ae003af60 con 0x7f4ae0038390 2026-03-10T08:50:38.696 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.650+0000 7f4afda1b700 1 -- 192.168.123.105:0/3141812439 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4ae0038390 msgr2=0x7f4ae003a850 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:38.696 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.650+0000 7f4afda1b700 1 --2- 192.168.123.105:0/3141812439 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4ae0038390 0x7f4ae003a850 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7f4aec003a10 tx=0x7f4aec0092b0 comp rx=0 tx=0).stop 2026-03-10T08:50:38.696 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.650+0000 7f4afda1b700 1 -- 192.168.123.105:0/3141812439 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4af81a9290 msgr2=0x7f4af81a96b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:38.696 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.650+0000 7f4afda1b700 1 --2- 192.168.123.105:0/3141812439 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4af81a9290 0x7f4af81a96b0 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f4ae800bdb0 tx=0x7f4ae800be90 comp rx=0 tx=0).stop 2026-03-10T08:50:38.696 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.650+0000 7f4afda1b700 1 -- 192.168.123.105:0/3141812439 shutdown_connections 2026-03-10T08:50:38.696 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.650+0000 7f4afda1b700 1 --2- 192.168.123.105:0/3141812439 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4ae0038390 0x7f4ae003a850 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:38.696 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.650+0000 7f4afda1b700 1 --2- 192.168.123.105:0/3141812439 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4af81a9290 0x7f4af81a96b0 unknown :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:38.696 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.650+0000 7f4afda1b700 1 -- 192.168.123.105:0/3141812439 >> 192.168.123.105:0/3141812439 conn(0x7f4af806d400 msgr2=0x7f4af806df60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:38.696 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.651+0000 7f4afda1b700 1 -- 192.168.123.105:0/3141812439 shutdown_connections 2026-03-10T08:50:38.696 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.651+0000 7f4afda1b700 1 -- 192.168.123.105:0/3141812439 wait complete. 2026-03-10T08:50:38.696 INFO:teuthology.orchestra.run.vm05.stdout:mgr epoch 8 is available 2026-03-10T08:50:38.696 INFO:teuthology.orchestra.run.vm05.stdout:Generating a dashboard self-signed certificate... 2026-03-10T08:50:39.026 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout Self-signed certificate created 2026-03-10T08:50:39.026 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.834+0000 7f8853fff700 1 Processor -- start 2026-03-10T08:50:39.026 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.834+0000 7f8853fff700 1 -- start start 2026-03-10T08:50:39.026 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.834+0000 7f8853fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8854071e00 0x7f8854072220 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.834+0000 7f8853fff700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f88540727f0 con 0x7f8854071e00 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.835+0000 7f8852ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8854071e00 0x7f8854072220 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.835+0000 7f8852ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8854071e00 0x7f8854072220 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:53930/0 (socket says 192.168.123.105:53930) 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.835+0000 7f8852ffd700 1 -- 192.168.123.105:0/567663904 learned_addr learned my addr 192.168.123.105:0/567663904 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.835+0000 7f8852ffd700 1 -- 192.168.123.105:0/567663904 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f885410ddb0 con 0x7f8854071e00 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.835+0000 7f8852ffd700 1 --2- 192.168.123.105:0/567663904 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8854071e00 0x7f8854072220 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7f8844009a90 tx=0x7f8844009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=cbcbd47ab6a97c71 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.835+0000 7f8851ffb700 1 -- 192.168.123.105:0/567663904 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8844004030 con 0x7f8854071e00 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.835+0000 7f8851ffb700 1 -- 192.168.123.105:0/567663904 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f884400b7e0 con 0x7f8854071e00 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.835+0000 7f8851ffb700 1 -- 192.168.123.105:0/567663904 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8844003ae0 con 0x7f8854071e00 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.836+0000 7f8853fff700 1 -- 192.168.123.105:0/567663904 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8854071e00 msgr2=0x7f8854072220 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.836+0000 7f8853fff700 1 --2- 192.168.123.105:0/567663904 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8854071e00 0x7f8854072220 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7f8844009a90 tx=0x7f8844009da0 comp rx=0 tx=0).stop 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.837+0000 7f8853fff700 1 -- 192.168.123.105:0/567663904 shutdown_connections 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.837+0000 7f8853fff700 1 --2- 192.168.123.105:0/567663904 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8854071e00 0x7f8854072220 unknown :-1 s=CLOSED pgs=80 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.837+0000 7f8853fff700 1 -- 192.168.123.105:0/567663904 >> 192.168.123.105:0/567663904 conn(0x7f885406d320 msgr2=0x7f885406f760 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.837+0000 7f8853fff700 1 -- 192.168.123.105:0/567663904 shutdown_connections 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.838+0000 7f8853fff700 1 -- 192.168.123.105:0/567663904 wait complete. 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.838+0000 7f8853fff700 1 Processor -- start 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.838+0000 7f8853fff700 1 -- start start 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.838+0000 7f8853fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8854071e00 0x7f885411d320 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.838+0000 7f8853fff700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f885411d860 con 0x7f8854071e00 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.839+0000 7f8852ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8854071e00 0x7f885411d320 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.839+0000 7f8852ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8854071e00 0x7f885411d320 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:53932/0 (socket says 192.168.123.105:53932) 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.839+0000 7f8852ffd700 1 -- 192.168.123.105:0/2844997271 learned_addr learned my addr 192.168.123.105:0/2844997271 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.839+0000 7f8852ffd700 1 -- 192.168.123.105:0/2844997271 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8844009740 con 0x7f8854071e00 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.840+0000 7f8852ffd700 1 --2- 192.168.123.105:0/2844997271 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8854071e00 0x7f885411d320 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7f8844009710 tx=0x7f8844004080 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.840+0000 7f883bfff700 1 -- 192.168.123.105:0/2844997271 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8844004220 con 0x7f8854071e00 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.840+0000 7f883bfff700 1 -- 192.168.123.105:0/2844997271 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8844004380 con 0x7f8854071e00 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.840+0000 7f883bfff700 1 -- 192.168.123.105:0/2844997271 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f88440114a0 con 0x7f8854071e00 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.840+0000 7f8853fff700 1 -- 192.168.123.105:0/2844997271 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f885411da60 con 0x7f8854071e00 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.840+0000 7f8853fff700 1 -- 192.168.123.105:0/2844997271 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f885411bb40 con 0x7f8854071e00 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.841+0000 7f883bfff700 1 -- 192.168.123.105:0/2844997271 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 10) v1 ==== 45185+0+0 (secure 0 0 0) 0x7f8844028020 con 0x7f8854071e00 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.841+0000 7f883bfff700 1 --2- 192.168.123.105:0/2844997271 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f883c038320 0x7f883c03a7e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.841+0000 7f883bfff700 1 -- 192.168.123.105:0/2844997271 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f884404bd90 con 0x7f8854071e00 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.841+0000 7f8853fff700 1 -- 192.168.123.105:0/2844997271 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f885404f000 con 0x7f8854071e00 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.842+0000 7f88527fc700 1 --2- 192.168.123.105:0/2844997271 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f883c038320 0x7f883c03a7e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.843+0000 7f88527fc700 1 --2- 192.168.123.105:0/2844997271 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f883c038320 0x7f883c03a7e0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f8848006fd0 tx=0x7f8848006e40 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.844+0000 7f883bfff700 1 -- 192.168.123.105:0/2844997271 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f8844019e20 con 0x7f8854071e00 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.948+0000 7f8853fff700 1 -- 192.168.123.105:0/2844997271 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "dashboard create-self-signed-cert", "target": ["mon-mgr", ""]}) v1 -- 0x7f885411c270 con 0x7f883c038320 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.984+0000 7f883bfff700 1 -- 192.168.123.105:0/2844997271 <== mgr.14162 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7f885411c270 con 0x7f883c038320 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.986+0000 7f8839ffb700 1 -- 192.168.123.105:0/2844997271 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f883c038320 msgr2=0x7f883c03a7e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.986+0000 7f8839ffb700 1 --2- 192.168.123.105:0/2844997271 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f883c038320 0x7f883c03a7e0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f8848006fd0 tx=0x7f8848006e40 comp rx=0 tx=0).stop 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.986+0000 7f8839ffb700 1 -- 192.168.123.105:0/2844997271 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8854071e00 msgr2=0x7f885411d320 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.986+0000 7f8839ffb700 1 --2- 192.168.123.105:0/2844997271 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8854071e00 0x7f885411d320 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7f8844009710 tx=0x7f8844004080 comp rx=0 tx=0).stop 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.986+0000 7f8839ffb700 1 -- 192.168.123.105:0/2844997271 shutdown_connections 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.986+0000 7f8839ffb700 1 --2- 192.168.123.105:0/2844997271 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f883c038320 0x7f883c03a7e0 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.986+0000 7f8839ffb700 1 --2- 192.168.123.105:0/2844997271 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8854071e00 0x7f885411d320 unknown :-1 s=CLOSED pgs=81 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.986+0000 7f8839ffb700 1 -- 192.168.123.105:0/2844997271 >> 192.168.123.105:0/2844997271 conn(0x7f885406d320 msgr2=0x7f885406deb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.986+0000 7f8839ffb700 1 -- 192.168.123.105:0/2844997271 shutdown_connections 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:38.987+0000 7f8839ffb700 1 -- 192.168.123.105:0/2844997271 wait complete. 2026-03-10T08:50:39.027 INFO:teuthology.orchestra.run.vm05.stdout:Creating initial admin user... 2026-03-10T08:50:39.487 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout {"username": "admin", "password": "$2b$12$tIBACf8Q.d1Rav4xTzH2M.PuONw68QVFPetFXlAm6kHTVfhw3oyIC", "roles": ["administrator"], "name": null, "email": null, "lastUpdate": 1773132639, "enabled": true, "pwdExpirationDate": null, "pwdUpdateRequired": true} 2026-03-10T08:50:39.487 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.161+0000 7f7922eeb700 1 Processor -- start 2026-03-10T08:50:39.487 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.161+0000 7f7922eeb700 1 -- start start 2026-03-10T08:50:39.487 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.161+0000 7f7922eeb700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f791c0721d0 0x7f791c0725f0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:39.487 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.161+0000 7f7922eeb700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f791c072bc0 con 0x7f791c0721d0 2026-03-10T08:50:39.487 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.164+0000 7f7921ee9700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f791c0721d0 0x7f791c0725f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:39.487 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.164+0000 7f7921ee9700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f791c0721d0 0x7f791c0725f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:53936/0 (socket says 192.168.123.105:53936) 2026-03-10T08:50:39.487 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.164+0000 7f7921ee9700 1 -- 192.168.123.105:0/3512775646 learned_addr learned my addr 192.168.123.105:0/3512775646 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:39.487 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.164+0000 7f7921ee9700 1 -- 192.168.123.105:0/3512775646 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f791c10e1c0 con 0x7f791c0721d0 2026-03-10T08:50:39.487 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.164+0000 7f7921ee9700 1 --2- 192.168.123.105:0/3512775646 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f791c0721d0 0x7f791c0725f0 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f79180075a0 tx=0x7f791800c050 comp rx=0 tx=0).ready entity=mon.0 client_cookie=c28f46ef0eba2625 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:39.487 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.164+0000 7f7920ee7700 1 -- 192.168.123.105:0/3512775646 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f791800f070 con 0x7f791c0721d0 2026-03-10T08:50:39.487 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.164+0000 7f7920ee7700 1 -- 192.168.123.105:0/3512775646 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7918004510 con 0x7f791c0721d0 2026-03-10T08:50:39.487 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.165+0000 7f7922eeb700 1 -- 192.168.123.105:0/3512775646 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f791c0721d0 msgr2=0x7f791c0725f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:39.487 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.165+0000 7f7922eeb700 1 --2- 192.168.123.105:0/3512775646 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f791c0721d0 0x7f791c0725f0 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f79180075a0 tx=0x7f791800c050 comp rx=0 tx=0).stop 2026-03-10T08:50:39.487 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.165+0000 7f7922eeb700 1 -- 192.168.123.105:0/3512775646 shutdown_connections 2026-03-10T08:50:39.487 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.165+0000 7f7922eeb700 1 --2- 192.168.123.105:0/3512775646 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f791c0721d0 0x7f791c0725f0 unknown :-1 s=CLOSED pgs=82 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:39.487 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.165+0000 7f7922eeb700 1 -- 192.168.123.105:0/3512775646 >> 192.168.123.105:0/3512775646 conn(0x7f791c06d320 msgr2=0x7f791c06f760 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:39.487 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.165+0000 7f7922eeb700 1 -- 192.168.123.105:0/3512775646 shutdown_connections 2026-03-10T08:50:39.487 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.165+0000 7f7922eeb700 1 -- 192.168.123.105:0/3512775646 wait complete. 2026-03-10T08:50:39.487 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.166+0000 7f7922eeb700 1 Processor -- start 2026-03-10T08:50:39.487 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.166+0000 7f7922eeb700 1 -- start start 2026-03-10T08:50:39.487 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.166+0000 7f7922eeb700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f791c1a0ab0 0x7f791c1a0ed0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:39.487 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.166+0000 7f7922eeb700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f791800cd10 con 0x7f791c1a0ab0 2026-03-10T08:50:39.487 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.166+0000 7f7921ee9700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f791c1a0ab0 0x7f791c1a0ed0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:39.487 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.166+0000 7f7921ee9700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f791c1a0ab0 0x7f791c1a0ed0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:53940/0 (socket says 192.168.123.105:53940) 2026-03-10T08:50:39.487 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.166+0000 7f7921ee9700 1 -- 192.168.123.105:0/657194862 learned_addr learned my addr 192.168.123.105:0/657194862 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:39.487 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.167+0000 7f7921ee9700 1 -- 192.168.123.105:0/657194862 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7918007250 con 0x7f791c1a0ab0 2026-03-10T08:50:39.487 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.167+0000 7f7921ee9700 1 --2- 192.168.123.105:0/657194862 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f791c1a0ab0 0x7f791c1a0ed0 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f7918016040 tx=0x7f791800a870 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:39.487 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.167+0000 7f7912ffd700 1 -- 192.168.123.105:0/657194862 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f791800f040 con 0x7f791c1a0ab0 2026-03-10T08:50:39.487 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.167+0000 7f7912ffd700 1 -- 192.168.123.105:0/657194862 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f791801e070 con 0x7f791c1a0ab0 2026-03-10T08:50:39.487 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.168+0000 7f7912ffd700 1 -- 192.168.123.105:0/657194862 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7918013430 con 0x7f791c1a0ab0 2026-03-10T08:50:39.487 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.168+0000 7f7922eeb700 1 -- 192.168.123.105:0/657194862 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f791c1a1410 con 0x7f791c1a0ab0 2026-03-10T08:50:39.487 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.168+0000 7f7922eeb700 1 -- 192.168.123.105:0/657194862 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f791c07aff0 con 0x7f791c1a0ab0 2026-03-10T08:50:39.487 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.168+0000 7f7912ffd700 1 -- 192.168.123.105:0/657194862 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 10) v1 ==== 45185+0+0 (secure 0 0 0) 0x7f7918008040 con 0x7f791c1a0ab0 2026-03-10T08:50:39.487 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.168+0000 7f7912ffd700 1 --2- 192.168.123.105:0/657194862 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7908037fa0 0x7f790803a460 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:39.487 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.168+0000 7f7912ffd700 1 -- 192.168.123.105:0/657194862 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f791802a020 con 0x7f791c1a0ab0 2026-03-10T08:50:39.487 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.169+0000 7f7922eeb700 1 -- 192.168.123.105:0/657194862 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f791c04f000 con 0x7f791c1a0ab0 2026-03-10T08:50:39.487 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.169+0000 7f79216e8700 1 --2- 192.168.123.105:0/657194862 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7908037fa0 0x7f790803a460 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:39.487 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.171+0000 7f7912ffd700 1 -- 192.168.123.105:0/657194862 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f7918018070 con 0x7f791c1a0ab0 2026-03-10T08:50:39.487 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.172+0000 7f79216e8700 1 --2- 192.168.123.105:0/657194862 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7908037fa0 0x7f790803a460 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f790c006fd0 tx=0x7f790c006e40 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:39.488 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.283+0000 7f7922eeb700 1 -- 192.168.123.105:0/657194862 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "dashboard ac-user-create", "username": "admin", "rolename": "administrator", "force_password": true, "pwd_update_required": true, "target": ["mon-mgr", ""]}) v1 -- 0x7f791c0621e0 con 0x7f7908037fa0 2026-03-10T08:50:39.488 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.455+0000 7f7912ffd700 1 -- 192.168.123.105:0/657194862 <== mgr.14162 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+252 (secure 0 0 0) 0x7f791c0621e0 con 0x7f7908037fa0 2026-03-10T08:50:39.488 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.457+0000 7f7922eeb700 1 -- 192.168.123.105:0/657194862 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7908037fa0 msgr2=0x7f790803a460 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:39.488 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.457+0000 7f7922eeb700 1 --2- 192.168.123.105:0/657194862 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7908037fa0 0x7f790803a460 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f790c006fd0 tx=0x7f790c006e40 comp rx=0 tx=0).stop 2026-03-10T08:50:39.488 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.457+0000 7f7922eeb700 1 -- 192.168.123.105:0/657194862 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f791c1a0ab0 msgr2=0x7f791c1a0ed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:39.488 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.457+0000 7f7922eeb700 1 --2- 192.168.123.105:0/657194862 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f791c1a0ab0 0x7f791c1a0ed0 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f7918016040 tx=0x7f791800a870 comp rx=0 tx=0).stop 2026-03-10T08:50:39.488 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.458+0000 7f7922eeb700 1 -- 192.168.123.105:0/657194862 shutdown_connections 2026-03-10T08:50:39.488 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.458+0000 7f7922eeb700 1 --2- 192.168.123.105:0/657194862 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7908037fa0 0x7f790803a460 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:39.488 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.458+0000 7f7922eeb700 1 --2- 192.168.123.105:0/657194862 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f791c1a0ab0 0x7f791c1a0ed0 unknown :-1 s=CLOSED pgs=83 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:39.488 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.458+0000 7f7922eeb700 1 -- 192.168.123.105:0/657194862 >> 192.168.123.105:0/657194862 conn(0x7f791c06d320 msgr2=0x7f791c06e090 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:39.488 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.458+0000 7f7922eeb700 1 -- 192.168.123.105:0/657194862 shutdown_connections 2026-03-10T08:50:39.488 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.458+0000 7f7922eeb700 1 -- 192.168.123.105:0/657194862 wait complete. 2026-03-10T08:50:39.488 INFO:teuthology.orchestra.run.vm05.stdout:Fetching dashboard port number... 2026-03-10T08:50:39.539 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:39 vm05 ceph-mon[49713]: [10/Mar/2026:08:50:38] ENGINE Bus STARTING 2026-03-10T08:50:39.539 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:39 vm05 ceph-mon[49713]: [10/Mar/2026:08:50:38] ENGINE Serving on http://192.168.123.105:8765 2026-03-10T08:50:39.539 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:39 vm05 ceph-mon[49713]: [10/Mar/2026:08:50:38] ENGINE Serving on https://192.168.123.105:7150 2026-03-10T08:50:39.539 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:39 vm05 ceph-mon[49713]: [10/Mar/2026:08:50:38] ENGINE Bus STARTED 2026-03-10T08:50:39.539 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:39 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:39.539 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:39 vm05 ceph-mon[49713]: mgrmap e10: vm05.rxwgjc(active, since 1.008s) 2026-03-10T08:50:39.539 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:39 vm05 ceph-mon[49713]: from='client.14166 -' entity='client.admin' cmd=[{"prefix": "get_command_descriptions"}]: dispatch 2026-03-10T08:50:39.539 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:39 vm05 ceph-mon[49713]: from='client.14166 -' entity='client.admin' cmd=[{"prefix": "mgr_status"}]: dispatch 2026-03-10T08:50:39.539 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:39 vm05 ceph-mon[49713]: from='client.14174 -' entity='client.admin' cmd=[{"prefix": "dashboard create-self-signed-cert", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:50:39.539 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:39 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:39.539 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:39 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:39.539 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:39 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout 8443 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.601+0000 7ff2289ce700 1 Processor -- start 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.602+0000 7ff2289ce700 1 -- start start 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.602+0000 7ff2289ce700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff220104fb0 0x7ff2201073e0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.602+0000 7ff2289ce700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff220074720 con 0x7ff220104fb0 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.602+0000 7ff22676a700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff220104fb0 0x7ff2201073e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.602+0000 7ff22676a700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff220104fb0 0x7ff2201073e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:53942/0 (socket says 192.168.123.105:53942) 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.602+0000 7ff22676a700 1 -- 192.168.123.105:0/3411785515 learned_addr learned my addr 192.168.123.105:0/3411785515 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.602+0000 7ff22676a700 1 -- 192.168.123.105:0/3411785515 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff220107920 con 0x7ff220104fb0 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.602+0000 7ff22676a700 1 --2- 192.168.123.105:0/3411785515 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff220104fb0 0x7ff2201073e0 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7ff218009a90 tx=0x7ff218009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=518f309fd9474882 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.602+0000 7ff225768700 1 -- 192.168.123.105:0/3411785515 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff218004030 con 0x7ff220104fb0 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.603+0000 7ff225768700 1 -- 192.168.123.105:0/3411785515 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff21800b7e0 con 0x7ff220104fb0 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.603+0000 7ff2289ce700 1 -- 192.168.123.105:0/3411785515 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff220104fb0 msgr2=0x7ff2201073e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.603+0000 7ff2289ce700 1 --2- 192.168.123.105:0/3411785515 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff220104fb0 0x7ff2201073e0 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7ff218009a90 tx=0x7ff218009da0 comp rx=0 tx=0).stop 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.603+0000 7ff2289ce700 1 -- 192.168.123.105:0/3411785515 shutdown_connections 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.603+0000 7ff2289ce700 1 --2- 192.168.123.105:0/3411785515 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff220104fb0 0x7ff2201073e0 unknown :-1 s=CLOSED pgs=84 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.603+0000 7ff2289ce700 1 -- 192.168.123.105:0/3411785515 >> 192.168.123.105:0/3411785515 conn(0x7ff220100bd0 msgr2=0x7ff220103030 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.603+0000 7ff2289ce700 1 -- 192.168.123.105:0/3411785515 shutdown_connections 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.603+0000 7ff2289ce700 1 -- 192.168.123.105:0/3411785515 wait complete. 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.604+0000 7ff2289ce700 1 Processor -- start 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.604+0000 7ff2289ce700 1 -- start start 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.604+0000 7ff2289ce700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff220104fb0 0x7ff2201a0b00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.604+0000 7ff2289ce700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff220074720 con 0x7ff220104fb0 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.604+0000 7ff22676a700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff220104fb0 0x7ff2201a0b00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.604+0000 7ff22676a700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff220104fb0 0x7ff2201a0b00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:53954/0 (socket says 192.168.123.105:53954) 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.604+0000 7ff22676a700 1 -- 192.168.123.105:0/2169525058 learned_addr learned my addr 192.168.123.105:0/2169525058 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.604+0000 7ff22676a700 1 -- 192.168.123.105:0/2169525058 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff218009740 con 0x7ff220104fb0 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.604+0000 7ff22676a700 1 --2- 192.168.123.105:0/2169525058 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff220104fb0 0x7ff2201a0b00 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7ff218003a10 tx=0x7ff218003e50 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.604+0000 7ff2137fe700 1 -- 192.168.123.105:0/2169525058 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff218004330 con 0x7ff220104fb0 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.604+0000 7ff2289ce700 1 -- 192.168.123.105:0/2169525058 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff2201a1040 con 0x7ff220104fb0 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.604+0000 7ff2289ce700 1 -- 192.168.123.105:0/2169525058 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff2201a1460 con 0x7ff220104fb0 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.604+0000 7ff2137fe700 1 -- 192.168.123.105:0/2169525058 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff218004490 con 0x7ff220104fb0 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.604+0000 7ff2137fe700 1 -- 192.168.123.105:0/2169525058 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff218011620 con 0x7ff220104fb0 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.605+0000 7ff2137fe700 1 -- 192.168.123.105:0/2169525058 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 10) v1 ==== 45185+0+0 (secure 0 0 0) 0x7ff218011780 con 0x7ff220104fb0 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.605+0000 7ff2137fe700 1 --2- 192.168.123.105:0/2169525058 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff20c038300 0x7ff20c03a7c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.605+0000 7ff225f69700 1 --2- 192.168.123.105:0/2169525058 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff20c038300 0x7ff20c03a7c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.605+0000 7ff225f69700 1 --2- 192.168.123.105:0/2169525058 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff20c038300 0x7ff20c03a7c0 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7ff214006fd0 tx=0x7ff214006e40 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.606+0000 7ff2137fe700 1 -- 192.168.123.105:0/2169525058 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7ff21804bfe0 con 0x7ff220104fb0 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.606+0000 7ff2289ce700 1 -- 192.168.123.105:0/2169525058 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff2040052f0 con 0x7ff220104fb0 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.609+0000 7ff2137fe700 1 -- 192.168.123.105:0/2169525058 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7ff218011a80 con 0x7ff220104fb0 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.711+0000 7ff2289ce700 1 -- 192.168.123.105:0/2169525058 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "config get", "who": "mgr", "key": "mgr/dashboard/ssl_server_port"} v 0) v1 -- 0x7ff204005f40 con 0x7ff220104fb0 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.711+0000 7ff2137fe700 1 -- 192.168.123.105:0/2169525058 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "config get", "who": "mgr", "key": "mgr/dashboard/ssl_server_port"}]=0 v8) v1 ==== 112+0+5 (secure 0 0 0) 0x7ff218018350 con 0x7ff220104fb0 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.714+0000 7ff2289ce700 1 -- 192.168.123.105:0/2169525058 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff20c038300 msgr2=0x7ff20c03a7c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.714+0000 7ff2289ce700 1 --2- 192.168.123.105:0/2169525058 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff20c038300 0x7ff20c03a7c0 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7ff214006fd0 tx=0x7ff214006e40 comp rx=0 tx=0).stop 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.714+0000 7ff2289ce700 1 -- 192.168.123.105:0/2169525058 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff220104fb0 msgr2=0x7ff2201a0b00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:39.758 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.714+0000 7ff2289ce700 1 --2- 192.168.123.105:0/2169525058 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff220104fb0 0x7ff2201a0b00 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7ff218003a10 tx=0x7ff218003e50 comp rx=0 tx=0).stop 2026-03-10T08:50:39.759 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.714+0000 7ff2289ce700 1 -- 192.168.123.105:0/2169525058 shutdown_connections 2026-03-10T08:50:39.759 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.714+0000 7ff2289ce700 1 --2- 192.168.123.105:0/2169525058 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff20c038300 0x7ff20c03a7c0 unknown :-1 s=CLOSED pgs=9 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:39.759 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.714+0000 7ff2289ce700 1 --2- 192.168.123.105:0/2169525058 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff220104fb0 0x7ff2201a0b00 unknown :-1 s=CLOSED pgs=85 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:39.759 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.714+0000 7ff2289ce700 1 -- 192.168.123.105:0/2169525058 >> 192.168.123.105:0/2169525058 conn(0x7ff220100bd0 msgr2=0x7ff220103030 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:39.759 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.714+0000 7ff2289ce700 1 -- 192.168.123.105:0/2169525058 shutdown_connections 2026-03-10T08:50:39.759 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.714+0000 7ff2289ce700 1 -- 192.168.123.105:0/2169525058 wait complete. 2026-03-10T08:50:39.759 INFO:teuthology.orchestra.run.vm05.stdout:firewalld does not appear to be present 2026-03-10T08:50:39.759 INFO:teuthology.orchestra.run.vm05.stdout:Not possible to open ports <[8443]>. firewalld.service is not available 2026-03-10T08:50:39.759 INFO:teuthology.orchestra.run.vm05.stdout:Ceph Dashboard is now available at: 2026-03-10T08:50:39.759 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:50:39.759 INFO:teuthology.orchestra.run.vm05.stdout: URL: https://vm05.local:8443/ 2026-03-10T08:50:39.759 INFO:teuthology.orchestra.run.vm05.stdout: User: admin 2026-03-10T08:50:39.759 INFO:teuthology.orchestra.run.vm05.stdout: Password: gkwtl8xkkj 2026-03-10T08:50:39.759 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:50:39.760 INFO:teuthology.orchestra.run.vm05.stdout:Saving cluster configuration to /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/config directory 2026-03-10T08:50:39.760 INFO:teuthology.orchestra.run.vm05.stdout:Enabling autotune for osd_memory_target 2026-03-10T08:50:40.039 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.878+0000 7f59d35f3700 1 Processor -- start 2026-03-10T08:50:40.039 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.878+0000 7f59d35f3700 1 -- start start 2026-03-10T08:50:40.039 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.879+0000 7f59d35f3700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f59cc104fb0 0x7f59cc1073e0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:40.039 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.879+0000 7f59d35f3700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f59cc074720 con 0x7f59cc104fb0 2026-03-10T08:50:40.039 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.879+0000 7f59d138f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f59cc104fb0 0x7f59cc1073e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:40.039 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.879+0000 7f59d138f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f59cc104fb0 0x7f59cc1073e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:44424/0 (socket says 192.168.123.105:44424) 2026-03-10T08:50:40.039 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.879+0000 7f59d138f700 1 -- 192.168.123.105:0/2539508858 learned_addr learned my addr 192.168.123.105:0/2539508858 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:40.039 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.880+0000 7f59d138f700 1 -- 192.168.123.105:0/2539508858 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f59cc107920 con 0x7f59cc104fb0 2026-03-10T08:50:40.039 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.880+0000 7f59d138f700 1 --2- 192.168.123.105:0/2539508858 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f59cc104fb0 0x7f59cc1073e0 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7f59c0009a90 tx=0x7f59c0009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=212c453cbe3a9e50 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:40.039 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.880+0000 7f59bffff700 1 -- 192.168.123.105:0/2539508858 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f59c0004030 con 0x7f59cc104fb0 2026-03-10T08:50:40.039 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.880+0000 7f59bffff700 1 -- 192.168.123.105:0/2539508858 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f59c000b7e0 con 0x7f59cc104fb0 2026-03-10T08:50:40.039 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.880+0000 7f59bffff700 1 -- 192.168.123.105:0/2539508858 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f59c0003b30 con 0x7f59cc104fb0 2026-03-10T08:50:40.039 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.881+0000 7f59d35f3700 1 -- 192.168.123.105:0/2539508858 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f59cc104fb0 msgr2=0x7f59cc1073e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:40.039 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.881+0000 7f59d35f3700 1 --2- 192.168.123.105:0/2539508858 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f59cc104fb0 0x7f59cc1073e0 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7f59c0009a90 tx=0x7f59c0009da0 comp rx=0 tx=0).stop 2026-03-10T08:50:40.039 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.881+0000 7f59d35f3700 1 -- 192.168.123.105:0/2539508858 shutdown_connections 2026-03-10T08:50:40.039 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.881+0000 7f59d35f3700 1 --2- 192.168.123.105:0/2539508858 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f59cc104fb0 0x7f59cc1073e0 unknown :-1 s=CLOSED pgs=86 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:40.039 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.881+0000 7f59d35f3700 1 -- 192.168.123.105:0/2539508858 >> 192.168.123.105:0/2539508858 conn(0x7f59cc100bd0 msgr2=0x7f59cc103030 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:40.039 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.881+0000 7f59d35f3700 1 -- 192.168.123.105:0/2539508858 shutdown_connections 2026-03-10T08:50:40.039 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.881+0000 7f59d35f3700 1 -- 192.168.123.105:0/2539508858 wait complete. 2026-03-10T08:50:40.039 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.882+0000 7f59d35f3700 1 Processor -- start 2026-03-10T08:50:40.039 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.882+0000 7f59d35f3700 1 -- start start 2026-03-10T08:50:40.040 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.882+0000 7f59d35f3700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f59cc104fb0 0x7f59cc1a09e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:40.040 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.882+0000 7f59d35f3700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f59cc1a0f20 con 0x7f59cc104fb0 2026-03-10T08:50:40.040 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.882+0000 7f59d138f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f59cc104fb0 0x7f59cc1a09e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:40.040 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.882+0000 7f59d138f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f59cc104fb0 0x7f59cc1a09e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:44428/0 (socket says 192.168.123.105:44428) 2026-03-10T08:50:40.040 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.882+0000 7f59d138f700 1 -- 192.168.123.105:0/2813585108 learned_addr learned my addr 192.168.123.105:0/2813585108 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:40.040 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.883+0000 7f59d138f700 1 -- 192.168.123.105:0/2813585108 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f59c0009740 con 0x7f59cc104fb0 2026-03-10T08:50:40.040 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.883+0000 7f59d138f700 1 --2- 192.168.123.105:0/2813585108 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f59cc104fb0 0x7f59cc1a09e0 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7f59c000bef0 tx=0x7f59c000bfd0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:40.040 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.883+0000 7f59be7fc700 1 -- 192.168.123.105:0/2813585108 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f59c00040f0 con 0x7f59cc104fb0 2026-03-10T08:50:40.040 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.883+0000 7f59be7fc700 1 -- 192.168.123.105:0/2813585108 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f59c0004250 con 0x7f59cc104fb0 2026-03-10T08:50:40.040 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.883+0000 7f59be7fc700 1 -- 192.168.123.105:0/2813585108 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f59c00115c0 con 0x7f59cc104fb0 2026-03-10T08:50:40.040 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.883+0000 7f59d35f3700 1 -- 192.168.123.105:0/2813585108 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f59cc1a1120 con 0x7f59cc104fb0 2026-03-10T08:50:40.040 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.883+0000 7f59d35f3700 1 -- 192.168.123.105:0/2813585108 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f59cc1a1540 con 0x7f59cc104fb0 2026-03-10T08:50:40.040 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.884+0000 7f59be7fc700 1 -- 192.168.123.105:0/2813585108 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 10) v1 ==== 45185+0+0 (secure 0 0 0) 0x7f59c0011720 con 0x7f59cc104fb0 2026-03-10T08:50:40.040 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.884+0000 7f59be7fc700 1 --2- 192.168.123.105:0/2813585108 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f59b80382d0 0x7f59b803a790 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:40.040 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.884+0000 7f59be7fc700 1 -- 192.168.123.105:0/2813585108 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f59c004cae0 con 0x7f59cc104fb0 2026-03-10T08:50:40.040 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.884+0000 7f59d0b8e700 1 --2- 192.168.123.105:0/2813585108 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f59b80382d0 0x7f59b803a790 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:40.040 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.884+0000 7f59d35f3700 1 -- 192.168.123.105:0/2813585108 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f59cc19a290 con 0x7f59cc104fb0 2026-03-10T08:50:40.040 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.885+0000 7f59d0b8e700 1 --2- 192.168.123.105:0/2813585108 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f59b80382d0 0x7f59b803a790 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f59c8006fd0 tx=0x7f59c8006e40 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:40.040 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.887+0000 7f59be7fc700 1 -- 192.168.123.105:0/2813585108 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f59c001e070 con 0x7f59cc104fb0 2026-03-10T08:50:40.040 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.988+0000 7f59d35f3700 1 -- 192.168.123.105:0/2813585108 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command([{prefix=config set, name=osd_memory_target_autotune}] v 0) v1 -- 0x7f59cc0623c0 con 0x7f59cc104fb0 2026-03-10T08:50:40.040 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.988+0000 7f59be7fc700 1 -- 192.168.123.105:0/2813585108 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{prefix=config set, name=osd_memory_target_autotune}]=0 v8)=0 v8) v1 ==== 127+0+0 (secure 0 0 0) 0x7f59c004b0c0 con 0x7f59cc104fb0 2026-03-10T08:50:40.040 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.990+0000 7f59d35f3700 1 -- 192.168.123.105:0/2813585108 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f59b80382d0 msgr2=0x7f59b803a790 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:40.040 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.990+0000 7f59d35f3700 1 --2- 192.168.123.105:0/2813585108 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f59b80382d0 0x7f59b803a790 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f59c8006fd0 tx=0x7f59c8006e40 comp rx=0 tx=0).stop 2026-03-10T08:50:40.040 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.991+0000 7f59d35f3700 1 -- 192.168.123.105:0/2813585108 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f59cc104fb0 msgr2=0x7f59cc1a09e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:40.040 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.991+0000 7f59d35f3700 1 --2- 192.168.123.105:0/2813585108 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f59cc104fb0 0x7f59cc1a09e0 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7f59c000bef0 tx=0x7f59c000bfd0 comp rx=0 tx=0).stop 2026-03-10T08:50:40.040 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.991+0000 7f59d35f3700 1 -- 192.168.123.105:0/2813585108 shutdown_connections 2026-03-10T08:50:40.040 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.991+0000 7f59d35f3700 1 --2- 192.168.123.105:0/2813585108 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f59b80382d0 0x7f59b803a790 unknown :-1 s=CLOSED pgs=10 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:40.040 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.991+0000 7f59d35f3700 1 --2- 192.168.123.105:0/2813585108 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f59cc104fb0 0x7f59cc1a09e0 unknown :-1 s=CLOSED pgs=87 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:40.040 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.991+0000 7f59d35f3700 1 -- 192.168.123.105:0/2813585108 >> 192.168.123.105:0/2813585108 conn(0x7f59cc100bd0 msgr2=0x7f59cc103030 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:40.040 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.991+0000 7f59d35f3700 1 -- 192.168.123.105:0/2813585108 shutdown_connections 2026-03-10T08:50:40.040 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:39.991+0000 7f59d35f3700 1 -- 192.168.123.105:0/2813585108 wait complete. 2026-03-10T08:50:40.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.151+0000 7f783d56a700 1 Processor -- start 2026-03-10T08:50:40.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.151+0000 7f783d56a700 1 -- start start 2026-03-10T08:50:40.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.151+0000 7f783d56a700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7838104fb0 0x7f78381073e0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:40.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.151+0000 7f783d56a700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7838074720 con 0x7f7838104fb0 2026-03-10T08:50:40.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.152+0000 7f7836ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7838104fb0 0x7f78381073e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:40.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.152+0000 7f7836ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7838104fb0 0x7f78381073e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:44432/0 (socket says 192.168.123.105:44432) 2026-03-10T08:50:40.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.152+0000 7f7836ffd700 1 -- 192.168.123.105:0/1783482901 learned_addr learned my addr 192.168.123.105:0/1783482901 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:40.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.152+0000 7f7836ffd700 1 -- 192.168.123.105:0/1783482901 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7838107920 con 0x7f7838104fb0 2026-03-10T08:50:40.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.153+0000 7f7836ffd700 1 --2- 192.168.123.105:0/1783482901 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7838104fb0 0x7f78381073e0 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f7820009a90 tx=0x7f7820009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=4127936795dfdbd9 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:40.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.153+0000 7f7835ffb700 1 -- 192.168.123.105:0/1783482901 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7820004030 con 0x7f7838104fb0 2026-03-10T08:50:40.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.153+0000 7f7835ffb700 1 -- 192.168.123.105:0/1783482901 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f782000b7e0 con 0x7f7838104fb0 2026-03-10T08:50:40.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.153+0000 7f783d56a700 1 -- 192.168.123.105:0/1783482901 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7838104fb0 msgr2=0x7f78381073e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:40.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.153+0000 7f783d56a700 1 --2- 192.168.123.105:0/1783482901 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7838104fb0 0x7f78381073e0 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f7820009a90 tx=0x7f7820009da0 comp rx=0 tx=0).stop 2026-03-10T08:50:40.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.154+0000 7f783d56a700 1 -- 192.168.123.105:0/1783482901 shutdown_connections 2026-03-10T08:50:40.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.154+0000 7f783d56a700 1 --2- 192.168.123.105:0/1783482901 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7838104fb0 0x7f78381073e0 unknown :-1 s=CLOSED pgs=88 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:40.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.154+0000 7f783d56a700 1 -- 192.168.123.105:0/1783482901 >> 192.168.123.105:0/1783482901 conn(0x7f7838100bd0 msgr2=0x7f7838103030 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:40.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.154+0000 7f783d56a700 1 -- 192.168.123.105:0/1783482901 shutdown_connections 2026-03-10T08:50:40.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.154+0000 7f783d56a700 1 -- 192.168.123.105:0/1783482901 wait complete. 2026-03-10T08:50:40.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.154+0000 7f783d56a700 1 Processor -- start 2026-03-10T08:50:40.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.154+0000 7f783d56a700 1 -- start start 2026-03-10T08:50:40.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.155+0000 7f783d56a700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f78381a0bd0 0x7f78381a1010 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:40.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.155+0000 7f7836ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f78381a0bd0 0x7f78381a1010 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:40.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.155+0000 7f7836ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f78381a0bd0 0x7f78381a1010 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:44442/0 (socket says 192.168.123.105:44442) 2026-03-10T08:50:40.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.155+0000 7f7836ffd700 1 -- 192.168.123.105:0/2742705317 learned_addr learned my addr 192.168.123.105:0/2742705317 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:40.367 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.155+0000 7f783d56a700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7838074720 con 0x7f78381a0bd0 2026-03-10T08:50:40.367 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.155+0000 7f7836ffd700 1 -- 192.168.123.105:0/2742705317 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7820009740 con 0x7f78381a0bd0 2026-03-10T08:50:40.367 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.155+0000 7f7836ffd700 1 --2- 192.168.123.105:0/2742705317 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f78381a0bd0 0x7f78381a1010 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f7820009130 tx=0x7f782000bfc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:40.367 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.155+0000 7f782ffff700 1 -- 192.168.123.105:0/2742705317 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7820003e80 con 0x7f78381a0bd0 2026-03-10T08:50:40.367 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.155+0000 7f783d56a700 1 -- 192.168.123.105:0/2742705317 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f78381a1550 con 0x7f78381a0bd0 2026-03-10T08:50:40.367 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.156+0000 7f783d56a700 1 -- 192.168.123.105:0/2742705317 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f78381a4190 con 0x7f78381a0bd0 2026-03-10T08:50:40.367 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.156+0000 7f782ffff700 1 -- 192.168.123.105:0/2742705317 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f78200044c0 con 0x7f78381a0bd0 2026-03-10T08:50:40.367 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.156+0000 7f782ffff700 1 -- 192.168.123.105:0/2742705317 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f782001ad50 con 0x7f78381a0bd0 2026-03-10T08:50:40.367 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.156+0000 7f782ffff700 1 -- 192.168.123.105:0/2742705317 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 10) v1 ==== 45185+0+0 (secure 0 0 0) 0x7f7820011420 con 0x7f78381a0bd0 2026-03-10T08:50:40.367 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.156+0000 7f782ffff700 1 --2- 192.168.123.105:0/2742705317 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7824038310 0x7f782403a7d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:40.367 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.157+0000 7f783d56a700 1 -- 192.168.123.105:0/2742705317 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7818005320 con 0x7f78381a0bd0 2026-03-10T08:50:40.367 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.157+0000 7f78367fc700 1 --2- 192.168.123.105:0/2742705317 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7824038310 0x7f782403a7d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:40.367 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.157+0000 7f78367fc700 1 --2- 192.168.123.105:0/2742705317 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7824038310 0x7f782403a7d0 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f7828006fd0 tx=0x7f7828006e40 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:40.367 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.157+0000 7f782ffff700 1 -- 192.168.123.105:0/2742705317 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f782004b640 con 0x7f78381a0bd0 2026-03-10T08:50:40.367 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.160+0000 7f782ffff700 1 -- 192.168.123.105:0/2742705317 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f78200116d0 con 0x7f78381a0bd0 2026-03-10T08:50:40.367 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.316+0000 7f783d56a700 1 -- 192.168.123.105:0/2742705317 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command([{prefix=config-key set, key=mgr/dashboard/cluster/status}] v 0) v1 -- 0x7f7818005f70 con 0x7f78381a0bd0 2026-03-10T08:50:40.367 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.319+0000 7f782ffff700 1 -- 192.168.123.105:0/2742705317 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{prefix=config-key set, key=mgr/dashboard/cluster/status}]=0 set mgr/dashboard/cluster/status v34)=0 set mgr/dashboard/cluster/status v34) v1 ==== 153+0+0 (secure 0 0 0) 0x7f7820010380 con 0x7f78381a0bd0 2026-03-10T08:50:40.367 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.321+0000 7f783d56a700 1 -- 192.168.123.105:0/2742705317 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7824038310 msgr2=0x7f782403a7d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:40.367 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.321+0000 7f783d56a700 1 --2- 192.168.123.105:0/2742705317 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7824038310 0x7f782403a7d0 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f7828006fd0 tx=0x7f7828006e40 comp rx=0 tx=0).stop 2026-03-10T08:50:40.367 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.321+0000 7f783d56a700 1 -- 192.168.123.105:0/2742705317 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f78381a0bd0 msgr2=0x7f78381a1010 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:40.367 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.321+0000 7f783d56a700 1 --2- 192.168.123.105:0/2742705317 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f78381a0bd0 0x7f78381a1010 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f7820009130 tx=0x7f782000bfc0 comp rx=0 tx=0).stop 2026-03-10T08:50:40.367 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.322+0000 7f783d56a700 1 -- 192.168.123.105:0/2742705317 shutdown_connections 2026-03-10T08:50:40.367 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.322+0000 7f783d56a700 1 --2- 192.168.123.105:0/2742705317 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7824038310 0x7f782403a7d0 unknown :-1 s=CLOSED pgs=11 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:40.367 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.322+0000 7f783d56a700 1 --2- 192.168.123.105:0/2742705317 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f78381a0bd0 0x7f78381a1010 unknown :-1 s=CLOSED pgs=89 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:40.367 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.322+0000 7f783d56a700 1 -- 192.168.123.105:0/2742705317 >> 192.168.123.105:0/2742705317 conn(0x7f7838100bd0 msgr2=0x7f78381071d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:40.367 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.322+0000 7f783d56a700 1 -- 192.168.123.105:0/2742705317 shutdown_connections 2026-03-10T08:50:40.367 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T08:50:40.322+0000 7f783d56a700 1 -- 192.168.123.105:0/2742705317 wait complete. 2026-03-10T08:50:40.367 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr set mgr/dashboard/cluster/status 2026-03-10T08:50:40.367 INFO:teuthology.orchestra.run.vm05.stdout:You can access the Ceph CLI as following in case of multi-cluster or non-default config: 2026-03-10T08:50:40.367 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:50:40.367 INFO:teuthology.orchestra.run.vm05.stdout: sudo /home/ubuntu/cephtest/cephadm shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring 2026-03-10T08:50:40.367 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:50:40.367 INFO:teuthology.orchestra.run.vm05.stdout:Or, if you are only running a single cluster on this host: 2026-03-10T08:50:40.367 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:50:40.367 INFO:teuthology.orchestra.run.vm05.stdout: sudo /home/ubuntu/cephtest/cephadm shell 2026-03-10T08:50:40.367 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:50:40.367 INFO:teuthology.orchestra.run.vm05.stdout:Please consider enabling telemetry to help improve Ceph: 2026-03-10T08:50:40.367 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:50:40.367 INFO:teuthology.orchestra.run.vm05.stdout: ceph telemetry on 2026-03-10T08:50:40.367 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:50:40.367 INFO:teuthology.orchestra.run.vm05.stdout:For more information see: 2026-03-10T08:50:40.367 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:50:40.367 INFO:teuthology.orchestra.run.vm05.stdout: https://docs.ceph.com/en/latest/mgr/telemetry/ 2026-03-10T08:50:40.367 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:50:40.367 INFO:teuthology.orchestra.run.vm05.stdout:Bootstrap complete. 2026-03-10T08:50:40.390 INFO:tasks.cephadm:Fetching config... 2026-03-10T08:50:40.390 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-10T08:50:40.390 DEBUG:teuthology.orchestra.run.vm05:> dd if=/etc/ceph/ceph.conf of=/dev/stdout 2026-03-10T08:50:40.467 INFO:tasks.cephadm:Fetching client.admin keyring... 2026-03-10T08:50:40.467 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-10T08:50:40.467 DEBUG:teuthology.orchestra.run.vm05:> dd if=/etc/ceph/ceph.client.admin.keyring of=/dev/stdout 2026-03-10T08:50:40.523 INFO:tasks.cephadm:Fetching mon keyring... 2026-03-10T08:50:40.523 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-10T08:50:40.523 DEBUG:teuthology.orchestra.run.vm05:> sudo dd if=/var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/keyring of=/dev/stdout 2026-03-10T08:50:40.586 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:40 vm05 ceph-mon[49713]: from='client.14176 -' entity='client.admin' cmd=[{"prefix": "dashboard ac-user-create", "username": "admin", "rolename": "administrator", "force_password": true, "pwd_update_required": true, "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:50:40.586 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:40 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:40.586 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:40 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/2169525058' entity='client.admin' cmd=[{"prefix": "config get", "who": "mgr", "key": "mgr/dashboard/ssl_server_port"}]: dispatch 2026-03-10T08:50:40.586 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:40 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/2742705317' entity='client.admin' 2026-03-10T08:50:40.590 INFO:tasks.cephadm:Fetching pub ssh key... 2026-03-10T08:50:40.590 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-10T08:50:40.590 DEBUG:teuthology.orchestra.run.vm05:> dd if=/home/ubuntu/cephtest/ceph.pub of=/dev/stdout 2026-03-10T08:50:40.644 INFO:tasks.cephadm:Installing pub ssh key for root users... 2026-03-10T08:50:40.644 DEBUG:teuthology.orchestra.run.vm05:> sudo install -d -m 0700 /root/.ssh && echo 'ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDwebO0BEKOMDaI+d0wXskLAIqRq/l/8cSZ+vgs/FMPpC6V4pxW3M0JZm9XVo5UNZfcXqRceQofO4PmONsCzDgedN6ePamdwiR2s0k77ul/2J4TZgYsiHrELZymaDhh34J5kCphoFd9HA/NpDaDizb3BftH2VBM3x9PqpHZ5Lwy6rjFsWKZd8OinvSl76WVV7kLEOipKr8p0WRY+Iw7YpP/yaRdviUQvdui5xvw9a1RLjA+z/q4nDWif9BMFPn1DqeGFqAdFCGRU0qf77Ode08FP/MrLHPCpjyegkWd+tVtc9lh8Mjfa7xYbE32yrkOiG2Gi33Btq/eQWAgVAafm1dgQn0z1NJSD5lc+1vy+YgX65Lftw66S/CH4zAXz69FsEd7P53ic3P+RxWssK1kP9jLuGNMPsJ3KHIRrJu2mxVMk0/LfFqB7pUf0ngivfp4iGK/XBhe8iQCpg+f5VrxNRb7FrRQ67gMB38ssyzXF7innINqBJFXfu6q95NwlMpR19k= ceph-16587ed2-1c5e-11f1-90f6-35051361a039' | sudo tee -a /root/.ssh/authorized_keys && sudo chmod 0600 /root/.ssh/authorized_keys 2026-03-10T08:50:40.717 INFO:teuthology.orchestra.run.vm05.stdout:ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDwebO0BEKOMDaI+d0wXskLAIqRq/l/8cSZ+vgs/FMPpC6V4pxW3M0JZm9XVo5UNZfcXqRceQofO4PmONsCzDgedN6ePamdwiR2s0k77ul/2J4TZgYsiHrELZymaDhh34J5kCphoFd9HA/NpDaDizb3BftH2VBM3x9PqpHZ5Lwy6rjFsWKZd8OinvSl76WVV7kLEOipKr8p0WRY+Iw7YpP/yaRdviUQvdui5xvw9a1RLjA+z/q4nDWif9BMFPn1DqeGFqAdFCGRU0qf77Ode08FP/MrLHPCpjyegkWd+tVtc9lh8Mjfa7xYbE32yrkOiG2Gi33Btq/eQWAgVAafm1dgQn0z1NJSD5lc+1vy+YgX65Lftw66S/CH4zAXz69FsEd7P53ic3P+RxWssK1kP9jLuGNMPsJ3KHIRrJu2mxVMk0/LfFqB7pUf0ngivfp4iGK/XBhe8iQCpg+f5VrxNRb7FrRQ67gMB38ssyzXF7innINqBJFXfu6q95NwlMpR19k= ceph-16587ed2-1c5e-11f1-90f6-35051361a039 2026-03-10T08:50:40.728 DEBUG:teuthology.orchestra.run.vm08:> sudo install -d -m 0700 /root/.ssh && echo 'ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDwebO0BEKOMDaI+d0wXskLAIqRq/l/8cSZ+vgs/FMPpC6V4pxW3M0JZm9XVo5UNZfcXqRceQofO4PmONsCzDgedN6ePamdwiR2s0k77ul/2J4TZgYsiHrELZymaDhh34J5kCphoFd9HA/NpDaDizb3BftH2VBM3x9PqpHZ5Lwy6rjFsWKZd8OinvSl76WVV7kLEOipKr8p0WRY+Iw7YpP/yaRdviUQvdui5xvw9a1RLjA+z/q4nDWif9BMFPn1DqeGFqAdFCGRU0qf77Ode08FP/MrLHPCpjyegkWd+tVtc9lh8Mjfa7xYbE32yrkOiG2Gi33Btq/eQWAgVAafm1dgQn0z1NJSD5lc+1vy+YgX65Lftw66S/CH4zAXz69FsEd7P53ic3P+RxWssK1kP9jLuGNMPsJ3KHIRrJu2mxVMk0/LfFqB7pUf0ngivfp4iGK/XBhe8iQCpg+f5VrxNRb7FrRQ67gMB38ssyzXF7innINqBJFXfu6q95NwlMpR19k= ceph-16587ed2-1c5e-11f1-90f6-35051361a039' | sudo tee -a /root/.ssh/authorized_keys && sudo chmod 0600 /root/.ssh/authorized_keys 2026-03-10T08:50:40.760 INFO:teuthology.orchestra.run.vm08.stdout:ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDwebO0BEKOMDaI+d0wXskLAIqRq/l/8cSZ+vgs/FMPpC6V4pxW3M0JZm9XVo5UNZfcXqRceQofO4PmONsCzDgedN6ePamdwiR2s0k77ul/2J4TZgYsiHrELZymaDhh34J5kCphoFd9HA/NpDaDizb3BftH2VBM3x9PqpHZ5Lwy6rjFsWKZd8OinvSl76WVV7kLEOipKr8p0WRY+Iw7YpP/yaRdviUQvdui5xvw9a1RLjA+z/q4nDWif9BMFPn1DqeGFqAdFCGRU0qf77Ode08FP/MrLHPCpjyegkWd+tVtc9lh8Mjfa7xYbE32yrkOiG2Gi33Btq/eQWAgVAafm1dgQn0z1NJSD5lc+1vy+YgX65Lftw66S/CH4zAXz69FsEd7P53ic3P+RxWssK1kP9jLuGNMPsJ3KHIRrJu2mxVMk0/LfFqB7pUf0ngivfp4iGK/XBhe8iQCpg+f5VrxNRb7FrRQ67gMB38ssyzXF7innINqBJFXfu6q95NwlMpR19k= ceph-16587ed2-1c5e-11f1-90f6-35051361a039 2026-03-10T08:50:40.769 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph config set mgr mgr/cephadm/allow_ptrace true 2026-03-10T08:50:40.906 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:50:41.228 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.227+0000 7facd7fff700 1 -- 192.168.123.105:0/3239270436 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7facd00a4a10 msgr2=0x7facd00a4e30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:41.228 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.227+0000 7facd7fff700 1 --2- 192.168.123.105:0/3239270436 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7facd00a4a10 0x7facd00a4e30 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7facc8009b00 tx=0x7facc8009e10 comp rx=0 tx=0).stop 2026-03-10T08:50:41.231 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.230+0000 7facd7fff700 1 -- 192.168.123.105:0/3239270436 shutdown_connections 2026-03-10T08:50:41.231 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.230+0000 7facd7fff700 1 --2- 192.168.123.105:0/3239270436 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7facd00a4a10 0x7facd00a4e30 unknown :-1 s=CLOSED pgs=90 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:41.231 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.230+0000 7facd7fff700 1 -- 192.168.123.105:0/3239270436 >> 192.168.123.105:0/3239270436 conn(0x7facd009fed0 msgr2=0x7facd00a2330 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:41.231 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.230+0000 7facd7fff700 1 -- 192.168.123.105:0/3239270436 shutdown_connections 2026-03-10T08:50:41.231 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.230+0000 7facd7fff700 1 -- 192.168.123.105:0/3239270436 wait complete. 2026-03-10T08:50:41.231 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.230+0000 7facd7fff700 1 Processor -- start 2026-03-10T08:50:41.231 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.230+0000 7facd7fff700 1 -- start start 2026-03-10T08:50:41.231 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.230+0000 7facd7fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7facd00a4a10 0x7facd014ad80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:41.231 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.230+0000 7facd7fff700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7facc8012070 con 0x7facd00a4a10 2026-03-10T08:50:41.231 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.231+0000 7facd6ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7facd00a4a10 0x7facd014ad80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:41.231 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.231+0000 7facd6ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7facd00a4a10 0x7facd014ad80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:44448/0 (socket says 192.168.123.105:44448) 2026-03-10T08:50:41.231 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.231+0000 7facd6ffd700 1 -- 192.168.123.105:0/437678197 learned_addr learned my addr 192.168.123.105:0/437678197 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:41.231 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.231+0000 7facd6ffd700 1 -- 192.168.123.105:0/437678197 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7facc80097e0 con 0x7facd00a4a10 2026-03-10T08:50:41.231 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.231+0000 7facd6ffd700 1 --2- 192.168.123.105:0/437678197 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7facd00a4a10 0x7facd014ad80 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7facc80055d0 tx=0x7facc80056b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:41.232 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.231+0000 7facdc9f0700 1 -- 192.168.123.105:0/437678197 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7facc801d070 con 0x7facd00a4a10 2026-03-10T08:50:41.232 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.231+0000 7facdc9f0700 1 -- 192.168.123.105:0/437678197 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7facc800b810 con 0x7facd00a4a10 2026-03-10T08:50:41.232 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.231+0000 7facdc9f0700 1 -- 192.168.123.105:0/437678197 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7facc800f460 con 0x7facd00a4a10 2026-03-10T08:50:41.232 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.232+0000 7facd7fff700 1 -- 192.168.123.105:0/437678197 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7facd014b380 con 0x7facd00a4a10 2026-03-10T08:50:41.233 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.232+0000 7facd7fff700 1 -- 192.168.123.105:0/437678197 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7facd014b7a0 con 0x7facd00a4a10 2026-03-10T08:50:41.233 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.233+0000 7facdc9f0700 1 -- 192.168.123.105:0/437678197 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 11) v1 ==== 45291+0+0 (secure 0 0 0) 0x7facc8003680 con 0x7facd00a4a10 2026-03-10T08:50:41.235 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.233+0000 7facd7fff700 1 -- 192.168.123.105:0/437678197 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7facd0004080 con 0x7facd00a4a10 2026-03-10T08:50:41.236 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.233+0000 7facdc9f0700 1 --2- 192.168.123.105:0/437678197 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7facc0038080 0x7facc003a540 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:41.236 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.233+0000 7facdc9f0700 1 -- 192.168.123.105:0/437678197 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7facc8051600 con 0x7facd00a4a10 2026-03-10T08:50:41.236 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.236+0000 7facd67fc700 1 --2- 192.168.123.105:0/437678197 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7facc0038080 0x7facc003a540 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:41.236 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.236+0000 7facdc9f0700 1 -- 192.168.123.105:0/437678197 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7facc8052050 con 0x7facd00a4a10 2026-03-10T08:50:41.236 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.236+0000 7facd67fc700 1 --2- 192.168.123.105:0/437678197 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7facc0038080 0x7facc003a540 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7faccc006fd0 tx=0x7faccc006e40 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:41.346 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.346+0000 7facd7fff700 1 -- 192.168.123.105:0/437678197 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command([{prefix=config set, name=mgr/cephadm/allow_ptrace}] v 0) v1 -- 0x7facd0004b00 con 0x7facd00a4a10 2026-03-10T08:50:41.354 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.353+0000 7facdc9f0700 1 -- 192.168.123.105:0/437678197 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mgr/cephadm/allow_ptrace}]=0 v9)=0 v9) v1 ==== 125+0+0 (secure 0 0 0) 0x7facc8026020 con 0x7facd00a4a10 2026-03-10T08:50:41.362 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.362+0000 7facd7fff700 1 -- 192.168.123.105:0/437678197 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7facc0038080 msgr2=0x7facc003a540 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:41.362 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.362+0000 7facd7fff700 1 --2- 192.168.123.105:0/437678197 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7facc0038080 0x7facc003a540 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7faccc006fd0 tx=0x7faccc006e40 comp rx=0 tx=0).stop 2026-03-10T08:50:41.363 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.362+0000 7facd7fff700 1 -- 192.168.123.105:0/437678197 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7facd00a4a10 msgr2=0x7facd014ad80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:41.363 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.362+0000 7facd7fff700 1 --2- 192.168.123.105:0/437678197 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7facd00a4a10 0x7facd014ad80 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7facc80055d0 tx=0x7facc80056b0 comp rx=0 tx=0).stop 2026-03-10T08:50:41.364 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.364+0000 7facd7fff700 1 -- 192.168.123.105:0/437678197 shutdown_connections 2026-03-10T08:50:41.364 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.364+0000 7facd7fff700 1 --2- 192.168.123.105:0/437678197 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7facc0038080 0x7facc003a540 unknown :-1 s=CLOSED pgs=12 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:41.364 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.364+0000 7facd7fff700 1 --2- 192.168.123.105:0/437678197 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7facd00a4a10 0x7facd014ad80 unknown :-1 s=CLOSED pgs=91 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:41.364 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.364+0000 7facd7fff700 1 -- 192.168.123.105:0/437678197 >> 192.168.123.105:0/437678197 conn(0x7facd009fed0 msgr2=0x7facd00a0b40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:41.364 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.364+0000 7facd7fff700 1 -- 192.168.123.105:0/437678197 shutdown_connections 2026-03-10T08:50:41.364 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.364+0000 7facd7fff700 1 -- 192.168.123.105:0/437678197 wait complete. 2026-03-10T08:50:41.428 INFO:tasks.cephadm:Distributing conf and client.admin keyring to all hosts + 0755 2026-03-10T08:50:41.428 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph orch client-keyring set client.admin '*' --mode 0755 2026-03-10T08:50:41.586 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:50:41.646 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:41 vm05 ceph-mon[49713]: mgrmap e11: vm05.rxwgjc(active, since 2s) 2026-03-10T08:50:41.646 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:41 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/437678197' entity='client.admin' 2026-03-10T08:50:41.844 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.843+0000 7fb6fee06700 1 -- 192.168.123.105:0/3036475002 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb6f8072340 msgr2=0x7fb6f8072760 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:41.845 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.843+0000 7fb6fee06700 1 --2- 192.168.123.105:0/3036475002 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb6f8072340 0x7fb6f8072760 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7fb6e8009b50 tx=0x7fb6e8009e60 comp rx=0 tx=0).stop 2026-03-10T08:50:41.845 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.845+0000 7fb6fee06700 1 -- 192.168.123.105:0/3036475002 shutdown_connections 2026-03-10T08:50:41.845 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.845+0000 7fb6fee06700 1 --2- 192.168.123.105:0/3036475002 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb6f8072340 0x7fb6f8072760 unknown :-1 s=CLOSED pgs=92 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:41.845 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.845+0000 7fb6fee06700 1 -- 192.168.123.105:0/3036475002 >> 192.168.123.105:0/3036475002 conn(0x7fb6f806d800 msgr2=0x7fb6f806fc60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:41.845 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.845+0000 7fb6fee06700 1 -- 192.168.123.105:0/3036475002 shutdown_connections 2026-03-10T08:50:41.845 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.845+0000 7fb6fee06700 1 -- 192.168.123.105:0/3036475002 wait complete. 2026-03-10T08:50:41.847 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.847+0000 7fb6fee06700 1 Processor -- start 2026-03-10T08:50:41.847 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.847+0000 7fb6fee06700 1 -- start start 2026-03-10T08:50:41.848 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.847+0000 7fb6fee06700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb6f819c160 0x7fb6f819c580 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:41.848 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.847+0000 7fb6fee06700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb6e8023070 con 0x7fb6f819c160 2026-03-10T08:50:41.848 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.847+0000 7fb6fcba2700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb6f819c160 0x7fb6f819c580 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:41.848 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.847+0000 7fb6fcba2700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb6f819c160 0x7fb6f819c580 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:44474/0 (socket says 192.168.123.105:44474) 2026-03-10T08:50:41.849 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.847+0000 7fb6fcba2700 1 -- 192.168.123.105:0/2006043739 learned_addr learned my addr 192.168.123.105:0/2006043739 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:41.849 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.849+0000 7fb6fcba2700 1 -- 192.168.123.105:0/2006043739 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb6e80097e0 con 0x7fb6f819c160 2026-03-10T08:50:41.849 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.849+0000 7fb6fcba2700 1 --2- 192.168.123.105:0/2006043739 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb6f819c160 0x7fb6f819c580 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7fb6e8009790 tx=0x7fb6e80049c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:41.850 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.850+0000 7fb6f5ffb700 1 -- 192.168.123.105:0/2006043739 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb6e802e070 con 0x7fb6f819c160 2026-03-10T08:50:41.851 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.850+0000 7fb6f5ffb700 1 -- 192.168.123.105:0/2006043739 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb6e801cb50 con 0x7fb6f819c160 2026-03-10T08:50:41.851 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.850+0000 7fb6f5ffb700 1 -- 192.168.123.105:0/2006043739 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb6e801f3f0 con 0x7fb6f819c160 2026-03-10T08:50:41.851 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.851+0000 7fb6fee06700 1 -- 192.168.123.105:0/2006043739 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb6f819cac0 con 0x7fb6f819c160 2026-03-10T08:50:41.852 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.851+0000 7fb6fee06700 1 -- 192.168.123.105:0/2006043739 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb6f819f740 con 0x7fb6f819c160 2026-03-10T08:50:41.852 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.852+0000 7fb6fee06700 1 -- 192.168.123.105:0/2006043739 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb6f8195dd0 con 0x7fb6f819c160 2026-03-10T08:50:41.855 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.852+0000 7fb6f5ffb700 1 -- 192.168.123.105:0/2006043739 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 11) v1 ==== 45291+0+0 (secure 0 0 0) 0x7fb6e8004ae0 con 0x7fb6f819c160 2026-03-10T08:50:41.855 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.852+0000 7fb6f5ffb700 1 --2- 192.168.123.105:0/2006043739 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb6e00380d0 0x7fb6e003a590 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:41.855 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.852+0000 7fb6f5ffb700 1 -- 192.168.123.105:0/2006043739 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7fb6e805cd50 con 0x7fb6f819c160 2026-03-10T08:50:41.855 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.855+0000 7fb6f5ffb700 1 -- 192.168.123.105:0/2006043739 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fb6e8031d40 con 0x7fb6f819c160 2026-03-10T08:50:41.855 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.855+0000 7fb6f7fff700 1 --2- 192.168.123.105:0/2006043739 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb6e00380d0 0x7fb6e003a590 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:41.855 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.855+0000 7fb6f7fff700 1 --2- 192.168.123.105:0/2006043739 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb6e00380d0 0x7fb6e003a590 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7fb6f000ad30 tx=0x7fb6f00093f0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:41.984 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.980+0000 7fb6fee06700 1 -- 192.168.123.105:0/2006043739 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch client-keyring set", "entity": "client.admin", "placement": "*", "mode": "0755", "target": ["mon-mgr", ""]}) v1 -- 0x7fb6f802d090 con 0x7fb6e00380d0 2026-03-10T08:50:41.987 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.986+0000 7fb6f5ffb700 1 -- 192.168.123.105:0/2006043739 <== mgr.14162 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+0 (secure 0 0 0) 0x7fb6f802d090 con 0x7fb6e00380d0 2026-03-10T08:50:41.989 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.989+0000 7fb6fee06700 1 -- 192.168.123.105:0/2006043739 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb6e00380d0 msgr2=0x7fb6e003a590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:41.989 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.989+0000 7fb6fee06700 1 --2- 192.168.123.105:0/2006043739 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb6e00380d0 0x7fb6e003a590 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7fb6f000ad30 tx=0x7fb6f00093f0 comp rx=0 tx=0).stop 2026-03-10T08:50:41.989 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.989+0000 7fb6fee06700 1 -- 192.168.123.105:0/2006043739 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb6f819c160 msgr2=0x7fb6f819c580 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:41.989 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.989+0000 7fb6fee06700 1 --2- 192.168.123.105:0/2006043739 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb6f819c160 0x7fb6f819c580 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7fb6e8009790 tx=0x7fb6e80049c0 comp rx=0 tx=0).stop 2026-03-10T08:50:41.989 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.989+0000 7fb6fee06700 1 -- 192.168.123.105:0/2006043739 shutdown_connections 2026-03-10T08:50:41.989 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.989+0000 7fb6fee06700 1 --2- 192.168.123.105:0/2006043739 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb6e00380d0 0x7fb6e003a590 unknown :-1 s=CLOSED pgs=13 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:41.989 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.989+0000 7fb6fee06700 1 --2- 192.168.123.105:0/2006043739 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb6f819c160 0x7fb6f819c580 unknown :-1 s=CLOSED pgs=93 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:41.989 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.989+0000 7fb6fee06700 1 -- 192.168.123.105:0/2006043739 >> 192.168.123.105:0/2006043739 conn(0x7fb6f806d800 msgr2=0x7fb6f806e4e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:41.989 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.989+0000 7fb6fee06700 1 -- 192.168.123.105:0/2006043739 shutdown_connections 2026-03-10T08:50:41.990 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:41.989+0000 7fb6fee06700 1 -- 192.168.123.105:0/2006043739 wait complete. 2026-03-10T08:50:42.088 INFO:tasks.cephadm:Writing (initial) conf and keyring to vm08 2026-03-10T08:50:42.088 DEBUG:teuthology.orchestra.run.vm08:> set -ex 2026-03-10T08:50:42.088 DEBUG:teuthology.orchestra.run.vm08:> dd of=/etc/ceph/ceph.conf 2026-03-10T08:50:42.105 DEBUG:teuthology.orchestra.run.vm08:> set -ex 2026-03-10T08:50:42.106 DEBUG:teuthology.orchestra.run.vm08:> dd of=/etc/ceph/ceph.client.admin.keyring 2026-03-10T08:50:42.163 INFO:tasks.cephadm:Adding host vm08 to orchestrator... 2026-03-10T08:50:42.163 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph orch host add vm08 2026-03-10T08:50:42.313 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:50:42.571 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:42.569+0000 7f65219cc700 1 -- 192.168.123.105:0/1216064064 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f651c072230 msgr2=0x7f651c072650 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:42.571 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:42.569+0000 7f65219cc700 1 --2- 192.168.123.105:0/1216064064 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f651c072230 0x7f651c072650 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7f650c009b50 tx=0x7f650c009e60 comp rx=0 tx=0).stop 2026-03-10T08:50:42.573 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:42.571+0000 7f65219cc700 1 -- 192.168.123.105:0/1216064064 shutdown_connections 2026-03-10T08:50:42.573 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:42.571+0000 7f65219cc700 1 --2- 192.168.123.105:0/1216064064 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f651c072230 0x7f651c072650 unknown :-1 s=CLOSED pgs=94 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:42.573 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:42.571+0000 7f65219cc700 1 -- 192.168.123.105:0/1216064064 >> 192.168.123.105:0/1216064064 conn(0x7f651c06d660 msgr2=0x7f651c06fac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:42.573 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:42.572+0000 7f65219cc700 1 -- 192.168.123.105:0/1216064064 shutdown_connections 2026-03-10T08:50:42.573 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:42.572+0000 7f65219cc700 1 -- 192.168.123.105:0/1216064064 wait complete. 2026-03-10T08:50:42.573 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:42.572+0000 7f65219cc700 1 Processor -- start 2026-03-10T08:50:42.573 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:42.572+0000 7f65219cc700 1 -- start start 2026-03-10T08:50:42.573 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:42.573+0000 7f65219cc700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f651c072230 0x7f651c1afa00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:42.573 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:42.573+0000 7f65219cc700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f651c117ab0 con 0x7f651c072230 2026-03-10T08:50:42.573 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:42.573+0000 7f65209ca700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f651c072230 0x7f651c1afa00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:42.573 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:42.573+0000 7f65209ca700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f651c072230 0x7f651c1afa00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:44488/0 (socket says 192.168.123.105:44488) 2026-03-10T08:50:42.573 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:42.573+0000 7f65209ca700 1 -- 192.168.123.105:0/713970218 learned_addr learned my addr 192.168.123.105:0/713970218 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:42.574 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:42.573+0000 7f65209ca700 1 -- 192.168.123.105:0/713970218 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f650c0097e0 con 0x7f651c072230 2026-03-10T08:50:42.574 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:42.573+0000 7f65209ca700 1 --2- 192.168.123.105:0/713970218 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f651c072230 0x7f651c1afa00 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7f650c005950 tx=0x7f650c00b860 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:42.574 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:42.573+0000 7f6519ffb700 1 -- 192.168.123.105:0/713970218 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f650c01c070 con 0x7f651c072230 2026-03-10T08:50:42.574 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:42.573+0000 7f6519ffb700 1 -- 192.168.123.105:0/713970218 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f650c0056f0 con 0x7f651c072230 2026-03-10T08:50:42.574 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:42.573+0000 7f6519ffb700 1 -- 192.168.123.105:0/713970218 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f650c00f460 con 0x7f651c072230 2026-03-10T08:50:42.576 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:42.574+0000 7f65219cc700 1 -- 192.168.123.105:0/713970218 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f651c1aff40 con 0x7f651c072230 2026-03-10T08:50:42.576 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:42.574+0000 7f65219cc700 1 -- 192.168.123.105:0/713970218 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f651c1b0320 con 0x7f651c072230 2026-03-10T08:50:42.576 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:42.575+0000 7f6519ffb700 1 -- 192.168.123.105:0/713970218 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 11) v1 ==== 45291+0+0 (secure 0 0 0) 0x7f650c021850 con 0x7f651c072230 2026-03-10T08:50:42.576 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:42.575+0000 7f6519ffb700 1 --2- 192.168.123.105:0/713970218 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f6504038430 0x7f650403a8f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:42.579 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:42.575+0000 7f6519ffb700 1 -- 192.168.123.105:0/713970218 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f650c04c330 con 0x7f651c072230 2026-03-10T08:50:42.579 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:42.576+0000 7f651bfff700 1 --2- 192.168.123.105:0/713970218 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f6504038430 0x7f650403a8f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:42.579 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:42.576+0000 7f65219cc700 1 -- 192.168.123.105:0/713970218 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f651c110fd0 con 0x7f651c072230 2026-03-10T08:50:42.580 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:42.579+0000 7f651bfff700 1 --2- 192.168.123.105:0/713970218 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f6504038430 0x7f650403a8f0 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7f6510006fd0 tx=0x7f6510006e40 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:42.581 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:42.580+0000 7f6519ffb700 1 -- 192.168.123.105:0/713970218 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f650c026030 con 0x7f651c072230 2026-03-10T08:50:42.712 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:42.712+0000 7f65219cc700 1 -- 192.168.123.105:0/713970218 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch host add", "hostname": "vm08", "target": ["mon-mgr", ""]}) v1 -- 0x7f651c02cf80 con 0x7f6504038430 2026-03-10T08:50:43.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:42 vm05 ceph-mon[49713]: from='client.14186 -' entity='client.admin' cmd=[{"prefix": "orch client-keyring set", "entity": "client.admin", "placement": "*", "mode": "0755", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:50:43.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:42 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:43.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:42 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:43.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:42 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:43.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:42 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-10T08:50:43.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:42 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:50:43.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:42 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:50:44.233 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:44 vm05 ceph-mon[49713]: Updating vm05:/etc/ceph/ceph.conf 2026-03-10T08:50:44.233 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:44 vm05 ceph-mon[49713]: Updating vm05:/var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/config/ceph.conf 2026-03-10T08:50:44.233 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:44 vm05 ceph-mon[49713]: from='client.14188 -' entity='client.admin' cmd=[{"prefix": "orch host add", "hostname": "vm08", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:50:44.233 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:44 vm05 ceph-mon[49713]: Updating vm05:/etc/ceph/ceph.client.admin.keyring 2026-03-10T08:50:44.233 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:44 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:44.233 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:44 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:44.233 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:44 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:44.233 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:44 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm05", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T08:50:44.233 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:44 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm05", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-10T08:50:44.233 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:44 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:50:44.362 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:44.361+0000 7f6519ffb700 1 -- 192.168.123.105:0/713970218 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mgrmap(e 12) v1 ==== 45337+0+0 (secure 0 0 0) 0x7f650c017d20 con 0x7f651c072230 2026-03-10T08:50:44.470 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:44.470+0000 7f6519ffb700 1 -- 192.168.123.105:0/713970218 <== mgr.14162 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+46 (secure 0 0 0) 0x7f651c02cf80 con 0x7f6504038430 2026-03-10T08:50:44.472 INFO:teuthology.orchestra.run.vm05.stdout:Added host 'vm08' with addr '192.168.123.108' 2026-03-10T08:50:44.473 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:44.473+0000 7f65219cc700 1 -- 192.168.123.105:0/713970218 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f6504038430 msgr2=0x7f650403a8f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:44.473 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:44.473+0000 7f65219cc700 1 --2- 192.168.123.105:0/713970218 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f6504038430 0x7f650403a8f0 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7f6510006fd0 tx=0x7f6510006e40 comp rx=0 tx=0).stop 2026-03-10T08:50:44.473 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:44.473+0000 7f65219cc700 1 -- 192.168.123.105:0/713970218 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f651c072230 msgr2=0x7f651c1afa00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:44.473 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:44.473+0000 7f65219cc700 1 --2- 192.168.123.105:0/713970218 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f651c072230 0x7f651c1afa00 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7f650c005950 tx=0x7f650c00b860 comp rx=0 tx=0).stop 2026-03-10T08:50:44.475 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:44.475+0000 7f65219cc700 1 -- 192.168.123.105:0/713970218 shutdown_connections 2026-03-10T08:50:44.475 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:44.475+0000 7f65219cc700 1 --2- 192.168.123.105:0/713970218 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f6504038430 0x7f650403a8f0 unknown :-1 s=CLOSED pgs=14 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:44.475 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:44.475+0000 7f65219cc700 1 --2- 192.168.123.105:0/713970218 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f651c072230 0x7f651c1afa00 unknown :-1 s=CLOSED pgs=95 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:44.475 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:44.475+0000 7f65219cc700 1 -- 192.168.123.105:0/713970218 >> 192.168.123.105:0/713970218 conn(0x7f651c06d660 msgr2=0x7f651c06e340 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:44.476 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:44.475+0000 7f65219cc700 1 -- 192.168.123.105:0/713970218 shutdown_connections 2026-03-10T08:50:44.476 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:44.475+0000 7f65219cc700 1 -- 192.168.123.105:0/713970218 wait complete. 2026-03-10T08:50:44.524 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph orch host ls --format=json 2026-03-10T08:50:44.709 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:50:44.965 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:44.962+0000 7f6d3327f700 1 -- 192.168.123.105:0/1254200351 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d2c107ff0 msgr2=0x7f6d2c10edf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:44.965 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:44.962+0000 7f6d3327f700 1 --2- 192.168.123.105:0/1254200351 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d2c107ff0 0x7f6d2c10edf0 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7f6d28009b50 tx=0x7f6d28009e60 comp rx=0 tx=0).stop 2026-03-10T08:50:44.965 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:44.962+0000 7f6d3327f700 1 -- 192.168.123.105:0/1254200351 shutdown_connections 2026-03-10T08:50:44.965 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:44.962+0000 7f6d3327f700 1 --2- 192.168.123.105:0/1254200351 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d2c107ff0 0x7f6d2c10edf0 unknown :-1 s=CLOSED pgs=97 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:44.965 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:44.962+0000 7f6d3327f700 1 -- 192.168.123.105:0/1254200351 >> 192.168.123.105:0/1254200351 conn(0x7f6d2c06c970 msgr2=0x7f6d2c06cd80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:44.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:44.962+0000 7f6d3327f700 1 -- 192.168.123.105:0/1254200351 shutdown_connections 2026-03-10T08:50:44.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:44.962+0000 7f6d3327f700 1 -- 192.168.123.105:0/1254200351 wait complete. 2026-03-10T08:50:44.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:44.963+0000 7f6d3327f700 1 Processor -- start 2026-03-10T08:50:44.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:44.963+0000 7f6d3327f700 1 -- start start 2026-03-10T08:50:44.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:44.963+0000 7f6d3327f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d2c19c3d0 0x7f6d2c19c7b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:44.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:44.963+0000 7f6d3327f700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6d28023070 con 0x7f6d2c19c3d0 2026-03-10T08:50:44.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:44.963+0000 7f6d3227d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d2c19c3d0 0x7f6d2c19c7b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:44.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:44.963+0000 7f6d3227d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d2c19c3d0 0x7f6d2c19c7b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:44508/0 (socket says 192.168.123.105:44508) 2026-03-10T08:50:44.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:44.963+0000 7f6d3227d700 1 -- 192.168.123.105:0/3251356536 learned_addr learned my addr 192.168.123.105:0/3251356536 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:44.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:44.963+0000 7f6d3227d700 1 -- 192.168.123.105:0/3251356536 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6d280097e0 con 0x7f6d2c19c3d0 2026-03-10T08:50:44.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:44.963+0000 7f6d3227d700 1 --2- 192.168.123.105:0/3251356536 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d2c19c3d0 0x7f6d2c19c7b0 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f6d28009790 tx=0x7f6d280049c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:44.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:44.964+0000 7f6d237fe700 1 -- 192.168.123.105:0/3251356536 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6d2802e070 con 0x7f6d2c19c3d0 2026-03-10T08:50:44.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:44.964+0000 7f6d3327f700 1 -- 192.168.123.105:0/3251356536 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6d2c19cd50 con 0x7f6d2c19c3d0 2026-03-10T08:50:44.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:44.964+0000 7f6d3327f700 1 -- 192.168.123.105:0/3251356536 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6d2c1a0d30 con 0x7f6d2c19c3d0 2026-03-10T08:50:44.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:44.965+0000 7f6d237fe700 1 -- 192.168.123.105:0/3251356536 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6d2801cb50 con 0x7f6d2c19c3d0 2026-03-10T08:50:44.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:44.966+0000 7f6d237fe700 1 -- 192.168.123.105:0/3251356536 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6d2801f3f0 con 0x7f6d2c19c3d0 2026-03-10T08:50:44.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:44.966+0000 7f6d237fe700 1 -- 192.168.123.105:0/3251356536 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 12) v1 ==== 45337+0+0 (secure 0 0 0) 0x7f6d2801f610 con 0x7f6d2c19c3d0 2026-03-10T08:50:44.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:44.966+0000 7f6d3327f700 1 -- 192.168.123.105:0/3251356536 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6d2c04f000 con 0x7f6d2c19c3d0 2026-03-10T08:50:44.969 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:44.969+0000 7f6d237fe700 1 --2- 192.168.123.105:0/3251356536 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f6d18038520 0x7f6d1803a9e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:44.970 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:44.969+0000 7f6d31a7c700 1 --2- 192.168.123.105:0/3251356536 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f6d18038520 0x7f6d1803a9e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:44.970 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:44.970+0000 7f6d237fe700 1 -- 192.168.123.105:0/3251356536 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f6d2805df00 con 0x7f6d2c19c3d0 2026-03-10T08:50:44.970 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:44.970+0000 7f6d237fe700 1 -- 192.168.123.105:0/3251356536 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f6d280623b0 con 0x7f6d2c19c3d0 2026-03-10T08:50:44.970 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:44.970+0000 7f6d31a7c700 1 --2- 192.168.123.105:0/3251356536 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f6d18038520 0x7f6d1803a9e0 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7f6d2400ad30 tx=0x7f6d240093f0 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:45.082 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:45.080+0000 7f6d3327f700 1 -- 192.168.123.105:0/3251356536 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json"}) v1 -- 0x7f6d2c1a0fe0 con 0x7f6d18038520 2026-03-10T08:50:45.082 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:45.081+0000 7f6d237fe700 1 -- 192.168.123.105:0/3251356536 <== mgr.14162 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+155 (secure 0 0 0) 0x7f6d2c1a0fe0 con 0x7f6d18038520 2026-03-10T08:50:45.082 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:50:45.082 INFO:teuthology.orchestra.run.vm05.stdout:[{"addr": "192.168.123.105", "hostname": "vm05", "labels": [], "status": ""}, {"addr": "192.168.123.108", "hostname": "vm08", "labels": [], "status": ""}] 2026-03-10T08:50:45.085 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:45.084+0000 7f6d217fa700 1 -- 192.168.123.105:0/3251356536 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f6d18038520 msgr2=0x7f6d1803a9e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:45.085 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:45.084+0000 7f6d217fa700 1 --2- 192.168.123.105:0/3251356536 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f6d18038520 0x7f6d1803a9e0 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7f6d2400ad30 tx=0x7f6d240093f0 comp rx=0 tx=0).stop 2026-03-10T08:50:45.085 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:45.084+0000 7f6d217fa700 1 -- 192.168.123.105:0/3251356536 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d2c19c3d0 msgr2=0x7f6d2c19c7b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:45.085 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:45.084+0000 7f6d217fa700 1 --2- 192.168.123.105:0/3251356536 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d2c19c3d0 0x7f6d2c19c7b0 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f6d28009790 tx=0x7f6d280049c0 comp rx=0 tx=0).stop 2026-03-10T08:50:45.085 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:45.084+0000 7f6d217fa700 1 -- 192.168.123.105:0/3251356536 shutdown_connections 2026-03-10T08:50:45.085 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:45.084+0000 7f6d217fa700 1 --2- 192.168.123.105:0/3251356536 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f6d18038520 0x7f6d1803a9e0 unknown :-1 s=CLOSED pgs=15 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:45.085 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:45.084+0000 7f6d217fa700 1 --2- 192.168.123.105:0/3251356536 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d2c19c3d0 0x7f6d2c19c7b0 unknown :-1 s=CLOSED pgs=98 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:45.085 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:45.084+0000 7f6d217fa700 1 -- 192.168.123.105:0/3251356536 >> 192.168.123.105:0/3251356536 conn(0x7f6d2c06c970 msgr2=0x7f6d2c10b6e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:45.086 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:45.084+0000 7f6d217fa700 1 -- 192.168.123.105:0/3251356536 shutdown_connections 2026-03-10T08:50:45.086 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:45.084+0000 7f6d217fa700 1 -- 192.168.123.105:0/3251356536 wait complete. 2026-03-10T08:50:45.110 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:45 vm05 ceph-mon[49713]: Updating vm05:/var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/config/ceph.client.admin.keyring 2026-03-10T08:50:45.110 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:45 vm05 ceph-mon[49713]: Deploying cephadm binary to vm08 2026-03-10T08:50:45.110 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:45 vm05 ceph-mon[49713]: Deploying daemon ceph-exporter.vm05 on vm05 2026-03-10T08:50:45.110 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:45 vm05 ceph-mon[49713]: mgrmap e12: vm05.rxwgjc(active, since 6s) 2026-03-10T08:50:45.110 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:45 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:45.110 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:45 vm05 ceph-mon[49713]: Added host vm08 2026-03-10T08:50:45.110 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:45 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:45.110 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:45 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:45.110 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:45 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:45.110 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:45 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:45.110 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:45 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm05", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T08:50:45.110 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:45 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.vm05", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished 2026-03-10T08:50:45.110 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:45 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:50:45.110 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:45 vm05 ceph-mon[49713]: Deploying daemon crash.vm05 on vm05 2026-03-10T08:50:45.141 INFO:tasks.cephadm:Setting crush tunables to default 2026-03-10T08:50:45.141 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph osd crush tunables default 2026-03-10T08:50:45.337 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:50:45.728 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:45.727+0000 7f571a63d700 1 -- 192.168.123.105:0/1060044864 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5714102e70 msgr2=0x7f5714103250 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:45.728 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:45.727+0000 7f571a63d700 1 --2- 192.168.123.105:0/1060044864 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5714102e70 0x7f5714103250 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7f5704009b00 tx=0x7f5704009e10 comp rx=0 tx=0).stop 2026-03-10T08:50:45.728 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:45.728+0000 7f571a63d700 1 -- 192.168.123.105:0/1060044864 shutdown_connections 2026-03-10T08:50:45.728 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:45.728+0000 7f571a63d700 1 --2- 192.168.123.105:0/1060044864 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5714102e70 0x7f5714103250 unknown :-1 s=CLOSED pgs=99 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:45.728 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:45.728+0000 7f571a63d700 1 -- 192.168.123.105:0/1060044864 >> 192.168.123.105:0/1060044864 conn(0x7f57140fe760 msgr2=0x7f5714100b80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:45.729 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:45.728+0000 7f571a63d700 1 -- 192.168.123.105:0/1060044864 shutdown_connections 2026-03-10T08:50:45.729 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:45.728+0000 7f571a63d700 1 -- 192.168.123.105:0/1060044864 wait complete. 2026-03-10T08:50:45.729 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:45.728+0000 7f571a63d700 1 Processor -- start 2026-03-10T08:50:45.729 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:45.728+0000 7f571a63d700 1 -- start start 2026-03-10T08:50:45.729 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:45.728+0000 7f571a63d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5714072b20 0x7f571406bb40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:45.729 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:45.728+0000 7f571a63d700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5704012070 con 0x7f5714072b20 2026-03-10T08:50:45.729 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:45.729+0000 7f5713fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5714072b20 0x7f571406bb40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:45.729 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:45.729+0000 7f5713fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5714072b20 0x7f571406bb40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:44514/0 (socket says 192.168.123.105:44514) 2026-03-10T08:50:45.729 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:45.729+0000 7f5713fff700 1 -- 192.168.123.105:0/2632564922 learned_addr learned my addr 192.168.123.105:0/2632564922 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:50:45.730 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:45.730+0000 7f5713fff700 1 -- 192.168.123.105:0/2632564922 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f57040097e0 con 0x7f5714072b20 2026-03-10T08:50:45.730 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:45.730+0000 7f5713fff700 1 --2- 192.168.123.105:0/2632564922 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5714072b20 0x7f571406bb40 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f5704006010 tx=0x7f570400bba0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:45.731 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:45.730+0000 7f57117fa700 1 -- 192.168.123.105:0/2632564922 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f570401c070 con 0x7f5714072b20 2026-03-10T08:50:45.731 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:45.730+0000 7f571a63d700 1 -- 192.168.123.105:0/2632564922 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f571406c080 con 0x7f5714072b20 2026-03-10T08:50:45.731 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:45.730+0000 7f571a63d700 1 -- 192.168.123.105:0/2632564922 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f571406c4a0 con 0x7f5714072b20 2026-03-10T08:50:45.731 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:45.731+0000 7f57117fa700 1 -- 192.168.123.105:0/2632564922 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f5704003d70 con 0x7f5714072b20 2026-03-10T08:50:45.731 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:45.731+0000 7f57117fa700 1 -- 192.168.123.105:0/2632564922 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5704017440 con 0x7f5714072b20 2026-03-10T08:50:45.731 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:45.731+0000 7f571a63d700 1 -- 192.168.123.105:0/2632564922 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5700005320 con 0x7f5714072b20 2026-03-10T08:50:45.736 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:45.735+0000 7f57117fa700 1 -- 192.168.123.105:0/2632564922 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 12) v1 ==== 45337+0+0 (secure 0 0 0) 0x7f570400f6f0 con 0x7f5714072b20 2026-03-10T08:50:45.736 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:45.735+0000 7f57117fa700 1 --2- 192.168.123.105:0/2632564922 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f56fc038480 0x7f56fc03a940 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:45.736 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:45.736+0000 7f57117fa700 1 -- 192.168.123.105:0/2632564922 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f570401fe50 con 0x7f5714072b20 2026-03-10T08:50:45.736 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:45.736+0000 7f57137fe700 1 --2- 192.168.123.105:0/2632564922 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f56fc038480 0x7f56fc03a940 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:45.737 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:45.737+0000 7f57137fe700 1 --2- 192.168.123.105:0/2632564922 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f56fc038480 0x7f56fc03a940 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7f5708006fd0 tx=0x7f5708006e40 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:45.741 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:45.738+0000 7f57117fa700 1 -- 192.168.123.105:0/2632564922 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f57040175a0 con 0x7f5714072b20 2026-03-10T08:50:45.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:45.865+0000 7f571a63d700 1 -- 192.168.123.105:0/2632564922 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "osd crush tunables", "profile": "default"} v 0) v1 -- 0x7f5700005190 con 0x7f5714072b20 2026-03-10T08:50:46.789 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:46.786+0000 7f57117fa700 1 -- 192.168.123.105:0/2632564922 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "osd crush tunables", "profile": "default"}]=0 adjusted tunables profile to default v4) v1 ==== 124+0+0 (secure 0 0 0) 0x7f5704029b80 con 0x7f5714072b20 2026-03-10T08:50:46.789 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:46.789+0000 7f571a63d700 1 -- 192.168.123.105:0/2632564922 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f56fc038480 msgr2=0x7f56fc03a940 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:46.789 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:46.789+0000 7f571a63d700 1 --2- 192.168.123.105:0/2632564922 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f56fc038480 0x7f56fc03a940 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7f5708006fd0 tx=0x7f5708006e40 comp rx=0 tx=0).stop 2026-03-10T08:50:46.790 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:46.789+0000 7f571a63d700 1 -- 192.168.123.105:0/2632564922 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5714072b20 msgr2=0x7f571406bb40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:46.790 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:46.789+0000 7f571a63d700 1 --2- 192.168.123.105:0/2632564922 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5714072b20 0x7f571406bb40 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f5704006010 tx=0x7f570400bba0 comp rx=0 tx=0).stop 2026-03-10T08:50:46.790 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:46.790+0000 7f571a63d700 1 -- 192.168.123.105:0/2632564922 shutdown_connections 2026-03-10T08:50:46.790 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:46.790+0000 7f571a63d700 1 --2- 192.168.123.105:0/2632564922 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f56fc038480 0x7f56fc03a940 unknown :-1 s=CLOSED pgs=16 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:46.790 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:46.790+0000 7f571a63d700 1 --2- 192.168.123.105:0/2632564922 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5714072b20 0x7f571406bb40 unknown :-1 s=CLOSED pgs=100 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:46.790 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:46.790+0000 7f571a63d700 1 -- 192.168.123.105:0/2632564922 >> 192.168.123.105:0/2632564922 conn(0x7f57140fe760 msgr2=0x7f57141070e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:46.790 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:46.790+0000 7f571a63d700 1 -- 192.168.123.105:0/2632564922 shutdown_connections 2026-03-10T08:50:46.790 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:50:46.790+0000 7f571a63d700 1 -- 192.168.123.105:0/2632564922 wait complete. 2026-03-10T08:50:46.790 INFO:teuthology.orchestra.run.vm05.stderr:adjusted tunables profile to default 2026-03-10T08:50:46.834 INFO:tasks.cephadm:Adding mon.vm05 on vm05 2026-03-10T08:50:46.834 INFO:tasks.cephadm:Adding mon.vm08 on vm08 2026-03-10T08:50:46.834 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph orch apply mon '2;vm05:192.168.123.105=vm05;vm08:192.168.123.108=vm08' 2026-03-10T08:50:46.963 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T08:50:46.996 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T08:50:47.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:46 vm05 ceph-mon[49713]: from='client.14191 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json"}]: dispatch 2026-03-10T08:50:47.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:46 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:47.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:46 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:47.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:46 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:47.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:46 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:47.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:46 vm05 ceph-mon[49713]: Deploying daemon node-exporter.vm05 on vm05 2026-03-10T08:50:47.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:46 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/2632564922' entity='client.admin' cmd=[{"prefix": "osd crush tunables", "profile": "default"}]: dispatch 2026-03-10T08:50:47.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:47 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/2632564922' entity='client.admin' cmd='[{"prefix": "osd crush tunables", "profile": "default"}]': finished 2026-03-10T08:50:47.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:47 vm05 ceph-mon[49713]: osdmap e4: 0 total, 0 up, 0 in 2026-03-10T08:50:47.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:47 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:48.110 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.108+0000 7fb03afa7700 1 -- 192.168.123.108:0/2198684996 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb034102cb0 msgr2=0x7fb0341030d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:48.110 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.108+0000 7fb03afa7700 1 --2- 192.168.123.108:0/2198684996 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb034102cb0 0x7fb0341030d0 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7fb024009b00 tx=0x7fb024009e10 comp rx=0 tx=0).stop 2026-03-10T08:50:48.110 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.109+0000 7fb03afa7700 1 -- 192.168.123.108:0/2198684996 shutdown_connections 2026-03-10T08:50:48.110 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.109+0000 7fb03afa7700 1 --2- 192.168.123.108:0/2198684996 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb034102cb0 0x7fb0341030d0 unknown :-1 s=CLOSED pgs=101 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:48.110 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.109+0000 7fb03afa7700 1 -- 192.168.123.108:0/2198684996 >> 192.168.123.108:0/2198684996 conn(0x7fb0340fe250 msgr2=0x7fb034100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:48.110 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.109+0000 7fb03afa7700 1 -- 192.168.123.108:0/2198684996 shutdown_connections 2026-03-10T08:50:48.110 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.109+0000 7fb03afa7700 1 -- 192.168.123.108:0/2198684996 wait complete. 2026-03-10T08:50:48.110 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.109+0000 7fb03afa7700 1 Processor -- start 2026-03-10T08:50:48.110 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.110+0000 7fb03afa7700 1 -- start start 2026-03-10T08:50:48.110 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.110+0000 7fb03afa7700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb034102cb0 0x7fb034197e20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:48.110 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.110+0000 7fb03afa7700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb034198360 con 0x7fb034102cb0 2026-03-10T08:50:48.111 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.110+0000 7fb038d43700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb034102cb0 0x7fb034197e20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:48.111 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.110+0000 7fb038d43700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb034102cb0 0x7fb034197e20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:58662/0 (socket says 192.168.123.108:58662) 2026-03-10T08:50:48.111 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.110+0000 7fb038d43700 1 -- 192.168.123.108:0/589126516 learned_addr learned my addr 192.168.123.108:0/589126516 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T08:50:48.111 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.110+0000 7fb038d43700 1 -- 192.168.123.108:0/589126516 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb0240097e0 con 0x7fb034102cb0 2026-03-10T08:50:48.111 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.111+0000 7fb038d43700 1 --2- 192.168.123.108:0/589126516 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb034102cb0 0x7fb034197e20 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7fb024004d40 tx=0x7fb024004e20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:48.111 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.111+0000 7fb031ffb700 1 -- 192.168.123.108:0/589126516 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb02401c070 con 0x7fb034102cb0 2026-03-10T08:50:48.112 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.111+0000 7fb031ffb700 1 -- 192.168.123.108:0/589126516 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb0240056f0 con 0x7fb034102cb0 2026-03-10T08:50:48.112 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.111+0000 7fb03afa7700 1 -- 192.168.123.108:0/589126516 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb034198560 con 0x7fb034102cb0 2026-03-10T08:50:48.112 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.111+0000 7fb031ffb700 1 -- 192.168.123.108:0/589126516 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb024017440 con 0x7fb034102cb0 2026-03-10T08:50:48.112 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.111+0000 7fb03afa7700 1 -- 192.168.123.108:0/589126516 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb034198a00 con 0x7fb034102cb0 2026-03-10T08:50:48.113 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.112+0000 7fb031ffb700 1 -- 192.168.123.108:0/589126516 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 12) v1 ==== 45337+0+0 (secure 0 0 0) 0x7fb02400f460 con 0x7fb034102cb0 2026-03-10T08:50:48.113 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.112+0000 7fb03afa7700 1 -- 192.168.123.108:0/589126516 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb034191a40 con 0x7fb034102cb0 2026-03-10T08:50:48.113 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.112+0000 7fb031ffb700 1 --2- 192.168.123.108:0/589126516 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb01c0383f0 0x7fb01c03a8b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:48.113 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.112+0000 7fb031ffb700 1 -- 192.168.123.108:0/589126516 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7fb02404bf80 con 0x7fb034102cb0 2026-03-10T08:50:48.113 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.113+0000 7fb033fff700 1 --2- 192.168.123.108:0/589126516 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb01c0383f0 0x7fb01c03a8b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:48.115 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.115+0000 7fb033fff700 1 --2- 192.168.123.108:0/589126516 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb01c0383f0 0x7fb01c03a8b0 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7fb028006fd0 tx=0x7fb028006e40 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:48.115 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.115+0000 7fb031ffb700 1 -- 192.168.123.108:0/589126516 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fb02400f920 con 0x7fb034102cb0 2026-03-10T08:50:48.232 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.229+0000 7fb03afa7700 1 -- 192.168.123.108:0/589126516 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "mon", "placement": "2;vm05:192.168.123.105=vm05;vm08:192.168.123.108=vm08", "target": ["mon-mgr", ""]}) v1 -- 0x7fb034061270 con 0x7fb01c0383f0 2026-03-10T08:50:48.236 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.235+0000 7fb031ffb700 1 -- 192.168.123.108:0/589126516 <== mgr.14162 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+24 (secure 0 0 0) 0x7fb034061270 con 0x7fb01c0383f0 2026-03-10T08:50:48.236 INFO:teuthology.orchestra.run.vm08.stdout:Scheduled mon update... 2026-03-10T08:50:48.238 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.238+0000 7fb03afa7700 1 -- 192.168.123.108:0/589126516 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb01c0383f0 msgr2=0x7fb01c03a8b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:48.238 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.238+0000 7fb03afa7700 1 --2- 192.168.123.108:0/589126516 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb01c0383f0 0x7fb01c03a8b0 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7fb028006fd0 tx=0x7fb028006e40 comp rx=0 tx=0).stop 2026-03-10T08:50:48.238 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.238+0000 7fb03afa7700 1 -- 192.168.123.108:0/589126516 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb034102cb0 msgr2=0x7fb034197e20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:48.239 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.238+0000 7fb03afa7700 1 --2- 192.168.123.108:0/589126516 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb034102cb0 0x7fb034197e20 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7fb024004d40 tx=0x7fb024004e20 comp rx=0 tx=0).stop 2026-03-10T08:50:48.239 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.238+0000 7fb03afa7700 1 -- 192.168.123.108:0/589126516 shutdown_connections 2026-03-10T08:50:48.239 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.238+0000 7fb03afa7700 1 --2- 192.168.123.108:0/589126516 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb01c0383f0 0x7fb01c03a8b0 unknown :-1 s=CLOSED pgs=17 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:48.239 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.238+0000 7fb03afa7700 1 --2- 192.168.123.108:0/589126516 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb034102cb0 0x7fb034197e20 unknown :-1 s=CLOSED pgs=102 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:48.239 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.238+0000 7fb03afa7700 1 -- 192.168.123.108:0/589126516 >> 192.168.123.108:0/589126516 conn(0x7fb0340fe250 msgr2=0x7fb0340fef10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:48.239 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.238+0000 7fb03afa7700 1 -- 192.168.123.108:0/589126516 shutdown_connections 2026-03-10T08:50:48.239 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.238+0000 7fb03afa7700 1 -- 192.168.123.108:0/589126516 wait complete. 2026-03-10T08:50:48.304 DEBUG:teuthology.orchestra.run.vm08:mon.vm08> sudo journalctl -f -n 0 -u ceph-16587ed2-1c5e-11f1-90f6-35051361a039@mon.vm08.service 2026-03-10T08:50:48.305 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T08:50:48.305 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph mon dump -f json 2026-03-10T08:50:48.474 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T08:50:48.513 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T08:50:48.744 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.743+0000 7fbf609e5700 1 -- 192.168.123.108:0/1215527384 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbf581015b0 msgr2=0x7fbf581039a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:48.744 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.743+0000 7fbf609e5700 1 --2- 192.168.123.108:0/1215527384 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbf581015b0 0x7fbf581039a0 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7fbf50009b00 tx=0x7fbf50009e10 comp rx=0 tx=0).stop 2026-03-10T08:50:48.744 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.743+0000 7fbf609e5700 1 -- 192.168.123.108:0/1215527384 shutdown_connections 2026-03-10T08:50:48.744 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.743+0000 7fbf609e5700 1 --2- 192.168.123.108:0/1215527384 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbf581015b0 0x7fbf581039a0 unknown :-1 s=CLOSED pgs=103 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:48.744 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.743+0000 7fbf609e5700 1 -- 192.168.123.108:0/1215527384 >> 192.168.123.108:0/1215527384 conn(0x7fbf580faf00 msgr2=0x7fbf580fd340 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:48.744 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.744+0000 7fbf609e5700 1 -- 192.168.123.108:0/1215527384 shutdown_connections 2026-03-10T08:50:48.744 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.744+0000 7fbf609e5700 1 -- 192.168.123.108:0/1215527384 wait complete. 2026-03-10T08:50:48.745 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.744+0000 7fbf609e5700 1 Processor -- start 2026-03-10T08:50:48.745 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.744+0000 7fbf609e5700 1 -- start start 2026-03-10T08:50:48.745 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.744+0000 7fbf609e5700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbf581015b0 0x7fbf58195b30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:48.745 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.744+0000 7fbf609e5700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbf58196070 con 0x7fbf581015b0 2026-03-10T08:50:48.745 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.745+0000 7fbf5e781700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbf581015b0 0x7fbf58195b30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:48.745 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.745+0000 7fbf5e781700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbf581015b0 0x7fbf58195b30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:58682/0 (socket says 192.168.123.108:58682) 2026-03-10T08:50:48.745 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.745+0000 7fbf5e781700 1 -- 192.168.123.108:0/1817156138 learned_addr learned my addr 192.168.123.108:0/1817156138 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T08:50:48.745 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.745+0000 7fbf5e781700 1 -- 192.168.123.108:0/1817156138 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbf500097e0 con 0x7fbf581015b0 2026-03-10T08:50:48.746 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.745+0000 7fbf5e781700 1 --2- 192.168.123.108:0/1817156138 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbf581015b0 0x7fbf58195b30 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7fbf50004f40 tx=0x7fbf50005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:48.746 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.745+0000 7fbf4b7fe700 1 -- 192.168.123.108:0/1817156138 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fbf5001c070 con 0x7fbf581015b0 2026-03-10T08:50:48.746 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.745+0000 7fbf4b7fe700 1 -- 192.168.123.108:0/1817156138 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fbf500053b0 con 0x7fbf581015b0 2026-03-10T08:50:48.746 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.745+0000 7fbf609e5700 1 -- 192.168.123.108:0/1817156138 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbf58196270 con 0x7fbf581015b0 2026-03-10T08:50:48.746 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.745+0000 7fbf609e5700 1 -- 192.168.123.108:0/1817156138 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbf58196710 con 0x7fbf581015b0 2026-03-10T08:50:48.746 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.746+0000 7fbf4b7fe700 1 -- 192.168.123.108:0/1817156138 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fbf5000f460 con 0x7fbf581015b0 2026-03-10T08:50:48.747 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.746+0000 7fbf4b7fe700 1 -- 192.168.123.108:0/1817156138 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 12) v1 ==== 45337+0+0 (secure 0 0 0) 0x7fbf50021470 con 0x7fbf581015b0 2026-03-10T08:50:48.747 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.746+0000 7fbf609e5700 1 -- 192.168.123.108:0/1817156138 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbf5818f800 con 0x7fbf581015b0 2026-03-10T08:50:48.747 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.746+0000 7fbf4b7fe700 1 --2- 192.168.123.108:0/1817156138 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbf440383f0 0x7fbf4403a8b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:48.747 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.746+0000 7fbf4b7fe700 1 -- 192.168.123.108:0/1817156138 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7fbf5004c3b0 con 0x7fbf581015b0 2026-03-10T08:50:48.749 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.749+0000 7fbf5df80700 1 --2- 192.168.123.108:0/1817156138 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbf440383f0 0x7fbf4403a8b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:48.750 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.749+0000 7fbf4b7fe700 1 -- 192.168.123.108:0/1817156138 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fbf50029b50 con 0x7fbf581015b0 2026-03-10T08:50:48.750 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.749+0000 7fbf5df80700 1 --2- 192.168.123.108:0/1817156138 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbf440383f0 0x7fbf4403a8b0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7fbf4c006fd0 tx=0x7fbf4c006e40 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:48.897 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.896+0000 7fbf609e5700 1 -- 192.168.123.108:0/1817156138 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fbf580623c0 con 0x7fbf581015b0 2026-03-10T08:50:48.898 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.897+0000 7fbf4b7fe700 1 -- 192.168.123.108:0/1817156138 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7fbf50030300 con 0x7fbf581015b0 2026-03-10T08:50:48.898 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:50:48.898 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"16587ed2-1c5e-11f1-90f6-35051361a039","modified":"2026-03-10T08:50:09.891602Z","created":"2026-03-10T08:50:09.891602Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T08:50:48.900 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.899+0000 7fbf609e5700 1 -- 192.168.123.108:0/1817156138 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbf440383f0 msgr2=0x7fbf4403a8b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:48.900 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.899+0000 7fbf609e5700 1 --2- 192.168.123.108:0/1817156138 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbf440383f0 0x7fbf4403a8b0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7fbf4c006fd0 tx=0x7fbf4c006e40 comp rx=0 tx=0).stop 2026-03-10T08:50:48.900 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.899+0000 7fbf609e5700 1 -- 192.168.123.108:0/1817156138 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbf581015b0 msgr2=0x7fbf58195b30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:48.900 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.899+0000 7fbf609e5700 1 --2- 192.168.123.108:0/1817156138 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbf581015b0 0x7fbf58195b30 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7fbf50004f40 tx=0x7fbf50005e70 comp rx=0 tx=0).stop 2026-03-10T08:50:48.900 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.899+0000 7fbf609e5700 1 -- 192.168.123.108:0/1817156138 shutdown_connections 2026-03-10T08:50:48.900 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.899+0000 7fbf609e5700 1 --2- 192.168.123.108:0/1817156138 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbf440383f0 0x7fbf4403a8b0 unknown :-1 s=CLOSED pgs=18 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:48.900 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.899+0000 7fbf609e5700 1 --2- 192.168.123.108:0/1817156138 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbf581015b0 0x7fbf58195b30 unknown :-1 s=CLOSED pgs=104 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:48.900 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.899+0000 7fbf609e5700 1 -- 192.168.123.108:0/1817156138 >> 192.168.123.108:0/1817156138 conn(0x7fbf580faf00 msgr2=0x7fbf580fbbc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:48.900 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.899+0000 7fbf609e5700 1 -- 192.168.123.108:0/1817156138 shutdown_connections 2026-03-10T08:50:48.900 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:48.900+0000 7fbf609e5700 1 -- 192.168.123.108:0/1817156138 wait complete. 2026-03-10T08:50:48.901 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-10T08:50:49.681 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:49 vm05 ceph-mon[49713]: from='client.14195 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "mon", "placement": "2;vm05:192.168.123.105=vm05;vm08:192.168.123.108=vm08", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:50:49.682 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:49 vm05 ceph-mon[49713]: Saving service mon spec with placement vm05:192.168.123.105=vm05;vm08:192.168.123.108=vm08;count:2 2026-03-10T08:50:49.682 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:49 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:49.682 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:49 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:49.682 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:49 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:49.682 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:49 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:49.682 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:49 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:49.682 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:49 vm05 ceph-mon[49713]: Deploying daemon alertmanager.vm05 on vm05 2026-03-10T08:50:49.682 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:49 vm05 ceph-mon[49713]: from='client.? 192.168.123.108:0/1817156138' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T08:50:49.966 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T08:50:49.966 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph mon dump -f json 2026-03-10T08:50:50.095 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T08:50:50.130 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T08:50:50.372 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:50.371+0000 7f5cfb7d1700 1 -- 192.168.123.108:0/2978309095 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5cf4102cb0 msgr2=0x7f5cf41030d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:50.373 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:50.371+0000 7f5cfb7d1700 1 --2- 192.168.123.108:0/2978309095 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5cf4102cb0 0x7f5cf41030d0 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7f5ce4009b00 tx=0x7f5ce4009e10 comp rx=0 tx=0).stop 2026-03-10T08:50:50.373 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:50.372+0000 7f5cfb7d1700 1 -- 192.168.123.108:0/2978309095 shutdown_connections 2026-03-10T08:50:50.373 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:50.372+0000 7f5cfb7d1700 1 --2- 192.168.123.108:0/2978309095 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5cf4102cb0 0x7f5cf41030d0 unknown :-1 s=CLOSED pgs=105 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:50.373 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:50.372+0000 7f5cfb7d1700 1 -- 192.168.123.108:0/2978309095 >> 192.168.123.108:0/2978309095 conn(0x7f5cf40fe250 msgr2=0x7f5cf4100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:50.373 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:50.372+0000 7f5cfb7d1700 1 -- 192.168.123.108:0/2978309095 shutdown_connections 2026-03-10T08:50:50.373 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:50.372+0000 7f5cfb7d1700 1 -- 192.168.123.108:0/2978309095 wait complete. 2026-03-10T08:50:50.373 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:50.372+0000 7f5cfb7d1700 1 Processor -- start 2026-03-10T08:50:50.373 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:50.372+0000 7f5cfb7d1700 1 -- start start 2026-03-10T08:50:50.373 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:50.373+0000 7f5cfb7d1700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5cf4102cb0 0x7f5cf4197e20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:50.373 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:50.373+0000 7f5cfb7d1700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5cf4198360 con 0x7f5cf4102cb0 2026-03-10T08:50:50.373 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:50.373+0000 7f5cf956d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5cf4102cb0 0x7f5cf4197e20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:50.373 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:50.373+0000 7f5cf956d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5cf4102cb0 0x7f5cf4197e20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:58708/0 (socket says 192.168.123.108:58708) 2026-03-10T08:50:50.373 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:50.373+0000 7f5cf956d700 1 -- 192.168.123.108:0/4237666407 learned_addr learned my addr 192.168.123.108:0/4237666407 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T08:50:50.374 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:50.373+0000 7f5cf956d700 1 -- 192.168.123.108:0/4237666407 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5ce40097e0 con 0x7f5cf4102cb0 2026-03-10T08:50:50.374 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:50.373+0000 7f5cf956d700 1 --2- 192.168.123.108:0/4237666407 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5cf4102cb0 0x7f5cf4197e20 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f5ce4004d40 tx=0x7f5ce4004e20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:50.374 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:50.373+0000 7f5cea7fc700 1 -- 192.168.123.108:0/4237666407 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5ce401c070 con 0x7f5cf4102cb0 2026-03-10T08:50:50.374 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:50.373+0000 7f5cea7fc700 1 -- 192.168.123.108:0/4237666407 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f5ce40056f0 con 0x7f5cf4102cb0 2026-03-10T08:50:50.375 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:50.373+0000 7f5cfb7d1700 1 -- 192.168.123.108:0/4237666407 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5cf4198560 con 0x7f5cf4102cb0 2026-03-10T08:50:50.375 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:50.374+0000 7f5cfb7d1700 1 -- 192.168.123.108:0/4237666407 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5cf4198a00 con 0x7f5cf4102cb0 2026-03-10T08:50:50.375 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:50.374+0000 7f5cea7fc700 1 -- 192.168.123.108:0/4237666407 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5ce4017440 con 0x7f5cf4102cb0 2026-03-10T08:50:50.375 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:50.374+0000 7f5cea7fc700 1 -- 192.168.123.108:0/4237666407 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 12) v1 ==== 45337+0+0 (secure 0 0 0) 0x7f5ce400f920 con 0x7f5cf4102cb0 2026-03-10T08:50:50.375 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:50.374+0000 7f5cfb7d1700 1 -- 192.168.123.108:0/4237666407 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5cd8005320 con 0x7f5cf4102cb0 2026-03-10T08:50:50.375 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:50.374+0000 7f5cea7fc700 1 --2- 192.168.123.108:0/4237666407 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f5ce0038440 0x7f5ce003a900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:50.375 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:50.374+0000 7f5cea7fc700 1 -- 192.168.123.108:0/4237666407 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f5ce404bf00 con 0x7f5cf4102cb0 2026-03-10T08:50:50.377 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:50.377+0000 7f5cf8d6c700 1 --2- 192.168.123.108:0/4237666407 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f5ce0038440 0x7f5ce003a900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:50.378 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:50.377+0000 7f5cea7fc700 1 -- 192.168.123.108:0/4237666407 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f5ce4028360 con 0x7f5cf4102cb0 2026-03-10T08:50:50.378 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:50.377+0000 7f5cf8d6c700 1 --2- 192.168.123.108:0/4237666407 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f5ce0038440 0x7f5ce003a900 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f5cf0006fd0 tx=0x7f5cf0006e40 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:50.523 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:50.522+0000 7f5cfb7d1700 1 -- 192.168.123.108:0/4237666407 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f5cd80059f0 con 0x7f5cf4102cb0 2026-03-10T08:50:50.523 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:50.523+0000 7f5cea7fc700 1 -- 192.168.123.108:0/4237666407 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f5ce4025030 con 0x7f5cf4102cb0 2026-03-10T08:50:50.525 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:50:50.525 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"16587ed2-1c5e-11f1-90f6-35051361a039","modified":"2026-03-10T08:50:09.891602Z","created":"2026-03-10T08:50:09.891602Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T08:50:50.527 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:50.526+0000 7f5cfb7d1700 1 -- 192.168.123.108:0/4237666407 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f5ce0038440 msgr2=0x7f5ce003a900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:50.527 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:50.526+0000 7f5cfb7d1700 1 --2- 192.168.123.108:0/4237666407 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f5ce0038440 0x7f5ce003a900 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f5cf0006fd0 tx=0x7f5cf0006e40 comp rx=0 tx=0).stop 2026-03-10T08:50:50.527 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:50.526+0000 7f5cfb7d1700 1 -- 192.168.123.108:0/4237666407 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5cf4102cb0 msgr2=0x7f5cf4197e20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:50.527 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:50.526+0000 7f5cfb7d1700 1 --2- 192.168.123.108:0/4237666407 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5cf4102cb0 0x7f5cf4197e20 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f5ce4004d40 tx=0x7f5ce4004e20 comp rx=0 tx=0).stop 2026-03-10T08:50:50.527 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:50.526+0000 7f5cfb7d1700 1 -- 192.168.123.108:0/4237666407 shutdown_connections 2026-03-10T08:50:50.527 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:50.526+0000 7f5cfb7d1700 1 --2- 192.168.123.108:0/4237666407 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f5ce0038440 0x7f5ce003a900 unknown :-1 s=CLOSED pgs=19 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:50.527 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:50.526+0000 7f5cfb7d1700 1 --2- 192.168.123.108:0/4237666407 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5cf4102cb0 0x7f5cf4197e20 unknown :-1 s=CLOSED pgs=106 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:50.527 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:50.526+0000 7f5cfb7d1700 1 -- 192.168.123.108:0/4237666407 >> 192.168.123.108:0/4237666407 conn(0x7f5cf40fe250 msgr2=0x7f5cf40fef10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:50.527 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:50.526+0000 7f5cfb7d1700 1 -- 192.168.123.108:0/4237666407 shutdown_connections 2026-03-10T08:50:50.527 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:50.527+0000 7f5cfb7d1700 1 -- 192.168.123.108:0/4237666407 wait complete. 2026-03-10T08:50:50.528 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-10T08:50:50.882 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:50 vm05 ceph-mon[49713]: from='client.? 192.168.123.108:0/4237666407' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T08:50:51.590 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T08:50:51.590 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph mon dump -f json 2026-03-10T08:50:51.721 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T08:50:51.756 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T08:50:52.032 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:52.030+0000 7f37bca61700 1 -- 192.168.123.108:0/1534885018 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f37b8100a60 msgr2=0x7f37b8100e80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:52.032 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:52.030+0000 7f37bca61700 1 --2- 192.168.123.108:0/1534885018 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f37b8100a60 0x7f37b8100e80 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7f37a0009b00 tx=0x7f37a0009e10 comp rx=0 tx=0).stop 2026-03-10T08:50:52.032 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:52.031+0000 7f37bca61700 1 -- 192.168.123.108:0/1534885018 shutdown_connections 2026-03-10T08:50:52.032 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:52.031+0000 7f37bca61700 1 --2- 192.168.123.108:0/1534885018 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f37b8100a60 0x7f37b8100e80 unknown :-1 s=CLOSED pgs=107 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:52.032 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:52.031+0000 7f37bca61700 1 -- 192.168.123.108:0/1534885018 >> 192.168.123.108:0/1534885018 conn(0x7f37b80fc000 msgr2=0x7f37b80fe440 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:52.032 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:52.031+0000 7f37bca61700 1 -- 192.168.123.108:0/1534885018 shutdown_connections 2026-03-10T08:50:52.032 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:52.031+0000 7f37bca61700 1 -- 192.168.123.108:0/1534885018 wait complete. 2026-03-10T08:50:52.032 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:52.032+0000 7f37bca61700 1 Processor -- start 2026-03-10T08:50:52.033 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:52.032+0000 7f37bca61700 1 -- start start 2026-03-10T08:50:52.033 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:52.032+0000 7f37bca61700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f37b8100a60 0x7f37b8197da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:52.033 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:52.032+0000 7f37bca61700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f37b81982e0 con 0x7f37b8100a60 2026-03-10T08:50:52.033 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:52.032+0000 7f37b659c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f37b8100a60 0x7f37b8197da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:52.033 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:52.032+0000 7f37b659c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f37b8100a60 0x7f37b8197da0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:58730/0 (socket says 192.168.123.108:58730) 2026-03-10T08:50:52.033 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:52.032+0000 7f37b659c700 1 -- 192.168.123.108:0/2248434128 learned_addr learned my addr 192.168.123.108:0/2248434128 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T08:50:52.033 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:52.033+0000 7f37b659c700 1 -- 192.168.123.108:0/2248434128 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f37a00097e0 con 0x7f37b8100a60 2026-03-10T08:50:52.034 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:52.033+0000 7f37b659c700 1 --2- 192.168.123.108:0/2248434128 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f37b8100a60 0x7f37b8197da0 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7f37a0004750 tx=0x7f37a0005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:52.034 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:52.033+0000 7f37af7fe700 1 -- 192.168.123.108:0/2248434128 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f37a001c070 con 0x7f37b8100a60 2026-03-10T08:50:52.034 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:52.033+0000 7f37af7fe700 1 -- 192.168.123.108:0/2248434128 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f37a0021470 con 0x7f37b8100a60 2026-03-10T08:50:52.034 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:52.033+0000 7f37af7fe700 1 -- 192.168.123.108:0/2248434128 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f37a000f460 con 0x7f37b8100a60 2026-03-10T08:50:52.034 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:52.033+0000 7f37bca61700 1 -- 192.168.123.108:0/2248434128 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f37b81984e0 con 0x7f37b8100a60 2026-03-10T08:50:52.034 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:52.033+0000 7f37bca61700 1 -- 192.168.123.108:0/2248434128 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f37b8198980 con 0x7f37b8100a60 2026-03-10T08:50:52.035 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:52.034+0000 7f37af7fe700 1 -- 192.168.123.108:0/2248434128 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 12) v1 ==== 45337+0+0 (secure 0 0 0) 0x7f37a000f5c0 con 0x7f37b8100a60 2026-03-10T08:50:52.035 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:52.035+0000 7f37af7fe700 1 --2- 192.168.123.108:0/2248434128 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f37a4038440 0x7f37a403a900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:52.036 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:52.035+0000 7f37b5d9b700 1 --2- 192.168.123.108:0/2248434128 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f37a4038440 0x7f37a403a900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:52.036 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:52.035+0000 7f37bca61700 1 -- 192.168.123.108:0/2248434128 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f37b8191bf0 con 0x7f37b8100a60 2026-03-10T08:50:52.036 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:52.035+0000 7f37af7fe700 1 -- 192.168.123.108:0/2248434128 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f37a004d440 con 0x7f37b8100a60 2026-03-10T08:50:52.036 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:52.036+0000 7f37b5d9b700 1 --2- 192.168.123.108:0/2248434128 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f37a4038440 0x7f37a403a900 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f37a8006fd0 tx=0x7f37a8006e40 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:52.039 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:52.038+0000 7f37af7fe700 1 -- 192.168.123.108:0/2248434128 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f37a0029950 con 0x7f37b8100a60 2026-03-10T08:50:52.189 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:52.188+0000 7f37bca61700 1 -- 192.168.123.108:0/2248434128 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f37b80623c0 con 0x7f37b8100a60 2026-03-10T08:50:52.190 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:52.189+0000 7f37af7fe700 1 -- 192.168.123.108:0/2248434128 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f37a0026030 con 0x7f37b8100a60 2026-03-10T08:50:52.191 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:50:52.191 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"16587ed2-1c5e-11f1-90f6-35051361a039","modified":"2026-03-10T08:50:09.891602Z","created":"2026-03-10T08:50:09.891602Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T08:50:52.193 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:52.192+0000 7f37bca61700 1 -- 192.168.123.108:0/2248434128 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f37a4038440 msgr2=0x7f37a403a900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:52.193 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:52.192+0000 7f37bca61700 1 --2- 192.168.123.108:0/2248434128 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f37a4038440 0x7f37a403a900 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f37a8006fd0 tx=0x7f37a8006e40 comp rx=0 tx=0).stop 2026-03-10T08:50:52.193 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:52.192+0000 7f37bca61700 1 -- 192.168.123.108:0/2248434128 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f37b8100a60 msgr2=0x7f37b8197da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:52.193 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:52.192+0000 7f37bca61700 1 --2- 192.168.123.108:0/2248434128 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f37b8100a60 0x7f37b8197da0 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7f37a0004750 tx=0x7f37a0005dc0 comp rx=0 tx=0).stop 2026-03-10T08:50:52.193 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:52.193+0000 7f37bca61700 1 -- 192.168.123.108:0/2248434128 shutdown_connections 2026-03-10T08:50:52.193 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:52.193+0000 7f37bca61700 1 --2- 192.168.123.108:0/2248434128 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f37a4038440 0x7f37a403a900 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:52.193 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:52.193+0000 7f37bca61700 1 --2- 192.168.123.108:0/2248434128 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f37b8100a60 0x7f37b8197da0 unknown :-1 s=CLOSED pgs=108 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:52.194 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:52.193+0000 7f37bca61700 1 -- 192.168.123.108:0/2248434128 >> 192.168.123.108:0/2248434128 conn(0x7f37b80fc000 msgr2=0x7f37b80fe440 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:52.194 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:52.193+0000 7f37bca61700 1 -- 192.168.123.108:0/2248434128 shutdown_connections 2026-03-10T08:50:52.194 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:52.193+0000 7f37bca61700 1 -- 192.168.123.108:0/2248434128 wait complete. 2026-03-10T08:50:52.195 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-10T08:50:53.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:52 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:53.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:52 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:53.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:52 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:53.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:52 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:53.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:52 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:53.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:52 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:53.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:52 vm05 ceph-mon[49713]: Regenerating cephadm self-signed grafana TLS certificates 2026-03-10T08:50:53.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:52 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:53.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:52 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:53.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:52 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "dashboard set-grafana-api-ssl-verify", "value": "false"}]: dispatch 2026-03-10T08:50:53.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:52 vm05 ceph-mon[49713]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-grafana-api-ssl-verify", "value": "false"}]: dispatch 2026-03-10T08:50:53.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:52 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:53.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:52 vm05 ceph-mon[49713]: Deploying daemon grafana.vm05 on vm05 2026-03-10T08:50:53.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:52 vm05 ceph-mon[49713]: from='client.? 192.168.123.108:0/2248434128' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T08:50:53.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:52 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:50:53.262 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T08:50:53.262 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph mon dump -f json 2026-03-10T08:50:53.397 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T08:50:53.435 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T08:50:53.678 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:53.677+0000 7fbba3201700 1 -- 192.168.123.108:0/4195892870 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbb9c1015b0 msgr2=0x7fbb9c1039a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:53.678 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:53.677+0000 7fbba3201700 1 --2- 192.168.123.108:0/4195892870 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbb9c1015b0 0x7fbb9c1039a0 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7fbb88009b00 tx=0x7fbb88009e10 comp rx=0 tx=0).stop 2026-03-10T08:50:53.678 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:53.677+0000 7fbba3201700 1 -- 192.168.123.108:0/4195892870 shutdown_connections 2026-03-10T08:50:53.678 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:53.677+0000 7fbba3201700 1 --2- 192.168.123.108:0/4195892870 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbb9c1015b0 0x7fbb9c1039a0 unknown :-1 s=CLOSED pgs=109 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:53.678 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:53.677+0000 7fbba3201700 1 -- 192.168.123.108:0/4195892870 >> 192.168.123.108:0/4195892870 conn(0x7fbb9c0faf00 msgr2=0x7fbb9c0fd340 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:53.678 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:53.678+0000 7fbba3201700 1 -- 192.168.123.108:0/4195892870 shutdown_connections 2026-03-10T08:50:53.678 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:53.678+0000 7fbba3201700 1 -- 192.168.123.108:0/4195892870 wait complete. 2026-03-10T08:50:53.679 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:53.678+0000 7fbba3201700 1 Processor -- start 2026-03-10T08:50:53.679 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:53.679+0000 7fbba3201700 1 -- start start 2026-03-10T08:50:53.679 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:53.679+0000 7fbba3201700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbb9c1015b0 0x7fbb9c195b30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:53.679 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:53.679+0000 7fbba3201700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbb9c196070 con 0x7fbb9c1015b0 2026-03-10T08:50:53.680 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:53.679+0000 7fbba0f9d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbb9c1015b0 0x7fbb9c195b30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:53.680 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:53.679+0000 7fbba0f9d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbb9c1015b0 0x7fbb9c195b30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:58754/0 (socket says 192.168.123.108:58754) 2026-03-10T08:50:53.680 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:53.679+0000 7fbba0f9d700 1 -- 192.168.123.108:0/4101202812 learned_addr learned my addr 192.168.123.108:0/4101202812 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T08:50:53.680 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:53.680+0000 7fbba0f9d700 1 -- 192.168.123.108:0/4101202812 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbb880097e0 con 0x7fbb9c1015b0 2026-03-10T08:50:53.680 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:53.680+0000 7fbba0f9d700 1 --2- 192.168.123.108:0/4101202812 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbb9c1015b0 0x7fbb9c195b30 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7fbb88004f40 tx=0x7fbb88005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:53.681 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:53.680+0000 7fbb99ffb700 1 -- 192.168.123.108:0/4101202812 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fbb8801c070 con 0x7fbb9c1015b0 2026-03-10T08:50:53.681 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:53.680+0000 7fbba3201700 1 -- 192.168.123.108:0/4101202812 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbb9c196270 con 0x7fbb9c1015b0 2026-03-10T08:50:53.681 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:53.680+0000 7fbba3201700 1 -- 192.168.123.108:0/4101202812 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbb9c196710 con 0x7fbb9c1015b0 2026-03-10T08:50:53.682 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:53.681+0000 7fbb99ffb700 1 -- 192.168.123.108:0/4101202812 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fbb880053b0 con 0x7fbb9c1015b0 2026-03-10T08:50:53.682 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:53.681+0000 7fbb99ffb700 1 -- 192.168.123.108:0/4101202812 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fbb8800f550 con 0x7fbb9c1015b0 2026-03-10T08:50:53.682 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:53.681+0000 7fbb99ffb700 1 -- 192.168.123.108:0/4101202812 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 12) v1 ==== 45337+0+0 (secure 0 0 0) 0x7fbb8800f6b0 con 0x7fbb9c1015b0 2026-03-10T08:50:53.682 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:53.681+0000 7fbb99ffb700 1 --2- 192.168.123.108:0/4101202812 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbb8c03c850 0x7fbb8c03ed10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:53.682 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:53.681+0000 7fbb99ffb700 1 -- 192.168.123.108:0/4101202812 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7fbb8804d4b0 con 0x7fbb9c1015b0 2026-03-10T08:50:53.682 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:53.681+0000 7fbba3201700 1 -- 192.168.123.108:0/4101202812 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbb9c18f800 con 0x7fbb9c1015b0 2026-03-10T08:50:53.682 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:53.681+0000 7fbb9bfff700 1 --2- 192.168.123.108:0/4101202812 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbb8c03c850 0x7fbb8c03ed10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:53.682 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:53.682+0000 7fbb9bfff700 1 --2- 192.168.123.108:0/4101202812 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbb8c03c850 0x7fbb8c03ed10 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fbb90006fd0 tx=0x7fbb90006e40 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:53.685 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:53.684+0000 7fbb99ffb700 1 -- 192.168.123.108:0/4101202812 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fbb88026070 con 0x7fbb9c1015b0 2026-03-10T08:50:53.831 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:53.830+0000 7fbba3201700 1 -- 192.168.123.108:0/4101202812 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fbb9c0623c0 con 0x7fbb9c1015b0 2026-03-10T08:50:53.832 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:53.831+0000 7fbb99ffb700 1 -- 192.168.123.108:0/4101202812 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7fbb88029720 con 0x7fbb9c1015b0 2026-03-10T08:50:53.833 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:50:53.833 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"16587ed2-1c5e-11f1-90f6-35051361a039","modified":"2026-03-10T08:50:09.891602Z","created":"2026-03-10T08:50:09.891602Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T08:50:53.835 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:53.835+0000 7fbba3201700 1 -- 192.168.123.108:0/4101202812 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbb8c03c850 msgr2=0x7fbb8c03ed10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:53.836 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:53.835+0000 7fbba3201700 1 --2- 192.168.123.108:0/4101202812 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbb8c03c850 0x7fbb8c03ed10 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fbb90006fd0 tx=0x7fbb90006e40 comp rx=0 tx=0).stop 2026-03-10T08:50:53.836 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:53.835+0000 7fbba3201700 1 -- 192.168.123.108:0/4101202812 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbb9c1015b0 msgr2=0x7fbb9c195b30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:53.836 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:53.835+0000 7fbba3201700 1 --2- 192.168.123.108:0/4101202812 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbb9c1015b0 0x7fbb9c195b30 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7fbb88004f40 tx=0x7fbb88005e70 comp rx=0 tx=0).stop 2026-03-10T08:50:53.836 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:53.835+0000 7fbba3201700 1 -- 192.168.123.108:0/4101202812 shutdown_connections 2026-03-10T08:50:53.836 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:53.835+0000 7fbba3201700 1 --2- 192.168.123.108:0/4101202812 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbb8c03c850 0x7fbb8c03ed10 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:53.836 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:53.835+0000 7fbba3201700 1 --2- 192.168.123.108:0/4101202812 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbb9c1015b0 0x7fbb9c195b30 unknown :-1 s=CLOSED pgs=110 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:53.836 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:53.836+0000 7fbba3201700 1 -- 192.168.123.108:0/4101202812 >> 192.168.123.108:0/4101202812 conn(0x7fbb9c0faf00 msgr2=0x7fbb9c0fbbc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:53.836 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:53.836+0000 7fbba3201700 1 -- 192.168.123.108:0/4101202812 shutdown_connections 2026-03-10T08:50:53.837 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:53.836+0000 7fbba3201700 1 -- 192.168.123.108:0/4101202812 wait complete. 2026-03-10T08:50:53.837 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-10T08:50:54.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:53 vm05 ceph-mon[49713]: from='client.? 192.168.123.108:0/4101202812' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T08:50:54.897 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T08:50:54.897 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph mon dump -f json 2026-03-10T08:50:55.024 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T08:50:55.057 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T08:50:55.275 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:55.274+0000 7f13606e1700 1 -- 192.168.123.108:0/1446890245 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1358102ca0 msgr2=0x7f13581030c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:55.275 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:55.274+0000 7f13606e1700 1 --2- 192.168.123.108:0/1446890245 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1358102ca0 0x7f13581030c0 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7f1348009b00 tx=0x7f1348009e10 comp rx=0 tx=0).stop 2026-03-10T08:50:55.275 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:55.275+0000 7f13606e1700 1 -- 192.168.123.108:0/1446890245 shutdown_connections 2026-03-10T08:50:55.275 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:55.275+0000 7f13606e1700 1 --2- 192.168.123.108:0/1446890245 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1358102ca0 0x7f13581030c0 unknown :-1 s=CLOSED pgs=111 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:55.276 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:55.275+0000 7f13606e1700 1 -- 192.168.123.108:0/1446890245 >> 192.168.123.108:0/1446890245 conn(0x7f13580fe220 msgr2=0x7f1358100680 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:55.276 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:55.275+0000 7f13606e1700 1 -- 192.168.123.108:0/1446890245 shutdown_connections 2026-03-10T08:50:55.276 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:55.275+0000 7f13606e1700 1 -- 192.168.123.108:0/1446890245 wait complete. 2026-03-10T08:50:55.276 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:55.275+0000 7f13606e1700 1 Processor -- start 2026-03-10T08:50:55.276 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:55.275+0000 7f13606e1700 1 -- start start 2026-03-10T08:50:55.276 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:55.276+0000 7f13606e1700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1358102ca0 0x7f1358197d60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:55.276 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:55.276+0000 7f13606e1700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f13581982a0 con 0x7f1358102ca0 2026-03-10T08:50:55.276 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:55.276+0000 7f135e47d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1358102ca0 0x7f1358197d60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:55.276 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:55.276+0000 7f135e47d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1358102ca0 0x7f1358197d60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:41160/0 (socket says 192.168.123.108:41160) 2026-03-10T08:50:55.276 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:55.276+0000 7f135e47d700 1 -- 192.168.123.108:0/2669032100 learned_addr learned my addr 192.168.123.108:0/2669032100 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T08:50:55.277 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:55.276+0000 7f135e47d700 1 -- 192.168.123.108:0/2669032100 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f13480097e0 con 0x7f1358102ca0 2026-03-10T08:50:55.277 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:55.276+0000 7f135e47d700 1 --2- 192.168.123.108:0/2669032100 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1358102ca0 0x7f1358197d60 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f1348004750 tx=0x7f1348005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:55.278 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:55.277+0000 7f134f7fe700 1 -- 192.168.123.108:0/2669032100 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f134801c070 con 0x7f1358102ca0 2026-03-10T08:50:55.278 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:55.277+0000 7f134f7fe700 1 -- 192.168.123.108:0/2669032100 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f1348021470 con 0x7f1358102ca0 2026-03-10T08:50:55.278 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:55.277+0000 7f134f7fe700 1 -- 192.168.123.108:0/2669032100 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f134800f460 con 0x7f1358102ca0 2026-03-10T08:50:55.278 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:55.277+0000 7f13606e1700 1 -- 192.168.123.108:0/2669032100 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f13581984a0 con 0x7f1358102ca0 2026-03-10T08:50:55.278 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:55.277+0000 7f13606e1700 1 -- 192.168.123.108:0/2669032100 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1358198940 con 0x7f1358102ca0 2026-03-10T08:50:55.278 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:55.277+0000 7f134f7fe700 1 -- 192.168.123.108:0/2669032100 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 12) v1 ==== 45337+0+0 (secure 0 0 0) 0x7f1348021ac0 con 0x7f1358102ca0 2026-03-10T08:50:55.278 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:55.278+0000 7f13606e1700 1 -- 192.168.123.108:0/2669032100 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1358191a30 con 0x7f1358102ca0 2026-03-10T08:50:55.278 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:55.278+0000 7f134f7fe700 1 --2- 192.168.123.108:0/2669032100 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f13440383f0 0x7f134403a8b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:55.278 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:55.278+0000 7f134f7fe700 1 -- 192.168.123.108:0/2669032100 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f134804c3b0 con 0x7f1358102ca0 2026-03-10T08:50:55.281 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:55.280+0000 7f135dc7c700 1 --2- 192.168.123.108:0/2669032100 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f13440383f0 0x7f134403a8b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:55.281 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:55.280+0000 7f134f7fe700 1 -- 192.168.123.108:0/2669032100 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f134800f5c0 con 0x7f1358102ca0 2026-03-10T08:50:55.281 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:55.281+0000 7f135dc7c700 1 --2- 192.168.123.108:0/2669032100 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f13440383f0 0x7f134403a8b0 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f1354006fd0 tx=0x7f1354006e40 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:55.423 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:55.422+0000 7f13606e1700 1 -- 192.168.123.108:0/2669032100 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f135802cc70 con 0x7f1358102ca0 2026-03-10T08:50:55.424 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:55.424+0000 7f134f7fe700 1 -- 192.168.123.108:0/2669032100 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f1348030300 con 0x7f1358102ca0 2026-03-10T08:50:55.424 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:50:55.425 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"16587ed2-1c5e-11f1-90f6-35051361a039","modified":"2026-03-10T08:50:09.891602Z","created":"2026-03-10T08:50:09.891602Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T08:50:55.427 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:55.426+0000 7f13606e1700 1 -- 192.168.123.108:0/2669032100 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f13440383f0 msgr2=0x7f134403a8b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:55.427 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:55.426+0000 7f13606e1700 1 --2- 192.168.123.108:0/2669032100 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f13440383f0 0x7f134403a8b0 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f1354006fd0 tx=0x7f1354006e40 comp rx=0 tx=0).stop 2026-03-10T08:50:55.427 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:55.426+0000 7f13606e1700 1 -- 192.168.123.108:0/2669032100 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1358102ca0 msgr2=0x7f1358197d60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:55.427 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:55.426+0000 7f13606e1700 1 --2- 192.168.123.108:0/2669032100 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1358102ca0 0x7f1358197d60 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f1348004750 tx=0x7f1348005dc0 comp rx=0 tx=0).stop 2026-03-10T08:50:55.427 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:55.426+0000 7f13606e1700 1 -- 192.168.123.108:0/2669032100 shutdown_connections 2026-03-10T08:50:55.427 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:55.426+0000 7f13606e1700 1 --2- 192.168.123.108:0/2669032100 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f13440383f0 0x7f134403a8b0 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:55.427 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:55.426+0000 7f13606e1700 1 --2- 192.168.123.108:0/2669032100 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1358102ca0 0x7f1358197d60 secure :-1 s=CLOSED pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f1348004750 tx=0x7f1348005dc0 comp rx=0 tx=0).stop 2026-03-10T08:50:55.428 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:55.426+0000 7f13606e1700 1 -- 192.168.123.108:0/2669032100 >> 192.168.123.108:0/2669032100 conn(0x7f13580fe220 msgr2=0x7f13580fef00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:55.428 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:55.426+0000 7f13606e1700 1 -- 192.168.123.108:0/2669032100 shutdown_connections 2026-03-10T08:50:55.428 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:55.426+0000 7f13606e1700 1 -- 192.168.123.108:0/2669032100 wait complete. 2026-03-10T08:50:55.428 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-10T08:50:55.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:55 vm05 ceph-mon[49713]: from='client.? 192.168.123.108:0/2669032100' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T08:50:56.489 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T08:50:56.489 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph mon dump -f json 2026-03-10T08:50:56.626 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T08:50:56.664 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T08:50:56.900 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:56.899+0000 7fbdea65a700 1 -- 192.168.123.108:0/967814258 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbde40ff2e0 msgr2=0x7fbde40ff700 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:56.900 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:56.899+0000 7fbdea65a700 1 --2- 192.168.123.108:0/967814258 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbde40ff2e0 0x7fbde40ff700 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7fbdd4009b00 tx=0x7fbdd4009e10 comp rx=0 tx=0).stop 2026-03-10T08:50:56.900 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:56.899+0000 7fbdea65a700 1 -- 192.168.123.108:0/967814258 shutdown_connections 2026-03-10T08:50:56.900 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:56.899+0000 7fbdea65a700 1 --2- 192.168.123.108:0/967814258 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbde40ff2e0 0x7fbde40ff700 secure :-1 s=CLOSED pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7fbdd4009b00 tx=0x7fbdd4009e10 comp rx=0 tx=0).stop 2026-03-10T08:50:56.900 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:56.899+0000 7fbdea65a700 1 -- 192.168.123.108:0/967814258 >> 192.168.123.108:0/967814258 conn(0x7fbde40faf00 msgr2=0x7fbde40fd340 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:56.901 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:56.899+0000 7fbdea65a700 1 -- 192.168.123.108:0/967814258 shutdown_connections 2026-03-10T08:50:56.901 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:56.900+0000 7fbdea65a700 1 -- 192.168.123.108:0/967814258 wait complete. 2026-03-10T08:50:56.901 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:56.900+0000 7fbdea65a700 1 Processor -- start 2026-03-10T08:50:56.901 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:56.900+0000 7fbdea65a700 1 -- start start 2026-03-10T08:50:56.901 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:56.900+0000 7fbdea65a700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbde4197f50 0x7fbde4198370 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:56.901 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:56.900+0000 7fbdea65a700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbde41988b0 con 0x7fbde4197f50 2026-03-10T08:50:56.901 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:56.901+0000 7fbde3fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbde4197f50 0x7fbde4198370 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:56.901 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:56.901+0000 7fbde3fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbde4197f50 0x7fbde4198370 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:41186/0 (socket says 192.168.123.108:41186) 2026-03-10T08:50:56.901 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:56.901+0000 7fbde3fff700 1 -- 192.168.123.108:0/3497925346 learned_addr learned my addr 192.168.123.108:0/3497925346 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T08:50:56.902 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:56.901+0000 7fbde3fff700 1 -- 192.168.123.108:0/3497925346 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbdd40097e0 con 0x7fbde4197f50 2026-03-10T08:50:56.902 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:56.901+0000 7fbde3fff700 1 --2- 192.168.123.108:0/3497925346 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbde4197f50 0x7fbde4198370 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7fbdd400b5c0 tx=0x7fbdd4005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:56.902 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:56.901+0000 7fbde17fa700 1 -- 192.168.123.108:0/3497925346 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fbdd401c070 con 0x7fbde4197f50 2026-03-10T08:50:56.902 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:56.901+0000 7fbdea65a700 1 -- 192.168.123.108:0/3497925346 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbde4198ab0 con 0x7fbde4197f50 2026-03-10T08:50:56.902 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:56.902+0000 7fbdea65a700 1 -- 192.168.123.108:0/3497925346 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbde419b700 con 0x7fbde4197f50 2026-03-10T08:50:56.903 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:56.902+0000 7fbde17fa700 1 -- 192.168.123.108:0/3497925346 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fbdd4021470 con 0x7fbde4197f50 2026-03-10T08:50:56.903 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:56.902+0000 7fbde17fa700 1 -- 192.168.123.108:0/3497925346 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fbdd400f460 con 0x7fbde4197f50 2026-03-10T08:50:56.903 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:56.902+0000 7fbdea65a700 1 -- 192.168.123.108:0/3497925346 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbde41919e0 con 0x7fbde4197f50 2026-03-10T08:50:56.904 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:56.903+0000 7fbde17fa700 1 -- 192.168.123.108:0/3497925346 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 12) v1 ==== 45337+0+0 (secure 0 0 0) 0x7fbdd4021ac0 con 0x7fbde4197f50 2026-03-10T08:50:56.904 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:56.903+0000 7fbde17fa700 1 --2- 192.168.123.108:0/3497925346 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbdcc038490 0x7fbdcc03a950 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:56.904 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:56.904+0000 7fbde17fa700 1 -- 192.168.123.108:0/3497925346 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7fbdd404ca00 con 0x7fbde4197f50 2026-03-10T08:50:56.906 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:56.905+0000 7fbde37fe700 1 --2- 192.168.123.108:0/3497925346 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbdcc038490 0x7fbdcc03a950 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:56.906 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:56.905+0000 7fbde17fa700 1 -- 192.168.123.108:0/3497925346 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fbdd4029dc0 con 0x7fbde4197f50 2026-03-10T08:50:56.907 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:56.906+0000 7fbde37fe700 1 --2- 192.168.123.108:0/3497925346 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbdcc038490 0x7fbdcc03a950 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7fbdd8006fd0 tx=0x7fbdd8006e40 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:57.063 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:57.062+0000 7fbdea65a700 1 -- 192.168.123.108:0/3497925346 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fbde40623c0 con 0x7fbde4197f50 2026-03-10T08:50:57.063 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:57.063+0000 7fbde17fa700 1 -- 192.168.123.108:0/3497925346 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7fbdd4026030 con 0x7fbde4197f50 2026-03-10T08:50:57.063 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:50:57.063 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"16587ed2-1c5e-11f1-90f6-35051361a039","modified":"2026-03-10T08:50:09.891602Z","created":"2026-03-10T08:50:09.891602Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T08:50:57.066 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:57.065+0000 7fbdea65a700 1 -- 192.168.123.108:0/3497925346 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbdcc038490 msgr2=0x7fbdcc03a950 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:57.066 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:57.065+0000 7fbdea65a700 1 --2- 192.168.123.108:0/3497925346 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbdcc038490 0x7fbdcc03a950 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7fbdd8006fd0 tx=0x7fbdd8006e40 comp rx=0 tx=0).stop 2026-03-10T08:50:57.066 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:57.065+0000 7fbdea65a700 1 -- 192.168.123.108:0/3497925346 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbde4197f50 msgr2=0x7fbde4198370 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:57.066 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:57.065+0000 7fbdea65a700 1 --2- 192.168.123.108:0/3497925346 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbde4197f50 0x7fbde4198370 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7fbdd400b5c0 tx=0x7fbdd4005e70 comp rx=0 tx=0).stop 2026-03-10T08:50:57.066 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:57.065+0000 7fbdea65a700 1 -- 192.168.123.108:0/3497925346 shutdown_connections 2026-03-10T08:50:57.066 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:57.065+0000 7fbdea65a700 1 --2- 192.168.123.108:0/3497925346 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbdcc038490 0x7fbdcc03a950 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:57.066 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:57.065+0000 7fbdea65a700 1 --2- 192.168.123.108:0/3497925346 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbde4197f50 0x7fbde4198370 unknown :-1 s=CLOSED pgs=114 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:57.066 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:57.065+0000 7fbdea65a700 1 -- 192.168.123.108:0/3497925346 >> 192.168.123.108:0/3497925346 conn(0x7fbde40faf00 msgr2=0x7fbde40fbbc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:57.066 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:57.065+0000 7fbdea65a700 1 -- 192.168.123.108:0/3497925346 shutdown_connections 2026-03-10T08:50:57.066 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:57.065+0000 7fbdea65a700 1 -- 192.168.123.108:0/3497925346 wait complete. 2026-03-10T08:50:57.067 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-10T08:50:58.113 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T08:50:58.113 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph mon dump -f json 2026-03-10T08:50:58.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:57 vm05 ceph-mon[49713]: from='client.? 192.168.123.108:0/3497925346' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T08:50:58.256 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T08:50:58.293 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T08:50:58.542 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:58.540+0000 7f0d11c08700 1 -- 192.168.123.108:0/1107289697 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0d0c102ca0 msgr2=0x7f0d0c1030c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:58.542 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:58.540+0000 7f0d11c08700 1 --2- 192.168.123.108:0/1107289697 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0d0c102ca0 0x7f0d0c1030c0 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7f0cf4009b00 tx=0x7f0cf4009e10 comp rx=0 tx=0).stop 2026-03-10T08:50:58.542 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:58.541+0000 7f0d11c08700 1 -- 192.168.123.108:0/1107289697 shutdown_connections 2026-03-10T08:50:58.542 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:58.541+0000 7f0d11c08700 1 --2- 192.168.123.108:0/1107289697 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0d0c102ca0 0x7f0d0c1030c0 unknown :-1 s=CLOSED pgs=115 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:58.542 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:58.541+0000 7f0d11c08700 1 -- 192.168.123.108:0/1107289697 >> 192.168.123.108:0/1107289697 conn(0x7f0d0c0fe220 msgr2=0x7f0d0c100680 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:58.542 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:58.541+0000 7f0d11c08700 1 -- 192.168.123.108:0/1107289697 shutdown_connections 2026-03-10T08:50:58.542 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:58.541+0000 7f0d11c08700 1 -- 192.168.123.108:0/1107289697 wait complete. 2026-03-10T08:50:58.542 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:58.542+0000 7f0d11c08700 1 Processor -- start 2026-03-10T08:50:58.542 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:58.542+0000 7f0d11c08700 1 -- start start 2026-03-10T08:50:58.542 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:58.542+0000 7f0d11c08700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0d0c102ca0 0x7f0d0c197d60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:58.542 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:58.542+0000 7f0d11c08700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0d0c1982a0 con 0x7f0d0c102ca0 2026-03-10T08:50:58.543 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:58.542+0000 7f0d0b7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0d0c102ca0 0x7f0d0c197d60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:58.543 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:58.542+0000 7f0d0b7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0d0c102ca0 0x7f0d0c197d60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:41204/0 (socket says 192.168.123.108:41204) 2026-03-10T08:50:58.543 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:58.542+0000 7f0d0b7fe700 1 -- 192.168.123.108:0/3669775754 learned_addr learned my addr 192.168.123.108:0/3669775754 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T08:50:58.543 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:58.542+0000 7f0d0b7fe700 1 -- 192.168.123.108:0/3669775754 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0cf40097e0 con 0x7f0d0c102ca0 2026-03-10T08:50:58.543 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:58.542+0000 7f0d0b7fe700 1 --2- 192.168.123.108:0/3669775754 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0d0c102ca0 0x7f0d0c197d60 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f0cf4004750 tx=0x7f0cf4005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:58.544 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:58.543+0000 7f0d08ff9700 1 -- 192.168.123.108:0/3669775754 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0cf401c070 con 0x7f0d0c102ca0 2026-03-10T08:50:58.544 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:58.543+0000 7f0d08ff9700 1 -- 192.168.123.108:0/3669775754 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f0cf4021470 con 0x7f0d0c102ca0 2026-03-10T08:50:58.544 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:58.543+0000 7f0d08ff9700 1 -- 192.168.123.108:0/3669775754 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0cf400f460 con 0x7f0d0c102ca0 2026-03-10T08:50:58.544 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:58.543+0000 7f0d11c08700 1 -- 192.168.123.108:0/3669775754 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0d0c1984a0 con 0x7f0d0c102ca0 2026-03-10T08:50:58.544 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:58.543+0000 7f0d11c08700 1 -- 192.168.123.108:0/3669775754 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0d0c198940 con 0x7f0d0c102ca0 2026-03-10T08:50:58.544 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:58.544+0000 7f0d11c08700 1 -- 192.168.123.108:0/3669775754 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0d0c191a30 con 0x7f0d0c102ca0 2026-03-10T08:50:58.544 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:58.544+0000 7f0d08ff9700 1 -- 192.168.123.108:0/3669775754 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 12) v1 ==== 45337+0+0 (secure 0 0 0) 0x7f0cf4021a80 con 0x7f0d0c102ca0 2026-03-10T08:50:58.545 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:58.544+0000 7f0d08ff9700 1 --2- 192.168.123.108:0/3669775754 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0cf8038440 0x7f0cf803a900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:50:58.545 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:58.544+0000 7f0d08ff9700 1 -- 192.168.123.108:0/3669775754 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f0cf404c390 con 0x7f0d0c102ca0 2026-03-10T08:50:58.547 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:58.546+0000 7f0d0affd700 1 --2- 192.168.123.108:0/3669775754 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0cf8038440 0x7f0cf803a900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:50:58.547 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:58.547+0000 7f0d08ff9700 1 -- 192.168.123.108:0/3669775754 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f0cf4029360 con 0x7f0d0c102ca0 2026-03-10T08:50:58.551 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:58.549+0000 7f0d0affd700 1 --2- 192.168.123.108:0/3669775754 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0cf8038440 0x7f0cf803a900 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f0cfc006fd0 tx=0x7f0cfc006e40 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:50:58.695 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:58.692+0000 7f0d11c08700 1 -- 192.168.123.108:0/3669775754 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f0d0c02cc70 con 0x7f0d0c102ca0 2026-03-10T08:50:58.695 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:58.695+0000 7f0d08ff9700 1 -- 192.168.123.108:0/3669775754 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f0cf4026030 con 0x7f0d0c102ca0 2026-03-10T08:50:58.696 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:50:58.696 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"16587ed2-1c5e-11f1-90f6-35051361a039","modified":"2026-03-10T08:50:09.891602Z","created":"2026-03-10T08:50:09.891602Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T08:50:58.698 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:58.697+0000 7f0d11c08700 1 -- 192.168.123.108:0/3669775754 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0cf8038440 msgr2=0x7f0cf803a900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:58.698 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:58.697+0000 7f0d11c08700 1 --2- 192.168.123.108:0/3669775754 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0cf8038440 0x7f0cf803a900 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f0cfc006fd0 tx=0x7f0cfc006e40 comp rx=0 tx=0).stop 2026-03-10T08:50:58.698 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:58.697+0000 7f0d11c08700 1 -- 192.168.123.108:0/3669775754 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0d0c102ca0 msgr2=0x7f0d0c197d60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:50:58.698 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:58.697+0000 7f0d11c08700 1 --2- 192.168.123.108:0/3669775754 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0d0c102ca0 0x7f0d0c197d60 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f0cf4004750 tx=0x7f0cf4005dc0 comp rx=0 tx=0).stop 2026-03-10T08:50:58.698 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:58.697+0000 7f0d11c08700 1 -- 192.168.123.108:0/3669775754 shutdown_connections 2026-03-10T08:50:58.698 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:58.697+0000 7f0d11c08700 1 --2- 192.168.123.108:0/3669775754 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0cf8038440 0x7f0cf803a900 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:58.698 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:58.697+0000 7f0d11c08700 1 --2- 192.168.123.108:0/3669775754 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0d0c102ca0 0x7f0d0c197d60 unknown :-1 s=CLOSED pgs=116 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:50:58.698 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:58.698+0000 7f0d11c08700 1 -- 192.168.123.108:0/3669775754 >> 192.168.123.108:0/3669775754 conn(0x7f0d0c0fe220 msgr2=0x7f0d0c0fef00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:50:58.698 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:58.698+0000 7f0d11c08700 1 -- 192.168.123.108:0/3669775754 shutdown_connections 2026-03-10T08:50:58.698 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:50:58.698+0000 7f0d11c08700 1 -- 192.168.123.108:0/3669775754 wait complete. 2026-03-10T08:50:58.699 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-10T08:50:59.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:58 vm05 ceph-mon[49713]: pgmap v4: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T08:50:59.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:50:58 vm05 ceph-mon[49713]: from='client.? 192.168.123.108:0/3669775754' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T08:50:59.765 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T08:50:59.765 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph mon dump -f json 2026-03-10T08:50:59.910 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T08:50:59.946 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T08:51:00.187 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:00.186+0000 7f09fd8c1700 1 -- 192.168.123.108:0/2930076666 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f09f8100a60 msgr2=0x7f09f8100e80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:00.188 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:00.186+0000 7f09fd8c1700 1 --2- 192.168.123.108:0/2930076666 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f09f8100a60 0x7f09f8100e80 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7f09e0009b00 tx=0x7f09e0009e10 comp rx=0 tx=0).stop 2026-03-10T08:51:00.188 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:00.187+0000 7f09fd8c1700 1 -- 192.168.123.108:0/2930076666 shutdown_connections 2026-03-10T08:51:00.188 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:00.187+0000 7f09fd8c1700 1 --2- 192.168.123.108:0/2930076666 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f09f8100a60 0x7f09f8100e80 unknown :-1 s=CLOSED pgs=117 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:00.188 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:00.187+0000 7f09fd8c1700 1 -- 192.168.123.108:0/2930076666 >> 192.168.123.108:0/2930076666 conn(0x7f09f80fc000 msgr2=0x7f09f80fe440 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:51:00.188 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:00.187+0000 7f09fd8c1700 1 -- 192.168.123.108:0/2930076666 shutdown_connections 2026-03-10T08:51:00.188 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:00.187+0000 7f09fd8c1700 1 -- 192.168.123.108:0/2930076666 wait complete. 2026-03-10T08:51:00.188 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:00.187+0000 7f09fd8c1700 1 Processor -- start 2026-03-10T08:51:00.188 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:00.187+0000 7f09fd8c1700 1 -- start start 2026-03-10T08:51:00.188 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:00.187+0000 7f09fd8c1700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f09f8100a60 0x7f09f8074b30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:51:00.188 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:00.187+0000 7f09fd8c1700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f09f8075070 con 0x7f09f8100a60 2026-03-10T08:51:00.188 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:00.188+0000 7f09f6ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f09f8100a60 0x7f09f8074b30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:51:00.188 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:00.188+0000 7f09f6ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f09f8100a60 0x7f09f8074b30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:41222/0 (socket says 192.168.123.108:41222) 2026-03-10T08:51:00.188 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:00.188+0000 7f09f6ffd700 1 -- 192.168.123.108:0/1579360048 learned_addr learned my addr 192.168.123.108:0/1579360048 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T08:51:00.188 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:00.188+0000 7f09f6ffd700 1 -- 192.168.123.108:0/1579360048 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f09e00097e0 con 0x7f09f8100a60 2026-03-10T08:51:00.189 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:00.188+0000 7f09f6ffd700 1 --2- 192.168.123.108:0/1579360048 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f09f8100a60 0x7f09f8074b30 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7f09e0004750 tx=0x7f09e0005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:51:00.189 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:00.188+0000 7f09fc8bf700 1 -- 192.168.123.108:0/1579360048 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f09e001c070 con 0x7f09f8100a60 2026-03-10T08:51:00.189 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:00.188+0000 7f09fd8c1700 1 -- 192.168.123.108:0/1579360048 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f09f8073180 con 0x7f09f8100a60 2026-03-10T08:51:00.190 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:00.188+0000 7f09fd8c1700 1 -- 192.168.123.108:0/1579360048 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f09f8073620 con 0x7f09f8100a60 2026-03-10T08:51:00.190 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:00.188+0000 7f09fc8bf700 1 -- 192.168.123.108:0/1579360048 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f09e0021470 con 0x7f09f8100a60 2026-03-10T08:51:00.190 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:00.188+0000 7f09fc8bf700 1 -- 192.168.123.108:0/1579360048 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f09e000f460 con 0x7f09f8100a60 2026-03-10T08:51:00.190 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:00.189+0000 7f09fc8bf700 1 -- 192.168.123.108:0/1579360048 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 12) v1 ==== 45337+0+0 (secure 0 0 0) 0x7f09e000f6d0 con 0x7f09f8100a60 2026-03-10T08:51:00.190 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:00.189+0000 7f09fc8bf700 1 --2- 192.168.123.108:0/1579360048 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f09e4038440 0x7f09e403a900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:51:00.190 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:00.189+0000 7f09fc8bf700 1 -- 192.168.123.108:0/1579360048 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f09e004d470 con 0x7f09f8100a60 2026-03-10T08:51:00.190 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:00.189+0000 7f09fd8c1700 1 -- 192.168.123.108:0/1579360048 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f09d8005320 con 0x7f09f8100a60 2026-03-10T08:51:00.190 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:00.190+0000 7f09f67fc700 1 --2- 192.168.123.108:0/1579360048 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f09e4038440 0x7f09e403a900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:51:00.191 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:00.190+0000 7f09f67fc700 1 --2- 192.168.123.108:0/1579360048 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f09e4038440 0x7f09e403a900 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f09e8006fd0 tx=0x7f09e8006e40 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:51:00.193 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:00.192+0000 7f09fc8bf700 1 -- 192.168.123.108:0/1579360048 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f09e0026070 con 0x7f09f8100a60 2026-03-10T08:51:00.350 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:00.347+0000 7f09fd8c1700 1 -- 192.168.123.108:0/1579360048 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f09d8005190 con 0x7f09f8100a60 2026-03-10T08:51:00.350 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:00.349+0000 7f09fc8bf700 1 -- 192.168.123.108:0/1579360048 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f09e0029720 con 0x7f09f8100a60 2026-03-10T08:51:00.350 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:51:00.350 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"16587ed2-1c5e-11f1-90f6-35051361a039","modified":"2026-03-10T08:50:09.891602Z","created":"2026-03-10T08:50:09.891602Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T08:51:00.352 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:00.352+0000 7f09fd8c1700 1 -- 192.168.123.108:0/1579360048 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f09e4038440 msgr2=0x7f09e403a900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:00.353 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:00.352+0000 7f09fd8c1700 1 --2- 192.168.123.108:0/1579360048 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f09e4038440 0x7f09e403a900 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f09e8006fd0 tx=0x7f09e8006e40 comp rx=0 tx=0).stop 2026-03-10T08:51:00.353 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:00.352+0000 7f09fd8c1700 1 -- 192.168.123.108:0/1579360048 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f09f8100a60 msgr2=0x7f09f8074b30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:00.353 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:00.352+0000 7f09fd8c1700 1 --2- 192.168.123.108:0/1579360048 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f09f8100a60 0x7f09f8074b30 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7f09e0004750 tx=0x7f09e0005dc0 comp rx=0 tx=0).stop 2026-03-10T08:51:00.353 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:00.352+0000 7f09fd8c1700 1 -- 192.168.123.108:0/1579360048 shutdown_connections 2026-03-10T08:51:00.353 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:00.352+0000 7f09fd8c1700 1 --2- 192.168.123.108:0/1579360048 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f09e4038440 0x7f09e403a900 unknown :-1 s=CLOSED pgs=25 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:00.353 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:00.352+0000 7f09fd8c1700 1 --2- 192.168.123.108:0/1579360048 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f09f8100a60 0x7f09f8074b30 unknown :-1 s=CLOSED pgs=118 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:00.353 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:00.352+0000 7f09fd8c1700 1 -- 192.168.123.108:0/1579360048 >> 192.168.123.108:0/1579360048 conn(0x7f09f80fc000 msgr2=0x7f09f80fccc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:51:00.353 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:00.352+0000 7f09fd8c1700 1 -- 192.168.123.108:0/1579360048 shutdown_connections 2026-03-10T08:51:00.353 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:00.352+0000 7f09fd8c1700 1 -- 192.168.123.108:0/1579360048 wait complete. 2026-03-10T08:51:00.354 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-10T08:51:01.411 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T08:51:01.411 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph mon dump -f json 2026-03-10T08:51:01.539 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T08:51:01.572 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T08:51:01.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:01 vm05 ceph-mon[49713]: pgmap v5: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T08:51:01.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:01 vm05 ceph-mon[49713]: from='client.? 192.168.123.108:0/1579360048' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T08:51:01.804 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:01.802+0000 7f23bef7f700 1 -- 192.168.123.108:0/2245679136 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23b8101500 msgr2=0x7f23b81038f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:01.804 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:01.802+0000 7f23bef7f700 1 --2- 192.168.123.108:0/2245679136 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23b8101500 0x7f23b81038f0 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7f23a8009b00 tx=0x7f23a8009e10 comp rx=0 tx=0).stop 2026-03-10T08:51:01.804 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:01.803+0000 7f23bef7f700 1 -- 192.168.123.108:0/2245679136 shutdown_connections 2026-03-10T08:51:01.804 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:01.803+0000 7f23bef7f700 1 --2- 192.168.123.108:0/2245679136 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23b8101500 0x7f23b81038f0 unknown :-1 s=CLOSED pgs=119 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:01.804 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:01.803+0000 7f23bef7f700 1 -- 192.168.123.108:0/2245679136 >> 192.168.123.108:0/2245679136 conn(0x7f23b80faf30 msgr2=0x7f23b80fd350 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:51:01.804 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:01.803+0000 7f23bef7f700 1 -- 192.168.123.108:0/2245679136 shutdown_connections 2026-03-10T08:51:01.804 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:01.803+0000 7f23bef7f700 1 -- 192.168.123.108:0/2245679136 wait complete. 2026-03-10T08:51:01.804 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:01.804+0000 7f23bef7f700 1 Processor -- start 2026-03-10T08:51:01.804 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:01.804+0000 7f23bef7f700 1 -- start start 2026-03-10T08:51:01.805 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:01.804+0000 7f23bef7f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23b8101500 0x7f23b8197d40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:51:01.805 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:01.804+0000 7f23bef7f700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f23b8198280 con 0x7f23b8101500 2026-03-10T08:51:01.805 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:01.804+0000 7f23bcd1b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23b8101500 0x7f23b8197d40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:51:01.805 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:01.805+0000 7f23bcd1b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23b8101500 0x7f23b8197d40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:41232/0 (socket says 192.168.123.108:41232) 2026-03-10T08:51:01.805 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:01.805+0000 7f23bcd1b700 1 -- 192.168.123.108:0/2023199599 learned_addr learned my addr 192.168.123.108:0/2023199599 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T08:51:01.805 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:01.805+0000 7f23bcd1b700 1 -- 192.168.123.108:0/2023199599 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f23a80097e0 con 0x7f23b8101500 2026-03-10T08:51:01.806 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:01.805+0000 7f23bcd1b700 1 --2- 192.168.123.108:0/2023199599 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23b8101500 0x7f23b8197d40 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7f23a8004f40 tx=0x7f23a8005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:51:01.806 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:01.805+0000 7f23b5ffb700 1 -- 192.168.123.108:0/2023199599 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f23a801c070 con 0x7f23b8101500 2026-03-10T08:51:01.806 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:01.805+0000 7f23bef7f700 1 -- 192.168.123.108:0/2023199599 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f23b8198480 con 0x7f23b8101500 2026-03-10T08:51:01.806 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:01.806+0000 7f23bef7f700 1 -- 192.168.123.108:0/2023199599 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f23b8198920 con 0x7f23b8101500 2026-03-10T08:51:01.807 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:01.806+0000 7f23b5ffb700 1 -- 192.168.123.108:0/2023199599 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f23a80053b0 con 0x7f23b8101500 2026-03-10T08:51:01.807 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:01.806+0000 7f23b5ffb700 1 -- 192.168.123.108:0/2023199599 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f23a800f460 con 0x7f23b8101500 2026-03-10T08:51:01.807 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:01.806+0000 7f23b5ffb700 1 -- 192.168.123.108:0/2023199599 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 12) v1 ==== 45337+0+0 (secure 0 0 0) 0x7f23a8005520 con 0x7f23b8101500 2026-03-10T08:51:01.807 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:01.807+0000 7f23bef7f700 1 -- 192.168.123.108:0/2023199599 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f23a4005320 con 0x7f23b8101500 2026-03-10T08:51:01.808 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:01.807+0000 7f23b5ffb700 1 --2- 192.168.123.108:0/2023199599 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f23a0038440 0x7f23a003a900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:51:01.808 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:01.807+0000 7f23b5ffb700 1 -- 192.168.123.108:0/2023199599 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f23a804c2e0 con 0x7f23b8101500 2026-03-10T08:51:01.808 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:01.807+0000 7f23b7fff700 1 --2- 192.168.123.108:0/2023199599 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f23a0038440 0x7f23a003a900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:51:01.808 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:01.807+0000 7f23b7fff700 1 --2- 192.168.123.108:0/2023199599 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f23a0038440 0x7f23a003a900 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f23ac006fd0 tx=0x7f23ac006e40 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:51:01.810 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:01.809+0000 7f23b5ffb700 1 -- 192.168.123.108:0/2023199599 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f23a800f5c0 con 0x7f23b8101500 2026-03-10T08:51:01.963 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:01.961+0000 7f23bef7f700 1 -- 192.168.123.108:0/2023199599 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f23a4005190 con 0x7f23b8101500 2026-03-10T08:51:01.963 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:01.962+0000 7f23b5ffb700 1 -- 192.168.123.108:0/2023199599 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f23a8026030 con 0x7f23b8101500 2026-03-10T08:51:01.964 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:51:01.964 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"16587ed2-1c5e-11f1-90f6-35051361a039","modified":"2026-03-10T08:50:09.891602Z","created":"2026-03-10T08:50:09.891602Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T08:51:01.966 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:01.965+0000 7f23bef7f700 1 -- 192.168.123.108:0/2023199599 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f23a0038440 msgr2=0x7f23a003a900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:01.966 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:01.966+0000 7f23bef7f700 1 --2- 192.168.123.108:0/2023199599 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f23a0038440 0x7f23a003a900 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f23ac006fd0 tx=0x7f23ac006e40 comp rx=0 tx=0).stop 2026-03-10T08:51:01.966 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:01.966+0000 7f23bef7f700 1 -- 192.168.123.108:0/2023199599 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23b8101500 msgr2=0x7f23b8197d40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:01.966 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:01.966+0000 7f23bef7f700 1 --2- 192.168.123.108:0/2023199599 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23b8101500 0x7f23b8197d40 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7f23a8004f40 tx=0x7f23a8005e70 comp rx=0 tx=0).stop 2026-03-10T08:51:01.967 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:01.966+0000 7f23bef7f700 1 -- 192.168.123.108:0/2023199599 shutdown_connections 2026-03-10T08:51:01.967 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:01.966+0000 7f23bef7f700 1 --2- 192.168.123.108:0/2023199599 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f23a0038440 0x7f23a003a900 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:01.967 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:01.966+0000 7f23bef7f700 1 --2- 192.168.123.108:0/2023199599 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23b8101500 0x7f23b8197d40 unknown :-1 s=CLOSED pgs=120 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:01.967 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:01.966+0000 7f23bef7f700 1 -- 192.168.123.108:0/2023199599 >> 192.168.123.108:0/2023199599 conn(0x7f23b80faf30 msgr2=0x7f23b80fd320 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:51:01.967 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:01.967+0000 7f23bef7f700 1 -- 192.168.123.108:0/2023199599 shutdown_connections 2026-03-10T08:51:01.967 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:01.967+0000 7f23bef7f700 1 -- 192.168.123.108:0/2023199599 wait complete. 2026-03-10T08:51:01.968 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-10T08:51:03.034 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T08:51:03.035 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph mon dump -f json 2026-03-10T08:51:03.036 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:02 vm05 ceph-mon[49713]: pgmap v6: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T08:51:03.036 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:02 vm05 ceph-mon[49713]: from='client.? 192.168.123.108:0/2023199599' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T08:51:03.168 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T08:51:03.204 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T08:51:03.489 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:03.488+0000 7f0aad53c700 1 -- 192.168.123.108:0/3104336698 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0aa8102c90 msgr2=0x7f0aa81030b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:03.490 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:03.488+0000 7f0aad53c700 1 --2- 192.168.123.108:0/3104336698 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0aa8102c90 0x7f0aa81030b0 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7f0a90009b00 tx=0x7f0a90009e10 comp rx=0 tx=0).stop 2026-03-10T08:51:03.490 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:03.489+0000 7f0aad53c700 1 -- 192.168.123.108:0/3104336698 shutdown_connections 2026-03-10T08:51:03.490 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:03.489+0000 7f0aad53c700 1 --2- 192.168.123.108:0/3104336698 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0aa8102c90 0x7f0aa81030b0 unknown :-1 s=CLOSED pgs=121 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:03.490 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:03.489+0000 7f0aad53c700 1 -- 192.168.123.108:0/3104336698 >> 192.168.123.108:0/3104336698 conn(0x7f0aa80fe230 msgr2=0x7f0aa8100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:51:03.490 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:03.489+0000 7f0aad53c700 1 -- 192.168.123.108:0/3104336698 shutdown_connections 2026-03-10T08:51:03.490 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:03.489+0000 7f0aad53c700 1 -- 192.168.123.108:0/3104336698 wait complete. 2026-03-10T08:51:03.490 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:03.489+0000 7f0aad53c700 1 Processor -- start 2026-03-10T08:51:03.490 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:03.489+0000 7f0aad53c700 1 -- start start 2026-03-10T08:51:03.490 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:03.490+0000 7f0aad53c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0aa8102c90 0x7f0aa8197d50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:51:03.490 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:03.490+0000 7f0aad53c700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0aa8198290 con 0x7f0aa8102c90 2026-03-10T08:51:03.490 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:03.490+0000 7f0aa6ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0aa8102c90 0x7f0aa8197d50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:51:03.490 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:03.490+0000 7f0aa6ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0aa8102c90 0x7f0aa8197d50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:41250/0 (socket says 192.168.123.108:41250) 2026-03-10T08:51:03.490 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:03.490+0000 7f0aa6ffd700 1 -- 192.168.123.108:0/726069246 learned_addr learned my addr 192.168.123.108:0/726069246 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T08:51:03.491 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:03.490+0000 7f0aa6ffd700 1 -- 192.168.123.108:0/726069246 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0a900097e0 con 0x7f0aa8102c90 2026-03-10T08:51:03.491 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:03.490+0000 7f0aa6ffd700 1 --2- 192.168.123.108:0/726069246 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0aa8102c90 0x7f0aa8197d50 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7f0a90004750 tx=0x7f0a90005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:51:03.491 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:03.490+0000 7f0a9ffff700 1 -- 192.168.123.108:0/726069246 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0a9001c070 con 0x7f0aa8102c90 2026-03-10T08:51:03.491 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:03.490+0000 7f0a9ffff700 1 -- 192.168.123.108:0/726069246 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f0a90021470 con 0x7f0aa8102c90 2026-03-10T08:51:03.491 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:03.490+0000 7f0aad53c700 1 -- 192.168.123.108:0/726069246 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0aa8198490 con 0x7f0aa8102c90 2026-03-10T08:51:03.492 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:03.491+0000 7f0aad53c700 1 -- 192.168.123.108:0/726069246 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0aa8198930 con 0x7f0aa8102c90 2026-03-10T08:51:03.492 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:03.491+0000 7f0a9ffff700 1 -- 192.168.123.108:0/726069246 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0a9000f460 con 0x7f0aa8102c90 2026-03-10T08:51:03.492 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:03.491+0000 7f0a9ffff700 1 -- 192.168.123.108:0/726069246 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 12) v1 ==== 45337+0+0 (secure 0 0 0) 0x7f0a9000f620 con 0x7f0aa8102c90 2026-03-10T08:51:03.492 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:03.492+0000 7f0aad53c700 1 -- 192.168.123.108:0/726069246 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0a88005320 con 0x7f0aa8102c90 2026-03-10T08:51:03.493 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:03.492+0000 7f0a9ffff700 1 --2- 192.168.123.108:0/726069246 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0a94038440 0x7f0a9403a900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:51:03.493 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:03.492+0000 7f0a9ffff700 1 -- 192.168.123.108:0/726069246 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f0a9004d4c0 con 0x7f0aa8102c90 2026-03-10T08:51:03.493 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:03.492+0000 7f0aa67fc700 1 --2- 192.168.123.108:0/726069246 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0a94038440 0x7f0a9403a900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:51:03.493 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:03.492+0000 7f0aa67fc700 1 --2- 192.168.123.108:0/726069246 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0a94038440 0x7f0a9403a900 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7f0a98006fd0 tx=0x7f0a98006e40 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:51:03.495 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:03.495+0000 7f0a9ffff700 1 -- 192.168.123.108:0/726069246 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f0a90029bb0 con 0x7f0aa8102c90 2026-03-10T08:51:03.642 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:03.642+0000 7f0aad53c700 1 -- 192.168.123.108:0/726069246 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f0a88005190 con 0x7f0aa8102c90 2026-03-10T08:51:03.643 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:03.642+0000 7f0a9ffff700 1 -- 192.168.123.108:0/726069246 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f0a90026030 con 0x7f0aa8102c90 2026-03-10T08:51:03.644 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:51:03.644 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"16587ed2-1c5e-11f1-90f6-35051361a039","modified":"2026-03-10T08:50:09.891602Z","created":"2026-03-10T08:50:09.891602Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T08:51:03.646 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:03.645+0000 7f0aad53c700 1 -- 192.168.123.108:0/726069246 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0a94038440 msgr2=0x7f0a9403a900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:03.646 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:03.646+0000 7f0aad53c700 1 --2- 192.168.123.108:0/726069246 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0a94038440 0x7f0a9403a900 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7f0a98006fd0 tx=0x7f0a98006e40 comp rx=0 tx=0).stop 2026-03-10T08:51:03.646 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:03.646+0000 7f0aad53c700 1 -- 192.168.123.108:0/726069246 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0aa8102c90 msgr2=0x7f0aa8197d50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:03.647 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:03.646+0000 7f0aad53c700 1 --2- 192.168.123.108:0/726069246 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0aa8102c90 0x7f0aa8197d50 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7f0a90004750 tx=0x7f0a90005dc0 comp rx=0 tx=0).stop 2026-03-10T08:51:03.647 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:03.646+0000 7f0aad53c700 1 -- 192.168.123.108:0/726069246 shutdown_connections 2026-03-10T08:51:03.647 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:03.646+0000 7f0aad53c700 1 --2- 192.168.123.108:0/726069246 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0a94038440 0x7f0a9403a900 unknown :-1 s=CLOSED pgs=27 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:03.647 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:03.646+0000 7f0aad53c700 1 --2- 192.168.123.108:0/726069246 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0aa8102c90 0x7f0aa8197d50 unknown :-1 s=CLOSED pgs=122 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:03.647 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:03.647+0000 7f0aad53c700 1 -- 192.168.123.108:0/726069246 >> 192.168.123.108:0/726069246 conn(0x7f0aa80fe230 msgr2=0x7f0aa80feef0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:51:03.647 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:03.647+0000 7f0aad53c700 1 -- 192.168.123.108:0/726069246 shutdown_connections 2026-03-10T08:51:03.648 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:03.647+0000 7f0aad53c700 1 -- 192.168.123.108:0/726069246 wait complete. 2026-03-10T08:51:03.649 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-10T08:51:04.694 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:04 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:04.694 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:04 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:04.694 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:04 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:04.694 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:04 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:04.694 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:04 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:04.694 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:04 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:04.694 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:04 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:04.694 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:04 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:04.694 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:04 vm05 ceph-mon[49713]: Deploying daemon prometheus.vm05 on vm05 2026-03-10T08:51:04.694 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:04 vm05 ceph-mon[49713]: from='client.? 192.168.123.108:0/726069246' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T08:51:04.694 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:04 vm05 ceph-mon[49713]: pgmap v7: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T08:51:04.708 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T08:51:04.708 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph mon dump -f json 2026-03-10T08:51:04.844 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T08:51:04.877 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T08:51:05.273 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:05.271+0000 7f291f05c700 1 -- 192.168.123.108:0/2187166232 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2918107090 msgr2=0x7f29181074b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:05.273 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:05.271+0000 7f291f05c700 1 --2- 192.168.123.108:0/2187166232 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2918107090 0x7f29181074b0 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7f2908009b00 tx=0x7f2908009e10 comp rx=0 tx=0).stop 2026-03-10T08:51:05.273 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:05.272+0000 7f291f05c700 1 -- 192.168.123.108:0/2187166232 shutdown_connections 2026-03-10T08:51:05.273 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:05.272+0000 7f291f05c700 1 --2- 192.168.123.108:0/2187166232 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2918107090 0x7f29181074b0 unknown :-1 s=CLOSED pgs=123 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:05.273 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:05.272+0000 7f291f05c700 1 -- 192.168.123.108:0/2187166232 >> 192.168.123.108:0/2187166232 conn(0x7f2918076040 msgr2=0x7f29180784a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:51:05.273 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:05.272+0000 7f291f05c700 1 -- 192.168.123.108:0/2187166232 shutdown_connections 2026-03-10T08:51:05.273 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:05.272+0000 7f291f05c700 1 -- 192.168.123.108:0/2187166232 wait complete. 2026-03-10T08:51:05.273 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:05.273+0000 7f291f05c700 1 Processor -- start 2026-03-10T08:51:05.274 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:05.273+0000 7f291f05c700 1 -- start start 2026-03-10T08:51:05.274 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:05.273+0000 7f291f05c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2918107090 0x7f291819c1a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:51:05.274 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:05.273+0000 7f291f05c700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f291819c6e0 con 0x7f2918107090 2026-03-10T08:51:05.274 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:05.274+0000 7f291cdf8700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2918107090 0x7f291819c1a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:51:05.274 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:05.274+0000 7f291cdf8700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2918107090 0x7f291819c1a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:46956/0 (socket says 192.168.123.108:46956) 2026-03-10T08:51:05.274 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:05.274+0000 7f291cdf8700 1 -- 192.168.123.108:0/290381747 learned_addr learned my addr 192.168.123.108:0/290381747 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T08:51:05.275 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:05.274+0000 7f291cdf8700 1 -- 192.168.123.108:0/290381747 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f29080097e0 con 0x7f2918107090 2026-03-10T08:51:05.275 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:05.274+0000 7f291cdf8700 1 --2- 192.168.123.108:0/290381747 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2918107090 0x7f291819c1a0 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7f2908004750 tx=0x7f2908005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:51:05.276 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:05.275+0000 7f2915ffb700 1 -- 192.168.123.108:0/290381747 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f290801c070 con 0x7f2918107090 2026-03-10T08:51:05.276 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:05.275+0000 7f2915ffb700 1 -- 192.168.123.108:0/290381747 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f2908021470 con 0x7f2918107090 2026-03-10T08:51:05.276 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:05.275+0000 7f2915ffb700 1 -- 192.168.123.108:0/290381747 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f290800f460 con 0x7f2918107090 2026-03-10T08:51:05.276 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:05.275+0000 7f291f05c700 1 -- 192.168.123.108:0/290381747 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f291819c8e0 con 0x7f2918107090 2026-03-10T08:51:05.276 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:05.275+0000 7f291f05c700 1 -- 192.168.123.108:0/290381747 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f291819cd80 con 0x7f2918107090 2026-03-10T08:51:05.276 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:05.276+0000 7f291f05c700 1 -- 192.168.123.108:0/290381747 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2918195e20 con 0x7f2918107090 2026-03-10T08:51:05.277 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:05.276+0000 7f2915ffb700 1 -- 192.168.123.108:0/290381747 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 12) v1 ==== 45337+0+0 (secure 0 0 0) 0x7f2908021a80 con 0x7f2918107090 2026-03-10T08:51:05.277 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:05.276+0000 7f2915ffb700 1 --2- 192.168.123.108:0/290381747 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f29000383f0 0x7f290003a8b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:51:05.277 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:05.276+0000 7f2915ffb700 1 -- 192.168.123.108:0/290381747 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f290804c2d0 con 0x7f2918107090 2026-03-10T08:51:05.278 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:05.277+0000 7f2917fff700 1 --2- 192.168.123.108:0/290381747 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f29000383f0 0x7f290003a8b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:51:05.278 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:05.278+0000 7f2917fff700 1 --2- 192.168.123.108:0/290381747 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f29000383f0 0x7f290003a8b0 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f290c006fd0 tx=0x7f290c006e40 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:51:05.279 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:05.279+0000 7f2915ffb700 1 -- 192.168.123.108:0/290381747 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f2908029360 con 0x7f2918107090 2026-03-10T08:51:05.428 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:05.427+0000 7f291f05c700 1 -- 192.168.123.108:0/290381747 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f29180623c0 con 0x7f2918107090 2026-03-10T08:51:05.428 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:05.427+0000 7f2915ffb700 1 -- 192.168.123.108:0/290381747 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f2908026030 con 0x7f2918107090 2026-03-10T08:51:05.428 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:51:05.428 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"16587ed2-1c5e-11f1-90f6-35051361a039","modified":"2026-03-10T08:50:09.891602Z","created":"2026-03-10T08:50:09.891602Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T08:51:05.430 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:05.429+0000 7f291f05c700 1 -- 192.168.123.108:0/290381747 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f29000383f0 msgr2=0x7f290003a8b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:05.430 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:05.429+0000 7f291f05c700 1 --2- 192.168.123.108:0/290381747 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f29000383f0 0x7f290003a8b0 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f290c006fd0 tx=0x7f290c006e40 comp rx=0 tx=0).stop 2026-03-10T08:51:05.430 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:05.429+0000 7f291f05c700 1 -- 192.168.123.108:0/290381747 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2918107090 msgr2=0x7f291819c1a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:05.430 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:05.430+0000 7f291f05c700 1 --2- 192.168.123.108:0/290381747 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2918107090 0x7f291819c1a0 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7f2908004750 tx=0x7f2908005dc0 comp rx=0 tx=0).stop 2026-03-10T08:51:05.431 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:05.430+0000 7f291f05c700 1 -- 192.168.123.108:0/290381747 shutdown_connections 2026-03-10T08:51:05.431 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:05.430+0000 7f291f05c700 1 --2- 192.168.123.108:0/290381747 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f29000383f0 0x7f290003a8b0 unknown :-1 s=CLOSED pgs=28 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:05.431 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:05.430+0000 7f291f05c700 1 --2- 192.168.123.108:0/290381747 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2918107090 0x7f291819c1a0 unknown :-1 s=CLOSED pgs=124 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:05.431 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:05.430+0000 7f291f05c700 1 -- 192.168.123.108:0/290381747 >> 192.168.123.108:0/290381747 conn(0x7f2918076040 msgr2=0x7f2918076d20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:51:05.431 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:05.430+0000 7f291f05c700 1 -- 192.168.123.108:0/290381747 shutdown_connections 2026-03-10T08:51:05.431 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:05.430+0000 7f291f05c700 1 -- 192.168.123.108:0/290381747 wait complete. 2026-03-10T08:51:05.431 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-10T08:51:05.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:05 vm05 ceph-mon[49713]: from='client.? 192.168.123.108:0/290381747' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T08:51:06.474 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T08:51:06.474 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph mon dump -f json 2026-03-10T08:51:06.605 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T08:51:06.641 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T08:51:06.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:06 vm05 ceph-mon[49713]: pgmap v8: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T08:51:07.229 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:07.227+0000 7f914ac67700 1 -- 192.168.123.108:0/2465318738 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9144102c90 msgr2=0x7f91441030b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:07.229 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:07.227+0000 7f914ac67700 1 --2- 192.168.123.108:0/2465318738 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9144102c90 0x7f91441030b0 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7f9138009b00 tx=0x7f9138009e10 comp rx=0 tx=0).stop 2026-03-10T08:51:07.229 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:07.228+0000 7f914ac67700 1 -- 192.168.123.108:0/2465318738 shutdown_connections 2026-03-10T08:51:07.229 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:07.228+0000 7f914ac67700 1 --2- 192.168.123.108:0/2465318738 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9144102c90 0x7f91441030b0 unknown :-1 s=CLOSED pgs=125 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:07.229 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:07.228+0000 7f914ac67700 1 -- 192.168.123.108:0/2465318738 >> 192.168.123.108:0/2465318738 conn(0x7f91440fe230 msgr2=0x7f9144100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:51:07.229 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:07.228+0000 7f914ac67700 1 -- 192.168.123.108:0/2465318738 shutdown_connections 2026-03-10T08:51:07.229 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:07.228+0000 7f914ac67700 1 -- 192.168.123.108:0/2465318738 wait complete. 2026-03-10T08:51:07.229 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:07.229+0000 7f914ac67700 1 Processor -- start 2026-03-10T08:51:07.229 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:07.229+0000 7f914ac67700 1 -- start start 2026-03-10T08:51:07.229 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:07.229+0000 7f914ac67700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9144102c90 0x7f9144197da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:51:07.230 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:07.229+0000 7f914ac67700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f91441982e0 con 0x7f9144102c90 2026-03-10T08:51:07.230 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:07.229+0000 7f9148a03700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9144102c90 0x7f9144197da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:51:07.230 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:07.229+0000 7f9148a03700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9144102c90 0x7f9144197da0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:46974/0 (socket says 192.168.123.108:46974) 2026-03-10T08:51:07.230 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:07.229+0000 7f9148a03700 1 -- 192.168.123.108:0/2256372784 learned_addr learned my addr 192.168.123.108:0/2256372784 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T08:51:07.230 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:07.230+0000 7f9148a03700 1 -- 192.168.123.108:0/2256372784 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f91380097e0 con 0x7f9144102c90 2026-03-10T08:51:07.231 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:07.230+0000 7f9148a03700 1 --2- 192.168.123.108:0/2256372784 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9144102c90 0x7f9144197da0 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f9138004750 tx=0x7f9138005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:51:07.231 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:07.230+0000 7f9141ffb700 1 -- 192.168.123.108:0/2256372784 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f913801c070 con 0x7f9144102c90 2026-03-10T08:51:07.231 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:07.230+0000 7f9141ffb700 1 -- 192.168.123.108:0/2256372784 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9138021470 con 0x7f9144102c90 2026-03-10T08:51:07.231 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:07.230+0000 7f914ac67700 1 -- 192.168.123.108:0/2256372784 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f91441984e0 con 0x7f9144102c90 2026-03-10T08:51:07.232 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:07.230+0000 7f9141ffb700 1 -- 192.168.123.108:0/2256372784 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f913800f460 con 0x7f9144102c90 2026-03-10T08:51:07.232 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:07.231+0000 7f914ac67700 1 -- 192.168.123.108:0/2256372784 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9144198980 con 0x7f9144102c90 2026-03-10T08:51:07.232 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:07.231+0000 7f9141ffb700 1 -- 192.168.123.108:0/2256372784 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 12) v1 ==== 45337+0+0 (secure 0 0 0) 0x7f9138021a80 con 0x7f9144102c90 2026-03-10T08:51:07.232 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:07.231+0000 7f914ac67700 1 -- 192.168.123.108:0/2256372784 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9144191a20 con 0x7f9144102c90 2026-03-10T08:51:07.232 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:07.232+0000 7f9141ffb700 1 --2- 192.168.123.108:0/2256372784 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f912c040c50 0x7f912c043110 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:51:07.233 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:07.232+0000 7f9141ffb700 1 -- 192.168.123.108:0/2256372784 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f913804c390 con 0x7f9144102c90 2026-03-10T08:51:07.233 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:07.232+0000 7f9143fff700 1 --2- 192.168.123.108:0/2256372784 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f912c040c50 0x7f912c043110 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:51:07.233 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:07.232+0000 7f9143fff700 1 --2- 192.168.123.108:0/2256372784 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f912c040c50 0x7f912c043110 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f9134006fd0 tx=0x7f9134006e40 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:51:07.235 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:07.235+0000 7f9141ffb700 1 -- 192.168.123.108:0/2256372784 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f9138029360 con 0x7f9144102c90 2026-03-10T08:51:07.380 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:07.379+0000 7f914ac67700 1 -- 192.168.123.108:0/2256372784 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f91440623c0 con 0x7f9144102c90 2026-03-10T08:51:07.381 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:07.380+0000 7f9141ffb700 1 -- 192.168.123.108:0/2256372784 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f9138026030 con 0x7f9144102c90 2026-03-10T08:51:07.381 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:51:07.381 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"16587ed2-1c5e-11f1-90f6-35051361a039","modified":"2026-03-10T08:50:09.891602Z","created":"2026-03-10T08:50:09.891602Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T08:51:07.383 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:07.382+0000 7f914ac67700 1 -- 192.168.123.108:0/2256372784 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f912c040c50 msgr2=0x7f912c043110 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:07.383 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:07.382+0000 7f914ac67700 1 --2- 192.168.123.108:0/2256372784 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f912c040c50 0x7f912c043110 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f9134006fd0 tx=0x7f9134006e40 comp rx=0 tx=0).stop 2026-03-10T08:51:07.383 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:07.382+0000 7f914ac67700 1 -- 192.168.123.108:0/2256372784 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9144102c90 msgr2=0x7f9144197da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:07.383 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:07.382+0000 7f914ac67700 1 --2- 192.168.123.108:0/2256372784 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9144102c90 0x7f9144197da0 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f9138004750 tx=0x7f9138005dc0 comp rx=0 tx=0).stop 2026-03-10T08:51:07.383 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:07.382+0000 7f914ac67700 1 -- 192.168.123.108:0/2256372784 shutdown_connections 2026-03-10T08:51:07.383 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:07.382+0000 7f914ac67700 1 --2- 192.168.123.108:0/2256372784 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f912c040c50 0x7f912c043110 unknown :-1 s=CLOSED pgs=29 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:07.383 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:07.382+0000 7f914ac67700 1 --2- 192.168.123.108:0/2256372784 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9144102c90 0x7f9144197da0 unknown :-1 s=CLOSED pgs=126 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:07.383 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:07.382+0000 7f914ac67700 1 -- 192.168.123.108:0/2256372784 >> 192.168.123.108:0/2256372784 conn(0x7f91440fe230 msgr2=0x7f91440feef0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:51:07.383 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:07.383+0000 7f914ac67700 1 -- 192.168.123.108:0/2256372784 shutdown_connections 2026-03-10T08:51:07.383 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:07.383+0000 7f914ac67700 1 -- 192.168.123.108:0/2256372784 wait complete. 2026-03-10T08:51:07.384 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-10T08:51:08.447 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T08:51:08.448 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph mon dump -f json 2026-03-10T08:51:08.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:08 vm05 ceph-mon[49713]: from='client.? 192.168.123.108:0/2256372784' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T08:51:08.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:08 vm05 ceph-mon[49713]: pgmap v9: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T08:51:08.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:08 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:08.582 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T08:51:08.620 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T08:51:08.930 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:08.929+0000 7f42de4a5700 1 -- 192.168.123.108:0/3299964207 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f42d8102c20 msgr2=0x7f42d8103040 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:08.931 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:08.929+0000 7f42de4a5700 1 --2- 192.168.123.108:0/3299964207 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f42d8102c20 0x7f42d8103040 secure :-1 s=READY pgs=127 cs=0 l=1 rev1=1 crypto rx=0x7f42c0009b00 tx=0x7f42c0009e10 comp rx=0 tx=0).stop 2026-03-10T08:51:08.931 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:08.930+0000 7f42de4a5700 1 -- 192.168.123.108:0/3299964207 shutdown_connections 2026-03-10T08:51:08.931 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:08.930+0000 7f42de4a5700 1 --2- 192.168.123.108:0/3299964207 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f42d8102c20 0x7f42d8103040 unknown :-1 s=CLOSED pgs=127 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:08.931 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:08.930+0000 7f42de4a5700 1 -- 192.168.123.108:0/3299964207 >> 192.168.123.108:0/3299964207 conn(0x7f42d80fe1a0 msgr2=0x7f42d8100600 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:51:08.931 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:08.930+0000 7f42de4a5700 1 -- 192.168.123.108:0/3299964207 shutdown_connections 2026-03-10T08:51:08.931 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:08.930+0000 7f42de4a5700 1 -- 192.168.123.108:0/3299964207 wait complete. 2026-03-10T08:51:08.931 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:08.930+0000 7f42de4a5700 1 Processor -- start 2026-03-10T08:51:08.931 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:08.930+0000 7f42de4a5700 1 -- start start 2026-03-10T08:51:08.931 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:08.930+0000 7f42de4a5700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f42d8102c20 0x7f42d8197d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:51:08.931 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:08.930+0000 7f42de4a5700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f42d8198270 con 0x7f42d8102c20 2026-03-10T08:51:08.931 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:08.931+0000 7f42d7fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f42d8102c20 0x7f42d8197d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:51:08.931 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:08.931+0000 7f42d7fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f42d8102c20 0x7f42d8197d30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:46986/0 (socket says 192.168.123.108:46986) 2026-03-10T08:51:08.931 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:08.931+0000 7f42d7fff700 1 -- 192.168.123.108:0/3330003635 learned_addr learned my addr 192.168.123.108:0/3330003635 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T08:51:08.932 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:08.931+0000 7f42d7fff700 1 -- 192.168.123.108:0/3330003635 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f42c00097e0 con 0x7f42d8102c20 2026-03-10T08:51:08.932 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:08.931+0000 7f42d7fff700 1 --2- 192.168.123.108:0/3330003635 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f42d8102c20 0x7f42d8197d30 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7f42c0004750 tx=0x7f42c0005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:51:08.933 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:08.932+0000 7f42d57fa700 1 -- 192.168.123.108:0/3330003635 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f42c001c070 con 0x7f42d8102c20 2026-03-10T08:51:08.933 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:08.932+0000 7f42d57fa700 1 -- 192.168.123.108:0/3330003635 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f42c0021470 con 0x7f42d8102c20 2026-03-10T08:51:08.933 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:08.932+0000 7f42d57fa700 1 -- 192.168.123.108:0/3330003635 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f42c000f460 con 0x7f42d8102c20 2026-03-10T08:51:08.933 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:08.932+0000 7f42de4a5700 1 -- 192.168.123.108:0/3330003635 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f42d8198470 con 0x7f42d8102c20 2026-03-10T08:51:08.933 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:08.932+0000 7f42de4a5700 1 -- 192.168.123.108:0/3330003635 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f42d8198850 con 0x7f42d8102c20 2026-03-10T08:51:08.933 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:08.932+0000 7f42d57fa700 1 -- 192.168.123.108:0/3330003635 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 12) v1 ==== 45337+0+0 (secure 0 0 0) 0x7f42c0021ac0 con 0x7f42d8102c20 2026-03-10T08:51:08.933 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:08.933+0000 7f42de4a5700 1 -- 192.168.123.108:0/3330003635 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f42d81919b0 con 0x7f42d8102c20 2026-03-10T08:51:08.933 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:08.933+0000 7f42d57fa700 1 --2- 192.168.123.108:0/3330003635 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f42c40383f0 0x7f42c403a8b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:51:08.934 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:08.933+0000 7f42d57fa700 1 -- 192.168.123.108:0/3330003635 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f42c004c3b0 con 0x7f42d8102c20 2026-03-10T08:51:08.934 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:08.933+0000 7f42d77fe700 1 --2- 192.168.123.108:0/3330003635 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f42c40383f0 0x7f42c403a8b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:51:08.934 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:08.933+0000 7f42d77fe700 1 --2- 192.168.123.108:0/3330003635 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f42c40383f0 0x7f42c403a8b0 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f42c8006fd0 tx=0x7f42c8006e40 comp rx=0 tx=0).ready entity=mgr.14162 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:51:08.936 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:08.935+0000 7f42d57fa700 1 -- 192.168.123.108:0/3330003635 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f42c000f5c0 con 0x7f42d8102c20 2026-03-10T08:51:09.076 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:09.075+0000 7f42de4a5700 1 -- 192.168.123.108:0/3330003635 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f42d804fa20 con 0x7f42d8102c20 2026-03-10T08:51:09.076 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:09.076+0000 7f42d57fa700 1 -- 192.168.123.108:0/3330003635 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f42c0030300 con 0x7f42d8102c20 2026-03-10T08:51:09.076 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:51:09.077 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"16587ed2-1c5e-11f1-90f6-35051361a039","modified":"2026-03-10T08:50:09.891602Z","created":"2026-03-10T08:50:09.891602Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T08:51:09.079 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:09.078+0000 7f42de4a5700 1 -- 192.168.123.108:0/3330003635 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f42c40383f0 msgr2=0x7f42c403a8b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:09.079 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:09.078+0000 7f42de4a5700 1 --2- 192.168.123.108:0/3330003635 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f42c40383f0 0x7f42c403a8b0 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f42c8006fd0 tx=0x7f42c8006e40 comp rx=0 tx=0).stop 2026-03-10T08:51:09.079 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:09.078+0000 7f42de4a5700 1 -- 192.168.123.108:0/3330003635 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f42d8102c20 msgr2=0x7f42d8197d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:09.079 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:09.078+0000 7f42de4a5700 1 --2- 192.168.123.108:0/3330003635 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f42d8102c20 0x7f42d8197d30 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7f42c0004750 tx=0x7f42c0005dc0 comp rx=0 tx=0).stop 2026-03-10T08:51:09.079 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:09.078+0000 7f42de4a5700 1 -- 192.168.123.108:0/3330003635 shutdown_connections 2026-03-10T08:51:09.079 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:09.078+0000 7f42de4a5700 1 --2- 192.168.123.108:0/3330003635 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f42c40383f0 0x7f42c403a8b0 unknown :-1 s=CLOSED pgs=30 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:09.079 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:09.078+0000 7f42de4a5700 1 --2- 192.168.123.108:0/3330003635 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f42d8102c20 0x7f42d8197d30 unknown :-1 s=CLOSED pgs=128 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:09.079 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:09.078+0000 7f42de4a5700 1 -- 192.168.123.108:0/3330003635 >> 192.168.123.108:0/3330003635 conn(0x7f42d80fe1a0 msgr2=0x7f42d80fee80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:51:09.079 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:09.079+0000 7f42de4a5700 1 -- 192.168.123.108:0/3330003635 shutdown_connections 2026-03-10T08:51:09.079 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:09.079+0000 7f42de4a5700 1 -- 192.168.123.108:0/3330003635 wait complete. 2026-03-10T08:51:09.080 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-10T08:51:09.316 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:09 vm05 ceph-mon[49713]: from='client.? 192.168.123.108:0/3330003635' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T08:51:10.137 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T08:51:10.137 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph mon dump -f json 2026-03-10T08:51:10.271 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T08:51:10.306 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T08:51:10.803 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:10 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:10.803 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:10 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:10.803 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:10 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:10.803 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:10 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mgr module enable", "module": "prometheus"}]: dispatch 2026-03-10T08:51:10.803 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:10 vm05 ceph-mon[49713]: pgmap v10: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T08:51:10.964 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:10.962+0000 7fbe6cea3700 1 -- 192.168.123.108:0/3346381613 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbe68102cb0 msgr2=0x7fbe681030d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:10.964 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:10.962+0000 7fbe6cea3700 1 --2- 192.168.123.108:0/3346381613 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbe68102cb0 0x7fbe681030d0 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7fbe50009b00 tx=0x7fbe50009e10 comp rx=0 tx=0).stop 2026-03-10T08:51:10.964 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:10.963+0000 7fbe6cea3700 1 -- 192.168.123.108:0/3346381613 shutdown_connections 2026-03-10T08:51:10.964 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:10.963+0000 7fbe6cea3700 1 --2- 192.168.123.108:0/3346381613 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbe68102cb0 0x7fbe681030d0 unknown :-1 s=CLOSED pgs=131 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:10.964 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:10.963+0000 7fbe6cea3700 1 -- 192.168.123.108:0/3346381613 >> 192.168.123.108:0/3346381613 conn(0x7fbe680fe230 msgr2=0x7fbe68100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:51:10.964 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:10.963+0000 7fbe6cea3700 1 -- 192.168.123.108:0/3346381613 shutdown_connections 2026-03-10T08:51:10.964 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:10.963+0000 7fbe6cea3700 1 -- 192.168.123.108:0/3346381613 wait complete. 2026-03-10T08:51:10.964 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:10.964+0000 7fbe6cea3700 1 Processor -- start 2026-03-10T08:51:10.964 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:10.964+0000 7fbe6cea3700 1 -- start start 2026-03-10T08:51:10.965 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:10.964+0000 7fbe6cea3700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbe68102cb0 0x7fbe68197da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:51:10.965 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:10.964+0000 7fbe6cea3700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbe681982e0 con 0x7fbe68102cb0 2026-03-10T08:51:10.965 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:10.964+0000 7fbe6659c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbe68102cb0 0x7fbe68197da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:51:10.965 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:10.964+0000 7fbe6659c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbe68102cb0 0x7fbe68197da0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:47000/0 (socket says 192.168.123.108:47000) 2026-03-10T08:51:10.965 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:10.964+0000 7fbe6659c700 1 -- 192.168.123.108:0/788533123 learned_addr learned my addr 192.168.123.108:0/788533123 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T08:51:10.965 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:10.965+0000 7fbe6659c700 1 -- 192.168.123.108:0/788533123 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbe500097e0 con 0x7fbe68102cb0 2026-03-10T08:51:10.965 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:10.965+0000 7fbe6659c700 1 --2- 192.168.123.108:0/788533123 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbe68102cb0 0x7fbe68197da0 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7fbe50004750 tx=0x7fbe50005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:51:10.966 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:10.965+0000 7fbe5f7fe700 1 -- 192.168.123.108:0/788533123 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fbe5001c070 con 0x7fbe68102cb0 2026-03-10T08:51:10.966 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:10.965+0000 7fbe5f7fe700 1 -- 192.168.123.108:0/788533123 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fbe50021470 con 0x7fbe68102cb0 2026-03-10T08:51:10.966 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:10.965+0000 7fbe5f7fe700 1 -- 192.168.123.108:0/788533123 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fbe5000f460 con 0x7fbe68102cb0 2026-03-10T08:51:10.966 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:10.965+0000 7fbe6cea3700 1 -- 192.168.123.108:0/788533123 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbe681984e0 con 0x7fbe68102cb0 2026-03-10T08:51:10.966 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:10.965+0000 7fbe6cea3700 1 -- 192.168.123.108:0/788533123 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbe68198980 con 0x7fbe68102cb0 2026-03-10T08:51:10.967 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:10.966+0000 7fbe5f7fe700 1 -- 192.168.123.108:0/788533123 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 13) v1 ==== 45351+0+0 (secure 0 0 0) 0x7fbe50021ac0 con 0x7fbe68102cb0 2026-03-10T08:51:10.967 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:10.966+0000 7fbe6cea3700 1 -- 192.168.123.108:0/788533123 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbe68191a40 con 0x7fbe68102cb0 2026-03-10T08:51:10.967 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:10.966+0000 7fbe5f7fe700 1 --2- 192.168.123.108:0/788533123 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbe54038490 0x7fbe5403a950 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:51:10.967 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:10.966+0000 7fbe5f7fe700 1 -- 192.168.123.108:0/788533123 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7fbe5004c410 con 0x7fbe68102cb0 2026-03-10T08:51:10.967 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:10.966+0000 7fbe65d9b700 1 -- 192.168.123.108:0/788533123 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbe54038490 msgr2=0x7fbe5403a950 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.105:6800/2 2026-03-10T08:51:10.967 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:10.966+0000 7fbe65d9b700 1 --2- 192.168.123.108:0/788533123 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbe54038490 0x7fbe5403a950 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-10T08:51:10.969 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:10.969+0000 7fbe5f7fe700 1 -- 192.168.123.108:0/788533123 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fbe50005290 con 0x7fbe68102cb0 2026-03-10T08:51:11.114 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:11.113+0000 7fbe6cea3700 1 -- 192.168.123.108:0/788533123 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fbe680623c0 con 0x7fbe68102cb0 2026-03-10T08:51:11.115 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:11.114+0000 7fbe5f7fe700 1 -- 192.168.123.108:0/788533123 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7fbe50026030 con 0x7fbe68102cb0 2026-03-10T08:51:11.115 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:51:11.115 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"16587ed2-1c5e-11f1-90f6-35051361a039","modified":"2026-03-10T08:50:09.891602Z","created":"2026-03-10T08:50:09.891602Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T08:51:11.117 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:11.117+0000 7fbe6cea3700 1 -- 192.168.123.108:0/788533123 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbe54038490 msgr2=0x7fbe5403a950 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T08:51:11.117 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:11.117+0000 7fbe6cea3700 1 --2- 192.168.123.108:0/788533123 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbe54038490 0x7fbe5403a950 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:11.117 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:11.117+0000 7fbe6cea3700 1 -- 192.168.123.108:0/788533123 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbe68102cb0 msgr2=0x7fbe68197da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:11.117 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:11.117+0000 7fbe6cea3700 1 --2- 192.168.123.108:0/788533123 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbe68102cb0 0x7fbe68197da0 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7fbe50004750 tx=0x7fbe50005dc0 comp rx=0 tx=0).stop 2026-03-10T08:51:11.118 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:11.117+0000 7fbe6cea3700 1 -- 192.168.123.108:0/788533123 shutdown_connections 2026-03-10T08:51:11.118 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:11.117+0000 7fbe6cea3700 1 --2- 192.168.123.108:0/788533123 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbe54038490 0x7fbe5403a950 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:11.118 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:11.117+0000 7fbe6cea3700 1 --2- 192.168.123.108:0/788533123 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbe68102cb0 0x7fbe68197da0 unknown :-1 s=CLOSED pgs=132 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:11.118 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:11.117+0000 7fbe6cea3700 1 -- 192.168.123.108:0/788533123 >> 192.168.123.108:0/788533123 conn(0x7fbe680fe230 msgr2=0x7fbe680fef10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:51:11.118 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:11.117+0000 7fbe6cea3700 1 -- 192.168.123.108:0/788533123 shutdown_connections 2026-03-10T08:51:11.118 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:11.117+0000 7fbe6cea3700 1 -- 192.168.123.108:0/788533123 wait complete. 2026-03-10T08:51:11.119 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-10T08:51:12.060 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:11 vm05 ceph-mon[49713]: from='mgr.14162 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "mgr module enable", "module": "prometheus"}]': finished 2026-03-10T08:51:12.060 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:11 vm05 ceph-mon[49713]: mgrmap e13: vm05.rxwgjc(active, since 33s) 2026-03-10T08:51:12.060 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:11 vm05 ceph-mon[49713]: from='client.? 192.168.123.108:0/788533123' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T08:51:12.157 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T08:51:12.157 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph mon dump -f json 2026-03-10T08:51:12.289 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T08:51:12.323 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T08:51:12.565 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:12.563+0000 7f97cae9c700 1 -- 192.168.123.108:0/1534737220 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f97c4101510 msgr2=0x7f97c4103900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:12.565 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:12.563+0000 7f97cae9c700 1 --2- 192.168.123.108:0/1534737220 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f97c4101510 0x7f97c4103900 secure :-1 s=READY pgs=133 cs=0 l=1 rev1=1 crypto rx=0x7f97b4009b00 tx=0x7f97b4009e10 comp rx=0 tx=0).stop 2026-03-10T08:51:12.565 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:12.564+0000 7f97cae9c700 1 -- 192.168.123.108:0/1534737220 shutdown_connections 2026-03-10T08:51:12.565 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:12.564+0000 7f97cae9c700 1 --2- 192.168.123.108:0/1534737220 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f97c4101510 0x7f97c4103900 unknown :-1 s=CLOSED pgs=133 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:12.565 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:12.564+0000 7f97cae9c700 1 -- 192.168.123.108:0/1534737220 >> 192.168.123.108:0/1534737220 conn(0x7f97c40faf20 msgr2=0x7f97c40fd360 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:51:12.565 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:12.564+0000 7f97cae9c700 1 -- 192.168.123.108:0/1534737220 shutdown_connections 2026-03-10T08:51:12.565 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:12.564+0000 7f97cae9c700 1 -- 192.168.123.108:0/1534737220 wait complete. 2026-03-10T08:51:12.565 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:12.564+0000 7f97cae9c700 1 Processor -- start 2026-03-10T08:51:12.565 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:12.565+0000 7f97cae9c700 1 -- start start 2026-03-10T08:51:12.565 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:12.565+0000 7f97cae9c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f97c4101510 0x7f97c4100e70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:51:12.565 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:12.565+0000 7f97cae9c700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f97c41013b0 con 0x7f97c4101510 2026-03-10T08:51:12.566 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:12.565+0000 7f97c8c38700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f97c4101510 0x7f97c4100e70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:51:12.566 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:12.565+0000 7f97c8c38700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f97c4101510 0x7f97c4100e70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:47018/0 (socket says 192.168.123.108:47018) 2026-03-10T08:51:12.566 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:12.565+0000 7f97c8c38700 1 -- 192.168.123.108:0/124827751 learned_addr learned my addr 192.168.123.108:0/124827751 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T08:51:12.566 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:12.565+0000 7f97c8c38700 1 -- 192.168.123.108:0/124827751 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f97b40097e0 con 0x7f97c4101510 2026-03-10T08:51:12.566 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:12.566+0000 7f97c8c38700 1 --2- 192.168.123.108:0/124827751 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f97c4101510 0x7f97c4100e70 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7f97b4004f40 tx=0x7f97b4005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:51:12.567 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:12.566+0000 7f97c1ffb700 1 -- 192.168.123.108:0/124827751 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f97b401c070 con 0x7f97c4101510 2026-03-10T08:51:12.567 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:12.566+0000 7f97c1ffb700 1 -- 192.168.123.108:0/124827751 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f97b40053b0 con 0x7f97c4101510 2026-03-10T08:51:12.567 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:12.566+0000 7f97cae9c700 1 -- 192.168.123.108:0/124827751 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f97c40ff4c0 con 0x7f97c4101510 2026-03-10T08:51:12.567 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:12.566+0000 7f97c1ffb700 1 -- 192.168.123.108:0/124827751 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f97b400f460 con 0x7f97c4101510 2026-03-10T08:51:12.567 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:12.566+0000 7f97cae9c700 1 -- 192.168.123.108:0/124827751 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f97c40ff960 con 0x7f97c4101510 2026-03-10T08:51:12.567 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:12.567+0000 7f97c1ffb700 1 -- 192.168.123.108:0/124827751 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 13) v1 ==== 45351+0+0 (secure 0 0 0) 0x7f97b4021470 con 0x7f97c4101510 2026-03-10T08:51:12.568 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:12.567+0000 7f97c1ffb700 1 --2- 192.168.123.108:0/124827751 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f97ac038440 0x7f97ac03a900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:51:12.568 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:12.567+0000 7f97c1ffb700 1 -- 192.168.123.108:0/124827751 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f97b404c3a0 con 0x7f97c4101510 2026-03-10T08:51:12.568 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:12.567+0000 7f97cae9c700 1 -- 192.168.123.108:0/124827751 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f97c404fa90 con 0x7f97c4101510 2026-03-10T08:51:12.568 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:12.567+0000 7f97c3fff700 1 -- 192.168.123.108:0/124827751 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f97ac038440 msgr2=0x7f97ac03a900 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.105:6800/2 2026-03-10T08:51:12.568 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:12.567+0000 7f97c3fff700 1 --2- 192.168.123.108:0/124827751 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f97ac038440 0x7f97ac03a900 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-10T08:51:12.571 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:12.570+0000 7f97c1ffb700 1 -- 192.168.123.108:0/124827751 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f97b4026070 con 0x7f97c4101510 2026-03-10T08:51:12.716 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:12.713+0000 7f97cae9c700 1 -- 192.168.123.108:0/124827751 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f97c40623c0 con 0x7f97c4101510 2026-03-10T08:51:12.716 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:12.715+0000 7f97c1ffb700 1 -- 192.168.123.108:0/124827751 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f97b40217c0 con 0x7f97c4101510 2026-03-10T08:51:12.716 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:51:12.716 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"16587ed2-1c5e-11f1-90f6-35051361a039","modified":"2026-03-10T08:50:09.891602Z","created":"2026-03-10T08:50:09.891602Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T08:51:12.718 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:12.717+0000 7f97cae9c700 1 -- 192.168.123.108:0/124827751 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f97ac038440 msgr2=0x7f97ac03a900 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T08:51:12.718 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:12.717+0000 7f97cae9c700 1 --2- 192.168.123.108:0/124827751 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f97ac038440 0x7f97ac03a900 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:12.718 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:12.717+0000 7f97cae9c700 1 -- 192.168.123.108:0/124827751 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f97c4101510 msgr2=0x7f97c4100e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:12.718 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:12.717+0000 7f97cae9c700 1 --2- 192.168.123.108:0/124827751 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f97c4101510 0x7f97c4100e70 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7f97b4004f40 tx=0x7f97b4005e70 comp rx=0 tx=0).stop 2026-03-10T08:51:12.718 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:12.718+0000 7f97cae9c700 1 -- 192.168.123.108:0/124827751 shutdown_connections 2026-03-10T08:51:12.719 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:12.718+0000 7f97cae9c700 1 --2- 192.168.123.108:0/124827751 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f97ac038440 0x7f97ac03a900 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:12.719 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:12.718+0000 7f97cae9c700 1 --2- 192.168.123.108:0/124827751 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f97c4101510 0x7f97c4100e70 unknown :-1 s=CLOSED pgs=134 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:12.719 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:12.718+0000 7f97cae9c700 1 -- 192.168.123.108:0/124827751 >> 192.168.123.108:0/124827751 conn(0x7f97c40faf20 msgr2=0x7f97c40fbbe0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:51:12.719 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:12.718+0000 7f97cae9c700 1 -- 192.168.123.108:0/124827751 shutdown_connections 2026-03-10T08:51:12.719 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:12.718+0000 7f97cae9c700 1 -- 192.168.123.108:0/124827751 wait complete. 2026-03-10T08:51:12.719 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-10T08:51:13.100 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:13 vm05 ceph-mon[49713]: from='client.? 192.168.123.108:0/124827751' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T08:51:13.765 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T08:51:13.765 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph mon dump -f json 2026-03-10T08:51:13.894 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T08:51:13.929 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T08:51:14.165 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:14.164+0000 7faaf3558700 1 -- 192.168.123.108:0/1377639688 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faaec102cb0 msgr2=0x7faaec1030d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:14.165 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:14.164+0000 7faaf3558700 1 --2- 192.168.123.108:0/1377639688 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faaec102cb0 0x7faaec1030d0 secure :-1 s=READY pgs=135 cs=0 l=1 rev1=1 crypto rx=0x7faadc009b00 tx=0x7faadc009e10 comp rx=0 tx=0).stop 2026-03-10T08:51:14.165 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:14.164+0000 7faaf3558700 1 -- 192.168.123.108:0/1377639688 shutdown_connections 2026-03-10T08:51:14.165 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:14.164+0000 7faaf3558700 1 --2- 192.168.123.108:0/1377639688 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faaec102cb0 0x7faaec1030d0 unknown :-1 s=CLOSED pgs=135 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:14.165 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:14.164+0000 7faaf3558700 1 -- 192.168.123.108:0/1377639688 >> 192.168.123.108:0/1377639688 conn(0x7faaec0fe250 msgr2=0x7faaec100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:51:14.165 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:14.165+0000 7faaf3558700 1 -- 192.168.123.108:0/1377639688 shutdown_connections 2026-03-10T08:51:14.166 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:14.165+0000 7faaf3558700 1 -- 192.168.123.108:0/1377639688 wait complete. 2026-03-10T08:51:14.166 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:14.165+0000 7faaf3558700 1 Processor -- start 2026-03-10T08:51:14.166 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:14.165+0000 7faaf3558700 1 -- start start 2026-03-10T08:51:14.166 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:14.166+0000 7faaf3558700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faaec198040 0x7faaec198460 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:51:14.166 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:14.166+0000 7faaf3558700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7faaec1989a0 con 0x7faaec198040 2026-03-10T08:51:14.166 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:14.166+0000 7faaf12f4700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faaec198040 0x7faaec198460 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:51:14.167 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:14.166+0000 7faaf12f4700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faaec198040 0x7faaec198460 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:47040/0 (socket says 192.168.123.108:47040) 2026-03-10T08:51:14.167 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:14.166+0000 7faaf12f4700 1 -- 192.168.123.108:0/2117043548 learned_addr learned my addr 192.168.123.108:0/2117043548 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T08:51:14.167 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:14.166+0000 7faaf12f4700 1 -- 192.168.123.108:0/2117043548 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7faadc0097e0 con 0x7faaec198040 2026-03-10T08:51:14.167 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:14.167+0000 7faaf12f4700 1 --2- 192.168.123.108:0/2117043548 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faaec198040 0x7faaec198460 secure :-1 s=READY pgs=136 cs=0 l=1 rev1=1 crypto rx=0x7faaec103d10 tx=0x7faadc004dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:51:14.168 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:14.167+0000 7faae27fc700 1 -- 192.168.123.108:0/2117043548 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7faadc01c070 con 0x7faaec198040 2026-03-10T08:51:14.168 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:14.167+0000 7faae27fc700 1 -- 192.168.123.108:0/2117043548 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7faadc0056f0 con 0x7faaec198040 2026-03-10T08:51:14.168 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:14.167+0000 7faae27fc700 1 -- 192.168.123.108:0/2117043548 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7faadc021e00 con 0x7faaec198040 2026-03-10T08:51:14.168 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:14.167+0000 7faaf3558700 1 -- 192.168.123.108:0/2117043548 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7faaec198ba0 con 0x7faaec198040 2026-03-10T08:51:14.168 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:14.167+0000 7faaf3558700 1 -- 192.168.123.108:0/2117043548 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7faaec075420 con 0x7faaec198040 2026-03-10T08:51:14.169 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:14.168+0000 7faae27fc700 1 -- 192.168.123.108:0/2117043548 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 13) v1 ==== 45351+0+0 (secure 0 0 0) 0x7faadc021470 con 0x7faaec198040 2026-03-10T08:51:14.169 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:14.168+0000 7faaf3558700 1 -- 192.168.123.108:0/2117043548 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7faaec191a40 con 0x7faaec198040 2026-03-10T08:51:14.169 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:14.168+0000 7faae27fc700 1 --2- 192.168.123.108:0/2117043548 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7faad80383f0 0x7faad803a8b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:51:14.169 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:14.168+0000 7faae27fc700 1 -- 192.168.123.108:0/2117043548 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7faadc04c2f0 con 0x7faaec198040 2026-03-10T08:51:14.169 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:14.168+0000 7faaf0af3700 1 -- 192.168.123.108:0/2117043548 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7faad80383f0 msgr2=0x7faad803a8b0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.105:6800/2 2026-03-10T08:51:14.169 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:14.168+0000 7faaf0af3700 1 --2- 192.168.123.108:0/2117043548 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7faad80383f0 0x7faad803a8b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-10T08:51:14.171 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:14.171+0000 7faae27fc700 1 -- 192.168.123.108:0/2117043548 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7faadc02a590 con 0x7faaec198040 2026-03-10T08:51:14.313 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:14.312+0000 7faaf3558700 1 -- 192.168.123.108:0/2117043548 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7faaec04fa20 con 0x7faaec198040 2026-03-10T08:51:14.313 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:14.312+0000 7faae27fc700 1 -- 192.168.123.108:0/2117043548 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7faadc026020 con 0x7faaec198040 2026-03-10T08:51:14.315 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:51:14.315 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"16587ed2-1c5e-11f1-90f6-35051361a039","modified":"2026-03-10T08:50:09.891602Z","created":"2026-03-10T08:50:09.891602Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T08:51:14.317 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:14.316+0000 7faaf3558700 1 -- 192.168.123.108:0/2117043548 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7faad80383f0 msgr2=0x7faad803a8b0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T08:51:14.317 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:14.316+0000 7faaf3558700 1 --2- 192.168.123.108:0/2117043548 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7faad80383f0 0x7faad803a8b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:14.317 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:14.316+0000 7faaf3558700 1 -- 192.168.123.108:0/2117043548 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faaec198040 msgr2=0x7faaec198460 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:14.317 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:14.316+0000 7faaf3558700 1 --2- 192.168.123.108:0/2117043548 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faaec198040 0x7faaec198460 secure :-1 s=READY pgs=136 cs=0 l=1 rev1=1 crypto rx=0x7faaec103d10 tx=0x7faadc004dc0 comp rx=0 tx=0).stop 2026-03-10T08:51:14.317 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:14.316+0000 7faaf3558700 1 -- 192.168.123.108:0/2117043548 shutdown_connections 2026-03-10T08:51:14.317 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:14.316+0000 7faaf3558700 1 --2- 192.168.123.108:0/2117043548 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7faad80383f0 0x7faad803a8b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:14.317 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:14.316+0000 7faaf3558700 1 --2- 192.168.123.108:0/2117043548 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faaec198040 0x7faaec198460 unknown :-1 s=CLOSED pgs=136 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:14.317 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:14.316+0000 7faaf3558700 1 -- 192.168.123.108:0/2117043548 >> 192.168.123.108:0/2117043548 conn(0x7faaec0fe250 msgr2=0x7faaec0fef10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:51:14.317 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:14.317+0000 7faaf3558700 1 -- 192.168.123.108:0/2117043548 shutdown_connections 2026-03-10T08:51:14.317 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:14.317+0000 7faaf3558700 1 -- 192.168.123.108:0/2117043548 wait complete. 2026-03-10T08:51:14.318 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-10T08:51:14.533 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:14 vm05 ceph-mon[49713]: from='client.? 192.168.123.108:0/2117043548' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T08:51:15.354 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T08:51:15.354 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph mon dump -f json 2026-03-10T08:51:15.519 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T08:51:15.550 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T08:51:15.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:15 vm05 ceph-mon[49713]: Active manager daemon vm05.rxwgjc restarted 2026-03-10T08:51:15.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:15 vm05 ceph-mon[49713]: Activating manager daemon vm05.rxwgjc 2026-03-10T08:51:15.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:15 vm05 ceph-mon[49713]: osdmap e5: 0 total, 0 up, 0 in 2026-03-10T08:51:15.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:15 vm05 ceph-mon[49713]: mgrmap e14: vm05.rxwgjc(active, starting, since 0.0497453s) 2026-03-10T08:51:15.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:15 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-10T08:51:15.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:15 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mgr metadata", "who": "vm05.rxwgjc", "id": "vm05.rxwgjc"}]: dispatch 2026-03-10T08:51:15.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:15 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T08:51:15.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:15 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T08:51:15.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:15 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T08:51:15.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:15 vm05 ceph-mon[49713]: Manager daemon vm05.rxwgjc is now available 2026-03-10T08:51:15.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:15 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:51:15.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:15 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:15.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:15 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:51:15.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:15 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.rxwgjc/mirror_snapshot_schedule"}]: dispatch 2026-03-10T08:51:15.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:15 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:51:15.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:15 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.rxwgjc/trash_purge_schedule"}]: dispatch 2026-03-10T08:51:16.008 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:16.007+0000 7f976dceb700 1 -- 192.168.123.108:0/2703060964 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9760095b30 msgr2=0x7f9760095f50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:16.008 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:16.007+0000 7f976dceb700 1 --2- 192.168.123.108:0/2703060964 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9760095b30 0x7f9760095f50 secure :-1 s=READY pgs=144 cs=0 l=1 rev1=1 crypto rx=0x7f975c009b00 tx=0x7f975c009e10 comp rx=0 tx=0).stop 2026-03-10T08:51:16.008 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:16.007+0000 7f976dceb700 1 -- 192.168.123.108:0/2703060964 shutdown_connections 2026-03-10T08:51:16.008 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:16.007+0000 7f976dceb700 1 --2- 192.168.123.108:0/2703060964 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9760095b30 0x7f9760095f50 unknown :-1 s=CLOSED pgs=144 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:16.008 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:16.007+0000 7f976dceb700 1 -- 192.168.123.108:0/2703060964 >> 192.168.123.108:0/2703060964 conn(0x7f97600910d0 msgr2=0x7f9760093510 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:51:16.008 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:16.007+0000 7f976dceb700 1 -- 192.168.123.108:0/2703060964 shutdown_connections 2026-03-10T08:51:16.009 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:16.007+0000 7f976dceb700 1 -- 192.168.123.108:0/2703060964 wait complete. 2026-03-10T08:51:16.012 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:16.008+0000 7f976dceb700 1 Processor -- start 2026-03-10T08:51:16.012 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:16.008+0000 7f976dceb700 1 -- start start 2026-03-10T08:51:16.012 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:16.008+0000 7f976dceb700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f97601422c0 0x7f97601426e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:51:16.012 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:16.008+0000 7f976dceb700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f975c012070 con 0x7f97601422c0 2026-03-10T08:51:16.012 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:16.008+0000 7f976cce9700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f97601422c0 0x7f97601426e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:51:16.012 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:16.008+0000 7f976cce9700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f97601422c0 0x7f97601426e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:39094/0 (socket says 192.168.123.108:39094) 2026-03-10T08:51:16.012 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:16.008+0000 7f976cce9700 1 -- 192.168.123.108:0/1226630927 learned_addr learned my addr 192.168.123.108:0/1226630927 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T08:51:16.012 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:16.008+0000 7f976cce9700 1 -- 192.168.123.108:0/1226630927 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f975c0097e0 con 0x7f97601422c0 2026-03-10T08:51:16.012 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:16.009+0000 7f976cce9700 1 --2- 192.168.123.108:0/1226630927 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f97601422c0 0x7f97601426e0 secure :-1 s=READY pgs=145 cs=0 l=1 rev1=1 crypto rx=0x7f975c00c010 tx=0x7f975c00ba40 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:51:16.012 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:16.009+0000 7f9765ffb700 1 -- 192.168.123.108:0/1226630927 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f975c01c070 con 0x7f97601422c0 2026-03-10T08:51:16.012 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:16.009+0000 7f976dceb700 1 -- 192.168.123.108:0/1226630927 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9760142c20 con 0x7f97601422c0 2026-03-10T08:51:16.012 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:16.009+0000 7f976dceb700 1 -- 192.168.123.108:0/1226630927 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f97601458a0 con 0x7f97601422c0 2026-03-10T08:51:16.012 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:16.010+0000 7f9765ffb700 1 -- 192.168.123.108:0/1226630927 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f975c003d70 con 0x7f97601422c0 2026-03-10T08:51:16.012 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:16.010+0000 7f9765ffb700 1 -- 192.168.123.108:0/1226630927 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f975c017440 con 0x7f97601422c0 2026-03-10T08:51:16.012 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:16.010+0000 7f9765ffb700 1 -- 192.168.123.108:0/1226630927 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 14) v1 ==== 45072+0+0 (secure 0 0 0) 0x7f975c0175a0 con 0x7f97601422c0 2026-03-10T08:51:16.012 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:16.011+0000 7f9765ffb700 1 -- 192.168.123.108:0/1226630927 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7f975c04cd60 con 0x7f97601422c0 2026-03-10T08:51:16.012 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:16.011+0000 7f976dceb700 1 -- 192.168.123.108:0/1226630927 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f974c005320 con 0x7f97601422c0 2026-03-10T08:51:16.014 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:16.014+0000 7f9765ffb700 1 -- 192.168.123.108:0/1226630927 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f975c02a430 con 0x7f97601422c0 2026-03-10T08:51:16.167 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:51:16.168 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"16587ed2-1c5e-11f1-90f6-35051361a039","modified":"2026-03-10T08:50:09.891602Z","created":"2026-03-10T08:50:09.891602Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T08:51:16.168 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:16.165+0000 7f976dceb700 1 -- 192.168.123.108:0/1226630927 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f974c005190 con 0x7f97601422c0 2026-03-10T08:51:16.168 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:16.165+0000 7f9765ffb700 1 -- 192.168.123.108:0/1226630927 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f975c01fb60 con 0x7f97601422c0 2026-03-10T08:51:16.169 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:16.169+0000 7f976dceb700 1 -- 192.168.123.108:0/1226630927 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f97601422c0 msgr2=0x7f97601426e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:16.169 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:16.169+0000 7f976dceb700 1 --2- 192.168.123.108:0/1226630927 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f97601422c0 0x7f97601426e0 secure :-1 s=READY pgs=145 cs=0 l=1 rev1=1 crypto rx=0x7f975c00c010 tx=0x7f975c00ba40 comp rx=0 tx=0).stop 2026-03-10T08:51:16.169 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:16.169+0000 7f976dceb700 1 -- 192.168.123.108:0/1226630927 shutdown_connections 2026-03-10T08:51:16.169 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:16.169+0000 7f976dceb700 1 --2- 192.168.123.108:0/1226630927 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f97601422c0 0x7f97601426e0 unknown :-1 s=CLOSED pgs=145 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:16.169 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:16.169+0000 7f976dceb700 1 -- 192.168.123.108:0/1226630927 >> 192.168.123.108:0/1226630927 conn(0x7f97600910d0 msgr2=0x7f97600934c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:51:16.170 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:16.169+0000 7f976dceb700 1 -- 192.168.123.108:0/1226630927 shutdown_connections 2026-03-10T08:51:16.170 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:16.169+0000 7f976dceb700 1 -- 192.168.123.108:0/1226630927 wait complete. 2026-03-10T08:51:16.170 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-10T08:51:16.624 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:16 vm05 ceph-mon[49713]: [10/Mar/2026:08:51:15] ENGINE Bus STARTING 2026-03-10T08:51:16.624 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:16 vm05 ceph-mon[49713]: [10/Mar/2026:08:51:15] ENGINE Serving on https://192.168.123.105:7150 2026-03-10T08:51:16.624 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:16 vm05 ceph-mon[49713]: [10/Mar/2026:08:51:15] ENGINE Serving on http://192.168.123.105:8765 2026-03-10T08:51:16.624 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:16 vm05 ceph-mon[49713]: [10/Mar/2026:08:51:15] ENGINE Bus STARTED 2026-03-10T08:51:16.624 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:16 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:16.624 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:16 vm05 ceph-mon[49713]: from='client.? 192.168.123.108:0/1226630927' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T08:51:16.624 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:16 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:16.624 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:16 vm05 ceph-mon[49713]: mgrmap e15: vm05.rxwgjc(active, since 1.05468s) 2026-03-10T08:51:17.212 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T08:51:17.212 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph mon dump -f json 2026-03-10T08:51:17.356 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T08:51:17.395 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T08:51:17.644 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:17.642+0000 7f36d1afd700 1 -- 192.168.123.108:0/4226063502 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f36cc0ff960 msgr2=0x7f36cc0ffd80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:17.644 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:17.642+0000 7f36d1afd700 1 --2- 192.168.123.108:0/4226063502 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f36cc0ff960 0x7f36cc0ffd80 secure :-1 s=READY pgs=146 cs=0 l=1 rev1=1 crypto rx=0x7f36b4009b00 tx=0x7f36b4009e10 comp rx=0 tx=0).stop 2026-03-10T08:51:17.644 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:17.644+0000 7f36d1afd700 1 -- 192.168.123.108:0/4226063502 shutdown_connections 2026-03-10T08:51:17.644 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:17.644+0000 7f36d1afd700 1 --2- 192.168.123.108:0/4226063502 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f36cc0ff960 0x7f36cc0ffd80 unknown :-1 s=CLOSED pgs=146 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:17.644 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:17.644+0000 7f36d1afd700 1 -- 192.168.123.108:0/4226063502 >> 192.168.123.108:0/4226063502 conn(0x7f36cc0faf00 msgr2=0x7f36cc0fd340 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:51:17.646 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:17.646+0000 7f36d1afd700 1 -- 192.168.123.108:0/4226063502 shutdown_connections 2026-03-10T08:51:17.648 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:17.647+0000 7f36d1afd700 1 -- 192.168.123.108:0/4226063502 wait complete. 2026-03-10T08:51:17.648 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:17.647+0000 7f36d1afd700 1 Processor -- start 2026-03-10T08:51:17.648 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:17.647+0000 7f36d1afd700 1 -- start start 2026-03-10T08:51:17.648 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:17.647+0000 7f36d1afd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f36cc0ff960 0x7f36cc18b590 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:51:17.648 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:17.647+0000 7f36d1afd700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f36cc18bad0 con 0x7f36cc0ff960 2026-03-10T08:51:17.649 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:17.648+0000 7f36cb7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f36cc0ff960 0x7f36cc18b590 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:51:17.649 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:17.648+0000 7f36cb7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f36cc0ff960 0x7f36cc18b590 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:39124/0 (socket says 192.168.123.108:39124) 2026-03-10T08:51:17.649 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:17.648+0000 7f36cb7fe700 1 -- 192.168.123.108:0/3771288788 learned_addr learned my addr 192.168.123.108:0/3771288788 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T08:51:17.649 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:17.649+0000 7f36cb7fe700 1 -- 192.168.123.108:0/3771288788 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f36b40097e0 con 0x7f36cc0ff960 2026-03-10T08:51:17.649 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:17.649+0000 7f36cb7fe700 1 --2- 192.168.123.108:0/3771288788 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f36cc0ff960 0x7f36cc18b590 secure :-1 s=READY pgs=147 cs=0 l=1 rev1=1 crypto rx=0x7f36b401caa0 tx=0x7f36b4005470 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:51:17.649 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:17.649+0000 7f36c8ff9700 1 -- 192.168.123.108:0/3771288788 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f36b402d070 con 0x7f36cc0ff960 2026-03-10T08:51:17.650 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:17.649+0000 7f36d1afd700 1 -- 192.168.123.108:0/3771288788 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f36cc18bcd0 con 0x7f36cc0ff960 2026-03-10T08:51:17.650 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:17.649+0000 7f36d1afd700 1 -- 192.168.123.108:0/3771288788 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f36cc18c170 con 0x7f36cc0ff960 2026-03-10T08:51:17.650 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:17.649+0000 7f36c8ff9700 1 -- 192.168.123.108:0/3771288788 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f36b4003b60 con 0x7f36cc0ff960 2026-03-10T08:51:17.650 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:17.649+0000 7f36c8ff9700 1 -- 192.168.123.108:0/3771288788 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f36b4028c30 con 0x7f36cc0ff960 2026-03-10T08:51:17.650 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:17.650+0000 7f36c8ff9700 1 -- 192.168.123.108:0/3771288788 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 15) v1 ==== 45199+0+0 (secure 0 0 0) 0x7f36b403b430 con 0x7f36cc0ff960 2026-03-10T08:51:17.651 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:17.650+0000 7f36c8ff9700 1 --2- 192.168.123.108:0/3771288788 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f36b8038310 0x7f36b803a7d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:51:17.651 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:17.650+0000 7f36c8ff9700 1 -- 192.168.123.108:0/3771288788 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7f36b405d7d0 con 0x7f36cc0ff960 2026-03-10T08:51:17.651 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:17.650+0000 7f36caffd700 1 --2- 192.168.123.108:0/3771288788 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f36b8038310 0x7f36b803a7d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:51:17.651 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:17.651+0000 7f36d1afd700 1 -- 192.168.123.108:0/3771288788 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f36ac005320 con 0x7f36cc0ff960 2026-03-10T08:51:17.651 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:17.651+0000 7f36caffd700 1 --2- 192.168.123.108:0/3771288788 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f36b8038310 0x7f36b803a7d0 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7f36bc006fd0 tx=0x7f36bc006e40 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:51:17.654 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:17.654+0000 7f36c8ff9700 1 -- 192.168.123.108:0/3771288788 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f36b4036070 con 0x7f36cc0ff960 2026-03-10T08:51:17.797 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:17.795+0000 7f36d1afd700 1 -- 192.168.123.108:0/3771288788 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f36ac005190 con 0x7f36cc0ff960 2026-03-10T08:51:17.798 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:17.796+0000 7f36c8ff9700 1 -- 192.168.123.108:0/3771288788 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f36b401f3f0 con 0x7f36cc0ff960 2026-03-10T08:51:17.798 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:51:17.798 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"16587ed2-1c5e-11f1-90f6-35051361a039","modified":"2026-03-10T08:50:09.891602Z","created":"2026-03-10T08:50:09.891602Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T08:51:17.800 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:17.799+0000 7f36d1afd700 1 -- 192.168.123.108:0/3771288788 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f36b8038310 msgr2=0x7f36b803a7d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:17.800 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:17.799+0000 7f36d1afd700 1 --2- 192.168.123.108:0/3771288788 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f36b8038310 0x7f36b803a7d0 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7f36bc006fd0 tx=0x7f36bc006e40 comp rx=0 tx=0).stop 2026-03-10T08:51:17.800 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:17.799+0000 7f36d1afd700 1 -- 192.168.123.108:0/3771288788 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f36cc0ff960 msgr2=0x7f36cc18b590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:17.800 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:17.799+0000 7f36d1afd700 1 --2- 192.168.123.108:0/3771288788 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f36cc0ff960 0x7f36cc18b590 secure :-1 s=READY pgs=147 cs=0 l=1 rev1=1 crypto rx=0x7f36b401caa0 tx=0x7f36b4005470 comp rx=0 tx=0).stop 2026-03-10T08:51:17.800 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:17.799+0000 7f36d1afd700 1 -- 192.168.123.108:0/3771288788 shutdown_connections 2026-03-10T08:51:17.800 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:17.799+0000 7f36d1afd700 1 --2- 192.168.123.108:0/3771288788 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f36b8038310 0x7f36b803a7d0 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:17.800 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:17.799+0000 7f36d1afd700 1 --2- 192.168.123.108:0/3771288788 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f36cc0ff960 0x7f36cc18b590 unknown :-1 s=CLOSED pgs=147 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:17.800 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:17.799+0000 7f36d1afd700 1 -- 192.168.123.108:0/3771288788 >> 192.168.123.108:0/3771288788 conn(0x7f36cc0faf00 msgr2=0x7f36cc068830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:51:17.800 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:17.799+0000 7f36d1afd700 1 -- 192.168.123.108:0/3771288788 shutdown_connections 2026-03-10T08:51:17.800 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:17.799+0000 7f36d1afd700 1 -- 192.168.123.108:0/3771288788 wait complete. 2026-03-10T08:51:17.801 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-10T08:51:17.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:17 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:17.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:17 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:17.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:17 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:17.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:17 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:17.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:17 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:17.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:17 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-10T08:51:18.888 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T08:51:18.889 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph mon dump -f json 2026-03-10T08:51:18.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:18 vm05 ceph-mon[49713]: from='client.? 192.168.123.108:0/3771288788' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T08:51:18.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:18 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:18.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:18 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:18.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:18 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:18.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:18 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:18.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:18 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-10T08:51:18.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:18 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:51:18.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:18 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:51:18.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:18 vm05 ceph-mon[49713]: mgrmap e16: vm05.rxwgjc(active, since 2s) 2026-03-10T08:51:19.037 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/config/ceph.conf 2026-03-10T08:51:19.325 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:19.324+0000 7f4ea76fc700 1 -- 192.168.123.108:0/130534933 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4ea0107ff0 msgr2=0x7f4ea010edf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:19.325 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:19.324+0000 7f4ea76fc700 1 --2- 192.168.123.108:0/130534933 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4ea0107ff0 0x7f4ea010edf0 secure :-1 s=READY pgs=148 cs=0 l=1 rev1=1 crypto rx=0x7f4e94009b00 tx=0x7f4e94009e10 comp rx=0 tx=0).stop 2026-03-10T08:51:19.325 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:19.324+0000 7f4ea76fc700 1 -- 192.168.123.108:0/130534933 shutdown_connections 2026-03-10T08:51:19.325 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:19.324+0000 7f4ea76fc700 1 --2- 192.168.123.108:0/130534933 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4ea0107ff0 0x7f4ea010edf0 unknown :-1 s=CLOSED pgs=148 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:19.325 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:19.324+0000 7f4ea76fc700 1 -- 192.168.123.108:0/130534933 >> 192.168.123.108:0/130534933 conn(0x7f4ea006c970 msgr2=0x7f4ea006cd80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:51:19.325 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:19.324+0000 7f4ea76fc700 1 -- 192.168.123.108:0/130534933 shutdown_connections 2026-03-10T08:51:19.326 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:19.325+0000 7f4ea76fc700 1 -- 192.168.123.108:0/130534933 wait complete. 2026-03-10T08:51:19.326 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:19.325+0000 7f4ea76fc700 1 Processor -- start 2026-03-10T08:51:19.326 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:19.325+0000 7f4ea76fc700 1 -- start start 2026-03-10T08:51:19.326 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:19.325+0000 7f4ea76fc700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4ea0107ff0 0x7f4ea0114ba0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:51:19.326 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:19.325+0000 7f4ea76fc700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4ea0117b80 con 0x7f4ea0107ff0 2026-03-10T08:51:19.326 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:19.325+0000 7f4ea66fa700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4ea0107ff0 0x7f4ea0114ba0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:51:19.326 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:19.326+0000 7f4ea66fa700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4ea0107ff0 0x7f4ea0114ba0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:39140/0 (socket says 192.168.123.108:39140) 2026-03-10T08:51:19.326 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:19.326+0000 7f4ea66fa700 1 -- 192.168.123.108:0/706834512 learned_addr learned my addr 192.168.123.108:0/706834512 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T08:51:19.326 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:19.326+0000 7f4ea66fa700 1 -- 192.168.123.108:0/706834512 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4e940097e0 con 0x7f4ea0107ff0 2026-03-10T08:51:19.327 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:19.326+0000 7f4ea66fa700 1 --2- 192.168.123.108:0/706834512 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4ea0107ff0 0x7f4ea0114ba0 secure :-1 s=READY pgs=149 cs=0 l=1 rev1=1 crypto rx=0x7f4e94006010 tx=0x7f4e94004dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:51:19.327 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:19.326+0000 7f4e937fe700 1 -- 192.168.123.108:0/706834512 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4e9401c070 con 0x7f4ea0107ff0 2026-03-10T08:51:19.328 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:19.326+0000 7f4ea76fc700 1 -- 192.168.123.108:0/706834512 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4ea0115140 con 0x7f4ea0107ff0 2026-03-10T08:51:19.328 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:19.327+0000 7f4e937fe700 1 -- 192.168.123.108:0/706834512 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4e94021470 con 0x7f4ea0107ff0 2026-03-10T08:51:19.328 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:19.327+0000 7f4e937fe700 1 -- 192.168.123.108:0/706834512 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4e9400f5e0 con 0x7f4ea0107ff0 2026-03-10T08:51:19.328 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:19.327+0000 7f4ea76fc700 1 -- 192.168.123.108:0/706834512 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4ea01155c0 con 0x7f4ea0107ff0 2026-03-10T08:51:19.331 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:19.330+0000 7f4e937fe700 1 -- 192.168.123.108:0/706834512 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 16) v1 ==== 45397+0+0 (secure 0 0 0) 0x7f4e940215e0 con 0x7f4ea0107ff0 2026-03-10T08:51:19.331 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:19.330+0000 7f4e937fe700 1 --2- 192.168.123.108:0/706834512 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4e8c038530 0x7f4e8c03a9f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:51:19.331 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:19.330+0000 7f4e937fe700 1 -- 192.168.123.108:0/706834512 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7f4e9404cc70 con 0x7f4ea0107ff0 2026-03-10T08:51:19.331 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:19.331+0000 7f4ea5ef9700 1 --2- 192.168.123.108:0/706834512 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4e8c038530 0x7f4e8c03a9f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:51:19.332 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:19.331+0000 7f4e917fa700 1 -- 192.168.123.108:0/706834512 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4ea004f000 con 0x7f4ea0107ff0 2026-03-10T08:51:19.332 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:19.331+0000 7f4ea5ef9700 1 --2- 192.168.123.108:0/706834512 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4e8c038530 0x7f4e8c03a9f0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f4e98006fd0 tx=0x7f4e98006e40 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:51:19.338 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:19.337+0000 7f4e937fe700 1 -- 192.168.123.108:0/706834512 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f4e94026030 con 0x7f4ea0107ff0 2026-03-10T08:51:19.516 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:19.514+0000 7f4e917fa700 1 -- 192.168.123.108:0/706834512 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f4ea00623c0 con 0x7f4ea0107ff0 2026-03-10T08:51:19.516 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:19.515+0000 7f4e937fe700 1 -- 192.168.123.108:0/706834512 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f4e9404a020 con 0x7f4ea0107ff0 2026-03-10T08:51:19.516 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:51:19.516 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"16587ed2-1c5e-11f1-90f6-35051361a039","modified":"2026-03-10T08:50:09.891602Z","created":"2026-03-10T08:50:09.891602Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T08:51:19.518 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:19.517+0000 7f4ea76fc700 1 -- 192.168.123.108:0/706834512 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4e8c038530 msgr2=0x7f4e8c03a9f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:19.518 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:19.517+0000 7f4ea76fc700 1 --2- 192.168.123.108:0/706834512 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4e8c038530 0x7f4e8c03a9f0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f4e98006fd0 tx=0x7f4e98006e40 comp rx=0 tx=0).stop 2026-03-10T08:51:19.518 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:19.517+0000 7f4ea76fc700 1 -- 192.168.123.108:0/706834512 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4ea0107ff0 msgr2=0x7f4ea0114ba0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:19.518 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:19.517+0000 7f4ea76fc700 1 --2- 192.168.123.108:0/706834512 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4ea0107ff0 0x7f4ea0114ba0 secure :-1 s=READY pgs=149 cs=0 l=1 rev1=1 crypto rx=0x7f4e94006010 tx=0x7f4e94004dc0 comp rx=0 tx=0).stop 2026-03-10T08:51:19.518 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:19.517+0000 7f4ea76fc700 1 -- 192.168.123.108:0/706834512 shutdown_connections 2026-03-10T08:51:19.518 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:19.517+0000 7f4ea76fc700 1 --2- 192.168.123.108:0/706834512 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4e8c038530 0x7f4e8c03a9f0 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:19.518 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:19.517+0000 7f4ea76fc700 1 --2- 192.168.123.108:0/706834512 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4ea0107ff0 0x7f4ea0114ba0 unknown :-1 s=CLOSED pgs=149 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:19.518 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:19.517+0000 7f4ea76fc700 1 -- 192.168.123.108:0/706834512 >> 192.168.123.108:0/706834512 conn(0x7f4ea006c970 msgr2=0x7f4ea010b640 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:51:19.518 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:19.517+0000 7f4ea76fc700 1 -- 192.168.123.108:0/706834512 shutdown_connections 2026-03-10T08:51:19.518 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:19.517+0000 7f4ea76fc700 1 -- 192.168.123.108:0/706834512 wait complete. 2026-03-10T08:51:19.519 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-10T08:51:19.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:19 vm05 ceph-mon[49713]: Updating vm05:/etc/ceph/ceph.conf 2026-03-10T08:51:19.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:19 vm05 ceph-mon[49713]: Updating vm08:/etc/ceph/ceph.conf 2026-03-10T08:51:19.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:19 vm05 ceph-mon[49713]: Updating vm08:/var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/config/ceph.conf 2026-03-10T08:51:19.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:19 vm05 ceph-mon[49713]: Updating vm05:/var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/config/ceph.conf 2026-03-10T08:51:19.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:19 vm05 ceph-mon[49713]: Updating vm08:/etc/ceph/ceph.client.admin.keyring 2026-03-10T08:51:19.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:19 vm05 ceph-mon[49713]: Updating vm05:/etc/ceph/ceph.client.admin.keyring 2026-03-10T08:51:19.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:19 vm05 ceph-mon[49713]: Updating vm08:/var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/config/ceph.client.admin.keyring 2026-03-10T08:51:19.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:19 vm05 ceph-mon[49713]: Updating vm05:/var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/config/ceph.client.admin.keyring 2026-03-10T08:51:19.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:19 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:19.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:19 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:19.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:19 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:19.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:19 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:19.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:19 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:19.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:19 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm08", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T08:51:19.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:19 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm08", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-10T08:51:19.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:19 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:51:19.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:19 vm05 ceph-mon[49713]: from='client.? 192.168.123.108:0/706834512' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T08:51:20.685 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T08:51:20.685 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph mon dump -f json 2026-03-10T08:51:20.948 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/config/ceph.conf 2026-03-10T08:51:20.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:20 vm05 ceph-mon[49713]: Deploying daemon ceph-exporter.vm08 on vm08 2026-03-10T08:51:20.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:20 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:20.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:20 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:20.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:20 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:20.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:20 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:20.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:20 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm08", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T08:51:20.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:20 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.vm08", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished 2026-03-10T08:51:20.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:20 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:51:21.242 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:21.241+0000 7ff6998f1700 1 -- 192.168.123.108:0/3182479719 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff68c0aa770 msgr2=0x7ff68c0aab50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:21.242 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:21.241+0000 7ff6998f1700 1 --2- 192.168.123.108:0/3182479719 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff68c0aa770 0x7ff68c0aab50 secure :-1 s=READY pgs=151 cs=0 l=1 rev1=1 crypto rx=0x7ff688009b00 tx=0x7ff688009e10 comp rx=0 tx=0).stop 2026-03-10T08:51:21.242 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:21.242+0000 7ff6998f1700 1 -- 192.168.123.108:0/3182479719 shutdown_connections 2026-03-10T08:51:21.243 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:21.242+0000 7ff6998f1700 1 --2- 192.168.123.108:0/3182479719 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff68c0aa770 0x7ff68c0aab50 unknown :-1 s=CLOSED pgs=151 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:21.243 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:21.242+0000 7ff6998f1700 1 -- 192.168.123.108:0/3182479719 >> 192.168.123.108:0/3182479719 conn(0x7ff68c01a430 msgr2=0x7ff68c01a840 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:51:21.243 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:21.242+0000 7ff6998f1700 1 -- 192.168.123.108:0/3182479719 shutdown_connections 2026-03-10T08:51:21.243 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:21.242+0000 7ff6998f1700 1 -- 192.168.123.108:0/3182479719 wait complete. 2026-03-10T08:51:21.243 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:21.242+0000 7ff6998f1700 1 Processor -- start 2026-03-10T08:51:21.243 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:21.242+0000 7ff6998f1700 1 -- start start 2026-03-10T08:51:21.243 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:21.242+0000 7ff6998f1700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff68c0aa770 0x7ff68c0b2490 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:51:21.243 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:21.242+0000 7ff6998f1700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff688012070 con 0x7ff68c0aa770 2026-03-10T08:51:21.243 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:21.243+0000 7ff6988ef700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff68c0aa770 0x7ff68c0b2490 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:51:21.243 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:21.243+0000 7ff6988ef700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff68c0aa770 0x7ff68c0b2490 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:39168/0 (socket says 192.168.123.108:39168) 2026-03-10T08:51:21.243 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:21.243+0000 7ff6988ef700 1 -- 192.168.123.108:0/878989185 learned_addr learned my addr 192.168.123.108:0/878989185 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T08:51:21.244 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:21.243+0000 7ff6988ef700 1 -- 192.168.123.108:0/878989185 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff6880097e0 con 0x7ff68c0aa770 2026-03-10T08:51:21.244 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:21.243+0000 7ff6988ef700 1 --2- 192.168.123.108:0/878989185 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff68c0aa770 0x7ff68c0b2490 secure :-1 s=READY pgs=152 cs=0 l=1 rev1=1 crypto rx=0x7ff688006010 tx=0x7ff6880052e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:51:21.244 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:21.243+0000 7ff691ffb700 1 -- 192.168.123.108:0/878989185 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff68801d070 con 0x7ff68c0aa770 2026-03-10T08:51:21.245 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:21.243+0000 7ff6998f1700 1 -- 192.168.123.108:0/878989185 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff68c0b2a30 con 0x7ff68c0aa770 2026-03-10T08:51:21.245 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:21.243+0000 7ff6998f1700 1 -- 192.168.123.108:0/878989185 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff68c0b2eb0 con 0x7ff68c0aa770 2026-03-10T08:51:21.245 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:21.244+0000 7ff691ffb700 1 -- 192.168.123.108:0/878989185 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff688005640 con 0x7ff68c0aa770 2026-03-10T08:51:21.245 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:21.244+0000 7ff691ffb700 1 -- 192.168.123.108:0/878989185 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff68800f460 con 0x7ff68c0aa770 2026-03-10T08:51:21.245 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:21.244+0000 7ff691ffb700 1 -- 192.168.123.108:0/878989185 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 16) v1 ==== 45397+0+0 (secure 0 0 0) 0x7ff68800f6d0 con 0x7ff68c0aa770 2026-03-10T08:51:21.245 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:21.245+0000 7ff67f7fe700 1 -- 192.168.123.108:0/878989185 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff6740052f0 con 0x7ff68c0aa770 2026-03-10T08:51:21.245 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:21.245+0000 7ff691ffb700 1 --2- 192.168.123.108:0/878989185 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff684038520 0x7ff68403a9e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:51:21.245 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:21.245+0000 7ff691ffb700 1 -- 192.168.123.108:0/878989185 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7ff68804c980 con 0x7ff68c0aa770 2026-03-10T08:51:21.248 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:21.247+0000 7ff693fff700 1 --2- 192.168.123.108:0/878989185 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff684038520 0x7ff68403a9e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:51:21.248 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:21.247+0000 7ff691ffb700 1 -- 192.168.123.108:0/878989185 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7ff688015350 con 0x7ff68c0aa770 2026-03-10T08:51:21.248 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:21.248+0000 7ff693fff700 1 --2- 192.168.123.108:0/878989185 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff684038520 0x7ff68403a9e0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7ff680006fd0 tx=0x7ff680006e40 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:51:21.393 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:21.392+0000 7ff67f7fe700 1 -- 192.168.123.108:0/878989185 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7ff674005160 con 0x7ff68c0aa770 2026-03-10T08:51:21.395 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:51:21.395 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"16587ed2-1c5e-11f1-90f6-35051361a039","modified":"2026-03-10T08:50:09.891602Z","created":"2026-03-10T08:50:09.891602Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T08:51:21.395 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:21.394+0000 7ff691ffb700 1 -- 192.168.123.108:0/878989185 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7ff688026070 con 0x7ff68c0aa770 2026-03-10T08:51:21.398 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:21.396+0000 7ff67f7fe700 1 -- 192.168.123.108:0/878989185 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff684038520 msgr2=0x7ff68403a9e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:21.398 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:21.396+0000 7ff67f7fe700 1 --2- 192.168.123.108:0/878989185 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff684038520 0x7ff68403a9e0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7ff680006fd0 tx=0x7ff680006e40 comp rx=0 tx=0).stop 2026-03-10T08:51:21.398 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:21.396+0000 7ff67f7fe700 1 -- 192.168.123.108:0/878989185 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff68c0aa770 msgr2=0x7ff68c0b2490 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:21.398 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:21.396+0000 7ff67f7fe700 1 --2- 192.168.123.108:0/878989185 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff68c0aa770 0x7ff68c0b2490 secure :-1 s=READY pgs=152 cs=0 l=1 rev1=1 crypto rx=0x7ff688006010 tx=0x7ff6880052e0 comp rx=0 tx=0).stop 2026-03-10T08:51:21.398 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:21.397+0000 7ff67f7fe700 1 -- 192.168.123.108:0/878989185 shutdown_connections 2026-03-10T08:51:21.398 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:21.397+0000 7ff67f7fe700 1 --2- 192.168.123.108:0/878989185 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff684038520 0x7ff68403a9e0 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:21.398 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:21.397+0000 7ff67f7fe700 1 --2- 192.168.123.108:0/878989185 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff68c0aa770 0x7ff68c0b2490 secure :-1 s=CLOSED pgs=152 cs=0 l=1 rev1=1 crypto rx=0x7ff688006010 tx=0x7ff6880052e0 comp rx=0 tx=0).stop 2026-03-10T08:51:21.398 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:21.397+0000 7ff67f7fe700 1 -- 192.168.123.108:0/878989185 >> 192.168.123.108:0/878989185 conn(0x7ff68c01a430 msgr2=0x7ff68c0a4160 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:51:21.398 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:21.397+0000 7ff67f7fe700 1 -- 192.168.123.108:0/878989185 shutdown_connections 2026-03-10T08:51:21.398 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:21.397+0000 7ff67f7fe700 1 -- 192.168.123.108:0/878989185 wait complete. 2026-03-10T08:51:21.404 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-10T08:51:21.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:21 vm05 ceph-mon[49713]: Deploying daemon crash.vm08 on vm08 2026-03-10T08:51:21.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:21 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:21.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:21 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:21.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:21 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:21.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:21 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:21.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:21 vm05 ceph-mon[49713]: from='client.? 192.168.123.108:0/878989185' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T08:51:22.453 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T08:51:22.453 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph mon dump -f json 2026-03-10T08:51:22.578 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/config/ceph.conf 2026-03-10T08:51:22.823 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:22.822+0000 7f9bb0b4a700 1 -- 192.168.123.108:0/2816740177 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9bac0ff9b0 msgr2=0x7f9bac0ffd90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:22.823 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:22.822+0000 7f9bb0b4a700 1 --2- 192.168.123.108:0/2816740177 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9bac0ff9b0 0x7f9bac0ffd90 secure :-1 s=READY pgs=153 cs=0 l=1 rev1=1 crypto rx=0x7f9b94009b00 tx=0x7f9b94009e10 comp rx=0 tx=0).stop 2026-03-10T08:51:22.823 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:22.822+0000 7f9bb0b4a700 1 -- 192.168.123.108:0/2816740177 shutdown_connections 2026-03-10T08:51:22.823 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:22.822+0000 7f9bb0b4a700 1 --2- 192.168.123.108:0/2816740177 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9bac0ff9b0 0x7f9bac0ffd90 unknown :-1 s=CLOSED pgs=153 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:22.823 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:22.822+0000 7f9bb0b4a700 1 -- 192.168.123.108:0/2816740177 >> 192.168.123.108:0/2816740177 conn(0x7f9bac0fb430 msgr2=0x7f9bac0fd850 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:51:22.823 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:22.823+0000 7f9bb0b4a700 1 -- 192.168.123.108:0/2816740177 shutdown_connections 2026-03-10T08:51:22.823 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:22.823+0000 7f9bb0b4a700 1 -- 192.168.123.108:0/2816740177 wait complete. 2026-03-10T08:51:22.824 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:22.823+0000 7f9bb0b4a700 1 Processor -- start 2026-03-10T08:51:22.824 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:22.823+0000 7f9bb0b4a700 1 -- start start 2026-03-10T08:51:22.824 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:22.823+0000 7f9bb0b4a700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9bac0ff9b0 0x7f9bac198220 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:51:22.824 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:22.823+0000 7f9bb0b4a700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9bac198760 con 0x7f9bac0ff9b0 2026-03-10T08:51:22.824 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:22.824+0000 7f9baa59c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9bac0ff9b0 0x7f9bac198220 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:51:22.824 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:22.824+0000 7f9baa59c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9bac0ff9b0 0x7f9bac198220 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:39194/0 (socket says 192.168.123.108:39194) 2026-03-10T08:51:22.824 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:22.824+0000 7f9baa59c700 1 -- 192.168.123.108:0/2734984510 learned_addr learned my addr 192.168.123.108:0/2734984510 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T08:51:22.825 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:22.824+0000 7f9baa59c700 1 -- 192.168.123.108:0/2734984510 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9b940097e0 con 0x7f9bac0ff9b0 2026-03-10T08:51:22.825 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:22.824+0000 7f9baa59c700 1 --2- 192.168.123.108:0/2734984510 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9bac0ff9b0 0x7f9bac198220 secure :-1 s=READY pgs=154 cs=0 l=1 rev1=1 crypto rx=0x7f9b94006010 tx=0x7f9b94004dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:51:22.825 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:22.824+0000 7f9ba37fe700 1 -- 192.168.123.108:0/2734984510 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f9b9401c070 con 0x7f9bac0ff9b0 2026-03-10T08:51:22.825 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:22.824+0000 7f9ba37fe700 1 -- 192.168.123.108:0/2734984510 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9b94021470 con 0x7f9bac0ff9b0 2026-03-10T08:51:22.825 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:22.824+0000 7f9bb0b4a700 1 -- 192.168.123.108:0/2734984510 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9bac198960 con 0x7f9bac0ff9b0 2026-03-10T08:51:22.826 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:22.824+0000 7f9bb0b4a700 1 -- 192.168.123.108:0/2734984510 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9bac198d80 con 0x7f9bac0ff9b0 2026-03-10T08:51:22.826 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:22.825+0000 7f9ba37fe700 1 -- 192.168.123.108:0/2734984510 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f9b9400f460 con 0x7f9bac0ff9b0 2026-03-10T08:51:22.826 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:22.825+0000 7f9ba37fe700 1 -- 192.168.123.108:0/2734984510 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 16) v1 ==== 45397+0+0 (secure 0 0 0) 0x7f9b9400f680 con 0x7f9bac0ff9b0 2026-03-10T08:51:22.826 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:22.825+0000 7f9bb0b4a700 1 -- 192.168.123.108:0/2734984510 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9bac04fa20 con 0x7f9bac0ff9b0 2026-03-10T08:51:22.827 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:22.826+0000 7f9ba37fe700 1 --2- 192.168.123.108:0/2734984510 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f9b98040d40 0x7f9b98043200 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:51:22.827 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:22.826+0000 7f9ba37fe700 1 -- 192.168.123.108:0/2734984510 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7f9b9404d5a0 con 0x7f9bac0ff9b0 2026-03-10T08:51:22.829 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:22.828+0000 7f9ba9d9b700 1 --2- 192.168.123.108:0/2734984510 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f9b98040d40 0x7f9b98043200 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:51:22.829 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:22.828+0000 7f9ba37fe700 1 -- 192.168.123.108:0/2734984510 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f9b94029bc0 con 0x7f9bac0ff9b0 2026-03-10T08:51:22.831 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:22.831+0000 7f9ba9d9b700 1 --2- 192.168.123.108:0/2734984510 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f9b98040d40 0x7f9b98043200 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f9b9c006fd0 tx=0x7f9b9c006e40 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:51:22.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:22 vm05 ceph-mon[49713]: Deploying daemon node-exporter.vm08 on vm08 2026-03-10T08:51:22.974 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:22.973+0000 7f9bb0b4a700 1 -- 192.168.123.108:0/2734984510 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f9bac0623c0 con 0x7f9bac0ff9b0 2026-03-10T08:51:22.974 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:22.973+0000 7f9ba37fe700 1 -- 192.168.123.108:0/2734984510 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f9b94026030 con 0x7f9bac0ff9b0 2026-03-10T08:51:22.974 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:51:22.974 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"16587ed2-1c5e-11f1-90f6-35051361a039","modified":"2026-03-10T08:50:09.891602Z","created":"2026-03-10T08:50:09.891602Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T08:51:22.976 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:22.976+0000 7f9bb0b4a700 1 -- 192.168.123.108:0/2734984510 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f9b98040d40 msgr2=0x7f9b98043200 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:22.976 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:22.976+0000 7f9bb0b4a700 1 --2- 192.168.123.108:0/2734984510 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f9b98040d40 0x7f9b98043200 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f9b9c006fd0 tx=0x7f9b9c006e40 comp rx=0 tx=0).stop 2026-03-10T08:51:22.976 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:22.976+0000 7f9bb0b4a700 1 -- 192.168.123.108:0/2734984510 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9bac0ff9b0 msgr2=0x7f9bac198220 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:22.976 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:22.976+0000 7f9bb0b4a700 1 --2- 192.168.123.108:0/2734984510 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9bac0ff9b0 0x7f9bac198220 secure :-1 s=READY pgs=154 cs=0 l=1 rev1=1 crypto rx=0x7f9b94006010 tx=0x7f9b94004dc0 comp rx=0 tx=0).stop 2026-03-10T08:51:22.976 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:22.976+0000 7f9bb0b4a700 1 -- 192.168.123.108:0/2734984510 shutdown_connections 2026-03-10T08:51:22.976 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:22.976+0000 7f9bb0b4a700 1 --2- 192.168.123.108:0/2734984510 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f9b98040d40 0x7f9b98043200 unknown :-1 s=CLOSED pgs=9 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:22.977 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:22.976+0000 7f9bb0b4a700 1 --2- 192.168.123.108:0/2734984510 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9bac0ff9b0 0x7f9bac198220 unknown :-1 s=CLOSED pgs=154 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:22.977 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:22.976+0000 7f9bb0b4a700 1 -- 192.168.123.108:0/2734984510 >> 192.168.123.108:0/2734984510 conn(0x7f9bac0fb430 msgr2=0x7f9bac0fcb00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:51:22.977 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:22.976+0000 7f9bb0b4a700 1 -- 192.168.123.108:0/2734984510 shutdown_connections 2026-03-10T08:51:22.977 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:22.976+0000 7f9bb0b4a700 1 -- 192.168.123.108:0/2734984510 wait complete. 2026-03-10T08:51:22.977 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-10T08:51:23.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:23 vm05 ceph-mon[49713]: from='client.? 192.168.123.108:0/2734984510' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T08:51:24.031 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T08:51:24.031 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph mon dump -f json 2026-03-10T08:51:24.177 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/config/ceph.conf 2026-03-10T08:51:24.435 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:24.433+0000 7f855c8a8700 1 -- 192.168.123.108:0/1669582293 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f855810c210 msgr2=0x7f855810c5f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:24.435 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:24.433+0000 7f855c8a8700 1 --2- 192.168.123.108:0/1669582293 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f855810c210 0x7f855810c5f0 secure :-1 s=READY pgs=155 cs=0 l=1 rev1=1 crypto rx=0x7f8548007780 tx=0x7f854800c050 comp rx=0 tx=0).stop 2026-03-10T08:51:24.435 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:24.433+0000 7f855c8a8700 1 -- 192.168.123.108:0/1669582293 shutdown_connections 2026-03-10T08:51:24.435 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:24.433+0000 7f855c8a8700 1 --2- 192.168.123.108:0/1669582293 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f855810c210 0x7f855810c5f0 secure :-1 s=CLOSED pgs=155 cs=0 l=1 rev1=1 crypto rx=0x7f8548007780 tx=0x7f854800c050 comp rx=0 tx=0).stop 2026-03-10T08:51:24.435 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:24.433+0000 7f855c8a8700 1 -- 192.168.123.108:0/1669582293 >> 192.168.123.108:0/1669582293 conn(0x7f855806c970 msgr2=0x7f855806cd80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:51:24.438 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:24.436+0000 7f855c8a8700 1 -- 192.168.123.108:0/1669582293 shutdown_connections 2026-03-10T08:51:24.438 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:24.436+0000 7f855c8a8700 1 -- 192.168.123.108:0/1669582293 wait complete. 2026-03-10T08:51:24.438 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:24.437+0000 7f855c8a8700 1 Processor -- start 2026-03-10T08:51:24.438 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:24.437+0000 7f855c8a8700 1 -- start start 2026-03-10T08:51:24.438 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:24.437+0000 7f855c8a8700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8558134d20 0x7f8558135100 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:51:24.438 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:24.437+0000 7f855c8a8700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8548003680 con 0x7f8558134d20 2026-03-10T08:51:24.438 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:24.437+0000 7f8556d9d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8558134d20 0x7f8558135100 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:51:24.438 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:24.437+0000 7f8556d9d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8558134d20 0x7f8558135100 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:39212/0 (socket says 192.168.123.108:39212) 2026-03-10T08:51:24.438 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:24.437+0000 7f8556d9d700 1 -- 192.168.123.108:0/1980359446 learned_addr learned my addr 192.168.123.108:0/1980359446 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T08:51:24.438 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:24.438+0000 7f8556d9d700 1 -- 192.168.123.108:0/1980359446 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8548007430 con 0x7f8558134d20 2026-03-10T08:51:24.438 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:24.438+0000 7f8556d9d700 1 --2- 192.168.123.108:0/1980359446 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8558134d20 0x7f8558135100 secure :-1 s=READY pgs=156 cs=0 l=1 rev1=1 crypto rx=0x7f854800a040 tx=0x7f854800c9e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:51:24.439 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:24.438+0000 7f853ffff700 1 -- 192.168.123.108:0/1980359446 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f854800f050 con 0x7f8558134d20 2026-03-10T08:51:24.439 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:24.438+0000 7f853ffff700 1 -- 192.168.123.108:0/1980359446 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f854800ce70 con 0x7f8558134d20 2026-03-10T08:51:24.439 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:24.438+0000 7f853ffff700 1 -- 192.168.123.108:0/1980359446 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8548008470 con 0x7f8558134d20 2026-03-10T08:51:24.439 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:24.438+0000 7f855c8a8700 1 -- 192.168.123.108:0/1980359446 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f85581356a0 con 0x7f8558134d20 2026-03-10T08:51:24.439 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:24.438+0000 7f855c8a8700 1 -- 192.168.123.108:0/1980359446 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8558136770 con 0x7f8558134d20 2026-03-10T08:51:24.440 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:24.439+0000 7f855c8a8700 1 -- 192.168.123.108:0/1980359446 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f855804f000 con 0x7f8558134d20 2026-03-10T08:51:24.440 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:24.440+0000 7f853ffff700 1 -- 192.168.123.108:0/1980359446 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 16) v1 ==== 45397+0+0 (secure 0 0 0) 0x7f854801a040 con 0x7f8558134d20 2026-03-10T08:51:24.441 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:24.440+0000 7f853ffff700 1 --2- 192.168.123.108:0/1980359446 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f85400385c0 0x7f854003aa80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:51:24.441 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:24.440+0000 7f853ffff700 1 -- 192.168.123.108:0/1980359446 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7f854804b780 con 0x7f8558134d20 2026-03-10T08:51:24.443 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:24.442+0000 7f853ffff700 1 -- 192.168.123.108:0/1980359446 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f854802e7d0 con 0x7f8558134d20 2026-03-10T08:51:24.443 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:24.443+0000 7f855659c700 1 --2- 192.168.123.108:0/1980359446 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f85400385c0 0x7f854003aa80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:51:24.444 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:24.443+0000 7f855659c700 1 --2- 192.168.123.108:0/1980359446 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f85400385c0 0x7f854003aa80 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f855000ad30 tx=0x7f85500093f0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:51:24.636 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:24.634+0000 7f855c8a8700 1 -- 192.168.123.108:0/1980359446 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f85580623c0 con 0x7f8558134d20 2026-03-10T08:51:24.636 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:51:24.636 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"16587ed2-1c5e-11f1-90f6-35051361a039","modified":"2026-03-10T08:50:09.891602Z","created":"2026-03-10T08:50:09.891602Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T08:51:24.636 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:24.635+0000 7f853ffff700 1 -- 192.168.123.108:0/1980359446 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f8548020090 con 0x7f8558134d20 2026-03-10T08:51:24.639 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:24.639+0000 7f853dffb700 1 -- 192.168.123.108:0/1980359446 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f85400385c0 msgr2=0x7f854003aa80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:24.639 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:24.639+0000 7f853dffb700 1 --2- 192.168.123.108:0/1980359446 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f85400385c0 0x7f854003aa80 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f855000ad30 tx=0x7f85500093f0 comp rx=0 tx=0).stop 2026-03-10T08:51:24.639 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:24.639+0000 7f853dffb700 1 -- 192.168.123.108:0/1980359446 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8558134d20 msgr2=0x7f8558135100 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:24.639 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:24.639+0000 7f853dffb700 1 --2- 192.168.123.108:0/1980359446 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8558134d20 0x7f8558135100 secure :-1 s=READY pgs=156 cs=0 l=1 rev1=1 crypto rx=0x7f854800a040 tx=0x7f854800c9e0 comp rx=0 tx=0).stop 2026-03-10T08:51:24.640 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:24.639+0000 7f853dffb700 1 -- 192.168.123.108:0/1980359446 shutdown_connections 2026-03-10T08:51:24.640 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:24.639+0000 7f853dffb700 1 --2- 192.168.123.108:0/1980359446 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f85400385c0 0x7f854003aa80 unknown :-1 s=CLOSED pgs=10 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:24.640 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:24.639+0000 7f853dffb700 1 --2- 192.168.123.108:0/1980359446 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8558134d20 0x7f8558135100 unknown :-1 s=CLOSED pgs=156 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:24.640 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:24.639+0000 7f853dffb700 1 -- 192.168.123.108:0/1980359446 >> 192.168.123.108:0/1980359446 conn(0x7f855806c970 msgr2=0x7f8558108e80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:51:24.640 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:24.639+0000 7f853dffb700 1 -- 192.168.123.108:0/1980359446 shutdown_connections 2026-03-10T08:51:24.640 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:24.639+0000 7f853dffb700 1 -- 192.168.123.108:0/1980359446 wait complete. 2026-03-10T08:51:24.641 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-10T08:51:25.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:25 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:25.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:25 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:25.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:25 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:25.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:25 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:25.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:25 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm08.rpongu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T08:51:25.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:25 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.vm08.rpongu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished 2026-03-10T08:51:25.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:25 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T08:51:25.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:25 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:51:25.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:25 vm05 ceph-mon[49713]: Deploying daemon mgr.vm08.rpongu on vm08 2026-03-10T08:51:25.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:25 vm05 ceph-mon[49713]: from='client.? 192.168.123.108:0/1980359446' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T08:51:25.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:25 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:25.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:25 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:25.715 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T08:51:25.715 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph mon dump -f json 2026-03-10T08:51:25.995 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm08/config 2026-03-10T08:51:26.356 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:26 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:26.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:26 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:26.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:26 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:26.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:26 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T08:51:26.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:26 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:51:26.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:26 vm05 ceph-mon[49713]: Deploying daemon mon.vm08 on vm08 2026-03-10T08:51:26.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:26 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:31.679 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:31 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-10T08:51:31.679 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:31 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T08:51:31.679 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:31 vm08 ceph-mon[57559]: mon.vm05 calling monitor election 2026-03-10T08:51:31.679 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:31 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:51:31.679 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:31 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T08:51:31.679 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:31 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T08:51:31.679 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:31 vm08 ceph-mon[57559]: mon.vm08 calling monitor election 2026-03-10T08:51:31.679 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:31 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T08:51:31.679 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:31 vm08 ceph-mon[57559]: from='mgr.? 192.168.123.108:0/1640129495' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.rpongu/crt"}]: dispatch 2026-03-10T08:51:31.679 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:31 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:51:31.679 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:31 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T08:51:31.679 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:31 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T08:51:31.679 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:31 vm08 ceph-mon[57559]: mon.vm05 is new leader, mons vm05,vm08 in quorum (ranks 0,1) 2026-03-10T08:51:31.679 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:31 vm08 ceph-mon[57559]: monmap e2: 2 mons at {vm05=[v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0],vm08=[v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0]} removed_ranks: {} disallowed_leaders: {} 2026-03-10T08:51:31.679 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:31 vm08 ceph-mon[57559]: fsmap 2026-03-10T08:51:31.679 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:31 vm08 ceph-mon[57559]: osdmap e5: 0 total, 0 up, 0 in 2026-03-10T08:51:31.679 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:31 vm08 ceph-mon[57559]: mgrmap e16: vm05.rxwgjc(active, since 16s) 2026-03-10T08:51:31.679 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:31 vm08 ceph-mon[57559]: Standby manager daemon vm08.rpongu started 2026-03-10T08:51:31.679 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:31 vm08 ceph-mon[57559]: from='mgr.? 192.168.123.108:0/1640129495' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T08:51:31.679 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:31 vm08 ceph-mon[57559]: overall HEALTH_OK 2026-03-10T08:51:31.679 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:31 vm08 ceph-mon[57559]: from='mgr.? 192.168.123.108:0/1640129495' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.rpongu/key"}]: dispatch 2026-03-10T08:51:31.679 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:31 vm08 ceph-mon[57559]: from='mgr.? 192.168.123.108:0/1640129495' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T08:51:31.679 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:31 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:31.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:31 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-10T08:51:31.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:31 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T08:51:31.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:31 vm05 ceph-mon[49713]: mon.vm05 calling monitor election 2026-03-10T08:51:31.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:31 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:51:31.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:31 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T08:51:31.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:31 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T08:51:31.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:31 vm05 ceph-mon[49713]: mon.vm08 calling monitor election 2026-03-10T08:51:31.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:31 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T08:51:31.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:31 vm05 ceph-mon[49713]: from='mgr.? 192.168.123.108:0/1640129495' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.rpongu/crt"}]: dispatch 2026-03-10T08:51:31.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:31 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:51:31.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:31 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T08:51:31.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:31 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T08:51:31.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:31 vm05 ceph-mon[49713]: mon.vm05 is new leader, mons vm05,vm08 in quorum (ranks 0,1) 2026-03-10T08:51:31.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:31 vm05 ceph-mon[49713]: monmap e2: 2 mons at {vm05=[v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0],vm08=[v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0]} removed_ranks: {} disallowed_leaders: {} 2026-03-10T08:51:31.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:31 vm05 ceph-mon[49713]: fsmap 2026-03-10T08:51:31.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:31 vm05 ceph-mon[49713]: osdmap e5: 0 total, 0 up, 0 in 2026-03-10T08:51:31.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:31 vm05 ceph-mon[49713]: mgrmap e16: vm05.rxwgjc(active, since 16s) 2026-03-10T08:51:31.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:31 vm05 ceph-mon[49713]: Standby manager daemon vm08.rpongu started 2026-03-10T08:51:31.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:31 vm05 ceph-mon[49713]: from='mgr.? 192.168.123.108:0/1640129495' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T08:51:31.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:31 vm05 ceph-mon[49713]: overall HEALTH_OK 2026-03-10T08:51:31.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:31 vm05 ceph-mon[49713]: from='mgr.? 192.168.123.108:0/1640129495' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.rpongu/key"}]: dispatch 2026-03-10T08:51:31.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:31 vm05 ceph-mon[49713]: from='mgr.? 192.168.123.108:0/1640129495' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T08:51:31.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:31 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:32.358 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.355+0000 7f2401993700 1 -- 192.168.123.108:0/452000571 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23e40062a0 msgr2=0x7f23e4008710 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:32.358 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.355+0000 7f2401993700 1 --2- 192.168.123.108:0/452000571 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23e40062a0 0x7f23e4008710 secure :-1 s=READY pgs=161 cs=0 l=1 rev1=1 crypto rx=0x7f23fc06cee0 tx=0x7f23ec00fe90 comp rx=0 tx=0).stop 2026-03-10T08:51:32.358 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.355+0000 7f2401993700 1 -- 192.168.123.108:0/452000571 shutdown_connections 2026-03-10T08:51:32.358 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.355+0000 7f2401993700 1 --2- 192.168.123.108:0/452000571 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23e40062a0 0x7f23e4008710 secure :-1 s=CLOSED pgs=161 cs=0 l=1 rev1=1 crypto rx=0x7f23fc06cee0 tx=0x7f23ec00fe90 comp rx=0 tx=0).stop 2026-03-10T08:51:32.358 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.355+0000 7f2401993700 1 -- 192.168.123.108:0/452000571 >> 192.168.123.108:0/452000571 conn(0x7f23fc06ba60 msgr2=0x7f23fc06be70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:51:32.358 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.355+0000 7f2401993700 1 -- 192.168.123.108:0/452000571 shutdown_connections 2026-03-10T08:51:32.358 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.355+0000 7f2401993700 1 -- 192.168.123.108:0/452000571 wait complete. 2026-03-10T08:51:32.358 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.356+0000 7f2401993700 1 Processor -- start 2026-03-10T08:51:32.358 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.356+0000 7f2401993700 1 -- start start 2026-03-10T08:51:32.358 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.356+0000 7f2401993700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23fc1b3860 0x7f23fc1b3c40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:51:32.358 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.356+0000 7f2401993700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f23fc1add10 0x7f23fc1ae190 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:51:32.358 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.356+0000 7f2401993700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f23fc1ae6d0 con 0x7f23fc1b3860 2026-03-10T08:51:32.358 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.356+0000 7f2401993700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f23fc1ae810 con 0x7f23fc1add10 2026-03-10T08:51:32.358 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.356+0000 7f23fbfff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f23fc1add10 0x7f23fc1ae190 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:51:32.358 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.356+0000 7f23fbfff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f23fc1add10 0x7f23fc1ae190 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.108:54910/0 (socket says 192.168.123.108:54910) 2026-03-10T08:51:32.358 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.356+0000 7f23fbfff700 1 -- 192.168.123.108:0/1227465901 learned_addr learned my addr 192.168.123.108:0/1227465901 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T08:51:32.358 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.357+0000 7f2400991700 1 --2- 192.168.123.108:0/1227465901 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23fc1b3860 0x7f23fc1b3c40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:51:32.358 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.357+0000 7f23fbfff700 1 -- 192.168.123.108:0/1227465901 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f23fc1add10 msgr2=0x7f23fc1ae190 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_bulk peer close file descriptor 12 2026-03-10T08:51:32.358 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.357+0000 7f23fbfff700 1 -- 192.168.123.108:0/1227465901 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f23fc1add10 msgr2=0x7f23fc1ae190 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until read failed 2026-03-10T08:51:32.358 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.357+0000 7f23fbfff700 1 --2- 192.168.123.108:0/1227465901 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f23fc1add10 0x7f23fc1ae190 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_read_frame_preamble_main read frame preamble failed r=-1 2026-03-10T08:51:32.358 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.357+0000 7f23fbfff700 1 --2- 192.168.123.108:0/1227465901 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f23fc1add10 0x7f23fc1ae190 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-10T08:51:32.358 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.357+0000 7f2400991700 1 -- 192.168.123.108:0/1227465901 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f23fc1add10 msgr2=0x7f23fc1ae190 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T08:51:32.358 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.357+0000 7f2400991700 1 --2- 192.168.123.108:0/1227465901 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f23fc1add10 0x7f23fc1ae190 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:32.358 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.357+0000 7f2400991700 1 -- 192.168.123.108:0/1227465901 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f23ec00f8e0 con 0x7f23fc1b3860 2026-03-10T08:51:32.358 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.357+0000 7f2400991700 1 --2- 192.168.123.108:0/1227465901 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23fc1b3860 0x7f23fc1b3c40 secure :-1 s=READY pgs=162 cs=0 l=1 rev1=1 crypto rx=0x7f23ec010010 tx=0x7f23ec005320 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:51:32.359 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.357+0000 7f23f9ffb700 1 -- 192.168.123.108:0/1227465901 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f23ec01dc50 con 0x7f23fc1b3860 2026-03-10T08:51:32.359 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.357+0000 7f2401993700 1 -- 192.168.123.108:0/1227465901 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f23fc1aea30 con 0x7f23fc1b3860 2026-03-10T08:51:32.359 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.358+0000 7f2401993700 1 -- 192.168.123.108:0/1227465901 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f23fc1b8650 con 0x7f23fc1b3860 2026-03-10T08:51:32.359 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.358+0000 7f2401993700 1 -- 192.168.123.108:0/1227465901 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f23fc1b3670 con 0x7f23fc1b3860 2026-03-10T08:51:32.359 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.358+0000 7f23f9ffb700 1 -- 192.168.123.108:0/1227465901 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f23ec01d8f0 con 0x7f23fc1b3860 2026-03-10T08:51:32.359 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.358+0000 7f23f9ffb700 1 -- 192.168.123.108:0/1227465901 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f23ec011870 con 0x7f23fc1b3860 2026-03-10T08:51:32.363 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.361+0000 7f23f9ffb700 1 -- 192.168.123.108:0/1227465901 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 17) v1 ==== 90252+0+0 (secure 0 0 0) 0x7f23ec01d410 con 0x7f23fc1b3860 2026-03-10T08:51:32.363 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.361+0000 7f23f9ffb700 1 --2- 192.168.123.108:0/1227465901 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f23e806c5b0 0x7f23e806ea70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:51:32.363 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.362+0000 7f23f9ffb700 1 -- 192.168.123.108:0/1227465901 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7f23ec08d840 con 0x7f23fc1b3860 2026-03-10T08:51:32.363 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.362+0000 7f23fbfff700 1 --2- 192.168.123.108:0/1227465901 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f23e806c5b0 0x7f23e806ea70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:51:32.363 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.362+0000 7f23f9ffb700 1 -- 192.168.123.108:0/1227465901 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f23ec05cca0 con 0x7f23fc1b3860 2026-03-10T08:51:32.365 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.365+0000 7f23fbfff700 1 --2- 192.168.123.108:0/1227465901 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f23e806c5b0 0x7f23e806ea70 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f23f4009400 tx=0x7f23f4007040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:51:32.520 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.520+0000 7f2401993700 1 -- 192.168.123.108:0/1227465901 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f23fc04f000 con 0x7f23fc1b3860 2026-03-10T08:51:32.521 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.520+0000 7f23f9ffb700 1 -- 192.168.123.108:0/1227465901 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 2 v2) v1 ==== 95+0+1032 (secure 0 0 0) 0x7f23ec05c830 con 0x7f23fc1b3860 2026-03-10T08:51:32.521 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:51:32.521 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":2,"fsid":"16587ed2-1c5e-11f1-90f6-35051361a039","modified":"2026-03-10T08:51:26.309295Z","created":"2026-03-10T08:50:09.891602Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"},{"rank":1,"name":"vm08","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:3300","nonce":0},{"type":"v1","addr":"192.168.123.108:6789","nonce":0}]},"addr":"192.168.123.108:6789/0","public_addr":"192.168.123.108:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0,1]} 2026-03-10T08:51:32.523 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.523+0000 7f2401993700 1 -- 192.168.123.108:0/1227465901 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f23e806c5b0 msgr2=0x7f23e806ea70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:32.523 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.523+0000 7f2401993700 1 --2- 192.168.123.108:0/1227465901 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f23e806c5b0 0x7f23e806ea70 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f23f4009400 tx=0x7f23f4007040 comp rx=0 tx=0).stop 2026-03-10T08:51:32.523 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.523+0000 7f2401993700 1 -- 192.168.123.108:0/1227465901 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23fc1b3860 msgr2=0x7f23fc1b3c40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:32.523 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.523+0000 7f2401993700 1 --2- 192.168.123.108:0/1227465901 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23fc1b3860 0x7f23fc1b3c40 secure :-1 s=READY pgs=162 cs=0 l=1 rev1=1 crypto rx=0x7f23ec010010 tx=0x7f23ec005320 comp rx=0 tx=0).stop 2026-03-10T08:51:32.524 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.523+0000 7f2401993700 1 -- 192.168.123.108:0/1227465901 shutdown_connections 2026-03-10T08:51:32.524 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.523+0000 7f2401993700 1 --2- 192.168.123.108:0/1227465901 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f23e806c5b0 0x7f23e806ea70 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:32.524 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.523+0000 7f2401993700 1 --2- 192.168.123.108:0/1227465901 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23fc1b3860 0x7f23fc1b3c40 unknown :-1 s=CLOSED pgs=162 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:32.524 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.523+0000 7f2401993700 1 --2- 192.168.123.108:0/1227465901 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f23fc1add10 0x7f23fc1ae190 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:32.524 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.523+0000 7f2401993700 1 -- 192.168.123.108:0/1227465901 >> 192.168.123.108:0/1227465901 conn(0x7f23fc06ba60 msgr2=0x7f23fc107d90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:51:32.524 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.524+0000 7f2401993700 1 -- 192.168.123.108:0/1227465901 shutdown_connections 2026-03-10T08:51:32.524 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:51:32.524+0000 7f2401993700 1 -- 192.168.123.108:0/1227465901 wait complete. 2026-03-10T08:51:32.525 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 2 2026-03-10T08:51:32.576 INFO:tasks.cephadm:Generating final ceph.conf file... 2026-03-10T08:51:32.576 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph config generate-minimal-conf 2026-03-10T08:51:32.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:32 vm05 ceph-mon[49713]: mgrmap e17: vm05.rxwgjc(active, since 16s), standbys: vm08.rpongu 2026-03-10T08:51:32.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:32 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mgr metadata", "who": "vm08.rpongu", "id": "vm08.rpongu"}]: dispatch 2026-03-10T08:51:32.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:32 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:32.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:32 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:32.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:32 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:51:32.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:32 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:51:32.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:32 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T08:51:32.728 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:51:32.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:32 vm08 ceph-mon[57559]: mgrmap e17: vm05.rxwgjc(active, since 16s), standbys: vm08.rpongu 2026-03-10T08:51:32.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:32 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mgr metadata", "who": "vm08.rpongu", "id": "vm08.rpongu"}]: dispatch 2026-03-10T08:51:32.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:32 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:32.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:32 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:32.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:32 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:51:32.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:32 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:51:32.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:32 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T08:51:33.021 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.016+0000 7f3c72756700 1 -- 192.168.123.105:0/3401438209 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3c6c105790 msgr2=0x7f3c6c105b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:33.021 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.016+0000 7f3c72756700 1 --2- 192.168.123.105:0/3401438209 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3c6c105790 0x7f3c6c105b70 secure :-1 s=READY pgs=163 cs=0 l=1 rev1=1 crypto rx=0x7f3c5c009b00 tx=0x7f3c5c009e10 comp rx=0 tx=0).stop 2026-03-10T08:51:33.021 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.018+0000 7f3c72756700 1 -- 192.168.123.105:0/3401438209 shutdown_connections 2026-03-10T08:51:33.021 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.018+0000 7f3c72756700 1 --2- 192.168.123.105:0/3401438209 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3c6c105790 0x7f3c6c105b70 unknown :-1 s=CLOSED pgs=163 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:33.021 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.018+0000 7f3c72756700 1 -- 192.168.123.105:0/3401438209 >> 192.168.123.105:0/3401438209 conn(0x7f3c6c075190 msgr2=0x7f3c6c0755a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:51:33.021 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.019+0000 7f3c72756700 1 -- 192.168.123.105:0/3401438209 shutdown_connections 2026-03-10T08:51:33.021 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.019+0000 7f3c72756700 1 -- 192.168.123.105:0/3401438209 wait complete. 2026-03-10T08:51:33.021 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.019+0000 7f3c72756700 1 Processor -- start 2026-03-10T08:51:33.021 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.019+0000 7f3c72756700 1 -- start start 2026-03-10T08:51:33.021 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.019+0000 7f3c72756700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3c6c105790 0x7f3c6c194000 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:51:33.021 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.019+0000 7f3c72756700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3c6c194540 0x7f3c6c1987c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:51:33.021 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.019+0000 7f3c72756700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3c6c198d00 con 0x7f3c6c194540 2026-03-10T08:51:33.021 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.019+0000 7f3c72756700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3c6c198e70 con 0x7f3c6c105790 2026-03-10T08:51:33.021 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.020+0000 7f3c6bfff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3c6c105790 0x7f3c6c194000 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:51:33.021 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.020+0000 7f3c6bfff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3c6c105790 0x7f3c6c194000 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:47240/0 (socket says 192.168.123.105:47240) 2026-03-10T08:51:33.021 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.020+0000 7f3c6bfff700 1 -- 192.168.123.105:0/2800984050 learned_addr learned my addr 192.168.123.105:0/2800984050 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:51:33.021 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.020+0000 7f3c6b7fe700 1 --2- 192.168.123.105:0/2800984050 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3c6c194540 0x7f3c6c1987c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:51:33.021 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.020+0000 7f3c6bfff700 1 -- 192.168.123.105:0/2800984050 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3c6c105790 msgr2=0x7f3c6c194000 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_bulk peer close file descriptor 12 2026-03-10T08:51:33.021 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.020+0000 7f3c6bfff700 1 -- 192.168.123.105:0/2800984050 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3c6c105790 msgr2=0x7f3c6c194000 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until read failed 2026-03-10T08:51:33.021 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.020+0000 7f3c6bfff700 1 --2- 192.168.123.105:0/2800984050 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3c6c105790 0x7f3c6c194000 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_read_frame_preamble_main read frame preamble failed r=-1 2026-03-10T08:51:33.021 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.020+0000 7f3c6bfff700 1 --2- 192.168.123.105:0/2800984050 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3c6c105790 0x7f3c6c194000 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-10T08:51:33.021 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.020+0000 7f3c6b7fe700 1 -- 192.168.123.105:0/2800984050 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3c6c105790 msgr2=0x7f3c6c194000 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T08:51:33.021 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.020+0000 7f3c6b7fe700 1 --2- 192.168.123.105:0/2800984050 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3c6c105790 0x7f3c6c194000 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:33.021 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.020+0000 7f3c6b7fe700 1 -- 192.168.123.105:0/2800984050 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3c5c0097e0 con 0x7f3c6c194540 2026-03-10T08:51:33.021 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.020+0000 7f3c6b7fe700 1 --2- 192.168.123.105:0/2800984050 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3c6c194540 0x7f3c6c1987c0 secure :-1 s=READY pgs=164 cs=0 l=1 rev1=1 crypto rx=0x7f3c6000b700 tx=0x7f3c6000bac0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:51:33.021 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.021+0000 7f3c697fa700 1 -- 192.168.123.105:0/2800984050 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3c60010820 con 0x7f3c6c194540 2026-03-10T08:51:33.023 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.021+0000 7f3c697fa700 1 -- 192.168.123.105:0/2800984050 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3c60010e60 con 0x7f3c6c194540 2026-03-10T08:51:33.023 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.021+0000 7f3c697fa700 1 -- 192.168.123.105:0/2800984050 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3c60017570 con 0x7f3c6c194540 2026-03-10T08:51:33.023 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.021+0000 7f3c72756700 1 -- 192.168.123.105:0/2800984050 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3c6c199150 con 0x7f3c6c194540 2026-03-10T08:51:33.023 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.021+0000 7f3c72756700 1 -- 192.168.123.105:0/2800984050 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3c6c19ea60 con 0x7f3c6c194540 2026-03-10T08:51:33.025 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.023+0000 7f3c697fa700 1 -- 192.168.123.105:0/2800984050 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 17) v1 ==== 90252+0+0 (secure 0 0 0) 0x7f3c600176d0 con 0x7f3c6c194540 2026-03-10T08:51:33.025 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.023+0000 7f3c697fa700 1 --2- 192.168.123.105:0/2800984050 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3c5406c5d0 0x7f3c5406ea90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:51:33.025 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.023+0000 7f3c697fa700 1 -- 192.168.123.105:0/2800984050 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7f3c6008b190 con 0x7f3c6c194540 2026-03-10T08:51:33.025 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.024+0000 7f3c6bfff700 1 --2- 192.168.123.105:0/2800984050 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3c5406c5d0 0x7f3c5406ea90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:51:33.025 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.024+0000 7f3c6bfff700 1 --2- 192.168.123.105:0/2800984050 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3c5406c5d0 0x7f3c5406ea90 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f3c5c00b5c0 tx=0x7f3c5c005fd0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:51:33.025 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.024+0000 7f3c72756700 1 -- 192.168.123.105:0/2800984050 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3c6c193e10 con 0x7f3c6c194540 2026-03-10T08:51:33.028 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.028+0000 7f3c697fa700 1 -- 192.168.123.105:0/2800984050 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f3c60056a60 con 0x7f3c6c194540 2026-03-10T08:51:33.166 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.165+0000 7f3c72756700 1 -- 192.168.123.105:0/2800984050 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "config generate-minimal-conf"} v 0) v1 -- 0x7f3c6c0623c0 con 0x7f3c6c194540 2026-03-10T08:51:33.166 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.166+0000 7f3c697fa700 1 -- 192.168.123.105:0/2800984050 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "config generate-minimal-conf"}]=0 v10) v1 ==== 76+0+235 (secure 0 0 0) 0x7f3c6001e020 con 0x7f3c6c194540 2026-03-10T08:51:33.167 INFO:teuthology.orchestra.run.vm05.stdout:# minimal ceph.conf for 16587ed2-1c5e-11f1-90f6-35051361a039 2026-03-10T08:51:33.167 INFO:teuthology.orchestra.run.vm05.stdout:[global] 2026-03-10T08:51:33.167 INFO:teuthology.orchestra.run.vm05.stdout: fsid = 16587ed2-1c5e-11f1-90f6-35051361a039 2026-03-10T08:51:33.167 INFO:teuthology.orchestra.run.vm05.stdout: mon_host = [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] 2026-03-10T08:51:33.170 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.169+0000 7f3c72756700 1 -- 192.168.123.105:0/2800984050 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3c5406c5d0 msgr2=0x7f3c5406ea90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:33.170 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.169+0000 7f3c72756700 1 --2- 192.168.123.105:0/2800984050 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3c5406c5d0 0x7f3c5406ea90 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f3c5c00b5c0 tx=0x7f3c5c005fd0 comp rx=0 tx=0).stop 2026-03-10T08:51:33.170 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.169+0000 7f3c72756700 1 -- 192.168.123.105:0/2800984050 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3c6c194540 msgr2=0x7f3c6c1987c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:33.170 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.169+0000 7f3c72756700 1 --2- 192.168.123.105:0/2800984050 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3c6c194540 0x7f3c6c1987c0 secure :-1 s=READY pgs=164 cs=0 l=1 rev1=1 crypto rx=0x7f3c6000b700 tx=0x7f3c6000bac0 comp rx=0 tx=0).stop 2026-03-10T08:51:33.170 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.169+0000 7f3c72756700 1 -- 192.168.123.105:0/2800984050 shutdown_connections 2026-03-10T08:51:33.170 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.169+0000 7f3c72756700 1 --2- 192.168.123.105:0/2800984050 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3c5406c5d0 0x7f3c5406ea90 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:33.170 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.169+0000 7f3c72756700 1 --2- 192.168.123.105:0/2800984050 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3c6c105790 0x7f3c6c194000 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:33.170 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.169+0000 7f3c72756700 1 --2- 192.168.123.105:0/2800984050 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3c6c194540 0x7f3c6c1987c0 unknown :-1 s=CLOSED pgs=164 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:33.170 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.169+0000 7f3c72756700 1 -- 192.168.123.105:0/2800984050 >> 192.168.123.105:0/2800984050 conn(0x7f3c6c075190 msgr2=0x7f3c6c101290 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:51:33.170 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.169+0000 7f3c72756700 1 -- 192.168.123.105:0/2800984050 shutdown_connections 2026-03-10T08:51:33.170 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:33.169+0000 7f3c72756700 1 -- 192.168.123.105:0/2800984050 wait complete. 2026-03-10T08:51:33.218 INFO:tasks.cephadm:Distributing (final) config and client.admin keyring... 2026-03-10T08:51:33.218 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-10T08:51:33.218 DEBUG:teuthology.orchestra.run.vm05:> sudo dd of=/etc/ceph/ceph.conf 2026-03-10T08:51:33.250 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-10T08:51:33.250 DEBUG:teuthology.orchestra.run.vm05:> sudo dd of=/etc/ceph/ceph.client.admin.keyring 2026-03-10T08:51:33.321 DEBUG:teuthology.orchestra.run.vm08:> set -ex 2026-03-10T08:51:33.321 DEBUG:teuthology.orchestra.run.vm08:> sudo dd of=/etc/ceph/ceph.conf 2026-03-10T08:51:33.347 DEBUG:teuthology.orchestra.run.vm08:> set -ex 2026-03-10T08:51:33.347 DEBUG:teuthology.orchestra.run.vm08:> sudo dd of=/etc/ceph/ceph.client.admin.keyring 2026-03-10T08:51:33.411 INFO:tasks.cephadm:Deploying OSDs... 2026-03-10T08:51:33.411 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-10T08:51:33.411 DEBUG:teuthology.orchestra.run.vm05:> dd if=/scratch_devs of=/dev/stdout 2026-03-10T08:51:33.439 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T08:51:33.439 DEBUG:teuthology.orchestra.run.vm05:> ls /dev/[sv]d? 2026-03-10T08:51:33.504 INFO:teuthology.orchestra.run.vm05.stdout:/dev/vda 2026-03-10T08:51:33.504 INFO:teuthology.orchestra.run.vm05.stdout:/dev/vdb 2026-03-10T08:51:33.504 INFO:teuthology.orchestra.run.vm05.stdout:/dev/vdc 2026-03-10T08:51:33.504 INFO:teuthology.orchestra.run.vm05.stdout:/dev/vdd 2026-03-10T08:51:33.504 INFO:teuthology.orchestra.run.vm05.stdout:/dev/vde 2026-03-10T08:51:33.504 WARNING:teuthology.misc:Removing root device: /dev/vda from device list 2026-03-10T08:51:33.504 DEBUG:teuthology.misc:devs=['/dev/vdb', '/dev/vdc', '/dev/vdd', '/dev/vde'] 2026-03-10T08:51:33.504 DEBUG:teuthology.orchestra.run.vm05:> stat /dev/vdb 2026-03-10T08:51:33.559 INFO:teuthology.orchestra.run.vm05.stdout: File: /dev/vdb 2026-03-10T08:51:33.559 INFO:teuthology.orchestra.run.vm05.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-10T08:51:33.559 INFO:teuthology.orchestra.run.vm05.stdout:Device: 6h/6d Inode: 254 Links: 1 Device type: fc,10 2026-03-10T08:51:33.559 INFO:teuthology.orchestra.run.vm05.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-10T08:51:33.559 INFO:teuthology.orchestra.run.vm05.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-10T08:51:33.559 INFO:teuthology.orchestra.run.vm05.stdout:Access: 2026-03-10 08:50:41.671067242 +0000 2026-03-10T08:51:33.559 INFO:teuthology.orchestra.run.vm05.stdout:Modify: 2026-03-10 08:45:30.102000000 +0000 2026-03-10T08:51:33.559 INFO:teuthology.orchestra.run.vm05.stdout:Change: 2026-03-10 08:45:30.102000000 +0000 2026-03-10T08:51:33.559 INFO:teuthology.orchestra.run.vm05.stdout: Birth: 2026-03-10 08:45:28.259000000 +0000 2026-03-10T08:51:33.560 DEBUG:teuthology.orchestra.run.vm05:> sudo dd if=/dev/vdb of=/dev/null count=1 2026-03-10T08:51:33.623 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:33 vm05 ceph-mon[49713]: Updating vm05:/etc/ceph/ceph.conf 2026-03-10T08:51:33.623 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:33 vm05 ceph-mon[49713]: Updating vm08:/etc/ceph/ceph.conf 2026-03-10T08:51:33.623 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:33 vm05 ceph-mon[49713]: Updating vm08:/var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/config/ceph.conf 2026-03-10T08:51:33.623 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:33 vm05 ceph-mon[49713]: Updating vm05:/var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/config/ceph.conf 2026-03-10T08:51:33.623 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:33 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:33.623 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:33 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:33.623 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:33 vm05 ceph-mon[49713]: from='client.? 192.168.123.108:0/1227465901' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T08:51:33.623 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:33 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:33.623 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:33 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:33.623 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:33 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:33.623 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:33 vm05 ceph-mon[49713]: Reconfiguring mon.vm05 (unknown last config time)... 2026-03-10T08:51:33.623 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:33 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T08:51:33.623 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:33 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T08:51:33.623 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:33 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:51:33.623 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:33 vm05 ceph-mon[49713]: Reconfiguring daemon mon.vm05 on vm05 2026-03-10T08:51:33.623 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:33 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:33.623 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:33 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:33.623 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:33 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/2800984050' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:51:33.623 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:33 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm05.rxwgjc", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T08:51:33.623 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:33 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T08:51:33.623 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:33 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:51:33.625 INFO:teuthology.orchestra.run.vm05.stderr:1+0 records in 2026-03-10T08:51:33.625 INFO:teuthology.orchestra.run.vm05.stderr:1+0 records out 2026-03-10T08:51:33.625 INFO:teuthology.orchestra.run.vm05.stderr:512 bytes copied, 0.000111699 s, 4.6 MB/s 2026-03-10T08:51:33.626 DEBUG:teuthology.orchestra.run.vm05:> ! mount | grep -v devtmpfs | grep -q /dev/vdb 2026-03-10T08:51:33.690 DEBUG:teuthology.orchestra.run.vm05:> stat /dev/vdc 2026-03-10T08:51:33.747 INFO:teuthology.orchestra.run.vm05.stdout: File: /dev/vdc 2026-03-10T08:51:33.748 INFO:teuthology.orchestra.run.vm05.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-10T08:51:33.748 INFO:teuthology.orchestra.run.vm05.stdout:Device: 6h/6d Inode: 255 Links: 1 Device type: fc,20 2026-03-10T08:51:33.748 INFO:teuthology.orchestra.run.vm05.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-10T08:51:33.748 INFO:teuthology.orchestra.run.vm05.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-10T08:51:33.748 INFO:teuthology.orchestra.run.vm05.stdout:Access: 2026-03-10 08:50:41.723067318 +0000 2026-03-10T08:51:33.748 INFO:teuthology.orchestra.run.vm05.stdout:Modify: 2026-03-10 08:45:30.103000000 +0000 2026-03-10T08:51:33.748 INFO:teuthology.orchestra.run.vm05.stdout:Change: 2026-03-10 08:45:30.103000000 +0000 2026-03-10T08:51:33.748 INFO:teuthology.orchestra.run.vm05.stdout: Birth: 2026-03-10 08:45:28.268000000 +0000 2026-03-10T08:51:33.748 DEBUG:teuthology.orchestra.run.vm05:> sudo dd if=/dev/vdc of=/dev/null count=1 2026-03-10T08:51:33.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:33 vm08 ceph-mon[57559]: Updating vm05:/etc/ceph/ceph.conf 2026-03-10T08:51:33.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:33 vm08 ceph-mon[57559]: Updating vm08:/etc/ceph/ceph.conf 2026-03-10T08:51:33.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:33 vm08 ceph-mon[57559]: Updating vm08:/var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/config/ceph.conf 2026-03-10T08:51:33.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:33 vm08 ceph-mon[57559]: Updating vm05:/var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/config/ceph.conf 2026-03-10T08:51:33.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:33 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:33.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:33 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:33.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:33 vm08 ceph-mon[57559]: from='client.? 192.168.123.108:0/1227465901' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T08:51:33.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:33 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:33.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:33 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:33.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:33 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:33.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:33 vm08 ceph-mon[57559]: Reconfiguring mon.vm05 (unknown last config time)... 2026-03-10T08:51:33.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:33 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T08:51:33.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:33 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T08:51:33.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:33 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:51:33.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:33 vm08 ceph-mon[57559]: Reconfiguring daemon mon.vm05 on vm05 2026-03-10T08:51:33.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:33 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:33.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:33 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:33.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:33 vm08 ceph-mon[57559]: from='client.? 192.168.123.105:0/2800984050' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:51:33.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:33 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm05.rxwgjc", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T08:51:33.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:33 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T08:51:33.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:33 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:51:33.827 INFO:teuthology.orchestra.run.vm05.stderr:1+0 records in 2026-03-10T08:51:33.827 INFO:teuthology.orchestra.run.vm05.stderr:1+0 records out 2026-03-10T08:51:33.827 INFO:teuthology.orchestra.run.vm05.stderr:512 bytes copied, 0.000132248 s, 3.9 MB/s 2026-03-10T08:51:33.828 DEBUG:teuthology.orchestra.run.vm05:> ! mount | grep -v devtmpfs | grep -q /dev/vdc 2026-03-10T08:51:33.890 DEBUG:teuthology.orchestra.run.vm05:> stat /dev/vdd 2026-03-10T08:51:33.952 INFO:teuthology.orchestra.run.vm05.stdout: File: /dev/vdd 2026-03-10T08:51:33.952 INFO:teuthology.orchestra.run.vm05.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-10T08:51:33.952 INFO:teuthology.orchestra.run.vm05.stdout:Device: 6h/6d Inode: 256 Links: 1 Device type: fc,30 2026-03-10T08:51:33.952 INFO:teuthology.orchestra.run.vm05.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-10T08:51:33.952 INFO:teuthology.orchestra.run.vm05.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-10T08:51:33.952 INFO:teuthology.orchestra.run.vm05.stdout:Access: 2026-03-10 08:50:41.786067409 +0000 2026-03-10T08:51:33.952 INFO:teuthology.orchestra.run.vm05.stdout:Modify: 2026-03-10 08:45:30.111000000 +0000 2026-03-10T08:51:33.952 INFO:teuthology.orchestra.run.vm05.stdout:Change: 2026-03-10 08:45:30.111000000 +0000 2026-03-10T08:51:33.952 INFO:teuthology.orchestra.run.vm05.stdout: Birth: 2026-03-10 08:45:28.278000000 +0000 2026-03-10T08:51:33.953 DEBUG:teuthology.orchestra.run.vm05:> sudo dd if=/dev/vdd of=/dev/null count=1 2026-03-10T08:51:34.027 INFO:teuthology.orchestra.run.vm05.stderr:1+0 records in 2026-03-10T08:51:34.027 INFO:teuthology.orchestra.run.vm05.stderr:1+0 records out 2026-03-10T08:51:34.027 INFO:teuthology.orchestra.run.vm05.stderr:512 bytes copied, 0.000126176 s, 4.1 MB/s 2026-03-10T08:51:34.028 DEBUG:teuthology.orchestra.run.vm05:> ! mount | grep -v devtmpfs | grep -q /dev/vdd 2026-03-10T08:51:34.086 DEBUG:teuthology.orchestra.run.vm05:> stat /dev/vde 2026-03-10T08:51:34.158 INFO:teuthology.orchestra.run.vm05.stdout: File: /dev/vde 2026-03-10T08:51:34.159 INFO:teuthology.orchestra.run.vm05.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-10T08:51:34.159 INFO:teuthology.orchestra.run.vm05.stdout:Device: 6h/6d Inode: 257 Links: 1 Device type: fc,40 2026-03-10T08:51:34.159 INFO:teuthology.orchestra.run.vm05.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-10T08:51:34.159 INFO:teuthology.orchestra.run.vm05.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-10T08:51:34.159 INFO:teuthology.orchestra.run.vm05.stdout:Access: 2026-03-10 08:50:41.841067489 +0000 2026-03-10T08:51:34.159 INFO:teuthology.orchestra.run.vm05.stdout:Modify: 2026-03-10 08:45:30.111000000 +0000 2026-03-10T08:51:34.159 INFO:teuthology.orchestra.run.vm05.stdout:Change: 2026-03-10 08:45:30.111000000 +0000 2026-03-10T08:51:34.159 INFO:teuthology.orchestra.run.vm05.stdout: Birth: 2026-03-10 08:45:28.282000000 +0000 2026-03-10T08:51:34.159 DEBUG:teuthology.orchestra.run.vm05:> sudo dd if=/dev/vde of=/dev/null count=1 2026-03-10T08:51:34.244 INFO:teuthology.orchestra.run.vm05.stderr:1+0 records in 2026-03-10T08:51:34.244 INFO:teuthology.orchestra.run.vm05.stderr:1+0 records out 2026-03-10T08:51:34.244 INFO:teuthology.orchestra.run.vm05.stderr:512 bytes copied, 0.000158777 s, 3.2 MB/s 2026-03-10T08:51:34.245 DEBUG:teuthology.orchestra.run.vm05:> ! mount | grep -v devtmpfs | grep -q /dev/vde 2026-03-10T08:51:34.269 DEBUG:teuthology.orchestra.run.vm08:> set -ex 2026-03-10T08:51:34.269 DEBUG:teuthology.orchestra.run.vm08:> dd if=/scratch_devs of=/dev/stdout 2026-03-10T08:51:34.283 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T08:51:34.283 DEBUG:teuthology.orchestra.run.vm08:> ls /dev/[sv]d? 2026-03-10T08:51:34.339 INFO:teuthology.orchestra.run.vm08.stdout:/dev/vda 2026-03-10T08:51:34.339 INFO:teuthology.orchestra.run.vm08.stdout:/dev/vdb 2026-03-10T08:51:34.339 INFO:teuthology.orchestra.run.vm08.stdout:/dev/vdc 2026-03-10T08:51:34.339 INFO:teuthology.orchestra.run.vm08.stdout:/dev/vdd 2026-03-10T08:51:34.339 INFO:teuthology.orchestra.run.vm08.stdout:/dev/vde 2026-03-10T08:51:34.339 WARNING:teuthology.misc:Removing root device: /dev/vda from device list 2026-03-10T08:51:34.339 DEBUG:teuthology.misc:devs=['/dev/vdb', '/dev/vdc', '/dev/vdd', '/dev/vde'] 2026-03-10T08:51:34.339 DEBUG:teuthology.orchestra.run.vm08:> stat /dev/vdb 2026-03-10T08:51:34.396 INFO:teuthology.orchestra.run.vm08.stdout: File: /dev/vdb 2026-03-10T08:51:34.396 INFO:teuthology.orchestra.run.vm08.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-10T08:51:34.396 INFO:teuthology.orchestra.run.vm08.stdout:Device: 6h/6d Inode: 221 Links: 1 Device type: fc,10 2026-03-10T08:51:34.396 INFO:teuthology.orchestra.run.vm08.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-10T08:51:34.396 INFO:teuthology.orchestra.run.vm08.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-10T08:51:34.396 INFO:teuthology.orchestra.run.vm08.stdout:Access: 2026-03-10 08:51:17.535202624 +0000 2026-03-10T08:51:34.396 INFO:teuthology.orchestra.run.vm08.stdout:Modify: 2026-03-10 08:45:55.123000000 +0000 2026-03-10T08:51:34.396 INFO:teuthology.orchestra.run.vm08.stdout:Change: 2026-03-10 08:45:55.123000000 +0000 2026-03-10T08:51:34.396 INFO:teuthology.orchestra.run.vm08.stdout: Birth: 2026-03-10 08:45:53.194000000 +0000 2026-03-10T08:51:34.396 DEBUG:teuthology.orchestra.run.vm08:> sudo dd if=/dev/vdb of=/dev/null count=1 2026-03-10T08:51:34.461 INFO:teuthology.orchestra.run.vm08.stderr:1+0 records in 2026-03-10T08:51:34.461 INFO:teuthology.orchestra.run.vm08.stderr:1+0 records out 2026-03-10T08:51:34.461 INFO:teuthology.orchestra.run.vm08.stderr:512 bytes copied, 0.000155822 s, 3.3 MB/s 2026-03-10T08:51:34.463 DEBUG:teuthology.orchestra.run.vm08:> ! mount | grep -v devtmpfs | grep -q /dev/vdb 2026-03-10T08:51:34.519 DEBUG:teuthology.orchestra.run.vm08:> stat /dev/vdc 2026-03-10T08:51:34.576 INFO:teuthology.orchestra.run.vm08.stdout: File: /dev/vdc 2026-03-10T08:51:34.576 INFO:teuthology.orchestra.run.vm08.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-10T08:51:34.576 INFO:teuthology.orchestra.run.vm08.stdout:Device: 6h/6d Inode: 222 Links: 1 Device type: fc,20 2026-03-10T08:51:34.576 INFO:teuthology.orchestra.run.vm08.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-10T08:51:34.576 INFO:teuthology.orchestra.run.vm08.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-10T08:51:34.576 INFO:teuthology.orchestra.run.vm08.stdout:Access: 2026-03-10 08:51:17.589202672 +0000 2026-03-10T08:51:34.577 INFO:teuthology.orchestra.run.vm08.stdout:Modify: 2026-03-10 08:45:55.105000000 +0000 2026-03-10T08:51:34.577 INFO:teuthology.orchestra.run.vm08.stdout:Change: 2026-03-10 08:45:55.105000000 +0000 2026-03-10T08:51:34.577 INFO:teuthology.orchestra.run.vm08.stdout: Birth: 2026-03-10 08:45:53.206000000 +0000 2026-03-10T08:51:34.577 DEBUG:teuthology.orchestra.run.vm08:> sudo dd if=/dev/vdc of=/dev/null count=1 2026-03-10T08:51:34.640 INFO:teuthology.orchestra.run.vm08.stderr:1+0 records in 2026-03-10T08:51:34.640 INFO:teuthology.orchestra.run.vm08.stderr:1+0 records out 2026-03-10T08:51:34.640 INFO:teuthology.orchestra.run.vm08.stderr:512 bytes copied, 0.000232895 s, 2.2 MB/s 2026-03-10T08:51:34.642 DEBUG:teuthology.orchestra.run.vm08:> ! mount | grep -v devtmpfs | grep -q /dev/vdc 2026-03-10T08:51:34.702 DEBUG:teuthology.orchestra.run.vm08:> stat /dev/vdd 2026-03-10T08:51:34.759 INFO:teuthology.orchestra.run.vm08.stdout: File: /dev/vdd 2026-03-10T08:51:34.759 INFO:teuthology.orchestra.run.vm08.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-10T08:51:34.759 INFO:teuthology.orchestra.run.vm08.stdout:Device: 6h/6d Inode: 225 Links: 1 Device type: fc,30 2026-03-10T08:51:34.759 INFO:teuthology.orchestra.run.vm08.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-10T08:51:34.759 INFO:teuthology.orchestra.run.vm08.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-10T08:51:34.759 INFO:teuthology.orchestra.run.vm08.stdout:Access: 2026-03-10 08:51:17.640202716 +0000 2026-03-10T08:51:34.759 INFO:teuthology.orchestra.run.vm08.stdout:Modify: 2026-03-10 08:45:55.121000000 +0000 2026-03-10T08:51:34.759 INFO:teuthology.orchestra.run.vm08.stdout:Change: 2026-03-10 08:45:55.121000000 +0000 2026-03-10T08:51:34.759 INFO:teuthology.orchestra.run.vm08.stdout: Birth: 2026-03-10 08:45:53.220000000 +0000 2026-03-10T08:51:34.759 DEBUG:teuthology.orchestra.run.vm08:> sudo dd if=/dev/vdd of=/dev/null count=1 2026-03-10T08:51:34.799 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:34 vm05 ceph-mon[49713]: Reconfiguring mgr.vm05.rxwgjc (unknown last config time)... 2026-03-10T08:51:34.799 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:34 vm05 ceph-mon[49713]: Reconfiguring daemon mgr.vm05.rxwgjc on vm05 2026-03-10T08:51:34.799 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:34 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:34.799 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:34 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:34.799 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:34 vm05 ceph-mon[49713]: Reconfiguring ceph-exporter.vm05 (monmap changed)... 2026-03-10T08:51:34.799 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:34 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm05", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T08:51:34.799 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:34 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:51:34.799 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:34 vm05 ceph-mon[49713]: Reconfiguring daemon ceph-exporter.vm05 on vm05 2026-03-10T08:51:34.799 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:34 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:34.799 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:34 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:34.799 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:34 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm05", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T08:51:34.799 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:34 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:51:34.799 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:34 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:34.799 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:34 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:34.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:34 vm08 ceph-mon[57559]: Reconfiguring mgr.vm05.rxwgjc (unknown last config time)... 2026-03-10T08:51:34.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:34 vm08 ceph-mon[57559]: Reconfiguring daemon mgr.vm05.rxwgjc on vm05 2026-03-10T08:51:34.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:34 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:34.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:34 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:34.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:34 vm08 ceph-mon[57559]: Reconfiguring ceph-exporter.vm05 (monmap changed)... 2026-03-10T08:51:34.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:34 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm05", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T08:51:34.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:34 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:51:34.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:34 vm08 ceph-mon[57559]: Reconfiguring daemon ceph-exporter.vm05 on vm05 2026-03-10T08:51:34.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:34 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:34.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:34 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:34.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:34 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm05", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T08:51:34.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:34 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:51:34.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:34 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:34.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:34 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:34.827 INFO:teuthology.orchestra.run.vm08.stderr:1+0 records in 2026-03-10T08:51:34.827 INFO:teuthology.orchestra.run.vm08.stderr:1+0 records out 2026-03-10T08:51:34.827 INFO:teuthology.orchestra.run.vm08.stderr:512 bytes copied, 0.000155641 s, 3.3 MB/s 2026-03-10T08:51:34.828 DEBUG:teuthology.orchestra.run.vm08:> ! mount | grep -v devtmpfs | grep -q /dev/vdd 2026-03-10T08:51:34.886 DEBUG:teuthology.orchestra.run.vm08:> stat /dev/vde 2026-03-10T08:51:34.943 INFO:teuthology.orchestra.run.vm08.stdout: File: /dev/vde 2026-03-10T08:51:34.943 INFO:teuthology.orchestra.run.vm08.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-10T08:51:34.943 INFO:teuthology.orchestra.run.vm08.stdout:Device: 6h/6d Inode: 226 Links: 1 Device type: fc,40 2026-03-10T08:51:34.943 INFO:teuthology.orchestra.run.vm08.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-10T08:51:34.943 INFO:teuthology.orchestra.run.vm08.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-10T08:51:34.943 INFO:teuthology.orchestra.run.vm08.stdout:Access: 2026-03-10 08:51:17.707202776 +0000 2026-03-10T08:51:34.943 INFO:teuthology.orchestra.run.vm08.stdout:Modify: 2026-03-10 08:45:55.123000000 +0000 2026-03-10T08:51:34.943 INFO:teuthology.orchestra.run.vm08.stdout:Change: 2026-03-10 08:45:55.123000000 +0000 2026-03-10T08:51:34.943 INFO:teuthology.orchestra.run.vm08.stdout: Birth: 2026-03-10 08:45:53.226000000 +0000 2026-03-10T08:51:34.943 DEBUG:teuthology.orchestra.run.vm08:> sudo dd if=/dev/vde of=/dev/null count=1 2026-03-10T08:51:35.008 INFO:teuthology.orchestra.run.vm08.stderr:1+0 records in 2026-03-10T08:51:35.008 INFO:teuthology.orchestra.run.vm08.stderr:1+0 records out 2026-03-10T08:51:35.008 INFO:teuthology.orchestra.run.vm08.stderr:512 bytes copied, 0.000197619 s, 2.6 MB/s 2026-03-10T08:51:35.009 DEBUG:teuthology.orchestra.run.vm08:> ! mount | grep -v devtmpfs | grep -q /dev/vde 2026-03-10T08:51:35.065 INFO:tasks.cephadm:Deploying osd.0 on vm05 with /dev/vde... 2026-03-10T08:51:35.065 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- lvm zap /dev/vde 2026-03-10T08:51:35.250 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:51:35.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:35 vm08 ceph-mon[57559]: Reconfiguring crash.vm05 (monmap changed)... 2026-03-10T08:51:35.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:35 vm08 ceph-mon[57559]: Reconfiguring daemon crash.vm05 on vm05 2026-03-10T08:51:35.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:35 vm08 ceph-mon[57559]: Reconfiguring alertmanager.vm05 (dependencies changed)... 2026-03-10T08:51:35.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:35 vm08 ceph-mon[57559]: Reconfiguring daemon alertmanager.vm05 on vm05 2026-03-10T08:51:35.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:35 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:35.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:35 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:35.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:35 vm05 ceph-mon[49713]: Reconfiguring crash.vm05 (monmap changed)... 2026-03-10T08:51:35.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:35 vm05 ceph-mon[49713]: Reconfiguring daemon crash.vm05 on vm05 2026-03-10T08:51:35.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:35 vm05 ceph-mon[49713]: Reconfiguring alertmanager.vm05 (dependencies changed)... 2026-03-10T08:51:35.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:35 vm05 ceph-mon[49713]: Reconfiguring daemon alertmanager.vm05 on vm05 2026-03-10T08:51:35.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:35 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:35.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:35 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:35.773 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:51:35.785 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph orch daemon add osd vm05:/dev/vde 2026-03-10T08:51:35.997 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:51:36.317 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:36.316+0000 7fbf52b67700 1 -- 192.168.123.105:0/1646521616 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbf4c10a700 msgr2=0x7fbf4c10cb90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:36.318 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:36.316+0000 7fbf52b67700 1 --2- 192.168.123.105:0/1646521616 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbf4c10a700 0x7fbf4c10cb90 secure :-1 s=READY pgs=165 cs=0 l=1 rev1=1 crypto rx=0x7fbf3c009b00 tx=0x7fbf3c009e10 comp rx=0 tx=0).stop 2026-03-10T08:51:36.318 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:36.316+0000 7fbf52b67700 1 -- 192.168.123.105:0/1646521616 shutdown_connections 2026-03-10T08:51:36.318 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:36.316+0000 7fbf52b67700 1 --2- 192.168.123.105:0/1646521616 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbf4c10a700 0x7fbf4c10cb90 unknown :-1 s=CLOSED pgs=165 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:36.318 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:36.316+0000 7fbf52b67700 1 --2- 192.168.123.105:0/1646521616 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbf4c107d90 0x7fbf4c10a1c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:36.318 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:36.316+0000 7fbf52b67700 1 -- 192.168.123.105:0/1646521616 >> 192.168.123.105:0/1646521616 conn(0x7fbf4c06daa0 msgr2=0x7fbf4c06ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:51:36.318 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:36.317+0000 7fbf52b67700 1 -- 192.168.123.105:0/1646521616 shutdown_connections 2026-03-10T08:51:36.318 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:36.317+0000 7fbf52b67700 1 -- 192.168.123.105:0/1646521616 wait complete. 2026-03-10T08:51:36.318 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:36.317+0000 7fbf52b67700 1 Processor -- start 2026-03-10T08:51:36.318 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:36.317+0000 7fbf52b67700 1 -- start start 2026-03-10T08:51:36.318 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:36.317+0000 7fbf52b67700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbf4c107d90 0x7fbf4c1a55c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:51:36.318 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:36.317+0000 7fbf52b67700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbf4c1a5b00 0x7fbf4c1aab70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:51:36.318 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:36.317+0000 7fbf52b67700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbf4c1a6010 con 0x7fbf4c1a5b00 2026-03-10T08:51:36.318 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:36.317+0000 7fbf52b67700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbf4c1a6180 con 0x7fbf4c107d90 2026-03-10T08:51:36.319 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:36.319+0000 7fbf51b65700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbf4c107d90 0x7fbf4c1a55c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:51:36.319 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:36.319+0000 7fbf51b65700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbf4c107d90 0x7fbf4c1a55c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:47252/0 (socket says 192.168.123.105:47252) 2026-03-10T08:51:36.319 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:36.319+0000 7fbf51b65700 1 -- 192.168.123.105:0/2745240927 learned_addr learned my addr 192.168.123.105:0/2745240927 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:51:36.319 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:36.319+0000 7fbf51364700 1 --2- 192.168.123.105:0/2745240927 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbf4c1a5b00 0x7fbf4c1aab70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:51:36.320 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:36.319+0000 7fbf51364700 1 -- 192.168.123.105:0/2745240927 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbf4c107d90 msgr2=0x7fbf4c1a55c0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:36.320 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:36.319+0000 7fbf51364700 1 --2- 192.168.123.105:0/2745240927 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbf4c107d90 0x7fbf4c1a55c0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:36.320 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:36.319+0000 7fbf51364700 1 -- 192.168.123.105:0/2745240927 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbf3c0097e0 con 0x7fbf4c1a5b00 2026-03-10T08:51:36.320 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:36.320+0000 7fbf51364700 1 --2- 192.168.123.105:0/2745240927 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbf4c1a5b00 0x7fbf4c1aab70 secure :-1 s=READY pgs=166 cs=0 l=1 rev1=1 crypto rx=0x7fbf3c008000 tx=0x7fbf3c004930 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:51:36.320 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:36.320+0000 7fbf42ffd700 1 -- 192.168.123.105:0/2745240927 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbf3c01d070 con 0x7fbf4c1a5b00 2026-03-10T08:51:36.320 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:36.320+0000 7fbf42ffd700 1 -- 192.168.123.105:0/2745240927 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fbf3c022470 con 0x7fbf4c1a5b00 2026-03-10T08:51:36.320 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:36.320+0000 7fbf52b67700 1 -- 192.168.123.105:0/2745240927 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbf4c10f4c0 con 0x7fbf4c1a5b00 2026-03-10T08:51:36.320 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:36.320+0000 7fbf52b67700 1 -- 192.168.123.105:0/2745240927 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbf4c10f980 con 0x7fbf4c1a5b00 2026-03-10T08:51:36.320 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:36.320+0000 7fbf42ffd700 1 -- 192.168.123.105:0/2745240927 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbf3c00f670 con 0x7fbf4c1a5b00 2026-03-10T08:51:36.321 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:36.320+0000 7fbf40ff9700 1 -- 192.168.123.105:0/2745240927 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbf340052f0 con 0x7fbf4c1a5b00 2026-03-10T08:51:36.323 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:36.323+0000 7fbf42ffd700 1 -- 192.168.123.105:0/2745240927 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 17) v1 ==== 90252+0+0 (secure 0 0 0) 0x7fbf3c00baa0 con 0x7fbf4c1a5b00 2026-03-10T08:51:36.323 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:36.323+0000 7fbf42ffd700 1 --2- 192.168.123.105:0/2745240927 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbf3806c1f0 0x7fbf3806e6b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:51:36.323 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:36.323+0000 7fbf42ffd700 1 -- 192.168.123.105:0/2745240927 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7fbf3c08c2c0 con 0x7fbf4c1a5b00 2026-03-10T08:51:36.325 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:36.325+0000 7fbf42ffd700 1 -- 192.168.123.105:0/2745240927 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fbf3c05b720 con 0x7fbf4c1a5b00 2026-03-10T08:51:36.325 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:36.325+0000 7fbf51b65700 1 --2- 192.168.123.105:0/2745240927 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbf3806c1f0 0x7fbf3806e6b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:51:36.328 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:36.328+0000 7fbf51b65700 1 --2- 192.168.123.105:0/2745240927 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbf3806c1f0 0x7fbf3806e6b0 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7fbf48005950 tx=0x7fbf4800b500 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:51:36.456 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:36.455+0000 7fbf40ff9700 1 -- 192.168.123.105:0/2745240927 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm05:/dev/vde", "target": ["mon-mgr", ""]}) v1 -- 0x7fbf34000bc0 con 0x7fbf3806c1f0 2026-03-10T08:51:36.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:36 vm08 ceph-mon[57559]: Reconfiguring grafana.vm05 (dependencies changed)... 2026-03-10T08:51:36.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:36 vm08 ceph-mon[57559]: Reconfiguring daemon grafana.vm05 on vm05 2026-03-10T08:51:36.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:36 vm08 ceph-mon[57559]: pgmap v3: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T08:51:36.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:36 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T08:51:36.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:36 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:36.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:36 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:36.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:36 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T08:51:36.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:36 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:51:36.827 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:36 vm05 ceph-mon[49713]: Reconfiguring grafana.vm05 (dependencies changed)... 2026-03-10T08:51:36.828 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:36 vm05 ceph-mon[49713]: Reconfiguring daemon grafana.vm05 on vm05 2026-03-10T08:51:36.828 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:36 vm05 ceph-mon[49713]: pgmap v3: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T08:51:36.828 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:36 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T08:51:36.828 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:36 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:36.828 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:36 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:36.828 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:36 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T08:51:36.828 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:36 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:51:37.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:37 vm05 ceph-mon[49713]: from='client.14262 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm05:/dev/vde", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:51:37.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:37 vm05 ceph-mon[49713]: Reconfiguring prometheus.vm05 (dependencies changed)... 2026-03-10T08:51:37.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:37 vm05 ceph-mon[49713]: Reconfiguring daemon prometheus.vm05 on vm05 2026-03-10T08:51:37.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:37 vm08 ceph-mon[57559]: from='client.14262 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm05:/dev/vde", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:51:37.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:37 vm08 ceph-mon[57559]: Reconfiguring prometheus.vm05 (dependencies changed)... 2026-03-10T08:51:37.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:37 vm08 ceph-mon[57559]: Reconfiguring daemon prometheus.vm05 on vm05 2026-03-10T08:51:38.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:38 vm08 ceph-mon[57559]: pgmap v4: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T08:51:38.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:38 vm08 ceph-mon[57559]: from='client.? 192.168.123.105:0/565880557' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "0e25ea50-b19b-4e07-85f6-5d48c19d3a4f"}]: dispatch 2026-03-10T08:51:38.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:38 vm08 ceph-mon[57559]: from='client.? 192.168.123.105:0/565880557' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "0e25ea50-b19b-4e07-85f6-5d48c19d3a4f"}]': finished 2026-03-10T08:51:38.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:38 vm08 ceph-mon[57559]: osdmap e6: 1 total, 0 up, 1 in 2026-03-10T08:51:38.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:38 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T08:51:38.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:38 vm08 ceph-mon[57559]: from='client.? 192.168.123.105:0/572237450' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T08:51:38.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:38 vm05 ceph-mon[49713]: pgmap v4: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T08:51:38.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:38 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/565880557' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "0e25ea50-b19b-4e07-85f6-5d48c19d3a4f"}]: dispatch 2026-03-10T08:51:38.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:38 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/565880557' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "0e25ea50-b19b-4e07-85f6-5d48c19d3a4f"}]': finished 2026-03-10T08:51:38.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:38 vm05 ceph-mon[49713]: osdmap e6: 1 total, 0 up, 1 in 2026-03-10T08:51:38.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:38 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T08:51:38.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:38 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/572237450' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T08:51:40.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:39 vm05 ceph-mon[49713]: pgmap v6: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T08:51:40.300 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:39 vm08 ceph-mon[57559]: pgmap v6: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T08:51:42.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:42 vm08 ceph-mon[57559]: pgmap v7: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T08:51:42.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:42 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:42.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:42 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:42.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:42 vm08 ceph-mon[57559]: Reconfiguring ceph-exporter.vm08 (monmap changed)... 2026-03-10T08:51:42.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:42 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm08", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T08:51:42.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:42 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:51:42.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:42 vm08 ceph-mon[57559]: Reconfiguring daemon ceph-exporter.vm08 on vm08 2026-03-10T08:51:42.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:42 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-10T08:51:42.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:42 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:51:42.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:42 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:42.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:42 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:42.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:42 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm08", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T08:51:42.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:42 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:51:42.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:42 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:42.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:42 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:42.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:42 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm08.rpongu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T08:51:42.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:42 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T08:51:42.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:42 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:51:42.828 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:42 vm05 ceph-mon[49713]: pgmap v7: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T08:51:42.828 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:42 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:42.828 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:42 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:42.828 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:42 vm05 ceph-mon[49713]: Reconfiguring ceph-exporter.vm08 (monmap changed)... 2026-03-10T08:51:42.828 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:42 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm08", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T08:51:42.828 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:42 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:51:42.828 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:42 vm05 ceph-mon[49713]: Reconfiguring daemon ceph-exporter.vm08 on vm08 2026-03-10T08:51:42.828 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:42 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-10T08:51:42.828 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:42 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:51:42.828 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:42 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:42.828 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:42 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:42.828 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:42 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm08", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T08:51:42.828 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:42 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:51:42.828 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:42 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:42.828 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:42 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:42.828 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:42 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm08.rpongu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T08:51:42.828 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:42 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T08:51:42.828 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:42 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:51:43.710 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:43 vm05 ceph-mon[49713]: Deploying daemon osd.0 on vm05 2026-03-10T08:51:43.711 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:43 vm05 ceph-mon[49713]: Reconfiguring crash.vm08 (monmap changed)... 2026-03-10T08:51:43.711 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:43 vm05 ceph-mon[49713]: Reconfiguring daemon crash.vm08 on vm08 2026-03-10T08:51:43.711 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:43 vm05 ceph-mon[49713]: Reconfiguring mgr.vm08.rpongu (monmap changed)... 2026-03-10T08:51:43.711 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:43 vm05 ceph-mon[49713]: Reconfiguring daemon mgr.vm08.rpongu on vm08 2026-03-10T08:51:43.711 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:43 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:43.711 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:43 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:43.711 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:43 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T08:51:43.711 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:43 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T08:51:43.711 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:43 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:51:43.711 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:43 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:43.711 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:43 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:43.711 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:43 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-10T08:51:43.711 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:43 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "dashboard set-alertmanager-api-host", "value": "http://vm05.local:9093"}]: dispatch 2026-03-10T08:51:43.711 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:43 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:43.711 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:43 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-10T08:51:43.711 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:43 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://vm05.local:3000"}]: dispatch 2026-03-10T08:51:43.711 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:43 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:43.711 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:43 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T08:51:43.711 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:43 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "dashboard set-prometheus-api-host", "value": "http://vm05.local:9095"}]: dispatch 2026-03-10T08:51:43.711 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:43 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:43.711 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:43 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:51:43.937 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:43 vm08 ceph-mon[57559]: Deploying daemon osd.0 on vm05 2026-03-10T08:51:43.938 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:43 vm08 ceph-mon[57559]: Reconfiguring crash.vm08 (monmap changed)... 2026-03-10T08:51:43.938 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:43 vm08 ceph-mon[57559]: Reconfiguring daemon crash.vm08 on vm08 2026-03-10T08:51:43.938 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:43 vm08 ceph-mon[57559]: Reconfiguring mgr.vm08.rpongu (monmap changed)... 2026-03-10T08:51:43.938 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:43 vm08 ceph-mon[57559]: Reconfiguring daemon mgr.vm08.rpongu on vm08 2026-03-10T08:51:43.938 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:43 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:43.938 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:43 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:43.938 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:43 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T08:51:43.938 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:43 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T08:51:43.938 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:43 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:51:43.938 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:43 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:43.938 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:43 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:43.938 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:43 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-10T08:51:43.938 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:43 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "dashboard set-alertmanager-api-host", "value": "http://vm05.local:9093"}]: dispatch 2026-03-10T08:51:43.938 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:43 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:43.938 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:43 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-10T08:51:43.938 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:43 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://vm05.local:3000"}]: dispatch 2026-03-10T08:51:43.938 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:43 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:43.938 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:43 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T08:51:43.938 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:43 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "dashboard set-prometheus-api-host", "value": "http://vm05.local:9095"}]: dispatch 2026-03-10T08:51:43.938 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:43 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:43.938 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:43 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:51:44.785 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:44 vm05 ceph-mon[49713]: Reconfiguring mon.vm08 (monmap changed)... 2026-03-10T08:51:44.785 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:44 vm05 ceph-mon[49713]: Reconfiguring daemon mon.vm08 on vm08 2026-03-10T08:51:44.785 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:44 vm05 ceph-mon[49713]: pgmap v8: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T08:51:44.785 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:44 vm05 ceph-mon[49713]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-10T08:51:44.785 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:44 vm05 ceph-mon[49713]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-alertmanager-api-host", "value": "http://vm05.local:9093"}]: dispatch 2026-03-10T08:51:44.785 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:44 vm05 ceph-mon[49713]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-10T08:51:44.785 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:44 vm05 ceph-mon[49713]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://vm05.local:3000"}]: dispatch 2026-03-10T08:51:44.785 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:44 vm05 ceph-mon[49713]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T08:51:44.785 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:44 vm05 ceph-mon[49713]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-prometheus-api-host", "value": "http://vm05.local:9095"}]: dispatch 2026-03-10T08:51:44.785 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:44 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:44.786 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:44 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:44.786 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:44 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:44.786 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:44 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:45.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:44 vm08 ceph-mon[57559]: Reconfiguring mon.vm08 (monmap changed)... 2026-03-10T08:51:45.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:44 vm08 ceph-mon[57559]: Reconfiguring daemon mon.vm08 on vm08 2026-03-10T08:51:45.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:44 vm08 ceph-mon[57559]: pgmap v8: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T08:51:45.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:44 vm08 ceph-mon[57559]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-10T08:51:45.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:44 vm08 ceph-mon[57559]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-alertmanager-api-host", "value": "http://vm05.local:9093"}]: dispatch 2026-03-10T08:51:45.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:44 vm08 ceph-mon[57559]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-10T08:51:45.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:44 vm08 ceph-mon[57559]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://vm05.local:3000"}]: dispatch 2026-03-10T08:51:45.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:44 vm08 ceph-mon[57559]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T08:51:45.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:44 vm08 ceph-mon[57559]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-prometheus-api-host", "value": "http://vm05.local:9095"}]: dispatch 2026-03-10T08:51:45.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:44 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:45.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:44 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:45.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:44 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:45.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:44 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:45.273 INFO:teuthology.orchestra.run.vm05.stdout:Created osd(s) 0 on host 'vm05' 2026-03-10T08:51:45.273 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:45.269+0000 7fbf42ffd700 1 -- 192.168.123.105:0/2745240927 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7fbf34000bc0 con 0x7fbf3806c1f0 2026-03-10T08:51:45.273 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:45.272+0000 7fbf52b67700 1 -- 192.168.123.105:0/2745240927 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbf3806c1f0 msgr2=0x7fbf3806e6b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:45.273 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:45.272+0000 7fbf52b67700 1 --2- 192.168.123.105:0/2745240927 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbf3806c1f0 0x7fbf3806e6b0 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7fbf48005950 tx=0x7fbf4800b500 comp rx=0 tx=0).stop 2026-03-10T08:51:45.273 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:45.272+0000 7fbf52b67700 1 -- 192.168.123.105:0/2745240927 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbf4c1a5b00 msgr2=0x7fbf4c1aab70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:45.273 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:45.272+0000 7fbf52b67700 1 --2- 192.168.123.105:0/2745240927 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbf4c1a5b00 0x7fbf4c1aab70 secure :-1 s=READY pgs=166 cs=0 l=1 rev1=1 crypto rx=0x7fbf3c008000 tx=0x7fbf3c004930 comp rx=0 tx=0).stop 2026-03-10T08:51:45.273 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:45.272+0000 7fbf52b67700 1 -- 192.168.123.105:0/2745240927 shutdown_connections 2026-03-10T08:51:45.273 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:45.272+0000 7fbf52b67700 1 --2- 192.168.123.105:0/2745240927 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbf3806c1f0 0x7fbf3806e6b0 secure :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7fbf48005950 tx=0x7fbf4800b500 comp rx=0 tx=0).stop 2026-03-10T08:51:45.273 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:45.272+0000 7fbf52b67700 1 --2- 192.168.123.105:0/2745240927 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbf4c107d90 0x7fbf4c1a55c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:45.273 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:45.272+0000 7fbf52b67700 1 --2- 192.168.123.105:0/2745240927 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbf4c1a5b00 0x7fbf4c1aab70 unknown :-1 s=CLOSED pgs=166 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:45.273 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:45.272+0000 7fbf52b67700 1 -- 192.168.123.105:0/2745240927 >> 192.168.123.105:0/2745240927 conn(0x7fbf4c06daa0 msgr2=0x7fbf4c06ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:51:45.273 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:45.273+0000 7fbf52b67700 1 -- 192.168.123.105:0/2745240927 shutdown_connections 2026-03-10T08:51:45.273 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:45.273+0000 7fbf52b67700 1 -- 192.168.123.105:0/2745240927 wait complete. 2026-03-10T08:51:45.347 DEBUG:teuthology.orchestra.run.vm05:osd.0> sudo journalctl -f -n 0 -u ceph-16587ed2-1c5e-11f1-90f6-35051361a039@osd.0.service 2026-03-10T08:51:45.348 INFO:tasks.cephadm:Deploying osd.1 on vm05 with /dev/vdd... 2026-03-10T08:51:45.348 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- lvm zap /dev/vdd 2026-03-10T08:51:45.605 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:51:45.976 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:45 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:45.976 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:45 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:45.976 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:45 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:51:45.976 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:45 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:51:45.976 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:45 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:45.977 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:45 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:51:45.977 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:45 vm08 ceph-mon[57559]: pgmap v9: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T08:51:45.977 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:45 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:45.977 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:45 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:45.977 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:45 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:51:45.977 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:45 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:45.977 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:45 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:46.159 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:51:46.173 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph orch daemon add osd vm05:/dev/vdd 2026-03-10T08:51:46.302 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:45 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:46.302 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:45 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:46.302 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:45 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:51:46.302 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:45 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:51:46.302 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:45 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:46.302 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:45 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:51:46.302 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:45 vm05 ceph-mon[49713]: pgmap v9: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T08:51:46.303 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:45 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:46.303 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:45 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:46.303 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:45 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:51:46.303 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:45 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:46.303 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:45 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:46.355 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:51:46.629 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:46.627+0000 7f80763bb700 1 -- 192.168.123.105:0/478218205 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f80701082d0 msgr2=0x7f8070108750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:46.629 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:46.627+0000 7f80763bb700 1 --2- 192.168.123.105:0/478218205 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f80701082d0 0x7f8070108750 secure :-1 s=READY pgs=173 cs=0 l=1 rev1=1 crypto rx=0x7f8060009b00 tx=0x7f8060009e10 comp rx=0 tx=0).stop 2026-03-10T08:51:46.629 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:46.628+0000 7f80763bb700 1 -- 192.168.123.105:0/478218205 shutdown_connections 2026-03-10T08:51:46.629 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:46.628+0000 7f80763bb700 1 --2- 192.168.123.105:0/478218205 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f80701082d0 0x7f8070108750 unknown :-1 s=CLOSED pgs=173 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:46.629 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:46.628+0000 7f80763bb700 1 --2- 192.168.123.105:0/478218205 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8070072b00 0x7f8070107d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:46.629 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:46.628+0000 7f80763bb700 1 -- 192.168.123.105:0/478218205 >> 192.168.123.105:0/478218205 conn(0x7f807006daa0 msgr2=0x7f807006ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:51:46.629 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:46.628+0000 7f80763bb700 1 -- 192.168.123.105:0/478218205 shutdown_connections 2026-03-10T08:51:46.629 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:46.628+0000 7f80763bb700 1 -- 192.168.123.105:0/478218205 wait complete. 2026-03-10T08:51:46.629 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:46.629+0000 7f80763bb700 1 Processor -- start 2026-03-10T08:51:46.629 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:46.629+0000 7f80763bb700 1 -- start start 2026-03-10T08:51:46.631 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:46.629+0000 7f80763bb700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8070072b00 0x7f80701a0f20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:51:46.631 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:46.629+0000 7f80763bb700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f80701082d0 0x7f80701a1460 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:51:46.631 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:46.629+0000 7f80763bb700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f80701a1a80 con 0x7f8070072b00 2026-03-10T08:51:46.631 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:46.629+0000 7f80763bb700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f80701a1bc0 con 0x7f80701082d0 2026-03-10T08:51:46.631 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:46.630+0000 7f8074bb8700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f80701082d0 0x7f80701a1460 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:51:46.631 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:46.630+0000 7f8074bb8700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f80701082d0 0x7f80701a1460 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:48376/0 (socket says 192.168.123.105:48376) 2026-03-10T08:51:46.631 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:46.630+0000 7f8074bb8700 1 -- 192.168.123.105:0/2770427722 learned_addr learned my addr 192.168.123.105:0/2770427722 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:51:46.631 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:46.630+0000 7f80753b9700 1 --2- 192.168.123.105:0/2770427722 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8070072b00 0x7f80701a0f20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:51:46.631 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:46.630+0000 7f80753b9700 1 -- 192.168.123.105:0/2770427722 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f80701082d0 msgr2=0x7f80701a1460 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:46.631 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:46.630+0000 7f80753b9700 1 --2- 192.168.123.105:0/2770427722 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f80701082d0 0x7f80701a1460 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:46.631 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:46.630+0000 7f80753b9700 1 -- 192.168.123.105:0/2770427722 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f80600097e0 con 0x7f8070072b00 2026-03-10T08:51:46.631 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:46.630+0000 7f80753b9700 1 --2- 192.168.123.105:0/2770427722 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8070072b00 0x7f80701a0f20 secure :-1 s=READY pgs=174 cs=0 l=1 rev1=1 crypto rx=0x7f806c00d8d0 tx=0x7f806c00dbe0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:51:46.631 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:46.631+0000 7f80667fc700 1 -- 192.168.123.105:0/2770427722 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f806c00f840 con 0x7f8070072b00 2026-03-10T08:51:46.631 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:46.631+0000 7f80667fc700 1 -- 192.168.123.105:0/2770427722 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f806c00fe80 con 0x7f8070072b00 2026-03-10T08:51:46.631 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:46.631+0000 7f80667fc700 1 -- 192.168.123.105:0/2770427722 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f806c00e5c0 con 0x7f8070072b00 2026-03-10T08:51:46.634 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:46.631+0000 7f80763bb700 1 -- 192.168.123.105:0/2770427722 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f80701a6670 con 0x7f8070072b00 2026-03-10T08:51:46.634 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:46.631+0000 7f80763bb700 1 -- 192.168.123.105:0/2770427722 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f80701a6b90 con 0x7f8070072b00 2026-03-10T08:51:46.634 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:46.632+0000 7f80667fc700 1 -- 192.168.123.105:0/2770427722 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 17) v1 ==== 90252+0+0 (secure 0 0 0) 0x7f806c010460 con 0x7f8070072b00 2026-03-10T08:51:46.634 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:46.633+0000 7f80667fc700 1 --2- 192.168.123.105:0/2770427722 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f805c06c580 0x7f805c06ea40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:51:46.634 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:46.633+0000 7f80667fc700 1 -- 192.168.123.105:0/2770427722 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(6..6 src has 1..6) v4 ==== 1313+0+0 (secure 0 0 0) 0x7f806c021030 con 0x7f8070072b00 2026-03-10T08:51:46.634 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:46.633+0000 7f8074bb8700 1 --2- 192.168.123.105:0/2770427722 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f805c06c580 0x7f805c06ea40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:51:46.634 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:46.633+0000 7f8074bb8700 1 --2- 192.168.123.105:0/2770427722 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f805c06c580 0x7f805c06ea40 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7f806000b5c0 tx=0x7f8060005fb0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:51:46.634 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:46.633+0000 7f80763bb700 1 -- 192.168.123.105:0/2770427722 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f807010dd70 con 0x7f8070072b00 2026-03-10T08:51:46.636 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:46.636+0000 7f80667fc700 1 -- 192.168.123.105:0/2770427722 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f806c059520 con 0x7f8070072b00 2026-03-10T08:51:46.762 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:46.762+0000 7f80763bb700 1 -- 192.168.123.105:0/2770427722 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm05:/dev/vdd", "target": ["mon-mgr", ""]}) v1 -- 0x7f807002cfa0 con 0x7f805c06c580 2026-03-10T08:51:46.843 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:51:46 vm05 ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0[68780]: 2026-03-10T08:51:46.838+0000 7f588fea6640 -1 osd.0 0 log_to_monitors true 2026-03-10T08:51:47.096 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:46 vm05 ceph-mon[49713]: from='client.14278 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm05:/dev/vdd", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:51:47.096 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:46 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T08:51:47.096 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:46 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T08:51:47.096 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:46 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:51:47.096 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:46 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:47.096 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:46 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:47.096 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:46 vm05 ceph-mon[49713]: from='osd.0 [v2:192.168.123.105:6802/2315796084,v1:192.168.123.105:6803/2315796084]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch 2026-03-10T08:51:47.096 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:46 vm05 ceph-mon[49713]: from='osd.0 ' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch 2026-03-10T08:51:47.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:46 vm08 ceph-mon[57559]: from='client.14278 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm05:/dev/vdd", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:51:47.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:46 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T08:51:47.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:46 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T08:51:47.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:46 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:51:47.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:46 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:47.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:46 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:47.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:46 vm08 ceph-mon[57559]: from='osd.0 [v2:192.168.123.105:6802/2315796084,v1:192.168.123.105:6803/2315796084]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch 2026-03-10T08:51:47.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:46 vm08 ceph-mon[57559]: from='osd.0 ' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch 2026-03-10T08:51:48.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:48 vm05 ceph-mon[49713]: pgmap v10: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T08:51:48.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:48 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/1477571733' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "65d8a731-173e-4188-b03d-f0602d504870"}]: dispatch 2026-03-10T08:51:48.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:48 vm05 ceph-mon[49713]: from='osd.0 ' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished 2026-03-10T08:51:48.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:48 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/1477571733' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "65d8a731-173e-4188-b03d-f0602d504870"}]': finished 2026-03-10T08:51:48.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:48 vm05 ceph-mon[49713]: osdmap e7: 2 total, 0 up, 2 in 2026-03-10T08:51:48.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:48 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T08:51:48.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:48 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T08:51:48.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:48 vm05 ceph-mon[49713]: from='osd.0 [v2:192.168.123.105:6802/2315796084,v1:192.168.123.105:6803/2315796084]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-10T08:51:48.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:48 vm05 ceph-mon[49713]: from='osd.0 ' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-10T08:51:48.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:48 vm08 ceph-mon[57559]: pgmap v10: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T08:51:48.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:48 vm08 ceph-mon[57559]: from='client.? 192.168.123.105:0/1477571733' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "65d8a731-173e-4188-b03d-f0602d504870"}]: dispatch 2026-03-10T08:51:48.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:48 vm08 ceph-mon[57559]: from='osd.0 ' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished 2026-03-10T08:51:48.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:48 vm08 ceph-mon[57559]: from='client.? 192.168.123.105:0/1477571733' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "65d8a731-173e-4188-b03d-f0602d504870"}]': finished 2026-03-10T08:51:48.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:48 vm08 ceph-mon[57559]: osdmap e7: 2 total, 0 up, 2 in 2026-03-10T08:51:48.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:48 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T08:51:48.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:48 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T08:51:48.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:48 vm08 ceph-mon[57559]: from='osd.0 [v2:192.168.123.105:6802/2315796084,v1:192.168.123.105:6803/2315796084]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-10T08:51:48.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:48 vm08 ceph-mon[57559]: from='osd.0 ' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-10T08:51:48.962 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:51:48 vm05 ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0[68780]: 2026-03-10T08:51:48.649+0000 7f588650c700 -1 osd.0 0 waiting for initial osdmap 2026-03-10T08:51:48.962 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:51:48 vm05 ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0[68780]: 2026-03-10T08:51:48.653+0000 7f58812ff700 -1 osd.0 8 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T08:51:49.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:49 vm08 ceph-mon[57559]: from='client.? 192.168.123.105:0/3001862004' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T08:51:49.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:49 vm08 ceph-mon[57559]: from='osd.0 ' entity='osd.0' cmd='[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm05", "root=default"]}]': finished 2026-03-10T08:51:49.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:49 vm08 ceph-mon[57559]: osdmap e8: 2 total, 0 up, 2 in 2026-03-10T08:51:49.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:49 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T08:51:49.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:49 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T08:51:49.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:49 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T08:51:49.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:49 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/3001862004' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T08:51:49.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:49 vm05 ceph-mon[49713]: from='osd.0 ' entity='osd.0' cmd='[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm05", "root=default"]}]': finished 2026-03-10T08:51:49.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:49 vm05 ceph-mon[49713]: osdmap e8: 2 total, 0 up, 2 in 2026-03-10T08:51:49.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:49 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T08:51:49.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:49 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T08:51:49.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:49 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T08:51:50.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:50 vm05 ceph-mon[49713]: purged_snaps scrub starts 2026-03-10T08:51:50.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:50 vm05 ceph-mon[49713]: purged_snaps scrub ok 2026-03-10T08:51:50.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:50 vm05 ceph-mon[49713]: pgmap v13: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T08:51:50.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:50 vm05 ceph-mon[49713]: osd.0 [v2:192.168.123.105:6802/2315796084,v1:192.168.123.105:6803/2315796084] boot 2026-03-10T08:51:50.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:50 vm05 ceph-mon[49713]: osdmap e9: 2 total, 1 up, 2 in 2026-03-10T08:51:50.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:50 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T08:51:50.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:50 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T08:51:51.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:50 vm08 ceph-mon[57559]: purged_snaps scrub starts 2026-03-10T08:51:51.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:50 vm08 ceph-mon[57559]: purged_snaps scrub ok 2026-03-10T08:51:51.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:50 vm08 ceph-mon[57559]: pgmap v13: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T08:51:51.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:50 vm08 ceph-mon[57559]: osd.0 [v2:192.168.123.105:6802/2315796084,v1:192.168.123.105:6803/2315796084] boot 2026-03-10T08:51:51.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:50 vm08 ceph-mon[57559]: osdmap e9: 2 total, 1 up, 2 in 2026-03-10T08:51:51.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:50 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T08:51:51.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:50 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T08:51:51.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:51 vm05 ceph-mon[49713]: osdmap e10: 2 total, 1 up, 2 in 2026-03-10T08:51:51.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:51 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T08:51:52.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:51 vm08 ceph-mon[57559]: osdmap e10: 2 total, 1 up, 2 in 2026-03-10T08:51:52.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:51 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T08:51:52.999 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:52 vm05 ceph-mon[49713]: pgmap v16: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T08:51:52.999 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:52 vm05 ceph-mon[49713]: Detected new or changed devices on vm05 2026-03-10T08:51:52.999 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:52 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:52.999 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:52 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:52.999 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:52 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-10T08:51:52.999 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:52 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:51:52.999 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:52 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:51:52.999 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:52 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:52.999 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:52 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:51:52.999 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:52 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-10T08:51:52.999 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:52 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:51:52.999 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:52 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:52.999 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:52 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:53.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:52 vm08 ceph-mon[57559]: pgmap v16: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T08:51:53.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:52 vm08 ceph-mon[57559]: Detected new or changed devices on vm05 2026-03-10T08:51:53.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:52 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:53.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:52 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:53.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:52 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-10T08:51:53.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:52 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:51:53.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:52 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:51:53.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:52 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:53.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:52 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:51:53.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:52 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-10T08:51:53.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:52 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:51:53.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:52 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:53.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:52 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:53.949 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:53 vm05 ceph-mon[49713]: Deploying daemon osd.1 on vm05 2026-03-10T08:51:53.949 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:53 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:53.949 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:53 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:53.949 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:53 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:51:53.949 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:53 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:51:53.949 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:53 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:54.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:53 vm08 ceph-mon[57559]: Deploying daemon osd.1 on vm05 2026-03-10T08:51:54.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:53 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:54.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:53 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:54.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:53 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:51:54.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:53 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:51:54.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:53 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:54.965 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:54 vm05 ceph-mon[49713]: pgmap v17: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T08:51:54.965 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:54 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:54.965 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:54 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:51:54.965 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:54 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:55.256 INFO:teuthology.orchestra.run.vm05.stdout:Created osd(s) 1 on host 'vm05' 2026-03-10T08:51:55.257 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:55.253+0000 7f80667fc700 1 -- 192.168.123.105:0/2770427722 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7f807002cfa0 con 0x7f805c06c580 2026-03-10T08:51:55.257 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:55.256+0000 7f80763bb700 1 -- 192.168.123.105:0/2770427722 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f805c06c580 msgr2=0x7f805c06ea40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:55.257 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:55.256+0000 7f80763bb700 1 --2- 192.168.123.105:0/2770427722 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f805c06c580 0x7f805c06ea40 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7f806000b5c0 tx=0x7f8060005fb0 comp rx=0 tx=0).stop 2026-03-10T08:51:55.257 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:55.256+0000 7f80763bb700 1 -- 192.168.123.105:0/2770427722 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8070072b00 msgr2=0x7f80701a0f20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:55.257 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:55.256+0000 7f80763bb700 1 --2- 192.168.123.105:0/2770427722 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8070072b00 0x7f80701a0f20 secure :-1 s=READY pgs=174 cs=0 l=1 rev1=1 crypto rx=0x7f806c00d8d0 tx=0x7f806c00dbe0 comp rx=0 tx=0).stop 2026-03-10T08:51:55.257 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:55.256+0000 7f80763bb700 1 -- 192.168.123.105:0/2770427722 shutdown_connections 2026-03-10T08:51:55.257 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:55.256+0000 7f80763bb700 1 --2- 192.168.123.105:0/2770427722 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f805c06c580 0x7f805c06ea40 unknown :-1 s=CLOSED pgs=27 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:55.257 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:55.256+0000 7f80763bb700 1 --2- 192.168.123.105:0/2770427722 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8070072b00 0x7f80701a0f20 unknown :-1 s=CLOSED pgs=174 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:55.257 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:55.256+0000 7f80763bb700 1 --2- 192.168.123.105:0/2770427722 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f80701082d0 0x7f80701a1460 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:55.257 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:55.256+0000 7f80763bb700 1 -- 192.168.123.105:0/2770427722 >> 192.168.123.105:0/2770427722 conn(0x7f807006daa0 msgr2=0x7f807010aeb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:51:55.257 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:55.256+0000 7f80763bb700 1 -- 192.168.123.105:0/2770427722 shutdown_connections 2026-03-10T08:51:55.257 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:55.257+0000 7f80763bb700 1 -- 192.168.123.105:0/2770427722 wait complete. 2026-03-10T08:51:55.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:54 vm08 ceph-mon[57559]: pgmap v17: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T08:51:55.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:54 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:55.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:54 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:51:55.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:54 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:55.321 DEBUG:teuthology.orchestra.run.vm05:osd.1> sudo journalctl -f -n 0 -u ceph-16587ed2-1c5e-11f1-90f6-35051361a039@osd.1.service 2026-03-10T08:51:55.323 INFO:tasks.cephadm:Deploying osd.2 on vm05 with /dev/vdc... 2026-03-10T08:51:55.323 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- lvm zap /dev/vdc 2026-03-10T08:51:55.573 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:51:56.094 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:55 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:56.094 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:55 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:56.094 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:55 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:56.094 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:55 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:56.094 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:55 vm05 ceph-mon[49713]: pgmap v18: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T08:51:56.144 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:51:56.157 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph orch daemon add osd vm05:/dev/vdc 2026-03-10T08:51:56.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:55 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:56.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:55 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:56.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:55 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:56.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:55 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:56.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:55 vm08 ceph-mon[57559]: pgmap v18: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T08:51:56.392 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:51:56.396 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 08:51:56 vm05 ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1[76319]: 2026-03-10T08:51:56.304+0000 7fade2332640 -1 osd.1 0 log_to_monitors true 2026-03-10T08:51:56.665 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:56.662+0000 7feb2d826700 1 -- 192.168.123.105:0/4235760334 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7feb28072b20 msgr2=0x7feb28072f40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:56.665 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:56.662+0000 7feb2d826700 1 --2- 192.168.123.105:0/4235760334 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7feb28072b20 0x7feb28072f40 secure :-1 s=READY pgs=180 cs=0 l=1 rev1=1 crypto rx=0x7feb18008790 tx=0x7feb18008aa0 comp rx=0 tx=0).stop 2026-03-10T08:51:56.665 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:56.663+0000 7feb2d826700 1 -- 192.168.123.105:0/4235760334 shutdown_connections 2026-03-10T08:51:56.665 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:56.663+0000 7feb2d826700 1 --2- 192.168.123.105:0/4235760334 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7feb28075a10 0x7feb28077ea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:56.665 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:56.663+0000 7feb2d826700 1 --2- 192.168.123.105:0/4235760334 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7feb28072b20 0x7feb28072f40 unknown :-1 s=CLOSED pgs=180 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:56.665 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:56.663+0000 7feb2d826700 1 -- 192.168.123.105:0/4235760334 >> 192.168.123.105:0/4235760334 conn(0x7feb2806daa0 msgr2=0x7feb2806ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:51:56.665 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:56.663+0000 7feb2d826700 1 -- 192.168.123.105:0/4235760334 shutdown_connections 2026-03-10T08:51:56.665 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:56.663+0000 7feb2d826700 1 -- 192.168.123.105:0/4235760334 wait complete. 2026-03-10T08:51:56.665 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:56.665+0000 7feb2d826700 1 Processor -- start 2026-03-10T08:51:56.665 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:56.665+0000 7feb2d826700 1 -- start start 2026-03-10T08:51:56.665 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:56.665+0000 7feb2d826700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7feb28075a10 0x7feb28083080 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:51:56.665 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:56.665+0000 7feb2d826700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7feb280835c0 0x7feb281bb8a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:51:56.665 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:56.665+0000 7feb2d826700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7feb28083b00 con 0x7feb28075a10 2026-03-10T08:51:56.665 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:56.665+0000 7feb2d826700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7feb28083c70 con 0x7feb280835c0 2026-03-10T08:51:56.665 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:56.665+0000 7feb2c824700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7feb28075a10 0x7feb28083080 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:51:56.666 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:56.665+0000 7feb2c824700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7feb28075a10 0x7feb28083080 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:60814/0 (socket says 192.168.123.105:60814) 2026-03-10T08:51:56.666 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:56.665+0000 7feb2c824700 1 -- 192.168.123.105:0/3535178122 learned_addr learned my addr 192.168.123.105:0/3535178122 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:51:56.666 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:56.666+0000 7feb27fff700 1 --2- 192.168.123.105:0/3535178122 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7feb280835c0 0x7feb281bb8a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:51:56.666 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:56.666+0000 7feb2c824700 1 -- 192.168.123.105:0/3535178122 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7feb280835c0 msgr2=0x7feb281bb8a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:51:56.666 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:56.666+0000 7feb2c824700 1 --2- 192.168.123.105:0/3535178122 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7feb280835c0 0x7feb281bb8a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:51:56.666 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:56.666+0000 7feb2c824700 1 -- 192.168.123.105:0/3535178122 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7feb18008440 con 0x7feb28075a10 2026-03-10T08:51:56.666 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:56.666+0000 7feb2c824700 1 --2- 192.168.123.105:0/3535178122 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7feb28075a10 0x7feb28083080 secure :-1 s=READY pgs=181 cs=0 l=1 rev1=1 crypto rx=0x7feb18003df0 tx=0x7feb1800fc50 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:51:56.666 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:56.666+0000 7feb25ffb700 1 -- 192.168.123.105:0/3535178122 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7feb18004170 con 0x7feb28075a10 2026-03-10T08:51:56.667 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:56.666+0000 7feb2d826700 1 -- 192.168.123.105:0/3535178122 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7feb281aeb70 con 0x7feb28075a10 2026-03-10T08:51:56.667 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:56.666+0000 7feb2d826700 1 -- 192.168.123.105:0/3535178122 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7feb281af010 con 0x7feb28075a10 2026-03-10T08:51:56.667 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:56.667+0000 7feb25ffb700 1 -- 192.168.123.105:0/3535178122 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7feb18009ed0 con 0x7feb28075a10 2026-03-10T08:51:56.667 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:56.667+0000 7feb25ffb700 1 -- 192.168.123.105:0/3535178122 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7feb18017410 con 0x7feb28075a10 2026-03-10T08:51:56.669 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:56.668+0000 7feb25ffb700 1 -- 192.168.123.105:0/3535178122 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 17) v1 ==== 90252+0+0 (secure 0 0 0) 0x7feb180042d0 con 0x7feb28075a10 2026-03-10T08:51:56.669 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:56.668+0000 7feb25ffb700 1 --2- 192.168.123.105:0/3535178122 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7feb1006c290 0x7feb1006e750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:51:56.669 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:56.668+0000 7feb25ffb700 1 -- 192.168.123.105:0/3535178122 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(10..10 src has 1..10) v4 ==== 1915+0+0 (secure 0 0 0) 0x7feb1808b2f0 con 0x7feb28075a10 2026-03-10T08:51:56.669 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:56.669+0000 7feb27fff700 1 --2- 192.168.123.105:0/3535178122 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7feb1006c290 0x7feb1006e750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:51:56.669 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:56.669+0000 7feb27fff700 1 --2- 192.168.123.105:0/3535178122 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7feb1006c290 0x7feb1006e750 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7feb200079a0 tx=0x7feb2000d040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:51:56.670 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:56.670+0000 7feb2d826700 1 -- 192.168.123.105:0/3535178122 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7feb14005320 con 0x7feb28075a10 2026-03-10T08:51:56.673 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:56.673+0000 7feb25ffb700 1 -- 192.168.123.105:0/3535178122 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7feb18056970 con 0x7feb28075a10 2026-03-10T08:51:56.781 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:51:56.780+0000 7feb2d826700 1 -- 192.168.123.105:0/3535178122 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm05:/dev/vdc", "target": ["mon-mgr", ""]}) v1 -- 0x7feb14000bf0 con 0x7feb1006c290 2026-03-10T08:51:56.936 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:56 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:56.936 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:56 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:56.936 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:56 vm05 ceph-mon[49713]: from='osd.1 [v2:192.168.123.105:6810/305535590,v1:192.168.123.105:6811/305535590]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch 2026-03-10T08:51:56.936 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:56 vm05 ceph-mon[49713]: from='client.14296 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm05:/dev/vdc", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:51:56.936 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:56 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T08:51:56.936 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:56 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T08:51:56.936 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:56 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:51:57.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:56 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:57.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:56 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:57.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:56 vm08 ceph-mon[57559]: from='osd.1 [v2:192.168.123.105:6810/305535590,v1:192.168.123.105:6811/305535590]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch 2026-03-10T08:51:57.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:56 vm08 ceph-mon[57559]: from='client.14296 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm05:/dev/vdc", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:51:57.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:56 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T08:51:57.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:56 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T08:51:57.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:56 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:51:57.948 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 08:51:57 vm05 ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1[76319]: 2026-03-10T08:51:57.704+0000 7fadd7195700 -1 osd.1 0 waiting for initial osdmap 2026-03-10T08:51:57.948 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 08:51:57 vm05 ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1[76319]: 2026-03-10T08:51:57.725+0000 7fadd378b700 -1 osd.1 12 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T08:51:58.214 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:58 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:58.214 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:58 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:58.214 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:58 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-10T08:51:58.214 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:58 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:51:58.214 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:58 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:51:58.214 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:58 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:58.214 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:58 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:51:58.214 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:58 vm05 ceph-mon[49713]: from='osd.1 [v2:192.168.123.105:6810/305535590,v1:192.168.123.105:6811/305535590]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished 2026-03-10T08:51:58.214 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:58 vm05 ceph-mon[49713]: osdmap e11: 2 total, 1 up, 2 in 2026-03-10T08:51:58.214 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:58 vm05 ceph-mon[49713]: from='osd.1 [v2:192.168.123.105:6810/305535590,v1:192.168.123.105:6811/305535590]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-10T08:51:58.214 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:58 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T08:51:58.214 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:58 vm05 ceph-mon[49713]: pgmap v20: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T08:51:58.214 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:58 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/468841973' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "3a3adfaf-6208-4836-b16d-7bbb2065933b"}]: dispatch 2026-03-10T08:51:58.214 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:58 vm05 ceph-mon[49713]: from='osd.1 [v2:192.168.123.105:6810/305535590,v1:192.168.123.105:6811/305535590]' entity='osd.1' cmd='[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm05", "root=default"]}]': finished 2026-03-10T08:51:58.214 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:58 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/468841973' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "3a3adfaf-6208-4836-b16d-7bbb2065933b"}]': finished 2026-03-10T08:51:58.214 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:58 vm05 ceph-mon[49713]: osdmap e12: 3 total, 1 up, 3 in 2026-03-10T08:51:58.214 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:58 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T08:51:58.214 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:58 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T08:51:58.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:58 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:58.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:58 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:58.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:58 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-10T08:51:58.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:58 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:51:58.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:58 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:51:58.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:58 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:58.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:58 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:51:58.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:58 vm08 ceph-mon[57559]: from='osd.1 [v2:192.168.123.105:6810/305535590,v1:192.168.123.105:6811/305535590]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished 2026-03-10T08:51:58.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:58 vm08 ceph-mon[57559]: osdmap e11: 2 total, 1 up, 2 in 2026-03-10T08:51:58.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:58 vm08 ceph-mon[57559]: from='osd.1 [v2:192.168.123.105:6810/305535590,v1:192.168.123.105:6811/305535590]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-10T08:51:58.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:58 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T08:51:58.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:58 vm08 ceph-mon[57559]: pgmap v20: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T08:51:58.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:58 vm08 ceph-mon[57559]: from='client.? 192.168.123.105:0/468841973' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "3a3adfaf-6208-4836-b16d-7bbb2065933b"}]: dispatch 2026-03-10T08:51:58.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:58 vm08 ceph-mon[57559]: from='osd.1 [v2:192.168.123.105:6810/305535590,v1:192.168.123.105:6811/305535590]' entity='osd.1' cmd='[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm05", "root=default"]}]': finished 2026-03-10T08:51:58.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:58 vm08 ceph-mon[57559]: from='client.? 192.168.123.105:0/468841973' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "3a3adfaf-6208-4836-b16d-7bbb2065933b"}]': finished 2026-03-10T08:51:58.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:58 vm08 ceph-mon[57559]: osdmap e12: 3 total, 1 up, 3 in 2026-03-10T08:51:58.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:58 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T08:51:58.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:58 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T08:51:59.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:59 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/3127831136' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T08:51:59.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:59 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:59.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:59 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:59.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:59 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:51:59.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:59 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:51:59.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:59 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:59.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:59 vm05 ceph-mon[49713]: osd.1 [v2:192.168.123.105:6810/305535590,v1:192.168.123.105:6811/305535590] boot 2026-03-10T08:51:59.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:59 vm05 ceph-mon[49713]: osdmap e13: 3 total, 2 up, 3 in 2026-03-10T08:51:59.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:59 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T08:51:59.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:51:59 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T08:51:59.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:59 vm08 ceph-mon[57559]: from='client.? 192.168.123.105:0/3127831136' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T08:51:59.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:59 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:59.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:59 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:59.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:59 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:51:59.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:59 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:51:59.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:59 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:51:59.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:59 vm08 ceph-mon[57559]: osd.1 [v2:192.168.123.105:6810/305535590,v1:192.168.123.105:6811/305535590] boot 2026-03-10T08:51:59.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:59 vm08 ceph-mon[57559]: osdmap e13: 3 total, 2 up, 3 in 2026-03-10T08:51:59.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:59 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T08:51:59.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:51:59 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T08:52:00.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:00 vm08 ceph-mon[57559]: purged_snaps scrub starts 2026-03-10T08:52:00.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:00 vm08 ceph-mon[57559]: purged_snaps scrub ok 2026-03-10T08:52:00.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:00 vm08 ceph-mon[57559]: pgmap v23: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T08:52:00.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:00 vm05 ceph-mon[49713]: purged_snaps scrub starts 2026-03-10T08:52:00.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:00 vm05 ceph-mon[49713]: purged_snaps scrub ok 2026-03-10T08:52:00.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:00 vm05 ceph-mon[49713]: pgmap v23: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T08:52:01.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:01 vm05 ceph-mon[49713]: osdmap e14: 3 total, 2 up, 3 in 2026-03-10T08:52:01.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:01 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T08:52:01.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:01 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:52:01.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:01 vm08 ceph-mon[57559]: osdmap e14: 3 total, 2 up, 3 in 2026-03-10T08:52:01.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:01 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T08:52:01.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:01 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:52:02.450 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:02 vm05 ceph-mon[49713]: pgmap v25: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T08:52:02.450 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:02 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-10T08:52:02.450 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:02 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:52:02.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:02 vm08 ceph-mon[57559]: pgmap v25: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T08:52:02.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:02 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-10T08:52:02.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:02 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:52:03.287 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:03 vm05 ceph-mon[49713]: Deploying daemon osd.2 on vm05 2026-03-10T08:52:03.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:03 vm08 ceph-mon[57559]: Deploying daemon osd.2 on vm05 2026-03-10T08:52:04.321 INFO:teuthology.orchestra.run.vm05.stdout:Created osd(s) 2 on host 'vm05' 2026-03-10T08:52:04.321 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:04.317+0000 7feb25ffb700 1 -- 192.168.123.105:0/3535178122 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7feb14000bf0 con 0x7feb1006c290 2026-03-10T08:52:04.321 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:04.320+0000 7feb0f7fe700 1 -- 192.168.123.105:0/3535178122 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7feb1006c290 msgr2=0x7feb1006e750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:04.321 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:04.320+0000 7feb0f7fe700 1 --2- 192.168.123.105:0/3535178122 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7feb1006c290 0x7feb1006e750 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7feb200079a0 tx=0x7feb2000d040 comp rx=0 tx=0).stop 2026-03-10T08:52:04.321 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:04.320+0000 7feb0f7fe700 1 -- 192.168.123.105:0/3535178122 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7feb28075a10 msgr2=0x7feb28083080 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:04.321 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:04.320+0000 7feb0f7fe700 1 --2- 192.168.123.105:0/3535178122 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7feb28075a10 0x7feb28083080 secure :-1 s=READY pgs=181 cs=0 l=1 rev1=1 crypto rx=0x7feb18003df0 tx=0x7feb1800fc50 comp rx=0 tx=0).stop 2026-03-10T08:52:04.321 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:04.321+0000 7feb0f7fe700 1 -- 192.168.123.105:0/3535178122 shutdown_connections 2026-03-10T08:52:04.321 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:04.321+0000 7feb0f7fe700 1 --2- 192.168.123.105:0/3535178122 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7feb1006c290 0x7feb1006e750 unknown :-1 s=CLOSED pgs=33 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:04.321 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:04.321+0000 7feb0f7fe700 1 --2- 192.168.123.105:0/3535178122 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7feb28075a10 0x7feb28083080 unknown :-1 s=CLOSED pgs=181 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:04.321 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:04.321+0000 7feb0f7fe700 1 --2- 192.168.123.105:0/3535178122 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7feb280835c0 0x7feb281bb8a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:04.321 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:04.321+0000 7feb0f7fe700 1 -- 192.168.123.105:0/3535178122 >> 192.168.123.105:0/3535178122 conn(0x7feb2806daa0 msgr2=0x7feb2806ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:04.322 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:04.321+0000 7feb0f7fe700 1 -- 192.168.123.105:0/3535178122 shutdown_connections 2026-03-10T08:52:04.322 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:04.321+0000 7feb0f7fe700 1 -- 192.168.123.105:0/3535178122 wait complete. 2026-03-10T08:52:04.383 DEBUG:teuthology.orchestra.run.vm05:osd.2> sudo journalctl -f -n 0 -u ceph-16587ed2-1c5e-11f1-90f6-35051361a039@osd.2.service 2026-03-10T08:52:04.385 INFO:tasks.cephadm:Deploying osd.3 on vm08 with /dev/vde... 2026-03-10T08:52:04.385 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- lvm zap /dev/vde 2026-03-10T08:52:04.526 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm08/config 2026-03-10T08:52:04.578 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:04 vm05 ceph-mon[49713]: pgmap v26: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T08:52:04.578 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:04 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:04.578 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:04 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:04.578 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:04 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:52:04.578 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:04 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:04.578 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:04 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:04.659 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:04 vm08 ceph-mon[57559]: pgmap v26: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T08:52:04.660 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:04 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:04.660 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:04 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:04.660 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:04 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:52:04.660 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:04 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:04.660 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:04 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:05.014 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:52:05.026 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph orch daemon add osd vm08:/dev/vde 2026-03-10T08:52:05.166 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm08/config 2026-03-10T08:52:05.415 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:05.413+0000 7f958e362700 1 -- 192.168.123.108:0/4273056436 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9588104340 msgr2=0x7f95881047a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:05.415 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:05.413+0000 7f958e362700 1 --2- 192.168.123.108:0/4273056436 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9588104340 0x7f95881047a0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f957c009b00 tx=0x7f957c009e10 comp rx=0 tx=0).stop 2026-03-10T08:52:05.416 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:05.414+0000 7f958e362700 1 -- 192.168.123.108:0/4273056436 shutdown_connections 2026-03-10T08:52:05.416 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:05.414+0000 7f958e362700 1 --2- 192.168.123.108:0/4273056436 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9588104340 0x7f95881047a0 secure :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f957c009b00 tx=0x7f957c009e10 comp rx=0 tx=0).stop 2026-03-10T08:52:05.416 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:05.414+0000 7f958e362700 1 --2- 192.168.123.108:0/4273056436 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9588103140 0x7f9588103560 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:05.416 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:05.414+0000 7f958e362700 1 -- 192.168.123.108:0/4273056436 >> 192.168.123.108:0/4273056436 conn(0x7f95880fe6c0 msgr2=0x7f9588100b20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:05.416 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:05.414+0000 7f958e362700 1 -- 192.168.123.108:0/4273056436 shutdown_connections 2026-03-10T08:52:05.416 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:05.414+0000 7f958e362700 1 -- 192.168.123.108:0/4273056436 wait complete. 2026-03-10T08:52:05.416 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:05.416+0000 7f958e362700 1 Processor -- start 2026-03-10T08:52:05.416 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:05.416+0000 7f958e362700 1 -- start start 2026-03-10T08:52:05.416 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:05.416+0000 7f958e362700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9588103140 0x7f9588078b40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:05.417 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:05.416+0000 7f958e362700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9588079080 0x7f95880755a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:05.417 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:05.416+0000 7f958e362700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9588075ae0 con 0x7f9588103140 2026-03-10T08:52:05.417 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:05.416+0000 7f958e362700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9588075c50 con 0x7f9588079080 2026-03-10T08:52:05.417 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:05.416+0000 7f95877fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9588079080 0x7f95880755a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:05.417 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:05.416+0000 7f95877fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9588079080 0x7f95880755a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.108:53628/0 (socket says 192.168.123.108:53628) 2026-03-10T08:52:05.419 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:05.416+0000 7f95877fe700 1 -- 192.168.123.108:0/2177712738 learned_addr learned my addr 192.168.123.108:0/2177712738 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T08:52:05.419 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:05.416+0000 7f95877fe700 1 -- 192.168.123.108:0/2177712738 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9588103140 msgr2=0x7f9588078b40 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T08:52:05.419 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:05.416+0000 7f9587fff700 1 --2- 192.168.123.108:0/2177712738 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9588103140 0x7f9588078b40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:05.419 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:05.416+0000 7f95877fe700 1 --2- 192.168.123.108:0/2177712738 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9588103140 0x7f9588078b40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:05.419 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:05.416+0000 7f95877fe700 1 -- 192.168.123.108:0/2177712738 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f957c0097e0 con 0x7f9588079080 2026-03-10T08:52:05.419 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:05.416+0000 7f95877fe700 1 --2- 192.168.123.108:0/2177712738 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9588079080 0x7f95880755a0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f957c000c00 tx=0x7f957c004a40 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:05.419 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:05.417+0000 7f95857fa700 1 -- 192.168.123.108:0/2177712738 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f957c01d070 con 0x7f9588079080 2026-03-10T08:52:05.419 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:05.417+0000 7f958e362700 1 -- 192.168.123.108:0/2177712738 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9588075e70 con 0x7f9588079080 2026-03-10T08:52:05.419 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:05.417+0000 7f958e362700 1 -- 192.168.123.108:0/2177712738 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f95881a30c0 con 0x7f9588079080 2026-03-10T08:52:05.420 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:05.417+0000 7f95857fa700 1 -- 192.168.123.108:0/2177712738 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f957c00bc50 con 0x7f9588079080 2026-03-10T08:52:05.420 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:05.417+0000 7f95857fa700 1 -- 192.168.123.108:0/2177712738 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f957c00f670 con 0x7f9588079080 2026-03-10T08:52:05.420 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:05.418+0000 7f95857fa700 1 -- 192.168.123.108:0/2177712738 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 17) v1 ==== 90252+0+0 (secure 0 0 0) 0x7f957c00f8b0 con 0x7f9588079080 2026-03-10T08:52:05.420 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:05.418+0000 7f95857fa700 1 --2- 192.168.123.108:0/2177712738 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f957006c4b0 0x7f957006e970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:05.420 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:05.418+0000 7f95857fa700 1 -- 192.168.123.108:0/2177712738 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(14..14 src has 1..14) v4 ==== 2347+0+0 (secure 0 0 0) 0x7f957c08d1b0 con 0x7f9588079080 2026-03-10T08:52:05.423 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:05.420+0000 7f958e362700 1 -- 192.168.123.108:0/2177712738 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9574005320 con 0x7f9588079080 2026-03-10T08:52:05.423 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:05.422+0000 7f9587fff700 1 --2- 192.168.123.108:0/2177712738 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f957006c4b0 0x7f957006e970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:05.423 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:05.422+0000 7f9587fff700 1 --2- 192.168.123.108:0/2177712738 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f957006c4b0 0x7f957006e970 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f9578005950 tx=0x7f95780058e0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:05.424 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:05.423+0000 7f95857fa700 1 -- 192.168.123.108:0/2177712738 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f957c05c1c0 con 0x7f9588079080 2026-03-10T08:52:05.530 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:05.529+0000 7f958e362700 1 -- 192.168.123.108:0/2177712738 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm08:/dev/vde", "target": ["mon-mgr", ""]}) v1 -- 0x7f9574000bf0 con 0x7f957006c4b0 2026-03-10T08:52:06.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:06 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:06.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:06 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:06.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:06 vm08 ceph-mon[57559]: pgmap v27: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T08:52:06.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:06 vm08 ceph-mon[57559]: from='client.24137 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm08:/dev/vde", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:52:06.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:06 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T08:52:06.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:06 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T08:52:06.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:06 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:52:06.463 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 08:52:06 vm05 ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2[83525]: 2026-03-10T08:52:06.123+0000 7f5a24692640 -1 osd.2 0 log_to_monitors true 2026-03-10T08:52:06.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:06 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:06.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:06 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:06.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:06 vm05 ceph-mon[49713]: pgmap v27: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T08:52:06.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:06 vm05 ceph-mon[49713]: from='client.24137 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm08:/dev/vde", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:52:06.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:06 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T08:52:06.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:06 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T08:52:06.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:06 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:52:07.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:07 vm05 ceph-mon[49713]: Detected new or changed devices on vm05 2026-03-10T08:52:07.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:07 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:07.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:07 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:07.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:07 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-10T08:52:07.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:07 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:52:07.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:07 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:52:07.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:07 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:07.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:07 vm05 ceph-mon[49713]: from='osd.2 [v2:192.168.123.105:6818/3827519003,v1:192.168.123.105:6819/3827519003]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch 2026-03-10T08:52:07.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:07 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:52:07.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:07 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:52:07.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:07 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:52:07.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:07 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:07.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:07 vm05 ceph-mon[49713]: from='client.? 192.168.123.108:0/4144089123' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "3e86065a-202f-4640-9f03-2490c913e09b"}]: dispatch 2026-03-10T08:52:07.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:07 vm05 ceph-mon[49713]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "3e86065a-202f-4640-9f03-2490c913e09b"}]: dispatch 2026-03-10T08:52:07.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:07 vm05 ceph-mon[49713]: from='osd.2 [v2:192.168.123.105:6818/3827519003,v1:192.168.123.105:6819/3827519003]' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished 2026-03-10T08:52:07.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:07 vm05 ceph-mon[49713]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "3e86065a-202f-4640-9f03-2490c913e09b"}]': finished 2026-03-10T08:52:07.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:07 vm05 ceph-mon[49713]: osdmap e15: 4 total, 2 up, 4 in 2026-03-10T08:52:07.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:07 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T08:52:07.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:07 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T08:52:07.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:07 vm05 ceph-mon[49713]: from='osd.2 [v2:192.168.123.105:6818/3827519003,v1:192.168.123.105:6819/3827519003]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-10T08:52:07.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:07 vm05 ceph-mon[49713]: from='client.? 192.168.123.108:0/3817813820' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T08:52:07.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:07 vm08 ceph-mon[57559]: Detected new or changed devices on vm05 2026-03-10T08:52:07.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:07 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:07.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:07 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:07.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:07 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-10T08:52:07.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:07 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:52:07.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:07 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:52:07.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:07 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:07.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:07 vm08 ceph-mon[57559]: from='osd.2 [v2:192.168.123.105:6818/3827519003,v1:192.168.123.105:6819/3827519003]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch 2026-03-10T08:52:07.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:07 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:52:07.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:07 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:52:07.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:07 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:52:07.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:07 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:07.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:07 vm08 ceph-mon[57559]: from='client.? 192.168.123.108:0/4144089123' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "3e86065a-202f-4640-9f03-2490c913e09b"}]: dispatch 2026-03-10T08:52:07.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:07 vm08 ceph-mon[57559]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "3e86065a-202f-4640-9f03-2490c913e09b"}]: dispatch 2026-03-10T08:52:07.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:07 vm08 ceph-mon[57559]: from='osd.2 [v2:192.168.123.105:6818/3827519003,v1:192.168.123.105:6819/3827519003]' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished 2026-03-10T08:52:07.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:07 vm08 ceph-mon[57559]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "3e86065a-202f-4640-9f03-2490c913e09b"}]': finished 2026-03-10T08:52:07.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:07 vm08 ceph-mon[57559]: osdmap e15: 4 total, 2 up, 4 in 2026-03-10T08:52:07.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:07 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T08:52:07.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:07 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T08:52:07.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:07 vm08 ceph-mon[57559]: from='osd.2 [v2:192.168.123.105:6818/3827519003,v1:192.168.123.105:6819/3827519003]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-10T08:52:07.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:07 vm08 ceph-mon[57559]: from='client.? 192.168.123.108:0/3817813820' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T08:52:07.713 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 08:52:07 vm05 ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2[83525]: 2026-03-10T08:52:07.365+0000 7f5a194f5700 -1 osd.2 0 waiting for initial osdmap 2026-03-10T08:52:07.713 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 08:52:07 vm05 ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2[83525]: 2026-03-10T08:52:07.375+0000 7f5a15aeb700 -1 osd.2 16 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T08:52:08.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:08 vm05 ceph-mon[49713]: pgmap v29: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T08:52:08.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:08 vm05 ceph-mon[49713]: from='osd.2 [v2:192.168.123.105:6818/3827519003,v1:192.168.123.105:6819/3827519003]' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm05", "root=default"]}]': finished 2026-03-10T08:52:08.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:08 vm05 ceph-mon[49713]: osdmap e16: 4 total, 2 up, 4 in 2026-03-10T08:52:08.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:08 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T08:52:08.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:08 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T08:52:08.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:08 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T08:52:08.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:08 vm08 ceph-mon[57559]: pgmap v29: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T08:52:08.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:08 vm08 ceph-mon[57559]: from='osd.2 [v2:192.168.123.105:6818/3827519003,v1:192.168.123.105:6819/3827519003]' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm05", "root=default"]}]': finished 2026-03-10T08:52:08.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:08 vm08 ceph-mon[57559]: osdmap e16: 4 total, 2 up, 4 in 2026-03-10T08:52:08.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:08 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T08:52:08.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:08 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T08:52:08.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:08 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T08:52:09.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:09 vm05 ceph-mon[49713]: purged_snaps scrub ok 2026-03-10T08:52:09.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:09 vm05 ceph-mon[49713]: osd.2 [v2:192.168.123.105:6818/3827519003,v1:192.168.123.105:6819/3827519003] boot 2026-03-10T08:52:09.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:09 vm05 ceph-mon[49713]: osdmap e17: 4 total, 3 up, 4 in 2026-03-10T08:52:09.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:09 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T08:52:09.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:09 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T08:52:09.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:09 vm08 ceph-mon[57559]: purged_snaps scrub ok 2026-03-10T08:52:09.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:09 vm08 ceph-mon[57559]: osd.2 [v2:192.168.123.105:6818/3827519003,v1:192.168.123.105:6819/3827519003] boot 2026-03-10T08:52:09.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:09 vm08 ceph-mon[57559]: osdmap e17: 4 total, 3 up, 4 in 2026-03-10T08:52:09.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:09 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T08:52:09.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:09 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T08:52:10.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:10 vm05 ceph-mon[49713]: pgmap v32: 0 pgs: ; 0 B data, 79 MiB used, 60 GiB / 60 GiB avail 2026-03-10T08:52:10.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:10 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]: dispatch 2026-03-10T08:52:10.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:10 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]': finished 2026-03-10T08:52:10.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:10 vm05 ceph-mon[49713]: osdmap e18: 4 total, 3 up, 4 in 2026-03-10T08:52:10.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:10 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T08:52:10.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:10 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]: dispatch 2026-03-10T08:52:10.714 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:10 vm08 ceph-mon[57559]: pgmap v32: 0 pgs: ; 0 B data, 79 MiB used, 60 GiB / 60 GiB avail 2026-03-10T08:52:10.714 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:10 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]: dispatch 2026-03-10T08:52:10.714 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:10 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]': finished 2026-03-10T08:52:10.714 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:10 vm08 ceph-mon[57559]: osdmap e18: 4 total, 3 up, 4 in 2026-03-10T08:52:10.714 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:10 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T08:52:10.714 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:10 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]: dispatch 2026-03-10T08:52:11.712 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 08:52:11 vm05 sudo[88077]: ceph : PWD=/ ; USER=root ; COMMAND=/usr/sbin/smartctl -x --json=o /dev/vdd 2026-03-10T08:52:11.712 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 08:52:11 vm05 sudo[88077]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory 2026-03-10T08:52:11.712 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 08:52:11 vm05 sudo[88077]: pam_unix(sudo:session): session opened for user root by (uid=167) 2026-03-10T08:52:11.713 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 08:52:11 vm05 sudo[88077]: pam_unix(sudo:session): session closed for user root 2026-03-10T08:52:11.713 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:52:11 vm05 sudo[88074]: ceph : PWD=/ ; USER=root ; COMMAND=/usr/sbin/smartctl -x --json=o /dev/vde 2026-03-10T08:52:11.713 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:52:11 vm05 sudo[88074]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory 2026-03-10T08:52:11.713 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:52:11 vm05 sudo[88074]: pam_unix(sudo:session): session opened for user root by (uid=167) 2026-03-10T08:52:11.713 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:52:11 vm05 sudo[88074]: pam_unix(sudo:session): session closed for user root 2026-03-10T08:52:11.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:11 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]': finished 2026-03-10T08:52:11.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:11 vm05 ceph-mon[49713]: osdmap e19: 4 total, 3 up, 4 in 2026-03-10T08:52:11.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:11 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T08:52:11.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:11 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-10T08:52:11.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:11 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:52:11.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:11 vm05 ceph-mon[49713]: Deploying daemon osd.3 on vm08 2026-03-10T08:52:11.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:11 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]': finished 2026-03-10T08:52:11.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:11 vm08 ceph-mon[57559]: osdmap e19: 4 total, 3 up, 4 in 2026-03-10T08:52:11.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:11 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T08:52:11.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:11 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-10T08:52:11.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:11 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:52:11.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:11 vm08 ceph-mon[57559]: Deploying daemon osd.3 on vm08 2026-03-10T08:52:12.212 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 08:52:11 vm05 sudo[88080]: ceph : PWD=/ ; USER=root ; COMMAND=/usr/sbin/smartctl -x --json=o /dev/vdc 2026-03-10T08:52:12.212 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 08:52:11 vm05 sudo[88080]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory 2026-03-10T08:52:12.212 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 08:52:11 vm05 sudo[88080]: pam_unix(sudo:session): session opened for user root by (uid=167) 2026-03-10T08:52:12.213 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 08:52:11 vm05 sudo[88080]: pam_unix(sudo:session): session closed for user root 2026-03-10T08:52:12.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:11 vm05 sudo[88083]: ceph : PWD=/ ; USER=root ; COMMAND=/usr/sbin/smartctl -x --json=o /dev/vda 2026-03-10T08:52:12.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:11 vm05 sudo[88083]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory 2026-03-10T08:52:12.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:11 vm05 sudo[88083]: pam_unix(sudo:session): session opened for user root by (uid=167) 2026-03-10T08:52:12.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:11 vm05 sudo[88083]: pam_unix(sudo:session): session closed for user root 2026-03-10T08:52:12.439 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:12 vm08 sudo[63390]: ceph : PWD=/ ; USER=root ; COMMAND=/usr/sbin/smartctl -x --json=o /dev/vda 2026-03-10T08:52:12.439 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:12 vm08 sudo[63390]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory 2026-03-10T08:52:12.439 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:12 vm08 sudo[63390]: pam_unix(sudo:session): session opened for user root by (uid=167) 2026-03-10T08:52:12.439 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:12 vm08 sudo[63390]: pam_unix(sudo:session): session closed for user root 2026-03-10T08:52:12.439 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:12 vm08 ceph-mon[57559]: pgmap v35: 1 pgs: 1 unknown; 0 B data, 79 MiB used, 60 GiB / 60 GiB avail 2026-03-10T08:52:12.439 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:12 vm08 ceph-mon[57559]: osdmap e20: 4 total, 3 up, 4 in 2026-03-10T08:52:12.439 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:12 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T08:52:12.439 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:12 vm08 ceph-mon[57559]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch 2026-03-10T08:52:12.439 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:12 vm08 ceph-mon[57559]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished 2026-03-10T08:52:12.439 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:12 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-10T08:52:12.439 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:12 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T08:52:12.439 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:12 vm08 ceph-mon[57559]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch 2026-03-10T08:52:12.439 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:12 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-10T08:52:12.439 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:12 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T08:52:12.439 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:12 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:52:12.439 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:12 vm08 ceph-mon[57559]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished 2026-03-10T08:52:12.439 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:12 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:12.439 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:12 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:12.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:12 vm05 ceph-mon[49713]: pgmap v35: 1 pgs: 1 unknown; 0 B data, 79 MiB used, 60 GiB / 60 GiB avail 2026-03-10T08:52:12.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:12 vm05 ceph-mon[49713]: osdmap e20: 4 total, 3 up, 4 in 2026-03-10T08:52:12.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:12 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T08:52:12.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:12 vm05 ceph-mon[49713]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch 2026-03-10T08:52:12.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:12 vm05 ceph-mon[49713]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished 2026-03-10T08:52:12.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:12 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-10T08:52:12.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:12 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T08:52:12.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:12 vm05 ceph-mon[49713]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch 2026-03-10T08:52:12.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:12 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-10T08:52:12.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:12 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T08:52:12.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:12 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:52:12.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:12 vm05 ceph-mon[49713]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished 2026-03-10T08:52:12.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:12 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:12.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:12 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:13.010 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:13.009+0000 7f95857fa700 1 -- 192.168.123.108:0/2177712738 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7f9574000bf0 con 0x7f957006c4b0 2026-03-10T08:52:13.010 INFO:teuthology.orchestra.run.vm08.stdout:Created osd(s) 3 on host 'vm08' 2026-03-10T08:52:13.012 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:13.011+0000 7f958e362700 1 -- 192.168.123.108:0/2177712738 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f957006c4b0 msgr2=0x7f957006e970 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:13.012 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:13.011+0000 7f958e362700 1 --2- 192.168.123.108:0/2177712738 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f957006c4b0 0x7f957006e970 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f9578005950 tx=0x7f95780058e0 comp rx=0 tx=0).stop 2026-03-10T08:52:13.012 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:13.011+0000 7f958e362700 1 -- 192.168.123.108:0/2177712738 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9588079080 msgr2=0x7f95880755a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:13.012 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:13.011+0000 7f958e362700 1 --2- 192.168.123.108:0/2177712738 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9588079080 0x7f95880755a0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f957c000c00 tx=0x7f957c004a40 comp rx=0 tx=0).stop 2026-03-10T08:52:13.012 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:13.011+0000 7f958e362700 1 -- 192.168.123.108:0/2177712738 shutdown_connections 2026-03-10T08:52:13.012 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:13.011+0000 7f958e362700 1 --2- 192.168.123.108:0/2177712738 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f957006c4b0 0x7f957006e970 unknown :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:13.012 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:13.011+0000 7f958e362700 1 --2- 192.168.123.108:0/2177712738 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9588103140 0x7f9588078b40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:13.012 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:13.011+0000 7f958e362700 1 --2- 192.168.123.108:0/2177712738 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9588079080 0x7f95880755a0 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:13.012 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:13.011+0000 7f958e362700 1 -- 192.168.123.108:0/2177712738 >> 192.168.123.108:0/2177712738 conn(0x7f95880fe6c0 msgr2=0x7f9588107570 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:13.012 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:13.012+0000 7f958e362700 1 -- 192.168.123.108:0/2177712738 shutdown_connections 2026-03-10T08:52:13.012 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:13.012+0000 7f958e362700 1 -- 192.168.123.108:0/2177712738 wait complete. 2026-03-10T08:52:13.102 DEBUG:teuthology.orchestra.run.vm08:osd.3> sudo journalctl -f -n 0 -u ceph-16587ed2-1c5e-11f1-90f6-35051361a039@osd.3.service 2026-03-10T08:52:13.104 INFO:tasks.cephadm:Deploying osd.4 on vm08 with /dev/vdd... 2026-03-10T08:52:13.104 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- lvm zap /dev/vdd 2026-03-10T08:52:13.300 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm08/config 2026-03-10T08:52:14.137 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:13 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:14.137 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:13 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:14.137 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:13 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:14.137 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:13 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:14.137 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:13 vm08 ceph-mon[57559]: pgmap v37: 1 pgs: 1 unknown; 0 B data, 79 MiB used, 60 GiB / 60 GiB avail 2026-03-10T08:52:14.137 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:13 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:14.137 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:13 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:14.138 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:13 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-10T08:52:14.138 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:13 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:52:14.138 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:13 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:52:14.138 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:13 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:14.329 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:52:14.341 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph orch daemon add osd vm08:/dev/vdd 2026-03-10T08:52:14.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:13 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:14.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:13 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:14.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:13 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:14.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:13 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:14.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:13 vm05 ceph-mon[49713]: pgmap v37: 1 pgs: 1 unknown; 0 B data, 79 MiB used, 60 GiB / 60 GiB avail 2026-03-10T08:52:14.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:13 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:14.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:13 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:14.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:13 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-10T08:52:14.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:13 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:52:14.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:13 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:52:14.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:13 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:14.470 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm08/config 2026-03-10T08:52:14.492 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 08:52:14 vm08 ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3[63369]: 2026-03-10T08:52:14.134+0000 7f3e739bf640 -1 osd.3 0 log_to_monitors true 2026-03-10T08:52:14.689 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:14.687+0000 7fac56d40700 1 -- 192.168.123.108:0/1144358586 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fac50104340 msgr2=0x7fac501047a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:14.689 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:14.687+0000 7fac56d40700 1 --2- 192.168.123.108:0/1144358586 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fac50104340 0x7fac501047a0 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7fac44009b50 tx=0x7fac44009e60 comp rx=0 tx=0).stop 2026-03-10T08:52:14.689 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:14.688+0000 7fac56d40700 1 -- 192.168.123.108:0/1144358586 shutdown_connections 2026-03-10T08:52:14.689 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:14.688+0000 7fac56d40700 1 --2- 192.168.123.108:0/1144358586 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fac50104340 0x7fac501047a0 unknown :-1 s=CLOSED pgs=16 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:14.689 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:14.688+0000 7fac56d40700 1 --2- 192.168.123.108:0/1144358586 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fac50103140 0x7fac50103560 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:14.689 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:14.688+0000 7fac56d40700 1 -- 192.168.123.108:0/1144358586 >> 192.168.123.108:0/1144358586 conn(0x7fac500fe6c0 msgr2=0x7fac50100b20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:14.689 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:14.688+0000 7fac56d40700 1 -- 192.168.123.108:0/1144358586 shutdown_connections 2026-03-10T08:52:14.689 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:14.688+0000 7fac56d40700 1 -- 192.168.123.108:0/1144358586 wait complete. 2026-03-10T08:52:14.689 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:14.689+0000 7fac56d40700 1 Processor -- start 2026-03-10T08:52:14.689 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:14.689+0000 7fac56d40700 1 -- start start 2026-03-10T08:52:14.690 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:14.689+0000 7fac56d40700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fac50103140 0x7fac50198a10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:14.690 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:14.689+0000 7fac56d40700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fac50104340 0x7fac50198f50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:14.690 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:14.689+0000 7fac56d40700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fac50199570 con 0x7fac50103140 2026-03-10T08:52:14.690 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:14.689+0000 7fac56d40700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fac501996b0 con 0x7fac50104340 2026-03-10T08:52:14.690 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:14.689+0000 7fac4ffff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fac50104340 0x7fac50198f50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:14.690 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:14.689+0000 7fac4ffff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fac50104340 0x7fac50198f50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.108:52160/0 (socket says 192.168.123.108:52160) 2026-03-10T08:52:14.690 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:14.689+0000 7fac4ffff700 1 -- 192.168.123.108:0/3320315060 learned_addr learned my addr 192.168.123.108:0/3320315060 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T08:52:14.690 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:14.690+0000 7fac4ffff700 1 -- 192.168.123.108:0/3320315060 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fac50103140 msgr2=0x7fac50198a10 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T08:52:14.690 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:14.690+0000 7fac4ffff700 1 --2- 192.168.123.108:0/3320315060 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fac50103140 0x7fac50198a10 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:14.690 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:14.690+0000 7fac4ffff700 1 -- 192.168.123.108:0/3320315060 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fac440097e0 con 0x7fac50104340 2026-03-10T08:52:14.690 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:14.690+0000 7fac4ffff700 1 --2- 192.168.123.108:0/3320315060 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fac50104340 0x7fac50198f50 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7fac44006010 tx=0x7fac4400b920 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:14.691 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:14.690+0000 7fac4dffb700 1 -- 192.168.123.108:0/3320315060 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fac4401d070 con 0x7fac50104340 2026-03-10T08:52:14.691 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:14.690+0000 7fac4dffb700 1 -- 192.168.123.108:0/3320315060 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fac4400bbf0 con 0x7fac50104340 2026-03-10T08:52:14.691 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:14.690+0000 7fac4dffb700 1 -- 192.168.123.108:0/3320315060 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fac4400f970 con 0x7fac50104340 2026-03-10T08:52:14.691 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:14.690+0000 7fac56d40700 1 -- 192.168.123.108:0/3320315060 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fac5019e100 con 0x7fac50104340 2026-03-10T08:52:14.691 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:14.690+0000 7fac56d40700 1 -- 192.168.123.108:0/3320315060 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fac50075530 con 0x7fac50104340 2026-03-10T08:52:14.692 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:14.691+0000 7fac56d40700 1 -- 192.168.123.108:0/3320315060 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fac50066e80 con 0x7fac50104340 2026-03-10T08:52:14.695 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:14.694+0000 7fac4dffb700 1 -- 192.168.123.108:0/3320315060 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fac44022c70 con 0x7fac50104340 2026-03-10T08:52:14.695 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:14.694+0000 7fac4dffb700 1 --2- 192.168.123.108:0/3320315060 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fac3806c600 0x7fac3806eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:14.695 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:14.694+0000 7fac4dffb700 1 -- 192.168.123.108:0/3320315060 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(20..20 src has 1..20) v4 ==== 3165+0+0 (secure 0 0 0) 0x7fac4408d060 con 0x7fac50104340 2026-03-10T08:52:14.695 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:14.694+0000 7fac4dffb700 1 -- 192.168.123.108:0/3320315060 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fac4408d4e0 con 0x7fac50104340 2026-03-10T08:52:14.695 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:14.695+0000 7fac54adc700 1 --2- 192.168.123.108:0/3320315060 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fac3806c600 0x7fac3806eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:14.698 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:14.698+0000 7fac54adc700 1 --2- 192.168.123.108:0/3320315060 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fac3806c600 0x7fac3806eac0 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7fac40005950 tx=0x7fac400058e0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:14.799 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:14.797+0000 7fac56d40700 1 -- 192.168.123.108:0/3320315060 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm08:/dev/vdd", "target": ["mon-mgr", ""]}) v1 -- 0x7fac50108c90 con 0x7fac3806c600 2026-03-10T08:52:15.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:15 vm08 ceph-mon[57559]: Detected new or changed devices on vm08 2026-03-10T08:52:15.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:15 vm08 ceph-mon[57559]: mgrmap e18: vm05.rxwgjc(active, since 58s), standbys: vm08.rpongu 2026-03-10T08:52:15.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:15 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:52:15.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:15 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:52:15.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:15 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:52:15.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:15 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:15.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:15 vm08 ceph-mon[57559]: from='osd.3 [v2:192.168.123.108:6800/1634067319,v1:192.168.123.108:6801/1634067319]' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-10T08:52:15.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:15 vm08 ceph-mon[57559]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-10T08:52:15.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:15 vm08 ceph-mon[57559]: from='client.24161 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm08:/dev/vdd", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:52:15.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:15 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T08:52:15.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:15 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T08:52:15.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:15 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:52:15.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:15 vm05 ceph-mon[49713]: Detected new or changed devices on vm08 2026-03-10T08:52:15.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:15 vm05 ceph-mon[49713]: mgrmap e18: vm05.rxwgjc(active, since 58s), standbys: vm08.rpongu 2026-03-10T08:52:15.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:15 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:52:15.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:15 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:52:15.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:15 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:52:15.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:15 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:15.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:15 vm05 ceph-mon[49713]: from='osd.3 [v2:192.168.123.108:6800/1634067319,v1:192.168.123.108:6801/1634067319]' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-10T08:52:15.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:15 vm05 ceph-mon[49713]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-10T08:52:15.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:15 vm05 ceph-mon[49713]: from='client.24161 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm08:/dev/vdd", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:52:15.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:15 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T08:52:15.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:15 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T08:52:15.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:15 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:52:15.932 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 08:52:15 vm08 ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3[63369]: 2026-03-10T08:52:15.592+0000 7f3e68822700 -1 osd.3 0 waiting for initial osdmap 2026-03-10T08:52:15.932 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 08:52:15 vm08 ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3[63369]: 2026-03-10T08:52:15.598+0000 7f3e62e14700 -1 osd.3 22 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T08:52:16.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:16 vm08 ceph-mon[57559]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]': finished 2026-03-10T08:52:16.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:16 vm08 ceph-mon[57559]: osdmap e21: 4 total, 3 up, 4 in 2026-03-10T08:52:16.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:16 vm08 ceph-mon[57559]: from='osd.3 [v2:192.168.123.108:6800/1634067319,v1:192.168.123.108:6801/1634067319]' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T08:52:16.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:16 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T08:52:16.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:16 vm08 ceph-mon[57559]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T08:52:16.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:16 vm08 ceph-mon[57559]: pgmap v39: 1 pgs: 1 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail 2026-03-10T08:52:16.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:16 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:52:16.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:16 vm08 ceph-mon[57559]: from='client.? 192.168.123.108:0/4078589474' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "2499eecc-b6be-48ac-ba73-53ff8a0686a4"}]: dispatch 2026-03-10T08:52:16.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:16 vm08 ceph-mon[57559]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "2499eecc-b6be-48ac-ba73-53ff8a0686a4"}]: dispatch 2026-03-10T08:52:16.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:16 vm08 ceph-mon[57559]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm08", "root=default"]}]': finished 2026-03-10T08:52:16.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:16 vm08 ceph-mon[57559]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "2499eecc-b6be-48ac-ba73-53ff8a0686a4"}]': finished 2026-03-10T08:52:16.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:16 vm08 ceph-mon[57559]: osdmap e22: 5 total, 3 up, 5 in 2026-03-10T08:52:16.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:16 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T08:52:16.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:16 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T08:52:16.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:16 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T08:52:16.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:16 vm08 ceph-mon[57559]: from='client.? 192.168.123.108:0/4123206036' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T08:52:16.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:16 vm05 ceph-mon[49713]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]': finished 2026-03-10T08:52:16.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:16 vm05 ceph-mon[49713]: osdmap e21: 4 total, 3 up, 4 in 2026-03-10T08:52:16.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:16 vm05 ceph-mon[49713]: from='osd.3 [v2:192.168.123.108:6800/1634067319,v1:192.168.123.108:6801/1634067319]' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T08:52:16.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:16 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T08:52:16.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:16 vm05 ceph-mon[49713]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T08:52:16.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:16 vm05 ceph-mon[49713]: pgmap v39: 1 pgs: 1 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail 2026-03-10T08:52:16.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:16 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:52:16.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:16 vm05 ceph-mon[49713]: from='client.? 192.168.123.108:0/4078589474' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "2499eecc-b6be-48ac-ba73-53ff8a0686a4"}]: dispatch 2026-03-10T08:52:16.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:16 vm05 ceph-mon[49713]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "2499eecc-b6be-48ac-ba73-53ff8a0686a4"}]: dispatch 2026-03-10T08:52:16.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:16 vm05 ceph-mon[49713]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm08", "root=default"]}]': finished 2026-03-10T08:52:16.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:16 vm05 ceph-mon[49713]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "2499eecc-b6be-48ac-ba73-53ff8a0686a4"}]': finished 2026-03-10T08:52:16.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:16 vm05 ceph-mon[49713]: osdmap e22: 5 total, 3 up, 5 in 2026-03-10T08:52:16.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:16 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T08:52:16.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:16 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T08:52:16.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:16 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T08:52:16.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:16 vm05 ceph-mon[49713]: from='client.? 192.168.123.108:0/4123206036' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T08:52:17.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:17 vm05 ceph-mon[49713]: purged_snaps scrub starts 2026-03-10T08:52:17.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:17 vm05 ceph-mon[49713]: purged_snaps scrub ok 2026-03-10T08:52:17.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:17 vm05 ceph-mon[49713]: osd.3 [v2:192.168.123.108:6800/1634067319,v1:192.168.123.108:6801/1634067319] boot 2026-03-10T08:52:17.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:17 vm05 ceph-mon[49713]: osdmap e23: 5 total, 4 up, 5 in 2026-03-10T08:52:17.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:17 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T08:52:17.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:17 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T08:52:17.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:17 vm05 ceph-mon[49713]: osdmap e24: 5 total, 4 up, 5 in 2026-03-10T08:52:17.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:17 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T08:52:18.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:17 vm08 ceph-mon[57559]: purged_snaps scrub starts 2026-03-10T08:52:18.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:17 vm08 ceph-mon[57559]: purged_snaps scrub ok 2026-03-10T08:52:18.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:17 vm08 ceph-mon[57559]: osd.3 [v2:192.168.123.108:6800/1634067319,v1:192.168.123.108:6801/1634067319] boot 2026-03-10T08:52:18.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:17 vm08 ceph-mon[57559]: osdmap e23: 5 total, 4 up, 5 in 2026-03-10T08:52:18.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:17 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T08:52:18.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:17 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T08:52:18.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:17 vm08 ceph-mon[57559]: osdmap e24: 5 total, 4 up, 5 in 2026-03-10T08:52:18.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:17 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T08:52:18.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:18 vm05 ceph-mon[49713]: pgmap v43: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail 2026-03-10T08:52:18.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:18 vm05 ceph-mon[49713]: osdmap e25: 5 total, 4 up, 5 in 2026-03-10T08:52:18.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:18 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T08:52:19.043 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:18 vm08 ceph-mon[57559]: pgmap v43: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail 2026-03-10T08:52:19.043 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:18 vm08 ceph-mon[57559]: osdmap e25: 5 total, 4 up, 5 in 2026-03-10T08:52:19.043 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:18 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T08:52:20.796 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:20 vm08 ceph-mon[57559]: pgmap v45: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail; 112 KiB/s, 0 objects/s recovering 2026-03-10T08:52:20.796 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:20 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-10T08:52:20.796 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:20 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:52:20.796 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:20 vm08 ceph-mon[57559]: Deploying daemon osd.4 on vm08 2026-03-10T08:52:20.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:20 vm05 ceph-mon[49713]: pgmap v45: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail; 112 KiB/s, 0 objects/s recovering 2026-03-10T08:52:20.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:20 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-10T08:52:20.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:20 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:52:20.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:20 vm05 ceph-mon[49713]: Deploying daemon osd.4 on vm08 2026-03-10T08:52:22.128 INFO:teuthology.orchestra.run.vm08.stdout:Created osd(s) 4 on host 'vm08' 2026-03-10T08:52:22.128 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:22.127+0000 7fac4dffb700 1 -- 192.168.123.108:0/3320315060 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7fac50108c90 con 0x7fac3806c600 2026-03-10T08:52:22.131 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:22.129+0000 7fac56d40700 1 -- 192.168.123.108:0/3320315060 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fac3806c600 msgr2=0x7fac3806eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:22.131 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:22.129+0000 7fac56d40700 1 --2- 192.168.123.108:0/3320315060 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fac3806c600 0x7fac3806eac0 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7fac40005950 tx=0x7fac400058e0 comp rx=0 tx=0).stop 2026-03-10T08:52:22.131 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:22.129+0000 7fac56d40700 1 -- 192.168.123.108:0/3320315060 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fac50104340 msgr2=0x7fac50198f50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:22.131 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:22.129+0000 7fac56d40700 1 --2- 192.168.123.108:0/3320315060 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fac50104340 0x7fac50198f50 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7fac44006010 tx=0x7fac4400b920 comp rx=0 tx=0).stop 2026-03-10T08:52:22.131 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:22.129+0000 7fac56d40700 1 -- 192.168.123.108:0/3320315060 shutdown_connections 2026-03-10T08:52:22.131 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:22.129+0000 7fac56d40700 1 --2- 192.168.123.108:0/3320315060 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fac3806c600 0x7fac3806eac0 unknown :-1 s=CLOSED pgs=44 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:22.131 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:22.129+0000 7fac56d40700 1 --2- 192.168.123.108:0/3320315060 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fac50103140 0x7fac50198a10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:22.131 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:22.129+0000 7fac56d40700 1 --2- 192.168.123.108:0/3320315060 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fac50104340 0x7fac50198f50 unknown :-1 s=CLOSED pgs=17 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:22.131 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:22.129+0000 7fac56d40700 1 -- 192.168.123.108:0/3320315060 >> 192.168.123.108:0/3320315060 conn(0x7fac500fe6c0 msgr2=0x7fac50107570 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:22.131 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:22.129+0000 7fac56d40700 1 -- 192.168.123.108:0/3320315060 shutdown_connections 2026-03-10T08:52:22.131 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:22.129+0000 7fac56d40700 1 -- 192.168.123.108:0/3320315060 wait complete. 2026-03-10T08:52:22.179 DEBUG:teuthology.orchestra.run.vm08:osd.4> sudo journalctl -f -n 0 -u ceph-16587ed2-1c5e-11f1-90f6-35051361a039@osd.4.service 2026-03-10T08:52:22.180 INFO:tasks.cephadm:Deploying osd.5 on vm08 with /dev/vdc... 2026-03-10T08:52:22.180 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- lvm zap /dev/vdc 2026-03-10T08:52:22.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:22 vm08 ceph-mon[57559]: pgmap v46: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail; 79 KiB/s, 0 objects/s recovering 2026-03-10T08:52:22.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:22 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:22.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:22 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:22.304 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:22 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:52:22.304 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:22 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:22.304 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:22 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:22.304 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:22 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:22.304 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:22 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:22.360 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm08/config 2026-03-10T08:52:22.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:22 vm05 ceph-mon[49713]: pgmap v46: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail; 79 KiB/s, 0 objects/s recovering 2026-03-10T08:52:22.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:22 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:22.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:22 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:22.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:22 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:52:22.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:22 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:22.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:22 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:22.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:22 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:22.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:22 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:22.913 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:52:22.925 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph orch daemon add osd vm08:/dev/vdc 2026-03-10T08:52:23.110 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm08/config 2026-03-10T08:52:23.416 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:23.414+0000 7f3d89bb3700 1 -- 192.168.123.108:0/3649303787 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3d8410a700 msgr2=0x7f3d8410cb90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:23.416 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:23.414+0000 7f3d89bb3700 1 --2- 192.168.123.108:0/3649303787 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3d8410a700 0x7f3d8410cb90 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f3d7c01c320 tx=0x7f3d7c01c630 comp rx=0 tx=0).stop 2026-03-10T08:52:23.416 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:23.414+0000 7f3d89bb3700 1 -- 192.168.123.108:0/3649303787 shutdown_connections 2026-03-10T08:52:23.416 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:23.414+0000 7f3d89bb3700 1 --2- 192.168.123.108:0/3649303787 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3d8410a700 0x7f3d8410cb90 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:23.416 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:23.414+0000 7f3d89bb3700 1 --2- 192.168.123.108:0/3649303787 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3d84107d90 0x7f3d8410a1c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:23.416 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:23.414+0000 7f3d89bb3700 1 -- 192.168.123.108:0/3649303787 >> 192.168.123.108:0/3649303787 conn(0x7f3d8406dae0 msgr2=0x7f3d8406ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:23.416 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:23.414+0000 7f3d89bb3700 1 -- 192.168.123.108:0/3649303787 shutdown_connections 2026-03-10T08:52:23.416 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:23.414+0000 7f3d89bb3700 1 -- 192.168.123.108:0/3649303787 wait complete. 2026-03-10T08:52:23.416 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:23.415+0000 7f3d89bb3700 1 Processor -- start 2026-03-10T08:52:23.416 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:23.415+0000 7f3d89bb3700 1 -- start start 2026-03-10T08:52:23.416 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:23.415+0000 7f3d89bb3700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3d84107d90 0x7f3d8419cc10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:23.416 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:23.415+0000 7f3d89bb3700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3d8419d150 0x7f3d841a2180 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:23.416 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:23.415+0000 7f3d89bb3700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3d8419d5d0 con 0x7f3d84107d90 2026-03-10T08:52:23.416 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:23.415+0000 7f3d89bb3700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3d8419d740 con 0x7f3d8419d150 2026-03-10T08:52:23.416 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:23.415+0000 7f3d82ffd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3d8419d150 0x7f3d841a2180 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:23.416 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:23.415+0000 7f3d82ffd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3d8419d150 0x7f3d841a2180 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.108:52246/0 (socket says 192.168.123.108:52246) 2026-03-10T08:52:23.416 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:23.415+0000 7f3d82ffd700 1 -- 192.168.123.108:0/542195105 learned_addr learned my addr 192.168.123.108:0/542195105 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T08:52:23.416 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:23.415+0000 7f3d837fe700 1 --2- 192.168.123.108:0/542195105 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3d84107d90 0x7f3d8419cc10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:23.417 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:23.416+0000 7f3d837fe700 1 -- 192.168.123.108:0/542195105 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3d8419d150 msgr2=0x7f3d841a2180 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:23.417 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:23.416+0000 7f3d837fe700 1 --2- 192.168.123.108:0/542195105 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3d8419d150 0x7f3d841a2180 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:23.417 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:23.416+0000 7f3d837fe700 1 -- 192.168.123.108:0/542195105 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3d7c01c060 con 0x7f3d84107d90 2026-03-10T08:52:23.417 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:23.416+0000 7f3d837fe700 1 --2- 192.168.123.108:0/542195105 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3d84107d90 0x7f3d8419cc10 secure :-1 s=READY pgs=194 cs=0 l=1 rev1=1 crypto rx=0x7f3d7400c8f0 tx=0x7f3d7400cc00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:23.417 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:23.416+0000 7f3d80ff9700 1 -- 192.168.123.108:0/542195105 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3d7400e830 con 0x7f3d84107d90 2026-03-10T08:52:23.418 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:23.417+0000 7f3d89bb3700 1 -- 192.168.123.108:0/542195105 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3d841a2720 con 0x7f3d84107d90 2026-03-10T08:52:23.418 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:23.417+0000 7f3d89bb3700 1 -- 192.168.123.108:0/542195105 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3d841a2c40 con 0x7f3d84107d90 2026-03-10T08:52:23.418 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:23.417+0000 7f3d80ff9700 1 -- 192.168.123.108:0/542195105 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3d7400ee70 con 0x7f3d84107d90 2026-03-10T08:52:23.418 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:23.417+0000 7f3d80ff9700 1 -- 192.168.123.108:0/542195105 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3d74018610 con 0x7f3d84107d90 2026-03-10T08:52:23.419 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:23.418+0000 7f3d80ff9700 1 -- 192.168.123.108:0/542195105 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f3d7400f880 con 0x7f3d84107d90 2026-03-10T08:52:23.419 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:23.419+0000 7f3d80ff9700 1 --2- 192.168.123.108:0/542195105 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3d6c06c530 0x7f3d6c06e9f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:23.420 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:23.419+0000 7f3d80ff9700 1 -- 192.168.123.108:0/542195105 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(25..25 src has 1..25) v4 ==== 3697+0+0 (secure 0 0 0) 0x7f3d7408b190 con 0x7f3d84107d90 2026-03-10T08:52:23.420 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:23.419+0000 7f3d82ffd700 1 --2- 192.168.123.108:0/542195105 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3d6c06c530 0x7f3d6c06e9f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:23.420 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:23.419+0000 7f3d89bb3700 1 -- 192.168.123.108:0/542195105 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3d70005320 con 0x7f3d84107d90 2026-03-10T08:52:23.423 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:23.422+0000 7f3d82ffd700 1 --2- 192.168.123.108:0/542195105 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3d6c06c530 0x7f3d6c06e9f0 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7f3d7c007ef0 tx=0x7f3d7c007e80 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:23.424 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:23.423+0000 7f3d80ff9700 1 -- 192.168.123.108:0/542195105 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f3d74059ac0 con 0x7f3d84107d90 2026-03-10T08:52:23.535 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:23.534+0000 7f3d89bb3700 1 -- 192.168.123.108:0/542195105 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm08:/dev/vdc", "target": ["mon-mgr", ""]}) v1 -- 0x7f3d70000bf0 con 0x7f3d6c06c530 2026-03-10T08:52:24.287 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 08:52:23 vm08 ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4[69269]: 2026-03-10T08:52:23.930+0000 7fbdfb274640 -1 osd.4 0 log_to_monitors true 2026-03-10T08:52:24.288 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:24 vm08 ceph-mon[57559]: pgmap v47: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail; 68 KiB/s, 0 objects/s recovering 2026-03-10T08:52:24.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:24 vm08 ceph-mon[57559]: from='client.14332 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm08:/dev/vdc", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:52:24.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:24 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T08:52:24.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:24 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T08:52:24.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:24 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:52:24.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:24 vm08 ceph-mon[57559]: Detected new or changed devices on vm08 2026-03-10T08:52:24.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:24 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:24.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:24 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:24.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:24 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-10T08:52:24.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:24 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:52:24.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:24 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:52:24.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:24 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:24.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:24 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:52:24.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:24 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:52:24.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:24 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:52:24.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:24 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:24.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:24 vm08 ceph-mon[57559]: from='osd.4 [v2:192.168.123.108:6808/4181429086,v1:192.168.123.108:6809/4181429086]' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-10T08:52:24.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:24 vm08 ceph-mon[57559]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-10T08:52:24.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:24 vm05 ceph-mon[49713]: pgmap v47: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail; 68 KiB/s, 0 objects/s recovering 2026-03-10T08:52:24.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:24 vm05 ceph-mon[49713]: from='client.14332 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm08:/dev/vdc", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:52:24.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:24 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T08:52:24.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:24 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T08:52:24.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:24 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:52:24.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:24 vm05 ceph-mon[49713]: Detected new or changed devices on vm08 2026-03-10T08:52:24.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:24 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:24.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:24 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:24.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:24 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-10T08:52:24.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:24 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:52:24.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:24 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:52:24.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:24 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:24.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:24 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:52:24.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:24 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:52:24.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:24 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:52:24.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:24 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:24.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:24 vm05 ceph-mon[49713]: from='osd.4 [v2:192.168.123.108:6808/4181429086,v1:192.168.123.108:6809/4181429086]' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-10T08:52:24.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:24 vm05 ceph-mon[49713]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-10T08:52:25.289 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:25 vm05 ceph-mon[49713]: from='client.? 192.168.123.108:0/453195646' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "52d5763a-8095-46f7-9dd8-2d20a4d53ab7"}]: dispatch 2026-03-10T08:52:25.289 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:25 vm05 ceph-mon[49713]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "52d5763a-8095-46f7-9dd8-2d20a4d53ab7"}]: dispatch 2026-03-10T08:52:25.289 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:25 vm05 ceph-mon[49713]: from='osd.4 ' entity='osd.4' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]': finished 2026-03-10T08:52:25.293 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:25 vm05 ceph-mon[49713]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "52d5763a-8095-46f7-9dd8-2d20a4d53ab7"}]': finished 2026-03-10T08:52:25.293 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:25 vm05 ceph-mon[49713]: osdmap e26: 6 total, 4 up, 6 in 2026-03-10T08:52:25.293 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:25 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T08:52:25.293 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:25 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T08:52:25.293 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:25 vm05 ceph-mon[49713]: from='osd.4 [v2:192.168.123.108:6808/4181429086,v1:192.168.123.108:6809/4181429086]' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T08:52:25.293 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:25 vm05 ceph-mon[49713]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T08:52:25.293 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:25 vm05 ceph-mon[49713]: from='client.? 192.168.123.108:0/4271664614' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T08:52:25.552 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 08:52:25 vm08 ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4[69269]: 2026-03-10T08:52:25.391+0000 7fbdf18da700 -1 osd.4 0 waiting for initial osdmap 2026-03-10T08:52:25.552 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 08:52:25 vm08 ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4[69269]: 2026-03-10T08:52:25.407+0000 7fbde9ec8700 -1 osd.4 27 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T08:52:25.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:25 vm08 ceph-mon[57559]: from='client.? 192.168.123.108:0/453195646' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "52d5763a-8095-46f7-9dd8-2d20a4d53ab7"}]: dispatch 2026-03-10T08:52:25.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:25 vm08 ceph-mon[57559]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "52d5763a-8095-46f7-9dd8-2d20a4d53ab7"}]: dispatch 2026-03-10T08:52:25.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:25 vm08 ceph-mon[57559]: from='osd.4 ' entity='osd.4' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]': finished 2026-03-10T08:52:25.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:25 vm08 ceph-mon[57559]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "52d5763a-8095-46f7-9dd8-2d20a4d53ab7"}]': finished 2026-03-10T08:52:25.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:25 vm08 ceph-mon[57559]: osdmap e26: 6 total, 4 up, 6 in 2026-03-10T08:52:25.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:25 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T08:52:25.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:25 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T08:52:25.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:25 vm08 ceph-mon[57559]: from='osd.4 [v2:192.168.123.108:6808/4181429086,v1:192.168.123.108:6809/4181429086]' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T08:52:25.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:25 vm08 ceph-mon[57559]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T08:52:25.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:25 vm08 ceph-mon[57559]: from='client.? 192.168.123.108:0/4271664614' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T08:52:26.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:26 vm05 ceph-mon[49713]: pgmap v49: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail; 56 KiB/s, 0 objects/s recovering 2026-03-10T08:52:26.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:26 vm05 ceph-mon[49713]: from='osd.4 ' entity='osd.4' cmd='[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm08", "root=default"]}]': finished 2026-03-10T08:52:26.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:26 vm05 ceph-mon[49713]: osdmap e27: 6 total, 4 up, 6 in 2026-03-10T08:52:26.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:26 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T08:52:26.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:26 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T08:52:26.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:26 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T08:52:26.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:26 vm08 ceph-mon[57559]: pgmap v49: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail; 56 KiB/s, 0 objects/s recovering 2026-03-10T08:52:26.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:26 vm08 ceph-mon[57559]: from='osd.4 ' entity='osd.4' cmd='[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm08", "root=default"]}]': finished 2026-03-10T08:52:26.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:26 vm08 ceph-mon[57559]: osdmap e27: 6 total, 4 up, 6 in 2026-03-10T08:52:26.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:26 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T08:52:26.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:26 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T08:52:26.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:26 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T08:52:27.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:27 vm05 ceph-mon[49713]: purged_snaps scrub starts 2026-03-10T08:52:27.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:27 vm05 ceph-mon[49713]: purged_snaps scrub ok 2026-03-10T08:52:27.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:27 vm05 ceph-mon[49713]: osd.4 [v2:192.168.123.108:6808/4181429086,v1:192.168.123.108:6809/4181429086] boot 2026-03-10T08:52:27.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:27 vm05 ceph-mon[49713]: osdmap e28: 6 total, 5 up, 6 in 2026-03-10T08:52:27.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:27 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T08:52:27.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:27 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T08:52:27.795 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:27 vm08 ceph-mon[57559]: purged_snaps scrub starts 2026-03-10T08:52:27.795 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:27 vm08 ceph-mon[57559]: purged_snaps scrub ok 2026-03-10T08:52:27.796 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:27 vm08 ceph-mon[57559]: osd.4 [v2:192.168.123.108:6808/4181429086,v1:192.168.123.108:6809/4181429086] boot 2026-03-10T08:52:27.796 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:27 vm08 ceph-mon[57559]: osdmap e28: 6 total, 5 up, 6 in 2026-03-10T08:52:27.796 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:27 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T08:52:27.796 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:27 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T08:52:28.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:28 vm05 ceph-mon[49713]: pgmap v52: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T08:52:28.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:28 vm05 ceph-mon[49713]: osdmap e29: 6 total, 5 up, 6 in 2026-03-10T08:52:28.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:28 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T08:52:28.759 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:28 vm08 ceph-mon[57559]: pgmap v52: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T08:52:28.759 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:28 vm08 ceph-mon[57559]: osdmap e29: 6 total, 5 up, 6 in 2026-03-10T08:52:28.759 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:28 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T08:52:29.616 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:29 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-10T08:52:29.616 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:29 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:52:29.616 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:29 vm08 ceph-mon[57559]: Deploying daemon osd.5 on vm08 2026-03-10T08:52:29.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:29 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-10T08:52:29.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:29 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:52:29.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:29 vm05 ceph-mon[49713]: Deploying daemon osd.5 on vm08 2026-03-10T08:52:30.548 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:30 vm08 ceph-mon[57559]: pgmap v54: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T08:52:30.548 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:30 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:30.548 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:30 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:30.548 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:30 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:52:30.548 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:30 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:52:30.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:30 vm05 ceph-mon[49713]: pgmap v54: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T08:52:30.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:30 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:30.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:30 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:30.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:30 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:52:30.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:30 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:52:31.016 INFO:teuthology.orchestra.run.vm08.stdout:Created osd(s) 5 on host 'vm08' 2026-03-10T08:52:31.016 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:31.013+0000 7f3d80ff9700 1 -- 192.168.123.108:0/542195105 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7f3d70000bf0 con 0x7f3d6c06c530 2026-03-10T08:52:31.016 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:31.015+0000 7f3d89bb3700 1 -- 192.168.123.108:0/542195105 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3d6c06c530 msgr2=0x7f3d6c06e9f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:31.016 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:31.015+0000 7f3d89bb3700 1 --2- 192.168.123.108:0/542195105 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3d6c06c530 0x7f3d6c06e9f0 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7f3d7c007ef0 tx=0x7f3d7c007e80 comp rx=0 tx=0).stop 2026-03-10T08:52:31.016 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:31.015+0000 7f3d89bb3700 1 -- 192.168.123.108:0/542195105 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3d84107d90 msgr2=0x7f3d8419cc10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:31.016 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:31.015+0000 7f3d89bb3700 1 --2- 192.168.123.108:0/542195105 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3d84107d90 0x7f3d8419cc10 secure :-1 s=READY pgs=194 cs=0 l=1 rev1=1 crypto rx=0x7f3d7400c8f0 tx=0x7f3d7400cc00 comp rx=0 tx=0).stop 2026-03-10T08:52:31.016 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:31.015+0000 7f3d89bb3700 1 -- 192.168.123.108:0/542195105 shutdown_connections 2026-03-10T08:52:31.016 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:31.015+0000 7f3d89bb3700 1 --2- 192.168.123.108:0/542195105 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3d6c06c530 0x7f3d6c06e9f0 unknown :-1 s=CLOSED pgs=49 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:31.016 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:31.015+0000 7f3d89bb3700 1 --2- 192.168.123.108:0/542195105 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3d84107d90 0x7f3d8419cc10 unknown :-1 s=CLOSED pgs=194 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:31.016 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:31.015+0000 7f3d89bb3700 1 --2- 192.168.123.108:0/542195105 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3d8419d150 0x7f3d841a2180 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:31.016 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:31.015+0000 7f3d89bb3700 1 -- 192.168.123.108:0/542195105 >> 192.168.123.108:0/542195105 conn(0x7f3d8406dae0 msgr2=0x7f3d8406e7c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:31.016 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:31.015+0000 7f3d89bb3700 1 -- 192.168.123.108:0/542195105 shutdown_connections 2026-03-10T08:52:31.016 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:31.015+0000 7f3d89bb3700 1 -- 192.168.123.108:0/542195105 wait complete. 2026-03-10T08:52:31.064 DEBUG:teuthology.orchestra.run.vm08:osd.5> sudo journalctl -f -n 0 -u ceph-16587ed2-1c5e-11f1-90f6-35051361a039@osd.5.service 2026-03-10T08:52:31.065 INFO:tasks.cephadm:Waiting for 6 OSDs to come up... 2026-03-10T08:52:31.065 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph osd stat -f json 2026-03-10T08:52:31.222 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:52:31.484 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.482+0000 7f85b2f06700 1 -- 192.168.123.105:0/64591686 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f85ac103140 msgr2=0x7f85ac103560 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:31.484 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.482+0000 7f85b2f06700 1 --2- 192.168.123.105:0/64591686 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f85ac103140 0x7f85ac103560 secure :-1 s=READY pgs=195 cs=0 l=1 rev1=1 crypto rx=0x7f859c009b00 tx=0x7f859c009e10 comp rx=0 tx=0).stop 2026-03-10T08:52:31.484 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.483+0000 7f85b2f06700 1 -- 192.168.123.105:0/64591686 shutdown_connections 2026-03-10T08:52:31.484 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.483+0000 7f85b2f06700 1 --2- 192.168.123.105:0/64591686 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f85ac104340 0x7f85ac1047a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:31.484 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.483+0000 7f85b2f06700 1 --2- 192.168.123.105:0/64591686 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f85ac103140 0x7f85ac103560 unknown :-1 s=CLOSED pgs=195 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:31.484 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.483+0000 7f85b2f06700 1 -- 192.168.123.105:0/64591686 >> 192.168.123.105:0/64591686 conn(0x7f85ac0fe6c0 msgr2=0x7f85ac100b20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:31.484 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.484+0000 7f85b2f06700 1 -- 192.168.123.105:0/64591686 shutdown_connections 2026-03-10T08:52:31.484 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.484+0000 7f85b2f06700 1 -- 192.168.123.105:0/64591686 wait complete. 2026-03-10T08:52:31.484 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.484+0000 7f85b2f06700 1 Processor -- start 2026-03-10T08:52:31.484 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.484+0000 7f85b2f06700 1 -- start start 2026-03-10T08:52:31.484 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.484+0000 7f85b2f06700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f85ac103140 0x7f85ac19ce80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:31.485 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.484+0000 7f85b2f06700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f85ac104340 0x7f85ac19d3c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:31.485 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.484+0000 7f85b2f06700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f85ac19d9e0 con 0x7f85ac104340 2026-03-10T08:52:31.485 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.484+0000 7f85b2f06700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f85ac19db20 con 0x7f85ac103140 2026-03-10T08:52:31.485 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.485+0000 7f85b0ca2700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f85ac103140 0x7f85ac19ce80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:31.485 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.485+0000 7f85b0ca2700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f85ac103140 0x7f85ac19ce80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:56562/0 (socket says 192.168.123.105:56562) 2026-03-10T08:52:31.485 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.485+0000 7f85b0ca2700 1 -- 192.168.123.105:0/4055053066 learned_addr learned my addr 192.168.123.105:0/4055053066 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:52:31.485 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.485+0000 7f85b0ca2700 1 -- 192.168.123.105:0/4055053066 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f85ac104340 msgr2=0x7f85ac19d3c0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T08:52:31.485 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.485+0000 7f85abfff700 1 --2- 192.168.123.105:0/4055053066 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f85ac104340 0x7f85ac19d3c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:31.485 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.485+0000 7f85b0ca2700 1 --2- 192.168.123.105:0/4055053066 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f85ac104340 0x7f85ac19d3c0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:31.486 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.485+0000 7f85b0ca2700 1 -- 192.168.123.105:0/4055053066 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f859c0097e0 con 0x7f85ac103140 2026-03-10T08:52:31.486 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.486+0000 7f85b0ca2700 1 --2- 192.168.123.105:0/4055053066 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f85ac103140 0x7f85ac19ce80 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7f859c000c00 tx=0x7f859c0056c0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:31.486 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.486+0000 7f85abfff700 1 --2- 192.168.123.105:0/4055053066 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f85ac104340 0x7f85ac19d3c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T08:52:31.486 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.486+0000 7f85a9ffb700 1 -- 192.168.123.105:0/4055053066 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f859c01d070 con 0x7f85ac103140 2026-03-10T08:52:31.487 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.486+0000 7f85b2f06700 1 -- 192.168.123.105:0/4055053066 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f85ac1a2570 con 0x7f85ac103140 2026-03-10T08:52:31.487 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.486+0000 7f85b2f06700 1 -- 192.168.123.105:0/4055053066 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f85ac1a2a60 con 0x7f85ac103140 2026-03-10T08:52:31.487 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.486+0000 7f85a9ffb700 1 -- 192.168.123.105:0/4055053066 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f859c00bc50 con 0x7f85ac103140 2026-03-10T08:52:31.487 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.486+0000 7f85a9ffb700 1 -- 192.168.123.105:0/4055053066 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f859c00f8b0 con 0x7f85ac103140 2026-03-10T08:52:31.487 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.487+0000 7f85a9ffb700 1 -- 192.168.123.105:0/4055053066 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f859c00fa10 con 0x7f85ac103140 2026-03-10T08:52:31.487 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.487+0000 7f85b2f06700 1 -- 192.168.123.105:0/4055053066 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8598005320 con 0x7f85ac103140 2026-03-10T08:52:31.488 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.488+0000 7f85a9ffb700 1 --2- 192.168.123.105:0/4055053066 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f859406c4e0 0x7f859406e9a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:31.488 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.488+0000 7f85abfff700 1 --2- 192.168.123.105:0/4055053066 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f859406c4e0 0x7f859406e9a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:31.488 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.488+0000 7f85a9ffb700 1 -- 192.168.123.105:0/4055053066 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(29..29 src has 1..29) v4 ==== 4129+0+0 (secure 0 0 0) 0x7f859c08cae0 con 0x7f85ac103140 2026-03-10T08:52:31.488 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.488+0000 7f85abfff700 1 --2- 192.168.123.105:0/4055053066 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f859406c4e0 0x7f859406e9a0 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f85a0005fd0 tx=0x7f85a0005dc0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:31.490 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.490+0000 7f85a9ffb700 1 -- 192.168.123.105:0/4055053066 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f859c05b310 con 0x7f85ac103140 2026-03-10T08:52:31.604 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.604+0000 7f85b2f06700 1 -- 192.168.123.105:0/4055053066 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "osd stat", "format": "json"} v 0) v1 -- 0x7f85980059f0 con 0x7f85ac103140 2026-03-10T08:52:31.605 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.604+0000 7f85a9ffb700 1 -- 192.168.123.105:0/4055053066 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "osd stat", "format": "json"}]=0 v29) v1 ==== 74+0+130 (secure 0 0 0) 0x7f859c05aea0 con 0x7f85ac103140 2026-03-10T08:52:31.606 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:52:31.608 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.608+0000 7f85b2f06700 1 -- 192.168.123.105:0/4055053066 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f859406c4e0 msgr2=0x7f859406e9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:31.608 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.608+0000 7f85b2f06700 1 --2- 192.168.123.105:0/4055053066 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f859406c4e0 0x7f859406e9a0 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f85a0005fd0 tx=0x7f85a0005dc0 comp rx=0 tx=0).stop 2026-03-10T08:52:31.608 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.608+0000 7f85b2f06700 1 -- 192.168.123.105:0/4055053066 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f85ac103140 msgr2=0x7f85ac19ce80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:31.608 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.608+0000 7f85b2f06700 1 --2- 192.168.123.105:0/4055053066 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f85ac103140 0x7f85ac19ce80 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7f859c000c00 tx=0x7f859c0056c0 comp rx=0 tx=0).stop 2026-03-10T08:52:31.608 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.608+0000 7f85b2f06700 1 -- 192.168.123.105:0/4055053066 shutdown_connections 2026-03-10T08:52:31.608 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.608+0000 7f85b2f06700 1 --2- 192.168.123.105:0/4055053066 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f859406c4e0 0x7f859406e9a0 unknown :-1 s=CLOSED pgs=54 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:31.608 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.608+0000 7f85b2f06700 1 --2- 192.168.123.105:0/4055053066 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f85ac103140 0x7f85ac19ce80 unknown :-1 s=CLOSED pgs=31 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:31.608 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.608+0000 7f85b2f06700 1 --2- 192.168.123.105:0/4055053066 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f85ac104340 0x7f85ac19d3c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:31.608 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.608+0000 7f85b2f06700 1 -- 192.168.123.105:0/4055053066 >> 192.168.123.105:0/4055053066 conn(0x7f85ac0fe6c0 msgr2=0x7f85ac107570 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:31.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.608+0000 7f85b2f06700 1 -- 192.168.123.105:0/4055053066 shutdown_connections 2026-03-10T08:52:31.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:31.608+0000 7f85b2f06700 1 -- 192.168.123.105:0/4055053066 wait complete. 2026-03-10T08:52:31.659 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":29,"num_osds":6,"num_up_osds":5,"osd_up_since":1773132746,"num_in_osds":6,"osd_in_since":1773132744,"num_remapped_pgs":0} 2026-03-10T08:52:32.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:31 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:32.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:31 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:32.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:31 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:32.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:31 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:32.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:31 vm08 ceph-mon[57559]: pgmap v55: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T08:52:32.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:31 vm08 ceph-mon[57559]: from='client.? 192.168.123.105:0/4055053066' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-10T08:52:32.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:31 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:32.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:31 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:32.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:31 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:32.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:31 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:32.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:31 vm05 ceph-mon[49713]: pgmap v55: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T08:52:32.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:31 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/4055053066' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-10T08:52:32.660 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph osd stat -f json 2026-03-10T08:52:32.803 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 08:52:32 vm08 ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5[75639]: 2026-03-10T08:52:32.432+0000 7f97023c1640 -1 osd.5 0 log_to_monitors true 2026-03-10T08:52:32.806 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:52:33.045 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.044+0000 7f52e837e700 1 -- 192.168.123.105:0/551493993 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f52e0101990 msgr2=0x7f52e0103d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:33.045 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.044+0000 7f52e837e700 1 --2- 192.168.123.105:0/551493993 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f52e0101990 0x7f52e0103d80 secure :-1 s=READY pgs=196 cs=0 l=1 rev1=1 crypto rx=0x7f52d0009b50 tx=0x7f52d0009e60 comp rx=0 tx=0).stop 2026-03-10T08:52:33.045 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.045+0000 7f52e837e700 1 -- 192.168.123.105:0/551493993 shutdown_connections 2026-03-10T08:52:33.045 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.045+0000 7f52e837e700 1 --2- 192.168.123.105:0/551493993 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f52e01042c0 0x7f52e01066b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:33.045 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.045+0000 7f52e837e700 1 --2- 192.168.123.105:0/551493993 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f52e0101990 0x7f52e0103d80 unknown :-1 s=CLOSED pgs=196 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:33.045 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.045+0000 7f52e837e700 1 -- 192.168.123.105:0/551493993 >> 192.168.123.105:0/551493993 conn(0x7f52e00fb380 msgr2=0x7f52e00fd7e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:33.045 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.045+0000 7f52e837e700 1 -- 192.168.123.105:0/551493993 shutdown_connections 2026-03-10T08:52:33.045 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.045+0000 7f52e837e700 1 -- 192.168.123.105:0/551493993 wait complete. 2026-03-10T08:52:33.046 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.045+0000 7f52e837e700 1 Processor -- start 2026-03-10T08:52:33.046 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.046+0000 7f52e837e700 1 -- start start 2026-03-10T08:52:33.046 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.046+0000 7f52e837e700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f52e0101990 0x7f52e019ce20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:33.046 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.046+0000 7f52e837e700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f52e01042c0 0x7f52e019d360 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:33.046 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.046+0000 7f52e837e700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f52e019d980 con 0x7f52e0101990 2026-03-10T08:52:33.046 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.046+0000 7f52e837e700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f52e019dac0 con 0x7f52e01042c0 2026-03-10T08:52:33.046 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.046+0000 7f52e5919700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f52e01042c0 0x7f52e019d360 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:33.047 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.046+0000 7f52e5919700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f52e01042c0 0x7f52e019d360 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:56572/0 (socket says 192.168.123.105:56572) 2026-03-10T08:52:33.047 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.046+0000 7f52e5919700 1 -- 192.168.123.105:0/5528562 learned_addr learned my addr 192.168.123.105:0/5528562 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:52:33.047 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.046+0000 7f52e611a700 1 --2- 192.168.123.105:0/5528562 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f52e0101990 0x7f52e019ce20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:33.047 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.047+0000 7f52e5919700 1 -- 192.168.123.105:0/5528562 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f52e0101990 msgr2=0x7f52e019ce20 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:33.047 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.047+0000 7f52e5919700 1 --2- 192.168.123.105:0/5528562 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f52e0101990 0x7f52e019ce20 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:33.047 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.047+0000 7f52e5919700 1 -- 192.168.123.105:0/5528562 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f52d00097e0 con 0x7f52e01042c0 2026-03-10T08:52:33.047 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.047+0000 7f52e611a700 1 --2- 192.168.123.105:0/5528562 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f52e0101990 0x7f52e019ce20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T08:52:33.047 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.047+0000 7f52e5919700 1 --2- 192.168.123.105:0/5528562 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f52e01042c0 0x7f52e019d360 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7f52dc00d8d0 tx=0x7f52dc00dc90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:33.047 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.047+0000 7f52d77fe700 1 -- 192.168.123.105:0/5528562 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f52dc009940 con 0x7f52e01042c0 2026-03-10T08:52:33.047 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.047+0000 7f52e837e700 1 -- 192.168.123.105:0/5528562 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f52e01a2570 con 0x7f52e01042c0 2026-03-10T08:52:33.048 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.047+0000 7f52e837e700 1 -- 192.168.123.105:0/5528562 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f52e01a2ac0 con 0x7f52e01042c0 2026-03-10T08:52:33.048 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.047+0000 7f52d77fe700 1 -- 192.168.123.105:0/5528562 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f52dc010460 con 0x7f52e01042c0 2026-03-10T08:52:33.048 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.047+0000 7f52d77fe700 1 -- 192.168.123.105:0/5528562 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f52dc00f5d0 con 0x7f52e01042c0 2026-03-10T08:52:33.050 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.048+0000 7f52e837e700 1 -- 192.168.123.105:0/5528562 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f52c4005320 con 0x7f52e01042c0 2026-03-10T08:52:33.050 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.049+0000 7f52d77fe700 1 -- 192.168.123.105:0/5528562 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f52dc009aa0 con 0x7f52e01042c0 2026-03-10T08:52:33.050 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.049+0000 7f52d77fe700 1 --2- 192.168.123.105:0/5528562 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f52cc06c4f0 0x7f52cc06e9b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:33.050 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.049+0000 7f52e611a700 1 --2- 192.168.123.105:0/5528562 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f52cc06c4f0 0x7f52cc06e9b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:33.050 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.049+0000 7f52d77fe700 1 -- 192.168.123.105:0/5528562 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(29..29 src has 1..29) v4 ==== 4129+0+0 (secure 0 0 0) 0x7f52dc08b200 con 0x7f52e01042c0 2026-03-10T08:52:33.050 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.050+0000 7f52e611a700 1 --2- 192.168.123.105:0/5528562 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f52cc06c4f0 0x7f52cc06e9b0 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f52d000b5c0 tx=0x7f52d00058e0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:33.052 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.052+0000 7f52d77fe700 1 -- 192.168.123.105:0/5528562 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f52dc0596f0 con 0x7f52e01042c0 2026-03-10T08:52:33.155 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.155+0000 7f52e837e700 1 -- 192.168.123.105:0/5528562 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "osd stat", "format": "json"} v 0) v1 -- 0x7f52c4005190 con 0x7f52e01042c0 2026-03-10T08:52:33.156 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.156+0000 7f52d77fe700 1 -- 192.168.123.105:0/5528562 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "osd stat", "format": "json"}]=0 v29) v1 ==== 74+0+130 (secure 0 0 0) 0x7f52dc059510 con 0x7f52e01042c0 2026-03-10T08:52:33.156 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:52:33.158 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.158+0000 7f52e837e700 1 -- 192.168.123.105:0/5528562 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f52cc06c4f0 msgr2=0x7f52cc06e9b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:33.158 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.158+0000 7f52e837e700 1 --2- 192.168.123.105:0/5528562 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f52cc06c4f0 0x7f52cc06e9b0 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f52d000b5c0 tx=0x7f52d00058e0 comp rx=0 tx=0).stop 2026-03-10T08:52:33.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.158+0000 7f52e837e700 1 -- 192.168.123.105:0/5528562 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f52e01042c0 msgr2=0x7f52e019d360 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:33.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.158+0000 7f52e837e700 1 --2- 192.168.123.105:0/5528562 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f52e01042c0 0x7f52e019d360 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7f52dc00d8d0 tx=0x7f52dc00dc90 comp rx=0 tx=0).stop 2026-03-10T08:52:33.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.159+0000 7f52e837e700 1 -- 192.168.123.105:0/5528562 shutdown_connections 2026-03-10T08:52:33.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.159+0000 7f52e837e700 1 --2- 192.168.123.105:0/5528562 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f52cc06c4f0 0x7f52cc06e9b0 unknown :-1 s=CLOSED pgs=55 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:33.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.159+0000 7f52e837e700 1 --2- 192.168.123.105:0/5528562 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f52e0101990 0x7f52e019ce20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:33.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.159+0000 7f52e837e700 1 --2- 192.168.123.105:0/5528562 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f52e01042c0 0x7f52e019d360 unknown :-1 s=CLOSED pgs=33 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:33.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.159+0000 7f52e837e700 1 -- 192.168.123.105:0/5528562 >> 192.168.123.105:0/5528562 conn(0x7f52e00fb380 msgr2=0x7f52e00fd7e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:33.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.159+0000 7f52e837e700 1 -- 192.168.123.105:0/5528562 shutdown_connections 2026-03-10T08:52:33.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:33.159+0000 7f52e837e700 1 -- 192.168.123.105:0/5528562 wait complete. 2026-03-10T08:52:33.196 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:33 vm05 ceph-mon[49713]: Detected new or changed devices on vm08 2026-03-10T08:52:33.196 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:33 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:33.196 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:33 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:33.196 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:33 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-10T08:52:33.196 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:33 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:52:33.196 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:33 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:52:33.196 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:33 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:33.196 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:33 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:52:33.196 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:33 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:52:33.196 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:33 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:52:33.196 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:33 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:33.196 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:33 vm05 ceph-mon[49713]: from='osd.5 [v2:192.168.123.108:6816/236163747,v1:192.168.123.108:6817/236163747]' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-10T08:52:33.196 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:33 vm05 ceph-mon[49713]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-10T08:52:33.208 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":29,"num_osds":6,"num_up_osds":5,"osd_up_since":1773132746,"num_in_osds":6,"osd_in_since":1773132744,"num_remapped_pgs":0} 2026-03-10T08:52:33.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:33 vm08 ceph-mon[57559]: Detected new or changed devices on vm08 2026-03-10T08:52:33.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:33 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:33.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:33 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:33.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:33 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-10T08:52:33.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:33 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:52:33.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:33 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:52:33.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:33 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:33.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:33 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:52:33.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:33 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:52:33.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:33 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:52:33.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:33 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:33.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:33 vm08 ceph-mon[57559]: from='osd.5 [v2:192.168.123.108:6816/236163747,v1:192.168.123.108:6817/236163747]' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-10T08:52:33.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:33 vm08 ceph-mon[57559]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-10T08:52:34.209 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph osd stat -f json 2026-03-10T08:52:34.368 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:52:34.390 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:34 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/5528562' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-10T08:52:34.391 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:34 vm05 ceph-mon[49713]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]': finished 2026-03-10T08:52:34.391 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:34 vm05 ceph-mon[49713]: osdmap e30: 6 total, 5 up, 6 in 2026-03-10T08:52:34.391 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:34 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T08:52:34.391 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:34 vm05 ceph-mon[49713]: from='osd.5 [v2:192.168.123.108:6816/236163747,v1:192.168.123.108:6817/236163747]' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T08:52:34.391 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:34 vm05 ceph-mon[49713]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T08:52:34.391 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:34 vm05 ceph-mon[49713]: pgmap v57: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T08:52:34.553 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 08:52:34 vm08 ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5[75639]: 2026-03-10T08:52:34.230+0000 7f96f7224700 -1 osd.5 0 waiting for initial osdmap 2026-03-10T08:52:34.553 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 08:52:34 vm08 ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5[75639]: 2026-03-10T08:52:34.237+0000 7f96f2818700 -1 osd.5 31 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T08:52:34.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:34 vm08 ceph-mon[57559]: from='client.? 192.168.123.105:0/5528562' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-10T08:52:34.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:34 vm08 ceph-mon[57559]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]': finished 2026-03-10T08:52:34.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:34 vm08 ceph-mon[57559]: osdmap e30: 6 total, 5 up, 6 in 2026-03-10T08:52:34.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:34 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T08:52:34.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:34 vm08 ceph-mon[57559]: from='osd.5 [v2:192.168.123.108:6816/236163747,v1:192.168.123.108:6817/236163747]' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T08:52:34.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:34 vm08 ceph-mon[57559]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T08:52:34.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:34 vm08 ceph-mon[57559]: pgmap v57: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T08:52:34.630 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.629+0000 7ff83ed60700 1 -- 192.168.123.105:0/889661018 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff8381043c0 msgr2=0x7ff8381067b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:34.630 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.629+0000 7ff83ed60700 1 --2- 192.168.123.105:0/889661018 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff8381043c0 0x7ff8381067b0 secure :-1 s=READY pgs=197 cs=0 l=1 rev1=1 crypto rx=0x7ff82c009b50 tx=0x7ff82c009e60 comp rx=0 tx=0).stop 2026-03-10T08:52:34.630 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.629+0000 7ff83ed60700 1 -- 192.168.123.105:0/889661018 shutdown_connections 2026-03-10T08:52:34.630 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.629+0000 7ff83ed60700 1 --2- 192.168.123.105:0/889661018 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff8381043c0 0x7ff8381067b0 secure :-1 s=CLOSED pgs=197 cs=0 l=1 rev1=1 crypto rx=0x7ff82c009b50 tx=0x7ff82c009e60 comp rx=0 tx=0).stop 2026-03-10T08:52:34.630 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.629+0000 7ff83ed60700 1 --2- 192.168.123.105:0/889661018 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff838101a90 0x7ff838103e80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:34.630 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.629+0000 7ff83ed60700 1 -- 192.168.123.105:0/889661018 >> 192.168.123.105:0/889661018 conn(0x7ff8380fb3c0 msgr2=0x7ff8380fd840 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:34.630 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.630+0000 7ff83ed60700 1 -- 192.168.123.105:0/889661018 shutdown_connections 2026-03-10T08:52:34.630 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.630+0000 7ff83ed60700 1 -- 192.168.123.105:0/889661018 wait complete. 2026-03-10T08:52:34.631 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.631+0000 7ff83ed60700 1 Processor -- start 2026-03-10T08:52:34.631 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.631+0000 7ff83ed60700 1 -- start start 2026-03-10T08:52:34.631 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.631+0000 7ff83ed60700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff838101a90 0x7ff838196a20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:34.631 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.631+0000 7ff83ed60700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff838196f60 0x7ff83819bfd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:34.632 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.631+0000 7ff83ed60700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff838197470 con 0x7ff838101a90 2026-03-10T08:52:34.632 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.631+0000 7ff83ed60700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff8381975e0 con 0x7ff838196f60 2026-03-10T08:52:34.632 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.631+0000 7ff83cafc700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff838101a90 0x7ff838196a20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:34.632 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.631+0000 7ff83cafc700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff838101a90 0x7ff838196a20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:49802/0 (socket says 192.168.123.105:49802) 2026-03-10T08:52:34.633 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.631+0000 7ff83cafc700 1 -- 192.168.123.105:0/3423514210 learned_addr learned my addr 192.168.123.105:0/3423514210 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:52:34.633 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.631+0000 7ff837fff700 1 --2- 192.168.123.105:0/3423514210 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff838196f60 0x7ff83819bfd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:34.633 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.631+0000 7ff83cafc700 1 -- 192.168.123.105:0/3423514210 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff838196f60 msgr2=0x7ff83819bfd0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:34.633 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.631+0000 7ff83cafc700 1 --2- 192.168.123.105:0/3423514210 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff838196f60 0x7ff83819bfd0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:34.633 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.631+0000 7ff83cafc700 1 -- 192.168.123.105:0/3423514210 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff82c0097e0 con 0x7ff838101a90 2026-03-10T08:52:34.633 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.632+0000 7ff83cafc700 1 --2- 192.168.123.105:0/3423514210 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff838101a90 0x7ff838196a20 secure :-1 s=READY pgs=198 cs=0 l=1 rev1=1 crypto rx=0x7ff82800eb10 tx=0x7ff82800ee20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:34.634 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.632+0000 7ff835ffb700 1 -- 192.168.123.105:0/3423514210 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff82800cc40 con 0x7ff838101a90 2026-03-10T08:52:34.634 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.632+0000 7ff83ed60700 1 -- 192.168.123.105:0/3423514210 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff8380ff880 con 0x7ff838101a90 2026-03-10T08:52:34.634 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.632+0000 7ff83ed60700 1 -- 192.168.123.105:0/3423514210 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff8380ffdd0 con 0x7ff838101a90 2026-03-10T08:52:34.634 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.632+0000 7ff835ffb700 1 -- 192.168.123.105:0/3423514210 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff82800cda0 con 0x7ff838101a90 2026-03-10T08:52:34.634 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.632+0000 7ff835ffb700 1 -- 192.168.123.105:0/3423514210 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff828018810 con 0x7ff838101a90 2026-03-10T08:52:34.634 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.633+0000 7ff835ffb700 1 -- 192.168.123.105:0/3423514210 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7ff828018a50 con 0x7ff838101a90 2026-03-10T08:52:34.634 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.633+0000 7ff835ffb700 1 --2- 192.168.123.105:0/3423514210 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff82006c600 0x7ff82006eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:34.634 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.634+0000 7ff837fff700 1 --2- 192.168.123.105:0/3423514210 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff82006c600 0x7ff82006eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:34.634 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.634+0000 7ff835ffb700 1 -- 192.168.123.105:0/3423514210 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(31..31 src has 1..31) v4 ==== 4166+0+0 (secure 0 0 0) 0x7ff828014070 con 0x7ff838101a90 2026-03-10T08:52:34.635 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.634+0000 7ff837fff700 1 --2- 192.168.123.105:0/3423514210 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff82006c600 0x7ff82006eac0 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7ff82c0053b0 tx=0x7ff82c01e040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:34.638 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.635+0000 7ff83ed60700 1 -- 192.168.123.105:0/3423514210 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff824005320 con 0x7ff838101a90 2026-03-10T08:52:34.638 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.638+0000 7ff835ffb700 1 -- 192.168.123.105:0/3423514210 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7ff82805b490 con 0x7ff838101a90 2026-03-10T08:52:34.746 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.745+0000 7ff83ed60700 1 -- 192.168.123.105:0/3423514210 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "osd stat", "format": "json"} v 0) v1 -- 0x7ff824005190 con 0x7ff838101a90 2026-03-10T08:52:34.746 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.746+0000 7ff835ffb700 1 -- 192.168.123.105:0/3423514210 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "osd stat", "format": "json"}]=0 v31) v1 ==== 74+0+130 (secure 0 0 0) 0x7ff82805b020 con 0x7ff838101a90 2026-03-10T08:52:34.746 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:52:34.749 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.748+0000 7ff83ed60700 1 -- 192.168.123.105:0/3423514210 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff82006c600 msgr2=0x7ff82006eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:34.749 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.748+0000 7ff83ed60700 1 --2- 192.168.123.105:0/3423514210 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff82006c600 0x7ff82006eac0 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7ff82c0053b0 tx=0x7ff82c01e040 comp rx=0 tx=0).stop 2026-03-10T08:52:34.749 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.749+0000 7ff83ed60700 1 -- 192.168.123.105:0/3423514210 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff838101a90 msgr2=0x7ff838196a20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:34.749 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.749+0000 7ff83ed60700 1 --2- 192.168.123.105:0/3423514210 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff838101a90 0x7ff838196a20 secure :-1 s=READY pgs=198 cs=0 l=1 rev1=1 crypto rx=0x7ff82800eb10 tx=0x7ff82800ee20 comp rx=0 tx=0).stop 2026-03-10T08:52:34.749 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.749+0000 7ff83ed60700 1 -- 192.168.123.105:0/3423514210 shutdown_connections 2026-03-10T08:52:34.749 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.749+0000 7ff83ed60700 1 --2- 192.168.123.105:0/3423514210 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff82006c600 0x7ff82006eac0 unknown :-1 s=CLOSED pgs=57 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:34.749 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.749+0000 7ff83ed60700 1 --2- 192.168.123.105:0/3423514210 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff838101a90 0x7ff838196a20 unknown :-1 s=CLOSED pgs=198 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:34.749 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.749+0000 7ff83ed60700 1 --2- 192.168.123.105:0/3423514210 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff838196f60 0x7ff83819bfd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:34.749 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.749+0000 7ff83ed60700 1 -- 192.168.123.105:0/3423514210 >> 192.168.123.105:0/3423514210 conn(0x7ff8380fb3c0 msgr2=0x7ff8380fd840 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:34.749 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.749+0000 7ff83ed60700 1 -- 192.168.123.105:0/3423514210 shutdown_connections 2026-03-10T08:52:34.749 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:34.749+0000 7ff83ed60700 1 -- 192.168.123.105:0/3423514210 wait complete. 2026-03-10T08:52:34.814 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":31,"num_osds":6,"num_up_osds":5,"osd_up_since":1773132746,"num_in_osds":6,"osd_in_since":1773132744,"num_remapped_pgs":0} 2026-03-10T08:52:35.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:35 vm08 ceph-mon[57559]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm08", "root=default"]}]': finished 2026-03-10T08:52:35.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:35 vm08 ceph-mon[57559]: osdmap e31: 6 total, 5 up, 6 in 2026-03-10T08:52:35.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:35 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T08:52:35.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:35 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T08:52:35.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:35 vm08 ceph-mon[57559]: from='client.? 192.168.123.105:0/3423514210' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-10T08:52:35.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:35 vm05 ceph-mon[49713]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm08", "root=default"]}]': finished 2026-03-10T08:52:35.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:35 vm05 ceph-mon[49713]: osdmap e31: 6 total, 5 up, 6 in 2026-03-10T08:52:35.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:35 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T08:52:35.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:35 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T08:52:35.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:35 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/3423514210' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-10T08:52:35.815 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph osd stat -f json 2026-03-10T08:52:35.958 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:52:36.193 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.192+0000 7fc089e3c700 1 -- 192.168.123.105:0/3685533320 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc0840737f0 msgr2=0x7fc084073c70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:36.193 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.192+0000 7fc089e3c700 1 --2- 192.168.123.105:0/3685533320 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc0840737f0 0x7fc084073c70 secure :-1 s=READY pgs=199 cs=0 l=1 rev1=1 crypto rx=0x7fc06c009b50 tx=0x7fc06c009e60 comp rx=0 tx=0).stop 2026-03-10T08:52:36.193 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.192+0000 7fc0837fe700 1 --2- 192.168.123.105:0/3685533320 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc084074dc0 0x7fc084073220 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T08:52:36.193 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.193+0000 7fc089e3c700 1 -- 192.168.123.105:0/3685533320 shutdown_connections 2026-03-10T08:52:36.193 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.193+0000 7fc089e3c700 1 --2- 192.168.123.105:0/3685533320 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc0840737f0 0x7fc084073c70 unknown :-1 s=CLOSED pgs=199 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:36.194 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.193+0000 7fc089e3c700 1 --2- 192.168.123.105:0/3685533320 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc084074dc0 0x7fc084073220 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:36.194 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.193+0000 7fc089e3c700 1 -- 192.168.123.105:0/3685533320 >> 192.168.123.105:0/3685533320 conn(0x7fc0840fc4d0 msgr2=0x7fc0840fe930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:36.194 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.194+0000 7fc089e3c700 1 -- 192.168.123.105:0/3685533320 shutdown_connections 2026-03-10T08:52:36.194 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.194+0000 7fc089e3c700 1 -- 192.168.123.105:0/3685533320 wait complete. 2026-03-10T08:52:36.194 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.194+0000 7fc089e3c700 1 Processor -- start 2026-03-10T08:52:36.194 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.194+0000 7fc089e3c700 1 -- start start 2026-03-10T08:52:36.195 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.194+0000 7fc089e3c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc0840737f0 0x7fc08419ce60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:36.195 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.194+0000 7fc089e3c700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc084074dc0 0x7fc08419d3a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:36.195 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.194+0000 7fc089e3c700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc08419d930 con 0x7fc0840737f0 2026-03-10T08:52:36.195 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.194+0000 7fc089e3c700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc08419da70 con 0x7fc084074dc0 2026-03-10T08:52:36.196 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.194+0000 7fc0837fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc0840737f0 0x7fc08419ce60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:36.196 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.194+0000 7fc0837fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc0840737f0 0x7fc08419ce60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:49822/0 (socket says 192.168.123.105:49822) 2026-03-10T08:52:36.196 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.194+0000 7fc0837fe700 1 -- 192.168.123.105:0/3643655290 learned_addr learned my addr 192.168.123.105:0/3643655290 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:52:36.196 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.195+0000 7fc0837fe700 1 -- 192.168.123.105:0/3643655290 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc084074dc0 msgr2=0x7fc08419d3a0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T08:52:36.197 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.195+0000 7fc0837fe700 1 --2- 192.168.123.105:0/3643655290 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc084074dc0 0x7fc08419d3a0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:36.197 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.195+0000 7fc0837fe700 1 -- 192.168.123.105:0/3643655290 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc06c0097e0 con 0x7fc0840737f0 2026-03-10T08:52:36.197 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.195+0000 7fc0837fe700 1 --2- 192.168.123.105:0/3643655290 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc0840737f0 0x7fc08419ce60 secure :-1 s=READY pgs=200 cs=0 l=1 rev1=1 crypto rx=0x7fc07400c960 tx=0x7fc07400cd20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:36.197 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.195+0000 7fc080ff9700 1 -- 192.168.123.105:0/3643655290 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc074007a50 con 0x7fc0840737f0 2026-03-10T08:52:36.197 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.195+0000 7fc080ff9700 1 -- 192.168.123.105:0/3643655290 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc074007bb0 con 0x7fc0840737f0 2026-03-10T08:52:36.197 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.195+0000 7fc080ff9700 1 -- 192.168.123.105:0/3643655290 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc0740186a0 con 0x7fc0840737f0 2026-03-10T08:52:36.197 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.195+0000 7fc089e3c700 1 -- 192.168.123.105:0/3643655290 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc0841a2530 con 0x7fc0840737f0 2026-03-10T08:52:36.197 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.195+0000 7fc089e3c700 1 -- 192.168.123.105:0/3643655290 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc0841a2a50 con 0x7fc0840737f0 2026-03-10T08:52:36.197 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.197+0000 7fc080ff9700 1 -- 192.168.123.105:0/3643655290 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fc07401f030 con 0x7fc0840737f0 2026-03-10T08:52:36.198 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.197+0000 7fc080ff9700 1 --2- 192.168.123.105:0/3643655290 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fc07006c600 0x7fc07006eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:36.198 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.198+0000 7fc082ffd700 1 --2- 192.168.123.105:0/3643655290 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fc07006c600 0x7fc07006eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:36.198 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.198+0000 7fc080ff9700 1 -- 192.168.123.105:0/3643655290 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(32..32 src has 1..32) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fc07408b4c0 con 0x7fc0840737f0 2026-03-10T08:52:36.198 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.198+0000 7fc082ffd700 1 --2- 192.168.123.105:0/3643655290 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fc07006c600 0x7fc07006eac0 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7fc06c009b20 tx=0x7fc06c005bc0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:36.198 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.198+0000 7fc089e3c700 1 -- 192.168.123.105:0/3643655290 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc08404ea90 con 0x7fc0840737f0 2026-03-10T08:52:36.201 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.201+0000 7fc080ff9700 1 -- 192.168.123.105:0/3643655290 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fc074059b90 con 0x7fc0840737f0 2026-03-10T08:52:36.312 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.311+0000 7fc089e3c700 1 -- 192.168.123.105:0/3643655290 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "osd stat", "format": "json"} v 0) v1 -- 0x7fc084066e80 con 0x7fc0840737f0 2026-03-10T08:52:36.312 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.312+0000 7fc080ff9700 1 -- 192.168.123.105:0/3643655290 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "osd stat", "format": "json"}]=0 v33) v1 ==== 74+0+130 (secure 0 0 0) 0x7fc074059720 con 0x7fc0840737f0 2026-03-10T08:52:36.312 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:52:36.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.314+0000 7fc089e3c700 1 -- 192.168.123.105:0/3643655290 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fc07006c600 msgr2=0x7fc07006eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:36.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.314+0000 7fc089e3c700 1 --2- 192.168.123.105:0/3643655290 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fc07006c600 0x7fc07006eac0 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7fc06c009b20 tx=0x7fc06c005bc0 comp rx=0 tx=0).stop 2026-03-10T08:52:36.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.314+0000 7fc089e3c700 1 -- 192.168.123.105:0/3643655290 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc0840737f0 msgr2=0x7fc08419ce60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:36.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.314+0000 7fc089e3c700 1 --2- 192.168.123.105:0/3643655290 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc0840737f0 0x7fc08419ce60 secure :-1 s=READY pgs=200 cs=0 l=1 rev1=1 crypto rx=0x7fc07400c960 tx=0x7fc07400cd20 comp rx=0 tx=0).stop 2026-03-10T08:52:36.315 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.314+0000 7fc089e3c700 1 -- 192.168.123.105:0/3643655290 shutdown_connections 2026-03-10T08:52:36.315 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.315+0000 7fc089e3c700 1 --2- 192.168.123.105:0/3643655290 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fc07006c600 0x7fc07006eac0 unknown :-1 s=CLOSED pgs=59 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:36.315 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.315+0000 7fc089e3c700 1 --2- 192.168.123.105:0/3643655290 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc0840737f0 0x7fc08419ce60 unknown :-1 s=CLOSED pgs=200 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:36.315 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.315+0000 7fc089e3c700 1 --2- 192.168.123.105:0/3643655290 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc084074dc0 0x7fc08419d3a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:36.315 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.315+0000 7fc089e3c700 1 -- 192.168.123.105:0/3643655290 >> 192.168.123.105:0/3643655290 conn(0x7fc0840fc4d0 msgr2=0x7fc0841027d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:36.315 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.315+0000 7fc089e3c700 1 -- 192.168.123.105:0/3643655290 shutdown_connections 2026-03-10T08:52:36.315 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.315+0000 7fc089e3c700 1 -- 192.168.123.105:0/3643655290 wait complete. 2026-03-10T08:52:36.346 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:36 vm05 ceph-mon[49713]: purged_snaps scrub starts 2026-03-10T08:52:36.346 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:36 vm05 ceph-mon[49713]: purged_snaps scrub ok 2026-03-10T08:52:36.346 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:36 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T08:52:36.346 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:36 vm05 ceph-mon[49713]: osd.5 [v2:192.168.123.108:6816/236163747,v1:192.168.123.108:6817/236163747] boot 2026-03-10T08:52:36.346 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:36 vm05 ceph-mon[49713]: osdmap e32: 6 total, 6 up, 6 in 2026-03-10T08:52:36.346 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:36 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T08:52:36.346 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:36 vm05 ceph-mon[49713]: pgmap v60: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T08:52:36.380 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":33,"num_osds":6,"num_up_osds":6,"osd_up_since":1773132755,"num_in_osds":6,"osd_in_since":1773132744,"num_remapped_pgs":0} 2026-03-10T08:52:36.380 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph osd dump --format=json 2026-03-10T08:52:36.522 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:52:36.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:36 vm08 ceph-mon[57559]: purged_snaps scrub starts 2026-03-10T08:52:36.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:36 vm08 ceph-mon[57559]: purged_snaps scrub ok 2026-03-10T08:52:36.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:36 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T08:52:36.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:36 vm08 ceph-mon[57559]: osd.5 [v2:192.168.123.108:6816/236163747,v1:192.168.123.108:6817/236163747] boot 2026-03-10T08:52:36.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:36 vm08 ceph-mon[57559]: osdmap e32: 6 total, 6 up, 6 in 2026-03-10T08:52:36.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:36 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T08:52:36.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:36 vm08 ceph-mon[57559]: pgmap v60: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T08:52:36.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.769+0000 7f11033d9700 1 -- 192.168.123.105:0/3104627993 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f10fc1033c0 msgr2=0x7f10fc1037a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:36.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.769+0000 7f11033d9700 1 --2- 192.168.123.105:0/3104627993 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f10fc1033c0 0x7f10fc1037a0 secure :-1 s=READY pgs=201 cs=0 l=1 rev1=1 crypto rx=0x7f10ec009b00 tx=0x7f10ec009e10 comp rx=0 tx=0).stop 2026-03-10T08:52:36.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.769+0000 7f11033d9700 1 -- 192.168.123.105:0/3104627993 shutdown_connections 2026-03-10T08:52:36.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.769+0000 7f11033d9700 1 --2- 192.168.123.105:0/3104627993 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f10fc103d70 0x7f10fc107dc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:36.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.769+0000 7f11033d9700 1 --2- 192.168.123.105:0/3104627993 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f10fc1033c0 0x7f10fc1037a0 unknown :-1 s=CLOSED pgs=201 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:36.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.769+0000 7f11033d9700 1 -- 192.168.123.105:0/3104627993 >> 192.168.123.105:0/3104627993 conn(0x7f10fc0fec30 msgr2=0x7f10fc101050 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:36.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.770+0000 7f11033d9700 1 -- 192.168.123.105:0/3104627993 shutdown_connections 2026-03-10T08:52:36.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.770+0000 7f11033d9700 1 -- 192.168.123.105:0/3104627993 wait complete. 2026-03-10T08:52:36.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.770+0000 7f11033d9700 1 Processor -- start 2026-03-10T08:52:36.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.770+0000 7f11033d9700 1 -- start start 2026-03-10T08:52:36.771 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.771+0000 7f11033d9700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f10fc1033c0 0x7f10fc198f00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:36.771 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.771+0000 7f11033d9700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f10fc103d70 0x7f10fc199440 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:36.771 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.771+0000 7f11033d9700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f10fc199b20 con 0x7f10fc103d70 2026-03-10T08:52:36.771 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.771+0000 7f11033d9700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f10fc19d8b0 con 0x7f10fc1033c0 2026-03-10T08:52:36.771 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.771+0000 7f1101175700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f10fc1033c0 0x7f10fc198f00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:36.771 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.771+0000 7f1101175700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f10fc1033c0 0x7f10fc198f00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:56624/0 (socket says 192.168.123.105:56624) 2026-03-10T08:52:36.771 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.771+0000 7f1101175700 1 -- 192.168.123.105:0/1839962289 learned_addr learned my addr 192.168.123.105:0/1839962289 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:52:36.771 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.771+0000 7f1100974700 1 --2- 192.168.123.105:0/1839962289 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f10fc103d70 0x7f10fc199440 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:36.771 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.771+0000 7f1101175700 1 -- 192.168.123.105:0/1839962289 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f10fc103d70 msgr2=0x7f10fc199440 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:36.771 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.771+0000 7f1101175700 1 --2- 192.168.123.105:0/1839962289 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f10fc103d70 0x7f10fc199440 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:36.771 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.771+0000 7f1101175700 1 -- 192.168.123.105:0/1839962289 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f10ec0097e0 con 0x7f10fc1033c0 2026-03-10T08:52:36.772 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.771+0000 7f1100974700 1 --2- 192.168.123.105:0/1839962289 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f10fc103d70 0x7f10fc199440 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T08:52:36.772 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.772+0000 7f1101175700 1 --2- 192.168.123.105:0/1839962289 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f10fc1033c0 0x7f10fc198f00 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7f10ec0094d0 tx=0x7f10ec0049e0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:36.772 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.772+0000 7f10f27fc700 1 -- 192.168.123.105:0/1839962289 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f10ec01d070 con 0x7f10fc1033c0 2026-03-10T08:52:36.772 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.772+0000 7f11033d9700 1 -- 192.168.123.105:0/1839962289 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f10fc19db30 con 0x7f10fc1033c0 2026-03-10T08:52:36.772 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.772+0000 7f10f27fc700 1 -- 192.168.123.105:0/1839962289 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f10ec00bc50 con 0x7f10fc1033c0 2026-03-10T08:52:36.772 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.772+0000 7f11033d9700 1 -- 192.168.123.105:0/1839962289 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f10fc19e020 con 0x7f10fc1033c0 2026-03-10T08:52:36.774 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.773+0000 7f10f27fc700 1 -- 192.168.123.105:0/1839962289 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f10ec00f780 con 0x7f10fc1033c0 2026-03-10T08:52:36.774 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.774+0000 7f10f27fc700 1 -- 192.168.123.105:0/1839962289 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f10ec00f9a0 con 0x7f10fc1033c0 2026-03-10T08:52:36.774 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.774+0000 7f11033d9700 1 -- 192.168.123.105:0/1839962289 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f10fc04ea90 con 0x7f10fc1033c0 2026-03-10T08:52:36.774 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.774+0000 7f10f27fc700 1 --2- 192.168.123.105:0/1839962289 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f10e806c2e0 0x7f10e806e7a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:36.774 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.774+0000 7f10f27fc700 1 -- 192.168.123.105:0/1839962289 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f10ec08cab0 con 0x7f10fc1033c0 2026-03-10T08:52:36.774 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.774+0000 7f1100974700 1 --2- 192.168.123.105:0/1839962289 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f10e806c2e0 0x7f10e806e7a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:36.775 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.775+0000 7f1100974700 1 --2- 192.168.123.105:0/1839962289 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f10e806c2e0 0x7f10e806e7a0 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f10fc19a520 tx=0x7f10f8008040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:36.777 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.777+0000 7f10f27fc700 1 -- 192.168.123.105:0/1839962289 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f10ec058950 con 0x7f10fc1033c0 2026-03-10T08:52:36.881 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.881+0000 7f11033d9700 1 -- 192.168.123.105:0/1839962289 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "osd dump", "format": "json"} v 0) v1 -- 0x7f10fc19a2b0 con 0x7f10fc1033c0 2026-03-10T08:52:36.882 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.882+0000 7f10f27fc700 1 -- 192.168.123.105:0/1839962289 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "osd dump", "format": "json"}]=0 v33) v1 ==== 74+0+11260 (secure 0 0 0) 0x7f10ec027090 con 0x7f10fc1033c0 2026-03-10T08:52:36.883 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:52:36.883 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":33,"fsid":"16587ed2-1c5e-11f1-90f6-35051361a039","created":"2026-03-10T08:50:10.961037+0000","modified":"2026-03-10T08:52:36.235765+0000","last_up_change":"2026-03-10T08:52:35.229629+0000","last_in_change":"2026-03-10T08:52:24.382935+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":14,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":1,"max_osd":6,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"reef","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-10T08:52:09.422580+0000","flags":1,"flags_names":"hashpspool","type":1,"size":3,"min_size":2,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"20","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_acting":6,"score_stable":6,"optimal_score":0.5,"raw_score_acting":3,"raw_score_stable":3,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"0e25ea50-b19b-4e07-85f6-5d48c19d3a4f","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":9,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6802","nonce":2315796084},{"type":"v1","addr":"192.168.123.105:6803","nonce":2315796084}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6804","nonce":2315796084},{"type":"v1","addr":"192.168.123.105:6805","nonce":2315796084}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6808","nonce":2315796084},{"type":"v1","addr":"192.168.123.105:6809","nonce":2315796084}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6806","nonce":2315796084},{"type":"v1","addr":"192.168.123.105:6807","nonce":2315796084}]},"public_addr":"192.168.123.105:6803/2315796084","cluster_addr":"192.168.123.105:6805/2315796084","heartbeat_back_addr":"192.168.123.105:6809/2315796084","heartbeat_front_addr":"192.168.123.105:6807/2315796084","state":["exists","up"]},{"osd":1,"uuid":"65d8a731-173e-4188-b03d-f0602d504870","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":13,"up_thru":24,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6810","nonce":305535590},{"type":"v1","addr":"192.168.123.105:6811","nonce":305535590}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6812","nonce":305535590},{"type":"v1","addr":"192.168.123.105:6813","nonce":305535590}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6816","nonce":305535590},{"type":"v1","addr":"192.168.123.105:6817","nonce":305535590}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6814","nonce":305535590},{"type":"v1","addr":"192.168.123.105:6815","nonce":305535590}]},"public_addr":"192.168.123.105:6811/305535590","cluster_addr":"192.168.123.105:6813/305535590","heartbeat_back_addr":"192.168.123.105:6817/305535590","heartbeat_front_addr":"192.168.123.105:6815/305535590","state":["exists","up"]},{"osd":2,"uuid":"3a3adfaf-6208-4836-b16d-7bbb2065933b","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":17,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6818","nonce":3827519003},{"type":"v1","addr":"192.168.123.105:6819","nonce":3827519003}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6820","nonce":3827519003},{"type":"v1","addr":"192.168.123.105:6821","nonce":3827519003}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6824","nonce":3827519003},{"type":"v1","addr":"192.168.123.105:6825","nonce":3827519003}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6822","nonce":3827519003},{"type":"v1","addr":"192.168.123.105:6823","nonce":3827519003}]},"public_addr":"192.168.123.105:6819/3827519003","cluster_addr":"192.168.123.105:6821/3827519003","heartbeat_back_addr":"192.168.123.105:6825/3827519003","heartbeat_front_addr":"192.168.123.105:6823/3827519003","state":["exists","up"]},{"osd":3,"uuid":"3e86065a-202f-4640-9f03-2490c913e09b","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":23,"up_thru":27,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6800","nonce":1634067319},{"type":"v1","addr":"192.168.123.108:6801","nonce":1634067319}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6802","nonce":1634067319},{"type":"v1","addr":"192.168.123.108:6803","nonce":1634067319}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6806","nonce":1634067319},{"type":"v1","addr":"192.168.123.108:6807","nonce":1634067319}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6804","nonce":1634067319},{"type":"v1","addr":"192.168.123.108:6805","nonce":1634067319}]},"public_addr":"192.168.123.108:6801/1634067319","cluster_addr":"192.168.123.108:6803/1634067319","heartbeat_back_addr":"192.168.123.108:6807/1634067319","heartbeat_front_addr":"192.168.123.108:6805/1634067319","state":["exists","up"]},{"osd":4,"uuid":"2499eecc-b6be-48ac-ba73-53ff8a0686a4","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":28,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6808","nonce":4181429086},{"type":"v1","addr":"192.168.123.108:6809","nonce":4181429086}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6810","nonce":4181429086},{"type":"v1","addr":"192.168.123.108:6811","nonce":4181429086}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6814","nonce":4181429086},{"type":"v1","addr":"192.168.123.108:6815","nonce":4181429086}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6812","nonce":4181429086},{"type":"v1","addr":"192.168.123.108:6813","nonce":4181429086}]},"public_addr":"192.168.123.108:6809/4181429086","cluster_addr":"192.168.123.108:6811/4181429086","heartbeat_back_addr":"192.168.123.108:6815/4181429086","heartbeat_front_addr":"192.168.123.108:6813/4181429086","state":["exists","up"]},{"osd":5,"uuid":"52d5763a-8095-46f7-9dd8-2d20a4d53ab7","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":32,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6816","nonce":236163747},{"type":"v1","addr":"192.168.123.108:6817","nonce":236163747}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6818","nonce":236163747},{"type":"v1","addr":"192.168.123.108:6819","nonce":236163747}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6822","nonce":236163747},{"type":"v1","addr":"192.168.123.108:6823","nonce":236163747}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6820","nonce":236163747},{"type":"v1","addr":"192.168.123.108:6821","nonce":236163747}]},"public_addr":"192.168.123.108:6817/236163747","cluster_addr":"192.168.123.108:6819/236163747","heartbeat_back_addr":"192.168.123.108:6823/236163747","heartbeat_front_addr":"192.168.123.108:6821/236163747","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T08:51:47.881910+0000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T08:51:57.340585+0000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T08:52:07.088267+0000","dead_epoch":0},{"osd":3,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T08:52:15.174441+0000","dead_epoch":0},{"osd":4,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T08:52:24.943469+0000","dead_epoch":0},{"osd":5,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T08:52:33.433439+0000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{"192.168.123.105:0/2745800648":"2026-03-11T08:51:15.172314+0000","192.168.123.105:0/840671361":"2026-03-11T08:51:15.172314+0000","192.168.123.105:0/302360737":"2026-03-11T08:50:37.635038+0000","192.168.123.105:0/2061557555":"2026-03-11T08:51:15.172314+0000","192.168.123.105:0/3374278469":"2026-03-11T08:50:37.635038+0000","192.168.123.105:6801/2":"2026-03-11T08:50:23.844460+0000","192.168.123.105:6800/2":"2026-03-11T08:50:23.844460+0000","192.168.123.105:0/3434295633":"2026-03-11T08:50:23.844460+0000","192.168.123.105:0/3706700996":"2026-03-11T08:50:23.844460+0000","192.168.123.105:0/3674537463":"2026-03-11T08:50:23.844460+0000","192.168.123.105:0/1368648469":"2026-03-11T08:50:37.635038+0000"},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"jerasure","technique":"reed_sol_van"}},"removed_snaps_queue":[],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-10T08:52:36.885 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.885+0000 7f11033d9700 1 -- 192.168.123.105:0/1839962289 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f10e806c2e0 msgr2=0x7f10e806e7a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:36.885 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.885+0000 7f11033d9700 1 --2- 192.168.123.105:0/1839962289 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f10e806c2e0 0x7f10e806e7a0 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f10fc19a520 tx=0x7f10f8008040 comp rx=0 tx=0).stop 2026-03-10T08:52:36.885 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.885+0000 7f11033d9700 1 -- 192.168.123.105:0/1839962289 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f10fc1033c0 msgr2=0x7f10fc198f00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:36.885 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.885+0000 7f11033d9700 1 --2- 192.168.123.105:0/1839962289 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f10fc1033c0 0x7f10fc198f00 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7f10ec0094d0 tx=0x7f10ec0049e0 comp rx=0 tx=0).stop 2026-03-10T08:52:36.886 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.885+0000 7f11033d9700 1 -- 192.168.123.105:0/1839962289 shutdown_connections 2026-03-10T08:52:36.886 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.886+0000 7f11033d9700 1 --2- 192.168.123.105:0/1839962289 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f10e806c2e0 0x7f10e806e7a0 unknown :-1 s=CLOSED pgs=61 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:36.886 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.886+0000 7f11033d9700 1 --2- 192.168.123.105:0/1839962289 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f10fc1033c0 0x7f10fc198f00 unknown :-1 s=CLOSED pgs=34 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:36.886 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.886+0000 7f11033d9700 1 --2- 192.168.123.105:0/1839962289 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f10fc103d70 0x7f10fc199440 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:36.886 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.886+0000 7f11033d9700 1 -- 192.168.123.105:0/1839962289 >> 192.168.123.105:0/1839962289 conn(0x7f10fc0fec30 msgr2=0x7f10fc100270 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:36.886 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.886+0000 7f11033d9700 1 -- 192.168.123.105:0/1839962289 shutdown_connections 2026-03-10T08:52:36.886 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:36.886+0000 7f11033d9700 1 -- 192.168.123.105:0/1839962289 wait complete. 2026-03-10T08:52:36.953 INFO:tasks.cephadm.ceph_manager.ceph:[{'pool': 1, 'pool_name': '.mgr', 'create_time': '2026-03-10T08:52:09.422580+0000', 'flags': 1, 'flags_names': 'hashpspool', 'type': 1, 'size': 3, 'min_size': 2, 'crush_rule': 0, 'peering_crush_bucket_count': 0, 'peering_crush_bucket_target': 0, 'peering_crush_bucket_barrier': 0, 'peering_crush_bucket_mandatory_member': 2147483647, 'object_hash': 2, 'pg_autoscale_mode': 'off', 'pg_num': 1, 'pg_placement_num': 1, 'pg_placement_num_target': 1, 'pg_num_target': 1, 'pg_num_pending': 1, 'last_pg_merge_meta': {'source_pgid': '0.0', 'ready_epoch': 0, 'last_epoch_started': 0, 'last_epoch_clean': 0, 'source_version': "0'0", 'target_version': "0'0"}, 'last_change': '20', 'last_force_op_resend': '0', 'last_force_op_resend_prenautilus': '0', 'last_force_op_resend_preluminous': '0', 'auid': 0, 'snap_mode': 'selfmanaged', 'snap_seq': 0, 'snap_epoch': 0, 'pool_snaps': [], 'removed_snaps': '[]', 'quota_max_bytes': 0, 'quota_max_objects': 0, 'tiers': [], 'tier_of': -1, 'read_tier': -1, 'write_tier': -1, 'cache_mode': 'none', 'target_max_bytes': 0, 'target_max_objects': 0, 'cache_target_dirty_ratio_micro': 400000, 'cache_target_dirty_high_ratio_micro': 600000, 'cache_target_full_ratio_micro': 800000, 'cache_min_flush_age': 0, 'cache_min_evict_age': 0, 'erasure_code_profile': '', 'hit_set_params': {'type': 'none'}, 'hit_set_period': 0, 'hit_set_count': 0, 'use_gmt_hitset': True, 'min_read_recency_for_promote': 0, 'min_write_recency_for_promote': 0, 'hit_set_grade_decay_rate': 0, 'hit_set_search_last_n': 0, 'grade_table': [], 'stripe_width': 0, 'expected_num_objects': 0, 'fast_read': False, 'options': {'pg_num_max': 32, 'pg_num_min': 1}, 'application_metadata': {'mgr': {}}, 'read_balance': {'score_acting': 6, 'score_stable': 6, 'optimal_score': 0.5, 'raw_score_acting': 3, 'raw_score_stable': 3, 'primary_affinity_weighted': 1, 'average_primary_affinity': 1, 'average_primary_affinity_weighted': 1}}] 2026-03-10T08:52:36.954 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph osd pool get .mgr pg_num 2026-03-10T08:52:37.109 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:52:37.373 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.372+0000 7f1719426700 1 -- 192.168.123.105:0/1659725844 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1714073a50 msgr2=0x7f1714111850 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:37.373 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.372+0000 7f1719426700 1 --2- 192.168.123.105:0/1659725844 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1714073a50 0x7f1714111850 secure :-1 s=READY pgs=202 cs=0 l=1 rev1=1 crypto rx=0x7f1704009b00 tx=0x7f1704009e10 comp rx=0 tx=0).stop 2026-03-10T08:52:37.373 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.373+0000 7f1719426700 1 -- 192.168.123.105:0/1659725844 shutdown_connections 2026-03-10T08:52:37.373 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.373+0000 7f1719426700 1 --2- 192.168.123.105:0/1659725844 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1714073a50 0x7f1714111850 unknown :-1 s=CLOSED pgs=202 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:37.373 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.373+0000 7f1719426700 1 --2- 192.168.123.105:0/1659725844 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1714073130 0x7f1714073510 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:37.373 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.373+0000 7f1719426700 1 -- 192.168.123.105:0/1659725844 >> 192.168.123.105:0/1659725844 conn(0x7f17140fc870 msgr2=0x7f17140fec90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:37.373 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.373+0000 7f1719426700 1 -- 192.168.123.105:0/1659725844 shutdown_connections 2026-03-10T08:52:37.373 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.373+0000 7f1719426700 1 -- 192.168.123.105:0/1659725844 wait complete. 2026-03-10T08:52:37.374 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.373+0000 7f1719426700 1 Processor -- start 2026-03-10T08:52:37.374 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.374+0000 7f1719426700 1 -- start start 2026-03-10T08:52:37.374 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.374+0000 7f1719426700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1714073130 0x7f171419d000 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:37.374 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.374+0000 7f1719426700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1714073a50 0x7f171419d540 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:37.374 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.374+0000 7f1719426700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f171419dc20 con 0x7f1714073a50 2026-03-10T08:52:37.374 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.374+0000 7f1719426700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f17141a19b0 con 0x7f1714073130 2026-03-10T08:52:37.374 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.374+0000 7f17127fc700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1714073a50 0x7f171419d540 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:37.374 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.374+0000 7f17127fc700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1714073a50 0x7f171419d540 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:49874/0 (socket says 192.168.123.105:49874) 2026-03-10T08:52:37.374 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.374+0000 7f17127fc700 1 -- 192.168.123.105:0/3816000324 learned_addr learned my addr 192.168.123.105:0/3816000324 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:52:37.375 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.374+0000 7f1712ffd700 1 --2- 192.168.123.105:0/3816000324 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1714073130 0x7f171419d000 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:37.375 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.374+0000 7f17127fc700 1 -- 192.168.123.105:0/3816000324 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1714073130 msgr2=0x7f171419d000 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:37.375 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.374+0000 7f17127fc700 1 --2- 192.168.123.105:0/3816000324 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1714073130 0x7f171419d000 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:37.375 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.374+0000 7f17127fc700 1 -- 192.168.123.105:0/3816000324 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f17040097e0 con 0x7f1714073a50 2026-03-10T08:52:37.375 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.375+0000 7f17127fc700 1 --2- 192.168.123.105:0/3816000324 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1714073a50 0x7f171419d540 secure :-1 s=READY pgs=203 cs=0 l=1 rev1=1 crypto rx=0x7f1704009ad0 tx=0x7f17040052e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:37.375 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.375+0000 7f170bfff700 1 -- 192.168.123.105:0/3816000324 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f170401d070 con 0x7f1714073a50 2026-03-10T08:52:37.375 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.375+0000 7f1719426700 1 -- 192.168.123.105:0/3816000324 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f17141a1c90 con 0x7f1714073a50 2026-03-10T08:52:37.376 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.375+0000 7f1719426700 1 -- 192.168.123.105:0/3816000324 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f17141a21e0 con 0x7f1714073a50 2026-03-10T08:52:37.376 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.375+0000 7f170bfff700 1 -- 192.168.123.105:0/3816000324 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f170400bc50 con 0x7f1714073a50 2026-03-10T08:52:37.376 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.375+0000 7f170bfff700 1 -- 192.168.123.105:0/3816000324 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f170400f790 con 0x7f1714073a50 2026-03-10T08:52:37.378 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.377+0000 7f170bfff700 1 -- 192.168.123.105:0/3816000324 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f1704022470 con 0x7f1714073a50 2026-03-10T08:52:37.378 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.377+0000 7f170bfff700 1 --2- 192.168.123.105:0/3816000324 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f170006c530 0x7f170006e9f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:37.378 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.377+0000 7f170bfff700 1 -- 192.168.123.105:0/3816000324 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f170408cc50 con 0x7f1714073a50 2026-03-10T08:52:37.378 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.378+0000 7f1712ffd700 1 --2- 192.168.123.105:0/3816000324 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f170006c530 0x7f170006e9f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:37.381 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.377+0000 7f1709ffb700 1 -- 192.168.123.105:0/3816000324 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f171404f350 con 0x7f1714073a50 2026-03-10T08:52:37.381 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.378+0000 7f1712ffd700 1 --2- 192.168.123.105:0/3816000324 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f170006c530 0x7f170006e9f0 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f16fc005fd0 tx=0x7f16fc00a560 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:37.381 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.381+0000 7f170bfff700 1 -- 192.168.123.105:0/3816000324 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f170408ce60 con 0x7f1714073a50 2026-03-10T08:52:37.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:37 vm05 ceph-mon[49713]: osdmap e33: 6 total, 6 up, 6 in 2026-03-10T08:52:37.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:37 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/3643655290' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-10T08:52:37.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:37 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/1839962289' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-10T08:52:37.488 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.487+0000 7f1709ffb700 1 -- 192.168.123.105:0/3816000324 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "osd pool get", "pool": ".mgr", "var": "pg_num"} v 0) v1 -- 0x7f171404ea90 con 0x7f1714073a50 2026-03-10T08:52:37.488 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.488+0000 7f170bfff700 1 -- 192.168.123.105:0/3816000324 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "osd pool get", "pool": ".mgr", "var": "pg_num"}]=0 v33) v1 ==== 93+0+10 (secure 0 0 0) 0x7f1704027030 con 0x7f1714073a50 2026-03-10T08:52:37.488 INFO:teuthology.orchestra.run.vm05.stdout:pg_num: 1 2026-03-10T08:52:37.490 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.490+0000 7f1709ffb700 1 -- 192.168.123.105:0/3816000324 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f170006c530 msgr2=0x7f170006e9f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:37.490 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.490+0000 7f1709ffb700 1 --2- 192.168.123.105:0/3816000324 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f170006c530 0x7f170006e9f0 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f16fc005fd0 tx=0x7f16fc00a560 comp rx=0 tx=0).stop 2026-03-10T08:52:37.490 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.490+0000 7f1709ffb700 1 -- 192.168.123.105:0/3816000324 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1714073a50 msgr2=0x7f171419d540 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:37.490 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.490+0000 7f1709ffb700 1 --2- 192.168.123.105:0/3816000324 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1714073a50 0x7f171419d540 secure :-1 s=READY pgs=203 cs=0 l=1 rev1=1 crypto rx=0x7f1704009ad0 tx=0x7f17040052e0 comp rx=0 tx=0).stop 2026-03-10T08:52:37.491 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.491+0000 7f1709ffb700 1 -- 192.168.123.105:0/3816000324 shutdown_connections 2026-03-10T08:52:37.491 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.491+0000 7f1709ffb700 1 --2- 192.168.123.105:0/3816000324 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f170006c530 0x7f170006e9f0 unknown :-1 s=CLOSED pgs=62 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:37.491 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.491+0000 7f1709ffb700 1 --2- 192.168.123.105:0/3816000324 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1714073130 0x7f171419d000 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:37.491 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.491+0000 7f1709ffb700 1 --2- 192.168.123.105:0/3816000324 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1714073a50 0x7f171419d540 unknown :-1 s=CLOSED pgs=203 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:37.491 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.491+0000 7f1709ffb700 1 -- 192.168.123.105:0/3816000324 >> 192.168.123.105:0/3816000324 conn(0x7f17140fc870 msgr2=0x7f1714103360 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:37.491 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.491+0000 7f1709ffb700 1 -- 192.168.123.105:0/3816000324 shutdown_connections 2026-03-10T08:52:37.491 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.491+0000 7f1709ffb700 1 -- 192.168.123.105:0/3816000324 wait complete. 2026-03-10T08:52:37.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:37 vm08 ceph-mon[57559]: osdmap e33: 6 total, 6 up, 6 in 2026-03-10T08:52:37.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:37 vm08 ceph-mon[57559]: from='client.? 192.168.123.105:0/3643655290' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-10T08:52:37.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:37 vm08 ceph-mon[57559]: from='client.? 192.168.123.105:0/1839962289' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-10T08:52:37.555 INFO:tasks.cephadm:Setting up client nodes... 2026-03-10T08:52:37.555 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph auth get-or-create client.0 mon 'allow *' osd 'allow *' mds 'allow *' mgr 'allow *' 2026-03-10T08:52:37.699 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:52:37.923 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.922+0000 7f100f89b700 1 -- 192.168.123.105:0/2344097119 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1008104350 msgr2=0x7f10081047b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:37.923 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.922+0000 7f100f89b700 1 --2- 192.168.123.105:0/2344097119 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1008104350 0x7f10081047b0 secure :-1 s=READY pgs=204 cs=0 l=1 rev1=1 crypto rx=0x7f1004009b50 tx=0x7f1004009e60 comp rx=0 tx=0).stop 2026-03-10T08:52:37.923 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.922+0000 7f100f89b700 1 -- 192.168.123.105:0/2344097119 shutdown_connections 2026-03-10T08:52:37.923 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.922+0000 7f100f89b700 1 --2- 192.168.123.105:0/2344097119 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1008104350 0x7f10081047b0 unknown :-1 s=CLOSED pgs=204 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:37.923 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.922+0000 7f100f89b700 1 --2- 192.168.123.105:0/2344097119 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1008103150 0x7f1008103570 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:37.923 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.922+0000 7f100f89b700 1 -- 192.168.123.105:0/2344097119 >> 192.168.123.105:0/2344097119 conn(0x7f10080fe6d0 msgr2=0x7f1008100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:37.923 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.923+0000 7f100f89b700 1 -- 192.168.123.105:0/2344097119 shutdown_connections 2026-03-10T08:52:37.923 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.923+0000 7f100f89b700 1 -- 192.168.123.105:0/2344097119 wait complete. 2026-03-10T08:52:37.923 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.923+0000 7f100f89b700 1 Processor -- start 2026-03-10T08:52:37.923 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.923+0000 7f100f89b700 1 -- start start 2026-03-10T08:52:37.923 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.923+0000 7f100f89b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1008103150 0x7f1008198a50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:37.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.923+0000 7f100f89b700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1008104350 0x7f1008198f90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:37.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.923+0000 7f100f89b700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f10081995b0 con 0x7f1008103150 2026-03-10T08:52:37.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.923+0000 7f100f89b700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f10081996f0 con 0x7f1008104350 2026-03-10T08:52:37.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.924+0000 7f100d637700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1008103150 0x7f1008198a50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:37.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.924+0000 7f100ce36700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1008104350 0x7f1008198f90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:37.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.924+0000 7f100d637700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1008103150 0x7f1008198a50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:49898/0 (socket says 192.168.123.105:49898) 2026-03-10T08:52:37.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.924+0000 7f100d637700 1 -- 192.168.123.105:0/2893161916 learned_addr learned my addr 192.168.123.105:0/2893161916 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:52:37.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.924+0000 7f100d637700 1 -- 192.168.123.105:0/2893161916 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1008104350 msgr2=0x7f1008198f90 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:37.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.924+0000 7f100d637700 1 --2- 192.168.123.105:0/2893161916 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1008104350 0x7f1008198f90 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:37.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.924+0000 7f100d637700 1 -- 192.168.123.105:0/2893161916 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f10040097e0 con 0x7f1008103150 2026-03-10T08:52:37.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.924+0000 7f100ce36700 1 --2- 192.168.123.105:0/2893161916 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1008104350 0x7f1008198f90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T08:52:37.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.924+0000 7f100d637700 1 --2- 192.168.123.105:0/2893161916 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1008103150 0x7f1008198a50 secure :-1 s=READY pgs=205 cs=0 l=1 rev1=1 crypto rx=0x7f0ff800d900 tx=0x7f0ff800dc10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:37.926 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.925+0000 7f0ffe7fc700 1 -- 192.168.123.105:0/2893161916 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0ff80041d0 con 0x7f1008103150 2026-03-10T08:52:37.926 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.925+0000 7f0ffe7fc700 1 -- 192.168.123.105:0/2893161916 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f0ff8004330 con 0x7f1008103150 2026-03-10T08:52:37.926 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.925+0000 7f0ffe7fc700 1 -- 192.168.123.105:0/2893161916 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0ff8003d70 con 0x7f1008103150 2026-03-10T08:52:37.926 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.925+0000 7f100f89b700 1 -- 192.168.123.105:0/2893161916 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f100819e1a0 con 0x7f1008103150 2026-03-10T08:52:37.926 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.925+0000 7f100f89b700 1 -- 192.168.123.105:0/2893161916 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f100819e6f0 con 0x7f1008103150 2026-03-10T08:52:37.929 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.926+0000 7f0ffe7fc700 1 -- 192.168.123.105:0/2893161916 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f0ff8009730 con 0x7f1008103150 2026-03-10T08:52:37.929 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.926+0000 7f100f89b700 1 -- 192.168.123.105:0/2893161916 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1008066e80 con 0x7f1008103150 2026-03-10T08:52:37.929 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.927+0000 7f0ffe7fc700 1 --2- 192.168.123.105:0/2893161916 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0ff406c490 0x7f0ff406e950 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:37.929 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.927+0000 7f0ffe7fc700 1 -- 192.168.123.105:0/2893161916 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f0ff8021030 con 0x7f1008103150 2026-03-10T08:52:37.929 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.927+0000 7f100ce36700 1 --2- 192.168.123.105:0/2893161916 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0ff406c490 0x7f0ff406e950 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:37.930 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.927+0000 7f100ce36700 1 --2- 192.168.123.105:0/2893161916 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0ff406c490 0x7f0ff406e950 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f1004006010 tx=0x7f1004005a90 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:37.930 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:37.930+0000 7f0ffe7fc700 1 -- 192.168.123.105:0/2893161916 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f0ff80be0e0 con 0x7f1008103150 2026-03-10T08:52:38.078 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:38.076+0000 7f100f89b700 1 -- 192.168.123.105:0/2893161916 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]} v 0) v1 -- 0x7f100819e9d0 con 0x7f1008103150 2026-03-10T08:52:38.080 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:38.080+0000 7f0ffe7fc700 1 -- 192.168.123.105:0/2893161916 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]=0 v16) v1 ==== 170+0+59 (secure 0 0 0) 0x7f0ff8059800 con 0x7f1008103150 2026-03-10T08:52:38.080 INFO:teuthology.orchestra.run.vm05.stdout:[client.0] 2026-03-10T08:52:38.080 INFO:teuthology.orchestra.run.vm05.stdout: key = AQDW269pTL2mBBAAxzQ7WhWl7P2VFLaY/FCbKA== 2026-03-10T08:52:38.082 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:38.082+0000 7f100f89b700 1 -- 192.168.123.105:0/2893161916 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0ff406c490 msgr2=0x7f0ff406e950 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:38.083 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:38.082+0000 7f100f89b700 1 --2- 192.168.123.105:0/2893161916 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0ff406c490 0x7f0ff406e950 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f1004006010 tx=0x7f1004005a90 comp rx=0 tx=0).stop 2026-03-10T08:52:38.083 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:38.082+0000 7f100f89b700 1 -- 192.168.123.105:0/2893161916 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1008103150 msgr2=0x7f1008198a50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:38.083 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:38.082+0000 7f100f89b700 1 --2- 192.168.123.105:0/2893161916 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1008103150 0x7f1008198a50 secure :-1 s=READY pgs=205 cs=0 l=1 rev1=1 crypto rx=0x7f0ff800d900 tx=0x7f0ff800dc10 comp rx=0 tx=0).stop 2026-03-10T08:52:38.083 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:38.083+0000 7f100f89b700 1 -- 192.168.123.105:0/2893161916 shutdown_connections 2026-03-10T08:52:38.083 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:38.083+0000 7f100f89b700 1 --2- 192.168.123.105:0/2893161916 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0ff406c490 0x7f0ff406e950 unknown :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:38.083 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:38.083+0000 7f100f89b700 1 --2- 192.168.123.105:0/2893161916 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1008103150 0x7f1008198a50 unknown :-1 s=CLOSED pgs=205 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:38.083 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:38.083+0000 7f100f89b700 1 --2- 192.168.123.105:0/2893161916 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1008104350 0x7f1008198f90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:38.083 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:38.083+0000 7f100f89b700 1 -- 192.168.123.105:0/2893161916 >> 192.168.123.105:0/2893161916 conn(0x7f10080fe6d0 msgr2=0x7f1008107580 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:38.083 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:38.083+0000 7f100f89b700 1 -- 192.168.123.105:0/2893161916 shutdown_connections 2026-03-10T08:52:38.083 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:38.083+0000 7f100f89b700 1 -- 192.168.123.105:0/2893161916 wait complete. 2026-03-10T08:52:38.144 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-10T08:52:38.144 DEBUG:teuthology.orchestra.run.vm05:> sudo dd of=/etc/ceph/ceph.client.0.keyring 2026-03-10T08:52:38.144 DEBUG:teuthology.orchestra.run.vm05:> sudo chmod 0644 /etc/ceph/ceph.client.0.keyring 2026-03-10T08:52:38.181 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph auth get-or-create client.1 mon 'allow *' osd 'allow *' mds 'allow *' mgr 'allow *' 2026-03-10T08:52:38.322 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm08/config 2026-03-10T08:52:38.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:38 vm05 ceph-mon[49713]: pgmap v62: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:52:38.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:38 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/3816000324' entity='client.admin' cmd=[{"prefix": "osd pool get", "pool": ".mgr", "var": "pg_num"}]: dispatch 2026-03-10T08:52:38.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:38 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/2893161916' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-10T08:52:38.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:38 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/2893161916' entity='client.admin' cmd='[{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]': finished 2026-03-10T08:52:38.468 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:38 vm08 ceph-mon[57559]: pgmap v62: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:52:38.468 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:38 vm08 ceph-mon[57559]: from='client.? 192.168.123.105:0/3816000324' entity='client.admin' cmd=[{"prefix": "osd pool get", "pool": ".mgr", "var": "pg_num"}]: dispatch 2026-03-10T08:52:38.468 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:38 vm08 ceph-mon[57559]: from='client.? 192.168.123.105:0/2893161916' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-10T08:52:38.468 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:38 vm08 ceph-mon[57559]: from='client.? 192.168.123.105:0/2893161916' entity='client.admin' cmd='[{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]': finished 2026-03-10T08:52:38.562 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.561+0000 7fc291598700 1 -- 192.168.123.108:0/1753301138 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc28c104380 msgr2=0x7fc28c1047e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:38.563 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.561+0000 7fc291598700 1 --2- 192.168.123.108:0/1753301138 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc28c104380 0x7fc28c1047e0 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7fc27c009b00 tx=0x7fc27c009e10 comp rx=0 tx=0).stop 2026-03-10T08:52:38.563 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.562+0000 7fc291598700 1 -- 192.168.123.108:0/1753301138 shutdown_connections 2026-03-10T08:52:38.563 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.562+0000 7fc291598700 1 --2- 192.168.123.108:0/1753301138 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc28c104380 0x7fc28c1047e0 unknown :-1 s=CLOSED pgs=35 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:38.563 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.562+0000 7fc291598700 1 --2- 192.168.123.108:0/1753301138 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc28c103180 0x7fc28c1035a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:38.563 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.562+0000 7fc291598700 1 -- 192.168.123.108:0/1753301138 >> 192.168.123.108:0/1753301138 conn(0x7fc28c0fe720 msgr2=0x7fc28c100b60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:38.563 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.562+0000 7fc291598700 1 -- 192.168.123.108:0/1753301138 shutdown_connections 2026-03-10T08:52:38.563 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.562+0000 7fc291598700 1 -- 192.168.123.108:0/1753301138 wait complete. 2026-03-10T08:52:38.563 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.562+0000 7fc291598700 1 Processor -- start 2026-03-10T08:52:38.563 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.563+0000 7fc291598700 1 -- start start 2026-03-10T08:52:38.563 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.563+0000 7fc291598700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc28c103180 0x7fc28c198a20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:38.563 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.563+0000 7fc291598700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc28c104380 0x7fc28c198f60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:38.563 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.563+0000 7fc291598700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc28c199580 con 0x7fc28c104380 2026-03-10T08:52:38.564 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.563+0000 7fc291598700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc28c1996c0 con 0x7fc28c103180 2026-03-10T08:52:38.564 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.563+0000 7fc28a7fc700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc28c104380 0x7fc28c198f60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:38.564 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.563+0000 7fc28a7fc700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc28c104380 0x7fc28c198f60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:57220/0 (socket says 192.168.123.108:57220) 2026-03-10T08:52:38.564 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.563+0000 7fc28a7fc700 1 -- 192.168.123.108:0/1610132735 learned_addr learned my addr 192.168.123.108:0/1610132735 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T08:52:38.564 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.563+0000 7fc28a7fc700 1 -- 192.168.123.108:0/1610132735 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc28c103180 msgr2=0x7fc28c198a20 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T08:52:38.564 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.563+0000 7fc28affd700 1 --2- 192.168.123.108:0/1610132735 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc28c103180 0x7fc28c198a20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:38.564 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.563+0000 7fc28a7fc700 1 --2- 192.168.123.108:0/1610132735 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc28c103180 0x7fc28c198a20 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:38.564 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.563+0000 7fc28a7fc700 1 -- 192.168.123.108:0/1610132735 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc27c0097e0 con 0x7fc28c104380 2026-03-10T08:52:38.564 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.564+0000 7fc28affd700 1 --2- 192.168.123.108:0/1610132735 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc28c103180 0x7fc28c198a20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T08:52:38.564 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.564+0000 7fc28a7fc700 1 --2- 192.168.123.108:0/1610132735 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc28c104380 0x7fc28c198f60 secure :-1 s=READY pgs=206 cs=0 l=1 rev1=1 crypto rx=0x7fc27c00b5c0 tx=0x7fc27c004a40 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:38.564 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.564+0000 7fc283fff700 1 -- 192.168.123.108:0/1610132735 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc27c01d070 con 0x7fc28c104380 2026-03-10T08:52:38.564 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.564+0000 7fc291598700 1 -- 192.168.123.108:0/1610132735 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc28c19e110 con 0x7fc28c104380 2026-03-10T08:52:38.565 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.564+0000 7fc283fff700 1 -- 192.168.123.108:0/1610132735 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc27c00bc50 con 0x7fc28c104380 2026-03-10T08:52:38.565 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.564+0000 7fc283fff700 1 -- 192.168.123.108:0/1610132735 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc27c00f740 con 0x7fc28c104380 2026-03-10T08:52:38.565 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.564+0000 7fc291598700 1 -- 192.168.123.108:0/1610132735 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc28c19e600 con 0x7fc28c104380 2026-03-10T08:52:38.566 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.565+0000 7fc291598700 1 -- 192.168.123.108:0/1610132735 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc28c066e80 con 0x7fc28c104380 2026-03-10T08:52:38.566 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.565+0000 7fc283fff700 1 -- 192.168.123.108:0/1610132735 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fc27c022a50 con 0x7fc28c104380 2026-03-10T08:52:38.566 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.565+0000 7fc283fff700 1 --2- 192.168.123.108:0/1610132735 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fc27806c4e0 0x7fc27806e9a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:38.566 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.566+0000 7fc283fff700 1 -- 192.168.123.108:0/1610132735 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fc27c08ce80 con 0x7fc28c104380 2026-03-10T08:52:38.566 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.566+0000 7fc28affd700 1 --2- 192.168.123.108:0/1610132735 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fc27806c4e0 0x7fc27806e9a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:38.567 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.566+0000 7fc28affd700 1 --2- 192.168.123.108:0/1610132735 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fc27806c4e0 0x7fc27806e9a0 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7fc274005fd0 tx=0x7fc274005f00 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:38.569 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.568+0000 7fc283fff700 1 -- 192.168.123.108:0/1610132735 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fc27c05b5f0 con 0x7fc28c104380 2026-03-10T08:52:38.714 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.713+0000 7fc291598700 1 -- 192.168.123.108:0/1610132735 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]} v 0) v1 -- 0x7fc28c19e940 con 0x7fc28c104380 2026-03-10T08:52:38.718 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.717+0000 7fc283fff700 1 -- 192.168.123.108:0/1610132735 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]=0 v17) v1 ==== 170+0+59 (secure 0 0 0) 0x7fc27c027020 con 0x7fc28c104380 2026-03-10T08:52:38.719 INFO:teuthology.orchestra.run.vm08.stdout:[client.1] 2026-03-10T08:52:38.719 INFO:teuthology.orchestra.run.vm08.stdout: key = AQDW269pwE6dKhAAHIPYpx8IgscuNYD2zzGqMA== 2026-03-10T08:52:38.721 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.720+0000 7fc291598700 1 -- 192.168.123.108:0/1610132735 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fc27806c4e0 msgr2=0x7fc27806e9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:38.721 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.720+0000 7fc291598700 1 --2- 192.168.123.108:0/1610132735 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fc27806c4e0 0x7fc27806e9a0 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7fc274005fd0 tx=0x7fc274005f00 comp rx=0 tx=0).stop 2026-03-10T08:52:38.721 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.721+0000 7fc291598700 1 -- 192.168.123.108:0/1610132735 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc28c104380 msgr2=0x7fc28c198f60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:38.721 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.721+0000 7fc291598700 1 --2- 192.168.123.108:0/1610132735 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc28c104380 0x7fc28c198f60 secure :-1 s=READY pgs=206 cs=0 l=1 rev1=1 crypto rx=0x7fc27c00b5c0 tx=0x7fc27c004a40 comp rx=0 tx=0).stop 2026-03-10T08:52:38.721 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.721+0000 7fc291598700 1 -- 192.168.123.108:0/1610132735 shutdown_connections 2026-03-10T08:52:38.722 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.721+0000 7fc291598700 1 --2- 192.168.123.108:0/1610132735 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fc27806c4e0 0x7fc27806e9a0 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:38.722 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.721+0000 7fc291598700 1 --2- 192.168.123.108:0/1610132735 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc28c103180 0x7fc28c198a20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:38.722 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.721+0000 7fc291598700 1 --2- 192.168.123.108:0/1610132735 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc28c104380 0x7fc28c198f60 unknown :-1 s=CLOSED pgs=206 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:38.722 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.721+0000 7fc291598700 1 -- 192.168.123.108:0/1610132735 >> 192.168.123.108:0/1610132735 conn(0x7fc28c0fe720 msgr2=0x7fc28c1075b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:38.722 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.721+0000 7fc291598700 1 -- 192.168.123.108:0/1610132735 shutdown_connections 2026-03-10T08:52:38.722 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:52:38.721+0000 7fc291598700 1 -- 192.168.123.108:0/1610132735 wait complete. 2026-03-10T08:52:38.767 DEBUG:teuthology.orchestra.run.vm08:> set -ex 2026-03-10T08:52:38.767 DEBUG:teuthology.orchestra.run.vm08:> sudo dd of=/etc/ceph/ceph.client.1.keyring 2026-03-10T08:52:38.767 DEBUG:teuthology.orchestra.run.vm08:> sudo chmod 0644 /etc/ceph/ceph.client.1.keyring 2026-03-10T08:52:38.802 INFO:tasks.ceph:Waiting until ceph daemons up and pgs clean... 2026-03-10T08:52:38.802 INFO:tasks.cephadm.ceph_manager.ceph:waiting for mgr available 2026-03-10T08:52:38.802 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph mgr dump --format=json 2026-03-10T08:52:38.945 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:52:39.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.180+0000 7fe66eaba700 1 -- 192.168.123.105:0/3479761984 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe668100990 msgr2=0x7fe6681049e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:39.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.180+0000 7fe66eaba700 1 --2- 192.168.123.105:0/3479761984 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe668100990 0x7fe6681049e0 secure :-1 s=READY pgs=207 cs=0 l=1 rev1=1 crypto rx=0x7fe65c009b00 tx=0x7fe65c009e10 comp rx=0 tx=0).stop 2026-03-10T08:52:39.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.180+0000 7fe66eaba700 1 -- 192.168.123.105:0/3479761984 shutdown_connections 2026-03-10T08:52:39.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.180+0000 7fe66eaba700 1 --2- 192.168.123.105:0/3479761984 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe668100990 0x7fe6681049e0 unknown :-1 s=CLOSED pgs=207 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:39.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.180+0000 7fe66eaba700 1 --2- 192.168.123.105:0/3479761984 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe6680fffe0 0x7fe6681003c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:39.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.180+0000 7fe66eaba700 1 -- 192.168.123.105:0/3479761984 >> 192.168.123.105:0/3479761984 conn(0x7fe6680fb830 msgr2=0x7fe6680fdc50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:39.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.181+0000 7fe66eaba700 1 -- 192.168.123.105:0/3479761984 shutdown_connections 2026-03-10T08:52:39.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.181+0000 7fe66eaba700 1 -- 192.168.123.105:0/3479761984 wait complete. 2026-03-10T08:52:39.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.181+0000 7fe66eaba700 1 Processor -- start 2026-03-10T08:52:39.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.181+0000 7fe66eaba700 1 -- start start 2026-03-10T08:52:39.182 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.182+0000 7fe66eaba700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe6680fffe0 0x7fe668198e10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:39.182 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.182+0000 7fe66eaba700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe668100990 0x7fe668199350 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:39.182 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.182+0000 7fe66eaba700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe668199a30 con 0x7fe668100990 2026-03-10T08:52:39.182 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.182+0000 7fe66eaba700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe66819d7c0 con 0x7fe6680fffe0 2026-03-10T08:52:39.182 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.182+0000 7fe667fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe668100990 0x7fe668199350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:39.182 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.182+0000 7fe667fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe668100990 0x7fe668199350 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:49926/0 (socket says 192.168.123.105:49926) 2026-03-10T08:52:39.182 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.182+0000 7fe667fff700 1 -- 192.168.123.105:0/1729650109 learned_addr learned my addr 192.168.123.105:0/1729650109 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:52:39.183 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.182+0000 7fe66c856700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe6680fffe0 0x7fe668198e10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:39.183 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.182+0000 7fe667fff700 1 -- 192.168.123.105:0/1729650109 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe6680fffe0 msgr2=0x7fe668198e10 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:39.183 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.182+0000 7fe667fff700 1 --2- 192.168.123.105:0/1729650109 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe6680fffe0 0x7fe668198e10 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:39.183 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.182+0000 7fe667fff700 1 -- 192.168.123.105:0/1729650109 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe65c0097e0 con 0x7fe668100990 2026-03-10T08:52:39.183 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.182+0000 7fe66c856700 1 --2- 192.168.123.105:0/1729650109 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe6680fffe0 0x7fe668198e10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T08:52:39.183 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.183+0000 7fe667fff700 1 --2- 192.168.123.105:0/1729650109 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe668100990 0x7fe668199350 secure :-1 s=READY pgs=208 cs=0 l=1 rev1=1 crypto rx=0x7fe65c009fd0 tx=0x7fe65c0049e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:39.183 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.183+0000 7fe665ffb700 1 -- 192.168.123.105:0/1729650109 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe65c01d070 con 0x7fe668100990 2026-03-10T08:52:39.183 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.183+0000 7fe66eaba700 1 -- 192.168.123.105:0/1729650109 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe66819da40 con 0x7fe668100990 2026-03-10T08:52:39.183 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.183+0000 7fe665ffb700 1 -- 192.168.123.105:0/1729650109 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fe65c00bc50 con 0x7fe668100990 2026-03-10T08:52:39.184 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.183+0000 7fe665ffb700 1 -- 192.168.123.105:0/1729650109 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe65c00f790 con 0x7fe668100990 2026-03-10T08:52:39.184 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.183+0000 7fe66eaba700 1 -- 192.168.123.105:0/1729650109 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe66819df30 con 0x7fe668100990 2026-03-10T08:52:39.186 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.184+0000 7fe6537fe700 1 -- 192.168.123.105:0/1729650109 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe6480052f0 con 0x7fe668100990 2026-03-10T08:52:39.188 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.187+0000 7fe665ffb700 1 -- 192.168.123.105:0/1729650109 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fe65c022470 con 0x7fe668100990 2026-03-10T08:52:39.188 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.187+0000 7fe665ffb700 1 --2- 192.168.123.105:0/1729650109 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe65806c4e0 0x7fe65806e9a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:39.188 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.187+0000 7fe665ffb700 1 -- 192.168.123.105:0/1729650109 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fe65c08d630 con 0x7fe668100990 2026-03-10T08:52:39.188 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.188+0000 7fe665ffb700 1 -- 192.168.123.105:0/1729650109 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fe65c08da10 con 0x7fe668100990 2026-03-10T08:52:39.188 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.188+0000 7fe66c856700 1 --2- 192.168.123.105:0/1729650109 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe65806c4e0 0x7fe65806e9a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:39.188 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.188+0000 7fe66c856700 1 --2- 192.168.123.105:0/1729650109 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe65806c4e0 0x7fe65806e9a0 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7fe654007950 tx=0x7fe654008040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:39.321 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.321+0000 7fe6537fe700 1 -- 192.168.123.105:0/1729650109 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mgr dump", "format": "json"} v 0) v1 -- 0x7fe648005160 con 0x7fe668100990 2026-03-10T08:52:39.324 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:39 vm05 ceph-mon[49713]: from='client.? 192.168.123.108:0/1610132735' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-10T08:52:39.324 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:39 vm05 ceph-mon[49713]: from='client.? 192.168.123.108:0/1610132735' entity='client.admin' cmd='[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]': finished 2026-03-10T08:52:39.324 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.324+0000 7fe665ffb700 1 -- 192.168.123.105:0/1729650109 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mgr dump", "format": "json"}]=0 v18) v1 ==== 74+0+173029 (secure 0 0 0) 0x7fe65c027090 con 0x7fe668100990 2026-03-10T08:52:39.326 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:52:39.330 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.330+0000 7fe6537fe700 1 -- 192.168.123.105:0/1729650109 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe65806c4e0 msgr2=0x7fe65806e9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:39.330 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.330+0000 7fe6537fe700 1 --2- 192.168.123.105:0/1729650109 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe65806c4e0 0x7fe65806e9a0 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7fe654007950 tx=0x7fe654008040 comp rx=0 tx=0).stop 2026-03-10T08:52:39.330 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.330+0000 7fe6537fe700 1 -- 192.168.123.105:0/1729650109 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe668100990 msgr2=0x7fe668199350 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:39.331 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.330+0000 7fe6537fe700 1 --2- 192.168.123.105:0/1729650109 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe668100990 0x7fe668199350 secure :-1 s=READY pgs=208 cs=0 l=1 rev1=1 crypto rx=0x7fe65c009fd0 tx=0x7fe65c0049e0 comp rx=0 tx=0).stop 2026-03-10T08:52:39.331 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.331+0000 7fe6537fe700 1 -- 192.168.123.105:0/1729650109 shutdown_connections 2026-03-10T08:52:39.331 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.331+0000 7fe6537fe700 1 --2- 192.168.123.105:0/1729650109 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe65806c4e0 0x7fe65806e9a0 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:39.331 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.331+0000 7fe6537fe700 1 --2- 192.168.123.105:0/1729650109 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe6680fffe0 0x7fe668198e10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:39.331 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.331+0000 7fe6537fe700 1 --2- 192.168.123.105:0/1729650109 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe668100990 0x7fe668199350 unknown :-1 s=CLOSED pgs=208 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:39.331 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.331+0000 7fe6537fe700 1 -- 192.168.123.105:0/1729650109 >> 192.168.123.105:0/1729650109 conn(0x7fe6680fb830 msgr2=0x7fe6680fdc30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:39.331 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.331+0000 7fe6537fe700 1 -- 192.168.123.105:0/1729650109 shutdown_connections 2026-03-10T08:52:39.331 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.331+0000 7fe6537fe700 1 -- 192.168.123.105:0/1729650109 wait complete. 2026-03-10T08:52:39.376 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":18,"active_gid":14223,"active_name":"vm05.rxwgjc","active_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6800","nonce":2},{"type":"v1","addr":"192.168.123.105:6801","nonce":2}]},"active_addr":"192.168.123.105:6801/2","active_change":"2026-03-10T08:51:15.172556+0000","active_mgr_features":4540138322906710015,"available":true,"standbys":[{"gid":14250,"name":"vm08.rpongu","mgr_features":4540138322906710015,"available_modules":[{"name":"alerts","can_run":true,"error_string":"","module_options":{"interval":{"name":"interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"How frequently to reexamine health status","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"smtp_destination":{"name":"smtp_destination","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Email address to send alerts to","long_desc":"","tags":[],"see_also":[]},"smtp_from_name":{"name":"smtp_from_name","type":"str","level":"advanced","flags":1,"default_value":"Ceph","min":"","max":"","enum_allowed":[],"desc":"Email From: name","long_desc":"","tags":[],"see_also":[]},"smtp_host":{"name":"smtp_host","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_password":{"name":"smtp_password","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Password to authenticate with","long_desc":"","tags":[],"see_also":[]},"smtp_port":{"name":"smtp_port","type":"int","level":"advanced","flags":1,"default_value":"465","min":"","max":"","enum_allowed":[],"desc":"SMTP port","long_desc":"","tags":[],"see_also":[]},"smtp_sender":{"name":"smtp_sender","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP envelope sender","long_desc":"","tags":[],"see_also":[]},"smtp_ssl":{"name":"smtp_ssl","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Use SSL to connect to SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_user":{"name":"smtp_user","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"User to authenticate as","long_desc":"","tags":[],"see_also":[]}}},{"name":"balancer","can_run":true,"error_string":"","module_options":{"active":{"name":"active","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"automatically balance PGs across cluster","long_desc":"","tags":[],"see_also":[]},"begin_time":{"name":"begin_time","type":"str","level":"advanced","flags":1,"default_value":"0000","min":"","max":"","enum_allowed":[],"desc":"beginning time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"begin_weekday":{"name":"begin_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to this day of the week or later","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"crush_compat_max_iterations":{"name":"crush_compat_max_iterations","type":"uint","level":"advanced","flags":1,"default_value":"25","min":"1","max":"250","enum_allowed":[],"desc":"maximum number of iterations to attempt optimization","long_desc":"","tags":[],"see_also":[]},"crush_compat_metrics":{"name":"crush_compat_metrics","type":"str","level":"advanced","flags":1,"default_value":"pgs,objects,bytes","min":"","max":"","enum_allowed":[],"desc":"metrics with which to calculate OSD utilization","long_desc":"Value is a list of one or more of \"pgs\", \"objects\", or \"bytes\", and indicates which metrics to use to balance utilization.","tags":[],"see_also":[]},"crush_compat_step":{"name":"crush_compat_step","type":"float","level":"advanced","flags":1,"default_value":"0.5","min":"0.001","max":"0.999","enum_allowed":[],"desc":"aggressiveness of optimization","long_desc":".99 is very aggressive, .01 is less aggressive","tags":[],"see_also":[]},"end_time":{"name":"end_time","type":"str","level":"advanced","flags":1,"default_value":"2359","min":"","max":"","enum_allowed":[],"desc":"ending time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"end_weekday":{"name":"end_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to days of the week earlier than this","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_score":{"name":"min_score","type":"float","level":"advanced","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"minimum score, below which no optimization is attempted","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":1,"default_value":"upmap","min":"","max":"","enum_allowed":["crush-compat","none","upmap"],"desc":"Balancer mode","long_desc":"","tags":[],"see_also":[]},"pool_ids":{"name":"pool_ids","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"pools which the automatic balancing will be limited to","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and attempt optimization","long_desc":"","tags":[],"see_also":[]},"upmap_max_deviation":{"name":"upmap_max_deviation","type":"int","level":"advanced","flags":1,"default_value":"5","min":"1","max":"","enum_allowed":[],"desc":"deviation below which no optimization is attempted","long_desc":"If the number of PGs are within this count then no optimization is attempted","tags":[],"see_also":[]},"upmap_max_optimizations":{"name":"upmap_max_optimizations","type":"uint","level":"advanced","flags":1,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"maximum upmap optimizations to make per attempt","long_desc":"","tags":[],"see_also":[]}}},{"name":"cephadm","can_run":true,"error_string":"","module_options":{"agent_down_multiplier":{"name":"agent_down_multiplier","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"","max":"","enum_allowed":[],"desc":"Multiplied by agent refresh rate to calculate how long agent must not report before being marked down","long_desc":"","tags":[],"see_also":[]},"agent_refresh_rate":{"name":"agent_refresh_rate","type":"secs","level":"advanced","flags":0,"default_value":"20","min":"","max":"","enum_allowed":[],"desc":"How often agent on each host will try to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"agent_starting_port":{"name":"agent_starting_port","type":"int","level":"advanced","flags":0,"default_value":"4721","min":"","max":"","enum_allowed":[],"desc":"First port agent will try to bind to (will also try up to next 1000 subsequent ports if blocked)","long_desc":"","tags":[],"see_also":[]},"allow_ptrace":{"name":"allow_ptrace","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow SYS_PTRACE capability on ceph containers","long_desc":"The SYS_PTRACE capability is needed to attach to a process with gdb or strace. Enabling this options can allow debugging daemons that encounter problems at runtime.","tags":[],"see_also":[]},"autotune_interval":{"name":"autotune_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to autotune daemon memory","long_desc":"","tags":[],"see_also":[]},"autotune_memory_target_ratio":{"name":"autotune_memory_target_ratio","type":"float","level":"advanced","flags":0,"default_value":"0.7","min":"","max":"","enum_allowed":[],"desc":"ratio of total system memory to divide amongst autotuned daemons","long_desc":"","tags":[],"see_also":[]},"cgroups_split":{"name":"cgroups_split","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Pass --cgroups=split when cephadm creates containers (currently podman only)","long_desc":"","tags":[],"see_also":[]},"config_checks_enabled":{"name":"config_checks_enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable or disable the cephadm configuration analysis","long_desc":"","tags":[],"see_also":[]},"config_dashboard":{"name":"config_dashboard","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"manage configs like API endpoints in Dashboard.","long_desc":"","tags":[],"see_also":[]},"container_image_alertmanager":{"name":"container_image_alertmanager","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/alertmanager:v0.25.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_base":{"name":"container_image_base","type":"str","level":"advanced","flags":1,"default_value":"quay.io/ceph/ceph","min":"","max":"","enum_allowed":[],"desc":"Container image name, without the tag","long_desc":"","tags":[],"see_also":[]},"container_image_elasticsearch":{"name":"container_image_elasticsearch","type":"str","level":"advanced","flags":0,"default_value":"quay.io/omrizeneva/elasticsearch:6.8.23","min":"","max":"","enum_allowed":[],"desc":"elasticsearch container image","long_desc":"","tags":[],"see_also":[]},"container_image_grafana":{"name":"container_image_grafana","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/ceph-grafana:9.4.7","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_haproxy":{"name":"container_image_haproxy","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/haproxy:2.3","min":"","max":"","enum_allowed":[],"desc":"HAproxy container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_agent":{"name":"container_image_jaeger_agent","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-agent:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger agent container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_collector":{"name":"container_image_jaeger_collector","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-collector:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger collector container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_query":{"name":"container_image_jaeger_query","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-query:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger query container image","long_desc":"","tags":[],"see_also":[]},"container_image_keepalived":{"name":"container_image_keepalived","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/keepalived:2.2.4","min":"","max":"","enum_allowed":[],"desc":"Keepalived container image","long_desc":"","tags":[],"see_also":[]},"container_image_loki":{"name":"container_image_loki","type":"str","level":"advanced","flags":0,"default_value":"docker.io/grafana/loki:2.4.0","min":"","max":"","enum_allowed":[],"desc":"Loki container image","long_desc":"","tags":[],"see_also":[]},"container_image_node_exporter":{"name":"container_image_node_exporter","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/node-exporter:v1.5.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_nvmeof":{"name":"container_image_nvmeof","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/nvmeof:0.0.2","min":"","max":"","enum_allowed":[],"desc":"Nvme-of container image","long_desc":"","tags":[],"see_also":[]},"container_image_prometheus":{"name":"container_image_prometheus","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/prometheus:v2.43.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_promtail":{"name":"container_image_promtail","type":"str","level":"advanced","flags":0,"default_value":"docker.io/grafana/promtail:2.4.0","min":"","max":"","enum_allowed":[],"desc":"Promtail container image","long_desc":"","tags":[],"see_also":[]},"container_image_snmp_gateway":{"name":"container_image_snmp_gateway","type":"str","level":"advanced","flags":0,"default_value":"docker.io/maxwo/snmp-notifier:v1.2.1","min":"","max":"","enum_allowed":[],"desc":"SNMP Gateway container image","long_desc":"","tags":[],"see_also":[]},"container_init":{"name":"container_init","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Run podman/docker with `--init`","long_desc":"","tags":[],"see_also":[]},"daemon_cache_timeout":{"name":"daemon_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"seconds to cache service (daemon) inventory","long_desc":"","tags":[],"see_also":[]},"default_cephadm_command_timeout":{"name":"default_cephadm_command_timeout","type":"secs","level":"advanced","flags":0,"default_value":"900","min":"","max":"","enum_allowed":[],"desc":"Default timeout applied to cephadm commands run directly on the host (in seconds)","long_desc":"","tags":[],"see_also":[]},"default_registry":{"name":"default_registry","type":"str","level":"advanced","flags":0,"default_value":"docker.io","min":"","max":"","enum_allowed":[],"desc":"Search-registry to which we should normalize unqualified image names. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"device_cache_timeout":{"name":"device_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"1800","min":"","max":"","enum_allowed":[],"desc":"seconds to cache device inventory","long_desc":"","tags":[],"see_also":[]},"device_enhanced_scan":{"name":"device_enhanced_scan","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use libstoragemgmt during device scans","long_desc":"","tags":[],"see_also":[]},"facts_cache_timeout":{"name":"facts_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"seconds to cache host facts data","long_desc":"","tags":[],"see_also":[]},"host_check_interval":{"name":"host_check_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to perform a host check","long_desc":"","tags":[],"see_also":[]},"inventory_list_all":{"name":"inventory_list_all","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Whether ceph-volume inventory should report more devices (mostly mappers (LVs / mpaths), partitions...)","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_refresh_metadata":{"name":"log_refresh_metadata","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Log all refresh metadata. Includes daemon, device, and host info collected regularly. Only has effect if logging at debug level","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"log to the \"cephadm\" cluster log channel\"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf":{"name":"manage_etc_ceph_ceph_conf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Manage and own /etc/ceph/ceph.conf on the hosts.","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf_hosts":{"name":"manage_etc_ceph_ceph_conf_hosts","type":"str","level":"advanced","flags":0,"default_value":"*","min":"","max":"","enum_allowed":[],"desc":"PlacementSpec describing on which hosts to manage /etc/ceph/ceph.conf","long_desc":"","tags":[],"see_also":[]},"max_count_per_host":{"name":"max_count_per_host","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of daemons per service per host","long_desc":"","tags":[],"see_also":[]},"max_osd_draining_count":{"name":"max_osd_draining_count","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of osds that will be drained simultaneously when osds are removed","long_desc":"","tags":[],"see_also":[]},"migration_current":{"name":"migration_current","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"internal - do not modify","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":0,"default_value":"root","min":"","max":"","enum_allowed":["cephadm-package","root"],"desc":"mode for remote execution of cephadm","long_desc":"","tags":[],"see_also":[]},"prometheus_alerts_path":{"name":"prometheus_alerts_path","type":"str","level":"advanced","flags":0,"default_value":"/etc/prometheus/ceph/ceph_default_alerts.yml","min":"","max":"","enum_allowed":[],"desc":"location of alerts to include in prometheus deployments","long_desc":"","tags":[],"see_also":[]},"registry_insecure":{"name":"registry_insecure","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Registry is to be considered insecure (no TLS available). Only for development purposes.","long_desc":"","tags":[],"see_also":[]},"registry_password":{"name":"registry_password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository password. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"registry_url":{"name":"registry_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Registry url for login purposes. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"registry_username":{"name":"registry_username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository username. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"secure_monitoring_stack":{"name":"secure_monitoring_stack","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable TLS security for all the monitoring stack daemons","long_desc":"","tags":[],"see_also":[]},"service_discovery_port":{"name":"service_discovery_port","type":"int","level":"advanced","flags":0,"default_value":"8765","min":"","max":"","enum_allowed":[],"desc":"cephadm service discovery port","long_desc":"","tags":[],"see_also":[]},"ssh_config_file":{"name":"ssh_config_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"customized SSH config file to connect to managed hosts","long_desc":"","tags":[],"see_also":[]},"use_agent":{"name":"use_agent","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use cephadm agent on each host to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"use_repo_digest":{"name":"use_repo_digest","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Automatically convert image tags to image digest. Make sure all daemons use the same image","long_desc":"","tags":[],"see_also":[]},"warn_on_failed_host_check":{"name":"warn_on_failed_host_check","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if the host check fails","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_daemons":{"name":"warn_on_stray_daemons","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected that are not managed by cephadm","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_hosts":{"name":"warn_on_stray_hosts","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected on a host that is not managed by cephadm","long_desc":"","tags":[],"see_also":[]}}},{"name":"crash","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"retain_interval":{"name":"retain_interval","type":"secs","level":"advanced","flags":1,"default_value":"31536000","min":"","max":"","enum_allowed":[],"desc":"how long to retain crashes before pruning them","long_desc":"","tags":[],"see_also":[]},"warn_recent_interval":{"name":"warn_recent_interval","type":"secs","level":"advanced","flags":1,"default_value":"1209600","min":"","max":"","enum_allowed":[],"desc":"time interval in which to warn about recent crashes","long_desc":"","tags":[],"see_also":[]}}},{"name":"dashboard","can_run":true,"error_string":"","module_options":{"ACCOUNT_LOCKOUT_ATTEMPTS":{"name":"ACCOUNT_LOCKOUT_ATTEMPTS","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_HOST":{"name":"ALERTMANAGER_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_SSL_VERIFY":{"name":"ALERTMANAGER_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_ENABLED":{"name":"AUDIT_API_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_LOG_PAYLOAD":{"name":"AUDIT_API_LOG_PAYLOAD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ENABLE_BROWSABLE_API":{"name":"ENABLE_BROWSABLE_API","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_CEPHFS":{"name":"FEATURE_TOGGLE_CEPHFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_DASHBOARD":{"name":"FEATURE_TOGGLE_DASHBOARD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_ISCSI":{"name":"FEATURE_TOGGLE_ISCSI","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_MIRRORING":{"name":"FEATURE_TOGGLE_MIRRORING","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_NFS":{"name":"FEATURE_TOGGLE_NFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RBD":{"name":"FEATURE_TOGGLE_RBD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RGW":{"name":"FEATURE_TOGGLE_RGW","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE":{"name":"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_PASSWORD":{"name":"GRAFANA_API_PASSWORD","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_SSL_VERIFY":{"name":"GRAFANA_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_URL":{"name":"GRAFANA_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_USERNAME":{"name":"GRAFANA_API_USERNAME","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_FRONTEND_API_URL":{"name":"GRAFANA_FRONTEND_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_UPDATE_DASHBOARDS":{"name":"GRAFANA_UPDATE_DASHBOARDS","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISCSI_API_SSL_VERIFICATION":{"name":"ISCSI_API_SSL_VERIFICATION","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISSUE_TRACKER_API_KEY":{"name":"ISSUE_TRACKER_API_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_HOST":{"name":"PROMETHEUS_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_SSL_VERIFY":{"name":"PROMETHEUS_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_COMPLEXITY_ENABLED":{"name":"PWD_POLICY_CHECK_COMPLEXITY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED":{"name":"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_LENGTH_ENABLED":{"name":"PWD_POLICY_CHECK_LENGTH_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_OLDPWD_ENABLED":{"name":"PWD_POLICY_CHECK_OLDPWD_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_USERNAME_ENABLED":{"name":"PWD_POLICY_CHECK_USERNAME_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_ENABLED":{"name":"PWD_POLICY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_EXCLUSION_LIST":{"name":"PWD_POLICY_EXCLUSION_LIST","type":"str","level":"advanced","flags":0,"default_value":"osd,host,dashboard,pool,block,nfs,ceph,monitors,gateway,logs,crush,maps","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_COMPLEXITY":{"name":"PWD_POLICY_MIN_COMPLEXITY","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_LENGTH":{"name":"PWD_POLICY_MIN_LENGTH","type":"int","level":"advanced","flags":0,"default_value":"8","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"REST_REQUESTS_TIMEOUT":{"name":"REST_REQUESTS_TIMEOUT","type":"int","level":"advanced","flags":0,"default_value":"45","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ACCESS_KEY":{"name":"RGW_API_ACCESS_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ADMIN_RESOURCE":{"name":"RGW_API_ADMIN_RESOURCE","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SECRET_KEY":{"name":"RGW_API_SECRET_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SSL_VERIFY":{"name":"RGW_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"UNSAFE_TLS_v1_2":{"name":"UNSAFE_TLS_v1_2","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_SPAN":{"name":"USER_PWD_EXPIRATION_SPAN","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_1":{"name":"USER_PWD_EXPIRATION_WARNING_1","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_2":{"name":"USER_PWD_EXPIRATION_WARNING_2","type":"int","level":"advanced","flags":0,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"cross_origin_url":{"name":"cross_origin_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"crt_file":{"name":"crt_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"debug":{"name":"debug","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable/disable debug options","long_desc":"","tags":[],"see_also":[]},"jwt_token_ttl":{"name":"jwt_token_ttl","type":"int","level":"advanced","flags":0,"default_value":"28800","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"key_file":{"name":"key_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"motd":{"name":"motd","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"The message of the day","long_desc":"","tags":[],"see_also":[]},"redirect_resolve_ip_addr":{"name":"redirect_resolve_ip_addr","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":0,"default_value":"8080","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl_server_port":{"name":"ssl_server_port","type":"int","level":"advanced","flags":0,"default_value":"8443","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":0,"default_value":"redirect","min":"","max":"","enum_allowed":["error","redirect"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":0,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url_prefix":{"name":"url_prefix","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"devicehealth","can_run":true,"error_string":"","module_options":{"enable_monitoring":{"name":"enable_monitoring","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"monitor device health metrics","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mark_out_threshold":{"name":"mark_out_threshold","type":"secs","level":"advanced","flags":1,"default_value":"2419200","min":"","max":"","enum_allowed":[],"desc":"automatically mark OSD if it may fail before this long","long_desc":"","tags":[],"see_also":[]},"pool_name":{"name":"pool_name","type":"str","level":"advanced","flags":1,"default_value":"device_health_metrics","min":"","max":"","enum_allowed":[],"desc":"name of pool in which to store device health metrics","long_desc":"","tags":[],"see_also":[]},"retention_period":{"name":"retention_period","type":"secs","level":"advanced","flags":1,"default_value":"15552000","min":"","max":"","enum_allowed":[],"desc":"how long to retain device health metrics","long_desc":"","tags":[],"see_also":[]},"scrape_frequency":{"name":"scrape_frequency","type":"secs","level":"advanced","flags":1,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"how frequently to scrape device health metrics","long_desc":"","tags":[],"see_also":[]},"self_heal":{"name":"self_heal","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"preemptively heal cluster around devices that may fail","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and check device health","long_desc":"","tags":[],"see_also":[]},"warn_threshold":{"name":"warn_threshold","type":"secs","level":"advanced","flags":1,"default_value":"7257600","min":"","max":"","enum_allowed":[],"desc":"raise health warning if OSD may fail before this long","long_desc":"","tags":[],"see_also":[]}}},{"name":"diskprediction_local","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predict_interval":{"name":"predict_interval","type":"str","level":"advanced","flags":0,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predictor_model":{"name":"predictor_model","type":"str","level":"advanced","flags":0,"default_value":"prophetstor","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"str","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"influx","can_run":false,"error_string":"influxdb python module not found","module_options":{"batch_size":{"name":"batch_size","type":"int","level":"advanced","flags":0,"default_value":"5000","min":"","max":"","enum_allowed":[],"desc":"How big batches of data points should be when sending to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"database":{"name":"database","type":"str","level":"advanced","flags":0,"default_value":"ceph","min":"","max":"","enum_allowed":[],"desc":"InfluxDB database name. You will need to create this database and grant write privileges to the configured username or the username must have admin privileges to create it.","long_desc":"","tags":[],"see_also":[]},"hostname":{"name":"hostname","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server hostname","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"30","min":"5","max":"","enum_allowed":[],"desc":"Time between reports to InfluxDB. Default 30 seconds.","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"password":{"name":"password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"password of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"port":{"name":"port","type":"int","level":"advanced","flags":0,"default_value":"8086","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server port","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"str","level":"advanced","flags":0,"default_value":"false","min":"","max":"","enum_allowed":[],"desc":"Use https connection for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]},"threads":{"name":"threads","type":"int","level":"advanced","flags":0,"default_value":"5","min":"1","max":"32","enum_allowed":[],"desc":"How many worker threads should be spawned for sending data to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"username":{"name":"username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"username of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"verify_ssl":{"name":"verify_ssl","type":"str","level":"advanced","flags":0,"default_value":"true","min":"","max":"","enum_allowed":[],"desc":"Verify https cert for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]}}},{"name":"insights","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"iostat","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"k8sevents","can_run":true,"error_string":"","module_options":{"ceph_event_retention_days":{"name":"ceph_event_retention_days","type":"int","level":"advanced","flags":0,"default_value":"7","min":"","max":"","enum_allowed":[],"desc":"Days to hold ceph event information within local cache","long_desc":"","tags":[],"see_also":[]},"config_check_secs":{"name":"config_check_secs","type":"int","level":"advanced","flags":0,"default_value":"10","min":"10","max":"","enum_allowed":[],"desc":"interval (secs) to check for cluster configuration changes","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"localpool","can_run":true,"error_string":"","module_options":{"failure_domain":{"name":"failure_domain","type":"str","level":"advanced","flags":1,"default_value":"host","min":"","max":"","enum_allowed":[],"desc":"failure domain for any created local pool","long_desc":"what failure domain we should separate data replicas across.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_size":{"name":"min_size","type":"int","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"default min_size for any created local pool","long_desc":"value to set min_size to (unchanged from Ceph's default if this option is not set)","tags":[],"see_also":[]},"num_rep":{"name":"num_rep","type":"int","level":"advanced","flags":1,"default_value":"3","min":"","max":"","enum_allowed":[],"desc":"default replica count for any created local pool","long_desc":"","tags":[],"see_also":[]},"pg_num":{"name":"pg_num","type":"int","level":"advanced","flags":1,"default_value":"128","min":"","max":"","enum_allowed":[],"desc":"default pg_num for any created local pool","long_desc":"","tags":[],"see_also":[]},"prefix":{"name":"prefix","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"name prefix for any created local pool","long_desc":"","tags":[],"see_also":[]},"subtree":{"name":"subtree","type":"str","level":"advanced","flags":1,"default_value":"rack","min":"","max":"","enum_allowed":[],"desc":"CRUSH level for which to create a local pool","long_desc":"which CRUSH subtree type the module should create a pool for.","tags":[],"see_also":[]}}},{"name":"mds_autoscaler","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"mirroring","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"nfs","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"orchestrator","can_run":true,"error_string":"","module_options":{"fail_fs":{"name":"fail_fs","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Fail filesystem for rapid multi-rank mds upgrade","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"orchestrator":{"name":"orchestrator","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["cephadm","rook","test_orchestrator"],"desc":"Orchestrator backend","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_perf_query","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"pg_autoscaler","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"threshold":{"name":"threshold","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"1.0","max":"","enum_allowed":[],"desc":"scaling threshold","long_desc":"The factor by which the `NEW PG_NUM` must vary from the current`PG_NUM` before being accepted. Cannot be less than 1.0","tags":[],"see_also":[]}}},{"name":"progress","can_run":true,"error_string":"","module_options":{"allow_pg_recovery_event":{"name":"allow_pg_recovery_event","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow the module to show pg recovery progress","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_completed_events":{"name":"max_completed_events","type":"int","level":"advanced","flags":1,"default_value":"50","min":"","max":"","enum_allowed":[],"desc":"number of past completed events to remember","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"how long the module is going to sleep","long_desc":"","tags":[],"see_also":[]}}},{"name":"prometheus","can_run":true,"error_string":"","module_options":{"cache":{"name":"cache","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"exclude_perf_counters":{"name":"exclude_perf_counters","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Do not include perf-counters in the metrics output","long_desc":"Gathering perf-counters from a single Prometheus exporter can degrade ceph-mgr performance, especially in large clusters. Instead, Ceph-exporter daemons are now used by default for perf-counter gathering. This should only be disabled when no ceph-exporters are deployed.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools":{"name":"rbd_stats_pools","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools_refresh_interval":{"name":"rbd_stats_pools_refresh_interval","type":"int","level":"advanced","flags":0,"default_value":"300","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"scrape_interval":{"name":"scrape_interval","type":"float","level":"advanced","flags":0,"default_value":"15.0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"the IPv4 or IPv6 address on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":1,"default_value":"9283","min":"","max":"","enum_allowed":[],"desc":"the port on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"stale_cache_strategy":{"name":"stale_cache_strategy","type":"str","level":"advanced","flags":0,"default_value":"log","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":1,"default_value":"default","min":"","max":"","enum_allowed":["default","error"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":1,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rbd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_snap_create":{"name":"max_concurrent_snap_create","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mirror_snapshot_schedule":{"name":"mirror_snapshot_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"trash_purge_schedule":{"name":"trash_purge_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"restful","can_run":true,"error_string":"","module_options":{"enable_auth":{"name":"enable_auth","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"key_file":{"name":"key_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rgw","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rook","can_run":true,"error_string":"","module_options":{"drive_group_interval":{"name":"drive_group_interval","type":"float","level":"advanced","flags":0,"default_value":"300.0","min":"","max":"","enum_allowed":[],"desc":"interval in seconds between re-application of applied drive_groups","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"storage_class":{"name":"storage_class","type":"str","level":"advanced","flags":0,"default_value":"local","min":"","max":"","enum_allowed":[],"desc":"storage class name for LSO-discovered PVs","long_desc":"","tags":[],"see_also":[]}}},{"name":"selftest","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption1":{"name":"roption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption2":{"name":"roption2","type":"str","level":"advanced","flags":0,"default_value":"xyz","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption1":{"name":"rwoption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption2":{"name":"rwoption2","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption3":{"name":"rwoption3","type":"float","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption4":{"name":"rwoption4","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption5":{"name":"rwoption5","type":"bool","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption6":{"name":"rwoption6","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption7":{"name":"rwoption7","type":"int","level":"advanced","flags":0,"default_value":"","min":"1","max":"42","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testkey":{"name":"testkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testlkey":{"name":"testlkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testnewline":{"name":"testnewline","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"snap_schedule","can_run":true,"error_string":"","module_options":{"allow_m_granularity":{"name":"allow_m_granularity","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow minute scheduled snapshots","long_desc":"","tags":[],"see_also":[]},"dump_on_update":{"name":"dump_on_update","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"dump database to debug log on update","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"stats","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"status","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telegraf","can_run":true,"error_string":"","module_options":{"address":{"name":"address","type":"str","level":"advanced","flags":0,"default_value":"unixgram:///tmp/telegraf.sock","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"15","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telemetry","can_run":true,"error_string":"","module_options":{"channel_basic":{"name":"channel_basic","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share basic cluster information (size, version)","long_desc":"","tags":[],"see_also":[]},"channel_crash":{"name":"channel_crash","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share metadata about Ceph daemon crashes (version, stack straces, etc)","long_desc":"","tags":[],"see_also":[]},"channel_device":{"name":"channel_device","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share device health metrics (e.g., SMART data, minus potentially identifying info like serial numbers)","long_desc":"","tags":[],"see_also":[]},"channel_ident":{"name":"channel_ident","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share a user-provided description and/or contact email for the cluster","long_desc":"","tags":[],"see_also":[]},"channel_perf":{"name":"channel_perf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share various performance metrics of a cluster","long_desc":"","tags":[],"see_also":[]},"contact":{"name":"contact","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"description":{"name":"description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"device_url":{"name":"device_url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/device","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"int","level":"advanced","flags":0,"default_value":"24","min":"8","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"last_opt_revision":{"name":"last_opt_revision","type":"int","level":"advanced","flags":0,"default_value":"1","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard":{"name":"leaderboard","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard_description":{"name":"leaderboard_description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"organization":{"name":"organization","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"proxy":{"name":"proxy","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url":{"name":"url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/report","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"test_orchestrator","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"volumes","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_clones":{"name":"max_concurrent_clones","type":"int","level":"advanced","flags":0,"default_value":"4","min":"","max":"","enum_allowed":[],"desc":"Number of asynchronous cloner threads","long_desc":"","tags":[],"see_also":[]},"snapshot_clone_delay":{"name":"snapshot_clone_delay","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"Delay clone begin operation by snapshot_clone_delay seconds","long_desc":"","tags":[],"see_also":[]}}},{"name":"zabbix","can_run":true,"error_string":"","module_options":{"discovery_interval":{"name":"discovery_interval","type":"uint","level":"advanced","flags":0,"default_value":"100","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"identifier":{"name":"identifier","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_host":{"name":"zabbix_host","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_port":{"name":"zabbix_port","type":"int","level":"advanced","flags":0,"default_value":"10051","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_sender":{"name":"zabbix_sender","type":"str","level":"advanced","flags":0,"default_value":"/usr/bin/zabbix_sender","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}}]}],"modules":["cephadm","dashboard","iostat","nfs","prometheus","restful"],"available_modules":[{"name":"alerts","can_run":true,"error_string":"","module_options":{"interval":{"name":"interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"How frequently to reexamine health status","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"smtp_destination":{"name":"smtp_destination","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Email address to send alerts to","long_desc":"","tags":[],"see_also":[]},"smtp_from_name":{"name":"smtp_from_name","type":"str","level":"advanced","flags":1,"default_value":"Ceph","min":"","max":"","enum_allowed":[],"desc":"Email From: name","long_desc":"","tags":[],"see_also":[]},"smtp_host":{"name":"smtp_host","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_password":{"name":"smtp_password","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Password to authenticate with","long_desc":"","tags":[],"see_also":[]},"smtp_port":{"name":"smtp_port","type":"int","level":"advanced","flags":1,"default_value":"465","min":"","max":"","enum_allowed":[],"desc":"SMTP port","long_desc":"","tags":[],"see_also":[]},"smtp_sender":{"name":"smtp_sender","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP envelope sender","long_desc":"","tags":[],"see_also":[]},"smtp_ssl":{"name":"smtp_ssl","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Use SSL to connect to SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_user":{"name":"smtp_user","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"User to authenticate as","long_desc":"","tags":[],"see_also":[]}}},{"name":"balancer","can_run":true,"error_string":"","module_options":{"active":{"name":"active","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"automatically balance PGs across cluster","long_desc":"","tags":[],"see_also":[]},"begin_time":{"name":"begin_time","type":"str","level":"advanced","flags":1,"default_value":"0000","min":"","max":"","enum_allowed":[],"desc":"beginning time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"begin_weekday":{"name":"begin_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to this day of the week or later","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"crush_compat_max_iterations":{"name":"crush_compat_max_iterations","type":"uint","level":"advanced","flags":1,"default_value":"25","min":"1","max":"250","enum_allowed":[],"desc":"maximum number of iterations to attempt optimization","long_desc":"","tags":[],"see_also":[]},"crush_compat_metrics":{"name":"crush_compat_metrics","type":"str","level":"advanced","flags":1,"default_value":"pgs,objects,bytes","min":"","max":"","enum_allowed":[],"desc":"metrics with which to calculate OSD utilization","long_desc":"Value is a list of one or more of \"pgs\", \"objects\", or \"bytes\", and indicates which metrics to use to balance utilization.","tags":[],"see_also":[]},"crush_compat_step":{"name":"crush_compat_step","type":"float","level":"advanced","flags":1,"default_value":"0.5","min":"0.001","max":"0.999","enum_allowed":[],"desc":"aggressiveness of optimization","long_desc":".99 is very aggressive, .01 is less aggressive","tags":[],"see_also":[]},"end_time":{"name":"end_time","type":"str","level":"advanced","flags":1,"default_value":"2359","min":"","max":"","enum_allowed":[],"desc":"ending time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"end_weekday":{"name":"end_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to days of the week earlier than this","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_score":{"name":"min_score","type":"float","level":"advanced","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"minimum score, below which no optimization is attempted","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":1,"default_value":"upmap","min":"","max":"","enum_allowed":["crush-compat","none","upmap"],"desc":"Balancer mode","long_desc":"","tags":[],"see_also":[]},"pool_ids":{"name":"pool_ids","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"pools which the automatic balancing will be limited to","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and attempt optimization","long_desc":"","tags":[],"see_also":[]},"upmap_max_deviation":{"name":"upmap_max_deviation","type":"int","level":"advanced","flags":1,"default_value":"5","min":"1","max":"","enum_allowed":[],"desc":"deviation below which no optimization is attempted","long_desc":"If the number of PGs are within this count then no optimization is attempted","tags":[],"see_also":[]},"upmap_max_optimizations":{"name":"upmap_max_optimizations","type":"uint","level":"advanced","flags":1,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"maximum upmap optimizations to make per attempt","long_desc":"","tags":[],"see_also":[]}}},{"name":"cephadm","can_run":true,"error_string":"","module_options":{"agent_down_multiplier":{"name":"agent_down_multiplier","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"","max":"","enum_allowed":[],"desc":"Multiplied by agent refresh rate to calculate how long agent must not report before being marked down","long_desc":"","tags":[],"see_also":[]},"agent_refresh_rate":{"name":"agent_refresh_rate","type":"secs","level":"advanced","flags":0,"default_value":"20","min":"","max":"","enum_allowed":[],"desc":"How often agent on each host will try to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"agent_starting_port":{"name":"agent_starting_port","type":"int","level":"advanced","flags":0,"default_value":"4721","min":"","max":"","enum_allowed":[],"desc":"First port agent will try to bind to (will also try up to next 1000 subsequent ports if blocked)","long_desc":"","tags":[],"see_also":[]},"allow_ptrace":{"name":"allow_ptrace","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow SYS_PTRACE capability on ceph containers","long_desc":"The SYS_PTRACE capability is needed to attach to a process with gdb or strace. Enabling this options can allow debugging daemons that encounter problems at runtime.","tags":[],"see_also":[]},"autotune_interval":{"name":"autotune_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to autotune daemon memory","long_desc":"","tags":[],"see_also":[]},"autotune_memory_target_ratio":{"name":"autotune_memory_target_ratio","type":"float","level":"advanced","flags":0,"default_value":"0.7","min":"","max":"","enum_allowed":[],"desc":"ratio of total system memory to divide amongst autotuned daemons","long_desc":"","tags":[],"see_also":[]},"cgroups_split":{"name":"cgroups_split","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Pass --cgroups=split when cephadm creates containers (currently podman only)","long_desc":"","tags":[],"see_also":[]},"config_checks_enabled":{"name":"config_checks_enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable or disable the cephadm configuration analysis","long_desc":"","tags":[],"see_also":[]},"config_dashboard":{"name":"config_dashboard","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"manage configs like API endpoints in Dashboard.","long_desc":"","tags":[],"see_also":[]},"container_image_alertmanager":{"name":"container_image_alertmanager","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/alertmanager:v0.25.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_base":{"name":"container_image_base","type":"str","level":"advanced","flags":1,"default_value":"quay.io/ceph/ceph","min":"","max":"","enum_allowed":[],"desc":"Container image name, without the tag","long_desc":"","tags":[],"see_also":[]},"container_image_elasticsearch":{"name":"container_image_elasticsearch","type":"str","level":"advanced","flags":0,"default_value":"quay.io/omrizeneva/elasticsearch:6.8.23","min":"","max":"","enum_allowed":[],"desc":"elasticsearch container image","long_desc":"","tags":[],"see_also":[]},"container_image_grafana":{"name":"container_image_grafana","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/ceph-grafana:9.4.7","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_haproxy":{"name":"container_image_haproxy","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/haproxy:2.3","min":"","max":"","enum_allowed":[],"desc":"HAproxy container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_agent":{"name":"container_image_jaeger_agent","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-agent:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger agent container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_collector":{"name":"container_image_jaeger_collector","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-collector:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger collector container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_query":{"name":"container_image_jaeger_query","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-query:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger query container image","long_desc":"","tags":[],"see_also":[]},"container_image_keepalived":{"name":"container_image_keepalived","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/keepalived:2.2.4","min":"","max":"","enum_allowed":[],"desc":"Keepalived container image","long_desc":"","tags":[],"see_also":[]},"container_image_loki":{"name":"container_image_loki","type":"str","level":"advanced","flags":0,"default_value":"docker.io/grafana/loki:2.4.0","min":"","max":"","enum_allowed":[],"desc":"Loki container image","long_desc":"","tags":[],"see_also":[]},"container_image_node_exporter":{"name":"container_image_node_exporter","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/node-exporter:v1.5.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_nvmeof":{"name":"container_image_nvmeof","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/nvmeof:0.0.2","min":"","max":"","enum_allowed":[],"desc":"Nvme-of container image","long_desc":"","tags":[],"see_also":[]},"container_image_prometheus":{"name":"container_image_prometheus","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/prometheus:v2.43.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_promtail":{"name":"container_image_promtail","type":"str","level":"advanced","flags":0,"default_value":"docker.io/grafana/promtail:2.4.0","min":"","max":"","enum_allowed":[],"desc":"Promtail container image","long_desc":"","tags":[],"see_also":[]},"container_image_snmp_gateway":{"name":"container_image_snmp_gateway","type":"str","level":"advanced","flags":0,"default_value":"docker.io/maxwo/snmp-notifier:v1.2.1","min":"","max":"","enum_allowed":[],"desc":"SNMP Gateway container image","long_desc":"","tags":[],"see_also":[]},"container_init":{"name":"container_init","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Run podman/docker with `--init`","long_desc":"","tags":[],"see_also":[]},"daemon_cache_timeout":{"name":"daemon_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"seconds to cache service (daemon) inventory","long_desc":"","tags":[],"see_also":[]},"default_cephadm_command_timeout":{"name":"default_cephadm_command_timeout","type":"secs","level":"advanced","flags":0,"default_value":"900","min":"","max":"","enum_allowed":[],"desc":"Default timeout applied to cephadm commands run directly on the host (in seconds)","long_desc":"","tags":[],"see_also":[]},"default_registry":{"name":"default_registry","type":"str","level":"advanced","flags":0,"default_value":"docker.io","min":"","max":"","enum_allowed":[],"desc":"Search-registry to which we should normalize unqualified image names. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"device_cache_timeout":{"name":"device_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"1800","min":"","max":"","enum_allowed":[],"desc":"seconds to cache device inventory","long_desc":"","tags":[],"see_also":[]},"device_enhanced_scan":{"name":"device_enhanced_scan","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use libstoragemgmt during device scans","long_desc":"","tags":[],"see_also":[]},"facts_cache_timeout":{"name":"facts_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"seconds to cache host facts data","long_desc":"","tags":[],"see_also":[]},"host_check_interval":{"name":"host_check_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to perform a host check","long_desc":"","tags":[],"see_also":[]},"inventory_list_all":{"name":"inventory_list_all","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Whether ceph-volume inventory should report more devices (mostly mappers (LVs / mpaths), partitions...)","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_refresh_metadata":{"name":"log_refresh_metadata","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Log all refresh metadata. Includes daemon, device, and host info collected regularly. Only has effect if logging at debug level","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"log to the \"cephadm\" cluster log channel\"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf":{"name":"manage_etc_ceph_ceph_conf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Manage and own /etc/ceph/ceph.conf on the hosts.","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf_hosts":{"name":"manage_etc_ceph_ceph_conf_hosts","type":"str","level":"advanced","flags":0,"default_value":"*","min":"","max":"","enum_allowed":[],"desc":"PlacementSpec describing on which hosts to manage /etc/ceph/ceph.conf","long_desc":"","tags":[],"see_also":[]},"max_count_per_host":{"name":"max_count_per_host","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of daemons per service per host","long_desc":"","tags":[],"see_also":[]},"max_osd_draining_count":{"name":"max_osd_draining_count","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of osds that will be drained simultaneously when osds are removed","long_desc":"","tags":[],"see_also":[]},"migration_current":{"name":"migration_current","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"internal - do not modify","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":0,"default_value":"root","min":"","max":"","enum_allowed":["cephadm-package","root"],"desc":"mode for remote execution of cephadm","long_desc":"","tags":[],"see_also":[]},"prometheus_alerts_path":{"name":"prometheus_alerts_path","type":"str","level":"advanced","flags":0,"default_value":"/etc/prometheus/ceph/ceph_default_alerts.yml","min":"","max":"","enum_allowed":[],"desc":"location of alerts to include in prometheus deployments","long_desc":"","tags":[],"see_also":[]},"registry_insecure":{"name":"registry_insecure","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Registry is to be considered insecure (no TLS available). Only for development purposes.","long_desc":"","tags":[],"see_also":[]},"registry_password":{"name":"registry_password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository password. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"registry_url":{"name":"registry_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Registry url for login purposes. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"registry_username":{"name":"registry_username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository username. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"secure_monitoring_stack":{"name":"secure_monitoring_stack","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable TLS security for all the monitoring stack daemons","long_desc":"","tags":[],"see_also":[]},"service_discovery_port":{"name":"service_discovery_port","type":"int","level":"advanced","flags":0,"default_value":"8765","min":"","max":"","enum_allowed":[],"desc":"cephadm service discovery port","long_desc":"","tags":[],"see_also":[]},"ssh_config_file":{"name":"ssh_config_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"customized SSH config file to connect to managed hosts","long_desc":"","tags":[],"see_also":[]},"use_agent":{"name":"use_agent","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use cephadm agent on each host to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"use_repo_digest":{"name":"use_repo_digest","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Automatically convert image tags to image digest. Make sure all daemons use the same image","long_desc":"","tags":[],"see_also":[]},"warn_on_failed_host_check":{"name":"warn_on_failed_host_check","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if the host check fails","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_daemons":{"name":"warn_on_stray_daemons","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected that are not managed by cephadm","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_hosts":{"name":"warn_on_stray_hosts","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected on a host that is not managed by cephadm","long_desc":"","tags":[],"see_also":[]}}},{"name":"crash","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"retain_interval":{"name":"retain_interval","type":"secs","level":"advanced","flags":1,"default_value":"31536000","min":"","max":"","enum_allowed":[],"desc":"how long to retain crashes before pruning them","long_desc":"","tags":[],"see_also":[]},"warn_recent_interval":{"name":"warn_recent_interval","type":"secs","level":"advanced","flags":1,"default_value":"1209600","min":"","max":"","enum_allowed":[],"desc":"time interval in which to warn about recent crashes","long_desc":"","tags":[],"see_also":[]}}},{"name":"dashboard","can_run":true,"error_string":"","module_options":{"ACCOUNT_LOCKOUT_ATTEMPTS":{"name":"ACCOUNT_LOCKOUT_ATTEMPTS","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_HOST":{"name":"ALERTMANAGER_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_SSL_VERIFY":{"name":"ALERTMANAGER_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_ENABLED":{"name":"AUDIT_API_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_LOG_PAYLOAD":{"name":"AUDIT_API_LOG_PAYLOAD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ENABLE_BROWSABLE_API":{"name":"ENABLE_BROWSABLE_API","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_CEPHFS":{"name":"FEATURE_TOGGLE_CEPHFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_DASHBOARD":{"name":"FEATURE_TOGGLE_DASHBOARD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_ISCSI":{"name":"FEATURE_TOGGLE_ISCSI","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_MIRRORING":{"name":"FEATURE_TOGGLE_MIRRORING","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_NFS":{"name":"FEATURE_TOGGLE_NFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RBD":{"name":"FEATURE_TOGGLE_RBD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RGW":{"name":"FEATURE_TOGGLE_RGW","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE":{"name":"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_PASSWORD":{"name":"GRAFANA_API_PASSWORD","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_SSL_VERIFY":{"name":"GRAFANA_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_URL":{"name":"GRAFANA_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_USERNAME":{"name":"GRAFANA_API_USERNAME","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_FRONTEND_API_URL":{"name":"GRAFANA_FRONTEND_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_UPDATE_DASHBOARDS":{"name":"GRAFANA_UPDATE_DASHBOARDS","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISCSI_API_SSL_VERIFICATION":{"name":"ISCSI_API_SSL_VERIFICATION","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISSUE_TRACKER_API_KEY":{"name":"ISSUE_TRACKER_API_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_HOST":{"name":"PROMETHEUS_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_SSL_VERIFY":{"name":"PROMETHEUS_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_COMPLEXITY_ENABLED":{"name":"PWD_POLICY_CHECK_COMPLEXITY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED":{"name":"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_LENGTH_ENABLED":{"name":"PWD_POLICY_CHECK_LENGTH_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_OLDPWD_ENABLED":{"name":"PWD_POLICY_CHECK_OLDPWD_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_USERNAME_ENABLED":{"name":"PWD_POLICY_CHECK_USERNAME_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_ENABLED":{"name":"PWD_POLICY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_EXCLUSION_LIST":{"name":"PWD_POLICY_EXCLUSION_LIST","type":"str","level":"advanced","flags":0,"default_value":"osd,host,dashboard,pool,block,nfs,ceph,monitors,gateway,logs,crush,maps","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_COMPLEXITY":{"name":"PWD_POLICY_MIN_COMPLEXITY","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_LENGTH":{"name":"PWD_POLICY_MIN_LENGTH","type":"int","level":"advanced","flags":0,"default_value":"8","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"REST_REQUESTS_TIMEOUT":{"name":"REST_REQUESTS_TIMEOUT","type":"int","level":"advanced","flags":0,"default_value":"45","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ACCESS_KEY":{"name":"RGW_API_ACCESS_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ADMIN_RESOURCE":{"name":"RGW_API_ADMIN_RESOURCE","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SECRET_KEY":{"name":"RGW_API_SECRET_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SSL_VERIFY":{"name":"RGW_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"UNSAFE_TLS_v1_2":{"name":"UNSAFE_TLS_v1_2","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_SPAN":{"name":"USER_PWD_EXPIRATION_SPAN","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_1":{"name":"USER_PWD_EXPIRATION_WARNING_1","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_2":{"name":"USER_PWD_EXPIRATION_WARNING_2","type":"int","level":"advanced","flags":0,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"cross_origin_url":{"name":"cross_origin_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"crt_file":{"name":"crt_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"debug":{"name":"debug","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable/disable debug options","long_desc":"","tags":[],"see_also":[]},"jwt_token_ttl":{"name":"jwt_token_ttl","type":"int","level":"advanced","flags":0,"default_value":"28800","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"key_file":{"name":"key_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"motd":{"name":"motd","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"The message of the day","long_desc":"","tags":[],"see_also":[]},"redirect_resolve_ip_addr":{"name":"redirect_resolve_ip_addr","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":0,"default_value":"8080","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl_server_port":{"name":"ssl_server_port","type":"int","level":"advanced","flags":0,"default_value":"8443","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":0,"default_value":"redirect","min":"","max":"","enum_allowed":["error","redirect"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":0,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url_prefix":{"name":"url_prefix","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"devicehealth","can_run":true,"error_string":"","module_options":{"enable_monitoring":{"name":"enable_monitoring","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"monitor device health metrics","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mark_out_threshold":{"name":"mark_out_threshold","type":"secs","level":"advanced","flags":1,"default_value":"2419200","min":"","max":"","enum_allowed":[],"desc":"automatically mark OSD if it may fail before this long","long_desc":"","tags":[],"see_also":[]},"pool_name":{"name":"pool_name","type":"str","level":"advanced","flags":1,"default_value":"device_health_metrics","min":"","max":"","enum_allowed":[],"desc":"name of pool in which to store device health metrics","long_desc":"","tags":[],"see_also":[]},"retention_period":{"name":"retention_period","type":"secs","level":"advanced","flags":1,"default_value":"15552000","min":"","max":"","enum_allowed":[],"desc":"how long to retain device health metrics","long_desc":"","tags":[],"see_also":[]},"scrape_frequency":{"name":"scrape_frequency","type":"secs","level":"advanced","flags":1,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"how frequently to scrape device health metrics","long_desc":"","tags":[],"see_also":[]},"self_heal":{"name":"self_heal","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"preemptively heal cluster around devices that may fail","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and check device health","long_desc":"","tags":[],"see_also":[]},"warn_threshold":{"name":"warn_threshold","type":"secs","level":"advanced","flags":1,"default_value":"7257600","min":"","max":"","enum_allowed":[],"desc":"raise health warning if OSD may fail before this long","long_desc":"","tags":[],"see_also":[]}}},{"name":"diskprediction_local","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predict_interval":{"name":"predict_interval","type":"str","level":"advanced","flags":0,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predictor_model":{"name":"predictor_model","type":"str","level":"advanced","flags":0,"default_value":"prophetstor","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"str","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"influx","can_run":false,"error_string":"influxdb python module not found","module_options":{"batch_size":{"name":"batch_size","type":"int","level":"advanced","flags":0,"default_value":"5000","min":"","max":"","enum_allowed":[],"desc":"How big batches of data points should be when sending to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"database":{"name":"database","type":"str","level":"advanced","flags":0,"default_value":"ceph","min":"","max":"","enum_allowed":[],"desc":"InfluxDB database name. You will need to create this database and grant write privileges to the configured username or the username must have admin privileges to create it.","long_desc":"","tags":[],"see_also":[]},"hostname":{"name":"hostname","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server hostname","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"30","min":"5","max":"","enum_allowed":[],"desc":"Time between reports to InfluxDB. Default 30 seconds.","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"password":{"name":"password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"password of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"port":{"name":"port","type":"int","level":"advanced","flags":0,"default_value":"8086","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server port","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"str","level":"advanced","flags":0,"default_value":"false","min":"","max":"","enum_allowed":[],"desc":"Use https connection for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]},"threads":{"name":"threads","type":"int","level":"advanced","flags":0,"default_value":"5","min":"1","max":"32","enum_allowed":[],"desc":"How many worker threads should be spawned for sending data to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"username":{"name":"username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"username of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"verify_ssl":{"name":"verify_ssl","type":"str","level":"advanced","flags":0,"default_value":"true","min":"","max":"","enum_allowed":[],"desc":"Verify https cert for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]}}},{"name":"insights","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"iostat","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"k8sevents","can_run":true,"error_string":"","module_options":{"ceph_event_retention_days":{"name":"ceph_event_retention_days","type":"int","level":"advanced","flags":0,"default_value":"7","min":"","max":"","enum_allowed":[],"desc":"Days to hold ceph event information within local cache","long_desc":"","tags":[],"see_also":[]},"config_check_secs":{"name":"config_check_secs","type":"int","level":"advanced","flags":0,"default_value":"10","min":"10","max":"","enum_allowed":[],"desc":"interval (secs) to check for cluster configuration changes","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"localpool","can_run":true,"error_string":"","module_options":{"failure_domain":{"name":"failure_domain","type":"str","level":"advanced","flags":1,"default_value":"host","min":"","max":"","enum_allowed":[],"desc":"failure domain for any created local pool","long_desc":"what failure domain we should separate data replicas across.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_size":{"name":"min_size","type":"int","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"default min_size for any created local pool","long_desc":"value to set min_size to (unchanged from Ceph's default if this option is not set)","tags":[],"see_also":[]},"num_rep":{"name":"num_rep","type":"int","level":"advanced","flags":1,"default_value":"3","min":"","max":"","enum_allowed":[],"desc":"default replica count for any created local pool","long_desc":"","tags":[],"see_also":[]},"pg_num":{"name":"pg_num","type":"int","level":"advanced","flags":1,"default_value":"128","min":"","max":"","enum_allowed":[],"desc":"default pg_num for any created local pool","long_desc":"","tags":[],"see_also":[]},"prefix":{"name":"prefix","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"name prefix for any created local pool","long_desc":"","tags":[],"see_also":[]},"subtree":{"name":"subtree","type":"str","level":"advanced","flags":1,"default_value":"rack","min":"","max":"","enum_allowed":[],"desc":"CRUSH level for which to create a local pool","long_desc":"which CRUSH subtree type the module should create a pool for.","tags":[],"see_also":[]}}},{"name":"mds_autoscaler","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"mirroring","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"nfs","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"orchestrator","can_run":true,"error_string":"","module_options":{"fail_fs":{"name":"fail_fs","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Fail filesystem for rapid multi-rank mds upgrade","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"orchestrator":{"name":"orchestrator","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["cephadm","rook","test_orchestrator"],"desc":"Orchestrator backend","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_perf_query","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"pg_autoscaler","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"threshold":{"name":"threshold","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"1.0","max":"","enum_allowed":[],"desc":"scaling threshold","long_desc":"The factor by which the `NEW PG_NUM` must vary from the current`PG_NUM` before being accepted. Cannot be less than 1.0","tags":[],"see_also":[]}}},{"name":"progress","can_run":true,"error_string":"","module_options":{"allow_pg_recovery_event":{"name":"allow_pg_recovery_event","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow the module to show pg recovery progress","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_completed_events":{"name":"max_completed_events","type":"int","level":"advanced","flags":1,"default_value":"50","min":"","max":"","enum_allowed":[],"desc":"number of past completed events to remember","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"how long the module is going to sleep","long_desc":"","tags":[],"see_also":[]}}},{"name":"prometheus","can_run":true,"error_string":"","module_options":{"cache":{"name":"cache","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"exclude_perf_counters":{"name":"exclude_perf_counters","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Do not include perf-counters in the metrics output","long_desc":"Gathering perf-counters from a single Prometheus exporter can degrade ceph-mgr performance, especially in large clusters. Instead, Ceph-exporter daemons are now used by default for perf-counter gathering. This should only be disabled when no ceph-exporters are deployed.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools":{"name":"rbd_stats_pools","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools_refresh_interval":{"name":"rbd_stats_pools_refresh_interval","type":"int","level":"advanced","flags":0,"default_value":"300","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"scrape_interval":{"name":"scrape_interval","type":"float","level":"advanced","flags":0,"default_value":"15.0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"the IPv4 or IPv6 address on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":1,"default_value":"9283","min":"","max":"","enum_allowed":[],"desc":"the port on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"stale_cache_strategy":{"name":"stale_cache_strategy","type":"str","level":"advanced","flags":0,"default_value":"log","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":1,"default_value":"default","min":"","max":"","enum_allowed":["default","error"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":1,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rbd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_snap_create":{"name":"max_concurrent_snap_create","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mirror_snapshot_schedule":{"name":"mirror_snapshot_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"trash_purge_schedule":{"name":"trash_purge_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"restful","can_run":true,"error_string":"","module_options":{"enable_auth":{"name":"enable_auth","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"key_file":{"name":"key_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rgw","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rook","can_run":true,"error_string":"","module_options":{"drive_group_interval":{"name":"drive_group_interval","type":"float","level":"advanced","flags":0,"default_value":"300.0","min":"","max":"","enum_allowed":[],"desc":"interval in seconds between re-application of applied drive_groups","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"storage_class":{"name":"storage_class","type":"str","level":"advanced","flags":0,"default_value":"local","min":"","max":"","enum_allowed":[],"desc":"storage class name for LSO-discovered PVs","long_desc":"","tags":[],"see_also":[]}}},{"name":"selftest","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption1":{"name":"roption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption2":{"name":"roption2","type":"str","level":"advanced","flags":0,"default_value":"xyz","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption1":{"name":"rwoption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption2":{"name":"rwoption2","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption3":{"name":"rwoption3","type":"float","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption4":{"name":"rwoption4","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption5":{"name":"rwoption5","type":"bool","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption6":{"name":"rwoption6","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption7":{"name":"rwoption7","type":"int","level":"advanced","flags":0,"default_value":"","min":"1","max":"42","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testkey":{"name":"testkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testlkey":{"name":"testlkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testnewline":{"name":"testnewline","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"snap_schedule","can_run":true,"error_string":"","module_options":{"allow_m_granularity":{"name":"allow_m_granularity","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow minute scheduled snapshots","long_desc":"","tags":[],"see_also":[]},"dump_on_update":{"name":"dump_on_update","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"dump database to debug log on update","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"stats","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"status","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telegraf","can_run":true,"error_string":"","module_options":{"address":{"name":"address","type":"str","level":"advanced","flags":0,"default_value":"unixgram:///tmp/telegraf.sock","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"15","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telemetry","can_run":true,"error_string":"","module_options":{"channel_basic":{"name":"channel_basic","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share basic cluster information (size, version)","long_desc":"","tags":[],"see_also":[]},"channel_crash":{"name":"channel_crash","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share metadata about Ceph daemon crashes (version, stack straces, etc)","long_desc":"","tags":[],"see_also":[]},"channel_device":{"name":"channel_device","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share device health metrics (e.g., SMART data, minus potentially identifying info like serial numbers)","long_desc":"","tags":[],"see_also":[]},"channel_ident":{"name":"channel_ident","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share a user-provided description and/or contact email for the cluster","long_desc":"","tags":[],"see_also":[]},"channel_perf":{"name":"channel_perf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share various performance metrics of a cluster","long_desc":"","tags":[],"see_also":[]},"contact":{"name":"contact","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"description":{"name":"description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"device_url":{"name":"device_url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/device","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"int","level":"advanced","flags":0,"default_value":"24","min":"8","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"last_opt_revision":{"name":"last_opt_revision","type":"int","level":"advanced","flags":0,"default_value":"1","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard":{"name":"leaderboard","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard_description":{"name":"leaderboard_description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"organization":{"name":"organization","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"proxy":{"name":"proxy","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url":{"name":"url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/report","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"test_orchestrator","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"volumes","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_clones":{"name":"max_concurrent_clones","type":"int","level":"advanced","flags":0,"default_value":"4","min":"","max":"","enum_allowed":[],"desc":"Number of asynchronous cloner threads","long_desc":"","tags":[],"see_also":[]},"snapshot_clone_delay":{"name":"snapshot_clone_delay","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"Delay clone begin operation by snapshot_clone_delay seconds","long_desc":"","tags":[],"see_also":[]}}},{"name":"zabbix","can_run":true,"error_string":"","module_options":{"discovery_interval":{"name":"discovery_interval","type":"uint","level":"advanced","flags":0,"default_value":"100","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"identifier":{"name":"identifier","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_host":{"name":"zabbix_host","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_port":{"name":"zabbix_port","type":"int","level":"advanced","flags":0,"default_value":"10051","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_sender":{"name":"zabbix_sender","type":"str","level":"advanced","flags":0,"default_value":"/usr/bin/zabbix_sender","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}}],"services":{"dashboard":"https://192.168.123.105:8443/","prometheus":"http://192.168.123.105:9283/"},"always_on_modules":{"octopus":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"pacific":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"quincy":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"reef":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"]},"last_failure_osd_epoch":5,"active_clients":[{"name":"devicehealth","addrvec":[{"type":"v2","addr":"192.168.123.105:0","nonce":3749819219}]},{"name":"libcephsqlite","addrvec":[{"type":"v2","addr":"192.168.123.105:0","nonce":4149772464}]},{"name":"rbd_support","addrvec":[{"type":"v2","addr":"192.168.123.105:0","nonce":1622646269}]},{"name":"volumes","addrvec":[{"type":"v2","addr":"192.168.123.105:0","nonce":3102676859}]}]} 2026-03-10T08:52:39.377 INFO:tasks.cephadm.ceph_manager.ceph:mgr available! 2026-03-10T08:52:39.377 INFO:tasks.cephadm.ceph_manager.ceph:waiting for all up 2026-03-10T08:52:39.377 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph osd dump --format=json 2026-03-10T08:52:39.514 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:52:39.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:39 vm08 ceph-mon[57559]: from='client.? 192.168.123.108:0/1610132735' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-10T08:52:39.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:39 vm08 ceph-mon[57559]: from='client.? 192.168.123.108:0/1610132735' entity='client.admin' cmd='[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]': finished 2026-03-10T08:52:39.736 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.735+0000 7f2880387700 1 -- 192.168.123.105:0/3720962062 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2878103cf0 msgr2=0x7f2878107d40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:39.736 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.735+0000 7f2880387700 1 --2- 192.168.123.105:0/3720962062 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2878103cf0 0x7f2878107d40 secure :-1 s=READY pgs=209 cs=0 l=1 rev1=1 crypto rx=0x7f2874009b50 tx=0x7f2874009e60 comp rx=0 tx=0).stop 2026-03-10T08:52:39.736 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.735+0000 7f2880387700 1 -- 192.168.123.105:0/3720962062 shutdown_connections 2026-03-10T08:52:39.736 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.735+0000 7f2880387700 1 --2- 192.168.123.105:0/3720962062 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2878103cf0 0x7f2878107d40 unknown :-1 s=CLOSED pgs=209 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:39.736 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.735+0000 7f2880387700 1 --2- 192.168.123.105:0/3720962062 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2878103340 0x7f2878103720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:39.736 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.735+0000 7f2880387700 1 -- 192.168.123.105:0/3720962062 >> 192.168.123.105:0/3720962062 conn(0x7f28780feb90 msgr2=0x7f2878100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:39.736 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.736+0000 7f2880387700 1 -- 192.168.123.105:0/3720962062 shutdown_connections 2026-03-10T08:52:39.736 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.736+0000 7f2880387700 1 -- 192.168.123.105:0/3720962062 wait complete. 2026-03-10T08:52:39.736 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.736+0000 7f2880387700 1 Processor -- start 2026-03-10T08:52:39.736 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.736+0000 7f2880387700 1 -- start start 2026-03-10T08:52:39.736 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.736+0000 7f2880387700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2878103340 0x7f2878198e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:39.737 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.736+0000 7f2880387700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2878103cf0 0x7f2878199390 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:39.737 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.736+0000 7f2880387700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2878199a70 con 0x7f2878103cf0 2026-03-10T08:52:39.737 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.737+0000 7f287d922700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2878103cf0 0x7f2878199390 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:39.737 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.736+0000 7f2880387700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f287819d800 con 0x7f2878103340 2026-03-10T08:52:39.737 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.737+0000 7f287d922700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2878103cf0 0x7f2878199390 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:35064/0 (socket says 192.168.123.105:35064) 2026-03-10T08:52:39.737 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.737+0000 7f287d922700 1 -- 192.168.123.105:0/3529002933 learned_addr learned my addr 192.168.123.105:0/3529002933 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:52:39.737 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.737+0000 7f287e123700 1 --2- 192.168.123.105:0/3529002933 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2878103340 0x7f2878198e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:39.737 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.737+0000 7f287d922700 1 -- 192.168.123.105:0/3529002933 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2878103340 msgr2=0x7f2878198e50 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:39.737 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.737+0000 7f287d922700 1 --2- 192.168.123.105:0/3529002933 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2878103340 0x7f2878198e50 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:39.737 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.737+0000 7f287d922700 1 -- 192.168.123.105:0/3529002933 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f28740097e0 con 0x7f2878103cf0 2026-03-10T08:52:39.737 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.737+0000 7f287d922700 1 --2- 192.168.123.105:0/3529002933 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2878103cf0 0x7f2878199390 secure :-1 s=READY pgs=210 cs=0 l=1 rev1=1 crypto rx=0x7f2874004cb0 tx=0x7f2874005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:39.737 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.737+0000 7f286f7fe700 1 -- 192.168.123.105:0/3529002933 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f287401d070 con 0x7f2878103cf0 2026-03-10T08:52:39.738 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.737+0000 7f2880387700 1 -- 192.168.123.105:0/3529002933 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f287819da80 con 0x7f2878103cf0 2026-03-10T08:52:39.738 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.738+0000 7f286f7fe700 1 -- 192.168.123.105:0/3529002933 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f2874022470 con 0x7f2878103cf0 2026-03-10T08:52:39.738 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.738+0000 7f286f7fe700 1 -- 192.168.123.105:0/3529002933 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f287400f650 con 0x7f2878103cf0 2026-03-10T08:52:39.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.738+0000 7f2880387700 1 -- 192.168.123.105:0/3529002933 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f287819df70 con 0x7f2878103cf0 2026-03-10T08:52:39.742 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.739+0000 7f2880387700 1 -- 192.168.123.105:0/3529002933 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f287810b740 con 0x7f2878103cf0 2026-03-10T08:52:39.742 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.739+0000 7f286f7fe700 1 -- 192.168.123.105:0/3529002933 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f2874022a80 con 0x7f2878103cf0 2026-03-10T08:52:39.742 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.739+0000 7f286f7fe700 1 --2- 192.168.123.105:0/3529002933 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f286406c490 0x7f286406e950 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:39.742 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.740+0000 7f286f7fe700 1 -- 192.168.123.105:0/3529002933 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f287405b770 con 0x7f2878103cf0 2026-03-10T08:52:39.742 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.742+0000 7f287e123700 1 --2- 192.168.123.105:0/3529002933 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f286406c490 0x7f286406e950 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:39.742 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.742+0000 7f287e123700 1 --2- 192.168.123.105:0/3529002933 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f286406c490 0x7f286406e950 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7f2868005950 tx=0x7f28680058e0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:39.742 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.742+0000 7f286f7fe700 1 -- 192.168.123.105:0/3529002933 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f2874092050 con 0x7f2878103cf0 2026-03-10T08:52:39.844 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.844+0000 7f2880387700 1 -- 192.168.123.105:0/3529002933 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "osd dump", "format": "json"} v 0) v1 -- 0x7f287804ea90 con 0x7f2878103cf0 2026-03-10T08:52:39.846 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.846+0000 7f286f7fe700 1 -- 192.168.123.105:0/3529002933 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "osd dump", "format": "json"}]=0 v33) v1 ==== 74+0+11260 (secure 0 0 0) 0x7f2874027090 con 0x7f2878103cf0 2026-03-10T08:52:39.846 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:52:39.846 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":33,"fsid":"16587ed2-1c5e-11f1-90f6-35051361a039","created":"2026-03-10T08:50:10.961037+0000","modified":"2026-03-10T08:52:36.235765+0000","last_up_change":"2026-03-10T08:52:35.229629+0000","last_in_change":"2026-03-10T08:52:24.382935+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":14,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":1,"max_osd":6,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"reef","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-10T08:52:09.422580+0000","flags":1,"flags_names":"hashpspool","type":1,"size":3,"min_size":2,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"20","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_acting":6,"score_stable":6,"optimal_score":0.5,"raw_score_acting":3,"raw_score_stable":3,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"0e25ea50-b19b-4e07-85f6-5d48c19d3a4f","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":9,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6802","nonce":2315796084},{"type":"v1","addr":"192.168.123.105:6803","nonce":2315796084}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6804","nonce":2315796084},{"type":"v1","addr":"192.168.123.105:6805","nonce":2315796084}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6808","nonce":2315796084},{"type":"v1","addr":"192.168.123.105:6809","nonce":2315796084}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6806","nonce":2315796084},{"type":"v1","addr":"192.168.123.105:6807","nonce":2315796084}]},"public_addr":"192.168.123.105:6803/2315796084","cluster_addr":"192.168.123.105:6805/2315796084","heartbeat_back_addr":"192.168.123.105:6809/2315796084","heartbeat_front_addr":"192.168.123.105:6807/2315796084","state":["exists","up"]},{"osd":1,"uuid":"65d8a731-173e-4188-b03d-f0602d504870","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":13,"up_thru":24,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6810","nonce":305535590},{"type":"v1","addr":"192.168.123.105:6811","nonce":305535590}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6812","nonce":305535590},{"type":"v1","addr":"192.168.123.105:6813","nonce":305535590}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6816","nonce":305535590},{"type":"v1","addr":"192.168.123.105:6817","nonce":305535590}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6814","nonce":305535590},{"type":"v1","addr":"192.168.123.105:6815","nonce":305535590}]},"public_addr":"192.168.123.105:6811/305535590","cluster_addr":"192.168.123.105:6813/305535590","heartbeat_back_addr":"192.168.123.105:6817/305535590","heartbeat_front_addr":"192.168.123.105:6815/305535590","state":["exists","up"]},{"osd":2,"uuid":"3a3adfaf-6208-4836-b16d-7bbb2065933b","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":17,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6818","nonce":3827519003},{"type":"v1","addr":"192.168.123.105:6819","nonce":3827519003}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6820","nonce":3827519003},{"type":"v1","addr":"192.168.123.105:6821","nonce":3827519003}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6824","nonce":3827519003},{"type":"v1","addr":"192.168.123.105:6825","nonce":3827519003}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6822","nonce":3827519003},{"type":"v1","addr":"192.168.123.105:6823","nonce":3827519003}]},"public_addr":"192.168.123.105:6819/3827519003","cluster_addr":"192.168.123.105:6821/3827519003","heartbeat_back_addr":"192.168.123.105:6825/3827519003","heartbeat_front_addr":"192.168.123.105:6823/3827519003","state":["exists","up"]},{"osd":3,"uuid":"3e86065a-202f-4640-9f03-2490c913e09b","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":23,"up_thru":27,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6800","nonce":1634067319},{"type":"v1","addr":"192.168.123.108:6801","nonce":1634067319}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6802","nonce":1634067319},{"type":"v1","addr":"192.168.123.108:6803","nonce":1634067319}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6806","nonce":1634067319},{"type":"v1","addr":"192.168.123.108:6807","nonce":1634067319}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6804","nonce":1634067319},{"type":"v1","addr":"192.168.123.108:6805","nonce":1634067319}]},"public_addr":"192.168.123.108:6801/1634067319","cluster_addr":"192.168.123.108:6803/1634067319","heartbeat_back_addr":"192.168.123.108:6807/1634067319","heartbeat_front_addr":"192.168.123.108:6805/1634067319","state":["exists","up"]},{"osd":4,"uuid":"2499eecc-b6be-48ac-ba73-53ff8a0686a4","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":28,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6808","nonce":4181429086},{"type":"v1","addr":"192.168.123.108:6809","nonce":4181429086}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6810","nonce":4181429086},{"type":"v1","addr":"192.168.123.108:6811","nonce":4181429086}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6814","nonce":4181429086},{"type":"v1","addr":"192.168.123.108:6815","nonce":4181429086}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6812","nonce":4181429086},{"type":"v1","addr":"192.168.123.108:6813","nonce":4181429086}]},"public_addr":"192.168.123.108:6809/4181429086","cluster_addr":"192.168.123.108:6811/4181429086","heartbeat_back_addr":"192.168.123.108:6815/4181429086","heartbeat_front_addr":"192.168.123.108:6813/4181429086","state":["exists","up"]},{"osd":5,"uuid":"52d5763a-8095-46f7-9dd8-2d20a4d53ab7","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":32,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6816","nonce":236163747},{"type":"v1","addr":"192.168.123.108:6817","nonce":236163747}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6818","nonce":236163747},{"type":"v1","addr":"192.168.123.108:6819","nonce":236163747}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6822","nonce":236163747},{"type":"v1","addr":"192.168.123.108:6823","nonce":236163747}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6820","nonce":236163747},{"type":"v1","addr":"192.168.123.108:6821","nonce":236163747}]},"public_addr":"192.168.123.108:6817/236163747","cluster_addr":"192.168.123.108:6819/236163747","heartbeat_back_addr":"192.168.123.108:6823/236163747","heartbeat_front_addr":"192.168.123.108:6821/236163747","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T08:51:47.881910+0000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T08:51:57.340585+0000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T08:52:07.088267+0000","dead_epoch":0},{"osd":3,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T08:52:15.174441+0000","dead_epoch":0},{"osd":4,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T08:52:24.943469+0000","dead_epoch":0},{"osd":5,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T08:52:33.433439+0000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{"192.168.123.105:0/2745800648":"2026-03-11T08:51:15.172314+0000","192.168.123.105:0/840671361":"2026-03-11T08:51:15.172314+0000","192.168.123.105:0/302360737":"2026-03-11T08:50:37.635038+0000","192.168.123.105:0/2061557555":"2026-03-11T08:51:15.172314+0000","192.168.123.105:0/3374278469":"2026-03-11T08:50:37.635038+0000","192.168.123.105:6801/2":"2026-03-11T08:50:23.844460+0000","192.168.123.105:6800/2":"2026-03-11T08:50:23.844460+0000","192.168.123.105:0/3434295633":"2026-03-11T08:50:23.844460+0000","192.168.123.105:0/3706700996":"2026-03-11T08:50:23.844460+0000","192.168.123.105:0/3674537463":"2026-03-11T08:50:23.844460+0000","192.168.123.105:0/1368648469":"2026-03-11T08:50:37.635038+0000"},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"jerasure","technique":"reed_sol_van"}},"removed_snaps_queue":[],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-10T08:52:39.848 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.848+0000 7f2880387700 1 -- 192.168.123.105:0/3529002933 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f286406c490 msgr2=0x7f286406e950 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:39.848 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.848+0000 7f2880387700 1 --2- 192.168.123.105:0/3529002933 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f286406c490 0x7f286406e950 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7f2868005950 tx=0x7f28680058e0 comp rx=0 tx=0).stop 2026-03-10T08:52:39.848 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.848+0000 7f2880387700 1 -- 192.168.123.105:0/3529002933 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2878103cf0 msgr2=0x7f2878199390 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:39.849 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.848+0000 7f2880387700 1 --2- 192.168.123.105:0/3529002933 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2878103cf0 0x7f2878199390 secure :-1 s=READY pgs=210 cs=0 l=1 rev1=1 crypto rx=0x7f2874004cb0 tx=0x7f2874005dc0 comp rx=0 tx=0).stop 2026-03-10T08:52:39.849 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.848+0000 7f2880387700 1 -- 192.168.123.105:0/3529002933 shutdown_connections 2026-03-10T08:52:39.849 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.848+0000 7f2880387700 1 --2- 192.168.123.105:0/3529002933 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f286406c490 0x7f286406e950 unknown :-1 s=CLOSED pgs=66 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:39.849 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.848+0000 7f2880387700 1 --2- 192.168.123.105:0/3529002933 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2878103340 0x7f2878198e50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:39.849 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.848+0000 7f2880387700 1 --2- 192.168.123.105:0/3529002933 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2878103cf0 0x7f2878199390 unknown :-1 s=CLOSED pgs=210 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:39.849 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.848+0000 7f2880387700 1 -- 192.168.123.105:0/3529002933 >> 192.168.123.105:0/3529002933 conn(0x7f28780feb90 msgr2=0x7f2878100fa0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:39.849 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.849+0000 7f2880387700 1 -- 192.168.123.105:0/3529002933 shutdown_connections 2026-03-10T08:52:39.849 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:39.849+0000 7f2880387700 1 -- 192.168.123.105:0/3529002933 wait complete. 2026-03-10T08:52:39.905 INFO:tasks.cephadm.ceph_manager.ceph:all up! 2026-03-10T08:52:39.906 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph osd dump --format=json 2026-03-10T08:52:40.049 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:52:40.278 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.277+0000 7fa3e74f8700 1 -- 192.168.123.105:0/1703076336 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa3e0073500 msgr2=0x7fa3e0073960 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:40.278 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.277+0000 7fa3e74f8700 1 --2- 192.168.123.105:0/1703076336 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa3e0073500 0x7fa3e0073960 secure :-1 s=READY pgs=211 cs=0 l=1 rev1=1 crypto rx=0x7fa3dc009b80 tx=0x7fa3dc009e90 comp rx=0 tx=0).stop 2026-03-10T08:52:40.278 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.278+0000 7fa3e74f8700 1 -- 192.168.123.105:0/1703076336 shutdown_connections 2026-03-10T08:52:40.278 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.278+0000 7fa3e74f8700 1 --2- 192.168.123.105:0/1703076336 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa3e0073500 0x7fa3e0073960 unknown :-1 s=CLOSED pgs=211 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:40.278 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.278+0000 7fa3e74f8700 1 --2- 192.168.123.105:0/1703076336 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa3e0074dd0 0x7fa3e0072fc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:40.278 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.278+0000 7fa3e74f8700 1 -- 192.168.123.105:0/1703076336 >> 192.168.123.105:0/1703076336 conn(0x7fa3e0078ed0 msgr2=0x7fa3e00792e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:40.278 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.278+0000 7fa3e74f8700 1 -- 192.168.123.105:0/1703076336 shutdown_connections 2026-03-10T08:52:40.278 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.278+0000 7fa3e74f8700 1 -- 192.168.123.105:0/1703076336 wait complete. 2026-03-10T08:52:40.279 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.278+0000 7fa3e74f8700 1 Processor -- start 2026-03-10T08:52:40.279 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.279+0000 7fa3e74f8700 1 -- start start 2026-03-10T08:52:40.279 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.279+0000 7fa3e74f8700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa3e0073500 0x7fa3e019d2a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:40.279 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.279+0000 7fa3e74f8700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa3e0074dd0 0x7fa3e019d7e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:40.279 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.279+0000 7fa3e74f8700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa3e019dec0 con 0x7fa3e0074dd0 2026-03-10T08:52:40.279 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.279+0000 7fa3e74f8700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa3e01a1c50 con 0x7fa3e0073500 2026-03-10T08:52:40.279 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.279+0000 7fa3e4a93700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa3e0074dd0 0x7fa3e019d7e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:40.279 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.279+0000 7fa3e4a93700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa3e0074dd0 0x7fa3e019d7e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:35090/0 (socket says 192.168.123.105:35090) 2026-03-10T08:52:40.280 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.279+0000 7fa3e4a93700 1 -- 192.168.123.105:0/936637577 learned_addr learned my addr 192.168.123.105:0/936637577 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:52:40.280 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.279+0000 7fa3e4a93700 1 -- 192.168.123.105:0/936637577 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa3e0073500 msgr2=0x7fa3e019d2a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:40.280 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.279+0000 7fa3e4a93700 1 --2- 192.168.123.105:0/936637577 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa3e0073500 0x7fa3e019d2a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:40.280 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.279+0000 7fa3e4a93700 1 -- 192.168.123.105:0/936637577 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa3d0009710 con 0x7fa3e0074dd0 2026-03-10T08:52:40.280 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.280+0000 7fa3e4a93700 1 --2- 192.168.123.105:0/936637577 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa3e0074dd0 0x7fa3e019d7e0 secure :-1 s=READY pgs=212 cs=0 l=1 rev1=1 crypto rx=0x7fa3dc005950 tx=0x7fa3dc004e80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:40.280 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.280+0000 7fa3d67fc700 1 -- 192.168.123.105:0/936637577 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa3dc01d070 con 0x7fa3e0074dd0 2026-03-10T08:52:40.281 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.280+0000 7fa3d67fc700 1 -- 192.168.123.105:0/936637577 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa3dc022470 con 0x7fa3e0074dd0 2026-03-10T08:52:40.281 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.280+0000 7fa3d67fc700 1 -- 192.168.123.105:0/936637577 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa3dc00f670 con 0x7fa3e0074dd0 2026-03-10T08:52:40.281 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.280+0000 7fa3e74f8700 1 -- 192.168.123.105:0/936637577 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa3dc0097e0 con 0x7fa3e0074dd0 2026-03-10T08:52:40.281 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.280+0000 7fa3e74f8700 1 -- 192.168.123.105:0/936637577 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa3e01a2230 con 0x7fa3e0074dd0 2026-03-10T08:52:40.281 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.281+0000 7fa3e74f8700 1 -- 192.168.123.105:0/936637577 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa3e004ea90 con 0x7fa3e0074dd0 2026-03-10T08:52:40.285 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.282+0000 7fa3d67fc700 1 -- 192.168.123.105:0/936637577 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fa3dc022ae0 con 0x7fa3e0074dd0 2026-03-10T08:52:40.285 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.282+0000 7fa3d67fc700 1 --2- 192.168.123.105:0/936637577 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fa3cc06c490 0x7fa3cc06e950 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:40.285 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.282+0000 7fa3d67fc700 1 -- 192.168.123.105:0/936637577 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fa3dc08caf0 con 0x7fa3e0074dd0 2026-03-10T08:52:40.285 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.285+0000 7fa3d67fc700 1 -- 192.168.123.105:0/936637577 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fa3dc057890 con 0x7fa3e0074dd0 2026-03-10T08:52:40.285 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.285+0000 7fa3e5294700 1 --2- 192.168.123.105:0/936637577 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fa3cc06c490 0x7fa3cc06e950 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:40.286 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.285+0000 7fa3e5294700 1 --2- 192.168.123.105:0/936637577 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fa3cc06c490 0x7fa3cc06e950 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7fa3d0005d90 tx=0x7fa3d0005ce0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:40.384 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:40 vm05 ceph-mon[49713]: pgmap v63: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:52:40.384 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:40 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/1729650109' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json"}]: dispatch 2026-03-10T08:52:40.384 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:40 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/3529002933' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-10T08:52:40.385 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.383+0000 7fa3e74f8700 1 -- 192.168.123.105:0/936637577 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "osd dump", "format": "json"} v 0) v1 -- 0x7fa3e0066e80 con 0x7fa3e0074dd0 2026-03-10T08:52:40.386 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.386+0000 7fa3d67fc700 1 -- 192.168.123.105:0/936637577 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "osd dump", "format": "json"}]=0 v33) v1 ==== 74+0+11260 (secure 0 0 0) 0x7fa3dc027020 con 0x7fa3e0074dd0 2026-03-10T08:52:40.386 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:52:40.386 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":33,"fsid":"16587ed2-1c5e-11f1-90f6-35051361a039","created":"2026-03-10T08:50:10.961037+0000","modified":"2026-03-10T08:52:36.235765+0000","last_up_change":"2026-03-10T08:52:35.229629+0000","last_in_change":"2026-03-10T08:52:24.382935+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":14,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":1,"max_osd":6,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"reef","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-10T08:52:09.422580+0000","flags":1,"flags_names":"hashpspool","type":1,"size":3,"min_size":2,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"20","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_acting":6,"score_stable":6,"optimal_score":0.5,"raw_score_acting":3,"raw_score_stable":3,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"0e25ea50-b19b-4e07-85f6-5d48c19d3a4f","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":9,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6802","nonce":2315796084},{"type":"v1","addr":"192.168.123.105:6803","nonce":2315796084}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6804","nonce":2315796084},{"type":"v1","addr":"192.168.123.105:6805","nonce":2315796084}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6808","nonce":2315796084},{"type":"v1","addr":"192.168.123.105:6809","nonce":2315796084}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6806","nonce":2315796084},{"type":"v1","addr":"192.168.123.105:6807","nonce":2315796084}]},"public_addr":"192.168.123.105:6803/2315796084","cluster_addr":"192.168.123.105:6805/2315796084","heartbeat_back_addr":"192.168.123.105:6809/2315796084","heartbeat_front_addr":"192.168.123.105:6807/2315796084","state":["exists","up"]},{"osd":1,"uuid":"65d8a731-173e-4188-b03d-f0602d504870","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":13,"up_thru":24,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6810","nonce":305535590},{"type":"v1","addr":"192.168.123.105:6811","nonce":305535590}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6812","nonce":305535590},{"type":"v1","addr":"192.168.123.105:6813","nonce":305535590}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6816","nonce":305535590},{"type":"v1","addr":"192.168.123.105:6817","nonce":305535590}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6814","nonce":305535590},{"type":"v1","addr":"192.168.123.105:6815","nonce":305535590}]},"public_addr":"192.168.123.105:6811/305535590","cluster_addr":"192.168.123.105:6813/305535590","heartbeat_back_addr":"192.168.123.105:6817/305535590","heartbeat_front_addr":"192.168.123.105:6815/305535590","state":["exists","up"]},{"osd":2,"uuid":"3a3adfaf-6208-4836-b16d-7bbb2065933b","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":17,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6818","nonce":3827519003},{"type":"v1","addr":"192.168.123.105:6819","nonce":3827519003}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6820","nonce":3827519003},{"type":"v1","addr":"192.168.123.105:6821","nonce":3827519003}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6824","nonce":3827519003},{"type":"v1","addr":"192.168.123.105:6825","nonce":3827519003}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6822","nonce":3827519003},{"type":"v1","addr":"192.168.123.105:6823","nonce":3827519003}]},"public_addr":"192.168.123.105:6819/3827519003","cluster_addr":"192.168.123.105:6821/3827519003","heartbeat_back_addr":"192.168.123.105:6825/3827519003","heartbeat_front_addr":"192.168.123.105:6823/3827519003","state":["exists","up"]},{"osd":3,"uuid":"3e86065a-202f-4640-9f03-2490c913e09b","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":23,"up_thru":27,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6800","nonce":1634067319},{"type":"v1","addr":"192.168.123.108:6801","nonce":1634067319}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6802","nonce":1634067319},{"type":"v1","addr":"192.168.123.108:6803","nonce":1634067319}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6806","nonce":1634067319},{"type":"v1","addr":"192.168.123.108:6807","nonce":1634067319}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6804","nonce":1634067319},{"type":"v1","addr":"192.168.123.108:6805","nonce":1634067319}]},"public_addr":"192.168.123.108:6801/1634067319","cluster_addr":"192.168.123.108:6803/1634067319","heartbeat_back_addr":"192.168.123.108:6807/1634067319","heartbeat_front_addr":"192.168.123.108:6805/1634067319","state":["exists","up"]},{"osd":4,"uuid":"2499eecc-b6be-48ac-ba73-53ff8a0686a4","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":28,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6808","nonce":4181429086},{"type":"v1","addr":"192.168.123.108:6809","nonce":4181429086}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6810","nonce":4181429086},{"type":"v1","addr":"192.168.123.108:6811","nonce":4181429086}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6814","nonce":4181429086},{"type":"v1","addr":"192.168.123.108:6815","nonce":4181429086}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6812","nonce":4181429086},{"type":"v1","addr":"192.168.123.108:6813","nonce":4181429086}]},"public_addr":"192.168.123.108:6809/4181429086","cluster_addr":"192.168.123.108:6811/4181429086","heartbeat_back_addr":"192.168.123.108:6815/4181429086","heartbeat_front_addr":"192.168.123.108:6813/4181429086","state":["exists","up"]},{"osd":5,"uuid":"52d5763a-8095-46f7-9dd8-2d20a4d53ab7","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":32,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6816","nonce":236163747},{"type":"v1","addr":"192.168.123.108:6817","nonce":236163747}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6818","nonce":236163747},{"type":"v1","addr":"192.168.123.108:6819","nonce":236163747}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6822","nonce":236163747},{"type":"v1","addr":"192.168.123.108:6823","nonce":236163747}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6820","nonce":236163747},{"type":"v1","addr":"192.168.123.108:6821","nonce":236163747}]},"public_addr":"192.168.123.108:6817/236163747","cluster_addr":"192.168.123.108:6819/236163747","heartbeat_back_addr":"192.168.123.108:6823/236163747","heartbeat_front_addr":"192.168.123.108:6821/236163747","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T08:51:47.881910+0000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T08:51:57.340585+0000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T08:52:07.088267+0000","dead_epoch":0},{"osd":3,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T08:52:15.174441+0000","dead_epoch":0},{"osd":4,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T08:52:24.943469+0000","dead_epoch":0},{"osd":5,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T08:52:33.433439+0000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{"192.168.123.105:0/2745800648":"2026-03-11T08:51:15.172314+0000","192.168.123.105:0/840671361":"2026-03-11T08:51:15.172314+0000","192.168.123.105:0/302360737":"2026-03-11T08:50:37.635038+0000","192.168.123.105:0/2061557555":"2026-03-11T08:51:15.172314+0000","192.168.123.105:0/3374278469":"2026-03-11T08:50:37.635038+0000","192.168.123.105:6801/2":"2026-03-11T08:50:23.844460+0000","192.168.123.105:6800/2":"2026-03-11T08:50:23.844460+0000","192.168.123.105:0/3434295633":"2026-03-11T08:50:23.844460+0000","192.168.123.105:0/3706700996":"2026-03-11T08:50:23.844460+0000","192.168.123.105:0/3674537463":"2026-03-11T08:50:23.844460+0000","192.168.123.105:0/1368648469":"2026-03-11T08:50:37.635038+0000"},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"jerasure","technique":"reed_sol_van"}},"removed_snaps_queue":[],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-10T08:52:40.388 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.388+0000 7fa3e74f8700 1 -- 192.168.123.105:0/936637577 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fa3cc06c490 msgr2=0x7fa3cc06e950 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:40.388 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.388+0000 7fa3e74f8700 1 --2- 192.168.123.105:0/936637577 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fa3cc06c490 0x7fa3cc06e950 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7fa3d0005d90 tx=0x7fa3d0005ce0 comp rx=0 tx=0).stop 2026-03-10T08:52:40.388 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.388+0000 7fa3e74f8700 1 -- 192.168.123.105:0/936637577 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa3e0074dd0 msgr2=0x7fa3e019d7e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:40.388 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.388+0000 7fa3e74f8700 1 --2- 192.168.123.105:0/936637577 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa3e0074dd0 0x7fa3e019d7e0 secure :-1 s=READY pgs=212 cs=0 l=1 rev1=1 crypto rx=0x7fa3dc005950 tx=0x7fa3dc004e80 comp rx=0 tx=0).stop 2026-03-10T08:52:40.388 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.388+0000 7fa3e74f8700 1 -- 192.168.123.105:0/936637577 shutdown_connections 2026-03-10T08:52:40.388 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.388+0000 7fa3e74f8700 1 --2- 192.168.123.105:0/936637577 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fa3cc06c490 0x7fa3cc06e950 unknown :-1 s=CLOSED pgs=67 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:40.388 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.388+0000 7fa3e74f8700 1 --2- 192.168.123.105:0/936637577 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa3e0073500 0x7fa3e019d2a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:40.388 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.388+0000 7fa3e74f8700 1 --2- 192.168.123.105:0/936637577 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa3e0074dd0 0x7fa3e019d7e0 unknown :-1 s=CLOSED pgs=212 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:40.388 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.388+0000 7fa3e74f8700 1 -- 192.168.123.105:0/936637577 >> 192.168.123.105:0/936637577 conn(0x7fa3e0078ed0 msgr2=0x7fa3e010fa80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:40.388 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.388+0000 7fa3e74f8700 1 -- 192.168.123.105:0/936637577 shutdown_connections 2026-03-10T08:52:40.389 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:40.388+0000 7fa3e74f8700 1 -- 192.168.123.105:0/936637577 wait complete. 2026-03-10T08:52:40.448 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph tell osd.0 flush_pg_stats 2026-03-10T08:52:40.448 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph tell osd.1 flush_pg_stats 2026-03-10T08:52:40.449 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph tell osd.2 flush_pg_stats 2026-03-10T08:52:40.449 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph tell osd.3 flush_pg_stats 2026-03-10T08:52:40.449 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph tell osd.4 flush_pg_stats 2026-03-10T08:52:40.449 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph tell osd.5 flush_pg_stats 2026-03-10T08:52:40.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:40 vm08 ceph-mon[57559]: pgmap v63: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:52:40.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:40 vm08 ceph-mon[57559]: from='client.? 192.168.123.105:0/1729650109' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json"}]: dispatch 2026-03-10T08:52:40.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:40 vm08 ceph-mon[57559]: from='client.? 192.168.123.105:0/3529002933' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-10T08:52:40.816 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:52:40.870 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:52:40.901 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:52:40.991 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:52:41.069 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:52:41.119 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:52:41.430 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:41 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/936637577' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-10T08:52:41.506 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.505+0000 7ffa123a1700 1 -- 192.168.123.105:0/783950904 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffa0c10d0f0 msgr2=0x7ffa0c10d570 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:41.507 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.505+0000 7ffa123a1700 1 --2- 192.168.123.105:0/783950904 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffa0c10d0f0 0x7ffa0c10d570 secure :-1 s=READY pgs=213 cs=0 l=1 rev1=1 crypto rx=0x7ff9fc009b00 tx=0x7ff9fc009e10 comp rx=0 tx=0).stop 2026-03-10T08:52:41.509 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.509+0000 7ffa123a1700 1 -- 192.168.123.105:0/783950904 shutdown_connections 2026-03-10T08:52:41.510 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.509+0000 7ffa123a1700 1 --2- 192.168.123.105:0/783950904 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffa0c10d0f0 0x7ffa0c10d570 unknown :-1 s=CLOSED pgs=213 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.510 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.509+0000 7ffa123a1700 1 --2- 192.168.123.105:0/783950904 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ffa0c10f340 0x7ffa0c10f720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.510 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.509+0000 7ffa123a1700 1 -- 192.168.123.105:0/783950904 >> 192.168.123.105:0/783950904 conn(0x7ffa0c06ce20 msgr2=0x7ffa0c06d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:41.510 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.510+0000 7ffa123a1700 1 -- 192.168.123.105:0/783950904 shutdown_connections 2026-03-10T08:52:41.512 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.510+0000 7ffa123a1700 1 -- 192.168.123.105:0/783950904 wait complete. 2026-03-10T08:52:41.512 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.512+0000 7ffa123a1700 1 Processor -- start 2026-03-10T08:52:41.512 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.512+0000 7ffa123a1700 1 -- start start 2026-03-10T08:52:41.513 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.512+0000 7ffa123a1700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffa0c10f340 0x7ffa0c11bf30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:41.513 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.512+0000 7ffa123a1700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ffa0c116ee0 0x7ffa0c117360 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:41.513 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.512+0000 7ffa123a1700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ffa0c117930 con 0x7ffa0c10f340 2026-03-10T08:52:41.513 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.512+0000 7ffa123a1700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ffa0c117aa0 con 0x7ffa0c116ee0 2026-03-10T08:52:41.513 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.513+0000 7ffa1139f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffa0c10f340 0x7ffa0c11bf30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:41.514 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.513+0000 7ffa1139f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffa0c10f340 0x7ffa0c11bf30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:35108/0 (socket says 192.168.123.105:35108) 2026-03-10T08:52:41.514 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.513+0000 7ffa1139f700 1 -- 192.168.123.105:0/1816673128 learned_addr learned my addr 192.168.123.105:0/1816673128 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:52:41.514 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.514+0000 7ffa10b9e700 1 --2- 192.168.123.105:0/1816673128 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ffa0c116ee0 0x7ffa0c117360 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:41.515 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.514+0000 7ffa1139f700 1 -- 192.168.123.105:0/1816673128 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ffa0c116ee0 msgr2=0x7ffa0c117360 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:41.515 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.514+0000 7ffa1139f700 1 --2- 192.168.123.105:0/1816673128 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ffa0c116ee0 0x7ffa0c117360 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.515 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.514+0000 7ffa1139f700 1 -- 192.168.123.105:0/1816673128 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff9fc0097e0 con 0x7ffa0c10f340 2026-03-10T08:52:41.515 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.515+0000 7ffa1139f700 1 --2- 192.168.123.105:0/1816673128 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffa0c10f340 0x7ffa0c11bf30 secure :-1 s=READY pgs=214 cs=0 l=1 rev1=1 crypto rx=0x7ffa0800d8d0 tx=0x7ffa0800dc90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:41.515 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.515+0000 7ffa027fc700 1 -- 192.168.123.105:0/1816673128 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ffa08009940 con 0x7ffa0c10f340 2026-03-10T08:52:41.516 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.516+0000 7ffa123a1700 1 -- 192.168.123.105:0/1816673128 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ffa0c117d30 con 0x7ffa0c10f340 2026-03-10T08:52:41.517 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.516+0000 7ffa123a1700 1 -- 192.168.123.105:0/1816673128 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ffa0c1b8680 con 0x7ffa0c10f340 2026-03-10T08:52:41.518 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.517+0000 7ffa027fc700 1 -- 192.168.123.105:0/1816673128 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ffa08010460 con 0x7ffa0c10f340 2026-03-10T08:52:41.519 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.519+0000 7ffa027fc700 1 -- 192.168.123.105:0/1816673128 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ffa0800f5d0 con 0x7ffa0c10f340 2026-03-10T08:52:41.520 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.519+0000 7ffa123a1700 1 -- 192.168.123.105:0/1816673128 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_get_version(what=osdmap handle=1) v1 -- 0x7ff9f0000ff0 con 0x7ffa0c10f340 2026-03-10T08:52:41.521 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.519+0000 7ffa027fc700 1 -- 192.168.123.105:0/1816673128 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7ffa0800f790 con 0x7ffa0c10f340 2026-03-10T08:52:41.522 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.520+0000 7ffa027fc700 1 --2- 192.168.123.105:0/1816673128 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff9f806c6d0 0x7ff9f806eb90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:41.522 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.520+0000 7ffa027fc700 1 -- 192.168.123.105:0/1816673128 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7ffa0808c320 con 0x7ffa0c10f340 2026-03-10T08:52:41.522 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.520+0000 7ffa027fc700 1 --2- 192.168.123.105:0/1816673128 >> [v2:192.168.123.105:6802/2315796084,v1:192.168.123.105:6803/2315796084] conn(0x7ff9f8072270 0x7ff9f8074690 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:41.528 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.520+0000 7ffa027fc700 1 -- 192.168.123.105:0/1816673128 --> [v2:192.168.123.105:6802/2315796084,v1:192.168.123.105:6803/2315796084] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7ff9f8074d40 con 0x7ff9f8072270 2026-03-10T08:52:41.528 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.522+0000 7ffa11ba0700 1 --2- 192.168.123.105:0/1816673128 >> [v2:192.168.123.105:6802/2315796084,v1:192.168.123.105:6803/2315796084] conn(0x7ff9f8072270 0x7ff9f8074690 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:41.528 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.527+0000 7ffa10b9e700 1 --2- 192.168.123.105:0/1816673128 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff9f806c6d0 0x7ff9f806eb90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:41.533 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.529+0000 7ffa11ba0700 1 --2- 192.168.123.105:0/1816673128 >> [v2:192.168.123.105:6802/2315796084,v1:192.168.123.105:6803/2315796084] conn(0x7ff9f8072270 0x7ff9f8074690 crc :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:41.533 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.530+0000 7ffa027fc700 1 -- 192.168.123.105:0/1816673128 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_get_version_reply(handle=1 version=33) v2 ==== 24+0+0 (secure 0 0 0) 0x7ffa08056f60 con 0x7ffa0c10f340 2026-03-10T08:52:41.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.532+0000 7ffa027fc700 1 -- 192.168.123.105:0/1816673128 <== osd.0 v2:192.168.123.105:6802/2315796084 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+25106 (crc 0 0 0) 0x7ff9f8074d40 con 0x7ff9f8072270 2026-03-10T08:52:41.540 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.539+0000 7ffa10b9e700 1 --2- 192.168.123.105:0/1816673128 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff9f806c6d0 0x7ff9f806eb90 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7ff9fc000c00 tx=0x7ff9fc011040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:41.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:41 vm08 ceph-mon[57559]: from='client.? 192.168.123.105:0/936637577' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-10T08:52:41.577 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.576+0000 7f71dd5fb700 1 -- 192.168.123.105:0/1150916022 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f71d00a4cd0 msgr2=0x7f71d00a50b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:41.578 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.576+0000 7f71dd5fb700 1 --2- 192.168.123.105:0/1150916022 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f71d00a4cd0 0x7f71d00a50b0 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f71cc009b00 tx=0x7f71cc009e10 comp rx=0 tx=0).stop 2026-03-10T08:52:41.578 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.577+0000 7f71dd5fb700 1 -- 192.168.123.105:0/1150916022 shutdown_connections 2026-03-10T08:52:41.578 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.577+0000 7f71dd5fb700 1 --2- 192.168.123.105:0/1150916022 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f71d00a55f0 0x7f71d00b7900 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.578 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.577+0000 7f71dd5fb700 1 --2- 192.168.123.105:0/1150916022 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f71d00a4cd0 0x7f71d00a50b0 unknown :-1 s=CLOSED pgs=36 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.578 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.577+0000 7f71dd5fb700 1 -- 192.168.123.105:0/1150916022 >> 192.168.123.105:0/1150916022 conn(0x7f71d001a720 msgr2=0x7f71d001ab30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:41.578 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.577+0000 7f71dd5fb700 1 -- 192.168.123.105:0/1150916022 shutdown_connections 2026-03-10T08:52:41.579 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.578+0000 7f71dd5fb700 1 -- 192.168.123.105:0/1150916022 wait complete. 2026-03-10T08:52:41.579 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.578+0000 7f71dd5fb700 1 Processor -- start 2026-03-10T08:52:41.579 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.579+0000 7f71dd5fb700 1 -- start start 2026-03-10T08:52:41.579 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.579+0000 7f71dd5fb700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f71d00a55f0 0x7f71d00b5490 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:41.579 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.579+0000 7f71dd5fb700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f71d00b0490 0x7f71d00b0910 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:41.579 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.579+0000 7f71dd5fb700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f71d00b5b20 con 0x7f71d00a55f0 2026-03-10T08:52:41.579 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.579+0000 7f71dd5fb700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f71d00b0e50 con 0x7f71d00b0490 2026-03-10T08:52:41.580 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.580+0000 7f71d77fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f71d00b0490 0x7f71d00b0910 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:41.580 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.580+0000 7f71d77fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f71d00b0490 0x7f71d00b0910 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:36076/0 (socket says 192.168.123.105:36076) 2026-03-10T08:52:41.580 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.580+0000 7f71d77fe700 1 -- 192.168.123.105:0/3961483147 learned_addr learned my addr 192.168.123.105:0/3961483147 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:52:41.580 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.580+0000 7f71d7fff700 1 --2- 192.168.123.105:0/3961483147 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f71d00a55f0 0x7f71d00b5490 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:41.581 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.580+0000 7f71d77fe700 1 -- 192.168.123.105:0/3961483147 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f71d00a55f0 msgr2=0x7f71d00b5490 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:41.581 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.580+0000 7f71d77fe700 1 --2- 192.168.123.105:0/3961483147 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f71d00a55f0 0x7f71d00b5490 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.581 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.580+0000 7f71d77fe700 1 -- 192.168.123.105:0/3961483147 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f71cc0097e0 con 0x7f71d00b0490 2026-03-10T08:52:41.583 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.581+0000 7f71d77fe700 1 --2- 192.168.123.105:0/3961483147 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f71d00b0490 0x7f71d00b0910 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7f71c400c8a0 tx=0x7f71c400cbb0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:41.583 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.582+0000 7ffa123a1700 1 -- 192.168.123.105:0/1816673128 --> [v2:192.168.123.105:6802/2315796084,v1:192.168.123.105:6803/2315796084] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7ff9f0002d70 con 0x7ff9f8072270 2026-03-10T08:52:41.583 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.583+0000 7f71d57fa700 1 -- 192.168.123.105:0/3961483147 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f71c40078c0 con 0x7f71d00b0490 2026-03-10T08:52:41.587 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.583+0000 7f71dd5fb700 1 -- 192.168.123.105:0/3961483147 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f71d00b1130 con 0x7f71d00b0490 2026-03-10T08:52:41.587 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.583+0000 7f71dd5fb700 1 -- 192.168.123.105:0/3961483147 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f71d0155f20 con 0x7f71d00b0490 2026-03-10T08:52:41.587 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.584+0000 7f71d57fa700 1 -- 192.168.123.105:0/3961483147 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f71c400f450 con 0x7f71d00b0490 2026-03-10T08:52:41.587 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.584+0000 7f71d57fa700 1 -- 192.168.123.105:0/3961483147 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f71c4018610 con 0x7f71d00b0490 2026-03-10T08:52:41.587 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.585+0000 7f71d57fa700 1 -- 192.168.123.105:0/3961483147 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f71c4018770 con 0x7f71d00b0490 2026-03-10T08:52:41.587 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.585+0000 7ffa027fc700 1 -- 192.168.123.105:0/1816673128 <== osd.0 v2:192.168.123.105:6802/2315796084 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+11 (crc 0 0 0) 0x7ff9f0002d70 con 0x7ff9f8072270 2026-03-10T08:52:41.587 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.586+0000 7ff9f7fff700 1 -- 192.168.123.105:0/1816673128 >> [v2:192.168.123.105:6802/2315796084,v1:192.168.123.105:6803/2315796084] conn(0x7ff9f8072270 msgr2=0x7ff9f8074690 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:41.587 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.586+0000 7ff9f7fff700 1 --2- 192.168.123.105:0/1816673128 >> [v2:192.168.123.105:6802/2315796084,v1:192.168.123.105:6803/2315796084] conn(0x7ff9f8072270 0x7ff9f8074690 crc :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.587 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.587+0000 7ff9f7fff700 1 -- 192.168.123.105:0/1816673128 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff9f806c6d0 msgr2=0x7ff9f806eb90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:41.587 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.587+0000 7ff9f7fff700 1 --2- 192.168.123.105:0/1816673128 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff9f806c6d0 0x7ff9f806eb90 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7ff9fc000c00 tx=0x7ff9fc011040 comp rx=0 tx=0).stop 2026-03-10T08:52:41.587 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.587+0000 7ff9f7fff700 1 -- 192.168.123.105:0/1816673128 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffa0c10f340 msgr2=0x7ffa0c11bf30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:41.587 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.587+0000 7f71d57fa700 1 --2- 192.168.123.105:0/3961483147 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f71c806c6e0 0x7f71c806eba0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:41.587 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.587+0000 7ff9f7fff700 1 --2- 192.168.123.105:0/1816673128 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffa0c10f340 0x7ffa0c11bf30 secure :-1 s=READY pgs=214 cs=0 l=1 rev1=1 crypto rx=0x7ffa0800d8d0 tx=0x7ffa0800dc90 comp rx=0 tx=0).stop 2026-03-10T08:52:41.591 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.590+0000 7ff9f7fff700 1 -- 192.168.123.105:0/1816673128 shutdown_connections 2026-03-10T08:52:41.591 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.590+0000 7ff9f7fff700 1 --2- 192.168.123.105:0/1816673128 >> [v2:192.168.123.105:6802/2315796084,v1:192.168.123.105:6803/2315796084] conn(0x7ff9f8072270 0x7ff9f8074690 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.591 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.590+0000 7ff9f7fff700 1 --2- 192.168.123.105:0/1816673128 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff9f806c6d0 0x7ff9f806eb90 unknown :-1 s=CLOSED pgs=68 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.591 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.590+0000 7ff9f7fff700 1 --2- 192.168.123.105:0/1816673128 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffa0c10f340 0x7ffa0c11bf30 unknown :-1 s=CLOSED pgs=214 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.591 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.590+0000 7ff9f7fff700 1 --2- 192.168.123.105:0/1816673128 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ffa0c116ee0 0x7ffa0c117360 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.591 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.590+0000 7ff9f7fff700 1 -- 192.168.123.105:0/1816673128 >> 192.168.123.105:0/1816673128 conn(0x7ffa0c06ce20 msgr2=0x7ffa0c10ae50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:41.591 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.590+0000 7ff9f7fff700 1 -- 192.168.123.105:0/1816673128 shutdown_connections 2026-03-10T08:52:41.591 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.590+0000 7ff9f7fff700 1 -- 192.168.123.105:0/1816673128 wait complete. 2026-03-10T08:52:41.593 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.593+0000 7f71d7fff700 1 --2- 192.168.123.105:0/3961483147 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f71c806c6e0 0x7f71c806eba0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:41.595 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.594+0000 7f71d7fff700 1 --2- 192.168.123.105:0/3961483147 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f71c806c6e0 0x7f71c806eba0 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f71cc009fd0 tx=0x7f71cc019040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:41.595 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.594+0000 7f71d57fa700 1 -- 192.168.123.105:0/3961483147 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f71c408c650 con 0x7f71d00b0490 2026-03-10T08:52:41.595 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.594+0000 7f71dd5fb700 1 --2- 192.168.123.105:0/3961483147 >> [v2:192.168.123.108:6800/1634067319,v1:192.168.123.108:6801/1634067319] conn(0x7f71bc001610 0x7f71bc003ad0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:41.596 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.595+0000 7f71dd5fb700 1 -- 192.168.123.105:0/3961483147 --> [v2:192.168.123.108:6800/1634067319,v1:192.168.123.108:6801/1634067319] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7f71bc006c00 con 0x7f71bc001610 2026-03-10T08:52:41.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.597+0000 7f71dcdfa700 1 --2- 192.168.123.105:0/3961483147 >> [v2:192.168.123.108:6800/1634067319,v1:192.168.123.108:6801/1634067319] conn(0x7f71bc001610 0x7f71bc003ad0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:41.611 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.610+0000 7f71dcdfa700 1 --2- 192.168.123.105:0/3961483147 >> [v2:192.168.123.108:6800/1634067319,v1:192.168.123.108:6801/1634067319] conn(0x7f71bc001610 0x7f71bc003ad0 crc :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.3 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:41.616 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.614+0000 7f71d57fa700 1 -- 192.168.123.105:0/3961483147 <== osd.3 v2:192.168.123.108:6800/1634067319 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+25106 (crc 0 0 0) 0x7f71bc006c00 con 0x7f71bc001610 2026-03-10T08:52:41.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.627+0000 7f4da59df700 1 -- 192.168.123.105:0/3749044982 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4da010f340 msgr2=0x7f4da010f720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:41.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.627+0000 7f4da59df700 1 --2- 192.168.123.105:0/3749044982 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4da010f340 0x7f4da010f720 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f4d90009a60 tx=0x7f4d90009d70 comp rx=0 tx=0).stop 2026-03-10T08:52:41.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.628+0000 7f4da59df700 1 -- 192.168.123.105:0/3749044982 shutdown_connections 2026-03-10T08:52:41.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.628+0000 7f4da59df700 1 --2- 192.168.123.105:0/3749044982 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4da010d0f0 0x7f4da010d570 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.628+0000 7f4da59df700 1 --2- 192.168.123.105:0/3749044982 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4da010f340 0x7f4da010f720 unknown :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.628+0000 7f4da59df700 1 -- 192.168.123.105:0/3749044982 >> 192.168.123.105:0/3749044982 conn(0x7f4da006ce20 msgr2=0x7f4da006d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:41.629 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.628+0000 7f4da59df700 1 -- 192.168.123.105:0/3749044982 shutdown_connections 2026-03-10T08:52:41.632 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.628+0000 7f4da59df700 1 -- 192.168.123.105:0/3749044982 wait complete. 2026-03-10T08:52:41.635 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.635+0000 7f71dd5fb700 1 -- 192.168.123.105:0/3961483147 --> [v2:192.168.123.108:6800/1634067319,v1:192.168.123.108:6801/1634067319] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7f71bc005c80 con 0x7f71bc001610 2026-03-10T08:52:41.636 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.635+0000 7f71d57fa700 1 -- 192.168.123.105:0/3961483147 <== osd.3 v2:192.168.123.108:6800/1634067319 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+11 (crc 0 0 0) 0x7f71bc005c80 con 0x7f71bc001610 2026-03-10T08:52:41.639 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.638+0000 7f71dd5fb700 1 -- 192.168.123.105:0/3961483147 >> [v2:192.168.123.108:6800/1634067319,v1:192.168.123.108:6801/1634067319] conn(0x7f71bc001610 msgr2=0x7f71bc003ad0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:41.639 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.638+0000 7f71dd5fb700 1 --2- 192.168.123.105:0/3961483147 >> [v2:192.168.123.108:6800/1634067319,v1:192.168.123.108:6801/1634067319] conn(0x7f71bc001610 0x7f71bc003ad0 crc :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.639 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.639+0000 7f71dd5fb700 1 -- 192.168.123.105:0/3961483147 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f71c806c6e0 msgr2=0x7f71c806eba0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:41.639 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.639+0000 7f71dd5fb700 1 --2- 192.168.123.105:0/3961483147 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f71c806c6e0 0x7f71c806eba0 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f71cc009fd0 tx=0x7f71cc019040 comp rx=0 tx=0).stop 2026-03-10T08:52:41.639 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.639+0000 7f71dd5fb700 1 -- 192.168.123.105:0/3961483147 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f71d00b0490 msgr2=0x7f71d00b0910 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:41.639 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.639+0000 7f71dd5fb700 1 --2- 192.168.123.105:0/3961483147 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f71d00b0490 0x7f71d00b0910 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7f71c400c8a0 tx=0x7f71c400cbb0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.640 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.639+0000 7f71dd5fb700 1 -- 192.168.123.105:0/3961483147 shutdown_connections 2026-03-10T08:52:41.640 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.639+0000 7f71dd5fb700 1 --2- 192.168.123.105:0/3961483147 >> [v2:192.168.123.108:6800/1634067319,v1:192.168.123.108:6801/1634067319] conn(0x7f71bc001610 0x7f71bc003ad0 unknown :-1 s=CLOSED pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.640 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.639+0000 7f71dd5fb700 1 --2- 192.168.123.105:0/3961483147 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f71c806c6e0 0x7f71c806eba0 unknown :-1 s=CLOSED pgs=69 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.640 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.639+0000 7f71dd5fb700 1 --2- 192.168.123.105:0/3961483147 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f71d00a55f0 0x7f71d00b5490 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.640 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.639+0000 7f71dd5fb700 1 --2- 192.168.123.105:0/3961483147 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f71d00b0490 0x7f71d00b0910 unknown :-1 s=CLOSED pgs=37 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.640 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.639+0000 7f71dd5fb700 1 -- 192.168.123.105:0/3961483147 >> 192.168.123.105:0/3961483147 conn(0x7f71d001a720 msgr2=0x7f71d00a3cc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:41.640 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.639+0000 7f71dd5fb700 1 -- 192.168.123.105:0/3961483147 shutdown_connections 2026-03-10T08:52:41.640 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.639+0000 7f71dd5fb700 1 -- 192.168.123.105:0/3961483147 wait complete. 2026-03-10T08:52:41.643 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.629+0000 7f4da59df700 1 Processor -- start 2026-03-10T08:52:41.643 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.643+0000 7f4da59df700 1 -- start start 2026-03-10T08:52:41.643 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.643+0000 7f4da59df700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4da010d0f0 0x7f4da011bed0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:41.643 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.643+0000 7f4da59df700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4da0116ed0 0x7f4da0117350 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:41.643 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.643+0000 7f4da59df700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4da0117890 con 0x7f4da010d0f0 2026-03-10T08:52:41.643 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.643+0000 7f4da59df700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4da0117a00 con 0x7f4da0116ed0 2026-03-10T08:52:41.644 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.644+0000 7f4d9ffff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4da0116ed0 0x7f4da0117350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:41.644 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.644+0000 7f4d9ffff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4da0116ed0 0x7f4da0117350 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:36104/0 (socket says 192.168.123.105:36104) 2026-03-10T08:52:41.644 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.644+0000 7f4d9ffff700 1 -- 192.168.123.105:0/2750854791 learned_addr learned my addr 192.168.123.105:0/2750854791 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:52:41.645 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.645+0000 7f4da49dd700 1 --2- 192.168.123.105:0/2750854791 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4da010d0f0 0x7f4da011bed0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:41.645 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.645+0000 7f4d9ffff700 1 -- 192.168.123.105:0/2750854791 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4da010d0f0 msgr2=0x7f4da011bed0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:41.645 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.645+0000 7f4d9ffff700 1 --2- 192.168.123.105:0/2750854791 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4da010d0f0 0x7f4da011bed0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.645 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.645+0000 7f4d9ffff700 1 -- 192.168.123.105:0/2750854791 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4d90009710 con 0x7f4da0116ed0 2026-03-10T08:52:41.645 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.645+0000 7f4d9ffff700 1 --2- 192.168.123.105:0/2750854791 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4da0116ed0 0x7f4da0117350 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f4d9400ea30 tx=0x7f4d9400edf0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:41.646 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.645+0000 7f4d9dffb700 1 -- 192.168.123.105:0/2750854791 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4d9400cc40 con 0x7f4da0116ed0 2026-03-10T08:52:41.655 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.655+0000 7f4da59df700 1 -- 192.168.123.105:0/2750854791 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4da0117ce0 con 0x7f4da0116ed0 2026-03-10T08:52:41.656 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.655+0000 7f4da59df700 1 -- 192.168.123.105:0/2750854791 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4da01b86e0 con 0x7f4da0116ed0 2026-03-10T08:52:41.656 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.656+0000 7f4d9dffb700 1 -- 192.168.123.105:0/2750854791 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4d9400cda0 con 0x7f4da0116ed0 2026-03-10T08:52:41.658 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.657+0000 7f4d9dffb700 1 -- 192.168.123.105:0/2750854791 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4d94010430 con 0x7f4da0116ed0 2026-03-10T08:52:41.664 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.663+0000 7f4d9dffb700 1 -- 192.168.123.105:0/2750854791 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f4d94004830 con 0x7f4da0116ed0 2026-03-10T08:52:41.664 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.664+0000 7f4d9dffb700 1 --2- 192.168.123.105:0/2750854791 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4d8806c530 0x7f4d8806e9f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:41.664 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.664+0000 7f4d9dffb700 1 -- 192.168.123.105:0/2750854791 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f4d94014070 con 0x7f4da0116ed0 2026-03-10T08:52:41.666 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.666+0000 7f4da49dd700 1 --2- 192.168.123.105:0/2750854791 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4d8806c530 0x7f4d8806e9f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:41.701 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.700+0000 7f4da49dd700 1 --2- 192.168.123.105:0/2750854791 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4d8806c530 0x7f4d8806e9f0 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7f4d9000b5c0 tx=0x7f4d90011040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:41.702 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.700+0000 7f4da59df700 1 --2- 192.168.123.105:0/2750854791 >> [v2:192.168.123.105:6810/305535590,v1:192.168.123.105:6811/305535590] conn(0x7f4d8c001610 0x7f4d8c003ad0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:41.703 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.700+0000 7f4da59df700 1 -- 192.168.123.105:0/2750854791 --> [v2:192.168.123.105:6810/305535590,v1:192.168.123.105:6811/305535590] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7f4d8c006c00 con 0x7f4d8c001610 2026-03-10T08:52:41.703 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.701+0000 7f4da51de700 1 --2- 192.168.123.105:0/2750854791 >> [v2:192.168.123.105:6810/305535590,v1:192.168.123.105:6811/305535590] conn(0x7f4d8c001610 0x7f4d8c003ad0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:41.704 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.702+0000 7f4da51de700 1 --2- 192.168.123.105:0/2750854791 >> [v2:192.168.123.105:6810/305535590,v1:192.168.123.105:6811/305535590] conn(0x7f4d8c001610 0x7f4d8c003ad0 crc :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:41.704 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.702+0000 7f4d9dffb700 1 -- 192.168.123.105:0/2750854791 <== osd.1 v2:192.168.123.105:6810/305535590 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+25106 (crc 0 0 0) 0x7f4d8c006c00 con 0x7f4d8c001610 2026-03-10T08:52:41.742 INFO:teuthology.orchestra.run.vm05.stdout:98784247815 2026-03-10T08:52:41.742 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph osd last-stat-seq osd.3 2026-03-10T08:52:41.775 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.775+0000 7f4da59df700 1 -- 192.168.123.105:0/2750854791 --> [v2:192.168.123.105:6810/305535590,v1:192.168.123.105:6811/305535590] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7f4d8c005ce0 con 0x7f4d8c001610 2026-03-10T08:52:41.797 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.795+0000 7f4d9dffb700 1 -- 192.168.123.105:0/2750854791 <== osd.1 v2:192.168.123.105:6810/305535590 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+11 (crc 0 0 0) 0x7f4d8c005ce0 con 0x7f4d8c001610 2026-03-10T08:52:41.805 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.802+0000 7f4da59df700 1 -- 192.168.123.105:0/2750854791 >> [v2:192.168.123.105:6810/305535590,v1:192.168.123.105:6811/305535590] conn(0x7f4d8c001610 msgr2=0x7f4d8c003ad0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:41.805 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.802+0000 7f4da59df700 1 --2- 192.168.123.105:0/2750854791 >> [v2:192.168.123.105:6810/305535590,v1:192.168.123.105:6811/305535590] conn(0x7f4d8c001610 0x7f4d8c003ad0 crc :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.805 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.803+0000 7f4da59df700 1 -- 192.168.123.105:0/2750854791 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4d8806c530 msgr2=0x7f4d8806e9f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:41.805 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.803+0000 7f4da59df700 1 --2- 192.168.123.105:0/2750854791 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4d8806c530 0x7f4d8806e9f0 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7f4d9000b5c0 tx=0x7f4d90011040 comp rx=0 tx=0).stop 2026-03-10T08:52:41.805 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.804+0000 7f4da59df700 1 -- 192.168.123.105:0/2750854791 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4da0116ed0 msgr2=0x7f4da0117350 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:41.805 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.804+0000 7f4da59df700 1 --2- 192.168.123.105:0/2750854791 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4da0116ed0 0x7f4da0117350 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f4d9400ea30 tx=0x7f4d9400edf0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.805 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.804+0000 7f4da59df700 1 -- 192.168.123.105:0/2750854791 shutdown_connections 2026-03-10T08:52:41.805 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.804+0000 7f4da59df700 1 --2- 192.168.123.105:0/2750854791 >> [v2:192.168.123.105:6810/305535590,v1:192.168.123.105:6811/305535590] conn(0x7f4d8c001610 0x7f4d8c003ad0 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.805 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.804+0000 7f4da59df700 1 --2- 192.168.123.105:0/2750854791 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4d8806c530 0x7f4d8806e9f0 unknown :-1 s=CLOSED pgs=70 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.805 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.804+0000 7f4da59df700 1 --2- 192.168.123.105:0/2750854791 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4da010d0f0 0x7f4da011bed0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.805 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.804+0000 7f4da59df700 1 --2- 192.168.123.105:0/2750854791 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4da0116ed0 0x7f4da0117350 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.805 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.804+0000 7f4da59df700 1 -- 192.168.123.105:0/2750854791 >> 192.168.123.105:0/2750854791 conn(0x7f4da006ce20 msgr2=0x7f4da0070000 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:41.805 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.804+0000 7f4da59df700 1 -- 192.168.123.105:0/2750854791 shutdown_connections 2026-03-10T08:52:41.805 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.804+0000 7f4da59df700 1 -- 192.168.123.105:0/2750854791 wait complete. 2026-03-10T08:52:41.841 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.838+0000 7f7af3b2d700 1 -- 192.168.123.105:0/2987125311 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7aec10f660 msgr2=0x7f7aec107d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:41.841 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.838+0000 7f7af3b2d700 1 --2- 192.168.123.105:0/2987125311 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7aec10f660 0x7f7aec107d90 secure :-1 s=READY pgs=215 cs=0 l=1 rev1=1 crypto rx=0x7f7ae8009b00 tx=0x7f7ae8009e10 comp rx=0 tx=0).stop 2026-03-10T08:52:41.841 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.839+0000 7f7af3b2d700 1 -- 192.168.123.105:0/2987125311 shutdown_connections 2026-03-10T08:52:41.841 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.839+0000 7f7af3b2d700 1 --2- 192.168.123.105:0/2987125311 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7aec1082d0 0x7f7aec108750 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.841 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.839+0000 7f7af3b2d700 1 --2- 192.168.123.105:0/2987125311 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7aec10f660 0x7f7aec107d90 secure :-1 s=CLOSED pgs=215 cs=0 l=1 rev1=1 crypto rx=0x7f7ae8009b00 tx=0x7f7ae8009e10 comp rx=0 tx=0).stop 2026-03-10T08:52:41.841 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.839+0000 7f7af3b2d700 1 -- 192.168.123.105:0/2987125311 >> 192.168.123.105:0/2987125311 conn(0x7f7aec06d0f0 msgr2=0x7f7aec06d500 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:41.841 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.840+0000 7f7af3b2d700 1 -- 192.168.123.105:0/2987125311 shutdown_connections 2026-03-10T08:52:41.844 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.840+0000 7f7af3b2d700 1 -- 192.168.123.105:0/2987125311 wait complete. 2026-03-10T08:52:41.844 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.841+0000 7f7af3b2d700 1 Processor -- start 2026-03-10T08:52:41.844 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.842+0000 7f7af3b2d700 1 -- start start 2026-03-10T08:52:41.844 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.842+0000 7f7af3b2d700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7aec1082d0 0x7f7aec115190 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:41.844 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.842+0000 7f7af3b2d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7aec1156d0 0x7f7aec115b50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:41.844 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.842+0000 7f7af3b2d700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7aec11a270 con 0x7f7aec1156d0 2026-03-10T08:52:41.844 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.842+0000 7f7af3b2d700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7aec11a3e0 con 0x7f7aec1082d0 2026-03-10T08:52:41.844 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.842+0000 7f7af10c8700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7aec1156d0 0x7f7aec115b50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:41.844 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.842+0000 7f7af10c8700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7aec1156d0 0x7f7aec115b50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:35156/0 (socket says 192.168.123.105:35156) 2026-03-10T08:52:41.844 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.842+0000 7f7af10c8700 1 -- 192.168.123.105:0/3792148123 learned_addr learned my addr 192.168.123.105:0/3792148123 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:52:41.845 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.845+0000 7f7af10c8700 1 -- 192.168.123.105:0/3792148123 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7aec1082d0 msgr2=0x7f7aec115190 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T08:52:41.845 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.845+0000 7f7af10c8700 1 --2- 192.168.123.105:0/3792148123 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7aec1082d0 0x7f7aec115190 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.845 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.845+0000 7f7af10c8700 1 -- 192.168.123.105:0/3792148123 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7ae80097e0 con 0x7f7aec1156d0 2026-03-10T08:52:41.845 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.845+0000 7f7af10c8700 1 --2- 192.168.123.105:0/3792148123 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7aec1156d0 0x7f7aec115b50 secure :-1 s=READY pgs=216 cs=0 l=1 rev1=1 crypto rx=0x7f7adc00d8d0 tx=0x7f7adc00dc90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:41.859 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.858+0000 7f7baf123700 1 -- 192.168.123.105:0/1077675418 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7ba810d0f0 msgr2=0x7f7ba810d570 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:41.861 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.852+0000 7f7ae2ffd700 1 -- 192.168.123.105:0/3792148123 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7adc009940 con 0x7f7aec1156d0 2026-03-10T08:52:41.861 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.852+0000 7f7af3b2d700 1 -- 192.168.123.105:0/3792148123 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7aec1b86f0 con 0x7f7aec1156d0 2026-03-10T08:52:41.861 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.852+0000 7f7af3b2d700 1 -- 192.168.123.105:0/3792148123 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7aec1b8bf0 con 0x7f7aec1156d0 2026-03-10T08:52:41.861 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.852+0000 7f7ae2ffd700 1 -- 192.168.123.105:0/3792148123 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7adc010460 con 0x7f7aec1156d0 2026-03-10T08:52:41.861 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.852+0000 7f7ae2ffd700 1 -- 192.168.123.105:0/3792148123 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7adc00f5f0 con 0x7f7aec1156d0 2026-03-10T08:52:41.863 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.861+0000 7f7af3b2d700 1 -- 192.168.123.105:0/3792148123 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_get_version(what=osdmap handle=1) v1 -- 0x7f7ad0000ff0 con 0x7f7aec1156d0 2026-03-10T08:52:41.867 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.858+0000 7f7baf123700 1 --2- 192.168.123.105:0/1077675418 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7ba810d0f0 0x7f7ba810d570 secure :-1 s=READY pgs=217 cs=0 l=1 rev1=1 crypto rx=0x7f7b98009b00 tx=0x7f7b98009e10 comp rx=0 tx=0).stop 2026-03-10T08:52:41.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.867+0000 7f7ae2ffd700 1 -- 192.168.123.105:0/3792148123 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f7adc009aa0 con 0x7f7aec1156d0 2026-03-10T08:52:41.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.869+0000 7f7baf123700 1 -- 192.168.123.105:0/1077675418 shutdown_connections 2026-03-10T08:52:41.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.869+0000 7f7baf123700 1 --2- 192.168.123.105:0/1077675418 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7ba810d0f0 0x7f7ba810d570 unknown :-1 s=CLOSED pgs=217 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.869+0000 7f7baf123700 1 --2- 192.168.123.105:0/1077675418 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7ba810f340 0x7f7ba810f720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.869+0000 7f7baf123700 1 -- 192.168.123.105:0/1077675418 >> 192.168.123.105:0/1077675418 conn(0x7f7ba806ce20 msgr2=0x7f7ba806d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:41.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.869+0000 7f7baf123700 1 -- 192.168.123.105:0/1077675418 shutdown_connections 2026-03-10T08:52:41.872 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.867+0000 7f7ae2ffd700 1 --2- 192.168.123.105:0/3792148123 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7ad806c530 0x7f7ad806e9f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:41.872 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.867+0000 7f7ae2ffd700 1 -- 192.168.123.105:0/3792148123 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f7adc08b0d0 con 0x7f7aec1156d0 2026-03-10T08:52:41.872 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.869+0000 7f7baf123700 1 -- 192.168.123.105:0/1077675418 wait complete. 2026-03-10T08:52:41.872 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.869+0000 7f7baf123700 1 Processor -- start 2026-03-10T08:52:41.872 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.871+0000 7f7baf123700 1 -- start start 2026-03-10T08:52:41.873 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.871+0000 7f7ae2ffd700 1 --2- 192.168.123.105:0/3792148123 >> [v2:192.168.123.105:6818/3827519003,v1:192.168.123.105:6819/3827519003] conn(0x7f7ad80720d0 0x7f7ad80744f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:41.873 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.871+0000 7f7ae2ffd700 1 -- 192.168.123.105:0/3792148123 --> [v2:192.168.123.105:6818/3827519003,v1:192.168.123.105:6819/3827519003] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7f7ad8074ba0 con 0x7f7ad80720d0 2026-03-10T08:52:41.873 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.871+0000 7f7baf123700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7ba810d0f0 0x7f7ba811c010 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:41.873 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.871+0000 7f7baf123700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7ba810f340 0x7f7ba8117010 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:41.873 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.871+0000 7f7baf123700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7ba81176f0 con 0x7f7ba810d0f0 2026-03-10T08:52:41.873 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.871+0000 7f7baf123700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7ba8117830 con 0x7f7ba810f340 2026-03-10T08:52:41.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.873+0000 7f7af20ca700 1 --2- 192.168.123.105:0/3792148123 >> [v2:192.168.123.105:6818/3827519003,v1:192.168.123.105:6819/3827519003] conn(0x7f7ad80720d0 0x7f7ad80744f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:41.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.871+0000 7f7bae121700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7ba810d0f0 0x7f7ba811c010 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:41.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.871+0000 7f7bae121700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7ba810d0f0 0x7f7ba811c010 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:35160/0 (socket says 192.168.123.105:35160) 2026-03-10T08:52:41.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.871+0000 7f7bae121700 1 -- 192.168.123.105:0/1420625944 learned_addr learned my addr 192.168.123.105:0/1420625944 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:52:41.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.871+0000 7f7bae121700 1 -- 192.168.123.105:0/1420625944 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7ba810f340 msgr2=0x7f7ba8117010 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T08:52:41.875 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.871+0000 7f7bae121700 1 --2- 192.168.123.105:0/1420625944 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7ba810f340 0x7f7ba8117010 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.875 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.871+0000 7f7bae121700 1 -- 192.168.123.105:0/1420625944 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7b980097e0 con 0x7f7ba810d0f0 2026-03-10T08:52:41.875 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.872+0000 7f7bae121700 1 --2- 192.168.123.105:0/1420625944 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7ba810d0f0 0x7f7ba811c010 secure :-1 s=READY pgs=218 cs=0 l=1 rev1=1 crypto rx=0x7f7ba400dc40 tx=0x7f7ba400be10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:41.875 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.872+0000 7f7b9f7fe700 1 -- 192.168.123.105:0/1420625944 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7ba40099a0 con 0x7f7ba810d0f0 2026-03-10T08:52:41.875 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.872+0000 7f7b9f7fe700 1 -- 192.168.123.105:0/1420625944 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7ba4010460 con 0x7f7ba810d0f0 2026-03-10T08:52:41.875 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.873+0000 7f7b9f7fe700 1 -- 192.168.123.105:0/1420625944 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7ba400f6f0 con 0x7f7ba810d0f0 2026-03-10T08:52:41.875 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.874+0000 7f7af18c9700 1 --2- 192.168.123.105:0/3792148123 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7ad806c530 0x7f7ad806e9f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:41.875 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.875+0000 7f7baf123700 1 -- 192.168.123.105:0/1420625944 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7ba8117b10 con 0x7f7ba810d0f0 2026-03-10T08:52:41.876 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.875+0000 7f7af20ca700 1 --2- 192.168.123.105:0/3792148123 >> [v2:192.168.123.105:6818/3827519003,v1:192.168.123.105:6819/3827519003] conn(0x7f7ad80720d0 0x7f7ad80744f0 crc :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.2 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:41.877 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.876+0000 7f7ae2ffd700 1 -- 192.168.123.105:0/3792148123 <== osd.2 v2:192.168.123.105:6818/3827519003 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+25106 (crc 0 0 0) 0x7f7ad8074ba0 con 0x7f7ad80720d0 2026-03-10T08:52:41.877 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.875+0000 7f7baf123700 1 -- 192.168.123.105:0/1420625944 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7ba81b8400 con 0x7f7ba810d0f0 2026-03-10T08:52:41.880 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.878+0000 7f7af18c9700 1 --2- 192.168.123.105:0/3792148123 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7ad806c530 0x7f7ad806e9f0 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f7ae8006010 tx=0x7f7ae8005bc0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:41.880 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.878+0000 7f7b9d7fa700 1 -- 192.168.123.105:0/1420625944 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_get_version(what=osdmap handle=1) v1 -- 0x7f7ba811cff0 con 0x7f7ba810d0f0 2026-03-10T08:52:41.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.878+0000 7f7b9f7fe700 1 -- 192.168.123.105:0/1420625944 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f7ba4009b00 con 0x7f7ba810d0f0 2026-03-10T08:52:41.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.878+0000 7f7b9f7fe700 1 --2- 192.168.123.105:0/1420625944 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7b9406c330 0x7f7b9406e7f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:41.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.878+0000 7f7b9f7fe700 1 -- 192.168.123.105:0/1420625944 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f7ba408b0d0 con 0x7f7ba810d0f0 2026-03-10T08:52:41.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.890+0000 7f7bad920700 1 --2- 192.168.123.105:0/1420625944 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7b9406c330 0x7f7b9406e7f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:41.892 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.890+0000 7f7b9f7fe700 1 --2- 192.168.123.105:0/1420625944 >> [v2:192.168.123.108:6808/4181429086,v1:192.168.123.108:6809/4181429086] conn(0x7f7b94071d60 0x7f7b94074180 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:41.892 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.890+0000 7f7b9f7fe700 1 -- 192.168.123.105:0/1420625944 --> [v2:192.168.123.108:6808/4181429086,v1:192.168.123.108:6809/4181429086] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7f7b94074830 con 0x7f7b94071d60 2026-03-10T08:52:41.892 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.890+0000 7f7b9f7fe700 1 -- 192.168.123.105:0/1420625944 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_get_version_reply(handle=1 version=33) v2 ==== 24+0+0 (secure 0 0 0) 0x7f7ba4055c90 con 0x7f7ba810d0f0 2026-03-10T08:52:41.893 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.891+0000 7f7bae922700 1 --2- 192.168.123.105:0/1420625944 >> [v2:192.168.123.108:6808/4181429086,v1:192.168.123.108:6809/4181429086] conn(0x7f7b94071d60 0x7f7b94074180 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:41.893 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.891+0000 7f7bae922700 1 --2- 192.168.123.105:0/1420625944 >> [v2:192.168.123.108:6808/4181429086,v1:192.168.123.108:6809/4181429086] conn(0x7f7b94071d60 0x7f7b94074180 crc :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.4 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:41.894 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.892+0000 7f7b9f7fe700 1 -- 192.168.123.105:0/1420625944 <== osd.4 v2:192.168.123.108:6808/4181429086 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+25106 (crc 0 0 0) 0x7f7b94074830 con 0x7f7b94071d60 2026-03-10T08:52:41.894 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.894+0000 7f7bad920700 1 --2- 192.168.123.105:0/1420625944 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7b9406c330 0x7f7b9406e7f0 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f7b98005e50 tx=0x7f7b98005de0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:41.898 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.878+0000 7f7ae2ffd700 1 -- 192.168.123.105:0/3792148123 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_get_version_reply(handle=1 version=33) v2 ==== 24+0+0 (secure 0 0 0) 0x7f7adc0597a0 con 0x7f7aec1156d0 2026-03-10T08:52:41.909 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.908+0000 7f7b9d7fa700 1 -- 192.168.123.105:0/1420625944 --> [v2:192.168.123.108:6808/4181429086,v1:192.168.123.108:6809/4181429086] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7f7ba804f2e0 con 0x7f7b94071d60 2026-03-10T08:52:41.919 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.916+0000 7f7b9f7fe700 1 -- 192.168.123.105:0/1420625944 <== osd.4 v2:192.168.123.108:6808/4181429086 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+12 (crc 0 0 0) 0x7f7ba804f2e0 con 0x7f7b94071d60 2026-03-10T08:52:41.919 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.917+0000 7f7baf123700 1 -- 192.168.123.105:0/1420625944 >> [v2:192.168.123.108:6808/4181429086,v1:192.168.123.108:6809/4181429086] conn(0x7f7b94071d60 msgr2=0x7f7b94074180 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:41.919 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.917+0000 7f7baf123700 1 --2- 192.168.123.105:0/1420625944 >> [v2:192.168.123.108:6808/4181429086,v1:192.168.123.108:6809/4181429086] conn(0x7f7b94071d60 0x7f7b94074180 crc :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.923 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.918+0000 7f7baf123700 1 -- 192.168.123.105:0/1420625944 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7b9406c330 msgr2=0x7f7b9406e7f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:41.923 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.918+0000 7f7baf123700 1 --2- 192.168.123.105:0/1420625944 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7b9406c330 0x7f7b9406e7f0 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f7b98005e50 tx=0x7f7b98005de0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.923 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.918+0000 7f7baf123700 1 -- 192.168.123.105:0/1420625944 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7ba810d0f0 msgr2=0x7f7ba811c010 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:41.923 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.918+0000 7f7baf123700 1 --2- 192.168.123.105:0/1420625944 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7ba810d0f0 0x7f7ba811c010 secure :-1 s=READY pgs=218 cs=0 l=1 rev1=1 crypto rx=0x7f7ba400dc40 tx=0x7f7ba400be10 comp rx=0 tx=0).stop 2026-03-10T08:52:41.923 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.919+0000 7f7baf123700 1 -- 192.168.123.105:0/1420625944 shutdown_connections 2026-03-10T08:52:41.923 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.919+0000 7f7baf123700 1 --2- 192.168.123.105:0/1420625944 >> [v2:192.168.123.108:6808/4181429086,v1:192.168.123.108:6809/4181429086] conn(0x7f7b94071d60 0x7f7b94074180 unknown :-1 s=CLOSED pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.923 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.919+0000 7f7baf123700 1 --2- 192.168.123.105:0/1420625944 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7b9406c330 0x7f7b9406e7f0 secure :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f7b98005e50 tx=0x7f7b98005de0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.923 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.919+0000 7f7baf123700 1 --2- 192.168.123.105:0/1420625944 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7ba810d0f0 0x7f7ba811c010 unknown :-1 s=CLOSED pgs=218 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.923 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.919+0000 7f7baf123700 1 --2- 192.168.123.105:0/1420625944 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7ba810f340 0x7f7ba8117010 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.923 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.919+0000 7f7baf123700 1 -- 192.168.123.105:0/1420625944 >> 192.168.123.105:0/1420625944 conn(0x7f7ba806ce20 msgr2=0x7f7ba8070510 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:41.927 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.925+0000 7f7baf123700 1 -- 192.168.123.105:0/1420625944 shutdown_connections 2026-03-10T08:52:41.927 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.925+0000 7f7baf123700 1 -- 192.168.123.105:0/1420625944 wait complete. 2026-03-10T08:52:41.927 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.926+0000 7f22197cf700 1 -- 192.168.123.105:0/3429129259 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f221410f340 msgr2=0x7f221410f720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:41.927 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.926+0000 7f22197cf700 1 --2- 192.168.123.105:0/3429129259 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f221410f340 0x7f221410f720 secure :-1 s=READY pgs=219 cs=0 l=1 rev1=1 crypto rx=0x7f2204009b00 tx=0x7f2204009e10 comp rx=0 tx=0).stop 2026-03-10T08:52:41.927 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.927+0000 7f22197cf700 1 -- 192.168.123.105:0/3429129259 shutdown_connections 2026-03-10T08:52:41.927 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.927+0000 7f22197cf700 1 --2- 192.168.123.105:0/3429129259 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f221410d0f0 0x7f221410d570 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.927 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.927+0000 7f22197cf700 1 --2- 192.168.123.105:0/3429129259 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f221410f340 0x7f221410f720 unknown :-1 s=CLOSED pgs=219 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.927 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.927+0000 7f22197cf700 1 -- 192.168.123.105:0/3429129259 >> 192.168.123.105:0/3429129259 conn(0x7f221406ce20 msgr2=0x7f221406d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:41.927 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.925+0000 7f7af3b2d700 1 -- 192.168.123.105:0/3792148123 --> [v2:192.168.123.105:6818/3827519003,v1:192.168.123.105:6819/3827519003] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7f7ad0002ca0 con 0x7f7ad80720d0 2026-03-10T08:52:41.928 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.928+0000 7f22197cf700 1 -- 192.168.123.105:0/3429129259 shutdown_connections 2026-03-10T08:52:41.930 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.929+0000 7f7ae2ffd700 1 -- 192.168.123.105:0/3792148123 <== osd.2 v2:192.168.123.105:6818/3827519003 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+11 (crc 0 0 0) 0x7f7ad0002ca0 con 0x7f7ad80720d0 2026-03-10T08:52:41.930 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.928+0000 7f22197cf700 1 -- 192.168.123.105:0/3429129259 wait complete. 2026-03-10T08:52:41.930 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.928+0000 7f22197cf700 1 Processor -- start 2026-03-10T08:52:41.931 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.930+0000 7f22197cf700 1 -- start start 2026-03-10T08:52:41.931 INFO:teuthology.orchestra.run.vm05.stdout:38654705676 2026-03-10T08:52:41.931 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph osd last-stat-seq osd.0 2026-03-10T08:52:41.932 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.930+0000 7f7af3b2d700 1 -- 192.168.123.105:0/3792148123 >> [v2:192.168.123.105:6818/3827519003,v1:192.168.123.105:6819/3827519003] conn(0x7f7ad80720d0 msgr2=0x7f7ad80744f0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:41.932 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.930+0000 7f7af3b2d700 1 --2- 192.168.123.105:0/3792148123 >> [v2:192.168.123.105:6818/3827519003,v1:192.168.123.105:6819/3827519003] conn(0x7f7ad80720d0 0x7f7ad80744f0 crc :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.932 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.930+0000 7f22197cf700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f221410d0f0 0x7f2214118090 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:41.932 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.930+0000 7f22197cf700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f221410f340 0x7f22141185d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:41.932 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.930+0000 7f22197cf700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f221411c220 con 0x7f221410f340 2026-03-10T08:52:41.932 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.930+0000 7f22197cf700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f221411c390 con 0x7f221410d0f0 2026-03-10T08:52:41.933 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.932+0000 7f22137fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f221410f340 0x7f22141185d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:41.933 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.932+0000 7f22137fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f221410f340 0x7f22141185d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:35176/0 (socket says 192.168.123.105:35176) 2026-03-10T08:52:41.933 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.932+0000 7f22137fe700 1 -- 192.168.123.105:0/608708543 learned_addr learned my addr 192.168.123.105:0/608708543 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:52:41.946 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.932+0000 7f7af3b2d700 1 -- 192.168.123.105:0/3792148123 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7ad806c530 msgr2=0x7f7ad806e9f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:41.946 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.932+0000 7f7af3b2d700 1 --2- 192.168.123.105:0/3792148123 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7ad806c530 0x7f7ad806e9f0 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f7ae8006010 tx=0x7f7ae8005bc0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.946 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.932+0000 7f7af3b2d700 1 -- 192.168.123.105:0/3792148123 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7aec1156d0 msgr2=0x7f7aec115b50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:41.946 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.932+0000 7f7af3b2d700 1 --2- 192.168.123.105:0/3792148123 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7aec1156d0 0x7f7aec115b50 secure :-1 s=READY pgs=216 cs=0 l=1 rev1=1 crypto rx=0x7f7adc00d8d0 tx=0x7f7adc00dc90 comp rx=0 tx=0).stop 2026-03-10T08:52:41.946 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.933+0000 7f7af3b2d700 1 -- 192.168.123.105:0/3792148123 shutdown_connections 2026-03-10T08:52:41.946 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.933+0000 7f7af3b2d700 1 --2- 192.168.123.105:0/3792148123 >> [v2:192.168.123.105:6818/3827519003,v1:192.168.123.105:6819/3827519003] conn(0x7f7ad80720d0 0x7f7ad80744f0 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.946 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.933+0000 7f7af3b2d700 1 --2- 192.168.123.105:0/3792148123 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7ad806c530 0x7f7ad806e9f0 unknown :-1 s=CLOSED pgs=71 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.946 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.933+0000 7f7af3b2d700 1 --2- 192.168.123.105:0/3792148123 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7aec1082d0 0x7f7aec115190 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.946 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.933+0000 7f7af3b2d700 1 --2- 192.168.123.105:0/3792148123 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7aec1156d0 0x7f7aec115b50 unknown :-1 s=CLOSED pgs=216 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.946 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.933+0000 7f7af3b2d700 1 -- 192.168.123.105:0/3792148123 >> 192.168.123.105:0/3792148123 conn(0x7f7aec06d0f0 msgr2=0x7f7aec070f60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:41.946 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.935+0000 7f7af3b2d700 1 -- 192.168.123.105:0/3792148123 shutdown_connections 2026-03-10T08:52:41.946 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.932+0000 7f22137fe700 1 -- 192.168.123.105:0/608708543 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f221410d0f0 msgr2=0x7f2214118090 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:41.946 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.933+0000 7f2213fff700 1 --2- 192.168.123.105:0/608708543 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f221410d0f0 0x7f2214118090 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:41.946 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.935+0000 7f22137fe700 1 --2- 192.168.123.105:0/608708543 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f221410d0f0 0x7f2214118090 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:41.946 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.935+0000 7f22137fe700 1 -- 192.168.123.105:0/608708543 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f22040097e0 con 0x7f221410f340 2026-03-10T08:52:41.951 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.941+0000 7f7af3b2d700 1 -- 192.168.123.105:0/3792148123 wait complete. 2026-03-10T08:52:41.959 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.958+0000 7f2213fff700 1 --2- 192.168.123.105:0/608708543 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f221410d0f0 0x7f2214118090 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T08:52:41.963 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.958+0000 7f22137fe700 1 --2- 192.168.123.105:0/608708543 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f221410f340 0x7f22141185d0 secure :-1 s=READY pgs=220 cs=0 l=1 rev1=1 crypto rx=0x7f220800eb10 tx=0x7f220800eed0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:41.965 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.963+0000 7f22117fa700 1 -- 192.168.123.105:0/608708543 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f220800cca0 con 0x7f221410f340 2026-03-10T08:52:41.965 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.963+0000 7f22117fa700 1 -- 192.168.123.105:0/608708543 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f220800ce00 con 0x7f221410f340 2026-03-10T08:52:41.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.963+0000 7f22117fa700 1 -- 192.168.123.105:0/608708543 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2208018910 con 0x7f221410f340 2026-03-10T08:52:41.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.964+0000 7f22197cf700 1 -- 192.168.123.105:0/608708543 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2214118bd0 con 0x7f221410f340 2026-03-10T08:52:41.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.964+0000 7f22197cf700 1 -- 192.168.123.105:0/608708543 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f22141b84b0 con 0x7f221410f340 2026-03-10T08:52:41.969 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.965+0000 7f22197cf700 1 -- 192.168.123.105:0/608708543 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_get_version(what=osdmap handle=1) v1 -- 0x7f22141107c0 con 0x7f221410f340 2026-03-10T08:52:41.969 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.965+0000 7f22117fa700 1 -- 192.168.123.105:0/608708543 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f2208018a70 con 0x7f221410f340 2026-03-10T08:52:41.969 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.966+0000 7f22117fa700 1 --2- 192.168.123.105:0/608708543 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f21fc06c600 0x7f21fc06eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:41.969 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.966+0000 7f22117fa700 1 -- 192.168.123.105:0/608708543 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f2208014070 con 0x7f221410f340 2026-03-10T08:52:41.969 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.966+0000 7f22117fa700 1 --2- 192.168.123.105:0/608708543 >> [v2:192.168.123.108:6816/236163747,v1:192.168.123.108:6817/236163747] conn(0x7f21fc0721a0 0x7f21fc0745c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:41.969 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.966+0000 7f22117fa700 1 -- 192.168.123.105:0/608708543 --> [v2:192.168.123.108:6816/236163747,v1:192.168.123.108:6817/236163747] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7f21fc074c70 con 0x7f21fc0721a0 2026-03-10T08:52:41.970 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.968+0000 7f22117fa700 1 -- 192.168.123.105:0/608708543 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_get_version_reply(handle=1 version=33) v2 ==== 24+0+0 (secure 0 0 0) 0x7f22080568a0 con 0x7f221410f340 2026-03-10T08:52:41.970 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.969+0000 7f2218fce700 1 --2- 192.168.123.105:0/608708543 >> [v2:192.168.123.108:6816/236163747,v1:192.168.123.108:6817/236163747] conn(0x7f21fc0721a0 0x7f21fc0745c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:41.970 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.969+0000 7f2213fff700 1 --2- 192.168.123.105:0/608708543 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f21fc06c600 0x7f21fc06eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:41.971 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.971+0000 7f2218fce700 1 --2- 192.168.123.105:0/608708543 >> [v2:192.168.123.108:6816/236163747,v1:192.168.123.108:6817/236163747] conn(0x7f21fc0721a0 0x7f21fc0745c0 crc :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.5 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:41.971 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.971+0000 7f2213fff700 1 --2- 192.168.123.105:0/608708543 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f21fc06c600 0x7f21fc06eac0 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f2204006010 tx=0x7f2204005c00 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:41.978 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:41.976+0000 7f22117fa700 1 -- 192.168.123.105:0/608708543 <== osd.5 v2:192.168.123.108:6816/236163747 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+25106 (crc 0 0 0) 0x7f21fc074c70 con 0x7f21fc0721a0 2026-03-10T08:52:42.010 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.008+0000 7f21faffd700 1 -- 192.168.123.105:0/608708543 --> [v2:192.168.123.108:6816/236163747,v1:192.168.123.108:6817/236163747] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7f221404f2e0 con 0x7f21fc0721a0 2026-03-10T08:52:42.018 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.015+0000 7f22117fa700 1 -- 192.168.123.105:0/608708543 <== osd.5 v2:192.168.123.108:6816/236163747 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+12 (crc 0 0 0) 0x7f221404f2e0 con 0x7f21fc0721a0 2026-03-10T08:52:42.019 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.017+0000 7f21faffd700 1 -- 192.168.123.105:0/608708543 >> [v2:192.168.123.108:6816/236163747,v1:192.168.123.108:6817/236163747] conn(0x7f21fc0721a0 msgr2=0x7f21fc0745c0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:42.019 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.017+0000 7f21faffd700 1 --2- 192.168.123.105:0/608708543 >> [v2:192.168.123.108:6816/236163747,v1:192.168.123.108:6817/236163747] conn(0x7f21fc0721a0 0x7f21fc0745c0 crc :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:42.019 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.017+0000 7f21faffd700 1 -- 192.168.123.105:0/608708543 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f21fc06c600 msgr2=0x7f21fc06eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:42.019 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.017+0000 7f21faffd700 1 --2- 192.168.123.105:0/608708543 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f21fc06c600 0x7f21fc06eac0 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f2204006010 tx=0x7f2204005c00 comp rx=0 tx=0).stop 2026-03-10T08:52:42.019 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.017+0000 7f21faffd700 1 -- 192.168.123.105:0/608708543 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f221410f340 msgr2=0x7f22141185d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:42.019 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.017+0000 7f21faffd700 1 --2- 192.168.123.105:0/608708543 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f221410f340 0x7f22141185d0 secure :-1 s=READY pgs=220 cs=0 l=1 rev1=1 crypto rx=0x7f220800eb10 tx=0x7f220800eed0 comp rx=0 tx=0).stop 2026-03-10T08:52:42.019 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.018+0000 7f21faffd700 1 -- 192.168.123.105:0/608708543 shutdown_connections 2026-03-10T08:52:42.019 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.018+0000 7f21faffd700 1 --2- 192.168.123.105:0/608708543 >> [v2:192.168.123.108:6816/236163747,v1:192.168.123.108:6817/236163747] conn(0x7f21fc0721a0 0x7f21fc0745c0 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:42.019 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.018+0000 7f21faffd700 1 --2- 192.168.123.105:0/608708543 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f21fc06c600 0x7f21fc06eac0 unknown :-1 s=CLOSED pgs=73 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:42.019 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.018+0000 7f21faffd700 1 --2- 192.168.123.105:0/608708543 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f221410d0f0 0x7f2214118090 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:42.019 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.018+0000 7f21faffd700 1 --2- 192.168.123.105:0/608708543 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f221410f340 0x7f22141185d0 unknown :-1 s=CLOSED pgs=220 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:42.019 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.018+0000 7f21faffd700 1 -- 192.168.123.105:0/608708543 >> 192.168.123.105:0/608708543 conn(0x7f221406ce20 msgr2=0x7f2214070470 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:42.019 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.018+0000 7f21faffd700 1 -- 192.168.123.105:0/608708543 shutdown_connections 2026-03-10T08:52:42.019 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.018+0000 7f21faffd700 1 -- 192.168.123.105:0/608708543 wait complete. 2026-03-10T08:52:42.026 INFO:teuthology.orchestra.run.vm05.stdout:55834574858 2026-03-10T08:52:42.026 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph osd last-stat-seq osd.1 2026-03-10T08:52:42.052 INFO:teuthology.orchestra.run.vm05.stdout:73014444040 2026-03-10T08:52:42.052 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph osd last-stat-seq osd.2 2026-03-10T08:52:42.070 INFO:teuthology.orchestra.run.vm05.stdout:120259084293 2026-03-10T08:52:42.071 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph osd last-stat-seq osd.4 2026-03-10T08:52:42.093 INFO:teuthology.orchestra.run.vm05.stdout:137438953475 2026-03-10T08:52:42.093 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph osd last-stat-seq osd.5 2026-03-10T08:52:42.161 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:52:42.452 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:52:42.461 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:52:42.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:42 vm05 ceph-mon[49713]: pgmap v64: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:52:42.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:42 vm08 ceph-mon[57559]: pgmap v64: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:52:42.659 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.658+0000 7f8ed1a21700 1 -- 192.168.123.105:0/3584786421 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8ec40a55f0 msgr2=0x7f8ec40b7900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:42.659 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.658+0000 7f8ed1a21700 1 --2- 192.168.123.105:0/3584786421 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8ec40a55f0 0x7f8ec40b7900 secure :-1 s=READY pgs=221 cs=0 l=1 rev1=1 crypto rx=0x7f8eb8009b00 tx=0x7f8eb8009e10 comp rx=0 tx=0).stop 2026-03-10T08:52:42.659 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.658+0000 7f8ed1a21700 1 -- 192.168.123.105:0/3584786421 shutdown_connections 2026-03-10T08:52:42.659 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.658+0000 7f8ed1a21700 1 --2- 192.168.123.105:0/3584786421 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8ec40a55f0 0x7f8ec40b7900 unknown :-1 s=CLOSED pgs=221 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:42.659 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.658+0000 7f8ed1a21700 1 --2- 192.168.123.105:0/3584786421 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8ec40a4cd0 0x7f8ec40a50b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:42.659 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.658+0000 7f8ed1a21700 1 -- 192.168.123.105:0/3584786421 >> 192.168.123.105:0/3584786421 conn(0x7f8ec401a720 msgr2=0x7f8ec401ab30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:42.659 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.659+0000 7f8ed1a21700 1 -- 192.168.123.105:0/3584786421 shutdown_connections 2026-03-10T08:52:42.660 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.659+0000 7f8ed1a21700 1 -- 192.168.123.105:0/3584786421 wait complete. 2026-03-10T08:52:42.660 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.659+0000 7f8ed1a21700 1 Processor -- start 2026-03-10T08:52:42.660 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.660+0000 7f8ed1a21700 1 -- start start 2026-03-10T08:52:42.660 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.660+0000 7f8ed1a21700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8ec40a4cd0 0x7f8ec40b3a30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:42.660 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.660+0000 7f8ed1a21700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8ec40a55f0 0x7f8ec40b3f70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:42.660 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.660+0000 7f8ed1a21700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8ec40b7bc0 con 0x7f8ec40a55f0 2026-03-10T08:52:42.660 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.660+0000 7f8ed1a21700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8ec40b44b0 con 0x7f8ec40a4cd0 2026-03-10T08:52:42.660 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.660+0000 7f8ed0a1f700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8ec40a4cd0 0x7f8ec40b3a30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:42.660 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.660+0000 7f8ecbfff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8ec40a55f0 0x7f8ec40b3f70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:42.660 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.660+0000 7f8ecbfff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8ec40a55f0 0x7f8ec40b3f70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:35190/0 (socket says 192.168.123.105:35190) 2026-03-10T08:52:42.660 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.660+0000 7f8ecbfff700 1 -- 192.168.123.105:0/2657121992 learned_addr learned my addr 192.168.123.105:0/2657121992 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:52:42.661 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.661+0000 7f8ecbfff700 1 -- 192.168.123.105:0/2657121992 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8ec40a4cd0 msgr2=0x7f8ec40b3a30 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:42.661 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.661+0000 7f8ecbfff700 1 --2- 192.168.123.105:0/2657121992 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8ec40a4cd0 0x7f8ec40b3a30 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:42.661 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.661+0000 7f8ecbfff700 1 -- 192.168.123.105:0/2657121992 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8eb80097e0 con 0x7f8ec40a55f0 2026-03-10T08:52:42.661 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.661+0000 7f8ecbfff700 1 --2- 192.168.123.105:0/2657121992 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8ec40a55f0 0x7f8ec40b3f70 secure :-1 s=READY pgs=222 cs=0 l=1 rev1=1 crypto rx=0x7f8eb800b5c0 tx=0x7f8eb80048c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:42.661 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.661+0000 7f8ec9ffb700 1 -- 192.168.123.105:0/2657121992 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8eb801d070 con 0x7f8ec40a55f0 2026-03-10T08:52:42.662 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.661+0000 7f8ed1a21700 1 -- 192.168.123.105:0/2657121992 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8ec40b4790 con 0x7f8ec40a55f0 2026-03-10T08:52:42.662 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.661+0000 7f8ed1a21700 1 -- 192.168.123.105:0/2657121992 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8ec4155be0 con 0x7f8ec40a55f0 2026-03-10T08:52:42.662 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.661+0000 7f8ec9ffb700 1 -- 192.168.123.105:0/2657121992 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8eb8022470 con 0x7f8ec40a55f0 2026-03-10T08:52:42.662 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.662+0000 7f8ec9ffb700 1 -- 192.168.123.105:0/2657121992 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8eb800f650 con 0x7f8ec40a55f0 2026-03-10T08:52:42.663 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.662+0000 7f8ec9ffb700 1 -- 192.168.123.105:0/2657121992 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f8eb800ba40 con 0x7f8ec40a55f0 2026-03-10T08:52:42.663 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.663+0000 7f8ec9ffb700 1 --2- 192.168.123.105:0/2657121992 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8ebc06c3a0 0x7f8ebc06e860 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:42.663 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.663+0000 7f8ec9ffb700 1 -- 192.168.123.105:0/2657121992 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f8eb808c850 con 0x7f8ec40a55f0 2026-03-10T08:52:42.663 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.663+0000 7f8ed0a1f700 1 --2- 192.168.123.105:0/2657121992 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8ebc06c3a0 0x7f8ebc06e860 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:42.663 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.663+0000 7f8ed0a1f700 1 --2- 192.168.123.105:0/2657121992 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8ebc06c3a0 0x7f8ebc06e860 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7f8ec0005950 tx=0x7f8ec000b410 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:42.668 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.664+0000 7f8ed1a21700 1 -- 192.168.123.105:0/2657121992 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8eb0005320 con 0x7f8ec40a55f0 2026-03-10T08:52:42.668 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.668+0000 7f8ec9ffb700 1 -- 192.168.123.105:0/2657121992 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f8eb80574c0 con 0x7f8ec40a55f0 2026-03-10T08:52:42.721 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:52:42.754 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:52:42.766 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:52:42.823 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.820+0000 7f8ed1a21700 1 -- 192.168.123.105:0/2657121992 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 3} v 0) v1 -- 0x7f8eb00059f0 con 0x7f8ec40a55f0 2026-03-10T08:52:42.827 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.826+0000 7f8ec9ffb700 1 -- 192.168.123.105:0/2657121992 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 3}]=0 v0) v1 ==== 74+0+12 (secure 0 0 0) 0x7f8eb8027030 con 0x7f8ec40a55f0 2026-03-10T08:52:42.827 INFO:teuthology.orchestra.run.vm05.stdout:98784247813 2026-03-10T08:52:42.832 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.831+0000 7f8eb77fe700 1 -- 192.168.123.105:0/2657121992 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8ebc06c3a0 msgr2=0x7f8ebc06e860 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:42.832 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.831+0000 7f8eb77fe700 1 --2- 192.168.123.105:0/2657121992 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8ebc06c3a0 0x7f8ebc06e860 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7f8ec0005950 tx=0x7f8ec000b410 comp rx=0 tx=0).stop 2026-03-10T08:52:42.832 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.831+0000 7f8eb77fe700 1 -- 192.168.123.105:0/2657121992 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8ec40a55f0 msgr2=0x7f8ec40b3f70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:42.832 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.831+0000 7f8eb77fe700 1 --2- 192.168.123.105:0/2657121992 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8ec40a55f0 0x7f8ec40b3f70 secure :-1 s=READY pgs=222 cs=0 l=1 rev1=1 crypto rx=0x7f8eb800b5c0 tx=0x7f8eb80048c0 comp rx=0 tx=0).stop 2026-03-10T08:52:42.832 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.832+0000 7f8eb77fe700 1 -- 192.168.123.105:0/2657121992 shutdown_connections 2026-03-10T08:52:42.832 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.832+0000 7f8eb77fe700 1 --2- 192.168.123.105:0/2657121992 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8ebc06c3a0 0x7f8ebc06e860 unknown :-1 s=CLOSED pgs=74 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:42.832 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.832+0000 7f8eb77fe700 1 --2- 192.168.123.105:0/2657121992 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8ec40a4cd0 0x7f8ec40b3a30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:42.832 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.832+0000 7f8eb77fe700 1 --2- 192.168.123.105:0/2657121992 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8ec40a55f0 0x7f8ec40b3f70 unknown :-1 s=CLOSED pgs=222 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:42.832 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.832+0000 7f8eb77fe700 1 -- 192.168.123.105:0/2657121992 >> 192.168.123.105:0/2657121992 conn(0x7f8ec401a720 msgr2=0x7f8ec40a26b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:42.833 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.832+0000 7f8eb77fe700 1 -- 192.168.123.105:0/2657121992 shutdown_connections 2026-03-10T08:52:42.833 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:42.833+0000 7f8eb77fe700 1 -- 192.168.123.105:0/2657121992 wait complete. 2026-03-10T08:52:42.896 INFO:tasks.cephadm.ceph_manager.ceph:need seq 98784247815 got 98784247813 for osd.3 2026-03-10T08:52:43.135 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.134+0000 7ff404afb700 1 -- 192.168.123.105:0/1042953236 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff40010f660 msgr2=0x7ff400107d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:43.135 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.134+0000 7ff404afb700 1 --2- 192.168.123.105:0/1042953236 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff40010f660 0x7ff400107d90 secure :-1 s=READY pgs=223 cs=0 l=1 rev1=1 crypto rx=0x7ff3f0009b00 tx=0x7ff3f0009e10 comp rx=0 tx=0).stop 2026-03-10T08:52:43.140 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.139+0000 7ff404afb700 1 -- 192.168.123.105:0/1042953236 shutdown_connections 2026-03-10T08:52:43.140 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.139+0000 7ff404afb700 1 --2- 192.168.123.105:0/1042953236 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff4001082d0 0x7ff400108750 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:43.140 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.139+0000 7ff404afb700 1 --2- 192.168.123.105:0/1042953236 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff40010f660 0x7ff400107d90 unknown :-1 s=CLOSED pgs=223 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:43.140 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.139+0000 7ff404afb700 1 -- 192.168.123.105:0/1042953236 >> 192.168.123.105:0/1042953236 conn(0x7ff40006d0f0 msgr2=0x7ff40006d500 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:43.144 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.139+0000 7ff404afb700 1 -- 192.168.123.105:0/1042953236 shutdown_connections 2026-03-10T08:52:43.144 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.139+0000 7ff404afb700 1 -- 192.168.123.105:0/1042953236 wait complete. 2026-03-10T08:52:43.144 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.139+0000 7ff404afb700 1 Processor -- start 2026-03-10T08:52:43.144 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.140+0000 7ff404afb700 1 -- start start 2026-03-10T08:52:43.144 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.140+0000 7ff404afb700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff4001082d0 0x7ff4001ab830 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:43.144 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.140+0000 7ff404afb700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff40010f660 0x7ff4001abd70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:43.144 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.140+0000 7ff404afb700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff4001ac400 con 0x7ff4001082d0 2026-03-10T08:52:43.144 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.140+0000 7ff404afb700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff4001a58b0 con 0x7ff40010f660 2026-03-10T08:52:43.144 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.140+0000 7ff3fe59c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff4001082d0 0x7ff4001ab830 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:43.144 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.140+0000 7ff3fe59c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff4001082d0 0x7ff4001ab830 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:35204/0 (socket says 192.168.123.105:35204) 2026-03-10T08:52:43.144 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.140+0000 7ff3fe59c700 1 -- 192.168.123.105:0/309465112 learned_addr learned my addr 192.168.123.105:0/309465112 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:52:43.144 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.140+0000 7ff3fdd9b700 1 --2- 192.168.123.105:0/309465112 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff40010f660 0x7ff4001abd70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:43.144 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.140+0000 7ff3fe59c700 1 -- 192.168.123.105:0/309465112 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff40010f660 msgr2=0x7ff4001abd70 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:43.144 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.140+0000 7ff3fe59c700 1 --2- 192.168.123.105:0/309465112 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff40010f660 0x7ff4001abd70 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:43.144 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.140+0000 7ff3fe59c700 1 -- 192.168.123.105:0/309465112 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff3f00097e0 con 0x7ff4001082d0 2026-03-10T08:52:43.144 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.141+0000 7ff3fe59c700 1 --2- 192.168.123.105:0/309465112 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff4001082d0 0x7ff4001ab830 secure :-1 s=READY pgs=224 cs=0 l=1 rev1=1 crypto rx=0x7ff3f000ba90 tx=0x7ff3f000bb70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:43.144 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.142+0000 7ff3ef7fe700 1 -- 192.168.123.105:0/309465112 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff3f001d070 con 0x7ff4001082d0 2026-03-10T08:52:43.144 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.142+0000 7ff3ef7fe700 1 -- 192.168.123.105:0/309465112 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff3f000f460 con 0x7ff4001082d0 2026-03-10T08:52:43.144 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.142+0000 7ff3ef7fe700 1 -- 192.168.123.105:0/309465112 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff3f0021690 con 0x7ff4001082d0 2026-03-10T08:52:43.144 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.142+0000 7ff404afb700 1 -- 192.168.123.105:0/309465112 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff4001a5ae0 con 0x7ff4001082d0 2026-03-10T08:52:43.145 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.142+0000 7ff404afb700 1 -- 192.168.123.105:0/309465112 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff4001a5fd0 con 0x7ff4001082d0 2026-03-10T08:52:43.148 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.144+0000 7ff404afb700 1 -- 192.168.123.105:0/309465112 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff3e0005320 con 0x7ff4001082d0 2026-03-10T08:52:43.148 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.144+0000 7ff3ef7fe700 1 -- 192.168.123.105:0/309465112 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7ff3f002b430 con 0x7ff4001082d0 2026-03-10T08:52:43.148 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.144+0000 7ff3ef7fe700 1 --2- 192.168.123.105:0/309465112 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff3e806c5b0 0x7ff3e806ea70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:43.148 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.144+0000 7ff3ef7fe700 1 -- 192.168.123.105:0/309465112 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7ff3f008d7e0 con 0x7ff4001082d0 2026-03-10T08:52:43.148 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.147+0000 7ff3fdd9b700 1 --2- 192.168.123.105:0/309465112 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff3e806c5b0 0x7ff3e806ea70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:43.148 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.147+0000 7ff3ef7fe700 1 -- 192.168.123.105:0/309465112 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7ff3f005beb0 con 0x7ff4001082d0 2026-03-10T08:52:43.148 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.147+0000 7ff3fdd9b700 1 --2- 192.168.123.105:0/309465112 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff3e806c5b0 0x7ff3e806ea70 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7ff3f4005950 tx=0x7ff3f40058e0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:43.219 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.217+0000 7fb93a11a700 1 -- 192.168.123.105:0/1447234514 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb934107ff0 msgr2=0x7fb9341083d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:43.219 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.217+0000 7fb93a11a700 1 --2- 192.168.123.105:0/1447234514 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb934107ff0 0x7fb9341083d0 secure :-1 s=READY pgs=225 cs=0 l=1 rev1=1 crypto rx=0x7fb92c00b3a0 tx=0x7fb92c00b6b0 comp rx=0 tx=0).stop 2026-03-10T08:52:43.219 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.217+0000 7fb93a11a700 1 -- 192.168.123.105:0/1447234514 shutdown_connections 2026-03-10T08:52:43.219 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.217+0000 7fb93a11a700 1 --2- 192.168.123.105:0/1447234514 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb9341089a0 0x7fb93410be70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:43.219 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.217+0000 7fb93a11a700 1 --2- 192.168.123.105:0/1447234514 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb934107ff0 0x7fb9341083d0 unknown :-1 s=CLOSED pgs=225 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:43.219 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.217+0000 7fb93a11a700 1 -- 192.168.123.105:0/1447234514 >> 192.168.123.105:0/1447234514 conn(0x7fb93406ce20 msgr2=0x7fb93406d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:43.222 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.218+0000 7fb93a11a700 1 -- 192.168.123.105:0/1447234514 shutdown_connections 2026-03-10T08:52:43.222 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.218+0000 7fb93a11a700 1 -- 192.168.123.105:0/1447234514 wait complete. 2026-03-10T08:52:43.222 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.218+0000 7fb93a11a700 1 Processor -- start 2026-03-10T08:52:43.222 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.218+0000 7fb93a11a700 1 -- start start 2026-03-10T08:52:43.222 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.218+0000 7fb93a11a700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb9341089a0 0x7fb93407cf20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:43.222 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.218+0000 7fb93a11a700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb93407d460 0x7fb93407d8e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:43.222 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.218+0000 7fb93a11a700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb934081aa0 con 0x7fb9341089a0 2026-03-10T08:52:43.222 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.218+0000 7fb93a11a700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb934081c10 con 0x7fb93407d460 2026-03-10T08:52:43.222 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.219+0000 7fb932ffd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb93407d460 0x7fb93407d8e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:43.222 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.219+0000 7fb932ffd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb93407d460 0x7fb93407d8e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:36236/0 (socket says 192.168.123.105:36236) 2026-03-10T08:52:43.222 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.219+0000 7fb932ffd700 1 -- 192.168.123.105:0/3622137274 learned_addr learned my addr 192.168.123.105:0/3622137274 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:52:43.222 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.219+0000 7fb9337fe700 1 --2- 192.168.123.105:0/3622137274 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb9341089a0 0x7fb93407cf20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:43.222 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.219+0000 7fb9337fe700 1 -- 192.168.123.105:0/3622137274 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb93407d460 msgr2=0x7fb93407d8e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:43.222 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.219+0000 7fb9337fe700 1 --2- 192.168.123.105:0/3622137274 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb93407d460 0x7fb93407d8e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:43.227 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.219+0000 7fb9337fe700 1 -- 192.168.123.105:0/3622137274 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb92c00b050 con 0x7fb9341089a0 2026-03-10T08:52:43.227 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.219+0000 7fb9337fe700 1 --2- 192.168.123.105:0/3622137274 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb9341089a0 0x7fb93407cf20 secure :-1 s=READY pgs=226 cs=0 l=1 rev1=1 crypto rx=0x7fb92c00b3a0 tx=0x7fb92c00bce0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:43.227 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.225+0000 7fb930ff9700 1 -- 192.168.123.105:0/3622137274 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb92c00e050 con 0x7fb9341089a0 2026-03-10T08:52:43.233 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.225+0000 7fb93a11a700 1 -- 192.168.123.105:0/3622137274 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb934081e90 con 0x7fb9341089a0 2026-03-10T08:52:43.233 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.225+0000 7fb93a11a700 1 -- 192.168.123.105:0/3622137274 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb934082380 con 0x7fb9341089a0 2026-03-10T08:52:43.233 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.227+0000 7fb91a7fc700 1 -- 192.168.123.105:0/3622137274 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb93404f2e0 con 0x7fb9341089a0 2026-03-10T08:52:43.233 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.230+0000 7fb930ff9700 1 -- 192.168.123.105:0/3622137274 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb92c007c00 con 0x7fb9341089a0 2026-03-10T08:52:43.233 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.231+0000 7fb930ff9700 1 -- 192.168.123.105:0/3622137274 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb92c01c930 con 0x7fb9341089a0 2026-03-10T08:52:43.233 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.231+0000 7fb930ff9700 1 -- 192.168.123.105:0/3622137274 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fb92c01cb50 con 0x7fb9341089a0 2026-03-10T08:52:43.235 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.233+0000 7fb930ff9700 1 --2- 192.168.123.105:0/3622137274 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb91c06c6d0 0x7fb91c06eb90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:43.235 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.234+0000 7fb932ffd700 1 --2- 192.168.123.105:0/3622137274 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb91c06c6d0 0x7fb91c06eb90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:43.235 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.234+0000 7fb930ff9700 1 -- 192.168.123.105:0/3622137274 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fb92c091580 con 0x7fb9341089a0 2026-03-10T08:52:43.235 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.234+0000 7fb932ffd700 1 --2- 192.168.123.105:0/3622137274 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb91c06c6d0 0x7fb91c06eb90 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7fb934083fa0 tx=0x7fb92400c040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:43.235 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.234+0000 7fb930ff9700 1 -- 192.168.123.105:0/3622137274 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fb92c093ac0 con 0x7fb9341089a0 2026-03-10T08:52:43.276 INFO:teuthology.orchestra.run.vm05.stdout:38654705675 2026-03-10T08:52:43.276 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.271+0000 7ff404afb700 1 -- 192.168.123.105:0/309465112 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 0} v 0) v1 -- 0x7ff3e0005190 con 0x7ff4001082d0 2026-03-10T08:52:43.276 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.272+0000 7ff3ef7fe700 1 -- 192.168.123.105:0/309465112 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 0}]=0 v0) v1 ==== 74+0+12 (secure 0 0 0) 0x7ff3f005ba40 con 0x7ff4001082d0 2026-03-10T08:52:43.308 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.307+0000 7ff3ed7fa700 1 -- 192.168.123.105:0/309465112 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff3e806c5b0 msgr2=0x7ff3e806ea70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:43.308 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.307+0000 7ff3ed7fa700 1 --2- 192.168.123.105:0/309465112 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff3e806c5b0 0x7ff3e806ea70 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7ff3f4005950 tx=0x7ff3f40058e0 comp rx=0 tx=0).stop 2026-03-10T08:52:43.312 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.307+0000 7ff3ed7fa700 1 -- 192.168.123.105:0/309465112 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff4001082d0 msgr2=0x7ff4001ab830 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:43.312 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.307+0000 7ff3ed7fa700 1 --2- 192.168.123.105:0/309465112 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff4001082d0 0x7ff4001ab830 secure :-1 s=READY pgs=224 cs=0 l=1 rev1=1 crypto rx=0x7ff3f000ba90 tx=0x7ff3f000bb70 comp rx=0 tx=0).stop 2026-03-10T08:52:43.312 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.309+0000 7ff3ed7fa700 1 -- 192.168.123.105:0/309465112 shutdown_connections 2026-03-10T08:52:43.312 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.309+0000 7ff3ed7fa700 1 --2- 192.168.123.105:0/309465112 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff3e806c5b0 0x7ff3e806ea70 unknown :-1 s=CLOSED pgs=75 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:43.312 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.309+0000 7ff3ed7fa700 1 --2- 192.168.123.105:0/309465112 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff4001082d0 0x7ff4001ab830 unknown :-1 s=CLOSED pgs=224 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:43.312 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.309+0000 7ff3ed7fa700 1 --2- 192.168.123.105:0/309465112 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff40010f660 0x7ff4001abd70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:43.312 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.309+0000 7ff3ed7fa700 1 -- 192.168.123.105:0/309465112 >> 192.168.123.105:0/309465112 conn(0x7ff40006d0f0 msgr2=0x7ff40010d510 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:43.312 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.309+0000 7ff3ed7fa700 1 -- 192.168.123.105:0/309465112 shutdown_connections 2026-03-10T08:52:43.312 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.309+0000 7ff3ed7fa700 1 -- 192.168.123.105:0/309465112 wait complete. 2026-03-10T08:52:43.351 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:43 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/2657121992' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 3}]: dispatch 2026-03-10T08:52:43.351 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:43 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/309465112' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 0}]: dispatch 2026-03-10T08:52:43.422 INFO:tasks.cephadm.ceph_manager.ceph:need seq 38654705676 got 38654705675 for osd.0 2026-03-10T08:52:43.492 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.491+0000 7f1cba783700 1 -- 192.168.123.105:0/732230070 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1cb4103a50 msgr2=0x7f1cb4107aa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:43.492 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.491+0000 7f1cba783700 1 --2- 192.168.123.105:0/732230070 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1cb4103a50 0x7f1cb4107aa0 secure :-1 s=READY pgs=227 cs=0 l=1 rev1=1 crypto rx=0x7f1ca8009b50 tx=0x7f1ca8009e60 comp rx=0 tx=0).stop 2026-03-10T08:52:43.492 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.491+0000 7f1cba783700 1 -- 192.168.123.105:0/732230070 shutdown_connections 2026-03-10T08:52:43.492 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.491+0000 7f1cba783700 1 --2- 192.168.123.105:0/732230070 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1cb4103a50 0x7f1cb4107aa0 unknown :-1 s=CLOSED pgs=227 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:43.492 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.491+0000 7f1cba783700 1 --2- 192.168.123.105:0/732230070 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1cb41030a0 0x7f1cb4103480 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:43.492 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.491+0000 7f1cba783700 1 -- 192.168.123.105:0/732230070 >> 192.168.123.105:0/732230070 conn(0x7f1cb40fe930 msgr2=0x7f1cb4100d50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:43.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.493+0000 7f1cba783700 1 -- 192.168.123.105:0/732230070 shutdown_connections 2026-03-10T08:52:43.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.493+0000 7f1cba783700 1 -- 192.168.123.105:0/732230070 wait complete. 2026-03-10T08:52:43.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.493+0000 7f1cba783700 1 Processor -- start 2026-03-10T08:52:43.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.493+0000 7f1cba783700 1 -- start start 2026-03-10T08:52:43.494 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.494+0000 7f1cba783700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1cb41030a0 0x7f1cb419ea10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:43.494 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.494+0000 7f1cba783700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1cb4103a50 0x7f1cb419ef50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:43.494 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.494+0000 7f1cba783700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1cb419f5e0 con 0x7f1cb41030a0 2026-03-10T08:52:43.494 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.494+0000 7f1cba783700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1cb4198a90 con 0x7f1cb4103a50 2026-03-10T08:52:43.494 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.494+0000 7f1cb8f80700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1cb4103a50 0x7f1cb419ef50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:43.494 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.494+0000 7f1cb8f80700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1cb4103a50 0x7f1cb419ef50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:36262/0 (socket says 192.168.123.105:36262) 2026-03-10T08:52:43.494 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.494+0000 7f1cb8f80700 1 -- 192.168.123.105:0/922580028 learned_addr learned my addr 192.168.123.105:0/922580028 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:52:43.495 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.494+0000 7f1cb9781700 1 --2- 192.168.123.105:0/922580028 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1cb41030a0 0x7f1cb419ea10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:43.495 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.494+0000 7f1cb8f80700 1 -- 192.168.123.105:0/922580028 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1cb41030a0 msgr2=0x7f1cb419ea10 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:43.495 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.494+0000 7f1cb8f80700 1 --2- 192.168.123.105:0/922580028 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1cb41030a0 0x7f1cb419ea10 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:43.495 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.494+0000 7f1cb8f80700 1 -- 192.168.123.105:0/922580028 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1ca80097e0 con 0x7f1cb4103a50 2026-03-10T08:52:43.495 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.495+0000 7f1cb8f80700 1 --2- 192.168.123.105:0/922580028 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1cb4103a50 0x7f1cb419ef50 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f1ca80048f0 tx=0x7f1ca80049d0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:43.495 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.495+0000 7f1cb9781700 1 --2- 192.168.123.105:0/922580028 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1cb41030a0 0x7f1cb419ea10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T08:52:43.495 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.495+0000 7f1ca67fc700 1 -- 192.168.123.105:0/922580028 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1ca801d070 con 0x7f1cb4103a50 2026-03-10T08:52:43.495 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.495+0000 7f1ca67fc700 1 -- 192.168.123.105:0/922580028 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f1ca800bc30 con 0x7f1cb4103a50 2026-03-10T08:52:43.497 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.495+0000 7f1ca67fc700 1 -- 192.168.123.105:0/922580028 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1ca800f670 con 0x7f1cb4103a50 2026-03-10T08:52:43.497 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.497+0000 7f1cba783700 1 -- 192.168.123.105:0/922580028 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1cb4198d10 con 0x7f1cb4103a50 2026-03-10T08:52:43.497 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.497+0000 7f1cba783700 1 -- 192.168.123.105:0/922580028 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1cb41991d0 con 0x7f1cb4103a50 2026-03-10T08:52:43.498 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.498+0000 7f1cba783700 1 -- 192.168.123.105:0/922580028 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1cb410b4a0 con 0x7f1cb4103a50 2026-03-10T08:52:43.498 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.498+0000 7f1ca67fc700 1 -- 192.168.123.105:0/922580028 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f1ca8022470 con 0x7f1cb4103a50 2026-03-10T08:52:43.499 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.498+0000 7f1ca67fc700 1 --2- 192.168.123.105:0/922580028 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f1ca006c2e0 0x7f1ca006e7a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:43.499 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.498+0000 7f1ca67fc700 1 -- 192.168.123.105:0/922580028 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f1ca808cac0 con 0x7f1cb4103a50 2026-03-10T08:52:43.499 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.499+0000 7f1cb9781700 1 --2- 192.168.123.105:0/922580028 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f1ca006c2e0 0x7f1ca006e7a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:43.503 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.502+0000 7f1cb9781700 1 --2- 192.168.123.105:0/922580028 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f1ca006c2e0 0x7f1ca006e7a0 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f1cb0005fd0 tx=0x7f1cb0005ee0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:43.503 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.503+0000 7f1ca67fc700 1 -- 192.168.123.105:0/922580028 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f1ca808ee60 con 0x7f1cb4103a50 2026-03-10T08:52:43.528 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.527+0000 7f6d340f2700 1 -- 192.168.123.105:0/2250970773 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6d2c103cd0 msgr2=0x7f6d2c107d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:43.528 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.527+0000 7f6d340f2700 1 --2- 192.168.123.105:0/2250970773 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6d2c103cd0 0x7f6d2c107d20 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f6d28009b00 tx=0x7f6d28009e10 comp rx=0 tx=0).stop 2026-03-10T08:52:43.532 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.532+0000 7f6d340f2700 1 -- 192.168.123.105:0/2250970773 shutdown_connections 2026-03-10T08:52:43.532 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.532+0000 7f6d340f2700 1 --2- 192.168.123.105:0/2250970773 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6d2c103cd0 0x7f6d2c107d20 unknown :-1 s=CLOSED pgs=41 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:43.532 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.532+0000 7f6d340f2700 1 --2- 192.168.123.105:0/2250970773 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d2c103320 0x7f6d2c103700 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:43.532 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.532+0000 7f6d340f2700 1 -- 192.168.123.105:0/2250970773 >> 192.168.123.105:0/2250970773 conn(0x7f6d2c0feb90 msgr2=0x7f6d2c100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:43.532 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.532+0000 7f6d340f2700 1 -- 192.168.123.105:0/2250970773 shutdown_connections 2026-03-10T08:52:43.532 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.532+0000 7f6d340f2700 1 -- 192.168.123.105:0/2250970773 wait complete. 2026-03-10T08:52:43.535 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.535+0000 7f6d340f2700 1 Processor -- start 2026-03-10T08:52:43.540 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.540+0000 7f6d340f2700 1 -- start start 2026-03-10T08:52:43.540 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.540+0000 7f6d340f2700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6d2c103320 0x7f6d2c198e20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:43.540 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.540+0000 7f6d340f2700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d2c103cd0 0x7f6d2c199360 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:43.541 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.540+0000 7f6d340f2700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6d2c199a40 con 0x7f6d2c103cd0 2026-03-10T08:52:43.541 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.540+0000 7f6d340f2700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6d2c19d7d0 con 0x7f6d2c103320 2026-03-10T08:52:43.541 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.540+0000 7f6d3168d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d2c103cd0 0x7f6d2c199360 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:43.541 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.540+0000 7f6d3168d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d2c103cd0 0x7f6d2c199360 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:35264/0 (socket says 192.168.123.105:35264) 2026-03-10T08:52:43.541 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.540+0000 7f6d3168d700 1 -- 192.168.123.105:0/1349080978 learned_addr learned my addr 192.168.123.105:0/1349080978 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:52:43.541 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.541+0000 7f6d3168d700 1 -- 192.168.123.105:0/1349080978 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6d2c103320 msgr2=0x7f6d2c198e20 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:43.541 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.541+0000 7f6d3168d700 1 --2- 192.168.123.105:0/1349080978 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6d2c103320 0x7f6d2c198e20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:43.541 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.541+0000 7f6d3168d700 1 -- 192.168.123.105:0/1349080978 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6d280097e0 con 0x7f6d2c103cd0 2026-03-10T08:52:43.542 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.541+0000 7f6d3168d700 1 --2- 192.168.123.105:0/1349080978 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d2c103cd0 0x7f6d2c199360 secure :-1 s=READY pgs=228 cs=0 l=1 rev1=1 crypto rx=0x7f6d2800b5c0 tx=0x7f6d280052a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:43.542 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.541+0000 7f6d22ffd700 1 -- 192.168.123.105:0/1349080978 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6d2801d070 con 0x7f6d2c103cd0 2026-03-10T08:52:43.542 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.541+0000 7f6d22ffd700 1 -- 192.168.123.105:0/1349080978 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6d28004500 con 0x7f6d2c103cd0 2026-03-10T08:52:43.542 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.542+0000 7f6d22ffd700 1 -- 192.168.123.105:0/1349080978 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6d28022470 con 0x7f6d2c103cd0 2026-03-10T08:52:43.543 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.542+0000 7f6d340f2700 1 -- 192.168.123.105:0/1349080978 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6d2c19da50 con 0x7f6d2c103cd0 2026-03-10T08:52:43.545 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.545+0000 7f6d340f2700 1 -- 192.168.123.105:0/1349080978 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6d2c19de60 con 0x7f6d2c103cd0 2026-03-10T08:52:43.548 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.543+0000 7f6d22ffd700 1 -- 192.168.123.105:0/1349080978 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f6d2800bc30 con 0x7f6d2c103cd0 2026-03-10T08:52:43.548 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.546+0000 7f6d340f2700 1 -- 192.168.123.105:0/1349080978 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6d2c10b670 con 0x7f6d2c103cd0 2026-03-10T08:52:43.548 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.546+0000 7f6d22ffd700 1 --2- 192.168.123.105:0/1349080978 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f6d1806c290 0x7f6d1806e750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:43.548 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.546+0000 7f6d22ffd700 1 -- 192.168.123.105:0/1349080978 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f6d2808e7f0 con 0x7f6d2c103cd0 2026-03-10T08:52:43.549 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.549+0000 7f6d22ffd700 1 -- 192.168.123.105:0/1349080978 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f6d28092050 con 0x7f6d2c103cd0 2026-03-10T08:52:43.549 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.549+0000 7f6d31e8e700 1 --2- 192.168.123.105:0/1349080978 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f6d1806c290 0x7f6d1806e750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:43.549 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.549+0000 7f6d31e8e700 1 --2- 192.168.123.105:0/1349080978 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f6d1806c290 0x7f6d1806e750 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7f6d1c0097b0 tx=0x7f6d1c006d20 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:43.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:43 vm08 ceph-mon[57559]: from='client.? 192.168.123.105:0/2657121992' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 3}]: dispatch 2026-03-10T08:52:43.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:43 vm08 ceph-mon[57559]: from='client.? 192.168.123.105:0/309465112' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 0}]: dispatch 2026-03-10T08:52:43.559 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.559+0000 7fb91a7fc700 1 -- 192.168.123.105:0/3622137274 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 4} v 0) v1 -- 0x7fb93404ea90 con 0x7fb9341089a0 2026-03-10T08:52:43.560 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.560+0000 7fb930ff9700 1 -- 192.168.123.105:0/3622137274 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 4}]=0 v0) v1 ==== 74+0+13 (secure 0 0 0) 0x7fb92c05c1c0 con 0x7fb9341089a0 2026-03-10T08:52:43.560 INFO:teuthology.orchestra.run.vm05.stdout:120259084293 2026-03-10T08:52:43.564 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.564+0000 7fb93a11a700 1 -- 192.168.123.105:0/3622137274 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb91c06c6d0 msgr2=0x7fb91c06eb90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:43.564 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.564+0000 7fb93a11a700 1 --2- 192.168.123.105:0/3622137274 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb91c06c6d0 0x7fb91c06eb90 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7fb934083fa0 tx=0x7fb92400c040 comp rx=0 tx=0).stop 2026-03-10T08:52:43.564 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.564+0000 7fb93a11a700 1 -- 192.168.123.105:0/3622137274 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb9341089a0 msgr2=0x7fb93407cf20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:43.564 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.564+0000 7fb93a11a700 1 --2- 192.168.123.105:0/3622137274 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb9341089a0 0x7fb93407cf20 secure :-1 s=READY pgs=226 cs=0 l=1 rev1=1 crypto rx=0x7fb92c00b3a0 tx=0x7fb92c00bce0 comp rx=0 tx=0).stop 2026-03-10T08:52:43.564 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.564+0000 7fb93a11a700 1 -- 192.168.123.105:0/3622137274 shutdown_connections 2026-03-10T08:52:43.564 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.564+0000 7fb93a11a700 1 --2- 192.168.123.105:0/3622137274 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb91c06c6d0 0x7fb91c06eb90 unknown :-1 s=CLOSED pgs=76 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:43.564 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.564+0000 7fb93a11a700 1 --2- 192.168.123.105:0/3622137274 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb9341089a0 0x7fb93407cf20 unknown :-1 s=CLOSED pgs=226 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:43.565 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.564+0000 7fb93a11a700 1 --2- 192.168.123.105:0/3622137274 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb93407d460 0x7fb93407d8e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:43.565 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.564+0000 7fb93a11a700 1 -- 192.168.123.105:0/3622137274 >> 192.168.123.105:0/3622137274 conn(0x7fb93406ce20 msgr2=0x7fb934070600 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:43.565 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.565+0000 7fb93a11a700 1 -- 192.168.123.105:0/3622137274 shutdown_connections 2026-03-10T08:52:43.565 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.565+0000 7fb93a11a700 1 -- 192.168.123.105:0/3622137274 wait complete. 2026-03-10T08:52:43.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.596+0000 7f7b3cd64700 1 -- 192.168.123.105:0/3817433842 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7b38107ff0 msgr2=0x7f7b381083d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:43.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.596+0000 7f7b3cd64700 1 --2- 192.168.123.105:0/3817433842 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7b38107ff0 0x7f7b381083d0 secure :-1 s=READY pgs=229 cs=0 l=1 rev1=1 crypto rx=0x7f7b30009230 tx=0x7f7b30009260 comp rx=0 tx=0).stop 2026-03-10T08:52:43.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.597+0000 7f7b3cd64700 1 -- 192.168.123.105:0/3817433842 shutdown_connections 2026-03-10T08:52:43.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.597+0000 7f7b3cd64700 1 --2- 192.168.123.105:0/3817433842 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7b381089a0 0x7f7b3810be70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:43.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.597+0000 7f7b3cd64700 1 --2- 192.168.123.105:0/3817433842 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7b38107ff0 0x7f7b381083d0 unknown :-1 s=CLOSED pgs=229 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:43.598 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.597+0000 7f7b3cd64700 1 -- 192.168.123.105:0/3817433842 >> 192.168.123.105:0/3817433842 conn(0x7f7b3806ce20 msgr2=0x7f7b3806d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:43.602 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.597+0000 7f7b3cd64700 1 -- 192.168.123.105:0/3817433842 shutdown_connections 2026-03-10T08:52:43.602 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.597+0000 7f7b3cd64700 1 -- 192.168.123.105:0/3817433842 wait complete. 2026-03-10T08:52:43.602 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.598+0000 7f7b3cd64700 1 Processor -- start 2026-03-10T08:52:43.603 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.598+0000 7f7b3cd64700 1 -- start start 2026-03-10T08:52:43.603 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.598+0000 7f7b3cd64700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7b381089a0 0x7f7b3807cf20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:43.603 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.598+0000 7f7b3cd64700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7b3807d460 0x7f7b3807d8e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:43.603 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.598+0000 7f7b3cd64700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7b38081aa0 con 0x7f7b3807d460 2026-03-10T08:52:43.603 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.598+0000 7f7b3cd64700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7b38081c10 con 0x7f7b381089a0 2026-03-10T08:52:43.603 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.598+0000 7f7b35d9b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7b3807d460 0x7f7b3807d8e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:43.603 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.598+0000 7f7b35d9b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7b3807d460 0x7f7b3807d8e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:35284/0 (socket says 192.168.123.105:35284) 2026-03-10T08:52:43.603 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.598+0000 7f7b35d9b700 1 -- 192.168.123.105:0/2405637549 learned_addr learned my addr 192.168.123.105:0/2405637549 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:52:43.603 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.602+0000 7f7b35d9b700 1 -- 192.168.123.105:0/2405637549 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7b381089a0 msgr2=0x7f7b3807cf20 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:43.603 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.602+0000 7f7b35d9b700 1 --2- 192.168.123.105:0/2405637549 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7b381089a0 0x7f7b3807cf20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:43.603 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.602+0000 7f7b35d9b700 1 -- 192.168.123.105:0/2405637549 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7b30008ee0 con 0x7f7b3807d460 2026-03-10T08:52:43.603 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.602+0000 7f7b35d9b700 1 --2- 192.168.123.105:0/2405637549 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7b3807d460 0x7f7b3807d8e0 secure :-1 s=READY pgs=230 cs=0 l=1 rev1=1 crypto rx=0x7f7b28011fc0 tx=0x7f7b2800f330 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:43.603 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.603+0000 7f7b277fe700 1 -- 192.168.123.105:0/2405637549 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7b2800fd00 con 0x7f7b3807d460 2026-03-10T08:52:43.606 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.604+0000 7f7b3cd64700 1 -- 192.168.123.105:0/2405637549 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7b38081ef0 con 0x7f7b3807d460 2026-03-10T08:52:43.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.604+0000 7f7b3cd64700 1 -- 192.168.123.105:0/2405637549 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7b38082440 con 0x7f7b3807d460 2026-03-10T08:52:43.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.607+0000 7f7b277fe700 1 -- 192.168.123.105:0/2405637549 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7b28013040 con 0x7f7b3807d460 2026-03-10T08:52:43.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.607+0000 7f7b277fe700 1 -- 192.168.123.105:0/2405637549 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7b280186d0 con 0x7f7b3807d460 2026-03-10T08:52:43.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.609+0000 7f7b277fe700 1 -- 192.168.123.105:0/2405637549 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f7b28018870 con 0x7f7b3807d460 2026-03-10T08:52:43.610 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.609+0000 7f7b277fe700 1 --2- 192.168.123.105:0/2405637549 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7b2006c6d0 0x7f7b2006eb90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:43.610 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.610+0000 7f7b3659c700 1 --2- 192.168.123.105:0/2405637549 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7b2006c6d0 0x7f7b2006eb90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:43.610 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.610+0000 7f7b277fe700 1 -- 192.168.123.105:0/2405637549 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f7b2808fde0 con 0x7f7b3807d460 2026-03-10T08:52:43.610 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.610+0000 7f7b3659c700 1 --2- 192.168.123.105:0/2405637549 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7b2006c6d0 0x7f7b2006eb90 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7f7b3000bfd0 tx=0x7f7b3000bf60 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:43.614 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.613+0000 7f7b3cd64700 1 -- 192.168.123.105:0/2405637549 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7b18005320 con 0x7f7b3807d460 2026-03-10T08:52:43.623 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.623+0000 7f7b277fe700 1 -- 192.168.123.105:0/2405637549 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f7b2805aad0 con 0x7f7b3807d460 2026-03-10T08:52:43.679 INFO:tasks.cephadm.ceph_manager.ceph:need seq 120259084293 got 120259084293 for osd.4 2026-03-10T08:52:43.679 DEBUG:teuthology.parallel:result is None 2026-03-10T08:52:43.690 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.690+0000 7f1cba783700 1 -- 192.168.123.105:0/922580028 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 2} v 0) v1 -- 0x7f1cb404f2e0 con 0x7f1cb4103a50 2026-03-10T08:52:43.691 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.690+0000 7f1ca67fc700 1 -- 192.168.123.105:0/922580028 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 2}]=0 v0) v1 ==== 74+0+12 (secure 0 0 0) 0x7f1ca8027020 con 0x7f1cb4103a50 2026-03-10T08:52:43.691 INFO:teuthology.orchestra.run.vm05.stdout:73014444040 2026-03-10T08:52:43.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.692+0000 7f1cba783700 1 -- 192.168.123.105:0/922580028 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f1ca006c2e0 msgr2=0x7f1ca006e7a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:43.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.692+0000 7f1cba783700 1 --2- 192.168.123.105:0/922580028 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f1ca006c2e0 0x7f1ca006e7a0 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f1cb0005fd0 tx=0x7f1cb0005ee0 comp rx=0 tx=0).stop 2026-03-10T08:52:43.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.693+0000 7f1cba783700 1 -- 192.168.123.105:0/922580028 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1cb4103a50 msgr2=0x7f1cb419ef50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:43.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.693+0000 7f1cba783700 1 --2- 192.168.123.105:0/922580028 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1cb4103a50 0x7f1cb419ef50 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f1ca80048f0 tx=0x7f1ca80049d0 comp rx=0 tx=0).stop 2026-03-10T08:52:43.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.693+0000 7f1cba783700 1 -- 192.168.123.105:0/922580028 shutdown_connections 2026-03-10T08:52:43.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.693+0000 7f1cba783700 1 --2- 192.168.123.105:0/922580028 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f1ca006c2e0 0x7f1ca006e7a0 unknown :-1 s=CLOSED pgs=77 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:43.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.693+0000 7f1cba783700 1 --2- 192.168.123.105:0/922580028 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1cb41030a0 0x7f1cb419ea10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:43.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.693+0000 7f1cba783700 1 --2- 192.168.123.105:0/922580028 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1cb4103a50 0x7f1cb419ef50 unknown :-1 s=CLOSED pgs=40 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:43.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.693+0000 7f1cba783700 1 -- 192.168.123.105:0/922580028 >> 192.168.123.105:0/922580028 conn(0x7f1cb40fe930 msgr2=0x7f1cb4100d50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:43.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.693+0000 7f1cba783700 1 -- 192.168.123.105:0/922580028 shutdown_connections 2026-03-10T08:52:43.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.693+0000 7f1cba783700 1 -- 192.168.123.105:0/922580028 wait complete. 2026-03-10T08:52:43.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.739+0000 7f6d340f2700 1 -- 192.168.123.105:0/1349080978 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 5} v 0) v1 -- 0x7f6d2c04ea90 con 0x7f6d2c103cd0 2026-03-10T08:52:43.739 INFO:teuthology.orchestra.run.vm05.stdout:137438953475 2026-03-10T08:52:43.740 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.739+0000 7f6d22ffd700 1 -- 192.168.123.105:0/1349080978 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 5}]=0 v0) v1 ==== 74+0+13 (secure 0 0 0) 0x7f6d28092020 con 0x7f6d2c103cd0 2026-03-10T08:52:43.745 INFO:tasks.cephadm.ceph_manager.ceph:need seq 73014444040 got 73014444040 for osd.2 2026-03-10T08:52:43.745 DEBUG:teuthology.parallel:result is None 2026-03-10T08:52:43.748 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.748+0000 7f6d20ff9700 1 -- 192.168.123.105:0/1349080978 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f6d1806c290 msgr2=0x7f6d1806e750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:43.748 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.748+0000 7f6d20ff9700 1 --2- 192.168.123.105:0/1349080978 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f6d1806c290 0x7f6d1806e750 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7f6d1c0097b0 tx=0x7f6d1c006d20 comp rx=0 tx=0).stop 2026-03-10T08:52:43.748 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.748+0000 7f6d20ff9700 1 -- 192.168.123.105:0/1349080978 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d2c103cd0 msgr2=0x7f6d2c199360 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:43.748 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.748+0000 7f6d20ff9700 1 --2- 192.168.123.105:0/1349080978 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d2c103cd0 0x7f6d2c199360 secure :-1 s=READY pgs=228 cs=0 l=1 rev1=1 crypto rx=0x7f6d2800b5c0 tx=0x7f6d280052a0 comp rx=0 tx=0).stop 2026-03-10T08:52:43.749 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.748+0000 7f6d20ff9700 1 -- 192.168.123.105:0/1349080978 shutdown_connections 2026-03-10T08:52:43.749 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.749+0000 7f6d20ff9700 1 --2- 192.168.123.105:0/1349080978 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f6d1806c290 0x7f6d1806e750 unknown :-1 s=CLOSED pgs=78 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:43.749 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.749+0000 7f6d20ff9700 1 --2- 192.168.123.105:0/1349080978 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6d2c103320 0x7f6d2c198e20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:43.749 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.749+0000 7f6d20ff9700 1 --2- 192.168.123.105:0/1349080978 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d2c103cd0 0x7f6d2c199360 unknown :-1 s=CLOSED pgs=228 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:43.749 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.749+0000 7f6d20ff9700 1 -- 192.168.123.105:0/1349080978 >> 192.168.123.105:0/1349080978 conn(0x7f6d2c0feb90 msgr2=0x7f6d2c100f60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:43.750 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.749+0000 7f6d20ff9700 1 -- 192.168.123.105:0/1349080978 shutdown_connections 2026-03-10T08:52:43.750 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.749+0000 7f6d20ff9700 1 -- 192.168.123.105:0/1349080978 wait complete. 2026-03-10T08:52:43.793 INFO:tasks.cephadm.ceph_manager.ceph:need seq 137438953475 got 137438953475 for osd.5 2026-03-10T08:52:43.794 DEBUG:teuthology.parallel:result is None 2026-03-10T08:52:43.810 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.810+0000 7f7b3cd64700 1 -- 192.168.123.105:0/2405637549 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 1} v 0) v1 -- 0x7f7b18005190 con 0x7f7b3807d460 2026-03-10T08:52:43.810 INFO:teuthology.orchestra.run.vm05.stdout:55834574858 2026-03-10T08:52:43.811 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.810+0000 7f7b277fe700 1 -- 192.168.123.105:0/2405637549 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 1}]=0 v0) v1 ==== 74+0+12 (secure 0 0 0) 0x7f7b2801e070 con 0x7f7b3807d460 2026-03-10T08:52:43.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.812+0000 7f7b3cd64700 1 -- 192.168.123.105:0/2405637549 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7b2006c6d0 msgr2=0x7f7b2006eb90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:43.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.812+0000 7f7b3cd64700 1 --2- 192.168.123.105:0/2405637549 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7b2006c6d0 0x7f7b2006eb90 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7f7b3000bfd0 tx=0x7f7b3000bf60 comp rx=0 tx=0).stop 2026-03-10T08:52:43.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.812+0000 7f7b3cd64700 1 -- 192.168.123.105:0/2405637549 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7b3807d460 msgr2=0x7f7b3807d8e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:43.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.812+0000 7f7b3cd64700 1 --2- 192.168.123.105:0/2405637549 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7b3807d460 0x7f7b3807d8e0 secure :-1 s=READY pgs=230 cs=0 l=1 rev1=1 crypto rx=0x7f7b28011fc0 tx=0x7f7b2800f330 comp rx=0 tx=0).stop 2026-03-10T08:52:43.813 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.812+0000 7f7b3cd64700 1 -- 192.168.123.105:0/2405637549 shutdown_connections 2026-03-10T08:52:43.813 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.813+0000 7f7b3cd64700 1 --2- 192.168.123.105:0/2405637549 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7b2006c6d0 0x7f7b2006eb90 unknown :-1 s=CLOSED pgs=79 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:43.813 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.813+0000 7f7b3cd64700 1 --2- 192.168.123.105:0/2405637549 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7b381089a0 0x7f7b3807cf20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:43.813 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.813+0000 7f7b3cd64700 1 --2- 192.168.123.105:0/2405637549 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7b3807d460 0x7f7b3807d8e0 unknown :-1 s=CLOSED pgs=230 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:43.813 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.813+0000 7f7b3cd64700 1 -- 192.168.123.105:0/2405637549 >> 192.168.123.105:0/2405637549 conn(0x7f7b3806ce20 msgr2=0x7f7b38070600 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:43.813 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.813+0000 7f7b3cd64700 1 -- 192.168.123.105:0/2405637549 shutdown_connections 2026-03-10T08:52:43.813 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:43.813+0000 7f7b3cd64700 1 -- 192.168.123.105:0/2405637549 wait complete. 2026-03-10T08:52:43.872 INFO:tasks.cephadm.ceph_manager.ceph:need seq 55834574858 got 55834574858 for osd.1 2026-03-10T08:52:43.872 DEBUG:teuthology.parallel:result is None 2026-03-10T08:52:43.897 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph osd last-stat-seq osd.3 2026-03-10T08:52:44.049 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:52:44.278 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.276+0000 7fbfe1aa7700 1 -- 192.168.123.105:0/1223688766 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbfdc0690e0 msgr2=0x7fbfdc105b50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:44.278 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.276+0000 7fbfe1aa7700 1 --2- 192.168.123.105:0/1223688766 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbfdc0690e0 0x7fbfdc105b50 secure :-1 s=READY pgs=231 cs=0 l=1 rev1=1 crypto rx=0x7fbfcc009b00 tx=0x7fbfcc009e10 comp rx=0 tx=0).stop 2026-03-10T08:52:44.278 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.277+0000 7fbfe1aa7700 1 -- 192.168.123.105:0/1223688766 shutdown_connections 2026-03-10T08:52:44.278 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.277+0000 7fbfe1aa7700 1 --2- 192.168.123.105:0/1223688766 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbfdc0690e0 0x7fbfdc105b50 unknown :-1 s=CLOSED pgs=231 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:44.278 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.277+0000 7fbfe1aa7700 1 --2- 192.168.123.105:0/1223688766 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbfdc068730 0x7fbfdc068b10 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:44.278 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.277+0000 7fbfe1aa7700 1 -- 192.168.123.105:0/1223688766 >> 192.168.123.105:0/1223688766 conn(0x7fbfdc075960 msgr2=0x7fbfdc075d70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:44.278 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.278+0000 7fbfe1aa7700 1 -- 192.168.123.105:0/1223688766 shutdown_connections 2026-03-10T08:52:44.278 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.278+0000 7fbfe1aa7700 1 -- 192.168.123.105:0/1223688766 wait complete. 2026-03-10T08:52:44.278 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.278+0000 7fbfe1aa7700 1 Processor -- start 2026-03-10T08:52:44.279 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.278+0000 7fbfe1aa7700 1 -- start start 2026-03-10T08:52:44.279 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.278+0000 7fbfe1aa7700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbfdc068730 0x7fbfdc19d350 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:44.279 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.278+0000 7fbfe1aa7700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbfdc0690e0 0x7fbfdc19d890 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:44.279 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.278+0000 7fbfe1aa7700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbfdc19df70 con 0x7fbfdc0690e0 2026-03-10T08:52:44.279 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.278+0000 7fbfe1aa7700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbfdc1a1d00 con 0x7fbfdc068730 2026-03-10T08:52:44.279 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.279+0000 7fbfdb7fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbfdc068730 0x7fbfdc19d350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:44.279 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.279+0000 7fbfdb7fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbfdc068730 0x7fbfdc19d350 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:36316/0 (socket says 192.168.123.105:36316) 2026-03-10T08:52:44.279 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.279+0000 7fbfdb7fe700 1 -- 192.168.123.105:0/70988631 learned_addr learned my addr 192.168.123.105:0/70988631 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:52:44.279 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.279+0000 7fbfdaffd700 1 --2- 192.168.123.105:0/70988631 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbfdc0690e0 0x7fbfdc19d890 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:44.279 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.279+0000 7fbfdb7fe700 1 -- 192.168.123.105:0/70988631 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbfdc0690e0 msgr2=0x7fbfdc19d890 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:44.279 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.279+0000 7fbfdb7fe700 1 --2- 192.168.123.105:0/70988631 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbfdc0690e0 0x7fbfdc19d890 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:44.279 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.279+0000 7fbfdb7fe700 1 -- 192.168.123.105:0/70988631 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbfcc0097e0 con 0x7fbfdc068730 2026-03-10T08:52:44.279 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.279+0000 7fbfdb7fe700 1 --2- 192.168.123.105:0/70988631 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbfdc068730 0x7fbfdc19d350 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7fbfc400b700 tx=0x7fbfc400bac0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:44.280 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.279+0000 7fbfdaffd700 1 --2- 192.168.123.105:0/70988631 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbfdc0690e0 0x7fbfdc19d890 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T08:52:44.280 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.279+0000 7fbfd8ff9700 1 -- 192.168.123.105:0/70988631 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbfc4010840 con 0x7fbfdc068730 2026-03-10T08:52:44.280 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.280+0000 7fbfe1aa7700 1 -- 192.168.123.105:0/70988631 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbfdc1a1fe0 con 0x7fbfdc068730 2026-03-10T08:52:44.280 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.280+0000 7fbfe1aa7700 1 -- 192.168.123.105:0/70988631 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbfdc1a2500 con 0x7fbfdc068730 2026-03-10T08:52:44.280 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.280+0000 7fbfd8ff9700 1 -- 192.168.123.105:0/70988631 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fbfc4010e80 con 0x7fbfdc068730 2026-03-10T08:52:44.280 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.280+0000 7fbfd8ff9700 1 -- 192.168.123.105:0/70988631 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbfc400d590 con 0x7fbfdc068730 2026-03-10T08:52:44.281 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.281+0000 7fbfd8ff9700 1 -- 192.168.123.105:0/70988631 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fbfc40109a0 con 0x7fbfdc068730 2026-03-10T08:52:44.282 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.281+0000 7fbfe1aa7700 1 -- 192.168.123.105:0/70988631 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbfbc005320 con 0x7fbfdc068730 2026-03-10T08:52:44.282 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.281+0000 7fbfd8ff9700 1 --2- 192.168.123.105:0/70988631 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbfc806c490 0x7fbfc806e950 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:44.282 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.281+0000 7fbfd8ff9700 1 -- 192.168.123.105:0/70988631 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fbfc408aee0 con 0x7fbfdc068730 2026-03-10T08:52:44.282 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.282+0000 7fbfdaffd700 1 --2- 192.168.123.105:0/70988631 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbfc806c490 0x7fbfc806e950 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:44.282 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.282+0000 7fbfdaffd700 1 --2- 192.168.123.105:0/70988631 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbfc806c490 0x7fbfc806e950 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7fbfcc009ad0 tx=0x7fbfcc005fb0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:44.284 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.284+0000 7fbfd8ff9700 1 -- 192.168.123.105:0/70988631 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fbfc4055a20 con 0x7fbfdc068730 2026-03-10T08:52:44.389 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:44 vm05 ceph-mon[49713]: pgmap v65: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:52:44.389 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:44 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/3622137274' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 4}]: dispatch 2026-03-10T08:52:44.389 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:44 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/922580028' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 2}]: dispatch 2026-03-10T08:52:44.389 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:44 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/1349080978' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 5}]: dispatch 2026-03-10T08:52:44.389 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:44 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/2405637549' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 1}]: dispatch 2026-03-10T08:52:44.389 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.388+0000 7fbfe1aa7700 1 -- 192.168.123.105:0/70988631 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 3} v 0) v1 -- 0x7fbfbc005190 con 0x7fbfdc068730 2026-03-10T08:52:44.389 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.389+0000 7fbfd8ff9700 1 -- 192.168.123.105:0/70988631 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 3}]=0 v0) v1 ==== 74+0+12 (secure 0 0 0) 0x7fbfc4059040 con 0x7fbfdc068730 2026-03-10T08:52:44.390 INFO:teuthology.orchestra.run.vm05.stdout:98784247815 2026-03-10T08:52:44.392 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.392+0000 7fbfe1aa7700 1 -- 192.168.123.105:0/70988631 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbfc806c490 msgr2=0x7fbfc806e950 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:44.392 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.392+0000 7fbfe1aa7700 1 --2- 192.168.123.105:0/70988631 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbfc806c490 0x7fbfc806e950 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7fbfcc009ad0 tx=0x7fbfcc005fb0 comp rx=0 tx=0).stop 2026-03-10T08:52:44.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.392+0000 7fbfe1aa7700 1 -- 192.168.123.105:0/70988631 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbfdc068730 msgr2=0x7fbfdc19d350 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:44.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.392+0000 7fbfe1aa7700 1 --2- 192.168.123.105:0/70988631 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbfdc068730 0x7fbfdc19d350 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7fbfc400b700 tx=0x7fbfc400bac0 comp rx=0 tx=0).stop 2026-03-10T08:52:44.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.393+0000 7fbfe1aa7700 1 -- 192.168.123.105:0/70988631 shutdown_connections 2026-03-10T08:52:44.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.393+0000 7fbfe1aa7700 1 --2- 192.168.123.105:0/70988631 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbfc806c490 0x7fbfc806e950 unknown :-1 s=CLOSED pgs=80 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:44.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.393+0000 7fbfe1aa7700 1 --2- 192.168.123.105:0/70988631 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbfdc068730 0x7fbfdc19d350 unknown :-1 s=CLOSED pgs=42 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:44.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.393+0000 7fbfe1aa7700 1 --2- 192.168.123.105:0/70988631 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbfdc0690e0 0x7fbfdc19d890 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:44.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.393+0000 7fbfe1aa7700 1 -- 192.168.123.105:0/70988631 >> 192.168.123.105:0/70988631 conn(0x7fbfdc075960 msgr2=0x7fbfdc0feb10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:44.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.393+0000 7fbfe1aa7700 1 -- 192.168.123.105:0/70988631 shutdown_connections 2026-03-10T08:52:44.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.393+0000 7fbfe1aa7700 1 -- 192.168.123.105:0/70988631 wait complete. 2026-03-10T08:52:44.424 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph osd last-stat-seq osd.0 2026-03-10T08:52:44.465 INFO:tasks.cephadm.ceph_manager.ceph:need seq 98784247815 got 98784247815 for osd.3 2026-03-10T08:52:44.465 DEBUG:teuthology.parallel:result is None 2026-03-10T08:52:44.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:44 vm08 ceph-mon[57559]: pgmap v65: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:52:44.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:44 vm08 ceph-mon[57559]: from='client.? 192.168.123.105:0/3622137274' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 4}]: dispatch 2026-03-10T08:52:44.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:44 vm08 ceph-mon[57559]: from='client.? 192.168.123.105:0/922580028' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 2}]: dispatch 2026-03-10T08:52:44.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:44 vm08 ceph-mon[57559]: from='client.? 192.168.123.105:0/1349080978' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 5}]: dispatch 2026-03-10T08:52:44.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:44 vm08 ceph-mon[57559]: from='client.? 192.168.123.105:0/2405637549' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 1}]: dispatch 2026-03-10T08:52:44.575 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:52:44.815 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.814+0000 7fe8e03ee700 1 -- 192.168.123.105:0/672908256 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe8d8102db0 msgr2=0x7fe8d8103190 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:44.815 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.814+0000 7fe8e03ee700 1 --2- 192.168.123.105:0/672908256 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe8d8102db0 0x7fe8d8103190 secure :-1 s=READY pgs=232 cs=0 l=1 rev1=1 crypto rx=0x7fe8c8009b50 tx=0x7fe8c8009e60 comp rx=0 tx=0).stop 2026-03-10T08:52:44.815 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.814+0000 7fe8e03ee700 1 -- 192.168.123.105:0/672908256 shutdown_connections 2026-03-10T08:52:44.815 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.814+0000 7fe8e03ee700 1 --2- 192.168.123.105:0/672908256 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe8d8069180 0x7fe8d8069600 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:44.815 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.814+0000 7fe8e03ee700 1 --2- 192.168.123.105:0/672908256 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe8d8102db0 0x7fe8d8103190 unknown :-1 s=CLOSED pgs=232 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:44.815 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.814+0000 7fe8e03ee700 1 -- 192.168.123.105:0/672908256 >> 192.168.123.105:0/672908256 conn(0x7fe8d8076b70 msgr2=0x7fe8d8076f80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:44.815 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.815+0000 7fe8e03ee700 1 -- 192.168.123.105:0/672908256 shutdown_connections 2026-03-10T08:52:44.815 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.815+0000 7fe8e03ee700 1 -- 192.168.123.105:0/672908256 wait complete. 2026-03-10T08:52:44.815 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.815+0000 7fe8e03ee700 1 Processor -- start 2026-03-10T08:52:44.816 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.815+0000 7fe8e03ee700 1 -- start start 2026-03-10T08:52:44.816 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.815+0000 7fe8e03ee700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe8d8069180 0x7fe8d8113b80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:44.816 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.815+0000 7fe8e03ee700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe8d8102db0 0x7fe8d81140c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:44.816 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.816+0000 7fe8e03ee700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe8d810fd80 con 0x7fe8d8069180 2026-03-10T08:52:44.816 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.816+0000 7fe8e03ee700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe8d810fef0 con 0x7fe8d8102db0 2026-03-10T08:52:44.816 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.816+0000 7fe8dd989700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe8d8102db0 0x7fe8d81140c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:44.816 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.816+0000 7fe8dd989700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe8d8102db0 0x7fe8d81140c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:36338/0 (socket says 192.168.123.105:36338) 2026-03-10T08:52:44.817 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.816+0000 7fe8dd989700 1 -- 192.168.123.105:0/2204444963 learned_addr learned my addr 192.168.123.105:0/2204444963 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:52:44.817 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.816+0000 7fe8dd989700 1 -- 192.168.123.105:0/2204444963 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe8d8069180 msgr2=0x7fe8d8113b80 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T08:52:44.817 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.816+0000 7fe8dd989700 1 --2- 192.168.123.105:0/2204444963 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe8d8069180 0x7fe8d8113b80 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:44.817 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.816+0000 7fe8dd989700 1 -- 192.168.123.105:0/2204444963 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe8c80097e0 con 0x7fe8d8102db0 2026-03-10T08:52:44.817 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.816+0000 7fe8dd989700 1 --2- 192.168.123.105:0/2204444963 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe8d8102db0 0x7fe8d81140c0 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7fe8d400eb10 tx=0x7fe8d400eed0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:44.817 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.816+0000 7fe8cf7fe700 1 -- 192.168.123.105:0/2204444963 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe8d400cca0 con 0x7fe8d8102db0 2026-03-10T08:52:44.817 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.817+0000 7fe8cf7fe700 1 -- 192.168.123.105:0/2204444963 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fe8d400ce00 con 0x7fe8d8102db0 2026-03-10T08:52:44.817 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.817+0000 7fe8e03ee700 1 -- 192.168.123.105:0/2204444963 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe8d8110150 con 0x7fe8d8102db0 2026-03-10T08:52:44.817 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.817+0000 7fe8e03ee700 1 -- 192.168.123.105:0/2204444963 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe8d81106a0 con 0x7fe8d8102db0 2026-03-10T08:52:44.818 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.817+0000 7fe8e03ee700 1 -- 192.168.123.105:0/2204444963 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe8d804ea90 con 0x7fe8d8102db0 2026-03-10T08:52:44.818 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.817+0000 7fe8cf7fe700 1 -- 192.168.123.105:0/2204444963 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe8d40105e0 con 0x7fe8d8102db0 2026-03-10T08:52:44.819 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.818+0000 7fe8cf7fe700 1 -- 192.168.123.105:0/2204444963 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fe8d40107d0 con 0x7fe8d8102db0 2026-03-10T08:52:44.819 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.819+0000 7fe8cf7fe700 1 --2- 192.168.123.105:0/2204444963 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe8c406c4f0 0x7fe8c406e9b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:44.819 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.819+0000 7fe8cf7fe700 1 -- 192.168.123.105:0/2204444963 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fe8d4014070 con 0x7fe8d8102db0 2026-03-10T08:52:44.819 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.819+0000 7fe8de18a700 1 --2- 192.168.123.105:0/2204444963 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe8c406c4f0 0x7fe8c406e9b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:44.820 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.819+0000 7fe8de18a700 1 --2- 192.168.123.105:0/2204444963 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe8c406c4f0 0x7fe8c406e9b0 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7fe8c8000c00 tx=0x7fe8c8005fb0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:44.821 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.820+0000 7fe8cf7fe700 1 -- 192.168.123.105:0/2204444963 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fe8d4056c10 con 0x7fe8d8102db0 2026-03-10T08:52:44.923 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.922+0000 7fe8e03ee700 1 -- 192.168.123.105:0/2204444963 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 0} v 0) v1 -- 0x7fe8d81102e0 con 0x7fe8d8102db0 2026-03-10T08:52:44.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.923+0000 7fe8cf7fe700 1 -- 192.168.123.105:0/2204444963 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 0}]=0 v0) v1 ==== 74+0+12 (secure 0 0 0) 0x7fe8d81102e0 con 0x7fe8d8102db0 2026-03-10T08:52:44.924 INFO:teuthology.orchestra.run.vm05.stdout:38654705676 2026-03-10T08:52:44.926 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.926+0000 7fe8e03ee700 1 -- 192.168.123.105:0/2204444963 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe8c406c4f0 msgr2=0x7fe8c406e9b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:44.927 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.926+0000 7fe8e03ee700 1 --2- 192.168.123.105:0/2204444963 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe8c406c4f0 0x7fe8c406e9b0 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7fe8c8000c00 tx=0x7fe8c8005fb0 comp rx=0 tx=0).stop 2026-03-10T08:52:44.927 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.927+0000 7fe8e03ee700 1 -- 192.168.123.105:0/2204444963 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe8d8102db0 msgr2=0x7fe8d81140c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:44.927 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.927+0000 7fe8e03ee700 1 --2- 192.168.123.105:0/2204444963 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe8d8102db0 0x7fe8d81140c0 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7fe8d400eb10 tx=0x7fe8d400eed0 comp rx=0 tx=0).stop 2026-03-10T08:52:44.927 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.927+0000 7fe8e03ee700 1 -- 192.168.123.105:0/2204444963 shutdown_connections 2026-03-10T08:52:44.927 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.927+0000 7fe8e03ee700 1 --2- 192.168.123.105:0/2204444963 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe8c406c4f0 0x7fe8c406e9b0 unknown :-1 s=CLOSED pgs=81 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:44.927 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.927+0000 7fe8e03ee700 1 --2- 192.168.123.105:0/2204444963 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe8d8069180 0x7fe8d8113b80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:44.928 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.927+0000 7fe8e03ee700 1 --2- 192.168.123.105:0/2204444963 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe8d8102db0 0x7fe8d81140c0 unknown :-1 s=CLOSED pgs=43 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:44.928 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.928+0000 7fe8e03ee700 1 -- 192.168.123.105:0/2204444963 >> 192.168.123.105:0/2204444963 conn(0x7fe8d8076b70 msgr2=0x7fe8d80fde50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:44.928 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.928+0000 7fe8e03ee700 1 -- 192.168.123.105:0/2204444963 shutdown_connections 2026-03-10T08:52:44.928 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:44.928+0000 7fe8e03ee700 1 -- 192.168.123.105:0/2204444963 wait complete. 2026-03-10T08:52:44.993 INFO:tasks.cephadm.ceph_manager.ceph:need seq 38654705676 got 38654705676 for osd.0 2026-03-10T08:52:44.993 DEBUG:teuthology.parallel:result is None 2026-03-10T08:52:44.993 INFO:tasks.cephadm.ceph_manager.ceph:waiting for clean 2026-03-10T08:52:44.993 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph pg dump --format=json 2026-03-10T08:52:45.129 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:52:45.349 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.348+0000 7f86a9a18700 1 -- 192.168.123.105:0/2681665777 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f86a4103d70 msgr2=0x7f86a4107dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:45.349 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.348+0000 7f86a9a18700 1 --2- 192.168.123.105:0/2681665777 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f86a4103d70 0x7f86a4107dc0 secure :-1 s=READY pgs=233 cs=0 l=1 rev1=1 crypto rx=0x7f8694009b00 tx=0x7f8694009e10 comp rx=0 tx=0).stop 2026-03-10T08:52:45.350 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.349+0000 7f86a9a18700 1 -- 192.168.123.105:0/2681665777 shutdown_connections 2026-03-10T08:52:45.350 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.349+0000 7f86a9a18700 1 --2- 192.168.123.105:0/2681665777 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f86a4103d70 0x7f86a4107dc0 unknown :-1 s=CLOSED pgs=233 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:45.350 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.349+0000 7f86a9a18700 1 --2- 192.168.123.105:0/2681665777 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f86a41033c0 0x7f86a41037a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:45.350 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.349+0000 7f86a9a18700 1 -- 192.168.123.105:0/2681665777 >> 192.168.123.105:0/2681665777 conn(0x7f86a40fec30 msgr2=0x7f86a4101050 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:45.350 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.349+0000 7f86a9a18700 1 -- 192.168.123.105:0/2681665777 shutdown_connections 2026-03-10T08:52:45.350 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.350+0000 7f86a9a18700 1 -- 192.168.123.105:0/2681665777 wait complete. 2026-03-10T08:52:45.350 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.350+0000 7f86a9a18700 1 Processor -- start 2026-03-10T08:52:45.350 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.350+0000 7f86a9a18700 1 -- start start 2026-03-10T08:52:45.350 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.350+0000 7f86a9a18700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f86a41033c0 0x7f86a4198e80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:45.351 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.350+0000 7f86a9a18700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f86a4103d70 0x7f86a41993c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:45.351 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.350+0000 7f86a9a18700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f86a4199aa0 con 0x7f86a4103d70 2026-03-10T08:52:45.351 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.350+0000 7f86a9a18700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f86a419d830 con 0x7f86a41033c0 2026-03-10T08:52:45.351 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.351+0000 7f86a27fc700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f86a4103d70 0x7f86a41993c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:45.351 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.351+0000 7f86a27fc700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f86a4103d70 0x7f86a41993c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:35326/0 (socket says 192.168.123.105:35326) 2026-03-10T08:52:45.351 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.351+0000 7f86a27fc700 1 -- 192.168.123.105:0/344116804 learned_addr learned my addr 192.168.123.105:0/344116804 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:52:45.351 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.351+0000 7f86a2ffd700 1 --2- 192.168.123.105:0/344116804 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f86a41033c0 0x7f86a4198e80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:45.351 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.351+0000 7f86a27fc700 1 -- 192.168.123.105:0/344116804 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f86a41033c0 msgr2=0x7f86a4198e80 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:45.351 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.351+0000 7f86a27fc700 1 --2- 192.168.123.105:0/344116804 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f86a41033c0 0x7f86a4198e80 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:45.351 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.351+0000 7f86a27fc700 1 -- 192.168.123.105:0/344116804 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f86940097e0 con 0x7f86a4103d70 2026-03-10T08:52:45.351 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.351+0000 7f86a2ffd700 1 --2- 192.168.123.105:0/344116804 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f86a41033c0 0x7f86a4198e80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T08:52:45.351 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.351+0000 7f86a27fc700 1 --2- 192.168.123.105:0/344116804 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f86a4103d70 0x7f86a41993c0 secure :-1 s=READY pgs=234 cs=0 l=1 rev1=1 crypto rx=0x7f8694004930 tx=0x7f8694004a10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:45.352 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.351+0000 7f86a8a16700 1 -- 192.168.123.105:0/344116804 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f869401d070 con 0x7f86a4103d70 2026-03-10T08:52:45.352 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.351+0000 7f86a8a16700 1 -- 192.168.123.105:0/344116804 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f869400bc50 con 0x7f86a4103d70 2026-03-10T08:52:45.352 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.351+0000 7f86a8a16700 1 -- 192.168.123.105:0/344116804 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f869400f830 con 0x7f86a4103d70 2026-03-10T08:52:45.352 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.351+0000 7f86a9a18700 1 -- 192.168.123.105:0/344116804 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f86a419dab0 con 0x7f86a4103d70 2026-03-10T08:52:45.352 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.351+0000 7f86a9a18700 1 -- 192.168.123.105:0/344116804 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f86a419dfa0 con 0x7f86a4103d70 2026-03-10T08:52:45.352 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.352+0000 7f86a9a18700 1 -- 192.168.123.105:0/344116804 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f86a404ea90 con 0x7f86a4103d70 2026-03-10T08:52:45.355 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.353+0000 7f86a8a16700 1 -- 192.168.123.105:0/344116804 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f869400f990 con 0x7f86a4103d70 2026-03-10T08:52:45.355 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.354+0000 7f86a8a16700 1 --2- 192.168.123.105:0/344116804 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f869006c5b0 0x7f869006ea70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:45.355 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.354+0000 7f86a8a16700 1 -- 192.168.123.105:0/344116804 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f869408d6a0 con 0x7f86a4103d70 2026-03-10T08:52:45.356 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.356+0000 7f86a8a16700 1 -- 192.168.123.105:0/344116804 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f86940582e0 con 0x7f86a4103d70 2026-03-10T08:52:45.356 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.356+0000 7f86a2ffd700 1 --2- 192.168.123.105:0/344116804 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f869006c5b0 0x7f869006ea70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:45.356 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.356+0000 7f86a2ffd700 1 --2- 192.168.123.105:0/344116804 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f869006c5b0 0x7f869006ea70 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f868c005fd0 tx=0x7f868c005dc0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:45.458 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.457+0000 7f86a9a18700 1 -- 192.168.123.105:0/344116804 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}) v1 -- 0x7f86a419e280 con 0x7f869006c5b0 2026-03-10T08:52:45.458 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:45 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/70988631' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 3}]: dispatch 2026-03-10T08:52:45.458 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:45 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/2204444963' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 0}]: dispatch 2026-03-10T08:52:45.459 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.458+0000 7f86a8a16700 1 -- 192.168.123.105:0/344116804 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 dumped all) v1 ==== 18+0+19094 (secure 0 0 0) 0x7f86a419e280 con 0x7f869006c5b0 2026-03-10T08:52:45.460 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:52:45.462 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.462+0000 7f86a9a18700 1 -- 192.168.123.105:0/344116804 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f869006c5b0 msgr2=0x7f869006ea70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:45.462 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.462+0000 7f86a9a18700 1 --2- 192.168.123.105:0/344116804 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f869006c5b0 0x7f869006ea70 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f868c005fd0 tx=0x7f868c005dc0 comp rx=0 tx=0).stop 2026-03-10T08:52:45.462 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.462+0000 7f86a9a18700 1 -- 192.168.123.105:0/344116804 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f86a4103d70 msgr2=0x7f86a41993c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:45.462 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.462+0000 7f86a9a18700 1 --2- 192.168.123.105:0/344116804 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f86a4103d70 0x7f86a41993c0 secure :-1 s=READY pgs=234 cs=0 l=1 rev1=1 crypto rx=0x7f8694004930 tx=0x7f8694004a10 comp rx=0 tx=0).stop 2026-03-10T08:52:45.463 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.462+0000 7f86a9a18700 1 -- 192.168.123.105:0/344116804 shutdown_connections 2026-03-10T08:52:45.463 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.463+0000 7f86a9a18700 1 --2- 192.168.123.105:0/344116804 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f869006c5b0 0x7f869006ea70 unknown :-1 s=CLOSED pgs=82 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:45.463 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.463+0000 7f86a9a18700 1 --2- 192.168.123.105:0/344116804 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f86a41033c0 0x7f86a4198e80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:45.463 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.463+0000 7f86a9a18700 1 --2- 192.168.123.105:0/344116804 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f86a4103d70 0x7f86a41993c0 unknown :-1 s=CLOSED pgs=234 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:45.463 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.463+0000 7f86a9a18700 1 -- 192.168.123.105:0/344116804 >> 192.168.123.105:0/344116804 conn(0x7f86a40fec30 msgr2=0x7f86a41001d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:45.463 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.463+0000 7f86a9a18700 1 -- 192.168.123.105:0/344116804 shutdown_connections 2026-03-10T08:52:45.463 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.463+0000 7f86a9a18700 1 -- 192.168.123.105:0/344116804 wait complete. 2026-03-10T08:52:45.464 INFO:teuthology.orchestra.run.vm05.stderr:dumped all 2026-03-10T08:52:45.524 INFO:teuthology.orchestra.run.vm05.stdout:{"pg_ready":true,"pg_map":{"version":66,"stamp":"2026-03-10T08:52:45.238001+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":96,"num_read_kb":82,"num_write":113,"num_write_kb":1372,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":76,"ondisk_log_size":76,"up":3,"acting":3,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":3,"num_osds":6,"num_per_pool_osds":6,"num_per_pool_omap_osds":3,"kb":125804544,"kb_used":163640,"kb_used_data":3080,"kb_used_omap":0,"kb_used_meta":160512,"kb_avail":125640904,"statfs":{"total":128823853056,"available":128656285696,"internally_reserved":0,"allocated":3153920,"data_stored":2044515,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":164364288},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"10.001499"},"pg_stats":[{"pgid":"1.0","version":"20'76","reported_seq":138,"reported_epoch":32,"state":"active+clean","last_fresh":"2026-03-10T08:52:35.235106+0000","last_change":"2026-03-10T08:52:26.394079+0000","last_active":"2026-03-10T08:52:35.235106+0000","last_peered":"2026-03-10T08:52:35.235106+0000","last_clean":"2026-03-10T08:52:35.235106+0000","last_became_active":"2026-03-10T08:52:26.393899+0000","last_became_peered":"2026-03-10T08:52:26.393899+0000","last_unstale":"2026-03-10T08:52:35.235106+0000","last_undegraded":"2026-03-10T08:52:35.235106+0000","last_fullsized":"2026-03-10T08:52:35.235106+0000","mapping_epoch":27,"log_start":"0'0","ondisk_log_start":"0'0","created":18,"last_epoch_clean":28,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-10T08:52:09.423405+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-10T08:52:09.423405+0000","last_clean_scrub_stamp":"2026-03-10T08:52:09.423405+0000","objects_scrubbed":0,"log_size":76,"log_dups_size":0,"ondisk_log_size":76,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-11T15:57:58.709097+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":96,"num_read_kb":82,"num_write":113,"num_write_kb":1372,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[3,0,1],"acting":[3,0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":3,"acting_primary":3,"purged_snaps":[]}],"pool_stats":[{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":96,"num_read_kb":82,"num_write":113,"num_write_kb":1372,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":1388544,"data_stored":1377840,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":76,"ondisk_log_size":76,"up":3,"acting":3,"num_store_stats":4}],"osd_stats":[{"osd":5,"up_from":32,"seq":137438953475,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27048,"kb_used_data":288,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20940376,"statfs":{"total":21470642176,"available":21442945024,"internally_reserved":0,"allocated":294912,"data_stored":111158,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,2,3,4],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.61099999999999999}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.48499999999999999}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.73899999999999999}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.752}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.73199999999999998}]}]},{"osd":4,"up_from":28,"seq":120259084293,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27048,"kb_used_data":288,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20940376,"statfs":{"total":21470642176,"available":21442945024,"internally_reserved":0,"allocated":294912,"data_stored":111158,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,2,3,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.25}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.313}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.28199999999999997}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.42099999999999999}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.997}]}]},{"osd":3,"up_from":23,"seq":98784247815,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27496,"kb_used_data":736,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20939928,"statfs":{"total":21470642176,"available":21442486272,"internally_reserved":0,"allocated":753664,"data_stored":570165,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,2,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.66600000000000004}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.67400000000000004}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.60199999999999998}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.32900000000000001}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.48799999999999999}]}]},{"osd":2,"up_from":17,"seq":73014444041,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27048,"kb_used_data":288,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20940376,"statfs":{"total":21470642176,"available":21442945024,"internally_reserved":0,"allocated":294912,"data_stored":111158,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.47699999999999998}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.34399999999999997}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.48599999999999999}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.39100000000000001}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.38300000000000001}]}]},{"osd":0,"up_from":9,"seq":38654705676,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27500,"kb_used_data":740,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20939924,"statfs":{"total":21470642176,"available":21442482176,"internally_reserved":0,"allocated":757760,"data_stored":570438,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[1,2,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.59399999999999997}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.30199999999999999}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.60599999999999998}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.64600000000000002}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.61499999999999999}]}]},{"osd":1,"up_from":13,"seq":55834574859,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27500,"kb_used_data":740,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20939924,"statfs":{"total":21470642176,"available":21442482176,"internally_reserved":0,"allocated":757760,"data_stored":570438,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,2,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.434}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.379}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.46999999999999997}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.629}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.45300000000000001}]}]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":3,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0}]}} 2026-03-10T08:52:45.524 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph pg dump --format=json 2026-03-10T08:52:45.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:45 vm08 ceph-mon[57559]: from='client.? 192.168.123.105:0/70988631' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 3}]: dispatch 2026-03-10T08:52:45.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:45 vm08 ceph-mon[57559]: from='client.? 192.168.123.105:0/2204444963' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 0}]: dispatch 2026-03-10T08:52:45.656 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:52:45.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.864+0000 7f549f63b700 1 -- 192.168.123.105:0/347156556 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5498101a80 msgr2=0x7f5498105ad0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:45.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.864+0000 7f549f63b700 1 --2- 192.168.123.105:0/347156556 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5498101a80 0x7f5498105ad0 secure :-1 s=READY pgs=235 cs=0 l=1 rev1=1 crypto rx=0x7f5494009b50 tx=0x7f5494009e60 comp rx=0 tx=0).stop 2026-03-10T08:52:45.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.865+0000 7f549f63b700 1 -- 192.168.123.105:0/347156556 shutdown_connections 2026-03-10T08:52:45.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.865+0000 7f549f63b700 1 --2- 192.168.123.105:0/347156556 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5498101a80 0x7f5498105ad0 unknown :-1 s=CLOSED pgs=235 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:45.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.865+0000 7f549f63b700 1 --2- 192.168.123.105:0/347156556 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f54981010d0 0x7f54981014b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:45.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.865+0000 7f549f63b700 1 -- 192.168.123.105:0/347156556 >> 192.168.123.105:0/347156556 conn(0x7f54980fc920 msgr2=0x7f54980fed40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:45.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.865+0000 7f549f63b700 1 -- 192.168.123.105:0/347156556 shutdown_connections 2026-03-10T08:52:45.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.865+0000 7f549f63b700 1 -- 192.168.123.105:0/347156556 wait complete. 2026-03-10T08:52:45.866 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.865+0000 7f549f63b700 1 Processor -- start 2026-03-10T08:52:45.866 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.866+0000 7f549f63b700 1 -- start start 2026-03-10T08:52:45.866 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.866+0000 7f549f63b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f54981010d0 0x7f5498073090 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:45.866 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.866+0000 7f549f63b700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5498101a80 0x7f54980735d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:45.866 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.866+0000 7f549f63b700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5498073b10 con 0x7f54981010d0 2026-03-10T08:52:45.866 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.866+0000 7f549f63b700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5498073c50 con 0x7f5498101a80 2026-03-10T08:52:45.866 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.866+0000 7f549d3d7700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f54981010d0 0x7f5498073090 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:45.866 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.866+0000 7f549d3d7700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f54981010d0 0x7f5498073090 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:35348/0 (socket says 192.168.123.105:35348) 2026-03-10T08:52:45.866 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.866+0000 7f549d3d7700 1 -- 192.168.123.105:0/1798061116 learned_addr learned my addr 192.168.123.105:0/1798061116 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:52:45.866 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.866+0000 7f549cbd6700 1 --2- 192.168.123.105:0/1798061116 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5498101a80 0x7f54980735d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:45.867 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.867+0000 7f549d3d7700 1 -- 192.168.123.105:0/1798061116 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5498101a80 msgr2=0x7f54980735d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:45.867 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.867+0000 7f549d3d7700 1 --2- 192.168.123.105:0/1798061116 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5498101a80 0x7f54980735d0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:45.867 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.867+0000 7f549d3d7700 1 -- 192.168.123.105:0/1798061116 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f54940097e0 con 0x7f54981010d0 2026-03-10T08:52:45.867 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.867+0000 7f549d3d7700 1 --2- 192.168.123.105:0/1798061116 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f54981010d0 0x7f5498073090 secure :-1 s=READY pgs=236 cs=0 l=1 rev1=1 crypto rx=0x7f548800eb10 tx=0x7f548800eed0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:45.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.867+0000 7f548e7fc700 1 -- 192.168.123.105:0/1798061116 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f548800cca0 con 0x7f54981010d0 2026-03-10T08:52:45.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.867+0000 7f548e7fc700 1 -- 192.168.123.105:0/1798061116 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f548800ce00 con 0x7f54981010d0 2026-03-10T08:52:45.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.867+0000 7f548e7fc700 1 -- 192.168.123.105:0/1798061116 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5488018a30 con 0x7f54981010d0 2026-03-10T08:52:45.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.867+0000 7f549f63b700 1 -- 192.168.123.105:0/1798061116 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f54981a2d80 con 0x7f54981010d0 2026-03-10T08:52:45.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.867+0000 7f549f63b700 1 -- 192.168.123.105:0/1798061116 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f54981a3190 con 0x7f54981010d0 2026-03-10T08:52:45.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.868+0000 7f548e7fc700 1 -- 192.168.123.105:0/1798061116 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f5488018b90 con 0x7f54981010d0 2026-03-10T08:52:45.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.869+0000 7f548e7fc700 1 --2- 192.168.123.105:0/1798061116 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f548406c600 0x7f548406eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:45.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.869+0000 7f548e7fc700 1 -- 192.168.123.105:0/1798061116 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f5488014070 con 0x7f54981010d0 2026-03-10T08:52:45.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.869+0000 7f549cbd6700 1 --2- 192.168.123.105:0/1798061116 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f548406c600 0x7f548406eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:45.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.869+0000 7f549f63b700 1 -- 192.168.123.105:0/1798061116 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f54981094d0 con 0x7f54981010d0 2026-03-10T08:52:45.872 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.872+0000 7f549cbd6700 1 --2- 192.168.123.105:0/1798061116 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f548406c600 0x7f548406eac0 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f5494005950 tx=0x7f54940058e0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:45.872 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.872+0000 7f548e7fc700 1 -- 192.168.123.105:0/1798061116 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f5488057a90 con 0x7f54981010d0 2026-03-10T08:52:45.973 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.971+0000 7f549f63b700 1 -- 192.168.123.105:0/1798061116 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}) v1 -- 0x7f54980745c0 con 0x7f548406c600 2026-03-10T08:52:45.974 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.974+0000 7f548e7fc700 1 -- 192.168.123.105:0/1798061116 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 dumped all) v1 ==== 18+0+19094 (secure 0 0 0) 0x7f54980745c0 con 0x7f548406c600 2026-03-10T08:52:45.974 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:52:45.976 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.976+0000 7f549f63b700 1 -- 192.168.123.105:0/1798061116 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f548406c600 msgr2=0x7f548406eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:45.976 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.976+0000 7f549f63b700 1 --2- 192.168.123.105:0/1798061116 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f548406c600 0x7f548406eac0 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f5494005950 tx=0x7f54940058e0 comp rx=0 tx=0).stop 2026-03-10T08:52:45.976 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.976+0000 7f549f63b700 1 -- 192.168.123.105:0/1798061116 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f54981010d0 msgr2=0x7f5498073090 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:45.976 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.976+0000 7f549f63b700 1 --2- 192.168.123.105:0/1798061116 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f54981010d0 0x7f5498073090 secure :-1 s=READY pgs=236 cs=0 l=1 rev1=1 crypto rx=0x7f548800eb10 tx=0x7f548800eed0 comp rx=0 tx=0).stop 2026-03-10T08:52:45.976 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.976+0000 7f549f63b700 1 -- 192.168.123.105:0/1798061116 shutdown_connections 2026-03-10T08:52:45.976 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.976+0000 7f549f63b700 1 --2- 192.168.123.105:0/1798061116 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f548406c600 0x7f548406eac0 unknown :-1 s=CLOSED pgs=83 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:45.976 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.976+0000 7f549f63b700 1 --2- 192.168.123.105:0/1798061116 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f54981010d0 0x7f5498073090 unknown :-1 s=CLOSED pgs=236 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:45.976 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.976+0000 7f549f63b700 1 --2- 192.168.123.105:0/1798061116 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5498101a80 0x7f54980735d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:45.977 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.976+0000 7f549f63b700 1 -- 192.168.123.105:0/1798061116 >> 192.168.123.105:0/1798061116 conn(0x7f54980fc920 msgr2=0x7f54980fecd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:45.977 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.977+0000 7f549f63b700 1 -- 192.168.123.105:0/1798061116 shutdown_connections 2026-03-10T08:52:45.977 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:45.977+0000 7f549f63b700 1 -- 192.168.123.105:0/1798061116 wait complete. 2026-03-10T08:52:45.977 INFO:teuthology.orchestra.run.vm05.stderr:dumped all 2026-03-10T08:52:46.037 INFO:teuthology.orchestra.run.vm05.stdout:{"pg_ready":true,"pg_map":{"version":66,"stamp":"2026-03-10T08:52:45.238001+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":96,"num_read_kb":82,"num_write":113,"num_write_kb":1372,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":76,"ondisk_log_size":76,"up":3,"acting":3,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":3,"num_osds":6,"num_per_pool_osds":6,"num_per_pool_omap_osds":3,"kb":125804544,"kb_used":163640,"kb_used_data":3080,"kb_used_omap":0,"kb_used_meta":160512,"kb_avail":125640904,"statfs":{"total":128823853056,"available":128656285696,"internally_reserved":0,"allocated":3153920,"data_stored":2044515,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":164364288},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"10.001499"},"pg_stats":[{"pgid":"1.0","version":"20'76","reported_seq":138,"reported_epoch":32,"state":"active+clean","last_fresh":"2026-03-10T08:52:35.235106+0000","last_change":"2026-03-10T08:52:26.394079+0000","last_active":"2026-03-10T08:52:35.235106+0000","last_peered":"2026-03-10T08:52:35.235106+0000","last_clean":"2026-03-10T08:52:35.235106+0000","last_became_active":"2026-03-10T08:52:26.393899+0000","last_became_peered":"2026-03-10T08:52:26.393899+0000","last_unstale":"2026-03-10T08:52:35.235106+0000","last_undegraded":"2026-03-10T08:52:35.235106+0000","last_fullsized":"2026-03-10T08:52:35.235106+0000","mapping_epoch":27,"log_start":"0'0","ondisk_log_start":"0'0","created":18,"last_epoch_clean":28,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-10T08:52:09.423405+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-10T08:52:09.423405+0000","last_clean_scrub_stamp":"2026-03-10T08:52:09.423405+0000","objects_scrubbed":0,"log_size":76,"log_dups_size":0,"ondisk_log_size":76,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-11T15:57:58.709097+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":96,"num_read_kb":82,"num_write":113,"num_write_kb":1372,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[3,0,1],"acting":[3,0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":3,"acting_primary":3,"purged_snaps":[]}],"pool_stats":[{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":96,"num_read_kb":82,"num_write":113,"num_write_kb":1372,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":1388544,"data_stored":1377840,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":76,"ondisk_log_size":76,"up":3,"acting":3,"num_store_stats":4}],"osd_stats":[{"osd":5,"up_from":32,"seq":137438953475,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27048,"kb_used_data":288,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20940376,"statfs":{"total":21470642176,"available":21442945024,"internally_reserved":0,"allocated":294912,"data_stored":111158,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,2,3,4],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.61099999999999999}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.48499999999999999}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.73899999999999999}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.752}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.73199999999999998}]}]},{"osd":4,"up_from":28,"seq":120259084293,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27048,"kb_used_data":288,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20940376,"statfs":{"total":21470642176,"available":21442945024,"internally_reserved":0,"allocated":294912,"data_stored":111158,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,2,3,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.25}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.313}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.28199999999999997}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.42099999999999999}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.997}]}]},{"osd":3,"up_from":23,"seq":98784247815,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27496,"kb_used_data":736,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20939928,"statfs":{"total":21470642176,"available":21442486272,"internally_reserved":0,"allocated":753664,"data_stored":570165,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,2,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.66600000000000004}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.67400000000000004}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.60199999999999998}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.32900000000000001}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.48799999999999999}]}]},{"osd":2,"up_from":17,"seq":73014444041,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27048,"kb_used_data":288,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20940376,"statfs":{"total":21470642176,"available":21442945024,"internally_reserved":0,"allocated":294912,"data_stored":111158,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.47699999999999998}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.34399999999999997}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.48599999999999999}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.39100000000000001}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.38300000000000001}]}]},{"osd":0,"up_from":9,"seq":38654705676,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27500,"kb_used_data":740,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20939924,"statfs":{"total":21470642176,"available":21442482176,"internally_reserved":0,"allocated":757760,"data_stored":570438,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[1,2,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.59399999999999997}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.30199999999999999}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.60599999999999998}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.64600000000000002}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.61499999999999999}]}]},{"osd":1,"up_from":13,"seq":55834574859,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27500,"kb_used_data":740,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20939924,"statfs":{"total":21470642176,"available":21442482176,"internally_reserved":0,"allocated":757760,"data_stored":570438,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,2,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.434}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.379}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.46999999999999997}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.629}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.45300000000000001}]}]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":3,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0}]}} 2026-03-10T08:52:46.038 INFO:tasks.cephadm.ceph_manager.ceph:clean! 2026-03-10T08:52:46.038 INFO:tasks.ceph:Waiting until ceph cluster ceph is healthy... 2026-03-10T08:52:46.038 INFO:tasks.cephadm.ceph_manager.ceph:wait_until_healthy 2026-03-10T08:52:46.038 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph health --format=json 2026-03-10T08:52:46.169 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:52:46.387 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.387+0000 7f3eae282700 1 -- 192.168.123.105:0/211729445 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3ea8103cf0 msgr2=0x7f3ea8107d40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:46.387 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.387+0000 7f3eae282700 1 --2- 192.168.123.105:0/211729445 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3ea8103cf0 0x7f3ea8107d40 secure :-1 s=READY pgs=237 cs=0 l=1 rev1=1 crypto rx=0x7f3e98009b50 tx=0x7f3e98009e60 comp rx=0 tx=0).stop 2026-03-10T08:52:46.387 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.387+0000 7f3eae282700 1 -- 192.168.123.105:0/211729445 shutdown_connections 2026-03-10T08:52:46.387 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.387+0000 7f3eae282700 1 --2- 192.168.123.105:0/211729445 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3ea8103cf0 0x7f3ea8107d40 unknown :-1 s=CLOSED pgs=237 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:46.388 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.387+0000 7f3eae282700 1 --2- 192.168.123.105:0/211729445 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3ea8103340 0x7f3ea8103720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:46.388 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.387+0000 7f3eae282700 1 -- 192.168.123.105:0/211729445 >> 192.168.123.105:0/211729445 conn(0x7f3ea80feb90 msgr2=0x7f3ea8100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:46.388 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.387+0000 7f3eae282700 1 -- 192.168.123.105:0/211729445 shutdown_connections 2026-03-10T08:52:46.388 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.387+0000 7f3eae282700 1 -- 192.168.123.105:0/211729445 wait complete. 2026-03-10T08:52:46.388 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.388+0000 7f3eae282700 1 Processor -- start 2026-03-10T08:52:46.388 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.388+0000 7f3eae282700 1 -- start start 2026-03-10T08:52:46.388 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.388+0000 7f3eae282700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3ea8103340 0x7f3ea8198e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:46.388 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.388+0000 7f3eae282700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3ea8103cf0 0x7f3ea8199390 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:46.389 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.388+0000 7f3eae282700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3ea8199a70 con 0x7f3ea8103cf0 2026-03-10T08:52:46.389 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.388+0000 7f3eae282700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3ea819d800 con 0x7f3ea8103340 2026-03-10T08:52:46.389 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.388+0000 7f3ea77fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3ea8103cf0 0x7f3ea8199390 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:46.389 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.388+0000 7f3ea77fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3ea8103cf0 0x7f3ea8199390 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:35374/0 (socket says 192.168.123.105:35374) 2026-03-10T08:52:46.389 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.388+0000 7f3ea77fe700 1 -- 192.168.123.105:0/2567836443 learned_addr learned my addr 192.168.123.105:0/2567836443 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:52:46.389 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.389+0000 7f3ea77fe700 1 -- 192.168.123.105:0/2567836443 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3ea8103340 msgr2=0x7f3ea8198e50 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T08:52:46.389 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.389+0000 7f3ea77fe700 1 --2- 192.168.123.105:0/2567836443 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3ea8103340 0x7f3ea8198e50 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:46.389 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.389+0000 7f3ea77fe700 1 -- 192.168.123.105:0/2567836443 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3e980097e0 con 0x7f3ea8103cf0 2026-03-10T08:52:46.389 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.389+0000 7f3ea77fe700 1 --2- 192.168.123.105:0/2567836443 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3ea8103cf0 0x7f3ea8199390 secure :-1 s=READY pgs=238 cs=0 l=1 rev1=1 crypto rx=0x7f3e98004cb0 tx=0x7f3e98005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:46.389 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.389+0000 7f3ea57fa700 1 -- 192.168.123.105:0/2567836443 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3e9801d070 con 0x7f3ea8103cf0 2026-03-10T08:52:46.390 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.389+0000 7f3ea57fa700 1 -- 192.168.123.105:0/2567836443 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3e98022470 con 0x7f3ea8103cf0 2026-03-10T08:52:46.390 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.389+0000 7f3ea57fa700 1 -- 192.168.123.105:0/2567836443 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3e9800f780 con 0x7f3ea8103cf0 2026-03-10T08:52:46.390 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.389+0000 7f3eae282700 1 -- 192.168.123.105:0/2567836443 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3ea819da80 con 0x7f3ea8103cf0 2026-03-10T08:52:46.390 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.389+0000 7f3eae282700 1 -- 192.168.123.105:0/2567836443 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3ea819df70 con 0x7f3ea8103cf0 2026-03-10T08:52:46.391 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.390+0000 7f3eae282700 1 -- 192.168.123.105:0/2567836443 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3ea810b740 con 0x7f3ea8103cf0 2026-03-10T08:52:46.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.391+0000 7f3ea57fa700 1 -- 192.168.123.105:0/2567836443 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f3e98022a60 con 0x7f3ea8103cf0 2026-03-10T08:52:46.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.391+0000 7f3ea57fa700 1 --2- 192.168.123.105:0/2567836443 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3e9406c4e0 0x7f3e9406e9a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:46.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.391+0000 7f3ea57fa700 1 -- 192.168.123.105:0/2567836443 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f3e9808cb20 con 0x7f3ea8103cf0 2026-03-10T08:52:46.394 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.393+0000 7f3ea7fff700 1 --2- 192.168.123.105:0/2567836443 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3e9406c4e0 0x7f3e9406e9a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:46.394 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.393+0000 7f3ea57fa700 1 -- 192.168.123.105:0/2567836443 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f3e98057800 con 0x7f3ea8103cf0 2026-03-10T08:52:46.394 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.394+0000 7f3ea7fff700 1 --2- 192.168.123.105:0/2567836443 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3e9406c4e0 0x7f3e9406e9a0 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f3e9000ba10 tx=0x7f3e9000b3f0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:46.519 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:46 vm05 ceph-mon[49713]: pgmap v66: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:52:46.519 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:46 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:52:46.519 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:46 vm05 ceph-mon[49713]: from='client.14436 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}]: dispatch 2026-03-10T08:52:46.521 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.518+0000 7f3eae282700 1 -- 192.168.123.105:0/2567836443 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "health", "format": "json"} v 0) v1 -- 0x7f3ea819a1b0 con 0x7f3ea8103cf0 2026-03-10T08:52:46.521 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.521+0000 7f3ea57fa700 1 -- 192.168.123.105:0/2567836443 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "health", "format": "json"}]=0 v0) v1 ==== 72+0+46 (secure 0 0 0) 0x7f3e98027080 con 0x7f3ea8103cf0 2026-03-10T08:52:46.521 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:52:46.521 INFO:teuthology.orchestra.run.vm05.stdout:{"status":"HEALTH_OK","checks":{},"mutes":[]} 2026-03-10T08:52:46.523 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.523+0000 7f3eae282700 1 -- 192.168.123.105:0/2567836443 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3e9406c4e0 msgr2=0x7f3e9406e9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:46.523 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.523+0000 7f3eae282700 1 --2- 192.168.123.105:0/2567836443 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3e9406c4e0 0x7f3e9406e9a0 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f3e9000ba10 tx=0x7f3e9000b3f0 comp rx=0 tx=0).stop 2026-03-10T08:52:46.523 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.523+0000 7f3eae282700 1 -- 192.168.123.105:0/2567836443 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3ea8103cf0 msgr2=0x7f3ea8199390 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:46.524 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.523+0000 7f3eae282700 1 --2- 192.168.123.105:0/2567836443 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3ea8103cf0 0x7f3ea8199390 secure :-1 s=READY pgs=238 cs=0 l=1 rev1=1 crypto rx=0x7f3e98004cb0 tx=0x7f3e98005dc0 comp rx=0 tx=0).stop 2026-03-10T08:52:46.524 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.523+0000 7f3eae282700 1 -- 192.168.123.105:0/2567836443 shutdown_connections 2026-03-10T08:52:46.524 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.523+0000 7f3eae282700 1 --2- 192.168.123.105:0/2567836443 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3e9406c4e0 0x7f3e9406e9a0 unknown :-1 s=CLOSED pgs=84 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:46.524 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.523+0000 7f3eae282700 1 --2- 192.168.123.105:0/2567836443 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3ea8103340 0x7f3ea8198e50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:46.524 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.523+0000 7f3eae282700 1 --2- 192.168.123.105:0/2567836443 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3ea8103cf0 0x7f3ea8199390 unknown :-1 s=CLOSED pgs=238 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:46.524 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.523+0000 7f3eae282700 1 -- 192.168.123.105:0/2567836443 >> 192.168.123.105:0/2567836443 conn(0x7f3ea80feb90 msgr2=0x7f3ea8100fa0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:46.524 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.524+0000 7f3eae282700 1 -- 192.168.123.105:0/2567836443 shutdown_connections 2026-03-10T08:52:46.524 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.524+0000 7f3eae282700 1 -- 192.168.123.105:0/2567836443 wait complete. 2026-03-10T08:52:46.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:46 vm08 ceph-mon[57559]: pgmap v66: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:52:46.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:46 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:52:46.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:46 vm08 ceph-mon[57559]: from='client.14436 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}]: dispatch 2026-03-10T08:52:46.563 INFO:tasks.cephadm.ceph_manager.ceph:wait_until_healthy done 2026-03-10T08:52:46.563 INFO:tasks.cephadm:Setup complete, yielding 2026-03-10T08:52:46.563 INFO:teuthology.run_tasks:Running task print... 2026-03-10T08:52:46.565 INFO:teuthology.task.print:**** done end installing v18.2.1 cephadm ... 2026-03-10T08:52:46.565 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-10T08:52:46.567 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm05.local 2026-03-10T08:52:46.567 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- bash -c 'ceph config set mgr mgr/cephadm/use_repo_digest true --force' 2026-03-10T08:52:46.701 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:52:46.923 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.922+0000 7f4ebf1dc700 1 -- 192.168.123.105:0/1275188881 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4eb8102060 msgr2=0x7f4eb81024e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:46.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.922+0000 7f4ebf1dc700 1 --2- 192.168.123.105:0/1275188881 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4eb8102060 0x7f4eb81024e0 secure :-1 s=READY pgs=239 cs=0 l=1 rev1=1 crypto rx=0x7f4eac009b00 tx=0x7f4eac009e10 comp rx=0 tx=0).stop 2026-03-10T08:52:46.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.923+0000 7f4ebf1dc700 1 -- 192.168.123.105:0/1275188881 shutdown_connections 2026-03-10T08:52:46.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.923+0000 7f4ebf1dc700 1 --2- 192.168.123.105:0/1275188881 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4eb8102060 0x7f4eb81024e0 unknown :-1 s=CLOSED pgs=239 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:46.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.923+0000 7f4ebf1dc700 1 --2- 192.168.123.105:0/1275188881 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4eb8100f00 0x7f4eb8101320 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:46.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.923+0000 7f4ebf1dc700 1 -- 192.168.123.105:0/1275188881 >> 192.168.123.105:0/1275188881 conn(0x7f4eb80fc460 msgr2=0x7f4eb80fe8e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:46.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.923+0000 7f4ebf1dc700 1 -- 192.168.123.105:0/1275188881 shutdown_connections 2026-03-10T08:52:46.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.923+0000 7f4ebf1dc700 1 -- 192.168.123.105:0/1275188881 wait complete. 2026-03-10T08:52:46.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.924+0000 7f4ebf1dc700 1 Processor -- start 2026-03-10T08:52:46.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.924+0000 7f4ebf1dc700 1 -- start start 2026-03-10T08:52:46.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.924+0000 7f4ebf1dc700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4eb8100f00 0x7f4eb8194670 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:46.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.924+0000 7f4ebf1dc700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4eb8102060 0x7f4eb8194bb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:46.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.924+0000 7f4ebf1dc700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4eb81951d0 con 0x7f4eb8100f00 2026-03-10T08:52:46.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.924+0000 7f4ebf1dc700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4eb8195310 con 0x7f4eb8102060 2026-03-10T08:52:46.925 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.924+0000 7f4ebcf78700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4eb8100f00 0x7f4eb8194670 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:46.925 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.924+0000 7f4eb7fff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4eb8102060 0x7f4eb8194bb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:46.925 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.924+0000 7f4ebcf78700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4eb8100f00 0x7f4eb8194670 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:35404/0 (socket says 192.168.123.105:35404) 2026-03-10T08:52:46.925 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.924+0000 7f4ebcf78700 1 -- 192.168.123.105:0/3766465032 learned_addr learned my addr 192.168.123.105:0/3766465032 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:52:46.925 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.925+0000 7f4ebcf78700 1 -- 192.168.123.105:0/3766465032 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4eb8102060 msgr2=0x7f4eb8194bb0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:46.925 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.925+0000 7f4ebcf78700 1 --2- 192.168.123.105:0/3766465032 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4eb8102060 0x7f4eb8194bb0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:46.925 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.925+0000 7f4ebcf78700 1 -- 192.168.123.105:0/3766465032 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4eac0097e0 con 0x7f4eb8100f00 2026-03-10T08:52:46.925 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.925+0000 7f4ebcf78700 1 --2- 192.168.123.105:0/3766465032 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4eb8100f00 0x7f4eb8194670 secure :-1 s=READY pgs=240 cs=0 l=1 rev1=1 crypto rx=0x7f4ea800ba70 tx=0x7f4ea800bd80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:46.927 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.925+0000 7f4eb5ffb700 1 -- 192.168.123.105:0/3766465032 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4ea800c700 con 0x7f4eb8100f00 2026-03-10T08:52:46.927 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.925+0000 7f4eb5ffb700 1 -- 192.168.123.105:0/3766465032 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4ea800cd40 con 0x7f4eb8100f00 2026-03-10T08:52:46.927 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.925+0000 7f4eb5ffb700 1 -- 192.168.123.105:0/3766465032 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4ea8012340 con 0x7f4eb8100f00 2026-03-10T08:52:46.927 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.925+0000 7f4ebf1dc700 1 -- 192.168.123.105:0/3766465032 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4eb8199dc0 con 0x7f4eb8100f00 2026-03-10T08:52:46.927 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.925+0000 7f4ebf1dc700 1 -- 192.168.123.105:0/3766465032 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4eb819a2c0 con 0x7f4eb8100f00 2026-03-10T08:52:46.929 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.927+0000 7f4eb5ffb700 1 -- 192.168.123.105:0/3766465032 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f4ea80124e0 con 0x7f4eb8100f00 2026-03-10T08:52:46.929 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.927+0000 7f4ebf1dc700 1 -- 192.168.123.105:0/3766465032 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4eb8066e80 con 0x7f4eb8100f00 2026-03-10T08:52:46.929 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.928+0000 7f4eb5ffb700 1 --2- 192.168.123.105:0/3766465032 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4ea00708f0 0x7f4ea0072db0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:46.929 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.928+0000 7f4eb5ffb700 1 -- 192.168.123.105:0/3766465032 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f4ea808a420 con 0x7f4eb8100f00 2026-03-10T08:52:46.930 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.929+0000 7f4eb7fff700 1 --2- 192.168.123.105:0/3766465032 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4ea00708f0 0x7f4ea0072db0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:46.930 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.930+0000 7f4eb5ffb700 1 -- 192.168.123.105:0/3766465032 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f4ea8059c70 con 0x7f4eb8100f00 2026-03-10T08:52:46.930 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:46.930+0000 7f4eb7fff700 1 --2- 192.168.123.105:0/3766465032 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4ea00708f0 0x7f4ea0072db0 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7f4eac00b5c0 tx=0x7f4eac005fb0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:47.035 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.034+0000 7f4ebf1dc700 1 -- 192.168.123.105:0/3766465032 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command([{prefix=config set, name=mgr/cephadm/use_repo_digest}] v 0) v1 -- 0x7f4eb819a910 con 0x7f4eb8100f00 2026-03-10T08:52:47.041 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.041+0000 7f4eb5ffb700 1 -- 192.168.123.105:0/3766465032 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mgr/cephadm/use_repo_digest}]=0 v14)=0 v14) v1 ==== 143+0+0 (secure 0 0 0) 0x7f4ea8059800 con 0x7f4eb8100f00 2026-03-10T08:52:47.044 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.044+0000 7f4ebf1dc700 1 -- 192.168.123.105:0/3766465032 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4ea00708f0 msgr2=0x7f4ea0072db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:47.044 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.044+0000 7f4ebf1dc700 1 --2- 192.168.123.105:0/3766465032 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4ea00708f0 0x7f4ea0072db0 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7f4eac00b5c0 tx=0x7f4eac005fb0 comp rx=0 tx=0).stop 2026-03-10T08:52:47.044 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.044+0000 7f4ebf1dc700 1 -- 192.168.123.105:0/3766465032 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4eb8100f00 msgr2=0x7f4eb8194670 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:47.044 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.044+0000 7f4ebf1dc700 1 --2- 192.168.123.105:0/3766465032 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4eb8100f00 0x7f4eb8194670 secure :-1 s=READY pgs=240 cs=0 l=1 rev1=1 crypto rx=0x7f4ea800ba70 tx=0x7f4ea800bd80 comp rx=0 tx=0).stop 2026-03-10T08:52:47.045 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.045+0000 7f4ebf1dc700 1 -- 192.168.123.105:0/3766465032 shutdown_connections 2026-03-10T08:52:47.045 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.045+0000 7f4ebf1dc700 1 --2- 192.168.123.105:0/3766465032 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4ea00708f0 0x7f4ea0072db0 unknown :-1 s=CLOSED pgs=85 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:47.045 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.045+0000 7f4ebf1dc700 1 --2- 192.168.123.105:0/3766465032 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4eb8100f00 0x7f4eb8194670 unknown :-1 s=CLOSED pgs=240 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:47.045 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.045+0000 7f4ebf1dc700 1 --2- 192.168.123.105:0/3766465032 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4eb8102060 0x7f4eb8194bb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:47.045 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.045+0000 7f4ebf1dc700 1 -- 192.168.123.105:0/3766465032 >> 192.168.123.105:0/3766465032 conn(0x7f4eb80fc460 msgr2=0x7f4eb8105320 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:47.045 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.045+0000 7f4ebf1dc700 1 -- 192.168.123.105:0/3766465032 shutdown_connections 2026-03-10T08:52:47.045 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.045+0000 7f4ebf1dc700 1 -- 192.168.123.105:0/3766465032 wait complete. 2026-03-10T08:52:47.110 INFO:teuthology.run_tasks:Running task print... 2026-03-10T08:52:47.112 INFO:teuthology.task.print:**** done cephadm.shell ceph config set mgr... 2026-03-10T08:52:47.112 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-10T08:52:47.114 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm05.local 2026-03-10T08:52:47.114 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- bash -c 'ceph orch status' 2026-03-10T08:52:47.260 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:52:47.296 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:47 vm05 ceph-mon[49713]: from='client.14440 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}]: dispatch 2026-03-10T08:52:47.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.491+0000 7fe2ba32d700 1 -- 192.168.123.105:0/2437190273 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe2b41042c0 msgr2=0x7fe2b41066b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:47.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.491+0000 7fe2ba32d700 1 --2- 192.168.123.105:0/2437190273 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe2b41042c0 0x7fe2b41066b0 secure :-1 s=READY pgs=241 cs=0 l=1 rev1=1 crypto rx=0x7fe2a4009b50 tx=0x7fe2a4009e60 comp rx=0 tx=0).stop 2026-03-10T08:52:47.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.492+0000 7fe2ba32d700 1 -- 192.168.123.105:0/2437190273 shutdown_connections 2026-03-10T08:52:47.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.492+0000 7fe2ba32d700 1 --2- 192.168.123.105:0/2437190273 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe2b41042c0 0x7fe2b41066b0 unknown :-1 s=CLOSED pgs=241 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:47.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.492+0000 7fe2ba32d700 1 --2- 192.168.123.105:0/2437190273 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe2b4101990 0x7fe2b4103d80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:47.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.492+0000 7fe2ba32d700 1 -- 192.168.123.105:0/2437190273 >> 192.168.123.105:0/2437190273 conn(0x7fe2b40fb360 msgr2=0x7fe2b40fd7e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:47.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.492+0000 7fe2ba32d700 1 -- 192.168.123.105:0/2437190273 shutdown_connections 2026-03-10T08:52:47.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.492+0000 7fe2ba32d700 1 -- 192.168.123.105:0/2437190273 wait complete. 2026-03-10T08:52:47.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.493+0000 7fe2ba32d700 1 Processor -- start 2026-03-10T08:52:47.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.493+0000 7fe2ba32d700 1 -- start start 2026-03-10T08:52:47.494 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.493+0000 7fe2ba32d700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe2b4101990 0x7fe2b41012f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:47.494 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.493+0000 7fe2ba32d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe2b41042c0 0x7fe2b40ff940 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:47.494 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.493+0000 7fe2ba32d700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe2b40ffe80 con 0x7fe2b41042c0 2026-03-10T08:52:47.494 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.493+0000 7fe2ba32d700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe2b40fffc0 con 0x7fe2b4101990 2026-03-10T08:52:47.494 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.494+0000 7fe2b37fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe2b41042c0 0x7fe2b40ff940 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:47.494 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.494+0000 7fe2b3fff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe2b4101990 0x7fe2b41012f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:47.494 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.494+0000 7fe2b37fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe2b41042c0 0x7fe2b40ff940 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:35410/0 (socket says 192.168.123.105:35410) 2026-03-10T08:52:47.494 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.494+0000 7fe2b37fe700 1 -- 192.168.123.105:0/459454187 learned_addr learned my addr 192.168.123.105:0/459454187 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:52:47.494 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.494+0000 7fe2b37fe700 1 -- 192.168.123.105:0/459454187 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe2b4101990 msgr2=0x7fe2b41012f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:47.494 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.494+0000 7fe2b37fe700 1 --2- 192.168.123.105:0/459454187 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe2b4101990 0x7fe2b41012f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:47.494 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.494+0000 7fe2b37fe700 1 -- 192.168.123.105:0/459454187 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe2a40097e0 con 0x7fe2b41042c0 2026-03-10T08:52:47.495 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.494+0000 7fe2b37fe700 1 --2- 192.168.123.105:0/459454187 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe2b41042c0 0x7fe2b40ff940 secure :-1 s=READY pgs=242 cs=0 l=1 rev1=1 crypto rx=0x7fe2a4004ce0 tx=0x7fe2a4005f00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:47.495 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.495+0000 7fe2b17fa700 1 -- 192.168.123.105:0/459454187 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe2a401d070 con 0x7fe2b41042c0 2026-03-10T08:52:47.495 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.495+0000 7fe2ba32d700 1 -- 192.168.123.105:0/459454187 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe2b4100240 con 0x7fe2b41042c0 2026-03-10T08:52:47.496 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.495+0000 7fe2b17fa700 1 -- 192.168.123.105:0/459454187 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fe2a400bc00 con 0x7fe2b41042c0 2026-03-10T08:52:47.496 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.495+0000 7fe2b17fa700 1 -- 192.168.123.105:0/459454187 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe2a400f830 con 0x7fe2b41042c0 2026-03-10T08:52:47.496 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.495+0000 7fe2ba32d700 1 -- 192.168.123.105:0/459454187 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe2b4100730 con 0x7fe2b41042c0 2026-03-10T08:52:47.497 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.496+0000 7fe2b17fa700 1 -- 192.168.123.105:0/459454187 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fe2a4022b50 con 0x7fe2b41042c0 2026-03-10T08:52:47.497 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.497+0000 7fe2b17fa700 1 --2- 192.168.123.105:0/459454187 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe2a006c490 0x7fe2a006e950 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:47.498 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.497+0000 7fe2b17fa700 1 -- 192.168.123.105:0/459454187 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fe2a405b970 con 0x7fe2b41042c0 2026-03-10T08:52:47.498 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.497+0000 7fe2b3fff700 1 --2- 192.168.123.105:0/459454187 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe2a006c490 0x7fe2a006e950 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:47.498 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.498+0000 7fe2b3fff700 1 --2- 192.168.123.105:0/459454187 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe2a006c490 0x7fe2a006e950 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7fe29c005950 tx=0x7fe29c0058e0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:47.501 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.498+0000 7fe2ba32d700 1 -- 192.168.123.105:0/459454187 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe2b41909e0 con 0x7fe2b41042c0 2026-03-10T08:52:47.501 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.501+0000 7fe2b17fa700 1 -- 192.168.123.105:0/459454187 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fe2a4092050 con 0x7fe2b41042c0 2026-03-10T08:52:47.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:47 vm08 ceph-mon[57559]: from='client.14440 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}]: dispatch 2026-03-10T08:52:47.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:47 vm08 ceph-mon[57559]: from='client.? 192.168.123.105:0/2567836443' entity='client.admin' cmd=[{"prefix": "health", "format": "json"}]: dispatch 2026-03-10T08:52:47.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:47 vm08 ceph-mon[57559]: from='client.? 192.168.123.105:0/3766465032' entity='client.admin' 2026-03-10T08:52:47.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:47 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:52:47.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:47 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:52:47.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:47 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:52:47.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:47 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:47.612 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.612+0000 7fe2ba32d700 1 -- 192.168.123.105:0/459454187 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch status", "target": ["mon-mgr", ""]}) v1 -- 0x7fe2b40611d0 con 0x7fe2a006c490 2026-03-10T08:52:47.612 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:47 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/2567836443' entity='client.admin' cmd=[{"prefix": "health", "format": "json"}]: dispatch 2026-03-10T08:52:47.612 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:47 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/3766465032' entity='client.admin' 2026-03-10T08:52:47.612 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:47 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:52:47.612 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:47 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:52:47.612 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:47 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:52:47.612 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:47 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:47.616 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.615+0000 7fe2b17fa700 1 -- 192.168.123.105:0/459454187 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+43 (secure 0 0 0) 0x7fe2b40611d0 con 0x7fe2a006c490 2026-03-10T08:52:47.616 INFO:teuthology.orchestra.run.vm05.stdout:Backend: cephadm 2026-03-10T08:52:47.616 INFO:teuthology.orchestra.run.vm05.stdout:Available: Yes 2026-03-10T08:52:47.616 INFO:teuthology.orchestra.run.vm05.stdout:Paused: No 2026-03-10T08:52:47.618 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.618+0000 7fe2ba32d700 1 -- 192.168.123.105:0/459454187 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe2a006c490 msgr2=0x7fe2a006e950 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:47.618 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.618+0000 7fe2ba32d700 1 --2- 192.168.123.105:0/459454187 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe2a006c490 0x7fe2a006e950 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7fe29c005950 tx=0x7fe29c0058e0 comp rx=0 tx=0).stop 2026-03-10T08:52:47.618 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.618+0000 7fe2ba32d700 1 -- 192.168.123.105:0/459454187 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe2b41042c0 msgr2=0x7fe2b40ff940 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:47.618 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.618+0000 7fe2ba32d700 1 --2- 192.168.123.105:0/459454187 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe2b41042c0 0x7fe2b40ff940 secure :-1 s=READY pgs=242 cs=0 l=1 rev1=1 crypto rx=0x7fe2a4004ce0 tx=0x7fe2a4005f00 comp rx=0 tx=0).stop 2026-03-10T08:52:47.619 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.618+0000 7fe2ba32d700 1 -- 192.168.123.105:0/459454187 shutdown_connections 2026-03-10T08:52:47.619 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.618+0000 7fe2ba32d700 1 --2- 192.168.123.105:0/459454187 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe2a006c490 0x7fe2a006e950 unknown :-1 s=CLOSED pgs=86 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:47.619 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.618+0000 7fe2ba32d700 1 --2- 192.168.123.105:0/459454187 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe2b4101990 0x7fe2b41012f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:47.619 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.618+0000 7fe2ba32d700 1 --2- 192.168.123.105:0/459454187 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe2b41042c0 0x7fe2b40ff940 unknown :-1 s=CLOSED pgs=242 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:47.619 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.618+0000 7fe2ba32d700 1 -- 192.168.123.105:0/459454187 >> 192.168.123.105:0/459454187 conn(0x7fe2b40fb360 msgr2=0x7fe2b40fd7e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:47.619 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.619+0000 7fe2ba32d700 1 -- 192.168.123.105:0/459454187 shutdown_connections 2026-03-10T08:52:47.619 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:47.619+0000 7fe2ba32d700 1 -- 192.168.123.105:0/459454187 wait complete. 2026-03-10T08:52:47.664 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- bash -c 'ceph orch ps' 2026-03-10T08:52:47.803 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:52:48.034 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.033+0000 7f1e3d26b700 1 -- 192.168.123.105:0/3094446231 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1e38074dc0 msgr2=0x7f1e38073220 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:48.034 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.033+0000 7f1e3d26b700 1 --2- 192.168.123.105:0/3094446231 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1e38074dc0 0x7f1e38073220 secure :-1 s=READY pgs=243 cs=0 l=1 rev1=1 crypto rx=0x7f1e20009b50 tx=0x7f1e20009e60 comp rx=0 tx=0).stop 2026-03-10T08:52:48.034 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.034+0000 7f1e3d26b700 1 -- 192.168.123.105:0/3094446231 shutdown_connections 2026-03-10T08:52:48.034 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.034+0000 7f1e3d26b700 1 --2- 192.168.123.105:0/3094446231 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1e380737f0 0x7f1e38073c70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:48.034 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.034+0000 7f1e3d26b700 1 --2- 192.168.123.105:0/3094446231 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1e38074dc0 0x7f1e38073220 unknown :-1 s=CLOSED pgs=243 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:48.034 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.034+0000 7f1e3d26b700 1 -- 192.168.123.105:0/3094446231 >> 192.168.123.105:0/3094446231 conn(0x7f1e380fc460 msgr2=0x7f1e380fe8e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:48.034 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.034+0000 7f1e3d26b700 1 -- 192.168.123.105:0/3094446231 shutdown_connections 2026-03-10T08:52:48.034 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.034+0000 7f1e3d26b700 1 -- 192.168.123.105:0/3094446231 wait complete. 2026-03-10T08:52:48.034 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.034+0000 7f1e3d26b700 1 Processor -- start 2026-03-10T08:52:48.034 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.034+0000 7f1e3d26b700 1 -- start start 2026-03-10T08:52:48.035 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.034+0000 7f1e3d26b700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1e380737f0 0x7f1e3819ce80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:48.035 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.034+0000 7f1e3d26b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1e38074dc0 0x7f1e3819d3c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:48.035 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.034+0000 7f1e3d26b700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1e3819d9e0 con 0x7f1e38074dc0 2026-03-10T08:52:48.035 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.034+0000 7f1e3d26b700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1e3819db20 con 0x7f1e380737f0 2026-03-10T08:52:48.035 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.035+0000 7f1e367fc700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1e38074dc0 0x7f1e3819d3c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:48.035 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.035+0000 7f1e367fc700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1e38074dc0 0x7f1e3819d3c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:35438/0 (socket says 192.168.123.105:35438) 2026-03-10T08:52:48.035 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.035+0000 7f1e367fc700 1 -- 192.168.123.105:0/3905973563 learned_addr learned my addr 192.168.123.105:0/3905973563 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:52:48.035 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.035+0000 7f1e367fc700 1 -- 192.168.123.105:0/3905973563 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1e380737f0 msgr2=0x7f1e3819ce80 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T08:52:48.035 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.035+0000 7f1e367fc700 1 --2- 192.168.123.105:0/3905973563 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1e380737f0 0x7f1e3819ce80 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:48.035 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.035+0000 7f1e367fc700 1 -- 192.168.123.105:0/3905973563 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1e200097e0 con 0x7f1e38074dc0 2026-03-10T08:52:48.035 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.035+0000 7f1e367fc700 1 --2- 192.168.123.105:0/3905973563 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1e38074dc0 0x7f1e3819d3c0 secure :-1 s=READY pgs=244 cs=0 l=1 rev1=1 crypto rx=0x7f1e2800da40 tx=0x7f1e2800de00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:48.035 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.035+0000 7f1e2ffff700 1 -- 192.168.123.105:0/3905973563 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1e280041d0 con 0x7f1e38074dc0 2026-03-10T08:52:48.036 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.035+0000 7f1e3d26b700 1 -- 192.168.123.105:0/3905973563 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1e381a25d0 con 0x7f1e38074dc0 2026-03-10T08:52:48.036 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.035+0000 7f1e3d26b700 1 -- 192.168.123.105:0/3905973563 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1e381a2b20 con 0x7f1e38074dc0 2026-03-10T08:52:48.036 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.035+0000 7f1e2ffff700 1 -- 192.168.123.105:0/3905973563 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f1e28009c70 con 0x7f1e38074dc0 2026-03-10T08:52:48.036 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.035+0000 7f1e2ffff700 1 -- 192.168.123.105:0/3905973563 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1e28003e40 con 0x7f1e38074dc0 2026-03-10T08:52:48.037 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.036+0000 7f1e2ffff700 1 -- 192.168.123.105:0/3905973563 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f1e28010460 con 0x7f1e38074dc0 2026-03-10T08:52:48.037 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.037+0000 7f1e2ffff700 1 --2- 192.168.123.105:0/3905973563 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f1e2406c4e0 0x7f1e2406e9a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:48.037 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.037+0000 7f1e2ffff700 1 -- 192.168.123.105:0/3905973563 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f1e28021030 con 0x7f1e38074dc0 2026-03-10T08:52:48.040 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.037+0000 7f1e36ffd700 1 --2- 192.168.123.105:0/3905973563 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f1e2406c4e0 0x7f1e2406e9a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:48.040 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.037+0000 7f1e3d26b700 1 -- 192.168.123.105:0/3905973563 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1e18005320 con 0x7f1e38074dc0 2026-03-10T08:52:48.040 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.040+0000 7f1e2ffff700 1 -- 192.168.123.105:0/3905973563 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f1e2805a7a0 con 0x7f1e38074dc0 2026-03-10T08:52:48.040 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.040+0000 7f1e36ffd700 1 --2- 192.168.123.105:0/3905973563 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f1e2406c4e0 0x7f1e2406e9a0 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7f1e2000b5c0 tx=0x7f1e20005fb0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:48.149 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.149+0000 7f1e3d26b700 1 -- 192.168.123.105:0/3905973563 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f1e18000bf0 con 0x7f1e2406c4e0 2026-03-10T08:52:48.156 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.156+0000 7f1e2ffff700 1 -- 192.168.123.105:0/3905973563 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+2660 (secure 0 0 0) 0x7f1e18000bf0 con 0x7f1e2406c4e0 2026-03-10T08:52:48.156 INFO:teuthology.orchestra.run.vm05.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T08:52:48.156 INFO:teuthology.orchestra.run.vm05.stdout:alertmanager.vm05 vm05 *:9093,9094 running (73s) 43s ago 116s 19.3M - 0.25.0 c8568f914cd2 3be9db7ff6a0 2026-03-10T08:52:48.156 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm05 vm05 running (2m) 43s ago 2m 7826k - 18.2.1 5be31c24972a 4f6a4fa3151a 2026-03-10T08:52:48.156 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm08 vm08 running (87s) 17s ago 87s 7964k - 18.2.1 5be31c24972a 17a1d108c2c1 2026-03-10T08:52:48.156 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm05 vm05 running (2m) 43s ago 2m 7407k - 18.2.1 5be31c24972a f9c585addcea 2026-03-10T08:52:48.156 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm08 vm08 running (87s) 17s ago 86s 7415k - 18.2.1 5be31c24972a f0b88fc7f552 2026-03-10T08:52:48.156 INFO:teuthology.orchestra.run.vm05.stdout:grafana.vm05 vm05 *:3000 running (71s) 43s ago 104s 77.0M - 9.4.7 954c08fa6188 91937b4745e7 2026-03-10T08:52:48.157 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm05.rxwgjc vm05 *:9283,8765,8443 running (2m) 43s ago 2m 489M - 18.2.1 5be31c24972a 6ec0cdb38171 2026-03-10T08:52:48.157 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm08.rpongu vm08 *:8443,9283,8765 running (83s) 17s ago 83s 448M - 18.2.1 5be31c24972a 9cd801f2f7a7 2026-03-10T08:52:48.157 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm05 vm05 running (2m) 43s ago 2m 44.3M 2048M 18.2.1 5be31c24972a 4cb0e74c8584 2026-03-10T08:52:48.157 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm08 vm08 running (82s) 17s ago 81s 44.3M 2048M 18.2.1 5be31c24972a bca448418226 2026-03-10T08:52:48.157 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm05 vm05 *:9100 running (119s) 43s ago 119s 11.7M - 1.5.0 0da6a335fe13 3b3db2a6030c 2026-03-10T08:52:48.157 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm08 vm08 *:9100 running (84s) 17s ago 84s 12.0M - 1.5.0 0da6a335fe13 f55df0cefc61 2026-03-10T08:52:48.157 INFO:teuthology.orchestra.run.vm05.stdout:osd.0 vm05 running (64s) 43s ago 63s 40.4M 4096M 18.2.1 5be31c24972a 2a2aeea5e3d4 2026-03-10T08:52:48.157 INFO:teuthology.orchestra.run.vm05.stdout:osd.1 vm05 running (54s) 43s ago 54s 37.2M 4096M 18.2.1 5be31c24972a 902f9ea11f1a 2026-03-10T08:52:48.157 INFO:teuthology.orchestra.run.vm05.stdout:osd.2 vm05 running (44s) 43s ago 44s 14.2M 4096M 18.2.1 5be31c24972a 32c0be0f86f2 2026-03-10T08:52:48.157 INFO:teuthology.orchestra.run.vm05.stdout:osd.3 vm08 running (36s) 17s ago 36s 42.3M 4096M 18.2.1 5be31c24972a 14f5f93ea4d1 2026-03-10T08:52:48.157 INFO:teuthology.orchestra.run.vm05.stdout:osd.4 vm08 running (26s) 17s ago 26s 39.8M 4096M 18.2.1 5be31c24972a 155c1482d81c 2026-03-10T08:52:48.157 INFO:teuthology.orchestra.run.vm05.stdout:osd.5 vm08 running (18s) 17s ago 18s 17.6M 4096M 18.2.1 5be31c24972a 21583bb58d82 2026-03-10T08:52:48.157 INFO:teuthology.orchestra.run.vm05.stdout:prometheus.vm05 vm05 *:9095 running (66s) 43s ago 98s 34.4M - 2.43.0 a07b618ecd1d e84b76e5c1c0 2026-03-10T08:52:48.158 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.158+0000 7f1e3d26b700 1 -- 192.168.123.105:0/3905973563 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f1e2406c4e0 msgr2=0x7f1e2406e9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:48.158 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.158+0000 7f1e3d26b700 1 --2- 192.168.123.105:0/3905973563 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f1e2406c4e0 0x7f1e2406e9a0 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7f1e2000b5c0 tx=0x7f1e20005fb0 comp rx=0 tx=0).stop 2026-03-10T08:52:48.158 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.158+0000 7f1e3d26b700 1 -- 192.168.123.105:0/3905973563 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1e38074dc0 msgr2=0x7f1e3819d3c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:48.158 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.158+0000 7f1e3d26b700 1 --2- 192.168.123.105:0/3905973563 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1e38074dc0 0x7f1e3819d3c0 secure :-1 s=READY pgs=244 cs=0 l=1 rev1=1 crypto rx=0x7f1e2800da40 tx=0x7f1e2800de00 comp rx=0 tx=0).stop 2026-03-10T08:52:48.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.158+0000 7f1e3d26b700 1 -- 192.168.123.105:0/3905973563 shutdown_connections 2026-03-10T08:52:48.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.158+0000 7f1e3d26b700 1 --2- 192.168.123.105:0/3905973563 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f1e2406c4e0 0x7f1e2406e9a0 unknown :-1 s=CLOSED pgs=87 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:48.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.158+0000 7f1e3d26b700 1 --2- 192.168.123.105:0/3905973563 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1e380737f0 0x7f1e3819ce80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:48.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.158+0000 7f1e3d26b700 1 --2- 192.168.123.105:0/3905973563 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1e38074dc0 0x7f1e3819d3c0 unknown :-1 s=CLOSED pgs=244 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:48.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.158+0000 7f1e3d26b700 1 -- 192.168.123.105:0/3905973563 >> 192.168.123.105:0/3905973563 conn(0x7f1e380fc460 msgr2=0x7f1e381028d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:48.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.159+0000 7f1e3d26b700 1 -- 192.168.123.105:0/3905973563 shutdown_connections 2026-03-10T08:52:48.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.159+0000 7f1e3d26b700 1 -- 192.168.123.105:0/3905973563 wait complete. 2026-03-10T08:52:48.216 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- bash -c 'ceph orch ls' 2026-03-10T08:52:48.356 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:52:48.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:48 vm05 ceph-mon[49713]: pgmap v67: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:52:48.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:48 vm05 ceph-mon[49713]: from='client.14452 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:52:48.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:48 vm08 ceph-mon[57559]: pgmap v67: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:52:48.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:48 vm08 ceph-mon[57559]: from='client.14452 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:52:48.606 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.604+0000 7f59e7afc700 1 -- 192.168.123.105:0/3848104748 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f59e01009d0 msgr2=0x7f59e0100df0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:48.606 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.604+0000 7f59e7afc700 1 --2- 192.168.123.105:0/3848104748 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f59e01009d0 0x7f59e0100df0 secure :-1 s=READY pgs=245 cs=0 l=1 rev1=1 crypto rx=0x7f59d000ac10 tx=0x7f59d000af20 comp rx=0 tx=0).stop 2026-03-10T08:52:48.606 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.605+0000 7f59e7afc700 1 -- 192.168.123.105:0/3848104748 shutdown_connections 2026-03-10T08:52:48.606 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.605+0000 7f59e7afc700 1 --2- 192.168.123.105:0/3848104748 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f59e00ff4d0 0x7f59e00ff950 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:48.606 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.605+0000 7f59e7afc700 1 --2- 192.168.123.105:0/3848104748 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f59e01009d0 0x7f59e0100df0 unknown :-1 s=CLOSED pgs=245 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:48.606 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.605+0000 7f59e7afc700 1 -- 192.168.123.105:0/3848104748 >> 192.168.123.105:0/3848104748 conn(0x7f59e00fa6d0 msgr2=0x7f59e00fcb10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:48.606 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.605+0000 7f59e7afc700 1 -- 192.168.123.105:0/3848104748 shutdown_connections 2026-03-10T08:52:48.606 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.605+0000 7f59e7afc700 1 -- 192.168.123.105:0/3848104748 wait complete. 2026-03-10T08:52:48.606 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.606+0000 7f59e7afc700 1 Processor -- start 2026-03-10T08:52:48.606 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.606+0000 7f59e7afc700 1 -- start start 2026-03-10T08:52:48.606 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.606+0000 7f59e7afc700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f59e00ff4d0 0x7f59e0194650 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:48.606 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.606+0000 7f59e7afc700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f59e01009d0 0x7f59e0194b90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:48.606 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.606+0000 7f59e7afc700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f59e01951b0 con 0x7f59e00ff4d0 2026-03-10T08:52:48.606 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.606+0000 7f59e7afc700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f59e01952f0 con 0x7f59e01009d0 2026-03-10T08:52:48.606 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.606+0000 7f59e5898700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f59e00ff4d0 0x7f59e0194650 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:48.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.606+0000 7f59e5898700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f59e00ff4d0 0x7f59e0194650 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:35460/0 (socket says 192.168.123.105:35460) 2026-03-10T08:52:48.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.606+0000 7f59e5898700 1 -- 192.168.123.105:0/1424941073 learned_addr learned my addr 192.168.123.105:0/1424941073 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:52:48.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.607+0000 7f59e5898700 1 -- 192.168.123.105:0/1424941073 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f59e01009d0 msgr2=0x7f59e0194b90 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:48.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.606+0000 7f59e5097700 1 --2- 192.168.123.105:0/1424941073 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f59e01009d0 0x7f59e0194b90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:48.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.607+0000 7f59e5898700 1 --2- 192.168.123.105:0/1424941073 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f59e01009d0 0x7f59e0194b90 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:48.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.607+0000 7f59e5898700 1 -- 192.168.123.105:0/1424941073 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f59d000a8f0 con 0x7f59e00ff4d0 2026-03-10T08:52:48.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.607+0000 7f59e5097700 1 --2- 192.168.123.105:0/1424941073 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f59e01009d0 0x7f59e0194b90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T08:52:48.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.607+0000 7f59e5898700 1 --2- 192.168.123.105:0/1424941073 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f59e00ff4d0 0x7f59e0194650 secure :-1 s=READY pgs=246 cs=0 l=1 rev1=1 crypto rx=0x7f59d0005950 tx=0x7f59d0004e80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:48.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.607+0000 7f59d6ffd700 1 -- 192.168.123.105:0/1424941073 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f59d0005560 con 0x7f59e00ff4d0 2026-03-10T08:52:48.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.607+0000 7f59d6ffd700 1 -- 192.168.123.105:0/1424941073 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f59d000ea10 con 0x7f59e00ff4d0 2026-03-10T08:52:48.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.607+0000 7f59d6ffd700 1 -- 192.168.123.105:0/1424941073 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f59d0016c40 con 0x7f59e00ff4d0 2026-03-10T08:52:48.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.607+0000 7f59e7afc700 1 -- 192.168.123.105:0/1424941073 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f59e0199d40 con 0x7f59e00ff4d0 2026-03-10T08:52:48.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.607+0000 7f59e7afc700 1 -- 192.168.123.105:0/1424941073 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f59e019a2b0 con 0x7f59e00ff4d0 2026-03-10T08:52:48.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.609+0000 7f59d6ffd700 1 -- 192.168.123.105:0/1424941073 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f59d0028400 con 0x7f59e00ff4d0 2026-03-10T08:52:48.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.609+0000 7f59d6ffd700 1 --2- 192.168.123.105:0/1424941073 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f59cc0709d0 0x7f59cc072e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:48.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.609+0000 7f59d6ffd700 1 -- 192.168.123.105:0/1424941073 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f59d0012070 con 0x7f59e00ff4d0 2026-03-10T08:52:48.610 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.609+0000 7f59e5097700 1 --2- 192.168.123.105:0/1424941073 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f59cc0709d0 0x7f59cc072e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:48.610 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.610+0000 7f59e5097700 1 --2- 192.168.123.105:0/1424941073 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f59cc0709d0 0x7f59cc072e90 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f59dc007900 tx=0x7f59dc008040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:48.610 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.610+0000 7f59e7afc700 1 -- 192.168.123.105:0/1424941073 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f59e0066e80 con 0x7f59e00ff4d0 2026-03-10T08:52:48.613 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.613+0000 7f59d6ffd700 1 -- 192.168.123.105:0/1424941073 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f59d005cdf0 con 0x7f59e00ff4d0 2026-03-10T08:52:48.723 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.722+0000 7f59e7afc700 1 -- 192.168.123.105:0/1424941073 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch ls", "target": ["mon-mgr", ""]}) v1 -- 0x7f59e019a590 con 0x7f59cc0709d0 2026-03-10T08:52:48.725 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.725+0000 7f59d6ffd700 1 -- 192.168.123.105:0/1424941073 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+1150 (secure 0 0 0) 0x7f59e019a590 con 0x7f59cc0709d0 2026-03-10T08:52:48.725 INFO:teuthology.orchestra.run.vm05.stdout:NAME PORTS RUNNING REFRESHED AGE PLACEMENT 2026-03-10T08:52:48.725 INFO:teuthology.orchestra.run.vm05.stdout:alertmanager ?:9093,9094 1/1 43s ago 2m count:1 2026-03-10T08:52:48.725 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter 2/2 43s ago 2m * 2026-03-10T08:52:48.725 INFO:teuthology.orchestra.run.vm05.stdout:crash 2/2 43s ago 2m * 2026-03-10T08:52:48.725 INFO:teuthology.orchestra.run.vm05.stdout:grafana ?:3000 1/1 43s ago 2m count:1 2026-03-10T08:52:48.725 INFO:teuthology.orchestra.run.vm05.stdout:mgr 2/2 43s ago 2m count:2 2026-03-10T08:52:48.725 INFO:teuthology.orchestra.run.vm05.stdout:mon 2/2 43s ago 2m vm05:192.168.123.105=vm05;vm08:192.168.123.108=vm08;count:2 2026-03-10T08:52:48.725 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter ?:9100 2/2 43s ago 2m * 2026-03-10T08:52:48.725 INFO:teuthology.orchestra.run.vm05.stdout:osd 6 43s ago - 2026-03-10T08:52:48.725 INFO:teuthology.orchestra.run.vm05.stdout:prometheus ?:9095 1/1 43s ago 2m count:1 2026-03-10T08:52:48.727 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.727+0000 7f59e7afc700 1 -- 192.168.123.105:0/1424941073 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f59cc0709d0 msgr2=0x7f59cc072e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:48.727 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.727+0000 7f59e7afc700 1 --2- 192.168.123.105:0/1424941073 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f59cc0709d0 0x7f59cc072e90 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f59dc007900 tx=0x7f59dc008040 comp rx=0 tx=0).stop 2026-03-10T08:52:48.727 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.727+0000 7f59e7afc700 1 -- 192.168.123.105:0/1424941073 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f59e00ff4d0 msgr2=0x7f59e0194650 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:48.727 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.727+0000 7f59e7afc700 1 --2- 192.168.123.105:0/1424941073 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f59e00ff4d0 0x7f59e0194650 secure :-1 s=READY pgs=246 cs=0 l=1 rev1=1 crypto rx=0x7f59d0005950 tx=0x7f59d0004e80 comp rx=0 tx=0).stop 2026-03-10T08:52:48.728 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.727+0000 7f59e7afc700 1 -- 192.168.123.105:0/1424941073 shutdown_connections 2026-03-10T08:52:48.728 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.727+0000 7f59e7afc700 1 --2- 192.168.123.105:0/1424941073 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f59cc0709d0 0x7f59cc072e90 unknown :-1 s=CLOSED pgs=88 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:48.728 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.727+0000 7f59e7afc700 1 --2- 192.168.123.105:0/1424941073 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f59e00ff4d0 0x7f59e0194650 unknown :-1 s=CLOSED pgs=246 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:48.728 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.727+0000 7f59e7afc700 1 --2- 192.168.123.105:0/1424941073 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f59e01009d0 0x7f59e0194b90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:48.728 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.728+0000 7f59e7afc700 1 -- 192.168.123.105:0/1424941073 >> 192.168.123.105:0/1424941073 conn(0x7f59e00fa6d0 msgr2=0x7f59e0103a30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:48.728 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.728+0000 7f59e7afc700 1 -- 192.168.123.105:0/1424941073 shutdown_connections 2026-03-10T08:52:48.728 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:48.728+0000 7f59e7afc700 1 -- 192.168.123.105:0/1424941073 wait complete. 2026-03-10T08:52:48.787 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- bash -c 'ceph orch host ls' 2026-03-10T08:52:48.927 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:52:49.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.158+0000 7f8e978a2700 1 -- 192.168.123.105:0/3956938288 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8e90104320 msgr2=0x7f8e90104780 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:49.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.158+0000 7f8e978a2700 1 --2- 192.168.123.105:0/3956938288 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8e90104320 0x7f8e90104780 secure :-1 s=READY pgs=247 cs=0 l=1 rev1=1 crypto rx=0x7f8e8c009b50 tx=0x7f8e8c009e60 comp rx=0 tx=0).stop 2026-03-10T08:52:49.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.159+0000 7f8e978a2700 1 -- 192.168.123.105:0/3956938288 shutdown_connections 2026-03-10T08:52:49.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.159+0000 7f8e978a2700 1 --2- 192.168.123.105:0/3956938288 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8e90104320 0x7f8e90104780 unknown :-1 s=CLOSED pgs=247 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:49.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.159+0000 7f8e978a2700 1 --2- 192.168.123.105:0/3956938288 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8e90103120 0x7f8e90103540 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:49.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.159+0000 7f8e978a2700 1 -- 192.168.123.105:0/3956938288 >> 192.168.123.105:0/3956938288 conn(0x7f8e900fe6c0 msgr2=0x7f8e90100b00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:49.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.159+0000 7f8e978a2700 1 -- 192.168.123.105:0/3956938288 shutdown_connections 2026-03-10T08:52:49.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.159+0000 7f8e978a2700 1 -- 192.168.123.105:0/3956938288 wait complete. 2026-03-10T08:52:49.160 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.159+0000 7f8e978a2700 1 Processor -- start 2026-03-10T08:52:49.160 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.160+0000 7f8e978a2700 1 -- start start 2026-03-10T08:52:49.160 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.160+0000 7f8e978a2700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8e90103120 0x7f8e9010fbe0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:49.160 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.160+0000 7f8e978a2700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8e90110120 0x7f8e90113180 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:49.160 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.160+0000 7f8e978a2700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8e90110630 con 0x7f8e90103120 2026-03-10T08:52:49.160 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.160+0000 7f8e978a2700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8e901107a0 con 0x7f8e90110120 2026-03-10T08:52:49.161 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.160+0000 7f8e9563e700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8e90103120 0x7f8e9010fbe0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:49.161 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.160+0000 7f8e9563e700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8e90103120 0x7f8e9010fbe0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:35472/0 (socket says 192.168.123.105:35472) 2026-03-10T08:52:49.161 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.160+0000 7f8e94e3d700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8e90110120 0x7f8e90113180 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:49.161 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.161+0000 7f8e94e3d700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8e90110120 0x7f8e90113180 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:36472/0 (socket says 192.168.123.105:36472) 2026-03-10T08:52:49.161 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.161+0000 7f8e94e3d700 1 -- 192.168.123.105:0/1596197150 learned_addr learned my addr 192.168.123.105:0/1596197150 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:52:49.161 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.161+0000 7f8e94e3d700 1 -- 192.168.123.105:0/1596197150 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8e90103120 msgr2=0x7f8e9010fbe0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:49.161 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.161+0000 7f8e94e3d700 1 --2- 192.168.123.105:0/1596197150 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8e90103120 0x7f8e9010fbe0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:49.162 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.161+0000 7f8e94e3d700 1 -- 192.168.123.105:0/1596197150 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8e8c0097e0 con 0x7f8e90110120 2026-03-10T08:52:49.162 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.162+0000 7f8e9563e700 1 --2- 192.168.123.105:0/1596197150 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8e90103120 0x7f8e9010fbe0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T08:52:49.162 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.162+0000 7f8e94e3d700 1 --2- 192.168.123.105:0/1596197150 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8e90110120 0x7f8e90113180 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f8e8c006010 tx=0x7f8e8c005710 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:49.162 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.162+0000 7f8e867fc700 1 -- 192.168.123.105:0/1596197150 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8e8c01d070 con 0x7f8e90110120 2026-03-10T08:52:49.162 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.162+0000 7f8e978a2700 1 -- 192.168.123.105:0/1596197150 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8e901136c0 con 0x7f8e90110120 2026-03-10T08:52:49.163 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.162+0000 7f8e978a2700 1 -- 192.168.123.105:0/1596197150 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8e90113b50 con 0x7f8e90110120 2026-03-10T08:52:49.163 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.163+0000 7f8e867fc700 1 -- 192.168.123.105:0/1596197150 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8e8c004500 con 0x7f8e90110120 2026-03-10T08:52:49.163 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.163+0000 7f8e867fc700 1 -- 192.168.123.105:0/1596197150 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8e8c00f460 con 0x7f8e90110120 2026-03-10T08:52:49.164 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.164+0000 7f8e978a2700 1 -- 192.168.123.105:0/1596197150 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8e74005320 con 0x7f8e90110120 2026-03-10T08:52:49.164 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.164+0000 7f8e867fc700 1 -- 192.168.123.105:0/1596197150 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f8e8c00bc30 con 0x7f8e90110120 2026-03-10T08:52:49.164 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.164+0000 7f8e867fc700 1 --2- 192.168.123.105:0/1596197150 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8e7c06c550 0x7f8e7c06ea10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:49.165 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.165+0000 7f8e9563e700 1 --2- 192.168.123.105:0/1596197150 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8e7c06c550 0x7f8e7c06ea10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:49.165 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.165+0000 7f8e9563e700 1 --2- 192.168.123.105:0/1596197150 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8e7c06c550 0x7f8e7c06ea10 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f8e80006fd0 tx=0x7f8e80009380 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:49.165 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.165+0000 7f8e867fc700 1 -- 192.168.123.105:0/1596197150 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f8e8c08ca80 con 0x7f8e90110120 2026-03-10T08:52:49.167 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.167+0000 7f8e867fc700 1 -- 192.168.123.105:0/1596197150 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f8e8c05b0d0 con 0x7f8e90110120 2026-03-10T08:52:49.274 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.272+0000 7f8e978a2700 1 -- 192.168.123.105:0/1596197150 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch host ls", "target": ["mon-mgr", ""]}) v1 -- 0x7f8e74000bf0 con 0x7f8e7c06c550 2026-03-10T08:52:49.275 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.275+0000 7f8e867fc700 1 -- 192.168.123.105:0/1596197150 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+139 (secure 0 0 0) 0x7f8e74000bf0 con 0x7f8e7c06c550 2026-03-10T08:52:49.275 INFO:teuthology.orchestra.run.vm05.stdout:HOST ADDR LABELS STATUS 2026-03-10T08:52:49.275 INFO:teuthology.orchestra.run.vm05.stdout:vm05 192.168.123.105 2026-03-10T08:52:49.275 INFO:teuthology.orchestra.run.vm05.stdout:vm08 192.168.123.108 2026-03-10T08:52:49.275 INFO:teuthology.orchestra.run.vm05.stdout:2 hosts in cluster 2026-03-10T08:52:49.277 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.277+0000 7f8e978a2700 1 -- 192.168.123.105:0/1596197150 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8e7c06c550 msgr2=0x7f8e7c06ea10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:49.277 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.277+0000 7f8e978a2700 1 --2- 192.168.123.105:0/1596197150 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8e7c06c550 0x7f8e7c06ea10 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f8e80006fd0 tx=0x7f8e80009380 comp rx=0 tx=0).stop 2026-03-10T08:52:49.277 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.277+0000 7f8e978a2700 1 -- 192.168.123.105:0/1596197150 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8e90110120 msgr2=0x7f8e90113180 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:49.277 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.277+0000 7f8e978a2700 1 --2- 192.168.123.105:0/1596197150 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8e90110120 0x7f8e90113180 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f8e8c006010 tx=0x7f8e8c005710 comp rx=0 tx=0).stop 2026-03-10T08:52:49.278 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.277+0000 7f8e978a2700 1 -- 192.168.123.105:0/1596197150 shutdown_connections 2026-03-10T08:52:49.278 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.277+0000 7f8e978a2700 1 --2- 192.168.123.105:0/1596197150 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8e7c06c550 0x7f8e7c06ea10 unknown :-1 s=CLOSED pgs=89 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:49.278 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.277+0000 7f8e978a2700 1 --2- 192.168.123.105:0/1596197150 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8e90103120 0x7f8e9010fbe0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:49.278 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.277+0000 7f8e978a2700 1 --2- 192.168.123.105:0/1596197150 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8e90110120 0x7f8e90113180 unknown :-1 s=CLOSED pgs=44 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:49.278 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.278+0000 7f8e978a2700 1 -- 192.168.123.105:0/1596197150 >> 192.168.123.105:0/1596197150 conn(0x7f8e900fe6c0 msgr2=0x7f8e90107550 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:49.278 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.278+0000 7f8e978a2700 1 -- 192.168.123.105:0/1596197150 shutdown_connections 2026-03-10T08:52:49.278 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.278+0000 7f8e978a2700 1 -- 192.168.123.105:0/1596197150 wait complete. 2026-03-10T08:52:49.322 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- bash -c 'ceph orch device ls' 2026-03-10T08:52:49.465 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:52:49.523 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:49 vm05 ceph-mon[49713]: from='client.14456 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:52:49.523 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:49 vm05 ceph-mon[49713]: from='client.14460 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:52:49.734 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.733+0000 7f51efc05700 1 -- 192.168.123.105:0/4135799817 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f51e8104350 msgr2=0x7f51e81047b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:49.734 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.733+0000 7f51efc05700 1 --2- 192.168.123.105:0/4135799817 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f51e8104350 0x7f51e81047b0 secure :-1 s=READY pgs=248 cs=0 l=1 rev1=1 crypto rx=0x7f51e4009b50 tx=0x7f51e4009e60 comp rx=0 tx=0).stop 2026-03-10T08:52:49.734 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.734+0000 7f51efc05700 1 -- 192.168.123.105:0/4135799817 shutdown_connections 2026-03-10T08:52:49.734 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.734+0000 7f51efc05700 1 --2- 192.168.123.105:0/4135799817 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f51e8104350 0x7f51e81047b0 unknown :-1 s=CLOSED pgs=248 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:49.734 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.734+0000 7f51efc05700 1 --2- 192.168.123.105:0/4135799817 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f51e8103150 0x7f51e8103570 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:49.734 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.734+0000 7f51efc05700 1 -- 192.168.123.105:0/4135799817 >> 192.168.123.105:0/4135799817 conn(0x7f51e80fe6d0 msgr2=0x7f51e8100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:49.734 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.734+0000 7f51efc05700 1 -- 192.168.123.105:0/4135799817 shutdown_connections 2026-03-10T08:52:49.734 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.734+0000 7f51efc05700 1 -- 192.168.123.105:0/4135799817 wait complete. 2026-03-10T08:52:49.735 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.734+0000 7f51efc05700 1 Processor -- start 2026-03-10T08:52:49.735 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.734+0000 7f51efc05700 1 -- start start 2026-03-10T08:52:49.735 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.735+0000 7f51efc05700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f51e8103150 0x7f51e8198a50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:49.735 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.735+0000 7f51efc05700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f51e8104350 0x7f51e8198f90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:49.735 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.735+0000 7f51efc05700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f51e81995b0 con 0x7f51e8103150 2026-03-10T08:52:49.735 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.735+0000 7f51efc05700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f51e81996f0 con 0x7f51e8104350 2026-03-10T08:52:49.735 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.735+0000 7f51ed9a1700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f51e8103150 0x7f51e8198a50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:49.735 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.735+0000 7f51ed9a1700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f51e8103150 0x7f51e8198a50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:59036/0 (socket says 192.168.123.105:59036) 2026-03-10T08:52:49.736 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.735+0000 7f51ed9a1700 1 -- 192.168.123.105:0/362407860 learned_addr learned my addr 192.168.123.105:0/362407860 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:52:49.736 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.735+0000 7f51ed9a1700 1 -- 192.168.123.105:0/362407860 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f51e8104350 msgr2=0x7f51e8198f90 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:49.736 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.735+0000 7f51ed9a1700 1 --2- 192.168.123.105:0/362407860 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f51e8104350 0x7f51e8198f90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:49.736 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.735+0000 7f51ed9a1700 1 -- 192.168.123.105:0/362407860 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f51e40097e0 con 0x7f51e8103150 2026-03-10T08:52:49.736 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.736+0000 7f51ed9a1700 1 --2- 192.168.123.105:0/362407860 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f51e8103150 0x7f51e8198a50 secure :-1 s=READY pgs=249 cs=0 l=1 rev1=1 crypto rx=0x7f51d8009fd0 tx=0x7f51d800eea0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:49.736 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.736+0000 7f51deffd700 1 -- 192.168.123.105:0/362407860 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f51d8009980 con 0x7f51e8103150 2026-03-10T08:52:49.736 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.736+0000 7f51efc05700 1 -- 192.168.123.105:0/362407860 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f51e819e1a0 con 0x7f51e8103150 2026-03-10T08:52:49.736 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.736+0000 7f51efc05700 1 -- 192.168.123.105:0/362407860 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f51e819e6f0 con 0x7f51e8103150 2026-03-10T08:52:49.737 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.736+0000 7f51deffd700 1 -- 192.168.123.105:0/362407860 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f51d8004500 con 0x7f51e8103150 2026-03-10T08:52:49.737 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.736+0000 7f51deffd700 1 -- 192.168.123.105:0/362407860 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f51d8010450 con 0x7f51e8103150 2026-03-10T08:52:49.738 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.738+0000 7f51deffd700 1 -- 192.168.123.105:0/362407860 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f51d80106f0 con 0x7f51e8103150 2026-03-10T08:52:49.738 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.738+0000 7f51efc05700 1 -- 192.168.123.105:0/362407860 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f51cc005320 con 0x7f51e8103150 2026-03-10T08:52:49.741 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.740+0000 7f51deffd700 1 --2- 192.168.123.105:0/362407860 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f51d406c5b0 0x7f51d406ea70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:49.741 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.740+0000 7f51deffd700 1 -- 192.168.123.105:0/362407860 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f51d8014070 con 0x7f51e8103150 2026-03-10T08:52:49.741 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.740+0000 7f51ed1a0700 1 --2- 192.168.123.105:0/362407860 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f51d406c5b0 0x7f51d406ea70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:49.741 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.741+0000 7f51ed1a0700 1 --2- 192.168.123.105:0/362407860 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f51d406c5b0 0x7f51d406ea70 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7f51e4009b20 tx=0x7f51e4005fb0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:49.741 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.741+0000 7f51deffd700 1 -- 192.168.123.105:0/362407860 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f51d8059f90 con 0x7f51e8103150 2026-03-10T08:52:49.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:49 vm08 ceph-mon[57559]: from='client.14456 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:52:49.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:49 vm08 ceph-mon[57559]: from='client.14460 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:52:49.849 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.848+0000 7f51efc05700 1 -- 192.168.123.105:0/362407860 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch device ls", "target": ["mon-mgr", ""]}) v1 -- 0x7f51cc000bf0 con 0x7f51d406c5b0 2026-03-10T08:52:49.853 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.852+0000 7f51deffd700 1 -- 192.168.123.105:0/362407860 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+1278 (secure 0 0 0) 0x7f51cc000bf0 con 0x7f51d406c5b0 2026-03-10T08:52:49.853 INFO:teuthology.orchestra.run.vm05.stdout:HOST PATH TYPE DEVICE ID SIZE AVAILABLE REFRESHED REJECT REASONS 2026-03-10T08:52:49.853 INFO:teuthology.orchestra.run.vm05.stdout:vm05 /dev/vdb hdd DWNBRSTVMM05001 20.0G Yes 43s ago 2026-03-10T08:52:49.853 INFO:teuthology.orchestra.run.vm05.stdout:vm05 /dev/vdc hdd DWNBRSTVMM05002 20.0G No 43s ago Has a FileSystem, Insufficient space (<10 extents) on vgs, LVM detected 2026-03-10T08:52:49.853 INFO:teuthology.orchestra.run.vm05.stdout:vm05 /dev/vdd hdd DWNBRSTVMM05003 20.0G No 43s ago Has a FileSystem, Insufficient space (<10 extents) on vgs, LVM detected 2026-03-10T08:52:49.853 INFO:teuthology.orchestra.run.vm05.stdout:vm05 /dev/vde hdd DWNBRSTVMM05004 20.0G No 43s ago Has a FileSystem, Insufficient space (<10 extents) on vgs, LVM detected 2026-03-10T08:52:49.853 INFO:teuthology.orchestra.run.vm05.stdout:vm08 /dev/vdb hdd DWNBRSTVMM08001 20.0G Yes 17s ago 2026-03-10T08:52:49.853 INFO:teuthology.orchestra.run.vm05.stdout:vm08 /dev/vdc hdd DWNBRSTVMM08002 20.0G No 17s ago Has a FileSystem, Insufficient space (<10 extents) on vgs, LVM detected 2026-03-10T08:52:49.853 INFO:teuthology.orchestra.run.vm05.stdout:vm08 /dev/vdd hdd DWNBRSTVMM08003 20.0G No 17s ago Has a FileSystem, Insufficient space (<10 extents) on vgs, LVM detected 2026-03-10T08:52:49.853 INFO:teuthology.orchestra.run.vm05.stdout:vm08 /dev/vde hdd DWNBRSTVMM08004 20.0G No 17s ago Has a FileSystem, Insufficient space (<10 extents) on vgs, LVM detected 2026-03-10T08:52:49.855 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.854+0000 7f51efc05700 1 -- 192.168.123.105:0/362407860 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f51d406c5b0 msgr2=0x7f51d406ea70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:49.855 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.854+0000 7f51efc05700 1 --2- 192.168.123.105:0/362407860 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f51d406c5b0 0x7f51d406ea70 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7f51e4009b20 tx=0x7f51e4005fb0 comp rx=0 tx=0).stop 2026-03-10T08:52:49.855 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.855+0000 7f51efc05700 1 -- 192.168.123.105:0/362407860 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f51e8103150 msgr2=0x7f51e8198a50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:49.855 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.855+0000 7f51efc05700 1 --2- 192.168.123.105:0/362407860 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f51e8103150 0x7f51e8198a50 secure :-1 s=READY pgs=249 cs=0 l=1 rev1=1 crypto rx=0x7f51d8009fd0 tx=0x7f51d800eea0 comp rx=0 tx=0).stop 2026-03-10T08:52:49.855 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.855+0000 7f51efc05700 1 -- 192.168.123.105:0/362407860 shutdown_connections 2026-03-10T08:52:49.855 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.855+0000 7f51efc05700 1 --2- 192.168.123.105:0/362407860 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f51d406c5b0 0x7f51d406ea70 unknown :-1 s=CLOSED pgs=90 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:49.855 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.855+0000 7f51efc05700 1 --2- 192.168.123.105:0/362407860 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f51e8103150 0x7f51e8198a50 unknown :-1 s=CLOSED pgs=249 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:49.855 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.855+0000 7f51efc05700 1 --2- 192.168.123.105:0/362407860 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f51e8104350 0x7f51e8198f90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:49.855 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.855+0000 7f51efc05700 1 -- 192.168.123.105:0/362407860 >> 192.168.123.105:0/362407860 conn(0x7f51e80fe6d0 msgr2=0x7f51e8107580 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:49.855 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.855+0000 7f51efc05700 1 -- 192.168.123.105:0/362407860 shutdown_connections 2026-03-10T08:52:49.855 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:49.855+0000 7f51efc05700 1 -- 192.168.123.105:0/362407860 wait complete. 2026-03-10T08:52:49.896 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-10T08:52:49.898 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm05.local 2026-03-10T08:52:49.898 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- bash -c 'ceph fs volume create cephfs --placement=4' 2026-03-10T08:52:50.059 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:52:50.326 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:50.325+0000 7f1d62bf6700 1 -- 192.168.123.105:0/4147719865 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1d5c074dc0 msgr2=0x7f1d5c073220 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:50.326 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:50.325+0000 7f1d62bf6700 1 --2- 192.168.123.105:0/4147719865 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1d5c074dc0 0x7f1d5c073220 secure :-1 s=READY pgs=250 cs=0 l=1 rev1=1 crypto rx=0x7f1d48009b50 tx=0x7f1d48009e60 comp rx=0 tx=0).stop 2026-03-10T08:52:50.326 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:50.326+0000 7f1d62bf6700 1 -- 192.168.123.105:0/4147719865 shutdown_connections 2026-03-10T08:52:50.326 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:50.326+0000 7f1d62bf6700 1 --2- 192.168.123.105:0/4147719865 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1d5c0737f0 0x7f1d5c073c70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:50.326 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:50.326+0000 7f1d62bf6700 1 --2- 192.168.123.105:0/4147719865 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1d5c074dc0 0x7f1d5c073220 unknown :-1 s=CLOSED pgs=250 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:50.326 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:50.326+0000 7f1d62bf6700 1 -- 192.168.123.105:0/4147719865 >> 192.168.123.105:0/4147719865 conn(0x7f1d5c0fc370 msgr2=0x7f1d5c0fe7d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:50.326 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:50.326+0000 7f1d62bf6700 1 -- 192.168.123.105:0/4147719865 shutdown_connections 2026-03-10T08:52:50.326 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:50.326+0000 7f1d62bf6700 1 -- 192.168.123.105:0/4147719865 wait complete. 2026-03-10T08:52:50.327 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:50.327+0000 7f1d62bf6700 1 Processor -- start 2026-03-10T08:52:50.327 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:50.327+0000 7f1d62bf6700 1 -- start start 2026-03-10T08:52:50.327 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:50.327+0000 7f1d62bf6700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1d5c0737f0 0x7f1d5c198b70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:50.327 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:50.327+0000 7f1d62bf6700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1d5c1990b0 0x7f1d5c19e120 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:50.327 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:50.327+0000 7f1d62bf6700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1d5c1995c0 con 0x7f1d5c0737f0 2026-03-10T08:52:50.327 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:50.327+0000 7f1d62bf6700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1d5c199730 con 0x7f1d5c1990b0 2026-03-10T08:52:50.328 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:50.327+0000 7f1d5bfff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1d5c1990b0 0x7f1d5c19e120 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:50.328 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:50.327+0000 7f1d5bfff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1d5c1990b0 0x7f1d5c19e120 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:34166/0 (socket says 192.168.123.105:34166) 2026-03-10T08:52:50.328 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:50.327+0000 7f1d5bfff700 1 -- 192.168.123.105:0/4130088105 learned_addr learned my addr 192.168.123.105:0/4130088105 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:52:50.328 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:50.328+0000 7f1d60992700 1 --2- 192.168.123.105:0/4130088105 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1d5c0737f0 0x7f1d5c198b70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:50.328 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:50.328+0000 7f1d5bfff700 1 -- 192.168.123.105:0/4130088105 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1d5c0737f0 msgr2=0x7f1d5c198b70 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:50.328 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:50.328+0000 7f1d5bfff700 1 --2- 192.168.123.105:0/4130088105 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1d5c0737f0 0x7f1d5c198b70 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:50.328 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:50.328+0000 7f1d5bfff700 1 -- 192.168.123.105:0/4130088105 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1d480097e0 con 0x7f1d5c1990b0 2026-03-10T08:52:50.328 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:50.328+0000 7f1d60992700 1 --2- 192.168.123.105:0/4130088105 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1d5c0737f0 0x7f1d5c198b70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T08:52:50.328 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:50.328+0000 7f1d5bfff700 1 --2- 192.168.123.105:0/4130088105 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1d5c1990b0 0x7f1d5c19e120 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f1d5000d900 tx=0x7f1d5000dcc0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:50.328 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:50.328+0000 7f1d59ffb700 1 -- 192.168.123.105:0/4130088105 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1d500041d0 con 0x7f1d5c1990b0 2026-03-10T08:52:50.328 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:50.328+0000 7f1d62bf6700 1 -- 192.168.123.105:0/4130088105 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1d5c19e6c0 con 0x7f1d5c1990b0 2026-03-10T08:52:50.329 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:50.328+0000 7f1d59ffb700 1 -- 192.168.123.105:0/4130088105 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f1d50004330 con 0x7f1d5c1990b0 2026-03-10T08:52:50.329 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:50.328+0000 7f1d59ffb700 1 -- 192.168.123.105:0/4130088105 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1d50003d40 con 0x7f1d5c1990b0 2026-03-10T08:52:50.329 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:50.328+0000 7f1d62bf6700 1 -- 192.168.123.105:0/4130088105 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1d5c19ec10 con 0x7f1d5c1990b0 2026-03-10T08:52:50.330 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:50.329+0000 7f1d62bf6700 1 -- 192.168.123.105:0/4130088105 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1d5c04ea90 con 0x7f1d5c1990b0 2026-03-10T08:52:50.330 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:50.330+0000 7f1d59ffb700 1 -- 192.168.123.105:0/4130088105 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f1d50009730 con 0x7f1d5c1990b0 2026-03-10T08:52:50.330 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:50.330+0000 7f1d59ffb700 1 --2- 192.168.123.105:0/4130088105 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f1d4c06c4e0 0x7f1d4c06e9a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:50.330 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:50.330+0000 7f1d59ffb700 1 -- 192.168.123.105:0/4130088105 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f1d5008ab00 con 0x7f1d5c1990b0 2026-03-10T08:52:50.330 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:50.330+0000 7f1d60992700 1 --2- 192.168.123.105:0/4130088105 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f1d4c06c4e0 0x7f1d4c06e9a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:50.331 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:50.330+0000 7f1d60992700 1 --2- 192.168.123.105:0/4130088105 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f1d4c06c4e0 0x7f1d4c06e9a0 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7f1d48005340 tx=0x7f1d480058e0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:50.332 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:50.332+0000 7f1d59ffb700 1 -- 192.168.123.105:0/4130088105 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f1d500591d0 con 0x7f1d5c1990b0 2026-03-10T08:52:50.448 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:50 vm05 ceph-mon[49713]: pgmap v68: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:52:50.448 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:50 vm05 ceph-mon[49713]: from='client.24275 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:52:50.448 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:50.447+0000 7f1d62bf6700 1 -- 192.168.123.105:0/4130088105 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "fs volume create", "name": "cephfs", "placement": "4", "target": ["mon-mgr", ""]}) v1 -- 0x7f1d5c1082f0 con 0x7f1d4c06c4e0 2026-03-10T08:52:50.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:50 vm08 ceph-mon[57559]: pgmap v68: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:52:50.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:50 vm08 ceph-mon[57559]: from='client.24275 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:52:51.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:51 vm05 ceph-mon[49713]: from='client.14468 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:52:51.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:51 vm05 ceph-mon[49713]: from='client.24281 -' entity='client.admin' cmd=[{"prefix": "fs volume create", "name": "cephfs", "placement": "4", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:52:51.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:51 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch 2026-03-10T08:52:51.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:51 vm08 ceph-mon[57559]: from='client.14468 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:52:51.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:51 vm08 ceph-mon[57559]: from='client.24281 -' entity='client.admin' cmd=[{"prefix": "fs volume create", "name": "cephfs", "placement": "4", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:52:51.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:51 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch 2026-03-10T08:52:52.383 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.382+0000 7f1d59ffb700 1 -- 192.168.123.105:0/4130088105 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+0 (secure 0 0 0) 0x7f1d5c1082f0 con 0x7f1d4c06c4e0 2026-03-10T08:52:52.385 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.384+0000 7f1d62bf6700 1 -- 192.168.123.105:0/4130088105 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f1d4c06c4e0 msgr2=0x7f1d4c06e9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:52.385 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.384+0000 7f1d62bf6700 1 --2- 192.168.123.105:0/4130088105 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f1d4c06c4e0 0x7f1d4c06e9a0 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7f1d48005340 tx=0x7f1d480058e0 comp rx=0 tx=0).stop 2026-03-10T08:52:52.385 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.384+0000 7f1d62bf6700 1 -- 192.168.123.105:0/4130088105 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1d5c1990b0 msgr2=0x7f1d5c19e120 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:52.385 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.384+0000 7f1d62bf6700 1 --2- 192.168.123.105:0/4130088105 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1d5c1990b0 0x7f1d5c19e120 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f1d5000d900 tx=0x7f1d5000dcc0 comp rx=0 tx=0).stop 2026-03-10T08:52:52.385 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.385+0000 7f1d62bf6700 1 -- 192.168.123.105:0/4130088105 shutdown_connections 2026-03-10T08:52:52.385 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.385+0000 7f1d62bf6700 1 --2- 192.168.123.105:0/4130088105 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f1d4c06c4e0 0x7f1d4c06e9a0 unknown :-1 s=CLOSED pgs=91 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:52.385 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.385+0000 7f1d62bf6700 1 --2- 192.168.123.105:0/4130088105 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1d5c0737f0 0x7f1d5c198b70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:52.385 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.385+0000 7f1d62bf6700 1 --2- 192.168.123.105:0/4130088105 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1d5c1990b0 0x7f1d5c19e120 unknown :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:52.385 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.385+0000 7f1d62bf6700 1 -- 192.168.123.105:0/4130088105 >> 192.168.123.105:0/4130088105 conn(0x7f1d5c0fc370 msgr2=0x7f1d5c106bd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:52.385 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.385+0000 7f1d62bf6700 1 -- 192.168.123.105:0/4130088105 shutdown_connections 2026-03-10T08:52:52.385 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.385+0000 7f1d62bf6700 1 -- 192.168.123.105:0/4130088105 wait complete. 2026-03-10T08:52:52.430 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- bash -c 'ceph fs dump' 2026-03-10T08:52:52.582 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:52:52.609 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:52 vm05 ceph-16587ed2-1c5e-11f1-90f6-35051361a039-mon-vm05[49709]: 2026-03-10T08:52:52.345+0000 7f33541f8700 -1 log_channel(cluster) log [ERR] : Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T08:52:52.610 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:52 vm05 ceph-mon[49713]: pgmap v69: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:52:52.610 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:52 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]': finished 2026-03-10T08:52:52.610 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:52 vm05 ceph-mon[49713]: osdmap e34: 6 total, 6 up, 6 in 2026-03-10T08:52:52.610 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:52 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch 2026-03-10T08:52:52.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:52 vm08 ceph-mon[57559]: pgmap v69: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:52:52.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:52 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]': finished 2026-03-10T08:52:52.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:52 vm08 ceph-mon[57559]: osdmap e34: 6 total, 6 up, 6 in 2026-03-10T08:52:52.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:52 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch 2026-03-10T08:52:52.849 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.847+0000 7f969a5cf700 1 -- 192.168.123.105:0/58987490 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9694072b20 msgr2=0x7f9694072f40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:52.849 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.847+0000 7f969a5cf700 1 --2- 192.168.123.105:0/58987490 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9694072b20 0x7f9694072f40 secure :-1 s=READY pgs=251 cs=0 l=1 rev1=1 crypto rx=0x7f968c00b600 tx=0x7f968c00b910 comp rx=0 tx=0).stop 2026-03-10T08:52:52.849 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.847+0000 7f969a5cf700 1 -- 192.168.123.105:0/58987490 shutdown_connections 2026-03-10T08:52:52.849 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.847+0000 7f969a5cf700 1 --2- 192.168.123.105:0/58987490 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9694075a10 0x7f9694077ea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:52.849 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.847+0000 7f969a5cf700 1 --2- 192.168.123.105:0/58987490 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9694072b20 0x7f9694072f40 unknown :-1 s=CLOSED pgs=251 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:52.849 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.847+0000 7f969a5cf700 1 -- 192.168.123.105:0/58987490 >> 192.168.123.105:0/58987490 conn(0x7f969406daa0 msgr2=0x7f969406ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:52.859 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.847+0000 7f969a5cf700 1 -- 192.168.123.105:0/58987490 shutdown_connections 2026-03-10T08:52:52.859 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.847+0000 7f969a5cf700 1 -- 192.168.123.105:0/58987490 wait complete. 2026-03-10T08:52:52.859 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.847+0000 7f969a5cf700 1 Processor -- start 2026-03-10T08:52:52.859 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.848+0000 7f969a5cf700 1 -- start start 2026-03-10T08:52:52.859 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.848+0000 7f969a5cf700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9694075a10 0x7f9694083180 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:52.859 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.848+0000 7f969a5cf700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f96940836c0 0x7f96941b3240 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:52.859 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.848+0000 7f969a5cf700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9694083b40 con 0x7f96940836c0 2026-03-10T08:52:52.859 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.848+0000 7f969a5cf700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9694083cb0 con 0x7f9694075a10 2026-03-10T08:52:52.859 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.855+0000 7f9698dcc700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f96940836c0 0x7f96941b3240 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:52.859 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.859+0000 7f9698dcc700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f96940836c0 0x7f96941b3240 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:59058/0 (socket says 192.168.123.105:59058) 2026-03-10T08:52:52.859 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.859+0000 7f9698dcc700 1 -- 192.168.123.105:0/3931770003 learned_addr learned my addr 192.168.123.105:0/3931770003 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:52:52.862 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.862+0000 7f96995cd700 1 --2- 192.168.123.105:0/3931770003 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9694075a10 0x7f9694083180 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:52.866 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.866+0000 7f96995cd700 1 -- 192.168.123.105:0/3931770003 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f96940836c0 msgr2=0x7f96941b3240 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:52.866 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.866+0000 7f96995cd700 1 --2- 192.168.123.105:0/3931770003 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f96940836c0 0x7f96941b3240 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:52.867 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.866+0000 7f96995cd700 1 -- 192.168.123.105:0/3931770003 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f968c00b050 con 0x7f9694075a10 2026-03-10T08:52:52.867 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.867+0000 7f96995cd700 1 --2- 192.168.123.105:0/3931770003 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9694075a10 0x7f9694083180 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f968c000f80 tx=0x7f968c0077f0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:52.867 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.867+0000 7f968a7fc700 1 -- 192.168.123.105:0/3931770003 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f968c00e030 con 0x7f9694075a10 2026-03-10T08:52:52.868 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.867+0000 7f969a5cf700 1 -- 192.168.123.105:0/3931770003 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f96941b3780 con 0x7f9694075a10 2026-03-10T08:52:52.868 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.867+0000 7f969a5cf700 1 -- 192.168.123.105:0/3931770003 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f96941b3ca0 con 0x7f9694075a10 2026-03-10T08:52:52.868 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.867+0000 7f968a7fc700 1 -- 192.168.123.105:0/3931770003 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f968c007e90 con 0x7f9694075a10 2026-03-10T08:52:52.868 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.867+0000 7f968a7fc700 1 -- 192.168.123.105:0/3931770003 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f968c004440 con 0x7f9694075a10 2026-03-10T08:52:52.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.869+0000 7f968a7fc700 1 -- 192.168.123.105:0/3931770003 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f968c023070 con 0x7f9694075a10 2026-03-10T08:52:52.873 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.871+0000 7f968a7fc700 1 --2- 192.168.123.105:0/3931770003 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f968006c530 0x7f968006e9f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:52.873 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.871+0000 7f968a7fc700 1 -- 192.168.123.105:0/3931770003 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(36..36 src has 1..36) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f968c0918a0 con 0x7f9694075a10 2026-03-10T08:52:52.873 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.871+0000 7f969a5cf700 1 -- 192.168.123.105:0/3931770003 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9678005320 con 0x7f9694075a10 2026-03-10T08:52:52.875 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.875+0000 7f968a7fc700 1 -- 192.168.123.105:0/3931770003 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f968c05fbb0 con 0x7f9694075a10 2026-03-10T08:52:52.883 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.883+0000 7f9698dcc700 1 --2- 192.168.123.105:0/3931770003 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f968006c530 0x7f968006e9f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:52.883 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:52.883+0000 7f9698dcc700 1 --2- 192.168.123.105:0/3931770003 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f968006c530 0x7f968006e9f0 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7f9690009f10 tx=0x7f9690009450 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:53.038 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.036+0000 7f969a5cf700 1 -- 192.168.123.105:0/3931770003 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f9678006200 con 0x7f9694075a10 2026-03-10T08:52:53.038 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.038+0000 7f968a7fc700 1 -- 192.168.123.105:0/3931770003 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 2 v2) v1 ==== 75+0+1093 (secure 0 0 0) 0x7f968c017020 con 0x7f9694075a10 2026-03-10T08:52:53.039 INFO:teuthology.orchestra.run.vm05.stdout:e2 2026-03-10T08:52:53.039 INFO:teuthology.orchestra.run.vm05.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T08:52:53.039 INFO:teuthology.orchestra.run.vm05.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T08:52:53.039 INFO:teuthology.orchestra.run.vm05.stdout:legacy client fscid: 1 2026-03-10T08:52:53.039 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:52:53.039 INFO:teuthology.orchestra.run.vm05.stdout:Filesystem 'cephfs' (1) 2026-03-10T08:52:53.039 INFO:teuthology.orchestra.run.vm05.stdout:fs_name cephfs 2026-03-10T08:52:53.039 INFO:teuthology.orchestra.run.vm05.stdout:epoch 2 2026-03-10T08:52:53.040 INFO:teuthology.orchestra.run.vm05.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T08:52:53.040 INFO:teuthology.orchestra.run.vm05.stdout:created 2026-03-10T08:52:52.346264+0000 2026-03-10T08:52:53.040 INFO:teuthology.orchestra.run.vm05.stdout:modified 2026-03-10T08:52:52.346311+0000 2026-03-10T08:52:53.040 INFO:teuthology.orchestra.run.vm05.stdout:tableserver 0 2026-03-10T08:52:53.040 INFO:teuthology.orchestra.run.vm05.stdout:root 0 2026-03-10T08:52:53.040 INFO:teuthology.orchestra.run.vm05.stdout:session_timeout 60 2026-03-10T08:52:53.040 INFO:teuthology.orchestra.run.vm05.stdout:session_autoclose 300 2026-03-10T08:52:53.040 INFO:teuthology.orchestra.run.vm05.stdout:max_file_size 1099511627776 2026-03-10T08:52:53.040 INFO:teuthology.orchestra.run.vm05.stdout:required_client_features {} 2026-03-10T08:52:53.040 INFO:teuthology.orchestra.run.vm05.stdout:last_failure 0 2026-03-10T08:52:53.040 INFO:teuthology.orchestra.run.vm05.stdout:last_failure_osd_epoch 0 2026-03-10T08:52:53.040 INFO:teuthology.orchestra.run.vm05.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T08:52:53.040 INFO:teuthology.orchestra.run.vm05.stdout:max_mds 1 2026-03-10T08:52:53.040 INFO:teuthology.orchestra.run.vm05.stdout:in 2026-03-10T08:52:53.040 INFO:teuthology.orchestra.run.vm05.stdout:up {} 2026-03-10T08:52:53.040 INFO:teuthology.orchestra.run.vm05.stdout:failed 2026-03-10T08:52:53.040 INFO:teuthology.orchestra.run.vm05.stdout:damaged 2026-03-10T08:52:53.040 INFO:teuthology.orchestra.run.vm05.stdout:stopped 2026-03-10T08:52:53.040 INFO:teuthology.orchestra.run.vm05.stdout:data_pools [3] 2026-03-10T08:52:53.040 INFO:teuthology.orchestra.run.vm05.stdout:metadata_pool 2 2026-03-10T08:52:53.040 INFO:teuthology.orchestra.run.vm05.stdout:inline_data disabled 2026-03-10T08:52:53.040 INFO:teuthology.orchestra.run.vm05.stdout:balancer 2026-03-10T08:52:53.040 INFO:teuthology.orchestra.run.vm05.stdout:bal_rank_mask -1 2026-03-10T08:52:53.040 INFO:teuthology.orchestra.run.vm05.stdout:standby_count_wanted 0 2026-03-10T08:52:53.040 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:52:53.040 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:52:53.044 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.044+0000 7f969a5cf700 1 -- 192.168.123.105:0/3931770003 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f968006c530 msgr2=0x7f968006e9f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:53.044 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.044+0000 7f969a5cf700 1 --2- 192.168.123.105:0/3931770003 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f968006c530 0x7f968006e9f0 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7f9690009f10 tx=0x7f9690009450 comp rx=0 tx=0).stop 2026-03-10T08:52:53.044 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.044+0000 7f969a5cf700 1 -- 192.168.123.105:0/3931770003 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9694075a10 msgr2=0x7f9694083180 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:53.044 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.044+0000 7f969a5cf700 1 --2- 192.168.123.105:0/3931770003 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9694075a10 0x7f9694083180 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f968c000f80 tx=0x7f968c0077f0 comp rx=0 tx=0).stop 2026-03-10T08:52:53.044 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.044+0000 7f969a5cf700 1 -- 192.168.123.105:0/3931770003 shutdown_connections 2026-03-10T08:52:53.044 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.044+0000 7f969a5cf700 1 --2- 192.168.123.105:0/3931770003 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f968006c530 0x7f968006e9f0 unknown :-1 s=CLOSED pgs=92 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:53.044 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.044+0000 7f969a5cf700 1 --2- 192.168.123.105:0/3931770003 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9694075a10 0x7f9694083180 unknown :-1 s=CLOSED pgs=46 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:53.044 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.044+0000 7f969a5cf700 1 --2- 192.168.123.105:0/3931770003 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f96940836c0 0x7f96941b3240 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:53.044 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.044+0000 7f969a5cf700 1 -- 192.168.123.105:0/3931770003 >> 192.168.123.105:0/3931770003 conn(0x7f969406daa0 msgr2=0x7f969406ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:53.044 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.044+0000 7f969a5cf700 1 -- 192.168.123.105:0/3931770003 shutdown_connections 2026-03-10T08:52:53.044 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.044+0000 7f969a5cf700 1 -- 192.168.123.105:0/3931770003 wait complete. 2026-03-10T08:52:53.045 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 2 2026-03-10T08:52:53.230 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-10T08:52:53.233 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm05.local 2026-03-10T08:52:53.233 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- bash -c 'ceph fs set cephfs max_mds 1' 2026-03-10T08:52:53.430 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:52:53.458 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:53 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd='[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]': finished 2026-03-10T08:52:53.458 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:53 vm05 ceph-mon[49713]: osdmap e35: 6 total, 6 up, 6 in 2026-03-10T08:52:53.458 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:53 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch 2026-03-10T08:52:53.458 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:53 vm05 ceph-mon[49713]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T08:52:53.458 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:53 vm05 ceph-mon[49713]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX) 2026-03-10T08:52:53.458 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:53 vm05 ceph-mon[49713]: osdmap e36: 6 total, 6 up, 6 in 2026-03-10T08:52:53.458 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:53 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished 2026-03-10T08:52:53.458 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:53 vm05 ceph-mon[49713]: fsmap cephfs:0 2026-03-10T08:52:53.458 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:53 vm05 ceph-mon[49713]: Saving service mds.cephfs spec with placement count:4 2026-03-10T08:52:53.458 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:53 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:53.458 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:53 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:52:53.459 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:53 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:52:53.459 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:53 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:52:53.459 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:53 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:53.459 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:53 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.bxdvbu", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T08:52:53.459 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:53 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.bxdvbu", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-10T08:52:53.459 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:53 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:52:53.459 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:53 vm05 ceph-mon[49713]: Deploying daemon mds.cephfs.vm05.bxdvbu on vm05 2026-03-10T08:52:53.459 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:53 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/3931770003' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T08:52:53.692 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.690+0000 7fd8fff0c700 1 -- 192.168.123.105:0/4077636087 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd8f8104380 msgr2=0x7fd8f81047e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:53.692 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.690+0000 7fd8fff0c700 1 --2- 192.168.123.105:0/4077636087 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd8f8104380 0x7fd8f81047e0 secure :-1 s=READY pgs=253 cs=0 l=1 rev1=1 crypto rx=0x7fd8f4009b50 tx=0x7fd8f4009e60 comp rx=0 tx=0).stop 2026-03-10T08:52:53.692 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.691+0000 7fd8fff0c700 1 -- 192.168.123.105:0/4077636087 shutdown_connections 2026-03-10T08:52:53.692 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.691+0000 7fd8fff0c700 1 --2- 192.168.123.105:0/4077636087 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd8f8104380 0x7fd8f81047e0 unknown :-1 s=CLOSED pgs=253 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:53.692 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.691+0000 7fd8fff0c700 1 --2- 192.168.123.105:0/4077636087 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd8f8103180 0x7fd8f81035a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:53.692 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.691+0000 7fd8fff0c700 1 -- 192.168.123.105:0/4077636087 >> 192.168.123.105:0/4077636087 conn(0x7fd8f80fe720 msgr2=0x7fd8f8100b60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:53.692 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.691+0000 7fd8fff0c700 1 -- 192.168.123.105:0/4077636087 shutdown_connections 2026-03-10T08:52:53.692 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.691+0000 7fd8fff0c700 1 -- 192.168.123.105:0/4077636087 wait complete. 2026-03-10T08:52:53.692 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.692+0000 7fd8fff0c700 1 Processor -- start 2026-03-10T08:52:53.692 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.692+0000 7fd8fff0c700 1 -- start start 2026-03-10T08:52:53.692 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.692+0000 7fd8fff0c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd8f8103180 0x7fd8f81989e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:53.692 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.692+0000 7fd8fff0c700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd8f8104380 0x7fd8f8198f20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:53.692 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.692+0000 7fd8fff0c700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd8f8199540 con 0x7fd8f8103180 2026-03-10T08:52:53.692 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.692+0000 7fd8fff0c700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd8f8199680 con 0x7fd8f8104380 2026-03-10T08:52:53.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.692+0000 7fd8fdca8700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd8f8103180 0x7fd8f81989e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:53.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.692+0000 7fd8fdca8700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd8f8103180 0x7fd8f81989e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:59092/0 (socket says 192.168.123.105:59092) 2026-03-10T08:52:53.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.692+0000 7fd8fdca8700 1 -- 192.168.123.105:0/573135389 learned_addr learned my addr 192.168.123.105:0/573135389 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:52:53.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.693+0000 7fd8fdca8700 1 -- 192.168.123.105:0/573135389 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd8f8104380 msgr2=0x7fd8f8198f20 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:53.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.693+0000 7fd8fd4a7700 1 --2- 192.168.123.105:0/573135389 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd8f8104380 0x7fd8f8198f20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:53.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.693+0000 7fd8fdca8700 1 --2- 192.168.123.105:0/573135389 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd8f8104380 0x7fd8f8198f20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:53.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.693+0000 7fd8fdca8700 1 -- 192.168.123.105:0/573135389 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd8f40097e0 con 0x7fd8f8103180 2026-03-10T08:52:53.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.693+0000 7fd8fdca8700 1 --2- 192.168.123.105:0/573135389 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd8f8103180 0x7fd8f81989e0 secure :-1 s=READY pgs=254 cs=0 l=1 rev1=1 crypto rx=0x7fd8e8009fd0 tx=0x7fd8e800eea0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:53.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.693+0000 7fd8eeffd700 1 -- 192.168.123.105:0/573135389 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd8e8009980 con 0x7fd8f8103180 2026-03-10T08:52:53.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.693+0000 7fd8eeffd700 1 -- 192.168.123.105:0/573135389 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd8e8004500 con 0x7fd8f8103180 2026-03-10T08:52:53.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.693+0000 7fd8eeffd700 1 -- 192.168.123.105:0/573135389 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd8e8010450 con 0x7fd8f8103180 2026-03-10T08:52:53.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.693+0000 7fd8fff0c700 1 -- 192.168.123.105:0/573135389 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd8f819e130 con 0x7fd8f8103180 2026-03-10T08:52:53.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.693+0000 7fd8fff0c700 1 -- 192.168.123.105:0/573135389 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd8f80754b0 con 0x7fd8f8103180 2026-03-10T08:52:53.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.694+0000 7fd8eeffd700 1 -- 192.168.123.105:0/573135389 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fd8e8010630 con 0x7fd8f8103180 2026-03-10T08:52:53.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.695+0000 7fd8eeffd700 1 --2- 192.168.123.105:0/573135389 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd8e406c600 0x7fd8e406eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:53.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.695+0000 7fd8eeffd700 1 -- 192.168.123.105:0/573135389 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fd8e8014070 con 0x7fd8f8103180 2026-03-10T08:52:53.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.695+0000 7fd8fd4a7700 1 --2- 192.168.123.105:0/573135389 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd8e406c600 0x7fd8e406eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:53.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.695+0000 7fd8fff0c700 1 -- 192.168.123.105:0/573135389 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd8f8066e80 con 0x7fd8f8103180 2026-03-10T08:52:53.698 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.698+0000 7fd8eeffd700 1 -- 192.168.123.105:0/573135389 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fd8e805aff0 con 0x7fd8f8103180 2026-03-10T08:52:53.699 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.698+0000 7fd8fd4a7700 1 --2- 192.168.123.105:0/573135389 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd8e406c600 0x7fd8e406eac0 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7fd8f4000c00 tx=0x7fd8f4005660 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:53.743 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:53 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd='[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]': finished 2026-03-10T08:52:53.743 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:53 vm08 ceph-mon[57559]: osdmap e35: 6 total, 6 up, 6 in 2026-03-10T08:52:53.743 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:53 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch 2026-03-10T08:52:53.743 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:53 vm08 ceph-mon[57559]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T08:52:53.743 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:53 vm08 ceph-mon[57559]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX) 2026-03-10T08:52:53.743 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:53 vm08 ceph-mon[57559]: osdmap e36: 6 total, 6 up, 6 in 2026-03-10T08:52:53.743 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:53 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished 2026-03-10T08:52:53.743 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:53 vm08 ceph-mon[57559]: fsmap cephfs:0 2026-03-10T08:52:53.743 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:53 vm08 ceph-mon[57559]: Saving service mds.cephfs spec with placement count:4 2026-03-10T08:52:53.743 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:53 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:53.743 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:53 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:52:53.743 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:53 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:52:53.743 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:53 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:52:53.743 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:53 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:53.743 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:53 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.bxdvbu", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T08:52:53.743 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:53 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.bxdvbu", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-10T08:52:53.743 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:53 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:52:53.744 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:53 vm08 ceph-mon[57559]: Deploying daemon mds.cephfs.vm05.bxdvbu on vm05 2026-03-10T08:52:53.744 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:53 vm08 ceph-mon[57559]: from='client.? 192.168.123.105:0/3931770003' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T08:52:53.827 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:53.826+0000 7fd8fff0c700 1 -- 192.168.123.105:0/573135389 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"} v 0) v1 -- 0x7fd8f8075cf0 con 0x7fd8f8103180 2026-03-10T08:52:54.372 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:54 vm08 ceph-mon[57559]: pgmap v73: 65 pgs: 12 creating+peering, 52 unknown, 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:52:54.374 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:54.374+0000 7fd8eeffd700 1 -- 192.168.123.105:0/573135389 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"}]=0 v3) v1 ==== 105+0+0 (secure 0 0 0) 0x7fd8e805ab80 con 0x7fd8f8103180 2026-03-10T08:52:54.377 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:54.376+0000 7fd8fff0c700 1 -- 192.168.123.105:0/573135389 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd8e406c600 msgr2=0x7fd8e406eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:54.377 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:54.376+0000 7fd8fff0c700 1 --2- 192.168.123.105:0/573135389 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd8e406c600 0x7fd8e406eac0 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7fd8f4000c00 tx=0x7fd8f4005660 comp rx=0 tx=0).stop 2026-03-10T08:52:54.377 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:54.376+0000 7fd8fff0c700 1 -- 192.168.123.105:0/573135389 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd8f8103180 msgr2=0x7fd8f81989e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:54.377 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:54.376+0000 7fd8fff0c700 1 --2- 192.168.123.105:0/573135389 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd8f8103180 0x7fd8f81989e0 secure :-1 s=READY pgs=254 cs=0 l=1 rev1=1 crypto rx=0x7fd8e8009fd0 tx=0x7fd8e800eea0 comp rx=0 tx=0).stop 2026-03-10T08:52:54.377 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:54.376+0000 7fd8fff0c700 1 -- 192.168.123.105:0/573135389 shutdown_connections 2026-03-10T08:52:54.377 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:54.376+0000 7fd8fff0c700 1 --2- 192.168.123.105:0/573135389 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd8e406c600 0x7fd8e406eac0 unknown :-1 s=CLOSED pgs=93 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:54.377 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:54.376+0000 7fd8fff0c700 1 --2- 192.168.123.105:0/573135389 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd8f8103180 0x7fd8f81989e0 unknown :-1 s=CLOSED pgs=254 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:54.377 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:54.376+0000 7fd8fff0c700 1 --2- 192.168.123.105:0/573135389 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd8f8104380 0x7fd8f8198f20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:54.377 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:54.376+0000 7fd8fff0c700 1 -- 192.168.123.105:0/573135389 >> 192.168.123.105:0/573135389 conn(0x7fd8f80fe720 msgr2=0x7fd8f81075b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:54.377 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:54.376+0000 7fd8fff0c700 1 -- 192.168.123.105:0/573135389 shutdown_connections 2026-03-10T08:52:54.378 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:54.376+0000 7fd8fff0c700 1 -- 192.168.123.105:0/573135389 wait complete. 2026-03-10T08:52:54.453 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-10T08:52:54.456 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm05.local 2026-03-10T08:52:54.456 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- bash -c 'ceph fs set cephfs allow_standby_replay false' 2026-03-10T08:52:54.622 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:54 vm05 ceph-mon[49713]: pgmap v73: 65 pgs: 12 creating+peering, 52 unknown, 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:52:54.623 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:54 vm05 ceph-mon[49713]: osdmap e37: 6 total, 6 up, 6 in 2026-03-10T08:52:54.623 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:54 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:54.623 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:54 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:54.623 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:54 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:54.623 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:54 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.xfzrbx", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T08:52:54.623 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:54 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.xfzrbx", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-10T08:52:54.623 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:54 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:52:54.623 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:54 vm05 ceph-mon[49713]: Deploying daemon mds.cephfs.vm08.xfzrbx on vm08 2026-03-10T08:52:54.623 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:54 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/573135389' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"}]: dispatch 2026-03-10T08:52:54.623 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:54 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:54.623 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:54 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:54.623 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:54 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:54.623 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:54 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.slhztf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T08:52:54.623 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:54 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.slhztf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-10T08:52:54.623 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:54 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:52:54.628 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:52:54.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:54 vm08 ceph-mon[57559]: osdmap e37: 6 total, 6 up, 6 in 2026-03-10T08:52:54.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:54 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:54.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:54 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:54.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:54 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:54.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:54 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.xfzrbx", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T08:52:54.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:54 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.xfzrbx", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-10T08:52:54.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:54 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:52:54.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:54 vm08 ceph-mon[57559]: Deploying daemon mds.cephfs.vm08.xfzrbx on vm08 2026-03-10T08:52:54.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:54 vm08 ceph-mon[57559]: from='client.? 192.168.123.105:0/573135389' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"}]: dispatch 2026-03-10T08:52:54.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:54 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:54.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:54 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:54.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:54 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:54.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:54 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.slhztf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T08:52:54.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:54 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.slhztf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-10T08:52:54.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:54 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:52:55.155 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.154+0000 7f89c2a0a700 1 -- 192.168.123.105:0/3889556128 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f89bc107d90 msgr2=0x7f89bc10a1c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:55.155 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.154+0000 7f89c2a0a700 1 --2- 192.168.123.105:0/3889556128 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f89bc107d90 0x7f89bc10a1c0 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f89b400b210 tx=0x7f89b400b520 comp rx=0 tx=0).stop 2026-03-10T08:52:55.156 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.154+0000 7f89c2a0a700 1 -- 192.168.123.105:0/3889556128 shutdown_connections 2026-03-10T08:52:55.156 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.154+0000 7f89c2a0a700 1 --2- 192.168.123.105:0/3889556128 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f89bc10a700 0x7f89bc10cb90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:55.156 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.154+0000 7f89c2a0a700 1 --2- 192.168.123.105:0/3889556128 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f89bc107d90 0x7f89bc10a1c0 unknown :-1 s=CLOSED pgs=50 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:55.156 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.154+0000 7f89c2a0a700 1 -- 192.168.123.105:0/3889556128 >> 192.168.123.105:0/3889556128 conn(0x7f89bc06dae0 msgr2=0x7f89bc06ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:55.156 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.155+0000 7f89c2a0a700 1 -- 192.168.123.105:0/3889556128 shutdown_connections 2026-03-10T08:52:55.156 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.155+0000 7f89c2a0a700 1 -- 192.168.123.105:0/3889556128 wait complete. 2026-03-10T08:52:55.156 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.155+0000 7f89c2a0a700 1 Processor -- start 2026-03-10T08:52:55.156 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.155+0000 7f89c2a0a700 1 -- start start 2026-03-10T08:52:55.156 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.156+0000 7f89c2a0a700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f89bc107d90 0x7f89bc1a53b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:55.156 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.156+0000 7f89c2a0a700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f89bc10a700 0x7f89bc1a58f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:55.156 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.156+0000 7f89c2a0a700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f89bc1a5f10 con 0x7f89bc10a700 2026-03-10T08:52:55.156 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.156+0000 7f89c2a0a700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f89bc1a6050 con 0x7f89bc107d90 2026-03-10T08:52:55.156 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.156+0000 7f89bbfff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f89bc107d90 0x7f89bc1a53b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:55.156 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.156+0000 7f89bbfff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f89bc107d90 0x7f89bc1a53b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:34252/0 (socket says 192.168.123.105:34252) 2026-03-10T08:52:55.156 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.156+0000 7f89bbfff700 1 -- 192.168.123.105:0/2203940864 learned_addr learned my addr 192.168.123.105:0/2203940864 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:52:55.156 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.156+0000 7f89bbfff700 1 -- 192.168.123.105:0/2203940864 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f89bc10a700 msgr2=0x7f89bc1a58f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:55.156 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.156+0000 7f89bbfff700 1 --2- 192.168.123.105:0/2203940864 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f89bc10a700 0x7f89bc1a58f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:55.156 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.156+0000 7f89bbfff700 1 -- 192.168.123.105:0/2203940864 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f89b4009e30 con 0x7f89bc107d90 2026-03-10T08:52:55.156 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.156+0000 7f89bbfff700 1 --2- 192.168.123.105:0/2203940864 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f89bc107d90 0x7f89bc1a53b0 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7f89b4000f80 tx=0x7f89b4003ce0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:55.156 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.156+0000 7f89b97fa700 1 -- 192.168.123.105:0/2203940864 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f89b400e070 con 0x7f89bc107d90 2026-03-10T08:52:55.156 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.156+0000 7f89c2a0a700 1 -- 192.168.123.105:0/2203940864 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f89bc1aaaa0 con 0x7f89bc107d90 2026-03-10T08:52:55.157 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.156+0000 7f89c2a0a700 1 -- 192.168.123.105:0/2203940864 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f89bc1aaff0 con 0x7f89bc107d90 2026-03-10T08:52:55.157 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.156+0000 7f89b97fa700 1 -- 192.168.123.105:0/2203940864 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f89b4007c90 con 0x7f89bc107d90 2026-03-10T08:52:55.157 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.156+0000 7f89b97fa700 1 -- 192.168.123.105:0/2203940864 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f89b4012b60 con 0x7f89bc107d90 2026-03-10T08:52:55.158 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.158+0000 7f89b97fa700 1 -- 192.168.123.105:0/2203940864 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f89b4019040 con 0x7f89bc107d90 2026-03-10T08:52:55.158 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.158+0000 7f89b97fa700 1 --2- 192.168.123.105:0/2203940864 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f89a406c530 0x7f89a406e9f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:55.158 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.158+0000 7f89b97fa700 1 -- 192.168.123.105:0/2203940864 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f89b408cc90 con 0x7f89bc107d90 2026-03-10T08:52:55.158 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.158+0000 7f89c2a0a700 1 -- 192.168.123.105:0/2203940864 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f89a8005320 con 0x7f89bc107d90 2026-03-10T08:52:55.160 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.160+0000 7f89bb7fe700 1 --2- 192.168.123.105:0/2203940864 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f89a406c530 0x7f89a406e9f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:55.160 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.160+0000 7f89bb7fe700 1 --2- 192.168.123.105:0/2203940864 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f89a406c530 0x7f89a406e9f0 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7f89b0005950 tx=0x7f89b00058e0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:55.161 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.161+0000 7f89b97fa700 1 -- 192.168.123.105:0/2203940864 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f89b405afa0 con 0x7f89bc107d90 2026-03-10T08:52:55.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.314+0000 7f89c2a0a700 1 -- 192.168.123.105:0/2203940864 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "false"} v 0) v1 -- 0x7f89a8005f70 con 0x7f89bc107d90 2026-03-10T08:52:55.392 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.391+0000 7f89b97fa700 1 -- 192.168.123.105:0/2203940864 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "false"}]=0 v5) v1 ==== 122+0+0 (secure 0 0 0) 0x7f89b40290c0 con 0x7f89bc107d90 2026-03-10T08:52:55.394 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.394+0000 7f89c2a0a700 1 -- 192.168.123.105:0/2203940864 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f89a406c530 msgr2=0x7f89a406e9f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:55.394 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.394+0000 7f89c2a0a700 1 --2- 192.168.123.105:0/2203940864 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f89a406c530 0x7f89a406e9f0 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7f89b0005950 tx=0x7f89b00058e0 comp rx=0 tx=0).stop 2026-03-10T08:52:55.394 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.394+0000 7f89c2a0a700 1 -- 192.168.123.105:0/2203940864 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f89bc107d90 msgr2=0x7f89bc1a53b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:55.394 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.394+0000 7f89c2a0a700 1 --2- 192.168.123.105:0/2203940864 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f89bc107d90 0x7f89bc1a53b0 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7f89b4000f80 tx=0x7f89b4003ce0 comp rx=0 tx=0).stop 2026-03-10T08:52:55.394 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.394+0000 7f89c2a0a700 1 -- 192.168.123.105:0/2203940864 shutdown_connections 2026-03-10T08:52:55.394 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.394+0000 7f89c2a0a700 1 --2- 192.168.123.105:0/2203940864 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f89a406c530 0x7f89a406e9f0 unknown :-1 s=CLOSED pgs=96 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:55.394 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.394+0000 7f89c2a0a700 1 --2- 192.168.123.105:0/2203940864 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f89bc107d90 0x7f89bc1a53b0 unknown :-1 s=CLOSED pgs=51 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:55.394 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.394+0000 7f89c2a0a700 1 --2- 192.168.123.105:0/2203940864 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f89bc10a700 0x7f89bc1a58f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:55.394 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.394+0000 7f89c2a0a700 1 -- 192.168.123.105:0/2203940864 >> 192.168.123.105:0/2203940864 conn(0x7f89bc06dae0 msgr2=0x7f89bc06ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:55.394 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.394+0000 7f89c2a0a700 1 -- 192.168.123.105:0/2203940864 shutdown_connections 2026-03-10T08:52:55.394 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.394+0000 7f89c2a0a700 1 -- 192.168.123.105:0/2203940864 wait complete. 2026-03-10T08:52:55.432 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:55 vm05 ceph-mon[49713]: Deploying daemon mds.cephfs.vm05.slhztf on vm05 2026-03-10T08:52:55.432 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:55 vm05 ceph-mon[49713]: osdmap e38: 6 total, 6 up, 6 in 2026-03-10T08:52:55.432 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:55 vm05 ceph-mon[49713]: mds.? [v2:192.168.123.105:6826/2466638752,v1:192.168.123.105:6827/2466638752] up:boot 2026-03-10T08:52:55.432 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:55 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/573135389' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"}]': finished 2026-03-10T08:52:55.432 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:55 vm05 ceph-mon[49713]: mds.? [v2:192.168.123.108:6824/1416297612,v1:192.168.123.108:6825/1416297612] up:boot 2026-03-10T08:52:55.432 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:55 vm05 ceph-mon[49713]: daemon mds.cephfs.vm08.xfzrbx assigned to filesystem cephfs as rank 0 (now has 1 ranks) 2026-03-10T08:52:55.432 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:55 vm05 ceph-mon[49713]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-10T08:52:55.433 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:55 vm05 ceph-mon[49713]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds) 2026-03-10T08:52:55.433 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:55 vm05 ceph-mon[49713]: Cluster is now healthy 2026-03-10T08:52:55.433 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:55 vm05 ceph-mon[49713]: fsmap cephfs:0 2 up:standby 2026-03-10T08:52:55.433 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:55 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.bxdvbu"}]: dispatch 2026-03-10T08:52:55.433 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:55 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.xfzrbx"}]: dispatch 2026-03-10T08:52:55.433 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:55 vm05 ceph-mon[49713]: fsmap cephfs:1 {0=cephfs.vm08.xfzrbx=up:creating} 1 up:standby 2026-03-10T08:52:55.433 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:55 vm05 ceph-mon[49713]: daemon mds.cephfs.vm08.xfzrbx is now active in filesystem cephfs as rank 0 2026-03-10T08:52:55.433 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:55 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:55.433 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:55 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:55.433 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:55 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:55.433 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:55 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.ssijow", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T08:52:55.433 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:55 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.ssijow", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-10T08:52:55.433 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:55 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:52:55.433 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:55 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/2203940864' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "false"}]: dispatch 2026-03-10T08:52:55.433 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:55 vm05 ceph-mon[49713]: from='client.? ' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "false"}]: dispatch 2026-03-10T08:52:55.463 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-10T08:52:55.465 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm05.local 2026-03-10T08:52:55.466 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- bash -c 'ceph fs set cephfs inline_data false' 2026-03-10T08:52:55.613 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:52:55.624 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:55 vm08 ceph-mon[57559]: Deploying daemon mds.cephfs.vm05.slhztf on vm05 2026-03-10T08:52:55.624 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:55 vm08 ceph-mon[57559]: osdmap e38: 6 total, 6 up, 6 in 2026-03-10T08:52:55.624 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:55 vm08 ceph-mon[57559]: mds.? [v2:192.168.123.105:6826/2466638752,v1:192.168.123.105:6827/2466638752] up:boot 2026-03-10T08:52:55.624 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:55 vm08 ceph-mon[57559]: from='client.? 192.168.123.105:0/573135389' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"}]': finished 2026-03-10T08:52:55.625 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:55 vm08 ceph-mon[57559]: mds.? [v2:192.168.123.108:6824/1416297612,v1:192.168.123.108:6825/1416297612] up:boot 2026-03-10T08:52:55.625 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:55 vm08 ceph-mon[57559]: daemon mds.cephfs.vm08.xfzrbx assigned to filesystem cephfs as rank 0 (now has 1 ranks) 2026-03-10T08:52:55.625 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:55 vm08 ceph-mon[57559]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-10T08:52:55.625 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:55 vm08 ceph-mon[57559]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds) 2026-03-10T08:52:55.625 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:55 vm08 ceph-mon[57559]: Cluster is now healthy 2026-03-10T08:52:55.625 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:55 vm08 ceph-mon[57559]: fsmap cephfs:0 2 up:standby 2026-03-10T08:52:55.625 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:55 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.bxdvbu"}]: dispatch 2026-03-10T08:52:55.625 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:55 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.xfzrbx"}]: dispatch 2026-03-10T08:52:55.625 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:55 vm08 ceph-mon[57559]: fsmap cephfs:1 {0=cephfs.vm08.xfzrbx=up:creating} 1 up:standby 2026-03-10T08:52:55.625 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:55 vm08 ceph-mon[57559]: daemon mds.cephfs.vm08.xfzrbx is now active in filesystem cephfs as rank 0 2026-03-10T08:52:55.625 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:55 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:55.625 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:55 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:55.625 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:55 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:55.625 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:55 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.ssijow", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T08:52:55.625 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:55 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.ssijow", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-10T08:52:55.625 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:55 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:52:55.625 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:55 vm08 ceph-mon[57559]: from='client.? 192.168.123.105:0/2203940864' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "false"}]: dispatch 2026-03-10T08:52:55.625 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:55 vm08 ceph-mon[57559]: from='client.? ' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "false"}]: dispatch 2026-03-10T08:52:55.858 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.857+0000 7fd018aba700 1 -- 192.168.123.105:0/3611819842 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd014104350 msgr2=0x7fd0141047b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:55.858 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.857+0000 7fd018aba700 1 --2- 192.168.123.105:0/3611819842 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd014104350 0x7fd0141047b0 secure :-1 s=READY pgs=256 cs=0 l=1 rev1=1 crypto rx=0x7fd004009b00 tx=0x7fd004009e10 comp rx=0 tx=0).stop 2026-03-10T08:52:55.859 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.858+0000 7fd018aba700 1 -- 192.168.123.105:0/3611819842 shutdown_connections 2026-03-10T08:52:55.859 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.858+0000 7fd018aba700 1 --2- 192.168.123.105:0/3611819842 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd014104350 0x7fd0141047b0 unknown :-1 s=CLOSED pgs=256 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:55.859 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.858+0000 7fd018aba700 1 --2- 192.168.123.105:0/3611819842 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd014103150 0x7fd014103570 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:55.859 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.858+0000 7fd018aba700 1 -- 192.168.123.105:0/3611819842 >> 192.168.123.105:0/3611819842 conn(0x7fd0140fe6d0 msgr2=0x7fd014100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:55.859 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.858+0000 7fd018aba700 1 -- 192.168.123.105:0/3611819842 shutdown_connections 2026-03-10T08:52:55.859 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.858+0000 7fd018aba700 1 -- 192.168.123.105:0/3611819842 wait complete. 2026-03-10T08:52:55.859 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.859+0000 7fd018aba700 1 Processor -- start 2026-03-10T08:52:55.859 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.859+0000 7fd018aba700 1 -- start start 2026-03-10T08:52:55.859 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.859+0000 7fd018aba700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd014103150 0x7fd014198a70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:55.860 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.859+0000 7fd018aba700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd014104350 0x7fd014198fb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:55.860 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.859+0000 7fd018aba700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd0141995d0 con 0x7fd014104350 2026-03-10T08:52:55.860 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.859+0000 7fd018aba700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd014199710 con 0x7fd014103150 2026-03-10T08:52:55.860 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.859+0000 7fd011d9b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd014104350 0x7fd014198fb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:55.860 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.859+0000 7fd011d9b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd014104350 0x7fd014198fb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:59120/0 (socket says 192.168.123.105:59120) 2026-03-10T08:52:55.860 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.859+0000 7fd011d9b700 1 -- 192.168.123.105:0/2915338198 learned_addr learned my addr 192.168.123.105:0/2915338198 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:52:55.860 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.859+0000 7fd011d9b700 1 -- 192.168.123.105:0/2915338198 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd014103150 msgr2=0x7fd014198a70 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T08:52:55.860 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.859+0000 7fd011d9b700 1 --2- 192.168.123.105:0/2915338198 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd014103150 0x7fd014198a70 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:55.860 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.859+0000 7fd011d9b700 1 -- 192.168.123.105:0/2915338198 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd0040097e0 con 0x7fd014104350 2026-03-10T08:52:55.860 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.860+0000 7fd011d9b700 1 --2- 192.168.123.105:0/2915338198 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd014104350 0x7fd014198fb0 secure :-1 s=READY pgs=257 cs=0 l=1 rev1=1 crypto rx=0x7fd004009ad0 tx=0x7fd004005070 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:55.861 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.860+0000 7fd00b7fe700 1 -- 192.168.123.105:0/2915338198 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd00401d070 con 0x7fd014104350 2026-03-10T08:52:55.861 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.860+0000 7fd00b7fe700 1 -- 192.168.123.105:0/2915338198 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd00400bc50 con 0x7fd014104350 2026-03-10T08:52:55.861 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.860+0000 7fd00b7fe700 1 -- 192.168.123.105:0/2915338198 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd00400f7f0 con 0x7fd014104350 2026-03-10T08:52:55.861 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.860+0000 7fd018aba700 1 -- 192.168.123.105:0/2915338198 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd01419e160 con 0x7fd014104350 2026-03-10T08:52:55.861 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.860+0000 7fd018aba700 1 -- 192.168.123.105:0/2915338198 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd01419e650 con 0x7fd014104350 2026-03-10T08:52:55.864 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.861+0000 7fd018aba700 1 -- 192.168.123.105:0/2915338198 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd014066e80 con 0x7fd014104350 2026-03-10T08:52:55.864 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.863+0000 7fd00b7fe700 1 -- 192.168.123.105:0/2915338198 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fd00400f950 con 0x7fd014104350 2026-03-10T08:52:55.864 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.864+0000 7fd00b7fe700 1 --2- 192.168.123.105:0/2915338198 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd00006c600 0x7fd00006eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:55.864 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.864+0000 7fd00b7fe700 1 -- 192.168.123.105:0/2915338198 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fd00408db70 con 0x7fd014104350 2026-03-10T08:52:55.864 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.864+0000 7fd01259c700 1 --2- 192.168.123.105:0/2915338198 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd00006c600 0x7fd00006eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:55.864 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.864+0000 7fd00b7fe700 1 -- 192.168.123.105:0/2915338198 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fd00405c090 con 0x7fd014104350 2026-03-10T08:52:55.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.864+0000 7fd01259c700 1 --2- 192.168.123.105:0/2915338198 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd00006c600 0x7fd00006eac0 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7fd0141041b0 tx=0x7fcffc006cb0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:55.990 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:55.990+0000 7fd018aba700 1 -- 192.168.123.105:0/2915338198 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "false"} v 0) v1 -- 0x7fd01419e930 con 0x7fd014104350 2026-03-10T08:52:56.395 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:56 vm05 ceph-mon[49713]: pgmap v76: 65 pgs: 12 creating+peering, 31 unknown, 22 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:52:56.395 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:56 vm05 ceph-mon[49713]: Deploying daemon mds.cephfs.vm08.ssijow on vm08 2026-03-10T08:52:56.395 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:56 vm05 ceph-mon[49713]: mds.? [v2:192.168.123.108:6824/1416297612,v1:192.168.123.108:6825/1416297612] up:active 2026-03-10T08:52:56.395 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:56 vm05 ceph-mon[49713]: mds.? [v2:192.168.123.105:6828/2662194502,v1:192.168.123.105:6829/2662194502] up:boot 2026-03-10T08:52:56.395 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:56 vm05 ceph-mon[49713]: from='client.? ' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "false"}]': finished 2026-03-10T08:52:56.395 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:56 vm05 ceph-mon[49713]: fsmap cephfs:1 {0=cephfs.vm08.xfzrbx=up:active} 2 up:standby 2026-03-10T08:52:56.395 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:56 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.slhztf"}]: dispatch 2026-03-10T08:52:56.395 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:56 vm05 ceph-mon[49713]: fsmap cephfs:1 {0=cephfs.vm08.xfzrbx=up:active} 2 up:standby 2026-03-10T08:52:56.395 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:56 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/2915338198' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "false"}]: dispatch 2026-03-10T08:52:56.395 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:56 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:56.395 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:56 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:56.395 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:56 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:56.395 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:56 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:56.395 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:56 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:56.395 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:56 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:52:56.395 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:56 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:56.399 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.398+0000 7fd00b7fe700 1 -- 192.168.123.105:0/2915338198 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "false"}]=0 inline data disabled v7) v1 ==== 133+0+0 (secure 0 0 0) 0x7fd00405bc20 con 0x7fd014104350 2026-03-10T08:52:56.401 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.401+0000 7fd018aba700 1 -- 192.168.123.105:0/2915338198 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd00006c600 msgr2=0x7fd00006eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:56.401 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.401+0000 7fd018aba700 1 --2- 192.168.123.105:0/2915338198 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd00006c600 0x7fd00006eac0 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7fd0141041b0 tx=0x7fcffc006cb0 comp rx=0 tx=0).stop 2026-03-10T08:52:56.401 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.401+0000 7fd018aba700 1 -- 192.168.123.105:0/2915338198 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd014104350 msgr2=0x7fd014198fb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:56.401 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.401+0000 7fd018aba700 1 --2- 192.168.123.105:0/2915338198 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd014104350 0x7fd014198fb0 secure :-1 s=READY pgs=257 cs=0 l=1 rev1=1 crypto rx=0x7fd004009ad0 tx=0x7fd004005070 comp rx=0 tx=0).stop 2026-03-10T08:52:56.402 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.401+0000 7fd018aba700 1 -- 192.168.123.105:0/2915338198 shutdown_connections 2026-03-10T08:52:56.402 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.401+0000 7fd018aba700 1 --2- 192.168.123.105:0/2915338198 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd00006c600 0x7fd00006eac0 unknown :-1 s=CLOSED pgs=100 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:56.402 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.401+0000 7fd018aba700 1 --2- 192.168.123.105:0/2915338198 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd014103150 0x7fd014198a70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:56.402 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.401+0000 7fd018aba700 1 --2- 192.168.123.105:0/2915338198 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd014104350 0x7fd014198fb0 unknown :-1 s=CLOSED pgs=257 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:56.402 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.401+0000 7fd018aba700 1 -- 192.168.123.105:0/2915338198 >> 192.168.123.105:0/2915338198 conn(0x7fd0140fe6d0 msgr2=0x7fd014107580 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:56.402 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.401+0000 7fd018aba700 1 -- 192.168.123.105:0/2915338198 shutdown_connections 2026-03-10T08:52:56.402 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.401+0000 7fd018aba700 1 -- 192.168.123.105:0/2915338198 wait complete. 2026-03-10T08:52:56.402 INFO:teuthology.orchestra.run.vm05.stderr:inline data disabled 2026-03-10T08:52:56.459 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-10T08:52:56.462 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm05.local 2026-03-10T08:52:56.462 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- bash -c 'ceph fs dump' 2026-03-10T08:52:56.663 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:52:56.750 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:56 vm08 ceph-mon[57559]: pgmap v76: 65 pgs: 12 creating+peering, 31 unknown, 22 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:52:56.750 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:56 vm08 ceph-mon[57559]: Deploying daemon mds.cephfs.vm08.ssijow on vm08 2026-03-10T08:52:56.750 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:56 vm08 ceph-mon[57559]: mds.? [v2:192.168.123.108:6824/1416297612,v1:192.168.123.108:6825/1416297612] up:active 2026-03-10T08:52:56.750 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:56 vm08 ceph-mon[57559]: mds.? [v2:192.168.123.105:6828/2662194502,v1:192.168.123.105:6829/2662194502] up:boot 2026-03-10T08:52:56.750 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:56 vm08 ceph-mon[57559]: from='client.? ' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "false"}]': finished 2026-03-10T08:52:56.750 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:56 vm08 ceph-mon[57559]: fsmap cephfs:1 {0=cephfs.vm08.xfzrbx=up:active} 2 up:standby 2026-03-10T08:52:56.750 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:56 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.slhztf"}]: dispatch 2026-03-10T08:52:56.750 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:56 vm08 ceph-mon[57559]: fsmap cephfs:1 {0=cephfs.vm08.xfzrbx=up:active} 2 up:standby 2026-03-10T08:52:56.750 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:56 vm08 ceph-mon[57559]: from='client.? 192.168.123.105:0/2915338198' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "false"}]: dispatch 2026-03-10T08:52:56.750 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:56 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:56.750 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:56 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:56.750 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:56 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:56.750 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:56 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:56.750 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:56 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:56.750 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:56 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:52:56.750 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:56 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:56.969 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.969+0000 7fb777d6c700 1 -- 192.168.123.105:0/3429977057 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb77010a700 msgr2=0x7fb77010cb90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:56.970 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.969+0000 7fb777d6c700 1 --2- 192.168.123.105:0/3429977057 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb77010a700 0x7fb77010cb90 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7fb76c009a60 tx=0x7fb76c009d70 comp rx=0 tx=0).stop 2026-03-10T08:52:56.973 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.973+0000 7fb777d6c700 1 -- 192.168.123.105:0/3429977057 shutdown_connections 2026-03-10T08:52:56.973 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.973+0000 7fb777d6c700 1 --2- 192.168.123.105:0/3429977057 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb77010a700 0x7fb77010cb90 unknown :-1 s=CLOSED pgs=55 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:56.973 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.973+0000 7fb777d6c700 1 --2- 192.168.123.105:0/3429977057 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb770107d90 0x7fb77010a1c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:56.973 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.973+0000 7fb777d6c700 1 -- 192.168.123.105:0/3429977057 >> 192.168.123.105:0/3429977057 conn(0x7fb77006dda0 msgr2=0x7fb770070220 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:56.975 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.975+0000 7fb777d6c700 1 -- 192.168.123.105:0/3429977057 shutdown_connections 2026-03-10T08:52:56.975 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.975+0000 7fb777d6c700 1 -- 192.168.123.105:0/3429977057 wait complete. 2026-03-10T08:52:56.975 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.975+0000 7fb777d6c700 1 Processor -- start 2026-03-10T08:52:56.976 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.975+0000 7fb777d6c700 1 -- start start 2026-03-10T08:52:56.976 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.975+0000 7fb777d6c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb770107d90 0x7fb7701adde0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:56.976 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.975+0000 7fb777d6c700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb77010a700 0x7fb7701ae320 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:56.976 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.975+0000 7fb777d6c700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb7701ae940 con 0x7fb770107d90 2026-03-10T08:52:56.976 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.975+0000 7fb777d6c700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb7701aea80 con 0x7fb77010a700 2026-03-10T08:52:56.976 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.976+0000 7fb775307700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb77010a700 0x7fb7701ae320 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:56.976 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.976+0000 7fb775307700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb77010a700 0x7fb7701ae320 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:34296/0 (socket says 192.168.123.105:34296) 2026-03-10T08:52:56.976 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.976+0000 7fb775307700 1 -- 192.168.123.105:0/4170363900 learned_addr learned my addr 192.168.123.105:0/4170363900 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:52:56.977 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.976+0000 7fb775b08700 1 --2- 192.168.123.105:0/4170363900 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb770107d90 0x7fb7701adde0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:56.977 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.976+0000 7fb775307700 1 -- 192.168.123.105:0/4170363900 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb770107d90 msgr2=0x7fb7701adde0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:56.977 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.976+0000 7fb775307700 1 --2- 192.168.123.105:0/4170363900 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb770107d90 0x7fb7701adde0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:56.977 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.976+0000 7fb775307700 1 -- 192.168.123.105:0/4170363900 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb76c009710 con 0x7fb77010a700 2026-03-10T08:52:56.977 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.976+0000 7fb775307700 1 --2- 192.168.123.105:0/4170363900 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb77010a700 0x7fb7701ae320 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7fb76c00f690 tx=0x7fb76c00f6c0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:56.977 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.976+0000 7fb766ffd700 1 -- 192.168.123.105:0/4170363900 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb76c01d070 con 0x7fb77010a700 2026-03-10T08:52:56.977 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.976+0000 7fb777d6c700 1 -- 192.168.123.105:0/4170363900 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb7701b34d0 con 0x7fb77010a700 2026-03-10T08:52:56.977 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.976+0000 7fb777d6c700 1 -- 192.168.123.105:0/4170363900 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb7701b39c0 con 0x7fb77010a700 2026-03-10T08:52:56.977 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.976+0000 7fb766ffd700 1 -- 192.168.123.105:0/4170363900 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb76c00fcd0 con 0x7fb77010a700 2026-03-10T08:52:56.977 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.977+0000 7fb766ffd700 1 -- 192.168.123.105:0/4170363900 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb76c0175e0 con 0x7fb77010a700 2026-03-10T08:52:56.978 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.978+0000 7fb766ffd700 1 -- 192.168.123.105:0/4170363900 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fb76c022470 con 0x7fb77010a700 2026-03-10T08:52:56.978 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.978+0000 7fb766ffd700 1 --2- 192.168.123.105:0/4170363900 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb75c06c2e0 0x7fb75c06e7a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:56.978 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.978+0000 7fb775b08700 1 --2- 192.168.123.105:0/4170363900 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb75c06c2e0 0x7fb75c06e7a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:56.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.978+0000 7fb775b08700 1 --2- 192.168.123.105:0/4170363900 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb75c06c2e0 0x7fb75c06e7a0 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7fb760005950 tx=0x7fb76000b410 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:56.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.978+0000 7fb766ffd700 1 -- 192.168.123.105:0/4170363900 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fb76c08ccd0 con 0x7fb77010a700 2026-03-10T08:52:56.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.979+0000 7fb777d6c700 1 -- 192.168.123.105:0/4170363900 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb754005320 con 0x7fb77010a700 2026-03-10T08:52:56.982 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:56.982+0000 7fb766ffd700 1 -- 192.168.123.105:0/4170363900 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fb76c05b090 con 0x7fb77010a700 2026-03-10T08:52:57.126 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.125+0000 7fb777d6c700 1 -- 192.168.123.105:0/4170363900 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fb754006200 con 0x7fb77010a700 2026-03-10T08:52:57.127 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.126+0000 7fb766ffd700 1 -- 192.168.123.105:0/4170363900 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 7 v7) v1 ==== 75+0+1772 (secure 0 0 0) 0x7fb76c027020 con 0x7fb77010a700 2026-03-10T08:52:57.128 INFO:teuthology.orchestra.run.vm05.stdout:e7 2026-03-10T08:52:57.128 INFO:teuthology.orchestra.run.vm05.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T08:52:57.128 INFO:teuthology.orchestra.run.vm05.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T08:52:57.128 INFO:teuthology.orchestra.run.vm05.stdout:legacy client fscid: 1 2026-03-10T08:52:57.128 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:52:57.128 INFO:teuthology.orchestra.run.vm05.stdout:Filesystem 'cephfs' (1) 2026-03-10T08:52:57.128 INFO:teuthology.orchestra.run.vm05.stdout:fs_name cephfs 2026-03-10T08:52:57.128 INFO:teuthology.orchestra.run.vm05.stdout:epoch 7 2026-03-10T08:52:57.128 INFO:teuthology.orchestra.run.vm05.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T08:52:57.128 INFO:teuthology.orchestra.run.vm05.stdout:created 2026-03-10T08:52:52.346264+0000 2026-03-10T08:52:57.128 INFO:teuthology.orchestra.run.vm05.stdout:modified 2026-03-10T08:52:56.395786+0000 2026-03-10T08:52:57.128 INFO:teuthology.orchestra.run.vm05.stdout:tableserver 0 2026-03-10T08:52:57.128 INFO:teuthology.orchestra.run.vm05.stdout:root 0 2026-03-10T08:52:57.128 INFO:teuthology.orchestra.run.vm05.stdout:session_timeout 60 2026-03-10T08:52:57.128 INFO:teuthology.orchestra.run.vm05.stdout:session_autoclose 300 2026-03-10T08:52:57.128 INFO:teuthology.orchestra.run.vm05.stdout:max_file_size 1099511627776 2026-03-10T08:52:57.128 INFO:teuthology.orchestra.run.vm05.stdout:required_client_features {} 2026-03-10T08:52:57.128 INFO:teuthology.orchestra.run.vm05.stdout:last_failure 0 2026-03-10T08:52:57.128 INFO:teuthology.orchestra.run.vm05.stdout:last_failure_osd_epoch 0 2026-03-10T08:52:57.128 INFO:teuthology.orchestra.run.vm05.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T08:52:57.128 INFO:teuthology.orchestra.run.vm05.stdout:max_mds 1 2026-03-10T08:52:57.128 INFO:teuthology.orchestra.run.vm05.stdout:in 0 2026-03-10T08:52:57.128 INFO:teuthology.orchestra.run.vm05.stdout:up {0=24295} 2026-03-10T08:52:57.128 INFO:teuthology.orchestra.run.vm05.stdout:failed 2026-03-10T08:52:57.128 INFO:teuthology.orchestra.run.vm05.stdout:damaged 2026-03-10T08:52:57.128 INFO:teuthology.orchestra.run.vm05.stdout:stopped 2026-03-10T08:52:57.128 INFO:teuthology.orchestra.run.vm05.stdout:data_pools [3] 2026-03-10T08:52:57.128 INFO:teuthology.orchestra.run.vm05.stdout:metadata_pool 2 2026-03-10T08:52:57.128 INFO:teuthology.orchestra.run.vm05.stdout:inline_data disabled 2026-03-10T08:52:57.128 INFO:teuthology.orchestra.run.vm05.stdout:balancer 2026-03-10T08:52:57.128 INFO:teuthology.orchestra.run.vm05.stdout:bal_rank_mask -1 2026-03-10T08:52:57.128 INFO:teuthology.orchestra.run.vm05.stdout:standby_count_wanted 1 2026-03-10T08:52:57.128 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.xfzrbx{0:24295} state up:active seq 2 addr [v2:192.168.123.108:6824/1416297612,v1:192.168.123.108:6825/1416297612] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:52:57.128 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:52:57.128 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:52:57.128 INFO:teuthology.orchestra.run.vm05.stdout:Standby daemons: 2026-03-10T08:52:57.128 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:52:57.128 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.slhztf{-1:14488} state up:standby seq 1 addr [v2:192.168.123.105:6828/2662194502,v1:192.168.123.105:6829/2662194502] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:52:57.128 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.bxdvbu{-1:24289} state up:standby seq 1 addr [v2:192.168.123.105:6826/2466638752,v1:192.168.123.105:6827/2466638752] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:52:57.128 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.ssijow{-1:24307} state up:standby seq 1 addr [v2:192.168.123.108:6826/573665424,v1:192.168.123.108:6827/573665424] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:52:57.130 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.130+0000 7fb764ff9700 1 -- 192.168.123.105:0/4170363900 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb75c06c2e0 msgr2=0x7fb75c06e7a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:57.130 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.130+0000 7fb764ff9700 1 --2- 192.168.123.105:0/4170363900 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb75c06c2e0 0x7fb75c06e7a0 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7fb760005950 tx=0x7fb76000b410 comp rx=0 tx=0).stop 2026-03-10T08:52:57.130 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.130+0000 7fb764ff9700 1 -- 192.168.123.105:0/4170363900 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb77010a700 msgr2=0x7fb7701ae320 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:57.130 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.130+0000 7fb764ff9700 1 --2- 192.168.123.105:0/4170363900 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb77010a700 0x7fb7701ae320 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7fb76c00f690 tx=0x7fb76c00f6c0 comp rx=0 tx=0).stop 2026-03-10T08:52:57.130 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.130+0000 7fb764ff9700 1 -- 192.168.123.105:0/4170363900 shutdown_connections 2026-03-10T08:52:57.130 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.130+0000 7fb764ff9700 1 --2- 192.168.123.105:0/4170363900 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb75c06c2e0 0x7fb75c06e7a0 unknown :-1 s=CLOSED pgs=102 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:57.130 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.130+0000 7fb764ff9700 1 --2- 192.168.123.105:0/4170363900 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb770107d90 0x7fb7701adde0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:57.130 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.130+0000 7fb764ff9700 1 --2- 192.168.123.105:0/4170363900 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb77010a700 0x7fb7701ae320 unknown :-1 s=CLOSED pgs=56 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:57.130 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.130+0000 7fb764ff9700 1 -- 192.168.123.105:0/4170363900 >> 192.168.123.105:0/4170363900 conn(0x7fb77006dda0 msgr2=0x7fb77010c150 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:57.130 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.130+0000 7fb764ff9700 1 -- 192.168.123.105:0/4170363900 shutdown_connections 2026-03-10T08:52:57.130 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.130+0000 7fb764ff9700 1 -- 192.168.123.105:0/4170363900 wait complete. 2026-03-10T08:52:57.132 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 7 2026-03-10T08:52:57.174 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- bash -c 'ceph --format=json fs dump | jq -e ".filesystems | length == 1"' 2026-03-10T08:52:57.347 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:52:57.523 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:57 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/2915338198' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "false"}]': finished 2026-03-10T08:52:57.523 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:57 vm05 ceph-mon[49713]: mds.? [v2:192.168.123.108:6826/573665424,v1:192.168.123.108:6827/573665424] up:boot 2026-03-10T08:52:57.523 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:57 vm05 ceph-mon[49713]: fsmap cephfs:1 {0=cephfs.vm08.xfzrbx=up:active} 3 up:standby 2026-03-10T08:52:57.523 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:57 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.ssijow"}]: dispatch 2026-03-10T08:52:57.523 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:57 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:57.523 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:57 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:57.523 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:57 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/4170363900' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T08:52:57.635 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.634+0000 7f3f612fa700 1 -- 192.168.123.105:0/1600896217 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3f5c104060 msgr2=0x7f3f5c1044e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:57.635 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.634+0000 7f3f612fa700 1 --2- 192.168.123.105:0/1600896217 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3f5c104060 0x7f3f5c1044e0 secure :-1 s=READY pgs=258 cs=0 l=1 rev1=1 crypto rx=0x7f3f50009b00 tx=0x7f3f50009e10 comp rx=0 tx=0).stop 2026-03-10T08:52:57.637 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.635+0000 7f3f612fa700 1 -- 192.168.123.105:0/1600896217 shutdown_connections 2026-03-10T08:52:57.637 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.635+0000 7f3f612fa700 1 --2- 192.168.123.105:0/1600896217 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3f5c104060 0x7f3f5c1044e0 unknown :-1 s=CLOSED pgs=258 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:57.637 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.635+0000 7f3f612fa700 1 --2- 192.168.123.105:0/1600896217 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3f5c102e70 0x7f3f5c103290 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:57.637 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.635+0000 7f3f612fa700 1 -- 192.168.123.105:0/1600896217 >> 192.168.123.105:0/1600896217 conn(0x7f3f5c0fe440 msgr2=0x7f3f5c1008a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:57.637 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.637+0000 7f3f612fa700 1 -- 192.168.123.105:0/1600896217 shutdown_connections 2026-03-10T08:52:57.637 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.637+0000 7f3f612fa700 1 -- 192.168.123.105:0/1600896217 wait complete. 2026-03-10T08:52:57.637 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.637+0000 7f3f612fa700 1 Processor -- start 2026-03-10T08:52:57.638 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.637+0000 7f3f612fa700 1 -- start start 2026-03-10T08:52:57.638 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.638+0000 7f3f612fa700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3f5c102e70 0x7f3f5c06eaf0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:57.638 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.638+0000 7f3f612fa700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3f5c104060 0x7f3f5c06f030 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:57.638 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.638+0000 7f3f612fa700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3f5c06f650 con 0x7f3f5c102e70 2026-03-10T08:52:57.638 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.638+0000 7f3f612fa700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3f5c06f790 con 0x7f3f5c104060 2026-03-10T08:52:57.639 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.638+0000 7f3f5bfff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3f5c102e70 0x7f3f5c06eaf0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:57.639 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.638+0000 7f3f5bfff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3f5c102e70 0x7f3f5c06eaf0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:59150/0 (socket says 192.168.123.105:59150) 2026-03-10T08:52:57.639 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.638+0000 7f3f5bfff700 1 -- 192.168.123.105:0/1197990023 learned_addr learned my addr 192.168.123.105:0/1197990023 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:52:57.639 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.638+0000 7f3f5b7fe700 1 --2- 192.168.123.105:0/1197990023 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3f5c104060 0x7f3f5c06f030 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:57.639 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.639+0000 7f3f5b7fe700 1 -- 192.168.123.105:0/1197990023 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3f5c102e70 msgr2=0x7f3f5c06eaf0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:57.640 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.640+0000 7f3f5b7fe700 1 --2- 192.168.123.105:0/1197990023 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3f5c102e70 0x7f3f5c06eaf0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:57.641 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.640+0000 7f3f5b7fe700 1 -- 192.168.123.105:0/1197990023 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3f500097e0 con 0x7f3f5c104060 2026-03-10T08:52:57.641 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.640+0000 7f3f5bfff700 1 --2- 192.168.123.105:0/1197990023 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3f5c102e70 0x7f3f5c06eaf0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T08:52:57.641 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.640+0000 7f3f5b7fe700 1 --2- 192.168.123.105:0/1197990023 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3f5c104060 0x7f3f5c06f030 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f3f50005fd0 tx=0x7f3f5000bb50 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:57.643 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.642+0000 7f3f597fa700 1 -- 192.168.123.105:0/1197990023 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3f5001d070 con 0x7f3f5c104060 2026-03-10T08:52:57.643 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.642+0000 7f3f612fa700 1 -- 192.168.123.105:0/1197990023 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3f5c0721d0 con 0x7f3f5c104060 2026-03-10T08:52:57.643 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.642+0000 7f3f612fa700 1 -- 192.168.123.105:0/1197990023 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3f5c0726c0 con 0x7f3f5c104060 2026-03-10T08:52:57.643 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.642+0000 7f3f597fa700 1 -- 192.168.123.105:0/1197990023 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3f5000fb70 con 0x7f3f5c104060 2026-03-10T08:52:57.643 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.642+0000 7f3f597fa700 1 -- 192.168.123.105:0/1197990023 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3f50021c20 con 0x7f3f5c104060 2026-03-10T08:52:57.644 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.643+0000 7f3f597fa700 1 -- 192.168.123.105:0/1197990023 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f3f5002b430 con 0x7f3f5c104060 2026-03-10T08:52:57.644 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.644+0000 7f3f597fa700 1 --2- 192.168.123.105:0/1197990023 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3f4406c600 0x7f3f4406eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:57.644 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.644+0000 7f3f5bfff700 1 --2- 192.168.123.105:0/1197990023 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3f4406c600 0x7f3f4406eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:57.644 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.644+0000 7f3f597fa700 1 -- 192.168.123.105:0/1197990023 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f3f50021e40 con 0x7f3f5c104060 2026-03-10T08:52:57.644 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.644+0000 7f3f612fa700 1 -- 192.168.123.105:0/1197990023 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3f5c04ea90 con 0x7f3f5c104060 2026-03-10T08:52:57.645 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.644+0000 7f3f5bfff700 1 --2- 192.168.123.105:0/1197990023 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3f4406c600 0x7f3f4406eac0 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7f3f4c0098f0 tx=0x7f3f4c008040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:57.652 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.648+0000 7f3f597fa700 1 -- 192.168.123.105:0/1197990023 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f3f50026020 con 0x7f3f5c104060 2026-03-10T08:52:57.794 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.794+0000 7f3f612fa700 1 -- 192.168.123.105:0/1197990023 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump", "format": "json"} v 0) v1 -- 0x7f3f5c1ab830 con 0x7f3f5c104060 2026-03-10T08:52:57.796 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.795+0000 7f3f597fa700 1 -- 192.168.123.105:0/1197990023 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "format": "json"}]=0 dumped fsmap epoch 7 v7) v1 ==== 93+0+4749 (secure 0 0 0) 0x7f3f5005b510 con 0x7f3f5c104060 2026-03-10T08:52:57.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:57 vm08 ceph-mon[57559]: from='client.? 192.168.123.105:0/2915338198' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "false"}]': finished 2026-03-10T08:52:57.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:57 vm08 ceph-mon[57559]: mds.? [v2:192.168.123.108:6826/573665424,v1:192.168.123.108:6827/573665424] up:boot 2026-03-10T08:52:57.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:57 vm08 ceph-mon[57559]: fsmap cephfs:1 {0=cephfs.vm08.xfzrbx=up:active} 3 up:standby 2026-03-10T08:52:57.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:57 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.ssijow"}]: dispatch 2026-03-10T08:52:57.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:57 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:57.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:57 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:57.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:57 vm08 ceph-mon[57559]: from='client.? 192.168.123.105:0/4170363900' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T08:52:57.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.803+0000 7f3f612fa700 1 -- 192.168.123.105:0/1197990023 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3f4406c600 msgr2=0x7f3f4406eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:57.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.803+0000 7f3f612fa700 1 --2- 192.168.123.105:0/1197990023 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3f4406c600 0x7f3f4406eac0 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7f3f4c0098f0 tx=0x7f3f4c008040 comp rx=0 tx=0).stop 2026-03-10T08:52:57.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.803+0000 7f3f612fa700 1 -- 192.168.123.105:0/1197990023 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3f5c104060 msgr2=0x7f3f5c06f030 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:57.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.803+0000 7f3f612fa700 1 --2- 192.168.123.105:0/1197990023 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3f5c104060 0x7f3f5c06f030 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f3f50005fd0 tx=0x7f3f5000bb50 comp rx=0 tx=0).stop 2026-03-10T08:52:57.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.803+0000 7f3f612fa700 1 -- 192.168.123.105:0/1197990023 shutdown_connections 2026-03-10T08:52:57.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.803+0000 7f3f612fa700 1 --2- 192.168.123.105:0/1197990023 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3f4406c600 0x7f3f4406eac0 unknown :-1 s=CLOSED pgs=103 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:57.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.803+0000 7f3f612fa700 1 --2- 192.168.123.105:0/1197990023 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3f5c102e70 0x7f3f5c06eaf0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:57.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.803+0000 7f3f612fa700 1 --2- 192.168.123.105:0/1197990023 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3f5c104060 0x7f3f5c06f030 unknown :-1 s=CLOSED pgs=57 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:57.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.803+0000 7f3f612fa700 1 -- 192.168.123.105:0/1197990023 >> 192.168.123.105:0/1197990023 conn(0x7f3f5c0fe440 msgr2=0x7f3f5c100820 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:57.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.804+0000 7f3f612fa700 1 -- 192.168.123.105:0/1197990023 shutdown_connections 2026-03-10T08:52:57.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:57.804+0000 7f3f612fa700 1 -- 192.168.123.105:0/1197990023 wait complete. 2026-03-10T08:52:57.812 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 7 2026-03-10T08:52:57.818 INFO:teuthology.orchestra.run.vm05.stdout:true 2026-03-10T08:52:57.856 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- bash -c 'while ! ceph --format=json mds versions | jq -e ". | add == 4"; do sleep 1; done' 2026-03-10T08:52:58.035 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:52:58.298 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.297+0000 7fe511a9e700 1 -- 192.168.123.105:0/2630339820 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe50c0ff450 msgr2=0x7fe50c1066c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:58.298 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.297+0000 7fe511a9e700 1 --2- 192.168.123.105:0/2630339820 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe50c0ff450 0x7fe50c1066c0 secure :-1 s=READY pgs=259 cs=0 l=1 rev1=1 crypto rx=0x7fe4f4009b00 tx=0x7fe4f4009e10 comp rx=0 tx=0).stop 2026-03-10T08:52:58.298 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.298+0000 7fe511a9e700 1 -- 192.168.123.105:0/2630339820 shutdown_connections 2026-03-10T08:52:58.298 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.298+0000 7fe511a9e700 1 --2- 192.168.123.105:0/2630339820 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe50c0ff450 0x7fe50c1066c0 unknown :-1 s=CLOSED pgs=259 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:58.299 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.298+0000 7fe511a9e700 1 --2- 192.168.123.105:0/2630339820 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe50c0feaf0 0x7fe50c0fef10 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:58.299 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.298+0000 7fe511a9e700 1 -- 192.168.123.105:0/2630339820 >> 192.168.123.105:0/2630339820 conn(0x7fe50c0fa6d0 msgr2=0x7fe50c0fcb10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:58.303 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.303+0000 7fe511a9e700 1 -- 192.168.123.105:0/2630339820 shutdown_connections 2026-03-10T08:52:58.304 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.303+0000 7fe511a9e700 1 -- 192.168.123.105:0/2630339820 wait complete. 2026-03-10T08:52:58.305 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.304+0000 7fe511a9e700 1 Processor -- start 2026-03-10T08:52:58.305 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.305+0000 7fe511a9e700 1 -- start start 2026-03-10T08:52:58.305 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.305+0000 7fe511a9e700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe50c0feaf0 0x7fe50c071d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:58.305 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.305+0000 7fe511a9e700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe50c0ff450 0x7fe50c0722a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:58.305 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.305+0000 7fe511a9e700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe50c0728e0 con 0x7fe50c0ff450 2026-03-10T08:52:58.306 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.305+0000 7fe511a9e700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe50c1a2300 con 0x7fe50c0feaf0 2026-03-10T08:52:58.306 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.306+0000 7fe50affd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe50c0ff450 0x7fe50c0722a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:58.306 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.306+0000 7fe50affd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe50c0ff450 0x7fe50c0722a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:59174/0 (socket says 192.168.123.105:59174) 2026-03-10T08:52:58.306 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.306+0000 7fe50affd700 1 -- 192.168.123.105:0/228195560 learned_addr learned my addr 192.168.123.105:0/228195560 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:52:58.307 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.306+0000 7fe50affd700 1 -- 192.168.123.105:0/228195560 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe50c0feaf0 msgr2=0x7fe50c071d20 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T08:52:58.307 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.306+0000 7fe50affd700 1 --2- 192.168.123.105:0/228195560 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe50c0feaf0 0x7fe50c071d20 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:58.308 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.306+0000 7fe50affd700 1 -- 192.168.123.105:0/228195560 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe4f40097e0 con 0x7fe50c0ff450 2026-03-10T08:52:58.309 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.309+0000 7fe50affd700 1 --2- 192.168.123.105:0/228195560 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe50c0ff450 0x7fe50c0722a0 secure :-1 s=READY pgs=260 cs=0 l=1 rev1=1 crypto rx=0x7fe4f4004a00 tx=0x7fe4f4004ae0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:58.309 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.309+0000 7fe508ff9700 1 -- 192.168.123.105:0/228195560 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe4f401d070 con 0x7fe50c0ff450 2026-03-10T08:52:58.310 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.309+0000 7fe511a9e700 1 -- 192.168.123.105:0/228195560 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe50c1a2500 con 0x7fe50c0ff450 2026-03-10T08:52:58.310 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.309+0000 7fe511a9e700 1 -- 192.168.123.105:0/228195560 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe50c1a2a40 con 0x7fe50c0ff450 2026-03-10T08:52:58.311 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.310+0000 7fe508ff9700 1 -- 192.168.123.105:0/228195560 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fe4f400bd10 con 0x7fe50c0ff450 2026-03-10T08:52:58.311 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.310+0000 7fe508ff9700 1 -- 192.168.123.105:0/228195560 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe4f400f810 con 0x7fe50c0ff450 2026-03-10T08:52:58.312 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.311+0000 7fe511a9e700 1 -- 192.168.123.105:0/228195560 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe50c06bf50 con 0x7fe50c0ff450 2026-03-10T08:52:58.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.313+0000 7fe508ff9700 1 -- 192.168.123.105:0/228195560 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fe4f4022470 con 0x7fe50c0ff450 2026-03-10T08:52:58.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.314+0000 7fe508ff9700 1 --2- 192.168.123.105:0/228195560 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe4f8070a10 0x7fe4f8072ed0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:58.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.314+0000 7fe508ff9700 1 -- 192.168.123.105:0/228195560 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fe4f408d480 con 0x7fe50c0ff450 2026-03-10T08:52:58.316 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.315+0000 7fe50b7fe700 1 --2- 192.168.123.105:0/228195560 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe4f8070a10 0x7fe4f8072ed0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:58.316 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.316+0000 7fe508ff9700 1 -- 192.168.123.105:0/228195560 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fe4f405b9e0 con 0x7fe50c0ff450 2026-03-10T08:52:58.318 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.316+0000 7fe50b7fe700 1 --2- 192.168.123.105:0/228195560 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe4f8070a10 0x7fe4f8072ed0 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7fe50c19e270 tx=0x7fe4fc006cb0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:58.419 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:58 vm05 ceph-mon[49713]: pgmap v77: 65 pgs: 65 active+clean; 451 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.4 KiB/s wr, 4 op/s 2026-03-10T08:52:58.420 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:58 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/1197990023' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-10T08:52:58.420 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:58 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:58.420 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:58 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:58.420 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:58 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:52:58.420 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:58 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:52:58.420 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:58 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:58.420 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:58 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:52:58.502 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.501+0000 7fe511a9e700 1 -- 192.168.123.105:0/228195560 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mds versions", "format": "json"} v 0) v1 -- 0x7fe50c066e80 con 0x7fe50c0ff450 2026-03-10T08:52:58.502 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.502+0000 7fe508ff9700 1 -- 192.168.123.105:0/228195560 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mds versions", "format": "json"}]=0 v9) v1 ==== 78+0+83 (secure 0 0 0) 0x7fe4f4031070 con 0x7fe50c0ff450 2026-03-10T08:52:58.508 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.507+0000 7fe5027fc700 1 -- 192.168.123.105:0/228195560 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe4f8070a10 msgr2=0x7fe4f8072ed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:58.508 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.507+0000 7fe5027fc700 1 --2- 192.168.123.105:0/228195560 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe4f8070a10 0x7fe4f8072ed0 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7fe50c19e270 tx=0x7fe4fc006cb0 comp rx=0 tx=0).stop 2026-03-10T08:52:58.508 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.507+0000 7fe5027fc700 1 -- 192.168.123.105:0/228195560 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe50c0ff450 msgr2=0x7fe50c0722a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:58.508 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.507+0000 7fe5027fc700 1 --2- 192.168.123.105:0/228195560 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe50c0ff450 0x7fe50c0722a0 secure :-1 s=READY pgs=260 cs=0 l=1 rev1=1 crypto rx=0x7fe4f4004a00 tx=0x7fe4f4004ae0 comp rx=0 tx=0).stop 2026-03-10T08:52:58.508 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.508+0000 7fe5027fc700 1 -- 192.168.123.105:0/228195560 shutdown_connections 2026-03-10T08:52:58.508 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.508+0000 7fe5027fc700 1 --2- 192.168.123.105:0/228195560 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe4f8070a10 0x7fe4f8072ed0 unknown :-1 s=CLOSED pgs=104 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:58.508 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.508+0000 7fe5027fc700 1 --2- 192.168.123.105:0/228195560 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe50c0feaf0 0x7fe50c071d20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:58.508 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.508+0000 7fe5027fc700 1 --2- 192.168.123.105:0/228195560 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe50c0ff450 0x7fe50c0722a0 secure :-1 s=CLOSED pgs=260 cs=0 l=1 rev1=1 crypto rx=0x7fe4f4004a00 tx=0x7fe4f4004ae0 comp rx=0 tx=0).stop 2026-03-10T08:52:58.508 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.508+0000 7fe5027fc700 1 -- 192.168.123.105:0/228195560 >> 192.168.123.105:0/228195560 conn(0x7fe50c0fa6d0 msgr2=0x7fe50c104f40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:58.509 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.508+0000 7fe5027fc700 1 -- 192.168.123.105:0/228195560 shutdown_connections 2026-03-10T08:52:58.509 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:58.508+0000 7fe5027fc700 1 -- 192.168.123.105:0/228195560 wait complete. 2026-03-10T08:52:58.518 INFO:teuthology.orchestra.run.vm05.stdout:false 2026-03-10T08:52:58.673 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:58 vm08 ceph-mon[57559]: pgmap v77: 65 pgs: 65 active+clean; 451 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.4 KiB/s wr, 4 op/s 2026-03-10T08:52:58.673 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:58 vm08 ceph-mon[57559]: from='client.? 192.168.123.105:0/1197990023' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-10T08:52:58.673 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:58 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:58.673 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:58 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:58.673 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:58 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:52:58.673 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:58 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:52:58.673 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:58 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:58.673 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:58 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:52:59.587 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:59 vm05 ceph-mon[49713]: mds.? [v2:192.168.123.105:6826/2466638752,v1:192.168.123.105:6827/2466638752] up:standby 2026-03-10T08:52:59.587 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:59 vm05 ceph-mon[49713]: Dropping low affinity active daemon mds.cephfs.vm08.xfzrbx in favor of higher affinity standby. 2026-03-10T08:52:59.587 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:59 vm05 ceph-mon[49713]: Replacing daemon mds.cephfs.vm08.xfzrbx as rank 0 with standby daemon mds.cephfs.vm05.bxdvbu 2026-03-10T08:52:59.587 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:59 vm05 ceph-mon[49713]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-10T08:52:59.587 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:59 vm05 ceph-mon[49713]: fsmap cephfs:1 {0=cephfs.vm08.xfzrbx=up:active} 3 up:standby 2026-03-10T08:52:59.587 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:59 vm05 ceph-mon[49713]: osdmap e39: 6 total, 6 up, 6 in 2026-03-10T08:52:59.587 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:59 vm05 ceph-mon[49713]: fsmap cephfs:1/1 {0=cephfs.vm05.bxdvbu=up:replay} 2 up:standby 2026-03-10T08:52:59.587 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:59 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/228195560' entity='client.admin' cmd=[{"prefix": "mds versions", "format": "json"}]: dispatch 2026-03-10T08:52:59.587 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:59 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:59.587 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:52:59 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:59.587 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.586+0000 7ff2f25e8700 1 -- 192.168.123.105:0/1233634788 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff2ec107d90 msgr2=0x7ff2ec1081b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:59.587 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.586+0000 7ff2f25e8700 1 --2- 192.168.123.105:0/1233634788 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff2ec107d90 0x7ff2ec1081b0 secure :-1 s=READY pgs=261 cs=0 l=1 rev1=1 crypto rx=0x7ff2e0009b00 tx=0x7ff2e0009e10 comp rx=0 tx=0).stop 2026-03-10T08:52:59.589 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.587+0000 7ff2f25e8700 1 -- 192.168.123.105:0/1233634788 shutdown_connections 2026-03-10T08:52:59.589 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.587+0000 7ff2f25e8700 1 --2- 192.168.123.105:0/1233634788 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff2ec1086f0 0x7ff2ec10f0d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:59.589 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.587+0000 7ff2f25e8700 1 --2- 192.168.123.105:0/1233634788 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff2ec107d90 0x7ff2ec1081b0 unknown :-1 s=CLOSED pgs=261 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:59.589 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.587+0000 7ff2f25e8700 1 -- 192.168.123.105:0/1233634788 >> 192.168.123.105:0/1233634788 conn(0x7ff2ec06dda0 msgr2=0x7ff2ec070220 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:59.589 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.589+0000 7ff2f25e8700 1 -- 192.168.123.105:0/1233634788 shutdown_connections 2026-03-10T08:52:59.589 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.589+0000 7ff2f25e8700 1 -- 192.168.123.105:0/1233634788 wait complete. 2026-03-10T08:52:59.589 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.589+0000 7ff2f25e8700 1 Processor -- start 2026-03-10T08:52:59.590 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.589+0000 7ff2f25e8700 1 -- start start 2026-03-10T08:52:59.590 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.590+0000 7ff2f25e8700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff2ec107d90 0x7ff2ec10aa10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:59.590 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.590+0000 7ff2f25e8700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff2ec1086f0 0x7ff2ec109060 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:59.590 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.590+0000 7ff2f25e8700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff2ec1095a0 con 0x7ff2ec107d90 2026-03-10T08:52:59.590 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.590+0000 7ff2f25e8700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff2ec1096e0 con 0x7ff2ec1086f0 2026-03-10T08:52:59.590 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.590+0000 7ff2ebfff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff2ec107d90 0x7ff2ec10aa10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:59.590 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.590+0000 7ff2ebfff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff2ec107d90 0x7ff2ec10aa10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:59182/0 (socket says 192.168.123.105:59182) 2026-03-10T08:52:59.590 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.590+0000 7ff2ebfff700 1 -- 192.168.123.105:0/2032659565 learned_addr learned my addr 192.168.123.105:0/2032659565 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:52:59.590 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.590+0000 7ff2eb7fe700 1 --2- 192.168.123.105:0/2032659565 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff2ec1086f0 0x7ff2ec109060 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:59.590 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.590+0000 7ff2ebfff700 1 -- 192.168.123.105:0/2032659565 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff2ec1086f0 msgr2=0x7ff2ec109060 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:59.591 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.590+0000 7ff2ebfff700 1 --2- 192.168.123.105:0/2032659565 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff2ec1086f0 0x7ff2ec109060 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:59.591 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.590+0000 7ff2ebfff700 1 -- 192.168.123.105:0/2032659565 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff2e00097e0 con 0x7ff2ec107d90 2026-03-10T08:52:59.591 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.590+0000 7ff2eb7fe700 1 --2- 192.168.123.105:0/2032659565 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff2ec1086f0 0x7ff2ec109060 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T08:52:59.591 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.591+0000 7ff2ebfff700 1 --2- 192.168.123.105:0/2032659565 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff2ec107d90 0x7ff2ec10aa10 secure :-1 s=READY pgs=262 cs=0 l=1 rev1=1 crypto rx=0x7ff2e0005230 tx=0x7ff2e00056c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:59.591 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.591+0000 7ff2e97fa700 1 -- 192.168.123.105:0/2032659565 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff2e001d070 con 0x7ff2ec107d90 2026-03-10T08:52:59.595 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.591+0000 7ff2e97fa700 1 -- 192.168.123.105:0/2032659565 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff2e000bc50 con 0x7ff2ec107d90 2026-03-10T08:52:59.595 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.591+0000 7ff2e97fa700 1 -- 192.168.123.105:0/2032659565 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff2e000f800 con 0x7ff2ec107d90 2026-03-10T08:52:59.595 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.591+0000 7ff2f25e8700 1 -- 192.168.123.105:0/2032659565 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff2ec109960 con 0x7ff2ec107d90 2026-03-10T08:52:59.595 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.591+0000 7ff2f25e8700 1 -- 192.168.123.105:0/2032659565 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff2ec109e50 con 0x7ff2ec107d90 2026-03-10T08:52:59.595 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.591+0000 7ff2f25e8700 1 -- 192.168.123.105:0/2032659565 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff2ec1991c0 con 0x7ff2ec107d90 2026-03-10T08:52:59.595 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.594+0000 7ff2e97fa700 1 -- 192.168.123.105:0/2032659565 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7ff2e0022ae0 con 0x7ff2ec107d90 2026-03-10T08:52:59.595 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.595+0000 7ff2e97fa700 1 --2- 192.168.123.105:0/2032659565 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff2d406c600 0x7ff2d406eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:52:59.595 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.595+0000 7ff2e97fa700 1 -- 192.168.123.105:0/2032659565 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7ff2e008d990 con 0x7ff2ec107d90 2026-03-10T08:52:59.595 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.595+0000 7ff2e97fa700 1 -- 192.168.123.105:0/2032659565 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7ff2e005bcc0 con 0x7ff2ec107d90 2026-03-10T08:52:59.595 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.595+0000 7ff2eb7fe700 1 --2- 192.168.123.105:0/2032659565 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff2d406c600 0x7ff2d406eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:52:59.596 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.596+0000 7ff2eb7fe700 1 --2- 192.168.123.105:0/2032659565 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff2d406c600 0x7ff2d406eac0 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7ff2e400d390 tx=0x7ff2e4009040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:52:59.751 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.751+0000 7ff2f25e8700 1 -- 192.168.123.105:0/2032659565 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mds versions", "format": "json"} v 0) v1 -- 0x7ff2ec066e80 con 0x7ff2ec107d90 2026-03-10T08:52:59.751 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.751+0000 7ff2e97fa700 1 -- 192.168.123.105:0/2032659565 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mds versions", "format": "json"}]=0 v10) v1 ==== 78+0+83 (secure 0 0 0) 0x7ff2e0027090 con 0x7ff2ec107d90 2026-03-10T08:52:59.756 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.753+0000 7ff2f25e8700 1 -- 192.168.123.105:0/2032659565 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff2d406c600 msgr2=0x7ff2d406eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:59.756 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.753+0000 7ff2f25e8700 1 --2- 192.168.123.105:0/2032659565 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff2d406c600 0x7ff2d406eac0 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7ff2e400d390 tx=0x7ff2e4009040 comp rx=0 tx=0).stop 2026-03-10T08:52:59.756 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.753+0000 7ff2f25e8700 1 -- 192.168.123.105:0/2032659565 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff2ec107d90 msgr2=0x7ff2ec10aa10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:52:59.756 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.753+0000 7ff2f25e8700 1 --2- 192.168.123.105:0/2032659565 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff2ec107d90 0x7ff2ec10aa10 secure :-1 s=READY pgs=262 cs=0 l=1 rev1=1 crypto rx=0x7ff2e0005230 tx=0x7ff2e00056c0 comp rx=0 tx=0).stop 2026-03-10T08:52:59.756 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.754+0000 7ff2f25e8700 1 -- 192.168.123.105:0/2032659565 shutdown_connections 2026-03-10T08:52:59.756 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.754+0000 7ff2f25e8700 1 --2- 192.168.123.105:0/2032659565 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff2d406c600 0x7ff2d406eac0 unknown :-1 s=CLOSED pgs=106 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:59.756 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.754+0000 7ff2f25e8700 1 --2- 192.168.123.105:0/2032659565 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff2ec107d90 0x7ff2ec10aa10 unknown :-1 s=CLOSED pgs=262 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:59.756 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.754+0000 7ff2f25e8700 1 --2- 192.168.123.105:0/2032659565 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff2ec1086f0 0x7ff2ec109060 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:52:59.756 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.754+0000 7ff2f25e8700 1 -- 192.168.123.105:0/2032659565 >> 192.168.123.105:0/2032659565 conn(0x7ff2ec06dda0 msgr2=0x7ff2ec10d950 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:52:59.756 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.754+0000 7ff2f25e8700 1 -- 192.168.123.105:0/2032659565 shutdown_connections 2026-03-10T08:52:59.756 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:52:59.754+0000 7ff2f25e8700 1 -- 192.168.123.105:0/2032659565 wait complete. 2026-03-10T08:52:59.763 INFO:teuthology.orchestra.run.vm05.stdout:true 2026-03-10T08:52:59.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:59 vm08 ceph-mon[57559]: mds.? [v2:192.168.123.105:6826/2466638752,v1:192.168.123.105:6827/2466638752] up:standby 2026-03-10T08:52:59.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:59 vm08 ceph-mon[57559]: Dropping low affinity active daemon mds.cephfs.vm08.xfzrbx in favor of higher affinity standby. 2026-03-10T08:52:59.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:59 vm08 ceph-mon[57559]: Replacing daemon mds.cephfs.vm08.xfzrbx as rank 0 with standby daemon mds.cephfs.vm05.bxdvbu 2026-03-10T08:52:59.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:59 vm08 ceph-mon[57559]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-10T08:52:59.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:59 vm08 ceph-mon[57559]: fsmap cephfs:1 {0=cephfs.vm08.xfzrbx=up:active} 3 up:standby 2026-03-10T08:52:59.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:59 vm08 ceph-mon[57559]: osdmap e39: 6 total, 6 up, 6 in 2026-03-10T08:52:59.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:59 vm08 ceph-mon[57559]: fsmap cephfs:1/1 {0=cephfs.vm05.bxdvbu=up:replay} 2 up:standby 2026-03-10T08:52:59.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:59 vm08 ceph-mon[57559]: from='client.? 192.168.123.105:0/228195560' entity='client.admin' cmd=[{"prefix": "mds versions", "format": "json"}]: dispatch 2026-03-10T08:52:59.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:59 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:59.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:52:59 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:52:59.824 INFO:teuthology.run_tasks:Running task fs.pre_upgrade_save... 2026-03-10T08:52:59.827 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph fs dump --format=json 2026-03-10T08:52:59.968 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:53:00.205 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.204+0000 7f894ba4d700 1 -- 192.168.123.105:0/4160518195 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8944103cf0 msgr2=0x7f8944107d40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:00.205 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.204+0000 7f894ba4d700 1 --2- 192.168.123.105:0/4160518195 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8944103cf0 0x7f8944107d40 secure :-1 s=READY pgs=263 cs=0 l=1 rev1=1 crypto rx=0x7f8940009b00 tx=0x7f8940009e10 comp rx=0 tx=0).stop 2026-03-10T08:53:00.205 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.205+0000 7f894ba4d700 1 -- 192.168.123.105:0/4160518195 shutdown_connections 2026-03-10T08:53:00.205 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.205+0000 7f894ba4d700 1 --2- 192.168.123.105:0/4160518195 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8944103cf0 0x7f8944107d40 unknown :-1 s=CLOSED pgs=263 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:00.205 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.205+0000 7f894ba4d700 1 --2- 192.168.123.105:0/4160518195 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8944103340 0x7f8944103720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:00.205 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.205+0000 7f894ba4d700 1 -- 192.168.123.105:0/4160518195 >> 192.168.123.105:0/4160518195 conn(0x7f89440febd0 msgr2=0x7f8944100ff0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:53:00.205 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.205+0000 7f894ba4d700 1 -- 192.168.123.105:0/4160518195 shutdown_connections 2026-03-10T08:53:00.205 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.205+0000 7f894ba4d700 1 -- 192.168.123.105:0/4160518195 wait complete. 2026-03-10T08:53:00.206 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.206+0000 7f894ba4d700 1 Processor -- start 2026-03-10T08:53:00.206 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.206+0000 7f894ba4d700 1 -- start start 2026-03-10T08:53:00.206 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.206+0000 7f894ba4d700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8944103340 0x7f8944198de0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:00.206 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.206+0000 7f894ba4d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8944103cf0 0x7f8944199320 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:00.207 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.206+0000 7f894ba4d700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8944199a00 con 0x7f8944103cf0 2026-03-10T08:53:00.207 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.206+0000 7f894ba4d700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f894419d790 con 0x7f8944103340 2026-03-10T08:53:00.207 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.206+0000 7f89497e9700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8944103340 0x7f8944198de0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:00.207 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.206+0000 7f89497e9700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8944103340 0x7f8944198de0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:34778/0 (socket says 192.168.123.105:34778) 2026-03-10T08:53:00.207 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.206+0000 7f89497e9700 1 -- 192.168.123.105:0/4074191477 learned_addr learned my addr 192.168.123.105:0/4074191477 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:53:00.207 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.207+0000 7f8948fe8700 1 --2- 192.168.123.105:0/4074191477 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8944103cf0 0x7f8944199320 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:00.207 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.207+0000 7f89497e9700 1 -- 192.168.123.105:0/4074191477 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8944103cf0 msgr2=0x7f8944199320 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:00.207 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.207+0000 7f89497e9700 1 --2- 192.168.123.105:0/4074191477 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8944103cf0 0x7f8944199320 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:00.208 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.207+0000 7f89497e9700 1 -- 192.168.123.105:0/4074191477 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f89400097e0 con 0x7f8944103340 2026-03-10T08:53:00.208 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.207+0000 7f89497e9700 1 --2- 192.168.123.105:0/4074191477 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8944103340 0x7f8944198de0 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7f893400eb10 tx=0x7f893400eed0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:53:00.208 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.207+0000 7f893a7fc700 1 -- 192.168.123.105:0/4074191477 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f893400cca0 con 0x7f8944103340 2026-03-10T08:53:00.208 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.207+0000 7f893a7fc700 1 -- 192.168.123.105:0/4074191477 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f893400ce00 con 0x7f8944103340 2026-03-10T08:53:00.208 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.207+0000 7f8948fe8700 1 --2- 192.168.123.105:0/4074191477 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8944103cf0 0x7f8944199320 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T08:53:00.208 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.207+0000 7f894ba4d700 1 -- 192.168.123.105:0/4074191477 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f894419da70 con 0x7f8944103340 2026-03-10T08:53:00.208 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.207+0000 7f893a7fc700 1 -- 192.168.123.105:0/4074191477 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f89340189c0 con 0x7f8944103340 2026-03-10T08:53:00.208 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.207+0000 7f894ba4d700 1 -- 192.168.123.105:0/4074191477 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f894419dfc0 con 0x7f8944103340 2026-03-10T08:53:00.209 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.209+0000 7f893a7fc700 1 -- 192.168.123.105:0/4074191477 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f8934018b20 con 0x7f8944103340 2026-03-10T08:53:00.209 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.209+0000 7f893a7fc700 1 --2- 192.168.123.105:0/4074191477 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f893006c2e0 0x7f893006e7a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:00.210 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.209+0000 7f893a7fc700 1 -- 192.168.123.105:0/4074191477 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f8934014070 con 0x7f8944103340 2026-03-10T08:53:00.210 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.209+0000 7f8948fe8700 1 --2- 192.168.123.105:0/4074191477 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f893006c2e0 0x7f893006e7a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:00.210 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.210+0000 7f8948fe8700 1 --2- 192.168.123.105:0/4074191477 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f893006c2e0 0x7f893006e7a0 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7f8940005310 tx=0x7f8940005fb0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:53:00.210 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.210+0000 7f894ba4d700 1 -- 192.168.123.105:0/4074191477 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f894404ea90 con 0x7f8944103340 2026-03-10T08:53:00.213 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.213+0000 7f893a7fc700 1 -- 192.168.123.105:0/4074191477 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f89340568f0 con 0x7f8944103340 2026-03-10T08:53:00.335 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.335+0000 7f894ba4d700 1 -- 192.168.123.105:0/4074191477 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump", "format": "json"} v 0) v1 -- 0x7f894419a140 con 0x7f8944103340 2026-03-10T08:53:00.336 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.336+0000 7f893a7fc700 1 -- 192.168.123.105:0/4074191477 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "format": "json"}]=0 dumped fsmap epoch 10 v10) v1 ==== 94+0+4754 (secure 0 0 0) 0x7f8934059f10 con 0x7f8944103340 2026-03-10T08:53:00.336 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:53:00.337 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":10,"default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14488,"name":"cephfs.vm05.slhztf","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.105:6829/2662194502","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6828","nonce":2662194502},{"type":"v1","addr":"192.168.123.105:6829","nonce":2662194502}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":24307,"name":"cephfs.vm08.ssijow","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6827/573665424","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":573665424},{"type":"v1","addr":"192.168.123.108:6827","nonce":573665424}]},"join_fscid":-1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":7},{"gid":24317,"name":"cephfs.vm08.xfzrbx","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/3236998387","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":3236998387},{"type":"v1","addr":"192.168.123.108:6825","nonce":3236998387}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10}],"filesystems":[{"mdsmap":{"epoch":10,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T08:52:52.346264+0000","modified":"2026-03-10T08:52:59.415527+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"last_failure":0,"last_failure_osd_epoch":39,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24289},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24289":{"gid":24289,"name":"cephfs.vm05.bxdvbu","rank":0,"incarnation":9,"state":"up:reconnect","state_seq":3,"addr":"192.168.123.105:6827/2466638752","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":2466638752},{"type":"v1","addr":"192.168.123.105:6827","nonce":2466638752}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1},"id":1}]} 2026-03-10T08:53:00.340 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.339+0000 7f894ba4d700 1 -- 192.168.123.105:0/4074191477 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f893006c2e0 msgr2=0x7f893006e7a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:00.340 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.339+0000 7f894ba4d700 1 --2- 192.168.123.105:0/4074191477 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f893006c2e0 0x7f893006e7a0 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7f8940005310 tx=0x7f8940005fb0 comp rx=0 tx=0).stop 2026-03-10T08:53:00.341 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.339+0000 7f894ba4d700 1 -- 192.168.123.105:0/4074191477 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8944103340 msgr2=0x7f8944198de0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:00.341 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.339+0000 7f894ba4d700 1 --2- 192.168.123.105:0/4074191477 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8944103340 0x7f8944198de0 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7f893400eb10 tx=0x7f893400eed0 comp rx=0 tx=0).stop 2026-03-10T08:53:00.341 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.340+0000 7f894ba4d700 1 -- 192.168.123.105:0/4074191477 shutdown_connections 2026-03-10T08:53:00.341 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.340+0000 7f894ba4d700 1 --2- 192.168.123.105:0/4074191477 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f893006c2e0 0x7f893006e7a0 unknown :-1 s=CLOSED pgs=107 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:00.341 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.340+0000 7f894ba4d700 1 --2- 192.168.123.105:0/4074191477 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8944103340 0x7f8944198de0 unknown :-1 s=CLOSED pgs=60 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:00.341 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.340+0000 7f894ba4d700 1 --2- 192.168.123.105:0/4074191477 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8944103cf0 0x7f8944199320 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:00.342 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.340+0000 7f894ba4d700 1 -- 192.168.123.105:0/4074191477 >> 192.168.123.105:0/4074191477 conn(0x7f89440febd0 msgr2=0x7f8944100140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:53:00.342 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.341+0000 7f894ba4d700 1 -- 192.168.123.105:0/4074191477 shutdown_connections 2026-03-10T08:53:00.342 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.341+0000 7f894ba4d700 1 -- 192.168.123.105:0/4074191477 wait complete. 2026-03-10T08:53:00.342 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 10 2026-03-10T08:53:00.391 DEBUG:tasks.fs:fs fscid=1,name=cephfs state = {'epoch': 10, 'max_mds': 1, 'flags': 18} 2026-03-10T08:53:00.391 INFO:teuthology.run_tasks:Running task ceph-fuse... 2026-03-10T08:53:00.400 INFO:tasks.ceph_fuse:Running ceph_fuse task... 2026-03-10T08:53:00.400 INFO:tasks.ceph_fuse:config is {'client.0': {}, 'client.1': {}} 2026-03-10T08:53:00.400 INFO:tasks.ceph_fuse:client.0 config is {} 2026-03-10T08:53:00.400 INFO:tasks.cephfs.mount:cephfs_mntpt = None 2026-03-10T08:53:00.400 INFO:tasks.cephfs.mount:hostfs_mntpt = /home/ubuntu/cephtest/mnt.0 2026-03-10T08:53:00.400 INFO:tasks.ceph_fuse:client.1 config is {} 2026-03-10T08:53:00.400 INFO:tasks.cephfs.mount:cephfs_mntpt = None 2026-03-10T08:53:00.400 INFO:tasks.cephfs.mount:hostfs_mntpt = /home/ubuntu/cephtest/mnt.1 2026-03-10T08:53:00.400 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T08:53:00.400 DEBUG:teuthology.orchestra.run.vm08:> ip netns list 2026-03-10T08:53:00.415 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T08:53:00.415 DEBUG:teuthology.orchestra.run.vm08:> sudo ip link delete ceph-brx 2026-03-10T08:53:00.482 INFO:teuthology.orchestra.run.vm08.stderr:Cannot find device "ceph-brx" 2026-03-10T08:53:00.483 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T08:53:00.483 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T08:53:00.483 DEBUG:teuthology.orchestra.run.vm05:> ip netns list 2026-03-10T08:53:00.498 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T08:53:00.498 DEBUG:teuthology.orchestra.run.vm05:> sudo ip link delete ceph-brx 2026-03-10T08:53:00.567 INFO:teuthology.orchestra.run.vm05.stderr:Cannot find device "ceph-brx" 2026-03-10T08:53:00.568 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T08:53:00.568 INFO:tasks.ceph_fuse:Mounting ceph-fuse clients... 2026-03-10T08:53:00.568 DEBUG:tasks.ceph_fuse:passing mntargs=[] 2026-03-10T08:53:00.568 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph fs ls 2026-03-10T08:53:00.628 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:00 vm05 ceph-mon[49713]: pgmap v79: 65 pgs: 65 active+clean; 451 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 341 B/s rd, 2.3 KiB/s wr, 7 op/s 2026-03-10T08:53:00.628 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:00 vm05 ceph-mon[49713]: mds.? [v2:192.168.123.108:6824/3236998387,v1:192.168.123.108:6825/3236998387] up:boot 2026-03-10T08:53:00.628 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:00 vm05 ceph-mon[49713]: mds.? [v2:192.168.123.105:6826/2466638752,v1:192.168.123.105:6827/2466638752] up:reconnect 2026-03-10T08:53:00.628 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:00 vm05 ceph-mon[49713]: mds.? [v2:192.168.123.105:6828/2662194502,v1:192.168.123.105:6829/2662194502] up:standby 2026-03-10T08:53:00.628 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:00 vm05 ceph-mon[49713]: fsmap cephfs:1/1 {0=cephfs.vm05.bxdvbu=up:reconnect} 3 up:standby 2026-03-10T08:53:00.628 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:00 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.xfzrbx"}]: dispatch 2026-03-10T08:53:00.628 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:00 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:53:00.628 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:00 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:53:00.628 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:00 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:53:00.628 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:00 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:53:00.628 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:00 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:53:00.628 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:00 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/2032659565' entity='client.admin' cmd=[{"prefix": "mds versions", "format": "json"}]: dispatch 2026-03-10T08:53:00.628 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:00 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:53:00.628 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:00 vm05 ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:53:00.628 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:00 vm05 ceph-mon[49713]: from='client.? 192.168.123.105:0/4074191477' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-10T08:53:00.741 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:53:00.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:00 vm08 ceph-mon[57559]: pgmap v79: 65 pgs: 65 active+clean; 451 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 341 B/s rd, 2.3 KiB/s wr, 7 op/s 2026-03-10T08:53:00.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:00 vm08 ceph-mon[57559]: mds.? [v2:192.168.123.108:6824/3236998387,v1:192.168.123.108:6825/3236998387] up:boot 2026-03-10T08:53:00.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:00 vm08 ceph-mon[57559]: mds.? [v2:192.168.123.105:6826/2466638752,v1:192.168.123.105:6827/2466638752] up:reconnect 2026-03-10T08:53:00.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:00 vm08 ceph-mon[57559]: mds.? [v2:192.168.123.105:6828/2662194502,v1:192.168.123.105:6829/2662194502] up:standby 2026-03-10T08:53:00.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:00 vm08 ceph-mon[57559]: fsmap cephfs:1/1 {0=cephfs.vm05.bxdvbu=up:reconnect} 3 up:standby 2026-03-10T08:53:00.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:00 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.xfzrbx"}]: dispatch 2026-03-10T08:53:00.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:00 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:53:00.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:00 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:53:00.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:00 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:53:00.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:00 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:53:00.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:00 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:53:00.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:00 vm08 ceph-mon[57559]: from='client.? 192.168.123.105:0/2032659565' entity='client.admin' cmd=[{"prefix": "mds versions", "format": "json"}]: dispatch 2026-03-10T08:53:00.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:00 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:53:00.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:00 vm08 ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:53:00.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:00 vm08 ceph-mon[57559]: from='client.? 192.168.123.105:0/4074191477' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-10T08:53:00.974 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.973+0000 7ff081c50700 1 -- 192.168.123.105:0/2326074170 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff07c073500 msgr2=0x7ff07c073960 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:00.974 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.973+0000 7ff081c50700 1 --2- 192.168.123.105:0/2326074170 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff07c073500 0x7ff07c073960 secure :-1 s=READY pgs=264 cs=0 l=1 rev1=1 crypto rx=0x7ff06c009b30 tx=0x7ff06c009e40 comp rx=0 tx=0).stop 2026-03-10T08:53:00.974 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.974+0000 7ff081c50700 1 -- 192.168.123.105:0/2326074170 shutdown_connections 2026-03-10T08:53:00.974 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.974+0000 7ff081c50700 1 --2- 192.168.123.105:0/2326074170 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff07c073500 0x7ff07c073960 unknown :-1 s=CLOSED pgs=264 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:00.974 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.974+0000 7ff081c50700 1 --2- 192.168.123.105:0/2326074170 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff07c074dd0 0x7ff07c072fc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:00.974 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.974+0000 7ff081c50700 1 -- 192.168.123.105:0/2326074170 >> 192.168.123.105:0/2326074170 conn(0x7ff07c078ed0 msgr2=0x7ff07c0792e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:53:00.974 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.974+0000 7ff081c50700 1 -- 192.168.123.105:0/2326074170 shutdown_connections 2026-03-10T08:53:00.974 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.974+0000 7ff081c50700 1 -- 192.168.123.105:0/2326074170 wait complete. 2026-03-10T08:53:00.975 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.974+0000 7ff081c50700 1 Processor -- start 2026-03-10T08:53:00.975 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.975+0000 7ff081c50700 1 -- start start 2026-03-10T08:53:00.975 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.975+0000 7ff081c50700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff07c073500 0x7ff07c19d2f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:00.975 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.975+0000 7ff081c50700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff07c074dd0 0x7ff07c19d830 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:00.975 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.975+0000 7ff081c50700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff07c19df10 con 0x7ff07c073500 2026-03-10T08:53:00.975 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.975+0000 7ff081c50700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff07c1a1ca0 con 0x7ff07c074dd0 2026-03-10T08:53:00.975 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.975+0000 7ff07b7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff07c073500 0x7ff07c19d2f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:00.976 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.975+0000 7ff07b7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff07c073500 0x7ff07c19d2f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:33384/0 (socket says 192.168.123.105:33384) 2026-03-10T08:53:00.976 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.975+0000 7ff07b7fe700 1 -- 192.168.123.105:0/2354906044 learned_addr learned my addr 192.168.123.105:0/2354906044 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:53:00.976 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.975+0000 7ff07affd700 1 --2- 192.168.123.105:0/2354906044 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff07c074dd0 0x7ff07c19d830 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:00.976 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.975+0000 7ff07b7fe700 1 -- 192.168.123.105:0/2354906044 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff07c074dd0 msgr2=0x7ff07c19d830 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:00.976 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.975+0000 7ff07b7fe700 1 --2- 192.168.123.105:0/2354906044 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff07c074dd0 0x7ff07c19d830 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:00.976 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.975+0000 7ff07b7fe700 1 -- 192.168.123.105:0/2354906044 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff06c0097e0 con 0x7ff07c073500 2026-03-10T08:53:00.976 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.976+0000 7ff07b7fe700 1 --2- 192.168.123.105:0/2354906044 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff07c073500 0x7ff07c19d2f0 secure :-1 s=READY pgs=265 cs=0 l=1 rev1=1 crypto rx=0x7ff06400b700 tx=0x7ff06400bac0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:53:00.977 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.976+0000 7ff078ff9700 1 -- 192.168.123.105:0/2354906044 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff064010840 con 0x7ff07c073500 2026-03-10T08:53:00.977 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.976+0000 7ff078ff9700 1 -- 192.168.123.105:0/2354906044 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff064010e80 con 0x7ff07c073500 2026-03-10T08:53:00.977 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.976+0000 7ff078ff9700 1 -- 192.168.123.105:0/2354906044 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff06400d590 con 0x7ff07c073500 2026-03-10T08:53:00.977 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.976+0000 7ff081c50700 1 -- 192.168.123.105:0/2354906044 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff07c1a1f80 con 0x7ff07c073500 2026-03-10T08:53:00.977 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.976+0000 7ff081c50700 1 -- 192.168.123.105:0/2354906044 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff07c1a24d0 con 0x7ff07c073500 2026-03-10T08:53:00.977 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.977+0000 7ff081c50700 1 -- 192.168.123.105:0/2354906044 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff07c04ea90 con 0x7ff07c073500 2026-03-10T08:53:00.980 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.977+0000 7ff078ff9700 1 -- 192.168.123.105:0/2354906044 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7ff06400f3e0 con 0x7ff07c073500 2026-03-10T08:53:00.980 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.978+0000 7ff078ff9700 1 --2- 192.168.123.105:0/2354906044 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff06806c490 0x7ff06806e950 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:00.980 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.978+0000 7ff078ff9700 1 -- 192.168.123.105:0/2354906044 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7ff06408b2f0 con 0x7ff07c073500 2026-03-10T08:53:00.980 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.978+0000 7ff07affd700 1 --2- 192.168.123.105:0/2354906044 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff06806c490 0x7ff06806e950 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:00.980 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.979+0000 7ff07affd700 1 --2- 192.168.123.105:0/2354906044 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff06806c490 0x7ff06806e950 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7ff07c19e910 tx=0x7ff06c005fb0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:53:00.980 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:00.980+0000 7ff078ff9700 1 -- 192.168.123.105:0/2354906044 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7ff064055a90 con 0x7ff07c073500 2026-03-10T08:53:01.106 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:01.105+0000 7ff081c50700 1 -- 192.168.123.105:0/2354906044 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs ls"} v 0) v1 -- 0x7ff07c1a27b0 con 0x7ff07c073500 2026-03-10T08:53:01.106 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:01.106+0000 7ff078ff9700 1 -- 192.168.123.105:0/2354906044 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs ls"}]=0 v11) v1 ==== 53+0+83 (secure 0 0 0) 0x7ff0640590b0 con 0x7ff07c073500 2026-03-10T08:53:01.106 INFO:teuthology.orchestra.run.vm05.stdout:name: cephfs, metadata pool: cephfs.cephfs.meta, data pools: [cephfs.cephfs.data ] 2026-03-10T08:53:01.109 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:01.109+0000 7ff081c50700 1 -- 192.168.123.105:0/2354906044 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff06806c490 msgr2=0x7ff06806e950 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:01.109 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:01.109+0000 7ff081c50700 1 --2- 192.168.123.105:0/2354906044 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff06806c490 0x7ff06806e950 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7ff07c19e910 tx=0x7ff06c005fb0 comp rx=0 tx=0).stop 2026-03-10T08:53:01.109 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:01.109+0000 7ff081c50700 1 -- 192.168.123.105:0/2354906044 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff07c073500 msgr2=0x7ff07c19d2f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:01.109 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:01.109+0000 7ff081c50700 1 --2- 192.168.123.105:0/2354906044 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff07c073500 0x7ff07c19d2f0 secure :-1 s=READY pgs=265 cs=0 l=1 rev1=1 crypto rx=0x7ff06400b700 tx=0x7ff06400bac0 comp rx=0 tx=0).stop 2026-03-10T08:53:01.110 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:01.109+0000 7ff081c50700 1 -- 192.168.123.105:0/2354906044 shutdown_connections 2026-03-10T08:53:01.110 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:01.109+0000 7ff081c50700 1 --2- 192.168.123.105:0/2354906044 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff06806c490 0x7ff06806e950 unknown :-1 s=CLOSED pgs=108 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:01.110 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:01.109+0000 7ff081c50700 1 --2- 192.168.123.105:0/2354906044 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff07c073500 0x7ff07c19d2f0 unknown :-1 s=CLOSED pgs=265 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:01.110 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:01.109+0000 7ff081c50700 1 --2- 192.168.123.105:0/2354906044 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff07c074dd0 0x7ff07c19d830 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:01.110 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:01.109+0000 7ff081c50700 1 -- 192.168.123.105:0/2354906044 >> 192.168.123.105:0/2354906044 conn(0x7ff07c078ed0 msgr2=0x7ff07c10f950 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:53:01.110 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:01.109+0000 7ff081c50700 1 -- 192.168.123.105:0/2354906044 shutdown_connections 2026-03-10T08:53:01.110 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:01.109+0000 7ff081c50700 1 -- 192.168.123.105:0/2354906044 wait complete. 2026-03-10T08:53:01.150 INFO:tasks.cephfs.mount:Mounting default Ceph FS; just confirmed its presence on cluster 2026-03-10T08:53:01.150 INFO:tasks.cephfs.mount:Mounting Ceph FS. Following are details of mount; remember "None" represents Python type None - 2026-03-10T08:53:01.150 INFO:tasks.cephfs.mount:self.client_remote.hostname = vm05.local 2026-03-10T08:53:01.150 INFO:tasks.cephfs.mount:self.client.name = client.0 2026-03-10T08:53:01.150 INFO:tasks.cephfs.mount:self.hostfs_mntpt = /home/ubuntu/cephtest/mnt.0 2026-03-10T08:53:01.150 INFO:tasks.cephfs.mount:self.cephfs_name = None 2026-03-10T08:53:01.150 INFO:tasks.cephfs.mount:self.cephfs_mntpt = None 2026-03-10T08:53:01.150 INFO:tasks.cephfs.mount:self.client_keyring_path = None 2026-03-10T08:53:01.150 INFO:tasks.cephfs.mount:Setting the 'None' netns for '/home/ubuntu/cephtest/mnt.0' 2026-03-10T08:53:01.150 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T08:53:01.150 DEBUG:teuthology.orchestra.run.vm05:> ip addr 2026-03-10T08:53:01.206 INFO:teuthology.orchestra.run.vm05.stdout:1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 2026-03-10T08:53:01.206 INFO:teuthology.orchestra.run.vm05.stdout: link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 2026-03-10T08:53:01.206 INFO:teuthology.orchestra.run.vm05.stdout: inet 127.0.0.1/8 scope host lo 2026-03-10T08:53:01.206 INFO:teuthology.orchestra.run.vm05.stdout: valid_lft forever preferred_lft forever 2026-03-10T08:53:01.206 INFO:teuthology.orchestra.run.vm05.stdout: inet6 ::1/128 scope host 2026-03-10T08:53:01.206 INFO:teuthology.orchestra.run.vm05.stdout: valid_lft forever preferred_lft forever 2026-03-10T08:53:01.206 INFO:teuthology.orchestra.run.vm05.stdout:2: eth0: mtu 1500 qdisc fq_codel state UP group default qlen 1000 2026-03-10T08:53:01.206 INFO:teuthology.orchestra.run.vm05.stdout: link/ether 52:55:00:00:00:05 brd ff:ff:ff:ff:ff:ff 2026-03-10T08:53:01.206 INFO:teuthology.orchestra.run.vm05.stdout: altname enp0s3 2026-03-10T08:53:01.206 INFO:teuthology.orchestra.run.vm05.stdout: altname ens3 2026-03-10T08:53:01.206 INFO:teuthology.orchestra.run.vm05.stdout: inet 192.168.123.105/24 brd 192.168.123.255 scope global dynamic noprefixroute eth0 2026-03-10T08:53:01.206 INFO:teuthology.orchestra.run.vm05.stdout: valid_lft 3150sec preferred_lft 3150sec 2026-03-10T08:53:01.206 INFO:teuthology.orchestra.run.vm05.stdout: inet6 fe80::5055:ff:fe00:5/64 scope link noprefixroute 2026-03-10T08:53:01.206 INFO:teuthology.orchestra.run.vm05.stdout: valid_lft forever preferred_lft forever 2026-03-10T08:53:01.206 INFO:tasks.cephfs.mount:Setuping the 'ceph-brx' with 192.168.159.254/20 2026-03-10T08:53:01.206 DEBUG:teuthology.orchestra.run.vm05:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T08:53:01.206 DEBUG:teuthology.orchestra.run.vm05:> set -e 2026-03-10T08:53:01.206 DEBUG:teuthology.orchestra.run.vm05:> sudo ip link add name ceph-brx type bridge 2026-03-10T08:53:01.206 DEBUG:teuthology.orchestra.run.vm05:> sudo ip addr flush dev ceph-brx 2026-03-10T08:53:01.206 DEBUG:teuthology.orchestra.run.vm05:> sudo ip link set ceph-brx up 2026-03-10T08:53:01.207 DEBUG:teuthology.orchestra.run.vm05:> sudo ip addr add 192.168.159.254/20 brd 192.168.159.255 dev ceph-brx 2026-03-10T08:53:01.207 DEBUG:teuthology.orchestra.run.vm05:> ') 2026-03-10T08:53:01.279 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:01 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T08:53:01.345 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:01 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T08:53:01.350 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T08:53:01.350 DEBUG:teuthology.orchestra.run.vm05:> echo 1 | sudo tee /proc/sys/net/ipv4/ip_forward 2026-03-10T08:53:01.419 INFO:teuthology.orchestra.run.vm05.stdout:1 2026-03-10T08:53:01.420 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T08:53:01.420 DEBUG:teuthology.orchestra.run.vm05:> ip r 2026-03-10T08:53:01.475 INFO:teuthology.orchestra.run.vm05.stdout:default via 192.168.123.1 dev eth0 proto dhcp src 192.168.123.105 metric 100 2026-03-10T08:53:01.475 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.0/24 dev eth0 proto kernel scope link src 192.168.123.105 metric 100 2026-03-10T08:53:01.475 INFO:teuthology.orchestra.run.vm05.stdout:192.168.144.0/20 dev ceph-brx proto kernel scope link src 192.168.159.254 2026-03-10T08:53:01.475 DEBUG:teuthology.orchestra.run.vm05:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T08:53:01.475 DEBUG:teuthology.orchestra.run.vm05:> set -e 2026-03-10T08:53:01.475 DEBUG:teuthology.orchestra.run.vm05:> sudo iptables -A FORWARD -o eth0 -i ceph-brx -j ACCEPT 2026-03-10T08:53:01.475 DEBUG:teuthology.orchestra.run.vm05:> sudo iptables -A FORWARD -i eth0 -o ceph-brx -j ACCEPT 2026-03-10T08:53:01.475 DEBUG:teuthology.orchestra.run.vm05:> sudo iptables -t nat -A POSTROUTING -s 192.168.159.254/20 -o eth0 -j MASQUERADE 2026-03-10T08:53:01.475 DEBUG:teuthology.orchestra.run.vm05:> ') 2026-03-10T08:53:01.549 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:01 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T08:53:01.606 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:01 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T08:53:01.609 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T08:53:01.609 DEBUG:teuthology.orchestra.run.vm05:> ip netns list 2026-03-10T08:53:01.664 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T08:53:01.664 DEBUG:teuthology.orchestra.run.vm05:> ip netns list-id 2026-03-10T08:53:01.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:01 vm05.local ceph-mon[49713]: mds.? [v2:192.168.123.105:6826/2466638752,v1:192.168.123.105:6827/2466638752] up:rejoin 2026-03-10T08:53:01.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:01 vm05.local ceph-mon[49713]: mds.? [v2:192.168.123.108:6826/573665424,v1:192.168.123.108:6827/573665424] up:standby 2026-03-10T08:53:01.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:01 vm05.local ceph-mon[49713]: fsmap cephfs:1/1 {0=cephfs.vm05.bxdvbu=up:rejoin} 3 up:standby 2026-03-10T08:53:01.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:01 vm05.local ceph-mon[49713]: daemon mds.cephfs.vm05.bxdvbu is now active in filesystem cephfs as rank 0 2026-03-10T08:53:01.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:01 vm05.local ceph-mon[49713]: from='client.? 192.168.123.105:0/2354906044' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch 2026-03-10T08:53:01.719 DEBUG:teuthology.orchestra.run.vm05:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T08:53:01.719 DEBUG:teuthology.orchestra.run.vm05:> set -e 2026-03-10T08:53:01.719 DEBUG:teuthology.orchestra.run.vm05:> sudo ip netns add ceph-ns--home-ubuntu-cephtest-mnt.0 2026-03-10T08:53:01.719 DEBUG:teuthology.orchestra.run.vm05:> sudo ip netns set ceph-ns--home-ubuntu-cephtest-mnt.0 0 2026-03-10T08:53:01.719 DEBUG:teuthology.orchestra.run.vm05:> ') 2026-03-10T08:53:01.791 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:01 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T08:53:01.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:01 vm08 ceph-mon[57559]: mds.? [v2:192.168.123.105:6826/2466638752,v1:192.168.123.105:6827/2466638752] up:rejoin 2026-03-10T08:53:01.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:01 vm08 ceph-mon[57559]: mds.? [v2:192.168.123.108:6826/573665424,v1:192.168.123.108:6827/573665424] up:standby 2026-03-10T08:53:01.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:01 vm08 ceph-mon[57559]: fsmap cephfs:1/1 {0=cephfs.vm05.bxdvbu=up:rejoin} 3 up:standby 2026-03-10T08:53:01.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:01 vm08 ceph-mon[57559]: daemon mds.cephfs.vm05.bxdvbu is now active in filesystem cephfs as rank 0 2026-03-10T08:53:01.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:01 vm08 ceph-mon[57559]: from='client.? 192.168.123.105:0/2354906044' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch 2026-03-10T08:53:01.814 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:01 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T08:53:01.815 INFO:tasks.cephfs.mount:Setuping the netns 'ceph-ns--home-ubuntu-cephtest-mnt.0' with 192.168.144.1/20 2026-03-10T08:53:01.815 DEBUG:teuthology.orchestra.run.vm05:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T08:53:01.815 DEBUG:teuthology.orchestra.run.vm05:> set -e 2026-03-10T08:53:01.815 DEBUG:teuthology.orchestra.run.vm05:> sudo ip link add veth0 netns ceph-ns--home-ubuntu-cephtest-mnt.0 type veth peer name brx.0 2026-03-10T08:53:01.815 DEBUG:teuthology.orchestra.run.vm05:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.0 ip addr add 192.168.144.1/20 brd 192.168.159.255 dev veth0 2026-03-10T08:53:01.815 DEBUG:teuthology.orchestra.run.vm05:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.0 ip link set veth0 up 2026-03-10T08:53:01.815 DEBUG:teuthology.orchestra.run.vm05:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.0 ip link set lo up 2026-03-10T08:53:01.815 DEBUG:teuthology.orchestra.run.vm05:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.0 ip route add default via 192.168.159.254 2026-03-10T08:53:01.815 DEBUG:teuthology.orchestra.run.vm05:> ') 2026-03-10T08:53:01.887 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:01 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T08:53:01.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:01 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T08:53:01.949 DEBUG:teuthology.orchestra.run.vm05:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T08:53:01.949 DEBUG:teuthology.orchestra.run.vm05:> set -e 2026-03-10T08:53:01.949 DEBUG:teuthology.orchestra.run.vm05:> sudo ip link set brx.0 up 2026-03-10T08:53:01.949 DEBUG:teuthology.orchestra.run.vm05:> sudo ip link set dev brx.0 master ceph-brx 2026-03-10T08:53:01.949 DEBUG:teuthology.orchestra.run.vm05:> ') 2026-03-10T08:53:02.022 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:02 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T08:53:02.045 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:02 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T08:53:02.049 INFO:tasks.cephfs.fuse_mount:Client client.0 config is {} 2026-03-10T08:53:02.049 INFO:teuthology.orchestra.run:Running command with timeout 60 2026-03-10T08:53:02.049 DEBUG:teuthology.orchestra.run.vm05:> mkdir -p -v /home/ubuntu/cephtest/mnt.0 2026-03-10T08:53:02.103 INFO:teuthology.orchestra.run.vm05.stdout:mkdir: created directory '/home/ubuntu/cephtest/mnt.0' 2026-03-10T08:53:02.103 INFO:teuthology.orchestra.run:Running command with timeout 60 2026-03-10T08:53:02.104 DEBUG:teuthology.orchestra.run.vm05:> chmod 0000 /home/ubuntu/cephtest/mnt.0 2026-03-10T08:53:02.157 DEBUG:teuthology.orchestra.run.vm05:> sudo modprobe fuse 2026-03-10T08:53:02.221 DEBUG:teuthology.orchestra.run.vm05:> cat /proc/self/mounts | awk '{print $2}' 2026-03-10T08:53:02.275 INFO:teuthology.orchestra.run.vm05.stdout:/proc 2026-03-10T08:53:02.275 INFO:teuthology.orchestra.run.vm05.stdout:/sys 2026-03-10T08:53:02.275 INFO:teuthology.orchestra.run.vm05.stdout:/dev 2026-03-10T08:53:02.275 INFO:teuthology.orchestra.run.vm05.stdout:/sys/kernel/security 2026-03-10T08:53:02.275 INFO:teuthology.orchestra.run.vm05.stdout:/dev/shm 2026-03-10T08:53:02.275 INFO:teuthology.orchestra.run.vm05.stdout:/dev/pts 2026-03-10T08:53:02.275 INFO:teuthology.orchestra.run.vm05.stdout:/run 2026-03-10T08:53:02.275 INFO:teuthology.orchestra.run.vm05.stdout:/sys/fs/cgroup 2026-03-10T08:53:02.275 INFO:teuthology.orchestra.run.vm05.stdout:/sys/fs/pstore 2026-03-10T08:53:02.275 INFO:teuthology.orchestra.run.vm05.stdout:/sys/fs/bpf 2026-03-10T08:53:02.275 INFO:teuthology.orchestra.run.vm05.stdout:/sys/kernel/config 2026-03-10T08:53:02.275 INFO:teuthology.orchestra.run.vm05.stdout:/ 2026-03-10T08:53:02.276 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/nfs/rpc_pipefs 2026-03-10T08:53:02.276 INFO:teuthology.orchestra.run.vm05.stdout:/sys/fs/selinux 2026-03-10T08:53:02.276 INFO:teuthology.orchestra.run.vm05.stdout:/proc/sys/fs/binfmt_misc 2026-03-10T08:53:02.276 INFO:teuthology.orchestra.run.vm05.stdout:/dev/mqueue 2026-03-10T08:53:02.276 INFO:teuthology.orchestra.run.vm05.stdout:/sys/kernel/debug 2026-03-10T08:53:02.276 INFO:teuthology.orchestra.run.vm05.stdout:/dev/hugepages 2026-03-10T08:53:02.276 INFO:teuthology.orchestra.run.vm05.stdout:/sys/kernel/tracing 2026-03-10T08:53:02.276 INFO:teuthology.orchestra.run.vm05.stdout:/run/credentials/systemd-sysctl.service 2026-03-10T08:53:02.276 INFO:teuthology.orchestra.run.vm05.stdout:/sys/fs/fuse/connections 2026-03-10T08:53:02.276 INFO:teuthology.orchestra.run.vm05.stdout:/run/credentials/systemd-sysusers.service 2026-03-10T08:53:02.276 INFO:teuthology.orchestra.run.vm05.stdout:/run/credentials/systemd-tmpfiles-setup-dev.service 2026-03-10T08:53:02.276 INFO:teuthology.orchestra.run.vm05.stdout:/run/credentials/systemd-tmpfiles-setup.service 2026-03-10T08:53:02.276 INFO:teuthology.orchestra.run.vm05.stdout:/run/user/1000 2026-03-10T08:53:02.276 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay 2026-03-10T08:53:02.276 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/2d8a1946d4a2c78fe0ecabb8fb5d856a11fac7b448ec4890abcf5012d618a819/merged 2026-03-10T08:53:02.276 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/73b3f7a3eb4ee52a8bc915529f2069ad801dba337dd50818aaa8269c93d98bde/merged 2026-03-10T08:53:02.276 INFO:teuthology.orchestra.run.vm05.stdout:/run/user/0 2026-03-10T08:53:02.276 INFO:teuthology.orchestra.run.vm05.stdout:/proc/sys/fs/binfmt_misc 2026-03-10T08:53:02.276 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/a6a7d19a1c2760b8edb84c8c45f3939beffa2c2d37e590b45438ce5df30f7fd9/merged 2026-03-10T08:53:02.276 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/d9767265ccdce4e12e626b04da44410573061ce2a96978bb0138833307068521/merged 2026-03-10T08:53:02.276 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/ae2241e21cfe8ca1de9cc8edacfdd44dd9ec229d79aea9ebb0db4139ad31806f/merged 2026-03-10T08:53:02.276 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/068f8c6c8a06d6376cb78f1d0751100726e0b76e62e74d2e450ca7aa611a28ab/merged 2026-03-10T08:53:02.276 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/d8acadf16539df303ee82d13b7e55863367bea3bda522ecaf8d515bd3c59d2d6/merged 2026-03-10T08:53:02.276 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/1b582c5103abc7ec9e6e3ea6fa63e5eee318edcbfe83dd5744bf5d06547720ef/merged 2026-03-10T08:53:02.276 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/3a9994ad96d81078f4da4f278b756afd70a8db3838afd6d382c26e770340f10f/merged 2026-03-10T08:53:02.276 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/cb32959c116ef3dd6e4d9faa8a2da6ac3e955adeaea37a9efe82100e0df53671/merged 2026-03-10T08:53:02.276 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/afe12ae9654d52278015525dcab49c0ee79464ccfb226774de9571835846e23e/merged 2026-03-10T08:53:02.276 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/6a3e04130a6e4cb97ab1ea4e6c9ad2a0948624fdf02434a7a537aa4ce41b6467/merged 2026-03-10T08:53:02.276 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/dc3085acb829d3c4abc3b018033967e45d4cdd78914a928081f969e2bdb3cacb/merged 2026-03-10T08:53:02.276 INFO:teuthology.orchestra.run.vm05.stdout:/run/netns 2026-03-10T08:53:02.276 INFO:teuthology.orchestra.run.vm05.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.0 2026-03-10T08:53:02.276 INFO:teuthology.orchestra.run.vm05.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.0 2026-03-10T08:53:02.276 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T08:53:02.276 DEBUG:teuthology.orchestra.run.vm05:> ls /sys/fs/fuse/connections 2026-03-10T08:53:02.330 INFO:tasks.cephfs.fuse_mount:Pre-mount connections: [] 2026-03-10T08:53:02.330 DEBUG:teuthology.orchestra.run.vm05:> (cd /home/ubuntu/cephtest && exec sudo nsenter --net=/var/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.0 sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-fuse -f --admin-socket '/var/run/ceph/$cluster-$name.$pid.asok' /home/ubuntu/cephtest/mnt.0 --id 0) 2026-03-10T08:53:02.371 DEBUG:teuthology.orchestra.run.vm05:> sudo modprobe fuse 2026-03-10T08:53:02.400 DEBUG:teuthology.orchestra.run.vm05:> cat /proc/self/mounts | awk '{print $2}' 2026-03-10T08:53:02.440 INFO:tasks.cephfs.fuse_mount.ceph-fuse.0.vm05.stderr:2026-03-10T08:53:02.440+0000 7f7c3e5d2480 -1 init, newargv = 0x55d245d282a0 newargc=15 2026-03-10T08:53:02.440 INFO:tasks.cephfs.fuse_mount.ceph-fuse.0.vm05.stderr:ceph-fuse[96456]: starting ceph client 2026-03-10T08:53:02.449 INFO:tasks.cephfs.fuse_mount.ceph-fuse.0.vm05.stderr:ceph-fuse[96456]: starting fuse 2026-03-10T08:53:02.460 INFO:teuthology.orchestra.run.vm05.stdout:/proc 2026-03-10T08:53:02.461 INFO:teuthology.orchestra.run.vm05.stdout:/sys 2026-03-10T08:53:02.461 INFO:teuthology.orchestra.run.vm05.stdout:/dev 2026-03-10T08:53:02.461 INFO:teuthology.orchestra.run.vm05.stdout:/sys/kernel/security 2026-03-10T08:53:02.461 INFO:teuthology.orchestra.run.vm05.stdout:/dev/shm 2026-03-10T08:53:02.461 INFO:teuthology.orchestra.run.vm05.stdout:/dev/pts 2026-03-10T08:53:02.461 INFO:teuthology.orchestra.run.vm05.stdout:/run 2026-03-10T08:53:02.461 INFO:teuthology.orchestra.run.vm05.stdout:/sys/fs/cgroup 2026-03-10T08:53:02.461 INFO:teuthology.orchestra.run.vm05.stdout:/sys/fs/pstore 2026-03-10T08:53:02.461 INFO:teuthology.orchestra.run.vm05.stdout:/sys/fs/bpf 2026-03-10T08:53:02.461 INFO:teuthology.orchestra.run.vm05.stdout:/sys/kernel/config 2026-03-10T08:53:02.461 INFO:teuthology.orchestra.run.vm05.stdout:/ 2026-03-10T08:53:02.461 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/nfs/rpc_pipefs 2026-03-10T08:53:02.461 INFO:teuthology.orchestra.run.vm05.stdout:/sys/fs/selinux 2026-03-10T08:53:02.461 INFO:teuthology.orchestra.run.vm05.stdout:/proc/sys/fs/binfmt_misc 2026-03-10T08:53:02.461 INFO:teuthology.orchestra.run.vm05.stdout:/dev/mqueue 2026-03-10T08:53:02.461 INFO:teuthology.orchestra.run.vm05.stdout:/sys/kernel/debug 2026-03-10T08:53:02.461 INFO:teuthology.orchestra.run.vm05.stdout:/dev/hugepages 2026-03-10T08:53:02.461 INFO:teuthology.orchestra.run.vm05.stdout:/sys/kernel/tracing 2026-03-10T08:53:02.461 INFO:teuthology.orchestra.run.vm05.stdout:/run/credentials/systemd-sysctl.service 2026-03-10T08:53:02.461 INFO:teuthology.orchestra.run.vm05.stdout:/sys/fs/fuse/connections 2026-03-10T08:53:02.461 INFO:teuthology.orchestra.run.vm05.stdout:/run/credentials/systemd-sysusers.service 2026-03-10T08:53:02.461 INFO:teuthology.orchestra.run.vm05.stdout:/run/credentials/systemd-tmpfiles-setup-dev.service 2026-03-10T08:53:02.461 INFO:teuthology.orchestra.run.vm05.stdout:/run/credentials/systemd-tmpfiles-setup.service 2026-03-10T08:53:02.461 INFO:teuthology.orchestra.run.vm05.stdout:/run/user/1000 2026-03-10T08:53:02.461 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay 2026-03-10T08:53:02.461 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/2d8a1946d4a2c78fe0ecabb8fb5d856a11fac7b448ec4890abcf5012d618a819/merged 2026-03-10T08:53:02.461 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/73b3f7a3eb4ee52a8bc915529f2069ad801dba337dd50818aaa8269c93d98bde/merged 2026-03-10T08:53:02.461 INFO:teuthology.orchestra.run.vm05.stdout:/run/user/0 2026-03-10T08:53:02.461 INFO:teuthology.orchestra.run.vm05.stdout:/proc/sys/fs/binfmt_misc 2026-03-10T08:53:02.461 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/a6a7d19a1c2760b8edb84c8c45f3939beffa2c2d37e590b45438ce5df30f7fd9/merged 2026-03-10T08:53:02.461 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/d9767265ccdce4e12e626b04da44410573061ce2a96978bb0138833307068521/merged 2026-03-10T08:53:02.461 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/ae2241e21cfe8ca1de9cc8edacfdd44dd9ec229d79aea9ebb0db4139ad31806f/merged 2026-03-10T08:53:02.461 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/068f8c6c8a06d6376cb78f1d0751100726e0b76e62e74d2e450ca7aa611a28ab/merged 2026-03-10T08:53:02.461 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/d8acadf16539df303ee82d13b7e55863367bea3bda522ecaf8d515bd3c59d2d6/merged 2026-03-10T08:53:02.461 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/1b582c5103abc7ec9e6e3ea6fa63e5eee318edcbfe83dd5744bf5d06547720ef/merged 2026-03-10T08:53:02.461 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/3a9994ad96d81078f4da4f278b756afd70a8db3838afd6d382c26e770340f10f/merged 2026-03-10T08:53:02.461 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/cb32959c116ef3dd6e4d9faa8a2da6ac3e955adeaea37a9efe82100e0df53671/merged 2026-03-10T08:53:02.461 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/afe12ae9654d52278015525dcab49c0ee79464ccfb226774de9571835846e23e/merged 2026-03-10T08:53:02.461 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/6a3e04130a6e4cb97ab1ea4e6c9ad2a0948624fdf02434a7a537aa4ce41b6467/merged 2026-03-10T08:53:02.461 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/dc3085acb829d3c4abc3b018033967e45d4cdd78914a928081f969e2bdb3cacb/merged 2026-03-10T08:53:02.461 INFO:teuthology.orchestra.run.vm05.stdout:/run/netns 2026-03-10T08:53:02.461 INFO:teuthology.orchestra.run.vm05.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.0 2026-03-10T08:53:02.461 INFO:teuthology.orchestra.run.vm05.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.0 2026-03-10T08:53:02.461 INFO:teuthology.orchestra.run.vm05.stdout:/home/ubuntu/cephtest/mnt.0 2026-03-10T08:53:02.462 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T08:53:02.462 DEBUG:teuthology.orchestra.run.vm05:> ls /sys/fs/fuse/connections 2026-03-10T08:53:02.517 INFO:teuthology.orchestra.run.vm05.stdout:93 2026-03-10T08:53:02.517 INFO:tasks.cephfs.fuse_mount:Post-mount connections: [93] 2026-03-10T08:53:02.517 DEBUG:teuthology.orchestra.run.vm05:> sudo stdin-killer -- python3 -c ' 2026-03-10T08:53:02.517 DEBUG:teuthology.orchestra.run.vm05:> import glob 2026-03-10T08:53:02.517 DEBUG:teuthology.orchestra.run.vm05:> import re 2026-03-10T08:53:02.517 DEBUG:teuthology.orchestra.run.vm05:> import os 2026-03-10T08:53:02.517 DEBUG:teuthology.orchestra.run.vm05:> import subprocess 2026-03-10T08:53:02.517 DEBUG:teuthology.orchestra.run.vm05:> 2026-03-10T08:53:02.517 DEBUG:teuthology.orchestra.run.vm05:> def _find_admin_socket(client_name): 2026-03-10T08:53:02.517 DEBUG:teuthology.orchestra.run.vm05:> asok_path = "/var/run/ceph/ceph-client.0.*.asok" 2026-03-10T08:53:02.517 DEBUG:teuthology.orchestra.run.vm05:> files = glob.glob(asok_path) 2026-03-10T08:53:02.517 DEBUG:teuthology.orchestra.run.vm05:> mountpoint = "/home/ubuntu/cephtest/mnt.0" 2026-03-10T08:53:02.517 DEBUG:teuthology.orchestra.run.vm05:> 2026-03-10T08:53:02.517 DEBUG:teuthology.orchestra.run.vm05:> # Given a non-glob path, it better be there 2026-03-10T08:53:02.517 DEBUG:teuthology.orchestra.run.vm05:> if "*" not in asok_path: 2026-03-10T08:53:02.517 DEBUG:teuthology.orchestra.run.vm05:> assert(len(files) == 1) 2026-03-10T08:53:02.517 DEBUG:teuthology.orchestra.run.vm05:> return files[0] 2026-03-10T08:53:02.517 DEBUG:teuthology.orchestra.run.vm05:> 2026-03-10T08:53:02.517 DEBUG:teuthology.orchestra.run.vm05:> for f in files: 2026-03-10T08:53:02.517 DEBUG:teuthology.orchestra.run.vm05:> pid = re.match(".*\.(\d+)\.asok$", f).group(1) 2026-03-10T08:53:02.517 DEBUG:teuthology.orchestra.run.vm05:> if os.path.exists("/proc/{0}".format(pid)): 2026-03-10T08:53:02.517 DEBUG:teuthology.orchestra.run.vm05:> with open("/proc/{0}/cmdline".format(pid), '"'"'r'"'"') as proc_f: 2026-03-10T08:53:02.517 DEBUG:teuthology.orchestra.run.vm05:> contents = proc_f.read() 2026-03-10T08:53:02.517 DEBUG:teuthology.orchestra.run.vm05:> if mountpoint in contents: 2026-03-10T08:53:02.517 DEBUG:teuthology.orchestra.run.vm05:> return f 2026-03-10T08:53:02.517 DEBUG:teuthology.orchestra.run.vm05:> raise RuntimeError("Client socket {0} not found".format(client_name)) 2026-03-10T08:53:02.517 DEBUG:teuthology.orchestra.run.vm05:> 2026-03-10T08:53:02.517 DEBUG:teuthology.orchestra.run.vm05:> print(_find_admin_socket("client.0")) 2026-03-10T08:53:02.517 DEBUG:teuthology.orchestra.run.vm05:> ' 2026-03-10T08:53:02.581 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:02 vm05.local ceph-mon[49713]: pgmap v80: 65 pgs: 65 active+clean; 451 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 520 B/s rd, 1.8 KiB/s wr, 6 op/s 2026-03-10T08:53:02.581 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:02 vm05.local ceph-mon[49713]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-10T08:53:02.581 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:02 vm05.local ceph-mon[49713]: Cluster is now healthy 2026-03-10T08:53:02.581 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:02 vm05.local ceph-mon[49713]: mds.? [v2:192.168.123.105:6826/2466638752,v1:192.168.123.105:6827/2466638752] up:active 2026-03-10T08:53:02.581 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:02 vm05.local ceph-mon[49713]: fsmap cephfs:1 {0=cephfs.vm05.bxdvbu=up:active} 3 up:standby 2026-03-10T08:53:02.617 INFO:teuthology.orchestra.run.vm05.stdout:/var/run/ceph/ceph-client.0.96456.asok 2026-03-10T08:53:02.622 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:02 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T08:53:02.624 INFO:tasks.cephfs.fuse_mount:Found client admin socket at /var/run/ceph/ceph-client.0.96456.asok 2026-03-10T08:53:02.624 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T08:53:02.624 DEBUG:teuthology.orchestra.run.vm05:> sudo ceph --admin-daemon /var/run/ceph/ceph-client.0.96456.asok status 2026-03-10T08:53:02.728 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T08:53:02.728 INFO:teuthology.orchestra.run.vm05.stdout: "metadata": { 2026-03-10T08:53:02.728 INFO:teuthology.orchestra.run.vm05.stdout: "ceph_sha1": "7fe91d5d5842e04be3b4f514d6dd990c54b29c76", 2026-03-10T08:53:02.728 INFO:teuthology.orchestra.run.vm05.stdout: "ceph_version": "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)", 2026-03-10T08:53:02.728 INFO:teuthology.orchestra.run.vm05.stdout: "entity_id": "0", 2026-03-10T08:53:02.729 INFO:teuthology.orchestra.run.vm05.stdout: "hostname": "vm05.local", 2026-03-10T08:53:02.729 INFO:teuthology.orchestra.run.vm05.stdout: "mount_point": "/home/ubuntu/cephtest/mnt.0", 2026-03-10T08:53:02.729 INFO:teuthology.orchestra.run.vm05.stdout: "pid": "96456", 2026-03-10T08:53:02.729 INFO:teuthology.orchestra.run.vm05.stdout: "root": "/" 2026-03-10T08:53:02.729 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:53:02.729 INFO:teuthology.orchestra.run.vm05.stdout: "dentry_count": 0, 2026-03-10T08:53:02.729 INFO:teuthology.orchestra.run.vm05.stdout: "dentry_pinned_count": 0, 2026-03-10T08:53:02.729 INFO:teuthology.orchestra.run.vm05.stdout: "id": 24325, 2026-03-10T08:53:02.729 INFO:teuthology.orchestra.run.vm05.stdout: "inst": { 2026-03-10T08:53:02.729 INFO:teuthology.orchestra.run.vm05.stdout: "name": { 2026-03-10T08:53:02.729 INFO:teuthology.orchestra.run.vm05.stdout: "type": "client", 2026-03-10T08:53:02.729 INFO:teuthology.orchestra.run.vm05.stdout: "num": 24325 2026-03-10T08:53:02.729 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:53:02.729 INFO:teuthology.orchestra.run.vm05.stdout: "addr": { 2026-03-10T08:53:02.729 INFO:teuthology.orchestra.run.vm05.stdout: "type": "v1", 2026-03-10T08:53:02.729 INFO:teuthology.orchestra.run.vm05.stdout: "addr": "192.168.123.105:0", 2026-03-10T08:53:02.729 INFO:teuthology.orchestra.run.vm05.stdout: "nonce": 1319997093 2026-03-10T08:53:02.729 INFO:teuthology.orchestra.run.vm05.stdout: } 2026-03-10T08:53:02.729 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:53:02.729 INFO:teuthology.orchestra.run.vm05.stdout: "addr": { 2026-03-10T08:53:02.729 INFO:teuthology.orchestra.run.vm05.stdout: "type": "v1", 2026-03-10T08:53:02.729 INFO:teuthology.orchestra.run.vm05.stdout: "addr": "192.168.123.105:0", 2026-03-10T08:53:02.729 INFO:teuthology.orchestra.run.vm05.stdout: "nonce": 1319997093 2026-03-10T08:53:02.729 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:53:02.729 INFO:teuthology.orchestra.run.vm05.stdout: "inst_str": "client.24325 192.168.123.105:0/1319997093", 2026-03-10T08:53:02.729 INFO:teuthology.orchestra.run.vm05.stdout: "addr_str": "192.168.123.105:0/1319997093", 2026-03-10T08:53:02.729 INFO:teuthology.orchestra.run.vm05.stdout: "inode_count": 1, 2026-03-10T08:53:02.729 INFO:teuthology.orchestra.run.vm05.stdout: "mds_epoch": 12, 2026-03-10T08:53:02.729 INFO:teuthology.orchestra.run.vm05.stdout: "osd_epoch": 39, 2026-03-10T08:53:02.729 INFO:teuthology.orchestra.run.vm05.stdout: "osd_epoch_barrier": 0, 2026-03-10T08:53:02.729 INFO:teuthology.orchestra.run.vm05.stdout: "blocklisted": false, 2026-03-10T08:53:02.729 INFO:teuthology.orchestra.run.vm05.stdout: "fs_name": "cephfs" 2026-03-10T08:53:02.729 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T08:53:02.734 DEBUG:tasks.ceph_fuse:passing mntargs=[] 2026-03-10T08:53:02.734 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph fs ls 2026-03-10T08:53:02.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:02 vm08 ceph-mon[57559]: pgmap v80: 65 pgs: 65 active+clean; 451 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 520 B/s rd, 1.8 KiB/s wr, 6 op/s 2026-03-10T08:53:02.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:02 vm08 ceph-mon[57559]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-10T08:53:02.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:02 vm08 ceph-mon[57559]: Cluster is now healthy 2026-03-10T08:53:02.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:02 vm08 ceph-mon[57559]: mds.? [v2:192.168.123.105:6826/2466638752,v1:192.168.123.105:6827/2466638752] up:active 2026-03-10T08:53:02.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:02 vm08 ceph-mon[57559]: fsmap cephfs:1 {0=cephfs.vm05.bxdvbu=up:active} 3 up:standby 2026-03-10T08:53:02.870 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:53:03.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.107+0000 7f589d152700 1 -- 192.168.123.105:0/153519007 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5898103340 msgr2=0x7f5898103720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:03.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.107+0000 7f589d152700 1 --2- 192.168.123.105:0/153519007 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5898103340 0x7f5898103720 secure :-1 s=READY pgs=267 cs=0 l=1 rev1=1 crypto rx=0x7f5880009b50 tx=0x7f5880009e60 comp rx=0 tx=0).stop 2026-03-10T08:53:03.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.108+0000 7f589d152700 1 -- 192.168.123.105:0/153519007 shutdown_connections 2026-03-10T08:53:03.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.108+0000 7f589d152700 1 --2- 192.168.123.105:0/153519007 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5898103cf0 0x7f5898107d40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:03.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.108+0000 7f589d152700 1 --2- 192.168.123.105:0/153519007 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5898103340 0x7f5898103720 unknown :-1 s=CLOSED pgs=267 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:03.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.108+0000 7f589d152700 1 -- 192.168.123.105:0/153519007 >> 192.168.123.105:0/153519007 conn(0x7f58980feb90 msgr2=0x7f5898100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:53:03.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.108+0000 7f589d152700 1 -- 192.168.123.105:0/153519007 shutdown_connections 2026-03-10T08:53:03.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.108+0000 7f589d152700 1 -- 192.168.123.105:0/153519007 wait complete. 2026-03-10T08:53:03.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.108+0000 7f589d152700 1 Processor -- start 2026-03-10T08:53:03.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.108+0000 7f589d152700 1 -- start start 2026-03-10T08:53:03.109 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.108+0000 7f589d152700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5898103340 0x7f58980752a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:03.109 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.109+0000 7f589d152700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5898103cf0 0x7f58980757e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:03.109 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.109+0000 7f589d152700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5898079430 con 0x7f5898103340 2026-03-10T08:53:03.109 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.109+0000 7f589d152700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5898075d20 con 0x7f5898103cf0 2026-03-10T08:53:03.109 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.109+0000 7f589659c700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5898103cf0 0x7f58980757e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:03.109 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.109+0000 7f5896d9d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5898103340 0x7f58980752a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:03.109 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.109+0000 7f5896d9d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5898103340 0x7f58980752a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:33408/0 (socket says 192.168.123.105:33408) 2026-03-10T08:53:03.109 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.109+0000 7f5896d9d700 1 -- 192.168.123.105:0/2606324953 learned_addr learned my addr 192.168.123.105:0/2606324953 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:53:03.109 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.109+0000 7f5896d9d700 1 -- 192.168.123.105:0/2606324953 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5898103cf0 msgr2=0x7f58980757e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:03.109 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.109+0000 7f5896d9d700 1 --2- 192.168.123.105:0/2606324953 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5898103cf0 0x7f58980757e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:03.109 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.109+0000 7f5896d9d700 1 -- 192.168.123.105:0/2606324953 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f58800097e0 con 0x7f5898103340 2026-03-10T08:53:03.109 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.109+0000 7f5896d9d700 1 --2- 192.168.123.105:0/2606324953 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5898103340 0x7f58980752a0 secure :-1 s=READY pgs=268 cs=0 l=1 rev1=1 crypto rx=0x7f5880006010 tx=0x7f5880004f70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:53:03.110 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.110+0000 7f588ffff700 1 -- 192.168.123.105:0/2606324953 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f588001d070 con 0x7f5898103340 2026-03-10T08:53:03.110 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.110+0000 7f588ffff700 1 -- 192.168.123.105:0/2606324953 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f5880022470 con 0x7f5898103340 2026-03-10T08:53:03.110 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.110+0000 7f588ffff700 1 -- 192.168.123.105:0/2606324953 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f588000f670 con 0x7f5898103340 2026-03-10T08:53:03.110 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.110+0000 7f589d152700 1 -- 192.168.123.105:0/2606324953 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5898075fa0 con 0x7f5898103340 2026-03-10T08:53:03.110 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.110+0000 7f589d152700 1 -- 192.168.123.105:0/2606324953 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f58981a7630 con 0x7f5898103340 2026-03-10T08:53:03.114 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.111+0000 7f588ffff700 1 -- 192.168.123.105:0/2606324953 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f58800225e0 con 0x7f5898103340 2026-03-10T08:53:03.114 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.111+0000 7f589d152700 1 -- 192.168.123.105:0/2606324953 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f589810b6e0 con 0x7f5898103340 2026-03-10T08:53:03.114 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.111+0000 7f588ffff700 1 --2- 192.168.123.105:0/2606324953 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f588406c490 0x7f588406e950 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:03.114 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.111+0000 7f588ffff700 1 -- 192.168.123.105:0/2606324953 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f588008cdd0 con 0x7f5898103340 2026-03-10T08:53:03.114 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.112+0000 7f589659c700 1 --2- 192.168.123.105:0/2606324953 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f588406c490 0x7f588406e950 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:03.114 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.114+0000 7f589659c700 1 --2- 192.168.123.105:0/2606324953 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f588406c490 0x7f588406e950 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7f5898076a90 tx=0x7f588800b410 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:53:03.114 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.114+0000 7f588ffff700 1 -- 192.168.123.105:0/2606324953 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f5880057620 con 0x7f5898103340 2026-03-10T08:53:03.238 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.238+0000 7f589d152700 1 -- 192.168.123.105:0/2606324953 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs ls"} v 0) v1 -- 0x7f589804ea90 con 0x7f5898103340 2026-03-10T08:53:03.239 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.239+0000 7f588ffff700 1 -- 192.168.123.105:0/2606324953 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs ls"}]=0 v12) v1 ==== 53+0+83 (secure 0 0 0) 0x7f588005ac40 con 0x7f5898103340 2026-03-10T08:53:03.239 INFO:teuthology.orchestra.run.vm05.stdout:name: cephfs, metadata pool: cephfs.cephfs.meta, data pools: [cephfs.cephfs.data ] 2026-03-10T08:53:03.241 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.241+0000 7f589d152700 1 -- 192.168.123.105:0/2606324953 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f588406c490 msgr2=0x7f588406e950 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:03.241 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.241+0000 7f589d152700 1 --2- 192.168.123.105:0/2606324953 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f588406c490 0x7f588406e950 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7f5898076a90 tx=0x7f588800b410 comp rx=0 tx=0).stop 2026-03-10T08:53:03.241 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.241+0000 7f589d152700 1 -- 192.168.123.105:0/2606324953 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5898103340 msgr2=0x7f58980752a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:03.241 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.241+0000 7f589d152700 1 --2- 192.168.123.105:0/2606324953 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5898103340 0x7f58980752a0 secure :-1 s=READY pgs=268 cs=0 l=1 rev1=1 crypto rx=0x7f5880006010 tx=0x7f5880004f70 comp rx=0 tx=0).stop 2026-03-10T08:53:03.242 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.241+0000 7f589d152700 1 -- 192.168.123.105:0/2606324953 shutdown_connections 2026-03-10T08:53:03.242 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.241+0000 7f589d152700 1 --2- 192.168.123.105:0/2606324953 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f588406c490 0x7f588406e950 unknown :-1 s=CLOSED pgs=109 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:03.242 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.241+0000 7f589d152700 1 --2- 192.168.123.105:0/2606324953 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5898103340 0x7f58980752a0 unknown :-1 s=CLOSED pgs=268 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:03.242 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.241+0000 7f589d152700 1 --2- 192.168.123.105:0/2606324953 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5898103cf0 0x7f58980757e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:03.242 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.241+0000 7f589d152700 1 -- 192.168.123.105:0/2606324953 >> 192.168.123.105:0/2606324953 conn(0x7f58980feb90 msgr2=0x7f5898100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:53:03.242 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.242+0000 7f589d152700 1 -- 192.168.123.105:0/2606324953 shutdown_connections 2026-03-10T08:53:03.242 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:03.242+0000 7f589d152700 1 -- 192.168.123.105:0/2606324953 wait complete. 2026-03-10T08:53:03.297 INFO:tasks.cephfs.mount:Mounting default Ceph FS; just confirmed its presence on cluster 2026-03-10T08:53:03.297 INFO:tasks.cephfs.mount:Mounting Ceph FS. Following are details of mount; remember "None" represents Python type None - 2026-03-10T08:53:03.297 INFO:tasks.cephfs.mount:self.client_remote.hostname = vm08.local 2026-03-10T08:53:03.297 INFO:tasks.cephfs.mount:self.client.name = client.1 2026-03-10T08:53:03.297 INFO:tasks.cephfs.mount:self.hostfs_mntpt = /home/ubuntu/cephtest/mnt.1 2026-03-10T08:53:03.297 INFO:tasks.cephfs.mount:self.cephfs_name = None 2026-03-10T08:53:03.297 INFO:tasks.cephfs.mount:self.cephfs_mntpt = None 2026-03-10T08:53:03.297 INFO:tasks.cephfs.mount:self.client_keyring_path = None 2026-03-10T08:53:03.298 INFO:tasks.cephfs.mount:Setting the 'None' netns for '/home/ubuntu/cephtest/mnt.1' 2026-03-10T08:53:03.298 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T08:53:03.298 DEBUG:teuthology.orchestra.run.vm08:> ip addr 2026-03-10T08:53:03.312 INFO:teuthology.orchestra.run.vm08.stdout:1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 2026-03-10T08:53:03.313 INFO:teuthology.orchestra.run.vm08.stdout: link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 2026-03-10T08:53:03.313 INFO:teuthology.orchestra.run.vm08.stdout: inet 127.0.0.1/8 scope host lo 2026-03-10T08:53:03.313 INFO:teuthology.orchestra.run.vm08.stdout: valid_lft forever preferred_lft forever 2026-03-10T08:53:03.313 INFO:teuthology.orchestra.run.vm08.stdout: inet6 ::1/128 scope host 2026-03-10T08:53:03.313 INFO:teuthology.orchestra.run.vm08.stdout: valid_lft forever preferred_lft forever 2026-03-10T08:53:03.313 INFO:teuthology.orchestra.run.vm08.stdout:2: eth0: mtu 1500 qdisc fq_codel state UP group default qlen 1000 2026-03-10T08:53:03.313 INFO:teuthology.orchestra.run.vm08.stdout: link/ether 52:55:00:00:00:08 brd ff:ff:ff:ff:ff:ff 2026-03-10T08:53:03.313 INFO:teuthology.orchestra.run.vm08.stdout: altname enp0s3 2026-03-10T08:53:03.313 INFO:teuthology.orchestra.run.vm08.stdout: altname ens3 2026-03-10T08:53:03.313 INFO:teuthology.orchestra.run.vm08.stdout: inet 192.168.123.108/24 brd 192.168.123.255 scope global dynamic noprefixroute eth0 2026-03-10T08:53:03.313 INFO:teuthology.orchestra.run.vm08.stdout: valid_lft 3173sec preferred_lft 3173sec 2026-03-10T08:53:03.313 INFO:teuthology.orchestra.run.vm08.stdout: inet6 fe80::5055:ff:fe00:8/64 scope link noprefixroute 2026-03-10T08:53:03.313 INFO:teuthology.orchestra.run.vm08.stdout: valid_lft forever preferred_lft forever 2026-03-10T08:53:03.313 INFO:tasks.cephfs.mount:Setuping the 'ceph-brx' with 192.168.159.254/20 2026-03-10T08:53:03.313 DEBUG:teuthology.orchestra.run.vm08:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T08:53:03.313 DEBUG:teuthology.orchestra.run.vm08:> set -e 2026-03-10T08:53:03.313 DEBUG:teuthology.orchestra.run.vm08:> sudo ip link add name ceph-brx type bridge 2026-03-10T08:53:03.313 DEBUG:teuthology.orchestra.run.vm08:> sudo ip addr flush dev ceph-brx 2026-03-10T08:53:03.313 DEBUG:teuthology.orchestra.run.vm08:> sudo ip link set ceph-brx up 2026-03-10T08:53:03.313 DEBUG:teuthology.orchestra.run.vm08:> sudo ip addr add 192.168.159.254/20 brd 192.168.159.255 dev ceph-brx 2026-03-10T08:53:03.313 DEBUG:teuthology.orchestra.run.vm08:> ') 2026-03-10T08:53:03.388 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:53:03 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T08:53:03.459 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:53:03 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T08:53:03.463 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T08:53:03.463 DEBUG:teuthology.orchestra.run.vm08:> echo 1 | sudo tee /proc/sys/net/ipv4/ip_forward 2026-03-10T08:53:03.532 INFO:teuthology.orchestra.run.vm08.stdout:1 2026-03-10T08:53:03.533 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T08:53:03.533 DEBUG:teuthology.orchestra.run.vm08:> ip r 2026-03-10T08:53:03.587 INFO:teuthology.orchestra.run.vm08.stdout:default via 192.168.123.1 dev eth0 proto dhcp src 192.168.123.108 metric 100 2026-03-10T08:53:03.587 INFO:teuthology.orchestra.run.vm08.stdout:192.168.123.0/24 dev eth0 proto kernel scope link src 192.168.123.108 metric 100 2026-03-10T08:53:03.587 INFO:teuthology.orchestra.run.vm08.stdout:192.168.144.0/20 dev ceph-brx proto kernel scope link src 192.168.159.254 2026-03-10T08:53:03.587 DEBUG:teuthology.orchestra.run.vm08:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T08:53:03.587 DEBUG:teuthology.orchestra.run.vm08:> set -e 2026-03-10T08:53:03.587 DEBUG:teuthology.orchestra.run.vm08:> sudo iptables -A FORWARD -o eth0 -i ceph-brx -j ACCEPT 2026-03-10T08:53:03.587 DEBUG:teuthology.orchestra.run.vm08:> sudo iptables -A FORWARD -i eth0 -o ceph-brx -j ACCEPT 2026-03-10T08:53:03.587 DEBUG:teuthology.orchestra.run.vm08:> sudo iptables -t nat -A POSTROUTING -s 192.168.159.254/20 -o eth0 -j MASQUERADE 2026-03-10T08:53:03.587 DEBUG:teuthology.orchestra.run.vm08:> ') 2026-03-10T08:53:03.659 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:53:03 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T08:53:03.666 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:03 vm08 ceph-mon[57559]: from='client.? 192.168.123.105:0/2606324953' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch 2026-03-10T08:53:03.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:03 vm05.local ceph-mon[49713]: from='client.? 192.168.123.105:0/2606324953' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch 2026-03-10T08:53:03.714 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:53:03 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T08:53:03.717 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T08:53:03.717 DEBUG:teuthology.orchestra.run.vm08:> ip netns list 2026-03-10T08:53:03.772 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T08:53:03.772 DEBUG:teuthology.orchestra.run.vm08:> ip netns list-id 2026-03-10T08:53:03.826 DEBUG:teuthology.orchestra.run.vm08:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T08:53:03.826 DEBUG:teuthology.orchestra.run.vm08:> set -e 2026-03-10T08:53:03.826 DEBUG:teuthology.orchestra.run.vm08:> sudo ip netns add ceph-ns--home-ubuntu-cephtest-mnt.1 2026-03-10T08:53:03.826 DEBUG:teuthology.orchestra.run.vm08:> sudo ip netns set ceph-ns--home-ubuntu-cephtest-mnt.1 0 2026-03-10T08:53:03.826 DEBUG:teuthology.orchestra.run.vm08:> ') 2026-03-10T08:53:03.898 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:53:03 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T08:53:03.919 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:53:03 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T08:53:03.923 INFO:tasks.cephfs.mount:Setuping the netns 'ceph-ns--home-ubuntu-cephtest-mnt.1' with 192.168.144.1/20 2026-03-10T08:53:03.923 DEBUG:teuthology.orchestra.run.vm08:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T08:53:03.923 DEBUG:teuthology.orchestra.run.vm08:> set -e 2026-03-10T08:53:03.923 DEBUG:teuthology.orchestra.run.vm08:> sudo ip link add veth0 netns ceph-ns--home-ubuntu-cephtest-mnt.1 type veth peer name brx.0 2026-03-10T08:53:03.923 DEBUG:teuthology.orchestra.run.vm08:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.1 ip addr add 192.168.144.1/20 brd 192.168.159.255 dev veth0 2026-03-10T08:53:03.923 DEBUG:teuthology.orchestra.run.vm08:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.1 ip link set veth0 up 2026-03-10T08:53:03.923 DEBUG:teuthology.orchestra.run.vm08:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.1 ip link set lo up 2026-03-10T08:53:03.923 DEBUG:teuthology.orchestra.run.vm08:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.1 ip route add default via 192.168.159.254 2026-03-10T08:53:03.923 DEBUG:teuthology.orchestra.run.vm08:> ') 2026-03-10T08:53:03.994 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:53:03 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T08:53:04.053 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:53:04 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T08:53:04.054 DEBUG:teuthology.orchestra.run.vm08:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T08:53:04.054 DEBUG:teuthology.orchestra.run.vm08:> set -e 2026-03-10T08:53:04.054 DEBUG:teuthology.orchestra.run.vm08:> sudo ip link set brx.0 up 2026-03-10T08:53:04.054 DEBUG:teuthology.orchestra.run.vm08:> sudo ip link set dev brx.0 master ceph-brx 2026-03-10T08:53:04.054 DEBUG:teuthology.orchestra.run.vm08:> ') 2026-03-10T08:53:04.126 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:53:04 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T08:53:04.151 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:53:04 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T08:53:04.154 INFO:tasks.cephfs.fuse_mount:Client client.1 config is {} 2026-03-10T08:53:04.154 INFO:teuthology.orchestra.run:Running command with timeout 60 2026-03-10T08:53:04.154 DEBUG:teuthology.orchestra.run.vm08:> mkdir -p -v /home/ubuntu/cephtest/mnt.1 2026-03-10T08:53:04.209 INFO:teuthology.orchestra.run.vm08.stdout:mkdir: created directory '/home/ubuntu/cephtest/mnt.1' 2026-03-10T08:53:04.209 INFO:teuthology.orchestra.run:Running command with timeout 60 2026-03-10T08:53:04.209 DEBUG:teuthology.orchestra.run.vm08:> chmod 0000 /home/ubuntu/cephtest/mnt.1 2026-03-10T08:53:04.262 DEBUG:teuthology.orchestra.run.vm08:> sudo modprobe fuse 2026-03-10T08:53:04.325 DEBUG:teuthology.orchestra.run.vm08:> cat /proc/self/mounts | awk '{print $2}' 2026-03-10T08:53:04.380 INFO:teuthology.orchestra.run.vm08.stdout:/proc 2026-03-10T08:53:04.380 INFO:teuthology.orchestra.run.vm08.stdout:/sys 2026-03-10T08:53:04.380 INFO:teuthology.orchestra.run.vm08.stdout:/dev 2026-03-10T08:53:04.380 INFO:teuthology.orchestra.run.vm08.stdout:/sys/kernel/security 2026-03-10T08:53:04.380 INFO:teuthology.orchestra.run.vm08.stdout:/dev/shm 2026-03-10T08:53:04.380 INFO:teuthology.orchestra.run.vm08.stdout:/dev/pts 2026-03-10T08:53:04.380 INFO:teuthology.orchestra.run.vm08.stdout:/run 2026-03-10T08:53:04.380 INFO:teuthology.orchestra.run.vm08.stdout:/sys/fs/cgroup 2026-03-10T08:53:04.380 INFO:teuthology.orchestra.run.vm08.stdout:/sys/fs/pstore 2026-03-10T08:53:04.380 INFO:teuthology.orchestra.run.vm08.stdout:/sys/fs/bpf 2026-03-10T08:53:04.380 INFO:teuthology.orchestra.run.vm08.stdout:/sys/kernel/config 2026-03-10T08:53:04.380 INFO:teuthology.orchestra.run.vm08.stdout:/ 2026-03-10T08:53:04.380 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/nfs/rpc_pipefs 2026-03-10T08:53:04.380 INFO:teuthology.orchestra.run.vm08.stdout:/sys/fs/selinux 2026-03-10T08:53:04.380 INFO:teuthology.orchestra.run.vm08.stdout:/proc/sys/fs/binfmt_misc 2026-03-10T08:53:04.380 INFO:teuthology.orchestra.run.vm08.stdout:/dev/hugepages 2026-03-10T08:53:04.380 INFO:teuthology.orchestra.run.vm08.stdout:/dev/mqueue 2026-03-10T08:53:04.380 INFO:teuthology.orchestra.run.vm08.stdout:/sys/kernel/debug 2026-03-10T08:53:04.380 INFO:teuthology.orchestra.run.vm08.stdout:/sys/kernel/tracing 2026-03-10T08:53:04.380 INFO:teuthology.orchestra.run.vm08.stdout:/run/credentials/systemd-sysctl.service 2026-03-10T08:53:04.380 INFO:teuthology.orchestra.run.vm08.stdout:/sys/fs/fuse/connections 2026-03-10T08:53:04.380 INFO:teuthology.orchestra.run.vm08.stdout:/run/credentials/systemd-sysusers.service 2026-03-10T08:53:04.380 INFO:teuthology.orchestra.run.vm08.stdout:/run/credentials/systemd-tmpfiles-setup-dev.service 2026-03-10T08:53:04.380 INFO:teuthology.orchestra.run.vm08.stdout:/run/credentials/systemd-tmpfiles-setup.service 2026-03-10T08:53:04.380 INFO:teuthology.orchestra.run.vm08.stdout:/run/user/1000 2026-03-10T08:53:04.380 INFO:teuthology.orchestra.run.vm08.stdout:/run/user/0 2026-03-10T08:53:04.380 INFO:teuthology.orchestra.run.vm08.stdout:/proc/sys/fs/binfmt_misc 2026-03-10T08:53:04.380 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay 2026-03-10T08:53:04.380 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/1edc3c439565a2c6275ccaec302949d9d708e2f803879c1b6245522c9872ed4f/merged 2026-03-10T08:53:04.380 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/48bbcb4f6cd206a2c2553ffe59e4fd3d14db6c3c87474042c904277084310228/merged 2026-03-10T08:53:04.380 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/f2405424f00bd5f56e50d16a3a4ab109e4e1d1def856643bdfb87f00dc07bf10/merged 2026-03-10T08:53:04.380 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/1c8fac14741f1f5fe0aae976c4cedd9365d47949bf6fa0068347850464392405/merged 2026-03-10T08:53:04.380 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/dd0eb7b95da24c87efb84afb9f79c1fb07b87b3df3df546b8cfb8224f9a12622/merged 2026-03-10T08:53:04.380 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/82d536e6e6c35377abeacdb043ea23717c34946d255da6b711a6894f3daffed0/merged 2026-03-10T08:53:04.380 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/0e77178b4445bcbf017cce2aed6e3eba13d8c0af3e48d84f90dd4422732f24f0/merged 2026-03-10T08:53:04.380 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/92e7c2a42d6e79da4760ea5392456e84dfc1d4553c4f03d05a15dbbe62d77ce4/merged 2026-03-10T08:53:04.380 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/9466922107ac6dacafd0e1f6d9fc8fdbbe8a6568977570ecbb195527fe182789/merged 2026-03-10T08:53:04.380 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/be819d008a9d97209672449efa50328a9c5c31c7272eedc0cdeffb2532e5b00a/merged 2026-03-10T08:53:04.380 INFO:teuthology.orchestra.run.vm08.stdout:/run/netns 2026-03-10T08:53:04.380 INFO:teuthology.orchestra.run.vm08.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.1 2026-03-10T08:53:04.380 INFO:teuthology.orchestra.run.vm08.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.1 2026-03-10T08:53:04.381 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T08:53:04.381 DEBUG:teuthology.orchestra.run.vm08:> ls /sys/fs/fuse/connections 2026-03-10T08:53:04.435 INFO:tasks.cephfs.fuse_mount:Pre-mount connections: [] 2026-03-10T08:53:04.435 DEBUG:teuthology.orchestra.run.vm08:> (cd /home/ubuntu/cephtest && exec sudo nsenter --net=/var/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.1 sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-fuse -f --admin-socket '/var/run/ceph/$cluster-$name.$pid.asok' /home/ubuntu/cephtest/mnt.1 --id 1) 2026-03-10T08:53:04.476 DEBUG:teuthology.orchestra.run.vm08:> sudo modprobe fuse 2026-03-10T08:53:04.505 DEBUG:teuthology.orchestra.run.vm08:> cat /proc/self/mounts | awk '{print $2}' 2026-03-10T08:53:04.550 INFO:tasks.cephfs.fuse_mount.ceph-fuse.1.vm08.stderr:2026-03-10T08:53:04.549+0000 7fc146e96480 -1 init, newargv = 0x55e94e900150 newargc=15 2026-03-10T08:53:04.550 INFO:tasks.cephfs.fuse_mount.ceph-fuse.1.vm08.stderr:ceph-fuse[81923]: starting ceph client 2026-03-10T08:53:04.558 INFO:tasks.cephfs.fuse_mount.ceph-fuse.1.vm08.stderr:ceph-fuse[81923]: starting fuse 2026-03-10T08:53:04.572 INFO:teuthology.orchestra.run.vm08.stdout:/proc 2026-03-10T08:53:04.572 INFO:teuthology.orchestra.run.vm08.stdout:/sys 2026-03-10T08:53:04.572 INFO:teuthology.orchestra.run.vm08.stdout:/dev 2026-03-10T08:53:04.572 INFO:teuthology.orchestra.run.vm08.stdout:/sys/kernel/security 2026-03-10T08:53:04.572 INFO:teuthology.orchestra.run.vm08.stdout:/dev/shm 2026-03-10T08:53:04.572 INFO:teuthology.orchestra.run.vm08.stdout:/dev/pts 2026-03-10T08:53:04.572 INFO:teuthology.orchestra.run.vm08.stdout:/run 2026-03-10T08:53:04.572 INFO:teuthology.orchestra.run.vm08.stdout:/sys/fs/cgroup 2026-03-10T08:53:04.572 INFO:teuthology.orchestra.run.vm08.stdout:/sys/fs/pstore 2026-03-10T08:53:04.572 INFO:teuthology.orchestra.run.vm08.stdout:/sys/fs/bpf 2026-03-10T08:53:04.572 INFO:teuthology.orchestra.run.vm08.stdout:/sys/kernel/config 2026-03-10T08:53:04.572 INFO:teuthology.orchestra.run.vm08.stdout:/ 2026-03-10T08:53:04.572 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/nfs/rpc_pipefs 2026-03-10T08:53:04.572 INFO:teuthology.orchestra.run.vm08.stdout:/sys/fs/selinux 2026-03-10T08:53:04.572 INFO:teuthology.orchestra.run.vm08.stdout:/proc/sys/fs/binfmt_misc 2026-03-10T08:53:04.572 INFO:teuthology.orchestra.run.vm08.stdout:/dev/hugepages 2026-03-10T08:53:04.572 INFO:teuthology.orchestra.run.vm08.stdout:/dev/mqueue 2026-03-10T08:53:04.572 INFO:teuthology.orchestra.run.vm08.stdout:/sys/kernel/debug 2026-03-10T08:53:04.572 INFO:teuthology.orchestra.run.vm08.stdout:/sys/kernel/tracing 2026-03-10T08:53:04.572 INFO:teuthology.orchestra.run.vm08.stdout:/run/credentials/systemd-sysctl.service 2026-03-10T08:53:04.572 INFO:teuthology.orchestra.run.vm08.stdout:/sys/fs/fuse/connections 2026-03-10T08:53:04.572 INFO:teuthology.orchestra.run.vm08.stdout:/run/credentials/systemd-sysusers.service 2026-03-10T08:53:04.572 INFO:teuthology.orchestra.run.vm08.stdout:/run/credentials/systemd-tmpfiles-setup-dev.service 2026-03-10T08:53:04.572 INFO:teuthology.orchestra.run.vm08.stdout:/run/credentials/systemd-tmpfiles-setup.service 2026-03-10T08:53:04.572 INFO:teuthology.orchestra.run.vm08.stdout:/run/user/1000 2026-03-10T08:53:04.572 INFO:teuthology.orchestra.run.vm08.stdout:/run/user/0 2026-03-10T08:53:04.572 INFO:teuthology.orchestra.run.vm08.stdout:/proc/sys/fs/binfmt_misc 2026-03-10T08:53:04.572 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay 2026-03-10T08:53:04.572 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/1edc3c439565a2c6275ccaec302949d9d708e2f803879c1b6245522c9872ed4f/merged 2026-03-10T08:53:04.572 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/48bbcb4f6cd206a2c2553ffe59e4fd3d14db6c3c87474042c904277084310228/merged 2026-03-10T08:53:04.572 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/f2405424f00bd5f56e50d16a3a4ab109e4e1d1def856643bdfb87f00dc07bf10/merged 2026-03-10T08:53:04.572 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/1c8fac14741f1f5fe0aae976c4cedd9365d47949bf6fa0068347850464392405/merged 2026-03-10T08:53:04.572 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/dd0eb7b95da24c87efb84afb9f79c1fb07b87b3df3df546b8cfb8224f9a12622/merged 2026-03-10T08:53:04.572 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/82d536e6e6c35377abeacdb043ea23717c34946d255da6b711a6894f3daffed0/merged 2026-03-10T08:53:04.572 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/0e77178b4445bcbf017cce2aed6e3eba13d8c0af3e48d84f90dd4422732f24f0/merged 2026-03-10T08:53:04.572 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/92e7c2a42d6e79da4760ea5392456e84dfc1d4553c4f03d05a15dbbe62d77ce4/merged 2026-03-10T08:53:04.572 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/9466922107ac6dacafd0e1f6d9fc8fdbbe8a6568977570ecbb195527fe182789/merged 2026-03-10T08:53:04.573 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/be819d008a9d97209672449efa50328a9c5c31c7272eedc0cdeffb2532e5b00a/merged 2026-03-10T08:53:04.573 INFO:teuthology.orchestra.run.vm08.stdout:/run/netns 2026-03-10T08:53:04.573 INFO:teuthology.orchestra.run.vm08.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.1 2026-03-10T08:53:04.573 INFO:teuthology.orchestra.run.vm08.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.1 2026-03-10T08:53:04.573 INFO:teuthology.orchestra.run.vm08.stdout:/home/ubuntu/cephtest/mnt.1 2026-03-10T08:53:04.573 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T08:53:04.573 DEBUG:teuthology.orchestra.run.vm08:> ls /sys/fs/fuse/connections 2026-03-10T08:53:04.627 INFO:teuthology.orchestra.run.vm08.stdout:90 2026-03-10T08:53:04.627 INFO:tasks.cephfs.fuse_mount:Post-mount connections: [90] 2026-03-10T08:53:04.627 DEBUG:teuthology.orchestra.run.vm08:> sudo stdin-killer -- python3 -c ' 2026-03-10T08:53:04.627 DEBUG:teuthology.orchestra.run.vm08:> import glob 2026-03-10T08:53:04.627 DEBUG:teuthology.orchestra.run.vm08:> import re 2026-03-10T08:53:04.627 DEBUG:teuthology.orchestra.run.vm08:> import os 2026-03-10T08:53:04.627 DEBUG:teuthology.orchestra.run.vm08:> import subprocess 2026-03-10T08:53:04.627 DEBUG:teuthology.orchestra.run.vm08:> 2026-03-10T08:53:04.627 DEBUG:teuthology.orchestra.run.vm08:> def _find_admin_socket(client_name): 2026-03-10T08:53:04.627 DEBUG:teuthology.orchestra.run.vm08:> asok_path = "/var/run/ceph/ceph-client.1.*.asok" 2026-03-10T08:53:04.627 DEBUG:teuthology.orchestra.run.vm08:> files = glob.glob(asok_path) 2026-03-10T08:53:04.627 DEBUG:teuthology.orchestra.run.vm08:> mountpoint = "/home/ubuntu/cephtest/mnt.1" 2026-03-10T08:53:04.627 DEBUG:teuthology.orchestra.run.vm08:> 2026-03-10T08:53:04.627 DEBUG:teuthology.orchestra.run.vm08:> # Given a non-glob path, it better be there 2026-03-10T08:53:04.627 DEBUG:teuthology.orchestra.run.vm08:> if "*" not in asok_path: 2026-03-10T08:53:04.627 DEBUG:teuthology.orchestra.run.vm08:> assert(len(files) == 1) 2026-03-10T08:53:04.627 DEBUG:teuthology.orchestra.run.vm08:> return files[0] 2026-03-10T08:53:04.627 DEBUG:teuthology.orchestra.run.vm08:> 2026-03-10T08:53:04.627 DEBUG:teuthology.orchestra.run.vm08:> for f in files: 2026-03-10T08:53:04.627 DEBUG:teuthology.orchestra.run.vm08:> pid = re.match(".*\.(\d+)\.asok$", f).group(1) 2026-03-10T08:53:04.627 DEBUG:teuthology.orchestra.run.vm08:> if os.path.exists("/proc/{0}".format(pid)): 2026-03-10T08:53:04.628 DEBUG:teuthology.orchestra.run.vm08:> with open("/proc/{0}/cmdline".format(pid), '"'"'r'"'"') as proc_f: 2026-03-10T08:53:04.628 DEBUG:teuthology.orchestra.run.vm08:> contents = proc_f.read() 2026-03-10T08:53:04.628 DEBUG:teuthology.orchestra.run.vm08:> if mountpoint in contents: 2026-03-10T08:53:04.628 DEBUG:teuthology.orchestra.run.vm08:> return f 2026-03-10T08:53:04.628 DEBUG:teuthology.orchestra.run.vm08:> raise RuntimeError("Client socket {0} not found".format(client_name)) 2026-03-10T08:53:04.628 DEBUG:teuthology.orchestra.run.vm08:> 2026-03-10T08:53:04.628 DEBUG:teuthology.orchestra.run.vm08:> print(_find_admin_socket("client.1")) 2026-03-10T08:53:04.628 DEBUG:teuthology.orchestra.run.vm08:> ' 2026-03-10T08:53:04.686 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:04 vm08.local ceph-mon[57559]: pgmap v81: 65 pgs: 65 active+clean; 451 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 3.2 KiB/s rd, 1.6 KiB/s wr, 8 op/s 2026-03-10T08:53:04.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:04 vm05.local ceph-mon[49713]: pgmap v81: 65 pgs: 65 active+clean; 451 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 3.2 KiB/s rd, 1.6 KiB/s wr, 8 op/s 2026-03-10T08:53:04.718 INFO:teuthology.orchestra.run.vm08.stdout:/var/run/ceph/ceph-client.1.81923.asok 2026-03-10T08:53:04.720 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T08:53:04 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T08:53:04.724 INFO:tasks.cephfs.fuse_mount:Found client admin socket at /var/run/ceph/ceph-client.1.81923.asok 2026-03-10T08:53:04.725 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T08:53:04.725 DEBUG:teuthology.orchestra.run.vm08:> sudo ceph --admin-daemon /var/run/ceph/ceph-client.1.81923.asok status 2026-03-10T08:53:04.826 INFO:teuthology.orchestra.run.vm08.stdout:{ 2026-03-10T08:53:04.826 INFO:teuthology.orchestra.run.vm08.stdout: "metadata": { 2026-03-10T08:53:04.826 INFO:teuthology.orchestra.run.vm08.stdout: "ceph_sha1": "7fe91d5d5842e04be3b4f514d6dd990c54b29c76", 2026-03-10T08:53:04.826 INFO:teuthology.orchestra.run.vm08.stdout: "ceph_version": "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)", 2026-03-10T08:53:04.826 INFO:teuthology.orchestra.run.vm08.stdout: "entity_id": "1", 2026-03-10T08:53:04.826 INFO:teuthology.orchestra.run.vm08.stdout: "hostname": "vm08.local", 2026-03-10T08:53:04.826 INFO:teuthology.orchestra.run.vm08.stdout: "mount_point": "/home/ubuntu/cephtest/mnt.1", 2026-03-10T08:53:04.826 INFO:teuthology.orchestra.run.vm08.stdout: "pid": "81923", 2026-03-10T08:53:04.826 INFO:teuthology.orchestra.run.vm08.stdout: "root": "/" 2026-03-10T08:53:04.826 INFO:teuthology.orchestra.run.vm08.stdout: }, 2026-03-10T08:53:04.826 INFO:teuthology.orchestra.run.vm08.stdout: "dentry_count": 0, 2026-03-10T08:53:04.826 INFO:teuthology.orchestra.run.vm08.stdout: "dentry_pinned_count": 0, 2026-03-10T08:53:04.826 INFO:teuthology.orchestra.run.vm08.stdout: "id": 24333, 2026-03-10T08:53:04.826 INFO:teuthology.orchestra.run.vm08.stdout: "inst": { 2026-03-10T08:53:04.826 INFO:teuthology.orchestra.run.vm08.stdout: "name": { 2026-03-10T08:53:04.826 INFO:teuthology.orchestra.run.vm08.stdout: "type": "client", 2026-03-10T08:53:04.826 INFO:teuthology.orchestra.run.vm08.stdout: "num": 24333 2026-03-10T08:53:04.826 INFO:teuthology.orchestra.run.vm08.stdout: }, 2026-03-10T08:53:04.826 INFO:teuthology.orchestra.run.vm08.stdout: "addr": { 2026-03-10T08:53:04.826 INFO:teuthology.orchestra.run.vm08.stdout: "type": "v1", 2026-03-10T08:53:04.826 INFO:teuthology.orchestra.run.vm08.stdout: "addr": "192.168.144.1:0", 2026-03-10T08:53:04.826 INFO:teuthology.orchestra.run.vm08.stdout: "nonce": 2430695904 2026-03-10T08:53:04.826 INFO:teuthology.orchestra.run.vm08.stdout: } 2026-03-10T08:53:04.826 INFO:teuthology.orchestra.run.vm08.stdout: }, 2026-03-10T08:53:04.826 INFO:teuthology.orchestra.run.vm08.stdout: "addr": { 2026-03-10T08:53:04.826 INFO:teuthology.orchestra.run.vm08.stdout: "type": "v1", 2026-03-10T08:53:04.826 INFO:teuthology.orchestra.run.vm08.stdout: "addr": "192.168.144.1:0", 2026-03-10T08:53:04.826 INFO:teuthology.orchestra.run.vm08.stdout: "nonce": 2430695904 2026-03-10T08:53:04.826 INFO:teuthology.orchestra.run.vm08.stdout: }, 2026-03-10T08:53:04.826 INFO:teuthology.orchestra.run.vm08.stdout: "inst_str": "client.24333 192.168.144.1:0/2430695904", 2026-03-10T08:53:04.826 INFO:teuthology.orchestra.run.vm08.stdout: "addr_str": "192.168.144.1:0/2430695904", 2026-03-10T08:53:04.826 INFO:teuthology.orchestra.run.vm08.stdout: "inode_count": 1, 2026-03-10T08:53:04.826 INFO:teuthology.orchestra.run.vm08.stdout: "mds_epoch": 12, 2026-03-10T08:53:04.826 INFO:teuthology.orchestra.run.vm08.stdout: "osd_epoch": 39, 2026-03-10T08:53:04.827 INFO:teuthology.orchestra.run.vm08.stdout: "osd_epoch_barrier": 0, 2026-03-10T08:53:04.827 INFO:teuthology.orchestra.run.vm08.stdout: "blocklisted": false, 2026-03-10T08:53:04.827 INFO:teuthology.orchestra.run.vm08.stdout: "fs_name": "cephfs" 2026-03-10T08:53:04.827 INFO:teuthology.orchestra.run.vm08.stdout:} 2026-03-10T08:53:04.833 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T08:53:04.833 DEBUG:teuthology.orchestra.run.vm05:> stat --file-system '--printf=%T 2026-03-10T08:53:04.833 DEBUG:teuthology.orchestra.run.vm05:> ' -- /home/ubuntu/cephtest/mnt.0 2026-03-10T08:53:04.851 INFO:teuthology.orchestra.run.vm05.stdout:fuseblk 2026-03-10T08:53:04.852 INFO:tasks.cephfs.fuse_mount:ceph-fuse is mounted on /home/ubuntu/cephtest/mnt.0 2026-03-10T08:53:04.852 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T08:53:04.852 DEBUG:teuthology.orchestra.run.vm05:> sudo chmod 1777 /home/ubuntu/cephtest/mnt.0 2026-03-10T08:53:04.921 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T08:53:04.922 DEBUG:teuthology.orchestra.run.vm08:> stat --file-system '--printf=%T 2026-03-10T08:53:04.922 DEBUG:teuthology.orchestra.run.vm08:> ' -- /home/ubuntu/cephtest/mnt.1 2026-03-10T08:53:04.937 INFO:teuthology.orchestra.run.vm08.stdout:fuseblk 2026-03-10T08:53:04.937 INFO:tasks.cephfs.fuse_mount:ceph-fuse is mounted on /home/ubuntu/cephtest/mnt.1 2026-03-10T08:53:04.937 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T08:53:04.937 DEBUG:teuthology.orchestra.run.vm08:> sudo chmod 1777 /home/ubuntu/cephtest/mnt.1 2026-03-10T08:53:05.002 INFO:teuthology.run_tasks:Running task print... 2026-03-10T08:53:05.005 INFO:teuthology.task.print:**** done client 2026-03-10T08:53:05.005 INFO:teuthology.run_tasks:Running task parallel... 2026-03-10T08:53:05.007 INFO:teuthology.task.parallel:starting parallel... 2026-03-10T08:53:05.007 INFO:teuthology.task.parallel:In parallel, running task sequential... 2026-03-10T08:53:05.008 INFO:teuthology.task.sequential:In sequential, running task cephadm.shell... 2026-03-10T08:53:05.008 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm05.local 2026-03-10T08:53:05.008 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set mgr mgr/orchestrator/fail_fs false || true' 2026-03-10T08:53:05.008 INFO:teuthology.task.parallel:In parallel, running task sequential... 2026-03-10T08:53:05.008 INFO:teuthology.task.sequential:In sequential, running task workunit... 2026-03-10T08:53:05.009 INFO:tasks.workunit:Pulling workunits from ref 75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b 2026-03-10T08:53:05.010 INFO:tasks.workunit:Making a separate scratch dir for every client... 2026-03-10T08:53:05.010 INFO:tasks.workunit:timeout=3h 2026-03-10T08:53:05.010 INFO:tasks.workunit:cleanup=True 2026-03-10T08:53:05.010 DEBUG:teuthology.orchestra.run.vm05:> stat -- /home/ubuntu/cephtest/mnt.0 2026-03-10T08:53:05.032 INFO:teuthology.orchestra.run.vm05.stdout: File: /home/ubuntu/cephtest/mnt.0 2026-03-10T08:53:05.032 INFO:teuthology.orchestra.run.vm05.stdout: Size: 0 Blocks: 1 IO Block: 4194304 directory 2026-03-10T08:53:05.032 INFO:teuthology.orchestra.run.vm05.stdout:Device: 5dh/93d Inode: 1 Links: 2 2026-03-10T08:53:05.032 INFO:teuthology.orchestra.run.vm05.stdout:Access: (1777/drwxrwxrwt) Uid: ( 0/ root) Gid: ( 0/ root) 2026-03-10T08:53:05.032 INFO:teuthology.orchestra.run.vm05.stdout:Context: system_u:object_r:fusefs_t:s0 2026-03-10T08:53:05.032 INFO:teuthology.orchestra.run.vm05.stdout:Access: 1970-01-01 00:00:00.000000000 +0000 2026-03-10T08:53:05.032 INFO:teuthology.orchestra.run.vm05.stdout:Modify: 2026-03-10 08:52:54.389780653 +0000 2026-03-10T08:53:05.032 INFO:teuthology.orchestra.run.vm05.stdout:Change: 2026-03-10 08:53:04.920190729 +0000 2026-03-10T08:53:05.032 INFO:teuthology.orchestra.run.vm05.stdout: Birth: - 2026-03-10T08:53:05.032 INFO:tasks.workunit:Did not need to create dir /home/ubuntu/cephtest/mnt.0 2026-03-10T08:53:05.032 DEBUG:teuthology.orchestra.run.vm05:> cd -- /home/ubuntu/cephtest/mnt.0 && sudo install -d -m 0755 --owner=ubuntu -- client.0 2026-03-10T08:53:05.114 DEBUG:teuthology.orchestra.run.vm08:> stat -- /home/ubuntu/cephtest/mnt.1 2026-03-10T08:53:05.131 INFO:teuthology.orchestra.run.vm08.stdout: File: /home/ubuntu/cephtest/mnt.1 2026-03-10T08:53:05.131 INFO:teuthology.orchestra.run.vm08.stdout: Size: 0 Blocks: 1 IO Block: 4194304 directory 2026-03-10T08:53:05.131 INFO:teuthology.orchestra.run.vm08.stdout:Device: 5ah/90d Inode: 1 Links: 3 2026-03-10T08:53:05.131 INFO:teuthology.orchestra.run.vm08.stdout:Access: (1777/drwxrwxrwt) Uid: ( 0/ root) Gid: ( 0/ root) 2026-03-10T08:53:05.131 INFO:teuthology.orchestra.run.vm08.stdout:Context: system_u:object_r:fusefs_t:s0 2026-03-10T08:53:05.131 INFO:teuthology.orchestra.run.vm08.stdout:Access: 1970-01-01 00:00:00.000000000 +0000 2026-03-10T08:53:05.131 INFO:teuthology.orchestra.run.vm08.stdout:Modify: 2026-03-10 08:53:05.108885723 +0000 2026-03-10T08:53:05.131 INFO:teuthology.orchestra.run.vm08.stdout:Change: 2026-03-10 08:53:05.108885723 +0000 2026-03-10T08:53:05.131 INFO:teuthology.orchestra.run.vm08.stdout: Birth: - 2026-03-10T08:53:05.131 INFO:tasks.workunit:Did not need to create dir /home/ubuntu/cephtest/mnt.1 2026-03-10T08:53:05.132 DEBUG:teuthology.orchestra.run.vm08:> cd -- /home/ubuntu/cephtest/mnt.1 && sudo install -d -m 0755 --owner=ubuntu -- client.1 2026-03-10T08:53:05.176 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:53:05.199 DEBUG:teuthology.orchestra.run.vm05:> rm -rf /home/ubuntu/cephtest/clone.client.0 && git clone https://github.com/kshtsk/ceph.git /home/ubuntu/cephtest/clone.client.0 && cd /home/ubuntu/cephtest/clone.client.0 && git checkout 75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b 2026-03-10T08:53:05.199 DEBUG:teuthology.orchestra.run.vm08:> rm -rf /home/ubuntu/cephtest/clone.client.1 && git clone https://github.com/kshtsk/ceph.git /home/ubuntu/cephtest/clone.client.1 && cd /home/ubuntu/cephtest/clone.client.1 && git checkout 75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b 2026-03-10T08:53:05.240 INFO:tasks.workunit.client.0.vm05.stderr:Cloning into '/home/ubuntu/cephtest/clone.client.0'... 2026-03-10T08:53:05.255 INFO:tasks.workunit.client.1.vm08.stderr:Cloning into '/home/ubuntu/cephtest/clone.client.1'... 2026-03-10T08:53:05.399 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.398+0000 7ff37f29c700 1 -- 192.168.123.105:0/3745531784 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff3780ff800 msgr2=0x7ff3780ffc20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:05.399 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.398+0000 7ff37f29c700 1 --2- 192.168.123.105:0/3745531784 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff3780ff800 0x7ff3780ffc20 secure :-1 s=READY pgs=270 cs=0 l=1 rev1=1 crypto rx=0x7ff374009b00 tx=0x7ff374009e10 comp rx=0 tx=0).stop 2026-03-10T08:53:05.399 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.399+0000 7ff37f29c700 1 -- 192.168.123.105:0/3745531784 shutdown_connections 2026-03-10T08:53:05.399 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.399+0000 7ff37f29c700 1 --2- 192.168.123.105:0/3745531784 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff378100160 0x7ff3781005e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:05.399 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.399+0000 7ff37f29c700 1 --2- 192.168.123.105:0/3745531784 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff3780ff800 0x7ff3780ffc20 unknown :-1 s=CLOSED pgs=270 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:05.399 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.399+0000 7ff37f29c700 1 -- 192.168.123.105:0/3745531784 >> 192.168.123.105:0/3745531784 conn(0x7ff3780fb380 msgr2=0x7ff3780fd7e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:53:05.400 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.399+0000 7ff37f29c700 1 -- 192.168.123.105:0/3745531784 shutdown_connections 2026-03-10T08:53:05.400 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.399+0000 7ff37f29c700 1 -- 192.168.123.105:0/3745531784 wait complete. 2026-03-10T08:53:05.400 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.400+0000 7ff37f29c700 1 Processor -- start 2026-03-10T08:53:05.400 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.400+0000 7ff37f29c700 1 -- start start 2026-03-10T08:53:05.400 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.400+0000 7ff37f29c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff3780ff800 0x7ff3781989d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:05.400 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.400+0000 7ff37f29c700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff378100160 0x7ff378198f10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:05.400 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.400+0000 7ff37f29c700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff378199530 con 0x7ff3780ff800 2026-03-10T08:53:05.400 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.400+0000 7ff37f29c700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff378199670 con 0x7ff378100160 2026-03-10T08:53:05.400 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.400+0000 7ff37d038700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff3780ff800 0x7ff3781989d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:05.400 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.400+0000 7ff37d038700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff3780ff800 0x7ff3781989d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:33424/0 (socket says 192.168.123.105:33424) 2026-03-10T08:53:05.400 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.400+0000 7ff37d038700 1 -- 192.168.123.105:0/2473515364 learned_addr learned my addr 192.168.123.105:0/2473515364 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:53:05.400 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.400+0000 7ff37c837700 1 --2- 192.168.123.105:0/2473515364 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff378100160 0x7ff378198f10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:05.401 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.400+0000 7ff37d038700 1 -- 192.168.123.105:0/2473515364 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff378100160 msgr2=0x7ff378198f10 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:05.401 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.400+0000 7ff37d038700 1 --2- 192.168.123.105:0/2473515364 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff378100160 0x7ff378198f10 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:05.401 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.400+0000 7ff37d038700 1 -- 192.168.123.105:0/2473515364 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff3740097e0 con 0x7ff3780ff800 2026-03-10T08:53:05.401 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.401+0000 7ff37d038700 1 --2- 192.168.123.105:0/2473515364 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff3780ff800 0x7ff3781989d0 secure :-1 s=READY pgs=271 cs=0 l=1 rev1=1 crypto rx=0x7ff374005850 tx=0x7ff374004990 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:53:05.401 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.401+0000 7ff36e7fc700 1 -- 192.168.123.105:0/2473515364 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff37401d070 con 0x7ff3780ff800 2026-03-10T08:53:05.401 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.401+0000 7ff37f29c700 1 -- 192.168.123.105:0/2473515364 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff37819e0c0 con 0x7ff3780ff800 2026-03-10T08:53:05.401 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.401+0000 7ff37f29c700 1 -- 192.168.123.105:0/2473515364 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff37819e5b0 con 0x7ff3780ff800 2026-03-10T08:53:05.402 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.401+0000 7ff36e7fc700 1 -- 192.168.123.105:0/2473515364 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff374022470 con 0x7ff3780ff800 2026-03-10T08:53:05.402 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.401+0000 7ff36e7fc700 1 -- 192.168.123.105:0/2473515364 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff37400f670 con 0x7ff3780ff800 2026-03-10T08:53:05.405 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.402+0000 7ff37f29c700 1 -- 192.168.123.105:0/2473515364 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff35c005320 con 0x7ff3780ff800 2026-03-10T08:53:05.405 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.404+0000 7ff36e7fc700 1 -- 192.168.123.105:0/2473515364 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7ff37400baa0 con 0x7ff3780ff800 2026-03-10T08:53:05.405 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.404+0000 7ff36e7fc700 1 --2- 192.168.123.105:0/2473515364 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff36406c600 0x7ff36406eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:05.405 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.404+0000 7ff36e7fc700 1 -- 192.168.123.105:0/2473515364 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7ff37408d9b0 con 0x7ff3780ff800 2026-03-10T08:53:05.405 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.405+0000 7ff37c837700 1 --2- 192.168.123.105:0/2473515364 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff36406c600 0x7ff36406eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:05.405 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.405+0000 7ff36e7fc700 1 -- 192.168.123.105:0/2473515364 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7ff37405bce0 con 0x7ff3780ff800 2026-03-10T08:53:05.405 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.405+0000 7ff37c837700 1 --2- 192.168.123.105:0/2473515364 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff36406c600 0x7ff36406eac0 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7ff368005950 tx=0x7ff36800b410 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:53:05.509 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.509+0000 7ff37f29c700 1 -- 192.168.123.105:0/2473515364 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command([{prefix=config set, name=mgr/orchestrator/fail_fs}] v 0) v1 -- 0x7ff35c005cc0 con 0x7ff3780ff800 2026-03-10T08:53:05.519 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.518+0000 7ff36e7fc700 1 -- 192.168.123.105:0/2473515364 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mgr/orchestrator/fail_fs}]=0 v16)=0 v16) v1 ==== 126+0+0 (secure 0 0 0) 0x7ff3740270c0 con 0x7ff3780ff800 2026-03-10T08:53:05.521 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.520+0000 7ff37f29c700 1 -- 192.168.123.105:0/2473515364 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff36406c600 msgr2=0x7ff36406eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:05.521 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.520+0000 7ff37f29c700 1 --2- 192.168.123.105:0/2473515364 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff36406c600 0x7ff36406eac0 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7ff368005950 tx=0x7ff36800b410 comp rx=0 tx=0).stop 2026-03-10T08:53:05.521 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.520+0000 7ff37f29c700 1 -- 192.168.123.105:0/2473515364 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff3780ff800 msgr2=0x7ff3781989d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:05.521 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.520+0000 7ff37f29c700 1 --2- 192.168.123.105:0/2473515364 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff3780ff800 0x7ff3781989d0 secure :-1 s=READY pgs=271 cs=0 l=1 rev1=1 crypto rx=0x7ff374005850 tx=0x7ff374004990 comp rx=0 tx=0).stop 2026-03-10T08:53:05.521 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.521+0000 7ff37f29c700 1 -- 192.168.123.105:0/2473515364 shutdown_connections 2026-03-10T08:53:05.521 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.521+0000 7ff37f29c700 1 --2- 192.168.123.105:0/2473515364 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff36406c600 0x7ff36406eac0 unknown :-1 s=CLOSED pgs=110 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:05.521 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.521+0000 7ff37f29c700 1 --2- 192.168.123.105:0/2473515364 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff3780ff800 0x7ff3781989d0 unknown :-1 s=CLOSED pgs=271 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:05.521 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.521+0000 7ff37f29c700 1 --2- 192.168.123.105:0/2473515364 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff378100160 0x7ff378198f10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:05.521 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.521+0000 7ff37f29c700 1 -- 192.168.123.105:0/2473515364 >> 192.168.123.105:0/2473515364 conn(0x7ff3780fb380 msgr2=0x7ff378107e30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:53:05.521 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.521+0000 7ff37f29c700 1 -- 192.168.123.105:0/2473515364 shutdown_connections 2026-03-10T08:53:05.521 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.521+0000 7ff37f29c700 1 -- 192.168.123.105:0/2473515364 wait complete. 2026-03-10T08:53:05.569 INFO:teuthology.task.sequential:In sequential, running task cephadm.shell... 2026-03-10T08:53:05.569 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm05.local 2026-03-10T08:53:05.569 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set mon mon_warn_on_insecure_global_id_reclaim false --force' 2026-03-10T08:53:05.752 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:53:05.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.984+0000 7f3e630f6700 1 -- 192.168.123.105:0/2054856473 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3e5c0ff520 msgr2=0x7f3e5c106700 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:05.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.984+0000 7f3e630f6700 1 --2- 192.168.123.105:0/2054856473 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3e5c0ff520 0x7f3e5c106700 secure :-1 s=READY pgs=272 cs=0 l=1 rev1=1 crypto rx=0x7f3e50009b00 tx=0x7f3e50009e10 comp rx=0 tx=0).stop 2026-03-10T08:53:05.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.984+0000 7f3e630f6700 1 -- 192.168.123.105:0/2054856473 shutdown_connections 2026-03-10T08:53:05.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.984+0000 7f3e630f6700 1 --2- 192.168.123.105:0/2054856473 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3e5c0ff520 0x7f3e5c106700 unknown :-1 s=CLOSED pgs=272 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:05.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.984+0000 7f3e630f6700 1 --2- 192.168.123.105:0/2054856473 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3e5c0feb30 0x7f3e5c0fef50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:05.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.984+0000 7f3e630f6700 1 -- 192.168.123.105:0/2054856473 >> 192.168.123.105:0/2054856473 conn(0x7f3e5c0fa6d0 msgr2=0x7f3e5c0fcb10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:53:05.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.985+0000 7f3e630f6700 1 -- 192.168.123.105:0/2054856473 shutdown_connections 2026-03-10T08:53:05.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.985+0000 7f3e630f6700 1 -- 192.168.123.105:0/2054856473 wait complete. 2026-03-10T08:53:05.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.985+0000 7f3e630f6700 1 Processor -- start 2026-03-10T08:53:05.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.986+0000 7f3e630f6700 1 -- start start 2026-03-10T08:53:05.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.986+0000 7f3e630f6700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3e5c0feb30 0x7f3e5c1968a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:05.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.986+0000 7f3e630f6700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3e5c0ff520 0x7f3e5c196de0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:05.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.986+0000 7f3e630f6700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3e5c197400 con 0x7f3e5c0ff520 2026-03-10T08:53:05.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.986+0000 7f3e630f6700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3e5c197540 con 0x7f3e5c0feb30 2026-03-10T08:53:05.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.986+0000 7f3e5bfff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3e5c0ff520 0x7f3e5c196de0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:05.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.986+0000 7f3e60e92700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3e5c0feb30 0x7f3e5c1968a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:05.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.986+0000 7f3e60e92700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3e5c0feb30 0x7f3e5c1968a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:34846/0 (socket says 192.168.123.105:34846) 2026-03-10T08:53:05.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.986+0000 7f3e60e92700 1 -- 192.168.123.105:0/24985476 learned_addr learned my addr 192.168.123.105:0/24985476 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:53:05.987 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.986+0000 7f3e5bfff700 1 -- 192.168.123.105:0/24985476 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3e5c0feb30 msgr2=0x7f3e5c1968a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:05.987 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.986+0000 7f3e5bfff700 1 --2- 192.168.123.105:0/24985476 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3e5c0feb30 0x7f3e5c1968a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:05.987 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.986+0000 7f3e5bfff700 1 -- 192.168.123.105:0/24985476 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3e500097e0 con 0x7f3e5c0ff520 2026-03-10T08:53:05.987 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.987+0000 7f3e5bfff700 1 --2- 192.168.123.105:0/24985476 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3e5c0ff520 0x7f3e5c196de0 secure :-1 s=READY pgs=273 cs=0 l=1 rev1=1 crypto rx=0x7f3e50009ad0 tx=0x7f3e5000b890 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:53:05.987 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.987+0000 7f3e59ffb700 1 -- 192.168.123.105:0/24985476 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3e5001d070 con 0x7f3e5c0ff520 2026-03-10T08:53:05.987 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.987+0000 7f3e630f6700 1 -- 192.168.123.105:0/24985476 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3e5c100610 con 0x7f3e5c0ff520 2026-03-10T08:53:05.988 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.988+0000 7f3e59ffb700 1 -- 192.168.123.105:0/24985476 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3e50004500 con 0x7f3e5c0ff520 2026-03-10T08:53:05.988 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.988+0000 7f3e59ffb700 1 -- 192.168.123.105:0/24985476 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3e5000f460 con 0x7f3e5c0ff520 2026-03-10T08:53:05.989 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.988+0000 7f3e630f6700 1 -- 192.168.123.105:0/24985476 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3e5c100aa0 con 0x7f3e5c0ff520 2026-03-10T08:53:05.989 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.989+0000 7f3e59ffb700 1 -- 192.168.123.105:0/24985476 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f3e5000f5c0 con 0x7f3e5c0ff520 2026-03-10T08:53:05.989 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.989+0000 7f3e630f6700 1 -- 192.168.123.105:0/24985476 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3e5c190a70 con 0x7f3e5c0ff520 2026-03-10T08:53:05.990 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.989+0000 7f3e59ffb700 1 --2- 192.168.123.105:0/24985476 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3e4406c480 0x7f3e4406e940 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:05.990 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.990+0000 7f3e59ffb700 1 -- 192.168.123.105:0/24985476 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f3e5000bc60 con 0x7f3e5c0ff520 2026-03-10T08:53:05.992 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.992+0000 7f3e60e92700 1 --2- 192.168.123.105:0/24985476 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3e4406c480 0x7f3e4406e940 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:05.992 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.992+0000 7f3e59ffb700 1 -- 192.168.123.105:0/24985476 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f3e5005bad0 con 0x7f3e5c0ff520 2026-03-10T08:53:05.992 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:05.992+0000 7f3e60e92700 1 --2- 192.168.123.105:0/24985476 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3e4406c480 0x7f3e4406e940 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7f3e4c009cd0 tx=0x7f3e4c009400 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:53:06.099 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.098+0000 7f3e630f6700 1 -- 192.168.123.105:0/24985476 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim}] v 0) v1 -- 0x7f3e5c02cc70 con 0x7f3e5c0ff520 2026-03-10T08:53:06.101 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.101+0000 7f3e59ffb700 1 -- 192.168.123.105:0/24985476 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim}]=0 v16)=0 v16) v1 ==== 155+0+0 (secure 0 0 0) 0x7f3e5005b660 con 0x7f3e5c0ff520 2026-03-10T08:53:06.102 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.102+0000 7f3e630f6700 1 -- 192.168.123.105:0/24985476 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3e4406c480 msgr2=0x7f3e4406e940 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:06.102 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.102+0000 7f3e630f6700 1 --2- 192.168.123.105:0/24985476 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3e4406c480 0x7f3e4406e940 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7f3e4c009cd0 tx=0x7f3e4c009400 comp rx=0 tx=0).stop 2026-03-10T08:53:06.102 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.102+0000 7f3e630f6700 1 -- 192.168.123.105:0/24985476 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3e5c0ff520 msgr2=0x7f3e5c196de0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:06.102 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.102+0000 7f3e630f6700 1 --2- 192.168.123.105:0/24985476 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3e5c0ff520 0x7f3e5c196de0 secure :-1 s=READY pgs=273 cs=0 l=1 rev1=1 crypto rx=0x7f3e50009ad0 tx=0x7f3e5000b890 comp rx=0 tx=0).stop 2026-03-10T08:53:06.102 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.102+0000 7f3e630f6700 1 -- 192.168.123.105:0/24985476 shutdown_connections 2026-03-10T08:53:06.102 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.102+0000 7f3e630f6700 1 --2- 192.168.123.105:0/24985476 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3e4406c480 0x7f3e4406e940 unknown :-1 s=CLOSED pgs=111 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:06.103 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.102+0000 7f3e630f6700 1 --2- 192.168.123.105:0/24985476 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3e5c0feb30 0x7f3e5c1968a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:06.103 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.102+0000 7f3e630f6700 1 --2- 192.168.123.105:0/24985476 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3e5c0ff520 0x7f3e5c196de0 unknown :-1 s=CLOSED pgs=273 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:06.103 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.102+0000 7f3e630f6700 1 -- 192.168.123.105:0/24985476 >> 192.168.123.105:0/24985476 conn(0x7f3e5c0fa6d0 msgr2=0x7f3e5c104f80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:53:06.103 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.102+0000 7f3e630f6700 1 -- 192.168.123.105:0/24985476 shutdown_connections 2026-03-10T08:53:06.103 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.102+0000 7f3e630f6700 1 -- 192.168.123.105:0/24985476 wait complete. 2026-03-10T08:53:06.163 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set mon mon_warn_on_insecure_global_id_reclaim_allowed false --force' 2026-03-10T08:53:06.309 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:53:06.548 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.547+0000 7f1eb0859700 1 -- 192.168.123.105:0/1127565850 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1ea81042c0 msgr2=0x7f1ea81066b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:06.548 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.547+0000 7f1eb0859700 1 --2- 192.168.123.105:0/1127565850 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1ea81042c0 0x7f1ea81066b0 secure :-1 s=READY pgs=274 cs=0 l=1 rev1=1 crypto rx=0x7f1e9c009b00 tx=0x7f1e9c009e10 comp rx=0 tx=0).stop 2026-03-10T08:53:06.548 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.548+0000 7f1eb0859700 1 -- 192.168.123.105:0/1127565850 shutdown_connections 2026-03-10T08:53:06.548 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.548+0000 7f1eb0859700 1 --2- 192.168.123.105:0/1127565850 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1ea81042c0 0x7f1ea81066b0 unknown :-1 s=CLOSED pgs=274 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:06.548 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.548+0000 7f1eb0859700 1 --2- 192.168.123.105:0/1127565850 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1ea8101990 0x7f1ea8103d80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:06.548 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.548+0000 7f1eb0859700 1 -- 192.168.123.105:0/1127565850 >> 192.168.123.105:0/1127565850 conn(0x7f1ea80fb380 msgr2=0x7f1ea80fd7e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:53:06.548 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.548+0000 7f1eb0859700 1 -- 192.168.123.105:0/1127565850 shutdown_connections 2026-03-10T08:53:06.548 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.548+0000 7f1eb0859700 1 -- 192.168.123.105:0/1127565850 wait complete. 2026-03-10T08:53:06.548 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.548+0000 7f1eb0859700 1 Processor -- start 2026-03-10T08:53:06.549 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.548+0000 7f1eb0859700 1 -- start start 2026-03-10T08:53:06.549 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.549+0000 7f1eb0859700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1ea8101990 0x7f1ea819cde0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:06.549 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.549+0000 7f1eb0859700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1ea81042c0 0x7f1ea819d320 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:06.549 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.549+0000 7f1eb0859700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1ea819d8b0 con 0x7f1ea8101990 2026-03-10T08:53:06.549 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.549+0000 7f1eb0859700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1ea819d9f0 con 0x7f1ea81042c0 2026-03-10T08:53:06.549 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.549+0000 7f1eae5f5700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1ea8101990 0x7f1ea819cde0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:06.549 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.549+0000 7f1eae5f5700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1ea8101990 0x7f1ea819cde0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:33462/0 (socket says 192.168.123.105:33462) 2026-03-10T08:53:06.549 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.549+0000 7f1eae5f5700 1 -- 192.168.123.105:0/1239381575 learned_addr learned my addr 192.168.123.105:0/1239381575 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:53:06.549 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.549+0000 7f1eaddf4700 1 --2- 192.168.123.105:0/1239381575 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1ea81042c0 0x7f1ea819d320 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:06.549 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.549+0000 7f1eae5f5700 1 -- 192.168.123.105:0/1239381575 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1ea81042c0 msgr2=0x7f1ea819d320 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:06.549 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.549+0000 7f1eae5f5700 1 --2- 192.168.123.105:0/1239381575 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1ea81042c0 0x7f1ea819d320 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:06.549 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.549+0000 7f1eae5f5700 1 -- 192.168.123.105:0/1239381575 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1e9c0097e0 con 0x7f1ea8101990 2026-03-10T08:53:06.549 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.549+0000 7f1eae5f5700 1 --2- 192.168.123.105:0/1239381575 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1ea8101990 0x7f1ea819cde0 secure :-1 s=READY pgs=275 cs=0 l=1 rev1=1 crypto rx=0x7f1ea000cc60 tx=0x7f1ea00074a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:53:06.550 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.549+0000 7f1e9b7fe700 1 -- 192.168.123.105:0/1239381575 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1ea0007af0 con 0x7f1ea8101990 2026-03-10T08:53:06.550 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.549+0000 7f1eb0859700 1 -- 192.168.123.105:0/1239381575 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1ea81a24b0 con 0x7f1ea8101990 2026-03-10T08:53:06.551 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.551+0000 7f1e9b7fe700 1 -- 192.168.123.105:0/1239381575 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f1ea0007c50 con 0x7f1ea8101990 2026-03-10T08:53:06.551 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.551+0000 7f1eb0859700 1 -- 192.168.123.105:0/1239381575 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1ea81a29d0 con 0x7f1ea8101990 2026-03-10T08:53:06.552 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.551+0000 7f1e9b7fe700 1 -- 192.168.123.105:0/1239381575 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1ea0018730 con 0x7f1ea8101990 2026-03-10T08:53:06.552 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.551+0000 7f1e9b7fe700 1 -- 192.168.123.105:0/1239381575 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f1ea0018970 con 0x7f1ea8101990 2026-03-10T08:53:06.555 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.552+0000 7f1eb0859700 1 -- 192.168.123.105:0/1239381575 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1ea80fcf70 con 0x7f1ea8101990 2026-03-10T08:53:06.555 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.552+0000 7f1e9b7fe700 1 --2- 192.168.123.105:0/1239381575 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f1e9406c480 0x7f1e9406e940 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:06.555 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.552+0000 7f1eaddf4700 1 --2- 192.168.123.105:0/1239381575 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f1e9406c480 0x7f1e9406e940 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:06.555 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.552+0000 7f1e9b7fe700 1 -- 192.168.123.105:0/1239381575 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f1ea008be40 con 0x7f1ea8101990 2026-03-10T08:53:06.555 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.554+0000 7f1eaddf4700 1 --2- 192.168.123.105:0/1239381575 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f1e9406c480 0x7f1e9406e940 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f1e9c000c00 tx=0x7f1e9c0053a0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:53:06.555 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.555+0000 7f1e9b7fe700 1 -- 192.168.123.105:0/1239381575 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f1ea0091050 con 0x7f1ea8101990 2026-03-10T08:53:06.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:06 vm05.local ceph-mon[49713]: pgmap v82: 65 pgs: 65 active+clean; 451 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 6.4 KiB/s rd, 1.4 KiB/s wr, 10 op/s 2026-03-10T08:53:06.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:06 vm05.local ceph-mon[49713]: from='client.? 192.168.123.105:0/2473515364' entity='client.admin' 2026-03-10T08:53:06.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:06 vm05.local ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:53:06.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:06 vm05.local ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:53:06.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:06 vm05.local ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:53:06.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:06 vm05.local ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:53:06.658 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.657+0000 7f1eb0859700 1 -- 192.168.123.105:0/1239381575 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim_allowed}] v 0) v1 -- 0x7f1ea8066e80 con 0x7f1ea8101990 2026-03-10T08:53:06.660 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.660+0000 7f1e9b7fe700 1 -- 192.168.123.105:0/1239381575 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim_allowed}]=0 v16)=0 v16) v1 ==== 163+0+0 (secure 0 0 0) 0x7f1ea005a1d0 con 0x7f1ea8101990 2026-03-10T08:53:06.662 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.662+0000 7f1eb0859700 1 -- 192.168.123.105:0/1239381575 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f1e9406c480 msgr2=0x7f1e9406e940 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:06.662 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.662+0000 7f1eb0859700 1 --2- 192.168.123.105:0/1239381575 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f1e9406c480 0x7f1e9406e940 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f1e9c000c00 tx=0x7f1e9c0053a0 comp rx=0 tx=0).stop 2026-03-10T08:53:06.662 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.662+0000 7f1eb0859700 1 -- 192.168.123.105:0/1239381575 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1ea8101990 msgr2=0x7f1ea819cde0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:06.662 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.662+0000 7f1eb0859700 1 --2- 192.168.123.105:0/1239381575 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1ea8101990 0x7f1ea819cde0 secure :-1 s=READY pgs=275 cs=0 l=1 rev1=1 crypto rx=0x7f1ea000cc60 tx=0x7f1ea00074a0 comp rx=0 tx=0).stop 2026-03-10T08:53:06.663 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.662+0000 7f1eb0859700 1 -- 192.168.123.105:0/1239381575 shutdown_connections 2026-03-10T08:53:06.663 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.662+0000 7f1eb0859700 1 --2- 192.168.123.105:0/1239381575 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f1e9406c480 0x7f1e9406e940 unknown :-1 s=CLOSED pgs=112 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:06.663 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.662+0000 7f1eb0859700 1 --2- 192.168.123.105:0/1239381575 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1ea8101990 0x7f1ea819cde0 unknown :-1 s=CLOSED pgs=275 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:06.663 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.662+0000 7f1eb0859700 1 --2- 192.168.123.105:0/1239381575 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1ea81042c0 0x7f1ea819d320 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:06.663 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.663+0000 7f1eb0859700 1 -- 192.168.123.105:0/1239381575 >> 192.168.123.105:0/1239381575 conn(0x7f1ea80fb380 msgr2=0x7f1ea8100010 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:53:06.663 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.663+0000 7f1eb0859700 1 -- 192.168.123.105:0/1239381575 shutdown_connections 2026-03-10T08:53:06.663 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:06.663+0000 7f1eb0859700 1 -- 192.168.123.105:0/1239381575 wait complete. 2026-03-10T08:53:06.723 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set global log_to_journald false --force' 2026-03-10T08:53:06.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:06 vm08.local ceph-mon[57559]: pgmap v82: 65 pgs: 65 active+clean; 451 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 6.4 KiB/s rd, 1.4 KiB/s wr, 10 op/s 2026-03-10T08:53:06.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:06 vm08.local ceph-mon[57559]: from='client.? 192.168.123.105:0/2473515364' entity='client.admin' 2026-03-10T08:53:06.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:06 vm08.local ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:53:06.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:06 vm08.local ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:53:06.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:06 vm08.local ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:53:06.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:06 vm08.local ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:53:06.861 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:53:07.106 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.105+0000 7f0ed8c93700 1 -- 192.168.123.105:0/3386780492 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0ed4100f00 msgr2=0x7f0ed4101320 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:07.106 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.105+0000 7f0ed8c93700 1 --2- 192.168.123.105:0/3386780492 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0ed4100f00 0x7f0ed4101320 secure :-1 s=READY pgs=276 cs=0 l=1 rev1=1 crypto rx=0x7f0ebc009b00 tx=0x7f0ebc009e10 comp rx=0 tx=0).stop 2026-03-10T08:53:07.107 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.106+0000 7f0ed8c93700 1 -- 192.168.123.105:0/3386780492 shutdown_connections 2026-03-10T08:53:07.107 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.106+0000 7f0ed8c93700 1 --2- 192.168.123.105:0/3386780492 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0ed4102060 0x7f0ed41024e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:07.107 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.106+0000 7f0ed8c93700 1 --2- 192.168.123.105:0/3386780492 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0ed4100f00 0x7f0ed4101320 unknown :-1 s=CLOSED pgs=276 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:07.107 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.106+0000 7f0ed8c93700 1 -- 192.168.123.105:0/3386780492 >> 192.168.123.105:0/3386780492 conn(0x7f0ed40fc460 msgr2=0x7f0ed40fe8e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:53:07.107 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.107+0000 7f0ed8c93700 1 -- 192.168.123.105:0/3386780492 shutdown_connections 2026-03-10T08:53:07.107 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.107+0000 7f0ed8c93700 1 -- 192.168.123.105:0/3386780492 wait complete. 2026-03-10T08:53:07.107 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.107+0000 7f0ed8c93700 1 Processor -- start 2026-03-10T08:53:07.107 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.107+0000 7f0ed8c93700 1 -- start start 2026-03-10T08:53:07.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.107+0000 7f0ed8c93700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0ed4100f00 0x7f0ed41945c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:07.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.107+0000 7f0ed8c93700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0ed4102060 0x7f0ed4194b00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:07.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.108+0000 7f0ed8c93700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0ed4195090 con 0x7f0ed4100f00 2026-03-10T08:53:07.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.108+0000 7f0ed8c93700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0ed41951d0 con 0x7f0ed4102060 2026-03-10T08:53:07.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.108+0000 7f0ed1d9b700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0ed4102060 0x7f0ed4194b00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:07.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.108+0000 7f0ed259c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0ed4100f00 0x7f0ed41945c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:07.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.108+0000 7f0ed1d9b700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0ed4102060 0x7f0ed4194b00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:34884/0 (socket says 192.168.123.105:34884) 2026-03-10T08:53:07.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.108+0000 7f0ed1d9b700 1 -- 192.168.123.105:0/1132654436 learned_addr learned my addr 192.168.123.105:0/1132654436 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:53:07.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.108+0000 7f0ed259c700 1 -- 192.168.123.105:0/1132654436 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0ed4102060 msgr2=0x7f0ed4194b00 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:07.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.108+0000 7f0ed259c700 1 --2- 192.168.123.105:0/1132654436 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0ed4102060 0x7f0ed4194b00 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:07.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.108+0000 7f0ed259c700 1 -- 192.168.123.105:0/1132654436 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0ebc0097e0 con 0x7f0ed4100f00 2026-03-10T08:53:07.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.108+0000 7f0ed1d9b700 1 --2- 192.168.123.105:0/1132654436 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0ed4102060 0x7f0ed4194b00 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T08:53:07.109 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.109+0000 7f0ed259c700 1 --2- 192.168.123.105:0/1132654436 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0ed4100f00 0x7f0ed41945c0 secure :-1 s=READY pgs=277 cs=0 l=1 rev1=1 crypto rx=0x7f0ebc000c00 tx=0x7f0ebc004c30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:53:07.113 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.109+0000 7f0ecb7fe700 1 -- 192.168.123.105:0/1132654436 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0ebc01d070 con 0x7f0ed4100f00 2026-03-10T08:53:07.113 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.109+0000 7f0ecb7fe700 1 -- 192.168.123.105:0/1132654436 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f0ebc00bc50 con 0x7f0ed4100f00 2026-03-10T08:53:07.113 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.109+0000 7f0ecb7fe700 1 -- 192.168.123.105:0/1132654436 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0ebc00f700 con 0x7f0ed4100f00 2026-03-10T08:53:07.113 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.109+0000 7f0ed8c93700 1 -- 192.168.123.105:0/1132654436 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0ed4199c30 con 0x7f0ed4100f00 2026-03-10T08:53:07.113 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.109+0000 7f0ed8c93700 1 -- 192.168.123.105:0/1132654436 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0ed419a0f0 con 0x7f0ed4100f00 2026-03-10T08:53:07.113 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.110+0000 7f0ecb7fe700 1 -- 192.168.123.105:0/1132654436 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f0ebc00f860 con 0x7f0ed4100f00 2026-03-10T08:53:07.113 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.110+0000 7f0ed8c93700 1 -- 192.168.123.105:0/1132654436 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0ed4066e80 con 0x7f0ed4100f00 2026-03-10T08:53:07.113 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.113+0000 7f0ecb7fe700 1 --2- 192.168.123.105:0/1132654436 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0ec0070940 0x7f0ec0072e00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:07.113 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.113+0000 7f0ecb7fe700 1 -- 192.168.123.105:0/1132654436 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f0ebc08df60 con 0x7f0ed4100f00 2026-03-10T08:53:07.113 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.113+0000 7f0ed1d9b700 1 --2- 192.168.123.105:0/1132654436 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0ec0070940 0x7f0ec0072e00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:07.114 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.114+0000 7f0ed1d9b700 1 --2- 192.168.123.105:0/1132654436 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0ec0070940 0x7f0ec0072e00 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7f0ec400f520 tx=0x7f0ec4005f90 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:53:07.114 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.114+0000 7f0ecb7fe700 1 -- 192.168.123.105:0/1132654436 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f0ebc022a50 con 0x7f0ed4100f00 2026-03-10T08:53:07.222 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.221+0000 7f0ed8c93700 1 -- 192.168.123.105:0/1132654436 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command([{prefix=config set, name=log_to_journald}] v 0) v1 -- 0x7f0ed419a340 con 0x7f0ed4100f00 2026-03-10T08:53:07.222 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.222+0000 7f0ecb7fe700 1 -- 192.168.123.105:0/1132654436 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{prefix=config set, name=log_to_journald}]=0 v16)=0 v16) v1 ==== 135+0+0 (secure 0 0 0) 0x7f0ebc05c2c0 con 0x7f0ed4100f00 2026-03-10T08:53:07.224 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.224+0000 7f0ed8c93700 1 -- 192.168.123.105:0/1132654436 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0ec0070940 msgr2=0x7f0ec0072e00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:07.224 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.224+0000 7f0ed8c93700 1 --2- 192.168.123.105:0/1132654436 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0ec0070940 0x7f0ec0072e00 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7f0ec400f520 tx=0x7f0ec4005f90 comp rx=0 tx=0).stop 2026-03-10T08:53:07.224 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.224+0000 7f0ed8c93700 1 -- 192.168.123.105:0/1132654436 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0ed4100f00 msgr2=0x7f0ed41945c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:07.224 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.224+0000 7f0ed8c93700 1 --2- 192.168.123.105:0/1132654436 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0ed4100f00 0x7f0ed41945c0 secure :-1 s=READY pgs=277 cs=0 l=1 rev1=1 crypto rx=0x7f0ebc000c00 tx=0x7f0ebc004c30 comp rx=0 tx=0).stop 2026-03-10T08:53:07.224 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.224+0000 7f0ed8c93700 1 -- 192.168.123.105:0/1132654436 shutdown_connections 2026-03-10T08:53:07.225 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.224+0000 7f0ed8c93700 1 --2- 192.168.123.105:0/1132654436 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0ec0070940 0x7f0ec0072e00 unknown :-1 s=CLOSED pgs=113 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:07.225 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.224+0000 7f0ed8c93700 1 --2- 192.168.123.105:0/1132654436 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0ed4100f00 0x7f0ed41945c0 unknown :-1 s=CLOSED pgs=277 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:07.225 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.224+0000 7f0ed8c93700 1 --2- 192.168.123.105:0/1132654436 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0ed4102060 0x7f0ed4194b00 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:07.225 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.224+0000 7f0ed8c93700 1 -- 192.168.123.105:0/1132654436 >> 192.168.123.105:0/1132654436 conn(0x7f0ed40fc460 msgr2=0x7f0ed4105320 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:53:07.225 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.224+0000 7f0ed8c93700 1 -- 192.168.123.105:0/1132654436 shutdown_connections 2026-03-10T08:53:07.225 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.225+0000 7f0ed8c93700 1 -- 192.168.123.105:0/1132654436 wait complete. 2026-03-10T08:53:07.284 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch upgrade start --image quay.ceph.io/ceph-ci/ceph:$sha1' 2026-03-10T08:53:07.432 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:53:07.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.714+0000 7f735532e700 1 -- 192.168.123.105:0/1811288254 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7350107d90 msgr2=0x7f73501081b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:07.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.714+0000 7f735532e700 1 --2- 192.168.123.105:0/1811288254 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7350107d90 0x7f73501081b0 secure :-1 s=READY pgs=278 cs=0 l=1 rev1=1 crypto rx=0x7f7344009b50 tx=0x7f7344009e60 comp rx=0 tx=0).stop 2026-03-10T08:53:07.715 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.715+0000 7f735532e700 1 -- 192.168.123.105:0/1811288254 shutdown_connections 2026-03-10T08:53:07.715 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.715+0000 7f735532e700 1 --2- 192.168.123.105:0/1811288254 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f73501086f0 0x7f735010f0d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:07.715 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.715+0000 7f735532e700 1 --2- 192.168.123.105:0/1811288254 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7350107d90 0x7f73501081b0 unknown :-1 s=CLOSED pgs=278 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:07.715 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.715+0000 7f735532e700 1 -- 192.168.123.105:0/1811288254 >> 192.168.123.105:0/1811288254 conn(0x7f735006dda0 msgr2=0x7f7350070220 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:53:07.715 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.715+0000 7f735532e700 1 -- 192.168.123.105:0/1811288254 shutdown_connections 2026-03-10T08:53:07.715 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.715+0000 7f735532e700 1 -- 192.168.123.105:0/1811288254 wait complete. 2026-03-10T08:53:07.716 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.716+0000 7f735532e700 1 Processor -- start 2026-03-10T08:53:07.716 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.716+0000 7f735532e700 1 -- start start 2026-03-10T08:53:07.717 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.716+0000 7f735532e700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7350107d90 0x7f735019cdc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:07.717 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.716+0000 7f735532e700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f73501086f0 0x7f735019d300 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:07.717 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.716+0000 7f735532e700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f735019d920 con 0x7f7350107d90 2026-03-10T08:53:07.717 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.716+0000 7f735532e700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f735019da60 con 0x7f73501086f0 2026-03-10T08:53:07.717 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.716+0000 7f734e7fc700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f73501086f0 0x7f735019d300 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:07.717 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.716+0000 7f734e7fc700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f73501086f0 0x7f735019d300 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:34906/0 (socket says 192.168.123.105:34906) 2026-03-10T08:53:07.717 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.716+0000 7f734e7fc700 1 -- 192.168.123.105:0/3092571109 learned_addr learned my addr 192.168.123.105:0/3092571109 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:53:07.717 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.716+0000 7f734effd700 1 --2- 192.168.123.105:0/3092571109 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7350107d90 0x7f735019cdc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:07.717 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.716+0000 7f734e7fc700 1 -- 192.168.123.105:0/3092571109 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7350107d90 msgr2=0x7f735019cdc0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:07.718 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.718+0000 7f734e7fc700 1 --2- 192.168.123.105:0/3092571109 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7350107d90 0x7f735019cdc0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:07.718 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.718+0000 7f734e7fc700 1 -- 192.168.123.105:0/3092571109 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f73440097e0 con 0x7f73501086f0 2026-03-10T08:53:07.718 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.718+0000 7f734effd700 1 --2- 192.168.123.105:0/3092571109 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7350107d90 0x7f735019cdc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T08:53:07.718 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.718+0000 7f734e7fc700 1 --2- 192.168.123.105:0/3092571109 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f73501086f0 0x7f735019d300 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f734800c390 tx=0x7f734800c6a0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:53:07.720 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.720+0000 7f7337fff700 1 -- 192.168.123.105:0/3092571109 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f734800e030 con 0x7f73501086f0 2026-03-10T08:53:07.720 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.720+0000 7f7337fff700 1 -- 192.168.123.105:0/3092571109 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f734800f040 con 0x7f73501086f0 2026-03-10T08:53:07.720 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.720+0000 7f7337fff700 1 -- 192.168.123.105:0/3092571109 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7348014650 con 0x7f73501086f0 2026-03-10T08:53:07.720 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.720+0000 7f735532e700 1 -- 192.168.123.105:0/3092571109 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f73501a2510 con 0x7f73501086f0 2026-03-10T08:53:07.720 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.720+0000 7f735532e700 1 -- 192.168.123.105:0/3092571109 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f73501a29e0 con 0x7f73501086f0 2026-03-10T08:53:07.723 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.721+0000 7f735532e700 1 -- 192.168.123.105:0/3092571109 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7350196fb0 con 0x7f73501086f0 2026-03-10T08:53:07.723 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.721+0000 7f7337fff700 1 -- 192.168.123.105:0/3092571109 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f7348009110 con 0x7f73501086f0 2026-03-10T08:53:07.723 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.722+0000 7f7337fff700 1 --2- 192.168.123.105:0/3092571109 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7338070790 0x7f7338072c50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:07.723 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.722+0000 7f734effd700 1 --2- 192.168.123.105:0/3092571109 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7338070790 0x7f7338072c50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:07.723 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.722+0000 7f734effd700 1 --2- 192.168.123.105:0/3092571109 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7338070790 0x7f7338072c50 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7f734400b5c0 tx=0x7f7344005fd0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:53:07.723 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.722+0000 7f7337fff700 1 -- 192.168.123.105:0/3092571109 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f734808b590 con 0x7f73501086f0 2026-03-10T08:53:07.725 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.724+0000 7f7337fff700 1 -- 192.168.123.105:0/3092571109 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f7348059630 con 0x7f73501086f0 2026-03-10T08:53:07.847 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.846+0000 7f735532e700 1 -- 192.168.123.105:0/3092571109 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade start", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "target": ["mon-mgr", ""]}) v1 -- 0x7f73500611d0 con 0x7f7338070790 2026-03-10T08:53:07.854 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.854+0000 7f7337fff700 1 -- 192.168.123.105:0/3092571109 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+89 (secure 0 0 0) 0x7f73500611d0 con 0x7f7338070790 2026-03-10T08:53:07.854 INFO:teuthology.orchestra.run.vm05.stdout:Initiating upgrade to quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T08:53:07.856 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.856+0000 7f735532e700 1 -- 192.168.123.105:0/3092571109 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7338070790 msgr2=0x7f7338072c50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:07.856 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.856+0000 7f735532e700 1 --2- 192.168.123.105:0/3092571109 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7338070790 0x7f7338072c50 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7f734400b5c0 tx=0x7f7344005fd0 comp rx=0 tx=0).stop 2026-03-10T08:53:07.857 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.856+0000 7f735532e700 1 -- 192.168.123.105:0/3092571109 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f73501086f0 msgr2=0x7f735019d300 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:07.857 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.856+0000 7f735532e700 1 --2- 192.168.123.105:0/3092571109 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f73501086f0 0x7f735019d300 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f734800c390 tx=0x7f734800c6a0 comp rx=0 tx=0).stop 2026-03-10T08:53:07.857 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.856+0000 7f735532e700 1 -- 192.168.123.105:0/3092571109 shutdown_connections 2026-03-10T08:53:07.857 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.856+0000 7f735532e700 1 --2- 192.168.123.105:0/3092571109 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7338070790 0x7f7338072c50 unknown :-1 s=CLOSED pgs=114 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:07.857 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.856+0000 7f735532e700 1 --2- 192.168.123.105:0/3092571109 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7350107d90 0x7f735019cdc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:07.857 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.857+0000 7f735532e700 1 --2- 192.168.123.105:0/3092571109 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f73501086f0 0x7f735019d300 unknown :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:07.857 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.857+0000 7f735532e700 1 -- 192.168.123.105:0/3092571109 >> 192.168.123.105:0/3092571109 conn(0x7f735006dda0 msgr2=0x7f735010d950 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:53:07.857 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.857+0000 7f735532e700 1 -- 192.168.123.105:0/3092571109 shutdown_connections 2026-03-10T08:53:07.857 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:07.857+0000 7f735532e700 1 -- 192.168.123.105:0/3092571109 wait complete. 2026-03-10T08:53:07.908 INFO:teuthology.task.sequential:In sequential, running task cephadm.shell... 2026-03-10T08:53:07.908 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm05.local 2026-03-10T08:53:07.908 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'while ceph orch upgrade status | jq '"'"'.in_progress'"'"' | grep true && ! ceph orch upgrade status | jq '"'"'.message'"'"' | grep Error ; do ceph orch ps ; ceph versions ; ceph fs dump; ceph orch upgrade status ; ceph health detail ; sleep 30 ; done' 2026-03-10T08:53:08.100 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T08:53:08.139 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:08 vm05.local ceph-mon[49713]: pgmap v83: 65 pgs: 65 active+clean; 457 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 8.0 KiB/s rd, 1023 B/s wr, 10 op/s 2026-03-10T08:53:08.140 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:08 vm05.local ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:53:08.140 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:08 vm05.local ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:53:08.140 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:08 vm05.local ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:53:08.140 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:08 vm05.local ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:53:08.140 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:08 vm05.local ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:53:08.444 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.441+0000 7fe0dd4ba700 1 -- 192.168.123.105:0/780147662 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe0d8075a40 msgr2=0x7fe0d8077ed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:08.444 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.442+0000 7fe0dd4ba700 1 --2- 192.168.123.105:0/780147662 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe0d8075a40 0x7fe0d8077ed0 secure :-1 s=READY pgs=279 cs=0 l=1 rev1=1 crypto rx=0x7fe0d000b780 tx=0x7fe0d000ba90 comp rx=0 tx=0).stop 2026-03-10T08:53:08.444 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.442+0000 7fe0dd4ba700 1 -- 192.168.123.105:0/780147662 shutdown_connections 2026-03-10T08:53:08.444 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.442+0000 7fe0dd4ba700 1 --2- 192.168.123.105:0/780147662 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe0d8075a40 0x7fe0d8077ed0 unknown :-1 s=CLOSED pgs=279 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:08.444 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.442+0000 7fe0dd4ba700 1 --2- 192.168.123.105:0/780147662 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe0d8072b50 0x7fe0d8072f70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:08.444 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.442+0000 7fe0dd4ba700 1 -- 192.168.123.105:0/780147662 >> 192.168.123.105:0/780147662 conn(0x7fe0d806dae0 msgr2=0x7fe0d806ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:53:08.444 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.442+0000 7fe0dd4ba700 1 -- 192.168.123.105:0/780147662 shutdown_connections 2026-03-10T08:53:08.444 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.442+0000 7fe0dd4ba700 1 -- 192.168.123.105:0/780147662 wait complete. 2026-03-10T08:53:08.444 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.443+0000 7fe0dd4ba700 1 Processor -- start 2026-03-10T08:53:08.444 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.443+0000 7fe0dd4ba700 1 -- start start 2026-03-10T08:53:08.444 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.443+0000 7fe0dd4ba700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe0d8072b50 0x7fe0d8082f60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:08.444 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.443+0000 7fe0dd4ba700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe0d80834a0 0x7fe0d8083920 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:08.444 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.443+0000 7fe0dd4ba700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe0d812e700 con 0x7fe0d8072b50 2026-03-10T08:53:08.444 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.443+0000 7fe0dd4ba700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe0d812e870 con 0x7fe0d80834a0 2026-03-10T08:53:08.444 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.443+0000 7fe0d67fc700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe0d80834a0 0x7fe0d8083920 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:08.444 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.443+0000 7fe0d67fc700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe0d80834a0 0x7fe0d8083920 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:34920/0 (socket says 192.168.123.105:34920) 2026-03-10T08:53:08.444 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.443+0000 7fe0d67fc700 1 -- 192.168.123.105:0/2086452058 learned_addr learned my addr 192.168.123.105:0/2086452058 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:53:08.444 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.443+0000 7fe0d6ffd700 1 --2- 192.168.123.105:0/2086452058 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe0d8072b50 0x7fe0d8082f60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:08.444 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.444+0000 7fe0d67fc700 1 -- 192.168.123.105:0/2086452058 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe0d8072b50 msgr2=0x7fe0d8082f60 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:08.444 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.444+0000 7fe0d67fc700 1 --2- 192.168.123.105:0/2086452058 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe0d8072b50 0x7fe0d8082f60 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:08.444 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.444+0000 7fe0d67fc700 1 -- 192.168.123.105:0/2086452058 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe0d000b050 con 0x7fe0d80834a0 2026-03-10T08:53:08.444 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.444+0000 7fe0d67fc700 1 --2- 192.168.123.105:0/2086452058 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe0d80834a0 0x7fe0d8083920 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7fe0d000b750 tx=0x7fe0d0009300 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:53:08.444 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.444+0000 7fe0bffff700 1 -- 192.168.123.105:0/2086452058 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe0d0003bb0 con 0x7fe0d80834a0 2026-03-10T08:53:08.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.444+0000 7fe0dd4ba700 1 -- 192.168.123.105:0/2086452058 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe0d812eaf0 con 0x7fe0d80834a0 2026-03-10T08:53:08.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.444+0000 7fe0dd4ba700 1 -- 192.168.123.105:0/2086452058 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe0d812f040 con 0x7fe0d80834a0 2026-03-10T08:53:08.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.445+0000 7fe0bffff700 1 -- 192.168.123.105:0/2086452058 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fe0d0009470 con 0x7fe0d80834a0 2026-03-10T08:53:08.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.445+0000 7fe0bffff700 1 -- 192.168.123.105:0/2086452058 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe0d0004580 con 0x7fe0d80834a0 2026-03-10T08:53:08.446 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.446+0000 7fe0bffff700 1 -- 192.168.123.105:0/2086452058 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fe0d00046e0 con 0x7fe0d80834a0 2026-03-10T08:53:08.446 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.446+0000 7fe0bffff700 1 --2- 192.168.123.105:0/2086452058 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe0c006c600 0x7fe0c006eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:08.447 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.447+0000 7fe0d6ffd700 1 --2- 192.168.123.105:0/2086452058 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe0c006c600 0x7fe0c006eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:08.447 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.447+0000 7fe0d6ffd700 1 --2- 192.168.123.105:0/2086452058 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe0c006c600 0x7fe0c006eac0 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7fe0c8005950 tx=0x7fe0c800b410 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:53:08.447 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.447+0000 7fe0bffff700 1 -- 192.168.123.105:0/2086452058 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7fe0d008cb40 con 0x7fe0d80834a0 2026-03-10T08:53:08.447 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.447+0000 7fe0dd4ba700 1 -- 192.168.123.105:0/2086452058 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe0c4005320 con 0x7fe0d80834a0 2026-03-10T08:53:08.450 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.450+0000 7fe0bffff700 1 -- 192.168.123.105:0/2086452058 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fe0d005ae70 con 0x7fe0d80834a0 2026-03-10T08:53:08.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:08 vm08.local ceph-mon[57559]: pgmap v83: 65 pgs: 65 active+clean; 457 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 8.0 KiB/s rd, 1023 B/s wr, 10 op/s 2026-03-10T08:53:08.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:08 vm08.local ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:53:08.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:08 vm08.local ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:53:08.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:08 vm08.local ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:53:08.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:08 vm08.local ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:53:08.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:08 vm08.local ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:53:08.582 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.579+0000 7fe0dd4ba700 1 -- 192.168.123.105:0/2086452058 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fe0c4000bf0 con 0x7fe0c006c600 2026-03-10T08:53:08.582 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.580+0000 7fe0bffff700 1 -- 192.168.123.105:0/2086452058 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7fe0c4000bf0 con 0x7fe0c006c600 2026-03-10T08:53:08.583 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.583+0000 7fe0dd4ba700 1 -- 192.168.123.105:0/2086452058 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe0c006c600 msgr2=0x7fe0c006eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:08.583 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.583+0000 7fe0dd4ba700 1 --2- 192.168.123.105:0/2086452058 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe0c006c600 0x7fe0c006eac0 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7fe0c8005950 tx=0x7fe0c800b410 comp rx=0 tx=0).stop 2026-03-10T08:53:08.583 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.583+0000 7fe0dd4ba700 1 -- 192.168.123.105:0/2086452058 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe0d80834a0 msgr2=0x7fe0d8083920 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:08.583 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.583+0000 7fe0dd4ba700 1 --2- 192.168.123.105:0/2086452058 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe0d80834a0 0x7fe0d8083920 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7fe0d000b750 tx=0x7fe0d0009300 comp rx=0 tx=0).stop 2026-03-10T08:53:08.583 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.583+0000 7fe0dd4ba700 1 -- 192.168.123.105:0/2086452058 shutdown_connections 2026-03-10T08:53:08.583 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.583+0000 7fe0dd4ba700 1 --2- 192.168.123.105:0/2086452058 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe0c006c600 0x7fe0c006eac0 unknown :-1 s=CLOSED pgs=115 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:08.583 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.583+0000 7fe0dd4ba700 1 --2- 192.168.123.105:0/2086452058 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe0d8072b50 0x7fe0d8082f60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:08.583 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.583+0000 7fe0dd4ba700 1 --2- 192.168.123.105:0/2086452058 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe0d80834a0 0x7fe0d8083920 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:08.583 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.583+0000 7fe0dd4ba700 1 -- 192.168.123.105:0/2086452058 >> 192.168.123.105:0/2086452058 conn(0x7fe0d806dae0 msgr2=0x7fe0d806ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:53:08.583 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.583+0000 7fe0dd4ba700 1 -- 192.168.123.105:0/2086452058 shutdown_connections 2026-03-10T08:53:08.584 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.583+0000 7fe0dd4ba700 1 -- 192.168.123.105:0/2086452058 wait complete. 2026-03-10T08:53:08.596 INFO:teuthology.orchestra.run.vm05.stdout:true 2026-03-10T08:53:08.671 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.670+0000 7f8458a71700 1 -- 192.168.123.105:0/4071641992 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8454107d90 msgr2=0x7f845410a1c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:08.671 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.670+0000 7f8458a71700 1 --2- 192.168.123.105:0/4071641992 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8454107d90 0x7f845410a1c0 secure :-1 s=READY pgs=280 cs=0 l=1 rev1=1 crypto rx=0x7f8444009b00 tx=0x7f8444009e10 comp rx=0 tx=0).stop 2026-03-10T08:53:08.671 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.670+0000 7f8458a71700 1 -- 192.168.123.105:0/4071641992 shutdown_connections 2026-03-10T08:53:08.671 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.670+0000 7f8458a71700 1 --2- 192.168.123.105:0/4071641992 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f845410a700 0x7f845410cb90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:08.671 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.670+0000 7f8458a71700 1 --2- 192.168.123.105:0/4071641992 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8454107d90 0x7f845410a1c0 unknown :-1 s=CLOSED pgs=280 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:08.671 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.670+0000 7f8458a71700 1 -- 192.168.123.105:0/4071641992 >> 192.168.123.105:0/4071641992 conn(0x7f845406daa0 msgr2=0x7f845406ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:53:08.672 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.672+0000 7f8458a71700 1 -- 192.168.123.105:0/4071641992 shutdown_connections 2026-03-10T08:53:08.672 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.672+0000 7f8458a71700 1 -- 192.168.123.105:0/4071641992 wait complete. 2026-03-10T08:53:08.672 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.672+0000 7f8458a71700 1 Processor -- start 2026-03-10T08:53:08.673 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.672+0000 7f8458a71700 1 -- start start 2026-03-10T08:53:08.673 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.672+0000 7f8458a71700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8454107d90 0x7f8454116d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:08.673 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.672+0000 7f8458a71700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f845410a700 0x7f8454117260 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:08.673 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.672+0000 7f8458a71700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8454117880 con 0x7f845410a700 2026-03-10T08:53:08.673 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.672+0000 7f8458a71700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f84541b3390 con 0x7f8454107d90 2026-03-10T08:53:08.673 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.672+0000 7f8452ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f845410a700 0x7f8454117260 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:08.673 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.673+0000 7f8452ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f845410a700 0x7f8454117260 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:33534/0 (socket says 192.168.123.105:33534) 2026-03-10T08:53:08.673 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.673+0000 7f8452ffd700 1 -- 192.168.123.105:0/1967800173 learned_addr learned my addr 192.168.123.105:0/1967800173 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:53:08.673 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.673+0000 7f84537fe700 1 --2- 192.168.123.105:0/1967800173 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8454107d90 0x7f8454116d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:08.673 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.673+0000 7f8452ffd700 1 -- 192.168.123.105:0/1967800173 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8454107d90 msgr2=0x7f8454116d20 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:08.673 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.673+0000 7f8452ffd700 1 --2- 192.168.123.105:0/1967800173 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8454107d90 0x7f8454116d20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:08.673 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.673+0000 7f8452ffd700 1 -- 192.168.123.105:0/1967800173 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f84440097e0 con 0x7f845410a700 2026-03-10T08:53:08.674 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.673+0000 7f8452ffd700 1 --2- 192.168.123.105:0/1967800173 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f845410a700 0x7f8454117260 secure :-1 s=READY pgs=281 cs=0 l=1 rev1=1 crypto rx=0x7f844c00c390 tx=0x7f844c00c750 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:53:08.674 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.674+0000 7f8450ff9700 1 -- 192.168.123.105:0/1967800173 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f844c00e030 con 0x7f845410a700 2026-03-10T08:53:08.675 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.674+0000 7f8458a71700 1 -- 192.168.123.105:0/1967800173 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f84541b3590 con 0x7f845410a700 2026-03-10T08:53:08.675 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.674+0000 7f8458a71700 1 -- 192.168.123.105:0/1967800173 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f84541b3a90 con 0x7f845410a700 2026-03-10T08:53:08.675 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.674+0000 7f8450ff9700 1 -- 192.168.123.105:0/1967800173 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f844c00f040 con 0x7f845410a700 2026-03-10T08:53:08.675 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.674+0000 7f8450ff9700 1 -- 192.168.123.105:0/1967800173 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f844c0146c0 con 0x7f845410a700 2026-03-10T08:53:08.675 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.675+0000 7f8450ff9700 1 -- 192.168.123.105:0/1967800173 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f844c014900 con 0x7f845410a700 2026-03-10T08:53:08.676 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.676+0000 7f8450ff9700 1 --2- 192.168.123.105:0/1967800173 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f843c06c600 0x7f843c06eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:08.676 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.676+0000 7f84537fe700 1 --2- 192.168.123.105:0/1967800173 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f843c06c600 0x7f843c06eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:08.676 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.676+0000 7f8450ff9700 1 -- 192.168.123.105:0/1967800173 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f844c08c930 con 0x7f845410a700 2026-03-10T08:53:08.676 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.676+0000 7f84537fe700 1 --2- 192.168.123.105:0/1967800173 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f843c06c600 0x7f843c06eac0 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f844400b5c0 tx=0x7f8444000bc0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:53:08.677 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.677+0000 7f8458a71700 1 -- 192.168.123.105:0/1967800173 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8440005320 con 0x7f845410a700 2026-03-10T08:53:08.679 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.679+0000 7f8450ff9700 1 -- 192.168.123.105:0/1967800173 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f844c0571d0 con 0x7f845410a700 2026-03-10T08:53:08.793 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.793+0000 7f8458a71700 1 -- 192.168.123.105:0/1967800173 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f8440000bf0 con 0x7f843c06c600 2026-03-10T08:53:08.794 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.794+0000 7f8450ff9700 1 -- 192.168.123.105:0/1967800173 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7f8440000bf0 con 0x7f843c06c600 2026-03-10T08:53:08.796 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.796+0000 7f843a7fc700 1 -- 192.168.123.105:0/1967800173 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f843c06c600 msgr2=0x7f843c06eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:08.796 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.796+0000 7f843a7fc700 1 --2- 192.168.123.105:0/1967800173 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f843c06c600 0x7f843c06eac0 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f844400b5c0 tx=0x7f8444000bc0 comp rx=0 tx=0).stop 2026-03-10T08:53:08.796 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.796+0000 7f843a7fc700 1 -- 192.168.123.105:0/1967800173 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f845410a700 msgr2=0x7f8454117260 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:08.797 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.796+0000 7f843a7fc700 1 --2- 192.168.123.105:0/1967800173 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f845410a700 0x7f8454117260 secure :-1 s=READY pgs=281 cs=0 l=1 rev1=1 crypto rx=0x7f844c00c390 tx=0x7f844c00c750 comp rx=0 tx=0).stop 2026-03-10T08:53:08.797 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.797+0000 7f843a7fc700 1 -- 192.168.123.105:0/1967800173 shutdown_connections 2026-03-10T08:53:08.797 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.797+0000 7f843a7fc700 1 --2- 192.168.123.105:0/1967800173 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f843c06c600 0x7f843c06eac0 unknown :-1 s=CLOSED pgs=116 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:08.797 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.797+0000 7f843a7fc700 1 --2- 192.168.123.105:0/1967800173 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8454107d90 0x7f8454116d20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:08.797 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.797+0000 7f843a7fc700 1 --2- 192.168.123.105:0/1967800173 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f845410a700 0x7f8454117260 unknown :-1 s=CLOSED pgs=281 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:08.797 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.797+0000 7f843a7fc700 1 -- 192.168.123.105:0/1967800173 >> 192.168.123.105:0/1967800173 conn(0x7f845406daa0 msgr2=0x7f845406ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:53:08.797 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.797+0000 7f843a7fc700 1 -- 192.168.123.105:0/1967800173 shutdown_connections 2026-03-10T08:53:08.797 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.797+0000 7f843a7fc700 1 -- 192.168.123.105:0/1967800173 wait complete. 2026-03-10T08:53:08.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.868+0000 7f0efc635700 1 -- 192.168.123.105:0/2802929558 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0ef4075a40 msgr2=0x7f0ef4077ed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:08.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.868+0000 7f0efc635700 1 --2- 192.168.123.105:0/2802929558 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0ef4075a40 0x7f0ef4077ed0 secure :-1 s=READY pgs=282 cs=0 l=1 rev1=1 crypto rx=0x7f0eec00b780 tx=0x7f0eec00ba90 comp rx=0 tx=0).stop 2026-03-10T08:53:08.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.868+0000 7f0efc635700 1 -- 192.168.123.105:0/2802929558 shutdown_connections 2026-03-10T08:53:08.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.868+0000 7f0efc635700 1 --2- 192.168.123.105:0/2802929558 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0ef4075a40 0x7f0ef4077ed0 unknown :-1 s=CLOSED pgs=282 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:08.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.868+0000 7f0efc635700 1 --2- 192.168.123.105:0/2802929558 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0ef4072b50 0x7f0ef4072f70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:08.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.868+0000 7f0efc635700 1 -- 192.168.123.105:0/2802929558 >> 192.168.123.105:0/2802929558 conn(0x7f0ef406dae0 msgr2=0x7f0ef406ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:53:08.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.869+0000 7f0efc635700 1 -- 192.168.123.105:0/2802929558 shutdown_connections 2026-03-10T08:53:08.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.869+0000 7f0efc635700 1 -- 192.168.123.105:0/2802929558 wait complete. 2026-03-10T08:53:08.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.869+0000 7f0efc635700 1 Processor -- start 2026-03-10T08:53:08.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.869+0000 7f0efc635700 1 -- start start 2026-03-10T08:53:08.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.869+0000 7f0efc635700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0ef4072b50 0x7f0ef4083100 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:08.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.869+0000 7f0efc635700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0ef4083640 0x7f0ef412e400 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:08.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.869+0000 7f0efc635700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0ef4083b80 con 0x7f0ef4083640 2026-03-10T08:53:08.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.869+0000 7f0efc635700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0ef4083cf0 con 0x7f0ef4072b50 2026-03-10T08:53:08.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.869+0000 7f0ef9bd0700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0ef4083640 0x7f0ef412e400 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:08.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.869+0000 7f0ef9bd0700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0ef4083640 0x7f0ef412e400 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:33542/0 (socket says 192.168.123.105:33542) 2026-03-10T08:53:08.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.869+0000 7f0ef9bd0700 1 -- 192.168.123.105:0/2585594310 learned_addr learned my addr 192.168.123.105:0/2585594310 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:53:08.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.869+0000 7f0efa3d1700 1 --2- 192.168.123.105:0/2585594310 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0ef4072b50 0x7f0ef4083100 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:08.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.869+0000 7f0ef9bd0700 1 -- 192.168.123.105:0/2585594310 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0ef4072b50 msgr2=0x7f0ef4083100 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:08.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.869+0000 7f0ef9bd0700 1 --2- 192.168.123.105:0/2585594310 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0ef4072b50 0x7f0ef4083100 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:08.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.869+0000 7f0ef9bd0700 1 -- 192.168.123.105:0/2585594310 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0eec00b050 con 0x7f0ef4083640 2026-03-10T08:53:08.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.870+0000 7f0ef9bd0700 1 --2- 192.168.123.105:0/2585594310 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0ef4083640 0x7f0ef412e400 secure :-1 s=READY pgs=283 cs=0 l=1 rev1=1 crypto rx=0x7f0eec00b750 tx=0x7f0eec0093b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:53:08.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.870+0000 7f0eeb7fe700 1 -- 192.168.123.105:0/2585594310 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0eec003bb0 con 0x7f0ef4083640 2026-03-10T08:53:08.872 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.870+0000 7f0efc635700 1 -- 192.168.123.105:0/2585594310 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0ef412ea00 con 0x7f0ef4083640 2026-03-10T08:53:08.872 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.870+0000 7f0efc635700 1 -- 192.168.123.105:0/2585594310 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0ef412ef00 con 0x7f0ef4083640 2026-03-10T08:53:08.872 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.871+0000 7f0eeb7fe700 1 -- 192.168.123.105:0/2585594310 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f0eec003d10 con 0x7f0ef4083640 2026-03-10T08:53:08.872 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.871+0000 7f0eeb7fe700 1 -- 192.168.123.105:0/2585594310 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0eec0046f0 con 0x7f0ef4083640 2026-03-10T08:53:08.872 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.872+0000 7f0eeb7fe700 1 -- 192.168.123.105:0/2585594310 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f0eec02b030 con 0x7f0ef4083640 2026-03-10T08:53:08.873 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.872+0000 7f0eeb7fe700 1 --2- 192.168.123.105:0/2585594310 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0ee006c600 0x7f0ee006eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:08.873 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.873+0000 7f0efa3d1700 1 --2- 192.168.123.105:0/2585594310 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0ee006c600 0x7f0ee006eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:08.873 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.873+0000 7f0eeb7fe700 1 -- 192.168.123.105:0/2585594310 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f0eec08ced0 con 0x7f0ef4083640 2026-03-10T08:53:08.873 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.871+0000 7f0efc635700 1 -- 192.168.123.105:0/2585594310 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0ed8005320 con 0x7f0ef4083640 2026-03-10T08:53:08.876 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.873+0000 7f0efa3d1700 1 --2- 192.168.123.105:0/2585594310 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0ee006c600 0x7f0ee006eac0 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7f0ef00099e0 tx=0x7f0ef0008040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:53:08.876 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.876+0000 7f0eeb7fe700 1 -- 192.168.123.105:0/2585594310 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f0eec0576f0 con 0x7f0ef4083640 2026-03-10T08:53:08.989 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.988+0000 7f0efc635700 1 -- 192.168.123.105:0/2585594310 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f0ed8000bf0 con 0x7f0ee006c600 2026-03-10T08:53:08.994 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.994+0000 7f0eeb7fe700 1 -- 192.168.123.105:0/2585594310 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3288 (secure 0 0 0) 0x7f0ed8000bf0 con 0x7f0ee006c600 2026-03-10T08:53:08.995 INFO:teuthology.orchestra.run.vm05.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T08:53:08.995 INFO:teuthology.orchestra.run.vm05.stdout:alertmanager.vm05 vm05 *:9093,9094 running (94s) 9s ago 2m 21.4M - 0.25.0 c8568f914cd2 3be9db7ff6a0 2026-03-10T08:53:08.995 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm05 vm05 running (2m) 9s ago 2m 8032k - 18.2.1 5be31c24972a 4f6a4fa3151a 2026-03-10T08:53:08.995 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm08 vm08 running (108s) 9s ago 108s 8308k - 18.2.1 5be31c24972a 17a1d108c2c1 2026-03-10T08:53:08.995 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm05 vm05 running (2m) 9s ago 2m 7407k - 18.2.1 5be31c24972a f9c585addcea 2026-03-10T08:53:08.995 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm08 vm08 running (107s) 9s ago 107s 7415k - 18.2.1 5be31c24972a f0b88fc7f552 2026-03-10T08:53:08.995 INFO:teuthology.orchestra.run.vm05.stdout:grafana.vm05 vm05 *:3000 running (92s) 9s ago 2m 80.8M - 9.4.7 954c08fa6188 91937b4745e7 2026-03-10T08:53:08.995 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.bxdvbu vm05 running (15s) 9s ago 15s 16.7M - 18.2.1 5be31c24972a 601862397d9b 2026-03-10T08:53:08.995 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.slhztf vm05 running (13s) 9s ago 13s 13.7M - 18.2.1 5be31c24972a 550cdd24738b 2026-03-10T08:53:08.995 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.ssijow vm08 running (12s) 9s ago 12s 16.1M - 18.2.1 5be31c24972a b9e55b365719 2026-03-10T08:53:08.995 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.xfzrbx vm08 running (14s) 9s ago 14s 11.0M - 18.2.1 5be31c24972a d58d1e2e6ff3 2026-03-10T08:53:08.995 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm05.rxwgjc vm05 *:9283,8765,8443 running (2m) 9s ago 2m 501M - 18.2.1 5be31c24972a 6ec0cdb38171 2026-03-10T08:53:08.995 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm08.rpongu vm08 *:8443,9283,8765 running (104s) 9s ago 104s 449M - 18.2.1 5be31c24972a 9cd801f2f7a7 2026-03-10T08:53:08.995 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm05 vm05 running (2m) 9s ago 2m 50.0M 2048M 18.2.1 5be31c24972a 4cb0e74c8584 2026-03-10T08:53:08.995 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm08 vm08 running (102s) 9s ago 102s 47.9M 2048M 18.2.1 5be31c24972a bca448418226 2026-03-10T08:53:08.995 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm05 vm05 *:9100 running (2m) 9s ago 2m 12.3M - 1.5.0 0da6a335fe13 3b3db2a6030c 2026-03-10T08:53:08.995 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm08 vm08 *:9100 running (105s) 9s ago 104s 12.7M - 1.5.0 0da6a335fe13 f55df0cefc61 2026-03-10T08:53:08.995 INFO:teuthology.orchestra.run.vm05.stdout:osd.0 vm05 running (84s) 9s ago 84s 48.5M 4096M 18.2.1 5be31c24972a 2a2aeea5e3d4 2026-03-10T08:53:08.995 INFO:teuthology.orchestra.run.vm05.stdout:osd.1 vm05 running (74s) 9s ago 74s 46.9M 4096M 18.2.1 5be31c24972a 902f9ea11f1a 2026-03-10T08:53:08.995 INFO:teuthology.orchestra.run.vm05.stdout:osd.2 vm05 running (65s) 9s ago 65s 48.1M 4096M 18.2.1 5be31c24972a 32c0be0f86f2 2026-03-10T08:53:08.995 INFO:teuthology.orchestra.run.vm05.stdout:osd.3 vm08 running (57s) 9s ago 57s 44.3M 4096M 18.2.1 5be31c24972a 14f5f93ea4d1 2026-03-10T08:53:08.995 INFO:teuthology.orchestra.run.vm05.stdout:osd.4 vm08 running (47s) 9s ago 47s 43.5M 4096M 18.2.1 5be31c24972a 155c1482d81c 2026-03-10T08:53:08.995 INFO:teuthology.orchestra.run.vm05.stdout:osd.5 vm08 running (39s) 9s ago 38s 45.8M 4096M 18.2.1 5be31c24972a 21583bb58d82 2026-03-10T08:53:08.995 INFO:teuthology.orchestra.run.vm05.stdout:prometheus.vm05 vm05 *:9095 running (87s) 9s ago 119s 36.9M - 2.43.0 a07b618ecd1d e84b76e5c1c0 2026-03-10T08:53:08.998 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.998+0000 7f0ee97fa700 1 -- 192.168.123.105:0/2585594310 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0ee006c600 msgr2=0x7f0ee006eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:08.998 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.998+0000 7f0ee97fa700 1 --2- 192.168.123.105:0/2585594310 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0ee006c600 0x7f0ee006eac0 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7f0ef00099e0 tx=0x7f0ef0008040 comp rx=0 tx=0).stop 2026-03-10T08:53:08.998 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.998+0000 7f0ee97fa700 1 -- 192.168.123.105:0/2585594310 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0ef4083640 msgr2=0x7f0ef412e400 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:08.998 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.998+0000 7f0ee97fa700 1 --2- 192.168.123.105:0/2585594310 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0ef4083640 0x7f0ef412e400 secure :-1 s=READY pgs=283 cs=0 l=1 rev1=1 crypto rx=0x7f0eec00b750 tx=0x7f0eec0093b0 comp rx=0 tx=0).stop 2026-03-10T08:53:08.998 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.998+0000 7f0ee97fa700 1 -- 192.168.123.105:0/2585594310 shutdown_connections 2026-03-10T08:53:08.998 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.998+0000 7f0ee97fa700 1 --2- 192.168.123.105:0/2585594310 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0ee006c600 0x7f0ee006eac0 unknown :-1 s=CLOSED pgs=117 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:08.998 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.998+0000 7f0ee97fa700 1 --2- 192.168.123.105:0/2585594310 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0ef4072b50 0x7f0ef4083100 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:08.998 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.998+0000 7f0ee97fa700 1 --2- 192.168.123.105:0/2585594310 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0ef4083640 0x7f0ef412e400 unknown :-1 s=CLOSED pgs=283 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:08.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.998+0000 7f0ee97fa700 1 -- 192.168.123.105:0/2585594310 >> 192.168.123.105:0/2585594310 conn(0x7f0ef406dae0 msgr2=0x7f0ef406ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:53:08.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.998+0000 7f0ee97fa700 1 -- 192.168.123.105:0/2585594310 shutdown_connections 2026-03-10T08:53:08.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:08.999+0000 7f0ee97fa700 1 -- 192.168.123.105:0/2585594310 wait complete. 2026-03-10T08:53:09.068 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.065+0000 7fe4ea2ab700 1 -- 192.168.123.105:0/874956677 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe4dc0088e0 msgr2=0x7fe4dc006d40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:09.068 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.065+0000 7fe4ea2ab700 1 --2- 192.168.123.105:0/874956677 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe4dc0088e0 0x7fe4dc006d40 secure :-1 s=READY pgs=284 cs=0 l=1 rev1=1 crypto rx=0x7fe4e0009b10 tx=0x7fe4e0009e20 comp rx=0 tx=0).stop 2026-03-10T08:53:09.069 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.068+0000 7fe4ea2ab700 1 -- 192.168.123.105:0/874956677 shutdown_connections 2026-03-10T08:53:09.069 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.068+0000 7fe4ea2ab700 1 --2- 192.168.123.105:0/874956677 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe4dc007310 0x7fe4dc007790 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:09.069 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.068+0000 7fe4ea2ab700 1 --2- 192.168.123.105:0/874956677 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe4dc0088e0 0x7fe4dc006d40 unknown :-1 s=CLOSED pgs=284 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:09.069 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.068+0000 7fe4ea2ab700 1 -- 192.168.123.105:0/874956677 >> 192.168.123.105:0/874956677 conn(0x7fe4dc091350 msgr2=0x7fe4dc0937b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:53:09.071 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.069+0000 7fe4ea2ab700 1 -- 192.168.123.105:0/874956677 shutdown_connections 2026-03-10T08:53:09.071 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.070+0000 7fe4ea2ab700 1 -- 192.168.123.105:0/874956677 wait complete. 2026-03-10T08:53:09.076 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.071+0000 7fe4ea2ab700 1 Processor -- start 2026-03-10T08:53:09.076 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.073+0000 7fe4ea2ab700 1 -- start start 2026-03-10T08:53:09.076 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.073+0000 7fe4ea2ab700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe4dc007310 0x7fe4dc131ce0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:09.076 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.073+0000 7fe4ea2ab700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe4dc0088e0 0x7fe4dc132220 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:09.076 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.073+0000 7fe4ea2ab700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe4dc132840 con 0x7fe4dc0088e0 2026-03-10T08:53:09.076 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.073+0000 7fe4ea2ab700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe4dc137250 con 0x7fe4dc007310 2026-03-10T08:53:09.076 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.073+0000 7fe4e8aa8700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe4dc0088e0 0x7fe4dc132220 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:09.076 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.073+0000 7fe4e8aa8700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe4dc0088e0 0x7fe4dc132220 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:33564/0 (socket says 192.168.123.105:33564) 2026-03-10T08:53:09.076 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.073+0000 7fe4e8aa8700 1 -- 192.168.123.105:0/591046339 learned_addr learned my addr 192.168.123.105:0/591046339 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:53:09.076 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.073+0000 7fe4e8aa8700 1 -- 192.168.123.105:0/591046339 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe4dc007310 msgr2=0x7fe4dc131ce0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T08:53:09.076 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.073+0000 7fe4e8aa8700 1 --2- 192.168.123.105:0/591046339 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe4dc007310 0x7fe4dc131ce0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:09.076 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.073+0000 7fe4e8aa8700 1 -- 192.168.123.105:0/591046339 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe4e0009770 con 0x7fe4dc0088e0 2026-03-10T08:53:09.076 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.073+0000 7fe4e8aa8700 1 --2- 192.168.123.105:0/591046339 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe4dc0088e0 0x7fe4dc132220 secure :-1 s=READY pgs=285 cs=0 l=1 rev1=1 crypto rx=0x7fe4d400d900 tx=0x7fe4d400dcc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:53:09.076 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.073+0000 7fe4da7fc700 1 -- 192.168.123.105:0/591046339 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe4d40041d0 con 0x7fe4dc0088e0 2026-03-10T08:53:09.076 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.073+0000 7fe4ea2ab700 1 -- 192.168.123.105:0/591046339 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe4dc148460 con 0x7fe4dc0088e0 2026-03-10T08:53:09.076 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.074+0000 7fe4ea2ab700 1 -- 192.168.123.105:0/591046339 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe4dc148980 con 0x7fe4dc0088e0 2026-03-10T08:53:09.076 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.074+0000 7fe4da7fc700 1 -- 192.168.123.105:0/591046339 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fe4d4004330 con 0x7fe4dc0088e0 2026-03-10T08:53:09.076 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.075+0000 7fe4da7fc700 1 -- 192.168.123.105:0/591046339 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe4d4003da0 con 0x7fe4dc0088e0 2026-03-10T08:53:09.076 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.075+0000 7fe4da7fc700 1 -- 192.168.123.105:0/591046339 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fe4d4010460 con 0x7fe4dc0088e0 2026-03-10T08:53:09.076 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.075+0000 7fe4ea2ab700 1 -- 192.168.123.105:0/591046339 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe4c8005320 con 0x7fe4dc0088e0 2026-03-10T08:53:09.079 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.075+0000 7fe4da7fc700 1 --2- 192.168.123.105:0/591046339 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe4d006c420 0x7fe4d006e8e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:09.079 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.076+0000 7fe4da7fc700 1 -- 192.168.123.105:0/591046339 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7fe4d4021030 con 0x7fe4dc0088e0 2026-03-10T08:53:09.079 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.078+0000 7fe4e92a9700 1 --2- 192.168.123.105:0/591046339 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe4d006c420 0x7fe4d006e8e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:09.081 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.079+0000 7fe4da7fc700 1 -- 192.168.123.105:0/591046339 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fe4d4056fb0 con 0x7fe4dc0088e0 2026-03-10T08:53:09.086 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.081+0000 7fe4e92a9700 1 --2- 192.168.123.105:0/591046339 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe4d006c420 0x7fe4d006e8e0 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7fe4e000b5c0 tx=0x7fe4e000bf00 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:53:09.227 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.226+0000 7fe4ea2ab700 1 -- 192.168.123.105:0/591046339 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fe4c8006200 con 0x7fe4dc0088e0 2026-03-10T08:53:09.227 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.226+0000 7fe4da7fc700 1 -- 192.168.123.105:0/591046339 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+558 (secure 0 0 0) 0x7fe4d405a5d0 con 0x7fe4dc0088e0 2026-03-10T08:53:09.227 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T08:53:09.227 INFO:teuthology.orchestra.run.vm05.stdout: "mon": { 2026-03-10T08:53:09.227 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 2 2026-03-10T08:53:09.227 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:53:09.227 INFO:teuthology.orchestra.run.vm05.stdout: "mgr": { 2026-03-10T08:53:09.227 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 2 2026-03-10T08:53:09.227 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:53:09.227 INFO:teuthology.orchestra.run.vm05.stdout: "osd": { 2026-03-10T08:53:09.227 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 6 2026-03-10T08:53:09.227 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:53:09.227 INFO:teuthology.orchestra.run.vm05.stdout: "mds": { 2026-03-10T08:53:09.227 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4 2026-03-10T08:53:09.227 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:53:09.227 INFO:teuthology.orchestra.run.vm05.stdout: "overall": { 2026-03-10T08:53:09.227 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 14 2026-03-10T08:53:09.227 INFO:teuthology.orchestra.run.vm05.stdout: } 2026-03-10T08:53:09.227 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T08:53:09.229 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.229+0000 7fe4ea2ab700 1 -- 192.168.123.105:0/591046339 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe4d006c420 msgr2=0x7fe4d006e8e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:09.229 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.229+0000 7fe4ea2ab700 1 --2- 192.168.123.105:0/591046339 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe4d006c420 0x7fe4d006e8e0 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7fe4e000b5c0 tx=0x7fe4e000bf00 comp rx=0 tx=0).stop 2026-03-10T08:53:09.229 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.229+0000 7fe4ea2ab700 1 -- 192.168.123.105:0/591046339 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe4dc0088e0 msgr2=0x7fe4dc132220 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:09.229 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.229+0000 7fe4ea2ab700 1 --2- 192.168.123.105:0/591046339 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe4dc0088e0 0x7fe4dc132220 secure :-1 s=READY pgs=285 cs=0 l=1 rev1=1 crypto rx=0x7fe4d400d900 tx=0x7fe4d400dcc0 comp rx=0 tx=0).stop 2026-03-10T08:53:09.229 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.229+0000 7fe4ea2ab700 1 -- 192.168.123.105:0/591046339 shutdown_connections 2026-03-10T08:53:09.229 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.229+0000 7fe4ea2ab700 1 --2- 192.168.123.105:0/591046339 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe4d006c420 0x7fe4d006e8e0 unknown :-1 s=CLOSED pgs=118 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:09.229 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.229+0000 7fe4ea2ab700 1 --2- 192.168.123.105:0/591046339 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe4dc007310 0x7fe4dc131ce0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:09.229 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.229+0000 7fe4ea2ab700 1 --2- 192.168.123.105:0/591046339 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe4dc0088e0 0x7fe4dc132220 unknown :-1 s=CLOSED pgs=285 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:09.230 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.229+0000 7fe4ea2ab700 1 -- 192.168.123.105:0/591046339 >> 192.168.123.105:0/591046339 conn(0x7fe4dc091350 msgr2=0x7fe4dc0977a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:53:09.230 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.229+0000 7fe4ea2ab700 1 -- 192.168.123.105:0/591046339 shutdown_connections 2026-03-10T08:53:09.230 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.229+0000 7fe4ea2ab700 1 -- 192.168.123.105:0/591046339 wait complete. 2026-03-10T08:53:09.308 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.306+0000 7f996dceb700 1 -- 192.168.123.105:0/2632817267 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9968107d90 msgr2=0x7f996810a1c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:09.308 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.306+0000 7f996dceb700 1 --2- 192.168.123.105:0/2632817267 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9968107d90 0x7f996810a1c0 secure :-1 s=READY pgs=286 cs=0 l=1 rev1=1 crypto rx=0x7f9958009b50 tx=0x7f9958009e60 comp rx=0 tx=0).stop 2026-03-10T08:53:09.308 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.308+0000 7f996dceb700 1 -- 192.168.123.105:0/2632817267 shutdown_connections 2026-03-10T08:53:09.308 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.308+0000 7f996dceb700 1 --2- 192.168.123.105:0/2632817267 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f996810a700 0x7f996810cb90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:09.308 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.308+0000 7f996dceb700 1 --2- 192.168.123.105:0/2632817267 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9968107d90 0x7f996810a1c0 unknown :-1 s=CLOSED pgs=286 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:09.308 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.308+0000 7f996dceb700 1 -- 192.168.123.105:0/2632817267 >> 192.168.123.105:0/2632817267 conn(0x7f996806dda0 msgr2=0x7f9968070220 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:53:09.309 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.308+0000 7f996dceb700 1 -- 192.168.123.105:0/2632817267 shutdown_connections 2026-03-10T08:53:09.309 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.309+0000 7f996dceb700 1 -- 192.168.123.105:0/2632817267 wait complete. 2026-03-10T08:53:09.309 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.309+0000 7f996dceb700 1 Processor -- start 2026-03-10T08:53:09.309 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.309+0000 7f996dceb700 1 -- start start 2026-03-10T08:53:09.310 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.309+0000 7f996dceb700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9968107d90 0x7f99681a55c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:09.310 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.309+0000 7f996dceb700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f996810a700 0x7f99681a5b00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:09.310 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.309+0000 7f996dceb700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f99681a6120 con 0x7f9968107d90 2026-03-10T08:53:09.310 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.309+0000 7f996dceb700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f99681a6260 con 0x7f996810a700 2026-03-10T08:53:09.310 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.309+0000 7f99677fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9968107d90 0x7f99681a55c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:09.310 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.309+0000 7f99677fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9968107d90 0x7f99681a55c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:33580/0 (socket says 192.168.123.105:33580) 2026-03-10T08:53:09.310 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.309+0000 7f99677fe700 1 -- 192.168.123.105:0/1575624460 learned_addr learned my addr 192.168.123.105:0/1575624460 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:53:09.310 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.310+0000 7f99677fe700 1 -- 192.168.123.105:0/1575624460 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f996810a700 msgr2=0x7f99681a5b00 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:09.310 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.310+0000 7f9966ffd700 1 --2- 192.168.123.105:0/1575624460 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f996810a700 0x7f99681a5b00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:09.310 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.310+0000 7f99677fe700 1 --2- 192.168.123.105:0/1575624460 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f996810a700 0x7f99681a5b00 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:09.310 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.310+0000 7f99677fe700 1 -- 192.168.123.105:0/1575624460 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f99580097e0 con 0x7f9968107d90 2026-03-10T08:53:09.310 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.310+0000 7f9966ffd700 1 --2- 192.168.123.105:0/1575624460 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f996810a700 0x7f99681a5b00 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T08:53:09.310 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.310+0000 7f99677fe700 1 --2- 192.168.123.105:0/1575624460 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9968107d90 0x7f99681a55c0 secure :-1 s=READY pgs=287 cs=0 l=1 rev1=1 crypto rx=0x7f9958006010 tx=0x7f9958004c30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:53:09.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.310+0000 7f9964ff9700 1 -- 192.168.123.105:0/1575624460 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f995801d070 con 0x7f9968107d90 2026-03-10T08:53:09.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.310+0000 7f996dceb700 1 -- 192.168.123.105:0/1575624460 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f99681aacb0 con 0x7f9968107d90 2026-03-10T08:53:09.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.310+0000 7f996dceb700 1 -- 192.168.123.105:0/1575624460 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f99681ab1a0 con 0x7f9968107d90 2026-03-10T08:53:09.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.311+0000 7f9964ff9700 1 -- 192.168.123.105:0/1575624460 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f995800bc50 con 0x7f9968107d90 2026-03-10T08:53:09.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.311+0000 7f9964ff9700 1 -- 192.168.123.105:0/1575624460 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f995800f780 con 0x7f9968107d90 2026-03-10T08:53:09.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.311+0000 7f996dceb700 1 -- 192.168.123.105:0/1575624460 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f996819f7c0 con 0x7f9968107d90 2026-03-10T08:53:09.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.311+0000 7f9964ff9700 1 -- 192.168.123.105:0/1575624460 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f995800f9a0 con 0x7f9968107d90 2026-03-10T08:53:09.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.312+0000 7f9964ff9700 1 --2- 192.168.123.105:0/1575624460 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f995406c600 0x7f995406eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:09.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.312+0000 7f9966ffd700 1 --2- 192.168.123.105:0/1575624460 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f995406c600 0x7f995406eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:09.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.312+0000 7f9966ffd700 1 --2- 192.168.123.105:0/1575624460 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f995406c600 0x7f995406eac0 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7f9950009e20 tx=0x7f9950009450 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:53:09.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.312+0000 7f9964ff9700 1 -- 192.168.123.105:0/1575624460 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f995808d0f0 con 0x7f9968107d90 2026-03-10T08:53:09.315 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.314+0000 7f9964ff9700 1 -- 192.168.123.105:0/1575624460 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f995805c310 con 0x7f9968107d90 2026-03-10T08:53:09.448 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:09 vm05.local ceph-mon[49713]: from='client.24345 -' entity='client.admin' cmd=[{"prefix": "orch upgrade start", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:53:09.448 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:09 vm05.local ceph-mon[49713]: Upgrade: Started with target quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T08:53:09.448 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:09 vm05.local ceph-mon[49713]: Upgrade: First pull of quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T08:53:09.448 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:09 vm05.local ceph-mon[49713]: from='client.24349 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:53:09.448 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:09 vm05.local ceph-mon[49713]: from='client.14554 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:53:09.450 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.446+0000 7f996dceb700 1 -- 192.168.123.105:0/1575624460 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f99680619a0 con 0x7f9968107d90 2026-03-10T08:53:09.450 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.447+0000 7f9964ff9700 1 -- 192.168.123.105:0/1575624460 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 12 v12) v1 ==== 76+0+1827 (secure 0 0 0) 0x7f995805bea0 con 0x7f9968107d90 2026-03-10T08:53:09.450 INFO:teuthology.orchestra.run.vm05.stdout:e12 2026-03-10T08:53:09.450 INFO:teuthology.orchestra.run.vm05.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T08:53:09.450 INFO:teuthology.orchestra.run.vm05.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T08:53:09.450 INFO:teuthology.orchestra.run.vm05.stdout:legacy client fscid: 1 2026-03-10T08:53:09.450 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:53:09.450 INFO:teuthology.orchestra.run.vm05.stdout:Filesystem 'cephfs' (1) 2026-03-10T08:53:09.451 INFO:teuthology.orchestra.run.vm05.stdout:fs_name cephfs 2026-03-10T08:53:09.451 INFO:teuthology.orchestra.run.vm05.stdout:epoch 12 2026-03-10T08:53:09.451 INFO:teuthology.orchestra.run.vm05.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T08:53:09.451 INFO:teuthology.orchestra.run.vm05.stdout:created 2026-03-10T08:52:52.346264+0000 2026-03-10T08:53:09.451 INFO:teuthology.orchestra.run.vm05.stdout:modified 2026-03-10T08:53:01.426195+0000 2026-03-10T08:53:09.451 INFO:teuthology.orchestra.run.vm05.stdout:tableserver 0 2026-03-10T08:53:09.451 INFO:teuthology.orchestra.run.vm05.stdout:root 0 2026-03-10T08:53:09.451 INFO:teuthology.orchestra.run.vm05.stdout:session_timeout 60 2026-03-10T08:53:09.451 INFO:teuthology.orchestra.run.vm05.stdout:session_autoclose 300 2026-03-10T08:53:09.451 INFO:teuthology.orchestra.run.vm05.stdout:max_file_size 1099511627776 2026-03-10T08:53:09.451 INFO:teuthology.orchestra.run.vm05.stdout:required_client_features {} 2026-03-10T08:53:09.451 INFO:teuthology.orchestra.run.vm05.stdout:last_failure 0 2026-03-10T08:53:09.451 INFO:teuthology.orchestra.run.vm05.stdout:last_failure_osd_epoch 39 2026-03-10T08:53:09.451 INFO:teuthology.orchestra.run.vm05.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T08:53:09.451 INFO:teuthology.orchestra.run.vm05.stdout:max_mds 1 2026-03-10T08:53:09.451 INFO:teuthology.orchestra.run.vm05.stdout:in 0 2026-03-10T08:53:09.451 INFO:teuthology.orchestra.run.vm05.stdout:up {0=24289} 2026-03-10T08:53:09.451 INFO:teuthology.orchestra.run.vm05.stdout:failed 2026-03-10T08:53:09.451 INFO:teuthology.orchestra.run.vm05.stdout:damaged 2026-03-10T08:53:09.451 INFO:teuthology.orchestra.run.vm05.stdout:stopped 2026-03-10T08:53:09.451 INFO:teuthology.orchestra.run.vm05.stdout:data_pools [3] 2026-03-10T08:53:09.451 INFO:teuthology.orchestra.run.vm05.stdout:metadata_pool 2 2026-03-10T08:53:09.451 INFO:teuthology.orchestra.run.vm05.stdout:inline_data disabled 2026-03-10T08:53:09.451 INFO:teuthology.orchestra.run.vm05.stdout:balancer 2026-03-10T08:53:09.451 INFO:teuthology.orchestra.run.vm05.stdout:bal_rank_mask -1 2026-03-10T08:53:09.451 INFO:teuthology.orchestra.run.vm05.stdout:standby_count_wanted 1 2026-03-10T08:53:09.451 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.bxdvbu{0:24289} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.105:6826/2466638752,v1:192.168.123.105:6827/2466638752] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:53:09.451 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:53:09.451 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:53:09.451 INFO:teuthology.orchestra.run.vm05.stdout:Standby daemons: 2026-03-10T08:53:09.451 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:53:09.451 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.slhztf{-1:14488} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.105:6828/2662194502,v1:192.168.123.105:6829/2662194502] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:53:09.451 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.ssijow{-1:24307} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.108:6826/573665424,v1:192.168.123.108:6827/573665424] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:53:09.451 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.xfzrbx{-1:24317} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.108:6824/3236998387,v1:192.168.123.108:6825/3236998387] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:53:09.453 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.453+0000 7f996dceb700 1 -- 192.168.123.105:0/1575624460 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f995406c600 msgr2=0x7f995406eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:09.453 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.453+0000 7f996dceb700 1 --2- 192.168.123.105:0/1575624460 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f995406c600 0x7f995406eac0 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7f9950009e20 tx=0x7f9950009450 comp rx=0 tx=0).stop 2026-03-10T08:53:09.453 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.453+0000 7f996dceb700 1 -- 192.168.123.105:0/1575624460 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9968107d90 msgr2=0x7f99681a55c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:09.454 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.453+0000 7f996dceb700 1 --2- 192.168.123.105:0/1575624460 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9968107d90 0x7f99681a55c0 secure :-1 s=READY pgs=287 cs=0 l=1 rev1=1 crypto rx=0x7f9958006010 tx=0x7f9958004c30 comp rx=0 tx=0).stop 2026-03-10T08:53:09.455 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.455+0000 7f996dceb700 1 -- 192.168.123.105:0/1575624460 shutdown_connections 2026-03-10T08:53:09.455 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.455+0000 7f996dceb700 1 --2- 192.168.123.105:0/1575624460 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f995406c600 0x7f995406eac0 unknown :-1 s=CLOSED pgs=119 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:09.455 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.455+0000 7f996dceb700 1 --2- 192.168.123.105:0/1575624460 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9968107d90 0x7f99681a55c0 unknown :-1 s=CLOSED pgs=287 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:09.455 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.455+0000 7f996dceb700 1 --2- 192.168.123.105:0/1575624460 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f996810a700 0x7f99681a5b00 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:09.455 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.455+0000 7f996dceb700 1 -- 192.168.123.105:0/1575624460 >> 192.168.123.105:0/1575624460 conn(0x7f996806dda0 msgr2=0x7f996810c180 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:53:09.456 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.455+0000 7f996dceb700 1 -- 192.168.123.105:0/1575624460 shutdown_connections 2026-03-10T08:53:09.456 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.456+0000 7f996dceb700 1 -- 192.168.123.105:0/1575624460 wait complete. 2026-03-10T08:53:09.456 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 12 2026-03-10T08:53:09.525 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.525+0000 7f321f59e700 1 -- 192.168.123.105:0/3983446664 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3218095e20 msgr2=0x7f3218096240 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:09.526 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.525+0000 7f321f59e700 1 --2- 192.168.123.105:0/3983446664 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3218095e20 0x7f3218096240 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f3210009b10 tx=0x7f3210009e20 comp rx=0 tx=0).stop 2026-03-10T08:53:09.526 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.525+0000 7f321f59e700 1 -- 192.168.123.105:0/3983446664 shutdown_connections 2026-03-10T08:53:09.526 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.525+0000 7f321f59e700 1 --2- 192.168.123.105:0/3983446664 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3218096f80 0x7f3218097400 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:09.526 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.525+0000 7f321f59e700 1 --2- 192.168.123.105:0/3983446664 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3218095e20 0x7f3218096240 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:09.526 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.525+0000 7f321f59e700 1 -- 192.168.123.105:0/3983446664 >> 192.168.123.105:0/3983446664 conn(0x7f32180913a0 msgr2=0x7f3218093800 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:53:09.526 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.526+0000 7f321f59e700 1 -- 192.168.123.105:0/3983446664 shutdown_connections 2026-03-10T08:53:09.526 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.526+0000 7f321f59e700 1 -- 192.168.123.105:0/3983446664 wait complete. 2026-03-10T08:53:09.527 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.527+0000 7f321f59e700 1 Processor -- start 2026-03-10T08:53:09.527 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.527+0000 7f321f59e700 1 -- start start 2026-03-10T08:53:09.527 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.527+0000 7f321f59e700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3218096f80 0x7f32180a4830 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:09.527 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.527+0000 7f321f59e700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f32180a4d70 0x7f32180a7dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:09.527 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.527+0000 7f321f59e700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f32180a5280 con 0x7f3218096f80 2026-03-10T08:53:09.527 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.527+0000 7f321f59e700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f32180a53f0 con 0x7f32180a4d70 2026-03-10T08:53:09.528 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.527+0000 7f321e59c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3218096f80 0x7f32180a4830 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:09.528 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.527+0000 7f321e59c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3218096f80 0x7f32180a4830 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:33582/0 (socket says 192.168.123.105:33582) 2026-03-10T08:53:09.528 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.527+0000 7f321e59c700 1 -- 192.168.123.105:0/1474874006 learned_addr learned my addr 192.168.123.105:0/1474874006 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:53:09.528 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.527+0000 7f321dd9b700 1 --2- 192.168.123.105:0/1474874006 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f32180a4d70 0x7f32180a7dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:09.528 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.527+0000 7f321dd9b700 1 -- 192.168.123.105:0/1474874006 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3218096f80 msgr2=0x7f32180a4830 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:09.528 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.527+0000 7f321dd9b700 1 --2- 192.168.123.105:0/1474874006 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3218096f80 0x7f32180a4830 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:09.528 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.527+0000 7f321dd9b700 1 -- 192.168.123.105:0/1474874006 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3210009770 con 0x7f32180a4d70 2026-03-10T08:53:09.528 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.528+0000 7f321dd9b700 1 --2- 192.168.123.105:0/1474874006 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f32180a4d70 0x7f32180a7dd0 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7f321400b6d0 tx=0x7f321400ba90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:53:09.528 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.528+0000 7f320f7fe700 1 -- 192.168.123.105:0/1474874006 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3214011840 con 0x7f32180a4d70 2026-03-10T08:53:09.528 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.528+0000 7f320f7fe700 1 -- 192.168.123.105:0/1474874006 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3214011e80 con 0x7f32180a4d70 2026-03-10T08:53:09.528 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.528+0000 7f320f7fe700 1 -- 192.168.123.105:0/1474874006 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f321400f550 con 0x7f32180a4d70 2026-03-10T08:53:09.529 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.528+0000 7f321f59e700 1 -- 192.168.123.105:0/1474874006 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f32180a8370 con 0x7f32180a4d70 2026-03-10T08:53:09.529 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.528+0000 7f321f59e700 1 -- 192.168.123.105:0/1474874006 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f32180a8890 con 0x7f32180a4d70 2026-03-10T08:53:09.529 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.529+0000 7f321f59e700 1 -- 192.168.123.105:0/1474874006 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3218008770 con 0x7f32180a4d70 2026-03-10T08:53:09.530 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.529+0000 7f320f7fe700 1 -- 192.168.123.105:0/1474874006 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f32140103e0 con 0x7f32180a4d70 2026-03-10T08:53:09.530 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.530+0000 7f320f7fe700 1 --2- 192.168.123.105:0/1474874006 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f320806c5f0 0x7f320806eab0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:09.530 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.530+0000 7f320f7fe700 1 -- 192.168.123.105:0/1474874006 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f321408bac0 con 0x7f32180a4d70 2026-03-10T08:53:09.530 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.530+0000 7f321e59c700 1 --2- 192.168.123.105:0/1474874006 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f320806c5f0 0x7f320806eab0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:09.531 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.531+0000 7f321e59c700 1 --2- 192.168.123.105:0/1474874006 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f320806c5f0 0x7f320806eab0 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7f321000b5c0 tx=0x7f3210011040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:53:09.532 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.531+0000 7f320f7fe700 1 -- 192.168.123.105:0/1474874006 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f3214059df0 con 0x7f32180a4d70 2026-03-10T08:53:09.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:09 vm08.local ceph-mon[57559]: from='client.24345 -' entity='client.admin' cmd=[{"prefix": "orch upgrade start", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:53:09.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:09 vm08.local ceph-mon[57559]: Upgrade: Started with target quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T08:53:09.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:09 vm08.local ceph-mon[57559]: Upgrade: First pull of quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T08:53:09.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:09 vm08.local ceph-mon[57559]: from='client.24349 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:53:09.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:09 vm08.local ceph-mon[57559]: from='client.14554 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:53:09.644 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.644+0000 7f321f59e700 1 -- 192.168.123.105:0/1474874006 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f32180046d0 con 0x7f320806c5f0 2026-03-10T08:53:09.645 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.645+0000 7f320f7fe700 1 -- 192.168.123.105:0/1474874006 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7f32180046d0 con 0x7f320806c5f0 2026-03-10T08:53:09.645 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T08:53:09.645 INFO:teuthology.orchestra.run.vm05.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", 2026-03-10T08:53:09.645 INFO:teuthology.orchestra.run.vm05.stdout: "in_progress": true, 2026-03-10T08:53:09.645 INFO:teuthology.orchestra.run.vm05.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T08:53:09.645 INFO:teuthology.orchestra.run.vm05.stdout: "services_complete": [], 2026-03-10T08:53:09.645 INFO:teuthology.orchestra.run.vm05.stdout: "progress": "", 2026-03-10T08:53:09.645 INFO:teuthology.orchestra.run.vm05.stdout: "message": "Doing first pull of quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df image", 2026-03-10T08:53:09.645 INFO:teuthology.orchestra.run.vm05.stdout: "is_paused": false 2026-03-10T08:53:09.645 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T08:53:09.647 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.647+0000 7f321f59e700 1 -- 192.168.123.105:0/1474874006 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f320806c5f0 msgr2=0x7f320806eab0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:09.647 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.647+0000 7f321f59e700 1 --2- 192.168.123.105:0/1474874006 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f320806c5f0 0x7f320806eab0 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7f321000b5c0 tx=0x7f3210011040 comp rx=0 tx=0).stop 2026-03-10T08:53:09.647 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.647+0000 7f321f59e700 1 -- 192.168.123.105:0/1474874006 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f32180a4d70 msgr2=0x7f32180a7dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:09.648 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.647+0000 7f321f59e700 1 --2- 192.168.123.105:0/1474874006 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f32180a4d70 0x7f32180a7dd0 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7f321400b6d0 tx=0x7f321400ba90 comp rx=0 tx=0).stop 2026-03-10T08:53:09.648 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.647+0000 7f321f59e700 1 -- 192.168.123.105:0/1474874006 shutdown_connections 2026-03-10T08:53:09.648 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.647+0000 7f321f59e700 1 --2- 192.168.123.105:0/1474874006 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f320806c5f0 0x7f320806eab0 unknown :-1 s=CLOSED pgs=120 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:09.648 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.647+0000 7f321f59e700 1 --2- 192.168.123.105:0/1474874006 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3218096f80 0x7f32180a4830 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:09.648 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.647+0000 7f321f59e700 1 --2- 192.168.123.105:0/1474874006 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f32180a4d70 0x7f32180a7dd0 unknown :-1 s=CLOSED pgs=66 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:09.648 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.647+0000 7f321f59e700 1 -- 192.168.123.105:0/1474874006 >> 192.168.123.105:0/1474874006 conn(0x7f32180913a0 msgr2=0x7f3218093540 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:53:09.648 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.648+0000 7f321f59e700 1 -- 192.168.123.105:0/1474874006 shutdown_connections 2026-03-10T08:53:09.648 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.648+0000 7f321f59e700 1 -- 192.168.123.105:0/1474874006 wait complete. 2026-03-10T08:53:09.718 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.718+0000 7fd192131700 1 -- 192.168.123.105:0/4207047858 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd18c072b20 msgr2=0x7fd18c072f40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:09.718 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.718+0000 7fd192131700 1 --2- 192.168.123.105:0/4207047858 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd18c072b20 0x7fd18c072f40 secure :-1 s=READY pgs=288 cs=0 l=1 rev1=1 crypto rx=0x7fd188007ae0 tx=0x7fd188007df0 comp rx=0 tx=0).stop 2026-03-10T08:53:09.718 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.718+0000 7fd192131700 1 -- 192.168.123.105:0/4207047858 shutdown_connections 2026-03-10T08:53:09.719 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.718+0000 7fd192131700 1 --2- 192.168.123.105:0/4207047858 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd18c075a10 0x7fd18c077ea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:09.719 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.718+0000 7fd192131700 1 --2- 192.168.123.105:0/4207047858 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd18c072b20 0x7fd18c072f40 unknown :-1 s=CLOSED pgs=288 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:09.719 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.718+0000 7fd192131700 1 -- 192.168.123.105:0/4207047858 >> 192.168.123.105:0/4207047858 conn(0x7fd18c06daa0 msgr2=0x7fd18c06ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:53:09.719 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.718+0000 7fd192131700 1 -- 192.168.123.105:0/4207047858 shutdown_connections 2026-03-10T08:53:09.719 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.718+0000 7fd192131700 1 -- 192.168.123.105:0/4207047858 wait complete. 2026-03-10T08:53:09.719 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.718+0000 7fd192131700 1 Processor -- start 2026-03-10T08:53:09.719 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.718+0000 7fd192131700 1 -- start start 2026-03-10T08:53:09.719 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.719+0000 7fd192131700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd18c075a10 0x7fd18c083010 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:09.719 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.719+0000 7fd192131700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd18c083550 0x7fd18c1b30a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:09.719 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.719+0000 7fd192131700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd18c083a60 con 0x7fd18c075a10 2026-03-10T08:53:09.719 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.719+0000 7fd192131700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd18c083bd0 con 0x7fd18c083550 2026-03-10T08:53:09.719 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.719+0000 7fd19112f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd18c075a10 0x7fd18c083010 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:09.719 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.719+0000 7fd19092e700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd18c083550 0x7fd18c1b30a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:09.719 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.719+0000 7fd19092e700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd18c083550 0x7fd18c1b30a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:51528/0 (socket says 192.168.123.105:51528) 2026-03-10T08:53:09.719 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.719+0000 7fd19092e700 1 -- 192.168.123.105:0/1932594773 learned_addr learned my addr 192.168.123.105:0/1932594773 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:53:09.720 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.720+0000 7fd19112f700 1 -- 192.168.123.105:0/1932594773 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd18c083550 msgr2=0x7fd18c1b30a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:09.720 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.720+0000 7fd19112f700 1 --2- 192.168.123.105:0/1932594773 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd18c083550 0x7fd18c1b30a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:09.720 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.720+0000 7fd19112f700 1 -- 192.168.123.105:0/1932594773 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd1880077c0 con 0x7fd18c075a10 2026-03-10T08:53:09.720 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.720+0000 7fd19112f700 1 --2- 192.168.123.105:0/1932594773 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd18c075a10 0x7fd18c083010 secure :-1 s=READY pgs=289 cs=0 l=1 rev1=1 crypto rx=0x7fd18800f780 tx=0x7fd188018db0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:53:09.720 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.720+0000 7fd1827fc700 1 -- 192.168.123.105:0/1932594773 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd18800f300 con 0x7fd18c075a10 2026-03-10T08:53:09.720 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.720+0000 7fd192131700 1 -- 192.168.123.105:0/1932594773 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd18c1b35e0 con 0x7fd18c075a10 2026-03-10T08:53:09.722 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.720+0000 7fd192131700 1 -- 192.168.123.105:0/1932594773 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd18c1b3b30 con 0x7fd18c075a10 2026-03-10T08:53:09.722 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.721+0000 7fd1827fc700 1 -- 192.168.123.105:0/1932594773 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd188017600 con 0x7fd18c075a10 2026-03-10T08:53:09.722 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.721+0000 7fd1827fc700 1 -- 192.168.123.105:0/1932594773 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd18801fa30 con 0x7fd18c075a10 2026-03-10T08:53:09.722 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.722+0000 7fd192131700 1 -- 192.168.123.105:0/1932594773 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd170005320 con 0x7fd18c075a10 2026-03-10T08:53:09.726 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.722+0000 7fd1827fc700 1 -- 192.168.123.105:0/1932594773 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fd18801fb90 con 0x7fd18c075a10 2026-03-10T08:53:09.726 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.723+0000 7fd1827fc700 1 --2- 192.168.123.105:0/1932594773 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd17806c530 0x7fd17806e9f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:09.726 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.723+0000 7fd19092e700 1 --2- 192.168.123.105:0/1932594773 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd17806c530 0x7fd17806e9f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:09.726 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.723+0000 7fd19092e700 1 --2- 192.168.123.105:0/1932594773 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd17806c530 0x7fd17806e9f0 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7fd18400a520 tx=0x7fd18400a040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:53:09.726 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.723+0000 7fd1827fc700 1 -- 192.168.123.105:0/1932594773 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7fd18808c360 con 0x7fd18c075a10 2026-03-10T08:53:09.728 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.726+0000 7fd1827fc700 1 -- 192.168.123.105:0/1932594773 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fd18805a610 con 0x7fd18c075a10 2026-03-10T08:53:09.892 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.892+0000 7fd192131700 1 -- 192.168.123.105:0/1932594773 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7fd170005190 con 0x7fd18c075a10 2026-03-10T08:53:09.892 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.892+0000 7fd1827fc700 1 -- 192.168.123.105:0/1932594773 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7fd18805a1a0 con 0x7fd18c075a10 2026-03-10T08:53:09.893 INFO:teuthology.orchestra.run.vm05.stdout:HEALTH_OK 2026-03-10T08:53:09.895 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.894+0000 7fd192131700 1 -- 192.168.123.105:0/1932594773 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd17806c530 msgr2=0x7fd17806e9f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:09.895 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.895+0000 7fd192131700 1 --2- 192.168.123.105:0/1932594773 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd17806c530 0x7fd17806e9f0 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7fd18400a520 tx=0x7fd18400a040 comp rx=0 tx=0).stop 2026-03-10T08:53:09.895 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.895+0000 7fd192131700 1 -- 192.168.123.105:0/1932594773 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd18c075a10 msgr2=0x7fd18c083010 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:09.895 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.895+0000 7fd192131700 1 --2- 192.168.123.105:0/1932594773 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd18c075a10 0x7fd18c083010 secure :-1 s=READY pgs=289 cs=0 l=1 rev1=1 crypto rx=0x7fd18800f780 tx=0x7fd188018db0 comp rx=0 tx=0).stop 2026-03-10T08:53:09.896 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.896+0000 7fd192131700 1 -- 192.168.123.105:0/1932594773 shutdown_connections 2026-03-10T08:53:09.896 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.896+0000 7fd192131700 1 --2- 192.168.123.105:0/1932594773 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd17806c530 0x7fd17806e9f0 unknown :-1 s=CLOSED pgs=121 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:09.896 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.896+0000 7fd192131700 1 --2- 192.168.123.105:0/1932594773 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd18c075a10 0x7fd18c083010 unknown :-1 s=CLOSED pgs=289 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:09.896 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.896+0000 7fd192131700 1 --2- 192.168.123.105:0/1932594773 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd18c083550 0x7fd18c1b30a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:09.896 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.896+0000 7fd192131700 1 -- 192.168.123.105:0/1932594773 >> 192.168.123.105:0/1932594773 conn(0x7fd18c06daa0 msgr2=0x7fd18c06ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:53:09.896 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.896+0000 7fd192131700 1 -- 192.168.123.105:0/1932594773 shutdown_connections 2026-03-10T08:53:09.896 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:09.896+0000 7fd192131700 1 -- 192.168.123.105:0/1932594773 wait complete. 2026-03-10T08:53:10.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:10 vm05.local ceph-mon[49713]: from='client.14558 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:53:10.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:10 vm05.local ceph-mon[49713]: from='client.? 192.168.123.105:0/591046339' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:53:10.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:10 vm05.local ceph-mon[49713]: pgmap v84: 65 pgs: 65 active+clean; 457 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 7.4 KiB/s rd, 1.0 KiB/s wr, 8 op/s 2026-03-10T08:53:10.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:10 vm05.local ceph-mon[49713]: from='client.? 192.168.123.105:0/1575624460' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T08:53:10.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:10 vm05.local ceph-mon[49713]: from='client.24361 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:53:10.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:10 vm05.local ceph-mon[49713]: from='client.? 192.168.123.105:0/1932594773' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T08:53:10.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:10 vm08.local ceph-mon[57559]: from='client.14558 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:53:10.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:10 vm08.local ceph-mon[57559]: from='client.? 192.168.123.105:0/591046339' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:53:10.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:10 vm08.local ceph-mon[57559]: pgmap v84: 65 pgs: 65 active+clean; 457 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 7.4 KiB/s rd, 1.0 KiB/s wr, 8 op/s 2026-03-10T08:53:10.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:10 vm08.local ceph-mon[57559]: from='client.? 192.168.123.105:0/1575624460' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T08:53:10.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:10 vm08.local ceph-mon[57559]: from='client.24361 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:53:10.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:10 vm08.local ceph-mon[57559]: from='client.? 192.168.123.105:0/1932594773' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T08:53:12.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:12 vm08.local ceph-mon[57559]: pgmap v85: 65 pgs: 65 active+clean; 457 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 6.5 KiB/s rd, 767 B/s wr, 6 op/s 2026-03-10T08:53:12.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:12 vm05.local ceph-mon[49713]: pgmap v85: 65 pgs: 65 active+clean; 457 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 6.5 KiB/s rd, 767 B/s wr, 6 op/s 2026-03-10T08:53:14.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:14 vm05.local ceph-mon[49713]: pgmap v86: 65 pgs: 65 active+clean; 457 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 6.3 KiB/s rd, 767 B/s wr, 6 op/s 2026-03-10T08:53:14.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:14 vm08.local ceph-mon[57559]: pgmap v86: 65 pgs: 65 active+clean; 457 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 6.3 KiB/s rd, 767 B/s wr, 6 op/s 2026-03-10T08:53:16.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:16 vm08.local ceph-mon[57559]: pgmap v87: 65 pgs: 65 active+clean; 457 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 4.3 KiB/s rd, 853 B/s wr, 4 op/s 2026-03-10T08:53:16.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:16 vm08.local ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:53:16.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:16 vm08.local ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:53:16.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:16 vm05.local ceph-mon[49713]: pgmap v87: 65 pgs: 65 active+clean; 457 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 4.3 KiB/s rd, 853 B/s wr, 4 op/s 2026-03-10T08:53:16.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:16 vm05.local ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:53:16.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:16 vm05.local ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:53:18.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:18 vm08.local ceph-mon[57559]: pgmap v88: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.3 KiB/s rd, 1.2 KiB/s wr, 1 op/s 2026-03-10T08:53:18.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:18 vm05.local ceph-mon[49713]: pgmap v88: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.3 KiB/s rd, 1.2 KiB/s wr, 1 op/s 2026-03-10T08:53:20.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:20 vm05.local ceph-mon[49713]: pgmap v89: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 511 B/s wr, 0 op/s 2026-03-10T08:53:20.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:20 vm08.local ceph-mon[57559]: pgmap v89: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 511 B/s wr, 0 op/s 2026-03-10T08:53:22.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:22 vm05.local ceph-mon[49713]: pgmap v90: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 426 B/s wr, 0 op/s 2026-03-10T08:53:22.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:22 vm08.local ceph-mon[57559]: pgmap v90: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 426 B/s wr, 0 op/s 2026-03-10T08:53:25.207 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:24 vm05.local ceph-mon[49713]: pgmap v91: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 426 B/s wr, 0 op/s 2026-03-10T08:53:25.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:24 vm08.local ceph-mon[57559]: pgmap v91: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 426 B/s wr, 0 op/s 2026-03-10T08:53:26.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:25 vm08.local ceph-mon[57559]: pgmap v92: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 426 B/s wr, 0 op/s 2026-03-10T08:53:26.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:25 vm05.local ceph-mon[49713]: pgmap v92: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 426 B/s wr, 0 op/s 2026-03-10T08:53:28.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:28 vm05.local ceph-mon[49713]: pgmap v93: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 341 B/s wr, 0 op/s 2026-03-10T08:53:28.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:28 vm08.local ceph-mon[57559]: pgmap v93: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 341 B/s wr, 0 op/s 2026-03-10T08:53:30.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:30 vm05.local ceph-mon[49713]: pgmap v94: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:53:30.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:30 vm05.local ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:53:30.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:30 vm08.local ceph-mon[57559]: pgmap v94: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:53:30.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:30 vm08.local ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:53:32.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:32 vm08.local ceph-mon[57559]: pgmap v95: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:53:32.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:32 vm05.local ceph-mon[49713]: pgmap v95: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:53:34.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:34 vm08.local ceph-mon[57559]: pgmap v96: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:53:34.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:34 vm05.local ceph-mon[49713]: pgmap v96: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:53:36.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:36 vm08.local ceph-mon[57559]: pgmap v97: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:53:36.935 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:36 vm05.local ceph-mon[49713]: pgmap v97: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:53:38.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:38 vm08.local ceph-mon[57559]: pgmap v98: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:53:38.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:38 vm05.local ceph-mon[49713]: pgmap v98: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:53:39.990 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:39.989+0000 7f8f328ec700 1 -- 192.168.123.105:0/3063771557 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8f24097fc0 msgr2=0x7f8f240983e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:39.990 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:39.989+0000 7f8f328ec700 1 --2- 192.168.123.105:0/3063771557 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8f24097fc0 0x7f8f240983e0 secure :-1 s=READY pgs=290 cs=0 l=1 rev1=1 crypto rx=0x7f8f28009b50 tx=0x7f8f28009e60 comp rx=0 tx=0).stop 2026-03-10T08:53:39.991 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:39.990+0000 7f8f328ec700 1 -- 192.168.123.105:0/3063771557 shutdown_connections 2026-03-10T08:53:39.991 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:39.990+0000 7f8f328ec700 1 --2- 192.168.123.105:0/3063771557 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8f240991c0 0x7f8f24099620 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:39.991 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:39.990+0000 7f8f328ec700 1 --2- 192.168.123.105:0/3063771557 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8f24097fc0 0x7f8f240983e0 unknown :-1 s=CLOSED pgs=290 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:39.991 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:39.990+0000 7f8f328ec700 1 -- 192.168.123.105:0/3063771557 >> 192.168.123.105:0/3063771557 conn(0x7f8f24093540 msgr2=0x7f8f240959a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:53:39.991 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:39.991+0000 7f8f328ec700 1 -- 192.168.123.105:0/3063771557 shutdown_connections 2026-03-10T08:53:39.991 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:39.991+0000 7f8f328ec700 1 -- 192.168.123.105:0/3063771557 wait complete. 2026-03-10T08:53:39.992 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:39.992+0000 7f8f328ec700 1 Processor -- start 2026-03-10T08:53:39.992 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:39.992+0000 7f8f328ec700 1 -- start start 2026-03-10T08:53:39.993 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:39.993+0000 7f8f328ec700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8f24097fc0 0x7f8f2412d870 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:39.993 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:39.993+0000 7f8f318ea700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8f24097fc0 0x7f8f2412d870 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:39.993 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:39.993+0000 7f8f318ea700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8f24097fc0 0x7f8f2412d870 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:52018/0 (socket says 192.168.123.105:52018) 2026-03-10T08:53:39.993 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:39.993+0000 7f8f318ea700 1 -- 192.168.123.105:0/2203595305 learned_addr learned my addr 192.168.123.105:0/2203595305 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:53:39.993 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:39.993+0000 7f8f328ec700 1 --2- 192.168.123.105:0/2203595305 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8f240991c0 0x7f8f2412ddb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:39.993 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:39.993+0000 7f8f328ec700 1 -- 192.168.123.105:0/2203595305 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8f2412e3d0 con 0x7f8f24097fc0 2026-03-10T08:53:39.993 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:39.993+0000 7f8f328ec700 1 -- 192.168.123.105:0/2203595305 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8f2412e510 con 0x7f8f240991c0 2026-03-10T08:53:39.994 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:39.993+0000 7f8f310e9700 1 --2- 192.168.123.105:0/2203595305 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8f240991c0 0x7f8f2412ddb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:39.995 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:39.994+0000 7f8f310e9700 1 -- 192.168.123.105:0/2203595305 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8f24097fc0 msgr2=0x7f8f2412d870 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:39.995 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:39.995+0000 7f8f310e9700 1 --2- 192.168.123.105:0/2203595305 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8f24097fc0 0x7f8f2412d870 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:39.995 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:39.995+0000 7f8f310e9700 1 -- 192.168.123.105:0/2203595305 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8f280097e0 con 0x7f8f240991c0 2026-03-10T08:53:39.996 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:39.995+0000 7f8f310e9700 1 --2- 192.168.123.105:0/2203595305 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8f240991c0 0x7f8f2412ddb0 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f8f1c00d6d0 tx=0x7f8f1c00d9e0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:53:39.996 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:39.995+0000 7f8f318ea700 1 --2- 192.168.123.105:0/2203595305 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8f24097fc0 0x7f8f2412d870 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T08:53:39.996 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:39.996+0000 7f8f22ffd700 1 -- 192.168.123.105:0/2203595305 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8f1c0041d0 con 0x7f8f240991c0 2026-03-10T08:53:39.996 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:39.996+0000 7f8f328ec700 1 -- 192.168.123.105:0/2203595305 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8f24132fc0 con 0x7f8f240991c0 2026-03-10T08:53:39.997 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:39.996+0000 7f8f328ec700 1 -- 192.168.123.105:0/2203595305 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8f241334e0 con 0x7f8f240991c0 2026-03-10T08:53:39.997 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:39.997+0000 7f8f22ffd700 1 -- 192.168.123.105:0/2203595305 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8f1c004d10 con 0x7f8f240991c0 2026-03-10T08:53:39.997 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:39.997+0000 7f8f22ffd700 1 -- 192.168.123.105:0/2203595305 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8f1c00f690 con 0x7f8f240991c0 2026-03-10T08:53:39.997 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:39.997+0000 7f8f22ffd700 1 -- 192.168.123.105:0/2203595305 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f8f1c00f870 con 0x7f8f240991c0 2026-03-10T08:53:39.998 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:39.998+0000 7f8f328ec700 1 -- 192.168.123.105:0/2203595305 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8f24006120 con 0x7f8f240991c0 2026-03-10T08:53:39.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:39.998+0000 7f8f22ffd700 1 --2- 192.168.123.105:0/2203595305 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8f1806c530 0x7f8f1806e9f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:39.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:39.999+0000 7f8f318ea700 1 --2- 192.168.123.105:0/2203595305 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8f1806c530 0x7f8f1806e9f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:39.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:39.999+0000 7f8f22ffd700 1 -- 192.168.123.105:0/2203595305 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f8f1c08c200 con 0x7f8f240991c0 2026-03-10T08:53:39.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:39.999+0000 7f8f318ea700 1 --2- 192.168.123.105:0/2203595305 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8f1806c530 0x7f8f1806e9f0 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7f8f2800b5c0 tx=0x7f8f280058e0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:53:40.001 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.001+0000 7f8f22ffd700 1 -- 192.168.123.105:0/2203595305 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f8f1c05a5f0 con 0x7f8f240991c0 2026-03-10T08:53:40.153 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.153+0000 7f8f328ec700 1 -- 192.168.123.105:0/2203595305 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f8f24133810 con 0x7f8f1806c530 2026-03-10T08:53:40.154 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.154+0000 7f8f22ffd700 1 -- 192.168.123.105:0/2203595305 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7f8f24133810 con 0x7f8f1806c530 2026-03-10T08:53:40.157 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.156+0000 7f8f328ec700 1 -- 192.168.123.105:0/2203595305 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8f1806c530 msgr2=0x7f8f1806e9f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:40.157 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.156+0000 7f8f328ec700 1 --2- 192.168.123.105:0/2203595305 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8f1806c530 0x7f8f1806e9f0 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7f8f2800b5c0 tx=0x7f8f280058e0 comp rx=0 tx=0).stop 2026-03-10T08:53:40.157 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.156+0000 7f8f328ec700 1 -- 192.168.123.105:0/2203595305 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8f240991c0 msgr2=0x7f8f2412ddb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:40.157 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.156+0000 7f8f328ec700 1 --2- 192.168.123.105:0/2203595305 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8f240991c0 0x7f8f2412ddb0 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f8f1c00d6d0 tx=0x7f8f1c00d9e0 comp rx=0 tx=0).stop 2026-03-10T08:53:40.157 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.157+0000 7f8f328ec700 1 -- 192.168.123.105:0/2203595305 shutdown_connections 2026-03-10T08:53:40.157 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.157+0000 7f8f328ec700 1 --2- 192.168.123.105:0/2203595305 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8f1806c530 0x7f8f1806e9f0 unknown :-1 s=CLOSED pgs=122 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:40.157 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.157+0000 7f8f328ec700 1 --2- 192.168.123.105:0/2203595305 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8f24097fc0 0x7f8f2412d870 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:40.158 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.157+0000 7f8f328ec700 1 --2- 192.168.123.105:0/2203595305 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8f240991c0 0x7f8f2412ddb0 unknown :-1 s=CLOSED pgs=67 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:40.158 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.157+0000 7f8f328ec700 1 -- 192.168.123.105:0/2203595305 >> 192.168.123.105:0/2203595305 conn(0x7f8f24093540 msgr2=0x7f8f2409c3f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:53:40.158 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.157+0000 7f8f328ec700 1 -- 192.168.123.105:0/2203595305 shutdown_connections 2026-03-10T08:53:40.158 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.157+0000 7f8f328ec700 1 -- 192.168.123.105:0/2203595305 wait complete. 2026-03-10T08:53:40.168 INFO:teuthology.orchestra.run.vm05.stdout:true 2026-03-10T08:53:40.251 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.249+0000 7fd836305700 1 -- 192.168.123.105:0/2780064512 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd828097fc0 msgr2=0x7fd8280983e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:40.252 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.249+0000 7fd836305700 1 --2- 192.168.123.105:0/2780064512 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd828097fc0 0x7fd8280983e0 secure :-1 s=READY pgs=291 cs=0 l=1 rev1=1 crypto rx=0x7fd82c009ab0 tx=0x7fd82c009dc0 comp rx=0 tx=0).stop 2026-03-10T08:53:40.252 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.251+0000 7fd836305700 1 -- 192.168.123.105:0/2780064512 shutdown_connections 2026-03-10T08:53:40.252 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.251+0000 7fd836305700 1 --2- 192.168.123.105:0/2780064512 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd8280991c0 0x7fd828099620 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:40.252 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.251+0000 7fd836305700 1 --2- 192.168.123.105:0/2780064512 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd828097fc0 0x7fd8280983e0 unknown :-1 s=CLOSED pgs=291 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:40.252 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.251+0000 7fd836305700 1 -- 192.168.123.105:0/2780064512 >> 192.168.123.105:0/2780064512 conn(0x7fd828093540 msgr2=0x7fd8280959a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:53:40.253 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.252+0000 7fd836305700 1 -- 192.168.123.105:0/2780064512 shutdown_connections 2026-03-10T08:53:40.253 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.253+0000 7fd836305700 1 -- 192.168.123.105:0/2780064512 wait complete. 2026-03-10T08:53:40.254 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.253+0000 7fd836305700 1 Processor -- start 2026-03-10T08:53:40.255 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.254+0000 7fd836305700 1 -- start start 2026-03-10T08:53:40.255 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.254+0000 7fd836305700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd8280991c0 0x7fd82800c5d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:40.255 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.254+0000 7fd836305700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd82800cb10 0x7fd828009030 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:40.255 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.254+0000 7fd836305700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd828009570 con 0x7fd82800cb10 2026-03-10T08:53:40.255 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.254+0000 7fd836305700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd8280096e0 con 0x7fd8280991c0 2026-03-10T08:53:40.255 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.255+0000 7fd834b02700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd82800cb10 0x7fd828009030 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:40.255 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.255+0000 7fd834b02700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd82800cb10 0x7fd828009030 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:52048/0 (socket says 192.168.123.105:52048) 2026-03-10T08:53:40.255 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.255+0000 7fd834b02700 1 -- 192.168.123.105:0/50467074 learned_addr learned my addr 192.168.123.105:0/50467074 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:53:40.255 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.255+0000 7fd834b02700 1 -- 192.168.123.105:0/50467074 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd8280991c0 msgr2=0x7fd82800c5d0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T08:53:40.256 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.255+0000 7fd835303700 1 --2- 192.168.123.105:0/50467074 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd8280991c0 0x7fd82800c5d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:40.256 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.256+0000 7fd834b02700 1 --2- 192.168.123.105:0/50467074 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd8280991c0 0x7fd82800c5d0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:40.256 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.256+0000 7fd834b02700 1 -- 192.168.123.105:0/50467074 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd82c009710 con 0x7fd82800cb10 2026-03-10T08:53:40.256 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.256+0000 7fd835303700 1 --2- 192.168.123.105:0/50467074 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd8280991c0 0x7fd82800c5d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T08:53:40.257 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.256+0000 7fd834b02700 1 --2- 192.168.123.105:0/50467074 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd82800cb10 0x7fd828009030 secure :-1 s=READY pgs=292 cs=0 l=1 rev1=1 crypto rx=0x7fd82000ea30 tx=0x7fd82000edf0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:53:40.257 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.257+0000 7fd8267fc700 1 -- 192.168.123.105:0/50467074 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd82000cbe0 con 0x7fd82800cb10 2026-03-10T08:53:40.257 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.257+0000 7fd8267fc700 1 -- 192.168.123.105:0/50467074 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd82000cd40 con 0x7fd82800cb10 2026-03-10T08:53:40.258 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.258+0000 7fd836305700 1 -- 192.168.123.105:0/50467074 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd828009940 con 0x7fd82800cb10 2026-03-10T08:53:40.258 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.258+0000 7fd836305700 1 -- 192.168.123.105:0/50467074 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd82812d8e0 con 0x7fd82800cb10 2026-03-10T08:53:40.258 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.258+0000 7fd8267fc700 1 -- 192.168.123.105:0/50467074 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd820010430 con 0x7fd82800cb10 2026-03-10T08:53:40.259 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.259+0000 7fd8267fc700 1 -- 192.168.123.105:0/50467074 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fd820004710 con 0x7fd82800cb10 2026-03-10T08:53:40.259 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.259+0000 7fd8267fc700 1 --2- 192.168.123.105:0/50467074 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd81c06c380 0x7fd81c06e840 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:40.260 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.259+0000 7fd8267fc700 1 -- 192.168.123.105:0/50467074 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7fd820014070 con 0x7fd82800cb10 2026-03-10T08:53:40.261 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.260+0000 7fd835303700 1 --2- 192.168.123.105:0/50467074 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd81c06c380 0x7fd81c06e840 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:40.261 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.260+0000 7fd835303700 1 --2- 192.168.123.105:0/50467074 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd81c06c380 0x7fd81c06e840 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7fd82c003970 tx=0x7fd82c003a40 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:53:40.261 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.260+0000 7fd836305700 1 -- 192.168.123.105:0/50467074 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd828004470 con 0x7fd82800cb10 2026-03-10T08:53:40.264 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.264+0000 7fd8267fc700 1 -- 192.168.123.105:0/50467074 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fd820059fa0 con 0x7fd82800cb10 2026-03-10T08:53:40.419 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.418+0000 7fd836305700 1 -- 192.168.123.105:0/50467074 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fd828009ad0 con 0x7fd81c06c380 2026-03-10T08:53:40.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.419+0000 7fd8267fc700 1 -- 192.168.123.105:0/50467074 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7fd828009ad0 con 0x7fd81c06c380 2026-03-10T08:53:40.422 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.422+0000 7fd836305700 1 -- 192.168.123.105:0/50467074 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd81c06c380 msgr2=0x7fd81c06e840 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:40.422 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.422+0000 7fd836305700 1 --2- 192.168.123.105:0/50467074 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd81c06c380 0x7fd81c06e840 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7fd82c003970 tx=0x7fd82c003a40 comp rx=0 tx=0).stop 2026-03-10T08:53:40.422 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.422+0000 7fd836305700 1 -- 192.168.123.105:0/50467074 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd82800cb10 msgr2=0x7fd828009030 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:40.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.422+0000 7fd836305700 1 --2- 192.168.123.105:0/50467074 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd82800cb10 0x7fd828009030 secure :-1 s=READY pgs=292 cs=0 l=1 rev1=1 crypto rx=0x7fd82000ea30 tx=0x7fd82000edf0 comp rx=0 tx=0).stop 2026-03-10T08:53:40.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.423+0000 7fd836305700 1 -- 192.168.123.105:0/50467074 shutdown_connections 2026-03-10T08:53:40.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.423+0000 7fd836305700 1 --2- 192.168.123.105:0/50467074 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd81c06c380 0x7fd81c06e840 unknown :-1 s=CLOSED pgs=123 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:40.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.423+0000 7fd836305700 1 --2- 192.168.123.105:0/50467074 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd8280991c0 0x7fd82800c5d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:40.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.423+0000 7fd836305700 1 --2- 192.168.123.105:0/50467074 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd82800cb10 0x7fd828009030 unknown :-1 s=CLOSED pgs=292 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:40.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.423+0000 7fd836305700 1 -- 192.168.123.105:0/50467074 >> 192.168.123.105:0/50467074 conn(0x7fd828093540 msgr2=0x7fd82809c3f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:53:40.424 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.423+0000 7fd836305700 1 -- 192.168.123.105:0/50467074 shutdown_connections 2026-03-10T08:53:40.424 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.424+0000 7fd836305700 1 -- 192.168.123.105:0/50467074 wait complete. 2026-03-10T08:53:40.504 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:40 vm05.local ceph-mon[49713]: pgmap v99: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:53:40.505 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.502+0000 7fb97759e700 1 -- 192.168.123.105:0/2150441862 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb978075a10 msgr2=0x7fb978077ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:40.505 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.502+0000 7fb97759e700 1 --2- 192.168.123.105:0/2150441862 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb978075a10 0x7fb978077ea0 secure :-1 s=READY pgs=293 cs=0 l=1 rev1=1 crypto rx=0x7fb97000b600 tx=0x7fb97000b910 comp rx=0 tx=0).stop 2026-03-10T08:53:40.505 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.504+0000 7fb97759e700 1 -- 192.168.123.105:0/2150441862 shutdown_connections 2026-03-10T08:53:40.505 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.504+0000 7fb97759e700 1 --2- 192.168.123.105:0/2150441862 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb978075a10 0x7fb978077ea0 unknown :-1 s=CLOSED pgs=293 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:40.505 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.504+0000 7fb97759e700 1 --2- 192.168.123.105:0/2150441862 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb978072b20 0x7fb978072f40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:40.505 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.504+0000 7fb97759e700 1 -- 192.168.123.105:0/2150441862 >> 192.168.123.105:0/2150441862 conn(0x7fb97806daa0 msgr2=0x7fb97806ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:53:40.507 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.505+0000 7fb97759e700 1 -- 192.168.123.105:0/2150441862 shutdown_connections 2026-03-10T08:53:40.507 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.505+0000 7fb97759e700 1 -- 192.168.123.105:0/2150441862 wait complete. 2026-03-10T08:53:40.508 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.506+0000 7fb97759e700 1 Processor -- start 2026-03-10T08:53:40.508 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.506+0000 7fb97759e700 1 -- start start 2026-03-10T08:53:40.508 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.506+0000 7fb97759e700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb978072b20 0x7fb9780830c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:40.508 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.506+0000 7fb97759e700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb978075a10 0x7fb978083600 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:40.508 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.506+0000 7fb97759e700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb978083c20 con 0x7fb978075a10 2026-03-10T08:53:40.508 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.506+0000 7fb97759e700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb978083d60 con 0x7fb978072b20 2026-03-10T08:53:40.508 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.506+0000 7fb975d9b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb978075a10 0x7fb978083600 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:40.508 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.506+0000 7fb975d9b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb978075a10 0x7fb978083600 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:52064/0 (socket says 192.168.123.105:52064) 2026-03-10T08:53:40.509 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.506+0000 7fb975d9b700 1 -- 192.168.123.105:0/371045737 learned_addr learned my addr 192.168.123.105:0/371045737 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:53:40.509 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.507+0000 7fb975d9b700 1 -- 192.168.123.105:0/371045737 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb978072b20 msgr2=0x7fb9780830c0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T08:53:40.509 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.507+0000 7fb975d9b700 1 --2- 192.168.123.105:0/371045737 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb978072b20 0x7fb9780830c0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:40.509 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.507+0000 7fb975d9b700 1 -- 192.168.123.105:0/371045737 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb97000b050 con 0x7fb978075a10 2026-03-10T08:53:40.509 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.507+0000 7fb975d9b700 1 --2- 192.168.123.105:0/371045737 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb978075a10 0x7fb978083600 secure :-1 s=READY pgs=294 cs=0 l=1 rev1=1 crypto rx=0x7fb970003f30 tx=0x7fb970009150 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:53:40.509 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.509+0000 7fb9677fe700 1 -- 192.168.123.105:0/371045737 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb97000e040 con 0x7fb978075a10 2026-03-10T08:53:40.512 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.509+0000 7fb97759e700 1 -- 192.168.123.105:0/371045737 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb9781b3290 con 0x7fb978075a10 2026-03-10T08:53:40.512 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.509+0000 7fb97759e700 1 -- 192.168.123.105:0/371045737 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb9781b37e0 con 0x7fb978075a10 2026-03-10T08:53:40.512 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.511+0000 7fb97759e700 1 -- 192.168.123.105:0/371045737 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb97807d310 con 0x7fb978075a10 2026-03-10T08:53:40.512 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.512+0000 7fb9677fe700 1 -- 192.168.123.105:0/371045737 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb970012430 con 0x7fb978075a10 2026-03-10T08:53:40.512 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.512+0000 7fb9677fe700 1 -- 192.168.123.105:0/371045737 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb97001c5d0 con 0x7fb978075a10 2026-03-10T08:53:40.513 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.513+0000 7fb9677fe700 1 -- 192.168.123.105:0/371045737 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fb97001c7f0 con 0x7fb978075a10 2026-03-10T08:53:40.513 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.513+0000 7fb9677fe700 1 --2- 192.168.123.105:0/371045737 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb96006e8f0 0x7fb960070db0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:40.514 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.513+0000 7fb9677fe700 1 -- 192.168.123.105:0/371045737 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7fb97008e1a0 con 0x7fb978075a10 2026-03-10T08:53:40.514 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.514+0000 7fb97659c700 1 --2- 192.168.123.105:0/371045737 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb96006e8f0 0x7fb960070db0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:40.517 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.516+0000 7fb9677fe700 1 -- 192.168.123.105:0/371045737 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fb97005c4d0 con 0x7fb978075a10 2026-03-10T08:53:40.517 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.517+0000 7fb97659c700 1 --2- 192.168.123.105:0/371045737 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb96006e8f0 0x7fb960070db0 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7fb968009710 tx=0x7fb968006c60 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:53:40.650 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.649+0000 7fb97759e700 1 -- 192.168.123.105:0/371045737 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fb9780611d0 con 0x7fb96006e8f0 2026-03-10T08:53:40.656 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.655+0000 7fb9677fe700 1 -- 192.168.123.105:0/371045737 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3288 (secure 0 0 0) 0x7fb9780611d0 con 0x7fb96006e8f0 2026-03-10T08:53:40.656 INFO:teuthology.orchestra.run.vm05.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T08:53:40.656 INFO:teuthology.orchestra.run.vm05.stdout:alertmanager.vm05 vm05 *:9093,9094 running (2m) 40s ago 2m 21.4M - 0.25.0 c8568f914cd2 3be9db7ff6a0 2026-03-10T08:53:40.656 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm05 vm05 running (2m) 40s ago 2m 8032k - 18.2.1 5be31c24972a 4f6a4fa3151a 2026-03-10T08:53:40.656 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm08 vm08 running (2m) 41s ago 2m 8308k - 18.2.1 5be31c24972a 17a1d108c2c1 2026-03-10T08:53:40.656 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm05 vm05 running (2m) 40s ago 2m 7407k - 18.2.1 5be31c24972a f9c585addcea 2026-03-10T08:53:40.656 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm08 vm08 running (2m) 41s ago 2m 7415k - 18.2.1 5be31c24972a f0b88fc7f552 2026-03-10T08:53:40.656 INFO:teuthology.orchestra.run.vm05.stdout:grafana.vm05 vm05 *:3000 running (2m) 40s ago 2m 80.8M - 9.4.7 954c08fa6188 91937b4745e7 2026-03-10T08:53:40.656 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.bxdvbu vm05 running (47s) 40s ago 47s 16.7M - 18.2.1 5be31c24972a 601862397d9b 2026-03-10T08:53:40.656 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.slhztf vm05 running (45s) 40s ago 45s 13.7M - 18.2.1 5be31c24972a 550cdd24738b 2026-03-10T08:53:40.656 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.ssijow vm08 running (44s) 41s ago 44s 16.1M - 18.2.1 5be31c24972a b9e55b365719 2026-03-10T08:53:40.656 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.xfzrbx vm08 running (46s) 41s ago 46s 11.0M - 18.2.1 5be31c24972a d58d1e2e6ff3 2026-03-10T08:53:40.656 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm05.rxwgjc vm05 *:9283,8765,8443 running (3m) 40s ago 3m 501M - 18.2.1 5be31c24972a 6ec0cdb38171 2026-03-10T08:53:40.656 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm08.rpongu vm08 *:8443,9283,8765 running (2m) 41s ago 2m 449M - 18.2.1 5be31c24972a 9cd801f2f7a7 2026-03-10T08:53:40.656 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm05 vm05 running (3m) 40s ago 3m 50.0M 2048M 18.2.1 5be31c24972a 4cb0e74c8584 2026-03-10T08:53:40.656 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm08 vm08 running (2m) 41s ago 2m 47.9M 2048M 18.2.1 5be31c24972a bca448418226 2026-03-10T08:53:40.656 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm05 vm05 *:9100 running (2m) 40s ago 2m 12.3M - 1.5.0 0da6a335fe13 3b3db2a6030c 2026-03-10T08:53:40.656 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm08 vm08 *:9100 running (2m) 41s ago 2m 12.7M - 1.5.0 0da6a335fe13 f55df0cefc61 2026-03-10T08:53:40.656 INFO:teuthology.orchestra.run.vm05.stdout:osd.0 vm05 running (116s) 40s ago 116s 48.5M 4096M 18.2.1 5be31c24972a 2a2aeea5e3d4 2026-03-10T08:53:40.656 INFO:teuthology.orchestra.run.vm05.stdout:osd.1 vm05 running (106s) 40s ago 106s 46.9M 4096M 18.2.1 5be31c24972a 902f9ea11f1a 2026-03-10T08:53:40.656 INFO:teuthology.orchestra.run.vm05.stdout:osd.2 vm05 running (97s) 40s ago 97s 48.1M 4096M 18.2.1 5be31c24972a 32c0be0f86f2 2026-03-10T08:53:40.656 INFO:teuthology.orchestra.run.vm05.stdout:osd.3 vm08 running (88s) 41s ago 88s 44.3M 4096M 18.2.1 5be31c24972a 14f5f93ea4d1 2026-03-10T08:53:40.656 INFO:teuthology.orchestra.run.vm05.stdout:osd.4 vm08 running (79s) 41s ago 79s 43.5M 4096M 18.2.1 5be31c24972a 155c1482d81c 2026-03-10T08:53:40.656 INFO:teuthology.orchestra.run.vm05.stdout:osd.5 vm08 running (70s) 41s ago 70s 45.8M 4096M 18.2.1 5be31c24972a 21583bb58d82 2026-03-10T08:53:40.656 INFO:teuthology.orchestra.run.vm05.stdout:prometheus.vm05 vm05 *:9095 running (119s) 40s ago 2m 36.9M - 2.43.0 a07b618ecd1d e84b76e5c1c0 2026-03-10T08:53:40.660 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.660+0000 7fb97759e700 1 -- 192.168.123.105:0/371045737 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb96006e8f0 msgr2=0x7fb960070db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:40.660 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.660+0000 7fb97759e700 1 --2- 192.168.123.105:0/371045737 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb96006e8f0 0x7fb960070db0 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7fb968009710 tx=0x7fb968006c60 comp rx=0 tx=0).stop 2026-03-10T08:53:40.660 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.660+0000 7fb97759e700 1 -- 192.168.123.105:0/371045737 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb978075a10 msgr2=0x7fb978083600 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:40.660 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.660+0000 7fb97759e700 1 --2- 192.168.123.105:0/371045737 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb978075a10 0x7fb978083600 secure :-1 s=READY pgs=294 cs=0 l=1 rev1=1 crypto rx=0x7fb970003f30 tx=0x7fb970009150 comp rx=0 tx=0).stop 2026-03-10T08:53:40.660 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.660+0000 7fb97759e700 1 -- 192.168.123.105:0/371045737 shutdown_connections 2026-03-10T08:53:40.660 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.660+0000 7fb97759e700 1 --2- 192.168.123.105:0/371045737 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb96006e8f0 0x7fb960070db0 unknown :-1 s=CLOSED pgs=124 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:40.661 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.660+0000 7fb97759e700 1 --2- 192.168.123.105:0/371045737 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb978072b20 0x7fb9780830c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:40.661 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.660+0000 7fb97759e700 1 --2- 192.168.123.105:0/371045737 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb978075a10 0x7fb978083600 unknown :-1 s=CLOSED pgs=294 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:40.661 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.660+0000 7fb97759e700 1 -- 192.168.123.105:0/371045737 >> 192.168.123.105:0/371045737 conn(0x7fb97806daa0 msgr2=0x7fb97806ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:53:40.661 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.660+0000 7fb97759e700 1 -- 192.168.123.105:0/371045737 shutdown_connections 2026-03-10T08:53:40.661 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.660+0000 7fb97759e700 1 -- 192.168.123.105:0/371045737 wait complete. 2026-03-10T08:53:40.728 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.728+0000 7f9d44b62700 1 -- 192.168.123.105:0/1871322049 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9d38096850 msgr2=0x7f9d38098c40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:40.729 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.728+0000 7f9d44b62700 1 --2- 192.168.123.105:0/1871322049 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9d38096850 0x7f9d38098c40 secure :-1 s=READY pgs=295 cs=0 l=1 rev1=1 crypto rx=0x7f9d34009b50 tx=0x7f9d34009e60 comp rx=0 tx=0).stop 2026-03-10T08:53:40.729 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.728+0000 7f9d44b62700 1 -- 192.168.123.105:0/1871322049 shutdown_connections 2026-03-10T08:53:40.729 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.728+0000 7f9d44b62700 1 --2- 192.168.123.105:0/1871322049 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9d38099180 0x7f9d3809b570 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:40.729 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.728+0000 7f9d44b62700 1 --2- 192.168.123.105:0/1871322049 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9d38096850 0x7f9d38098c40 unknown :-1 s=CLOSED pgs=295 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:40.729 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.728+0000 7f9d44b62700 1 -- 192.168.123.105:0/1871322049 >> 192.168.123.105:0/1871322049 conn(0x7f9d38090240 msgr2=0x7f9d380926a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:53:40.729 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.729+0000 7f9d44b62700 1 -- 192.168.123.105:0/1871322049 shutdown_connections 2026-03-10T08:53:40.730 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.729+0000 7f9d44b62700 1 -- 192.168.123.105:0/1871322049 wait complete. 2026-03-10T08:53:40.730 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.729+0000 7f9d44b62700 1 Processor -- start 2026-03-10T08:53:40.731 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.731+0000 7f9d44b62700 1 -- start start 2026-03-10T08:53:40.731 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.731+0000 7f9d44b62700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9d38096850 0x7f9d38129470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:40.731 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.731+0000 7f9d44b62700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9d38099180 0x7f9d381299b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:40.731 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.731+0000 7f9d44b62700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9d38129fd0 con 0x7f9d38096850 2026-03-10T08:53:40.732 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.731+0000 7f9d44b62700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9d3812a110 con 0x7f9d38099180 2026-03-10T08:53:40.732 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.731+0000 7f9d3f7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9d38096850 0x7f9d38129470 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:40.732 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.731+0000 7f9d3effd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9d38099180 0x7f9d381299b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:40.732 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:40 vm08.local ceph-mon[57559]: pgmap v99: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:53:40.732 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.731+0000 7f9d3f7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9d38096850 0x7f9d38129470 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:52072/0 (socket says 192.168.123.105:52072) 2026-03-10T08:53:40.732 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.731+0000 7f9d3f7fe700 1 -- 192.168.123.105:0/497535088 learned_addr learned my addr 192.168.123.105:0/497535088 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:53:40.732 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.732+0000 7f9d3effd700 1 -- 192.168.123.105:0/497535088 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9d38096850 msgr2=0x7f9d38129470 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:40.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.732+0000 7f9d3effd700 1 --2- 192.168.123.105:0/497535088 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9d38096850 0x7f9d38129470 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:40.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.732+0000 7f9d3effd700 1 -- 192.168.123.105:0/497535088 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9d340097e0 con 0x7f9d38099180 2026-03-10T08:53:40.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.733+0000 7f9d3f7fe700 1 --2- 192.168.123.105:0/497535088 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9d38096850 0x7f9d38129470 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T08:53:40.734 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.733+0000 7f9d3effd700 1 --2- 192.168.123.105:0/497535088 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9d38099180 0x7f9d381299b0 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f9d2c00eab0 tx=0x7f9d2c00edc0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:53:40.734 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.734+0000 7f9d3cff9700 1 -- 192.168.123.105:0/497535088 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9d2c00cb20 con 0x7f9d38099180 2026-03-10T08:53:40.734 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.734+0000 7f9d3cff9700 1 -- 192.168.123.105:0/497535088 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9d2c00cc80 con 0x7f9d38099180 2026-03-10T08:53:40.734 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.734+0000 7f9d3cff9700 1 -- 192.168.123.105:0/497535088 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9d2c018860 con 0x7f9d38099180 2026-03-10T08:53:40.735 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.735+0000 7f9d44b62700 1 -- 192.168.123.105:0/497535088 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9d3812ebc0 con 0x7f9d38099180 2026-03-10T08:53:40.735 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.735+0000 7f9d44b62700 1 -- 192.168.123.105:0/497535088 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9d3812f090 con 0x7f9d38099180 2026-03-10T08:53:40.736 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.735+0000 7f9d44b62700 1 -- 192.168.123.105:0/497535088 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9d38005390 con 0x7f9d38099180 2026-03-10T08:53:40.737 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.737+0000 7f9d3cff9700 1 -- 192.168.123.105:0/497535088 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f9d2c0189c0 con 0x7f9d38099180 2026-03-10T08:53:40.737 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.737+0000 7f9d3cff9700 1 --2- 192.168.123.105:0/497535088 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f9d300706f0 0x7f9d30072bb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:40.738 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.737+0000 7f9d3f7fe700 1 --2- 192.168.123.105:0/497535088 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f9d300706f0 0x7f9d30072bb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:40.738 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.738+0000 7f9d3cff9700 1 -- 192.168.123.105:0/497535088 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f9d2c014070 con 0x7f9d38099180 2026-03-10T08:53:40.738 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.738+0000 7f9d3f7fe700 1 --2- 192.168.123.105:0/497535088 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f9d300706f0 0x7f9d30072bb0 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7f9d34005420 tx=0x7f9d340058e0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:53:40.743 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.743+0000 7f9d3cff9700 1 -- 192.168.123.105:0/497535088 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f9d2c05a4d0 con 0x7f9d38099180 2026-03-10T08:53:40.916 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.912+0000 7f9d44b62700 1 -- 192.168.123.105:0/497535088 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f9d3812f340 con 0x7f9d38099180 2026-03-10T08:53:40.916 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.916+0000 7f9d3cff9700 1 -- 192.168.123.105:0/497535088 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+558 (secure 0 0 0) 0x7f9d2c05a060 con 0x7f9d38099180 2026-03-10T08:53:40.916 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T08:53:40.916 INFO:teuthology.orchestra.run.vm05.stdout: "mon": { 2026-03-10T08:53:40.916 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 2 2026-03-10T08:53:40.916 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:53:40.916 INFO:teuthology.orchestra.run.vm05.stdout: "mgr": { 2026-03-10T08:53:40.916 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 2 2026-03-10T08:53:40.916 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:53:40.916 INFO:teuthology.orchestra.run.vm05.stdout: "osd": { 2026-03-10T08:53:40.916 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 6 2026-03-10T08:53:40.916 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:53:40.916 INFO:teuthology.orchestra.run.vm05.stdout: "mds": { 2026-03-10T08:53:40.916 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4 2026-03-10T08:53:40.916 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:53:40.916 INFO:teuthology.orchestra.run.vm05.stdout: "overall": { 2026-03-10T08:53:40.916 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 14 2026-03-10T08:53:40.916 INFO:teuthology.orchestra.run.vm05.stdout: } 2026-03-10T08:53:40.916 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T08:53:40.919 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.918+0000 7f9d44b62700 1 -- 192.168.123.105:0/497535088 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f9d300706f0 msgr2=0x7f9d30072bb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:40.919 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.918+0000 7f9d44b62700 1 --2- 192.168.123.105:0/497535088 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f9d300706f0 0x7f9d30072bb0 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7f9d34005420 tx=0x7f9d340058e0 comp rx=0 tx=0).stop 2026-03-10T08:53:40.919 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.918+0000 7f9d44b62700 1 -- 192.168.123.105:0/497535088 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9d38099180 msgr2=0x7f9d381299b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:40.919 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.918+0000 7f9d44b62700 1 --2- 192.168.123.105:0/497535088 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9d38099180 0x7f9d381299b0 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f9d2c00eab0 tx=0x7f9d2c00edc0 comp rx=0 tx=0).stop 2026-03-10T08:53:40.919 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.919+0000 7f9d44b62700 1 -- 192.168.123.105:0/497535088 shutdown_connections 2026-03-10T08:53:40.919 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.919+0000 7f9d44b62700 1 --2- 192.168.123.105:0/497535088 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f9d300706f0 0x7f9d30072bb0 unknown :-1 s=CLOSED pgs=125 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:40.919 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.919+0000 7f9d44b62700 1 --2- 192.168.123.105:0/497535088 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9d38096850 0x7f9d38129470 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:40.919 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.919+0000 7f9d44b62700 1 --2- 192.168.123.105:0/497535088 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9d38099180 0x7f9d381299b0 unknown :-1 s=CLOSED pgs=68 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:40.919 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.919+0000 7f9d44b62700 1 -- 192.168.123.105:0/497535088 >> 192.168.123.105:0/497535088 conn(0x7f9d38090240 msgr2=0x7f9d38092630 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:53:40.920 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.919+0000 7f9d44b62700 1 -- 192.168.123.105:0/497535088 shutdown_connections 2026-03-10T08:53:40.920 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.919+0000 7f9d44b62700 1 -- 192.168.123.105:0/497535088 wait complete. 2026-03-10T08:53:40.996 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.996+0000 7f8442e93700 1 -- 192.168.123.105:0/2464357269 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f84340a5de0 msgr2=0x7f84340a6260 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:40.996 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.996+0000 7f8442e93700 1 --2- 192.168.123.105:0/2464357269 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f84340a5de0 0x7f84340a6260 secure :-1 s=READY pgs=296 cs=0 l=1 rev1=1 crypto rx=0x7f842c009ab0 tx=0x7f842c009dc0 comp rx=0 tx=0).stop 2026-03-10T08:53:40.997 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.996+0000 7f8442e93700 1 -- 192.168.123.105:0/2464357269 shutdown_connections 2026-03-10T08:53:40.997 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.996+0000 7f8442e93700 1 --2- 192.168.123.105:0/2464357269 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f84340a5de0 0x7f84340a6260 unknown :-1 s=CLOSED pgs=296 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:40.997 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.996+0000 7f8442e93700 1 --2- 192.168.123.105:0/2464357269 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f84340a4ca0 0x7f84340a50c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:40.997 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.996+0000 7f8442e93700 1 -- 192.168.123.105:0/2464357269 >> 192.168.123.105:0/2464357269 conn(0x7f84340a0160 msgr2=0x7f84340a25c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:53:40.997 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.997+0000 7f8442e93700 1 -- 192.168.123.105:0/2464357269 shutdown_connections 2026-03-10T08:53:40.997 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.997+0000 7f8442e93700 1 -- 192.168.123.105:0/2464357269 wait complete. 2026-03-10T08:53:40.998 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.998+0000 7f8442e93700 1 Processor -- start 2026-03-10T08:53:40.998 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.998+0000 7f8442e93700 1 -- start start 2026-03-10T08:53:40.998 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.998+0000 7f8442e93700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f84340a4ca0 0x7f84340b3e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:40.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.998+0000 7f8441e91700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f84340a4ca0 0x7f84340b3e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:40.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.998+0000 7f8442e93700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f84340a5de0 0x7f84340b43d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:40.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.998+0000 7f8442e93700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f84340b49f0 con 0x7f84340a5de0 2026-03-10T08:53:40.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.998+0000 7f8442e93700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8434150a70 con 0x7f84340a4ca0 2026-03-10T08:53:40.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.998+0000 7f8441e91700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f84340a4ca0 0x7f84340b3e90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:35094/0 (socket says 192.168.123.105:35094) 2026-03-10T08:53:40.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.998+0000 7f8441e91700 1 -- 192.168.123.105:0/3581879589 learned_addr learned my addr 192.168.123.105:0/3581879589 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:53:40.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.999+0000 7f8441690700 1 --2- 192.168.123.105:0/3581879589 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f84340a5de0 0x7f84340b43d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:41.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.999+0000 7f8441e91700 1 -- 192.168.123.105:0/3581879589 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f84340a5de0 msgr2=0x7f84340b43d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:41.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.999+0000 7f8441e91700 1 --2- 192.168.123.105:0/3581879589 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f84340a5de0 0x7f84340b43d0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:41.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.999+0000 7f8441e91700 1 -- 192.168.123.105:0/3581879589 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8438009770 con 0x7f84340a4ca0 2026-03-10T08:53:41.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:40.999+0000 7f8441690700 1 --2- 192.168.123.105:0/3581879589 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f84340a5de0 0x7f84340b43d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T08:53:41.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.000+0000 7f8441e91700 1 --2- 192.168.123.105:0/3581879589 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f84340a4ca0 0x7f84340b3e90 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f843800ebf0 tx=0x7f843800ef00 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:53:41.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.000+0000 7f8432ffd700 1 -- 192.168.123.105:0/3581879589 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f843800ccf0 con 0x7f84340a4ca0 2026-03-10T08:53:41.001 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.000+0000 7f8442e93700 1 -- 192.168.123.105:0/3581879589 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f842c009710 con 0x7f84340a4ca0 2026-03-10T08:53:41.001 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.000+0000 7f8442e93700 1 -- 192.168.123.105:0/3581879589 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8434150fe0 con 0x7f84340a4ca0 2026-03-10T08:53:41.002 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.001+0000 7f8432ffd700 1 -- 192.168.123.105:0/3581879589 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f843800ce50 con 0x7f84340a4ca0 2026-03-10T08:53:41.002 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.001+0000 7f8432ffd700 1 -- 192.168.123.105:0/3581879589 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8438005340 con 0x7f84340a4ca0 2026-03-10T08:53:41.002 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.001+0000 7f8442e93700 1 -- 192.168.123.105:0/3581879589 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8420005320 con 0x7f84340a4ca0 2026-03-10T08:53:41.005 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.003+0000 7f8432ffd700 1 -- 192.168.123.105:0/3581879589 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f8438005540 con 0x7f84340a4ca0 2026-03-10T08:53:41.005 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.003+0000 7f8432ffd700 1 --2- 192.168.123.105:0/3581879589 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f842806c4e0 0x7f842806e9a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:41.005 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.003+0000 7f8432ffd700 1 -- 192.168.123.105:0/3581879589 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f8438014070 con 0x7f84340a4ca0 2026-03-10T08:53:41.005 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.005+0000 7f8432ffd700 1 -- 192.168.123.105:0/3581879589 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f8438059ed0 con 0x7f84340a4ca0 2026-03-10T08:53:41.005 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.005+0000 7f8441690700 1 --2- 192.168.123.105:0/3581879589 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f842806c4e0 0x7f842806e9a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:41.007 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.006+0000 7f8441690700 1 --2- 192.168.123.105:0/3581879589 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f842806c4e0 0x7f842806e9a0 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f842c005d50 tx=0x7f842c005ce0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:53:41.161 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.160+0000 7f8442e93700 1 -- 192.168.123.105:0/3581879589 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f8420006200 con 0x7f84340a4ca0 2026-03-10T08:53:41.162 INFO:teuthology.orchestra.run.vm05.stdout:e12 2026-03-10T08:53:41.163 INFO:teuthology.orchestra.run.vm05.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T08:53:41.163 INFO:teuthology.orchestra.run.vm05.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T08:53:41.163 INFO:teuthology.orchestra.run.vm05.stdout:legacy client fscid: 1 2026-03-10T08:53:41.163 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:53:41.163 INFO:teuthology.orchestra.run.vm05.stdout:Filesystem 'cephfs' (1) 2026-03-10T08:53:41.163 INFO:teuthology.orchestra.run.vm05.stdout:fs_name cephfs 2026-03-10T08:53:41.163 INFO:teuthology.orchestra.run.vm05.stdout:epoch 12 2026-03-10T08:53:41.163 INFO:teuthology.orchestra.run.vm05.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T08:53:41.163 INFO:teuthology.orchestra.run.vm05.stdout:created 2026-03-10T08:52:52.346264+0000 2026-03-10T08:53:41.163 INFO:teuthology.orchestra.run.vm05.stdout:modified 2026-03-10T08:53:01.426195+0000 2026-03-10T08:53:41.163 INFO:teuthology.orchestra.run.vm05.stdout:tableserver 0 2026-03-10T08:53:41.163 INFO:teuthology.orchestra.run.vm05.stdout:root 0 2026-03-10T08:53:41.163 INFO:teuthology.orchestra.run.vm05.stdout:session_timeout 60 2026-03-10T08:53:41.163 INFO:teuthology.orchestra.run.vm05.stdout:session_autoclose 300 2026-03-10T08:53:41.163 INFO:teuthology.orchestra.run.vm05.stdout:max_file_size 1099511627776 2026-03-10T08:53:41.163 INFO:teuthology.orchestra.run.vm05.stdout:required_client_features {} 2026-03-10T08:53:41.163 INFO:teuthology.orchestra.run.vm05.stdout:last_failure 0 2026-03-10T08:53:41.163 INFO:teuthology.orchestra.run.vm05.stdout:last_failure_osd_epoch 39 2026-03-10T08:53:41.163 INFO:teuthology.orchestra.run.vm05.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T08:53:41.163 INFO:teuthology.orchestra.run.vm05.stdout:max_mds 1 2026-03-10T08:53:41.163 INFO:teuthology.orchestra.run.vm05.stdout:in 0 2026-03-10T08:53:41.163 INFO:teuthology.orchestra.run.vm05.stdout:up {0=24289} 2026-03-10T08:53:41.163 INFO:teuthology.orchestra.run.vm05.stdout:failed 2026-03-10T08:53:41.163 INFO:teuthology.orchestra.run.vm05.stdout:damaged 2026-03-10T08:53:41.163 INFO:teuthology.orchestra.run.vm05.stdout:stopped 2026-03-10T08:53:41.163 INFO:teuthology.orchestra.run.vm05.stdout:data_pools [3] 2026-03-10T08:53:41.163 INFO:teuthology.orchestra.run.vm05.stdout:metadata_pool 2 2026-03-10T08:53:41.163 INFO:teuthology.orchestra.run.vm05.stdout:inline_data disabled 2026-03-10T08:53:41.163 INFO:teuthology.orchestra.run.vm05.stdout:balancer 2026-03-10T08:53:41.163 INFO:teuthology.orchestra.run.vm05.stdout:bal_rank_mask -1 2026-03-10T08:53:41.163 INFO:teuthology.orchestra.run.vm05.stdout:standby_count_wanted 1 2026-03-10T08:53:41.163 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.bxdvbu{0:24289} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.105:6826/2466638752,v1:192.168.123.105:6827/2466638752] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:53:41.163 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:53:41.163 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:53:41.163 INFO:teuthology.orchestra.run.vm05.stdout:Standby daemons: 2026-03-10T08:53:41.163 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:53:41.163 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.slhztf{-1:14488} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.105:6828/2662194502,v1:192.168.123.105:6829/2662194502] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:53:41.163 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.ssijow{-1:24307} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.108:6826/573665424,v1:192.168.123.108:6827/573665424] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:53:41.163 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.xfzrbx{-1:24317} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.108:6824/3236998387,v1:192.168.123.108:6825/3236998387] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:53:41.163 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.161+0000 7f8432ffd700 1 -- 192.168.123.105:0/3581879589 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 12 v12) v1 ==== 76+0+1827 (secure 0 0 0) 0x7f8438059ed0 con 0x7f84340a4ca0 2026-03-10T08:53:41.165 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.163+0000 7f8442e93700 1 -- 192.168.123.105:0/3581879589 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f842806c4e0 msgr2=0x7f842806e9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:41.165 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.163+0000 7f8442e93700 1 --2- 192.168.123.105:0/3581879589 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f842806c4e0 0x7f842806e9a0 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f842c005d50 tx=0x7f842c005ce0 comp rx=0 tx=0).stop 2026-03-10T08:53:41.165 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.164+0000 7f8442e93700 1 -- 192.168.123.105:0/3581879589 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f84340a4ca0 msgr2=0x7f84340b3e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:41.165 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.164+0000 7f8442e93700 1 --2- 192.168.123.105:0/3581879589 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f84340a4ca0 0x7f84340b3e90 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f843800ebf0 tx=0x7f843800ef00 comp rx=0 tx=0).stop 2026-03-10T08:53:41.168 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.167+0000 7f8442e93700 1 -- 192.168.123.105:0/3581879589 shutdown_connections 2026-03-10T08:53:41.168 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.167+0000 7f8442e93700 1 --2- 192.168.123.105:0/3581879589 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f842806c4e0 0x7f842806e9a0 unknown :-1 s=CLOSED pgs=126 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:41.168 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.167+0000 7f8442e93700 1 --2- 192.168.123.105:0/3581879589 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f84340a4ca0 0x7f84340b3e90 unknown :-1 s=CLOSED pgs=69 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:41.168 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.167+0000 7f8442e93700 1 --2- 192.168.123.105:0/3581879589 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f84340a5de0 0x7f84340b43d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:41.168 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.167+0000 7f8442e93700 1 -- 192.168.123.105:0/3581879589 >> 192.168.123.105:0/3581879589 conn(0x7f84340a0160 msgr2=0x7f84340a9010 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:53:41.169 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.168+0000 7f8442e93700 1 -- 192.168.123.105:0/3581879589 shutdown_connections 2026-03-10T08:53:41.169 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.169+0000 7f8442e93700 1 -- 192.168.123.105:0/3581879589 wait complete. 2026-03-10T08:53:41.171 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 12 2026-03-10T08:53:41.252 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.251+0000 7fe7b8f2a700 1 -- 192.168.123.105:0/500135896 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe7b4074dc0 msgr2=0x7fe7b4073220 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:41.252 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.251+0000 7fe7b8f2a700 1 --2- 192.168.123.105:0/500135896 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe7b4074dc0 0x7fe7b4073220 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7fe7a8009b00 tx=0x7fe7a8009e10 comp rx=0 tx=0).stop 2026-03-10T08:53:41.252 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.251+0000 7fe7b8f2a700 1 -- 192.168.123.105:0/500135896 shutdown_connections 2026-03-10T08:53:41.252 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.251+0000 7fe7b8f2a700 1 --2- 192.168.123.105:0/500135896 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe7b40737f0 0x7fe7b4073c70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:41.252 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.251+0000 7fe7b8f2a700 1 --2- 192.168.123.105:0/500135896 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe7b4074dc0 0x7fe7b4073220 unknown :-1 s=CLOSED pgs=70 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:41.252 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.251+0000 7fe7b8f2a700 1 -- 192.168.123.105:0/500135896 >> 192.168.123.105:0/500135896 conn(0x7fe7b40fc210 msgr2=0x7fe7b40fe670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:53:41.252 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.252+0000 7fe7b8f2a700 1 -- 192.168.123.105:0/500135896 shutdown_connections 2026-03-10T08:53:41.252 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.252+0000 7fe7b8f2a700 1 -- 192.168.123.105:0/500135896 wait complete. 2026-03-10T08:53:41.253 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.253+0000 7fe7b8f2a700 1 Processor -- start 2026-03-10T08:53:41.253 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.253+0000 7fe7b8f2a700 1 -- start start 2026-03-10T08:53:41.253 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.253+0000 7fe7b8f2a700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe7b40737f0 0x7fe7b4071d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:41.253 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.253+0000 7fe7b8f2a700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe7b4074dc0 0x7fe7b4072260 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:41.253 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.253+0000 7fe7b8f2a700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe7b40727f0 con 0x7fe7b40737f0 2026-03-10T08:53:41.254 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.253+0000 7fe7b8f2a700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe7b4072930 con 0x7fe7b4074dc0 2026-03-10T08:53:41.254 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.254+0000 7fe7b37fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe7b40737f0 0x7fe7b4071d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:41.254 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.254+0000 7fe7b37fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe7b40737f0 0x7fe7b4071d20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:52110/0 (socket says 192.168.123.105:52110) 2026-03-10T08:53:41.254 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.254+0000 7fe7b37fe700 1 -- 192.168.123.105:0/676258931 learned_addr learned my addr 192.168.123.105:0/676258931 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:53:41.254 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.254+0000 7fe7b2ffd700 1 --2- 192.168.123.105:0/676258931 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe7b4074dc0 0x7fe7b4072260 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:41.254 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.254+0000 7fe7b37fe700 1 -- 192.168.123.105:0/676258931 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe7b4074dc0 msgr2=0x7fe7b4072260 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:41.255 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.254+0000 7fe7b37fe700 1 --2- 192.168.123.105:0/676258931 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe7b4074dc0 0x7fe7b4072260 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:41.255 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.254+0000 7fe7b37fe700 1 -- 192.168.123.105:0/676258931 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe7a80097e0 con 0x7fe7b40737f0 2026-03-10T08:53:41.255 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.255+0000 7fe7b37fe700 1 --2- 192.168.123.105:0/676258931 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe7b40737f0 0x7fe7b4071d20 secure :-1 s=READY pgs=297 cs=0 l=1 rev1=1 crypto rx=0x7fe7a8005230 tx=0x7fe7a8004a20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:53:41.255 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.255+0000 7fe7b0ff9700 1 -- 192.168.123.105:0/676258931 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe7a801d070 con 0x7fe7b40737f0 2026-03-10T08:53:41.255 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.255+0000 7fe7b0ff9700 1 -- 192.168.123.105:0/676258931 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fe7a800bc50 con 0x7fe7b40737f0 2026-03-10T08:53:41.255 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.255+0000 7fe7b0ff9700 1 -- 192.168.123.105:0/676258931 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe7a800f870 con 0x7fe7b40737f0 2026-03-10T08:53:41.256 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.255+0000 7fe7b8f2a700 1 -- 192.168.123.105:0/676258931 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe7b41aa980 con 0x7fe7b40737f0 2026-03-10T08:53:41.256 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.256+0000 7fe7b8f2a700 1 -- 192.168.123.105:0/676258931 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe7b41aadf0 con 0x7fe7b40737f0 2026-03-10T08:53:41.256 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.256+0000 7fe7b8f2a700 1 -- 192.168.123.105:0/676258931 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe7b404ea90 con 0x7fe7b40737f0 2026-03-10T08:53:41.260 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.260+0000 7fe7b0ff9700 1 -- 192.168.123.105:0/676258931 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fe7a800f9d0 con 0x7fe7b40737f0 2026-03-10T08:53:41.261 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.261+0000 7fe7b0ff9700 1 --2- 192.168.123.105:0/676258931 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe79c06c290 0x7fe79c06e750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:41.261 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.261+0000 7fe7b2ffd700 1 --2- 192.168.123.105:0/676258931 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe79c06c290 0x7fe79c06e750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:41.261 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.261+0000 7fe7b0ff9700 1 -- 192.168.123.105:0/676258931 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7fe7a808daa0 con 0x7fe7b40737f0 2026-03-10T08:53:41.262 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.261+0000 7fe7b0ff9700 1 -- 192.168.123.105:0/676258931 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fe7a80ba180 con 0x7fe7b40737f0 2026-03-10T08:53:41.262 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.261+0000 7fe7b2ffd700 1 --2- 192.168.123.105:0/676258931 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe79c06c290 0x7fe79c06e750 secure :-1 s=READY pgs=127 cs=0 l=1 rev1=1 crypto rx=0x7fe7b4074af0 tx=0x7fe7ac009040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:53:41.405 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.404+0000 7fe7b8f2a700 1 -- 192.168.123.105:0/676258931 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fe7b4103d40 con 0x7fe79c06c290 2026-03-10T08:53:41.406 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.406+0000 7fe7b0ff9700 1 -- 192.168.123.105:0/676258931 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7fe7b4103d40 con 0x7fe79c06c290 2026-03-10T08:53:41.408 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T08:53:41.408 INFO:teuthology.orchestra.run.vm05.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", 2026-03-10T08:53:41.408 INFO:teuthology.orchestra.run.vm05.stdout: "in_progress": true, 2026-03-10T08:53:41.408 INFO:teuthology.orchestra.run.vm05.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T08:53:41.408 INFO:teuthology.orchestra.run.vm05.stdout: "services_complete": [], 2026-03-10T08:53:41.408 INFO:teuthology.orchestra.run.vm05.stdout: "progress": "", 2026-03-10T08:53:41.408 INFO:teuthology.orchestra.run.vm05.stdout: "message": "Doing first pull of quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df image", 2026-03-10T08:53:41.408 INFO:teuthology.orchestra.run.vm05.stdout: "is_paused": false 2026-03-10T08:53:41.408 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T08:53:41.409 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.409+0000 7fe7b8f2a700 1 -- 192.168.123.105:0/676258931 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe79c06c290 msgr2=0x7fe79c06e750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:41.409 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.409+0000 7fe7b8f2a700 1 --2- 192.168.123.105:0/676258931 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe79c06c290 0x7fe79c06e750 secure :-1 s=READY pgs=127 cs=0 l=1 rev1=1 crypto rx=0x7fe7b4074af0 tx=0x7fe7ac009040 comp rx=0 tx=0).stop 2026-03-10T08:53:41.409 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.409+0000 7fe7b8f2a700 1 -- 192.168.123.105:0/676258931 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe7b40737f0 msgr2=0x7fe7b4071d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:41.409 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.409+0000 7fe7b8f2a700 1 --2- 192.168.123.105:0/676258931 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe7b40737f0 0x7fe7b4071d20 secure :-1 s=READY pgs=297 cs=0 l=1 rev1=1 crypto rx=0x7fe7a8005230 tx=0x7fe7a8004a20 comp rx=0 tx=0).stop 2026-03-10T08:53:41.410 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.409+0000 7fe7b8f2a700 1 -- 192.168.123.105:0/676258931 shutdown_connections 2026-03-10T08:53:41.410 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.410+0000 7fe7b8f2a700 1 --2- 192.168.123.105:0/676258931 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe79c06c290 0x7fe79c06e750 unknown :-1 s=CLOSED pgs=127 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:41.410 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.410+0000 7fe7b8f2a700 1 --2- 192.168.123.105:0/676258931 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe7b40737f0 0x7fe7b4071d20 unknown :-1 s=CLOSED pgs=297 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:41.410 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.410+0000 7fe7b8f2a700 1 --2- 192.168.123.105:0/676258931 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe7b4074dc0 0x7fe7b4072260 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:41.410 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.410+0000 7fe7b8f2a700 1 -- 192.168.123.105:0/676258931 >> 192.168.123.105:0/676258931 conn(0x7fe7b40fc210 msgr2=0x7fe7b4102620 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:53:41.410 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.410+0000 7fe7b8f2a700 1 -- 192.168.123.105:0/676258931 shutdown_connections 2026-03-10T08:53:41.412 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.412+0000 7fe7b8f2a700 1 -- 192.168.123.105:0/676258931 wait complete. 2026-03-10T08:53:41.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:41 vm05.local ceph-mon[49713]: from='client.24369 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:53:41.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:41 vm05.local ceph-mon[49713]: from='client.14580 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:53:41.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:41 vm05.local ceph-mon[49713]: from='client.14584 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:53:41.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:41 vm05.local ceph-mon[49713]: from='client.? 192.168.123.105:0/497535088' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:53:41.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:41 vm05.local ceph-mon[49713]: from='client.? 192.168.123.105:0/3581879589' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T08:53:41.500 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.499+0000 7f3a45bb9700 1 -- 192.168.123.105:0/646480418 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3a4010a700 msgr2=0x7f3a4010cb90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:41.500 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.499+0000 7f3a45bb9700 1 --2- 192.168.123.105:0/646480418 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3a4010a700 0x7f3a4010cb90 secure :-1 s=READY pgs=298 cs=0 l=1 rev1=1 crypto rx=0x7f3a2c009b00 tx=0x7f3a2c009e10 comp rx=0 tx=0).stop 2026-03-10T08:53:41.500 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.499+0000 7f3a45bb9700 1 -- 192.168.123.105:0/646480418 shutdown_connections 2026-03-10T08:53:41.500 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.499+0000 7f3a45bb9700 1 --2- 192.168.123.105:0/646480418 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3a4010a700 0x7f3a4010cb90 unknown :-1 s=CLOSED pgs=298 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:41.500 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.499+0000 7f3a45bb9700 1 --2- 192.168.123.105:0/646480418 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3a40107d90 0x7f3a4010a1c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:41.500 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.499+0000 7f3a45bb9700 1 -- 192.168.123.105:0/646480418 >> 192.168.123.105:0/646480418 conn(0x7f3a4006daa0 msgr2=0x7f3a4006ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:53:41.500 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.500+0000 7f3a45bb9700 1 -- 192.168.123.105:0/646480418 shutdown_connections 2026-03-10T08:53:41.501 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.500+0000 7f3a45bb9700 1 -- 192.168.123.105:0/646480418 wait complete. 2026-03-10T08:53:41.501 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.500+0000 7f3a45bb9700 1 Processor -- start 2026-03-10T08:53:41.501 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.500+0000 7f3a45bb9700 1 -- start start 2026-03-10T08:53:41.501 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.500+0000 7f3a45bb9700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3a40107d90 0x7f3a40116a20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:41.501 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.500+0000 7f3a45bb9700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3a4010a700 0x7f3a40116f60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:41.501 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.500+0000 7f3a45bb9700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3a40117580 con 0x7f3a40107d90 2026-03-10T08:53:41.501 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.500+0000 7f3a45bb9700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3a401b3090 con 0x7f3a4010a700 2026-03-10T08:53:41.501 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.501+0000 7f3a44bb7700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3a40107d90 0x7f3a40116a20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:41.501 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.501+0000 7f3a3ffff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3a4010a700 0x7f3a40116f60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:41.501 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.501+0000 7f3a44bb7700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3a40107d90 0x7f3a40116a20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:52124/0 (socket says 192.168.123.105:52124) 2026-03-10T08:53:41.501 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.501+0000 7f3a44bb7700 1 -- 192.168.123.105:0/2103710503 learned_addr learned my addr 192.168.123.105:0/2103710503 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:53:41.501 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.501+0000 7f3a44bb7700 1 -- 192.168.123.105:0/2103710503 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3a4010a700 msgr2=0x7f3a40116f60 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:41.502 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.501+0000 7f3a44bb7700 1 --2- 192.168.123.105:0/2103710503 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3a4010a700 0x7f3a40116f60 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:41.502 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.501+0000 7f3a44bb7700 1 -- 192.168.123.105:0/2103710503 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3a2c0097e0 con 0x7f3a40107d90 2026-03-10T08:53:41.502 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.502+0000 7f3a44bb7700 1 --2- 192.168.123.105:0/2103710503 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3a40107d90 0x7f3a40116a20 secure :-1 s=READY pgs=299 cs=0 l=1 rev1=1 crypto rx=0x7f3a3400eb10 tx=0x7f3a3400ee20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:53:41.502 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.502+0000 7f3a3dffb700 1 -- 192.168.123.105:0/2103710503 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3a3400cc40 con 0x7f3a40107d90 2026-03-10T08:53:41.502 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.502+0000 7f3a3dffb700 1 -- 192.168.123.105:0/2103710503 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3a3400cda0 con 0x7f3a40107d90 2026-03-10T08:53:41.504 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.502+0000 7f3a45bb9700 1 -- 192.168.123.105:0/2103710503 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3a401b3290 con 0x7f3a40107d90 2026-03-10T08:53:41.504 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.502+0000 7f3a45bb9700 1 -- 192.168.123.105:0/2103710503 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3a401b3760 con 0x7f3a40107d90 2026-03-10T08:53:41.504 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.503+0000 7f3a45bb9700 1 -- 192.168.123.105:0/2103710503 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3a40110c60 con 0x7f3a40107d90 2026-03-10T08:53:41.504 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.502+0000 7f3a3dffb700 1 -- 192.168.123.105:0/2103710503 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3a34018810 con 0x7f3a40107d90 2026-03-10T08:53:41.507 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.506+0000 7f3a3dffb700 1 -- 192.168.123.105:0/2103710503 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f3a34018970 con 0x7f3a40107d90 2026-03-10T08:53:41.507 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.506+0000 7f3a3dffb700 1 --2- 192.168.123.105:0/2103710503 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3a3006c2e0 0x7f3a3006e7a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:53:41.507 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.506+0000 7f3a3dffb700 1 -- 192.168.123.105:0/2103710503 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f3a34014070 con 0x7f3a40107d90 2026-03-10T08:53:41.507 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.507+0000 7f3a3dffb700 1 -- 192.168.123.105:0/2103710503 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f3a34056f40 con 0x7f3a40107d90 2026-03-10T08:53:41.507 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.507+0000 7f3a3ffff700 1 --2- 192.168.123.105:0/2103710503 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3a3006c2e0 0x7f3a3006e7a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:53:41.508 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.507+0000 7f3a3ffff700 1 --2- 192.168.123.105:0/2103710503 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3a3006c2e0 0x7f3a3006e7a0 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7f3a2c005270 tx=0x7f3a2c01a040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:53:41.678 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.677+0000 7f3a45bb9700 1 -- 192.168.123.105:0/2103710503 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f3a4002ced0 con 0x7f3a40107d90 2026-03-10T08:53:41.678 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.677+0000 7f3a3dffb700 1 -- 192.168.123.105:0/2103710503 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7f3a3405a560 con 0x7f3a40107d90 2026-03-10T08:53:41.678 INFO:teuthology.orchestra.run.vm05.stdout:HEALTH_OK 2026-03-10T08:53:41.680 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.680+0000 7f3a2b7fe700 1 -- 192.168.123.105:0/2103710503 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3a3006c2e0 msgr2=0x7f3a3006e7a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:41.680 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.680+0000 7f3a2b7fe700 1 --2- 192.168.123.105:0/2103710503 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3a3006c2e0 0x7f3a3006e7a0 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7f3a2c005270 tx=0x7f3a2c01a040 comp rx=0 tx=0).stop 2026-03-10T08:53:41.681 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.680+0000 7f3a2b7fe700 1 -- 192.168.123.105:0/2103710503 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3a40107d90 msgr2=0x7f3a40116a20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:53:41.681 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.680+0000 7f3a2b7fe700 1 --2- 192.168.123.105:0/2103710503 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3a40107d90 0x7f3a40116a20 secure :-1 s=READY pgs=299 cs=0 l=1 rev1=1 crypto rx=0x7f3a3400eb10 tx=0x7f3a3400ee20 comp rx=0 tx=0).stop 2026-03-10T08:53:41.681 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.680+0000 7f3a2b7fe700 1 -- 192.168.123.105:0/2103710503 shutdown_connections 2026-03-10T08:53:41.681 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.680+0000 7f3a2b7fe700 1 --2- 192.168.123.105:0/2103710503 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3a3006c2e0 0x7f3a3006e7a0 unknown :-1 s=CLOSED pgs=128 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:41.681 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.680+0000 7f3a2b7fe700 1 --2- 192.168.123.105:0/2103710503 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3a40107d90 0x7f3a40116a20 secure :-1 s=CLOSED pgs=299 cs=0 l=1 rev1=1 crypto rx=0x7f3a3400eb10 tx=0x7f3a3400ee20 comp rx=0 tx=0).stop 2026-03-10T08:53:41.681 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.680+0000 7f3a2b7fe700 1 --2- 192.168.123.105:0/2103710503 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3a4010a700 0x7f3a40116f60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:53:41.681 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.680+0000 7f3a2b7fe700 1 -- 192.168.123.105:0/2103710503 >> 192.168.123.105:0/2103710503 conn(0x7f3a4006daa0 msgr2=0x7f3a4006ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:53:41.682 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.681+0000 7f3a2b7fe700 1 -- 192.168.123.105:0/2103710503 shutdown_connections 2026-03-10T08:53:41.683 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:53:41.682+0000 7f3a2b7fe700 1 -- 192.168.123.105:0/2103710503 wait complete. 2026-03-10T08:53:41.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:41 vm08.local ceph-mon[57559]: from='client.24369 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:53:41.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:41 vm08.local ceph-mon[57559]: from='client.14580 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:53:41.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:41 vm08.local ceph-mon[57559]: from='client.14584 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:53:41.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:41 vm08.local ceph-mon[57559]: from='client.? 192.168.123.105:0/497535088' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:53:41.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:41 vm08.local ceph-mon[57559]: from='client.? 192.168.123.105:0/3581879589' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T08:53:42.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:42 vm05.local ceph-mon[49713]: pgmap v100: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:53:42.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:42 vm05.local ceph-mon[49713]: from='client.14594 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:53:42.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:42 vm05.local ceph-mon[49713]: from='client.? 192.168.123.105:0/2103710503' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T08:53:42.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:42 vm08.local ceph-mon[57559]: pgmap v100: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:53:42.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:42 vm08.local ceph-mon[57559]: from='client.14594 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:53:42.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:42 vm08.local ceph-mon[57559]: from='client.? 192.168.123.105:0/2103710503' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T08:53:44.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:44 vm05.local ceph-mon[49713]: pgmap v101: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:53:44.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:44 vm08.local ceph-mon[57559]: pgmap v101: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:53:45.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:45 vm05.local ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:53:45.762 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:45 vm08.local ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:53:46.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:46 vm08.local ceph-mon[57559]: pgmap v102: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:53:46.935 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:46 vm05.local ceph-mon[49713]: pgmap v102: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:53:48.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:48 vm08.local ceph-mon[57559]: pgmap v103: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:53:48.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:48 vm05.local ceph-mon[49713]: pgmap v103: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:53:50.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:50 vm05.local ceph-mon[49713]: pgmap v104: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:53:51.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:50 vm08.local ceph-mon[57559]: pgmap v104: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:53:52.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:52 vm05.local ceph-mon[49713]: pgmap v105: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:53:53.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:52 vm08.local ceph-mon[57559]: pgmap v105: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:53:55.402 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:55 vm05.local ceph-mon[49713]: pgmap v106: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:53:55.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:55 vm08.local ceph-mon[57559]: pgmap v106: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:53:56.090 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:56 vm08.local ceph-mon[57559]: pgmap v107: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:53:56.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:56 vm05.local ceph-mon[49713]: pgmap v107: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:53:58.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:53:58 vm05.local ceph-mon[49713]: pgmap v108: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:53:58.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:53:58 vm08.local ceph-mon[57559]: pgmap v108: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:54:00.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:00 vm05.local ceph-mon[49713]: pgmap v109: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:54:00.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:00 vm05.local ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:54:00.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:00 vm08.local ceph-mon[57559]: pgmap v109: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:54:00.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:00 vm08.local ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:54:02.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:02 vm05.local ceph-mon[49713]: pgmap v110: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:54:02.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:02 vm08.local ceph-mon[57559]: pgmap v110: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:54:04.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:04 vm05.local ceph-mon[49713]: pgmap v111: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:54:04.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:04 vm08.local ceph-mon[57559]: pgmap v111: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:54:06.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:06 vm05.local ceph-mon[49713]: pgmap v112: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:54:06.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:06 vm08.local ceph-mon[57559]: pgmap v112: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:54:06.947 INFO:tasks.workunit.client.1.vm08.stderr:Note: switching to '75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b'. 2026-03-10T08:54:06.948 INFO:tasks.workunit.client.1.vm08.stderr: 2026-03-10T08:54:06.948 INFO:tasks.workunit.client.1.vm08.stderr:You are in 'detached HEAD' state. You can look around, make experimental 2026-03-10T08:54:06.948 INFO:tasks.workunit.client.1.vm08.stderr:changes and commit them, and you can discard any commits you make in this 2026-03-10T08:54:06.948 INFO:tasks.workunit.client.1.vm08.stderr:state without impacting any branches by switching back to a branch. 2026-03-10T08:54:06.948 INFO:tasks.workunit.client.1.vm08.stderr: 2026-03-10T08:54:06.948 INFO:tasks.workunit.client.1.vm08.stderr:If you want to create a new branch to retain commits you create, you may 2026-03-10T08:54:06.948 INFO:tasks.workunit.client.1.vm08.stderr:do so (now or later) by using -c with the switch command. Example: 2026-03-10T08:54:06.948 INFO:tasks.workunit.client.1.vm08.stderr: 2026-03-10T08:54:06.948 INFO:tasks.workunit.client.1.vm08.stderr: git switch -c 2026-03-10T08:54:06.948 INFO:tasks.workunit.client.1.vm08.stderr: 2026-03-10T08:54:06.948 INFO:tasks.workunit.client.1.vm08.stderr:Or undo this operation with: 2026-03-10T08:54:06.948 INFO:tasks.workunit.client.1.vm08.stderr: 2026-03-10T08:54:06.948 INFO:tasks.workunit.client.1.vm08.stderr: git switch - 2026-03-10T08:54:06.948 INFO:tasks.workunit.client.1.vm08.stderr: 2026-03-10T08:54:06.948 INFO:tasks.workunit.client.1.vm08.stderr:Turn off this advice by setting config variable advice.detachedHead to false 2026-03-10T08:54:06.948 INFO:tasks.workunit.client.1.vm08.stderr: 2026-03-10T08:54:06.948 INFO:tasks.workunit.client.1.vm08.stderr:HEAD is now at 75a68fd8ca3 qa/suites/orch/cephadm/osds: drop nvme_loop task 2026-03-10T08:54:06.953 DEBUG:teuthology.orchestra.run.vm08:> cd -- /home/ubuntu/cephtest/clone.client.1/qa/workunits && if test -e Makefile ; then make ; fi && find -executable -type f -printf '%P\0' >/home/ubuntu/cephtest/workunits.list.client.1 2026-03-10T08:54:07.011 INFO:tasks.workunit.client.1.vm08.stdout:for d in direct_io fs ; do ( cd $d ; make all ) ; done 2026-03-10T08:54:07.013 INFO:tasks.workunit.client.1.vm08.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.1/qa/workunits/direct_io' 2026-03-10T08:54:07.013 INFO:tasks.workunit.client.1.vm08.stdout:cc -Wall -Wextra -D_GNU_SOURCE direct_io_test.c -o direct_io_test 2026-03-10T08:54:07.054 INFO:tasks.workunit.client.1.vm08.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_sync_io.c -o test_sync_io 2026-03-10T08:54:07.086 INFO:tasks.workunit.client.1.vm08.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_short_dio_read.c -o test_short_dio_read 2026-03-10T08:54:07.114 INFO:tasks.workunit.client.1.vm08.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.1/qa/workunits/direct_io' 2026-03-10T08:54:07.115 INFO:tasks.workunit.client.1.vm08.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.1/qa/workunits/fs' 2026-03-10T08:54:07.115 INFO:tasks.workunit.client.1.vm08.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_o_trunc.c -o test_o_trunc 2026-03-10T08:54:07.142 INFO:tasks.workunit.client.1.vm08.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.1/qa/workunits/fs' 2026-03-10T08:54:07.145 DEBUG:teuthology.orchestra.run.vm08:> set -ex 2026-03-10T08:54:07.145 DEBUG:teuthology.orchestra.run.vm08:> dd if=/home/ubuntu/cephtest/workunits.list.client.1 of=/dev/stdout 2026-03-10T08:54:07.202 INFO:tasks.workunit:Running workunits matching suites/fsstress.sh on client.1... 2026-03-10T08:54:07.203 INFO:tasks.workunit:Running workunit suites/fsstress.sh... 2026-03-10T08:54:07.203 DEBUG:teuthology.orchestra.run.vm08:workunit test suites/fsstress.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.1/client.1/tmp && cd -- /home/ubuntu/cephtest/mnt.1/client.1/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="1" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.1 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.1 CEPH_MNT=/home/ubuntu/cephtest/mnt.1 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.1/qa/workunits/suites/fsstress.sh 2026-03-10T08:54:07.265 INFO:tasks.workunit.client.1.vm08.stderr:+ mkdir -p fsstress 2026-03-10T08:54:07.266 INFO:tasks.workunit.client.1.vm08.stderr:+ pushd fsstress 2026-03-10T08:54:07.267 INFO:tasks.workunit.client.1.vm08.stdout:~/cephtest/mnt.1/client.1/tmp/fsstress ~/cephtest/mnt.1/client.1/tmp 2026-03-10T08:54:07.268 INFO:tasks.workunit.client.1.vm08.stderr:+ wget -q -O ltp-full.tgz http://download.ceph.com/qa/ltp-full-20091231.tgz 2026-03-10T08:54:08.719 INFO:tasks.workunit.client.1.vm08.stderr:+ tar xzf ltp-full.tgz 2026-03-10T08:54:08.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:08 vm05.local ceph-mon[49713]: pgmap v113: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:54:09.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:08 vm08.local ceph-mon[57559]: pgmap v113: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T08:54:10.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:10 vm08.local ceph-mon[57559]: pgmap v114: 65 pgs: 65 active+clean; 4.4 MiB data, 160 MiB used, 120 GiB / 120 GiB avail; 341 KiB/s wr, 0 op/s 2026-03-10T08:54:10.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:10 vm05.local ceph-mon[49713]: pgmap v114: 65 pgs: 65 active+clean; 4.4 MiB data, 160 MiB used, 120 GiB / 120 GiB avail; 341 KiB/s wr, 0 op/s 2026-03-10T08:54:11.853 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:11.851+0000 7f7a2391d700 1 -- 192.168.123.105:0/3471263741 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7a1c107d90 msgr2=0x7f7a1c10a1c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:11.853 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:11.851+0000 7f7a2391d700 1 --2- 192.168.123.105:0/3471263741 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7a1c107d90 0x7f7a1c10a1c0 secure :-1 s=READY pgs=300 cs=0 l=1 rev1=1 crypto rx=0x7f7a18009b00 tx=0x7f7a18009e10 comp rx=0 tx=0).stop 2026-03-10T08:54:11.853 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:11.851+0000 7f7a2391d700 1 -- 192.168.123.105:0/3471263741 shutdown_connections 2026-03-10T08:54:11.853 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:11.851+0000 7f7a2391d700 1 --2- 192.168.123.105:0/3471263741 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7a1c10a700 0x7f7a1c10cb90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:11.853 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:11.851+0000 7f7a2391d700 1 --2- 192.168.123.105:0/3471263741 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7a1c107d90 0x7f7a1c10a1c0 unknown :-1 s=CLOSED pgs=300 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:11.853 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:11.851+0000 7f7a2391d700 1 -- 192.168.123.105:0/3471263741 >> 192.168.123.105:0/3471263741 conn(0x7f7a1c06dda0 msgr2=0x7f7a1c070220 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:54:11.853 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:11.851+0000 7f7a2391d700 1 -- 192.168.123.105:0/3471263741 shutdown_connections 2026-03-10T08:54:11.853 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:11.851+0000 7f7a2391d700 1 -- 192.168.123.105:0/3471263741 wait complete. 2026-03-10T08:54:11.853 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:11.852+0000 7f7a2391d700 1 Processor -- start 2026-03-10T08:54:11.853 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:11.852+0000 7f7a2391d700 1 -- start start 2026-03-10T08:54:11.853 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:11.852+0000 7f7a2391d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7a1c107d90 0x7f7a1c116a90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:54:11.853 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:11.852+0000 7f7a2391d700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7a1c10a700 0x7f7a1c116fd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:54:11.853 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:11.852+0000 7f7a2391d700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7a1c1175f0 con 0x7f7a1c107d90 2026-03-10T08:54:11.853 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:11.852+0000 7f7a2391d700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7a1c117730 con 0x7f7a1c10a700 2026-03-10T08:54:11.853 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:11.852+0000 7f7a20eb8700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7a1c10a700 0x7f7a1c116fd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:54:11.853 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:11.852+0000 7f7a216b9700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7a1c107d90 0x7f7a1c116a90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:54:11.853 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:11.852+0000 7f7a216b9700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7a1c107d90 0x7f7a1c116a90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:49522/0 (socket says 192.168.123.105:49522) 2026-03-10T08:54:11.853 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:11.852+0000 7f7a216b9700 1 -- 192.168.123.105:0/3200894687 learned_addr learned my addr 192.168.123.105:0/3200894687 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:54:11.853 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:11.852+0000 7f7a20eb8700 1 -- 192.168.123.105:0/3200894687 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7a1c107d90 msgr2=0x7f7a1c116a90 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:11.853 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:11.852+0000 7f7a20eb8700 1 --2- 192.168.123.105:0/3200894687 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7a1c107d90 0x7f7a1c116a90 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:11.853 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:11.853+0000 7f7a20eb8700 1 -- 192.168.123.105:0/3200894687 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7a180097e0 con 0x7f7a1c10a700 2026-03-10T08:54:11.855 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:11.855+0000 7f7a20eb8700 1 --2- 192.168.123.105:0/3200894687 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7a1c10a700 0x7f7a1c116fd0 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f7a0c00d8d0 tx=0x7f7a0c00dbe0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:54:11.857 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:11.857+0000 7f7a127fc700 1 -- 192.168.123.105:0/3200894687 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7a0c00f840 con 0x7f7a1c10a700 2026-03-10T08:54:11.857 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:11.857+0000 7f7a127fc700 1 -- 192.168.123.105:0/3200894687 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7a0c00fe80 con 0x7f7a1c10a700 2026-03-10T08:54:11.857 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:11.857+0000 7f7a127fc700 1 -- 192.168.123.105:0/3200894687 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7a0c00e5c0 con 0x7f7a1c10a700 2026-03-10T08:54:11.857 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:11.857+0000 7f7a2391d700 1 -- 192.168.123.105:0/3200894687 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7a1c1b3540 con 0x7f7a1c10a700 2026-03-10T08:54:11.857 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:11.857+0000 7f7a2391d700 1 -- 192.168.123.105:0/3200894687 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7a1c1b3a60 con 0x7f7a1c10a700 2026-03-10T08:54:11.858 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:11.857+0000 7f7a2391d700 1 -- 192.168.123.105:0/3200894687 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7a1c110c60 con 0x7f7a1c10a700 2026-03-10T08:54:11.858 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:11.858+0000 7f7a127fc700 1 -- 192.168.123.105:0/3200894687 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f7a0c00f9a0 con 0x7f7a1c10a700 2026-03-10T08:54:11.859 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:11.858+0000 7f7a127fc700 1 --2- 192.168.123.105:0/3200894687 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7a0806c2e0 0x7f7a0806e7a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:54:11.859 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:11.859+0000 7f7a127fc700 1 -- 192.168.123.105:0/3200894687 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f7a0c08b750 con 0x7f7a1c10a700 2026-03-10T08:54:11.859 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:11.859+0000 7f7a216b9700 1 --2- 192.168.123.105:0/3200894687 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7a0806c2e0 0x7f7a0806e7a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:54:11.859 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:11.859+0000 7f7a216b9700 1 --2- 192.168.123.105:0/3200894687 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7a0806c2e0 0x7f7a0806e7a0 secure :-1 s=READY pgs=129 cs=0 l=1 rev1=1 crypto rx=0x7f7a1800b5c0 tx=0x7f7a18005fb0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:54:11.861 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:11.861+0000 7f7a127fc700 1 -- 192.168.123.105:0/3200894687 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f7a0c059a80 con 0x7f7a1c10a700 2026-03-10T08:54:12.061 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.060+0000 7f7a2391d700 1 -- 192.168.123.105:0/3200894687 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f7a1c0611d0 con 0x7f7a0806c2e0 2026-03-10T08:54:12.067 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.067+0000 7f7a127fc700 1 -- 192.168.123.105:0/3200894687 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+442 (secure 0 0 0) 0x7f7a1c0611d0 con 0x7f7a0806c2e0 2026-03-10T08:54:12.074 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.071+0000 7f7a07fff700 1 -- 192.168.123.105:0/3200894687 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7a0806c2e0 msgr2=0x7f7a0806e7a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:12.074 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.071+0000 7f7a07fff700 1 --2- 192.168.123.105:0/3200894687 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7a0806c2e0 0x7f7a0806e7a0 secure :-1 s=READY pgs=129 cs=0 l=1 rev1=1 crypto rx=0x7f7a1800b5c0 tx=0x7f7a18005fb0 comp rx=0 tx=0).stop 2026-03-10T08:54:12.074 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.071+0000 7f7a07fff700 1 -- 192.168.123.105:0/3200894687 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7a1c10a700 msgr2=0x7f7a1c116fd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:12.074 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.071+0000 7f7a07fff700 1 --2- 192.168.123.105:0/3200894687 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7a1c10a700 0x7f7a1c116fd0 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f7a0c00d8d0 tx=0x7f7a0c00dbe0 comp rx=0 tx=0).stop 2026-03-10T08:54:12.075 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.072+0000 7f7a07fff700 1 -- 192.168.123.105:0/3200894687 shutdown_connections 2026-03-10T08:54:12.075 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.072+0000 7f7a07fff700 1 --2- 192.168.123.105:0/3200894687 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7a0806c2e0 0x7f7a0806e7a0 unknown :-1 s=CLOSED pgs=129 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:12.075 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.072+0000 7f7a07fff700 1 --2- 192.168.123.105:0/3200894687 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7a1c107d90 0x7f7a1c116a90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:12.075 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.072+0000 7f7a07fff700 1 --2- 192.168.123.105:0/3200894687 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7a1c10a700 0x7f7a1c116fd0 unknown :-1 s=CLOSED pgs=71 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:12.075 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.073+0000 7f7a07fff700 1 -- 192.168.123.105:0/3200894687 >> 192.168.123.105:0/3200894687 conn(0x7f7a1c06dda0 msgr2=0x7f7a1c10c190 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:54:12.075 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.073+0000 7f7a07fff700 1 -- 192.168.123.105:0/3200894687 shutdown_connections 2026-03-10T08:54:12.075 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.073+0000 7f7a07fff700 1 -- 192.168.123.105:0/3200894687 wait complete. 2026-03-10T08:54:12.095 INFO:teuthology.orchestra.run.vm05.stdout:true 2026-03-10T08:54:12.194 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.193+0000 7f13bb5ed700 1 -- 192.168.123.105:0/583364178 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f13b4075a40 msgr2=0x7f13b4077ed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:12.195 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.193+0000 7f13bb5ed700 1 --2- 192.168.123.105:0/583364178 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f13b4075a40 0x7f13b4077ed0 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f13ac00d3f0 tx=0x7f13ac00d700 comp rx=0 tx=0).stop 2026-03-10T08:54:12.195 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.194+0000 7f13bb5ed700 1 -- 192.168.123.105:0/583364178 shutdown_connections 2026-03-10T08:54:12.195 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.194+0000 7f13bb5ed700 1 --2- 192.168.123.105:0/583364178 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f13b4075a40 0x7f13b4077ed0 unknown :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:12.195 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.194+0000 7f13bb5ed700 1 --2- 192.168.123.105:0/583364178 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f13b4072b50 0x7f13b4072f70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:12.195 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.194+0000 7f13bb5ed700 1 -- 192.168.123.105:0/583364178 >> 192.168.123.105:0/583364178 conn(0x7f13b406dae0 msgr2=0x7f13b406ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:54:12.195 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.194+0000 7f13bb5ed700 1 -- 192.168.123.105:0/583364178 shutdown_connections 2026-03-10T08:54:12.195 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.194+0000 7f13bb5ed700 1 -- 192.168.123.105:0/583364178 wait complete. 2026-03-10T08:54:12.195 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.195+0000 7f13bb5ed700 1 Processor -- start 2026-03-10T08:54:12.195 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.195+0000 7f13bb5ed700 1 -- start start 2026-03-10T08:54:12.195 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.195+0000 7f13bb5ed700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f13b4072b50 0x7f13b40830f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:54:12.195 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.195+0000 7f13bb5ed700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f13b4083630 0x7f13b41b3180 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:54:12.195 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.195+0000 7f13bb5ed700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f13b4083b40 con 0x7f13b4072b50 2026-03-10T08:54:12.195 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.195+0000 7f13bb5ed700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f13b4083cb0 con 0x7f13b4083630 2026-03-10T08:54:12.195 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.195+0000 7f13b8b88700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f13b4083630 0x7f13b41b3180 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:54:12.195 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.195+0000 7f13b8b88700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f13b4083630 0x7f13b41b3180 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:42064/0 (socket says 192.168.123.105:42064) 2026-03-10T08:54:12.196 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.195+0000 7f13b8b88700 1 -- 192.168.123.105:0/2556812446 learned_addr learned my addr 192.168.123.105:0/2556812446 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:54:12.196 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.195+0000 7f13b8b88700 1 -- 192.168.123.105:0/2556812446 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f13b4072b50 msgr2=0x7f13b40830f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:12.196 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.195+0000 7f13b8b88700 1 --2- 192.168.123.105:0/2556812446 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f13b4072b50 0x7f13b40830f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:12.196 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.195+0000 7f13b8b88700 1 -- 192.168.123.105:0/2556812446 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f13ac007ed0 con 0x7f13b4083630 2026-03-10T08:54:12.196 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.196+0000 7f13b8b88700 1 --2- 192.168.123.105:0/2556812446 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f13b4083630 0x7f13b41b3180 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f13ac003c30 tx=0x7f13ac003d10 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:54:12.196 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.196+0000 7f13aa7fc700 1 -- 192.168.123.105:0/2556812446 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f13ac01e070 con 0x7f13b4083630 2026-03-10T08:54:12.197 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.196+0000 7f13bb5ed700 1 -- 192.168.123.105:0/2556812446 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f13b41b36c0 con 0x7f13b4083630 2026-03-10T08:54:12.197 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.196+0000 7f13bb5ed700 1 -- 192.168.123.105:0/2556812446 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f13b41b3c10 con 0x7f13b4083630 2026-03-10T08:54:12.197 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.197+0000 7f13aa7fc700 1 -- 192.168.123.105:0/2556812446 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f13ac00fcf0 con 0x7f13b4083630 2026-03-10T08:54:12.197 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.197+0000 7f13aa7fc700 1 -- 192.168.123.105:0/2556812446 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f13ac022410 con 0x7f13b4083630 2026-03-10T08:54:12.198 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.198+0000 7f13aa7fc700 1 -- 192.168.123.105:0/2556812446 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f13ac02c430 con 0x7f13b4083630 2026-03-10T08:54:12.198 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.198+0000 7f13aa7fc700 1 --2- 192.168.123.105:0/2556812446 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f13a006c530 0x7f13a006e9f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:54:12.199 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.198+0000 7f13aa7fc700 1 -- 192.168.123.105:0/2556812446 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f13ac013070 con 0x7f13b4083630 2026-03-10T08:54:12.199 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.199+0000 7f13b9389700 1 --2- 192.168.123.105:0/2556812446 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f13a006c530 0x7f13a006e9f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:54:12.200 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.200+0000 7f13bb5ed700 1 -- 192.168.123.105:0/2556812446 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1398005320 con 0x7f13b4083630 2026-03-10T08:54:12.200 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.200+0000 7f13b9389700 1 --2- 192.168.123.105:0/2556812446 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f13a006c530 0x7f13a006e9f0 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7f13b00098a0 tx=0x7f13b0006d90 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:54:12.211 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.208+0000 7f13aa7fc700 1 -- 192.168.123.105:0/2556812446 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f13ac00fe60 con 0x7f13b4083630 2026-03-10T08:54:12.395 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.393+0000 7f13bb5ed700 1 -- 192.168.123.105:0/2556812446 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f1398000bf0 con 0x7f13a006c530 2026-03-10T08:54:12.395 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.394+0000 7f13aa7fc700 1 -- 192.168.123.105:0/2556812446 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+442 (secure 0 0 0) 0x7f1398000bf0 con 0x7f13a006c530 2026-03-10T08:54:12.399 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.397+0000 7f139ffff700 1 -- 192.168.123.105:0/2556812446 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f13a006c530 msgr2=0x7f13a006e9f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:12.400 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.397+0000 7f139ffff700 1 --2- 192.168.123.105:0/2556812446 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f13a006c530 0x7f13a006e9f0 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7f13b00098a0 tx=0x7f13b0006d90 comp rx=0 tx=0).stop 2026-03-10T08:54:12.400 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.398+0000 7f139ffff700 1 -- 192.168.123.105:0/2556812446 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f13b4083630 msgr2=0x7f13b41b3180 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:12.400 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.398+0000 7f139ffff700 1 --2- 192.168.123.105:0/2556812446 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f13b4083630 0x7f13b41b3180 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f13ac003c30 tx=0x7f13ac003d10 comp rx=0 tx=0).stop 2026-03-10T08:54:12.400 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.398+0000 7f139ffff700 1 -- 192.168.123.105:0/2556812446 shutdown_connections 2026-03-10T08:54:12.400 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.398+0000 7f139ffff700 1 --2- 192.168.123.105:0/2556812446 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f13a006c530 0x7f13a006e9f0 unknown :-1 s=CLOSED pgs=130 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:12.400 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.398+0000 7f139ffff700 1 --2- 192.168.123.105:0/2556812446 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f13b4072b50 0x7f13b40830f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:12.400 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.398+0000 7f139ffff700 1 --2- 192.168.123.105:0/2556812446 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f13b4083630 0x7f13b41b3180 unknown :-1 s=CLOSED pgs=73 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:12.400 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.398+0000 7f139ffff700 1 -- 192.168.123.105:0/2556812446 >> 192.168.123.105:0/2556812446 conn(0x7f13b406dae0 msgr2=0x7f13b406ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:54:12.400 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.398+0000 7f139ffff700 1 -- 192.168.123.105:0/2556812446 shutdown_connections 2026-03-10T08:54:12.400 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.398+0000 7f139ffff700 1 -- 192.168.123.105:0/2556812446 wait complete. 2026-03-10T08:54:12.521 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.518+0000 7fc7819ec700 1 -- 192.168.123.105:0/3594314679 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc77c075a40 msgr2=0x7fc77c077ed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:12.521 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.518+0000 7fc7819ec700 1 --2- 192.168.123.105:0/3594314679 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc77c075a40 0x7fc77c077ed0 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7fc774009f20 tx=0x7fc77400a640 comp rx=0 tx=0).stop 2026-03-10T08:54:12.521 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.518+0000 7fc7819ec700 1 -- 192.168.123.105:0/3594314679 shutdown_connections 2026-03-10T08:54:12.521 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.518+0000 7fc7819ec700 1 --2- 192.168.123.105:0/3594314679 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc77c075a40 0x7fc77c077ed0 unknown :-1 s=CLOSED pgs=74 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:12.521 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.518+0000 7fc7819ec700 1 --2- 192.168.123.105:0/3594314679 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc77c072b50 0x7fc77c072f70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:12.521 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.518+0000 7fc7819ec700 1 -- 192.168.123.105:0/3594314679 >> 192.168.123.105:0/3594314679 conn(0x7fc77c06dae0 msgr2=0x7fc77c06ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:54:12.521 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.518+0000 7fc7819ec700 1 -- 192.168.123.105:0/3594314679 shutdown_connections 2026-03-10T08:54:12.521 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.519+0000 7fc7819ec700 1 -- 192.168.123.105:0/3594314679 wait complete. 2026-03-10T08:54:12.521 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.519+0000 7fc7819ec700 1 Processor -- start 2026-03-10T08:54:12.521 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.519+0000 7fc7819ec700 1 -- start start 2026-03-10T08:54:12.522 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.519+0000 7fc7819ec700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc77c072b50 0x7fc77c083030 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:54:12.522 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.519+0000 7fc7819ec700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc77c083570 0x7fc77c1b30c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:54:12.522 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.519+0000 7fc7819ec700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc77c083a80 con 0x7fc77c083570 2026-03-10T08:54:12.522 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.519+0000 7fc7819ec700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc77c083bc0 con 0x7fc77c072b50 2026-03-10T08:54:12.522 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.521+0000 7fc77a7fc700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc77c083570 0x7fc77c1b30c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:54:12.522 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.521+0000 7fc77a7fc700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc77c083570 0x7fc77c1b30c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:49554/0 (socket says 192.168.123.105:49554) 2026-03-10T08:54:12.522 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.521+0000 7fc77a7fc700 1 -- 192.168.123.105:0/110144120 learned_addr learned my addr 192.168.123.105:0/110144120 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:54:12.522 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.521+0000 7fc77affd700 1 --2- 192.168.123.105:0/110144120 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc77c072b50 0x7fc77c083030 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:54:12.524 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.521+0000 7fc77affd700 1 -- 192.168.123.105:0/110144120 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc77c083570 msgr2=0x7fc77c1b30c0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:12.525 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.521+0000 7fc77affd700 1 --2- 192.168.123.105:0/110144120 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc77c083570 0x7fc77c1b30c0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:12.525 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.521+0000 7fc77affd700 1 -- 192.168.123.105:0/110144120 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc77400a040 con 0x7fc77c072b50 2026-03-10T08:54:12.525 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.521+0000 7fc77affd700 1 --2- 192.168.123.105:0/110144120 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc77c072b50 0x7fc77c083030 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7fc76c00a8d0 tx=0x7fc76c00ac90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:54:12.525 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.522+0000 7fc7809ea700 1 -- 192.168.123.105:0/110144120 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc76c010070 con 0x7fc77c072b50 2026-03-10T08:54:12.525 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.522+0000 7fc7809ea700 1 -- 192.168.123.105:0/110144120 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc76c014410 con 0x7fc77c072b50 2026-03-10T08:54:12.525 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.523+0000 7fc7809ea700 1 -- 192.168.123.105:0/110144120 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc76c0135c0 con 0x7fc77c072b50 2026-03-10T08:54:12.525 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.523+0000 7fc7819ec700 1 -- 192.168.123.105:0/110144120 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc77c1b3660 con 0x7fc77c072b50 2026-03-10T08:54:12.525 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.523+0000 7fc7819ec700 1 -- 192.168.123.105:0/110144120 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc77c1b3b30 con 0x7fc77c072b50 2026-03-10T08:54:12.525 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.523+0000 7fc7819ec700 1 -- 192.168.123.105:0/110144120 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc77c04ea90 con 0x7fc77c072b50 2026-03-10T08:54:12.529 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.526+0000 7fc7809ea700 1 -- 192.168.123.105:0/110144120 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fc76c0079f0 con 0x7fc77c072b50 2026-03-10T08:54:12.529 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.526+0000 7fc7809ea700 1 --2- 192.168.123.105:0/110144120 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fc76406c2e0 0x7fc76406e7a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:54:12.529 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.527+0000 7fc77a7fc700 1 --2- 192.168.123.105:0/110144120 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fc76406c2e0 0x7fc76406e7a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:54:12.529 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.528+0000 7fc77a7fc700 1 --2- 192.168.123.105:0/110144120 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fc76406c2e0 0x7fc76406e7a0 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7fc77400b870 tx=0x7fc77400bbc0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:54:12.532 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.529+0000 7fc7809ea700 1 -- 192.168.123.105:0/110144120 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7fc76c059e60 con 0x7fc77c072b50 2026-03-10T08:54:12.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.532+0000 7fc7809ea700 1 -- 192.168.123.105:0/110144120 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fc76c0592a0 con 0x7fc77c072b50 2026-03-10T08:54:12.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.713+0000 7fc7819ec700 1 -- 192.168.123.105:0/110144120 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fc77c1b3de0 con 0x7fc76406c2e0 2026-03-10T08:54:12.721 INFO:teuthology.orchestra.run.vm05.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T08:54:12.721 INFO:teuthology.orchestra.run.vm05.stdout:alertmanager.vm05 vm05 *:9093,9094 running (2m) 73s ago 3m 21.4M - 0.25.0 c8568f914cd2 3be9db7ff6a0 2026-03-10T08:54:12.721 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm05 vm05 running (3m) 73s ago 3m 8032k - 18.2.1 5be31c24972a 4f6a4fa3151a 2026-03-10T08:54:12.721 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm08 vm08 running (2m) 73s ago 2m 8308k - 18.2.1 5be31c24972a 17a1d108c2c1 2026-03-10T08:54:12.721 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm05 vm05 running (3m) 73s ago 3m 7407k - 18.2.1 5be31c24972a f9c585addcea 2026-03-10T08:54:12.721 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm08 vm08 running (2m) 73s ago 2m 7415k - 18.2.1 5be31c24972a f0b88fc7f552 2026-03-10T08:54:12.721 INFO:teuthology.orchestra.run.vm05.stdout:grafana.vm05 vm05 *:3000 running (2m) 73s ago 3m 80.8M - 9.4.7 954c08fa6188 91937b4745e7 2026-03-10T08:54:12.721 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.bxdvbu vm05 running (79s) 73s ago 79s 16.7M - 18.2.1 5be31c24972a 601862397d9b 2026-03-10T08:54:12.722 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.slhztf vm05 running (77s) 73s ago 77s 13.7M - 18.2.1 5be31c24972a 550cdd24738b 2026-03-10T08:54:12.722 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.ssijow vm08 running (76s) 73s ago 76s 16.1M - 18.2.1 5be31c24972a b9e55b365719 2026-03-10T08:54:12.722 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.xfzrbx vm08 running (78s) 73s ago 78s 11.0M - 18.2.1 5be31c24972a d58d1e2e6ff3 2026-03-10T08:54:12.722 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm05.rxwgjc vm05 *:9283,8765,8443 running (4m) 73s ago 4m 501M - 18.2.1 5be31c24972a 6ec0cdb38171 2026-03-10T08:54:12.722 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm08.rpongu vm08 *:8443,9283,8765 running (2m) 73s ago 2m 449M - 18.2.1 5be31c24972a 9cd801f2f7a7 2026-03-10T08:54:12.722 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm05 vm05 running (4m) 73s ago 4m 50.0M 2048M 18.2.1 5be31c24972a 4cb0e74c8584 2026-03-10T08:54:12.722 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm08 vm08 running (2m) 73s ago 2m 47.9M 2048M 18.2.1 5be31c24972a bca448418226 2026-03-10T08:54:12.722 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm05 vm05 *:9100 running (3m) 73s ago 3m 12.3M - 1.5.0 0da6a335fe13 3b3db2a6030c 2026-03-10T08:54:12.722 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm08 vm08 *:9100 running (2m) 73s ago 2m 12.7M - 1.5.0 0da6a335fe13 f55df0cefc61 2026-03-10T08:54:12.722 INFO:teuthology.orchestra.run.vm05.stdout:osd.0 vm05 running (2m) 73s ago 2m 48.5M 4096M 18.2.1 5be31c24972a 2a2aeea5e3d4 2026-03-10T08:54:12.722 INFO:teuthology.orchestra.run.vm05.stdout:osd.1 vm05 running (2m) 73s ago 2m 46.9M 4096M 18.2.1 5be31c24972a 902f9ea11f1a 2026-03-10T08:54:12.722 INFO:teuthology.orchestra.run.vm05.stdout:osd.2 vm05 running (2m) 73s ago 2m 48.1M 4096M 18.2.1 5be31c24972a 32c0be0f86f2 2026-03-10T08:54:12.722 INFO:teuthology.orchestra.run.vm05.stdout:osd.3 vm08 running (2m) 73s ago 2m 44.3M 4096M 18.2.1 5be31c24972a 14f5f93ea4d1 2026-03-10T08:54:12.722 INFO:teuthology.orchestra.run.vm05.stdout:osd.4 vm08 running (111s) 73s ago 111s 43.5M 4096M 18.2.1 5be31c24972a 155c1482d81c 2026-03-10T08:54:12.722 INFO:teuthology.orchestra.run.vm05.stdout:osd.5 vm08 running (102s) 73s ago 102s 45.8M 4096M 18.2.1 5be31c24972a 21583bb58d82 2026-03-10T08:54:12.722 INFO:teuthology.orchestra.run.vm05.stdout:prometheus.vm05 vm05 *:9095 running (2m) 73s ago 3m 36.9M - 2.43.0 a07b618ecd1d e84b76e5c1c0 2026-03-10T08:54:12.722 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.719+0000 7fc7809ea700 1 -- 192.168.123.105:0/110144120 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3288 (secure 0 0 0) 0x7fc77c1b3de0 con 0x7fc76406c2e0 2026-03-10T08:54:12.724 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.724+0000 7fc7819ec700 1 -- 192.168.123.105:0/110144120 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fc76406c2e0 msgr2=0x7fc76406e7a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:12.724 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.724+0000 7fc7819ec700 1 --2- 192.168.123.105:0/110144120 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fc76406c2e0 0x7fc76406e7a0 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7fc77400b870 tx=0x7fc77400bbc0 comp rx=0 tx=0).stop 2026-03-10T08:54:12.724 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.724+0000 7fc7819ec700 1 -- 192.168.123.105:0/110144120 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc77c072b50 msgr2=0x7fc77c083030 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:12.724 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.724+0000 7fc7819ec700 1 --2- 192.168.123.105:0/110144120 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc77c072b50 0x7fc77c083030 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7fc76c00a8d0 tx=0x7fc76c00ac90 comp rx=0 tx=0).stop 2026-03-10T08:54:12.724 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.724+0000 7fc7819ec700 1 -- 192.168.123.105:0/110144120 shutdown_connections 2026-03-10T08:54:12.724 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.724+0000 7fc7819ec700 1 --2- 192.168.123.105:0/110144120 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fc76406c2e0 0x7fc76406e7a0 unknown :-1 s=CLOSED pgs=131 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:12.724 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.724+0000 7fc7819ec700 1 --2- 192.168.123.105:0/110144120 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc77c072b50 0x7fc77c083030 unknown :-1 s=CLOSED pgs=75 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:12.725 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.724+0000 7fc7819ec700 1 --2- 192.168.123.105:0/110144120 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc77c083570 0x7fc77c1b30c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:12.725 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.724+0000 7fc7819ec700 1 -- 192.168.123.105:0/110144120 >> 192.168.123.105:0/110144120 conn(0x7fc77c06dae0 msgr2=0x7fc77c06ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:54:12.725 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.724+0000 7fc7819ec700 1 -- 192.168.123.105:0/110144120 shutdown_connections 2026-03-10T08:54:12.725 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.724+0000 7fc7819ec700 1 -- 192.168.123.105:0/110144120 wait complete. 2026-03-10T08:54:12.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:12 vm08.local ceph-mon[57559]: pgmap v115: 65 pgs: 65 active+clean; 9.8 MiB data, 177 MiB used, 120 GiB / 120 GiB avail; 822 KiB/s wr, 33 op/s 2026-03-10T08:54:12.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:12 vm08.local ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:54:12.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:12 vm08.local ceph-mon[57559]: Upgrade: Target is version 19.2.3-678-ge911bdeb (unknown) 2026-03-10T08:54:12.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:12 vm08.local ceph-mon[57559]: Upgrade: Target container is quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, digests ['quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc'] 2026-03-10T08:54:12.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:12 vm08.local ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:54:12.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:12 vm08.local ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:54:12.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:12 vm08.local ceph-mon[57559]: Upgrade: Need to upgrade myself (mgr.vm05.rxwgjc) 2026-03-10T08:54:12.842 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:12 vm05.local ceph-mon[49713]: pgmap v115: 65 pgs: 65 active+clean; 9.8 MiB data, 177 MiB used, 120 GiB / 120 GiB avail; 822 KiB/s wr, 33 op/s 2026-03-10T08:54:12.842 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:12 vm05.local ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:54:12.842 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:12 vm05.local ceph-mon[49713]: Upgrade: Target is version 19.2.3-678-ge911bdeb (unknown) 2026-03-10T08:54:12.842 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:12 vm05.local ceph-mon[49713]: Upgrade: Target container is quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, digests ['quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc'] 2026-03-10T08:54:12.842 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:12 vm05.local ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:54:12.842 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:12 vm05.local ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:54:12.842 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:12 vm05.local ceph-mon[49713]: Upgrade: Need to upgrade myself (mgr.vm05.rxwgjc) 2026-03-10T08:54:12.844 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.841+0000 7f07eb59e700 1 -- 192.168.123.105:0/54450186 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f07ec10af50 msgr2=0x7f07ec10d340 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:12.844 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.841+0000 7f07eb59e700 1 --2- 192.168.123.105:0/54450186 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f07ec10af50 0x7f07ec10d340 secure :-1 s=READY pgs=301 cs=0 l=1 rev1=1 crypto rx=0x7f07e400c3a0 tx=0x7f07e400c6b0 comp rx=0 tx=0).stop 2026-03-10T08:54:12.845 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.844+0000 7f07eb59e700 1 -- 192.168.123.105:0/54450186 shutdown_connections 2026-03-10T08:54:12.845 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.844+0000 7f07eb59e700 1 --2- 192.168.123.105:0/54450186 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f07ec10af50 0x7f07ec10d340 unknown :-1 s=CLOSED pgs=301 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:12.845 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.844+0000 7f07eb59e700 1 --2- 192.168.123.105:0/54450186 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f07ec108620 0x7f07ec10aa10 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:12.845 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.844+0000 7f07eb59e700 1 -- 192.168.123.105:0/54450186 >> 192.168.123.105:0/54450186 conn(0x7f07ec06daa0 msgr2=0x7f07ec06ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:54:12.845 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.844+0000 7f07eb59e700 1 -- 192.168.123.105:0/54450186 shutdown_connections 2026-03-10T08:54:12.845 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.844+0000 7f07eb59e700 1 -- 192.168.123.105:0/54450186 wait complete. 2026-03-10T08:54:12.846 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.845+0000 7f07eb59e700 1 Processor -- start 2026-03-10T08:54:12.846 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.845+0000 7f07eb59e700 1 -- start start 2026-03-10T08:54:12.846 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.845+0000 7f07eb59e700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f07ec108620 0x7f07ec19cc30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:54:12.846 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.845+0000 7f07eb59e700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f07ec19d170 0x7f07ec1b3270 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:54:12.846 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.845+0000 7f07eb59e700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f07ec19d5f0 con 0x7f07ec19d170 2026-03-10T08:54:12.846 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.845+0000 7f07eb59e700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f07ec19d760 con 0x7f07ec108620 2026-03-10T08:54:12.846 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.845+0000 7f07ea59c700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f07ec108620 0x7f07ec19cc30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:54:12.846 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.845+0000 7f07ea59c700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f07ec108620 0x7f07ec19cc30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:42116/0 (socket says 192.168.123.105:42116) 2026-03-10T08:54:12.847 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.845+0000 7f07ea59c700 1 -- 192.168.123.105:0/1723307427 learned_addr learned my addr 192.168.123.105:0/1723307427 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:54:12.847 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.845+0000 7f07e9d9b700 1 --2- 192.168.123.105:0/1723307427 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f07ec19d170 0x7f07ec1b3270 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:54:12.847 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.846+0000 7f07ea59c700 1 -- 192.168.123.105:0/1723307427 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f07ec19d170 msgr2=0x7f07ec1b3270 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:12.847 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.846+0000 7f07ea59c700 1 --2- 192.168.123.105:0/1723307427 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f07ec19d170 0x7f07ec1b3270 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:12.847 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.846+0000 7f07ea59c700 1 -- 192.168.123.105:0/1723307427 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f07e400c050 con 0x7f07ec108620 2026-03-10T08:54:12.848 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.846+0000 7f07ea59c700 1 --2- 192.168.123.105:0/1723307427 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f07ec108620 0x7f07ec19cc30 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f07dc00b770 tx=0x7f07dc00ba80 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:54:12.848 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.847+0000 7f07db7fe700 1 -- 192.168.123.105:0/1723307427 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f07dc010840 con 0x7f07ec108620 2026-03-10T08:54:12.848 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.847+0000 7f07eb59e700 1 -- 192.168.123.105:0/1723307427 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f07ec1b37b0 con 0x7f07ec108620 2026-03-10T08:54:12.848 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.847+0000 7f07eb59e700 1 -- 192.168.123.105:0/1723307427 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f07ec1b3cb0 con 0x7f07ec108620 2026-03-10T08:54:12.848 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.847+0000 7f07db7fe700 1 -- 192.168.123.105:0/1723307427 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f07dc010e80 con 0x7f07ec108620 2026-03-10T08:54:12.848 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.847+0000 7f07db7fe700 1 -- 192.168.123.105:0/1723307427 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f07dc00d590 con 0x7f07ec108620 2026-03-10T08:54:12.850 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.849+0000 7f07db7fe700 1 -- 192.168.123.105:0/1723307427 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f07dc0109a0 con 0x7f07ec108620 2026-03-10T08:54:12.850 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.849+0000 7f07db7fe700 1 --2- 192.168.123.105:0/1723307427 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f07d406c530 0x7f07d406e9f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:54:12.850 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.850+0000 7f07e9d9b700 1 --2- 192.168.123.105:0/1723307427 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f07d406c530 0x7f07d406e9f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:54:12.850 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.850+0000 7f07db7fe700 1 -- 192.168.123.105:0/1723307427 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f07dc08b1c0 con 0x7f07ec108620 2026-03-10T08:54:12.851 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.850+0000 7f07e9d9b700 1 --2- 192.168.123.105:0/1723307427 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f07d406c530 0x7f07d406e9f0 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7f07e400cb30 tx=0x7f07e4006040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:54:12.851 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.850+0000 7f07eb59e700 1 -- 192.168.123.105:0/1723307427 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f07cc005320 con 0x7f07ec108620 2026-03-10T08:54:12.855 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:12.853+0000 7f07db7fe700 1 -- 192.168.123.105:0/1723307427 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f07dc059470 con 0x7f07ec108620 2026-03-10T08:54:13.127 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T08:54:13.127 INFO:teuthology.orchestra.run.vm05.stdout: "mon": { 2026-03-10T08:54:13.127 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 2 2026-03-10T08:54:13.127 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:54:13.127 INFO:teuthology.orchestra.run.vm05.stdout: "mgr": { 2026-03-10T08:54:13.127 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 2 2026-03-10T08:54:13.127 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:54:13.128 INFO:teuthology.orchestra.run.vm05.stdout: "osd": { 2026-03-10T08:54:13.128 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 6 2026-03-10T08:54:13.128 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:54:13.128 INFO:teuthology.orchestra.run.vm05.stdout: "mds": { 2026-03-10T08:54:13.128 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4 2026-03-10T08:54:13.128 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:54:13.128 INFO:teuthology.orchestra.run.vm05.stdout: "overall": { 2026-03-10T08:54:13.128 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 14 2026-03-10T08:54:13.128 INFO:teuthology.orchestra.run.vm05.stdout: } 2026-03-10T08:54:13.128 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T08:54:13.128 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.122+0000 7f07eb59e700 1 -- 192.168.123.105:0/1723307427 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f07cc006200 con 0x7f07ec108620 2026-03-10T08:54:13.128 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.123+0000 7f07db7fe700 1 -- 192.168.123.105:0/1723307427 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+558 (secure 0 0 0) 0x7f07dc014240 con 0x7f07ec108620 2026-03-10T08:54:13.131 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.128+0000 7f07d97fa700 1 -- 192.168.123.105:0/1723307427 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f07d406c530 msgr2=0x7f07d406e9f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:13.131 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.128+0000 7f07d97fa700 1 --2- 192.168.123.105:0/1723307427 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f07d406c530 0x7f07d406e9f0 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7f07e400cb30 tx=0x7f07e4006040 comp rx=0 tx=0).stop 2026-03-10T08:54:13.131 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.128+0000 7f07d97fa700 1 -- 192.168.123.105:0/1723307427 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f07ec108620 msgr2=0x7f07ec19cc30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:13.131 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.128+0000 7f07d97fa700 1 --2- 192.168.123.105:0/1723307427 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f07ec108620 0x7f07ec19cc30 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f07dc00b770 tx=0x7f07dc00ba80 comp rx=0 tx=0).stop 2026-03-10T08:54:13.131 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.128+0000 7f07d97fa700 1 -- 192.168.123.105:0/1723307427 shutdown_connections 2026-03-10T08:54:13.131 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.128+0000 7f07d97fa700 1 --2- 192.168.123.105:0/1723307427 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f07d406c530 0x7f07d406e9f0 unknown :-1 s=CLOSED pgs=132 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:13.131 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.128+0000 7f07d97fa700 1 --2- 192.168.123.105:0/1723307427 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f07ec108620 0x7f07ec19cc30 unknown :-1 s=CLOSED pgs=76 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:13.131 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.128+0000 7f07d97fa700 1 --2- 192.168.123.105:0/1723307427 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f07ec19d170 0x7f07ec1b3270 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:13.131 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.128+0000 7f07d97fa700 1 -- 192.168.123.105:0/1723307427 >> 192.168.123.105:0/1723307427 conn(0x7f07ec06daa0 msgr2=0x7f07ec10ac20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:54:13.131 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.128+0000 7f07d97fa700 1 -- 192.168.123.105:0/1723307427 shutdown_connections 2026-03-10T08:54:13.131 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.128+0000 7f07d97fa700 1 -- 192.168.123.105:0/1723307427 wait complete. 2026-03-10T08:54:13.262 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.257+0000 7fe265daa700 1 -- 192.168.123.105:0/2819195033 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe26010a700 msgr2=0x7fe2601114d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:13.262 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.257+0000 7fe265daa700 1 --2- 192.168.123.105:0/2819195033 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe26010a700 0x7fe2601114d0 secure :-1 s=READY pgs=302 cs=0 l=1 rev1=1 crypto rx=0x7fe254009b00 tx=0x7fe254009e10 comp rx=0 tx=0).stop 2026-03-10T08:54:13.262 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.258+0000 7fe265daa700 1 -- 192.168.123.105:0/2819195033 shutdown_connections 2026-03-10T08:54:13.262 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.258+0000 7fe265daa700 1 --2- 192.168.123.105:0/2819195033 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe26010a700 0x7fe2601114d0 unknown :-1 s=CLOSED pgs=302 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:13.262 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.258+0000 7fe265daa700 1 --2- 192.168.123.105:0/2819195033 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe260107d90 0x7fe26010a1c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:13.262 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.258+0000 7fe265daa700 1 -- 192.168.123.105:0/2819195033 >> 192.168.123.105:0/2819195033 conn(0x7fe26006dae0 msgr2=0x7fe26006ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:54:13.262 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.259+0000 7fe265daa700 1 -- 192.168.123.105:0/2819195033 shutdown_connections 2026-03-10T08:54:13.262 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.259+0000 7fe265daa700 1 -- 192.168.123.105:0/2819195033 wait complete. 2026-03-10T08:54:13.262 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.259+0000 7fe265daa700 1 Processor -- start 2026-03-10T08:54:13.262 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.260+0000 7fe265daa700 1 -- start start 2026-03-10T08:54:13.262 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.260+0000 7fe265daa700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe260107d90 0x7fe2601a0fa0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:54:13.262 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.260+0000 7fe265daa700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe26010a700 0x7fe2601a14e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:54:13.262 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.260+0000 7fe265daa700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe2601a1b00 con 0x7fe26010a700 2026-03-10T08:54:13.262 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.260+0000 7fe265daa700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe2601a1c40 con 0x7fe260107d90 2026-03-10T08:54:13.262 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.260+0000 7fe25effd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe26010a700 0x7fe2601a14e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:54:13.262 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.260+0000 7fe25effd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe26010a700 0x7fe2601a14e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:49592/0 (socket says 192.168.123.105:49592) 2026-03-10T08:54:13.262 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.260+0000 7fe25effd700 1 -- 192.168.123.105:0/468540228 learned_addr learned my addr 192.168.123.105:0/468540228 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:54:13.263 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.260+0000 7fe25effd700 1 -- 192.168.123.105:0/468540228 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe260107d90 msgr2=0x7fe2601a0fa0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:13.263 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.260+0000 7fe25effd700 1 --2- 192.168.123.105:0/468540228 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe260107d90 0x7fe2601a0fa0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:13.263 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.260+0000 7fe25effd700 1 -- 192.168.123.105:0/468540228 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe2540097e0 con 0x7fe26010a700 2026-03-10T08:54:13.263 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.260+0000 7fe25effd700 1 --2- 192.168.123.105:0/468540228 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe26010a700 0x7fe2601a14e0 secure :-1 s=READY pgs=303 cs=0 l=1 rev1=1 crypto rx=0x7fe254009fd0 tx=0x7fe254004970 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:54:13.263 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.260+0000 7fe25cff9700 1 -- 192.168.123.105:0/468540228 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe25401d070 con 0x7fe26010a700 2026-03-10T08:54:13.263 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.260+0000 7fe25cff9700 1 -- 192.168.123.105:0/468540228 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fe25400bb40 con 0x7fe26010a700 2026-03-10T08:54:13.263 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.261+0000 7fe25cff9700 1 -- 192.168.123.105:0/468540228 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe25400f670 con 0x7fe26010a700 2026-03-10T08:54:13.272 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.261+0000 7fe265daa700 1 -- 192.168.123.105:0/468540228 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe2601a6640 con 0x7fe26010a700 2026-03-10T08:54:13.272 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.261+0000 7fe265daa700 1 -- 192.168.123.105:0/468540228 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe2601a6ab0 con 0x7fe26010a700 2026-03-10T08:54:13.272 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.262+0000 7fe265daa700 1 -- 192.168.123.105:0/468540228 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe26019b190 con 0x7fe26010a700 2026-03-10T08:54:13.272 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.265+0000 7fe25cff9700 1 -- 192.168.123.105:0/468540228 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fe254004d50 con 0x7fe26010a700 2026-03-10T08:54:13.272 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.265+0000 7fe25cff9700 1 --2- 192.168.123.105:0/468540228 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe24806c380 0x7fe24806e840 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:54:13.272 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.266+0000 7fe25cff9700 1 -- 192.168.123.105:0/468540228 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7fe25408d660 con 0x7fe26010a700 2026-03-10T08:54:13.272 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.266+0000 7fe25f7fe700 1 --2- 192.168.123.105:0/468540228 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe24806c380 0x7fe24806e840 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:54:13.272 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.266+0000 7fe25cff9700 1 -- 192.168.123.105:0/468540228 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fe25405bab0 con 0x7fe26010a700 2026-03-10T08:54:13.277 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.276+0000 7fe25f7fe700 1 --2- 192.168.123.105:0/468540228 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe24806c380 0x7fe24806e840 secure :-1 s=READY pgs=133 cs=0 l=1 rev1=1 crypto rx=0x7fe250005950 tx=0x7fe2500058e0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:54:13.462 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.459+0000 7fe265daa700 1 -- 192.168.123.105:0/468540228 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fe26002ce00 con 0x7fe26010a700 2026-03-10T08:54:13.465 INFO:teuthology.orchestra.run.vm05.stdout:e12 2026-03-10T08:54:13.465 INFO:teuthology.orchestra.run.vm05.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T08:54:13.465 INFO:teuthology.orchestra.run.vm05.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T08:54:13.465 INFO:teuthology.orchestra.run.vm05.stdout:legacy client fscid: 1 2026-03-10T08:54:13.465 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:54:13.465 INFO:teuthology.orchestra.run.vm05.stdout:Filesystem 'cephfs' (1) 2026-03-10T08:54:13.465 INFO:teuthology.orchestra.run.vm05.stdout:fs_name cephfs 2026-03-10T08:54:13.465 INFO:teuthology.orchestra.run.vm05.stdout:epoch 12 2026-03-10T08:54:13.465 INFO:teuthology.orchestra.run.vm05.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T08:54:13.465 INFO:teuthology.orchestra.run.vm05.stdout:created 2026-03-10T08:52:52.346264+0000 2026-03-10T08:54:13.465 INFO:teuthology.orchestra.run.vm05.stdout:modified 2026-03-10T08:53:01.426195+0000 2026-03-10T08:54:13.465 INFO:teuthology.orchestra.run.vm05.stdout:tableserver 0 2026-03-10T08:54:13.465 INFO:teuthology.orchestra.run.vm05.stdout:root 0 2026-03-10T08:54:13.465 INFO:teuthology.orchestra.run.vm05.stdout:session_timeout 60 2026-03-10T08:54:13.465 INFO:teuthology.orchestra.run.vm05.stdout:session_autoclose 300 2026-03-10T08:54:13.465 INFO:teuthology.orchestra.run.vm05.stdout:max_file_size 1099511627776 2026-03-10T08:54:13.465 INFO:teuthology.orchestra.run.vm05.stdout:required_client_features {} 2026-03-10T08:54:13.465 INFO:teuthology.orchestra.run.vm05.stdout:last_failure 0 2026-03-10T08:54:13.465 INFO:teuthology.orchestra.run.vm05.stdout:last_failure_osd_epoch 39 2026-03-10T08:54:13.465 INFO:teuthology.orchestra.run.vm05.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T08:54:13.465 INFO:teuthology.orchestra.run.vm05.stdout:max_mds 1 2026-03-10T08:54:13.465 INFO:teuthology.orchestra.run.vm05.stdout:in 0 2026-03-10T08:54:13.465 INFO:teuthology.orchestra.run.vm05.stdout:up {0=24289} 2026-03-10T08:54:13.465 INFO:teuthology.orchestra.run.vm05.stdout:failed 2026-03-10T08:54:13.465 INFO:teuthology.orchestra.run.vm05.stdout:damaged 2026-03-10T08:54:13.465 INFO:teuthology.orchestra.run.vm05.stdout:stopped 2026-03-10T08:54:13.465 INFO:teuthology.orchestra.run.vm05.stdout:data_pools [3] 2026-03-10T08:54:13.465 INFO:teuthology.orchestra.run.vm05.stdout:metadata_pool 2 2026-03-10T08:54:13.465 INFO:teuthology.orchestra.run.vm05.stdout:inline_data disabled 2026-03-10T08:54:13.465 INFO:teuthology.orchestra.run.vm05.stdout:balancer 2026-03-10T08:54:13.465 INFO:teuthology.orchestra.run.vm05.stdout:bal_rank_mask -1 2026-03-10T08:54:13.465 INFO:teuthology.orchestra.run.vm05.stdout:standby_count_wanted 1 2026-03-10T08:54:13.465 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.bxdvbu{0:24289} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.105:6826/2466638752,v1:192.168.123.105:6827/2466638752] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:54:13.465 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:54:13.465 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:54:13.465 INFO:teuthology.orchestra.run.vm05.stdout:Standby daemons: 2026-03-10T08:54:13.465 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:54:13.465 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.slhztf{-1:14488} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.105:6828/2662194502,v1:192.168.123.105:6829/2662194502] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:54:13.465 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.ssijow{-1:24307} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.108:6826/573665424,v1:192.168.123.108:6827/573665424] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:54:13.465 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.xfzrbx{-1:24317} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.108:6824/3236998387,v1:192.168.123.108:6825/3236998387] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:54:13.465 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.464+0000 7fe25cff9700 1 -- 192.168.123.105:0/468540228 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 12 v12) v1 ==== 76+0+1827 (secure 0 0 0) 0x7fe254027740 con 0x7fe26010a700 2026-03-10T08:54:13.466 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.466+0000 7fe265daa700 1 -- 192.168.123.105:0/468540228 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe24806c380 msgr2=0x7fe24806e840 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:13.466 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.466+0000 7fe265daa700 1 --2- 192.168.123.105:0/468540228 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe24806c380 0x7fe24806e840 secure :-1 s=READY pgs=133 cs=0 l=1 rev1=1 crypto rx=0x7fe250005950 tx=0x7fe2500058e0 comp rx=0 tx=0).stop 2026-03-10T08:54:13.466 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.466+0000 7fe265daa700 1 -- 192.168.123.105:0/468540228 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe26010a700 msgr2=0x7fe2601a14e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:13.466 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.466+0000 7fe265daa700 1 --2- 192.168.123.105:0/468540228 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe26010a700 0x7fe2601a14e0 secure :-1 s=READY pgs=303 cs=0 l=1 rev1=1 crypto rx=0x7fe254009fd0 tx=0x7fe254004970 comp rx=0 tx=0).stop 2026-03-10T08:54:13.466 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.466+0000 7fe265daa700 1 -- 192.168.123.105:0/468540228 shutdown_connections 2026-03-10T08:54:13.466 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.466+0000 7fe265daa700 1 --2- 192.168.123.105:0/468540228 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe24806c380 0x7fe24806e840 unknown :-1 s=CLOSED pgs=133 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:13.466 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.466+0000 7fe265daa700 1 --2- 192.168.123.105:0/468540228 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe260107d90 0x7fe2601a0fa0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:13.466 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.466+0000 7fe265daa700 1 --2- 192.168.123.105:0/468540228 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe26010a700 0x7fe2601a14e0 unknown :-1 s=CLOSED pgs=303 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:13.466 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.466+0000 7fe265daa700 1 -- 192.168.123.105:0/468540228 >> 192.168.123.105:0/468540228 conn(0x7fe26006dae0 msgr2=0x7fe26006ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:54:13.467 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.466+0000 7fe265daa700 1 -- 192.168.123.105:0/468540228 shutdown_connections 2026-03-10T08:54:13.467 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.466+0000 7fe265daa700 1 -- 192.168.123.105:0/468540228 wait complete. 2026-03-10T08:54:13.467 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 12 2026-03-10T08:54:13.581 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.580+0000 7f97e9013700 1 -- 192.168.123.105:0/3798989083 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f97e4075a10 msgr2=0x7f97e4077ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:13.581 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.580+0000 7f97e9013700 1 --2- 192.168.123.105:0/3798989083 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f97e4075a10 0x7f97e4077ea0 secure :-1 s=READY pgs=304 cs=0 l=1 rev1=1 crypto rx=0x7f97dc00d3f0 tx=0x7f97dc00d700 comp rx=0 tx=0).stop 2026-03-10T08:54:13.581 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.580+0000 7f97e9013700 1 -- 192.168.123.105:0/3798989083 shutdown_connections 2026-03-10T08:54:13.581 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.580+0000 7f97e9013700 1 --2- 192.168.123.105:0/3798989083 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f97e4075a10 0x7f97e4077ea0 unknown :-1 s=CLOSED pgs=304 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:13.581 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.580+0000 7f97e9013700 1 --2- 192.168.123.105:0/3798989083 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f97e4072b20 0x7f97e4072f40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:13.581 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.580+0000 7f97e9013700 1 -- 192.168.123.105:0/3798989083 >> 192.168.123.105:0/3798989083 conn(0x7f97e406daa0 msgr2=0x7f97e406ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:54:13.581 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.580+0000 7f97e9013700 1 -- 192.168.123.105:0/3798989083 shutdown_connections 2026-03-10T08:54:13.581 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.580+0000 7f97e9013700 1 -- 192.168.123.105:0/3798989083 wait complete. 2026-03-10T08:54:13.581 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.580+0000 7f97e9013700 1 Processor -- start 2026-03-10T08:54:13.581 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.580+0000 7f97e9013700 1 -- start start 2026-03-10T08:54:13.581 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.580+0000 7f97e9013700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f97e4072b20 0x7f97e4082f60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:54:13.581 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.580+0000 7f97e9013700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f97e40834a0 0x7f97e4083920 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:54:13.581 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.580+0000 7f97e9013700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f97e412e700 con 0x7f97e40834a0 2026-03-10T08:54:13.581 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.580+0000 7f97e9013700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f97e412e870 con 0x7f97e4072b20 2026-03-10T08:54:13.581 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.580+0000 7f97e37fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f97e40834a0 0x7f97e4083920 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:54:13.581 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.580+0000 7f97e37fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f97e40834a0 0x7f97e4083920 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:49598/0 (socket says 192.168.123.105:49598) 2026-03-10T08:54:13.581 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.580+0000 7f97e37fe700 1 -- 192.168.123.105:0/2103523336 learned_addr learned my addr 192.168.123.105:0/2103523336 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:54:13.581 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.580+0000 7f97e3fff700 1 --2- 192.168.123.105:0/2103523336 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f97e4072b20 0x7f97e4082f60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:54:13.581 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.581+0000 7f97e37fe700 1 -- 192.168.123.105:0/2103523336 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f97e4072b20 msgr2=0x7f97e4082f60 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:13.582 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.581+0000 7f97e37fe700 1 --2- 192.168.123.105:0/2103523336 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f97e4072b20 0x7f97e4082f60 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:13.582 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.581+0000 7f97e37fe700 1 -- 192.168.123.105:0/2103523336 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f97dc007ed0 con 0x7f97e40834a0 2026-03-10T08:54:13.582 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.581+0000 7f97e37fe700 1 --2- 192.168.123.105:0/2103523336 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f97e40834a0 0x7f97e4083920 secure :-1 s=READY pgs=305 cs=0 l=1 rev1=1 crypto rx=0x7f97dc003c60 tx=0x7f97dc003c90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:54:13.582 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.581+0000 7f97e17fa700 1 -- 192.168.123.105:0/2103523336 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f97dc01c070 con 0x7f97e40834a0 2026-03-10T08:54:13.584 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.582+0000 7f97e9013700 1 -- 192.168.123.105:0/2103523336 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f97e412eaf0 con 0x7f97e40834a0 2026-03-10T08:54:13.584 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.582+0000 7f97e9013700 1 -- 192.168.123.105:0/2103523336 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f97e412efe0 con 0x7f97e40834a0 2026-03-10T08:54:13.584 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.582+0000 7f97e17fa700 1 -- 192.168.123.105:0/2103523336 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f97dc00fb40 con 0x7f97e40834a0 2026-03-10T08:54:13.584 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.582+0000 7f97e17fa700 1 -- 192.168.123.105:0/2103523336 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f97dc017bf0 con 0x7f97e40834a0 2026-03-10T08:54:13.586 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.584+0000 7f97e9013700 1 -- 192.168.123.105:0/2103523336 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f97d0005320 con 0x7f97e40834a0 2026-03-10T08:54:13.586 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.584+0000 7f97e17fa700 1 -- 192.168.123.105:0/2103523336 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f97dc017d50 con 0x7f97e40834a0 2026-03-10T08:54:13.586 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.584+0000 7f97e17fa700 1 --2- 192.168.123.105:0/2103523336 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f97cc06c600 0x7f97cc06eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:54:13.586 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.584+0000 7f97e17fa700 1 -- 192.168.123.105:0/2103523336 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f97dc013070 con 0x7f97e40834a0 2026-03-10T08:54:13.586 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.585+0000 7f97e3fff700 1 --2- 192.168.123.105:0/2103523336 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f97cc06c600 0x7f97cc06eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:54:13.586 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.585+0000 7f97e3fff700 1 --2- 192.168.123.105:0/2103523336 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f97cc06c600 0x7f97cc06eac0 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7f97d4009c80 tx=0x7f97d4009400 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:54:13.588 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.588+0000 7f97e17fa700 1 -- 192.168.123.105:0/2103523336 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f97dc05b4d0 con 0x7f97e40834a0 2026-03-10T08:54:13.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:13 vm05.local ceph-mon[49713]: Upgrade: Pulling quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc on vm08 2026-03-10T08:54:13.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:13 vm05.local ceph-mon[49713]: from='client.24391 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:54:13.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:13 vm05.local ceph-mon[49713]: from='client.24395 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:54:13.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:13 vm05.local ceph-mon[49713]: from='client.24399 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:54:13.826 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T08:54:13.826 INFO:teuthology.orchestra.run.vm05.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T08:54:13.826 INFO:teuthology.orchestra.run.vm05.stdout: "in_progress": true, 2026-03-10T08:54:13.826 INFO:teuthology.orchestra.run.vm05.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T08:54:13.826 INFO:teuthology.orchestra.run.vm05.stdout: "services_complete": [], 2026-03-10T08:54:13.826 INFO:teuthology.orchestra.run.vm05.stdout: "progress": "0/23 daemons upgraded", 2026-03-10T08:54:13.826 INFO:teuthology.orchestra.run.vm05.stdout: "message": "Pulling quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc image on host vm08", 2026-03-10T08:54:13.826 INFO:teuthology.orchestra.run.vm05.stdout: "is_paused": false 2026-03-10T08:54:13.826 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T08:54:13.826 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.823+0000 7f97e9013700 1 -- 192.168.123.105:0/2103523336 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f97d0000bf0 con 0x7f97cc06c600 2026-03-10T08:54:13.826 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.825+0000 7f97e17fa700 1 -- 192.168.123.105:0/2103523336 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+442 (secure 0 0 0) 0x7f97d0000bf0 con 0x7f97cc06c600 2026-03-10T08:54:13.832 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.831+0000 7f97caffd700 1 -- 192.168.123.105:0/2103523336 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f97cc06c600 msgr2=0x7f97cc06eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:13.832 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.831+0000 7f97caffd700 1 --2- 192.168.123.105:0/2103523336 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f97cc06c600 0x7f97cc06eac0 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7f97d4009c80 tx=0x7f97d4009400 comp rx=0 tx=0).stop 2026-03-10T08:54:13.832 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.831+0000 7f97caffd700 1 -- 192.168.123.105:0/2103523336 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f97e40834a0 msgr2=0x7f97e4083920 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:13.832 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.831+0000 7f97caffd700 1 --2- 192.168.123.105:0/2103523336 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f97e40834a0 0x7f97e4083920 secure :-1 s=READY pgs=305 cs=0 l=1 rev1=1 crypto rx=0x7f97dc003c60 tx=0x7f97dc003c90 comp rx=0 tx=0).stop 2026-03-10T08:54:13.834 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.832+0000 7f97caffd700 1 -- 192.168.123.105:0/2103523336 shutdown_connections 2026-03-10T08:54:13.834 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.832+0000 7f97caffd700 1 --2- 192.168.123.105:0/2103523336 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f97cc06c600 0x7f97cc06eac0 unknown :-1 s=CLOSED pgs=134 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:13.834 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.832+0000 7f97caffd700 1 --2- 192.168.123.105:0/2103523336 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f97e4072b20 0x7f97e4082f60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:13.834 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.832+0000 7f97caffd700 1 --2- 192.168.123.105:0/2103523336 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f97e40834a0 0x7f97e4083920 unknown :-1 s=CLOSED pgs=305 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:13.834 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.832+0000 7f97caffd700 1 -- 192.168.123.105:0/2103523336 >> 192.168.123.105:0/2103523336 conn(0x7f97e406daa0 msgr2=0x7f97e406ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:54:13.834 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.833+0000 7f97caffd700 1 -- 192.168.123.105:0/2103523336 shutdown_connections 2026-03-10T08:54:13.834 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.833+0000 7f97caffd700 1 -- 192.168.123.105:0/2103523336 wait complete. 2026-03-10T08:54:13.958 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.957+0000 7ff2fe162700 1 -- 192.168.123.105:0/3042909455 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff2f8075a10 msgr2=0x7ff2f8077ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:13.958 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.957+0000 7ff2fe162700 1 --2- 192.168.123.105:0/3042909455 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff2f8075a10 0x7ff2f8077ea0 secure :-1 s=READY pgs=306 cs=0 l=1 rev1=1 crypto rx=0x7ff2f4008790 tx=0x7ff2f400ae50 comp rx=0 tx=0).stop 2026-03-10T08:54:13.958 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.958+0000 7ff2fe162700 1 -- 192.168.123.105:0/3042909455 shutdown_connections 2026-03-10T08:54:13.958 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.958+0000 7ff2fe162700 1 --2- 192.168.123.105:0/3042909455 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff2f8075a10 0x7ff2f8077ea0 unknown :-1 s=CLOSED pgs=306 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:13.958 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.958+0000 7ff2fe162700 1 --2- 192.168.123.105:0/3042909455 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff2f8072b20 0x7ff2f8072f40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:13.959 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.958+0000 7ff2fe162700 1 -- 192.168.123.105:0/3042909455 >> 192.168.123.105:0/3042909455 conn(0x7ff2f806daa0 msgr2=0x7ff2f806ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:54:13.960 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.958+0000 7ff2fe162700 1 -- 192.168.123.105:0/3042909455 shutdown_connections 2026-03-10T08:54:13.960 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.958+0000 7ff2fe162700 1 -- 192.168.123.105:0/3042909455 wait complete. 2026-03-10T08:54:13.960 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.958+0000 7ff2fe162700 1 Processor -- start 2026-03-10T08:54:13.960 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.958+0000 7ff2fe162700 1 -- start start 2026-03-10T08:54:13.960 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.959+0000 7ff2fe162700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff2f8072b20 0x7ff2f8082e30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:54:13.960 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.959+0000 7ff2fe162700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff2f8083370 0x7ff2f80837f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:54:13.960 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.959+0000 7ff2fe162700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff2f812e5d0 con 0x7ff2f8072b20 2026-03-10T08:54:13.960 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.959+0000 7ff2fe162700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff2f812e740 con 0x7ff2f8083370 2026-03-10T08:54:13.960 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.959+0000 7ff2fd160700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff2f8072b20 0x7ff2f8082e30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:54:13.960 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.959+0000 7ff2fd160700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff2f8072b20 0x7ff2f8082e30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:49624/0 (socket says 192.168.123.105:49624) 2026-03-10T08:54:13.960 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.959+0000 7ff2fd160700 1 -- 192.168.123.105:0/141525207 learned_addr learned my addr 192.168.123.105:0/141525207 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:54:13.960 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.959+0000 7ff2fc95f700 1 --2- 192.168.123.105:0/141525207 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff2f8083370 0x7ff2f80837f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:54:13.960 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.959+0000 7ff2fc95f700 1 -- 192.168.123.105:0/141525207 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff2f8072b20 msgr2=0x7ff2f8082e30 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:13.960 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.959+0000 7ff2fc95f700 1 --2- 192.168.123.105:0/141525207 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff2f8072b20 0x7ff2f8082e30 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:13.960 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.959+0000 7ff2fc95f700 1 -- 192.168.123.105:0/141525207 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff2f4008440 con 0x7ff2f8083370 2026-03-10T08:54:13.960 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.960+0000 7ff2fc95f700 1 --2- 192.168.123.105:0/141525207 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff2f8083370 0x7ff2f80837f0 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7ff2f400b590 tx=0x7ff2f400be60 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:54:13.960 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.960+0000 7ff2ee7fc700 1 -- 192.168.123.105:0/141525207 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff2f400a490 con 0x7ff2f8083370 2026-03-10T08:54:13.962 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.960+0000 7ff2fe162700 1 -- 192.168.123.105:0/141525207 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff2f812e9c0 con 0x7ff2f8083370 2026-03-10T08:54:13.962 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.960+0000 7ff2fe162700 1 -- 192.168.123.105:0/141525207 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff2f812ef10 con 0x7ff2f8083370 2026-03-10T08:54:13.962 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.962+0000 7ff2ee7fc700 1 -- 192.168.123.105:0/141525207 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff2f400b920 con 0x7ff2f8083370 2026-03-10T08:54:13.962 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.962+0000 7ff2ee7fc700 1 -- 192.168.123.105:0/141525207 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff2f401cbc0 con 0x7ff2f8083370 2026-03-10T08:54:13.964 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.963+0000 7ff2ee7fc700 1 -- 192.168.123.105:0/141525207 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7ff2f401c420 con 0x7ff2f8083370 2026-03-10T08:54:13.964 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.964+0000 7ff2ee7fc700 1 --2- 192.168.123.105:0/141525207 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff2e406c530 0x7ff2e406e9f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:54:13.966 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:13 vm05.local ceph-mon[49713]: from='client.? 192.168.123.105:0/1723307427' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:54:13.966 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:13 vm05.local ceph-mon[49713]: from='client.? 192.168.123.105:0/468540228' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T08:54:13.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.965+0000 7ff2ee7fc700 1 -- 192.168.123.105:0/141525207 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7ff2f408c520 con 0x7ff2f8083370 2026-03-10T08:54:13.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.963+0000 7ff2fe162700 1 -- 192.168.123.105:0/141525207 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff2dc005320 con 0x7ff2f8083370 2026-03-10T08:54:13.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.965+0000 7ff2fd160700 1 --2- 192.168.123.105:0/141525207 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff2e406c530 0x7ff2e406e9f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:54:13.968 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.968+0000 7ff2ee7fc700 1 -- 192.168.123.105:0/141525207 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7ff2f405a7d0 con 0x7ff2f8083370 2026-03-10T08:54:13.972 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:13.969+0000 7ff2fd160700 1 --2- 192.168.123.105:0/141525207 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff2e406c530 0x7ff2e406e9f0 secure :-1 s=READY pgs=135 cs=0 l=1 rev1=1 crypto rx=0x7ff2f00095a0 tx=0x7ff2f000bd20 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:54:14.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:13 vm08.local ceph-mon[57559]: Upgrade: Pulling quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc on vm08 2026-03-10T08:54:14.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:13 vm08.local ceph-mon[57559]: from='client.24391 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:54:14.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:13 vm08.local ceph-mon[57559]: from='client.24395 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:54:14.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:13 vm08.local ceph-mon[57559]: from='client.24399 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:54:14.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:13 vm08.local ceph-mon[57559]: from='client.? 192.168.123.105:0/1723307427' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:54:14.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:13 vm08.local ceph-mon[57559]: from='client.? 192.168.123.105:0/468540228' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T08:54:14.261 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:14.259+0000 7ff2fe162700 1 -- 192.168.123.105:0/141525207 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7ff2dc005190 con 0x7ff2f8083370 2026-03-10T08:54:14.266 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:14.266+0000 7ff2ee7fc700 1 -- 192.168.123.105:0/141525207 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7ff2f402a090 con 0x7ff2f8083370 2026-03-10T08:54:14.267 INFO:teuthology.orchestra.run.vm05.stdout:HEALTH_OK 2026-03-10T08:54:14.277 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:14.275+0000 7ff2e3fff700 1 -- 192.168.123.105:0/141525207 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff2e406c530 msgr2=0x7ff2e406e9f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:14.277 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:14.275+0000 7ff2e3fff700 1 --2- 192.168.123.105:0/141525207 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff2e406c530 0x7ff2e406e9f0 secure :-1 s=READY pgs=135 cs=0 l=1 rev1=1 crypto rx=0x7ff2f00095a0 tx=0x7ff2f000bd20 comp rx=0 tx=0).stop 2026-03-10T08:54:14.277 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:14.275+0000 7ff2e3fff700 1 -- 192.168.123.105:0/141525207 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff2f8083370 msgr2=0x7ff2f80837f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:14.277 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:14.275+0000 7ff2e3fff700 1 --2- 192.168.123.105:0/141525207 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff2f8083370 0x7ff2f80837f0 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7ff2f400b590 tx=0x7ff2f400be60 comp rx=0 tx=0).stop 2026-03-10T08:54:14.277 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:14.277+0000 7ff2e3fff700 1 -- 192.168.123.105:0/141525207 shutdown_connections 2026-03-10T08:54:14.277 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:14.277+0000 7ff2e3fff700 1 --2- 192.168.123.105:0/141525207 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff2e406c530 0x7ff2e406e9f0 unknown :-1 s=CLOSED pgs=135 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:14.277 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:14.277+0000 7ff2e3fff700 1 --2- 192.168.123.105:0/141525207 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff2f8072b20 0x7ff2f8082e30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:14.277 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:14.277+0000 7ff2e3fff700 1 --2- 192.168.123.105:0/141525207 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff2f8083370 0x7ff2f80837f0 unknown :-1 s=CLOSED pgs=77 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:14.277 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:14.277+0000 7ff2e3fff700 1 -- 192.168.123.105:0/141525207 >> 192.168.123.105:0/141525207 conn(0x7ff2f806daa0 msgr2=0x7ff2f806ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:54:14.278 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:14.277+0000 7ff2e3fff700 1 -- 192.168.123.105:0/141525207 shutdown_connections 2026-03-10T08:54:14.278 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:14.277+0000 7ff2e3fff700 1 -- 192.168.123.105:0/141525207 wait complete. 2026-03-10T08:54:15.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:15 vm05.local ceph-mon[49713]: pgmap v116: 65 pgs: 65 active+clean; 13 MiB data, 194 MiB used, 120 GiB / 120 GiB avail; 1.1 MiB/s wr, 60 op/s 2026-03-10T08:54:15.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:15 vm05.local ceph-mon[49713]: from='client.14618 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:54:15.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:15 vm05.local ceph-mon[49713]: from='client.? 192.168.123.105:0/141525207' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T08:54:15.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:15 vm08.local ceph-mon[57559]: pgmap v116: 65 pgs: 65 active+clean; 13 MiB data, 194 MiB used, 120 GiB / 120 GiB avail; 1.1 MiB/s wr, 60 op/s 2026-03-10T08:54:15.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:15 vm08.local ceph-mon[57559]: from='client.14618 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:54:15.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:15 vm08.local ceph-mon[57559]: from='client.? 192.168.123.105:0/141525207' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T08:54:16.184 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:16 vm08.local ceph-mon[57559]: pgmap v117: 65 pgs: 65 active+clean; 17 MiB data, 225 MiB used, 120 GiB / 120 GiB avail; 1.4 MiB/s wr, 103 op/s 2026-03-10T08:54:16.184 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:16 vm08.local ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:54:16.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:16 vm05.local ceph-mon[49713]: pgmap v117: 65 pgs: 65 active+clean; 17 MiB data, 225 MiB used, 120 GiB / 120 GiB avail; 1.4 MiB/s wr, 103 op/s 2026-03-10T08:54:16.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:16 vm05.local ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:54:18.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:18 vm05.local ceph-mon[49713]: pgmap v118: 65 pgs: 65 active+clean; 23 MiB data, 265 MiB used, 120 GiB / 120 GiB avail; 2.0 MiB/s wr, 197 op/s 2026-03-10T08:54:19.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:18 vm08.local ceph-mon[57559]: pgmap v118: 65 pgs: 65 active+clean; 23 MiB data, 265 MiB used, 120 GiB / 120 GiB avail; 2.0 MiB/s wr, 197 op/s 2026-03-10T08:54:20.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:20 vm05.local ceph-mon[49713]: pgmap v119: 65 pgs: 65 active+clean; 29 MiB data, 277 MiB used, 120 GiB / 120 GiB avail; 2.6 MiB/s wr, 229 op/s 2026-03-10T08:54:20.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:20 vm08.local ceph-mon[57559]: pgmap v119: 65 pgs: 65 active+clean; 29 MiB data, 277 MiB used, 120 GiB / 120 GiB avail; 2.6 MiB/s wr, 229 op/s 2026-03-10T08:54:20.991 INFO:tasks.workunit.client.0.vm05.stderr:Note: switching to '75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b'. 2026-03-10T08:54:20.991 INFO:tasks.workunit.client.0.vm05.stderr: 2026-03-10T08:54:20.991 INFO:tasks.workunit.client.0.vm05.stderr:You are in 'detached HEAD' state. You can look around, make experimental 2026-03-10T08:54:20.991 INFO:tasks.workunit.client.0.vm05.stderr:changes and commit them, and you can discard any commits you make in this 2026-03-10T08:54:20.991 INFO:tasks.workunit.client.0.vm05.stderr:state without impacting any branches by switching back to a branch. 2026-03-10T08:54:20.991 INFO:tasks.workunit.client.0.vm05.stderr: 2026-03-10T08:54:20.991 INFO:tasks.workunit.client.0.vm05.stderr:If you want to create a new branch to retain commits you create, you may 2026-03-10T08:54:20.991 INFO:tasks.workunit.client.0.vm05.stderr:do so (now or later) by using -c with the switch command. Example: 2026-03-10T08:54:20.991 INFO:tasks.workunit.client.0.vm05.stderr: 2026-03-10T08:54:20.991 INFO:tasks.workunit.client.0.vm05.stderr: git switch -c 2026-03-10T08:54:20.991 INFO:tasks.workunit.client.0.vm05.stderr: 2026-03-10T08:54:20.991 INFO:tasks.workunit.client.0.vm05.stderr:Or undo this operation with: 2026-03-10T08:54:20.991 INFO:tasks.workunit.client.0.vm05.stderr: 2026-03-10T08:54:20.991 INFO:tasks.workunit.client.0.vm05.stderr: git switch - 2026-03-10T08:54:20.991 INFO:tasks.workunit.client.0.vm05.stderr: 2026-03-10T08:54:20.991 INFO:tasks.workunit.client.0.vm05.stderr:Turn off this advice by setting config variable advice.detachedHead to false 2026-03-10T08:54:20.991 INFO:tasks.workunit.client.0.vm05.stderr: 2026-03-10T08:54:20.991 INFO:tasks.workunit.client.0.vm05.stderr:HEAD is now at 75a68fd8ca3 qa/suites/orch/cephadm/osds: drop nvme_loop task 2026-03-10T08:54:20.997 DEBUG:teuthology.orchestra.run.vm05:> cd -- /home/ubuntu/cephtest/clone.client.0/qa/workunits && if test -e Makefile ; then make ; fi && find -executable -type f -printf '%P\0' >/home/ubuntu/cephtest/workunits.list.client.0 2026-03-10T08:54:21.063 INFO:tasks.workunit.client.0.vm05.stdout:for d in direct_io fs ; do ( cd $d ; make all ) ; done 2026-03-10T08:54:21.065 INFO:tasks.workunit.client.0.vm05.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/direct_io' 2026-03-10T08:54:21.065 INFO:tasks.workunit.client.0.vm05.stdout:cc -Wall -Wextra -D_GNU_SOURCE direct_io_test.c -o direct_io_test 2026-03-10T08:54:21.148 INFO:tasks.workunit.client.0.vm05.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_sync_io.c -o test_sync_io 2026-03-10T08:54:21.190 INFO:tasks.workunit.client.0.vm05.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_short_dio_read.c -o test_short_dio_read 2026-03-10T08:54:21.229 INFO:tasks.workunit.client.0.vm05.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/direct_io' 2026-03-10T08:54:21.231 INFO:tasks.workunit.client.0.vm05.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/fs' 2026-03-10T08:54:21.231 INFO:tasks.workunit.client.0.vm05.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_o_trunc.c -o test_o_trunc 2026-03-10T08:54:21.273 INFO:tasks.workunit.client.0.vm05.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/fs' 2026-03-10T08:54:21.279 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-10T08:54:21.279 DEBUG:teuthology.orchestra.run.vm05:> dd if=/home/ubuntu/cephtest/workunits.list.client.0 of=/dev/stdout 2026-03-10T08:54:21.342 INFO:tasks.workunit:Running workunits matching suites/fsstress.sh on client.0... 2026-03-10T08:54:21.343 INFO:tasks.workunit:Running workunit suites/fsstress.sh... 2026-03-10T08:54:21.343 DEBUG:teuthology.orchestra.run.vm05:workunit test suites/fsstress.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/suites/fsstress.sh 2026-03-10T08:54:21.420 INFO:tasks.workunit.client.0.vm05.stderr:+ mkdir -p fsstress 2026-03-10T08:54:21.423 INFO:tasks.workunit.client.0.vm05.stderr:+ pushd fsstress 2026-03-10T08:54:21.424 INFO:tasks.workunit.client.0.vm05.stdout:~/cephtest/mnt.0/client.0/tmp/fsstress ~/cephtest/mnt.0/client.0/tmp 2026-03-10T08:54:21.424 INFO:tasks.workunit.client.0.vm05.stderr:+ wget -q -O ltp-full.tgz http://download.ceph.com/qa/ltp-full-20091231.tgz 2026-03-10T08:54:23.068 INFO:tasks.workunit.client.0.vm05.stderr:+ tar xzf ltp-full.tgz 2026-03-10T08:54:23.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:22 vm05.local ceph-mon[49713]: pgmap v120: 65 pgs: 65 active+clean; 33 MiB data, 299 MiB used, 120 GiB / 120 GiB avail; 2.6 MiB/s wr, 273 op/s 2026-03-10T08:54:23.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:22 vm08.local ceph-mon[57559]: pgmap v120: 65 pgs: 65 active+clean; 33 MiB data, 299 MiB used, 120 GiB / 120 GiB avail; 2.6 MiB/s wr, 273 op/s 2026-03-10T08:54:24.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:24 vm05.local ceph-mon[49713]: pgmap v121: 65 pgs: 65 active+clean; 41 MiB data, 332 MiB used, 120 GiB / 120 GiB avail; 2.8 MiB/s wr, 276 op/s 2026-03-10T08:54:24.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:24 vm08.local ceph-mon[57559]: pgmap v121: 65 pgs: 65 active+clean; 41 MiB data, 332 MiB used, 120 GiB / 120 GiB avail; 2.8 MiB/s wr, 276 op/s 2026-03-10T08:54:26.934 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:26 vm05.local ceph-mon[49713]: pgmap v122: 65 pgs: 65 active+clean; 45 MiB data, 372 MiB used, 120 GiB / 120 GiB avail; 2.8 MiB/s wr, 289 op/s 2026-03-10T08:54:27.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:26 vm08.local ceph-mon[57559]: pgmap v122: 65 pgs: 65 active+clean; 45 MiB data, 372 MiB used, 120 GiB / 120 GiB avail; 2.8 MiB/s wr, 289 op/s 2026-03-10T08:54:28.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:28 vm05.local ceph-mon[49713]: pgmap v123: 65 pgs: 65 active+clean; 68 MiB data, 564 MiB used, 119 GiB / 120 GiB avail; 4.4 MiB/s wr, 392 op/s 2026-03-10T08:54:28.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:28 vm08.local ceph-mon[57559]: pgmap v123: 65 pgs: 65 active+clean; 68 MiB data, 564 MiB used, 119 GiB / 120 GiB avail; 4.4 MiB/s wr, 392 op/s 2026-03-10T08:54:30.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:30 vm05.local ceph-mon[49713]: pgmap v124: 65 pgs: 65 active+clean; 74 MiB data, 650 MiB used, 119 GiB / 120 GiB avail; 4.4 MiB/s wr, 359 op/s 2026-03-10T08:54:30.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:30 vm05.local ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:54:30.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:30 vm08.local ceph-mon[57559]: pgmap v124: 65 pgs: 65 active+clean; 74 MiB data, 650 MiB used, 119 GiB / 120 GiB avail; 4.4 MiB/s wr, 359 op/s 2026-03-10T08:54:30.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:30 vm08.local ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:54:33.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:32 vm05.local ceph-mon[49713]: pgmap v125: 65 pgs: 65 active+clean; 76 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 4.1 MiB/s wr, 402 op/s 2026-03-10T08:54:33.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:32 vm08.local ceph-mon[57559]: pgmap v125: 65 pgs: 65 active+clean; 76 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 4.1 MiB/s wr, 402 op/s 2026-03-10T08:54:35.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:34 vm08.local ceph-mon[57559]: pgmap v126: 65 pgs: 65 active+clean; 82 MiB data, 733 MiB used, 119 GiB / 120 GiB avail; 4.3 MiB/s wr, 411 op/s 2026-03-10T08:54:35.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:34 vm05.local ceph-mon[49713]: pgmap v126: 65 pgs: 65 active+clean; 82 MiB data, 733 MiB used, 119 GiB / 120 GiB avail; 4.3 MiB/s wr, 411 op/s 2026-03-10T08:54:36.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:36 vm08.local ceph-mon[57559]: pgmap v127: 65 pgs: 65 active+clean; 84 MiB data, 752 MiB used, 119 GiB / 120 GiB avail; 3.8 MiB/s wr, 423 op/s 2026-03-10T08:54:36.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:36 vm05.local ceph-mon[49713]: pgmap v127: 65 pgs: 65 active+clean; 84 MiB data, 752 MiB used, 119 GiB / 120 GiB avail; 3.8 MiB/s wr, 423 op/s 2026-03-10T08:54:38.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:38 vm05.local ceph-mon[49713]: pgmap v128: 65 pgs: 65 active+clean; 103 MiB data, 858 MiB used, 119 GiB / 120 GiB avail; 5.1 MiB/s wr, 526 op/s 2026-03-10T08:54:38.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:38 vm08.local ceph-mon[57559]: pgmap v128: 65 pgs: 65 active+clean; 103 MiB data, 858 MiB used, 119 GiB / 120 GiB avail; 5.1 MiB/s wr, 526 op/s 2026-03-10T08:54:40.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:39 vm05.local ceph-mon[49713]: pgmap v129: 65 pgs: 65 active+clean; 109 MiB data, 913 MiB used, 119 GiB / 120 GiB avail; 3.7 MiB/s wr, 434 op/s 2026-03-10T08:54:40.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:39 vm08.local ceph-mon[57559]: pgmap v129: 65 pgs: 65 active+clean; 109 MiB data, 913 MiB used, 119 GiB / 120 GiB avail; 3.7 MiB/s wr, 434 op/s 2026-03-10T08:54:42.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:42 vm05.local ceph-mon[49713]: pgmap v130: 65 pgs: 65 active+clean; 123 MiB data, 993 MiB used, 119 GiB / 120 GiB avail; 4.3 MiB/s wr, 458 op/s 2026-03-10T08:54:42.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:42 vm08.local ceph-mon[57559]: pgmap v130: 65 pgs: 65 active+clean; 123 MiB data, 993 MiB used, 119 GiB / 120 GiB avail; 4.3 MiB/s wr, 458 op/s 2026-03-10T08:54:44.523 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.522+0000 7f2e38b6f700 1 -- 192.168.123.105:0/1151550234 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2e2c0a4c90 msgr2=0x7f2e2c0a50b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:44.523 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:44 vm05.local ceph-mon[49713]: pgmap v131: 65 pgs: 65 active+clean; 129 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 4.6 MiB/s wr, 439 op/s 2026-03-10T08:54:44.523 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.522+0000 7f2e38b6f700 1 --2- 192.168.123.105:0/1151550234 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2e2c0a4c90 0x7f2e2c0a50b0 secure :-1 s=READY pgs=307 cs=0 l=1 rev1=1 crypto rx=0x7f2e28009b10 tx=0x7f2e28009e20 comp rx=0 tx=0).stop 2026-03-10T08:54:44.523 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.523+0000 7f2e38b6f700 1 -- 192.168.123.105:0/1151550234 shutdown_connections 2026-03-10T08:54:44.523 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.523+0000 7f2e38b6f700 1 --2- 192.168.123.105:0/1151550234 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2e2c0a5dd0 0x7f2e2c0a6250 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:44.523 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.523+0000 7f2e38b6f700 1 --2- 192.168.123.105:0/1151550234 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2e2c0a4c90 0x7f2e2c0a50b0 unknown :-1 s=CLOSED pgs=307 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:44.523 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.523+0000 7f2e38b6f700 1 -- 192.168.123.105:0/1151550234 >> 192.168.123.105:0/1151550234 conn(0x7f2e2c0a0150 msgr2=0x7f2e2c0a25b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:54:44.524 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.523+0000 7f2e38b6f700 1 -- 192.168.123.105:0/1151550234 shutdown_connections 2026-03-10T08:54:44.526 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.523+0000 7f2e38b6f700 1 -- 192.168.123.105:0/1151550234 wait complete. 2026-03-10T08:54:44.526 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.524+0000 7f2e38b6f700 1 Processor -- start 2026-03-10T08:54:44.526 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.524+0000 7f2e38b6f700 1 -- start start 2026-03-10T08:54:44.526 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.525+0000 7f2e38b6f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2e2c0a4c90 0x7f2e2c0b3e30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:54:44.526 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.525+0000 7f2e38b6f700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2e2c0a5dd0 0x7f2e2c0b4370 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:54:44.526 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.525+0000 7f2e38b6f700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2e2c0b4990 con 0x7f2e2c0a4c90 2026-03-10T08:54:44.526 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.525+0000 7f2e38b6f700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2e2c150a10 con 0x7f2e2c0a5dd0 2026-03-10T08:54:44.526 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.526+0000 7f2e32ffd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2e2c0a5dd0 0x7f2e2c0b4370 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:54:44.526 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.526+0000 7f2e32ffd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2e2c0a5dd0 0x7f2e2c0b4370 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:38628/0 (socket says 192.168.123.105:38628) 2026-03-10T08:54:44.526 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.526+0000 7f2e32ffd700 1 -- 192.168.123.105:0/587717666 learned_addr learned my addr 192.168.123.105:0/587717666 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:54:44.527 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.526+0000 7f2e337fe700 1 --2- 192.168.123.105:0/587717666 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2e2c0a4c90 0x7f2e2c0b3e30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:54:44.527 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.526+0000 7f2e32ffd700 1 -- 192.168.123.105:0/587717666 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2e2c0a4c90 msgr2=0x7f2e2c0b3e30 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:44.527 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.526+0000 7f2e32ffd700 1 --2- 192.168.123.105:0/587717666 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2e2c0a4c90 0x7f2e2c0b3e30 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:44.527 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.526+0000 7f2e32ffd700 1 -- 192.168.123.105:0/587717666 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2e28009770 con 0x7f2e2c0a5dd0 2026-03-10T08:54:44.527 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.526+0000 7f2e32ffd700 1 --2- 192.168.123.105:0/587717666 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2e2c0a5dd0 0x7f2e2c0b4370 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7f2e2000d8d0 tx=0x7f2e2000dc90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:54:44.528 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.527+0000 7f2e30ff9700 1 -- 192.168.123.105:0/587717666 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2e20009940 con 0x7f2e2c0a5dd0 2026-03-10T08:54:44.528 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.527+0000 7f2e38b6f700 1 -- 192.168.123.105:0/587717666 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2e2c150c10 con 0x7f2e2c0a5dd0 2026-03-10T08:54:44.528 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.527+0000 7f2e38b6f700 1 -- 192.168.123.105:0/587717666 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2e2c151110 con 0x7f2e2c0a5dd0 2026-03-10T08:54:44.528 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.528+0000 7f2e30ff9700 1 -- 192.168.123.105:0/587717666 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f2e20010460 con 0x7f2e2c0a5dd0 2026-03-10T08:54:44.528 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.528+0000 7f2e30ff9700 1 -- 192.168.123.105:0/587717666 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2e2000f5d0 con 0x7f2e2c0a5dd0 2026-03-10T08:54:44.529 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.528+0000 7f2e38b6f700 1 -- 192.168.123.105:0/587717666 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2e18005320 con 0x7f2e2c0a5dd0 2026-03-10T08:54:44.530 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.529+0000 7f2e30ff9700 1 -- 192.168.123.105:0/587717666 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f2e20009aa0 con 0x7f2e2c0a5dd0 2026-03-10T08:54:44.530 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.529+0000 7f2e30ff9700 1 --2- 192.168.123.105:0/587717666 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2e2406c600 0x7f2e2406eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:54:44.530 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.530+0000 7f2e30ff9700 1 -- 192.168.123.105:0/587717666 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f2e2008b610 con 0x7f2e2c0a5dd0 2026-03-10T08:54:44.530 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.530+0000 7f2e337fe700 1 --2- 192.168.123.105:0/587717666 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2e2406c600 0x7f2e2406eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:54:44.531 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.530+0000 7f2e337fe700 1 --2- 192.168.123.105:0/587717666 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2e2406c600 0x7f2e2406eac0 secure :-1 s=READY pgs=136 cs=0 l=1 rev1=1 crypto rx=0x7f2e2800b5c0 tx=0x7f2e28005ba0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:54:44.532 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.532+0000 7f2e30ff9700 1 -- 192.168.123.105:0/587717666 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f2e200596b0 con 0x7f2e2c0a5dd0 2026-03-10T08:54:44.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:44 vm08.local ceph-mon[57559]: pgmap v131: 65 pgs: 65 active+clean; 129 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 4.6 MiB/s wr, 439 op/s 2026-03-10T08:54:44.700 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.699+0000 7f2e38b6f700 1 -- 192.168.123.105:0/587717666 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f2e18000bf0 con 0x7f2e2406c600 2026-03-10T08:54:44.700 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.700+0000 7f2e30ff9700 1 -- 192.168.123.105:0/587717666 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+442 (secure 0 0 0) 0x7f2e18000bf0 con 0x7f2e2406c600 2026-03-10T08:54:44.704 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.704+0000 7f2e1e7fc700 1 -- 192.168.123.105:0/587717666 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2e2406c600 msgr2=0x7f2e2406eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:44.704 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.704+0000 7f2e1e7fc700 1 --2- 192.168.123.105:0/587717666 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2e2406c600 0x7f2e2406eac0 secure :-1 s=READY pgs=136 cs=0 l=1 rev1=1 crypto rx=0x7f2e2800b5c0 tx=0x7f2e28005ba0 comp rx=0 tx=0).stop 2026-03-10T08:54:44.704 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.704+0000 7f2e1e7fc700 1 -- 192.168.123.105:0/587717666 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2e2c0a5dd0 msgr2=0x7f2e2c0b4370 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:44.704 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.704+0000 7f2e1e7fc700 1 --2- 192.168.123.105:0/587717666 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2e2c0a5dd0 0x7f2e2c0b4370 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7f2e2000d8d0 tx=0x7f2e2000dc90 comp rx=0 tx=0).stop 2026-03-10T08:54:44.705 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.705+0000 7f2e1e7fc700 1 -- 192.168.123.105:0/587717666 shutdown_connections 2026-03-10T08:54:44.705 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.705+0000 7f2e1e7fc700 1 --2- 192.168.123.105:0/587717666 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2e2406c600 0x7f2e2406eac0 unknown :-1 s=CLOSED pgs=136 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:44.705 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.705+0000 7f2e1e7fc700 1 --2- 192.168.123.105:0/587717666 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2e2c0a4c90 0x7f2e2c0b3e30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:44.706 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.705+0000 7f2e1e7fc700 1 --2- 192.168.123.105:0/587717666 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2e2c0a5dd0 0x7f2e2c0b4370 unknown :-1 s=CLOSED pgs=78 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:44.706 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.705+0000 7f2e1e7fc700 1 -- 192.168.123.105:0/587717666 >> 192.168.123.105:0/587717666 conn(0x7f2e2c0a0150 msgr2=0x7f2e2c0a9000 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:54:44.706 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.706+0000 7f2e1e7fc700 1 -- 192.168.123.105:0/587717666 shutdown_connections 2026-03-10T08:54:44.706 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.706+0000 7f2e1e7fc700 1 -- 192.168.123.105:0/587717666 wait complete. 2026-03-10T08:54:44.718 INFO:teuthology.orchestra.run.vm05.stdout:true 2026-03-10T08:54:44.809 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.807+0000 7fd2c28ba700 1 -- 192.168.123.105:0/456220543 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd2bc10a700 msgr2=0x7fd2bc10cb90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:44.809 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.808+0000 7fd2c28ba700 1 --2- 192.168.123.105:0/456220543 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd2bc10a700 0x7fd2bc10cb90 secure :-1 s=READY pgs=308 cs=0 l=1 rev1=1 crypto rx=0x7fd2b400b3a0 tx=0x7fd2b400b6b0 comp rx=0 tx=0).stop 2026-03-10T08:54:44.809 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.808+0000 7fd2c28ba700 1 -- 192.168.123.105:0/456220543 shutdown_connections 2026-03-10T08:54:44.809 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.808+0000 7fd2c28ba700 1 --2- 192.168.123.105:0/456220543 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd2bc10a700 0x7fd2bc10cb90 unknown :-1 s=CLOSED pgs=308 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:44.809 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.808+0000 7fd2c28ba700 1 --2- 192.168.123.105:0/456220543 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd2bc107d90 0x7fd2bc10a1c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:44.809 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.808+0000 7fd2c28ba700 1 -- 192.168.123.105:0/456220543 >> 192.168.123.105:0/456220543 conn(0x7fd2bc06dae0 msgr2=0x7fd2bc06ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:54:44.809 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.809+0000 7fd2c28ba700 1 -- 192.168.123.105:0/456220543 shutdown_connections 2026-03-10T08:54:44.809 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.809+0000 7fd2c28ba700 1 -- 192.168.123.105:0/456220543 wait complete. 2026-03-10T08:54:44.809 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.809+0000 7fd2c28ba700 1 Processor -- start 2026-03-10T08:54:44.810 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.809+0000 7fd2c28ba700 1 -- start start 2026-03-10T08:54:44.810 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.810+0000 7fd2c28ba700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd2bc107d90 0x7fd2bc116da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:54:44.810 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.810+0000 7fd2c28ba700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd2bc10a700 0x7fd2bc1172e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:54:44.810 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.810+0000 7fd2c28ba700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd2bc117900 con 0x7fd2bc107d90 2026-03-10T08:54:44.810 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.810+0000 7fd2c28ba700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd2bc1b3440 con 0x7fd2bc10a700 2026-03-10T08:54:44.810 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.810+0000 7fd2bb7fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd2bc10a700 0x7fd2bc1172e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:54:44.810 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.810+0000 7fd2bbfff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd2bc107d90 0x7fd2bc116da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:54:44.810 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.810+0000 7fd2bb7fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd2bc10a700 0x7fd2bc1172e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:38656/0 (socket says 192.168.123.105:38656) 2026-03-10T08:54:44.811 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.810+0000 7fd2bb7fe700 1 -- 192.168.123.105:0/1781240596 learned_addr learned my addr 192.168.123.105:0/1781240596 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:54:44.811 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.811+0000 7fd2bbfff700 1 -- 192.168.123.105:0/1781240596 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd2bc10a700 msgr2=0x7fd2bc1172e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:44.811 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.811+0000 7fd2bbfff700 1 --2- 192.168.123.105:0/1781240596 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd2bc10a700 0x7fd2bc1172e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:44.811 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.811+0000 7fd2bbfff700 1 -- 192.168.123.105:0/1781240596 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd2b400b050 con 0x7fd2bc107d90 2026-03-10T08:54:44.811 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.811+0000 7fd2bbfff700 1 --2- 192.168.123.105:0/1781240596 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd2bc107d90 0x7fd2bc116da0 secure :-1 s=READY pgs=309 cs=0 l=1 rev1=1 crypto rx=0x7fd2ac00ca50 tx=0x7fd2ac00cd60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:54:44.811 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.811+0000 7fd2b97fa700 1 -- 192.168.123.105:0/1781240596 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd2ac0049e0 con 0x7fd2bc107d90 2026-03-10T08:54:44.811 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.811+0000 7fd2c28ba700 1 -- 192.168.123.105:0/1781240596 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd2bc1b3640 con 0x7fd2bc107d90 2026-03-10T08:54:44.813 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.812+0000 7fd2c28ba700 1 -- 192.168.123.105:0/1781240596 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd2bc1b3b40 con 0x7fd2bc107d90 2026-03-10T08:54:44.818 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.813+0000 7fd2c28ba700 1 -- 192.168.123.105:0/1781240596 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd2bc110c60 con 0x7fd2bc107d90 2026-03-10T08:54:44.818 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.817+0000 7fd2b97fa700 1 -- 192.168.123.105:0/1781240596 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd2ac005500 con 0x7fd2bc107d90 2026-03-10T08:54:44.818 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.817+0000 7fd2b97fa700 1 -- 192.168.123.105:0/1781240596 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd2ac007d10 con 0x7fd2bc107d90 2026-03-10T08:54:44.820 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.819+0000 7fd2b97fa700 1 -- 192.168.123.105:0/1781240596 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fd2ac022460 con 0x7fd2bc107d90 2026-03-10T08:54:44.820 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.819+0000 7fd2b97fa700 1 --2- 192.168.123.105:0/1781240596 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd2a406c530 0x7fd2a406e9f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:54:44.820 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.820+0000 7fd2b97fa700 1 -- 192.168.123.105:0/1781240596 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7fd2ac08c6d0 con 0x7fd2bc107d90 2026-03-10T08:54:44.820 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.820+0000 7fd2bb7fe700 1 --2- 192.168.123.105:0/1781240596 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd2a406c530 0x7fd2a406e9f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:54:44.823 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.822+0000 7fd2bb7fe700 1 --2- 192.168.123.105:0/1781240596 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd2a406c530 0x7fd2a406e9f0 secure :-1 s=READY pgs=137 cs=0 l=1 rev1=1 crypto rx=0x7fd2b400c010 tx=0x7fd2b400bab0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:54:44.823 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.823+0000 7fd2b97fa700 1 -- 192.168.123.105:0/1781240596 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fd2ac056f70 con 0x7fd2bc107d90 2026-03-10T08:54:44.987 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.987+0000 7fd2c28ba700 1 -- 192.168.123.105:0/1781240596 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fd2bc0611d0 con 0x7fd2a406c530 2026-03-10T08:54:44.988 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.988+0000 7fd2b97fa700 1 -- 192.168.123.105:0/1781240596 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+442 (secure 0 0 0) 0x7fd2bc0611d0 con 0x7fd2a406c530 2026-03-10T08:54:44.991 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.990+0000 7fd2c28ba700 1 -- 192.168.123.105:0/1781240596 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd2a406c530 msgr2=0x7fd2a406e9f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:44.991 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.991+0000 7fd2c28ba700 1 --2- 192.168.123.105:0/1781240596 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd2a406c530 0x7fd2a406e9f0 secure :-1 s=READY pgs=137 cs=0 l=1 rev1=1 crypto rx=0x7fd2b400c010 tx=0x7fd2b400bab0 comp rx=0 tx=0).stop 2026-03-10T08:54:44.991 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.991+0000 7fd2c28ba700 1 -- 192.168.123.105:0/1781240596 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd2bc107d90 msgr2=0x7fd2bc116da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:44.991 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.991+0000 7fd2c28ba700 1 --2- 192.168.123.105:0/1781240596 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd2bc107d90 0x7fd2bc116da0 secure :-1 s=READY pgs=309 cs=0 l=1 rev1=1 crypto rx=0x7fd2ac00ca50 tx=0x7fd2ac00cd60 comp rx=0 tx=0).stop 2026-03-10T08:54:44.991 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.991+0000 7fd2c28ba700 1 -- 192.168.123.105:0/1781240596 shutdown_connections 2026-03-10T08:54:44.991 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.991+0000 7fd2c28ba700 1 --2- 192.168.123.105:0/1781240596 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd2a406c530 0x7fd2a406e9f0 unknown :-1 s=CLOSED pgs=137 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:44.991 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.991+0000 7fd2c28ba700 1 --2- 192.168.123.105:0/1781240596 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd2bc107d90 0x7fd2bc116da0 unknown :-1 s=CLOSED pgs=309 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:44.991 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.991+0000 7fd2c28ba700 1 --2- 192.168.123.105:0/1781240596 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd2bc10a700 0x7fd2bc1172e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:44.991 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.991+0000 7fd2c28ba700 1 -- 192.168.123.105:0/1781240596 >> 192.168.123.105:0/1781240596 conn(0x7fd2bc06dae0 msgr2=0x7fd2bc06ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:54:44.992 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.991+0000 7fd2c28ba700 1 -- 192.168.123.105:0/1781240596 shutdown_connections 2026-03-10T08:54:44.992 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:44.991+0000 7fd2c28ba700 1 -- 192.168.123.105:0/1781240596 wait complete. 2026-03-10T08:54:45.078 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.076+0000 7f8577b98700 1 -- 192.168.123.105:0/3343860686 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8570072b50 msgr2=0x7f8570072f70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:45.078 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.076+0000 7f8577b98700 1 --2- 192.168.123.105:0/3343860686 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8570072b50 0x7f8570072f70 secure :-1 s=READY pgs=310 cs=0 l=1 rev1=1 crypto rx=0x7f856c005fd0 tx=0x7f856c0088d0 comp rx=0 tx=0).stop 2026-03-10T08:54:45.078 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.077+0000 7f8577b98700 1 -- 192.168.123.105:0/3343860686 shutdown_connections 2026-03-10T08:54:45.078 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.077+0000 7f8577b98700 1 --2- 192.168.123.105:0/3343860686 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8570075a40 0x7f8570077ed0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:45.078 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.077+0000 7f8577b98700 1 --2- 192.168.123.105:0/3343860686 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8570072b50 0x7f8570072f70 unknown :-1 s=CLOSED pgs=310 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:45.078 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.077+0000 7f8577b98700 1 -- 192.168.123.105:0/3343860686 >> 192.168.123.105:0/3343860686 conn(0x7f857006dae0 msgr2=0x7f857006ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:54:45.078 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.077+0000 7f8577b98700 1 -- 192.168.123.105:0/3343860686 shutdown_connections 2026-03-10T08:54:45.078 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.077+0000 7f8577b98700 1 -- 192.168.123.105:0/3343860686 wait complete. 2026-03-10T08:54:45.079 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.077+0000 7f8577b98700 1 Processor -- start 2026-03-10T08:54:45.079 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.077+0000 7f8577b98700 1 -- start start 2026-03-10T08:54:45.079 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.077+0000 7f8577b98700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8570075a40 0x7f8570080fd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:54:45.079 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.077+0000 7f8577b98700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8570081510 0x7f857012e380 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:54:45.079 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.077+0000 7f8577b98700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8570081a50 con 0x7f8570075a40 2026-03-10T08:54:45.079 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.077+0000 7f8577b98700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8570081bc0 con 0x7f8570081510 2026-03-10T08:54:45.079 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.077+0000 7f8575133700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8570081510 0x7f857012e380 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:54:45.079 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.077+0000 7f8575133700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8570081510 0x7f857012e380 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:38666/0 (socket says 192.168.123.105:38666) 2026-03-10T08:54:45.079 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.077+0000 7f8575133700 1 -- 192.168.123.105:0/3096371474 learned_addr learned my addr 192.168.123.105:0/3096371474 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:54:45.079 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.078+0000 7f8575934700 1 --2- 192.168.123.105:0/3096371474 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8570075a40 0x7f8570080fd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:54:45.079 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.078+0000 7f8575133700 1 -- 192.168.123.105:0/3096371474 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8570075a40 msgr2=0x7f8570080fd0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:45.079 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.078+0000 7f8575133700 1 --2- 192.168.123.105:0/3096371474 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8570075a40 0x7f8570080fd0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:45.079 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.078+0000 7f8575133700 1 -- 192.168.123.105:0/3096371474 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f856c0082d0 con 0x7f8570081510 2026-03-10T08:54:45.079 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.078+0000 7f8575934700 1 --2- 192.168.123.105:0/3096371474 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8570075a40 0x7f8570080fd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T08:54:45.079 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.078+0000 7f8575133700 1 --2- 192.168.123.105:0/3096371474 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8570081510 0x7f857012e380 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7f856800e580 tx=0x7f856800e890 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:54:45.079 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.079+0000 7f8566ffd700 1 -- 192.168.123.105:0/3096371474 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8568011600 con 0x7f8570081510 2026-03-10T08:54:45.079 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.079+0000 7f8577b98700 1 -- 192.168.123.105:0/3096371474 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f857012e9e0 con 0x7f8570081510 2026-03-10T08:54:45.080 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.079+0000 7f8577b98700 1 -- 192.168.123.105:0/3096371474 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f857012eee0 con 0x7f8570081510 2026-03-10T08:54:45.080 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.079+0000 7f8566ffd700 1 -- 192.168.123.105:0/3096371474 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8568011c40 con 0x7f8570081510 2026-03-10T08:54:45.080 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.079+0000 7f8566ffd700 1 -- 192.168.123.105:0/3096371474 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8568010df0 con 0x7f8570081510 2026-03-10T08:54:45.081 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.080+0000 7f8566ffd700 1 -- 192.168.123.105:0/3096371474 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f8568018360 con 0x7f8570081510 2026-03-10T08:54:45.081 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.081+0000 7f8566ffd700 1 --2- 192.168.123.105:0/3096371474 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f855c06c600 0x7f855c06eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:54:45.081 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.081+0000 7f8575934700 1 --2- 192.168.123.105:0/3096371474 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f855c06c600 0x7f855c06eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:54:45.081 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.081+0000 7f8566ffd700 1 -- 192.168.123.105:0/3096371474 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f856808ccd0 con 0x7f8570081510 2026-03-10T08:54:45.082 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.082+0000 7f8577b98700 1 -- 192.168.123.105:0/3096371474 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8554005320 con 0x7f8570081510 2026-03-10T08:54:45.082 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.082+0000 7f8575934700 1 --2- 192.168.123.105:0/3096371474 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f855c06c600 0x7f855c06eac0 secure :-1 s=READY pgs=138 cs=0 l=1 rev1=1 crypto rx=0x7f856c005f00 tx=0x7f856c005040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:54:45.088 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.088+0000 7f8566ffd700 1 -- 192.168.123.105:0/3096371474 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f8568057570 con 0x7f8570081510 2026-03-10T08:54:45.236 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.236+0000 7f8577b98700 1 -- 192.168.123.105:0/3096371474 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f8554000bf0 con 0x7f855c06c600 2026-03-10T08:54:45.244 INFO:teuthology.orchestra.run.vm05.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T08:54:45.244 INFO:teuthology.orchestra.run.vm05.stdout:alertmanager.vm05 vm05 *:9093,9094 running (3m) 105s ago 3m 21.4M - 0.25.0 c8568f914cd2 3be9db7ff6a0 2026-03-10T08:54:45.244 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm05 vm05 running (4m) 105s ago 4m 8032k - 18.2.1 5be31c24972a 4f6a4fa3151a 2026-03-10T08:54:45.244 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm08 vm08 running (3m) 106s ago 3m 8308k - 18.2.1 5be31c24972a 17a1d108c2c1 2026-03-10T08:54:45.244 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm05 vm05 running (3m) 105s ago 3m 7407k - 18.2.1 5be31c24972a f9c585addcea 2026-03-10T08:54:45.245 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm08 vm08 running (3m) 106s ago 3m 7415k - 18.2.1 5be31c24972a f0b88fc7f552 2026-03-10T08:54:45.245 INFO:teuthology.orchestra.run.vm05.stdout:grafana.vm05 vm05 *:3000 running (3m) 105s ago 3m 80.8M - 9.4.7 954c08fa6188 91937b4745e7 2026-03-10T08:54:45.245 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.bxdvbu vm05 running (111s) 105s ago 111s 16.7M - 18.2.1 5be31c24972a 601862397d9b 2026-03-10T08:54:45.245 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.slhztf vm05 running (110s) 105s ago 110s 13.7M - 18.2.1 5be31c24972a 550cdd24738b 2026-03-10T08:54:45.245 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.ssijow vm08 running (109s) 106s ago 109s 16.1M - 18.2.1 5be31c24972a b9e55b365719 2026-03-10T08:54:45.245 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.xfzrbx vm08 running (111s) 106s ago 110s 11.0M - 18.2.1 5be31c24972a d58d1e2e6ff3 2026-03-10T08:54:45.245 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm05.rxwgjc vm05 *:9283,8765,8443 running (4m) 105s ago 4m 501M - 18.2.1 5be31c24972a 6ec0cdb38171 2026-03-10T08:54:45.245 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm08.rpongu vm08 *:8443,9283,8765 running (3m) 106s ago 3m 449M - 18.2.1 5be31c24972a 9cd801f2f7a7 2026-03-10T08:54:45.245 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm05 vm05 running (4m) 105s ago 4m 50.0M 2048M 18.2.1 5be31c24972a 4cb0e74c8584 2026-03-10T08:54:45.245 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm08 vm08 running (3m) 106s ago 3m 47.9M 2048M 18.2.1 5be31c24972a bca448418226 2026-03-10T08:54:45.245 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm05 vm05 *:9100 running (3m) 105s ago 3m 12.3M - 1.5.0 0da6a335fe13 3b3db2a6030c 2026-03-10T08:54:45.245 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm08 vm08 *:9100 running (3m) 106s ago 3m 12.7M - 1.5.0 0da6a335fe13 f55df0cefc61 2026-03-10T08:54:45.245 INFO:teuthology.orchestra.run.vm05.stdout:osd.0 vm05 running (3m) 105s ago 3m 48.5M 4096M 18.2.1 5be31c24972a 2a2aeea5e3d4 2026-03-10T08:54:45.245 INFO:teuthology.orchestra.run.vm05.stdout:osd.1 vm05 running (2m) 105s ago 2m 46.9M 4096M 18.2.1 5be31c24972a 902f9ea11f1a 2026-03-10T08:54:45.245 INFO:teuthology.orchestra.run.vm05.stdout:osd.2 vm05 running (2m) 105s ago 2m 48.1M 4096M 18.2.1 5be31c24972a 32c0be0f86f2 2026-03-10T08:54:45.245 INFO:teuthology.orchestra.run.vm05.stdout:osd.3 vm08 running (2m) 106s ago 2m 44.3M 4096M 18.2.1 5be31c24972a 14f5f93ea4d1 2026-03-10T08:54:45.245 INFO:teuthology.orchestra.run.vm05.stdout:osd.4 vm08 running (2m) 106s ago 2m 43.5M 4096M 18.2.1 5be31c24972a 155c1482d81c 2026-03-10T08:54:45.245 INFO:teuthology.orchestra.run.vm05.stdout:osd.5 vm08 running (2m) 106s ago 2m 45.8M 4096M 18.2.1 5be31c24972a 21583bb58d82 2026-03-10T08:54:45.245 INFO:teuthology.orchestra.run.vm05.stdout:prometheus.vm05 vm05 *:9095 running (3m) 105s ago 3m 36.9M - 2.43.0 a07b618ecd1d e84b76e5c1c0 2026-03-10T08:54:45.245 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.242+0000 7f8566ffd700 1 -- 192.168.123.105:0/3096371474 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3288 (secure 0 0 0) 0x7f8554000bf0 con 0x7f855c06c600 2026-03-10T08:54:45.245 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.245+0000 7f8564ff9700 1 -- 192.168.123.105:0/3096371474 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f855c06c600 msgr2=0x7f855c06eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:45.245 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.245+0000 7f8564ff9700 1 --2- 192.168.123.105:0/3096371474 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f855c06c600 0x7f855c06eac0 secure :-1 s=READY pgs=138 cs=0 l=1 rev1=1 crypto rx=0x7f856c005f00 tx=0x7f856c005040 comp rx=0 tx=0).stop 2026-03-10T08:54:45.245 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.245+0000 7f8564ff9700 1 -- 192.168.123.105:0/3096371474 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8570081510 msgr2=0x7f857012e380 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:45.245 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.245+0000 7f8564ff9700 1 --2- 192.168.123.105:0/3096371474 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8570081510 0x7f857012e380 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7f856800e580 tx=0x7f856800e890 comp rx=0 tx=0).stop 2026-03-10T08:54:45.245 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.245+0000 7f8564ff9700 1 -- 192.168.123.105:0/3096371474 shutdown_connections 2026-03-10T08:54:45.245 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.245+0000 7f8564ff9700 1 --2- 192.168.123.105:0/3096371474 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f855c06c600 0x7f855c06eac0 secure :-1 s=CLOSED pgs=138 cs=0 l=1 rev1=1 crypto rx=0x7f856c005f00 tx=0x7f856c005040 comp rx=0 tx=0).stop 2026-03-10T08:54:45.245 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.245+0000 7f8564ff9700 1 --2- 192.168.123.105:0/3096371474 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8570075a40 0x7f8570080fd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:45.245 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.245+0000 7f8564ff9700 1 --2- 192.168.123.105:0/3096371474 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8570081510 0x7f857012e380 unknown :-1 s=CLOSED pgs=79 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:45.245 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.245+0000 7f8564ff9700 1 -- 192.168.123.105:0/3096371474 >> 192.168.123.105:0/3096371474 conn(0x7f857006dae0 msgr2=0x7f857006ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:54:45.245 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.245+0000 7f8564ff9700 1 -- 192.168.123.105:0/3096371474 shutdown_connections 2026-03-10T08:54:45.245 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.245+0000 7f8564ff9700 1 -- 192.168.123.105:0/3096371474 wait complete. 2026-03-10T08:54:45.338 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.338+0000 7f6cc39b6700 1 -- 192.168.123.105:0/308371270 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6cbc10a700 msgr2=0x7f6cbc10cb90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:45.339 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.338+0000 7f6cc39b6700 1 --2- 192.168.123.105:0/308371270 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6cbc10a700 0x7f6cbc10cb90 secure :-1 s=READY pgs=311 cs=0 l=1 rev1=1 crypto rx=0x7f6cb400b3a0 tx=0x7f6cb400b6b0 comp rx=0 tx=0).stop 2026-03-10T08:54:45.339 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:45 vm05.local ceph-mon[49713]: from='client.24413 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:54:45.339 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.338+0000 7f6cc39b6700 1 -- 192.168.123.105:0/308371270 shutdown_connections 2026-03-10T08:54:45.339 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.338+0000 7f6cc39b6700 1 --2- 192.168.123.105:0/308371270 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6cbc10a700 0x7f6cbc10cb90 unknown :-1 s=CLOSED pgs=311 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:45.339 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.338+0000 7f6cc39b6700 1 --2- 192.168.123.105:0/308371270 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6cbc107d90 0x7f6cbc10a1c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:45.339 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.338+0000 7f6cc39b6700 1 -- 192.168.123.105:0/308371270 >> 192.168.123.105:0/308371270 conn(0x7f6cbc06dae0 msgr2=0x7f6cbc06ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:54:45.341 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.341+0000 7f6cc39b6700 1 -- 192.168.123.105:0/308371270 shutdown_connections 2026-03-10T08:54:45.341 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.341+0000 7f6cc39b6700 1 -- 192.168.123.105:0/308371270 wait complete. 2026-03-10T08:54:45.342 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.342+0000 7f6cc39b6700 1 Processor -- start 2026-03-10T08:54:45.342 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.342+0000 7f6cc39b6700 1 -- start start 2026-03-10T08:54:45.343 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.342+0000 7f6cc39b6700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6cbc107d90 0x7f6cbc1a5440 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:54:45.343 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.342+0000 7f6cc39b6700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6cbc1a5980 0x7f6cbc1aa9f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:54:45.343 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.342+0000 7f6cc39b6700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6cbc1a5e90 con 0x7f6cbc107d90 2026-03-10T08:54:45.343 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.342+0000 7f6cc39b6700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6cbc1a6000 con 0x7f6cbc1a5980 2026-03-10T08:54:45.343 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.342+0000 7f6cc1752700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6cbc107d90 0x7f6cbc1a5440 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:54:45.343 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.343+0000 7f6cc1752700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6cbc107d90 0x7f6cbc1a5440 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:39906/0 (socket says 192.168.123.105:39906) 2026-03-10T08:54:45.343 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.343+0000 7f6cc1752700 1 -- 192.168.123.105:0/2674834928 learned_addr learned my addr 192.168.123.105:0/2674834928 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:54:45.343 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.343+0000 7f6cc1752700 1 -- 192.168.123.105:0/2674834928 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6cbc1a5980 msgr2=0x7f6cbc1aa9f0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T08:54:45.343 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.343+0000 7f6cc0f51700 1 --2- 192.168.123.105:0/2674834928 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6cbc1a5980 0x7f6cbc1aa9f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:54:45.343 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.343+0000 7f6cc1752700 1 --2- 192.168.123.105:0/2674834928 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6cbc1a5980 0x7f6cbc1aa9f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:45.343 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.343+0000 7f6cc1752700 1 -- 192.168.123.105:0/2674834928 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6cb400b050 con 0x7f6cbc107d90 2026-03-10T08:54:45.344 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.343+0000 7f6cc1752700 1 --2- 192.168.123.105:0/2674834928 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6cbc107d90 0x7f6cbc1a5440 secure :-1 s=READY pgs=312 cs=0 l=1 rev1=1 crypto rx=0x7f6cb800b770 tx=0x7f6cb800bb30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:54:45.344 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.344+0000 7f6cb27fc700 1 -- 192.168.123.105:0/2674834928 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6cb800f820 con 0x7f6cbc107d90 2026-03-10T08:54:45.345 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.344+0000 7f6cc39b6700 1 -- 192.168.123.105:0/2674834928 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6cbc10f520 con 0x7f6cbc107d90 2026-03-10T08:54:45.346 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.344+0000 7f6cc39b6700 1 -- 192.168.123.105:0/2674834928 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6cbc10fa70 con 0x7f6cbc107d90 2026-03-10T08:54:45.346 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.345+0000 7f6cc39b6700 1 -- 192.168.123.105:0/2674834928 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6cbc04ea90 con 0x7f6cbc107d90 2026-03-10T08:54:45.346 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.346+0000 7f6cb27fc700 1 -- 192.168.123.105:0/2674834928 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6cb800fe60 con 0x7f6cbc107d90 2026-03-10T08:54:45.349 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.349+0000 7f6cb27fc700 1 -- 192.168.123.105:0/2674834928 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6cb800d610 con 0x7f6cbc107d90 2026-03-10T08:54:45.349 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.349+0000 7f6cb27fc700 1 -- 192.168.123.105:0/2674834928 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f6cb800d830 con 0x7f6cbc107d90 2026-03-10T08:54:45.349 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.349+0000 7f6cb27fc700 1 --2- 192.168.123.105:0/2674834928 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f6ca806c6d0 0x7f6ca806eb90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:54:45.349 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.349+0000 7f6cb27fc700 1 -- 192.168.123.105:0/2674834928 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f6cb808c7d0 con 0x7f6cbc107d90 2026-03-10T08:54:45.350 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.350+0000 7f6cc0f51700 1 --2- 192.168.123.105:0/2674834928 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f6ca806c6d0 0x7f6ca806eb90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:54:45.351 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.351+0000 7f6cb27fc700 1 -- 192.168.123.105:0/2674834928 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f6cb805ab00 con 0x7f6cbc107d90 2026-03-10T08:54:45.356 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.356+0000 7f6cc0f51700 1 --2- 192.168.123.105:0/2674834928 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f6ca806c6d0 0x7f6ca806eb90 secure :-1 s=READY pgs=139 cs=0 l=1 rev1=1 crypto rx=0x7f6cb400c010 tx=0x7f6cb40090d0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:54:45.547 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.547+0000 7f6cc39b6700 1 -- 192.168.123.105:0/2674834928 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f6cbc10fd50 con 0x7f6cbc107d90 2026-03-10T08:54:45.548 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.547+0000 7f6cb27fc700 1 -- 192.168.123.105:0/2674834928 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+558 (secure 0 0 0) 0x7f6cb805a690 con 0x7f6cbc107d90 2026-03-10T08:54:45.548 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T08:54:45.548 INFO:teuthology.orchestra.run.vm05.stdout: "mon": { 2026-03-10T08:54:45.548 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 2 2026-03-10T08:54:45.548 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:54:45.548 INFO:teuthology.orchestra.run.vm05.stdout: "mgr": { 2026-03-10T08:54:45.548 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 2 2026-03-10T08:54:45.548 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:54:45.548 INFO:teuthology.orchestra.run.vm05.stdout: "osd": { 2026-03-10T08:54:45.548 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 6 2026-03-10T08:54:45.548 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:54:45.548 INFO:teuthology.orchestra.run.vm05.stdout: "mds": { 2026-03-10T08:54:45.548 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4 2026-03-10T08:54:45.548 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:54:45.548 INFO:teuthology.orchestra.run.vm05.stdout: "overall": { 2026-03-10T08:54:45.548 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 14 2026-03-10T08:54:45.548 INFO:teuthology.orchestra.run.vm05.stdout: } 2026-03-10T08:54:45.548 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T08:54:45.551 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.550+0000 7f6ca7fff700 1 -- 192.168.123.105:0/2674834928 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f6ca806c6d0 msgr2=0x7f6ca806eb90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:45.551 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.550+0000 7f6ca7fff700 1 --2- 192.168.123.105:0/2674834928 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f6ca806c6d0 0x7f6ca806eb90 secure :-1 s=READY pgs=139 cs=0 l=1 rev1=1 crypto rx=0x7f6cb400c010 tx=0x7f6cb40090d0 comp rx=0 tx=0).stop 2026-03-10T08:54:45.551 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.550+0000 7f6ca7fff700 1 -- 192.168.123.105:0/2674834928 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6cbc107d90 msgr2=0x7f6cbc1a5440 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:45.551 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.551+0000 7f6ca7fff700 1 --2- 192.168.123.105:0/2674834928 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6cbc107d90 0x7f6cbc1a5440 secure :-1 s=READY pgs=312 cs=0 l=1 rev1=1 crypto rx=0x7f6cb800b770 tx=0x7f6cb800bb30 comp rx=0 tx=0).stop 2026-03-10T08:54:45.552 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.552+0000 7f6ca7fff700 1 -- 192.168.123.105:0/2674834928 shutdown_connections 2026-03-10T08:54:45.552 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.552+0000 7f6ca7fff700 1 --2- 192.168.123.105:0/2674834928 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f6ca806c6d0 0x7f6ca806eb90 unknown :-1 s=CLOSED pgs=139 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:45.552 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.552+0000 7f6ca7fff700 1 --2- 192.168.123.105:0/2674834928 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6cbc107d90 0x7f6cbc1a5440 unknown :-1 s=CLOSED pgs=312 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:45.552 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.552+0000 7f6ca7fff700 1 --2- 192.168.123.105:0/2674834928 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6cbc1a5980 0x7f6cbc1aa9f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:45.552 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.552+0000 7f6ca7fff700 1 -- 192.168.123.105:0/2674834928 >> 192.168.123.105:0/2674834928 conn(0x7f6cbc06dae0 msgr2=0x7f6cbc06e7c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:54:45.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:45 vm08.local ceph-mon[57559]: from='client.24413 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:54:45.553 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.553+0000 7f6ca7fff700 1 -- 192.168.123.105:0/2674834928 shutdown_connections 2026-03-10T08:54:45.553 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.553+0000 7f6ca7fff700 1 -- 192.168.123.105:0/2674834928 wait complete. 2026-03-10T08:54:45.650 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.649+0000 7f45faadb700 1 -- 192.168.123.105:0/1564442006 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f45f4072b20 msgr2=0x7f45f4072f40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:45.650 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.649+0000 7f45faadb700 1 --2- 192.168.123.105:0/1564442006 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f45f4072b20 0x7f45f4072f40 secure :-1 s=READY pgs=313 cs=0 l=1 rev1=1 crypto rx=0x7f45f0008790 tx=0x7f45f000ae50 comp rx=0 tx=0).stop 2026-03-10T08:54:45.650 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.649+0000 7f45faadb700 1 -- 192.168.123.105:0/1564442006 shutdown_connections 2026-03-10T08:54:45.650 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.649+0000 7f45faadb700 1 --2- 192.168.123.105:0/1564442006 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f45f4075a10 0x7f45f4077ea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:45.650 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.649+0000 7f45faadb700 1 --2- 192.168.123.105:0/1564442006 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f45f4072b20 0x7f45f4072f40 unknown :-1 s=CLOSED pgs=313 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:45.650 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.649+0000 7f45faadb700 1 -- 192.168.123.105:0/1564442006 >> 192.168.123.105:0/1564442006 conn(0x7f45f406daa0 msgr2=0x7f45f406ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:54:45.650 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.650+0000 7f45faadb700 1 -- 192.168.123.105:0/1564442006 shutdown_connections 2026-03-10T08:54:45.650 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.650+0000 7f45faadb700 1 -- 192.168.123.105:0/1564442006 wait complete. 2026-03-10T08:54:45.651 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.650+0000 7f45faadb700 1 Processor -- start 2026-03-10T08:54:45.651 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.650+0000 7f45faadb700 1 -- start start 2026-03-10T08:54:45.651 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.650+0000 7f45faadb700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f45f4075a10 0x7f45f4082e20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:54:45.651 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.650+0000 7f45faadb700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f45f4083360 0x7f45f40837e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:54:45.651 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.650+0000 7f45faadb700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f45f412e5c0 con 0x7f45f4075a10 2026-03-10T08:54:45.651 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.650+0000 7f45faadb700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f45f412e700 con 0x7f45f4083360 2026-03-10T08:54:45.651 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.650+0000 7f45f92d8700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f45f4083360 0x7f45f40837e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:54:45.651 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.650+0000 7f45f92d8700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f45f4083360 0x7f45f40837e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:38682/0 (socket says 192.168.123.105:38682) 2026-03-10T08:54:45.651 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.650+0000 7f45f92d8700 1 -- 192.168.123.105:0/3195162458 learned_addr learned my addr 192.168.123.105:0/3195162458 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:54:45.651 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.650+0000 7f45f9ad9700 1 --2- 192.168.123.105:0/3195162458 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f45f4075a10 0x7f45f4082e20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:54:45.651 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.651+0000 7f45f92d8700 1 -- 192.168.123.105:0/3195162458 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f45f4075a10 msgr2=0x7f45f4082e20 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:45.652 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.651+0000 7f45f92d8700 1 --2- 192.168.123.105:0/3195162458 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f45f4075a10 0x7f45f4082e20 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:45.652 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.651+0000 7f45f92d8700 1 -- 192.168.123.105:0/3195162458 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f45f0008440 con 0x7f45f4083360 2026-03-10T08:54:45.652 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.651+0000 7f45f92d8700 1 --2- 192.168.123.105:0/3195162458 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f45f4083360 0x7f45f40837e0 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7f45ec0060b0 tx=0x7f45ec0076f0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:54:45.652 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.651+0000 7f45eaffd700 1 -- 192.168.123.105:0/3195162458 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f45ec010040 con 0x7f45f4083360 2026-03-10T08:54:45.652 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.652+0000 7f45faadb700 1 -- 192.168.123.105:0/3195162458 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f45f412e990 con 0x7f45f4083360 2026-03-10T08:54:45.652 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.652+0000 7f45faadb700 1 -- 192.168.123.105:0/3195162458 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f45f412eee0 con 0x7f45f4083360 2026-03-10T08:54:45.652 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.652+0000 7f45eaffd700 1 -- 192.168.123.105:0/3195162458 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f45ec009240 con 0x7f45f4083360 2026-03-10T08:54:45.652 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.652+0000 7f45eaffd700 1 -- 192.168.123.105:0/3195162458 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f45ec016600 con 0x7f45f4083360 2026-03-10T08:54:45.653 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.653+0000 7f45faadb700 1 -- 192.168.123.105:0/3195162458 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f45d8005320 con 0x7f45f4083360 2026-03-10T08:54:45.654 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.653+0000 7f45eaffd700 1 -- 192.168.123.105:0/3195162458 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f45ec004ad0 con 0x7f45f4083360 2026-03-10T08:54:45.654 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.654+0000 7f45eaffd700 1 --2- 192.168.123.105:0/3195162458 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f45e006c600 0x7f45e006eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:54:45.654 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.654+0000 7f45f9ad9700 1 --2- 192.168.123.105:0/3195162458 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f45e006c600 0x7f45e006eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:54:45.654 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.654+0000 7f45eaffd700 1 -- 192.168.123.105:0/3195162458 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f45ec08c4a0 con 0x7f45f4083360 2026-03-10T08:54:45.655 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.655+0000 7f45f9ad9700 1 --2- 192.168.123.105:0/3195162458 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f45e006c600 0x7f45e006eac0 secure :-1 s=READY pgs=140 cs=0 l=1 rev1=1 crypto rx=0x7f45f0008760 tx=0x7f45f000b360 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:54:45.656 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.656+0000 7f45eaffd700 1 -- 192.168.123.105:0/3195162458 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f45ec05a7d0 con 0x7f45f4083360 2026-03-10T08:54:45.787 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.787+0000 7f45faadb700 1 -- 192.168.123.105:0/3195162458 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f45d8005cc0 con 0x7f45f4083360 2026-03-10T08:54:45.789 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.789+0000 7f45eaffd700 1 -- 192.168.123.105:0/3195162458 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 12 v12) v1 ==== 76+0+1827 (secure 0 0 0) 0x7f45ec004d80 con 0x7f45f4083360 2026-03-10T08:54:45.791 INFO:teuthology.orchestra.run.vm05.stdout:e12 2026-03-10T08:54:45.791 INFO:teuthology.orchestra.run.vm05.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T08:54:45.791 INFO:teuthology.orchestra.run.vm05.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T08:54:45.791 INFO:teuthology.orchestra.run.vm05.stdout:legacy client fscid: 1 2026-03-10T08:54:45.791 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:54:45.791 INFO:teuthology.orchestra.run.vm05.stdout:Filesystem 'cephfs' (1) 2026-03-10T08:54:45.791 INFO:teuthology.orchestra.run.vm05.stdout:fs_name cephfs 2026-03-10T08:54:45.791 INFO:teuthology.orchestra.run.vm05.stdout:epoch 12 2026-03-10T08:54:45.791 INFO:teuthology.orchestra.run.vm05.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T08:54:45.791 INFO:teuthology.orchestra.run.vm05.stdout:created 2026-03-10T08:52:52.346264+0000 2026-03-10T08:54:45.792 INFO:teuthology.orchestra.run.vm05.stdout:modified 2026-03-10T08:53:01.426195+0000 2026-03-10T08:54:45.792 INFO:teuthology.orchestra.run.vm05.stdout:tableserver 0 2026-03-10T08:54:45.792 INFO:teuthology.orchestra.run.vm05.stdout:root 0 2026-03-10T08:54:45.792 INFO:teuthology.orchestra.run.vm05.stdout:session_timeout 60 2026-03-10T08:54:45.792 INFO:teuthology.orchestra.run.vm05.stdout:session_autoclose 300 2026-03-10T08:54:45.792 INFO:teuthology.orchestra.run.vm05.stdout:max_file_size 1099511627776 2026-03-10T08:54:45.792 INFO:teuthology.orchestra.run.vm05.stdout:required_client_features {} 2026-03-10T08:54:45.792 INFO:teuthology.orchestra.run.vm05.stdout:last_failure 0 2026-03-10T08:54:45.792 INFO:teuthology.orchestra.run.vm05.stdout:last_failure_osd_epoch 39 2026-03-10T08:54:45.792 INFO:teuthology.orchestra.run.vm05.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T08:54:45.792 INFO:teuthology.orchestra.run.vm05.stdout:max_mds 1 2026-03-10T08:54:45.792 INFO:teuthology.orchestra.run.vm05.stdout:in 0 2026-03-10T08:54:45.792 INFO:teuthology.orchestra.run.vm05.stdout:up {0=24289} 2026-03-10T08:54:45.792 INFO:teuthology.orchestra.run.vm05.stdout:failed 2026-03-10T08:54:45.792 INFO:teuthology.orchestra.run.vm05.stdout:damaged 2026-03-10T08:54:45.792 INFO:teuthology.orchestra.run.vm05.stdout:stopped 2026-03-10T08:54:45.792 INFO:teuthology.orchestra.run.vm05.stdout:data_pools [3] 2026-03-10T08:54:45.792 INFO:teuthology.orchestra.run.vm05.stdout:metadata_pool 2 2026-03-10T08:54:45.792 INFO:teuthology.orchestra.run.vm05.stdout:inline_data disabled 2026-03-10T08:54:45.792 INFO:teuthology.orchestra.run.vm05.stdout:balancer 2026-03-10T08:54:45.792 INFO:teuthology.orchestra.run.vm05.stdout:bal_rank_mask -1 2026-03-10T08:54:45.792 INFO:teuthology.orchestra.run.vm05.stdout:standby_count_wanted 1 2026-03-10T08:54:45.792 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.bxdvbu{0:24289} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.105:6826/2466638752,v1:192.168.123.105:6827/2466638752] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:54:45.792 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:54:45.792 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:54:45.792 INFO:teuthology.orchestra.run.vm05.stdout:Standby daemons: 2026-03-10T08:54:45.792 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:54:45.792 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.slhztf{-1:14488} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.105:6828/2662194502,v1:192.168.123.105:6829/2662194502] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:54:45.792 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.ssijow{-1:24307} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.108:6826/573665424,v1:192.168.123.108:6827/573665424] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:54:45.792 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.xfzrbx{-1:24317} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.108:6824/3236998387,v1:192.168.123.108:6825/3236998387] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:54:45.796 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.796+0000 7f45faadb700 1 -- 192.168.123.105:0/3195162458 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f45e006c600 msgr2=0x7f45e006eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:45.796 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.796+0000 7f45faadb700 1 --2- 192.168.123.105:0/3195162458 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f45e006c600 0x7f45e006eac0 secure :-1 s=READY pgs=140 cs=0 l=1 rev1=1 crypto rx=0x7f45f0008760 tx=0x7f45f000b360 comp rx=0 tx=0).stop 2026-03-10T08:54:45.797 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.796+0000 7f45faadb700 1 -- 192.168.123.105:0/3195162458 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f45f4083360 msgr2=0x7f45f40837e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:45.797 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.796+0000 7f45faadb700 1 --2- 192.168.123.105:0/3195162458 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f45f4083360 0x7f45f40837e0 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7f45ec0060b0 tx=0x7f45ec0076f0 comp rx=0 tx=0).stop 2026-03-10T08:54:45.797 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.796+0000 7f45faadb700 1 -- 192.168.123.105:0/3195162458 shutdown_connections 2026-03-10T08:54:45.797 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.796+0000 7f45faadb700 1 --2- 192.168.123.105:0/3195162458 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f45e006c600 0x7f45e006eac0 unknown :-1 s=CLOSED pgs=140 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:45.797 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.796+0000 7f45faadb700 1 --2- 192.168.123.105:0/3195162458 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f45f4075a10 0x7f45f4082e20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:45.797 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.796+0000 7f45faadb700 1 --2- 192.168.123.105:0/3195162458 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f45f4083360 0x7f45f40837e0 secure :-1 s=CLOSED pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7f45ec0060b0 tx=0x7f45ec0076f0 comp rx=0 tx=0).stop 2026-03-10T08:54:45.797 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.796+0000 7f45faadb700 1 -- 192.168.123.105:0/3195162458 >> 192.168.123.105:0/3195162458 conn(0x7f45f406daa0 msgr2=0x7f45f406ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:54:45.797 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.797+0000 7f45faadb700 1 -- 192.168.123.105:0/3195162458 shutdown_connections 2026-03-10T08:54:45.797 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.797+0000 7f45faadb700 1 -- 192.168.123.105:0/3195162458 wait complete. 2026-03-10T08:54:45.797 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 12 2026-03-10T08:54:45.868 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.867+0000 7fb0a8698700 1 -- 192.168.123.105:0/1847915117 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb0a0075a40 msgr2=0x7fb0a0077ed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:45.868 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.867+0000 7fb0a8698700 1 --2- 192.168.123.105:0/1847915117 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb0a0075a40 0x7fb0a0077ed0 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7fb098009230 tx=0x7fb098009260 comp rx=0 tx=0).stop 2026-03-10T08:54:45.868 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.867+0000 7fb0a8698700 1 -- 192.168.123.105:0/1847915117 shutdown_connections 2026-03-10T08:54:45.868 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.867+0000 7fb0a8698700 1 --2- 192.168.123.105:0/1847915117 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb0a0075a40 0x7fb0a0077ed0 unknown :-1 s=CLOSED pgs=81 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:45.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.867+0000 7fb0a8698700 1 --2- 192.168.123.105:0/1847915117 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb0a0072b50 0x7fb0a0072f70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:45.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.867+0000 7fb0a8698700 1 -- 192.168.123.105:0/1847915117 >> 192.168.123.105:0/1847915117 conn(0x7fb0a006dae0 msgr2=0x7fb0a006ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:54:45.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.867+0000 7fb0a8698700 1 -- 192.168.123.105:0/1847915117 shutdown_connections 2026-03-10T08:54:45.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.867+0000 7fb0a8698700 1 -- 192.168.123.105:0/1847915117 wait complete. 2026-03-10T08:54:45.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.867+0000 7fb0a8698700 1 Processor -- start 2026-03-10T08:54:45.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.867+0000 7fb0a8698700 1 -- start start 2026-03-10T08:54:45.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.867+0000 7fb0a8698700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb0a0072b50 0x7fb0a0083110 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:54:45.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.867+0000 7fb0a8698700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb0a0083650 0x7fb0a012e440 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:54:45.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.867+0000 7fb0a8698700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb0a0083b60 con 0x7fb0a0083650 2026-03-10T08:54:45.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.867+0000 7fb0a8698700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb0a0083cd0 con 0x7fb0a0072b50 2026-03-10T08:54:45.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.868+0000 7fb0a5c33700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb0a0083650 0x7fb0a012e440 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:54:45.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.868+0000 7fb0a5c33700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb0a0083650 0x7fb0a012e440 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:39934/0 (socket says 192.168.123.105:39934) 2026-03-10T08:54:45.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.868+0000 7fb0a5c33700 1 -- 192.168.123.105:0/3991413792 learned_addr learned my addr 192.168.123.105:0/3991413792 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:54:45.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.868+0000 7fb0a6434700 1 --2- 192.168.123.105:0/3991413792 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb0a0072b50 0x7fb0a0083110 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:54:45.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.868+0000 7fb0a5c33700 1 -- 192.168.123.105:0/3991413792 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb0a0072b50 msgr2=0x7fb0a0083110 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:45.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.868+0000 7fb0a5c33700 1 --2- 192.168.123.105:0/3991413792 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb0a0072b50 0x7fb0a0083110 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:45.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.868+0000 7fb0a5c33700 1 -- 192.168.123.105:0/3991413792 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb098008ee0 con 0x7fb0a0083650 2026-03-10T08:54:45.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.868+0000 7fb0a6434700 1 --2- 192.168.123.105:0/3991413792 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb0a0072b50 0x7fb0a0083110 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T08:54:45.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.868+0000 7fb0a5c33700 1 --2- 192.168.123.105:0/3991413792 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb0a0083650 0x7fb0a012e440 secure :-1 s=READY pgs=314 cs=0 l=1 rev1=1 crypto rx=0x7fb098003fa0 tx=0x7fb098008e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:54:45.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.869+0000 7fb0977fe700 1 -- 192.168.123.105:0/3991413792 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb09801d070 con 0x7fb0a0083650 2026-03-10T08:54:45.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.869+0000 7fb0a8698700 1 -- 192.168.123.105:0/3991413792 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb0a012e980 con 0x7fb0a0083650 2026-03-10T08:54:45.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.869+0000 7fb0a8698700 1 -- 192.168.123.105:0/3991413792 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb0a012ee70 con 0x7fb0a0083650 2026-03-10T08:54:45.873 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.870+0000 7fb0a8698700 1 -- 192.168.123.105:0/3991413792 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb0a007c880 con 0x7fb0a0083650 2026-03-10T08:54:45.873 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.873+0000 7fb0977fe700 1 -- 192.168.123.105:0/3991413792 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb098007cb0 con 0x7fb0a0083650 2026-03-10T08:54:45.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.873+0000 7fb0977fe700 1 -- 192.168.123.105:0/3991413792 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb09800eaf0 con 0x7fb0a0083650 2026-03-10T08:54:45.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.873+0000 7fb0977fe700 1 -- 192.168.123.105:0/3991413792 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fb09800ec50 con 0x7fb0a0083650 2026-03-10T08:54:45.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.874+0000 7fb0977fe700 1 --2- 192.168.123.105:0/3991413792 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb08c06c6d0 0x7fb08c06eb90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:54:45.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.874+0000 7fb0977fe700 1 -- 192.168.123.105:0/3991413792 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7fb098012070 con 0x7fb0a0083650 2026-03-10T08:54:45.875 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.875+0000 7fb0a6434700 1 --2- 192.168.123.105:0/3991413792 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb08c06c6d0 0x7fb08c06eb90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:54:45.875 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.875+0000 7fb0977fe700 1 -- 192.168.123.105:0/3991413792 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fb09805c480 con 0x7fb0a0083650 2026-03-10T08:54:45.875 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.875+0000 7fb0a6434700 1 --2- 192.168.123.105:0/3991413792 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb08c06c6d0 0x7fb08c06eb90 secure :-1 s=READY pgs=141 cs=0 l=1 rev1=1 crypto rx=0x7fb09c00b3c0 tx=0x7fb09c00d040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:54:45.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.998+0000 7fb0a8698700 1 -- 192.168.123.105:0/3991413792 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fb0a00611d0 con 0x7fb08c06c6d0 2026-03-10T08:54:45.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:45.999+0000 7fb0977fe700 1 -- 192.168.123.105:0/3991413792 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+442 (secure 0 0 0) 0x7fb0a00611d0 con 0x7fb08c06c6d0 2026-03-10T08:54:46.000 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T08:54:46.000 INFO:teuthology.orchestra.run.vm05.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T08:54:46.000 INFO:teuthology.orchestra.run.vm05.stdout: "in_progress": true, 2026-03-10T08:54:46.000 INFO:teuthology.orchestra.run.vm05.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T08:54:46.000 INFO:teuthology.orchestra.run.vm05.stdout: "services_complete": [], 2026-03-10T08:54:46.000 INFO:teuthology.orchestra.run.vm05.stdout: "progress": "0/23 daemons upgraded", 2026-03-10T08:54:46.000 INFO:teuthology.orchestra.run.vm05.stdout: "message": "Pulling quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc image on host vm08", 2026-03-10T08:54:46.000 INFO:teuthology.orchestra.run.vm05.stdout: "is_paused": false 2026-03-10T08:54:46.000 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T08:54:46.005 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.005+0000 7fb0957fa700 1 -- 192.168.123.105:0/3991413792 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb08c06c6d0 msgr2=0x7fb08c06eb90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:46.005 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.005+0000 7fb0957fa700 1 --2- 192.168.123.105:0/3991413792 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb08c06c6d0 0x7fb08c06eb90 secure :-1 s=READY pgs=141 cs=0 l=1 rev1=1 crypto rx=0x7fb09c00b3c0 tx=0x7fb09c00d040 comp rx=0 tx=0).stop 2026-03-10T08:54:46.005 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.005+0000 7fb0957fa700 1 -- 192.168.123.105:0/3991413792 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb0a0083650 msgr2=0x7fb0a012e440 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:46.005 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.005+0000 7fb0957fa700 1 --2- 192.168.123.105:0/3991413792 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb0a0083650 0x7fb0a012e440 secure :-1 s=READY pgs=314 cs=0 l=1 rev1=1 crypto rx=0x7fb098003fa0 tx=0x7fb098008e70 comp rx=0 tx=0).stop 2026-03-10T08:54:46.005 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.005+0000 7fb0957fa700 1 -- 192.168.123.105:0/3991413792 shutdown_connections 2026-03-10T08:54:46.005 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.005+0000 7fb0957fa700 1 --2- 192.168.123.105:0/3991413792 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb08c06c6d0 0x7fb08c06eb90 secure :-1 s=CLOSED pgs=141 cs=0 l=1 rev1=1 crypto rx=0x7fb09c00b3c0 tx=0x7fb09c00d040 comp rx=0 tx=0).stop 2026-03-10T08:54:46.005 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.005+0000 7fb0957fa700 1 --2- 192.168.123.105:0/3991413792 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb0a0072b50 0x7fb0a0083110 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:46.005 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.005+0000 7fb0957fa700 1 --2- 192.168.123.105:0/3991413792 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb0a0083650 0x7fb0a012e440 unknown :-1 s=CLOSED pgs=314 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:46.005 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.005+0000 7fb0957fa700 1 -- 192.168.123.105:0/3991413792 >> 192.168.123.105:0/3991413792 conn(0x7fb0a006dae0 msgr2=0x7fb0a006ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:54:46.005 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.005+0000 7fb0957fa700 1 -- 192.168.123.105:0/3991413792 shutdown_connections 2026-03-10T08:54:46.005 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.005+0000 7fb0957fa700 1 -- 192.168.123.105:0/3991413792 wait complete. 2026-03-10T08:54:46.077 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.076+0000 7f536fffe700 1 -- 192.168.123.105:0/1680928894 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5368072b50 msgr2=0x7f5368072f70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:46.077 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.076+0000 7f536fffe700 1 --2- 192.168.123.105:0/1680928894 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5368072b50 0x7f5368072f70 secure :-1 s=READY pgs=315 cs=0 l=1 rev1=1 crypto rx=0x7f5364009b50 tx=0x7f5364009e60 comp rx=0 tx=0).stop 2026-03-10T08:54:46.077 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.077+0000 7f536fffe700 1 -- 192.168.123.105:0/1680928894 shutdown_connections 2026-03-10T08:54:46.077 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.077+0000 7f536fffe700 1 --2- 192.168.123.105:0/1680928894 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5368075a40 0x7f5368077ed0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:46.077 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.077+0000 7f536fffe700 1 --2- 192.168.123.105:0/1680928894 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5368072b50 0x7f5368072f70 unknown :-1 s=CLOSED pgs=315 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:46.077 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.077+0000 7f536fffe700 1 -- 192.168.123.105:0/1680928894 >> 192.168.123.105:0/1680928894 conn(0x7f536806dae0 msgr2=0x7f536806ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:54:46.078 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.077+0000 7f536fffe700 1 -- 192.168.123.105:0/1680928894 shutdown_connections 2026-03-10T08:54:46.078 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.077+0000 7f536fffe700 1 -- 192.168.123.105:0/1680928894 wait complete. 2026-03-10T08:54:46.078 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.077+0000 7f536fffe700 1 Processor -- start 2026-03-10T08:54:46.078 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.077+0000 7f536fffe700 1 -- start start 2026-03-10T08:54:46.078 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.077+0000 7f536fffe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5368075a40 0x7f5368082fc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:54:46.078 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.077+0000 7f536fffe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5368083500 0x7f53681b3050 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:54:46.078 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.077+0000 7f536fffe700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5368083a10 con 0x7f5368083500 2026-03-10T08:54:46.078 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.077+0000 7f536fffe700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5368083b80 con 0x7f5368075a40 2026-03-10T08:54:46.078 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.077+0000 7f536d599700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5368083500 0x7f53681b3050 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:54:46.079 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.077+0000 7f536d599700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5368083500 0x7f53681b3050 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:39956/0 (socket says 192.168.123.105:39956) 2026-03-10T08:54:46.079 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.077+0000 7f536d599700 1 -- 192.168.123.105:0/4207642761 learned_addr learned my addr 192.168.123.105:0/4207642761 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:54:46.079 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.078+0000 7f536d599700 1 -- 192.168.123.105:0/4207642761 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5368075a40 msgr2=0x7f5368082fc0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T08:54:46.079 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.078+0000 7f536d599700 1 --2- 192.168.123.105:0/4207642761 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5368075a40 0x7f5368082fc0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:46.079 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.078+0000 7f536d599700 1 -- 192.168.123.105:0/4207642761 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f53640097e0 con 0x7f5368083500 2026-03-10T08:54:46.079 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.078+0000 7f536d599700 1 --2- 192.168.123.105:0/4207642761 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5368083500 0x7f53681b3050 secure :-1 s=READY pgs=316 cs=0 l=1 rev1=1 crypto rx=0x7f536000c420 tx=0x7f536000c7e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:54:46.079 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.078+0000 7f535effd700 1 -- 192.168.123.105:0/4207642761 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f536000e030 con 0x7f5368083500 2026-03-10T08:54:46.080 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.079+0000 7f536fffe700 1 -- 192.168.123.105:0/4207642761 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f53681b35f0 con 0x7f5368083500 2026-03-10T08:54:46.080 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.079+0000 7f536fffe700 1 -- 192.168.123.105:0/4207642761 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f53681b3b40 con 0x7f5368083500 2026-03-10T08:54:46.080 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.080+0000 7f535effd700 1 -- 192.168.123.105:0/4207642761 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f536000f040 con 0x7f5368083500 2026-03-10T08:54:46.080 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.080+0000 7f535effd700 1 -- 192.168.123.105:0/4207642761 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5360014750 con 0x7f5368083500 2026-03-10T08:54:46.081 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.080+0000 7f535effd700 1 -- 192.168.123.105:0/4207642761 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f5360014950 con 0x7f5368083500 2026-03-10T08:54:46.081 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.081+0000 7f535effd700 1 --2- 192.168.123.105:0/4207642761 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f535406e8f0 0x7f5354070db0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:54:46.081 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.081+0000 7f536dd9a700 1 --2- 192.168.123.105:0/4207642761 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f535406e8f0 0x7f5354070db0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:54:46.081 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.081+0000 7f535effd700 1 -- 192.168.123.105:0/4207642761 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f536008ca40 con 0x7f5368083500 2026-03-10T08:54:46.082 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.082+0000 7f536fffe700 1 -- 192.168.123.105:0/4207642761 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f534c005320 con 0x7f5368083500 2026-03-10T08:54:46.082 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.082+0000 7f536dd9a700 1 --2- 192.168.123.105:0/4207642761 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f535406e8f0 0x7f5354070db0 secure :-1 s=READY pgs=142 cs=0 l=1 rev1=1 crypto rx=0x7f5364009b20 tx=0x7f536400b540 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:54:46.084 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.084+0000 7f535effd700 1 -- 192.168.123.105:0/4207642761 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f536005ad70 con 0x7f5368083500 2026-03-10T08:54:46.238 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.238+0000 7f536fffe700 1 -- 192.168.123.105:0/4207642761 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f534c0059f0 con 0x7f5368083500 2026-03-10T08:54:46.239 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.238+0000 7f535effd700 1 -- 192.168.123.105:0/4207642761 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7f536005a900 con 0x7f5368083500 2026-03-10T08:54:46.239 INFO:teuthology.orchestra.run.vm05.stdout:HEALTH_OK 2026-03-10T08:54:46.241 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.241+0000 7f536fffe700 1 -- 192.168.123.105:0/4207642761 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f535406e8f0 msgr2=0x7f5354070db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:46.241 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.241+0000 7f536fffe700 1 --2- 192.168.123.105:0/4207642761 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f535406e8f0 0x7f5354070db0 secure :-1 s=READY pgs=142 cs=0 l=1 rev1=1 crypto rx=0x7f5364009b20 tx=0x7f536400b540 comp rx=0 tx=0).stop 2026-03-10T08:54:46.241 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.241+0000 7f536fffe700 1 -- 192.168.123.105:0/4207642761 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5368083500 msgr2=0x7f53681b3050 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:54:46.241 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.241+0000 7f536fffe700 1 --2- 192.168.123.105:0/4207642761 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5368083500 0x7f53681b3050 secure :-1 s=READY pgs=316 cs=0 l=1 rev1=1 crypto rx=0x7f536000c420 tx=0x7f536000c7e0 comp rx=0 tx=0).stop 2026-03-10T08:54:46.241 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.241+0000 7f536fffe700 1 -- 192.168.123.105:0/4207642761 shutdown_connections 2026-03-10T08:54:46.242 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.241+0000 7f536fffe700 1 --2- 192.168.123.105:0/4207642761 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f535406e8f0 0x7f5354070db0 unknown :-1 s=CLOSED pgs=142 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:46.242 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.242+0000 7f536fffe700 1 --2- 192.168.123.105:0/4207642761 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5368075a40 0x7f5368082fc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:46.242 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.242+0000 7f536fffe700 1 --2- 192.168.123.105:0/4207642761 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5368083500 0x7f53681b3050 unknown :-1 s=CLOSED pgs=316 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:54:46.242 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.242+0000 7f536fffe700 1 -- 192.168.123.105:0/4207642761 >> 192.168.123.105:0/4207642761 conn(0x7f536806dae0 msgr2=0x7f536806ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:54:46.242 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.242+0000 7f536fffe700 1 -- 192.168.123.105:0/4207642761 shutdown_connections 2026-03-10T08:54:46.242 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:54:46.242+0000 7f536fffe700 1 -- 192.168.123.105:0/4207642761 wait complete. 2026-03-10T08:54:46.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:46 vm08.local ceph-mon[57559]: from='client.14630 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:54:46.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:46 vm08.local ceph-mon[57559]: from='client.24419 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:54:46.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:46 vm08.local ceph-mon[57559]: pgmap v132: 65 pgs: 65 active+clean; 131 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 4.3 MiB/s wr, 434 op/s 2026-03-10T08:54:46.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:46 vm08.local ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:54:46.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:46 vm08.local ceph-mon[57559]: from='client.? 192.168.123.105:0/2674834928' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:54:46.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:46 vm08.local ceph-mon[57559]: from='client.? 192.168.123.105:0/3195162458' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T08:54:46.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:46 vm08.local ceph-mon[57559]: from='client.? 192.168.123.105:0/4207642761' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T08:54:46.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:46 vm05.local ceph-mon[49713]: from='client.14630 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:54:46.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:46 vm05.local ceph-mon[49713]: from='client.24419 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:54:46.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:46 vm05.local ceph-mon[49713]: pgmap v132: 65 pgs: 65 active+clean; 131 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 4.3 MiB/s wr, 434 op/s 2026-03-10T08:54:46.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:46 vm05.local ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:54:46.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:46 vm05.local ceph-mon[49713]: from='client.? 192.168.123.105:0/2674834928' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:54:46.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:46 vm05.local ceph-mon[49713]: from='client.? 192.168.123.105:0/3195162458' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T08:54:46.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:46 vm05.local ceph-mon[49713]: from='client.? 192.168.123.105:0/4207642761' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T08:54:47.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:47 vm05.local ceph-mon[49713]: from='client.14646 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:54:47.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:47 vm08.local ceph-mon[57559]: from='client.14646 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:54:48.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:48 vm05.local ceph-mon[49713]: pgmap v133: 65 pgs: 65 active+clean; 142 MiB data, 1.2 GiB used, 119 GiB / 120 GiB avail; 5.1 MiB/s wr, 499 op/s 2026-03-10T08:54:48.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:48 vm08.local ceph-mon[57559]: pgmap v133: 65 pgs: 65 active+clean; 142 MiB data, 1.2 GiB used, 119 GiB / 120 GiB avail; 5.1 MiB/s wr, 499 op/s 2026-03-10T08:54:50.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:50 vm05.local ceph-mon[49713]: pgmap v134: 65 pgs: 65 active+clean; 146 MiB data, 1.2 GiB used, 119 GiB / 120 GiB avail; 3.8 MiB/s wr, 397 op/s 2026-03-10T08:54:50.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:50 vm08.local ceph-mon[57559]: pgmap v134: 65 pgs: 65 active+clean; 146 MiB data, 1.2 GiB used, 119 GiB / 120 GiB avail; 3.8 MiB/s wr, 397 op/s 2026-03-10T08:54:52.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:52 vm05.local ceph-mon[49713]: pgmap v135: 65 pgs: 65 active+clean; 155 MiB data, 1.2 GiB used, 119 GiB / 120 GiB avail; 4.0 MiB/s wr, 418 op/s 2026-03-10T08:54:52.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:52 vm08.local ceph-mon[57559]: pgmap v135: 65 pgs: 65 active+clean; 155 MiB data, 1.2 GiB used, 119 GiB / 120 GiB avail; 4.0 MiB/s wr, 418 op/s 2026-03-10T08:54:54.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:54 vm05.local ceph-mon[49713]: pgmap v136: 65 pgs: 65 active+clean; 161 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 3.4 MiB/s wr, 382 op/s 2026-03-10T08:54:54.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:54 vm08.local ceph-mon[57559]: pgmap v136: 65 pgs: 65 active+clean; 161 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 3.4 MiB/s wr, 382 op/s 2026-03-10T08:54:55.547 INFO:tasks.workunit.client.1.vm08.stderr:+ pushd ltp-full-20091231/testcases/kernel/fs/fsstress 2026-03-10T08:54:55.555 INFO:tasks.workunit.client.1.vm08.stdout:~/cephtest/mnt.1/client.1/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress ~/cephtest/mnt.1/client.1/tmp/fsstress ~/cephtest/mnt.1/client.1/tmp 2026-03-10T08:54:55.555 INFO:tasks.workunit.client.1.vm08.stderr:+ make 2026-03-10T08:54:55.925 INFO:tasks.workunit.client.1.vm08.stdout:cc -DNO_XFS -I/home/ubuntu/cephtest/mnt.1/client.1/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress -D_LARGEFILE64_SOURCE -D_GNU_SOURCE -I../../../../include -I../../../../include -L../../../../lib fsstress.c -o fsstress 2026-03-10T08:54:56.313 INFO:tasks.workunit.client.1.vm08.stderr:++ readlink -f fsstress 2026-03-10T08:54:56.315 INFO:tasks.workunit.client.1.vm08.stderr:+ BIN=/home/ubuntu/cephtest/mnt.1/client.1/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress/fsstress 2026-03-10T08:54:56.315 INFO:tasks.workunit.client.1.vm08.stderr:+ popd 2026-03-10T08:54:56.316 INFO:tasks.workunit.client.1.vm08.stdout:~/cephtest/mnt.1/client.1/tmp/fsstress ~/cephtest/mnt.1/client.1/tmp 2026-03-10T08:54:56.316 INFO:tasks.workunit.client.1.vm08.stderr:+ popd 2026-03-10T08:54:56.317 INFO:tasks.workunit.client.1.vm08.stdout:~/cephtest/mnt.1/client.1/tmp 2026-03-10T08:54:56.317 INFO:tasks.workunit.client.1.vm08.stderr:++ mktemp -d -p . 2026-03-10T08:54:56.320 INFO:tasks.workunit.client.1.vm08.stderr:+ T=./tmp.I077c4cT8V 2026-03-10T08:54:56.321 INFO:tasks.workunit.client.1.vm08.stderr:+ /home/ubuntu/cephtest/mnt.1/client.1/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress/fsstress -d ./tmp.I077c4cT8V -l 1 -n 1000 -p 10 -v 2026-03-10T08:54:56.326 INFO:tasks.workunit.client.1.vm08.stdout:seed = 1772822806 2026-03-10T08:54:56.331 INFO:tasks.workunit.client.1.vm08.stdout:0/0: link - no file 2026-03-10T08:54:56.336 INFO:tasks.workunit.client.1.vm08.stdout:0/1: creat f0 x:0 0 0 2026-03-10T08:54:56.337 INFO:tasks.workunit.client.1.vm08.stdout:3/0: dread - no filename 2026-03-10T08:54:56.337 INFO:tasks.workunit.client.1.vm08.stdout:3/1: stat - no entries 2026-03-10T08:54:56.337 INFO:tasks.workunit.client.1.vm08.stdout:3/2: rename - no filename 2026-03-10T08:54:56.337 INFO:tasks.workunit.client.1.vm08.stdout:3/3: read - no filename 2026-03-10T08:54:56.337 INFO:tasks.workunit.client.1.vm08.stdout:3/4: dwrite - no filename 2026-03-10T08:54:56.337 INFO:tasks.workunit.client.1.vm08.stdout:3/5: chown . 101 1 2026-03-10T08:54:56.337 INFO:tasks.workunit.client.1.vm08.stdout:3/6: dread - no filename 2026-03-10T08:54:56.337 INFO:tasks.workunit.client.1.vm08.stdout:0/2: write f0 [615905,70549] 0 2026-03-10T08:54:56.337 INFO:tasks.workunit.client.1.vm08.stdout:0/3: rmdir - no directory 2026-03-10T08:54:56.342 INFO:tasks.workunit.client.1.vm08.stdout:1/0: mknod c0 0 2026-03-10T08:54:56.342 INFO:tasks.workunit.client.1.vm08.stdout:0/4: creat f1 x:0 0 0 2026-03-10T08:54:56.343 INFO:tasks.workunit.client.1.vm08.stdout:5/0: read - no filename 2026-03-10T08:54:56.343 INFO:tasks.workunit.client.1.vm08.stdout:5/1: write - no filename 2026-03-10T08:54:56.343 INFO:tasks.workunit.client.1.vm08.stdout:5/2: write - no filename 2026-03-10T08:54:56.343 INFO:tasks.workunit.client.1.vm08.stdout:5/3: dread - no filename 2026-03-10T08:54:56.343 INFO:tasks.workunit.client.1.vm08.stdout:3/7: creat f0 x:0 0 0 2026-03-10T08:54:56.346 INFO:tasks.workunit.client.1.vm08.stdout:1/1: mkdir d1 0 2026-03-10T08:54:56.346 INFO:tasks.workunit.client.1.vm08.stdout:1/2: dwrite - no filename 2026-03-10T08:54:56.351 INFO:tasks.workunit.client.1.vm08.stdout:6/0: write - no filename 2026-03-10T08:54:56.351 INFO:tasks.workunit.client.1.vm08.stdout:6/1: dwrite - no filename 2026-03-10T08:54:56.351 INFO:tasks.workunit.client.1.vm08.stdout:3/8: creat f1 x:0 0 0 2026-03-10T08:54:56.353 INFO:tasks.workunit.client.1.vm08.stdout:5/4: mkdir d0 0 2026-03-10T08:54:56.353 INFO:tasks.workunit.client.1.vm08.stdout:4/0: creat f0 x:0 0 0 2026-03-10T08:54:56.353 INFO:tasks.workunit.client.1.vm08.stdout:4/1: fdatasync f0 0 2026-03-10T08:54:56.354 INFO:tasks.workunit.client.1.vm08.stdout:5/5: rename d0 to d0/d1 22 2026-03-10T08:54:56.354 INFO:tasks.workunit.client.1.vm08.stdout:5/6: write - no filename 2026-03-10T08:54:56.354 INFO:tasks.workunit.client.1.vm08.stdout:5/7: truncate - no filename 2026-03-10T08:54:56.355 INFO:tasks.workunit.client.1.vm08.stdout:4/2: chown f0 18 1 2026-03-10T08:54:56.355 INFO:tasks.workunit.client.1.vm08.stdout:7/0: truncate - no filename 2026-03-10T08:54:56.355 INFO:tasks.workunit.client.1.vm08.stdout:7/1: dwrite - no filename 2026-03-10T08:54:56.358 INFO:tasks.workunit.client.1.vm08.stdout:0/5: dwrite f1 [0,4194304] 0 2026-03-10T08:54:56.359 INFO:tasks.workunit.client.1.vm08.stdout:4/3: creat f1 x:0 0 0 2026-03-10T08:54:56.368 INFO:tasks.workunit.client.1.vm08.stdout:3/9: link f1 f2 0 2026-03-10T08:54:56.374 INFO:tasks.workunit.client.1.vm08.stdout:6/2: creat f0 x:0 0 0 2026-03-10T08:54:56.374 INFO:tasks.workunit.client.1.vm08.stdout:0/6: write f1 [2379443,46122] 0 2026-03-10T08:54:56.374 INFO:tasks.workunit.client.1.vm08.stdout:9/0: rename - no filename 2026-03-10T08:54:56.374 INFO:tasks.workunit.client.1.vm08.stdout:8/0: dwrite - no filename 2026-03-10T08:54:56.374 INFO:tasks.workunit.client.1.vm08.stdout:3/10: write f2 [930849,44942] 0 2026-03-10T08:54:56.379 INFO:tasks.workunit.client.1.vm08.stdout:7/2: mkdir d0 0 2026-03-10T08:54:56.399 INFO:tasks.workunit.client.1.vm08.stdout:0/7: unlink f0 0 2026-03-10T08:54:56.399 INFO:tasks.workunit.client.1.vm08.stdout:6/3: creat f1 x:0 0 0 2026-03-10T08:54:56.399 INFO:tasks.workunit.client.1.vm08.stdout:4/4: dwrite f1 [0,4194304] 0 2026-03-10T08:54:56.399 INFO:tasks.workunit.client.1.vm08.stdout:2/0: stat - no entries 2026-03-10T08:54:56.399 INFO:tasks.workunit.client.1.vm08.stdout:4/5: write f0 [413002,30219] 0 2026-03-10T08:54:56.399 INFO:tasks.workunit.client.1.vm08.stdout:3/11: rename f0 to f3 0 2026-03-10T08:54:56.399 INFO:tasks.workunit.client.1.vm08.stdout:9/1: creat f0 x:0 0 0 2026-03-10T08:54:56.399 INFO:tasks.workunit.client.1.vm08.stdout:4/6: read f1 [2329401,110518] 0 2026-03-10T08:54:56.399 INFO:tasks.workunit.client.1.vm08.stdout:9/2: read - f0 zero size 2026-03-10T08:54:56.399 INFO:tasks.workunit.client.1.vm08.stdout:9/3: chown f0 0 1 2026-03-10T08:54:56.399 INFO:tasks.workunit.client.1.vm08.stdout:9/4: write f0 [212340,44718] 0 2026-03-10T08:54:56.399 INFO:tasks.workunit.client.1.vm08.stdout:6/4: mknod c2 0 2026-03-10T08:54:56.400 INFO:tasks.workunit.client.1.vm08.stdout:8/1: creat f0 x:0 0 0 2026-03-10T08:54:56.400 INFO:tasks.workunit.client.1.vm08.stdout:0/8: dread f1 [0,4194304] 0 2026-03-10T08:54:56.400 INFO:tasks.workunit.client.1.vm08.stdout:8/2: stat f0 0 2026-03-10T08:54:56.400 INFO:tasks.workunit.client.1.vm08.stdout:8/3: write f0 [919902,128891] 0 2026-03-10T08:54:56.410 INFO:tasks.workunit.client.1.vm08.stdout:6/5: dwrite f1 [0,4194304] 0 2026-03-10T08:54:56.415 INFO:tasks.workunit.client.1.vm08.stdout:8/4: dwrite f0 [0,4194304] 0 2026-03-10T08:54:56.422 INFO:tasks.workunit.client.1.vm08.stdout:7/3: creat d0/f1 x:0 0 0 2026-03-10T08:54:56.422 INFO:tasks.workunit.client.1.vm08.stdout:2/1: creat f0 x:0 0 0 2026-03-10T08:54:56.422 INFO:tasks.workunit.client.1.vm08.stdout:9/5: link f0 f1 0 2026-03-10T08:54:56.422 INFO:tasks.workunit.client.1.vm08.stdout:6/6: truncate f0 181908 0 2026-03-10T08:54:56.423 INFO:tasks.workunit.client.1.vm08.stdout:2/2: dread - f0 zero size 2026-03-10T08:54:56.423 INFO:tasks.workunit.client.1.vm08.stdout:7/4: mkdir d0/d2 0 2026-03-10T08:54:56.423 INFO:tasks.workunit.client.1.vm08.stdout:6/7: creat f3 x:0 0 0 2026-03-10T08:54:56.424 INFO:tasks.workunit.client.1.vm08.stdout:7/5: write d0/f1 [455423,73552] 0 2026-03-10T08:54:56.433 INFO:tasks.workunit.client.1.vm08.stdout:6/8: creat f4 x:0 0 0 2026-03-10T08:54:56.439 INFO:tasks.workunit.client.1.vm08.stdout:7/6: mknod d0/d2/c3 0 2026-03-10T08:54:56.444 INFO:tasks.workunit.client.1.vm08.stdout:7/7: dwrite d0/f1 [0,4194304] 0 2026-03-10T08:54:56.447 INFO:tasks.workunit.client.1.vm08.stdout:2/3: dwrite f0 [0,4194304] 0 2026-03-10T08:54:56.447 INFO:tasks.workunit.client.1.vm08.stdout:6/9: link f0 f5 0 2026-03-10T08:54:56.458 INFO:tasks.workunit.client.1.vm08.stdout:6/10: unlink f4 0 2026-03-10T08:54:56.463 INFO:tasks.workunit.client.1.vm08.stdout:6/11: rename f0 to f6 0 2026-03-10T08:54:56.472 INFO:tasks.workunit.client.1.vm08.stdout:6/12: dwrite f5 [0,4194304] 0 2026-03-10T08:54:56.482 INFO:tasks.workunit.client.1.vm08.stdout:6/13: creat f7 x:0 0 0 2026-03-10T08:54:56.482 INFO:tasks.workunit.client.1.vm08.stdout:6/14: readlink - no filename 2026-03-10T08:54:56.483 INFO:tasks.workunit.client.1.vm08.stdout:6/15: unlink f3 0 2026-03-10T08:54:56.484 INFO:tasks.workunit.client.1.vm08.stdout:6/16: write f1 [943632,38378] 0 2026-03-10T08:54:56.489 INFO:tasks.workunit.client.1.vm08.stdout:6/17: dwrite f5 [0,4194304] 0 2026-03-10T08:54:56.489 INFO:tasks.workunit.client.1.vm08.stdout:6/18: chown c2 2 1 2026-03-10T08:54:56.490 INFO:tasks.workunit.client.1.vm08.stdout:6/19: rmdir - no directory 2026-03-10T08:54:56.491 INFO:tasks.workunit.client.1.vm08.stdout:6/20: mknod c8 0 2026-03-10T08:54:56.491 INFO:tasks.workunit.client.1.vm08.stdout:6/21: readlink - no filename 2026-03-10T08:54:56.613 INFO:tasks.workunit.client.1.vm08.stdout:9/6: dread f1 [0,4194304] 0 2026-03-10T08:54:56.615 INFO:tasks.workunit.client.1.vm08.stdout:5/8: sync 2026-03-10T08:54:56.615 INFO:tasks.workunit.client.1.vm08.stdout:1/3: sync 2026-03-10T08:54:56.615 INFO:tasks.workunit.client.1.vm08.stdout:5/9: write - no filename 2026-03-10T08:54:56.615 INFO:tasks.workunit.client.1.vm08.stdout:5/10: chown d0 1993283 1 2026-03-10T08:54:56.616 INFO:tasks.workunit.client.1.vm08.stdout:5/11: chown d0 82 1 2026-03-10T08:54:56.616 INFO:tasks.workunit.client.1.vm08.stdout:5/12: read - no filename 2026-03-10T08:54:56.616 INFO:tasks.workunit.client.1.vm08.stdout:5/13: chown d0 4445469 1 2026-03-10T08:54:56.616 INFO:tasks.workunit.client.1.vm08.stdout:5/14: link - no file 2026-03-10T08:54:56.616 INFO:tasks.workunit.client.1.vm08.stdout:5/15: truncate - no filename 2026-03-10T08:54:56.616 INFO:tasks.workunit.client.1.vm08.stdout:5/16: write - no filename 2026-03-10T08:54:56.616 INFO:tasks.workunit.client.1.vm08.stdout:5/17: chown d0 80 1 2026-03-10T08:54:56.618 INFO:tasks.workunit.client.1.vm08.stdout:9/7: mkdir d2 0 2026-03-10T08:54:56.618 INFO:tasks.workunit.client.1.vm08.stdout:9/8: readlink - no filename 2026-03-10T08:54:56.619 INFO:tasks.workunit.client.1.vm08.stdout:9/9: dread f0 [0,4194304] 0 2026-03-10T08:54:56.630 INFO:tasks.workunit.client.1.vm08.stdout:1/4: mknod d1/c2 0 2026-03-10T08:54:56.641 INFO:tasks.workunit.client.1.vm08.stdout:5/18: mknod d0/c2 0 2026-03-10T08:54:56.649 INFO:tasks.workunit.client.1.vm08.stdout:9/10: creat d2/f3 x:0 0 0 2026-03-10T08:54:56.653 INFO:tasks.workunit.client.1.vm08.stdout:1/5: creat d1/f3 x:0 0 0 2026-03-10T08:54:56.654 INFO:tasks.workunit.client.1.vm08.stdout:5/19: rename d0/c2 to d0/c3 0 2026-03-10T08:54:56.655 INFO:tasks.workunit.client.1.vm08.stdout:5/20: read - no filename 2026-03-10T08:54:56.655 INFO:tasks.workunit.client.1.vm08.stdout:5/21: truncate - no filename 2026-03-10T08:54:56.659 INFO:tasks.workunit.client.1.vm08.stdout:9/11: creat d2/f4 x:0 0 0 2026-03-10T08:54:56.661 INFO:tasks.workunit.client.1.vm08.stdout:1/6: dwrite d1/f3 [0,4194304] 0 2026-03-10T08:54:56.662 INFO:tasks.workunit.client.1.vm08.stdout:2/4: fdatasync f0 0 2026-03-10T08:54:56.662 INFO:tasks.workunit.client.1.vm08.stdout:2/5: write f0 [2409394,8885] 0 2026-03-10T08:54:56.666 INFO:tasks.workunit.client.1.vm08.stdout:9/12: symlink d2/l5 0 2026-03-10T08:54:56.672 INFO:tasks.workunit.client.1.vm08.stdout:1/7: creat d1/f4 x:0 0 0 2026-03-10T08:54:56.677 INFO:tasks.workunit.client.1.vm08.stdout:4/7: dread f0 [0,4194304] 0 2026-03-10T08:54:56.677 INFO:tasks.workunit.client.1.vm08.stdout:9/13: creat d2/f6 x:0 0 0 2026-03-10T08:54:56.678 INFO:tasks.workunit.client.1.vm08.stdout:9/14: stat f1 0 2026-03-10T08:54:56.678 INFO:tasks.workunit.client.1.vm08.stdout:4/8: dread f0 [0,4194304] 0 2026-03-10T08:54:56.681 INFO:tasks.workunit.client.1.vm08.stdout:4/9: dread f1 [0,4194304] 0 2026-03-10T08:54:56.681 INFO:tasks.workunit.client.1.vm08.stdout:1/8: mknod d1/c5 0 2026-03-10T08:54:56.684 INFO:tasks.workunit.client.1.vm08.stdout:6/22: fdatasync f1 0 2026-03-10T08:54:56.688 INFO:tasks.workunit.client.1.vm08.stdout:6/23: dwrite f1 [0,4194304] 0 2026-03-10T08:54:56.700 INFO:tasks.workunit.client.1.vm08.stdout:4/10: rename f1 to f2 0 2026-03-10T08:54:56.707 INFO:tasks.workunit.client.1.vm08.stdout:6/24: mkdir d9 0 2026-03-10T08:54:56.718 INFO:tasks.workunit.client.1.vm08.stdout:6/25: chown c2 0 1 2026-03-10T08:54:56.730 INFO:tasks.workunit.client.1.vm08.stdout:4/11: sync 2026-03-10T08:54:56.731 INFO:tasks.workunit.client.1.vm08.stdout:4/12: chown f2 265073911 1 2026-03-10T08:54:56.734 INFO:tasks.workunit.client.1.vm08.stdout:4/13: unlink f0 0 2026-03-10T08:54:56.736 INFO:tasks.workunit.client.1.vm08.stdout:4/14: dread f2 [0,4194304] 0 2026-03-10T08:54:56.935 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:56 vm05.local ceph-mon[49713]: pgmap v137: 65 pgs: 65 active+clean; 167 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 3.4 MiB/s wr, 385 op/s 2026-03-10T08:54:57.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:56 vm08.local ceph-mon[57559]: pgmap v137: 65 pgs: 65 active+clean; 167 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 3.4 MiB/s wr, 385 op/s 2026-03-10T08:54:57.213 INFO:tasks.workunit.client.1.vm08.stdout:3/12: unlink f3 0 2026-03-10T08:54:57.219 INFO:tasks.workunit.client.1.vm08.stdout:0/9: truncate f1 2759612 0 2026-03-10T08:54:57.224 INFO:tasks.workunit.client.1.vm08.stdout:3/13: sync 2026-03-10T08:54:57.224 INFO:tasks.workunit.client.1.vm08.stdout:3/14: chown f1 1139 1 2026-03-10T08:54:57.225 INFO:tasks.workunit.client.1.vm08.stdout:3/15: truncate f2 1618765 0 2026-03-10T08:54:57.225 INFO:tasks.workunit.client.1.vm08.stdout:2/6: getdents . 0 2026-03-10T08:54:57.225 INFO:tasks.workunit.client.1.vm08.stdout:3/16: write f1 [2087398,72742] 0 2026-03-10T08:54:57.226 INFO:tasks.workunit.client.1.vm08.stdout:3/17: write f1 [1294135,128021] 0 2026-03-10T08:54:57.232 INFO:tasks.workunit.client.1.vm08.stdout:6/26: write f1 [5175950,60062] 0 2026-03-10T08:54:57.233 INFO:tasks.workunit.client.1.vm08.stdout:6/27: write f1 [1060110,66717] 0 2026-03-10T08:54:57.234 INFO:tasks.workunit.client.1.vm08.stdout:9/15: fsync f1 0 2026-03-10T08:54:57.235 INFO:tasks.workunit.client.1.vm08.stdout:9/16: fsync f0 0 2026-03-10T08:54:57.235 INFO:tasks.workunit.client.1.vm08.stdout:9/17: dread f1 [0,4194304] 0 2026-03-10T08:54:57.237 INFO:tasks.workunit.client.1.vm08.stdout:8/5: write f0 [4294851,118343] 0 2026-03-10T08:54:57.238 INFO:tasks.workunit.client.1.vm08.stdout:8/6: write f0 [3351949,21899] 0 2026-03-10T08:54:57.239 INFO:tasks.workunit.client.1.vm08.stdout:7/8: getdents d0 0 2026-03-10T08:54:57.241 INFO:tasks.workunit.client.1.vm08.stdout:7/9: dread d0/f1 [0,4194304] 0 2026-03-10T08:54:57.243 INFO:tasks.workunit.client.1.vm08.stdout:7/10: dread d0/f1 [0,4194304] 0 2026-03-10T08:54:57.381 INFO:tasks.workunit.client.1.vm08.stdout:4/15: fsync f2 0 2026-03-10T08:54:57.385 INFO:tasks.workunit.client.1.vm08.stdout:1/9: truncate d1/f3 833088 0 2026-03-10T08:54:57.385 INFO:tasks.workunit.client.1.vm08.stdout:1/10: readlink - no filename 2026-03-10T08:54:57.386 INFO:tasks.workunit.client.1.vm08.stdout:4/16: sync 2026-03-10T08:54:57.386 INFO:tasks.workunit.client.1.vm08.stdout:4/17: chown f2 0 1 2026-03-10T08:54:57.391 INFO:tasks.workunit.client.1.vm08.stdout:4/18: dwrite f2 [4194304,4194304] 0 2026-03-10T08:54:57.448 INFO:tasks.workunit.client.1.vm08.stdout:2/7: mkdir d1 0 2026-03-10T08:54:57.448 INFO:tasks.workunit.client.1.vm08.stdout:2/8: stat d1 0 2026-03-10T08:54:57.450 INFO:tasks.workunit.client.1.vm08.stdout:3/18: mkdir d4 0 2026-03-10T08:54:57.455 INFO:tasks.workunit.client.1.vm08.stdout:9/18: symlink d2/l7 0 2026-03-10T08:54:57.455 INFO:tasks.workunit.client.1.vm08.stdout:9/19: dread f1 [0,4194304] 0 2026-03-10T08:54:57.456 INFO:tasks.workunit.client.1.vm08.stdout:9/20: dread - d2/f4 zero size 2026-03-10T08:54:57.456 INFO:tasks.workunit.client.1.vm08.stdout:8/7: mkdir d1 0 2026-03-10T08:54:57.456 INFO:tasks.workunit.client.1.vm08.stdout:8/8: chown d1 23 1 2026-03-10T08:54:57.457 INFO:tasks.workunit.client.1.vm08.stdout:9/21: dread f1 [0,4194304] 0 2026-03-10T08:54:57.458 INFO:tasks.workunit.client.1.vm08.stdout:5/22: rename d0/c3 to d0/c4 0 2026-03-10T08:54:57.459 INFO:tasks.workunit.client.1.vm08.stdout:8/9: dread f0 [0,4194304] 0 2026-03-10T08:54:57.459 INFO:tasks.workunit.client.1.vm08.stdout:8/10: fdatasync f0 0 2026-03-10T08:54:57.461 INFO:tasks.workunit.client.1.vm08.stdout:8/11: dread f0 [0,4194304] 0 2026-03-10T08:54:57.461 INFO:tasks.workunit.client.1.vm08.stdout:4/19: mknod c3 0 2026-03-10T08:54:57.461 INFO:tasks.workunit.client.1.vm08.stdout:4/20: rmdir - no directory 2026-03-10T08:54:57.461 INFO:tasks.workunit.client.1.vm08.stdout:4/21: read f2 [8040930,33548] 0 2026-03-10T08:54:57.462 INFO:tasks.workunit.client.1.vm08.stdout:8/12: stat d1 0 2026-03-10T08:54:57.467 INFO:tasks.workunit.client.1.vm08.stdout:6/28: creat d9/fa x:0 0 0 2026-03-10T08:54:57.468 INFO:tasks.workunit.client.1.vm08.stdout:6/29: dread - d9/fa zero size 2026-03-10T08:54:57.472 INFO:tasks.workunit.client.1.vm08.stdout:6/30: dwrite f1 [0,4194304] 0 2026-03-10T08:54:57.483 INFO:tasks.workunit.client.1.vm08.stdout:9/22: symlink d2/l8 0 2026-03-10T08:54:57.483 INFO:tasks.workunit.client.1.vm08.stdout:9/23: truncate f1 844817 0 2026-03-10T08:54:57.483 INFO:tasks.workunit.client.1.vm08.stdout:9/24: chown d2/l7 797825 1 2026-03-10T08:54:57.483 INFO:tasks.workunit.client.1.vm08.stdout:9/25: fdatasync f0 0 2026-03-10T08:54:57.484 INFO:tasks.workunit.client.1.vm08.stdout:5/23: creat d0/f5 x:0 0 0 2026-03-10T08:54:57.484 INFO:tasks.workunit.client.1.vm08.stdout:5/24: dread - d0/f5 zero size 2026-03-10T08:54:57.485 INFO:tasks.workunit.client.1.vm08.stdout:5/25: write d0/f5 [499266,82706] 0 2026-03-10T08:54:57.489 INFO:tasks.workunit.client.1.vm08.stdout:2/9: creat d1/f2 x:0 0 0 2026-03-10T08:54:57.498 INFO:tasks.workunit.client.1.vm08.stdout:4/22: creat f4 x:0 0 0 2026-03-10T08:54:57.498 INFO:tasks.workunit.client.1.vm08.stdout:3/19: mkdir d4/d5 0 2026-03-10T08:54:57.498 INFO:tasks.workunit.client.1.vm08.stdout:3/20: write f1 [974690,121244] 0 2026-03-10T08:54:57.498 INFO:tasks.workunit.client.1.vm08.stdout:3/21: chown f1 64478927 1 2026-03-10T08:54:57.498 INFO:tasks.workunit.client.1.vm08.stdout:6/31: rename c2 to d9/cb 0 2026-03-10T08:54:57.498 INFO:tasks.workunit.client.1.vm08.stdout:7/11: link d0/d2/c3 d0/c4 0 2026-03-10T08:54:57.511 INFO:tasks.workunit.client.1.vm08.stdout:2/10: creat d1/f3 x:0 0 0 2026-03-10T08:54:57.513 INFO:tasks.workunit.client.1.vm08.stdout:4/23: mkdir d5 0 2026-03-10T08:54:57.517 INFO:tasks.workunit.client.1.vm08.stdout:3/22: mknod d4/c6 0 2026-03-10T08:54:57.518 INFO:tasks.workunit.client.1.vm08.stdout:8/13: symlink d1/l2 0 2026-03-10T08:54:57.518 INFO:tasks.workunit.client.1.vm08.stdout:8/14: stat d1 0 2026-03-10T08:54:57.521 INFO:tasks.workunit.client.1.vm08.stdout:6/32: mkdir d9/dc 0 2026-03-10T08:54:57.526 INFO:tasks.workunit.client.1.vm08.stdout:7/12: dwrite d0/f1 [0,4194304] 0 2026-03-10T08:54:57.531 INFO:tasks.workunit.client.1.vm08.stdout:3/23: dread f1 [0,4194304] 0 2026-03-10T08:54:57.532 INFO:tasks.workunit.client.1.vm08.stdout:7/13: dwrite d0/f1 [0,4194304] 0 2026-03-10T08:54:57.536 INFO:tasks.workunit.client.1.vm08.stdout:3/24: dwrite f1 [0,4194304] 0 2026-03-10T08:54:57.560 INFO:tasks.workunit.client.1.vm08.stdout:2/11: write f0 [2163186,36248] 0 2026-03-10T08:54:57.563 INFO:tasks.workunit.client.1.vm08.stdout:4/24: dread f2 [4194304,4194304] 0 2026-03-10T08:54:57.569 INFO:tasks.workunit.client.1.vm08.stdout:8/15: dwrite f0 [4194304,4194304] 0 2026-03-10T08:54:57.573 INFO:tasks.workunit.client.1.vm08.stdout:2/12: unlink f0 0 2026-03-10T08:54:57.573 INFO:tasks.workunit.client.1.vm08.stdout:2/13: readlink - no filename 2026-03-10T08:54:57.581 INFO:tasks.workunit.client.1.vm08.stdout:6/33: mkdir d9/dc/dd 0 2026-03-10T08:54:57.582 INFO:tasks.workunit.client.1.vm08.stdout:6/34: chown d9/dc/dd 10522131 1 2026-03-10T08:54:57.582 INFO:tasks.workunit.client.1.vm08.stdout:8/16: mknod d1/c3 0 2026-03-10T08:54:57.588 INFO:tasks.workunit.client.1.vm08.stdout:8/17: symlink d1/l4 0 2026-03-10T08:54:57.599 INFO:tasks.workunit.client.1.vm08.stdout:8/18: chown d1/l2 14 1 2026-03-10T08:54:57.599 INFO:tasks.workunit.client.1.vm08.stdout:8/19: dread f0 [0,4194304] 0 2026-03-10T08:54:57.599 INFO:tasks.workunit.client.1.vm08.stdout:6/35: link f5 d9/dc/fe 0 2026-03-10T08:54:57.599 INFO:tasks.workunit.client.1.vm08.stdout:8/20: mknod d1/c5 0 2026-03-10T08:54:57.599 INFO:tasks.workunit.client.1.vm08.stdout:8/21: creat d1/f6 x:0 0 0 2026-03-10T08:54:57.599 INFO:tasks.workunit.client.1.vm08.stdout:8/22: truncate f0 9322347 0 2026-03-10T08:54:57.603 INFO:tasks.workunit.client.1.vm08.stdout:8/23: mkdir d1/d7 0 2026-03-10T08:54:57.605 INFO:tasks.workunit.client.1.vm08.stdout:8/24: creat d1/f8 x:0 0 0 2026-03-10T08:54:57.605 INFO:tasks.workunit.client.1.vm08.stdout:8/25: chown d1/f8 825 1 2026-03-10T08:54:57.607 INFO:tasks.workunit.client.1.vm08.stdout:8/26: dread f0 [4194304,4194304] 0 2026-03-10T08:54:57.607 INFO:tasks.workunit.client.1.vm08.stdout:8/27: fdatasync d1/f6 0 2026-03-10T08:54:57.608 INFO:tasks.workunit.client.1.vm08.stdout:8/28: readlink d1/l2 0 2026-03-10T08:54:57.617 INFO:tasks.workunit.client.1.vm08.stdout:7/14: fdatasync d0/f1 0 2026-03-10T08:54:57.617 INFO:tasks.workunit.client.1.vm08.stdout:8/29: unlink d1/c5 0 2026-03-10T08:54:57.627 INFO:tasks.workunit.client.1.vm08.stdout:7/15: sync 2026-03-10T08:54:57.630 INFO:tasks.workunit.client.1.vm08.stdout:7/16: dread d0/f1 [0,4194304] 0 2026-03-10T08:54:57.635 INFO:tasks.workunit.client.1.vm08.stdout:7/17: rename d0/d2/c3 to d0/d2/c5 0 2026-03-10T08:54:57.636 INFO:tasks.workunit.client.1.vm08.stdout:7/18: mknod d0/c6 0 2026-03-10T08:54:57.638 INFO:tasks.workunit.client.1.vm08.stdout:7/19: dread d0/f1 [0,4194304] 0 2026-03-10T08:54:57.751 INFO:tasks.workunit.client.1.vm08.stdout:0/10: fsync f1 0 2026-03-10T08:54:57.751 INFO:tasks.workunit.client.1.vm08.stdout:0/11: readlink - no filename 2026-03-10T08:54:57.753 INFO:tasks.workunit.client.1.vm08.stdout:0/12: mknod c2 0 2026-03-10T08:54:57.757 INFO:tasks.workunit.client.1.vm08.stdout:0/13: sync 2026-03-10T08:54:57.759 INFO:tasks.workunit.client.1.vm08.stdout:0/14: creat f3 x:0 0 0 2026-03-10T08:54:57.787 INFO:tasks.workunit.client.1.vm08.stdout:1/11: write d1/f3 [1683446,14388] 0 2026-03-10T08:54:57.828 INFO:tasks.workunit.client.1.vm08.stdout:5/26: link d0/c4 d0/c6 0 2026-03-10T08:54:57.830 INFO:tasks.workunit.client.1.vm08.stdout:9/26: getdents d2 0 2026-03-10T08:54:57.830 INFO:tasks.workunit.client.1.vm08.stdout:9/27: rename d2 to d2/d9 22 2026-03-10T08:54:57.830 INFO:tasks.workunit.client.1.vm08.stdout:9/28: readlink d2/l7 0 2026-03-10T08:54:57.830 INFO:tasks.workunit.client.1.vm08.stdout:9/29: fdatasync f0 0 2026-03-10T08:54:57.831 INFO:tasks.workunit.client.1.vm08.stdout:5/27: unlink d0/c4 0 2026-03-10T08:54:57.832 INFO:tasks.workunit.client.1.vm08.stdout:2/14: fsync d1/f2 0 2026-03-10T08:54:57.833 INFO:tasks.workunit.client.1.vm08.stdout:9/30: dread f1 [0,4194304] 0 2026-03-10T08:54:57.834 INFO:tasks.workunit.client.1.vm08.stdout:5/28: symlink d0/l7 0 2026-03-10T08:54:57.835 INFO:tasks.workunit.client.1.vm08.stdout:9/31: rename d2/f3 to d2/fa 0 2026-03-10T08:54:57.836 INFO:tasks.workunit.client.1.vm08.stdout:5/29: sync 2026-03-10T08:54:57.837 INFO:tasks.workunit.client.1.vm08.stdout:4/25: truncate f2 5860550 0 2026-03-10T08:54:57.837 INFO:tasks.workunit.client.1.vm08.stdout:2/15: dwrite d1/f2 [0,4194304] 0 2026-03-10T08:54:57.853 INFO:tasks.workunit.client.1.vm08.stdout:7/20: truncate d0/f1 3752763 0 2026-03-10T08:54:57.854 INFO:tasks.workunit.client.1.vm08.stdout:7/21: chown d0 100627 1 2026-03-10T08:54:57.856 INFO:tasks.workunit.client.1.vm08.stdout:3/25: truncate f1 763261 0 2026-03-10T08:54:57.859 INFO:tasks.workunit.client.1.vm08.stdout:5/30: link d0/l7 d0/l8 0 2026-03-10T08:54:57.861 INFO:tasks.workunit.client.1.vm08.stdout:6/36: fsync f6 0 2026-03-10T08:54:57.862 INFO:tasks.workunit.client.1.vm08.stdout:5/31: mknod d0/c9 0 2026-03-10T08:54:57.866 INFO:tasks.workunit.client.1.vm08.stdout:5/32: rename d0/f5 to d0/fa 0 2026-03-10T08:54:57.867 INFO:tasks.workunit.client.1.vm08.stdout:6/37: link f5 d9/dc/dd/ff 0 2026-03-10T08:54:57.867 INFO:tasks.workunit.client.1.vm08.stdout:6/38: chown d9/fa 3 1 2026-03-10T08:54:57.869 INFO:tasks.workunit.client.1.vm08.stdout:5/33: creat d0/fb x:0 0 0 2026-03-10T08:54:57.870 INFO:tasks.workunit.client.1.vm08.stdout:8/30: rmdir d1 39 2026-03-10T08:54:57.870 INFO:tasks.workunit.client.1.vm08.stdout:6/39: mkdir d9/d10 0 2026-03-10T08:54:57.874 INFO:tasks.workunit.client.1.vm08.stdout:6/40: mkdir d9/dc/d11 0 2026-03-10T08:54:57.874 INFO:tasks.workunit.client.1.vm08.stdout:6/41: chown d9 540 1 2026-03-10T08:54:57.874 INFO:tasks.workunit.client.1.vm08.stdout:6/42: truncate d9/fa 715246 0 2026-03-10T08:54:57.875 INFO:tasks.workunit.client.1.vm08.stdout:6/43: stat d9/dc/fe 0 2026-03-10T08:54:57.875 INFO:tasks.workunit.client.1.vm08.stdout:5/34: chown d0/c6 7505686 1 2026-03-10T08:54:57.875 INFO:tasks.workunit.client.1.vm08.stdout:5/35: stat d0/fb 0 2026-03-10T08:54:57.876 INFO:tasks.workunit.client.1.vm08.stdout:8/31: unlink d1/l2 0 2026-03-10T08:54:57.879 INFO:tasks.workunit.client.1.vm08.stdout:5/36: fdatasync d0/fa 0 2026-03-10T08:54:57.879 INFO:tasks.workunit.client.1.vm08.stdout:8/32: sync 2026-03-10T08:54:57.879 INFO:tasks.workunit.client.1.vm08.stdout:8/33: read - d1/f6 zero size 2026-03-10T08:54:57.881 INFO:tasks.workunit.client.1.vm08.stdout:6/44: creat d9/dc/dd/f12 x:0 0 0 2026-03-10T08:54:57.882 INFO:tasks.workunit.client.1.vm08.stdout:5/37: dwrite d0/fa [0,4194304] 0 2026-03-10T08:54:57.883 INFO:tasks.workunit.client.1.vm08.stdout:5/38: dread - d0/fb zero size 2026-03-10T08:54:57.883 INFO:tasks.workunit.client.1.vm08.stdout:5/39: write d0/fb [689876,43482] 0 2026-03-10T08:54:57.886 INFO:tasks.workunit.client.1.vm08.stdout:6/45: mkdir d9/d13 0 2026-03-10T08:54:57.887 INFO:tasks.workunit.client.1.vm08.stdout:6/46: sync 2026-03-10T08:54:57.893 INFO:tasks.workunit.client.1.vm08.stdout:6/47: write d9/dc/dd/f12 [126836,127258] 0 2026-03-10T08:54:57.897 INFO:tasks.workunit.client.1.vm08.stdout:5/40: symlink d0/lc 0 2026-03-10T08:54:57.897 INFO:tasks.workunit.client.1.vm08.stdout:5/41: truncate d0/fb 953665 0 2026-03-10T08:54:57.900 INFO:tasks.workunit.client.1.vm08.stdout:5/42: symlink d0/ld 0 2026-03-10T08:54:57.911 INFO:tasks.workunit.client.1.vm08.stdout:6/48: dread d9/dc/dd/ff [0,4194304] 0 2026-03-10T08:54:57.911 INFO:tasks.workunit.client.1.vm08.stdout:6/49: dread d9/fa [0,4194304] 0 2026-03-10T08:54:57.911 INFO:tasks.workunit.client.1.vm08.stdout:5/43: stat d0/l7 0 2026-03-10T08:54:57.911 INFO:tasks.workunit.client.1.vm08.stdout:6/50: mknod d9/d13/c14 0 2026-03-10T08:54:57.913 INFO:tasks.workunit.client.1.vm08.stdout:5/44: sync 2026-03-10T08:54:57.916 INFO:tasks.workunit.client.1.vm08.stdout:5/45: creat d0/fe x:0 0 0 2026-03-10T08:54:57.917 INFO:tasks.workunit.client.1.vm08.stdout:5/46: dread - d0/fe zero size 2026-03-10T08:54:57.918 INFO:tasks.workunit.client.1.vm08.stdout:5/47: creat d0/ff x:0 0 0 2026-03-10T08:54:57.973 INFO:tasks.workunit.client.1.vm08.stdout:1/12: truncate d1/f3 989927 0 2026-03-10T08:54:58.027 INFO:tasks.workunit.client.1.vm08.stdout:0/15: truncate f1 530946 0 2026-03-10T08:54:58.051 INFO:tasks.workunit.client.1.vm08.stdout:9/32: fsync d2/fa 0 2026-03-10T08:54:58.052 INFO:tasks.workunit.client.1.vm08.stdout:9/33: write f1 [1877066,40596] 0 2026-03-10T08:54:58.052 INFO:tasks.workunit.client.1.vm08.stdout:9/34: read - d2/fa zero size 2026-03-10T08:54:58.055 INFO:tasks.workunit.client.1.vm08.stdout:4/26: rename f2 to d5/f6 0 2026-03-10T08:54:58.057 INFO:tasks.workunit.client.1.vm08.stdout:9/35: dwrite f0 [0,4194304] 0 2026-03-10T08:54:58.058 INFO:tasks.workunit.client.1.vm08.stdout:9/36: chown d2/l7 272276 1 2026-03-10T08:54:58.058 INFO:tasks.workunit.client.1.vm08.stdout:9/37: read f0 [1082763,19891] 0 2026-03-10T08:54:58.058 INFO:tasks.workunit.client.1.vm08.stdout:9/38: stat d2 0 2026-03-10T08:54:58.078 INFO:tasks.workunit.client.1.vm08.stdout:2/16: truncate d1/f2 2572331 0 2026-03-10T08:54:58.103 INFO:tasks.workunit.client.1.vm08.stdout:8/34: getdents d1 0 2026-03-10T08:54:58.179 INFO:tasks.workunit.client.1.vm08.stdout:1/13: unlink d1/f3 0 2026-03-10T08:54:58.179 INFO:tasks.workunit.client.1.vm08.stdout:1/14: dread - d1/f4 zero size 2026-03-10T08:54:58.227 INFO:tasks.workunit.client.1.vm08.stdout:3/26: truncate f2 1442079 0 2026-03-10T08:54:58.339 INFO:tasks.workunit.client.1.vm08.stdout:5/48: rename d0/fa to d0/f10 0 2026-03-10T08:54:58.340 INFO:tasks.workunit.client.1.vm08.stdout:4/27: unlink f4 0 2026-03-10T08:54:58.353 INFO:tasks.workunit.client.1.vm08.stdout:9/39: creat d2/fb x:0 0 0 2026-03-10T08:54:58.370 INFO:tasks.workunit.client.1.vm08.stdout:7/22: rename d0/f1 to d0/d2/f7 0 2026-03-10T08:54:58.377 INFO:tasks.workunit.client.1.vm08.stdout:9/40: write f0 [2821361,17627] 0 2026-03-10T08:54:58.377 INFO:tasks.workunit.client.1.vm08.stdout:9/41: stat d2/f6 0 2026-03-10T08:54:58.379 INFO:tasks.workunit.client.1.vm08.stdout:2/17: link d1/f3 d1/f4 0 2026-03-10T08:54:58.383 INFO:tasks.workunit.client.1.vm08.stdout:8/35: mkdir d1/d7/d9 0 2026-03-10T08:54:58.387 INFO:tasks.workunit.client.1.vm08.stdout:3/27: dread f2 [0,4194304] 0 2026-03-10T08:54:58.389 INFO:tasks.workunit.client.1.vm08.stdout:6/51: rename f7 to d9/d13/f15 0 2026-03-10T08:54:58.389 INFO:tasks.workunit.client.1.vm08.stdout:9/42: sync 2026-03-10T08:54:58.390 INFO:tasks.workunit.client.1.vm08.stdout:9/43: chown d2/fa 943753 1 2026-03-10T08:54:58.390 INFO:tasks.workunit.client.1.vm08.stdout:9/44: dread - d2/fa zero size 2026-03-10T08:54:58.399 INFO:tasks.workunit.client.1.vm08.stdout:3/28: fdatasync f1 0 2026-03-10T08:54:58.402 INFO:tasks.workunit.client.1.vm08.stdout:8/36: rename d1/c3 to d1/d7/d9/ca 0 2026-03-10T08:54:58.402 INFO:tasks.workunit.client.1.vm08.stdout:8/37: readlink d1/l4 0 2026-03-10T08:54:58.403 INFO:tasks.workunit.client.1.vm08.stdout:8/38: stat d1/d7/d9 0 2026-03-10T08:54:58.408 INFO:tasks.workunit.client.1.vm08.stdout:9/45: symlink d2/lc 0 2026-03-10T08:54:58.419 INFO:tasks.workunit.client.1.vm08.stdout:1/15: getdents d1 0 2026-03-10T08:54:58.427 INFO:tasks.workunit.client.1.vm08.stdout:8/39: creat d1/d7/d9/fb x:0 0 0 2026-03-10T08:54:58.429 INFO:tasks.workunit.client.1.vm08.stdout:9/46: mkdir d2/dd 0 2026-03-10T08:54:58.429 INFO:tasks.workunit.client.1.vm08.stdout:9/47: truncate d2/fa 275777 0 2026-03-10T08:54:58.430 INFO:tasks.workunit.client.1.vm08.stdout:9/48: write f0 [502447,89354] 0 2026-03-10T08:54:58.430 INFO:tasks.workunit.client.1.vm08.stdout:9/49: write f1 [2043814,53120] 0 2026-03-10T08:54:58.431 INFO:tasks.workunit.client.1.vm08.stdout:9/50: write d2/f6 [71316,78210] 0 2026-03-10T08:54:58.431 INFO:tasks.workunit.client.1.vm08.stdout:9/51: readlink d2/l7 0 2026-03-10T08:54:58.431 INFO:tasks.workunit.client.1.vm08.stdout:9/52: chown d2/l5 3602 1 2026-03-10T08:54:58.432 INFO:tasks.workunit.client.1.vm08.stdout:9/53: write d2/fb [638916,62195] 0 2026-03-10T08:54:58.434 INFO:tasks.workunit.client.1.vm08.stdout:9/54: dread f0 [0,4194304] 0 2026-03-10T08:54:58.437 INFO:tasks.workunit.client.1.vm08.stdout:1/16: rename d1/c2 to d1/c6 0 2026-03-10T08:54:58.438 INFO:tasks.workunit.client.1.vm08.stdout:1/17: write d1/f4 [160636,13681] 0 2026-03-10T08:54:58.438 INFO:tasks.workunit.client.1.vm08.stdout:9/55: dread d2/f6 [0,4194304] 0 2026-03-10T08:54:58.439 INFO:tasks.workunit.client.1.vm08.stdout:9/56: write d2/fb [1091702,102655] 0 2026-03-10T08:54:58.440 INFO:tasks.workunit.client.1.vm08.stdout:9/57: chown d2/fb 629 1 2026-03-10T08:54:58.444 INFO:tasks.workunit.client.1.vm08.stdout:9/58: fdatasync d2/fb 0 2026-03-10T08:54:58.479 INFO:tasks.workunit.client.1.vm08.stdout:3/29: creat d4/d5/f7 x:0 0 0 2026-03-10T08:54:58.482 INFO:tasks.workunit.client.1.vm08.stdout:8/40: symlink d1/d7/lc 0 2026-03-10T08:54:58.483 INFO:tasks.workunit.client.1.vm08.stdout:8/41: write d1/d7/d9/fb [537558,13922] 0 2026-03-10T08:54:58.483 INFO:tasks.workunit.client.1.vm08.stdout:8/42: write d1/f8 [602734,19424] 0 2026-03-10T08:54:58.487 INFO:tasks.workunit.client.1.vm08.stdout:4/28: read d5/f6 [1449134,27065] 0 2026-03-10T08:54:58.487 INFO:tasks.workunit.client.1.vm08.stdout:4/29: read d5/f6 [2916139,20453] 0 2026-03-10T08:54:58.489 INFO:tasks.workunit.client.1.vm08.stdout:5/49: dread - d0/fe zero size 2026-03-10T08:54:58.490 INFO:tasks.workunit.client.1.vm08.stdout:4/30: dread d5/f6 [0,4194304] 0 2026-03-10T08:54:58.492 INFO:tasks.workunit.client.1.vm08.stdout:1/18: mkdir d1/d7 0 2026-03-10T08:54:58.494 INFO:tasks.workunit.client.1.vm08.stdout:1/19: dread d1/f4 [0,4194304] 0 2026-03-10T08:54:58.505 INFO:tasks.workunit.client.1.vm08.stdout:7/23: dread d0/d2/f7 [0,4194304] 0 2026-03-10T08:54:58.506 INFO:tasks.workunit.client.1.vm08.stdout:3/30: mkdir d4/d5/d8 0 2026-03-10T08:54:58.507 INFO:tasks.workunit.client.1.vm08.stdout:3/31: dread f2 [0,4194304] 0 2026-03-10T08:54:58.507 INFO:tasks.workunit.client.1.vm08.stdout:3/32: rename d4 to d4/d9 22 2026-03-10T08:54:58.508 INFO:tasks.workunit.client.1.vm08.stdout:8/43: mkdir d1/d7/d9/dd 0 2026-03-10T08:54:58.509 INFO:tasks.workunit.client.1.vm08.stdout:2/18: rmdir d1 39 2026-03-10T08:54:58.514 INFO:tasks.workunit.client.1.vm08.stdout:4/31: fsync d5/f6 0 2026-03-10T08:54:58.520 INFO:tasks.workunit.client.1.vm08.stdout:1/20: creat d1/f8 x:0 0 0 2026-03-10T08:54:58.520 INFO:tasks.workunit.client.1.vm08.stdout:1/21: chown d1/f8 744208 1 2026-03-10T08:54:58.521 INFO:tasks.workunit.client.1.vm08.stdout:1/22: dread - d1/f8 zero size 2026-03-10T08:54:58.522 INFO:tasks.workunit.client.1.vm08.stdout:6/52: write d9/dc/fe [4384995,62107] 0 2026-03-10T08:54:58.523 INFO:tasks.workunit.client.1.vm08.stdout:9/59: symlink d2/dd/le 0 2026-03-10T08:54:58.534 INFO:tasks.workunit.client.1.vm08.stdout:3/33: creat d4/d5/fa x:0 0 0 2026-03-10T08:54:58.540 INFO:tasks.workunit.client.1.vm08.stdout:8/44: dwrite f0 [4194304,4194304] 0 2026-03-10T08:54:58.541 INFO:tasks.workunit.client.1.vm08.stdout:4/32: symlink d5/l7 0 2026-03-10T08:54:58.559 INFO:tasks.workunit.client.1.vm08.stdout:9/60: write f1 [5029346,124685] 0 2026-03-10T08:54:58.559 INFO:tasks.workunit.client.1.vm08.stdout:9/61: truncate d2/f4 584276 0 2026-03-10T08:54:58.563 INFO:tasks.workunit.client.1.vm08.stdout:2/19: symlink d1/l5 0 2026-03-10T08:54:58.563 INFO:tasks.workunit.client.1.vm08.stdout:2/20: chown d1/f4 112584065 1 2026-03-10T08:54:58.565 INFO:tasks.workunit.client.1.vm08.stdout:5/50: mkdir d0/d11 0 2026-03-10T08:54:58.567 INFO:tasks.workunit.client.1.vm08.stdout:8/45: mkdir d1/de 0 2026-03-10T08:54:58.568 INFO:tasks.workunit.client.1.vm08.stdout:8/46: write f0 [3633893,54747] 0 2026-03-10T08:54:58.570 INFO:tasks.workunit.client.1.vm08.stdout:4/33: creat d5/f8 x:0 0 0 2026-03-10T08:54:58.573 INFO:tasks.workunit.client.1.vm08.stdout:8/47: dwrite f0 [0,4194304] 0 2026-03-10T08:54:58.575 INFO:tasks.workunit.client.1.vm08.stdout:1/23: symlink d1/d7/l9 0 2026-03-10T08:54:58.576 INFO:tasks.workunit.client.1.vm08.stdout:1/24: chown d1/c5 13 1 2026-03-10T08:54:58.578 INFO:tasks.workunit.client.1.vm08.stdout:6/53: symlink d9/d10/l16 0 2026-03-10T08:54:58.579 INFO:tasks.workunit.client.1.vm08.stdout:6/54: stat d9/d13 0 2026-03-10T08:54:58.579 INFO:tasks.workunit.client.1.vm08.stdout:6/55: chown d9/dc/dd/ff 718 1 2026-03-10T08:54:58.585 INFO:tasks.workunit.client.1.vm08.stdout:8/48: dread d1/d7/d9/fb [0,4194304] 0 2026-03-10T08:54:58.586 INFO:tasks.workunit.client.1.vm08.stdout:1/25: dwrite d1/f8 [0,4194304] 0 2026-03-10T08:54:58.597 INFO:tasks.workunit.client.1.vm08.stdout:9/62: dread d2/f6 [0,4194304] 0 2026-03-10T08:54:58.598 INFO:tasks.workunit.client.1.vm08.stdout:3/34: mknod d4/d5/d8/cb 0 2026-03-10T08:54:58.600 INFO:tasks.workunit.client.1.vm08.stdout:2/21: symlink d1/l6 0 2026-03-10T08:54:58.606 INFO:tasks.workunit.client.1.vm08.stdout:4/34: mknod d5/c9 0 2026-03-10T08:54:58.618 INFO:tasks.workunit.client.1.vm08.stdout:6/56: creat d9/dc/dd/f17 x:0 0 0 2026-03-10T08:54:58.618 INFO:tasks.workunit.client.1.vm08.stdout:6/57: fdatasync f6 0 2026-03-10T08:54:58.618 INFO:tasks.workunit.client.1.vm08.stdout:8/49: creat d1/d7/ff x:0 0 0 2026-03-10T08:54:58.618 INFO:tasks.workunit.client.1.vm08.stdout:8/50: dread - d1/d7/ff zero size 2026-03-10T08:54:58.621 INFO:tasks.workunit.client.1.vm08.stdout:1/26: mkdir d1/da 0 2026-03-10T08:54:58.624 INFO:tasks.workunit.client.1.vm08.stdout:0/16: write f1 [1522164,35711] 0 2026-03-10T08:54:58.625 INFO:tasks.workunit.client.1.vm08.stdout:9/63: symlink d2/dd/lf 0 2026-03-10T08:54:58.626 INFO:tasks.workunit.client.1.vm08.stdout:3/35: unlink f2 0 2026-03-10T08:54:58.626 INFO:tasks.workunit.client.1.vm08.stdout:0/17: write f1 [1150701,28531] 0 2026-03-10T08:54:58.627 INFO:tasks.workunit.client.1.vm08.stdout:3/36: dread - d4/d5/fa zero size 2026-03-10T08:54:58.627 INFO:tasks.workunit.client.1.vm08.stdout:1/27: dwrite d1/f8 [0,4194304] 0 2026-03-10T08:54:58.642 INFO:tasks.workunit.client.1.vm08.stdout:2/22: creat d1/f7 x:0 0 0 2026-03-10T08:54:58.649 INFO:tasks.workunit.client.1.vm08.stdout:6/58: unlink d9/dc/fe 0 2026-03-10T08:54:58.653 INFO:tasks.workunit.client.1.vm08.stdout:4/35: dwrite d5/f6 [4194304,4194304] 0 2026-03-10T08:54:58.655 INFO:tasks.workunit.client.1.vm08.stdout:9/64: fdatasync d2/f6 0 2026-03-10T08:54:58.663 INFO:tasks.workunit.client.1.vm08.stdout:0/18: creat f4 x:0 0 0 2026-03-10T08:54:58.663 INFO:tasks.workunit.client.1.vm08.stdout:3/37: creat d4/d5/fc x:0 0 0 2026-03-10T08:54:58.663 INFO:tasks.workunit.client.1.vm08.stdout:0/19: write f1 [2020681,36978] 0 2026-03-10T08:54:58.664 INFO:tasks.workunit.client.1.vm08.stdout:0/20: write f1 [2434053,52314] 0 2026-03-10T08:54:58.664 INFO:tasks.workunit.client.1.vm08.stdout:3/38: chown d4/d5/fa 16748443 1 2026-03-10T08:54:58.667 INFO:tasks.workunit.client.1.vm08.stdout:2/23: creat d1/f8 x:0 0 0 2026-03-10T08:54:58.673 INFO:tasks.workunit.client.1.vm08.stdout:7/24: dwrite d0/d2/f7 [4194304,4194304] 0 2026-03-10T08:54:58.674 INFO:tasks.workunit.client.1.vm08.stdout:6/59: mknod d9/d13/c18 0 2026-03-10T08:54:58.678 INFO:tasks.workunit.client.1.vm08.stdout:9/65: creat d2/f10 x:0 0 0 2026-03-10T08:54:58.680 INFO:tasks.workunit.client.1.vm08.stdout:7/25: dread d0/d2/f7 [0,4194304] 0 2026-03-10T08:54:58.680 INFO:tasks.workunit.client.1.vm08.stdout:0/21: creat f5 x:0 0 0 2026-03-10T08:54:58.681 INFO:tasks.workunit.client.1.vm08.stdout:3/39: rmdir d4 39 2026-03-10T08:54:58.684 INFO:tasks.workunit.client.1.vm08.stdout:6/60: fsync d9/d13/f15 0 2026-03-10T08:54:58.685 INFO:tasks.workunit.client.1.vm08.stdout:6/61: readlink d9/d10/l16 0 2026-03-10T08:54:58.685 INFO:tasks.workunit.client.1.vm08.stdout:6/62: write f1 [2799035,25003] 0 2026-03-10T08:54:58.688 INFO:tasks.workunit.client.1.vm08.stdout:9/66: fsync d2/f6 0 2026-03-10T08:54:58.689 INFO:tasks.workunit.client.1.vm08.stdout:9/67: write d2/fa [567876,41392] 0 2026-03-10T08:54:58.689 INFO:tasks.workunit.client.1.vm08.stdout:7/26: rmdir d0 39 2026-03-10T08:54:58.690 INFO:tasks.workunit.client.1.vm08.stdout:0/22: mkdir d6 0 2026-03-10T08:54:58.695 INFO:tasks.workunit.client.1.vm08.stdout:9/68: unlink d2/f10 0 2026-03-10T08:54:58.703 INFO:tasks.workunit.client.1.vm08.stdout:5/51: mknod d0/c12 0 2026-03-10T08:54:58.703 INFO:tasks.workunit.client.1.vm08.stdout:3/40: dwrite d4/d5/fc [0,4194304] 0 2026-03-10T08:54:58.705 INFO:tasks.workunit.client.1.vm08.stdout:3/41: write d4/d5/f7 [979189,67361] 0 2026-03-10T08:54:58.713 INFO:tasks.workunit.client.1.vm08.stdout:4/36: rmdir d5 39 2026-03-10T08:54:58.715 INFO:tasks.workunit.client.1.vm08.stdout:0/23: mknod d6/c7 0 2026-03-10T08:54:58.715 INFO:tasks.workunit.client.1.vm08.stdout:0/24: write f5 [782909,38262] 0 2026-03-10T08:54:58.716 INFO:tasks.workunit.client.1.vm08.stdout:4/37: dread - d5/f8 zero size 2026-03-10T08:54:58.718 INFO:tasks.workunit.client.1.vm08.stdout:5/52: symlink d0/l13 0 2026-03-10T08:54:58.719 INFO:tasks.workunit.client.1.vm08.stdout:4/38: symlink d5/la 0 2026-03-10T08:54:58.719 INFO:tasks.workunit.client.1.vm08.stdout:7/27: link d0/c6 d0/c8 0 2026-03-10T08:54:58.720 INFO:tasks.workunit.client.1.vm08.stdout:4/39: fdatasync d5/f8 0 2026-03-10T08:54:58.728 INFO:tasks.workunit.client.1.vm08.stdout:7/28: chown d0/c8 41 1 2026-03-10T08:54:58.744 INFO:tasks.workunit.client.1.vm08.stdout:5/53: dwrite d0/ff [0,4194304] 0 2026-03-10T08:54:58.744 INFO:tasks.workunit.client.1.vm08.stdout:5/54: truncate d0/fe 876351 0 2026-03-10T08:54:58.744 INFO:tasks.workunit.client.1.vm08.stdout:5/55: getdents d0/d11 0 2026-03-10T08:54:58.744 INFO:tasks.workunit.client.1.vm08.stdout:5/56: rename d0/d11 to d0/d11/d14 22 2026-03-10T08:54:58.744 INFO:tasks.workunit.client.1.vm08.stdout:5/57: write d0/fe [1011143,47868] 0 2026-03-10T08:54:58.744 INFO:tasks.workunit.client.1.vm08.stdout:5/58: dread d0/ff [0,4194304] 0 2026-03-10T08:54:58.746 INFO:tasks.workunit.client.1.vm08.stdout:5/59: creat d0/f15 x:0 0 0 2026-03-10T08:54:58.746 INFO:tasks.workunit.client.1.vm08.stdout:5/60: chown d0/f15 1828398 1 2026-03-10T08:54:58.747 INFO:tasks.workunit.client.1.vm08.stdout:5/61: write d0/fe [95093,129145] 0 2026-03-10T08:54:58.754 INFO:tasks.workunit.client.1.vm08.stdout:5/62: creat d0/f16 x:0 0 0 2026-03-10T08:54:58.762 INFO:tasks.workunit.client.1.vm08.stdout:5/63: dread d0/ff [0,4194304] 0 2026-03-10T08:54:58.771 INFO:tasks.workunit.client.1.vm08.stdout:1/28: fdatasync d1/f8 0 2026-03-10T08:54:58.799 INFO:tasks.workunit.client.1.vm08.stdout:0/25: dread f1 [0,4194304] 0 2026-03-10T08:54:58.800 INFO:tasks.workunit.client.1.vm08.stdout:0/26: write f5 [1831833,22912] 0 2026-03-10T08:54:58.801 INFO:tasks.workunit.client.1.vm08.stdout:8/51: rename d1/d7 to d1/d10 0 2026-03-10T08:54:58.802 INFO:tasks.workunit.client.1.vm08.stdout:8/52: dread - d1/d10/ff zero size 2026-03-10T08:54:58.808 INFO:tasks.workunit.client.1.vm08.stdout:2/24: fsync d1/f8 0 2026-03-10T08:54:58.812 INFO:tasks.workunit.client.1.vm08.stdout:8/53: creat d1/f11 x:0 0 0 2026-03-10T08:54:58.813 INFO:tasks.workunit.client.1.vm08.stdout:6/63: truncate f5 3659161 0 2026-03-10T08:54:58.814 INFO:tasks.workunit.client.1.vm08.stdout:2/25: dwrite d1/f8 [0,4194304] 0 2026-03-10T08:54:58.817 INFO:tasks.workunit.client.1.vm08.stdout:7/29: truncate d0/d2/f7 2977210 0 2026-03-10T08:54:58.822 INFO:tasks.workunit.client.1.vm08.stdout:0/27: fdatasync f5 0 2026-03-10T08:54:58.822 INFO:tasks.workunit.client.1.vm08.stdout:6/64: rmdir d9/d10 39 2026-03-10T08:54:58.824 INFO:tasks.workunit.client.1.vm08.stdout:2/26: creat d1/f9 x:0 0 0 2026-03-10T08:54:58.826 INFO:tasks.workunit.client.1.vm08.stdout:2/27: mkdir d1/da 0 2026-03-10T08:54:58.826 INFO:tasks.workunit.client.1.vm08.stdout:2/28: chown d1/f7 1099 1 2026-03-10T08:54:58.827 INFO:tasks.workunit.client.1.vm08.stdout:2/29: fdatasync d1/f7 0 2026-03-10T08:54:58.828 INFO:tasks.workunit.client.1.vm08.stdout:7/30: rename d0/c8 to d0/d2/c9 0 2026-03-10T08:54:58.832 INFO:tasks.workunit.client.1.vm08.stdout:7/31: unlink d0/c6 0 2026-03-10T08:54:58.847 INFO:tasks.workunit.client.1.vm08.stdout:3/42: fdatasync d4/d5/fc 0 2026-03-10T08:54:58.848 INFO:tasks.workunit.client.1.vm08.stdout:0/28: getdents d6 0 2026-03-10T08:54:58.848 INFO:tasks.workunit.client.1.vm08.stdout:0/29: dread - f4 zero size 2026-03-10T08:54:58.848 INFO:tasks.workunit.client.1.vm08.stdout:7/32: chown d0/d2/c5 19960535 1 2026-03-10T08:54:58.848 INFO:tasks.workunit.client.1.vm08.stdout:7/33: chown d0/d2/c5 1872 1 2026-03-10T08:54:58.848 INFO:tasks.workunit.client.1.vm08.stdout:9/69: truncate d2/f4 334658 0 2026-03-10T08:54:58.848 INFO:tasks.workunit.client.1.vm08.stdout:4/40: getdents d5 0 2026-03-10T08:54:58.848 INFO:tasks.workunit.client.1.vm08.stdout:9/70: dread f0 [0,4194304] 0 2026-03-10T08:54:58.848 INFO:tasks.workunit.client.1.vm08.stdout:3/43: mknod d4/d5/d8/cd 0 2026-03-10T08:54:58.848 INFO:tasks.workunit.client.1.vm08.stdout:0/30: creat d6/f8 x:0 0 0 2026-03-10T08:54:58.848 INFO:tasks.workunit.client.1.vm08.stdout:3/44: read d4/d5/fc [4191857,114685] 0 2026-03-10T08:54:58.848 INFO:tasks.workunit.client.1.vm08.stdout:9/71: write f1 [3247811,27125] 0 2026-03-10T08:54:58.848 INFO:tasks.workunit.client.1.vm08.stdout:7/34: mkdir d0/d2/da 0 2026-03-10T08:54:58.848 INFO:tasks.workunit.client.1.vm08.stdout:4/41: creat d5/fb x:0 0 0 2026-03-10T08:54:58.848 INFO:tasks.workunit.client.1.vm08.stdout:4/42: truncate d5/fb 179563 0 2026-03-10T08:54:58.848 INFO:tasks.workunit.client.1.vm08.stdout:4/43: fdatasync d5/f8 0 2026-03-10T08:54:58.848 INFO:tasks.workunit.client.1.vm08.stdout:3/45: dread d4/d5/f7 [0,4194304] 0 2026-03-10T08:54:58.848 INFO:tasks.workunit.client.1.vm08.stdout:0/31: creat d6/f9 x:0 0 0 2026-03-10T08:54:58.848 INFO:tasks.workunit.client.1.vm08.stdout:0/32: fsync f4 0 2026-03-10T08:54:58.848 INFO:tasks.workunit.client.1.vm08.stdout:0/33: write d6/f8 [699865,97914] 0 2026-03-10T08:54:58.848 INFO:tasks.workunit.client.1.vm08.stdout:7/35: rmdir d0/d2 39 2026-03-10T08:54:58.850 INFO:tasks.workunit.client.1.vm08.stdout:3/46: creat d4/d5/fe x:0 0 0 2026-03-10T08:54:58.852 INFO:tasks.workunit.client.1.vm08.stdout:0/34: rename d6/f8 to d6/fa 0 2026-03-10T08:54:58.854 INFO:tasks.workunit.client.1.vm08.stdout:7/36: chown d0/d2/f7 143496499 1 2026-03-10T08:54:58.857 INFO:tasks.workunit.client.1.vm08.stdout:3/47: fsync d4/d5/f7 0 2026-03-10T08:54:58.864 INFO:tasks.workunit.client.1.vm08.stdout:3/48: creat d4/d5/d8/ff x:0 0 0 2026-03-10T08:54:58.864 INFO:tasks.workunit.client.1.vm08.stdout:3/49: dread - d4/d5/d8/ff zero size 2026-03-10T08:54:58.867 INFO:tasks.workunit.client.1.vm08.stdout:3/50: rename d4/d5/fe to d4/f10 0 2026-03-10T08:54:58.871 INFO:tasks.workunit.client.1.vm08.stdout:3/51: creat d4/d5/f11 x:0 0 0 2026-03-10T08:54:58.873 INFO:tasks.workunit.client.1.vm08.stdout:3/52: creat d4/d5/f12 x:0 0 0 2026-03-10T08:54:58.874 INFO:tasks.workunit.client.1.vm08.stdout:3/53: chown d4/f10 538680551 1 2026-03-10T08:54:58.876 INFO:tasks.workunit.client.1.vm08.stdout:3/54: mknod d4/d5/d8/c13 0 2026-03-10T08:54:58.914 INFO:tasks.workunit.client.1.vm08.stdout:5/64: write d0/f10 [2205361,118330] 0 2026-03-10T08:54:58.929 INFO:tasks.workunit.client.1.vm08.stdout:5/65: mknod d0/d11/c17 0 2026-03-10T08:54:58.929 INFO:tasks.workunit.client.1.vm08.stdout:5/66: write d0/ff [1338919,12849] 0 2026-03-10T08:54:58.929 INFO:tasks.workunit.client.1.vm08.stdout:5/67: mkdir d0/d11/d18 0 2026-03-10T08:54:58.929 INFO:tasks.workunit.client.1.vm08.stdout:5/68: rename d0/c12 to d0/d11/d18/c19 0 2026-03-10T08:54:58.929 INFO:tasks.workunit.client.1.vm08.stdout:5/69: creat d0/d11/d18/f1a x:0 0 0 2026-03-10T08:54:58.929 INFO:tasks.workunit.client.1.vm08.stdout:5/70: write d0/fb [1884694,49958] 0 2026-03-10T08:54:58.987 INFO:tasks.workunit.client.1.vm08.stdout:6/65: sync 2026-03-10T08:54:58.987 INFO:tasks.workunit.client.1.vm08.stdout:2/30: sync 2026-03-10T08:54:58.988 INFO:tasks.workunit.client.1.vm08.stdout:9/72: sync 2026-03-10T08:54:58.988 INFO:tasks.workunit.client.1.vm08.stdout:0/35: sync 2026-03-10T08:54:58.988 INFO:tasks.workunit.client.1.vm08.stdout:3/55: sync 2026-03-10T08:54:58.988 INFO:tasks.workunit.client.1.vm08.stdout:9/73: chown d2/lc 7248 1 2026-03-10T08:54:58.988 INFO:tasks.workunit.client.1.vm08.stdout:0/36: chown f1 207420267 1 2026-03-10T08:54:58.990 INFO:tasks.workunit.client.1.vm08.stdout:2/31: sync 2026-03-10T08:54:58.991 INFO:tasks.workunit.client.1.vm08.stdout:2/32: stat d1/da 0 2026-03-10T08:54:58.993 INFO:tasks.workunit.client.1.vm08.stdout:2/33: dread d1/f2 [0,4194304] 0 2026-03-10T08:54:59.001 INFO:tasks.workunit.client.1.vm08.stdout:3/56: dread f1 [0,4194304] 0 2026-03-10T08:54:59.005 INFO:tasks.workunit.client.1.vm08.stdout:0/37: dread d6/fa [0,4194304] 0 2026-03-10T08:54:59.005 INFO:tasks.workunit.client.1.vm08.stdout:0/38: write d6/f9 [498499,36694] 0 2026-03-10T08:54:59.005 INFO:tasks.workunit.client.1.vm08.stdout:0/39: chown f3 424179463 1 2026-03-10T08:54:59.006 INFO:tasks.workunit.client.1.vm08.stdout:0/40: fsync f3 0 2026-03-10T08:54:59.007 INFO:tasks.workunit.client.1.vm08.stdout:0/41: dread d6/fa [0,4194304] 0 2026-03-10T08:54:59.007 INFO:tasks.workunit.client.1.vm08.stdout:9/74: symlink d2/dd/l11 0 2026-03-10T08:54:59.014 INFO:tasks.workunit.client.1.vm08.stdout:3/57: fdatasync f1 0 2026-03-10T08:54:59.017 INFO:tasks.workunit.client.1.vm08.stdout:3/58: dread d4/d5/fc [0,4194304] 0 2026-03-10T08:54:59.018 INFO:tasks.workunit.client.1.vm08.stdout:6/66: mknod d9/c19 0 2026-03-10T08:54:59.020 INFO:tasks.workunit.client.1.vm08.stdout:1/29: dwrite d1/f4 [0,4194304] 0 2026-03-10T08:54:59.028 INFO:tasks.workunit.client.1.vm08.stdout:6/67: mkdir d9/d13/d1a 0 2026-03-10T08:54:59.029 INFO:tasks.workunit.client.1.vm08.stdout:6/68: fdatasync f1 0 2026-03-10T08:54:59.030 INFO:tasks.workunit.client.1.vm08.stdout:0/42: link d6/f9 d6/fb 0 2026-03-10T08:54:59.034 INFO:tasks.workunit.client.1.vm08.stdout:1/30: unlink d1/f4 0 2026-03-10T08:54:59.035 INFO:tasks.workunit.client.1.vm08.stdout:1/31: stat d1/f8 0 2026-03-10T08:54:59.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:54:58 vm08.local ceph-mon[57559]: pgmap v138: 65 pgs: 65 active+clean; 183 MiB data, 1.5 GiB used, 118 GiB / 120 GiB avail; 4.6 MiB/s wr, 463 op/s 2026-03-10T08:54:59.064 INFO:tasks.workunit.client.1.vm08.stdout:8/54: fsync d1/f11 0 2026-03-10T08:54:59.068 INFO:tasks.workunit.client.1.vm08.stdout:8/55: dwrite f0 [4194304,4194304] 0 2026-03-10T08:54:59.071 INFO:tasks.workunit.client.1.vm08.stdout:7/37: fsync d0/d2/f7 0 2026-03-10T08:54:59.073 INFO:tasks.workunit.client.1.vm08.stdout:7/38: dread d0/d2/f7 [0,4194304] 0 2026-03-10T08:54:59.074 INFO:tasks.workunit.client.1.vm08.stdout:7/39: dread d0/d2/f7 [0,4194304] 0 2026-03-10T08:54:59.087 INFO:tasks.workunit.client.1.vm08.stdout:9/75: read d2/f4 [269409,64391] 0 2026-03-10T08:54:59.093 INFO:tasks.workunit.client.1.vm08.stdout:4/44: getdents d5 0 2026-03-10T08:54:59.132 INFO:tasks.workunit.client.1.vm08.stdout:5/71: getdents d0/d11/d18 0 2026-03-10T08:54:59.160 INFO:tasks.workunit.client.1.vm08.stdout:2/34: write d1/f4 [122667,15500] 0 2026-03-10T08:54:59.163 INFO:tasks.workunit.client.1.vm08.stdout:3/59: truncate f1 295060 0 2026-03-10T08:54:59.164 INFO:tasks.workunit.client.1.vm08.stdout:2/35: dread d1/f4 [0,4194304] 0 2026-03-10T08:54:59.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:54:58 vm05.local ceph-mon[49713]: pgmap v138: 65 pgs: 65 active+clean; 183 MiB data, 1.5 GiB used, 118 GiB / 120 GiB avail; 4.6 MiB/s wr, 463 op/s 2026-03-10T08:54:59.327 INFO:tasks.workunit.client.1.vm08.stdout:0/43: fdatasync f1 0 2026-03-10T08:54:59.327 INFO:tasks.workunit.client.1.vm08.stdout:6/69: creat d9/dc/f1b x:0 0 0 2026-03-10T08:54:59.327 INFO:tasks.workunit.client.1.vm08.stdout:0/44: truncate f4 760137 0 2026-03-10T08:54:59.328 INFO:tasks.workunit.client.1.vm08.stdout:0/45: write f4 [637857,99020] 0 2026-03-10T08:54:59.337 INFO:tasks.workunit.client.1.vm08.stdout:8/56: creat d1/d10/f12 x:0 0 0 2026-03-10T08:54:59.347 INFO:tasks.workunit.client.1.vm08.stdout:9/76: rename d2/l8 to d2/l12 0 2026-03-10T08:54:59.348 INFO:tasks.workunit.client.1.vm08.stdout:3/60: creat d4/f14 x:0 0 0 2026-03-10T08:54:59.348 INFO:tasks.workunit.client.1.vm08.stdout:3/61: dread - d4/f14 zero size 2026-03-10T08:54:59.349 INFO:tasks.workunit.client.1.vm08.stdout:3/62: fsync d4/d5/f12 0 2026-03-10T08:54:59.349 INFO:tasks.workunit.client.1.vm08.stdout:2/36: rename d1/f8 to d1/da/fb 0 2026-03-10T08:54:59.353 INFO:tasks.workunit.client.1.vm08.stdout:0/46: creat d6/fc x:0 0 0 2026-03-10T08:54:59.354 INFO:tasks.workunit.client.1.vm08.stdout:1/32: creat d1/da/fb x:0 0 0 2026-03-10T08:54:59.357 INFO:tasks.workunit.client.1.vm08.stdout:7/40: mknod d0/cb 0 2026-03-10T08:54:59.359 INFO:tasks.workunit.client.1.vm08.stdout:8/57: fdatasync d1/d10/d9/fb 0 2026-03-10T08:54:59.360 INFO:tasks.workunit.client.1.vm08.stdout:8/58: stat d1/d10/lc 0 2026-03-10T08:54:59.360 INFO:tasks.workunit.client.1.vm08.stdout:8/59: chown d1/l4 3 1 2026-03-10T08:54:59.361 INFO:tasks.workunit.client.1.vm08.stdout:8/60: dread - d1/d10/ff zero size 2026-03-10T08:54:59.362 INFO:tasks.workunit.client.1.vm08.stdout:8/61: truncate d1/f11 974550 0 2026-03-10T08:54:59.362 INFO:tasks.workunit.client.1.vm08.stdout:4/45: getdents d5 0 2026-03-10T08:54:59.365 INFO:tasks.workunit.client.1.vm08.stdout:5/72: mkdir d0/d1b 0 2026-03-10T08:54:59.371 INFO:tasks.workunit.client.1.vm08.stdout:3/63: rename d4/d5 to d4/d15 0 2026-03-10T08:54:59.375 INFO:tasks.workunit.client.1.vm08.stdout:6/70: symlink d9/d10/l1c 0 2026-03-10T08:54:59.379 INFO:tasks.workunit.client.1.vm08.stdout:0/47: mkdir d6/dd 0 2026-03-10T08:54:59.379 INFO:tasks.workunit.client.1.vm08.stdout:0/48: chown d6/dd 254047945 1 2026-03-10T08:54:59.379 INFO:tasks.workunit.client.1.vm08.stdout:0/49: dread - d6/fc zero size 2026-03-10T08:54:59.380 INFO:tasks.workunit.client.1.vm08.stdout:1/33: unlink d1/d7/l9 0 2026-03-10T08:54:59.380 INFO:tasks.workunit.client.1.vm08.stdout:1/34: dread - d1/da/fb zero size 2026-03-10T08:54:59.386 INFO:tasks.workunit.client.1.vm08.stdout:4/46: symlink d5/lc 0 2026-03-10T08:54:59.389 INFO:tasks.workunit.client.1.vm08.stdout:9/77: link d2/fb d2/f13 0 2026-03-10T08:54:59.389 INFO:tasks.workunit.client.1.vm08.stdout:5/73: unlink d0/l13 0 2026-03-10T08:54:59.391 INFO:tasks.workunit.client.1.vm08.stdout:6/71: mknod d9/dc/c1d 0 2026-03-10T08:54:59.394 INFO:tasks.workunit.client.1.vm08.stdout:8/62: mkdir d1/d10/d9/dd/d13 0 2026-03-10T08:54:59.394 INFO:tasks.workunit.client.1.vm08.stdout:8/63: chown f0 24 1 2026-03-10T08:54:59.397 INFO:tasks.workunit.client.1.vm08.stdout:4/47: creat d5/fd x:0 0 0 2026-03-10T08:54:59.397 INFO:tasks.workunit.client.1.vm08.stdout:4/48: dread - d5/fd zero size 2026-03-10T08:54:59.401 INFO:tasks.workunit.client.1.vm08.stdout:5/74: mkdir d0/d11/d18/d1c 0 2026-03-10T08:54:59.404 INFO:tasks.workunit.client.1.vm08.stdout:0/50: link d6/fa d6/fe 0 2026-03-10T08:54:59.405 INFO:tasks.workunit.client.1.vm08.stdout:8/64: fsync d1/d10/d9/fb 0 2026-03-10T08:54:59.405 INFO:tasks.workunit.client.1.vm08.stdout:8/65: read d1/f11 [490553,41573] 0 2026-03-10T08:54:59.406 INFO:tasks.workunit.client.1.vm08.stdout:8/66: dread - d1/d10/f12 zero size 2026-03-10T08:54:59.408 INFO:tasks.workunit.client.1.vm08.stdout:4/49: mkdir d5/de 0 2026-03-10T08:54:59.411 INFO:tasks.workunit.client.1.vm08.stdout:3/64: link d4/d15/d8/c13 d4/d15/c16 0 2026-03-10T08:54:59.414 INFO:tasks.workunit.client.1.vm08.stdout:0/51: mknod d6/cf 0 2026-03-10T08:54:59.414 INFO:tasks.workunit.client.1.vm08.stdout:3/65: dwrite d4/d15/fa [0,4194304] 0 2026-03-10T08:54:59.415 INFO:tasks.workunit.client.1.vm08.stdout:3/66: dread - d4/f10 zero size 2026-03-10T08:54:59.417 INFO:tasks.workunit.client.1.vm08.stdout:8/67: symlink d1/d10/d9/dd/l14 0 2026-03-10T08:54:59.417 INFO:tasks.workunit.client.1.vm08.stdout:8/68: write d1/d10/f12 [907411,98374] 0 2026-03-10T08:54:59.418 INFO:tasks.workunit.client.1.vm08.stdout:4/50: symlink d5/lf 0 2026-03-10T08:54:59.420 INFO:tasks.workunit.client.1.vm08.stdout:4/51: dread d5/f6 [4194304,4194304] 0 2026-03-10T08:54:59.420 INFO:tasks.workunit.client.1.vm08.stdout:4/52: fdatasync d5/fb 0 2026-03-10T08:54:59.421 INFO:tasks.workunit.client.1.vm08.stdout:4/53: truncate d5/f8 1011609 0 2026-03-10T08:54:59.423 INFO:tasks.workunit.client.1.vm08.stdout:4/54: dread d5/f6 [4194304,4194304] 0 2026-03-10T08:54:59.435 INFO:tasks.workunit.client.1.vm08.stdout:8/69: dwrite d1/f11 [0,4194304] 0 2026-03-10T08:54:59.435 INFO:tasks.workunit.client.1.vm08.stdout:5/75: creat d0/f1d x:0 0 0 2026-03-10T08:54:59.438 INFO:tasks.workunit.client.1.vm08.stdout:4/55: mknod d5/c10 0 2026-03-10T08:54:59.438 INFO:tasks.workunit.client.1.vm08.stdout:4/56: chown d5/c10 6442496 1 2026-03-10T08:54:59.444 INFO:tasks.workunit.client.1.vm08.stdout:5/76: readlink d0/ld 0 2026-03-10T08:54:59.446 INFO:tasks.workunit.client.1.vm08.stdout:8/70: fsync d1/d10/d9/fb 0 2026-03-10T08:54:59.451 INFO:tasks.workunit.client.1.vm08.stdout:4/57: creat d5/de/f11 x:0 0 0 2026-03-10T08:54:59.452 INFO:tasks.workunit.client.1.vm08.stdout:8/71: symlink d1/d10/d9/dd/l15 0 2026-03-10T08:54:59.457 INFO:tasks.workunit.client.1.vm08.stdout:8/72: mkdir d1/d10/d9/dd/d16 0 2026-03-10T08:54:59.460 INFO:tasks.workunit.client.1.vm08.stdout:8/73: rename d1/f6 to d1/f17 0 2026-03-10T08:54:59.463 INFO:tasks.workunit.client.1.vm08.stdout:4/58: link d5/l7 d5/l12 0 2026-03-10T08:54:59.463 INFO:tasks.workunit.client.1.vm08.stdout:4/59: truncate d5/fd 423327 0 2026-03-10T08:54:59.463 INFO:tasks.workunit.client.1.vm08.stdout:8/74: rename d1/de to d1/d10/d9/dd/d18 0 2026-03-10T08:54:59.463 INFO:tasks.workunit.client.1.vm08.stdout:8/75: write f0 [9446011,75052] 0 2026-03-10T08:54:59.480 INFO:tasks.workunit.client.1.vm08.stdout:3/67: fsync d4/f14 0 2026-03-10T08:54:59.480 INFO:tasks.workunit.client.1.vm08.stdout:7/41: write d0/d2/f7 [1591638,107191] 0 2026-03-10T08:54:59.487 INFO:tasks.workunit.client.1.vm08.stdout:3/68: mkdir d4/d15/d17 0 2026-03-10T08:54:59.487 INFO:tasks.workunit.client.1.vm08.stdout:3/69: read - d4/d15/d8/ff zero size 2026-03-10T08:54:59.488 INFO:tasks.workunit.client.1.vm08.stdout:3/70: dread - d4/d15/f12 zero size 2026-03-10T08:54:59.492 INFO:tasks.workunit.client.1.vm08.stdout:3/71: dwrite d4/d15/f7 [0,4194304] 0 2026-03-10T08:54:59.500 INFO:tasks.workunit.client.1.vm08.stdout:8/76: write d1/d10/d9/fb [1364824,23043] 0 2026-03-10T08:54:59.508 INFO:tasks.workunit.client.1.vm08.stdout:2/37: truncate d1/f4 1119622 0 2026-03-10T08:54:59.508 INFO:tasks.workunit.client.1.vm08.stdout:8/77: symlink d1/d10/l19 0 2026-03-10T08:54:59.509 INFO:tasks.workunit.client.1.vm08.stdout:2/38: write d1/f7 [947227,29276] 0 2026-03-10T08:54:59.509 INFO:tasks.workunit.client.1.vm08.stdout:7/42: mknod d0/d2/da/cc 0 2026-03-10T08:54:59.510 INFO:tasks.workunit.client.1.vm08.stdout:7/43: fdatasync d0/d2/f7 0 2026-03-10T08:54:59.510 INFO:tasks.workunit.client.1.vm08.stdout:8/78: symlink d1/l1a 0 2026-03-10T08:54:59.513 INFO:tasks.workunit.client.1.vm08.stdout:2/39: rename d1/l6 to d1/da/lc 0 2026-03-10T08:54:59.514 INFO:tasks.workunit.client.1.vm08.stdout:7/44: dwrite d0/d2/f7 [0,4194304] 0 2026-03-10T08:54:59.523 INFO:tasks.workunit.client.1.vm08.stdout:8/79: mknod d1/c1b 0 2026-03-10T08:54:59.523 INFO:tasks.workunit.client.1.vm08.stdout:2/40: write d1/f2 [2502551,53544] 0 2026-03-10T08:54:59.524 INFO:tasks.workunit.client.1.vm08.stdout:3/72: sync 2026-03-10T08:54:59.524 INFO:tasks.workunit.client.1.vm08.stdout:3/73: chown d4 29288 1 2026-03-10T08:54:59.532 INFO:tasks.workunit.client.1.vm08.stdout:1/35: rmdir d1/d7 0 2026-03-10T08:54:59.533 INFO:tasks.workunit.client.1.vm08.stdout:1/36: dread - d1/da/fb zero size 2026-03-10T08:54:59.535 INFO:tasks.workunit.client.1.vm08.stdout:2/41: creat d1/fd x:0 0 0 2026-03-10T08:54:59.536 INFO:tasks.workunit.client.1.vm08.stdout:2/42: truncate d1/fd 582448 0 2026-03-10T08:54:59.536 INFO:tasks.workunit.client.1.vm08.stdout:1/37: dread d1/f8 [0,4194304] 0 2026-03-10T08:54:59.536 INFO:tasks.workunit.client.1.vm08.stdout:3/74: creat d4/f18 x:0 0 0 2026-03-10T08:54:59.540 INFO:tasks.workunit.client.1.vm08.stdout:8/80: rename d1/d10/d9/ca to d1/d10/d9/c1c 0 2026-03-10T08:54:59.546 INFO:tasks.workunit.client.1.vm08.stdout:2/43: dread d1/da/fb [0,4194304] 0 2026-03-10T08:54:59.546 INFO:tasks.workunit.client.1.vm08.stdout:1/38: dread d1/f8 [0,4194304] 0 2026-03-10T08:54:59.546 INFO:tasks.workunit.client.1.vm08.stdout:1/39: fdatasync d1/f8 0 2026-03-10T08:54:59.546 INFO:tasks.workunit.client.1.vm08.stdout:1/40: dread - d1/da/fb zero size 2026-03-10T08:54:59.552 INFO:tasks.workunit.client.1.vm08.stdout:2/44: creat d1/da/fe x:0 0 0 2026-03-10T08:54:59.552 INFO:tasks.workunit.client.1.vm08.stdout:9/78: dwrite d2/f6 [0,4194304] 0 2026-03-10T08:54:59.554 INFO:tasks.workunit.client.1.vm08.stdout:8/81: dwrite f0 [0,4194304] 0 2026-03-10T08:54:59.561 INFO:tasks.workunit.client.1.vm08.stdout:0/52: rmdir d6 39 2026-03-10T08:54:59.562 INFO:tasks.workunit.client.1.vm08.stdout:8/82: truncate d1/d10/ff 826956 0 2026-03-10T08:54:59.563 INFO:tasks.workunit.client.1.vm08.stdout:9/79: dread d2/f4 [0,4194304] 0 2026-03-10T08:54:59.568 INFO:tasks.workunit.client.1.vm08.stdout:2/45: mkdir d1/da/df 0 2026-03-10T08:54:59.573 INFO:tasks.workunit.client.1.vm08.stdout:0/53: chown d6/fc 14892 1 2026-03-10T08:54:59.573 INFO:tasks.workunit.client.1.vm08.stdout:9/80: dwrite d2/fa [0,4194304] 0 2026-03-10T08:54:59.574 INFO:tasks.workunit.client.1.vm08.stdout:8/83: write d1/f17 [712403,40483] 0 2026-03-10T08:54:59.579 INFO:tasks.workunit.client.1.vm08.stdout:0/54: write f1 [1362439,94547] 0 2026-03-10T08:54:59.580 INFO:tasks.workunit.client.1.vm08.stdout:9/81: write d2/fa [2861958,75527] 0 2026-03-10T08:54:59.580 INFO:tasks.workunit.client.1.vm08.stdout:9/82: chown d2/fb 634398 1 2026-03-10T08:54:59.583 INFO:tasks.workunit.client.1.vm08.stdout:0/55: dread f4 [0,4194304] 0 2026-03-10T08:54:59.584 INFO:tasks.workunit.client.1.vm08.stdout:1/41: truncate d1/f8 1450712 0 2026-03-10T08:54:59.585 INFO:tasks.workunit.client.1.vm08.stdout:2/46: mkdir d1/da/d10 0 2026-03-10T08:54:59.585 INFO:tasks.workunit.client.1.vm08.stdout:1/42: chown d1/da 124 1 2026-03-10T08:54:59.585 INFO:tasks.workunit.client.1.vm08.stdout:1/43: chown d1/c5 249401 1 2026-03-10T08:54:59.586 INFO:tasks.workunit.client.1.vm08.stdout:1/44: chown d1/c5 53636 1 2026-03-10T08:54:59.591 INFO:tasks.workunit.client.1.vm08.stdout:8/84: dwrite d1/f17 [0,4194304] 0 2026-03-10T08:54:59.592 INFO:tasks.workunit.client.1.vm08.stdout:8/85: write d1/f11 [2615726,128208] 0 2026-03-10T08:54:59.592 INFO:tasks.workunit.client.1.vm08.stdout:8/86: fsync d1/f8 0 2026-03-10T08:54:59.595 INFO:tasks.workunit.client.1.vm08.stdout:4/60: getdents d5/de 0 2026-03-10T08:54:59.604 INFO:tasks.workunit.client.1.vm08.stdout:4/61: stat d5/lf 0 2026-03-10T08:54:59.605 INFO:tasks.workunit.client.1.vm08.stdout:5/77: truncate d0/ff 1157827 0 2026-03-10T08:54:59.606 INFO:tasks.workunit.client.1.vm08.stdout:9/83: mknod d2/c14 0 2026-03-10T08:54:59.612 INFO:tasks.workunit.client.1.vm08.stdout:9/84: chown d2 182592721 1 2026-03-10T08:54:59.612 INFO:tasks.workunit.client.1.vm08.stdout:6/72: write f5 [2661017,128620] 0 2026-03-10T08:54:59.613 INFO:tasks.workunit.client.1.vm08.stdout:6/73: dread d9/dc/dd/ff [0,4194304] 0 2026-03-10T08:54:59.614 INFO:tasks.workunit.client.1.vm08.stdout:6/74: chown d9/d13/c18 888 1 2026-03-10T08:54:59.615 INFO:tasks.workunit.client.1.vm08.stdout:8/87: symlink d1/d10/d9/l1d 0 2026-03-10T08:54:59.616 INFO:tasks.workunit.client.1.vm08.stdout:8/88: write d1/d10/ff [922691,81701] 0 2026-03-10T08:54:59.618 INFO:tasks.workunit.client.1.vm08.stdout:5/78: rmdir d0/d11 39 2026-03-10T08:54:59.618 INFO:tasks.workunit.client.1.vm08.stdout:5/79: write d0/f10 [5221566,73482] 0 2026-03-10T08:54:59.622 INFO:tasks.workunit.client.1.vm08.stdout:9/85: mkdir d2/dd/d15 0 2026-03-10T08:54:59.622 INFO:tasks.workunit.client.1.vm08.stdout:9/86: write d2/f6 [1772648,29460] 0 2026-03-10T08:54:59.623 INFO:tasks.workunit.client.1.vm08.stdout:9/87: read f1 [803826,27955] 0 2026-03-10T08:54:59.623 INFO:tasks.workunit.client.1.vm08.stdout:0/56: mkdir d6/dd/d10 0 2026-03-10T08:54:59.625 INFO:tasks.workunit.client.1.vm08.stdout:2/47: symlink d1/da/df/l11 0 2026-03-10T08:54:59.626 INFO:tasks.workunit.client.1.vm08.stdout:2/48: readlink d1/da/df/l11 0 2026-03-10T08:54:59.631 INFO:tasks.workunit.client.1.vm08.stdout:7/45: truncate d0/d2/f7 366984 0 2026-03-10T08:54:59.631 INFO:tasks.workunit.client.1.vm08.stdout:6/75: mkdir d9/d10/d1e 0 2026-03-10T08:54:59.633 INFO:tasks.workunit.client.1.vm08.stdout:8/89: creat d1/d10/f1e x:0 0 0 2026-03-10T08:54:59.643 INFO:tasks.workunit.client.1.vm08.stdout:3/75: truncate d4/d15/f7 2676687 0 2026-03-10T08:54:59.644 INFO:tasks.workunit.client.1.vm08.stdout:3/76: write d4/d15/fa [4049783,58840] 0 2026-03-10T08:54:59.644 INFO:tasks.workunit.client.1.vm08.stdout:3/77: readlink - no filename 2026-03-10T08:54:59.656 INFO:tasks.workunit.client.1.vm08.stdout:9/88: fsync f1 0 2026-03-10T08:54:59.656 INFO:tasks.workunit.client.1.vm08.stdout:9/89: chown d2/f13 1270533976 1 2026-03-10T08:54:59.661 INFO:tasks.workunit.client.1.vm08.stdout:0/57: creat d6/f11 x:0 0 0 2026-03-10T08:54:59.661 INFO:tasks.workunit.client.1.vm08.stdout:0/58: dread - f3 zero size 2026-03-10T08:54:59.661 INFO:tasks.workunit.client.1.vm08.stdout:0/59: fsync d6/fc 0 2026-03-10T08:54:59.666 INFO:tasks.workunit.client.1.vm08.stdout:2/49: mkdir d1/da/df/d12 0 2026-03-10T08:54:59.669 INFO:tasks.workunit.client.1.vm08.stdout:2/50: dwrite d1/f9 [0,4194304] 0 2026-03-10T08:54:59.677 INFO:tasks.workunit.client.1.vm08.stdout:6/76: symlink d9/dc/l1f 0 2026-03-10T08:54:59.679 INFO:tasks.workunit.client.1.vm08.stdout:4/62: link d5/la d5/de/l13 0 2026-03-10T08:54:59.680 INFO:tasks.workunit.client.1.vm08.stdout:5/80: rename d0/f15 to d0/d11/f1e 0 2026-03-10T08:54:59.681 INFO:tasks.workunit.client.1.vm08.stdout:5/81: dread - d0/f16 zero size 2026-03-10T08:54:59.683 INFO:tasks.workunit.client.1.vm08.stdout:4/63: dwrite d5/de/f11 [0,4194304] 0 2026-03-10T08:54:59.688 INFO:tasks.workunit.client.1.vm08.stdout:4/64: dread d5/fd [0,4194304] 0 2026-03-10T08:54:59.688 INFO:tasks.workunit.client.1.vm08.stdout:4/65: chown c3 371256 1 2026-03-10T08:54:59.690 INFO:tasks.workunit.client.1.vm08.stdout:3/78: dread f1 [0,4194304] 0 2026-03-10T08:54:59.691 INFO:tasks.workunit.client.1.vm08.stdout:9/90: creat d2/dd/f16 x:0 0 0 2026-03-10T08:54:59.699 INFO:tasks.workunit.client.1.vm08.stdout:2/51: fdatasync d1/da/fb 0 2026-03-10T08:54:59.700 INFO:tasks.workunit.client.1.vm08.stdout:8/90: symlink d1/d10/d9/dd/d13/l1f 0 2026-03-10T08:54:59.701 INFO:tasks.workunit.client.1.vm08.stdout:8/91: read d1/f8 [415169,40718] 0 2026-03-10T08:54:59.702 INFO:tasks.workunit.client.1.vm08.stdout:8/92: write d1/f11 [4909142,58962] 0 2026-03-10T08:54:59.702 INFO:tasks.workunit.client.1.vm08.stdout:6/77: rename d9/d10/l16 to d9/d10/l20 0 2026-03-10T08:54:59.706 INFO:tasks.workunit.client.1.vm08.stdout:7/46: read d0/d2/f7 [289790,48395] 0 2026-03-10T08:54:59.706 INFO:tasks.workunit.client.1.vm08.stdout:5/82: readlink d0/l8 0 2026-03-10T08:54:59.707 INFO:tasks.workunit.client.1.vm08.stdout:8/93: dwrite d1/d10/ff [0,4194304] 0 2026-03-10T08:54:59.711 INFO:tasks.workunit.client.1.vm08.stdout:2/52: dread d1/f7 [0,4194304] 0 2026-03-10T08:54:59.711 INFO:tasks.workunit.client.1.vm08.stdout:2/53: stat d1/da/fb 0 2026-03-10T08:54:59.719 INFO:tasks.workunit.client.1.vm08.stdout:8/94: dread d1/d10/f12 [0,4194304] 0 2026-03-10T08:54:59.720 INFO:tasks.workunit.client.1.vm08.stdout:8/95: chown d1/d10/d9 16524750 1 2026-03-10T08:54:59.720 INFO:tasks.workunit.client.1.vm08.stdout:8/96: fdatasync d1/f8 0 2026-03-10T08:54:59.722 INFO:tasks.workunit.client.1.vm08.stdout:4/66: creat d5/f14 x:0 0 0 2026-03-10T08:54:59.724 INFO:tasks.workunit.client.1.vm08.stdout:3/79: chown d4/d15/c16 150 1 2026-03-10T08:54:59.727 INFO:tasks.workunit.client.1.vm08.stdout:1/45: truncate d1/f8 56072 0 2026-03-10T08:54:59.732 INFO:tasks.workunit.client.1.vm08.stdout:4/67: dwrite d5/fb [0,4194304] 0 2026-03-10T08:54:59.732 INFO:tasks.workunit.client.1.vm08.stdout:6/78: symlink d9/dc/l21 0 2026-03-10T08:54:59.737 INFO:tasks.workunit.client.1.vm08.stdout:7/47: rmdir d0/d2/da 39 2026-03-10T08:54:59.738 INFO:tasks.workunit.client.1.vm08.stdout:5/83: symlink d0/d11/l1f 0 2026-03-10T08:54:59.743 INFO:tasks.workunit.client.1.vm08.stdout:8/97: symlink d1/d10/l20 0 2026-03-10T08:54:59.748 INFO:tasks.workunit.client.1.vm08.stdout:9/91: creat d2/dd/d15/f17 x:0 0 0 2026-03-10T08:54:59.748 INFO:tasks.workunit.client.1.vm08.stdout:5/84: dwrite d0/d11/d18/f1a [0,4194304] 0 2026-03-10T08:54:59.748 INFO:tasks.workunit.client.1.vm08.stdout:3/80: mknod d4/c19 0 2026-03-10T08:54:59.748 INFO:tasks.workunit.client.1.vm08.stdout:3/81: write d4/d15/fa [713387,18276] 0 2026-03-10T08:54:59.748 INFO:tasks.workunit.client.1.vm08.stdout:0/60: dwrite d6/fa [0,4194304] 0 2026-03-10T08:54:59.748 INFO:tasks.workunit.client.1.vm08.stdout:0/61: chown d6/f9 15081 1 2026-03-10T08:54:59.749 INFO:tasks.workunit.client.1.vm08.stdout:0/62: chown d6/f9 1818 1 2026-03-10T08:54:59.750 INFO:tasks.workunit.client.1.vm08.stdout:8/98: dread f0 [4194304,4194304] 0 2026-03-10T08:54:59.751 INFO:tasks.workunit.client.1.vm08.stdout:8/99: chown d1/d10/f1e 271315 1 2026-03-10T08:54:59.761 INFO:tasks.workunit.client.1.vm08.stdout:4/68: mknod d5/c15 0 2026-03-10T08:54:59.762 INFO:tasks.workunit.client.1.vm08.stdout:8/100: dwrite d1/f8 [0,4194304] 0 2026-03-10T08:54:59.762 INFO:tasks.workunit.client.1.vm08.stdout:4/69: write d5/f14 [466291,45893] 0 2026-03-10T08:54:59.764 INFO:tasks.workunit.client.1.vm08.stdout:7/48: unlink d0/c4 0 2026-03-10T08:54:59.764 INFO:tasks.workunit.client.1.vm08.stdout:7/49: readlink - no filename 2026-03-10T08:54:59.764 INFO:tasks.workunit.client.1.vm08.stdout:6/79: rename c8 to d9/d13/d1a/c22 0 2026-03-10T08:54:59.767 INFO:tasks.workunit.client.1.vm08.stdout:0/63: mknod d6/c12 0 2026-03-10T08:54:59.767 INFO:tasks.workunit.client.1.vm08.stdout:5/85: stat d0/c9 0 2026-03-10T08:54:59.772 INFO:tasks.workunit.client.1.vm08.stdout:5/86: write d0/f16 [710006,111934] 0 2026-03-10T08:54:59.773 INFO:tasks.workunit.client.1.vm08.stdout:5/87: write d0/fe [912576,23855] 0 2026-03-10T08:54:59.787 INFO:tasks.workunit.client.1.vm08.stdout:1/46: write d1/f8 [1066872,84321] 0 2026-03-10T08:54:59.787 INFO:tasks.workunit.client.1.vm08.stdout:8/101: mkdir d1/d10/d9/dd/d16/d21 0 2026-03-10T08:54:59.787 INFO:tasks.workunit.client.1.vm08.stdout:1/47: fdatasync d1/da/fb 0 2026-03-10T08:54:59.791 INFO:tasks.workunit.client.1.vm08.stdout:4/70: rename d5/l7 to d5/de/l16 0 2026-03-10T08:54:59.796 INFO:tasks.workunit.client.1.vm08.stdout:2/54: link d1/f4 d1/da/f13 0 2026-03-10T08:54:59.796 INFO:tasks.workunit.client.1.vm08.stdout:4/71: chown d5/f6 94370427 1 2026-03-10T08:54:59.796 INFO:tasks.workunit.client.1.vm08.stdout:6/80: mkdir d9/dc/d11/d23 0 2026-03-10T08:54:59.796 INFO:tasks.workunit.client.1.vm08.stdout:1/48: creat d1/fc x:0 0 0 2026-03-10T08:54:59.796 INFO:tasks.workunit.client.1.vm08.stdout:6/81: mknod d9/dc/dd/c24 0 2026-03-10T08:54:59.798 INFO:tasks.workunit.client.1.vm08.stdout:6/82: write f1 [4337810,25584] 0 2026-03-10T08:54:59.798 INFO:tasks.workunit.client.1.vm08.stdout:2/55: dwrite d1/f9 [0,4194304] 0 2026-03-10T08:54:59.801 INFO:tasks.workunit.client.1.vm08.stdout:0/64: getdents d6/dd 0 2026-03-10T08:54:59.801 INFO:tasks.workunit.client.1.vm08.stdout:2/56: chown d1/da/df 1286470930 1 2026-03-10T08:54:59.806 INFO:tasks.workunit.client.1.vm08.stdout:1/49: unlink d1/da/fb 0 2026-03-10T08:54:59.807 INFO:tasks.workunit.client.1.vm08.stdout:8/102: dwrite d1/d10/f12 [0,4194304] 0 2026-03-10T08:54:59.818 INFO:tasks.workunit.client.1.vm08.stdout:6/83: write f5 [1980236,47152] 0 2026-03-10T08:54:59.820 INFO:tasks.workunit.client.1.vm08.stdout:0/65: mkdir d6/dd/d13 0 2026-03-10T08:54:59.821 INFO:tasks.workunit.client.1.vm08.stdout:1/50: dwrite d1/fc [0,4194304] 0 2026-03-10T08:54:59.822 INFO:tasks.workunit.client.1.vm08.stdout:1/51: write d1/fc [225921,43760] 0 2026-03-10T08:54:59.824 INFO:tasks.workunit.client.1.vm08.stdout:8/103: dwrite d1/d10/d9/fb [0,4194304] 0 2026-03-10T08:54:59.824 INFO:tasks.workunit.client.1.vm08.stdout:1/52: read d1/fc [1967989,104857] 0 2026-03-10T08:54:59.832 INFO:tasks.workunit.client.1.vm08.stdout:8/104: creat d1/d10/d9/dd/d16/f22 x:0 0 0 2026-03-10T08:54:59.834 INFO:tasks.workunit.client.1.vm08.stdout:0/66: mknod d6/dd/c14 0 2026-03-10T08:54:59.835 INFO:tasks.workunit.client.1.vm08.stdout:0/67: chown f4 1 1 2026-03-10T08:54:59.835 INFO:tasks.workunit.client.1.vm08.stdout:8/105: creat d1/d10/f23 x:0 0 0 2026-03-10T08:54:59.836 INFO:tasks.workunit.client.1.vm08.stdout:8/106: write d1/d10/f1e [169669,11554] 0 2026-03-10T08:54:59.838 INFO:tasks.workunit.client.1.vm08.stdout:8/107: write f0 [9990358,112913] 0 2026-03-10T08:54:59.839 INFO:tasks.workunit.client.1.vm08.stdout:8/108: chown d1/d10/f12 53558 1 2026-03-10T08:54:59.840 INFO:tasks.workunit.client.1.vm08.stdout:8/109: creat d1/d10/d9/dd/d13/f24 x:0 0 0 2026-03-10T08:54:59.841 INFO:tasks.workunit.client.1.vm08.stdout:8/110: write d1/d10/d9/dd/d16/f22 [940127,73804] 0 2026-03-10T08:54:59.999 INFO:tasks.workunit.client.1.vm08.stdout:4/72: sync 2026-03-10T08:55:00.002 INFO:tasks.workunit.client.1.vm08.stdout:4/73: dwrite d5/fb [0,4194304] 0 2026-03-10T08:55:00.008 INFO:tasks.workunit.client.1.vm08.stdout:4/74: link d5/de/l13 d5/l17 0 2026-03-10T08:55:00.010 INFO:tasks.workunit.client.1.vm08.stdout:4/75: mknod d5/c18 0 2026-03-10T08:55:00.010 INFO:tasks.workunit.client.1.vm08.stdout:4/76: readlink d5/lf 0 2026-03-10T08:55:00.012 INFO:tasks.workunit.client.1.vm08.stdout:2/57: sync 2026-03-10T08:55:00.015 INFO:tasks.workunit.client.1.vm08.stdout:4/77: dwrite d5/fb [0,4194304] 0 2026-03-10T08:55:00.230 INFO:tasks.workunit.client.1.vm08.stdout:3/82: fsync d4/d15/fa 0 2026-03-10T08:55:00.234 INFO:tasks.workunit.client.1.vm08.stdout:3/83: creat d4/d15/f1a x:0 0 0 2026-03-10T08:55:00.242 INFO:tasks.workunit.client.1.vm08.stdout:9/92: rmdir d2 39 2026-03-10T08:55:00.245 INFO:tasks.workunit.client.1.vm08.stdout:3/84: mknod d4/d15/d17/c1b 0 2026-03-10T08:55:00.247 INFO:tasks.workunit.client.1.vm08.stdout:3/85: mknod d4/c1c 0 2026-03-10T08:55:00.249 INFO:tasks.workunit.client.1.vm08.stdout:7/50: rmdir d0 39 2026-03-10T08:55:00.258 INFO:tasks.workunit.client.1.vm08.stdout:9/93: rename f0 to d2/dd/f18 0 2026-03-10T08:55:00.277 INFO:tasks.workunit.client.1.vm08.stdout:3/86: mkdir d4/d15/d8/d1d 0 2026-03-10T08:55:00.277 INFO:tasks.workunit.client.1.vm08.stdout:3/87: fsync d4/f14 0 2026-03-10T08:55:00.277 INFO:tasks.workunit.client.1.vm08.stdout:3/88: stat d4/c1c 0 2026-03-10T08:55:00.281 INFO:tasks.workunit.client.1.vm08.stdout:2/58: dread d1/fd [0,4194304] 0 2026-03-10T08:55:00.288 INFO:tasks.workunit.client.1.vm08.stdout:8/111: dread d1/d10/d9/dd/d16/f22 [0,4194304] 0 2026-03-10T08:55:00.289 INFO:tasks.workunit.client.1.vm08.stdout:8/112: dread - d1/d10/f23 zero size 2026-03-10T08:55:00.292 INFO:tasks.workunit.client.1.vm08.stdout:9/94: rmdir d2 39 2026-03-10T08:55:00.298 INFO:tasks.workunit.client.1.vm08.stdout:3/89: truncate f1 1158341 0 2026-03-10T08:55:00.310 INFO:tasks.workunit.client.1.vm08.stdout:8/113: mkdir d1/d10/d9/dd/d25 0 2026-03-10T08:55:00.311 INFO:tasks.workunit.client.1.vm08.stdout:8/114: chown d1/f11 146098 1 2026-03-10T08:55:00.313 INFO:tasks.workunit.client.1.vm08.stdout:8/115: dread d1/f8 [0,4194304] 0 2026-03-10T08:55:00.316 INFO:tasks.workunit.client.1.vm08.stdout:7/51: mknod d0/d2/da/cd 0 2026-03-10T08:55:00.316 INFO:tasks.workunit.client.1.vm08.stdout:7/52: readlink - no filename 2026-03-10T08:55:00.320 INFO:tasks.workunit.client.1.vm08.stdout:2/59: rename d1/da/f13 to d1/da/df/f14 0 2026-03-10T08:55:00.333 INFO:tasks.workunit.client.1.vm08.stdout:8/116: creat d1/f26 x:0 0 0 2026-03-10T08:55:00.333 INFO:tasks.workunit.client.1.vm08.stdout:8/117: chown f0 114 1 2026-03-10T08:55:00.334 INFO:tasks.workunit.client.1.vm08.stdout:7/53: write d0/d2/f7 [1053987,119376] 0 2026-03-10T08:55:00.334 INFO:tasks.workunit.client.1.vm08.stdout:7/54: write d0/d2/f7 [1705375,114758] 0 2026-03-10T08:55:00.342 INFO:tasks.workunit.client.1.vm08.stdout:2/60: chown d1/da/lc 1322 1 2026-03-10T08:55:00.348 INFO:tasks.workunit.client.1.vm08.stdout:9/95: truncate d2/f13 1114816 0 2026-03-10T08:55:00.357 INFO:tasks.workunit.client.1.vm08.stdout:4/78: fdatasync d5/fb 0 2026-03-10T08:55:00.364 INFO:tasks.workunit.client.1.vm08.stdout:7/55: unlink d0/d2/da/cd 0 2026-03-10T08:55:00.373 INFO:tasks.workunit.client.1.vm08.stdout:2/61: dwrite d1/fd [0,4194304] 0 2026-03-10T08:55:00.374 INFO:tasks.workunit.client.1.vm08.stdout:2/62: dread - d1/da/fe zero size 2026-03-10T08:55:00.375 INFO:tasks.workunit.client.1.vm08.stdout:8/118: mkdir d1/d10/d9/dd/d25/d27 0 2026-03-10T08:55:00.377 INFO:tasks.workunit.client.1.vm08.stdout:1/53: truncate d1/fc 1756139 0 2026-03-10T08:55:00.377 INFO:tasks.workunit.client.1.vm08.stdout:4/79: creat d5/f19 x:0 0 0 2026-03-10T08:55:00.377 INFO:tasks.workunit.client.1.vm08.stdout:1/54: chown d1/fc 136535 1 2026-03-10T08:55:00.378 INFO:tasks.workunit.client.1.vm08.stdout:2/63: dread d1/fd [0,4194304] 0 2026-03-10T08:55:00.386 INFO:tasks.workunit.client.1.vm08.stdout:6/84: truncate f5 2559934 0 2026-03-10T08:55:00.390 INFO:tasks.workunit.client.1.vm08.stdout:8/119: creat d1/d10/d9/dd/d16/f28 x:0 0 0 2026-03-10T08:55:00.395 INFO:tasks.workunit.client.1.vm08.stdout:8/120: write d1/d10/f12 [1888128,124289] 0 2026-03-10T08:55:00.395 INFO:tasks.workunit.client.1.vm08.stdout:4/80: creat d5/f1a x:0 0 0 2026-03-10T08:55:00.397 INFO:tasks.workunit.client.1.vm08.stdout:4/81: dwrite d5/f19 [0,4194304] 0 2026-03-10T08:55:00.398 INFO:tasks.workunit.client.1.vm08.stdout:4/82: fsync d5/fb 0 2026-03-10T08:55:00.398 INFO:tasks.workunit.client.1.vm08.stdout:0/68: write d6/fb [674518,12825] 0 2026-03-10T08:55:00.409 INFO:tasks.workunit.client.1.vm08.stdout:6/85: rmdir d9/dc/d11 39 2026-03-10T08:55:00.409 INFO:tasks.workunit.client.1.vm08.stdout:6/86: chown d9/dc/dd 168 1 2026-03-10T08:55:00.416 INFO:tasks.workunit.client.1.vm08.stdout:8/121: symlink d1/d10/d9/l29 0 2026-03-10T08:55:00.416 INFO:tasks.workunit.client.1.vm08.stdout:8/122: stat d1/d10/l19 0 2026-03-10T08:55:00.417 INFO:tasks.workunit.client.1.vm08.stdout:8/123: write d1/d10/f1e [715702,55321] 0 2026-03-10T08:55:00.420 INFO:tasks.workunit.client.1.vm08.stdout:7/56: creat d0/fe x:0 0 0 2026-03-10T08:55:00.420 INFO:tasks.workunit.client.1.vm08.stdout:4/83: creat d5/de/f1b x:0 0 0 2026-03-10T08:55:00.423 INFO:tasks.workunit.client.1.vm08.stdout:2/64: rename d1/f3 to d1/f15 0 2026-03-10T08:55:00.439 INFO:tasks.workunit.client.1.vm08.stdout:7/57: creat d0/d2/da/ff x:0 0 0 2026-03-10T08:55:00.442 INFO:tasks.workunit.client.1.vm08.stdout:0/69: link d6/fc d6/f15 0 2026-03-10T08:55:00.443 INFO:tasks.workunit.client.1.vm08.stdout:0/70: write d6/f11 [730186,52949] 0 2026-03-10T08:55:00.443 INFO:tasks.workunit.client.1.vm08.stdout:2/65: symlink d1/da/l16 0 2026-03-10T08:55:00.451 INFO:tasks.workunit.client.1.vm08.stdout:0/71: fdatasync f4 0 2026-03-10T08:55:00.452 INFO:tasks.workunit.client.1.vm08.stdout:2/66: dread d1/f7 [0,4194304] 0 2026-03-10T08:55:00.452 INFO:tasks.workunit.client.1.vm08.stdout:2/67: fdatasync d1/f9 0 2026-03-10T08:55:00.454 INFO:tasks.workunit.client.1.vm08.stdout:7/58: symlink d0/l10 0 2026-03-10T08:55:00.465 INFO:tasks.workunit.client.1.vm08.stdout:2/68: mknod d1/da/df/c17 0 2026-03-10T08:55:00.469 INFO:tasks.workunit.client.1.vm08.stdout:4/84: link d5/l12 d5/de/l1c 0 2026-03-10T08:55:00.470 INFO:tasks.workunit.client.1.vm08.stdout:4/85: truncate d5/f1a 706481 0 2026-03-10T08:55:00.470 INFO:tasks.workunit.client.1.vm08.stdout:4/86: fdatasync d5/de/f11 0 2026-03-10T08:55:00.472 INFO:tasks.workunit.client.1.vm08.stdout:7/59: mkdir d0/d11 0 2026-03-10T08:55:00.473 INFO:tasks.workunit.client.1.vm08.stdout:7/60: chown d0/d2/da/ff 1496 1 2026-03-10T08:55:00.475 INFO:tasks.workunit.client.1.vm08.stdout:4/87: creat d5/f1d x:0 0 0 2026-03-10T08:55:00.478 INFO:tasks.workunit.client.1.vm08.stdout:4/88: creat d5/f1e x:0 0 0 2026-03-10T08:55:00.478 INFO:tasks.workunit.client.1.vm08.stdout:4/89: stat d5/lc 0 2026-03-10T08:55:00.486 INFO:tasks.workunit.client.1.vm08.stdout:7/61: dwrite d0/d2/da/ff [0,4194304] 0 2026-03-10T08:55:00.487 INFO:tasks.workunit.client.1.vm08.stdout:4/90: rename d5/de/f11 to d5/de/f1f 0 2026-03-10T08:55:00.488 INFO:tasks.workunit.client.1.vm08.stdout:2/69: sync 2026-03-10T08:55:00.490 INFO:tasks.workunit.client.1.vm08.stdout:2/70: fdatasync d1/f2 0 2026-03-10T08:55:00.495 INFO:tasks.workunit.client.1.vm08.stdout:4/91: symlink d5/l20 0 2026-03-10T08:55:00.499 INFO:tasks.workunit.client.1.vm08.stdout:4/92: dread d5/f14 [0,4194304] 0 2026-03-10T08:55:00.509 INFO:tasks.workunit.client.1.vm08.stdout:4/93: creat d5/f21 x:0 0 0 2026-03-10T08:55:00.509 INFO:tasks.workunit.client.1.vm08.stdout:7/62: creat d0/d2/f12 x:0 0 0 2026-03-10T08:55:00.509 INFO:tasks.workunit.client.1.vm08.stdout:7/63: chown d0/d2 936685 1 2026-03-10T08:55:00.509 INFO:tasks.workunit.client.1.vm08.stdout:7/64: stat d0/fe 0 2026-03-10T08:55:00.509 INFO:tasks.workunit.client.1.vm08.stdout:7/65: write d0/fe [684964,59672] 0 2026-03-10T08:55:00.514 INFO:tasks.workunit.client.1.vm08.stdout:7/66: dwrite d0/d2/da/ff [0,4194304] 0 2026-03-10T08:55:00.526 INFO:tasks.workunit.client.1.vm08.stdout:4/94: dwrite d5/de/f1f [0,4194304] 0 2026-03-10T08:55:00.540 INFO:tasks.workunit.client.1.vm08.stdout:4/95: unlink d5/c18 0 2026-03-10T08:55:00.544 INFO:tasks.workunit.client.1.vm08.stdout:4/96: rename d5/l20 to d5/l22 0 2026-03-10T08:55:00.544 INFO:tasks.workunit.client.1.vm08.stdout:4/97: chown d5/lf 4936 1 2026-03-10T08:55:00.544 INFO:tasks.workunit.client.1.vm08.stdout:4/98: stat d5/de/f1f 0 2026-03-10T08:55:00.547 INFO:tasks.workunit.client.1.vm08.stdout:4/99: mkdir d5/d23 0 2026-03-10T08:55:00.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:00 vm08.local ceph-mon[57559]: pgmap v139: 65 pgs: 65 active+clean; 213 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 7.6 MiB/s wr, 386 op/s 2026-03-10T08:55:00.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:00 vm05.local ceph-mon[49713]: pgmap v139: 65 pgs: 65 active+clean; 213 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 7.6 MiB/s wr, 386 op/s 2026-03-10T08:55:00.784 INFO:tasks.workunit.client.1.vm08.stdout:3/90: getdents d4 0 2026-03-10T08:55:00.787 INFO:tasks.workunit.client.1.vm08.stdout:5/88: dwrite d0/ff [0,4194304] 0 2026-03-10T08:55:00.787 INFO:tasks.workunit.client.1.vm08.stdout:3/91: creat d4/d15/d8/f1e x:0 0 0 2026-03-10T08:55:00.793 INFO:tasks.workunit.client.1.vm08.stdout:3/92: dwrite d4/d15/fc [0,4194304] 0 2026-03-10T08:55:00.793 INFO:tasks.workunit.client.1.vm08.stdout:3/93: truncate d4/d15/f1a 788537 0 2026-03-10T08:55:00.794 INFO:tasks.workunit.client.1.vm08.stdout:3/94: chown d4/d15/f12 410 1 2026-03-10T08:55:00.802 INFO:tasks.workunit.client.1.vm08.stdout:5/89: mknod d0/c20 0 2026-03-10T08:55:00.805 INFO:tasks.workunit.client.1.vm08.stdout:1/55: dwrite d1/fc [0,4194304] 0 2026-03-10T08:55:00.806 INFO:tasks.workunit.client.1.vm08.stdout:1/56: chown d1/fc 737 1 2026-03-10T08:55:00.807 INFO:tasks.workunit.client.1.vm08.stdout:6/87: truncate f6 2910719 0 2026-03-10T08:55:00.817 INFO:tasks.workunit.client.1.vm08.stdout:6/88: fsync f1 0 2026-03-10T08:55:00.817 INFO:tasks.workunit.client.1.vm08.stdout:3/95: dread d4/d15/f1a [0,4194304] 0 2026-03-10T08:55:00.817 INFO:tasks.workunit.client.1.vm08.stdout:1/57: rmdir d1 39 2026-03-10T08:55:00.820 INFO:tasks.workunit.client.1.vm08.stdout:7/67: fsync d0/d2/da/ff 0 2026-03-10T08:55:00.821 INFO:tasks.workunit.client.1.vm08.stdout:8/124: dwrite d1/f17 [4194304,4194304] 0 2026-03-10T08:55:00.821 INFO:tasks.workunit.client.1.vm08.stdout:7/68: fsync d0/d2/f7 0 2026-03-10T08:55:00.822 INFO:tasks.workunit.client.1.vm08.stdout:5/90: symlink d0/l21 0 2026-03-10T08:55:00.822 INFO:tasks.workunit.client.1.vm08.stdout:0/72: fsync d6/f15 0 2026-03-10T08:55:00.823 INFO:tasks.workunit.client.1.vm08.stdout:5/91: write d0/f10 [5726427,70242] 0 2026-03-10T08:55:00.834 INFO:tasks.workunit.client.1.vm08.stdout:8/125: creat d1/d10/f2a x:0 0 0 2026-03-10T08:55:00.834 INFO:tasks.workunit.client.1.vm08.stdout:8/126: chown d1/d10/ff 179319783 1 2026-03-10T08:55:00.837 INFO:tasks.workunit.client.1.vm08.stdout:2/71: rmdir d1 39 2026-03-10T08:55:00.838 INFO:tasks.workunit.client.1.vm08.stdout:8/127: dwrite d1/d10/f23 [0,4194304] 0 2026-03-10T08:55:00.839 INFO:tasks.workunit.client.1.vm08.stdout:0/73: truncate f4 1740329 0 2026-03-10T08:55:00.840 INFO:tasks.workunit.client.1.vm08.stdout:0/74: chown f3 174061 1 2026-03-10T08:55:00.842 INFO:tasks.workunit.client.1.vm08.stdout:3/96: sync 2026-03-10T08:55:00.852 INFO:tasks.workunit.client.1.vm08.stdout:8/128: creat d1/d10/d9/dd/d18/f2b x:0 0 0 2026-03-10T08:55:00.852 INFO:tasks.workunit.client.1.vm08.stdout:8/129: truncate d1/f26 864633 0 2026-03-10T08:55:00.853 INFO:tasks.workunit.client.1.vm08.stdout:4/100: getdents d5 0 2026-03-10T08:55:00.856 INFO:tasks.workunit.client.1.vm08.stdout:3/97: creat d4/d15/d8/f1f x:0 0 0 2026-03-10T08:55:00.856 INFO:tasks.workunit.client.1.vm08.stdout:5/92: creat d0/d11/d18/d1c/f22 x:0 0 0 2026-03-10T08:55:00.858 INFO:tasks.workunit.client.1.vm08.stdout:8/130: mkdir d1/d2c 0 2026-03-10T08:55:00.860 INFO:tasks.workunit.client.1.vm08.stdout:0/75: link d6/fb d6/f16 0 2026-03-10T08:55:00.861 INFO:tasks.workunit.client.1.vm08.stdout:3/98: mkdir d4/d15/d17/d20 0 2026-03-10T08:55:00.863 INFO:tasks.workunit.client.1.vm08.stdout:2/72: creat d1/da/d10/f18 x:0 0 0 2026-03-10T08:55:00.863 INFO:tasks.workunit.client.1.vm08.stdout:2/73: fsync d1/f9 0 2026-03-10T08:55:00.864 INFO:tasks.workunit.client.1.vm08.stdout:4/101: symlink d5/d23/l24 0 2026-03-10T08:55:00.865 INFO:tasks.workunit.client.1.vm08.stdout:4/102: write d5/f19 [675707,38679] 0 2026-03-10T08:55:00.867 INFO:tasks.workunit.client.1.vm08.stdout:3/99: dread d4/d15/f7 [0,4194304] 0 2026-03-10T08:55:00.885 INFO:tasks.workunit.client.1.vm08.stdout:7/69: link d0/d2/c9 d0/c13 0 2026-03-10T08:55:00.893 INFO:tasks.workunit.client.1.vm08.stdout:0/76: dwrite d6/fe [4194304,4194304] 0 2026-03-10T08:55:00.893 INFO:tasks.workunit.client.1.vm08.stdout:7/70: dread d0/d2/da/ff [0,4194304] 0 2026-03-10T08:55:00.893 INFO:tasks.workunit.client.1.vm08.stdout:2/74: link d1/da/d10/f18 d1/f19 0 2026-03-10T08:55:00.893 INFO:tasks.workunit.client.1.vm08.stdout:5/93: getdents d0/d11 0 2026-03-10T08:55:00.894 INFO:tasks.workunit.client.1.vm08.stdout:3/100: creat d4/d15/d8/d1d/f21 x:0 0 0 2026-03-10T08:55:00.894 INFO:tasks.workunit.client.1.vm08.stdout:3/101: write d4/d15/d8/ff [855266,18974] 0 2026-03-10T08:55:00.894 INFO:tasks.workunit.client.1.vm08.stdout:5/94: dwrite d0/d11/d18/f1a [0,4194304] 0 2026-03-10T08:55:00.894 INFO:tasks.workunit.client.1.vm08.stdout:2/75: symlink d1/da/d10/l1a 0 2026-03-10T08:55:00.896 INFO:tasks.workunit.client.1.vm08.stdout:0/77: mkdir d6/dd/d13/d17 0 2026-03-10T08:55:00.896 INFO:tasks.workunit.client.1.vm08.stdout:3/102: mknod d4/d15/c22 0 2026-03-10T08:55:00.896 INFO:tasks.workunit.client.1.vm08.stdout:5/95: write d0/d11/f1e [351877,75700] 0 2026-03-10T08:55:00.897 INFO:tasks.workunit.client.1.vm08.stdout:5/96: dread - d0/d11/d18/d1c/f22 zero size 2026-03-10T08:55:00.900 INFO:tasks.workunit.client.1.vm08.stdout:5/97: unlink d0/f1d 0 2026-03-10T08:55:00.901 INFO:tasks.workunit.client.1.vm08.stdout:5/98: read d0/f10 [3294836,117387] 0 2026-03-10T08:55:00.901 INFO:tasks.workunit.client.1.vm08.stdout:5/99: fsync d0/f16 0 2026-03-10T08:55:00.910 INFO:tasks.workunit.client.1.vm08.stdout:2/76: getdents d1/da/df 0 2026-03-10T08:55:00.910 INFO:tasks.workunit.client.1.vm08.stdout:3/103: link d4/d15/d8/c13 d4/d15/d8/c23 0 2026-03-10T08:55:00.912 INFO:tasks.workunit.client.1.vm08.stdout:5/100: rename d0/f10 to d0/d11/d18/f23 0 2026-03-10T08:55:00.913 INFO:tasks.workunit.client.1.vm08.stdout:3/104: creat d4/d15/d8/f24 x:0 0 0 2026-03-10T08:55:00.914 INFO:tasks.workunit.client.1.vm08.stdout:3/105: dread - d4/f14 zero size 2026-03-10T08:55:00.915 INFO:tasks.workunit.client.1.vm08.stdout:3/106: chown d4/f10 489315 1 2026-03-10T08:55:00.916 INFO:tasks.workunit.client.1.vm08.stdout:2/77: rename d1/da/df to d1/da/d10/d1b 0 2026-03-10T08:55:00.920 INFO:tasks.workunit.client.1.vm08.stdout:5/101: dwrite d0/d11/f1e [0,4194304] 0 2026-03-10T08:55:00.923 INFO:tasks.workunit.client.1.vm08.stdout:2/78: truncate d1/f4 871590 0 2026-03-10T08:55:00.927 INFO:tasks.workunit.client.1.vm08.stdout:5/102: symlink d0/d11/d18/l24 0 2026-03-10T08:55:00.927 INFO:tasks.workunit.client.1.vm08.stdout:5/103: chown d0/d11/d18/d1c/f22 5453 1 2026-03-10T08:55:00.927 INFO:tasks.workunit.client.1.vm08.stdout:2/79: mkdir d1/da/d10/d1b/d1c 0 2026-03-10T08:55:00.934 INFO:tasks.workunit.client.1.vm08.stdout:2/80: creat d1/da/d10/d1b/d12/f1d x:0 0 0 2026-03-10T08:55:00.934 INFO:tasks.workunit.client.1.vm08.stdout:5/104: dwrite d0/f16 [0,4194304] 0 2026-03-10T08:55:00.940 INFO:tasks.workunit.client.1.vm08.stdout:2/81: mkdir d1/da/d10/d1b/d12/d1e 0 2026-03-10T08:55:00.942 INFO:tasks.workunit.client.1.vm08.stdout:5/105: dwrite d0/f16 [0,4194304] 0 2026-03-10T08:55:00.944 INFO:tasks.workunit.client.1.vm08.stdout:5/106: write d0/d11/f1e [1439600,75025] 0 2026-03-10T08:55:00.948 INFO:tasks.workunit.client.1.vm08.stdout:2/82: dwrite d1/da/fe [0,4194304] 0 2026-03-10T08:55:00.967 INFO:tasks.workunit.client.1.vm08.stdout:5/107: dwrite d0/d11/d18/f23 [0,4194304] 0 2026-03-10T08:55:00.970 INFO:tasks.workunit.client.1.vm08.stdout:2/83: write d1/fd [3225679,20977] 0 2026-03-10T08:55:00.973 INFO:tasks.workunit.client.1.vm08.stdout:5/108: dwrite d0/fb [0,4194304] 0 2026-03-10T08:55:00.981 INFO:tasks.workunit.client.1.vm08.stdout:5/109: dread d0/d11/d18/f1a [0,4194304] 0 2026-03-10T08:55:00.998 INFO:tasks.workunit.client.1.vm08.stdout:5/110: dwrite d0/d11/d18/f1a [0,4194304] 0 2026-03-10T08:55:00.998 INFO:tasks.workunit.client.1.vm08.stdout:5/111: write d0/ff [1448844,23271] 0 2026-03-10T08:55:00.998 INFO:tasks.workunit.client.1.vm08.stdout:5/112: write d0/fe [1850023,16355] 0 2026-03-10T08:55:00.998 INFO:tasks.workunit.client.1.vm08.stdout:5/113: write d0/d11/f1e [651722,113051] 0 2026-03-10T08:55:00.998 INFO:tasks.workunit.client.1.vm08.stdout:2/84: write d1/da/d10/f18 [942349,47395] 0 2026-03-10T08:55:00.998 INFO:tasks.workunit.client.1.vm08.stdout:5/114: dwrite d0/d11/d18/f1a [4194304,4194304] 0 2026-03-10T08:55:00.998 INFO:tasks.workunit.client.1.vm08.stdout:5/115: stat d0 0 2026-03-10T08:55:00.998 INFO:tasks.workunit.client.1.vm08.stdout:5/116: creat d0/d11/f25 x:0 0 0 2026-03-10T08:55:00.998 INFO:tasks.workunit.client.1.vm08.stdout:5/117: truncate d0/d11/f1e 4993972 0 2026-03-10T08:55:00.998 INFO:tasks.workunit.client.1.vm08.stdout:2/85: creat d1/da/d10/d1b/d12/d1e/f1f x:0 0 0 2026-03-10T08:55:00.998 INFO:tasks.workunit.client.1.vm08.stdout:5/118: mknod d0/d11/d18/d1c/c26 0 2026-03-10T08:55:00.998 INFO:tasks.workunit.client.1.vm08.stdout:2/86: dread d1/fd [0,4194304] 0 2026-03-10T08:55:00.998 INFO:tasks.workunit.client.1.vm08.stdout:5/119: chown d0/c6 88125749 1 2026-03-10T08:55:01.004 INFO:tasks.workunit.client.1.vm08.stdout:2/87: rename d1/f7 to d1/da/d10/d1b/d1c/f20 0 2026-03-10T08:55:01.006 INFO:tasks.workunit.client.1.vm08.stdout:2/88: creat d1/da/f21 x:0 0 0 2026-03-10T08:55:01.008 INFO:tasks.workunit.client.1.vm08.stdout:2/89: write d1/fd [4697045,99712] 0 2026-03-10T08:55:01.009 INFO:tasks.workunit.client.1.vm08.stdout:2/90: read - d1/da/d10/d1b/d12/d1e/f1f zero size 2026-03-10T08:55:01.011 INFO:tasks.workunit.client.1.vm08.stdout:2/91: mkdir d1/da/d10/d1b/d12/d22 0 2026-03-10T08:55:01.012 INFO:tasks.workunit.client.1.vm08.stdout:2/92: mkdir d1/da/d10/d1b/d12/d23 0 2026-03-10T08:55:01.015 INFO:tasks.workunit.client.1.vm08.stdout:2/93: creat d1/da/d10/d1b/d12/d23/f24 x:0 0 0 2026-03-10T08:55:01.017 INFO:tasks.workunit.client.1.vm08.stdout:2/94: mknod d1/da/c25 0 2026-03-10T08:55:01.017 INFO:tasks.workunit.client.1.vm08.stdout:2/95: readlink d1/l5 0 2026-03-10T08:55:01.020 INFO:tasks.workunit.client.1.vm08.stdout:2/96: mknod d1/da/d10/d1b/d12/d1e/c26 0 2026-03-10T08:55:01.028 INFO:tasks.workunit.client.1.vm08.stdout:0/78: sync 2026-03-10T08:55:01.051 INFO:tasks.workunit.client.1.vm08.stdout:0/79: dread d6/fa [0,4194304] 0 2026-03-10T08:55:01.051 INFO:tasks.workunit.client.1.vm08.stdout:0/80: dread d6/fe [4194304,4194304] 0 2026-03-10T08:55:01.051 INFO:tasks.workunit.client.1.vm08.stdout:0/81: write f3 [528684,115621] 0 2026-03-10T08:55:01.051 INFO:tasks.workunit.client.1.vm08.stdout:0/82: dwrite d6/f11 [0,4194304] 0 2026-03-10T08:55:01.051 INFO:tasks.workunit.client.1.vm08.stdout:0/83: chown d6/fa 10525 1 2026-03-10T08:55:01.111 INFO:tasks.workunit.client.1.vm08.stdout:0/84: creat d6/f18 x:0 0 0 2026-03-10T08:55:01.122 INFO:tasks.workunit.client.1.vm08.stdout:0/85: mkdir d6/dd/d13/d17/d19 0 2026-03-10T08:55:01.125 INFO:tasks.workunit.client.1.vm08.stdout:0/86: symlink d6/dd/d13/l1a 0 2026-03-10T08:55:01.125 INFO:tasks.workunit.client.1.vm08.stdout:0/87: chown d6/dd/d13 384373 1 2026-03-10T08:55:01.127 INFO:tasks.workunit.client.1.vm08.stdout:0/88: dread f1 [0,4194304] 0 2026-03-10T08:55:01.136 INFO:tasks.workunit.client.1.vm08.stdout:9/96: dread d2/fb [0,4194304] 0 2026-03-10T08:55:01.140 INFO:tasks.workunit.client.1.vm08.stdout:9/97: symlink d2/l19 0 2026-03-10T08:55:01.140 INFO:tasks.workunit.client.1.vm08.stdout:9/98: creat d2/f1a x:0 0 0 2026-03-10T08:55:01.141 INFO:tasks.workunit.client.1.vm08.stdout:9/99: creat d2/dd/d15/f1b x:0 0 0 2026-03-10T08:55:01.144 INFO:tasks.workunit.client.1.vm08.stdout:9/100: unlink d2/dd/l11 0 2026-03-10T08:55:01.146 INFO:tasks.workunit.client.1.vm08.stdout:9/101: fsync d2/f4 0 2026-03-10T08:55:01.146 INFO:tasks.workunit.client.1.vm08.stdout:9/102: fdatasync d2/dd/d15/f1b 0 2026-03-10T08:55:01.148 INFO:tasks.workunit.client.1.vm08.stdout:0/89: sync 2026-03-10T08:55:01.157 INFO:tasks.workunit.client.1.vm08.stdout:0/90: creat d6/dd/d10/f1b x:0 0 0 2026-03-10T08:55:01.175 INFO:tasks.workunit.client.1.vm08.stdout:3/107: fdatasync d4/d15/d8/ff 0 2026-03-10T08:55:01.187 INFO:tasks.workunit.client.1.vm08.stdout:3/108: write d4/d15/f1a [27756,102647] 0 2026-03-10T08:55:01.188 INFO:tasks.workunit.client.1.vm08.stdout:3/109: read d4/d15/fc [2682110,93972] 0 2026-03-10T08:55:01.203 INFO:tasks.workunit.client.1.vm08.stdout:3/110: sync 2026-03-10T08:55:01.207 INFO:tasks.workunit.client.1.vm08.stdout:3/111: getdents d4/d15/d17/d20 0 2026-03-10T08:55:01.208 INFO:tasks.workunit.client.1.vm08.stdout:3/112: dread - d4/f10 zero size 2026-03-10T08:55:01.217 INFO:tasks.workunit.client.1.vm08.stdout:3/113: link d4/d15/d8/c13 d4/d15/d17/d20/c25 0 2026-03-10T08:55:01.220 INFO:tasks.workunit.client.1.vm08.stdout:3/114: mknod d4/d15/c26 0 2026-03-10T08:55:01.222 INFO:tasks.workunit.client.1.vm08.stdout:3/115: chown d4/c19 0 1 2026-03-10T08:55:01.284 INFO:tasks.workunit.client.1.vm08.stdout:6/89: write f6 [2268478,108127] 0 2026-03-10T08:55:01.292 INFO:tasks.workunit.client.1.vm08.stdout:6/90: dread d9/fa [0,4194304] 0 2026-03-10T08:55:01.293 INFO:tasks.workunit.client.1.vm08.stdout:6/91: read d9/dc/dd/f12 [218823,63021] 0 2026-03-10T08:55:01.294 INFO:tasks.workunit.client.1.vm08.stdout:9/103: dwrite d2/f13 [0,4194304] 0 2026-03-10T08:55:01.295 INFO:tasks.workunit.client.1.vm08.stdout:6/92: write d9/dc/f1b [255171,46450] 0 2026-03-10T08:55:01.300 INFO:tasks.workunit.client.1.vm08.stdout:9/104: dwrite d2/fa [0,4194304] 0 2026-03-10T08:55:01.303 INFO:tasks.workunit.client.1.vm08.stdout:6/93: dwrite d9/dc/dd/ff [0,4194304] 0 2026-03-10T08:55:01.306 INFO:tasks.workunit.client.1.vm08.stdout:6/94: write f6 [674249,27234] 0 2026-03-10T08:55:01.327 INFO:tasks.workunit.client.1.vm08.stdout:9/105: fsync d2/dd/f18 0 2026-03-10T08:55:01.331 INFO:tasks.workunit.client.1.vm08.stdout:8/131: dwrite d1/d10/f23 [4194304,4194304] 0 2026-03-10T08:55:01.331 INFO:tasks.workunit.client.1.vm08.stdout:1/58: truncate d1/fc 3063882 0 2026-03-10T08:55:01.333 INFO:tasks.workunit.client.1.vm08.stdout:6/95: unlink f6 0 2026-03-10T08:55:01.339 INFO:tasks.workunit.client.1.vm08.stdout:8/132: dread d1/d10/f1e [0,4194304] 0 2026-03-10T08:55:01.342 INFO:tasks.workunit.client.1.vm08.stdout:9/106: mknod d2/dd/d15/c1c 0 2026-03-10T08:55:01.349 INFO:tasks.workunit.client.1.vm08.stdout:7/71: rename d0/d2 to d0/d14 0 2026-03-10T08:55:01.349 INFO:tasks.workunit.client.1.vm08.stdout:7/72: fsync d0/d14/f7 0 2026-03-10T08:55:01.349 INFO:tasks.workunit.client.1.vm08.stdout:1/59: creat d1/fd x:0 0 0 2026-03-10T08:55:01.350 INFO:tasks.workunit.client.1.vm08.stdout:7/73: truncate d0/d14/f12 833859 0 2026-03-10T08:55:01.350 INFO:tasks.workunit.client.1.vm08.stdout:7/74: stat d0/l10 0 2026-03-10T08:55:01.354 INFO:tasks.workunit.client.1.vm08.stdout:6/96: creat d9/d10/f25 x:0 0 0 2026-03-10T08:55:01.355 INFO:tasks.workunit.client.1.vm08.stdout:6/97: fsync f1 0 2026-03-10T08:55:01.357 INFO:tasks.workunit.client.1.vm08.stdout:9/107: symlink d2/dd/l1d 0 2026-03-10T08:55:01.357 INFO:tasks.workunit.client.1.vm08.stdout:9/108: chown d2/dd/d15/c1c 72561222 1 2026-03-10T08:55:01.357 INFO:tasks.workunit.client.1.vm08.stdout:9/109: chown d2/f4 7 1 2026-03-10T08:55:01.363 INFO:tasks.workunit.client.1.vm08.stdout:4/103: write d5/fd [745642,3627] 0 2026-03-10T08:55:01.364 INFO:tasks.workunit.client.1.vm08.stdout:8/133: rename d1/d10/f1e to d1/d10/f2d 0 2026-03-10T08:55:01.366 INFO:tasks.workunit.client.1.vm08.stdout:9/110: mkdir d2/dd/d15/d1e 0 2026-03-10T08:55:01.373 INFO:tasks.workunit.client.1.vm08.stdout:7/75: mknod d0/d11/c15 0 2026-03-10T08:55:01.374 INFO:tasks.workunit.client.1.vm08.stdout:7/76: write d0/d14/f7 [143808,107054] 0 2026-03-10T08:55:01.374 INFO:tasks.workunit.client.1.vm08.stdout:7/77: readlink d0/l10 0 2026-03-10T08:55:01.397 INFO:tasks.workunit.client.1.vm08.stdout:3/116: fsync d4/d15/d8/f24 0 2026-03-10T08:55:01.404 INFO:tasks.workunit.client.1.vm08.stdout:2/97: unlink d1/f15 0 2026-03-10T08:55:01.422 INFO:tasks.workunit.client.1.vm08.stdout:6/98: fdatasync d9/dc/dd/ff 0 2026-03-10T08:55:01.425 INFO:tasks.workunit.client.1.vm08.stdout:6/99: dread - d9/dc/dd/f17 zero size 2026-03-10T08:55:01.426 INFO:tasks.workunit.client.1.vm08.stdout:6/100: write f1 [3810336,119827] 0 2026-03-10T08:55:01.428 INFO:tasks.workunit.client.1.vm08.stdout:7/78: dread d0/d14/f7 [0,4194304] 0 2026-03-10T08:55:01.430 INFO:tasks.workunit.client.1.vm08.stdout:7/79: write d0/d14/da/ff [4297814,117524] 0 2026-03-10T08:55:01.431 INFO:tasks.workunit.client.1.vm08.stdout:7/80: write d0/d14/f12 [194488,72463] 0 2026-03-10T08:55:01.433 INFO:tasks.workunit.client.1.vm08.stdout:6/101: creat d9/d10/f26 x:0 0 0 2026-03-10T08:55:01.437 INFO:tasks.workunit.client.1.vm08.stdout:7/81: write d0/d14/f7 [802651,36136] 0 2026-03-10T08:55:01.440 INFO:tasks.workunit.client.1.vm08.stdout:6/102: creat d9/dc/dd/f27 x:0 0 0 2026-03-10T08:55:01.441 INFO:tasks.workunit.client.1.vm08.stdout:6/103: write f5 [191422,105284] 0 2026-03-10T08:55:01.441 INFO:tasks.workunit.client.1.vm08.stdout:6/104: chown d9/d13 30274289 1 2026-03-10T08:55:01.448 INFO:tasks.workunit.client.1.vm08.stdout:7/82: creat d0/f16 x:0 0 0 2026-03-10T08:55:01.448 INFO:tasks.workunit.client.1.vm08.stdout:7/83: read d0/d14/f7 [1549261,48128] 0 2026-03-10T08:55:01.449 INFO:tasks.workunit.client.1.vm08.stdout:7/84: stat d0/d14/f7 0 2026-03-10T08:55:01.450 INFO:tasks.workunit.client.1.vm08.stdout:6/105: sync 2026-03-10T08:55:01.451 INFO:tasks.workunit.client.1.vm08.stdout:6/106: read f5 [2457799,31982] 0 2026-03-10T08:55:01.451 INFO:tasks.workunit.client.1.vm08.stdout:6/107: chown d9 153738 1 2026-03-10T08:55:01.451 INFO:tasks.workunit.client.1.vm08.stdout:6/108: readlink d9/dc/l21 0 2026-03-10T08:55:01.451 INFO:tasks.workunit.client.1.vm08.stdout:6/109: read f5 [2769947,6849] 0 2026-03-10T08:55:01.468 INFO:tasks.workunit.client.1.vm08.stdout:7/85: getdents d0/d14/da 0 2026-03-10T08:55:01.469 INFO:tasks.workunit.client.1.vm08.stdout:6/110: link d9/d10/l20 d9/d13/l28 0 2026-03-10T08:55:01.473 INFO:tasks.workunit.client.1.vm08.stdout:7/86: dwrite d0/f16 [0,4194304] 0 2026-03-10T08:55:01.475 INFO:tasks.workunit.client.1.vm08.stdout:7/87: write d0/f16 [1967477,67132] 0 2026-03-10T08:55:01.476 INFO:tasks.workunit.client.1.vm08.stdout:7/88: write d0/d14/f12 [1166903,32708] 0 2026-03-10T08:55:01.481 INFO:tasks.workunit.client.1.vm08.stdout:7/89: dread d0/fe [0,4194304] 0 2026-03-10T08:55:01.484 INFO:tasks.workunit.client.1.vm08.stdout:6/111: creat d9/dc/d11/f29 x:0 0 0 2026-03-10T08:55:01.485 INFO:tasks.workunit.client.1.vm08.stdout:6/112: fdatasync d9/dc/f1b 0 2026-03-10T08:55:01.486 INFO:tasks.workunit.client.1.vm08.stdout:7/90: dwrite d0/d14/da/ff [0,4194304] 0 2026-03-10T08:55:01.495 INFO:tasks.workunit.client.1.vm08.stdout:7/91: dwrite d0/fe [0,4194304] 0 2026-03-10T08:55:01.502 INFO:tasks.workunit.client.1.vm08.stdout:5/120: rename d0/d11/d18/d1c to d0/d11/d27 0 2026-03-10T08:55:01.523 INFO:tasks.workunit.client.1.vm08.stdout:7/92: symlink d0/d14/l17 0 2026-03-10T08:55:01.523 INFO:tasks.workunit.client.1.vm08.stdout:7/93: stat d0/f16 0 2026-03-10T08:55:01.524 INFO:tasks.workunit.client.1.vm08.stdout:5/121: symlink d0/d1b/l28 0 2026-03-10T08:55:01.524 INFO:tasks.workunit.client.1.vm08.stdout:5/122: write d0/d11/d27/f22 [725789,36441] 0 2026-03-10T08:55:01.527 INFO:tasks.workunit.client.1.vm08.stdout:5/123: dread d0/d11/d18/f1a [0,4194304] 0 2026-03-10T08:55:01.528 INFO:tasks.workunit.client.1.vm08.stdout:6/113: link d9/fa d9/d10/d1e/f2a 0 2026-03-10T08:55:01.529 INFO:tasks.workunit.client.1.vm08.stdout:6/114: chown d9/d13/c14 2583935 1 2026-03-10T08:55:01.529 INFO:tasks.workunit.client.1.vm08.stdout:6/115: write d9/d10/f26 [944077,81443] 0 2026-03-10T08:55:01.542 INFO:tasks.workunit.client.1.vm08.stdout:5/124: sync 2026-03-10T08:55:01.542 INFO:tasks.workunit.client.1.vm08.stdout:6/116: dread d9/d10/d1e/f2a [0,4194304] 0 2026-03-10T08:55:01.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:01 vm08.local ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:55:01.665 INFO:tasks.workunit.client.1.vm08.stdout:0/91: rename d6/dd/d10 to d6/dd/d13/d17/d1c 0 2026-03-10T08:55:01.665 INFO:tasks.workunit.client.1.vm08.stdout:4/104: rename d5 to d5/d25 22 2026-03-10T08:55:01.670 INFO:tasks.workunit.client.1.vm08.stdout:8/134: rename d1/d10/d9/dd/d16 to d1/d2c/d2e 0 2026-03-10T08:55:01.672 INFO:tasks.workunit.client.1.vm08.stdout:8/135: write d1/d10/d9/fb [1901554,42176] 0 2026-03-10T08:55:01.673 INFO:tasks.workunit.client.1.vm08.stdout:4/105: mknod d5/c26 0 2026-03-10T08:55:01.680 INFO:tasks.workunit.client.1.vm08.stdout:9/111: rename d2/c14 to d2/dd/d15/c1f 0 2026-03-10T08:55:01.684 INFO:tasks.workunit.client.1.vm08.stdout:8/136: unlink d1/f17 0 2026-03-10T08:55:01.684 INFO:tasks.workunit.client.1.vm08.stdout:9/112: dwrite d2/f6 [0,4194304] 0 2026-03-10T08:55:01.685 INFO:tasks.workunit.client.1.vm08.stdout:8/137: fdatasync d1/d10/ff 0 2026-03-10T08:55:01.686 INFO:tasks.workunit.client.1.vm08.stdout:8/138: dread - d1/d10/d9/dd/d13/f24 zero size 2026-03-10T08:55:01.687 INFO:tasks.workunit.client.1.vm08.stdout:0/92: link f5 d6/dd/d13/d17/f1d 0 2026-03-10T08:55:01.687 INFO:tasks.workunit.client.1.vm08.stdout:8/139: chown d1/d2c/d2e/d21 34592 1 2026-03-10T08:55:01.702 INFO:tasks.workunit.client.1.vm08.stdout:9/113: mknod d2/c20 0 2026-03-10T08:55:01.706 INFO:tasks.workunit.client.1.vm08.stdout:9/114: dwrite d2/dd/d15/f17 [0,4194304] 0 2026-03-10T08:55:01.708 INFO:tasks.workunit.client.1.vm08.stdout:8/140: unlink d1/d10/ff 0 2026-03-10T08:55:01.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:01 vm05.local ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:55:01.716 INFO:tasks.workunit.client.1.vm08.stdout:3/117: rename d4/d15/c16 to d4/d15/d17/c27 0 2026-03-10T08:55:01.716 INFO:tasks.workunit.client.1.vm08.stdout:8/141: write d1/d10/f2a [177112,49958] 0 2026-03-10T08:55:01.716 INFO:tasks.workunit.client.1.vm08.stdout:3/118: write d4/d15/d8/d1d/f21 [163158,40268] 0 2026-03-10T08:55:01.716 INFO:tasks.workunit.client.1.vm08.stdout:3/119: truncate d4/d15/f11 453889 0 2026-03-10T08:55:01.716 INFO:tasks.workunit.client.1.vm08.stdout:8/142: dread - d1/d10/d9/dd/d13/f24 zero size 2026-03-10T08:55:01.727 INFO:tasks.workunit.client.1.vm08.stdout:8/143: sync 2026-03-10T08:55:01.727 INFO:tasks.workunit.client.1.vm08.stdout:8/144: dread - d1/d10/d9/dd/d18/f2b zero size 2026-03-10T08:55:01.730 INFO:tasks.workunit.client.1.vm08.stdout:8/145: dread d1/d10/f23 [4194304,4194304] 0 2026-03-10T08:55:01.733 INFO:tasks.workunit.client.1.vm08.stdout:2/98: rename d1/f2 to d1/da/f27 0 2026-03-10T08:55:01.733 INFO:tasks.workunit.client.1.vm08.stdout:8/146: dwrite d1/d10/f12 [0,4194304] 0 2026-03-10T08:55:01.744 INFO:tasks.workunit.client.1.vm08.stdout:3/120: truncate d4/d15/fc 973204 0 2026-03-10T08:55:01.746 INFO:tasks.workunit.client.1.vm08.stdout:4/106: getdents d5/d23 0 2026-03-10T08:55:01.749 INFO:tasks.workunit.client.1.vm08.stdout:3/121: dwrite d4/d15/f12 [0,4194304] 0 2026-03-10T08:55:01.755 INFO:tasks.workunit.client.1.vm08.stdout:9/115: mkdir d2/dd/d15/d1e/d21 0 2026-03-10T08:55:01.764 INFO:tasks.workunit.client.1.vm08.stdout:2/99: creat d1/da/d10/d1b/f28 x:0 0 0 2026-03-10T08:55:01.765 INFO:tasks.workunit.client.1.vm08.stdout:2/100: read d1/f9 [4043757,73692] 0 2026-03-10T08:55:01.767 INFO:tasks.workunit.client.1.vm08.stdout:7/94: rename d0/l10 to d0/d14/da/l18 0 2026-03-10T08:55:01.776 INFO:tasks.workunit.client.1.vm08.stdout:8/147: symlink d1/d10/d9/l2f 0 2026-03-10T08:55:01.776 INFO:tasks.workunit.client.1.vm08.stdout:1/60: truncate d1/f8 469185 0 2026-03-10T08:55:01.782 INFO:tasks.workunit.client.1.vm08.stdout:3/122: dwrite d4/d15/f7 [0,4194304] 0 2026-03-10T08:55:01.791 INFO:tasks.workunit.client.1.vm08.stdout:3/123: dwrite d4/f14 [0,4194304] 0 2026-03-10T08:55:01.792 INFO:tasks.workunit.client.1.vm08.stdout:6/117: getdents d9/d10 0 2026-03-10T08:55:01.792 INFO:tasks.workunit.client.1.vm08.stdout:2/101: mknod d1/da/d10/d1b/d12/d1e/c29 0 2026-03-10T08:55:01.797 INFO:tasks.workunit.client.1.vm08.stdout:5/125: rename d0/d11/d27/f22 to d0/d11/f29 0 2026-03-10T08:55:01.801 INFO:tasks.workunit.client.1.vm08.stdout:7/95: rename d0/d11 to d0/d11/d19 22 2026-03-10T08:55:01.802 INFO:tasks.workunit.client.1.vm08.stdout:8/148: creat d1/d2c/f30 x:0 0 0 2026-03-10T08:55:01.803 INFO:tasks.workunit.client.1.vm08.stdout:1/61: mkdir d1/da/de 0 2026-03-10T08:55:01.804 INFO:tasks.workunit.client.1.vm08.stdout:7/96: dwrite d0/f16 [0,4194304] 0 2026-03-10T08:55:01.806 INFO:tasks.workunit.client.1.vm08.stdout:8/149: dread d1/f11 [0,4194304] 0 2026-03-10T08:55:01.817 INFO:tasks.workunit.client.1.vm08.stdout:2/102: creat d1/da/d10/d1b/f2a x:0 0 0 2026-03-10T08:55:01.834 INFO:tasks.workunit.client.1.vm08.stdout:6/118: mknod d9/d13/d1a/c2b 0 2026-03-10T08:55:01.834 INFO:tasks.workunit.client.1.vm08.stdout:6/119: dread - d9/d13/f15 zero size 2026-03-10T08:55:01.842 INFO:tasks.workunit.client.1.vm08.stdout:0/93: getdents d6/dd 0 2026-03-10T08:55:01.847 INFO:tasks.workunit.client.1.vm08.stdout:5/126: link d0/d11/d18/f1a d0/d11/d27/f2a 0 2026-03-10T08:55:01.850 INFO:tasks.workunit.client.1.vm08.stdout:0/94: mknod d6/dd/c1e 0 2026-03-10T08:55:01.852 INFO:tasks.workunit.client.1.vm08.stdout:7/97: link d0/d14/da/cc d0/c1a 0 2026-03-10T08:55:01.853 INFO:tasks.workunit.client.1.vm08.stdout:7/98: stat d0 0 2026-03-10T08:55:01.854 INFO:tasks.workunit.client.1.vm08.stdout:3/124: getdents d4/d15 0 2026-03-10T08:55:01.858 INFO:tasks.workunit.client.1.vm08.stdout:0/95: mkdir d6/dd/d13/d17/d1f 0 2026-03-10T08:55:01.859 INFO:tasks.workunit.client.1.vm08.stdout:3/125: symlink d4/d15/d17/l28 0 2026-03-10T08:55:01.861 INFO:tasks.workunit.client.1.vm08.stdout:7/99: rename d0/d14/c9 to d0/d11/c1b 0 2026-03-10T08:55:01.862 INFO:tasks.workunit.client.1.vm08.stdout:7/100: chown d0/d14/da/ff 75044416 1 2026-03-10T08:55:01.862 INFO:tasks.workunit.client.1.vm08.stdout:9/116: getdents d2/dd/d15/d1e 0 2026-03-10T08:55:01.864 INFO:tasks.workunit.client.1.vm08.stdout:0/96: rename d6/dd/d13/d17/d19 to d6/dd/d13/d17/d1f/d20 0 2026-03-10T08:55:01.864 INFO:tasks.workunit.client.1.vm08.stdout:3/126: symlink d4/d15/d17/d20/l29 0 2026-03-10T08:55:01.866 INFO:tasks.workunit.client.1.vm08.stdout:9/117: creat d2/dd/d15/f22 x:0 0 0 2026-03-10T08:55:01.871 INFO:tasks.workunit.client.1.vm08.stdout:4/107: write d5/f14 [429250,105817] 0 2026-03-10T08:55:01.873 INFO:tasks.workunit.client.1.vm08.stdout:1/62: dread d1/f8 [0,4194304] 0 2026-03-10T08:55:01.874 INFO:tasks.workunit.client.1.vm08.stdout:0/97: creat d6/dd/d13/d17/d1f/d20/f21 x:0 0 0 2026-03-10T08:55:01.874 INFO:tasks.workunit.client.1.vm08.stdout:1/63: fsync d1/fd 0 2026-03-10T08:55:01.875 INFO:tasks.workunit.client.1.vm08.stdout:0/98: write d6/dd/d13/d17/d1c/f1b [997914,11299] 0 2026-03-10T08:55:01.875 INFO:tasks.workunit.client.1.vm08.stdout:9/118: dwrite d2/dd/d15/f17 [0,4194304] 0 2026-03-10T08:55:01.877 INFO:tasks.workunit.client.1.vm08.stdout:7/101: mkdir d0/d1c 0 2026-03-10T08:55:01.884 INFO:tasks.workunit.client.1.vm08.stdout:0/99: dwrite f3 [0,4194304] 0 2026-03-10T08:55:01.886 INFO:tasks.workunit.client.1.vm08.stdout:1/64: creat d1/da/ff x:0 0 0 2026-03-10T08:55:01.895 INFO:tasks.workunit.client.1.vm08.stdout:7/102: mknod d0/d11/c1d 0 2026-03-10T08:55:01.897 INFO:tasks.workunit.client.1.vm08.stdout:0/100: symlink d6/l22 0 2026-03-10T08:55:01.899 INFO:tasks.workunit.client.1.vm08.stdout:4/108: link d5/f1e d5/d23/f27 0 2026-03-10T08:55:01.900 INFO:tasks.workunit.client.1.vm08.stdout:4/109: write d5/f19 [2158441,60658] 0 2026-03-10T08:55:01.900 INFO:tasks.workunit.client.1.vm08.stdout:1/65: creat d1/f10 x:0 0 0 2026-03-10T08:55:01.902 INFO:tasks.workunit.client.1.vm08.stdout:7/103: readlink d0/d14/da/l18 0 2026-03-10T08:55:01.902 INFO:tasks.workunit.client.1.vm08.stdout:6/120: fsync f5 0 2026-03-10T08:55:01.918 INFO:tasks.workunit.client.1.vm08.stdout:6/121: chown d9/dc/d11 7787 1 2026-03-10T08:55:01.919 INFO:tasks.workunit.client.1.vm08.stdout:4/110: truncate d5/f6 1379207 0 2026-03-10T08:55:01.919 INFO:tasks.workunit.client.1.vm08.stdout:8/150: write d1/d10/f2d [1179510,86714] 0 2026-03-10T08:55:01.919 INFO:tasks.workunit.client.1.vm08.stdout:5/127: truncate d0/fb 2095385 0 2026-03-10T08:55:01.919 INFO:tasks.workunit.client.1.vm08.stdout:6/122: dread d9/d10/f26 [0,4194304] 0 2026-03-10T08:55:01.919 INFO:tasks.workunit.client.1.vm08.stdout:4/111: dread d5/de/f1f [0,4194304] 0 2026-03-10T08:55:01.919 INFO:tasks.workunit.client.1.vm08.stdout:8/151: symlink d1/d10/d9/dd/d18/l31 0 2026-03-10T08:55:01.919 INFO:tasks.workunit.client.1.vm08.stdout:5/128: mknod d0/d1b/c2b 0 2026-03-10T08:55:01.920 INFO:tasks.workunit.client.1.vm08.stdout:5/129: read d0/d11/f29 [289997,118367] 0 2026-03-10T08:55:01.920 INFO:tasks.workunit.client.1.vm08.stdout:8/152: dwrite d1/d10/f12 [0,4194304] 0 2026-03-10T08:55:01.922 INFO:tasks.workunit.client.1.vm08.stdout:7/104: stat d0/c13 0 2026-03-10T08:55:01.922 INFO:tasks.workunit.client.1.vm08.stdout:8/153: dread - d1/d10/d9/dd/d13/f24 zero size 2026-03-10T08:55:01.931 INFO:tasks.workunit.client.1.vm08.stdout:9/119: rmdir d2/dd 39 2026-03-10T08:55:01.933 INFO:tasks.workunit.client.1.vm08.stdout:0/101: sync 2026-03-10T08:55:01.935 INFO:tasks.workunit.client.1.vm08.stdout:0/102: readlink d6/dd/d13/l1a 0 2026-03-10T08:55:01.936 INFO:tasks.workunit.client.1.vm08.stdout:1/66: dwrite d1/f8 [0,4194304] 0 2026-03-10T08:55:01.936 INFO:tasks.workunit.client.1.vm08.stdout:6/123: mkdir d9/dc/d11/d23/d2c 0 2026-03-10T08:55:01.936 INFO:tasks.workunit.client.1.vm08.stdout:4/112: stat d5/de/l13 0 2026-03-10T08:55:01.941 INFO:tasks.workunit.client.1.vm08.stdout:5/130: rename d0/ld to d0/d1b/l2c 0 2026-03-10T08:55:01.942 INFO:tasks.workunit.client.1.vm08.stdout:4/113: write d5/f14 [928920,108519] 0 2026-03-10T08:55:01.943 INFO:tasks.workunit.client.1.vm08.stdout:9/120: dread d2/fb [0,4194304] 0 2026-03-10T08:55:01.945 INFO:tasks.workunit.client.1.vm08.stdout:4/114: stat d5/f14 0 2026-03-10T08:55:01.948 INFO:tasks.workunit.client.1.vm08.stdout:0/103: dread d6/fa [4194304,4194304] 0 2026-03-10T08:55:01.953 INFO:tasks.workunit.client.1.vm08.stdout:3/127: dwrite d4/d15/fc [0,4194304] 0 2026-03-10T08:55:01.961 INFO:tasks.workunit.client.1.vm08.stdout:7/105: mkdir d0/d1e 0 2026-03-10T08:55:01.961 INFO:tasks.workunit.client.1.vm08.stdout:6/124: rename d9/dc/l1f to d9/d10/l2d 0 2026-03-10T08:55:01.961 INFO:tasks.workunit.client.1.vm08.stdout:9/121: read - d2/dd/d15/f22 zero size 2026-03-10T08:55:01.962 INFO:tasks.workunit.client.1.vm08.stdout:6/125: write d9/d10/f25 [64714,129210] 0 2026-03-10T08:55:01.963 INFO:tasks.workunit.client.1.vm08.stdout:9/122: write d2/dd/f16 [141262,20234] 0 2026-03-10T08:55:01.965 INFO:tasks.workunit.client.1.vm08.stdout:9/123: symlink d2/dd/d15/l23 0 2026-03-10T08:55:01.966 INFO:tasks.workunit.client.1.vm08.stdout:9/124: read d2/fa [99745,8625] 0 2026-03-10T08:55:01.966 INFO:tasks.workunit.client.1.vm08.stdout:9/125: write d2/fa [1553243,118437] 0 2026-03-10T08:55:01.967 INFO:tasks.workunit.client.1.vm08.stdout:6/126: chown d9/dc/l21 36570249 1 2026-03-10T08:55:01.969 INFO:tasks.workunit.client.1.vm08.stdout:9/126: mkdir d2/dd/d15/d1e/d24 0 2026-03-10T08:55:01.969 INFO:tasks.workunit.client.1.vm08.stdout:9/127: truncate d2/dd/f16 298265 0 2026-03-10T08:55:01.970 INFO:tasks.workunit.client.1.vm08.stdout:7/106: rmdir d0/d1e 0 2026-03-10T08:55:01.972 INFO:tasks.workunit.client.1.vm08.stdout:1/67: sync 2026-03-10T08:55:01.973 INFO:tasks.workunit.client.1.vm08.stdout:1/68: write d1/da/ff [128612,51344] 0 2026-03-10T08:55:01.976 INFO:tasks.workunit.client.1.vm08.stdout:9/128: mkdir d2/dd/d15/d1e/d25 0 2026-03-10T08:55:01.977 INFO:tasks.workunit.client.1.vm08.stdout:7/107: mkdir d0/d11/d1f 0 2026-03-10T08:55:01.977 INFO:tasks.workunit.client.1.vm08.stdout:6/127: mknod d9/dc/d11/d23/d2c/c2e 0 2026-03-10T08:55:01.978 INFO:tasks.workunit.client.1.vm08.stdout:7/108: fsync d0/d14/f12 0 2026-03-10T08:55:01.984 INFO:tasks.workunit.client.1.vm08.stdout:1/69: dwrite d1/fd [0,4194304] 0 2026-03-10T08:55:01.985 INFO:tasks.workunit.client.1.vm08.stdout:1/70: read d1/f8 [1173793,106737] 0 2026-03-10T08:55:01.987 INFO:tasks.workunit.client.1.vm08.stdout:9/129: mknod d2/dd/d15/d1e/d24/c26 0 2026-03-10T08:55:01.991 INFO:tasks.workunit.client.1.vm08.stdout:7/109: dread d0/f16 [0,4194304] 0 2026-03-10T08:55:01.991 INFO:tasks.workunit.client.1.vm08.stdout:4/115: sync 2026-03-10T08:55:01.994 INFO:tasks.workunit.client.1.vm08.stdout:1/71: sync 2026-03-10T08:55:01.996 INFO:tasks.workunit.client.1.vm08.stdout:7/110: unlink d0/d14/da/l18 0 2026-03-10T08:55:01.996 INFO:tasks.workunit.client.1.vm08.stdout:7/111: stat d0/d14 0 2026-03-10T08:55:01.998 INFO:tasks.workunit.client.1.vm08.stdout:4/116: creat d5/f28 x:0 0 0 2026-03-10T08:55:01.999 INFO:tasks.workunit.client.1.vm08.stdout:4/117: read - d5/de/f1b zero size 2026-03-10T08:55:02.001 INFO:tasks.workunit.client.1.vm08.stdout:1/72: creat d1/f11 x:0 0 0 2026-03-10T08:55:02.003 INFO:tasks.workunit.client.1.vm08.stdout:9/130: unlink d2/l12 0 2026-03-10T08:55:02.004 INFO:tasks.workunit.client.1.vm08.stdout:9/131: truncate d2/dd/d15/f1b 571145 0 2026-03-10T08:55:02.009 INFO:tasks.workunit.client.1.vm08.stdout:4/118: creat d5/d23/f29 x:0 0 0 2026-03-10T08:55:02.012 INFO:tasks.workunit.client.1.vm08.stdout:7/112: mknod d0/d14/c20 0 2026-03-10T08:55:02.015 INFO:tasks.workunit.client.1.vm08.stdout:1/73: creat d1/da/de/f12 x:0 0 0 2026-03-10T08:55:02.017 INFO:tasks.workunit.client.1.vm08.stdout:7/113: dread d0/d14/da/ff [0,4194304] 0 2026-03-10T08:55:02.023 INFO:tasks.workunit.client.1.vm08.stdout:9/132: link d2/dd/le d2/dd/d15/d1e/d24/l27 0 2026-03-10T08:55:02.028 INFO:tasks.workunit.client.1.vm08.stdout:7/114: dread d0/d14/da/ff [0,4194304] 0 2026-03-10T08:55:02.032 INFO:tasks.workunit.client.1.vm08.stdout:2/103: write d1/da/d10/d1b/d12/f1d [98110,119557] 0 2026-03-10T08:55:02.044 INFO:tasks.workunit.client.1.vm08.stdout:2/104: dwrite d1/da/d10/d1b/d12/d23/f24 [0,4194304] 0 2026-03-10T08:55:02.044 INFO:tasks.workunit.client.1.vm08.stdout:2/105: write d1/f19 [100790,126330] 0 2026-03-10T08:55:02.045 INFO:tasks.workunit.client.1.vm08.stdout:1/74: dwrite d1/fc [0,4194304] 0 2026-03-10T08:55:02.048 INFO:tasks.workunit.client.1.vm08.stdout:4/119: link d5/de/l1c d5/l2a 0 2026-03-10T08:55:02.056 INFO:tasks.workunit.client.1.vm08.stdout:9/133: symlink d2/l28 0 2026-03-10T08:55:02.056 INFO:tasks.workunit.client.1.vm08.stdout:2/106: rename d1/da/d10/d1b/f2a to d1/da/f2b 0 2026-03-10T08:55:02.061 INFO:tasks.workunit.client.1.vm08.stdout:5/131: rmdir d0 39 2026-03-10T08:55:02.064 INFO:tasks.workunit.client.1.vm08.stdout:4/120: mknod d5/d23/c2b 0 2026-03-10T08:55:02.074 INFO:tasks.workunit.client.1.vm08.stdout:8/154: dwrite d1/f8 [0,4194304] 0 2026-03-10T08:55:02.074 INFO:tasks.workunit.client.1.vm08.stdout:4/121: dwrite d5/f28 [0,4194304] 0 2026-03-10T08:55:02.075 INFO:tasks.workunit.client.1.vm08.stdout:2/107: creat d1/da/d10/f2c x:0 0 0 2026-03-10T08:55:02.076 INFO:tasks.workunit.client.1.vm08.stdout:1/75: dwrite d1/f8 [0,4194304] 0 2026-03-10T08:55:02.076 INFO:tasks.workunit.client.1.vm08.stdout:7/115: dwrite d0/fe [0,4194304] 0 2026-03-10T08:55:02.078 INFO:tasks.workunit.client.1.vm08.stdout:4/122: dwrite d5/f28 [0,4194304] 0 2026-03-10T08:55:02.087 INFO:tasks.workunit.client.1.vm08.stdout:9/134: truncate f1 2815787 0 2026-03-10T08:55:02.097 INFO:tasks.workunit.client.1.vm08.stdout:1/76: sync 2026-03-10T08:55:02.102 INFO:tasks.workunit.client.1.vm08.stdout:8/155: rename d1/d10/d9/dd/d18/f2b to d1/d2c/d2e/d21/f32 0 2026-03-10T08:55:02.106 INFO:tasks.workunit.client.1.vm08.stdout:2/108: mkdir d1/da/d10/d2d 0 2026-03-10T08:55:02.108 INFO:tasks.workunit.client.1.vm08.stdout:8/156: dwrite d1/d10/d9/fb [0,4194304] 0 2026-03-10T08:55:02.108 INFO:tasks.workunit.client.1.vm08.stdout:4/123: mknod d5/d23/c2c 0 2026-03-10T08:55:02.108 INFO:tasks.workunit.client.1.vm08.stdout:0/104: truncate d6/fe 7373398 0 2026-03-10T08:55:02.109 INFO:tasks.workunit.client.1.vm08.stdout:0/105: chown d6/c7 89038812 1 2026-03-10T08:55:02.114 INFO:tasks.workunit.client.1.vm08.stdout:7/116: dwrite d0/f16 [0,4194304] 0 2026-03-10T08:55:02.114 INFO:tasks.workunit.client.1.vm08.stdout:7/117: chown d0/d11/d1f 90 1 2026-03-10T08:55:02.114 INFO:tasks.workunit.client.1.vm08.stdout:1/77: creat d1/da/f13 x:0 0 0 2026-03-10T08:55:02.114 INFO:tasks.workunit.client.1.vm08.stdout:1/78: readlink - no filename 2026-03-10T08:55:02.114 INFO:tasks.workunit.client.1.vm08.stdout:3/128: truncate d4/d15/f1a 661246 0 2026-03-10T08:55:02.115 INFO:tasks.workunit.client.1.vm08.stdout:3/129: readlink d4/d15/d17/d20/l29 0 2026-03-10T08:55:02.115 INFO:tasks.workunit.client.1.vm08.stdout:1/79: read d1/da/ff [135927,52157] 0 2026-03-10T08:55:02.116 INFO:tasks.workunit.client.1.vm08.stdout:3/130: write d4/d15/f11 [648619,109635] 0 2026-03-10T08:55:02.119 INFO:tasks.workunit.client.1.vm08.stdout:4/124: mknod d5/c2d 0 2026-03-10T08:55:02.120 INFO:tasks.workunit.client.1.vm08.stdout:4/125: fdatasync d5/de/f1b 0 2026-03-10T08:55:02.122 INFO:tasks.workunit.client.1.vm08.stdout:8/157: write d1/d2c/d2e/d21/f32 [1033922,103232] 0 2026-03-10T08:55:02.123 INFO:tasks.workunit.client.1.vm08.stdout:4/126: dwrite d5/f14 [0,4194304] 0 2026-03-10T08:55:02.124 INFO:tasks.workunit.client.1.vm08.stdout:7/118: rmdir d0/d14 39 2026-03-10T08:55:02.125 INFO:tasks.workunit.client.1.vm08.stdout:8/158: truncate d1/d2c/d2e/f28 995460 0 2026-03-10T08:55:02.127 INFO:tasks.workunit.client.1.vm08.stdout:2/109: dwrite d1/da/f27 [0,4194304] 0 2026-03-10T08:55:02.149 INFO:tasks.workunit.client.1.vm08.stdout:1/80: symlink d1/l14 0 2026-03-10T08:55:02.149 INFO:tasks.workunit.client.1.vm08.stdout:1/81: write d1/fc [61849,6641] 0 2026-03-10T08:55:02.149 INFO:tasks.workunit.client.1.vm08.stdout:1/82: dwrite d1/f8 [0,4194304] 0 2026-03-10T08:55:02.155 INFO:tasks.workunit.client.1.vm08.stdout:8/159: dread d1/f11 [0,4194304] 0 2026-03-10T08:55:02.170 INFO:tasks.workunit.client.1.vm08.stdout:2/110: mknod d1/c2e 0 2026-03-10T08:55:02.170 INFO:tasks.workunit.client.1.vm08.stdout:2/111: chown d1/da/fb 2018 1 2026-03-10T08:55:02.171 INFO:tasks.workunit.client.1.vm08.stdout:2/112: rename d1/da/d10/d1b to d1/da/d10/d1b/d12/d22/d2f 22 2026-03-10T08:55:02.178 INFO:tasks.workunit.client.1.vm08.stdout:1/83: mknod d1/c15 0 2026-03-10T08:55:02.180 INFO:tasks.workunit.client.1.vm08.stdout:1/84: write d1/fc [2203154,87217] 0 2026-03-10T08:55:02.181 INFO:tasks.workunit.client.1.vm08.stdout:8/160: unlink d1/d10/f12 0 2026-03-10T08:55:02.182 INFO:tasks.workunit.client.1.vm08.stdout:7/119: dread d0/d14/f7 [0,4194304] 0 2026-03-10T08:55:02.183 INFO:tasks.workunit.client.1.vm08.stdout:6/128: dwrite d9/dc/dd/ff [0,4194304] 0 2026-03-10T08:55:02.185 INFO:tasks.workunit.client.1.vm08.stdout:5/132: dread d0/fb [0,4194304] 0 2026-03-10T08:55:02.187 INFO:tasks.workunit.client.1.vm08.stdout:0/106: getdents d6/dd/d13/d17/d1f/d20 0 2026-03-10T08:55:02.187 INFO:tasks.workunit.client.1.vm08.stdout:0/107: chown d6/dd/d13/d17/d1c 667 1 2026-03-10T08:55:02.188 INFO:tasks.workunit.client.1.vm08.stdout:1/85: rename c0 to d1/da/de/c16 0 2026-03-10T08:55:02.189 INFO:tasks.workunit.client.1.vm08.stdout:2/113: symlink d1/da/d10/d1b/l30 0 2026-03-10T08:55:02.189 INFO:tasks.workunit.client.1.vm08.stdout:2/114: stat d1/c2e 0 2026-03-10T08:55:02.190 INFO:tasks.workunit.client.1.vm08.stdout:7/120: dwrite d0/d14/f12 [0,4194304] 0 2026-03-10T08:55:02.207 INFO:tasks.workunit.client.1.vm08.stdout:0/108: mknod d6/dd/d13/c23 0 2026-03-10T08:55:02.207 INFO:tasks.workunit.client.1.vm08.stdout:1/86: unlink d1/f10 0 2026-03-10T08:55:02.207 INFO:tasks.workunit.client.1.vm08.stdout:2/115: rename d1/da/d10/d1b/d12/f1d to d1/da/d10/d1b/d12/d23/f31 0 2026-03-10T08:55:02.207 INFO:tasks.workunit.client.1.vm08.stdout:6/129: dwrite d9/dc/dd/f27 [0,4194304] 0 2026-03-10T08:55:02.207 INFO:tasks.workunit.client.1.vm08.stdout:1/87: dread d1/fd [0,4194304] 0 2026-03-10T08:55:02.207 INFO:tasks.workunit.client.1.vm08.stdout:2/116: mknod d1/da/d10/d1b/d12/d1e/c32 0 2026-03-10T08:55:02.207 INFO:tasks.workunit.client.1.vm08.stdout:6/130: creat d9/d13/f2f x:0 0 0 2026-03-10T08:55:02.207 INFO:tasks.workunit.client.1.vm08.stdout:0/109: mkdir d6/dd/d13/d17/d1c/d24 0 2026-03-10T08:55:02.217 INFO:tasks.workunit.client.1.vm08.stdout:6/131: write f5 [323168,78672] 0 2026-03-10T08:55:02.218 INFO:tasks.workunit.client.1.vm08.stdout:0/110: creat d6/f25 x:0 0 0 2026-03-10T08:55:02.218 INFO:tasks.workunit.client.1.vm08.stdout:2/117: symlink d1/da/d10/d1b/d12/d22/l33 0 2026-03-10T08:55:02.219 INFO:tasks.workunit.client.1.vm08.stdout:6/132: creat d9/d10/f30 x:0 0 0 2026-03-10T08:55:02.221 INFO:tasks.workunit.client.1.vm08.stdout:2/118: mknod d1/da/d10/c34 0 2026-03-10T08:55:02.222 INFO:tasks.workunit.client.1.vm08.stdout:0/111: fdatasync d6/fe 0 2026-03-10T08:55:02.225 INFO:tasks.workunit.client.1.vm08.stdout:0/112: mkdir d6/dd/d13/d17/d1c/d26 0 2026-03-10T08:55:02.226 INFO:tasks.workunit.client.1.vm08.stdout:0/113: fdatasync d6/f25 0 2026-03-10T08:55:02.228 INFO:tasks.workunit.client.1.vm08.stdout:6/133: dwrite d9/dc/dd/f27 [0,4194304] 0 2026-03-10T08:55:02.229 INFO:tasks.workunit.client.1.vm08.stdout:2/119: dwrite d1/da/fe [0,4194304] 0 2026-03-10T08:55:02.245 INFO:tasks.workunit.client.1.vm08.stdout:6/134: creat d9/dc/d11/f31 x:0 0 0 2026-03-10T08:55:02.246 INFO:tasks.workunit.client.1.vm08.stdout:2/120: unlink d1/da/d10/d1b/d12/d1e/c29 0 2026-03-10T08:55:02.246 INFO:tasks.workunit.client.1.vm08.stdout:2/121: stat d1/da/d10 0 2026-03-10T08:55:02.248 INFO:tasks.workunit.client.1.vm08.stdout:6/135: rename d9/dc/dd to d9/d10/d1e/d32 0 2026-03-10T08:55:02.248 INFO:tasks.workunit.client.1.vm08.stdout:6/136: write d9/d13/f2f [753088,108901] 0 2026-03-10T08:55:02.251 INFO:tasks.workunit.client.1.vm08.stdout:2/122: rename d1/c2e to d1/da/d10/d2d/c35 0 2026-03-10T08:55:02.252 INFO:tasks.workunit.client.1.vm08.stdout:2/123: mkdir d1/d36 0 2026-03-10T08:55:02.253 INFO:tasks.workunit.client.1.vm08.stdout:6/137: symlink d9/dc/l33 0 2026-03-10T08:55:02.256 INFO:tasks.workunit.client.1.vm08.stdout:6/138: chown d9/cb 43 1 2026-03-10T08:55:02.256 INFO:tasks.workunit.client.1.vm08.stdout:2/124: dread d1/f4 [0,4194304] 0 2026-03-10T08:55:02.257 INFO:tasks.workunit.client.1.vm08.stdout:6/139: rmdir d9/dc/d11/d23/d2c 39 2026-03-10T08:55:02.258 INFO:tasks.workunit.client.1.vm08.stdout:2/125: creat d1/da/d10/d1b/d12/d23/f37 x:0 0 0 2026-03-10T08:55:02.259 INFO:tasks.workunit.client.1.vm08.stdout:6/140: dread d9/d10/d1e/d32/f12 [0,4194304] 0 2026-03-10T08:55:02.260 INFO:tasks.workunit.client.1.vm08.stdout:6/141: dread - d9/d10/f30 zero size 2026-03-10T08:55:02.261 INFO:tasks.workunit.client.1.vm08.stdout:2/126: symlink d1/da/d10/d1b/d1c/l38 0 2026-03-10T08:55:02.261 INFO:tasks.workunit.client.1.vm08.stdout:2/127: chown d1/da/d10/d1b/d12/d1e 107537 1 2026-03-10T08:55:02.263 INFO:tasks.workunit.client.1.vm08.stdout:2/128: chown d1/da/d10/d1b/f28 226 1 2026-03-10T08:55:02.265 INFO:tasks.workunit.client.1.vm08.stdout:2/129: chown d1/da/lc 548704 1 2026-03-10T08:55:02.267 INFO:tasks.workunit.client.1.vm08.stdout:2/130: rmdir d1/da/d10/d2d 39 2026-03-10T08:55:02.276 INFO:tasks.workunit.client.1.vm08.stdout:2/131: rename d1/da/f27 to d1/da/d10/f39 0 2026-03-10T08:55:02.287 INFO:tasks.workunit.client.1.vm08.stdout:2/132: unlink d1/da/f21 0 2026-03-10T08:55:02.289 INFO:tasks.workunit.client.1.vm08.stdout:2/133: rmdir d1/da/d10/d1b 39 2026-03-10T08:55:02.291 INFO:tasks.workunit.client.1.vm08.stdout:2/134: write d1/da/d10/f39 [2017299,66902] 0 2026-03-10T08:55:02.293 INFO:tasks.workunit.client.1.vm08.stdout:2/135: write d1/da/d10/d1b/f28 [831433,58350] 0 2026-03-10T08:55:02.295 INFO:tasks.workunit.client.1.vm08.stdout:2/136: symlink d1/da/d10/d1b/d1c/l3a 0 2026-03-10T08:55:02.297 INFO:tasks.workunit.client.1.vm08.stdout:2/137: creat d1/da/d10/d1b/d12/f3b x:0 0 0 2026-03-10T08:55:02.302 INFO:tasks.workunit.client.1.vm08.stdout:2/138: dread d1/da/d10/f39 [0,4194304] 0 2026-03-10T08:55:02.302 INFO:tasks.workunit.client.1.vm08.stdout:2/139: dread - d1/da/d10/f2c zero size 2026-03-10T08:55:02.808 INFO:tasks.workunit.client.1.vm08.stdout:8/161: dread d1/d2c/d2e/d21/f32 [0,4194304] 0 2026-03-10T08:55:02.814 INFO:tasks.workunit.client.1.vm08.stdout:8/162: creat d1/d2c/f33 x:0 0 0 2026-03-10T08:55:02.860 INFO:tasks.workunit.client.1.vm08.stdout:4/127: getdents d5 0 2026-03-10T08:55:02.860 INFO:tasks.workunit.client.1.vm08.stdout:4/128: truncate d5/d23/f29 726349 0 2026-03-10T08:55:02.872 INFO:tasks.workunit.client.1.vm08.stdout:3/131: truncate d4/d15/fc 1951280 0 2026-03-10T08:55:02.873 INFO:tasks.workunit.client.1.vm08.stdout:3/132: stat d4/f18 0 2026-03-10T08:55:02.888 INFO:tasks.workunit.client.1.vm08.stdout:1/88: link d1/da/de/c16 d1/da/de/c17 0 2026-03-10T08:55:02.889 INFO:tasks.workunit.client.1.vm08.stdout:1/89: fsync d1/da/ff 0 2026-03-10T08:55:02.894 INFO:tasks.workunit.client.1.vm08.stdout:1/90: dwrite d1/fc [0,4194304] 0 2026-03-10T08:55:02.896 INFO:tasks.workunit.client.1.vm08.stdout:7/121: dwrite d0/d14/f12 [4194304,4194304] 0 2026-03-10T08:55:02.897 INFO:tasks.workunit.client.1.vm08.stdout:7/122: readlink d0/d14/l17 0 2026-03-10T08:55:02.902 INFO:tasks.workunit.client.1.vm08.stdout:1/91: mkdir d1/da/d18 0 2026-03-10T08:55:02.902 INFO:tasks.workunit.client.1.vm08.stdout:1/92: chown d1/c5 4 1 2026-03-10T08:55:02.903 INFO:tasks.workunit.client.1.vm08.stdout:5/133: write d0/d11/d27/f2a [9281562,29522] 0 2026-03-10T08:55:02.904 INFO:tasks.workunit.client.1.vm08.stdout:5/134: read d0/d11/d18/f1a [100059,3677] 0 2026-03-10T08:55:02.911 INFO:tasks.workunit.client.1.vm08.stdout:1/93: write d1/fd [3217848,119166] 0 2026-03-10T08:55:02.911 INFO:tasks.workunit.client.1.vm08.stdout:1/94: read d1/fd [2477463,111388] 0 2026-03-10T08:55:02.915 INFO:tasks.workunit.client.1.vm08.stdout:5/135: creat d0/d11/f2d x:0 0 0 2026-03-10T08:55:02.916 INFO:tasks.workunit.client.1.vm08.stdout:7/123: mknod d0/d1c/c21 0 2026-03-10T08:55:02.916 INFO:tasks.workunit.client.1.vm08.stdout:1/95: dwrite d1/da/f13 [0,4194304] 0 2026-03-10T08:55:02.932 INFO:tasks.workunit.client.1.vm08.stdout:1/96: unlink d1/c6 0 2026-03-10T08:55:02.933 INFO:tasks.workunit.client.1.vm08.stdout:5/136: creat d0/f2e x:0 0 0 2026-03-10T08:55:02.933 INFO:tasks.workunit.client.1.vm08.stdout:5/137: dread - d0/d11/f2d zero size 2026-03-10T08:55:02.934 INFO:tasks.workunit.client.1.vm08.stdout:5/138: chown d0/f16 95 1 2026-03-10T08:55:02.934 INFO:tasks.workunit.client.1.vm08.stdout:1/97: creat d1/da/de/f19 x:0 0 0 2026-03-10T08:55:02.936 INFO:tasks.workunit.client.1.vm08.stdout:5/139: rename d0/f2e to d0/d1b/f2f 0 2026-03-10T08:55:02.936 INFO:tasks.workunit.client.1.vm08.stdout:5/140: chown d0/d11/f2d 255059 1 2026-03-10T08:55:02.937 INFO:tasks.workunit.client.1.vm08.stdout:1/98: creat d1/da/de/f1a x:0 0 0 2026-03-10T08:55:02.945 INFO:tasks.workunit.client.1.vm08.stdout:5/141: link d0/d11/d18/l24 d0/d1b/l30 0 2026-03-10T08:55:02.945 INFO:tasks.workunit.client.1.vm08.stdout:0/114: dwrite f4 [0,4194304] 0 2026-03-10T08:55:02.954 INFO:tasks.workunit.client.1.vm08.stdout:0/115: dread f5 [0,4194304] 0 2026-03-10T08:55:02.959 INFO:tasks.workunit.client.1.vm08.stdout:0/116: dread d6/f11 [0,4194304] 0 2026-03-10T08:55:02.960 INFO:tasks.workunit.client.1.vm08.stdout:5/142: dwrite d0/d11/d18/f1a [4194304,4194304] 0 2026-03-10T08:55:02.962 INFO:tasks.workunit.client.1.vm08.stdout:5/143: write d0/d11/f1e [3413948,2367] 0 2026-03-10T08:55:02.963 INFO:tasks.workunit.client.1.vm08.stdout:5/144: write d0/d11/d18/f1a [5647578,14563] 0 2026-03-10T08:55:02.963 INFO:tasks.workunit.client.1.vm08.stdout:5/145: fsync d0/ff 0 2026-03-10T08:55:02.974 INFO:tasks.workunit.client.1.vm08.stdout:0/117: stat d6/dd/c14 0 2026-03-10T08:55:02.986 INFO:tasks.workunit.client.1.vm08.stdout:0/118: dwrite f3 [0,4194304] 0 2026-03-10T08:55:02.986 INFO:tasks.workunit.client.1.vm08.stdout:5/146: rename d0/d1b/l28 to d0/d11/d18/l31 0 2026-03-10T08:55:02.990 INFO:tasks.workunit.client.1.vm08.stdout:5/147: rmdir d0/d1b 39 2026-03-10T08:55:02.995 INFO:tasks.workunit.client.1.vm08.stdout:6/142: truncate d9/d10/d1e/d32/ff 1910293 0 2026-03-10T08:55:03.011 INFO:tasks.workunit.client.1.vm08.stdout:5/148: dread d0/fe [0,4194304] 0 2026-03-10T08:55:03.035 INFO:tasks.workunit.client.1.vm08.stdout:2/140: dwrite d1/da/d10/d1b/d1c/f20 [0,4194304] 0 2026-03-10T08:55:03.042 INFO:tasks.workunit.client.1.vm08.stdout:2/141: dwrite d1/f19 [0,4194304] 0 2026-03-10T08:55:03.199 INFO:tasks.workunit.client.1.vm08.stdout:2/142: sync 2026-03-10T08:55:03.204 INFO:tasks.workunit.client.1.vm08.stdout:2/143: creat d1/da/d10/d2d/f3c x:0 0 0 2026-03-10T08:55:03.208 INFO:tasks.workunit.client.1.vm08.stdout:2/144: creat d1/da/d10/d1b/d1c/f3d x:0 0 0 2026-03-10T08:55:03.214 INFO:tasks.workunit.client.1.vm08.stdout:2/145: link d1/f9 d1/da/d10/d2d/f3e 0 2026-03-10T08:55:03.214 INFO:tasks.workunit.client.1.vm08.stdout:2/146: write d1/da/d10/f2c [811779,115460] 0 2026-03-10T08:55:03.216 INFO:tasks.workunit.client.1.vm08.stdout:2/147: truncate d1/da/d10/d2d/f3c 1017723 0 2026-03-10T08:55:03.219 INFO:tasks.workunit.client.1.vm08.stdout:2/148: creat d1/da/d10/d1b/d1c/f3f x:0 0 0 2026-03-10T08:55:03.219 INFO:tasks.workunit.client.1.vm08.stdout:2/149: readlink d1/l5 0 2026-03-10T08:55:03.222 INFO:tasks.workunit.client.1.vm08.stdout:2/150: rename d1/da/d10/d1b/d12/d23/f24 to d1/da/f40 0 2026-03-10T08:55:03.232 INFO:tasks.workunit.client.1.vm08.stdout:2/151: rename d1/da/d10/d2d/f3c to d1/da/f41 0 2026-03-10T08:55:03.243 INFO:tasks.workunit.client.1.vm08.stdout:2/152: dwrite d1/da/d10/d1b/d12/f3b [0,4194304] 0 2026-03-10T08:55:03.259 INFO:tasks.workunit.client.1.vm08.stdout:9/135: truncate f1 1172524 0 2026-03-10T08:55:03.284 INFO:tasks.workunit.client.1.vm08.stdout:8/163: truncate d1/d10/d9/fb 3870418 0 2026-03-10T08:55:03.285 INFO:tasks.workunit.client.1.vm08.stdout:8/164: write d1/d2c/d2e/f28 [196178,106002] 0 2026-03-10T08:55:03.286 INFO:tasks.workunit.client.1.vm08.stdout:8/165: truncate d1/d2c/f30 343421 0 2026-03-10T08:55:03.291 INFO:tasks.workunit.client.1.vm08.stdout:8/166: mkdir d1/d10/d9/dd/d18/d34 0 2026-03-10T08:55:03.293 INFO:tasks.workunit.client.1.vm08.stdout:8/167: rename d1/d10/d9/l1d to d1/d2c/l35 0 2026-03-10T08:55:03.295 INFO:tasks.workunit.client.1.vm08.stdout:8/168: unlink d1/d2c/d2e/f28 0 2026-03-10T08:55:03.302 INFO:tasks.workunit.client.1.vm08.stdout:3/133: read d4/d15/f1a [85862,92045] 0 2026-03-10T08:55:03.315 INFO:tasks.workunit.client.1.vm08.stdout:3/134: chown d4/d15/d17/d20 16030165 1 2026-03-10T08:55:03.316 INFO:tasks.workunit.client.1.vm08.stdout:3/135: dread - d4/d15/d8/f1e zero size 2026-03-10T08:55:03.316 INFO:tasks.workunit.client.1.vm08.stdout:3/136: read d4/d15/f7 [421683,76832] 0 2026-03-10T08:55:03.316 INFO:tasks.workunit.client.1.vm08.stdout:8/169: dread d1/d10/f23 [0,4194304] 0 2026-03-10T08:55:03.316 INFO:tasks.workunit.client.1.vm08.stdout:3/137: truncate d4/d15/d8/f1f 541853 0 2026-03-10T08:55:03.316 INFO:tasks.workunit.client.1.vm08.stdout:8/170: symlink d1/d10/d9/dd/l36 0 2026-03-10T08:55:03.316 INFO:tasks.workunit.client.1.vm08.stdout:3/138: dread d4/d15/f11 [0,4194304] 0 2026-03-10T08:55:03.316 INFO:tasks.workunit.client.1.vm08.stdout:8/171: write d1/d2c/d2e/f22 [3065,119618] 0 2026-03-10T08:55:03.316 INFO:tasks.workunit.client.1.vm08.stdout:3/139: mkdir d4/d15/d8/d2a 0 2026-03-10T08:55:03.319 INFO:tasks.workunit.client.1.vm08.stdout:8/172: symlink d1/d10/d9/dd/d18/d34/l37 0 2026-03-10T08:55:03.331 INFO:tasks.workunit.client.1.vm08.stdout:4/129: write d5/f6 [280916,1415] 0 2026-03-10T08:55:03.333 INFO:tasks.workunit.client.1.vm08.stdout:4/130: creat d5/d23/f2e x:0 0 0 2026-03-10T08:55:03.334 INFO:tasks.workunit.client.1.vm08.stdout:4/131: mkdir d5/d2f 0 2026-03-10T08:55:03.335 INFO:tasks.workunit.client.1.vm08.stdout:3/140: sync 2026-03-10T08:55:03.338 INFO:tasks.workunit.client.1.vm08.stdout:4/132: symlink d5/d23/l30 0 2026-03-10T08:55:03.347 INFO:tasks.workunit.client.1.vm08.stdout:3/141: dread f1 [0,4194304] 0 2026-03-10T08:55:03.353 INFO:tasks.workunit.client.1.vm08.stdout:4/133: symlink d5/d23/l31 0 2026-03-10T08:55:03.353 INFO:tasks.workunit.client.1.vm08.stdout:4/134: write d5/f6 [1913636,45105] 0 2026-03-10T08:55:03.354 INFO:tasks.workunit.client.1.vm08.stdout:4/135: dread - d5/d23/f2e zero size 2026-03-10T08:55:03.360 INFO:tasks.workunit.client.1.vm08.stdout:2/153: read d1/da/d10/f2c [109595,111725] 0 2026-03-10T08:55:03.361 INFO:tasks.workunit.client.1.vm08.stdout:2/154: chown d1/da/d10/d1b/d1c/f3f 18511782 1 2026-03-10T08:55:03.363 INFO:tasks.workunit.client.1.vm08.stdout:3/142: rename d4/c1c to d4/d15/d8/d2a/c2b 0 2026-03-10T08:55:03.378 INFO:tasks.workunit.client.1.vm08.stdout:4/136: rmdir d5/de 39 2026-03-10T08:55:03.384 INFO:tasks.workunit.client.1.vm08.stdout:3/143: mkdir d4/d15/d8/d2c 0 2026-03-10T08:55:03.388 INFO:tasks.workunit.client.1.vm08.stdout:2/155: mkdir d1/da/d10/d42 0 2026-03-10T08:55:03.391 INFO:tasks.workunit.client.1.vm08.stdout:3/144: write d4/d15/f7 [3750568,44141] 0 2026-03-10T08:55:03.391 INFO:tasks.workunit.client.1.vm08.stdout:4/137: symlink d5/l32 0 2026-03-10T08:55:03.391 INFO:tasks.workunit.client.1.vm08.stdout:4/138: read - d5/f1d zero size 2026-03-10T08:55:03.392 INFO:tasks.workunit.client.1.vm08.stdout:4/139: stat d5/d23/c2b 0 2026-03-10T08:55:03.393 INFO:tasks.workunit.client.1.vm08.stdout:2/156: mkdir d1/d43 0 2026-03-10T08:55:03.394 INFO:tasks.workunit.client.1.vm08.stdout:2/157: chown d1/da/d10/d1b/d1c/l3a 1328 1 2026-03-10T08:55:03.395 INFO:tasks.workunit.client.1.vm08.stdout:2/158: chown d1/da/l16 1041843855 1 2026-03-10T08:55:03.395 INFO:tasks.workunit.client.1.vm08.stdout:2/159: chown d1/da/f41 1 1 2026-03-10T08:55:03.396 INFO:tasks.workunit.client.1.vm08.stdout:2/160: write d1/da/d10/d1b/d1c/f20 [1162989,12801] 0 2026-03-10T08:55:03.397 INFO:tasks.workunit.client.1.vm08.stdout:3/145: creat d4/d15/d8/d1d/f2d x:0 0 0 2026-03-10T08:55:03.398 INFO:tasks.workunit.client.1.vm08.stdout:2/161: chown d1/da/d10/l1a 230263688 1 2026-03-10T08:55:03.400 INFO:tasks.workunit.client.1.vm08.stdout:7/124: rmdir d0 39 2026-03-10T08:55:03.404 INFO:tasks.workunit.client.1.vm08.stdout:3/146: mknod d4/d15/d8/d2a/c2e 0 2026-03-10T08:55:03.407 INFO:tasks.workunit.client.1.vm08.stdout:4/140: rename d5/c2d to d5/de/c33 0 2026-03-10T08:55:03.408 INFO:tasks.workunit.client.1.vm08.stdout:4/141: fdatasync d5/f8 0 2026-03-10T08:55:03.409 INFO:tasks.workunit.client.1.vm08.stdout:7/125: chown d0/f16 118669001 1 2026-03-10T08:55:03.412 INFO:tasks.workunit.client.1.vm08.stdout:4/142: dwrite d5/f8 [0,4194304] 0 2026-03-10T08:55:03.413 INFO:tasks.workunit.client.1.vm08.stdout:7/126: dread d0/d14/f7 [0,4194304] 0 2026-03-10T08:55:03.415 INFO:tasks.workunit.client.1.vm08.stdout:2/162: truncate d1/da/d10/d1b/f14 1912266 0 2026-03-10T08:55:03.416 INFO:tasks.workunit.client.1.vm08.stdout:1/99: rmdir d1/da/de 39 2026-03-10T08:55:03.419 INFO:tasks.workunit.client.1.vm08.stdout:1/100: readlink d1/l14 0 2026-03-10T08:55:03.419 INFO:tasks.workunit.client.1.vm08.stdout:1/101: write d1/fc [2015468,5610] 0 2026-03-10T08:55:03.424 INFO:tasks.workunit.client.1.vm08.stdout:3/147: symlink d4/d15/d8/d2c/l2f 0 2026-03-10T08:55:03.425 INFO:tasks.workunit.client.1.vm08.stdout:4/143: dwrite d5/f1d [0,4194304] 0 2026-03-10T08:55:03.427 INFO:tasks.workunit.client.1.vm08.stdout:4/144: stat d5/fd 0 2026-03-10T08:55:03.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:03 vm05.local ceph-mon[49713]: pgmap v140: 65 pgs: 65 active+clean; 273 MiB data, 1.8 GiB used, 118 GiB / 120 GiB avail; 312 KiB/s rd, 14 MiB/s wr, 392 op/s 2026-03-10T08:55:03.494 INFO:tasks.workunit.client.1.vm08.stdout:0/119: write d6/f15 [560857,109139] 0 2026-03-10T08:55:03.500 INFO:tasks.workunit.client.1.vm08.stdout:3/148: symlink d4/d15/d8/d2c/l30 0 2026-03-10T08:55:03.500 INFO:tasks.workunit.client.1.vm08.stdout:3/149: readlink d4/d15/d17/d20/l29 0 2026-03-10T08:55:03.501 INFO:tasks.workunit.client.1.vm08.stdout:6/143: dwrite d9/d10/d1e/d32/ff [0,4194304] 0 2026-03-10T08:55:03.507 INFO:tasks.workunit.client.1.vm08.stdout:6/144: dwrite f5 [0,4194304] 0 2026-03-10T08:55:03.517 INFO:tasks.workunit.client.1.vm08.stdout:7/127: mknod d0/d14/c22 0 2026-03-10T08:55:03.518 INFO:tasks.workunit.client.1.vm08.stdout:4/145: mknod d5/d23/c34 0 2026-03-10T08:55:03.519 INFO:tasks.workunit.client.1.vm08.stdout:4/146: dread - d5/de/f1b zero size 2026-03-10T08:55:03.519 INFO:tasks.workunit.client.1.vm08.stdout:1/102: getdents d1/da/d18 0 2026-03-10T08:55:03.520 INFO:tasks.workunit.client.1.vm08.stdout:4/147: chown d5/d23/f2e 389 1 2026-03-10T08:55:03.521 INFO:tasks.workunit.client.1.vm08.stdout:3/150: stat d4/d15/d8/c13 0 2026-03-10T08:55:03.524 INFO:tasks.workunit.client.1.vm08.stdout:0/120: symlink d6/dd/d13/d17/d1c/d26/l27 0 2026-03-10T08:55:03.526 INFO:tasks.workunit.client.1.vm08.stdout:3/151: symlink d4/d15/d17/l31 0 2026-03-10T08:55:03.529 INFO:tasks.workunit.client.1.vm08.stdout:1/103: mkdir d1/da/d1b 0 2026-03-10T08:55:03.529 INFO:tasks.workunit.client.1.vm08.stdout:1/104: chown d1 43337 1 2026-03-10T08:55:03.529 INFO:tasks.workunit.client.1.vm08.stdout:5/149: link d0/d11/d18/l31 d0/d11/d18/l32 0 2026-03-10T08:55:03.530 INFO:tasks.workunit.client.1.vm08.stdout:4/148: symlink d5/d2f/l35 0 2026-03-10T08:55:03.532 INFO:tasks.workunit.client.1.vm08.stdout:0/121: unlink d6/dd/d13/d17/d1c/d26/l27 0 2026-03-10T08:55:03.533 INFO:tasks.workunit.client.1.vm08.stdout:3/152: creat d4/d15/d8/d2c/f32 x:0 0 0 2026-03-10T08:55:03.536 INFO:tasks.workunit.client.1.vm08.stdout:4/149: mkdir d5/d23/d36 0 2026-03-10T08:55:03.538 INFO:tasks.workunit.client.1.vm08.stdout:3/153: mknod d4/d15/d8/d1d/c33 0 2026-03-10T08:55:03.541 INFO:tasks.workunit.client.1.vm08.stdout:4/150: symlink d5/de/l37 0 2026-03-10T08:55:03.543 INFO:tasks.workunit.client.1.vm08.stdout:3/154: dwrite d4/d15/d8/f1f [0,4194304] 0 2026-03-10T08:55:03.547 INFO:tasks.workunit.client.1.vm08.stdout:4/151: write d5/de/f1b [836967,73082] 0 2026-03-10T08:55:03.550 INFO:tasks.workunit.client.1.vm08.stdout:1/105: rename d1/c15 to d1/da/c1c 0 2026-03-10T08:55:03.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:03 vm08.local ceph-mon[57559]: pgmap v140: 65 pgs: 65 active+clean; 273 MiB data, 1.8 GiB used, 118 GiB / 120 GiB avail; 312 KiB/s rd, 14 MiB/s wr, 392 op/s 2026-03-10T08:55:03.552 INFO:tasks.workunit.client.1.vm08.stdout:0/122: creat d6/dd/f28 x:0 0 0 2026-03-10T08:55:03.553 INFO:tasks.workunit.client.1.vm08.stdout:3/155: dread d4/d15/f11 [0,4194304] 0 2026-03-10T08:55:03.554 INFO:tasks.workunit.client.1.vm08.stdout:5/150: dwrite d0/d1b/f2f [0,4194304] 0 2026-03-10T08:55:03.558 INFO:tasks.workunit.client.1.vm08.stdout:4/152: creat d5/de/f38 x:0 0 0 2026-03-10T08:55:03.567 INFO:tasks.workunit.client.1.vm08.stdout:3/156: unlink d4/d15/c22 0 2026-03-10T08:55:03.567 INFO:tasks.workunit.client.1.vm08.stdout:0/123: creat d6/dd/d13/d17/f29 x:0 0 0 2026-03-10T08:55:03.567 INFO:tasks.workunit.client.1.vm08.stdout:5/151: mknod d0/d11/d18/c33 0 2026-03-10T08:55:03.567 INFO:tasks.workunit.client.1.vm08.stdout:1/106: write d1/da/de/f12 [110810,130102] 0 2026-03-10T08:55:03.571 INFO:tasks.workunit.client.1.vm08.stdout:0/124: mknod d6/dd/d13/d17/d1c/c2a 0 2026-03-10T08:55:03.571 INFO:tasks.workunit.client.1.vm08.stdout:1/107: rmdir d1/da/de 39 2026-03-10T08:55:03.572 INFO:tasks.workunit.client.1.vm08.stdout:0/125: chown d6/f16 1805022 1 2026-03-10T08:55:03.572 INFO:tasks.workunit.client.1.vm08.stdout:0/126: write f4 [2369794,110205] 0 2026-03-10T08:55:03.572 INFO:tasks.workunit.client.1.vm08.stdout:4/153: mknod d5/c39 0 2026-03-10T08:55:03.573 INFO:tasks.workunit.client.1.vm08.stdout:0/127: write d6/f25 [66943,54436] 0 2026-03-10T08:55:03.577 INFO:tasks.workunit.client.1.vm08.stdout:1/108: creat d1/da/d18/f1d x:0 0 0 2026-03-10T08:55:03.578 INFO:tasks.workunit.client.1.vm08.stdout:4/154: creat d5/d2f/f3a x:0 0 0 2026-03-10T08:55:03.578 INFO:tasks.workunit.client.1.vm08.stdout:1/109: truncate d1/da/ff 357604 0 2026-03-10T08:55:03.579 INFO:tasks.workunit.client.1.vm08.stdout:4/155: creat d5/de/f3b x:0 0 0 2026-03-10T08:55:03.579 INFO:tasks.workunit.client.1.vm08.stdout:4/156: chown d5/l22 6358 1 2026-03-10T08:55:03.580 INFO:tasks.workunit.client.1.vm08.stdout:1/110: write d1/da/de/f19 [484838,106903] 0 2026-03-10T08:55:03.580 INFO:tasks.workunit.client.1.vm08.stdout:4/157: write d5/f28 [4367793,27509] 0 2026-03-10T08:55:03.582 INFO:tasks.workunit.client.1.vm08.stdout:1/111: chown d1/da/d1b 701 1 2026-03-10T08:55:03.584 INFO:tasks.workunit.client.1.vm08.stdout:4/158: unlink d5/c26 0 2026-03-10T08:55:03.587 INFO:tasks.workunit.client.1.vm08.stdout:4/159: truncate d5/d23/f27 871950 0 2026-03-10T08:55:03.589 INFO:tasks.workunit.client.1.vm08.stdout:1/112: dwrite d1/da/f13 [0,4194304] 0 2026-03-10T08:55:03.589 INFO:tasks.workunit.client.1.vm08.stdout:2/163: unlink d1/da/f40 0 2026-03-10T08:55:03.597 INFO:tasks.workunit.client.1.vm08.stdout:2/164: creat d1/da/d10/d1b/d12/d23/f44 x:0 0 0 2026-03-10T08:55:03.598 INFO:tasks.workunit.client.1.vm08.stdout:9/136: truncate d2/dd/d15/f17 2759280 0 2026-03-10T08:55:03.598 INFO:tasks.workunit.client.1.vm08.stdout:2/165: write d1/da/d10/d1b/d12/f3b [4140431,28318] 0 2026-03-10T08:55:03.603 INFO:tasks.workunit.client.1.vm08.stdout:1/113: rmdir d1/da/d1b 0 2026-03-10T08:55:03.603 INFO:tasks.workunit.client.1.vm08.stdout:9/137: dread d2/fa [0,4194304] 0 2026-03-10T08:55:03.603 INFO:tasks.workunit.client.1.vm08.stdout:2/166: creat d1/da/d10/d1b/d12/d22/f45 x:0 0 0 2026-03-10T08:55:03.609 INFO:tasks.workunit.client.1.vm08.stdout:9/138: symlink d2/dd/l29 0 2026-03-10T08:55:03.620 INFO:tasks.workunit.client.1.vm08.stdout:2/167: symlink d1/da/d10/d1b/l46 0 2026-03-10T08:55:03.620 INFO:tasks.workunit.client.1.vm08.stdout:1/114: dwrite d1/f8 [0,4194304] 0 2026-03-10T08:55:03.620 INFO:tasks.workunit.client.1.vm08.stdout:9/139: rename d2/dd/d15/d1e/d24/c26 to d2/dd/c2a 0 2026-03-10T08:55:03.620 INFO:tasks.workunit.client.1.vm08.stdout:9/140: stat d2/fb 0 2026-03-10T08:55:03.621 INFO:tasks.workunit.client.1.vm08.stdout:5/152: sync 2026-03-10T08:55:03.621 INFO:tasks.workunit.client.1.vm08.stdout:3/157: sync 2026-03-10T08:55:03.624 INFO:tasks.workunit.client.1.vm08.stdout:0/128: sync 2026-03-10T08:55:03.624 INFO:tasks.workunit.client.1.vm08.stdout:5/153: sync 2026-03-10T08:55:03.624 INFO:tasks.workunit.client.1.vm08.stdout:4/160: sync 2026-03-10T08:55:03.629 INFO:tasks.workunit.client.1.vm08.stdout:1/115: dread d1/f8 [0,4194304] 0 2026-03-10T08:55:03.631 INFO:tasks.workunit.client.1.vm08.stdout:5/154: creat d0/d11/d18/f34 x:0 0 0 2026-03-10T08:55:03.643 INFO:tasks.workunit.client.1.vm08.stdout:0/129: rmdir d6/dd/d13/d17/d1f 39 2026-03-10T08:55:03.643 INFO:tasks.workunit.client.1.vm08.stdout:4/161: dwrite d5/f21 [0,4194304] 0 2026-03-10T08:55:03.643 INFO:tasks.workunit.client.1.vm08.stdout:1/116: chown d1/da/de/f19 101 1 2026-03-10T08:55:03.643 INFO:tasks.workunit.client.1.vm08.stdout:4/162: rename d5/d2f to d5/d2f/d3c 22 2026-03-10T08:55:03.643 INFO:tasks.workunit.client.1.vm08.stdout:0/130: unlink f1 0 2026-03-10T08:55:03.643 INFO:tasks.workunit.client.1.vm08.stdout:8/173: write d1/d10/f23 [1170683,34934] 0 2026-03-10T08:55:03.645 INFO:tasks.workunit.client.1.vm08.stdout:4/163: dread d5/f14 [0,4194304] 0 2026-03-10T08:55:03.647 INFO:tasks.workunit.client.1.vm08.stdout:8/174: mknod d1/d2c/d2e/c38 0 2026-03-10T08:55:03.650 INFO:tasks.workunit.client.1.vm08.stdout:3/158: getdents d4 0 2026-03-10T08:55:03.656 INFO:tasks.workunit.client.1.vm08.stdout:0/131: mknod d6/dd/c2b 0 2026-03-10T08:55:03.656 INFO:tasks.workunit.client.1.vm08.stdout:3/159: creat d4/d15/d17/f34 x:0 0 0 2026-03-10T08:55:03.656 INFO:tasks.workunit.client.1.vm08.stdout:0/132: read d6/f16 [260937,118509] 0 2026-03-10T08:55:03.656 INFO:tasks.workunit.client.1.vm08.stdout:8/175: dwrite d1/d10/f2a [0,4194304] 0 2026-03-10T08:55:03.659 INFO:tasks.workunit.client.1.vm08.stdout:1/117: write d1/fd [3598467,41118] 0 2026-03-10T08:55:03.659 INFO:tasks.workunit.client.1.vm08.stdout:8/176: readlink d1/d10/d9/l2f 0 2026-03-10T08:55:03.659 INFO:tasks.workunit.client.1.vm08.stdout:5/155: getdents d0/d11/d27 0 2026-03-10T08:55:03.660 INFO:tasks.workunit.client.1.vm08.stdout:0/133: creat d6/f2c x:0 0 0 2026-03-10T08:55:03.663 INFO:tasks.workunit.client.1.vm08.stdout:3/160: dwrite d4/d15/d8/f1f [0,4194304] 0 2026-03-10T08:55:03.663 INFO:tasks.workunit.client.1.vm08.stdout:3/161: rename d4/d15 to d4/d15/d8/d2a/d35 22 2026-03-10T08:55:03.674 INFO:tasks.workunit.client.1.vm08.stdout:5/156: rename d0 to d0/d1b/d35 22 2026-03-10T08:55:03.676 INFO:tasks.workunit.client.1.vm08.stdout:3/162: symlink d4/d15/d8/l36 0 2026-03-10T08:55:03.678 INFO:tasks.workunit.client.1.vm08.stdout:2/168: truncate d1/da/d10/d1b/f14 2153258 0 2026-03-10T08:55:03.679 INFO:tasks.workunit.client.1.vm08.stdout:3/163: chown d4/d15/d17/l31 112 1 2026-03-10T08:55:03.679 INFO:tasks.workunit.client.1.vm08.stdout:3/164: write d4/d15/d17/f34 [571047,22053] 0 2026-03-10T08:55:03.684 INFO:tasks.workunit.client.1.vm08.stdout:2/169: fdatasync d1/f9 0 2026-03-10T08:55:03.688 INFO:tasks.workunit.client.1.vm08.stdout:2/170: mknod d1/da/d10/d1b/d12/d23/c47 0 2026-03-10T08:55:03.690 INFO:tasks.workunit.client.1.vm08.stdout:0/134: dwrite d6/dd/d13/d17/d1c/f1b [0,4194304] 0 2026-03-10T08:55:03.690 INFO:tasks.workunit.client.1.vm08.stdout:1/118: rename d1/da/f13 to d1/da/f1e 0 2026-03-10T08:55:03.692 INFO:tasks.workunit.client.1.vm08.stdout:3/165: rename d4/d15/d8/f1f to d4/d15/d8/f37 0 2026-03-10T08:55:03.702 INFO:tasks.workunit.client.1.vm08.stdout:5/157: dwrite d0/d11/f29 [0,4194304] 0 2026-03-10T08:55:03.703 INFO:tasks.workunit.client.1.vm08.stdout:5/158: chown d0/d11/f1e 48991 1 2026-03-10T08:55:03.709 INFO:tasks.workunit.client.1.vm08.stdout:5/159: dread d0/d11/f29 [0,4194304] 0 2026-03-10T08:55:03.719 INFO:tasks.workunit.client.1.vm08.stdout:5/160: readlink d0/d11/l1f 0 2026-03-10T08:55:03.719 INFO:tasks.workunit.client.1.vm08.stdout:0/135: dwrite d6/dd/f28 [0,4194304] 0 2026-03-10T08:55:03.719 INFO:tasks.workunit.client.1.vm08.stdout:0/136: write d6/fc [1177440,30486] 0 2026-03-10T08:55:03.724 INFO:tasks.workunit.client.1.vm08.stdout:2/171: getdents d1/d43 0 2026-03-10T08:55:03.725 INFO:tasks.workunit.client.1.vm08.stdout:3/166: mknod d4/c38 0 2026-03-10T08:55:03.725 INFO:tasks.workunit.client.1.vm08.stdout:3/167: chown d4/d15 773 1 2026-03-10T08:55:03.726 INFO:tasks.workunit.client.1.vm08.stdout:3/168: write d4/d15/f7 [297073,74874] 0 2026-03-10T08:55:03.733 INFO:tasks.workunit.client.1.vm08.stdout:3/169: dwrite d4/d15/fa [0,4194304] 0 2026-03-10T08:55:03.738 INFO:tasks.workunit.client.1.vm08.stdout:0/137: mkdir d6/dd/d13/d17/d1f/d2d 0 2026-03-10T08:55:03.739 INFO:tasks.workunit.client.1.vm08.stdout:1/119: sync 2026-03-10T08:55:03.748 INFO:tasks.workunit.client.1.vm08.stdout:6/145: mknod d9/dc/c34 0 2026-03-10T08:55:03.749 INFO:tasks.workunit.client.1.vm08.stdout:1/120: creat d1/f1f x:0 0 0 2026-03-10T08:55:03.750 INFO:tasks.workunit.client.1.vm08.stdout:1/121: write d1/f1f [937346,84174] 0 2026-03-10T08:55:03.750 INFO:tasks.workunit.client.1.vm08.stdout:1/122: stat d1/f8 0 2026-03-10T08:55:03.752 INFO:tasks.workunit.client.1.vm08.stdout:5/161: creat d0/f36 x:0 0 0 2026-03-10T08:55:03.753 INFO:tasks.workunit.client.1.vm08.stdout:3/170: fsync d4/f10 0 2026-03-10T08:55:03.755 INFO:tasks.workunit.client.1.vm08.stdout:6/146: creat d9/d13/f35 x:0 0 0 2026-03-10T08:55:03.756 INFO:tasks.workunit.client.1.vm08.stdout:6/147: chown d9/c19 8 1 2026-03-10T08:55:03.757 INFO:tasks.workunit.client.1.vm08.stdout:5/162: creat d0/d11/f37 x:0 0 0 2026-03-10T08:55:03.758 INFO:tasks.workunit.client.1.vm08.stdout:3/171: symlink d4/d15/d8/d2a/l39 0 2026-03-10T08:55:03.759 INFO:tasks.workunit.client.1.vm08.stdout:7/128: dwrite d0/d14/f7 [0,4194304] 0 2026-03-10T08:55:03.760 INFO:tasks.workunit.client.1.vm08.stdout:0/138: mknod d6/dd/d13/d17/d1f/d2d/c2e 0 2026-03-10T08:55:03.772 INFO:tasks.workunit.client.1.vm08.stdout:6/148: creat d9/d13/f36 x:0 0 0 2026-03-10T08:55:03.774 INFO:tasks.workunit.client.1.vm08.stdout:1/123: mkdir d1/da/d20 0 2026-03-10T08:55:03.775 INFO:tasks.workunit.client.1.vm08.stdout:1/124: fdatasync d1/da/de/f19 0 2026-03-10T08:55:03.775 INFO:tasks.workunit.client.1.vm08.stdout:1/125: write d1/f11 [997101,45666] 0 2026-03-10T08:55:03.777 INFO:tasks.workunit.client.1.vm08.stdout:5/163: mknod d0/d11/c38 0 2026-03-10T08:55:03.780 INFO:tasks.workunit.client.1.vm08.stdout:7/129: mknod d0/d1c/c23 0 2026-03-10T08:55:03.780 INFO:tasks.workunit.client.1.vm08.stdout:7/130: chown d0/d14/da/ff 56810 1 2026-03-10T08:55:03.791 INFO:tasks.workunit.client.1.vm08.stdout:7/131: dread d0/fe [0,4194304] 0 2026-03-10T08:55:03.799 INFO:tasks.workunit.client.1.vm08.stdout:5/164: creat d0/d1b/f39 x:0 0 0 2026-03-10T08:55:03.799 INFO:tasks.workunit.client.1.vm08.stdout:1/126: fdatasync d1/da/f1e 0 2026-03-10T08:55:03.799 INFO:tasks.workunit.client.1.vm08.stdout:1/127: fdatasync d1/fc 0 2026-03-10T08:55:03.800 INFO:tasks.workunit.client.1.vm08.stdout:7/132: creat d0/d14/da/f24 x:0 0 0 2026-03-10T08:55:03.800 INFO:tasks.workunit.client.1.vm08.stdout:7/133: stat d0 0 2026-03-10T08:55:03.801 INFO:tasks.workunit.client.1.vm08.stdout:6/149: mknod d9/dc/d11/d23/c37 0 2026-03-10T08:55:03.807 INFO:tasks.workunit.client.1.vm08.stdout:5/165: chown d0/d11/d18/l32 57488032 1 2026-03-10T08:55:03.810 INFO:tasks.workunit.client.1.vm08.stdout:1/128: rmdir d1/da/d18 39 2026-03-10T08:55:03.812 INFO:tasks.workunit.client.1.vm08.stdout:6/150: symlink d9/d13/d1a/l38 0 2026-03-10T08:55:03.813 INFO:tasks.workunit.client.1.vm08.stdout:6/151: chown d9/d10/f30 154833353 1 2026-03-10T08:55:03.815 INFO:tasks.workunit.client.1.vm08.stdout:6/152: fsync d9/d10/f25 0 2026-03-10T08:55:03.819 INFO:tasks.workunit.client.1.vm08.stdout:0/139: truncate d6/f9 1695245 0 2026-03-10T08:55:03.819 INFO:tasks.workunit.client.1.vm08.stdout:0/140: write d6/f2c [240679,3502] 0 2026-03-10T08:55:03.820 INFO:tasks.workunit.client.1.vm08.stdout:5/166: creat d0/d1b/f3a x:0 0 0 2026-03-10T08:55:03.832 INFO:tasks.workunit.client.1.vm08.stdout:0/141: dwrite d6/fc [0,4194304] 0 2026-03-10T08:55:03.835 INFO:tasks.workunit.client.1.vm08.stdout:2/172: dwrite d1/f9 [0,4194304] 0 2026-03-10T08:55:03.839 INFO:tasks.workunit.client.1.vm08.stdout:2/173: write d1/da/d10/d1b/d12/d23/f44 [811455,68507] 0 2026-03-10T08:55:03.851 INFO:tasks.workunit.client.1.vm08.stdout:5/167: creat d0/d11/d27/f3b x:0 0 0 2026-03-10T08:55:03.855 INFO:tasks.workunit.client.1.vm08.stdout:1/129: creat d1/da/d20/f21 x:0 0 0 2026-03-10T08:55:03.859 INFO:tasks.workunit.client.1.vm08.stdout:0/142: rename d6/dd/d13/d17/d1c to d6/dd/d13/d17/d1f/d20/d2f 0 2026-03-10T08:55:03.861 INFO:tasks.workunit.client.1.vm08.stdout:7/134: creat d0/f25 x:0 0 0 2026-03-10T08:55:03.864 INFO:tasks.workunit.client.1.vm08.stdout:2/174: rename d1/da/d10/f2c to d1/f48 0 2026-03-10T08:55:03.875 INFO:tasks.workunit.client.1.vm08.stdout:2/175: write d1/da/d10/d1b/d1c/f3f [562227,65440] 0 2026-03-10T08:55:03.875 INFO:tasks.workunit.client.1.vm08.stdout:5/168: symlink d0/l3c 0 2026-03-10T08:55:03.875 INFO:tasks.workunit.client.1.vm08.stdout:7/135: mknod d0/d11/c26 0 2026-03-10T08:55:03.875 INFO:tasks.workunit.client.1.vm08.stdout:2/176: rmdir d1/da/d10/d1b/d1c 39 2026-03-10T08:55:03.880 INFO:tasks.workunit.client.1.vm08.stdout:7/136: creat d0/d11/f27 x:0 0 0 2026-03-10T08:55:03.884 INFO:tasks.workunit.client.1.vm08.stdout:7/137: dread d0/d14/f7 [0,4194304] 0 2026-03-10T08:55:03.891 INFO:tasks.workunit.client.1.vm08.stdout:1/130: link d1/da/d18/f1d d1/da/f22 0 2026-03-10T08:55:03.894 INFO:tasks.workunit.client.1.vm08.stdout:7/138: unlink d0/c13 0 2026-03-10T08:55:03.894 INFO:tasks.workunit.client.1.vm08.stdout:1/131: dread d1/f1f [0,4194304] 0 2026-03-10T08:55:03.901 INFO:tasks.workunit.client.1.vm08.stdout:2/177: getdents d1/da/d10/d1b/d12/d22 0 2026-03-10T08:55:03.903 INFO:tasks.workunit.client.1.vm08.stdout:7/139: symlink d0/d11/d1f/l28 0 2026-03-10T08:55:03.904 INFO:tasks.workunit.client.1.vm08.stdout:7/140: chown d0/fe 4105 1 2026-03-10T08:55:03.905 INFO:tasks.workunit.client.1.vm08.stdout:1/132: sync 2026-03-10T08:55:03.909 INFO:tasks.workunit.client.1.vm08.stdout:7/141: write d0/fe [4700617,62375] 0 2026-03-10T08:55:03.912 INFO:tasks.workunit.client.1.vm08.stdout:9/141: write d2/dd/f18 [24112,47469] 0 2026-03-10T08:55:03.913 INFO:tasks.workunit.client.1.vm08.stdout:7/142: write d0/d11/f27 [728008,87804] 0 2026-03-10T08:55:03.914 INFO:tasks.workunit.client.1.vm08.stdout:7/143: dread d0/d14/da/ff [4194304,4194304] 0 2026-03-10T08:55:03.916 INFO:tasks.workunit.client.1.vm08.stdout:1/133: mkdir d1/da/d23 0 2026-03-10T08:55:03.917 INFO:tasks.workunit.client.1.vm08.stdout:3/172: rmdir d4 39 2026-03-10T08:55:03.918 INFO:tasks.workunit.client.1.vm08.stdout:7/144: mkdir d0/d11/d1f/d29 0 2026-03-10T08:55:03.919 INFO:tasks.workunit.client.1.vm08.stdout:7/145: chown d0/d11/d1f/d29 21 1 2026-03-10T08:55:03.925 INFO:tasks.workunit.client.1.vm08.stdout:3/173: write d4/d15/d8/f1e [1012213,13608] 0 2026-03-10T08:55:03.926 INFO:tasks.workunit.client.1.vm08.stdout:4/164: rmdir d5 39 2026-03-10T08:55:03.928 INFO:tasks.workunit.client.1.vm08.stdout:3/174: chown d4/d15/d8/d1d/f2d 14178664 1 2026-03-10T08:55:03.931 INFO:tasks.workunit.client.1.vm08.stdout:8/177: link d1/d10/lc d1/l39 0 2026-03-10T08:55:03.932 INFO:tasks.workunit.client.1.vm08.stdout:8/178: write d1/d10/d9/dd/d13/f24 [78694,20761] 0 2026-03-10T08:55:03.934 INFO:tasks.workunit.client.1.vm08.stdout:4/165: write d5/de/f3b [369099,21902] 0 2026-03-10T08:55:03.935 INFO:tasks.workunit.client.1.vm08.stdout:4/166: truncate d5/de/f38 753144 0 2026-03-10T08:55:03.937 INFO:tasks.workunit.client.1.vm08.stdout:3/175: mknod d4/d15/d17/c3a 0 2026-03-10T08:55:03.938 INFO:tasks.workunit.client.1.vm08.stdout:7/146: mknod d0/d11/d1f/d29/c2a 0 2026-03-10T08:55:03.938 INFO:tasks.workunit.client.1.vm08.stdout:7/147: stat d0 0 2026-03-10T08:55:03.944 INFO:tasks.workunit.client.1.vm08.stdout:6/153: truncate f5 1604697 0 2026-03-10T08:55:03.947 INFO:tasks.workunit.client.1.vm08.stdout:9/142: getdents d2/dd/d15/d1e 0 2026-03-10T08:55:03.966 INFO:tasks.workunit.client.1.vm08.stdout:5/169: rmdir d0 39 2026-03-10T08:55:03.966 INFO:tasks.workunit.client.1.vm08.stdout:7/148: mknod d0/d14/da/c2b 0 2026-03-10T08:55:03.968 INFO:tasks.workunit.client.1.vm08.stdout:1/134: getdents d1/da/d20 0 2026-03-10T08:55:03.970 INFO:tasks.workunit.client.1.vm08.stdout:8/179: creat d1/d10/d9/dd/d25/d27/f3a x:0 0 0 2026-03-10T08:55:03.972 INFO:tasks.workunit.client.1.vm08.stdout:7/149: dwrite d0/f16 [0,4194304] 0 2026-03-10T08:55:03.974 INFO:tasks.workunit.client.1.vm08.stdout:9/143: dwrite d2/fa [0,4194304] 0 2026-03-10T08:55:03.976 INFO:tasks.workunit.client.1.vm08.stdout:9/144: write d2/fa [2544435,65708] 0 2026-03-10T08:55:03.995 INFO:tasks.workunit.client.1.vm08.stdout:1/135: rename d1/da/d23 to d1/da/de/d24 0 2026-03-10T08:55:03.997 INFO:tasks.workunit.client.1.vm08.stdout:8/180: unlink d1/f11 0 2026-03-10T08:55:03.997 INFO:tasks.workunit.client.1.vm08.stdout:0/143: write d6/fb [2303850,60059] 0 2026-03-10T08:55:03.998 INFO:tasks.workunit.client.1.vm08.stdout:8/181: write d1/f8 [1390073,57427] 0 2026-03-10T08:55:03.998 INFO:tasks.workunit.client.1.vm08.stdout:7/150: mkdir d0/d11/d1f/d2c 0 2026-03-10T08:55:04.000 INFO:tasks.workunit.client.1.vm08.stdout:5/170: dread d0/ff [0,4194304] 0 2026-03-10T08:55:04.001 INFO:tasks.workunit.client.1.vm08.stdout:9/145: creat d2/dd/d15/d1e/d24/f2b x:0 0 0 2026-03-10T08:55:04.010 INFO:tasks.workunit.client.1.vm08.stdout:8/182: creat d1/d10/f3b x:0 0 0 2026-03-10T08:55:04.017 INFO:tasks.workunit.client.1.vm08.stdout:4/167: write d5/d23/f27 [914990,100277] 0 2026-03-10T08:55:04.017 INFO:tasks.workunit.client.1.vm08.stdout:5/171: creat d0/d11/d27/f3d x:0 0 0 2026-03-10T08:55:04.018 INFO:tasks.workunit.client.1.vm08.stdout:9/146: mknod d2/dd/d15/d1e/d24/c2c 0 2026-03-10T08:55:04.018 INFO:tasks.workunit.client.1.vm08.stdout:5/172: fdatasync d0/d11/d27/f3b 0 2026-03-10T08:55:04.023 INFO:tasks.workunit.client.1.vm08.stdout:2/178: write d1/da/d10/d1b/f14 [1927752,67066] 0 2026-03-10T08:55:04.023 INFO:tasks.workunit.client.1.vm08.stdout:3/176: getdents d4/d15/d17 0 2026-03-10T08:55:04.023 INFO:tasks.workunit.client.1.vm08.stdout:2/179: write d1/da/d10/d2d/f3e [1207165,24789] 0 2026-03-10T08:55:04.023 INFO:tasks.workunit.client.1.vm08.stdout:5/173: dread - d0/d11/d27/f3d zero size 2026-03-10T08:55:04.024 INFO:tasks.workunit.client.1.vm08.stdout:4/168: read d5/de/f38 [285369,41014] 0 2026-03-10T08:55:04.025 INFO:tasks.workunit.client.1.vm08.stdout:4/169: fdatasync d5/fb 0 2026-03-10T08:55:04.029 INFO:tasks.workunit.client.1.vm08.stdout:1/136: creat d1/da/f25 x:0 0 0 2026-03-10T08:55:04.029 INFO:tasks.workunit.client.1.vm08.stdout:4/170: dwrite d5/f8 [0,4194304] 0 2026-03-10T08:55:04.031 INFO:tasks.workunit.client.1.vm08.stdout:4/171: chown d5/de 1429 1 2026-03-10T08:55:04.032 INFO:tasks.workunit.client.1.vm08.stdout:3/177: rmdir d4/d15/d8/d1d 39 2026-03-10T08:55:04.033 INFO:tasks.workunit.client.1.vm08.stdout:4/172: write d5/f8 [5210860,126851] 0 2026-03-10T08:55:04.042 INFO:tasks.workunit.client.1.vm08.stdout:5/174: mkdir d0/d11/d3e 0 2026-03-10T08:55:04.042 INFO:tasks.workunit.client.1.vm08.stdout:9/147: creat d2/dd/d15/d1e/d21/f2d x:0 0 0 2026-03-10T08:55:04.043 INFO:tasks.workunit.client.1.vm08.stdout:1/137: mkdir d1/da/de/d24/d26 0 2026-03-10T08:55:04.044 INFO:tasks.workunit.client.1.vm08.stdout:1/138: write d1/f11 [2056143,117899] 0 2026-03-10T08:55:04.045 INFO:tasks.workunit.client.1.vm08.stdout:0/144: getdents d6/dd 0 2026-03-10T08:55:04.047 INFO:tasks.workunit.client.1.vm08.stdout:3/178: symlink d4/l3b 0 2026-03-10T08:55:04.048 INFO:tasks.workunit.client.1.vm08.stdout:3/179: chown d4/d15/d8/d2a/l39 4050 1 2026-03-10T08:55:04.048 INFO:tasks.workunit.client.1.vm08.stdout:8/183: fsync d1/d10/d9/dd/d25/d27/f3a 0 2026-03-10T08:55:04.051 INFO:tasks.workunit.client.1.vm08.stdout:2/180: unlink d1/da/d10/d1b/d1c/l38 0 2026-03-10T08:55:04.055 INFO:tasks.workunit.client.1.vm08.stdout:9/148: creat d2/dd/f2e x:0 0 0 2026-03-10T08:55:04.059 INFO:tasks.workunit.client.1.vm08.stdout:0/145: mknod d6/dd/d13/d17/d1f/d20/d2f/c30 0 2026-03-10T08:55:04.060 INFO:tasks.workunit.client.1.vm08.stdout:7/151: rmdir d0/d11 39 2026-03-10T08:55:04.060 INFO:tasks.workunit.client.1.vm08.stdout:0/146: chown d6/dd/c14 217960897 1 2026-03-10T08:55:04.062 INFO:tasks.workunit.client.1.vm08.stdout:3/180: dread d4/d15/d8/d1d/f21 [0,4194304] 0 2026-03-10T08:55:04.064 INFO:tasks.workunit.client.1.vm08.stdout:5/175: symlink d0/d11/d3e/l3f 0 2026-03-10T08:55:04.066 INFO:tasks.workunit.client.1.vm08.stdout:9/149: fsync d2/f13 0 2026-03-10T08:55:04.067 INFO:tasks.workunit.client.1.vm08.stdout:1/139: rename d1/da/ff to d1/da/de/f27 0 2026-03-10T08:55:04.067 INFO:tasks.workunit.client.1.vm08.stdout:8/184: sync 2026-03-10T08:55:04.068 INFO:tasks.workunit.client.1.vm08.stdout:3/181: dwrite d4/f10 [0,4194304] 0 2026-03-10T08:55:04.068 INFO:tasks.workunit.client.1.vm08.stdout:8/185: stat d1/d2c/f30 0 2026-03-10T08:55:04.068 INFO:tasks.workunit.client.1.vm08.stdout:6/154: write d9/d10/d1e/d32/ff [623127,61035] 0 2026-03-10T08:55:04.070 INFO:tasks.workunit.client.1.vm08.stdout:6/155: truncate d9/d13/f15 595930 0 2026-03-10T08:55:04.075 INFO:tasks.workunit.client.1.vm08.stdout:4/173: creat d5/f3d x:0 0 0 2026-03-10T08:55:04.075 INFO:tasks.workunit.client.1.vm08.stdout:5/176: readlink d0/d1b/l2c 0 2026-03-10T08:55:04.075 INFO:tasks.workunit.client.1.vm08.stdout:5/177: chown d0/c6 466491 1 2026-03-10T08:55:04.078 INFO:tasks.workunit.client.1.vm08.stdout:5/178: write d0/f16 [834130,34307] 0 2026-03-10T08:55:04.078 INFO:tasks.workunit.client.1.vm08.stdout:5/179: chown d0/d11/d27/f2a 25816288 1 2026-03-10T08:55:04.081 INFO:tasks.workunit.client.1.vm08.stdout:9/150: readlink d2/dd/le 0 2026-03-10T08:55:04.081 INFO:tasks.workunit.client.1.vm08.stdout:5/180: sync 2026-03-10T08:55:04.082 INFO:tasks.workunit.client.1.vm08.stdout:5/181: dread - d0/d11/f2d zero size 2026-03-10T08:55:04.084 INFO:tasks.workunit.client.1.vm08.stdout:0/147: rename d6/dd/d13/c23 to d6/c31 0 2026-03-10T08:55:04.084 INFO:tasks.workunit.client.1.vm08.stdout:5/182: fsync d0/d11/d18/f1a 0 2026-03-10T08:55:04.087 INFO:tasks.workunit.client.1.vm08.stdout:5/183: stat d0/d1b/f2f 0 2026-03-10T08:55:04.087 INFO:tasks.workunit.client.1.vm08.stdout:3/182: creat d4/d15/d17/f3c x:0 0 0 2026-03-10T08:55:04.088 INFO:tasks.workunit.client.1.vm08.stdout:8/186: mkdir d1/d10/d9/dd/d18/d3c 0 2026-03-10T08:55:04.089 INFO:tasks.workunit.client.1.vm08.stdout:3/183: dread - d4/d15/d8/f24 zero size 2026-03-10T08:55:04.091 INFO:tasks.workunit.client.1.vm08.stdout:3/184: readlink d4/d15/d8/d2c/l2f 0 2026-03-10T08:55:04.091 INFO:tasks.workunit.client.1.vm08.stdout:9/151: dread d2/fb [0,4194304] 0 2026-03-10T08:55:04.092 INFO:tasks.workunit.client.1.vm08.stdout:9/152: stat d2/dd/d15 0 2026-03-10T08:55:04.093 INFO:tasks.workunit.client.1.vm08.stdout:9/153: chown d2/dd/l29 1550609818 1 2026-03-10T08:55:04.093 INFO:tasks.workunit.client.1.vm08.stdout:0/148: dwrite d6/fb [0,4194304] 0 2026-03-10T08:55:04.094 INFO:tasks.workunit.client.1.vm08.stdout:9/154: write d2/dd/d15/f22 [435615,50789] 0 2026-03-10T08:55:04.094 INFO:tasks.workunit.client.1.vm08.stdout:9/155: chown d2/l19 3262223 1 2026-03-10T08:55:04.100 INFO:tasks.workunit.client.1.vm08.stdout:9/156: truncate d2/dd/d15/d1e/d21/f2d 262982 0 2026-03-10T08:55:04.111 INFO:tasks.workunit.client.1.vm08.stdout:4/174: rename d5/d23/c2b to d5/de/c3e 0 2026-03-10T08:55:04.111 INFO:tasks.workunit.client.1.vm08.stdout:1/140: mknod d1/da/de/d24/d26/c28 0 2026-03-10T08:55:04.112 INFO:tasks.workunit.client.1.vm08.stdout:1/141: rename d1/da/de to d1/da/de/d24/d29 22 2026-03-10T08:55:04.113 INFO:tasks.workunit.client.1.vm08.stdout:5/184: unlink d0/d1b/f3a 0 2026-03-10T08:55:04.113 INFO:tasks.workunit.client.1.vm08.stdout:8/187: chown d1/d10/d9/c1c 25918767 1 2026-03-10T08:55:04.113 INFO:tasks.workunit.client.1.vm08.stdout:7/152: creat d0/d14/f2d x:0 0 0 2026-03-10T08:55:04.115 INFO:tasks.workunit.client.1.vm08.stdout:3/185: creat d4/d15/d8/d2c/f3d x:0 0 0 2026-03-10T08:55:04.115 INFO:tasks.workunit.client.1.vm08.stdout:0/149: mkdir d6/dd/d13/d32 0 2026-03-10T08:55:04.116 INFO:tasks.workunit.client.1.vm08.stdout:0/150: write d6/f15 [2863700,118817] 0 2026-03-10T08:55:04.120 INFO:tasks.workunit.client.1.vm08.stdout:2/181: getdents d1/da/d10/d1b/d12 0 2026-03-10T08:55:04.129 INFO:tasks.workunit.client.1.vm08.stdout:0/151: dwrite d6/f16 [0,4194304] 0 2026-03-10T08:55:04.130 INFO:tasks.workunit.client.1.vm08.stdout:3/186: dwrite d4/d15/d17/f3c [0,4194304] 0 2026-03-10T08:55:04.130 INFO:tasks.workunit.client.1.vm08.stdout:7/153: dwrite d0/d14/da/f24 [0,4194304] 0 2026-03-10T08:55:04.131 INFO:tasks.workunit.client.1.vm08.stdout:9/157: creat d2/dd/d15/d1e/d24/f2f x:0 0 0 2026-03-10T08:55:04.131 INFO:tasks.workunit.client.1.vm08.stdout:4/175: mknod d5/de/c3f 0 2026-03-10T08:55:04.138 INFO:tasks.workunit.client.1.vm08.stdout:5/185: rmdir d0/d11 39 2026-03-10T08:55:04.145 INFO:tasks.workunit.client.1.vm08.stdout:8/188: write d1/d2c/d2e/d21/f32 [1911754,71829] 0 2026-03-10T08:55:04.145 INFO:tasks.workunit.client.1.vm08.stdout:0/152: symlink d6/dd/d13/d17/d1f/d20/d2f/l33 0 2026-03-10T08:55:04.145 INFO:tasks.workunit.client.1.vm08.stdout:6/156: creat d9/dc/f39 x:0 0 0 2026-03-10T08:55:04.147 INFO:tasks.workunit.client.1.vm08.stdout:1/142: rename d1/da/c1c to d1/da/de/d24/c2a 0 2026-03-10T08:55:04.147 INFO:tasks.workunit.client.1.vm08.stdout:9/158: dread d2/dd/d15/f1b [0,4194304] 0 2026-03-10T08:55:04.151 INFO:tasks.workunit.client.1.vm08.stdout:1/143: write d1/da/d20/f21 [826021,104470] 0 2026-03-10T08:55:04.152 INFO:tasks.workunit.client.1.vm08.stdout:1/144: write d1/da/de/f19 [329070,61439] 0 2026-03-10T08:55:04.156 INFO:tasks.workunit.client.1.vm08.stdout:1/145: chown d1/da/de 1558 1 2026-03-10T08:55:04.160 INFO:tasks.workunit.client.1.vm08.stdout:9/159: dwrite d2/f6 [4194304,4194304] 0 2026-03-10T08:55:04.160 INFO:tasks.workunit.client.1.vm08.stdout:0/153: dread d6/fb [0,4194304] 0 2026-03-10T08:55:04.170 INFO:tasks.workunit.client.1.vm08.stdout:7/154: mknod d0/c2e 0 2026-03-10T08:55:04.171 INFO:tasks.workunit.client.1.vm08.stdout:3/187: dread d4/f14 [0,4194304] 0 2026-03-10T08:55:04.172 INFO:tasks.workunit.client.1.vm08.stdout:5/186: mkdir d0/d40 0 2026-03-10T08:55:04.172 INFO:tasks.workunit.client.1.vm08.stdout:2/182: rmdir d1/d36 0 2026-03-10T08:55:04.173 INFO:tasks.workunit.client.1.vm08.stdout:2/183: dread - d1/da/d10/d1b/d1c/f3d zero size 2026-03-10T08:55:04.174 INFO:tasks.workunit.client.1.vm08.stdout:2/184: chown d1/da/d10/d1b/d1c/f3f 28476119 1 2026-03-10T08:55:04.183 INFO:tasks.workunit.client.1.vm08.stdout:9/160: link d2/dd/f18 d2/dd/d15/d1e/d24/f30 0 2026-03-10T08:55:04.183 INFO:tasks.workunit.client.1.vm08.stdout:1/146: symlink d1/da/l2b 0 2026-03-10T08:55:04.184 INFO:tasks.workunit.client.1.vm08.stdout:0/154: creat d6/dd/d13/d32/f34 x:0 0 0 2026-03-10T08:55:04.185 INFO:tasks.workunit.client.1.vm08.stdout:6/157: creat d9/d10/d1e/d32/f3a x:0 0 0 2026-03-10T08:55:04.187 INFO:tasks.workunit.client.1.vm08.stdout:7/155: mkdir d0/d14/d2f 0 2026-03-10T08:55:04.189 INFO:tasks.workunit.client.1.vm08.stdout:7/156: write d0/f25 [990888,94513] 0 2026-03-10T08:55:04.189 INFO:tasks.workunit.client.1.vm08.stdout:2/185: getdents d1/d43 0 2026-03-10T08:55:04.193 INFO:tasks.workunit.client.1.vm08.stdout:9/161: unlink d2/f1a 0 2026-03-10T08:55:04.199 INFO:tasks.workunit.client.1.vm08.stdout:2/186: dwrite d1/f4 [0,4194304] 0 2026-03-10T08:55:04.202 INFO:tasks.workunit.client.1.vm08.stdout:6/158: mknod d9/dc/d11/d23/c3b 0 2026-03-10T08:55:04.204 INFO:tasks.workunit.client.1.vm08.stdout:5/187: link d0/d11/l1f d0/d11/d3e/l41 0 2026-03-10T08:55:04.233 INFO:tasks.workunit.client.1.vm08.stdout:7/157: rmdir d0/d11/d1f/d29 39 2026-03-10T08:55:04.234 INFO:tasks.workunit.client.1.vm08.stdout:9/162: mkdir d2/dd/d15/d1e/d25/d31 0 2026-03-10T08:55:04.234 INFO:tasks.workunit.client.1.vm08.stdout:5/188: chown d0/d11/d27/f2a 89022494 1 2026-03-10T08:55:04.234 INFO:tasks.workunit.client.1.vm08.stdout:7/158: truncate d0/d11/f27 1728209 0 2026-03-10T08:55:04.234 INFO:tasks.workunit.client.1.vm08.stdout:9/163: rmdir d2/dd/d15/d1e 39 2026-03-10T08:55:04.234 INFO:tasks.workunit.client.1.vm08.stdout:5/189: write d0/fe [391886,47979] 0 2026-03-10T08:55:04.234 INFO:tasks.workunit.client.1.vm08.stdout:6/159: dwrite d9/dc/f1b [0,4194304] 0 2026-03-10T08:55:04.235 INFO:tasks.workunit.client.1.vm08.stdout:9/164: chown d2/dd/d15/d1e/d24/f30 531 1 2026-03-10T08:55:04.235 INFO:tasks.workunit.client.1.vm08.stdout:5/190: dread - d0/d11/d18/f34 zero size 2026-03-10T08:55:04.235 INFO:tasks.workunit.client.1.vm08.stdout:9/165: dread d2/dd/d15/f1b [0,4194304] 0 2026-03-10T08:55:04.235 INFO:tasks.workunit.client.1.vm08.stdout:5/191: dwrite d0/ff [0,4194304] 0 2026-03-10T08:55:04.235 INFO:tasks.workunit.client.1.vm08.stdout:9/166: mkdir d2/dd/d15/d1e/d25/d32 0 2026-03-10T08:55:04.235 INFO:tasks.workunit.client.1.vm08.stdout:2/187: getdents d1/da/d10/d1b/d1c 0 2026-03-10T08:55:04.235 INFO:tasks.workunit.client.1.vm08.stdout:6/160: dread - d9/d10/d1e/d32/f17 zero size 2026-03-10T08:55:04.235 INFO:tasks.workunit.client.1.vm08.stdout:6/161: truncate d9/d13/f36 1042490 0 2026-03-10T08:55:04.235 INFO:tasks.workunit.client.1.vm08.stdout:2/188: write d1/da/d10/d1b/d12/d1e/f1f [755949,71087] 0 2026-03-10T08:55:04.236 INFO:tasks.workunit.client.1.vm08.stdout:5/192: creat d0/d40/f42 x:0 0 0 2026-03-10T08:55:04.236 INFO:tasks.workunit.client.1.vm08.stdout:6/162: write d9/d10/f25 [449280,28474] 0 2026-03-10T08:55:04.238 INFO:tasks.workunit.client.1.vm08.stdout:6/163: chown d9/dc/c1d 317522 1 2026-03-10T08:55:04.238 INFO:tasks.workunit.client.1.vm08.stdout:2/189: write d1/da/d10/d1b/d1c/f20 [90615,106797] 0 2026-03-10T08:55:04.238 INFO:tasks.workunit.client.1.vm08.stdout:9/167: dwrite d2/f6 [0,4194304] 0 2026-03-10T08:55:04.240 INFO:tasks.workunit.client.1.vm08.stdout:6/164: symlink d9/d10/l3c 0 2026-03-10T08:55:04.244 INFO:tasks.workunit.client.1.vm08.stdout:9/168: write d2/dd/d15/f22 [159891,122576] 0 2026-03-10T08:55:04.288 INFO:tasks.workunit.client.1.vm08.stdout:2/190: dwrite d1/da/d10/d1b/d1c/f3f [0,4194304] 0 2026-03-10T08:55:04.289 INFO:tasks.workunit.client.1.vm08.stdout:5/193: creat d0/f43 x:0 0 0 2026-03-10T08:55:04.289 INFO:tasks.workunit.client.1.vm08.stdout:5/194: chown d0/d1b 21003408 1 2026-03-10T08:55:04.289 INFO:tasks.workunit.client.1.vm08.stdout:2/191: write d1/da/d10/d1b/d12/d23/f44 [1054623,127395] 0 2026-03-10T08:55:04.289 INFO:tasks.workunit.client.1.vm08.stdout:9/169: rename d2/dd/d15/d1e/d24/f2f to d2/dd/d15/d1e/d24/f33 0 2026-03-10T08:55:04.289 INFO:tasks.workunit.client.1.vm08.stdout:9/170: chown d2/f4 180857 1 2026-03-10T08:55:04.289 INFO:tasks.workunit.client.1.vm08.stdout:5/195: dwrite d0/d1b/f39 [0,4194304] 0 2026-03-10T08:55:04.289 INFO:tasks.workunit.client.1.vm08.stdout:9/171: dread d2/dd/f16 [0,4194304] 0 2026-03-10T08:55:04.289 INFO:tasks.workunit.client.1.vm08.stdout:9/172: fdatasync d2/f4 0 2026-03-10T08:55:04.289 INFO:tasks.workunit.client.1.vm08.stdout:5/196: mknod d0/c44 0 2026-03-10T08:55:04.289 INFO:tasks.workunit.client.1.vm08.stdout:9/173: creat d2/dd/d15/d1e/d24/f34 x:0 0 0 2026-03-10T08:55:04.289 INFO:tasks.workunit.client.1.vm08.stdout:5/197: truncate d0/d1b/f39 4514018 0 2026-03-10T08:55:04.289 INFO:tasks.workunit.client.1.vm08.stdout:9/174: write d2/dd/d15/f22 [320050,121927] 0 2026-03-10T08:55:04.289 INFO:tasks.workunit.client.1.vm08.stdout:5/198: readlink d0/d11/d18/l32 0 2026-03-10T08:55:04.289 INFO:tasks.workunit.client.1.vm08.stdout:9/175: write d2/fa [36931,102347] 0 2026-03-10T08:55:04.289 INFO:tasks.workunit.client.1.vm08.stdout:5/199: dread d0/d1b/f2f [0,4194304] 0 2026-03-10T08:55:04.289 INFO:tasks.workunit.client.1.vm08.stdout:5/200: mkdir d0/d11/d3e/d45 0 2026-03-10T08:55:04.289 INFO:tasks.workunit.client.1.vm08.stdout:5/201: chown d0/d11/d18/f34 143832885 1 2026-03-10T08:55:04.289 INFO:tasks.workunit.client.1.vm08.stdout:5/202: stat d0/d11/f1e 0 2026-03-10T08:55:04.289 INFO:tasks.workunit.client.1.vm08.stdout:9/176: link d2/f13 d2/f35 0 2026-03-10T08:55:04.289 INFO:tasks.workunit.client.1.vm08.stdout:5/203: mkdir d0/d46 0 2026-03-10T08:55:04.289 INFO:tasks.workunit.client.1.vm08.stdout:9/177: truncate d2/dd/d15/d1e/d24/f34 866534 0 2026-03-10T08:55:04.289 INFO:tasks.workunit.client.1.vm08.stdout:5/204: write d0/f16 [925835,83525] 0 2026-03-10T08:55:04.289 INFO:tasks.workunit.client.1.vm08.stdout:9/178: stat d2/dd/d15 0 2026-03-10T08:55:04.289 INFO:tasks.workunit.client.1.vm08.stdout:5/205: mknod d0/d40/c47 0 2026-03-10T08:55:04.289 INFO:tasks.workunit.client.1.vm08.stdout:5/206: fsync d0/d11/d27/f3b 0 2026-03-10T08:55:04.289 INFO:tasks.workunit.client.1.vm08.stdout:9/179: truncate d2/f4 85657 0 2026-03-10T08:55:04.289 INFO:tasks.workunit.client.1.vm08.stdout:8/189: sync 2026-03-10T08:55:04.289 INFO:tasks.workunit.client.1.vm08.stdout:8/190: write d1/d2c/f33 [523741,105278] 0 2026-03-10T08:55:04.290 INFO:tasks.workunit.client.1.vm08.stdout:8/191: fdatasync f0 0 2026-03-10T08:55:04.291 INFO:tasks.workunit.client.1.vm08.stdout:8/192: readlink d1/d10/d9/dd/l14 0 2026-03-10T08:55:04.295 INFO:tasks.workunit.client.1.vm08.stdout:8/193: mkdir d1/d10/d9/dd/d3d 0 2026-03-10T08:55:04.299 INFO:tasks.workunit.client.1.vm08.stdout:8/194: mknod d1/d10/c3e 0 2026-03-10T08:55:04.306 INFO:tasks.workunit.client.1.vm08.stdout:5/207: dread d0/d11/d18/f23 [0,4194304] 0 2026-03-10T08:55:04.310 INFO:tasks.workunit.client.1.vm08.stdout:5/208: creat d0/d11/d3e/f48 x:0 0 0 2026-03-10T08:55:04.314 INFO:tasks.workunit.client.1.vm08.stdout:5/209: rename d0/d11/d18/c19 to d0/d11/c49 0 2026-03-10T08:55:04.351 INFO:tasks.workunit.client.1.vm08.stdout:5/210: creat d0/d11/d3e/d45/f4a x:0 0 0 2026-03-10T08:55:04.351 INFO:tasks.workunit.client.1.vm08.stdout:5/211: mkdir d0/d40/d4b 0 2026-03-10T08:55:04.495 INFO:tasks.workunit.client.1.vm08.stdout:5/212: sync 2026-03-10T08:55:04.497 INFO:tasks.workunit.client.1.vm08.stdout:2/192: read d1/da/d10/d1b/f28 [663907,111598] 0 2026-03-10T08:55:04.509 INFO:tasks.workunit.client.1.vm08.stdout:2/193: creat d1/da/d10/d1b/d12/d1e/f49 x:0 0 0 2026-03-10T08:55:04.721 INFO:tasks.workunit.client.1.vm08.stdout:2/194: sync 2026-03-10T08:55:04.722 INFO:tasks.workunit.client.1.vm08.stdout:2/195: truncate d1/f48 1542380 0 2026-03-10T08:55:04.722 INFO:tasks.workunit.client.1.vm08.stdout:2/196: stat d1/da/d10/d1b/f14 0 2026-03-10T08:55:04.848 INFO:tasks.workunit.client.1.vm08.stdout:2/197: sync 2026-03-10T08:55:05.065 INFO:tasks.workunit.client.1.vm08.stdout:3/188: truncate d4/d15/d8/f37 1424409 0 2026-03-10T08:55:05.068 INFO:tasks.workunit.client.1.vm08.stdout:3/189: mknod d4/d15/d8/d2a/c3e 0 2026-03-10T08:55:05.072 INFO:tasks.workunit.client.1.vm08.stdout:3/190: dwrite d4/d15/d8/d2c/f3d [0,4194304] 0 2026-03-10T08:55:05.086 INFO:tasks.workunit.client.1.vm08.stdout:1/147: dwrite d1/f1f [0,4194304] 0 2026-03-10T08:55:05.092 INFO:tasks.workunit.client.1.vm08.stdout:1/148: dwrite d1/fc [0,4194304] 0 2026-03-10T08:55:05.106 INFO:tasks.workunit.client.1.vm08.stdout:0/155: truncate d6/f25 67308 0 2026-03-10T08:55:05.112 INFO:tasks.workunit.client.1.vm08.stdout:0/156: truncate d6/dd/d13/d17/f1d 1675716 0 2026-03-10T08:55:05.113 INFO:tasks.workunit.client.1.vm08.stdout:0/157: creat d6/dd/f35 x:0 0 0 2026-03-10T08:55:05.115 INFO:tasks.workunit.client.1.vm08.stdout:0/158: dread f3 [0,4194304] 0 2026-03-10T08:55:05.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:04 vm05.local ceph-mon[49713]: pgmap v141: 65 pgs: 65 active+clean; 347 MiB data, 2.0 GiB used, 118 GiB / 120 GiB avail; 839 KiB/s rd, 23 MiB/s wr, 380 op/s 2026-03-10T08:55:05.289 INFO:tasks.workunit.client.1.vm08.stdout:7/159: fdatasync d0/d14/da/f24 0 2026-03-10T08:55:05.291 INFO:tasks.workunit.client.1.vm08.stdout:7/160: dread d0/fe [0,4194304] 0 2026-03-10T08:55:05.299 INFO:tasks.workunit.client.1.vm08.stdout:7/161: creat d0/d11/d1f/d2c/f30 x:0 0 0 2026-03-10T08:55:05.300 INFO:tasks.workunit.client.1.vm08.stdout:7/162: symlink d0/d1c/l31 0 2026-03-10T08:55:05.302 INFO:tasks.workunit.client.1.vm08.stdout:7/163: rename d0/f16 to d0/d1c/f32 0 2026-03-10T08:55:05.304 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:04 vm08.local ceph-mon[57559]: pgmap v141: 65 pgs: 65 active+clean; 347 MiB data, 2.0 GiB used, 118 GiB / 120 GiB avail; 839 KiB/s rd, 23 MiB/s wr, 380 op/s 2026-03-10T08:55:05.304 INFO:tasks.workunit.client.1.vm08.stdout:7/164: creat d0/d11/d1f/d2c/f33 x:0 0 0 2026-03-10T08:55:05.304 INFO:tasks.workunit.client.1.vm08.stdout:7/165: chown d0/d11/d1f 5 1 2026-03-10T08:55:05.319 INFO:tasks.workunit.client.1.vm08.stdout:7/166: dread d0/d14/f7 [0,4194304] 0 2026-03-10T08:55:05.342 INFO:tasks.workunit.client.1.vm08.stdout:2/198: read d1/da/d10/d1b/d12/d1e/f1f [171585,24344] 0 2026-03-10T08:55:05.348 INFO:tasks.workunit.client.1.vm08.stdout:2/199: symlink d1/da/d10/d1b/d1c/l4a 0 2026-03-10T08:55:05.348 INFO:tasks.workunit.client.1.vm08.stdout:2/200: chown d1/da/d10/d1b/d12/d23 0 1 2026-03-10T08:55:05.349 INFO:tasks.workunit.client.1.vm08.stdout:2/201: write d1/da/fe [4091401,86859] 0 2026-03-10T08:55:05.349 INFO:tasks.workunit.client.1.vm08.stdout:4/176: read d5/f19 [1952774,119518] 0 2026-03-10T08:55:05.351 INFO:tasks.workunit.client.1.vm08.stdout:2/202: truncate d1/f48 2267475 0 2026-03-10T08:55:05.359 INFO:tasks.workunit.client.1.vm08.stdout:2/203: rename d1/da/f2b to d1/d43/f4b 0 2026-03-10T08:55:05.371 INFO:tasks.workunit.client.1.vm08.stdout:4/177: sync 2026-03-10T08:55:05.372 INFO:tasks.workunit.client.1.vm08.stdout:4/178: chown d5/d23/c34 20152317 1 2026-03-10T08:55:05.380 INFO:tasks.workunit.client.1.vm08.stdout:2/204: creat d1/da/d10/d2d/f4c x:0 0 0 2026-03-10T08:55:05.386 INFO:tasks.workunit.client.1.vm08.stdout:2/205: rename d1/da/f41 to d1/da/d10/d2d/f4d 0 2026-03-10T08:55:05.396 INFO:tasks.workunit.client.1.vm08.stdout:4/179: getdents d5/d2f 0 2026-03-10T08:55:05.396 INFO:tasks.workunit.client.1.vm08.stdout:4/180: truncate d5/d23/f27 1607142 0 2026-03-10T08:55:05.397 INFO:tasks.workunit.client.1.vm08.stdout:4/181: chown d5/d23/l24 389 1 2026-03-10T08:55:05.404 INFO:tasks.workunit.client.1.vm08.stdout:2/206: getdents d1 0 2026-03-10T08:55:05.406 INFO:tasks.workunit.client.1.vm08.stdout:2/207: write d1/da/d10/d1b/d12/d22/f45 [913264,64073] 0 2026-03-10T08:55:05.408 INFO:tasks.workunit.client.1.vm08.stdout:4/182: getdents d5/de 0 2026-03-10T08:55:05.410 INFO:tasks.workunit.client.1.vm08.stdout:2/208: creat d1/da/f4e x:0 0 0 2026-03-10T08:55:05.427 INFO:tasks.workunit.client.1.vm08.stdout:6/165: dread d9/d10/f26 [0,4194304] 0 2026-03-10T08:55:05.439 INFO:tasks.workunit.client.1.vm08.stdout:2/209: dread d1/f48 [0,4194304] 0 2026-03-10T08:55:05.439 INFO:tasks.workunit.client.1.vm08.stdout:2/210: readlink d1/da/lc 0 2026-03-10T08:55:05.442 INFO:tasks.workunit.client.1.vm08.stdout:2/211: dread d1/f4 [0,4194304] 0 2026-03-10T08:55:05.443 INFO:tasks.workunit.client.1.vm08.stdout:6/166: link d9/d10/d1e/d32/ff d9/dc/d11/d23/d2c/f3d 0 2026-03-10T08:55:05.445 INFO:tasks.workunit.client.1.vm08.stdout:2/212: dwrite d1/da/f4e [0,4194304] 0 2026-03-10T08:55:05.453 INFO:tasks.workunit.client.1.vm08.stdout:6/167: symlink d9/dc/l3e 0 2026-03-10T08:55:05.454 INFO:tasks.workunit.client.1.vm08.stdout:9/180: truncate d2/fa 1347646 0 2026-03-10T08:55:05.455 INFO:tasks.workunit.client.1.vm08.stdout:2/213: mkdir d1/d43/d4f 0 2026-03-10T08:55:05.457 INFO:tasks.workunit.client.1.vm08.stdout:9/181: dread d2/f35 [0,4194304] 0 2026-03-10T08:55:05.462 INFO:tasks.workunit.client.1.vm08.stdout:6/168: mknod d9/dc/d11/c3f 0 2026-03-10T08:55:05.463 INFO:tasks.workunit.client.1.vm08.stdout:6/169: fsync d9/dc/f1b 0 2026-03-10T08:55:05.463 INFO:tasks.workunit.client.1.vm08.stdout:2/214: creat d1/da/f50 x:0 0 0 2026-03-10T08:55:05.471 INFO:tasks.workunit.client.1.vm08.stdout:8/195: dread f0 [0,4194304] 0 2026-03-10T08:55:05.479 INFO:tasks.workunit.client.1.vm08.stdout:9/182: symlink d2/l36 0 2026-03-10T08:55:05.520 INFO:tasks.workunit.client.1.vm08.stdout:8/196: mknod d1/d10/d9/dd/d25/c3f 0 2026-03-10T08:55:05.523 INFO:tasks.workunit.client.1.vm08.stdout:8/197: dread d1/f26 [0,4194304] 0 2026-03-10T08:55:05.529 INFO:tasks.workunit.client.1.vm08.stdout:9/183: mkdir d2/dd/d15/d1e/d37 0 2026-03-10T08:55:05.529 INFO:tasks.workunit.client.1.vm08.stdout:2/215: dread d1/da/fe [0,4194304] 0 2026-03-10T08:55:05.531 INFO:tasks.workunit.client.1.vm08.stdout:2/216: write d1/da/d10/d1b/d12/d23/f44 [8790,55365] 0 2026-03-10T08:55:05.534 INFO:tasks.workunit.client.1.vm08.stdout:9/184: dread d2/dd/d15/d1e/d24/f30 [0,4194304] 0 2026-03-10T08:55:05.536 INFO:tasks.workunit.client.1.vm08.stdout:9/185: dread d2/f13 [0,4194304] 0 2026-03-10T08:55:05.536 INFO:tasks.workunit.client.1.vm08.stdout:9/186: stat d2/f35 0 2026-03-10T08:55:05.545 INFO:tasks.workunit.client.1.vm08.stdout:8/198: mkdir d1/d10/d9/dd/d13/d40 0 2026-03-10T08:55:05.569 INFO:tasks.workunit.client.1.vm08.stdout:9/187: symlink d2/dd/d15/d1e/d25/l38 0 2026-03-10T08:55:05.574 INFO:tasks.workunit.client.1.vm08.stdout:8/199: rename d1/d2c/f33 to d1/d10/d9/dd/f41 0 2026-03-10T08:55:05.575 INFO:tasks.workunit.client.1.vm08.stdout:2/217: symlink d1/da/d10/d42/l51 0 2026-03-10T08:55:05.586 INFO:tasks.workunit.client.1.vm08.stdout:9/188: mkdir d2/dd/d15/d1e/d39 0 2026-03-10T08:55:05.591 INFO:tasks.workunit.client.1.vm08.stdout:8/200: rename d1/d10/d9/c1c to d1/d10/d9/dd/d25/c42 0 2026-03-10T08:55:05.603 INFO:tasks.workunit.client.1.vm08.stdout:2/218: unlink d1/da/d10/d1b/d1c/f3f 0 2026-03-10T08:55:05.603 INFO:tasks.workunit.client.1.vm08.stdout:2/219: dread - d1/d43/f4b zero size 2026-03-10T08:55:05.610 INFO:tasks.workunit.client.1.vm08.stdout:9/189: dread d2/f4 [0,4194304] 0 2026-03-10T08:55:05.610 INFO:tasks.workunit.client.1.vm08.stdout:9/190: chown d2/dd 110901 1 2026-03-10T08:55:05.611 INFO:tasks.workunit.client.1.vm08.stdout:8/201: creat d1/d2c/f43 x:0 0 0 2026-03-10T08:55:05.617 INFO:tasks.workunit.client.1.vm08.stdout:2/220: dread d1/da/d10/d1b/f14 [0,4194304] 0 2026-03-10T08:55:05.622 INFO:tasks.workunit.client.1.vm08.stdout:8/202: rename d1/d2c/d2e to d1/d10/d9/dd/d25/d27/d44 0 2026-03-10T08:55:05.625 INFO:tasks.workunit.client.1.vm08.stdout:9/191: unlink d2/dd/c2a 0 2026-03-10T08:55:05.630 INFO:tasks.workunit.client.1.vm08.stdout:5/213: truncate d0/d11/f29 772389 0 2026-03-10T08:55:05.637 INFO:tasks.workunit.client.1.vm08.stdout:2/221: dwrite d1/f4 [0,4194304] 0 2026-03-10T08:55:05.638 INFO:tasks.workunit.client.1.vm08.stdout:9/192: creat d2/dd/d15/d1e/d21/f3a x:0 0 0 2026-03-10T08:55:05.639 INFO:tasks.workunit.client.1.vm08.stdout:8/203: dwrite d1/d10/d9/dd/f41 [0,4194304] 0 2026-03-10T08:55:05.640 INFO:tasks.workunit.client.1.vm08.stdout:8/204: write d1/d10/d9/dd/f41 [3933449,40313] 0 2026-03-10T08:55:05.642 INFO:tasks.workunit.client.1.vm08.stdout:2/222: write d1/f48 [1507742,119369] 0 2026-03-10T08:55:05.650 INFO:tasks.workunit.client.1.vm08.stdout:2/223: dwrite d1/da/d10/d1b/d12/d22/f45 [0,4194304] 0 2026-03-10T08:55:05.657 INFO:tasks.workunit.client.1.vm08.stdout:9/193: rename d2/dd/d15/l23 to d2/dd/d15/d1e/l3b 0 2026-03-10T08:55:05.663 INFO:tasks.workunit.client.1.vm08.stdout:8/205: mknod d1/d2c/c45 0 2026-03-10T08:55:05.671 INFO:tasks.workunit.client.1.vm08.stdout:9/194: mknod d2/dd/d15/d1e/d21/c3c 0 2026-03-10T08:55:05.672 INFO:tasks.workunit.client.1.vm08.stdout:8/206: creat d1/d10/d9/dd/d13/f46 x:0 0 0 2026-03-10T08:55:05.673 INFO:tasks.workunit.client.1.vm08.stdout:8/207: readlink d1/d10/d9/dd/d13/l1f 0 2026-03-10T08:55:05.677 INFO:tasks.workunit.client.1.vm08.stdout:2/224: mkdir d1/d43/d4f/d52 0 2026-03-10T08:55:05.677 INFO:tasks.workunit.client.1.vm08.stdout:2/225: write d1/da/d10/d1b/d12/d1e/f49 [61636,127031] 0 2026-03-10T08:55:05.680 INFO:tasks.workunit.client.1.vm08.stdout:9/195: mknod d2/dd/d15/d1e/d25/c3d 0 2026-03-10T08:55:05.683 INFO:tasks.workunit.client.1.vm08.stdout:8/208: creat d1/d2c/f47 x:0 0 0 2026-03-10T08:55:05.684 INFO:tasks.workunit.client.1.vm08.stdout:2/226: rmdir d1/d43/d4f 39 2026-03-10T08:55:05.691 INFO:tasks.workunit.client.1.vm08.stdout:9/196: creat d2/dd/d15/d1e/d37/f3e x:0 0 0 2026-03-10T08:55:05.693 INFO:tasks.workunit.client.1.vm08.stdout:8/209: getdents d1/d10/d9/dd/d13/d40 0 2026-03-10T08:55:05.694 INFO:tasks.workunit.client.1.vm08.stdout:2/227: mknod d1/da/d10/d1b/c53 0 2026-03-10T08:55:05.695 INFO:tasks.workunit.client.1.vm08.stdout:8/210: chown d1/d10/d9/dd/d25/c3f 17 1 2026-03-10T08:55:05.696 INFO:tasks.workunit.client.1.vm08.stdout:9/197: dwrite d2/dd/d15/d1e/d21/f3a [0,4194304] 0 2026-03-10T08:55:05.711 INFO:tasks.workunit.client.1.vm08.stdout:8/211: readlink d1/l39 0 2026-03-10T08:55:05.714 INFO:tasks.workunit.client.1.vm08.stdout:2/228: rename d1/da/lc to d1/da/d10/d2d/l54 0 2026-03-10T08:55:05.715 INFO:tasks.workunit.client.1.vm08.stdout:8/212: symlink d1/d10/d9/dd/d25/d27/d44/l48 0 2026-03-10T08:55:05.716 INFO:tasks.workunit.client.1.vm08.stdout:8/213: write d1/d2c/f30 [1122950,116743] 0 2026-03-10T08:55:05.718 INFO:tasks.workunit.client.1.vm08.stdout:9/198: rename d2/f6 to d2/dd/d15/d1e/d24/f3f 0 2026-03-10T08:55:05.719 INFO:tasks.workunit.client.1.vm08.stdout:9/199: chown d2/fb 249032 1 2026-03-10T08:55:05.720 INFO:tasks.workunit.client.1.vm08.stdout:9/200: stat d2/dd/d15/d1e/d24/f2b 0 2026-03-10T08:55:05.720 INFO:tasks.workunit.client.1.vm08.stdout:8/214: read d1/d10/f2d [1237529,66377] 0 2026-03-10T08:55:05.727 INFO:tasks.workunit.client.1.vm08.stdout:8/215: symlink d1/d2c/l49 0 2026-03-10T08:55:05.730 INFO:tasks.workunit.client.1.vm08.stdout:9/201: symlink d2/dd/d15/d1e/d37/l40 0 2026-03-10T08:55:05.730 INFO:tasks.workunit.client.1.vm08.stdout:9/202: write d2/dd/f2e [511237,57664] 0 2026-03-10T08:55:05.731 INFO:tasks.workunit.client.1.vm08.stdout:9/203: dread d2/f4 [0,4194304] 0 2026-03-10T08:55:05.732 INFO:tasks.workunit.client.1.vm08.stdout:9/204: dread f1 [0,4194304] 0 2026-03-10T08:55:05.732 INFO:tasks.workunit.client.1.vm08.stdout:9/205: readlink d2/dd/d15/d1e/d24/l27 0 2026-03-10T08:55:05.739 INFO:tasks.workunit.client.1.vm08.stdout:8/216: creat d1/d10/d9/dd/d25/d27/d44/d21/f4a x:0 0 0 2026-03-10T08:55:05.739 INFO:tasks.workunit.client.1.vm08.stdout:8/217: read - d1/d10/d9/dd/d13/f46 zero size 2026-03-10T08:55:05.740 INFO:tasks.workunit.client.1.vm08.stdout:8/218: truncate d1/d2c/f43 696983 0 2026-03-10T08:55:05.741 INFO:tasks.workunit.client.1.vm08.stdout:8/219: chown d1/d10/d9/dd/d25/d27 51913358 1 2026-03-10T08:55:05.744 INFO:tasks.workunit.client.1.vm08.stdout:9/206: dwrite d2/dd/d15/d1e/d24/f3f [0,4194304] 0 2026-03-10T08:55:05.745 INFO:tasks.workunit.client.1.vm08.stdout:9/207: dread - d2/dd/d15/d1e/d24/f2b zero size 2026-03-10T08:55:05.751 INFO:tasks.workunit.client.1.vm08.stdout:8/220: symlink d1/d10/d9/dd/d25/d27/d44/l4b 0 2026-03-10T08:55:05.757 INFO:tasks.workunit.client.1.vm08.stdout:8/221: mknod d1/d10/d9/dd/d18/c4c 0 2026-03-10T08:55:05.765 INFO:tasks.workunit.client.1.vm08.stdout:9/208: mkdir d2/d41 0 2026-03-10T08:55:05.766 INFO:tasks.workunit.client.1.vm08.stdout:8/222: mkdir d1/d10/d9/d4d 0 2026-03-10T08:55:05.767 INFO:tasks.workunit.client.1.vm08.stdout:8/223: write d1/d10/d9/dd/f41 [1844920,7060] 0 2026-03-10T08:55:05.771 INFO:tasks.workunit.client.1.vm08.stdout:9/209: truncate d2/dd/d15/f17 1265764 0 2026-03-10T08:55:05.771 INFO:tasks.workunit.client.1.vm08.stdout:9/210: stat d2/dd/d15/d1e/d21 0 2026-03-10T08:55:05.774 INFO:tasks.workunit.client.1.vm08.stdout:9/211: dwrite d2/dd/d15/d1e/d37/f3e [0,4194304] 0 2026-03-10T08:55:05.774 INFO:tasks.workunit.client.1.vm08.stdout:8/224: link d1/d10/d9/dd/d25/d27/d44/d21/f4a d1/d10/d9/dd/d18/d3c/f4e 0 2026-03-10T08:55:05.774 INFO:tasks.workunit.client.1.vm08.stdout:9/212: chown d2/dd/d15/d1e/d25 15 1 2026-03-10T08:55:05.778 INFO:tasks.workunit.client.1.vm08.stdout:8/225: write d1/d10/d9/dd/d13/f46 [506260,124279] 0 2026-03-10T08:55:05.789 INFO:tasks.workunit.client.1.vm08.stdout:9/213: chown d2/dd/d15/f17 47 1 2026-03-10T08:55:05.790 INFO:tasks.workunit.client.1.vm08.stdout:9/214: creat d2/dd/d15/d1e/f42 x:0 0 0 2026-03-10T08:55:05.792 INFO:tasks.workunit.client.1.vm08.stdout:9/215: fdatasync d2/dd/d15/d1e/d24/f30 0 2026-03-10T08:55:05.795 INFO:tasks.workunit.client.1.vm08.stdout:9/216: symlink d2/dd/l43 0 2026-03-10T08:55:05.795 INFO:tasks.workunit.client.1.vm08.stdout:9/217: chown d2/fa 0 1 2026-03-10T08:55:05.799 INFO:tasks.workunit.client.1.vm08.stdout:9/218: creat d2/dd/d15/f44 x:0 0 0 2026-03-10T08:55:05.805 INFO:tasks.workunit.client.1.vm08.stdout:8/226: dread d1/d10/d9/dd/d13/f46 [0,4194304] 0 2026-03-10T08:55:05.807 INFO:tasks.workunit.client.1.vm08.stdout:8/227: mkdir d1/d4f 0 2026-03-10T08:55:05.810 INFO:tasks.workunit.client.1.vm08.stdout:8/228: dwrite d1/d10/f23 [0,4194304] 0 2026-03-10T08:55:05.810 INFO:tasks.workunit.client.1.vm08.stdout:8/229: chown d1/d10/d9/dd/d18/d3c 28607522 1 2026-03-10T08:55:05.832 INFO:tasks.workunit.client.1.vm08.stdout:8/230: link d1/d10/d9/dd/d25/c3f d1/d10/d9/dd/d25/d27/d44/d21/c50 0 2026-03-10T08:55:05.834 INFO:tasks.workunit.client.1.vm08.stdout:8/231: mkdir d1/d10/d9/dd/d25/d27/d44/d21/d51 0 2026-03-10T08:55:05.835 INFO:tasks.workunit.client.1.vm08.stdout:8/232: dread - d1/d10/d9/dd/d25/d27/d44/d21/f4a zero size 2026-03-10T08:55:05.839 INFO:tasks.workunit.client.1.vm08.stdout:8/233: creat d1/d10/d9/dd/d25/d27/f52 x:0 0 0 2026-03-10T08:55:05.842 INFO:tasks.workunit.client.1.vm08.stdout:8/234: truncate d1/d10/d9/dd/d13/f24 930166 0 2026-03-10T08:55:05.846 INFO:tasks.workunit.client.1.vm08.stdout:8/235: mknod d1/d10/d9/dd/d25/d27/d44/d21/d51/c53 0 2026-03-10T08:55:05.848 INFO:tasks.workunit.client.1.vm08.stdout:8/236: symlink d1/d10/d9/dd/l54 0 2026-03-10T08:55:05.854 INFO:tasks.workunit.client.1.vm08.stdout:8/237: mknod d1/d10/d9/dd/d3d/c55 0 2026-03-10T08:55:05.855 INFO:tasks.workunit.client.1.vm08.stdout:8/238: chown d1/d10/d9/dd 793465082 1 2026-03-10T08:55:05.857 INFO:tasks.workunit.client.1.vm08.stdout:8/239: creat d1/d10/d9/dd/d25/d27/d44/d21/d51/f56 x:0 0 0 2026-03-10T08:55:05.858 INFO:tasks.workunit.client.1.vm08.stdout:8/240: fsync d1/d10/f2a 0 2026-03-10T08:55:05.989 INFO:tasks.workunit.client.1.vm08.stdout:1/149: write d1/da/de/f27 [610983,85768] 0 2026-03-10T08:55:05.990 INFO:tasks.workunit.client.1.vm08.stdout:3/191: dwrite f1 [0,4194304] 0 2026-03-10T08:55:06.000 INFO:tasks.workunit.client.1.vm08.stdout:3/192: unlink d4/d15/d8/d2c/l30 0 2026-03-10T08:55:06.001 INFO:tasks.workunit.client.1.vm08.stdout:1/150: rename d1/da/de/d24/d26/c28 to d1/da/de/d24/c2c 0 2026-03-10T08:55:06.001 INFO:tasks.workunit.client.1.vm08.stdout:1/151: readlink d1/da/l2b 0 2026-03-10T08:55:06.002 INFO:tasks.workunit.client.1.vm08.stdout:0/159: chown d6/dd/d13/d17/d1f/d20/d2f/l33 794391283 1 2026-03-10T08:55:06.002 INFO:tasks.workunit.client.1.vm08.stdout:1/152: stat d1/da/de/f19 0 2026-03-10T08:55:06.004 INFO:tasks.workunit.client.1.vm08.stdout:3/193: dwrite d4/d15/d8/d2c/f3d [0,4194304] 0 2026-03-10T08:55:06.008 INFO:tasks.workunit.client.1.vm08.stdout:3/194: rename d4/f10 to d4/d15/f3f 0 2026-03-10T08:55:06.020 INFO:tasks.workunit.client.1.vm08.stdout:3/195: mknod d4/d15/d17/d20/c40 0 2026-03-10T08:55:06.025 INFO:tasks.workunit.client.1.vm08.stdout:0/160: dread d6/f11 [0,4194304] 0 2026-03-10T08:55:06.025 INFO:tasks.workunit.client.1.vm08.stdout:3/196: dwrite d4/d15/f12 [0,4194304] 0 2026-03-10T08:55:06.027 INFO:tasks.workunit.client.1.vm08.stdout:0/161: dwrite d6/dd/d13/d17/d1f/d20/d2f/f1b [0,4194304] 0 2026-03-10T08:55:06.054 INFO:tasks.workunit.client.1.vm08.stdout:3/197: creat d4/d15/d8/f41 x:0 0 0 2026-03-10T08:55:06.082 INFO:tasks.workunit.client.1.vm08.stdout:7/167: truncate d0/d14/f12 6464305 0 2026-03-10T08:55:06.085 INFO:tasks.workunit.client.1.vm08.stdout:7/168: symlink d0/d11/d1f/l34 0 2026-03-10T08:55:06.087 INFO:tasks.workunit.client.1.vm08.stdout:7/169: rmdir d0/d1c 39 2026-03-10T08:55:06.090 INFO:tasks.workunit.client.1.vm08.stdout:0/162: sync 2026-03-10T08:55:06.094 INFO:tasks.workunit.client.1.vm08.stdout:4/183: write d5/de/f38 [1137604,115107] 0 2026-03-10T08:55:06.094 INFO:tasks.workunit.client.1.vm08.stdout:4/184: readlink d5/lf 0 2026-03-10T08:55:06.097 INFO:tasks.workunit.client.1.vm08.stdout:0/163: dwrite d6/f16 [0,4194304] 0 2026-03-10T08:55:06.097 INFO:tasks.workunit.client.1.vm08.stdout:7/170: mknod d0/c35 0 2026-03-10T08:55:06.098 INFO:tasks.workunit.client.1.vm08.stdout:7/171: dread - d0/d11/d1f/d2c/f30 zero size 2026-03-10T08:55:06.099 INFO:tasks.workunit.client.1.vm08.stdout:0/164: write d6/f9 [2817272,26720] 0 2026-03-10T08:55:06.103 INFO:tasks.workunit.client.1.vm08.stdout:4/185: creat d5/de/f40 x:0 0 0 2026-03-10T08:55:06.103 INFO:tasks.workunit.client.1.vm08.stdout:4/186: readlink d5/l17 0 2026-03-10T08:55:06.105 INFO:tasks.workunit.client.1.vm08.stdout:4/187: truncate d5/d23/f29 936694 0 2026-03-10T08:55:06.126 INFO:tasks.workunit.client.1.vm08.stdout:0/165: mknod d6/dd/d13/d32/c36 0 2026-03-10T08:55:06.133 INFO:tasks.workunit.client.1.vm08.stdout:4/188: rename d5/de/f3b to d5/de/f41 0 2026-03-10T08:55:06.133 INFO:tasks.workunit.client.1.vm08.stdout:4/189: chown d5/d23/d36 165597 1 2026-03-10T08:55:06.140 INFO:tasks.workunit.client.1.vm08.stdout:7/172: mkdir d0/d11/d1f/d29/d36 0 2026-03-10T08:55:06.142 INFO:tasks.workunit.client.1.vm08.stdout:2/229: fsync d1/da/f50 0 2026-03-10T08:55:06.143 INFO:tasks.workunit.client.1.vm08.stdout:7/173: dwrite d0/d11/d1f/d2c/f30 [0,4194304] 0 2026-03-10T08:55:06.149 INFO:tasks.workunit.client.1.vm08.stdout:4/190: write d5/f19 [1175393,19773] 0 2026-03-10T08:55:06.158 INFO:tasks.workunit.client.1.vm08.stdout:2/230: rename d1/da/fe to d1/da/d10/d1b/d12/f55 0 2026-03-10T08:55:06.158 INFO:tasks.workunit.client.1.vm08.stdout:2/231: stat d1/l5 0 2026-03-10T08:55:06.159 INFO:tasks.workunit.client.1.vm08.stdout:6/170: dwrite d9/d10/d1e/f2a [0,4194304] 0 2026-03-10T08:55:06.159 INFO:tasks.workunit.client.1.vm08.stdout:6/171: read f5 [1103606,124597] 0 2026-03-10T08:55:06.160 INFO:tasks.workunit.client.1.vm08.stdout:6/172: truncate d9/dc/d11/f29 684415 0 2026-03-10T08:55:06.160 INFO:tasks.workunit.client.1.vm08.stdout:6/173: write d9/d10/f30 [723310,128222] 0 2026-03-10T08:55:06.164 INFO:tasks.workunit.client.1.vm08.stdout:0/166: creat d6/dd/d13/d17/d1f/d20/d2f/d24/f37 x:0 0 0 2026-03-10T08:55:06.179 INFO:tasks.workunit.client.1.vm08.stdout:7/174: mknod d0/d1c/c37 0 2026-03-10T08:55:06.187 INFO:tasks.workunit.client.1.vm08.stdout:7/175: creat d0/d14/da/f38 x:0 0 0 2026-03-10T08:55:06.190 INFO:tasks.workunit.client.1.vm08.stdout:4/191: creat d5/f42 x:0 0 0 2026-03-10T08:55:06.191 INFO:tasks.workunit.client.1.vm08.stdout:2/232: rename d1/da/f4e to d1/d43/f56 0 2026-03-10T08:55:06.191 INFO:tasks.workunit.client.1.vm08.stdout:4/192: dread - d5/d23/f2e zero size 2026-03-10T08:55:06.192 INFO:tasks.workunit.client.1.vm08.stdout:7/176: dwrite d0/d11/d1f/d2c/f33 [0,4194304] 0 2026-03-10T08:55:06.200 INFO:tasks.workunit.client.1.vm08.stdout:6/174: creat d9/dc/d11/d23/f40 x:0 0 0 2026-03-10T08:55:06.208 INFO:tasks.workunit.client.1.vm08.stdout:2/233: rmdir d1 39 2026-03-10T08:55:06.211 INFO:tasks.workunit.client.1.vm08.stdout:7/177: unlink d0/d1c/c37 0 2026-03-10T08:55:06.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:05 vm05.local ceph-mon[49713]: pgmap v142: 65 pgs: 65 active+clean; 405 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail; 1.3 MiB/s rd, 34 MiB/s wr, 366 op/s 2026-03-10T08:55:06.213 INFO:tasks.workunit.client.1.vm08.stdout:6/175: rename d9/d13/d1a to d9/dc/d11/d23/d2c/d41 0 2026-03-10T08:55:06.215 INFO:tasks.workunit.client.1.vm08.stdout:7/178: dwrite d0/d11/d1f/d2c/f33 [0,4194304] 0 2026-03-10T08:55:06.225 INFO:tasks.workunit.client.1.vm08.stdout:2/234: truncate d1/da/d10/f39 4427339 0 2026-03-10T08:55:06.225 INFO:tasks.workunit.client.1.vm08.stdout:2/235: dread - d1/da/d10/d1b/d1c/f3d zero size 2026-03-10T08:55:06.226 INFO:tasks.workunit.client.1.vm08.stdout:4/193: symlink d5/d23/d36/l43 0 2026-03-10T08:55:06.226 INFO:tasks.workunit.client.1.vm08.stdout:4/194: chown d5/f3d 27567133 1 2026-03-10T08:55:06.228 INFO:tasks.workunit.client.1.vm08.stdout:4/195: read d5/de/f41 [24032,90142] 0 2026-03-10T08:55:06.232 INFO:tasks.workunit.client.1.vm08.stdout:7/179: creat d0/d11/f39 x:0 0 0 2026-03-10T08:55:06.234 INFO:tasks.workunit.client.1.vm08.stdout:2/236: creat d1/da/d10/d1b/d12/d23/f57 x:0 0 0 2026-03-10T08:55:06.235 INFO:tasks.workunit.client.1.vm08.stdout:4/196: rename d5/f1a to d5/d23/d36/f44 0 2026-03-10T08:55:06.236 INFO:tasks.workunit.client.1.vm08.stdout:7/180: dwrite d0/d11/d1f/d2c/f30 [0,4194304] 0 2026-03-10T08:55:06.252 INFO:tasks.workunit.client.1.vm08.stdout:2/237: unlink d1/f9 0 2026-03-10T08:55:06.253 INFO:tasks.workunit.client.1.vm08.stdout:2/238: write d1/da/f50 [73342,35493] 0 2026-03-10T08:55:06.255 INFO:tasks.workunit.client.1.vm08.stdout:4/197: dwrite d5/f14 [0,4194304] 0 2026-03-10T08:55:06.282 INFO:tasks.workunit.client.1.vm08.stdout:2/239: rmdir d1/da/d10/d42 39 2026-03-10T08:55:06.282 INFO:tasks.workunit.client.1.vm08.stdout:6/176: getdents d9/dc/d11/d23/d2c/d41 0 2026-03-10T08:55:06.282 INFO:tasks.workunit.client.1.vm08.stdout:6/177: chown d9/dc/d11/f31 23 1 2026-03-10T08:55:06.282 INFO:tasks.workunit.client.1.vm08.stdout:2/240: dwrite d1/f4 [0,4194304] 0 2026-03-10T08:55:06.282 INFO:tasks.workunit.client.1.vm08.stdout:2/241: dwrite d1/da/d10/d1b/d1c/f20 [0,4194304] 0 2026-03-10T08:55:06.283 INFO:tasks.workunit.client.1.vm08.stdout:7/181: link d0/d14/da/c2b d0/d14/da/c3a 0 2026-03-10T08:55:06.301 INFO:tasks.workunit.client.1.vm08.stdout:7/182: mkdir d0/d11/d1f/d29/d3b 0 2026-03-10T08:55:06.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:05 vm08.local ceph-mon[57559]: pgmap v142: 65 pgs: 65 active+clean; 405 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail; 1.3 MiB/s rd, 34 MiB/s wr, 366 op/s 2026-03-10T08:55:06.305 INFO:tasks.workunit.client.1.vm08.stdout:4/198: dread d5/f1d [0,4194304] 0 2026-03-10T08:55:06.308 INFO:tasks.workunit.client.1.vm08.stdout:7/183: symlink d0/d11/d1f/d2c/l3c 0 2026-03-10T08:55:06.308 INFO:tasks.workunit.client.1.vm08.stdout:7/184: chown d0/d1c/c23 2345 1 2026-03-10T08:55:06.310 INFO:tasks.workunit.client.1.vm08.stdout:6/178: creat d9/d10/d1e/d32/f42 x:0 0 0 2026-03-10T08:55:06.311 INFO:tasks.workunit.client.1.vm08.stdout:4/199: rename d5/d23/f29 to d5/d2f/f45 0 2026-03-10T08:55:06.313 INFO:tasks.workunit.client.1.vm08.stdout:9/219: truncate d2/dd/d15/d1e/d24/f30 771492 0 2026-03-10T08:55:06.313 INFO:tasks.workunit.client.1.vm08.stdout:9/220: stat d2/c20 0 2026-03-10T08:55:06.313 INFO:tasks.workunit.client.1.vm08.stdout:7/185: mkdir d0/d11/d1f/d29/d3d 0 2026-03-10T08:55:06.315 INFO:tasks.workunit.client.1.vm08.stdout:9/221: truncate d2/dd/d15/d1e/d24/f33 746469 0 2026-03-10T08:55:06.315 INFO:tasks.workunit.client.1.vm08.stdout:9/222: chown d2/dd/d15/d1e/d24/f33 0 1 2026-03-10T08:55:06.316 INFO:tasks.workunit.client.1.vm08.stdout:6/179: fsync d9/dc/d11/d23/d2c/f3d 0 2026-03-10T08:55:06.316 INFO:tasks.workunit.client.1.vm08.stdout:9/223: chown d2/dd/d15/d1e/d21/c3c 1 1 2026-03-10T08:55:06.318 INFO:tasks.workunit.client.1.vm08.stdout:4/200: symlink d5/d23/l46 0 2026-03-10T08:55:06.318 INFO:tasks.workunit.client.1.vm08.stdout:4/201: dread - d5/d2f/f3a zero size 2026-03-10T08:55:06.319 INFO:tasks.workunit.client.1.vm08.stdout:4/202: dread - d5/d23/f2e zero size 2026-03-10T08:55:06.320 INFO:tasks.workunit.client.1.vm08.stdout:7/186: mknod d0/d1c/c3e 0 2026-03-10T08:55:06.323 INFO:tasks.workunit.client.1.vm08.stdout:6/180: stat d9/d10/l20 0 2026-03-10T08:55:06.324 INFO:tasks.workunit.client.1.vm08.stdout:6/181: write d9/d13/f35 [925655,102049] 0 2026-03-10T08:55:06.329 INFO:tasks.workunit.client.1.vm08.stdout:2/242: dread d1/da/d10/f18 [0,4194304] 0 2026-03-10T08:55:06.330 INFO:tasks.workunit.client.1.vm08.stdout:6/182: creat d9/dc/d11/d23/d2c/f43 x:0 0 0 2026-03-10T08:55:06.331 INFO:tasks.workunit.client.1.vm08.stdout:9/224: creat d2/dd/d15/d1e/d25/d32/f45 x:0 0 0 2026-03-10T08:55:06.334 INFO:tasks.workunit.client.1.vm08.stdout:7/187: stat d0/d14/da/c2b 0 2026-03-10T08:55:06.335 INFO:tasks.workunit.client.1.vm08.stdout:6/183: mkdir d9/dc/d11/d23/d2c/d44 0 2026-03-10T08:55:06.336 INFO:tasks.workunit.client.1.vm08.stdout:6/184: chown d9/dc/d11/d23/d2c/d44 267 1 2026-03-10T08:55:06.337 INFO:tasks.workunit.client.1.vm08.stdout:6/185: chown d9/dc/d11 122929 1 2026-03-10T08:55:06.339 INFO:tasks.workunit.client.1.vm08.stdout:6/186: symlink d9/dc/d11/d23/d2c/l45 0 2026-03-10T08:55:06.342 INFO:tasks.workunit.client.1.vm08.stdout:6/187: dwrite d9/d13/f36 [0,4194304] 0 2026-03-10T08:55:06.355 INFO:tasks.workunit.client.1.vm08.stdout:6/188: symlink d9/d13/l46 0 2026-03-10T08:55:06.389 INFO:tasks.workunit.client.1.vm08.stdout:6/189: creat d9/dc/d11/f47 x:0 0 0 2026-03-10T08:55:06.389 INFO:tasks.workunit.client.1.vm08.stdout:6/190: dwrite d9/d10/f30 [0,4194304] 0 2026-03-10T08:55:06.389 INFO:tasks.workunit.client.1.vm08.stdout:6/191: creat d9/d10/d1e/d32/f48 x:0 0 0 2026-03-10T08:55:06.389 INFO:tasks.workunit.client.1.vm08.stdout:6/192: link d9/dc/f1b d9/dc/d11/d23/d2c/f49 0 2026-03-10T08:55:06.389 INFO:tasks.workunit.client.1.vm08.stdout:6/193: dread d9/dc/f1b [0,4194304] 0 2026-03-10T08:55:06.389 INFO:tasks.workunit.client.1.vm08.stdout:6/194: dwrite d9/d13/f2f [0,4194304] 0 2026-03-10T08:55:06.389 INFO:tasks.workunit.client.1.vm08.stdout:6/195: write d9/d10/f30 [3195676,90741] 0 2026-03-10T08:55:06.389 INFO:tasks.workunit.client.1.vm08.stdout:8/241: rmdir d1/d10 39 2026-03-10T08:55:06.468 INFO:tasks.workunit.client.1.vm08.stdout:2/243: sync 2026-03-10T08:55:06.469 INFO:tasks.workunit.client.1.vm08.stdout:2/244: write d1/da/d10/d1b/d12/d23/f57 [503271,127265] 0 2026-03-10T08:55:06.475 INFO:tasks.workunit.client.1.vm08.stdout:7/188: sync 2026-03-10T08:55:06.475 INFO:tasks.workunit.client.1.vm08.stdout:8/242: sync 2026-03-10T08:55:06.479 INFO:tasks.workunit.client.1.vm08.stdout:2/245: creat d1/da/d10/d42/f58 x:0 0 0 2026-03-10T08:55:06.499 INFO:tasks.workunit.client.1.vm08.stdout:7/189: mknod d0/d11/d1f/d29/d3b/c3f 0 2026-03-10T08:55:06.500 INFO:tasks.workunit.client.1.vm08.stdout:1/153: truncate d1/da/d20/f21 930325 0 2026-03-10T08:55:06.512 INFO:tasks.workunit.client.1.vm08.stdout:7/190: fsync d0/d11/d1f/d2c/f30 0 2026-03-10T08:55:06.623 INFO:tasks.workunit.client.1.vm08.stdout:4/203: fsync d5/de/f41 0 2026-03-10T08:55:06.648 INFO:tasks.workunit.client.1.vm08.stdout:6/196: write d9/dc/d11/d23/d2c/f3d [1794408,65387] 0 2026-03-10T08:55:06.658 INFO:tasks.workunit.client.1.vm08.stdout:0/167: truncate d6/fb 791933 0 2026-03-10T08:55:06.660 INFO:tasks.workunit.client.1.vm08.stdout:6/197: sync 2026-03-10T08:55:06.662 INFO:tasks.workunit.client.1.vm08.stdout:0/168: mkdir d6/dd/d13/d17/d1f/d2d/d38 0 2026-03-10T08:55:06.663 INFO:tasks.workunit.client.1.vm08.stdout:6/198: dread d9/d10/d1e/f2a [0,4194304] 0 2026-03-10T08:55:06.678 INFO:tasks.workunit.client.1.vm08.stdout:5/214: write d0/d11/f29 [1217437,17326] 0 2026-03-10T08:55:06.679 INFO:tasks.workunit.client.1.vm08.stdout:5/215: write d0/d11/d18/f34 [890176,53285] 0 2026-03-10T08:55:06.682 INFO:tasks.workunit.client.1.vm08.stdout:0/169: truncate d6/fa 231356 0 2026-03-10T08:55:06.687 INFO:tasks.workunit.client.1.vm08.stdout:5/216: mknod d0/d11/d3e/d45/c4c 0 2026-03-10T08:55:06.690 INFO:tasks.workunit.client.1.vm08.stdout:5/217: sync 2026-03-10T08:55:06.691 INFO:tasks.workunit.client.1.vm08.stdout:5/218: fdatasync d0/d11/d18/f34 0 2026-03-10T08:55:06.691 INFO:tasks.workunit.client.1.vm08.stdout:0/170: mkdir d6/dd/d13/d17/d1f/d2d/d39 0 2026-03-10T08:55:06.698 INFO:tasks.workunit.client.1.vm08.stdout:0/171: chown d6/f9 2093 1 2026-03-10T08:55:06.699 INFO:tasks.workunit.client.1.vm08.stdout:0/172: write d6/dd/d13/d17/f29 [710786,70930] 0 2026-03-10T08:55:06.700 INFO:tasks.workunit.client.1.vm08.stdout:0/173: dread d6/f2c [0,4194304] 0 2026-03-10T08:55:06.702 INFO:tasks.workunit.client.1.vm08.stdout:5/219: unlink d0/d11/c49 0 2026-03-10T08:55:06.703 INFO:tasks.workunit.client.1.vm08.stdout:5/220: truncate d0/d11/d27/f3b 268204 0 2026-03-10T08:55:06.704 INFO:tasks.workunit.client.1.vm08.stdout:5/221: sync 2026-03-10T08:55:06.705 INFO:tasks.workunit.client.1.vm08.stdout:0/174: mknod d6/dd/d13/d17/d1f/d20/d2f/d26/c3a 0 2026-03-10T08:55:06.708 INFO:tasks.workunit.client.1.vm08.stdout:0/175: dwrite d6/f18 [0,4194304] 0 2026-03-10T08:55:06.716 INFO:tasks.workunit.client.1.vm08.stdout:5/222: dread d0/fb [0,4194304] 0 2026-03-10T08:55:06.718 INFO:tasks.workunit.client.1.vm08.stdout:7/191: rename d0/d14/da to d0/d11/d1f/d29/d3d/d40 0 2026-03-10T08:55:06.720 INFO:tasks.workunit.client.1.vm08.stdout:5/223: creat d0/d11/d3e/f4d x:0 0 0 2026-03-10T08:55:06.725 INFO:tasks.workunit.client.1.vm08.stdout:0/176: creat d6/dd/d13/d17/d1f/d2d/d39/f3b x:0 0 0 2026-03-10T08:55:06.733 INFO:tasks.workunit.client.1.vm08.stdout:0/177: symlink d6/dd/d13/d17/d1f/d20/d2f/d26/l3c 0 2026-03-10T08:55:06.737 INFO:tasks.workunit.client.1.vm08.stdout:4/204: rename d5/l2a to d5/d23/l47 0 2026-03-10T08:55:06.739 INFO:tasks.workunit.client.1.vm08.stdout:5/224: mkdir d0/d40/d4b/d4e 0 2026-03-10T08:55:06.740 INFO:tasks.workunit.client.1.vm08.stdout:5/225: dread - d0/d11/d3e/f4d zero size 2026-03-10T08:55:06.740 INFO:tasks.workunit.client.1.vm08.stdout:0/178: creat d6/dd/d13/d32/f3d x:0 0 0 2026-03-10T08:55:06.740 INFO:tasks.workunit.client.1.vm08.stdout:5/226: chown d0/d11/d3e/d45 78727 1 2026-03-10T08:55:06.741 INFO:tasks.workunit.client.1.vm08.stdout:5/227: fdatasync d0/d11/d3e/d45/f4a 0 2026-03-10T08:55:06.744 INFO:tasks.workunit.client.1.vm08.stdout:5/228: dread d0/ff [0,4194304] 0 2026-03-10T08:55:06.744 INFO:tasks.workunit.client.1.vm08.stdout:5/229: truncate d0/f36 246956 0 2026-03-10T08:55:06.745 INFO:tasks.workunit.client.1.vm08.stdout:9/225: rmdir d2/dd/d15/d1e/d25/d32 39 2026-03-10T08:55:06.745 INFO:tasks.workunit.client.1.vm08.stdout:7/192: creat d0/f41 x:0 0 0 2026-03-10T08:55:06.747 INFO:tasks.workunit.client.1.vm08.stdout:4/205: rename d5/d2f/l35 to d5/d23/l48 0 2026-03-10T08:55:06.750 INFO:tasks.workunit.client.1.vm08.stdout:5/230: rename d0/d11/d18/f1a to d0/d11/d18/f4f 0 2026-03-10T08:55:06.753 INFO:tasks.workunit.client.1.vm08.stdout:7/193: unlink d0/d11/d1f/d29/c2a 0 2026-03-10T08:55:06.758 INFO:tasks.workunit.client.1.vm08.stdout:0/179: rename d6/dd/d13/d17/d1f/d20/d2f/f1b to d6/dd/d13/d17/d1f/d20/f3e 0 2026-03-10T08:55:06.759 INFO:tasks.workunit.client.1.vm08.stdout:6/199: rmdir d9/dc 39 2026-03-10T08:55:06.761 INFO:tasks.workunit.client.1.vm08.stdout:4/206: dwrite d5/d2f/f45 [0,4194304] 0 2026-03-10T08:55:06.767 INFO:tasks.workunit.client.1.vm08.stdout:4/207: sync 2026-03-10T08:55:06.784 INFO:tasks.workunit.client.1.vm08.stdout:5/231: mkdir d0/d11/d27/d50 0 2026-03-10T08:55:06.784 INFO:tasks.workunit.client.1.vm08.stdout:9/226: symlink d2/dd/d15/d1e/d39/l46 0 2026-03-10T08:55:06.784 INFO:tasks.workunit.client.1.vm08.stdout:5/232: chown d0/d11/d18/c33 23630565 1 2026-03-10T08:55:06.786 INFO:tasks.workunit.client.1.vm08.stdout:7/194: fdatasync d0/fe 0 2026-03-10T08:55:06.789 INFO:tasks.workunit.client.1.vm08.stdout:5/233: dwrite d0/d11/d3e/f48 [0,4194304] 0 2026-03-10T08:55:06.795 INFO:tasks.workunit.client.1.vm08.stdout:5/234: fsync d0/f43 0 2026-03-10T08:55:06.795 INFO:tasks.workunit.client.1.vm08.stdout:0/180: rmdir d6/dd/d13/d17/d1f/d20/d2f/d26 39 2026-03-10T08:55:06.795 INFO:tasks.workunit.client.1.vm08.stdout:6/200: rename d9/d10/f26 to d9/d13/f4a 0 2026-03-10T08:55:06.795 INFO:tasks.workunit.client.1.vm08.stdout:4/208: mkdir d5/d23/d49 0 2026-03-10T08:55:06.811 INFO:tasks.workunit.client.1.vm08.stdout:5/235: dwrite d0/d1b/f2f [0,4194304] 0 2026-03-10T08:55:06.813 INFO:tasks.workunit.client.1.vm08.stdout:2/246: rmdir d1/da/d10/d42 39 2026-03-10T08:55:06.823 INFO:tasks.workunit.client.1.vm08.stdout:8/243: dwrite d1/f26 [0,4194304] 0 2026-03-10T08:55:06.828 INFO:tasks.workunit.client.1.vm08.stdout:1/154: rmdir d1/da 39 2026-03-10T08:55:06.828 INFO:tasks.workunit.client.1.vm08.stdout:4/209: unlink d5/de/f40 0 2026-03-10T08:55:06.832 INFO:tasks.workunit.client.1.vm08.stdout:6/201: write d9/dc/d11/f47 [482981,25445] 0 2026-03-10T08:55:06.832 INFO:tasks.workunit.client.1.vm08.stdout:9/227: mknod d2/c47 0 2026-03-10T08:55:06.834 INFO:tasks.workunit.client.1.vm08.stdout:6/202: fsync d9/d13/f36 0 2026-03-10T08:55:06.834 INFO:tasks.workunit.client.1.vm08.stdout:6/203: chown d9/dc/d11/d23/f40 94761170 1 2026-03-10T08:55:06.846 INFO:tasks.workunit.client.1.vm08.stdout:8/244: write d1/d10/d9/dd/d13/f24 [593260,72753] 0 2026-03-10T08:55:06.848 INFO:tasks.workunit.client.1.vm08.stdout:5/236: dread d0/d11/f29 [0,4194304] 0 2026-03-10T08:55:06.852 INFO:tasks.workunit.client.1.vm08.stdout:1/155: truncate d1/da/de/f1a 961628 0 2026-03-10T08:55:06.858 INFO:tasks.workunit.client.1.vm08.stdout:9/228: creat d2/dd/d15/d1e/f48 x:0 0 0 2026-03-10T08:55:06.864 INFO:tasks.workunit.client.1.vm08.stdout:9/229: dread d2/dd/d15/d1e/d24/f33 [0,4194304] 0 2026-03-10T08:55:06.869 INFO:tasks.workunit.client.1.vm08.stdout:8/245: creat d1/d10/d9/dd/d18/d34/f57 x:0 0 0 2026-03-10T08:55:06.869 INFO:tasks.workunit.client.1.vm08.stdout:8/246: chown d1/d10/d9/dd/d13 63476 1 2026-03-10T08:55:06.869 INFO:tasks.workunit.client.1.vm08.stdout:5/237: dread d0/d11/d27/f2a [4194304,4194304] 0 2026-03-10T08:55:06.872 INFO:tasks.workunit.client.1.vm08.stdout:1/156: rename d1/f11 to d1/da/d20/f2d 0 2026-03-10T08:55:06.872 INFO:tasks.workunit.client.1.vm08.stdout:4/210: creat d5/d23/d49/f4a x:0 0 0 2026-03-10T08:55:06.875 INFO:tasks.workunit.client.1.vm08.stdout:4/211: write d5/d2f/f45 [2829067,94346] 0 2026-03-10T08:55:06.876 INFO:tasks.workunit.client.1.vm08.stdout:0/181: dread d6/f9 [0,4194304] 0 2026-03-10T08:55:06.877 INFO:tasks.workunit.client.1.vm08.stdout:0/182: dread d6/f16 [0,4194304] 0 2026-03-10T08:55:06.887 INFO:tasks.workunit.client.1.vm08.stdout:9/230: rename d2/dd/d15/d1e/d25 to d2/dd/d15/d1e/d25/d32/d49 22 2026-03-10T08:55:06.888 INFO:tasks.workunit.client.1.vm08.stdout:8/247: fdatasync d1/d10/d9/dd/d13/f46 0 2026-03-10T08:55:06.888 INFO:tasks.workunit.client.1.vm08.stdout:9/231: dread - d2/dd/d15/d1e/f42 zero size 2026-03-10T08:55:06.892 INFO:tasks.workunit.client.1.vm08.stdout:1/157: mknod d1/da/de/c2e 0 2026-03-10T08:55:06.898 INFO:tasks.workunit.client.1.vm08.stdout:4/212: unlink d5/de/c3e 0 2026-03-10T08:55:06.899 INFO:tasks.workunit.client.1.vm08.stdout:8/248: dread d1/d10/f2a [0,4194304] 0 2026-03-10T08:55:06.900 INFO:tasks.workunit.client.1.vm08.stdout:0/183: write d6/dd/f28 [3451924,65745] 0 2026-03-10T08:55:06.900 INFO:tasks.workunit.client.1.vm08.stdout:8/249: write d1/d10/d9/dd/d25/d27/d44/d21/f32 [518456,49559] 0 2026-03-10T08:55:06.901 INFO:tasks.workunit.client.1.vm08.stdout:0/184: read d6/f11 [1851565,31974] 0 2026-03-10T08:55:06.901 INFO:tasks.workunit.client.1.vm08.stdout:8/250: read - d1/d2c/f47 zero size 2026-03-10T08:55:06.904 INFO:tasks.workunit.client.1.vm08.stdout:5/238: creat d0/d46/f51 x:0 0 0 2026-03-10T08:55:06.906 INFO:tasks.workunit.client.1.vm08.stdout:1/158: mknod d1/da/de/d24/c2f 0 2026-03-10T08:55:06.909 INFO:tasks.workunit.client.1.vm08.stdout:4/213: symlink d5/d23/d49/l4b 0 2026-03-10T08:55:06.909 INFO:tasks.workunit.client.1.vm08.stdout:5/239: dwrite d0/d1b/f39 [4194304,4194304] 0 2026-03-10T08:55:06.912 INFO:tasks.workunit.client.1.vm08.stdout:5/240: chown d0/d11/c17 4696 1 2026-03-10T08:55:06.912 INFO:tasks.workunit.client.1.vm08.stdout:5/241: chown d0/d11 0 1 2026-03-10T08:55:06.918 INFO:tasks.workunit.client.1.vm08.stdout:5/242: dwrite d0/d11/f37 [0,4194304] 0 2026-03-10T08:55:06.921 INFO:tasks.workunit.client.1.vm08.stdout:3/198: write d4/d15/d8/f37 [1032733,17523] 0 2026-03-10T08:55:06.922 INFO:tasks.workunit.client.1.vm08.stdout:7/195: truncate d0/d11/d1f/d29/d3d/d40/ff 2038012 0 2026-03-10T08:55:06.924 INFO:tasks.workunit.client.1.vm08.stdout:1/159: mkdir d1/da/d20/d30 0 2026-03-10T08:55:06.924 INFO:tasks.workunit.client.1.vm08.stdout:7/196: write d0/fe [494276,100885] 0 2026-03-10T08:55:06.926 INFO:tasks.workunit.client.1.vm08.stdout:2/247: truncate d1/da/d10/d2d/f4d 1782448 0 2026-03-10T08:55:06.930 INFO:tasks.workunit.client.1.vm08.stdout:4/214: unlink d5/f28 0 2026-03-10T08:55:06.930 INFO:tasks.workunit.client.1.vm08.stdout:7/197: dwrite d0/d11/d1f/d29/d3d/d40/f24 [4194304,4194304] 0 2026-03-10T08:55:06.940 INFO:tasks.workunit.client.1.vm08.stdout:4/215: dwrite d5/d2f/f3a [0,4194304] 0 2026-03-10T08:55:06.941 INFO:tasks.workunit.client.1.vm08.stdout:4/216: chown d5/d23/d49/f4a 121452467 1 2026-03-10T08:55:06.952 INFO:tasks.workunit.client.1.vm08.stdout:6/204: dwrite d9/d10/d1e/d32/f12 [0,4194304] 0 2026-03-10T08:55:06.964 INFO:tasks.workunit.client.1.vm08.stdout:9/232: rename d2/dd/d15/d1e/d39/l46 to d2/l4a 0 2026-03-10T08:55:06.964 INFO:tasks.workunit.client.1.vm08.stdout:9/233: readlink d2/dd/lf 0 2026-03-10T08:55:06.964 INFO:tasks.workunit.client.1.vm08.stdout:5/243: mkdir d0/d11/d18/d52 0 2026-03-10T08:55:06.964 INFO:tasks.workunit.client.1.vm08.stdout:5/244: chown d0/d11/d18/d52 170078 1 2026-03-10T08:55:06.964 INFO:tasks.workunit.client.1.vm08.stdout:3/199: creat d4/d15/d8/d2c/f42 x:0 0 0 2026-03-10T08:55:06.965 INFO:tasks.workunit.client.1.vm08.stdout:3/200: chown d4/c19 1621530752 1 2026-03-10T08:55:06.965 INFO:tasks.workunit.client.1.vm08.stdout:3/201: write d4/d15/d8/d1d/f2d [426924,37599] 0 2026-03-10T08:55:06.965 INFO:tasks.workunit.client.1.vm08.stdout:6/205: dwrite d9/d10/d1e/d32/ff [0,4194304] 0 2026-03-10T08:55:06.965 INFO:tasks.workunit.client.1.vm08.stdout:3/202: chown d4/d15/f12 0 1 2026-03-10T08:55:06.966 INFO:tasks.workunit.client.1.vm08.stdout:6/206: chown d9/d10/f25 1400953 1 2026-03-10T08:55:06.967 INFO:tasks.workunit.client.1.vm08.stdout:3/203: write d4/d15/d8/d2c/f32 [825664,90822] 0 2026-03-10T08:55:06.973 INFO:tasks.workunit.client.1.vm08.stdout:4/217: rmdir d5/d23/d36 39 2026-03-10T08:55:06.975 INFO:tasks.workunit.client.1.vm08.stdout:8/251: rename d1/d10/d9/dd/l36 to d1/d10/d9/dd/d25/d27/d44/l58 0 2026-03-10T08:55:06.983 INFO:tasks.workunit.client.1.vm08.stdout:5/245: dwrite d0/d11/f29 [0,4194304] 0 2026-03-10T08:55:06.983 INFO:tasks.workunit.client.1.vm08.stdout:2/248: mknod d1/da/d10/d2d/c59 0 2026-03-10T08:55:06.984 INFO:tasks.workunit.client.1.vm08.stdout:5/246: chown d0/d11/c38 1294 1 2026-03-10T08:55:06.986 INFO:tasks.workunit.client.1.vm08.stdout:7/198: link d0/d1c/c21 d0/d14/d2f/c42 0 2026-03-10T08:55:06.988 INFO:tasks.workunit.client.1.vm08.stdout:7/199: write d0/d11/d1f/d29/d3d/d40/f38 [972253,60080] 0 2026-03-10T08:55:06.988 INFO:tasks.workunit.client.1.vm08.stdout:7/200: fsync d0/f41 0 2026-03-10T08:55:06.994 INFO:tasks.workunit.client.1.vm08.stdout:3/204: dread d4/d15/d17/f3c [0,4194304] 0 2026-03-10T08:55:07.006 INFO:tasks.workunit.client.1.vm08.stdout:9/234: creat d2/dd/d15/d1e/d25/f4b x:0 0 0 2026-03-10T08:55:07.009 INFO:tasks.workunit.client.1.vm08.stdout:8/252: symlink d1/d10/d9/dd/d13/d40/l59 0 2026-03-10T08:55:07.009 INFO:tasks.workunit.client.1.vm08.stdout:1/160: getdents d1/da/de 0 2026-03-10T08:55:07.013 INFO:tasks.workunit.client.1.vm08.stdout:2/249: dwrite d1/da/d10/d42/f58 [0,4194304] 0 2026-03-10T08:55:07.019 INFO:tasks.workunit.client.1.vm08.stdout:1/161: dwrite d1/da/de/f1a [0,4194304] 0 2026-03-10T08:55:07.019 INFO:tasks.workunit.client.1.vm08.stdout:4/218: symlink d5/d23/d36/l4c 0 2026-03-10T08:55:07.025 INFO:tasks.workunit.client.1.vm08.stdout:5/247: sync 2026-03-10T08:55:07.025 INFO:tasks.workunit.client.1.vm08.stdout:6/207: rename d9/dc/l3e to d9/dc/d11/d23/l4b 0 2026-03-10T08:55:07.025 INFO:tasks.workunit.client.1.vm08.stdout:6/208: chown d9/d10 266753 1 2026-03-10T08:55:07.028 INFO:tasks.workunit.client.1.vm08.stdout:2/250: dread d1/da/d10/d42/f58 [0,4194304] 0 2026-03-10T08:55:07.029 INFO:tasks.workunit.client.1.vm08.stdout:3/205: dread d4/d15/d8/ff [0,4194304] 0 2026-03-10T08:55:07.034 INFO:tasks.workunit.client.1.vm08.stdout:8/253: symlink d1/d10/d9/dd/l5a 0 2026-03-10T08:55:07.037 INFO:tasks.workunit.client.1.vm08.stdout:9/235: mkdir d2/d41/d4c 0 2026-03-10T08:55:07.038 INFO:tasks.workunit.client.1.vm08.stdout:4/219: dread d5/de/f38 [0,4194304] 0 2026-03-10T08:55:07.038 INFO:tasks.workunit.client.1.vm08.stdout:7/201: mkdir d0/d14/d43 0 2026-03-10T08:55:07.040 INFO:tasks.workunit.client.1.vm08.stdout:7/202: dread - d0/d14/f2d zero size 2026-03-10T08:55:07.043 INFO:tasks.workunit.client.1.vm08.stdout:5/248: rename d0/d40/c47 to d0/d11/d18/d52/c53 0 2026-03-10T08:55:07.044 INFO:tasks.workunit.client.1.vm08.stdout:4/220: truncate d5/fb 5117656 0 2026-03-10T08:55:07.045 INFO:tasks.workunit.client.1.vm08.stdout:3/206: dwrite f1 [0,4194304] 0 2026-03-10T08:55:07.046 INFO:tasks.workunit.client.1.vm08.stdout:8/254: dwrite d1/f26 [0,4194304] 0 2026-03-10T08:55:07.053 INFO:tasks.workunit.client.1.vm08.stdout:2/251: unlink d1/da/d10/f39 0 2026-03-10T08:55:07.061 INFO:tasks.workunit.client.1.vm08.stdout:8/255: dread d1/d10/d9/dd/d13/f46 [0,4194304] 0 2026-03-10T08:55:07.061 INFO:tasks.workunit.client.1.vm08.stdout:2/252: fdatasync d1/da/d10/d1b/f14 0 2026-03-10T08:55:07.061 INFO:tasks.workunit.client.1.vm08.stdout:6/209: mkdir d9/d10/d1e/d4c 0 2026-03-10T08:55:07.062 INFO:tasks.workunit.client.1.vm08.stdout:5/249: dwrite d0/d46/f51 [0,4194304] 0 2026-03-10T08:55:07.062 INFO:tasks.workunit.client.1.vm08.stdout:1/162: symlink d1/l31 0 2026-03-10T08:55:07.062 INFO:tasks.workunit.client.1.vm08.stdout:5/250: dread - d0/d40/f42 zero size 2026-03-10T08:55:07.062 INFO:tasks.workunit.client.1.vm08.stdout:7/203: fsync d0/d1c/f32 0 2026-03-10T08:55:07.062 INFO:tasks.workunit.client.1.vm08.stdout:3/207: rename d4/c6 to d4/d15/d8/d2c/c43 0 2026-03-10T08:55:07.064 INFO:tasks.workunit.client.1.vm08.stdout:6/210: write d9/dc/d11/d23/d2c/f3d [2431059,31785] 0 2026-03-10T08:55:07.064 INFO:tasks.workunit.client.1.vm08.stdout:5/251: fdatasync d0/d11/d18/f23 0 2026-03-10T08:55:07.065 INFO:tasks.workunit.client.1.vm08.stdout:5/252: fdatasync d0/d11/f1e 0 2026-03-10T08:55:07.065 INFO:tasks.workunit.client.1.vm08.stdout:6/211: read - d9/dc/d11/d23/d2c/f43 zero size 2026-03-10T08:55:07.066 INFO:tasks.workunit.client.1.vm08.stdout:1/163: mkdir d1/da/de/d32 0 2026-03-10T08:55:07.067 INFO:tasks.workunit.client.1.vm08.stdout:8/256: rename d1/d2c/f43 to d1/d10/d9/f5b 0 2026-03-10T08:55:07.068 INFO:tasks.workunit.client.1.vm08.stdout:4/221: dread d5/fd [0,4194304] 0 2026-03-10T08:55:07.068 INFO:tasks.workunit.client.1.vm08.stdout:2/253: mknod d1/da/d10/d1b/c5a 0 2026-03-10T08:55:07.069 INFO:tasks.workunit.client.1.vm08.stdout:2/254: write d1/da/d10/d1b/f14 [3415247,90551] 0 2026-03-10T08:55:07.069 INFO:tasks.workunit.client.1.vm08.stdout:4/222: readlink d5/d23/l46 0 2026-03-10T08:55:07.072 INFO:tasks.workunit.client.1.vm08.stdout:5/253: mknod d0/d46/c54 0 2026-03-10T08:55:07.089 INFO:tasks.workunit.client.1.vm08.stdout:7/204: dwrite d0/d11/d1f/d2c/f33 [0,4194304] 0 2026-03-10T08:55:07.089 INFO:tasks.workunit.client.1.vm08.stdout:1/164: symlink d1/da/de/d24/l33 0 2026-03-10T08:55:07.089 INFO:tasks.workunit.client.1.vm08.stdout:1/165: truncate d1/da/f25 510498 0 2026-03-10T08:55:07.089 INFO:tasks.workunit.client.1.vm08.stdout:2/255: mkdir d1/d5b 0 2026-03-10T08:55:07.089 INFO:tasks.workunit.client.1.vm08.stdout:7/205: truncate d0/d14/f2d 1037766 0 2026-03-10T08:55:07.089 INFO:tasks.workunit.client.1.vm08.stdout:4/223: fsync d5/de/f38 0 2026-03-10T08:55:07.089 INFO:tasks.workunit.client.1.vm08.stdout:5/254: rename d0/f16 to d0/d11/d27/d50/f55 0 2026-03-10T08:55:07.089 INFO:tasks.workunit.client.1.vm08.stdout:8/257: mkdir d1/d10/d9/d4d/d5c 0 2026-03-10T08:55:07.089 INFO:tasks.workunit.client.1.vm08.stdout:7/206: write d0/d11/f27 [1441806,58012] 0 2026-03-10T08:55:07.089 INFO:tasks.workunit.client.1.vm08.stdout:5/255: dread d0/d11/d3e/f48 [0,4194304] 0 2026-03-10T08:55:07.089 INFO:tasks.workunit.client.1.vm08.stdout:6/212: creat d9/d10/d1e/d32/f4d x:0 0 0 2026-03-10T08:55:07.089 INFO:tasks.workunit.client.1.vm08.stdout:4/224: creat d5/d23/d49/f4d x:0 0 0 2026-03-10T08:55:07.089 INFO:tasks.workunit.client.1.vm08.stdout:6/213: chown d9/d10/d1e/d32/f27 425 1 2026-03-10T08:55:07.089 INFO:tasks.workunit.client.1.vm08.stdout:8/258: symlink d1/d10/d9/dd/d18/d34/l5d 0 2026-03-10T08:55:07.089 INFO:tasks.workunit.client.1.vm08.stdout:2/256: mkdir d1/d43/d5c 0 2026-03-10T08:55:07.100 INFO:tasks.workunit.client.1.vm08.stdout:8/259: mknod d1/d10/d9/dd/d25/d27/d44/d21/d51/c5e 0 2026-03-10T08:55:07.100 INFO:tasks.workunit.client.1.vm08.stdout:1/166: dread d1/da/f25 [0,4194304] 0 2026-03-10T08:55:07.100 INFO:tasks.workunit.client.1.vm08.stdout:5/256: creat d0/d40/d4b/d4e/f56 x:0 0 0 2026-03-10T08:55:07.101 INFO:tasks.workunit.client.1.vm08.stdout:5/257: write d0/d11/d27/f3d [11589,31954] 0 2026-03-10T08:55:07.102 INFO:tasks.workunit.client.1.vm08.stdout:4/225: rename d5/d23/l47 to d5/d23/d36/l4e 0 2026-03-10T08:55:07.103 INFO:tasks.workunit.client.1.vm08.stdout:4/226: write d5/f19 [1289447,55483] 0 2026-03-10T08:55:07.105 INFO:tasks.workunit.client.1.vm08.stdout:1/167: mknod d1/da/de/d24/c34 0 2026-03-10T08:55:07.105 INFO:tasks.workunit.client.1.vm08.stdout:7/207: getdents d0/d11/d1f 0 2026-03-10T08:55:07.107 INFO:tasks.workunit.client.1.vm08.stdout:8/260: mkdir d1/d10/d9/dd/d25/d27/d44/d21/d5f 0 2026-03-10T08:55:07.107 INFO:tasks.workunit.client.1.vm08.stdout:5/258: creat d0/d11/d18/d52/f57 x:0 0 0 2026-03-10T08:55:07.109 INFO:tasks.workunit.client.1.vm08.stdout:2/257: creat d1/d43/f5d x:0 0 0 2026-03-10T08:55:07.110 INFO:tasks.workunit.client.1.vm08.stdout:4/227: symlink d5/d23/d49/l4f 0 2026-03-10T08:55:07.110 INFO:tasks.workunit.client.1.vm08.stdout:4/228: write d5/d23/f2e [787086,35850] 0 2026-03-10T08:55:07.111 INFO:tasks.workunit.client.1.vm08.stdout:1/168: mkdir d1/da/de/d24/d35 0 2026-03-10T08:55:07.116 INFO:tasks.workunit.client.1.vm08.stdout:2/258: creat d1/da/d10/d1b/d1c/f5e x:0 0 0 2026-03-10T08:55:07.119 INFO:tasks.workunit.client.1.vm08.stdout:8/261: mkdir d1/d4f/d60 0 2026-03-10T08:55:07.122 INFO:tasks.workunit.client.1.vm08.stdout:1/169: symlink d1/da/d20/l36 0 2026-03-10T08:55:07.122 INFO:tasks.workunit.client.1.vm08.stdout:1/170: write d1/f1f [1139576,56822] 0 2026-03-10T08:55:07.128 INFO:tasks.workunit.client.1.vm08.stdout:1/171: creat d1/da/de/d24/f37 x:0 0 0 2026-03-10T08:55:07.130 INFO:tasks.workunit.client.1.vm08.stdout:1/172: dread d1/fc [0,4194304] 0 2026-03-10T08:55:07.133 INFO:tasks.workunit.client.1.vm08.stdout:1/173: mknod d1/da/de/d24/c38 0 2026-03-10T08:55:07.139 INFO:tasks.workunit.client.1.vm08.stdout:1/174: dwrite d1/fc [0,4194304] 0 2026-03-10T08:55:07.145 INFO:tasks.workunit.client.1.vm08.stdout:1/175: dwrite d1/f8 [0,4194304] 0 2026-03-10T08:55:07.151 INFO:tasks.workunit.client.1.vm08.stdout:1/176: readlink d1/da/d20/l36 0 2026-03-10T08:55:07.152 INFO:tasks.workunit.client.1.vm08.stdout:1/177: stat d1/da/de/f19 0 2026-03-10T08:55:07.172 INFO:tasks.workunit.client.1.vm08.stdout:1/178: getdents d1/da/de/d24/d26 0 2026-03-10T08:55:07.180 INFO:tasks.workunit.client.1.vm08.stdout:1/179: getdents d1/da 0 2026-03-10T08:55:07.180 INFO:tasks.workunit.client.1.vm08.stdout:1/180: stat d1/f1f 0 2026-03-10T08:55:07.181 INFO:tasks.workunit.client.1.vm08.stdout:1/181: fdatasync d1/fd 0 2026-03-10T08:55:07.182 INFO:tasks.workunit.client.1.vm08.stdout:1/182: read d1/fd [2258829,26494] 0 2026-03-10T08:55:07.182 INFO:tasks.workunit.client.1.vm08.stdout:0/185: truncate d6/dd/f28 975363 0 2026-03-10T08:55:07.183 INFO:tasks.workunit.client.1.vm08.stdout:1/183: fsync d1/da/de/f12 0 2026-03-10T08:55:07.208 INFO:tasks.workunit.client.1.vm08.stdout:2/259: truncate d1/da/d10/d2d/f4c 675540 0 2026-03-10T08:55:07.210 INFO:tasks.workunit.client.1.vm08.stdout:0/186: link d6/dd/d13/d17/f1d d6/dd/f3f 0 2026-03-10T08:55:07.211 INFO:tasks.workunit.client.1.vm08.stdout:2/260: rename d1/f4 to d1/da/d10/d1b/d12/d23/f5f 0 2026-03-10T08:55:07.212 INFO:tasks.workunit.client.1.vm08.stdout:2/261: read d1/da/d10/d1b/d12/d22/f45 [1520807,47485] 0 2026-03-10T08:55:07.215 INFO:tasks.workunit.client.1.vm08.stdout:1/184: dread d1/da/de/f27 [0,4194304] 0 2026-03-10T08:55:07.216 INFO:tasks.workunit.client.1.vm08.stdout:1/185: readlink d1/da/l2b 0 2026-03-10T08:55:07.219 INFO:tasks.workunit.client.1.vm08.stdout:1/186: creat d1/da/f39 x:0 0 0 2026-03-10T08:55:07.219 INFO:tasks.workunit.client.1.vm08.stdout:1/187: chown d1/l31 1169209358 1 2026-03-10T08:55:07.228 INFO:tasks.workunit.client.1.vm08.stdout:2/262: rmdir d1/d43/d4f 39 2026-03-10T08:55:07.234 INFO:tasks.workunit.client.1.vm08.stdout:2/263: mknod d1/d43/c60 0 2026-03-10T08:55:07.237 INFO:tasks.workunit.client.1.vm08.stdout:2/264: write d1/da/d10/d42/f58 [2605094,37839] 0 2026-03-10T08:55:07.238 INFO:tasks.workunit.client.1.vm08.stdout:1/188: sync 2026-03-10T08:55:07.241 INFO:tasks.workunit.client.1.vm08.stdout:2/265: dread d1/da/f50 [0,4194304] 0 2026-03-10T08:55:07.251 INFO:tasks.workunit.client.1.vm08.stdout:1/189: sync 2026-03-10T08:55:07.290 INFO:tasks.workunit.client.1.vm08.stdout:9/236: write d2/f13 [4378952,38068] 0 2026-03-10T08:55:07.342 INFO:tasks.workunit.client.1.vm08.stdout:3/208: truncate d4/f14 3641462 0 2026-03-10T08:55:07.343 INFO:tasks.workunit.client.1.vm08.stdout:3/209: truncate d4/d15/d8/f24 689532 0 2026-03-10T08:55:07.346 INFO:tasks.workunit.client.1.vm08.stdout:3/210: creat d4/f44 x:0 0 0 2026-03-10T08:55:07.347 INFO:tasks.workunit.client.1.vm08.stdout:3/211: write d4/d15/d8/d2c/f42 [721499,70606] 0 2026-03-10T08:55:07.354 INFO:tasks.workunit.client.1.vm08.stdout:3/212: link d4/d15/d8/d2c/f42 d4/d15/d8/f45 0 2026-03-10T08:55:07.359 INFO:tasks.workunit.client.1.vm08.stdout:3/213: dwrite d4/d15/d17/f34 [0,4194304] 0 2026-03-10T08:55:07.370 INFO:tasks.workunit.client.1.vm08.stdout:4/229: getdents d5/d23/d36 0 2026-03-10T08:55:07.372 INFO:tasks.workunit.client.1.vm08.stdout:3/214: symlink d4/d15/d8/d1d/l46 0 2026-03-10T08:55:07.372 INFO:tasks.workunit.client.1.vm08.stdout:4/230: creat d5/de/f50 x:0 0 0 2026-03-10T08:55:07.373 INFO:tasks.workunit.client.1.vm08.stdout:4/231: fsync d5/d23/d49/f4a 0 2026-03-10T08:55:07.373 INFO:tasks.workunit.client.1.vm08.stdout:4/232: fdatasync d5/de/f41 0 2026-03-10T08:55:07.375 INFO:tasks.workunit.client.1.vm08.stdout:3/215: creat d4/f47 x:0 0 0 2026-03-10T08:55:07.376 INFO:tasks.workunit.client.1.vm08.stdout:4/233: creat d5/d23/d36/f51 x:0 0 0 2026-03-10T08:55:07.376 INFO:tasks.workunit.client.1.vm08.stdout:7/208: write d0/d14/f12 [6049462,109777] 0 2026-03-10T08:55:07.379 INFO:tasks.workunit.client.1.vm08.stdout:5/259: truncate d0/d1b/f39 5802696 0 2026-03-10T08:55:07.383 INFO:tasks.workunit.client.1.vm08.stdout:8/262: write d1/d10/f2d [2069406,91855] 0 2026-03-10T08:55:07.386 INFO:tasks.workunit.client.1.vm08.stdout:4/234: rename d5/de/l16 to d5/de/l52 0 2026-03-10T08:55:07.392 INFO:tasks.workunit.client.1.vm08.stdout:3/216: link d4/d15/d17/c3a d4/d15/d8/d2c/c48 0 2026-03-10T08:55:07.392 INFO:tasks.workunit.client.1.vm08.stdout:3/217: chown d4/d15/d17/d20 1603 1 2026-03-10T08:55:07.393 INFO:tasks.workunit.client.1.vm08.stdout:3/218: read d4/d15/d8/f24 [324512,18903] 0 2026-03-10T08:55:07.396 INFO:tasks.workunit.client.1.vm08.stdout:3/219: dwrite d4/d15/d8/f37 [0,4194304] 0 2026-03-10T08:55:07.403 INFO:tasks.workunit.client.1.vm08.stdout:7/209: creat d0/f44 x:0 0 0 2026-03-10T08:55:07.403 INFO:tasks.workunit.client.1.vm08.stdout:7/210: write d0/d1c/f32 [4736230,96570] 0 2026-03-10T08:55:07.414 INFO:tasks.workunit.client.1.vm08.stdout:8/263: symlink d1/d4f/d60/l61 0 2026-03-10T08:55:07.415 INFO:tasks.workunit.client.1.vm08.stdout:8/264: chown d1/d10/d9/dd/d25/d27 0 1 2026-03-10T08:55:07.418 INFO:tasks.workunit.client.1.vm08.stdout:4/235: mknod d5/d23/c53 0 2026-03-10T08:55:07.419 INFO:tasks.workunit.client.1.vm08.stdout:3/220: mknod d4/d15/d17/c49 0 2026-03-10T08:55:07.422 INFO:tasks.workunit.client.1.vm08.stdout:7/211: rmdir d0/d11/d1f/d29/d3d/d40 39 2026-03-10T08:55:07.425 INFO:tasks.workunit.client.1.vm08.stdout:8/265: creat d1/d10/d9/dd/f62 x:0 0 0 2026-03-10T08:55:07.425 INFO:tasks.workunit.client.1.vm08.stdout:7/212: dwrite d0/d11/d1f/d2c/f33 [0,4194304] 0 2026-03-10T08:55:07.436 INFO:tasks.workunit.client.1.vm08.stdout:4/236: creat d5/de/f54 x:0 0 0 2026-03-10T08:55:07.437 INFO:tasks.workunit.client.1.vm08.stdout:3/221: symlink d4/l4a 0 2026-03-10T08:55:07.443 INFO:tasks.workunit.client.1.vm08.stdout:2/266: unlink d1/da/d10/d1b/d12/d23/f5f 0 2026-03-10T08:55:07.444 INFO:tasks.workunit.client.1.vm08.stdout:4/237: unlink d5/fb 0 2026-03-10T08:55:07.445 INFO:tasks.workunit.client.1.vm08.stdout:7/213: dread d0/d11/d1f/d29/d3d/d40/f38 [0,4194304] 0 2026-03-10T08:55:07.446 INFO:tasks.workunit.client.1.vm08.stdout:4/238: dread d5/fd [0,4194304] 0 2026-03-10T08:55:07.446 INFO:tasks.workunit.client.1.vm08.stdout:4/239: truncate d5/d23/d36/f51 965974 0 2026-03-10T08:55:07.451 INFO:tasks.workunit.client.1.vm08.stdout:2/267: mkdir d1/da/d61 0 2026-03-10T08:55:07.451 INFO:tasks.workunit.client.1.vm08.stdout:2/268: chown d1/d43 13020827 1 2026-03-10T08:55:07.452 INFO:tasks.workunit.client.1.vm08.stdout:2/269: chown d1/da/d10/d1b/c5a 262 1 2026-03-10T08:55:07.464 INFO:tasks.workunit.client.1.vm08.stdout:2/270: sync 2026-03-10T08:55:07.465 INFO:tasks.workunit.client.1.vm08.stdout:7/214: unlink d0/d14/f2d 0 2026-03-10T08:55:07.468 INFO:tasks.workunit.client.1.vm08.stdout:2/271: dwrite d1/da/d10/d1b/d12/f3b [0,4194304] 0 2026-03-10T08:55:07.473 INFO:tasks.workunit.client.1.vm08.stdout:2/272: dread d1/da/d10/d1b/d12/d23/f57 [0,4194304] 0 2026-03-10T08:55:07.475 INFO:tasks.workunit.client.1.vm08.stdout:2/273: truncate d1/da/d10/d1b/d12/d1e/f49 509289 0 2026-03-10T08:55:07.484 INFO:tasks.workunit.client.1.vm08.stdout:4/240: symlink d5/d23/d36/l55 0 2026-03-10T08:55:07.485 INFO:tasks.workunit.client.1.vm08.stdout:4/241: write d5/de/f41 [582079,10108] 0 2026-03-10T08:55:07.488 INFO:tasks.workunit.client.1.vm08.stdout:1/190: chown d1/da/de/f27 25348515 1 2026-03-10T08:55:07.519 INFO:tasks.workunit.client.1.vm08.stdout:2/274: read d1/da/d10/d2d/f3e [1364346,82093] 0 2026-03-10T08:55:07.521 INFO:tasks.workunit.client.1.vm08.stdout:9/237: write d2/f4 [258209,129367] 0 2026-03-10T08:55:07.521 INFO:tasks.workunit.client.1.vm08.stdout:6/214: write f1 [5676784,81459] 0 2026-03-10T08:55:07.528 INFO:tasks.workunit.client.1.vm08.stdout:2/275: dwrite d1/f19 [0,4194304] 0 2026-03-10T08:55:07.531 INFO:tasks.workunit.client.1.vm08.stdout:1/191: mkdir d1/da/d18/d3a 0 2026-03-10T08:55:07.531 INFO:tasks.workunit.client.1.vm08.stdout:1/192: write d1/da/f39 [1012110,118110] 0 2026-03-10T08:55:07.535 INFO:tasks.workunit.client.1.vm08.stdout:9/238: dread d2/dd/d15/f1b [0,4194304] 0 2026-03-10T08:55:07.537 INFO:tasks.workunit.client.1.vm08.stdout:7/215: link d0/d11/d1f/l34 d0/d11/d1f/d29/l45 0 2026-03-10T08:55:07.538 INFO:tasks.workunit.client.1.vm08.stdout:1/193: dwrite d1/da/de/d24/f37 [0,4194304] 0 2026-03-10T08:55:07.541 INFO:tasks.workunit.client.1.vm08.stdout:6/215: mkdir d9/d13/d4e 0 2026-03-10T08:55:07.550 INFO:tasks.workunit.client.1.vm08.stdout:6/216: dwrite d9/d10/d1e/d32/f4d [0,4194304] 0 2026-03-10T08:55:07.552 INFO:tasks.workunit.client.1.vm08.stdout:4/242: creat d5/d23/f56 x:0 0 0 2026-03-10T08:55:07.552 INFO:tasks.workunit.client.1.vm08.stdout:6/217: write d9/d10/d1e/d32/f3a [919454,124237] 0 2026-03-10T08:55:07.553 INFO:tasks.workunit.client.1.vm08.stdout:2/276: creat d1/da/d10/d1b/d1c/f62 x:0 0 0 2026-03-10T08:55:07.553 INFO:tasks.workunit.client.1.vm08.stdout:6/218: write d9/d13/f15 [88472,8043] 0 2026-03-10T08:55:07.573 INFO:tasks.workunit.client.1.vm08.stdout:7/216: readlink d0/d11/d1f/d29/l45 0 2026-03-10T08:55:07.573 INFO:tasks.workunit.client.1.vm08.stdout:1/194: mkdir d1/da/d18/d3b 0 2026-03-10T08:55:07.574 INFO:tasks.workunit.client.1.vm08.stdout:1/195: dread d1/da/de/f27 [0,4194304] 0 2026-03-10T08:55:07.578 INFO:tasks.workunit.client.1.vm08.stdout:1/196: dwrite d1/f1f [0,4194304] 0 2026-03-10T08:55:07.591 INFO:tasks.workunit.client.1.vm08.stdout:6/219: rename d9/d10/f30 to d9/dc/d11/d23/d2c/f4f 0 2026-03-10T08:55:07.591 INFO:tasks.workunit.client.1.vm08.stdout:6/220: stat d9/d10/d1e/d32/f48 0 2026-03-10T08:55:07.612 INFO:tasks.workunit.client.1.vm08.stdout:9/239: link d2/l19 d2/dd/d15/d1e/d25/d31/l4d 0 2026-03-10T08:55:07.612 INFO:tasks.workunit.client.1.vm08.stdout:9/240: fsync d2/f4 0 2026-03-10T08:55:07.613 INFO:tasks.workunit.client.1.vm08.stdout:9/241: stat d2/l28 0 2026-03-10T08:55:07.622 INFO:tasks.workunit.client.1.vm08.stdout:1/197: creat d1/da/d18/d3a/f3c x:0 0 0 2026-03-10T08:55:07.625 INFO:tasks.workunit.client.1.vm08.stdout:5/260: dwrite d0/d11/d3e/f48 [0,4194304] 0 2026-03-10T08:55:07.628 INFO:tasks.workunit.client.1.vm08.stdout:6/221: mkdir d9/d50 0 2026-03-10T08:55:07.628 INFO:tasks.workunit.client.1.vm08.stdout:5/261: truncate d0/d11/f2d 989991 0 2026-03-10T08:55:07.629 INFO:tasks.workunit.client.1.vm08.stdout:6/222: readlink d9/d10/l20 0 2026-03-10T08:55:07.629 INFO:tasks.workunit.client.1.vm08.stdout:6/223: write d9/dc/d11/f31 [517545,48486] 0 2026-03-10T08:55:07.644 INFO:tasks.workunit.client.1.vm08.stdout:9/242: mkdir d2/dd/d15/d1e/d39/d4e 0 2026-03-10T08:55:07.650 INFO:tasks.workunit.client.1.vm08.stdout:1/198: rename d1/da/de/d32 to d1/da/de/d24/d3d 0 2026-03-10T08:55:07.653 INFO:tasks.workunit.client.1.vm08.stdout:6/224: creat d9/dc/d11/d23/d2c/d41/f51 x:0 0 0 2026-03-10T08:55:07.655 INFO:tasks.workunit.client.1.vm08.stdout:0/187: truncate d6/dd/f28 1820502 0 2026-03-10T08:55:07.661 INFO:tasks.workunit.client.1.vm08.stdout:6/225: symlink d9/d10/d1e/l52 0 2026-03-10T08:55:07.669 INFO:tasks.workunit.client.1.vm08.stdout:0/188: rmdir d6/dd 39 2026-03-10T08:55:07.673 INFO:tasks.workunit.client.1.vm08.stdout:8/266: write d1/d10/d9/fb [1187074,70746] 0 2026-03-10T08:55:07.676 INFO:tasks.workunit.client.1.vm08.stdout:5/262: link d0/l3c d0/d11/d3e/l58 0 2026-03-10T08:55:07.677 INFO:tasks.workunit.client.1.vm08.stdout:1/199: dread d1/da/d20/f21 [0,4194304] 0 2026-03-10T08:55:07.678 INFO:tasks.workunit.client.1.vm08.stdout:6/226: creat d9/d10/f53 x:0 0 0 2026-03-10T08:55:07.678 INFO:tasks.workunit.client.1.vm08.stdout:6/227: fdatasync d9/d10/f53 0 2026-03-10T08:55:07.685 INFO:tasks.workunit.client.1.vm08.stdout:8/267: mknod d1/d10/d9/dd/d25/d27/c63 0 2026-03-10T08:55:07.685 INFO:tasks.workunit.client.1.vm08.stdout:8/268: chown d1/d10/f23 405659 1 2026-03-10T08:55:07.687 INFO:tasks.workunit.client.1.vm08.stdout:4/243: getdents d5/d23/d36 0 2026-03-10T08:55:07.687 INFO:tasks.workunit.client.1.vm08.stdout:4/244: dread - d5/f3d zero size 2026-03-10T08:55:07.688 INFO:tasks.workunit.client.1.vm08.stdout:4/245: dread - d5/de/f54 zero size 2026-03-10T08:55:07.691 INFO:tasks.workunit.client.1.vm08.stdout:6/228: fsync d9/dc/d11/d23/d2c/f49 0 2026-03-10T08:55:07.693 INFO:tasks.workunit.client.1.vm08.stdout:6/229: dread d9/d13/f2f [0,4194304] 0 2026-03-10T08:55:07.700 INFO:tasks.workunit.client.1.vm08.stdout:4/246: creat d5/d23/d36/f57 x:0 0 0 2026-03-10T08:55:07.715 INFO:tasks.workunit.client.1.vm08.stdout:0/189: mkdir d6/dd/d13/d17/d1f/d2d/d39/d40 0 2026-03-10T08:55:07.718 INFO:tasks.workunit.client.1.vm08.stdout:8/269: truncate f0 3614129 0 2026-03-10T08:55:07.719 INFO:tasks.workunit.client.1.vm08.stdout:5/263: link d0/d1b/l2c d0/d40/d4b/d4e/l59 0 2026-03-10T08:55:07.723 INFO:tasks.workunit.client.1.vm08.stdout:1/200: link d1/da/de/d24/c34 d1/da/de/d24/d26/c3e 0 2026-03-10T08:55:07.723 INFO:tasks.workunit.client.1.vm08.stdout:1/201: stat d1/da/de 0 2026-03-10T08:55:07.723 INFO:tasks.workunit.client.1.vm08.stdout:1/202: stat d1/da/f39 0 2026-03-10T08:55:07.726 INFO:tasks.workunit.client.1.vm08.stdout:3/222: link d4/f14 d4/d15/f4b 0 2026-03-10T08:55:07.726 INFO:tasks.workunit.client.1.vm08.stdout:6/230: mknod d9/d50/c54 0 2026-03-10T08:55:07.726 INFO:tasks.workunit.client.1.vm08.stdout:3/223: write d4/f44 [236226,106720] 0 2026-03-10T08:55:07.730 INFO:tasks.workunit.client.1.vm08.stdout:6/231: dread d9/d13/f15 [0,4194304] 0 2026-03-10T08:55:07.730 INFO:tasks.workunit.client.1.vm08.stdout:8/270: sync 2026-03-10T08:55:07.730 INFO:tasks.workunit.client.1.vm08.stdout:8/271: fsync d1/d2c/f30 0 2026-03-10T08:55:07.731 INFO:tasks.workunit.client.1.vm08.stdout:8/272: chown d1/c1b 123 1 2026-03-10T08:55:07.731 INFO:tasks.workunit.client.1.vm08.stdout:6/232: chown d9/d10/d1e/d32/f42 19 1 2026-03-10T08:55:07.732 INFO:tasks.workunit.client.1.vm08.stdout:8/273: write d1/d10/f3b [1021484,126304] 0 2026-03-10T08:55:07.734 INFO:tasks.workunit.client.1.vm08.stdout:0/190: fdatasync d6/f9 0 2026-03-10T08:55:07.735 INFO:tasks.workunit.client.1.vm08.stdout:6/233: dwrite d9/dc/d11/f47 [0,4194304] 0 2026-03-10T08:55:07.736 INFO:tasks.workunit.client.1.vm08.stdout:2/277: truncate d1/da/d10/d1b/d12/f55 2866908 0 2026-03-10T08:55:07.736 INFO:tasks.workunit.client.1.vm08.stdout:2/278: readlink d1/l5 0 2026-03-10T08:55:07.736 INFO:tasks.workunit.client.1.vm08.stdout:2/279: chown d1/da/d10/d2d 337161 1 2026-03-10T08:55:07.738 INFO:tasks.workunit.client.1.vm08.stdout:5/264: creat d0/d11/d18/f5a x:0 0 0 2026-03-10T08:55:07.740 INFO:tasks.workunit.client.1.vm08.stdout:9/243: rename d2/dd/d15/d1e/d25/d31 to d2/dd/d15/d4f 0 2026-03-10T08:55:07.741 INFO:tasks.workunit.client.1.vm08.stdout:7/217: truncate d0/d14/f12 4672962 0 2026-03-10T08:55:07.743 INFO:tasks.workunit.client.1.vm08.stdout:1/203: rename d1/da/d20/d30 to d1/da/d20/d3f 0 2026-03-10T08:55:07.751 INFO:tasks.workunit.client.1.vm08.stdout:8/274: mkdir d1/d10/d9/dd/d25/d27/d44/d21/d51/d64 0 2026-03-10T08:55:07.754 INFO:tasks.workunit.client.1.vm08.stdout:0/191: rmdir d6/dd/d13/d17/d1f/d20/d2f 39 2026-03-10T08:55:07.755 INFO:tasks.workunit.client.1.vm08.stdout:2/280: creat d1/da/d10/d1b/d1c/f63 x:0 0 0 2026-03-10T08:55:07.767 INFO:tasks.workunit.client.1.vm08.stdout:5/265: creat d0/d11/d3e/d45/f5b x:0 0 0 2026-03-10T08:55:07.767 INFO:tasks.workunit.client.1.vm08.stdout:3/224: rename d4/d15/d8/d2c/l2f to d4/l4c 0 2026-03-10T08:55:07.767 INFO:tasks.workunit.client.1.vm08.stdout:5/266: dwrite d0/d11/d3e/d45/f5b [0,4194304] 0 2026-03-10T08:55:07.767 INFO:tasks.workunit.client.1.vm08.stdout:5/267: stat d0/d11/d27/f2a 0 2026-03-10T08:55:07.781 INFO:tasks.workunit.client.1.vm08.stdout:1/204: fdatasync d1/da/f1e 0 2026-03-10T08:55:07.784 INFO:tasks.workunit.client.1.vm08.stdout:4/247: rmdir d5/d23/d36 39 2026-03-10T08:55:07.787 INFO:tasks.workunit.client.1.vm08.stdout:4/248: dwrite d5/f14 [0,4194304] 0 2026-03-10T08:55:07.800 INFO:tasks.workunit.client.1.vm08.stdout:8/275: rename d1/l1a to d1/d4f/d60/l65 0 2026-03-10T08:55:07.801 INFO:tasks.workunit.client.1.vm08.stdout:3/225: creat d4/d15/d8/d2a/f4d x:0 0 0 2026-03-10T08:55:07.802 INFO:tasks.workunit.client.1.vm08.stdout:2/281: dwrite d1/da/d10/d2d/f4d [0,4194304] 0 2026-03-10T08:55:07.809 INFO:tasks.workunit.client.1.vm08.stdout:7/218: mknod d0/c46 0 2026-03-10T08:55:07.810 INFO:tasks.workunit.client.1.vm08.stdout:9/244: link d2/dd/d15/d1e/d24/f2b d2/dd/d15/d1e/d21/f50 0 2026-03-10T08:55:07.811 INFO:tasks.workunit.client.1.vm08.stdout:5/268: mknod d0/d40/d4b/d4e/c5c 0 2026-03-10T08:55:07.812 INFO:tasks.workunit.client.1.vm08.stdout:5/269: write d0/d1b/f2f [1837445,59812] 0 2026-03-10T08:55:07.812 INFO:tasks.workunit.client.1.vm08.stdout:1/205: mkdir d1/da/de/d24/d3d/d40 0 2026-03-10T08:55:07.812 INFO:tasks.workunit.client.1.vm08.stdout:0/192: mknod d6/dd/d13/d17/c41 0 2026-03-10T08:55:07.817 INFO:tasks.workunit.client.1.vm08.stdout:8/276: creat d1/d10/d9/dd/d25/d27/d44/d21/f66 x:0 0 0 2026-03-10T08:55:07.820 INFO:tasks.workunit.client.1.vm08.stdout:3/226: fdatasync d4/d15/d8/d1d/f21 0 2026-03-10T08:55:07.820 INFO:tasks.workunit.client.1.vm08.stdout:9/245: rename d2/dd/d15/d1e/d25/l38 to d2/dd/l51 0 2026-03-10T08:55:07.820 INFO:tasks.workunit.client.1.vm08.stdout:7/219: dread d0/d11/d1f/d29/d3d/d40/f38 [0,4194304] 0 2026-03-10T08:55:07.822 INFO:tasks.workunit.client.1.vm08.stdout:4/249: creat d5/d23/d36/f58 x:0 0 0 2026-03-10T08:55:07.822 INFO:tasks.workunit.client.1.vm08.stdout:6/234: getdents d9/dc/d11 0 2026-03-10T08:55:07.823 INFO:tasks.workunit.client.1.vm08.stdout:8/277: mkdir d1/d4f/d60/d67 0 2026-03-10T08:55:07.824 INFO:tasks.workunit.client.1.vm08.stdout:4/250: dread d5/fd [0,4194304] 0 2026-03-10T08:55:07.824 INFO:tasks.workunit.client.1.vm08.stdout:8/278: chown d1/d10/d9/dd/d25/d27/f3a 15 1 2026-03-10T08:55:07.825 INFO:tasks.workunit.client.1.vm08.stdout:9/246: unlink d2/l5 0 2026-03-10T08:55:07.826 INFO:tasks.workunit.client.1.vm08.stdout:7/220: mknod d0/d11/d1f/c47 0 2026-03-10T08:55:07.828 INFO:tasks.workunit.client.1.vm08.stdout:6/235: creat d9/dc/d11/f55 x:0 0 0 2026-03-10T08:55:07.828 INFO:tasks.workunit.client.1.vm08.stdout:8/279: fsync d1/d10/d9/f5b 0 2026-03-10T08:55:07.829 INFO:tasks.workunit.client.1.vm08.stdout:8/280: chown d1/f8 34 1 2026-03-10T08:55:07.829 INFO:tasks.workunit.client.1.vm08.stdout:3/227: dread d4/d15/d8/d2c/f42 [0,4194304] 0 2026-03-10T08:55:07.830 INFO:tasks.workunit.client.1.vm08.stdout:9/247: unlink d2/dd/d15/d1e/f42 0 2026-03-10T08:55:07.832 INFO:tasks.workunit.client.1.vm08.stdout:9/248: readlink d2/l36 0 2026-03-10T08:55:07.832 INFO:tasks.workunit.client.1.vm08.stdout:6/236: creat d9/dc/d11/d23/d2c/d41/f56 x:0 0 0 2026-03-10T08:55:07.833 INFO:tasks.workunit.client.1.vm08.stdout:2/282: getdents d1/da/d10/d42 0 2026-03-10T08:55:07.833 INFO:tasks.workunit.client.1.vm08.stdout:6/237: read d9/dc/d11/d23/d2c/f3d [2764669,44670] 0 2026-03-10T08:55:07.834 INFO:tasks.workunit.client.1.vm08.stdout:8/281: creat d1/d10/d9/dd/d13/d40/f68 x:0 0 0 2026-03-10T08:55:07.836 INFO:tasks.workunit.client.1.vm08.stdout:6/238: dwrite f1 [4194304,4194304] 0 2026-03-10T08:55:07.837 INFO:tasks.workunit.client.1.vm08.stdout:6/239: readlink d9/d13/l46 0 2026-03-10T08:55:07.844 INFO:tasks.workunit.client.1.vm08.stdout:4/251: sync 2026-03-10T08:55:07.844 INFO:tasks.workunit.client.1.vm08.stdout:8/282: dwrite d1/d10/d9/dd/d25/d27/d44/f22 [0,4194304] 0 2026-03-10T08:55:07.846 INFO:tasks.workunit.client.1.vm08.stdout:2/283: stat d1/da/d10/d1b/d12/d23/f57 0 2026-03-10T08:55:07.847 INFO:tasks.workunit.client.1.vm08.stdout:2/284: readlink d1/l5 0 2026-03-10T08:55:07.849 INFO:tasks.workunit.client.1.vm08.stdout:9/249: symlink d2/l52 0 2026-03-10T08:55:07.851 INFO:tasks.workunit.client.1.vm08.stdout:0/193: link d6/dd/d13/d17/d1f/d20/d2f/c30 d6/dd/d13/d17/d1f/d20/d2f/c42 0 2026-03-10T08:55:07.856 INFO:tasks.workunit.client.1.vm08.stdout:3/228: creat d4/d15/d8/f4e x:0 0 0 2026-03-10T08:55:07.857 INFO:tasks.workunit.client.1.vm08.stdout:6/240: unlink d9/dc/d11/d23/c3b 0 2026-03-10T08:55:07.857 INFO:tasks.workunit.client.1.vm08.stdout:8/283: dwrite d1/d10/d9/dd/f62 [0,4194304] 0 2026-03-10T08:55:07.857 INFO:tasks.workunit.client.1.vm08.stdout:4/252: rename d5/l17 to d5/d23/d36/l59 0 2026-03-10T08:55:07.861 INFO:tasks.workunit.client.1.vm08.stdout:4/253: chown d5 979 1 2026-03-10T08:55:07.861 INFO:tasks.workunit.client.1.vm08.stdout:9/250: mkdir d2/d41/d53 0 2026-03-10T08:55:07.863 INFO:tasks.workunit.client.1.vm08.stdout:2/285: creat d1/da/f64 x:0 0 0 2026-03-10T08:55:07.870 INFO:tasks.workunit.client.1.vm08.stdout:3/229: mkdir d4/d15/d8/d1d/d4f 0 2026-03-10T08:55:07.870 INFO:tasks.workunit.client.1.vm08.stdout:0/194: rmdir d6/dd 39 2026-03-10T08:55:07.870 INFO:tasks.workunit.client.1.vm08.stdout:4/254: mkdir d5/d2f/d5a 0 2026-03-10T08:55:07.870 INFO:tasks.workunit.client.1.vm08.stdout:2/286: fsync d1/da/d10/d1b/d1c/f5e 0 2026-03-10T08:55:07.870 INFO:tasks.workunit.client.1.vm08.stdout:4/255: stat d5/d23/f56 0 2026-03-10T08:55:07.870 INFO:tasks.workunit.client.1.vm08.stdout:4/256: write d5/f8 [609442,43879] 0 2026-03-10T08:55:07.870 INFO:tasks.workunit.client.1.vm08.stdout:6/241: creat d9/d13/d4e/f57 x:0 0 0 2026-03-10T08:55:07.871 INFO:tasks.workunit.client.1.vm08.stdout:3/230: mknod d4/d15/d8/d2a/c50 0 2026-03-10T08:55:07.871 INFO:tasks.workunit.client.1.vm08.stdout:8/284: sync 2026-03-10T08:55:07.876 INFO:tasks.workunit.client.1.vm08.stdout:2/287: dwrite d1/da/d10/d2d/f4d [0,4194304] 0 2026-03-10T08:55:07.876 INFO:tasks.workunit.client.1.vm08.stdout:3/231: dwrite d4/d15/d8/d2c/f3d [4194304,4194304] 0 2026-03-10T08:55:07.878 INFO:tasks.workunit.client.1.vm08.stdout:6/242: creat d9/d10/d1e/f58 x:0 0 0 2026-03-10T08:55:07.885 INFO:tasks.workunit.client.1.vm08.stdout:8/285: read d1/d2c/f30 [1234747,77657] 0 2026-03-10T08:55:07.885 INFO:tasks.workunit.client.1.vm08.stdout:6/243: dread d9/d13/f36 [0,4194304] 0 2026-03-10T08:55:07.892 INFO:tasks.workunit.client.1.vm08.stdout:6/244: mknod d9/d10/c59 0 2026-03-10T08:55:07.900 INFO:tasks.workunit.client.1.vm08.stdout:6/245: mknod d9/dc/d11/d23/d2c/d44/c5a 0 2026-03-10T08:55:07.929 INFO:tasks.workunit.client.1.vm08.stdout:5/270: dwrite d0/ff [4194304,4194304] 0 2026-03-10T08:55:07.930 INFO:tasks.workunit.client.1.vm08.stdout:5/271: dread - d0/f43 zero size 2026-03-10T08:55:07.930 INFO:tasks.workunit.client.1.vm08.stdout:5/272: read - d0/d11/d3e/d45/f4a zero size 2026-03-10T08:55:07.931 INFO:tasks.workunit.client.1.vm08.stdout:5/273: truncate d0/d11/d18/f34 1636769 0 2026-03-10T08:55:07.936 INFO:tasks.workunit.client.1.vm08.stdout:5/274: dread d0/d11/f29 [0,4194304] 0 2026-03-10T08:55:07.937 INFO:tasks.workunit.client.1.vm08.stdout:5/275: chown d0/d11/f29 41 1 2026-03-10T08:55:07.938 INFO:tasks.workunit.client.1.vm08.stdout:1/206: write d1/da/f22 [428428,22444] 0 2026-03-10T08:55:07.940 INFO:tasks.workunit.client.1.vm08.stdout:5/276: symlink d0/d46/l5d 0 2026-03-10T08:55:07.941 INFO:tasks.workunit.client.1.vm08.stdout:5/277: chown d0/fe 12104 1 2026-03-10T08:55:07.942 INFO:tasks.workunit.client.1.vm08.stdout:1/207: mkdir d1/da/d18/d3b/d41 0 2026-03-10T08:55:07.942 INFO:tasks.workunit.client.1.vm08.stdout:7/221: symlink d0/d11/d1f/d29/d3d/d40/l48 0 2026-03-10T08:55:07.944 INFO:tasks.workunit.client.1.vm08.stdout:1/208: creat d1/da/de/d24/d3d/d40/f42 x:0 0 0 2026-03-10T08:55:07.948 INFO:tasks.workunit.client.1.vm08.stdout:1/209: truncate d1/da/d20/f21 1650633 0 2026-03-10T08:55:07.948 INFO:tasks.workunit.client.1.vm08.stdout:1/210: chown d1/f8 698 1 2026-03-10T08:55:07.949 INFO:tasks.workunit.client.1.vm08.stdout:7/222: dwrite d0/f44 [0,4194304] 0 2026-03-10T08:55:07.953 INFO:tasks.workunit.client.1.vm08.stdout:5/278: sync 2026-03-10T08:55:07.959 INFO:tasks.workunit.client.1.vm08.stdout:5/279: dwrite d0/d11/d27/f3b [0,4194304] 0 2026-03-10T08:55:07.959 INFO:tasks.workunit.client.1.vm08.stdout:7/223: chown d0/d11/d1f/l28 2072664 1 2026-03-10T08:55:07.962 INFO:tasks.workunit.client.1.vm08.stdout:5/280: stat d0/d11/d18 0 2026-03-10T08:55:07.971 INFO:tasks.workunit.client.1.vm08.stdout:1/211: dread d1/da/de/f19 [0,4194304] 0 2026-03-10T08:55:07.972 INFO:tasks.workunit.client.1.vm08.stdout:1/212: fdatasync d1/fd 0 2026-03-10T08:55:07.973 INFO:tasks.workunit.client.1.vm08.stdout:7/224: rename d0/d11/d1f/c47 to d0/d11/d1f/c49 0 2026-03-10T08:55:07.973 INFO:tasks.workunit.client.1.vm08.stdout:5/281: dwrite d0/d11/d18/f4f [0,4194304] 0 2026-03-10T08:55:07.978 INFO:tasks.workunit.client.1.vm08.stdout:1/213: mkdir d1/da/de/d24/d35/d43 0 2026-03-10T08:55:07.980 INFO:tasks.workunit.client.1.vm08.stdout:1/214: symlink d1/da/de/l44 0 2026-03-10T08:55:07.980 INFO:tasks.workunit.client.1.vm08.stdout:1/215: chown d1/da/de/d24/d3d/d40 869109815 1 2026-03-10T08:55:07.982 INFO:tasks.workunit.client.1.vm08.stdout:5/282: truncate d0/d11/d18/f23 3892674 0 2026-03-10T08:55:07.984 INFO:tasks.workunit.client.1.vm08.stdout:5/283: truncate d0/d11/d3e/f4d 517946 0 2026-03-10T08:55:07.985 INFO:tasks.workunit.client.1.vm08.stdout:5/284: fdatasync d0/d11/d3e/d45/f5b 0 2026-03-10T08:55:07.992 INFO:tasks.workunit.client.1.vm08.stdout:5/285: dwrite d0/f43 [0,4194304] 0 2026-03-10T08:55:08.012 INFO:tasks.workunit.client.1.vm08.stdout:6/246: truncate d9/d10/d1e/d32/ff 4331371 0 2026-03-10T08:55:08.016 INFO:tasks.workunit.client.1.vm08.stdout:6/247: symlink d9/d10/l5b 0 2026-03-10T08:55:08.017 INFO:tasks.workunit.client.1.vm08.stdout:4/257: rename d5/d23/d36/l59 to d5/d23/d49/l5b 0 2026-03-10T08:55:08.019 INFO:tasks.workunit.client.1.vm08.stdout:6/248: creat d9/dc/d11/d23/d2c/f5c x:0 0 0 2026-03-10T08:55:08.019 INFO:tasks.workunit.client.1.vm08.stdout:5/286: rename d0 to d0/d5e 22 2026-03-10T08:55:08.027 INFO:tasks.workunit.client.1.vm08.stdout:4/258: rename d5/c39 to d5/d2f/c5c 0 2026-03-10T08:55:08.030 INFO:tasks.workunit.client.1.vm08.stdout:4/259: mkdir d5/d2f/d5d 0 2026-03-10T08:55:08.034 INFO:tasks.workunit.client.1.vm08.stdout:0/195: write d6/f2c [784514,115264] 0 2026-03-10T08:55:08.035 INFO:tasks.workunit.client.1.vm08.stdout:9/251: truncate d2/f35 1386715 0 2026-03-10T08:55:08.036 INFO:tasks.workunit.client.1.vm08.stdout:4/260: creat d5/de/f5e x:0 0 0 2026-03-10T08:55:08.045 INFO:tasks.workunit.client.1.vm08.stdout:5/287: getdents d0 0 2026-03-10T08:55:08.046 INFO:tasks.workunit.client.1.vm08.stdout:9/252: read d2/dd/d15/f17 [980763,43891] 0 2026-03-10T08:55:08.047 INFO:tasks.workunit.client.1.vm08.stdout:9/253: truncate d2/f4 1085924 0 2026-03-10T08:55:08.051 INFO:tasks.workunit.client.1.vm08.stdout:4/261: mkdir d5/d5f 0 2026-03-10T08:55:08.054 INFO:tasks.workunit.client.1.vm08.stdout:4/262: dread - d5/d23/d49/f4a zero size 2026-03-10T08:55:08.054 INFO:tasks.workunit.client.1.vm08.stdout:5/288: rename d0/d11/f37 to d0/d46/f5f 0 2026-03-10T08:55:08.056 INFO:tasks.workunit.client.1.vm08.stdout:4/263: dwrite d5/de/f5e [0,4194304] 0 2026-03-10T08:55:08.057 INFO:tasks.workunit.client.1.vm08.stdout:3/232: write d4/d15/d8/d2c/f42 [747443,10302] 0 2026-03-10T08:55:08.058 INFO:tasks.workunit.client.1.vm08.stdout:2/288: truncate d1/da/d10/d42/f58 3977236 0 2026-03-10T08:55:08.062 INFO:tasks.workunit.client.1.vm08.stdout:5/289: creat d0/d11/f60 x:0 0 0 2026-03-10T08:55:08.062 INFO:tasks.workunit.client.1.vm08.stdout:9/254: mkdir d2/d54 0 2026-03-10T08:55:08.064 INFO:tasks.workunit.client.1.vm08.stdout:8/286: dwrite d1/d10/d9/dd/d18/d3c/f4e [0,4194304] 0 2026-03-10T08:55:08.075 INFO:tasks.workunit.client.1.vm08.stdout:4/264: creat d5/d2f/d5d/f60 x:0 0 0 2026-03-10T08:55:08.075 INFO:tasks.workunit.client.1.vm08.stdout:5/290: creat d0/d11/d27/f61 x:0 0 0 2026-03-10T08:55:08.076 INFO:tasks.workunit.client.1.vm08.stdout:5/291: fsync d0/d40/f42 0 2026-03-10T08:55:08.077 INFO:tasks.workunit.client.0.vm05.stderr:+ pushd ltp-full-20091231/testcases/kernel/fs/fsstress 2026-03-10T08:55:08.079 INFO:tasks.workunit.client.1.vm08.stdout:5/292: dwrite d0/ff [0,4194304] 0 2026-03-10T08:55:08.080 INFO:tasks.workunit.client.0.vm05.stdout:~/cephtest/mnt.0/client.0/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress ~/cephtest/mnt.0/client.0/tmp/fsstress ~/cephtest/mnt.0/client.0/tmp 2026-03-10T08:55:08.081 INFO:tasks.workunit.client.0.vm05.stderr:+ make 2026-03-10T08:55:08.082 INFO:tasks.workunit.client.1.vm08.stdout:8/287: rmdir d1/d10/d9/dd/d25/d27/d44/d21 39 2026-03-10T08:55:08.087 INFO:tasks.workunit.client.1.vm08.stdout:5/293: dread d0/d11/d27/f3b [0,4194304] 0 2026-03-10T08:55:08.089 INFO:tasks.workunit.client.1.vm08.stdout:8/288: dwrite d1/d10/d9/dd/d13/d40/f68 [0,4194304] 0 2026-03-10T08:55:08.094 INFO:tasks.workunit.client.1.vm08.stdout:2/289: dwrite d1/da/d10/d1b/f14 [0,4194304] 0 2026-03-10T08:55:08.097 INFO:tasks.workunit.client.1.vm08.stdout:4/265: creat d5/d2f/d5d/f61 x:0 0 0 2026-03-10T08:55:08.098 INFO:tasks.workunit.client.1.vm08.stdout:8/289: chown d1/d10/d9/dd 2109892 1 2026-03-10T08:55:08.098 INFO:tasks.workunit.client.1.vm08.stdout:5/294: write d0/d11/d18/f4f [337400,103522] 0 2026-03-10T08:55:08.103 INFO:tasks.workunit.client.1.vm08.stdout:9/255: creat d2/dd/d15/d1e/d39/d4e/f55 x:0 0 0 2026-03-10T08:55:08.107 INFO:tasks.workunit.client.1.vm08.stdout:2/290: dwrite d1/da/d10/d1b/d1c/f20 [0,4194304] 0 2026-03-10T08:55:08.124 INFO:tasks.workunit.client.1.vm08.stdout:3/233: dread d4/d15/f12 [0,4194304] 0 2026-03-10T08:55:08.130 INFO:tasks.workunit.client.1.vm08.stdout:9/256: symlink d2/dd/d15/d4f/l56 0 2026-03-10T08:55:08.131 INFO:tasks.workunit.client.1.vm08.stdout:8/290: dread d1/d10/d9/dd/d25/d27/d44/d21/f4a [0,4194304] 0 2026-03-10T08:55:08.135 INFO:tasks.workunit.client.1.vm08.stdout:2/291: symlink d1/d43/d5c/l65 0 2026-03-10T08:55:08.136 INFO:tasks.workunit.client.1.vm08.stdout:4/266: rename c3 to d5/d23/c62 0 2026-03-10T08:55:08.138 INFO:tasks.workunit.client.1.vm08.stdout:9/257: dwrite d2/dd/d15/d1e/f48 [0,4194304] 0 2026-03-10T08:55:08.142 INFO:tasks.workunit.client.1.vm08.stdout:8/291: dwrite d1/d10/d9/f5b [0,4194304] 0 2026-03-10T08:55:08.142 INFO:tasks.workunit.client.1.vm08.stdout:3/234: mknod d4/d15/d8/d1d/d4f/c51 0 2026-03-10T08:55:08.142 INFO:tasks.workunit.client.1.vm08.stdout:8/292: write d1/d10/d9/dd/d25/d27/d44/d21/d51/f56 [29621,48173] 0 2026-03-10T08:55:08.143 INFO:tasks.workunit.client.1.vm08.stdout:2/292: dwrite d1/f48 [0,4194304] 0 2026-03-10T08:55:08.155 INFO:tasks.workunit.client.1.vm08.stdout:3/235: write d4/d15/f12 [2013723,60562] 0 2026-03-10T08:55:08.159 INFO:tasks.workunit.client.1.vm08.stdout:4/267: dwrite d5/f1d [0,4194304] 0 2026-03-10T08:55:08.166 INFO:tasks.workunit.client.1.vm08.stdout:4/268: read d5/de/f5e [4006614,92528] 0 2026-03-10T08:55:08.166 INFO:tasks.workunit.client.1.vm08.stdout:8/293: unlink d1/d10/f2d 0 2026-03-10T08:55:08.166 INFO:tasks.workunit.client.1.vm08.stdout:8/294: write d1/d10/f2a [1648914,19261] 0 2026-03-10T08:55:08.166 INFO:tasks.workunit.client.1.vm08.stdout:1/216: truncate d1/da/de/f19 460929 0 2026-03-10T08:55:08.166 INFO:tasks.workunit.client.1.vm08.stdout:3/236: read - d4/f18 zero size 2026-03-10T08:55:08.170 INFO:tasks.workunit.client.1.vm08.stdout:8/295: dwrite d1/d10/d9/fb [0,4194304] 0 2026-03-10T08:55:08.171 INFO:tasks.workunit.client.1.vm08.stdout:4/269: dread d5/f1d [0,4194304] 0 2026-03-10T08:55:08.172 INFO:tasks.workunit.client.1.vm08.stdout:1/217: symlink d1/da/d20/d3f/l45 0 2026-03-10T08:55:08.176 INFO:tasks.workunit.client.1.vm08.stdout:2/293: sync 2026-03-10T08:55:08.177 INFO:tasks.workunit.client.1.vm08.stdout:8/296: symlink d1/d10/d9/d4d/l69 0 2026-03-10T08:55:08.177 INFO:tasks.workunit.client.1.vm08.stdout:3/237: dwrite d4/d15/d8/d2c/f3d [0,4194304] 0 2026-03-10T08:55:08.177 INFO:tasks.workunit.client.1.vm08.stdout:4/270: readlink d5/l12 0 2026-03-10T08:55:08.177 INFO:tasks.workunit.client.1.vm08.stdout:8/297: fdatasync d1/d2c/f30 0 2026-03-10T08:55:08.181 INFO:tasks.workunit.client.1.vm08.stdout:1/218: dwrite d1/f1f [0,4194304] 0 2026-03-10T08:55:08.195 INFO:tasks.workunit.client.1.vm08.stdout:4/271: creat d5/de/f63 x:0 0 0 2026-03-10T08:55:08.196 INFO:tasks.workunit.client.1.vm08.stdout:8/298: creat d1/d10/d9/dd/d13/f6a x:0 0 0 2026-03-10T08:55:08.197 INFO:tasks.workunit.client.1.vm08.stdout:2/294: rename d1/da/d10/d1b/d1c to d1/d5b/d66 0 2026-03-10T08:55:08.198 INFO:tasks.workunit.client.1.vm08.stdout:4/272: mknod d5/d2f/d5d/c64 0 2026-03-10T08:55:08.199 INFO:tasks.workunit.client.1.vm08.stdout:8/299: truncate d1/d10/d9/dd/d25/d27/d44/d21/d51/f56 1121533 0 2026-03-10T08:55:08.204 INFO:tasks.workunit.client.1.vm08.stdout:8/300: write d1/d10/d9/dd/d25/d27/f52 [834315,50668] 0 2026-03-10T08:55:08.206 INFO:tasks.workunit.client.1.vm08.stdout:8/301: write d1/d10/d9/dd/d25/d27/f52 [138052,48135] 0 2026-03-10T08:55:08.208 INFO:tasks.workunit.client.1.vm08.stdout:6/249: write d9/dc/f1b [4524762,79233] 0 2026-03-10T08:55:08.211 INFO:tasks.workunit.client.1.vm08.stdout:8/302: symlink d1/d10/d9/dd/d13/d40/l6b 0 2026-03-10T08:55:08.211 INFO:tasks.workunit.client.1.vm08.stdout:6/250: mkdir d9/dc/d11/d23/d2c/d41/d5d 0 2026-03-10T08:55:08.212 INFO:tasks.workunit.client.1.vm08.stdout:2/295: creat d1/da/d10/d2d/f67 x:0 0 0 2026-03-10T08:55:08.214 INFO:tasks.workunit.client.1.vm08.stdout:6/251: rename d9/d13/f15 to d9/dc/d11/d23/d2c/d41/d5d/f5e 0 2026-03-10T08:55:08.215 INFO:tasks.workunit.client.1.vm08.stdout:2/296: dread d1/da/f50 [0,4194304] 0 2026-03-10T08:55:08.215 INFO:tasks.workunit.client.1.vm08.stdout:2/297: fsync d1/f19 0 2026-03-10T08:55:08.219 INFO:tasks.workunit.client.1.vm08.stdout:6/252: dwrite d9/d10/f53 [0,4194304] 0 2026-03-10T08:55:08.222 INFO:tasks.workunit.client.1.vm08.stdout:6/253: dread d9/d13/f36 [0,4194304] 0 2026-03-10T08:55:08.223 INFO:tasks.workunit.client.1.vm08.stdout:6/254: readlink d9/d10/l1c 0 2026-03-10T08:55:08.225 INFO:tasks.workunit.client.1.vm08.stdout:6/255: link d9/dc/d11/f47 d9/dc/d11/d23/f5f 0 2026-03-10T08:55:08.226 INFO:tasks.workunit.client.1.vm08.stdout:6/256: truncate d9/d13/f4a 230802 0 2026-03-10T08:55:08.262 INFO:tasks.workunit.client.1.vm08.stdout:8/303: fsync d1/d10/d9/dd/d25/d27/d44/d21/f4a 0 2026-03-10T08:55:08.264 INFO:tasks.workunit.client.1.vm08.stdout:8/304: creat d1/d10/f6c x:0 0 0 2026-03-10T08:55:08.299 INFO:tasks.workunit.client.1.vm08.stdout:0/196: write f5 [876202,22484] 0 2026-03-10T08:55:08.334 INFO:tasks.workunit.client.1.vm08.stdout:0/197: sync 2026-03-10T08:55:08.339 INFO:tasks.workunit.client.1.vm08.stdout:0/198: dwrite d6/dd/d13/d32/f34 [0,4194304] 0 2026-03-10T08:55:08.361 INFO:tasks.workunit.client.1.vm08.stdout:0/199: dread f3 [0,4194304] 0 2026-03-10T08:55:08.371 INFO:tasks.workunit.client.1.vm08.stdout:0/200: creat d6/dd/d13/d17/d1f/d20/f43 x:0 0 0 2026-03-10T08:55:08.372 INFO:tasks.workunit.client.1.vm08.stdout:0/201: write d6/dd/f35 [660394,75513] 0 2026-03-10T08:55:08.372 INFO:tasks.workunit.client.1.vm08.stdout:0/202: chown d6/f2c 29765 1 2026-03-10T08:55:08.378 INFO:tasks.workunit.client.1.vm08.stdout:1/219: dread d1/da/d20/f2d [0,4194304] 0 2026-03-10T08:55:08.387 INFO:tasks.workunit.client.1.vm08.stdout:0/203: mknod d6/dd/d13/c44 0 2026-03-10T08:55:08.388 INFO:tasks.workunit.client.1.vm08.stdout:5/295: write d0/d1b/f39 [2855813,110588] 0 2026-03-10T08:55:08.388 INFO:tasks.workunit.client.1.vm08.stdout:5/296: chown d0/d40/d4b/d4e/c5c 355487 1 2026-03-10T08:55:08.389 INFO:tasks.workunit.client.1.vm08.stdout:0/204: creat d6/dd/d13/d17/d1f/d2d/f45 x:0 0 0 2026-03-10T08:55:08.389 INFO:tasks.workunit.client.1.vm08.stdout:0/205: fsync f4 0 2026-03-10T08:55:08.390 INFO:tasks.workunit.client.1.vm08.stdout:1/220: mknod d1/da/d18/d3b/d41/c46 0 2026-03-10T08:55:08.393 INFO:tasks.workunit.client.1.vm08.stdout:1/221: dwrite d1/fd [0,4194304] 0 2026-03-10T08:55:08.400 INFO:tasks.workunit.client.0.vm05.stdout:cc -DNO_XFS -I/home/ubuntu/cephtest/mnt.0/client.0/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress -D_LARGEFILE64_SOURCE -D_GNU_SOURCE -I../../../../include -I../../../../include -L../../../../lib fsstress.c -o fsstress 2026-03-10T08:55:08.418 INFO:tasks.workunit.client.1.vm08.stdout:1/222: mknod d1/da/de/d24/d35/c47 0 2026-03-10T08:55:08.424 INFO:tasks.workunit.client.1.vm08.stdout:9/258: dwrite f1 [0,4194304] 0 2026-03-10T08:55:08.427 INFO:tasks.workunit.client.1.vm08.stdout:1/223: creat d1/da/d18/f48 x:0 0 0 2026-03-10T08:55:08.437 INFO:tasks.workunit.client.1.vm08.stdout:9/259: sync 2026-03-10T08:55:08.438 INFO:tasks.workunit.client.1.vm08.stdout:9/260: write d2/dd/d15/f44 [100376,84585] 0 2026-03-10T08:55:08.438 INFO:tasks.workunit.client.1.vm08.stdout:0/206: creat d6/dd/d13/d17/d1f/d20/f46 x:0 0 0 2026-03-10T08:55:08.438 INFO:tasks.workunit.client.1.vm08.stdout:7/225: truncate d0/d11/d1f/d2c/f33 1856696 0 2026-03-10T08:55:08.439 INFO:tasks.workunit.client.1.vm08.stdout:7/226: truncate d0/d11/f39 600341 0 2026-03-10T08:55:08.440 INFO:tasks.workunit.client.1.vm08.stdout:7/227: stat d0/d11/d1f/d2c/l3c 0 2026-03-10T08:55:08.482 INFO:tasks.workunit.client.1.vm08.stdout:1/224: mkdir d1/da/d20/d3f/d49 0 2026-03-10T08:55:08.485 INFO:tasks.workunit.client.1.vm08.stdout:0/207: creat d6/dd/d13/d17/d1f/d2d/d39/f47 x:0 0 0 2026-03-10T08:55:08.485 INFO:tasks.workunit.client.1.vm08.stdout:0/208: readlink d6/dd/d13/l1a 0 2026-03-10T08:55:08.486 INFO:tasks.workunit.client.1.vm08.stdout:0/209: chown d6/l22 430561204 1 2026-03-10T08:55:08.486 INFO:tasks.workunit.client.1.vm08.stdout:0/210: chown d6/dd/d13/d17/f1d 264 1 2026-03-10T08:55:08.488 INFO:tasks.workunit.client.1.vm08.stdout:7/228: readlink d0/d11/d1f/l34 0 2026-03-10T08:55:08.489 INFO:tasks.workunit.client.1.vm08.stdout:3/238: write d4/d15/f4b [2328351,15947] 0 2026-03-10T08:55:08.489 INFO:tasks.workunit.client.1.vm08.stdout:3/239: chown d4/d15/f4b 238 1 2026-03-10T08:55:08.492 INFO:tasks.workunit.client.1.vm08.stdout:1/225: write d1/da/d20/f2d [1652423,59014] 0 2026-03-10T08:55:08.493 INFO:tasks.workunit.client.1.vm08.stdout:4/273: rmdir d5 39 2026-03-10T08:55:08.495 INFO:tasks.workunit.client.1.vm08.stdout:1/226: dread d1/fc [0,4194304] 0 2026-03-10T08:55:08.495 INFO:tasks.workunit.client.1.vm08.stdout:9/261: link d2/dd/f18 d2/dd/d15/d1e/d39/f57 0 2026-03-10T08:55:08.495 INFO:tasks.workunit.client.1.vm08.stdout:9/262: chown d2/dd/d15/d1e 74580 1 2026-03-10T08:55:08.496 INFO:tasks.workunit.client.1.vm08.stdout:1/227: dread - d1/da/d18/d3a/f3c zero size 2026-03-10T08:55:08.499 INFO:tasks.workunit.client.1.vm08.stdout:7/229: mkdir d0/d11/d4a 0 2026-03-10T08:55:08.502 INFO:tasks.workunit.client.1.vm08.stdout:3/240: dread d4/d15/f3f [0,4194304] 0 2026-03-10T08:55:08.505 INFO:tasks.workunit.client.1.vm08.stdout:4/274: chown d5/de/f41 11389 1 2026-03-10T08:55:08.511 INFO:tasks.workunit.client.1.vm08.stdout:7/230: dread d0/d14/f7 [0,4194304] 0 2026-03-10T08:55:08.531 INFO:tasks.workunit.client.1.vm08.stdout:4/275: unlink d5/d23/d49/f4a 0 2026-03-10T08:55:08.531 INFO:tasks.workunit.client.1.vm08.stdout:8/305: truncate d1/d10/d9/dd/d25/d27/d44/f22 236487 0 2026-03-10T08:55:08.535 INFO:tasks.workunit.client.1.vm08.stdout:8/306: dread d1/d10/d9/dd/d25/d27/d44/d21/f4a [0,4194304] 0 2026-03-10T08:55:08.536 INFO:tasks.workunit.client.1.vm08.stdout:7/231: unlink d0/d1c/f32 0 2026-03-10T08:55:08.538 INFO:tasks.workunit.client.1.vm08.stdout:9/263: symlink d2/d54/l58 0 2026-03-10T08:55:08.540 INFO:tasks.workunit.client.1.vm08.stdout:2/298: write d1/da/d10/d1b/d12/f55 [1522673,89130] 0 2026-03-10T08:55:08.545 INFO:tasks.workunit.client.1.vm08.stdout:6/257: fsync d9/dc/f1b 0 2026-03-10T08:55:08.555 INFO:tasks.workunit.client.1.vm08.stdout:6/258: creat d9/dc/d11/d23/d2c/f60 x:0 0 0 2026-03-10T08:55:08.556 INFO:tasks.workunit.client.1.vm08.stdout:4/276: link d5/f3d d5/d5f/f65 0 2026-03-10T08:55:08.557 INFO:tasks.workunit.client.1.vm08.stdout:7/232: mknod d0/c4b 0 2026-03-10T08:55:08.558 INFO:tasks.workunit.client.1.vm08.stdout:4/277: creat d5/d2f/d5d/f66 x:0 0 0 2026-03-10T08:55:08.559 INFO:tasks.workunit.client.1.vm08.stdout:7/233: dread d0/d11/f39 [0,4194304] 0 2026-03-10T08:55:08.561 INFO:tasks.workunit.client.1.vm08.stdout:5/297: truncate d0/d11/d27/f3b 1283160 0 2026-03-10T08:55:08.565 INFO:tasks.workunit.client.1.vm08.stdout:7/234: dwrite d0/d11/d1f/d2c/f30 [0,4194304] 0 2026-03-10T08:55:08.567 INFO:tasks.workunit.client.1.vm08.stdout:7/235: read - d0/f41 zero size 2026-03-10T08:55:08.570 INFO:tasks.workunit.client.1.vm08.stdout:4/278: dread d5/de/f38 [0,4194304] 0 2026-03-10T08:55:08.578 INFO:tasks.workunit.client.1.vm08.stdout:5/298: getdents d0/d11/d18 0 2026-03-10T08:55:08.603 INFO:tasks.workunit.client.1.vm08.stdout:7/236: creat d0/d11/d1f/d29/d3b/f4c x:0 0 0 2026-03-10T08:55:08.607 INFO:tasks.workunit.client.1.vm08.stdout:4/279: link d5/d23/d36/f51 d5/d2f/f67 0 2026-03-10T08:55:08.613 INFO:tasks.workunit.client.1.vm08.stdout:4/280: write d5/d2f/f67 [206770,109560] 0 2026-03-10T08:55:08.617 INFO:tasks.workunit.client.1.vm08.stdout:4/281: creat d5/d23/f68 x:0 0 0 2026-03-10T08:55:08.618 INFO:tasks.workunit.client.1.vm08.stdout:7/237: symlink d0/d14/l4d 0 2026-03-10T08:55:08.621 INFO:tasks.workunit.client.1.vm08.stdout:4/282: dwrite d5/de/f38 [0,4194304] 0 2026-03-10T08:55:08.628 INFO:tasks.workunit.client.1.vm08.stdout:3/241: rename d4/d15/d8/c23 to d4/d15/d17/c52 0 2026-03-10T08:55:08.632 INFO:tasks.workunit.client.1.vm08.stdout:4/283: mkdir d5/d2f/d5a/d69 0 2026-03-10T08:55:08.632 INFO:tasks.workunit.client.1.vm08.stdout:6/259: rename d9/d10/d1e/d32/c24 to d9/d13/d4e/c61 0 2026-03-10T08:55:08.634 INFO:tasks.workunit.client.1.vm08.stdout:6/260: creat d9/dc/d11/d23/d2c/d44/f62 x:0 0 0 2026-03-10T08:55:08.635 INFO:tasks.workunit.client.1.vm08.stdout:6/261: chown d9/d10/d1e/d32/f12 1794674 1 2026-03-10T08:55:08.636 INFO:tasks.workunit.client.1.vm08.stdout:6/262: truncate d9/dc/d11/f31 795461 0 2026-03-10T08:55:08.642 INFO:tasks.workunit.client.1.vm08.stdout:7/238: dread d0/d11/d1f/d29/d3d/d40/f24 [0,4194304] 0 2026-03-10T08:55:08.642 INFO:tasks.workunit.client.1.vm08.stdout:6/263: unlink d9/d10/l3c 0 2026-03-10T08:55:08.646 INFO:tasks.workunit.client.1.vm08.stdout:7/239: dwrite d0/d11/f27 [0,4194304] 0 2026-03-10T08:55:08.651 INFO:tasks.workunit.client.1.vm08.stdout:7/240: dwrite d0/d11/f27 [0,4194304] 0 2026-03-10T08:55:08.656 INFO:tasks.workunit.client.1.vm08.stdout:7/241: dwrite d0/f25 [0,4194304] 0 2026-03-10T08:55:08.671 INFO:tasks.workunit.client.0.vm05.stderr:++ readlink -f fsstress 2026-03-10T08:55:08.673 INFO:tasks.workunit.client.0.vm05.stderr:+ BIN=/home/ubuntu/cephtest/mnt.0/client.0/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress/fsstress 2026-03-10T08:55:08.673 INFO:tasks.workunit.client.0.vm05.stderr:+ popd 2026-03-10T08:55:08.674 INFO:tasks.workunit.client.0.vm05.stdout:~/cephtest/mnt.0/client.0/tmp/fsstress ~/cephtest/mnt.0/client.0/tmp 2026-03-10T08:55:08.675 INFO:tasks.workunit.client.0.vm05.stderr:+ popd 2026-03-10T08:55:08.675 INFO:tasks.workunit.client.0.vm05.stdout:~/cephtest/mnt.0/client.0/tmp 2026-03-10T08:55:08.675 INFO:tasks.workunit.client.0.vm05.stderr:++ mktemp -d -p . 2026-03-10T08:55:08.684 INFO:tasks.workunit.client.0.vm05.stderr:+ T=./tmp.xqItf6Q9xe 2026-03-10T08:55:08.684 INFO:tasks.workunit.client.0.vm05.stderr:+ /home/ubuntu/cephtest/mnt.0/client.0/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress/fsstress -d ./tmp.xqItf6Q9xe -l 1 -n 1000 -p 10 -v 2026-03-10T08:55:08.684 INFO:tasks.workunit.client.1.vm08.stdout:7/242: symlink d0/d11/d1f/d29/d3d/l4e 0 2026-03-10T08:55:08.702 INFO:tasks.workunit.client.0.vm05.stdout:seed = 1772447301 2026-03-10T08:55:08.709 INFO:tasks.workunit.client.0.vm05.stdout:0/0: readlink - no filename 2026-03-10T08:55:08.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:08 vm05.local ceph-mon[49713]: pgmap v143: 65 pgs: 65 active+clean; 562 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 4.5 MiB/s rd, 57 MiB/s wr, 418 op/s 2026-03-10T08:55:08.718 INFO:tasks.workunit.client.0.vm05.stdout:3/0: link - no file 2026-03-10T08:55:08.719 INFO:tasks.workunit.client.0.vm05.stdout:3/1: write - no filename 2026-03-10T08:55:08.719 INFO:tasks.workunit.client.0.vm05.stdout:3/2: stat - no entries 2026-03-10T08:55:08.719 INFO:tasks.workunit.client.0.vm05.stdout:3/3: fsync - no filename 2026-03-10T08:55:08.727 INFO:tasks.workunit.client.1.vm08.stdout:0/211: link d6/dd/d13/d32/f34 d6/dd/d13/d17/d1f/f48 0 2026-03-10T08:55:08.727 INFO:tasks.workunit.client.1.vm08.stdout:0/212: stat d6/dd/d13/l1a 0 2026-03-10T08:55:08.727 INFO:tasks.workunit.client.0.vm05.stdout:0/1: mknod c0 0 2026-03-10T08:55:08.730 INFO:tasks.workunit.client.0.vm05.stdout:4/0: write - no filename 2026-03-10T08:55:08.736 INFO:tasks.workunit.client.1.vm08.stdout:0/213: mknod d6/dd/d13/d17/d1f/d20/c49 0 2026-03-10T08:55:08.756 INFO:tasks.workunit.client.0.vm05.stdout:2/0: mkdir d0 0 2026-03-10T08:55:08.756 INFO:tasks.workunit.client.0.vm05.stdout:2/1: truncate - no filename 2026-03-10T08:55:08.756 INFO:tasks.workunit.client.0.vm05.stdout:0/2: creat f1 x:0 0 0 2026-03-10T08:55:08.756 INFO:tasks.workunit.client.0.vm05.stdout:0/3: read - f1 zero size 2026-03-10T08:55:08.756 INFO:tasks.workunit.client.0.vm05.stdout:3/4: mkdir d0 0 2026-03-10T08:55:08.756 INFO:tasks.workunit.client.0.vm05.stdout:2/2: symlink d0/l1 0 2026-03-10T08:55:08.756 INFO:tasks.workunit.client.0.vm05.stdout:2/3: fsync - no filename 2026-03-10T08:55:08.756 INFO:tasks.workunit.client.1.vm08.stdout:8/307: symlink d1/d10/d9/l6d 0 2026-03-10T08:55:08.756 INFO:tasks.workunit.client.1.vm08.stdout:5/299: mknod d0/d11/d18/c62 0 2026-03-10T08:55:08.756 INFO:tasks.workunit.client.1.vm08.stdout:8/308: dread d1/d10/f3b [0,4194304] 0 2026-03-10T08:55:08.756 INFO:tasks.workunit.client.1.vm08.stdout:5/300: symlink d0/d40/d4b/l63 0 2026-03-10T08:55:08.756 INFO:tasks.workunit.client.1.vm08.stdout:9/264: rmdir d2/dd/d15/d1e/d25 39 2026-03-10T08:55:08.756 INFO:tasks.workunit.client.1.vm08.stdout:8/309: creat d1/d10/d9/dd/d25/f6e x:0 0 0 2026-03-10T08:55:08.757 INFO:tasks.workunit.client.1.vm08.stdout:3/242: write d4/d15/d8/d1d/f21 [1092530,110683] 0 2026-03-10T08:55:08.757 INFO:tasks.workunit.client.1.vm08.stdout:8/310: dwrite d1/d2c/f47 [0,4194304] 0 2026-03-10T08:55:08.757 INFO:tasks.workunit.client.1.vm08.stdout:4/284: rmdir d5/de 39 2026-03-10T08:55:08.757 INFO:tasks.workunit.client.1.vm08.stdout:8/311: readlink d1/d10/l20 0 2026-03-10T08:55:08.759 INFO:tasks.workunit.client.1.vm08.stdout:5/301: dread d0/d11/d18/f4f [4194304,4194304] 0 2026-03-10T08:55:08.777 INFO:tasks.workunit.client.1.vm08.stdout:8/312: dwrite d1/d10/d9/dd/d25/d27/f3a [0,4194304] 0 2026-03-10T08:55:08.777 INFO:tasks.workunit.client.1.vm08.stdout:6/264: write d9/dc/d11/f47 [2294727,124748] 0 2026-03-10T08:55:08.777 INFO:tasks.workunit.client.1.vm08.stdout:6/265: write d9/dc/f1b [4828222,102234] 0 2026-03-10T08:55:08.777 INFO:tasks.workunit.client.1.vm08.stdout:5/302: creat d0/d11/d27/f64 x:0 0 0 2026-03-10T08:55:08.777 INFO:tasks.workunit.client.1.vm08.stdout:2/299: rmdir d1/da/d10/d2d 39 2026-03-10T08:55:08.777 INFO:tasks.workunit.client.0.vm05.stdout:7/0: readlink - no filename 2026-03-10T08:55:08.777 INFO:tasks.workunit.client.0.vm05.stdout:7/1: write - no filename 2026-03-10T08:55:08.777 INFO:tasks.workunit.client.0.vm05.stdout:7/2: rename - no filename 2026-03-10T08:55:08.777 INFO:tasks.workunit.client.0.vm05.stdout:5/0: symlink l0 0 2026-03-10T08:55:08.777 INFO:tasks.workunit.client.0.vm05.stdout:5/1: write - no filename 2026-03-10T08:55:08.777 INFO:tasks.workunit.client.0.vm05.stdout:2/4: rmdir d0 39 2026-03-10T08:55:08.777 INFO:tasks.workunit.client.0.vm05.stdout:2/5: write - no filename 2026-03-10T08:55:08.777 INFO:tasks.workunit.client.0.vm05.stdout:3/5: rmdir d0 0 2026-03-10T08:55:08.777 INFO:tasks.workunit.client.0.vm05.stdout:3/6: write - no filename 2026-03-10T08:55:08.777 INFO:tasks.workunit.client.0.vm05.stdout:3/7: dwrite - no filename 2026-03-10T08:55:08.777 INFO:tasks.workunit.client.0.vm05.stdout:3/8: write - no filename 2026-03-10T08:55:08.778 INFO:tasks.workunit.client.0.vm05.stdout:6/0: creat f0 x:0 0 0 2026-03-10T08:55:08.781 INFO:tasks.workunit.client.1.vm08.stdout:4/285: dread d5/f1e [0,4194304] 0 2026-03-10T08:55:08.783 INFO:tasks.workunit.client.0.vm05.stdout:6/1: dwrite f0 [0,4194304] 0 2026-03-10T08:55:08.787 INFO:tasks.workunit.client.1.vm08.stdout:8/313: unlink d1/l4 0 2026-03-10T08:55:08.787 INFO:tasks.workunit.client.0.vm05.stdout:9/0: link - no file 2026-03-10T08:55:08.787 INFO:tasks.workunit.client.0.vm05.stdout:9/1: write - no filename 2026-03-10T08:55:08.787 INFO:tasks.workunit.client.0.vm05.stdout:9/2: dwrite - no filename 2026-03-10T08:55:08.796 INFO:tasks.workunit.client.0.vm05.stdout:5/2: mknod c1 0 2026-03-10T08:55:08.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:08 vm08.local ceph-mon[57559]: pgmap v143: 65 pgs: 65 active+clean; 562 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 4.5 MiB/s rd, 57 MiB/s wr, 418 op/s 2026-03-10T08:55:08.812 INFO:tasks.workunit.client.0.vm05.stdout:7/3: mknod c0 0 2026-03-10T08:55:08.812 INFO:tasks.workunit.client.0.vm05.stdout:7/4: dwrite - no filename 2026-03-10T08:55:08.812 INFO:tasks.workunit.client.0.vm05.stdout:7/5: dwrite - no filename 2026-03-10T08:55:08.812 INFO:tasks.workunit.client.1.vm08.stdout:6/266: mkdir d9/dc/d11/d23/d2c/d44/d63 0 2026-03-10T08:55:08.813 INFO:tasks.workunit.client.1.vm08.stdout:2/300: unlink d1/da/d10/d1b/d12/d23/c47 0 2026-03-10T08:55:08.813 INFO:tasks.workunit.client.1.vm08.stdout:3/243: creat d4/d15/d8/f53 x:0 0 0 2026-03-10T08:55:08.813 INFO:tasks.workunit.client.1.vm08.stdout:3/244: stat d4/d15/d17/d20/l29 0 2026-03-10T08:55:08.813 INFO:tasks.workunit.client.1.vm08.stdout:3/245: write f1 [2550989,38668] 0 2026-03-10T08:55:08.818 INFO:tasks.workunit.client.1.vm08.stdout:3/246: mknod d4/d15/d8/d1d/c54 0 2026-03-10T08:55:08.820 INFO:tasks.workunit.client.1.vm08.stdout:3/247: mkdir d4/d15/d8/d2c/d55 0 2026-03-10T08:55:08.820 INFO:tasks.workunit.client.1.vm08.stdout:6/267: creat d9/d10/d1e/d32/f64 x:0 0 0 2026-03-10T08:55:08.821 INFO:tasks.workunit.client.1.vm08.stdout:3/248: mknod d4/d15/d8/d2c/c56 0 2026-03-10T08:55:08.821 INFO:tasks.workunit.client.1.vm08.stdout:6/268: readlink d9/dc/d11/d23/d2c/d41/l38 0 2026-03-10T08:55:08.822 INFO:tasks.workunit.client.1.vm08.stdout:6/269: fdatasync d9/d10/d1e/d32/f4d 0 2026-03-10T08:55:08.823 INFO:tasks.workunit.client.1.vm08.stdout:8/314: rename d1/d10/d9/dd/d25/c3f to d1/d10/d9/dd/d25/d27/d44/c6f 0 2026-03-10T08:55:08.825 INFO:tasks.workunit.client.1.vm08.stdout:3/249: dread d4/d15/f12 [0,4194304] 0 2026-03-10T08:55:08.826 INFO:tasks.workunit.client.1.vm08.stdout:6/270: symlink d9/d10/l65 0 2026-03-10T08:55:08.827 INFO:tasks.workunit.client.1.vm08.stdout:4/286: rename d5/de/l1c to d5/d23/d49/l6a 0 2026-03-10T08:55:08.857 INFO:tasks.workunit.client.1.vm08.stdout:8/315: creat d1/d10/d9/dd/f70 x:0 0 0 2026-03-10T08:55:08.857 INFO:tasks.workunit.client.1.vm08.stdout:6/271: mknod d9/d10/d1e/c66 0 2026-03-10T08:55:08.857 INFO:tasks.workunit.client.1.vm08.stdout:4/287: creat d5/f6b x:0 0 0 2026-03-10T08:55:08.857 INFO:tasks.workunit.client.1.vm08.stdout:6/272: dwrite d9/d10/d1e/d32/f42 [0,4194304] 0 2026-03-10T08:55:08.857 INFO:tasks.workunit.client.1.vm08.stdout:4/288: mknod d5/d2f/d5a/c6c 0 2026-03-10T08:55:08.857 INFO:tasks.workunit.client.1.vm08.stdout:6/273: creat d9/d10/f67 x:0 0 0 2026-03-10T08:55:08.857 INFO:tasks.workunit.client.1.vm08.stdout:4/289: creat d5/de/f6d x:0 0 0 2026-03-10T08:55:08.857 INFO:tasks.workunit.client.1.vm08.stdout:6/274: symlink d9/dc/d11/d23/d2c/l68 0 2026-03-10T08:55:08.857 INFO:tasks.workunit.client.1.vm08.stdout:4/290: write d5/d23/f68 [428303,56340] 0 2026-03-10T08:55:08.857 INFO:tasks.workunit.client.1.vm08.stdout:4/291: rename d5/f21 to d5/d2f/d5a/d69/f6e 0 2026-03-10T08:55:08.857 INFO:tasks.workunit.client.1.vm08.stdout:4/292: chown d5/d23/d36/f51 1378 1 2026-03-10T08:55:08.857 INFO:tasks.workunit.client.1.vm08.stdout:6/275: dread f1 [0,4194304] 0 2026-03-10T08:55:08.857 INFO:tasks.workunit.client.1.vm08.stdout:4/293: dwrite d5/de/f63 [0,4194304] 0 2026-03-10T08:55:08.858 INFO:tasks.workunit.client.1.vm08.stdout:4/294: chown d5/d23/d36 2380008 1 2026-03-10T08:55:08.858 INFO:tasks.workunit.client.1.vm08.stdout:6/276: dwrite d9/d10/d1e/d32/f42 [0,4194304] 0 2026-03-10T08:55:08.858 INFO:tasks.workunit.client.1.vm08.stdout:4/295: write d5/d2f/d5d/f61 [40352,105554] 0 2026-03-10T08:55:08.858 INFO:tasks.workunit.client.1.vm08.stdout:6/277: write d9/d13/f36 [4168639,118652] 0 2026-03-10T08:55:08.858 INFO:tasks.workunit.client.1.vm08.stdout:4/296: chown d5/d23/c2c 28736880 1 2026-03-10T08:55:08.861 INFO:tasks.workunit.client.1.vm08.stdout:6/278: read d9/d10/d1e/d32/f12 [40049,51336] 0 2026-03-10T08:55:08.863 INFO:tasks.workunit.client.1.vm08.stdout:6/279: write d9/d10/d1e/d32/f17 [510308,100924] 0 2026-03-10T08:55:08.869 INFO:tasks.workunit.client.1.vm08.stdout:4/297: dwrite d5/de/f6d [0,4194304] 0 2026-03-10T08:55:08.871 INFO:tasks.workunit.client.1.vm08.stdout:6/280: dread d9/d10/d1e/d32/f4d [0,4194304] 0 2026-03-10T08:55:08.872 INFO:tasks.workunit.client.1.vm08.stdout:6/281: chown d9/d10/d1e/d32/f4d 138092010 1 2026-03-10T08:55:08.873 INFO:tasks.workunit.client.1.vm08.stdout:6/282: read d9/dc/f1b [2563464,54054] 0 2026-03-10T08:55:08.875 INFO:tasks.workunit.client.1.vm08.stdout:6/283: write d9/dc/d11/d23/d2c/f5c [705636,25467] 0 2026-03-10T08:55:08.876 INFO:tasks.workunit.client.1.vm08.stdout:6/284: write d9/dc/d11/d23/d2c/f5c [1622749,8334] 0 2026-03-10T08:55:08.876 INFO:tasks.workunit.client.1.vm08.stdout:3/250: dread d4/d15/f7 [0,4194304] 0 2026-03-10T08:55:08.876 INFO:tasks.workunit.client.1.vm08.stdout:4/298: dwrite d5/de/f50 [0,4194304] 0 2026-03-10T08:55:08.877 INFO:tasks.workunit.client.1.vm08.stdout:3/251: chown d4/d15/c26 0 1 2026-03-10T08:55:08.878 INFO:tasks.workunit.client.0.vm05.stdout:6/2: creat f1 x:0 0 0 2026-03-10T08:55:08.879 INFO:tasks.workunit.client.0.vm05.stdout:6/3: write f1 [826095,79266] 0 2026-03-10T08:55:08.881 INFO:tasks.workunit.client.1.vm08.stdout:3/252: mknod d4/d15/c57 0 2026-03-10T08:55:08.884 INFO:tasks.workunit.client.1.vm08.stdout:3/253: dwrite d4/d15/d8/f45 [0,4194304] 0 2026-03-10T08:55:08.885 INFO:tasks.workunit.client.0.vm05.stdout:6/4: dwrite f0 [0,4194304] 0 2026-03-10T08:55:08.888 INFO:tasks.workunit.client.0.vm05.stdout:6/5: stat f0 0 2026-03-10T08:55:08.894 INFO:tasks.workunit.client.0.vm05.stdout:8/0: dread - no filename 2026-03-10T08:55:08.897 INFO:tasks.workunit.client.0.vm05.stdout:8/1: dwrite - no filename 2026-03-10T08:55:08.897 INFO:tasks.workunit.client.0.vm05.stdout:8/2: dwrite - no filename 2026-03-10T08:55:08.897 INFO:tasks.workunit.client.0.vm05.stdout:8/3: dwrite - no filename 2026-03-10T08:55:08.897 INFO:tasks.workunit.client.0.vm05.stdout:8/4: read - no filename 2026-03-10T08:55:08.897 INFO:tasks.workunit.client.0.vm05.stdout:8/5: chown . 64491 1 2026-03-10T08:55:08.897 INFO:tasks.workunit.client.0.vm05.stdout:8/6: readlink - no filename 2026-03-10T08:55:08.901 INFO:tasks.workunit.client.1.vm08.stdout:6/285: mkdir d9/d10/d1e/d4c/d69 0 2026-03-10T08:55:08.901 INFO:tasks.workunit.client.1.vm08.stdout:6/286: dread - d9/d10/d1e/d32/f48 zero size 2026-03-10T08:55:08.903 INFO:tasks.workunit.client.1.vm08.stdout:3/254: symlink d4/d15/l58 0 2026-03-10T08:55:08.903 INFO:tasks.workunit.client.1.vm08.stdout:3/255: write d4/d15/d8/f4e [806319,93678] 0 2026-03-10T08:55:08.904 INFO:tasks.workunit.client.1.vm08.stdout:6/287: creat d9/dc/d11/d23/d2c/d44/f6a x:0 0 0 2026-03-10T08:55:08.905 INFO:tasks.workunit.client.0.vm05.stdout:5/3: creat f2 x:0 0 0 2026-03-10T08:55:08.905 INFO:tasks.workunit.client.0.vm05.stdout:5/4: rmdir - no directory 2026-03-10T08:55:08.905 INFO:tasks.workunit.client.0.vm05.stdout:7/6: symlink l1 0 2026-03-10T08:55:08.905 INFO:tasks.workunit.client.0.vm05.stdout:7/7: dread - no filename 2026-03-10T08:55:08.905 INFO:tasks.workunit.client.0.vm05.stdout:7/8: write - no filename 2026-03-10T08:55:08.906 INFO:tasks.workunit.client.0.vm05.stdout:9/3: creat f0 x:0 0 0 2026-03-10T08:55:08.909 INFO:tasks.workunit.client.0.vm05.stdout:5/5: dwrite f2 [0,4194304] 0 2026-03-10T08:55:08.912 INFO:tasks.workunit.client.1.vm08.stdout:2/301: sync 2026-03-10T08:55:08.917 INFO:tasks.workunit.client.0.vm05.stdout:8/7: creat f0 x:0 0 0 2026-03-10T08:55:08.921 INFO:tasks.workunit.client.0.vm05.stdout:1/0: rmdir - no directory 2026-03-10T08:55:08.923 INFO:tasks.workunit.client.0.vm05.stdout:5/6: symlink l3 0 2026-03-10T08:55:08.925 INFO:tasks.workunit.client.0.vm05.stdout:9/4: creat f1 x:0 0 0 2026-03-10T08:55:08.926 INFO:tasks.workunit.client.0.vm05.stdout:5/7: dread f2 [0,4194304] 0 2026-03-10T08:55:08.926 INFO:tasks.workunit.client.0.vm05.stdout:9/5: write f1 [706048,43890] 0 2026-03-10T08:55:08.930 INFO:tasks.workunit.client.0.vm05.stdout:8/8: dwrite f0 [0,4194304] 0 2026-03-10T08:55:08.931 INFO:tasks.workunit.client.1.vm08.stdout:6/288: rename d9/d10/d1e/d32/f42 to d9/d13/d4e/f6b 0 2026-03-10T08:55:08.936 INFO:tasks.workunit.client.1.vm08.stdout:3/256: read d4/d15/f11 [436674,72839] 0 2026-03-10T08:55:08.936 INFO:tasks.workunit.client.0.vm05.stdout:5/8: mkdir d4 0 2026-03-10T08:55:08.938 INFO:tasks.workunit.client.1.vm08.stdout:2/302: rmdir d1/d43/d4f 39 2026-03-10T08:55:08.938 INFO:tasks.workunit.client.0.vm05.stdout:9/6: unlink f0 0 2026-03-10T08:55:08.940 INFO:tasks.workunit.client.0.vm05.stdout:1/1: symlink l0 0 2026-03-10T08:55:08.940 INFO:tasks.workunit.client.0.vm05.stdout:1/2: dwrite - no filename 2026-03-10T08:55:08.945 INFO:tasks.workunit.client.0.vm05.stdout:9/7: rename f1 to f2 0 2026-03-10T08:55:08.949 INFO:tasks.workunit.client.1.vm08.stdout:2/303: dread d1/da/d10/d1b/d12/f55 [0,4194304] 0 2026-03-10T08:55:08.949 INFO:tasks.workunit.client.0.vm05.stdout:1/3: creat f1 x:0 0 0 2026-03-10T08:55:08.949 INFO:tasks.workunit.client.0.vm05.stdout:1/4: write f1 [892835,63445] 0 2026-03-10T08:55:08.949 INFO:tasks.workunit.client.1.vm08.stdout:2/304: dwrite d1/d5b/d66/f5e [0,4194304] 0 2026-03-10T08:55:08.951 INFO:tasks.workunit.client.1.vm08.stdout:6/289: getdents d9/dc/d11/d23/d2c/d41 0 2026-03-10T08:55:08.952 INFO:tasks.workunit.client.1.vm08.stdout:2/305: readlink d1/da/d10/d1b/l30 0 2026-03-10T08:55:08.952 INFO:tasks.workunit.client.0.vm05.stdout:9/8: write f2 [728966,94741] 0 2026-03-10T08:55:08.953 INFO:tasks.workunit.client.1.vm08.stdout:2/306: readlink d1/da/d10/d42/l51 0 2026-03-10T08:55:08.953 INFO:tasks.workunit.client.1.vm08.stdout:6/290: write d9/dc/d11/d23/f40 [16768,101170] 0 2026-03-10T08:55:08.953 INFO:tasks.workunit.client.0.vm05.stdout:5/9: rmdir d4 0 2026-03-10T08:55:08.955 INFO:tasks.workunit.client.0.vm05.stdout:5/10: mkdir d5 0 2026-03-10T08:55:08.963 INFO:tasks.workunit.client.1.vm08.stdout:6/291: creat d9/d13/f6c x:0 0 0 2026-03-10T08:55:08.963 INFO:tasks.workunit.client.1.vm08.stdout:2/307: symlink d1/da/d10/d1b/d12/d23/l68 0 2026-03-10T08:55:08.967 INFO:tasks.workunit.client.1.vm08.stdout:2/308: fsync d1/da/d10/d1b/d12/d23/f37 0 2026-03-10T08:55:08.976 INFO:tasks.workunit.client.1.vm08.stdout:2/309: dwrite d1/da/d10/d1b/d12/d23/f31 [0,4194304] 0 2026-03-10T08:55:08.979 INFO:tasks.workunit.client.1.vm08.stdout:2/310: rename d1/d5b/d66/l4a to d1/da/d10/d1b/d12/d23/l69 0 2026-03-10T08:55:08.994 INFO:tasks.workunit.client.1.vm08.stdout:2/311: getdents d1/da/d10/d1b 0 2026-03-10T08:55:08.998 INFO:tasks.workunit.client.1.vm08.stdout:6/292: fdatasync d9/dc/d11/f47 0 2026-03-10T08:55:09.001 INFO:tasks.workunit.client.1.vm08.stdout:6/293: rename d9/d10/l65 to d9/dc/d11/d23/d2c/d44/d63/l6d 0 2026-03-10T08:55:09.002 INFO:tasks.workunit.client.1.vm08.stdout:6/294: chown d9/d50 506255 1 2026-03-10T08:55:09.002 INFO:tasks.workunit.client.1.vm08.stdout:6/295: rmdir d9/dc 39 2026-03-10T08:55:09.153 INFO:tasks.workunit.client.0.vm05.stdout:1/5: dread f1 [0,4194304] 0 2026-03-10T08:55:09.155 INFO:tasks.workunit.client.0.vm05.stdout:1/6: link f1 f2 0 2026-03-10T08:55:09.156 INFO:tasks.workunit.client.0.vm05.stdout:1/7: write f1 [606068,57477] 0 2026-03-10T08:55:09.157 INFO:tasks.workunit.client.0.vm05.stdout:1/8: creat f3 x:0 0 0 2026-03-10T08:55:09.161 INFO:tasks.workunit.client.0.vm05.stdout:1/9: dwrite f1 [0,4194304] 0 2026-03-10T08:55:09.162 INFO:tasks.workunit.client.0.vm05.stdout:1/10: dread - f3 zero size 2026-03-10T08:55:09.163 INFO:tasks.workunit.client.0.vm05.stdout:1/11: mknod c4 0 2026-03-10T08:55:09.163 INFO:tasks.workunit.client.0.vm05.stdout:1/12: write f2 [698494,73196] 0 2026-03-10T08:55:09.164 INFO:tasks.workunit.client.0.vm05.stdout:1/13: write f3 [858441,68742] 0 2026-03-10T08:55:09.366 INFO:tasks.workunit.client.0.vm05.stdout:6/6: fdatasync f1 0 2026-03-10T08:55:09.366 INFO:tasks.workunit.client.0.vm05.stdout:6/7: stat f1 0 2026-03-10T08:55:09.367 INFO:tasks.workunit.client.0.vm05.stdout:6/8: creat f2 x:0 0 0 2026-03-10T08:55:09.368 INFO:tasks.workunit.client.1.vm08.stdout:8/316: dread d1/d10/d9/dd/d13/f46 [0,4194304] 0 2026-03-10T08:55:09.368 INFO:tasks.workunit.client.0.vm05.stdout:6/9: creat f3 x:0 0 0 2026-03-10T08:55:09.369 INFO:tasks.workunit.client.1.vm08.stdout:8/317: dread - d1/d10/d9/dd/d25/f6e zero size 2026-03-10T08:55:09.369 INFO:tasks.workunit.client.1.vm08.stdout:8/318: write d1/d10/d9/dd/d13/f6a [935658,34621] 0 2026-03-10T08:55:09.370 INFO:tasks.workunit.client.1.vm08.stdout:8/319: chown d1/d10/l20 128 1 2026-03-10T08:55:09.370 INFO:tasks.workunit.client.1.vm08.stdout:8/320: write d1/d10/d9/fb [679775,20102] 0 2026-03-10T08:55:09.371 INFO:tasks.workunit.client.1.vm08.stdout:8/321: write d1/d10/d9/dd/d13/d40/f68 [3991609,14045] 0 2026-03-10T08:55:09.649 INFO:tasks.workunit.client.0.vm05.stdout:4/1: sync 2026-03-10T08:55:09.649 INFO:tasks.workunit.client.0.vm05.stdout:0/4: sync 2026-03-10T08:55:09.649 INFO:tasks.workunit.client.0.vm05.stdout:4/2: truncate - no filename 2026-03-10T08:55:09.649 INFO:tasks.workunit.client.0.vm05.stdout:4/3: dwrite - no filename 2026-03-10T08:55:09.649 INFO:tasks.workunit.client.0.vm05.stdout:4/4: dwrite - no filename 2026-03-10T08:55:09.649 INFO:tasks.workunit.client.0.vm05.stdout:4/5: unlink - no file 2026-03-10T08:55:09.649 INFO:tasks.workunit.client.0.vm05.stdout:3/9: sync 2026-03-10T08:55:09.649 INFO:tasks.workunit.client.0.vm05.stdout:3/10: dread - no filename 2026-03-10T08:55:09.649 INFO:tasks.workunit.client.0.vm05.stdout:3/11: read - no filename 2026-03-10T08:55:09.649 INFO:tasks.workunit.client.0.vm05.stdout:7/9: sync 2026-03-10T08:55:09.649 INFO:tasks.workunit.client.0.vm05.stdout:7/10: dread - no filename 2026-03-10T08:55:09.651 INFO:tasks.workunit.client.0.vm05.stdout:0/5: mknod c2 0 2026-03-10T08:55:09.651 INFO:tasks.workunit.client.0.vm05.stdout:0/6: dread - f1 zero size 2026-03-10T08:55:09.659 INFO:tasks.workunit.client.0.vm05.stdout:3/12: creat f1 x:0 0 0 2026-03-10T08:55:09.664 INFO:tasks.workunit.client.0.vm05.stdout:0/7: dwrite f1 [0,4194304] 0 2026-03-10T08:55:09.665 INFO:tasks.workunit.client.1.vm08.stdout:1/228: dwrite d1/da/de/f19 [0,4194304] 0 2026-03-10T08:55:09.665 INFO:tasks.workunit.client.1.vm08.stdout:7/243: dwrite d0/d14/f7 [0,4194304] 0 2026-03-10T08:55:09.669 INFO:tasks.workunit.client.0.vm05.stdout:2/6: chown d0/l1 376006 1 2026-03-10T08:55:09.669 INFO:tasks.workunit.client.0.vm05.stdout:2/7: fdatasync - no filename 2026-03-10T08:55:09.676 INFO:tasks.workunit.client.0.vm05.stdout:7/11: symlink l2 0 2026-03-10T08:55:09.676 INFO:tasks.workunit.client.1.vm08.stdout:0/214: dwrite f3 [0,4194304] 0 2026-03-10T08:55:09.677 INFO:tasks.workunit.client.0.vm05.stdout:7/12: chown c0 0 1 2026-03-10T08:55:09.677 INFO:tasks.workunit.client.0.vm05.stdout:7/13: stat l1 0 2026-03-10T08:55:09.677 INFO:tasks.workunit.client.0.vm05.stdout:7/14: dread - no filename 2026-03-10T08:55:09.677 INFO:tasks.workunit.client.0.vm05.stdout:7/15: write - no filename 2026-03-10T08:55:09.677 INFO:tasks.workunit.client.0.vm05.stdout:7/16: truncate - no filename 2026-03-10T08:55:09.677 INFO:tasks.workunit.client.0.vm05.stdout:7/17: truncate - no filename 2026-03-10T08:55:09.677 INFO:tasks.workunit.client.0.vm05.stdout:7/18: chown c0 0 1 2026-03-10T08:55:09.680 INFO:tasks.workunit.client.0.vm05.stdout:4/6: mkdir d0 0 2026-03-10T08:55:09.680 INFO:tasks.workunit.client.0.vm05.stdout:4/7: write - no filename 2026-03-10T08:55:09.680 INFO:tasks.workunit.client.0.vm05.stdout:4/8: dwrite - no filename 2026-03-10T08:55:09.682 INFO:tasks.workunit.client.1.vm08.stdout:9/265: truncate f1 2687042 0 2026-03-10T08:55:09.688 INFO:tasks.workunit.client.1.vm08.stdout:7/244: rename d0/d11/f27 to d0/d11/d4a/f4f 0 2026-03-10T08:55:09.688 INFO:tasks.workunit.client.1.vm08.stdout:9/266: dread d2/dd/d15/d1e/d24/f33 [0,4194304] 0 2026-03-10T08:55:09.690 INFO:tasks.workunit.client.1.vm08.stdout:9/267: dwrite d2/dd/d15/d1e/d24/f34 [0,4194304] 0 2026-03-10T08:55:09.695 INFO:tasks.workunit.client.1.vm08.stdout:6/296: fdatasync d9/dc/d11/d23/f40 0 2026-03-10T08:55:09.695 INFO:tasks.workunit.client.1.vm08.stdout:6/297: write d9/d13/d4e/f57 [954166,88393] 0 2026-03-10T08:55:09.696 INFO:tasks.workunit.client.1.vm08.stdout:0/215: creat d6/dd/d13/d17/d1f/d2d/d39/f4a x:0 0 0 2026-03-10T08:55:09.696 INFO:tasks.workunit.client.1.vm08.stdout:6/298: write d9/dc/d11/d23/d2c/d41/f51 [939419,73133] 0 2026-03-10T08:55:09.696 INFO:tasks.workunit.client.1.vm08.stdout:6/299: chown d9/dc/c34 394783 1 2026-03-10T08:55:09.700 INFO:tasks.workunit.client.0.vm05.stdout:0/8: creat f3 x:0 0 0 2026-03-10T08:55:09.700 INFO:tasks.workunit.client.0.vm05.stdout:0/9: dread - f3 zero size 2026-03-10T08:55:09.700 INFO:tasks.workunit.client.0.vm05.stdout:0/10: readlink - no filename 2026-03-10T08:55:09.700 INFO:tasks.workunit.client.0.vm05.stdout:0/11: write f3 [886263,128731] 0 2026-03-10T08:55:09.706 INFO:tasks.workunit.client.0.vm05.stdout:2/8: creat d0/f2 x:0 0 0 2026-03-10T08:55:09.706 INFO:tasks.workunit.client.0.vm05.stdout:2/9: chown d0 29 1 2026-03-10T08:55:09.709 INFO:tasks.workunit.client.0.vm05.stdout:7/19: creat f3 x:0 0 0 2026-03-10T08:55:09.709 INFO:tasks.workunit.client.0.vm05.stdout:7/20: fdatasync f3 0 2026-03-10T08:55:09.710 INFO:tasks.workunit.client.1.vm08.stdout:5/303: write d0/d11/f25 [388765,63245] 0 2026-03-10T08:55:09.711 INFO:tasks.workunit.client.1.vm08.stdout:7/245: write d0/d11/d4a/f4f [2497511,106562] 0 2026-03-10T08:55:09.713 INFO:tasks.workunit.client.1.vm08.stdout:3/257: getdents d4/d15/d8/d1d 0 2026-03-10T08:55:09.714 INFO:tasks.workunit.client.1.vm08.stdout:3/258: write d4/d15/d8/f41 [805462,92643] 0 2026-03-10T08:55:09.714 INFO:tasks.workunit.client.1.vm08.stdout:3/259: readlink d4/d15/d8/d1d/l46 0 2026-03-10T08:55:09.714 INFO:tasks.workunit.client.0.vm05.stdout:0/12: unlink f3 0 2026-03-10T08:55:09.714 INFO:tasks.workunit.client.1.vm08.stdout:3/260: stat d4/d15/d8/f37 0 2026-03-10T08:55:09.716 INFO:tasks.workunit.client.0.vm05.stdout:2/10: mknod d0/c3 0 2026-03-10T08:55:09.718 INFO:tasks.workunit.client.1.vm08.stdout:9/268: creat d2/dd/d15/d1e/d37/f59 x:0 0 0 2026-03-10T08:55:09.719 INFO:tasks.workunit.client.0.vm05.stdout:7/21: creat f4 x:0 0 0 2026-03-10T08:55:09.719 INFO:tasks.workunit.client.0.vm05.stdout:7/22: read - f4 zero size 2026-03-10T08:55:09.722 INFO:tasks.workunit.client.0.vm05.stdout:4/9: creat d0/f1 x:0 0 0 2026-03-10T08:55:09.725 INFO:tasks.workunit.client.0.vm05.stdout:0/13: mknod c4 0 2026-03-10T08:55:09.728 INFO:tasks.workunit.client.1.vm08.stdout:5/304: mknod d0/d40/d4b/d4e/c65 0 2026-03-10T08:55:09.728 INFO:tasks.workunit.client.0.vm05.stdout:2/11: creat d0/f4 x:0 0 0 2026-03-10T08:55:09.729 INFO:tasks.workunit.client.0.vm05.stdout:2/12: chown d0/f2 0 1 2026-03-10T08:55:09.730 INFO:tasks.workunit.client.1.vm08.stdout:3/261: chown d4/d15/d8/d2c/c43 2624016 1 2026-03-10T08:55:09.732 INFO:tasks.workunit.client.0.vm05.stdout:0/14: creat f5 x:0 0 0 2026-03-10T08:55:09.734 INFO:tasks.workunit.client.0.vm05.stdout:2/13: symlink d0/l5 0 2026-03-10T08:55:09.734 INFO:tasks.workunit.client.0.vm05.stdout:2/14: fdatasync d0/f4 0 2026-03-10T08:55:09.734 INFO:tasks.workunit.client.0.vm05.stdout:2/15: dread - d0/f4 zero size 2026-03-10T08:55:09.735 INFO:tasks.workunit.client.0.vm05.stdout:2/16: write d0/f4 [6897,35825] 0 2026-03-10T08:55:09.735 INFO:tasks.workunit.client.0.vm05.stdout:2/17: truncate d0/f4 406877 0 2026-03-10T08:55:09.738 INFO:tasks.workunit.client.1.vm08.stdout:7/246: symlink d0/d11/d1f/d29/d3d/l50 0 2026-03-10T08:55:09.745 INFO:tasks.workunit.client.1.vm08.stdout:9/269: mknod d2/c5a 0 2026-03-10T08:55:09.745 INFO:tasks.workunit.client.0.vm05.stdout:0/15: creat f6 x:0 0 0 2026-03-10T08:55:09.745 INFO:tasks.workunit.client.0.vm05.stdout:2/18: mknod d0/c6 0 2026-03-10T08:55:09.745 INFO:tasks.workunit.client.0.vm05.stdout:2/19: write d0/f2 [385733,32463] 0 2026-03-10T08:55:09.745 INFO:tasks.workunit.client.0.vm05.stdout:2/20: stat d0/c6 0 2026-03-10T08:55:09.745 INFO:tasks.workunit.client.0.vm05.stdout:0/16: creat f7 x:0 0 0 2026-03-10T08:55:09.745 INFO:tasks.workunit.client.0.vm05.stdout:0/17: write f1 [2078120,12163] 0 2026-03-10T08:55:09.748 INFO:tasks.workunit.client.0.vm05.stdout:0/18: dwrite f1 [0,4194304] 0 2026-03-10T08:55:09.753 INFO:tasks.workunit.client.0.vm05.stdout:2/21: dread d0/f2 [0,4194304] 0 2026-03-10T08:55:09.754 INFO:tasks.workunit.client.0.vm05.stdout:2/22: write d0/f4 [577725,65240] 0 2026-03-10T08:55:09.755 INFO:tasks.workunit.client.0.vm05.stdout:2/23: dread d0/f2 [0,4194304] 0 2026-03-10T08:55:09.758 INFO:tasks.workunit.client.1.vm08.stdout:6/300: rename d9/d10/d1e/c66 to d9/dc/c6e 0 2026-03-10T08:55:09.760 INFO:tasks.workunit.client.1.vm08.stdout:7/247: read d0/d11/f39 [200699,72386] 0 2026-03-10T08:55:09.763 INFO:tasks.workunit.client.1.vm08.stdout:9/270: unlink d2/fa 0 2026-03-10T08:55:09.764 INFO:tasks.workunit.client.1.vm08.stdout:0/216: getdents d6/dd/d13/d17/d1f/d2d 0 2026-03-10T08:55:09.765 INFO:tasks.workunit.client.1.vm08.stdout:0/217: read - d6/dd/d13/d17/d1f/d20/f46 zero size 2026-03-10T08:55:09.766 INFO:tasks.workunit.client.0.vm05.stdout:2/24: fdatasync d0/f2 0 2026-03-10T08:55:09.766 INFO:tasks.workunit.client.0.vm05.stdout:2/25: write d0/f2 [1335779,54560] 0 2026-03-10T08:55:09.769 INFO:tasks.workunit.client.1.vm08.stdout:7/248: dwrite d0/d11/d1f/d29/d3d/d40/f24 [4194304,4194304] 0 2026-03-10T08:55:09.770 INFO:tasks.workunit.client.1.vm08.stdout:7/249: dread - d0/d11/d1f/d29/d3b/f4c zero size 2026-03-10T08:55:09.771 INFO:tasks.workunit.client.1.vm08.stdout:7/250: chown d0/d11/d1f/d29/d3d/d40/f38 76759 1 2026-03-10T08:55:09.809 INFO:tasks.workunit.client.1.vm08.stdout:6/301: rename d9/dc/d11/d23/d2c/d44/f6a to d9/dc/d11/d23/f6f 0 2026-03-10T08:55:09.809 INFO:tasks.workunit.client.1.vm08.stdout:6/302: chown d9/d10/d1e/d32/f17 11 1 2026-03-10T08:55:09.810 INFO:tasks.workunit.client.1.vm08.stdout:0/218: mknod d6/dd/c4b 0 2026-03-10T08:55:09.811 INFO:tasks.workunit.client.1.vm08.stdout:0/219: truncate f5 1874722 0 2026-03-10T08:55:09.811 INFO:tasks.workunit.client.1.vm08.stdout:0/220: write d6/dd/f35 [1565022,14512] 0 2026-03-10T08:55:09.813 INFO:tasks.workunit.client.1.vm08.stdout:7/251: mkdir d0/d51 0 2026-03-10T08:55:09.814 INFO:tasks.workunit.client.1.vm08.stdout:9/271: rename d2/dd/d15 to d2/dd/d15/d1e/d5b 22 2026-03-10T08:55:09.815 INFO:tasks.workunit.client.1.vm08.stdout:9/272: write d2/dd/f2e [1339026,88144] 0 2026-03-10T08:55:09.817 INFO:tasks.workunit.client.1.vm08.stdout:9/273: truncate d2/dd/d15/d1e/d21/f50 242600 0 2026-03-10T08:55:09.817 INFO:tasks.workunit.client.1.vm08.stdout:7/252: dwrite d0/d14/f7 [0,4194304] 0 2026-03-10T08:55:09.825 INFO:tasks.workunit.client.1.vm08.stdout:6/303: unlink d9/d10/l1c 0 2026-03-10T08:55:09.826 INFO:tasks.workunit.client.1.vm08.stdout:6/304: stat d9/dc/d11/d23/d2c/d44/d63 0 2026-03-10T08:55:09.827 INFO:tasks.workunit.client.1.vm08.stdout:4/299: write d5/d5f/f65 [808553,113489] 0 2026-03-10T08:55:09.828 INFO:tasks.workunit.client.1.vm08.stdout:4/300: write d5/d2f/d5d/f66 [166121,39157] 0 2026-03-10T08:55:09.829 INFO:tasks.workunit.client.1.vm08.stdout:4/301: write d5/d23/d36/f51 [1079326,1354] 0 2026-03-10T08:55:09.831 INFO:tasks.workunit.client.1.vm08.stdout:4/302: write d5/de/f50 [4546647,126287] 0 2026-03-10T08:55:09.833 INFO:tasks.workunit.client.1.vm08.stdout:4/303: dread d5/d23/d36/f44 [0,4194304] 0 2026-03-10T08:55:09.838 INFO:tasks.workunit.client.0.vm05.stdout:5/11: truncate f2 2231577 0 2026-03-10T08:55:09.856 INFO:tasks.workunit.client.1.vm08.stdout:4/304: creat d5/d2f/d5a/d69/f6f x:0 0 0 2026-03-10T08:55:09.856 INFO:tasks.workunit.client.1.vm08.stdout:9/274: mkdir d2/dd/d15/d1e/d25/d32/d5c 0 2026-03-10T08:55:09.856 INFO:tasks.workunit.client.1.vm08.stdout:9/275: dread - d2/dd/d15/d1e/d25/f4b zero size 2026-03-10T08:55:09.857 INFO:tasks.workunit.client.0.vm05.stdout:5/12: rename d5 to d5/d6 22 2026-03-10T08:55:09.857 INFO:tasks.workunit.client.0.vm05.stdout:8/9: dwrite f0 [4194304,4194304] 0 2026-03-10T08:55:09.857 INFO:tasks.workunit.client.0.vm05.stdout:8/10: readlink - no filename 2026-03-10T08:55:09.857 INFO:tasks.workunit.client.0.vm05.stdout:8/11: write f0 [6492888,40281] 0 2026-03-10T08:55:09.857 INFO:tasks.workunit.client.0.vm05.stdout:8/12: write f0 [6462401,6306] 0 2026-03-10T08:55:09.860 INFO:tasks.workunit.client.1.vm08.stdout:9/276: dwrite d2/dd/d15/d1e/d25/f4b [0,4194304] 0 2026-03-10T08:55:09.868 INFO:tasks.workunit.client.1.vm08.stdout:0/221: sync 2026-03-10T08:55:09.871 INFO:tasks.workunit.client.0.vm05.stdout:9/9: truncate f2 388449 0 2026-03-10T08:55:09.880 INFO:tasks.workunit.client.1.vm08.stdout:2/312: truncate d1/d5b/d66/f20 3865529 0 2026-03-10T08:55:09.881 INFO:tasks.workunit.client.1.vm08.stdout:2/313: write d1/da/d10/d1b/d12/d23/f31 [5102569,86453] 0 2026-03-10T08:55:09.898 INFO:tasks.workunit.client.1.vm08.stdout:2/314: rmdir d1/da/d10/d1b/d12/d22 39 2026-03-10T08:55:09.898 INFO:tasks.workunit.client.1.vm08.stdout:2/315: dread - d1/d43/f5d zero size 2026-03-10T08:55:09.899 INFO:tasks.workunit.client.1.vm08.stdout:2/316: write d1/d5b/d66/f3d [799251,75738] 0 2026-03-10T08:55:09.899 INFO:tasks.workunit.client.0.vm05.stdout:1/14: write f2 [4699190,37408] 0 2026-03-10T08:55:09.902 INFO:tasks.workunit.client.0.vm05.stdout:6/10: getdents . 0 2026-03-10T08:55:09.903 INFO:tasks.workunit.client.1.vm08.stdout:2/317: dwrite d1/da/d10/d1b/d12/d23/f31 [0,4194304] 0 2026-03-10T08:55:09.905 INFO:tasks.workunit.client.1.vm08.stdout:8/322: truncate d1/d10/d9/dd/d13/d40/f68 2856371 0 2026-03-10T08:55:09.905 INFO:tasks.workunit.client.1.vm08.stdout:6/305: link d9/d10/d1e/d32/ff d9/d13/f70 0 2026-03-10T08:55:09.905 INFO:tasks.workunit.client.1.vm08.stdout:4/305: link d5/lc d5/d5f/l70 0 2026-03-10T08:55:09.910 INFO:tasks.workunit.client.0.vm05.stdout:3/13: getdents . 0 2026-03-10T08:55:09.912 INFO:tasks.workunit.client.1.vm08.stdout:4/306: truncate d5/d23/f56 757043 0 2026-03-10T08:55:09.914 INFO:tasks.workunit.client.1.vm08.stdout:9/277: link d2/dd/l1d d2/dd/l5d 0 2026-03-10T08:55:09.914 INFO:tasks.workunit.client.0.vm05.stdout:7/23: fsync f3 0 2026-03-10T08:55:09.914 INFO:tasks.workunit.client.1.vm08.stdout:2/318: mkdir d1/da/d10/d1b/d6a 0 2026-03-10T08:55:09.918 INFO:tasks.workunit.client.1.vm08.stdout:1/229: dwrite d1/da/f25 [0,4194304] 0 2026-03-10T08:55:09.920 INFO:tasks.workunit.client.1.vm08.stdout:6/306: dwrite d9/d10/d1e/f58 [0,4194304] 0 2026-03-10T08:55:09.924 INFO:tasks.workunit.client.1.vm08.stdout:8/323: rename d1/d10/d9/l6d to d1/d10/d9/dd/d18/d34/l71 0 2026-03-10T08:55:09.926 INFO:tasks.workunit.client.1.vm08.stdout:1/230: read d1/da/f39 [1089518,46964] 0 2026-03-10T08:55:09.927 INFO:tasks.workunit.client.1.vm08.stdout:8/324: stat d1/d10/d9/dd/d18/l31 0 2026-03-10T08:55:09.928 INFO:tasks.workunit.client.0.vm05.stdout:4/10: getdents d0 0 2026-03-10T08:55:09.928 INFO:tasks.workunit.client.0.vm05.stdout:4/11: stat d0 0 2026-03-10T08:55:09.928 INFO:tasks.workunit.client.0.vm05.stdout:4/12: readlink - no filename 2026-03-10T08:55:09.928 INFO:tasks.workunit.client.0.vm05.stdout:4/13: chown d0/f1 3055 1 2026-03-10T08:55:09.929 INFO:tasks.workunit.client.0.vm05.stdout:4/14: rename d0 to d0/d2 22 2026-03-10T08:55:09.929 INFO:tasks.workunit.client.0.vm05.stdout:4/15: dread - d0/f1 zero size 2026-03-10T08:55:09.929 INFO:tasks.workunit.client.1.vm08.stdout:5/305: write d0/d11/f29 [1289512,80664] 0 2026-03-10T08:55:09.930 INFO:tasks.workunit.client.1.vm08.stdout:5/306: read - d0/d11/d27/f61 zero size 2026-03-10T08:55:09.943 INFO:tasks.workunit.client.0.vm05.stdout:0/19: fsync f6 0 2026-03-10T08:55:09.943 INFO:tasks.workunit.client.0.vm05.stdout:2/26: rmdir d0 39 2026-03-10T08:55:09.943 INFO:tasks.workunit.client.1.vm08.stdout:4/307: write d5/f1d [3970141,10002] 0 2026-03-10T08:55:09.943 INFO:tasks.workunit.client.1.vm08.stdout:4/308: chown d5/de/f63 2905 1 2026-03-10T08:55:09.943 INFO:tasks.workunit.client.1.vm08.stdout:3/262: write d4/d15/d8/f24 [244676,8553] 0 2026-03-10T08:55:09.943 INFO:tasks.workunit.client.1.vm08.stdout:3/263: write f1 [1056055,30771] 0 2026-03-10T08:55:09.943 INFO:tasks.workunit.client.1.vm08.stdout:2/319: mknod d1/da/d10/d42/c6b 0 2026-03-10T08:55:09.950 INFO:tasks.workunit.client.1.vm08.stdout:8/325: creat d1/d10/d9/dd/d25/d27/d44/d21/d51/f72 x:0 0 0 2026-03-10T08:55:09.951 INFO:tasks.workunit.client.1.vm08.stdout:8/326: fdatasync d1/d10/d9/dd/f70 0 2026-03-10T08:55:09.955 INFO:tasks.workunit.client.1.vm08.stdout:4/309: rmdir d5/d23/d49 39 2026-03-10T08:55:09.958 INFO:tasks.workunit.client.1.vm08.stdout:8/327: dread d1/f26 [0,4194304] 0 2026-03-10T08:55:09.960 INFO:tasks.workunit.client.1.vm08.stdout:5/307: read d0/d1b/f2f [2133367,52161] 0 2026-03-10T08:55:09.961 INFO:tasks.workunit.client.1.vm08.stdout:3/264: creat d4/d15/d17/f59 x:0 0 0 2026-03-10T08:55:09.968 INFO:tasks.workunit.client.1.vm08.stdout:6/307: rename d9/dc/c6e to d9/dc/d11/d23/d2c/d44/c71 0 2026-03-10T08:55:09.977 INFO:tasks.workunit.client.1.vm08.stdout:8/328: creat d1/d10/d9/f73 x:0 0 0 2026-03-10T08:55:09.985 INFO:tasks.workunit.client.1.vm08.stdout:7/253: write d0/d14/f12 [3369782,63694] 0 2026-03-10T08:55:09.985 INFO:tasks.workunit.client.1.vm08.stdout:2/320: mknod d1/d43/c6c 0 2026-03-10T08:55:09.985 INFO:tasks.workunit.client.1.vm08.stdout:5/308: fdatasync d0/d11/d18/d52/f57 0 2026-03-10T08:55:09.985 INFO:tasks.workunit.client.1.vm08.stdout:3/265: stat d4/d15/d8/d2c/c48 0 2026-03-10T08:55:09.989 INFO:tasks.workunit.client.1.vm08.stdout:6/308: truncate d9/d10/d1e/f2a 110155 0 2026-03-10T08:55:09.990 INFO:tasks.workunit.client.1.vm08.stdout:0/222: write d6/f9 [1497764,32070] 0 2026-03-10T08:55:09.991 INFO:tasks.workunit.client.1.vm08.stdout:0/223: chown d6/dd/d13/d17/d1f/d20/f21 10859 1 2026-03-10T08:55:09.992 INFO:tasks.workunit.client.1.vm08.stdout:7/254: rmdir d0/d11 39 2026-03-10T08:55:09.997 INFO:tasks.workunit.client.1.vm08.stdout:2/321: dwrite d1/d43/f4b [0,4194304] 0 2026-03-10T08:55:10.002 INFO:tasks.workunit.client.1.vm08.stdout:3/266: rename d4/d15/f11 to d4/d15/d8/d2c/f5a 0 2026-03-10T08:55:10.002 INFO:tasks.workunit.client.1.vm08.stdout:2/322: write d1/d5b/d66/f3d [1681637,2501] 0 2026-03-10T08:55:10.004 INFO:tasks.workunit.client.1.vm08.stdout:6/309: creat d9/d10/f72 x:0 0 0 2026-03-10T08:55:10.005 INFO:tasks.workunit.client.1.vm08.stdout:8/329: mknod d1/d10/d9/dd/d25/d27/d44/c74 0 2026-03-10T08:55:10.005 INFO:tasks.workunit.client.1.vm08.stdout:0/224: mknod d6/dd/d13/d17/d1f/d2d/c4c 0 2026-03-10T08:55:10.008 INFO:tasks.workunit.client.1.vm08.stdout:0/225: stat d6/dd/d13/d17/d1f/d20 0 2026-03-10T08:55:10.008 INFO:tasks.workunit.client.1.vm08.stdout:0/226: fsync d6/f15 0 2026-03-10T08:55:10.008 INFO:tasks.workunit.client.1.vm08.stdout:3/267: symlink d4/d15/d17/l5b 0 2026-03-10T08:55:10.009 INFO:tasks.workunit.client.1.vm08.stdout:4/310: getdents d5/d23/d36 0 2026-03-10T08:55:10.010 INFO:tasks.workunit.client.1.vm08.stdout:6/310: creat d9/dc/d11/f73 x:0 0 0 2026-03-10T08:55:10.010 INFO:tasks.workunit.client.1.vm08.stdout:8/330: truncate d1/f8 2552101 0 2026-03-10T08:55:10.013 INFO:tasks.workunit.client.1.vm08.stdout:4/311: creat d5/d2f/f71 x:0 0 0 2026-03-10T08:55:10.014 INFO:tasks.workunit.client.1.vm08.stdout:8/331: symlink d1/d10/d9/dd/d18/l75 0 2026-03-10T08:55:10.016 INFO:tasks.workunit.client.1.vm08.stdout:2/323: rename d1/d5b/d66/f3d to d1/d43/f6d 0 2026-03-10T08:55:10.016 INFO:tasks.workunit.client.1.vm08.stdout:6/311: rename d9/dc/d11 to d9/dc/d11/d74 22 2026-03-10T08:55:10.017 INFO:tasks.workunit.client.1.vm08.stdout:6/312: readlink d9/dc/d11/d23/d2c/d41/l38 0 2026-03-10T08:55:10.018 INFO:tasks.workunit.client.1.vm08.stdout:8/332: dwrite d1/d10/d9/fb [0,4194304] 0 2026-03-10T08:55:10.018 INFO:tasks.workunit.client.1.vm08.stdout:6/313: dread - d9/d10/d1e/d32/f48 zero size 2026-03-10T08:55:10.023 INFO:tasks.workunit.client.1.vm08.stdout:6/314: dwrite d9/d13/f35 [0,4194304] 0 2026-03-10T08:55:10.037 INFO:tasks.workunit.client.1.vm08.stdout:6/315: readlink d9/d13/l46 0 2026-03-10T08:55:10.037 INFO:tasks.workunit.client.1.vm08.stdout:8/333: dwrite d1/d10/d9/f73 [0,4194304] 0 2026-03-10T08:55:10.037 INFO:tasks.workunit.client.1.vm08.stdout:8/334: fdatasync d1/d10/d9/dd/f62 0 2026-03-10T08:55:10.037 INFO:tasks.workunit.client.1.vm08.stdout:0/227: symlink d6/dd/d13/d17/d1f/d20/d2f/l4d 0 2026-03-10T08:55:10.041 INFO:tasks.workunit.client.0.vm05.stdout:4/16: sync 2026-03-10T08:55:10.042 INFO:tasks.workunit.client.1.vm08.stdout:0/228: dwrite d6/fc [0,4194304] 0 2026-03-10T08:55:10.042 INFO:tasks.workunit.client.1.vm08.stdout:7/255: link d0/d1c/c23 d0/d11/d4a/c52 0 2026-03-10T08:55:10.042 INFO:tasks.workunit.client.1.vm08.stdout:0/229: write d6/f2c [1884887,43110] 0 2026-03-10T08:55:10.047 INFO:tasks.workunit.client.1.vm08.stdout:4/312: write d5/d23/d49/f4d [641887,81488] 0 2026-03-10T08:55:10.061 INFO:tasks.workunit.client.1.vm08.stdout:7/256: dwrite d0/fe [0,4194304] 0 2026-03-10T08:55:10.070 INFO:tasks.workunit.client.1.vm08.stdout:4/313: chown d5/d23/l48 32395 1 2026-03-10T08:55:10.070 INFO:tasks.workunit.client.1.vm08.stdout:0/230: fdatasync d6/fe 0 2026-03-10T08:55:10.073 INFO:tasks.workunit.client.1.vm08.stdout:8/335: rmdir d1/d4f/d60/d67 0 2026-03-10T08:55:10.076 INFO:tasks.workunit.client.1.vm08.stdout:7/257: creat d0/d11/d4a/f53 x:0 0 0 2026-03-10T08:55:10.088 INFO:tasks.workunit.client.1.vm08.stdout:6/316: truncate d9/d13/f70 2469395 0 2026-03-10T08:55:10.088 INFO:tasks.workunit.client.1.vm08.stdout:6/317: fdatasync d9/d10/d1e/d32/f17 0 2026-03-10T08:55:10.088 INFO:tasks.workunit.client.1.vm08.stdout:4/314: dread d5/d2f/f45 [0,4194304] 0 2026-03-10T08:55:10.088 INFO:tasks.workunit.client.1.vm08.stdout:4/315: stat d5/d23/f27 0 2026-03-10T08:55:10.088 INFO:tasks.workunit.client.1.vm08.stdout:8/336: dwrite d1/d10/f3b [0,4194304] 0 2026-03-10T08:55:10.093 INFO:tasks.workunit.client.1.vm08.stdout:0/231: mknod d6/c4e 0 2026-03-10T08:55:10.096 INFO:tasks.workunit.client.1.vm08.stdout:8/337: mknod d1/d10/d9/dd/d13/c76 0 2026-03-10T08:55:10.096 INFO:tasks.workunit.client.1.vm08.stdout:4/316: rename d5/d23/f2e to d5/de/f72 0 2026-03-10T08:55:10.103 INFO:tasks.workunit.client.1.vm08.stdout:7/258: creat d0/d11/d1f/d29/d3d/f54 x:0 0 0 2026-03-10T08:55:10.120 INFO:tasks.workunit.client.1.vm08.stdout:7/259: unlink d0/d11/d1f/d29/l45 0 2026-03-10T08:55:10.120 INFO:tasks.workunit.client.1.vm08.stdout:7/260: write d0/d14/f12 [2439461,23836] 0 2026-03-10T08:55:10.120 INFO:tasks.workunit.client.1.vm08.stdout:7/261: fdatasync d0/d11/d1f/d29/d3d/d40/f38 0 2026-03-10T08:55:10.120 INFO:tasks.workunit.client.1.vm08.stdout:4/317: link d5/de/l13 d5/d5f/l73 0 2026-03-10T08:55:10.120 INFO:tasks.workunit.client.1.vm08.stdout:7/262: symlink d0/d14/d2f/l55 0 2026-03-10T08:55:10.120 INFO:tasks.workunit.client.1.vm08.stdout:7/263: stat d0/d14/d2f/c42 0 2026-03-10T08:55:10.120 INFO:tasks.workunit.client.1.vm08.stdout:7/264: symlink d0/d11/d1f/d29/d3b/l56 0 2026-03-10T08:55:10.120 INFO:tasks.workunit.client.1.vm08.stdout:7/265: getdents d0/d1c 0 2026-03-10T08:55:10.120 INFO:tasks.workunit.client.1.vm08.stdout:7/266: rename d0/d14/c5 to d0/d11/d1f/d29/d36/c57 0 2026-03-10T08:55:10.124 INFO:tasks.workunit.client.1.vm08.stdout:6/318: dread d9/dc/d11/d23/d2c/f49 [4194304,4194304] 0 2026-03-10T08:55:10.146 INFO:tasks.workunit.client.1.vm08.stdout:9/278: write d2/fb [346323,68347] 0 2026-03-10T08:55:10.149 INFO:tasks.workunit.client.1.vm08.stdout:9/279: dwrite d2/dd/d15/d1e/d25/d32/f45 [0,4194304] 0 2026-03-10T08:55:10.156 INFO:tasks.workunit.client.1.vm08.stdout:6/319: sync 2026-03-10T08:55:10.158 INFO:tasks.workunit.client.1.vm08.stdout:1/231: truncate d1/da/de/d24/f37 2095558 0 2026-03-10T08:55:10.162 INFO:tasks.workunit.client.1.vm08.stdout:9/280: creat d2/dd/d15/d1e/d24/f5e x:0 0 0 2026-03-10T08:55:10.163 INFO:tasks.workunit.client.1.vm08.stdout:6/320: creat d9/d50/f75 x:0 0 0 2026-03-10T08:55:10.166 INFO:tasks.workunit.client.1.vm08.stdout:6/321: dread d9/d13/f35 [0,4194304] 0 2026-03-10T08:55:10.169 INFO:tasks.workunit.client.1.vm08.stdout:9/281: dwrite d2/dd/d15/f44 [0,4194304] 0 2026-03-10T08:55:10.175 INFO:tasks.workunit.client.1.vm08.stdout:6/322: creat d9/dc/d11/d23/d2c/d41/d5d/f76 x:0 0 0 2026-03-10T08:55:10.176 INFO:tasks.workunit.client.1.vm08.stdout:6/323: write d9/d13/f36 [224784,23347] 0 2026-03-10T08:55:10.176 INFO:tasks.workunit.client.1.vm08.stdout:6/324: readlink d9/d13/l46 0 2026-03-10T08:55:10.177 INFO:tasks.workunit.client.1.vm08.stdout:6/325: write d9/dc/d11/f73 [44412,11887] 0 2026-03-10T08:55:10.181 INFO:tasks.workunit.client.1.vm08.stdout:6/326: write d9/dc/d11/d23/d2c/f4f [2469437,108162] 0 2026-03-10T08:55:10.201 INFO:tasks.workunit.client.1.vm08.stdout:9/282: dread d2/dd/d15/f17 [0,4194304] 0 2026-03-10T08:55:10.202 INFO:tasks.workunit.client.1.vm08.stdout:9/283: dread d2/dd/d15/f1b [0,4194304] 0 2026-03-10T08:55:10.202 INFO:tasks.workunit.client.1.vm08.stdout:4/318: fdatasync d5/d23/d49/f4d 0 2026-03-10T08:55:10.205 INFO:tasks.workunit.client.1.vm08.stdout:6/327: dread d9/dc/d11/d23/d2c/d41/d5d/f5e [0,4194304] 0 2026-03-10T08:55:10.209 INFO:tasks.workunit.client.0.vm05.stdout:9/10: dwrite f2 [0,4194304] 0 2026-03-10T08:55:10.218 INFO:tasks.workunit.client.1.vm08.stdout:4/319: write d5/de/f5e [1876741,82002] 0 2026-03-10T08:55:10.219 INFO:tasks.workunit.client.1.vm08.stdout:4/320: chown d5/d23/f27 402 1 2026-03-10T08:55:10.220 INFO:tasks.workunit.client.1.vm08.stdout:6/328: dread d9/dc/d11/f47 [0,4194304] 0 2026-03-10T08:55:10.220 INFO:tasks.workunit.client.0.vm05.stdout:1/15: rename f3 to f5 0 2026-03-10T08:55:10.220 INFO:tasks.workunit.client.0.vm05.stdout:1/16: fdatasync f1 0 2026-03-10T08:55:10.220 INFO:tasks.workunit.client.0.vm05.stdout:3/14: creat f2 x:0 0 0 2026-03-10T08:55:10.220 INFO:tasks.workunit.client.0.vm05.stdout:7/24: mknod c5 0 2026-03-10T08:55:10.220 INFO:tasks.workunit.client.1.vm08.stdout:9/284: creat d2/dd/d15/d1e/d25/f5f x:0 0 0 2026-03-10T08:55:10.221 INFO:tasks.workunit.client.0.vm05.stdout:0/20: unlink c0 0 2026-03-10T08:55:10.222 INFO:tasks.workunit.client.0.vm05.stdout:0/21: rmdir - no directory 2026-03-10T08:55:10.223 INFO:tasks.workunit.client.1.vm08.stdout:8/338: dread d1/d10/d9/dd/f62 [0,4194304] 0 2026-03-10T08:55:10.227 INFO:tasks.workunit.client.0.vm05.stdout:0/22: dwrite f6 [0,4194304] 0 2026-03-10T08:55:10.227 INFO:tasks.workunit.client.1.vm08.stdout:9/285: creat d2/dd/d15/d1e/d25/d32/f60 x:0 0 0 2026-03-10T08:55:10.229 INFO:tasks.workunit.client.1.vm08.stdout:9/286: write d2/dd/d15/d1e/d39/d4e/f55 [326666,108191] 0 2026-03-10T08:55:10.235 INFO:tasks.workunit.client.1.vm08.stdout:9/287: chown d2/dd/f16 38259399 1 2026-03-10T08:55:10.239 INFO:tasks.workunit.client.0.vm05.stdout:4/17: mknod d0/c3 0 2026-03-10T08:55:10.258 INFO:tasks.workunit.client.1.vm08.stdout:8/339: mknod d1/d10/c77 0 2026-03-10T08:55:10.258 INFO:tasks.workunit.client.1.vm08.stdout:9/288: write d2/dd/d15/f22 [1304823,15597] 0 2026-03-10T08:55:10.258 INFO:tasks.workunit.client.1.vm08.stdout:5/309: truncate d0/d11/d18/f34 545275 0 2026-03-10T08:55:10.258 INFO:tasks.workunit.client.1.vm08.stdout:8/340: rmdir d1/d4f 39 2026-03-10T08:55:10.258 INFO:tasks.workunit.client.0.vm05.stdout:4/18: dwrite d0/f1 [0,4194304] 0 2026-03-10T08:55:10.258 INFO:tasks.workunit.client.0.vm05.stdout:4/19: read d0/f1 [3291987,57676] 0 2026-03-10T08:55:10.258 INFO:tasks.workunit.client.0.vm05.stdout:5/13: link f2 d5/f7 0 2026-03-10T08:55:10.258 INFO:tasks.workunit.client.0.vm05.stdout:8/13: rename f0 to f1 0 2026-03-10T08:55:10.258 INFO:tasks.workunit.client.1.vm08.stdout:6/329: creat d9/f77 x:0 0 0 2026-03-10T08:55:10.258 INFO:tasks.workunit.client.1.vm08.stdout:8/341: read - d1/d10/d9/dd/d25/d27/d44/d21/d51/f72 zero size 2026-03-10T08:55:10.258 INFO:tasks.workunit.client.1.vm08.stdout:9/289: readlink d2/l4a 0 2026-03-10T08:55:10.260 INFO:tasks.workunit.client.1.vm08.stdout:9/290: write d2/dd/d15/d1e/d25/d32/f60 [437820,26126] 0 2026-03-10T08:55:10.260 INFO:tasks.workunit.client.0.vm05.stdout:6/11: getdents . 0 2026-03-10T08:55:10.260 INFO:tasks.workunit.client.1.vm08.stdout:5/310: mknod d0/d11/d18/d52/c66 0 2026-03-10T08:55:10.260 INFO:tasks.workunit.client.0.vm05.stdout:6/12: write f2 [756255,31824] 0 2026-03-10T08:55:10.263 INFO:tasks.workunit.client.0.vm05.stdout:3/15: creat f3 x:0 0 0 2026-03-10T08:55:10.267 INFO:tasks.workunit.client.0.vm05.stdout:3/16: rmdir - no directory 2026-03-10T08:55:10.267 INFO:tasks.workunit.client.0.vm05.stdout:7/25: rename l2 to l6 0 2026-03-10T08:55:10.267 INFO:tasks.workunit.client.1.vm08.stdout:6/330: creat d9/d50/f78 x:0 0 0 2026-03-10T08:55:10.267 INFO:tasks.workunit.client.1.vm08.stdout:8/342: creat d1/d10/d9/dd/d3d/f78 x:0 0 0 2026-03-10T08:55:10.267 INFO:tasks.workunit.client.1.vm08.stdout:6/331: readlink d9/d13/l28 0 2026-03-10T08:55:10.267 INFO:tasks.workunit.client.1.vm08.stdout:3/268: dwrite d4/f18 [0,4194304] 0 2026-03-10T08:55:10.268 INFO:tasks.workunit.client.1.vm08.stdout:4/321: sync 2026-03-10T08:55:10.269 INFO:tasks.workunit.client.0.vm05.stdout:7/26: dwrite f3 [0,4194304] 0 2026-03-10T08:55:10.270 INFO:tasks.workunit.client.1.vm08.stdout:9/291: write d2/dd/d15/d1e/d21/f2d [828041,70398] 0 2026-03-10T08:55:10.270 INFO:tasks.workunit.client.0.vm05.stdout:7/27: truncate f3 4682309 0 2026-03-10T08:55:10.276 INFO:tasks.workunit.client.1.vm08.stdout:4/322: dread d5/d23/f56 [0,4194304] 0 2026-03-10T08:55:10.289 INFO:tasks.workunit.client.1.vm08.stdout:3/269: unlink d4/f14 0 2026-03-10T08:55:10.289 INFO:tasks.workunit.client.0.vm05.stdout:2/27: unlink d0/c3 0 2026-03-10T08:55:10.289 INFO:tasks.workunit.client.0.vm05.stdout:0/23: mknod c8 0 2026-03-10T08:55:10.289 INFO:tasks.workunit.client.0.vm05.stdout:1/17: sync 2026-03-10T08:55:10.304 INFO:tasks.workunit.client.1.vm08.stdout:4/323: symlink d5/d23/d36/l74 0 2026-03-10T08:55:10.304 INFO:tasks.workunit.client.0.vm05.stdout:9/11: link f2 f3 0 2026-03-10T08:55:10.307 INFO:tasks.workunit.client.0.vm05.stdout:6/13: mkdir d4 0 2026-03-10T08:55:10.307 INFO:tasks.workunit.client.0.vm05.stdout:6/14: write f0 [4320712,47335] 0 2026-03-10T08:55:10.309 INFO:tasks.workunit.client.0.vm05.stdout:9/12: dwrite f3 [0,4194304] 0 2026-03-10T08:55:10.311 INFO:tasks.workunit.client.1.vm08.stdout:8/343: creat d1/d10/d9/d4d/d5c/f79 x:0 0 0 2026-03-10T08:55:10.312 INFO:tasks.workunit.client.0.vm05.stdout:3/17: creat f4 x:0 0 0 2026-03-10T08:55:10.315 INFO:tasks.workunit.client.0.vm05.stdout:3/18: dwrite f3 [0,4194304] 0 2026-03-10T08:55:10.316 INFO:tasks.workunit.client.0.vm05.stdout:3/19: dread - f2 zero size 2026-03-10T08:55:10.316 INFO:tasks.workunit.client.0.vm05.stdout:3/20: chown f3 38196 1 2026-03-10T08:55:10.318 INFO:tasks.workunit.client.1.vm08.stdout:3/270: creat d4/d15/d17/f5c x:0 0 0 2026-03-10T08:55:10.319 INFO:tasks.workunit.client.1.vm08.stdout:3/271: read - d4/d15/d8/d2a/f4d zero size 2026-03-10T08:55:10.319 INFO:tasks.workunit.client.1.vm08.stdout:3/272: fdatasync d4/d15/d8/d2c/f3d 0 2026-03-10T08:55:10.320 INFO:tasks.workunit.client.1.vm08.stdout:3/273: readlink d4/d15/d8/d2a/l39 0 2026-03-10T08:55:10.320 INFO:tasks.workunit.client.0.vm05.stdout:3/21: dwrite f4 [0,4194304] 0 2026-03-10T08:55:10.326 INFO:tasks.workunit.client.1.vm08.stdout:3/274: dwrite d4/d15/d8/d1d/f2d [0,4194304] 0 2026-03-10T08:55:10.338 INFO:tasks.workunit.client.1.vm08.stdout:2/324: truncate d1/da/d10/d1b/f28 1931100 0 2026-03-10T08:55:10.338 INFO:tasks.workunit.client.1.vm08.stdout:4/324: mkdir d5/d2f/d5a/d75 0 2026-03-10T08:55:10.339 INFO:tasks.workunit.client.0.vm05.stdout:5/14: creat d5/f8 x:0 0 0 2026-03-10T08:55:10.344 INFO:tasks.workunit.client.1.vm08.stdout:9/292: dread d2/dd/d15/d1e/d39/d4e/f55 [0,4194304] 0 2026-03-10T08:55:10.348 INFO:tasks.workunit.client.1.vm08.stdout:9/293: read d2/dd/d15/d1e/d24/f34 [92134,6445] 0 2026-03-10T08:55:10.348 INFO:tasks.workunit.client.1.vm08.stdout:9/294: stat d2/d54 0 2026-03-10T08:55:10.348 INFO:tasks.workunit.client.1.vm08.stdout:9/295: dwrite d2/dd/d15/d1e/f48 [0,4194304] 0 2026-03-10T08:55:10.358 INFO:tasks.workunit.client.1.vm08.stdout:3/275: dread d4/d15/f1a [0,4194304] 0 2026-03-10T08:55:10.359 INFO:tasks.workunit.client.0.vm05.stdout:9/13: dwrite f3 [0,4194304] 0 2026-03-10T08:55:10.368 INFO:tasks.workunit.client.1.vm08.stdout:0/232: truncate d6/fc 1332709 0 2026-03-10T08:55:10.372 INFO:tasks.workunit.client.1.vm08.stdout:9/296: sync 2026-03-10T08:55:10.374 INFO:tasks.workunit.client.0.vm05.stdout:1/18: link f5 f6 0 2026-03-10T08:55:10.374 INFO:tasks.workunit.client.0.vm05.stdout:1/19: truncate f6 1593296 0 2026-03-10T08:55:10.374 INFO:tasks.workunit.client.0.vm05.stdout:1/20: truncate f6 2075778 0 2026-03-10T08:55:10.377 INFO:tasks.workunit.client.0.vm05.stdout:7/28: link c0 c7 0 2026-03-10T08:55:10.379 INFO:tasks.workunit.client.0.vm05.stdout:7/29: dread f3 [0,4194304] 0 2026-03-10T08:55:10.390 INFO:tasks.workunit.client.1.vm08.stdout:0/233: symlink d6/dd/d13/d32/l4f 0 2026-03-10T08:55:10.391 INFO:tasks.workunit.client.1.vm08.stdout:9/297: chown d2/dd/l5d 58797896 1 2026-03-10T08:55:10.391 INFO:tasks.workunit.client.0.vm05.stdout:7/30: truncate f4 421622 0 2026-03-10T08:55:10.391 INFO:tasks.workunit.client.0.vm05.stdout:7/31: write f4 [1413053,63822] 0 2026-03-10T08:55:10.391 INFO:tasks.workunit.client.0.vm05.stdout:7/32: write f4 [949831,56328] 0 2026-03-10T08:55:10.391 INFO:tasks.workunit.client.0.vm05.stdout:7/33: chown c5 0 1 2026-03-10T08:55:10.391 INFO:tasks.workunit.client.0.vm05.stdout:0/24: link c8 c9 0 2026-03-10T08:55:10.391 INFO:tasks.workunit.client.0.vm05.stdout:5/15: link d5/f7 d5/f9 0 2026-03-10T08:55:10.391 INFO:tasks.workunit.client.0.vm05.stdout:5/16: dread d5/f7 [0,4194304] 0 2026-03-10T08:55:10.391 INFO:tasks.workunit.client.0.vm05.stdout:1/21: dwrite f5 [0,4194304] 0 2026-03-10T08:55:10.392 INFO:tasks.workunit.client.0.vm05.stdout:1/22: readlink l0 0 2026-03-10T08:55:10.401 INFO:tasks.workunit.client.1.vm08.stdout:0/234: mkdir d6/dd/d13/d17/d50 0 2026-03-10T08:55:10.402 INFO:tasks.workunit.client.0.vm05.stdout:9/14: link f2 f4 0 2026-03-10T08:55:10.402 INFO:tasks.workunit.client.1.vm08.stdout:0/235: fdatasync f5 0 2026-03-10T08:55:10.406 INFO:tasks.workunit.client.0.vm05.stdout:7/34: creat f8 x:0 0 0 2026-03-10T08:55:10.410 INFO:tasks.workunit.client.0.vm05.stdout:7/35: dwrite f4 [0,4194304] 0 2026-03-10T08:55:10.415 INFO:tasks.workunit.client.0.vm05.stdout:0/25: unlink f7 0 2026-03-10T08:55:10.424 INFO:tasks.workunit.client.1.vm08.stdout:0/236: rename d6/c31 to d6/dd/d13/d32/c51 0 2026-03-10T08:55:10.424 INFO:tasks.workunit.client.1.vm08.stdout:0/237: fdatasync f5 0 2026-03-10T08:55:10.424 INFO:tasks.workunit.client.1.vm08.stdout:0/238: dwrite d6/f9 [0,4194304] 0 2026-03-10T08:55:10.424 INFO:tasks.workunit.client.0.vm05.stdout:5/17: write d5/f7 [2717489,4930] 0 2026-03-10T08:55:10.424 INFO:tasks.workunit.client.0.vm05.stdout:9/15: creat f5 x:0 0 0 2026-03-10T08:55:10.424 INFO:tasks.workunit.client.0.vm05.stdout:0/26: mknod ca 0 2026-03-10T08:55:10.428 INFO:tasks.workunit.client.1.vm08.stdout:5/311: dread d0/d11/f29 [0,4194304] 0 2026-03-10T08:55:10.430 INFO:tasks.workunit.client.0.vm05.stdout:9/16: mkdir d6 0 2026-03-10T08:55:10.430 INFO:tasks.workunit.client.0.vm05.stdout:9/17: read - f5 zero size 2026-03-10T08:55:10.430 INFO:tasks.workunit.client.0.vm05.stdout:9/18: chown f5 12 1 2026-03-10T08:55:10.436 INFO:tasks.workunit.client.0.vm05.stdout:9/19: dread f4 [0,4194304] 0 2026-03-10T08:55:10.442 INFO:tasks.workunit.client.0.vm05.stdout:9/20: chown f3 47101 1 2026-03-10T08:55:10.442 INFO:tasks.workunit.client.0.vm05.stdout:9/21: stat f4 0 2026-03-10T08:55:10.448 INFO:tasks.workunit.client.0.vm05.stdout:1/23: sync 2026-03-10T08:55:10.448 INFO:tasks.workunit.client.0.vm05.stdout:1/24: readlink l0 0 2026-03-10T08:55:10.452 INFO:tasks.workunit.client.1.vm08.stdout:7/267: write d0/d11/d1f/d2c/f33 [631087,96973] 0 2026-03-10T08:55:10.456 INFO:tasks.workunit.client.1.vm08.stdout:0/239: truncate f4 4016119 0 2026-03-10T08:55:10.460 INFO:tasks.workunit.client.1.vm08.stdout:5/312: mkdir d0/d1b/d67 0 2026-03-10T08:55:10.460 INFO:tasks.workunit.client.1.vm08.stdout:5/313: chown d0/d40/f42 169 1 2026-03-10T08:55:10.462 INFO:tasks.workunit.client.0.vm05.stdout:1/25: creat f7 x:0 0 0 2026-03-10T08:55:10.474 INFO:tasks.workunit.client.1.vm08.stdout:7/268: chown d0/d1c/c23 2055 1 2026-03-10T08:55:10.474 INFO:tasks.workunit.client.1.vm08.stdout:7/269: readlink d0/d11/d1f/d2c/l3c 0 2026-03-10T08:55:10.474 INFO:tasks.workunit.client.0.vm05.stdout:1/26: write f1 [2123953,74257] 0 2026-03-10T08:55:10.474 INFO:tasks.workunit.client.0.vm05.stdout:1/27: chown f2 73058185 1 2026-03-10T08:55:10.475 INFO:tasks.workunit.client.0.vm05.stdout:1/28: dwrite f2 [0,4194304] 0 2026-03-10T08:55:10.475 INFO:tasks.workunit.client.0.vm05.stdout:9/22: getdents d6 0 2026-03-10T08:55:10.475 INFO:tasks.workunit.client.0.vm05.stdout:1/29: dwrite f1 [0,4194304] 0 2026-03-10T08:55:10.476 INFO:tasks.workunit.client.0.vm05.stdout:6/15: fsync f0 0 2026-03-10T08:55:10.487 INFO:tasks.workunit.client.0.vm05.stdout:6/16: dread f2 [0,4194304] 0 2026-03-10T08:55:10.488 INFO:tasks.workunit.client.0.vm05.stdout:6/17: write f0 [157016,34194] 0 2026-03-10T08:55:10.502 INFO:tasks.workunit.client.0.vm05.stdout:1/30: symlink l8 0 2026-03-10T08:55:10.505 INFO:tasks.workunit.client.0.vm05.stdout:1/31: dwrite f5 [0,4194304] 0 2026-03-10T08:55:10.510 INFO:tasks.workunit.client.1.vm08.stdout:0/240: symlink d6/dd/l52 0 2026-03-10T08:55:10.511 INFO:tasks.workunit.client.1.vm08.stdout:0/241: chown d6/dd/d13/l1a 516618 1 2026-03-10T08:55:10.511 INFO:tasks.workunit.client.1.vm08.stdout:0/242: chown d6/fc 20719194 1 2026-03-10T08:55:10.512 INFO:tasks.workunit.client.0.vm05.stdout:9/23: creat d6/f7 x:0 0 0 2026-03-10T08:55:10.514 INFO:tasks.workunit.client.1.vm08.stdout:7/270: creat d0/d14/d43/f58 x:0 0 0 2026-03-10T08:55:10.515 INFO:tasks.workunit.client.0.vm05.stdout:1/32: symlink l9 0 2026-03-10T08:55:10.519 INFO:tasks.workunit.client.0.vm05.stdout:1/33: dwrite f7 [0,4194304] 0 2026-03-10T08:55:10.539 INFO:tasks.workunit.client.1.vm08.stdout:5/314: getdents d0/d11 0 2026-03-10T08:55:10.539 INFO:tasks.workunit.client.1.vm08.stdout:5/315: dwrite d0/d11/d18/f5a [0,4194304] 0 2026-03-10T08:55:10.540 INFO:tasks.workunit.client.0.vm05.stdout:1/34: dwrite f5 [0,4194304] 0 2026-03-10T08:55:10.540 INFO:tasks.workunit.client.0.vm05.stdout:9/24: creat d6/f8 x:0 0 0 2026-03-10T08:55:10.540 INFO:tasks.workunit.client.0.vm05.stdout:9/25: stat d6/f7 0 2026-03-10T08:55:10.540 INFO:tasks.workunit.client.0.vm05.stdout:9/26: write d6/f7 [42399,100610] 0 2026-03-10T08:55:10.540 INFO:tasks.workunit.client.0.vm05.stdout:9/27: readlink - no filename 2026-03-10T08:55:10.540 INFO:tasks.workunit.client.0.vm05.stdout:6/18: link f0 d4/f5 0 2026-03-10T08:55:10.540 INFO:tasks.workunit.client.0.vm05.stdout:9/28: mkdir d6/d9 0 2026-03-10T08:55:10.540 INFO:tasks.workunit.client.0.vm05.stdout:6/19: rename f1 to d4/f6 0 2026-03-10T08:55:10.540 INFO:tasks.workunit.client.0.vm05.stdout:6/20: chown f3 1548296 1 2026-03-10T08:55:10.546 INFO:tasks.workunit.client.0.vm05.stdout:6/21: write d4/f5 [5327056,94000] 0 2026-03-10T08:55:10.548 INFO:tasks.workunit.client.1.vm08.stdout:5/316: dwrite d0/d1b/f2f [0,4194304] 0 2026-03-10T08:55:10.551 INFO:tasks.workunit.client.1.vm08.stdout:5/317: dread - d0/d11/f60 zero size 2026-03-10T08:55:10.551 INFO:tasks.workunit.client.1.vm08.stdout:5/318: write d0/f43 [2001082,51161] 0 2026-03-10T08:55:10.552 INFO:tasks.workunit.client.0.vm05.stdout:6/22: dwrite d4/f6 [0,4194304] 0 2026-03-10T08:55:10.556 INFO:tasks.workunit.client.0.vm05.stdout:6/23: dwrite d4/f5 [0,4194304] 0 2026-03-10T08:55:10.559 INFO:tasks.workunit.client.0.vm05.stdout:6/24: write f0 [3068329,78621] 0 2026-03-10T08:55:10.559 INFO:tasks.workunit.client.0.vm05.stdout:6/25: write f3 [300186,127482] 0 2026-03-10T08:55:10.561 INFO:tasks.workunit.client.0.vm05.stdout:6/26: dread f0 [0,4194304] 0 2026-03-10T08:55:10.565 INFO:tasks.workunit.client.0.vm05.stdout:9/29: rmdir d6/d9 0 2026-03-10T08:55:10.577 INFO:tasks.workunit.client.1.vm08.stdout:5/319: mkdir d0/d11/d27/d68 0 2026-03-10T08:55:10.586 INFO:tasks.workunit.client.0.vm05.stdout:6/27: mkdir d4/d7 0 2026-03-10T08:55:10.587 INFO:tasks.workunit.client.1.vm08.stdout:5/320: creat d0/d1b/f69 x:0 0 0 2026-03-10T08:55:10.590 INFO:tasks.workunit.client.0.vm05.stdout:6/28: unlink f3 0 2026-03-10T08:55:10.590 INFO:tasks.workunit.client.0.vm05.stdout:6/29: chown f0 860294963 1 2026-03-10T08:55:10.592 INFO:tasks.workunit.client.0.vm05.stdout:6/30: rmdir d4 39 2026-03-10T08:55:10.704 INFO:tasks.workunit.client.1.vm08.stdout:7/271: dread d0/f44 [0,4194304] 0 2026-03-10T08:55:10.706 INFO:tasks.workunit.client.1.vm08.stdout:7/272: creat d0/d11/d1f/d29/d3d/f59 x:0 0 0 2026-03-10T08:55:10.745 INFO:tasks.workunit.client.0.vm05.stdout:8/14: getdents . 0 2026-03-10T08:55:10.746 INFO:tasks.workunit.client.0.vm05.stdout:8/15: mkdir d2 0 2026-03-10T08:55:10.748 INFO:tasks.workunit.client.0.vm05.stdout:8/16: mknod d2/c3 0 2026-03-10T08:55:10.751 INFO:tasks.workunit.client.0.vm05.stdout:8/17: symlink d2/l4 0 2026-03-10T08:55:10.807 INFO:tasks.workunit.client.1.vm08.stdout:5/321: fdatasync d0/d11/d18/f5a 0 2026-03-10T08:55:10.811 INFO:tasks.workunit.client.0.vm05.stdout:0/27: write f6 [4162508,46598] 0 2026-03-10T08:55:10.811 INFO:tasks.workunit.client.0.vm05.stdout:2/28: rmdir d0 39 2026-03-10T08:55:10.812 INFO:tasks.workunit.client.0.vm05.stdout:9/30: dread d6/f7 [0,4194304] 0 2026-03-10T08:55:10.812 INFO:tasks.workunit.client.0.vm05.stdout:9/31: readlink - no filename 2026-03-10T08:55:10.814 INFO:tasks.workunit.client.0.vm05.stdout:9/32: write f5 [242515,120976] 0 2026-03-10T08:55:10.816 INFO:tasks.workunit.client.0.vm05.stdout:4/20: dwrite d0/f1 [4194304,4194304] 0 2026-03-10T08:55:10.820 INFO:tasks.workunit.client.0.vm05.stdout:0/28: dwrite f6 [0,4194304] 0 2026-03-10T08:55:10.823 INFO:tasks.workunit.client.0.vm05.stdout:4/21: dwrite d0/f1 [4194304,4194304] 0 2026-03-10T08:55:10.824 INFO:tasks.workunit.client.1.vm08.stdout:5/322: dread d0/d11/d3e/d45/f5b [0,4194304] 0 2026-03-10T08:55:10.832 INFO:tasks.workunit.client.0.vm05.stdout:6/31: stat d4/f6 0 2026-03-10T08:55:10.839 INFO:tasks.workunit.client.0.vm05.stdout:3/22: getdents . 0 2026-03-10T08:55:10.841 INFO:tasks.workunit.client.1.vm08.stdout:6/332: write f5 [2610786,62550] 0 2026-03-10T08:55:10.845 INFO:tasks.workunit.client.1.vm08.stdout:0/243: read d6/f15 [940950,114977] 0 2026-03-10T08:55:10.846 INFO:tasks.workunit.client.1.vm08.stdout:8/344: write f0 [3359535,75474] 0 2026-03-10T08:55:10.847 INFO:tasks.workunit.client.1.vm08.stdout:2/325: write d1/da/d10/d1b/d12/d23/f57 [212292,129702] 0 2026-03-10T08:55:10.847 INFO:tasks.workunit.client.1.vm08.stdout:4/325: dwrite d5/d23/d36/f44 [0,4194304] 0 2026-03-10T08:55:10.848 INFO:tasks.workunit.client.1.vm08.stdout:3/276: write d4/d15/d8/d2c/f42 [4600477,60489] 0 2026-03-10T08:55:10.851 INFO:tasks.workunit.client.1.vm08.stdout:2/326: read d1/da/d10/d1b/d12/d23/f44 [116383,12634] 0 2026-03-10T08:55:10.855 INFO:tasks.workunit.client.0.vm05.stdout:5/18: fsync d5/f7 0 2026-03-10T08:55:10.857 INFO:tasks.workunit.client.1.vm08.stdout:0/244: write d6/fc [1058242,23643] 0 2026-03-10T08:55:10.858 INFO:tasks.workunit.client.1.vm08.stdout:6/333: dwrite d9/dc/d11/d23/d2c/f49 [0,4194304] 0 2026-03-10T08:55:10.858 INFO:tasks.workunit.client.1.vm08.stdout:6/334: chown d9/dc/d11/d23 636 1 2026-03-10T08:55:10.860 INFO:tasks.workunit.client.0.vm05.stdout:7/36: truncate f3 32390 0 2026-03-10T08:55:10.865 INFO:tasks.workunit.client.0.vm05.stdout:7/37: dwrite f8 [0,4194304] 0 2026-03-10T08:55:10.868 INFO:tasks.workunit.client.1.vm08.stdout:3/277: mknod d4/c5d 0 2026-03-10T08:55:10.875 INFO:tasks.workunit.client.1.vm08.stdout:4/326: mkdir d5/d23/d36/d76 0 2026-03-10T08:55:10.875 INFO:tasks.workunit.client.1.vm08.stdout:4/327: chown d5/d2f/d5a/d69/f6e 958279862 1 2026-03-10T08:55:10.876 INFO:tasks.workunit.client.1.vm08.stdout:4/328: fsync d5/f8 0 2026-03-10T08:55:10.882 INFO:tasks.workunit.client.1.vm08.stdout:5/323: getdents d0/d11/d3e/d45 0 2026-03-10T08:55:10.890 INFO:tasks.workunit.client.1.vm08.stdout:5/324: chown d0/d46/l5d 18132 1 2026-03-10T08:55:10.890 INFO:tasks.workunit.client.1.vm08.stdout:2/327: symlink d1/l6e 0 2026-03-10T08:55:10.890 INFO:tasks.workunit.client.1.vm08.stdout:0/245: stat d6/dd/c1e 0 2026-03-10T08:55:10.891 INFO:tasks.workunit.client.1.vm08.stdout:6/335: creat d9/dc/d11/d23/d2c/f79 x:0 0 0 2026-03-10T08:55:10.891 INFO:tasks.workunit.client.1.vm08.stdout:6/336: dread - d9/d10/f72 zero size 2026-03-10T08:55:10.898 INFO:tasks.workunit.client.1.vm08.stdout:9/298: dwrite d2/dd/d15/d1e/d39/f57 [0,4194304] 0 2026-03-10T08:55:10.920 INFO:tasks.workunit.client.0.vm05.stdout:1/35: truncate f6 3865232 0 2026-03-10T08:55:10.921 INFO:tasks.workunit.client.1.vm08.stdout:9/299: chown d2/d41 17 1 2026-03-10T08:55:10.921 INFO:tasks.workunit.client.1.vm08.stdout:5/325: fsync d0/fe 0 2026-03-10T08:55:10.921 INFO:tasks.workunit.client.1.vm08.stdout:1/232: dwrite d1/da/de/d24/f37 [0,4194304] 0 2026-03-10T08:55:10.921 INFO:tasks.workunit.client.1.vm08.stdout:0/246: rmdir d6/dd/d13 39 2026-03-10T08:55:10.921 INFO:tasks.workunit.client.1.vm08.stdout:6/337: mkdir d9/dc/d11/d23/d2c/d7a 0 2026-03-10T08:55:10.921 INFO:tasks.workunit.client.1.vm08.stdout:5/326: creat d0/d40/f6a x:0 0 0 2026-03-10T08:55:10.921 INFO:tasks.workunit.client.1.vm08.stdout:6/338: chown d9/d10/f72 483070 1 2026-03-10T08:55:10.921 INFO:tasks.workunit.client.1.vm08.stdout:5/327: stat d0/d11/d3e/f48 0 2026-03-10T08:55:10.921 INFO:tasks.workunit.client.1.vm08.stdout:6/339: chown d9/d13/d4e/f57 19548340 1 2026-03-10T08:55:10.921 INFO:tasks.workunit.client.1.vm08.stdout:2/328: unlink d1/da/d10/d2d/c59 0 2026-03-10T08:55:10.921 INFO:tasks.workunit.client.1.vm08.stdout:2/329: truncate d1/da/d10/d1b/d12/d23/f57 660482 0 2026-03-10T08:55:10.921 INFO:tasks.workunit.client.1.vm08.stdout:2/330: truncate d1/da/d10/d1b/d12/d23/f57 1269233 0 2026-03-10T08:55:10.921 INFO:tasks.workunit.client.1.vm08.stdout:1/233: mkdir d1/da/de/d24/d3d/d4a 0 2026-03-10T08:55:10.921 INFO:tasks.workunit.client.1.vm08.stdout:2/331: dread d1/da/d10/d1b/d12/f55 [0,4194304] 0 2026-03-10T08:55:10.921 INFO:tasks.workunit.client.1.vm08.stdout:4/329: creat d5/f77 x:0 0 0 2026-03-10T08:55:10.922 INFO:tasks.workunit.client.1.vm08.stdout:7/273: dwrite d0/d11/d1f/d29/d3d/d40/f38 [0,4194304] 0 2026-03-10T08:55:10.926 INFO:tasks.workunit.client.0.vm05.stdout:8/18: dwrite f1 [0,4194304] 0 2026-03-10T08:55:10.930 INFO:tasks.workunit.client.1.vm08.stdout:4/330: dread d5/d2f/d5d/f66 [0,4194304] 0 2026-03-10T08:55:10.930 INFO:tasks.workunit.client.1.vm08.stdout:5/328: symlink d0/d11/d27/l6b 0 2026-03-10T08:55:10.931 INFO:tasks.workunit.client.1.vm08.stdout:6/340: mkdir d9/d10/d1e/d7b 0 2026-03-10T08:55:10.932 INFO:tasks.workunit.client.1.vm08.stdout:6/341: dread - d9/dc/d11/f55 zero size 2026-03-10T08:55:10.933 INFO:tasks.workunit.client.1.vm08.stdout:6/342: fdatasync d9/f77 0 2026-03-10T08:55:10.934 INFO:tasks.workunit.client.1.vm08.stdout:7/274: mknod d0/d14/d43/c5a 0 2026-03-10T08:55:10.935 INFO:tasks.workunit.client.1.vm08.stdout:5/329: rmdir d0/d11/d18/d52 39 2026-03-10T08:55:10.936 INFO:tasks.workunit.client.1.vm08.stdout:4/331: dwrite d5/f19 [0,4194304] 0 2026-03-10T08:55:10.938 INFO:tasks.workunit.client.1.vm08.stdout:0/247: creat d6/dd/d13/d17/d1f/d2d/d38/f53 x:0 0 0 2026-03-10T08:55:10.938 INFO:tasks.workunit.client.1.vm08.stdout:1/234: mkdir d1/da/d4b 0 2026-03-10T08:55:10.946 INFO:tasks.workunit.client.1.vm08.stdout:4/332: readlink d5/d23/d36/l4e 0 2026-03-10T08:55:10.946 INFO:tasks.workunit.client.1.vm08.stdout:2/332: sync 2026-03-10T08:55:10.948 INFO:tasks.workunit.client.1.vm08.stdout:4/333: stat d5/d5f 0 2026-03-10T08:55:10.949 INFO:tasks.workunit.client.1.vm08.stdout:6/343: dwrite d9/dc/d11/d23/d2c/d41/f51 [0,4194304] 0 2026-03-10T08:55:10.950 INFO:tasks.workunit.client.1.vm08.stdout:6/344: stat d9/dc/d11/d23/d2c/d41/c22 0 2026-03-10T08:55:10.952 INFO:tasks.workunit.client.1.vm08.stdout:8/345: dread d1/f8 [0,4194304] 0 2026-03-10T08:55:10.954 INFO:tasks.workunit.client.1.vm08.stdout:0/248: rmdir d6/dd/d13/d17/d1f/d2d/d39 39 2026-03-10T08:55:10.955 INFO:tasks.workunit.client.1.vm08.stdout:4/334: mknod d5/d2f/d5d/c78 0 2026-03-10T08:55:10.956 INFO:tasks.workunit.client.1.vm08.stdout:6/345: symlink d9/d10/d1e/l7c 0 2026-03-10T08:55:10.957 INFO:tasks.workunit.client.1.vm08.stdout:0/249: dread f3 [0,4194304] 0 2026-03-10T08:55:10.957 INFO:tasks.workunit.client.1.vm08.stdout:1/235: mkdir d1/da/d20/d4c 0 2026-03-10T08:55:10.958 INFO:tasks.workunit.client.1.vm08.stdout:8/346: chown d1/d10/d9/dd/d25/d27/d44/f22 4100495 1 2026-03-10T08:55:10.959 INFO:tasks.workunit.client.1.vm08.stdout:0/250: chown d6/fe 0 1 2026-03-10T08:55:10.960 INFO:tasks.workunit.client.1.vm08.stdout:0/251: write d6/dd/d13/d32/f3d [706153,61048] 0 2026-03-10T08:55:10.962 INFO:tasks.workunit.client.1.vm08.stdout:6/346: rename d9/dc/d11/d23/d2c/d41/c2b to d9/d10/d1e/d7b/c7d 0 2026-03-10T08:55:10.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:10 vm05.local ceph-mon[49713]: pgmap v144: 65 pgs: 65 active+clean; 644 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 6.6 MiB/s rd, 71 MiB/s wr, 341 op/s 2026-03-10T08:55:10.972 INFO:tasks.workunit.client.1.vm08.stdout:8/347: dwrite f0 [0,4194304] 0 2026-03-10T08:55:10.972 INFO:tasks.workunit.client.1.vm08.stdout:6/347: truncate d9/dc/d11/d23/d2c/d44/f62 581143 0 2026-03-10T08:55:10.975 INFO:tasks.workunit.client.1.vm08.stdout:6/348: mkdir d9/d10/d1e/d7e 0 2026-03-10T08:55:10.975 INFO:tasks.workunit.client.1.vm08.stdout:6/349: read f1 [6482763,32518] 0 2026-03-10T08:55:10.977 INFO:tasks.workunit.client.1.vm08.stdout:6/350: dwrite d9/dc/d11/d23/d2c/d44/f62 [0,4194304] 0 2026-03-10T08:55:10.983 INFO:tasks.workunit.client.1.vm08.stdout:2/333: getdents d1/da 0 2026-03-10T08:55:10.983 INFO:tasks.workunit.client.1.vm08.stdout:2/334: chown d1/da/d10/d2d/f4d 6146 1 2026-03-10T08:55:10.992 INFO:tasks.workunit.client.1.vm08.stdout:8/348: symlink d1/d10/d9/dd/d25/d27/d44/l7a 0 2026-03-10T08:55:10.995 INFO:tasks.workunit.client.1.vm08.stdout:6/351: symlink d9/dc/d11/d23/d2c/d44/d63/l7f 0 2026-03-10T08:55:10.999 INFO:tasks.workunit.client.1.vm08.stdout:6/352: mkdir d9/dc/d11/d23/d2c/d41/d5d/d80 0 2026-03-10T08:55:10.999 INFO:tasks.workunit.client.1.vm08.stdout:6/353: fdatasync d9/dc/d11/d23/f5f 0 2026-03-10T08:55:11.001 INFO:tasks.workunit.client.1.vm08.stdout:8/349: truncate d1/d10/d9/dd/d25/d27/d44/d21/f4a 3324479 0 2026-03-10T08:55:11.002 INFO:tasks.workunit.client.1.vm08.stdout:8/350: dread - d1/d10/d9/dd/d25/d27/d44/d21/f66 zero size 2026-03-10T08:55:11.003 INFO:tasks.workunit.client.1.vm08.stdout:6/354: dwrite d9/dc/d11/f29 [0,4194304] 0 2026-03-10T08:55:11.003 INFO:tasks.workunit.client.1.vm08.stdout:2/335: getdents d1/da/d10/d1b 0 2026-03-10T08:55:11.004 INFO:tasks.workunit.client.1.vm08.stdout:3/278: write d4/d15/d8/f37 [468357,38295] 0 2026-03-10T08:55:11.014 INFO:tasks.workunit.client.1.vm08.stdout:3/279: dwrite d4/f18 [0,4194304] 0 2026-03-10T08:55:11.016 INFO:tasks.workunit.client.1.vm08.stdout:2/336: dread - d1/da/d10/d1b/d12/d23/f37 zero size 2026-03-10T08:55:11.016 INFO:tasks.workunit.client.1.vm08.stdout:0/252: link d6/dd/d13/d17/d1f/d20/d2f/c30 d6/dd/d13/d17/d1f/d20/d2f/c54 0 2026-03-10T08:55:11.016 INFO:tasks.workunit.client.1.vm08.stdout:0/253: fsync d6/f2c 0 2026-03-10T08:55:11.018 INFO:tasks.workunit.client.1.vm08.stdout:8/351: symlink d1/d10/d9/dd/d13/d40/l7b 0 2026-03-10T08:55:11.025 INFO:tasks.workunit.client.1.vm08.stdout:0/254: unlink d6/fc 0 2026-03-10T08:55:11.026 INFO:tasks.workunit.client.1.vm08.stdout:3/280: rmdir d4/d15/d8/d1d 39 2026-03-10T08:55:11.027 INFO:tasks.workunit.client.1.vm08.stdout:2/337: truncate d1/da/d10/d42/f58 4102978 0 2026-03-10T08:55:11.028 INFO:tasks.workunit.client.1.vm08.stdout:0/255: write f3 [49169,11403] 0 2026-03-10T08:55:11.029 INFO:tasks.workunit.client.1.vm08.stdout:8/352: rename d1/d10/d9/dd/d25/d27/d44/d21/d51/c5e to d1/d4f/d60/c7c 0 2026-03-10T08:55:11.029 INFO:tasks.workunit.client.1.vm08.stdout:0/256: dread - d6/dd/d13/d17/d1f/d20/d2f/d24/f37 zero size 2026-03-10T08:55:11.031 INFO:tasks.workunit.client.1.vm08.stdout:8/353: chown d1/d10/d9/dd/d25/d27/d44/c6f 2509 1 2026-03-10T08:55:11.031 INFO:tasks.workunit.client.1.vm08.stdout:3/281: creat d4/d15/f5e x:0 0 0 2026-03-10T08:55:11.032 INFO:tasks.workunit.client.1.vm08.stdout:8/354: dread - d1/d10/d9/dd/d25/d27/d44/d21/d51/f72 zero size 2026-03-10T08:55:11.032 INFO:tasks.workunit.client.1.vm08.stdout:8/355: readlink d1/d10/d9/dd/l15 0 2026-03-10T08:55:11.033 INFO:tasks.workunit.client.1.vm08.stdout:2/338: mknod d1/d43/d4f/d52/c6f 0 2026-03-10T08:55:11.034 INFO:tasks.workunit.client.1.vm08.stdout:2/339: dread - d1/d5b/d66/f62 zero size 2026-03-10T08:55:11.034 INFO:tasks.workunit.client.1.vm08.stdout:8/356: mkdir d1/d10/d9/d4d/d5c/d7d 0 2026-03-10T08:55:11.036 INFO:tasks.workunit.client.1.vm08.stdout:8/357: chown d1/d2c/l35 20277 1 2026-03-10T08:55:11.038 INFO:tasks.workunit.client.1.vm08.stdout:2/340: dwrite d1/da/d10/d1b/d12/d23/f57 [0,4194304] 0 2026-03-10T08:55:11.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:10 vm08.local ceph-mon[57559]: pgmap v144: 65 pgs: 65 active+clean; 644 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 6.6 MiB/s rd, 71 MiB/s wr, 341 op/s 2026-03-10T08:55:11.063 INFO:tasks.workunit.client.1.vm08.stdout:1/236: dread d1/f8 [0,4194304] 0 2026-03-10T08:55:11.081 INFO:tasks.workunit.client.0.vm05.stdout:9/33: fdatasync d6/f7 0 2026-03-10T08:55:11.082 INFO:tasks.workunit.client.1.vm08.stdout:1/237: dread d1/da/d18/f1d [0,4194304] 0 2026-03-10T08:55:11.083 INFO:tasks.workunit.client.0.vm05.stdout:0/29: chown c8 1 1 2026-03-10T08:55:11.083 INFO:tasks.workunit.client.1.vm08.stdout:1/238: stat d1/l31 0 2026-03-10T08:55:11.085 INFO:tasks.workunit.client.0.vm05.stdout:4/22: creat d0/f4 x:0 0 0 2026-03-10T08:55:11.087 INFO:tasks.workunit.client.1.vm08.stdout:1/239: chown d1/da/f25 239384 1 2026-03-10T08:55:11.088 INFO:tasks.workunit.client.1.vm08.stdout:1/240: write d1/da/d20/f2d [485920,90254] 0 2026-03-10T08:55:11.088 INFO:tasks.workunit.client.0.vm05.stdout:6/32: dwrite f0 [0,4194304] 0 2026-03-10T08:55:11.099 INFO:tasks.workunit.client.0.vm05.stdout:3/23: creat f5 x:0 0 0 2026-03-10T08:55:11.107 INFO:tasks.workunit.client.1.vm08.stdout:1/241: write d1/da/de/d24/d3d/d40/f42 [793713,113105] 0 2026-03-10T08:55:11.107 INFO:tasks.workunit.client.1.vm08.stdout:1/242: dwrite d1/da/de/f19 [0,4194304] 0 2026-03-10T08:55:11.107 INFO:tasks.workunit.client.0.vm05.stdout:5/19: rename c1 to d5/ca 0 2026-03-10T08:55:11.107 INFO:tasks.workunit.client.0.vm05.stdout:8/19: creat d2/f5 x:0 0 0 2026-03-10T08:55:11.107 INFO:tasks.workunit.client.0.vm05.stdout:8/20: chown d2/f5 22898892 1 2026-03-10T08:55:11.107 INFO:tasks.workunit.client.0.vm05.stdout:8/21: dwrite d2/f5 [0,4194304] 0 2026-03-10T08:55:11.110 INFO:tasks.workunit.client.0.vm05.stdout:2/29: creat d0/f7 x:0 0 0 2026-03-10T08:55:11.110 INFO:tasks.workunit.client.0.vm05.stdout:4/23: symlink d0/l5 0 2026-03-10T08:55:11.110 INFO:tasks.workunit.client.0.vm05.stdout:6/33: creat d4/f8 x:0 0 0 2026-03-10T08:55:11.111 INFO:tasks.workunit.client.0.vm05.stdout:4/24: write d0/f4 [667662,24658] 0 2026-03-10T08:55:11.111 INFO:tasks.workunit.client.0.vm05.stdout:3/24: rename f3 to f6 0 2026-03-10T08:55:11.116 INFO:tasks.workunit.client.0.vm05.stdout:5/20: mknod d5/cb 0 2026-03-10T08:55:11.116 INFO:tasks.workunit.client.0.vm05.stdout:5/21: dread - d5/f8 zero size 2026-03-10T08:55:11.118 INFO:tasks.workunit.client.0.vm05.stdout:8/22: symlink d2/l6 0 2026-03-10T08:55:11.124 INFO:tasks.workunit.client.0.vm05.stdout:2/30: rename d0/f7 to d0/f8 0 2026-03-10T08:55:11.124 INFO:tasks.workunit.client.0.vm05.stdout:6/34: truncate f2 1520334 0 2026-03-10T08:55:11.125 INFO:tasks.workunit.client.0.vm05.stdout:4/25: symlink d0/l6 0 2026-03-10T08:55:11.125 INFO:tasks.workunit.client.0.vm05.stdout:4/26: write d0/f1 [8745131,7396] 0 2026-03-10T08:55:11.125 INFO:tasks.workunit.client.0.vm05.stdout:5/22: creat d5/fc x:0 0 0 2026-03-10T08:55:11.133 INFO:tasks.workunit.client.0.vm05.stdout:2/31: dwrite d0/f8 [0,4194304] 0 2026-03-10T08:55:11.156 INFO:tasks.workunit.client.0.vm05.stdout:2/32: chown d0/f8 7751319 1 2026-03-10T08:55:11.156 INFO:tasks.workunit.client.0.vm05.stdout:6/35: creat d4/f9 x:0 0 0 2026-03-10T08:55:11.156 INFO:tasks.workunit.client.0.vm05.stdout:7/38: getdents . 0 2026-03-10T08:55:11.156 INFO:tasks.workunit.client.0.vm05.stdout:7/39: chown f4 56349 1 2026-03-10T08:55:11.156 INFO:tasks.workunit.client.0.vm05.stdout:9/34: getdents d6 0 2026-03-10T08:55:11.156 INFO:tasks.workunit.client.0.vm05.stdout:0/30: getdents . 0 2026-03-10T08:55:11.156 INFO:tasks.workunit.client.0.vm05.stdout:0/31: stat ca 0 2026-03-10T08:55:11.156 INFO:tasks.workunit.client.0.vm05.stdout:9/35: mknod d6/ca 0 2026-03-10T08:55:11.156 INFO:tasks.workunit.client.0.vm05.stdout:9/36: read f5 [191155,42296] 0 2026-03-10T08:55:11.156 INFO:tasks.workunit.client.0.vm05.stdout:0/32: rename f1 to fb 0 2026-03-10T08:55:11.156 INFO:tasks.workunit.client.0.vm05.stdout:6/36: creat d4/d7/fa x:0 0 0 2026-03-10T08:55:11.156 INFO:tasks.workunit.client.0.vm05.stdout:6/37: chown f2 1685 1 2026-03-10T08:55:11.156 INFO:tasks.workunit.client.0.vm05.stdout:6/38: stat d4/d7/fa 0 2026-03-10T08:55:11.156 INFO:tasks.workunit.client.0.vm05.stdout:0/33: dwrite fb [0,4194304] 0 2026-03-10T08:55:11.156 INFO:tasks.workunit.client.0.vm05.stdout:6/39: rename d4/f6 to d4/fb 0 2026-03-10T08:55:11.156 INFO:tasks.workunit.client.0.vm05.stdout:6/40: truncate d4/d7/fa 44057 0 2026-03-10T08:55:11.160 INFO:tasks.workunit.client.0.vm05.stdout:0/34: creat fc x:0 0 0 2026-03-10T08:55:11.175 INFO:tasks.workunit.client.0.vm05.stdout:0/35: write f5 [477794,110578] 0 2026-03-10T08:55:11.175 INFO:tasks.workunit.client.0.vm05.stdout:0/36: dwrite fb [0,4194304] 0 2026-03-10T08:55:11.175 INFO:tasks.workunit.client.0.vm05.stdout:5/23: dread f2 [0,4194304] 0 2026-03-10T08:55:11.175 INFO:tasks.workunit.client.0.vm05.stdout:5/24: dread d5/f9 [0,4194304] 0 2026-03-10T08:55:11.175 INFO:tasks.workunit.client.0.vm05.stdout:5/25: write d5/fc [877365,81468] 0 2026-03-10T08:55:11.254 INFO:tasks.workunit.client.0.vm05.stdout:3/25: sync 2026-03-10T08:55:11.254 INFO:tasks.workunit.client.0.vm05.stdout:2/33: sync 2026-03-10T08:55:11.257 INFO:tasks.workunit.client.0.vm05.stdout:3/26: dread f6 [0,4194304] 0 2026-03-10T08:55:11.263 INFO:tasks.workunit.client.0.vm05.stdout:3/27: rename f4 to f7 0 2026-03-10T08:55:11.266 INFO:tasks.workunit.client.0.vm05.stdout:3/28: dread - f5 zero size 2026-03-10T08:55:11.267 INFO:tasks.workunit.client.0.vm05.stdout:3/29: dwrite f5 [0,4194304] 0 2026-03-10T08:55:11.267 INFO:tasks.workunit.client.0.vm05.stdout:3/30: dread - f2 zero size 2026-03-10T08:55:11.274 INFO:tasks.workunit.client.0.vm05.stdout:3/31: write f7 [485215,62257] 0 2026-03-10T08:55:11.274 INFO:tasks.workunit.client.0.vm05.stdout:3/32: chown f2 0 1 2026-03-10T08:55:11.275 INFO:tasks.workunit.client.0.vm05.stdout:3/33: chown f7 0 1 2026-03-10T08:55:11.279 INFO:tasks.workunit.client.0.vm05.stdout:3/34: dwrite f1 [0,4194304] 0 2026-03-10T08:55:11.285 INFO:tasks.workunit.client.0.vm05.stdout:3/35: dwrite f1 [0,4194304] 0 2026-03-10T08:55:11.285 INFO:tasks.workunit.client.0.vm05.stdout:3/36: read - f2 zero size 2026-03-10T08:55:11.286 INFO:tasks.workunit.client.0.vm05.stdout:3/37: write f5 [2006286,98836] 0 2026-03-10T08:55:11.287 INFO:tasks.workunit.client.0.vm05.stdout:3/38: dread - f2 zero size 2026-03-10T08:55:11.299 INFO:tasks.workunit.client.0.vm05.stdout:1/36: dwrite f6 [0,4194304] 0 2026-03-10T08:55:11.307 INFO:tasks.workunit.client.0.vm05.stdout:1/37: creat fa x:0 0 0 2026-03-10T08:55:11.307 INFO:tasks.workunit.client.0.vm05.stdout:1/38: readlink l0 0 2026-03-10T08:55:11.310 INFO:tasks.workunit.client.0.vm05.stdout:1/39: creat fb x:0 0 0 2026-03-10T08:55:11.310 INFO:tasks.workunit.client.0.vm05.stdout:1/40: fdatasync f2 0 2026-03-10T08:55:11.311 INFO:tasks.workunit.client.0.vm05.stdout:1/41: creat fc x:0 0 0 2026-03-10T08:55:11.311 INFO:tasks.workunit.client.0.vm05.stdout:1/42: stat l9 0 2026-03-10T08:55:11.312 INFO:tasks.workunit.client.0.vm05.stdout:1/43: mkdir dd 0 2026-03-10T08:55:11.313 INFO:tasks.workunit.client.0.vm05.stdout:1/44: symlink dd/le 0 2026-03-10T08:55:11.313 INFO:tasks.workunit.client.0.vm05.stdout:1/45: truncate f6 4744505 0 2026-03-10T08:55:11.314 INFO:tasks.workunit.client.0.vm05.stdout:1/46: write f2 [675165,75106] 0 2026-03-10T08:55:11.319 INFO:tasks.workunit.client.0.vm05.stdout:1/47: dwrite f2 [0,4194304] 0 2026-03-10T08:55:11.328 INFO:tasks.workunit.client.0.vm05.stdout:4/27: fsync d0/f1 0 2026-03-10T08:55:11.328 INFO:tasks.workunit.client.0.vm05.stdout:4/28: chown d0/l5 2812577 1 2026-03-10T08:55:11.334 INFO:tasks.workunit.client.0.vm05.stdout:4/29: dwrite d0/f1 [0,4194304] 0 2026-03-10T08:55:11.335 INFO:tasks.workunit.client.0.vm05.stdout:4/30: rename d0 to d0/d7 22 2026-03-10T08:55:11.338 INFO:tasks.workunit.client.0.vm05.stdout:4/31: creat d0/f8 x:0 0 0 2026-03-10T08:55:11.360 INFO:tasks.workunit.client.0.vm05.stdout:1/48: sync 2026-03-10T08:55:11.363 INFO:tasks.workunit.client.0.vm05.stdout:1/49: dwrite f5 [0,4194304] 0 2026-03-10T08:55:11.366 INFO:tasks.workunit.client.0.vm05.stdout:1/50: mknod dd/cf 0 2026-03-10T08:55:11.371 INFO:tasks.workunit.client.0.vm05.stdout:1/51: dwrite f7 [4194304,4194304] 0 2026-03-10T08:55:11.377 INFO:tasks.workunit.client.0.vm05.stdout:1/52: unlink f5 0 2026-03-10T08:55:11.382 INFO:tasks.workunit.client.0.vm05.stdout:1/53: dwrite fc [0,4194304] 0 2026-03-10T08:55:11.398 INFO:tasks.workunit.client.0.vm05.stdout:1/54: write fc [3089359,123675] 0 2026-03-10T08:55:11.399 INFO:tasks.workunit.client.0.vm05.stdout:1/55: dwrite fc [0,4194304] 0 2026-03-10T08:55:11.399 INFO:tasks.workunit.client.0.vm05.stdout:1/56: mkdir dd/d10 0 2026-03-10T08:55:11.399 INFO:tasks.workunit.client.0.vm05.stdout:1/57: truncate fb 443714 0 2026-03-10T08:55:11.399 INFO:tasks.workunit.client.0.vm05.stdout:1/58: creat dd/f11 x:0 0 0 2026-03-10T08:55:11.399 INFO:tasks.workunit.client.0.vm05.stdout:1/59: rename l0 to dd/l12 0 2026-03-10T08:55:11.399 INFO:tasks.workunit.client.0.vm05.stdout:1/60: mkdir dd/d13 0 2026-03-10T08:55:11.399 INFO:tasks.workunit.client.0.vm05.stdout:1/61: readlink dd/le 0 2026-03-10T08:55:11.399 INFO:tasks.workunit.client.0.vm05.stdout:1/62: write f6 [3925357,4124] 0 2026-03-10T08:55:11.404 INFO:tasks.workunit.client.0.vm05.stdout:1/63: mknod dd/d10/c14 0 2026-03-10T08:55:11.407 INFO:tasks.workunit.client.0.vm05.stdout:1/64: symlink dd/d13/l15 0 2026-03-10T08:55:11.409 INFO:tasks.workunit.client.0.vm05.stdout:1/65: dwrite f7 [4194304,4194304] 0 2026-03-10T08:55:11.412 INFO:tasks.workunit.client.0.vm05.stdout:1/66: creat dd/f16 x:0 0 0 2026-03-10T08:55:11.412 INFO:tasks.workunit.client.0.vm05.stdout:1/67: mknod dd/c17 0 2026-03-10T08:55:11.412 INFO:tasks.workunit.client.0.vm05.stdout:1/68: dread - fa zero size 2026-03-10T08:55:11.420 INFO:tasks.workunit.client.1.vm08.stdout:9/300: truncate d2/dd/d15/d1e/f48 1976178 0 2026-03-10T08:55:11.428 INFO:tasks.workunit.client.1.vm08.stdout:9/301: getdents d2/dd/d15/d1e/d24 0 2026-03-10T08:55:11.429 INFO:tasks.workunit.client.1.vm08.stdout:9/302: mkdir d2/dd/d61 0 2026-03-10T08:55:11.434 INFO:tasks.workunit.client.1.vm08.stdout:7/275: truncate d0/d11/d4a/f4f 1867999 0 2026-03-10T08:55:11.437 INFO:tasks.workunit.client.1.vm08.stdout:4/335: dwrite d5/de/f1b [0,4194304] 0 2026-03-10T08:55:11.439 INFO:tasks.workunit.client.1.vm08.stdout:5/330: dwrite d0/d11/d27/f3b [0,4194304] 0 2026-03-10T08:55:11.439 INFO:tasks.workunit.client.1.vm08.stdout:5/331: read - d0/d1b/f69 zero size 2026-03-10T08:55:11.441 INFO:tasks.workunit.client.1.vm08.stdout:5/332: write d0/d11/f2d [1527609,114738] 0 2026-03-10T08:55:11.441 INFO:tasks.workunit.client.1.vm08.stdout:5/333: dread - d0/d11/f60 zero size 2026-03-10T08:55:11.445 INFO:tasks.workunit.client.1.vm08.stdout:4/336: rename d5/d5f/l70 to d5/d5f/l79 0 2026-03-10T08:55:11.447 INFO:tasks.workunit.client.1.vm08.stdout:5/334: dread d0/d11/d27/f3b [0,4194304] 0 2026-03-10T08:55:11.448 INFO:tasks.workunit.client.1.vm08.stdout:6/355: rename d9/dc/d11/d23/d2c/d44 to d9/dc/d11/d23/d2c/d81 0 2026-03-10T08:55:11.448 INFO:tasks.workunit.client.1.vm08.stdout:5/335: readlink d0/d11/l1f 0 2026-03-10T08:55:11.448 INFO:tasks.workunit.client.1.vm08.stdout:6/356: dread - d9/d10/d1e/d32/f48 zero size 2026-03-10T08:55:11.452 INFO:tasks.workunit.client.1.vm08.stdout:5/336: read - d0/d11/d18/d52/f57 zero size 2026-03-10T08:55:11.468 INFO:tasks.workunit.client.1.vm08.stdout:3/282: chown d4/d15/f4b 100 1 2026-03-10T08:55:11.468 INFO:tasks.workunit.client.1.vm08.stdout:0/257: write d6/dd/f28 [1907324,52359] 0 2026-03-10T08:55:11.468 INFO:tasks.workunit.client.1.vm08.stdout:3/283: chown d4/d15/f3f 9427 1 2026-03-10T08:55:11.468 INFO:tasks.workunit.client.1.vm08.stdout:3/284: chown f1 3792513 1 2026-03-10T08:55:11.468 INFO:tasks.workunit.client.1.vm08.stdout:3/285: mknod d4/c5f 0 2026-03-10T08:55:11.468 INFO:tasks.workunit.client.1.vm08.stdout:3/286: fdatasync d4/f47 0 2026-03-10T08:55:11.469 INFO:tasks.workunit.client.1.vm08.stdout:3/287: dread - d4/d15/d17/f5c zero size 2026-03-10T08:55:11.476 INFO:tasks.workunit.client.1.vm08.stdout:5/337: read d0/d11/d18/f23 [2995355,31961] 0 2026-03-10T08:55:11.494 INFO:tasks.workunit.client.1.vm08.stdout:8/358: write d1/f8 [639103,69687] 0 2026-03-10T08:55:11.497 INFO:tasks.workunit.client.1.vm08.stdout:8/359: dwrite d1/d10/d9/dd/d25/d27/f52 [0,4194304] 0 2026-03-10T08:55:11.497 INFO:tasks.workunit.client.1.vm08.stdout:6/357: sync 2026-03-10T08:55:11.502 INFO:tasks.workunit.client.1.vm08.stdout:2/341: write d1/da/d10/d1b/d12/d22/f45 [510138,128866] 0 2026-03-10T08:55:11.504 INFO:tasks.workunit.client.1.vm08.stdout:8/360: symlink d1/d10/d9/d4d/d5c/d7d/l7e 0 2026-03-10T08:55:11.507 INFO:tasks.workunit.client.1.vm08.stdout:4/337: dread d5/de/f5e [0,4194304] 0 2026-03-10T08:55:11.507 INFO:tasks.workunit.client.1.vm08.stdout:4/338: chown d5/d2f/d5d 41135 1 2026-03-10T08:55:11.508 INFO:tasks.workunit.client.1.vm08.stdout:8/361: getdents d1/d10/d9/dd/d25/d27/d44/d21/d51/d64 0 2026-03-10T08:55:11.509 INFO:tasks.workunit.client.1.vm08.stdout:4/339: symlink d5/d23/d36/d76/l7a 0 2026-03-10T08:55:11.510 INFO:tasks.workunit.client.1.vm08.stdout:8/362: rename d1/d10/lc to d1/d10/d9/dd/d25/d27/d44/d21/l7f 0 2026-03-10T08:55:11.523 INFO:tasks.workunit.client.1.vm08.stdout:5/338: dread d0/d11/d18/f23 [0,4194304] 0 2026-03-10T08:55:11.523 INFO:tasks.workunit.client.1.vm08.stdout:5/339: chown d0/d40/f42 108 1 2026-03-10T08:55:11.528 INFO:tasks.workunit.client.1.vm08.stdout:4/340: sync 2026-03-10T08:55:11.528 INFO:tasks.workunit.client.1.vm08.stdout:8/363: sync 2026-03-10T08:55:11.528 INFO:tasks.workunit.client.1.vm08.stdout:8/364: chown d1/d10/d9/dd/d25/f6e 181915 1 2026-03-10T08:55:11.528 INFO:tasks.workunit.client.1.vm08.stdout:4/341: dread - d5/d2f/f71 zero size 2026-03-10T08:55:11.529 INFO:tasks.workunit.client.1.vm08.stdout:4/342: stat d5/de 0 2026-03-10T08:55:11.530 INFO:tasks.workunit.client.1.vm08.stdout:5/340: creat d0/f6c x:0 0 0 2026-03-10T08:55:11.530 INFO:tasks.workunit.client.1.vm08.stdout:5/341: stat d0/d11/f60 0 2026-03-10T08:55:11.531 INFO:tasks.workunit.client.1.vm08.stdout:5/342: fsync d0/d11/d27/f3d 0 2026-03-10T08:55:11.531 INFO:tasks.workunit.client.1.vm08.stdout:5/343: readlink d0/lc 0 2026-03-10T08:55:11.532 INFO:tasks.workunit.client.1.vm08.stdout:4/343: creat d5/d23/d36/d76/f7b x:0 0 0 2026-03-10T08:55:11.532 INFO:tasks.workunit.client.1.vm08.stdout:8/365: rename d1/d10/d9/dd/d13/f46 to d1/d10/d9/dd/d18/f80 0 2026-03-10T08:55:11.534 INFO:tasks.workunit.client.1.vm08.stdout:4/344: unlink d5/d23/d36/d76/f7b 0 2026-03-10T08:55:11.535 INFO:tasks.workunit.client.1.vm08.stdout:8/366: fdatasync d1/d10/d9/dd/f41 0 2026-03-10T08:55:11.536 INFO:tasks.workunit.client.1.vm08.stdout:4/345: mknod d5/d2f/c7c 0 2026-03-10T08:55:11.537 INFO:tasks.workunit.client.1.vm08.stdout:4/346: fdatasync d5/d2f/d5d/f61 0 2026-03-10T08:55:11.539 INFO:tasks.workunit.client.1.vm08.stdout:4/347: truncate d5/d5f/f65 1778219 0 2026-03-10T08:55:11.540 INFO:tasks.workunit.client.1.vm08.stdout:4/348: write d5/d2f/f71 [931753,88220] 0 2026-03-10T08:55:11.540 INFO:tasks.workunit.client.1.vm08.stdout:4/349: chown d5/f1d 7979 1 2026-03-10T08:55:11.540 INFO:tasks.workunit.client.1.vm08.stdout:4/350: chown d5/d2f/d5a/d75 31259572 1 2026-03-10T08:55:11.541 INFO:tasks.workunit.client.1.vm08.stdout:4/351: fdatasync d5/d23/d49/f4d 0 2026-03-10T08:55:11.542 INFO:tasks.workunit.client.1.vm08.stdout:8/367: truncate d1/d10/d9/dd/d25/d27/d44/f22 917424 0 2026-03-10T08:55:11.543 INFO:tasks.workunit.client.1.vm08.stdout:4/352: creat d5/d23/d36/f7d x:0 0 0 2026-03-10T08:55:11.546 INFO:tasks.workunit.client.1.vm08.stdout:8/368: mknod d1/d10/d9/dd/d25/d27/d44/c81 0 2026-03-10T08:55:11.547 INFO:tasks.workunit.client.1.vm08.stdout:4/353: dread d5/d23/d36/f51 [0,4194304] 0 2026-03-10T08:55:11.548 INFO:tasks.workunit.client.1.vm08.stdout:8/369: creat d1/d10/d9/d4d/d5c/d7d/f82 x:0 0 0 2026-03-10T08:55:11.549 INFO:tasks.workunit.client.1.vm08.stdout:8/370: read d1/d10/d9/dd/d13/f6a [274403,35659] 0 2026-03-10T08:55:11.550 INFO:tasks.workunit.client.1.vm08.stdout:8/371: write d1/d10/d9/dd/d25/d27/f3a [734500,47860] 0 2026-03-10T08:55:11.551 INFO:tasks.workunit.client.1.vm08.stdout:4/354: creat d5/f7e x:0 0 0 2026-03-10T08:55:11.563 INFO:tasks.workunit.client.1.vm08.stdout:8/372: rename d1/d10/f6c to d1/d10/d9/dd/d18/d3c/f83 0 2026-03-10T08:55:11.563 INFO:tasks.workunit.client.1.vm08.stdout:8/373: unlink d1/d10/l19 0 2026-03-10T08:55:11.563 INFO:tasks.workunit.client.1.vm08.stdout:8/374: mknod d1/d10/d9/dd/d18/d34/c84 0 2026-03-10T08:55:11.563 INFO:tasks.workunit.client.1.vm08.stdout:8/375: dwrite d1/d10/d9/dd/f70 [0,4194304] 0 2026-03-10T08:55:11.563 INFO:tasks.workunit.client.1.vm08.stdout:8/376: mknod d1/d10/d9/dd/d18/d34/c85 0 2026-03-10T08:55:11.563 INFO:tasks.workunit.client.1.vm08.stdout:8/377: symlink d1/d10/d9/dd/d18/d3c/l86 0 2026-03-10T08:55:11.574 INFO:tasks.workunit.client.1.vm08.stdout:8/378: dwrite d1/d10/d9/dd/d25/d27/d44/d21/d51/f72 [0,4194304] 0 2026-03-10T08:55:11.576 INFO:tasks.workunit.client.1.vm08.stdout:8/379: write d1/d10/d9/dd/d25/d27/d44/d21/f66 [93447,18337] 0 2026-03-10T08:55:11.577 INFO:tasks.workunit.client.1.vm08.stdout:8/380: chown d1/d10/d9/dd/d25/d27/d44 3243204 1 2026-03-10T08:55:11.584 INFO:tasks.workunit.client.1.vm08.stdout:4/355: dread d5/f8 [0,4194304] 0 2026-03-10T08:55:11.587 INFO:tasks.workunit.client.1.vm08.stdout:8/381: rename d1/d10/d9/l2f to d1/l87 0 2026-03-10T08:55:11.587 INFO:tasks.workunit.client.1.vm08.stdout:4/356: unlink d5/d23/d49/l5b 0 2026-03-10T08:55:11.587 INFO:tasks.workunit.client.1.vm08.stdout:4/357: write d5/f42 [52368,108844] 0 2026-03-10T08:55:11.589 INFO:tasks.workunit.client.1.vm08.stdout:8/382: mkdir d1/d4f/d60/d88 0 2026-03-10T08:55:11.590 INFO:tasks.workunit.client.1.vm08.stdout:4/358: rename d5/f42 to d5/d2f/d5a/d75/f7f 0 2026-03-10T08:55:11.591 INFO:tasks.workunit.client.1.vm08.stdout:4/359: mkdir d5/d2f/d80 0 2026-03-10T08:55:11.601 INFO:tasks.workunit.client.1.vm08.stdout:8/383: mkdir d1/d10/d9/dd/d25/d27/d44/d89 0 2026-03-10T08:55:11.601 INFO:tasks.workunit.client.1.vm08.stdout:8/384: mkdir d1/d10/d9/d8a 0 2026-03-10T08:55:11.601 INFO:tasks.workunit.client.1.vm08.stdout:4/360: link d5/l22 d5/d2f/d5a/d75/l81 0 2026-03-10T08:55:11.601 INFO:tasks.workunit.client.1.vm08.stdout:4/361: truncate d5/d23/d36/f51 1892526 0 2026-03-10T08:55:11.601 INFO:tasks.workunit.client.1.vm08.stdout:4/362: dwrite d5/de/f72 [0,4194304] 0 2026-03-10T08:55:11.601 INFO:tasks.workunit.client.1.vm08.stdout:4/363: truncate d5/d2f/d5d/f60 1022482 0 2026-03-10T08:55:11.605 INFO:tasks.workunit.client.1.vm08.stdout:4/364: dread - d5/d23/d36/f7d zero size 2026-03-10T08:55:11.611 INFO:tasks.workunit.client.1.vm08.stdout:4/365: dwrite d5/f3d [0,4194304] 0 2026-03-10T08:55:11.620 INFO:tasks.workunit.client.1.vm08.stdout:4/366: dread d5/de/f1f [0,4194304] 0 2026-03-10T08:55:11.626 INFO:tasks.workunit.client.1.vm08.stdout:4/367: chown d5/d23/d36/l43 190164822 1 2026-03-10T08:55:11.626 INFO:tasks.workunit.client.1.vm08.stdout:4/368: write d5/d2f/f67 [2286144,124694] 0 2026-03-10T08:55:11.626 INFO:tasks.workunit.client.1.vm08.stdout:4/369: fsync d5/d23/d36/f57 0 2026-03-10T08:55:11.632 INFO:tasks.workunit.client.0.vm05.stdout:5/26: unlink d5/ca 0 2026-03-10T08:55:11.935 INFO:tasks.workunit.client.0.vm05.stdout:3/39: fsync f6 0 2026-03-10T08:55:11.935 INFO:tasks.workunit.client.0.vm05.stdout:3/40: write f7 [155619,92930] 0 2026-03-10T08:55:11.937 INFO:tasks.workunit.client.0.vm05.stdout:3/41: link f5 f8 0 2026-03-10T08:55:11.938 INFO:tasks.workunit.client.0.vm05.stdout:3/42: mkdir d9 0 2026-03-10T08:55:12.076 INFO:tasks.workunit.client.0.vm05.stdout:5/27: dread d5/fc [0,4194304] 0 2026-03-10T08:55:12.080 INFO:tasks.workunit.client.0.vm05.stdout:5/28: dwrite d5/f8 [0,4194304] 0 2026-03-10T08:55:12.085 INFO:tasks.workunit.client.0.vm05.stdout:5/29: creat d5/fd x:0 0 0 2026-03-10T08:55:12.096 INFO:tasks.workunit.client.0.vm05.stdout:5/30: stat l0 0 2026-03-10T08:55:12.097 INFO:tasks.workunit.client.0.vm05.stdout:5/31: creat d5/fe x:0 0 0 2026-03-10T08:55:12.097 INFO:tasks.workunit.client.0.vm05.stdout:5/32: write d5/fd [679472,25916] 0 2026-03-10T08:55:12.097 INFO:tasks.workunit.client.0.vm05.stdout:5/33: mkdir d5/df 0 2026-03-10T08:55:12.097 INFO:tasks.workunit.client.0.vm05.stdout:5/34: chown d5/fe 589 1 2026-03-10T08:55:12.097 INFO:tasks.workunit.client.0.vm05.stdout:5/35: chown d5/f8 145 1 2026-03-10T08:55:12.097 INFO:tasks.workunit.client.0.vm05.stdout:5/36: write d5/f8 [1407579,85625] 0 2026-03-10T08:55:12.097 INFO:tasks.workunit.client.0.vm05.stdout:5/37: read d5/fc [103739,51328] 0 2026-03-10T08:55:12.133 INFO:tasks.workunit.client.0.vm05.stdout:3/43: sync 2026-03-10T08:55:12.133 INFO:tasks.workunit.client.0.vm05.stdout:3/44: write f7 [4990398,66875] 0 2026-03-10T08:55:12.134 INFO:tasks.workunit.client.0.vm05.stdout:3/45: dread - f2 zero size 2026-03-10T08:55:12.134 INFO:tasks.workunit.client.0.vm05.stdout:3/46: write f6 [3329248,1459] 0 2026-03-10T08:55:12.134 INFO:tasks.workunit.client.0.vm05.stdout:3/47: stat f5 0 2026-03-10T08:55:12.134 INFO:tasks.workunit.client.0.vm05.stdout:3/48: readlink - no filename 2026-03-10T08:55:12.137 INFO:tasks.workunit.client.0.vm05.stdout:3/49: unlink f5 0 2026-03-10T08:55:12.138 INFO:tasks.workunit.client.0.vm05.stdout:3/50: creat d9/fa x:0 0 0 2026-03-10T08:55:12.139 INFO:tasks.workunit.client.0.vm05.stdout:3/51: stat f1 0 2026-03-10T08:55:12.204 INFO:tasks.workunit.client.0.vm05.stdout:8/23: getdents d2 0 2026-03-10T08:55:12.206 INFO:tasks.workunit.client.1.vm08.stdout:1/243: dwrite d1/da/f39 [0,4194304] 0 2026-03-10T08:55:12.208 INFO:tasks.workunit.client.0.vm05.stdout:0/37: unlink fb 0 2026-03-10T08:55:12.213 INFO:tasks.workunit.client.0.vm05.stdout:7/40: truncate f8 2746646 0 2026-03-10T08:55:12.213 INFO:tasks.workunit.client.0.vm05.stdout:7/41: fsync f4 0 2026-03-10T08:55:12.213 INFO:tasks.workunit.client.0.vm05.stdout:9/37: truncate f4 3514065 0 2026-03-10T08:55:12.217 INFO:tasks.workunit.client.0.vm05.stdout:6/41: rename d4/fb to d4/fc 0 2026-03-10T08:55:12.217 INFO:tasks.workunit.client.0.vm05.stdout:2/34: truncate d0/f2 449008 0 2026-03-10T08:55:12.218 INFO:tasks.workunit.client.0.vm05.stdout:2/35: chown d0/f8 11778517 1 2026-03-10T08:55:12.218 INFO:tasks.workunit.client.0.vm05.stdout:6/42: write d4/d7/fa [830849,21436] 0 2026-03-10T08:55:12.219 INFO:tasks.workunit.client.0.vm05.stdout:2/36: write d0/f4 [1388993,42255] 0 2026-03-10T08:55:12.222 INFO:tasks.workunit.client.0.vm05.stdout:6/43: dwrite d4/f5 [0,4194304] 0 2026-03-10T08:55:12.224 INFO:tasks.workunit.client.0.vm05.stdout:6/44: stat d4/f8 0 2026-03-10T08:55:12.230 INFO:tasks.workunit.client.0.vm05.stdout:8/24: mknod d2/c7 0 2026-03-10T08:55:12.231 INFO:tasks.workunit.client.0.vm05.stdout:4/32: rmdir d0 39 2026-03-10T08:55:12.233 INFO:tasks.workunit.client.0.vm05.stdout:8/25: dwrite f1 [0,4194304] 0 2026-03-10T08:55:12.238 INFO:tasks.workunit.client.1.vm08.stdout:1/244: creat d1/da/de/d24/d26/f4d x:0 0 0 2026-03-10T08:55:12.239 INFO:tasks.workunit.client.0.vm05.stdout:5/38: sync 2026-03-10T08:55:12.273 INFO:tasks.workunit.client.1.vm08.stdout:1/245: mkdir d1/da/d4b/d4e 0 2026-03-10T08:55:12.274 INFO:tasks.workunit.client.1.vm08.stdout:1/246: write d1/da/de/f1a [1756327,65538] 0 2026-03-10T08:55:12.274 INFO:tasks.workunit.client.1.vm08.stdout:9/303: truncate d2/dd/d15/f44 4161213 0 2026-03-10T08:55:12.274 INFO:tasks.workunit.client.1.vm08.stdout:1/247: getdents d1/da/de/d24/d35 0 2026-03-10T08:55:12.274 INFO:tasks.workunit.client.1.vm08.stdout:1/248: dwrite d1/da/d20/f2d [0,4194304] 0 2026-03-10T08:55:12.274 INFO:tasks.workunit.client.0.vm05.stdout:1/69: getdents dd 0 2026-03-10T08:55:12.274 INFO:tasks.workunit.client.0.vm05.stdout:6/45: rename d4/f8 to d4/fd 0 2026-03-10T08:55:12.274 INFO:tasks.workunit.client.0.vm05.stdout:4/33: write d0/f8 [191726,93563] 0 2026-03-10T08:55:12.274 INFO:tasks.workunit.client.0.vm05.stdout:8/26: fsync d2/f5 0 2026-03-10T08:55:12.274 INFO:tasks.workunit.client.0.vm05.stdout:8/27: dwrite d2/f5 [4194304,4194304] 0 2026-03-10T08:55:12.274 INFO:tasks.workunit.client.0.vm05.stdout:8/28: write f1 [7208400,47117] 0 2026-03-10T08:55:12.274 INFO:tasks.workunit.client.0.vm05.stdout:0/38: mknod cd 0 2026-03-10T08:55:12.274 INFO:tasks.workunit.client.1.vm08.stdout:1/249: dwrite d1/da/de/d24/d3d/d40/f42 [0,4194304] 0 2026-03-10T08:55:12.276 INFO:tasks.workunit.client.1.vm08.stdout:9/304: read d2/f35 [343344,73656] 0 2026-03-10T08:55:12.277 INFO:tasks.workunit.client.0.vm05.stdout:4/34: rmdir d0 39 2026-03-10T08:55:12.278 INFO:tasks.workunit.client.1.vm08.stdout:1/250: chown d1/da/l2b 189 1 2026-03-10T08:55:12.279 INFO:tasks.workunit.client.1.vm08.stdout:1/251: chown d1/da/d18/d3b/d41 28 1 2026-03-10T08:55:12.284 INFO:tasks.workunit.client.1.vm08.stdout:9/305: creat d2/d41/d4c/f62 x:0 0 0 2026-03-10T08:55:12.286 INFO:tasks.workunit.client.1.vm08.stdout:1/252: link d1/f1f d1/da/d4b/f4f 0 2026-03-10T08:55:12.287 INFO:tasks.workunit.client.0.vm05.stdout:1/70: mkdir dd/d10/d18 0 2026-03-10T08:55:12.287 INFO:tasks.workunit.client.0.vm05.stdout:4/35: chown d0 3113618 1 2026-03-10T08:55:12.288 INFO:tasks.workunit.client.0.vm05.stdout:8/29: symlink d2/l8 0 2026-03-10T08:55:12.289 INFO:tasks.workunit.client.1.vm08.stdout:1/253: rename d1/da/de/d24/d35/c47 to d1/da/d20/d4c/c50 0 2026-03-10T08:55:12.290 INFO:tasks.workunit.client.1.vm08.stdout:1/254: readlink d1/da/d20/l36 0 2026-03-10T08:55:12.290 INFO:tasks.workunit.client.0.vm05.stdout:1/71: mkdir dd/d10/d19 0 2026-03-10T08:55:12.291 INFO:tasks.workunit.client.1.vm08.stdout:1/255: rmdir d1/da/d18/d3a 39 2026-03-10T08:55:12.292 INFO:tasks.workunit.client.0.vm05.stdout:0/39: creat fe x:0 0 0 2026-03-10T08:55:12.293 INFO:tasks.workunit.client.1.vm08.stdout:1/256: chown d1/da/de/c17 5230347 1 2026-03-10T08:55:12.295 INFO:tasks.workunit.client.0.vm05.stdout:1/72: dwrite f2 [0,4194304] 0 2026-03-10T08:55:12.296 INFO:tasks.workunit.client.0.vm05.stdout:8/30: mknod d2/c9 0 2026-03-10T08:55:12.297 INFO:tasks.workunit.client.1.vm08.stdout:1/257: creat d1/da/d4b/d4e/f51 x:0 0 0 2026-03-10T08:55:12.302 INFO:tasks.workunit.client.0.vm05.stdout:1/73: rename dd/cf to dd/c1a 0 2026-03-10T08:55:12.303 INFO:tasks.workunit.client.0.vm05.stdout:1/74: chown dd/d10 4 1 2026-03-10T08:55:12.304 INFO:tasks.workunit.client.0.vm05.stdout:0/40: mkdir df 0 2026-03-10T08:55:12.306 INFO:tasks.workunit.client.0.vm05.stdout:8/31: creat d2/fa x:0 0 0 2026-03-10T08:55:12.306 INFO:tasks.workunit.client.0.vm05.stdout:8/32: readlink d2/l8 0 2026-03-10T08:55:12.307 INFO:tasks.workunit.client.0.vm05.stdout:0/41: mknod df/c10 0 2026-03-10T08:55:12.308 INFO:tasks.workunit.client.0.vm05.stdout:0/42: creat df/f11 x:0 0 0 2026-03-10T08:55:12.310 INFO:tasks.workunit.client.0.vm05.stdout:8/33: mkdir d2/db 0 2026-03-10T08:55:12.311 INFO:tasks.workunit.client.0.vm05.stdout:0/43: dwrite df/f11 [0,4194304] 0 2026-03-10T08:55:12.319 INFO:tasks.workunit.client.0.vm05.stdout:8/34: dwrite d2/fa [0,4194304] 0 2026-03-10T08:55:12.320 INFO:tasks.workunit.client.0.vm05.stdout:8/35: chown d2/db 0 1 2026-03-10T08:55:12.324 INFO:tasks.workunit.client.0.vm05.stdout:8/36: unlink f1 0 2026-03-10T08:55:12.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:12 vm08.local ceph-mon[57559]: pgmap v145: 65 pgs: 65 active+clean; 780 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 10 MiB/s rd, 87 MiB/s wr, 340 op/s 2026-03-10T08:55:12.603 INFO:tasks.workunit.client.0.vm05.stdout:5/39: dread d5/fd [0,4194304] 0 2026-03-10T08:55:12.606 INFO:tasks.workunit.client.0.vm05.stdout:5/40: dwrite f2 [0,4194304] 0 2026-03-10T08:55:12.607 INFO:tasks.workunit.client.0.vm05.stdout:5/41: dread d5/fd [0,4194304] 0 2026-03-10T08:55:12.611 INFO:tasks.workunit.client.0.vm05.stdout:5/42: dwrite d5/f7 [0,4194304] 0 2026-03-10T08:55:12.622 INFO:tasks.workunit.client.0.vm05.stdout:5/43: symlink d5/l10 0 2026-03-10T08:55:12.622 INFO:tasks.workunit.client.0.vm05.stdout:5/44: getdents d5/df 0 2026-03-10T08:55:12.622 INFO:tasks.workunit.client.0.vm05.stdout:5/45: chown d5/f8 861439781 1 2026-03-10T08:55:12.622 INFO:tasks.workunit.client.0.vm05.stdout:5/46: write d5/f7 [2388177,14703] 0 2026-03-10T08:55:12.622 INFO:tasks.workunit.client.0.vm05.stdout:5/47: mknod d5/c11 0 2026-03-10T08:55:12.622 INFO:tasks.workunit.client.0.vm05.stdout:5/48: mkdir d5/df/d12 0 2026-03-10T08:55:12.622 INFO:tasks.workunit.client.0.vm05.stdout:5/49: write d5/f7 [2072951,30157] 0 2026-03-10T08:55:12.622 INFO:tasks.workunit.client.0.vm05.stdout:5/50: read d5/f8 [2004622,90428] 0 2026-03-10T08:55:12.622 INFO:tasks.workunit.client.0.vm05.stdout:5/51: dread d5/fd [0,4194304] 0 2026-03-10T08:55:12.622 INFO:tasks.workunit.client.0.vm05.stdout:5/52: creat d5/df/d12/f13 x:0 0 0 2026-03-10T08:55:12.623 INFO:tasks.workunit.client.0.vm05.stdout:5/53: dread d5/f7 [0,4194304] 0 2026-03-10T08:55:12.624 INFO:tasks.workunit.client.0.vm05.stdout:5/54: unlink d5/f7 0 2026-03-10T08:55:12.625 INFO:tasks.workunit.client.0.vm05.stdout:5/55: mknod d5/c14 0 2026-03-10T08:55:12.625 INFO:tasks.workunit.client.0.vm05.stdout:5/56: truncate d5/fe 725076 0 2026-03-10T08:55:12.629 INFO:tasks.workunit.client.0.vm05.stdout:5/57: dwrite f2 [0,4194304] 0 2026-03-10T08:55:12.633 INFO:tasks.workunit.client.0.vm05.stdout:5/58: creat d5/df/d12/f15 x:0 0 0 2026-03-10T08:55:12.657 INFO:tasks.workunit.client.1.vm08.stdout:7/276: truncate d0/d11/d1f/d29/d3d/d40/f38 2381496 0 2026-03-10T08:55:12.659 INFO:tasks.workunit.client.1.vm08.stdout:7/277: write d0/d11/d1f/d29/d3d/f59 [716465,90110] 0 2026-03-10T08:55:12.659 INFO:tasks.workunit.client.1.vm08.stdout:3/288: truncate d4/f44 151877 0 2026-03-10T08:55:12.660 INFO:tasks.workunit.client.1.vm08.stdout:7/278: truncate d0/f41 948461 0 2026-03-10T08:55:12.661 INFO:tasks.workunit.client.1.vm08.stdout:3/289: write d4/d15/d8/d2c/f3d [8988715,104092] 0 2026-03-10T08:55:12.662 INFO:tasks.workunit.client.1.vm08.stdout:6/358: truncate f1 2828009 0 2026-03-10T08:55:12.662 INFO:tasks.workunit.client.1.vm08.stdout:7/279: write d0/d14/d43/f58 [651489,89150] 0 2026-03-10T08:55:12.663 INFO:tasks.workunit.client.1.vm08.stdout:6/359: readlink d9/d10/d1e/l7c 0 2026-03-10T08:55:12.663 INFO:tasks.workunit.client.1.vm08.stdout:2/342: truncate d1/d5b/d66/f5e 2466171 0 2026-03-10T08:55:12.664 INFO:tasks.workunit.client.1.vm08.stdout:0/258: dwrite d6/dd/d13/d17/d1f/d2d/d38/f53 [0,4194304] 0 2026-03-10T08:55:12.666 INFO:tasks.workunit.client.1.vm08.stdout:3/290: fsync d4/d15/f3f 0 2026-03-10T08:55:12.667 INFO:tasks.workunit.client.1.vm08.stdout:5/344: truncate d0/d1b/f39 1933629 0 2026-03-10T08:55:12.669 INFO:tasks.workunit.client.1.vm08.stdout:2/343: creat d1/da/d10/d1b/d12/d23/f70 x:0 0 0 2026-03-10T08:55:12.673 INFO:tasks.workunit.client.1.vm08.stdout:0/259: dread d6/dd/d13/d17/d1f/d2d/d38/f53 [0,4194304] 0 2026-03-10T08:55:12.678 INFO:tasks.workunit.client.1.vm08.stdout:6/360: rename d9/dc/d11/d23/d2c/l68 to d9/d10/d1e/l82 0 2026-03-10T08:55:12.679 INFO:tasks.workunit.client.1.vm08.stdout:6/361: chown d9/d13/c18 0 1 2026-03-10T08:55:12.679 INFO:tasks.workunit.client.1.vm08.stdout:6/362: write f5 [2201891,43197] 0 2026-03-10T08:55:12.679 INFO:tasks.workunit.client.1.vm08.stdout:2/344: truncate d1/da/d10/d1b/d12/d1e/f1f 975717 0 2026-03-10T08:55:12.679 INFO:tasks.workunit.client.1.vm08.stdout:0/260: read d6/dd/d13/d17/d1f/f48 [3860214,34011] 0 2026-03-10T08:55:12.682 INFO:tasks.workunit.client.0.vm05.stdout:6/46: fsync d4/f5 0 2026-03-10T08:55:12.683 INFO:tasks.workunit.client.1.vm08.stdout:0/261: write d6/dd/f3f [1080649,2986] 0 2026-03-10T08:55:12.689 INFO:tasks.workunit.client.1.vm08.stdout:9/306: dread d2/dd/d15/d1e/d21/f3a [0,4194304] 0 2026-03-10T08:55:12.691 INFO:tasks.workunit.client.1.vm08.stdout:0/262: readlink d6/dd/d13/d17/d1f/d20/d2f/d26/l3c 0 2026-03-10T08:55:12.692 INFO:tasks.workunit.client.1.vm08.stdout:0/263: write d6/dd/d13/d17/f1d [2772593,55366] 0 2026-03-10T08:55:12.692 INFO:tasks.workunit.client.1.vm08.stdout:7/280: getdents d0/d11/d4a 0 2026-03-10T08:55:12.693 INFO:tasks.workunit.client.1.vm08.stdout:6/363: mknod d9/dc/d11/d23/d2c/d41/d5d/d80/c83 0 2026-03-10T08:55:12.693 INFO:tasks.workunit.client.1.vm08.stdout:9/307: dwrite d2/dd/d15/d1e/d25/f5f [0,4194304] 0 2026-03-10T08:55:12.697 INFO:tasks.workunit.client.1.vm08.stdout:3/291: getdents d4/d15/d8 0 2026-03-10T08:55:12.698 INFO:tasks.workunit.client.1.vm08.stdout:5/345: getdents d0/d11/d27/d50 0 2026-03-10T08:55:12.701 INFO:tasks.workunit.client.1.vm08.stdout:3/292: dwrite d4/d15/d8/f4e [0,4194304] 0 2026-03-10T08:55:12.701 INFO:tasks.workunit.client.1.vm08.stdout:9/308: mknod d2/d41/c63 0 2026-03-10T08:55:12.709 INFO:tasks.workunit.client.1.vm08.stdout:2/345: getdents d1/d43/d5c 0 2026-03-10T08:55:12.712 INFO:tasks.workunit.client.1.vm08.stdout:0/264: write d6/dd/d13/d17/d1f/d2d/d39/f3b [959130,57215] 0 2026-03-10T08:55:12.726 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:12 vm05.local ceph-mon[49713]: pgmap v145: 65 pgs: 65 active+clean; 780 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 10 MiB/s rd, 87 MiB/s wr, 340 op/s 2026-03-10T08:55:12.732 INFO:tasks.workunit.client.1.vm08.stdout:0/265: readlink d6/dd/d13/l1a 0 2026-03-10T08:55:12.732 INFO:tasks.workunit.client.1.vm08.stdout:7/281: dwrite d0/d11/d1f/d2c/f30 [0,4194304] 0 2026-03-10T08:55:12.732 INFO:tasks.workunit.client.1.vm08.stdout:0/266: chown d6/dd/d13/d17/d1f/d20/d2f/d24 5 1 2026-03-10T08:55:12.732 INFO:tasks.workunit.client.1.vm08.stdout:0/267: dread - d6/dd/d13/d17/d1f/d20/f46 zero size 2026-03-10T08:55:12.732 INFO:tasks.workunit.client.1.vm08.stdout:9/309: mknod d2/dd/d15/d1e/d25/c64 0 2026-03-10T08:55:12.732 INFO:tasks.workunit.client.1.vm08.stdout:5/346: symlink d0/d11/d27/d68/l6d 0 2026-03-10T08:55:12.732 INFO:tasks.workunit.client.1.vm08.stdout:9/310: write d2/dd/d15/d1e/d39/f57 [3637444,589] 0 2026-03-10T08:55:12.732 INFO:tasks.workunit.client.1.vm08.stdout:7/282: readlink d0/d11/d1f/l28 0 2026-03-10T08:55:12.733 INFO:tasks.workunit.client.1.vm08.stdout:9/311: link d2/dd/d15/d1e/d21/c3c d2/d41/c65 0 2026-03-10T08:55:12.733 INFO:tasks.workunit.client.1.vm08.stdout:9/312: stat d2/dd/d15/d1e/d25/f4b 0 2026-03-10T08:55:12.736 INFO:tasks.workunit.client.0.vm05.stdout:5/59: fsync d5/f8 0 2026-03-10T08:55:12.736 INFO:tasks.workunit.client.1.vm08.stdout:7/283: dwrite d0/d11/d1f/d29/d3d/f59 [0,4194304] 0 2026-03-10T08:55:12.736 INFO:tasks.workunit.client.1.vm08.stdout:7/284: stat d0/d11/d4a/c52 0 2026-03-10T08:55:12.743 INFO:tasks.workunit.client.1.vm08.stdout:9/313: mkdir d2/d41/d4c/d66 0 2026-03-10T08:55:12.756 INFO:tasks.workunit.client.1.vm08.stdout:9/314: unlink d2/dd/d15/d1e/d37/l40 0 2026-03-10T08:55:12.756 INFO:tasks.workunit.client.1.vm08.stdout:9/315: creat d2/dd/d61/f67 x:0 0 0 2026-03-10T08:55:12.756 INFO:tasks.workunit.client.1.vm08.stdout:9/316: stat d2/d41/d4c 0 2026-03-10T08:55:12.756 INFO:tasks.workunit.client.1.vm08.stdout:9/317: dwrite d2/dd/d15/d1e/d25/f4b [0,4194304] 0 2026-03-10T08:55:12.760 INFO:tasks.workunit.client.1.vm08.stdout:9/318: mkdir d2/d41/d68 0 2026-03-10T08:55:12.761 INFO:tasks.workunit.client.1.vm08.stdout:9/319: chown d2/dd/d15/d4f/l56 0 1 2026-03-10T08:55:12.766 INFO:tasks.workunit.client.1.vm08.stdout:9/320: dread d2/fb [0,4194304] 0 2026-03-10T08:55:12.781 INFO:tasks.workunit.client.1.vm08.stdout:7/285: dread d0/d14/f7 [0,4194304] 0 2026-03-10T08:55:12.781 INFO:tasks.workunit.client.1.vm08.stdout:9/321: dwrite d2/dd/d15/f22 [0,4194304] 0 2026-03-10T08:55:12.781 INFO:tasks.workunit.client.1.vm08.stdout:9/322: chown d2/dd/d15/d1e/d25/f4b 2 1 2026-03-10T08:55:12.781 INFO:tasks.workunit.client.1.vm08.stdout:9/323: chown d2/f13 6191116 1 2026-03-10T08:55:12.781 INFO:tasks.workunit.client.1.vm08.stdout:7/286: fsync d0/f44 0 2026-03-10T08:55:12.781 INFO:tasks.workunit.client.1.vm08.stdout:7/287: dread - d0/d11/d4a/f53 zero size 2026-03-10T08:55:12.781 INFO:tasks.workunit.client.1.vm08.stdout:7/288: truncate d0/d11/d1f/d29/d3d/f54 621754 0 2026-03-10T08:55:12.781 INFO:tasks.workunit.client.1.vm08.stdout:7/289: dwrite d0/fe [4194304,4194304] 0 2026-03-10T08:55:12.787 INFO:tasks.workunit.client.1.vm08.stdout:9/324: rmdir d2/d41/d68 0 2026-03-10T08:55:12.790 INFO:tasks.workunit.client.1.vm08.stdout:7/290: rename d0/d11/d1f/d29/d3d/f54 to d0/d11/d1f/d29/d3b/f5b 0 2026-03-10T08:55:12.792 INFO:tasks.workunit.client.1.vm08.stdout:7/291: creat d0/d11/d4a/f5c x:0 0 0 2026-03-10T08:55:12.802 INFO:tasks.workunit.client.1.vm08.stdout:0/268: sync 2026-03-10T08:55:12.806 INFO:tasks.workunit.client.1.vm08.stdout:0/269: dwrite d6/f2c [0,4194304] 0 2026-03-10T08:55:12.809 INFO:tasks.workunit.client.1.vm08.stdout:0/270: mknod d6/dd/d13/d17/d1f/d20/d2f/d26/c55 0 2026-03-10T08:55:12.810 INFO:tasks.workunit.client.1.vm08.stdout:0/271: mkdir d6/dd/d13/d17/d1f/d20/d2f/d26/d56 0 2026-03-10T08:55:12.819 INFO:tasks.workunit.client.0.vm05.stdout:5/60: sync 2026-03-10T08:55:12.819 INFO:tasks.workunit.client.1.vm08.stdout:9/325: sync 2026-03-10T08:55:12.820 INFO:tasks.workunit.client.0.vm05.stdout:5/61: write d5/df/d12/f15 [669114,87318] 0 2026-03-10T08:55:12.823 INFO:tasks.workunit.client.1.vm08.stdout:9/326: mkdir d2/dd/d15/d1e/d39/d69 0 2026-03-10T08:55:12.824 INFO:tasks.workunit.client.0.vm05.stdout:5/62: dwrite d5/fe [0,4194304] 0 2026-03-10T08:55:12.826 INFO:tasks.workunit.client.1.vm08.stdout:9/327: creat d2/dd/d15/d1e/d37/f6a x:0 0 0 2026-03-10T08:55:12.827 INFO:tasks.workunit.client.1.vm08.stdout:9/328: write d2/dd/d15/d1e/d25/d32/f60 [540868,93006] 0 2026-03-10T08:55:12.829 INFO:tasks.workunit.client.1.vm08.stdout:9/329: unlink d2/dd/l43 0 2026-03-10T08:55:12.832 INFO:tasks.workunit.client.1.vm08.stdout:9/330: fsync d2/dd/d15/f1b 0 2026-03-10T08:55:12.837 INFO:tasks.workunit.client.1.vm08.stdout:9/331: symlink d2/dd/d15/d1e/d25/l6b 0 2026-03-10T08:55:12.839 INFO:tasks.workunit.client.1.vm08.stdout:9/332: symlink d2/d41/d4c/d66/l6c 0 2026-03-10T08:55:12.858 INFO:tasks.workunit.client.0.vm05.stdout:5/63: sync 2026-03-10T08:55:12.859 INFO:tasks.workunit.client.1.vm08.stdout:9/333: sync 2026-03-10T08:55:12.863 INFO:tasks.workunit.client.0.vm05.stdout:5/64: dwrite d5/f9 [4194304,4194304] 0 2026-03-10T08:55:12.870 INFO:tasks.workunit.client.0.vm05.stdout:5/65: fsync d5/f8 0 2026-03-10T08:55:12.870 INFO:tasks.workunit.client.0.vm05.stdout:5/66: truncate d5/fd 1316220 0 2026-03-10T08:55:12.878 INFO:tasks.workunit.client.1.vm08.stdout:9/334: dwrite d2/f13 [0,4194304] 0 2026-03-10T08:55:12.887 INFO:tasks.workunit.client.1.vm08.stdout:9/335: creat d2/d41/d53/f6d x:0 0 0 2026-03-10T08:55:12.888 INFO:tasks.workunit.client.1.vm08.stdout:4/370: truncate d5/de/f50 1254090 0 2026-03-10T08:55:12.890 INFO:tasks.workunit.client.1.vm08.stdout:9/336: mknod d2/dd/d61/c6e 0 2026-03-10T08:55:12.890 INFO:tasks.workunit.client.1.vm08.stdout:9/337: readlink d2/dd/l1d 0 2026-03-10T08:55:12.892 INFO:tasks.workunit.client.1.vm08.stdout:9/338: chown d2/dd/d15/d1e/d24/c2c 41494560 1 2026-03-10T08:55:12.893 INFO:tasks.workunit.client.1.vm08.stdout:4/371: rename d5/d2f/f67 to d5/d23/d36/d76/f82 0 2026-03-10T08:55:12.902 INFO:tasks.workunit.client.1.vm08.stdout:9/339: dwrite d2/dd/d15/d1e/d39/d4e/f55 [0,4194304] 0 2026-03-10T08:55:12.934 INFO:tasks.workunit.client.1.vm08.stdout:9/340: dread d2/f4 [0,4194304] 0 2026-03-10T08:55:12.939 INFO:tasks.workunit.client.0.vm05.stdout:5/67: rmdir d5/df 39 2026-03-10T08:55:12.940 INFO:tasks.workunit.client.0.vm05.stdout:5/68: symlink d5/l16 0 2026-03-10T08:55:12.942 INFO:tasks.workunit.client.0.vm05.stdout:5/69: mkdir d5/d17 0 2026-03-10T08:55:12.944 INFO:tasks.workunit.client.0.vm05.stdout:5/70: dwrite d5/f8 [0,4194304] 0 2026-03-10T08:55:12.979 INFO:tasks.workunit.client.0.vm05.stdout:3/52: truncate f7 4955559 0 2026-03-10T08:55:12.986 INFO:tasks.workunit.client.0.vm05.stdout:3/53: creat d9/fb x:0 0 0 2026-03-10T08:55:12.993 INFO:tasks.workunit.client.0.vm05.stdout:5/71: sync 2026-03-10T08:55:12.993 INFO:tasks.workunit.client.0.vm05.stdout:3/54: creat d9/fc x:0 0 0 2026-03-10T08:55:12.996 INFO:tasks.workunit.client.0.vm05.stdout:5/72: rename d5/df/d12 to d5/df/d12/d18 22 2026-03-10T08:55:13.037 INFO:tasks.workunit.client.0.vm05.stdout:6/47: unlink d4/fd 0 2026-03-10T08:55:13.038 INFO:tasks.workunit.client.0.vm05.stdout:6/48: symlink d4/d7/le 0 2026-03-10T08:55:13.039 INFO:tasks.workunit.client.0.vm05.stdout:6/49: creat d4/d7/ff x:0 0 0 2026-03-10T08:55:13.041 INFO:tasks.workunit.client.0.vm05.stdout:6/50: fsync f2 0 2026-03-10T08:55:13.043 INFO:tasks.workunit.client.0.vm05.stdout:6/51: mkdir d4/d7/d10 0 2026-03-10T08:55:13.046 INFO:tasks.workunit.client.0.vm05.stdout:6/52: dread f0 [0,4194304] 0 2026-03-10T08:55:13.047 INFO:tasks.workunit.client.0.vm05.stdout:8/37: write d2/f5 [9350932,111491] 0 2026-03-10T08:55:13.048 INFO:tasks.workunit.client.0.vm05.stdout:7/42: dwrite f3 [0,4194304] 0 2026-03-10T08:55:13.052 INFO:tasks.workunit.client.0.vm05.stdout:6/53: dread d4/d7/fa [0,4194304] 0 2026-03-10T08:55:13.053 INFO:tasks.workunit.client.0.vm05.stdout:6/54: dread - d4/d7/ff zero size 2026-03-10T08:55:13.058 INFO:tasks.workunit.client.0.vm05.stdout:8/38: mknod d2/db/cc 0 2026-03-10T08:55:13.065 INFO:tasks.workunit.client.0.vm05.stdout:6/55: dwrite d4/fc [4194304,4194304] 0 2026-03-10T08:55:13.066 INFO:tasks.workunit.client.1.vm08.stdout:1/258: fsync d1/da/d4b/f4f 0 2026-03-10T08:55:13.077 INFO:tasks.workunit.client.0.vm05.stdout:6/56: creat d4/f11 x:0 0 0 2026-03-10T08:55:13.081 INFO:tasks.workunit.client.0.vm05.stdout:6/57: dwrite d4/d7/ff [0,4194304] 0 2026-03-10T08:55:13.081 INFO:tasks.workunit.client.0.vm05.stdout:6/58: write d4/fc [2610318,117882] 0 2026-03-10T08:55:13.085 INFO:tasks.workunit.client.0.vm05.stdout:4/36: truncate d0/f4 406395 0 2026-03-10T08:55:13.093 INFO:tasks.workunit.client.0.vm05.stdout:1/75: truncate f2 940967 0 2026-03-10T08:55:13.094 INFO:tasks.workunit.client.0.vm05.stdout:4/37: chown d0/f4 51160029 1 2026-03-10T08:55:13.097 INFO:tasks.workunit.client.0.vm05.stdout:1/76: dwrite dd/f16 [0,4194304] 0 2026-03-10T08:55:13.102 INFO:tasks.workunit.client.0.vm05.stdout:0/44: rmdir df 39 2026-03-10T08:55:13.102 INFO:tasks.workunit.client.0.vm05.stdout:0/45: fsync f5 0 2026-03-10T08:55:13.107 INFO:tasks.workunit.client.0.vm05.stdout:4/38: creat d0/f9 x:0 0 0 2026-03-10T08:55:13.109 INFO:tasks.workunit.client.0.vm05.stdout:1/77: symlink dd/d10/l1b 0 2026-03-10T08:55:13.112 INFO:tasks.workunit.client.0.vm05.stdout:4/39: symlink d0/la 0 2026-03-10T08:55:13.128 INFO:tasks.workunit.client.1.vm08.stdout:1/259: unlink d1/da/de/d24/c38 0 2026-03-10T08:55:13.128 INFO:tasks.workunit.client.1.vm08.stdout:1/260: chown d1/da/d20/d3f 0 1 2026-03-10T08:55:13.128 INFO:tasks.workunit.client.1.vm08.stdout:1/261: mknod d1/da/d18/c52 0 2026-03-10T08:55:13.128 INFO:tasks.workunit.client.1.vm08.stdout:1/262: chown d1/da/d18/d3a/f3c 1 1 2026-03-10T08:55:13.128 INFO:tasks.workunit.client.1.vm08.stdout:1/263: dwrite d1/da/f25 [4194304,4194304] 0 2026-03-10T08:55:13.128 INFO:tasks.workunit.client.0.vm05.stdout:4/40: creat d0/fb x:0 0 0 2026-03-10T08:55:13.128 INFO:tasks.workunit.client.0.vm05.stdout:4/41: stat d0/c3 0 2026-03-10T08:55:13.128 INFO:tasks.workunit.client.0.vm05.stdout:1/78: truncate f6 393851 0 2026-03-10T08:55:13.128 INFO:tasks.workunit.client.0.vm05.stdout:1/79: creat dd/f1c x:0 0 0 2026-03-10T08:55:13.128 INFO:tasks.workunit.client.0.vm05.stdout:0/46: getdents df 0 2026-03-10T08:55:13.128 INFO:tasks.workunit.client.0.vm05.stdout:1/80: getdents dd/d10/d18 0 2026-03-10T08:55:13.128 INFO:tasks.workunit.client.0.vm05.stdout:1/81: creat dd/d10/d19/f1d x:0 0 0 2026-03-10T08:55:13.136 INFO:tasks.workunit.client.1.vm08.stdout:1/264: mkdir d1/da/d18/d53 0 2026-03-10T08:55:13.139 INFO:tasks.workunit.client.1.vm08.stdout:1/265: dwrite d1/da/d4b/d4e/f51 [0,4194304] 0 2026-03-10T08:55:13.160 INFO:tasks.workunit.client.1.vm08.stdout:1/266: creat d1/da/d20/f54 x:0 0 0 2026-03-10T08:55:13.160 INFO:tasks.workunit.client.1.vm08.stdout:1/267: readlink d1/da/d20/d3f/l45 0 2026-03-10T08:55:13.160 INFO:tasks.workunit.client.1.vm08.stdout:1/268: unlink d1/da/de/d24/f37 0 2026-03-10T08:55:13.160 INFO:tasks.workunit.client.1.vm08.stdout:1/269: mknod d1/da/d20/d4c/c55 0 2026-03-10T08:55:13.160 INFO:tasks.workunit.client.1.vm08.stdout:1/270: rename d1/da/d18/d3b/d41 to d1/da/de/d24/d3d/d40/d56 0 2026-03-10T08:55:13.160 INFO:tasks.workunit.client.1.vm08.stdout:1/271: readlink d1/da/l2b 0 2026-03-10T08:55:13.160 INFO:tasks.workunit.client.1.vm08.stdout:1/272: dwrite d1/da/de/f27 [0,4194304] 0 2026-03-10T08:55:13.160 INFO:tasks.workunit.client.1.vm08.stdout:1/273: dwrite d1/da/de/f1a [4194304,4194304] 0 2026-03-10T08:55:13.165 INFO:tasks.workunit.client.1.vm08.stdout:1/274: creat d1/da/d18/d3a/f57 x:0 0 0 2026-03-10T08:55:13.169 INFO:tasks.workunit.client.1.vm08.stdout:1/275: chown d1/fc 9250 1 2026-03-10T08:55:13.169 INFO:tasks.workunit.client.1.vm08.stdout:1/276: symlink d1/da/de/d24/d3d/d40/l58 0 2026-03-10T08:55:13.169 INFO:tasks.workunit.client.1.vm08.stdout:1/277: chown d1/da/d4b 36 1 2026-03-10T08:55:13.169 INFO:tasks.workunit.client.1.vm08.stdout:1/278: mknod d1/da/d20/c59 0 2026-03-10T08:55:13.170 INFO:tasks.workunit.client.1.vm08.stdout:1/279: truncate d1/da/f22 1075721 0 2026-03-10T08:55:13.172 INFO:tasks.workunit.client.1.vm08.stdout:1/280: read d1/da/de/f12 [55919,91084] 0 2026-03-10T08:55:13.174 INFO:tasks.workunit.client.0.vm05.stdout:1/82: sync 2026-03-10T08:55:13.191 INFO:tasks.workunit.client.0.vm05.stdout:0/47: sync 2026-03-10T08:55:13.192 INFO:tasks.workunit.client.1.vm08.stdout:1/281: dwrite d1/da/d20/f54 [0,4194304] 0 2026-03-10T08:55:13.192 INFO:tasks.workunit.client.1.vm08.stdout:1/282: dwrite d1/da/d20/f2d [0,4194304] 0 2026-03-10T08:55:13.192 INFO:tasks.workunit.client.0.vm05.stdout:0/48: fdatasync fc 0 2026-03-10T08:55:13.192 INFO:tasks.workunit.client.0.vm05.stdout:0/49: chown c9 14533 1 2026-03-10T08:55:13.192 INFO:tasks.workunit.client.0.vm05.stdout:0/50: creat df/f12 x:0 0 0 2026-03-10T08:55:13.192 INFO:tasks.workunit.client.0.vm05.stdout:0/51: dwrite fe [0,4194304] 0 2026-03-10T08:55:13.192 INFO:tasks.workunit.client.0.vm05.stdout:0/52: dread - fc zero size 2026-03-10T08:55:13.199 INFO:tasks.workunit.client.1.vm08.stdout:1/283: dread d1/da/d4b/f4f [0,4194304] 0 2026-03-10T08:55:13.200 INFO:tasks.workunit.client.1.vm08.stdout:1/284: fdatasync d1/da/d18/d3a/f57 0 2026-03-10T08:55:13.200 INFO:tasks.workunit.client.1.vm08.stdout:1/285: stat d1/da/d20/l36 0 2026-03-10T08:55:13.201 INFO:tasks.workunit.client.1.vm08.stdout:1/286: chown d1/da/de/d24/l33 5 1 2026-03-10T08:55:13.204 INFO:tasks.workunit.client.1.vm08.stdout:1/287: mknod d1/da/de/d24/d3d/d40/d56/c5a 0 2026-03-10T08:55:13.211 INFO:tasks.workunit.client.1.vm08.stdout:1/288: mkdir d1/da/de/d24/d3d/d40/d5b 0 2026-03-10T08:55:13.215 INFO:tasks.workunit.client.0.vm05.stdout:1/83: sync 2026-03-10T08:55:13.219 INFO:tasks.workunit.client.0.vm05.stdout:1/84: dwrite dd/f1c [0,4194304] 0 2026-03-10T08:55:13.236 INFO:tasks.workunit.client.1.vm08.stdout:1/289: sync 2026-03-10T08:55:13.239 INFO:tasks.workunit.client.1.vm08.stdout:6/364: chown d9/d10/d1e/l82 5003683 1 2026-03-10T08:55:13.241 INFO:tasks.workunit.client.1.vm08.stdout:6/365: write d9/dc/d11/d23/f5f [2170397,26086] 0 2026-03-10T08:55:13.242 INFO:tasks.workunit.client.1.vm08.stdout:8/385: dwrite d1/d10/d9/dd/d25/d27/d44/f22 [0,4194304] 0 2026-03-10T08:55:13.251 INFO:tasks.workunit.client.1.vm08.stdout:6/366: dwrite d9/dc/d11/f31 [0,4194304] 0 2026-03-10T08:55:13.251 INFO:tasks.workunit.client.1.vm08.stdout:1/290: mkdir d1/da/de/d5c 0 2026-03-10T08:55:13.251 INFO:tasks.workunit.client.1.vm08.stdout:8/386: mknod d1/d10/d9/dd/d25/d27/d44/d21/d51/c8b 0 2026-03-10T08:55:13.251 INFO:tasks.workunit.client.1.vm08.stdout:1/291: mkdir d1/da/de/d24/d26/d5d 0 2026-03-10T08:55:13.252 INFO:tasks.workunit.client.1.vm08.stdout:8/387: symlink d1/d10/d9/dd/d25/d27/d44/d89/l8c 0 2026-03-10T08:55:13.253 INFO:tasks.workunit.client.1.vm08.stdout:1/292: chown d1/da/d20/f21 596 1 2026-03-10T08:55:13.255 INFO:tasks.workunit.client.1.vm08.stdout:1/293: creat d1/da/d18/d3b/f5e x:0 0 0 2026-03-10T08:55:13.255 INFO:tasks.workunit.client.1.vm08.stdout:1/294: stat d1/fc 0 2026-03-10T08:55:13.270 INFO:tasks.workunit.client.0.vm05.stdout:1/85: sync 2026-03-10T08:55:13.272 INFO:tasks.workunit.client.1.vm08.stdout:1/295: dread d1/da/de/f12 [0,4194304] 0 2026-03-10T08:55:13.276 INFO:tasks.workunit.client.0.vm05.stdout:1/86: stat fb 0 2026-03-10T08:55:13.282 INFO:tasks.workunit.client.0.vm05.stdout:1/87: mkdir dd/d10/d18/d1e 0 2026-03-10T08:55:13.282 INFO:tasks.workunit.client.1.vm08.stdout:1/296: symlink d1/da/d18/d3a/l5f 0 2026-03-10T08:55:13.282 INFO:tasks.workunit.client.1.vm08.stdout:1/297: dread d1/da/de/f12 [0,4194304] 0 2026-03-10T08:55:13.282 INFO:tasks.workunit.client.1.vm08.stdout:1/298: symlink d1/da/d4b/d4e/l60 0 2026-03-10T08:55:13.283 INFO:tasks.workunit.client.1.vm08.stdout:1/299: dwrite d1/da/f25 [4194304,4194304] 0 2026-03-10T08:55:13.286 INFO:tasks.workunit.client.1.vm08.stdout:1/300: creat d1/da/d20/d3f/d49/f61 x:0 0 0 2026-03-10T08:55:13.290 INFO:tasks.workunit.client.1.vm08.stdout:1/301: mkdir d1/da/d18/d3b/d62 0 2026-03-10T08:55:13.291 INFO:tasks.workunit.client.1.vm08.stdout:1/302: mkdir d1/da/d20/d3f/d49/d63 0 2026-03-10T08:55:13.292 INFO:tasks.workunit.client.1.vm08.stdout:1/303: creat d1/da/de/d24/d35/f64 x:0 0 0 2026-03-10T08:55:13.296 INFO:tasks.workunit.client.1.vm08.stdout:1/304: chown d1/fd 46 1 2026-03-10T08:55:13.298 INFO:tasks.workunit.client.1.vm08.stdout:1/305: unlink d1/l14 0 2026-03-10T08:55:13.299 INFO:tasks.workunit.client.1.vm08.stdout:6/367: rename d9/dc/d11/d23/d2c/d41/d5d to d9/dc/d84 0 2026-03-10T08:55:13.300 INFO:tasks.workunit.client.1.vm08.stdout:6/368: chown d9/d10/d1e/l7c 1625153 1 2026-03-10T08:55:13.301 INFO:tasks.workunit.client.1.vm08.stdout:2/346: write d1/da/d10/d42/f58 [4100925,42105] 0 2026-03-10T08:55:13.302 INFO:tasks.workunit.client.1.vm08.stdout:3/293: write d4/d15/d8/d2c/f5a [1285423,87543] 0 2026-03-10T08:55:13.306 INFO:tasks.workunit.client.1.vm08.stdout:1/306: creat d1/f65 x:0 0 0 2026-03-10T08:55:13.309 INFO:tasks.workunit.client.1.vm08.stdout:1/307: symlink d1/da/d18/d3a/l66 0 2026-03-10T08:55:13.309 INFO:tasks.workunit.client.1.vm08.stdout:1/308: creat d1/da/d20/f67 x:0 0 0 2026-03-10T08:55:13.309 INFO:tasks.workunit.client.1.vm08.stdout:1/309: write d1/da/de/f1a [713074,106692] 0 2026-03-10T08:55:13.314 INFO:tasks.workunit.client.1.vm08.stdout:1/310: mkdir d1/da/d20/d3f/d49/d68 0 2026-03-10T08:55:13.316 INFO:tasks.workunit.client.1.vm08.stdout:1/311: dread d1/f1f [0,4194304] 0 2026-03-10T08:55:13.316 INFO:tasks.workunit.client.1.vm08.stdout:1/312: fsync d1/da/d18/f48 0 2026-03-10T08:55:13.321 INFO:tasks.workunit.client.1.vm08.stdout:1/313: mknod d1/da/de/d24/d3d/c69 0 2026-03-10T08:55:13.323 INFO:tasks.workunit.client.1.vm08.stdout:1/314: mknod d1/da/d18/c6a 0 2026-03-10T08:55:13.327 INFO:tasks.workunit.client.1.vm08.stdout:8/388: dread d1/d10/f3b [0,4194304] 0 2026-03-10T08:55:13.332 INFO:tasks.workunit.client.1.vm08.stdout:8/389: readlink d1/d2c/l35 0 2026-03-10T08:55:13.332 INFO:tasks.workunit.client.1.vm08.stdout:8/390: mknod d1/d10/d9/d8a/c8d 0 2026-03-10T08:55:13.332 INFO:tasks.workunit.client.1.vm08.stdout:8/391: mkdir d1/d10/d9/dd/d25/d27/d44/d21/d51/d8e 0 2026-03-10T08:55:13.332 INFO:tasks.workunit.client.1.vm08.stdout:8/392: stat d1/l39 0 2026-03-10T08:55:13.338 INFO:tasks.workunit.client.1.vm08.stdout:1/315: sync 2026-03-10T08:55:13.338 INFO:tasks.workunit.client.1.vm08.stdout:1/316: fdatasync d1/da/d20/f67 0 2026-03-10T08:55:13.346 INFO:tasks.workunit.client.1.vm08.stdout:2/347: dread d1/da/d10/f18 [0,4194304] 0 2026-03-10T08:55:13.346 INFO:tasks.workunit.client.1.vm08.stdout:2/348: chown d1/da/d10/d1b/d12/d23/l68 145 1 2026-03-10T08:55:13.347 INFO:tasks.workunit.client.1.vm08.stdout:2/349: chown d1/da/d10/d1b/d12/d23 1110038 1 2026-03-10T08:55:13.347 INFO:tasks.workunit.client.1.vm08.stdout:2/350: readlink d1/da/d10/d1b/l30 0 2026-03-10T08:55:13.354 INFO:tasks.workunit.client.1.vm08.stdout:2/351: fsync d1/da/d10/d1b/d12/d23/f44 0 2026-03-10T08:55:13.354 INFO:tasks.workunit.client.1.vm08.stdout:8/393: dread d1/d2c/f47 [0,4194304] 0 2026-03-10T08:55:13.356 INFO:tasks.workunit.client.1.vm08.stdout:8/394: creat d1/d10/d9/dd/f8f x:0 0 0 2026-03-10T08:55:13.357 INFO:tasks.workunit.client.1.vm08.stdout:2/352: creat d1/da/d10/d1b/d6a/f71 x:0 0 0 2026-03-10T08:55:13.357 INFO:tasks.workunit.client.1.vm08.stdout:8/395: read d1/f26 [1624046,22307] 0 2026-03-10T08:55:13.357 INFO:tasks.workunit.client.1.vm08.stdout:2/353: dread - d1/da/d10/d2d/f67 zero size 2026-03-10T08:55:13.360 INFO:tasks.workunit.client.1.vm08.stdout:8/396: rename d1/l39 to d1/d10/d9/d8a/l90 0 2026-03-10T08:55:13.363 INFO:tasks.workunit.client.1.vm08.stdout:2/354: creat d1/da/d10/d1b/f72 x:0 0 0 2026-03-10T08:55:13.363 INFO:tasks.workunit.client.1.vm08.stdout:8/397: creat d1/d10/d9/dd/f91 x:0 0 0 2026-03-10T08:55:13.373 INFO:tasks.workunit.client.1.vm08.stdout:7/292: truncate d0/f25 3796447 0 2026-03-10T08:55:13.373 INFO:tasks.workunit.client.1.vm08.stdout:8/398: dread d1/d10/d9/f73 [0,4194304] 0 2026-03-10T08:55:13.375 INFO:tasks.workunit.client.1.vm08.stdout:7/293: creat d0/d51/f5d x:0 0 0 2026-03-10T08:55:13.375 INFO:tasks.workunit.client.1.vm08.stdout:8/399: creat d1/d10/d9/dd/d13/f92 x:0 0 0 2026-03-10T08:55:13.375 INFO:tasks.workunit.client.1.vm08.stdout:8/400: stat d1/d10/d9/f5b 0 2026-03-10T08:55:13.377 INFO:tasks.workunit.client.0.vm05.stdout:5/73: fsync d5/f8 0 2026-03-10T08:55:13.378 INFO:tasks.workunit.client.1.vm08.stdout:8/401: dread - d1/d10/d9/dd/d18/d3c/f83 zero size 2026-03-10T08:55:13.378 INFO:tasks.workunit.client.1.vm08.stdout:7/294: rmdir d0/d14 39 2026-03-10T08:55:13.379 INFO:tasks.workunit.client.1.vm08.stdout:7/295: chown d0/d11/d1f/d2c/f33 154717080 1 2026-03-10T08:55:13.380 INFO:tasks.workunit.client.1.vm08.stdout:8/402: creat d1/d10/d9/dd/d25/f93 x:0 0 0 2026-03-10T08:55:13.382 INFO:tasks.workunit.client.0.vm05.stdout:5/74: rename d5/f8 to d5/df/f19 0 2026-03-10T08:55:13.386 INFO:tasks.workunit.client.1.vm08.stdout:8/403: creat d1/d10/d9/f94 x:0 0 0 2026-03-10T08:55:13.388 INFO:tasks.workunit.client.0.vm05.stdout:5/75: write d5/fd [661009,25880] 0 2026-03-10T08:55:13.388 INFO:tasks.workunit.client.0.vm05.stdout:5/76: chown d5 61824 1 2026-03-10T08:55:13.391 INFO:tasks.workunit.client.1.vm08.stdout:7/296: dread d0/d14/d43/f58 [0,4194304] 0 2026-03-10T08:55:13.391 INFO:tasks.workunit.client.1.vm08.stdout:7/297: stat d0/d14/d2f/c42 0 2026-03-10T08:55:13.394 INFO:tasks.workunit.client.1.vm08.stdout:8/404: dread - d1/d10/d9/dd/d18/d34/f57 zero size 2026-03-10T08:55:13.395 INFO:tasks.workunit.client.0.vm05.stdout:5/77: write d5/df/f19 [1742890,65994] 0 2026-03-10T08:55:13.396 INFO:tasks.workunit.client.0.vm05.stdout:5/78: write d5/fe [2736600,34667] 0 2026-03-10T08:55:13.400 INFO:tasks.workunit.client.0.vm05.stdout:5/79: dwrite d5/fd [0,4194304] 0 2026-03-10T08:55:13.400 INFO:tasks.workunit.client.0.vm05.stdout:5/80: chown d5 648795968 1 2026-03-10T08:55:13.403 INFO:tasks.workunit.client.1.vm08.stdout:0/272: write f4 [4103716,64996] 0 2026-03-10T08:55:13.406 INFO:tasks.workunit.client.0.vm05.stdout:5/81: dread d5/df/d12/f15 [0,4194304] 0 2026-03-10T08:55:13.412 INFO:tasks.workunit.client.1.vm08.stdout:7/298: mkdir d0/d11/d4a/d5e 0 2026-03-10T08:55:13.421 INFO:tasks.workunit.client.0.vm05.stdout:5/82: creat d5/df/d12/f1a x:0 0 0 2026-03-10T08:55:13.421 INFO:tasks.workunit.client.0.vm05.stdout:5/83: truncate d5/fd 4205727 0 2026-03-10T08:55:13.422 INFO:tasks.workunit.client.1.vm08.stdout:0/273: mkdir d6/dd/d13/d17/d1f/d20/d2f/d57 0 2026-03-10T08:55:13.423 INFO:tasks.workunit.client.1.vm08.stdout:0/274: chown d6 58498 1 2026-03-10T08:55:13.423 INFO:tasks.workunit.client.1.vm08.stdout:0/275: write d6/dd/f35 [1346811,24831] 0 2026-03-10T08:55:13.425 INFO:tasks.workunit.client.1.vm08.stdout:7/299: mkdir d0/d11/d4a/d5e/d5f 0 2026-03-10T08:55:13.427 INFO:tasks.workunit.client.1.vm08.stdout:8/405: dread d1/d10/f23 [0,4194304] 0 2026-03-10T08:55:13.429 INFO:tasks.workunit.client.1.vm08.stdout:0/276: rename f4 to d6/dd/d13/d17/d1f/d20/d2f/d57/f58 0 2026-03-10T08:55:13.429 INFO:tasks.workunit.client.1.vm08.stdout:0/277: read - d6/dd/d13/d17/d1f/d2d/f45 zero size 2026-03-10T08:55:13.430 INFO:tasks.workunit.client.0.vm05.stdout:5/84: link d5/df/f19 d5/df/d12/f1b 0 2026-03-10T08:55:13.431 INFO:tasks.workunit.client.1.vm08.stdout:7/300: mknod d0/d14/d2f/c60 0 2026-03-10T08:55:13.432 INFO:tasks.workunit.client.0.vm05.stdout:5/85: unlink f2 0 2026-03-10T08:55:13.432 INFO:tasks.workunit.client.1.vm08.stdout:7/301: dread - d0/d11/d1f/d29/d3b/f4c zero size 2026-03-10T08:55:13.433 INFO:tasks.workunit.client.1.vm08.stdout:7/302: chown d0/c2e 58440 1 2026-03-10T08:55:13.433 INFO:tasks.workunit.client.1.vm08.stdout:7/303: chown d0/d14 805 1 2026-03-10T08:55:13.433 INFO:tasks.workunit.client.1.vm08.stdout:7/304: fdatasync d0/d11/d1f/d29/d3b/f4c 0 2026-03-10T08:55:13.435 INFO:tasks.workunit.client.0.vm05.stdout:5/86: creat d5/df/f1c x:0 0 0 2026-03-10T08:55:13.435 INFO:tasks.workunit.client.0.vm05.stdout:5/87: chown d5/fe 16492201 1 2026-03-10T08:55:13.443 INFO:tasks.workunit.client.0.vm05.stdout:5/88: dwrite d5/df/d12/f15 [0,4194304] 0 2026-03-10T08:55:13.443 INFO:tasks.workunit.client.0.vm05.stdout:5/89: dread - d5/df/d12/f1a zero size 2026-03-10T08:55:13.474 INFO:tasks.workunit.client.0.vm05.stdout:5/90: rename d5/l10 to d5/df/l1d 0 2026-03-10T08:55:13.477 INFO:tasks.workunit.client.0.vm05.stdout:5/91: dread d5/df/f19 [0,4194304] 0 2026-03-10T08:55:13.487 INFO:tasks.workunit.client.0.vm05.stdout:5/92: link d5/l16 d5/df/d12/l1e 0 2026-03-10T08:55:13.542 INFO:tasks.workunit.client.0.vm05.stdout:9/38: dwrite f2 [0,4194304] 0 2026-03-10T08:55:13.544 INFO:tasks.workunit.client.1.vm08.stdout:9/341: truncate d2/dd/d15/f44 973415 0 2026-03-10T08:55:13.545 INFO:tasks.workunit.client.1.vm08.stdout:9/342: chown d2/dd/l1d 55 1 2026-03-10T08:55:13.549 INFO:tasks.workunit.client.0.vm05.stdout:2/37: dwrite d0/f2 [0,4194304] 0 2026-03-10T08:55:13.559 INFO:tasks.workunit.client.0.vm05.stdout:3/55: rmdir d9 39 2026-03-10T08:55:13.559 INFO:tasks.workunit.client.0.vm05.stdout:9/39: rmdir d6 39 2026-03-10T08:55:13.561 INFO:tasks.workunit.client.0.vm05.stdout:3/56: stat d9 0 2026-03-10T08:55:13.562 INFO:tasks.workunit.client.0.vm05.stdout:9/40: chown d6/ca 55893443 1 2026-03-10T08:55:13.563 INFO:tasks.workunit.client.0.vm05.stdout:3/57: symlink d9/ld 0 2026-03-10T08:55:13.564 INFO:tasks.workunit.client.0.vm05.stdout:3/58: write d9/fa [665409,68366] 0 2026-03-10T08:55:13.564 INFO:tasks.workunit.client.0.vm05.stdout:3/59: chown d9 13563 1 2026-03-10T08:55:13.566 INFO:tasks.workunit.client.0.vm05.stdout:3/60: symlink d9/le 0 2026-03-10T08:55:13.567 INFO:tasks.workunit.client.0.vm05.stdout:8/39: truncate d2/f5 1517151 0 2026-03-10T08:55:13.570 INFO:tasks.workunit.client.0.vm05.stdout:8/40: chown d2/l6 120910235 1 2026-03-10T08:55:13.586 INFO:tasks.workunit.client.0.vm05.stdout:8/41: dwrite d2/fa [0,4194304] 0 2026-03-10T08:55:13.588 INFO:tasks.workunit.client.0.vm05.stdout:8/42: mkdir d2/dd 0 2026-03-10T08:55:13.591 INFO:tasks.workunit.client.0.vm05.stdout:6/59: dwrite f2 [0,4194304] 0 2026-03-10T08:55:13.593 INFO:tasks.workunit.client.0.vm05.stdout:8/43: rename d2/db/cc to d2/dd/ce 0 2026-03-10T08:55:13.598 INFO:tasks.workunit.client.1.vm08.stdout:1/317: fsync d1/da/f1e 0 2026-03-10T08:55:13.601 INFO:tasks.workunit.client.1.vm08.stdout:6/369: truncate d9/dc/d11/d23/d2c/d41/f56 191685 0 2026-03-10T08:55:13.602 INFO:tasks.workunit.client.0.vm05.stdout:6/60: rename d4/d7/fa to d4/d7/d10/f12 0 2026-03-10T08:55:13.606 INFO:tasks.workunit.client.1.vm08.stdout:1/318: mkdir d1/da/de/d24/d3d/d40/d56/d6b 0 2026-03-10T08:55:13.619 INFO:tasks.workunit.client.1.vm08.stdout:6/370: rename d9/d10/d1e/d32/f17 to d9/dc/d11/d23/d2c/d81/f85 0 2026-03-10T08:55:13.619 INFO:tasks.workunit.client.1.vm08.stdout:6/371: write d9/dc/d11/d23/d2c/f3d [130976,29987] 0 2026-03-10T08:55:13.619 INFO:tasks.workunit.client.1.vm08.stdout:1/319: dwrite d1/f8 [4194304,4194304] 0 2026-03-10T08:55:13.620 INFO:tasks.workunit.client.0.vm05.stdout:1/88: link f1 dd/d10/d19/f1f 0 2026-03-10T08:55:13.620 INFO:tasks.workunit.client.0.vm05.stdout:1/89: dread - dd/f11 zero size 2026-03-10T08:55:13.620 INFO:tasks.workunit.client.0.vm05.stdout:1/90: fdatasync fb 0 2026-03-10T08:55:13.620 INFO:tasks.workunit.client.0.vm05.stdout:1/91: chown dd 28 1 2026-03-10T08:55:13.620 INFO:tasks.workunit.client.0.vm05.stdout:1/92: dread dd/d10/d19/f1f [0,4194304] 0 2026-03-10T08:55:13.620 INFO:tasks.workunit.client.0.vm05.stdout:6/61: unlink f0 0 2026-03-10T08:55:13.623 INFO:tasks.workunit.client.0.vm05.stdout:2/38: fsync d0/f2 0 2026-03-10T08:55:13.634 INFO:tasks.workunit.client.1.vm08.stdout:6/372: rename d9/dc/d11/d23/d2c/d81/c5a to d9/d10/d1e/d4c/c86 0 2026-03-10T08:55:13.635 INFO:tasks.workunit.client.1.vm08.stdout:6/373: fdatasync d9/d50/f75 0 2026-03-10T08:55:13.640 INFO:tasks.workunit.client.0.vm05.stdout:1/93: chown fc 26 1 2026-03-10T08:55:13.645 INFO:tasks.workunit.client.0.vm05.stdout:1/94: rename dd/d10/d18/d1e to dd/d10/d18/d20 0 2026-03-10T08:55:13.645 INFO:tasks.workunit.client.0.vm05.stdout:1/95: read dd/f16 [2480172,50414] 0 2026-03-10T08:55:13.646 INFO:tasks.workunit.client.0.vm05.stdout:1/96: dread f1 [0,4194304] 0 2026-03-10T08:55:13.651 INFO:tasks.workunit.client.0.vm05.stdout:4/42: dread d0/f4 [0,4194304] 0 2026-03-10T08:55:13.651 INFO:tasks.workunit.client.0.vm05.stdout:4/43: fdatasync d0/f8 0 2026-03-10T08:55:13.654 INFO:tasks.workunit.client.0.vm05.stdout:4/44: dread d0/f1 [4194304,4194304] 0 2026-03-10T08:55:13.665 INFO:tasks.workunit.client.0.vm05.stdout:4/45: creat d0/fc x:0 0 0 2026-03-10T08:55:13.666 INFO:tasks.workunit.client.0.vm05.stdout:4/46: write d0/f9 [596937,77353] 0 2026-03-10T08:55:13.672 INFO:tasks.workunit.client.0.vm05.stdout:4/47: write d0/f1 [3980700,40772] 0 2026-03-10T08:55:13.672 INFO:tasks.workunit.client.0.vm05.stdout:4/48: truncate d0/fb 194123 0 2026-03-10T08:55:13.673 INFO:tasks.workunit.client.0.vm05.stdout:0/53: truncate df/f11 660929 0 2026-03-10T08:55:13.678 INFO:tasks.workunit.client.0.vm05.stdout:4/49: creat d0/fd x:0 0 0 2026-03-10T08:55:13.678 INFO:tasks.workunit.client.1.vm08.stdout:5/347: dwrite d0/d1b/f39 [0,4194304] 0 2026-03-10T08:55:13.680 INFO:tasks.workunit.client.1.vm08.stdout:5/348: stat d0/d11/d27/f3d 0 2026-03-10T08:55:13.689 INFO:tasks.workunit.client.0.vm05.stdout:4/50: rename d0/fd to d0/fe 0 2026-03-10T08:55:13.689 INFO:tasks.workunit.client.0.vm05.stdout:4/51: fdatasync d0/fb 0 2026-03-10T08:55:13.689 INFO:tasks.workunit.client.0.vm05.stdout:4/52: dread - d0/fe zero size 2026-03-10T08:55:13.691 INFO:tasks.workunit.client.0.vm05.stdout:0/54: link fc df/f13 0 2026-03-10T08:55:13.694 INFO:tasks.workunit.client.0.vm05.stdout:4/53: mknod d0/cf 0 2026-03-10T08:55:13.698 INFO:tasks.workunit.client.0.vm05.stdout:4/54: dwrite d0/fc [0,4194304] 0 2026-03-10T08:55:13.699 INFO:tasks.workunit.client.0.vm05.stdout:4/55: readlink d0/l5 0 2026-03-10T08:55:13.699 INFO:tasks.workunit.client.0.vm05.stdout:4/56: write d0/f1 [5672061,15725] 0 2026-03-10T08:55:13.699 INFO:tasks.workunit.client.0.vm05.stdout:0/55: stat c8 0 2026-03-10T08:55:13.699 INFO:tasks.workunit.client.0.vm05.stdout:0/56: fdatasync fe 0 2026-03-10T08:55:13.700 INFO:tasks.workunit.client.0.vm05.stdout:4/57: stat d0/l6 0 2026-03-10T08:55:13.700 INFO:tasks.workunit.client.0.vm05.stdout:4/58: fsync d0/f9 0 2026-03-10T08:55:13.711 INFO:tasks.workunit.client.0.vm05.stdout:0/57: mknod df/c14 0 2026-03-10T08:55:13.712 INFO:tasks.workunit.client.0.vm05.stdout:4/59: creat d0/f10 x:0 0 0 2026-03-10T08:55:13.714 INFO:tasks.workunit.client.0.vm05.stdout:4/60: symlink d0/l11 0 2026-03-10T08:55:13.715 INFO:tasks.workunit.client.0.vm05.stdout:4/61: write d0/f9 [1024262,90635] 0 2026-03-10T08:55:13.715 INFO:tasks.workunit.client.1.vm08.stdout:3/294: dwrite d4/d15/d8/ff [0,4194304] 0 2026-03-10T08:55:13.717 INFO:tasks.workunit.client.1.vm08.stdout:3/295: dread - d4/d15/d8/d2a/f4d zero size 2026-03-10T08:55:13.725 INFO:tasks.workunit.client.0.vm05.stdout:0/58: fdatasync df/f11 0 2026-03-10T08:55:13.728 INFO:tasks.workunit.client.0.vm05.stdout:0/59: write fc [816292,94249] 0 2026-03-10T08:55:13.729 INFO:tasks.workunit.client.0.vm05.stdout:0/60: write fe [1768029,87142] 0 2026-03-10T08:55:13.736 INFO:tasks.workunit.client.1.vm08.stdout:3/296: creat d4/d15/d8/d2c/d55/f60 x:0 0 0 2026-03-10T08:55:13.740 INFO:tasks.workunit.client.1.vm08.stdout:3/297: dwrite d4/d15/d8/d2a/f4d [0,4194304] 0 2026-03-10T08:55:13.741 INFO:tasks.workunit.client.1.vm08.stdout:2/355: rename d1/da/fb to d1/da/d10/d1b/d6a/f73 0 2026-03-10T08:55:13.746 INFO:tasks.workunit.client.1.vm08.stdout:3/298: truncate d4/d15/f4b 4243833 0 2026-03-10T08:55:13.754 INFO:tasks.workunit.client.1.vm08.stdout:3/299: rename d4/d15/f5e to d4/d15/d8/d2c/d55/f61 0 2026-03-10T08:55:13.756 INFO:tasks.workunit.client.1.vm08.stdout:2/356: rename d1/da/d10/d1b/d12/d23/f57 to d1/d43/d4f/f74 0 2026-03-10T08:55:13.757 INFO:tasks.workunit.client.1.vm08.stdout:7/305: getdents d0/d11/d4a 0 2026-03-10T08:55:13.759 INFO:tasks.workunit.client.1.vm08.stdout:7/306: mknod d0/d14/d43/c61 0 2026-03-10T08:55:13.760 INFO:tasks.workunit.client.1.vm08.stdout:7/307: dread d0/d11/f39 [0,4194304] 0 2026-03-10T08:55:13.762 INFO:tasks.workunit.client.1.vm08.stdout:7/308: mkdir d0/d14/d43/d62 0 2026-03-10T08:55:13.767 INFO:tasks.workunit.client.1.vm08.stdout:2/357: dread d1/da/d10/d1b/f28 [0,4194304] 0 2026-03-10T08:55:13.767 INFO:tasks.workunit.client.1.vm08.stdout:7/309: chown d0/d11/d1f/l34 111 1 2026-03-10T08:55:13.768 INFO:tasks.workunit.client.1.vm08.stdout:2/358: chown d1/da/f64 1 1 2026-03-10T08:55:13.769 INFO:tasks.workunit.client.1.vm08.stdout:8/406: dwrite d1/d10/d9/dd/d18/d3c/f4e [0,4194304] 0 2026-03-10T08:55:13.779 INFO:tasks.workunit.client.1.vm08.stdout:8/407: fsync d1/d10/d9/dd/d25/d27/d44/d21/f32 0 2026-03-10T08:55:13.781 INFO:tasks.workunit.client.1.vm08.stdout:2/359: unlink d1/da/d10/d1b/l46 0 2026-03-10T08:55:13.781 INFO:tasks.workunit.client.1.vm08.stdout:8/408: read d1/d10/d9/dd/d25/d27/f52 [1127596,108467] 0 2026-03-10T08:55:13.782 INFO:tasks.workunit.client.0.vm05.stdout:5/93: stat d5/df/l1d 0 2026-03-10T08:55:13.783 INFO:tasks.workunit.client.1.vm08.stdout:8/409: creat d1/d10/d9/d8a/f95 x:0 0 0 2026-03-10T08:55:13.785 INFO:tasks.workunit.client.1.vm08.stdout:8/410: symlink d1/d10/d9/dd/d13/d40/l96 0 2026-03-10T08:55:13.786 INFO:tasks.workunit.client.1.vm08.stdout:4/372: write d5/de/f50 [2197017,76495] 0 2026-03-10T08:55:13.789 INFO:tasks.workunit.client.1.vm08.stdout:8/411: getdents d1/d10/d9/dd/d18/d34 0 2026-03-10T08:55:13.790 INFO:tasks.workunit.client.1.vm08.stdout:8/412: readlink d1/d10/d9/dd/d25/d27/d44/l4b 0 2026-03-10T08:55:13.790 INFO:tasks.workunit.client.1.vm08.stdout:4/373: dread d5/d23/d36/f51 [0,4194304] 0 2026-03-10T08:55:13.790 INFO:tasks.workunit.client.1.vm08.stdout:9/343: truncate d2/f4 2112887 0 2026-03-10T08:55:13.792 INFO:tasks.workunit.client.1.vm08.stdout:9/344: mknod d2/dd/d15/d1e/d25/d32/c6f 0 2026-03-10T08:55:13.792 INFO:tasks.workunit.client.1.vm08.stdout:9/345: fsync d2/f35 0 2026-03-10T08:55:13.793 INFO:tasks.workunit.client.1.vm08.stdout:8/413: getdents d1/d10/d9/dd 0 2026-03-10T08:55:13.794 INFO:tasks.workunit.client.0.vm05.stdout:7/43: truncate f8 401548 0 2026-03-10T08:55:13.799 INFO:tasks.workunit.client.1.vm08.stdout:9/346: fdatasync d2/f4 0 2026-03-10T08:55:13.800 INFO:tasks.workunit.client.1.vm08.stdout:5/349: symlink d0/d40/d4b/l6e 0 2026-03-10T08:55:13.800 INFO:tasks.workunit.client.0.vm05.stdout:9/41: write f5 [535755,93007] 0 2026-03-10T08:55:13.801 INFO:tasks.workunit.client.1.vm08.stdout:5/350: truncate d0/d11/f25 1136368 0 2026-03-10T08:55:13.802 INFO:tasks.workunit.client.1.vm08.stdout:8/414: dwrite d1/d10/d9/dd/f70 [0,4194304] 0 2026-03-10T08:55:13.804 INFO:tasks.workunit.client.0.vm05.stdout:9/42: creat d6/fb x:0 0 0 2026-03-10T08:55:13.804 INFO:tasks.workunit.client.1.vm08.stdout:9/347: creat d2/dd/d15/d1e/d25/d32/d5c/f70 x:0 0 0 2026-03-10T08:55:13.805 INFO:tasks.workunit.client.0.vm05.stdout:9/43: dread - d6/fb zero size 2026-03-10T08:55:13.806 INFO:tasks.workunit.client.0.vm05.stdout:3/61: truncate d9/fa 400245 0 2026-03-10T08:55:13.807 INFO:tasks.workunit.client.0.vm05.stdout:4/62: sync 2026-03-10T08:55:13.809 INFO:tasks.workunit.client.0.vm05.stdout:3/62: write d9/fc [758998,118904] 0 2026-03-10T08:55:13.822 INFO:tasks.workunit.client.0.vm05.stdout:9/44: mknod d6/cc 0 2026-03-10T08:55:13.828 INFO:tasks.workunit.client.0.vm05.stdout:9/45: dwrite f4 [0,4194304] 0 2026-03-10T08:55:13.829 INFO:tasks.workunit.client.0.vm05.stdout:8/44: getdents d2/dd 0 2026-03-10T08:55:13.830 INFO:tasks.workunit.client.0.vm05.stdout:9/46: write f5 [643205,7809] 0 2026-03-10T08:55:13.845 INFO:tasks.workunit.client.0.vm05.stdout:1/97: fsync f2 0 2026-03-10T08:55:13.852 INFO:tasks.workunit.client.0.vm05.stdout:4/63: sync 2026-03-10T08:55:13.852 INFO:tasks.workunit.client.0.vm05.stdout:4/64: chown d0/fb 28 1 2026-03-10T08:55:13.853 INFO:tasks.workunit.client.0.vm05.stdout:4/65: chown d0/f1 3 1 2026-03-10T08:55:13.855 INFO:tasks.workunit.client.0.vm05.stdout:4/66: dread d0/fc [0,4194304] 0 2026-03-10T08:55:13.857 INFO:tasks.workunit.client.0.vm05.stdout:2/39: truncate d0/f4 766006 0 2026-03-10T08:55:13.858 INFO:tasks.workunit.client.0.vm05.stdout:2/40: chown d0/c6 1640664 1 2026-03-10T08:55:13.860 INFO:tasks.workunit.client.0.vm05.stdout:2/41: dread d0/f2 [0,4194304] 0 2026-03-10T08:55:13.862 INFO:tasks.workunit.client.0.vm05.stdout:3/63: rename f6 to d9/ff 0 2026-03-10T08:55:13.868 INFO:tasks.workunit.client.0.vm05.stdout:3/64: chown d9/ff 14870 1 2026-03-10T08:55:13.868 INFO:tasks.workunit.client.0.vm05.stdout:3/65: rename d9 to d9/d10 22 2026-03-10T08:55:13.868 INFO:tasks.workunit.client.0.vm05.stdout:9/47: dread d6/f7 [0,4194304] 0 2026-03-10T08:55:13.868 INFO:tasks.workunit.client.0.vm05.stdout:1/98: mkdir dd/d21 0 2026-03-10T08:55:13.871 INFO:tasks.workunit.client.0.vm05.stdout:4/67: unlink d0/cf 0 2026-03-10T08:55:13.873 INFO:tasks.workunit.client.0.vm05.stdout:0/61: dread df/f11 [0,4194304] 0 2026-03-10T08:55:13.877 INFO:tasks.workunit.client.0.vm05.stdout:0/62: dwrite df/f13 [0,4194304] 0 2026-03-10T08:55:13.877 INFO:tasks.workunit.client.0.vm05.stdout:0/63: readlink - no filename 2026-03-10T08:55:13.884 INFO:tasks.workunit.client.0.vm05.stdout:0/64: dwrite f5 [0,4194304] 0 2026-03-10T08:55:13.890 INFO:tasks.workunit.client.0.vm05.stdout:0/65: dwrite df/f12 [0,4194304] 0 2026-03-10T08:55:13.908 INFO:tasks.workunit.client.0.vm05.stdout:2/42: mkdir d0/d9 0 2026-03-10T08:55:13.909 INFO:tasks.workunit.client.0.vm05.stdout:2/43: read d0/f8 [3654221,113569] 0 2026-03-10T08:55:13.912 INFO:tasks.workunit.client.0.vm05.stdout:3/66: dwrite d9/ff [0,4194304] 0 2026-03-10T08:55:13.927 INFO:tasks.workunit.client.0.vm05.stdout:1/99: creat dd/d10/f22 x:0 0 0 2026-03-10T08:55:13.927 INFO:tasks.workunit.client.0.vm05.stdout:1/100: truncate dd/d10/f22 910926 0 2026-03-10T08:55:13.927 INFO:tasks.workunit.client.0.vm05.stdout:4/68: creat d0/f12 x:0 0 0 2026-03-10T08:55:13.934 INFO:tasks.workunit.client.0.vm05.stdout:3/67: read - f2 zero size 2026-03-10T08:55:13.935 INFO:tasks.workunit.client.0.vm05.stdout:8/45: creat d2/ff x:0 0 0 2026-03-10T08:55:13.938 INFO:tasks.workunit.client.0.vm05.stdout:3/68: dwrite d9/fc [0,4194304] 0 2026-03-10T08:55:13.939 INFO:tasks.workunit.client.1.vm08.stdout:1/320: rmdir d1/da/de/d24/d3d/d40 39 2026-03-10T08:55:13.944 INFO:tasks.workunit.client.0.vm05.stdout:9/48: symlink d6/ld 0 2026-03-10T08:55:13.944 INFO:tasks.workunit.client.0.vm05.stdout:1/101: write f1 [163522,7414] 0 2026-03-10T08:55:13.944 INFO:tasks.workunit.client.0.vm05.stdout:9/49: write d6/f7 [688680,20872] 0 2026-03-10T08:55:13.949 INFO:tasks.workunit.client.0.vm05.stdout:1/102: dwrite dd/d10/d19/f1f [0,4194304] 0 2026-03-10T08:55:13.960 INFO:tasks.workunit.client.1.vm08.stdout:1/321: mknod d1/da/de/d24/d35/d43/c6c 0 2026-03-10T08:55:13.961 INFO:tasks.workunit.client.0.vm05.stdout:4/69: unlink d0/f4 0 2026-03-10T08:55:13.966 INFO:tasks.workunit.client.1.vm08.stdout:3/300: write d4/d15/d8/d2c/d55/f60 [146232,5569] 0 2026-03-10T08:55:13.968 INFO:tasks.workunit.client.0.vm05.stdout:8/46: mknod d2/db/c10 0 2026-03-10T08:55:13.969 INFO:tasks.workunit.client.0.vm05.stdout:3/69: unlink d9/fb 0 2026-03-10T08:55:13.972 INFO:tasks.workunit.client.0.vm05.stdout:1/103: creat dd/d10/d18/f23 x:0 0 0 2026-03-10T08:55:13.973 INFO:tasks.workunit.client.0.vm05.stdout:4/70: rename d0/f12 to d0/f13 0 2026-03-10T08:55:13.976 INFO:tasks.workunit.client.0.vm05.stdout:8/47: mknod d2/db/c11 0 2026-03-10T08:55:13.978 INFO:tasks.workunit.client.0.vm05.stdout:3/70: creat d9/f11 x:0 0 0 2026-03-10T08:55:13.979 INFO:tasks.workunit.client.1.vm08.stdout:0/278: creat d6/dd/d13/d17/d1f/d20/d2f/f59 x:0 0 0 2026-03-10T08:55:13.983 INFO:tasks.workunit.client.0.vm05.stdout:1/104: creat dd/d10/d19/f24 x:0 0 0 2026-03-10T08:55:13.983 INFO:tasks.workunit.client.0.vm05.stdout:1/105: chown dd/d10/d19/f1f 81106 1 2026-03-10T08:55:13.983 INFO:tasks.workunit.client.0.vm05.stdout:1/106: stat dd/d10/l1b 0 2026-03-10T08:55:13.983 INFO:tasks.workunit.client.1.vm08.stdout:0/279: chown d6/dd/d13/d17/f29 4 1 2026-03-10T08:55:13.983 INFO:tasks.workunit.client.1.vm08.stdout:0/280: chown d6/dd/d13/d17/d1f/d20/f46 7620 1 2026-03-10T08:55:13.983 INFO:tasks.workunit.client.1.vm08.stdout:0/281: chown f5 366 1 2026-03-10T08:55:13.985 INFO:tasks.workunit.client.0.vm05.stdout:8/48: mknod d2/dd/c12 0 2026-03-10T08:55:13.985 INFO:tasks.workunit.client.0.vm05.stdout:3/71: creat d9/f12 x:0 0 0 2026-03-10T08:55:13.987 INFO:tasks.workunit.client.0.vm05.stdout:9/50: rename f5 to d6/fe 0 2026-03-10T08:55:13.988 INFO:tasks.workunit.client.0.vm05.stdout:4/71: dwrite d0/fc [0,4194304] 0 2026-03-10T08:55:13.989 INFO:tasks.workunit.client.0.vm05.stdout:8/49: dread d2/fa [0,4194304] 0 2026-03-10T08:55:13.989 INFO:tasks.workunit.client.0.vm05.stdout:1/107: symlink dd/d13/l25 0 2026-03-10T08:55:13.991 INFO:tasks.workunit.client.0.vm05.stdout:1/108: write dd/d10/d18/f23 [678029,114512] 0 2026-03-10T08:55:13.994 INFO:tasks.workunit.client.1.vm08.stdout:8/415: dwrite d1/d10/d9/dd/d25/d27/d44/d21/f32 [0,4194304] 0 2026-03-10T08:55:13.995 INFO:tasks.workunit.client.0.vm05.stdout:4/72: dread d0/fc [0,4194304] 0 2026-03-10T08:55:13.996 INFO:tasks.workunit.client.0.vm05.stdout:1/109: dwrite dd/f11 [0,4194304] 0 2026-03-10T08:55:13.998 INFO:tasks.workunit.client.0.vm05.stdout:1/110: write f2 [3953022,9117] 0 2026-03-10T08:55:14.003 INFO:tasks.workunit.client.1.vm08.stdout:2/360: write d1/da/f64 [319346,127373] 0 2026-03-10T08:55:14.007 INFO:tasks.workunit.client.0.vm05.stdout:3/72: fdatasync f1 0 2026-03-10T08:55:14.012 INFO:tasks.workunit.client.0.vm05.stdout:9/51: dwrite d6/fe [0,4194304] 0 2026-03-10T08:55:14.013 INFO:tasks.workunit.client.0.vm05.stdout:9/52: dread - d6/fb zero size 2026-03-10T08:55:14.019 INFO:tasks.workunit.client.0.vm05.stdout:4/73: dread d0/f9 [0,4194304] 0 2026-03-10T08:55:14.031 INFO:tasks.workunit.client.0.vm05.stdout:3/73: creat d9/f13 x:0 0 0 2026-03-10T08:55:14.032 INFO:tasks.workunit.client.0.vm05.stdout:4/74: mknod d0/c14 0 2026-03-10T08:55:14.033 INFO:tasks.workunit.client.0.vm05.stdout:1/111: creat dd/d21/f26 x:0 0 0 2026-03-10T08:55:14.036 INFO:tasks.workunit.client.0.vm05.stdout:3/74: symlink d9/l14 0 2026-03-10T08:55:14.043 INFO:tasks.workunit.client.1.vm08.stdout:2/361: dread d1/fd [0,4194304] 0 2026-03-10T08:55:14.043 INFO:tasks.workunit.client.1.vm08.stdout:2/362: fdatasync d1/d43/f5d 0 2026-03-10T08:55:14.043 INFO:tasks.workunit.client.1.vm08.stdout:2/363: dread - d1/d5b/d66/f62 zero size 2026-03-10T08:55:14.043 INFO:tasks.workunit.client.0.vm05.stdout:9/53: symlink d6/lf 0 2026-03-10T08:55:14.043 INFO:tasks.workunit.client.0.vm05.stdout:1/112: mkdir dd/d10/d19/d27 0 2026-03-10T08:55:14.043 INFO:tasks.workunit.client.0.vm05.stdout:4/75: dwrite d0/fb [0,4194304] 0 2026-03-10T08:55:14.043 INFO:tasks.workunit.client.0.vm05.stdout:3/75: rename d9/ld to d9/l15 0 2026-03-10T08:55:14.043 INFO:tasks.workunit.client.0.vm05.stdout:1/113: dread - fa zero size 2026-03-10T08:55:14.043 INFO:tasks.workunit.client.0.vm05.stdout:4/76: truncate d0/f13 584489 0 2026-03-10T08:55:14.044 INFO:tasks.workunit.client.1.vm08.stdout:2/364: readlink d1/da/d10/d1b/d12/d22/l33 0 2026-03-10T08:55:14.045 INFO:tasks.workunit.client.1.vm08.stdout:2/365: stat d1/da/d10/d1b/d12/d1e/c32 0 2026-03-10T08:55:14.051 INFO:tasks.workunit.client.0.vm05.stdout:1/114: dwrite dd/f11 [0,4194304] 0 2026-03-10T08:55:14.059 INFO:tasks.workunit.client.0.vm05.stdout:4/77: read d0/f9 [710726,111174] 0 2026-03-10T08:55:14.072 INFO:tasks.workunit.client.0.vm05.stdout:4/78: dread - d0/f10 zero size 2026-03-10T08:55:14.072 INFO:tasks.workunit.client.0.vm05.stdout:4/79: mkdir d0/d15 0 2026-03-10T08:55:14.072 INFO:tasks.workunit.client.0.vm05.stdout:1/115: rename dd/c1a to dd/d10/c28 0 2026-03-10T08:55:14.072 INFO:tasks.workunit.client.0.vm05.stdout:1/116: mknod dd/d10/c29 0 2026-03-10T08:55:14.072 INFO:tasks.workunit.client.0.vm05.stdout:1/117: dread f7 [0,4194304] 0 2026-03-10T08:55:14.077 INFO:tasks.workunit.client.1.vm08.stdout:3/301: dread d4/d15/d8/d2c/d55/f60 [0,4194304] 0 2026-03-10T08:55:14.078 INFO:tasks.workunit.client.0.vm05.stdout:1/118: creat dd/d10/d19/d27/f2a x:0 0 0 2026-03-10T08:55:14.081 INFO:tasks.workunit.client.0.vm05.stdout:1/119: rename dd/d13/l15 to dd/d10/d19/d27/l2b 0 2026-03-10T08:55:14.081 INFO:tasks.workunit.client.0.vm05.stdout:1/120: readlink dd/l12 0 2026-03-10T08:55:14.087 INFO:tasks.workunit.client.0.vm05.stdout:1/121: rename l9 to dd/d10/d19/d27/l2c 0 2026-03-10T08:55:14.088 INFO:tasks.workunit.client.0.vm05.stdout:1/122: mkdir dd/d10/d18/d2d 0 2026-03-10T08:55:14.089 INFO:tasks.workunit.client.0.vm05.stdout:1/123: creat dd/d10/d19/f2e x:0 0 0 2026-03-10T08:55:14.092 INFO:tasks.workunit.client.1.vm08.stdout:2/366: symlink d1/da/d10/d1b/d12/l75 0 2026-03-10T08:55:14.092 INFO:tasks.workunit.client.1.vm08.stdout:3/302: creat d4/d15/d8/d1d/f62 x:0 0 0 2026-03-10T08:55:14.093 INFO:tasks.workunit.client.1.vm08.stdout:3/303: fdatasync d4/f18 0 2026-03-10T08:55:14.093 INFO:tasks.workunit.client.1.vm08.stdout:3/304: stat d4/d15 0 2026-03-10T08:55:14.094 INFO:tasks.workunit.client.1.vm08.stdout:2/367: fsync d1/da/d10/d1b/d12/d23/f70 0 2026-03-10T08:55:14.094 INFO:tasks.workunit.client.1.vm08.stdout:3/305: dread - d4/d15/d8/f53 zero size 2026-03-10T08:55:14.094 INFO:tasks.workunit.client.1.vm08.stdout:2/368: stat d1/d43 0 2026-03-10T08:55:14.095 INFO:tasks.workunit.client.1.vm08.stdout:3/306: chown d4/d15/d17/c49 988 1 2026-03-10T08:55:14.096 INFO:tasks.workunit.client.1.vm08.stdout:3/307: chown d4/d15/fc 11683 1 2026-03-10T08:55:14.099 INFO:tasks.workunit.client.1.vm08.stdout:3/308: dwrite d4/d15/d8/d1d/f2d [0,4194304] 0 2026-03-10T08:55:14.100 INFO:tasks.workunit.client.1.vm08.stdout:3/309: fsync d4/d15/d8/f37 0 2026-03-10T08:55:14.104 INFO:tasks.workunit.client.1.vm08.stdout:3/310: dwrite d4/d15/d17/f5c [0,4194304] 0 2026-03-10T08:55:14.107 INFO:tasks.workunit.client.1.vm08.stdout:3/311: dread - d4/f47 zero size 2026-03-10T08:55:14.120 INFO:tasks.workunit.client.1.vm08.stdout:3/312: creat d4/d15/d8/d2a/f63 x:0 0 0 2026-03-10T08:55:14.121 INFO:tasks.workunit.client.1.vm08.stdout:3/313: rmdir d4/d15/d8/d2c/d55 39 2026-03-10T08:55:14.121 INFO:tasks.workunit.client.1.vm08.stdout:3/314: fsync d4/d15/d17/f59 0 2026-03-10T08:55:14.123 INFO:tasks.workunit.client.1.vm08.stdout:3/315: mknod d4/d15/d8/d1d/c64 0 2026-03-10T08:55:14.123 INFO:tasks.workunit.client.1.vm08.stdout:3/316: chown d4/d15/d8/d1d/f2d 43970 1 2026-03-10T08:55:14.132 INFO:tasks.workunit.client.1.vm08.stdout:3/317: fsync d4/d15/fa 0 2026-03-10T08:55:14.140 INFO:tasks.workunit.client.1.vm08.stdout:5/351: link d0/d11/d27/f3b d0/d40/f6f 0 2026-03-10T08:55:14.155 INFO:tasks.workunit.client.1.vm08.stdout:5/352: mknod d0/d11/d27/d68/c70 0 2026-03-10T08:55:14.158 INFO:tasks.workunit.client.1.vm08.stdout:5/353: dread d0/d11/f25 [0,4194304] 0 2026-03-10T08:55:14.161 INFO:tasks.workunit.client.0.vm05.stdout:2/44: rmdir d0 39 2026-03-10T08:55:14.162 INFO:tasks.workunit.client.0.vm05.stdout:6/62: write d4/f5 [3713289,102951] 0 2026-03-10T08:55:14.166 INFO:tasks.workunit.client.1.vm08.stdout:5/354: creat d0/d40/d4b/d4e/f71 x:0 0 0 2026-03-10T08:55:14.167 INFO:tasks.workunit.client.1.vm08.stdout:5/355: fdatasync d0/d11/d3e/d45/f4a 0 2026-03-10T08:55:14.170 INFO:tasks.workunit.client.1.vm08.stdout:5/356: dwrite d0/d1b/f39 [0,4194304] 0 2026-03-10T08:55:14.173 INFO:tasks.workunit.client.1.vm08.stdout:5/357: truncate d0/d11/f60 910829 0 2026-03-10T08:55:14.176 INFO:tasks.workunit.client.1.vm08.stdout:5/358: creat d0/d11/d18/f72 x:0 0 0 2026-03-10T08:55:14.181 INFO:tasks.workunit.client.1.vm08.stdout:5/359: dwrite d0/d40/d4b/d4e/f71 [0,4194304] 0 2026-03-10T08:55:14.182 INFO:tasks.workunit.client.1.vm08.stdout:5/360: stat d0/d1b 0 2026-03-10T08:55:14.188 INFO:tasks.workunit.client.1.vm08.stdout:5/361: creat d0/d11/d3e/f73 x:0 0 0 2026-03-10T08:55:14.197 INFO:tasks.workunit.client.1.vm08.stdout:6/374: mknod d9/c87 0 2026-03-10T08:55:14.198 INFO:tasks.workunit.client.1.vm08.stdout:6/375: write d9/dc/d11/f47 [3167907,25562] 0 2026-03-10T08:55:14.199 INFO:tasks.workunit.client.1.vm08.stdout:6/376: fdatasync d9/d13/d4e/f6b 0 2026-03-10T08:55:14.203 INFO:tasks.workunit.client.1.vm08.stdout:1/322: rmdir d1/da/d18/d53 0 2026-03-10T08:55:14.205 INFO:tasks.workunit.client.0.vm05.stdout:6/63: truncate d4/d7/d10/f12 1713653 0 2026-03-10T08:55:14.207 INFO:tasks.workunit.client.1.vm08.stdout:1/323: mkdir d1/da/de/d24/d35/d6d 0 2026-03-10T08:55:14.207 INFO:tasks.workunit.client.0.vm05.stdout:0/66: truncate df/f13 2837359 0 2026-03-10T08:55:14.209 INFO:tasks.workunit.client.0.vm05.stdout:0/67: write f5 [606175,100979] 0 2026-03-10T08:55:14.211 INFO:tasks.workunit.client.0.vm05.stdout:6/64: mknod d4/d7/c13 0 2026-03-10T08:55:14.213 INFO:tasks.workunit.client.0.vm05.stdout:0/68: creat df/f15 x:0 0 0 2026-03-10T08:55:14.217 INFO:tasks.workunit.client.1.vm08.stdout:1/324: mknod d1/da/de/d24/d3d/d40/d56/c6e 0 2026-03-10T08:55:14.217 INFO:tasks.workunit.client.0.vm05.stdout:0/69: mknod df/c16 0 2026-03-10T08:55:14.217 INFO:tasks.workunit.client.1.vm08.stdout:1/325: chown d1/da/de/d24/d35 93 1 2026-03-10T08:55:14.220 INFO:tasks.workunit.client.1.vm08.stdout:1/326: fsync d1/fd 0 2026-03-10T08:55:14.221 INFO:tasks.workunit.client.1.vm08.stdout:6/377: dread d9/dc/d11/d23/d2c/f3d [0,4194304] 0 2026-03-10T08:55:14.223 INFO:tasks.workunit.client.1.vm08.stdout:7/310: rename d0/d11/c15 to d0/d11/d1f/d29/d3d/d40/c63 0 2026-03-10T08:55:14.223 INFO:tasks.workunit.client.0.vm05.stdout:4/80: unlink d0/f13 0 2026-03-10T08:55:14.224 INFO:tasks.workunit.client.1.vm08.stdout:1/327: rmdir d1/da/d20/d4c 39 2026-03-10T08:55:14.225 INFO:tasks.workunit.client.0.vm05.stdout:4/81: creat d0/f16 x:0 0 0 2026-03-10T08:55:14.227 INFO:tasks.workunit.client.0.vm05.stdout:8/50: getdents d2/db 0 2026-03-10T08:55:14.227 INFO:tasks.workunit.client.1.vm08.stdout:4/374: rename d5/d2f/d5a/d75 to d5/d23/d49/d83 0 2026-03-10T08:55:14.227 INFO:tasks.workunit.client.0.vm05.stdout:8/51: stat d2/db/c11 0 2026-03-10T08:55:14.228 INFO:tasks.workunit.client.1.vm08.stdout:7/311: write d0/d14/f7 [3949536,88080] 0 2026-03-10T08:55:14.229 INFO:tasks.workunit.client.0.vm05.stdout:8/52: symlink d2/dd/l13 0 2026-03-10T08:55:14.230 INFO:tasks.workunit.client.1.vm08.stdout:7/312: write d0/d14/f7 [4627390,79191] 0 2026-03-10T08:55:14.231 INFO:tasks.workunit.client.0.vm05.stdout:4/82: link d0/fe d0/f17 0 2026-03-10T08:55:14.232 INFO:tasks.workunit.client.0.vm05.stdout:4/83: write d0/f16 [441578,25073] 0 2026-03-10T08:55:14.234 INFO:tasks.workunit.client.1.vm08.stdout:1/328: dwrite d1/da/f39 [4194304,4194304] 0 2026-03-10T08:55:14.236 INFO:tasks.workunit.client.1.vm08.stdout:9/348: rename d2/dd/d15/d1e/d24/f33 to d2/dd/d15/d1e/d39/d4e/f71 0 2026-03-10T08:55:14.239 INFO:tasks.workunit.client.1.vm08.stdout:7/313: dwrite d0/d11/d1f/d29/d3b/f4c [0,4194304] 0 2026-03-10T08:55:14.252 INFO:tasks.workunit.client.0.vm05.stdout:8/53: mknod d2/c14 0 2026-03-10T08:55:14.256 INFO:tasks.workunit.client.0.vm05.stdout:4/84: link d0/fc d0/f18 0 2026-03-10T08:55:14.257 INFO:tasks.workunit.client.0.vm05.stdout:4/85: link d0/l6 d0/l19 0 2026-03-10T08:55:14.257 INFO:tasks.workunit.client.1.vm08.stdout:1/329: write d1/da/f22 [1014997,89149] 0 2026-03-10T08:55:14.257 INFO:tasks.workunit.client.1.vm08.stdout:9/349: chown d2/l19 3521 1 2026-03-10T08:55:14.257 INFO:tasks.workunit.client.1.vm08.stdout:9/350: chown d2/d54/l58 27918038 1 2026-03-10T08:55:14.257 INFO:tasks.workunit.client.1.vm08.stdout:1/330: truncate d1/da/de/d24/d26/f4d 714204 0 2026-03-10T08:55:14.257 INFO:tasks.workunit.client.1.vm08.stdout:9/351: write d2/dd/d15/d1e/d39/f57 [4142594,80371] 0 2026-03-10T08:55:14.261 INFO:tasks.workunit.client.1.vm08.stdout:0/282: rename d6/f15 to d6/dd/d13/d17/d1f/d20/d2f/d26/f5a 0 2026-03-10T08:55:14.269 INFO:tasks.workunit.client.0.vm05.stdout:4/86: dwrite d0/fb [0,4194304] 0 2026-03-10T08:55:14.269 INFO:tasks.workunit.client.0.vm05.stdout:4/87: write d0/f10 [342928,100384] 0 2026-03-10T08:55:14.269 INFO:tasks.workunit.client.1.vm08.stdout:7/314: truncate d0/d11/d1f/d29/d3d/d40/ff 97760 0 2026-03-10T08:55:14.269 INFO:tasks.workunit.client.1.vm08.stdout:1/331: dwrite d1/da/d18/d3a/f57 [0,4194304] 0 2026-03-10T08:55:14.272 INFO:tasks.workunit.client.1.vm08.stdout:1/332: dread d1/da/d20/f54 [0,4194304] 0 2026-03-10T08:55:14.272 INFO:tasks.workunit.client.1.vm08.stdout:9/352: dread d2/dd/f16 [0,4194304] 0 2026-03-10T08:55:14.272 INFO:tasks.workunit.client.1.vm08.stdout:9/353: chown d2/l36 2588031 1 2026-03-10T08:55:14.277 INFO:tasks.workunit.client.0.vm05.stdout:4/88: readlink d0/l6 0 2026-03-10T08:55:14.278 INFO:tasks.workunit.client.1.vm08.stdout:4/375: rmdir d5/d2f/d80 0 2026-03-10T08:55:14.278 INFO:tasks.workunit.client.0.vm05.stdout:4/89: chown d0/f16 16248 1 2026-03-10T08:55:14.279 INFO:tasks.workunit.client.1.vm08.stdout:7/315: mknod d0/d11/d1f/d29/d36/c64 0 2026-03-10T08:55:14.280 INFO:tasks.workunit.client.1.vm08.stdout:7/316: readlink d0/d11/d1f/d29/d3d/l50 0 2026-03-10T08:55:14.283 INFO:tasks.workunit.client.1.vm08.stdout:6/378: sync 2026-03-10T08:55:14.289 INFO:tasks.workunit.client.0.vm05.stdout:4/90: mkdir d0/d1a 0 2026-03-10T08:55:14.290 INFO:tasks.workunit.client.1.vm08.stdout:9/354: dwrite d2/dd/d15/f22 [0,4194304] 0 2026-03-10T08:55:14.290 INFO:tasks.workunit.client.1.vm08.stdout:4/376: dread d5/fd [0,4194304] 0 2026-03-10T08:55:14.291 INFO:tasks.workunit.client.0.vm05.stdout:4/91: dread d0/f9 [0,4194304] 0 2026-03-10T08:55:14.291 INFO:tasks.workunit.client.1.vm08.stdout:1/333: dread d1/f8 [4194304,4194304] 0 2026-03-10T08:55:14.292 INFO:tasks.workunit.client.0.vm05.stdout:4/92: read d0/f8 [152109,125609] 0 2026-03-10T08:55:14.293 INFO:tasks.workunit.client.0.vm05.stdout:4/93: chown d0/fe 105730299 1 2026-03-10T08:55:14.296 INFO:tasks.workunit.client.0.vm05.stdout:4/94: mknod d0/c1b 0 2026-03-10T08:55:14.298 INFO:tasks.workunit.client.0.vm05.stdout:4/95: creat d0/d15/f1c x:0 0 0 2026-03-10T08:55:14.300 INFO:tasks.workunit.client.1.vm08.stdout:1/334: dread d1/da/d18/d3a/f57 [0,4194304] 0 2026-03-10T08:55:14.301 INFO:tasks.workunit.client.0.vm05.stdout:5/94: dwrite d5/df/f19 [4194304,4194304] 0 2026-03-10T08:55:14.303 INFO:tasks.workunit.client.0.vm05.stdout:5/95: truncate d5/df/d12/f1a 837285 0 2026-03-10T08:55:14.307 INFO:tasks.workunit.client.0.vm05.stdout:4/96: dwrite d0/f1 [4194304,4194304] 0 2026-03-10T08:55:14.320 INFO:tasks.workunit.client.1.vm08.stdout:4/377: creat d5/d2f/f84 x:0 0 0 2026-03-10T08:55:14.320 INFO:tasks.workunit.client.0.vm05.stdout:9/54: getdents d6 0 2026-03-10T08:55:14.321 INFO:tasks.workunit.client.0.vm05.stdout:9/55: chown f4 26 1 2026-03-10T08:55:14.323 INFO:tasks.workunit.client.0.vm05.stdout:9/56: dread d6/fe [0,4194304] 0 2026-03-10T08:55:14.329 INFO:tasks.workunit.client.0.vm05.stdout:3/76: dwrite d9/fc [4194304,4194304] 0 2026-03-10T08:55:14.340 INFO:tasks.workunit.client.1.vm08.stdout:9/355: rmdir d2/dd/d15/d1e/d25/d32/d5c 39 2026-03-10T08:55:14.340 INFO:tasks.workunit.client.0.vm05.stdout:1/124: getdents dd 0 2026-03-10T08:55:14.340 INFO:tasks.workunit.client.0.vm05.stdout:9/57: dread f2 [0,4194304] 0 2026-03-10T08:55:14.340 INFO:tasks.workunit.client.0.vm05.stdout:9/58: dread - d6/fb zero size 2026-03-10T08:55:14.341 INFO:tasks.workunit.client.0.vm05.stdout:4/97: readlink d0/l6 0 2026-03-10T08:55:14.346 INFO:tasks.workunit.client.1.vm08.stdout:8/416: rename d1/d10/d9/d4d/d5c to d1/d10/d9/dd/d25/d27/d44/d97 0 2026-03-10T08:55:14.347 INFO:tasks.workunit.client.0.vm05.stdout:7/44: write f8 [842619,14073] 0 2026-03-10T08:55:14.348 INFO:tasks.workunit.client.1.vm08.stdout:9/356: dwrite d2/dd/d15/d1e/d24/f30 [0,4194304] 0 2026-03-10T08:55:14.349 INFO:tasks.workunit.client.1.vm08.stdout:8/417: truncate d1/d2c/f47 5238138 0 2026-03-10T08:55:14.351 INFO:tasks.workunit.client.1.vm08.stdout:0/283: rmdir d6/dd/d13/d17/d1f/d2d/d39/d40 0 2026-03-10T08:55:14.352 INFO:tasks.workunit.client.1.vm08.stdout:9/357: write d2/dd/d15/d1e/d39/d4e/f55 [3458585,119138] 0 2026-03-10T08:55:14.352 INFO:tasks.workunit.client.1.vm08.stdout:6/379: truncate d9/d10/d1e/f2a 762391 0 2026-03-10T08:55:14.353 INFO:tasks.workunit.client.1.vm08.stdout:9/358: read d2/dd/d15/d1e/d21/f3a [2922583,100969] 0 2026-03-10T08:55:14.356 INFO:tasks.workunit.client.0.vm05.stdout:3/77: dread f8 [0,4194304] 0 2026-03-10T08:55:14.364 INFO:tasks.workunit.client.1.vm08.stdout:1/335: dread - d1/da/d18/d3a/f3c zero size 2026-03-10T08:55:14.364 INFO:tasks.workunit.client.1.vm08.stdout:1/336: write d1/da/de/d24/d3d/d40/f42 [1362291,12754] 0 2026-03-10T08:55:14.364 INFO:tasks.workunit.client.1.vm08.stdout:5/362: dwrite d0/d11/d3e/d45/f4a [0,4194304] 0 2026-03-10T08:55:14.364 INFO:tasks.workunit.client.1.vm08.stdout:7/317: link d0/d14/d43/f58 d0/d11/d1f/d29/d3b/f65 0 2026-03-10T08:55:14.364 INFO:tasks.workunit.client.0.vm05.stdout:3/78: write d9/f11 [400318,108194] 0 2026-03-10T08:55:14.364 INFO:tasks.workunit.client.0.vm05.stdout:3/79: chown d9/f11 254 1 2026-03-10T08:55:14.364 INFO:tasks.workunit.client.0.vm05.stdout:3/80: write f2 [176178,95005] 0 2026-03-10T08:55:14.364 INFO:tasks.workunit.client.0.vm05.stdout:1/125: mknod dd/d21/c2f 0 2026-03-10T08:55:14.364 INFO:tasks.workunit.client.0.vm05.stdout:9/59: rename d6/lf to d6/l10 0 2026-03-10T08:55:14.364 INFO:tasks.workunit.client.1.vm08.stdout:7/318: fdatasync d0/d51/f5d 0 2026-03-10T08:55:14.369 INFO:tasks.workunit.client.1.vm08.stdout:1/337: dread d1/f1f [0,4194304] 0 2026-03-10T08:55:14.371 INFO:tasks.workunit.client.1.vm08.stdout:1/338: read - d1/da/d20/f67 zero size 2026-03-10T08:55:14.378 INFO:tasks.workunit.client.0.vm05.stdout:7/45: creat f9 x:0 0 0 2026-03-10T08:55:14.378 INFO:tasks.workunit.client.1.vm08.stdout:1/339: dwrite d1/da/d20/f2d [0,4194304] 0 2026-03-10T08:55:14.381 INFO:tasks.workunit.client.1.vm08.stdout:8/418: dwrite d1/d10/d9/dd/d18/f80 [0,4194304] 0 2026-03-10T08:55:14.381 INFO:tasks.workunit.client.1.vm08.stdout:1/340: readlink d1/da/de/d24/d3d/d40/l58 0 2026-03-10T08:55:14.388 INFO:tasks.workunit.client.0.vm05.stdout:3/81: dwrite d9/fa [0,4194304] 0 2026-03-10T08:55:14.398 INFO:tasks.workunit.client.0.vm05.stdout:1/126: symlink dd/d13/l30 0 2026-03-10T08:55:14.398 INFO:tasks.workunit.client.1.vm08.stdout:9/359: creat d2/dd/d15/d4f/f72 x:0 0 0 2026-03-10T08:55:14.398 INFO:tasks.workunit.client.1.vm08.stdout:5/363: mknod d0/d40/d4b/d4e/c74 0 2026-03-10T08:55:14.398 INFO:tasks.workunit.client.0.vm05.stdout:5/96: creat d5/d17/f1f x:0 0 0 2026-03-10T08:55:14.399 INFO:tasks.workunit.client.1.vm08.stdout:5/364: chown d0/fe 443997158 1 2026-03-10T08:55:14.401 INFO:tasks.workunit.client.1.vm08.stdout:2/369: rename d1/da/d10/d1b/c5a to d1/d5b/d66/c76 0 2026-03-10T08:55:14.405 INFO:tasks.workunit.client.0.vm05.stdout:1/127: creat dd/d10/d19/d27/f31 x:0 0 0 2026-03-10T08:55:14.406 INFO:tasks.workunit.client.0.vm05.stdout:1/128: dread - dd/d10/d19/f1d zero size 2026-03-10T08:55:14.418 INFO:tasks.workunit.client.1.vm08.stdout:5/365: creat d0/d40/f75 x:0 0 0 2026-03-10T08:55:14.419 INFO:tasks.workunit.client.0.vm05.stdout:1/129: rename dd/d10/d19 to dd/d10/d19/d32 22 2026-03-10T08:55:14.419 INFO:tasks.workunit.client.0.vm05.stdout:5/97: creat d5/df/d12/f20 x:0 0 0 2026-03-10T08:55:14.419 INFO:tasks.workunit.client.0.vm05.stdout:5/98: read - d5/df/f1c zero size 2026-03-10T08:55:14.419 INFO:tasks.workunit.client.0.vm05.stdout:5/99: fdatasync d5/df/d12/f13 0 2026-03-10T08:55:14.419 INFO:tasks.workunit.client.0.vm05.stdout:4/98: rmdir d0/d1a 0 2026-03-10T08:55:14.419 INFO:tasks.workunit.client.0.vm05.stdout:1/130: creat dd/d13/f33 x:0 0 0 2026-03-10T08:55:14.419 INFO:tasks.workunit.client.0.vm05.stdout:1/131: write dd/d10/d19/f24 [117825,66869] 0 2026-03-10T08:55:14.419 INFO:tasks.workunit.client.1.vm08.stdout:4/378: creat d5/f85 x:0 0 0 2026-03-10T08:55:14.420 INFO:tasks.workunit.client.1.vm08.stdout:4/379: write d5/f77 [791809,80023] 0 2026-03-10T08:55:14.422 INFO:tasks.workunit.client.1.vm08.stdout:3/318: rename d4 to d4/d15/d8/d2c/d65 22 2026-03-10T08:55:14.422 INFO:tasks.workunit.client.1.vm08.stdout:9/360: dread d2/dd/d15/f44 [0,4194304] 0 2026-03-10T08:55:14.423 INFO:tasks.workunit.client.1.vm08.stdout:9/361: dread - d2/dd/d15/d4f/f72 zero size 2026-03-10T08:55:14.424 INFO:tasks.workunit.client.0.vm05.stdout:1/132: creat dd/d10/d18/d20/f34 x:0 0 0 2026-03-10T08:55:14.425 INFO:tasks.workunit.client.0.vm05.stdout:4/99: chown d0/l19 12834 1 2026-03-10T08:55:14.431 INFO:tasks.workunit.client.1.vm08.stdout:0/284: creat d6/dd/d13/d17/d1f/d2d/f5b x:0 0 0 2026-03-10T08:55:14.433 INFO:tasks.workunit.client.0.vm05.stdout:4/100: mkdir d0/d1d 0 2026-03-10T08:55:14.434 INFO:tasks.workunit.client.0.vm05.stdout:4/101: truncate d0/fe 713903 0 2026-03-10T08:55:14.438 INFO:tasks.workunit.client.0.vm05.stdout:1/133: link dd/f1c dd/d10/d19/f35 0 2026-03-10T08:55:14.439 INFO:tasks.workunit.client.0.vm05.stdout:4/102: dwrite d0/f10 [0,4194304] 0 2026-03-10T08:55:14.445 INFO:tasks.workunit.client.1.vm08.stdout:4/380: rename d5/d23/d36/l43 to d5/d23/d49/d83/l86 0 2026-03-10T08:55:14.454 INFO:tasks.workunit.client.1.vm08.stdout:4/381: dwrite d5/d2f/d5d/f61 [0,4194304] 0 2026-03-10T08:55:14.457 INFO:tasks.workunit.client.1.vm08.stdout:4/382: write d5/d2f/d5d/f60 [280987,43400] 0 2026-03-10T08:55:14.465 INFO:tasks.workunit.client.1.vm08.stdout:4/383: dwrite d5/de/f1b [0,4194304] 0 2026-03-10T08:55:14.466 INFO:tasks.workunit.client.0.vm05.stdout:1/134: creat dd/d10/d18/f36 x:0 0 0 2026-03-10T08:55:14.471 INFO:tasks.workunit.client.0.vm05.stdout:4/103: creat d0/f1e x:0 0 0 2026-03-10T08:55:14.472 INFO:tasks.workunit.client.1.vm08.stdout:2/370: symlink d1/da/d61/l77 0 2026-03-10T08:55:14.473 INFO:tasks.workunit.client.1.vm08.stdout:9/362: dread d2/f4 [0,4194304] 0 2026-03-10T08:55:14.473 INFO:tasks.workunit.client.0.vm05.stdout:1/135: mkdir dd/d21/d37 0 2026-03-10T08:55:14.473 INFO:tasks.workunit.client.0.vm05.stdout:1/136: dread - dd/d10/d19/d27/f31 zero size 2026-03-10T08:55:14.477 INFO:tasks.workunit.client.0.vm05.stdout:4/104: mkdir d0/d1f 0 2026-03-10T08:55:14.482 INFO:tasks.workunit.client.0.vm05.stdout:4/105: dwrite d0/f8 [0,4194304] 0 2026-03-10T08:55:14.492 INFO:tasks.workunit.client.0.vm05.stdout:1/137: mknod dd/c38 0 2026-03-10T08:55:14.492 INFO:tasks.workunit.client.0.vm05.stdout:1/138: truncate dd/d10/d19/f35 4315255 0 2026-03-10T08:55:14.492 INFO:tasks.workunit.client.0.vm05.stdout:1/139: creat dd/d21/d37/f39 x:0 0 0 2026-03-10T08:55:14.492 INFO:tasks.workunit.client.1.vm08.stdout:5/366: unlink d0/d11/d18/l32 0 2026-03-10T08:55:14.492 INFO:tasks.workunit.client.1.vm08.stdout:3/319: symlink d4/d15/d8/d2c/d55/l66 0 2026-03-10T08:55:14.492 INFO:tasks.workunit.client.1.vm08.stdout:5/367: chown d0/d11/d27/d68 617 1 2026-03-10T08:55:14.492 INFO:tasks.workunit.client.1.vm08.stdout:4/384: rename d5/de/f63 to d5/d2f/d5a/f87 0 2026-03-10T08:55:14.492 INFO:tasks.workunit.client.1.vm08.stdout:2/371: fsync d1/da/f50 0 2026-03-10T08:55:14.492 INFO:tasks.workunit.client.1.vm08.stdout:2/372: dread - d1/da/d10/d1b/d6a/f71 zero size 2026-03-10T08:55:14.492 INFO:tasks.workunit.client.1.vm08.stdout:4/385: dwrite d5/f7e [0,4194304] 0 2026-03-10T08:55:14.493 INFO:tasks.workunit.client.0.vm05.stdout:1/140: write fc [971928,84384] 0 2026-03-10T08:55:14.496 INFO:tasks.workunit.client.0.vm05.stdout:1/141: dwrite f1 [4194304,4194304] 0 2026-03-10T08:55:14.501 INFO:tasks.workunit.client.0.vm05.stdout:1/142: dread dd/d10/f22 [0,4194304] 0 2026-03-10T08:55:14.505 INFO:tasks.workunit.client.1.vm08.stdout:2/373: truncate d1/f48 4195574 0 2026-03-10T08:55:14.509 INFO:tasks.workunit.client.0.vm05.stdout:1/143: dwrite dd/f11 [0,4194304] 0 2026-03-10T08:55:14.509 INFO:tasks.workunit.client.1.vm08.stdout:2/374: fsync d1/d5b/d66/f62 0 2026-03-10T08:55:14.520 INFO:tasks.workunit.client.0.vm05.stdout:1/144: dread dd/d10/d19/f1f [0,4194304] 0 2026-03-10T08:55:14.522 INFO:tasks.workunit.client.0.vm05.stdout:1/145: write dd/d10/d19/d27/f2a [702157,110139] 0 2026-03-10T08:55:14.523 INFO:tasks.workunit.client.0.vm05.stdout:1/146: readlink dd/d13/l30 0 2026-03-10T08:55:14.523 INFO:tasks.workunit.client.0.vm05.stdout:1/147: dread - dd/d10/d19/f2e zero size 2026-03-10T08:55:14.523 INFO:tasks.workunit.client.0.vm05.stdout:1/148: chown fa 264774143 1 2026-03-10T08:55:14.529 INFO:tasks.workunit.client.1.vm08.stdout:9/363: mknod d2/c73 0 2026-03-10T08:55:14.542 INFO:tasks.workunit.client.1.vm08.stdout:4/386: dwrite d5/d2f/d5a/d69/f6e [0,4194304] 0 2026-03-10T08:55:14.556 INFO:tasks.workunit.client.1.vm08.stdout:3/320: creat d4/d15/d8/d2c/f67 x:0 0 0 2026-03-10T08:55:14.562 INFO:tasks.workunit.client.1.vm08.stdout:0/285: dread d6/f18 [0,4194304] 0 2026-03-10T08:55:14.563 INFO:tasks.workunit.client.1.vm08.stdout:9/364: rename d2/dd/d15/d1e/d37 to d2/d41/d74 0 2026-03-10T08:55:14.566 INFO:tasks.workunit.client.1.vm08.stdout:3/321: fsync d4/d15/d8/f53 0 2026-03-10T08:55:14.575 INFO:tasks.workunit.client.1.vm08.stdout:3/322: rename d4/d15/d8/f45 to d4/d15/d8/f68 0 2026-03-10T08:55:14.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:14 vm05.local ceph-mon[49713]: pgmap v146: 65 pgs: 65 active+clean; 857 MiB data, 3.7 GiB used, 116 GiB / 120 GiB avail; 12 MiB/s rd, 94 MiB/s wr, 350 op/s 2026-03-10T08:55:14.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:14 vm08.local ceph-mon[57559]: pgmap v146: 65 pgs: 65 active+clean; 857 MiB data, 3.7 GiB used, 116 GiB / 120 GiB avail; 12 MiB/s rd, 94 MiB/s wr, 350 op/s 2026-03-10T08:55:14.969 INFO:tasks.workunit.client.0.vm05.stdout:6/65: getdents d4/d7 0 2026-03-10T08:55:14.971 INFO:tasks.workunit.client.0.vm05.stdout:0/70: dwrite fc [0,4194304] 0 2026-03-10T08:55:14.971 INFO:tasks.workunit.client.0.vm05.stdout:6/66: fsync d4/f11 0 2026-03-10T08:55:14.973 INFO:tasks.workunit.client.0.vm05.stdout:6/67: fdatasync d4/d7/d10/f12 0 2026-03-10T08:55:14.974 INFO:tasks.workunit.client.0.vm05.stdout:0/71: truncate df/f11 258803 0 2026-03-10T08:55:14.975 INFO:tasks.workunit.client.0.vm05.stdout:0/72: creat df/f17 x:0 0 0 2026-03-10T08:55:14.976 INFO:tasks.workunit.client.0.vm05.stdout:6/68: dread d4/f5 [0,4194304] 0 2026-03-10T08:55:14.980 INFO:tasks.workunit.client.0.vm05.stdout:6/69: dread d4/fc [4194304,4194304] 0 2026-03-10T08:55:14.983 INFO:tasks.workunit.client.0.vm05.stdout:6/70: write d4/f5 [667748,30343] 0 2026-03-10T08:55:14.984 INFO:tasks.workunit.client.0.vm05.stdout:6/71: creat d4/d7/f14 x:0 0 0 2026-03-10T08:55:14.986 INFO:tasks.workunit.client.0.vm05.stdout:6/72: truncate d4/d7/d10/f12 2391453 0 2026-03-10T08:55:14.986 INFO:tasks.workunit.client.0.vm05.stdout:6/73: stat d4/d7/c13 0 2026-03-10T08:55:14.986 INFO:tasks.workunit.client.0.vm05.stdout:3/82: fdatasync d9/fa 0 2026-03-10T08:55:14.986 INFO:tasks.workunit.client.0.vm05.stdout:3/83: stat f2 0 2026-03-10T08:55:14.986 INFO:tasks.workunit.client.0.vm05.stdout:3/84: fsync f2 0 2026-03-10T08:55:14.987 INFO:tasks.workunit.client.0.vm05.stdout:3/85: truncate d9/f11 1039465 0 2026-03-10T08:55:14.987 INFO:tasks.workunit.client.0.vm05.stdout:6/74: mkdir d4/d7/d10/d15 0 2026-03-10T08:55:14.989 INFO:tasks.workunit.client.0.vm05.stdout:3/86: mknod d9/c16 0 2026-03-10T08:55:14.993 INFO:tasks.workunit.client.0.vm05.stdout:3/87: rename d9/c16 to d9/c17 0 2026-03-10T08:55:14.993 INFO:tasks.workunit.client.0.vm05.stdout:3/88: fsync f2 0 2026-03-10T08:55:14.995 INFO:tasks.workunit.client.0.vm05.stdout:3/89: dread f8 [0,4194304] 0 2026-03-10T08:55:14.998 INFO:tasks.workunit.client.0.vm05.stdout:6/75: creat d4/d7/d10/d15/f16 x:0 0 0 2026-03-10T08:55:15.001 INFO:tasks.workunit.client.0.vm05.stdout:6/76: dread d4/d7/d10/f12 [0,4194304] 0 2026-03-10T08:55:15.004 INFO:tasks.workunit.client.0.vm05.stdout:3/90: symlink d9/l18 0 2026-03-10T08:55:15.009 INFO:tasks.workunit.client.0.vm05.stdout:6/77: creat d4/d7/d10/d15/f17 x:0 0 0 2026-03-10T08:55:15.010 INFO:tasks.workunit.client.0.vm05.stdout:6/78: symlink d4/d7/l18 0 2026-03-10T08:55:15.013 INFO:tasks.workunit.client.0.vm05.stdout:6/79: write d4/f11 [856447,8864] 0 2026-03-10T08:55:15.018 INFO:tasks.workunit.client.0.vm05.stdout:8/54: rmdir d2 39 2026-03-10T08:55:15.020 INFO:tasks.workunit.client.0.vm05.stdout:8/55: dread - d2/ff zero size 2026-03-10T08:55:15.026 INFO:tasks.workunit.client.0.vm05.stdout:8/56: mknod d2/db/c15 0 2026-03-10T08:55:15.026 INFO:tasks.workunit.client.0.vm05.stdout:8/57: symlink d2/l16 0 2026-03-10T08:55:15.026 INFO:tasks.workunit.client.0.vm05.stdout:8/58: mknod d2/c17 0 2026-03-10T08:55:15.026 INFO:tasks.workunit.client.0.vm05.stdout:8/59: chown d2/l16 47233 1 2026-03-10T08:55:15.026 INFO:tasks.workunit.client.0.vm05.stdout:8/60: chown d2/l6 73 1 2026-03-10T08:55:15.030 INFO:tasks.workunit.client.0.vm05.stdout:8/61: dread d2/fa [0,4194304] 0 2026-03-10T08:55:15.069 INFO:tasks.workunit.client.0.vm05.stdout:7/46: dread f8 [0,4194304] 0 2026-03-10T08:55:15.073 INFO:tasks.workunit.client.0.vm05.stdout:7/47: dwrite f4 [0,4194304] 0 2026-03-10T08:55:15.084 INFO:tasks.workunit.client.0.vm05.stdout:7/48: chown c7 198460 1 2026-03-10T08:55:15.084 INFO:tasks.workunit.client.0.vm05.stdout:9/60: chown d6/fb 18693 1 2026-03-10T08:55:15.086 INFO:tasks.workunit.client.0.vm05.stdout:5/100: rename d5/d17 to d5/df/d12/d21 0 2026-03-10T08:55:15.086 INFO:tasks.workunit.client.0.vm05.stdout:4/106: rename d0/d1f to d0/d1f/d20 22 2026-03-10T08:55:15.089 INFO:tasks.workunit.client.0.vm05.stdout:7/49: dwrite f4 [0,4194304] 0 2026-03-10T08:55:15.092 INFO:tasks.workunit.client.0.vm05.stdout:4/107: dwrite d0/f1 [8388608,4194304] 0 2026-03-10T08:55:15.092 INFO:tasks.workunit.client.1.vm08.stdout:6/380: dwrite d9/d13/f4a [0,4194304] 0 2026-03-10T08:55:15.096 INFO:tasks.workunit.client.1.vm08.stdout:8/419: dwrite d1/d10/f2a [0,4194304] 0 2026-03-10T08:55:15.097 INFO:tasks.workunit.client.1.vm08.stdout:1/341: write d1/da/d20/f21 [1333752,91611] 0 2026-03-10T08:55:15.097 INFO:tasks.workunit.client.1.vm08.stdout:1/342: write d1/da/f22 [75455,25091] 0 2026-03-10T08:55:15.102 INFO:tasks.workunit.client.1.vm08.stdout:1/343: chown d1/da/de/c17 92462 1 2026-03-10T08:55:15.102 INFO:tasks.workunit.client.1.vm08.stdout:6/381: creat d9/d13/f88 x:0 0 0 2026-03-10T08:55:15.104 INFO:tasks.workunit.client.1.vm08.stdout:7/319: write d0/f25 [1377637,56026] 0 2026-03-10T08:55:15.106 INFO:tasks.workunit.client.1.vm08.stdout:7/320: dread d0/d14/d43/f58 [0,4194304] 0 2026-03-10T08:55:15.107 INFO:tasks.workunit.client.1.vm08.stdout:7/321: chown d0/d14/l17 906 1 2026-03-10T08:55:15.107 INFO:tasks.workunit.client.0.vm05.stdout:3/91: rename f8 to d9/f19 0 2026-03-10T08:55:15.108 INFO:tasks.workunit.client.0.vm05.stdout:7/50: fdatasync f8 0 2026-03-10T08:55:15.108 INFO:tasks.workunit.client.0.vm05.stdout:7/51: rmdir - no directory 2026-03-10T08:55:15.109 INFO:tasks.workunit.client.0.vm05.stdout:9/61: symlink d6/l11 0 2026-03-10T08:55:15.110 INFO:tasks.workunit.client.0.vm05.stdout:4/108: mknod d0/c21 0 2026-03-10T08:55:15.116 INFO:tasks.workunit.client.1.vm08.stdout:6/382: creat d9/dc/d84/f89 x:0 0 0 2026-03-10T08:55:15.117 INFO:tasks.workunit.client.0.vm05.stdout:4/109: write d0/f1 [7518941,87058] 0 2026-03-10T08:55:15.117 INFO:tasks.workunit.client.0.vm05.stdout:3/92: creat d9/f1a x:0 0 0 2026-03-10T08:55:15.117 INFO:tasks.workunit.client.0.vm05.stdout:3/93: dread - d9/f12 zero size 2026-03-10T08:55:15.117 INFO:tasks.workunit.client.0.vm05.stdout:6/80: sync 2026-03-10T08:55:15.117 INFO:tasks.workunit.client.0.vm05.stdout:3/94: dwrite d9/ff [0,4194304] 0 2026-03-10T08:55:15.125 INFO:tasks.workunit.client.1.vm08.stdout:7/322: creat d0/d11/f66 x:0 0 0 2026-03-10T08:55:15.126 INFO:tasks.workunit.client.1.vm08.stdout:8/420: symlink d1/d10/d9/dd/d25/d27/d44/l98 0 2026-03-10T08:55:15.126 INFO:tasks.workunit.client.0.vm05.stdout:4/110: rename d0/f17 to d0/d1d/f22 0 2026-03-10T08:55:15.126 INFO:tasks.workunit.client.1.vm08.stdout:1/344: symlink d1/da/l6f 0 2026-03-10T08:55:15.129 INFO:tasks.workunit.client.0.vm05.stdout:3/95: mknod d9/c1b 0 2026-03-10T08:55:15.129 INFO:tasks.workunit.client.1.vm08.stdout:7/323: dwrite d0/f44 [4194304,4194304] 0 2026-03-10T08:55:15.129 INFO:tasks.workunit.client.0.vm05.stdout:3/96: stat d9/fa 0 2026-03-10T08:55:15.130 INFO:tasks.workunit.client.0.vm05.stdout:3/97: rename d9 to d9/d1c 22 2026-03-10T08:55:15.130 INFO:tasks.workunit.client.0.vm05.stdout:6/81: rename d4/d7/l18 to d4/d7/l19 0 2026-03-10T08:55:15.132 INFO:tasks.workunit.client.1.vm08.stdout:7/324: dread d0/f44 [4194304,4194304] 0 2026-03-10T08:55:15.135 INFO:tasks.workunit.client.0.vm05.stdout:6/82: dwrite d4/d7/f14 [0,4194304] 0 2026-03-10T08:55:15.143 INFO:tasks.workunit.client.1.vm08.stdout:1/345: symlink d1/da/d20/d3f/l70 0 2026-03-10T08:55:15.144 INFO:tasks.workunit.client.1.vm08.stdout:8/421: creat d1/d10/d9/d8a/f99 x:0 0 0 2026-03-10T08:55:15.144 INFO:tasks.workunit.client.1.vm08.stdout:1/346: chown d1/da/d18/f1d 1091954 1 2026-03-10T08:55:15.145 INFO:tasks.workunit.client.0.vm05.stdout:9/62: mkdir d6/d12 0 2026-03-10T08:55:15.148 INFO:tasks.workunit.client.0.vm05.stdout:6/83: mkdir d4/d7/d10/d1a 0 2026-03-10T08:55:15.148 INFO:tasks.workunit.client.0.vm05.stdout:6/84: write d4/f11 [986991,97019] 0 2026-03-10T08:55:15.149 INFO:tasks.workunit.client.1.vm08.stdout:8/422: mkdir d1/d10/d9/dd/d9a 0 2026-03-10T08:55:15.150 INFO:tasks.workunit.client.1.vm08.stdout:8/423: chown d1/d10/d9/dd/d25/d27/d44/d89/l8c 144644374 1 2026-03-10T08:55:15.181 INFO:tasks.workunit.client.0.vm05.stdout:9/63: write d6/f8 [46054,7950] 0 2026-03-10T08:55:15.182 INFO:tasks.workunit.client.0.vm05.stdout:6/85: fsync d4/fc 0 2026-03-10T08:55:15.182 INFO:tasks.workunit.client.0.vm05.stdout:3/98: link d9/c1b d9/c1d 0 2026-03-10T08:55:15.182 INFO:tasks.workunit.client.0.vm05.stdout:3/99: fdatasync d9/f13 0 2026-03-10T08:55:15.182 INFO:tasks.workunit.client.0.vm05.stdout:6/86: mkdir d4/d7/d10/d15/d1b 0 2026-03-10T08:55:15.182 INFO:tasks.workunit.client.0.vm05.stdout:6/87: truncate d4/d7/d10/d15/f17 765973 0 2026-03-10T08:55:15.182 INFO:tasks.workunit.client.0.vm05.stdout:9/64: fsync d6/fe 0 2026-03-10T08:55:15.182 INFO:tasks.workunit.client.0.vm05.stdout:6/88: symlink d4/d7/d10/d1a/l1c 0 2026-03-10T08:55:15.182 INFO:tasks.workunit.client.0.vm05.stdout:6/89: chown d4/fc 57802 1 2026-03-10T08:55:15.183 INFO:tasks.workunit.client.1.vm08.stdout:8/424: read - d1/d10/d9/dd/f8f zero size 2026-03-10T08:55:15.183 INFO:tasks.workunit.client.1.vm08.stdout:1/347: rename d1/da/de/d24/d26/f4d to d1/da/d20/d3f/d49/f71 0 2026-03-10T08:55:15.183 INFO:tasks.workunit.client.1.vm08.stdout:8/425: mknod d1/d10/d9/d8a/c9b 0 2026-03-10T08:55:15.183 INFO:tasks.workunit.client.1.vm08.stdout:8/426: creat d1/d10/d9/dd/d25/d27/d44/d97/f9c x:0 0 0 2026-03-10T08:55:15.183 INFO:tasks.workunit.client.1.vm08.stdout:1/348: creat d1/da/d18/f72 x:0 0 0 2026-03-10T08:55:15.183 INFO:tasks.workunit.client.1.vm08.stdout:1/349: read - d1/da/de/d24/d35/f64 zero size 2026-03-10T08:55:15.183 INFO:tasks.workunit.client.1.vm08.stdout:8/427: creat d1/d10/d9/dd/d9a/f9d x:0 0 0 2026-03-10T08:55:15.183 INFO:tasks.workunit.client.1.vm08.stdout:8/428: read - d1/d10/d9/dd/d3d/f78 zero size 2026-03-10T08:55:15.183 INFO:tasks.workunit.client.1.vm08.stdout:1/350: chown d1/da/de/d24/d35/d43/c6c 1327591862 1 2026-03-10T08:55:15.183 INFO:tasks.workunit.client.1.vm08.stdout:8/429: unlink d1/d10/c77 0 2026-03-10T08:55:15.183 INFO:tasks.workunit.client.1.vm08.stdout:1/351: creat d1/da/de/d24/d3d/d40/d56/f73 x:0 0 0 2026-03-10T08:55:15.183 INFO:tasks.workunit.client.1.vm08.stdout:1/352: read d1/f1f [3642021,51713] 0 2026-03-10T08:55:15.183 INFO:tasks.workunit.client.0.vm05.stdout:4/111: sync 2026-03-10T08:55:15.184 INFO:tasks.workunit.client.1.vm08.stdout:6/383: dread d9/dc/f1b [0,4194304] 0 2026-03-10T08:55:15.184 INFO:tasks.workunit.client.0.vm05.stdout:4/112: getdents d0/d15 0 2026-03-10T08:55:15.184 INFO:tasks.workunit.client.1.vm08.stdout:6/384: read - d9/d50/f75 zero size 2026-03-10T08:55:15.185 INFO:tasks.workunit.client.0.vm05.stdout:4/113: creat d0/f23 x:0 0 0 2026-03-10T08:55:15.187 INFO:tasks.workunit.client.0.vm05.stdout:4/114: dread d0/fc [0,4194304] 0 2026-03-10T08:55:15.190 INFO:tasks.workunit.client.1.vm08.stdout:6/385: dwrite d9/dc/f1b [0,4194304] 0 2026-03-10T08:55:15.236 INFO:tasks.workunit.client.1.vm08.stdout:7/325: sync 2026-03-10T08:55:15.236 INFO:tasks.workunit.client.1.vm08.stdout:1/353: sync 2026-03-10T08:55:15.241 INFO:tasks.workunit.client.1.vm08.stdout:1/354: dwrite d1/da/f39 [8388608,4194304] 0 2026-03-10T08:55:15.243 INFO:tasks.workunit.client.1.vm08.stdout:7/326: truncate d0/d11/d4a/f4f 2331047 0 2026-03-10T08:55:15.244 INFO:tasks.workunit.client.1.vm08.stdout:7/327: truncate d0/d11/f66 733136 0 2026-03-10T08:55:15.244 INFO:tasks.workunit.client.1.vm08.stdout:1/355: write d1/f65 [220438,124355] 0 2026-03-10T08:55:15.252 INFO:tasks.workunit.client.1.vm08.stdout:7/328: unlink d0/f41 0 2026-03-10T08:55:15.252 INFO:tasks.workunit.client.1.vm08.stdout:7/329: write d0/d11/d1f/d29/d3b/f4c [2399580,44317] 0 2026-03-10T08:55:15.253 INFO:tasks.workunit.client.1.vm08.stdout:1/356: symlink d1/da/de/d24/d3d/d4a/l74 0 2026-03-10T08:55:15.255 INFO:tasks.workunit.client.1.vm08.stdout:7/330: creat d0/d1c/f67 x:0 0 0 2026-03-10T08:55:15.257 INFO:tasks.workunit.client.1.vm08.stdout:1/357: symlink d1/da/de/d24/d3d/d40/d56/d6b/l75 0 2026-03-10T08:55:15.258 INFO:tasks.workunit.client.1.vm08.stdout:7/331: fsync d0/d11/d1f/d29/d3d/d40/f24 0 2026-03-10T08:55:15.263 INFO:tasks.workunit.client.1.vm08.stdout:7/332: creat d0/d14/f68 x:0 0 0 2026-03-10T08:55:15.271 INFO:tasks.workunit.client.1.vm08.stdout:7/333: rename d0/c46 to d0/d14/c69 0 2026-03-10T08:55:15.274 INFO:tasks.workunit.client.1.vm08.stdout:7/334: truncate d0/d11/d1f/d29/d3d/d40/f38 3410775 0 2026-03-10T08:55:15.274 INFO:tasks.workunit.client.1.vm08.stdout:7/335: chown d0/d11/d1f/d29/d3d/l50 1 1 2026-03-10T08:55:15.274 INFO:tasks.workunit.client.1.vm08.stdout:7/336: chown d0/f25 100 1 2026-03-10T08:55:15.275 INFO:tasks.workunit.client.1.vm08.stdout:7/337: read - d0/d11/d4a/f5c zero size 2026-03-10T08:55:15.277 INFO:tasks.workunit.client.1.vm08.stdout:7/338: creat d0/d11/f6a x:0 0 0 2026-03-10T08:55:15.279 INFO:tasks.workunit.client.1.vm08.stdout:7/339: symlink d0/d11/d4a/l6b 0 2026-03-10T08:55:15.280 INFO:tasks.workunit.client.1.vm08.stdout:7/340: creat d0/d11/d1f/d2c/f6c x:0 0 0 2026-03-10T08:55:15.290 INFO:tasks.workunit.client.0.vm05.stdout:1/149: rmdir dd/d10/d18/d20 39 2026-03-10T08:55:15.294 INFO:tasks.workunit.client.1.vm08.stdout:4/387: getdents d5/d23/d36 0 2026-03-10T08:55:15.308 INFO:tasks.workunit.client.0.vm05.stdout:1/150: creat dd/d21/f3a x:0 0 0 2026-03-10T08:55:15.308 INFO:tasks.workunit.client.0.vm05.stdout:1/151: readlink dd/d10/d19/d27/l2b 0 2026-03-10T08:55:15.308 INFO:tasks.workunit.client.0.vm05.stdout:1/152: mknod dd/d10/d18/d2d/c3b 0 2026-03-10T08:55:15.308 INFO:tasks.workunit.client.0.vm05.stdout:1/153: dread - dd/d13/f33 zero size 2026-03-10T08:55:15.308 INFO:tasks.workunit.client.1.vm08.stdout:2/375: getdents d1/da/d61 0 2026-03-10T08:55:15.308 INFO:tasks.workunit.client.1.vm08.stdout:2/376: write d1/da/d10/d1b/d6a/f71 [920604,104975] 0 2026-03-10T08:55:15.309 INFO:tasks.workunit.client.1.vm08.stdout:4/388: mknod d5/d23/d36/c88 0 2026-03-10T08:55:15.309 INFO:tasks.workunit.client.1.vm08.stdout:4/389: stat d5/de/f5e 0 2026-03-10T08:55:15.309 INFO:tasks.workunit.client.1.vm08.stdout:2/377: rename d1/da/d61 to d1/da/d78 0 2026-03-10T08:55:15.309 INFO:tasks.workunit.client.1.vm08.stdout:5/368: write d0/d11/d18/f34 [537384,8077] 0 2026-03-10T08:55:15.311 INFO:tasks.workunit.client.0.vm05.stdout:6/90: dread d4/f11 [0,4194304] 0 2026-03-10T08:55:15.317 INFO:tasks.workunit.client.1.vm08.stdout:9/365: write d2/dd/d15/f1b [1504236,56929] 0 2026-03-10T08:55:15.318 INFO:tasks.workunit.client.0.vm05.stdout:1/154: mknod dd/d13/c3c 0 2026-03-10T08:55:15.318 INFO:tasks.workunit.client.0.vm05.stdout:1/155: dread f6 [0,4194304] 0 2026-03-10T08:55:15.318 INFO:tasks.workunit.client.0.vm05.stdout:1/156: readlink dd/l12 0 2026-03-10T08:55:15.318 INFO:tasks.workunit.client.0.vm05.stdout:6/91: mknod d4/d7/d10/d15/d1b/c1d 0 2026-03-10T08:55:15.319 INFO:tasks.workunit.client.1.vm08.stdout:0/286: truncate d6/dd/d13/d17/d1f/d2d/d38/f53 3452254 0 2026-03-10T08:55:15.319 INFO:tasks.workunit.client.1.vm08.stdout:5/369: dwrite d0/d11/d18/f4f [4194304,4194304] 0 2026-03-10T08:55:15.320 INFO:tasks.workunit.client.0.vm05.stdout:1/157: truncate dd/f16 3803400 0 2026-03-10T08:55:15.321 INFO:tasks.workunit.client.1.vm08.stdout:4/390: rename d5/lf to d5/l89 0 2026-03-10T08:55:15.332 INFO:tasks.workunit.client.1.vm08.stdout:3/323: unlink d4/d15/d8/f53 0 2026-03-10T08:55:15.333 INFO:tasks.workunit.client.0.vm05.stdout:1/158: symlink dd/d21/d37/l3d 0 2026-03-10T08:55:15.334 INFO:tasks.workunit.client.0.vm05.stdout:6/92: dwrite d4/f11 [0,4194304] 0 2026-03-10T08:55:15.335 INFO:tasks.workunit.client.0.vm05.stdout:6/93: write d4/fc [7121227,112608] 0 2026-03-10T08:55:15.339 INFO:tasks.workunit.client.1.vm08.stdout:0/287: creat d6/dd/d13/d17/d1f/d20/d2f/d57/f5c x:0 0 0 2026-03-10T08:55:15.350 INFO:tasks.workunit.client.0.vm05.stdout:3/100: rename d9/c17 to d9/c1e 0 2026-03-10T08:55:15.352 INFO:tasks.workunit.client.0.vm05.stdout:6/94: unlink d4/f9 0 2026-03-10T08:55:15.353 INFO:tasks.workunit.client.1.vm08.stdout:5/370: link d0/l8 d0/d1b/d67/l76 0 2026-03-10T08:55:15.354 INFO:tasks.workunit.client.1.vm08.stdout:4/391: creat d5/f8a x:0 0 0 2026-03-10T08:55:15.358 INFO:tasks.workunit.client.1.vm08.stdout:5/371: unlink d0/d11/d18/f72 0 2026-03-10T08:55:15.364 INFO:tasks.workunit.client.1.vm08.stdout:3/324: rename d4/d15/d8/d1d/d4f/c51 to d4/d15/d8/d2c/c69 0 2026-03-10T08:55:15.368 INFO:tasks.workunit.client.1.vm08.stdout:3/325: link d4/d15/d8/d2c/f42 d4/d15/d8/d2c/f6a 0 2026-03-10T08:55:15.378 INFO:tasks.workunit.client.0.vm05.stdout:8/62: dwrite d2/fa [0,4194304] 0 2026-03-10T08:55:15.380 INFO:tasks.workunit.client.1.vm08.stdout:0/288: sync 2026-03-10T08:55:15.380 INFO:tasks.workunit.client.1.vm08.stdout:4/392: sync 2026-03-10T08:55:15.387 INFO:tasks.workunit.client.0.vm05.stdout:2/45: write d0/f4 [1580100,88809] 0 2026-03-10T08:55:15.389 INFO:tasks.workunit.client.1.vm08.stdout:0/289: chown d6/f11 450 1 2026-03-10T08:55:15.390 INFO:tasks.workunit.client.1.vm08.stdout:4/393: truncate d5/d2f/d5d/f66 481269 0 2026-03-10T08:55:15.393 INFO:tasks.workunit.client.0.vm05.stdout:8/63: creat d2/f18 x:0 0 0 2026-03-10T08:55:15.394 INFO:tasks.workunit.client.1.vm08.stdout:4/394: rename d5/d2f/c5c to d5/d2f/d5a/c8b 0 2026-03-10T08:55:15.397 INFO:tasks.workunit.client.0.vm05.stdout:8/64: creat d2/db/f19 x:0 0 0 2026-03-10T08:55:15.397 INFO:tasks.workunit.client.0.vm05.stdout:8/65: dread - d2/db/f19 zero size 2026-03-10T08:55:15.397 INFO:tasks.workunit.client.1.vm08.stdout:4/395: dread d5/d23/d36/f51 [0,4194304] 0 2026-03-10T08:55:15.397 INFO:tasks.workunit.client.1.vm08.stdout:0/290: getdents d6/dd/d13/d17/d1f/d20 0 2026-03-10T08:55:15.397 INFO:tasks.workunit.client.1.vm08.stdout:4/396: write d5/f85 [210786,17175] 0 2026-03-10T08:55:15.397 INFO:tasks.workunit.client.1.vm08.stdout:0/291: chown d6/dd/d13/c44 127 1 2026-03-10T08:55:15.398 INFO:tasks.workunit.client.1.vm08.stdout:0/292: read - d6/dd/d13/d17/d1f/d20/d2f/d24/f37 zero size 2026-03-10T08:55:15.399 INFO:tasks.workunit.client.0.vm05.stdout:8/66: dwrite d2/f5 [0,4194304] 0 2026-03-10T08:55:15.399 INFO:tasks.workunit.client.1.vm08.stdout:0/293: readlink d6/dd/d13/d17/d1f/d20/d2f/l33 0 2026-03-10T08:55:15.402 INFO:tasks.workunit.client.0.vm05.stdout:8/67: creat d2/dd/f1a x:0 0 0 2026-03-10T08:55:15.402 INFO:tasks.workunit.client.1.vm08.stdout:0/294: chown d6/dd/d13/d17/c41 164221 1 2026-03-10T08:55:15.403 INFO:tasks.workunit.client.0.vm05.stdout:8/68: readlink d2/l4 0 2026-03-10T08:55:15.403 INFO:tasks.workunit.client.1.vm08.stdout:0/295: dread - d6/dd/d13/d17/d1f/d2d/f5b zero size 2026-03-10T08:55:15.403 INFO:tasks.workunit.client.1.vm08.stdout:0/296: fdatasync d6/dd/d13/d17/d1f/d2d/f5b 0 2026-03-10T08:55:15.404 INFO:tasks.workunit.client.0.vm05.stdout:8/69: unlink d2/l8 0 2026-03-10T08:55:15.409 INFO:tasks.workunit.client.1.vm08.stdout:4/397: dread d5/f8 [0,4194304] 0 2026-03-10T08:55:15.413 INFO:tasks.workunit.client.0.vm05.stdout:8/70: dwrite d2/db/f19 [0,4194304] 0 2026-03-10T08:55:15.413 INFO:tasks.workunit.client.1.vm08.stdout:4/398: write d5/f85 [1150547,61155] 0 2026-03-10T08:55:15.417 INFO:tasks.workunit.client.0.vm05.stdout:8/71: dwrite d2/db/f19 [0,4194304] 0 2026-03-10T08:55:15.421 INFO:tasks.workunit.client.1.vm08.stdout:4/399: sync 2026-03-10T08:55:15.426 INFO:tasks.workunit.client.0.vm05.stdout:2/46: dread d0/f4 [0,4194304] 0 2026-03-10T08:55:15.433 INFO:tasks.workunit.client.0.vm05.stdout:8/72: unlink d2/f18 0 2026-03-10T08:55:15.437 INFO:tasks.workunit.client.1.vm08.stdout:4/400: write d5/d23/d36/f51 [2960589,47310] 0 2026-03-10T08:55:15.437 INFO:tasks.workunit.client.0.vm05.stdout:8/73: write d2/fa [3158354,18855] 0 2026-03-10T08:55:15.438 INFO:tasks.workunit.client.0.vm05.stdout:8/74: dwrite d2/f5 [0,4194304] 0 2026-03-10T08:55:15.442 INFO:tasks.workunit.client.0.vm05.stdout:2/47: creat d0/fa x:0 0 0 2026-03-10T08:55:15.442 INFO:tasks.workunit.client.0.vm05.stdout:2/48: chown d0/d9 58 1 2026-03-10T08:55:15.447 INFO:tasks.workunit.client.0.vm05.stdout:8/75: dwrite d2/ff [0,4194304] 0 2026-03-10T08:55:15.452 INFO:tasks.workunit.client.0.vm05.stdout:8/76: creat d2/db/f1b x:0 0 0 2026-03-10T08:55:15.459 INFO:tasks.workunit.client.0.vm05.stdout:9/65: fdatasync d6/f8 0 2026-03-10T08:55:15.463 INFO:tasks.workunit.client.0.vm05.stdout:9/66: dread d6/f7 [0,4194304] 0 2026-03-10T08:55:15.480 INFO:tasks.workunit.client.1.vm08.stdout:4/401: dread d5/f1e [0,4194304] 0 2026-03-10T08:55:15.480 INFO:tasks.workunit.client.1.vm08.stdout:4/402: truncate d5/d23/d36/d76/f82 3551591 0 2026-03-10T08:55:15.480 INFO:tasks.workunit.client.1.vm08.stdout:4/403: creat d5/d2f/d5a/d69/f8c x:0 0 0 2026-03-10T08:55:15.480 INFO:tasks.workunit.client.1.vm08.stdout:4/404: rename d5/d23/l48 to d5/d2f/l8d 0 2026-03-10T08:55:15.480 INFO:tasks.workunit.client.1.vm08.stdout:4/405: rename d5/d23/l24 to d5/de/l8e 0 2026-03-10T08:55:15.480 INFO:tasks.workunit.client.0.vm05.stdout:9/67: stat d6/f7 0 2026-03-10T08:55:15.480 INFO:tasks.workunit.client.0.vm05.stdout:9/68: symlink d6/d12/l13 0 2026-03-10T08:55:15.480 INFO:tasks.workunit.client.0.vm05.stdout:9/69: creat d6/d12/f14 x:0 0 0 2026-03-10T08:55:15.480 INFO:tasks.workunit.client.0.vm05.stdout:9/70: chown d6/fb 7451 1 2026-03-10T08:55:15.480 INFO:tasks.workunit.client.0.vm05.stdout:9/71: mkdir d6/d15 0 2026-03-10T08:55:15.480 INFO:tasks.workunit.client.0.vm05.stdout:9/72: getdents d6/d15 0 2026-03-10T08:55:15.485 INFO:tasks.workunit.client.1.vm08.stdout:6/386: getdents d9/d13 0 2026-03-10T08:55:15.486 INFO:tasks.workunit.client.1.vm08.stdout:6/387: stat d9/d10/d1e/l7c 0 2026-03-10T08:55:15.486 INFO:tasks.workunit.client.0.vm05.stdout:5/101: write d5/f9 [9290726,55300] 0 2026-03-10T08:55:15.487 INFO:tasks.workunit.client.0.vm05.stdout:5/102: truncate d5/df/d12/f13 605671 0 2026-03-10T08:55:15.487 INFO:tasks.workunit.client.0.vm05.stdout:5/103: fdatasync d5/df/d12/f15 0 2026-03-10T08:55:15.488 INFO:tasks.workunit.client.0.vm05.stdout:5/104: write d5/df/f1c [492016,17719] 0 2026-03-10T08:55:15.490 INFO:tasks.workunit.client.1.vm08.stdout:6/388: unlink d9/dc/d11/d23/d2c/f60 0 2026-03-10T08:55:15.501 INFO:tasks.workunit.client.0.vm05.stdout:5/105: mknod d5/df/d12/c22 0 2026-03-10T08:55:15.501 INFO:tasks.workunit.client.0.vm05.stdout:5/106: chown d5/df/d12 40822 1 2026-03-10T08:55:15.501 INFO:tasks.workunit.client.1.vm08.stdout:6/389: write d9/d13/f36 [3543353,101139] 0 2026-03-10T08:55:15.501 INFO:tasks.workunit.client.1.vm08.stdout:6/390: link d9/fa d9/dc/d11/d23/f8a 0 2026-03-10T08:55:15.502 INFO:tasks.workunit.client.1.vm08.stdout:6/391: dread d9/dc/d11/d23/d2c/f49 [0,4194304] 0 2026-03-10T08:55:15.506 INFO:tasks.workunit.client.1.vm08.stdout:6/392: creat d9/dc/d11/d23/f8b x:0 0 0 2026-03-10T08:55:15.506 INFO:tasks.workunit.client.1.vm08.stdout:6/393: chown d9/d10/d1e/f58 2 1 2026-03-10T08:55:15.511 INFO:tasks.workunit.client.1.vm08.stdout:6/394: fdatasync d9/dc/d11/d23/f8a 0 2026-03-10T08:55:15.512 INFO:tasks.workunit.client.1.vm08.stdout:6/395: creat d9/d10/f8c x:0 0 0 2026-03-10T08:55:15.514 INFO:tasks.workunit.client.1.vm08.stdout:6/396: creat d9/dc/d11/f8d x:0 0 0 2026-03-10T08:55:15.516 INFO:tasks.workunit.client.1.vm08.stdout:6/397: truncate d9/d10/d1e/d32/f3a 1214658 0 2026-03-10T08:55:15.519 INFO:tasks.workunit.client.1.vm08.stdout:6/398: creat d9/dc/d11/d23/d2c/f8e x:0 0 0 2026-03-10T08:55:15.520 INFO:tasks.workunit.client.1.vm08.stdout:6/399: write d9/d13/f70 [979054,4166] 0 2026-03-10T08:55:15.521 INFO:tasks.workunit.client.1.vm08.stdout:6/400: creat d9/d10/f8f x:0 0 0 2026-03-10T08:55:15.522 INFO:tasks.workunit.client.1.vm08.stdout:6/401: chown d9/dc/d11/d23/d2c/f4f 28 1 2026-03-10T08:55:15.522 INFO:tasks.workunit.client.1.vm08.stdout:6/402: truncate d9/d10/f72 957980 0 2026-03-10T08:55:15.529 INFO:tasks.workunit.client.1.vm08.stdout:6/403: dread d9/d13/f2f [0,4194304] 0 2026-03-10T08:55:15.536 INFO:tasks.workunit.client.0.vm05.stdout:5/107: dread d5/df/f1c [0,4194304] 0 2026-03-10T08:55:15.537 INFO:tasks.workunit.client.1.vm08.stdout:6/404: creat d9/dc/d11/d23/d2c/d81/d63/f90 x:0 0 0 2026-03-10T08:55:15.537 INFO:tasks.workunit.client.1.vm08.stdout:6/405: chown d9/d50/f75 1104 1 2026-03-10T08:55:15.538 INFO:tasks.workunit.client.1.vm08.stdout:6/406: dread - d9/dc/d11/d23/f6f zero size 2026-03-10T08:55:15.539 INFO:tasks.workunit.client.0.vm05.stdout:5/108: dread d5/df/d12/f15 [0,4194304] 0 2026-03-10T08:55:15.541 INFO:tasks.workunit.client.1.vm08.stdout:6/407: dread d9/dc/d11/d23/d2c/d81/f85 [0,4194304] 0 2026-03-10T08:55:15.542 INFO:tasks.workunit.client.0.vm05.stdout:6/95: getdents d4/d7 0 2026-03-10T08:55:15.542 INFO:tasks.workunit.client.1.vm08.stdout:6/408: write d9/dc/d11/d23/d2c/d81/d63/f90 [283482,89565] 0 2026-03-10T08:55:15.543 INFO:tasks.workunit.client.1.vm08.stdout:6/409: chown d9/f77 26774712 1 2026-03-10T08:55:15.543 INFO:tasks.workunit.client.0.vm05.stdout:7/52: truncate f3 2841745 0 2026-03-10T08:55:15.543 INFO:tasks.workunit.client.0.vm05.stdout:7/53: write f8 [779509,48692] 0 2026-03-10T08:55:15.552 INFO:tasks.workunit.client.0.vm05.stdout:6/96: creat d4/d7/d10/d1a/f1e x:0 0 0 2026-03-10T08:55:15.552 INFO:tasks.workunit.client.1.vm08.stdout:8/430: write d1/d10/d9/dd/d13/f24 [58917,129730] 0 2026-03-10T08:55:15.552 INFO:tasks.workunit.client.0.vm05.stdout:4/115: truncate d0/f16 304016 0 2026-03-10T08:55:15.554 INFO:tasks.workunit.client.1.vm08.stdout:6/410: dread d9/dc/d11/d23/d2c/d41/f56 [0,4194304] 0 2026-03-10T08:55:15.554 INFO:tasks.workunit.client.0.vm05.stdout:5/109: link d5/df/f19 d5/f23 0 2026-03-10T08:55:15.555 INFO:tasks.workunit.client.0.vm05.stdout:6/97: read d4/fc [2831806,80977] 0 2026-03-10T08:55:15.559 INFO:tasks.workunit.client.0.vm05.stdout:4/116: fsync d0/fc 0 2026-03-10T08:55:15.559 INFO:tasks.workunit.client.0.vm05.stdout:4/117: stat d0/d1f 0 2026-03-10T08:55:15.560 INFO:tasks.workunit.client.1.vm08.stdout:1/358: dwrite d1/da/d20/f54 [4194304,4194304] 0 2026-03-10T08:55:15.560 INFO:tasks.workunit.client.0.vm05.stdout:4/118: write d0/f1 [6050436,64340] 0 2026-03-10T08:55:15.567 INFO:tasks.workunit.client.0.vm05.stdout:5/110: mkdir d5/df/d12/d24 0 2026-03-10T08:55:15.569 INFO:tasks.workunit.client.1.vm08.stdout:7/341: dread d0/d11/d1f/d29/d3d/d40/f38 [0,4194304] 0 2026-03-10T08:55:15.569 INFO:tasks.workunit.client.0.vm05.stdout:6/98: mkdir d4/d7/d10/d1a/d1f 0 2026-03-10T08:55:15.573 INFO:tasks.workunit.client.1.vm08.stdout:2/378: truncate d1/d43/f56 1456514 0 2026-03-10T08:55:15.585 INFO:tasks.workunit.client.0.vm05.stdout:4/119: creat d0/d1d/f24 x:0 0 0 2026-03-10T08:55:15.587 INFO:tasks.workunit.client.0.vm05.stdout:4/120: fdatasync d0/f10 0 2026-03-10T08:55:15.587 INFO:tasks.workunit.client.0.vm05.stdout:4/121: write d0/f1 [1820717,100445] 0 2026-03-10T08:55:15.587 INFO:tasks.workunit.client.0.vm05.stdout:4/122: dwrite d0/f1e [0,4194304] 0 2026-03-10T08:55:15.590 INFO:tasks.workunit.client.0.vm05.stdout:0/73: dwrite df/f11 [0,4194304] 0 2026-03-10T08:55:15.593 INFO:tasks.workunit.client.0.vm05.stdout:0/74: dread df/f11 [0,4194304] 0 2026-03-10T08:55:15.593 INFO:tasks.workunit.client.0.vm05.stdout:6/99: mkdir d4/d7/d10/d15/d20 0 2026-03-10T08:55:15.594 INFO:tasks.workunit.client.0.vm05.stdout:6/100: write d4/d7/d10/d15/f16 [307168,50283] 0 2026-03-10T08:55:15.595 INFO:tasks.workunit.client.0.vm05.stdout:6/101: chown d4/f5 212629619 1 2026-03-10T08:55:15.603 INFO:tasks.workunit.client.1.vm08.stdout:1/359: creat d1/da/d18/d3b/d62/f76 x:0 0 0 2026-03-10T08:55:15.604 INFO:tasks.workunit.client.0.vm05.stdout:7/54: sync 2026-03-10T08:55:15.604 INFO:tasks.workunit.client.0.vm05.stdout:7/55: readlink l6 0 2026-03-10T08:55:15.606 INFO:tasks.workunit.client.1.vm08.stdout:9/366: truncate d2/dd/d15/f22 3817053 0 2026-03-10T08:55:15.607 INFO:tasks.workunit.client.1.vm08.stdout:1/360: dwrite d1/da/d18/d3b/d62/f76 [0,4194304] 0 2026-03-10T08:55:15.610 INFO:tasks.workunit.client.0.vm05.stdout:1/159: dwrite dd/d10/d19/f35 [4194304,4194304] 0 2026-03-10T08:55:15.617 INFO:tasks.workunit.client.0.vm05.stdout:3/101: getdents d9 0 2026-03-10T08:55:15.617 INFO:tasks.workunit.client.0.vm05.stdout:3/102: write d9/fc [8389548,77337] 0 2026-03-10T08:55:15.621 INFO:tasks.workunit.client.0.vm05.stdout:3/103: dwrite d9/f11 [0,4194304] 0 2026-03-10T08:55:15.628 INFO:tasks.workunit.client.0.vm05.stdout:5/111: creat d5/df/d12/d24/f25 x:0 0 0 2026-03-10T08:55:15.633 INFO:tasks.workunit.client.1.vm08.stdout:8/431: getdents d1/d10/d9/dd/d25/d27/d44/d97 0 2026-03-10T08:55:15.633 INFO:tasks.workunit.client.1.vm08.stdout:5/372: truncate d0/d11/f29 835137 0 2026-03-10T08:55:15.635 INFO:tasks.workunit.client.1.vm08.stdout:9/367: creat d2/dd/d15/d1e/d21/f75 x:0 0 0 2026-03-10T08:55:15.635 INFO:tasks.workunit.client.0.vm05.stdout:0/75: rmdir df 39 2026-03-10T08:55:15.637 INFO:tasks.workunit.client.1.vm08.stdout:1/361: mkdir d1/da/d18/d3a/d77 0 2026-03-10T08:55:15.637 INFO:tasks.workunit.client.1.vm08.stdout:3/326: write d4/d15/d8/d2c/f32 [1019416,4038] 0 2026-03-10T08:55:15.637 INFO:tasks.workunit.client.1.vm08.stdout:7/342: link d0/d11/d1f/d29/d3d/l4e d0/d51/l6d 0 2026-03-10T08:55:15.639 INFO:tasks.workunit.client.0.vm05.stdout:7/56: rename c7 to ca 0 2026-03-10T08:55:15.640 INFO:tasks.workunit.client.1.vm08.stdout:1/362: dread d1/da/f22 [0,4194304] 0 2026-03-10T08:55:15.641 INFO:tasks.workunit.client.0.vm05.stdout:1/160: rename dd/d10/d18/f23 to dd/d21/f3e 0 2026-03-10T08:55:15.642 INFO:tasks.workunit.client.1.vm08.stdout:5/373: creat d0/d1b/f77 x:0 0 0 2026-03-10T08:55:15.660 INFO:tasks.workunit.client.1.vm08.stdout:8/432: mkdir d1/d10/d9/dd/d25/d27/d44/d21/d5f/d9e 0 2026-03-10T08:55:15.660 INFO:tasks.workunit.client.1.vm08.stdout:8/433: write d1/d10/d9/dd/d25/d27/d44/d21/f32 [387126,20616] 0 2026-03-10T08:55:15.660 INFO:tasks.workunit.client.1.vm08.stdout:4/406: getdents d5/d23 0 2026-03-10T08:55:15.660 INFO:tasks.workunit.client.1.vm08.stdout:1/363: creat d1/da/d20/d3f/d49/f78 x:0 0 0 2026-03-10T08:55:15.660 INFO:tasks.workunit.client.1.vm08.stdout:1/364: write d1/da/d20/f2d [3763054,4791] 0 2026-03-10T08:55:15.660 INFO:tasks.workunit.client.0.vm05.stdout:2/49: rmdir d0 39 2026-03-10T08:55:15.660 INFO:tasks.workunit.client.0.vm05.stdout:8/77: getdents d2/db 0 2026-03-10T08:55:15.660 INFO:tasks.workunit.client.0.vm05.stdout:8/78: write d2/f5 [2205425,6882] 0 2026-03-10T08:55:15.660 INFO:tasks.workunit.client.0.vm05.stdout:8/79: truncate d2/dd/f1a 981530 0 2026-03-10T08:55:15.660 INFO:tasks.workunit.client.0.vm05.stdout:8/80: truncate d2/db/f1b 59098 0 2026-03-10T08:55:15.660 INFO:tasks.workunit.client.0.vm05.stdout:8/81: write d2/db/f19 [3109243,63731] 0 2026-03-10T08:55:15.660 INFO:tasks.workunit.client.0.vm05.stdout:8/82: dread d2/f5 [0,4194304] 0 2026-03-10T08:55:15.660 INFO:tasks.workunit.client.0.vm05.stdout:8/83: write d2/ff [2091911,9501] 0 2026-03-10T08:55:15.661 INFO:tasks.workunit.client.0.vm05.stdout:9/73: dwrite f4 [0,4194304] 0 2026-03-10T08:55:15.670 INFO:tasks.workunit.client.0.vm05.stdout:1/161: dwrite dd/d10/d18/d20/f34 [0,4194304] 0 2026-03-10T08:55:15.675 INFO:tasks.workunit.client.0.vm05.stdout:5/112: sync 2026-03-10T08:55:15.677 INFO:tasks.workunit.client.1.vm08.stdout:8/434: dread d1/d10/d9/dd/d13/d40/f68 [0,4194304] 0 2026-03-10T08:55:15.677 INFO:tasks.workunit.client.0.vm05.stdout:3/104: write f7 [3356747,69909] 0 2026-03-10T08:55:15.681 INFO:tasks.workunit.client.1.vm08.stdout:7/343: truncate d0/d11/d1f/d29/d3d/d40/f24 5088363 0 2026-03-10T08:55:15.681 INFO:tasks.workunit.client.1.vm08.stdout:7/344: chown d0/d11/d1f/d29/d3b 8146 1 2026-03-10T08:55:15.681 INFO:tasks.workunit.client.1.vm08.stdout:7/345: chown d0/d11/f66 238208 1 2026-03-10T08:55:15.682 INFO:tasks.workunit.client.1.vm08.stdout:7/346: readlink d0/d11/d1f/d29/d3d/l50 0 2026-03-10T08:55:15.683 INFO:tasks.workunit.client.1.vm08.stdout:5/374: sync 2026-03-10T08:55:15.685 INFO:tasks.workunit.client.1.vm08.stdout:5/375: write d0/d40/d4b/d4e/f71 [195251,77315] 0 2026-03-10T08:55:15.685 INFO:tasks.workunit.client.1.vm08.stdout:5/376: stat d0/d11/f2d 0 2026-03-10T08:55:15.687 INFO:tasks.workunit.client.1.vm08.stdout:1/365: rmdir d1/da/de/d24/d3d/d40/d56/d6b 39 2026-03-10T08:55:15.693 INFO:tasks.workunit.client.0.vm05.stdout:0/76: mkdir df/d18 0 2026-03-10T08:55:15.701 INFO:tasks.workunit.client.1.vm08.stdout:8/435: mkdir d1/d10/d9/d4d/d9f 0 2026-03-10T08:55:15.701 INFO:tasks.workunit.client.0.vm05.stdout:0/77: dread fc [0,4194304] 0 2026-03-10T08:55:15.702 INFO:tasks.workunit.client.0.vm05.stdout:0/78: fsync df/f13 0 2026-03-10T08:55:15.702 INFO:tasks.workunit.client.0.vm05.stdout:6/102: creat d4/f21 x:0 0 0 2026-03-10T08:55:15.702 INFO:tasks.workunit.client.0.vm05.stdout:7/57: rename c0 to cb 0 2026-03-10T08:55:15.706 INFO:tasks.workunit.client.0.vm05.stdout:1/162: fdatasync f2 0 2026-03-10T08:55:15.706 INFO:tasks.workunit.client.1.vm08.stdout:6/411: write d9/d13/f2f [2621005,42446] 0 2026-03-10T08:55:15.713 INFO:tasks.workunit.client.0.vm05.stdout:2/50: dread d0/f8 [0,4194304] 0 2026-03-10T08:55:15.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:15 vm05.local ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:55:15.717 INFO:tasks.workunit.client.0.vm05.stdout:2/51: dwrite d0/fa [0,4194304] 0 2026-03-10T08:55:15.719 INFO:tasks.workunit.client.0.vm05.stdout:2/52: chown d0/f4 161545903 1 2026-03-10T08:55:15.720 INFO:tasks.workunit.client.0.vm05.stdout:2/53: dread d0/f4 [0,4194304] 0 2026-03-10T08:55:15.727 INFO:tasks.workunit.client.1.vm08.stdout:7/347: dread d0/d11/d1f/d29/d3d/f59 [0,4194304] 0 2026-03-10T08:55:15.729 INFO:tasks.workunit.client.1.vm08.stdout:1/366: creat d1/da/de/f79 x:0 0 0 2026-03-10T08:55:15.729 INFO:tasks.workunit.client.1.vm08.stdout:1/367: dread - d1/da/de/f79 zero size 2026-03-10T08:55:15.732 INFO:tasks.workunit.client.1.vm08.stdout:9/368: getdents d2/dd/d15/d1e/d25 0 2026-03-10T08:55:15.732 INFO:tasks.workunit.client.1.vm08.stdout:0/297: write d6/dd/d13/d17/d1f/d2d/d38/f53 [2288373,43530] 0 2026-03-10T08:55:15.734 INFO:tasks.workunit.client.1.vm08.stdout:1/368: dread d1/da/d18/d3a/f57 [0,4194304] 0 2026-03-10T08:55:15.734 INFO:tasks.workunit.client.1.vm08.stdout:9/369: dread d2/dd/d15/f44 [0,4194304] 0 2026-03-10T08:55:15.744 INFO:tasks.workunit.client.0.vm05.stdout:8/84: mknod d2/c1c 0 2026-03-10T08:55:15.744 INFO:tasks.workunit.client.0.vm05.stdout:6/103: mkdir d4/d7/d10/d15/d1b/d22 0 2026-03-10T08:55:15.744 INFO:tasks.workunit.client.0.vm05.stdout:7/58: mknod cc 0 2026-03-10T08:55:15.744 INFO:tasks.workunit.client.0.vm05.stdout:1/163: dread f7 [4194304,4194304] 0 2026-03-10T08:55:15.744 INFO:tasks.workunit.client.0.vm05.stdout:2/54: fsync d0/f8 0 2026-03-10T08:55:15.744 INFO:tasks.workunit.client.1.vm08.stdout:7/348: creat d0/d14/d43/f6e x:0 0 0 2026-03-10T08:55:15.744 INFO:tasks.workunit.client.1.vm08.stdout:7/349: chown d0/d11/d1f/d29/d3d/l50 1 1 2026-03-10T08:55:15.744 INFO:tasks.workunit.client.1.vm08.stdout:8/436: dread d1/d10/d9/dd/d25/d27/f52 [0,4194304] 0 2026-03-10T08:55:15.744 INFO:tasks.workunit.client.1.vm08.stdout:9/370: truncate d2/dd/d15/d1e/d24/f2b 938549 0 2026-03-10T08:55:15.744 INFO:tasks.workunit.client.1.vm08.stdout:6/412: link d9/d10/f72 d9/d10/d1e/f91 0 2026-03-10T08:55:15.744 INFO:tasks.workunit.client.1.vm08.stdout:9/371: write d2/fb [1746488,68482] 0 2026-03-10T08:55:15.744 INFO:tasks.workunit.client.0.vm05.stdout:5/113: rename d5/cb to d5/df/c26 0 2026-03-10T08:55:15.746 INFO:tasks.workunit.client.0.vm05.stdout:3/105: getdents d9 0 2026-03-10T08:55:15.747 INFO:tasks.workunit.client.0.vm05.stdout:3/106: dread f2 [0,4194304] 0 2026-03-10T08:55:15.750 INFO:tasks.workunit.client.1.vm08.stdout:8/437: mknod d1/d10/d9/d4d/ca0 0 2026-03-10T08:55:15.751 INFO:tasks.workunit.client.0.vm05.stdout:3/107: dwrite f1 [0,4194304] 0 2026-03-10T08:55:15.752 INFO:tasks.workunit.client.0.vm05.stdout:0/79: mkdir df/d18/d19 0 2026-03-10T08:55:15.753 INFO:tasks.workunit.client.0.vm05.stdout:6/104: creat d4/d7/d10/d15/d1b/f23 x:0 0 0 2026-03-10T08:55:15.754 INFO:tasks.workunit.client.1.vm08.stdout:8/438: dwrite d1/d10/d9/dd/f70 [0,4194304] 0 2026-03-10T08:55:15.764 INFO:tasks.workunit.client.1.vm08.stdout:6/413: mkdir d9/d10/d1e/d92 0 2026-03-10T08:55:15.764 INFO:tasks.workunit.client.1.vm08.stdout:6/414: truncate d9/dc/d11/f55 155174 0 2026-03-10T08:55:15.764 INFO:tasks.workunit.client.1.vm08.stdout:6/415: write d9/d13/f88 [381410,77466] 0 2026-03-10T08:55:15.764 INFO:tasks.workunit.client.1.vm08.stdout:6/416: stat d9/d10/f67 0 2026-03-10T08:55:15.765 INFO:tasks.workunit.client.1.vm08.stdout:6/417: truncate d9/dc/d11/d23/d2c/f79 727341 0 2026-03-10T08:55:15.765 INFO:tasks.workunit.client.0.vm05.stdout:6/105: write d4/d7/ff [3550938,18454] 0 2026-03-10T08:55:15.765 INFO:tasks.workunit.client.0.vm05.stdout:7/59: creat fd x:0 0 0 2026-03-10T08:55:15.765 INFO:tasks.workunit.client.0.vm05.stdout:0/80: dwrite df/f15 [0,4194304] 0 2026-03-10T08:55:15.765 INFO:tasks.workunit.client.0.vm05.stdout:9/74: creat d6/f16 x:0 0 0 2026-03-10T08:55:15.765 INFO:tasks.workunit.client.0.vm05.stdout:2/55: unlink d0/l5 0 2026-03-10T08:55:15.765 INFO:tasks.workunit.client.0.vm05.stdout:6/106: truncate d4/d7/d10/d15/f16 423382 0 2026-03-10T08:55:15.781 INFO:tasks.workunit.client.0.vm05.stdout:8/85: symlink d2/l1d 0 2026-03-10T08:55:15.785 INFO:tasks.workunit.client.0.vm05.stdout:8/86: dwrite d2/db/f1b [0,4194304] 0 2026-03-10T08:55:15.787 INFO:tasks.workunit.client.1.vm08.stdout:0/298: getdents d6/dd/d13/d17/d1f/d2d/d39 0 2026-03-10T08:55:15.787 INFO:tasks.workunit.client.1.vm08.stdout:1/369: getdents d1/da/d4b 0 2026-03-10T08:55:15.787 INFO:tasks.workunit.client.1.vm08.stdout:0/299: write f5 [150157,111083] 0 2026-03-10T08:55:15.787 INFO:tasks.workunit.client.0.vm05.stdout:0/81: creat df/f1a x:0 0 0 2026-03-10T08:55:15.788 INFO:tasks.workunit.client.1.vm08.stdout:1/370: readlink d1/da/d18/d3a/l66 0 2026-03-10T08:55:15.789 INFO:tasks.workunit.client.0.vm05.stdout:1/164: truncate f2 1732489 0 2026-03-10T08:55:15.791 INFO:tasks.workunit.client.0.vm05.stdout:2/56: creat d0/fb x:0 0 0 2026-03-10T08:55:15.791 INFO:tasks.workunit.client.0.vm05.stdout:8/87: dread d2/fa [0,4194304] 0 2026-03-10T08:55:15.792 INFO:tasks.workunit.client.1.vm08.stdout:0/300: dwrite d6/dd/d13/d17/d1f/d2d/d39/f3b [0,4194304] 0 2026-03-10T08:55:15.793 INFO:tasks.workunit.client.0.vm05.stdout:6/107: creat d4/d7/d10/d15/d1b/f24 x:0 0 0 2026-03-10T08:55:15.797 INFO:tasks.workunit.client.0.vm05.stdout:3/108: getdents d9 0 2026-03-10T08:55:15.802 INFO:tasks.workunit.client.0.vm05.stdout:8/88: dwrite d2/fa [4194304,4194304] 0 2026-03-10T08:55:15.814 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:15 vm08.local ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:55:15.815 INFO:tasks.workunit.client.1.vm08.stdout:0/301: unlink d6/f18 0 2026-03-10T08:55:15.815 INFO:tasks.workunit.client.1.vm08.stdout:1/371: mkdir d1/da/de/d24/d3d/d40/d56/d7a 0 2026-03-10T08:55:15.815 INFO:tasks.workunit.client.1.vm08.stdout:8/439: getdents d1/d10/d9/d8a 0 2026-03-10T08:55:15.815 INFO:tasks.workunit.client.1.vm08.stdout:0/302: write d6/dd/d13/d17/d1f/d20/d2f/d57/f5c [576768,17274] 0 2026-03-10T08:55:15.815 INFO:tasks.workunit.client.1.vm08.stdout:1/372: read d1/da/de/f12 [35738,8825] 0 2026-03-10T08:55:15.815 INFO:tasks.workunit.client.1.vm08.stdout:6/418: dread d9/dc/d11/d23/d2c/d41/f51 [0,4194304] 0 2026-03-10T08:55:15.815 INFO:tasks.workunit.client.0.vm05.stdout:5/114: dwrite d5/fc [0,4194304] 0 2026-03-10T08:55:15.815 INFO:tasks.workunit.client.0.vm05.stdout:8/89: readlink d2/l6 0 2026-03-10T08:55:15.815 INFO:tasks.workunit.client.0.vm05.stdout:0/82: creat df/f1b x:0 0 0 2026-03-10T08:55:15.815 INFO:tasks.workunit.client.0.vm05.stdout:2/57: rename d0/c6 to d0/cc 0 2026-03-10T08:55:15.815 INFO:tasks.workunit.client.0.vm05.stdout:2/58: chown d0/fa 142715 1 2026-03-10T08:55:15.815 INFO:tasks.workunit.client.0.vm05.stdout:2/59: rename d0 to d0/dd 22 2026-03-10T08:55:15.815 INFO:tasks.workunit.client.0.vm05.stdout:2/60: chown d0/fa 256 1 2026-03-10T08:55:15.815 INFO:tasks.workunit.client.1.vm08.stdout:6/419: chown d9/d13/d4e/f6b 79396 1 2026-03-10T08:55:15.817 INFO:tasks.workunit.client.0.vm05.stdout:2/61: dread d0/fa [0,4194304] 0 2026-03-10T08:55:15.820 INFO:tasks.workunit.client.0.vm05.stdout:6/108: creat d4/d7/d10/d1a/f25 x:0 0 0 2026-03-10T08:55:15.821 INFO:tasks.workunit.client.0.vm05.stdout:6/109: write d4/d7/d10/d15/d1b/f23 [77894,23302] 0 2026-03-10T08:55:15.822 INFO:tasks.workunit.client.1.vm08.stdout:8/440: symlink d1/d4f/d60/la1 0 2026-03-10T08:55:15.823 INFO:tasks.workunit.client.1.vm08.stdout:8/441: chown d1/d10/d9/dd/d25 30283649 1 2026-03-10T08:55:15.826 INFO:tasks.workunit.client.0.vm05.stdout:6/110: dwrite d4/f11 [0,4194304] 0 2026-03-10T08:55:15.835 INFO:tasks.workunit.client.1.vm08.stdout:0/303: dwrite d6/f16 [0,4194304] 0 2026-03-10T08:55:15.840 INFO:tasks.workunit.client.1.vm08.stdout:8/442: dwrite d1/d10/d9/dd/d25/d27/d44/d21/f4a [0,4194304] 0 2026-03-10T08:55:15.840 INFO:tasks.workunit.client.1.vm08.stdout:1/373: creat d1/da/de/d24/d3d/d4a/f7b x:0 0 0 2026-03-10T08:55:15.840 INFO:tasks.workunit.client.0.vm05.stdout:3/109: dread d9/f19 [0,4194304] 0 2026-03-10T08:55:15.840 INFO:tasks.workunit.client.0.vm05.stdout:5/115: write d5/fe [2361006,60093] 0 2026-03-10T08:55:15.840 INFO:tasks.workunit.client.0.vm05.stdout:5/116: dread d5/df/d12/f15 [0,4194304] 0 2026-03-10T08:55:15.854 INFO:tasks.workunit.client.1.vm08.stdout:2/379: dwrite d1/d5b/d66/f63 [0,4194304] 0 2026-03-10T08:55:15.856 INFO:tasks.workunit.client.0.vm05.stdout:4/123: dwrite d0/f1e [4194304,4194304] 0 2026-03-10T08:55:15.857 INFO:tasks.workunit.client.0.vm05.stdout:8/90: symlink d2/dd/l1e 0 2026-03-10T08:55:15.871 INFO:tasks.workunit.client.1.vm08.stdout:3/327: write d4/d15/d8/f68 [412731,6090] 0 2026-03-10T08:55:15.872 INFO:tasks.workunit.client.1.vm08.stdout:8/443: mknod d1/d10/d9/dd/d25/d27/ca2 0 2026-03-10T08:55:15.876 INFO:tasks.workunit.client.1.vm08.stdout:8/444: dwrite d1/d10/d9/dd/d25/d27/d44/d21/f32 [0,4194304] 0 2026-03-10T08:55:15.880 INFO:tasks.workunit.client.1.vm08.stdout:0/304: mknod d6/dd/d13/d17/d1f/d2d/d38/c5d 0 2026-03-10T08:55:15.885 INFO:tasks.workunit.client.1.vm08.stdout:1/374: fdatasync d1/da/de/f12 0 2026-03-10T08:55:15.889 INFO:tasks.workunit.client.1.vm08.stdout:2/380: creat d1/da/d10/d42/f79 x:0 0 0 2026-03-10T08:55:15.890 INFO:tasks.workunit.client.1.vm08.stdout:0/305: creat d6/dd/d13/f5e x:0 0 0 2026-03-10T08:55:15.892 INFO:tasks.workunit.client.0.vm05.stdout:3/110: creat d9/f1f x:0 0 0 2026-03-10T08:55:15.897 INFO:tasks.workunit.client.0.vm05.stdout:3/111: readlink d9/le 0 2026-03-10T08:55:15.897 INFO:tasks.workunit.client.1.vm08.stdout:2/381: unlink d1/da/d10/l1a 0 2026-03-10T08:55:15.897 INFO:tasks.workunit.client.1.vm08.stdout:2/382: chown d1/da/d10/d42/f58 6998688 1 2026-03-10T08:55:15.897 INFO:tasks.workunit.client.1.vm08.stdout:4/407: write d5/de/f41 [808063,98901] 0 2026-03-10T08:55:15.897 INFO:tasks.workunit.client.1.vm08.stdout:3/328: rmdir d4/d15/d8/d2a 39 2026-03-10T08:55:15.900 INFO:tasks.workunit.client.1.vm08.stdout:5/377: dread d0/d46/f5f [0,4194304] 0 2026-03-10T08:55:15.901 INFO:tasks.workunit.client.1.vm08.stdout:5/378: write d0/f6c [428163,76982] 0 2026-03-10T08:55:15.901 INFO:tasks.workunit.client.1.vm08.stdout:2/383: dwrite d1/da/d10/d1b/d12/d23/f70 [0,4194304] 0 2026-03-10T08:55:15.901 INFO:tasks.workunit.client.0.vm05.stdout:5/117: fsync d5/df/d12/f1b 0 2026-03-10T08:55:15.902 INFO:tasks.workunit.client.0.vm05.stdout:5/118: write d5/df/d12/f20 [888759,130728] 0 2026-03-10T08:55:15.913 INFO:tasks.workunit.client.0.vm05.stdout:5/119: dread d5/df/f1c [0,4194304] 0 2026-03-10T08:55:15.914 INFO:tasks.workunit.client.0.vm05.stdout:4/124: mkdir d0/d15/d25 0 2026-03-10T08:55:15.915 INFO:tasks.workunit.client.0.vm05.stdout:4/125: dread - d0/d15/f1c zero size 2026-03-10T08:55:15.915 INFO:tasks.workunit.client.0.vm05.stdout:4/126: write d0/f23 [876622,95632] 0 2026-03-10T08:55:15.915 INFO:tasks.workunit.client.0.vm05.stdout:8/91: mkdir d2/db/d1f 0 2026-03-10T08:55:15.915 INFO:tasks.workunit.client.1.vm08.stdout:4/408: rmdir d5/d23/d36/d76 39 2026-03-10T08:55:15.915 INFO:tasks.workunit.client.0.vm05.stdout:7/60: unlink ca 0 2026-03-10T08:55:15.916 INFO:tasks.workunit.client.0.vm05.stdout:7/61: dread f8 [0,4194304] 0 2026-03-10T08:55:15.917 INFO:tasks.workunit.client.0.vm05.stdout:0/83: creat df/d18/d19/f1c x:0 0 0 2026-03-10T08:55:15.919 INFO:tasks.workunit.client.0.vm05.stdout:0/84: dread df/f12 [0,4194304] 0 2026-03-10T08:55:15.920 INFO:tasks.workunit.client.0.vm05.stdout:2/62: mknod d0/d9/ce 0 2026-03-10T08:55:15.920 INFO:tasks.workunit.client.0.vm05.stdout:2/63: truncate d0/fb 778042 0 2026-03-10T08:55:15.921 INFO:tasks.workunit.client.0.vm05.stdout:6/111: symlink d4/d7/d10/d15/d1b/d22/l26 0 2026-03-10T08:55:15.925 INFO:tasks.workunit.client.0.vm05.stdout:5/120: dread d5/fd [0,4194304] 0 2026-03-10T08:55:15.926 INFO:tasks.workunit.client.0.vm05.stdout:5/121: fsync d5/fd 0 2026-03-10T08:55:15.929 INFO:tasks.workunit.client.0.vm05.stdout:7/62: mknod ce 0 2026-03-10T08:55:15.932 INFO:tasks.workunit.client.0.vm05.stdout:0/85: creat df/f1d x:0 0 0 2026-03-10T08:55:15.932 INFO:tasks.workunit.client.0.vm05.stdout:0/86: chown fe 68 1 2026-03-10T08:55:15.932 INFO:tasks.workunit.client.0.vm05.stdout:6/112: symlink d4/d7/d10/d15/d1b/d22/l27 0 2026-03-10T08:55:15.932 INFO:tasks.workunit.client.0.vm05.stdout:3/112: link d9/f1f d9/f20 0 2026-03-10T08:55:15.933 INFO:tasks.workunit.client.0.vm05.stdout:3/113: fdatasync d9/fc 0 2026-03-10T08:55:15.934 INFO:tasks.workunit.client.0.vm05.stdout:3/114: write d9/fa [1260287,78341] 0 2026-03-10T08:55:15.935 INFO:tasks.workunit.client.1.vm08.stdout:2/384: rename d1/d43/c6c to d1/da/d10/d1b/d12/c7a 0 2026-03-10T08:55:15.936 INFO:tasks.workunit.client.0.vm05.stdout:3/115: dread f1 [0,4194304] 0 2026-03-10T08:55:15.936 INFO:tasks.workunit.client.0.vm05.stdout:4/127: creat d0/d1f/f26 x:0 0 0 2026-03-10T08:55:15.936 INFO:tasks.workunit.client.1.vm08.stdout:3/329: getdents d4/d15/d8/d1d 0 2026-03-10T08:55:15.936 INFO:tasks.workunit.client.1.vm08.stdout:8/445: sync 2026-03-10T08:55:15.936 INFO:tasks.workunit.client.1.vm08.stdout:1/375: sync 2026-03-10T08:55:15.937 INFO:tasks.workunit.client.1.vm08.stdout:3/330: rename d4 to d4/d15/d8/d2c/d55/d6b 22 2026-03-10T08:55:15.937 INFO:tasks.workunit.client.0.vm05.stdout:4/128: read d0/fb [10004,10592] 0 2026-03-10T08:55:15.937 INFO:tasks.workunit.client.0.vm05.stdout:5/122: chown d5/l16 1770482366 1 2026-03-10T08:55:15.937 INFO:tasks.workunit.client.1.vm08.stdout:3/331: stat d4/d15/d8/d1d 0 2026-03-10T08:55:15.938 INFO:tasks.workunit.client.1.vm08.stdout:8/446: write d1/d10/d9/dd/d3d/f78 [627422,46804] 0 2026-03-10T08:55:15.939 INFO:tasks.workunit.client.1.vm08.stdout:8/447: fsync d1/d10/d9/dd/d25/d27/f3a 0 2026-03-10T08:55:15.940 INFO:tasks.workunit.client.1.vm08.stdout:1/376: creat d1/da/de/d24/d26/f7c x:0 0 0 2026-03-10T08:55:15.941 INFO:tasks.workunit.client.0.vm05.stdout:3/116: dwrite f1 [0,4194304] 0 2026-03-10T08:55:15.944 INFO:tasks.workunit.client.1.vm08.stdout:3/332: readlink d4/d15/l58 0 2026-03-10T08:55:15.960 INFO:tasks.workunit.client.0.vm05.stdout:3/117: dread d9/fa [0,4194304] 0 2026-03-10T08:55:15.960 INFO:tasks.workunit.client.0.vm05.stdout:3/118: dwrite d9/f1a [0,4194304] 0 2026-03-10T08:55:15.960 INFO:tasks.workunit.client.0.vm05.stdout:6/113: mknod d4/d7/d10/d15/d1b/c28 0 2026-03-10T08:55:15.960 INFO:tasks.workunit.client.0.vm05.stdout:7/63: dwrite f8 [0,4194304] 0 2026-03-10T08:55:15.960 INFO:tasks.workunit.client.0.vm05.stdout:7/64: read f4 [789347,32485] 0 2026-03-10T08:55:15.960 INFO:tasks.workunit.client.0.vm05.stdout:7/65: rmdir - no directory 2026-03-10T08:55:15.960 INFO:tasks.workunit.client.0.vm05.stdout:7/66: rmdir - no directory 2026-03-10T08:55:15.960 INFO:tasks.workunit.client.1.vm08.stdout:2/385: dwrite d1/da/f64 [0,4194304] 0 2026-03-10T08:55:15.960 INFO:tasks.workunit.client.1.vm08.stdout:2/386: stat d1/d43/c60 0 2026-03-10T08:55:15.960 INFO:tasks.workunit.client.1.vm08.stdout:1/377: dwrite d1/da/de/d24/d3d/d4a/f7b [0,4194304] 0 2026-03-10T08:55:15.960 INFO:tasks.workunit.client.1.vm08.stdout:1/378: write d1/da/d20/d3f/d49/f78 [257530,34704] 0 2026-03-10T08:55:15.960 INFO:tasks.workunit.client.1.vm08.stdout:3/333: mknod d4/c6c 0 2026-03-10T08:55:15.960 INFO:tasks.workunit.client.0.vm05.stdout:7/67: dread f8 [0,4194304] 0 2026-03-10T08:55:15.960 INFO:tasks.workunit.client.1.vm08.stdout:2/387: write d1/d43/f6d [641672,102981] 0 2026-03-10T08:55:15.961 INFO:tasks.workunit.client.1.vm08.stdout:3/334: write d4/d15/d8/d2c/f67 [82714,67647] 0 2026-03-10T08:55:15.973 INFO:tasks.workunit.client.1.vm08.stdout:1/379: creat d1/da/de/d24/d35/d43/f7d x:0 0 0 2026-03-10T08:55:15.974 INFO:tasks.workunit.client.1.vm08.stdout:1/380: chown d1/da/d18/f1d 32177167 1 2026-03-10T08:55:15.980 INFO:tasks.workunit.client.1.vm08.stdout:2/388: mkdir d1/da/d10/d1b/d12/d1e/d7b 0 2026-03-10T08:55:15.981 INFO:tasks.workunit.client.1.vm08.stdout:1/381: fdatasync d1/da/f1e 0 2026-03-10T08:55:15.982 INFO:tasks.workunit.client.1.vm08.stdout:1/382: write d1/da/d4b/d4e/f51 [908584,105902] 0 2026-03-10T08:55:15.983 INFO:tasks.workunit.client.1.vm08.stdout:1/383: rename d1 to d1/da/de/d24/d26/d7e 22 2026-03-10T08:55:15.984 INFO:tasks.workunit.client.0.vm05.stdout:0/87: rename c4 to df/c1e 0 2026-03-10T08:55:15.985 INFO:tasks.workunit.client.0.vm05.stdout:3/119: dwrite d9/f1f [0,4194304] 0 2026-03-10T08:55:15.987 INFO:tasks.workunit.client.1.vm08.stdout:8/448: link d1/d10/d9/dd/d18/d3c/l86 d1/la3 0 2026-03-10T08:55:15.988 INFO:tasks.workunit.client.1.vm08.stdout:1/384: fdatasync d1/da/d4b/f4f 0 2026-03-10T08:55:15.988 INFO:tasks.workunit.client.1.vm08.stdout:1/385: chown d1/da/d18/d3a/d77 5465313 1 2026-03-10T08:55:15.995 INFO:tasks.workunit.client.0.vm05.stdout:7/68: rename l6 to lf 0 2026-03-10T08:55:16.000 INFO:tasks.workunit.client.0.vm05.stdout:7/69: dread - fd zero size 2026-03-10T08:55:16.001 INFO:tasks.workunit.client.0.vm05.stdout:7/70: dread f8 [0,4194304] 0 2026-03-10T08:55:16.006 INFO:tasks.workunit.client.0.vm05.stdout:4/129: rename d0/l6 to d0/d1f/l27 0 2026-03-10T08:55:16.007 INFO:tasks.workunit.client.0.vm05.stdout:0/88: unlink c2 0 2026-03-10T08:55:16.008 INFO:tasks.workunit.client.0.vm05.stdout:6/114: link d4/fc d4/d7/d10/d15/d20/f29 0 2026-03-10T08:55:16.008 INFO:tasks.workunit.client.0.vm05.stdout:6/115: read - d4/d7/d10/d15/d1b/f24 zero size 2026-03-10T08:55:16.009 INFO:tasks.workunit.client.0.vm05.stdout:6/116: write d4/d7/d10/d15/d1b/f23 [1112140,65500] 0 2026-03-10T08:55:16.009 INFO:tasks.workunit.client.0.vm05.stdout:4/130: creat d0/d15/f28 x:0 0 0 2026-03-10T08:55:16.079 INFO:tasks.workunit.client.1.vm08.stdout:3/335: dread d4/d15/f3f [0,4194304] 0 2026-03-10T08:55:16.082 INFO:tasks.workunit.client.1.vm08.stdout:8/449: dread d1/d10/d9/dd/d13/f24 [0,4194304] 0 2026-03-10T08:55:16.082 INFO:tasks.workunit.client.1.vm08.stdout:3/336: dwrite d4/d15/d8/d1d/f2d [0,4194304] 0 2026-03-10T08:55:16.088 INFO:tasks.workunit.client.1.vm08.stdout:3/337: write d4/d15/d8/d2a/f4d [2525689,68039] 0 2026-03-10T08:55:16.090 INFO:tasks.workunit.client.1.vm08.stdout:8/450: creat d1/d10/d9/dd/d13/fa4 x:0 0 0 2026-03-10T08:55:16.091 INFO:tasks.workunit.client.1.vm08.stdout:3/338: dread - d4/d15/d8/d2a/f63 zero size 2026-03-10T08:55:16.092 INFO:tasks.workunit.client.1.vm08.stdout:8/451: stat d1/d10/d9/dd/d25/d27/d44/l58 0 2026-03-10T08:55:16.094 INFO:tasks.workunit.client.0.vm05.stdout:3/120: sync 2026-03-10T08:55:16.094 INFO:tasks.workunit.client.1.vm08.stdout:8/452: rmdir d1/d10/d9/d8a 39 2026-03-10T08:55:16.095 INFO:tasks.workunit.client.0.vm05.stdout:3/121: write d9/fa [180659,5112] 0 2026-03-10T08:55:16.095 INFO:tasks.workunit.client.1.vm08.stdout:8/453: chown d1/d10/d9/dd/d25/d27/f3a 24 1 2026-03-10T08:55:16.097 INFO:tasks.workunit.client.1.vm08.stdout:8/454: symlink d1/d10/d9/dd/d25/d27/d44/d97/la5 0 2026-03-10T08:55:16.097 INFO:tasks.workunit.client.0.vm05.stdout:3/122: link d9/c1e d9/c21 0 2026-03-10T08:55:16.098 INFO:tasks.workunit.client.0.vm05.stdout:3/123: unlink d9/fc 0 2026-03-10T08:55:16.102 INFO:tasks.workunit.client.0.vm05.stdout:3/124: chown d9/l14 300 1 2026-03-10T08:55:16.102 INFO:tasks.workunit.client.0.vm05.stdout:3/125: write d9/f1f [3932328,127274] 0 2026-03-10T08:55:16.102 INFO:tasks.workunit.client.0.vm05.stdout:3/126: fsync d9/f13 0 2026-03-10T08:55:16.102 INFO:tasks.workunit.client.0.vm05.stdout:6/117: sync 2026-03-10T08:55:16.103 INFO:tasks.workunit.client.0.vm05.stdout:3/127: dwrite f7 [0,4194304] 0 2026-03-10T08:55:16.105 INFO:tasks.workunit.client.0.vm05.stdout:6/118: creat d4/d7/d10/d15/f2a x:0 0 0 2026-03-10T08:55:16.130 INFO:tasks.workunit.client.1.vm08.stdout:3/339: sync 2026-03-10T08:55:16.130 INFO:tasks.workunit.client.1.vm08.stdout:3/340: chown d4/f18 80343 1 2026-03-10T08:55:16.131 INFO:tasks.workunit.client.1.vm08.stdout:3/341: read - d4/d15/d8/d1d/f62 zero size 2026-03-10T08:55:16.131 INFO:tasks.workunit.client.1.vm08.stdout:3/342: stat d4/d15/d17/f3c 0 2026-03-10T08:55:16.166 INFO:tasks.workunit.client.1.vm08.stdout:3/343: sync 2026-03-10T08:55:16.166 INFO:tasks.workunit.client.1.vm08.stdout:3/344: readlink d4/d15/d17/l5b 0 2026-03-10T08:55:16.171 INFO:tasks.workunit.client.1.vm08.stdout:3/345: dwrite d4/d15/d8/d2c/f42 [4194304,4194304] 0 2026-03-10T08:55:16.174 INFO:tasks.workunit.client.1.vm08.stdout:3/346: write d4/d15/d8/d2c/d55/f60 [278013,44005] 0 2026-03-10T08:55:16.175 INFO:tasks.workunit.client.1.vm08.stdout:3/347: chown d4/d15/d17/f34 1870 1 2026-03-10T08:55:16.181 INFO:tasks.workunit.client.1.vm08.stdout:3/348: mkdir d4/d15/d8/d2c/d6d 0 2026-03-10T08:55:16.183 INFO:tasks.workunit.client.1.vm08.stdout:3/349: creat d4/d15/d8/d1d/f6e x:0 0 0 2026-03-10T08:55:16.185 INFO:tasks.workunit.client.1.vm08.stdout:3/350: mkdir d4/d6f 0 2026-03-10T08:55:16.186 INFO:tasks.workunit.client.1.vm08.stdout:3/351: mknod d4/d15/d8/d1d/c70 0 2026-03-10T08:55:16.187 INFO:tasks.workunit.client.1.vm08.stdout:3/352: mkdir d4/d15/d8/d71 0 2026-03-10T08:55:16.192 INFO:tasks.workunit.client.1.vm08.stdout:3/353: dwrite d4/d15/d8/f4e [0,4194304] 0 2026-03-10T08:55:16.195 INFO:tasks.workunit.client.1.vm08.stdout:3/354: symlink d4/d15/d17/d20/l72 0 2026-03-10T08:55:16.199 INFO:tasks.workunit.client.1.vm08.stdout:3/355: creat d4/d15/d8/d1d/f73 x:0 0 0 2026-03-10T08:55:16.199 INFO:tasks.workunit.client.1.vm08.stdout:7/350: dread d0/d11/d4a/f4f [0,4194304] 0 2026-03-10T08:55:16.200 INFO:tasks.workunit.client.1.vm08.stdout:3/356: rename d4/d15/l58 to d4/l74 0 2026-03-10T08:55:16.201 INFO:tasks.workunit.client.1.vm08.stdout:7/351: stat d0/d11/d1f/d29/d3d/d40/c3a 0 2026-03-10T08:55:16.201 INFO:tasks.workunit.client.1.vm08.stdout:7/352: stat d0/cb 0 2026-03-10T08:55:16.206 INFO:tasks.workunit.client.1.vm08.stdout:7/353: write d0/d14/f12 [4134459,116460] 0 2026-03-10T08:55:16.212 INFO:tasks.workunit.client.1.vm08.stdout:7/354: sync 2026-03-10T08:55:16.214 INFO:tasks.workunit.client.1.vm08.stdout:7/355: unlink d0/d11/c1d 0 2026-03-10T08:55:16.214 INFO:tasks.workunit.client.1.vm08.stdout:7/356: chown d0/d1c 6385 1 2026-03-10T08:55:16.215 INFO:tasks.workunit.client.1.vm08.stdout:7/357: chown d0/c35 99 1 2026-03-10T08:55:16.215 INFO:tasks.workunit.client.1.vm08.stdout:7/358: readlink d0/d11/d1f/d2c/l3c 0 2026-03-10T08:55:16.221 INFO:tasks.workunit.client.1.vm08.stdout:7/359: link d0/d11/d4a/l6b d0/d11/d1f/d29/l6f 0 2026-03-10T08:55:16.237 INFO:tasks.workunit.client.1.vm08.stdout:7/360: getdents d0/d11/d1f/d29/d3d 0 2026-03-10T08:55:16.237 INFO:tasks.workunit.client.1.vm08.stdout:7/361: symlink d0/d1c/l70 0 2026-03-10T08:55:16.237 INFO:tasks.workunit.client.1.vm08.stdout:7/362: dread d0/f44 [4194304,4194304] 0 2026-03-10T08:55:16.237 INFO:tasks.workunit.client.1.vm08.stdout:7/363: mknod d0/d14/c71 0 2026-03-10T08:55:16.237 INFO:tasks.workunit.client.1.vm08.stdout:7/364: dwrite d0/f44 [0,4194304] 0 2026-03-10T08:55:16.237 INFO:tasks.workunit.client.1.vm08.stdout:7/365: chown d0/d11/c1b 16927 1 2026-03-10T08:55:16.247 INFO:tasks.workunit.client.1.vm08.stdout:7/366: sync 2026-03-10T08:55:16.249 INFO:tasks.workunit.client.1.vm08.stdout:7/367: write d0/d11/f6a [462931,113540] 0 2026-03-10T08:55:16.250 INFO:tasks.workunit.client.1.vm08.stdout:7/368: write d0/d14/f68 [981352,92741] 0 2026-03-10T08:55:16.332 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.331+0000 7f3a25156700 1 -- 192.168.123.105:0/2690134531 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3a200ff530 msgr2=0x7f3a200ff950 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:16.332 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.331+0000 7f3a25156700 1 --2- 192.168.123.105:0/2690134531 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3a200ff530 0x7f3a200ff950 secure :-1 s=READY pgs=317 cs=0 l=1 rev1=1 crypto rx=0x7f3a10009b00 tx=0x7f3a10009e10 comp rx=0 tx=0).stop 2026-03-10T08:55:16.332 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.331+0000 7f3a25156700 1 -- 192.168.123.105:0/2690134531 shutdown_connections 2026-03-10T08:55:16.332 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.331+0000 7f3a25156700 1 --2- 192.168.123.105:0/2690134531 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3a200ffe90 0x7f3a20100310 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:16.332 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.331+0000 7f3a25156700 1 --2- 192.168.123.105:0/2690134531 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3a200ff530 0x7f3a200ff950 unknown :-1 s=CLOSED pgs=317 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:16.332 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.331+0000 7f3a25156700 1 -- 192.168.123.105:0/2690134531 >> 192.168.123.105:0/2690134531 conn(0x7f3a200fb110 msgr2=0x7f3a200fd590 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:55:16.332 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.332+0000 7f3a25156700 1 -- 192.168.123.105:0/2690134531 shutdown_connections 2026-03-10T08:55:16.332 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.332+0000 7f3a25156700 1 -- 192.168.123.105:0/2690134531 wait complete. 2026-03-10T08:55:16.332 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.332+0000 7f3a25156700 1 Processor -- start 2026-03-10T08:55:16.332 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.332+0000 7f3a25156700 1 -- start start 2026-03-10T08:55:16.333 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.333+0000 7f3a25156700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3a200ff530 0x7f3a201987a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:55:16.333 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.333+0000 7f3a25156700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3a200ffe90 0x7f3a20198ce0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:55:16.333 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.333+0000 7f3a25156700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3a20199300 con 0x7f3a200ff530 2026-03-10T08:55:16.333 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.333+0000 7f3a25156700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3a20199440 con 0x7f3a200ffe90 2026-03-10T08:55:16.333 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.333+0000 7f3a1ed9d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3a200ff530 0x7f3a201987a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:55:16.333 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.333+0000 7f3a1ed9d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3a200ff530 0x7f3a201987a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:46548/0 (socket says 192.168.123.105:46548) 2026-03-10T08:55:16.333 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.333+0000 7f3a1ed9d700 1 -- 192.168.123.105:0/1048372276 learned_addr learned my addr 192.168.123.105:0/1048372276 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:55:16.333 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.333+0000 7f3a163ff700 1 --2- 192.168.123.105:0/1048372276 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3a200ffe90 0x7f3a20198ce0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:55:16.333 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.333+0000 7f3a1ed9d700 1 -- 192.168.123.105:0/1048372276 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3a200ffe90 msgr2=0x7f3a20198ce0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:16.333 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.333+0000 7f3a1ed9d700 1 --2- 192.168.123.105:0/1048372276 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3a200ffe90 0x7f3a20198ce0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:16.334 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.333+0000 7f3a1ed9d700 1 -- 192.168.123.105:0/1048372276 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3a100097e0 con 0x7f3a200ff530 2026-03-10T08:55:16.334 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.333+0000 7f3a163ff700 1 --2- 192.168.123.105:0/1048372276 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3a200ffe90 0x7f3a20198ce0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T08:55:16.334 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.334+0000 7f3a1ed9d700 1 --2- 192.168.123.105:0/1048372276 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3a200ff530 0x7f3a201987a0 secure :-1 s=READY pgs=318 cs=0 l=1 rev1=1 crypto rx=0x7f3a10005850 tx=0x7f3a10004a40 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:55:16.335 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.334+0000 7f3a1cd99700 1 -- 192.168.123.105:0/1048372276 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3a1001d070 con 0x7f3a200ff530 2026-03-10T08:55:16.335 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.334+0000 7f3a1cd99700 1 -- 192.168.123.105:0/1048372276 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3a1000bc50 con 0x7f3a200ff530 2026-03-10T08:55:16.335 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.334+0000 7f3a1cd99700 1 -- 192.168.123.105:0/1048372276 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3a1000f740 con 0x7f3a200ff530 2026-03-10T08:55:16.335 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.334+0000 7f3a25156700 1 -- 192.168.123.105:0/1048372276 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3a2019de90 con 0x7f3a200ff530 2026-03-10T08:55:16.335 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.334+0000 7f3a25156700 1 -- 192.168.123.105:0/1048372276 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3a20101740 con 0x7f3a200ff530 2026-03-10T08:55:16.337 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.336+0000 7f3a25156700 1 -- 192.168.123.105:0/1048372276 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3a2004ea90 con 0x7f3a200ff530 2026-03-10T08:55:16.338 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.337+0000 7f3a1cd99700 1 -- 192.168.123.105:0/1048372276 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f3a10022470 con 0x7f3a200ff530 2026-03-10T08:55:16.338 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.337+0000 7f3a1cd99700 1 --2- 192.168.123.105:0/1048372276 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3a0006c600 0x7f3a0006eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:55:16.338 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.337+0000 7f3a1cd99700 1 -- 192.168.123.105:0/1048372276 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f3a1008d0f0 con 0x7f3a200ff530 2026-03-10T08:55:16.339 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.339+0000 7f3a163ff700 1 --2- 192.168.123.105:0/1048372276 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3a0006c600 0x7f3a0006eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:55:16.341 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.341+0000 7f3a163ff700 1 --2- 192.168.123.105:0/1048372276 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3a0006c600 0x7f3a0006eac0 secure :-1 s=READY pgs=143 cs=0 l=1 rev1=1 crypto rx=0x7f3a08005fd0 tx=0x7f3a08005ee0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:55:16.341 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.341+0000 7f3a1cd99700 1 -- 192.168.123.105:0/1048372276 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f3a1005b420 con 0x7f3a200ff530 2026-03-10T08:55:16.365 INFO:tasks.workunit.client.1.vm08.stdout:9/372: fsync d2/dd/d15/d1e/d21/f50 0 2026-03-10T08:55:16.369 INFO:tasks.workunit.client.0.vm05.stdout:0/89: getdents df/d18 0 2026-03-10T08:55:16.369 INFO:tasks.workunit.client.1.vm08.stdout:9/373: sync 2026-03-10T08:55:16.370 INFO:tasks.workunit.client.1.vm08.stdout:9/374: write d2/dd/d15/f1b [1268568,108422] 0 2026-03-10T08:55:16.371 INFO:tasks.workunit.client.0.vm05.stdout:9/75: getdents d6 0 2026-03-10T08:55:16.371 INFO:tasks.workunit.client.1.vm08.stdout:9/375: read d2/dd/d15/d1e/d24/f2b [936091,104050] 0 2026-03-10T08:55:16.374 INFO:tasks.workunit.client.0.vm05.stdout:9/76: dread f3 [0,4194304] 0 2026-03-10T08:55:16.374 INFO:tasks.workunit.client.0.vm05.stdout:9/77: chown d6/fb 14 1 2026-03-10T08:55:16.379 INFO:tasks.workunit.client.0.vm05.stdout:9/78: dwrite f2 [0,4194304] 0 2026-03-10T08:55:16.381 INFO:tasks.workunit.client.0.vm05.stdout:9/79: stat d6/f7 0 2026-03-10T08:55:16.381 INFO:tasks.workunit.client.0.vm05.stdout:9/80: dread - d6/d12/f14 zero size 2026-03-10T08:55:16.382 INFO:tasks.workunit.client.0.vm05.stdout:9/81: truncate f2 4316205 0 2026-03-10T08:55:16.383 INFO:tasks.workunit.client.0.vm05.stdout:9/82: chown d6/f7 51587330 1 2026-03-10T08:55:16.388 INFO:tasks.workunit.client.0.vm05.stdout:9/83: dwrite f3 [4194304,4194304] 0 2026-03-10T08:55:16.394 INFO:tasks.workunit.client.0.vm05.stdout:0/90: dread fc [0,4194304] 0 2026-03-10T08:55:16.394 INFO:tasks.workunit.client.0.vm05.stdout:0/91: write df/f15 [3183869,115974] 0 2026-03-10T08:55:16.413 INFO:tasks.workunit.client.1.vm08.stdout:9/376: mknod d2/d41/c76 0 2026-03-10T08:55:16.414 INFO:tasks.workunit.client.0.vm05.stdout:9/84: mknod d6/d12/c17 0 2026-03-10T08:55:16.424 INFO:tasks.workunit.client.0.vm05.stdout:0/92: mkdir df/d1f 0 2026-03-10T08:55:16.424 INFO:tasks.workunit.client.0.vm05.stdout:0/93: chown df/f1b 2 1 2026-03-10T08:55:16.424 INFO:tasks.workunit.client.0.vm05.stdout:0/94: read df/f13 [3065803,57866] 0 2026-03-10T08:55:16.425 INFO:tasks.workunit.client.0.vm05.stdout:0/95: dread - df/f17 zero size 2026-03-10T08:55:16.431 INFO:tasks.workunit.client.0.vm05.stdout:0/96: dwrite df/f1a [0,4194304] 0 2026-03-10T08:55:16.437 INFO:tasks.workunit.client.1.vm08.stdout:9/377: creat d2/f77 x:0 0 0 2026-03-10T08:55:16.438 INFO:tasks.workunit.client.1.vm08.stdout:9/378: chown d2/d41/d53 2939 1 2026-03-10T08:55:16.438 INFO:tasks.workunit.client.1.vm08.stdout:9/379: readlink d2/d41/d4c/d66/l6c 0 2026-03-10T08:55:16.439 INFO:tasks.workunit.client.1.vm08.stdout:9/380: dread d2/f4 [0,4194304] 0 2026-03-10T08:55:16.441 INFO:tasks.workunit.client.1.vm08.stdout:9/381: dread f1 [0,4194304] 0 2026-03-10T08:55:16.449 INFO:tasks.workunit.client.0.vm05.stdout:1/165: write dd/d10/f22 [36044,98043] 0 2026-03-10T08:55:16.450 INFO:tasks.workunit.client.0.vm05.stdout:1/166: chown dd/d21/d37/l3d 421185222 1 2026-03-10T08:55:16.454 INFO:tasks.workunit.client.0.vm05.stdout:9/85: creat d6/d15/f18 x:0 0 0 2026-03-10T08:55:16.454 INFO:tasks.workunit.client.0.vm05.stdout:9/86: chown d6/d12/f14 20290 1 2026-03-10T08:55:16.457 INFO:tasks.workunit.client.0.vm05.stdout:9/87: dread d6/fe [0,4194304] 0 2026-03-10T08:55:16.458 INFO:tasks.workunit.client.0.vm05.stdout:2/64: stat d0/cc 0 2026-03-10T08:55:16.462 INFO:tasks.workunit.client.0.vm05.stdout:2/65: dwrite d0/fb [0,4194304] 0 2026-03-10T08:55:16.473 INFO:tasks.workunit.client.1.vm08.stdout:9/382: creat d2/dd/d15/d1e/d39/d4e/f78 x:0 0 0 2026-03-10T08:55:16.474 INFO:tasks.workunit.client.1.vm08.stdout:9/383: write d2/d41/d74/f6a [215038,103897] 0 2026-03-10T08:55:16.474 INFO:tasks.workunit.client.0.vm05.stdout:0/97: write df/f11 [2325313,109947] 0 2026-03-10T08:55:16.488 INFO:tasks.workunit.client.0.vm05.stdout:1/167: mkdir dd/d21/d3f 0 2026-03-10T08:55:16.501 INFO:tasks.workunit.client.0.vm05.stdout:9/88: mkdir d6/d19 0 2026-03-10T08:55:16.501 INFO:tasks.workunit.client.0.vm05.stdout:2/66: creat d0/ff x:0 0 0 2026-03-10T08:55:16.501 INFO:tasks.workunit.client.0.vm05.stdout:0/98: rename df/f1b to df/d1f/f20 0 2026-03-10T08:55:16.501 INFO:tasks.workunit.client.0.vm05.stdout:0/99: readlink - no filename 2026-03-10T08:55:16.501 INFO:tasks.workunit.client.0.vm05.stdout:1/168: creat dd/d13/f40 x:0 0 0 2026-03-10T08:55:16.501 INFO:tasks.workunit.client.0.vm05.stdout:1/169: chown dd/d10/d19 243808750 1 2026-03-10T08:55:16.501 INFO:tasks.workunit.client.1.vm08.stdout:6/420: write d9/d10/d1e/d32/f64 [771403,14193] 0 2026-03-10T08:55:16.501 INFO:tasks.workunit.client.1.vm08.stdout:6/421: chown d9/dc/d11/d23/c37 188491 1 2026-03-10T08:55:16.501 INFO:tasks.workunit.client.1.vm08.stdout:9/384: mkdir d2/dd/d15/d1e/d25/d32/d79 0 2026-03-10T08:55:16.501 INFO:tasks.workunit.client.1.vm08.stdout:9/385: readlink d2/d54/l58 0 2026-03-10T08:55:16.501 INFO:tasks.workunit.client.1.vm08.stdout:9/386: stat d2/dd 0 2026-03-10T08:55:16.501 INFO:tasks.workunit.client.1.vm08.stdout:1/386: write d1/da/de/f12 [1199530,19825] 0 2026-03-10T08:55:16.501 INFO:tasks.workunit.client.0.vm05.stdout:1/170: write dd/f11 [271213,28058] 0 2026-03-10T08:55:16.502 INFO:tasks.workunit.client.0.vm05.stdout:1/171: truncate dd/d10/d19/f2e 28462 0 2026-03-10T08:55:16.502 INFO:tasks.workunit.client.0.vm05.stdout:1/172: dread - dd/d13/f33 zero size 2026-03-10T08:55:16.503 INFO:tasks.workunit.client.1.vm08.stdout:6/422: dread d9/d10/f25 [0,4194304] 0 2026-03-10T08:55:16.504 INFO:tasks.workunit.client.0.vm05.stdout:5/123: write d5/df/d12/f1b [2456328,127870] 0 2026-03-10T08:55:16.508 INFO:tasks.workunit.client.1.vm08.stdout:9/387: sync 2026-03-10T08:55:16.508 INFO:tasks.workunit.client.1.vm08.stdout:9/388: chown d2/c73 23650078 1 2026-03-10T08:55:16.509 INFO:tasks.workunit.client.1.vm08.stdout:9/389: dread - d2/dd/d61/f67 zero size 2026-03-10T08:55:16.511 INFO:tasks.workunit.client.0.vm05.stdout:2/67: truncate d0/f8 3315650 0 2026-03-10T08:55:16.511 INFO:tasks.workunit.client.0.vm05.stdout:2/68: chown d0/f4 7730 1 2026-03-10T08:55:16.516 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.514+0000 7f3a25156700 1 -- 192.168.123.105:0/1048372276 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f3a20109280 con 0x7f3a0006c600 2026-03-10T08:55:16.517 INFO:tasks.workunit.client.1.vm08.stdout:6/423: symlink d9/dc/d11/d23/d2c/l93 0 2026-03-10T08:55:16.519 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.516+0000 7f3a1cd99700 1 -- 192.168.123.105:0/1048372276 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+442 (secure 0 0 0) 0x7f3a20109280 con 0x7f3a0006c600 2026-03-10T08:55:16.519 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.519+0000 7f3a25156700 1 -- 192.168.123.105:0/1048372276 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3a0006c600 msgr2=0x7f3a0006eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:16.520 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.519+0000 7f3a25156700 1 --2- 192.168.123.105:0/1048372276 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3a0006c600 0x7f3a0006eac0 secure :-1 s=READY pgs=143 cs=0 l=1 rev1=1 crypto rx=0x7f3a08005fd0 tx=0x7f3a08005ee0 comp rx=0 tx=0).stop 2026-03-10T08:55:16.520 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.519+0000 7f3a25156700 1 -- 192.168.123.105:0/1048372276 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3a200ff530 msgr2=0x7f3a201987a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:16.520 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.519+0000 7f3a25156700 1 --2- 192.168.123.105:0/1048372276 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3a200ff530 0x7f3a201987a0 secure :-1 s=READY pgs=318 cs=0 l=1 rev1=1 crypto rx=0x7f3a10005850 tx=0x7f3a10004a40 comp rx=0 tx=0).stop 2026-03-10T08:55:16.520 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.520+0000 7f3a25156700 1 -- 192.168.123.105:0/1048372276 shutdown_connections 2026-03-10T08:55:16.520 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.520+0000 7f3a25156700 1 --2- 192.168.123.105:0/1048372276 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3a0006c600 0x7f3a0006eac0 unknown :-1 s=CLOSED pgs=143 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:16.520 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.520+0000 7f3a25156700 1 --2- 192.168.123.105:0/1048372276 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3a200ff530 0x7f3a201987a0 unknown :-1 s=CLOSED pgs=318 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:16.520 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.520+0000 7f3a25156700 1 --2- 192.168.123.105:0/1048372276 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3a200ffe90 0x7f3a20198ce0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:16.520 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.520+0000 7f3a25156700 1 -- 192.168.123.105:0/1048372276 >> 192.168.123.105:0/1048372276 conn(0x7f3a200fb110 msgr2=0x7f3a20107b60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:55:16.520 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.520+0000 7f3a25156700 1 -- 192.168.123.105:0/1048372276 shutdown_connections 2026-03-10T08:55:16.520 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.520+0000 7f3a25156700 1 -- 192.168.123.105:0/1048372276 wait complete. 2026-03-10T08:55:16.526 INFO:tasks.workunit.client.1.vm08.stdout:1/387: mkdir d1/da/d20/d3f/d49/d68/d7f 0 2026-03-10T08:55:16.527 INFO:tasks.workunit.client.1.vm08.stdout:0/306: truncate d6/dd/d13/d17/d1f/d20/f3e 1414209 0 2026-03-10T08:55:16.529 INFO:tasks.workunit.client.0.vm05.stdout:0/100: creat df/d1f/f21 x:0 0 0 2026-03-10T08:55:16.530 INFO:tasks.workunit.client.0.vm05.stdout:0/101: write df/f17 [580235,30605] 0 2026-03-10T08:55:16.530 INFO:tasks.workunit.client.1.vm08.stdout:5/379: dwrite d0/d11/d27/f61 [0,4194304] 0 2026-03-10T08:55:16.531 INFO:teuthology.orchestra.run.vm05.stdout:true 2026-03-10T08:55:16.532 INFO:tasks.workunit.client.1.vm08.stdout:4/409: dwrite d5/d23/d36/f58 [0,4194304] 0 2026-03-10T08:55:16.538 INFO:tasks.workunit.client.1.vm08.stdout:4/410: dread d5/d23/d36/f51 [0,4194304] 0 2026-03-10T08:55:16.551 INFO:tasks.workunit.client.1.vm08.stdout:6/424: rmdir d9/dc/d11/d23/d2c/d81/d63 39 2026-03-10T08:55:16.553 INFO:tasks.workunit.client.0.vm05.stdout:1/173: rmdir dd/d10/d18 39 2026-03-10T08:55:16.556 INFO:tasks.workunit.client.0.vm05.stdout:5/124: symlink d5/df/d12/d24/l27 0 2026-03-10T08:55:16.565 INFO:tasks.workunit.client.1.vm08.stdout:9/390: symlink d2/dd/d15/d1e/d25/d32/d79/l7a 0 2026-03-10T08:55:16.565 INFO:tasks.workunit.client.1.vm08.stdout:9/391: fdatasync d2/dd/d15/f1b 0 2026-03-10T08:55:16.565 INFO:tasks.workunit.client.0.vm05.stdout:8/92: truncate d2/db/f19 3859507 0 2026-03-10T08:55:16.565 INFO:tasks.workunit.client.0.vm05.stdout:8/93: chown d2/l1d 57 1 2026-03-10T08:55:16.565 INFO:tasks.workunit.client.0.vm05.stdout:2/69: creat d0/f10 x:0 0 0 2026-03-10T08:55:16.565 INFO:tasks.workunit.client.0.vm05.stdout:2/70: readlink d0/l1 0 2026-03-10T08:55:16.565 INFO:tasks.workunit.client.0.vm05.stdout:2/71: chown d0/l1 55231224 1 2026-03-10T08:55:16.566 INFO:tasks.workunit.client.0.vm05.stdout:0/102: symlink df/d18/d19/l22 0 2026-03-10T08:55:16.569 INFO:tasks.workunit.client.1.vm08.stdout:5/380: mknod d0/d40/d4b/d4e/c78 0 2026-03-10T08:55:16.569 INFO:tasks.workunit.client.1.vm08.stdout:5/381: fdatasync d0/d11/d3e/d45/f4a 0 2026-03-10T08:55:16.574 INFO:tasks.workunit.client.0.vm05.stdout:5/125: dread d5/f23 [4194304,4194304] 0 2026-03-10T08:55:16.574 INFO:tasks.workunit.client.0.vm05.stdout:5/126: fsync d5/df/d12/d24/f25 0 2026-03-10T08:55:16.577 INFO:tasks.workunit.client.0.vm05.stdout:5/127: dwrite d5/df/d12/f13 [0,4194304] 0 2026-03-10T08:55:16.592 INFO:tasks.workunit.client.1.vm08.stdout:6/425: creat d9/dc/d84/d80/f94 x:0 0 0 2026-03-10T08:55:16.596 INFO:tasks.workunit.client.1.vm08.stdout:2/389: write d1/da/d10/d1b/f14 [4201646,120610] 0 2026-03-10T08:55:16.601 INFO:tasks.workunit.client.0.vm05.stdout:0/103: rename fc to df/d18/d19/f23 0 2026-03-10T08:55:16.601 INFO:tasks.workunit.client.0.vm05.stdout:0/104: read df/f15 [2035049,73565] 0 2026-03-10T08:55:16.605 INFO:tasks.workunit.client.1.vm08.stdout:1/388: dwrite d1/da/de/f12 [0,4194304] 0 2026-03-10T08:55:16.607 INFO:tasks.workunit.client.1.vm08.stdout:5/382: sync 2026-03-10T08:55:16.607 INFO:tasks.workunit.client.1.vm08.stdout:5/383: readlink d0/d11/d3e/l41 0 2026-03-10T08:55:16.617 INFO:tasks.workunit.client.0.vm05.stdout:6/119: getdents d4/d7/d10/d15/d20 0 2026-03-10T08:55:16.617 INFO:tasks.workunit.client.0.vm05.stdout:6/120: readlink d4/d7/d10/d15/d1b/d22/l26 0 2026-03-10T08:55:16.618 INFO:tasks.workunit.client.0.vm05.stdout:6/121: write d4/f5 [3376100,65319] 0 2026-03-10T08:55:16.625 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.624+0000 7f02fb8eb700 1 -- 192.168.123.105:0/1199429567 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f02f4072b50 msgr2=0x7f02f4072f70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:16.625 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.624+0000 7f02fb8eb700 1 --2- 192.168.123.105:0/1199429567 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f02f4072b50 0x7f02f4072f70 secure :-1 s=READY pgs=319 cs=0 l=1 rev1=1 crypto rx=0x7f02f0009a60 tx=0x7f02f0009d70 comp rx=0 tx=0).stop 2026-03-10T08:55:16.625 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.624+0000 7f02fb8eb700 1 -- 192.168.123.105:0/1199429567 shutdown_connections 2026-03-10T08:55:16.625 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.624+0000 7f02fb8eb700 1 --2- 192.168.123.105:0/1199429567 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f02f4075a40 0x7f02f4077ed0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:16.625 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.624+0000 7f02fb8eb700 1 --2- 192.168.123.105:0/1199429567 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f02f4072b50 0x7f02f4072f70 unknown :-1 s=CLOSED pgs=319 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:16.625 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.624+0000 7f02fb8eb700 1 -- 192.168.123.105:0/1199429567 >> 192.168.123.105:0/1199429567 conn(0x7f02f406dae0 msgr2=0x7f02f406ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:55:16.625 INFO:tasks.workunit.client.1.vm08.stdout:0/307: creat d6/f5f x:0 0 0 2026-03-10T08:55:16.625 INFO:tasks.workunit.client.1.vm08.stdout:0/308: dread - d6/dd/d13/f5e zero size 2026-03-10T08:55:16.625 INFO:tasks.workunit.client.1.vm08.stdout:0/309: dread - d6/f5f zero size 2026-03-10T08:55:16.625 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.625+0000 7f02fb8eb700 1 -- 192.168.123.105:0/1199429567 shutdown_connections 2026-03-10T08:55:16.625 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.625+0000 7f02fb8eb700 1 -- 192.168.123.105:0/1199429567 wait complete. 2026-03-10T08:55:16.625 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.625+0000 7f02fb8eb700 1 Processor -- start 2026-03-10T08:55:16.625 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.625+0000 7f02fb8eb700 1 -- start start 2026-03-10T08:55:16.626 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.625+0000 7f02fb8eb700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f02f4075a40 0x7f02f40830d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:55:16.626 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.625+0000 7f02fb8eb700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f02f4083610 0x7f02f41b30e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:55:16.626 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.625+0000 7f02fb8eb700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f02f4083b50 con 0x7f02f4075a40 2026-03-10T08:55:16.626 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.625+0000 7f02fb8eb700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f02f4083cc0 con 0x7f02f4083610 2026-03-10T08:55:16.626 INFO:tasks.workunit.client.1.vm08.stdout:0/310: chown d6/dd/d13/d32/l4f 1987 1 2026-03-10T08:55:16.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.627+0000 7f02f8e86700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f02f4083610 0x7f02f41b30e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:55:16.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.628+0000 7f02f8e86700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f02f4083610 0x7f02f41b30e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:48598/0 (socket says 192.168.123.105:48598) 2026-03-10T08:55:16.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.628+0000 7f02f8e86700 1 -- 192.168.123.105:0/2332509110 learned_addr learned my addr 192.168.123.105:0/2332509110 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:55:16.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.628+0000 7f02f9687700 1 --2- 192.168.123.105:0/2332509110 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f02f4075a40 0x7f02f40830d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:55:16.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.628+0000 7f02f8e86700 1 -- 192.168.123.105:0/2332509110 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f02f4075a40 msgr2=0x7f02f40830d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:16.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.628+0000 7f02f8e86700 1 --2- 192.168.123.105:0/2332509110 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f02f4075a40 0x7f02f40830d0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:16.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.628+0000 7f02f8e86700 1 -- 192.168.123.105:0/2332509110 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f02f0009710 con 0x7f02f4083610 2026-03-10T08:55:16.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.628+0000 7f02f8e86700 1 --2- 192.168.123.105:0/2332509110 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f02f4083610 0x7f02f41b30e0 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f02ec0060b0 tx=0x7f02ec00b6e0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:55:16.629 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.628+0000 7f02ea7fc700 1 -- 192.168.123.105:0/2332509110 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f02ec004bb0 con 0x7f02f4083610 2026-03-10T08:55:16.629 INFO:tasks.workunit.client.0.vm05.stdout:1/174: mkdir dd/d21/d3f/d41 0 2026-03-10T08:55:16.629 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.628+0000 7f02fb8eb700 1 -- 192.168.123.105:0/2332509110 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f02f41b3740 con 0x7f02f4083610 2026-03-10T08:55:16.629 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.628+0000 7f02fb8eb700 1 -- 192.168.123.105:0/2332509110 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f02f41b3c10 con 0x7f02f4083610 2026-03-10T08:55:16.629 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.629+0000 7f02ea7fc700 1 -- 192.168.123.105:0/2332509110 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f02ec004d10 con 0x7f02f4083610 2026-03-10T08:55:16.629 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.629+0000 7f02ea7fc700 1 -- 192.168.123.105:0/2332509110 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f02ec0056a0 con 0x7f02f4083610 2026-03-10T08:55:16.630 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.630+0000 7f02ea7fc700 1 -- 192.168.123.105:0/2332509110 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f02ec0058e0 con 0x7f02f4083610 2026-03-10T08:55:16.631 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.631+0000 7f02ea7fc700 1 --2- 192.168.123.105:0/2332509110 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f02e006c530 0x7f02e006e9f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:55:16.631 INFO:tasks.workunit.client.1.vm08.stdout:9/392: mkdir d2/dd/d15/d1e/d25/d32/d5c/d7b 0 2026-03-10T08:55:16.632 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.632+0000 7f02f9687700 1 --2- 192.168.123.105:0/2332509110 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f02e006c530 0x7f02e006e9f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:55:16.632 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.632+0000 7f02ea7fc700 1 -- 192.168.123.105:0/2332509110 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f02ec013070 con 0x7f02f4083610 2026-03-10T08:55:16.632 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.632+0000 7f02f9687700 1 --2- 192.168.123.105:0/2332509110 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f02e006c530 0x7f02e006e9f0 secure :-1 s=READY pgs=144 cs=0 l=1 rev1=1 crypto rx=0x7f02f000b5c0 tx=0x7f02f0011040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:55:16.632 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.632+0000 7f02fb8eb700 1 -- 192.168.123.105:0/2332509110 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f02d8005320 con 0x7f02f4083610 2026-03-10T08:55:16.636 INFO:tasks.workunit.client.0.vm05.stdout:2/72: sync 2026-03-10T08:55:16.637 INFO:tasks.workunit.client.0.vm05.stdout:0/105: sync 2026-03-10T08:55:16.637 INFO:tasks.workunit.client.0.vm05.stdout:0/106: chown df/d1f 6229 1 2026-03-10T08:55:16.637 INFO:tasks.workunit.client.0.vm05.stdout:2/73: truncate d0/ff 1010243 0 2026-03-10T08:55:16.639 INFO:tasks.workunit.client.1.vm08.stdout:5/384: unlink d0/d11/d18/f34 0 2026-03-10T08:55:16.640 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.636+0000 7f02ea7fc700 1 -- 192.168.123.105:0/2332509110 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f02ec0573b0 con 0x7f02f4083610 2026-03-10T08:55:16.643 INFO:tasks.workunit.client.0.vm05.stdout:3/128: rmdir d9 39 2026-03-10T08:55:16.648 INFO:tasks.workunit.client.1.vm08.stdout:8/455: truncate d1/d10/d9/dd/d18/d3c/f4e 325844 0 2026-03-10T08:55:16.650 INFO:tasks.workunit.client.0.vm05.stdout:7/71: link f3 f10 0 2026-03-10T08:55:16.651 INFO:tasks.workunit.client.0.vm05.stdout:7/72: write f9 [652439,16012] 0 2026-03-10T08:55:16.656 INFO:tasks.workunit.client.1.vm08.stdout:8/456: dread d1/d10/f23 [4194304,4194304] 0 2026-03-10T08:55:16.669 INFO:tasks.workunit.client.0.vm05.stdout:6/122: symlink d4/d7/l2b 0 2026-03-10T08:55:16.671 INFO:tasks.workunit.client.1.vm08.stdout:1/389: mknod d1/da/d18/d3a/d77/c80 0 2026-03-10T08:55:16.678 INFO:tasks.workunit.client.0.vm05.stdout:1/175: rename dd/d10/d19/d27/f2a to dd/d13/f42 0 2026-03-10T08:55:16.679 INFO:tasks.workunit.client.1.vm08.stdout:1/390: dread d1/da/de/f19 [0,4194304] 0 2026-03-10T08:55:16.681 INFO:tasks.workunit.client.0.vm05.stdout:2/74: mknod d0/c11 0 2026-03-10T08:55:16.684 INFO:tasks.workunit.client.0.vm05.stdout:2/75: dwrite d0/fb [0,4194304] 0 2026-03-10T08:55:16.685 INFO:tasks.workunit.client.1.vm08.stdout:5/385: dread d0/fb [0,4194304] 0 2026-03-10T08:55:16.686 INFO:tasks.workunit.client.1.vm08.stdout:5/386: chown d0/d40/d4b/l6e 8240584 1 2026-03-10T08:55:16.693 INFO:tasks.workunit.client.1.vm08.stdout:8/457: mkdir d1/d10/d9/dd/d9a/da6 0 2026-03-10T08:55:16.696 INFO:tasks.workunit.client.1.vm08.stdout:7/369: dwrite d0/d11/d1f/d29/d3d/d40/f38 [4194304,4194304] 0 2026-03-10T08:55:16.698 INFO:tasks.workunit.client.1.vm08.stdout:2/390: rename d1/da/d10/d1b/d12/d1e/c32 to d1/da/d10/c7c 0 2026-03-10T08:55:16.708 INFO:tasks.workunit.client.0.vm05.stdout:3/129: dwrite d9/fa [0,4194304] 0 2026-03-10T08:55:16.709 INFO:tasks.workunit.client.0.vm05.stdout:3/130: write f7 [5907067,29516] 0 2026-03-10T08:55:16.721 INFO:tasks.workunit.client.1.vm08.stdout:9/393: link d2/f77 d2/d41/d4c/f7c 0 2026-03-10T08:55:16.725 INFO:tasks.workunit.client.0.vm05.stdout:9/89: getdents d6/d15 0 2026-03-10T08:55:16.728 INFO:tasks.workunit.client.1.vm08.stdout:3/357: stat d4/d15/d8/f24 0 2026-03-10T08:55:16.729 INFO:tasks.workunit.client.1.vm08.stdout:3/358: dread d4/d15/d8/d2c/f67 [0,4194304] 0 2026-03-10T08:55:16.730 INFO:tasks.workunit.client.0.vm05.stdout:5/128: creat d5/f28 x:0 0 0 2026-03-10T08:55:16.733 INFO:tasks.workunit.client.0.vm05.stdout:5/129: dwrite d5/f28 [0,4194304] 0 2026-03-10T08:55:16.739 INFO:tasks.workunit.client.0.vm05.stdout:5/130: dwrite d5/fe [4194304,4194304] 0 2026-03-10T08:55:16.769 INFO:tasks.workunit.client.0.vm05.stdout:3/131: mknod d9/c22 0 2026-03-10T08:55:16.782 INFO:tasks.workunit.client.0.vm05.stdout:4/131: dwrite d0/f16 [0,4194304] 0 2026-03-10T08:55:16.785 INFO:tasks.workunit.client.1.vm08.stdout:4/411: write d5/f6b [107867,35872] 0 2026-03-10T08:55:16.787 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.787+0000 7f02fb8eb700 1 -- 192.168.123.105:0/2332509110 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f02d8000bf0 con 0x7f02e006c530 2026-03-10T08:55:16.788 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:16 vm05.local ceph-mon[49713]: pgmap v147: 65 pgs: 65 active+clean; 965 MiB data, 3.9 GiB used, 116 GiB / 120 GiB avail; 16 MiB/s rd, 99 MiB/s wr, 308 op/s 2026-03-10T08:55:16.793 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.790+0000 7f02ea7fc700 1 -- 192.168.123.105:0/2332509110 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+442 (secure 0 0 0) 0x7f02d8000bf0 con 0x7f02e006c530 2026-03-10T08:55:16.793 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.793+0000 7f02dffff700 1 -- 192.168.123.105:0/2332509110 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f02e006c530 msgr2=0x7f02e006e9f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:16.793 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.793+0000 7f02dffff700 1 --2- 192.168.123.105:0/2332509110 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f02e006c530 0x7f02e006e9f0 secure :-1 s=READY pgs=144 cs=0 l=1 rev1=1 crypto rx=0x7f02f000b5c0 tx=0x7f02f0011040 comp rx=0 tx=0).stop 2026-03-10T08:55:16.793 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.793+0000 7f02dffff700 1 -- 192.168.123.105:0/2332509110 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f02f4083610 msgr2=0x7f02f41b30e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:16.793 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.793+0000 7f02dffff700 1 --2- 192.168.123.105:0/2332509110 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f02f4083610 0x7f02f41b30e0 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f02ec0060b0 tx=0x7f02ec00b6e0 comp rx=0 tx=0).stop 2026-03-10T08:55:16.793 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.793+0000 7f02dffff700 1 -- 192.168.123.105:0/2332509110 shutdown_connections 2026-03-10T08:55:16.793 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.793+0000 7f02dffff700 1 --2- 192.168.123.105:0/2332509110 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f02e006c530 0x7f02e006e9f0 unknown :-1 s=CLOSED pgs=144 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:16.794 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.793+0000 7f02dffff700 1 --2- 192.168.123.105:0/2332509110 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f02f4075a40 0x7f02f40830d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:16.794 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.793+0000 7f02dffff700 1 --2- 192.168.123.105:0/2332509110 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f02f4083610 0x7f02f41b30e0 unknown :-1 s=CLOSED pgs=82 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:16.794 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.793+0000 7f02dffff700 1 -- 192.168.123.105:0/2332509110 >> 192.168.123.105:0/2332509110 conn(0x7f02f406dae0 msgr2=0x7f02f406ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:55:16.794 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.794+0000 7f02dffff700 1 -- 192.168.123.105:0/2332509110 shutdown_connections 2026-03-10T08:55:16.794 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.794+0000 7f02dffff700 1 -- 192.168.123.105:0/2332509110 wait complete. 2026-03-10T08:55:16.802 INFO:tasks.workunit.client.0.vm05.stdout:6/123: mkdir d4/d2c 0 2026-03-10T08:55:16.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:16 vm08.local ceph-mon[57559]: pgmap v147: 65 pgs: 65 active+clean; 965 MiB data, 3.9 GiB used, 116 GiB / 120 GiB avail; 16 MiB/s rd, 99 MiB/s wr, 308 op/s 2026-03-10T08:55:16.806 INFO:tasks.workunit.client.0.vm05.stdout:5/131: mknod d5/df/d12/c29 0 2026-03-10T08:55:16.807 INFO:tasks.workunit.client.0.vm05.stdout:5/132: dread d5/df/f1c [0,4194304] 0 2026-03-10T08:55:16.809 INFO:tasks.workunit.client.0.vm05.stdout:1/176: symlink dd/d10/d18/d2d/l43 0 2026-03-10T08:55:16.810 INFO:tasks.workunit.client.0.vm05.stdout:1/177: truncate dd/d21/f26 234879 0 2026-03-10T08:55:16.814 INFO:tasks.workunit.client.1.vm08.stdout:6/426: write d9/fa [316106,102257] 0 2026-03-10T08:55:16.814 INFO:tasks.workunit.client.0.vm05.stdout:3/132: creat d9/f23 x:0 0 0 2026-03-10T08:55:16.817 INFO:tasks.workunit.client.1.vm08.stdout:2/391: truncate d1/f19 2825659 0 2026-03-10T08:55:16.818 INFO:tasks.workunit.client.1.vm08.stdout:0/311: write d6/f11 [2627637,11006] 0 2026-03-10T08:55:16.819 INFO:tasks.workunit.client.1.vm08.stdout:9/394: mknod d2/dd/d15/d1e/c7d 0 2026-03-10T08:55:16.821 INFO:tasks.workunit.client.0.vm05.stdout:9/90: creat d6/d19/f1a x:0 0 0 2026-03-10T08:55:16.826 INFO:tasks.workunit.client.1.vm08.stdout:3/359: rename d4/d15/d8/f37 to d4/d15/d8/d2c/d55/f75 0 2026-03-10T08:55:16.827 INFO:tasks.workunit.client.0.vm05.stdout:2/76: rmdir d0 39 2026-03-10T08:55:16.829 INFO:tasks.workunit.client.1.vm08.stdout:4/412: mkdir d5/d23/d49/d8f 0 2026-03-10T08:55:16.829 INFO:tasks.workunit.client.1.vm08.stdout:4/413: chown d5/de/l52 512243 1 2026-03-10T08:55:16.829 INFO:tasks.workunit.client.1.vm08.stdout:4/414: stat d5/d23/f68 0 2026-03-10T08:55:16.830 INFO:tasks.workunit.client.1.vm08.stdout:4/415: dread - d5/d2f/f84 zero size 2026-03-10T08:55:16.831 INFO:tasks.workunit.client.0.vm05.stdout:0/107: rename df/d18/d19/f23 to df/d18/f24 0 2026-03-10T08:55:16.834 INFO:tasks.workunit.client.1.vm08.stdout:8/458: creat d1/d10/d9/dd/d25/d27/d44/fa7 x:0 0 0 2026-03-10T08:55:16.837 INFO:tasks.workunit.client.0.vm05.stdout:5/133: creat d5/df/d12/f2a x:0 0 0 2026-03-10T08:55:16.838 INFO:tasks.workunit.client.1.vm08.stdout:5/387: dwrite d0/d40/f6f [0,4194304] 0 2026-03-10T08:55:16.840 INFO:tasks.workunit.client.0.vm05.stdout:5/134: dwrite d5/df/d12/f13 [0,4194304] 0 2026-03-10T08:55:16.854 INFO:tasks.workunit.client.0.vm05.stdout:1/178: creat dd/f44 x:0 0 0 2026-03-10T08:55:16.854 INFO:tasks.workunit.client.0.vm05.stdout:1/179: dread - dd/d21/d37/f39 zero size 2026-03-10T08:55:16.855 INFO:tasks.workunit.client.0.vm05.stdout:1/180: chown dd/d13/f40 480754360 1 2026-03-10T08:55:16.860 INFO:tasks.workunit.client.0.vm05.stdout:3/133: rename d9/l14 to d9/l24 0 2026-03-10T08:55:16.861 INFO:tasks.workunit.client.0.vm05.stdout:8/94: write d2/db/f19 [3329112,16041] 0 2026-03-10T08:55:16.861 INFO:tasks.workunit.client.1.vm08.stdout:6/427: dread d9/d13/f70 [0,4194304] 0 2026-03-10T08:55:16.864 INFO:tasks.workunit.client.1.vm08.stdout:2/392: rmdir d1/d43/d4f/d52 39 2026-03-10T08:55:16.865 INFO:tasks.workunit.client.0.vm05.stdout:7/73: write f10 [2208553,30939] 0 2026-03-10T08:55:16.874 INFO:tasks.workunit.client.1.vm08.stdout:1/391: truncate d1/da/de/d24/d3d/d4a/f7b 1993864 0 2026-03-10T08:55:16.878 INFO:tasks.workunit.client.0.vm05.stdout:2/77: dread d0/f4 [0,4194304] 0 2026-03-10T08:55:16.881 INFO:tasks.workunit.client.0.vm05.stdout:4/132: link d0/f16 d0/d15/f29 0 2026-03-10T08:55:16.895 INFO:tasks.workunit.client.1.vm08.stdout:3/360: symlink d4/d15/d8/d2c/d55/l76 0 2026-03-10T08:55:16.895 INFO:tasks.workunit.client.1.vm08.stdout:3/361: readlink d4/l4a 0 2026-03-10T08:55:16.895 INFO:tasks.workunit.client.1.vm08.stdout:3/362: dwrite d4/f47 [0,4194304] 0 2026-03-10T08:55:16.895 INFO:tasks.workunit.client.0.vm05.stdout:4/133: stat d0/d15/d25 0 2026-03-10T08:55:16.895 INFO:tasks.workunit.client.0.vm05.stdout:0/108: rename df/d1f/f20 to df/d1f/f25 0 2026-03-10T08:55:16.895 INFO:tasks.workunit.client.0.vm05.stdout:0/109: read - df/d18/d19/f1c zero size 2026-03-10T08:55:16.895 INFO:tasks.workunit.client.0.vm05.stdout:0/110: write df/f11 [4144114,24411] 0 2026-03-10T08:55:16.895 INFO:tasks.workunit.client.0.vm05.stdout:0/111: write df/d1f/f21 [1021246,9867] 0 2026-03-10T08:55:16.897 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.896+0000 7f912d519700 1 -- 192.168.123.105:0/1024207770 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9128075a40 msgr2=0x7f9128077ed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:16.897 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.896+0000 7f912d519700 1 --2- 192.168.123.105:0/1024207770 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9128075a40 0x7f9128077ed0 secure :-1 s=READY pgs=320 cs=0 l=1 rev1=1 crypto rx=0x7f912000d3f0 tx=0x7f912000d700 comp rx=0 tx=0).stop 2026-03-10T08:55:16.897 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.897+0000 7f912d519700 1 -- 192.168.123.105:0/1024207770 shutdown_connections 2026-03-10T08:55:16.897 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.897+0000 7f912d519700 1 --2- 192.168.123.105:0/1024207770 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9128075a40 0x7f9128077ed0 unknown :-1 s=CLOSED pgs=320 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:16.897 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.897+0000 7f912d519700 1 --2- 192.168.123.105:0/1024207770 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9128072b50 0x7f9128072f70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:16.897 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.897+0000 7f912d519700 1 -- 192.168.123.105:0/1024207770 >> 192.168.123.105:0/1024207770 conn(0x7f912806dae0 msgr2=0x7f912806ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:55:16.898 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.897+0000 7f912d519700 1 -- 192.168.123.105:0/1024207770 shutdown_connections 2026-03-10T08:55:16.899 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.897+0000 7f912d519700 1 -- 192.168.123.105:0/1024207770 wait complete. 2026-03-10T08:55:16.899 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.898+0000 7f912d519700 1 Processor -- start 2026-03-10T08:55:16.899 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.898+0000 7f912d519700 1 -- start start 2026-03-10T08:55:16.899 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.898+0000 7f912d519700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9128072b50 0x7f9128082f60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:55:16.899 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.898+0000 7f912d519700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f91280834a0 0x7f9128083920 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:55:16.900 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.898+0000 7f912d519700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f912812e700 con 0x7f9128072b50 2026-03-10T08:55:16.900 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.898+0000 7f912d519700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f912812e870 con 0x7f91280834a0 2026-03-10T08:55:16.900 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.898+0000 7f91267fc700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f91280834a0 0x7f9128083920 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:55:16.900 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.898+0000 7f91267fc700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f91280834a0 0x7f9128083920 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:48612/0 (socket says 192.168.123.105:48612) 2026-03-10T08:55:16.900 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.898+0000 7f91267fc700 1 -- 192.168.123.105:0/3150427189 learned_addr learned my addr 192.168.123.105:0/3150427189 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:55:16.900 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.899+0000 7f9126ffd700 1 --2- 192.168.123.105:0/3150427189 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9128072b50 0x7f9128082f60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:55:16.900 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.899+0000 7f91267fc700 1 -- 192.168.123.105:0/3150427189 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9128072b50 msgr2=0x7f9128082f60 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:16.900 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.899+0000 7f91267fc700 1 --2- 192.168.123.105:0/3150427189 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9128072b50 0x7f9128082f60 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:16.901 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.899+0000 7f91267fc700 1 -- 192.168.123.105:0/3150427189 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9120007ed0 con 0x7f91280834a0 2026-03-10T08:55:16.901 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.899+0000 7f91267fc700 1 --2- 192.168.123.105:0/3150427189 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f91280834a0 0x7f9128083920 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f9120003c60 tx=0x7f9120003c90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:55:16.901 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.899+0000 7f910ffff700 1 -- 192.168.123.105:0/3150427189 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f912001c070 con 0x7f91280834a0 2026-03-10T08:55:16.901 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.900+0000 7f912d519700 1 -- 192.168.123.105:0/3150427189 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f912812eaf0 con 0x7f91280834a0 2026-03-10T08:55:16.901 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.900+0000 7f912d519700 1 -- 192.168.123.105:0/3150427189 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f912812efe0 con 0x7f91280834a0 2026-03-10T08:55:16.901 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.900+0000 7f910ffff700 1 -- 192.168.123.105:0/3150427189 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9120004280 con 0x7f91280834a0 2026-03-10T08:55:16.901 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.901+0000 7f910ffff700 1 -- 192.168.123.105:0/3150427189 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9120021820 con 0x7f91280834a0 2026-03-10T08:55:16.902 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.902+0000 7f912d519700 1 -- 192.168.123.105:0/3150427189 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9114005320 con 0x7f91280834a0 2026-03-10T08:55:16.902 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.902+0000 7f910ffff700 1 -- 192.168.123.105:0/3150427189 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f912000f660 con 0x7f91280834a0 2026-03-10T08:55:16.902 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.902+0000 7f910ffff700 1 --2- 192.168.123.105:0/3150427189 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f911006c330 0x7f911006e7f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:55:16.902 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.902+0000 7f910ffff700 1 -- 192.168.123.105:0/3150427189 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f9120013070 con 0x7f91280834a0 2026-03-10T08:55:16.903 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.902+0000 7f9126ffd700 1 --2- 192.168.123.105:0/3150427189 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f911006c330 0x7f911006e7f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:55:16.905 INFO:tasks.workunit.client.1.vm08.stdout:4/416: fsync d5/d23/d36/f44 0 2026-03-10T08:55:16.909 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.908+0000 7f910ffff700 1 -- 192.168.123.105:0/3150427189 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f912005b3f0 con 0x7f91280834a0 2026-03-10T08:55:16.909 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:16.908+0000 7f9126ffd700 1 --2- 192.168.123.105:0/3150427189 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f911006c330 0x7f911006e7f0 secure :-1 s=READY pgs=145 cs=0 l=1 rev1=1 crypto rx=0x7f9118005950 tx=0x7f911800b410 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:55:16.916 INFO:tasks.workunit.client.1.vm08.stdout:5/388: symlink d0/d11/d27/l79 0 2026-03-10T08:55:16.924 INFO:tasks.workunit.client.1.vm08.stdout:8/459: write d1/d10/d9/dd/d25/f6e [736515,2635] 0 2026-03-10T08:55:16.925 INFO:tasks.workunit.client.1.vm08.stdout:7/370: getdents d0/d11/d1f/d29/d3d/d40 0 2026-03-10T08:55:16.926 INFO:tasks.workunit.client.1.vm08.stdout:9/395: dwrite d2/dd/d15/d1e/d21/f3a [0,4194304] 0 2026-03-10T08:55:16.928 INFO:tasks.workunit.client.1.vm08.stdout:6/428: dread - d9/d50/f78 zero size 2026-03-10T08:55:16.928 INFO:tasks.workunit.client.0.vm05.stdout:1/181: mkdir dd/d21/d37/d45 0 2026-03-10T08:55:16.928 INFO:tasks.workunit.client.0.vm05.stdout:3/134: mknod d9/c25 0 2026-03-10T08:55:16.931 INFO:tasks.workunit.client.1.vm08.stdout:6/429: dwrite d9/d13/f36 [0,4194304] 0 2026-03-10T08:55:16.943 INFO:tasks.workunit.client.0.vm05.stdout:7/74: mknod c11 0 2026-03-10T08:55:16.944 INFO:tasks.workunit.client.0.vm05.stdout:7/75: write f9 [1369480,111193] 0 2026-03-10T08:55:16.946 INFO:tasks.workunit.client.0.vm05.stdout:7/76: dwrite f10 [0,4194304] 0 2026-03-10T08:55:16.947 INFO:tasks.workunit.client.1.vm08.stdout:1/392: mkdir d1/da/de/d24/d81 0 2026-03-10T08:55:16.970 INFO:tasks.workunit.client.0.vm05.stdout:0/112: rmdir df 39 2026-03-10T08:55:16.982 INFO:tasks.workunit.client.1.vm08.stdout:4/417: creat d5/d2f/d5a/f90 x:0 0 0 2026-03-10T08:55:16.982 INFO:tasks.workunit.client.0.vm05.stdout:1/182: mknod dd/d13/c46 0 2026-03-10T08:55:16.984 INFO:tasks.workunit.client.0.vm05.stdout:3/135: creat d9/f26 x:0 0 0 2026-03-10T08:55:16.986 INFO:tasks.workunit.client.0.vm05.stdout:1/183: dwrite dd/d10/f22 [0,4194304] 0 2026-03-10T08:55:16.987 INFO:tasks.workunit.client.1.vm08.stdout:9/396: rmdir d2/d41 39 2026-03-10T08:55:16.991 INFO:tasks.workunit.client.1.vm08.stdout:4/418: dread d5/de/f5e [0,4194304] 0 2026-03-10T08:55:16.992 INFO:tasks.workunit.client.0.vm05.stdout:7/77: mknod c12 0 2026-03-10T08:55:16.993 INFO:tasks.workunit.client.1.vm08.stdout:6/430: mkdir d9/d50/d95 0 2026-03-10T08:55:16.995 INFO:tasks.workunit.client.0.vm05.stdout:7/78: dwrite f9 [0,4194304] 0 2026-03-10T08:55:16.999 INFO:tasks.workunit.client.1.vm08.stdout:2/393: mknod d1/da/d10/d1b/c7d 0 2026-03-10T08:55:16.999 INFO:tasks.workunit.client.0.vm05.stdout:0/113: fdatasync df/f1a 0 2026-03-10T08:55:17.000 INFO:tasks.workunit.client.0.vm05.stdout:0/114: chown df/f1a 230934393 1 2026-03-10T08:55:17.002 INFO:tasks.workunit.client.0.vm05.stdout:7/79: dwrite f9 [0,4194304] 0 2026-03-10T08:55:17.006 INFO:tasks.workunit.client.0.vm05.stdout:6/124: getdents d4/d7/d10/d1a 0 2026-03-10T08:55:17.013 INFO:tasks.workunit.client.1.vm08.stdout:1/393: rename d1/da/de/d24/d3d/d4a to d1/da/de/d24/d35/d6d/d82 0 2026-03-10T08:55:17.014 INFO:tasks.workunit.client.0.vm05.stdout:0/115: dwrite df/f11 [0,4194304] 0 2026-03-10T08:55:17.015 INFO:tasks.workunit.client.1.vm08.stdout:1/394: truncate d1/da/de/d24/d3d/d40/d56/f73 616060 0 2026-03-10T08:55:17.015 INFO:tasks.workunit.client.0.vm05.stdout:2/78: dread d0/f8 [0,4194304] 0 2026-03-10T08:55:17.018 INFO:tasks.workunit.client.1.vm08.stdout:8/460: write d1/d10/d9/dd/d13/f6a [585858,97751] 0 2026-03-10T08:55:17.019 INFO:tasks.workunit.client.1.vm08.stdout:8/461: read - d1/d10/d9/dd/d13/f92 zero size 2026-03-10T08:55:17.020 INFO:tasks.workunit.client.1.vm08.stdout:8/462: fdatasync d1/d10/d9/dd/d3d/f78 0 2026-03-10T08:55:17.021 INFO:tasks.workunit.client.1.vm08.stdout:8/463: readlink d1/d10/d9/dd/d25/d27/d44/d97/la5 0 2026-03-10T08:55:17.042 INFO:tasks.workunit.client.1.vm08.stdout:7/371: write d0/d11/d1f/d29/d3d/d40/ff [474840,93778] 0 2026-03-10T08:55:17.044 INFO:tasks.workunit.client.1.vm08.stdout:3/363: dwrite d4/d15/d17/f3c [0,4194304] 0 2026-03-10T08:55:17.045 INFO:tasks.workunit.client.0.vm05.stdout:3/136: rmdir d9 39 2026-03-10T08:55:17.052 INFO:tasks.workunit.client.1.vm08.stdout:5/389: mkdir d0/d1b/d67/d7a 0 2026-03-10T08:55:17.052 INFO:tasks.workunit.client.0.vm05.stdout:8/95: rename d2/l16 to d2/l20 0 2026-03-10T08:55:17.060 INFO:tasks.workunit.client.1.vm08.stdout:4/419: dread d5/d23/d36/f51 [0,4194304] 0 2026-03-10T08:55:17.068 INFO:tasks.workunit.client.1.vm08.stdout:0/312: link d6/dd/c2b d6/dd/d13/d17/d1f/d20/c60 0 2026-03-10T08:55:17.077 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.072+0000 7f912d519700 1 -- 192.168.123.105:0/3150427189 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f9114000bf0 con 0x7f911006c330 2026-03-10T08:55:17.081 INFO:tasks.workunit.client.0.vm05.stdout:7/80: symlink l13 0 2026-03-10T08:55:17.082 INFO:tasks.workunit.client.0.vm05.stdout:7/81: chown c11 5945 1 2026-03-10T08:55:17.082 INFO:tasks.workunit.client.0.vm05.stdout:0/116: write df/d1f/f25 [366461,36301] 0 2026-03-10T08:55:17.082 INFO:tasks.workunit.client.0.vm05.stdout:5/135: getdents d5 0 2026-03-10T08:55:17.083 INFO:tasks.workunit.client.0.vm05.stdout:5/136: fdatasync d5/f9 0 2026-03-10T08:55:17.084 INFO:tasks.workunit.client.0.vm05.stdout:3/137: write d9/fa [1302776,120108] 0 2026-03-10T08:55:17.085 INFO:tasks.workunit.client.0.vm05.stdout:5/137: write d5/df/d12/d24/f25 [81671,12437] 0 2026-03-10T08:55:17.085 INFO:tasks.workunit.client.0.vm05.stdout:3/138: fdatasync d9/f12 0 2026-03-10T08:55:17.086 INFO:tasks.workunit.client.0.vm05.stdout:3/139: readlink d9/le 0 2026-03-10T08:55:17.087 INFO:tasks.workunit.client.0.vm05.stdout:3/140: write d9/f1a [513185,78446] 0 2026-03-10T08:55:17.088 INFO:tasks.workunit.client.0.vm05.stdout:7/82: dwrite f3 [4194304,4194304] 0 2026-03-10T08:55:17.093 INFO:tasks.workunit.client.0.vm05.stdout:0/117: dwrite df/d18/d19/f1c [0,4194304] 0 2026-03-10T08:55:17.094 INFO:tasks.workunit.client.0.vm05.stdout:3/141: dread d9/f19 [0,4194304] 0 2026-03-10T08:55:17.097 INFO:tasks.workunit.client.0.vm05.stdout:3/142: write d9/fa [1407205,76734] 0 2026-03-10T08:55:17.102 INFO:tasks.workunit.client.0.vm05.stdout:0/118: dwrite df/d18/d19/f1c [0,4194304] 0 2026-03-10T08:55:17.104 INFO:tasks.workunit.client.1.vm08.stdout:7/372: unlink d0/d14/f7 0 2026-03-10T08:55:17.105 INFO:tasks.workunit.client.1.vm08.stdout:5/390: mknod d0/d11/c7b 0 2026-03-10T08:55:17.107 INFO:teuthology.orchestra.run.vm05.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T08:55:17.107 INFO:teuthology.orchestra.run.vm05.stdout:alertmanager.vm05 vm05 *:9093,9094 running (3m) 2m ago 4m 21.4M - 0.25.0 c8568f914cd2 3be9db7ff6a0 2026-03-10T08:55:17.107 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm05 vm05 running (4m) 2m ago 4m 8032k - 18.2.1 5be31c24972a 4f6a4fa3151a 2026-03-10T08:55:17.107 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm08 vm08 running (3m) 2m ago 3m 8308k - 18.2.1 5be31c24972a 17a1d108c2c1 2026-03-10T08:55:17.107 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm05 vm05 running (4m) 2m ago 4m 7407k - 18.2.1 5be31c24972a f9c585addcea 2026-03-10T08:55:17.107 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm08 vm08 running (3m) 2m ago 3m 7415k - 18.2.1 5be31c24972a f0b88fc7f552 2026-03-10T08:55:17.107 INFO:teuthology.orchestra.run.vm05.stdout:grafana.vm05 vm05 *:3000 running (3m) 2m ago 4m 80.8M - 9.4.7 954c08fa6188 91937b4745e7 2026-03-10T08:55:17.107 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.bxdvbu vm05 running (2m) 2m ago 2m 16.7M - 18.2.1 5be31c24972a 601862397d9b 2026-03-10T08:55:17.107 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.slhztf vm05 running (2m) 2m ago 2m 13.7M - 18.2.1 5be31c24972a 550cdd24738b 2026-03-10T08:55:17.107 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.ssijow vm08 running (2m) 2m ago 2m 16.1M - 18.2.1 5be31c24972a b9e55b365719 2026-03-10T08:55:17.107 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.xfzrbx vm08 running (2m) 2m ago 2m 11.0M - 18.2.1 5be31c24972a d58d1e2e6ff3 2026-03-10T08:55:17.107 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm05.rxwgjc vm05 *:9283,8765,8443 running (5m) 2m ago 5m 501M - 18.2.1 5be31c24972a 6ec0cdb38171 2026-03-10T08:55:17.107 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm08.rpongu vm08 *:8443,9283,8765 running (3m) 2m ago 3m 449M - 18.2.1 5be31c24972a 9cd801f2f7a7 2026-03-10T08:55:17.107 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm05 vm05 running (5m) 2m ago 5m 50.0M 2048M 18.2.1 5be31c24972a 4cb0e74c8584 2026-03-10T08:55:17.107 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm08 vm08 running (3m) 2m ago 3m 47.9M 2048M 18.2.1 5be31c24972a bca448418226 2026-03-10T08:55:17.107 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm05 vm05 *:9100 running (4m) 2m ago 4m 12.3M - 1.5.0 0da6a335fe13 3b3db2a6030c 2026-03-10T08:55:17.107 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm08 vm08 *:9100 running (3m) 2m ago 3m 12.7M - 1.5.0 0da6a335fe13 f55df0cefc61 2026-03-10T08:55:17.107 INFO:teuthology.orchestra.run.vm05.stdout:osd.0 vm05 running (3m) 2m ago 3m 48.5M 4096M 18.2.1 5be31c24972a 2a2aeea5e3d4 2026-03-10T08:55:17.107 INFO:teuthology.orchestra.run.vm05.stdout:osd.1 vm05 running (3m) 2m ago 3m 46.9M 4096M 18.2.1 5be31c24972a 902f9ea11f1a 2026-03-10T08:55:17.107 INFO:teuthology.orchestra.run.vm05.stdout:osd.2 vm05 running (3m) 2m ago 3m 48.1M 4096M 18.2.1 5be31c24972a 32c0be0f86f2 2026-03-10T08:55:17.107 INFO:teuthology.orchestra.run.vm05.stdout:osd.3 vm08 running (3m) 2m ago 3m 44.3M 4096M 18.2.1 5be31c24972a 14f5f93ea4d1 2026-03-10T08:55:17.107 INFO:teuthology.orchestra.run.vm05.stdout:osd.4 vm08 running (2m) 2m ago 2m 43.5M 4096M 18.2.1 5be31c24972a 155c1482d81c 2026-03-10T08:55:17.107 INFO:teuthology.orchestra.run.vm05.stdout:osd.5 vm08 running (2m) 2m ago 2m 45.8M 4096M 18.2.1 5be31c24972a 21583bb58d82 2026-03-10T08:55:17.107 INFO:teuthology.orchestra.run.vm05.stdout:prometheus.vm05 vm05 *:9095 running (3m) 2m ago 4m 36.9M - 2.43.0 a07b618ecd1d e84b76e5c1c0 2026-03-10T08:55:17.107 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.104+0000 7f910ffff700 1 -- 192.168.123.105:0/3150427189 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3216 (secure 0 0 0) 0x7f9114000bf0 con 0x7f911006c330 2026-03-10T08:55:17.108 INFO:tasks.workunit.client.1.vm08.stdout:9/397: truncate d2/dd/d15/d1e/d24/f3f 2965397 0 2026-03-10T08:55:17.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.107+0000 7f910dffb700 1 -- 192.168.123.105:0/3150427189 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f911006c330 msgr2=0x7f911006e7f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:17.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.107+0000 7f910dffb700 1 --2- 192.168.123.105:0/3150427189 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f911006c330 0x7f911006e7f0 secure :-1 s=READY pgs=145 cs=0 l=1 rev1=1 crypto rx=0x7f9118005950 tx=0x7f911800b410 comp rx=0 tx=0).stop 2026-03-10T08:55:17.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.107+0000 7f910dffb700 1 -- 192.168.123.105:0/3150427189 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f91280834a0 msgr2=0x7f9128083920 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:17.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.107+0000 7f910dffb700 1 --2- 192.168.123.105:0/3150427189 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f91280834a0 0x7f9128083920 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f9120003c60 tx=0x7f9120003c90 comp rx=0 tx=0).stop 2026-03-10T08:55:17.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.107+0000 7f910dffb700 1 -- 192.168.123.105:0/3150427189 shutdown_connections 2026-03-10T08:55:17.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.107+0000 7f910dffb700 1 --2- 192.168.123.105:0/3150427189 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f911006c330 0x7f911006e7f0 unknown :-1 s=CLOSED pgs=145 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:17.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.107+0000 7f910dffb700 1 --2- 192.168.123.105:0/3150427189 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9128072b50 0x7f9128082f60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:17.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.107+0000 7f910dffb700 1 --2- 192.168.123.105:0/3150427189 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f91280834a0 0x7f9128083920 unknown :-1 s=CLOSED pgs=83 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:17.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.107+0000 7f910dffb700 1 -- 192.168.123.105:0/3150427189 >> 192.168.123.105:0/3150427189 conn(0x7f912806dae0 msgr2=0x7f912806ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:55:17.108 INFO:tasks.workunit.client.0.vm05.stdout:1/184: creat dd/d21/d37/d45/f47 x:0 0 0 2026-03-10T08:55:17.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.108+0000 7f910dffb700 1 -- 192.168.123.105:0/3150427189 shutdown_connections 2026-03-10T08:55:17.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.108+0000 7f910dffb700 1 -- 192.168.123.105:0/3150427189 wait complete. 2026-03-10T08:55:17.111 INFO:tasks.workunit.client.0.vm05.stdout:4/134: getdents d0/d1f 0 2026-03-10T08:55:17.132 INFO:tasks.workunit.client.0.vm05.stdout:9/91: link d6/ca d6/c1b 0 2026-03-10T08:55:17.132 INFO:tasks.workunit.client.0.vm05.stdout:6/125: mkdir d4/d2d 0 2026-03-10T08:55:17.132 INFO:tasks.workunit.client.0.vm05.stdout:6/126: dwrite d4/d7/d10/d15/f16 [0,4194304] 0 2026-03-10T08:55:17.132 INFO:tasks.workunit.client.0.vm05.stdout:5/138: fdatasync d5/df/f1c 0 2026-03-10T08:55:17.132 INFO:tasks.workunit.client.0.vm05.stdout:5/139: dwrite d5/f28 [0,4194304] 0 2026-03-10T08:55:17.132 INFO:tasks.workunit.client.0.vm05.stdout:5/140: fdatasync d5/df/d12/f20 0 2026-03-10T08:55:17.132 INFO:tasks.workunit.client.0.vm05.stdout:5/141: truncate d5/df/d12/f13 5099123 0 2026-03-10T08:55:17.133 INFO:tasks.workunit.client.0.vm05.stdout:7/83: mknod c14 0 2026-03-10T08:55:17.138 INFO:tasks.workunit.client.1.vm08.stdout:5/391: dread d0/d11/d27/f2a [0,4194304] 0 2026-03-10T08:55:17.143 INFO:tasks.workunit.client.0.vm05.stdout:0/119: mkdir df/d1f/d26 0 2026-03-10T08:55:17.146 INFO:tasks.workunit.client.0.vm05.stdout:8/96: mkdir d2/db/d1f/d21 0 2026-03-10T08:55:17.151 INFO:tasks.workunit.client.0.vm05.stdout:9/92: dread d6/fe [0,4194304] 0 2026-03-10T08:55:17.152 INFO:tasks.workunit.client.1.vm08.stdout:0/313: mkdir d6/dd/d13/d61 0 2026-03-10T08:55:17.154 INFO:tasks.workunit.client.0.vm05.stdout:6/127: rmdir d4/d7/d10/d15 39 2026-03-10T08:55:17.161 INFO:tasks.workunit.client.1.vm08.stdout:7/373: sync 2026-03-10T08:55:17.163 INFO:tasks.workunit.client.0.vm05.stdout:3/143: sync 2026-03-10T08:55:17.165 INFO:tasks.workunit.client.0.vm05.stdout:6/128: dread d4/d7/ff [0,4194304] 0 2026-03-10T08:55:17.173 INFO:tasks.workunit.client.1.vm08.stdout:1/395: mkdir d1/da/d20/d4c/d83 0 2026-03-10T08:55:17.177 INFO:tasks.workunit.client.0.vm05.stdout:5/142: unlink d5/df/f19 0 2026-03-10T08:55:17.182 INFO:tasks.workunit.client.0.vm05.stdout:7/84: rename f8 to f15 0 2026-03-10T08:55:17.185 INFO:tasks.workunit.client.1.vm08.stdout:5/392: fsync d0/f36 0 2026-03-10T08:55:17.201 INFO:tasks.workunit.client.1.vm08.stdout:4/420: symlink d5/d23/d49/d8f/l91 0 2026-03-10T08:55:17.203 INFO:tasks.workunit.client.0.vm05.stdout:8/97: rmdir d2/dd 39 2026-03-10T08:55:17.203 INFO:tasks.workunit.client.0.vm05.stdout:1/185: unlink f1 0 2026-03-10T08:55:17.203 INFO:tasks.workunit.client.0.vm05.stdout:1/186: dwrite dd/d10/d19/f35 [0,4194304] 0 2026-03-10T08:55:17.203 INFO:tasks.workunit.client.0.vm05.stdout:1/187: readlink dd/d21/d37/l3d 0 2026-03-10T08:55:17.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.195+0000 7f2b4a410700 1 -- 192.168.123.105:0/3113351886 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2b44075a40 msgr2=0x7f2b44077ed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:17.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.195+0000 7f2b4a410700 1 --2- 192.168.123.105:0/3113351886 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2b44075a40 0x7f2b44077ed0 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f2b3c00d3f0 tx=0x7f2b3c00d700 comp rx=0 tx=0).stop 2026-03-10T08:55:17.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.195+0000 7f2b4a410700 1 -- 192.168.123.105:0/3113351886 shutdown_connections 2026-03-10T08:55:17.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.195+0000 7f2b4a410700 1 --2- 192.168.123.105:0/3113351886 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2b44075a40 0x7f2b44077ed0 unknown :-1 s=CLOSED pgs=84 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:17.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.195+0000 7f2b4a410700 1 --2- 192.168.123.105:0/3113351886 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2b44072b50 0x7f2b44072f70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:17.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.195+0000 7f2b4a410700 1 -- 192.168.123.105:0/3113351886 >> 192.168.123.105:0/3113351886 conn(0x7f2b4406dae0 msgr2=0x7f2b4406ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:55:17.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.196+0000 7f2b4a410700 1 -- 192.168.123.105:0/3113351886 shutdown_connections 2026-03-10T08:55:17.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.196+0000 7f2b4a410700 1 -- 192.168.123.105:0/3113351886 wait complete. 2026-03-10T08:55:17.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.196+0000 7f2b4a410700 1 Processor -- start 2026-03-10T08:55:17.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.196+0000 7f2b4a410700 1 -- start start 2026-03-10T08:55:17.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.196+0000 7f2b4a410700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2b44072b50 0x7f2b44082f00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:55:17.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.196+0000 7f2b4a410700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2b44083440 0x7f2b440838c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:55:17.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.196+0000 7f2b4a410700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2b4412e6a0 con 0x7f2b44072b50 2026-03-10T08:55:17.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.196+0000 7f2b4a410700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2b4412e810 con 0x7f2b44083440 2026-03-10T08:55:17.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.196+0000 7f2b437fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2b44083440 0x7f2b440838c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:55:17.204 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.196+0000 7f2b437fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2b44083440 0x7f2b440838c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:48624/0 (socket says 192.168.123.105:48624) 2026-03-10T08:55:17.204 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.196+0000 7f2b437fe700 1 -- 192.168.123.105:0/1370754065 learned_addr learned my addr 192.168.123.105:0/1370754065 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:55:17.204 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.196+0000 7f2b437fe700 1 -- 192.168.123.105:0/1370754065 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2b44072b50 msgr2=0x7f2b44082f00 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:17.204 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.196+0000 7f2b437fe700 1 --2- 192.168.123.105:0/1370754065 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2b44072b50 0x7f2b44082f00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:17.204 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.196+0000 7f2b437fe700 1 -- 192.168.123.105:0/1370754065 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2b3c007ed0 con 0x7f2b44083440 2026-03-10T08:55:17.204 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.197+0000 7f2b437fe700 1 --2- 192.168.123.105:0/1370754065 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2b44083440 0x7f2b440838c0 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7f2b3c003c60 tx=0x7f2b3c003d40 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:55:17.204 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.198+0000 7f2b417fa700 1 -- 192.168.123.105:0/1370754065 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2b3c01c070 con 0x7f2b44083440 2026-03-10T08:55:17.204 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.198+0000 7f2b4a410700 1 -- 192.168.123.105:0/1370754065 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2b4412ea90 con 0x7f2b44083440 2026-03-10T08:55:17.204 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.198+0000 7f2b4a410700 1 -- 192.168.123.105:0/1370754065 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2b4412efe0 con 0x7f2b44083440 2026-03-10T08:55:17.204 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.198+0000 7f2b417fa700 1 -- 192.168.123.105:0/1370754065 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f2b3c00deb0 con 0x7f2b44083440 2026-03-10T08:55:17.204 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.198+0000 7f2b417fa700 1 -- 192.168.123.105:0/1370754065 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2b3c021c10 con 0x7f2b44083440 2026-03-10T08:55:17.204 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.200+0000 7f2b4a410700 1 -- 192.168.123.105:0/1370754065 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2b30005320 con 0x7f2b44083440 2026-03-10T08:55:17.204 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.200+0000 7f2b417fa700 1 -- 192.168.123.105:0/1370754065 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f2b3c00f810 con 0x7f2b44083440 2026-03-10T08:55:17.204 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.200+0000 7f2b417fa700 1 --2- 192.168.123.105:0/1370754065 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2b2c06c600 0x7f2b2c06eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:55:17.204 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.200+0000 7f2b43fff700 1 --2- 192.168.123.105:0/1370754065 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2b2c06c600 0x7f2b2c06eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:55:17.204 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.200+0000 7f2b417fa700 1 -- 192.168.123.105:0/1370754065 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f2b3c013070 con 0x7f2b44083440 2026-03-10T08:55:17.204 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.201+0000 7f2b43fff700 1 --2- 192.168.123.105:0/1370754065 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2b2c06c600 0x7f2b2c06eac0 secure :-1 s=READY pgs=146 cs=0 l=1 rev1=1 crypto rx=0x7f2b34005950 tx=0x7f2b340058e0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:55:17.204 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.203+0000 7f2b417fa700 1 -- 192.168.123.105:0/1370754065 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f2b3c05b5c0 con 0x7f2b44083440 2026-03-10T08:55:17.204 INFO:tasks.workunit.client.1.vm08.stdout:1/396: mkdir d1/da/de/d24/d3d/d40/d84 0 2026-03-10T08:55:17.212 INFO:tasks.workunit.client.0.vm05.stdout:3/144: unlink d9/f11 0 2026-03-10T08:55:17.213 INFO:tasks.workunit.client.1.vm08.stdout:6/431: dwrite d9/d10/f25 [0,4194304] 0 2026-03-10T08:55:17.214 INFO:tasks.workunit.client.1.vm08.stdout:6/432: chown d9/d50 3 1 2026-03-10T08:55:17.214 INFO:tasks.workunit.client.1.vm08.stdout:6/433: readlink d9/d10/d1e/l52 0 2026-03-10T08:55:17.214 INFO:tasks.workunit.client.1.vm08.stdout:4/421: dread - d5/d23/d36/f57 zero size 2026-03-10T08:55:17.218 INFO:tasks.workunit.client.1.vm08.stdout:4/422: dread d5/f6b [0,4194304] 0 2026-03-10T08:55:17.218 INFO:tasks.workunit.client.1.vm08.stdout:1/397: dread d1/da/d20/f2d [0,4194304] 0 2026-03-10T08:55:17.224 INFO:tasks.workunit.client.1.vm08.stdout:0/314: fsync d6/dd/d13/d17/d1f/d20/d2f/f59 0 2026-03-10T08:55:17.226 INFO:tasks.workunit.client.0.vm05.stdout:5/143: symlink d5/l2b 0 2026-03-10T08:55:17.239 INFO:tasks.workunit.client.1.vm08.stdout:2/394: rename d1/f48 to d1/da/d10/f7e 0 2026-03-10T08:55:17.239 INFO:tasks.workunit.client.1.vm08.stdout:2/395: write d1/d5b/d66/f62 [548581,18151] 0 2026-03-10T08:55:17.239 INFO:tasks.workunit.client.0.vm05.stdout:7/85: write f15 [1516242,103628] 0 2026-03-10T08:55:17.239 INFO:tasks.workunit.client.0.vm05.stdout:7/86: chown l13 9693576 1 2026-03-10T08:55:17.239 INFO:tasks.workunit.client.0.vm05.stdout:7/87: chown c14 3469 1 2026-03-10T08:55:17.240 INFO:tasks.workunit.client.0.vm05.stdout:8/98: creat d2/db/f22 x:0 0 0 2026-03-10T08:55:17.242 INFO:tasks.workunit.client.0.vm05.stdout:8/99: dread d2/fa [0,4194304] 0 2026-03-10T08:55:17.243 INFO:tasks.workunit.client.0.vm05.stdout:8/100: read d2/db/f1b [2887,97819] 0 2026-03-10T08:55:17.257 INFO:tasks.workunit.client.0.vm05.stdout:3/145: sync 2026-03-10T08:55:17.257 INFO:tasks.workunit.client.0.vm05.stdout:1/188: creat dd/d21/f48 x:0 0 0 2026-03-10T08:55:17.269 INFO:tasks.workunit.client.0.vm05.stdout:6/129: creat d4/d7/d10/d15/f2e x:0 0 0 2026-03-10T08:55:17.269 INFO:tasks.workunit.client.0.vm05.stdout:6/130: write d4/f21 [992059,11608] 0 2026-03-10T08:55:17.298 INFO:tasks.workunit.client.0.vm05.stdout:8/101: symlink d2/db/l23 0 2026-03-10T08:55:17.298 INFO:tasks.workunit.client.1.vm08.stdout:8/464: write d1/d10/d9/dd/d18/d34/f57 [372495,55398] 0 2026-03-10T08:55:17.305 INFO:tasks.workunit.client.1.vm08.stdout:9/398: dread d2/dd/f16 [0,4194304] 0 2026-03-10T08:55:17.307 INFO:tasks.workunit.client.0.vm05.stdout:3/146: creat d9/f27 x:0 0 0 2026-03-10T08:55:17.310 INFO:tasks.workunit.client.0.vm05.stdout:3/147: dwrite d9/f23 [0,4194304] 0 2026-03-10T08:55:17.311 INFO:tasks.workunit.client.0.vm05.stdout:3/148: readlink d9/le 0 2026-03-10T08:55:17.323 INFO:tasks.workunit.client.1.vm08.stdout:3/364: rename d4/d15/d8/d2c/c69 to d4/d15/d8/d1d/c77 0 2026-03-10T08:55:17.325 INFO:tasks.workunit.client.0.vm05.stdout:7/88: rename lf to l16 0 2026-03-10T08:55:17.328 INFO:tasks.workunit.client.1.vm08.stdout:2/396: fdatasync d1/da/d10/d1b/d12/d1e/f1f 0 2026-03-10T08:55:17.330 INFO:tasks.workunit.client.1.vm08.stdout:8/465: write d1/d10/d9/f73 [1540483,69919] 0 2026-03-10T08:55:17.330 INFO:tasks.workunit.client.1.vm08.stdout:2/397: read d1/da/d10/d1b/f14 [1391731,3891] 0 2026-03-10T08:55:17.331 INFO:tasks.workunit.client.1.vm08.stdout:2/398: dread - d1/d43/f5d zero size 2026-03-10T08:55:17.331 INFO:tasks.workunit.client.0.vm05.stdout:8/102: symlink d2/db/l24 0 2026-03-10T08:55:17.334 INFO:tasks.workunit.client.0.vm05.stdout:1/189: unlink f2 0 2026-03-10T08:55:17.337 INFO:tasks.workunit.client.1.vm08.stdout:4/423: creat d5/d23/d36/f92 x:0 0 0 2026-03-10T08:55:17.340 INFO:tasks.workunit.client.1.vm08.stdout:0/315: creat d6/f62 x:0 0 0 2026-03-10T08:55:17.343 INFO:tasks.workunit.client.1.vm08.stdout:0/316: dwrite d6/dd/d13/d17/f1d [0,4194304] 0 2026-03-10T08:55:17.343 INFO:tasks.workunit.client.0.vm05.stdout:3/149: creat d9/f28 x:0 0 0 2026-03-10T08:55:17.360 INFO:tasks.workunit.client.1.vm08.stdout:5/393: rename d0/d40 to d0/d11/d27/d68/d7c 0 2026-03-10T08:55:17.365 INFO:tasks.workunit.client.1.vm08.stdout:5/394: read d0/d11/d18/f23 [1819845,101221] 0 2026-03-10T08:55:17.365 INFO:tasks.workunit.client.0.vm05.stdout:1/190: dread fc [0,4194304] 0 2026-03-10T08:55:17.366 INFO:tasks.workunit.client.0.vm05.stdout:1/191: dread - dd/d21/d37/d45/f47 zero size 2026-03-10T08:55:17.370 INFO:tasks.workunit.client.0.vm05.stdout:9/93: link d6/f16 d6/d12/f1c 0 2026-03-10T08:55:17.375 INFO:tasks.workunit.client.0.vm05.stdout:7/89: chown cb 769705534 1 2026-03-10T08:55:17.375 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.375+0000 7f2b4a410700 1 -- 192.168.123.105:0/1370754065 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f2b30006200 con 0x7f2b44083440 2026-03-10T08:55:17.376 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.376+0000 7f2b417fa700 1 -- 192.168.123.105:0/1370754065 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+558 (secure 0 0 0) 0x7f2b3c05b150 con 0x7f2b44083440 2026-03-10T08:55:17.376 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T08:55:17.376 INFO:teuthology.orchestra.run.vm05.stdout: "mon": { 2026-03-10T08:55:17.376 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 2 2026-03-10T08:55:17.376 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:55:17.376 INFO:teuthology.orchestra.run.vm05.stdout: "mgr": { 2026-03-10T08:55:17.376 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 2 2026-03-10T08:55:17.376 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:55:17.376 INFO:teuthology.orchestra.run.vm05.stdout: "osd": { 2026-03-10T08:55:17.376 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 6 2026-03-10T08:55:17.376 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:55:17.376 INFO:teuthology.orchestra.run.vm05.stdout: "mds": { 2026-03-10T08:55:17.376 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4 2026-03-10T08:55:17.376 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:55:17.376 INFO:teuthology.orchestra.run.vm05.stdout: "overall": { 2026-03-10T08:55:17.376 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 14 2026-03-10T08:55:17.376 INFO:teuthology.orchestra.run.vm05.stdout: } 2026-03-10T08:55:17.376 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T08:55:17.378 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.378+0000 7f2b2affd700 1 -- 192.168.123.105:0/1370754065 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2b2c06c600 msgr2=0x7f2b2c06eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:17.378 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.378+0000 7f2b2affd700 1 --2- 192.168.123.105:0/1370754065 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2b2c06c600 0x7f2b2c06eac0 secure :-1 s=READY pgs=146 cs=0 l=1 rev1=1 crypto rx=0x7f2b34005950 tx=0x7f2b340058e0 comp rx=0 tx=0).stop 2026-03-10T08:55:17.378 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.378+0000 7f2b2affd700 1 -- 192.168.123.105:0/1370754065 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2b44083440 msgr2=0x7f2b440838c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:17.378 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.378+0000 7f2b2affd700 1 --2- 192.168.123.105:0/1370754065 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2b44083440 0x7f2b440838c0 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7f2b3c003c60 tx=0x7f2b3c003d40 comp rx=0 tx=0).stop 2026-03-10T08:55:17.379 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.378+0000 7f2b2affd700 1 -- 192.168.123.105:0/1370754065 shutdown_connections 2026-03-10T08:55:17.379 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.378+0000 7f2b2affd700 1 --2- 192.168.123.105:0/1370754065 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2b2c06c600 0x7f2b2c06eac0 unknown :-1 s=CLOSED pgs=146 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:17.379 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.378+0000 7f2b2affd700 1 --2- 192.168.123.105:0/1370754065 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2b44072b50 0x7f2b44082f00 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:17.379 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.378+0000 7f2b2affd700 1 --2- 192.168.123.105:0/1370754065 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2b44083440 0x7f2b440838c0 unknown :-1 s=CLOSED pgs=85 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:17.379 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.378+0000 7f2b2affd700 1 -- 192.168.123.105:0/1370754065 >> 192.168.123.105:0/1370754065 conn(0x7f2b4406dae0 msgr2=0x7f2b4406ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:55:17.379 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.379+0000 7f2b2affd700 1 -- 192.168.123.105:0/1370754065 shutdown_connections 2026-03-10T08:55:17.379 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.379+0000 7f2b2affd700 1 -- 192.168.123.105:0/1370754065 wait complete. 2026-03-10T08:55:17.382 INFO:tasks.workunit.client.0.vm05.stdout:2/79: truncate d0/fa 56203 0 2026-03-10T08:55:17.399 INFO:tasks.workunit.client.0.vm05.stdout:0/120: write df/d18/d19/f1c [4379926,117005] 0 2026-03-10T08:55:17.410 INFO:tasks.workunit.client.0.vm05.stdout:3/150: readlink d9/l24 0 2026-03-10T08:55:17.411 INFO:tasks.workunit.client.0.vm05.stdout:3/151: chown d9/l18 24988106 1 2026-03-10T08:55:17.411 INFO:tasks.workunit.client.0.vm05.stdout:9/94: mknod d6/d19/c1d 0 2026-03-10T08:55:17.411 INFO:tasks.workunit.client.0.vm05.stdout:4/135: fsync d0/fe 0 2026-03-10T08:55:17.411 INFO:tasks.workunit.client.0.vm05.stdout:3/152: stat f1 0 2026-03-10T08:55:17.412 INFO:tasks.workunit.client.0.vm05.stdout:8/103: rename d2/c3 to d2/db/d1f/c25 0 2026-03-10T08:55:17.416 INFO:tasks.workunit.client.0.vm05.stdout:3/153: dread d9/f19 [0,4194304] 0 2026-03-10T08:55:17.416 INFO:tasks.workunit.client.0.vm05.stdout:8/104: dwrite d2/db/f19 [0,4194304] 0 2026-03-10T08:55:17.418 INFO:tasks.workunit.client.0.vm05.stdout:8/105: stat d2/c17 0 2026-03-10T08:55:17.427 INFO:tasks.workunit.client.0.vm05.stdout:4/136: symlink d0/l2a 0 2026-03-10T08:55:17.428 INFO:tasks.workunit.client.0.vm05.stdout:2/80: rename d0/ff to d0/d9/f12 0 2026-03-10T08:55:17.429 INFO:tasks.workunit.client.0.vm05.stdout:9/95: dread d6/f7 [0,4194304] 0 2026-03-10T08:55:17.429 INFO:tasks.workunit.client.0.vm05.stdout:0/121: mkdir df/d1f/d26/d27 0 2026-03-10T08:55:17.430 INFO:tasks.workunit.client.0.vm05.stdout:0/122: write f5 [4474910,30713] 0 2026-03-10T08:55:17.432 INFO:tasks.workunit.client.0.vm05.stdout:0/123: write df/f1a [4263635,26231] 0 2026-03-10T08:55:17.432 INFO:tasks.workunit.client.0.vm05.stdout:4/137: fsync d0/fb 0 2026-03-10T08:55:17.447 INFO:tasks.workunit.client.0.vm05.stdout:1/192: dread dd/d10/d19/f1f [0,4194304] 0 2026-03-10T08:55:17.450 INFO:tasks.workunit.client.1.vm08.stdout:4/424: creat d5/d23/d49/d83/f93 x:0 0 0 2026-03-10T08:55:17.450 INFO:tasks.workunit.client.1.vm08.stdout:0/317: symlink d6/dd/d13/d17/d1f/d20/d2f/d57/l63 0 2026-03-10T08:55:17.451 INFO:tasks.workunit.client.1.vm08.stdout:0/318: chown d6/dd/d13/d32/l4f 1025043449 1 2026-03-10T08:55:17.451 INFO:tasks.workunit.client.1.vm08.stdout:7/374: truncate d0/d14/f12 2272289 0 2026-03-10T08:55:17.452 INFO:tasks.workunit.client.1.vm08.stdout:7/375: dread - d0/d11/d1f/d2c/f6c zero size 2026-03-10T08:55:17.453 INFO:tasks.workunit.client.1.vm08.stdout:8/466: mkdir d1/da8 0 2026-03-10T08:55:17.454 INFO:tasks.workunit.client.1.vm08.stdout:7/376: write d0/d14/d43/f6e [519338,125819] 0 2026-03-10T08:55:17.455 INFO:tasks.workunit.client.1.vm08.stdout:6/434: write d9/d13/d4e/f57 [1996781,93655] 0 2026-03-10T08:55:17.461 INFO:tasks.workunit.client.1.vm08.stdout:9/399: dwrite d2/dd/f18 [4194304,4194304] 0 2026-03-10T08:55:17.462 INFO:tasks.workunit.client.1.vm08.stdout:8/467: dwrite d1/d10/d9/dd/d13/f6a [0,4194304] 0 2026-03-10T08:55:17.470 INFO:tasks.workunit.client.0.vm05.stdout:6/131: write d4/fc [3356781,122549] 0 2026-03-10T08:55:17.470 INFO:tasks.workunit.client.0.vm05.stdout:6/132: readlink d4/d7/l2b 0 2026-03-10T08:55:17.472 INFO:tasks.workunit.client.0.vm05.stdout:2/81: write d0/f4 [1729253,70608] 0 2026-03-10T08:55:17.481 INFO:tasks.workunit.client.0.vm05.stdout:5/144: mkdir d5/df/d12/d24/d2c 0 2026-03-10T08:55:17.483 INFO:tasks.workunit.client.1.vm08.stdout:0/319: read - d6/dd/d13/d17/d1f/d20/f43 zero size 2026-03-10T08:55:17.491 INFO:tasks.workunit.client.1.vm08.stdout:3/365: creat d4/d15/f78 x:0 0 0 2026-03-10T08:55:17.492 INFO:tasks.workunit.client.1.vm08.stdout:3/366: truncate d4/d15/d8/d2c/d55/f61 632039 0 2026-03-10T08:55:17.493 INFO:tasks.workunit.client.0.vm05.stdout:8/106: creat d2/dd/f26 x:0 0 0 2026-03-10T08:55:17.494 INFO:tasks.workunit.client.1.vm08.stdout:2/399: link d1/da/d10/d42/f79 d1/d43/f7f 0 2026-03-10T08:55:17.495 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.495+0000 7f3bce5d6700 1 -- 192.168.123.105:0/2795679613 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3bc8104060 msgr2=0x7f3bc81044e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:17.495 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.495+0000 7f3bce5d6700 1 --2- 192.168.123.105:0/2795679613 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3bc8104060 0x7f3bc81044e0 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7f3bb8009b00 tx=0x7f3bb8009e10 comp rx=0 tx=0).stop 2026-03-10T08:55:17.496 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.496+0000 7f3bce5d6700 1 -- 192.168.123.105:0/2795679613 shutdown_connections 2026-03-10T08:55:17.496 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.496+0000 7f3bce5d6700 1 --2- 192.168.123.105:0/2795679613 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3bc8104060 0x7f3bc81044e0 unknown :-1 s=CLOSED pgs=86 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:17.496 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.496+0000 7f3bce5d6700 1 --2- 192.168.123.105:0/2795679613 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3bc8102e70 0x7f3bc8103290 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:17.497 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.496+0000 7f3bce5d6700 1 -- 192.168.123.105:0/2795679613 >> 192.168.123.105:0/2795679613 conn(0x7f3bc80fe440 msgr2=0x7f3bc81008a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:55:17.497 INFO:tasks.workunit.client.1.vm08.stdout:9/400: truncate d2/d41/d4c/f7c 318632 0 2026-03-10T08:55:17.497 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.496+0000 7f3bce5d6700 1 -- 192.168.123.105:0/2795679613 shutdown_connections 2026-03-10T08:55:17.497 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.496+0000 7f3bce5d6700 1 -- 192.168.123.105:0/2795679613 wait complete. 2026-03-10T08:55:17.497 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.497+0000 7f3bce5d6700 1 Processor -- start 2026-03-10T08:55:17.497 INFO:tasks.workunit.client.0.vm05.stdout:4/138: rmdir d0/d15 39 2026-03-10T08:55:17.498 INFO:tasks.workunit.client.1.vm08.stdout:9/401: read d2/f77 [259384,129694] 0 2026-03-10T08:55:17.498 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.498+0000 7f3bce5d6700 1 -- start start 2026-03-10T08:55:17.498 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.498+0000 7f3bce5d6700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3bc8102e70 0x7f3bc81987b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:55:17.498 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.498+0000 7f3bce5d6700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3bc8104060 0x7f3bc8198cf0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:55:17.498 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.498+0000 7f3bce5d6700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3bc8199310 con 0x7f3bc8102e70 2026-03-10T08:55:17.498 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.498+0000 7f3bce5d6700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3bc8199450 con 0x7f3bc8104060 2026-03-10T08:55:17.499 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.498+0000 7f3bcd5d4700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3bc8102e70 0x7f3bc81987b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:55:17.499 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.499+0000 7f3bcd5d4700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3bc8102e70 0x7f3bc81987b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:46622/0 (socket says 192.168.123.105:46622) 2026-03-10T08:55:17.499 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.499+0000 7f3bcd5d4700 1 -- 192.168.123.105:0/3152795561 learned_addr learned my addr 192.168.123.105:0/3152795561 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:55:17.499 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.499+0000 7f3bccdd3700 1 --2- 192.168.123.105:0/3152795561 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3bc8104060 0x7f3bc8198cf0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:55:17.499 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.499+0000 7f3bcd5d4700 1 -- 192.168.123.105:0/3152795561 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3bc8104060 msgr2=0x7f3bc8198cf0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:17.499 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.499+0000 7f3bcd5d4700 1 --2- 192.168.123.105:0/3152795561 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3bc8104060 0x7f3bc8198cf0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:17.499 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.499+0000 7f3bcd5d4700 1 -- 192.168.123.105:0/3152795561 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3bb80097e0 con 0x7f3bc8102e70 2026-03-10T08:55:17.499 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.499+0000 7f3bcd5d4700 1 --2- 192.168.123.105:0/3152795561 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3bc8102e70 0x7f3bc81987b0 secure :-1 s=READY pgs=321 cs=0 l=1 rev1=1 crypto rx=0x7f3bc400d900 tx=0x7f3bc400dc10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:55:17.500 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.500+0000 7f3bbe7fc700 1 -- 192.168.123.105:0/3152795561 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3bc40041d0 con 0x7f3bc8102e70 2026-03-10T08:55:17.500 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.500+0000 7f3bce5d6700 1 -- 192.168.123.105:0/3152795561 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3bc819df00 con 0x7f3bc8102e70 2026-03-10T08:55:17.500 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.500+0000 7f3bce5d6700 1 -- 192.168.123.105:0/3152795561 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3bc819e3c0 con 0x7f3bc8102e70 2026-03-10T08:55:17.501 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.500+0000 7f3bce5d6700 1 -- 192.168.123.105:0/3152795561 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3bc804ea90 con 0x7f3bc8102e70 2026-03-10T08:55:17.501 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.501+0000 7f3bbe7fc700 1 -- 192.168.123.105:0/3152795561 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3bc4020030 con 0x7f3bc8102e70 2026-03-10T08:55:17.501 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.501+0000 7f3bbe7fc700 1 -- 192.168.123.105:0/3152795561 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3bc4003bf0 con 0x7f3bc8102e70 2026-03-10T08:55:17.503 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.503+0000 7f3bbe7fc700 1 -- 192.168.123.105:0/3152795561 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f3bc4003d50 con 0x7f3bc8102e70 2026-03-10T08:55:17.515 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.503+0000 7f3bbe7fc700 1 --2- 192.168.123.105:0/3152795561 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3bb406c4e0 0x7f3bb406e9a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:55:17.515 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.503+0000 7f3bbe7fc700 1 -- 192.168.123.105:0/3152795561 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f3bc408b7f0 con 0x7f3bc8102e70 2026-03-10T08:55:17.515 INFO:tasks.workunit.client.1.vm08.stdout:1/398: rename d1/da/de/c16 to d1/da/d20/c85 0 2026-03-10T08:55:17.515 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.503+0000 7f3bccdd3700 1 --2- 192.168.123.105:0/3152795561 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3bb406c4e0 0x7f3bb406e9a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:55:17.515 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.504+0000 7f3bccdd3700 1 --2- 192.168.123.105:0/3152795561 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3bb406c4e0 0x7f3bb406e9a0 secure :-1 s=READY pgs=147 cs=0 l=1 rev1=1 crypto rx=0x7f3bb80052d0 tx=0x7f3bb800b540 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:55:17.515 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.505+0000 7f3bbe7fc700 1 -- 192.168.123.105:0/3152795561 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f3bc4059b20 con 0x7f3bc8102e70 2026-03-10T08:55:17.515 INFO:tasks.workunit.client.0.vm05.stdout:2/82: symlink d0/d9/l13 0 2026-03-10T08:55:17.515 INFO:tasks.workunit.client.0.vm05.stdout:2/83: fsync d0/fb 0 2026-03-10T08:55:17.515 INFO:tasks.workunit.client.0.vm05.stdout:2/84: write d0/f4 [1220575,41811] 0 2026-03-10T08:55:17.515 INFO:tasks.workunit.client.0.vm05.stdout:2/85: truncate d0/d9/f12 2032384 0 2026-03-10T08:55:17.515 INFO:tasks.workunit.client.0.vm05.stdout:2/86: write d0/fb [3928303,65089] 0 2026-03-10T08:55:17.515 INFO:tasks.workunit.client.0.vm05.stdout:2/87: chown d0/d9/ce 877350 1 2026-03-10T08:55:17.515 INFO:tasks.workunit.client.0.vm05.stdout:1/193: rename dd/d10/d18/d2d/c3b to dd/d10/d19/c49 0 2026-03-10T08:55:17.519 INFO:tasks.workunit.client.1.vm08.stdout:5/395: link d0/d11/d18/f5a d0/d11/d18/d52/f7d 0 2026-03-10T08:55:17.520 INFO:tasks.workunit.client.1.vm08.stdout:2/400: creat d1/d5b/f80 x:0 0 0 2026-03-10T08:55:17.520 INFO:tasks.workunit.client.1.vm08.stdout:6/435: mknod d9/dc/c96 0 2026-03-10T08:55:17.522 INFO:tasks.workunit.client.0.vm05.stdout:8/107: symlink d2/dd/l27 0 2026-03-10T08:55:17.532 INFO:tasks.workunit.client.1.vm08.stdout:0/320: sync 2026-03-10T08:55:17.532 INFO:tasks.workunit.client.1.vm08.stdout:2/401: sync 2026-03-10T08:55:17.537 INFO:tasks.workunit.client.1.vm08.stdout:1/399: mkdir d1/da/de/d24/d26/d86 0 2026-03-10T08:55:17.542 INFO:tasks.workunit.client.0.vm05.stdout:7/90: write f4 [2759213,7598] 0 2026-03-10T08:55:17.551 INFO:tasks.workunit.client.1.vm08.stdout:7/377: creat d0/d14/f72 x:0 0 0 2026-03-10T08:55:17.551 INFO:tasks.workunit.client.0.vm05.stdout:3/154: write d9/f19 [1783835,69614] 0 2026-03-10T08:55:17.552 INFO:tasks.workunit.client.0.vm05.stdout:3/155: dread - d9/f26 zero size 2026-03-10T08:55:17.552 INFO:tasks.workunit.client.0.vm05.stdout:3/156: dread - d9/f27 zero size 2026-03-10T08:55:17.552 INFO:tasks.workunit.client.0.vm05.stdout:3/157: fdatasync d9/f27 0 2026-03-10T08:55:17.556 INFO:tasks.workunit.client.0.vm05.stdout:3/158: dread d9/f20 [0,4194304] 0 2026-03-10T08:55:17.559 INFO:tasks.workunit.client.1.vm08.stdout:4/425: dwrite d5/f1e [0,4194304] 0 2026-03-10T08:55:17.562 INFO:tasks.workunit.client.0.vm05.stdout:0/124: dwrite df/f12 [0,4194304] 0 2026-03-10T08:55:17.569 INFO:tasks.workunit.client.0.vm05.stdout:6/133: dwrite d4/d7/ff [4194304,4194304] 0 2026-03-10T08:55:17.571 INFO:tasks.workunit.client.0.vm05.stdout:6/134: write d4/d7/f14 [4479994,37050] 0 2026-03-10T08:55:17.585 INFO:tasks.workunit.client.1.vm08.stdout:8/468: dwrite d1/d10/f23 [0,4194304] 0 2026-03-10T08:55:17.597 INFO:tasks.workunit.client.0.vm05.stdout:2/88: symlink d0/d9/l14 0 2026-03-10T08:55:17.597 INFO:tasks.workunit.client.0.vm05.stdout:2/89: stat d0/d9/ce 0 2026-03-10T08:55:17.603 INFO:tasks.workunit.client.1.vm08.stdout:6/436: creat d9/dc/d11/d23/d2c/f97 x:0 0 0 2026-03-10T08:55:17.604 INFO:tasks.workunit.client.0.vm05.stdout:9/96: creat d6/f1e x:0 0 0 2026-03-10T08:55:17.607 INFO:tasks.workunit.client.0.vm05.stdout:8/108: rename d2/db/d1f/d21 to d2/db/d28 0 2026-03-10T08:55:17.608 INFO:tasks.workunit.client.1.vm08.stdout:2/402: fsync d1/da/d10/d1b/d12/d23/f31 0 2026-03-10T08:55:17.612 INFO:tasks.workunit.client.1.vm08.stdout:0/321: mknod d6/dd/d13/d17/d1f/d20/d2f/d57/c64 0 2026-03-10T08:55:17.614 INFO:tasks.workunit.client.1.vm08.stdout:0/322: dread - d6/f62 zero size 2026-03-10T08:55:17.614 INFO:tasks.workunit.client.1.vm08.stdout:0/323: chown d6/dd/d13/d17 289004307 1 2026-03-10T08:55:17.615 INFO:tasks.workunit.client.0.vm05.stdout:7/91: mknod c17 0 2026-03-10T08:55:17.615 INFO:tasks.workunit.client.0.vm05.stdout:7/92: write f9 [1087931,34719] 0 2026-03-10T08:55:17.615 INFO:tasks.workunit.client.0.vm05.stdout:7/93: stat f15 0 2026-03-10T08:55:17.631 INFO:tasks.workunit.client.0.vm05.stdout:3/159: write f1 [174344,105969] 0 2026-03-10T08:55:17.636 INFO:tasks.workunit.client.1.vm08.stdout:9/402: dwrite d2/dd/d15/f17 [0,4194304] 0 2026-03-10T08:55:17.636 INFO:tasks.workunit.client.1.vm08.stdout:9/403: write d2/dd/d15/d1e/d39/d4e/f78 [435438,12314] 0 2026-03-10T08:55:17.637 INFO:tasks.workunit.client.0.vm05.stdout:3/160: readlink d9/l24 0 2026-03-10T08:55:17.637 INFO:tasks.workunit.client.0.vm05.stdout:3/161: write f7 [2883190,125737] 0 2026-03-10T08:55:17.637 INFO:tasks.workunit.client.0.vm05.stdout:0/125: sync 2026-03-10T08:55:17.637 INFO:tasks.workunit.client.1.vm08.stdout:6/437: sync 2026-03-10T08:55:17.643 INFO:tasks.workunit.client.1.vm08.stdout:1/400: read d1/f8 [6253089,111770] 0 2026-03-10T08:55:17.644 INFO:tasks.workunit.client.1.vm08.stdout:1/401: dread - d1/da/d20/f67 zero size 2026-03-10T08:55:17.644 INFO:tasks.workunit.client.1.vm08.stdout:1/402: fdatasync d1/da/f25 0 2026-03-10T08:55:17.645 INFO:tasks.workunit.client.1.vm08.stdout:7/378: rename d0/d11/d1f/d29/d36/c64 to d0/d14/d43/c73 0 2026-03-10T08:55:17.645 INFO:tasks.workunit.client.1.vm08.stdout:7/379: readlink d0/d11/d1f/d29/d3b/l56 0 2026-03-10T08:55:17.658 INFO:tasks.workunit.client.1.vm08.stdout:4/426: unlink d5/de/f1f 0 2026-03-10T08:55:17.662 INFO:tasks.workunit.client.1.vm08.stdout:4/427: sync 2026-03-10T08:55:17.670 INFO:tasks.workunit.client.1.vm08.stdout:2/403: readlink d1/da/d10/d1b/l30 0 2026-03-10T08:55:17.670 INFO:tasks.workunit.client.1.vm08.stdout:2/404: dread - d1/d43/f5d zero size 2026-03-10T08:55:17.673 INFO:tasks.workunit.client.0.vm05.stdout:2/90: rename d0/d9/l14 to d0/l15 0 2026-03-10T08:55:17.675 INFO:tasks.workunit.client.0.vm05.stdout:4/139: symlink d0/d15/d25/l2b 0 2026-03-10T08:55:17.677 INFO:tasks.workunit.client.0.vm05.stdout:4/140: dread d0/fb [0,4194304] 0 2026-03-10T08:55:17.681 INFO:tasks.workunit.client.0.vm05.stdout:7/94: unlink f10 0 2026-03-10T08:55:17.681 INFO:tasks.workunit.client.0.vm05.stdout:7/95: truncate fd 243245 0 2026-03-10T08:55:17.681 INFO:tasks.workunit.client.0.vm05.stdout:7/96: chown c5 14459834 1 2026-03-10T08:55:17.682 INFO:tasks.workunit.client.0.vm05.stdout:7/97: fsync f4 0 2026-03-10T08:55:17.683 INFO:tasks.workunit.client.0.vm05.stdout:1/194: stat dd/d10/d19/f35 0 2026-03-10T08:55:17.685 INFO:tasks.workunit.client.1.vm08.stdout:1/403: mknod d1/da/d20/d4c/c87 0 2026-03-10T08:55:17.688 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:17 vm05.local ceph-mon[49713]: from='client.14654 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:55:17.688 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:17 vm05.local ceph-mon[49713]: from='client.24435 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:55:17.688 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:17 vm05.local ceph-mon[49713]: from='client.? 192.168.123.105:0/1370754065' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:55:17.689 INFO:tasks.workunit.client.1.vm08.stdout:8/469: rename d1/d10/d9/dd/d25/d27/d44/d21/f66 to d1/d10/d9/dd/d18/d3c/fa9 0 2026-03-10T08:55:17.690 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.688+0000 7f3bce5d6700 1 -- 192.168.123.105:0/3152795561 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f3bc819e6c0 con 0x7f3bc8102e70 2026-03-10T08:55:17.697 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.691+0000 7f3bbe7fc700 1 -- 192.168.123.105:0/3152795561 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 12 v12) v1 ==== 76+0+1827 (secure 0 0 0) 0x7f3bc40596b0 con 0x7f3bc8102e70 2026-03-10T08:55:17.697 INFO:teuthology.orchestra.run.vm05.stdout:e12 2026-03-10T08:55:17.697 INFO:teuthology.orchestra.run.vm05.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T08:55:17.697 INFO:teuthology.orchestra.run.vm05.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T08:55:17.697 INFO:teuthology.orchestra.run.vm05.stdout:legacy client fscid: 1 2026-03-10T08:55:17.697 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:55:17.697 INFO:teuthology.orchestra.run.vm05.stdout:Filesystem 'cephfs' (1) 2026-03-10T08:55:17.697 INFO:teuthology.orchestra.run.vm05.stdout:fs_name cephfs 2026-03-10T08:55:17.697 INFO:teuthology.orchestra.run.vm05.stdout:epoch 12 2026-03-10T08:55:17.697 INFO:teuthology.orchestra.run.vm05.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T08:55:17.697 INFO:teuthology.orchestra.run.vm05.stdout:created 2026-03-10T08:52:52.346264+0000 2026-03-10T08:55:17.697 INFO:teuthology.orchestra.run.vm05.stdout:modified 2026-03-10T08:53:01.426195+0000 2026-03-10T08:55:17.697 INFO:teuthology.orchestra.run.vm05.stdout:tableserver 0 2026-03-10T08:55:17.697 INFO:teuthology.orchestra.run.vm05.stdout:root 0 2026-03-10T08:55:17.697 INFO:teuthology.orchestra.run.vm05.stdout:session_timeout 60 2026-03-10T08:55:17.697 INFO:teuthology.orchestra.run.vm05.stdout:session_autoclose 300 2026-03-10T08:55:17.697 INFO:teuthology.orchestra.run.vm05.stdout:max_file_size 1099511627776 2026-03-10T08:55:17.697 INFO:teuthology.orchestra.run.vm05.stdout:required_client_features {} 2026-03-10T08:55:17.697 INFO:teuthology.orchestra.run.vm05.stdout:last_failure 0 2026-03-10T08:55:17.697 INFO:teuthology.orchestra.run.vm05.stdout:last_failure_osd_epoch 39 2026-03-10T08:55:17.697 INFO:teuthology.orchestra.run.vm05.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T08:55:17.697 INFO:teuthology.orchestra.run.vm05.stdout:max_mds 1 2026-03-10T08:55:17.697 INFO:teuthology.orchestra.run.vm05.stdout:in 0 2026-03-10T08:55:17.697 INFO:teuthology.orchestra.run.vm05.stdout:up {0=24289} 2026-03-10T08:55:17.697 INFO:teuthology.orchestra.run.vm05.stdout:failed 2026-03-10T08:55:17.697 INFO:teuthology.orchestra.run.vm05.stdout:damaged 2026-03-10T08:55:17.697 INFO:teuthology.orchestra.run.vm05.stdout:stopped 2026-03-10T08:55:17.697 INFO:teuthology.orchestra.run.vm05.stdout:data_pools [3] 2026-03-10T08:55:17.697 INFO:teuthology.orchestra.run.vm05.stdout:metadata_pool 2 2026-03-10T08:55:17.697 INFO:teuthology.orchestra.run.vm05.stdout:inline_data disabled 2026-03-10T08:55:17.697 INFO:teuthology.orchestra.run.vm05.stdout:balancer 2026-03-10T08:55:17.697 INFO:teuthology.orchestra.run.vm05.stdout:bal_rank_mask -1 2026-03-10T08:55:17.697 INFO:teuthology.orchestra.run.vm05.stdout:standby_count_wanted 1 2026-03-10T08:55:17.697 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.bxdvbu{0:24289} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.105:6826/2466638752,v1:192.168.123.105:6827/2466638752] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:55:17.697 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:55:17.697 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:55:17.697 INFO:teuthology.orchestra.run.vm05.stdout:Standby daemons: 2026-03-10T08:55:17.697 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:55:17.697 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.slhztf{-1:14488} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.105:6828/2662194502,v1:192.168.123.105:6829/2662194502] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:55:17.697 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.ssijow{-1:24307} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.108:6826/573665424,v1:192.168.123.108:6827/573665424] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:55:17.697 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.xfzrbx{-1:24317} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.108:6824/3236998387,v1:192.168.123.108:6825/3236998387] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:55:17.697 INFO:tasks.workunit.client.1.vm08.stdout:7/380: creat d0/d11/d1f/d29/d3d/f74 x:0 0 0 2026-03-10T08:55:17.697 INFO:tasks.workunit.client.1.vm08.stdout:4/428: creat d5/d23/d49/f94 x:0 0 0 2026-03-10T08:55:17.697 INFO:tasks.workunit.client.1.vm08.stdout:4/429: write d5/d2f/f84 [188347,34847] 0 2026-03-10T08:55:17.698 INFO:tasks.workunit.client.0.vm05.stdout:3/162: creat d9/f29 x:0 0 0 2026-03-10T08:55:17.698 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.694+0000 7f3bce5d6700 1 -- 192.168.123.105:0/3152795561 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3bb406c4e0 msgr2=0x7f3bb406e9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:17.698 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.694+0000 7f3bce5d6700 1 --2- 192.168.123.105:0/3152795561 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3bb406c4e0 0x7f3bb406e9a0 secure :-1 s=READY pgs=147 cs=0 l=1 rev1=1 crypto rx=0x7f3bb80052d0 tx=0x7f3bb800b540 comp rx=0 tx=0).stop 2026-03-10T08:55:17.698 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.694+0000 7f3bce5d6700 1 -- 192.168.123.105:0/3152795561 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3bc8102e70 msgr2=0x7f3bc81987b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:17.698 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.694+0000 7f3bce5d6700 1 --2- 192.168.123.105:0/3152795561 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3bc8102e70 0x7f3bc81987b0 secure :-1 s=READY pgs=321 cs=0 l=1 rev1=1 crypto rx=0x7f3bc400d900 tx=0x7f3bc400dc10 comp rx=0 tx=0).stop 2026-03-10T08:55:17.698 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.695+0000 7f3bce5d6700 1 -- 192.168.123.105:0/3152795561 shutdown_connections 2026-03-10T08:55:17.698 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.695+0000 7f3bce5d6700 1 --2- 192.168.123.105:0/3152795561 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3bb406c4e0 0x7f3bb406e9a0 unknown :-1 s=CLOSED pgs=147 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:17.698 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.695+0000 7f3bce5d6700 1 --2- 192.168.123.105:0/3152795561 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3bc8102e70 0x7f3bc81987b0 unknown :-1 s=CLOSED pgs=321 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:17.698 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.695+0000 7f3bce5d6700 1 --2- 192.168.123.105:0/3152795561 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3bc8104060 0x7f3bc8198cf0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:17.698 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.695+0000 7f3bce5d6700 1 -- 192.168.123.105:0/3152795561 >> 192.168.123.105:0/3152795561 conn(0x7f3bc80fe440 msgr2=0x7f3bc8107320 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:55:17.698 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.695+0000 7f3bce5d6700 1 -- 192.168.123.105:0/3152795561 shutdown_connections 2026-03-10T08:55:17.698 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.695+0000 7f3bce5d6700 1 -- 192.168.123.105:0/3152795561 wait complete. 2026-03-10T08:55:17.698 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 12 2026-03-10T08:55:17.698 INFO:tasks.workunit.client.1.vm08.stdout:4/430: chown d5/d23/c34 31530 1 2026-03-10T08:55:17.698 INFO:tasks.workunit.client.0.vm05.stdout:0/126: dread df/f15 [0,4194304] 0 2026-03-10T08:55:17.699 INFO:tasks.workunit.client.0.vm05.stdout:0/127: rename df/d1f to df/d1f/d26/d27/d28 22 2026-03-10T08:55:17.703 INFO:tasks.workunit.client.0.vm05.stdout:0/128: read fe [1502494,39730] 0 2026-03-10T08:55:17.705 INFO:tasks.workunit.client.0.vm05.stdout:0/129: dread f5 [4194304,4194304] 0 2026-03-10T08:55:17.708 INFO:tasks.workunit.client.0.vm05.stdout:0/130: dwrite df/d18/d19/f1c [0,4194304] 0 2026-03-10T08:55:17.710 INFO:tasks.workunit.client.0.vm05.stdout:0/131: read df/d1f/f21 [739279,74652] 0 2026-03-10T08:55:17.712 INFO:tasks.workunit.client.1.vm08.stdout:5/396: write d0/ff [3190683,13086] 0 2026-03-10T08:55:17.713 INFO:tasks.workunit.client.0.vm05.stdout:6/135: creat d4/d2d/f2f x:0 0 0 2026-03-10T08:55:17.715 INFO:tasks.workunit.client.1.vm08.stdout:5/397: read d0/d11/d27/d68/d7c/d4b/d4e/f71 [3717356,38888] 0 2026-03-10T08:55:17.715 INFO:tasks.workunit.client.1.vm08.stdout:5/398: stat d0/d11/d27/l6b 0 2026-03-10T08:55:17.717 INFO:tasks.workunit.client.0.vm05.stdout:6/136: dwrite d4/f21 [0,4194304] 0 2026-03-10T08:55:17.718 INFO:tasks.workunit.client.0.vm05.stdout:6/137: truncate d4/d7/d10/d15/f2e 34807 0 2026-03-10T08:55:17.724 INFO:tasks.workunit.client.0.vm05.stdout:6/138: fdatasync d4/d7/ff 0 2026-03-10T08:55:17.730 INFO:tasks.workunit.client.1.vm08.stdout:3/367: truncate d4/d15/d17/f5c 1128391 0 2026-03-10T08:55:17.737 INFO:tasks.workunit.client.1.vm08.stdout:9/404: symlink d2/dd/d15/l7e 0 2026-03-10T08:55:17.747 INFO:tasks.workunit.client.1.vm08.stdout:0/324: write d6/dd/d13/d17/f29 [1020940,98885] 0 2026-03-10T08:55:17.747 INFO:tasks.workunit.client.1.vm08.stdout:6/438: symlink d9/dc/d11/d23/l98 0 2026-03-10T08:55:17.747 INFO:tasks.workunit.client.1.vm08.stdout:8/470: write d1/d10/d9/dd/d13/f24 [72882,102512] 0 2026-03-10T08:55:17.747 INFO:tasks.workunit.client.1.vm08.stdout:2/405: dwrite d1/d43/f7f [0,4194304] 0 2026-03-10T08:55:17.747 INFO:tasks.workunit.client.1.vm08.stdout:7/381: unlink d0/d11/d1f/d29/d3b/f65 0 2026-03-10T08:55:17.750 INFO:tasks.workunit.client.1.vm08.stdout:2/406: dwrite d1/d43/f5d [0,4194304] 0 2026-03-10T08:55:17.750 INFO:tasks.workunit.client.0.vm05.stdout:8/109: mknod d2/c29 0 2026-03-10T08:55:17.755 INFO:tasks.workunit.client.0.vm05.stdout:2/91: creat d0/f16 x:0 0 0 2026-03-10T08:55:17.755 INFO:tasks.workunit.client.0.vm05.stdout:8/110: chown d2/dd/f26 441071569 1 2026-03-10T08:55:17.758 INFO:tasks.workunit.client.0.vm05.stdout:1/195: mkdir dd/d21/d3f/d4a 0 2026-03-10T08:55:17.765 INFO:tasks.workunit.client.0.vm05.stdout:3/163: mknod d9/c2a 0 2026-03-10T08:55:17.765 INFO:tasks.workunit.client.1.vm08.stdout:5/399: dread - d0/d11/d18/d52/f57 zero size 2026-03-10T08:55:17.768 INFO:tasks.workunit.client.1.vm08.stdout:5/400: dwrite d0/d11/d27/f3b [0,4194304] 0 2026-03-10T08:55:17.769 INFO:tasks.workunit.client.1.vm08.stdout:5/401: readlink d0/d11/d27/d68/l6d 0 2026-03-10T08:55:17.772 INFO:tasks.workunit.client.1.vm08.stdout:9/405: creat d2/dd/d15/d1e/d25/d32/d5c/f7f x:0 0 0 2026-03-10T08:55:17.783 INFO:tasks.workunit.client.1.vm08.stdout:7/382: mkdir d0/d11/d1f/d29/d36/d75 0 2026-03-10T08:55:17.785 INFO:tasks.workunit.client.1.vm08.stdout:2/407: rename d1/da/d10/d1b/d12/d23/l68 to d1/da/d10/d1b/d12/d1e/l81 0 2026-03-10T08:55:17.785 INFO:tasks.workunit.client.1.vm08.stdout:2/408: chown d1/da/d10/d1b 304857 1 2026-03-10T08:55:17.786 INFO:tasks.workunit.client.1.vm08.stdout:2/409: chown d1/da/d10/d2d/f67 5820 1 2026-03-10T08:55:17.787 INFO:tasks.workunit.client.0.vm05.stdout:4/141: mkdir d0/d2c 0 2026-03-10T08:55:17.787 INFO:tasks.workunit.client.0.vm05.stdout:4/142: dread - d0/d1f/f26 zero size 2026-03-10T08:55:17.787 INFO:tasks.workunit.client.0.vm05.stdout:4/143: chown d0/c1b 3 1 2026-03-10T08:55:17.788 INFO:tasks.workunit.client.0.vm05.stdout:4/144: fdatasync d0/d1d/f24 0 2026-03-10T08:55:17.791 INFO:tasks.workunit.client.1.vm08.stdout:7/383: dwrite d0/d11/d1f/d29/d3d/f74 [0,4194304] 0 2026-03-10T08:55:17.795 INFO:tasks.workunit.client.0.vm05.stdout:2/92: creat d0/d9/f17 x:0 0 0 2026-03-10T08:55:17.797 INFO:tasks.workunit.client.0.vm05.stdout:8/111: unlink d2/l4 0 2026-03-10T08:55:17.800 INFO:tasks.workunit.client.0.vm05.stdout:1/196: write fa [221829,37615] 0 2026-03-10T08:55:17.807 INFO:tasks.workunit.client.0.vm05.stdout:3/164: mkdir d9/d2b 0 2026-03-10T08:55:17.815 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.814+0000 7fb9a5a23700 1 -- 192.168.123.105:0/2264356119 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb9a00ff5b0 msgr2=0x7fb9a00ff9d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:17.823 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.814+0000 7fb9a5a23700 1 --2- 192.168.123.105:0/2264356119 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb9a00ff5b0 0x7fb9a00ff9d0 secure :-1 s=READY pgs=322 cs=0 l=1 rev1=1 crypto rx=0x7fb990009b00 tx=0x7fb990009e10 comp rx=0 tx=0).stop 2026-03-10T08:55:17.823 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.814+0000 7fb9a5a23700 1 -- 192.168.123.105:0/2264356119 shutdown_connections 2026-03-10T08:55:17.823 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.814+0000 7fb9a5a23700 1 --2- 192.168.123.105:0/2264356119 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb9a00fff10 0x7fb9a0100390 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:17.823 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.814+0000 7fb9a5a23700 1 --2- 192.168.123.105:0/2264356119 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb9a00ff5b0 0x7fb9a00ff9d0 unknown :-1 s=CLOSED pgs=322 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:17.823 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.814+0000 7fb9a5a23700 1 -- 192.168.123.105:0/2264356119 >> 192.168.123.105:0/2264356119 conn(0x7fb9a00fb110 msgr2=0x7fb9a00fd590 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:55:17.823 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.815+0000 7fb9a5a23700 1 -- 192.168.123.105:0/2264356119 shutdown_connections 2026-03-10T08:55:17.823 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.815+0000 7fb9a5a23700 1 -- 192.168.123.105:0/2264356119 wait complete. 2026-03-10T08:55:17.824 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.815+0000 7fb9a5a23700 1 Processor -- start 2026-03-10T08:55:17.824 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.815+0000 7fb9a5a23700 1 -- start start 2026-03-10T08:55:17.824 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.816+0000 7fb9a5a23700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb9a00ff5b0 0x7fb9a01985b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:55:17.824 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.816+0000 7fb99effd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb9a00ff5b0 0x7fb9a01985b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:55:17.824 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.816+0000 7fb9a5a23700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb9a00fff10 0x7fb9a0198af0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:55:17.824 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.816+0000 7fb99effd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb9a00ff5b0 0x7fb9a01985b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:46634/0 (socket says 192.168.123.105:46634) 2026-03-10T08:55:17.824 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.816+0000 7fb99effd700 1 -- 192.168.123.105:0/751789696 learned_addr learned my addr 192.168.123.105:0/751789696 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:55:17.824 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.816+0000 7fb9a5a23700 1 -- 192.168.123.105:0/751789696 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb9a0199110 con 0x7fb9a00ff5b0 2026-03-10T08:55:17.824 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.816+0000 7fb9a5a23700 1 -- 192.168.123.105:0/751789696 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb9a0199250 con 0x7fb9a00fff10 2026-03-10T08:55:17.824 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.816+0000 7fb99e7fc700 1 --2- 192.168.123.105:0/751789696 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb9a00fff10 0x7fb9a0198af0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:55:17.824 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.816+0000 7fb99effd700 1 -- 192.168.123.105:0/751789696 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb9a00fff10 msgr2=0x7fb9a0198af0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:17.824 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.816+0000 7fb99effd700 1 --2- 192.168.123.105:0/751789696 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb9a00fff10 0x7fb9a0198af0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:17.824 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.816+0000 7fb99effd700 1 -- 192.168.123.105:0/751789696 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb9900097e0 con 0x7fb9a00ff5b0 2026-03-10T08:55:17.824 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.817+0000 7fb99effd700 1 --2- 192.168.123.105:0/751789696 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb9a00ff5b0 0x7fb9a01985b0 secure :-1 s=READY pgs=323 cs=0 l=1 rev1=1 crypto rx=0x7fb99000b5c0 tx=0x7fb990004c30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:55:17.824 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.817+0000 7fb9a4a21700 1 -- 192.168.123.105:0/751789696 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb99001d070 con 0x7fb9a00ff5b0 2026-03-10T08:55:17.824 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.817+0000 7fb9a5a23700 1 -- 192.168.123.105:0/751789696 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb9a019dca0 con 0x7fb9a00ff5b0 2026-03-10T08:55:17.824 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.818+0000 7fb9a5a23700 1 -- 192.168.123.105:0/751789696 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb9a019e190 con 0x7fb9a00ff5b0 2026-03-10T08:55:17.824 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.819+0000 7fb9a4a21700 1 -- 192.168.123.105:0/751789696 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb99000bcd0 con 0x7fb9a00ff5b0 2026-03-10T08:55:17.824 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.819+0000 7fb9a4a21700 1 -- 192.168.123.105:0/751789696 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb99000f9f0 con 0x7fb9a00ff5b0 2026-03-10T08:55:17.824 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.819+0000 7fb9a4a21700 1 -- 192.168.123.105:0/751789696 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fb99000fc10 con 0x7fb9a00ff5b0 2026-03-10T08:55:17.824 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.819+0000 7fb9a4a21700 1 --2- 192.168.123.105:0/751789696 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb98806c600 0x7fb98806eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:55:17.824 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.819+0000 7fb99e7fc700 1 --2- 192.168.123.105:0/751789696 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb98806c600 0x7fb98806eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:55:17.824 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.819+0000 7fb9a4a21700 1 -- 192.168.123.105:0/751789696 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7fb990022470 con 0x7fb9a00ff5b0 2026-03-10T08:55:17.824 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.820+0000 7fb99e7fc700 1 --2- 192.168.123.105:0/751789696 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb98806c600 0x7fb98806eac0 secure :-1 s=READY pgs=148 cs=0 l=1 rev1=1 crypto rx=0x7fb994005950 tx=0x7fb99400b410 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:55:17.824 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.820+0000 7fb9a5a23700 1 -- 192.168.123.105:0/751789696 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb98c005320 con 0x7fb9a00ff5b0 2026-03-10T08:55:17.824 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:17.823+0000 7fb9a4a21700 1 -- 192.168.123.105:0/751789696 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fb990093050 con 0x7fb9a00ff5b0 2026-03-10T08:55:17.826 INFO:tasks.workunit.client.0.vm05.stdout:0/132: truncate df/d18/f24 4430739 0 2026-03-10T08:55:17.843 INFO:tasks.workunit.client.0.vm05.stdout:9/97: getdents d6/d19 0 2026-03-10T08:55:17.846 INFO:tasks.workunit.client.1.vm08.stdout:2/410: mknod d1/da/d10/d42/c82 0 2026-03-10T08:55:17.847 INFO:tasks.workunit.client.0.vm05.stdout:4/145: write d0/f16 [5194711,90812] 0 2026-03-10T08:55:17.851 INFO:tasks.workunit.client.0.vm05.stdout:2/93: creat d0/f18 x:0 0 0 2026-03-10T08:55:17.854 INFO:tasks.workunit.client.1.vm08.stdout:5/402: unlink d0/d1b/l30 0 2026-03-10T08:55:17.855 INFO:tasks.workunit.client.0.vm05.stdout:1/197: chown dd/d10/c28 5142 1 2026-03-10T08:55:17.856 INFO:tasks.workunit.client.0.vm05.stdout:1/198: write dd/d21/f48 [315120,29466] 0 2026-03-10T08:55:17.857 INFO:tasks.workunit.client.0.vm05.stdout:1/199: write dd/d10/d19/f35 [5913212,13926] 0 2026-03-10T08:55:17.860 INFO:tasks.workunit.client.0.vm05.stdout:1/200: dwrite dd/d10/d18/d20/f34 [0,4194304] 0 2026-03-10T08:55:17.862 INFO:tasks.workunit.client.0.vm05.stdout:3/165: sync 2026-03-10T08:55:17.862 INFO:tasks.workunit.client.0.vm05.stdout:0/133: sync 2026-03-10T08:55:17.863 INFO:tasks.workunit.client.0.vm05.stdout:0/134: write df/f12 [2942440,94398] 0 2026-03-10T08:55:17.865 INFO:tasks.workunit.client.0.vm05.stdout:0/135: readlink df/d18/d19/l22 0 2026-03-10T08:55:17.879 INFO:tasks.workunit.client.0.vm05.stdout:9/98: symlink d6/d19/l1f 0 2026-03-10T08:55:17.888 INFO:tasks.workunit.client.1.vm08.stdout:4/431: dwrite d5/f1d [0,4194304] 0 2026-03-10T08:55:17.892 INFO:tasks.workunit.client.0.vm05.stdout:5/145: dwrite d5/df/d12/f1b [0,4194304] 0 2026-03-10T08:55:17.895 INFO:tasks.workunit.client.0.vm05.stdout:6/139: dwrite d4/d7/d10/d15/f16 [4194304,4194304] 0 2026-03-10T08:55:17.898 INFO:tasks.workunit.client.0.vm05.stdout:6/140: truncate d4/d7/d10/d1a/f1e 437239 0 2026-03-10T08:55:17.900 INFO:tasks.workunit.client.0.vm05.stdout:5/146: sync 2026-03-10T08:55:17.908 INFO:tasks.workunit.client.1.vm08.stdout:3/368: dwrite d4/d15/f4b [4194304,4194304] 0 2026-03-10T08:55:17.911 INFO:tasks.workunit.client.1.vm08.stdout:6/439: link d9/dc/d11/d23/f40 d9/d50/d95/f99 0 2026-03-10T08:55:17.912 INFO:tasks.workunit.client.0.vm05.stdout:2/94: creat d0/d9/f19 x:0 0 0 2026-03-10T08:55:17.912 INFO:tasks.workunit.client.0.vm05.stdout:2/95: fdatasync d0/f2 0 2026-03-10T08:55:17.918 INFO:tasks.workunit.client.1.vm08.stdout:2/411: rmdir d1/da/d10/d2d 39 2026-03-10T08:55:17.933 INFO:tasks.workunit.client.1.vm08.stdout:7/384: link d0/d11/d4a/f53 d0/d11/d1f/d29/d3d/f76 0 2026-03-10T08:55:17.943 INFO:tasks.workunit.client.1.vm08.stdout:5/403: readlink d0/d11/d18/l24 0 2026-03-10T08:55:17.944 INFO:tasks.workunit.client.1.vm08.stdout:8/471: truncate d1/d10/d9/f73 3723411 0 2026-03-10T08:55:17.951 INFO:tasks.workunit.client.1.vm08.stdout:9/406: link d2/dd/d15/d1e/d21/f2d d2/d41/d4c/f80 0 2026-03-10T08:55:17.957 INFO:tasks.workunit.client.0.vm05.stdout:1/201: rename dd/d13/c3c to dd/d21/d3f/d4a/c4b 0 2026-03-10T08:55:17.965 INFO:tasks.workunit.client.1.vm08.stdout:0/325: getdents d6/dd/d13/d32 0 2026-03-10T08:55:17.965 INFO:tasks.workunit.client.1.vm08.stdout:0/326: dread d6/f16 [0,4194304] 0 2026-03-10T08:55:17.965 INFO:tasks.workunit.client.1.vm08.stdout:1/404: getdents d1/da/de/d24/d3d/d40/d56/d6b 0 2026-03-10T08:55:17.971 INFO:tasks.workunit.client.0.vm05.stdout:9/99: write d6/f8 [769814,85828] 0 2026-03-10T08:55:17.973 INFO:tasks.workunit.client.1.vm08.stdout:3/369: rename d4/d15/d17 to d4/d15/d8/d2a/d79 0 2026-03-10T08:55:17.983 INFO:tasks.workunit.client.1.vm08.stdout:5/404: dread - d0/d1b/f69 zero size 2026-03-10T08:55:17.984 INFO:tasks.workunit.client.1.vm08.stdout:5/405: chown d0/d11/d27/l6b 19488177 1 2026-03-10T08:55:17.984 INFO:tasks.workunit.client.1.vm08.stdout:5/406: chown d0/d11/d27/d68/l6d 12 1 2026-03-10T08:55:17.984 INFO:tasks.workunit.client.0.vm05.stdout:5/147: readlink d5/df/l1d 0 2026-03-10T08:55:17.990 INFO:tasks.workunit.client.0.vm05.stdout:8/112: creat d2/f2a x:0 0 0 2026-03-10T08:55:17.994 INFO:tasks.workunit.client.1.vm08.stdout:1/405: unlink d1/da/de/d24/d35/d6d/d82/l74 0 2026-03-10T08:55:17.994 INFO:tasks.workunit.client.1.vm08.stdout:1/406: read - d1/da/d18/d3a/f3c zero size 2026-03-10T08:55:17.995 INFO:tasks.workunit.client.1.vm08.stdout:1/407: truncate d1/da/d20/d3f/d49/f71 1201163 0 2026-03-10T08:55:17.996 INFO:tasks.workunit.client.0.vm05.stdout:3/166: rename d9/f12 to d9/d2b/f2c 0 2026-03-10T08:55:17.998 INFO:tasks.workunit.client.0.vm05.stdout:1/202: creat dd/d21/f4c x:0 0 0 2026-03-10T08:55:18.000 INFO:tasks.workunit.client.1.vm08.stdout:6/440: rename d9/dc/d11/d23/d2c/d41/f51 to d9/d10/d1e/f9a 0 2026-03-10T08:55:18.001 INFO:tasks.workunit.client.1.vm08.stdout:6/441: stat d9/d10/d1e/d32/f64 0 2026-03-10T08:55:18.001 INFO:tasks.workunit.client.1.vm08.stdout:6/442: write d9/d10/f25 [1987862,47399] 0 2026-03-10T08:55:18.002 INFO:tasks.workunit.client.1.vm08.stdout:6/443: write d9/d13/f88 [1226910,92759] 0 2026-03-10T08:55:18.004 INFO:tasks.workunit.client.1.vm08.stdout:6/444: read - d9/d13/f6c zero size 2026-03-10T08:55:18.005 INFO:tasks.workunit.client.0.vm05.stdout:9/100: dwrite d6/d19/f1a [0,4194304] 0 2026-03-10T08:55:18.006 INFO:tasks.workunit.client.1.vm08.stdout:5/407: sync 2026-03-10T08:55:18.015 INFO:tasks.workunit.client.0.vm05.stdout:7/98: write f3 [1788295,41659] 0 2026-03-10T08:55:18.017 INFO:tasks.workunit.client.0.vm05.stdout:0/136: dwrite df/f15 [0,4194304] 0 2026-03-10T08:55:18.020 INFO:tasks.workunit.client.1.vm08.stdout:8/472: dwrite d1/f8 [0,4194304] 0 2026-03-10T08:55:18.021 INFO:tasks.workunit.client.1.vm08.stdout:2/412: dwrite d1/da/d10/d1b/d12/d23/f31 [0,4194304] 0 2026-03-10T08:55:18.027 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.026+0000 7fb9a5a23700 1 -- 192.168.123.105:0/751789696 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fb98c000bf0 con 0x7fb98806c600 2026-03-10T08:55:18.030 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T08:55:18.030 INFO:teuthology.orchestra.run.vm05.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T08:55:18.030 INFO:teuthology.orchestra.run.vm05.stdout: "in_progress": true, 2026-03-10T08:55:18.030 INFO:teuthology.orchestra.run.vm05.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T08:55:18.030 INFO:teuthology.orchestra.run.vm05.stdout: "services_complete": [], 2026-03-10T08:55:18.030 INFO:teuthology.orchestra.run.vm05.stdout: "progress": "0/23 daemons upgraded", 2026-03-10T08:55:18.030 INFO:teuthology.orchestra.run.vm05.stdout: "message": "Pulling quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc image on host vm08", 2026-03-10T08:55:18.030 INFO:teuthology.orchestra.run.vm05.stdout: "is_paused": false 2026-03-10T08:55:18.030 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T08:55:18.031 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.028+0000 7fb9a4a21700 1 -- 192.168.123.105:0/751789696 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+442 (secure 0 0 0) 0x7fb98c000bf0 con 0x7fb98806c600 2026-03-10T08:55:18.034 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.033+0000 7fb9a5a23700 1 -- 192.168.123.105:0/751789696 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb98806c600 msgr2=0x7fb98806eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:18.034 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.033+0000 7fb9a5a23700 1 --2- 192.168.123.105:0/751789696 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb98806c600 0x7fb98806eac0 secure :-1 s=READY pgs=148 cs=0 l=1 rev1=1 crypto rx=0x7fb994005950 tx=0x7fb99400b410 comp rx=0 tx=0).stop 2026-03-10T08:55:18.034 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.033+0000 7fb9a5a23700 1 -- 192.168.123.105:0/751789696 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb9a00ff5b0 msgr2=0x7fb9a01985b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:18.034 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.033+0000 7fb9a5a23700 1 --2- 192.168.123.105:0/751789696 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb9a00ff5b0 0x7fb9a01985b0 secure :-1 s=READY pgs=323 cs=0 l=1 rev1=1 crypto rx=0x7fb99000b5c0 tx=0x7fb990004c30 comp rx=0 tx=0).stop 2026-03-10T08:55:18.034 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.033+0000 7fb9a5a23700 1 -- 192.168.123.105:0/751789696 shutdown_connections 2026-03-10T08:55:18.034 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.033+0000 7fb9a5a23700 1 --2- 192.168.123.105:0/751789696 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb98806c600 0x7fb98806eac0 unknown :-1 s=CLOSED pgs=148 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:18.034 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.033+0000 7fb9a5a23700 1 --2- 192.168.123.105:0/751789696 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb9a00ff5b0 0x7fb9a01985b0 unknown :-1 s=CLOSED pgs=323 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:18.034 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.033+0000 7fb9a5a23700 1 --2- 192.168.123.105:0/751789696 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb9a00fff10 0x7fb9a0198af0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:18.034 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.033+0000 7fb9a5a23700 1 -- 192.168.123.105:0/751789696 >> 192.168.123.105:0/751789696 conn(0x7fb9a00fb110 msgr2=0x7fb9a00fd3d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:55:18.034 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.033+0000 7fb9a5a23700 1 -- 192.168.123.105:0/751789696 shutdown_connections 2026-03-10T08:55:18.034 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.034+0000 7fb9a5a23700 1 -- 192.168.123.105:0/751789696 wait complete. 2026-03-10T08:55:18.034 INFO:tasks.workunit.client.0.vm05.stdout:4/146: dread d0/f1 [0,4194304] 0 2026-03-10T08:55:18.035 INFO:tasks.workunit.client.0.vm05.stdout:4/147: write d0/f1e [2725775,67730] 0 2026-03-10T08:55:18.039 INFO:tasks.workunit.client.0.vm05.stdout:4/148: read d0/f8 [3849866,114683] 0 2026-03-10T08:55:18.046 INFO:tasks.workunit.client.0.vm05.stdout:5/148: fdatasync d5/fc 0 2026-03-10T08:55:18.046 INFO:tasks.workunit.client.0.vm05.stdout:5/149: chown d5/l2b 3921923 1 2026-03-10T08:55:18.048 INFO:tasks.workunit.client.0.vm05.stdout:8/113: write d2/f5 [5012982,49885] 0 2026-03-10T08:55:18.048 INFO:tasks.workunit.client.0.vm05.stdout:8/114: write d2/ff [3078580,45630] 0 2026-03-10T08:55:18.049 INFO:tasks.workunit.client.0.vm05.stdout:8/115: dread - d2/dd/f26 zero size 2026-03-10T08:55:18.052 INFO:tasks.workunit.client.1.vm08.stdout:4/432: creat d5/f95 x:0 0 0 2026-03-10T08:55:18.061 INFO:tasks.workunit.client.1.vm08.stdout:4/433: stat d5/f1d 0 2026-03-10T08:55:18.062 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:17 vm08.local ceph-mon[57559]: from='client.14654 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:55:18.062 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:17 vm08.local ceph-mon[57559]: from='client.24435 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:55:18.062 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:17 vm08.local ceph-mon[57559]: from='client.? 192.168.123.105:0/1370754065' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:55:18.062 INFO:tasks.workunit.client.0.vm05.stdout:3/167: creat d9/d2b/f2d x:0 0 0 2026-03-10T08:55:18.062 INFO:tasks.workunit.client.0.vm05.stdout:8/116: dwrite d2/dd/f26 [0,4194304] 0 2026-03-10T08:55:18.062 INFO:tasks.workunit.client.0.vm05.stdout:3/168: fsync d9/f19 0 2026-03-10T08:55:18.062 INFO:tasks.workunit.client.0.vm05.stdout:3/169: write f7 [832589,29025] 0 2026-03-10T08:55:18.062 INFO:tasks.workunit.client.0.vm05.stdout:3/170: write d9/f29 [930961,87509] 0 2026-03-10T08:55:18.091 INFO:tasks.workunit.client.0.vm05.stdout:3/171: dread d9/ff [0,4194304] 0 2026-03-10T08:55:18.092 INFO:tasks.workunit.client.0.vm05.stdout:9/101: rename d6/d19 to d6/d19/d20 22 2026-03-10T08:55:18.092 INFO:tasks.workunit.client.0.vm05.stdout:7/99: mkdir d18 0 2026-03-10T08:55:18.093 INFO:tasks.workunit.client.0.vm05.stdout:1/203: truncate dd/d10/d19/f2e 871394 0 2026-03-10T08:55:18.093 INFO:tasks.workunit.client.0.vm05.stdout:9/102: write d6/d19/f1a [4040295,114964] 0 2026-03-10T08:55:18.094 INFO:tasks.workunit.client.0.vm05.stdout:9/103: dread - d6/d12/f1c zero size 2026-03-10T08:55:18.097 INFO:tasks.workunit.client.0.vm05.stdout:1/204: dwrite dd/d21/f26 [0,4194304] 0 2026-03-10T08:55:18.099 INFO:tasks.workunit.client.1.vm08.stdout:6/445: symlink d9/d50/l9b 0 2026-03-10T08:55:18.099 INFO:tasks.workunit.client.1.vm08.stdout:5/408: creat d0/d1b/d67/f7e x:0 0 0 2026-03-10T08:55:18.116 INFO:tasks.workunit.client.0.vm05.stdout:0/137: dread df/d18/f24 [0,4194304] 0 2026-03-10T08:55:18.125 INFO:tasks.workunit.client.1.vm08.stdout:9/407: dread d2/dd/d15/d1e/d25/f5f [0,4194304] 0 2026-03-10T08:55:18.126 INFO:tasks.workunit.client.1.vm08.stdout:9/408: dread - d2/dd/d15/d1e/d25/d32/d5c/f7f zero size 2026-03-10T08:55:18.130 INFO:tasks.workunit.client.0.vm05.stdout:6/141: truncate d4/f11 2259586 0 2026-03-10T08:55:18.131 INFO:tasks.workunit.client.1.vm08.stdout:0/327: write d6/dd/d13/d17/d1f/d20/f21 [136181,49467] 0 2026-03-10T08:55:18.135 INFO:tasks.workunit.client.0.vm05.stdout:6/142: dwrite d4/d7/d10/d15/f2e [0,4194304] 0 2026-03-10T08:55:18.135 INFO:tasks.workunit.client.1.vm08.stdout:1/408: dwrite d1/da/d18/d3a/f3c [0,4194304] 0 2026-03-10T08:55:18.148 INFO:tasks.workunit.client.1.vm08.stdout:3/370: write d4/f44 [77582,23929] 0 2026-03-10T08:55:18.156 INFO:tasks.workunit.client.1.vm08.stdout:4/434: mkdir d5/de/d96 0 2026-03-10T08:55:18.161 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.159+0000 7fb510150700 1 -- 192.168.123.105:0/2879384811 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb508075a40 msgr2=0x7fb508077ed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:18.161 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.159+0000 7fb510150700 1 --2- 192.168.123.105:0/2879384811 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb508075a40 0x7fb508077ed0 secure :-1 s=READY pgs=324 cs=0 l=1 rev1=1 crypto rx=0x7fb50000d3f0 tx=0x7fb50000d700 comp rx=0 tx=0).stop 2026-03-10T08:55:18.161 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.159+0000 7fb510150700 1 -- 192.168.123.105:0/2879384811 shutdown_connections 2026-03-10T08:55:18.161 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.159+0000 7fb510150700 1 --2- 192.168.123.105:0/2879384811 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb508075a40 0x7fb508077ed0 unknown :-1 s=CLOSED pgs=324 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:18.161 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.159+0000 7fb510150700 1 --2- 192.168.123.105:0/2879384811 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb508072b50 0x7fb508072f70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:18.161 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.159+0000 7fb510150700 1 -- 192.168.123.105:0/2879384811 >> 192.168.123.105:0/2879384811 conn(0x7fb50806dae0 msgr2=0x7fb50806ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:55:18.161 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.159+0000 7fb510150700 1 -- 192.168.123.105:0/2879384811 shutdown_connections 2026-03-10T08:55:18.161 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.159+0000 7fb510150700 1 -- 192.168.123.105:0/2879384811 wait complete. 2026-03-10T08:55:18.161 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.160+0000 7fb510150700 1 Processor -- start 2026-03-10T08:55:18.161 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.160+0000 7fb510150700 1 -- start start 2026-03-10T08:55:18.161 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.160+0000 7fb510150700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb508072b50 0x7fb508083970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:55:18.161 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.160+0000 7fb510150700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb50812bdb0 0x7fb50812e240 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:55:18.161 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.160+0000 7fb510150700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb50812e780 con 0x7fb50812bdb0 2026-03-10T08:55:18.161 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.160+0000 7fb510150700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb50812e8f0 con 0x7fb508072b50 2026-03-10T08:55:18.161 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.160+0000 7fb50d6eb700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb50812bdb0 0x7fb50812e240 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:55:18.161 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.160+0000 7fb50d6eb700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb50812bdb0 0x7fb50812e240 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:46650/0 (socket says 192.168.123.105:46650) 2026-03-10T08:55:18.161 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.160+0000 7fb50d6eb700 1 -- 192.168.123.105:0/3164661493 learned_addr learned my addr 192.168.123.105:0/3164661493 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:55:18.161 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.160+0000 7fb50deec700 1 --2- 192.168.123.105:0/3164661493 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb508072b50 0x7fb508083970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:55:18.161 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.160+0000 7fb50deec700 1 -- 192.168.123.105:0/3164661493 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb50812bdb0 msgr2=0x7fb50812e240 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:18.161 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.160+0000 7fb50deec700 1 --2- 192.168.123.105:0/3164661493 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb50812bdb0 0x7fb50812e240 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:18.161 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.160+0000 7fb50deec700 1 -- 192.168.123.105:0/3164661493 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb500007ed0 con 0x7fb508072b50 2026-03-10T08:55:18.161 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.161+0000 7fb50deec700 1 --2- 192.168.123.105:0/3164661493 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb508072b50 0x7fb508083970 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7fb50400d8d0 tx=0x7fb50400dbe0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:55:18.161 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.161+0000 7fb4feffd700 1 -- 192.168.123.105:0/3164661493 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb50400f840 con 0x7fb508072b50 2026-03-10T08:55:18.162 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.161+0000 7fb510150700 1 -- 192.168.123.105:0/3164661493 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb50812ebd0 con 0x7fb508072b50 2026-03-10T08:55:18.162 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.161+0000 7fb510150700 1 -- 192.168.123.105:0/3164661493 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb50812f120 con 0x7fb508072b50 2026-03-10T08:55:18.162 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.162+0000 7fb4feffd700 1 -- 192.168.123.105:0/3164661493 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb50400fe80 con 0x7fb508072b50 2026-03-10T08:55:18.162 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.162+0000 7fb4feffd700 1 -- 192.168.123.105:0/3164661493 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb50400e5c0 con 0x7fb508072b50 2026-03-10T08:55:18.163 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.163+0000 7fb4feffd700 1 -- 192.168.123.105:0/3164661493 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 18) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fb5040108f0 con 0x7fb508072b50 2026-03-10T08:55:18.163 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.163+0000 7fb4feffd700 1 --2- 192.168.123.105:0/3164661493 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb4f406c530 0x7fb4f406e9f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:55:18.163 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.163+0000 7fb4feffd700 1 -- 192.168.123.105:0/3164661493 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7fb50408b210 con 0x7fb508072b50 2026-03-10T08:55:18.163 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.163+0000 7fb510150700 1 -- 192.168.123.105:0/3164661493 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb4ec005320 con 0x7fb508072b50 2026-03-10T08:55:18.164 INFO:tasks.workunit.client.0.vm05.stdout:5/150: symlink d5/df/d12/d24/l2d 0 2026-03-10T08:55:18.166 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.166+0000 7fb50d6eb700 1 --2- 192.168.123.105:0/3164661493 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb4f406c530 0x7fb4f406e9f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:55:18.166 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.166+0000 7fb50d6eb700 1 --2- 192.168.123.105:0/3164661493 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb4f406c530 0x7fb4f406e9f0 secure :-1 s=READY pgs=149 cs=0 l=1 rev1=1 crypto rx=0x7fb500007ea0 tx=0x7fb500007db0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:55:18.171 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.167+0000 7fb4feffd700 1 -- 192.168.123.105:0/3164661493 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fb5040594c0 con 0x7fb508072b50 2026-03-10T08:55:18.213 INFO:tasks.workunit.client.0.vm05.stdout:7/100: rename c5 to d18/c19 0 2026-03-10T08:55:18.215 INFO:tasks.workunit.client.0.vm05.stdout:7/101: dread f15 [0,4194304] 0 2026-03-10T08:55:18.216 INFO:tasks.workunit.client.0.vm05.stdout:7/102: fsync f4 0 2026-03-10T08:55:18.216 INFO:tasks.workunit.client.0.vm05.stdout:7/103: write f15 [2455654,49383] 0 2026-03-10T08:55:18.216 INFO:tasks.workunit.client.0.vm05.stdout:7/104: chown cc 30483 1 2026-03-10T08:55:18.229 INFO:tasks.workunit.client.1.vm08.stdout:6/446: symlink d9/d10/d1e/d7b/l9c 0 2026-03-10T08:55:18.232 INFO:tasks.workunit.client.1.vm08.stdout:6/447: dread d9/dc/d11/d23/d2c/f49 [4194304,4194304] 0 2026-03-10T08:55:18.233 INFO:tasks.workunit.client.1.vm08.stdout:6/448: write d9/d13/f36 [1583383,39637] 0 2026-03-10T08:55:18.234 INFO:tasks.workunit.client.1.vm08.stdout:6/449: write d9/dc/d84/f89 [578987,103951] 0 2026-03-10T08:55:18.235 INFO:tasks.workunit.client.1.vm08.stdout:6/450: truncate d9/dc/d11/d23/f6f 415357 0 2026-03-10T08:55:18.238 INFO:tasks.workunit.client.1.vm08.stdout:5/409: truncate d0/d11/d27/d68/d7c/f6a 198234 0 2026-03-10T08:55:18.240 INFO:tasks.workunit.client.1.vm08.stdout:7/385: link d0/d11/d4a/l6b d0/d11/d1f/d29/d36/d75/l77 0 2026-03-10T08:55:18.242 INFO:tasks.workunit.client.1.vm08.stdout:8/473: symlink d1/laa 0 2026-03-10T08:55:18.243 INFO:tasks.workunit.client.0.vm05.stdout:9/104: truncate d6/fb 773510 0 2026-03-10T08:55:18.245 INFO:tasks.workunit.client.0.vm05.stdout:9/105: dread d6/fb [0,4194304] 0 2026-03-10T08:55:18.247 INFO:tasks.workunit.client.1.vm08.stdout:2/413: truncate d1/d5b/d66/f5e 2129105 0 2026-03-10T08:55:18.248 INFO:tasks.workunit.client.1.vm08.stdout:9/409: creat d2/d41/d53/f81 x:0 0 0 2026-03-10T08:55:18.252 INFO:tasks.workunit.client.0.vm05.stdout:0/138: rmdir df/d1f 39 2026-03-10T08:55:18.253 INFO:tasks.workunit.client.0.vm05.stdout:0/139: write df/d18/d19/f1c [3273056,69376] 0 2026-03-10T08:55:18.254 INFO:tasks.workunit.client.1.vm08.stdout:0/328: rename d6/dd/d13/d17/d1f/d2d/f5b to d6/dd/d13/d17/d1f/d20/d2f/d57/f65 0 2026-03-10T08:55:18.262 INFO:tasks.workunit.client.1.vm08.stdout:4/435: read d5/f19 [2596464,31883] 0 2026-03-10T08:55:18.265 INFO:tasks.workunit.client.1.vm08.stdout:3/371: dread d4/d15/fc [0,4194304] 0 2026-03-10T08:55:18.265 INFO:tasks.workunit.client.0.vm05.stdout:4/149: mknod d0/c2d 0 2026-03-10T08:55:18.266 INFO:tasks.workunit.client.1.vm08.stdout:3/372: read d4/d15/d8/d1d/f21 [859934,105568] 0 2026-03-10T08:55:18.267 INFO:tasks.workunit.client.1.vm08.stdout:6/451: creat d9/d10/f9d x:0 0 0 2026-03-10T08:55:18.269 INFO:tasks.workunit.client.1.vm08.stdout:5/410: fsync d0/d11/f1e 0 2026-03-10T08:55:18.271 INFO:tasks.workunit.client.1.vm08.stdout:7/386: fdatasync d0/d11/d1f/d29/d3d/f59 0 2026-03-10T08:55:18.271 INFO:tasks.workunit.client.1.vm08.stdout:7/387: stat d0/f25 0 2026-03-10T08:55:18.272 INFO:tasks.workunit.client.0.vm05.stdout:2/96: getdents d0 0 2026-03-10T08:55:18.276 INFO:tasks.workunit.client.1.vm08.stdout:2/414: creat d1/da/d10/d1b/d12/d1e/f83 x:0 0 0 2026-03-10T08:55:18.279 INFO:tasks.workunit.client.1.vm08.stdout:9/410: mkdir d2/d41/d4c/d66/d82 0 2026-03-10T08:55:18.283 INFO:tasks.workunit.client.0.vm05.stdout:7/105: creat d18/f1a x:0 0 0 2026-03-10T08:55:18.285 INFO:tasks.workunit.client.1.vm08.stdout:1/409: link d1/da/d18/f72 d1/da/de/d24/d81/f88 0 2026-03-10T08:55:18.292 INFO:tasks.workunit.client.1.vm08.stdout:4/436: rename d5/f1e to d5/d2f/d5a/d69/f97 0 2026-03-10T08:55:18.303 INFO:tasks.workunit.client.1.vm08.stdout:6/452: creat d9/dc/d84/d80/f9e x:0 0 0 2026-03-10T08:55:18.303 INFO:tasks.workunit.client.1.vm08.stdout:5/411: creat d0/f7f x:0 0 0 2026-03-10T08:55:18.303 INFO:tasks.workunit.client.0.vm05.stdout:9/106: truncate d6/fe 509559 0 2026-03-10T08:55:18.303 INFO:tasks.workunit.client.0.vm05.stdout:0/140: creat df/d18/f29 x:0 0 0 2026-03-10T08:55:18.303 INFO:tasks.workunit.client.0.vm05.stdout:5/151: creat d5/df/d12/d24/d2c/f2e x:0 0 0 2026-03-10T08:55:18.303 INFO:tasks.workunit.client.0.vm05.stdout:2/97: symlink d0/l1a 0 2026-03-10T08:55:18.304 INFO:tasks.workunit.client.0.vm05.stdout:2/98: chown d0/c11 59 1 2026-03-10T08:55:18.310 INFO:tasks.workunit.client.0.vm05.stdout:1/205: dwrite dd/d10/d19/f1f [0,4194304] 0 2026-03-10T08:55:18.319 INFO:tasks.workunit.client.0.vm05.stdout:8/117: link d2/dd/l27 d2/db/d1f/l2b 0 2026-03-10T08:55:18.319 INFO:tasks.workunit.client.0.vm05.stdout:8/118: dread - d2/f2a zero size 2026-03-10T08:55:18.322 INFO:tasks.workunit.client.1.vm08.stdout:2/415: stat d1/d5b/d66/f20 0 2026-03-10T08:55:18.328 INFO:tasks.workunit.client.1.vm08.stdout:1/410: rename d1/da/de/f27 to d1/da/d18/d3b/f89 0 2026-03-10T08:55:18.334 INFO:tasks.workunit.client.1.vm08.stdout:4/437: mknod d5/d5f/c98 0 2026-03-10T08:55:18.334 INFO:tasks.workunit.client.0.vm05.stdout:0/141: creat df/d18/f2a x:0 0 0 2026-03-10T08:55:18.334 INFO:tasks.workunit.client.0.vm05.stdout:0/142: dread - df/d18/f29 zero size 2026-03-10T08:55:18.335 INFO:tasks.workunit.client.0.vm05.stdout:0/143: dwrite df/d18/d19/f1c [0,4194304] 0 2026-03-10T08:55:18.342 INFO:tasks.workunit.client.0.vm05.stdout:6/143: getdents d4/d7/d10/d1a 0 2026-03-10T08:55:18.350 INFO:tasks.workunit.client.0.vm05.stdout:5/152: creat d5/df/f2f x:0 0 0 2026-03-10T08:55:18.352 INFO:tasks.workunit.client.1.vm08.stdout:5/412: mkdir d0/d1b/d67/d80 0 2026-03-10T08:55:18.353 INFO:tasks.workunit.client.0.vm05.stdout:2/99: rmdir d0 39 2026-03-10T08:55:18.354 INFO:tasks.workunit.client.1.vm08.stdout:5/413: stat d0/d11/d27/f64 0 2026-03-10T08:55:18.355 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.353+0000 7fb510150700 1 -- 192.168.123.105:0/3164661493 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7fb4ec005190 con 0x7fb508072b50 2026-03-10T08:55:18.357 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.357+0000 7fb4feffd700 1 -- 192.168.123.105:0/3164661493 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7fb504021090 con 0x7fb508072b50 2026-03-10T08:55:18.357 INFO:teuthology.orchestra.run.vm05.stdout:HEALTH_OK 2026-03-10T08:55:18.360 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.360+0000 7fb4fcff9700 1 -- 192.168.123.105:0/3164661493 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb4f406c530 msgr2=0x7fb4f406e9f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:18.362 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.360+0000 7fb4fcff9700 1 --2- 192.168.123.105:0/3164661493 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb4f406c530 0x7fb4f406e9f0 secure :-1 s=READY pgs=149 cs=0 l=1 rev1=1 crypto rx=0x7fb500007ea0 tx=0x7fb500007db0 comp rx=0 tx=0).stop 2026-03-10T08:55:18.362 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.360+0000 7fb4fcff9700 1 -- 192.168.123.105:0/3164661493 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb508072b50 msgr2=0x7fb508083970 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:18.362 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.360+0000 7fb4fcff9700 1 --2- 192.168.123.105:0/3164661493 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb508072b50 0x7fb508083970 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7fb50400d8d0 tx=0x7fb50400dbe0 comp rx=0 tx=0).stop 2026-03-10T08:55:18.362 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.360+0000 7fb4fcff9700 1 -- 192.168.123.105:0/3164661493 shutdown_connections 2026-03-10T08:55:18.362 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.360+0000 7fb4fcff9700 1 --2- 192.168.123.105:0/3164661493 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb4f406c530 0x7fb4f406e9f0 unknown :-1 s=CLOSED pgs=149 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:18.363 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.360+0000 7fb4fcff9700 1 --2- 192.168.123.105:0/3164661493 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb508072b50 0x7fb508083970 secure :-1 s=CLOSED pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7fb50400d8d0 tx=0x7fb50400dbe0 comp rx=0 tx=0).stop 2026-03-10T08:55:18.363 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.360+0000 7fb4fcff9700 1 --2- 192.168.123.105:0/3164661493 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb50812bdb0 0x7fb50812e240 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:18.363 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.360+0000 7fb4fcff9700 1 -- 192.168.123.105:0/3164661493 >> 192.168.123.105:0/3164661493 conn(0x7fb50806dae0 msgr2=0x7fb50806ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:55:18.363 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.361+0000 7fb4fcff9700 1 -- 192.168.123.105:0/3164661493 shutdown_connections 2026-03-10T08:55:18.363 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:18.361+0000 7fb4fcff9700 1 -- 192.168.123.105:0/3164661493 wait complete. 2026-03-10T08:55:18.363 INFO:tasks.workunit.client.1.vm08.stdout:9/411: mknod d2/dd/d15/d1e/d25/d32/d5c/d7b/c83 0 2026-03-10T08:55:18.364 INFO:tasks.workunit.client.0.vm05.stdout:8/119: mkdir d2/dd/d2c 0 2026-03-10T08:55:18.365 INFO:tasks.workunit.client.0.vm05.stdout:8/120: chown d2/dd/l27 320932 1 2026-03-10T08:55:18.368 INFO:tasks.workunit.client.1.vm08.stdout:0/329: creat d6/dd/d13/d17/f66 x:0 0 0 2026-03-10T08:55:18.369 INFO:tasks.workunit.client.0.vm05.stdout:3/172: getdents d9 0 2026-03-10T08:55:18.374 INFO:tasks.workunit.client.0.vm05.stdout:9/107: mkdir d6/d19/d21 0 2026-03-10T08:55:18.375 INFO:tasks.workunit.client.1.vm08.stdout:1/411: dread d1/da/de/f19 [0,4194304] 0 2026-03-10T08:55:18.377 INFO:tasks.workunit.client.0.vm05.stdout:9/108: dwrite d6/d12/f14 [0,4194304] 0 2026-03-10T08:55:18.387 INFO:tasks.workunit.client.0.vm05.stdout:0/144: dwrite df/d1f/f25 [0,4194304] 0 2026-03-10T08:55:18.389 INFO:tasks.workunit.client.0.vm05.stdout:0/145: dread df/d1f/f25 [0,4194304] 0 2026-03-10T08:55:18.391 INFO:tasks.workunit.client.0.vm05.stdout:4/150: rename d0/d15/d25 to d0/d2e 0 2026-03-10T08:55:18.391 INFO:tasks.workunit.client.0.vm05.stdout:4/151: stat d0/d15/f1c 0 2026-03-10T08:55:18.392 INFO:tasks.workunit.client.1.vm08.stdout:5/414: truncate d0/d11/d27/d68/d7c/f42 989634 0 2026-03-10T08:55:18.396 INFO:tasks.workunit.client.0.vm05.stdout:2/100: dwrite d0/d9/f17 [0,4194304] 0 2026-03-10T08:55:18.399 INFO:tasks.workunit.client.0.vm05.stdout:1/206: mkdir dd/d10/d19/d4d 0 2026-03-10T08:55:18.399 INFO:tasks.workunit.client.0.vm05.stdout:8/121: creat d2/db/d28/f2d x:0 0 0 2026-03-10T08:55:18.399 INFO:tasks.workunit.client.0.vm05.stdout:3/173: symlink d9/d2b/l2e 0 2026-03-10T08:55:18.399 INFO:tasks.workunit.client.1.vm08.stdout:8/474: getdents d1/d10/d9/dd/d25/d27/d44/d21/d51 0 2026-03-10T08:55:18.400 INFO:tasks.workunit.client.0.vm05.stdout:1/207: write fa [192987,98252] 0 2026-03-10T08:55:18.400 INFO:tasks.workunit.client.1.vm08.stdout:9/412: rename d2/l28 to d2/dd/d15/d1e/d24/l84 0 2026-03-10T08:55:18.400 INFO:tasks.workunit.client.0.vm05.stdout:1/208: write dd/f11 [1126057,16400] 0 2026-03-10T08:55:18.404 INFO:tasks.workunit.client.1.vm08.stdout:9/413: dread d2/dd/d15/f44 [0,4194304] 0 2026-03-10T08:55:18.405 INFO:tasks.workunit.client.1.vm08.stdout:4/438: mkdir d5/d23/d36/d99 0 2026-03-10T08:55:18.407 INFO:tasks.workunit.client.1.vm08.stdout:3/373: link d4/c38 d4/d15/d8/c7a 0 2026-03-10T08:55:18.407 INFO:tasks.workunit.client.1.vm08.stdout:5/415: creat d0/d46/f81 x:0 0 0 2026-03-10T08:55:18.409 INFO:tasks.workunit.client.1.vm08.stdout:7/388: getdents d0/d14/d2f 0 2026-03-10T08:55:18.415 INFO:tasks.workunit.client.1.vm08.stdout:7/389: dwrite d0/d11/d1f/d29/d3d/d40/ff [0,4194304] 0 2026-03-10T08:55:18.440 INFO:tasks.workunit.client.0.vm05.stdout:5/153: dread d5/f9 [4194304,4194304] 0 2026-03-10T08:55:18.440 INFO:tasks.workunit.client.0.vm05.stdout:2/101: unlink d0/d9/ce 0 2026-03-10T08:55:18.440 INFO:tasks.workunit.client.0.vm05.stdout:1/209: stat fc 0 2026-03-10T08:55:18.440 INFO:tasks.workunit.client.0.vm05.stdout:7/106: getdents d18 0 2026-03-10T08:55:18.440 INFO:tasks.workunit.client.0.vm05.stdout:2/102: creat d0/d9/f1b x:0 0 0 2026-03-10T08:55:18.440 INFO:tasks.workunit.client.0.vm05.stdout:8/122: mkdir d2/dd/d2c/d2e 0 2026-03-10T08:55:18.440 INFO:tasks.workunit.client.1.vm08.stdout:0/330: getdents d6/dd/d13/d32 0 2026-03-10T08:55:18.440 INFO:tasks.workunit.client.1.vm08.stdout:8/475: write d1/d10/d9/dd/d18/d3c/f4e [494308,80660] 0 2026-03-10T08:55:18.440 INFO:tasks.workunit.client.1.vm08.stdout:1/412: creat d1/da/de/d5c/f8a x:0 0 0 2026-03-10T08:55:18.440 INFO:tasks.workunit.client.1.vm08.stdout:9/414: mkdir d2/dd/d15/d1e/d25/d32/d79/d85 0 2026-03-10T08:55:18.440 INFO:tasks.workunit.client.1.vm08.stdout:4/439: mknod d5/d2f/d5d/c9a 0 2026-03-10T08:55:18.440 INFO:tasks.workunit.client.1.vm08.stdout:3/374: mknod d4/d15/d8/d2a/d79/d20/c7b 0 2026-03-10T08:55:18.440 INFO:tasks.workunit.client.1.vm08.stdout:3/375: chown d4/d15/d8/d2c/d55/l66 7747 1 2026-03-10T08:55:18.440 INFO:tasks.workunit.client.1.vm08.stdout:3/376: dwrite d4/d15/f78 [0,4194304] 0 2026-03-10T08:55:18.440 INFO:tasks.workunit.client.1.vm08.stdout:1/413: creat d1/da/d20/d3f/d49/d68/f8b x:0 0 0 2026-03-10T08:55:18.440 INFO:tasks.workunit.client.0.vm05.stdout:2/103: dread d0/d9/f17 [0,4194304] 0 2026-03-10T08:55:18.468 INFO:tasks.workunit.client.0.vm05.stdout:7/107: mkdir d18/d1b 0 2026-03-10T08:55:18.481 INFO:tasks.workunit.client.1.vm08.stdout:8/476: link d1/d10/d9/dd/d25/f6e d1/d10/d9/dd/d25/d27/d44/d21/d51/d64/fab 0 2026-03-10T08:55:18.481 INFO:tasks.workunit.client.1.vm08.stdout:1/414: symlink d1/da/d4b/d4e/l8c 0 2026-03-10T08:55:18.481 INFO:tasks.workunit.client.1.vm08.stdout:8/477: creat d1/d10/fac x:0 0 0 2026-03-10T08:55:18.481 INFO:tasks.workunit.client.1.vm08.stdout:7/390: getdents d0/d11/d1f/d29/d36 0 2026-03-10T08:55:18.481 INFO:tasks.workunit.client.1.vm08.stdout:1/415: dwrite d1/da/de/d24/d35/d43/f7d [0,4194304] 0 2026-03-10T08:55:18.481 INFO:tasks.workunit.client.1.vm08.stdout:1/416: readlink d1/da/l6f 0 2026-03-10T08:55:18.481 INFO:tasks.workunit.client.0.vm05.stdout:8/123: creat d2/dd/d2c/f2f x:0 0 0 2026-03-10T08:55:18.481 INFO:tasks.workunit.client.0.vm05.stdout:1/210: creat dd/d10/d19/d27/f4e x:0 0 0 2026-03-10T08:55:18.481 INFO:tasks.workunit.client.0.vm05.stdout:7/108: rmdir d18 39 2026-03-10T08:55:18.481 INFO:tasks.workunit.client.0.vm05.stdout:9/109: link d6/l10 d6/d12/l22 0 2026-03-10T08:55:18.481 INFO:tasks.workunit.client.0.vm05.stdout:9/110: write d6/f1e [353997,98071] 0 2026-03-10T08:55:18.481 INFO:tasks.workunit.client.0.vm05.stdout:8/124: creat d2/dd/d2c/f30 x:0 0 0 2026-03-10T08:55:18.481 INFO:tasks.workunit.client.0.vm05.stdout:1/211: write dd/d10/d19/f24 [738824,16753] 0 2026-03-10T08:55:18.481 INFO:tasks.workunit.client.0.vm05.stdout:7/109: dread f15 [0,4194304] 0 2026-03-10T08:55:18.481 INFO:tasks.workunit.client.0.vm05.stdout:7/110: chown ce 22571 1 2026-03-10T08:55:18.484 INFO:tasks.workunit.client.1.vm08.stdout:7/391: creat d0/d51/f78 x:0 0 0 2026-03-10T08:55:18.485 INFO:tasks.workunit.client.0.vm05.stdout:7/111: dwrite f4 [0,4194304] 0 2026-03-10T08:55:18.485 INFO:tasks.workunit.client.1.vm08.stdout:7/392: dread - d0/d11/d1f/d2c/f6c zero size 2026-03-10T08:55:18.487 INFO:tasks.workunit.client.1.vm08.stdout:4/440: read d5/d23/f56 [318426,81998] 0 2026-03-10T08:55:18.490 INFO:tasks.workunit.client.1.vm08.stdout:7/393: dwrite d0/d1c/f67 [0,4194304] 0 2026-03-10T08:55:18.490 INFO:tasks.workunit.client.0.vm05.stdout:1/212: mknod dd/d10/d18/d2d/c4f 0 2026-03-10T08:55:18.491 INFO:tasks.workunit.client.1.vm08.stdout:7/394: chown d0/d11/d1f/d29/d3d 7600983 1 2026-03-10T08:55:18.491 INFO:tasks.workunit.client.0.vm05.stdout:1/213: stat dd/d10/d19/d4d 0 2026-03-10T08:55:18.491 INFO:tasks.workunit.client.1.vm08.stdout:7/395: readlink d0/d14/l4d 0 2026-03-10T08:55:18.492 INFO:tasks.workunit.client.0.vm05.stdout:1/214: write dd/d21/d37/d45/f47 [618994,78984] 0 2026-03-10T08:55:18.493 INFO:tasks.workunit.client.0.vm05.stdout:1/215: truncate dd/d21/d37/f39 707593 0 2026-03-10T08:55:18.496 INFO:tasks.workunit.client.0.vm05.stdout:8/125: mkdir d2/dd/d2c/d2e/d31 0 2026-03-10T08:55:18.498 INFO:tasks.workunit.client.0.vm05.stdout:2/104: getdents d0 0 2026-03-10T08:55:18.498 INFO:tasks.workunit.client.0.vm05.stdout:2/105: fsync d0/f16 0 2026-03-10T08:55:18.503 INFO:tasks.workunit.client.1.vm08.stdout:4/441: symlink d5/d5f/l9b 0 2026-03-10T08:55:18.515 INFO:tasks.workunit.client.0.vm05.stdout:1/216: unlink dd/d21/f26 0 2026-03-10T08:55:18.526 INFO:tasks.workunit.client.1.vm08.stdout:1/417: creat d1/da/de/d24/d3d/d40/d5b/f8d x:0 0 0 2026-03-10T08:55:18.526 INFO:tasks.workunit.client.1.vm08.stdout:4/442: readlink d5/d2f/l8d 0 2026-03-10T08:55:18.526 INFO:tasks.workunit.client.1.vm08.stdout:4/443: readlink d5/d23/d36/d76/l7a 0 2026-03-10T08:55:18.526 INFO:tasks.workunit.client.1.vm08.stdout:4/444: fdatasync d5/d23/d36/f7d 0 2026-03-10T08:55:18.526 INFO:tasks.workunit.client.1.vm08.stdout:4/445: chown d5/d2f/d5d/c78 444001 1 2026-03-10T08:55:18.527 INFO:tasks.workunit.client.0.vm05.stdout:1/217: chown dd/d21/d37/d45 336432 1 2026-03-10T08:55:18.527 INFO:tasks.workunit.client.0.vm05.stdout:2/106: creat d0/f1c x:0 0 0 2026-03-10T08:55:18.527 INFO:tasks.workunit.client.0.vm05.stdout:1/218: dwrite fa [0,4194304] 0 2026-03-10T08:55:18.527 INFO:tasks.workunit.client.0.vm05.stdout:2/107: link d0/d9/f12 d0/d9/f1d 0 2026-03-10T08:55:18.527 INFO:tasks.workunit.client.0.vm05.stdout:2/108: dread d0/f8 [0,4194304] 0 2026-03-10T08:55:18.527 INFO:tasks.workunit.client.0.vm05.stdout:2/109: truncate d0/f18 1030191 0 2026-03-10T08:55:18.527 INFO:tasks.workunit.client.0.vm05.stdout:2/110: write d0/fb [4698803,76684] 0 2026-03-10T08:55:18.532 INFO:tasks.workunit.client.1.vm08.stdout:4/446: creat d5/d2f/f9c x:0 0 0 2026-03-10T08:55:18.532 INFO:tasks.workunit.client.0.vm05.stdout:2/111: mkdir d0/d9/d1e 0 2026-03-10T08:55:18.533 INFO:tasks.workunit.client.0.vm05.stdout:2/112: mknod d0/c1f 0 2026-03-10T08:55:18.534 INFO:tasks.workunit.client.0.vm05.stdout:2/113: mkdir d0/d9/d1e/d20 0 2026-03-10T08:55:18.541 INFO:tasks.workunit.client.0.vm05.stdout:2/114: mkdir d0/d9/d1e/d20/d21 0 2026-03-10T08:55:18.541 INFO:tasks.workunit.client.0.vm05.stdout:2/115: chown d0/fa 127068 1 2026-03-10T08:55:18.555 INFO:tasks.workunit.client.1.vm08.stdout:6/453: write d9/dc/d11/d23/d2c/d81/f62 [3300922,71536] 0 2026-03-10T08:55:18.557 INFO:tasks.workunit.client.1.vm08.stdout:6/454: chown d9/dc 7013 1 2026-03-10T08:55:18.560 INFO:tasks.workunit.client.1.vm08.stdout:6/455: creat d9/d10/d1e/d32/f9f x:0 0 0 2026-03-10T08:55:18.560 INFO:tasks.workunit.client.1.vm08.stdout:6/456: fdatasync d9/dc/d11/f8d 0 2026-03-10T08:55:18.616 INFO:tasks.workunit.client.1.vm08.stdout:6/457: read d9/d10/f53 [715719,120400] 0 2026-03-10T08:55:18.617 INFO:tasks.workunit.client.1.vm08.stdout:6/458: write f5 [88235,107467] 0 2026-03-10T08:55:18.621 INFO:tasks.workunit.client.1.vm08.stdout:6/459: dwrite d9/dc/d11/d23/d2c/f79 [0,4194304] 0 2026-03-10T08:55:18.709 INFO:tasks.workunit.client.0.vm05.stdout:1/219: sync 2026-03-10T08:55:18.709 INFO:tasks.workunit.client.0.vm05.stdout:1/220: readlink dd/le 0 2026-03-10T08:55:18.712 INFO:tasks.workunit.client.1.vm08.stdout:2/416: write d1/da/d10/d1b/d12/d1e/f49 [1379642,95947] 0 2026-03-10T08:55:18.712 INFO:tasks.workunit.client.0.vm05.stdout:1/221: link dd/d13/f40 dd/d10/d19/f50 0 2026-03-10T08:55:18.716 INFO:tasks.workunit.client.0.vm05.stdout:1/222: chown dd/d10/d19/f1d 0 1 2026-03-10T08:55:18.720 INFO:tasks.workunit.client.0.vm05.stdout:1/223: dwrite dd/d21/f48 [0,4194304] 0 2026-03-10T08:55:18.730 INFO:tasks.workunit.client.0.vm05.stdout:1/224: mkdir dd/d10/d18/d2d/d51 0 2026-03-10T08:55:18.730 INFO:tasks.workunit.client.0.vm05.stdout:1/225: dread - dd/d21/f3a zero size 2026-03-10T08:55:18.730 INFO:tasks.workunit.client.0.vm05.stdout:1/226: mkdir dd/d10/d18/d20/d52 0 2026-03-10T08:55:18.730 INFO:tasks.workunit.client.0.vm05.stdout:1/227: symlink dd/d10/d19/l53 0 2026-03-10T08:55:18.763 INFO:tasks.workunit.client.1.vm08.stdout:7/396: fsync d0/d11/d1f/d29/d3d/d40/ff 0 2026-03-10T08:55:18.764 INFO:tasks.workunit.client.1.vm08.stdout:7/397: truncate d0/d11/d4a/f5c 768926 0 2026-03-10T08:55:18.794 INFO:tasks.workunit.client.1.vm08.stdout:9/415: dread d2/dd/d15/f22 [0,4194304] 0 2026-03-10T08:55:18.794 INFO:tasks.workunit.client.1.vm08.stdout:9/416: dread - d2/dd/d61/f67 zero size 2026-03-10T08:55:18.795 INFO:tasks.workunit.client.1.vm08.stdout:9/417: fsync d2/dd/d15/f44 0 2026-03-10T08:55:18.826 INFO:tasks.workunit.client.0.vm05.stdout:4/152: write d0/f18 [4524849,32634] 0 2026-03-10T08:55:18.827 INFO:tasks.workunit.client.0.vm05.stdout:4/153: write d0/d1d/f24 [809820,35501] 0 2026-03-10T08:55:18.827 INFO:tasks.workunit.client.0.vm05.stdout:4/154: stat d0/f1 0 2026-03-10T08:55:18.833 INFO:tasks.workunit.client.0.vm05.stdout:3/174: dwrite d9/f1f [4194304,4194304] 0 2026-03-10T08:55:18.845 INFO:tasks.workunit.client.1.vm08.stdout:5/416: write d0/d11/d27/d68/d7c/d4b/d4e/f71 [3365827,70693] 0 2026-03-10T08:55:18.845 INFO:tasks.workunit.client.0.vm05.stdout:3/175: mkdir d9/d2b/d2f 0 2026-03-10T08:55:18.845 INFO:tasks.workunit.client.1.vm08.stdout:5/417: dread - d0/d1b/d67/f7e zero size 2026-03-10T08:55:18.846 INFO:tasks.workunit.client.0.vm05.stdout:4/155: creat d0/d2c/f2f x:0 0 0 2026-03-10T08:55:18.848 INFO:tasks.workunit.client.1.vm08.stdout:0/331: write d6/dd/d13/d17/d1f/d2d/d39/f4a [225822,42947] 0 2026-03-10T08:55:18.850 INFO:tasks.workunit.client.0.vm05.stdout:3/176: unlink d9/f1f 0 2026-03-10T08:55:18.854 INFO:tasks.workunit.client.0.vm05.stdout:3/177: dwrite d9/f19 [0,4194304] 0 2026-03-10T08:55:18.854 INFO:tasks.workunit.client.0.vm05.stdout:3/178: truncate d9/f1a 4856845 0 2026-03-10T08:55:18.857 INFO:tasks.workunit.client.0.vm05.stdout:3/179: dwrite f1 [0,4194304] 0 2026-03-10T08:55:18.867 INFO:tasks.workunit.client.1.vm08.stdout:0/332: creat d6/dd/d13/d17/d1f/f67 x:0 0 0 2026-03-10T08:55:18.867 INFO:tasks.workunit.client.1.vm08.stdout:0/333: dread - d6/dd/d13/d17/d1f/d20/d2f/d24/f37 zero size 2026-03-10T08:55:18.868 INFO:tasks.workunit.client.1.vm08.stdout:0/334: readlink d6/dd/d13/d17/d1f/d20/d2f/d57/l63 0 2026-03-10T08:55:18.872 INFO:tasks.workunit.client.1.vm08.stdout:0/335: link f3 d6/dd/d13/d17/d1f/d20/d2f/d24/f68 0 2026-03-10T08:55:18.872 INFO:tasks.workunit.client.0.vm05.stdout:4/156: sync 2026-03-10T08:55:18.872 INFO:tasks.workunit.client.1.vm08.stdout:0/336: stat d6/dd/d13/d17/d1f/d20/c49 0 2026-03-10T08:55:18.873 INFO:tasks.workunit.client.0.vm05.stdout:4/157: fsync d0/f8 0 2026-03-10T08:55:18.874 INFO:tasks.workunit.client.1.vm08.stdout:0/337: mknod d6/dd/d13/d17/d1f/d2d/d39/c69 0 2026-03-10T08:55:18.875 INFO:tasks.workunit.client.1.vm08.stdout:0/338: stat d6/dd/d13/d17/d1f/d2d 0 2026-03-10T08:55:18.896 INFO:tasks.workunit.client.0.vm05.stdout:3/180: fsync d9/f20 0 2026-03-10T08:55:18.899 INFO:tasks.workunit.client.0.vm05.stdout:3/181: dwrite d9/f13 [0,4194304] 0 2026-03-10T08:55:18.901 INFO:tasks.workunit.client.0.vm05.stdout:3/182: mknod d9/c30 0 2026-03-10T08:55:18.903 INFO:tasks.workunit.client.0.vm05.stdout:3/183: creat d9/f31 x:0 0 0 2026-03-10T08:55:18.905 INFO:tasks.workunit.client.0.vm05.stdout:3/184: symlink d9/d2b/l32 0 2026-03-10T08:55:18.916 INFO:tasks.workunit.client.0.vm05.stdout:9/111: getdents d6/d12 0 2026-03-10T08:55:18.917 INFO:tasks.workunit.client.0.vm05.stdout:9/112: symlink d6/d15/l23 0 2026-03-10T08:55:18.919 INFO:tasks.workunit.client.1.vm08.stdout:8/478: write d1/d2c/f30 [133272,99946] 0 2026-03-10T08:55:18.920 INFO:tasks.workunit.client.1.vm08.stdout:8/479: write d1/d10/d9/dd/d18/d3c/f4e [1149624,87624] 0 2026-03-10T08:55:18.924 INFO:tasks.workunit.client.1.vm08.stdout:8/480: creat d1/d10/fad x:0 0 0 2026-03-10T08:55:18.927 INFO:tasks.workunit.client.1.vm08.stdout:7/398: dwrite d0/d1c/f67 [4194304,4194304] 0 2026-03-10T08:55:18.932 INFO:tasks.workunit.client.1.vm08.stdout:8/481: creat d1/d10/d9/dd/d25/d27/d44/d97/d7d/fae x:0 0 0 2026-03-10T08:55:18.933 INFO:tasks.workunit.client.1.vm08.stdout:8/482: write d1/f8 [456422,27888] 0 2026-03-10T08:55:18.933 INFO:tasks.workunit.client.1.vm08.stdout:7/399: rename d0/d11/d1f/d29/d3b/f5b to d0/d14/d2f/f79 0 2026-03-10T08:55:18.934 INFO:tasks.workunit.client.1.vm08.stdout:8/483: chown d1/d10/d9/dd/d25/d27/d44/d21/c50 65709304 1 2026-03-10T08:55:18.934 INFO:tasks.workunit.client.0.vm05.stdout:8/126: write d2/db/f1b [180623,90003] 0 2026-03-10T08:55:18.935 INFO:tasks.workunit.client.1.vm08.stdout:8/484: chown d1/d10/d9/f5b 91707406 1 2026-03-10T08:55:18.935 INFO:tasks.workunit.client.0.vm05.stdout:8/127: write d2/dd/f26 [2762687,78265] 0 2026-03-10T08:55:18.935 INFO:tasks.workunit.client.1.vm08.stdout:8/485: chown d1/d10/d9/d4d 4 1 2026-03-10T08:55:18.940 INFO:tasks.workunit.client.0.vm05.stdout:9/113: dread f3 [0,4194304] 0 2026-03-10T08:55:18.941 INFO:tasks.workunit.client.0.vm05.stdout:9/114: write d6/d15/f18 [735027,100428] 0 2026-03-10T08:55:18.943 INFO:tasks.workunit.client.1.vm08.stdout:7/400: creat d0/f7a x:0 0 0 2026-03-10T08:55:18.943 INFO:tasks.workunit.client.1.vm08.stdout:7/401: chown d0/d11 706408 1 2026-03-10T08:55:18.943 INFO:tasks.workunit.client.1.vm08.stdout:7/402: stat d0/d14/l4d 0 2026-03-10T08:55:18.944 INFO:tasks.workunit.client.0.vm05.stdout:9/115: creat d6/d15/f24 x:0 0 0 2026-03-10T08:55:18.944 INFO:tasks.workunit.client.1.vm08.stdout:7/403: creat d0/d14/d43/f7b x:0 0 0 2026-03-10T08:55:18.945 INFO:tasks.workunit.client.1.vm08.stdout:7/404: symlink d0/d11/d1f/d2c/l7c 0 2026-03-10T08:55:18.947 INFO:tasks.workunit.client.0.vm05.stdout:2/116: rmdir d0/d9 39 2026-03-10T08:55:18.948 INFO:tasks.workunit.client.0.vm05.stdout:2/117: write d0/fb [5706814,48485] 0 2026-03-10T08:55:18.953 INFO:tasks.workunit.client.1.vm08.stdout:7/405: dread d0/d11/d1f/d29/d3d/f59 [0,4194304] 0 2026-03-10T08:55:18.966 INFO:tasks.workunit.client.0.vm05.stdout:2/118: link d0/fa d0/d9/d1e/d20/f22 0 2026-03-10T08:55:18.966 INFO:tasks.workunit.client.0.vm05.stdout:2/119: dwrite d0/d9/f1b [0,4194304] 0 2026-03-10T08:55:18.967 INFO:tasks.workunit.client.0.vm05.stdout:2/120: creat d0/d9/d1e/d20/d21/f23 x:0 0 0 2026-03-10T08:55:18.967 INFO:tasks.workunit.client.0.vm05.stdout:2/121: mkdir d0/d9/d1e/d20/d24 0 2026-03-10T08:55:18.967 INFO:tasks.workunit.client.0.vm05.stdout:2/122: readlink d0/l1a 0 2026-03-10T08:55:18.967 INFO:tasks.workunit.client.1.vm08.stdout:7/406: fsync d0/d11/d1f/d29/d3d/d40/f24 0 2026-03-10T08:55:18.967 INFO:tasks.workunit.client.1.vm08.stdout:7/407: creat d0/d11/d1f/d29/d3b/f7d x:0 0 0 2026-03-10T08:55:18.967 INFO:tasks.workunit.client.1.vm08.stdout:7/408: write d0/d11/d1f/d29/d3d/f74 [4807887,9901] 0 2026-03-10T08:55:18.967 INFO:tasks.workunit.client.1.vm08.stdout:7/409: dread d0/d14/d43/f58 [0,4194304] 0 2026-03-10T08:55:18.967 INFO:tasks.workunit.client.1.vm08.stdout:7/410: stat d0 0 2026-03-10T08:55:18.967 INFO:tasks.workunit.client.1.vm08.stdout:7/411: read - d0/d11/d1f/d29/d3b/f7d zero size 2026-03-10T08:55:18.967 INFO:tasks.workunit.client.1.vm08.stdout:7/412: readlink d0/d11/d1f/d29/d3d/l50 0 2026-03-10T08:55:18.967 INFO:tasks.workunit.client.1.vm08.stdout:7/413: write d0/f25 [1734239,92398] 0 2026-03-10T08:55:18.967 INFO:tasks.workunit.client.1.vm08.stdout:7/414: truncate d0/d14/f68 1654626 0 2026-03-10T08:55:18.967 INFO:tasks.workunit.client.0.vm05.stdout:9/116: sync 2026-03-10T08:55:18.968 INFO:tasks.workunit.client.1.vm08.stdout:7/415: write d0/d11/d1f/d29/d3d/f59 [2843783,41970] 0 2026-03-10T08:55:18.968 INFO:tasks.workunit.client.0.vm05.stdout:2/123: read d0/f2 [3174562,14913] 0 2026-03-10T08:55:18.969 INFO:tasks.workunit.client.0.vm05.stdout:9/117: readlink d6/l10 0 2026-03-10T08:55:18.969 INFO:tasks.workunit.client.1.vm08.stdout:7/416: write d0/d51/f5d [393143,91943] 0 2026-03-10T08:55:18.975 INFO:tasks.workunit.client.1.vm08.stdout:8/486: dread d1/d10/d9/f5b [0,4194304] 0 2026-03-10T08:55:18.978 INFO:tasks.workunit.client.1.vm08.stdout:8/487: dwrite d1/d10/f23 [0,4194304] 0 2026-03-10T08:55:18.987 INFO:tasks.workunit.client.0.vm05.stdout:2/124: link d0/c11 d0/d9/d1e/d20/d24/c25 0 2026-03-10T08:55:18.988 INFO:tasks.workunit.client.0.vm05.stdout:2/125: rmdir d0/d9/d1e/d20 39 2026-03-10T08:55:18.992 INFO:tasks.workunit.client.0.vm05.stdout:2/126: dwrite d0/f1c [0,4194304] 0 2026-03-10T08:55:19.002 INFO:tasks.workunit.client.0.vm05.stdout:2/127: rmdir d0/d9/d1e/d20 39 2026-03-10T08:55:19.004 INFO:tasks.workunit.client.0.vm05.stdout:2/128: dread d0/f8 [0,4194304] 0 2026-03-10T08:55:19.006 INFO:tasks.workunit.client.0.vm05.stdout:2/129: creat d0/d9/d1e/d20/f26 x:0 0 0 2026-03-10T08:55:19.007 INFO:tasks.workunit.client.1.vm08.stdout:6/460: write d9/d10/d1e/d32/f48 [720687,120126] 0 2026-03-10T08:55:19.007 INFO:tasks.workunit.client.1.vm08.stdout:6/461: fdatasync d9/d10/d1e/d32/f9f 0 2026-03-10T08:55:19.008 INFO:tasks.workunit.client.1.vm08.stdout:6/462: chown d9/d13 230566 1 2026-03-10T08:55:19.009 INFO:tasks.workunit.client.0.vm05.stdout:2/130: dread d0/d9/f1b [0,4194304] 0 2026-03-10T08:55:19.010 INFO:tasks.workunit.client.0.vm05.stdout:1/228: write f6 [641749,35183] 0 2026-03-10T08:55:19.012 INFO:tasks.workunit.client.1.vm08.stdout:6/463: creat d9/d13/fa0 x:0 0 0 2026-03-10T08:55:19.013 INFO:tasks.workunit.client.0.vm05.stdout:2/131: mkdir d0/d9/d27 0 2026-03-10T08:55:19.013 INFO:tasks.workunit.client.0.vm05.stdout:1/229: creat dd/d21/f54 x:0 0 0 2026-03-10T08:55:19.015 INFO:tasks.workunit.client.1.vm08.stdout:2/417: dwrite d1/da/f50 [0,4194304] 0 2026-03-10T08:55:19.015 INFO:tasks.workunit.client.0.vm05.stdout:2/132: unlink d0/l1 0 2026-03-10T08:55:19.016 INFO:tasks.workunit.client.0.vm05.stdout:2/133: write d0/d9/d1e/d20/f26 [460291,36378] 0 2026-03-10T08:55:19.022 INFO:tasks.workunit.client.0.vm05.stdout:2/134: dwrite d0/d9/f19 [0,4194304] 0 2026-03-10T08:55:19.028 INFO:tasks.workunit.client.1.vm08.stdout:2/418: rename d1/da/d10/d1b/d12/d23/f44 to d1/da/d10/d1b/d12/d1e/f84 0 2026-03-10T08:55:19.031 INFO:tasks.workunit.client.0.vm05.stdout:2/135: symlink d0/l28 0 2026-03-10T08:55:19.033 INFO:tasks.workunit.client.1.vm08.stdout:2/419: dread d1/d5b/d66/f62 [0,4194304] 0 2026-03-10T08:55:19.033 INFO:tasks.workunit.client.1.vm08.stdout:2/420: chown d1/da/l16 997 1 2026-03-10T08:55:19.036 INFO:tasks.workunit.client.1.vm08.stdout:2/421: dwrite d1/da/d10/d1b/d12/d23/f31 [0,4194304] 0 2026-03-10T08:55:19.042 INFO:tasks.workunit.client.1.vm08.stdout:2/422: mknod d1/d43/d4f/d52/c85 0 2026-03-10T08:55:19.047 INFO:tasks.workunit.client.0.vm05.stdout:2/136: dread d0/f2 [0,4194304] 0 2026-03-10T08:55:19.052 INFO:tasks.workunit.client.0.vm05.stdout:2/137: creat d0/d9/d1e/d20/d24/f29 x:0 0 0 2026-03-10T08:55:19.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:18 vm08.local ceph-mon[57559]: from='client.24439 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:55:19.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:18 vm08.local ceph-mon[57559]: pgmap v148: 65 pgs: 65 active+clean; 1.2 GiB data, 4.6 GiB used, 115 GiB / 120 GiB avail; 21 MiB/s rd, 128 MiB/s wr, 349 op/s 2026-03-10T08:55:19.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:18 vm08.local ceph-mon[57559]: from='client.? 192.168.123.105:0/3152795561' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T08:55:19.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:18 vm08.local ceph-mon[57559]: from='client.? 192.168.123.105:0/3164661493' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T08:55:19.062 INFO:tasks.workunit.client.1.vm08.stdout:2/423: dread d1/da/d10/f7e [0,4194304] 0 2026-03-10T08:55:19.062 INFO:tasks.workunit.client.1.vm08.stdout:2/424: chown d1/da/f64 907 1 2026-03-10T08:55:19.076 INFO:tasks.workunit.client.1.vm08.stdout:2/425: creat d1/d43/d4f/f86 x:0 0 0 2026-03-10T08:55:19.077 INFO:tasks.workunit.client.1.vm08.stdout:2/426: fdatasync d1/da/f64 0 2026-03-10T08:55:19.080 INFO:tasks.workunit.client.1.vm08.stdout:2/427: truncate d1/da/d10/d1b/f28 1334235 0 2026-03-10T08:55:19.082 INFO:tasks.workunit.client.1.vm08.stdout:2/428: chown d1/da/d10/d2d/f4d 26134 1 2026-03-10T08:55:19.084 INFO:tasks.workunit.client.1.vm08.stdout:6/464: sync 2026-03-10T08:55:19.093 INFO:tasks.workunit.client.1.vm08.stdout:7/417: dread d0/d51/f5d [0,4194304] 0 2026-03-10T08:55:19.095 INFO:tasks.workunit.client.1.vm08.stdout:7/418: dread - d0/d14/d43/f7b zero size 2026-03-10T08:55:19.096 INFO:tasks.workunit.client.1.vm08.stdout:7/419: dread d0/d11/d4a/f5c [0,4194304] 0 2026-03-10T08:55:19.097 INFO:tasks.workunit.client.1.vm08.stdout:7/420: dread d0/d14/d43/f6e [0,4194304] 0 2026-03-10T08:55:19.100 INFO:tasks.workunit.client.1.vm08.stdout:7/421: dwrite d0/d14/f72 [0,4194304] 0 2026-03-10T08:55:19.100 INFO:tasks.workunit.client.1.vm08.stdout:7/422: chown d0/d14/d2f 27372 1 2026-03-10T08:55:19.115 INFO:tasks.workunit.client.1.vm08.stdout:7/423: dwrite d0/d14/d2f/f79 [0,4194304] 0 2026-03-10T08:55:19.189 INFO:tasks.workunit.client.1.vm08.stdout:7/424: sync 2026-03-10T08:55:19.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:18 vm05.local ceph-mon[49713]: from='client.24439 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:55:19.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:18 vm05.local ceph-mon[49713]: pgmap v148: 65 pgs: 65 active+clean; 1.2 GiB data, 4.6 GiB used, 115 GiB / 120 GiB avail; 21 MiB/s rd, 128 MiB/s wr, 349 op/s 2026-03-10T08:55:19.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:18 vm05.local ceph-mon[49713]: from='client.? 192.168.123.105:0/3152795561' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T08:55:19.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:18 vm05.local ceph-mon[49713]: from='client.? 192.168.123.105:0/3164661493' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T08:55:19.243 INFO:tasks.workunit.client.1.vm08.stdout:5/418: write d0/d11/f25 [421606,73586] 0 2026-03-10T08:55:19.249 INFO:tasks.workunit.client.1.vm08.stdout:5/419: read d0/d11/d27/d68/d7c/f6a [45032,95075] 0 2026-03-10T08:55:19.253 INFO:tasks.workunit.client.1.vm08.stdout:1/418: mkdir d1/da/de/d24/d3d/d40/d8e 0 2026-03-10T08:55:19.254 INFO:tasks.workunit.client.1.vm08.stdout:1/419: chown d1/f1f 232 1 2026-03-10T08:55:19.254 INFO:tasks.workunit.client.0.vm05.stdout:4/158: write d0/f9 [1824386,125218] 0 2026-03-10T08:55:19.255 INFO:tasks.workunit.client.1.vm08.stdout:0/339: write d6/dd/d13/d17/d1f/d20/f43 [385901,4934] 0 2026-03-10T08:55:19.256 INFO:tasks.workunit.client.1.vm08.stdout:1/420: read d1/da/f39 [112505,62503] 0 2026-03-10T08:55:19.256 INFO:tasks.workunit.client.1.vm08.stdout:5/420: sync 2026-03-10T08:55:19.264 INFO:tasks.workunit.client.1.vm08.stdout:5/421: dwrite d0/d11/d27/d50/f55 [4194304,4194304] 0 2026-03-10T08:55:19.264 INFO:tasks.workunit.client.1.vm08.stdout:0/340: getdents d6/dd/d13/d17/d1f/d2d/d39 0 2026-03-10T08:55:19.273 INFO:tasks.workunit.client.1.vm08.stdout:8/488: symlink d1/d10/d9/dd/d25/d27/d44/laf 0 2026-03-10T08:55:19.273 INFO:tasks.workunit.client.1.vm08.stdout:8/489: write d1/d10/f2a [2389058,58267] 0 2026-03-10T08:55:19.280 INFO:tasks.workunit.client.1.vm08.stdout:2/429: rename d1/da/d10/d1b/d12/d23/l69 to d1/da/d10/d1b/l87 0 2026-03-10T08:55:19.284 INFO:tasks.workunit.client.1.vm08.stdout:0/341: creat d6/dd/d13/d17/d1f/d20/f6a x:0 0 0 2026-03-10T08:55:19.285 INFO:tasks.workunit.client.1.vm08.stdout:5/422: rename d0/d11/d27/d68/d7c/d4b/d4e/f71 to d0/d11/d27/d68/d7c/d4b/f82 0 2026-03-10T08:55:19.285 INFO:tasks.workunit.client.1.vm08.stdout:1/421: dread d1/f1f [0,4194304] 0 2026-03-10T08:55:19.286 INFO:tasks.workunit.client.0.vm05.stdout:6/144: rename d4/d7/d10/d15/d20/f29 to d4/f30 0 2026-03-10T08:55:19.286 INFO:tasks.workunit.client.1.vm08.stdout:8/490: dread d1/d10/d9/dd/d25/d27/f52 [0,4194304] 0 2026-03-10T08:55:19.287 INFO:tasks.workunit.client.1.vm08.stdout:2/430: stat d1/d5b/d66/f5e 0 2026-03-10T08:55:19.287 INFO:tasks.workunit.client.1.vm08.stdout:1/422: truncate d1/da/d20/d3f/d49/d68/f8b 1025956 0 2026-03-10T08:55:19.288 INFO:tasks.workunit.client.0.vm05.stdout:0/146: rename df/d1f/d26 to df/d18/d2b 0 2026-03-10T08:55:19.289 INFO:tasks.workunit.client.0.vm05.stdout:0/147: chown df/f15 122499368 1 2026-03-10T08:55:19.289 INFO:tasks.workunit.client.0.vm05.stdout:0/148: chown df/d18/d2b 1852 1 2026-03-10T08:55:19.291 INFO:tasks.workunit.client.1.vm08.stdout:4/447: creat d5/f9d x:0 0 0 2026-03-10T08:55:19.292 INFO:tasks.workunit.client.0.vm05.stdout:6/145: creat d4/d7/d10/d15/d1b/f31 x:0 0 0 2026-03-10T08:55:19.296 INFO:tasks.workunit.client.1.vm08.stdout:8/491: fsync d1/d10/d9/dd/d25/d27/d44/d97/d7d/fae 0 2026-03-10T08:55:19.298 INFO:tasks.workunit.client.0.vm05.stdout:5/154: rename d5/df/d12/f15 to d5/df/d12/d21/f30 0 2026-03-10T08:55:19.299 INFO:tasks.workunit.client.0.vm05.stdout:5/155: truncate d5/df/d12/f2a 523315 0 2026-03-10T08:55:19.301 INFO:tasks.workunit.client.0.vm05.stdout:6/146: creat d4/d7/d10/d1a/f32 x:0 0 0 2026-03-10T08:55:19.302 INFO:tasks.workunit.client.1.vm08.stdout:4/448: dwrite d5/d2f/f84 [0,4194304] 0 2026-03-10T08:55:19.310 INFO:tasks.workunit.client.1.vm08.stdout:1/423: dread d1/da/d4b/d4e/f51 [0,4194304] 0 2026-03-10T08:55:19.311 INFO:tasks.workunit.client.1.vm08.stdout:2/431: rename d1/da/l16 to d1/da/d10/d1b/d6a/l88 0 2026-03-10T08:55:19.312 INFO:tasks.workunit.client.0.vm05.stdout:7/112: rename cb to d18/d1b/c1c 0 2026-03-10T08:55:19.315 INFO:tasks.workunit.client.0.vm05.stdout:5/156: creat d5/df/f31 x:0 0 0 2026-03-10T08:55:19.317 INFO:tasks.workunit.client.0.vm05.stdout:6/147: symlink d4/d7/d10/d15/d20/l33 0 2026-03-10T08:55:19.320 INFO:tasks.workunit.client.1.vm08.stdout:4/449: unlink d5/d23/c53 0 2026-03-10T08:55:19.322 INFO:tasks.workunit.client.1.vm08.stdout:5/423: dread d0/d11/d18/f5a [0,4194304] 0 2026-03-10T08:55:19.323 INFO:tasks.workunit.client.1.vm08.stdout:4/450: read d5/de/f1b [2986382,61154] 0 2026-03-10T08:55:19.323 INFO:tasks.workunit.client.0.vm05.stdout:0/149: sync 2026-03-10T08:55:19.324 INFO:tasks.workunit.client.0.vm05.stdout:0/150: dread - df/d18/f2a zero size 2026-03-10T08:55:19.325 INFO:tasks.workunit.client.0.vm05.stdout:9/118: write d6/fe [126978,122184] 0 2026-03-10T08:55:19.325 INFO:tasks.workunit.client.1.vm08.stdout:2/432: dread d1/d5b/d66/f63 [0,4194304] 0 2026-03-10T08:55:19.325 INFO:tasks.workunit.client.0.vm05.stdout:9/119: read - d6/f16 zero size 2026-03-10T08:55:19.325 INFO:tasks.workunit.client.0.vm05.stdout:9/120: readlink d6/d12/l22 0 2026-03-10T08:55:19.326 INFO:tasks.workunit.client.0.vm05.stdout:9/121: chown d6/l10 763844 1 2026-03-10T08:55:19.330 INFO:tasks.workunit.client.0.vm05.stdout:5/157: symlink d5/df/l32 0 2026-03-10T08:55:19.330 INFO:tasks.workunit.client.0.vm05.stdout:5/158: fdatasync d5/df/d12/f13 0 2026-03-10T08:55:19.333 INFO:tasks.workunit.client.0.vm05.stdout:5/159: dwrite d5/df/d12/d24/f25 [0,4194304] 0 2026-03-10T08:55:19.349 INFO:tasks.workunit.client.0.vm05.stdout:6/148: dread d4/d7/f14 [0,4194304] 0 2026-03-10T08:55:19.350 INFO:tasks.workunit.client.1.vm08.stdout:0/342: symlink d6/dd/d13/l6b 0 2026-03-10T08:55:19.354 INFO:tasks.workunit.client.0.vm05.stdout:6/149: dread d4/d7/d10/d15/d1b/f23 [0,4194304] 0 2026-03-10T08:55:19.355 INFO:tasks.workunit.client.0.vm05.stdout:1/230: write dd/f16 [1295025,19415] 0 2026-03-10T08:55:19.359 INFO:tasks.workunit.client.1.vm08.stdout:5/424: creat d0/d11/d27/f83 x:0 0 0 2026-03-10T08:55:19.362 INFO:tasks.workunit.client.0.vm05.stdout:9/122: creat d6/d15/f25 x:0 0 0 2026-03-10T08:55:19.363 INFO:tasks.workunit.client.0.vm05.stdout:0/151: dwrite df/d1f/f25 [4194304,4194304] 0 2026-03-10T08:55:19.365 INFO:tasks.workunit.client.1.vm08.stdout:4/451: unlink d5/d2f/d5a/d69/f6f 0 2026-03-10T08:55:19.366 INFO:tasks.workunit.client.0.vm05.stdout:2/138: write d0/fa [1080785,112807] 0 2026-03-10T08:55:19.367 INFO:tasks.workunit.client.0.vm05.stdout:2/139: truncate d0/f10 909518 0 2026-03-10T08:55:19.368 INFO:tasks.workunit.client.1.vm08.stdout:2/433: rename d1/da/d10/d1b/d6a/f71 to d1/da/d10/d42/f89 0 2026-03-10T08:55:19.369 INFO:tasks.workunit.client.1.vm08.stdout:8/492: creat d1/d10/d9/dd/d25/d27/d44/fb0 x:0 0 0 2026-03-10T08:55:19.370 INFO:tasks.workunit.client.1.vm08.stdout:4/452: dwrite d5/f9d [0,4194304] 0 2026-03-10T08:55:19.370 INFO:tasks.workunit.client.0.vm05.stdout:9/123: sync 2026-03-10T08:55:19.370 INFO:tasks.workunit.client.1.vm08.stdout:8/493: fsync d1/d10/d9/dd/d13/f24 0 2026-03-10T08:55:19.372 INFO:tasks.workunit.client.1.vm08.stdout:8/494: chown d1/d10/d9/dd/d13/d40/l6b 12559988 1 2026-03-10T08:55:19.376 INFO:tasks.workunit.client.1.vm08.stdout:5/425: chown d0/l7 0 1 2026-03-10T08:55:19.378 INFO:tasks.workunit.client.1.vm08.stdout:6/465: dwrite d9/dc/d11/d23/d2c/d41/f56 [0,4194304] 0 2026-03-10T08:55:19.378 INFO:tasks.workunit.client.0.vm05.stdout:7/113: creat d18/f1d x:0 0 0 2026-03-10T08:55:19.380 INFO:tasks.workunit.client.1.vm08.stdout:3/377: unlink d4/d15/d8/f24 0 2026-03-10T08:55:19.380 INFO:tasks.workunit.client.1.vm08.stdout:5/426: dread d0/d11/d27/d50/f55 [4194304,4194304] 0 2026-03-10T08:55:19.386 INFO:tasks.workunit.client.1.vm08.stdout:2/434: creat d1/da/d10/d1b/d12/d22/f8a x:0 0 0 2026-03-10T08:55:19.388 INFO:tasks.workunit.client.0.vm05.stdout:0/152: rmdir df/d18/d19 39 2026-03-10T08:55:19.388 INFO:tasks.workunit.client.1.vm08.stdout:7/425: write d0/d14/f12 [2467689,116047] 0 2026-03-10T08:55:19.392 INFO:tasks.workunit.client.1.vm08.stdout:7/426: dwrite d0/d14/f68 [0,4194304] 0 2026-03-10T08:55:19.404 INFO:tasks.workunit.client.1.vm08.stdout:9/418: creat d2/f86 x:0 0 0 2026-03-10T08:55:19.404 INFO:tasks.workunit.client.0.vm05.stdout:1/231: mkdir dd/d55 0 2026-03-10T08:55:19.404 INFO:tasks.workunit.client.0.vm05.stdout:2/140: chown d0/d9/d1e/d20/d24/c25 40 1 2026-03-10T08:55:19.405 INFO:tasks.workunit.client.1.vm08.stdout:8/495: chown d1/d10/d9/d8a/f99 25030326 1 2026-03-10T08:55:19.407 INFO:tasks.workunit.client.0.vm05.stdout:2/141: dwrite d0/d9/d1e/d20/d21/f23 [0,4194304] 0 2026-03-10T08:55:19.408 INFO:tasks.workunit.client.0.vm05.stdout:2/142: readlink d0/l28 0 2026-03-10T08:55:19.411 INFO:tasks.workunit.client.0.vm05.stdout:3/185: rename d9/f26 to d9/d2b/d2f/f33 0 2026-03-10T08:55:19.416 INFO:tasks.workunit.client.1.vm08.stdout:0/343: creat d6/dd/d13/d17/d1f/d20/d2f/d26/d56/f6c x:0 0 0 2026-03-10T08:55:19.424 INFO:tasks.workunit.client.0.vm05.stdout:0/153: dwrite df/d18/d19/f1c [0,4194304] 0 2026-03-10T08:55:19.428 INFO:tasks.workunit.client.0.vm05.stdout:0/154: readlink df/d18/d19/l22 0 2026-03-10T08:55:19.428 INFO:tasks.workunit.client.0.vm05.stdout:0/155: write df/f1d [852807,12322] 0 2026-03-10T08:55:19.428 INFO:tasks.workunit.client.0.vm05.stdout:0/156: readlink df/d18/d19/l22 0 2026-03-10T08:55:19.428 INFO:tasks.workunit.client.0.vm05.stdout:1/232: unlink dd/d10/d19/f50 0 2026-03-10T08:55:19.435 INFO:tasks.workunit.client.0.vm05.stdout:5/160: link d5/df/d12/d24/f25 d5/f33 0 2026-03-10T08:55:19.436 INFO:tasks.workunit.client.1.vm08.stdout:0/344: sync 2026-03-10T08:55:19.436 INFO:tasks.workunit.client.1.vm08.stdout:0/345: write d6/f62 [409324,61493] 0 2026-03-10T08:55:19.437 INFO:tasks.workunit.client.1.vm08.stdout:0/346: fdatasync d6/dd/d13/d17/d1f/d20/d2f/d24/f37 0 2026-03-10T08:55:19.438 INFO:tasks.workunit.client.0.vm05.stdout:5/161: dwrite d5/fc [0,4194304] 0 2026-03-10T08:55:19.439 INFO:tasks.workunit.client.0.vm05.stdout:5/162: chown d5/df/d12/f2a 1498 1 2026-03-10T08:55:19.440 INFO:tasks.workunit.client.0.vm05.stdout:5/163: write d5/df/f1c [678946,2023] 0 2026-03-10T08:55:19.455 INFO:tasks.workunit.client.1.vm08.stdout:5/427: mkdir d0/d11/d27/d68/d7c/d4b/d4e/d84 0 2026-03-10T08:55:19.457 INFO:tasks.workunit.client.0.vm05.stdout:8/128: rename d2/dd/l1e to d2/db/l32 0 2026-03-10T08:55:19.460 INFO:tasks.workunit.client.1.vm08.stdout:7/427: mknod d0/d51/c7e 0 2026-03-10T08:55:19.461 INFO:tasks.workunit.client.0.vm05.stdout:7/114: symlink d18/l1e 0 2026-03-10T08:55:19.461 INFO:tasks.workunit.client.0.vm05.stdout:7/115: truncate fd 724775 0 2026-03-10T08:55:19.463 INFO:tasks.workunit.client.1.vm08.stdout:4/453: creat d5/d23/d36/d76/f9e x:0 0 0 2026-03-10T08:55:19.463 INFO:tasks.workunit.client.1.vm08.stdout:1/424: write d1/da/d18/f1d [917324,47473] 0 2026-03-10T08:55:19.464 INFO:tasks.workunit.client.1.vm08.stdout:1/425: read - d1/da/d20/f67 zero size 2026-03-10T08:55:19.465 INFO:tasks.workunit.client.1.vm08.stdout:8/496: fsync f0 0 2026-03-10T08:55:19.466 INFO:tasks.workunit.client.0.vm05.stdout:5/164: read d5/fd [1054694,44990] 0 2026-03-10T08:55:19.467 INFO:tasks.workunit.client.0.vm05.stdout:5/165: write d5/df/d12/f2a [887425,12554] 0 2026-03-10T08:55:19.470 INFO:tasks.workunit.client.1.vm08.stdout:0/347: write d6/dd/d13/d17/d1f/d20/d2f/d57/f65 [76782,44570] 0 2026-03-10T08:55:19.471 INFO:tasks.workunit.client.0.vm05.stdout:4/159: rename d0/d15 to d0/d1d/d30 0 2026-03-10T08:55:19.471 INFO:tasks.workunit.client.0.vm05.stdout:4/160: chown d0/d1d 598 1 2026-03-10T08:55:19.472 INFO:tasks.workunit.client.1.vm08.stdout:0/348: fdatasync d6/dd/d13/d17/d1f/d20/d2f/d26/d56/f6c 0 2026-03-10T08:55:19.477 INFO:tasks.workunit.client.0.vm05.stdout:8/129: truncate d2/fa 5780216 0 2026-03-10T08:55:19.477 INFO:tasks.workunit.client.1.vm08.stdout:4/454: sync 2026-03-10T08:55:19.478 INFO:tasks.workunit.client.1.vm08.stdout:4/455: dread - d5/d23/d36/f92 zero size 2026-03-10T08:55:19.486 INFO:tasks.workunit.client.0.vm05.stdout:4/161: dread d0/f1e [0,4194304] 0 2026-03-10T08:55:19.489 INFO:tasks.workunit.client.1.vm08.stdout:7/428: chown d0/c1a 220 1 2026-03-10T08:55:19.510 INFO:tasks.workunit.client.0.vm05.stdout:5/166: symlink d5/df/d12/d24/d2c/l34 0 2026-03-10T08:55:19.511 INFO:tasks.workunit.client.0.vm05.stdout:2/143: link d0/d9/d1e/d20/d21/f23 d0/d9/d1e/f2a 0 2026-03-10T08:55:19.511 INFO:tasks.workunit.client.0.vm05.stdout:6/150: rename d4/f5 to d4/d7/f34 0 2026-03-10T08:55:19.511 INFO:tasks.workunit.client.0.vm05.stdout:9/124: rename d6/d12 to d6/d12/d26 22 2026-03-10T08:55:19.511 INFO:tasks.workunit.client.0.vm05.stdout:9/125: dwrite d6/f8 [0,4194304] 0 2026-03-10T08:55:19.511 INFO:tasks.workunit.client.1.vm08.stdout:1/426: creat d1/da/de/d24/d3d/d40/d56/d6b/f8f x:0 0 0 2026-03-10T08:55:19.511 INFO:tasks.workunit.client.1.vm08.stdout:1/427: readlink d1/da/d4b/d4e/l8c 0 2026-03-10T08:55:19.511 INFO:tasks.workunit.client.1.vm08.stdout:1/428: chown d1/da/de/f1a 8 1 2026-03-10T08:55:19.511 INFO:tasks.workunit.client.1.vm08.stdout:6/466: rename d9/d13/f4a to d9/d10/d1e/d32/fa1 0 2026-03-10T08:55:19.511 INFO:tasks.workunit.client.1.vm08.stdout:6/467: dread d9/d13/f88 [0,4194304] 0 2026-03-10T08:55:19.511 INFO:tasks.workunit.client.1.vm08.stdout:4/456: symlink d5/de/l9f 0 2026-03-10T08:55:19.511 INFO:tasks.workunit.client.1.vm08.stdout:4/457: dwrite d5/d2f/d5a/f90 [0,4194304] 0 2026-03-10T08:55:19.520 INFO:tasks.workunit.client.1.vm08.stdout:0/349: dread d6/f2c [0,4194304] 0 2026-03-10T08:55:19.521 INFO:tasks.workunit.client.1.vm08.stdout:0/350: write d6/dd/d13/d17/d1f/d20/d2f/d26/d56/f6c [168893,92307] 0 2026-03-10T08:55:19.525 INFO:tasks.workunit.client.1.vm08.stdout:7/429: fsync d0/d11/d1f/d29/d3d/d40/f24 0 2026-03-10T08:55:19.525 INFO:tasks.workunit.client.1.vm08.stdout:0/351: dwrite d6/dd/d13/d17/d1f/d20/d2f/d26/d56/f6c [0,4194304] 0 2026-03-10T08:55:19.527 INFO:tasks.workunit.client.1.vm08.stdout:0/352: readlink d6/dd/d13/l1a 0 2026-03-10T08:55:19.530 INFO:tasks.workunit.client.1.vm08.stdout:1/429: fsync d1/da/d18/d3b/f5e 0 2026-03-10T08:55:19.534 INFO:tasks.workunit.client.0.vm05.stdout:5/167: write d5/f28 [3193943,53917] 0 2026-03-10T08:55:19.536 INFO:tasks.workunit.client.0.vm05.stdout:2/144: symlink d0/d9/d1e/d20/d21/l2b 0 2026-03-10T08:55:19.538 INFO:tasks.workunit.client.1.vm08.stdout:4/458: rmdir d5/d2f/d5d 39 2026-03-10T08:55:19.539 INFO:tasks.workunit.client.0.vm05.stdout:6/151: rename d4/d7/d10/d1a/f32 to d4/d2c/f35 0 2026-03-10T08:55:19.540 INFO:tasks.workunit.client.0.vm05.stdout:6/152: chown d4/d7/d10 0 1 2026-03-10T08:55:19.541 INFO:tasks.workunit.client.0.vm05.stdout:9/126: rmdir d6/d12 39 2026-03-10T08:55:19.541 INFO:tasks.workunit.client.0.vm05.stdout:9/127: chown d6/d15/l23 1788090379 1 2026-03-10T08:55:19.542 INFO:tasks.workunit.client.1.vm08.stdout:7/430: symlink d0/d14/d43/l7f 0 2026-03-10T08:55:19.543 INFO:tasks.workunit.client.0.vm05.stdout:8/130: getdents d2/dd/d2c/d2e/d31 0 2026-03-10T08:55:19.554 INFO:tasks.workunit.client.0.vm05.stdout:8/131: chown d2/db/d28/f2d 209055027 1 2026-03-10T08:55:19.554 INFO:tasks.workunit.client.0.vm05.stdout:8/132: dwrite d2/db/f22 [0,4194304] 0 2026-03-10T08:55:19.554 INFO:tasks.workunit.client.1.vm08.stdout:0/353: unlink d6/dd/d13/l1a 0 2026-03-10T08:55:19.554 INFO:tasks.workunit.client.1.vm08.stdout:7/431: dwrite d0/d11/d1f/d29/d3d/f74 [0,4194304] 0 2026-03-10T08:55:19.554 INFO:tasks.workunit.client.1.vm08.stdout:7/432: stat d0/d1c 0 2026-03-10T08:55:19.555 INFO:tasks.workunit.client.0.vm05.stdout:0/157: getdents df/d18 0 2026-03-10T08:55:19.555 INFO:tasks.workunit.client.0.vm05.stdout:0/158: read df/f13 [723472,48860] 0 2026-03-10T08:55:19.556 INFO:tasks.workunit.client.0.vm05.stdout:0/159: write df/f1d [1614611,34632] 0 2026-03-10T08:55:19.558 INFO:tasks.workunit.client.1.vm08.stdout:6/468: mkdir d9/d10/d1e/d4c/d69/da2 0 2026-03-10T08:55:19.559 INFO:tasks.workunit.client.1.vm08.stdout:4/459: unlink d5/de/l37 0 2026-03-10T08:55:19.560 INFO:tasks.workunit.client.1.vm08.stdout:0/354: fdatasync d6/dd/d13/d17/d1f/f48 0 2026-03-10T08:55:19.561 INFO:tasks.workunit.client.1.vm08.stdout:7/433: unlink d0/d14/d43/c5a 0 2026-03-10T08:55:19.562 INFO:tasks.workunit.client.1.vm08.stdout:0/355: write d6/dd/d13/d17/d1f/d20/d2f/d57/f5c [1375977,23096] 0 2026-03-10T08:55:19.562 INFO:tasks.workunit.client.1.vm08.stdout:0/356: chown d6/dd/d13/l6b 198736 1 2026-03-10T08:55:19.567 INFO:tasks.workunit.client.0.vm05.stdout:6/153: read d4/d7/d10/f12 [1040473,81487] 0 2026-03-10T08:55:19.568 INFO:tasks.workunit.client.1.vm08.stdout:6/469: read d9/dc/d11/d23/d2c/f5c [589441,12704] 0 2026-03-10T08:55:19.575 INFO:tasks.workunit.client.1.vm08.stdout:4/460: symlink d5/d23/d49/d8f/la0 0 2026-03-10T08:55:19.576 INFO:tasks.workunit.client.0.vm05.stdout:5/168: link d5/fc d5/df/d12/d24/d2c/f35 0 2026-03-10T08:55:19.578 INFO:tasks.workunit.client.0.vm05.stdout:5/169: dread d5/f33 [0,4194304] 0 2026-03-10T08:55:19.578 INFO:tasks.workunit.client.0.vm05.stdout:5/170: fdatasync d5/f9 0 2026-03-10T08:55:19.581 INFO:tasks.workunit.client.0.vm05.stdout:5/171: dwrite d5/df/d12/f1b [4194304,4194304] 0 2026-03-10T08:55:19.581 INFO:tasks.workunit.client.1.vm08.stdout:6/470: truncate d9/d50/f78 789507 0 2026-03-10T08:55:19.584 INFO:tasks.workunit.client.0.vm05.stdout:2/145: mknod d0/d9/d27/c2c 0 2026-03-10T08:55:19.586 INFO:tasks.workunit.client.0.vm05.stdout:9/128: mkdir d6/d27 0 2026-03-10T08:55:19.587 INFO:tasks.workunit.client.0.vm05.stdout:9/129: truncate d6/d15/f24 608902 0 2026-03-10T08:55:19.588 INFO:tasks.workunit.client.0.vm05.stdout:4/162: link d0/d1f/l27 d0/d2e/l31 0 2026-03-10T08:55:19.589 INFO:tasks.workunit.client.0.vm05.stdout:4/163: write d0/f8 [5011846,58933] 0 2026-03-10T08:55:19.589 INFO:tasks.workunit.client.0.vm05.stdout:2/146: dwrite d0/d9/d1e/d20/f22 [0,4194304] 0 2026-03-10T08:55:19.590 INFO:tasks.workunit.client.0.vm05.stdout:2/147: stat d0/fb 0 2026-03-10T08:55:19.591 INFO:tasks.workunit.client.0.vm05.stdout:2/148: chown d0/d9/d1e/d20/d21 811896 1 2026-03-10T08:55:19.598 INFO:tasks.workunit.client.0.vm05.stdout:2/149: dwrite d0/f10 [0,4194304] 0 2026-03-10T08:55:19.600 INFO:tasks.workunit.client.0.vm05.stdout:2/150: chown d0/d9/d1e/d20/d24 1 1 2026-03-10T08:55:19.600 INFO:tasks.workunit.client.0.vm05.stdout:2/151: rename d0 to d0/d9/d1e/d20/d2d 22 2026-03-10T08:55:19.603 INFO:tasks.workunit.client.1.vm08.stdout:6/471: rename d9/dc/d11/d23/d2c/f43 to d9/d50/fa3 0 2026-03-10T08:55:19.603 INFO:tasks.workunit.client.1.vm08.stdout:3/378: write d4/d15/d8/d1d/f21 [1664859,27848] 0 2026-03-10T08:55:19.603 INFO:tasks.workunit.client.0.vm05.stdout:8/133: link d2/f2a d2/dd/d2c/d2e/f33 0 2026-03-10T08:55:19.604 INFO:tasks.workunit.client.1.vm08.stdout:3/379: chown d4/d15/d8/d2a/f63 4002 1 2026-03-10T08:55:19.604 INFO:tasks.workunit.client.1.vm08.stdout:6/472: truncate d9/d10/f9d 580342 0 2026-03-10T08:55:19.605 INFO:tasks.workunit.client.1.vm08.stdout:2/435: write d1/da/d10/f7e [2791728,113366] 0 2026-03-10T08:55:19.605 INFO:tasks.workunit.client.1.vm08.stdout:3/380: readlink d4/d15/d8/d2a/d79/d20/l29 0 2026-03-10T08:55:19.606 INFO:tasks.workunit.client.1.vm08.stdout:3/381: chown d4/d15/d8/d2a/d79/f34 982917 1 2026-03-10T08:55:19.609 INFO:tasks.workunit.client.1.vm08.stdout:2/436: mknod d1/c8b 0 2026-03-10T08:55:19.612 INFO:tasks.workunit.client.0.vm05.stdout:5/172: creat d5/df/d12/d21/f36 x:0 0 0 2026-03-10T08:55:19.623 INFO:tasks.workunit.client.0.vm05.stdout:5/173: stat d5/df/l32 0 2026-03-10T08:55:19.623 INFO:tasks.workunit.client.1.vm08.stdout:9/419: write d2/dd/f16 [149515,105009] 0 2026-03-10T08:55:19.623 INFO:tasks.workunit.client.1.vm08.stdout:8/497: write f0 [4903377,57142] 0 2026-03-10T08:55:19.625 INFO:tasks.workunit.client.1.vm08.stdout:2/437: fsync d1/da/d10/d1b/f14 0 2026-03-10T08:55:19.631 INFO:tasks.workunit.client.1.vm08.stdout:3/382: getdents d4/d6f 0 2026-03-10T08:55:19.635 INFO:tasks.workunit.client.0.vm05.stdout:7/116: dwrite f15 [0,4194304] 0 2026-03-10T08:55:19.635 INFO:tasks.workunit.client.1.vm08.stdout:9/420: dread d2/dd/d15/d1e/d24/f34 [0,4194304] 0 2026-03-10T08:55:19.642 INFO:tasks.workunit.client.0.vm05.stdout:7/117: dwrite f3 [0,4194304] 0 2026-03-10T08:55:19.643 INFO:tasks.workunit.client.0.vm05.stdout:7/118: write f4 [2450705,20935] 0 2026-03-10T08:55:19.646 INFO:tasks.workunit.client.0.vm05.stdout:7/119: dread f15 [0,4194304] 0 2026-03-10T08:55:19.659 INFO:tasks.workunit.client.1.vm08.stdout:5/428: write d0/d11/d18/d52/f57 [278039,97226] 0 2026-03-10T08:55:19.660 INFO:tasks.workunit.client.0.vm05.stdout:4/164: truncate d0/fb 5125025 0 2026-03-10T08:55:19.663 INFO:tasks.workunit.client.0.vm05.stdout:6/154: sync 2026-03-10T08:55:19.664 INFO:tasks.workunit.client.0.vm05.stdout:1/233: dwrite dd/d13/f40 [0,4194304] 0 2026-03-10T08:55:19.665 INFO:tasks.workunit.client.0.vm05.stdout:4/165: read d0/f8 [2396149,20557] 0 2026-03-10T08:55:19.665 INFO:tasks.workunit.client.0.vm05.stdout:4/166: fdatasync d0/f16 0 2026-03-10T08:55:19.667 INFO:tasks.workunit.client.1.vm08.stdout:8/498: mknod d1/d10/d9/dd/d13/cb1 0 2026-03-10T08:55:19.671 INFO:tasks.workunit.client.0.vm05.stdout:2/152: unlink d0/c1f 0 2026-03-10T08:55:19.683 INFO:tasks.workunit.client.1.vm08.stdout:3/383: symlink d4/l7c 0 2026-03-10T08:55:19.683 INFO:tasks.workunit.client.1.vm08.stdout:3/384: truncate d4/f44 648243 0 2026-03-10T08:55:19.683 INFO:tasks.workunit.client.0.vm05.stdout:5/174: mkdir d5/df/d37 0 2026-03-10T08:55:19.683 INFO:tasks.workunit.client.0.vm05.stdout:5/175: dwrite d5/df/f31 [0,4194304] 0 2026-03-10T08:55:19.691 INFO:tasks.workunit.client.0.vm05.stdout:3/186: rmdir d9/d2b/d2f 39 2026-03-10T08:55:19.699 INFO:tasks.workunit.client.1.vm08.stdout:3/385: rmdir d4/d15/d8/d2a/d79/d20 39 2026-03-10T08:55:19.699 INFO:tasks.workunit.client.1.vm08.stdout:3/386: rename d4/d15/d8/d2a/d79/l5b to d4/l7d 0 2026-03-10T08:55:19.699 INFO:tasks.workunit.client.0.vm05.stdout:7/120: mkdir d18/d1b/d1f 0 2026-03-10T08:55:19.699 INFO:tasks.workunit.client.0.vm05.stdout:6/155: creat d4/d7/d10/d15/d1b/d22/f36 x:0 0 0 2026-03-10T08:55:19.699 INFO:tasks.workunit.client.0.vm05.stdout:6/156: chown d4/d7/d10/d15/d1b/d22/l27 207104 1 2026-03-10T08:55:19.699 INFO:tasks.workunit.client.0.vm05.stdout:1/234: truncate fc 4331780 0 2026-03-10T08:55:19.701 INFO:tasks.workunit.client.0.vm05.stdout:4/167: mkdir d0/d1d/d30/d32 0 2026-03-10T08:55:19.702 INFO:tasks.workunit.client.1.vm08.stdout:2/438: dread d1/da/d10/d1b/d6a/f73 [0,4194304] 0 2026-03-10T08:55:19.705 INFO:tasks.workunit.client.1.vm08.stdout:3/387: mknod d4/d15/d8/d2c/c7e 0 2026-03-10T08:55:19.705 INFO:tasks.workunit.client.1.vm08.stdout:2/439: chown d1/da/d10/c7c 31 1 2026-03-10T08:55:19.708 INFO:tasks.workunit.client.0.vm05.stdout:7/121: dwrite f15 [0,4194304] 0 2026-03-10T08:55:19.709 INFO:tasks.workunit.client.1.vm08.stdout:3/388: dwrite d4/d15/d8/d2a/f63 [0,4194304] 0 2026-03-10T08:55:19.712 INFO:tasks.workunit.client.0.vm05.stdout:1/235: mkdir dd/d10/d18/d20/d56 0 2026-03-10T08:55:19.717 INFO:tasks.workunit.client.0.vm05.stdout:1/236: dwrite dd/d13/f33 [0,4194304] 0 2026-03-10T08:55:19.717 INFO:tasks.workunit.client.0.vm05.stdout:7/122: dwrite f3 [4194304,4194304] 0 2026-03-10T08:55:19.725 INFO:tasks.workunit.client.0.vm05.stdout:7/123: dwrite fd [0,4194304] 0 2026-03-10T08:55:19.726 INFO:tasks.workunit.client.0.vm05.stdout:7/124: write d18/f1d [311986,88943] 0 2026-03-10T08:55:19.741 INFO:tasks.workunit.client.1.vm08.stdout:3/389: rename d4/l4a to d4/d15/d8/d2c/d55/l7f 0 2026-03-10T08:55:19.746 INFO:tasks.workunit.client.0.vm05.stdout:4/168: rename d0/c21 to d0/d1d/d30/c33 0 2026-03-10T08:55:19.746 INFO:tasks.workunit.client.1.vm08.stdout:2/440: read d1/da/d10/d42/f58 [2546992,78100] 0 2026-03-10T08:55:19.746 INFO:tasks.workunit.client.1.vm08.stdout:3/390: dwrite d4/d15/d8/d1d/f6e [0,4194304] 0 2026-03-10T08:55:19.747 INFO:tasks.workunit.client.1.vm08.stdout:2/441: fdatasync d1/d5b/d66/f5e 0 2026-03-10T08:55:19.754 INFO:tasks.workunit.client.0.vm05.stdout:1/237: stat dd/d10/d19/f35 0 2026-03-10T08:55:19.766 INFO:tasks.workunit.client.0.vm05.stdout:7/125: mknod d18/d1b/c20 0 2026-03-10T08:55:19.769 INFO:tasks.workunit.client.0.vm05.stdout:8/134: getdents d2/db/d1f 0 2026-03-10T08:55:19.772 INFO:tasks.workunit.client.0.vm05.stdout:4/169: mkdir d0/d1d/d30/d34 0 2026-03-10T08:55:19.783 INFO:tasks.workunit.client.1.vm08.stdout:2/442: link d1/da/d10/d2d/f4c d1/d5b/f8c 0 2026-03-10T08:55:19.784 INFO:tasks.workunit.client.0.vm05.stdout:4/170: truncate d0/d1d/d30/f1c 49602 0 2026-03-10T08:55:19.784 INFO:tasks.workunit.client.0.vm05.stdout:4/171: readlink d0/la 0 2026-03-10T08:55:19.784 INFO:tasks.workunit.client.0.vm05.stdout:4/172: chown d0/d1d/d30/f28 111931 1 2026-03-10T08:55:19.784 INFO:tasks.workunit.client.0.vm05.stdout:3/187: creat d9/d2b/f34 x:0 0 0 2026-03-10T08:55:19.784 INFO:tasks.workunit.client.0.vm05.stdout:6/157: link d4/d7/d10/d15/d1b/d22/l27 d4/d7/d10/d15/d1b/d22/l37 0 2026-03-10T08:55:19.784 INFO:tasks.workunit.client.0.vm05.stdout:3/188: dwrite f1 [0,4194304] 0 2026-03-10T08:55:19.784 INFO:tasks.workunit.client.0.vm05.stdout:2/153: getdents d0/d9/d1e 0 2026-03-10T08:55:19.787 INFO:tasks.workunit.client.0.vm05.stdout:7/126: read - d18/f1a zero size 2026-03-10T08:55:19.787 INFO:tasks.workunit.client.0.vm05.stdout:7/127: chown fd 40537 1 2026-03-10T08:55:19.789 INFO:tasks.workunit.client.0.vm05.stdout:4/173: mknod d0/d2e/c35 0 2026-03-10T08:55:19.790 INFO:tasks.workunit.client.1.vm08.stdout:2/443: creat d1/da/d10/d1b/d12/f8d x:0 0 0 2026-03-10T08:55:19.801 INFO:tasks.workunit.client.1.vm08.stdout:2/444: chown d1/da/d10/d1b/d12/d1e/d7b 69688271 1 2026-03-10T08:55:19.801 INFO:tasks.workunit.client.1.vm08.stdout:2/445: unlink d1/da/d10/d42/c82 0 2026-03-10T08:55:19.801 INFO:tasks.workunit.client.1.vm08.stdout:2/446: dwrite d1/d43/f5d [0,4194304] 0 2026-03-10T08:55:19.801 INFO:tasks.workunit.client.0.vm05.stdout:3/189: rmdir d9 39 2026-03-10T08:55:19.801 INFO:tasks.workunit.client.0.vm05.stdout:2/154: mknod d0/d9/d1e/d20/c2e 0 2026-03-10T08:55:19.801 INFO:tasks.workunit.client.0.vm05.stdout:1/238: rename dd/d10/d19/f1f to dd/d21/d3f/f57 0 2026-03-10T08:55:19.801 INFO:tasks.workunit.client.0.vm05.stdout:7/128: chown d18/d1b/c1c 0 1 2026-03-10T08:55:19.801 INFO:tasks.workunit.client.0.vm05.stdout:7/129: chown l13 8249005 1 2026-03-10T08:55:19.808 INFO:tasks.workunit.client.0.vm05.stdout:8/135: rename d2/dd/d2c/d2e/f33 to d2/dd/d2c/f34 0 2026-03-10T08:55:19.812 INFO:tasks.workunit.client.1.vm08.stdout:1/430: write d1/da/d18/f48 [679177,44856] 0 2026-03-10T08:55:19.814 INFO:tasks.workunit.client.1.vm08.stdout:7/434: write d0/d11/f39 [1578840,88109] 0 2026-03-10T08:55:19.814 INFO:tasks.workunit.client.0.vm05.stdout:0/160: write fe [4604772,40508] 0 2026-03-10T08:55:19.814 INFO:tasks.workunit.client.1.vm08.stdout:0/357: write d6/dd/d13/d17/d1f/d2d/d39/f47 [629144,18239] 0 2026-03-10T08:55:19.817 INFO:tasks.workunit.client.1.vm08.stdout:4/461: write d5/fd [27693,11726] 0 2026-03-10T08:55:19.818 INFO:tasks.workunit.client.1.vm08.stdout:1/431: unlink d1/da/de/f1a 0 2026-03-10T08:55:19.819 INFO:tasks.workunit.client.1.vm08.stdout:1/432: write d1/da/de/d24/d35/d43/f7d [3883998,28139] 0 2026-03-10T08:55:19.820 INFO:tasks.workunit.client.1.vm08.stdout:7/435: mkdir d0/d11/d1f/d29/d3b/d80 0 2026-03-10T08:55:19.821 INFO:tasks.workunit.client.1.vm08.stdout:6/473: truncate d9/dc/d11/d23/d2c/f3d 2595477 0 2026-03-10T08:55:19.821 INFO:tasks.workunit.client.1.vm08.stdout:6/474: write d9/dc/d11/f47 [2096882,96924] 0 2026-03-10T08:55:19.823 INFO:tasks.workunit.client.0.vm05.stdout:3/190: dread d9/ff [0,4194304] 0 2026-03-10T08:55:19.828 INFO:tasks.workunit.client.0.vm05.stdout:1/239: truncate fb 1096805 0 2026-03-10T08:55:19.832 INFO:tasks.workunit.client.0.vm05.stdout:4/174: rename d0/d1d/d30/d34 to d0/d1f/d36 0 2026-03-10T08:55:19.832 INFO:tasks.workunit.client.0.vm05.stdout:8/136: mknod d2/db/d1f/c35 0 2026-03-10T08:55:19.832 INFO:tasks.workunit.client.0.vm05.stdout:0/161: rmdir df/d1f 39 2026-03-10T08:55:19.838 INFO:tasks.workunit.client.1.vm08.stdout:2/447: creat d1/da/d10/d1b/d12/f8e x:0 0 0 2026-03-10T08:55:19.838 INFO:tasks.workunit.client.1.vm08.stdout:4/462: write d5/de/f5e [1879975,351] 0 2026-03-10T08:55:19.839 INFO:tasks.workunit.client.1.vm08.stdout:1/433: creat d1/da/de/d24/d3d/d40/f90 x:0 0 0 2026-03-10T08:55:19.841 INFO:tasks.workunit.client.1.vm08.stdout:1/434: write d1/da/de/d24/d35/d43/f7d [2357343,104996] 0 2026-03-10T08:55:19.844 INFO:tasks.workunit.client.1.vm08.stdout:2/448: dread d1/da/d10/d1b/d12/d1e/f1f [0,4194304] 0 2026-03-10T08:55:19.845 INFO:tasks.workunit.client.1.vm08.stdout:7/436: dwrite d0/d51/f5d [0,4194304] 0 2026-03-10T08:55:19.847 INFO:tasks.workunit.client.0.vm05.stdout:2/155: creat d0/f2f x:0 0 0 2026-03-10T08:55:19.847 INFO:tasks.workunit.client.1.vm08.stdout:0/358: link d6/dd/d13/d17/d1f/d20/d2f/d57/f58 d6/dd/d13/d17/f6d 0 2026-03-10T08:55:19.850 INFO:tasks.workunit.client.0.vm05.stdout:2/156: dwrite d0/f2 [0,4194304] 0 2026-03-10T08:55:19.852 INFO:tasks.workunit.client.0.vm05.stdout:2/157: write d0/f10 [4335042,75703] 0 2026-03-10T08:55:19.864 INFO:tasks.workunit.client.0.vm05.stdout:8/137: symlink d2/db/l36 0 2026-03-10T08:55:19.867 INFO:tasks.workunit.client.1.vm08.stdout:2/449: write d1/da/d10/d1b/f14 [719232,85740] 0 2026-03-10T08:55:19.867 INFO:tasks.workunit.client.1.vm08.stdout:7/437: creat d0/d14/d2f/f81 x:0 0 0 2026-03-10T08:55:19.869 INFO:tasks.workunit.client.0.vm05.stdout:3/191: mknod d9/d2b/d2f/c35 0 2026-03-10T08:55:19.870 INFO:tasks.workunit.client.1.vm08.stdout:0/359: creat d6/dd/d13/d17/d1f/d20/d2f/d24/f6e x:0 0 0 2026-03-10T08:55:19.871 INFO:tasks.workunit.client.1.vm08.stdout:0/360: fsync d6/f11 0 2026-03-10T08:55:19.877 INFO:tasks.workunit.client.0.vm05.stdout:1/240: mkdir dd/d10/d18/d2d/d51/d58 0 2026-03-10T08:55:19.877 INFO:tasks.workunit.client.1.vm08.stdout:4/463: rename d5/fd to d5/d23/fa1 0 2026-03-10T08:55:19.877 INFO:tasks.workunit.client.1.vm08.stdout:7/438: mknod d0/d11/d4a/d5e/c82 0 2026-03-10T08:55:19.877 INFO:tasks.workunit.client.1.vm08.stdout:1/435: rename d1/da/d20/d4c to d1/da/d20/d91 0 2026-03-10T08:55:19.877 INFO:tasks.workunit.client.1.vm08.stdout:1/436: chown d1/da/de/d24/d3d/d40/d5b 360893 1 2026-03-10T08:55:19.879 INFO:tasks.workunit.client.0.vm05.stdout:3/192: write d9/f27 [805976,71441] 0 2026-03-10T08:55:19.879 INFO:tasks.workunit.client.0.vm05.stdout:3/193: chown f2 1150348715 1 2026-03-10T08:55:19.880 INFO:tasks.workunit.client.1.vm08.stdout:0/361: unlink d6/dd/f28 0 2026-03-10T08:55:19.882 INFO:tasks.workunit.client.1.vm08.stdout:1/437: dwrite d1/da/d18/f1d [0,4194304] 0 2026-03-10T08:55:19.886 INFO:tasks.workunit.client.1.vm08.stdout:7/439: write d0/d14/d43/f6e [393328,37666] 0 2026-03-10T08:55:19.886 INFO:tasks.workunit.client.0.vm05.stdout:3/194: dwrite d9/d2b/f2d [0,4194304] 0 2026-03-10T08:55:19.887 INFO:tasks.workunit.client.1.vm08.stdout:0/362: dwrite d6/dd/d13/d17/d1f/d2d/d38/f53 [0,4194304] 0 2026-03-10T08:55:19.888 INFO:tasks.workunit.client.1.vm08.stdout:0/363: fdatasync d6/dd/d13/d17/d1f/d2d/d39/f4a 0 2026-03-10T08:55:19.898 INFO:tasks.workunit.client.0.vm05.stdout:1/241: dwrite dd/d21/f3e [0,4194304] 0 2026-03-10T08:55:19.920 INFO:tasks.workunit.client.1.vm08.stdout:1/438: mkdir d1/da/de/d24/d3d/d40/d92 0 2026-03-10T08:55:19.920 INFO:tasks.workunit.client.0.vm05.stdout:0/162: rename df/c1e to df/d1f/c2c 0 2026-03-10T08:55:19.920 INFO:tasks.workunit.client.0.vm05.stdout:2/158: creat d0/f30 x:0 0 0 2026-03-10T08:55:19.920 INFO:tasks.workunit.client.0.vm05.stdout:0/163: chown df/f1a 27273804 1 2026-03-10T08:55:19.920 INFO:tasks.workunit.client.0.vm05.stdout:0/164: dwrite fe [0,4194304] 0 2026-03-10T08:55:19.920 INFO:tasks.workunit.client.0.vm05.stdout:1/242: dread - dd/d10/d18/f36 zero size 2026-03-10T08:55:19.920 INFO:tasks.workunit.client.0.vm05.stdout:1/243: chown dd/d21/d37 175304177 1 2026-03-10T08:55:19.920 INFO:tasks.workunit.client.0.vm05.stdout:2/159: creat d0/d9/d1e/d20/d21/f31 x:0 0 0 2026-03-10T08:55:19.920 INFO:tasks.workunit.client.0.vm05.stdout:0/165: rename df/d18/d19/f1c to df/d1f/f2d 0 2026-03-10T08:55:19.921 INFO:tasks.workunit.client.1.vm08.stdout:7/440: symlink d0/d1c/l83 0 2026-03-10T08:55:19.923 INFO:tasks.workunit.client.1.vm08.stdout:1/439: dwrite d1/da/d18/f1d [0,4194304] 0 2026-03-10T08:55:19.923 INFO:tasks.workunit.client.0.vm05.stdout:2/160: creat d0/d9/d1e/d20/f32 x:0 0 0 2026-03-10T08:55:19.924 INFO:tasks.workunit.client.1.vm08.stdout:1/440: chown d1/da/l6f 12395 1 2026-03-10T08:55:19.924 INFO:tasks.workunit.client.0.vm05.stdout:2/161: write d0/f16 [402415,109215] 0 2026-03-10T08:55:19.926 INFO:tasks.workunit.client.0.vm05.stdout:1/244: fdatasync fb 0 2026-03-10T08:55:19.932 INFO:tasks.workunit.client.0.vm05.stdout:1/245: chown dd/d10/l1b 609230 1 2026-03-10T08:55:19.932 INFO:tasks.workunit.client.0.vm05.stdout:2/162: creat d0/d9/d1e/d20/d24/f33 x:0 0 0 2026-03-10T08:55:19.933 INFO:tasks.workunit.client.1.vm08.stdout:0/364: mkdir d6/dd/d13/d61/d6f 0 2026-03-10T08:55:19.936 INFO:tasks.workunit.client.0.vm05.stdout:2/163: rename d0/f1c to d0/d9/d1e/f34 0 2026-03-10T08:55:19.947 INFO:tasks.workunit.client.0.vm05.stdout:0/166: link df/d1f/f21 df/d18/d2b/d27/f2e 0 2026-03-10T08:55:19.947 INFO:tasks.workunit.client.0.vm05.stdout:1/246: chown fc 23 1 2026-03-10T08:55:19.947 INFO:tasks.workunit.client.0.vm05.stdout:1/247: write dd/d21/d3f/f57 [2076280,23322] 0 2026-03-10T08:55:19.947 INFO:tasks.workunit.client.0.vm05.stdout:1/248: stat fb 0 2026-03-10T08:55:19.947 INFO:tasks.workunit.client.1.vm08.stdout:0/365: mknod d6/dd/d13/d17/d1f/d2d/d38/c70 0 2026-03-10T08:55:19.947 INFO:tasks.workunit.client.1.vm08.stdout:0/366: chown d6/dd/d13/d17/d1f/d20/d2f/d57/f58 4791 1 2026-03-10T08:55:19.947 INFO:tasks.workunit.client.1.vm08.stdout:7/441: rename d0/d11/d1f/d29/d3b/l56 to d0/d11/d1f/l84 0 2026-03-10T08:55:19.947 INFO:tasks.workunit.client.1.vm08.stdout:7/442: fsync d0/d14/f72 0 2026-03-10T08:55:19.947 INFO:tasks.workunit.client.1.vm08.stdout:7/443: chown d0/d11/d4a/d5e/c82 191710 1 2026-03-10T08:55:19.947 INFO:tasks.workunit.client.0.vm05.stdout:1/249: stat dd/d10/d18/d20/d56 0 2026-03-10T08:55:19.951 INFO:tasks.workunit.client.1.vm08.stdout:1/441: link d1/f1f d1/da/f93 0 2026-03-10T08:55:19.953 INFO:tasks.workunit.client.1.vm08.stdout:1/442: creat d1/da/de/d24/d26/f94 x:0 0 0 2026-03-10T08:55:19.954 INFO:tasks.workunit.client.1.vm08.stdout:7/444: rename d0/d14/d2f/f79 to d0/d11/d1f/d29/d36/d75/f85 0 2026-03-10T08:55:19.954 INFO:tasks.workunit.client.1.vm08.stdout:0/367: link d6/fa d6/dd/d13/d17/d50/f71 0 2026-03-10T08:55:19.955 INFO:tasks.workunit.client.0.vm05.stdout:1/250: dread dd/d13/f42 [0,4194304] 0 2026-03-10T08:55:19.957 INFO:tasks.workunit.client.1.vm08.stdout:7/445: creat d0/d11/d1f/d29/d3b/f86 x:0 0 0 2026-03-10T08:55:19.959 INFO:tasks.workunit.client.0.vm05.stdout:1/251: dwrite dd/d21/f3a [0,4194304] 0 2026-03-10T08:55:19.962 INFO:tasks.workunit.client.1.vm08.stdout:0/368: fdatasync d6/f16 0 2026-03-10T08:55:19.963 INFO:tasks.workunit.client.1.vm08.stdout:7/446: dwrite d0/d11/f39 [0,4194304] 0 2026-03-10T08:55:19.965 INFO:tasks.workunit.client.0.vm05.stdout:1/252: dwrite dd/d10/d19/f35 [4194304,4194304] 0 2026-03-10T08:55:19.966 INFO:tasks.workunit.client.1.vm08.stdout:1/443: dread d1/da/f93 [0,4194304] 0 2026-03-10T08:55:19.967 INFO:tasks.workunit.client.1.vm08.stdout:7/447: dread d0/d11/f66 [0,4194304] 0 2026-03-10T08:55:19.971 INFO:tasks.workunit.client.1.vm08.stdout:0/369: unlink d6/dd/d13/d17/d1f/d2d/d38/f53 0 2026-03-10T08:55:19.983 INFO:tasks.workunit.client.1.vm08.stdout:1/444: symlink d1/da/de/d24/d35/d6d/l95 0 2026-03-10T08:55:19.983 INFO:tasks.workunit.client.1.vm08.stdout:7/448: rename d0/f44 to d0/d11/d4a/f87 0 2026-03-10T08:55:19.983 INFO:tasks.workunit.client.1.vm08.stdout:7/449: readlink d0/d11/d1f/d29/d3d/d40/l48 0 2026-03-10T08:55:19.983 INFO:tasks.workunit.client.1.vm08.stdout:0/370: symlink d6/dd/d13/d17/d1f/d20/d2f/d26/l72 0 2026-03-10T08:55:19.983 INFO:tasks.workunit.client.1.vm08.stdout:0/371: truncate f3 3134134 0 2026-03-10T08:55:19.983 INFO:tasks.workunit.client.1.vm08.stdout:7/450: getdents d0/d11/d1f/d29/d36 0 2026-03-10T08:55:19.983 INFO:tasks.workunit.client.1.vm08.stdout:1/445: getdents d1/da/de/d24/d35 0 2026-03-10T08:55:19.983 INFO:tasks.workunit.client.1.vm08.stdout:1/446: rename d1/fd to d1/da/d20/d3f/d49/f96 0 2026-03-10T08:55:19.983 INFO:tasks.workunit.client.1.vm08.stdout:7/451: creat d0/d11/d1f/d29/d3b/d80/f88 x:0 0 0 2026-03-10T08:55:19.984 INFO:tasks.workunit.client.1.vm08.stdout:7/452: mkdir d0/d11/d1f/d29/d3d/d89 0 2026-03-10T08:55:19.984 INFO:tasks.workunit.client.1.vm08.stdout:1/447: creat d1/da/d20/d3f/d49/d68/d7f/f97 x:0 0 0 2026-03-10T08:55:19.986 INFO:tasks.workunit.client.1.vm08.stdout:7/453: chown d0/d11/d1f/d29/l6f 800382906 1 2026-03-10T08:55:20.004 INFO:tasks.workunit.client.1.vm08.stdout:1/448: dread d1/da/de/f19 [0,4194304] 0 2026-03-10T08:55:20.004 INFO:tasks.workunit.client.1.vm08.stdout:1/449: mknod d1/da/de/d24/d3d/d40/d56/c98 0 2026-03-10T08:55:20.004 INFO:tasks.workunit.client.1.vm08.stdout:1/450: chown d1/da/f1e 60 1 2026-03-10T08:55:20.004 INFO:tasks.workunit.client.1.vm08.stdout:1/451: write d1/da/d18/d3b/d62/f76 [5067044,85317] 0 2026-03-10T08:55:20.006 INFO:tasks.workunit.client.0.vm05.stdout:0/167: sync 2026-03-10T08:55:20.015 INFO:tasks.workunit.client.0.vm05.stdout:0/168: rmdir df/d1f 39 2026-03-10T08:55:20.019 INFO:tasks.workunit.client.0.vm05.stdout:0/169: creat df/d18/d2b/f2f x:0 0 0 2026-03-10T08:55:20.020 INFO:tasks.workunit.client.0.vm05.stdout:0/170: mknod df/d18/d19/c30 0 2026-03-10T08:55:20.044 INFO:tasks.workunit.client.1.vm08.stdout:5/429: write d0/d11/d18/f5a [747481,63200] 0 2026-03-10T08:55:20.044 INFO:tasks.workunit.client.1.vm08.stdout:5/430: chown d0/d11/d27/f61 236 1 2026-03-10T08:55:20.045 INFO:tasks.workunit.client.1.vm08.stdout:5/431: chown d0/d11/d27/f3d 259773 1 2026-03-10T08:55:20.045 INFO:tasks.workunit.client.0.vm05.stdout:9/130: truncate d6/f8 2393057 0 2026-03-10T08:55:20.046 INFO:tasks.workunit.client.1.vm08.stdout:8/499: write d1/d10/d9/dd/d25/d27/d44/d97/f79 [528938,107773] 0 2026-03-10T08:55:20.050 INFO:tasks.workunit.client.0.vm05.stdout:9/131: mknod d6/d19/d21/c28 0 2026-03-10T08:55:20.062 INFO:tasks.workunit.client.0.vm05.stdout:5/176: dwrite d5/fd [4194304,4194304] 0 2026-03-10T08:55:20.062 INFO:tasks.workunit.client.0.vm05.stdout:5/177: stat d5/df/d12/c22 0 2026-03-10T08:55:20.062 INFO:tasks.workunit.client.1.vm08.stdout:8/500: dwrite d1/d10/d9/dd/d18/d34/f57 [0,4194304] 0 2026-03-10T08:55:20.062 INFO:tasks.workunit.client.1.vm08.stdout:8/501: mkdir d1/d10/d9/d4d/db2 0 2026-03-10T08:55:20.062 INFO:tasks.workunit.client.1.vm08.stdout:5/432: link d0/d11/d27/d68/d7c/d4b/l6e d0/d11/d18/l85 0 2026-03-10T08:55:20.062 INFO:tasks.workunit.client.1.vm08.stdout:5/433: stat d0/d11/d27/d68/l6d 0 2026-03-10T08:55:20.062 INFO:tasks.workunit.client.1.vm08.stdout:5/434: stat d0/d46/f5f 0 2026-03-10T08:55:20.062 INFO:tasks.workunit.client.1.vm08.stdout:5/435: fsync d0/d11/f60 0 2026-03-10T08:55:20.062 INFO:tasks.workunit.client.0.vm05.stdout:5/178: stat d5/df/d12/f2a 0 2026-03-10T08:55:20.062 INFO:tasks.workunit.client.0.vm05.stdout:0/171: sync 2026-03-10T08:55:20.065 INFO:tasks.workunit.client.1.vm08.stdout:4/464: read d5/de/f6d [3288400,129479] 0 2026-03-10T08:55:20.068 INFO:tasks.workunit.client.1.vm08.stdout:8/502: symlink d1/d10/d9/dd/d13/lb3 0 2026-03-10T08:55:20.069 INFO:tasks.workunit.client.1.vm08.stdout:4/465: readlink d5/de/l9f 0 2026-03-10T08:55:20.069 INFO:tasks.workunit.client.0.vm05.stdout:9/132: creat d6/d19/f29 x:0 0 0 2026-03-10T08:55:20.071 INFO:tasks.workunit.client.1.vm08.stdout:5/436: fdatasync d0/d11/d27/d68/d7c/f6a 0 2026-03-10T08:55:20.078 INFO:tasks.workunit.client.0.vm05.stdout:5/179: creat d5/df/d12/d24/f38 x:0 0 0 2026-03-10T08:55:20.078 INFO:tasks.workunit.client.0.vm05.stdout:9/133: mkdir d6/d19/d2a 0 2026-03-10T08:55:20.084 INFO:tasks.workunit.client.1.vm08.stdout:1/452: dread d1/da/f25 [4194304,4194304] 0 2026-03-10T08:55:20.085 INFO:tasks.workunit.client.0.vm05.stdout:0/172: sync 2026-03-10T08:55:20.090 INFO:tasks.workunit.client.0.vm05.stdout:0/173: dwrite df/d18/f2a [0,4194304] 0 2026-03-10T08:55:20.092 INFO:tasks.workunit.client.0.vm05.stdout:0/174: write df/d18/f29 [908733,60456] 0 2026-03-10T08:55:20.094 INFO:tasks.workunit.client.0.vm05.stdout:9/134: getdents d6/d19/d2a 0 2026-03-10T08:55:20.100 INFO:tasks.workunit.client.0.vm05.stdout:0/175: dwrite df/f11 [4194304,4194304] 0 2026-03-10T08:55:20.110 INFO:tasks.workunit.client.0.vm05.stdout:9/135: link d6/d15/f18 d6/d27/f2b 0 2026-03-10T08:55:20.112 INFO:tasks.workunit.client.0.vm05.stdout:0/176: sync 2026-03-10T08:55:20.113 INFO:tasks.workunit.client.0.vm05.stdout:0/177: rename df to df/d18/d2b/d27/d31 22 2026-03-10T08:55:20.115 INFO:tasks.workunit.client.1.vm08.stdout:5/437: truncate d0/d1b/f2f 939665 0 2026-03-10T08:55:20.117 INFO:tasks.workunit.client.0.vm05.stdout:9/136: mkdir d6/d19/d2c 0 2026-03-10T08:55:20.119 INFO:tasks.workunit.client.1.vm08.stdout:8/503: creat d1/d10/d9/dd/d9a/da6/fb4 x:0 0 0 2026-03-10T08:55:20.119 INFO:tasks.workunit.client.1.vm08.stdout:8/504: rename d1/d10 to d1/d10/d9/dd/d25/d27/d44/d89/db5 22 2026-03-10T08:55:20.120 INFO:tasks.workunit.client.1.vm08.stdout:1/453: creat d1/da/de/d24/d3d/d40/d92/f99 x:0 0 0 2026-03-10T08:55:20.121 INFO:tasks.workunit.client.1.vm08.stdout:5/438: write d0/d11/d27/d68/d7c/d4b/f82 [259704,27959] 0 2026-03-10T08:55:20.123 INFO:tasks.workunit.client.1.vm08.stdout:8/505: mknod d1/d10/d9/d4d/cb6 0 2026-03-10T08:55:20.124 INFO:tasks.workunit.client.1.vm08.stdout:1/454: creat d1/da/d20/d3f/d49/f9a x:0 0 0 2026-03-10T08:55:20.126 INFO:tasks.workunit.client.1.vm08.stdout:5/439: write d0/d1b/f39 [3928812,81688] 0 2026-03-10T08:55:20.126 INFO:tasks.workunit.client.0.vm05.stdout:0/178: truncate df/d18/d2b/d27/f2e 115835 0 2026-03-10T08:55:20.128 INFO:tasks.workunit.client.1.vm08.stdout:1/455: dwrite d1/da/d20/d3f/d49/f9a [0,4194304] 0 2026-03-10T08:55:20.130 INFO:tasks.workunit.client.1.vm08.stdout:8/506: rename d1/d10/d9/dd/d3d/f78 to d1/d10/d9/d8a/fb7 0 2026-03-10T08:55:20.130 INFO:tasks.workunit.client.0.vm05.stdout:0/179: dwrite df/d18/f29 [0,4194304] 0 2026-03-10T08:55:20.138 INFO:tasks.workunit.client.0.vm05.stdout:9/137: getdents d6/d15 0 2026-03-10T08:55:20.139 INFO:tasks.workunit.client.1.vm08.stdout:1/456: symlink d1/da/d4b/l9b 0 2026-03-10T08:55:20.139 INFO:tasks.workunit.client.0.vm05.stdout:0/180: fdatasync df/d1f/f25 0 2026-03-10T08:55:20.139 INFO:tasks.workunit.client.0.vm05.stdout:0/181: write fe [1866351,62129] 0 2026-03-10T08:55:20.140 INFO:tasks.workunit.client.1.vm08.stdout:5/440: read d0/d11/f2d [1482644,66867] 0 2026-03-10T08:55:20.140 INFO:tasks.workunit.client.1.vm08.stdout:1/457: chown d1/da/de/d24/d3d/d40/d56/d6b/f8f 14 1 2026-03-10T08:55:20.141 INFO:tasks.workunit.client.0.vm05.stdout:9/138: fdatasync d6/f16 0 2026-03-10T08:55:20.143 INFO:tasks.workunit.client.1.vm08.stdout:5/441: creat d0/d11/f86 x:0 0 0 2026-03-10T08:55:20.144 INFO:tasks.workunit.client.1.vm08.stdout:8/507: link d1/d10/d9/dd/l5a d1/d10/d9/dd/d18/d3c/lb8 0 2026-03-10T08:55:20.145 INFO:tasks.workunit.client.1.vm08.stdout:8/508: stat d1/d10/d9/dd/d25/d27/f52 0 2026-03-10T08:55:20.146 INFO:tasks.workunit.client.0.vm05.stdout:0/182: mkdir df/d18/d2b/d27/d32 0 2026-03-10T08:55:20.146 INFO:tasks.workunit.client.1.vm08.stdout:8/509: dread - d1/d10/d9/d8a/f99 zero size 2026-03-10T08:55:20.147 INFO:tasks.workunit.client.0.vm05.stdout:9/139: link d6/l10 d6/d19/d2a/l2d 0 2026-03-10T08:55:20.148 INFO:tasks.workunit.client.1.vm08.stdout:5/442: dwrite d0/d11/f86 [0,4194304] 0 2026-03-10T08:55:20.149 INFO:tasks.workunit.client.1.vm08.stdout:1/458: dread d1/da/f93 [0,4194304] 0 2026-03-10T08:55:20.151 INFO:tasks.workunit.client.1.vm08.stdout:1/459: chown d1/da/d18/d3a/f57 360260 1 2026-03-10T08:55:20.152 INFO:tasks.workunit.client.0.vm05.stdout:9/140: dwrite f2 [4194304,4194304] 0 2026-03-10T08:55:20.159 INFO:tasks.workunit.client.1.vm08.stdout:1/460: mkdir d1/da/d20/d3f/d49/d9c 0 2026-03-10T08:55:20.161 INFO:tasks.workunit.client.1.vm08.stdout:5/443: mkdir d0/d11/d27/d68/d7c/d4b/d87 0 2026-03-10T08:55:20.161 INFO:tasks.workunit.client.1.vm08.stdout:8/510: truncate d1/d10/d9/dd/d25/d27/d44/d21/d51/d64/fab 222408 0 2026-03-10T08:55:20.161 INFO:tasks.workunit.client.1.vm08.stdout:1/461: dread d1/da/f22 [0,4194304] 0 2026-03-10T08:55:20.170 INFO:tasks.workunit.client.1.vm08.stdout:5/444: fdatasync d0/d11/f2d 0 2026-03-10T08:55:20.172 INFO:tasks.workunit.client.1.vm08.stdout:1/462: fdatasync d1/da/f1e 0 2026-03-10T08:55:20.174 INFO:tasks.workunit.client.1.vm08.stdout:8/511: getdents d1/d10/d9/dd/d18/d34 0 2026-03-10T08:55:20.175 INFO:tasks.workunit.client.1.vm08.stdout:8/512: mknod d1/d10/d9/dd/d25/d27/d44/d21/cb9 0 2026-03-10T08:55:20.176 INFO:tasks.workunit.client.1.vm08.stdout:8/513: write d1/d10/d9/dd/d9a/da6/fb4 [626880,42717] 0 2026-03-10T08:55:20.189 INFO:tasks.workunit.client.0.vm05.stdout:0/183: sync 2026-03-10T08:55:20.194 INFO:tasks.workunit.client.0.vm05.stdout:0/184: creat df/d18/d2b/f33 x:0 0 0 2026-03-10T08:55:20.195 INFO:tasks.workunit.client.0.vm05.stdout:0/185: unlink df/d18/d2b/f2f 0 2026-03-10T08:55:20.195 INFO:tasks.workunit.client.0.vm05.stdout:0/186: symlink df/d18/l34 0 2026-03-10T08:55:20.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:19 vm05.local ceph-mon[49713]: from='client.14672 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:55:20.233 INFO:tasks.workunit.client.0.vm05.stdout:1/253: dread dd/d10/d19/f2e [0,4194304] 0 2026-03-10T08:55:20.237 INFO:tasks.workunit.client.0.vm05.stdout:1/254: dwrite dd/d10/d18/f36 [0,4194304] 0 2026-03-10T08:55:20.239 INFO:tasks.workunit.client.0.vm05.stdout:1/255: chown dd/d13/l25 36209561 1 2026-03-10T08:55:20.240 INFO:tasks.workunit.client.0.vm05.stdout:1/256: mknod dd/c59 0 2026-03-10T08:55:20.242 INFO:tasks.workunit.client.0.vm05.stdout:1/257: creat dd/d21/d3f/f5a x:0 0 0 2026-03-10T08:55:20.242 INFO:tasks.workunit.client.0.vm05.stdout:1/258: chown dd/c59 6 1 2026-03-10T08:55:20.261 INFO:tasks.workunit.client.1.vm08.stdout:3/391: readlink d4/d15/d8/d2c/d55/l7f 0 2026-03-10T08:55:20.262 INFO:tasks.workunit.client.1.vm08.stdout:3/392: chown d4/d15/d8/d2a/f63 55 1 2026-03-10T08:55:20.263 INFO:tasks.workunit.client.0.vm05.stdout:6/158: getdents d4/d7/d10/d15/d1b/d22 0 2026-03-10T08:55:20.263 INFO:tasks.workunit.client.0.vm05.stdout:6/159: dread - d4/d7/d10/d1a/f25 zero size 2026-03-10T08:55:20.265 INFO:tasks.workunit.client.1.vm08.stdout:3/393: getdents d4/d15/d8/d71 0 2026-03-10T08:55:20.266 INFO:tasks.workunit.client.0.vm05.stdout:7/130: write fd [4450672,94352] 0 2026-03-10T08:55:20.268 INFO:tasks.workunit.client.0.vm05.stdout:7/131: dread f3 [4194304,4194304] 0 2026-03-10T08:55:20.269 INFO:tasks.workunit.client.0.vm05.stdout:1/259: dread dd/f11 [0,4194304] 0 2026-03-10T08:55:20.276 INFO:tasks.workunit.client.0.vm05.stdout:6/160: dread d4/fc [4194304,4194304] 0 2026-03-10T08:55:20.280 INFO:tasks.workunit.client.0.vm05.stdout:6/161: mkdir d4/d7/d10/d15/d38 0 2026-03-10T08:55:20.280 INFO:tasks.workunit.client.0.vm05.stdout:6/162: chown d4/d7/d10/d15/d1b 0 1 2026-03-10T08:55:20.281 INFO:tasks.workunit.client.1.vm08.stdout:3/394: creat d4/d15/d8/d2a/d79/f80 x:0 0 0 2026-03-10T08:55:20.282 INFO:tasks.workunit.client.0.vm05.stdout:7/132: rename d18/d1b/c1c to d18/c21 0 2026-03-10T08:55:20.285 INFO:tasks.workunit.client.1.vm08.stdout:3/395: readlink d4/l74 0 2026-03-10T08:55:20.285 INFO:tasks.workunit.client.0.vm05.stdout:6/163: link d4/f30 d4/d7/d10/d15/d20/f39 0 2026-03-10T08:55:20.286 INFO:tasks.workunit.client.0.vm05.stdout:7/133: symlink d18/d1b/d1f/l22 0 2026-03-10T08:55:20.291 INFO:tasks.workunit.client.0.vm05.stdout:6/164: creat d4/d7/d10/d15/d38/f3a x:0 0 0 2026-03-10T08:55:20.291 INFO:tasks.workunit.client.0.vm05.stdout:6/165: write d4/d7/ff [3206574,17049] 0 2026-03-10T08:55:20.292 INFO:tasks.workunit.client.0.vm05.stdout:6/166: readlink d4/d7/d10/d15/d20/l33 0 2026-03-10T08:55:20.293 INFO:tasks.workunit.client.1.vm08.stdout:3/396: symlink d4/d15/d8/d2a/d79/l81 0 2026-03-10T08:55:20.293 INFO:tasks.workunit.client.0.vm05.stdout:7/134: link l13 d18/d1b/d1f/l23 0 2026-03-10T08:55:20.302 INFO:tasks.workunit.client.0.vm05.stdout:7/135: dread d18/f1d [0,4194304] 0 2026-03-10T08:55:20.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:19 vm08.local ceph-mon[57559]: from='client.14672 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:55:20.303 INFO:tasks.workunit.client.0.vm05.stdout:7/136: write f3 [3265461,13227] 0 2026-03-10T08:55:20.304 INFO:tasks.workunit.client.0.vm05.stdout:7/137: creat d18/f24 x:0 0 0 2026-03-10T08:55:20.305 INFO:tasks.workunit.client.1.vm08.stdout:3/397: dread d4/d15/d8/d2a/d79/f3c [0,4194304] 0 2026-03-10T08:55:20.305 INFO:tasks.workunit.client.0.vm05.stdout:7/138: mkdir d18/d1b/d1f/d25 0 2026-03-10T08:55:20.306 INFO:tasks.workunit.client.0.vm05.stdout:7/139: dread d18/f1d [0,4194304] 0 2026-03-10T08:55:20.310 INFO:tasks.workunit.client.1.vm08.stdout:1/463: dread d1/da/d18/f48 [0,4194304] 0 2026-03-10T08:55:20.310 INFO:tasks.workunit.client.1.vm08.stdout:1/464: stat d1/da/f39 0 2026-03-10T08:55:20.310 INFO:tasks.workunit.client.0.vm05.stdout:7/140: chown f9 1851548083 1 2026-03-10T08:55:20.310 INFO:tasks.workunit.client.0.vm05.stdout:7/141: dread - d18/f1a zero size 2026-03-10T08:55:20.310 INFO:tasks.workunit.client.0.vm05.stdout:7/142: creat d18/f26 x:0 0 0 2026-03-10T08:55:20.310 INFO:tasks.workunit.client.0.vm05.stdout:7/143: mknod d18/d1b/c27 0 2026-03-10T08:55:20.310 INFO:tasks.workunit.client.0.vm05.stdout:7/144: fsync d18/f1a 0 2026-03-10T08:55:20.312 INFO:tasks.workunit.client.1.vm08.stdout:1/465: mknod d1/da/de/d24/d3d/c9d 0 2026-03-10T08:55:20.313 INFO:tasks.workunit.client.0.vm05.stdout:7/145: mknod d18/d1b/d1f/d25/c28 0 2026-03-10T08:55:20.314 INFO:tasks.workunit.client.1.vm08.stdout:1/466: truncate d1/da/f39 1692819 0 2026-03-10T08:55:20.315 INFO:tasks.workunit.client.0.vm05.stdout:7/146: dread f15 [0,4194304] 0 2026-03-10T08:55:20.317 INFO:tasks.workunit.client.1.vm08.stdout:1/467: mkdir d1/da/d20/d9e 0 2026-03-10T08:55:20.319 INFO:tasks.workunit.client.1.vm08.stdout:3/398: dread d4/d15/f7 [0,4194304] 0 2026-03-10T08:55:20.322 INFO:tasks.workunit.client.0.vm05.stdout:8/138: fsync d2/dd/d2c/f34 0 2026-03-10T08:55:20.334 INFO:tasks.workunit.client.1.vm08.stdout:6/475: dwrite d9/d50/f78 [0,4194304] 0 2026-03-10T08:55:20.336 INFO:tasks.workunit.client.0.vm05.stdout:8/139: rmdir d2/dd 39 2026-03-10T08:55:20.337 INFO:tasks.workunit.client.0.vm05.stdout:8/140: chown d2/db/l24 2129006338 1 2026-03-10T08:55:20.344 INFO:tasks.workunit.client.1.vm08.stdout:2/450: write d1/d43/f4b [2387519,36800] 0 2026-03-10T08:55:20.345 INFO:tasks.workunit.client.1.vm08.stdout:2/451: fsync d1/da/d10/d1b/d12/d1e/f83 0 2026-03-10T08:55:20.346 INFO:tasks.workunit.client.1.vm08.stdout:2/452: chown d1/da/d10/d1b/d12/d1e 3140 1 2026-03-10T08:55:20.347 INFO:tasks.workunit.client.0.vm05.stdout:4/175: truncate d0/f9 358360 0 2026-03-10T08:55:20.352 INFO:tasks.workunit.client.0.vm05.stdout:4/176: mkdir d0/d1f/d36/d37 0 2026-03-10T08:55:20.354 INFO:tasks.workunit.client.0.vm05.stdout:8/141: creat d2/dd/d2c/d2e/f37 x:0 0 0 2026-03-10T08:55:20.355 INFO:tasks.workunit.client.0.vm05.stdout:8/142: chown d2/db/d28 751994 1 2026-03-10T08:55:20.355 INFO:tasks.workunit.client.0.vm05.stdout:8/143: truncate d2/dd/f1a 1580401 0 2026-03-10T08:55:20.356 INFO:tasks.workunit.client.1.vm08.stdout:6/476: mknod d9/dc/d11/d23/ca4 0 2026-03-10T08:55:20.359 INFO:tasks.workunit.client.1.vm08.stdout:3/399: link d4/d15/d8/d2c/f5a d4/d15/d8/f82 0 2026-03-10T08:55:20.359 INFO:tasks.workunit.client.1.vm08.stdout:3/400: chown f1 21 1 2026-03-10T08:55:20.362 INFO:tasks.workunit.client.0.vm05.stdout:3/195: dwrite d9/f20 [8388608,4194304] 0 2026-03-10T08:55:20.363 INFO:tasks.workunit.client.0.vm05.stdout:3/196: truncate d9/d2b/d2f/f33 154093 0 2026-03-10T08:55:20.366 INFO:tasks.workunit.client.0.vm05.stdout:3/197: dread - d9/d2b/f34 zero size 2026-03-10T08:55:20.366 INFO:tasks.workunit.client.0.vm05.stdout:4/177: truncate d0/d1d/d30/f29 4699699 0 2026-03-10T08:55:20.371 INFO:tasks.workunit.client.1.vm08.stdout:2/453: rename d1/da/d10/d1b/f72 to d1/da/d10/d1b/d12/f8f 0 2026-03-10T08:55:20.372 INFO:tasks.workunit.client.1.vm08.stdout:2/454: stat d1/d43/f4b 0 2026-03-10T08:55:20.373 INFO:tasks.workunit.client.0.vm05.stdout:8/144: mknod d2/c38 0 2026-03-10T08:55:20.374 INFO:tasks.workunit.client.0.vm05.stdout:8/145: write d2/dd/d2c/f2f [220473,60438] 0 2026-03-10T08:55:20.375 INFO:tasks.workunit.client.0.vm05.stdout:8/146: chown d2/dd/d2c/f34 20040 1 2026-03-10T08:55:20.378 INFO:tasks.workunit.client.1.vm08.stdout:9/421: dread d2/dd/d15/d1e/d24/f3f [0,4194304] 0 2026-03-10T08:55:20.379 INFO:tasks.workunit.client.0.vm05.stdout:3/198: dread d9/f20 [0,4194304] 0 2026-03-10T08:55:20.383 INFO:tasks.workunit.client.0.vm05.stdout:4/178: mknod d0/d1d/c38 0 2026-03-10T08:55:20.383 INFO:tasks.workunit.client.0.vm05.stdout:4/179: readlink d0/l2a 0 2026-03-10T08:55:20.385 INFO:tasks.workunit.client.1.vm08.stdout:2/455: creat d1/da/d10/d1b/d12/d1e/f90 x:0 0 0 2026-03-10T08:55:20.385 INFO:tasks.workunit.client.1.vm08.stdout:2/456: fdatasync d1/d43/f5d 0 2026-03-10T08:55:20.393 INFO:tasks.workunit.client.0.vm05.stdout:2/164: dwrite d0/d9/d1e/d20/d24/f29 [0,4194304] 0 2026-03-10T08:55:20.394 INFO:tasks.workunit.client.0.vm05.stdout:2/165: chown d0/d9/d1e 40552 1 2026-03-10T08:55:20.403 INFO:tasks.workunit.client.1.vm08.stdout:7/454: getdents d0/d11/d4a 0 2026-03-10T08:55:20.404 INFO:tasks.workunit.client.0.vm05.stdout:4/180: rename d0/c2d to d0/d2e/c39 0 2026-03-10T08:55:20.406 INFO:tasks.workunit.client.1.vm08.stdout:2/457: symlink d1/d5b/l91 0 2026-03-10T08:55:20.407 INFO:tasks.workunit.client.1.vm08.stdout:3/401: creat d4/d15/d8/f83 x:0 0 0 2026-03-10T08:55:20.409 INFO:tasks.workunit.client.1.vm08.stdout:6/477: getdents d9/d10 0 2026-03-10T08:55:20.411 INFO:tasks.workunit.client.1.vm08.stdout:6/478: chown d9/dc/c96 2 1 2026-03-10T08:55:20.412 INFO:tasks.workunit.client.1.vm08.stdout:0/372: dwrite d6/fe [0,4194304] 0 2026-03-10T08:55:20.413 INFO:tasks.workunit.client.1.vm08.stdout:0/373: stat d6/dd/d13/d17/f66 0 2026-03-10T08:55:20.413 INFO:tasks.workunit.client.1.vm08.stdout:2/458: rmdir d1/da/d78 39 2026-03-10T08:55:20.414 INFO:tasks.workunit.client.1.vm08.stdout:3/402: rmdir d4/d15 39 2026-03-10T08:55:20.415 INFO:tasks.workunit.client.1.vm08.stdout:5/445: getdents d0/d11/d18 0 2026-03-10T08:55:20.417 INFO:tasks.workunit.client.0.vm05.stdout:4/181: creat d0/d1d/d30/f3a x:0 0 0 2026-03-10T08:55:20.420 INFO:tasks.workunit.client.0.vm05.stdout:4/182: dread d0/f8 [4194304,4194304] 0 2026-03-10T08:55:20.421 INFO:tasks.workunit.client.1.vm08.stdout:6/479: mknod d9/d13/ca5 0 2026-03-10T08:55:20.424 INFO:tasks.workunit.client.0.vm05.stdout:4/183: dwrite d0/d1d/f22 [0,4194304] 0 2026-03-10T08:55:20.432 INFO:tasks.workunit.client.1.vm08.stdout:7/455: truncate d0/d11/d1f/d2c/f33 920773 0 2026-03-10T08:55:20.432 INFO:tasks.workunit.client.1.vm08.stdout:7/456: stat d0/d11/d1f/d2c/f6c 0 2026-03-10T08:55:20.433 INFO:tasks.workunit.client.0.vm05.stdout:5/180: getdents d5/df/d12/d24 0 2026-03-10T08:55:20.433 INFO:tasks.workunit.client.0.vm05.stdout:4/184: symlink d0/d2e/l3b 0 2026-03-10T08:55:20.436 INFO:tasks.workunit.client.1.vm08.stdout:2/459: mknod d1/d43/d4f/d52/c92 0 2026-03-10T08:55:20.436 INFO:tasks.workunit.client.0.vm05.stdout:2/166: getdents d0/d9/d1e/d20 0 2026-03-10T08:55:20.438 INFO:tasks.workunit.client.1.vm08.stdout:5/446: mknod d0/d46/c88 0 2026-03-10T08:55:20.439 INFO:tasks.workunit.client.0.vm05.stdout:2/167: dread d0/f2 [0,4194304] 0 2026-03-10T08:55:20.440 INFO:tasks.workunit.client.0.vm05.stdout:5/181: mkdir d5/df/d12/d39 0 2026-03-10T08:55:20.442 INFO:tasks.workunit.client.1.vm08.stdout:5/447: dwrite d0/d11/d18/d52/f57 [0,4194304] 0 2026-03-10T08:55:20.444 INFO:tasks.workunit.client.0.vm05.stdout:4/185: creat d0/d1d/f3c x:0 0 0 2026-03-10T08:55:20.444 INFO:tasks.workunit.client.1.vm08.stdout:5/448: chown d0/d11/d3e/d45 4 1 2026-03-10T08:55:20.447 INFO:tasks.workunit.client.1.vm08.stdout:4/466: write d5/f14 [2620625,63495] 0 2026-03-10T08:55:20.458 INFO:tasks.workunit.client.1.vm08.stdout:7/457: read d0/d11/d1f/d2c/f30 [280559,60114] 0 2026-03-10T08:55:20.462 INFO:tasks.workunit.client.0.vm05.stdout:2/168: creat d0/d9/d1e/d20/d21/f35 x:0 0 0 2026-03-10T08:55:20.476 INFO:tasks.workunit.client.0.vm05.stdout:2/169: write d0/f4 [1827076,20759] 0 2026-03-10T08:55:20.476 INFO:tasks.workunit.client.0.vm05.stdout:9/141: write d6/f7 [1729336,45544] 0 2026-03-10T08:55:20.476 INFO:tasks.workunit.client.1.vm08.stdout:7/458: dread d0/d11/d1f/d29/d3d/d40/f24 [0,4194304] 0 2026-03-10T08:55:20.476 INFO:tasks.workunit.client.1.vm08.stdout:2/460: write d1/da/d10/d42/f89 [840638,54937] 0 2026-03-10T08:55:20.476 INFO:tasks.workunit.client.1.vm08.stdout:2/461: chown d1/da/d10/d1b/d12/d1e 32064728 1 2026-03-10T08:55:20.476 INFO:tasks.workunit.client.1.vm08.stdout:2/462: chown d1/d43/d5c 1261657 1 2026-03-10T08:55:20.476 INFO:tasks.workunit.client.1.vm08.stdout:4/467: symlink d5/d23/d49/d83/la2 0 2026-03-10T08:55:20.476 INFO:tasks.workunit.client.1.vm08.stdout:4/468: dread - d5/d23/d36/d76/f9e zero size 2026-03-10T08:55:20.476 INFO:tasks.workunit.client.0.vm05.stdout:2/170: dwrite d0/d9/d1e/d20/d21/f23 [0,4194304] 0 2026-03-10T08:55:20.481 INFO:tasks.workunit.client.0.vm05.stdout:2/171: dwrite d0/f2f [0,4194304] 0 2026-03-10T08:55:20.482 INFO:tasks.workunit.client.0.vm05.stdout:5/182: mkdir d5/d3a 0 2026-03-10T08:55:20.487 INFO:tasks.workunit.client.0.vm05.stdout:2/172: truncate d0/d9/d1e/d20/f32 777678 0 2026-03-10T08:55:20.490 INFO:tasks.workunit.client.0.vm05.stdout:2/173: read - d0/d9/d1e/d20/d24/f33 zero size 2026-03-10T08:55:20.490 INFO:tasks.workunit.client.0.vm05.stdout:2/174: readlink d0/l28 0 2026-03-10T08:55:20.491 INFO:tasks.workunit.client.0.vm05.stdout:2/175: fsync d0/d9/d1e/d20/d24/f33 0 2026-03-10T08:55:20.492 INFO:tasks.workunit.client.0.vm05.stdout:2/176: chown d0/d9/d1e/d20/f22 6290 1 2026-03-10T08:55:20.493 INFO:tasks.workunit.client.1.vm08.stdout:3/403: creat d4/d15/d8/d2a/d79/d20/f84 x:0 0 0 2026-03-10T08:55:20.499 INFO:tasks.workunit.client.0.vm05.stdout:0/187: write f6 [363725,30357] 0 2026-03-10T08:55:20.499 INFO:tasks.workunit.client.0.vm05.stdout:0/188: chown df/d1f/f25 27311 1 2026-03-10T08:55:20.514 INFO:tasks.workunit.client.1.vm08.stdout:4/469: creat d5/d2f/d5a/d69/fa3 x:0 0 0 2026-03-10T08:55:20.522 INFO:tasks.workunit.client.0.vm05.stdout:6/167: dwrite d4/f11 [0,4194304] 0 2026-03-10T08:55:20.528 INFO:tasks.workunit.client.0.vm05.stdout:6/168: dwrite d4/d7/d10/d15/f16 [0,4194304] 0 2026-03-10T08:55:20.530 INFO:tasks.workunit.client.1.vm08.stdout:7/459: sync 2026-03-10T08:55:20.535 INFO:tasks.workunit.client.0.vm05.stdout:7/147: getdents d18 0 2026-03-10T08:55:20.537 INFO:tasks.workunit.client.0.vm05.stdout:2/177: creat d0/f36 x:0 0 0 2026-03-10T08:55:20.545 INFO:tasks.workunit.client.1.vm08.stdout:7/460: dwrite d0/d11/d1f/d29/d36/d75/f85 [0,4194304] 0 2026-03-10T08:55:20.561 INFO:tasks.workunit.client.1.vm08.stdout:1/468: write d1/da/d20/f2d [2699554,13871] 0 2026-03-10T08:55:20.561 INFO:tasks.workunit.client.1.vm08.stdout:1/469: fsync d1/da/de/f12 0 2026-03-10T08:55:20.561 INFO:tasks.workunit.client.0.vm05.stdout:0/189: mkdir df/d18/d19/d35 0 2026-03-10T08:55:20.561 INFO:tasks.workunit.client.0.vm05.stdout:7/148: mknod d18/d1b/c29 0 2026-03-10T08:55:20.561 INFO:tasks.workunit.client.0.vm05.stdout:1/260: write fb [1444018,119054] 0 2026-03-10T08:55:20.566 INFO:tasks.workunit.client.0.vm05.stdout:0/190: dread df/f15 [0,4194304] 0 2026-03-10T08:55:20.600 INFO:tasks.workunit.client.1.vm08.stdout:1/470: symlink d1/l9f 0 2026-03-10T08:55:20.600 INFO:tasks.workunit.client.1.vm08.stdout:7/461: link d0/d1c/l31 d0/d1c/l8a 0 2026-03-10T08:55:20.600 INFO:tasks.workunit.client.1.vm08.stdout:1/471: mknod d1/da/de/d24/d26/ca0 0 2026-03-10T08:55:20.601 INFO:tasks.workunit.client.0.vm05.stdout:7/149: mknod d18/d1b/d1f/d25/c2a 0 2026-03-10T08:55:20.601 INFO:tasks.workunit.client.0.vm05.stdout:7/150: dwrite d18/f26 [0,4194304] 0 2026-03-10T08:55:20.601 INFO:tasks.workunit.client.0.vm05.stdout:1/261: rename dd/d10/d19/f24 to dd/d10/d18/d2d/d51/d58/f5b 0 2026-03-10T08:55:20.601 INFO:tasks.workunit.client.0.vm05.stdout:6/169: link d4/d7/d10/d15/d20/l33 d4/d7/d10/d15/l3b 0 2026-03-10T08:55:20.601 INFO:tasks.workunit.client.0.vm05.stdout:6/170: write d4/d2d/f2f [97762,113203] 0 2026-03-10T08:55:20.601 INFO:tasks.workunit.client.0.vm05.stdout:7/151: unlink d18/d1b/d1f/d25/c28 0 2026-03-10T08:55:20.601 INFO:tasks.workunit.client.0.vm05.stdout:7/152: chown d18/d1b/d1f 5240 1 2026-03-10T08:55:20.601 INFO:tasks.workunit.client.0.vm05.stdout:6/171: truncate d4/d7/d10/f12 2271219 0 2026-03-10T08:55:20.601 INFO:tasks.workunit.client.0.vm05.stdout:7/153: symlink d18/l2b 0 2026-03-10T08:55:20.601 INFO:tasks.workunit.client.0.vm05.stdout:1/262: getdents dd/d55 0 2026-03-10T08:55:20.601 INFO:tasks.workunit.client.0.vm05.stdout:1/263: write dd/d21/d3f/f57 [4168915,14029] 0 2026-03-10T08:55:20.601 INFO:tasks.workunit.client.0.vm05.stdout:7/154: creat d18/d1b/f2c x:0 0 0 2026-03-10T08:55:20.601 INFO:tasks.workunit.client.0.vm05.stdout:7/155: write d18/f1a [664708,55103] 0 2026-03-10T08:55:20.601 INFO:tasks.workunit.client.0.vm05.stdout:6/172: getdents d4/d2d 0 2026-03-10T08:55:20.601 INFO:tasks.workunit.client.0.vm05.stdout:7/156: creat d18/d1b/d1f/f2d x:0 0 0 2026-03-10T08:55:20.601 INFO:tasks.workunit.client.0.vm05.stdout:6/173: readlink d4/d7/d10/d15/d1b/d22/l27 0 2026-03-10T08:55:20.601 INFO:tasks.workunit.client.0.vm05.stdout:7/157: dwrite d18/f26 [0,4194304] 0 2026-03-10T08:55:20.601 INFO:tasks.workunit.client.0.vm05.stdout:6/174: rename d4/d7/d10/d15/d1b/f24 to d4/d7/d10/d15/d38/f3c 0 2026-03-10T08:55:20.601 INFO:tasks.workunit.client.0.vm05.stdout:7/158: dread f15 [0,4194304] 0 2026-03-10T08:55:20.604 INFO:tasks.workunit.client.0.vm05.stdout:6/175: dwrite d4/d7/d10/d1a/f1e [0,4194304] 0 2026-03-10T08:55:20.605 INFO:tasks.workunit.client.0.vm05.stdout:7/159: mkdir d18/d1b/d1f/d25/d2e 0 2026-03-10T08:55:20.605 INFO:tasks.workunit.client.0.vm05.stdout:7/160: stat d18/d1b/c27 0 2026-03-10T08:55:20.605 INFO:tasks.workunit.client.0.vm05.stdout:6/176: readlink d4/d7/d10/d15/d1b/d22/l27 0 2026-03-10T08:55:20.605 INFO:tasks.workunit.client.0.vm05.stdout:6/177: write d4/d2d/f2f [1083427,11438] 0 2026-03-10T08:55:20.607 INFO:tasks.workunit.client.0.vm05.stdout:6/178: dread - d4/d7/d10/d15/d1b/d22/f36 zero size 2026-03-10T08:55:20.614 INFO:tasks.workunit.client.0.vm05.stdout:2/178: sync 2026-03-10T08:55:20.614 INFO:tasks.workunit.client.0.vm05.stdout:2/179: stat d0/l15 0 2026-03-10T08:55:20.616 INFO:tasks.workunit.client.0.vm05.stdout:0/191: dread df/f12 [0,4194304] 0 2026-03-10T08:55:20.617 INFO:tasks.workunit.client.0.vm05.stdout:2/180: dread d0/f2f [0,4194304] 0 2026-03-10T08:55:20.619 INFO:tasks.workunit.client.0.vm05.stdout:0/192: dread df/f12 [0,4194304] 0 2026-03-10T08:55:20.619 INFO:tasks.workunit.client.0.vm05.stdout:7/161: mkdir d18/d1b/d1f/d25/d2e/d2f 0 2026-03-10T08:55:20.620 INFO:tasks.workunit.client.0.vm05.stdout:7/162: read fd [724131,21837] 0 2026-03-10T08:55:20.622 INFO:tasks.workunit.client.1.vm08.stdout:2/463: rename d1/da/d10/d1b/d12 to d1/da/d10/d42/d93 0 2026-03-10T08:55:20.625 INFO:tasks.workunit.client.1.vm08.stdout:2/464: chown d1/da/d10/d42/d93/d22/l33 84509 1 2026-03-10T08:55:20.625 INFO:tasks.workunit.client.0.vm05.stdout:0/193: creat df/d1f/f36 x:0 0 0 2026-03-10T08:55:20.625 INFO:tasks.workunit.client.0.vm05.stdout:0/194: chown df/f12 1629045210 1 2026-03-10T08:55:20.625 INFO:tasks.workunit.client.0.vm05.stdout:3/199: write d9/d2b/d2f/f33 [1001972,105191] 0 2026-03-10T08:55:20.629 INFO:tasks.workunit.client.1.vm08.stdout:1/472: rename d1/da/de/d24/d3d/d40/f90 to d1/da/de/d5c/fa1 0 2026-03-10T08:55:20.634 INFO:tasks.workunit.client.1.vm08.stdout:6/480: write d9/dc/d11/f73 [376203,74459] 0 2026-03-10T08:55:20.645 INFO:tasks.workunit.client.1.vm08.stdout:0/374: write d6/dd/d13/d17/d1f/d20/f46 [7214,93781] 0 2026-03-10T08:55:20.645 INFO:tasks.workunit.client.1.vm08.stdout:5/449: dwrite d0/d11/d27/d50/f55 [4194304,4194304] 0 2026-03-10T08:55:20.645 INFO:tasks.workunit.client.1.vm08.stdout:1/473: dwrite d1/da/d20/d3f/d49/f71 [0,4194304] 0 2026-03-10T08:55:20.645 INFO:tasks.workunit.client.0.vm05.stdout:4/186: rmdir d0 39 2026-03-10T08:55:20.645 INFO:tasks.workunit.client.0.vm05.stdout:7/163: dread - d18/f24 zero size 2026-03-10T08:55:20.645 INFO:tasks.workunit.client.0.vm05.stdout:2/181: link d0/f2f d0/d9/d27/f37 0 2026-03-10T08:55:20.646 INFO:tasks.workunit.client.0.vm05.stdout:9/142: dwrite d6/fb [0,4194304] 0 2026-03-10T08:55:20.649 INFO:tasks.workunit.client.1.vm08.stdout:5/450: dread d0/d11/d18/d52/f57 [0,4194304] 0 2026-03-10T08:55:20.649 INFO:tasks.workunit.client.0.vm05.stdout:3/200: dread d9/f13 [0,4194304] 0 2026-03-10T08:55:20.651 INFO:tasks.workunit.client.0.vm05.stdout:3/201: write d9/d2b/d2f/f33 [584610,63426] 0 2026-03-10T08:55:20.652 INFO:tasks.workunit.client.0.vm05.stdout:3/202: write f1 [1129876,99748] 0 2026-03-10T08:55:20.661 INFO:tasks.workunit.client.1.vm08.stdout:1/474: dwrite d1/da/de/d24/d35/f64 [0,4194304] 0 2026-03-10T08:55:20.664 INFO:tasks.workunit.client.0.vm05.stdout:4/187: fsync d0/f10 0 2026-03-10T08:55:20.664 INFO:tasks.workunit.client.1.vm08.stdout:0/375: truncate d6/dd/d13/d32/f34 4110317 0 2026-03-10T08:55:20.665 INFO:tasks.workunit.client.1.vm08.stdout:0/376: dread - d6/f5f zero size 2026-03-10T08:55:20.669 INFO:tasks.workunit.client.1.vm08.stdout:5/451: dread d0/d11/f2d [0,4194304] 0 2026-03-10T08:55:20.670 INFO:tasks.workunit.client.0.vm05.stdout:7/164: rename d18/f1a to d18/d1b/f30 0 2026-03-10T08:55:20.672 INFO:tasks.workunit.client.0.vm05.stdout:7/165: chown d18/d1b/c29 7161209 1 2026-03-10T08:55:20.674 INFO:tasks.workunit.client.0.vm05.stdout:2/182: dwrite d0/d9/f17 [4194304,4194304] 0 2026-03-10T08:55:20.681 INFO:tasks.workunit.client.0.vm05.stdout:2/183: readlink d0/d9/l13 0 2026-03-10T08:55:20.681 INFO:tasks.workunit.client.0.vm05.stdout:2/184: stat d0/d9 0 2026-03-10T08:55:20.692 INFO:tasks.workunit.client.0.vm05.stdout:5/183: dwrite d5/df/d12/d21/f1f [0,4194304] 0 2026-03-10T08:55:20.692 INFO:tasks.workunit.client.1.vm08.stdout:3/404: truncate d4/d15/f4b 7575776 0 2026-03-10T08:55:20.696 INFO:tasks.workunit.client.1.vm08.stdout:4/470: truncate d5/d2f/f84 1560297 0 2026-03-10T08:55:20.699 INFO:tasks.workunit.client.1.vm08.stdout:7/462: rmdir d0/d1c 39 2026-03-10T08:55:20.702 INFO:tasks.workunit.client.1.vm08.stdout:7/463: dread d0/d11/d1f/d29/d3d/d40/ff [0,4194304] 0 2026-03-10T08:55:20.703 INFO:tasks.workunit.client.0.vm05.stdout:3/203: sync 2026-03-10T08:55:20.705 INFO:tasks.workunit.client.0.vm05.stdout:9/143: readlink d6/d19/d2a/l2d 0 2026-03-10T08:55:20.706 INFO:tasks.workunit.client.1.vm08.stdout:0/377: fdatasync d6/dd/d13/d32/f3d 0 2026-03-10T08:55:20.708 INFO:tasks.workunit.client.0.vm05.stdout:4/188: chown d0/d1d/d30/c33 1 1 2026-03-10T08:55:20.716 INFO:tasks.workunit.client.0.vm05.stdout:2/185: creat d0/d9/d27/f38 x:0 0 0 2026-03-10T08:55:20.733 INFO:tasks.workunit.client.0.vm05.stdout:4/189: creat d0/d1f/d36/f3d x:0 0 0 2026-03-10T08:55:20.745 INFO:tasks.workunit.client.0.vm05.stdout:4/190: dread - d0/d1d/d30/f28 zero size 2026-03-10T08:55:20.745 INFO:tasks.workunit.client.1.vm08.stdout:4/471: dwrite d5/d23/fa1 [0,4194304] 0 2026-03-10T08:55:20.745 INFO:tasks.workunit.client.1.vm08.stdout:2/465: getdents d1/da 0 2026-03-10T08:55:20.745 INFO:tasks.workunit.client.1.vm08.stdout:0/378: creat d6/dd/d13/d17/d1f/d20/d2f/d26/f73 x:0 0 0 2026-03-10T08:55:20.745 INFO:tasks.workunit.client.1.vm08.stdout:3/405: mkdir d4/d6f/d85 0 2026-03-10T08:55:20.745 INFO:tasks.workunit.client.1.vm08.stdout:7/464: link d0/d14/d2f/f81 d0/d11/d1f/d29/d3d/d89/f8b 0 2026-03-10T08:55:20.745 INFO:tasks.workunit.client.1.vm08.stdout:7/465: readlink d0/d11/d1f/d2c/l7c 0 2026-03-10T08:55:20.745 INFO:tasks.workunit.client.1.vm08.stdout:7/466: truncate d0/d14/d2f/f81 574283 0 2026-03-10T08:55:20.745 INFO:tasks.workunit.client.1.vm08.stdout:0/379: truncate d6/dd/f35 1148125 0 2026-03-10T08:55:20.745 INFO:tasks.workunit.client.1.vm08.stdout:3/406: creat d4/d15/d8/d2a/f86 x:0 0 0 2026-03-10T08:55:20.745 INFO:tasks.workunit.client.1.vm08.stdout:2/466: truncate d1/f19 816174 0 2026-03-10T08:55:20.745 INFO:tasks.workunit.client.1.vm08.stdout:2/467: chown d1 117 1 2026-03-10T08:55:20.745 INFO:tasks.workunit.client.1.vm08.stdout:0/380: rename d6/dd/d13/d32/c51 to d6/dd/d13/d61/c74 0 2026-03-10T08:55:20.745 INFO:tasks.workunit.client.1.vm08.stdout:7/467: creat d0/d1c/f8c x:0 0 0 2026-03-10T08:55:20.745 INFO:tasks.workunit.client.1.vm08.stdout:3/407: creat d4/d6f/d85/f87 x:0 0 0 2026-03-10T08:55:20.745 INFO:tasks.workunit.client.1.vm08.stdout:7/468: read d0/d11/d1f/d29/d36/d75/f85 [3412699,33394] 0 2026-03-10T08:55:20.745 INFO:tasks.workunit.client.1.vm08.stdout:2/468: mknod d1/d5b/d66/c94 0 2026-03-10T08:55:20.745 INFO:tasks.workunit.client.1.vm08.stdout:3/408: rmdir d4/d15/d8/d2c/d55 39 2026-03-10T08:55:20.745 INFO:tasks.workunit.client.1.vm08.stdout:7/469: creat d0/d11/d1f/d29/f8d x:0 0 0 2026-03-10T08:55:20.746 INFO:tasks.workunit.client.0.vm05.stdout:7/166: creat d18/f31 x:0 0 0 2026-03-10T08:55:20.746 INFO:tasks.workunit.client.0.vm05.stdout:5/184: creat d5/f3b x:0 0 0 2026-03-10T08:55:20.746 INFO:tasks.workunit.client.0.vm05.stdout:5/185: write d5/f28 [3454437,107607] 0 2026-03-10T08:55:20.746 INFO:tasks.workunit.client.0.vm05.stdout:5/186: rename d5 to d5/df/d12/d21/d3c 22 2026-03-10T08:55:20.746 INFO:tasks.workunit.client.0.vm05.stdout:5/187: dread d5/df/f31 [0,4194304] 0 2026-03-10T08:55:20.746 INFO:tasks.workunit.client.0.vm05.stdout:4/191: creat d0/d1d/d30/d32/f3e x:0 0 0 2026-03-10T08:55:20.746 INFO:tasks.workunit.client.0.vm05.stdout:7/167: readlink d18/d1b/d1f/l23 0 2026-03-10T08:55:20.747 INFO:tasks.workunit.client.0.vm05.stdout:5/188: rename d5/f28 to d5/df/d12/d24/d2c/f3d 0 2026-03-10T08:55:20.749 INFO:tasks.workunit.client.0.vm05.stdout:1/264: fdatasync fb 0 2026-03-10T08:55:20.749 INFO:tasks.workunit.client.0.vm05.stdout:8/147: dread d2/dd/d2c/f2f [0,4194304] 0 2026-03-10T08:55:20.751 INFO:tasks.workunit.client.0.vm05.stdout:3/204: rename d9/l24 to d9/l36 0 2026-03-10T08:55:20.751 INFO:tasks.workunit.client.0.vm05.stdout:3/205: fsync d9/f27 0 2026-03-10T08:55:20.753 INFO:tasks.workunit.client.0.vm05.stdout:3/206: dread d9/ff [0,4194304] 0 2026-03-10T08:55:20.756 INFO:tasks.workunit.client.1.vm08.stdout:0/381: getdents d6/dd/d13/d17/d1f/d20/d2f/d24 0 2026-03-10T08:55:20.756 INFO:tasks.workunit.client.0.vm05.stdout:5/189: mknod d5/df/d12/d21/c3e 0 2026-03-10T08:55:20.757 INFO:tasks.workunit.client.0.vm05.stdout:5/190: write d5/df/d12/f20 [1138500,117982] 0 2026-03-10T08:55:20.759 INFO:tasks.workunit.client.1.vm08.stdout:2/469: link d1/da/d10/d42/d93/d23/f37 d1/da/d78/f95 0 2026-03-10T08:55:20.762 INFO:tasks.workunit.client.0.vm05.stdout:1/265: mkdir dd/d10/d18/d2d/d5c 0 2026-03-10T08:55:20.762 INFO:tasks.workunit.client.1.vm08.stdout:0/382: unlink d6/dd/d13/d17/d1f/d20/d2f/d26/f5a 0 2026-03-10T08:55:20.765 INFO:tasks.workunit.client.0.vm05.stdout:3/207: dwrite d9/f13 [4194304,4194304] 0 2026-03-10T08:55:20.766 INFO:tasks.workunit.client.1.vm08.stdout:0/383: stat d6/dd/d13/d17/d1f/d20/d2f/d26/d56/f6c 0 2026-03-10T08:55:20.766 INFO:tasks.workunit.client.1.vm08.stdout:2/470: dwrite d1/d43/d4f/f86 [0,4194304] 0 2026-03-10T08:55:20.767 INFO:tasks.workunit.client.1.vm08.stdout:0/384: write d6/dd/d13/d17/f1d [1663569,102371] 0 2026-03-10T08:55:20.767 INFO:tasks.workunit.client.0.vm05.stdout:4/192: mknod d0/c3f 0 2026-03-10T08:55:20.774 INFO:tasks.workunit.client.0.vm05.stdout:7/168: dwrite d18/f1d [0,4194304] 0 2026-03-10T08:55:20.785 INFO:tasks.workunit.client.1.vm08.stdout:0/385: dwrite d6/dd/d13/d17/d1f/d20/d2f/d57/f65 [0,4194304] 0 2026-03-10T08:55:20.785 INFO:tasks.workunit.client.0.vm05.stdout:7/169: chown f3 1659 1 2026-03-10T08:55:20.785 INFO:tasks.workunit.client.0.vm05.stdout:5/191: mknod d5/d3a/c3f 0 2026-03-10T08:55:20.785 INFO:tasks.workunit.client.0.vm05.stdout:3/208: symlink d9/d2b/l37 0 2026-03-10T08:55:20.793 INFO:tasks.workunit.client.0.vm05.stdout:7/170: mkdir d18/d1b/d1f/d25/d2e/d32 0 2026-03-10T08:55:20.796 INFO:tasks.workunit.client.0.vm05.stdout:7/171: dwrite d18/f31 [0,4194304] 0 2026-03-10T08:55:20.796 INFO:tasks.workunit.client.0.vm05.stdout:1/266: sync 2026-03-10T08:55:20.800 INFO:tasks.workunit.client.0.vm05.stdout:3/209: write d9/f31 [98165,27661] 0 2026-03-10T08:55:20.800 INFO:tasks.workunit.client.1.vm08.stdout:0/386: mknod d6/dd/d13/d17/d1f/d20/d2f/d26/c75 0 2026-03-10T08:55:20.804 INFO:tasks.workunit.client.1.vm08.stdout:0/387: mknod d6/dd/d13/d17/d50/c76 0 2026-03-10T08:55:20.822 INFO:tasks.workunit.client.0.vm05.stdout:7/172: chown l16 7653296 1 2026-03-10T08:55:20.822 INFO:tasks.workunit.client.0.vm05.stdout:1/267: rmdir dd/d21/d37 39 2026-03-10T08:55:20.822 INFO:tasks.workunit.client.0.vm05.stdout:4/193: creat d0/f40 x:0 0 0 2026-03-10T08:55:20.822 INFO:tasks.workunit.client.0.vm05.stdout:5/192: creat d5/f40 x:0 0 0 2026-03-10T08:55:20.822 INFO:tasks.workunit.client.0.vm05.stdout:4/194: mkdir d0/d1d/d30/d32/d41 0 2026-03-10T08:55:20.822 INFO:tasks.workunit.client.0.vm05.stdout:7/173: creat d18/d1b/d1f/d25/d2e/d2f/f33 x:0 0 0 2026-03-10T08:55:20.822 INFO:tasks.workunit.client.0.vm05.stdout:7/174: chown d18/d1b/d1f/d25 3618 1 2026-03-10T08:55:20.822 INFO:tasks.workunit.client.0.vm05.stdout:7/175: write d18/f1d [87918,30575] 0 2026-03-10T08:55:20.822 INFO:tasks.workunit.client.0.vm05.stdout:5/193: dread d5/df/d12/f1a [0,4194304] 0 2026-03-10T08:55:20.824 INFO:tasks.workunit.client.0.vm05.stdout:4/195: dwrite d0/f1 [4194304,4194304] 0 2026-03-10T08:55:20.828 INFO:tasks.workunit.client.0.vm05.stdout:5/194: dwrite d5/df/f1c [0,4194304] 0 2026-03-10T08:55:20.833 INFO:tasks.workunit.client.0.vm05.stdout:4/196: mkdir d0/d2e/d42 0 2026-03-10T08:55:20.836 INFO:tasks.workunit.client.0.vm05.stdout:7/176: mknod d18/c34 0 2026-03-10T08:55:20.837 INFO:tasks.workunit.client.0.vm05.stdout:5/195: dwrite d5/df/d12/f20 [0,4194304] 0 2026-03-10T08:55:20.841 INFO:tasks.workunit.client.1.vm08.stdout:6/481: truncate d9/dc/d11/d23/f8a 456093 0 2026-03-10T08:55:20.842 INFO:tasks.workunit.client.1.vm08.stdout:6/482: chown d9/dc/d11/d23/c37 56 1 2026-03-10T08:55:20.842 INFO:tasks.workunit.client.1.vm08.stdout:6/483: write d9/dc/d11/f73 [363049,119439] 0 2026-03-10T08:55:20.846 INFO:tasks.workunit.client.1.vm08.stdout:6/484: truncate f5 499739 0 2026-03-10T08:55:20.848 INFO:tasks.workunit.client.0.vm05.stdout:5/196: mkdir d5/df/d12/d24/d2c/d41 0 2026-03-10T08:55:20.848 INFO:tasks.workunit.client.0.vm05.stdout:5/197: write d5/fd [2786906,2805] 0 2026-03-10T08:55:20.865 INFO:tasks.workunit.client.0.vm05.stdout:4/197: sync 2026-03-10T08:55:20.865 INFO:tasks.workunit.client.0.vm05.stdout:0/195: read df/d1f/f21 [35578,47074] 0 2026-03-10T08:55:20.866 INFO:tasks.workunit.client.0.vm05.stdout:4/198: chown d0/f10 5778215 1 2026-03-10T08:55:20.872 INFO:tasks.workunit.client.0.vm05.stdout:4/199: dwrite d0/f1 [8388608,4194304] 0 2026-03-10T08:55:20.877 INFO:tasks.workunit.client.0.vm05.stdout:4/200: fsync d0/d1d/d30/f1c 0 2026-03-10T08:55:20.882 INFO:tasks.workunit.client.0.vm05.stdout:4/201: dread d0/fb [4194304,4194304] 0 2026-03-10T08:55:20.890 INFO:tasks.workunit.client.0.vm05.stdout:4/202: symlink d0/d1f/d36/l43 0 2026-03-10T08:55:20.890 INFO:tasks.workunit.client.0.vm05.stdout:0/196: rename df/d18/d2b/f33 to df/f37 0 2026-03-10T08:55:20.890 INFO:tasks.workunit.client.0.vm05.stdout:4/203: symlink d0/d2e/l44 0 2026-03-10T08:55:20.892 INFO:tasks.workunit.client.0.vm05.stdout:4/204: truncate d0/fb 2991212 0 2026-03-10T08:55:20.894 INFO:tasks.workunit.client.0.vm05.stdout:4/205: mkdir d0/d2e/d42/d45 0 2026-03-10T08:55:20.896 INFO:tasks.workunit.client.0.vm05.stdout:4/206: rename d0/d1d/d30/c33 to d0/d2c/c46 0 2026-03-10T08:55:20.898 INFO:tasks.workunit.client.0.vm05.stdout:4/207: rename d0/f8 to d0/d1f/f47 0 2026-03-10T08:55:20.899 INFO:tasks.workunit.client.0.vm05.stdout:4/208: symlink d0/d2e/d42/d45/l48 0 2026-03-10T08:55:20.900 INFO:tasks.workunit.client.0.vm05.stdout:4/209: unlink d0/d1f/d36/l43 0 2026-03-10T08:55:20.908 INFO:tasks.workunit.client.0.vm05.stdout:4/210: dread d0/f23 [0,4194304] 0 2026-03-10T08:55:20.908 INFO:tasks.workunit.client.0.vm05.stdout:4/211: write d0/fe [2673218,19447] 0 2026-03-10T08:55:20.912 INFO:tasks.workunit.client.0.vm05.stdout:4/212: dwrite d0/f18 [0,4194304] 0 2026-03-10T08:55:20.977 INFO:tasks.workunit.client.1.vm08.stdout:4/472: rmdir d5/d2f 39 2026-03-10T08:55:21.012 INFO:tasks.workunit.client.1.vm08.stdout:8/514: dread d1/d10/d9/dd/d25/f6e [0,4194304] 0 2026-03-10T08:55:21.016 INFO:tasks.workunit.client.1.vm08.stdout:4/473: read d5/de/f6d [2572873,102068] 0 2026-03-10T08:55:21.018 INFO:tasks.workunit.client.1.vm08.stdout:4/474: stat d5/d2f/d5d/c78 0 2026-03-10T08:55:21.023 INFO:tasks.workunit.client.1.vm08.stdout:4/475: mkdir d5/d23/d49/d8f/da4 0 2026-03-10T08:55:21.023 INFO:tasks.workunit.client.1.vm08.stdout:4/476: rename d5/d23/d49/d83/f7f to d5/d23/d36/d76/fa5 0 2026-03-10T08:55:21.023 INFO:tasks.workunit.client.1.vm08.stdout:4/477: symlink d5/d23/d36/d76/la6 0 2026-03-10T08:55:21.023 INFO:tasks.workunit.client.1.vm08.stdout:4/478: creat d5/d23/d36/d76/fa7 x:0 0 0 2026-03-10T08:55:21.051 INFO:tasks.workunit.client.1.vm08.stdout:1/475: dwrite d1/da/d4b/d4e/f51 [0,4194304] 0 2026-03-10T08:55:21.054 INFO:tasks.workunit.client.1.vm08.stdout:1/476: mkdir d1/da/de/d24/d35/d6d/d82/da2 0 2026-03-10T08:55:21.064 INFO:tasks.workunit.client.1.vm08.stdout:1/477: dread d1/da/d18/d3a/f57 [0,4194304] 0 2026-03-10T08:55:21.073 INFO:tasks.workunit.client.1.vm08.stdout:1/478: mknod d1/da/de/d24/d3d/d40/d84/ca3 0 2026-03-10T08:55:21.076 INFO:tasks.workunit.client.1.vm08.stdout:1/479: dread d1/da/d20/d3f/d49/f71 [0,4194304] 0 2026-03-10T08:55:21.092 INFO:tasks.workunit.client.0.vm05.stdout:1/268: dread dd/d21/f48 [0,4194304] 0 2026-03-10T08:55:21.094 INFO:tasks.workunit.client.0.vm05.stdout:1/269: dread dd/d21/f48 [0,4194304] 0 2026-03-10T08:55:21.097 INFO:tasks.workunit.client.0.vm05.stdout:1/270: symlink dd/d10/d18/d2d/l5d 0 2026-03-10T08:55:21.100 INFO:tasks.workunit.client.0.vm05.stdout:1/271: creat dd/f5e x:0 0 0 2026-03-10T08:55:21.103 INFO:tasks.workunit.client.0.vm05.stdout:1/272: mknod dd/d13/c5f 0 2026-03-10T08:55:21.104 INFO:tasks.workunit.client.1.vm08.stdout:1/480: sync 2026-03-10T08:55:21.108 INFO:tasks.workunit.client.1.vm08.stdout:3/409: rmdir d4/d15/d8 39 2026-03-10T08:55:21.108 INFO:tasks.workunit.client.0.vm05.stdout:9/144: write d6/d12/f1c [349878,112980] 0 2026-03-10T08:55:21.113 INFO:tasks.workunit.client.0.vm05.stdout:2/186: truncate d0/d9/d1e/d20/f26 61633 0 2026-03-10T08:55:21.117 INFO:tasks.workunit.client.1.vm08.stdout:3/410: chown d4/l4c 949281442 1 2026-03-10T08:55:21.120 INFO:tasks.workunit.client.1.vm08.stdout:1/481: symlink d1/da/d20/d9e/la4 0 2026-03-10T08:55:21.124 INFO:tasks.workunit.client.0.vm05.stdout:2/187: rename d0/d9/d1e/d20/d24/f33 to d0/d9/d1e/f39 0 2026-03-10T08:55:21.125 INFO:tasks.workunit.client.1.vm08.stdout:3/411: rename d4/d15/d8/f4e to d4/d6f/d85/f88 0 2026-03-10T08:55:21.126 INFO:tasks.workunit.client.0.vm05.stdout:3/210: getdents d9/d2b 0 2026-03-10T08:55:21.144 INFO:tasks.workunit.client.1.vm08.stdout:2/471: dwrite d1/da/d10/d2d/f4c [0,4194304] 0 2026-03-10T08:55:21.144 INFO:tasks.workunit.client.1.vm08.stdout:0/388: dwrite d6/dd/d13/d32/f3d [0,4194304] 0 2026-03-10T08:55:21.144 INFO:tasks.workunit.client.0.vm05.stdout:3/211: chown d9/c2a 76805 1 2026-03-10T08:55:21.144 INFO:tasks.workunit.client.0.vm05.stdout:3/212: write f7 [5318158,65305] 0 2026-03-10T08:55:21.144 INFO:tasks.workunit.client.0.vm05.stdout:3/213: dwrite d9/f13 [0,4194304] 0 2026-03-10T08:55:21.144 INFO:tasks.workunit.client.0.vm05.stdout:5/198: getdents d5 0 2026-03-10T08:55:21.144 INFO:tasks.workunit.client.0.vm05.stdout:7/177: rmdir d18/d1b/d1f 39 2026-03-10T08:55:21.151 INFO:tasks.workunit.client.1.vm08.stdout:2/472: dread - d1/da/d10/d2d/f67 zero size 2026-03-10T08:55:21.154 INFO:tasks.workunit.client.0.vm05.stdout:3/214: rename d9/d2b/l37 to d9/d2b/l38 0 2026-03-10T08:55:21.159 INFO:tasks.workunit.client.1.vm08.stdout:3/412: getdents d4/d15/d8/d2c/d6d 0 2026-03-10T08:55:21.162 INFO:tasks.workunit.client.1.vm08.stdout:3/413: mkdir d4/d15/d8/d2c/d89 0 2026-03-10T08:55:21.165 INFO:tasks.workunit.client.0.vm05.stdout:6/179: dread d4/d7/ff [4194304,4194304] 0 2026-03-10T08:55:21.165 INFO:tasks.workunit.client.0.vm05.stdout:7/178: unlink d18/d1b/d1f/l22 0 2026-03-10T08:55:21.166 INFO:tasks.workunit.client.0.vm05.stdout:7/179: write d18/d1b/d1f/d25/d2e/d2f/f33 [628714,84988] 0 2026-03-10T08:55:21.167 INFO:tasks.workunit.client.1.vm08.stdout:3/414: read d4/d15/d8/d2c/d55/f75 [3713012,67051] 0 2026-03-10T08:55:21.167 INFO:tasks.workunit.client.0.vm05.stdout:7/180: readlink l1 0 2026-03-10T08:55:21.167 INFO:tasks.workunit.client.0.vm05.stdout:6/180: chown d4/f30 16252 1 2026-03-10T08:55:21.169 INFO:tasks.workunit.client.1.vm08.stdout:3/415: fdatasync d4/d15/d8/d2c/f67 0 2026-03-10T08:55:21.172 INFO:tasks.workunit.client.1.vm08.stdout:3/416: dwrite d4/d15/d8/d2a/d79/f80 [0,4194304] 0 2026-03-10T08:55:21.203 INFO:tasks.workunit.client.0.vm05.stdout:3/215: getdents d9 0 2026-03-10T08:55:21.203 INFO:tasks.workunit.client.0.vm05.stdout:7/181: creat d18/d1b/d1f/d25/d2e/d32/f35 x:0 0 0 2026-03-10T08:55:21.203 INFO:tasks.workunit.client.0.vm05.stdout:3/216: mkdir d9/d39 0 2026-03-10T08:55:21.203 INFO:tasks.workunit.client.0.vm05.stdout:6/181: getdents d4/d7/d10 0 2026-03-10T08:55:21.204 INFO:tasks.workunit.client.1.vm08.stdout:3/417: dread - d4/d15/d8/d1d/f73 zero size 2026-03-10T08:55:21.204 INFO:tasks.workunit.client.1.vm08.stdout:3/418: dwrite d4/f44 [0,4194304] 0 2026-03-10T08:55:21.204 INFO:tasks.workunit.client.1.vm08.stdout:3/419: dwrite d4/d15/d8/d2a/d79/d20/f84 [0,4194304] 0 2026-03-10T08:55:21.204 INFO:tasks.workunit.client.1.vm08.stdout:3/420: rename d4/d15/d8/d2a/d79/c27 to d4/d15/d8/c8a 0 2026-03-10T08:55:21.204 INFO:tasks.workunit.client.1.vm08.stdout:3/421: creat d4/d15/d8/d2a/d79/d20/f8b x:0 0 0 2026-03-10T08:55:21.204 INFO:tasks.workunit.client.1.vm08.stdout:3/422: creat d4/d15/d8/d2c/f8c x:0 0 0 2026-03-10T08:55:21.204 INFO:tasks.workunit.client.1.vm08.stdout:3/423: creat d4/d15/d8/d71/f8d x:0 0 0 2026-03-10T08:55:21.204 INFO:tasks.workunit.client.1.vm08.stdout:3/424: dread d4/d15/d8/d1d/f6e [0,4194304] 0 2026-03-10T08:55:21.204 INFO:tasks.workunit.client.1.vm08.stdout:3/425: dread - d4/d15/d8/d71/f8d zero size 2026-03-10T08:55:21.204 INFO:tasks.workunit.client.1.vm08.stdout:3/426: write d4/d15/d8/d1d/f73 [634900,88233] 0 2026-03-10T08:55:21.204 INFO:tasks.workunit.client.1.vm08.stdout:3/427: creat d4/d15/d8/d71/f8e x:0 0 0 2026-03-10T08:55:21.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:20 vm05.local ceph-mon[49713]: pgmap v149: 65 pgs: 65 active+clean; 1.3 GiB data, 5.1 GiB used, 115 GiB / 120 GiB avail; 21 MiB/s rd, 120 MiB/s wr, 259 op/s 2026-03-10T08:55:21.215 INFO:tasks.workunit.client.1.vm08.stdout:0/389: sync 2026-03-10T08:55:21.239 INFO:tasks.workunit.client.1.vm08.stdout:0/390: readlink d6/dd/l52 0 2026-03-10T08:55:21.239 INFO:tasks.workunit.client.1.vm08.stdout:0/391: mkdir d6/dd/d13/d17/d1f/d20/d2f/d57/d77 0 2026-03-10T08:55:21.239 INFO:tasks.workunit.client.1.vm08.stdout:0/392: readlink d6/dd/d13/d17/d1f/d20/d2f/d57/l63 0 2026-03-10T08:55:21.239 INFO:tasks.workunit.client.1.vm08.stdout:0/393: symlink d6/dd/d13/d17/d1f/d20/l78 0 2026-03-10T08:55:21.239 INFO:tasks.workunit.client.1.vm08.stdout:0/394: mknod d6/dd/d13/d17/d1f/d20/d2f/d26/d56/c79 0 2026-03-10T08:55:21.239 INFO:tasks.workunit.client.1.vm08.stdout:0/395: symlink d6/dd/d13/d17/d1f/d2d/d39/l7a 0 2026-03-10T08:55:21.260 INFO:tasks.workunit.client.0.vm05.stdout:0/197: write df/d1f/f21 [536766,73873] 0 2026-03-10T08:55:21.261 INFO:tasks.workunit.client.0.vm05.stdout:9/145: sync 2026-03-10T08:55:21.267 INFO:tasks.workunit.client.1.vm08.stdout:7/470: truncate d0/d11/d1f/d2c/f33 477987 0 2026-03-10T08:55:21.281 INFO:tasks.workunit.client.1.vm08.stdout:7/471: chown d0/d14/d43/f6e 238 1 2026-03-10T08:55:21.281 INFO:tasks.workunit.client.1.vm08.stdout:8/515: truncate d1/d10/d9/dd/d18/d34/f57 1427128 0 2026-03-10T08:55:21.281 INFO:tasks.workunit.client.1.vm08.stdout:4/479: write d5/d2f/f3a [4128794,55260] 0 2026-03-10T08:55:21.282 INFO:tasks.workunit.client.1.vm08.stdout:8/516: dread d1/d10/d9/f5b [0,4194304] 0 2026-03-10T08:55:21.282 INFO:tasks.workunit.client.0.vm05.stdout:0/198: fdatasync df/f13 0 2026-03-10T08:55:21.282 INFO:tasks.workunit.client.0.vm05.stdout:4/213: dwrite d0/f16 [0,4194304] 0 2026-03-10T08:55:21.282 INFO:tasks.workunit.client.0.vm05.stdout:9/146: mkdir d6/d19/d2c/d2e 0 2026-03-10T08:55:21.282 INFO:tasks.workunit.client.0.vm05.stdout:9/147: dread - d6/d19/f29 zero size 2026-03-10T08:55:21.282 INFO:tasks.workunit.client.0.vm05.stdout:4/214: unlink d0/f16 0 2026-03-10T08:55:21.283 INFO:tasks.workunit.client.0.vm05.stdout:4/215: dwrite d0/d1d/d30/f1c [0,4194304] 0 2026-03-10T08:55:21.284 INFO:tasks.workunit.client.0.vm05.stdout:4/216: fsync d0/fc 0 2026-03-10T08:55:21.284 INFO:tasks.workunit.client.0.vm05.stdout:4/217: truncate d0/f10 4456331 0 2026-03-10T08:55:21.289 INFO:tasks.workunit.client.0.vm05.stdout:9/148: fsync d6/d27/f2b 0 2026-03-10T08:55:21.293 INFO:tasks.workunit.client.0.vm05.stdout:9/149: dwrite f2 [4194304,4194304] 0 2026-03-10T08:55:21.299 INFO:tasks.workunit.client.1.vm08.stdout:4/480: mknod d5/d2f/ca8 0 2026-03-10T08:55:21.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:20 vm08.local ceph-mon[57559]: pgmap v149: 65 pgs: 65 active+clean; 1.3 GiB data, 5.1 GiB used, 115 GiB / 120 GiB avail; 21 MiB/s rd, 120 MiB/s wr, 259 op/s 2026-03-10T08:55:21.319 INFO:tasks.workunit.client.0.vm05.stdout:4/218: mkdir d0/d1d/d30/d49 0 2026-03-10T08:55:21.322 INFO:tasks.workunit.client.0.vm05.stdout:1/273: dwrite dd/d10/d19/f1d [0,4194304] 0 2026-03-10T08:55:21.323 INFO:tasks.workunit.client.1.vm08.stdout:7/472: dread d0/f25 [0,4194304] 0 2026-03-10T08:55:21.323 INFO:tasks.workunit.client.1.vm08.stdout:7/473: mkdir d0/d1c/d8e 0 2026-03-10T08:55:21.323 INFO:tasks.workunit.client.1.vm08.stdout:7/474: stat d0/d14/f68 0 2026-03-10T08:55:21.323 INFO:tasks.workunit.client.0.vm05.stdout:6/182: sync 2026-03-10T08:55:21.323 INFO:tasks.workunit.client.0.vm05.stdout:5/199: sync 2026-03-10T08:55:21.323 INFO:tasks.workunit.client.0.vm05.stdout:9/150: creat d6/d19/d21/f2f x:0 0 0 2026-03-10T08:55:21.327 INFO:tasks.workunit.client.0.vm05.stdout:6/183: fdatasync d4/fc 0 2026-03-10T08:55:21.327 INFO:tasks.workunit.client.1.vm08.stdout:8/517: getdents d1/d10/d9/dd 0 2026-03-10T08:55:21.327 INFO:tasks.workunit.client.1.vm08.stdout:7/475: symlink d0/d11/d1f/d2c/l8f 0 2026-03-10T08:55:21.330 INFO:tasks.workunit.client.1.vm08.stdout:4/481: sync 2026-03-10T08:55:21.346 INFO:tasks.workunit.client.1.vm08.stdout:2/473: dwrite d1/f19 [0,4194304] 0 2026-03-10T08:55:21.346 INFO:tasks.workunit.client.0.vm05.stdout:1/274: mkdir dd/d10/d18/d20/d56/d60 0 2026-03-10T08:55:21.346 INFO:tasks.workunit.client.0.vm05.stdout:1/275: dread - dd/f5e zero size 2026-03-10T08:55:21.346 INFO:tasks.workunit.client.0.vm05.stdout:1/276: dwrite dd/d10/d18/d20/f34 [0,4194304] 0 2026-03-10T08:55:21.346 INFO:tasks.workunit.client.0.vm05.stdout:2/188: rmdir d0 39 2026-03-10T08:55:21.346 INFO:tasks.workunit.client.0.vm05.stdout:1/277: fdatasync dd/f1c 0 2026-03-10T08:55:21.352 INFO:tasks.workunit.client.0.vm05.stdout:4/219: dread d0/d1f/f47 [0,4194304] 0 2026-03-10T08:55:21.356 INFO:tasks.workunit.client.1.vm08.stdout:1/482: truncate d1/da/d20/f2d 3374077 0 2026-03-10T08:55:21.363 INFO:tasks.workunit.client.0.vm05.stdout:6/184: chown d4/d7/l19 0 1 2026-03-10T08:55:21.363 INFO:tasks.workunit.client.1.vm08.stdout:2/474: read d1/da/d10/d2d/f4d [1641099,112896] 0 2026-03-10T08:55:21.368 INFO:tasks.workunit.client.1.vm08.stdout:4/482: fsync d5/d23/f27 0 2026-03-10T08:55:21.370 INFO:tasks.workunit.client.1.vm08.stdout:1/483: creat d1/da/de/d24/d3d/d40/d84/fa5 x:0 0 0 2026-03-10T08:55:21.371 INFO:tasks.workunit.client.1.vm08.stdout:1/484: write d1/da/de/f12 [4585025,14477] 0 2026-03-10T08:55:21.372 INFO:tasks.workunit.client.0.vm05.stdout:4/220: rename d0/d1f to d0/d2e/d42/d45/d4a 0 2026-03-10T08:55:21.373 INFO:tasks.workunit.client.1.vm08.stdout:4/483: read d5/d23/f56 [336875,36069] 0 2026-03-10T08:55:21.375 INFO:tasks.workunit.client.0.vm05.stdout:9/151: creat d6/f30 x:0 0 0 2026-03-10T08:55:21.375 INFO:tasks.workunit.client.0.vm05.stdout:4/221: dwrite d0/fe [0,4194304] 0 2026-03-10T08:55:21.382 INFO:tasks.workunit.client.0.vm05.stdout:2/189: rmdir d0 39 2026-03-10T08:55:21.383 INFO:tasks.workunit.client.0.vm05.stdout:1/278: mknod dd/d10/d19/d4d/c61 0 2026-03-10T08:55:21.383 INFO:tasks.workunit.client.0.vm05.stdout:1/279: chown dd/f1c 53239 1 2026-03-10T08:55:21.384 INFO:tasks.workunit.client.0.vm05.stdout:1/280: fdatasync f6 0 2026-03-10T08:55:21.386 INFO:tasks.workunit.client.0.vm05.stdout:9/152: unlink f2 0 2026-03-10T08:55:21.387 INFO:tasks.workunit.client.1.vm08.stdout:2/475: dread d1/da/d10/d2d/f4d [0,4194304] 0 2026-03-10T08:55:21.387 INFO:tasks.workunit.client.0.vm05.stdout:4/222: mknod d0/d2e/d42/d45/d4a/d36/c4b 0 2026-03-10T08:55:21.388 INFO:tasks.workunit.client.1.vm08.stdout:1/485: link d1/da/de/d24/d26/ca0 d1/da/d20/d3f/d49/d9c/ca6 0 2026-03-10T08:55:21.390 INFO:tasks.workunit.client.1.vm08.stdout:4/484: creat d5/d23/d36/fa9 x:0 0 0 2026-03-10T08:55:21.401 INFO:tasks.workunit.client.1.vm08.stdout:2/476: readlink d1/da/d10/d1b/l11 0 2026-03-10T08:55:21.402 INFO:tasks.workunit.client.1.vm08.stdout:1/486: stat d1/da/de/d24/c34 0 2026-03-10T08:55:21.402 INFO:tasks.workunit.client.1.vm08.stdout:2/477: dwrite d1/d43/f4b [0,4194304] 0 2026-03-10T08:55:21.402 INFO:tasks.workunit.client.1.vm08.stdout:1/487: fsync d1/f8 0 2026-03-10T08:55:21.402 INFO:tasks.workunit.client.0.vm05.stdout:2/190: write d0/f16 [287581,39668] 0 2026-03-10T08:55:21.402 INFO:tasks.workunit.client.0.vm05.stdout:1/281: fdatasync dd/d21/d37/d45/f47 0 2026-03-10T08:55:21.402 INFO:tasks.workunit.client.0.vm05.stdout:9/153: creat d6/d19/d21/f31 x:0 0 0 2026-03-10T08:55:21.402 INFO:tasks.workunit.client.0.vm05.stdout:4/223: rename d0/d1d/c38 to d0/d2e/d42/d45/d4a/c4c 0 2026-03-10T08:55:21.402 INFO:tasks.workunit.client.0.vm05.stdout:2/191: stat d0/d9/d1e/d20/d24/c25 0 2026-03-10T08:55:21.402 INFO:tasks.workunit.client.0.vm05.stdout:1/282: mkdir dd/d21/d3f/d4a/d62 0 2026-03-10T08:55:21.402 INFO:tasks.workunit.client.0.vm05.stdout:9/154: creat d6/d19/d21/f32 x:0 0 0 2026-03-10T08:55:21.402 INFO:tasks.workunit.client.0.vm05.stdout:4/224: rmdir d0/d2e/d42 39 2026-03-10T08:55:21.405 INFO:tasks.workunit.client.0.vm05.stdout:4/225: dwrite d0/d2c/f2f [0,4194304] 0 2026-03-10T08:55:21.405 INFO:tasks.workunit.client.0.vm05.stdout:6/185: getdents d4/d7/d10/d15/d20 0 2026-03-10T08:55:21.408 INFO:tasks.workunit.client.0.vm05.stdout:1/283: symlink dd/d10/d18/d2d/l63 0 2026-03-10T08:55:21.411 INFO:tasks.workunit.client.0.vm05.stdout:1/284: chown dd/d21/f3a 0 1 2026-03-10T08:55:21.416 INFO:tasks.workunit.client.0.vm05.stdout:2/192: dwrite d0/f2f [0,4194304] 0 2026-03-10T08:55:21.419 INFO:tasks.workunit.client.0.vm05.stdout:9/155: symlink d6/l33 0 2026-03-10T08:55:21.423 INFO:tasks.workunit.client.0.vm05.stdout:4/226: symlink d0/d1d/d30/d32/d41/l4d 0 2026-03-10T08:55:21.428 INFO:tasks.workunit.client.0.vm05.stdout:4/227: dwrite d0/f18 [0,4194304] 0 2026-03-10T08:55:21.430 INFO:tasks.workunit.client.1.vm08.stdout:4/485: getdents d5 0 2026-03-10T08:55:21.439 INFO:tasks.workunit.client.0.vm05.stdout:0/199: fdatasync f6 0 2026-03-10T08:55:21.440 INFO:tasks.workunit.client.0.vm05.stdout:0/200: dread - df/d1f/f36 zero size 2026-03-10T08:55:21.445 INFO:tasks.workunit.client.0.vm05.stdout:2/193: dread d0/d9/d1e/f34 [0,4194304] 0 2026-03-10T08:55:21.452 INFO:tasks.workunit.client.1.vm08.stdout:3/428: fdatasync d4/d15/d8/d2a/d79/f80 0 2026-03-10T08:55:21.452 INFO:tasks.workunit.client.0.vm05.stdout:2/194: chown d0/f18 23 1 2026-03-10T08:55:21.453 INFO:tasks.workunit.client.0.vm05.stdout:2/195: write d0/f10 [2619402,66433] 0 2026-03-10T08:55:21.453 INFO:tasks.workunit.client.0.vm05.stdout:2/196: chown d0/d9/d1e/d20/d24 3 1 2026-03-10T08:55:21.453 INFO:tasks.workunit.client.0.vm05.stdout:2/197: dread d0/d9/d1e/f34 [0,4194304] 0 2026-03-10T08:55:21.457 INFO:tasks.workunit.client.0.vm05.stdout:1/285: symlink dd/d55/l64 0 2026-03-10T08:55:21.457 INFO:tasks.workunit.client.0.vm05.stdout:1/286: chown dd/d10/d19/d27/l2b 3 1 2026-03-10T08:55:21.457 INFO:tasks.workunit.client.1.vm08.stdout:4/486: truncate d5/de/f72 282966 0 2026-03-10T08:55:21.458 INFO:tasks.workunit.client.1.vm08.stdout:4/487: write d5/f14 [2298656,67514] 0 2026-03-10T08:55:21.462 INFO:tasks.workunit.client.1.vm08.stdout:1/488: read d1/da/d18/d3b/d62/f76 [3415021,111048] 0 2026-03-10T08:55:21.469 INFO:tasks.workunit.client.1.vm08.stdout:4/488: creat d5/d23/d36/d76/faa x:0 0 0 2026-03-10T08:55:21.475 INFO:tasks.workunit.client.0.vm05.stdout:1/287: mknod dd/d21/d3f/c65 0 2026-03-10T08:55:21.480 INFO:tasks.workunit.client.1.vm08.stdout:0/396: rmdir d6/dd/d13/d17/d1f/d20/d2f/d57 39 2026-03-10T08:55:21.481 INFO:tasks.workunit.client.1.vm08.stdout:1/489: read d1/da/f39 [1682117,91178] 0 2026-03-10T08:55:21.482 INFO:tasks.workunit.client.1.vm08.stdout:1/490: stat d1/da/de/d24/d3d/d40/d56/d6b/f8f 0 2026-03-10T08:55:21.482 INFO:tasks.workunit.client.1.vm08.stdout:1/491: chown d1/da/d20/d91/d83 1 1 2026-03-10T08:55:21.483 INFO:tasks.workunit.client.1.vm08.stdout:1/492: chown d1/da/de/d24/d3d/d40/f42 1829305994 1 2026-03-10T08:55:21.484 INFO:tasks.workunit.client.0.vm05.stdout:2/198: sync 2026-03-10T08:55:21.488 INFO:tasks.workunit.client.0.vm05.stdout:0/201: symlink df/d18/d2b/d27/d32/l38 0 2026-03-10T08:55:21.505 INFO:tasks.workunit.client.1.vm08.stdout:1/493: mkdir d1/da/d18/d3a/da7 0 2026-03-10T08:55:21.505 INFO:tasks.workunit.client.1.vm08.stdout:1/494: chown d1/da/d4b/f4f 155248292 1 2026-03-10T08:55:21.505 INFO:tasks.workunit.client.1.vm08.stdout:1/495: rename d1/da/de/d24/d81/f88 to d1/da/de/d24/d35/d6d/fa8 0 2026-03-10T08:55:21.505 INFO:tasks.workunit.client.0.vm05.stdout:0/202: chown df 2504 1 2026-03-10T08:55:21.505 INFO:tasks.workunit.client.0.vm05.stdout:0/203: dwrite fe [0,4194304] 0 2026-03-10T08:55:21.505 INFO:tasks.workunit.client.0.vm05.stdout:1/288: rename dd/d21/c2f to dd/d10/d18/d20/c66 0 2026-03-10T08:55:21.505 INFO:tasks.workunit.client.0.vm05.stdout:4/228: creat d0/d2e/f4e x:0 0 0 2026-03-10T08:55:21.505 INFO:tasks.workunit.client.1.vm08.stdout:1/496: chown d1/da/d18/f1d 213 1 2026-03-10T08:55:21.507 INFO:tasks.workunit.client.0.vm05.stdout:0/204: rmdir df/d1f 39 2026-03-10T08:55:21.509 INFO:tasks.workunit.client.0.vm05.stdout:1/289: write dd/d10/d18/d2d/d51/d58/f5b [897562,120608] 0 2026-03-10T08:55:21.511 INFO:tasks.workunit.client.1.vm08.stdout:1/497: creat d1/da/de/d24/d35/fa9 x:0 0 0 2026-03-10T08:55:21.511 INFO:tasks.workunit.client.0.vm05.stdout:1/290: dread dd/d13/f33 [0,4194304] 0 2026-03-10T08:55:21.514 INFO:tasks.workunit.client.0.vm05.stdout:4/229: sync 2026-03-10T08:55:21.514 INFO:tasks.workunit.client.0.vm05.stdout:2/199: rename d0/f18 to d0/d9/d1e/d20/f3a 0 2026-03-10T08:55:21.516 INFO:tasks.workunit.client.0.vm05.stdout:2/200: write d0/d9/d1e/d20/d21/f31 [205222,3305] 0 2026-03-10T08:55:21.516 INFO:tasks.workunit.client.0.vm05.stdout:2/201: readlink d0/d9/d1e/d20/d21/l2b 0 2026-03-10T08:55:21.519 INFO:tasks.workunit.client.1.vm08.stdout:1/498: mkdir d1/da/d20/d91/daa 0 2026-03-10T08:55:21.519 INFO:tasks.workunit.client.0.vm05.stdout:9/156: fsync f3 0 2026-03-10T08:55:21.519 INFO:tasks.workunit.client.1.vm08.stdout:1/499: readlink d1/da/d4b/d4e/l8c 0 2026-03-10T08:55:21.520 INFO:tasks.workunit.client.1.vm08.stdout:1/500: read d1/da/d20/d3f/d49/f71 [1648849,115483] 0 2026-03-10T08:55:21.523 INFO:tasks.workunit.client.0.vm05.stdout:9/157: dwrite f4 [0,4194304] 0 2026-03-10T08:55:21.529 INFO:tasks.workunit.client.0.vm05.stdout:3/217: write d9/d2b/d2f/f33 [527150,24646] 0 2026-03-10T08:55:21.529 INFO:tasks.workunit.client.0.vm05.stdout:9/158: dread - d6/f30 zero size 2026-03-10T08:55:21.537 INFO:tasks.workunit.client.1.vm08.stdout:1/501: mknod d1/da/de/d24/d3d/d40/d5b/cab 0 2026-03-10T08:55:21.552 INFO:tasks.workunit.client.0.vm05.stdout:9/159: creat d6/d12/f34 x:0 0 0 2026-03-10T08:55:21.558 INFO:tasks.workunit.client.1.vm08.stdout:1/502: creat d1/fac x:0 0 0 2026-03-10T08:55:21.558 INFO:tasks.workunit.client.0.vm05.stdout:1/291: fsync f7 0 2026-03-10T08:55:21.558 INFO:tasks.workunit.client.0.vm05.stdout:1/292: write dd/d10/d19/d27/f4e [742921,108064] 0 2026-03-10T08:55:21.562 INFO:tasks.workunit.client.1.vm08.stdout:3/429: dread d4/d15/d8/d2c/d55/f75 [0,4194304] 0 2026-03-10T08:55:21.565 INFO:tasks.workunit.client.1.vm08.stdout:3/430: dwrite d4/d15/d8/d1d/f2d [0,4194304] 0 2026-03-10T08:55:21.582 INFO:tasks.workunit.client.0.vm05.stdout:9/160: mkdir d6/d15/d35 0 2026-03-10T08:55:21.591 INFO:tasks.workunit.client.1.vm08.stdout:1/503: getdents d1/da/de/d24/d3d/d40 0 2026-03-10T08:55:21.594 INFO:tasks.workunit.client.0.vm05.stdout:1/293: mknod dd/d10/d18/d2d/c67 0 2026-03-10T08:55:21.598 INFO:tasks.workunit.client.1.vm08.stdout:1/504: creat d1/da/de/fad x:0 0 0 2026-03-10T08:55:21.602 INFO:tasks.workunit.client.0.vm05.stdout:1/294: dwrite dd/d21/f48 [4194304,4194304] 0 2026-03-10T08:55:21.607 INFO:tasks.workunit.client.0.vm05.stdout:9/161: rename d6/c1b to d6/d19/c36 0 2026-03-10T08:55:21.612 INFO:tasks.workunit.client.0.vm05.stdout:1/295: dwrite dd/d21/f3a [4194304,4194304] 0 2026-03-10T08:55:21.612 INFO:tasks.workunit.client.0.vm05.stdout:1/296: chown dd/d21/d37/d45/f47 95283 1 2026-03-10T08:55:21.613 INFO:tasks.workunit.client.0.vm05.stdout:1/297: write dd/d21/d37/f39 [840011,108480] 0 2026-03-10T08:55:21.616 INFO:tasks.workunit.client.1.vm08.stdout:6/485: dread d9/dc/d11/d23/d2c/f3d [0,4194304] 0 2026-03-10T08:55:21.620 INFO:tasks.workunit.client.1.vm08.stdout:3/431: getdents d4/d15/d8/d2c/d55 0 2026-03-10T08:55:21.620 INFO:tasks.workunit.client.0.vm05.stdout:9/162: mkdir d6/d15/d37 0 2026-03-10T08:55:21.621 INFO:tasks.workunit.client.1.vm08.stdout:1/505: symlink d1/da/d18/d3a/da7/lae 0 2026-03-10T08:55:21.622 INFO:tasks.workunit.client.1.vm08.stdout:1/506: dread - d1/da/de/d24/d35/fa9 zero size 2026-03-10T08:55:21.623 INFO:tasks.workunit.client.1.vm08.stdout:1/507: dread - d1/da/de/d24/d3d/d40/d84/fa5 zero size 2026-03-10T08:55:21.624 INFO:tasks.workunit.client.1.vm08.stdout:3/432: write d4/d15/d8/d1d/f6e [3483299,53206] 0 2026-03-10T08:55:21.625 INFO:tasks.workunit.client.0.vm05.stdout:9/163: creat d6/d15/d35/f38 x:0 0 0 2026-03-10T08:55:21.627 INFO:tasks.workunit.client.0.vm05.stdout:9/164: unlink d6/f1e 0 2026-03-10T08:55:21.627 INFO:tasks.workunit.client.0.vm05.stdout:9/165: readlink d6/d12/l22 0 2026-03-10T08:55:21.629 INFO:tasks.workunit.client.1.vm08.stdout:1/508: rename d1/da/d18/d3b/f89 to d1/da/d18/d3b/faf 0 2026-03-10T08:55:21.630 INFO:tasks.workunit.client.0.vm05.stdout:9/166: rename d6/l33 to d6/d27/l39 0 2026-03-10T08:55:21.635 INFO:tasks.workunit.client.1.vm08.stdout:1/509: mknod d1/da/d20/d9e/cb0 0 2026-03-10T08:55:21.636 INFO:tasks.workunit.client.0.vm05.stdout:9/167: read d6/d15/f24 [562581,112303] 0 2026-03-10T08:55:21.636 INFO:tasks.workunit.client.0.vm05.stdout:9/168: mkdir d6/d12/d3a 0 2026-03-10T08:55:21.637 INFO:tasks.workunit.client.0.vm05.stdout:9/169: mknod d6/d15/d37/c3b 0 2026-03-10T08:55:21.638 INFO:tasks.workunit.client.1.vm08.stdout:1/510: creat d1/da/d18/fb1 x:0 0 0 2026-03-10T08:55:21.638 INFO:tasks.workunit.client.1.vm08.stdout:1/511: chown d1/da/d20/d91/daa 30733 1 2026-03-10T08:55:21.638 INFO:tasks.workunit.client.0.vm05.stdout:9/170: chown d6/cc 108585667 1 2026-03-10T08:55:21.639 INFO:tasks.workunit.client.0.vm05.stdout:9/171: mkdir d6/d15/d3c 0 2026-03-10T08:55:21.642 INFO:tasks.workunit.client.1.vm08.stdout:1/512: rename d1/da/de/d24/d35/d43/f7d to d1/da/de/d24/d35/d43/fb2 0 2026-03-10T08:55:21.642 INFO:tasks.workunit.client.1.vm08.stdout:1/513: chown d1/da/d18/d3a/da7/lae 14224 1 2026-03-10T08:55:21.643 INFO:tasks.workunit.client.1.vm08.stdout:3/433: sync 2026-03-10T08:55:21.643 INFO:tasks.workunit.client.0.vm05.stdout:9/172: dwrite f4 [0,4194304] 0 2026-03-10T08:55:21.655 INFO:tasks.workunit.client.1.vm08.stdout:1/514: symlink d1/da/de/d24/d3d/d40/d92/lb3 0 2026-03-10T08:55:21.655 INFO:tasks.workunit.client.1.vm08.stdout:3/434: mkdir d4/d15/d8/d2a/d79/d8f 0 2026-03-10T08:55:21.657 INFO:tasks.workunit.client.1.vm08.stdout:3/435: stat d4/d15/f3f 0 2026-03-10T08:55:21.658 INFO:tasks.workunit.client.1.vm08.stdout:3/436: write d4/d15/d8/d1d/f73 [1348493,99207] 0 2026-03-10T08:55:21.660 INFO:tasks.workunit.client.1.vm08.stdout:3/437: creat d4/d15/d8/d2c/f90 x:0 0 0 2026-03-10T08:55:21.661 INFO:tasks.workunit.client.1.vm08.stdout:3/438: creat d4/d15/d8/d2a/d79/d8f/f91 x:0 0 0 2026-03-10T08:55:21.661 INFO:tasks.workunit.client.1.vm08.stdout:3/439: fsync d4/d15/f3f 0 2026-03-10T08:55:21.663 INFO:tasks.workunit.client.1.vm08.stdout:3/440: unlink d4/d15/f3f 0 2026-03-10T08:55:21.663 INFO:tasks.workunit.client.1.vm08.stdout:3/441: rename d4/d6f to d4/d6f/d92 22 2026-03-10T08:55:21.664 INFO:tasks.workunit.client.1.vm08.stdout:0/397: read d6/dd/d13/d32/f34 [244049,8112] 0 2026-03-10T08:55:21.672 INFO:tasks.workunit.client.1.vm08.stdout:0/398: dread d6/f16 [0,4194304] 0 2026-03-10T08:55:21.673 INFO:tasks.workunit.client.1.vm08.stdout:0/399: symlink d6/l7b 0 2026-03-10T08:55:21.674 INFO:tasks.workunit.client.1.vm08.stdout:0/400: mknod d6/dd/d13/c7c 0 2026-03-10T08:55:21.684 INFO:tasks.workunit.client.0.vm05.stdout:5/200: truncate d5/df/f1c 3648422 0 2026-03-10T08:55:21.692 INFO:tasks.workunit.client.0.vm05.stdout:5/201: chown d5/df/l32 396301831 1 2026-03-10T08:55:21.692 INFO:tasks.workunit.client.0.vm05.stdout:5/202: chown d5/df/c26 3780 1 2026-03-10T08:55:21.695 INFO:tasks.workunit.client.1.vm08.stdout:0/401: sync 2026-03-10T08:55:21.699 INFO:tasks.workunit.client.0.vm05.stdout:6/186: dread d4/d7/d10/d15/d20/f39 [0,4194304] 0 2026-03-10T08:55:21.699 INFO:tasks.workunit.client.0.vm05.stdout:6/187: chown d4/d2c/f35 379655935 1 2026-03-10T08:55:21.701 INFO:tasks.workunit.client.1.vm08.stdout:0/402: symlink d6/dd/d13/d17/d1f/d20/d2f/l7d 0 2026-03-10T08:55:21.704 INFO:tasks.workunit.client.0.vm05.stdout:6/188: symlink d4/d7/d10/d15/d20/l3d 0 2026-03-10T08:55:21.704 INFO:tasks.workunit.client.0.vm05.stdout:7/182: dread f4 [0,4194304] 0 2026-03-10T08:55:21.710 INFO:tasks.workunit.client.0.vm05.stdout:7/183: dwrite d18/d1b/f2c [0,4194304] 0 2026-03-10T08:55:21.711 INFO:tasks.workunit.client.0.vm05.stdout:7/184: truncate d18/d1b/d1f/f2d 278927 0 2026-03-10T08:55:21.715 INFO:tasks.workunit.client.0.vm05.stdout:5/203: dread d5/df/d12/f1b [4194304,4194304] 0 2026-03-10T08:55:21.721 INFO:tasks.workunit.client.0.vm05.stdout:6/189: rename d4/d7/d10/d15/d1b/d22/l37 to d4/d2c/l3e 0 2026-03-10T08:55:21.722 INFO:tasks.workunit.client.1.vm08.stdout:0/403: rename d6/fb to d6/dd/d13/d17/d1f/d20/d2f/d57/d77/f7e 0 2026-03-10T08:55:21.723 INFO:tasks.workunit.client.0.vm05.stdout:5/204: read d5/fc [1525215,53946] 0 2026-03-10T08:55:21.732 INFO:tasks.workunit.client.0.vm05.stdout:6/190: rename d4/d7/d10/d15/d1b/f23 to d4/d7/d10/d15/d1b/f3f 0 2026-03-10T08:55:21.732 INFO:tasks.workunit.client.0.vm05.stdout:6/191: chown d4/f21 3 1 2026-03-10T08:55:21.732 INFO:tasks.workunit.client.0.vm05.stdout:6/192: dread - d4/d7/d10/d15/f2a zero size 2026-03-10T08:55:21.734 INFO:tasks.workunit.client.0.vm05.stdout:5/205: dread d5/fd [0,4194304] 0 2026-03-10T08:55:21.739 INFO:tasks.workunit.client.0.vm05.stdout:7/185: dread f3 [0,4194304] 0 2026-03-10T08:55:21.740 INFO:tasks.workunit.client.0.vm05.stdout:7/186: fsync d18/f1d 0 2026-03-10T08:55:21.740 INFO:tasks.workunit.client.0.vm05.stdout:7/187: chown c12 19721 1 2026-03-10T08:55:21.747 INFO:tasks.workunit.client.0.vm05.stdout:7/188: dread d18/d1b/d1f/d25/d2e/d2f/f33 [0,4194304] 0 2026-03-10T08:55:21.748 INFO:tasks.workunit.client.0.vm05.stdout:7/189: fdatasync d18/d1b/d1f/f2d 0 2026-03-10T08:55:21.749 INFO:tasks.workunit.client.1.vm08.stdout:8/518: write d1/d10/d9/dd/f91 [891566,79852] 0 2026-03-10T08:55:21.754 INFO:tasks.workunit.client.0.vm05.stdout:6/193: chown d4/d7/d10/d15/d20/l33 1729535411 1 2026-03-10T08:55:21.760 INFO:tasks.workunit.client.1.vm08.stdout:7/476: truncate d0/f25 327236 0 2026-03-10T08:55:21.768 INFO:tasks.workunit.client.1.vm08.stdout:9/422: dread d2/dd/d15/d1e/d39/d4e/f71 [0,4194304] 0 2026-03-10T08:55:21.768 INFO:tasks.workunit.client.1.vm08.stdout:9/423: readlink d2/dd/le 0 2026-03-10T08:55:21.769 INFO:tasks.workunit.client.0.vm05.stdout:5/206: rename d5/df/d12/d24/d2c/f3d to d5/df/d37/f42 0 2026-03-10T08:55:21.770 INFO:tasks.workunit.client.0.vm05.stdout:3/218: read d9/f19 [452795,57472] 0 2026-03-10T08:55:21.783 INFO:tasks.workunit.client.1.vm08.stdout:2/478: dwrite d1/da/d10/d42/d93/d1e/f1f [0,4194304] 0 2026-03-10T08:55:21.791 INFO:tasks.workunit.client.0.vm05.stdout:8/148: dread d2/f5 [0,4194304] 0 2026-03-10T08:55:21.792 INFO:tasks.workunit.client.0.vm05.stdout:8/149: write d2/ff [4620667,102701] 0 2026-03-10T08:55:21.799 INFO:tasks.workunit.client.0.vm05.stdout:7/190: creat d18/d1b/d1f/d25/f36 x:0 0 0 2026-03-10T08:55:21.801 INFO:tasks.workunit.client.1.vm08.stdout:4/489: dwrite d5/d2f/f84 [0,4194304] 0 2026-03-10T08:55:21.805 INFO:tasks.workunit.client.0.vm05.stdout:0/205: dwrite df/f13 [0,4194304] 0 2026-03-10T08:55:21.825 INFO:tasks.workunit.client.1.vm08.stdout:8/519: mkdir d1/d10/d9/dd/d25/d27/d44/d21/d51/d64/dba 0 2026-03-10T08:55:21.830 INFO:tasks.workunit.client.0.vm05.stdout:2/202: truncate d0/f10 2694427 0 2026-03-10T08:55:21.830 INFO:tasks.workunit.client.0.vm05.stdout:1/298: stat dd/d10/d18/d20/c66 0 2026-03-10T08:55:21.834 INFO:tasks.workunit.client.0.vm05.stdout:2/203: dread d0/d9/d1e/f34 [0,4194304] 0 2026-03-10T08:55:21.840 INFO:tasks.workunit.client.1.vm08.stdout:8/520: rename d1/d10/d9/dd/d25/d27/d44/l4b to d1/d10/d9/dd/d25/d27/d44/d97/lbb 0 2026-03-10T08:55:21.840 INFO:tasks.workunit.client.1.vm08.stdout:8/521: readlink d1/laa 0 2026-03-10T08:55:21.843 INFO:tasks.workunit.client.0.vm05.stdout:7/191: dwrite f4 [0,4194304] 0 2026-03-10T08:55:21.845 INFO:tasks.workunit.client.1.vm08.stdout:8/522: dread - d1/d10/d9/dd/d25/d27/d44/d97/f9c zero size 2026-03-10T08:55:21.851 INFO:tasks.workunit.client.0.vm05.stdout:0/206: mkdir df/d18/d19/d39 0 2026-03-10T08:55:21.853 INFO:tasks.workunit.client.0.vm05.stdout:3/219: mkdir d9/d2b/d3a 0 2026-03-10T08:55:21.856 INFO:tasks.workunit.client.1.vm08.stdout:6/486: write d9/dc/d11/f55 [391130,84190] 0 2026-03-10T08:55:21.856 INFO:tasks.workunit.client.1.vm08.stdout:2/479: dread d1/da/d10/d1b/f28 [0,4194304] 0 2026-03-10T08:55:21.857 INFO:tasks.workunit.client.1.vm08.stdout:2/480: stat d1/d43 0 2026-03-10T08:55:21.858 INFO:tasks.workunit.client.0.vm05.stdout:9/173: getdents d6/d15/d35 0 2026-03-10T08:55:21.861 INFO:tasks.workunit.client.1.vm08.stdout:8/523: dwrite d1/d10/d9/dd/d18/d34/f57 [0,4194304] 0 2026-03-10T08:55:21.873 INFO:tasks.workunit.client.1.vm08.stdout:6/487: dread d9/d10/f53 [0,4194304] 0 2026-03-10T08:55:21.875 INFO:tasks.workunit.client.0.vm05.stdout:2/204: creat d0/d9/f3b x:0 0 0 2026-03-10T08:55:21.908 INFO:tasks.workunit.client.0.vm05.stdout:2/205: stat d0/d9/d1e/f34 0 2026-03-10T08:55:21.908 INFO:tasks.workunit.client.0.vm05.stdout:1/299: unlink dd/d10/d19/c49 0 2026-03-10T08:55:21.908 INFO:tasks.workunit.client.0.vm05.stdout:1/300: dwrite dd/d21/f48 [8388608,4194304] 0 2026-03-10T08:55:21.908 INFO:tasks.workunit.client.0.vm05.stdout:2/206: symlink d0/d9/d27/l3c 0 2026-03-10T08:55:21.908 INFO:tasks.workunit.client.0.vm05.stdout:9/174: creat d6/d19/d2c/f3d x:0 0 0 2026-03-10T08:55:21.908 INFO:tasks.workunit.client.0.vm05.stdout:1/301: dread dd/d13/f33 [0,4194304] 0 2026-03-10T08:55:21.908 INFO:tasks.workunit.client.0.vm05.stdout:4/230: mkdir d0/d1d/d30/d49/d4f 0 2026-03-10T08:55:21.908 INFO:tasks.workunit.client.0.vm05.stdout:3/220: creat d9/d2b/f3b x:0 0 0 2026-03-10T08:55:21.908 INFO:tasks.workunit.client.1.vm08.stdout:8/524: symlink d1/d10/d9/d8a/lbc 0 2026-03-10T08:55:21.908 INFO:tasks.workunit.client.1.vm08.stdout:8/525: write d1/d10/d9/dd/f91 [1959206,2248] 0 2026-03-10T08:55:21.908 INFO:tasks.workunit.client.1.vm08.stdout:2/481: link d1/da/d10/d42/d93/d22/f8a d1/da/d10/d42/f96 0 2026-03-10T08:55:21.908 INFO:tasks.workunit.client.1.vm08.stdout:3/442: dwrite d4/f18 [0,4194304] 0 2026-03-10T08:55:21.908 INFO:tasks.workunit.client.1.vm08.stdout:6/488: mknod d9/dc/ca6 0 2026-03-10T08:55:21.908 INFO:tasks.workunit.client.1.vm08.stdout:8/526: creat d1/d10/d9/dd/d25/d27/d44/d21/d5f/fbd x:0 0 0 2026-03-10T08:55:21.908 INFO:tasks.workunit.client.1.vm08.stdout:8/527: dwrite d1/d10/fac [0,4194304] 0 2026-03-10T08:55:21.908 INFO:tasks.workunit.client.1.vm08.stdout:8/528: unlink d1/d10/d9/dd/d25/d27/d44/d21/cb9 0 2026-03-10T08:55:21.908 INFO:tasks.workunit.client.1.vm08.stdout:8/529: mknod d1/d10/d9/dd/d18/cbe 0 2026-03-10T08:55:21.908 INFO:tasks.workunit.client.1.vm08.stdout:3/443: getdents d4/d15/d8/d71 0 2026-03-10T08:55:21.908 INFO:tasks.workunit.client.1.vm08.stdout:3/444: write d4/f18 [4910242,45607] 0 2026-03-10T08:55:21.908 INFO:tasks.workunit.client.1.vm08.stdout:4/490: sync 2026-03-10T08:55:21.914 INFO:tasks.workunit.client.1.vm08.stdout:8/530: dwrite d1/d10/f2a [0,4194304] 0 2026-03-10T08:55:21.925 INFO:tasks.workunit.client.1.vm08.stdout:3/445: mkdir d4/d15/d8/d2c/d55/d93 0 2026-03-10T08:55:21.927 INFO:tasks.workunit.client.1.vm08.stdout:3/446: dread d4/f18 [0,4194304] 0 2026-03-10T08:55:21.930 INFO:tasks.workunit.client.0.vm05.stdout:4/231: creat d0/d1d/f50 x:0 0 0 2026-03-10T08:55:21.930 INFO:tasks.workunit.client.0.vm05.stdout:7/192: getdents d18/d1b 0 2026-03-10T08:55:21.948 INFO:tasks.workunit.client.0.vm05.stdout:9/175: symlink d6/d15/d3c/l3e 0 2026-03-10T08:55:21.951 INFO:tasks.workunit.client.0.vm05.stdout:5/207: rmdir d5/df 39 2026-03-10T08:55:21.951 INFO:tasks.workunit.client.0.vm05.stdout:3/221: creat d9/f3c x:0 0 0 2026-03-10T08:55:21.951 INFO:tasks.workunit.client.1.vm08.stdout:8/531: mkdir d1/d4f/d60/dbf 0 2026-03-10T08:55:21.951 INFO:tasks.workunit.client.1.vm08.stdout:3/447: unlink d4/l4c 0 2026-03-10T08:55:21.951 INFO:tasks.workunit.client.1.vm08.stdout:8/532: mknod d1/d10/d9/dd/d9a/da6/cc0 0 2026-03-10T08:55:21.951 INFO:tasks.workunit.client.1.vm08.stdout:8/533: stat d1/d10/d9/dd/d25/d27/d44/d21 0 2026-03-10T08:55:21.951 INFO:tasks.workunit.client.1.vm08.stdout:3/448: fsync f1 0 2026-03-10T08:55:21.951 INFO:tasks.workunit.client.1.vm08.stdout:0/404: write d6/dd/d13/d17/d1f/d20/d2f/d24/f68 [3381931,48530] 0 2026-03-10T08:55:21.951 INFO:tasks.workunit.client.1.vm08.stdout:0/405: truncate d6/f62 1027639 0 2026-03-10T08:55:21.951 INFO:tasks.workunit.client.1.vm08.stdout:3/449: chown d4/d15/d8/f1e 3841927 1 2026-03-10T08:55:21.954 INFO:tasks.workunit.client.0.vm05.stdout:4/232: creat d0/d1d/d30/d49/d4f/f51 x:0 0 0 2026-03-10T08:55:21.955 INFO:tasks.workunit.client.0.vm05.stdout:4/233: dread - d0/d2e/f4e zero size 2026-03-10T08:55:21.957 INFO:tasks.workunit.client.1.vm08.stdout:0/406: mknod d6/dd/d13/d17/d1f/d20/c7f 0 2026-03-10T08:55:21.958 INFO:tasks.workunit.client.1.vm08.stdout:3/450: rename d4/d15/d8/c8a to d4/d15/d8/d2c/d6d/c94 0 2026-03-10T08:55:21.958 INFO:tasks.workunit.client.0.vm05.stdout:9/176: creat d6/f3f x:0 0 0 2026-03-10T08:55:21.963 INFO:tasks.workunit.client.1.vm08.stdout:0/407: creat d6/dd/d13/d17/d1f/d20/d2f/d26/f80 x:0 0 0 2026-03-10T08:55:21.966 INFO:tasks.workunit.client.0.vm05.stdout:3/222: rename d9/c2a to d9/d2b/c3d 0 2026-03-10T08:55:21.966 INFO:tasks.workunit.client.0.vm05.stdout:0/207: sync 2026-03-10T08:55:21.966 INFO:tasks.workunit.client.0.vm05.stdout:0/208: stat df 0 2026-03-10T08:55:21.967 INFO:tasks.workunit.client.1.vm08.stdout:0/408: getdents d6/dd/d13/d61/d6f 0 2026-03-10T08:55:21.967 INFO:tasks.workunit.client.0.vm05.stdout:0/209: mkdir df/d18/d2b/d3a 0 2026-03-10T08:55:21.968 INFO:tasks.workunit.client.0.vm05.stdout:0/210: fdatasync df/f1a 0 2026-03-10T08:55:21.968 INFO:tasks.workunit.client.1.vm08.stdout:8/534: sync 2026-03-10T08:55:21.971 INFO:tasks.workunit.client.0.vm05.stdout:0/211: dwrite fe [0,4194304] 0 2026-03-10T08:55:21.978 INFO:tasks.workunit.client.0.vm05.stdout:0/212: dread df/d18/d2b/d27/f2e [0,4194304] 0 2026-03-10T08:55:21.978 INFO:tasks.workunit.client.1.vm08.stdout:0/409: mknod d6/dd/d13/d17/d1f/c81 0 2026-03-10T08:55:21.978 INFO:tasks.workunit.client.1.vm08.stdout:8/535: symlink d1/d10/d9/dd/d13/d40/lc1 0 2026-03-10T08:55:21.978 INFO:tasks.workunit.client.1.vm08.stdout:0/410: fsync d6/dd/d13/d32/f34 0 2026-03-10T08:55:21.980 INFO:tasks.workunit.client.0.vm05.stdout:0/213: dwrite df/d18/f24 [0,4194304] 0 2026-03-10T08:55:21.987 INFO:tasks.workunit.client.0.vm05.stdout:0/214: truncate df/d18/f2a 5154085 0 2026-03-10T08:55:21.988 INFO:tasks.workunit.client.0.vm05.stdout:0/215: readlink df/d18/l34 0 2026-03-10T08:55:21.988 INFO:tasks.workunit.client.0.vm05.stdout:0/216: readlink df/d18/l34 0 2026-03-10T08:55:21.988 INFO:tasks.workunit.client.0.vm05.stdout:0/217: chown df/d18/d19/c30 29458911 1 2026-03-10T08:55:21.990 INFO:tasks.workunit.client.0.vm05.stdout:0/218: dread df/d18/d2b/d27/f2e [0,4194304] 0 2026-03-10T08:55:21.990 INFO:tasks.workunit.client.1.vm08.stdout:0/411: link d6/dd/d13/d32/f34 d6/dd/d13/d17/f82 0 2026-03-10T08:55:22.002 INFO:tasks.workunit.client.1.vm08.stdout:9/424: write d2/dd/d15/d1e/d25/f4b [1307723,13009] 0 2026-03-10T08:55:22.004 INFO:tasks.workunit.client.1.vm08.stdout:0/412: symlink d6/dd/d13/d17/l83 0 2026-03-10T08:55:22.005 INFO:tasks.workunit.client.1.vm08.stdout:9/425: mkdir d2/dd/d15/d1e/d39/d4e/d87 0 2026-03-10T08:55:22.005 INFO:tasks.workunit.client.1.vm08.stdout:7/477: dwrite d0/d11/d1f/d29/d3d/d89/f8b [0,4194304] 0 2026-03-10T08:55:22.007 INFO:tasks.workunit.client.0.vm05.stdout:0/219: dread f6 [0,4194304] 0 2026-03-10T08:55:22.019 INFO:tasks.workunit.client.1.vm08.stdout:2/482: dread d1/da/d10/d42/f89 [0,4194304] 0 2026-03-10T08:55:22.022 INFO:tasks.workunit.client.1.vm08.stdout:0/413: mknod d6/dd/d13/d17/d50/c84 0 2026-03-10T08:55:22.024 INFO:tasks.workunit.client.0.vm05.stdout:0/220: creat df/d18/d2b/f3b x:0 0 0 2026-03-10T08:55:22.025 INFO:tasks.workunit.client.0.vm05.stdout:1/302: read dd/d10/d18/d2d/d51/d58/f5b [441090,58477] 0 2026-03-10T08:55:22.026 INFO:tasks.workunit.client.1.vm08.stdout:2/483: mkdir d1/d97 0 2026-03-10T08:55:22.032 INFO:tasks.workunit.client.0.vm05.stdout:1/303: mknod dd/c68 0 2026-03-10T08:55:22.035 INFO:tasks.workunit.client.0.vm05.stdout:1/304: mkdir dd/d10/d18/d20/d69 0 2026-03-10T08:55:22.037 INFO:tasks.workunit.client.1.vm08.stdout:7/478: creat d0/d11/d1f/f90 x:0 0 0 2026-03-10T08:55:22.038 INFO:tasks.workunit.client.0.vm05.stdout:1/305: mknod dd/d21/d3f/c6a 0 2026-03-10T08:55:22.041 INFO:tasks.workunit.client.0.vm05.stdout:1/306: rename dd/d21/f54 to dd/d10/d18/d2d/d51/f6b 0 2026-03-10T08:55:22.042 INFO:tasks.workunit.client.0.vm05.stdout:1/307: creat dd/d10/d18/d20/f6c x:0 0 0 2026-03-10T08:55:22.044 INFO:tasks.workunit.client.1.vm08.stdout:7/479: truncate d0/d11/d4a/f4f 2756547 0 2026-03-10T08:55:22.044 INFO:tasks.workunit.client.0.vm05.stdout:1/308: creat dd/d10/d18/d2d/f6d x:0 0 0 2026-03-10T08:55:22.046 INFO:tasks.workunit.client.1.vm08.stdout:7/480: fsync d0/d14/d43/f58 0 2026-03-10T08:55:22.047 INFO:tasks.workunit.client.0.vm05.stdout:1/309: creat dd/d10/d18/d2d/d51/f6e x:0 0 0 2026-03-10T08:55:22.048 INFO:tasks.workunit.client.1.vm08.stdout:7/481: dread d0/d11/d1f/d29/d3d/f74 [0,4194304] 0 2026-03-10T08:55:22.050 INFO:tasks.workunit.client.0.vm05.stdout:7/193: rmdir d18 39 2026-03-10T08:55:22.056 INFO:tasks.workunit.client.0.vm05.stdout:1/310: creat dd/d21/f6f x:0 0 0 2026-03-10T08:55:22.062 INFO:tasks.workunit.client.0.vm05.stdout:7/194: link l1 d18/d1b/d1f/d25/d2e/l37 0 2026-03-10T08:55:22.068 INFO:tasks.workunit.client.0.vm05.stdout:7/195: dwrite f15 [0,4194304] 0 2026-03-10T08:55:22.070 INFO:tasks.workunit.client.0.vm05.stdout:7/196: chown l13 0 1 2026-03-10T08:55:22.071 INFO:tasks.workunit.client.0.vm05.stdout:1/311: dread dd/d21/f3a [0,4194304] 0 2026-03-10T08:55:22.077 INFO:tasks.workunit.client.0.vm05.stdout:1/312: unlink dd/d10/d19/d4d/c61 0 2026-03-10T08:55:22.084 INFO:tasks.workunit.client.0.vm05.stdout:1/313: unlink dd/f11 0 2026-03-10T08:55:22.084 INFO:tasks.workunit.client.0.vm05.stdout:1/314: read - dd/d10/d18/d2d/d51/f6e zero size 2026-03-10T08:55:22.085 INFO:tasks.workunit.client.0.vm05.stdout:1/315: write dd/d21/f48 [7381049,60118] 0 2026-03-10T08:55:22.088 INFO:tasks.workunit.client.1.vm08.stdout:9/426: dread d2/dd/d15/d1e/d39/d4e/f78 [0,4194304] 0 2026-03-10T08:55:22.095 INFO:tasks.workunit.client.1.vm08.stdout:9/427: dread f1 [4194304,4194304] 0 2026-03-10T08:55:22.097 INFO:tasks.workunit.client.1.vm08.stdout:9/428: truncate d2/dd/d15/d1e/d39/d4e/f78 533089 0 2026-03-10T08:55:22.097 INFO:tasks.workunit.client.0.vm05.stdout:8/150: write d2/dd/d2c/f34 [542289,15390] 0 2026-03-10T08:55:22.099 INFO:tasks.workunit.client.0.vm05.stdout:6/194: truncate d4/f21 2556653 0 2026-03-10T08:55:22.100 INFO:tasks.workunit.client.0.vm05.stdout:6/195: chown d4/d7/d10/d15/f2e 67 1 2026-03-10T08:55:22.104 INFO:tasks.workunit.client.1.vm08.stdout:7/482: sync 2026-03-10T08:55:22.108 INFO:tasks.workunit.client.0.vm05.stdout:6/196: rename d4/d7/d10/d15/d1b/d22/l26 to d4/d7/d10/d1a/l40 0 2026-03-10T08:55:22.110 INFO:tasks.workunit.client.1.vm08.stdout:6/489: dwrite d9/d50/f75 [0,4194304] 0 2026-03-10T08:55:22.112 INFO:tasks.workunit.client.1.vm08.stdout:7/483: symlink d0/d11/d4a/d5e/l91 0 2026-03-10T08:55:22.115 INFO:tasks.workunit.client.1.vm08.stdout:7/484: readlink d0/d11/d1f/d2c/l3c 0 2026-03-10T08:55:22.122 INFO:tasks.workunit.client.1.vm08.stdout:9/429: mknod d2/dd/d15/d1e/d25/d32/c88 0 2026-03-10T08:55:22.132 INFO:tasks.workunit.client.1.vm08.stdout:6/490: mknod d9/d10/d1e/ca7 0 2026-03-10T08:55:22.132 INFO:tasks.workunit.client.0.vm05.stdout:6/197: rename d4/d7/d10/d15/d20/f39 to d4/d7/d10/d15/d38/f41 0 2026-03-10T08:55:22.133 INFO:tasks.workunit.client.1.vm08.stdout:7/485: mknod d0/d11/d1f/d29/d3b/c92 0 2026-03-10T08:55:22.134 INFO:tasks.workunit.client.1.vm08.stdout:9/430: fdatasync d2/dd/d15/d1e/d39/d4e/f55 0 2026-03-10T08:55:22.134 INFO:tasks.workunit.client.0.vm05.stdout:6/198: write d4/d7/f14 [165750,14745] 0 2026-03-10T08:55:22.136 INFO:tasks.workunit.client.0.vm05.stdout:6/199: read d4/d7/f34 [3913674,45124] 0 2026-03-10T08:55:22.139 INFO:tasks.workunit.client.1.vm08.stdout:9/431: fsync d2/dd/d15/d1e/d24/f3f 0 2026-03-10T08:55:22.142 INFO:tasks.workunit.client.0.vm05.stdout:6/200: symlink d4/l42 0 2026-03-10T08:55:22.144 INFO:tasks.workunit.client.1.vm08.stdout:9/432: truncate d2/d41/d4c/f7c 497030 0 2026-03-10T08:55:22.144 INFO:tasks.workunit.client.0.vm05.stdout:6/201: symlink d4/d7/d10/d15/d1b/l43 0 2026-03-10T08:55:22.145 INFO:tasks.workunit.client.1.vm08.stdout:7/486: dwrite d0/d11/d1f/d29/d3d/f59 [4194304,4194304] 0 2026-03-10T08:55:22.145 INFO:tasks.workunit.client.0.vm05.stdout:6/202: dread - d4/d7/d10/d15/f2a zero size 2026-03-10T08:55:22.145 INFO:tasks.workunit.client.0.vm05.stdout:6/203: chown d4/d7/d10/d15/f16 133973644 1 2026-03-10T08:55:22.146 INFO:tasks.workunit.client.1.vm08.stdout:9/433: read d2/dd/d15/f17 [3605042,95589] 0 2026-03-10T08:55:22.147 INFO:tasks.workunit.client.1.vm08.stdout:7/487: chown d0/d14/l4d 5 1 2026-03-10T08:55:22.150 INFO:tasks.workunit.client.1.vm08.stdout:7/488: write d0/d11/d1f/d29/f8d [71633,48640] 0 2026-03-10T08:55:22.155 INFO:tasks.workunit.client.1.vm08.stdout:0/414: dread f3 [0,4194304] 0 2026-03-10T08:55:22.157 INFO:tasks.workunit.client.1.vm08.stdout:9/434: fdatasync d2/dd/d15/d1e/d39/d4e/f78 0 2026-03-10T08:55:22.158 INFO:tasks.workunit.client.1.vm08.stdout:7/489: creat d0/d11/d4a/d5e/f93 x:0 0 0 2026-03-10T08:55:22.166 INFO:tasks.workunit.client.1.vm08.stdout:9/435: dwrite d2/dd/f16 [0,4194304] 0 2026-03-10T08:55:22.167 INFO:tasks.workunit.client.0.vm05.stdout:2/207: truncate d0/d9/f17 5034883 0 2026-03-10T08:55:22.173 INFO:tasks.workunit.client.1.vm08.stdout:4/491: dwrite d5/d23/d36/f44 [0,4194304] 0 2026-03-10T08:55:22.182 INFO:tasks.workunit.client.0.vm05.stdout:9/177: dwrite d6/d27/f2b [0,4194304] 0 2026-03-10T08:55:22.183 INFO:tasks.workunit.client.0.vm05.stdout:4/234: dwrite d0/f9 [0,4194304] 0 2026-03-10T08:55:22.184 INFO:tasks.workunit.client.0.vm05.stdout:3/223: write d9/ff [4921786,101192] 0 2026-03-10T08:55:22.184 INFO:tasks.workunit.client.0.vm05.stdout:4/235: fsync d0/f18 0 2026-03-10T08:55:22.184 INFO:tasks.workunit.client.1.vm08.stdout:3/451: getdents d4/d15/d8/d2c/d6d 0 2026-03-10T08:55:22.194 INFO:tasks.workunit.client.0.vm05.stdout:2/208: creat d0/d9/d1e/d20/d21/f3d x:0 0 0 2026-03-10T08:55:22.203 INFO:tasks.workunit.client.0.vm05.stdout:9/178: rmdir d6/d15/d37 39 2026-03-10T08:55:22.203 INFO:tasks.workunit.client.0.vm05.stdout:9/179: dwrite d6/d12/f34 [0,4194304] 0 2026-03-10T08:55:22.203 INFO:tasks.workunit.client.0.vm05.stdout:9/180: readlink d6/d15/d3c/l3e 0 2026-03-10T08:55:22.205 INFO:tasks.workunit.client.1.vm08.stdout:8/536: write d1/d10/d9/dd/d25/d27/d44/d21/d51/f56 [227654,51447] 0 2026-03-10T08:55:22.207 INFO:tasks.workunit.client.0.vm05.stdout:4/236: chown d0/d1d/d30/d32 6711 1 2026-03-10T08:55:22.209 INFO:tasks.workunit.client.1.vm08.stdout:3/452: fdatasync d4/d15/d8/ff 0 2026-03-10T08:55:22.211 INFO:tasks.workunit.client.1.vm08.stdout:8/537: creat d1/d10/d9/dd/d25/d27/d44/d21/d51/d64/fc2 x:0 0 0 2026-03-10T08:55:22.217 INFO:tasks.workunit.client.1.vm08.stdout:4/492: creat d5/d2f/fab x:0 0 0 2026-03-10T08:55:22.217 INFO:tasks.workunit.client.1.vm08.stdout:4/493: dread - d5/d23/d36/d76/faa zero size 2026-03-10T08:55:22.217 INFO:tasks.workunit.client.1.vm08.stdout:4/494: readlink d5/d23/d49/d8f/l91 0 2026-03-10T08:55:22.217 INFO:tasks.workunit.client.0.vm05.stdout:9/181: symlink d6/d27/l40 0 2026-03-10T08:55:22.217 INFO:tasks.workunit.client.0.vm05.stdout:9/182: truncate d6/f30 249320 0 2026-03-10T08:55:22.217 INFO:tasks.workunit.client.0.vm05.stdout:3/224: mknod d9/c3e 0 2026-03-10T08:55:22.217 INFO:tasks.workunit.client.0.vm05.stdout:3/225: fsync f7 0 2026-03-10T08:55:22.217 INFO:tasks.workunit.client.1.vm08.stdout:0/415: sync 2026-03-10T08:55:22.219 INFO:tasks.workunit.client.0.vm05.stdout:0/221: truncate df/f12 1743021 0 2026-03-10T08:55:22.221 INFO:tasks.workunit.client.1.vm08.stdout:0/416: chown d6/dd/d13/d17/f1d 184 1 2026-03-10T08:55:22.223 INFO:tasks.workunit.client.0.vm05.stdout:2/209: link d0/fa d0/d9/d1e/d20/d21/f3e 0 2026-03-10T08:55:22.223 INFO:tasks.workunit.client.0.vm05.stdout:0/222: dwrite df/f13 [0,4194304] 0 2026-03-10T08:55:22.224 INFO:tasks.workunit.client.0.vm05.stdout:0/223: write fe [3281720,90978] 0 2026-03-10T08:55:22.227 INFO:tasks.workunit.client.0.vm05.stdout:3/226: write d9/f1a [3217117,93999] 0 2026-03-10T08:55:22.229 INFO:tasks.workunit.client.0.vm05.stdout:2/210: mknod d0/d9/d27/c3f 0 2026-03-10T08:55:22.229 INFO:tasks.workunit.client.1.vm08.stdout:8/538: mknod d1/d10/d9/dd/d25/d27/d44/d21/d5f/d9e/cc3 0 2026-03-10T08:55:22.231 INFO:tasks.workunit.client.0.vm05.stdout:3/227: truncate d9/d2b/f2c 324372 0 2026-03-10T08:55:22.231 INFO:tasks.workunit.client.0.vm05.stdout:4/237: symlink d0/d2e/d42/d45/d4a/d36/d37/l52 0 2026-03-10T08:55:22.233 INFO:tasks.workunit.client.0.vm05.stdout:0/224: symlink df/d18/d19/d35/l3c 0 2026-03-10T08:55:22.233 INFO:tasks.workunit.client.1.vm08.stdout:8/539: read - d1/d10/d9/dd/d25/d27/d44/fa7 zero size 2026-03-10T08:55:22.233 INFO:tasks.workunit.client.0.vm05.stdout:3/228: chown d9/d2b/f34 56271 1 2026-03-10T08:55:22.234 INFO:tasks.workunit.client.1.vm08.stdout:0/417: mkdir d6/dd/d13/d17/d1f/d2d/d85 0 2026-03-10T08:55:22.242 INFO:tasks.workunit.client.1.vm08.stdout:4/495: getdents d5/d2f/d5a/d69 0 2026-03-10T08:55:22.242 INFO:tasks.workunit.client.1.vm08.stdout:8/540: creat d1/d4f/d60/fc4 x:0 0 0 2026-03-10T08:55:22.242 INFO:tasks.workunit.client.1.vm08.stdout:4/496: write d5/d2f/d5a/f90 [2376392,79707] 0 2026-03-10T08:55:22.244 INFO:tasks.workunit.client.0.vm05.stdout:2/211: creat d0/f40 x:0 0 0 2026-03-10T08:55:22.244 INFO:tasks.workunit.client.1.vm08.stdout:8/541: creat d1/d10/d9/dd/fc5 x:0 0 0 2026-03-10T08:55:22.248 INFO:tasks.workunit.client.0.vm05.stdout:3/229: rmdir d9/d39 0 2026-03-10T08:55:22.249 INFO:tasks.workunit.client.1.vm08.stdout:4/497: fsync d5/d2f/d5a/d69/f8c 0 2026-03-10T08:55:22.252 INFO:tasks.workunit.client.0.vm05.stdout:2/212: rename d0/d9/d1e/d20/f26 to d0/d9/d1e/d20/d21/f41 0 2026-03-10T08:55:22.254 INFO:tasks.workunit.client.1.vm08.stdout:0/418: getdents d6/dd/d13/d17/d1f/d20/d2f/d57/d77 0 2026-03-10T08:55:22.255 INFO:tasks.workunit.client.0.vm05.stdout:0/225: link df/d1f/c2c df/d18/d19/d39/c3d 0 2026-03-10T08:55:22.255 INFO:tasks.workunit.client.0.vm05.stdout:2/213: dread d0/f16 [0,4194304] 0 2026-03-10T08:55:22.255 INFO:tasks.workunit.client.0.vm05.stdout:2/214: readlink d0/d9/l13 0 2026-03-10T08:55:22.257 INFO:tasks.workunit.client.0.vm05.stdout:2/215: fdatasync d0/d9/d1e/d20/f3a 0 2026-03-10T08:55:22.257 INFO:tasks.workunit.client.1.vm08.stdout:0/419: creat d6/dd/d13/d61/f86 x:0 0 0 2026-03-10T08:55:22.258 INFO:tasks.workunit.client.0.vm05.stdout:2/216: truncate d0/f36 555745 0 2026-03-10T08:55:22.259 INFO:tasks.workunit.client.1.vm08.stdout:8/542: dread d1/d2c/f47 [0,4194304] 0 2026-03-10T08:55:22.263 INFO:tasks.workunit.client.0.vm05.stdout:0/226: dread df/d18/f29 [0,4194304] 0 2026-03-10T08:55:22.264 INFO:tasks.workunit.client.0.vm05.stdout:2/217: symlink d0/d9/l42 0 2026-03-10T08:55:22.265 INFO:tasks.workunit.client.1.vm08.stdout:0/420: creat d6/dd/d13/d17/d1f/d2d/d39/f87 x:0 0 0 2026-03-10T08:55:22.265 INFO:tasks.workunit.client.1.vm08.stdout:8/543: mkdir d1/d10/d9/dd/d3d/dc6 0 2026-03-10T08:55:22.266 INFO:tasks.workunit.client.1.vm08.stdout:0/421: chown d6/dd/d13/d32/l4f 15 1 2026-03-10T08:55:22.267 INFO:tasks.workunit.client.1.vm08.stdout:4/498: creat d5/d23/fac x:0 0 0 2026-03-10T08:55:22.274 INFO:tasks.workunit.client.1.vm08.stdout:0/422: symlink d6/dd/l88 0 2026-03-10T08:55:22.279 INFO:tasks.workunit.client.0.vm05.stdout:0/227: getdents df 0 2026-03-10T08:55:22.280 INFO:tasks.workunit.client.1.vm08.stdout:4/499: creat d5/d2f/d5a/fad x:0 0 0 2026-03-10T08:55:22.280 INFO:tasks.workunit.client.1.vm08.stdout:4/500: chown d5/d5f/c98 33850978 1 2026-03-10T08:55:22.280 INFO:tasks.workunit.client.1.vm08.stdout:0/423: mknod d6/dd/d13/d17/d1f/d20/d2f/d57/c89 0 2026-03-10T08:55:22.280 INFO:tasks.workunit.client.1.vm08.stdout:0/424: write d6/dd/d13/d17/d50/f71 [1431915,117475] 0 2026-03-10T08:55:22.280 INFO:tasks.workunit.client.0.vm05.stdout:0/228: mknod df/d1f/c3e 0 2026-03-10T08:55:22.281 INFO:tasks.workunit.client.1.vm08.stdout:0/425: creat d6/dd/d13/d17/d1f/d2d/d39/f8a x:0 0 0 2026-03-10T08:55:22.284 INFO:tasks.workunit.client.0.vm05.stdout:0/229: getdents df/d18/d2b 0 2026-03-10T08:55:22.299 INFO:tasks.workunit.client.0.vm05.stdout:0/230: creat df/d18/d2b/d3a/f3f x:0 0 0 2026-03-10T08:55:22.299 INFO:tasks.workunit.client.0.vm05.stdout:0/231: creat df/d18/d19/d35/f40 x:0 0 0 2026-03-10T08:55:22.299 INFO:tasks.workunit.client.0.vm05.stdout:0/232: mknod df/d1f/c41 0 2026-03-10T08:55:22.299 INFO:tasks.workunit.client.0.vm05.stdout:0/233: creat df/d18/d19/d39/f42 x:0 0 0 2026-03-10T08:55:22.299 INFO:tasks.workunit.client.0.vm05.stdout:0/234: dwrite f6 [0,4194304] 0 2026-03-10T08:55:22.338 INFO:tasks.workunit.client.0.vm05.stdout:3/230: sync 2026-03-10T08:55:22.340 INFO:tasks.workunit.client.0.vm05.stdout:3/231: getdents d9/d2b/d3a 0 2026-03-10T08:55:22.342 INFO:tasks.workunit.client.0.vm05.stdout:3/232: creat d9/d2b/d2f/f3f x:0 0 0 2026-03-10T08:55:22.346 INFO:tasks.workunit.client.0.vm05.stdout:3/233: dwrite d9/f31 [0,4194304] 0 2026-03-10T08:55:22.354 INFO:tasks.workunit.client.0.vm05.stdout:3/234: dwrite f1 [0,4194304] 0 2026-03-10T08:55:22.396 INFO:tasks.workunit.client.0.vm05.stdout:9/183: fsync d6/d27/f2b 0 2026-03-10T08:55:22.396 INFO:tasks.workunit.client.0.vm05.stdout:9/184: truncate d6/f7 2204360 0 2026-03-10T08:55:22.400 INFO:tasks.workunit.client.0.vm05.stdout:0/235: dread f5 [0,4194304] 0 2026-03-10T08:55:22.402 INFO:tasks.workunit.client.0.vm05.stdout:0/236: unlink df/d18/d19/d35/f40 0 2026-03-10T08:55:22.403 INFO:tasks.workunit.client.0.vm05.stdout:0/237: symlink df/d18/d19/l43 0 2026-03-10T08:55:22.404 INFO:tasks.workunit.client.0.vm05.stdout:0/238: creat df/d18/d2b/d27/d32/f44 x:0 0 0 2026-03-10T08:55:22.417 INFO:tasks.workunit.client.1.vm08.stdout:2/484: write d1/da/d10/d42/d93/d23/f70 [4666313,51070] 0 2026-03-10T08:55:22.419 INFO:tasks.workunit.client.0.vm05.stdout:7/197: write fd [1582524,99797] 0 2026-03-10T08:55:22.426 INFO:tasks.workunit.client.1.vm08.stdout:2/485: dread d1/fd [0,4194304] 0 2026-03-10T08:55:22.427 INFO:tasks.workunit.client.0.vm05.stdout:7/198: mkdir d18/d38 0 2026-03-10T08:55:22.427 INFO:tasks.workunit.client.0.vm05.stdout:7/199: dread f15 [0,4194304] 0 2026-03-10T08:55:22.427 INFO:tasks.workunit.client.0.vm05.stdout:7/200: mknod d18/d1b/d1f/d25/d2e/d2f/c39 0 2026-03-10T08:55:22.427 INFO:tasks.workunit.client.0.vm05.stdout:7/201: dread f3 [0,4194304] 0 2026-03-10T08:55:22.428 INFO:tasks.workunit.client.0.vm05.stdout:7/202: creat d18/d1b/d1f/d25/f3a x:0 0 0 2026-03-10T08:55:22.429 INFO:tasks.workunit.client.0.vm05.stdout:7/203: mknod d18/d1b/c3b 0 2026-03-10T08:55:22.433 INFO:tasks.workunit.client.1.vm08.stdout:2/486: write d1/d5b/f8c [1913260,39692] 0 2026-03-10T08:55:22.433 INFO:tasks.workunit.client.0.vm05.stdout:1/316: truncate fa 1483805 0 2026-03-10T08:55:22.434 INFO:tasks.workunit.client.0.vm05.stdout:1/317: dread - dd/d10/d18/d2d/d51/f6b zero size 2026-03-10T08:55:22.434 INFO:tasks.workunit.client.1.vm08.stdout:2/487: read - d1/da/d10/d42/d93/f8d zero size 2026-03-10T08:55:22.436 INFO:tasks.workunit.client.0.vm05.stdout:1/318: creat dd/d10/d19/d4d/f70 x:0 0 0 2026-03-10T08:55:22.436 INFO:tasks.workunit.client.0.vm05.stdout:1/319: truncate dd/f16 3910932 0 2026-03-10T08:55:22.439 INFO:tasks.workunit.client.0.vm05.stdout:1/320: rename dd/d21/d3f/d4a to dd/d10/d18/d2d/d51/d58/d71 0 2026-03-10T08:55:22.439 INFO:tasks.workunit.client.0.vm05.stdout:1/321: read - dd/d10/d19/d4d/f70 zero size 2026-03-10T08:55:22.444 INFO:tasks.workunit.client.0.vm05.stdout:1/322: creat dd/d21/d37/f72 x:0 0 0 2026-03-10T08:55:22.446 INFO:tasks.workunit.client.0.vm05.stdout:1/323: mkdir dd/d10/d18/d2d/d51/d58/d71/d73 0 2026-03-10T08:55:22.448 INFO:tasks.workunit.client.0.vm05.stdout:1/324: creat dd/d10/d19/d4d/f74 x:0 0 0 2026-03-10T08:55:22.452 INFO:tasks.workunit.client.0.vm05.stdout:5/208: dwrite d5/fe [0,4194304] 0 2026-03-10T08:55:22.453 INFO:tasks.workunit.client.0.vm05.stdout:1/325: dread dd/d21/d37/f39 [0,4194304] 0 2026-03-10T08:55:22.458 INFO:tasks.workunit.client.0.vm05.stdout:8/151: truncate d2/db/f1b 2278009 0 2026-03-10T08:55:22.460 INFO:tasks.workunit.client.0.vm05.stdout:1/326: dwrite dd/d10/d18/d20/f6c [0,4194304] 0 2026-03-10T08:55:22.462 INFO:tasks.workunit.client.0.vm05.stdout:1/327: chown dd/d21/d37/l3d 13 1 2026-03-10T08:55:22.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:22 vm05.local ceph-mon[49713]: pgmap v150: 65 pgs: 65 active+clean; 1.4 GiB data, 5.5 GiB used, 115 GiB / 120 GiB avail; 25 MiB/s rd, 127 MiB/s wr, 249 op/s 2026-03-10T08:55:22.469 INFO:tasks.workunit.client.0.vm05.stdout:5/209: unlink d5/df/d12/d24/f38 0 2026-03-10T08:55:22.515 INFO:tasks.workunit.client.0.vm05.stdout:8/152: mknod d2/dd/d2c/d2e/c39 0 2026-03-10T08:55:22.516 INFO:tasks.workunit.client.0.vm05.stdout:1/328: symlink dd/d10/d18/l75 0 2026-03-10T08:55:22.516 INFO:tasks.workunit.client.0.vm05.stdout:5/210: mkdir d5/d3a/d43 0 2026-03-10T08:55:22.516 INFO:tasks.workunit.client.0.vm05.stdout:8/153: fdatasync d2/db/f1b 0 2026-03-10T08:55:22.516 INFO:tasks.workunit.client.0.vm05.stdout:1/329: rmdir dd/d10/d18/d20 39 2026-03-10T08:55:22.516 INFO:tasks.workunit.client.0.vm05.stdout:5/211: chown d5/f9 0 1 2026-03-10T08:55:22.516 INFO:tasks.workunit.client.0.vm05.stdout:6/204: write d4/d7/d10/d15/f17 [259469,28119] 0 2026-03-10T08:55:22.516 INFO:tasks.workunit.client.0.vm05.stdout:4/238: truncate d0/fc 2176561 0 2026-03-10T08:55:22.516 INFO:tasks.workunit.client.0.vm05.stdout:8/154: link d2/dd/d2c/f2f d2/dd/d2c/f3a 0 2026-03-10T08:55:22.516 INFO:tasks.workunit.client.0.vm05.stdout:1/330: symlink dd/d10/d18/d2d/d51/d58/d71/d73/l76 0 2026-03-10T08:55:22.516 INFO:tasks.workunit.client.0.vm05.stdout:9/185: truncate d6/d27/f2b 3027005 0 2026-03-10T08:55:22.516 INFO:tasks.workunit.client.0.vm05.stdout:6/205: mknod d4/c44 0 2026-03-10T08:55:22.516 INFO:tasks.workunit.client.0.vm05.stdout:1/331: fsync dd/d10/d18/d2d/d51/d58/f5b 0 2026-03-10T08:55:22.516 INFO:tasks.workunit.client.1.vm08.stdout:6/491: dwrite d9/dc/d11/d23/d2c/f8e [0,4194304] 0 2026-03-10T08:55:22.516 INFO:tasks.workunit.client.1.vm08.stdout:9/436: read d2/f77 [247727,35398] 0 2026-03-10T08:55:22.516 INFO:tasks.workunit.client.1.vm08.stdout:7/490: dwrite d0/d11/d1f/d2c/f30 [0,4194304] 0 2026-03-10T08:55:22.516 INFO:tasks.workunit.client.1.vm08.stdout:4/501: getdents d5/d2f 0 2026-03-10T08:55:22.516 INFO:tasks.workunit.client.1.vm08.stdout:9/437: truncate d2/dd/d15/f22 4619080 0 2026-03-10T08:55:22.516 INFO:tasks.workunit.client.1.vm08.stdout:7/491: rename d0/d11/d1f/d29/d36/d75/l77 to d0/d51/l94 0 2026-03-10T08:55:22.516 INFO:tasks.workunit.client.1.vm08.stdout:7/492: fsync d0/d14/d43/f7b 0 2026-03-10T08:55:22.516 INFO:tasks.workunit.client.1.vm08.stdout:7/493: dread - d0/d11/d1f/d29/d3b/f7d zero size 2026-03-10T08:55:22.516 INFO:tasks.workunit.client.1.vm08.stdout:4/502: mkdir d5/d2f/d5d/dae 0 2026-03-10T08:55:22.516 INFO:tasks.workunit.client.1.vm08.stdout:7/494: dread - d0/f7a zero size 2026-03-10T08:55:22.516 INFO:tasks.workunit.client.1.vm08.stdout:3/453: write d4/d15/fa [1090912,63624] 0 2026-03-10T08:55:22.516 INFO:tasks.workunit.client.1.vm08.stdout:9/438: mkdir d2/d41/d4c/d89 0 2026-03-10T08:55:22.516 INFO:tasks.workunit.client.1.vm08.stdout:9/439: creat d2/d41/d4c/f8a x:0 0 0 2026-03-10T08:55:22.516 INFO:tasks.workunit.client.1.vm08.stdout:7/495: truncate d0/d11/d1f/d2c/f33 1104879 0 2026-03-10T08:55:22.516 INFO:tasks.workunit.client.1.vm08.stdout:7/496: fsync d0/d11/f39 0 2026-03-10T08:55:22.516 INFO:tasks.workunit.client.1.vm08.stdout:3/454: rename d4/d15/d8/f82 to d4/d15/d8/d2a/f95 0 2026-03-10T08:55:22.517 INFO:tasks.workunit.client.1.vm08.stdout:9/440: fdatasync d2/dd/d15/d1e/d24/f2b 0 2026-03-10T08:55:22.519 INFO:tasks.workunit.client.0.vm05.stdout:8/155: fsync d2/fa 0 2026-03-10T08:55:22.522 INFO:tasks.workunit.client.1.vm08.stdout:7/497: mkdir d0/d11/d4a/d95 0 2026-03-10T08:55:22.523 INFO:tasks.workunit.client.1.vm08.stdout:3/455: symlink d4/d15/d8/d2a/d79/d20/l96 0 2026-03-10T08:55:22.525 INFO:tasks.workunit.client.0.vm05.stdout:4/239: link d0/c1b d0/d1d/d30/d49/c53 0 2026-03-10T08:55:22.527 INFO:tasks.workunit.client.0.vm05.stdout:8/156: creat d2/dd/d2c/d2e/f3b x:0 0 0 2026-03-10T08:55:22.539 INFO:tasks.workunit.client.1.vm08.stdout:9/441: mknod d2/c8b 0 2026-03-10T08:55:22.539 INFO:tasks.workunit.client.1.vm08.stdout:7/498: creat d0/d11/d1f/d29/d3d/d89/f96 x:0 0 0 2026-03-10T08:55:22.539 INFO:tasks.workunit.client.1.vm08.stdout:7/499: write d0/d11/d1f/d29/d3d/d89/f8b [4457424,114] 0 2026-03-10T08:55:22.539 INFO:tasks.workunit.client.0.vm05.stdout:6/206: symlink d4/l45 0 2026-03-10T08:55:22.539 INFO:tasks.workunit.client.0.vm05.stdout:1/332: mknod dd/d21/c77 0 2026-03-10T08:55:22.539 INFO:tasks.workunit.client.0.vm05.stdout:4/240: dread d0/d1d/d30/f1c [0,4194304] 0 2026-03-10T08:55:22.539 INFO:tasks.workunit.client.0.vm05.stdout:4/241: truncate d0/d2e/d42/d45/d4a/f26 617646 0 2026-03-10T08:55:22.539 INFO:tasks.workunit.client.0.vm05.stdout:1/333: symlink dd/d10/d18/d2d/d51/l78 0 2026-03-10T08:55:22.539 INFO:tasks.workunit.client.0.vm05.stdout:6/207: dwrite d4/d7/f34 [4194304,4194304] 0 2026-03-10T08:55:22.542 INFO:tasks.workunit.client.1.vm08.stdout:2/488: fsync d1/d5b/f8c 0 2026-03-10T08:55:22.548 INFO:tasks.workunit.client.0.vm05.stdout:4/242: mknod d0/d1d/d30/d32/d41/c54 0 2026-03-10T08:55:22.548 INFO:tasks.workunit.client.0.vm05.stdout:8/157: symlink d2/dd/d2c/d2e/d31/l3c 0 2026-03-10T08:55:22.549 INFO:tasks.workunit.client.0.vm05.stdout:8/158: write d2/dd/d2c/f30 [820166,28462] 0 2026-03-10T08:55:22.549 INFO:tasks.workunit.client.0.vm05.stdout:4/243: write d0/d1d/d30/d49/d4f/f51 [159247,10194] 0 2026-03-10T08:55:22.549 INFO:tasks.workunit.client.0.vm05.stdout:6/208: mknod d4/d7/d10/d15/c46 0 2026-03-10T08:55:22.550 INFO:tasks.workunit.client.0.vm05.stdout:4/244: readlink d0/d2e/l44 0 2026-03-10T08:55:22.552 INFO:tasks.workunit.client.0.vm05.stdout:6/209: dwrite d4/d7/d10/d15/f17 [0,4194304] 0 2026-03-10T08:55:22.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:22 vm08.local ceph-mon[57559]: pgmap v150: 65 pgs: 65 active+clean; 1.4 GiB data, 5.5 GiB used, 115 GiB / 120 GiB avail; 25 MiB/s rd, 127 MiB/s wr, 249 op/s 2026-03-10T08:55:22.557 INFO:tasks.workunit.client.0.vm05.stdout:1/334: mknod dd/d10/d19/c79 0 2026-03-10T08:55:22.558 INFO:tasks.workunit.client.0.vm05.stdout:8/159: mknod d2/dd/c3d 0 2026-03-10T08:55:22.558 INFO:tasks.workunit.client.0.vm05.stdout:8/160: truncate d2/dd/f1a 1866497 0 2026-03-10T08:55:22.559 INFO:tasks.workunit.client.0.vm05.stdout:8/161: fdatasync d2/dd/d2c/f34 0 2026-03-10T08:55:22.560 INFO:tasks.workunit.client.1.vm08.stdout:7/500: getdents d0/d11/d4a/d5e 0 2026-03-10T08:55:22.560 INFO:tasks.workunit.client.0.vm05.stdout:6/210: creat d4/d7/d10/d15/d20/f47 x:0 0 0 2026-03-10T08:55:22.562 INFO:tasks.workunit.client.0.vm05.stdout:8/162: rmdir d2/dd/d2c 39 2026-03-10T08:55:22.563 INFO:tasks.workunit.client.0.vm05.stdout:4/245: mkdir d0/d55 0 2026-03-10T08:55:22.564 INFO:tasks.workunit.client.0.vm05.stdout:6/211: creat d4/d7/d10/d15/d20/f48 x:0 0 0 2026-03-10T08:55:22.565 INFO:tasks.workunit.client.1.vm08.stdout:7/501: rmdir d0/d11/d4a/d5e/d5f 0 2026-03-10T08:55:22.567 INFO:tasks.workunit.client.0.vm05.stdout:6/212: rename d4/d2c/f35 to d4/d7/d10/d15/d38/f49 0 2026-03-10T08:55:22.568 INFO:tasks.workunit.client.1.vm08.stdout:7/502: rename d0/d11/d1f/d2c/l8f to d0/d11/d1f/d29/d3d/d89/l97 0 2026-03-10T08:55:22.571 INFO:tasks.workunit.client.0.vm05.stdout:8/163: unlink d2/dd/ce 0 2026-03-10T08:55:22.573 INFO:tasks.workunit.client.0.vm05.stdout:6/213: write d4/d7/ff [5010945,30244] 0 2026-03-10T08:55:22.576 INFO:tasks.workunit.client.0.vm05.stdout:6/214: mkdir d4/d7/d10/d15/d38/d4a 0 2026-03-10T08:55:22.577 INFO:tasks.workunit.client.0.vm05.stdout:8/164: mkdir d2/dd/d2c/d2e/d31/d3e 0 2026-03-10T08:55:22.577 INFO:tasks.workunit.client.0.vm05.stdout:4/246: link d0/d2e/d42/d45/d4a/l27 d0/d1d/d30/d32/d41/l56 0 2026-03-10T08:55:22.578 INFO:tasks.workunit.client.0.vm05.stdout:6/215: rename d4/d7/d10/d15/f16 to d4/d7/d10/d1a/d1f/f4b 0 2026-03-10T08:55:22.579 INFO:tasks.workunit.client.0.vm05.stdout:6/216: rename d4/d7/d10 to d4/d7/d10/d15/d20/d4c 22 2026-03-10T08:55:22.582 INFO:tasks.workunit.client.0.vm05.stdout:6/217: dwrite d4/d7/d10/d1a/f1e [0,4194304] 0 2026-03-10T08:55:22.583 INFO:tasks.workunit.client.0.vm05.stdout:4/247: creat d0/d1d/d30/d32/d41/f57 x:0 0 0 2026-03-10T08:55:22.586 INFO:tasks.workunit.client.0.vm05.stdout:8/165: creat d2/dd/f3f x:0 0 0 2026-03-10T08:55:22.593 INFO:tasks.workunit.client.0.vm05.stdout:8/166: dwrite d2/db/d28/f2d [0,4194304] 0 2026-03-10T08:55:22.602 INFO:tasks.workunit.client.0.vm05.stdout:3/235: truncate f7 1219208 0 2026-03-10T08:55:22.608 INFO:tasks.workunit.client.0.vm05.stdout:8/167: link d2/db/c15 d2/db/d1f/c40 0 2026-03-10T08:55:22.611 INFO:tasks.workunit.client.0.vm05.stdout:3/236: creat d9/d2b/f40 x:0 0 0 2026-03-10T08:55:22.613 INFO:tasks.workunit.client.0.vm05.stdout:3/237: chown d9/c1d 1725 1 2026-03-10T08:55:22.614 INFO:tasks.workunit.client.0.vm05.stdout:8/168: mkdir d2/dd/d41 0 2026-03-10T08:55:22.615 INFO:tasks.workunit.client.0.vm05.stdout:8/169: truncate d2/ff 5146018 0 2026-03-10T08:55:22.624 INFO:tasks.workunit.client.0.vm05.stdout:8/170: fdatasync d2/db/f1b 0 2026-03-10T08:55:22.625 INFO:tasks.workunit.client.0.vm05.stdout:2/218: dwrite d0/fa [4194304,4194304] 0 2026-03-10T08:55:22.625 INFO:tasks.workunit.client.0.vm05.stdout:8/171: chown d2/fa 35120 1 2026-03-10T08:55:22.625 INFO:tasks.workunit.client.0.vm05.stdout:2/219: stat d0/d9/d1e/d20/f3a 0 2026-03-10T08:55:22.626 INFO:tasks.workunit.client.1.vm08.stdout:8/544: dwrite d1/d10/d9/dd/f8f [0,4194304] 0 2026-03-10T08:55:22.629 INFO:tasks.workunit.client.0.vm05.stdout:2/220: dwrite d0/f40 [0,4194304] 0 2026-03-10T08:55:22.645 INFO:tasks.workunit.client.1.vm08.stdout:5/452: dread d0/d11/d27/f3b [0,4194304] 0 2026-03-10T08:55:22.646 INFO:tasks.workunit.client.0.vm05.stdout:3/238: fsync d9/d2b/d2f/f3f 0 2026-03-10T08:55:22.648 INFO:tasks.workunit.client.1.vm08.stdout:0/426: dwrite d6/f25 [0,4194304] 0 2026-03-10T08:55:22.657 INFO:tasks.workunit.client.1.vm08.stdout:5/453: unlink d0/d46/l5d 0 2026-03-10T08:55:22.657 INFO:tasks.workunit.client.1.vm08.stdout:0/427: stat d6/dd/c14 0 2026-03-10T08:55:22.660 INFO:tasks.workunit.client.1.vm08.stdout:5/454: fdatasync d0/d11/d27/d68/d7c/f42 0 2026-03-10T08:55:22.662 INFO:tasks.workunit.client.0.vm05.stdout:3/239: link d9/c22 d9/d2b/c41 0 2026-03-10T08:55:22.664 INFO:tasks.workunit.client.0.vm05.stdout:3/240: write d9/f20 [5850582,105034] 0 2026-03-10T08:55:22.664 INFO:tasks.workunit.client.0.vm05.stdout:3/241: truncate d9/d2b/d2f/f3f 188723 0 2026-03-10T08:55:22.666 INFO:tasks.workunit.client.0.vm05.stdout:3/242: symlink d9/d2b/l42 0 2026-03-10T08:55:22.673 INFO:tasks.workunit.client.1.vm08.stdout:8/545: dread d1/d10/d9/fb [0,4194304] 0 2026-03-10T08:55:22.737 INFO:tasks.workunit.client.1.vm08.stdout:6/492: fsync d9/dc/d11/d23/d2c/f8e 0 2026-03-10T08:55:22.739 INFO:tasks.workunit.client.1.vm08.stdout:6/493: creat d9/d13/d4e/fa8 x:0 0 0 2026-03-10T08:55:22.828 INFO:tasks.workunit.client.0.vm05.stdout:3/243: sync 2026-03-10T08:55:22.831 INFO:tasks.workunit.client.0.vm05.stdout:0/239: write df/d1f/f21 [1492585,44705] 0 2026-03-10T08:55:22.835 INFO:tasks.workunit.client.1.vm08.stdout:5/455: dread d0/d11/d27/d68/d7c/f6a [0,4194304] 0 2026-03-10T08:55:22.835 INFO:tasks.workunit.client.0.vm05.stdout:0/240: write df/d18/f2a [3885066,72779] 0 2026-03-10T08:55:22.835 INFO:tasks.workunit.client.0.vm05.stdout:3/244: chown d9/fa 383588428 1 2026-03-10T08:55:22.835 INFO:tasks.workunit.client.1.vm08.stdout:5/456: fsync d0/d46/f81 0 2026-03-10T08:55:22.835 INFO:tasks.workunit.client.1.vm08.stdout:5/457: stat d0/l7 0 2026-03-10T08:55:22.837 INFO:tasks.workunit.client.1.vm08.stdout:5/458: chown d0/l3c 4 1 2026-03-10T08:55:22.841 INFO:tasks.workunit.client.0.vm05.stdout:0/241: dwrite df/d18/f29 [0,4194304] 0 2026-03-10T08:55:22.842 INFO:tasks.workunit.client.0.vm05.stdout:0/242: chown df/d18/d2b/d27/d32/f44 427471272 1 2026-03-10T08:55:22.848 INFO:tasks.workunit.client.0.vm05.stdout:3/245: mkdir d9/d2b/d3a/d43 0 2026-03-10T08:55:22.851 INFO:tasks.workunit.client.1.vm08.stdout:5/459: creat d0/d11/d27/d68/d7c/d4b/d4e/f89 x:0 0 0 2026-03-10T08:55:22.854 INFO:tasks.workunit.client.0.vm05.stdout:0/243: creat df/d18/d2b/d3a/f45 x:0 0 0 2026-03-10T08:55:22.858 INFO:tasks.workunit.client.0.vm05.stdout:0/244: dwrite df/d18/d2b/d27/f2e [0,4194304] 0 2026-03-10T08:55:22.859 INFO:tasks.workunit.client.1.vm08.stdout:5/460: creat d0/f8a x:0 0 0 2026-03-10T08:55:22.859 INFO:tasks.workunit.client.1.vm08.stdout:5/461: chown d0/d11/d18/d52/c66 13 1 2026-03-10T08:55:22.867 INFO:tasks.workunit.client.0.vm05.stdout:0/245: unlink df/f11 0 2026-03-10T08:55:22.879 INFO:tasks.workunit.client.1.vm08.stdout:5/462: mknod d0/d11/d27/d68/d7c/d4b/d4e/c8b 0 2026-03-10T08:55:22.879 INFO:tasks.workunit.client.1.vm08.stdout:5/463: unlink d0/d11/d27/d68/l6d 0 2026-03-10T08:55:22.915 INFO:tasks.workunit.client.0.vm05.stdout:1/335: getdents dd/d10/d18/d2d/d51/d58/d71/d73 0 2026-03-10T08:55:22.920 INFO:tasks.workunit.client.0.vm05.stdout:2/221: write d0/d9/f17 [2895783,64432] 0 2026-03-10T08:55:22.921 INFO:tasks.workunit.client.0.vm05.stdout:2/222: write d0/d9/d1e/d20/f3a [1441592,74395] 0 2026-03-10T08:55:22.922 INFO:tasks.workunit.client.0.vm05.stdout:5/212: dwrite d5/fc [0,4194304] 0 2026-03-10T08:55:22.923 INFO:tasks.workunit.client.0.vm05.stdout:2/223: fdatasync d0/d9/d1e/d20/d21/f23 0 2026-03-10T08:55:22.923 INFO:tasks.workunit.client.0.vm05.stdout:1/336: dread f6 [0,4194304] 0 2026-03-10T08:55:22.923 INFO:tasks.workunit.client.0.vm05.stdout:5/213: stat d5/df/d12/f2a 0 2026-03-10T08:55:22.931 INFO:tasks.workunit.client.0.vm05.stdout:1/337: dwrite dd/f5e [0,4194304] 0 2026-03-10T08:55:22.932 INFO:tasks.workunit.client.0.vm05.stdout:1/338: dread - dd/d21/d3f/f5a zero size 2026-03-10T08:55:22.932 INFO:tasks.workunit.client.0.vm05.stdout:2/224: dwrite d0/d9/d1e/d20/d21/f31 [0,4194304] 0 2026-03-10T08:55:22.935 INFO:tasks.workunit.client.0.vm05.stdout:2/225: chown d0/d9/d1e/d20/d24/f29 102 1 2026-03-10T08:55:22.936 INFO:tasks.workunit.client.1.vm08.stdout:1/515: dread d1/da/de/d24/d3d/d40/f42 [0,4194304] 0 2026-03-10T08:55:22.949 INFO:tasks.workunit.client.0.vm05.stdout:5/214: creat d5/df/d12/f44 x:0 0 0 2026-03-10T08:55:22.951 INFO:tasks.workunit.client.1.vm08.stdout:4/503: write d5/de/f50 [2131982,93526] 0 2026-03-10T08:55:22.952 INFO:tasks.workunit.client.1.vm08.stdout:1/516: chown d1/da/de/d24/d35/d6d/d82/f7b 224536167 1 2026-03-10T08:55:22.961 INFO:tasks.workunit.client.0.vm05.stdout:9/186: truncate d6/d19/f1a 2344326 0 2026-03-10T08:55:22.967 INFO:tasks.workunit.client.1.vm08.stdout:4/504: symlink d5/d2f/d5a/laf 0 2026-03-10T08:55:22.967 INFO:tasks.workunit.client.1.vm08.stdout:1/517: symlink d1/da/d18/d3b/lb4 0 2026-03-10T08:55:22.968 INFO:tasks.workunit.client.1.vm08.stdout:3/456: write d4/d15/d8/d2c/d55/f60 [406076,116637] 0 2026-03-10T08:55:22.968 INFO:tasks.workunit.client.0.vm05.stdout:5/215: mknod d5/df/d12/d24/c45 0 2026-03-10T08:55:22.969 INFO:tasks.workunit.client.1.vm08.stdout:1/518: stat d1/da/d18/f1d 0 2026-03-10T08:55:22.971 INFO:tasks.workunit.client.0.vm05.stdout:2/226: mknod d0/c43 0 2026-03-10T08:55:22.972 INFO:tasks.workunit.client.0.vm05.stdout:5/216: chown d5/c11 4 1 2026-03-10T08:55:22.974 INFO:tasks.workunit.client.1.vm08.stdout:1/519: stat d1/da/de/d24/d3d/d40/d92 0 2026-03-10T08:55:22.975 INFO:tasks.workunit.client.1.vm08.stdout:4/505: dwrite d5/d23/d36/d76/fa7 [0,4194304] 0 2026-03-10T08:55:22.989 INFO:tasks.workunit.client.1.vm08.stdout:9/442: dwrite d2/d41/d53/f6d [0,4194304] 0 2026-03-10T08:55:22.989 INFO:tasks.workunit.client.1.vm08.stdout:3/457: dwrite d4/d15/d8/d2a/d79/f80 [0,4194304] 0 2026-03-10T08:55:22.990 INFO:tasks.workunit.client.0.vm05.stdout:2/227: dwrite d0/d9/f17 [0,4194304] 0 2026-03-10T08:55:22.990 INFO:tasks.workunit.client.0.vm05.stdout:1/339: symlink dd/d10/d18/d20/d56/d60/l7a 0 2026-03-10T08:55:22.990 INFO:tasks.workunit.client.0.vm05.stdout:1/340: dread - dd/d21/f6f zero size 2026-03-10T08:55:22.990 INFO:tasks.workunit.client.0.vm05.stdout:1/341: unlink dd/d10/d18/d2d/c67 0 2026-03-10T08:55:22.990 INFO:tasks.workunit.client.0.vm05.stdout:1/342: truncate dd/d21/f4c 265793 0 2026-03-10T08:55:22.990 INFO:tasks.workunit.client.1.vm08.stdout:2/489: dwrite d1/d5b/d66/f20 [0,4194304] 0 2026-03-10T08:55:22.994 INFO:tasks.workunit.client.0.vm05.stdout:2/228: creat d0/d9/d1e/d20/d21/f44 x:0 0 0 2026-03-10T08:55:22.996 INFO:tasks.workunit.client.1.vm08.stdout:2/490: chown d1/d97 12764232 1 2026-03-10T08:55:22.997 INFO:tasks.workunit.client.1.vm08.stdout:1/520: unlink d1/da/d20/f2d 0 2026-03-10T08:55:22.997 INFO:tasks.workunit.client.0.vm05.stdout:4/248: rmdir d0/d2e 39 2026-03-10T08:55:22.998 INFO:tasks.workunit.client.0.vm05.stdout:4/249: dread - d0/d1d/f3c zero size 2026-03-10T08:55:23.003 INFO:tasks.workunit.client.0.vm05.stdout:1/343: mkdir dd/d10/d18/d2d/d51/d7b 0 2026-03-10T08:55:23.003 INFO:tasks.workunit.client.1.vm08.stdout:4/506: mknod d5/d2f/d5d/dae/cb0 0 2026-03-10T08:55:23.004 INFO:tasks.workunit.client.1.vm08.stdout:2/491: write d1/da/d10/d42/d93/d1e/f83 [715373,39425] 0 2026-03-10T08:55:23.004 INFO:tasks.workunit.client.1.vm08.stdout:3/458: chown d4/c38 2292 1 2026-03-10T08:55:23.005 INFO:tasks.workunit.client.1.vm08.stdout:2/492: fdatasync d1/da/d10/d2d/f4c 0 2026-03-10T08:55:23.006 INFO:tasks.workunit.client.1.vm08.stdout:7/503: dwrite d0/d11/f6a [0,4194304] 0 2026-03-10T08:55:23.008 INFO:tasks.workunit.client.1.vm08.stdout:7/504: stat d0/f7a 0 2026-03-10T08:55:23.035 INFO:tasks.workunit.client.1.vm08.stdout:4/507: chown d5/lc 42 1 2026-03-10T08:55:23.035 INFO:tasks.workunit.client.1.vm08.stdout:0/428: write d6/dd/d13/d17/d1f/f48 [1745937,125116] 0 2026-03-10T08:55:23.035 INFO:tasks.workunit.client.1.vm08.stdout:8/546: write d1/d10/d9/dd/d25/d27/d44/d21/f32 [4677469,65287] 0 2026-03-10T08:55:23.035 INFO:tasks.workunit.client.1.vm08.stdout:6/494: write d9/dc/d11/d23/f8b [606176,106259] 0 2026-03-10T08:55:23.035 INFO:tasks.workunit.client.0.vm05.stdout:2/229: mkdir d0/d9/d1e/d20/d21/d45 0 2026-03-10T08:55:23.035 INFO:tasks.workunit.client.0.vm05.stdout:2/230: fdatasync d0/f30 0 2026-03-10T08:55:23.035 INFO:tasks.workunit.client.0.vm05.stdout:6/218: dwrite d4/d7/d10/f12 [0,4194304] 0 2026-03-10T08:55:23.035 INFO:tasks.workunit.client.0.vm05.stdout:6/219: chown d4/d7/f14 158 1 2026-03-10T08:55:23.035 INFO:tasks.workunit.client.0.vm05.stdout:8/172: truncate d2/db/d28/f2d 602710 0 2026-03-10T08:55:23.035 INFO:tasks.workunit.client.0.vm05.stdout:6/220: creat d4/d7/f4d x:0 0 0 2026-03-10T08:55:23.035 INFO:tasks.workunit.client.0.vm05.stdout:6/221: rmdir d4/d7/d10/d1a 39 2026-03-10T08:55:23.038 INFO:tasks.workunit.client.0.vm05.stdout:7/204: write d18/d1b/d1f/d25/d2e/d2f/f33 [1707048,47509] 0 2026-03-10T08:55:23.038 INFO:tasks.workunit.client.0.vm05.stdout:6/222: dread d4/d7/d10/f12 [0,4194304] 0 2026-03-10T08:55:23.039 INFO:tasks.workunit.client.0.vm05.stdout:7/205: chown d18/d1b/c3b 244 1 2026-03-10T08:55:23.041 INFO:tasks.workunit.client.1.vm08.stdout:7/505: creat d0/d14/f98 x:0 0 0 2026-03-10T08:55:23.045 INFO:tasks.workunit.client.0.vm05.stdout:8/173: mknod d2/dd/d2c/d2e/d31/d3e/c42 0 2026-03-10T08:55:23.046 INFO:tasks.workunit.client.1.vm08.stdout:8/547: rename d1/d10/d9/dd/d18/cbe to d1/d10/d9/dd/d25/cc7 0 2026-03-10T08:55:23.046 INFO:tasks.workunit.client.0.vm05.stdout:8/174: dread - d2/dd/d2c/d2e/f3b zero size 2026-03-10T08:55:23.046 INFO:tasks.workunit.client.1.vm08.stdout:2/493: link d1/da/d10/d1b/f28 d1/da/d10/d1b/d6a/f98 0 2026-03-10T08:55:23.047 INFO:tasks.workunit.client.1.vm08.stdout:4/508: link d5/de/f41 d5/d23/d49/d8f/fb1 0 2026-03-10T08:55:23.047 INFO:tasks.workunit.client.1.vm08.stdout:7/506: symlink d0/d11/d1f/d29/l99 0 2026-03-10T08:55:23.048 INFO:tasks.workunit.client.1.vm08.stdout:7/507: dread - d0/d14/d43/f7b zero size 2026-03-10T08:55:23.051 INFO:tasks.workunit.client.1.vm08.stdout:8/548: write d1/f8 [1684150,23644] 0 2026-03-10T08:55:23.079 INFO:tasks.workunit.client.1.vm08.stdout:6/495: rename d9/dc/d11/d23/ca4 to d9/d10/d1e/d4c/d69/da2/ca9 0 2026-03-10T08:55:23.080 INFO:tasks.workunit.client.1.vm08.stdout:5/464: dwrite d0/d11/f29 [0,4194304] 0 2026-03-10T08:55:23.080 INFO:tasks.workunit.client.1.vm08.stdout:8/549: write d1/d10/d9/dd/d25/d27/d44/d21/d51/f56 [1869499,71159] 0 2026-03-10T08:55:23.080 INFO:tasks.workunit.client.1.vm08.stdout:6/496: readlink d9/dc/d11/d23/d2c/l93 0 2026-03-10T08:55:23.080 INFO:tasks.workunit.client.1.vm08.stdout:2/494: unlink d1/d43/d4f/f86 0 2026-03-10T08:55:23.080 INFO:tasks.workunit.client.1.vm08.stdout:6/497: symlink d9/dc/d11/d23/d2c/laa 0 2026-03-10T08:55:23.080 INFO:tasks.workunit.client.1.vm08.stdout:2/495: rmdir d1/da 39 2026-03-10T08:55:23.080 INFO:tasks.workunit.client.1.vm08.stdout:7/508: link d0/d11/d1f/d29/d3b/f86 d0/d14/d43/d62/f9a 0 2026-03-10T08:55:23.080 INFO:tasks.workunit.client.0.vm05.stdout:6/223: mknod d4/d7/d10/d15/c4e 0 2026-03-10T08:55:23.080 INFO:tasks.workunit.client.0.vm05.stdout:7/206: dwrite d18/d1b/f2c [0,4194304] 0 2026-03-10T08:55:23.080 INFO:tasks.workunit.client.0.vm05.stdout:6/224: truncate d4/d7/d10/d15/d20/f47 303212 0 2026-03-10T08:55:23.080 INFO:tasks.workunit.client.0.vm05.stdout:6/225: mknod d4/d7/d10/d15/d1b/c4f 0 2026-03-10T08:55:23.080 INFO:tasks.workunit.client.0.vm05.stdout:0/246: fdatasync df/f37 0 2026-03-10T08:55:23.080 INFO:tasks.workunit.client.0.vm05.stdout:0/247: read fe [1338074,90546] 0 2026-03-10T08:55:23.080 INFO:tasks.workunit.client.0.vm05.stdout:0/248: chown c8 2010237457 1 2026-03-10T08:55:23.080 INFO:tasks.workunit.client.0.vm05.stdout:0/249: stat df/d18/d19/l22 0 2026-03-10T08:55:23.080 INFO:tasks.workunit.client.0.vm05.stdout:0/250: readlink df/d18/d2b/d27/d32/l38 0 2026-03-10T08:55:23.081 INFO:tasks.workunit.client.0.vm05.stdout:6/226: truncate d4/fc 3861363 0 2026-03-10T08:55:23.081 INFO:tasks.workunit.client.0.vm05.stdout:6/227: dread - d4/d7/d10/d15/d38/f3c zero size 2026-03-10T08:55:23.081 INFO:tasks.workunit.client.0.vm05.stdout:8/175: getdents d2/dd/d2c 0 2026-03-10T08:55:23.081 INFO:tasks.workunit.client.0.vm05.stdout:8/176: readlink d2/l6 0 2026-03-10T08:55:23.081 INFO:tasks.workunit.client.1.vm08.stdout:7/509: dread d0/d11/d1f/d29/d3d/d40/f24 [0,4194304] 0 2026-03-10T08:55:23.082 INFO:tasks.workunit.client.0.vm05.stdout:6/228: fsync d4/d7/d10/d15/d38/f49 0 2026-03-10T08:55:23.084 INFO:tasks.workunit.client.0.vm05.stdout:6/229: write d4/d7/d10/d1a/f25 [822004,52944] 0 2026-03-10T08:55:23.085 INFO:tasks.workunit.client.0.vm05.stdout:6/230: chown d4/d2c 1044 1 2026-03-10T08:55:23.085 INFO:tasks.workunit.client.0.vm05.stdout:8/177: creat d2/dd/d41/f43 x:0 0 0 2026-03-10T08:55:23.086 INFO:tasks.workunit.client.0.vm05.stdout:0/251: rename df/d18/l34 to df/d18/l46 0 2026-03-10T08:55:23.088 INFO:tasks.workunit.client.0.vm05.stdout:0/252: rmdir df/d18/d19/d39 39 2026-03-10T08:55:23.090 INFO:tasks.workunit.client.1.vm08.stdout:6/498: creat d9/d10/fab x:0 0 0 2026-03-10T08:55:23.091 INFO:tasks.workunit.client.0.vm05.stdout:0/253: mkdir df/d18/d19/d47 0 2026-03-10T08:55:23.092 INFO:tasks.workunit.client.1.vm08.stdout:7/510: unlink d0/d11/d4a/f5c 0 2026-03-10T08:55:23.095 INFO:tasks.workunit.client.1.vm08.stdout:2/496: creat d1/da/d10/d42/d93/d23/f99 x:0 0 0 2026-03-10T08:55:23.096 INFO:tasks.workunit.client.0.vm05.stdout:0/254: dwrite df/d1f/f21 [0,4194304] 0 2026-03-10T08:55:23.097 INFO:tasks.workunit.client.1.vm08.stdout:7/511: creat d0/d11/d4a/d95/f9b x:0 0 0 2026-03-10T08:55:23.099 INFO:tasks.workunit.client.1.vm08.stdout:7/512: mknod d0/d14/d43/d62/c9c 0 2026-03-10T08:55:23.100 INFO:tasks.workunit.client.1.vm08.stdout:7/513: mkdir d0/d14/d43/d9d 0 2026-03-10T08:55:23.104 INFO:tasks.workunit.client.1.vm08.stdout:7/514: dwrite d0/d11/f6a [0,4194304] 0 2026-03-10T08:55:23.112 INFO:tasks.workunit.client.0.vm05.stdout:7/207: sync 2026-03-10T08:55:23.115 INFO:tasks.workunit.client.1.vm08.stdout:7/515: symlink d0/d14/d2f/l9e 0 2026-03-10T08:55:23.119 INFO:tasks.workunit.client.1.vm08.stdout:7/516: unlink d0/d11/d1f/d29/d3d/d89/l97 0 2026-03-10T08:55:23.120 INFO:tasks.workunit.client.1.vm08.stdout:7/517: creat d0/d11/d1f/d29/d3b/f9f x:0 0 0 2026-03-10T08:55:23.122 INFO:tasks.workunit.client.1.vm08.stdout:7/518: write d0/d11/d1f/f90 [497280,81883] 0 2026-03-10T08:55:23.126 INFO:tasks.workunit.client.1.vm08.stdout:7/519: chown d0/d11/d4a/d5e/l91 63962891 1 2026-03-10T08:55:23.135 INFO:tasks.workunit.client.1.vm08.stdout:8/550: dread f0 [0,4194304] 0 2026-03-10T08:55:23.136 INFO:tasks.workunit.client.1.vm08.stdout:8/551: chown d1/d10/d9/dd/d13 211447 1 2026-03-10T08:55:23.139 INFO:tasks.workunit.client.1.vm08.stdout:8/552: symlink d1/lc8 0 2026-03-10T08:55:23.151 INFO:tasks.workunit.client.1.vm08.stdout:8/553: dread d1/d10/d9/dd/d25/d27/d44/d97/f79 [0,4194304] 0 2026-03-10T08:55:23.152 INFO:tasks.workunit.client.1.vm08.stdout:8/554: chown d1/d10/d9/dd/d25/d27/d44/d21 808637 1 2026-03-10T08:55:23.157 INFO:tasks.workunit.client.0.vm05.stdout:1/344: dread dd/d21/d37/d45/f47 [0,4194304] 0 2026-03-10T08:55:23.158 INFO:tasks.workunit.client.0.vm05.stdout:1/345: readlink dd/l12 0 2026-03-10T08:55:23.167 INFO:tasks.workunit.client.0.vm05.stdout:8/178: write d2/db/f1b [452002,17826] 0 2026-03-10T08:55:23.171 INFO:tasks.workunit.client.0.vm05.stdout:1/346: rename dd/d10/d18/d20/d56 to dd/d21/d37/d7c 0 2026-03-10T08:55:23.175 INFO:tasks.workunit.client.0.vm05.stdout:1/347: write dd/d10/d19/d4d/f70 [540411,19262] 0 2026-03-10T08:55:23.175 INFO:tasks.workunit.client.0.vm05.stdout:1/348: chown dd/d21/f3a 98625 1 2026-03-10T08:55:23.175 INFO:tasks.workunit.client.0.vm05.stdout:1/349: chown dd/d10/l1b 13 1 2026-03-10T08:55:23.175 INFO:tasks.workunit.client.0.vm05.stdout:8/179: rename d2/dd/d2c/f3a to d2/db/d1f/f44 0 2026-03-10T08:55:23.178 INFO:tasks.workunit.client.0.vm05.stdout:1/350: mkdir dd/d10/d19/d4d/d7d 0 2026-03-10T08:55:23.199 INFO:tasks.workunit.client.0.vm05.stdout:5/217: dwrite d5/df/f31 [0,4194304] 0 2026-03-10T08:55:23.205 INFO:tasks.workunit.client.1.vm08.stdout:4/509: rename d5/d2f to d5/d23/d36/d99/db2 0 2026-03-10T08:55:23.205 INFO:tasks.workunit.client.0.vm05.stdout:5/218: chown d5/fd 19340603 1 2026-03-10T08:55:23.205 INFO:tasks.workunit.client.1.vm08.stdout:4/510: chown d5/d23/d36/f92 450508577 1 2026-03-10T08:55:23.206 INFO:tasks.workunit.client.0.vm05.stdout:5/219: chown d5/df/d12/f2a 9020937 1 2026-03-10T08:55:23.207 INFO:tasks.workunit.client.1.vm08.stdout:1/521: write d1/da/d4b/f4f [3500614,109212] 0 2026-03-10T08:55:23.211 INFO:tasks.workunit.client.0.vm05.stdout:5/220: dwrite d5/f3b [0,4194304] 0 2026-03-10T08:55:23.214 INFO:tasks.workunit.client.1.vm08.stdout:4/511: creat d5/d23/d36/d99/db2/d5a/d69/fb3 x:0 0 0 2026-03-10T08:55:23.217 INFO:tasks.workunit.client.1.vm08.stdout:1/522: fdatasync d1/da/d18/f72 0 2026-03-10T08:55:23.218 INFO:tasks.workunit.client.0.vm05.stdout:2/231: write d0/d9/d1e/d20/d21/f41 [1051078,10424] 0 2026-03-10T08:55:23.221 INFO:tasks.workunit.client.1.vm08.stdout:6/499: rename d9/dc/f39 to d9/fac 0 2026-03-10T08:55:23.222 INFO:tasks.workunit.client.0.vm05.stdout:2/232: dwrite d0/d9/f19 [4194304,4194304] 0 2026-03-10T08:55:23.229 INFO:tasks.workunit.client.1.vm08.stdout:1/523: creat d1/da/de/d5c/fb5 x:0 0 0 2026-03-10T08:55:23.235 INFO:tasks.workunit.client.0.vm05.stdout:4/250: dread d0/d1d/d30/d49/d4f/f51 [0,4194304] 0 2026-03-10T08:55:23.236 INFO:tasks.workunit.client.0.vm05.stdout:4/251: mkdir d0/d1d/d30/d49/d58 0 2026-03-10T08:55:23.238 INFO:tasks.workunit.client.0.vm05.stdout:4/252: creat d0/d2e/d42/f59 x:0 0 0 2026-03-10T08:55:23.238 INFO:tasks.workunit.client.1.vm08.stdout:6/500: rename d9/dc/c34 to d9/dc/d11/d23/cad 0 2026-03-10T08:55:23.239 INFO:tasks.workunit.client.1.vm08.stdout:6/501: readlink d9/dc/l33 0 2026-03-10T08:55:23.241 INFO:tasks.workunit.client.0.vm05.stdout:4/253: mknod d0/d1d/d30/d32/d41/c5a 0 2026-03-10T08:55:23.241 INFO:tasks.workunit.client.1.vm08.stdout:6/502: readlink d9/d10/l20 0 2026-03-10T08:55:23.242 INFO:tasks.workunit.client.1.vm08.stdout:1/524: dwrite d1/da/d20/f67 [0,4194304] 0 2026-03-10T08:55:23.242 INFO:tasks.workunit.client.0.vm05.stdout:4/254: rmdir d0/d2e/d42/d45/d4a/d36 39 2026-03-10T08:55:23.245 INFO:tasks.workunit.client.1.vm08.stdout:6/503: unlink d9/d10/f72 0 2026-03-10T08:55:23.245 INFO:tasks.workunit.client.1.vm08.stdout:1/525: write d1/da/de/d24/d35/fa9 [1023064,114229] 0 2026-03-10T08:55:23.250 INFO:tasks.workunit.client.1.vm08.stdout:6/504: creat d9/dc/d84/fae x:0 0 0 2026-03-10T08:55:23.250 INFO:tasks.workunit.client.1.vm08.stdout:1/526: creat d1/da/d20/d3f/d49/fb6 x:0 0 0 2026-03-10T08:55:23.251 INFO:tasks.workunit.client.1.vm08.stdout:6/505: creat d9/d10/d1e/d92/faf x:0 0 0 2026-03-10T08:55:23.253 INFO:tasks.workunit.client.1.vm08.stdout:6/506: unlink d9/dc/d11/d23/d2c/laa 0 2026-03-10T08:55:23.253 INFO:tasks.workunit.client.1.vm08.stdout:6/507: chown d9/d10/d1e/d7b/l9c 5194522 1 2026-03-10T08:55:23.257 INFO:tasks.workunit.client.1.vm08.stdout:6/508: write d9/d13/fa0 [163976,26091] 0 2026-03-10T08:55:23.258 INFO:tasks.workunit.client.1.vm08.stdout:6/509: chown d9/dc/d11/d23/d2c 7 1 2026-03-10T08:55:23.264 INFO:tasks.workunit.client.1.vm08.stdout:2/497: dread d1/da/f50 [0,4194304] 0 2026-03-10T08:55:23.270 INFO:tasks.workunit.client.0.vm05.stdout:3/246: write d9/f23 [1637645,25052] 0 2026-03-10T08:55:23.312 INFO:tasks.workunit.client.0.vm05.stdout:3/247: dread - d9/d2b/f3b zero size 2026-03-10T08:55:23.313 INFO:tasks.workunit.client.0.vm05.stdout:3/248: chown d9/d2b/c41 7047490 1 2026-03-10T08:55:23.313 INFO:tasks.workunit.client.0.vm05.stdout:3/249: dwrite d9/f27 [0,4194304] 0 2026-03-10T08:55:23.313 INFO:tasks.workunit.client.0.vm05.stdout:3/250: creat d9/d2b/d3a/f44 x:0 0 0 2026-03-10T08:55:23.313 INFO:tasks.workunit.client.0.vm05.stdout:8/180: rename d2/dd/d41 to d2/d45 0 2026-03-10T08:55:23.313 INFO:tasks.workunit.client.0.vm05.stdout:3/251: creat d9/d2b/d3a/f45 x:0 0 0 2026-03-10T08:55:23.313 INFO:tasks.workunit.client.0.vm05.stdout:3/252: chown d9/d2b 1 1 2026-03-10T08:55:23.313 INFO:tasks.workunit.client.0.vm05.stdout:0/255: write df/d1f/f21 [5241224,20720] 0 2026-03-10T08:55:23.313 INFO:tasks.workunit.client.0.vm05.stdout:8/181: symlink d2/dd/l46 0 2026-03-10T08:55:23.313 INFO:tasks.workunit.client.0.vm05.stdout:8/182: stat d2/dd/d2c/f2f 0 2026-03-10T08:55:23.313 INFO:tasks.workunit.client.0.vm05.stdout:8/183: write d2/dd/d2c/d2e/f37 [1036182,63889] 0 2026-03-10T08:55:23.313 INFO:tasks.workunit.client.0.vm05.stdout:3/253: mknod d9/c46 0 2026-03-10T08:55:23.313 INFO:tasks.workunit.client.0.vm05.stdout:8/184: dwrite d2/ff [0,4194304] 0 2026-03-10T08:55:23.313 INFO:tasks.workunit.client.0.vm05.stdout:8/185: truncate d2/f5 5598156 0 2026-03-10T08:55:23.313 INFO:tasks.workunit.client.0.vm05.stdout:8/186: mkdir d2/db/d47 0 2026-03-10T08:55:23.313 INFO:tasks.workunit.client.0.vm05.stdout:8/187: chown d2/l1d 704974 1 2026-03-10T08:55:23.313 INFO:tasks.workunit.client.0.vm05.stdout:3/254: symlink d9/d2b/l47 0 2026-03-10T08:55:23.313 INFO:tasks.workunit.client.0.vm05.stdout:3/255: chown d9/f28 81 1 2026-03-10T08:55:23.313 INFO:tasks.workunit.client.0.vm05.stdout:8/188: dwrite d2/dd/d2c/f30 [0,4194304] 0 2026-03-10T08:55:23.313 INFO:tasks.workunit.client.0.vm05.stdout:0/256: getdents df/d18 0 2026-03-10T08:55:23.313 INFO:tasks.workunit.client.0.vm05.stdout:0/257: mkdir df/d1f/d48 0 2026-03-10T08:55:23.313 INFO:tasks.workunit.client.0.vm05.stdout:3/256: mknod d9/d2b/d3a/d43/c48 0 2026-03-10T08:55:23.316 INFO:tasks.workunit.client.0.vm05.stdout:0/258: dwrite df/d18/d2b/d3a/f3f [0,4194304] 0 2026-03-10T08:55:23.317 INFO:tasks.workunit.client.1.vm08.stdout:6/510: dread d9/d10/d1e/f9a [0,4194304] 0 2026-03-10T08:55:23.318 INFO:tasks.workunit.client.1.vm08.stdout:6/511: chown d9/d10/f8f 220394252 1 2026-03-10T08:55:23.320 INFO:tasks.workunit.client.0.vm05.stdout:3/257: rename d9/l15 to d9/d2b/d3a/l49 0 2026-03-10T08:55:23.322 INFO:tasks.workunit.client.1.vm08.stdout:6/512: rename d9/d10/d1e/d32/ff to d9/d10/d1e/d7e/fb0 0 2026-03-10T08:55:23.322 INFO:tasks.workunit.client.0.vm05.stdout:8/189: link d2/db/d1f/c40 d2/dd/d2c/d2e/d31/d3e/c48 0 2026-03-10T08:55:23.324 INFO:tasks.workunit.client.0.vm05.stdout:3/258: fdatasync f2 0 2026-03-10T08:55:23.325 INFO:tasks.workunit.client.1.vm08.stdout:6/513: mkdir d9/d10/d1e/d4c/db1 0 2026-03-10T08:55:23.327 INFO:tasks.workunit.client.0.vm05.stdout:3/259: dwrite d9/d2b/d3a/f45 [0,4194304] 0 2026-03-10T08:55:23.330 INFO:tasks.workunit.client.0.vm05.stdout:8/190: creat d2/f49 x:0 0 0 2026-03-10T08:55:23.330 INFO:tasks.workunit.client.0.vm05.stdout:3/260: readlink d9/d2b/l38 0 2026-03-10T08:55:23.335 INFO:tasks.workunit.client.0.vm05.stdout:3/261: dwrite d9/ff [0,4194304] 0 2026-03-10T08:55:23.340 INFO:tasks.workunit.client.1.vm08.stdout:6/514: link d9/dc/d84/d80/f94 d9/d10/d1e/d32/fb2 0 2026-03-10T08:55:23.343 INFO:tasks.workunit.client.0.vm05.stdout:3/262: link d9/d2b/f40 d9/f4a 0 2026-03-10T08:55:23.343 INFO:tasks.workunit.client.0.vm05.stdout:3/263: chown d9/d2b/c3d 528 1 2026-03-10T08:55:23.345 INFO:tasks.workunit.client.0.vm05.stdout:3/264: rename f7 to d9/d2b/d2f/f4b 0 2026-03-10T08:55:23.347 INFO:tasks.workunit.client.0.vm05.stdout:3/265: rename d9/f31 to d9/d2b/d3a/d43/f4c 0 2026-03-10T08:55:23.348 INFO:tasks.workunit.client.0.vm05.stdout:3/266: mkdir d9/d4d 0 2026-03-10T08:55:23.349 INFO:tasks.workunit.client.0.vm05.stdout:3/267: read f1 [3931227,110107] 0 2026-03-10T08:55:23.349 INFO:tasks.workunit.client.0.vm05.stdout:3/268: chown d9/ff 25 1 2026-03-10T08:55:23.416 INFO:tasks.workunit.client.1.vm08.stdout:2/498: dread d1/d43/f6d [0,4194304] 0 2026-03-10T08:55:23.417 INFO:tasks.workunit.client.1.vm08.stdout:2/499: creat d1/d5b/d66/f9a x:0 0 0 2026-03-10T08:55:23.419 INFO:tasks.workunit.client.1.vm08.stdout:2/500: rename d1/d43/d4f to d1/d9b 0 2026-03-10T08:55:23.420 INFO:tasks.workunit.client.1.vm08.stdout:2/501: readlink d1/d43/d5c/l65 0 2026-03-10T08:55:23.420 INFO:tasks.workunit.client.1.vm08.stdout:2/502: stat d1/da/d10/d1b/d6a 0 2026-03-10T08:55:23.421 INFO:tasks.workunit.client.1.vm08.stdout:2/503: creat d1/da/f9c x:0 0 0 2026-03-10T08:55:23.422 INFO:tasks.workunit.client.1.vm08.stdout:2/504: symlink d1/l9d 0 2026-03-10T08:55:23.437 INFO:tasks.workunit.client.0.vm05.stdout:7/208: truncate d18/f26 1405467 0 2026-03-10T08:55:23.438 INFO:tasks.workunit.client.0.vm05.stdout:7/209: chown fd 0 1 2026-03-10T08:55:23.439 INFO:tasks.workunit.client.0.vm05.stdout:7/210: creat d18/d1b/d1f/f3c x:0 0 0 2026-03-10T08:55:23.440 INFO:tasks.workunit.client.0.vm05.stdout:1/351: sync 2026-03-10T08:55:23.441 INFO:tasks.workunit.client.0.vm05.stdout:4/255: sync 2026-03-10T08:55:23.441 INFO:tasks.workunit.client.0.vm05.stdout:3/269: sync 2026-03-10T08:55:23.445 INFO:tasks.workunit.client.0.vm05.stdout:1/352: dwrite dd/d21/d37/d45/f47 [0,4194304] 0 2026-03-10T08:55:23.446 INFO:tasks.workunit.client.1.vm08.stdout:9/443: sync 2026-03-10T08:55:23.448 INFO:tasks.workunit.client.1.vm08.stdout:8/555: dwrite d1/d10/d9/dd/d13/f92 [0,4194304] 0 2026-03-10T08:55:23.460 INFO:tasks.workunit.client.1.vm08.stdout:0/429: sync 2026-03-10T08:55:23.461 INFO:tasks.workunit.client.1.vm08.stdout:3/459: sync 2026-03-10T08:55:23.468 INFO:tasks.workunit.client.1.vm08.stdout:0/430: read d6/dd/d13/d17/d1f/d20/f21 [70046,57230] 0 2026-03-10T08:55:23.505 INFO:tasks.workunit.client.0.vm05.stdout:4/256: mkdir d0/d1d/d30/d49/d4f/d5b 0 2026-03-10T08:55:23.518 INFO:tasks.workunit.client.1.vm08.stdout:3/460: truncate d4/d15/d8/d2c/d55/f61 574645 0 2026-03-10T08:55:23.519 INFO:tasks.workunit.client.0.vm05.stdout:5/221: truncate d5/fc 2984077 0 2026-03-10T08:55:23.520 INFO:tasks.workunit.client.1.vm08.stdout:4/512: write d5/d23/d36/d76/f82 [76477,57210] 0 2026-03-10T08:55:23.523 INFO:tasks.workunit.client.0.vm05.stdout:1/353: creat dd/d10/d18/d2d/d5c/f7e x:0 0 0 2026-03-10T08:55:23.523 INFO:tasks.workunit.client.0.vm05.stdout:1/354: chown dd/d21/d37/d45 52055830 1 2026-03-10T08:55:23.545 INFO:tasks.workunit.client.1.vm08.stdout:0/431: mkdir d6/d8b 0 2026-03-10T08:55:23.546 INFO:tasks.workunit.client.0.vm05.stdout:0/259: dread df/d18/d2b/d27/f2e [4194304,4194304] 0 2026-03-10T08:55:23.547 INFO:tasks.workunit.client.0.vm05.stdout:0/260: chown df/d18/l46 57704817 1 2026-03-10T08:55:23.554 INFO:tasks.workunit.client.1.vm08.stdout:3/461: creat d4/f97 x:0 0 0 2026-03-10T08:55:23.555 INFO:tasks.workunit.client.1.vm08.stdout:3/462: chown d4/d15/d8/d2c/f6a 6451129 1 2026-03-10T08:55:23.556 INFO:tasks.workunit.client.1.vm08.stdout:3/463: chown d4/f47 7 1 2026-03-10T08:55:23.557 INFO:tasks.workunit.client.0.vm05.stdout:5/222: link d5/fd d5/df/d12/d24/d2c/f46 0 2026-03-10T08:55:23.557 INFO:tasks.workunit.client.0.vm05.stdout:0/261: chown df/d18/d19/d39/f42 58 1 2026-03-10T08:55:23.558 INFO:tasks.workunit.client.0.vm05.stdout:5/223: chown d5/df/f2f 13 1 2026-03-10T08:55:23.561 INFO:tasks.workunit.client.1.vm08.stdout:0/432: link d6/dd/d13/d17/d1f/d20/c49 d6/dd/d13/d17/d1f/d20/d2f/d26/d56/c8c 0 2026-03-10T08:55:23.562 INFO:tasks.workunit.client.0.vm05.stdout:5/224: creat d5/df/d37/f47 x:0 0 0 2026-03-10T08:55:23.562 INFO:tasks.workunit.client.0.vm05.stdout:5/225: chown d5/fe 0 1 2026-03-10T08:55:23.565 INFO:tasks.workunit.client.1.vm08.stdout:3/464: truncate d4/d15/f12 4298441 0 2026-03-10T08:55:23.569 INFO:tasks.workunit.client.1.vm08.stdout:3/465: creat d4/d15/d8/d1d/f98 x:0 0 0 2026-03-10T08:55:23.570 INFO:tasks.workunit.client.1.vm08.stdout:3/466: fdatasync d4/f97 0 2026-03-10T08:55:23.570 INFO:tasks.workunit.client.0.vm05.stdout:5/226: write d5/f23 [1101759,46922] 0 2026-03-10T08:55:23.574 INFO:tasks.workunit.client.1.vm08.stdout:3/467: creat d4/d15/d8/d2a/d79/d20/f99 x:0 0 0 2026-03-10T08:55:23.612 INFO:tasks.workunit.client.1.vm08.stdout:3/468: write d4/d15/d8/d2c/f8c [133967,2323] 0 2026-03-10T08:55:23.612 INFO:tasks.workunit.client.1.vm08.stdout:3/469: symlink d4/d15/d8/d1d/l9a 0 2026-03-10T08:55:23.612 INFO:tasks.workunit.client.1.vm08.stdout:2/505: mkdir d1/da/d10/d42/d93/d23/d9e 0 2026-03-10T08:55:23.612 INFO:tasks.workunit.client.1.vm08.stdout:2/506: truncate d1/da/d10/d42/d93/f8d 817262 0 2026-03-10T08:55:23.612 INFO:tasks.workunit.client.0.vm05.stdout:6/231: dwrite d4/fc [0,4194304] 0 2026-03-10T08:55:23.612 INFO:tasks.workunit.client.0.vm05.stdout:5/227: unlink d5/df/f1c 0 2026-03-10T08:55:23.612 INFO:tasks.workunit.client.0.vm05.stdout:5/228: mkdir d5/d48 0 2026-03-10T08:55:23.612 INFO:tasks.workunit.client.0.vm05.stdout:5/229: dwrite d5/f3b [0,4194304] 0 2026-03-10T08:55:23.612 INFO:tasks.workunit.client.0.vm05.stdout:3/270: getdents d9/d2b/d3a/d43 0 2026-03-10T08:55:23.612 INFO:tasks.workunit.client.0.vm05.stdout:3/271: write d9/d2b/d2f/f3f [424211,20648] 0 2026-03-10T08:55:23.615 INFO:tasks.workunit.client.1.vm08.stdout:2/507: dread d1/d5b/d66/f5e [0,4194304] 0 2026-03-10T08:55:23.617 INFO:tasks.workunit.client.1.vm08.stdout:2/508: mknod d1/da/d10/d42/d93/d22/c9f 0 2026-03-10T08:55:23.617 INFO:tasks.workunit.client.1.vm08.stdout:2/509: fsync d1/da/d10/d2d/f4c 0 2026-03-10T08:55:23.676 INFO:tasks.workunit.client.1.vm08.stdout:4/513: sync 2026-03-10T08:55:23.685 INFO:tasks.workunit.client.0.vm05.stdout:7/211: rename d18/f26 to d18/d1b/d1f/d25/d2e/d32/f3d 0 2026-03-10T08:55:23.688 INFO:tasks.workunit.client.0.vm05.stdout:7/212: dread d18/d1b/f2c [0,4194304] 0 2026-03-10T08:55:23.691 INFO:tasks.workunit.client.0.vm05.stdout:7/213: write f15 [1491074,57346] 0 2026-03-10T08:55:23.696 INFO:tasks.workunit.client.0.vm05.stdout:7/214: truncate f3 8624335 0 2026-03-10T08:55:23.696 INFO:tasks.workunit.client.0.vm05.stdout:7/215: mknod d18/d1b/d1f/d25/d2e/d32/c3e 0 2026-03-10T08:55:23.696 INFO:tasks.workunit.client.0.vm05.stdout:7/216: dread - d18/d1b/d1f/d25/d2e/d32/f35 zero size 2026-03-10T08:55:23.696 INFO:tasks.workunit.client.0.vm05.stdout:7/217: unlink d18/d1b/d1f/d25/d2e/d2f/c39 0 2026-03-10T08:55:23.713 INFO:tasks.workunit.client.1.vm08.stdout:1/527: read d1/da/f1e [2944817,10708] 0 2026-03-10T08:55:23.724 INFO:tasks.workunit.client.1.vm08.stdout:1/528: mknod d1/da/de/d24/d3d/d40/d56/d7a/cb7 0 2026-03-10T08:55:23.734 INFO:tasks.workunit.client.0.vm05.stdout:7/218: sync 2026-03-10T08:55:23.739 INFO:tasks.workunit.client.0.vm05.stdout:7/219: truncate d18/f1d 953195 0 2026-03-10T08:55:23.742 INFO:tasks.workunit.client.1.vm08.stdout:9/444: dwrite d2/f77 [0,4194304] 0 2026-03-10T08:55:23.751 INFO:tasks.workunit.client.0.vm05.stdout:7/220: dread d18/d1b/f2c [0,4194304] 0 2026-03-10T08:55:23.771 INFO:tasks.workunit.client.1.vm08.stdout:8/556: truncate d1/d10/f23 1678209 0 2026-03-10T08:55:23.771 INFO:tasks.workunit.client.1.vm08.stdout:8/557: fdatasync d1/d2c/f30 0 2026-03-10T08:55:23.771 INFO:tasks.workunit.client.0.vm05.stdout:4/257: write d0/f18 [3222799,26709] 0 2026-03-10T08:55:23.798 INFO:tasks.workunit.client.0.vm05.stdout:4/258: sync 2026-03-10T08:55:23.800 INFO:tasks.workunit.client.0.vm05.stdout:4/259: symlink d0/d2e/d42/l5c 0 2026-03-10T08:55:23.800 INFO:tasks.workunit.client.1.vm08.stdout:7/520: read d0/d11/f66 [387731,111358] 0 2026-03-10T08:55:23.804 INFO:tasks.workunit.client.1.vm08.stdout:7/521: mknod d0/d1c/ca0 0 2026-03-10T08:55:23.813 INFO:tasks.workunit.client.0.vm05.stdout:4/260: link d0/d2e/d42/d45/d4a/l27 d0/d2e/d42/d45/d4a/d36/l5d 0 2026-03-10T08:55:23.822 INFO:tasks.workunit.client.0.vm05.stdout:2/233: dread d0/d9/f12 [0,4194304] 0 2026-03-10T08:55:23.826 INFO:tasks.workunit.client.0.vm05.stdout:2/234: creat d0/d9/d1e/d20/d21/f46 x:0 0 0 2026-03-10T08:55:23.854 INFO:tasks.workunit.client.0.vm05.stdout:8/191: dread d2/db/d1f/f44 [0,4194304] 0 2026-03-10T08:55:23.855 INFO:tasks.workunit.client.0.vm05.stdout:8/192: chown d2/db/l24 242597 1 2026-03-10T08:55:23.858 INFO:tasks.workunit.client.0.vm05.stdout:8/193: dwrite d2/f49 [0,4194304] 0 2026-03-10T08:55:23.860 INFO:tasks.workunit.client.0.vm05.stdout:8/194: symlink d2/dd/d2c/d2e/d31/l4a 0 2026-03-10T08:55:23.867 INFO:tasks.workunit.client.0.vm05.stdout:8/195: link d2/dd/l13 d2/d45/l4b 0 2026-03-10T08:55:23.870 INFO:tasks.workunit.client.0.vm05.stdout:8/196: dwrite d2/dd/d2c/d2e/f3b [0,4194304] 0 2026-03-10T08:55:23.871 INFO:tasks.workunit.client.0.vm05.stdout:8/197: read d2/db/d1f/f44 [242339,83786] 0 2026-03-10T08:55:23.873 INFO:tasks.workunit.client.0.vm05.stdout:8/198: mkdir d2/dd/d2c/d2e/d31/d4c 0 2026-03-10T08:55:23.873 INFO:tasks.workunit.client.0.vm05.stdout:8/199: chown d2/dd/d2c/d2e/d31/d4c 0 1 2026-03-10T08:55:23.874 INFO:tasks.workunit.client.0.vm05.stdout:8/200: stat d2/dd/d2c/d2e/d31/d4c 0 2026-03-10T08:55:23.875 INFO:tasks.workunit.client.0.vm05.stdout:8/201: creat d2/dd/d2c/f4d x:0 0 0 2026-03-10T08:55:23.880 INFO:tasks.workunit.client.0.vm05.stdout:8/202: symlink d2/dd/d2c/d2e/d31/d3e/l4e 0 2026-03-10T08:55:23.880 INFO:tasks.workunit.client.0.vm05.stdout:8/203: dread d2/f49 [0,4194304] 0 2026-03-10T08:55:23.880 INFO:tasks.workunit.client.0.vm05.stdout:8/204: write d2/dd/f26 [4962661,24712] 0 2026-03-10T08:55:23.882 INFO:tasks.workunit.client.0.vm05.stdout:8/205: write d2/dd/d2c/f2f [547479,81172] 0 2026-03-10T08:55:23.888 INFO:tasks.workunit.client.0.vm05.stdout:8/206: rmdir d2/dd/d2c/d2e/d31 39 2026-03-10T08:55:23.890 INFO:tasks.workunit.client.0.vm05.stdout:8/207: chown d2/l6 2934624 1 2026-03-10T08:55:23.893 INFO:tasks.workunit.client.0.vm05.stdout:8/208: unlink d2/ff 0 2026-03-10T08:55:23.893 INFO:tasks.workunit.client.1.vm08.stdout:0/433: getdents d6/dd/d13/d17/d1f/d20/d2f/d26/d56 0 2026-03-10T08:55:23.893 INFO:tasks.workunit.client.0.vm05.stdout:0/262: write df/f12 [2781290,96609] 0 2026-03-10T08:55:23.894 INFO:tasks.workunit.client.1.vm08.stdout:0/434: stat d6/dd/d13/d17/d1f/d20/d2f/l7d 0 2026-03-10T08:55:23.906 INFO:tasks.workunit.client.1.vm08.stdout:6/515: dread d9/dc/d11/d23/d2c/d41/f56 [0,4194304] 0 2026-03-10T08:55:23.913 INFO:tasks.workunit.client.0.vm05.stdout:3/272: dwrite f1 [0,4194304] 0 2026-03-10T08:55:23.916 INFO:tasks.workunit.client.0.vm05.stdout:3/273: dwrite d9/d2b/d2f/f33 [0,4194304] 0 2026-03-10T08:55:23.923 INFO:tasks.workunit.client.0.vm05.stdout:0/263: mknod df/d18/d2b/d27/d32/c49 0 2026-03-10T08:55:23.923 INFO:tasks.workunit.client.1.vm08.stdout:2/510: rmdir d1 39 2026-03-10T08:55:23.923 INFO:tasks.workunit.client.1.vm08.stdout:6/516: dread - d9/d50/fa3 zero size 2026-03-10T08:55:23.930 INFO:tasks.workunit.client.0.vm05.stdout:0/264: dwrite df/f1a [4194304,4194304] 0 2026-03-10T08:55:23.938 INFO:tasks.workunit.client.1.vm08.stdout:2/511: symlink d1/d5b/la0 0 2026-03-10T08:55:23.945 INFO:tasks.workunit.client.1.vm08.stdout:2/512: creat d1/da/d10/d42/d93/d23/d9e/fa1 x:0 0 0 2026-03-10T08:55:23.948 INFO:tasks.workunit.client.0.vm05.stdout:3/274: dread d9/f1a [0,4194304] 0 2026-03-10T08:55:23.951 INFO:tasks.workunit.client.0.vm05.stdout:3/275: chown d9/f20 39 1 2026-03-10T08:55:23.956 INFO:tasks.workunit.client.1.vm08.stdout:2/513: getdents d1/da/d78 0 2026-03-10T08:55:24.003 INFO:tasks.workunit.client.1.vm08.stdout:4/514: write d5/d23/d49/d8f/fb1 [1902870,34976] 0 2026-03-10T08:55:24.004 INFO:tasks.workunit.client.1.vm08.stdout:4/515: fsync d5/d23/d36/d99/db2/f9c 0 2026-03-10T08:55:24.017 INFO:tasks.workunit.client.1.vm08.stdout:4/516: dread d5/d23/d36/f58 [0,4194304] 0 2026-03-10T08:55:24.018 INFO:tasks.workunit.client.1.vm08.stdout:4/517: creat d5/fb4 x:0 0 0 2026-03-10T08:55:24.024 INFO:tasks.workunit.client.1.vm08.stdout:4/518: link d5/d23/d49/l4b d5/de/lb5 0 2026-03-10T08:55:24.074 INFO:tasks.workunit.client.1.vm08.stdout:5/465: dread d0/d1b/f2f [0,4194304] 0 2026-03-10T08:55:24.074 INFO:tasks.workunit.client.1.vm08.stdout:5/466: chown d0/d11/d3e/d45 881107603 1 2026-03-10T08:55:24.075 INFO:tasks.workunit.client.0.vm05.stdout:7/221: dread fd [0,4194304] 0 2026-03-10T08:55:24.080 INFO:tasks.workunit.client.1.vm08.stdout:6/517: sync 2026-03-10T08:55:24.083 INFO:tasks.workunit.client.0.vm05.stdout:7/222: unlink d18/d1b/d1f/l23 0 2026-03-10T08:55:24.086 INFO:tasks.workunit.client.0.vm05.stdout:7/223: creat d18/d1b/d1f/f3f x:0 0 0 2026-03-10T08:55:24.088 INFO:tasks.workunit.client.0.vm05.stdout:7/224: mknod d18/d1b/d1f/c40 0 2026-03-10T08:55:24.091 INFO:tasks.workunit.client.0.vm05.stdout:7/225: dwrite d18/d1b/d1f/f2d [0,4194304] 0 2026-03-10T08:55:24.097 INFO:tasks.workunit.client.0.vm05.stdout:7/226: symlink d18/d1b/d1f/d25/d2e/d32/l41 0 2026-03-10T08:55:24.128 INFO:tasks.workunit.client.0.vm05.stdout:7/227: mkdir d18/d1b/d1f/d25/d2e/d42 0 2026-03-10T08:55:24.128 INFO:tasks.workunit.client.0.vm05.stdout:7/228: mkdir d18/d38/d43 0 2026-03-10T08:55:24.143 INFO:tasks.workunit.client.1.vm08.stdout:5/467: dread d0/d11/d3e/f48 [0,4194304] 0 2026-03-10T08:55:24.144 INFO:tasks.workunit.client.1.vm08.stdout:5/468: mknod d0/d1b/d67/d7a/c8c 0 2026-03-10T08:55:24.157 INFO:tasks.workunit.client.1.vm08.stdout:9/445: creat d2/dd/d15/d1e/d25/d32/f8c x:0 0 0 2026-03-10T08:55:24.159 INFO:tasks.workunit.client.1.vm08.stdout:3/470: rename d4/d15/d8/d2a to d4/d15/d8/d2c/d9b 0 2026-03-10T08:55:24.160 INFO:tasks.workunit.client.0.vm05.stdout:9/187: truncate d6/d19/f1a 158956 0 2026-03-10T08:55:24.160 INFO:tasks.workunit.client.0.vm05.stdout:9/188: write d6/f3f [752844,62384] 0 2026-03-10T08:55:24.163 INFO:tasks.workunit.client.0.vm05.stdout:6/232: rename d4/d7/d10/d15/c4e to d4/d7/c50 0 2026-03-10T08:55:24.164 INFO:tasks.workunit.client.1.vm08.stdout:1/529: write d1/da/de/f79 [662649,42384] 0 2026-03-10T08:55:24.169 INFO:tasks.workunit.client.0.vm05.stdout:5/230: mknod d5/df/d12/c49 0 2026-03-10T08:55:24.169 INFO:tasks.workunit.client.0.vm05.stdout:5/231: write d5/f23 [8299229,94366] 0 2026-03-10T08:55:24.173 INFO:tasks.workunit.client.0.vm05.stdout:8/209: fdatasync d2/dd/d2c/d2e/f3b 0 2026-03-10T08:55:24.174 INFO:tasks.workunit.client.0.vm05.stdout:8/210: read d2/dd/d2c/f2f [561780,76908] 0 2026-03-10T08:55:24.179 INFO:tasks.workunit.client.1.vm08.stdout:3/471: mknod d4/d6f/d85/c9c 0 2026-03-10T08:55:24.181 INFO:tasks.workunit.client.0.vm05.stdout:2/235: rename d0/d9/d1e/d20/d21/f3e to d0/d9/d1e/d20/f47 0 2026-03-10T08:55:24.181 INFO:tasks.workunit.client.0.vm05.stdout:2/236: chown d0/d9/d27 119215564 1 2026-03-10T08:55:24.183 INFO:tasks.workunit.client.1.vm08.stdout:9/446: dread d2/dd/d15/d1e/d39/d4e/f55 [0,4194304] 0 2026-03-10T08:55:24.183 INFO:tasks.workunit.client.0.vm05.stdout:1/355: creat dd/d21/f7f x:0 0 0 2026-03-10T08:55:24.184 INFO:tasks.workunit.client.0.vm05.stdout:1/356: readlink dd/le 0 2026-03-10T08:55:24.186 INFO:tasks.workunit.client.1.vm08.stdout:1/530: creat d1/da/d20/d3f/d49/d68/fb8 x:0 0 0 2026-03-10T08:55:24.187 INFO:tasks.workunit.client.0.vm05.stdout:6/233: mkdir d4/d2d/d51 0 2026-03-10T08:55:24.188 INFO:tasks.workunit.client.1.vm08.stdout:1/531: write d1/da/de/d24/d26/f94 [737833,90074] 0 2026-03-10T08:55:24.190 INFO:tasks.workunit.client.0.vm05.stdout:2/237: dread d0/f2f [0,4194304] 0 2026-03-10T08:55:24.191 INFO:tasks.workunit.client.0.vm05.stdout:2/238: write d0/f40 [1549812,52726] 0 2026-03-10T08:55:24.192 INFO:tasks.workunit.client.0.vm05.stdout:2/239: readlink d0/d9/l42 0 2026-03-10T08:55:24.193 INFO:tasks.workunit.client.0.vm05.stdout:6/234: dwrite d4/d7/d10/d15/f2a [0,4194304] 0 2026-03-10T08:55:24.202 INFO:tasks.workunit.client.0.vm05.stdout:6/235: dwrite d4/d7/ff [8388608,4194304] 0 2026-03-10T08:55:24.205 INFO:tasks.workunit.client.0.vm05.stdout:6/236: write d4/f11 [3412842,46995] 0 2026-03-10T08:55:24.214 INFO:tasks.workunit.client.0.vm05.stdout:5/232: creat d5/d3a/f4a x:0 0 0 2026-03-10T08:55:24.216 INFO:tasks.workunit.client.1.vm08.stdout:1/532: creat d1/da/d20/d3f/d49/d68/d7f/fb9 x:0 0 0 2026-03-10T08:55:24.219 INFO:tasks.workunit.client.1.vm08.stdout:9/447: symlink d2/dd/d15/d1e/d39/d69/l8d 0 2026-03-10T08:55:24.219 INFO:tasks.workunit.client.0.vm05.stdout:9/189: symlink d6/d15/d37/l41 0 2026-03-10T08:55:24.220 INFO:tasks.workunit.client.0.vm05.stdout:9/190: dread - d6/d19/d21/f2f zero size 2026-03-10T08:55:24.225 INFO:tasks.workunit.client.1.vm08.stdout:8/558: truncate d1/d10/d9/dd/d25/d27/d44/d97/f79 245819 0 2026-03-10T08:55:24.230 INFO:tasks.workunit.client.0.vm05.stdout:4/261: dwrite d0/fb [0,4194304] 0 2026-03-10T08:55:24.231 INFO:tasks.workunit.client.1.vm08.stdout:7/522: dwrite d0/d11/d1f/d29/d3b/f86 [0,4194304] 0 2026-03-10T08:55:24.239 INFO:tasks.workunit.client.1.vm08.stdout:9/448: mkdir d2/d54/d8e 0 2026-03-10T08:55:24.241 INFO:tasks.workunit.client.0.vm05.stdout:4/262: read d0/d2e/d42/d45/d4a/f47 [687527,107194] 0 2026-03-10T08:55:24.242 INFO:tasks.workunit.client.0.vm05.stdout:2/240: mknod d0/d9/d1e/d20/d24/c48 0 2026-03-10T08:55:24.245 INFO:tasks.workunit.client.1.vm08.stdout:7/523: mkdir d0/d11/d1f/d29/d3b/da1 0 2026-03-10T08:55:24.246 INFO:tasks.workunit.client.0.vm05.stdout:6/237: creat d4/d7/f52 x:0 0 0 2026-03-10T08:55:24.247 INFO:tasks.workunit.client.1.vm08.stdout:9/449: truncate d2/dd/d15/d1e/d21/f50 769120 0 2026-03-10T08:55:24.249 INFO:tasks.workunit.client.1.vm08.stdout:8/559: creat d1/d10/d9/dd/d25/d27/d44/fc9 x:0 0 0 2026-03-10T08:55:24.249 INFO:tasks.workunit.client.0.vm05.stdout:0/265: creat df/f4a x:0 0 0 2026-03-10T08:55:24.252 INFO:tasks.workunit.client.0.vm05.stdout:0/266: dread f6 [0,4194304] 0 2026-03-10T08:55:24.261 INFO:tasks.workunit.client.0.vm05.stdout:3/276: unlink d9/f1a 0 2026-03-10T08:55:24.261 INFO:tasks.workunit.client.0.vm05.stdout:8/211: mkdir d2/dd/d2c/d2e/d31/d4f 0 2026-03-10T08:55:24.261 INFO:tasks.workunit.client.1.vm08.stdout:7/524: rmdir d0/d11 39 2026-03-10T08:55:24.261 INFO:tasks.workunit.client.1.vm08.stdout:8/560: unlink d1/d10/d9/dd/d18/d34/c84 0 2026-03-10T08:55:24.261 INFO:tasks.workunit.client.1.vm08.stdout:0/435: dwrite d6/dd/d13/d17/f29 [0,4194304] 0 2026-03-10T08:55:24.262 INFO:tasks.workunit.client.0.vm05.stdout:9/191: mknod d6/d19/d2c/c42 0 2026-03-10T08:55:24.262 INFO:tasks.workunit.client.0.vm05.stdout:9/192: chown d6/d12/l13 29038143 1 2026-03-10T08:55:24.266 INFO:tasks.workunit.client.1.vm08.stdout:2/514: write d1/da/d10/d42/d93/f8f [366900,17235] 0 2026-03-10T08:55:24.276 INFO:tasks.workunit.client.0.vm05.stdout:3/277: dread d9/fa [0,4194304] 0 2026-03-10T08:55:24.276 INFO:tasks.workunit.client.0.vm05.stdout:3/278: chown d9/d4d 27274 1 2026-03-10T08:55:24.278 INFO:tasks.workunit.client.0.vm05.stdout:1/357: mkdir dd/d10/d18/d20/d52/d80 0 2026-03-10T08:55:24.279 INFO:tasks.workunit.client.0.vm05.stdout:7/229: dread d18/d1b/d1f/d25/d2e/d32/f3d [0,4194304] 0 2026-03-10T08:55:24.279 INFO:tasks.workunit.client.0.vm05.stdout:7/230: fdatasync d18/d1b/d1f/f3f 0 2026-03-10T08:55:24.280 INFO:tasks.workunit.client.0.vm05.stdout:7/231: read d18/d1b/d1f/d25/d2e/d32/f3d [1210899,106747] 0 2026-03-10T08:55:24.283 INFO:tasks.workunit.client.0.vm05.stdout:4/263: rmdir d0/d2e/d42 39 2026-03-10T08:55:24.285 INFO:tasks.workunit.client.0.vm05.stdout:7/232: dwrite d18/d1b/d1f/f3c [0,4194304] 0 2026-03-10T08:55:24.292 INFO:tasks.workunit.client.0.vm05.stdout:2/241: dwrite d0/d9/d1e/d20/f22 [0,4194304] 0 2026-03-10T08:55:24.297 INFO:tasks.workunit.client.1.vm08.stdout:0/436: mknod d6/dd/d13/d17/d1f/d2d/d39/c8d 0 2026-03-10T08:55:24.298 INFO:tasks.workunit.client.1.vm08.stdout:0/437: readlink d6/dd/d13/d17/l83 0 2026-03-10T08:55:24.304 INFO:tasks.workunit.client.0.vm05.stdout:2/242: dwrite d0/f36 [0,4194304] 0 2026-03-10T08:55:24.308 INFO:tasks.workunit.client.1.vm08.stdout:8/561: dread d1/d10/d9/dd/d18/d3c/fa9 [0,4194304] 0 2026-03-10T08:55:24.313 INFO:tasks.workunit.client.0.vm05.stdout:7/233: dread f4 [0,4194304] 0 2026-03-10T08:55:24.316 INFO:tasks.workunit.client.1.vm08.stdout:6/518: mknod d9/d10/d1e/d4c/d69/da2/cb3 0 2026-03-10T08:55:24.318 INFO:tasks.workunit.client.1.vm08.stdout:2/515: chown d1/da/d10/d2d/l54 16290727 1 2026-03-10T08:55:24.318 INFO:tasks.workunit.client.0.vm05.stdout:7/234: dwrite d18/d1b/d1f/f3f [0,4194304] 0 2026-03-10T08:55:24.320 INFO:tasks.workunit.client.0.vm05.stdout:7/235: fdatasync d18/d1b/d1f/f2d 0 2026-03-10T08:55:24.320 INFO:tasks.workunit.client.1.vm08.stdout:4/519: dwrite d5/de/f6d [0,4194304] 0 2026-03-10T08:55:24.322 INFO:tasks.workunit.client.1.vm08.stdout:6/519: write d9/dc/d11/f55 [957356,29084] 0 2026-03-10T08:55:24.326 INFO:tasks.workunit.client.1.vm08.stdout:1/533: truncate d1/da/d18/d3a/f3c 1751754 0 2026-03-10T08:55:24.330 INFO:tasks.workunit.client.1.vm08.stdout:2/516: read d1/da/d10/f18 [1886504,62003] 0 2026-03-10T08:55:24.331 INFO:tasks.workunit.client.0.vm05.stdout:0/267: dwrite df/f12 [0,4194304] 0 2026-03-10T08:55:24.350 INFO:tasks.workunit.client.0.vm05.stdout:3/279: readlink d9/le 0 2026-03-10T08:55:24.352 INFO:tasks.workunit.client.0.vm05.stdout:1/358: write dd/d13/f42 [814771,47536] 0 2026-03-10T08:55:24.357 INFO:tasks.workunit.client.0.vm05.stdout:4/264: unlink d0/d1d/d30/d32/d41/c5a 0 2026-03-10T08:55:24.361 INFO:tasks.workunit.client.0.vm05.stdout:8/212: dread d2/dd/d2c/f2f [0,4194304] 0 2026-03-10T08:55:24.373 INFO:tasks.workunit.client.1.vm08.stdout:7/525: dread d0/d11/d1f/d2c/f33 [0,4194304] 0 2026-03-10T08:55:24.373 INFO:tasks.workunit.client.1.vm08.stdout:4/520: symlink d5/d23/d36/d99/lb6 0 2026-03-10T08:55:24.373 INFO:tasks.workunit.client.1.vm08.stdout:5/469: write d0/d11/d3e/f4d [1047144,117728] 0 2026-03-10T08:55:24.374 INFO:tasks.workunit.client.1.vm08.stdout:5/470: read d0/d11/d18/d52/f57 [3290949,82624] 0 2026-03-10T08:55:24.377 INFO:tasks.workunit.client.1.vm08.stdout:1/534: dread - d1/da/d20/d3f/d49/f61 zero size 2026-03-10T08:55:24.381 INFO:tasks.workunit.client.1.vm08.stdout:4/521: dread d5/d23/d36/d76/f82 [0,4194304] 0 2026-03-10T08:55:24.386 INFO:tasks.workunit.client.1.vm08.stdout:0/438: truncate d6/dd/d13/d17/d1f/d20/d2f/d57/f58 475702 0 2026-03-10T08:55:24.390 INFO:tasks.workunit.client.0.vm05.stdout:3/280: dread d9/f23 [0,4194304] 0 2026-03-10T08:55:24.397 INFO:tasks.workunit.client.1.vm08.stdout:7/526: creat d0/d11/d1f/d29/d3b/d80/fa2 x:0 0 0 2026-03-10T08:55:24.400 INFO:tasks.workunit.client.0.vm05.stdout:0/268: stat df/f13 0 2026-03-10T08:55:24.400 INFO:tasks.workunit.client.1.vm08.stdout:5/471: mknod d0/d11/d3e/d45/c8d 0 2026-03-10T08:55:24.402 INFO:tasks.workunit.client.0.vm05.stdout:9/193: unlink d6/d15/f18 0 2026-03-10T08:55:24.403 INFO:tasks.workunit.client.0.vm05.stdout:9/194: dread - d6/d19/d21/f2f zero size 2026-03-10T08:55:24.404 INFO:tasks.workunit.client.1.vm08.stdout:3/472: dwrite d4/d15/d8/f68 [0,4194304] 0 2026-03-10T08:55:24.418 INFO:tasks.workunit.client.0.vm05.stdout:4/265: truncate d0/d2e/d42/f59 998793 0 2026-03-10T08:55:24.422 INFO:tasks.workunit.client.0.vm05.stdout:4/266: dwrite d0/d2e/f4e [0,4194304] 0 2026-03-10T08:55:24.425 INFO:tasks.workunit.client.1.vm08.stdout:8/562: rmdir d1/d10/d9/dd/d25/d27/d44/d21/d51/d64/dba 0 2026-03-10T08:55:24.426 INFO:tasks.workunit.client.0.vm05.stdout:8/213: rename d2/dd/c12 to d2/dd/d2c/d2e/d31/d3e/c50 0 2026-03-10T08:55:24.433 INFO:tasks.workunit.client.0.vm05.stdout:5/233: getdents d5/df/d12/d24/d2c 0 2026-03-10T08:55:24.434 INFO:tasks.workunit.client.1.vm08.stdout:7/527: mkdir d0/d11/d4a/da3 0 2026-03-10T08:55:24.436 INFO:tasks.workunit.client.1.vm08.stdout:9/450: dwrite d2/d41/d74/f3e [0,4194304] 0 2026-03-10T08:55:24.436 INFO:tasks.workunit.client.1.vm08.stdout:5/472: read d0/d11/d27/f2a [6566549,116109] 0 2026-03-10T08:55:24.441 INFO:tasks.workunit.client.1.vm08.stdout:1/535: link d1/da/d18/d3a/f57 d1/da/d18/d3a/da7/fba 0 2026-03-10T08:55:24.442 INFO:tasks.workunit.client.1.vm08.stdout:3/473: creat d4/d15/d8/d2c/d6d/f9d x:0 0 0 2026-03-10T08:55:24.443 INFO:tasks.workunit.client.0.vm05.stdout:9/195: mkdir d6/d12/d43 0 2026-03-10T08:55:24.444 INFO:tasks.workunit.client.0.vm05.stdout:9/196: chown d6/fe 3 1 2026-03-10T08:55:24.450 INFO:tasks.workunit.client.1.vm08.stdout:7/528: creat d0/d14/d43/fa4 x:0 0 0 2026-03-10T08:55:24.450 INFO:tasks.workunit.client.0.vm05.stdout:0/269: dread df/f1d [0,4194304] 0 2026-03-10T08:55:24.456 INFO:tasks.workunit.client.0.vm05.stdout:6/238: getdents d4/d7/d10 0 2026-03-10T08:55:24.457 INFO:tasks.workunit.client.0.vm05.stdout:6/239: truncate d4/d7/d10/d15/d38/f3c 405208 0 2026-03-10T08:55:24.466 INFO:tasks.workunit.client.0.vm05.stdout:6/240: dwrite d4/d7/d10/d15/d38/f49 [0,4194304] 0 2026-03-10T08:55:24.467 INFO:tasks.workunit.client.0.vm05.stdout:6/241: chown d4/l42 503578845 1 2026-03-10T08:55:24.467 INFO:tasks.workunit.client.0.vm05.stdout:6/242: readlink d4/l42 0 2026-03-10T08:55:24.470 INFO:tasks.workunit.client.1.vm08.stdout:8/563: dread d1/d10/d9/dd/d25/d27/d44/d21/d51/f72 [0,4194304] 0 2026-03-10T08:55:24.471 INFO:tasks.workunit.client.0.vm05.stdout:1/359: read dd/d13/f42 [73322,32015] 0 2026-03-10T08:55:24.471 INFO:tasks.workunit.client.0.vm05.stdout:1/360: write dd/d13/f40 [3229283,31035] 0 2026-03-10T08:55:24.475 INFO:tasks.workunit.client.0.vm05.stdout:7/236: link d18/d1b/d1f/d25/c2a d18/d1b/d1f/d25/c44 0 2026-03-10T08:55:24.484 INFO:tasks.workunit.client.0.vm05.stdout:9/197: creat d6/d27/f44 x:0 0 0 2026-03-10T08:55:24.484 INFO:tasks.workunit.client.1.vm08.stdout:7/529: dwrite d0/d11/d4a/d95/f9b [0,4194304] 0 2026-03-10T08:55:24.484 INFO:tasks.workunit.client.1.vm08.stdout:7/530: dwrite d0/d14/d43/d62/f9a [0,4194304] 0 2026-03-10T08:55:24.491 INFO:tasks.workunit.client.1.vm08.stdout:7/531: write d0/d51/f5d [4486679,60981] 0 2026-03-10T08:55:24.496 INFO:tasks.workunit.client.1.vm08.stdout:3/474: mknod d4/d6f/d85/c9e 0 2026-03-10T08:55:24.506 INFO:tasks.workunit.client.1.vm08.stdout:6/520: getdents d9/d10/d1e/d4c/d69 0 2026-03-10T08:55:24.506 INFO:tasks.workunit.client.1.vm08.stdout:8/564: rename d1/d10/d9/dd/d3d to d1/d10/d9/dd/d25/dca 0 2026-03-10T08:55:24.506 INFO:tasks.workunit.client.0.vm05.stdout:8/214: creat d2/db/d47/f51 x:0 0 0 2026-03-10T08:55:24.506 INFO:tasks.workunit.client.0.vm05.stdout:5/234: unlink d5/df/d12/d24/d2c/f35 0 2026-03-10T08:55:24.506 INFO:tasks.workunit.client.0.vm05.stdout:7/237: readlink d18/d1b/d1f/d25/d2e/l37 0 2026-03-10T08:55:24.509 INFO:tasks.workunit.client.1.vm08.stdout:1/536: mkdir d1/da/de/d24/d35/d6d/d82/da2/dbb 0 2026-03-10T08:55:24.509 INFO:tasks.workunit.client.1.vm08.stdout:4/522: getdents d5/d23 0 2026-03-10T08:55:24.511 INFO:tasks.workunit.client.0.vm05.stdout:3/281: rename d9/d2b/l38 to d9/l4e 0 2026-03-10T08:55:24.513 INFO:tasks.workunit.client.0.vm05.stdout:8/215: dread d2/dd/d2c/f34 [0,4194304] 0 2026-03-10T08:55:24.513 INFO:tasks.workunit.client.0.vm05.stdout:8/216: readlink d2/l1d 0 2026-03-10T08:55:24.514 INFO:tasks.workunit.client.0.vm05.stdout:8/217: truncate d2/db/f22 4431482 0 2026-03-10T08:55:24.515 INFO:tasks.workunit.client.1.vm08.stdout:6/521: fsync d9/d13/f6c 0 2026-03-10T08:55:24.518 INFO:tasks.workunit.client.1.vm08.stdout:7/532: creat d0/d11/d4a/fa5 x:0 0 0 2026-03-10T08:55:24.519 INFO:tasks.workunit.client.1.vm08.stdout:3/475: mkdir d4/d15/d8/d1d/d9f 0 2026-03-10T08:55:24.521 INFO:tasks.workunit.client.0.vm05.stdout:4/267: rename d0/d1d/d30/f1c to d0/d2e/d42/f5e 0 2026-03-10T08:55:24.525 INFO:tasks.workunit.client.0.vm05.stdout:4/268: dwrite d0/fb [4194304,4194304] 0 2026-03-10T08:55:24.530 INFO:tasks.workunit.client.0.vm05.stdout:3/282: mkdir d9/d2b/d3a/d43/d4f 0 2026-03-10T08:55:24.531 INFO:tasks.workunit.client.1.vm08.stdout:4/523: mknod d5/d23/cb7 0 2026-03-10T08:55:24.533 INFO:tasks.workunit.client.1.vm08.stdout:4/524: fdatasync d5/de/f41 0 2026-03-10T08:55:24.544 INFO:tasks.workunit.client.0.vm05.stdout:6/243: rename d4/d7/d10/d15/d38 to d4/d7/d10/d15/d20/d53 0 2026-03-10T08:55:24.552 INFO:tasks.workunit.client.1.vm08.stdout:7/533: creat d0/d11/d1f/d29/d3d/d89/fa6 x:0 0 0 2026-03-10T08:55:24.552 INFO:tasks.workunit.client.0.vm05.stdout:3/283: chown d9/c1b 49 1 2026-03-10T08:55:24.552 INFO:tasks.workunit.client.0.vm05.stdout:7/238: link d18/l2b d18/d38/l45 0 2026-03-10T08:55:24.553 INFO:tasks.workunit.client.1.vm08.stdout:6/522: fsync d9/fa 0 2026-03-10T08:55:24.554 INFO:tasks.workunit.client.0.vm05.stdout:9/198: rename d6/cc to d6/d15/d35/c45 0 2026-03-10T08:55:24.555 INFO:tasks.workunit.client.0.vm05.stdout:6/244: creat d4/d7/f54 x:0 0 0 2026-03-10T08:55:24.567 INFO:tasks.workunit.client.1.vm08.stdout:3/476: rmdir d4/d15/d8/d1d/d9f 0 2026-03-10T08:55:24.576 INFO:tasks.workunit.client.1.vm08.stdout:6/523: mknod d9/d10/d1e/d4c/db1/cb4 0 2026-03-10T08:55:24.576 INFO:tasks.workunit.client.0.vm05.stdout:0/270: rename df/d18/d19/d35/l3c to df/d1f/d48/l4b 0 2026-03-10T08:55:24.576 INFO:tasks.workunit.client.0.vm05.stdout:9/199: mknod d6/d15/d37/c46 0 2026-03-10T08:55:24.576 INFO:tasks.workunit.client.0.vm05.stdout:9/200: dwrite d6/f7 [0,4194304] 0 2026-03-10T08:55:24.576 INFO:tasks.workunit.client.1.vm08.stdout:6/524: readlink d9/d13/l46 0 2026-03-10T08:55:24.576 INFO:tasks.workunit.client.1.vm08.stdout:3/477: stat d4/d15/fa 0 2026-03-10T08:55:24.576 INFO:tasks.workunit.client.1.vm08.stdout:3/478: dread - d4/d15/d8/d2c/d9b/d79/d8f/f91 zero size 2026-03-10T08:55:24.576 INFO:tasks.workunit.client.1.vm08.stdout:6/525: dread f5 [0,4194304] 0 2026-03-10T08:55:24.578 INFO:tasks.workunit.client.0.vm05.stdout:6/245: creat d4/d2c/f55 x:0 0 0 2026-03-10T08:55:24.589 INFO:tasks.workunit.client.0.vm05.stdout:0/271: dread fe [0,4194304] 0 2026-03-10T08:55:24.590 INFO:tasks.workunit.client.0.vm05.stdout:6/246: creat d4/d7/d10/d15/d1b/d22/f56 x:0 0 0 2026-03-10T08:55:24.591 INFO:tasks.workunit.client.1.vm08.stdout:2/517: dwrite d1/da/d10/d42/d93/d1e/f84 [0,4194304] 0 2026-03-10T08:55:24.591 INFO:tasks.workunit.client.0.vm05.stdout:4/269: getdents d0/d1d/d30/d49/d4f 0 2026-03-10T08:55:24.592 INFO:tasks.workunit.client.0.vm05.stdout:4/270: write d0/fe [904006,123597] 0 2026-03-10T08:55:24.592 INFO:tasks.workunit.client.0.vm05.stdout:4/271: readlink d0/d1d/d30/d32/d41/l4d 0 2026-03-10T08:55:24.600 INFO:tasks.workunit.client.0.vm05.stdout:4/272: creat d0/d2e/d42/d45/f5f x:0 0 0 2026-03-10T08:55:24.642 INFO:tasks.workunit.client.1.vm08.stdout:2/518: creat d1/da/d10/d2d/fa2 x:0 0 0 2026-03-10T08:55:24.642 INFO:tasks.workunit.client.1.vm08.stdout:2/519: symlink d1/la3 0 2026-03-10T08:55:24.642 INFO:tasks.workunit.client.1.vm08.stdout:2/520: dread - d1/da/d10/d42/d93/d1e/f90 zero size 2026-03-10T08:55:24.642 INFO:tasks.workunit.client.1.vm08.stdout:2/521: link d1/d5b/d66/c94 d1/d97/ca4 0 2026-03-10T08:55:24.642 INFO:tasks.workunit.client.1.vm08.stdout:2/522: readlink d1/d5b/d66/l3a 0 2026-03-10T08:55:24.642 INFO:tasks.workunit.client.1.vm08.stdout:2/523: chown d1/d5b/d66/c94 640 1 2026-03-10T08:55:24.643 INFO:tasks.workunit.client.0.vm05.stdout:4/273: stat d0 0 2026-03-10T08:55:24.643 INFO:tasks.workunit.client.0.vm05.stdout:0/272: mknod df/c4c 0 2026-03-10T08:55:24.643 INFO:tasks.workunit.client.0.vm05.stdout:4/274: read - d0/d1d/d30/f3a zero size 2026-03-10T08:55:24.643 INFO:tasks.workunit.client.0.vm05.stdout:6/247: rename d4/d7/d10/d15/d1b/c4f to d4/c57 0 2026-03-10T08:55:24.643 INFO:tasks.workunit.client.0.vm05.stdout:4/275: creat d0/d1d/d30/d32/d41/f60 x:0 0 0 2026-03-10T08:55:24.643 INFO:tasks.workunit.client.0.vm05.stdout:6/248: symlink d4/d7/d10/d15/l58 0 2026-03-10T08:55:24.643 INFO:tasks.workunit.client.0.vm05.stdout:6/249: creat d4/d7/f59 x:0 0 0 2026-03-10T08:55:24.643 INFO:tasks.workunit.client.0.vm05.stdout:6/250: read d4/d7/d10/d15/d20/f47 [50209,75557] 0 2026-03-10T08:55:24.643 INFO:tasks.workunit.client.0.vm05.stdout:6/251: symlink d4/d7/d10/d15/l5a 0 2026-03-10T08:55:24.643 INFO:tasks.workunit.client.0.vm05.stdout:6/252: mknod d4/d7/d10/d15/d20/c5b 0 2026-03-10T08:55:24.643 INFO:tasks.workunit.client.0.vm05.stdout:6/253: dwrite d4/d7/d10/d15/d1b/d22/f36 [0,4194304] 0 2026-03-10T08:55:24.643 INFO:tasks.workunit.client.0.vm05.stdout:6/254: dwrite d4/fc [0,4194304] 0 2026-03-10T08:55:24.643 INFO:tasks.workunit.client.0.vm05.stdout:6/255: dwrite d4/d2c/f55 [0,4194304] 0 2026-03-10T08:55:24.643 INFO:tasks.workunit.client.1.vm08.stdout:2/524: dread - d1/da/d78/f95 zero size 2026-03-10T08:55:24.645 INFO:tasks.workunit.client.0.vm05.stdout:6/256: read d4/d7/ff [4952881,129267] 0 2026-03-10T08:55:24.649 INFO:tasks.workunit.client.0.vm05.stdout:6/257: dwrite d4/d7/d10/d1a/f25 [0,4194304] 0 2026-03-10T08:55:24.653 INFO:tasks.workunit.client.0.vm05.stdout:6/258: creat d4/d7/d10/d15/d1b/d22/f5c x:0 0 0 2026-03-10T08:55:24.655 INFO:tasks.workunit.client.0.vm05.stdout:6/259: creat d4/d7/f5d x:0 0 0 2026-03-10T08:55:24.661 INFO:tasks.workunit.client.0.vm05.stdout:6/260: dread d4/d7/f14 [0,4194304] 0 2026-03-10T08:55:24.663 INFO:tasks.workunit.client.0.vm05.stdout:6/261: creat d4/d7/d10/d15/f5e x:0 0 0 2026-03-10T08:55:24.664 INFO:tasks.workunit.client.0.vm05.stdout:6/262: unlink d4/d7/d10/d15/f5e 0 2026-03-10T08:55:24.805 INFO:tasks.workunit.client.0.vm05.stdout:1/361: sync 2026-03-10T08:55:24.805 INFO:tasks.workunit.client.0.vm05.stdout:0/273: sync 2026-03-10T08:55:24.805 INFO:tasks.workunit.client.0.vm05.stdout:4/276: sync 2026-03-10T08:55:24.806 INFO:tasks.workunit.client.0.vm05.stdout:1/362: stat dd/d21/f7f 0 2026-03-10T08:55:24.806 INFO:tasks.workunit.client.0.vm05.stdout:1/363: chown dd/d10/d18/d20/d52/d80 7901602 1 2026-03-10T08:55:24.809 INFO:tasks.workunit.client.0.vm05.stdout:1/364: unlink dd/d13/f33 0 2026-03-10T08:55:24.809 INFO:tasks.workunit.client.0.vm05.stdout:1/365: fsync dd/d10/d19/d27/f4e 0 2026-03-10T08:55:24.810 INFO:tasks.workunit.client.0.vm05.stdout:2/243: truncate d0/d9/f17 3425756 0 2026-03-10T08:55:24.815 INFO:tasks.workunit.client.0.vm05.stdout:7/239: rmdir d18/d1b/d1f/d25 39 2026-03-10T08:55:24.821 INFO:tasks.workunit.client.1.vm08.stdout:0/439: truncate d6/dd/d13/d17/f82 1398763 0 2026-03-10T08:55:24.821 INFO:tasks.workunit.client.0.vm05.stdout:4/277: creat d0/d1d/d30/f61 x:0 0 0 2026-03-10T08:55:24.827 INFO:tasks.workunit.client.0.vm05.stdout:1/366: symlink dd/d21/l81 0 2026-03-10T08:55:24.829 INFO:tasks.workunit.client.0.vm05.stdout:7/240: dread f15 [0,4194304] 0 2026-03-10T08:55:24.829 INFO:tasks.workunit.client.0.vm05.stdout:2/244: symlink d0/d9/d1e/l49 0 2026-03-10T08:55:24.830 INFO:tasks.workunit.client.0.vm05.stdout:2/245: chown d0/d9/d1e/l49 1 1 2026-03-10T08:55:24.833 INFO:tasks.workunit.client.0.vm05.stdout:4/278: rmdir d0 39 2026-03-10T08:55:24.835 INFO:tasks.workunit.client.0.vm05.stdout:7/241: chown d18/d1b/d1f/d25/d2e/d42 3697 1 2026-03-10T08:55:24.860 INFO:tasks.workunit.client.0.vm05.stdout:4/279: creat d0/d2e/d42/d45/f62 x:0 0 0 2026-03-10T08:55:24.860 INFO:tasks.workunit.client.0.vm05.stdout:4/280: fsync d0/f10 0 2026-03-10T08:55:24.860 INFO:tasks.workunit.client.0.vm05.stdout:1/367: creat dd/d10/d18/f82 x:0 0 0 2026-03-10T08:55:24.860 INFO:tasks.workunit.client.0.vm05.stdout:1/368: readlink dd/d10/l1b 0 2026-03-10T08:55:24.860 INFO:tasks.workunit.client.0.vm05.stdout:7/242: creat d18/d1b/d1f/d25/d2e/d42/f46 x:0 0 0 2026-03-10T08:55:24.860 INFO:tasks.workunit.client.0.vm05.stdout:7/243: write d18/d1b/d1f/f2d [5084057,3560] 0 2026-03-10T08:55:24.860 INFO:tasks.workunit.client.0.vm05.stdout:2/246: rename d0/c11 to d0/d9/d1e/c4a 0 2026-03-10T08:55:24.860 INFO:tasks.workunit.client.0.vm05.stdout:4/281: creat d0/d1d/d30/d32/f63 x:0 0 0 2026-03-10T08:55:24.860 INFO:tasks.workunit.client.0.vm05.stdout:1/369: link dd/d10/f22 dd/d21/d3f/f83 0 2026-03-10T08:55:24.860 INFO:tasks.workunit.client.0.vm05.stdout:1/370: creat dd/d10/d18/d2d/f84 x:0 0 0 2026-03-10T08:55:24.860 INFO:tasks.workunit.client.0.vm05.stdout:1/371: dread - dd/d10/d18/d2d/f84 zero size 2026-03-10T08:55:24.860 INFO:tasks.workunit.client.0.vm05.stdout:1/372: read - dd/d21/f6f zero size 2026-03-10T08:55:24.860 INFO:tasks.workunit.client.0.vm05.stdout:7/244: getdents d18 0 2026-03-10T08:55:24.861 INFO:tasks.workunit.client.0.vm05.stdout:1/373: dwrite fb [0,4194304] 0 2026-03-10T08:55:24.879 INFO:tasks.workunit.client.1.vm08.stdout:2/525: read d1/da/d10/d42/d93/f3b [3844119,85768] 0 2026-03-10T08:55:24.894 INFO:tasks.workunit.client.1.vm08.stdout:5/473: dwrite d0/d11/d18/f23 [0,4194304] 0 2026-03-10T08:55:24.895 INFO:tasks.workunit.client.1.vm08.stdout:9/451: dwrite d2/dd/d15/d1e/f48 [0,4194304] 0 2026-03-10T08:55:24.905 INFO:tasks.workunit.client.1.vm08.stdout:8/565: write d1/d2c/f47 [1966613,110337] 0 2026-03-10T08:55:24.906 INFO:tasks.workunit.client.0.vm05.stdout:8/218: write d2/db/d1f/f44 [958377,68009] 0 2026-03-10T08:55:24.914 INFO:tasks.workunit.client.1.vm08.stdout:4/525: fdatasync d5/d23/d36/d99/db2/d5a/fad 0 2026-03-10T08:55:24.921 INFO:tasks.workunit.client.1.vm08.stdout:1/537: dwrite d1/da/de/d24/d3d/d40/d5b/f8d [0,4194304] 0 2026-03-10T08:55:24.921 INFO:tasks.workunit.client.0.vm05.stdout:5/235: write d5/fc [99812,3809] 0 2026-03-10T08:55:24.921 INFO:tasks.workunit.client.0.vm05.stdout:8/219: write d2/f5 [6479511,67797] 0 2026-03-10T08:55:24.921 INFO:tasks.workunit.client.0.vm05.stdout:6/263: rmdir d4/d7/d10/d15/d20/d53 39 2026-03-10T08:55:24.922 INFO:tasks.workunit.client.0.vm05.stdout:6/264: chown d4/d2c 894 1 2026-03-10T08:55:24.922 INFO:tasks.workunit.client.0.vm05.stdout:8/220: unlink d2/dd/d2c/d2e/d31/d3e/c42 0 2026-03-10T08:55:24.922 INFO:tasks.workunit.client.1.vm08.stdout:8/566: creat d1/d4f/fcb x:0 0 0 2026-03-10T08:55:24.923 INFO:tasks.workunit.client.0.vm05.stdout:8/221: write d2/f2a [568593,119435] 0 2026-03-10T08:55:24.924 INFO:tasks.workunit.client.0.vm05.stdout:8/222: write d2/db/d1f/f44 [937280,78230] 0 2026-03-10T08:55:24.924 INFO:tasks.workunit.client.0.vm05.stdout:8/223: fdatasync d2/dd/d2c/d2e/f37 0 2026-03-10T08:55:24.926 INFO:tasks.workunit.client.0.vm05.stdout:4/282: dread d0/f1e [4194304,4194304] 0 2026-03-10T08:55:24.930 INFO:tasks.workunit.client.0.vm05.stdout:5/236: mkdir d5/df/d37/d4b 0 2026-03-10T08:55:24.930 INFO:tasks.workunit.client.0.vm05.stdout:8/224: dwrite d2/f2a [0,4194304] 0 2026-03-10T08:55:24.931 INFO:tasks.workunit.client.0.vm05.stdout:6/265: mkdir d4/d2d/d5f 0 2026-03-10T08:55:24.932 INFO:tasks.workunit.client.0.vm05.stdout:3/284: write d9/f4a [334597,116270] 0 2026-03-10T08:55:24.933 INFO:tasks.workunit.client.1.vm08.stdout:2/526: dread d1/da/d10/d42/d93/d23/f70 [0,4194304] 0 2026-03-10T08:55:24.933 INFO:tasks.workunit.client.0.vm05.stdout:9/201: fsync d6/d19/f1a 0 2026-03-10T08:55:24.938 INFO:tasks.workunit.client.1.vm08.stdout:8/567: mknod d1/d10/d9/dd/d25/d27/d44/d21/d5f/d9e/ccc 0 2026-03-10T08:55:24.943 INFO:tasks.workunit.client.1.vm08.stdout:4/526: dwrite d5/de/f41 [0,4194304] 0 2026-03-10T08:55:24.952 INFO:tasks.workunit.client.1.vm08.stdout:7/534: dwrite d0/d11/d1f/d29/d3d/d40/f38 [4194304,4194304] 0 2026-03-10T08:55:24.953 INFO:tasks.workunit.client.0.vm05.stdout:5/237: unlink d5/df/d37/f42 0 2026-03-10T08:55:24.953 INFO:tasks.workunit.client.0.vm05.stdout:6/266: write d4/d7/ff [10824687,56059] 0 2026-03-10T08:55:24.953 INFO:tasks.workunit.client.0.vm05.stdout:4/283: unlink d0/d2c/c46 0 2026-03-10T08:55:24.953 INFO:tasks.workunit.client.1.vm08.stdout:4/527: readlink d5/d23/d36/l74 0 2026-03-10T08:55:24.953 INFO:tasks.workunit.client.1.vm08.stdout:4/528: write d5/d23/d36/d76/fa7 [4407824,122395] 0 2026-03-10T08:55:24.953 INFO:tasks.workunit.client.1.vm08.stdout:8/568: truncate d1/d10/d9/dd/d25/f93 121004 0 2026-03-10T08:55:24.953 INFO:tasks.workunit.client.0.vm05.stdout:9/202: fdatasync f4 0 2026-03-10T08:55:24.954 INFO:tasks.workunit.client.1.vm08.stdout:2/527: dread d1/da/d10/d42/d93/f8d [0,4194304] 0 2026-03-10T08:55:24.957 INFO:tasks.workunit.client.0.vm05.stdout:3/285: fdatasync d9/f19 0 2026-03-10T08:55:24.959 INFO:tasks.workunit.client.1.vm08.stdout:7/535: rename d0/d11/d1f/d29/d3b/f7d to d0/d11/d4a/d95/fa7 0 2026-03-10T08:55:24.959 INFO:tasks.workunit.client.0.vm05.stdout:8/225: symlink d2/dd/d2c/d2e/d31/d4c/l52 0 2026-03-10T08:55:24.961 INFO:tasks.workunit.client.1.vm08.stdout:8/569: creat d1/d10/d9/dd/d25/d27/d44/d89/fcd x:0 0 0 2026-03-10T08:55:24.963 INFO:tasks.workunit.client.0.vm05.stdout:5/238: creat d5/df/d12/d24/d2c/d41/f4c x:0 0 0 2026-03-10T08:55:24.969 INFO:tasks.workunit.client.0.vm05.stdout:3/286: write d9/f13 [39818,33227] 0 2026-03-10T08:55:24.972 INFO:tasks.workunit.client.1.vm08.stdout:8/570: mkdir d1/d10/d9/dd/d25/d27/d44/d21/dce 0 2026-03-10T08:55:24.973 INFO:tasks.workunit.client.0.vm05.stdout:8/226: creat d2/db/d1f/f53 x:0 0 0 2026-03-10T08:55:24.976 INFO:tasks.workunit.client.0.vm05.stdout:8/227: dwrite d2/dd/d2c/d2e/f37 [0,4194304] 0 2026-03-10T08:55:24.978 INFO:tasks.workunit.client.0.vm05.stdout:5/239: rename d5/df/d12/d24/d2c/f2e to d5/df/d12/d24/d2c/d41/f4d 0 2026-03-10T08:55:24.979 INFO:tasks.workunit.client.0.vm05.stdout:9/203: link d6/f3f d6/d12/d43/f47 0 2026-03-10T08:55:24.979 INFO:tasks.workunit.client.0.vm05.stdout:5/240: chown d5/df/d12/d21/f30 26213488 1 2026-03-10T08:55:24.985 INFO:tasks.workunit.client.1.vm08.stdout:2/528: creat d1/da/d10/fa5 x:0 0 0 2026-03-10T08:55:24.986 INFO:tasks.workunit.client.1.vm08.stdout:4/529: getdents d5/d23/d36/d99 0 2026-03-10T08:55:24.987 INFO:tasks.workunit.client.1.vm08.stdout:4/530: chown d5/de/l9f 319 1 2026-03-10T08:55:24.988 INFO:tasks.workunit.client.0.vm05.stdout:8/228: rename d2/dd/l46 to d2/dd/d2c/d2e/d31/d4c/l54 0 2026-03-10T08:55:24.989 INFO:tasks.workunit.client.0.vm05.stdout:3/287: mkdir d9/d2b/d3a/d43/d4f/d50 0 2026-03-10T08:55:24.989 INFO:tasks.workunit.client.0.vm05.stdout:3/288: chown d9/d2b/c41 5 1 2026-03-10T08:55:24.992 INFO:tasks.workunit.client.0.vm05.stdout:9/204: mkdir d6/d12/d3a/d48 0 2026-03-10T08:55:24.992 INFO:tasks.workunit.client.0.vm05.stdout:8/229: mkdir d2/dd/d2c/d2e/d31/d4c/d55 0 2026-03-10T08:55:24.992 INFO:tasks.workunit.client.0.vm05.stdout:3/289: unlink f1 0 2026-03-10T08:55:24.995 INFO:tasks.workunit.client.1.vm08.stdout:2/529: dwrite d1/d43/f5d [4194304,4194304] 0 2026-03-10T08:55:24.996 INFO:tasks.workunit.client.0.vm05.stdout:8/230: dwrite d2/f5 [0,4194304] 0 2026-03-10T08:55:24.998 INFO:tasks.workunit.client.0.vm05.stdout:3/290: mkdir d9/d4d/d51 0 2026-03-10T08:55:25.008 INFO:tasks.workunit.client.1.vm08.stdout:2/530: symlink d1/d9b/la6 0 2026-03-10T08:55:25.021 INFO:tasks.workunit.client.1.vm08.stdout:2/531: mkdir d1/d5b/da7 0 2026-03-10T08:55:25.021 INFO:tasks.workunit.client.1.vm08.stdout:2/532: readlink d1/da/d10/d1b/l30 0 2026-03-10T08:55:25.021 INFO:tasks.workunit.client.1.vm08.stdout:2/533: chown d1/d5b/d66/c76 6 1 2026-03-10T08:55:25.021 INFO:tasks.workunit.client.1.vm08.stdout:2/534: link d1/da/d10/d42/d93/d22/f8a d1/da/d10/d1b/d6a/fa8 0 2026-03-10T08:55:25.021 INFO:tasks.workunit.client.1.vm08.stdout:2/535: rename d1/l5 to d1/da/d10/d1b/d6a/la9 0 2026-03-10T08:55:25.021 INFO:tasks.workunit.client.0.vm05.stdout:8/231: link d2/l6 d2/db/d47/l56 0 2026-03-10T08:55:25.021 INFO:tasks.workunit.client.0.vm05.stdout:8/232: creat d2/db/d47/f57 x:0 0 0 2026-03-10T08:55:25.021 INFO:tasks.workunit.client.0.vm05.stdout:8/233: getdents d2 0 2026-03-10T08:55:25.021 INFO:tasks.workunit.client.0.vm05.stdout:8/234: getdents d2/dd/d2c 0 2026-03-10T08:55:25.021 INFO:tasks.workunit.client.0.vm05.stdout:8/235: creat d2/db/d47/f58 x:0 0 0 2026-03-10T08:55:25.021 INFO:tasks.workunit.client.0.vm05.stdout:8/236: getdents d2 0 2026-03-10T08:55:25.021 INFO:tasks.workunit.client.0.vm05.stdout:8/237: symlink d2/dd/d2c/d2e/d31/d3e/l59 0 2026-03-10T08:55:25.022 INFO:tasks.workunit.client.0.vm05.stdout:8/238: creat d2/dd/d2c/d2e/f5a x:0 0 0 2026-03-10T08:55:25.025 INFO:tasks.workunit.client.0.vm05.stdout:8/239: dwrite d2/f2a [0,4194304] 0 2026-03-10T08:55:25.027 INFO:tasks.workunit.client.0.vm05.stdout:8/240: symlink d2/d45/l5b 0 2026-03-10T08:55:25.028 INFO:tasks.workunit.client.0.vm05.stdout:8/241: symlink d2/dd/d2c/d2e/d31/d3e/l5c 0 2026-03-10T08:55:25.029 INFO:tasks.workunit.client.0.vm05.stdout:8/242: mkdir d2/dd/d2c/d2e/d31/d3e/d5d 0 2026-03-10T08:55:25.031 INFO:tasks.workunit.client.0.vm05.stdout:8/243: getdents d2/dd/d2c 0 2026-03-10T08:55:25.034 INFO:tasks.workunit.client.0.vm05.stdout:8/244: link d2/db/l24 d2/dd/d2c/d2e/d31/d3e/d5d/l5e 0 2026-03-10T08:55:25.034 INFO:tasks.workunit.client.0.vm05.stdout:8/245: write d2/dd/d2c/f4d [301132,125871] 0 2026-03-10T08:55:25.034 INFO:tasks.workunit.client.0.vm05.stdout:8/246: stat d2/d45/l5b 0 2026-03-10T08:55:25.034 INFO:tasks.workunit.client.0.vm05.stdout:8/247: fsync d2/dd/f1a 0 2026-03-10T08:55:25.036 INFO:tasks.workunit.client.0.vm05.stdout:8/248: symlink d2/dd/l5f 0 2026-03-10T08:55:25.037 INFO:tasks.workunit.client.0.vm05.stdout:8/249: symlink d2/d45/l60 0 2026-03-10T08:55:25.059 INFO:tasks.workunit.client.0.vm05.stdout:4/284: read d0/d1d/f24 [602875,48084] 0 2026-03-10T08:55:25.064 INFO:tasks.workunit.client.0.vm05.stdout:4/285: dwrite d0/d1d/d30/f28 [0,4194304] 0 2026-03-10T08:55:25.070 INFO:tasks.workunit.client.0.vm05.stdout:4/286: truncate d0/f1e 420874 0 2026-03-10T08:55:25.071 INFO:tasks.workunit.client.0.vm05.stdout:4/287: truncate d0/d2c/f2f 4222761 0 2026-03-10T08:55:25.071 INFO:tasks.workunit.client.0.vm05.stdout:4/288: chown d0/d1d/d30/d32/d41/c54 1 1 2026-03-10T08:55:25.072 INFO:tasks.workunit.client.0.vm05.stdout:4/289: write d0/d1d/d30/d32/f3e [589965,48858] 0 2026-03-10T08:55:25.076 INFO:tasks.workunit.client.1.vm08.stdout:3/479: dwrite d4/d15/d8/d2c/d9b/d79/f59 [0,4194304] 0 2026-03-10T08:55:25.092 INFO:tasks.workunit.client.1.vm08.stdout:6/526: dwrite d9/d10/f53 [4194304,4194304] 0 2026-03-10T08:55:25.092 INFO:tasks.workunit.client.1.vm08.stdout:6/527: chown d9/d13/fa0 14265 1 2026-03-10T08:55:25.092 INFO:tasks.workunit.client.1.vm08.stdout:6/528: mknod d9/d50/d95/cb5 0 2026-03-10T08:55:25.092 INFO:tasks.workunit.client.1.vm08.stdout:6/529: creat d9/d13/fb6 x:0 0 0 2026-03-10T08:55:25.092 INFO:tasks.workunit.client.1.vm08.stdout:6/530: mknod d9/d50/d95/cb7 0 2026-03-10T08:55:25.092 INFO:tasks.workunit.client.1.vm08.stdout:6/531: dread - d9/d13/f6c zero size 2026-03-10T08:55:25.099 INFO:tasks.workunit.client.0.vm05.stdout:5/241: sync 2026-03-10T08:55:25.109 INFO:tasks.workunit.client.0.vm05.stdout:4/290: dread d0/f10 [0,4194304] 0 2026-03-10T08:55:25.145 INFO:tasks.workunit.client.0.vm05.stdout:0/274: truncate df/d18/d2b/d3a/f3f 1509293 0 2026-03-10T08:55:25.149 INFO:tasks.workunit.client.0.vm05.stdout:0/275: mkdir df/d18/d19/d39/d4d 0 2026-03-10T08:55:25.152 INFO:tasks.workunit.client.0.vm05.stdout:0/276: mkdir df/d18/d2b/d27/d32/d4e 0 2026-03-10T08:55:25.152 INFO:tasks.workunit.client.0.vm05.stdout:0/277: readlink df/d18/d19/l43 0 2026-03-10T08:55:25.153 INFO:tasks.workunit.client.0.vm05.stdout:0/278: write df/d18/d2b/f3b [13585,72988] 0 2026-03-10T08:55:25.156 INFO:tasks.workunit.client.0.vm05.stdout:0/279: creat df/d18/d2b/d27/f4f x:0 0 0 2026-03-10T08:55:25.158 INFO:tasks.workunit.client.0.vm05.stdout:2/247: dwrite d0/f2f [0,4194304] 0 2026-03-10T08:55:25.160 INFO:tasks.workunit.client.0.vm05.stdout:0/280: unlink c9 0 2026-03-10T08:55:25.161 INFO:tasks.workunit.client.0.vm05.stdout:5/242: sync 2026-03-10T08:55:25.162 INFO:tasks.workunit.client.0.vm05.stdout:5/243: fsync d5/d3a/f4a 0 2026-03-10T08:55:25.165 INFO:tasks.workunit.client.0.vm05.stdout:2/248: write d0/d9/d1e/f2a [211416,26591] 0 2026-03-10T08:55:25.168 INFO:tasks.workunit.client.0.vm05.stdout:2/249: write d0/d9/f12 [2482120,62985] 0 2026-03-10T08:55:25.172 INFO:tasks.workunit.client.0.vm05.stdout:0/281: rmdir df/d18/d19/d35 0 2026-03-10T08:55:25.172 INFO:tasks.workunit.client.0.vm05.stdout:0/282: stat df/c16 0 2026-03-10T08:55:25.177 INFO:tasks.workunit.client.0.vm05.stdout:0/283: dwrite df/d1f/f2d [0,4194304] 0 2026-03-10T08:55:25.179 INFO:tasks.workunit.client.0.vm05.stdout:0/284: mkdir df/d18/d19/d39/d4d/d50 0 2026-03-10T08:55:25.180 INFO:tasks.workunit.client.1.vm08.stdout:6/532: dread d9/dc/d11/f55 [0,4194304] 0 2026-03-10T08:55:25.183 INFO:tasks.workunit.client.0.vm05.stdout:0/285: dwrite df/d1f/f2d [0,4194304] 0 2026-03-10T08:55:25.194 INFO:tasks.workunit.client.1.vm08.stdout:6/533: link d9/dc/d11/f8d d9/d50/fb8 0 2026-03-10T08:55:25.200 INFO:tasks.workunit.client.1.vm08.stdout:6/534: creat d9/d10/d1e/d4c/fb9 x:0 0 0 2026-03-10T08:55:25.200 INFO:tasks.workunit.client.0.vm05.stdout:0/286: chown df/d18/d2b/d27/d32/l38 10 1 2026-03-10T08:55:25.200 INFO:tasks.workunit.client.0.vm05.stdout:0/287: dread - df/d18/d19/d39/f42 zero size 2026-03-10T08:55:25.200 INFO:tasks.workunit.client.0.vm05.stdout:0/288: mkdir df/d18/d2b/d51 0 2026-03-10T08:55:25.200 INFO:tasks.workunit.client.0.vm05.stdout:0/289: dread df/f1a [4194304,4194304] 0 2026-03-10T08:55:25.200 INFO:tasks.workunit.client.0.vm05.stdout:0/290: symlink df/d18/d2b/d27/d32/l52 0 2026-03-10T08:55:25.201 INFO:tasks.workunit.client.1.vm08.stdout:6/535: rename d9/dc/f1b to d9/d10/d1e/fba 0 2026-03-10T08:55:25.206 INFO:tasks.workunit.client.0.vm05.stdout:0/291: dwrite df/d1f/f21 [4194304,4194304] 0 2026-03-10T08:55:25.212 INFO:tasks.workunit.client.0.vm05.stdout:0/292: creat df/d18/f53 x:0 0 0 2026-03-10T08:55:25.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:24 vm05.local ceph-mon[49713]: pgmap v151: 65 pgs: 65 active+clean; 1.5 GiB data, 6.0 GiB used, 114 GiB / 120 GiB avail; 29 MiB/s rd, 126 MiB/s wr, 262 op/s 2026-03-10T08:55:25.221 INFO:tasks.workunit.client.0.vm05.stdout:2/250: sync 2026-03-10T08:55:25.280 INFO:tasks.workunit.client.1.vm08.stdout:1/538: dread d1/da/d18/f48 [0,4194304] 0 2026-03-10T08:55:25.280 INFO:tasks.workunit.client.1.vm08.stdout:1/539: stat d1/da 0 2026-03-10T08:55:25.284 INFO:tasks.workunit.client.0.vm05.stdout:7/245: dwrite f9 [0,4194304] 0 2026-03-10T08:55:25.286 INFO:tasks.workunit.client.1.vm08.stdout:0/440: sync 2026-03-10T08:55:25.286 INFO:tasks.workunit.client.1.vm08.stdout:4/531: sync 2026-03-10T08:55:25.287 INFO:tasks.workunit.client.0.vm05.stdout:7/246: dread f15 [0,4194304] 0 2026-03-10T08:55:25.290 INFO:tasks.workunit.client.1.vm08.stdout:1/540: symlink d1/da/de/d24/d3d/lbc 0 2026-03-10T08:55:25.297 INFO:tasks.workunit.client.0.vm05.stdout:7/247: getdents d18/d1b 0 2026-03-10T08:55:25.310 INFO:tasks.workunit.client.0.vm05.stdout:7/248: creat d18/d1b/d1f/d25/f47 x:0 0 0 2026-03-10T08:55:25.310 INFO:tasks.workunit.client.0.vm05.stdout:7/249: write d18/d1b/d1f/f3c [3253551,7308] 0 2026-03-10T08:55:25.310 INFO:tasks.workunit.client.1.vm08.stdout:4/532: rename d5/de/l9f to d5/d23/lb8 0 2026-03-10T08:55:25.310 INFO:tasks.workunit.client.1.vm08.stdout:4/533: truncate d5/d23/d36/d76/f9e 664864 0 2026-03-10T08:55:25.310 INFO:tasks.workunit.client.1.vm08.stdout:4/534: chown d5/d23/d36/d99/db2 15643 1 2026-03-10T08:55:25.310 INFO:tasks.workunit.client.1.vm08.stdout:4/535: rename d5/d23/d36/d99 to d5/d23/d36/d99/db2/d5a/db9 22 2026-03-10T08:55:25.310 INFO:tasks.workunit.client.1.vm08.stdout:4/536: creat d5/fba x:0 0 0 2026-03-10T08:55:25.313 INFO:tasks.workunit.client.1.vm08.stdout:4/537: readlink d5/d23/d49/l4b 0 2026-03-10T08:55:25.315 INFO:tasks.workunit.client.1.vm08.stdout:4/538: creat d5/de/d96/fbb x:0 0 0 2026-03-10T08:55:25.330 INFO:tasks.workunit.client.1.vm08.stdout:6/536: sync 2026-03-10T08:55:25.331 INFO:tasks.workunit.client.1.vm08.stdout:4/539: creat d5/d23/d49/d8f/fbc x:0 0 0 2026-03-10T08:55:25.337 INFO:tasks.workunit.client.1.vm08.stdout:4/540: mkdir d5/d23/d36/d99/db2/dbd 0 2026-03-10T08:55:25.341 INFO:tasks.workunit.client.1.vm08.stdout:6/537: mknod d9/dc/d11/d23/d2c/d81/d63/cbb 0 2026-03-10T08:55:25.352 INFO:tasks.workunit.client.1.vm08.stdout:3/480: dread d4/d15/d8/d2c/f32 [0,4194304] 0 2026-03-10T08:55:25.353 INFO:tasks.workunit.client.1.vm08.stdout:3/481: write d4/d15/d8/f83 [219561,87683] 0 2026-03-10T08:55:25.357 INFO:tasks.workunit.client.1.vm08.stdout:3/482: creat d4/d15/d8/fa0 x:0 0 0 2026-03-10T08:55:25.361 INFO:tasks.workunit.client.0.vm05.stdout:2/251: fdatasync d0/f2f 0 2026-03-10T08:55:25.371 INFO:tasks.workunit.client.0.vm05.stdout:2/252: mkdir d0/d9/d1e/d20/d21/d45/d4b 0 2026-03-10T08:55:25.371 INFO:tasks.workunit.client.0.vm05.stdout:2/253: chown d0/d9/d1e/d20/d21/d45/d4b 11 1 2026-03-10T08:55:25.371 INFO:tasks.workunit.client.0.vm05.stdout:6/267: dread d4/f11 [0,4194304] 0 2026-03-10T08:55:25.371 INFO:tasks.workunit.client.0.vm05.stdout:6/268: rename d4/d7/d10/d1a/l40 to d4/d7/d10/d1a/d1f/l60 0 2026-03-10T08:55:25.374 INFO:tasks.workunit.client.0.vm05.stdout:6/269: dwrite d4/d7/f14 [0,4194304] 0 2026-03-10T08:55:25.378 INFO:tasks.workunit.client.0.vm05.stdout:6/270: chown d4/d7/d10/d15/d20/d53 81508 1 2026-03-10T08:55:25.379 INFO:tasks.workunit.client.0.vm05.stdout:6/271: creat d4/f61 x:0 0 0 2026-03-10T08:55:25.380 INFO:tasks.workunit.client.0.vm05.stdout:6/272: chown d4/d7/d10/d15/d1b/d22/f36 51935826 1 2026-03-10T08:55:25.381 INFO:tasks.workunit.client.0.vm05.stdout:6/273: mkdir d4/d2d/d51/d62 0 2026-03-10T08:55:25.382 INFO:tasks.workunit.client.0.vm05.stdout:6/274: creat d4/d7/d10/d15/d20/d53/d4a/f63 x:0 0 0 2026-03-10T08:55:25.383 INFO:tasks.workunit.client.0.vm05.stdout:6/275: truncate d4/d7/f52 264394 0 2026-03-10T08:55:25.383 INFO:tasks.workunit.client.0.vm05.stdout:6/276: chown d4/d7/f54 94864 1 2026-03-10T08:55:25.385 INFO:tasks.workunit.client.0.vm05.stdout:6/277: getdents d4/d7/d10/d15/d20 0 2026-03-10T08:55:25.386 INFO:tasks.workunit.client.0.vm05.stdout:6/278: fsync d4/d7/d10/d1a/f1e 0 2026-03-10T08:55:25.390 INFO:tasks.workunit.client.0.vm05.stdout:6/279: getdents d4/d7/d10/d15/d1b/d22 0 2026-03-10T08:55:25.397 INFO:tasks.workunit.client.1.vm08.stdout:4/541: dread d5/d23/d36/d99/db2/f84 [0,4194304] 0 2026-03-10T08:55:25.401 INFO:tasks.workunit.client.1.vm08.stdout:5/474: write d0/d11/d27/d68/d7c/f42 [1541322,91668] 0 2026-03-10T08:55:25.401 INFO:tasks.workunit.client.1.vm08.stdout:5/475: write d0/d46/f81 [138271,104294] 0 2026-03-10T08:55:25.402 INFO:tasks.workunit.client.1.vm08.stdout:9/452: write d2/dd/d15/f44 [502842,68191] 0 2026-03-10T08:55:25.402 INFO:tasks.workunit.client.1.vm08.stdout:0/441: stat d6/dd/d13/d17/f6d 0 2026-03-10T08:55:25.402 INFO:tasks.workunit.client.1.vm08.stdout:5/476: chown d0/ff 13631 1 2026-03-10T08:55:25.403 INFO:tasks.workunit.client.1.vm08.stdout:9/453: write d2/d41/d4c/f8a [404603,11898] 0 2026-03-10T08:55:25.403 INFO:tasks.workunit.client.1.vm08.stdout:5/477: stat d0/d1b/f69 0 2026-03-10T08:55:25.404 INFO:tasks.workunit.client.1.vm08.stdout:4/542: rename d5/d23/d36/d99/db2/d5a/fad to d5/d23/d36/d99/db2/dbd/fbe 0 2026-03-10T08:55:25.405 INFO:tasks.workunit.client.1.vm08.stdout:8/571: getdents d1/d10/d9/dd/d25/d27/d44/d21/d5f/d9e 0 2026-03-10T08:55:25.406 INFO:tasks.workunit.client.1.vm08.stdout:8/572: dread - d1/d10/d9/dd/d25/d27/d44/fa7 zero size 2026-03-10T08:55:25.407 INFO:tasks.workunit.client.1.vm08.stdout:0/442: unlink d6/dd/d13/d32/c36 0 2026-03-10T08:55:25.411 INFO:tasks.workunit.client.1.vm08.stdout:8/573: creat d1/d10/d9/dd/d18/fcf x:0 0 0 2026-03-10T08:55:25.412 INFO:tasks.workunit.client.1.vm08.stdout:4/543: read d5/d23/d36/f51 [1321874,79045] 0 2026-03-10T08:55:25.413 INFO:tasks.workunit.client.1.vm08.stdout:5/478: getdents d0/d11/d18/d52 0 2026-03-10T08:55:25.415 INFO:tasks.workunit.client.1.vm08.stdout:4/544: symlink d5/d23/d36/d99/db2/lbf 0 2026-03-10T08:55:25.417 INFO:tasks.workunit.client.1.vm08.stdout:8/574: unlink d1/d10/d9/dd/d25/d27/d44/d21/f32 0 2026-03-10T08:55:25.419 INFO:tasks.workunit.client.1.vm08.stdout:5/479: dwrite d0/d11/d27/d68/d7c/f42 [0,4194304] 0 2026-03-10T08:55:25.424 INFO:tasks.workunit.client.1.vm08.stdout:8/575: fdatasync d1/d10/f23 0 2026-03-10T08:55:25.425 INFO:tasks.workunit.client.1.vm08.stdout:7/536: truncate d0/d11/d1f/f90 523771 0 2026-03-10T08:55:25.426 INFO:tasks.workunit.client.1.vm08.stdout:5/480: write d0/d11/d18/d52/f57 [177771,83474] 0 2026-03-10T08:55:25.427 INFO:tasks.workunit.client.1.vm08.stdout:5/481: readlink d0/d11/d3e/l3f 0 2026-03-10T08:55:25.429 INFO:tasks.workunit.client.1.vm08.stdout:4/545: rename d5/de/l8e to d5/d23/d49/d8f/lc0 0 2026-03-10T08:55:25.430 INFO:tasks.workunit.client.1.vm08.stdout:5/482: mkdir d0/d11/d27/d68/d7c/d8e 0 2026-03-10T08:55:25.438 INFO:tasks.workunit.client.1.vm08.stdout:5/483: mknod d0/d11/d27/d68/d7c/d4b/d87/c8f 0 2026-03-10T08:55:25.439 INFO:tasks.workunit.client.1.vm08.stdout:0/443: dread d6/dd/d13/d32/f3d [0,4194304] 0 2026-03-10T08:55:25.439 INFO:tasks.workunit.client.1.vm08.stdout:4/546: creat d5/d23/fc1 x:0 0 0 2026-03-10T08:55:25.444 INFO:tasks.workunit.client.1.vm08.stdout:7/537: dwrite d0/d14/f72 [0,4194304] 0 2026-03-10T08:55:25.444 INFO:tasks.workunit.client.1.vm08.stdout:0/444: readlink d6/dd/d13/d17/d1f/d20/d2f/d26/l3c 0 2026-03-10T08:55:25.451 INFO:tasks.workunit.client.1.vm08.stdout:4/547: mknod d5/de/d96/cc2 0 2026-03-10T08:55:25.453 INFO:tasks.workunit.client.1.vm08.stdout:7/538: dwrite d0/d11/d4a/d5e/f93 [0,4194304] 0 2026-03-10T08:55:25.454 INFO:tasks.workunit.client.1.vm08.stdout:0/445: mknod d6/dd/d13/d17/d50/c8e 0 2026-03-10T08:55:25.459 INFO:tasks.workunit.client.1.vm08.stdout:0/446: write d6/f5f [720457,76237] 0 2026-03-10T08:55:25.462 INFO:tasks.workunit.client.1.vm08.stdout:7/539: dread - d0/d11/d1f/d29/d3b/f9f zero size 2026-03-10T08:55:25.475 INFO:tasks.workunit.client.0.vm05.stdout:3/291: rmdir d9 39 2026-03-10T08:55:25.480 INFO:tasks.workunit.client.1.vm08.stdout:4/548: rmdir d5/d23/d36/d99 39 2026-03-10T08:55:25.481 INFO:tasks.workunit.client.1.vm08.stdout:0/447: mkdir d6/dd/d13/d8f 0 2026-03-10T08:55:25.481 INFO:tasks.workunit.client.0.vm05.stdout:3/292: chown d9/d2b/c41 1736426 1 2026-03-10T08:55:25.481 INFO:tasks.workunit.client.0.vm05.stdout:3/293: chown d9/c3e 30456157 1 2026-03-10T08:55:25.488 INFO:tasks.workunit.client.0.vm05.stdout:3/294: rmdir d9/d2b/d3a 39 2026-03-10T08:55:25.488 INFO:tasks.workunit.client.1.vm08.stdout:7/540: chown d0/d11/d4a/l6b 70443 1 2026-03-10T08:55:25.491 INFO:tasks.workunit.client.0.vm05.stdout:3/295: creat d9/d4d/f52 x:0 0 0 2026-03-10T08:55:25.493 INFO:tasks.workunit.client.1.vm08.stdout:7/541: symlink d0/d14/d43/d62/la8 0 2026-03-10T08:55:25.493 INFO:tasks.workunit.client.1.vm08.stdout:4/549: creat d5/d23/d49/d8f/da4/fc3 x:0 0 0 2026-03-10T08:55:25.494 INFO:tasks.workunit.client.1.vm08.stdout:2/536: truncate d1/da/d10/d42/d93/d23/f70 144712 0 2026-03-10T08:55:25.497 INFO:tasks.workunit.client.0.vm05.stdout:1/374: read dd/d10/d18/f36 [3357438,53074] 0 2026-03-10T08:55:25.497 INFO:tasks.workunit.client.1.vm08.stdout:0/448: dwrite d6/dd/d13/d17/d1f/d2d/d39/f47 [0,4194304] 0 2026-03-10T08:55:25.507 INFO:tasks.workunit.client.1.vm08.stdout:7/542: creat d0/d11/d4a/da3/fa9 x:0 0 0 2026-03-10T08:55:25.516 INFO:tasks.workunit.client.0.vm05.stdout:6/280: dread d4/d7/f34 [4194304,4194304] 0 2026-03-10T08:55:25.521 INFO:tasks.workunit.client.0.vm05.stdout:6/281: creat d4/d7/d10/d15/d20/f64 x:0 0 0 2026-03-10T08:55:25.523 INFO:tasks.workunit.client.1.vm08.stdout:0/449: truncate d6/dd/d13/d17/d1f/d20/d2f/f59 172838 0 2026-03-10T08:55:25.526 INFO:tasks.workunit.client.1.vm08.stdout:2/537: dread d1/d5b/f8c [0,4194304] 0 2026-03-10T08:55:25.526 INFO:tasks.workunit.client.0.vm05.stdout:6/282: read - d4/d7/d10/d15/d1b/f31 zero size 2026-03-10T08:55:25.527 INFO:tasks.workunit.client.0.vm05.stdout:6/283: fdatasync d4/d7/f4d 0 2026-03-10T08:55:25.528 INFO:tasks.workunit.client.1.vm08.stdout:2/538: dread d1/da/d10/d42/d93/f8f [0,4194304] 0 2026-03-10T08:55:25.528 INFO:tasks.workunit.client.1.vm08.stdout:2/539: fdatasync d1/d43/f4b 0 2026-03-10T08:55:25.529 INFO:tasks.workunit.client.0.vm05.stdout:6/284: creat d4/d7/d10/f65 x:0 0 0 2026-03-10T08:55:25.530 INFO:tasks.workunit.client.1.vm08.stdout:7/543: truncate d0/d11/d1f/d2c/f6c 516867 0 2026-03-10T08:55:25.533 INFO:tasks.workunit.client.0.vm05.stdout:4/291: fsync d0/f1e 0 2026-03-10T08:55:25.534 INFO:tasks.workunit.client.0.vm05.stdout:4/292: read - d0/d2e/d42/d45/f5f zero size 2026-03-10T08:55:25.537 INFO:tasks.workunit.client.0.vm05.stdout:6/285: dread d4/d7/f52 [0,4194304] 0 2026-03-10T08:55:25.538 INFO:tasks.workunit.client.0.vm05.stdout:8/250: dread d2/dd/d2c/f4d [0,4194304] 0 2026-03-10T08:55:25.542 INFO:tasks.workunit.client.1.vm08.stdout:2/540: mkdir d1/da/d10/d42/d93/daa 0 2026-03-10T08:55:25.553 INFO:tasks.workunit.client.1.vm08.stdout:7/544: mkdir d0/d11/d1f/d29/d3b/da1/daa 0 2026-03-10T08:55:25.553 INFO:tasks.workunit.client.1.vm08.stdout:2/541: fsync d1/da/d10/d42/d93/f8f 0 2026-03-10T08:55:25.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:25 vm08.local ceph-mon[57559]: pgmap v151: 65 pgs: 65 active+clean; 1.5 GiB data, 6.0 GiB used, 114 GiB / 120 GiB avail; 29 MiB/s rd, 126 MiB/s wr, 262 op/s 2026-03-10T08:55:25.553 INFO:tasks.workunit.client.0.vm05.stdout:6/286: unlink d4/d7/f59 0 2026-03-10T08:55:25.553 INFO:tasks.workunit.client.0.vm05.stdout:8/251: creat d2/d45/f61 x:0 0 0 2026-03-10T08:55:25.553 INFO:tasks.workunit.client.0.vm05.stdout:4/293: mknod d0/c64 0 2026-03-10T08:55:25.553 INFO:tasks.workunit.client.0.vm05.stdout:8/252: unlink d2/c17 0 2026-03-10T08:55:25.554 INFO:tasks.workunit.client.1.vm08.stdout:7/545: dwrite d0/d11/d1f/d29/d3d/d89/f96 [0,4194304] 0 2026-03-10T08:55:25.555 INFO:tasks.workunit.client.0.vm05.stdout:4/294: unlink d0/f18 0 2026-03-10T08:55:25.557 INFO:tasks.workunit.client.0.vm05.stdout:8/253: rename d2/dd/d2c/d2e/d31/l3c to d2/db/l62 0 2026-03-10T08:55:25.561 INFO:tasks.workunit.client.0.vm05.stdout:8/254: mkdir d2/dd/d2c/d2e/d31/d4c/d63 0 2026-03-10T08:55:25.561 INFO:tasks.workunit.client.1.vm08.stdout:7/546: unlink d0/d11/d4a/l6b 0 2026-03-10T08:55:25.562 INFO:tasks.workunit.client.1.vm08.stdout:7/547: fsync d0/d14/f98 0 2026-03-10T08:55:25.566 INFO:tasks.workunit.client.0.vm05.stdout:8/255: dwrite d2/db/d1f/f44 [0,4194304] 0 2026-03-10T08:55:25.576 INFO:tasks.workunit.client.1.vm08.stdout:2/542: dread d1/da/d10/d42/d93/d23/f31 [0,4194304] 0 2026-03-10T08:55:25.578 INFO:tasks.workunit.client.1.vm08.stdout:2/543: dread d1/d43/f6d [0,4194304] 0 2026-03-10T08:55:25.580 INFO:tasks.workunit.client.1.vm08.stdout:2/544: mknod d1/da/d10/d42/d93/d23/d9e/cab 0 2026-03-10T08:55:25.581 INFO:tasks.workunit.client.1.vm08.stdout:7/548: dread d0/d14/d2f/f81 [0,4194304] 0 2026-03-10T08:55:25.585 INFO:tasks.workunit.client.1.vm08.stdout:2/545: unlink d1/da/d10/d1b/d6a/f73 0 2026-03-10T08:55:25.593 INFO:tasks.workunit.client.1.vm08.stdout:7/549: creat d0/d11/d1f/d29/d36/d75/fab x:0 0 0 2026-03-10T08:55:25.593 INFO:tasks.workunit.client.1.vm08.stdout:7/550: creat d0/d11/d1f/d29/d3b/fac x:0 0 0 2026-03-10T08:55:25.593 INFO:tasks.workunit.client.1.vm08.stdout:7/551: symlink d0/d11/d1f/d29/d36/lad 0 2026-03-10T08:55:25.593 INFO:tasks.workunit.client.1.vm08.stdout:7/552: truncate d0/d11/d1f/d29/d3d/d89/fa6 824980 0 2026-03-10T08:55:25.593 INFO:tasks.workunit.client.1.vm08.stdout:7/553: symlink d0/d11/d1f/lae 0 2026-03-10T08:55:25.595 INFO:tasks.workunit.client.1.vm08.stdout:7/554: dread d0/f25 [0,4194304] 0 2026-03-10T08:55:25.597 INFO:tasks.workunit.client.1.vm08.stdout:7/555: mkdir d0/d11/d1f/d29/d36/daf 0 2026-03-10T08:55:25.598 INFO:tasks.workunit.client.1.vm08.stdout:7/556: write d0/d11/d1f/d2c/f30 [4784066,127541] 0 2026-03-10T08:55:25.612 INFO:tasks.workunit.client.1.vm08.stdout:7/557: rename d0/d11/d1f/d29/d3d/f59 to d0/d11/d1f/d29/d3d/d40/fb0 0 2026-03-10T08:55:25.667 INFO:tasks.workunit.client.1.vm08.stdout:0/450: sync 2026-03-10T08:55:25.670 INFO:tasks.workunit.client.0.vm05.stdout:5/244: truncate d5/df/f31 3734707 0 2026-03-10T08:55:25.670 INFO:tasks.workunit.client.0.vm05.stdout:5/245: write d5/f23 [5633923,54842] 0 2026-03-10T08:55:25.671 INFO:tasks.workunit.client.0.vm05.stdout:5/246: truncate d5/fd 9180342 0 2026-03-10T08:55:25.672 INFO:tasks.workunit.client.0.vm05.stdout:4/295: sync 2026-03-10T08:55:25.672 INFO:tasks.workunit.client.0.vm05.stdout:4/296: read - d0/d1d/d30/d32/d41/f57 zero size 2026-03-10T08:55:25.676 INFO:tasks.workunit.client.0.vm05.stdout:4/297: dwrite d0/d2e/d42/d45/d4a/f26 [0,4194304] 0 2026-03-10T08:55:25.677 INFO:tasks.workunit.client.0.vm05.stdout:8/256: sync 2026-03-10T08:55:25.677 INFO:tasks.workunit.client.0.vm05.stdout:4/298: chown d0/d1d/d30/d32/d41/c54 162548882 1 2026-03-10T08:55:25.679 INFO:tasks.workunit.client.0.vm05.stdout:8/257: write d2/dd/d2c/d2e/f37 [1479900,82045] 0 2026-03-10T08:55:25.680 INFO:tasks.workunit.client.0.vm05.stdout:8/258: dread - d2/dd/d2c/d2e/f5a zero size 2026-03-10T08:55:25.684 INFO:tasks.workunit.client.1.vm08.stdout:2/546: sync 2026-03-10T08:55:25.685 INFO:tasks.workunit.client.1.vm08.stdout:7/558: sync 2026-03-10T08:55:25.687 INFO:tasks.workunit.client.1.vm08.stdout:0/451: mknod d6/dd/d13/d32/c90 0 2026-03-10T08:55:25.693 INFO:tasks.workunit.client.1.vm08.stdout:0/452: creat d6/dd/d13/d17/d1f/d2d/d39/f91 x:0 0 0 2026-03-10T08:55:25.695 INFO:tasks.workunit.client.0.vm05.stdout:0/293: dwrite df/d18/f2a [0,4194304] 0 2026-03-10T08:55:25.696 INFO:tasks.workunit.client.0.vm05.stdout:4/299: mknod d0/d55/c65 0 2026-03-10T08:55:25.699 INFO:tasks.workunit.client.0.vm05.stdout:4/300: dwrite d0/d1d/f22 [0,4194304] 0 2026-03-10T08:55:25.700 INFO:tasks.workunit.client.0.vm05.stdout:7/250: rmdir d18/d1b/d1f 39 2026-03-10T08:55:25.701 INFO:tasks.workunit.client.0.vm05.stdout:7/251: write d18/f31 [1812662,93734] 0 2026-03-10T08:55:25.701 INFO:tasks.workunit.client.0.vm05.stdout:7/252: dread - d18/f24 zero size 2026-03-10T08:55:25.703 INFO:tasks.workunit.client.1.vm08.stdout:3/483: truncate d4/d15/d8/f68 7382946 0 2026-03-10T08:55:25.717 INFO:tasks.workunit.client.1.vm08.stdout:7/559: dwrite d0/d11/d4a/d95/f9b [0,4194304] 0 2026-03-10T08:55:25.717 INFO:tasks.workunit.client.0.vm05.stdout:2/254: truncate d0/d9/f19 1334919 0 2026-03-10T08:55:25.717 INFO:tasks.workunit.client.1.vm08.stdout:3/484: stat d4/d15/d8/d1d/f6e 0 2026-03-10T08:55:25.717 INFO:tasks.workunit.client.1.vm08.stdout:1/541: dwrite d1/da/de/f19 [0,4194304] 0 2026-03-10T08:55:25.717 INFO:tasks.workunit.client.1.vm08.stdout:1/542: write d1/da/de/d24/d35/fa9 [1906265,82330] 0 2026-03-10T08:55:25.717 INFO:tasks.workunit.client.1.vm08.stdout:6/538: dwrite d9/dc/d11/f31 [0,4194304] 0 2026-03-10T08:55:25.719 INFO:tasks.workunit.client.0.vm05.stdout:8/259: sync 2026-03-10T08:55:25.720 INFO:tasks.workunit.client.1.vm08.stdout:0/453: creat d6/dd/f92 x:0 0 0 2026-03-10T08:55:25.727 INFO:tasks.workunit.client.0.vm05.stdout:2/255: rmdir d0/d9/d1e 39 2026-03-10T08:55:25.730 INFO:tasks.workunit.client.1.vm08.stdout:7/560: mknod d0/d11/d1f/d29/d3d/d89/cb1 0 2026-03-10T08:55:25.731 INFO:tasks.workunit.client.1.vm08.stdout:9/454: write d2/dd/d15/d1e/d25/d32/f60 [516126,92973] 0 2026-03-10T08:55:25.734 INFO:tasks.workunit.client.0.vm05.stdout:8/260: unlink d2/l20 0 2026-03-10T08:55:25.735 INFO:tasks.workunit.client.0.vm05.stdout:7/253: stat d18/d1b/d1f/d25/c44 0 2026-03-10T08:55:25.744 INFO:tasks.workunit.client.1.vm08.stdout:7/561: rename d0/d1c to d0/d11/db2 0 2026-03-10T08:55:25.744 INFO:tasks.workunit.client.1.vm08.stdout:9/455: dwrite d2/dd/f16 [0,4194304] 0 2026-03-10T08:55:25.744 INFO:tasks.workunit.client.0.vm05.stdout:8/261: readlink d2/dd/l13 0 2026-03-10T08:55:25.744 INFO:tasks.workunit.client.0.vm05.stdout:8/262: creat d2/dd/d2c/d2e/f64 x:0 0 0 2026-03-10T08:55:25.744 INFO:tasks.workunit.client.0.vm05.stdout:8/263: symlink d2/dd/d2c/l65 0 2026-03-10T08:55:25.744 INFO:tasks.workunit.client.0.vm05.stdout:8/264: mknod d2/dd/d2c/d2e/d31/d3e/d5d/c66 0 2026-03-10T08:55:25.748 INFO:tasks.workunit.client.0.vm05.stdout:8/265: mkdir d2/db/d1f/d67 0 2026-03-10T08:55:25.750 INFO:tasks.workunit.client.0.vm05.stdout:8/266: mknod d2/dd/d2c/c68 0 2026-03-10T08:55:25.751 INFO:tasks.workunit.client.0.vm05.stdout:2/256: sync 2026-03-10T08:55:25.761 INFO:tasks.workunit.client.0.vm05.stdout:8/267: mknod d2/dd/d2c/d2e/d31/c69 0 2026-03-10T08:55:25.762 INFO:tasks.workunit.client.1.vm08.stdout:9/456: dread d2/d41/d4c/f80 [0,4194304] 0 2026-03-10T08:55:25.763 INFO:tasks.workunit.client.0.vm05.stdout:2/257: write d0/d9/d1e/d20/d21/f44 [139842,69683] 0 2026-03-10T08:55:25.763 INFO:tasks.workunit.client.1.vm08.stdout:2/547: sync 2026-03-10T08:55:25.768 INFO:tasks.workunit.client.1.vm08.stdout:7/562: dread d0/d11/d1f/d29/d3d/d40/fb0 [4194304,4194304] 0 2026-03-10T08:55:25.769 INFO:tasks.workunit.client.0.vm05.stdout:2/258: truncate d0/d9/d27/f37 2385479 0 2026-03-10T08:55:25.771 INFO:tasks.workunit.client.0.vm05.stdout:8/268: rename d2/db/d47/f57 to d2/dd/d2c/d2e/f6a 0 2026-03-10T08:55:25.772 INFO:tasks.workunit.client.0.vm05.stdout:2/259: dread d0/d9/d1e/d20/f22 [0,4194304] 0 2026-03-10T08:55:25.772 INFO:tasks.workunit.client.0.vm05.stdout:2/260: chown d0/cc 35889 1 2026-03-10T08:55:25.775 INFO:tasks.workunit.client.1.vm08.stdout:9/457: creat d2/d41/d4c/d89/f8f x:0 0 0 2026-03-10T08:55:25.775 INFO:tasks.workunit.client.1.vm08.stdout:9/458: chown d2/dd/d15/d1e/d39 2 1 2026-03-10T08:55:25.780 INFO:tasks.workunit.client.1.vm08.stdout:2/548: creat d1/da/d10/d1b/fac x:0 0 0 2026-03-10T08:55:25.786 INFO:tasks.workunit.client.1.vm08.stdout:8/576: write d1/d10/d9/f5b [2838994,116501] 0 2026-03-10T08:55:25.786 INFO:tasks.workunit.client.1.vm08.stdout:2/549: symlink d1/da/d10/d42/d93/d23/d9e/lad 0 2026-03-10T08:55:25.787 INFO:tasks.workunit.client.1.vm08.stdout:7/563: rename d0/d11/db2/c3e to d0/d14/cb3 0 2026-03-10T08:55:25.787 INFO:tasks.workunit.client.1.vm08.stdout:1/543: sync 2026-03-10T08:55:25.797 INFO:tasks.workunit.client.1.vm08.stdout:8/577: rename d1/d10/d9/dd/d25/d27/d44/d21/d51/d8e to d1/d10/d9/dd/d18/d34/dd0 0 2026-03-10T08:55:25.797 INFO:tasks.workunit.client.1.vm08.stdout:1/544: rmdir d1/da/de/d24/d3d/d40/d5b 39 2026-03-10T08:55:25.798 INFO:tasks.workunit.client.1.vm08.stdout:1/545: fsync d1/da/d20/d3f/d49/fb6 0 2026-03-10T08:55:25.798 INFO:tasks.workunit.client.1.vm08.stdout:7/564: link d0/d11/d1f/d29/d3d/d89/f96 d0/d11/d1f/d29/d36/fb4 0 2026-03-10T08:55:25.800 INFO:tasks.workunit.client.1.vm08.stdout:8/578: creat d1/d10/d9/dd/d18/d34/fd1 x:0 0 0 2026-03-10T08:55:25.800 INFO:tasks.workunit.client.1.vm08.stdout:7/565: creat d0/d14/d43/d62/fb5 x:0 0 0 2026-03-10T08:55:25.801 INFO:tasks.workunit.client.1.vm08.stdout:8/579: write d1/d10/d9/d8a/f99 [486824,1328] 0 2026-03-10T08:55:25.802 INFO:tasks.workunit.client.1.vm08.stdout:7/566: unlink d0/d14/d2f/c60 0 2026-03-10T08:55:25.813 INFO:tasks.workunit.client.1.vm08.stdout:8/580: symlink d1/d10/ld2 0 2026-03-10T08:55:25.813 INFO:tasks.workunit.client.1.vm08.stdout:1/546: link d1/da/d20/d91/c55 d1/da/de/d24/d3d/d40/d8e/cbd 0 2026-03-10T08:55:25.813 INFO:tasks.workunit.client.1.vm08.stdout:6/539: dread d9/dc/d11/d23/f6f [0,4194304] 0 2026-03-10T08:55:25.813 INFO:tasks.workunit.client.1.vm08.stdout:1/547: creat d1/da/de/d24/d3d/d40/d56/d7a/fbe x:0 0 0 2026-03-10T08:55:25.813 INFO:tasks.workunit.client.1.vm08.stdout:8/581: creat d1/d10/d9/dd/d25/d27/fd3 x:0 0 0 2026-03-10T08:55:25.814 INFO:tasks.workunit.client.1.vm08.stdout:1/548: creat d1/da/d20/d3f/d49/d68/fbf x:0 0 0 2026-03-10T08:55:25.818 INFO:tasks.workunit.client.1.vm08.stdout:2/550: dread d1/da/d10/d42/d93/d1e/f1f [0,4194304] 0 2026-03-10T08:55:25.820 INFO:tasks.workunit.client.0.vm05.stdout:9/205: dread d6/f3f [0,4194304] 0 2026-03-10T08:55:25.821 INFO:tasks.workunit.client.0.vm05.stdout:9/206: chown d6/d12/f14 332748749 1 2026-03-10T08:55:25.824 INFO:tasks.workunit.client.0.vm05.stdout:9/207: symlink d6/d19/d2c/l49 0 2026-03-10T08:55:25.825 INFO:tasks.workunit.client.0.vm05.stdout:9/208: fdatasync d6/d27/f44 0 2026-03-10T08:55:25.826 INFO:tasks.workunit.client.1.vm08.stdout:1/549: creat d1/da/d20/d9e/fc0 x:0 0 0 2026-03-10T08:55:25.826 INFO:tasks.workunit.client.1.vm08.stdout:1/550: dread - d1/da/d20/d3f/d49/d68/fbf zero size 2026-03-10T08:55:25.826 INFO:tasks.workunit.client.1.vm08.stdout:1/551: fdatasync d1/da/de/d24/d35/fa9 0 2026-03-10T08:55:25.827 INFO:tasks.workunit.client.0.vm05.stdout:9/209: mkdir d6/d19/d2a/d4a 0 2026-03-10T08:55:25.827 INFO:tasks.workunit.client.1.vm08.stdout:6/540: dwrite d9/d10/d1e/d32/fa1 [0,4194304] 0 2026-03-10T08:55:25.828 INFO:tasks.workunit.client.1.vm08.stdout:6/541: read - d9/d10/d1e/d4c/fb9 zero size 2026-03-10T08:55:25.828 INFO:tasks.workunit.client.0.vm05.stdout:9/210: getdents d6 0 2026-03-10T08:55:25.832 INFO:tasks.workunit.client.1.vm08.stdout:2/551: creat d1/da/d10/d42/d93/d23/fae x:0 0 0 2026-03-10T08:55:25.833 INFO:tasks.workunit.client.1.vm08.stdout:6/542: read - d9/d13/d4e/fa8 zero size 2026-03-10T08:55:25.836 INFO:tasks.workunit.client.1.vm08.stdout:0/454: read d6/dd/d13/d17/d1f/d20/d2f/d26/d56/f6c [1860022,40898] 0 2026-03-10T08:55:25.838 INFO:tasks.workunit.client.1.vm08.stdout:1/552: write d1/da/d18/f48 [63189,25551] 0 2026-03-10T08:55:25.845 INFO:tasks.workunit.client.1.vm08.stdout:2/552: mknod d1/d9b/caf 0 2026-03-10T08:55:25.845 INFO:tasks.workunit.client.1.vm08.stdout:6/543: creat d9/d10/d1e/d7b/fbc x:0 0 0 2026-03-10T08:55:25.845 INFO:tasks.workunit.client.1.vm08.stdout:5/484: dwrite d0/d11/d27/d68/d7c/f75 [0,4194304] 0 2026-03-10T08:55:25.848 INFO:tasks.workunit.client.1.vm08.stdout:8/582: sync 2026-03-10T08:55:25.848 INFO:tasks.workunit.client.1.vm08.stdout:1/553: creat d1/da/de/d24/d3d/d40/d8e/fc1 x:0 0 0 2026-03-10T08:55:25.849 INFO:tasks.workunit.client.1.vm08.stdout:2/553: fdatasync d1/da/d10/d42/d93/f55 0 2026-03-10T08:55:25.850 INFO:tasks.workunit.client.1.vm08.stdout:0/455: rename d6/dd/d13/d17/d1f/d20/d2f/d57/d77 to d6/dd/d13/d17/d1f/d2d/d85/d93 0 2026-03-10T08:55:25.852 INFO:tasks.workunit.client.1.vm08.stdout:8/583: creat d1/d10/d9/dd/d25/d27/d44/d21/d5f/fd4 x:0 0 0 2026-03-10T08:55:25.870 INFO:tasks.workunit.client.1.vm08.stdout:1/554: creat d1/da/de/d24/d26/d86/fc2 x:0 0 0 2026-03-10T08:55:25.870 INFO:tasks.workunit.client.1.vm08.stdout:6/544: rename d9/dc/d11/d23/d2c/f79 to d9/dc/d11/fbd 0 2026-03-10T08:55:25.871 INFO:tasks.workunit.client.0.vm05.stdout:1/375: write fc [3287460,14336] 0 2026-03-10T08:55:25.871 INFO:tasks.workunit.client.0.vm05.stdout:3/296: write d9/d2b/d2f/f4b [823451,106390] 0 2026-03-10T08:55:25.871 INFO:tasks.workunit.client.1.vm08.stdout:8/584: creat d1/d10/d9/d4d/d9f/fd5 x:0 0 0 2026-03-10T08:55:25.871 INFO:tasks.workunit.client.1.vm08.stdout:8/585: mkdir d1/d10/d9/dd/d25/d27/d44/d21/d51/dd6 0 2026-03-10T08:55:25.871 INFO:tasks.workunit.client.1.vm08.stdout:0/456: creat d6/d8b/f94 x:0 0 0 2026-03-10T08:55:25.871 INFO:tasks.workunit.client.1.vm08.stdout:2/554: rename d1/da/d10/d1b/l87 to d1/da/d10/d42/lb0 0 2026-03-10T08:55:25.871 INFO:tasks.workunit.client.1.vm08.stdout:0/457: mkdir d6/dd/d13/d17/d1f/d2d/d85/d95 0 2026-03-10T08:55:25.871 INFO:tasks.workunit.client.1.vm08.stdout:8/586: mknod d1/d10/d9/dd/d25/dca/dc6/cd7 0 2026-03-10T08:55:25.871 INFO:tasks.workunit.client.1.vm08.stdout:2/555: mkdir d1/db1 0 2026-03-10T08:55:25.871 INFO:tasks.workunit.client.1.vm08.stdout:4/550: write d5/d23/d36/d99/db2/d5a/d69/f8c [474633,96824] 0 2026-03-10T08:55:25.871 INFO:tasks.workunit.client.1.vm08.stdout:7/567: dread d0/d11/d1f/d2c/f6c [0,4194304] 0 2026-03-10T08:55:25.875 INFO:tasks.workunit.client.1.vm08.stdout:7/568: dwrite d0/d11/d1f/d29/d3d/d40/ff [4194304,4194304] 0 2026-03-10T08:55:25.876 INFO:tasks.workunit.client.1.vm08.stdout:8/587: read d1/d10/d9/dd/d18/d3c/f4e [464741,59330] 0 2026-03-10T08:55:25.877 INFO:tasks.workunit.client.1.vm08.stdout:8/588: truncate d1/d10/d9/dd/fc5 947487 0 2026-03-10T08:55:25.880 INFO:tasks.workunit.client.1.vm08.stdout:4/551: chown d5/d23/d36/d99/db2/d5a/d69/fa3 2 1 2026-03-10T08:55:25.880 INFO:tasks.workunit.client.1.vm08.stdout:0/458: mknod d6/dd/d13/d8f/c96 0 2026-03-10T08:55:25.882 INFO:tasks.workunit.client.1.vm08.stdout:0/459: read f5 [4060977,102637] 0 2026-03-10T08:55:25.885 INFO:tasks.workunit.client.0.vm05.stdout:3/297: rmdir d9 39 2026-03-10T08:55:25.889 INFO:tasks.workunit.client.1.vm08.stdout:2/556: dread d1/d9b/f74 [0,4194304] 0 2026-03-10T08:55:25.895 INFO:tasks.workunit.client.0.vm05.stdout:3/298: write d9/d4d/f52 [800262,56405] 0 2026-03-10T08:55:25.939 INFO:tasks.workunit.client.1.vm08.stdout:8/589: creat d1/d10/d9/dd/d18/d3c/fd8 x:0 0 0 2026-03-10T08:55:25.939 INFO:tasks.workunit.client.1.vm08.stdout:1/555: dread d1/da/d20/f54 [0,4194304] 0 2026-03-10T08:55:25.939 INFO:tasks.workunit.client.1.vm08.stdout:4/552: symlink d5/d23/d36/d99/db2/dbd/lc4 0 2026-03-10T08:55:25.939 INFO:tasks.workunit.client.1.vm08.stdout:1/556: stat d1/da/de/d24/d3d/d40/d56 0 2026-03-10T08:55:25.939 INFO:tasks.workunit.client.1.vm08.stdout:0/460: unlink d6/dd/d13/d32/f3d 0 2026-03-10T08:55:25.939 INFO:tasks.workunit.client.1.vm08.stdout:4/553: dwrite d5/d23/d36/d99/db2/f3a [4194304,4194304] 0 2026-03-10T08:55:25.940 INFO:tasks.workunit.client.1.vm08.stdout:2/557: creat d1/da/d10/d42/d93/d1e/fb2 x:0 0 0 2026-03-10T08:55:25.940 INFO:tasks.workunit.client.1.vm08.stdout:2/558: dread - d1/da/f9c zero size 2026-03-10T08:55:25.940 INFO:tasks.workunit.client.1.vm08.stdout:0/461: mknod d6/dd/d13/d17/d1f/c97 0 2026-03-10T08:55:25.940 INFO:tasks.workunit.client.1.vm08.stdout:1/557: rename d1/da/de/d24/d3d/d40/d56/d7a/fbe to d1/da/d20/d3f/d49/fc3 0 2026-03-10T08:55:25.940 INFO:tasks.workunit.client.1.vm08.stdout:8/590: mkdir d1/dd9 0 2026-03-10T08:55:25.940 INFO:tasks.workunit.client.1.vm08.stdout:1/558: readlink d1/da/de/d24/d3d/d40/d92/lb3 0 2026-03-10T08:55:25.940 INFO:tasks.workunit.client.0.vm05.stdout:1/376: link dd/d21/d37/f72 dd/d21/d37/f85 0 2026-03-10T08:55:25.940 INFO:tasks.workunit.client.0.vm05.stdout:1/377: dwrite dd/d21/f48 [4194304,4194304] 0 2026-03-10T08:55:25.940 INFO:tasks.workunit.client.0.vm05.stdout:6/287: creat d4/d7/d10/d1a/d1f/f66 x:0 0 0 2026-03-10T08:55:25.940 INFO:tasks.workunit.client.0.vm05.stdout:6/288: readlink d4/d7/l19 0 2026-03-10T08:55:25.940 INFO:tasks.workunit.client.0.vm05.stdout:1/378: symlink dd/d55/l86 0 2026-03-10T08:55:25.940 INFO:tasks.workunit.client.0.vm05.stdout:1/379: fdatasync dd/d10/d19/d4d/f70 0 2026-03-10T08:55:25.940 INFO:tasks.workunit.client.0.vm05.stdout:3/299: mkdir d9/d2b/d53 0 2026-03-10T08:55:25.940 INFO:tasks.workunit.client.0.vm05.stdout:3/300: dwrite d9/ff [0,4194304] 0 2026-03-10T08:55:25.940 INFO:tasks.workunit.client.0.vm05.stdout:1/380: rename dd/d13/c46 to dd/d21/d37/d45/c87 0 2026-03-10T08:55:25.940 INFO:tasks.workunit.client.0.vm05.stdout:3/301: rename d9/d2b/c3d to d9/d2b/d3a/c54 0 2026-03-10T08:55:25.940 INFO:tasks.workunit.client.0.vm05.stdout:3/302: stat d9/d4d 0 2026-03-10T08:55:25.940 INFO:tasks.workunit.client.0.vm05.stdout:3/303: fdatasync d9/d2b/d3a/d43/f4c 0 2026-03-10T08:55:25.940 INFO:tasks.workunit.client.0.vm05.stdout:3/304: fdatasync d9/f27 0 2026-03-10T08:55:25.940 INFO:tasks.workunit.client.0.vm05.stdout:3/305: chown d9/c3e 719578203 1 2026-03-10T08:55:25.940 INFO:tasks.workunit.client.0.vm05.stdout:3/306: write d9/f3c [966201,95335] 0 2026-03-10T08:55:25.943 INFO:tasks.workunit.client.0.vm05.stdout:1/381: rmdir dd/d21/d3f/d41 0 2026-03-10T08:55:25.943 INFO:tasks.workunit.client.1.vm08.stdout:8/591: dwrite d1/d10/d9/dd/d25/d27/d44/d21/d51/d64/fc2 [0,4194304] 0 2026-03-10T08:55:25.945 INFO:tasks.workunit.client.0.vm05.stdout:3/307: dwrite d9/ff [4194304,4194304] 0 2026-03-10T08:55:25.946 INFO:tasks.workunit.client.0.vm05.stdout:1/382: write dd/d10/d18/d2d/d51/d58/f5b [1401377,83227] 0 2026-03-10T08:55:25.946 INFO:tasks.workunit.client.0.vm05.stdout:1/383: chown dd/d13/c5f 37622891 1 2026-03-10T08:55:25.947 INFO:tasks.workunit.client.1.vm08.stdout:0/462: truncate d6/dd/d13/d17/d1f/d20/d2f/f59 1074884 0 2026-03-10T08:55:25.948 INFO:tasks.workunit.client.1.vm08.stdout:1/559: mknod d1/da/d20/d9e/cc4 0 2026-03-10T08:55:25.962 INFO:tasks.workunit.client.0.vm05.stdout:3/308: mkdir d9/d2b/d3a/d43/d4f/d55 0 2026-03-10T08:55:25.963 INFO:tasks.workunit.client.1.vm08.stdout:8/592: creat d1/d10/d9/d4d/db2/fda x:0 0 0 2026-03-10T08:55:25.964 INFO:tasks.workunit.client.0.vm05.stdout:3/309: unlink d9/c1e 0 2026-03-10T08:55:25.966 INFO:tasks.workunit.client.0.vm05.stdout:3/310: creat d9/d2b/d3a/f56 x:0 0 0 2026-03-10T08:55:25.966 INFO:tasks.workunit.client.1.vm08.stdout:1/560: getdents d1/da/d4b 0 2026-03-10T08:55:25.967 INFO:tasks.workunit.client.0.vm05.stdout:3/311: mkdir d9/d2b/d2f/d57 0 2026-03-10T08:55:25.969 INFO:tasks.workunit.client.0.vm05.stdout:3/312: unlink d9/l36 0 2026-03-10T08:55:25.969 INFO:tasks.workunit.client.0.vm05.stdout:3/313: chown d9/f19 1 1 2026-03-10T08:55:25.970 INFO:tasks.workunit.client.0.vm05.stdout:5/247: dread d5/df/d12/d21/f30 [0,4194304] 0 2026-03-10T08:55:25.972 INFO:tasks.workunit.client.1.vm08.stdout:1/561: mknod d1/da/de/d24/d26/cc5 0 2026-03-10T08:55:25.973 INFO:tasks.workunit.client.1.vm08.stdout:1/562: write d1/da/d18/d3b/faf [161401,104717] 0 2026-03-10T08:55:25.974 INFO:tasks.workunit.client.1.vm08.stdout:1/563: stat d1/da/d20/d3f/d49/d68/d7f/fb9 0 2026-03-10T08:55:25.976 INFO:tasks.workunit.client.1.vm08.stdout:1/564: symlink d1/da/de/d24/d35/d6d/d82/da2/dbb/lc6 0 2026-03-10T08:55:26.197 INFO:tasks.workunit.client.0.vm05.stdout:0/294: dwrite fe [0,4194304] 0 2026-03-10T08:55:26.198 INFO:tasks.workunit.client.0.vm05.stdout:4/301: write d0/d2e/d42/f5e [89572,118425] 0 2026-03-10T08:55:26.199 INFO:tasks.workunit.client.0.vm05.stdout:8/269: getdents d2 0 2026-03-10T08:55:26.199 INFO:tasks.workunit.client.0.vm05.stdout:7/254: write d18/f1d [1804898,11927] 0 2026-03-10T08:55:26.202 INFO:tasks.workunit.client.0.vm05.stdout:2/261: truncate d0/d9/f1b 2478471 0 2026-03-10T08:55:26.203 INFO:tasks.workunit.client.0.vm05.stdout:2/262: read d0/d9/d1e/d20/d21/f41 [1004512,10071] 0 2026-03-10T08:55:26.203 INFO:tasks.workunit.client.0.vm05.stdout:2/263: write d0/f40 [3739372,125087] 0 2026-03-10T08:55:26.204 INFO:tasks.workunit.client.0.vm05.stdout:2/264: fsync d0/d9/f3b 0 2026-03-10T08:55:26.205 INFO:tasks.workunit.client.0.vm05.stdout:2/265: stat d0/d9/d1e/d20/d21/f46 0 2026-03-10T08:55:26.205 INFO:tasks.workunit.client.0.vm05.stdout:2/266: readlink d0/l28 0 2026-03-10T08:55:26.206 INFO:tasks.workunit.client.0.vm05.stdout:2/267: write d0/d9/f1d [2770295,74152] 0 2026-03-10T08:55:26.207 INFO:tasks.workunit.client.0.vm05.stdout:2/268: chown d0/d9/d1e/d20/d21/f31 907097 1 2026-03-10T08:55:26.209 INFO:tasks.workunit.client.0.vm05.stdout:0/295: unlink df/d1f/f25 0 2026-03-10T08:55:26.210 INFO:tasks.workunit.client.1.vm08.stdout:3/485: dwrite d4/d15/d8/d2c/d9b/d79/f5c [0,4194304] 0 2026-03-10T08:55:26.213 INFO:tasks.workunit.client.0.vm05.stdout:7/255: rmdir d18/d38 39 2026-03-10T08:55:26.215 INFO:tasks.workunit.client.0.vm05.stdout:2/269: creat d0/d9/d1e/d20/d21/f4c x:0 0 0 2026-03-10T08:55:26.215 INFO:tasks.workunit.client.0.vm05.stdout:2/270: chown d0/f40 103 1 2026-03-10T08:55:26.216 INFO:tasks.workunit.client.0.vm05.stdout:0/296: write df/d18/f24 [2373314,111328] 0 2026-03-10T08:55:26.217 INFO:tasks.workunit.client.0.vm05.stdout:4/302: mkdir d0/d1d/d30/d49/d58/d66 0 2026-03-10T08:55:26.217 INFO:tasks.workunit.client.0.vm05.stdout:6/289: sync 2026-03-10T08:55:26.217 INFO:tasks.workunit.client.1.vm08.stdout:9/459: dwrite d2/dd/d15/d1e/d21/f3a [0,4194304] 0 2026-03-10T08:55:26.217 INFO:tasks.workunit.client.1.vm08.stdout:9/460: dread d2/d41/d4c/f80 [0,4194304] 0 2026-03-10T08:55:26.220 INFO:tasks.workunit.client.0.vm05.stdout:7/256: rmdir d18/d1b/d1f/d25/d2e/d32 39 2026-03-10T08:55:26.227 INFO:tasks.workunit.client.0.vm05.stdout:8/270: dread d2/db/f19 [0,4194304] 0 2026-03-10T08:55:26.229 INFO:tasks.workunit.client.0.vm05.stdout:3/314: sync 2026-03-10T08:55:26.230 INFO:tasks.workunit.client.1.vm08.stdout:9/461: rmdir d2/dd/d15/d1e/d25/d32/d79 39 2026-03-10T08:55:26.233 INFO:tasks.workunit.client.1.vm08.stdout:3/486: creat d4/d15/d8/d2c/d9b/d79/d8f/fa1 x:0 0 0 2026-03-10T08:55:26.236 INFO:tasks.workunit.client.0.vm05.stdout:2/271: dread d0/f2f [0,4194304] 0 2026-03-10T08:55:26.236 INFO:tasks.workunit.client.1.vm08.stdout:9/462: creat d2/dd/d15/d1e/d21/f90 x:0 0 0 2026-03-10T08:55:26.236 INFO:tasks.workunit.client.0.vm05.stdout:9/211: write f4 [4576097,91757] 0 2026-03-10T08:55:26.237 INFO:tasks.workunit.client.0.vm05.stdout:2/272: chown d0/d9/d1e/d20/d21/l2b 106 1 2026-03-10T08:55:26.237 INFO:tasks.workunit.client.0.vm05.stdout:2/273: readlink d0/d9/d1e/l49 0 2026-03-10T08:55:26.249 INFO:tasks.workunit.client.1.vm08.stdout:3/487: dwrite d4/d15/d8/d2c/d9b/f86 [0,4194304] 0 2026-03-10T08:55:26.249 INFO:tasks.workunit.client.0.vm05.stdout:4/303: mkdir d0/d1d/d30/d32/d41/d67 0 2026-03-10T08:55:26.252 INFO:tasks.workunit.client.0.vm05.stdout:4/304: dwrite d0/d2c/f2f [0,4194304] 0 2026-03-10T08:55:26.254 INFO:tasks.workunit.client.1.vm08.stdout:0/463: read d6/fa [728446,33178] 0 2026-03-10T08:55:26.271 INFO:tasks.workunit.client.1.vm08.stdout:3/488: creat d4/d15/d8/d1d/d4f/fa2 x:0 0 0 2026-03-10T08:55:26.271 INFO:tasks.workunit.client.1.vm08.stdout:9/463: creat d2/dd/d15/d1e/f91 x:0 0 0 2026-03-10T08:55:26.271 INFO:tasks.workunit.client.1.vm08.stdout:3/489: read - d4/d15/d8/d2c/d6d/f9d zero size 2026-03-10T08:55:26.271 INFO:tasks.workunit.client.1.vm08.stdout:3/490: readlink d4/d15/d8/d2c/d9b/d79/l81 0 2026-03-10T08:55:26.271 INFO:tasks.workunit.client.1.vm08.stdout:0/464: mkdir d6/dd/d13/d17/d1f/d2d/d38/d98 0 2026-03-10T08:55:26.271 INFO:tasks.workunit.client.1.vm08.stdout:9/464: mknod d2/d41/d4c/d66/c92 0 2026-03-10T08:55:26.271 INFO:tasks.workunit.client.1.vm08.stdout:3/491: readlink d4/d15/d8/d2c/d55/l7f 0 2026-03-10T08:55:26.271 INFO:tasks.workunit.client.0.vm05.stdout:8/271: creat d2/dd/d2c/d2e/d31/d3e/f6b x:0 0 0 2026-03-10T08:55:26.271 INFO:tasks.workunit.client.0.vm05.stdout:3/315: dread d9/f23 [0,4194304] 0 2026-03-10T08:55:26.271 INFO:tasks.workunit.client.0.vm05.stdout:5/248: truncate d5/df/d12/d21/f1f 3502020 0 2026-03-10T08:55:26.271 INFO:tasks.workunit.client.0.vm05.stdout:4/305: write d0/f23 [911055,42886] 0 2026-03-10T08:55:26.271 INFO:tasks.workunit.client.0.vm05.stdout:4/306: fdatasync d0/d1d/d30/d32/f63 0 2026-03-10T08:55:26.271 INFO:tasks.workunit.client.0.vm05.stdout:4/307: dread - d0/d2e/d42/d45/f62 zero size 2026-03-10T08:55:26.271 INFO:tasks.workunit.client.0.vm05.stdout:8/272: rmdir d2/db/d28 39 2026-03-10T08:55:26.271 INFO:tasks.workunit.client.0.vm05.stdout:3/316: truncate f2 115569 0 2026-03-10T08:55:26.273 INFO:tasks.workunit.client.1.vm08.stdout:5/485: dwrite d0/d11/d27/f3d [0,4194304] 0 2026-03-10T08:55:26.281 INFO:tasks.workunit.client.0.vm05.stdout:9/212: sync 2026-03-10T08:55:26.291 INFO:tasks.workunit.client.1.vm08.stdout:6/545: dwrite d9/dc/d11/d23/f6f [0,4194304] 0 2026-03-10T08:55:26.304 INFO:tasks.workunit.client.1.vm08.stdout:0/465: dwrite d6/dd/d13/d17/d50/f71 [4194304,4194304] 0 2026-03-10T08:55:26.304 INFO:tasks.workunit.client.1.vm08.stdout:5/486: link d0/d11/d27/d68/d7c/d4b/f82 d0/d11/d27/d68/d7c/d4b/d4e/d84/f90 0 2026-03-10T08:55:26.304 INFO:tasks.workunit.client.0.vm05.stdout:0/297: symlink df/d18/d19/d39/d4d/d50/l54 0 2026-03-10T08:55:26.304 INFO:tasks.workunit.client.0.vm05.stdout:5/249: creat d5/d3a/f4e x:0 0 0 2026-03-10T08:55:26.304 INFO:tasks.workunit.client.0.vm05.stdout:7/257: creat d18/d1b/d1f/d25/d2e/f48 x:0 0 0 2026-03-10T08:55:26.304 INFO:tasks.workunit.client.0.vm05.stdout:0/298: fsync df/f4a 0 2026-03-10T08:55:26.304 INFO:tasks.workunit.client.0.vm05.stdout:6/290: chown d4/d7/d10/d15/d20/f47 380171 1 2026-03-10T08:55:26.307 INFO:tasks.workunit.client.0.vm05.stdout:3/317: mknod d9/d2b/d3a/d43/c58 0 2026-03-10T08:55:26.309 INFO:tasks.workunit.client.0.vm05.stdout:3/318: write d9/d2b/d2f/f33 [449708,85848] 0 2026-03-10T08:55:26.310 INFO:tasks.workunit.client.1.vm08.stdout:6/546: mkdir d9/d10/d1e/d4c/dbe 0 2026-03-10T08:55:26.311 INFO:tasks.workunit.client.1.vm08.stdout:7/569: write d0/d11/db2/f67 [5776236,42354] 0 2026-03-10T08:55:26.312 INFO:tasks.workunit.client.1.vm08.stdout:6/547: chown d9/c87 36 1 2026-03-10T08:55:26.327 INFO:tasks.workunit.client.1.vm08.stdout:3/492: dread d4/d15/f7 [0,4194304] 0 2026-03-10T08:55:26.328 INFO:tasks.workunit.client.0.vm05.stdout:9/213: mkdir d6/d15/d3c/d4b 0 2026-03-10T08:55:26.329 INFO:tasks.workunit.client.1.vm08.stdout:2/559: write d1/da/d10/f18 [1674717,51254] 0 2026-03-10T08:55:26.330 INFO:tasks.workunit.client.0.vm05.stdout:7/258: fdatasync fd 0 2026-03-10T08:55:26.331 INFO:tasks.workunit.client.1.vm08.stdout:5/487: write d0/d11/f86 [345172,100933] 0 2026-03-10T08:55:26.331 INFO:tasks.workunit.client.0.vm05.stdout:5/250: symlink d5/d3a/l4f 0 2026-03-10T08:55:26.333 INFO:tasks.workunit.client.1.vm08.stdout:0/466: creat d6/dd/d13/d61/d6f/f99 x:0 0 0 2026-03-10T08:55:26.333 INFO:tasks.workunit.client.1.vm08.stdout:3/493: dwrite d4/d15/d8/d2c/f90 [0,4194304] 0 2026-03-10T08:55:26.335 INFO:tasks.workunit.client.0.vm05.stdout:8/273: creat d2/dd/d2c/d2e/d31/d4c/d63/f6c x:0 0 0 2026-03-10T08:55:26.336 INFO:tasks.workunit.client.1.vm08.stdout:2/560: mkdir d1/d9b/d52/db3 0 2026-03-10T08:55:26.339 INFO:tasks.workunit.client.1.vm08.stdout:6/548: symlink d9/dc/lbf 0 2026-03-10T08:55:26.339 INFO:tasks.workunit.client.0.vm05.stdout:2/274: getdents d0/d9/d1e 0 2026-03-10T08:55:26.339 INFO:tasks.workunit.client.1.vm08.stdout:3/494: dread d4/d15/d8/d2c/d9b/d79/f59 [0,4194304] 0 2026-03-10T08:55:26.351 INFO:tasks.workunit.client.1.vm08.stdout:5/488: creat d0/d11/d18/d52/f91 x:0 0 0 2026-03-10T08:55:26.354 INFO:tasks.workunit.client.1.vm08.stdout:3/495: dread d4/d15/d8/d2c/d9b/f86 [0,4194304] 0 2026-03-10T08:55:26.355 INFO:tasks.workunit.client.0.vm05.stdout:0/299: dread f5 [0,4194304] 0 2026-03-10T08:55:26.358 INFO:tasks.workunit.client.0.vm05.stdout:3/319: stat d9/d2b/d3a/l49 0 2026-03-10T08:55:26.360 INFO:tasks.workunit.client.0.vm05.stdout:9/214: creat d6/d15/d37/f4c x:0 0 0 2026-03-10T08:55:26.361 INFO:tasks.workunit.client.1.vm08.stdout:2/561: fdatasync d1/d5b/d66/f5e 0 2026-03-10T08:55:26.361 INFO:tasks.workunit.client.0.vm05.stdout:5/251: read - d5/df/d12/d24/d2c/d41/f4d zero size 2026-03-10T08:55:26.362 INFO:tasks.workunit.client.1.vm08.stdout:2/562: write d1/da/d10/d2d/fa2 [878105,12425] 0 2026-03-10T08:55:26.363 INFO:tasks.workunit.client.0.vm05.stdout:6/291: mknod d4/d2d/d51/d62/c67 0 2026-03-10T08:55:26.364 INFO:tasks.workunit.client.0.vm05.stdout:8/274: fsync d2/dd/d2c/d2e/f6a 0 2026-03-10T08:55:26.366 INFO:tasks.workunit.client.0.vm05.stdout:2/275: read d0/d9/d1e/d20/d21/f41 [518488,62519] 0 2026-03-10T08:55:26.367 INFO:tasks.workunit.client.0.vm05.stdout:5/252: dwrite d5/df/d12/d24/d2c/d41/f4c [0,4194304] 0 2026-03-10T08:55:26.370 INFO:tasks.workunit.client.0.vm05.stdout:8/275: dwrite d2/dd/d2c/d2e/d31/d4c/d63/f6c [0,4194304] 0 2026-03-10T08:55:26.373 INFO:tasks.workunit.client.1.vm08.stdout:6/549: mkdir d9/dc/d11/d23/d2c/dc0 0 2026-03-10T08:55:26.375 INFO:tasks.workunit.client.1.vm08.stdout:3/496: symlink d4/d15/d8/d71/la3 0 2026-03-10T08:55:26.379 INFO:tasks.workunit.client.1.vm08.stdout:0/467: symlink d6/dd/d13/d17/d1f/d2d/l9a 0 2026-03-10T08:55:26.381 INFO:tasks.workunit.client.1.vm08.stdout:6/550: creat d9/dc/d84/d80/fc1 x:0 0 0 2026-03-10T08:55:26.381 INFO:tasks.workunit.client.0.vm05.stdout:5/253: symlink d5/d3a/l50 0 2026-03-10T08:55:26.382 INFO:tasks.workunit.client.0.vm05.stdout:9/215: dread d6/d19/f1a [0,4194304] 0 2026-03-10T08:55:26.383 INFO:tasks.workunit.client.0.vm05.stdout:9/216: dread - d6/d27/f44 zero size 2026-03-10T08:55:26.385 INFO:tasks.workunit.client.0.vm05.stdout:2/276: mknod d0/d9/d1e/c4d 0 2026-03-10T08:55:26.386 INFO:tasks.workunit.client.0.vm05.stdout:9/217: dwrite d6/d19/d21/f2f [0,4194304] 0 2026-03-10T08:55:26.388 INFO:tasks.workunit.client.0.vm05.stdout:0/300: mkdir df/d18/d19/d55 0 2026-03-10T08:55:26.389 INFO:tasks.workunit.client.0.vm05.stdout:7/259: creat d18/d1b/d1f/d25/d2e/f49 x:0 0 0 2026-03-10T08:55:26.390 INFO:tasks.workunit.client.0.vm05.stdout:7/260: readlink d18/d1b/d1f/d25/d2e/l37 0 2026-03-10T08:55:26.392 INFO:tasks.workunit.client.1.vm08.stdout:0/468: symlink d6/dd/d13/d17/d1f/d20/d2f/d24/l9b 0 2026-03-10T08:55:26.393 INFO:tasks.workunit.client.0.vm05.stdout:6/292: mknod d4/d7/d10/d15/d20/d53/d4a/c68 0 2026-03-10T08:55:26.394 INFO:tasks.workunit.client.0.vm05.stdout:5/254: creat d5/df/d12/d24/f51 x:0 0 0 2026-03-10T08:55:26.396 INFO:tasks.workunit.client.1.vm08.stdout:3/497: mknod d4/d15/ca4 0 2026-03-10T08:55:26.398 INFO:tasks.workunit.client.1.vm08.stdout:6/551: creat d9/d10/d1e/d7e/fc2 x:0 0 0 2026-03-10T08:55:26.399 INFO:tasks.workunit.client.0.vm05.stdout:1/384: write dd/d10/f22 [4368261,4858] 0 2026-03-10T08:55:26.404 INFO:tasks.workunit.client.1.vm08.stdout:0/469: symlink d6/l9c 0 2026-03-10T08:55:26.405 INFO:tasks.workunit.client.0.vm05.stdout:7/261: dread d18/d1b/f30 [0,4194304] 0 2026-03-10T08:55:26.405 INFO:tasks.workunit.client.1.vm08.stdout:3/498: creat d4/d15/d8/d2c/d55/d93/fa5 x:0 0 0 2026-03-10T08:55:26.407 INFO:tasks.workunit.client.1.vm08.stdout:1/565: write d1/da/d20/d3f/d49/d68/d7f/f97 [315899,125070] 0 2026-03-10T08:55:26.407 INFO:tasks.workunit.client.0.vm05.stdout:6/293: symlink d4/d7/d10/d1a/d1f/l69 0 2026-03-10T08:55:26.409 INFO:tasks.workunit.client.1.vm08.stdout:6/552: truncate d9/d13/f2f 3799134 0 2026-03-10T08:55:26.409 INFO:tasks.workunit.client.1.vm08.stdout:6/553: stat d9/fa 0 2026-03-10T08:55:26.409 INFO:tasks.workunit.client.0.vm05.stdout:5/255: symlink d5/df/d12/d24/d2c/d41/l52 0 2026-03-10T08:55:26.409 INFO:tasks.workunit.client.0.vm05.stdout:5/256: read d5/fc [2740754,96262] 0 2026-03-10T08:55:26.411 INFO:tasks.workunit.client.1.vm08.stdout:3/499: symlink d4/d15/d8/d2c/d9b/d79/d8f/la6 0 2026-03-10T08:55:26.414 INFO:tasks.workunit.client.1.vm08.stdout:1/566: fdatasync d1/da/de/d5c/fa1 0 2026-03-10T08:55:26.414 INFO:tasks.workunit.client.0.vm05.stdout:7/262: chown d18/d1b/d1f/d25/d2e/d32 3173277 1 2026-03-10T08:55:26.414 INFO:tasks.workunit.client.1.vm08.stdout:6/554: rename d9/d13/fa0 to d9/d10/d1e/d7b/fc3 0 2026-03-10T08:55:26.415 INFO:tasks.workunit.client.1.vm08.stdout:6/555: readlink d9/dc/d11/d23/d2c/d41/l38 0 2026-03-10T08:55:26.416 INFO:tasks.workunit.client.1.vm08.stdout:3/500: symlink d4/d15/d8/d1d/la7 0 2026-03-10T08:55:26.419 INFO:tasks.workunit.client.1.vm08.stdout:6/556: unlink d9/dc/d11/c3f 0 2026-03-10T08:55:26.419 INFO:tasks.workunit.client.0.vm05.stdout:2/277: creat d0/d9/f4e x:0 0 0 2026-03-10T08:55:26.425 INFO:tasks.workunit.client.0.vm05.stdout:7/263: truncate f4 2304457 0 2026-03-10T08:55:26.425 INFO:tasks.workunit.client.0.vm05.stdout:0/301: creat df/d18/d2b/d27/d32/d4e/f56 x:0 0 0 2026-03-10T08:55:26.425 INFO:tasks.workunit.client.1.vm08.stdout:3/501: dwrite d4/d15/d8/d2c/d9b/d79/d20/f84 [0,4194304] 0 2026-03-10T08:55:26.426 INFO:tasks.workunit.client.0.vm05.stdout:7/264: readlink d18/d1b/d1f/d25/d2e/d32/l41 0 2026-03-10T08:55:26.435 INFO:tasks.workunit.client.0.vm05.stdout:6/294: creat d4/f6a x:0 0 0 2026-03-10T08:55:26.439 INFO:tasks.workunit.client.1.vm08.stdout:6/557: write f5 [1535748,46890] 0 2026-03-10T08:55:26.444 INFO:tasks.workunit.client.0.vm05.stdout:3/320: read d9/f29 [384984,999] 0 2026-03-10T08:55:26.444 INFO:tasks.workunit.client.0.vm05.stdout:6/295: creat d4/d7/d10/d15/d20/d53/f6b x:0 0 0 2026-03-10T08:55:26.450 INFO:tasks.workunit.client.0.vm05.stdout:0/302: dread df/f15 [0,4194304] 0 2026-03-10T08:55:26.450 INFO:tasks.workunit.client.0.vm05.stdout:0/303: chown df/d18/d2b/d27/d32 1987631494 1 2026-03-10T08:55:26.453 INFO:tasks.workunit.client.0.vm05.stdout:7/265: rename d18/d1b/d1f/d25/f36 to d18/f4a 0 2026-03-10T08:55:26.454 INFO:tasks.workunit.client.0.vm05.stdout:0/304: creat df/d18/d2b/d3a/f57 x:0 0 0 2026-03-10T08:55:26.456 INFO:tasks.workunit.client.0.vm05.stdout:6/296: creat d4/f6c x:0 0 0 2026-03-10T08:55:26.463 INFO:tasks.workunit.client.1.vm08.stdout:8/593: dread d1/d10/d9/dd/d18/d34/f57 [0,4194304] 0 2026-03-10T08:55:26.463 INFO:tasks.workunit.client.0.vm05.stdout:0/305: symlink df/d18/d2b/d27/d32/l58 0 2026-03-10T08:55:26.463 INFO:tasks.workunit.client.0.vm05.stdout:0/306: dwrite df/d18/f53 [0,4194304] 0 2026-03-10T08:55:26.464 INFO:tasks.workunit.client.0.vm05.stdout:0/307: dread fe [0,4194304] 0 2026-03-10T08:55:26.467 INFO:tasks.workunit.client.1.vm08.stdout:8/594: creat d1/d2c/fdb x:0 0 0 2026-03-10T08:55:26.490 INFO:tasks.workunit.client.1.vm08.stdout:6/558: dread d9/dc/d11/d23/d2c/f3d [0,4194304] 0 2026-03-10T08:55:26.492 INFO:tasks.workunit.client.1.vm08.stdout:6/559: truncate d9/dc/d11/f29 1224855 0 2026-03-10T08:55:26.493 INFO:tasks.workunit.client.1.vm08.stdout:6/560: truncate d9/f77 319642 0 2026-03-10T08:55:26.495 INFO:tasks.workunit.client.0.vm05.stdout:6/297: dread d4/d7/d10/d15/f2e [0,4194304] 0 2026-03-10T08:55:26.497 INFO:tasks.workunit.client.0.vm05.stdout:6/298: creat d4/d2d/d5f/f6d x:0 0 0 2026-03-10T08:55:26.501 INFO:tasks.workunit.client.1.vm08.stdout:6/561: rename d9/fac to d9/d10/d1e/d4c/d69/da2/fc4 0 2026-03-10T08:55:26.503 INFO:tasks.workunit.client.1.vm08.stdout:6/562: dread - d9/d10/d1e/d32/fb2 zero size 2026-03-10T08:55:26.505 INFO:tasks.workunit.client.1.vm08.stdout:6/563: fdatasync d9/dc/d11/d23/d2c/f5c 0 2026-03-10T08:55:26.506 INFO:tasks.workunit.client.1.vm08.stdout:6/564: fsync d9/d10/d1e/d4c/fb9 0 2026-03-10T08:55:26.509 INFO:tasks.workunit.client.1.vm08.stdout:6/565: link d9/d10/d1e/d7e/fc2 d9/fc5 0 2026-03-10T08:55:26.536 INFO:tasks.workunit.client.0.vm05.stdout:6/299: sync 2026-03-10T08:55:26.537 INFO:tasks.workunit.client.0.vm05.stdout:6/300: read - d4/d7/d10/d1a/d1f/f66 zero size 2026-03-10T08:55:26.537 INFO:tasks.workunit.client.0.vm05.stdout:6/301: write d4/f6a [64976,25114] 0 2026-03-10T08:55:26.542 INFO:tasks.workunit.client.0.vm05.stdout:6/302: dwrite d4/d7/d10/d15/d20/d53/f49 [4194304,4194304] 0 2026-03-10T08:55:26.543 INFO:tasks.workunit.client.0.vm05.stdout:6/303: fsync d4/f30 0 2026-03-10T08:55:26.552 INFO:tasks.workunit.client.0.vm05.stdout:6/304: getdents d4/d7/d10/d15 0 2026-03-10T08:55:26.555 INFO:tasks.workunit.client.0.vm05.stdout:6/305: symlink d4/d2d/d5f/l6e 0 2026-03-10T08:55:26.556 INFO:tasks.workunit.client.0.vm05.stdout:6/306: rmdir d4/d2d/d51 39 2026-03-10T08:55:26.557 INFO:tasks.workunit.client.0.vm05.stdout:6/307: symlink d4/d2d/d51/l6f 0 2026-03-10T08:55:26.563 INFO:tasks.workunit.client.1.vm08.stdout:4/554: dread d5/f9d [0,4194304] 0 2026-03-10T08:55:26.586 INFO:tasks.workunit.client.0.vm05.stdout:0/308: dread df/d18/d2b/d27/f2e [0,4194304] 0 2026-03-10T08:55:26.607 INFO:tasks.workunit.client.1.vm08.stdout:9/465: write d2/dd/d15/d1e/d39/d4e/f71 [534175,7292] 0 2026-03-10T08:55:26.610 INFO:tasks.workunit.client.0.vm05.stdout:4/308: write d0/d1d/f24 [1715887,105288] 0 2026-03-10T08:55:26.610 INFO:tasks.workunit.client.0.vm05.stdout:4/309: read - d0/d2e/d42/d45/f62 zero size 2026-03-10T08:55:26.612 INFO:tasks.workunit.client.0.vm05.stdout:4/310: rmdir d0 39 2026-03-10T08:55:26.624 INFO:tasks.workunit.client.0.vm05.stdout:4/311: dread - d0/d2e/d42/d45/d4a/d36/f3d zero size 2026-03-10T08:55:26.632 INFO:tasks.workunit.client.0.vm05.stdout:4/312: dread d0/f9 [0,4194304] 0 2026-03-10T08:55:26.633 INFO:tasks.workunit.client.0.vm05.stdout:4/313: creat d0/d2e/d42/d45/d4a/d36/d37/f68 x:0 0 0 2026-03-10T08:55:26.634 INFO:tasks.workunit.client.0.vm05.stdout:4/314: symlink d0/d1d/l69 0 2026-03-10T08:55:26.638 INFO:tasks.workunit.client.0.vm05.stdout:4/315: getdents d0/d1d/d30/d49 0 2026-03-10T08:55:26.638 INFO:tasks.workunit.client.0.vm05.stdout:4/316: chown d0/d2e/d42/d45 9817 1 2026-03-10T08:55:26.640 INFO:tasks.workunit.client.0.vm05.stdout:4/317: fdatasync d0/fc 0 2026-03-10T08:55:26.641 INFO:tasks.workunit.client.0.vm05.stdout:4/318: mkdir d0/d2c/d6a 0 2026-03-10T08:55:26.642 INFO:tasks.workunit.client.0.vm05.stdout:4/319: dread - d0/d1d/d30/d32/f63 zero size 2026-03-10T08:55:26.643 INFO:tasks.workunit.client.0.vm05.stdout:4/320: creat d0/d1d/d30/d32/d41/d67/f6b x:0 0 0 2026-03-10T08:55:26.645 INFO:tasks.workunit.client.0.vm05.stdout:4/321: symlink d0/d1d/d30/d49/d58/l6c 0 2026-03-10T08:55:26.646 INFO:tasks.workunit.client.1.vm08.stdout:3/502: truncate d4/d15/d8/d2c/d9b/d79/f59 2784164 0 2026-03-10T08:55:26.647 INFO:tasks.workunit.client.0.vm05.stdout:4/322: link d0/d55/c65 d0/d1d/d30/d32/d41/d67/c6d 0 2026-03-10T08:55:26.647 INFO:tasks.workunit.client.1.vm08.stdout:2/563: write d1/d43/f7f [1112995,4228] 0 2026-03-10T08:55:26.649 INFO:tasks.workunit.client.1.vm08.stdout:5/489: dwrite d0/f7f [0,4194304] 0 2026-03-10T08:55:26.650 INFO:tasks.workunit.client.0.vm05.stdout:4/323: creat d0/d1d/d30/d49/d58/f6e x:0 0 0 2026-03-10T08:55:26.650 INFO:tasks.workunit.client.1.vm08.stdout:5/490: stat d0/d46/c54 0 2026-03-10T08:55:26.651 INFO:tasks.workunit.client.0.vm05.stdout:4/324: stat d0/f1e 0 2026-03-10T08:55:26.651 INFO:tasks.workunit.client.1.vm08.stdout:3/503: mkdir d4/d15/d8/d1d/da8 0 2026-03-10T08:55:26.661 INFO:tasks.workunit.client.1.vm08.stdout:5/491: creat d0/f92 x:0 0 0 2026-03-10T08:55:26.662 INFO:tasks.workunit.client.0.vm05.stdout:8/276: write d2/fa [5653088,75127] 0 2026-03-10T08:55:26.664 INFO:tasks.workunit.client.0.vm05.stdout:4/325: mknod d0/d1d/d30/c6f 0 2026-03-10T08:55:26.666 INFO:tasks.workunit.client.0.vm05.stdout:8/277: dwrite d2/f5 [0,4194304] 0 2026-03-10T08:55:26.668 INFO:tasks.workunit.client.1.vm08.stdout:7/570: truncate d0/d11/d1f/d29/d3b/f86 888615 0 2026-03-10T08:55:26.669 INFO:tasks.workunit.client.0.vm05.stdout:2/278: write d0/d9/d1e/d20/d21/f46 [358977,63202] 0 2026-03-10T08:55:26.670 INFO:tasks.workunit.client.1.vm08.stdout:5/492: mkdir d0/d11/d3e/d45/d93 0 2026-03-10T08:55:26.674 INFO:tasks.workunit.client.1.vm08.stdout:7/571: truncate d0/d11/d1f/d29/d36/d75/f85 4569028 0 2026-03-10T08:55:26.674 INFO:tasks.workunit.client.1.vm08.stdout:5/493: creat d0/d11/d27/d68/d7c/d4b/d4e/f94 x:0 0 0 2026-03-10T08:55:26.674 INFO:tasks.workunit.client.0.vm05.stdout:8/278: fdatasync d2/dd/d2c/f4d 0 2026-03-10T08:55:26.676 INFO:tasks.workunit.client.0.vm05.stdout:2/279: dread d0/d9/d1e/d20/f32 [0,4194304] 0 2026-03-10T08:55:26.677 INFO:tasks.workunit.client.0.vm05.stdout:2/280: fdatasync d0/d9/f4e 0 2026-03-10T08:55:26.678 INFO:tasks.workunit.client.0.vm05.stdout:2/281: dread d0/d9/d27/f37 [0,4194304] 0 2026-03-10T08:55:26.679 INFO:tasks.workunit.client.0.vm05.stdout:2/282: fsync d0/d9/f12 0 2026-03-10T08:55:26.681 INFO:tasks.workunit.client.0.vm05.stdout:9/218: write d6/d15/f24 [242721,129185] 0 2026-03-10T08:55:26.682 INFO:tasks.workunit.client.0.vm05.stdout:9/219: write d6/d27/f44 [737266,110814] 0 2026-03-10T08:55:26.685 INFO:tasks.workunit.client.1.vm08.stdout:7/572: dwrite d0/d14/d2f/f81 [0,4194304] 0 2026-03-10T08:55:26.685 INFO:tasks.workunit.client.0.vm05.stdout:4/326: truncate d0/f10 1186506 0 2026-03-10T08:55:26.686 INFO:tasks.workunit.client.0.vm05.stdout:4/327: dread d0/fb [4194304,4194304] 0 2026-03-10T08:55:26.687 INFO:tasks.workunit.client.0.vm05.stdout:4/328: stat d0/d2e/f4e 0 2026-03-10T08:55:26.696 INFO:tasks.workunit.client.1.vm08.stdout:3/504: dread d4/d15/d8/d2c/d55/f75 [0,4194304] 0 2026-03-10T08:55:26.697 INFO:tasks.workunit.client.1.vm08.stdout:3/505: chown d4/d15/d8/fa0 469 1 2026-03-10T08:55:26.698 INFO:tasks.workunit.client.0.vm05.stdout:8/279: rmdir d2/dd/d2c/d2e 39 2026-03-10T08:55:26.700 INFO:tasks.workunit.client.1.vm08.stdout:7/573: mknod d0/d14/d43/cb6 0 2026-03-10T08:55:26.702 INFO:tasks.workunit.client.0.vm05.stdout:2/283: fdatasync d0/d9/d27/f38 0 2026-03-10T08:55:26.704 INFO:tasks.workunit.client.0.vm05.stdout:9/220: fdatasync d6/fe 0 2026-03-10T08:55:26.714 INFO:tasks.workunit.client.0.vm05.stdout:9/221: dwrite d6/f30 [0,4194304] 0 2026-03-10T08:55:26.714 INFO:tasks.workunit.client.1.vm08.stdout:0/470: dwrite d6/dd/d13/d17/d1f/d20/d2f/d24/f6e [0,4194304] 0 2026-03-10T08:55:26.714 INFO:tasks.workunit.client.1.vm08.stdout:0/471: fdatasync d6/dd/d13/d61/d6f/f99 0 2026-03-10T08:55:26.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:26 vm05.local ceph-mon[49713]: pgmap v152: 65 pgs: 65 active+clean; 1.7 GiB data, 6.2 GiB used, 114 GiB / 120 GiB avail; 33 MiB/s rd, 134 MiB/s wr, 231 op/s 2026-03-10T08:55:26.719 INFO:tasks.workunit.client.1.vm08.stdout:9/466: fsync d2/dd/d15/d1e/d39/d4e/f71 0 2026-03-10T08:55:26.732 INFO:tasks.workunit.client.0.vm05.stdout:1/385: write dd/d10/d19/d27/f31 [950917,82097] 0 2026-03-10T08:55:26.733 INFO:tasks.workunit.client.0.vm05.stdout:1/386: fdatasync dd/d21/f48 0 2026-03-10T08:55:26.738 INFO:tasks.workunit.client.1.vm08.stdout:5/494: link d0/d1b/d67/d7a/c8c d0/d11/c95 0 2026-03-10T08:55:26.742 INFO:tasks.workunit.client.0.vm05.stdout:2/284: truncate d0/d9/d27/f37 818841 0 2026-03-10T08:55:26.742 INFO:tasks.workunit.client.0.vm05.stdout:2/285: write d0/d9/f1d [1969710,99047] 0 2026-03-10T08:55:26.744 INFO:tasks.workunit.client.1.vm08.stdout:3/506: mkdir d4/da9 0 2026-03-10T08:55:26.747 INFO:tasks.workunit.client.1.vm08.stdout:1/567: dwrite d1/da/d20/f54 [4194304,4194304] 0 2026-03-10T08:55:26.749 INFO:tasks.workunit.client.1.vm08.stdout:5/495: mknod d0/d46/c96 0 2026-03-10T08:55:26.756 INFO:tasks.workunit.client.0.vm05.stdout:7/266: write d18/d1b/d1f/d25/d2e/d32/f3d [222967,82003] 0 2026-03-10T08:55:26.757 INFO:tasks.workunit.client.0.vm05.stdout:7/267: dread - d18/d1b/d1f/d25/d2e/f48 zero size 2026-03-10T08:55:26.758 INFO:tasks.workunit.client.1.vm08.stdout:3/507: mknod d4/d15/d8/d1d/caa 0 2026-03-10T08:55:26.758 INFO:tasks.workunit.client.0.vm05.stdout:7/268: rename d18/d1b/d1f/d25/d2e to d18/d1b/d1f/d25/d2e/d42/d4b 22 2026-03-10T08:55:26.761 INFO:tasks.workunit.client.1.vm08.stdout:7/574: creat d0/d11/d1f/fb7 x:0 0 0 2026-03-10T08:55:26.761 INFO:tasks.workunit.client.1.vm08.stdout:7/575: read - d0/d14/f98 zero size 2026-03-10T08:55:26.763 INFO:tasks.workunit.client.0.vm05.stdout:1/387: mkdir dd/d10/d19/d4d/d88 0 2026-03-10T08:55:26.764 INFO:tasks.workunit.client.1.vm08.stdout:8/595: truncate d1/d10/d9/dd/f91 1697525 0 2026-03-10T08:55:26.766 INFO:tasks.workunit.client.0.vm05.stdout:4/329: creat d0/d1d/d30/d49/d4f/d5b/f70 x:0 0 0 2026-03-10T08:55:26.767 INFO:tasks.workunit.client.1.vm08.stdout:1/568: write d1/da/d18/d3b/d62/f76 [401720,78691] 0 2026-03-10T08:55:26.767 INFO:tasks.workunit.client.0.vm05.stdout:4/330: readlink d0/l5 0 2026-03-10T08:55:26.769 INFO:tasks.workunit.client.1.vm08.stdout:5/496: mknod d0/d11/c97 0 2026-03-10T08:55:26.770 INFO:tasks.workunit.client.1.vm08.stdout:5/497: chown d0/d11/f1e 158760 1 2026-03-10T08:55:26.771 INFO:tasks.workunit.client.1.vm08.stdout:3/508: symlink d4/d15/d8/d2c/d55/d93/lab 0 2026-03-10T08:55:26.772 INFO:tasks.workunit.client.1.vm08.stdout:6/566: write d9/dc/d11/f8d [861483,130048] 0 2026-03-10T08:55:26.773 INFO:tasks.workunit.client.1.vm08.stdout:7/576: write d0/d11/d1f/d29/d3d/d40/f38 [5601265,31534] 0 2026-03-10T08:55:26.783 INFO:tasks.workunit.client.0.vm05.stdout:9/222: link d6/d19/d2c/f3d d6/d19/d2a/f4d 0 2026-03-10T08:55:26.788 INFO:tasks.workunit.client.1.vm08.stdout:1/569: fdatasync d1/da/d20/d3f/d49/f9a 0 2026-03-10T08:55:26.788 INFO:tasks.workunit.client.0.vm05.stdout:6/308: write d4/d7/d10/d15/f2e [4281250,129867] 0 2026-03-10T08:55:26.789 INFO:tasks.workunit.client.1.vm08.stdout:1/570: chown d1/da/de/d24/d35 84 1 2026-03-10T08:55:26.790 INFO:tasks.workunit.client.0.vm05.stdout:7/269: creat d18/d1b/d1f/d25/d2e/d32/f4c x:0 0 0 2026-03-10T08:55:26.791 INFO:tasks.workunit.client.0.vm05.stdout:0/309: write df/f15 [2165059,26856] 0 2026-03-10T08:55:26.791 INFO:tasks.workunit.client.1.vm08.stdout:4/555: write d5/f95 [240018,74982] 0 2026-03-10T08:55:26.793 INFO:tasks.workunit.client.0.vm05.stdout:0/310: dread df/d18/d2b/d27/f2e [0,4194304] 0 2026-03-10T08:55:26.793 INFO:tasks.workunit.client.1.vm08.stdout:4/556: chown d5/d23/d36/d99/db2/d5a/d69 3545074 1 2026-03-10T08:55:26.796 INFO:tasks.workunit.client.0.vm05.stdout:1/388: creat dd/d10/d18/d20/f89 x:0 0 0 2026-03-10T08:55:26.798 INFO:tasks.workunit.client.1.vm08.stdout:5/498: rename d0/d11/d27/d68/d7c/d4b/d4e/c8b to d0/d1b/d67/d80/c98 0 2026-03-10T08:55:26.800 INFO:tasks.workunit.client.0.vm05.stdout:5/257: write d5/df/d12/d21/f1f [3749962,28224] 0 2026-03-10T08:55:26.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:26 vm08.local ceph-mon[57559]: pgmap v152: 65 pgs: 65 active+clean; 1.7 GiB data, 6.2 GiB used, 114 GiB / 120 GiB avail; 33 MiB/s rd, 134 MiB/s wr, 231 op/s 2026-03-10T08:55:26.805 INFO:tasks.workunit.client.1.vm08.stdout:6/567: unlink d9/d13/fb6 0 2026-03-10T08:55:26.806 INFO:tasks.workunit.client.0.vm05.stdout:8/280: mknod d2/dd/d2c/d2e/c6d 0 2026-03-10T08:55:26.806 INFO:tasks.workunit.client.1.vm08.stdout:9/467: dread d2/f13 [0,4194304] 0 2026-03-10T08:55:26.812 INFO:tasks.workunit.client.0.vm05.stdout:2/286: dwrite d0/d9/d1e/f39 [0,4194304] 0 2026-03-10T08:55:26.813 INFO:tasks.workunit.client.1.vm08.stdout:1/571: dread d1/da/de/f12 [0,4194304] 0 2026-03-10T08:55:26.815 INFO:tasks.workunit.client.1.vm08.stdout:7/577: mknod d0/d11/d4a/cb8 0 2026-03-10T08:55:26.817 INFO:tasks.workunit.client.0.vm05.stdout:9/223: rmdir d6/d19/d21 39 2026-03-10T08:55:26.819 INFO:tasks.workunit.client.0.vm05.stdout:9/224: dread d6/d12/f1c [0,4194304] 0 2026-03-10T08:55:26.819 INFO:tasks.workunit.client.1.vm08.stdout:1/572: dwrite d1/da/d20/d3f/d49/d68/fbf [0,4194304] 0 2026-03-10T08:55:26.821 INFO:tasks.workunit.client.1.vm08.stdout:2/564: write d1/da/d10/d42/d93/f3b [1504337,19408] 0 2026-03-10T08:55:26.834 INFO:tasks.workunit.client.0.vm05.stdout:6/309: mknod d4/d2d/d51/d62/c70 0 2026-03-10T08:55:26.834 INFO:tasks.workunit.client.0.vm05.stdout:6/310: readlink d4/d7/d10/d1a/l1c 0 2026-03-10T08:55:26.842 INFO:tasks.workunit.client.1.vm08.stdout:0/472: write d6/dd/d13/d17/d1f/d20/d2f/d57/f65 [2609392,112912] 0 2026-03-10T08:55:26.842 INFO:tasks.workunit.client.0.vm05.stdout:4/331: mkdir d0/d2e/d71 0 2026-03-10T08:55:26.844 INFO:tasks.workunit.client.1.vm08.stdout:5/499: mknod d0/d11/d27/d68/d7c/c99 0 2026-03-10T08:55:26.845 INFO:tasks.workunit.client.1.vm08.stdout:6/568: write d9/dc/d11/f55 [1607165,45633] 0 2026-03-10T08:55:26.845 INFO:tasks.workunit.client.1.vm08.stdout:5/500: write d0/d11/d18/d52/f57 [4774731,66531] 0 2026-03-10T08:55:26.849 INFO:tasks.workunit.client.0.vm05.stdout:3/321: creat d9/d4d/d51/f59 x:0 0 0 2026-03-10T08:55:26.852 INFO:tasks.workunit.client.0.vm05.stdout:8/281: mkdir d2/dd/d2c/d2e/d31/d4c/d6e 0 2026-03-10T08:55:26.855 INFO:tasks.workunit.client.1.vm08.stdout:7/578: creat d0/d11/d1f/d29/d36/d75/fb9 x:0 0 0 2026-03-10T08:55:26.856 INFO:tasks.workunit.client.0.vm05.stdout:2/287: mknod d0/d9/d27/c4f 0 2026-03-10T08:55:26.859 INFO:tasks.workunit.client.0.vm05.stdout:9/225: truncate d6/fb 820609 0 2026-03-10T08:55:26.863 INFO:tasks.workunit.client.1.vm08.stdout:5/501: rename d0/d11/d27/d68/d7c/d4b/d4e/c65 to d0/d11/d27/d68/d7c/d4b/d4e/c9a 0 2026-03-10T08:55:26.863 INFO:tasks.workunit.client.1.vm08.stdout:8/596: link d1/d10/d9/dd/d25/d27/d44/d21/d51/f72 d1/fdc 0 2026-03-10T08:55:26.865 INFO:tasks.workunit.client.1.vm08.stdout:9/468: creat d2/dd/d15/d1e/d39/d4e/d87/f93 x:0 0 0 2026-03-10T08:55:26.866 INFO:tasks.workunit.client.0.vm05.stdout:6/311: symlink d4/d2d/d5f/l71 0 2026-03-10T08:55:26.866 INFO:tasks.workunit.client.1.vm08.stdout:5/502: read d0/d11/d18/f23 [3642158,53334] 0 2026-03-10T08:55:26.867 INFO:tasks.workunit.client.1.vm08.stdout:9/469: chown d2/d41/d74/f6a 32612 1 2026-03-10T08:55:26.867 INFO:tasks.workunit.client.1.vm08.stdout:8/597: unlink d1/d10/d9/dd/d25/d27/d44/c81 0 2026-03-10T08:55:26.867 INFO:tasks.workunit.client.0.vm05.stdout:7/270: readlink d18/d38/l45 0 2026-03-10T08:55:26.870 INFO:tasks.workunit.client.1.vm08.stdout:3/509: dwrite d4/d6f/d85/f87 [0,4194304] 0 2026-03-10T08:55:26.870 INFO:tasks.workunit.client.1.vm08.stdout:3/510: chown d4 79007 1 2026-03-10T08:55:26.873 INFO:tasks.workunit.client.0.vm05.stdout:4/332: truncate d0/f1e 850942 0 2026-03-10T08:55:26.873 INFO:tasks.workunit.client.0.vm05.stdout:4/333: chown d0/d2e/d42/d45/f62 93 1 2026-03-10T08:55:26.874 INFO:tasks.workunit.client.1.vm08.stdout:5/503: dwrite d0/f92 [0,4194304] 0 2026-03-10T08:55:26.874 INFO:tasks.workunit.client.1.vm08.stdout:9/470: read - d2/d41/d4c/d89/f8f zero size 2026-03-10T08:55:26.876 INFO:tasks.workunit.client.0.vm05.stdout:4/334: dwrite d0/d2e/f4e [0,4194304] 0 2026-03-10T08:55:26.877 INFO:tasks.workunit.client.1.vm08.stdout:5/504: write d0/d11/f86 [4768400,25238] 0 2026-03-10T08:55:26.878 INFO:tasks.workunit.client.1.vm08.stdout:3/511: dread - d4/d15/d8/d71/f8d zero size 2026-03-10T08:55:26.883 INFO:tasks.workunit.client.1.vm08.stdout:8/598: rename d1/d10/d9/dd/d25/d27/d44/d97/d7d/f82 to d1/d10/d9/dd/d13/d40/fdd 0 2026-03-10T08:55:26.883 INFO:tasks.workunit.client.0.vm05.stdout:8/282: rmdir d2/dd/d2c/d2e/d31/d3e/d5d 39 2026-03-10T08:55:26.883 INFO:tasks.workunit.client.1.vm08.stdout:3/512: write d4/d15/d8/d71/f8d [474129,68253] 0 2026-03-10T08:55:26.887 INFO:tasks.workunit.client.1.vm08.stdout:0/473: dread d6/dd/d13/d17/f6d [0,4194304] 0 2026-03-10T08:55:26.889 INFO:tasks.workunit.client.0.vm05.stdout:9/226: creat d6/f4e x:0 0 0 2026-03-10T08:55:26.890 INFO:tasks.workunit.client.1.vm08.stdout:2/565: dread d1/d5b/d66/f63 [0,4194304] 0 2026-03-10T08:55:26.891 INFO:tasks.workunit.client.1.vm08.stdout:2/566: dread - d1/da/f9c zero size 2026-03-10T08:55:26.893 INFO:tasks.workunit.client.0.vm05.stdout:6/312: dwrite d4/f11 [0,4194304] 0 2026-03-10T08:55:26.895 INFO:tasks.workunit.client.0.vm05.stdout:6/313: fsync d4/d7/d10/d15/f2a 0 2026-03-10T08:55:26.896 INFO:tasks.workunit.client.0.vm05.stdout:7/271: dwrite d18/f4a [0,4194304] 0 2026-03-10T08:55:26.908 INFO:tasks.workunit.client.0.vm05.stdout:0/311: rename df/d18/d2b/d3a to df/d59 0 2026-03-10T08:55:26.909 INFO:tasks.workunit.client.0.vm05.stdout:0/312: fsync df/f4a 0 2026-03-10T08:55:26.911 INFO:tasks.workunit.client.1.vm08.stdout:6/569: getdents d9/d10/d1e/d92 0 2026-03-10T08:55:26.921 INFO:tasks.workunit.client.1.vm08.stdout:9/471: mkdir d2/dd/d15/d1e/d94 0 2026-03-10T08:55:26.921 INFO:tasks.workunit.client.0.vm05.stdout:2/288: mknod d0/c50 0 2026-03-10T08:55:26.921 INFO:tasks.workunit.client.0.vm05.stdout:2/289: dwrite d0/d9/d1e/d20/d21/f46 [0,4194304] 0 2026-03-10T08:55:26.923 INFO:tasks.workunit.client.0.vm05.stdout:2/290: dwrite d0/d9/d1e/f39 [0,4194304] 0 2026-03-10T08:55:26.932 INFO:tasks.workunit.client.1.vm08.stdout:3/513: symlink d4/d15/d8/lac 0 2026-03-10T08:55:26.933 INFO:tasks.workunit.client.1.vm08.stdout:6/570: mknod d9/dc/d84/cc6 0 2026-03-10T08:55:26.933 INFO:tasks.workunit.client.0.vm05.stdout:6/314: unlink d4/d2d/d51/l6f 0 2026-03-10T08:55:26.933 INFO:tasks.workunit.client.0.vm05.stdout:6/315: dread - d4/d7/d10/d15/d20/d53/d4a/f63 zero size 2026-03-10T08:55:26.936 INFO:tasks.workunit.client.1.vm08.stdout:6/571: dwrite d9/d10/fab [0,4194304] 0 2026-03-10T08:55:26.938 INFO:tasks.workunit.client.0.vm05.stdout:1/389: link dd/d21/d37/f72 dd/d10/d18/f8a 0 2026-03-10T08:55:26.941 INFO:tasks.workunit.client.0.vm05.stdout:4/335: unlink d0/d2e/d42/d45/d4a/l27 0 2026-03-10T08:55:26.945 INFO:tasks.workunit.client.0.vm05.stdout:4/336: dwrite d0/fe [0,4194304] 0 2026-03-10T08:55:26.953 INFO:tasks.workunit.client.1.vm08.stdout:0/474: link d6/dd/d13/d17/d50/c8e d6/dd/d13/d17/d1f/d20/d2f/d26/c9d 0 2026-03-10T08:55:26.954 INFO:tasks.workunit.client.1.vm08.stdout:2/567: creat d1/da/d10/d2d/fb4 x:0 0 0 2026-03-10T08:55:26.972 INFO:tasks.workunit.client.1.vm08.stdout:6/572: symlink d9/dc/d11/d23/lc7 0 2026-03-10T08:55:26.972 INFO:tasks.workunit.client.1.vm08.stdout:2/568: chown d1/d43/d5c/l65 285771290 1 2026-03-10T08:55:26.972 INFO:tasks.workunit.client.1.vm08.stdout:9/472: link d2/l7 d2/l95 0 2026-03-10T08:55:26.972 INFO:tasks.workunit.client.1.vm08.stdout:9/473: readlink d2/d54/l58 0 2026-03-10T08:55:26.972 INFO:tasks.workunit.client.1.vm08.stdout:0/475: mknod d6/dd/c9e 0 2026-03-10T08:55:26.972 INFO:tasks.workunit.client.1.vm08.stdout:2/569: rmdir d1/d5b 39 2026-03-10T08:55:26.972 INFO:tasks.workunit.client.1.vm08.stdout:9/474: symlink d2/d41/d4c/d66/l96 0 2026-03-10T08:55:26.972 INFO:tasks.workunit.client.1.vm08.stdout:0/476: dread d6/f5f [0,4194304] 0 2026-03-10T08:55:26.973 INFO:tasks.workunit.client.0.vm05.stdout:6/316: mknod d4/d2d/d51/c72 0 2026-03-10T08:55:26.973 INFO:tasks.workunit.client.0.vm05.stdout:0/313: unlink df/f37 0 2026-03-10T08:55:26.973 INFO:tasks.workunit.client.0.vm05.stdout:1/390: mkdir dd/d10/d18/d2d/d51/d58/d71/d73/d8b 0 2026-03-10T08:55:26.973 INFO:tasks.workunit.client.0.vm05.stdout:6/317: symlink d4/d7/d10/d15/d1b/l73 0 2026-03-10T08:55:26.973 INFO:tasks.workunit.client.0.vm05.stdout:6/318: dread d4/fc [0,4194304] 0 2026-03-10T08:55:26.973 INFO:tasks.workunit.client.0.vm05.stdout:6/319: write d4/d7/d10/f65 [479711,117498] 0 2026-03-10T08:55:26.973 INFO:tasks.workunit.client.0.vm05.stdout:1/391: rename dd/d10/d18/d2d/f6d to dd/d21/d37/f8c 0 2026-03-10T08:55:26.973 INFO:tasks.workunit.client.0.vm05.stdout:1/392: write dd/d10/d18/d20/f89 [627147,36995] 0 2026-03-10T08:55:26.973 INFO:tasks.workunit.client.0.vm05.stdout:8/283: getdents d2/dd/d2c/d2e/d31/d3e 0 2026-03-10T08:55:26.973 INFO:tasks.workunit.client.0.vm05.stdout:6/320: dwrite d4/d7/d10/d1a/f25 [0,4194304] 0 2026-03-10T08:55:26.974 INFO:tasks.workunit.client.0.vm05.stdout:3/322: link d9/ff d9/d2b/d53/f5a 0 2026-03-10T08:55:26.974 INFO:tasks.workunit.client.0.vm05.stdout:2/291: getdents d0/d9/d1e 0 2026-03-10T08:55:26.979 INFO:tasks.workunit.client.0.vm05.stdout:2/292: dwrite d0/d9/d1e/d20/d21/f44 [0,4194304] 0 2026-03-10T08:55:27.009 INFO:tasks.workunit.client.1.vm08.stdout:9/475: unlink d2/d54/l58 0 2026-03-10T08:55:27.009 INFO:tasks.workunit.client.1.vm08.stdout:0/477: symlink d6/dd/d13/d32/l9f 0 2026-03-10T08:55:27.009 INFO:tasks.workunit.client.1.vm08.stdout:9/476: symlink d2/dd/d15/d1e/d39/d69/l97 0 2026-03-10T08:55:27.009 INFO:tasks.workunit.client.1.vm08.stdout:2/570: dread d1/da/d10/d1b/f28 [0,4194304] 0 2026-03-10T08:55:27.009 INFO:tasks.workunit.client.1.vm08.stdout:0/478: truncate d6/dd/d13/d17/d1f/d20/d2f/d57/f5c 296593 0 2026-03-10T08:55:27.009 INFO:tasks.workunit.client.1.vm08.stdout:9/477: dread - d2/d41/d4c/f62 zero size 2026-03-10T08:55:27.009 INFO:tasks.workunit.client.1.vm08.stdout:0/479: dread - d6/dd/d13/d17/d1f/d20/d2f/d26/f73 zero size 2026-03-10T08:55:27.010 INFO:tasks.workunit.client.0.vm05.stdout:0/314: unlink ca 0 2026-03-10T08:55:27.010 INFO:tasks.workunit.client.0.vm05.stdout:0/315: dread df/d1f/f2d [0,4194304] 0 2026-03-10T08:55:27.010 INFO:tasks.workunit.client.0.vm05.stdout:0/316: write df/f15 [5178903,19783] 0 2026-03-10T08:55:27.010 INFO:tasks.workunit.client.0.vm05.stdout:3/323: unlink d9/d2b/d3a/d43/f4c 0 2026-03-10T08:55:27.010 INFO:tasks.workunit.client.0.vm05.stdout:8/284: rename d2/db/d1f/c40 to d2/db/d1f/c6f 0 2026-03-10T08:55:27.010 INFO:tasks.workunit.client.0.vm05.stdout:8/285: dread - d2/d45/f43 zero size 2026-03-10T08:55:27.010 INFO:tasks.workunit.client.0.vm05.stdout:6/321: rename d4/d2c/l3e to d4/d2d/d5f/l74 0 2026-03-10T08:55:27.010 INFO:tasks.workunit.client.0.vm05.stdout:6/322: truncate d4/d7/d10/d15/d20/d53/f6b 633220 0 2026-03-10T08:55:27.010 INFO:tasks.workunit.client.0.vm05.stdout:2/293: symlink d0/d9/l51 0 2026-03-10T08:55:27.010 INFO:tasks.workunit.client.0.vm05.stdout:2/294: dread d0/d9/d1e/d20/f32 [0,4194304] 0 2026-03-10T08:55:27.010 INFO:tasks.workunit.client.0.vm05.stdout:0/317: truncate df/f17 850407 0 2026-03-10T08:55:27.013 INFO:tasks.workunit.client.0.vm05.stdout:8/286: symlink d2/db/d1f/d67/l70 0 2026-03-10T08:55:27.016 INFO:tasks.workunit.client.0.vm05.stdout:2/295: mknod d0/d9/d1e/d20/d21/d45/c52 0 2026-03-10T08:55:27.018 INFO:tasks.workunit.client.0.vm05.stdout:5/258: dwrite d5/df/d12/f2a [0,4194304] 0 2026-03-10T08:55:27.020 INFO:tasks.workunit.client.1.vm08.stdout:4/557: dwrite d5/d23/d49/f4d [0,4194304] 0 2026-03-10T08:55:27.020 INFO:tasks.workunit.client.0.vm05.stdout:5/259: chown d5/fe 1936 1 2026-03-10T08:55:27.021 INFO:tasks.workunit.client.1.vm08.stdout:4/558: write d5/d23/d36/d99/db2/fab [506516,8077] 0 2026-03-10T08:55:27.032 INFO:tasks.workunit.client.0.vm05.stdout:6/323: dread d4/d7/d10/d1a/f1e [0,4194304] 0 2026-03-10T08:55:27.033 INFO:tasks.workunit.client.0.vm05.stdout:6/324: chown d4/d7/f14 122557 1 2026-03-10T08:55:27.033 INFO:tasks.workunit.client.0.vm05.stdout:6/325: dread - d4/f6c zero size 2026-03-10T08:55:27.035 INFO:tasks.workunit.client.0.vm05.stdout:4/337: sync 2026-03-10T08:55:27.038 INFO:tasks.workunit.client.0.vm05.stdout:8/287: creat d2/dd/d2c/d2e/d31/d4c/f71 x:0 0 0 2026-03-10T08:55:27.049 INFO:tasks.workunit.client.0.vm05.stdout:2/296: mknod d0/d9/d1e/d20/d21/c53 0 2026-03-10T08:55:27.049 INFO:tasks.workunit.client.0.vm05.stdout:5/260: unlink d5/d3a/c3f 0 2026-03-10T08:55:27.049 INFO:tasks.workunit.client.0.vm05.stdout:4/338: write d0/d2e/d42/d45/d4a/f47 [5937852,112066] 0 2026-03-10T08:55:27.049 INFO:tasks.workunit.client.0.vm05.stdout:2/297: creat d0/d9/d27/f54 x:0 0 0 2026-03-10T08:55:27.049 INFO:tasks.workunit.client.0.vm05.stdout:4/339: dwrite d0/d2e/d42/d45/d4a/f47 [0,4194304] 0 2026-03-10T08:55:27.049 INFO:tasks.workunit.client.0.vm05.stdout:4/340: dread - d0/d1d/d30/d32/f63 zero size 2026-03-10T08:55:27.049 INFO:tasks.workunit.client.0.vm05.stdout:5/261: rmdir d5/df/d12/d24/d2c/d41 39 2026-03-10T08:55:27.054 INFO:tasks.workunit.client.0.vm05.stdout:4/341: unlink d0/f40 0 2026-03-10T08:55:27.057 INFO:tasks.workunit.client.0.vm05.stdout:8/288: creat d2/dd/f72 x:0 0 0 2026-03-10T08:55:27.058 INFO:tasks.workunit.client.0.vm05.stdout:4/342: rename d0/d1d/d30/d32/f63 to d0/d1d/d30/d32/f72 0 2026-03-10T08:55:27.063 INFO:tasks.workunit.client.0.vm05.stdout:6/326: link d4/d7/d10/d15/d20/d53/d4a/c68 d4/d2d/c75 0 2026-03-10T08:55:27.065 INFO:tasks.workunit.client.0.vm05.stdout:8/289: creat d2/dd/d2c/d2e/d31/d3e/f73 x:0 0 0 2026-03-10T08:55:27.066 INFO:tasks.workunit.client.0.vm05.stdout:8/290: fdatasync d2/dd/f3f 0 2026-03-10T08:55:27.068 INFO:tasks.workunit.client.0.vm05.stdout:4/343: write d0/f9 [655389,23142] 0 2026-03-10T08:55:27.069 INFO:tasks.workunit.client.0.vm05.stdout:5/262: rmdir d5/df/d37/d4b 0 2026-03-10T08:55:27.071 INFO:tasks.workunit.client.0.vm05.stdout:4/344: write d0/f1 [4482672,99933] 0 2026-03-10T08:55:27.075 INFO:tasks.workunit.client.0.vm05.stdout:5/263: rename d5/fe to d5/df/f53 0 2026-03-10T08:55:27.076 INFO:tasks.workunit.client.0.vm05.stdout:8/291: mkdir d2/dd/d74 0 2026-03-10T08:55:27.077 INFO:tasks.workunit.client.0.vm05.stdout:8/292: chown d2/dd/f72 136384 1 2026-03-10T08:55:27.077 INFO:tasks.workunit.client.0.vm05.stdout:8/293: write d2/dd/f3f [843795,54093] 0 2026-03-10T08:55:27.077 INFO:tasks.workunit.client.0.vm05.stdout:8/294: stat d2/fa 0 2026-03-10T08:55:27.078 INFO:tasks.workunit.client.0.vm05.stdout:8/295: fdatasync d2/db/d47/f51 0 2026-03-10T08:55:27.078 INFO:tasks.workunit.client.0.vm05.stdout:8/296: fdatasync d2/db/f22 0 2026-03-10T08:55:27.079 INFO:tasks.workunit.client.0.vm05.stdout:4/345: mknod d0/d1d/c73 0 2026-03-10T08:55:27.080 INFO:tasks.workunit.client.0.vm05.stdout:4/346: write d0/d1d/f3c [597280,27646] 0 2026-03-10T08:55:27.083 INFO:tasks.workunit.client.0.vm05.stdout:8/297: creat d2/db/d1f/d67/f75 x:0 0 0 2026-03-10T08:55:27.098 INFO:tasks.workunit.client.0.vm05.stdout:4/347: creat d0/d2c/f74 x:0 0 0 2026-03-10T08:55:27.098 INFO:tasks.workunit.client.0.vm05.stdout:4/348: dwrite d0/d1d/d30/f61 [0,4194304] 0 2026-03-10T08:55:27.098 INFO:tasks.workunit.client.0.vm05.stdout:6/327: getdents d4 0 2026-03-10T08:55:27.098 INFO:tasks.workunit.client.0.vm05.stdout:3/324: dread d9/f3c [0,4194304] 0 2026-03-10T08:55:27.098 INFO:tasks.workunit.client.0.vm05.stdout:6/328: dwrite d4/d7/d10/d15/d20/d53/d4a/f63 [0,4194304] 0 2026-03-10T08:55:27.098 INFO:tasks.workunit.client.0.vm05.stdout:3/325: stat d9/f29 0 2026-03-10T08:55:27.098 INFO:tasks.workunit.client.0.vm05.stdout:4/349: creat d0/d2c/d6a/f75 x:0 0 0 2026-03-10T08:55:27.099 INFO:tasks.workunit.client.0.vm05.stdout:4/350: write d0/d2e/d42/d45/f5f [210860,59521] 0 2026-03-10T08:55:27.103 INFO:tasks.workunit.client.0.vm05.stdout:4/351: dwrite d0/d1d/f24 [0,4194304] 0 2026-03-10T08:55:27.105 INFO:tasks.workunit.client.0.vm05.stdout:6/329: creat d4/d7/d10/d15/d20/d53/d4a/f76 x:0 0 0 2026-03-10T08:55:27.106 INFO:tasks.workunit.client.0.vm05.stdout:4/352: truncate d0/d1d/d30/d32/f72 29639 0 2026-03-10T08:55:27.107 INFO:tasks.workunit.client.0.vm05.stdout:4/353: write d0/fe [3988628,64845] 0 2026-03-10T08:55:27.116 INFO:tasks.workunit.client.0.vm05.stdout:4/354: creat d0/d1d/d30/d32/f76 x:0 0 0 2026-03-10T08:55:27.119 INFO:tasks.workunit.client.1.vm08.stdout:2/571: dread d1/da/d10/d42/d93/d1e/f84 [0,4194304] 0 2026-03-10T08:55:27.120 INFO:tasks.workunit.client.0.vm05.stdout:4/355: dwrite d0/d2e/d42/f5e [0,4194304] 0 2026-03-10T08:55:27.125 INFO:tasks.workunit.client.0.vm05.stdout:4/356: symlink d0/d2c/d6a/l77 0 2026-03-10T08:55:27.125 INFO:tasks.workunit.client.0.vm05.stdout:3/326: dread d9/d2b/d2f/f4b [0,4194304] 0 2026-03-10T08:55:27.125 INFO:tasks.workunit.client.0.vm05.stdout:4/357: truncate d0/f9 4801482 0 2026-03-10T08:55:27.129 INFO:tasks.workunit.client.0.vm05.stdout:4/358: mkdir d0/d78 0 2026-03-10T08:55:27.132 INFO:tasks.workunit.client.1.vm08.stdout:2/572: creat d1/d5b/da7/fb5 x:0 0 0 2026-03-10T08:55:27.133 INFO:tasks.workunit.client.0.vm05.stdout:4/359: fsync d0/d1d/d30/f29 0 2026-03-10T08:55:27.135 INFO:tasks.workunit.client.1.vm08.stdout:2/573: fsync d1/d43/f6d 0 2026-03-10T08:55:27.136 INFO:tasks.workunit.client.0.vm05.stdout:6/330: read d4/d7/ff [1094079,87288] 0 2026-03-10T08:55:27.136 INFO:tasks.workunit.client.1.vm08.stdout:2/574: dread - d1/da/d10/d42/d93/d1e/fb2 zero size 2026-03-10T08:55:27.137 INFO:tasks.workunit.client.0.vm05.stdout:6/331: write d4/d7/f5d [196071,103377] 0 2026-03-10T08:55:27.138 INFO:tasks.workunit.client.0.vm05.stdout:4/360: dwrite d0/d2e/d42/d45/d4a/f26 [0,4194304] 0 2026-03-10T08:55:27.140 INFO:tasks.workunit.client.0.vm05.stdout:6/332: fdatasync d4/d7/d10/d15/d1b/d22/f56 0 2026-03-10T08:55:27.144 INFO:tasks.workunit.client.0.vm05.stdout:4/361: mkdir d0/d1d/d30/d49/d58/d66/d79 0 2026-03-10T08:55:27.144 INFO:tasks.workunit.client.1.vm08.stdout:2/575: mkdir d1/da/d10/d2d/db6 0 2026-03-10T08:55:27.149 INFO:tasks.workunit.client.1.vm08.stdout:2/576: dread d1/da/d10/d2d/f4c [0,4194304] 0 2026-03-10T08:55:27.156 INFO:tasks.workunit.client.0.vm05.stdout:4/362: dread d0/d1d/d30/d49/d4f/f51 [0,4194304] 0 2026-03-10T08:55:27.167 INFO:tasks.workunit.client.1.vm08.stdout:1/573: dwrite d1/f65 [0,4194304] 0 2026-03-10T08:55:27.182 INFO:tasks.workunit.client.1.vm08.stdout:1/574: creat d1/da/d18/d3b/d62/fc7 x:0 0 0 2026-03-10T08:55:27.182 INFO:tasks.workunit.client.1.vm08.stdout:1/575: write d1/da/d20/f54 [1617899,59645] 0 2026-03-10T08:55:27.182 INFO:tasks.workunit.client.1.vm08.stdout:1/576: creat d1/da/de/d24/d35/d6d/fc8 x:0 0 0 2026-03-10T08:55:27.182 INFO:tasks.workunit.client.1.vm08.stdout:7/579: dwrite d0/d11/d1f/f90 [0,4194304] 0 2026-03-10T08:55:27.184 INFO:tasks.workunit.client.1.vm08.stdout:1/577: dwrite d1/da/d20/f67 [0,4194304] 0 2026-03-10T08:55:27.185 INFO:tasks.workunit.client.0.vm05.stdout:9/227: rmdir d6/d19/d21 39 2026-03-10T08:55:27.186 INFO:tasks.workunit.client.0.vm05.stdout:9/228: stat d6/d27/f44 0 2026-03-10T08:55:27.186 INFO:tasks.workunit.client.1.vm08.stdout:8/599: write d1/d10/d9/dd/d9a/f9d [358685,95694] 0 2026-03-10T08:55:27.191 INFO:tasks.workunit.client.0.vm05.stdout:9/229: dwrite d6/f7 [4194304,4194304] 0 2026-03-10T08:55:27.193 INFO:tasks.workunit.client.1.vm08.stdout:5/505: dwrite d0/d11/d27/d68/d7c/f6a [0,4194304] 0 2026-03-10T08:55:27.198 INFO:tasks.workunit.client.1.vm08.stdout:8/600: mknod d1/d2c/cde 0 2026-03-10T08:55:27.200 INFO:tasks.workunit.client.0.vm05.stdout:9/230: dread - d6/d15/d35/f38 zero size 2026-03-10T08:55:27.201 INFO:tasks.workunit.client.0.vm05.stdout:9/231: chown d6/d15/d37/l41 100 1 2026-03-10T08:55:27.202 INFO:tasks.workunit.client.1.vm08.stdout:7/580: creat d0/d11/d1f/d29/fba x:0 0 0 2026-03-10T08:55:27.213 INFO:tasks.workunit.client.1.vm08.stdout:8/601: rename d1/d10/d9/dd/d25/d27/d44/c74 to d1/d10/d9/dd/d25/d27/d44/d97/cdf 0 2026-03-10T08:55:27.217 INFO:tasks.workunit.client.1.vm08.stdout:7/581: mkdir d0/d14/d43/d9d/dbb 0 2026-03-10T08:55:27.217 INFO:tasks.workunit.client.1.vm08.stdout:5/506: rmdir d0/d11/d3e/d45/d93 0 2026-03-10T08:55:27.219 INFO:tasks.workunit.client.1.vm08.stdout:8/602: link d1/d10/d9/dd/d25/d27/d44/fb0 d1/da8/fe0 0 2026-03-10T08:55:27.220 INFO:tasks.workunit.client.1.vm08.stdout:8/603: chown d1/d10/d9/dd/f62 66974163 1 2026-03-10T08:55:27.231 INFO:tasks.workunit.client.1.vm08.stdout:1/578: read d1/da/d20/d3f/d49/f96 [155060,89711] 0 2026-03-10T08:55:27.242 INFO:tasks.workunit.client.0.vm05.stdout:1/393: fsync dd/d21/d37/f85 0 2026-03-10T08:55:27.243 INFO:tasks.workunit.client.0.vm05.stdout:1/394: rmdir dd/d21/d37/d45 39 2026-03-10T08:55:27.290 INFO:tasks.workunit.client.0.vm05.stdout:4/363: dread d0/d1d/f3c [0,4194304] 0 2026-03-10T08:55:27.342 INFO:tasks.workunit.client.1.vm08.stdout:3/514: write d4/d15/d8/d2c/f5a [1862728,114282] 0 2026-03-10T08:55:27.361 INFO:tasks.workunit.client.0.vm05.stdout:6/333: rmdir d4/d2d/d51 39 2026-03-10T08:55:27.361 INFO:tasks.workunit.client.0.vm05.stdout:7/272: truncate d18/d1b/d1f/d25/d2e/d32/f3d 1042981 0 2026-03-10T08:55:27.365 INFO:tasks.workunit.client.0.vm05.stdout:6/334: dwrite d4/f61 [0,4194304] 0 2026-03-10T08:55:27.368 INFO:tasks.workunit.client.0.vm05.stdout:7/273: mknod d18/c4d 0 2026-03-10T08:55:27.368 INFO:tasks.workunit.client.0.vm05.stdout:6/335: mknod d4/d7/d10/d15/d1b/d22/c77 0 2026-03-10T08:55:27.373 INFO:tasks.workunit.client.0.vm05.stdout:7/274: dwrite d18/d1b/d1f/d25/d2e/f49 [0,4194304] 0 2026-03-10T08:55:27.374 INFO:tasks.workunit.client.0.vm05.stdout:7/275: readlink l1 0 2026-03-10T08:55:27.374 INFO:tasks.workunit.client.0.vm05.stdout:6/336: chown d4/d2d/d51/d62 2061678024 1 2026-03-10T08:55:27.378 INFO:tasks.workunit.client.0.vm05.stdout:6/337: fdatasync d4/d7/d10/d15/d20/d53/f3a 0 2026-03-10T08:55:27.385 INFO:tasks.workunit.client.0.vm05.stdout:6/338: fdatasync d4/d7/f34 0 2026-03-10T08:55:27.387 INFO:tasks.workunit.client.0.vm05.stdout:6/339: creat d4/d7/d10/d1a/f78 x:0 0 0 2026-03-10T08:55:27.388 INFO:tasks.workunit.client.0.vm05.stdout:6/340: read d4/d7/d10/d15/f2e [2414345,126946] 0 2026-03-10T08:55:27.389 INFO:tasks.workunit.client.0.vm05.stdout:6/341: unlink f2 0 2026-03-10T08:55:27.390 INFO:tasks.workunit.client.0.vm05.stdout:6/342: truncate d4/d7/d10/d15/d20/f48 508827 0 2026-03-10T08:55:27.392 INFO:tasks.workunit.client.0.vm05.stdout:6/343: creat d4/d2d/f79 x:0 0 0 2026-03-10T08:55:27.392 INFO:tasks.workunit.client.0.vm05.stdout:6/344: chown d4/d7/d10/d15/d1b/l73 32 1 2026-03-10T08:55:27.393 INFO:tasks.workunit.client.0.vm05.stdout:6/345: readlink d4/d7/d10/d15/d1b/l73 0 2026-03-10T08:55:27.410 INFO:tasks.workunit.client.0.vm05.stdout:1/395: dread dd/d21/f3e [0,4194304] 0 2026-03-10T08:55:27.411 INFO:tasks.workunit.client.0.vm05.stdout:1/396: mkdir dd/d21/d37/d45/d8d 0 2026-03-10T08:55:27.412 INFO:tasks.workunit.client.0.vm05.stdout:1/397: write dd/d10/d19/d4d/f74 [808611,70488] 0 2026-03-10T08:55:27.413 INFO:tasks.workunit.client.0.vm05.stdout:1/398: creat dd/d10/d18/d2d/d5c/f8e x:0 0 0 2026-03-10T08:55:27.414 INFO:tasks.workunit.client.0.vm05.stdout:1/399: write dd/d10/d18/f8a [89398,74828] 0 2026-03-10T08:55:27.415 INFO:tasks.workunit.client.0.vm05.stdout:1/400: chown dd/d10/d18/d2d 802075 1 2026-03-10T08:55:27.416 INFO:tasks.workunit.client.0.vm05.stdout:1/401: creat dd/d10/f8f x:0 0 0 2026-03-10T08:55:27.418 INFO:tasks.workunit.client.0.vm05.stdout:1/402: symlink dd/d10/d18/d2d/d51/d58/d71/d62/l90 0 2026-03-10T08:55:27.418 INFO:tasks.workunit.client.0.vm05.stdout:1/403: chown dd/d10/d18/f82 16201194 1 2026-03-10T08:55:27.422 INFO:tasks.workunit.client.0.vm05.stdout:1/404: dread dd/d10/d19/d27/f31 [0,4194304] 0 2026-03-10T08:55:27.429 INFO:tasks.workunit.client.0.vm05.stdout:1/405: read dd/d21/d37/d45/f47 [3032440,48347] 0 2026-03-10T08:55:27.430 INFO:tasks.workunit.client.0.vm05.stdout:1/406: rename dd/d10/d19 to dd/d10/d19/d4d/d91 22 2026-03-10T08:55:27.433 INFO:tasks.workunit.client.0.vm05.stdout:1/407: mknod dd/d21/d37/d7c/d60/c92 0 2026-03-10T08:55:27.435 INFO:tasks.workunit.client.0.vm05.stdout:1/408: dwrite dd/d21/d3f/f83 [0,4194304] 0 2026-03-10T08:55:27.443 INFO:tasks.workunit.client.0.vm05.stdout:1/409: creat dd/d10/d18/d2d/f93 x:0 0 0 2026-03-10T08:55:27.466 INFO:tasks.workunit.client.0.vm05.stdout:1/410: fdatasync dd/d21/d37/f85 0 2026-03-10T08:55:27.466 INFO:tasks.workunit.client.0.vm05.stdout:1/411: read - dd/d21/d37/f8c zero size 2026-03-10T08:55:27.467 INFO:tasks.workunit.client.0.vm05.stdout:1/412: chown dd/d21/d3f/c6a 431180 1 2026-03-10T08:55:27.469 INFO:tasks.workunit.client.0.vm05.stdout:1/413: rename dd/d10/c14 to dd/d10/d18/d2d/d51/d58/d71/d73/d8b/c94 0 2026-03-10T08:55:27.473 INFO:tasks.workunit.client.0.vm05.stdout:6/346: chown d4/d2d/d5f/l74 17 1 2026-03-10T08:55:27.474 INFO:tasks.workunit.client.0.vm05.stdout:1/414: read dd/d21/d3f/f57 [468830,49055] 0 2026-03-10T08:55:27.477 INFO:tasks.workunit.client.1.vm08.stdout:0/480: truncate d6/dd/d13/d17/d1f/d20/d2f/d24/f68 1771857 0 2026-03-10T08:55:27.478 INFO:tasks.workunit.client.1.vm08.stdout:9/478: dwrite d2/dd/d15/d1e/d24/f30 [8388608,4194304] 0 2026-03-10T08:55:27.478 INFO:tasks.workunit.client.0.vm05.stdout:1/415: link dd/d21/d3f/f57 dd/d10/d19/f95 0 2026-03-10T08:55:27.481 INFO:tasks.workunit.client.0.vm05.stdout:1/416: creat dd/d10/d18/d2d/d51/d58/d71/d62/f96 x:0 0 0 2026-03-10T08:55:27.482 INFO:tasks.workunit.client.1.vm08.stdout:9/479: write d2/dd/d15/d1e/d39/f57 [737518,106920] 0 2026-03-10T08:55:27.487 INFO:tasks.workunit.client.0.vm05.stdout:0/318: truncate df/f15 1315131 0 2026-03-10T08:55:27.488 INFO:tasks.workunit.client.0.vm05.stdout:1/417: rename dd/d21/d3f/c6a to dd/d10/d19/d4d/c97 0 2026-03-10T08:55:27.489 INFO:tasks.workunit.client.0.vm05.stdout:1/418: write dd/d10/d18/d20/f89 [134147,87436] 0 2026-03-10T08:55:27.493 INFO:tasks.workunit.client.1.vm08.stdout:4/559: write d5/d23/d36/f7d [268627,97887] 0 2026-03-10T08:55:27.495 INFO:tasks.workunit.client.1.vm08.stdout:4/560: write d5/d23/d36/f44 [2363524,39109] 0 2026-03-10T08:55:27.496 INFO:tasks.workunit.client.1.vm08.stdout:0/481: creat d6/dd/d13/d17/d1f/d2d/fa0 x:0 0 0 2026-03-10T08:55:27.498 INFO:tasks.workunit.client.0.vm05.stdout:1/419: symlink dd/d21/d37/l98 0 2026-03-10T08:55:27.502 INFO:tasks.workunit.client.0.vm05.stdout:1/420: dwrite dd/d21/d37/f72 [0,4194304] 0 2026-03-10T08:55:27.502 INFO:tasks.workunit.client.1.vm08.stdout:9/480: dread d2/dd/f2e [0,4194304] 0 2026-03-10T08:55:27.505 INFO:tasks.workunit.client.1.vm08.stdout:4/561: creat d5/d23/d36/d99/db2/d5d/fc5 x:0 0 0 2026-03-10T08:55:27.508 INFO:tasks.workunit.client.0.vm05.stdout:2/298: dwrite d0/d9/d1e/d20/d21/f41 [0,4194304] 0 2026-03-10T08:55:27.518 INFO:tasks.workunit.client.1.vm08.stdout:0/482: symlink d6/dd/d13/d17/d1f/d20/d2f/d26/d56/la1 0 2026-03-10T08:55:27.522 INFO:tasks.workunit.client.1.vm08.stdout:4/562: mkdir d5/d23/d36/d99/dc6 0 2026-03-10T08:55:27.525 INFO:tasks.workunit.client.0.vm05.stdout:5/264: getdents d5/df/d37 0 2026-03-10T08:55:27.526 INFO:tasks.workunit.client.1.vm08.stdout:9/481: dread d2/dd/d15/d1e/d24/f3f [0,4194304] 0 2026-03-10T08:55:27.528 INFO:tasks.workunit.client.1.vm08.stdout:9/482: chown d2/dd/d15/d1e/d21/f75 0 1 2026-03-10T08:55:27.539 INFO:tasks.workunit.client.1.vm08.stdout:4/563: rename d5/f14 to d5/d23/d36/d76/fc7 0 2026-03-10T08:55:27.540 INFO:tasks.workunit.client.1.vm08.stdout:9/483: write d2/dd/d15/d1e/d21/f90 [889619,28934] 0 2026-03-10T08:55:27.540 INFO:tasks.workunit.client.1.vm08.stdout:0/483: dwrite d6/dd/d13/d17/d1f/d20/d2f/d26/f80 [0,4194304] 0 2026-03-10T08:55:27.540 INFO:tasks.workunit.client.0.vm05.stdout:1/421: creat dd/d21/d37/d45/d8d/f99 x:0 0 0 2026-03-10T08:55:27.540 INFO:tasks.workunit.client.0.vm05.stdout:1/422: chown dd/d21/f48 193 1 2026-03-10T08:55:27.540 INFO:tasks.workunit.client.0.vm05.stdout:8/298: stat d2/dd/d2c/d2e/d31/d3e/c50 0 2026-03-10T08:55:27.540 INFO:tasks.workunit.client.0.vm05.stdout:8/299: dread - d2/dd/d2c/d2e/f64 zero size 2026-03-10T08:55:27.540 INFO:tasks.workunit.client.0.vm05.stdout:1/423: mknod dd/d10/d19/d4d/c9a 0 2026-03-10T08:55:27.541 INFO:tasks.workunit.client.0.vm05.stdout:8/300: symlink d2/dd/l76 0 2026-03-10T08:55:27.541 INFO:tasks.workunit.client.1.vm08.stdout:9/484: mkdir d2/dd/d15/d1e/d25/d98 0 2026-03-10T08:55:27.542 INFO:tasks.workunit.client.0.vm05.stdout:1/424: mkdir dd/d10/d19/d9b 0 2026-03-10T08:55:27.542 INFO:tasks.workunit.client.0.vm05.stdout:1/425: chown dd/d21/d37/l3d 454 1 2026-03-10T08:55:27.543 INFO:tasks.workunit.client.0.vm05.stdout:1/426: write dd/d21/d37/f85 [88229,6444] 0 2026-03-10T08:55:27.545 INFO:tasks.workunit.client.1.vm08.stdout:0/484: dread d6/dd/d13/d17/d1f/d20/d2f/d26/f80 [0,4194304] 0 2026-03-10T08:55:27.547 INFO:tasks.workunit.client.0.vm05.stdout:1/427: creat dd/d10/d19/d27/f9c x:0 0 0 2026-03-10T08:55:27.553 INFO:tasks.workunit.client.0.vm05.stdout:1/428: rmdir dd/d10/d18/d2d/d51/d7b 0 2026-03-10T08:55:27.555 INFO:tasks.workunit.client.0.vm05.stdout:1/429: symlink dd/d10/d19/d4d/d88/l9d 0 2026-03-10T08:55:27.556 INFO:tasks.workunit.client.1.vm08.stdout:4/564: getdents d5/d23/d49 0 2026-03-10T08:55:27.557 INFO:tasks.workunit.client.0.vm05.stdout:1/430: creat dd/f9e x:0 0 0 2026-03-10T08:55:27.557 INFO:tasks.workunit.client.1.vm08.stdout:9/485: dwrite f1 [4194304,4194304] 0 2026-03-10T08:55:27.559 INFO:tasks.workunit.client.1.vm08.stdout:9/486: stat d2/dd/d61/c6e 0 2026-03-10T08:55:27.565 INFO:tasks.workunit.client.0.vm05.stdout:1/431: dwrite dd/d21/f6f [0,4194304] 0 2026-03-10T08:55:27.575 INFO:tasks.workunit.client.0.vm05.stdout:3/327: dwrite f2 [0,4194304] 0 2026-03-10T08:55:27.585 INFO:tasks.workunit.client.1.vm08.stdout:0/485: getdents d6/dd/d13/d32 0 2026-03-10T08:55:27.586 INFO:tasks.workunit.client.0.vm05.stdout:3/328: symlink d9/d4d/d51/l5b 0 2026-03-10T08:55:27.587 INFO:tasks.workunit.client.1.vm08.stdout:4/565: mkdir d5/d23/d36/d99/dc6/dc8 0 2026-03-10T08:55:27.588 INFO:tasks.workunit.client.1.vm08.stdout:0/486: mknod d6/dd/d13/d17/d50/ca2 0 2026-03-10T08:55:27.588 INFO:tasks.workunit.client.1.vm08.stdout:2/577: rmdir d1/da/d10/d2d 39 2026-03-10T08:55:27.588 INFO:tasks.workunit.client.1.vm08.stdout:2/578: stat d1/da/d10/d42/d93/d23/d9e 0 2026-03-10T08:55:27.589 INFO:tasks.workunit.client.1.vm08.stdout:2/579: dread - d1/da/d10/d42/d93/d23/f37 zero size 2026-03-10T08:55:27.591 INFO:tasks.workunit.client.0.vm05.stdout:3/329: dwrite d9/f4a [0,4194304] 0 2026-03-10T08:55:27.595 INFO:tasks.workunit.client.0.vm05.stdout:3/330: dread d9/d2b/f40 [0,4194304] 0 2026-03-10T08:55:27.597 INFO:tasks.workunit.client.1.vm08.stdout:6/573: write d9/dc/d11/f29 [307297,78126] 0 2026-03-10T08:55:27.601 INFO:tasks.workunit.client.1.vm08.stdout:4/566: fdatasync d5/f8 0 2026-03-10T08:55:27.602 INFO:tasks.workunit.client.0.vm05.stdout:9/232: write d6/d19/f1a [622901,122701] 0 2026-03-10T08:55:27.607 INFO:tasks.workunit.client.0.vm05.stdout:3/331: symlink d9/d2b/d3a/d43/d4f/l5c 0 2026-03-10T08:55:27.607 INFO:tasks.workunit.client.0.vm05.stdout:3/332: dread - d9/d2b/d3a/f44 zero size 2026-03-10T08:55:27.610 INFO:tasks.workunit.client.0.vm05.stdout:1/432: getdents dd/d10/d18 0 2026-03-10T08:55:27.612 INFO:tasks.workunit.client.0.vm05.stdout:9/233: truncate d6/d15/f25 469631 0 2026-03-10T08:55:27.612 INFO:tasks.workunit.client.0.vm05.stdout:9/234: chown d6/d19/d2c/l49 0 1 2026-03-10T08:55:27.615 INFO:tasks.workunit.client.0.vm05.stdout:3/333: rename d9/d2b/d3a/f56 to d9/d2b/d2f/f5d 0 2026-03-10T08:55:27.616 INFO:tasks.workunit.client.0.vm05.stdout:1/433: dread f6 [0,4194304] 0 2026-03-10T08:55:27.620 INFO:tasks.workunit.client.0.vm05.stdout:3/334: unlink d9/d2b/d3a/f45 0 2026-03-10T08:55:27.621 INFO:tasks.workunit.client.0.vm05.stdout:1/434: mknod dd/d21/d37/d45/d8d/c9f 0 2026-03-10T08:55:27.622 INFO:tasks.workunit.client.0.vm05.stdout:7/276: truncate d18/f1d 854302 0 2026-03-10T08:55:27.622 INFO:tasks.workunit.client.1.vm08.stdout:3/515: truncate d4/d15/d8/d2c/f5a 1321819 0 2026-03-10T08:55:27.624 INFO:tasks.workunit.client.0.vm05.stdout:3/335: write d9/f3c [1298872,17421] 0 2026-03-10T08:55:27.624 INFO:tasks.workunit.client.0.vm05.stdout:6/347: rmdir d4/d7 39 2026-03-10T08:55:27.628 INFO:tasks.workunit.client.0.vm05.stdout:7/277: write d18/d1b/f30 [667436,55444] 0 2026-03-10T08:55:27.632 INFO:tasks.workunit.client.1.vm08.stdout:2/580: rename d1/d5b/f8c to d1/da/d10/d2d/fb7 0 2026-03-10T08:55:27.643 INFO:tasks.workunit.client.1.vm08.stdout:7/582: dwrite d0/d11/d1f/d29/d3d/d89/f96 [0,4194304] 0 2026-03-10T08:55:27.643 INFO:tasks.workunit.client.1.vm08.stdout:5/507: dwrite d0/d11/d3e/d45/f5b [0,4194304] 0 2026-03-10T08:55:27.643 INFO:tasks.workunit.client.0.vm05.stdout:3/336: rename d9/f3c to d9/d4d/f5e 0 2026-03-10T08:55:27.644 INFO:tasks.workunit.client.0.vm05.stdout:1/435: chown dd/d21/d3f/f57 16012114 1 2026-03-10T08:55:27.644 INFO:tasks.workunit.client.1.vm08.stdout:8/604: dwrite d1/d10/d9/dd/d25/d27/d44/f22 [0,4194304] 0 2026-03-10T08:55:27.644 INFO:tasks.workunit.client.1.vm08.stdout:1/579: dwrite d1/f8 [4194304,4194304] 0 2026-03-10T08:55:27.644 INFO:tasks.workunit.client.1.vm08.stdout:2/581: chown d1/da/d78 600329124 1 2026-03-10T08:55:27.648 INFO:tasks.workunit.client.0.vm05.stdout:6/348: truncate d4/f6a 703202 0 2026-03-10T08:55:27.650 INFO:tasks.workunit.client.0.vm05.stdout:0/319: write df/f1d [1334540,55915] 0 2026-03-10T08:55:27.651 INFO:tasks.workunit.client.0.vm05.stdout:0/320: fdatasync df/d18/d19/d39/f42 0 2026-03-10T08:55:27.652 INFO:tasks.workunit.client.1.vm08.stdout:5/508: creat d0/d1b/d67/f9b x:0 0 0 2026-03-10T08:55:27.652 INFO:tasks.workunit.client.0.vm05.stdout:1/436: creat dd/d10/d18/d2d/d51/d58/fa0 x:0 0 0 2026-03-10T08:55:27.654 INFO:tasks.workunit.client.0.vm05.stdout:9/235: link d6/d12/f1c d6/d15/f4f 0 2026-03-10T08:55:27.654 INFO:tasks.workunit.client.1.vm08.stdout:8/605: creat d1/d10/d9/dd/d18/d34/dd0/fe1 x:0 0 0 2026-03-10T08:55:27.655 INFO:tasks.workunit.client.0.vm05.stdout:6/349: fsync d4/d7/d10/f65 0 2026-03-10T08:55:27.655 INFO:tasks.workunit.client.1.vm08.stdout:8/606: read - d1/d10/d9/dd/d18/fcf zero size 2026-03-10T08:55:27.657 INFO:tasks.workunit.client.1.vm08.stdout:7/583: rmdir d0/d11/d1f/d29/d36 39 2026-03-10T08:55:27.669 INFO:tasks.workunit.client.0.vm05.stdout:2/299: dwrite d0/d9/d1e/f34 [0,4194304] 0 2026-03-10T08:55:27.671 INFO:tasks.workunit.client.0.vm05.stdout:5/265: truncate d5/df/d12/d24/f25 2460405 0 2026-03-10T08:55:27.674 INFO:tasks.workunit.client.1.vm08.stdout:4/567: dread d5/d23/d36/d99/db2/d5a/d69/f6e [0,4194304] 0 2026-03-10T08:55:27.674 INFO:tasks.workunit.client.1.vm08.stdout:6/574: dread d9/d10/d1e/d32/f27 [0,4194304] 0 2026-03-10T08:55:27.674 INFO:tasks.workunit.client.0.vm05.stdout:0/321: fdatasync df/f1a 0 2026-03-10T08:55:27.674 INFO:tasks.workunit.client.0.vm05.stdout:3/337: getdents d9/d2b/d3a/d43/d4f/d50 0 2026-03-10T08:55:27.674 INFO:tasks.workunit.client.0.vm05.stdout:3/338: read d9/d2b/d2f/f4b [510676,131069] 0 2026-03-10T08:55:27.674 INFO:tasks.workunit.client.1.vm08.stdout:8/607: mknod d1/d10/d9/dd/d25/ce2 0 2026-03-10T08:55:27.676 INFO:tasks.workunit.client.0.vm05.stdout:1/437: creat dd/d10/d18/d20/fa1 x:0 0 0 2026-03-10T08:55:27.678 INFO:tasks.workunit.client.1.vm08.stdout:2/582: creat d1/da/d10/d42/d93/d1e/d7b/fb8 x:0 0 0 2026-03-10T08:55:27.679 INFO:tasks.workunit.client.1.vm08.stdout:2/583: dread - d1/da/d10/d2d/f67 zero size 2026-03-10T08:55:27.679 INFO:tasks.workunit.client.1.vm08.stdout:2/584: dread - d1/da/d10/d1b/fac zero size 2026-03-10T08:55:27.680 INFO:tasks.workunit.client.1.vm08.stdout:2/585: chown d1/fd 628 1 2026-03-10T08:55:27.682 INFO:tasks.workunit.client.1.vm08.stdout:9/487: chown d2/dd/d15/d1e/d25/d32/f8c 403279 1 2026-03-10T08:55:27.683 INFO:tasks.workunit.client.0.vm05.stdout:2/300: stat d0/d9/d1e/d20/d24/c25 0 2026-03-10T08:55:27.683 INFO:tasks.workunit.client.1.vm08.stdout:4/568: symlink d5/d23/d36/d99/db2/dbd/lc9 0 2026-03-10T08:55:27.684 INFO:tasks.workunit.client.0.vm05.stdout:9/236: dread d6/f30 [0,4194304] 0 2026-03-10T08:55:27.685 INFO:tasks.workunit.client.0.vm05.stdout:9/237: read - d6/d19/d2c/f3d zero size 2026-03-10T08:55:27.685 INFO:tasks.workunit.client.0.vm05.stdout:5/266: symlink d5/df/d37/l54 0 2026-03-10T08:55:27.694 INFO:tasks.workunit.client.0.vm05.stdout:2/301: dwrite d0/d9/d27/f54 [0,4194304] 0 2026-03-10T08:55:27.708 INFO:tasks.workunit.client.0.vm05.stdout:8/301: truncate d2/f2a 3380698 0 2026-03-10T08:55:27.709 INFO:tasks.workunit.client.1.vm08.stdout:3/516: rename d4/d15/d8/d2c/f67 to d4/d15/d8/fad 0 2026-03-10T08:55:27.716 INFO:tasks.workunit.client.1.vm08.stdout:8/608: creat d1/d10/d9/d4d/fe3 x:0 0 0 2026-03-10T08:55:27.717 INFO:tasks.workunit.client.1.vm08.stdout:5/509: dread d0/d11/d18/d52/f57 [0,4194304] 0 2026-03-10T08:55:27.718 INFO:tasks.workunit.client.1.vm08.stdout:0/487: write d6/f16 [946743,76479] 0 2026-03-10T08:55:27.718 INFO:tasks.workunit.client.0.vm05.stdout:6/350: rename d4/d7/d10/d15/f2e to d4/d2c/f7a 0 2026-03-10T08:55:27.719 INFO:tasks.workunit.client.1.vm08.stdout:0/488: dread - d6/dd/f92 zero size 2026-03-10T08:55:27.721 INFO:tasks.workunit.client.1.vm08.stdout:9/488: truncate d2/f86 851938 0 2026-03-10T08:55:27.721 INFO:tasks.workunit.client.0.vm05.stdout:9/238: chown d6/d27/f2b 196365 1 2026-03-10T08:55:27.722 INFO:tasks.workunit.client.1.vm08.stdout:1/580: rename d1/da/d20/d3f/d49/d68/d7f/f97 to d1/da/d20/d3f/d49/d63/fc9 0 2026-03-10T08:55:27.723 INFO:tasks.workunit.client.1.vm08.stdout:8/609: symlink d1/d10/d9/d8a/le4 0 2026-03-10T08:55:27.726 INFO:tasks.workunit.client.0.vm05.stdout:3/339: mkdir d9/d2b/d3a/d43/d4f/d50/d5f 0 2026-03-10T08:55:27.726 INFO:tasks.workunit.client.0.vm05.stdout:3/340: fsync d9/d2b/d2f/f33 0 2026-03-10T08:55:27.727 INFO:tasks.workunit.client.0.vm05.stdout:3/341: dread - d9/d2b/d2f/f5d zero size 2026-03-10T08:55:27.727 INFO:tasks.workunit.client.1.vm08.stdout:1/581: fsync d1/da/d20/d3f/d49/fc3 0 2026-03-10T08:55:27.727 INFO:tasks.workunit.client.0.vm05.stdout:9/239: rmdir d6/d12 39 2026-03-10T08:55:27.728 INFO:tasks.workunit.client.1.vm08.stdout:4/569: rename d5/d23/d36/d99/db2/d5a/c6c to d5/d23/d49/d8f/cca 0 2026-03-10T08:55:27.728 INFO:tasks.workunit.client.0.vm05.stdout:5/267: symlink d5/df/d12/d39/l55 0 2026-03-10T08:55:27.729 INFO:tasks.workunit.client.0.vm05.stdout:2/302: mkdir d0/d55 0 2026-03-10T08:55:27.729 INFO:tasks.workunit.client.0.vm05.stdout:0/322: link df/d18/d19/l22 df/d18/d2b/d27/d32/d4e/l5a 0 2026-03-10T08:55:27.730 INFO:tasks.workunit.client.0.vm05.stdout:0/323: fsync df/f12 0 2026-03-10T08:55:27.730 INFO:tasks.workunit.client.0.vm05.stdout:0/324: readlink df/d18/d2b/d27/d32/l38 0 2026-03-10T08:55:27.731 INFO:tasks.workunit.client.0.vm05.stdout:0/325: readlink df/d18/d19/d39/d4d/d50/l54 0 2026-03-10T08:55:27.733 INFO:tasks.workunit.client.0.vm05.stdout:0/326: dread - df/d18/d19/d39/f42 zero size 2026-03-10T08:55:27.733 INFO:tasks.workunit.client.1.vm08.stdout:1/582: fsync d1/da/de/f12 0 2026-03-10T08:55:27.735 INFO:tasks.workunit.client.0.vm05.stdout:3/342: creat d9/d2b/d53/f60 x:0 0 0 2026-03-10T08:55:27.752 INFO:tasks.workunit.client.1.vm08.stdout:5/510: rename d0/d11/d27/d68/d7c/d4b/d4e/c78 to d0/d1b/d67/c9c 0 2026-03-10T08:55:27.752 INFO:tasks.workunit.client.1.vm08.stdout:5/511: chown d0/f8a 251239369 1 2026-03-10T08:55:27.753 INFO:tasks.workunit.client.0.vm05.stdout:5/268: symlink d5/l56 0 2026-03-10T08:55:27.753 INFO:tasks.workunit.client.0.vm05.stdout:2/303: creat d0/f56 x:0 0 0 2026-03-10T08:55:27.753 INFO:tasks.workunit.client.0.vm05.stdout:3/343: dwrite d9/f27 [0,4194304] 0 2026-03-10T08:55:27.753 INFO:tasks.workunit.client.0.vm05.stdout:2/304: dwrite d0/d9/d1e/f39 [0,4194304] 0 2026-03-10T08:55:27.753 INFO:tasks.workunit.client.0.vm05.stdout:5/269: rename l3 to d5/df/d12/d21/l57 0 2026-03-10T08:55:27.753 INFO:tasks.workunit.client.0.vm05.stdout:5/270: stat d5/d3a/f4a 0 2026-03-10T08:55:27.753 INFO:tasks.workunit.client.0.vm05.stdout:3/344: mkdir d9/d2b/d53/d61 0 2026-03-10T08:55:27.753 INFO:tasks.workunit.client.0.vm05.stdout:2/305: unlink d0/d9/d1e/f2a 0 2026-03-10T08:55:27.753 INFO:tasks.workunit.client.0.vm05.stdout:2/306: stat d0/f36 0 2026-03-10T08:55:27.753 INFO:tasks.workunit.client.0.vm05.stdout:2/307: stat d0/f36 0 2026-03-10T08:55:27.753 INFO:tasks.workunit.client.0.vm05.stdout:3/345: rename d9/d2b/d2f/c35 to d9/d2b/d53/c62 0 2026-03-10T08:55:27.756 INFO:tasks.workunit.client.0.vm05.stdout:0/327: read df/f12 [1625340,126875] 0 2026-03-10T08:55:27.761 INFO:tasks.workunit.client.1.vm08.stdout:0/489: getdents d6/d8b 0 2026-03-10T08:55:27.762 INFO:tasks.workunit.client.1.vm08.stdout:5/512: write d0/d11/d27/f3b [3184308,76302] 0 2026-03-10T08:55:27.769 INFO:tasks.workunit.client.1.vm08.stdout:0/490: mkdir d6/dd/d13/d17/d1f/da3 0 2026-03-10T08:55:27.769 INFO:tasks.workunit.client.1.vm08.stdout:0/491: readlink d6/dd/d13/d17/d1f/d20/d2f/l7d 0 2026-03-10T08:55:27.770 INFO:tasks.workunit.client.1.vm08.stdout:5/513: dread d0/f6c [0,4194304] 0 2026-03-10T08:55:27.772 INFO:tasks.workunit.client.0.vm05.stdout:3/346: dread d9/f20 [0,4194304] 0 2026-03-10T08:55:27.773 INFO:tasks.workunit.client.0.vm05.stdout:3/347: fsync d9/d2b/f34 0 2026-03-10T08:55:27.775 INFO:tasks.workunit.client.0.vm05.stdout:3/348: mknod d9/d4d/c63 0 2026-03-10T08:55:27.779 INFO:tasks.workunit.client.1.vm08.stdout:5/514: creat d0/d11/d27/d50/f9d x:0 0 0 2026-03-10T08:55:27.780 INFO:tasks.workunit.client.1.vm08.stdout:0/492: dread d6/dd/d13/d17/d1f/d20/f21 [0,4194304] 0 2026-03-10T08:55:27.781 INFO:tasks.workunit.client.1.vm08.stdout:5/515: chown d0/d11/d18/f4f 3 1 2026-03-10T08:55:27.783 INFO:tasks.workunit.client.1.vm08.stdout:0/493: mknod d6/dd/d13/d61/ca4 0 2026-03-10T08:55:27.818 INFO:tasks.workunit.client.0.vm05.stdout:1/438: sync 2026-03-10T08:55:27.818 INFO:tasks.workunit.client.0.vm05.stdout:8/302: sync 2026-03-10T08:55:27.821 INFO:tasks.workunit.client.0.vm05.stdout:1/439: creat dd/d10/d18/d2d/d5c/fa2 x:0 0 0 2026-03-10T08:55:27.821 INFO:tasks.workunit.client.0.vm05.stdout:8/303: mknod d2/dd/d2c/d2e/d31/d4c/d63/c77 0 2026-03-10T08:55:27.823 INFO:tasks.workunit.client.0.vm05.stdout:1/440: unlink dd/d21/d3f/f5a 0 2026-03-10T08:55:27.828 INFO:tasks.workunit.client.0.vm05.stdout:1/441: mknod dd/d21/d37/d45/d8d/ca3 0 2026-03-10T08:55:27.828 INFO:tasks.workunit.client.0.vm05.stdout:1/442: dwrite dd/d10/d19/d4d/f70 [0,4194304] 0 2026-03-10T08:55:27.830 INFO:tasks.workunit.client.0.vm05.stdout:1/443: write dd/d10/d18/d2d/f93 [886095,127189] 0 2026-03-10T08:55:27.831 INFO:tasks.workunit.client.0.vm05.stdout:1/444: dread - dd/d10/d18/d2d/d51/f6e zero size 2026-03-10T08:55:27.845 INFO:tasks.workunit.client.0.vm05.stdout:1/445: fsync dd/d10/d18/f36 0 2026-03-10T08:55:27.846 INFO:tasks.workunit.client.0.vm05.stdout:1/446: chown dd/d10/d18/d20/d69 359 1 2026-03-10T08:55:27.852 INFO:tasks.workunit.client.0.vm05.stdout:1/447: mknod dd/d10/d18/d2d/d51/ca4 0 2026-03-10T08:55:27.862 INFO:tasks.workunit.client.0.vm05.stdout:1/448: stat dd/d21/f3a 0 2026-03-10T08:55:27.865 INFO:tasks.workunit.client.1.vm08.stdout:5/516: dread d0/fb [0,4194304] 0 2026-03-10T08:55:27.866 INFO:tasks.workunit.client.1.vm08.stdout:5/517: write d0/d11/d3e/d45/f5b [742207,103453] 0 2026-03-10T08:55:27.872 INFO:tasks.workunit.client.1.vm08.stdout:5/518: mknod d0/d1b/d67/d80/c9e 0 2026-03-10T08:55:27.872 INFO:tasks.workunit.client.1.vm08.stdout:5/519: mknod d0/d11/d27/d68/c9f 0 2026-03-10T08:55:27.873 INFO:tasks.workunit.client.1.vm08.stdout:5/520: creat d0/d11/d27/d68/d7c/d4b/fa0 x:0 0 0 2026-03-10T08:55:27.873 INFO:tasks.workunit.client.1.vm08.stdout:5/521: creat d0/d11/d27/d50/fa1 x:0 0 0 2026-03-10T08:55:27.886 INFO:tasks.workunit.client.0.vm05.stdout:8/304: dread d2/db/f19 [0,4194304] 0 2026-03-10T08:55:27.887 INFO:tasks.workunit.client.0.vm05.stdout:4/364: dread d0/d1d/d30/f29 [0,4194304] 0 2026-03-10T08:55:27.889 INFO:tasks.workunit.client.0.vm05.stdout:7/278: write d18/d1b/f2c [4016273,40732] 0 2026-03-10T08:55:27.895 INFO:tasks.workunit.client.1.vm08.stdout:0/494: fsync d6/dd/d13/d17/d1f/d20/d2f/d57/f5c 0 2026-03-10T08:55:27.896 INFO:tasks.workunit.client.0.vm05.stdout:4/365: read - d0/d1d/d30/d32/d41/f60 zero size 2026-03-10T08:55:27.897 INFO:tasks.workunit.client.1.vm08.stdout:5/522: unlink d0/d11/d27/d68/d7c/d4b/l6e 0 2026-03-10T08:55:27.898 INFO:tasks.workunit.client.1.vm08.stdout:4/570: getdents d5/d23/d36/d99/db2/dbd 0 2026-03-10T08:55:27.899 INFO:tasks.workunit.client.0.vm05.stdout:4/366: creat d0/d1d/d30/d49/f7a x:0 0 0 2026-03-10T08:55:27.942 INFO:tasks.workunit.client.0.vm05.stdout:7/279: symlink d18/l4e 0 2026-03-10T08:55:27.942 INFO:tasks.workunit.client.0.vm05.stdout:7/280: dread - d18/f24 zero size 2026-03-10T08:55:27.942 INFO:tasks.workunit.client.0.vm05.stdout:4/367: dread - d0/d2c/f74 zero size 2026-03-10T08:55:27.942 INFO:tasks.workunit.client.0.vm05.stdout:4/368: write d0/d1d/d30/d49/d4f/d5b/f70 [1044962,50510] 0 2026-03-10T08:55:27.942 INFO:tasks.workunit.client.0.vm05.stdout:7/281: mknod d18/d38/d43/c4f 0 2026-03-10T08:55:27.942 INFO:tasks.workunit.client.0.vm05.stdout:4/369: mkdir d0/d1d/d30/d32/d41/d67/d7b 0 2026-03-10T08:55:27.942 INFO:tasks.workunit.client.0.vm05.stdout:7/282: rename d18/d1b/d1f/f3c to d18/d1b/f50 0 2026-03-10T08:55:27.942 INFO:tasks.workunit.client.0.vm05.stdout:4/370: mkdir d0/d2e/d71/d7c 0 2026-03-10T08:55:27.943 INFO:tasks.workunit.client.1.vm08.stdout:5/523: creat d0/d11/d27/d68/d7c/d4b/fa2 x:0 0 0 2026-03-10T08:55:27.943 INFO:tasks.workunit.client.1.vm08.stdout:5/524: readlink d0/d11/d27/l6b 0 2026-03-10T08:55:27.943 INFO:tasks.workunit.client.1.vm08.stdout:5/525: dread - d0/d11/d27/d68/d7c/d4b/d4e/f89 zero size 2026-03-10T08:55:27.943 INFO:tasks.workunit.client.1.vm08.stdout:5/526: chown d0/d11/d3e/d45/f5b 1 1 2026-03-10T08:55:27.943 INFO:tasks.workunit.client.1.vm08.stdout:4/571: truncate d5/d23/d36/d76/f82 1032478 0 2026-03-10T08:55:27.943 INFO:tasks.workunit.client.1.vm08.stdout:7/584: dwrite d0/d11/d1f/d29/d3b/f86 [0,4194304] 0 2026-03-10T08:55:27.943 INFO:tasks.workunit.client.1.vm08.stdout:6/575: dwrite d9/dc/d11/d23/d2c/d81/f62 [0,4194304] 0 2026-03-10T08:55:27.943 INFO:tasks.workunit.client.1.vm08.stdout:6/576: fdatasync d9/d10/d1e/d32/f4d 0 2026-03-10T08:55:27.943 INFO:tasks.workunit.client.1.vm08.stdout:6/577: chown d9/d10/d1e/d32/f12 127838774 1 2026-03-10T08:55:27.943 INFO:tasks.workunit.client.1.vm08.stdout:7/585: link d0/d14/f72 d0/d14/d43/fbc 0 2026-03-10T08:55:27.943 INFO:tasks.workunit.client.1.vm08.stdout:4/572: creat d5/d23/fcb x:0 0 0 2026-03-10T08:55:27.943 INFO:tasks.workunit.client.1.vm08.stdout:6/578: read d9/d13/d4e/f57 [10318,71598] 0 2026-03-10T08:55:27.943 INFO:tasks.workunit.client.1.vm08.stdout:6/579: write d9/dc/d11/d23/f8b [1057456,12289] 0 2026-03-10T08:55:27.943 INFO:tasks.workunit.client.1.vm08.stdout:6/580: read - d9/fc5 zero size 2026-03-10T08:55:27.943 INFO:tasks.workunit.client.1.vm08.stdout:6/581: fdatasync d9/dc/d11/f8d 0 2026-03-10T08:55:27.943 INFO:tasks.workunit.client.1.vm08.stdout:6/582: symlink d9/d50/lc8 0 2026-03-10T08:55:27.944 INFO:tasks.workunit.client.1.vm08.stdout:6/583: dwrite d9/d10/fab [0,4194304] 0 2026-03-10T08:55:28.166 INFO:tasks.workunit.client.0.vm05.stdout:8/305: sync 2026-03-10T08:55:28.169 INFO:tasks.workunit.client.0.vm05.stdout:8/306: mkdir d2/dd/d74/d78 0 2026-03-10T08:55:28.169 INFO:tasks.workunit.client.0.vm05.stdout:8/307: dread - d2/dd/d2c/d2e/f6a zero size 2026-03-10T08:55:28.172 INFO:tasks.workunit.client.0.vm05.stdout:8/308: getdents d2/dd/d2c/d2e/d31/d3e/d5d 0 2026-03-10T08:55:28.174 INFO:tasks.workunit.client.0.vm05.stdout:8/309: rename d2/f49 to d2/db/d1f/d67/f79 0 2026-03-10T08:55:28.179 INFO:tasks.workunit.client.0.vm05.stdout:6/351: read d4/d2c/f7a [55975,74639] 0 2026-03-10T08:55:28.184 INFO:tasks.workunit.client.0.vm05.stdout:6/352: dread d4/f6a [0,4194304] 0 2026-03-10T08:55:28.185 INFO:tasks.workunit.client.1.vm08.stdout:3/517: write d4/d15/d8/d1d/f6e [4740685,108032] 0 2026-03-10T08:55:28.188 INFO:tasks.workunit.client.1.vm08.stdout:8/610: write d1/d10/d9/dd/d25/d27/d44/d21/d5f/fbd [616291,44250] 0 2026-03-10T08:55:28.188 INFO:tasks.workunit.client.0.vm05.stdout:6/353: symlink d4/d2c/l7b 0 2026-03-10T08:55:28.188 INFO:tasks.workunit.client.0.vm05.stdout:6/354: dread - d4/d7/d10/d1a/f78 zero size 2026-03-10T08:55:28.189 INFO:tasks.workunit.client.1.vm08.stdout:8/611: write d1/d10/d9/dd/fc5 [823317,85911] 0 2026-03-10T08:55:28.191 INFO:tasks.workunit.client.1.vm08.stdout:2/586: dwrite d1/da/d10/d42/d93/d22/f8a [0,4194304] 0 2026-03-10T08:55:28.192 INFO:tasks.workunit.client.0.vm05.stdout:6/355: dwrite d4/d7/f54 [0,4194304] 0 2026-03-10T08:55:28.194 INFO:tasks.workunit.client.1.vm08.stdout:9/489: truncate d2/dd/d15/d1e/f48 2774178 0 2026-03-10T08:55:28.199 INFO:tasks.workunit.client.0.vm05.stdout:9/240: write d6/d19/d21/f32 [277204,75863] 0 2026-03-10T08:55:28.210 INFO:tasks.workunit.client.0.vm05.stdout:5/271: write d5/df/d12/d21/f36 [986780,3727] 0 2026-03-10T08:55:28.210 INFO:tasks.workunit.client.1.vm08.stdout:8/612: creat d1/d10/d9/dd/d18/fe5 x:0 0 0 2026-03-10T08:55:28.210 INFO:tasks.workunit.client.0.vm05.stdout:0/328: write f5 [1811956,96114] 0 2026-03-10T08:55:28.214 INFO:tasks.workunit.client.0.vm05.stdout:9/241: truncate d6/d15/d37/f4c 253213 0 2026-03-10T08:55:28.216 INFO:tasks.workunit.client.1.vm08.stdout:2/587: rmdir d1/da/d10/d1b/d6a 39 2026-03-10T08:55:28.217 INFO:tasks.workunit.client.0.vm05.stdout:2/308: stat d0/d9/d1e/d20/c2e 0 2026-03-10T08:55:28.221 INFO:tasks.workunit.client.0.vm05.stdout:3/349: truncate d9/d2b/f34 995908 0 2026-03-10T08:55:28.222 INFO:tasks.workunit.client.1.vm08.stdout:2/588: creat d1/da/d10/d42/d93/d1e/d7b/fb9 x:0 0 0 2026-03-10T08:55:28.222 INFO:tasks.workunit.client.0.vm05.stdout:6/356: truncate d4/d7/d10/d15/d20/d53/f41 3808298 0 2026-03-10T08:55:28.223 INFO:tasks.workunit.client.1.vm08.stdout:9/490: unlink d2/dd/d15/d1e/d25/c3d 0 2026-03-10T08:55:28.230 INFO:tasks.workunit.client.1.vm08.stdout:9/491: mkdir d2/d41/d4c/d66/d99 0 2026-03-10T08:55:28.231 INFO:tasks.workunit.client.0.vm05.stdout:5/272: mknod d5/df/d12/d24/d2c/d41/c58 0 2026-03-10T08:55:28.232 INFO:tasks.workunit.client.0.vm05.stdout:5/273: write d5/df/d12/d24/d2c/d41/f4c [4516351,129305] 0 2026-03-10T08:55:28.234 INFO:tasks.workunit.client.0.vm05.stdout:4/371: truncate d0/d1d/f50 641565 0 2026-03-10T08:55:28.234 INFO:tasks.workunit.client.0.vm05.stdout:4/372: fsync d0/d1d/d30/f29 0 2026-03-10T08:55:28.237 INFO:tasks.workunit.client.0.vm05.stdout:1/449: write dd/f44 [530830,103677] 0 2026-03-10T08:55:28.240 INFO:tasks.workunit.client.1.vm08.stdout:9/492: unlink d2/dd/d15/d4f/f72 0 2026-03-10T08:55:28.241 INFO:tasks.workunit.client.0.vm05.stdout:6/357: chown d4/d2d/d51 116369 1 2026-03-10T08:55:28.244 INFO:tasks.workunit.client.1.vm08.stdout:9/493: creat d2/d41/d74/f9a x:0 0 0 2026-03-10T08:55:28.246 INFO:tasks.workunit.client.1.vm08.stdout:2/589: getdents d1/da/d10/d42/d93/d1e 0 2026-03-10T08:55:28.247 INFO:tasks.workunit.client.0.vm05.stdout:5/274: creat d5/df/d12/f59 x:0 0 0 2026-03-10T08:55:28.248 INFO:tasks.workunit.client.0.vm05.stdout:4/373: symlink d0/d2e/d42/d45/d4a/d36/l7d 0 2026-03-10T08:55:28.248 INFO:tasks.workunit.client.0.vm05.stdout:4/374: chown d0/d2e/f4e 0 1 2026-03-10T08:55:28.251 INFO:tasks.workunit.client.1.vm08.stdout:2/590: creat d1/d5b/fba x:0 0 0 2026-03-10T08:55:28.251 INFO:tasks.workunit.client.1.vm08.stdout:9/494: mkdir d2/dd/d15/d1e/d25/d9b 0 2026-03-10T08:55:28.252 INFO:tasks.workunit.client.1.vm08.stdout:9/495: fdatasync d2/d41/d4c/f7c 0 2026-03-10T08:55:28.258 INFO:tasks.workunit.client.1.vm08.stdout:1/583: sync 2026-03-10T08:55:28.259 INFO:tasks.workunit.client.1.vm08.stdout:0/495: sync 2026-03-10T08:55:28.259 INFO:tasks.workunit.client.1.vm08.stdout:3/518: sync 2026-03-10T08:55:28.263 INFO:tasks.workunit.client.1.vm08.stdout:9/496: creat d2/dd/d61/f9c x:0 0 0 2026-03-10T08:55:28.263 INFO:tasks.workunit.client.1.vm08.stdout:2/591: creat d1/da/d10/d1b/d6a/fbb x:0 0 0 2026-03-10T08:55:28.266 INFO:tasks.workunit.client.0.vm05.stdout:4/375: symlink d0/d1d/d30/d49/l7e 0 2026-03-10T08:55:28.268 INFO:tasks.workunit.client.1.vm08.stdout:1/584: dwrite d1/da/d20/f67 [0,4194304] 0 2026-03-10T08:55:28.272 INFO:tasks.workunit.client.1.vm08.stdout:0/496: dwrite d6/f5f [0,4194304] 0 2026-03-10T08:55:28.272 INFO:tasks.workunit.client.0.vm05.stdout:1/450: creat dd/d10/d18/d20/d52/d80/fa5 x:0 0 0 2026-03-10T08:55:28.274 INFO:tasks.workunit.client.1.vm08.stdout:0/497: stat d6/dd/d13/d17/d1f/d20/d2f/d57/f58 0 2026-03-10T08:55:28.274 INFO:tasks.workunit.client.1.vm08.stdout:3/519: symlink d4/d15/d8/d71/lae 0 2026-03-10T08:55:28.275 INFO:tasks.workunit.client.1.vm08.stdout:9/497: rmdir d2/dd/d15/d1e/d24 39 2026-03-10T08:55:28.282 INFO:tasks.workunit.client.1.vm08.stdout:9/498: truncate d2/dd/d15/d1e/f91 448370 0 2026-03-10T08:55:28.284 INFO:tasks.workunit.client.0.vm05.stdout:6/358: rename d4/f21 to d4/d2d/d51/f7c 0 2026-03-10T08:55:28.287 INFO:tasks.workunit.client.1.vm08.stdout:5/527: dwrite d0/d11/d18/d52/f7d [0,4194304] 0 2026-03-10T08:55:28.298 INFO:tasks.workunit.client.0.vm05.stdout:9/242: getdents d6/d19/d2c 0 2026-03-10T08:55:28.298 INFO:tasks.workunit.client.0.vm05.stdout:7/283: write f4 [1789200,108913] 0 2026-03-10T08:55:28.298 INFO:tasks.workunit.client.0.vm05.stdout:7/284: chown d18/d1b/f2c 133075395 1 2026-03-10T08:55:28.300 INFO:tasks.workunit.client.1.vm08.stdout:2/592: rename d1/d5b/d66/f63 to d1/da/d10/d42/d93/d22/fbc 0 2026-03-10T08:55:28.301 INFO:tasks.workunit.client.1.vm08.stdout:2/593: read d1/da/d10/d1b/d6a/fa8 [1043969,9597] 0 2026-03-10T08:55:28.302 INFO:tasks.workunit.client.0.vm05.stdout:4/376: chown d0/d1d/d30/d32/d41/l56 93892 1 2026-03-10T08:55:28.302 INFO:tasks.workunit.client.1.vm08.stdout:2/594: truncate d1/da/d10/d42/d93/d23/fae 783181 0 2026-03-10T08:55:28.303 INFO:tasks.workunit.client.1.vm08.stdout:1/585: dread - d1/da/de/fad zero size 2026-03-10T08:55:28.303 INFO:tasks.workunit.client.1.vm08.stdout:2/595: dread - d1/da/f9c zero size 2026-03-10T08:55:28.304 INFO:tasks.workunit.client.1.vm08.stdout:2/596: read d1/da/d10/d42/f58 [588095,88151] 0 2026-03-10T08:55:28.304 INFO:tasks.workunit.client.1.vm08.stdout:0/498: fsync d6/f11 0 2026-03-10T08:55:28.305 INFO:tasks.workunit.client.1.vm08.stdout:4/573: write d5/d23/d36/d99/db2/f45 [1671618,9645] 0 2026-03-10T08:55:28.308 INFO:tasks.workunit.client.1.vm08.stdout:3/520: creat d4/d15/d8/d71/faf x:0 0 0 2026-03-10T08:55:28.308 INFO:tasks.workunit.client.0.vm05.stdout:6/359: dread d4/f11 [0,4194304] 0 2026-03-10T08:55:28.308 INFO:tasks.workunit.client.1.vm08.stdout:7/586: dwrite d0/d11/d1f/d2c/f6c [0,4194304] 0 2026-03-10T08:55:28.311 INFO:tasks.workunit.client.1.vm08.stdout:3/521: truncate d4/d15/d8/d71/faf 234903 0 2026-03-10T08:55:28.320 INFO:tasks.workunit.client.1.vm08.stdout:5/528: rmdir d0/d46 39 2026-03-10T08:55:28.321 INFO:tasks.workunit.client.0.vm05.stdout:7/285: mknod d18/d1b/d1f/d25/d2e/d42/c51 0 2026-03-10T08:55:28.321 INFO:tasks.workunit.client.0.vm05.stdout:7/286: fdatasync d18/d1b/f2c 0 2026-03-10T08:55:28.322 INFO:tasks.workunit.client.1.vm08.stdout:6/584: write d9/dc/d11/d23/f40 [111820,12392] 0 2026-03-10T08:55:28.325 INFO:tasks.workunit.client.1.vm08.stdout:1/586: mknod d1/da/de/d24/d3d/d40/d8e/cca 0 2026-03-10T08:55:28.326 INFO:tasks.workunit.client.0.vm05.stdout:4/377: rename d0/d2e/d42/d45/d4a/d36/l5d to d0/d1d/d30/d32/d41/d67/d7b/l7f 0 2026-03-10T08:55:28.330 INFO:tasks.workunit.client.0.vm05.stdout:6/360: truncate d4/d7/d10/d15/d1b/f31 260405 0 2026-03-10T08:55:28.332 INFO:tasks.workunit.client.1.vm08.stdout:0/499: rmdir d6/dd/d13/d8f 39 2026-03-10T08:55:28.334 INFO:tasks.workunit.client.0.vm05.stdout:7/287: readlink l13 0 2026-03-10T08:55:28.335 INFO:tasks.workunit.client.1.vm08.stdout:7/587: symlink d0/d11/d1f/d29/d3b/lbd 0 2026-03-10T08:55:28.337 INFO:tasks.workunit.client.1.vm08.stdout:7/588: write d0/d11/d1f/d29/d3d/d40/ff [1451475,56068] 0 2026-03-10T08:55:28.340 INFO:tasks.workunit.client.0.vm05.stdout:4/378: write d0/d1d/d30/d32/f72 [889367,45214] 0 2026-03-10T08:55:28.343 INFO:tasks.workunit.client.1.vm08.stdout:1/587: symlink d1/da/d20/d91/lcb 0 2026-03-10T08:55:28.344 INFO:tasks.workunit.client.0.vm05.stdout:6/361: creat d4/d2d/d51/f7d x:0 0 0 2026-03-10T08:55:28.346 INFO:tasks.workunit.client.0.vm05.stdout:7/288: creat d18/d1b/d1f/d25/d2e/d42/f52 x:0 0 0 2026-03-10T08:55:28.349 INFO:tasks.workunit.client.1.vm08.stdout:0/500: creat d6/dd/d13/d61/d6f/fa5 x:0 0 0 2026-03-10T08:55:28.350 INFO:tasks.workunit.client.0.vm05.stdout:7/289: mkdir d18/d1b/d1f/d25/d2e/d42/d53 0 2026-03-10T08:55:28.351 INFO:tasks.workunit.client.1.vm08.stdout:0/501: chown d6/dd/d13/d17/d50/f71 16407 1 2026-03-10T08:55:28.353 INFO:tasks.workunit.client.1.vm08.stdout:1/588: dwrite d1/da/d20/d3f/d49/d68/d7f/fb9 [0,4194304] 0 2026-03-10T08:55:28.354 INFO:tasks.workunit.client.0.vm05.stdout:7/290: dwrite fd [0,4194304] 0 2026-03-10T08:55:28.375 INFO:tasks.workunit.client.0.vm05.stdout:8/310: dwrite d2/db/f19 [0,4194304] 0 2026-03-10T08:55:28.379 INFO:tasks.workunit.client.1.vm08.stdout:6/585: mkdir d9/dc/dc9 0 2026-03-10T08:55:28.380 INFO:tasks.workunit.client.0.vm05.stdout:8/311: dwrite d2/f5 [4194304,4194304] 0 2026-03-10T08:55:28.381 INFO:tasks.workunit.client.0.vm05.stdout:8/312: readlink d2/db/l36 0 2026-03-10T08:55:28.381 INFO:tasks.workunit.client.0.vm05.stdout:8/313: write d2/db/f1b [3130876,80715] 0 2026-03-10T08:55:28.382 INFO:tasks.workunit.client.0.vm05.stdout:8/314: fdatasync d2/dd/d2c/d2e/d31/d3e/f73 0 2026-03-10T08:55:28.387 INFO:tasks.workunit.client.0.vm05.stdout:6/362: truncate d4/f6a 1398531 0 2026-03-10T08:55:28.399 INFO:tasks.workunit.client.0.vm05.stdout:8/315: rename d2/dd/l76 to d2/dd/d2c/d2e/d31/d4c/d63/l7a 0 2026-03-10T08:55:28.399 INFO:tasks.workunit.client.0.vm05.stdout:8/316: write d2/dd/d2c/d2e/f37 [97948,78910] 0 2026-03-10T08:55:28.400 INFO:tasks.workunit.client.1.vm08.stdout:6/586: rmdir d9/d10 39 2026-03-10T08:55:28.400 INFO:tasks.workunit.client.1.vm08.stdout:1/589: creat d1/da/de/d5c/fcc x:0 0 0 2026-03-10T08:55:28.400 INFO:tasks.workunit.client.1.vm08.stdout:3/522: getdents d4/d6f/d85 0 2026-03-10T08:55:28.400 INFO:tasks.workunit.client.1.vm08.stdout:0/502: unlink d6/cf 0 2026-03-10T08:55:28.400 INFO:tasks.workunit.client.1.vm08.stdout:7/589: sync 2026-03-10T08:55:28.403 INFO:tasks.workunit.client.0.vm05.stdout:6/363: symlink d4/d2d/l7e 0 2026-03-10T08:55:28.404 INFO:tasks.workunit.client.0.vm05.stdout:6/364: fdatasync d4/d7/f5d 0 2026-03-10T08:55:28.405 INFO:tasks.workunit.client.0.vm05.stdout:6/365: truncate d4/d7/d10/f65 1361120 0 2026-03-10T08:55:28.405 INFO:tasks.workunit.client.0.vm05.stdout:6/366: readlink d4/d2d/d5f/l74 0 2026-03-10T08:55:28.418 INFO:tasks.workunit.client.0.vm05.stdout:6/367: chown d4/fc 977643 1 2026-03-10T08:55:28.419 INFO:tasks.workunit.client.1.vm08.stdout:1/590: creat d1/da/de/d24/d35/d6d/d82/da2/fcd x:0 0 0 2026-03-10T08:55:28.419 INFO:tasks.workunit.client.1.vm08.stdout:3/523: creat d4/d15/d8/d1d/d4f/fb0 x:0 0 0 2026-03-10T08:55:28.420 INFO:tasks.workunit.client.0.vm05.stdout:6/368: mkdir d4/d2d/d7f 0 2026-03-10T08:55:28.420 INFO:tasks.workunit.client.1.vm08.stdout:8/613: write d1/d10/d9/dd/d18/f80 [2414505,109313] 0 2026-03-10T08:55:28.422 INFO:tasks.workunit.client.1.vm08.stdout:0/503: mknod d6/dd/d13/d8f/ca6 0 2026-03-10T08:55:28.422 INFO:tasks.workunit.client.1.vm08.stdout:7/590: fdatasync d0/d11/f66 0 2026-03-10T08:55:28.422 INFO:tasks.workunit.client.0.vm05.stdout:6/369: creat d4/d7/f80 x:0 0 0 2026-03-10T08:55:28.423 INFO:tasks.workunit.client.0.vm05.stdout:6/370: read d4/d7/d10/d15/d20/d53/f6b [71426,125528] 0 2026-03-10T08:55:28.427 INFO:tasks.workunit.client.1.vm08.stdout:7/591: creat d0/d14/d43/d9d/fbe x:0 0 0 2026-03-10T08:55:28.427 INFO:tasks.workunit.client.0.vm05.stdout:6/371: rename d4/d7/d10/d1a/f78 to d4/d2d/d5f/f81 0 2026-03-10T08:55:28.428 INFO:tasks.workunit.client.0.vm05.stdout:6/372: dread d4/d7/d10/d1a/f1e [0,4194304] 0 2026-03-10T08:55:28.429 INFO:tasks.workunit.client.1.vm08.stdout:3/524: dread d4/d15/d8/f1e [0,4194304] 0 2026-03-10T08:55:28.431 INFO:tasks.workunit.client.1.vm08.stdout:0/504: rename d6/f16 to d6/dd/d13/d17/d1f/da3/fa7 0 2026-03-10T08:55:28.431 INFO:tasks.workunit.client.1.vm08.stdout:3/525: write d4/d15/d8/d1d/d4f/fb0 [361268,71199] 0 2026-03-10T08:55:28.432 INFO:tasks.workunit.client.0.vm05.stdout:6/373: rename d4/d7/d10/d15/d1b/d22/l27 to d4/d2d/d51/l82 0 2026-03-10T08:55:28.433 INFO:tasks.workunit.client.1.vm08.stdout:0/505: chown d6/dd/d13/d17/d1f/d20/f6a 30 1 2026-03-10T08:55:28.433 INFO:tasks.workunit.client.0.vm05.stdout:6/374: mkdir d4/d7/d10/d15/d20/d53/d83 0 2026-03-10T08:55:28.434 INFO:tasks.workunit.client.0.vm05.stdout:6/375: readlink d4/d2d/d5f/l71 0 2026-03-10T08:55:28.436 INFO:tasks.workunit.client.1.vm08.stdout:8/614: rename d1/d10/d9/dd/d25/d27/d44/d21/d5f/d9e/cc3 to d1/d10/d9/dd/d25/d27/d44/d21/dce/ce6 0 2026-03-10T08:55:28.453 INFO:tasks.workunit.client.1.vm08.stdout:0/506: creat d6/dd/d13/d17/d1f/d20/d2f/d24/fa8 x:0 0 0 2026-03-10T08:55:28.453 INFO:tasks.workunit.client.1.vm08.stdout:3/526: creat d4/d15/d8/d1d/da8/fb1 x:0 0 0 2026-03-10T08:55:28.453 INFO:tasks.workunit.client.1.vm08.stdout:8/615: symlink d1/d10/d9/le7 0 2026-03-10T08:55:28.453 INFO:tasks.workunit.client.1.vm08.stdout:3/527: mknod d4/d15/d8/d1d/da8/cb2 0 2026-03-10T08:55:28.453 INFO:tasks.workunit.client.1.vm08.stdout:8/616: creat d1/d10/d9/dd/d25/dca/dc6/fe8 x:0 0 0 2026-03-10T08:55:28.453 INFO:tasks.workunit.client.1.vm08.stdout:3/528: rename d4/d15/d8 to d4/d15/d8/d2c/d9b/db3 22 2026-03-10T08:55:28.454 INFO:tasks.workunit.client.1.vm08.stdout:3/529: creat d4/d15/d8/d2c/d89/fb4 x:0 0 0 2026-03-10T08:55:28.454 INFO:tasks.workunit.client.1.vm08.stdout:8/617: dwrite d1/d10/d9/dd/fc5 [0,4194304] 0 2026-03-10T08:55:28.457 INFO:tasks.workunit.client.1.vm08.stdout:1/591: sync 2026-03-10T08:55:28.473 INFO:tasks.workunit.client.1.vm08.stdout:8/618: rename d1/d10/d9/dd/d18/d34/l37 to d1/d10/d9/d4d/le9 0 2026-03-10T08:55:28.475 INFO:tasks.workunit.client.1.vm08.stdout:8/619: dread - d1/d10/d9/d8a/f95 zero size 2026-03-10T08:55:28.477 INFO:tasks.workunit.client.1.vm08.stdout:8/620: mknod d1/d10/d9/dd/d25/dca/cea 0 2026-03-10T08:55:28.479 INFO:tasks.workunit.client.1.vm08.stdout:3/530: getdents d4/d15/d8/d2c/d9b/d79/d8f 0 2026-03-10T08:55:28.480 INFO:tasks.workunit.client.1.vm08.stdout:1/592: getdents d1/da/de/d24/d3d/d40/d56 0 2026-03-10T08:55:28.481 INFO:tasks.workunit.client.1.vm08.stdout:8/621: read d1/d10/f3b [2685289,56946] 0 2026-03-10T08:55:28.482 INFO:tasks.workunit.client.1.vm08.stdout:8/622: chown d1/d10/d9/dd/d25/d27/d44/l7a 1 1 2026-03-10T08:55:28.484 INFO:tasks.workunit.client.1.vm08.stdout:1/593: mknod d1/da/de/d24/d3d/d40/d56/d6b/cce 0 2026-03-10T08:55:28.486 INFO:tasks.workunit.client.1.vm08.stdout:8/623: creat d1/d10/d9/dd/d25/d27/d44/d21/d51/d64/feb x:0 0 0 2026-03-10T08:55:28.489 INFO:tasks.workunit.client.1.vm08.stdout:8/624: fsync d1/d10/d9/dd/d25/d27/d44/d21/d5f/fbd 0 2026-03-10T08:55:28.499 INFO:tasks.workunit.client.1.vm08.stdout:1/594: dread d1/da/d20/d3f/d49/f96 [0,4194304] 0 2026-03-10T08:55:28.501 INFO:tasks.workunit.client.0.vm05.stdout:2/309: write d0/d9/d27/f38 [447984,51134] 0 2026-03-10T08:55:28.506 INFO:tasks.workunit.client.0.vm05.stdout:0/329: dwrite df/d1f/f36 [0,4194304] 0 2026-03-10T08:55:28.518 INFO:tasks.workunit.client.0.vm05.stdout:3/350: write d9/d2b/f2d [2909416,40034] 0 2026-03-10T08:55:28.520 INFO:tasks.workunit.client.0.vm05.stdout:5/275: getdents d5/df/d12 0 2026-03-10T08:55:28.520 INFO:tasks.workunit.client.0.vm05.stdout:5/276: fdatasync d5/df/d37/f47 0 2026-03-10T08:55:28.526 INFO:tasks.workunit.client.1.vm08.stdout:8/625: rmdir d1/d10/d9/dd/d18/d34/dd0 39 2026-03-10T08:55:28.529 INFO:tasks.workunit.client.1.vm08.stdout:1/595: mkdir d1/da/de/dcf 0 2026-03-10T08:55:28.533 INFO:tasks.workunit.client.1.vm08.stdout:8/626: dread d1/d10/d9/dd/d25/d27/d44/d97/f79 [0,4194304] 0 2026-03-10T08:55:28.533 INFO:tasks.workunit.client.0.vm05.stdout:0/330: rmdir df/d18 39 2026-03-10T08:55:28.536 INFO:tasks.workunit.client.1.vm08.stdout:1/596: rmdir d1/da/de/d24/d3d/d40/d56/d7a 39 2026-03-10T08:55:28.537 INFO:tasks.workunit.client.0.vm05.stdout:5/277: creat d5/df/d12/d21/f5a x:0 0 0 2026-03-10T08:55:28.539 INFO:tasks.workunit.client.0.vm05.stdout:9/243: write d6/d19/d21/f31 [836533,768] 0 2026-03-10T08:55:28.540 INFO:tasks.workunit.client.0.vm05.stdout:0/331: readlink df/d1f/d48/l4b 0 2026-03-10T08:55:28.540 INFO:tasks.workunit.client.1.vm08.stdout:0/507: dwrite d6/f11 [4194304,4194304] 0 2026-03-10T08:55:28.543 INFO:tasks.workunit.client.1.vm08.stdout:1/597: rename d1/da/d20/c59 to d1/da/d18/d3a/da7/cd0 0 2026-03-10T08:55:28.560 INFO:tasks.workunit.client.1.vm08.stdout:2/597: write d1/da/d78/f95 [989869,38790] 0 2026-03-10T08:55:28.560 INFO:tasks.workunit.client.1.vm08.stdout:1/598: creat d1/da/d20/d3f/d49/d9c/fd1 x:0 0 0 2026-03-10T08:55:28.560 INFO:tasks.workunit.client.0.vm05.stdout:3/351: unlink d9/f23 0 2026-03-10T08:55:28.560 INFO:tasks.workunit.client.0.vm05.stdout:4/379: chown d0/d1d/d30/d32/d41/d67/d7b/l7f 1 1 2026-03-10T08:55:28.560 INFO:tasks.workunit.client.0.vm05.stdout:4/380: stat d0/d1d/d30/d32/d41/l56 0 2026-03-10T08:55:28.560 INFO:tasks.workunit.client.0.vm05.stdout:5/278: rmdir d5/df/d12/d39 39 2026-03-10T08:55:28.560 INFO:tasks.workunit.client.0.vm05.stdout:9/244: symlink d6/d15/d3c/l50 0 2026-03-10T08:55:28.560 INFO:tasks.workunit.client.0.vm05.stdout:9/245: readlink d6/d15/d3c/l3e 0 2026-03-10T08:55:28.560 INFO:tasks.workunit.client.0.vm05.stdout:1/451: truncate dd/d10/d18/f8a 579255 0 2026-03-10T08:55:28.560 INFO:tasks.workunit.client.0.vm05.stdout:5/279: dread - d5/df/f2f zero size 2026-03-10T08:55:28.560 INFO:tasks.workunit.client.0.vm05.stdout:1/452: dwrite dd/d10/d19/d27/f31 [0,4194304] 0 2026-03-10T08:55:28.561 INFO:tasks.workunit.client.1.vm08.stdout:5/529: dwrite d0/d11/f1e [0,4194304] 0 2026-03-10T08:55:28.563 INFO:tasks.workunit.client.0.vm05.stdout:9/246: mknod d6/d19/d2a/d4a/c51 0 2026-03-10T08:55:28.568 INFO:tasks.workunit.client.1.vm08.stdout:2/598: chown d1/da/f50 8 1 2026-03-10T08:55:28.573 INFO:tasks.workunit.client.1.vm08.stdout:2/599: write d1/da/f9c [904529,108823] 0 2026-03-10T08:55:28.586 INFO:tasks.workunit.client.1.vm08.stdout:1/599: rename d1/da/d20/d3f/d49/d68 to d1/da/de/d24/d3d/d40/d8e/dd2 0 2026-03-10T08:55:28.590 INFO:tasks.workunit.client.0.vm05.stdout:7/291: truncate f9 592864 0 2026-03-10T08:55:28.590 INFO:tasks.workunit.client.0.vm05.stdout:7/292: chown d18/d1b/d1f/d25/d2e/d42 24 1 2026-03-10T08:55:28.590 INFO:tasks.workunit.client.1.vm08.stdout:1/600: readlink d1/da/de/d24/d3d/d40/d92/lb3 0 2026-03-10T08:55:28.590 INFO:tasks.workunit.client.1.vm08.stdout:9/499: dwrite d2/dd/d15/d1e/d24/f2b [0,4194304] 0 2026-03-10T08:55:28.593 INFO:tasks.workunit.client.0.vm05.stdout:1/453: symlink dd/d10/d19/d9b/la6 0 2026-03-10T08:55:28.593 INFO:tasks.workunit.client.0.vm05.stdout:1/454: write dd/f5e [3236017,22936] 0 2026-03-10T08:55:28.596 INFO:tasks.workunit.client.0.vm05.stdout:4/381: getdents d0/d1d/d30/d49/d4f/d5b 0 2026-03-10T08:55:28.600 INFO:tasks.workunit.client.0.vm05.stdout:1/455: truncate dd/f1c 8909061 0 2026-03-10T08:55:28.603 INFO:tasks.workunit.client.1.vm08.stdout:9/500: fdatasync d2/dd/d15/f1b 0 2026-03-10T08:55:28.603 INFO:tasks.workunit.client.1.vm08.stdout:9/501: readlink d2/dd/l5d 0 2026-03-10T08:55:28.603 INFO:tasks.workunit.client.0.vm05.stdout:4/382: chown d0/f23 2 1 2026-03-10T08:55:28.603 INFO:tasks.workunit.client.0.vm05.stdout:2/310: sync 2026-03-10T08:55:28.603 INFO:tasks.workunit.client.0.vm05.stdout:3/352: sync 2026-03-10T08:55:28.604 INFO:tasks.workunit.client.0.vm05.stdout:9/247: creat d6/d12/d43/f52 x:0 0 0 2026-03-10T08:55:28.605 INFO:tasks.workunit.client.0.vm05.stdout:6/376: dread d4/d2d/d51/f7c [0,4194304] 0 2026-03-10T08:55:28.605 INFO:tasks.workunit.client.0.vm05.stdout:6/377: write d4/d7/f54 [1002840,29120] 0 2026-03-10T08:55:28.605 INFO:tasks.workunit.client.0.vm05.stdout:8/317: truncate d2/db/f1b 1325492 0 2026-03-10T08:55:28.606 INFO:tasks.workunit.client.0.vm05.stdout:6/378: chown d4 4171 1 2026-03-10T08:55:28.606 INFO:tasks.workunit.client.0.vm05.stdout:6/379: write d4/d7/f34 [3817215,32309] 0 2026-03-10T08:55:28.609 INFO:tasks.workunit.client.0.vm05.stdout:4/383: chown d0/d2e/d42/d45/d4a/c4c 27 1 2026-03-10T08:55:28.610 INFO:tasks.workunit.client.1.vm08.stdout:6/587: write d9/d10/d1e/d7e/fc2 [504314,25328] 0 2026-03-10T08:55:28.615 INFO:tasks.workunit.client.0.vm05.stdout:3/353: fdatasync d9/d2b/f2c 0 2026-03-10T08:55:28.615 INFO:tasks.workunit.client.0.vm05.stdout:3/354: read - d9/d2b/f3b zero size 2026-03-10T08:55:28.617 INFO:tasks.workunit.client.0.vm05.stdout:9/248: read d6/d15/f4f [275330,130670] 0 2026-03-10T08:55:28.627 INFO:tasks.workunit.client.1.vm08.stdout:6/588: creat d9/dc/d11/d23/d2c/fca x:0 0 0 2026-03-10T08:55:28.632 INFO:tasks.workunit.client.1.vm08.stdout:7/592: write d0/d14/d43/f6e [491105,9016] 0 2026-03-10T08:55:28.633 INFO:tasks.workunit.client.0.vm05.stdout:4/384: chown d0/c1b 1207 1 2026-03-10T08:55:28.637 INFO:tasks.workunit.client.0.vm05.stdout:6/380: rename d4/d7/d10/d15/d20/d53 to d4/d2c/d84 0 2026-03-10T08:55:28.638 INFO:tasks.workunit.client.1.vm08.stdout:7/593: mknod d0/d11/d1f/d29/d3d/d40/cbf 0 2026-03-10T08:55:28.645 INFO:tasks.workunit.client.0.vm05.stdout:3/355: mkdir d9/d4d/d51/d64 0 2026-03-10T08:55:28.645 INFO:tasks.workunit.client.1.vm08.stdout:7/594: dread - d0/d11/d1f/d29/d3d/f76 zero size 2026-03-10T08:55:28.653 INFO:tasks.workunit.client.1.vm08.stdout:6/589: dread d9/dc/d11/d23/d2c/f5c [0,4194304] 0 2026-03-10T08:55:28.654 INFO:tasks.workunit.client.0.vm05.stdout:5/280: dread d5/df/f31 [0,4194304] 0 2026-03-10T08:55:28.657 INFO:tasks.workunit.client.0.vm05.stdout:5/281: dwrite d5/f23 [0,4194304] 0 2026-03-10T08:55:28.679 INFO:tasks.workunit.client.1.vm08.stdout:0/508: dread d6/dd/d13/d17/f29 [0,4194304] 0 2026-03-10T08:55:28.679 INFO:tasks.workunit.client.0.vm05.stdout:8/318: mkdir d2/dd/d2c/d2e/d31/d4f/d7b 0 2026-03-10T08:55:28.679 INFO:tasks.workunit.client.0.vm05.stdout:9/249: link d6/d19/d21/f31 d6/d19/d2a/f53 0 2026-03-10T08:55:28.679 INFO:tasks.workunit.client.0.vm05.stdout:6/381: symlink d4/d7/d10/l85 0 2026-03-10T08:55:28.679 INFO:tasks.workunit.client.0.vm05.stdout:6/382: creat d4/d2c/f86 x:0 0 0 2026-03-10T08:55:28.682 INFO:tasks.workunit.client.0.vm05.stdout:6/383: readlink d4/d7/d10/d15/d1b/l73 0 2026-03-10T08:55:28.683 INFO:tasks.workunit.client.0.vm05.stdout:6/384: chown d4/d7/f34 865259 1 2026-03-10T08:55:28.685 INFO:tasks.workunit.client.0.vm05.stdout:6/385: fdatasync d4/d7/d10/d15/d1b/f31 0 2026-03-10T08:55:28.685 INFO:tasks.workunit.client.0.vm05.stdout:6/386: read d4/d2c/f7a [3432266,24533] 0 2026-03-10T08:55:28.688 INFO:tasks.workunit.client.0.vm05.stdout:6/387: truncate d4/d2c/d84/f41 2566624 0 2026-03-10T08:55:28.729 INFO:tasks.workunit.client.0.vm05.stdout:6/388: write d4/d2c/d84/f6b [24070,95672] 0 2026-03-10T08:55:28.729 INFO:tasks.workunit.client.0.vm05.stdout:6/389: rename d4/d2c/d84/d83 to d4/d2d/d51/d87 0 2026-03-10T08:55:28.729 INFO:tasks.workunit.client.0.vm05.stdout:6/390: creat d4/d2d/d5f/f88 x:0 0 0 2026-03-10T08:55:28.729 INFO:tasks.workunit.client.0.vm05.stdout:6/391: fsync d4/d7/d10/d15/d1b/d22/f56 0 2026-03-10T08:55:28.729 INFO:tasks.workunit.client.0.vm05.stdout:6/392: dwrite d4/d7/d10/f65 [0,4194304] 0 2026-03-10T08:55:28.729 INFO:tasks.workunit.client.0.vm05.stdout:0/332: rmdir df 39 2026-03-10T08:55:28.729 INFO:tasks.workunit.client.0.vm05.stdout:0/333: write df/f15 [1350523,122979] 0 2026-03-10T08:55:28.729 INFO:tasks.workunit.client.0.vm05.stdout:0/334: dread df/d1f/f21 [4194304,4194304] 0 2026-03-10T08:55:28.729 INFO:tasks.workunit.client.0.vm05.stdout:0/335: write df/d1f/f21 [4401849,74375] 0 2026-03-10T08:55:28.729 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:28 vm05.local ceph-mon[49713]: pgmap v153: 65 pgs: 65 active+clean; 1.9 GiB data, 6.8 GiB used, 113 GiB / 120 GiB avail; 42 MiB/s rd, 157 MiB/s wr, 304 op/s 2026-03-10T08:55:28.735 INFO:tasks.workunit.client.1.vm08.stdout:0/509: dread d6/dd/d13/d17/d1f/d2d/d39/f47 [0,4194304] 0 2026-03-10T08:55:28.736 INFO:tasks.workunit.client.1.vm08.stdout:0/510: rmdir d6/dd/d13/d17/d1f/d20/d2f 39 2026-03-10T08:55:28.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:28 vm08.local ceph-mon[57559]: pgmap v153: 65 pgs: 65 active+clean; 1.9 GiB data, 6.8 GiB used, 113 GiB / 120 GiB avail; 42 MiB/s rd, 157 MiB/s wr, 304 op/s 2026-03-10T08:55:28.850 INFO:tasks.workunit.client.0.vm05.stdout:0/336: dread df/f12 [0,4194304] 0 2026-03-10T08:55:28.854 INFO:tasks.workunit.client.0.vm05.stdout:0/337: mkdir df/d18/d19/d5b 0 2026-03-10T08:55:28.854 INFO:tasks.workunit.client.0.vm05.stdout:0/338: stat df/d18/d2b/d27/d32/l52 0 2026-03-10T08:55:28.855 INFO:tasks.workunit.client.0.vm05.stdout:0/339: mkdir df/d18/d2b/d27/d32/d4e/d5c 0 2026-03-10T08:55:28.855 INFO:tasks.workunit.client.0.vm05.stdout:0/340: chown df/d18/d2b/d27/d32 16054948 1 2026-03-10T08:55:28.856 INFO:tasks.workunit.client.0.vm05.stdout:0/341: readlink df/d18/d19/l43 0 2026-03-10T08:55:28.857 INFO:tasks.workunit.client.0.vm05.stdout:0/342: rename df/d1f/f36 to df/d18/d2b/d27/d32/f5d 0 2026-03-10T08:55:28.859 INFO:tasks.workunit.client.0.vm05.stdout:0/343: mknod df/d18/d19/d39/d4d/d50/c5e 0 2026-03-10T08:55:28.884 INFO:tasks.workunit.client.0.vm05.stdout:0/344: symlink df/d18/d19/d39/d4d/d50/l5f 0 2026-03-10T08:55:28.889 INFO:tasks.workunit.client.0.vm05.stdout:0/345: chown df/d18/l46 101 1 2026-03-10T08:55:28.895 INFO:tasks.workunit.client.0.vm05.stdout:0/346: unlink df/c14 0 2026-03-10T08:55:28.898 INFO:tasks.workunit.client.0.vm05.stdout:0/347: dread df/d1f/f21 [0,4194304] 0 2026-03-10T08:55:28.904 INFO:tasks.workunit.client.0.vm05.stdout:0/348: dwrite df/d1f/f2d [0,4194304] 0 2026-03-10T08:55:28.915 INFO:tasks.workunit.client.1.vm08.stdout:4/574: link d5/d23/d36/f51 d5/d5f/fcc 0 2026-03-10T08:55:28.918 INFO:tasks.workunit.client.1.vm08.stdout:4/575: rmdir d5/d23/d49 39 2026-03-10T08:55:28.919 INFO:tasks.workunit.client.1.vm08.stdout:5/530: mknod d0/d11/d18/ca3 0 2026-03-10T08:55:28.919 INFO:tasks.workunit.client.1.vm08.stdout:8/627: dwrite d1/d10/d9/dd/f62 [0,4194304] 0 2026-03-10T08:55:28.921 INFO:tasks.workunit.client.1.vm08.stdout:8/628: chown d1/d10/f2a 26 1 2026-03-10T08:55:28.923 INFO:tasks.workunit.client.1.vm08.stdout:8/629: chown d1/d10/d9/dd/d25/d27/d44/d21/d51/dd6 24 1 2026-03-10T08:55:28.926 INFO:tasks.workunit.client.1.vm08.stdout:4/576: symlink d5/d23/d36/d99/db2/lcd 0 2026-03-10T08:55:28.938 INFO:tasks.workunit.client.1.vm08.stdout:8/630: getdents d1/d10/d9/dd/d25/d27/d44/d21/d5f/d9e 0 2026-03-10T08:55:28.939 INFO:tasks.workunit.client.1.vm08.stdout:2/600: truncate d1/d43/f5d 182564 0 2026-03-10T08:55:28.945 INFO:tasks.workunit.client.0.vm05.stdout:6/393: mkdir d4/d7/d10/d1a/d89 0 2026-03-10T08:55:28.945 INFO:tasks.workunit.client.1.vm08.stdout:8/631: creat d1/d10/fec x:0 0 0 2026-03-10T08:55:28.945 INFO:tasks.workunit.client.1.vm08.stdout:9/502: truncate d2/dd/d15/f1b 475899 0 2026-03-10T08:55:28.949 INFO:tasks.workunit.client.0.vm05.stdout:2/311: dwrite d0/d9/f19 [0,4194304] 0 2026-03-10T08:55:28.951 INFO:tasks.workunit.client.0.vm05.stdout:6/394: dread d4/d2c/f7a [0,4194304] 0 2026-03-10T08:55:28.960 INFO:tasks.workunit.client.1.vm08.stdout:9/503: mkdir d2/dd/d15/d1e/d25/d98/d9d 0 2026-03-10T08:55:28.965 INFO:tasks.workunit.client.1.vm08.stdout:8/632: creat d1/d10/d9/dd/d25/d27/d44/d89/fed x:0 0 0 2026-03-10T08:55:28.965 INFO:tasks.workunit.client.0.vm05.stdout:2/312: dwrite d0/d9/d1e/d20/f22 [4194304,4194304] 0 2026-03-10T08:55:28.965 INFO:tasks.workunit.client.0.vm05.stdout:2/313: stat d0/d9/d27/f54 0 2026-03-10T08:55:28.965 INFO:tasks.workunit.client.0.vm05.stdout:2/314: chown d0/d9/d27/c3f 90 1 2026-03-10T08:55:28.974 INFO:tasks.workunit.client.0.vm05.stdout:6/395: rename d4/d7/d10/d15/l5a to d4/d2c/d84/d4a/l8a 0 2026-03-10T08:55:28.976 INFO:tasks.workunit.client.0.vm05.stdout:2/315: rmdir d0/d9/d1e/d20/d24 39 2026-03-10T08:55:28.976 INFO:tasks.workunit.client.0.vm05.stdout:2/316: write d0/d9/d27/f38 [1407357,42973] 0 2026-03-10T08:55:28.979 INFO:tasks.workunit.client.0.vm05.stdout:6/396: dread d4/f61 [0,4194304] 0 2026-03-10T08:55:28.984 INFO:tasks.workunit.client.0.vm05.stdout:2/317: truncate d0/d9/d1e/d20/d21/f3d 702366 0 2026-03-10T08:55:28.985 INFO:tasks.workunit.client.1.vm08.stdout:7/595: write d0/d11/d1f/d29/d3b/f4c [363536,116119] 0 2026-03-10T08:55:28.985 INFO:tasks.workunit.client.1.vm08.stdout:8/633: link d1/d10/d9/d4d/l69 d1/d10/d9/dd/d25/lee 0 2026-03-10T08:55:28.986 INFO:tasks.workunit.client.0.vm05.stdout:6/397: readlink d4/l42 0 2026-03-10T08:55:28.987 INFO:tasks.workunit.client.0.vm05.stdout:6/398: dread - d4/d7/f4d zero size 2026-03-10T08:55:28.990 INFO:tasks.workunit.client.0.vm05.stdout:6/399: dwrite d4/d2d/d51/f7d [0,4194304] 0 2026-03-10T08:55:28.999 INFO:tasks.workunit.client.0.vm05.stdout:2/318: unlink d0/d9/d1e/c4d 0 2026-03-10T08:55:29.001 INFO:tasks.workunit.client.1.vm08.stdout:8/634: unlink d1/d10/fec 0 2026-03-10T08:55:29.001 INFO:tasks.workunit.client.1.vm08.stdout:8/635: chown d1/d10/d9/dd/d18/f80 962007341 1 2026-03-10T08:55:29.004 INFO:tasks.workunit.client.0.vm05.stdout:2/319: dwrite d0/d9/d1e/f39 [0,4194304] 0 2026-03-10T08:55:29.007 INFO:tasks.workunit.client.0.vm05.stdout:6/400: chown d4/f30 234844 1 2026-03-10T08:55:29.009 INFO:tasks.workunit.client.0.vm05.stdout:4/385: truncate d0/d1d/d30/f61 4019333 0 2026-03-10T08:55:29.012 INFO:tasks.workunit.client.0.vm05.stdout:4/386: dwrite d0/d1d/d30/d49/d4f/d5b/f70 [0,4194304] 0 2026-03-10T08:55:29.025 INFO:tasks.workunit.client.0.vm05.stdout:6/401: truncate d4/f6c 22616 0 2026-03-10T08:55:29.025 INFO:tasks.workunit.client.0.vm05.stdout:2/320: mknod d0/d55/c57 0 2026-03-10T08:55:29.025 INFO:tasks.workunit.client.0.vm05.stdout:2/321: chown d0/d55/c57 992 1 2026-03-10T08:55:29.026 INFO:tasks.workunit.client.0.vm05.stdout:2/322: write d0/fa [4092944,120908] 0 2026-03-10T08:55:29.030 INFO:tasks.workunit.client.0.vm05.stdout:8/319: dwrite d2/f2a [0,4194304] 0 2026-03-10T08:55:29.031 INFO:tasks.workunit.client.0.vm05.stdout:8/320: fdatasync d2/f5 0 2026-03-10T08:55:29.031 INFO:tasks.workunit.client.0.vm05.stdout:6/402: symlink d4/d7/l8b 0 2026-03-10T08:55:29.034 INFO:tasks.workunit.client.1.vm08.stdout:8/636: mknod d1/d10/d9/dd/d18/d34/dd0/cef 0 2026-03-10T08:55:29.035 INFO:tasks.workunit.client.0.vm05.stdout:6/403: dread d4/d2c/d84/d4a/f63 [0,4194304] 0 2026-03-10T08:55:29.036 INFO:tasks.workunit.client.0.vm05.stdout:9/250: write d6/fe [351048,27131] 0 2026-03-10T08:55:29.040 INFO:tasks.workunit.client.0.vm05.stdout:4/387: dread d0/f9 [0,4194304] 0 2026-03-10T08:55:29.049 INFO:tasks.workunit.client.0.vm05.stdout:4/388: chown d0/d2e/l44 89 1 2026-03-10T08:55:29.049 INFO:tasks.workunit.client.0.vm05.stdout:8/321: unlink d2/dd/d2c/d2e/d31/d4c/f71 0 2026-03-10T08:55:29.050 INFO:tasks.workunit.client.0.vm05.stdout:6/404: mkdir d4/d7/d10/d1a/d8c 0 2026-03-10T08:55:29.050 INFO:tasks.workunit.client.0.vm05.stdout:9/251: creat d6/d19/d2c/f54 x:0 0 0 2026-03-10T08:55:29.052 INFO:tasks.workunit.client.0.vm05.stdout:9/252: chown d6/f7 754 1 2026-03-10T08:55:29.053 INFO:tasks.workunit.client.0.vm05.stdout:9/253: write d6/d19/f1a [1301607,104240] 0 2026-03-10T08:55:29.061 INFO:tasks.workunit.client.0.vm05.stdout:8/322: rename d2/dd/d2c/d2e/d31/d3e/d5d/l5e to d2/dd/d2c/d2e/d31/d4f/l7c 0 2026-03-10T08:55:29.105 INFO:tasks.workunit.client.1.vm08.stdout:0/511: creat d6/dd/fa9 x:0 0 0 2026-03-10T08:55:29.105 INFO:tasks.workunit.client.1.vm08.stdout:3/531: symlink d4/lb5 0 2026-03-10T08:55:29.105 INFO:tasks.workunit.client.1.vm08.stdout:0/512: unlink d6/dd/d13/d17/d1f/d20/d2f/d26/d56/c8c 0 2026-03-10T08:55:29.105 INFO:tasks.workunit.client.1.vm08.stdout:3/532: getdents d4/d15/d8/d2c/d9b/d79 0 2026-03-10T08:55:29.105 INFO:tasks.workunit.client.1.vm08.stdout:3/533: chown d4/d15/d8/d1d/da8/cb2 0 1 2026-03-10T08:55:29.105 INFO:tasks.workunit.client.1.vm08.stdout:0/513: truncate d6/dd/d13/d17/f82 2007861 0 2026-03-10T08:55:29.105 INFO:tasks.workunit.client.1.vm08.stdout:4/577: creat d5/d23/d36/fce x:0 0 0 2026-03-10T08:55:29.105 INFO:tasks.workunit.client.1.vm08.stdout:1/601: rename d1/da/de/l44 to d1/da/de/d24/d3d/d40/ld3 0 2026-03-10T08:55:29.106 INFO:tasks.workunit.client.1.vm08.stdout:1/602: dread - d1/da/de/d24/d35/d6d/fa8 zero size 2026-03-10T08:55:29.106 INFO:tasks.workunit.client.1.vm08.stdout:3/534: stat d4/d15/d8/cb 0 2026-03-10T08:55:29.106 INFO:tasks.workunit.client.1.vm08.stdout:4/578: dwrite d5/d23/d36/d76/fa7 [0,4194304] 0 2026-03-10T08:55:29.106 INFO:tasks.workunit.client.1.vm08.stdout:0/514: chown f3 5643740 1 2026-03-10T08:55:29.106 INFO:tasks.workunit.client.1.vm08.stdout:0/515: dread - d6/dd/fa9 zero size 2026-03-10T08:55:29.106 INFO:tasks.workunit.client.1.vm08.stdout:6/590: rename d9/dc/d11/d23/d2c/d81/d63/f90 to d9/d13/d4e/fcb 0 2026-03-10T08:55:29.106 INFO:tasks.workunit.client.1.vm08.stdout:0/516: read d6/f11 [92324,117248] 0 2026-03-10T08:55:29.106 INFO:tasks.workunit.client.0.vm05.stdout:8/323: chown d2/dd/d2c/d2e/f37 1 1 2026-03-10T08:55:29.106 INFO:tasks.workunit.client.0.vm05.stdout:8/324: chown d2/db/d1f/d67 22770 1 2026-03-10T08:55:29.106 INFO:tasks.workunit.client.0.vm05.stdout:8/325: dwrite d2/fa [4194304,4194304] 0 2026-03-10T08:55:29.106 INFO:tasks.workunit.client.0.vm05.stdout:1/456: rmdir dd/d13 39 2026-03-10T08:55:29.106 INFO:tasks.workunit.client.0.vm05.stdout:1/457: symlink dd/d10/d18/d20/d69/la7 0 2026-03-10T08:55:29.106 INFO:tasks.workunit.client.0.vm05.stdout:5/282: symlink d5/df/l5b 0 2026-03-10T08:55:29.106 INFO:tasks.workunit.client.0.vm05.stdout:5/283: chown d5/df/d12/d21 0 1 2026-03-10T08:55:29.106 INFO:tasks.workunit.client.0.vm05.stdout:5/284: symlink d5/df/d37/l5c 0 2026-03-10T08:55:29.106 INFO:tasks.workunit.client.0.vm05.stdout:5/285: rename d5/df to d5/df/d12/d24/d2c/d41/d5d 22 2026-03-10T08:55:29.106 INFO:tasks.workunit.client.0.vm05.stdout:3/356: mknod d9/d2b/c65 0 2026-03-10T08:55:29.106 INFO:tasks.workunit.client.0.vm05.stdout:5/286: symlink d5/df/d12/l5e 0 2026-03-10T08:55:29.106 INFO:tasks.workunit.client.0.vm05.stdout:5/287: chown d5/df/d12/f1b 829 1 2026-03-10T08:55:29.106 INFO:tasks.workunit.client.0.vm05.stdout:0/349: write f6 [2467976,54842] 0 2026-03-10T08:55:29.106 INFO:tasks.workunit.client.0.vm05.stdout:0/350: creat df/d18/d2b/d27/f60 x:0 0 0 2026-03-10T08:55:29.106 INFO:tasks.workunit.client.0.vm05.stdout:0/351: link df/d18/d2b/f3b df/d18/d19/d39/f61 0 2026-03-10T08:55:29.106 INFO:tasks.workunit.client.0.vm05.stdout:0/352: stat df/d1f/f2d 0 2026-03-10T08:55:29.107 INFO:tasks.workunit.client.1.vm08.stdout:3/535: creat d4/d15/d8/d71/fb6 x:0 0 0 2026-03-10T08:55:29.107 INFO:tasks.workunit.client.0.vm05.stdout:0/353: mkdir df/d18/d19/d62 0 2026-03-10T08:55:29.108 INFO:tasks.workunit.client.1.vm08.stdout:4/579: stat d5/d23/d36/d99/db2/d5d/f66 0 2026-03-10T08:55:29.108 INFO:tasks.workunit.client.0.vm05.stdout:0/354: readlink df/d18/d2b/d27/d32/l52 0 2026-03-10T08:55:29.109 INFO:tasks.workunit.client.1.vm08.stdout:4/580: dread - d5/fba zero size 2026-03-10T08:55:29.111 INFO:tasks.workunit.client.0.vm05.stdout:0/355: creat df/d18/d19/d39/f63 x:0 0 0 2026-03-10T08:55:29.112 INFO:tasks.workunit.client.0.vm05.stdout:5/288: dread d5/df/d12/d24/f25 [0,4194304] 0 2026-03-10T08:55:29.116 INFO:tasks.workunit.client.1.vm08.stdout:0/517: creat d6/d8b/faa x:0 0 0 2026-03-10T08:55:29.121 INFO:tasks.workunit.client.0.vm05.stdout:0/356: dread df/d18/f53 [0,4194304] 0 2026-03-10T08:55:29.125 INFO:tasks.workunit.client.1.vm08.stdout:7/596: rename d0/d14/d2f/l55 to d0/d11/d1f/d29/lc0 0 2026-03-10T08:55:29.127 INFO:tasks.workunit.client.1.vm08.stdout:7/597: creat d0/d11/db2/fc1 x:0 0 0 2026-03-10T08:55:29.128 INFO:tasks.workunit.client.1.vm08.stdout:1/603: rename d1/da/de/d24/c2a to d1/da/de/d24/d3d/d40/d56/d7a/cd4 0 2026-03-10T08:55:29.130 INFO:tasks.workunit.client.1.vm08.stdout:7/598: link d0/d14/d43/d62/f9a d0/d14/d43/fc2 0 2026-03-10T08:55:29.131 INFO:tasks.workunit.client.1.vm08.stdout:1/604: truncate d1/da/de/d24/d35/d6d/d82/f7b 580632 0 2026-03-10T08:55:29.132 INFO:tasks.workunit.client.1.vm08.stdout:7/599: mkdir d0/d11/d4a/d5e/dc3 0 2026-03-10T08:55:29.133 INFO:tasks.workunit.client.1.vm08.stdout:1/605: symlink d1/da/d18/d3b/ld5 0 2026-03-10T08:55:29.133 INFO:tasks.workunit.client.1.vm08.stdout:7/600: fsync d0/d11/d1f/d29/d36/d75/fab 0 2026-03-10T08:55:29.134 INFO:tasks.workunit.client.1.vm08.stdout:7/601: mknod d0/d14/d43/d9d/cc4 0 2026-03-10T08:55:29.135 INFO:tasks.workunit.client.1.vm08.stdout:1/606: creat d1/da/fd6 x:0 0 0 2026-03-10T08:55:29.135 INFO:tasks.workunit.client.1.vm08.stdout:7/602: unlink d0/cb 0 2026-03-10T08:55:29.136 INFO:tasks.workunit.client.1.vm08.stdout:7/603: dread - d0/d11/d1f/fb7 zero size 2026-03-10T08:55:29.137 INFO:tasks.workunit.client.1.vm08.stdout:7/604: write d0/d14/d43/fc2 [2221780,1509] 0 2026-03-10T08:55:29.138 INFO:tasks.workunit.client.1.vm08.stdout:7/605: readlink d0/d51/l6d 0 2026-03-10T08:55:29.139 INFO:tasks.workunit.client.1.vm08.stdout:7/606: dread - d0/d11/d4a/fa5 zero size 2026-03-10T08:55:29.140 INFO:tasks.workunit.client.1.vm08.stdout:7/607: fsync d0/d14/d2f/f81 0 2026-03-10T08:55:29.140 INFO:tasks.workunit.client.1.vm08.stdout:1/607: dwrite d1/da/d18/f48 [0,4194304] 0 2026-03-10T08:55:29.147 INFO:tasks.workunit.client.1.vm08.stdout:7/608: chown d0/d11/d1f/l84 6538 1 2026-03-10T08:55:29.187 INFO:tasks.workunit.client.1.vm08.stdout:7/609: write d0/d11/d4a/f87 [6297152,73350] 0 2026-03-10T08:55:29.187 INFO:tasks.workunit.client.1.vm08.stdout:1/608: link d1/da/f22 d1/fd7 0 2026-03-10T08:55:29.187 INFO:tasks.workunit.client.1.vm08.stdout:7/610: mkdir d0/d11/d4a/d95/dc5 0 2026-03-10T08:55:29.187 INFO:tasks.workunit.client.1.vm08.stdout:1/609: creat d1/da/de/d24/d35/d6d/d82/da2/dbb/fd8 x:0 0 0 2026-03-10T08:55:29.187 INFO:tasks.workunit.client.1.vm08.stdout:1/610: truncate d1/da/d20/d3f/d49/d9c/fd1 993033 0 2026-03-10T08:55:29.187 INFO:tasks.workunit.client.1.vm08.stdout:1/611: creat d1/da/d18/d3b/d62/fd9 x:0 0 0 2026-03-10T08:55:29.187 INFO:tasks.workunit.client.1.vm08.stdout:1/612: dread d1/da/de/d24/d3d/d40/d8e/dd2/fbf [0,4194304] 0 2026-03-10T08:55:29.187 INFO:tasks.workunit.client.1.vm08.stdout:1/613: stat d1/da/d20/d3f/d49/f9a 0 2026-03-10T08:55:29.279 INFO:tasks.workunit.client.0.vm05.stdout:9/254: sync 2026-03-10T08:55:29.280 INFO:tasks.workunit.client.0.vm05.stdout:9/255: truncate d6/d15/d37/f4c 740430 0 2026-03-10T08:55:29.291 INFO:tasks.workunit.client.1.vm08.stdout:5/531: creat d0/fa4 x:0 0 0 2026-03-10T08:55:29.334 INFO:tasks.workunit.client.1.vm08.stdout:9/504: sync 2026-03-10T08:55:29.342 INFO:tasks.workunit.client.1.vm08.stdout:4/581: sync 2026-03-10T08:55:29.346 INFO:tasks.workunit.client.1.vm08.stdout:4/582: creat d5/d23/d36/d76/fcf x:0 0 0 2026-03-10T08:55:29.354 INFO:tasks.workunit.client.1.vm08.stdout:4/583: fsync d5/d23/d36/f51 0 2026-03-10T08:55:29.356 INFO:tasks.workunit.client.0.vm05.stdout:3/357: dread d9/d4d/f5e [0,4194304] 0 2026-03-10T08:55:29.360 INFO:tasks.workunit.client.0.vm05.stdout:3/358: dwrite d9/d4d/f5e [0,4194304] 0 2026-03-10T08:55:29.363 INFO:tasks.workunit.client.1.vm08.stdout:4/584: mknod d5/d23/d49/cd0 0 2026-03-10T08:55:29.374 INFO:tasks.workunit.client.0.vm05.stdout:0/357: dread f6 [0,4194304] 0 2026-03-10T08:55:29.377 INFO:tasks.workunit.client.0.vm05.stdout:0/358: dwrite f5 [0,4194304] 0 2026-03-10T08:55:29.383 INFO:tasks.workunit.client.0.vm05.stdout:0/359: chown df/d1f/c2c 436 1 2026-03-10T08:55:29.383 INFO:tasks.workunit.client.0.vm05.stdout:7/293: mknod d18/d1b/d1f/d25/d2e/c54 0 2026-03-10T08:55:29.383 INFO:tasks.workunit.client.0.vm05.stdout:7/294: creat d18/d38/f55 x:0 0 0 2026-03-10T08:55:29.383 INFO:tasks.workunit.client.0.vm05.stdout:7/295: write d18/f24 [1046255,90679] 0 2026-03-10T08:55:29.389 INFO:tasks.workunit.client.1.vm08.stdout:4/585: dread d5/d23/f68 [0,4194304] 0 2026-03-10T08:55:29.389 INFO:tasks.workunit.client.0.vm05.stdout:7/296: fsync d18/f1d 0 2026-03-10T08:55:29.389 INFO:tasks.workunit.client.1.vm08.stdout:4/586: chown d5/d23/d36/d76/fa5 53 1 2026-03-10T08:55:29.390 INFO:tasks.workunit.client.1.vm08.stdout:4/587: fdatasync d5/fba 0 2026-03-10T08:55:29.390 INFO:tasks.workunit.client.1.vm08.stdout:4/588: chown d5/d23/d36/c88 5814 1 2026-03-10T08:55:29.393 INFO:tasks.workunit.client.1.vm08.stdout:4/589: creat d5/d23/d36/d99/db2/d5d/dae/fd1 x:0 0 0 2026-03-10T08:55:29.394 INFO:tasks.workunit.client.0.vm05.stdout:7/297: getdents d18/d1b/d1f 0 2026-03-10T08:55:29.394 INFO:tasks.workunit.client.0.vm05.stdout:7/298: stat d18/d1b/d1f/d25/d2e/d32/c3e 0 2026-03-10T08:55:29.397 INFO:tasks.workunit.client.0.vm05.stdout:7/299: dwrite d18/d1b/d1f/f3f [0,4194304] 0 2026-03-10T08:55:29.398 INFO:tasks.workunit.client.0.vm05.stdout:7/300: chown d18/d38 0 1 2026-03-10T08:55:29.399 INFO:tasks.workunit.client.0.vm05.stdout:7/301: write d18/d1b/d1f/d25/d2e/f49 [3398389,55905] 0 2026-03-10T08:55:29.403 INFO:tasks.workunit.client.0.vm05.stdout:7/302: dwrite d18/f24 [0,4194304] 0 2026-03-10T08:55:29.405 INFO:tasks.workunit.client.0.vm05.stdout:7/303: chown d18/d1b/d1f/d25/c2a 203984 1 2026-03-10T08:55:29.405 INFO:tasks.workunit.client.0.vm05.stdout:7/304: read - d18/d1b/d1f/d25/d2e/f48 zero size 2026-03-10T08:55:29.409 INFO:tasks.workunit.client.0.vm05.stdout:7/305: chown d18/d1b/c29 607666 1 2026-03-10T08:55:29.415 INFO:tasks.workunit.client.1.vm08.stdout:4/590: dread d5/f77 [0,4194304] 0 2026-03-10T08:55:29.416 INFO:tasks.workunit.client.1.vm08.stdout:4/591: fsync d5/d23/d49/f4d 0 2026-03-10T08:55:29.463 INFO:tasks.workunit.client.0.vm05.stdout:8/326: creat d2/dd/d2c/d2e/f7d x:0 0 0 2026-03-10T08:55:29.463 INFO:tasks.workunit.client.0.vm05.stdout:8/327: fdatasync d2/dd/d2c/f34 0 2026-03-10T08:55:29.465 INFO:tasks.workunit.client.0.vm05.stdout:8/328: dread d2/fa [4194304,4194304] 0 2026-03-10T08:55:29.466 INFO:tasks.workunit.client.0.vm05.stdout:8/329: chown d2/db/c10 255 1 2026-03-10T08:55:29.468 INFO:tasks.workunit.client.1.vm08.stdout:0/518: rename d6/dd/d13/f5e to d6/dd/d13/d17/d1f/d20/d2f/d24/fab 0 2026-03-10T08:55:29.469 INFO:tasks.workunit.client.0.vm05.stdout:6/405: mkdir d4/d8d 0 2026-03-10T08:55:29.469 INFO:tasks.workunit.client.1.vm08.stdout:0/519: write d6/f5f [3858937,25520] 0 2026-03-10T08:55:29.474 INFO:tasks.workunit.client.0.vm05.stdout:8/330: mknod d2/dd/d74/c7e 0 2026-03-10T08:55:29.475 INFO:tasks.workunit.client.1.vm08.stdout:2/601: dwrite d1/d5b/f80 [0,4194304] 0 2026-03-10T08:55:29.476 INFO:tasks.workunit.client.1.vm08.stdout:7/611: rename d0/d14/d43/l7f to d0/d14/d43/d9d/lc6 0 2026-03-10T08:55:29.477 INFO:tasks.workunit.client.1.vm08.stdout:0/520: write d6/dd/d13/d17/d1f/da3/fa7 [1874602,26387] 0 2026-03-10T08:55:29.479 INFO:tasks.workunit.client.0.vm05.stdout:8/331: dwrite d2/dd/d2c/d2e/f3b [4194304,4194304] 0 2026-03-10T08:55:29.479 INFO:tasks.workunit.client.1.vm08.stdout:7/612: write d0/d11/d1f/d29/d3d/d89/f8b [4546461,53399] 0 2026-03-10T08:55:29.480 INFO:tasks.workunit.client.1.vm08.stdout:7/613: stat d0/d11/d4a/c52 0 2026-03-10T08:55:29.495 INFO:tasks.workunit.client.1.vm08.stdout:7/614: creat d0/d11/d1f/d29/d3b/fc7 x:0 0 0 2026-03-10T08:55:29.496 INFO:tasks.workunit.client.0.vm05.stdout:8/332: rmdir d2/dd/d2c/d2e/d31/d4c/d55 0 2026-03-10T08:55:29.496 INFO:tasks.workunit.client.0.vm05.stdout:8/333: chown d2/dd/d74 440 1 2026-03-10T08:55:29.497 INFO:tasks.workunit.client.0.vm05.stdout:8/334: write d2/dd/d2c/d2e/f5a [958158,98997] 0 2026-03-10T08:55:29.497 INFO:tasks.workunit.client.0.vm05.stdout:8/335: chown d2 95598080 1 2026-03-10T08:55:29.499 INFO:tasks.workunit.client.1.vm08.stdout:7/615: rename d0/d11/d1f/d29/d36/d75/fab to d0/d11/d1f/d29/d36/d75/fc8 0 2026-03-10T08:55:29.499 INFO:tasks.workunit.client.0.vm05.stdout:8/336: rmdir d2/dd/d2c/d2e/d31/d4f 39 2026-03-10T08:55:29.500 INFO:tasks.workunit.client.0.vm05.stdout:8/337: creat d2/dd/d2c/d2e/d31/d4c/f7f x:0 0 0 2026-03-10T08:55:29.501 INFO:tasks.workunit.client.0.vm05.stdout:8/338: fdatasync d2/dd/d2c/d2e/d31/d3e/f6b 0 2026-03-10T08:55:29.504 INFO:tasks.workunit.client.1.vm08.stdout:7/616: dwrite d0/d11/d1f/d29/d3d/d40/f38 [4194304,4194304] 0 2026-03-10T08:55:29.505 INFO:tasks.workunit.client.0.vm05.stdout:8/339: dwrite d2/dd/d2c/d2e/d31/d3e/f73 [0,4194304] 0 2026-03-10T08:55:29.506 INFO:tasks.workunit.client.0.vm05.stdout:8/340: dread - d2/dd/d2c/d2e/f64 zero size 2026-03-10T08:55:29.507 INFO:tasks.workunit.client.0.vm05.stdout:8/341: dread - d2/dd/d2c/d2e/d31/d3e/f6b zero size 2026-03-10T08:55:29.510 INFO:tasks.workunit.client.0.vm05.stdout:8/342: rmdir d2/dd/d2c/d2e/d31/d4c/d6e 0 2026-03-10T08:55:29.512 INFO:tasks.workunit.client.1.vm08.stdout:7/617: dread - d0/d11/d1f/d29/d3b/fc7 zero size 2026-03-10T08:55:29.524 INFO:tasks.workunit.client.0.vm05.stdout:8/343: mkdir d2/dd/d2c/d2e/d31/d4f/d80 0 2026-03-10T08:55:29.524 INFO:tasks.workunit.client.0.vm05.stdout:8/344: dread - d2/db/d1f/f53 zero size 2026-03-10T08:55:29.526 INFO:tasks.workunit.client.1.vm08.stdout:7/618: mknod d0/d11/cc9 0 2026-03-10T08:55:29.528 INFO:tasks.workunit.client.1.vm08.stdout:8/637: write d1/d10/fac [4858763,104816] 0 2026-03-10T08:55:29.529 INFO:tasks.workunit.client.0.vm05.stdout:2/323: write d0/d9/f1b [2237720,113096] 0 2026-03-10T08:55:29.532 INFO:tasks.workunit.client.1.vm08.stdout:8/638: symlink d1/d10/d9/dd/d25/lf0 0 2026-03-10T08:55:29.533 INFO:tasks.workunit.client.1.vm08.stdout:8/639: dread - d1/d10/d9/dd/d25/d27/d44/fa7 zero size 2026-03-10T08:55:29.535 INFO:tasks.workunit.client.0.vm05.stdout:4/389: dwrite d0/f10 [0,4194304] 0 2026-03-10T08:55:29.539 INFO:tasks.workunit.client.0.vm05.stdout:4/390: dwrite d0/d2e/d42/d45/d4a/f47 [4194304,4194304] 0 2026-03-10T08:55:29.541 INFO:tasks.workunit.client.0.vm05.stdout:4/391: chown d0/d2c/d6a/l77 11971 1 2026-03-10T08:55:29.542 INFO:tasks.workunit.client.0.vm05.stdout:8/345: dread d2/db/f1b [0,4194304] 0 2026-03-10T08:55:29.547 INFO:tasks.workunit.client.1.vm08.stdout:8/640: readlink d1/l87 0 2026-03-10T08:55:29.550 INFO:tasks.workunit.client.0.vm05.stdout:8/346: dwrite d2/dd/d2c/d2e/f37 [0,4194304] 0 2026-03-10T08:55:29.554 INFO:tasks.workunit.client.0.vm05.stdout:4/392: chown d0/l5 150198 1 2026-03-10T08:55:29.560 INFO:tasks.workunit.client.1.vm08.stdout:8/641: getdents d1/d10 0 2026-03-10T08:55:29.560 INFO:tasks.workunit.client.0.vm05.stdout:2/324: link d0/d9/d1e/d20/d21/f35 d0/d9/d1e/d20/d21/d45/d4b/f58 0 2026-03-10T08:55:29.560 INFO:tasks.workunit.client.0.vm05.stdout:2/325: read d0/d9/f19 [2988200,45951] 0 2026-03-10T08:55:29.560 INFO:tasks.workunit.client.0.vm05.stdout:8/347: mknod d2/db/d1f/d67/c81 0 2026-03-10T08:55:29.560 INFO:tasks.workunit.client.0.vm05.stdout:4/393: unlink d0/d1d/d30/d49/d58/f6e 0 2026-03-10T08:55:29.562 INFO:tasks.workunit.client.0.vm05.stdout:8/348: symlink d2/db/d1f/l82 0 2026-03-10T08:55:29.563 INFO:tasks.workunit.client.0.vm05.stdout:8/349: write d2/db/d47/f51 [20270,92972] 0 2026-03-10T08:55:29.563 INFO:tasks.workunit.client.1.vm08.stdout:8/642: chown d1/d10/d9/d4d/le9 795 1 2026-03-10T08:55:29.563 INFO:tasks.workunit.client.1.vm08.stdout:8/643: readlink d1/d10/ld2 0 2026-03-10T08:55:29.569 INFO:tasks.workunit.client.0.vm05.stdout:8/350: dwrite d2/dd/f1a [0,4194304] 0 2026-03-10T08:55:29.570 INFO:tasks.workunit.client.0.vm05.stdout:8/351: fdatasync d2/dd/d2c/d2e/d31/d4c/d63/f6c 0 2026-03-10T08:55:29.575 INFO:tasks.workunit.client.1.vm08.stdout:8/644: rename d1/d10/l20 to d1/d4f/d60/dbf/lf1 0 2026-03-10T08:55:29.579 INFO:tasks.workunit.client.1.vm08.stdout:0/521: dread d6/dd/d13/d17/f1d [0,4194304] 0 2026-03-10T08:55:29.587 INFO:tasks.workunit.client.1.vm08.stdout:0/522: chown d6/dd/d13/d61/f86 56298898 1 2026-03-10T08:55:29.587 INFO:tasks.workunit.client.1.vm08.stdout:0/523: dwrite d6/fa [0,4194304] 0 2026-03-10T08:55:29.587 INFO:tasks.workunit.client.0.vm05.stdout:4/394: symlink d0/d55/l80 0 2026-03-10T08:55:29.587 INFO:tasks.workunit.client.0.vm05.stdout:2/326: creat d0/d9/d1e/f59 x:0 0 0 2026-03-10T08:55:29.587 INFO:tasks.workunit.client.0.vm05.stdout:4/395: dwrite d0/d1d/d30/d32/f72 [0,4194304] 0 2026-03-10T08:55:29.592 INFO:tasks.workunit.client.0.vm05.stdout:8/352: dread d2/dd/d2c/f30 [0,4194304] 0 2026-03-10T08:55:29.593 INFO:tasks.workunit.client.0.vm05.stdout:8/353: write d2/dd/d2c/d2e/f64 [611128,24604] 0 2026-03-10T08:55:29.601 INFO:tasks.workunit.client.1.vm08.stdout:0/524: creat d6/dd/d13/d17/d50/fac x:0 0 0 2026-03-10T08:55:29.602 INFO:tasks.workunit.client.0.vm05.stdout:4/396: rename d0/d1d/d30/d32/d41/c54 to d0/d1d/d30/d49/d4f/d5b/c81 0 2026-03-10T08:55:29.602 INFO:tasks.workunit.client.0.vm05.stdout:4/397: stat d0/d55 0 2026-03-10T08:55:29.603 INFO:tasks.workunit.client.1.vm08.stdout:8/645: dread d1/d10/d9/dd/d9a/f9d [0,4194304] 0 2026-03-10T08:55:29.605 INFO:tasks.workunit.client.0.vm05.stdout:4/398: dread d0/d1d/d30/d49/d4f/d5b/f70 [0,4194304] 0 2026-03-10T08:55:29.606 INFO:tasks.workunit.client.0.vm05.stdout:2/327: symlink d0/d9/d1e/d20/d21/d45/d4b/l5a 0 2026-03-10T08:55:29.613 INFO:tasks.workunit.client.1.vm08.stdout:0/525: creat d6/dd/d13/d17/d1f/d2d/d39/fad x:0 0 0 2026-03-10T08:55:29.613 INFO:tasks.workunit.client.0.vm05.stdout:2/328: truncate d0/f10 3303868 0 2026-03-10T08:55:29.613 INFO:tasks.workunit.client.0.vm05.stdout:1/458: write dd/d10/d18/d20/f6c [1692693,14873] 0 2026-03-10T08:55:29.613 INFO:tasks.workunit.client.0.vm05.stdout:1/459: write dd/d10/d18/d2d/f84 [578654,111059] 0 2026-03-10T08:55:29.615 INFO:tasks.workunit.client.1.vm08.stdout:0/526: creat d6/dd/d13/d17/d1f/d2d/d38/fae x:0 0 0 2026-03-10T08:55:29.617 INFO:tasks.workunit.client.0.vm05.stdout:4/399: mkdir d0/d2e/d71/d7c/d82 0 2026-03-10T08:55:29.621 INFO:tasks.workunit.client.0.vm05.stdout:2/329: unlink d0/d9/d1e/c4a 0 2026-03-10T08:55:29.626 INFO:tasks.workunit.client.1.vm08.stdout:0/527: creat d6/dd/d13/d61/d6f/faf x:0 0 0 2026-03-10T08:55:29.627 INFO:tasks.workunit.client.1.vm08.stdout:0/528: stat d6/dd/d13/d17/d1f/d20/d2f/l33 0 2026-03-10T08:55:29.629 INFO:tasks.workunit.client.0.vm05.stdout:1/460: mknod dd/d10/ca8 0 2026-03-10T08:55:29.631 INFO:tasks.workunit.client.0.vm05.stdout:4/400: mknod d0/d2e/d42/d45/c83 0 2026-03-10T08:55:29.631 INFO:tasks.workunit.client.0.vm05.stdout:4/401: chown d0/d2e/d42/d45/d4a 0 1 2026-03-10T08:55:29.633 INFO:tasks.workunit.client.1.vm08.stdout:6/591: write d9/dc/d11/d23/f8a [1146889,8953] 0 2026-03-10T08:55:29.633 INFO:tasks.workunit.client.1.vm08.stdout:3/536: write d4/d15/d8/d2c/d9b/d79/d20/f8b [986148,72093] 0 2026-03-10T08:55:29.634 INFO:tasks.workunit.client.0.vm05.stdout:5/289: dwrite d5/df/d12/d24/d2c/d41/f4d [0,4194304] 0 2026-03-10T08:55:29.638 INFO:tasks.workunit.client.1.vm08.stdout:3/537: chown d4/d15/d8/d2c/d9b/d79/d8f/f91 119586509 1 2026-03-10T08:55:29.638 INFO:tasks.workunit.client.0.vm05.stdout:2/330: dread d0/f10 [0,4194304] 0 2026-03-10T08:55:29.644 INFO:tasks.workunit.client.1.vm08.stdout:3/538: mknod d4/d15/d8/d2c/d55/cb7 0 2026-03-10T08:55:29.645 INFO:tasks.workunit.client.0.vm05.stdout:1/461: mkdir dd/d55/da9 0 2026-03-10T08:55:29.646 INFO:tasks.workunit.client.0.vm05.stdout:4/402: creat d0/d1d/d30/d49/d4f/d5b/f84 x:0 0 0 2026-03-10T08:55:29.648 INFO:tasks.workunit.client.1.vm08.stdout:6/592: rename d9/d10/d1e/d4c/d69/da2/fc4 to d9/d10/d1e/d92/fcc 0 2026-03-10T08:55:29.652 INFO:tasks.workunit.client.0.vm05.stdout:1/462: creat dd/d21/d37/d7c/faa x:0 0 0 2026-03-10T08:55:29.654 INFO:tasks.workunit.client.1.vm08.stdout:1/614: write d1/da/de/d24/d35/f64 [1573501,120826] 0 2026-03-10T08:55:29.655 INFO:tasks.workunit.client.0.vm05.stdout:5/290: getdents d5/d3a/d43 0 2026-03-10T08:55:29.658 INFO:tasks.workunit.client.0.vm05.stdout:9/256: dwrite d6/d19/f29 [0,4194304] 0 2026-03-10T08:55:29.658 INFO:tasks.workunit.client.0.vm05.stdout:1/463: write dd/d21/d37/f8c [85667,95285] 0 2026-03-10T08:55:29.664 INFO:tasks.workunit.client.1.vm08.stdout:5/532: dwrite d0/d1b/f77 [0,4194304] 0 2026-03-10T08:55:29.668 INFO:tasks.workunit.client.1.vm08.stdout:1/615: creat d1/da/de/d24/d26/fda x:0 0 0 2026-03-10T08:55:29.670 INFO:tasks.workunit.client.1.vm08.stdout:9/505: dwrite d2/dd/d15/d1e/d25/d32/d5c/f7f [0,4194304] 0 2026-03-10T08:55:29.674 INFO:tasks.workunit.client.0.vm05.stdout:4/403: getdents d0/d1d/d30/d49/d4f 0 2026-03-10T08:55:29.675 INFO:tasks.workunit.client.0.vm05.stdout:5/291: symlink d5/df/d12/d24/d2c/l5f 0 2026-03-10T08:55:29.677 INFO:tasks.workunit.client.0.vm05.stdout:4/404: creat d0/d1d/d30/d49/d58/d66/d79/f85 x:0 0 0 2026-03-10T08:55:29.678 INFO:tasks.workunit.client.0.vm05.stdout:5/292: mkdir d5/d3a/d43/d60 0 2026-03-10T08:55:29.682 INFO:tasks.workunit.client.1.vm08.stdout:5/533: mkdir d0/d11/d27/d68/d7c/d4b/d4e/da5 0 2026-03-10T08:55:29.702 INFO:tasks.workunit.client.1.vm08.stdout:1/616: creat d1/da/de/d5c/fdb x:0 0 0 2026-03-10T08:55:29.702 INFO:tasks.workunit.client.1.vm08.stdout:9/506: creat d2/dd/d15/d1e/d24/f9e x:0 0 0 2026-03-10T08:55:29.702 INFO:tasks.workunit.client.1.vm08.stdout:1/617: creat d1/da/de/d24/d3d/d40/d8e/dd2/fdc x:0 0 0 2026-03-10T08:55:29.702 INFO:tasks.workunit.client.1.vm08.stdout:9/507: creat d2/f9f x:0 0 0 2026-03-10T08:55:29.702 INFO:tasks.workunit.client.1.vm08.stdout:1/618: unlink d1/da/de/d24/d35/f64 0 2026-03-10T08:55:29.702 INFO:tasks.workunit.client.1.vm08.stdout:1/619: dread d1/da/d20/d3f/d49/d9c/fd1 [0,4194304] 0 2026-03-10T08:55:29.702 INFO:tasks.workunit.client.1.vm08.stdout:1/620: rename d1/da/de/d24/d3d/d40/d8e/dd2/fbf to d1/da/de/dcf/fdd 0 2026-03-10T08:55:29.736 INFO:tasks.workunit.client.0.vm05.stdout:9/257: sync 2026-03-10T08:55:29.738 INFO:tasks.workunit.client.0.vm05.stdout:9/258: fsync d6/d15/f4f 0 2026-03-10T08:55:29.739 INFO:tasks.workunit.client.1.vm08.stdout:5/534: dread d0/d11/d27/d50/f55 [0,4194304] 0 2026-03-10T08:55:29.739 INFO:tasks.workunit.client.0.vm05.stdout:9/259: mkdir d6/d15/d3c/d4b/d55 0 2026-03-10T08:55:29.751 INFO:tasks.workunit.client.0.vm05.stdout:9/260: dread d6/fb [0,4194304] 0 2026-03-10T08:55:29.756 INFO:tasks.workunit.client.1.vm08.stdout:5/535: read d0/d11/d3e/f4d [1163648,69710] 0 2026-03-10T08:55:29.757 INFO:tasks.workunit.client.1.vm08.stdout:5/536: dread - d0/d11/d27/d50/f9d zero size 2026-03-10T08:55:29.761 INFO:tasks.workunit.client.1.vm08.stdout:5/537: write d0/d11/d18/d52/f7d [3890526,72102] 0 2026-03-10T08:55:29.761 INFO:tasks.workunit.client.1.vm08.stdout:5/538: dread - d0/fa4 zero size 2026-03-10T08:55:29.765 INFO:tasks.workunit.client.1.vm08.stdout:5/539: symlink d0/d11/d3e/la6 0 2026-03-10T08:55:29.768 INFO:tasks.workunit.client.1.vm08.stdout:5/540: unlink d0/d11/f2d 0 2026-03-10T08:55:29.777 INFO:tasks.workunit.client.1.vm08.stdout:5/541: dwrite d0/d11/d18/d52/f7d [0,4194304] 0 2026-03-10T08:55:29.784 INFO:tasks.workunit.client.0.vm05.stdout:3/359: truncate d9/d4d/f52 422610 0 2026-03-10T08:55:29.786 INFO:tasks.workunit.client.0.vm05.stdout:3/360: symlink d9/d4d/d51/d64/l66 0 2026-03-10T08:55:29.787 INFO:tasks.workunit.client.0.vm05.stdout:3/361: stat d9/c30 0 2026-03-10T08:55:29.790 INFO:tasks.workunit.client.0.vm05.stdout:3/362: link d9/f28 d9/d4d/d51/f67 0 2026-03-10T08:55:29.791 INFO:tasks.workunit.client.0.vm05.stdout:3/363: fdatasync d9/d2b/d2f/f3f 0 2026-03-10T08:55:29.792 INFO:tasks.workunit.client.0.vm05.stdout:3/364: creat d9/d2b/d3a/f68 x:0 0 0 2026-03-10T08:55:29.793 INFO:tasks.workunit.client.0.vm05.stdout:3/365: chown d9/c21 24423 1 2026-03-10T08:55:29.793 INFO:tasks.workunit.client.0.vm05.stdout:3/366: fsync d9/d2b/f2d 0 2026-03-10T08:55:29.796 INFO:tasks.workunit.client.0.vm05.stdout:3/367: creat d9/d2b/d53/d61/f69 x:0 0 0 2026-03-10T08:55:29.798 INFO:tasks.workunit.client.0.vm05.stdout:3/368: dread d9/f29 [0,4194304] 0 2026-03-10T08:55:29.805 INFO:tasks.workunit.client.0.vm05.stdout:0/360: write df/f17 [1326705,40884] 0 2026-03-10T08:55:29.808 INFO:tasks.workunit.client.0.vm05.stdout:0/361: symlink df/d18/d19/l64 0 2026-03-10T08:55:29.854 INFO:tasks.workunit.client.0.vm05.stdout:7/306: truncate f15 3206289 0 2026-03-10T08:55:29.858 INFO:tasks.workunit.client.0.vm05.stdout:7/307: dwrite d18/d1b/d1f/d25/d2e/d42/f46 [0,4194304] 0 2026-03-10T08:55:29.862 INFO:tasks.workunit.client.1.vm08.stdout:4/592: dwrite d5/d23/d36/d99/db2/d5a/d69/f6e [0,4194304] 0 2026-03-10T08:55:29.866 INFO:tasks.workunit.client.0.vm05.stdout:7/308: creat d18/d1b/d1f/d25/f56 x:0 0 0 2026-03-10T08:55:29.866 INFO:tasks.workunit.client.0.vm05.stdout:7/309: readlink l13 0 2026-03-10T08:55:29.866 INFO:tasks.workunit.client.0.vm05.stdout:7/310: fsync d18/d1b/f30 0 2026-03-10T08:55:29.869 INFO:tasks.workunit.client.0.vm05.stdout:8/354: getdents d2/dd/d74 0 2026-03-10T08:55:29.872 INFO:tasks.workunit.client.0.vm05.stdout:7/311: mknod d18/d1b/d1f/d25/d2e/d32/c57 0 2026-03-10T08:55:29.873 INFO:tasks.workunit.client.0.vm05.stdout:7/312: chown d18/d1b/d1f/d25/d2e/d42/f52 1911263 1 2026-03-10T08:55:29.875 INFO:tasks.workunit.client.0.vm05.stdout:6/406: write d4/d2d/f2f [1635179,71359] 0 2026-03-10T08:55:29.878 INFO:tasks.workunit.client.1.vm08.stdout:2/602: truncate d1/da/d10/d2d/f4c 4147701 0 2026-03-10T08:55:29.879 INFO:tasks.workunit.client.0.vm05.stdout:8/355: rmdir d2/dd/d2c/d2e/d31/d4f 39 2026-03-10T08:55:29.883 INFO:tasks.workunit.client.1.vm08.stdout:4/593: creat d5/d23/d36/fd2 x:0 0 0 2026-03-10T08:55:29.884 INFO:tasks.workunit.client.0.vm05.stdout:7/313: unlink f4 0 2026-03-10T08:55:29.885 INFO:tasks.workunit.client.0.vm05.stdout:7/314: chown d18/d1b/d1f/d25/d2e/d42 1257 1 2026-03-10T08:55:29.886 INFO:tasks.workunit.client.0.vm05.stdout:7/315: dread - d18/d38/f55 zero size 2026-03-10T08:55:29.889 INFO:tasks.workunit.client.0.vm05.stdout:6/407: rename d4/d7/d10/d1a/d1f/l69 to d4/d2c/d84/d4a/l8e 0 2026-03-10T08:55:29.893 INFO:tasks.workunit.client.0.vm05.stdout:6/408: dwrite d4/d2c/d84/d4a/f76 [0,4194304] 0 2026-03-10T08:55:29.898 INFO:tasks.workunit.client.0.vm05.stdout:6/409: stat d4/d7/d10/d15/d1b/d22/f56 0 2026-03-10T08:55:29.898 INFO:tasks.workunit.client.0.vm05.stdout:6/410: read - d4/d7/f4d zero size 2026-03-10T08:55:29.898 INFO:tasks.workunit.client.0.vm05.stdout:8/356: fsync d2/db/d1f/d67/f79 0 2026-03-10T08:55:29.903 INFO:tasks.workunit.client.1.vm08.stdout:2/603: dread d1/da/d10/d42/d93/d22/f45 [0,4194304] 0 2026-03-10T08:55:29.908 INFO:tasks.workunit.client.1.vm08.stdout:7/619: dwrite d0/d14/f72 [4194304,4194304] 0 2026-03-10T08:55:29.926 INFO:tasks.workunit.client.1.vm08.stdout:7/620: chown d0/d14/d43/fa4 14839282 1 2026-03-10T08:55:29.926 INFO:tasks.workunit.client.1.vm08.stdout:2/604: symlink d1/da/d10/d1b/d6a/lbd 0 2026-03-10T08:55:29.926 INFO:tasks.workunit.client.1.vm08.stdout:4/594: link d5/d23/d49/d83/l86 d5/d5f/ld3 0 2026-03-10T08:55:29.926 INFO:tasks.workunit.client.1.vm08.stdout:7/621: symlink d0/d11/d1f/d29/d3d/d40/lca 0 2026-03-10T08:55:29.926 INFO:tasks.workunit.client.1.vm08.stdout:4/595: write d5/d23/d36/d99/db2/fab [363433,85053] 0 2026-03-10T08:55:29.926 INFO:tasks.workunit.client.1.vm08.stdout:4/596: rename d5/d23/d36/d99/db2/fab to d5/d23/d49/d83/fd4 0 2026-03-10T08:55:29.928 INFO:tasks.workunit.client.1.vm08.stdout:8/646: dwrite d1/d10/d9/dd/d25/d27/f3a [0,4194304] 0 2026-03-10T08:55:29.932 INFO:tasks.workunit.client.1.vm08.stdout:4/597: fsync d5/d23/d36/d76/fa5 0 2026-03-10T08:55:29.934 INFO:tasks.workunit.client.0.vm05.stdout:6/411: sync 2026-03-10T08:55:29.934 INFO:tasks.workunit.client.1.vm08.stdout:0/529: dwrite d6/dd/d13/d17/d1f/d20/d2f/d26/f73 [0,4194304] 0 2026-03-10T08:55:29.937 INFO:tasks.workunit.client.0.vm05.stdout:4/405: dread d0/d1d/d30/f61 [0,4194304] 0 2026-03-10T08:55:29.938 INFO:tasks.workunit.client.1.vm08.stdout:0/530: truncate d6/dd/d13/d61/d6f/fa5 728558 0 2026-03-10T08:55:29.939 INFO:tasks.workunit.client.0.vm05.stdout:6/412: dwrite d4/f6c [0,4194304] 0 2026-03-10T08:55:29.947 INFO:tasks.workunit.client.1.vm08.stdout:4/598: creat d5/d23/d49/d83/fd5 x:0 0 0 2026-03-10T08:55:29.952 INFO:tasks.workunit.client.1.vm08.stdout:4/599: chown d5/d23/d36/d99/db2/f45 59628 1 2026-03-10T08:55:29.952 INFO:tasks.workunit.client.1.vm08.stdout:3/539: write d4/d15/fa [2593795,17293] 0 2026-03-10T08:55:29.954 INFO:tasks.workunit.client.1.vm08.stdout:0/531: creat d6/dd/d13/d17/d1f/da3/fb0 x:0 0 0 2026-03-10T08:55:29.956 INFO:tasks.workunit.client.0.vm05.stdout:6/413: truncate d4/d2c/d84/f3a 382607 0 2026-03-10T08:55:29.960 INFO:tasks.workunit.client.1.vm08.stdout:4/600: fsync d5/d23/d36/d99/db2/d5a/d69/f97 0 2026-03-10T08:55:29.964 INFO:tasks.workunit.client.0.vm05.stdout:4/406: truncate d0/f1e 584602 0 2026-03-10T08:55:29.968 INFO:tasks.workunit.client.1.vm08.stdout:4/601: fsync d5/d23/d36/fa9 0 2026-03-10T08:55:29.968 INFO:tasks.workunit.client.1.vm08.stdout:3/540: getdents d4/da9 0 2026-03-10T08:55:29.969 INFO:tasks.workunit.client.0.vm05.stdout:6/414: dread d4/d7/d10/d15/d20/f47 [0,4194304] 0 2026-03-10T08:55:29.969 INFO:tasks.workunit.client.0.vm05.stdout:6/415: chown d4/d7/f5d 109836838 1 2026-03-10T08:55:29.970 INFO:tasks.workunit.client.0.vm05.stdout:6/416: write d4/d2c/d84/d4a/f76 [2284715,97914] 0 2026-03-10T08:55:29.971 INFO:tasks.workunit.client.0.vm05.stdout:4/407: creat d0/d2e/d42/d45/d4a/f86 x:0 0 0 2026-03-10T08:55:29.972 INFO:tasks.workunit.client.1.vm08.stdout:4/602: symlink d5/d23/d49/ld6 0 2026-03-10T08:55:29.986 INFO:tasks.workunit.client.1.vm08.stdout:3/541: dread d4/d15/fc [0,4194304] 0 2026-03-10T08:55:29.989 INFO:tasks.workunit.client.1.vm08.stdout:3/542: stat d4/d15/d8/d1d/c70 0 2026-03-10T08:55:29.990 INFO:tasks.workunit.client.1.vm08.stdout:3/543: symlink d4/d15/lb8 0 2026-03-10T08:55:29.993 INFO:tasks.workunit.client.1.vm08.stdout:3/544: dwrite d4/d15/d8/d1d/d4f/fa2 [0,4194304] 0 2026-03-10T08:55:30.001 INFO:tasks.workunit.client.1.vm08.stdout:3/545: dwrite d4/d15/d8/d2c/d55/d93/fa5 [0,4194304] 0 2026-03-10T08:55:30.005 INFO:tasks.workunit.client.0.vm05.stdout:6/417: sync 2026-03-10T08:55:30.007 INFO:tasks.workunit.client.0.vm05.stdout:6/418: write d4/d7/f14 [78550,121236] 0 2026-03-10T08:55:30.011 INFO:tasks.workunit.client.0.vm05.stdout:6/419: mkdir d4/d7/d10/d8f 0 2026-03-10T08:55:30.011 INFO:tasks.workunit.client.0.vm05.stdout:6/420: readlink d4/l45 0 2026-03-10T08:55:30.012 INFO:tasks.workunit.client.0.vm05.stdout:6/421: dread - d4/d2d/d5f/f6d zero size 2026-03-10T08:55:30.015 INFO:tasks.workunit.client.1.vm08.stdout:6/593: write d9/dc/d11/d23/d2c/f97 [479719,104799] 0 2026-03-10T08:55:30.022 INFO:tasks.workunit.client.1.vm08.stdout:6/594: fsync d9/d10/d1e/d32/f12 0 2026-03-10T08:55:30.023 INFO:tasks.workunit.client.0.vm05.stdout:6/422: rename d4/d2d/c75 to d4/d2d/d51/d87/c90 0 2026-03-10T08:55:30.023 INFO:tasks.workunit.client.0.vm05.stdout:6/423: symlink d4/d2d/d51/d62/l91 0 2026-03-10T08:55:30.023 INFO:tasks.workunit.client.0.vm05.stdout:2/331: truncate d0/f10 3064304 0 2026-03-10T08:55:30.029 INFO:tasks.workunit.client.1.vm08.stdout:6/595: dread f5 [0,4194304] 0 2026-03-10T08:55:30.036 INFO:tasks.workunit.client.0.vm05.stdout:6/424: mkdir d4/d92 0 2026-03-10T08:55:30.036 INFO:tasks.workunit.client.0.vm05.stdout:6/425: creat d4/d7/d10/d1a/d89/f93 x:0 0 0 2026-03-10T08:55:30.036 INFO:tasks.workunit.client.0.vm05.stdout:6/426: write d4/d2c/d84/f3c [1303943,92359] 0 2026-03-10T08:55:30.036 INFO:tasks.workunit.client.0.vm05.stdout:6/427: fsync d4/d7/d10/d15/d1b/f31 0 2026-03-10T08:55:30.036 INFO:tasks.workunit.client.0.vm05.stdout:6/428: dread - d4/d7/f80 zero size 2026-03-10T08:55:30.036 INFO:tasks.workunit.client.1.vm08.stdout:6/596: link d9/dc/d11/d23/d2c/l93 d9/dc/d11/d23/d2c/d81/lcd 0 2026-03-10T08:55:30.038 INFO:tasks.workunit.client.0.vm05.stdout:6/429: creat d4/d7/d10/d15/f94 x:0 0 0 2026-03-10T08:55:30.038 INFO:tasks.workunit.client.0.vm05.stdout:6/430: fsync d4/d2d/d5f/f88 0 2026-03-10T08:55:30.040 INFO:tasks.workunit.client.0.vm05.stdout:6/431: mknod d4/d2c/d84/d4a/c95 0 2026-03-10T08:55:30.045 INFO:tasks.workunit.client.0.vm05.stdout:6/432: dwrite d4/d7/d10/f12 [0,4194304] 0 2026-03-10T08:55:30.047 INFO:tasks.workunit.client.0.vm05.stdout:6/433: write d4/d7/d10/d1a/d89/f93 [696380,130846] 0 2026-03-10T08:55:30.053 INFO:tasks.workunit.client.1.vm08.stdout:3/546: dread d4/d15/d8/d2c/d9b/d79/f80 [0,4194304] 0 2026-03-10T08:55:30.057 INFO:tasks.workunit.client.1.vm08.stdout:6/597: rename d9/d10/d1e/d4c to d9/dc/d11/d23/d2c/d7a/dce 0 2026-03-10T08:55:30.058 INFO:tasks.workunit.client.0.vm05.stdout:6/434: read d4/d7/d10/d15/d1b/f3f [819574,88286] 0 2026-03-10T08:55:30.060 INFO:tasks.workunit.client.0.vm05.stdout:6/435: creat d4/d92/f96 x:0 0 0 2026-03-10T08:55:30.068 INFO:tasks.workunit.client.0.vm05.stdout:1/464: dwrite dd/d21/d37/f85 [0,4194304] 0 2026-03-10T08:55:30.073 INFO:tasks.workunit.client.1.vm08.stdout:9/508: write d2/dd/d15/d1e/d21/f2d [215216,20183] 0 2026-03-10T08:55:30.073 INFO:tasks.workunit.client.0.vm05.stdout:5/293: write d5/f3b [563598,118057] 0 2026-03-10T08:55:30.073 INFO:tasks.workunit.client.0.vm05.stdout:6/436: dread d4/fc [0,4194304] 0 2026-03-10T08:55:30.076 INFO:tasks.workunit.client.0.vm05.stdout:6/437: dwrite d4/d2d/d51/f7d [0,4194304] 0 2026-03-10T08:55:30.081 INFO:tasks.workunit.client.0.vm05.stdout:9/261: dwrite d6/fb [0,4194304] 0 2026-03-10T08:55:30.090 INFO:tasks.workunit.client.0.vm05.stdout:1/465: write dd/d10/d18/f82 [935608,113564] 0 2026-03-10T08:55:30.091 INFO:tasks.workunit.client.1.vm08.stdout:5/542: write d0/d11/d3e/d45/f4a [3258346,24161] 0 2026-03-10T08:55:30.091 INFO:tasks.workunit.client.1.vm08.stdout:1/621: dwrite d1/da/de/d24/d3d/d40/f42 [0,4194304] 0 2026-03-10T08:55:30.092 INFO:tasks.workunit.client.0.vm05.stdout:6/438: mknod d4/d2d/d7f/c97 0 2026-03-10T08:55:30.093 INFO:tasks.workunit.client.0.vm05.stdout:9/262: truncate d6/f16 470408 0 2026-03-10T08:55:30.096 INFO:tasks.workunit.client.0.vm05.stdout:5/294: getdents d5/df/d12/d21 0 2026-03-10T08:55:30.098 INFO:tasks.workunit.client.1.vm08.stdout:6/598: dread d9/dc/d11/d23/d2c/f49 [0,4194304] 0 2026-03-10T08:55:30.098 INFO:tasks.workunit.client.1.vm08.stdout:5/543: dwrite d0/d11/d27/d68/d7c/d4b/d4e/f89 [0,4194304] 0 2026-03-10T08:55:30.098 INFO:tasks.workunit.client.0.vm05.stdout:5/295: creat d5/df/d12/d24/f61 x:0 0 0 2026-03-10T08:55:30.098 INFO:tasks.workunit.client.0.vm05.stdout:5/296: chown d5/df/l5b 3246 1 2026-03-10T08:55:30.099 INFO:tasks.workunit.client.0.vm05.stdout:5/297: write d5/f3b [3047700,105189] 0 2026-03-10T08:55:30.116 INFO:tasks.workunit.client.1.vm08.stdout:6/599: mkdir d9/dc/d11/d23/d2c/d81/d63/dcf 0 2026-03-10T08:55:30.116 INFO:tasks.workunit.client.1.vm08.stdout:5/544: dwrite d0/d11/d18/d52/f7d [0,4194304] 0 2026-03-10T08:55:30.116 INFO:tasks.workunit.client.1.vm08.stdout:6/600: mkdir d9/d10/dd0 0 2026-03-10T08:55:30.116 INFO:tasks.workunit.client.1.vm08.stdout:6/601: readlink d9/dc/d11/d23/d2c/d41/l38 0 2026-03-10T08:55:30.116 INFO:tasks.workunit.client.1.vm08.stdout:5/545: unlink d0/ff 0 2026-03-10T08:55:30.117 INFO:tasks.workunit.client.0.vm05.stdout:5/298: dread - d5/df/d12/f59 zero size 2026-03-10T08:55:30.117 INFO:tasks.workunit.client.0.vm05.stdout:9/263: read d6/d19/d21/f31 [570185,63893] 0 2026-03-10T08:55:30.117 INFO:tasks.workunit.client.0.vm05.stdout:9/264: read - d6/d15/d35/f38 zero size 2026-03-10T08:55:30.117 INFO:tasks.workunit.client.0.vm05.stdout:9/265: creat d6/d19/d2a/d4a/f56 x:0 0 0 2026-03-10T08:55:30.117 INFO:tasks.workunit.client.0.vm05.stdout:9/266: fdatasync d6/d19/f1a 0 2026-03-10T08:55:30.117 INFO:tasks.workunit.client.0.vm05.stdout:9/267: rename d6/d15/d37/l41 to d6/d15/d35/l57 0 2026-03-10T08:55:30.117 INFO:tasks.workunit.client.0.vm05.stdout:9/268: write d6/fb [1211390,49285] 0 2026-03-10T08:55:30.283 INFO:tasks.workunit.client.0.vm05.stdout:3/369: dwrite d9/f29 [0,4194304] 0 2026-03-10T08:55:30.285 INFO:tasks.workunit.client.0.vm05.stdout:0/362: dwrite df/d18/d2b/d27/d32/f44 [0,4194304] 0 2026-03-10T08:55:30.287 INFO:tasks.workunit.client.0.vm05.stdout:0/363: stat df/d18/d2b/d27/f4f 0 2026-03-10T08:55:30.292 INFO:tasks.workunit.client.0.vm05.stdout:0/364: chown df/d18/d19/d39/d4d/d50/c5e 118868717 1 2026-03-10T08:55:30.292 INFO:tasks.workunit.client.0.vm05.stdout:3/370: symlink d9/d2b/d53/l6a 0 2026-03-10T08:55:30.307 INFO:tasks.workunit.client.0.vm05.stdout:7/316: truncate d18/d1b/d1f/d25/d2e/f49 417955 0 2026-03-10T08:55:30.309 INFO:tasks.workunit.client.0.vm05.stdout:8/357: truncate d2/db/d1f/f44 1527473 0 2026-03-10T08:55:30.309 INFO:tasks.workunit.client.0.vm05.stdout:8/358: stat d2/dd/d2c/d2e 0 2026-03-10T08:55:30.316 INFO:tasks.workunit.client.0.vm05.stdout:3/371: dwrite d9/d2b/d2f/f4b [0,4194304] 0 2026-03-10T08:55:30.318 INFO:tasks.workunit.client.0.vm05.stdout:3/372: dread - d9/d2b/d3a/f68 zero size 2026-03-10T08:55:30.322 INFO:tasks.workunit.client.0.vm05.stdout:7/317: mknod d18/d38/d43/c58 0 2026-03-10T08:55:30.323 INFO:tasks.workunit.client.0.vm05.stdout:8/359: rename d2/dd/c3d to d2/dd/d2c/d2e/d31/d4c/d63/c83 0 2026-03-10T08:55:30.324 INFO:tasks.workunit.client.1.vm08.stdout:7/622: rmdir d0 39 2026-03-10T08:55:30.324 INFO:tasks.workunit.client.1.vm08.stdout:2/605: write d1/da/d10/d42/d93/d23/f99 [183511,43788] 0 2026-03-10T08:55:30.331 INFO:tasks.workunit.client.0.vm05.stdout:8/360: creat d2/db/d1f/f84 x:0 0 0 2026-03-10T08:55:30.331 INFO:tasks.workunit.client.0.vm05.stdout:8/361: readlink d2/db/d1f/d67/l70 0 2026-03-10T08:55:30.333 INFO:tasks.workunit.client.1.vm08.stdout:2/606: mknod d1/da/d10/d42/d93/d1e/d7b/cbe 0 2026-03-10T08:55:30.334 INFO:tasks.workunit.client.1.vm08.stdout:8/647: write d1/d10/d9/dd/d13/d40/fdd [990776,112508] 0 2026-03-10T08:55:30.348 INFO:tasks.workunit.client.1.vm08.stdout:7/623: rmdir d0/d11/d1f/d29/d3b/da1 39 2026-03-10T08:55:30.348 INFO:tasks.workunit.client.1.vm08.stdout:7/624: chown d0/d11/d4a/c52 841142 1 2026-03-10T08:55:30.348 INFO:tasks.workunit.client.1.vm08.stdout:8/648: creat d1/d10/d9/dd/d25/d27/d44/d21/d5f/d9e/ff2 x:0 0 0 2026-03-10T08:55:30.348 INFO:tasks.workunit.client.1.vm08.stdout:4/603: dwrite d5/f77 [0,4194304] 0 2026-03-10T08:55:30.348 INFO:tasks.workunit.client.1.vm08.stdout:0/532: dwrite d6/dd/d13/d17/d1f/d20/f3e [0,4194304] 0 2026-03-10T08:55:30.348 INFO:tasks.workunit.client.0.vm05.stdout:8/362: dwrite d2/f2a [0,4194304] 0 2026-03-10T08:55:30.348 INFO:tasks.workunit.client.0.vm05.stdout:3/373: rename d9/f13 to d9/d2b/d3a/d43/d4f/d55/f6b 0 2026-03-10T08:55:30.348 INFO:tasks.workunit.client.0.vm05.stdout:3/374: fdatasync d9/f4a 0 2026-03-10T08:55:30.348 INFO:tasks.workunit.client.0.vm05.stdout:4/408: getdents d0/d2e/d42/d45/d4a 0 2026-03-10T08:55:30.348 INFO:tasks.workunit.client.0.vm05.stdout:8/363: dread d2/dd/d2c/d2e/d31/d3e/f73 [0,4194304] 0 2026-03-10T08:55:30.348 INFO:tasks.workunit.client.0.vm05.stdout:3/375: readlink d9/l4e 0 2026-03-10T08:55:30.349 INFO:tasks.workunit.client.1.vm08.stdout:0/533: write d6/d8b/faa [895018,87804] 0 2026-03-10T08:55:30.349 INFO:tasks.workunit.client.0.vm05.stdout:3/376: mkdir d9/d2b/d3a/d6c 0 2026-03-10T08:55:30.349 INFO:tasks.workunit.client.0.vm05.stdout:4/409: getdents d0/d2e/d71/d7c/d82 0 2026-03-10T08:55:30.351 INFO:tasks.workunit.client.0.vm05.stdout:3/377: symlink d9/d2b/l6d 0 2026-03-10T08:55:30.352 INFO:tasks.workunit.client.0.vm05.stdout:3/378: stat d9/d2b/d2f/d57 0 2026-03-10T08:55:30.352 INFO:tasks.workunit.client.0.vm05.stdout:3/379: dread - d9/d2b/d2f/f5d zero size 2026-03-10T08:55:30.353 INFO:tasks.workunit.client.0.vm05.stdout:4/410: rename d0/d2e/d42/f5e to d0/d78/f87 0 2026-03-10T08:55:30.354 INFO:tasks.workunit.client.0.vm05.stdout:4/411: write d0/d2e/d42/d45/d4a/f86 [100447,108567] 0 2026-03-10T08:55:30.362 INFO:tasks.workunit.client.0.vm05.stdout:3/380: mkdir d9/d2b/d3a/d43/d6e 0 2026-03-10T08:55:30.366 INFO:tasks.workunit.client.1.vm08.stdout:7/625: symlink d0/d11/d1f/d29/lcb 0 2026-03-10T08:55:30.372 INFO:tasks.workunit.client.1.vm08.stdout:2/607: dread d1/da/d10/f7e [0,4194304] 0 2026-03-10T08:55:30.378 INFO:tasks.workunit.client.1.vm08.stdout:2/608: mknod d1/da/d10/d42/d93/d23/d9e/cbf 0 2026-03-10T08:55:30.378 INFO:tasks.workunit.client.1.vm08.stdout:2/609: truncate d1/da/d10/d1b/fac 585261 0 2026-03-10T08:55:30.378 INFO:tasks.workunit.client.1.vm08.stdout:7/626: creat d0/d11/d1f/d29/fcc x:0 0 0 2026-03-10T08:55:30.378 INFO:tasks.workunit.client.1.vm08.stdout:2/610: stat d1/da/d10/d42/d93/d22/l33 0 2026-03-10T08:55:30.380 INFO:tasks.workunit.client.1.vm08.stdout:4/604: dread d5/f8 [4194304,4194304] 0 2026-03-10T08:55:30.383 INFO:tasks.workunit.client.1.vm08.stdout:4/605: mknod d5/d23/d36/d99/cd7 0 2026-03-10T08:55:30.384 INFO:tasks.workunit.client.1.vm08.stdout:7/627: link d0/d11/d1f/d29/d3d/d89/cb1 d0/d14/d43/d9d/ccd 0 2026-03-10T08:55:30.384 INFO:tasks.workunit.client.1.vm08.stdout:2/611: link d1/da/d10/d42/d93/d1e/c26 d1/da/d10/d42/d93/d22/cc0 0 2026-03-10T08:55:30.384 INFO:tasks.workunit.client.1.vm08.stdout:4/606: mknod d5/de/d96/cd8 0 2026-03-10T08:55:30.384 INFO:tasks.workunit.client.1.vm08.stdout:2/612: stat d1/da/d10/d2d/fb4 0 2026-03-10T08:55:30.385 INFO:tasks.workunit.client.1.vm08.stdout:4/607: mknod d5/de/cd9 0 2026-03-10T08:55:30.386 INFO:tasks.workunit.client.1.vm08.stdout:2/613: creat d1/db1/fc1 x:0 0 0 2026-03-10T08:55:30.387 INFO:tasks.workunit.client.1.vm08.stdout:4/608: creat d5/d23/d36/d99/db2/fda x:0 0 0 2026-03-10T08:55:30.388 INFO:tasks.workunit.client.1.vm08.stdout:2/614: unlink d1/da/d10/d1b/d6a/fa8 0 2026-03-10T08:55:30.388 INFO:tasks.workunit.client.1.vm08.stdout:4/609: mkdir d5/d23/d36/d99/db2/d5a/ddb 0 2026-03-10T08:55:30.389 INFO:tasks.workunit.client.1.vm08.stdout:2/615: symlink d1/d9b/d52/db3/lc2 0 2026-03-10T08:55:30.390 INFO:tasks.workunit.client.1.vm08.stdout:4/610: dread - d5/d23/d36/d99/db2/d5a/d69/fb3 zero size 2026-03-10T08:55:30.392 INFO:tasks.workunit.client.1.vm08.stdout:2/616: rename d1/da/d10/d42/d93/f8d to d1/da/fc3 0 2026-03-10T08:55:30.415 INFO:tasks.workunit.client.0.vm05.stdout:2/332: write d0/d9/d1e/d20/d21/f23 [4104824,16516] 0 2026-03-10T08:55:30.415 INFO:tasks.workunit.client.0.vm05.stdout:1/466: write dd/d21/f4c [532690,124821] 0 2026-03-10T08:55:30.415 INFO:tasks.workunit.client.1.vm08.stdout:2/617: readlink d1/da/d10/d42/lb0 0 2026-03-10T08:55:30.415 INFO:tasks.workunit.client.1.vm08.stdout:2/618: link d1/da/d10/d42/d93/d23/d9e/cbf d1/da/d10/d42/d93/cc4 0 2026-03-10T08:55:30.415 INFO:tasks.workunit.client.1.vm08.stdout:2/619: mkdir d1/d5b/dc5 0 2026-03-10T08:55:30.415 INFO:tasks.workunit.client.1.vm08.stdout:9/509: write d2/dd/d15/d1e/d39/d4e/f55 [2715323,72576] 0 2026-03-10T08:55:30.415 INFO:tasks.workunit.client.1.vm08.stdout:3/547: dwrite d4/f44 [4194304,4194304] 0 2026-03-10T08:55:30.417 INFO:tasks.workunit.client.0.vm05.stdout:1/467: mkdir dd/d21/d37/d7c/dab 0 2026-03-10T08:55:30.417 INFO:tasks.workunit.client.0.vm05.stdout:1/468: chown dd/d21/d3f 225601 1 2026-03-10T08:55:30.419 INFO:tasks.workunit.client.1.vm08.stdout:9/510: dwrite d2/dd/d15/d1e/d21/f90 [0,4194304] 0 2026-03-10T08:55:30.419 INFO:tasks.workunit.client.0.vm05.stdout:1/469: rmdir dd/d10/d18/d2d/d5c 39 2026-03-10T08:55:30.420 INFO:tasks.workunit.client.0.vm05.stdout:1/470: write dd/d10/d18/d2d/d51/d58/f5b [1414588,56985] 0 2026-03-10T08:55:30.421 INFO:tasks.workunit.client.0.vm05.stdout:1/471: read dd/d10/d18/f8a [1323986,1906] 0 2026-03-10T08:55:30.427 INFO:tasks.workunit.client.0.vm05.stdout:2/333: link d0/d9/d1e/d20/c2e d0/d9/d1e/c5b 0 2026-03-10T08:55:30.428 INFO:tasks.workunit.client.1.vm08.stdout:3/548: rename d4/d15/d8/c13 to d4/d15/d8/d2c/d55/cb9 0 2026-03-10T08:55:30.428 INFO:tasks.workunit.client.1.vm08.stdout:9/511: unlink d2/c8b 0 2026-03-10T08:55:30.430 INFO:tasks.workunit.client.0.vm05.stdout:2/334: rename d0/l15 to d0/l5c 0 2026-03-10T08:55:30.431 INFO:tasks.workunit.client.0.vm05.stdout:2/335: symlink d0/d55/l5d 0 2026-03-10T08:55:30.432 INFO:tasks.workunit.client.1.vm08.stdout:9/512: truncate d2/f77 195989 0 2026-03-10T08:55:30.433 INFO:tasks.workunit.client.1.vm08.stdout:3/549: rename d4/d15/d8/d2c/d9b/d79/d8f/la6 to d4/d15/d8/lba 0 2026-03-10T08:55:30.435 INFO:tasks.workunit.client.1.vm08.stdout:3/550: creat d4/d15/d8/fbb x:0 0 0 2026-03-10T08:55:30.437 INFO:tasks.workunit.client.1.vm08.stdout:3/551: link d4/d15/d8/d2c/d9b/c50 d4/d15/cbc 0 2026-03-10T08:55:30.556 INFO:tasks.workunit.client.0.vm05.stdout:3/381: sync 2026-03-10T08:55:30.558 INFO:tasks.workunit.client.0.vm05.stdout:3/382: mkdir d9/d4d/d51/d6f 0 2026-03-10T08:55:30.559 INFO:tasks.workunit.client.0.vm05.stdout:3/383: write d9/f4a [4494382,118468] 0 2026-03-10T08:55:30.560 INFO:tasks.workunit.client.0.vm05.stdout:3/384: write d9/d2b/d3a/f44 [44617,105057] 0 2026-03-10T08:55:30.569 INFO:tasks.workunit.client.0.vm05.stdout:1/472: dread dd/d10/d18/f82 [0,4194304] 0 2026-03-10T08:55:30.577 INFO:tasks.workunit.client.0.vm05.stdout:1/473: mkdir dd/d10/d18/d2d/d5c/dac 0 2026-03-10T08:55:30.686 INFO:tasks.workunit.client.0.vm05.stdout:9/269: dread d6/f3f [0,4194304] 0 2026-03-10T08:55:30.686 INFO:tasks.workunit.client.0.vm05.stdout:9/270: rmdir d6/d12 39 2026-03-10T08:55:30.687 INFO:tasks.workunit.client.0.vm05.stdout:9/271: dread d6/f3f [0,4194304] 0 2026-03-10T08:55:30.712 INFO:tasks.workunit.client.0.vm05.stdout:3/385: dread d9/d2b/f2c [0,4194304] 0 2026-03-10T08:55:30.753 INFO:tasks.workunit.client.0.vm05.stdout:9/272: dwrite d6/d19/d21/f31 [0,4194304] 0 2026-03-10T08:55:30.753 INFO:tasks.workunit.client.0.vm05.stdout:6/439: dwrite d4/f6a [0,4194304] 0 2026-03-10T08:55:30.753 INFO:tasks.workunit.client.1.vm08.stdout:1/622: sync 2026-03-10T08:55:30.755 INFO:tasks.workunit.client.0.vm05.stdout:6/440: fdatasync d4/d2d/d51/f7d 0 2026-03-10T08:55:30.760 INFO:tasks.workunit.client.0.vm05.stdout:3/386: dread f2 [0,4194304] 0 2026-03-10T08:55:30.764 INFO:tasks.workunit.client.1.vm08.stdout:1/623: fdatasync d1/da/f25 0 2026-03-10T08:55:30.768 INFO:tasks.workunit.client.0.vm05.stdout:6/441: creat d4/d7/d10/d1a/d1f/f98 x:0 0 0 2026-03-10T08:55:30.768 INFO:tasks.workunit.client.0.vm05.stdout:3/387: write d9/d2b/d3a/d43/d4f/d55/f6b [8030643,67320] 0 2026-03-10T08:55:30.770 INFO:tasks.workunit.client.0.vm05.stdout:6/442: stat d4/d2d/d5f/l74 0 2026-03-10T08:55:30.771 INFO:tasks.workunit.client.0.vm05.stdout:6/443: write d4/d2c/f86 [302889,85311] 0 2026-03-10T08:55:30.779 INFO:tasks.workunit.client.1.vm08.stdout:1/624: dwrite d1/da/de/d5c/fcc [0,4194304] 0 2026-03-10T08:55:30.789 INFO:tasks.workunit.client.1.vm08.stdout:1/625: truncate d1/da/de/d24/d35/d6d/d82/f7b 539390 0 2026-03-10T08:55:30.789 INFO:tasks.workunit.client.1.vm08.stdout:1/626: fsync d1/f8 0 2026-03-10T08:55:30.792 INFO:tasks.workunit.client.1.vm08.stdout:9/513: sync 2026-03-10T08:55:30.794 INFO:tasks.workunit.client.1.vm08.stdout:7/628: sync 2026-03-10T08:55:30.794 INFO:tasks.workunit.client.1.vm08.stdout:8/649: sync 2026-03-10T08:55:30.798 INFO:tasks.workunit.client.1.vm08.stdout:1/627: mkdir d1/dde 0 2026-03-10T08:55:30.799 INFO:tasks.workunit.client.1.vm08.stdout:1/628: chown d1/da/d18/f1d 14 1 2026-03-10T08:55:30.800 INFO:tasks.workunit.client.1.vm08.stdout:8/650: mknod d1/d4f/cf3 0 2026-03-10T08:55:30.803 INFO:tasks.workunit.client.1.vm08.stdout:9/514: creat d2/d41/d4c/d66/d99/fa0 x:0 0 0 2026-03-10T08:55:30.805 INFO:tasks.workunit.client.1.vm08.stdout:1/629: truncate d1/f1f 3221417 0 2026-03-10T08:55:30.807 INFO:tasks.workunit.client.1.vm08.stdout:7/629: creat d0/d11/d4a/d5e/dc3/fce x:0 0 0 2026-03-10T08:55:30.818 INFO:tasks.workunit.client.1.vm08.stdout:1/630: rmdir d1/da/d4b/d4e 39 2026-03-10T08:55:30.818 INFO:tasks.workunit.client.1.vm08.stdout:8/651: getdents d1/d10/d9/dd/d25/d27/d44/d97/d7d 0 2026-03-10T08:55:30.818 INFO:tasks.workunit.client.1.vm08.stdout:7/630: creat d0/d11/d1f/d29/fcf x:0 0 0 2026-03-10T08:55:30.819 INFO:tasks.workunit.client.1.vm08.stdout:9/515: dread d2/f35 [0,4194304] 0 2026-03-10T08:55:30.819 INFO:tasks.workunit.client.1.vm08.stdout:7/631: dread - d0/d11/d1f/fb7 zero size 2026-03-10T08:55:30.819 INFO:tasks.workunit.client.1.vm08.stdout:8/652: rmdir d1/d10/d9/dd/d13 39 2026-03-10T08:55:30.819 INFO:tasks.workunit.client.1.vm08.stdout:9/516: mknod d2/d41/d53/ca1 0 2026-03-10T08:55:30.819 INFO:tasks.workunit.client.1.vm08.stdout:9/517: chown d2/dd/l5d 7157 1 2026-03-10T08:55:30.819 INFO:tasks.workunit.client.1.vm08.stdout:7/632: dwrite d0/d11/d4a/da3/fa9 [0,4194304] 0 2026-03-10T08:55:30.822 INFO:tasks.workunit.client.0.vm05.stdout:3/388: dread d9/d2b/f2d [0,4194304] 0 2026-03-10T08:55:30.823 INFO:tasks.workunit.client.0.vm05.stdout:3/389: write d9/d4d/d51/f59 [624236,71790] 0 2026-03-10T08:55:30.825 INFO:tasks.workunit.client.1.vm08.stdout:7/633: chown d0/d11/d1f/d2c/f30 59809 1 2026-03-10T08:55:30.825 INFO:tasks.workunit.client.1.vm08.stdout:7/634: fsync d0/d14/d43/d62/f9a 0 2026-03-10T08:55:30.827 INFO:tasks.workunit.client.1.vm08.stdout:9/518: unlink d2/dd/f16 0 2026-03-10T08:55:30.828 INFO:tasks.workunit.client.1.vm08.stdout:8/653: creat d1/d10/d9/dd/d18/d34/ff4 x:0 0 0 2026-03-10T08:55:30.828 INFO:tasks.workunit.client.0.vm05.stdout:3/390: dwrite d9/d2b/d3a/d43/d4f/d55/f6b [4194304,4194304] 0 2026-03-10T08:55:30.830 INFO:tasks.workunit.client.0.vm05.stdout:3/391: chown d9/d2b/d53/f5a 421079 1 2026-03-10T08:55:30.832 INFO:tasks.workunit.client.0.vm05.stdout:3/392: fdatasync d9/f20 0 2026-03-10T08:55:30.837 INFO:tasks.workunit.client.1.vm08.stdout:8/654: chown d1/d10/d9/dd/d25/d27/d44/d97/f9c 21081305 1 2026-03-10T08:55:30.838 INFO:tasks.workunit.client.1.vm08.stdout:8/655: stat d1/d10/d9/dd/f62 0 2026-03-10T08:55:30.840 INFO:tasks.workunit.client.1.vm08.stdout:9/519: mknod d2/dd/d15/d1e/d25/d98/d9d/ca2 0 2026-03-10T08:55:30.843 INFO:tasks.workunit.client.1.vm08.stdout:9/520: fsync d2/d41/d74/f6a 0 2026-03-10T08:55:30.845 INFO:tasks.workunit.client.1.vm08.stdout:8/656: dwrite d1/d10/d9/dd/d25/d27/d44/d89/fcd [0,4194304] 0 2026-03-10T08:55:30.847 INFO:tasks.workunit.client.1.vm08.stdout:7/635: dread d0/d11/f6a [0,4194304] 0 2026-03-10T08:55:30.863 INFO:tasks.workunit.client.1.vm08.stdout:8/657: rename d1/d10/d9/dd/d25/dca/dc6/fe8 to d1/d10/d9/dd/d13/ff5 0 2026-03-10T08:55:30.863 INFO:tasks.workunit.client.1.vm08.stdout:9/521: getdents d2/d41/d4c 0 2026-03-10T08:55:30.865 INFO:tasks.workunit.client.1.vm08.stdout:8/658: dread - d1/d10/d9/dd/d13/fa4 zero size 2026-03-10T08:55:30.865 INFO:tasks.workunit.client.1.vm08.stdout:9/522: dread - d2/d41/d4c/d66/d99/fa0 zero size 2026-03-10T08:55:30.867 INFO:tasks.workunit.client.1.vm08.stdout:9/523: truncate d2/d41/d4c/d66/d99/fa0 901293 0 2026-03-10T08:55:30.867 INFO:tasks.workunit.client.1.vm08.stdout:8/659: creat d1/d10/d9/dd/d18/ff6 x:0 0 0 2026-03-10T08:55:30.870 INFO:tasks.workunit.client.1.vm08.stdout:8/660: link d1/d10/d9/dd/d18/d34/dd0/cef d1/d10/d9/dd/d18/d34/dd0/cf7 0 2026-03-10T08:55:30.871 INFO:tasks.workunit.client.1.vm08.stdout:8/661: write d1/d10/d9/d8a/f99 [655748,119695] 0 2026-03-10T08:55:30.875 INFO:tasks.workunit.client.1.vm08.stdout:8/662: symlink d1/d10/d9/lf8 0 2026-03-10T08:55:30.876 INFO:tasks.workunit.client.1.vm08.stdout:8/663: mknod d1/d4f/d60/dbf/cf9 0 2026-03-10T08:55:30.877 INFO:tasks.workunit.client.1.vm08.stdout:8/664: fsync d1/d2c/fdb 0 2026-03-10T08:55:30.878 INFO:tasks.workunit.client.1.vm08.stdout:8/665: creat d1/d10/d9/dd/d25/d27/d44/d21/d51/dd6/ffa x:0 0 0 2026-03-10T08:55:30.880 INFO:tasks.workunit.client.1.vm08.stdout:8/666: mkdir d1/d10/d9/dd/d25/d27/d44/d21/d51/d64/dfb 0 2026-03-10T08:55:30.910 INFO:tasks.workunit.client.0.vm05.stdout:5/299: truncate d5/df/d12/d21/f1f 2726926 0 2026-03-10T08:55:30.912 INFO:tasks.workunit.client.0.vm05.stdout:5/300: creat d5/df/d12/d24/d2c/d41/f62 x:0 0 0 2026-03-10T08:55:30.912 INFO:tasks.workunit.client.0.vm05.stdout:5/301: read - d5/d3a/f4e zero size 2026-03-10T08:55:30.913 INFO:tasks.workunit.client.0.vm05.stdout:5/302: fsync d5/df/d12/d24/f25 0 2026-03-10T08:55:30.915 INFO:tasks.workunit.client.0.vm05.stdout:5/303: symlink d5/df/d12/d21/l63 0 2026-03-10T08:55:30.917 INFO:tasks.workunit.client.0.vm05.stdout:5/304: mkdir d5/d48/d64 0 2026-03-10T08:55:30.920 INFO:tasks.workunit.client.1.vm08.stdout:6/602: write d9/d13/f36 [217144,121165] 0 2026-03-10T08:55:30.920 INFO:tasks.workunit.client.1.vm08.stdout:5/546: write d0/d11/f29 [1698764,77779] 0 2026-03-10T08:55:30.921 INFO:tasks.workunit.client.0.vm05.stdout:5/305: dread d5/df/d12/d24/f25 [0,4194304] 0 2026-03-10T08:55:30.927 INFO:tasks.workunit.client.0.vm05.stdout:0/365: dwrite df/d59/f45 [0,4194304] 0 2026-03-10T08:55:30.929 INFO:tasks.workunit.client.0.vm05.stdout:0/366: dread df/d1f/f2d [0,4194304] 0 2026-03-10T08:55:30.930 INFO:tasks.workunit.client.0.vm05.stdout:0/367: fdatasync df/d18/d2b/d27/f60 0 2026-03-10T08:55:30.934 INFO:tasks.workunit.client.1.vm08.stdout:1/631: dread d1/da/d18/d3b/faf [0,4194304] 0 2026-03-10T08:55:30.943 INFO:tasks.workunit.client.0.vm05.stdout:3/393: dwrite d9/d2b/d2f/f4b [4194304,4194304] 0 2026-03-10T08:55:30.950 INFO:tasks.workunit.client.0.vm05.stdout:8/364: truncate d2/dd/f1a 2623076 0 2026-03-10T08:55:30.955 INFO:tasks.workunit.client.0.vm05.stdout:4/412: truncate d0/d78/f87 1199030 0 2026-03-10T08:55:30.955 INFO:tasks.workunit.client.1.vm08.stdout:0/534: write d6/dd/d13/d17/f6d [1410872,126756] 0 2026-03-10T08:55:30.962 INFO:tasks.workunit.client.1.vm08.stdout:0/535: rmdir d6/dd 39 2026-03-10T08:55:30.973 INFO:tasks.workunit.client.1.vm08.stdout:1/632: rename d1/da/de/d24/d3d/d40/d56/d6b/f8f to d1/da/d18/d3a/d77/fdf 0 2026-03-10T08:55:30.974 INFO:tasks.workunit.client.1.vm08.stdout:1/633: stat d1/da/d20/d91/daa 0 2026-03-10T08:55:30.974 INFO:tasks.workunit.client.1.vm08.stdout:4/611: dwrite d5/de/f50 [0,4194304] 0 2026-03-10T08:55:30.974 INFO:tasks.workunit.client.0.vm05.stdout:0/368: chown df/d59/f3f 100867672 1 2026-03-10T08:55:30.974 INFO:tasks.workunit.client.0.vm05.stdout:0/369: readlink df/d18/d19/d39/d4d/d50/l5f 0 2026-03-10T08:55:30.974 INFO:tasks.workunit.client.0.vm05.stdout:0/370: fsync df/f12 0 2026-03-10T08:55:30.974 INFO:tasks.workunit.client.0.vm05.stdout:8/365: rmdir d2/d45 39 2026-03-10T08:55:30.979 INFO:tasks.workunit.client.0.vm05.stdout:5/306: mkdir d5/df/d12/d24/d2c/d65 0 2026-03-10T08:55:30.980 INFO:tasks.workunit.client.0.vm05.stdout:5/307: dread - d5/df/f2f zero size 2026-03-10T08:55:30.982 INFO:tasks.workunit.client.0.vm05.stdout:4/413: creat d0/d2e/d42/d45/d4a/d36/f88 x:0 0 0 2026-03-10T08:55:30.985 INFO:tasks.workunit.client.1.vm08.stdout:4/612: creat d5/d23/d49/fdc x:0 0 0 2026-03-10T08:55:30.993 INFO:tasks.workunit.client.0.vm05.stdout:4/414: creat d0/d1d/f89 x:0 0 0 2026-03-10T08:55:30.996 INFO:tasks.workunit.client.0.vm05.stdout:2/336: write d0/d9/d1e/d20/d21/f3d [1559500,125191] 0 2026-03-10T08:55:30.996 INFO:tasks.workunit.client.0.vm05.stdout:2/337: chown d0/d9/f12 965150 1 2026-03-10T08:55:31.001 INFO:tasks.workunit.client.0.vm05.stdout:3/394: link d9/l4e d9/d4d/l70 0 2026-03-10T08:55:31.003 INFO:tasks.workunit.client.1.vm08.stdout:1/634: dread d1/da/d18/d3a/da7/fba [0,4194304] 0 2026-03-10T08:55:31.006 INFO:tasks.workunit.client.1.vm08.stdout:3/552: truncate d4/d15/d8/d1d/f6e 1554533 0 2026-03-10T08:55:31.006 INFO:tasks.workunit.client.0.vm05.stdout:1/474: dwrite dd/d10/d19/d27/f4e [0,4194304] 0 2026-03-10T08:55:31.018 INFO:tasks.workunit.client.1.vm08.stdout:3/553: truncate d4/d15/d8/d2c/d9b/f63 4513370 0 2026-03-10T08:55:31.020 INFO:tasks.workunit.client.1.vm08.stdout:3/554: fdatasync d4/d15/d8/d2c/d55/d93/fa5 0 2026-03-10T08:55:31.022 INFO:tasks.workunit.client.0.vm05.stdout:3/395: fsync d9/d2b/f34 0 2026-03-10T08:55:31.023 INFO:tasks.workunit.client.0.vm05.stdout:3/396: fsync d9/d2b/d2f/f4b 0 2026-03-10T08:55:31.025 INFO:tasks.workunit.client.0.vm05.stdout:9/273: write d6/f30 [2932640,123391] 0 2026-03-10T08:55:31.025 INFO:tasks.workunit.client.0.vm05.stdout:9/274: chown d6/d19/d2c/f54 353 1 2026-03-10T08:55:31.027 INFO:tasks.workunit.client.1.vm08.stdout:1/635: link d1/da/de/d24/d3d/d40/d56/d7a/cb7 d1/da/d20/d91/ce0 0 2026-03-10T08:55:31.028 INFO:tasks.workunit.client.0.vm05.stdout:2/338: mkdir d0/d9/d1e/d20/d24/d5e 0 2026-03-10T08:55:31.029 INFO:tasks.workunit.client.1.vm08.stdout:3/555: dwrite d4/d15/d8/d2c/d6d/f9d [0,4194304] 0 2026-03-10T08:55:31.029 INFO:tasks.workunit.client.0.vm05.stdout:6/444: dwrite d4/f11 [0,4194304] 0 2026-03-10T08:55:31.031 INFO:tasks.workunit.client.0.vm05.stdout:6/445: write d4/d7/d10/d1a/d89/f93 [1056308,105030] 0 2026-03-10T08:55:31.046 INFO:tasks.workunit.client.1.vm08.stdout:3/556: mknod d4/d6f/d85/cbd 0 2026-03-10T08:55:31.052 INFO:tasks.workunit.client.1.vm08.stdout:1/636: link d1/da/f25 d1/da/de/d24/d26/d5d/fe1 0 2026-03-10T08:55:31.053 INFO:tasks.workunit.client.0.vm05.stdout:3/397: mkdir d9/d2b/d3a/d43/d71 0 2026-03-10T08:55:31.053 INFO:tasks.workunit.client.0.vm05.stdout:4/415: getdents d0/d1d/d30/d32 0 2026-03-10T08:55:31.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:30 vm08.local ceph-mon[57559]: pgmap v154: 65 pgs: 65 active+clean; 2.0 GiB data, 7.2 GiB used, 113 GiB / 120 GiB avail; 44 MiB/s rd, 133 MiB/s wr, 258 op/s 2026-03-10T08:55:31.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:30 vm08.local ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:55:31.053 INFO:tasks.workunit.client.0.vm05.stdout:4/416: chown d0/d1d/d30/d49/d4f/d5b/f70 0 1 2026-03-10T08:55:31.053 INFO:tasks.workunit.client.1.vm08.stdout:3/557: fsync d4/d15/d8/d2c/d55/f75 0 2026-03-10T08:55:31.053 INFO:tasks.workunit.client.0.vm05.stdout:2/339: stat d0/d9/f17 0 2026-03-10T08:55:31.054 INFO:tasks.workunit.client.0.vm05.stdout:2/340: chown d0/d9/d1e/d20/d21/f31 31303 1 2026-03-10T08:55:31.056 INFO:tasks.workunit.client.0.vm05.stdout:6/446: chown d4/c57 8 1 2026-03-10T08:55:31.059 INFO:tasks.workunit.client.1.vm08.stdout:1/637: mkdir d1/da/de/d24/d3d/d40/de2 0 2026-03-10T08:55:31.060 INFO:tasks.workunit.client.0.vm05.stdout:3/398: creat d9/d2b/d3a/d43/d4f/d50/f72 x:0 0 0 2026-03-10T08:55:31.061 INFO:tasks.workunit.client.0.vm05.stdout:3/399: chown d9/d2b/d2f/f3f 40106 1 2026-03-10T08:55:31.063 INFO:tasks.workunit.client.1.vm08.stdout:3/558: symlink d4/lbe 0 2026-03-10T08:55:31.063 INFO:tasks.workunit.client.0.vm05.stdout:3/400: dread d9/d4d/d51/f59 [0,4194304] 0 2026-03-10T08:55:31.066 INFO:tasks.workunit.client.0.vm05.stdout:4/417: mknod d0/d2e/d42/d45/c8a 0 2026-03-10T08:55:31.069 INFO:tasks.workunit.client.0.vm05.stdout:4/418: dread d0/d1d/f3c [0,4194304] 0 2026-03-10T08:55:31.072 INFO:tasks.workunit.client.0.vm05.stdout:2/341: mkdir d0/d9/d27/d5f 0 2026-03-10T08:55:31.074 INFO:tasks.workunit.client.0.vm05.stdout:8/366: truncate d2/db/d47/f58 1022356 0 2026-03-10T08:55:31.075 INFO:tasks.workunit.client.0.vm05.stdout:3/401: readlink d9/l18 0 2026-03-10T08:55:31.075 INFO:tasks.workunit.client.0.vm05.stdout:3/402: write d9/f29 [3232476,62738] 0 2026-03-10T08:55:31.076 INFO:tasks.workunit.client.0.vm05.stdout:3/403: read d9/f20 [7626006,90296] 0 2026-03-10T08:55:31.079 INFO:tasks.workunit.client.0.vm05.stdout:4/419: fdatasync d0/d1d/d30/d32/f3e 0 2026-03-10T08:55:31.081 INFO:tasks.workunit.client.0.vm05.stdout:8/367: creat d2/dd/d2c/d2e/d31/d4c/f85 x:0 0 0 2026-03-10T08:55:31.086 INFO:tasks.workunit.client.0.vm05.stdout:8/368: dwrite d2/db/d1f/f53 [0,4194304] 0 2026-03-10T08:55:31.091 INFO:tasks.workunit.client.1.vm08.stdout:7/636: truncate d0/d11/d1f/d29/d36/d75/f85 4149670 0 2026-03-10T08:55:31.097 INFO:tasks.workunit.client.1.vm08.stdout:9/524: write d2/dd/d15/d1e/d24/f3f [3563526,60335] 0 2026-03-10T08:55:31.107 INFO:tasks.workunit.client.1.vm08.stdout:8/667: dwrite d1/d4f/d60/fc4 [0,4194304] 0 2026-03-10T08:55:31.107 INFO:tasks.workunit.client.0.vm05.stdout:2/342: rmdir d0/d9/d27/d5f 0 2026-03-10T08:55:31.107 INFO:tasks.workunit.client.0.vm05.stdout:8/369: creat d2/dd/d2c/f86 x:0 0 0 2026-03-10T08:55:31.113 INFO:tasks.workunit.client.0.vm05.stdout:2/343: link d0/d9/d1e/d20/d21/d45/d4b/f58 d0/d55/f60 0 2026-03-10T08:55:31.113 INFO:tasks.workunit.client.1.vm08.stdout:5/547: write d0/d11/d27/d68/d7c/f42 [4340560,72369] 0 2026-03-10T08:55:31.114 INFO:tasks.workunit.client.0.vm05.stdout:2/344: fsync d0/d9/d1e/d20/d21/f44 0 2026-03-10T08:55:31.115 INFO:tasks.workunit.client.1.vm08.stdout:0/536: write d6/dd/f3f [834811,42079] 0 2026-03-10T08:55:31.115 INFO:tasks.workunit.client.0.vm05.stdout:7/318: truncate f15 784859 0 2026-03-10T08:55:31.119 INFO:tasks.workunit.client.1.vm08.stdout:5/548: chown d0/d1b/d67/d7a 738 1 2026-03-10T08:55:31.147 INFO:tasks.workunit.client.0.vm05.stdout:2/345: creat d0/f61 x:0 0 0 2026-03-10T08:55:31.147 INFO:tasks.workunit.client.0.vm05.stdout:7/319: link d18/d1b/d1f/d25/d2e/d32/f3d d18/d1b/d1f/d25/d2e/d2f/f59 0 2026-03-10T08:55:31.147 INFO:tasks.workunit.client.0.vm05.stdout:7/320: creat d18/d1b/d1f/d25/d2e/d42/f5a x:0 0 0 2026-03-10T08:55:31.148 INFO:tasks.workunit.client.1.vm08.stdout:4/613: write d5/d23/d36/d99/db2/d5a/d69/f97 [4986288,87164] 0 2026-03-10T08:55:31.148 INFO:tasks.workunit.client.1.vm08.stdout:6/603: dwrite d9/dc/d84/d80/f94 [0,4194304] 0 2026-03-10T08:55:31.148 INFO:tasks.workunit.client.1.vm08.stdout:2/620: dwrite d1/da/fc3 [0,4194304] 0 2026-03-10T08:55:31.148 INFO:tasks.workunit.client.1.vm08.stdout:7/637: stat d0/d11/db2/l8a 0 2026-03-10T08:55:31.148 INFO:tasks.workunit.client.1.vm08.stdout:5/549: mknod d0/d1b/d67/ca7 0 2026-03-10T08:55:31.148 INFO:tasks.workunit.client.1.vm08.stdout:4/614: mknod d5/de/cdd 0 2026-03-10T08:55:31.148 INFO:tasks.workunit.client.1.vm08.stdout:5/550: dread - d0/d1b/f69 zero size 2026-03-10T08:55:31.148 INFO:tasks.workunit.client.1.vm08.stdout:5/551: write d0/d1b/f77 [4418344,15446] 0 2026-03-10T08:55:31.148 INFO:tasks.workunit.client.1.vm08.stdout:7/638: rename d0/d14/d2f/l9e to d0/d14/d43/d9d/dbb/ld0 0 2026-03-10T08:55:31.148 INFO:tasks.workunit.client.1.vm08.stdout:9/525: link d2/dd/d15/d1e/d24/l27 d2/dd/d15/d4f/la3 0 2026-03-10T08:55:31.148 INFO:tasks.workunit.client.1.vm08.stdout:6/604: rmdir d9/dc/d11/d23/d2c/d7a/dce/dbe 0 2026-03-10T08:55:31.148 INFO:tasks.workunit.client.1.vm08.stdout:2/621: creat d1/da/d10/d1b/fc6 x:0 0 0 2026-03-10T08:55:31.148 INFO:tasks.workunit.client.1.vm08.stdout:7/639: mkdir d0/d11/d1f/d29/d3d/dd1 0 2026-03-10T08:55:31.148 INFO:tasks.workunit.client.1.vm08.stdout:5/552: rename d0/d11/d27/d68/d7c/d4b/d87/c8f to d0/d11/ca8 0 2026-03-10T08:55:31.148 INFO:tasks.workunit.client.1.vm08.stdout:9/526: mkdir d2/dd/d15/d1e/d21/da4 0 2026-03-10T08:55:31.150 INFO:tasks.workunit.client.1.vm08.stdout:6/605: creat d9/dc/d11/d23/d2c/d7a/fd1 x:0 0 0 2026-03-10T08:55:31.151 INFO:tasks.workunit.client.1.vm08.stdout:9/527: creat d2/dd/d15/d4f/fa5 x:0 0 0 2026-03-10T08:55:31.151 INFO:tasks.workunit.client.1.vm08.stdout:6/606: read d9/dc/d11/d23/d2c/f3d [1313234,13279] 0 2026-03-10T08:55:31.154 INFO:tasks.workunit.client.1.vm08.stdout:9/528: fsync d2/dd/d15/f44 0 2026-03-10T08:55:31.154 INFO:tasks.workunit.client.1.vm08.stdout:6/607: mknod d9/d13/cd2 0 2026-03-10T08:55:31.155 INFO:tasks.workunit.client.1.vm08.stdout:9/529: stat d2/d41/d4c/f80 0 2026-03-10T08:55:31.155 INFO:tasks.workunit.client.1.vm08.stdout:6/608: fdatasync d9/d50/fb8 0 2026-03-10T08:55:31.155 INFO:tasks.workunit.client.1.vm08.stdout:8/668: dread d1/fdc [0,4194304] 0 2026-03-10T08:55:31.156 INFO:tasks.workunit.client.1.vm08.stdout:9/530: mkdir d2/d54/d8e/da6 0 2026-03-10T08:55:31.157 INFO:tasks.workunit.client.1.vm08.stdout:6/609: creat d9/dc/d11/d23/d2c/d7a/fd3 x:0 0 0 2026-03-10T08:55:31.159 INFO:tasks.workunit.client.1.vm08.stdout:8/669: rename d1/d10/d9/lf8 to d1/d10/d9/dd/d25/d27/d44/d21/d5f/d9e/lfc 0 2026-03-10T08:55:31.164 INFO:tasks.workunit.client.1.vm08.stdout:9/531: link d2/dd/d15/d1e/d25/d32/c6f d2/dd/d15/d1e/d25/d98/d9d/ca7 0 2026-03-10T08:55:31.164 INFO:tasks.workunit.client.1.vm08.stdout:6/610: dwrite d9/dc/d11/d23/f40 [0,4194304] 0 2026-03-10T08:55:31.164 INFO:tasks.workunit.client.1.vm08.stdout:9/532: unlink d2/dd/d15/c1f 0 2026-03-10T08:55:31.164 INFO:tasks.workunit.client.1.vm08.stdout:6/611: fsync d9/d13/f36 0 2026-03-10T08:55:31.165 INFO:tasks.workunit.client.1.vm08.stdout:9/533: creat d2/d41/d4c/d66/d82/fa8 x:0 0 0 2026-03-10T08:55:31.166 INFO:tasks.workunit.client.1.vm08.stdout:6/612: mknod d9/dc/d11/d23/d2c/d81/cd4 0 2026-03-10T08:55:31.167 INFO:tasks.workunit.client.1.vm08.stdout:9/534: mknod d2/dd/d15/d1e/ca9 0 2026-03-10T08:55:31.167 INFO:tasks.workunit.client.1.vm08.stdout:9/535: stat d2/dd/d15 0 2026-03-10T08:55:31.169 INFO:tasks.workunit.client.1.vm08.stdout:6/613: creat d9/dc/d84/d80/fd5 x:0 0 0 2026-03-10T08:55:31.174 INFO:tasks.workunit.client.0.vm05.stdout:8/370: dread d2/db/d47/f51 [0,4194304] 0 2026-03-10T08:55:31.178 INFO:tasks.workunit.client.1.vm08.stdout:9/536: mknod d2/dd/d15/d1e/d24/caa 0 2026-03-10T08:55:31.179 INFO:tasks.workunit.client.0.vm05.stdout:8/371: mkdir d2/d87 0 2026-03-10T08:55:31.182 INFO:tasks.workunit.client.0.vm05.stdout:8/372: getdents d2/db/d1f/d67 0 2026-03-10T08:55:31.182 INFO:tasks.workunit.client.1.vm08.stdout:9/537: truncate d2/f4 2723852 0 2026-03-10T08:55:31.182 INFO:tasks.workunit.client.1.vm08.stdout:9/538: fdatasync d2/dd/d15/f44 0 2026-03-10T08:55:31.184 INFO:tasks.workunit.client.0.vm05.stdout:6/447: sync 2026-03-10T08:55:31.184 INFO:tasks.workunit.client.0.vm05.stdout:3/404: sync 2026-03-10T08:55:31.185 INFO:tasks.workunit.client.0.vm05.stdout:8/373: dread d2/dd/d2c/f34 [0,4194304] 0 2026-03-10T08:55:31.189 INFO:tasks.workunit.client.0.vm05.stdout:3/405: readlink d9/d2b/l32 0 2026-03-10T08:55:31.189 INFO:tasks.workunit.client.0.vm05.stdout:6/448: symlink d4/d2d/d5f/l99 0 2026-03-10T08:55:31.189 INFO:tasks.workunit.client.1.vm08.stdout:2/622: dread d1/da/d10/d42/d93/f8f [0,4194304] 0 2026-03-10T08:55:31.189 INFO:tasks.workunit.client.0.vm05.stdout:3/406: fsync d9/f20 0 2026-03-10T08:55:31.191 INFO:tasks.workunit.client.0.vm05.stdout:3/407: truncate d9/d2b/f3b 435374 0 2026-03-10T08:55:31.203 INFO:tasks.workunit.client.1.vm08.stdout:9/539: creat d2/dd/d15/d1e/d25/d32/d5c/fab x:0 0 0 2026-03-10T08:55:31.203 INFO:tasks.workunit.client.1.vm08.stdout:9/540: rename d2/dd/lf to d2/d54/d8e/da6/lac 0 2026-03-10T08:55:31.203 INFO:tasks.workunit.client.1.vm08.stdout:2/623: mknod d1/da/d10/d1b/cc7 0 2026-03-10T08:55:31.203 INFO:tasks.workunit.client.0.vm05.stdout:3/408: dread d9/d2b/f34 [0,4194304] 0 2026-03-10T08:55:31.203 INFO:tasks.workunit.client.0.vm05.stdout:3/409: mknod d9/d4d/d51/d6f/c73 0 2026-03-10T08:55:31.203 INFO:tasks.workunit.client.0.vm05.stdout:3/410: creat d9/d2b/d3a/d6c/f74 x:0 0 0 2026-03-10T08:55:31.203 INFO:tasks.workunit.client.0.vm05.stdout:3/411: stat d9/d2b/d3a/d43/d6e 0 2026-03-10T08:55:31.205 INFO:tasks.workunit.client.1.vm08.stdout:2/624: creat d1/d9b/fc8 x:0 0 0 2026-03-10T08:55:31.205 INFO:tasks.workunit.client.0.vm05.stdout:3/412: dread d9/d2b/f2d [0,4194304] 0 2026-03-10T08:55:31.207 INFO:tasks.workunit.client.0.vm05.stdout:3/413: write d9/d2b/d3a/d43/d4f/d50/f72 [670470,129308] 0 2026-03-10T08:55:31.208 INFO:tasks.workunit.client.1.vm08.stdout:2/625: truncate d1/da/d10/d42/d93/d23/d9e/fa1 77928 0 2026-03-10T08:55:31.208 INFO:tasks.workunit.client.0.vm05.stdout:3/414: mknod d9/d4d/c75 0 2026-03-10T08:55:31.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:30 vm05.local ceph-mon[49713]: pgmap v154: 65 pgs: 65 active+clean; 2.0 GiB data, 7.2 GiB used, 113 GiB / 120 GiB avail; 44 MiB/s rd, 133 MiB/s wr, 258 op/s 2026-03-10T08:55:31.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:30 vm05.local ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:55:31.213 INFO:tasks.workunit.client.0.vm05.stdout:3/415: symlink d9/l76 0 2026-03-10T08:55:31.214 INFO:tasks.workunit.client.0.vm05.stdout:5/308: write d5/f9 [5305325,57145] 0 2026-03-10T08:55:31.217 INFO:tasks.workunit.client.0.vm05.stdout:3/416: creat d9/d2b/d2f/d57/f77 x:0 0 0 2026-03-10T08:55:31.219 INFO:tasks.workunit.client.0.vm05.stdout:0/371: truncate df/d18/d2b/d27/d32/f44 191383 0 2026-03-10T08:55:31.223 INFO:tasks.workunit.client.0.vm05.stdout:5/309: mkdir d5/df/d12/d66 0 2026-03-10T08:55:31.224 INFO:tasks.workunit.client.0.vm05.stdout:0/372: mkdir df/d18/d2b/d65 0 2026-03-10T08:55:31.224 INFO:tasks.workunit.client.0.vm05.stdout:0/373: chown df/d18/f24 24 1 2026-03-10T08:55:31.226 INFO:tasks.workunit.client.0.vm05.stdout:5/310: mknod d5/df/d12/d21/c67 0 2026-03-10T08:55:31.235 INFO:tasks.workunit.client.0.vm05.stdout:1/475: dwrite dd/d21/d37/f39 [0,4194304] 0 2026-03-10T08:55:31.236 INFO:tasks.workunit.client.0.vm05.stdout:5/311: mkdir d5/df/d37/d68 0 2026-03-10T08:55:31.241 INFO:tasks.workunit.client.0.vm05.stdout:3/417: dread d9/f27 [0,4194304] 0 2026-03-10T08:55:31.242 INFO:tasks.workunit.client.0.vm05.stdout:3/418: dread - d9/d2b/d3a/d6c/f74 zero size 2026-03-10T08:55:31.261 INFO:tasks.workunit.client.0.vm05.stdout:1/476: dread dd/d21/d37/d45/f47 [0,4194304] 0 2026-03-10T08:55:31.262 INFO:tasks.workunit.client.0.vm05.stdout:1/477: chown dd/d10/d18/d2d/d51/d58/fa0 15 1 2026-03-10T08:55:31.262 INFO:tasks.workunit.client.0.vm05.stdout:1/478: chown dd/d10/d19/d9b/la6 2356 1 2026-03-10T08:55:31.264 INFO:tasks.workunit.client.0.vm05.stdout:5/312: creat d5/d48/f69 x:0 0 0 2026-03-10T08:55:31.264 INFO:tasks.workunit.client.0.vm05.stdout:5/313: fdatasync d5/df/d12/f59 0 2026-03-10T08:55:31.265 INFO:tasks.workunit.client.0.vm05.stdout:0/374: rmdir df/d18/d2b/d27/d32/d4e/d5c 0 2026-03-10T08:55:31.268 INFO:tasks.workunit.client.0.vm05.stdout:5/314: mknod d5/df/c6a 0 2026-03-10T08:55:31.273 INFO:tasks.workunit.client.0.vm05.stdout:1/479: symlink dd/lad 0 2026-03-10T08:55:31.273 INFO:tasks.workunit.client.0.vm05.stdout:0/375: link df/d59/f45 df/d18/d19/d39/d4d/d50/f66 0 2026-03-10T08:55:31.273 INFO:tasks.workunit.client.0.vm05.stdout:0/376: write df/d18/f24 [5106107,646] 0 2026-03-10T08:55:31.275 INFO:tasks.workunit.client.0.vm05.stdout:0/377: truncate df/d18/d2b/d27/d32/d4e/f56 41733 0 2026-03-10T08:55:31.279 INFO:tasks.workunit.client.0.vm05.stdout:0/378: write df/f15 [1353444,7817] 0 2026-03-10T08:55:31.279 INFO:tasks.workunit.client.0.vm05.stdout:0/379: stat df/d18/d2b/d27/d32/l52 0 2026-03-10T08:55:31.279 INFO:tasks.workunit.client.0.vm05.stdout:0/380: dread df/d18/d2b/d27/d32/d4e/f56 [0,4194304] 0 2026-03-10T08:55:31.282 INFO:tasks.workunit.client.0.vm05.stdout:9/275: truncate d6/f8 1602465 0 2026-03-10T08:55:31.289 INFO:tasks.workunit.client.1.vm08.stdout:3/559: dwrite d4/d15/d8/d2c/d9b/d79/d8f/f91 [0,4194304] 0 2026-03-10T08:55:31.289 INFO:tasks.workunit.client.0.vm05.stdout:1/480: dread dd/d10/d19/f1d [0,4194304] 0 2026-03-10T08:55:31.289 INFO:tasks.workunit.client.0.vm05.stdout:4/420: dwrite d0/fc [0,4194304] 0 2026-03-10T08:55:31.290 INFO:tasks.workunit.client.0.vm05.stdout:4/421: dread - d0/d1d/d30/d49/d58/d66/d79/f85 zero size 2026-03-10T08:55:31.293 INFO:tasks.workunit.client.0.vm05.stdout:5/315: creat d5/df/d37/d68/f6b x:0 0 0 2026-03-10T08:55:31.294 INFO:tasks.workunit.client.0.vm05.stdout:3/419: sync 2026-03-10T08:55:31.295 INFO:tasks.workunit.client.0.vm05.stdout:3/420: dread - d9/d2b/d2f/d57/f77 zero size 2026-03-10T08:55:31.308 INFO:tasks.workunit.client.0.vm05.stdout:3/421: sync 2026-03-10T08:55:31.311 INFO:tasks.workunit.client.0.vm05.stdout:0/381: fsync df/d18/f2a 0 2026-03-10T08:55:31.315 INFO:tasks.workunit.client.0.vm05.stdout:0/382: chown df 501 1 2026-03-10T08:55:31.315 INFO:tasks.workunit.client.0.vm05.stdout:9/276: rmdir d6/d19/d2a 39 2026-03-10T08:55:31.318 INFO:tasks.workunit.client.1.vm08.stdout:3/560: rename d4/d15/d8/d2c/d9b/d79/d20/c25 to d4/d15/d8/d2c/d9b/d79/cbf 0 2026-03-10T08:55:31.322 INFO:tasks.workunit.client.0.vm05.stdout:1/481: dread fa [0,4194304] 0 2026-03-10T08:55:31.326 INFO:tasks.workunit.client.0.vm05.stdout:4/422: write d0/f1 [5862646,97588] 0 2026-03-10T08:55:31.331 INFO:tasks.workunit.client.1.vm08.stdout:3/561: rmdir d4 39 2026-03-10T08:55:31.335 INFO:tasks.workunit.client.0.vm05.stdout:5/316: symlink d5/df/d12/d21/l6c 0 2026-03-10T08:55:31.342 INFO:tasks.workunit.client.0.vm05.stdout:4/423: sync 2026-03-10T08:55:31.346 INFO:tasks.workunit.client.0.vm05.stdout:9/277: mkdir d6/d19/d2c/d58 0 2026-03-10T08:55:31.346 INFO:tasks.workunit.client.0.vm05.stdout:9/278: stat f4 0 2026-03-10T08:55:31.353 INFO:tasks.workunit.client.0.vm05.stdout:1/482: creat dd/d21/d37/d45/d8d/fae x:0 0 0 2026-03-10T08:55:31.356 INFO:tasks.workunit.client.0.vm05.stdout:5/317: rename d5/df/d12/d24/d2c/d41/f62 to d5/d3a/d43/f6d 0 2026-03-10T08:55:31.358 INFO:tasks.workunit.client.0.vm05.stdout:0/383: mkdir df/d18/d2b/d51/d67 0 2026-03-10T08:55:31.363 INFO:tasks.workunit.client.0.vm05.stdout:2/346: dwrite d0/f10 [0,4194304] 0 2026-03-10T08:55:31.368 INFO:tasks.workunit.client.0.vm05.stdout:4/424: mknod d0/d2e/d71/c8b 0 2026-03-10T08:55:31.368 INFO:tasks.workunit.client.0.vm05.stdout:4/425: chown d0/d1d/d30/d32/f3e 3 1 2026-03-10T08:55:31.369 INFO:tasks.workunit.client.0.vm05.stdout:9/279: symlink d6/d15/d3c/d4b/l59 0 2026-03-10T08:55:31.374 INFO:tasks.workunit.client.0.vm05.stdout:7/321: dwrite d18/d1b/d1f/f2d [0,4194304] 0 2026-03-10T08:55:31.386 INFO:tasks.workunit.client.0.vm05.stdout:3/422: creat d9/f78 x:0 0 0 2026-03-10T08:55:31.387 INFO:tasks.workunit.client.0.vm05.stdout:3/423: truncate d9/d2b/d3a/f44 195953 0 2026-03-10T08:55:31.387 INFO:tasks.workunit.client.0.vm05.stdout:3/424: fsync d9/d2b/f3b 0 2026-03-10T08:55:31.390 INFO:tasks.workunit.client.0.vm05.stdout:4/426: symlink d0/d2e/d71/l8c 0 2026-03-10T08:55:31.396 INFO:tasks.workunit.client.0.vm05.stdout:5/318: symlink d5/df/d12/d24/d2c/d65/l6e 0 2026-03-10T08:55:31.396 INFO:tasks.workunit.client.0.vm05.stdout:5/319: stat d5/df/d12/d21/f5a 0 2026-03-10T08:55:31.400 INFO:tasks.workunit.client.0.vm05.stdout:5/320: dwrite d5/d48/f69 [0,4194304] 0 2026-03-10T08:55:31.406 INFO:tasks.workunit.client.1.vm08.stdout:0/537: dwrite d6/dd/d13/d17/d1f/d20/d2f/f59 [0,4194304] 0 2026-03-10T08:55:31.406 INFO:tasks.workunit.client.1.vm08.stdout:0/538: fsync d6/dd/d13/d17/f1d 0 2026-03-10T08:55:31.406 INFO:tasks.workunit.client.1.vm08.stdout:0/539: dread - d6/dd/d13/d17/d50/fac zero size 2026-03-10T08:55:31.407 INFO:tasks.workunit.client.0.vm05.stdout:7/322: sync 2026-03-10T08:55:31.412 INFO:tasks.workunit.client.1.vm08.stdout:0/540: dread d6/dd/d13/d17/f6d [0,4194304] 0 2026-03-10T08:55:31.415 INFO:tasks.workunit.client.0.vm05.stdout:7/323: dread d18/f4a [0,4194304] 0 2026-03-10T08:55:31.418 INFO:tasks.workunit.client.0.vm05.stdout:5/321: dread d5/df/d12/f2a [0,4194304] 0 2026-03-10T08:55:31.419 INFO:tasks.workunit.client.1.vm08.stdout:0/541: chown d6/dd/d13/d17/d1f/d2d/d85/d93/f7e 6149045 1 2026-03-10T08:55:31.423 INFO:tasks.workunit.client.1.vm08.stdout:4/615: write d5/d23/d36/d99/db2/d5a/d69/fb3 [538472,108527] 0 2026-03-10T08:55:31.424 INFO:tasks.workunit.client.1.vm08.stdout:5/553: write d0/f6c [1273719,2604] 0 2026-03-10T08:55:31.429 INFO:tasks.workunit.client.1.vm08.stdout:7/640: dwrite d0/d14/d43/f7b [0,4194304] 0 2026-03-10T08:55:31.433 INFO:tasks.workunit.client.0.vm05.stdout:3/425: creat d9/d2b/d3a/d43/d4f/d55/f79 x:0 0 0 2026-03-10T08:55:31.439 INFO:tasks.workunit.client.0.vm05.stdout:4/427: symlink d0/d1d/d30/d32/l8d 0 2026-03-10T08:55:31.441 INFO:tasks.workunit.client.1.vm08.stdout:4/616: symlink d5/de/lde 0 2026-03-10T08:55:31.443 INFO:tasks.workunit.client.1.vm08.stdout:4/617: stat d5/d23/d36/d76 0 2026-03-10T08:55:31.444 INFO:tasks.workunit.client.1.vm08.stdout:7/641: dwrite d0/d14/d43/fbc [0,4194304] 0 2026-03-10T08:55:31.451 INFO:tasks.workunit.client.1.vm08.stdout:8/670: write d1/d10/f23 [645891,2837] 0 2026-03-10T08:55:31.451 INFO:tasks.workunit.client.1.vm08.stdout:5/554: stat d0/d1b/l2c 0 2026-03-10T08:55:31.452 INFO:tasks.workunit.client.1.vm08.stdout:9/541: getdents d2/d41/d4c/d66/d82 0 2026-03-10T08:55:31.459 INFO:tasks.workunit.client.0.vm05.stdout:8/374: write d2/fa [2714436,67648] 0 2026-03-10T08:55:31.460 INFO:tasks.workunit.client.1.vm08.stdout:6/614: dwrite d9/d10/d1e/d32/f48 [0,4194304] 0 2026-03-10T08:55:31.461 INFO:tasks.workunit.client.1.vm08.stdout:6/615: truncate d9/d13/f70 2189834 0 2026-03-10T08:55:31.463 INFO:tasks.workunit.client.1.vm08.stdout:7/642: rename d0/d51/l6d to d0/d11/ld2 0 2026-03-10T08:55:31.468 INFO:tasks.workunit.client.0.vm05.stdout:6/449: write d4/d7/d10/d15/d1b/f3f [1287435,8150] 0 2026-03-10T08:55:31.468 INFO:tasks.workunit.client.0.vm05.stdout:7/324: mknod d18/d38/c5b 0 2026-03-10T08:55:31.468 INFO:tasks.workunit.client.1.vm08.stdout:7/643: chown d0/d14/c20 904561285 1 2026-03-10T08:55:31.474 INFO:tasks.workunit.client.0.vm05.stdout:5/322: stat d5/f40 0 2026-03-10T08:55:31.474 INFO:tasks.workunit.client.0.vm05.stdout:5/323: write d5/df/d37/d68/f6b [826917,46026] 0 2026-03-10T08:55:31.475 INFO:tasks.workunit.client.1.vm08.stdout:9/542: creat d2/d41/d4c/d66/fad x:0 0 0 2026-03-10T08:55:31.476 INFO:tasks.workunit.client.0.vm05.stdout:3/426: chown d9/le 190823 1 2026-03-10T08:55:31.478 INFO:tasks.workunit.client.1.vm08.stdout:5/555: dwrite d0/d11/d3e/f4d [0,4194304] 0 2026-03-10T08:55:31.483 INFO:tasks.workunit.client.0.vm05.stdout:3/427: dwrite d9/f29 [0,4194304] 0 2026-03-10T08:55:31.486 INFO:tasks.workunit.client.1.vm08.stdout:2/626: dwrite d1/da/d10/d1b/f14 [0,4194304] 0 2026-03-10T08:55:31.488 INFO:tasks.workunit.client.0.vm05.stdout:2/347: rename d0/d9/d27/c4f to d0/d9/c62 0 2026-03-10T08:55:31.488 INFO:tasks.workunit.client.1.vm08.stdout:2/627: chown d1/da/d10/d42/d93/d23/d9e/cab 28 1 2026-03-10T08:55:31.489 INFO:tasks.workunit.client.0.vm05.stdout:4/428: fsync d0/d1d/d30/d49/d4f/f51 0 2026-03-10T08:55:31.493 INFO:tasks.workunit.client.1.vm08.stdout:4/618: rename d5/d23/d36/d99/db2/dbd to d5/d23/d36/d99/db2/d5d/dae/ddf 0 2026-03-10T08:55:31.494 INFO:tasks.workunit.client.0.vm05.stdout:9/280: rmdir d6/d15/d3c/d4b/d55 0 2026-03-10T08:55:31.501 INFO:tasks.workunit.client.0.vm05.stdout:8/375: stat d2/db/d28/f2d 0 2026-03-10T08:55:31.501 INFO:tasks.workunit.client.0.vm05.stdout:8/376: dread - d2/dd/f72 zero size 2026-03-10T08:55:31.504 INFO:tasks.workunit.client.1.vm08.stdout:2/628: dwrite d1/da/d10/d1b/fc6 [0,4194304] 0 2026-03-10T08:55:31.505 INFO:tasks.workunit.client.0.vm05.stdout:7/325: truncate d18/f1d 246242 0 2026-03-10T08:55:31.508 INFO:tasks.workunit.client.0.vm05.stdout:5/324: rename d5/df/d12/d24/f61 to d5/df/d12/d24/f6f 0 2026-03-10T08:55:31.508 INFO:tasks.workunit.client.0.vm05.stdout:0/384: write df/d18/f2a [3814444,109291] 0 2026-03-10T08:55:31.510 INFO:tasks.workunit.client.1.vm08.stdout:8/671: rename d1/d10/d9/dd/d18/d3c/f83 to d1/d10/d9/dd/d25/d27/d44/d21/dce/ffd 0 2026-03-10T08:55:31.512 INFO:tasks.workunit.client.0.vm05.stdout:3/428: chown d9/c22 514183361 1 2026-03-10T08:55:31.522 INFO:tasks.workunit.client.1.vm08.stdout:5/556: creat d0/d11/d27/d68/d7c/d4b/d4e/d84/fa9 x:0 0 0 2026-03-10T08:55:31.522 INFO:tasks.workunit.client.1.vm08.stdout:2/629: dread - d1/da/d10/d42/d93/f8e zero size 2026-03-10T08:55:31.522 INFO:tasks.workunit.client.1.vm08.stdout:2/630: write d1/da/d10/d1b/f14 [378627,67563] 0 2026-03-10T08:55:31.522 INFO:tasks.workunit.client.1.vm08.stdout:3/562: stat d4/d15/d8/d2c/d9b/d79/cbf 0 2026-03-10T08:55:31.522 INFO:tasks.workunit.client.0.vm05.stdout:4/429: chown d0/l11 102737431 1 2026-03-10T08:55:31.522 INFO:tasks.workunit.client.0.vm05.stdout:1/483: truncate dd/d21/f3a 8018153 0 2026-03-10T08:55:31.522 INFO:tasks.workunit.client.0.vm05.stdout:7/326: mkdir d18/d38/d43/d5c 0 2026-03-10T08:55:31.522 INFO:tasks.workunit.client.0.vm05.stdout:6/450: truncate d4/d7/d10/d1a/f1e 3071576 0 2026-03-10T08:55:31.522 INFO:tasks.workunit.client.0.vm05.stdout:6/451: write d4/d7/f14 [852995,54193] 0 2026-03-10T08:55:31.524 INFO:tasks.workunit.client.0.vm05.stdout:5/325: fdatasync d5/df/f2f 0 2026-03-10T08:55:31.525 INFO:tasks.workunit.client.1.vm08.stdout:5/557: creat d0/d11/d27/d68/d7c/d4b/d4e/d84/faa x:0 0 0 2026-03-10T08:55:31.525 INFO:tasks.workunit.client.0.vm05.stdout:0/385: rmdir df/d18/d2b 39 2026-03-10T08:55:31.526 INFO:tasks.workunit.client.0.vm05.stdout:3/429: mkdir d9/d2b/d3a/d43/d7a 0 2026-03-10T08:55:31.526 INFO:tasks.workunit.client.0.vm05.stdout:3/430: readlink d9/d2b/d3a/l49 0 2026-03-10T08:55:31.527 INFO:tasks.workunit.client.1.vm08.stdout:2/631: unlink d1/da/d10/d42/d93/f8e 0 2026-03-10T08:55:31.529 INFO:tasks.workunit.client.1.vm08.stdout:3/563: truncate d4/d15/d8/d2c/d9b/f4d 229926 0 2026-03-10T08:55:31.529 INFO:tasks.workunit.client.0.vm05.stdout:9/281: readlink d6/d27/l40 0 2026-03-10T08:55:31.532 INFO:tasks.workunit.client.0.vm05.stdout:1/484: rename dd/d21/c77 to dd/d21/d37/d7c/caf 0 2026-03-10T08:55:31.556 INFO:tasks.workunit.client.1.vm08.stdout:9/543: rename d2/d41/d4c/d89 to d2/dd/d15/d1e/d25/dae 0 2026-03-10T08:55:31.556 INFO:tasks.workunit.client.1.vm08.stdout:2/632: unlink d1/da/d10/d42/f96 0 2026-03-10T08:55:31.556 INFO:tasks.workunit.client.1.vm08.stdout:9/544: creat d2/dd/faf x:0 0 0 2026-03-10T08:55:31.556 INFO:tasks.workunit.client.1.vm08.stdout:3/564: rename d4/c19 to d4/d15/d8/d2c/d9b/d79/d20/cc0 0 2026-03-10T08:55:31.556 INFO:tasks.workunit.client.1.vm08.stdout:3/565: stat d4/d15/d8/d2c/d9b/d79/d20/l72 0 2026-03-10T08:55:31.556 INFO:tasks.workunit.client.1.vm08.stdout:9/545: creat d2/d41/d4c/d66/fb0 x:0 0 0 2026-03-10T08:55:31.556 INFO:tasks.workunit.client.1.vm08.stdout:2/633: symlink d1/lc9 0 2026-03-10T08:55:31.556 INFO:tasks.workunit.client.1.vm08.stdout:3/566: mknod d4/d15/cc1 0 2026-03-10T08:55:31.556 INFO:tasks.workunit.client.1.vm08.stdout:2/634: mkdir d1/da/d10/dca 0 2026-03-10T08:55:31.556 INFO:tasks.workunit.client.1.vm08.stdout:3/567: dwrite d4/d15/fa [0,4194304] 0 2026-03-10T08:55:31.556 INFO:tasks.workunit.client.1.vm08.stdout:9/546: symlink d2/dd/d15/d1e/d25/d32/d79/d85/lb1 0 2026-03-10T08:55:31.556 INFO:tasks.workunit.client.1.vm08.stdout:3/568: creat d4/d15/d8/d2c/d9b/fc2 x:0 0 0 2026-03-10T08:55:31.556 INFO:tasks.workunit.client.0.vm05.stdout:1/485: write dd/d10/d18/d2d/d51/d58/fa0 [533951,12514] 0 2026-03-10T08:55:31.556 INFO:tasks.workunit.client.0.vm05.stdout:1/486: fsync dd/d21/d37/d45/d8d/f99 0 2026-03-10T08:55:31.556 INFO:tasks.workunit.client.0.vm05.stdout:7/327: creat d18/d38/f5d x:0 0 0 2026-03-10T08:55:31.556 INFO:tasks.workunit.client.0.vm05.stdout:6/452: readlink d4/d7/d10/d1a/d1f/l60 0 2026-03-10T08:55:31.556 INFO:tasks.workunit.client.0.vm05.stdout:4/430: mknod d0/c8e 0 2026-03-10T08:55:31.556 INFO:tasks.workunit.client.0.vm05.stdout:4/431: chown d0/d1d/d30/d49/d58/d66 44408 1 2026-03-10T08:55:31.556 INFO:tasks.workunit.client.0.vm05.stdout:4/432: stat d0/d1d/d30/d49/d58 0 2026-03-10T08:55:31.556 INFO:tasks.workunit.client.0.vm05.stdout:9/282: creat d6/d19/d2c/f5a x:0 0 0 2026-03-10T08:55:31.556 INFO:tasks.workunit.client.0.vm05.stdout:6/453: unlink d4/f6a 0 2026-03-10T08:55:31.556 INFO:tasks.workunit.client.0.vm05.stdout:6/454: dread - d4/d2d/d5f/f81 zero size 2026-03-10T08:55:31.556 INFO:tasks.workunit.client.0.vm05.stdout:6/455: readlink d4/d7/d10/d15/l3b 0 2026-03-10T08:55:31.557 INFO:tasks.workunit.client.0.vm05.stdout:6/456: read d4/fc [2355405,95985] 0 2026-03-10T08:55:31.557 INFO:tasks.workunit.client.0.vm05.stdout:2/348: rmdir d0/d9/d1e/d20/d24/d5e 0 2026-03-10T08:55:31.557 INFO:tasks.workunit.client.0.vm05.stdout:2/349: write d0/d9/d1e/f59 [404246,61814] 0 2026-03-10T08:55:31.557 INFO:tasks.workunit.client.0.vm05.stdout:9/283: creat d6/d15/d3c/d4b/f5b x:0 0 0 2026-03-10T08:55:31.557 INFO:tasks.workunit.client.0.vm05.stdout:8/377: getdents d2 0 2026-03-10T08:55:31.557 INFO:tasks.workunit.client.1.vm08.stdout:9/547: dwrite d2/d41/d74/f9a [0,4194304] 0 2026-03-10T08:55:31.558 INFO:tasks.workunit.client.0.vm05.stdout:7/328: symlink d18/d1b/d1f/d25/d2e/d42/d53/l5e 0 2026-03-10T08:55:31.559 INFO:tasks.workunit.client.0.vm05.stdout:4/433: dread d0/d78/f87 [0,4194304] 0 2026-03-10T08:55:31.559 INFO:tasks.workunit.client.0.vm05.stdout:4/434: chown d0/d2e/d71/c8b 62259952 1 2026-03-10T08:55:31.560 INFO:tasks.workunit.client.0.vm05.stdout:4/435: dread - d0/d2e/d42/d45/d4a/d36/f3d zero size 2026-03-10T08:55:31.562 INFO:tasks.workunit.client.0.vm05.stdout:2/350: creat d0/d9/d27/f63 x:0 0 0 2026-03-10T08:55:31.585 INFO:tasks.workunit.client.1.vm08.stdout:3/569: readlink d4/d15/d8/d2c/d9b/d79/d20/l96 0 2026-03-10T08:55:31.608 INFO:tasks.workunit.client.1.vm08.stdout:3/570: chown d4/d6f/d85/f87 3836676 1 2026-03-10T08:55:31.608 INFO:tasks.workunit.client.1.vm08.stdout:2/635: rename d1/da/d10/f18 to d1/da/d10/d42/d93/fcb 0 2026-03-10T08:55:31.608 INFO:tasks.workunit.client.1.vm08.stdout:3/571: chown d4/d15/c26 71741 1 2026-03-10T08:55:31.608 INFO:tasks.workunit.client.1.vm08.stdout:9/548: link d2/dd/d15/d1e/d24/caa d2/d54/cb2 0 2026-03-10T08:55:31.608 INFO:tasks.workunit.client.1.vm08.stdout:9/549: fdatasync d2/dd/d15/d1e/d25/f5f 0 2026-03-10T08:55:31.608 INFO:tasks.workunit.client.1.vm08.stdout:9/550: fdatasync d2/dd/d15/d1e/d24/f9e 0 2026-03-10T08:55:31.608 INFO:tasks.workunit.client.1.vm08.stdout:9/551: chown d2/dd/l1d 4 1 2026-03-10T08:55:31.608 INFO:tasks.workunit.client.1.vm08.stdout:9/552: dread - d2/dd/d15/d1e/d25/d32/d5c/fab zero size 2026-03-10T08:55:31.608 INFO:tasks.workunit.client.1.vm08.stdout:2/636: link d1/da/d10/d1b/d6a/la9 d1/d9b/d52/db3/lcc 0 2026-03-10T08:55:31.609 INFO:tasks.workunit.client.1.vm08.stdout:2/637: dread - d1/d5b/fba zero size 2026-03-10T08:55:31.609 INFO:tasks.workunit.client.1.vm08.stdout:9/553: mkdir d2/dd/d15/d1e/d25/d98/d9d/db3 0 2026-03-10T08:55:31.609 INFO:tasks.workunit.client.1.vm08.stdout:9/554: link d2/dd/l5d d2/dd/d15/d1e/d21/lb4 0 2026-03-10T08:55:31.609 INFO:tasks.workunit.client.1.vm08.stdout:9/555: rename d2/dd/l1d to d2/dd/d15/d1e/d39/lb5 0 2026-03-10T08:55:31.610 INFO:tasks.workunit.client.0.vm05.stdout:2/351: fdatasync d0/d9/d27/f54 0 2026-03-10T08:55:31.610 INFO:tasks.workunit.client.0.vm05.stdout:9/284: write d6/f3f [683457,38272] 0 2026-03-10T08:55:31.610 INFO:tasks.workunit.client.0.vm05.stdout:5/326: getdents d5/df 0 2026-03-10T08:55:31.610 INFO:tasks.workunit.client.0.vm05.stdout:4/436: unlink d0/d1d/d30/c6f 0 2026-03-10T08:55:31.610 INFO:tasks.workunit.client.0.vm05.stdout:3/431: getdents d9/d2b/d3a/d43/d4f/d50 0 2026-03-10T08:55:31.610 INFO:tasks.workunit.client.0.vm05.stdout:2/352: unlink d0/d9/d1e/d20/d21/f4c 0 2026-03-10T08:55:31.610 INFO:tasks.workunit.client.0.vm05.stdout:2/353: truncate d0/d9/d1e/d20/d21/f41 4459237 0 2026-03-10T08:55:31.610 INFO:tasks.workunit.client.0.vm05.stdout:1/487: link dd/d21/f7f dd/d10/fb0 0 2026-03-10T08:55:31.610 INFO:tasks.workunit.client.0.vm05.stdout:7/329: creat d18/d38/d43/d5c/f5f x:0 0 0 2026-03-10T08:55:31.610 INFO:tasks.workunit.client.0.vm05.stdout:7/330: chown d18/d1b/d1f/d25/d2e/d42/f5a 12 1 2026-03-10T08:55:31.610 INFO:tasks.workunit.client.0.vm05.stdout:2/354: dread - d0/d9/d1e/d20/d21/f35 zero size 2026-03-10T08:55:31.610 INFO:tasks.workunit.client.0.vm05.stdout:1/488: creat dd/d10/d18/d20/d69/fb1 x:0 0 0 2026-03-10T08:55:31.610 INFO:tasks.workunit.client.0.vm05.stdout:1/489: truncate dd/f9e 415469 0 2026-03-10T08:55:31.610 INFO:tasks.workunit.client.0.vm05.stdout:1/490: chown dd/f9e 223934402 1 2026-03-10T08:55:31.610 INFO:tasks.workunit.client.0.vm05.stdout:4/437: fdatasync d0/fb 0 2026-03-10T08:55:31.610 INFO:tasks.workunit.client.0.vm05.stdout:3/432: mkdir d9/d2b/d3a/d43/d4f/d50/d5f/d7b 0 2026-03-10T08:55:31.610 INFO:tasks.workunit.client.0.vm05.stdout:9/285: rename d6/d19/f1a to d6/d19/f5c 0 2026-03-10T08:55:31.610 INFO:tasks.workunit.client.0.vm05.stdout:3/433: creat d9/d2b/d3a/d43/d4f/d50/f7c x:0 0 0 2026-03-10T08:55:31.610 INFO:tasks.workunit.client.0.vm05.stdout:3/434: dwrite d9/d2b/d53/f60 [0,4194304] 0 2026-03-10T08:55:31.610 INFO:tasks.workunit.client.0.vm05.stdout:2/355: getdents d0/d55 0 2026-03-10T08:55:31.610 INFO:tasks.workunit.client.0.vm05.stdout:2/356: dwrite d0/f40 [4194304,4194304] 0 2026-03-10T08:55:31.610 INFO:tasks.workunit.client.0.vm05.stdout:3/435: creat d9/d2b/d3a/d43/d6e/f7d x:0 0 0 2026-03-10T08:55:31.610 INFO:tasks.workunit.client.0.vm05.stdout:3/436: readlink d9/l76 0 2026-03-10T08:55:31.610 INFO:tasks.workunit.client.0.vm05.stdout:9/286: getdents d6/d19/d2a 0 2026-03-10T08:55:31.610 INFO:tasks.workunit.client.0.vm05.stdout:3/437: mknod d9/d4d/d51/d64/c7e 0 2026-03-10T08:55:31.611 INFO:tasks.workunit.client.1.vm08.stdout:9/556: symlink d2/dd/d15/d1e/d25/d32/d5c/lb6 0 2026-03-10T08:55:31.612 INFO:tasks.workunit.client.1.vm08.stdout:2/638: dread d1/da/d10/d42/d93/d23/f37 [0,4194304] 0 2026-03-10T08:55:31.612 INFO:tasks.workunit.client.0.vm05.stdout:9/287: unlink d6/d19/d2c/f5a 0 2026-03-10T08:55:31.612 INFO:tasks.workunit.client.1.vm08.stdout:9/557: chown d2/dd/d15/d1e/d25/d98/d9d 1604633 1 2026-03-10T08:55:31.613 INFO:tasks.workunit.client.1.vm08.stdout:9/558: write d2/dd/d15/d4f/fa5 [37099,122859] 0 2026-03-10T08:55:31.622 INFO:tasks.workunit.client.0.vm05.stdout:3/438: creat d9/d2b/d3a/d43/d7a/f7f x:0 0 0 2026-03-10T08:55:31.629 INFO:tasks.workunit.client.1.vm08.stdout:2/639: mkdir d1/d43/dcd 0 2026-03-10T08:55:31.629 INFO:tasks.workunit.client.0.vm05.stdout:3/439: dread d9/f29 [0,4194304] 0 2026-03-10T08:55:31.629 INFO:tasks.workunit.client.0.vm05.stdout:3/440: read d9/d2b/f2c [279341,26606] 0 2026-03-10T08:55:31.629 INFO:tasks.workunit.client.0.vm05.stdout:3/441: readlink d9/d2b/l47 0 2026-03-10T08:55:31.629 INFO:tasks.workunit.client.0.vm05.stdout:3/442: dread d9/d2b/d53/f60 [0,4194304] 0 2026-03-10T08:55:31.630 INFO:tasks.workunit.client.0.vm05.stdout:3/443: read d9/d2b/d53/f60 [1147276,82365] 0 2026-03-10T08:55:31.755 INFO:tasks.workunit.client.1.vm08.stdout:0/542: sync 2026-03-10T08:55:31.755 INFO:tasks.workunit.client.1.vm08.stdout:0/543: readlink d6/l9c 0 2026-03-10T08:55:31.756 INFO:tasks.workunit.client.1.vm08.stdout:0/544: chown d6 5388585 1 2026-03-10T08:55:31.760 INFO:tasks.workunit.client.1.vm08.stdout:0/545: fsync d6/dd/d13/d17/d1f/f48 0 2026-03-10T08:55:31.780 INFO:tasks.workunit.client.0.vm05.stdout:5/327: sync 2026-03-10T08:55:31.830 INFO:tasks.workunit.client.1.vm08.stdout:5/558: dread d0/fb [0,4194304] 0 2026-03-10T08:55:31.833 INFO:tasks.workunit.client.0.vm05.stdout:9/288: dread d6/d19/d21/f2f [0,4194304] 0 2026-03-10T08:55:31.836 INFO:tasks.workunit.client.0.vm05.stdout:2/357: sync 2026-03-10T08:55:31.845 INFO:tasks.workunit.client.1.vm08.stdout:5/559: dread d0/d11/d3e/d45/f4a [0,4194304] 0 2026-03-10T08:55:31.845 INFO:tasks.workunit.client.1.vm08.stdout:5/560: dread - d0/d1b/f69 zero size 2026-03-10T08:55:31.845 INFO:tasks.workunit.client.0.vm05.stdout:9/289: rmdir d6/d19/d2c/d2e 0 2026-03-10T08:55:31.845 INFO:tasks.workunit.client.0.vm05.stdout:9/290: dread d6/fe [0,4194304] 0 2026-03-10T08:55:31.845 INFO:tasks.workunit.client.0.vm05.stdout:2/358: read d0/d9/d1e/d20/d21/f31 [3728046,52038] 0 2026-03-10T08:55:31.845 INFO:tasks.workunit.client.0.vm05.stdout:9/291: mknod d6/d15/d3c/c5d 0 2026-03-10T08:55:31.850 INFO:tasks.workunit.client.0.vm05.stdout:9/292: dwrite d6/d12/d43/f47 [0,4194304] 0 2026-03-10T08:55:31.854 INFO:tasks.workunit.client.0.vm05.stdout:9/293: dread d6/d19/f5c [0,4194304] 0 2026-03-10T08:55:31.865 INFO:tasks.workunit.client.0.vm05.stdout:9/294: rmdir d6/d19 39 2026-03-10T08:55:31.867 INFO:tasks.workunit.client.0.vm05.stdout:9/295: creat d6/d12/d3a/f5e x:0 0 0 2026-03-10T08:55:31.867 INFO:tasks.workunit.client.0.vm05.stdout:9/296: fdatasync d6/f4e 0 2026-03-10T08:55:31.868 INFO:tasks.workunit.client.0.vm05.stdout:9/297: mknod d6/d19/c5f 0 2026-03-10T08:55:31.870 INFO:tasks.workunit.client.0.vm05.stdout:9/298: dread d6/d15/d37/f4c [0,4194304] 0 2026-03-10T08:55:31.870 INFO:tasks.workunit.client.0.vm05.stdout:9/299: read d6/fe [400547,47321] 0 2026-03-10T08:55:31.872 INFO:tasks.workunit.client.0.vm05.stdout:9/300: creat d6/d27/f60 x:0 0 0 2026-03-10T08:55:31.873 INFO:tasks.workunit.client.0.vm05.stdout:9/301: creat d6/d19/d2c/f61 x:0 0 0 2026-03-10T08:55:31.874 INFO:tasks.workunit.client.0.vm05.stdout:9/302: creat d6/d12/d3a/f62 x:0 0 0 2026-03-10T08:55:31.876 INFO:tasks.workunit.client.0.vm05.stdout:9/303: dread d6/d12/d43/f47 [0,4194304] 0 2026-03-10T08:55:31.907 INFO:tasks.workunit.client.0.vm05.stdout:5/328: sync 2026-03-10T08:55:31.911 INFO:tasks.workunit.client.0.vm05.stdout:5/329: link d5/l2b d5/d48/l70 0 2026-03-10T08:55:31.912 INFO:tasks.workunit.client.0.vm05.stdout:5/330: mkdir d5/df/d12/d21/d71 0 2026-03-10T08:55:31.914 INFO:tasks.workunit.client.1.vm08.stdout:7/644: fsync d0/d14/f72 0 2026-03-10T08:55:31.917 INFO:tasks.workunit.client.0.vm05.stdout:5/331: symlink d5/df/d12/d21/d71/l72 0 2026-03-10T08:55:31.917 INFO:tasks.workunit.client.0.vm05.stdout:5/332: chown d5/df/d12/d21/l63 199145 1 2026-03-10T08:55:31.917 INFO:tasks.workunit.client.0.vm05.stdout:5/333: read d5/f9 [8766264,82835] 0 2026-03-10T08:55:31.935 INFO:tasks.workunit.client.1.vm08.stdout:0/546: sync 2026-03-10T08:55:31.943 INFO:tasks.workunit.client.1.vm08.stdout:0/547: creat d6/dd/d13/d61/fb1 x:0 0 0 2026-03-10T08:55:31.944 INFO:tasks.workunit.client.1.vm08.stdout:0/548: fdatasync d6/dd/d13/d17/d50/fac 0 2026-03-10T08:55:31.944 INFO:tasks.workunit.client.1.vm08.stdout:0/549: readlink d6/dd/d13/d17/d1f/d2d/l9a 0 2026-03-10T08:55:32.006 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:32 vm08.local ceph-mon[57559]: pgmap v155: 65 pgs: 65 active+clean; 2.0 GiB data, 7.2 GiB used, 113 GiB / 120 GiB avail; 44 MiB/s rd, 124 MiB/s wr, 247 op/s 2026-03-10T08:55:32.042 INFO:tasks.workunit.client.1.vm08.stdout:6/616: write f1 [1516131,130944] 0 2026-03-10T08:55:32.043 INFO:tasks.workunit.client.1.vm08.stdout:4/619: write d5/d23/d36/d76/fc7 [67624,5270] 0 2026-03-10T08:55:32.044 INFO:tasks.workunit.client.1.vm08.stdout:4/620: fsync d5/f77 0 2026-03-10T08:55:32.049 INFO:tasks.workunit.client.1.vm08.stdout:1/638: dwrite d1/f1f [0,4194304] 0 2026-03-10T08:55:32.052 INFO:tasks.workunit.client.1.vm08.stdout:6/617: write d9/dc/d11/d23/f6f [3412592,20512] 0 2026-03-10T08:55:32.057 INFO:tasks.workunit.client.1.vm08.stdout:1/639: stat d1/da/de/d24/d3d/d40/d56/d6b 0 2026-03-10T08:55:32.057 INFO:tasks.workunit.client.1.vm08.stdout:8/672: dwrite d1/d10/d9/dd/d25/d27/d44/d21/d51/d64/fc2 [0,4194304] 0 2026-03-10T08:55:32.064 INFO:tasks.workunit.client.1.vm08.stdout:4/621: truncate d5/d23/d36/d99/db2/d5a/d69/f8c 643242 0 2026-03-10T08:55:32.069 INFO:tasks.workunit.client.1.vm08.stdout:6/618: dwrite d9/dc/d11/d23/f40 [0,4194304] 0 2026-03-10T08:55:32.071 INFO:tasks.workunit.client.1.vm08.stdout:8/673: mkdir d1/d10/d9/dd/d25/d27/d44/d21/d51/d64/dfe 0 2026-03-10T08:55:32.071 INFO:tasks.workunit.client.0.vm05.stdout:0/386: truncate f6 2747890 0 2026-03-10T08:55:32.072 INFO:tasks.workunit.client.1.vm08.stdout:6/619: chown d9/dc/d11/d23/c37 5843 1 2026-03-10T08:55:32.073 INFO:tasks.workunit.client.0.vm05.stdout:1/491: rename dd/d21/d37/d7c/caf to dd/d10/d19/cb2 0 2026-03-10T08:55:32.074 INFO:tasks.workunit.client.0.vm05.stdout:1/492: write dd/d10/d18/d20/f89 [1654736,72123] 0 2026-03-10T08:55:32.075 INFO:tasks.workunit.client.0.vm05.stdout:1/493: chown dd/d21/d37/l3d 80399 1 2026-03-10T08:55:32.079 INFO:tasks.workunit.client.0.vm05.stdout:1/494: read - dd/d10/d18/d2d/d5c/f7e zero size 2026-03-10T08:55:32.085 INFO:tasks.workunit.client.1.vm08.stdout:1/640: truncate d1/da/d20/d3f/d49/f9a 5093213 0 2026-03-10T08:55:32.085 INFO:tasks.workunit.client.0.vm05.stdout:0/387: creat df/d18/d19/d47/f68 x:0 0 0 2026-03-10T08:55:32.085 INFO:tasks.workunit.client.0.vm05.stdout:0/388: stat df/d18/d19/d39/d4d/d50/f66 0 2026-03-10T08:55:32.088 INFO:tasks.workunit.client.0.vm05.stdout:0/389: chown c8 98220977 1 2026-03-10T08:55:32.102 INFO:tasks.workunit.client.0.vm05.stdout:1/495: dread fb [0,4194304] 0 2026-03-10T08:55:32.103 INFO:tasks.workunit.client.0.vm05.stdout:1/496: rename dd to dd/d10/d18/db3 22 2026-03-10T08:55:32.105 INFO:tasks.workunit.client.0.vm05.stdout:6/457: dwrite d4/fc [0,4194304] 0 2026-03-10T08:55:32.109 INFO:tasks.workunit.client.0.vm05.stdout:6/458: dread d4/d2c/d84/f41 [0,4194304] 0 2026-03-10T08:55:32.110 INFO:tasks.workunit.client.0.vm05.stdout:1/497: mkdir dd/db4 0 2026-03-10T08:55:32.110 INFO:tasks.workunit.client.0.vm05.stdout:1/498: readlink dd/d10/l1b 0 2026-03-10T08:55:32.123 INFO:tasks.workunit.client.0.vm05.stdout:1/499: creat dd/d10/fb5 x:0 0 0 2026-03-10T08:55:32.125 INFO:tasks.workunit.client.0.vm05.stdout:1/500: fdatasync dd/d10/d19/f95 0 2026-03-10T08:55:32.128 INFO:tasks.workunit.client.1.vm08.stdout:6/620: dread d9/d10/d1e/d32/f4d [0,4194304] 0 2026-03-10T08:55:32.138 INFO:tasks.workunit.client.0.vm05.stdout:0/390: dread df/d18/f2a [0,4194304] 0 2026-03-10T08:55:32.139 INFO:tasks.workunit.client.0.vm05.stdout:0/391: symlink df/d59/l69 0 2026-03-10T08:55:32.141 INFO:tasks.workunit.client.0.vm05.stdout:0/392: getdents df/d1f 0 2026-03-10T08:55:32.151 INFO:tasks.workunit.client.0.vm05.stdout:8/378: write d2/dd/f1a [59316,8632] 0 2026-03-10T08:55:32.155 INFO:tasks.workunit.client.1.vm08.stdout:6/621: dread d9/dc/d11/d23/d2c/d81/f62 [0,4194304] 0 2026-03-10T08:55:32.155 INFO:tasks.workunit.client.1.vm08.stdout:1/641: dread d1/da/de/f19 [0,4194304] 0 2026-03-10T08:55:32.156 INFO:tasks.workunit.client.1.vm08.stdout:2/640: rename d1/d9b to d1/da/d10/d42/d93/d1e/dce 0 2026-03-10T08:55:32.159 INFO:tasks.workunit.client.1.vm08.stdout:2/641: mkdir d1/da/d10/d1b/dcf 0 2026-03-10T08:55:32.160 INFO:tasks.workunit.client.1.vm08.stdout:3/572: dwrite d4/d15/d8/d1d/f2d [0,4194304] 0 2026-03-10T08:55:32.160 INFO:tasks.workunit.client.1.vm08.stdout:1/642: fdatasync d1/da/f1e 0 2026-03-10T08:55:32.165 INFO:tasks.workunit.client.1.vm08.stdout:2/642: readlink d1/da/d10/d1b/d6a/la9 0 2026-03-10T08:55:32.176 INFO:tasks.workunit.client.1.vm08.stdout:1/643: write d1/da/de/d24/d3d/d40/f42 [2951217,49185] 0 2026-03-10T08:55:32.176 INFO:tasks.workunit.client.1.vm08.stdout:7/645: dread d0/d14/d43/fc2 [0,4194304] 0 2026-03-10T08:55:32.176 INFO:tasks.workunit.client.1.vm08.stdout:3/573: creat d4/d15/d8/d2c/d6d/fc3 x:0 0 0 2026-03-10T08:55:32.176 INFO:tasks.workunit.client.1.vm08.stdout:1/644: fsync d1/da/de/d24/d26/d86/fc2 0 2026-03-10T08:55:32.186 INFO:tasks.workunit.client.0.vm05.stdout:8/379: sync 2026-03-10T08:55:32.187 INFO:tasks.workunit.client.1.vm08.stdout:1/645: link d1/da/de/d24/d3d/d40/d8e/dd2/f8b d1/da/d4b/fe3 0 2026-03-10T08:55:32.187 INFO:tasks.workunit.client.1.vm08.stdout:7/646: dwrite d0/d11/d4a/d95/fa7 [0,4194304] 0 2026-03-10T08:55:32.187 INFO:tasks.workunit.client.1.vm08.stdout:1/646: link d1/da/de/d24/d35/d43/fb2 d1/da/de/fe4 0 2026-03-10T08:55:32.189 INFO:tasks.workunit.client.0.vm05.stdout:8/380: dread d2/dd/d2c/f30 [0,4194304] 0 2026-03-10T08:55:32.192 INFO:tasks.workunit.client.1.vm08.stdout:7/647: read d0/d14/d2f/f81 [734969,28799] 0 2026-03-10T08:55:32.193 INFO:tasks.workunit.client.0.vm05.stdout:8/381: symlink d2/d45/l88 0 2026-03-10T08:55:32.194 INFO:tasks.workunit.client.0.vm05.stdout:8/382: chown d2/db/d28/f2d 36339316 1 2026-03-10T08:55:32.195 INFO:tasks.workunit.client.1.vm08.stdout:1/647: truncate d1/da/d20/d3f/d49/fb6 466481 0 2026-03-10T08:55:32.198 INFO:tasks.workunit.client.0.vm05.stdout:8/383: creat d2/dd/d2c/d2e/d31/f89 x:0 0 0 2026-03-10T08:55:32.200 INFO:tasks.workunit.client.1.vm08.stdout:7/648: dread d0/d11/d1f/d29/d3b/f86 [0,4194304] 0 2026-03-10T08:55:32.201 INFO:tasks.workunit.client.1.vm08.stdout:7/649: stat d0/d11/d1f/fb7 0 2026-03-10T08:55:32.204 INFO:tasks.workunit.client.1.vm08.stdout:1/648: mknod d1/da/de/d24/d3d/d40/d56/ce5 0 2026-03-10T08:55:32.205 INFO:tasks.workunit.client.1.vm08.stdout:2/643: dread d1/da/d10/d2d/fb7 [0,4194304] 0 2026-03-10T08:55:32.208 INFO:tasks.workunit.client.1.vm08.stdout:1/649: getdents d1/da/de/dcf 0 2026-03-10T08:55:32.208 INFO:tasks.workunit.client.1.vm08.stdout:1/650: dread - d1/da/de/d24/d26/fda zero size 2026-03-10T08:55:32.210 INFO:tasks.workunit.client.1.vm08.stdout:1/651: fdatasync d1/da/d20/d3f/d49/d9c/fd1 0 2026-03-10T08:55:32.220 INFO:tasks.workunit.client.1.vm08.stdout:1/652: chown d1/da/de/d24/d3d/d40/d84 3103385 1 2026-03-10T08:55:32.220 INFO:tasks.workunit.client.1.vm08.stdout:2/644: dread d1/fd [0,4194304] 0 2026-03-10T08:55:32.220 INFO:tasks.workunit.client.1.vm08.stdout:1/653: link d1/da/d18/fb1 d1/fe6 0 2026-03-10T08:55:32.220 INFO:tasks.workunit.client.1.vm08.stdout:2/645: write d1/da/d10/d2d/fa2 [1631660,63687] 0 2026-03-10T08:55:32.220 INFO:tasks.workunit.client.1.vm08.stdout:1/654: symlink d1/da/d18/d3a/le7 0 2026-03-10T08:55:32.222 INFO:tasks.workunit.client.1.vm08.stdout:2/646: getdents d1/da/d10/d42/d93/d23 0 2026-03-10T08:55:32.223 INFO:tasks.workunit.client.1.vm08.stdout:2/647: chown d1/da/d10/d42/d93/d23/f70 343444789 1 2026-03-10T08:55:32.224 INFO:tasks.workunit.client.1.vm08.stdout:2/648: mkdir d1/da/d10/d42/dd0 0 2026-03-10T08:55:32.235 INFO:tasks.workunit.client.1.vm08.stdout:6/622: dread d9/dc/d11/f47 [0,4194304] 0 2026-03-10T08:55:32.240 INFO:tasks.workunit.client.1.vm08.stdout:6/623: write d9/dc/d84/d80/fc1 [937865,95792] 0 2026-03-10T08:55:32.240 INFO:tasks.workunit.client.1.vm08.stdout:6/624: readlink d9/d50/l9b 0 2026-03-10T08:55:32.240 INFO:tasks.workunit.client.1.vm08.stdout:6/625: dread - d9/dc/d84/fae zero size 2026-03-10T08:55:32.240 INFO:tasks.workunit.client.1.vm08.stdout:6/626: dread - d9/dc/d84/fae zero size 2026-03-10T08:55:32.240 INFO:tasks.workunit.client.1.vm08.stdout:6/627: stat d9/d10/d1e/d7b 0 2026-03-10T08:55:32.241 INFO:tasks.workunit.client.1.vm08.stdout:6/628: creat d9/dc/d11/d23/d2c/d41/fd6 x:0 0 0 2026-03-10T08:55:32.246 INFO:tasks.workunit.client.1.vm08.stdout:6/629: dwrite d9/dc/d84/d80/f94 [4194304,4194304] 0 2026-03-10T08:55:32.249 INFO:tasks.workunit.client.1.vm08.stdout:6/630: truncate d9/dc/d84/d80/fc1 1970965 0 2026-03-10T08:55:32.250 INFO:tasks.workunit.client.1.vm08.stdout:6/631: rename d9/dc/d11/d23/d2c/d7a/dce/d69/da2 to d9/dc/d11/d23/d2c/d7a/dce/d69/da2/dd7 22 2026-03-10T08:55:32.255 INFO:tasks.workunit.client.1.vm08.stdout:6/632: getdents d9/dc/d11/d23/d2c/d81/d63/dcf 0 2026-03-10T08:55:32.257 INFO:tasks.workunit.client.1.vm08.stdout:6/633: creat d9/dc/d11/d23/d2c/d7a/dce/d69/da2/fd8 x:0 0 0 2026-03-10T08:55:32.258 INFO:tasks.workunit.client.1.vm08.stdout:6/634: symlink d9/d50/ld9 0 2026-03-10T08:55:32.259 INFO:tasks.workunit.client.1.vm08.stdout:6/635: truncate d9/dc/d11/d23/d2c/d81/f85 302859 0 2026-03-10T08:55:32.262 INFO:tasks.workunit.client.1.vm08.stdout:6/636: write d9/dc/d11/d23/d2c/d7a/fd1 [982374,129037] 0 2026-03-10T08:55:32.262 INFO:tasks.workunit.client.1.vm08.stdout:6/637: write d9/dc/d11/d23/d2c/fca [216454,91426] 0 2026-03-10T08:55:32.265 INFO:tasks.workunit.client.1.vm08.stdout:6/638: creat d9/dc/d11/d23/d2c/d81/d63/fda x:0 0 0 2026-03-10T08:55:32.265 INFO:tasks.workunit.client.1.vm08.stdout:6/639: stat d9/dc/d11/d23/d2c/d41/c22 0 2026-03-10T08:55:32.272 INFO:tasks.workunit.client.0.vm05.stdout:7/331: dwrite d18/d1b/d1f/d25/d2e/f49 [0,4194304] 0 2026-03-10T08:55:32.272 INFO:tasks.workunit.client.0.vm05.stdout:4/438: dwrite d0/f1e [0,4194304] 0 2026-03-10T08:55:32.274 INFO:tasks.workunit.client.0.vm05.stdout:4/439: chown d0/d2c/d6a 3692733 1 2026-03-10T08:55:32.277 INFO:tasks.workunit.client.0.vm05.stdout:4/440: mknod d0/d1d/d30/d49/d4f/c8f 0 2026-03-10T08:55:32.284 INFO:tasks.workunit.client.1.vm08.stdout:3/574: sync 2026-03-10T08:55:32.287 INFO:tasks.workunit.client.0.vm05.stdout:4/441: creat d0/d2e/d71/f90 x:0 0 0 2026-03-10T08:55:32.290 INFO:tasks.workunit.client.0.vm05.stdout:4/442: dwrite d0/fc [0,4194304] 0 2026-03-10T08:55:32.295 INFO:tasks.workunit.client.1.vm08.stdout:3/575: mknod d4/d15/d8/d1d/da8/cc4 0 2026-03-10T08:55:32.304 INFO:tasks.workunit.client.0.vm05.stdout:4/443: symlink d0/d1d/d30/d49/d4f/d5b/l91 0 2026-03-10T08:55:32.304 INFO:tasks.workunit.client.0.vm05.stdout:4/444: symlink d0/d1d/l92 0 2026-03-10T08:55:32.304 INFO:tasks.workunit.client.0.vm05.stdout:4/445: unlink d0/d1d/f3c 0 2026-03-10T08:55:32.304 INFO:tasks.workunit.client.0.vm05.stdout:4/446: dread - d0/d1d/d30/d49/f7a zero size 2026-03-10T08:55:32.304 INFO:tasks.workunit.client.0.vm05.stdout:4/447: write d0/d2e/d42/d45/d4a/f26 [3094836,95902] 0 2026-03-10T08:55:32.304 INFO:tasks.workunit.client.0.vm05.stdout:4/448: chown d0/d2e/d42/d45/f62 31 1 2026-03-10T08:55:32.304 INFO:tasks.workunit.client.0.vm05.stdout:4/449: dwrite d0/d1d/d30/d49/f7a [0,4194304] 0 2026-03-10T08:55:32.306 INFO:tasks.workunit.client.1.vm08.stdout:3/576: mknod d4/d15/d8/d2c/d9b/d79/cc5 0 2026-03-10T08:55:32.307 INFO:tasks.workunit.client.1.vm08.stdout:9/559: dwrite d2/dd/d15/d1e/d21/f75 [0,4194304] 0 2026-03-10T08:55:32.307 INFO:tasks.workunit.client.0.vm05.stdout:4/450: creat d0/d1d/d30/f93 x:0 0 0 2026-03-10T08:55:32.309 INFO:tasks.workunit.client.0.vm05.stdout:4/451: creat d0/d1d/d30/d49/d58/f94 x:0 0 0 2026-03-10T08:55:32.310 INFO:tasks.workunit.client.0.vm05.stdout:4/452: fdatasync d0/d1d/d30/d32/f3e 0 2026-03-10T08:55:32.314 INFO:tasks.workunit.client.1.vm08.stdout:3/577: rmdir d4/da9 0 2026-03-10T08:55:32.317 INFO:tasks.workunit.client.0.vm05.stdout:4/453: unlink d0/l19 0 2026-03-10T08:55:32.319 INFO:tasks.workunit.client.0.vm05.stdout:4/454: creat d0/d78/f95 x:0 0 0 2026-03-10T08:55:32.322 INFO:tasks.workunit.client.0.vm05.stdout:3/444: write d9/d2b/d2f/f3f [874807,59394] 0 2026-03-10T08:55:32.324 INFO:tasks.workunit.client.0.vm05.stdout:2/359: write d0/f8 [3912136,9824] 0 2026-03-10T08:55:32.325 INFO:tasks.workunit.client.0.vm05.stdout:2/360: chown d0/fa 90364 1 2026-03-10T08:55:32.327 INFO:tasks.workunit.client.1.vm08.stdout:9/560: dwrite d2/d41/d4c/f80 [0,4194304] 0 2026-03-10T08:55:32.329 INFO:tasks.workunit.client.0.vm05.stdout:2/361: rename d0/d9/f17 to d0/d9/d27/f64 0 2026-03-10T08:55:32.330 INFO:tasks.workunit.client.0.vm05.stdout:2/362: write d0/f40 [8439834,41114] 0 2026-03-10T08:55:32.333 INFO:tasks.workunit.client.0.vm05.stdout:3/445: symlink d9/d2b/d3a/d43/d4f/d50/d5f/d7b/l80 0 2026-03-10T08:55:32.338 INFO:tasks.workunit.client.0.vm05.stdout:3/446: dwrite d9/d2b/d3a/f44 [0,4194304] 0 2026-03-10T08:55:32.341 INFO:tasks.workunit.client.1.vm08.stdout:5/561: dwrite d0/d46/f81 [0,4194304] 0 2026-03-10T08:55:32.362 INFO:tasks.workunit.client.0.vm05.stdout:9/304: dwrite d6/f16 [0,4194304] 0 2026-03-10T08:55:32.369 INFO:tasks.workunit.client.1.vm08.stdout:9/561: dread d2/dd/d15/d1e/d25/d32/d5c/f7f [0,4194304] 0 2026-03-10T08:55:32.370 INFO:tasks.workunit.client.0.vm05.stdout:5/334: dwrite d5/f33 [0,4194304] 0 2026-03-10T08:55:32.371 INFO:tasks.workunit.client.0.vm05.stdout:3/447: mknod d9/d2b/d2f/d57/c81 0 2026-03-10T08:55:32.371 INFO:tasks.workunit.client.0.vm05.stdout:3/448: stat d9/d2b/d3a/d43/d7a/f7f 0 2026-03-10T08:55:32.372 INFO:tasks.workunit.client.0.vm05.stdout:3/449: chown d9/d2b/d2f/f33 16069 1 2026-03-10T08:55:32.379 INFO:tasks.workunit.client.1.vm08.stdout:0/550: write d6/dd/f35 [1264918,85522] 0 2026-03-10T08:55:32.381 INFO:tasks.workunit.client.0.vm05.stdout:9/305: readlink d6/d15/d35/l57 0 2026-03-10T08:55:32.384 INFO:tasks.workunit.client.0.vm05.stdout:9/306: dread d6/f3f [0,4194304] 0 2026-03-10T08:55:32.385 INFO:tasks.workunit.client.0.vm05.stdout:5/335: truncate d5/fc 2691580 0 2026-03-10T08:55:32.393 INFO:tasks.workunit.client.0.vm05.stdout:2/363: creat d0/d9/f65 x:0 0 0 2026-03-10T08:55:32.394 INFO:tasks.workunit.client.0.vm05.stdout:2/364: dread - d0/d55/f60 zero size 2026-03-10T08:55:32.395 INFO:tasks.workunit.client.0.vm05.stdout:9/307: symlink d6/d19/d21/l63 0 2026-03-10T08:55:32.398 INFO:tasks.workunit.client.0.vm05.stdout:5/336: creat d5/df/d37/f73 x:0 0 0 2026-03-10T08:55:32.400 INFO:tasks.workunit.client.0.vm05.stdout:2/365: creat d0/d9/d27/f66 x:0 0 0 2026-03-10T08:55:32.404 INFO:tasks.workunit.client.1.vm08.stdout:8/674: write d1/d10/d9/dd/d25/d27/d44/d21/f4a [1461228,114006] 0 2026-03-10T08:55:32.406 INFO:tasks.workunit.client.1.vm08.stdout:0/551: dread d6/dd/d13/d17/d1f/d20/d2f/d24/f68 [0,4194304] 0 2026-03-10T08:55:32.409 INFO:tasks.workunit.client.0.vm05.stdout:9/308: chown d6/l11 17686091 1 2026-03-10T08:55:32.438 INFO:tasks.workunit.client.1.vm08.stdout:4/622: dwrite d5/d23/d36/f57 [0,4194304] 0 2026-03-10T08:55:32.438 INFO:tasks.workunit.client.1.vm08.stdout:9/562: dread d2/dd/d15/d1e/d21/f3a [0,4194304] 0 2026-03-10T08:55:32.438 INFO:tasks.workunit.client.1.vm08.stdout:4/623: read d5/d5f/fcc [672477,33678] 0 2026-03-10T08:55:32.438 INFO:tasks.workunit.client.1.vm08.stdout:9/563: mkdir d2/d54/d8e/db7 0 2026-03-10T08:55:32.438 INFO:tasks.workunit.client.1.vm08.stdout:9/564: mknod d2/d54/d8e/da6/cb8 0 2026-03-10T08:55:32.438 INFO:tasks.workunit.client.1.vm08.stdout:4/624: mknod d5/d23/ce0 0 2026-03-10T08:55:32.438 INFO:tasks.workunit.client.1.vm08.stdout:4/625: creat d5/d5f/fe1 x:0 0 0 2026-03-10T08:55:32.439 INFO:tasks.workunit.client.0.vm05.stdout:5/337: mkdir d5/df/d12/d24/d2c/d41/d74 0 2026-03-10T08:55:32.439 INFO:tasks.workunit.client.0.vm05.stdout:5/338: symlink d5/d3a/d43/l75 0 2026-03-10T08:55:32.439 INFO:tasks.workunit.client.0.vm05.stdout:3/450: getdents d9/d2b/d3a/d43/d6e 0 2026-03-10T08:55:32.439 INFO:tasks.workunit.client.0.vm05.stdout:3/451: write d9/d2b/f3b [1002778,101127] 0 2026-03-10T08:55:32.439 INFO:tasks.workunit.client.0.vm05.stdout:6/459: truncate d4/d2c/d84/f6b 210244 0 2026-03-10T08:55:32.439 INFO:tasks.workunit.client.0.vm05.stdout:6/460: write d4/d2d/d51/f7d [160489,69139] 0 2026-03-10T08:55:32.439 INFO:tasks.workunit.client.0.vm05.stdout:9/309: symlink d6/d15/l64 0 2026-03-10T08:55:32.439 INFO:tasks.workunit.client.0.vm05.stdout:5/339: rename d5/df/d12/d24/d2c/d65 to d5/df/d12/d66/d76 0 2026-03-10T08:55:32.439 INFO:tasks.workunit.client.0.vm05.stdout:6/461: symlink d4/d2c/d84/d4a/l9a 0 2026-03-10T08:55:32.439 INFO:tasks.workunit.client.0.vm05.stdout:6/462: stat d4/f11 0 2026-03-10T08:55:32.439 INFO:tasks.workunit.client.0.vm05.stdout:6/463: stat d4/fc 0 2026-03-10T08:55:32.439 INFO:tasks.workunit.client.0.vm05.stdout:9/310: unlink d6/d12/l13 0 2026-03-10T08:55:32.439 INFO:tasks.workunit.client.0.vm05.stdout:1/501: truncate dd/d10/d19/d4d/f74 628633 0 2026-03-10T08:55:32.442 INFO:tasks.workunit.client.0.vm05.stdout:6/464: symlink d4/d7/d10/d15/d1b/l9b 0 2026-03-10T08:55:32.443 INFO:tasks.workunit.client.0.vm05.stdout:6/465: fsync d4/f11 0 2026-03-10T08:55:32.445 INFO:tasks.workunit.client.0.vm05.stdout:9/311: rename d6/d15/f24 to d6/d12/d3a/d48/f65 0 2026-03-10T08:55:32.446 INFO:tasks.workunit.client.0.vm05.stdout:0/393: write df/d18/d2b/f3b [1021636,102895] 0 2026-03-10T08:55:32.463 INFO:tasks.workunit.client.1.vm08.stdout:4/626: dwrite d5/d23/d36/fce [0,4194304] 0 2026-03-10T08:55:32.463 INFO:tasks.workunit.client.0.vm05.stdout:9/312: dread d6/f3f [0,4194304] 0 2026-03-10T08:55:32.463 INFO:tasks.workunit.client.0.vm05.stdout:5/340: creat d5/df/d12/d39/f77 x:0 0 0 2026-03-10T08:55:32.463 INFO:tasks.workunit.client.0.vm05.stdout:3/452: link d9/l18 d9/d2b/d2f/d57/l82 0 2026-03-10T08:55:32.463 INFO:tasks.workunit.client.0.vm05.stdout:6/466: mknod d4/d2c/c9c 0 2026-03-10T08:55:32.463 INFO:tasks.workunit.client.0.vm05.stdout:8/384: dwrite d2/dd/d2c/f2f [0,4194304] 0 2026-03-10T08:55:32.463 INFO:tasks.workunit.client.0.vm05.stdout:8/385: chown d2/dd/d74/c7e 1397828 1 2026-03-10T08:55:32.463 INFO:tasks.workunit.client.0.vm05.stdout:8/386: stat d2/dd/d2c/d2e/c39 0 2026-03-10T08:55:32.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:32 vm05.local ceph-mon[49713]: pgmap v155: 65 pgs: 65 active+clean; 2.0 GiB data, 7.2 GiB used, 113 GiB / 120 GiB avail; 44 MiB/s rd, 124 MiB/s wr, 247 op/s 2026-03-10T08:55:32.466 INFO:tasks.workunit.client.0.vm05.stdout:8/387: read d2/dd/d2c/d2e/d31/d4c/d63/f6c [2838932,4138] 0 2026-03-10T08:55:32.467 INFO:tasks.workunit.client.1.vm08.stdout:4/627: read d5/de/f6d [608032,22095] 0 2026-03-10T08:55:32.470 INFO:tasks.workunit.client.0.vm05.stdout:8/388: dwrite d2/dd/d2c/d2e/f3b [4194304,4194304] 0 2026-03-10T08:55:32.470 INFO:tasks.workunit.client.1.vm08.stdout:4/628: dread - d5/d23/fcb zero size 2026-03-10T08:55:32.473 INFO:tasks.workunit.client.0.vm05.stdout:1/502: symlink dd/d21/d37/lb6 0 2026-03-10T08:55:32.477 INFO:tasks.workunit.client.1.vm08.stdout:7/650: write d0/d14/f68 [1745711,21371] 0 2026-03-10T08:55:32.477 INFO:tasks.workunit.client.0.vm05.stdout:3/453: creat d9/d2b/d3a/d43/d4f/d50/d5f/d7b/f83 x:0 0 0 2026-03-10T08:55:32.484 INFO:tasks.workunit.client.1.vm08.stdout:1/655: write d1/da/de/d24/d3d/d40/d8e/fc1 [404937,11795] 0 2026-03-10T08:55:32.484 INFO:tasks.workunit.client.1.vm08.stdout:4/629: readlink d5/d23/d49/d83/l81 0 2026-03-10T08:55:32.485 INFO:tasks.workunit.client.1.vm08.stdout:4/630: fdatasync d5/d23/d36/f7d 0 2026-03-10T08:55:32.493 INFO:tasks.workunit.client.1.vm08.stdout:7/651: dread - d0/d11/d1f/d29/d3b/d80/f88 zero size 2026-03-10T08:55:32.494 INFO:tasks.workunit.client.1.vm08.stdout:2/649: dwrite d1/da/d10/d42/d93/f8f [0,4194304] 0 2026-03-10T08:55:32.498 INFO:tasks.workunit.client.0.vm05.stdout:3/454: mknod d9/d4d/c84 0 2026-03-10T08:55:32.499 INFO:tasks.workunit.client.0.vm05.stdout:7/332: write f9 [1170929,34935] 0 2026-03-10T08:55:32.500 INFO:tasks.workunit.client.1.vm08.stdout:6/640: write d9/f77 [825796,22715] 0 2026-03-10T08:55:32.500 INFO:tasks.workunit.client.0.vm05.stdout:7/333: chown c14 21121 1 2026-03-10T08:55:32.511 INFO:tasks.workunit.client.1.vm08.stdout:1/656: mkdir d1/da/de/d24/d3d/d40/d5b/de8 0 2026-03-10T08:55:32.514 INFO:tasks.workunit.client.0.vm05.stdout:3/455: dread d9/f4a [0,4194304] 0 2026-03-10T08:55:32.516 INFO:tasks.workunit.client.1.vm08.stdout:1/657: dwrite d1/f65 [0,4194304] 0 2026-03-10T08:55:32.519 INFO:tasks.workunit.client.1.vm08.stdout:7/652: truncate d0/d14/f98 982311 0 2026-03-10T08:55:32.519 INFO:tasks.workunit.client.1.vm08.stdout:2/650: rmdir d1/da/d10/d42/d93 39 2026-03-10T08:55:32.521 INFO:tasks.workunit.client.1.vm08.stdout:7/653: dread - d0/d11/d1f/d29/d3b/d80/f88 zero size 2026-03-10T08:55:32.521 INFO:tasks.workunit.client.1.vm08.stdout:2/651: write d1/d5b/da7/fb5 [535144,60813] 0 2026-03-10T08:55:32.523 INFO:tasks.workunit.client.0.vm05.stdout:3/456: rmdir d9/d2b/d3a/d43/d4f/d55 39 2026-03-10T08:55:32.558 INFO:tasks.workunit.client.0.vm05.stdout:3/457: getdents d9/d2b 0 2026-03-10T08:55:32.558 INFO:tasks.workunit.client.1.vm08.stdout:4/631: symlink d5/d23/d36/d99/db2/d5a/ddb/le2 0 2026-03-10T08:55:32.558 INFO:tasks.workunit.client.1.vm08.stdout:6/641: fsync d9/dc/d11/d23/d2c/f49 0 2026-03-10T08:55:32.558 INFO:tasks.workunit.client.1.vm08.stdout:6/642: chown d9/dc/d11/d23/d2c/d7a/fd1 240195165 1 2026-03-10T08:55:32.558 INFO:tasks.workunit.client.1.vm08.stdout:2/652: dwrite d1/da/d10/d42/d93/d23/f99 [0,4194304] 0 2026-03-10T08:55:32.558 INFO:tasks.workunit.client.1.vm08.stdout:6/643: mknod d9/dc/d11/d23/d2c/d81/d63/dcf/cdb 0 2026-03-10T08:55:32.558 INFO:tasks.workunit.client.1.vm08.stdout:2/653: getdents d1/da/d10/d42/d93/d1e/d7b 0 2026-03-10T08:55:32.562 INFO:tasks.workunit.client.1.vm08.stdout:2/654: dread d1/da/d10/d42/f79 [0,4194304] 0 2026-03-10T08:55:32.568 INFO:tasks.workunit.client.1.vm08.stdout:2/655: dwrite d1/da/d10/d42/d93/d23/f99 [0,4194304] 0 2026-03-10T08:55:32.570 INFO:tasks.workunit.client.1.vm08.stdout:2/656: dwrite d1/db1/fc1 [0,4194304] 0 2026-03-10T08:55:32.583 INFO:tasks.workunit.client.1.vm08.stdout:2/657: creat d1/da/d10/d42/dd0/fd1 x:0 0 0 2026-03-10T08:55:32.612 INFO:tasks.workunit.client.1.vm08.stdout:2/658: creat d1/fd2 x:0 0 0 2026-03-10T08:55:32.612 INFO:tasks.workunit.client.1.vm08.stdout:2/659: dwrite d1/da/d78/f95 [0,4194304] 0 2026-03-10T08:55:32.612 INFO:tasks.workunit.client.1.vm08.stdout:2/660: mkdir d1/da/d10/dd3 0 2026-03-10T08:55:32.670 INFO:tasks.workunit.client.0.vm05.stdout:7/334: sync 2026-03-10T08:55:32.671 INFO:tasks.workunit.client.0.vm05.stdout:7/335: fsync d18/f4a 0 2026-03-10T08:55:32.672 INFO:tasks.workunit.client.0.vm05.stdout:7/336: readlink d18/d1b/d1f/d25/d2e/d42/d53/l5e 0 2026-03-10T08:55:32.712 INFO:tasks.workunit.client.1.vm08.stdout:4/632: sync 2026-03-10T08:55:32.713 INFO:tasks.workunit.client.1.vm08.stdout:1/658: dread d1/da/d20/d3f/d49/f9a [4194304,4194304] 0 2026-03-10T08:55:32.715 INFO:tasks.workunit.client.1.vm08.stdout:4/633: mkdir d5/d23/d36/d99/db2/d5d/de3 0 2026-03-10T08:55:32.715 INFO:tasks.workunit.client.1.vm08.stdout:1/659: creat d1/da/de/d24/d3d/d40/d8e/dd2/d7f/fe9 x:0 0 0 2026-03-10T08:55:32.716 INFO:tasks.workunit.client.1.vm08.stdout:4/634: mknod d5/d23/d36/d99/db2/ce4 0 2026-03-10T08:55:32.716 INFO:tasks.workunit.client.1.vm08.stdout:1/660: chown d1/da/d18/d3a/l5f 2842 1 2026-03-10T08:55:32.719 INFO:tasks.workunit.client.1.vm08.stdout:4/635: rename d5/d23/f68 to d5/d23/d49/d8f/da4/fe5 0 2026-03-10T08:55:32.722 INFO:tasks.workunit.client.1.vm08.stdout:4/636: write d5/d23/d36/d99/db2/d5a/d69/fb3 [286984,38255] 0 2026-03-10T08:55:32.727 INFO:tasks.workunit.client.1.vm08.stdout:2/661: sync 2026-03-10T08:55:32.727 INFO:tasks.workunit.client.1.vm08.stdout:4/637: symlink d5/d23/d36/d99/db2/d5d/de3/le6 0 2026-03-10T08:55:32.731 INFO:tasks.workunit.client.1.vm08.stdout:4/638: creat d5/d23/d36/d99/dc6/dc8/fe7 x:0 0 0 2026-03-10T08:55:32.732 INFO:tasks.workunit.client.1.vm08.stdout:2/662: mknod d1/da/d10/d42/d93/d23/d9e/cd4 0 2026-03-10T08:55:32.734 INFO:tasks.workunit.client.1.vm08.stdout:4/639: mknod d5/d23/d36/d76/ce8 0 2026-03-10T08:55:32.735 INFO:tasks.workunit.client.1.vm08.stdout:2/663: mkdir d1/dd5 0 2026-03-10T08:55:32.736 INFO:tasks.workunit.client.1.vm08.stdout:4/640: rmdir d5/d23/d36/d99/db2/d5d/de3 39 2026-03-10T08:55:32.737 INFO:tasks.workunit.client.1.vm08.stdout:4/641: fdatasync d5/de/f54 0 2026-03-10T08:55:32.739 INFO:tasks.workunit.client.1.vm08.stdout:4/642: readlink d5/d23/d49/d8f/l91 0 2026-03-10T08:55:32.739 INFO:tasks.workunit.client.1.vm08.stdout:1/661: dread d1/da/de/fe4 [0,4194304] 0 2026-03-10T08:55:32.742 INFO:tasks.workunit.client.1.vm08.stdout:2/664: dwrite d1/d5b/da7/fb5 [0,4194304] 0 2026-03-10T08:55:32.815 INFO:tasks.workunit.client.1.vm08.stdout:2/665: sync 2026-03-10T08:55:32.819 INFO:tasks.workunit.client.1.vm08.stdout:2/666: getdents d1/da/d10/d2d/db6 0 2026-03-10T08:55:32.820 INFO:tasks.workunit.client.1.vm08.stdout:2/667: write d1/da/d10/d42/d93/d1e/fb2 [537367,38394] 0 2026-03-10T08:55:32.820 INFO:tasks.workunit.client.1.vm08.stdout:2/668: dread - d1/da/d10/d2d/f67 zero size 2026-03-10T08:55:32.823 INFO:tasks.workunit.client.1.vm08.stdout:2/669: dread d1/da/d10/d2d/f4c [0,4194304] 0 2026-03-10T08:55:32.825 INFO:tasks.workunit.client.1.vm08.stdout:2/670: fsync d1/d5b/f80 0 2026-03-10T08:55:32.825 INFO:tasks.workunit.client.1.vm08.stdout:2/671: readlink d1/d5b/la0 0 2026-03-10T08:55:32.829 INFO:tasks.workunit.client.1.vm08.stdout:2/672: creat d1/da/d10/d42/d93/d1e/dce/fd6 x:0 0 0 2026-03-10T08:55:32.831 INFO:tasks.workunit.client.1.vm08.stdout:2/673: truncate d1/da/d10/d42/d93/d1e/d7b/fb9 234961 0 2026-03-10T08:55:32.832 INFO:tasks.workunit.client.1.vm08.stdout:2/674: creat d1/da/d10/d1b/d6a/fd7 x:0 0 0 2026-03-10T08:55:32.833 INFO:tasks.workunit.client.1.vm08.stdout:2/675: symlink d1/d5b/dc5/ld8 0 2026-03-10T08:55:32.834 INFO:tasks.workunit.client.1.vm08.stdout:2/676: read - d1/fd2 zero size 2026-03-10T08:55:32.839 INFO:tasks.workunit.client.1.vm08.stdout:2/677: creat d1/da/d10/dd3/fd9 x:0 0 0 2026-03-10T08:55:32.840 INFO:tasks.workunit.client.1.vm08.stdout:2/678: dread - d1/da/d10/d42/d93/d1e/dce/fc8 zero size 2026-03-10T08:55:32.844 INFO:tasks.workunit.client.1.vm08.stdout:2/679: read - d1/da/d10/d2d/fb4 zero size 2026-03-10T08:55:32.849 INFO:tasks.workunit.client.1.vm08.stdout:2/680: getdents d1/d5b/d66 0 2026-03-10T08:55:32.853 INFO:tasks.workunit.client.1.vm08.stdout:2/681: link d1/da/d10/d1b/fac d1/da/d10/d42/fda 0 2026-03-10T08:55:32.857 INFO:tasks.workunit.client.1.vm08.stdout:2/682: creat d1/da/d10/d42/d93/daa/fdb x:0 0 0 2026-03-10T08:55:32.857 INFO:tasks.workunit.client.1.vm08.stdout:4/643: dread d5/d23/d36/d99/db2/d5a/f87 [0,4194304] 0 2026-03-10T08:55:32.858 INFO:tasks.workunit.client.1.vm08.stdout:4/644: readlink d5/d23/d49/d8f/l91 0 2026-03-10T08:55:32.859 INFO:tasks.workunit.client.1.vm08.stdout:2/683: mkdir d1/da/d10/d42/d93/d23/d9e/ddc 0 2026-03-10T08:55:32.868 INFO:tasks.workunit.client.1.vm08.stdout:2/684: dwrite d1/da/d10/d42/d93/d1e/dce/fc8 [0,4194304] 0 2026-03-10T08:55:32.868 INFO:tasks.workunit.client.1.vm08.stdout:2/685: chown d1/da/d10/d42/d93/d23/fae 1676729 1 2026-03-10T08:55:32.876 INFO:tasks.workunit.client.1.vm08.stdout:3/578: write d4/d15/d8/d2c/f6a [6333087,26687] 0 2026-03-10T08:55:32.877 INFO:tasks.workunit.client.1.vm08.stdout:2/686: symlink d1/d97/ldd 0 2026-03-10T08:55:32.879 INFO:tasks.workunit.client.1.vm08.stdout:3/579: symlink d4/d15/d8/d2c/d6d/lc6 0 2026-03-10T08:55:32.879 INFO:tasks.workunit.client.1.vm08.stdout:2/687: rename d1/da/d10 to d1/da/d10/d42/d93/d1e/dce/d52/db3/dde 22 2026-03-10T08:55:32.883 INFO:tasks.workunit.client.1.vm08.stdout:2/688: rename d1/lc9 to d1/da/d10/dca/ldf 0 2026-03-10T08:55:32.886 INFO:tasks.workunit.client.1.vm08.stdout:2/689: write d1/da/d10/d42/d93/d23/fae [1232913,85306] 0 2026-03-10T08:55:32.888 INFO:tasks.workunit.client.1.vm08.stdout:2/690: creat d1/da/d10/d1b/d6a/fe0 x:0 0 0 2026-03-10T08:55:32.890 INFO:tasks.workunit.client.1.vm08.stdout:2/691: getdents d1/da/d10/d42/d93/d23 0 2026-03-10T08:55:32.892 INFO:tasks.workunit.client.0.vm05.stdout:4/455: write d0/fb [8641745,109867] 0 2026-03-10T08:55:32.895 INFO:tasks.workunit.client.1.vm08.stdout:2/692: chown d1/da/d10/d1b/c53 227 1 2026-03-10T08:55:32.897 INFO:tasks.workunit.client.0.vm05.stdout:4/456: symlink d0/l96 0 2026-03-10T08:55:32.898 INFO:tasks.workunit.client.1.vm08.stdout:2/693: unlink d1/d43/f7f 0 2026-03-10T08:55:32.902 INFO:tasks.workunit.client.1.vm08.stdout:5/562: write d0/d11/d27/f64 [170022,123271] 0 2026-03-10T08:55:32.904 INFO:tasks.workunit.client.1.vm08.stdout:5/563: chown d0/fa4 17419849 1 2026-03-10T08:55:32.904 INFO:tasks.workunit.client.1.vm08.stdout:5/564: stat d0/d11/c17 0 2026-03-10T08:55:32.908 INFO:tasks.workunit.client.1.vm08.stdout:2/694: link d1/da/d10/d42/d93/d22/l33 d1/da/d10/d1b/dcf/le1 0 2026-03-10T08:55:32.909 INFO:tasks.workunit.client.1.vm08.stdout:5/565: rename d0/c9 to d0/d11/d18/cab 0 2026-03-10T08:55:32.910 INFO:tasks.workunit.client.0.vm05.stdout:0/394: write f6 [3623191,112644] 0 2026-03-10T08:55:32.911 INFO:tasks.workunit.client.1.vm08.stdout:0/552: write d6/dd/d13/d17/d1f/d20/f21 [689469,92435] 0 2026-03-10T08:55:32.911 INFO:tasks.workunit.client.0.vm05.stdout:2/366: write d0/d9/d1e/d20/f32 [1388321,17998] 0 2026-03-10T08:55:32.912 INFO:tasks.workunit.client.1.vm08.stdout:2/695: mkdir d1/da/d10/d42/d93/de2 0 2026-03-10T08:55:32.915 INFO:tasks.workunit.client.0.vm05.stdout:2/367: rename d0/d9/f65 to d0/d9/d27/f67 0 2026-03-10T08:55:32.923 INFO:tasks.workunit.client.1.vm08.stdout:5/566: write d0/d11/d18/d52/f57 [3998814,64973] 0 2026-03-10T08:55:32.923 INFO:tasks.workunit.client.1.vm08.stdout:8/675: dwrite d1/d10/d9/dd/d18/d3c/fd8 [0,4194304] 0 2026-03-10T08:55:32.923 INFO:tasks.workunit.client.1.vm08.stdout:0/553: creat d6/dd/d13/d17/d1f/d2d/fb2 x:0 0 0 2026-03-10T08:55:32.926 INFO:tasks.workunit.client.1.vm08.stdout:8/676: unlink d1/d10/d9/d8a/c9b 0 2026-03-10T08:55:32.926 INFO:tasks.workunit.client.1.vm08.stdout:3/580: sync 2026-03-10T08:55:32.926 INFO:tasks.workunit.client.1.vm08.stdout:5/567: getdents d0/d11/d3e 0 2026-03-10T08:55:32.926 INFO:tasks.workunit.client.1.vm08.stdout:0/554: read - d6/dd/d13/d17/d1f/d2d/d39/f91 zero size 2026-03-10T08:55:32.930 INFO:tasks.workunit.client.1.vm08.stdout:0/555: dread - d6/dd/d13/d61/d6f/faf zero size 2026-03-10T08:55:32.932 INFO:tasks.workunit.client.1.vm08.stdout:0/556: chown d6/dd/d13/d17/d1f/d2d 19049 1 2026-03-10T08:55:32.934 INFO:tasks.workunit.client.1.vm08.stdout:0/557: truncate d6/dd/d13/d17/d1f/da3/fb0 726773 0 2026-03-10T08:55:32.936 INFO:tasks.workunit.client.1.vm08.stdout:3/581: creat d4/d15/d8/d2c/d55/fc7 x:0 0 0 2026-03-10T08:55:32.937 INFO:tasks.workunit.client.1.vm08.stdout:5/568: symlink d0/d11/d27/d68/d7c/d4b/d4e/da5/lac 0 2026-03-10T08:55:32.938 INFO:tasks.workunit.client.1.vm08.stdout:5/569: write d0/d11/d27/d68/d7c/d4b/fa2 [910089,51340] 0 2026-03-10T08:55:32.944 INFO:tasks.workunit.client.1.vm08.stdout:0/558: rename d6/dd/c2b to d6/dd/d13/d17/cb3 0 2026-03-10T08:55:32.945 INFO:tasks.workunit.client.1.vm08.stdout:5/570: creat d0/d11/d3e/d45/fad x:0 0 0 2026-03-10T08:55:32.949 INFO:tasks.workunit.client.1.vm08.stdout:0/559: creat d6/dd/d13/d17/fb4 x:0 0 0 2026-03-10T08:55:32.955 INFO:tasks.workunit.client.1.vm08.stdout:0/560: creat d6/dd/d13/d17/d1f/d20/d2f/d24/fb5 x:0 0 0 2026-03-10T08:55:32.955 INFO:tasks.workunit.client.1.vm08.stdout:0/561: chown d6/dd/d13/d17/d1f/d20/d2f 476190 1 2026-03-10T08:55:32.955 INFO:tasks.workunit.client.1.vm08.stdout:0/562: write d6/dd/d13/d17/f1d [1382388,72844] 0 2026-03-10T08:55:32.955 INFO:tasks.workunit.client.1.vm08.stdout:0/563: write d6/dd/d13/d17/d1f/da3/fa7 [1011955,16867] 0 2026-03-10T08:55:32.958 INFO:tasks.workunit.client.1.vm08.stdout:9/565: truncate d2/dd/d15/d1e/d21/f75 695660 0 2026-03-10T08:55:32.965 INFO:tasks.workunit.client.0.vm05.stdout:9/313: dwrite d6/d15/d35/f38 [0,4194304] 0 2026-03-10T08:55:32.965 INFO:tasks.workunit.client.0.vm05.stdout:5/341: dwrite d5/d3a/f4a [0,4194304] 0 2026-03-10T08:55:32.967 INFO:tasks.workunit.client.0.vm05.stdout:8/389: dwrite d2/dd/d2c/f4d [0,4194304] 0 2026-03-10T08:55:32.968 INFO:tasks.workunit.client.0.vm05.stdout:8/390: fdatasync d2/dd/d2c/d2e/f64 0 2026-03-10T08:55:32.968 INFO:tasks.workunit.client.0.vm05.stdout:6/467: dwrite d4/d2c/f7a [0,4194304] 0 2026-03-10T08:55:32.996 INFO:tasks.workunit.client.0.vm05.stdout:1/503: dwrite dd/d10/fb0 [0,4194304] 0 2026-03-10T08:55:33.005 INFO:tasks.workunit.client.1.vm08.stdout:0/564: getdents d6/dd/d13/d17/d1f/d2d/d38/d98 0 2026-03-10T08:55:33.006 INFO:tasks.workunit.client.0.vm05.stdout:5/342: creat d5/df/d12/d39/f78 x:0 0 0 2026-03-10T08:55:33.007 INFO:tasks.workunit.client.0.vm05.stdout:5/343: stat d5/df/d12/d24/l27 0 2026-03-10T08:55:33.008 INFO:tasks.workunit.client.0.vm05.stdout:9/314: mknod d6/d15/d35/c66 0 2026-03-10T08:55:33.010 INFO:tasks.workunit.client.1.vm08.stdout:9/566: mknod d2/d54/d8e/db7/cb9 0 2026-03-10T08:55:33.010 INFO:tasks.workunit.client.1.vm08.stdout:7/654: write d0/d11/d1f/d2c/f30 [5930531,83354] 0 2026-03-10T08:55:33.018 INFO:tasks.workunit.client.1.vm08.stdout:0/565: fsync d6/dd/d13/d17/d1f/d2d/d39/f87 0 2026-03-10T08:55:33.018 INFO:tasks.workunit.client.1.vm08.stdout:7/655: dread - d0/d11/d4a/d5e/dc3/fce zero size 2026-03-10T08:55:33.018 INFO:tasks.workunit.client.0.vm05.stdout:3/458: truncate f2 1575770 0 2026-03-10T08:55:33.018 INFO:tasks.workunit.client.0.vm05.stdout:6/468: creat d4/d92/f9d x:0 0 0 2026-03-10T08:55:33.018 INFO:tasks.workunit.client.0.vm05.stdout:6/469: dread - d4/d2d/d5f/f6d zero size 2026-03-10T08:55:33.018 INFO:tasks.workunit.client.0.vm05.stdout:6/470: read d4/f30 [1476998,106734] 0 2026-03-10T08:55:33.019 INFO:tasks.workunit.client.0.vm05.stdout:3/459: dread d9/f29 [0,4194304] 0 2026-03-10T08:55:33.019 INFO:tasks.workunit.client.0.vm05.stdout:7/337: write d18/d1b/d1f/d25/d2e/d2f/f33 [1035668,1941] 0 2026-03-10T08:55:33.021 INFO:tasks.workunit.client.1.vm08.stdout:6/644: dwrite d9/dc/d11/d23/d2c/f5c [0,4194304] 0 2026-03-10T08:55:33.023 INFO:tasks.workunit.client.0.vm05.stdout:9/315: creat d6/d15/d3c/d4b/f67 x:0 0 0 2026-03-10T08:55:33.025 INFO:tasks.workunit.client.1.vm08.stdout:9/567: read - d2/dd/d61/f67 zero size 2026-03-10T08:55:33.027 INFO:tasks.workunit.client.0.vm05.stdout:9/316: dwrite d6/d19/d2c/f54 [0,4194304] 0 2026-03-10T08:55:33.036 INFO:tasks.workunit.client.1.vm08.stdout:4/645: rmdir d5/d23/d36/d99/db2/d5d/de3 39 2026-03-10T08:55:33.037 INFO:tasks.workunit.client.0.vm05.stdout:3/460: dwrite d9/d2b/d2f/f33 [0,4194304] 0 2026-03-10T08:55:33.038 INFO:tasks.workunit.client.0.vm05.stdout:3/461: stat d9/f4a 0 2026-03-10T08:55:33.045 INFO:tasks.workunit.client.1.vm08.stdout:6/645: chown d9/dc/d11/d23/d2c/d81/c71 3 1 2026-03-10T08:55:33.049 INFO:tasks.workunit.client.1.vm08.stdout:1/662: dwrite d1/da/d18/d3b/d62/f76 [0,4194304] 0 2026-03-10T08:55:33.056 INFO:tasks.workunit.client.1.vm08.stdout:6/646: read d9/fa [623350,105331] 0 2026-03-10T08:55:33.056 INFO:tasks.workunit.client.1.vm08.stdout:6/647: truncate d9/fc5 1361318 0 2026-03-10T08:55:33.056 INFO:tasks.workunit.client.1.vm08.stdout:9/568: creat d2/d54/d8e/fba x:0 0 0 2026-03-10T08:55:33.056 INFO:tasks.workunit.client.1.vm08.stdout:4/646: truncate d5/d23/d36/d99/db2/f71 1765602 0 2026-03-10T08:55:33.065 INFO:tasks.workunit.client.0.vm05.stdout:9/317: fdatasync d6/d19/d2a/f4d 0 2026-03-10T08:55:33.066 INFO:tasks.workunit.client.1.vm08.stdout:1/663: rmdir d1/da/d20/d91 39 2026-03-10T08:55:33.066 INFO:tasks.workunit.client.0.vm05.stdout:8/391: link d2/dd/d2c/d2e/d31/f89 d2/dd/d2c/d2e/d31/d4f/d7b/f8a 0 2026-03-10T08:55:33.068 INFO:tasks.workunit.client.0.vm05.stdout:1/504: rmdir dd/db4 0 2026-03-10T08:55:33.069 INFO:tasks.workunit.client.0.vm05.stdout:1/505: readlink dd/d10/d18/d2d/l43 0 2026-03-10T08:55:33.071 INFO:tasks.workunit.client.0.vm05.stdout:7/338: symlink d18/l60 0 2026-03-10T08:55:33.072 INFO:tasks.workunit.client.1.vm08.stdout:9/569: creat d2/dd/d61/fbb x:0 0 0 2026-03-10T08:55:33.073 INFO:tasks.workunit.client.0.vm05.stdout:5/344: creat d5/df/d12/d24/d2c/f79 x:0 0 0 2026-03-10T08:55:33.075 INFO:tasks.workunit.client.1.vm08.stdout:4/647: creat d5/d23/d36/d99/db2/d5a/ddb/fe9 x:0 0 0 2026-03-10T08:55:33.076 INFO:tasks.workunit.client.0.vm05.stdout:9/318: unlink d6/d27/f60 0 2026-03-10T08:55:33.078 INFO:tasks.workunit.client.0.vm05.stdout:8/392: symlink d2/db/d47/l8b 0 2026-03-10T08:55:33.078 INFO:tasks.workunit.client.0.vm05.stdout:8/393: chown d2/db/d28 113613 1 2026-03-10T08:55:33.079 INFO:tasks.workunit.client.0.vm05.stdout:1/506: dread - dd/d10/f8f zero size 2026-03-10T08:55:33.081 INFO:tasks.workunit.client.0.vm05.stdout:5/345: symlink d5/d48/l7a 0 2026-03-10T08:55:33.081 INFO:tasks.workunit.client.0.vm05.stdout:5/346: dread - d5/d3a/d43/f6d zero size 2026-03-10T08:55:33.083 INFO:tasks.workunit.client.1.vm08.stdout:9/570: unlink d2/d41/d74/f9a 0 2026-03-10T08:55:33.084 INFO:tasks.workunit.client.1.vm08.stdout:4/648: mkdir d5/de/dea 0 2026-03-10T08:55:33.084 INFO:tasks.workunit.client.1.vm08.stdout:4/649: chown d5/de/f1b 1964173905 1 2026-03-10T08:55:33.089 INFO:tasks.workunit.client.0.vm05.stdout:0/395: write df/d59/f57 [544236,92172] 0 2026-03-10T08:55:33.092 INFO:tasks.workunit.client.0.vm05.stdout:2/368: dwrite d0/d9/d1e/d20/d21/f31 [4194304,4194304] 0 2026-03-10T08:55:33.093 INFO:tasks.workunit.client.1.vm08.stdout:9/571: mkdir d2/d54/dbc 0 2026-03-10T08:55:33.094 INFO:tasks.workunit.client.1.vm08.stdout:9/572: fdatasync d2/dd/d15/d1e/f91 0 2026-03-10T08:55:33.094 INFO:tasks.workunit.client.1.vm08.stdout:1/664: creat d1/da/d18/fea x:0 0 0 2026-03-10T08:55:33.099 INFO:tasks.workunit.client.0.vm05.stdout:5/347: creat d5/df/d12/d66/f7b x:0 0 0 2026-03-10T08:55:33.100 INFO:tasks.workunit.client.0.vm05.stdout:9/319: symlink d6/d15/l68 0 2026-03-10T08:55:33.102 INFO:tasks.workunit.client.0.vm05.stdout:4/457: creat d0/d2e/d42/d45/d4a/d36/d37/f97 x:0 0 0 2026-03-10T08:55:33.104 INFO:tasks.workunit.client.0.vm05.stdout:0/396: mkdir df/d18/d2b/d27/d32/d4e/d6a 0 2026-03-10T08:55:33.106 INFO:tasks.workunit.client.0.vm05.stdout:9/320: dwrite d6/d19/d2a/f53 [4194304,4194304] 0 2026-03-10T08:55:33.107 INFO:tasks.workunit.client.0.vm05.stdout:9/321: dread - d6/d15/d3c/d4b/f67 zero size 2026-03-10T08:55:33.117 INFO:tasks.workunit.client.1.vm08.stdout:0/566: dread d6/f9 [0,4194304] 0 2026-03-10T08:55:33.120 INFO:tasks.workunit.client.0.vm05.stdout:5/348: rename d5/df/d12/d21/c3e to d5/df/d37/d68/c7c 0 2026-03-10T08:55:33.123 INFO:tasks.workunit.client.0.vm05.stdout:9/322: symlink d6/d15/d37/l69 0 2026-03-10T08:55:33.124 INFO:tasks.workunit.client.0.vm05.stdout:8/394: rmdir d2/d87 0 2026-03-10T08:55:33.127 INFO:tasks.workunit.client.0.vm05.stdout:9/323: truncate d6/d27/f44 1880486 0 2026-03-10T08:55:33.133 INFO:tasks.workunit.client.0.vm05.stdout:9/324: symlink d6/d27/l6a 0 2026-03-10T08:55:33.133 INFO:tasks.workunit.client.0.vm05.stdout:9/325: read d6/d15/d35/f38 [333819,61787] 0 2026-03-10T08:55:33.133 INFO:tasks.workunit.client.0.vm05.stdout:9/326: dwrite d6/d19/d21/f32 [0,4194304] 0 2026-03-10T08:55:33.143 INFO:tasks.workunit.client.0.vm05.stdout:9/327: rmdir d6/d12/d3a/d48 39 2026-03-10T08:55:33.144 INFO:tasks.workunit.client.1.vm08.stdout:9/573: mknod d2/dd/d15/d1e/d25/d98/d9d/cbd 0 2026-03-10T08:55:33.145 INFO:tasks.workunit.client.1.vm08.stdout:2/696: truncate d1/da/d10/d42/d93/d1e/dce/fc8 3991973 0 2026-03-10T08:55:33.146 INFO:tasks.workunit.client.1.vm08.stdout:0/567: fdatasync d6/f2c 0 2026-03-10T08:55:33.149 INFO:tasks.workunit.client.1.vm08.stdout:9/574: mknod d2/d54/cbe 0 2026-03-10T08:55:33.150 INFO:tasks.workunit.client.1.vm08.stdout:0/568: symlink d6/dd/d13/d17/d50/lb6 0 2026-03-10T08:55:33.151 INFO:tasks.workunit.client.0.vm05.stdout:9/328: dwrite d6/d19/f5c [0,4194304] 0 2026-03-10T08:55:33.151 INFO:tasks.workunit.client.1.vm08.stdout:8/677: dwrite d1/d2c/f47 [0,4194304] 0 2026-03-10T08:55:33.153 INFO:tasks.workunit.client.0.vm05.stdout:9/329: dread - d6/d15/d3c/d4b/f5b zero size 2026-03-10T08:55:33.154 INFO:tasks.workunit.client.0.vm05.stdout:9/330: readlink d6/d15/d37/l69 0 2026-03-10T08:55:33.154 INFO:tasks.workunit.client.1.vm08.stdout:9/575: mknod d2/dd/d15/d1e/d39/d4e/cbf 0 2026-03-10T08:55:33.155 INFO:tasks.workunit.client.1.vm08.stdout:2/697: creat d1/dd5/fe3 x:0 0 0 2026-03-10T08:55:33.156 INFO:tasks.workunit.client.1.vm08.stdout:0/569: read - d6/dd/d13/d17/d1f/d2d/d39/f87 zero size 2026-03-10T08:55:33.156 INFO:tasks.workunit.client.1.vm08.stdout:9/576: fsync d2/dd/d15/d1e/d24/f9e 0 2026-03-10T08:55:33.157 INFO:tasks.workunit.client.0.vm05.stdout:9/331: creat d6/d15/d3c/f6b x:0 0 0 2026-03-10T08:55:33.159 INFO:tasks.workunit.client.0.vm05.stdout:9/332: creat d6/d19/d2c/d58/f6c x:0 0 0 2026-03-10T08:55:33.160 INFO:tasks.workunit.client.0.vm05.stdout:9/333: symlink d6/d12/d43/l6d 0 2026-03-10T08:55:33.165 INFO:tasks.workunit.client.0.vm05.stdout:3/462: sync 2026-03-10T08:55:33.166 INFO:tasks.workunit.client.0.vm05.stdout:3/463: write d9/d2b/d3a/f44 [3803804,120893] 0 2026-03-10T08:55:33.169 INFO:tasks.workunit.client.0.vm05.stdout:4/458: sync 2026-03-10T08:55:33.169 INFO:tasks.workunit.client.0.vm05.stdout:7/339: sync 2026-03-10T08:55:33.169 INFO:tasks.workunit.client.0.vm05.stdout:7/340: readlink l16 0 2026-03-10T08:55:33.178 INFO:tasks.workunit.client.1.vm08.stdout:9/577: creat d2/dd/d15/d1e/d25/d32/d79/d85/fc0 x:0 0 0 2026-03-10T08:55:33.178 INFO:tasks.workunit.client.0.vm05.stdout:3/464: creat d9/d4d/d51/d64/f85 x:0 0 0 2026-03-10T08:55:33.179 INFO:tasks.workunit.client.0.vm05.stdout:3/465: chown d9/d2b/d3a/d43/d4f/d50/f7c 0 1 2026-03-10T08:55:33.182 INFO:tasks.workunit.client.1.vm08.stdout:2/698: link d1/da/d10/d1b/c7d d1/da/d10/d42/d93/d1e/dce/ce4 0 2026-03-10T08:55:33.182 INFO:tasks.workunit.client.0.vm05.stdout:3/466: dwrite d9/d2b/f3b [0,4194304] 0 2026-03-10T08:55:33.182 INFO:tasks.workunit.client.1.vm08.stdout:2/699: chown d1/da/d10/fa5 182 1 2026-03-10T08:55:33.183 INFO:tasks.workunit.client.0.vm05.stdout:3/467: write d9/d2b/d2f/f3f [630667,24268] 0 2026-03-10T08:55:33.185 INFO:tasks.workunit.client.1.vm08.stdout:9/578: readlink d2/dd/d15/d1e/d21/lb4 0 2026-03-10T08:55:33.185 INFO:tasks.workunit.client.1.vm08.stdout:2/700: creat d1/da/d10/d42/d93/d23/d9e/fe5 x:0 0 0 2026-03-10T08:55:33.186 INFO:tasks.workunit.client.0.vm05.stdout:3/468: fdatasync d9/d2b/f34 0 2026-03-10T08:55:33.226 INFO:tasks.workunit.client.0.vm05.stdout:3/469: unlink d9/d2b/d53/l6a 0 2026-03-10T08:55:33.234 INFO:tasks.workunit.client.0.vm05.stdout:3/470: dread d9/d2b/d53/f60 [0,4194304] 0 2026-03-10T08:55:33.234 INFO:tasks.workunit.client.0.vm05.stdout:3/471: stat d9/d2b/f3b 0 2026-03-10T08:55:33.234 INFO:tasks.workunit.client.0.vm05.stdout:3/472: dwrite d9/f4a [0,4194304] 0 2026-03-10T08:55:33.234 INFO:tasks.workunit.client.0.vm05.stdout:3/473: getdents d9/d4d/d51 0 2026-03-10T08:55:33.234 INFO:tasks.workunit.client.0.vm05.stdout:3/474: mkdir d9/d2b/d3a/d43/d71/d86 0 2026-03-10T08:55:33.234 INFO:tasks.workunit.client.0.vm05.stdout:3/475: creat d9/d2b/d53/d61/f87 x:0 0 0 2026-03-10T08:55:33.235 INFO:tasks.workunit.client.1.vm08.stdout:9/579: symlink d2/dd/d15/d1e/d24/lc1 0 2026-03-10T08:55:33.235 INFO:tasks.workunit.client.1.vm08.stdout:9/580: mkdir d2/dd/d15/d1e/d25/d32/d5c/dc2 0 2026-03-10T08:55:33.235 INFO:tasks.workunit.client.1.vm08.stdout:9/581: dread - d2/dd/d15/d1e/d25/d32/d5c/f70 zero size 2026-03-10T08:55:33.235 INFO:tasks.workunit.client.1.vm08.stdout:9/582: symlink d2/d54/lc3 0 2026-03-10T08:55:33.235 INFO:tasks.workunit.client.1.vm08.stdout:9/583: dwrite d2/d41/d4c/d66/fb0 [0,4194304] 0 2026-03-10T08:55:33.235 INFO:tasks.workunit.client.1.vm08.stdout:9/584: fsync d2/dd/d15/d1e/d24/f30 0 2026-03-10T08:55:33.235 INFO:tasks.workunit.client.1.vm08.stdout:9/585: write d2/dd/d15/d1e/d39/d4e/f55 [4774907,3062] 0 2026-03-10T08:55:33.235 INFO:tasks.workunit.client.1.vm08.stdout:9/586: write d2/dd/d15/d1e/d21/f2d [1546616,112411] 0 2026-03-10T08:55:33.235 INFO:tasks.workunit.client.1.vm08.stdout:9/587: rename d2/d54/dbc to d2/dd/d15/d1e/d25/d32/dc4 0 2026-03-10T08:55:33.235 INFO:tasks.workunit.client.1.vm08.stdout:9/588: creat d2/dd/d15/d1e/d21/fc5 x:0 0 0 2026-03-10T08:55:33.235 INFO:tasks.workunit.client.1.vm08.stdout:9/589: dwrite d2/dd/d15/d1e/d21/fc5 [0,4194304] 0 2026-03-10T08:55:33.235 INFO:tasks.workunit.client.1.vm08.stdout:9/590: symlink d2/d41/d4c/d66/d82/lc6 0 2026-03-10T08:55:33.235 INFO:tasks.workunit.client.1.vm08.stdout:9/591: chown d2/d54/d8e/da6/cb8 1164 1 2026-03-10T08:55:33.235 INFO:tasks.workunit.client.1.vm08.stdout:9/592: creat d2/dd/d15/d1e/d21/fc7 x:0 0 0 2026-03-10T08:55:33.235 INFO:tasks.workunit.client.1.vm08.stdout:9/593: dwrite d2/dd/d15/d4f/fa5 [0,4194304] 0 2026-03-10T08:55:33.236 INFO:tasks.workunit.client.1.vm08.stdout:3/582: write d4/d15/d8/fad [577388,30885] 0 2026-03-10T08:55:33.237 INFO:tasks.workunit.client.1.vm08.stdout:5/571: write d0/d11/f25 [1422239,13368] 0 2026-03-10T08:55:33.239 INFO:tasks.workunit.client.1.vm08.stdout:9/594: mkdir d2/d41/d74/dc8 0 2026-03-10T08:55:33.240 INFO:tasks.workunit.client.1.vm08.stdout:3/583: symlink d4/d15/d8/d1d/d4f/lc8 0 2026-03-10T08:55:33.241 INFO:tasks.workunit.client.1.vm08.stdout:5/572: rmdir d0/d11 39 2026-03-10T08:55:33.241 INFO:tasks.workunit.client.1.vm08.stdout:3/584: creat d4/d15/d8/d1d/da8/fc9 x:0 0 0 2026-03-10T08:55:33.242 INFO:tasks.workunit.client.1.vm08.stdout:3/585: read - d4/d15/d8/d1d/da8/fc9 zero size 2026-03-10T08:55:33.243 INFO:tasks.workunit.client.1.vm08.stdout:9/595: rename d2/dd/l51 to d2/d54/d8e/da6/lc9 0 2026-03-10T08:55:33.243 INFO:tasks.workunit.client.1.vm08.stdout:9/596: chown d2/c20 1735 1 2026-03-10T08:55:33.244 INFO:tasks.workunit.client.1.vm08.stdout:5/573: stat d0/d11/d3e/la6 0 2026-03-10T08:55:33.246 INFO:tasks.workunit.client.1.vm08.stdout:5/574: write d0/d11/d27/d68/d7c/d4b/d4e/d84/fa9 [996156,87300] 0 2026-03-10T08:55:33.246 INFO:tasks.workunit.client.1.vm08.stdout:5/575: chown d0/d46 464363550 1 2026-03-10T08:55:33.247 INFO:tasks.workunit.client.1.vm08.stdout:5/576: symlink d0/d11/d27/d68/d7c/lae 0 2026-03-10T08:55:33.249 INFO:tasks.workunit.client.1.vm08.stdout:8/678: sync 2026-03-10T08:55:33.250 INFO:tasks.workunit.client.1.vm08.stdout:8/679: mkdir d1/d10/d9/dd/d18/dff 0 2026-03-10T08:55:33.297 INFO:tasks.workunit.client.1.vm08.stdout:7/656: dwrite d0/d14/d43/d62/fb5 [0,4194304] 0 2026-03-10T08:55:33.306 INFO:tasks.workunit.client.1.vm08.stdout:7/657: mkdir d0/d11/d1f/d29/d3b/d80/dd3 0 2026-03-10T08:55:33.308 INFO:tasks.workunit.client.1.vm08.stdout:7/658: link d0/d14/d2f/c42 d0/d14/d43/d9d/cd4 0 2026-03-10T08:55:33.310 INFO:tasks.workunit.client.1.vm08.stdout:7/659: rename d0/d14/d43/d9d/cc4 to d0/d11/d1f/d29/d36/d75/cd5 0 2026-03-10T08:55:33.311 INFO:tasks.workunit.client.1.vm08.stdout:7/660: symlink d0/d11/d4a/ld6 0 2026-03-10T08:55:33.313 INFO:tasks.workunit.client.1.vm08.stdout:7/661: rename d0/d11/db2/fc1 to d0/d11/d1f/d29/fd7 0 2026-03-10T08:55:33.314 INFO:tasks.workunit.client.1.vm08.stdout:7/662: write d0/d11/d4a/da3/fa9 [4411705,84508] 0 2026-03-10T08:55:33.316 INFO:tasks.workunit.client.1.vm08.stdout:7/663: rmdir d0/d14/d43 39 2026-03-10T08:55:33.317 INFO:tasks.workunit.client.1.vm08.stdout:7/664: symlink d0/d11/d1f/d29/d36/d75/ld8 0 2026-03-10T08:55:33.318 INFO:tasks.workunit.client.1.vm08.stdout:7/665: stat d0/d14/d43/d62/f9a 0 2026-03-10T08:55:33.320 INFO:tasks.workunit.client.0.vm05.stdout:6/471: write d4/d2c/d84/d4a/f63 [1821053,98289] 0 2026-03-10T08:55:33.321 INFO:tasks.workunit.client.0.vm05.stdout:6/472: truncate d4/d2d/d5f/f88 135518 0 2026-03-10T08:55:33.322 INFO:tasks.workunit.client.0.vm05.stdout:6/473: symlink d4/d92/l9e 0 2026-03-10T08:55:33.324 INFO:tasks.workunit.client.0.vm05.stdout:6/474: rename d4/d2d/f79 to d4/d2d/d5f/f9f 0 2026-03-10T08:55:33.324 INFO:tasks.workunit.client.0.vm05.stdout:6/475: write d4/d7/f14 [3828273,127611] 0 2026-03-10T08:55:33.326 INFO:tasks.workunit.client.0.vm05.stdout:6/476: read d4/f61 [2502213,106741] 0 2026-03-10T08:55:33.327 INFO:tasks.workunit.client.0.vm05.stdout:6/477: truncate d4/d7/f5d 697523 0 2026-03-10T08:55:33.328 INFO:tasks.workunit.client.1.vm08.stdout:7/666: truncate d0/d11/d4a/f87 8643163 0 2026-03-10T08:55:33.328 INFO:tasks.workunit.client.0.vm05.stdout:6/478: fsync d4/d2c/d84/f49 0 2026-03-10T08:55:33.329 INFO:tasks.workunit.client.0.vm05.stdout:6/479: chown d4/c44 308703325 1 2026-03-10T08:55:33.330 INFO:tasks.workunit.client.0.vm05.stdout:6/480: mknod d4/d7/d10/d1a/ca0 0 2026-03-10T08:55:33.331 INFO:tasks.workunit.client.0.vm05.stdout:6/481: fdatasync d4/d7/d10/d15/d1b/d22/f56 0 2026-03-10T08:55:33.331 INFO:tasks.workunit.client.0.vm05.stdout:6/482: write d4/f30 [1367695,106696] 0 2026-03-10T08:55:33.334 INFO:tasks.workunit.client.0.vm05.stdout:6/483: rename d4/d7/d10/d1a/d1f/f66 to d4/d7/d10/d15/d20/fa1 0 2026-03-10T08:55:33.334 INFO:tasks.workunit.client.0.vm05.stdout:6/484: fdatasync d4/f6c 0 2026-03-10T08:55:33.335 INFO:tasks.workunit.client.0.vm05.stdout:6/485: creat d4/d2d/fa2 x:0 0 0 2026-03-10T08:55:33.339 INFO:tasks.workunit.client.0.vm05.stdout:6/486: dwrite d4/d7/f5d [0,4194304] 0 2026-03-10T08:55:33.342 INFO:tasks.workunit.client.1.vm08.stdout:7/667: dwrite d0/d11/d1f/d29/d3d/d40/f38 [8388608,4194304] 0 2026-03-10T08:55:33.353 INFO:tasks.workunit.client.1.vm08.stdout:7/668: symlink d0/d11/db2/d8e/ld9 0 2026-03-10T08:55:33.354 INFO:tasks.workunit.client.1.vm08.stdout:7/669: chown d0/d11/d1f/d29/d3b/fc7 1510212767 1 2026-03-10T08:55:33.354 INFO:tasks.workunit.client.1.vm08.stdout:7/670: chown d0/d14/d43/d9d 58534 1 2026-03-10T08:55:33.355 INFO:tasks.workunit.client.1.vm08.stdout:7/671: chown d0/d11/c26 444874 1 2026-03-10T08:55:33.356 INFO:tasks.workunit.client.1.vm08.stdout:7/672: read d0/d11/d1f/d29/d3d/d40/ff [2058110,89342] 0 2026-03-10T08:55:33.359 INFO:tasks.workunit.client.1.vm08.stdout:7/673: truncate d0/d11/d1f/d29/d36/d75/fb9 914168 0 2026-03-10T08:55:33.363 INFO:tasks.workunit.client.1.vm08.stdout:7/674: dread d0/d14/d43/d62/f9a [0,4194304] 0 2026-03-10T08:55:33.367 INFO:tasks.workunit.client.1.vm08.stdout:7/675: creat d0/d11/d1f/d29/d3d/fda x:0 0 0 2026-03-10T08:55:33.381 INFO:tasks.workunit.client.1.vm08.stdout:5/577: dread d0/d11/f1e [0,4194304] 0 2026-03-10T08:55:33.383 INFO:tasks.workunit.client.1.vm08.stdout:5/578: creat d0/d11/d18/faf x:0 0 0 2026-03-10T08:55:33.386 INFO:tasks.workunit.client.1.vm08.stdout:5/579: fdatasync d0/d11/d27/d68/d7c/f6a 0 2026-03-10T08:55:33.387 INFO:tasks.workunit.client.1.vm08.stdout:5/580: fsync d0/fb 0 2026-03-10T08:55:33.393 INFO:tasks.workunit.client.1.vm08.stdout:5/581: dwrite d0/d46/f81 [4194304,4194304] 0 2026-03-10T08:55:33.402 INFO:tasks.workunit.client.1.vm08.stdout:5/582: mknod d0/d11/d27/d68/d7c/d4b/cb0 0 2026-03-10T08:55:33.437 INFO:tasks.workunit.client.1.vm08.stdout:6/648: truncate d9/dc/d11/d23/d2c/f5c 3953314 0 2026-03-10T08:55:33.438 INFO:tasks.workunit.client.1.vm08.stdout:6/649: write d9/dc/d11/d23/d2c/d7a/fd3 [10902,123279] 0 2026-03-10T08:55:33.439 INFO:tasks.workunit.client.0.vm05.stdout:1/507: write dd/d21/d3f/f83 [919089,46409] 0 2026-03-10T08:55:33.442 INFO:tasks.workunit.client.0.vm05.stdout:1/508: chown dd/d13/c5f 74369142 1 2026-03-10T08:55:33.453 INFO:tasks.workunit.client.1.vm08.stdout:4/650: dwrite d5/f85 [0,4194304] 0 2026-03-10T08:55:33.456 INFO:tasks.workunit.client.1.vm08.stdout:4/651: readlink d5/d23/d36/l74 0 2026-03-10T08:55:33.457 INFO:tasks.workunit.client.1.vm08.stdout:4/652: write d5/d23/d36/f7d [405061,81308] 0 2026-03-10T08:55:33.479 INFO:tasks.workunit.client.0.vm05.stdout:1/509: mkdir dd/d21/d37/d7c/dab/db7 0 2026-03-10T08:55:33.481 INFO:tasks.workunit.client.1.vm08.stdout:4/653: readlink d5/d23/d49/d8f/lc0 0 2026-03-10T08:55:33.483 INFO:tasks.workunit.client.0.vm05.stdout:2/369: write d0/f2 [795155,16348] 0 2026-03-10T08:55:33.488 INFO:tasks.workunit.client.0.vm05.stdout:5/349: write d5/df/d12/d24/d2c/f46 [2378645,20603] 0 2026-03-10T08:55:33.492 INFO:tasks.workunit.client.1.vm08.stdout:4/654: rename d5/d23/d36/d99/db2/d5a/d69/fa3 to d5/de/d96/feb 0 2026-03-10T08:55:33.493 INFO:tasks.workunit.client.0.vm05.stdout:8/395: stat d2/dd/d2c/d2e/d31/d4c 0 2026-03-10T08:55:33.495 INFO:tasks.workunit.client.0.vm05.stdout:5/350: link d5/df/d12/d66/f7b d5/df/d12/d24/d2c/d41/d74/f7d 0 2026-03-10T08:55:33.506 INFO:tasks.workunit.client.0.vm05.stdout:8/396: mknod d2/dd/d2c/d2e/d31/d4f/d7b/c8c 0 2026-03-10T08:55:33.506 INFO:tasks.workunit.client.0.vm05.stdout:8/397: fsync d2/db/d47/f51 0 2026-03-10T08:55:33.506 INFO:tasks.workunit.client.0.vm05.stdout:8/398: mkdir d2/db/d1f/d67/d8d 0 2026-03-10T08:55:33.506 INFO:tasks.workunit.client.0.vm05.stdout:8/399: truncate d2/dd/d2c/d2e/d31/d4c/f85 643148 0 2026-03-10T08:55:33.506 INFO:tasks.workunit.client.0.vm05.stdout:8/400: dread - d2/dd/d2c/d2e/d31/d3e/f6b zero size 2026-03-10T08:55:33.509 INFO:tasks.workunit.client.0.vm05.stdout:8/401: dwrite d2/fa [4194304,4194304] 0 2026-03-10T08:55:33.515 INFO:tasks.workunit.client.0.vm05.stdout:8/402: mknod d2/dd/d74/d78/c8e 0 2026-03-10T08:55:33.519 INFO:tasks.workunit.client.0.vm05.stdout:8/403: getdents d2/dd/d2c/d2e/d31 0 2026-03-10T08:55:33.519 INFO:tasks.workunit.client.0.vm05.stdout:8/404: write d2/dd/d2c/f4d [4878185,71846] 0 2026-03-10T08:55:33.522 INFO:tasks.workunit.client.0.vm05.stdout:8/405: creat d2/db/d1f/d67/d8d/f8f x:0 0 0 2026-03-10T08:55:33.523 INFO:tasks.workunit.client.0.vm05.stdout:8/406: chown d2/dd/f1a 406 1 2026-03-10T08:55:33.524 INFO:tasks.workunit.client.0.vm05.stdout:8/407: mknod d2/dd/d2c/d2e/d31/d4f/d80/c90 0 2026-03-10T08:55:33.529 INFO:tasks.workunit.client.0.vm05.stdout:8/408: dwrite d2/dd/d2c/d2e/d31/d3e/f73 [4194304,4194304] 0 2026-03-10T08:55:33.548 INFO:tasks.workunit.client.0.vm05.stdout:8/409: dread - d2/dd/d2c/d2e/f6a zero size 2026-03-10T08:55:33.548 INFO:tasks.workunit.client.0.vm05.stdout:0/397: dwrite df/d18/f29 [0,4194304] 0 2026-03-10T08:55:33.548 INFO:tasks.workunit.client.0.vm05.stdout:0/398: stat df/d18/d19/d39 0 2026-03-10T08:55:33.549 INFO:tasks.workunit.client.0.vm05.stdout:0/399: mknod df/d18/d19/d39/d4d/c6b 0 2026-03-10T08:55:33.549 INFO:tasks.workunit.client.0.vm05.stdout:8/410: truncate d2/d45/f43 310288 0 2026-03-10T08:55:33.549 INFO:tasks.workunit.client.0.vm05.stdout:8/411: creat d2/dd/d2c/d2e/d31/d4c/d63/f91 x:0 0 0 2026-03-10T08:55:33.549 INFO:tasks.workunit.client.0.vm05.stdout:8/412: stat d2/db/d1f/f44 0 2026-03-10T08:55:33.549 INFO:tasks.workunit.client.0.vm05.stdout:0/400: creat df/d18/d19/d5b/f6c x:0 0 0 2026-03-10T08:55:33.549 INFO:tasks.workunit.client.0.vm05.stdout:0/401: fdatasync df/d18/d2b/d27/f60 0 2026-03-10T08:55:33.549 INFO:tasks.workunit.client.0.vm05.stdout:8/413: dread d2/db/f1b [0,4194304] 0 2026-03-10T08:55:33.550 INFO:tasks.workunit.client.0.vm05.stdout:8/414: dread - d2/dd/d2c/d2e/d31/f89 zero size 2026-03-10T08:55:33.550 INFO:tasks.workunit.client.0.vm05.stdout:9/334: dread d6/d27/f44 [0,4194304] 0 2026-03-10T08:55:33.555 INFO:tasks.workunit.client.0.vm05.stdout:0/402: dread df/f12 [0,4194304] 0 2026-03-10T08:55:33.556 INFO:tasks.workunit.client.1.vm08.stdout:1/665: write d1/da/de/fe4 [2214855,48621] 0 2026-03-10T08:55:33.559 INFO:tasks.workunit.client.0.vm05.stdout:8/415: creat d2/dd/d2c/d2e/d31/d3e/d5d/f92 x:0 0 0 2026-03-10T08:55:33.560 INFO:tasks.workunit.client.0.vm05.stdout:0/403: truncate df/d59/f3f 298636 0 2026-03-10T08:55:33.562 INFO:tasks.workunit.client.0.vm05.stdout:8/416: rmdir d2/dd/d2c/d2e/d31/d4c 39 2026-03-10T08:55:33.563 INFO:tasks.workunit.client.0.vm05.stdout:8/417: read d2/db/f19 [4042569,108404] 0 2026-03-10T08:55:33.566 INFO:tasks.workunit.client.0.vm05.stdout:8/418: mkdir d2/dd/d2c/d2e/d93 0 2026-03-10T08:55:33.570 INFO:tasks.workunit.client.0.vm05.stdout:8/419: write d2/dd/d2c/d2e/d31/d3e/f73 [3356930,51390] 0 2026-03-10T08:55:33.571 INFO:tasks.workunit.client.1.vm08.stdout:0/570: dwrite d6/dd/d13/d17/f66 [0,4194304] 0 2026-03-10T08:55:33.572 INFO:tasks.workunit.client.0.vm05.stdout:4/459: write d0/d1d/d30/f3a [615775,37450] 0 2026-03-10T08:55:33.572 INFO:tasks.workunit.client.0.vm05.stdout:4/460: fsync d0/d1d/d30/f29 0 2026-03-10T08:55:33.574 INFO:tasks.workunit.client.0.vm05.stdout:2/370: sync 2026-03-10T08:55:33.582 INFO:tasks.workunit.client.0.vm05.stdout:8/420: creat d2/db/d1f/d67/f94 x:0 0 0 2026-03-10T08:55:33.583 INFO:tasks.workunit.client.0.vm05.stdout:8/421: fdatasync d2/db/d47/f51 0 2026-03-10T08:55:33.585 INFO:tasks.workunit.client.0.vm05.stdout:7/341: dwrite d18/d1b/d1f/d25/f3a [0,4194304] 0 2026-03-10T08:55:33.590 INFO:tasks.workunit.client.0.vm05.stdout:4/461: dread - d0/d2e/d42/d45/d4a/d36/f88 zero size 2026-03-10T08:55:33.597 INFO:tasks.workunit.client.1.vm08.stdout:0/571: symlink d6/dd/d13/d17/d1f/d20/d2f/d57/lb7 0 2026-03-10T08:55:33.597 INFO:tasks.workunit.client.1.vm08.stdout:0/572: write d6/dd/d13/d17/d1f/da3/fb0 [1091333,63877] 0 2026-03-10T08:55:33.597 INFO:tasks.workunit.client.1.vm08.stdout:1/666: dread d1/da/f22 [0,4194304] 0 2026-03-10T08:55:33.597 INFO:tasks.workunit.client.1.vm08.stdout:2/701: getdents d1/da/d10/d42/d93/d23/d9e 0 2026-03-10T08:55:33.597 INFO:tasks.workunit.client.0.vm05.stdout:2/371: creat d0/d9/d1e/d20/d21/d45/f68 x:0 0 0 2026-03-10T08:55:33.597 INFO:tasks.workunit.client.0.vm05.stdout:2/372: stat d0/d9/d27/f66 0 2026-03-10T08:55:33.597 INFO:tasks.workunit.client.0.vm05.stdout:2/373: chown d0/d9/d1e/d20/f32 346036650 1 2026-03-10T08:55:33.597 INFO:tasks.workunit.client.0.vm05.stdout:8/422: fdatasync d2/db/f19 0 2026-03-10T08:55:33.601 INFO:tasks.workunit.client.0.vm05.stdout:3/476: dwrite d9/d4d/d51/f67 [0,4194304] 0 2026-03-10T08:55:33.612 INFO:tasks.workunit.client.1.vm08.stdout:3/586: write d4/d15/d8/d1d/f21 [1821258,73246] 0 2026-03-10T08:55:33.612 INFO:tasks.workunit.client.1.vm08.stdout:0/573: dwrite f5 [0,4194304] 0 2026-03-10T08:55:33.612 INFO:tasks.workunit.client.0.vm05.stdout:4/462: truncate d0/d1d/f24 2325935 0 2026-03-10T08:55:33.619 INFO:tasks.workunit.client.1.vm08.stdout:2/702: rmdir d1/da/d10/d42 39 2026-03-10T08:55:33.621 INFO:tasks.workunit.client.1.vm08.stdout:9/597: dwrite d2/d41/d74/f6a [0,4194304] 0 2026-03-10T08:55:33.625 INFO:tasks.workunit.client.1.vm08.stdout:8/680: truncate d1/d10/d9/dd/d25/d27/f3a 492184 0 2026-03-10T08:55:33.631 INFO:tasks.workunit.client.1.vm08.stdout:2/703: dwrite d1/d5b/da7/fb5 [0,4194304] 0 2026-03-10T08:55:33.638 INFO:tasks.workunit.client.0.vm05.stdout:7/342: truncate d18/d1b/d1f/d25/d2e/d2f/f59 1362279 0 2026-03-10T08:55:33.641 INFO:tasks.workunit.client.1.vm08.stdout:9/598: rename d2/dd/d15/d1e/d25/d32/d79/l7a to d2/d54/lca 0 2026-03-10T08:55:33.647 INFO:tasks.workunit.client.0.vm05.stdout:7/343: creat d18/d1b/d1f/d25/d2e/d42/d53/f61 x:0 0 0 2026-03-10T08:55:33.649 INFO:tasks.workunit.client.0.vm05.stdout:6/487: truncate d4/d7/f54 1317269 0 2026-03-10T08:55:33.661 INFO:tasks.workunit.client.1.vm08.stdout:9/599: creat d2/dd/d15/d1e/d25/d32/d5c/dc2/fcb x:0 0 0 2026-03-10T08:55:33.665 INFO:tasks.workunit.client.1.vm08.stdout:7/676: dwrite d0/d11/f39 [0,4194304] 0 2026-03-10T08:55:33.672 INFO:tasks.workunit.client.1.vm08.stdout:0/574: getdents d6/dd/d13/d8f 0 2026-03-10T08:55:33.675 INFO:tasks.workunit.client.0.vm05.stdout:7/344: link cc d18/d1b/d1f/c62 0 2026-03-10T08:55:33.679 INFO:tasks.workunit.client.1.vm08.stdout:9/600: symlink d2/dd/d15/d1e/d24/lcc 0 2026-03-10T08:55:33.687 INFO:tasks.workunit.client.1.vm08.stdout:1/667: fsync d1/da/de/fe4 0 2026-03-10T08:55:33.694 INFO:tasks.workunit.client.0.vm05.stdout:7/345: mknod d18/d38/c63 0 2026-03-10T08:55:33.694 INFO:tasks.workunit.client.1.vm08.stdout:5/583: truncate d0/d11/f25 485266 0 2026-03-10T08:55:33.694 INFO:tasks.workunit.client.1.vm08.stdout:5/584: write d0/d11/d18/d52/f7d [3144699,52651] 0 2026-03-10T08:55:33.700 INFO:tasks.workunit.client.1.vm08.stdout:7/677: truncate d0/d11/d1f/d29/d36/d75/fb9 301556 0 2026-03-10T08:55:33.701 INFO:tasks.workunit.client.1.vm08.stdout:7/678: write d0/d11/d1f/d29/fcf [649390,103365] 0 2026-03-10T08:55:33.701 INFO:tasks.workunit.client.1.vm08.stdout:9/601: truncate d2/d41/d4c/f7c 685113 0 2026-03-10T08:55:33.702 INFO:tasks.workunit.client.1.vm08.stdout:7/679: mknod d0/d11/db2/cdb 0 2026-03-10T08:55:33.707 INFO:tasks.workunit.client.1.vm08.stdout:9/602: truncate d2/dd/d15/d1e/d24/f30 2492542 0 2026-03-10T08:55:33.707 INFO:tasks.workunit.client.1.vm08.stdout:1/668: getdents d1/da/d4b 0 2026-03-10T08:55:33.719 INFO:tasks.workunit.client.1.vm08.stdout:9/603: read - d2/dd/d15/d1e/d39/d4e/d87/f93 zero size 2026-03-10T08:55:33.724 INFO:tasks.workunit.client.1.vm08.stdout:6/650: dwrite d9/d10/d1e/d32/f27 [4194304,4194304] 0 2026-03-10T08:55:33.750 INFO:tasks.workunit.client.1.vm08.stdout:1/669: creat d1/da/de/d24/d3d/d40/d8e/dd2/d7f/feb x:0 0 0 2026-03-10T08:55:33.750 INFO:tasks.workunit.client.1.vm08.stdout:0/575: dwrite d6/dd/d13/d17/f29 [0,4194304] 0 2026-03-10T08:55:33.750 INFO:tasks.workunit.client.1.vm08.stdout:9/604: rename d2/dd/d15/d4f/l4d to d2/d41/d53/lcd 0 2026-03-10T08:55:33.750 INFO:tasks.workunit.client.1.vm08.stdout:9/605: chown d2/dd/d15/d1e/d39/d4e/f71 98 1 2026-03-10T08:55:33.750 INFO:tasks.workunit.client.1.vm08.stdout:6/651: write d9/d50/fb8 [1332210,78968] 0 2026-03-10T08:55:33.750 INFO:tasks.workunit.client.1.vm08.stdout:0/576: mknod d6/dd/d13/d17/d1f/d20/d2f/d24/cb8 0 2026-03-10T08:55:33.750 INFO:tasks.workunit.client.1.vm08.stdout:9/606: mknod d2/dd/d15/d1e/d25/d98/d9d/db3/cce 0 2026-03-10T08:55:33.755 INFO:tasks.workunit.client.1.vm08.stdout:6/652: dread d9/fa [0,4194304] 0 2026-03-10T08:55:33.759 INFO:tasks.workunit.client.1.vm08.stdout:6/653: creat d9/dc/d11/fdc x:0 0 0 2026-03-10T08:55:33.759 INFO:tasks.workunit.client.1.vm08.stdout:6/654: chown d9/d50/f75 1694 1 2026-03-10T08:55:33.764 INFO:tasks.workunit.client.1.vm08.stdout:7/680: dread d0/d11/d1f/f90 [0,4194304] 0 2026-03-10T08:55:33.771 INFO:tasks.workunit.client.1.vm08.stdout:6/655: dread d9/dc/d11/d23/d2c/f3d [0,4194304] 0 2026-03-10T08:55:33.771 INFO:tasks.workunit.client.1.vm08.stdout:7/681: dread - d0/d11/d1f/d29/fba zero size 2026-03-10T08:55:33.771 INFO:tasks.workunit.client.1.vm08.stdout:6/656: mknod d9/dc/d11/d23/d2c/d81/d63/cdd 0 2026-03-10T08:55:33.771 INFO:tasks.workunit.client.1.vm08.stdout:6/657: rename d9/d10/d1e/d7b/fc3 to d9/d10/d1e/d32/fde 0 2026-03-10T08:55:33.771 INFO:tasks.workunit.client.1.vm08.stdout:6/658: chown d9/c87 278900422 1 2026-03-10T08:55:33.778 INFO:tasks.workunit.client.1.vm08.stdout:6/659: creat d9/dc/d11/d23/fdf x:0 0 0 2026-03-10T08:55:33.878 INFO:tasks.workunit.client.1.vm08.stdout:5/585: sync 2026-03-10T08:55:33.879 INFO:tasks.workunit.client.1.vm08.stdout:5/586: fsync d0/d11/d3e/d45/fad 0 2026-03-10T08:55:33.882 INFO:tasks.workunit.client.1.vm08.stdout:5/587: rename d0/d11/d18/c33 to d0/d11/d27/d68/d7c/d4b/d87/cb1 0 2026-03-10T08:55:33.884 INFO:tasks.workunit.client.1.vm08.stdout:5/588: mknod d0/cb2 0 2026-03-10T08:55:33.898 INFO:tasks.workunit.client.0.vm05.stdout:4/463: symlink d0/d1d/d30/d49/d58/l98 0 2026-03-10T08:55:33.899 INFO:tasks.workunit.client.0.vm05.stdout:4/464: readlink d0/d2e/l44 0 2026-03-10T08:55:33.899 INFO:tasks.workunit.client.0.vm05.stdout:4/465: fdatasync d0/fb 0 2026-03-10T08:55:33.900 INFO:tasks.workunit.client.0.vm05.stdout:4/466: mknod d0/d1d/d30/d49/d58/d66/d79/c99 0 2026-03-10T08:55:33.901 INFO:tasks.workunit.client.0.vm05.stdout:4/467: stat d0/d2e/d42/d45/d4a/d36/l7d 0 2026-03-10T08:55:33.903 INFO:tasks.workunit.client.0.vm05.stdout:1/510: write dd/d10/d19/f95 [4948155,63622] 0 2026-03-10T08:55:33.906 INFO:tasks.workunit.client.0.vm05.stdout:2/374: mknod d0/c69 0 2026-03-10T08:55:33.906 INFO:tasks.workunit.client.0.vm05.stdout:2/375: dread - d0/d55/f60 zero size 2026-03-10T08:55:33.981 INFO:tasks.workunit.client.0.vm05.stdout:5/351: dwrite d5/df/d12/f44 [0,4194304] 0 2026-03-10T08:55:33.982 INFO:tasks.workunit.client.1.vm08.stdout:4/655: dwrite d5/f19 [0,4194304] 0 2026-03-10T08:55:33.990 INFO:tasks.workunit.client.0.vm05.stdout:5/352: dwrite d5/df/d12/f1b [4194304,4194304] 0 2026-03-10T08:55:34.010 INFO:tasks.workunit.client.1.vm08.stdout:8/681: mknod d1/d10/d9/d4d/c100 0 2026-03-10T08:55:34.012 INFO:tasks.workunit.client.1.vm08.stdout:8/682: read - d1/d10/d9/dd/d25/d27/d44/d21/dce/ffd zero size 2026-03-10T08:55:34.025 INFO:tasks.workunit.client.1.vm08.stdout:9/607: creat d2/dd/d15/d1e/d39/d4e/fcf x:0 0 0 2026-03-10T08:55:34.033 INFO:tasks.workunit.client.1.vm08.stdout:3/587: write d4/d6f/d85/f87 [3211503,97304] 0 2026-03-10T08:55:34.034 INFO:tasks.workunit.client.1.vm08.stdout:3/588: write d4/d15/d8/d2c/d55/d93/fa5 [3217109,53571] 0 2026-03-10T08:55:34.042 INFO:tasks.workunit.client.0.vm05.stdout:6/488: symlink d4/d7/d10/d15/d20/la3 0 2026-03-10T08:55:34.065 INFO:tasks.workunit.client.0.vm05.stdout:7/346: mknod d18/d1b/d1f/d25/d2e/d42/d53/c64 0 2026-03-10T08:55:34.087 INFO:tasks.workunit.client.1.vm08.stdout:0/577: write d6/dd/d13/d17/d1f/d2d/fa0 [500434,15991] 0 2026-03-10T08:55:34.093 INFO:tasks.workunit.client.1.vm08.stdout:1/670: dwrite d1/da/f25 [0,4194304] 0 2026-03-10T08:55:34.098 INFO:tasks.workunit.client.1.vm08.stdout:6/660: write d9/dc/d84/f5e [451213,72272] 0 2026-03-10T08:55:34.100 INFO:tasks.workunit.client.1.vm08.stdout:6/661: readlink d9/d13/l46 0 2026-03-10T08:55:34.101 INFO:tasks.workunit.client.1.vm08.stdout:6/662: chown d9/dc/d11/d23 133198073 1 2026-03-10T08:55:34.104 INFO:tasks.workunit.client.0.vm05.stdout:8/423: creat d2/dd/d2c/d2e/d31/d3e/f95 x:0 0 0 2026-03-10T08:55:34.105 INFO:tasks.workunit.client.1.vm08.stdout:7/682: dwrite d0/d14/d2f/f81 [4194304,4194304] 0 2026-03-10T08:55:34.107 INFO:tasks.workunit.client.1.vm08.stdout:1/671: rename d1/da/de/d24/d3d/c9d to d1/da/de/d24/d3d/d40/d8e/dd2/cec 0 2026-03-10T08:55:34.118 INFO:tasks.workunit.client.1.vm08.stdout:7/683: read d0/d14/d43/f58 [239048,83229] 0 2026-03-10T08:55:34.119 INFO:tasks.workunit.client.0.vm05.stdout:1/511: rename dd/d21/f7f to dd/d21/fb8 0 2026-03-10T08:55:34.121 INFO:tasks.workunit.client.0.vm05.stdout:1/512: truncate dd/d10/d18/f8a 549126 0 2026-03-10T08:55:34.122 INFO:tasks.workunit.client.1.vm08.stdout:4/656: link d5/d23/c34 d5/d23/d36/d99/cec 0 2026-03-10T08:55:34.135 INFO:tasks.workunit.client.0.vm05.stdout:1/513: symlink dd/d10/d19/d9b/lb9 0 2026-03-10T08:55:34.135 INFO:tasks.workunit.client.0.vm05.stdout:1/514: mknod dd/d10/d19/d9b/cba 0 2026-03-10T08:55:34.135 INFO:tasks.workunit.client.0.vm05.stdout:1/515: dread - dd/d10/d19/d27/f9c zero size 2026-03-10T08:55:34.135 INFO:tasks.workunit.client.0.vm05.stdout:1/516: creat dd/d10/d18/d2d/d51/d58/d71/d73/fbb x:0 0 0 2026-03-10T08:55:34.136 INFO:tasks.workunit.client.0.vm05.stdout:1/517: fdatasync dd/d10/f22 0 2026-03-10T08:55:34.136 INFO:tasks.workunit.client.0.vm05.stdout:1/518: symlink dd/d10/d18/d2d/lbc 0 2026-03-10T08:55:34.136 INFO:tasks.workunit.client.0.vm05.stdout:1/519: unlink dd/d10/d19/d27/f31 0 2026-03-10T08:55:34.136 INFO:tasks.workunit.client.1.vm08.stdout:4/657: write d5/d23/d36/f57 [2300459,2512] 0 2026-03-10T08:55:34.136 INFO:tasks.workunit.client.1.vm08.stdout:4/658: stat d5/d23/d36/d99/db2/d5a 0 2026-03-10T08:55:34.136 INFO:tasks.workunit.client.1.vm08.stdout:7/684: mkdir d0/d11/d1f/d29/d3d/d40/ddc 0 2026-03-10T08:55:34.136 INFO:tasks.workunit.client.1.vm08.stdout:4/659: fsync d5/d23/d49/fdc 0 2026-03-10T08:55:34.136 INFO:tasks.workunit.client.1.vm08.stdout:1/672: creat d1/fed x:0 0 0 2026-03-10T08:55:34.136 INFO:tasks.workunit.client.1.vm08.stdout:4/660: chown d5/de/cd9 8 1 2026-03-10T08:55:34.136 INFO:tasks.workunit.client.1.vm08.stdout:4/661: write d5/d23/d36/f57 [1447749,125876] 0 2026-03-10T08:55:34.136 INFO:tasks.workunit.client.1.vm08.stdout:5/589: dwrite d0/d11/d27/f83 [0,4194304] 0 2026-03-10T08:55:34.136 INFO:tasks.workunit.client.1.vm08.stdout:4/662: write d5/d23/d36/fce [2087365,120957] 0 2026-03-10T08:55:34.136 INFO:tasks.workunit.client.1.vm08.stdout:4/663: readlink d5/d23/d36/d99/db2/d5d/dae/ddf/lc9 0 2026-03-10T08:55:34.143 INFO:tasks.workunit.client.1.vm08.stdout:4/664: rename d5/d23/d36/d99/lb6 to d5/d5f/led 0 2026-03-10T08:55:34.149 INFO:tasks.workunit.client.1.vm08.stdout:7/685: creat d0/d11/d1f/d29/d3b/fdd x:0 0 0 2026-03-10T08:55:34.149 INFO:tasks.workunit.client.1.vm08.stdout:1/673: mknod d1/cee 0 2026-03-10T08:55:34.149 INFO:tasks.workunit.client.1.vm08.stdout:1/674: write d1/da/f25 [3774712,114163] 0 2026-03-10T08:55:34.149 INFO:tasks.workunit.client.1.vm08.stdout:4/665: creat d5/d23/d36/d99/dc6/fee x:0 0 0 2026-03-10T08:55:34.150 INFO:tasks.workunit.client.1.vm08.stdout:7/686: mknod d0/d11/d1f/d29/d36/d75/cde 0 2026-03-10T08:55:34.151 INFO:tasks.workunit.client.1.vm08.stdout:1/675: chown d1/da/de/d24/d35/d6d/l95 9731038 1 2026-03-10T08:55:34.152 INFO:tasks.workunit.client.1.vm08.stdout:4/666: mkdir d5/de/def 0 2026-03-10T08:55:34.155 INFO:tasks.workunit.client.1.vm08.stdout:7/687: dread d0/d11/d1f/d29/d3d/d89/f8b [4194304,4194304] 0 2026-03-10T08:55:34.156 INFO:tasks.workunit.client.1.vm08.stdout:4/667: creat d5/d23/d36/d76/ff0 x:0 0 0 2026-03-10T08:55:34.164 INFO:tasks.workunit.client.1.vm08.stdout:4/668: mkdir d5/d23/d36/d99/dc6/df1 0 2026-03-10T08:55:34.165 INFO:tasks.workunit.client.1.vm08.stdout:0/578: dread d6/f11 [0,4194304] 0 2026-03-10T08:55:34.166 INFO:tasks.workunit.client.1.vm08.stdout:4/669: fsync d5/f7e 0 2026-03-10T08:55:34.167 INFO:tasks.workunit.client.1.vm08.stdout:4/670: stat d5/d23/d36/d76/ff0 0 2026-03-10T08:55:34.167 INFO:tasks.workunit.client.1.vm08.stdout:0/579: readlink d6/dd/d13/d17/d1f/d20/d2f/d24/l9b 0 2026-03-10T08:55:34.170 INFO:tasks.workunit.client.1.vm08.stdout:4/671: read d5/de/f6d [3443768,58715] 0 2026-03-10T08:55:34.171 INFO:tasks.workunit.client.1.vm08.stdout:0/580: readlink d6/dd/d13/d17/d1f/d20/d2f/d26/l3c 0 2026-03-10T08:55:34.175 INFO:tasks.workunit.client.1.vm08.stdout:4/672: fdatasync d5/d23/d36/d99/db2/d5a/d69/fb3 0 2026-03-10T08:55:34.175 INFO:tasks.workunit.client.1.vm08.stdout:4/673: truncate d5/d23/d36/fd2 324264 0 2026-03-10T08:55:34.178 INFO:tasks.workunit.client.1.vm08.stdout:0/581: creat d6/dd/d13/d17/d1f/d2d/d85/d95/fb9 x:0 0 0 2026-03-10T08:55:34.180 INFO:tasks.workunit.client.1.vm08.stdout:4/674: mkdir d5/de/def/df2 0 2026-03-10T08:55:34.181 INFO:tasks.workunit.client.1.vm08.stdout:0/582: chown d6/dd/d13/d17/d1f/d20/d2f/d26/l3c 76061462 1 2026-03-10T08:55:34.183 INFO:tasks.workunit.client.1.vm08.stdout:0/583: fsync d6/dd/d13/d17/d50/fac 0 2026-03-10T08:55:34.188 INFO:tasks.workunit.client.0.vm05.stdout:2/376: dwrite d0/d9/f12 [0,4194304] 0 2026-03-10T08:55:34.190 INFO:tasks.workunit.client.0.vm05.stdout:2/377: write d0/d9/d1e/d20/f47 [8831321,95600] 0 2026-03-10T08:55:34.191 INFO:tasks.workunit.client.1.vm08.stdout:4/675: link d5/de/l52 d5/d23/d36/d99/lf3 0 2026-03-10T08:55:34.194 INFO:tasks.workunit.client.1.vm08.stdout:0/584: rmdir d6/dd/d13/d17/d1f/d2d/d39 39 2026-03-10T08:55:34.194 INFO:tasks.workunit.client.1.vm08.stdout:4/676: creat d5/de/def/ff4 x:0 0 0 2026-03-10T08:55:34.194 INFO:tasks.workunit.client.1.vm08.stdout:0/585: creat d6/dd/d13/d61/fba x:0 0 0 2026-03-10T08:55:34.195 INFO:tasks.workunit.client.1.vm08.stdout:4/677: mkdir d5/df5 0 2026-03-10T08:55:34.195 INFO:tasks.workunit.client.1.vm08.stdout:0/586: read d6/dd/d13/d17/d1f/da3/fa7 [3845480,36065] 0 2026-03-10T08:55:34.195 INFO:tasks.workunit.client.1.vm08.stdout:0/587: fsync d6/f5f 0 2026-03-10T08:55:34.195 INFO:tasks.workunit.client.1.vm08.stdout:4/678: write d5/d23/d49/fdc [110208,74219] 0 2026-03-10T08:55:34.201 INFO:tasks.workunit.client.1.vm08.stdout:0/588: chown d6/dd/d13/d17/d1f/d20/d2f/d57/l63 965776 1 2026-03-10T08:55:34.214 INFO:tasks.workunit.client.1.vm08.stdout:4/679: chown d5/d23/d36/d99/db2/d5d/f61 192261687 1 2026-03-10T08:55:34.214 INFO:tasks.workunit.client.1.vm08.stdout:0/589: creat d6/dd/d13/d8f/fbb x:0 0 0 2026-03-10T08:55:34.214 INFO:tasks.workunit.client.1.vm08.stdout:4/680: truncate d5/f77 3906032 0 2026-03-10T08:55:34.214 INFO:tasks.workunit.client.1.vm08.stdout:0/590: symlink d6/dd/d13/d17/d1f/d20/d2f/d24/lbc 0 2026-03-10T08:55:34.214 INFO:tasks.workunit.client.1.vm08.stdout:4/681: creat d5/de/d96/ff6 x:0 0 0 2026-03-10T08:55:34.214 INFO:tasks.workunit.client.1.vm08.stdout:4/682: chown d5/d23/d36/d99/db2/d5d/f60 103 1 2026-03-10T08:55:34.214 INFO:tasks.workunit.client.1.vm08.stdout:4/683: creat d5/d23/d36/d99/db2/ff7 x:0 0 0 2026-03-10T08:55:34.214 INFO:tasks.workunit.client.1.vm08.stdout:4/684: mkdir d5/d23/d36/d99/db2/d5d/de3/df8 0 2026-03-10T08:55:34.215 INFO:tasks.workunit.client.1.vm08.stdout:4/685: creat d5/d23/d49/ff9 x:0 0 0 2026-03-10T08:55:34.215 INFO:tasks.workunit.client.1.vm08.stdout:4/686: mknod d5/d23/d36/d99/db2/d5d/de3/cfa 0 2026-03-10T08:55:34.217 INFO:tasks.workunit.client.1.vm08.stdout:4/687: truncate d5/d23/d49/d8f/da4/fe5 19873 0 2026-03-10T08:55:34.231 INFO:tasks.workunit.client.0.vm05.stdout:2/378: dread d0/f36 [0,4194304] 0 2026-03-10T08:55:34.234 INFO:tasks.workunit.client.0.vm05.stdout:5/353: dwrite d5/df/d12/d21/f1f [0,4194304] 0 2026-03-10T08:55:34.237 INFO:tasks.workunit.client.0.vm05.stdout:5/354: dread d5/d48/f69 [0,4194304] 0 2026-03-10T08:55:34.243 INFO:tasks.workunit.client.0.vm05.stdout:9/335: rename d6/d19/c36 to d6/d12/c6e 0 2026-03-10T08:55:34.249 INFO:tasks.workunit.client.1.vm08.stdout:2/704: rmdir d1/da/d10/d42/d93/d1e/d7b 39 2026-03-10T08:55:34.249 INFO:tasks.workunit.client.1.vm08.stdout:9/608: write d2/dd/d15/d1e/d24/f34 [4859867,43743] 0 2026-03-10T08:55:34.249 INFO:tasks.workunit.client.0.vm05.stdout:9/336: write d6/d19/d21/f31 [788025,62166] 0 2026-03-10T08:55:34.249 INFO:tasks.workunit.client.0.vm05.stdout:9/337: write d6/d19/d2a/f53 [1213012,60797] 0 2026-03-10T08:55:34.249 INFO:tasks.workunit.client.0.vm05.stdout:5/355: read - d5/df/d37/f47 zero size 2026-03-10T08:55:34.249 INFO:tasks.workunit.client.0.vm05.stdout:3/477: unlink d9/c1d 0 2026-03-10T08:55:34.254 INFO:tasks.workunit.client.0.vm05.stdout:9/338: write d6/d15/d35/f38 [1237013,32373] 0 2026-03-10T08:55:34.259 INFO:tasks.workunit.client.1.vm08.stdout:9/609: rename d2/d41/d74 to d2/d54/d8e/da6/dd0 0 2026-03-10T08:55:34.260 INFO:tasks.workunit.client.0.vm05.stdout:9/339: fdatasync d6/d19/d21/f32 0 2026-03-10T08:55:34.260 INFO:tasks.workunit.client.0.vm05.stdout:5/356: creat d5/d48/f7e x:0 0 0 2026-03-10T08:55:34.261 INFO:tasks.workunit.client.0.vm05.stdout:3/478: creat d9/d4d/f88 x:0 0 0 2026-03-10T08:55:34.262 INFO:tasks.workunit.client.1.vm08.stdout:3/589: dwrite d4/d15/d8/d1d/f6e [0,4194304] 0 2026-03-10T08:55:34.265 INFO:tasks.workunit.client.0.vm05.stdout:3/479: dwrite d9/d2b/d2f/f4b [4194304,4194304] 0 2026-03-10T08:55:34.283 INFO:tasks.workunit.client.1.vm08.stdout:3/590: mkdir d4/d6f/dca 0 2026-03-10T08:55:34.283 INFO:tasks.workunit.client.0.vm05.stdout:9/340: mknod d6/d12/c6f 0 2026-03-10T08:55:34.284 INFO:tasks.workunit.client.1.vm08.stdout:3/591: read - d4/d15/d8/fbb zero size 2026-03-10T08:55:34.284 INFO:tasks.workunit.client.1.vm08.stdout:9/610: mknod d2/dd/d15/d1e/d25/d32/cd1 0 2026-03-10T08:55:34.284 INFO:tasks.workunit.client.0.vm05.stdout:5/357: symlink d5/df/d37/d68/l7f 0 2026-03-10T08:55:34.285 INFO:tasks.workunit.client.0.vm05.stdout:5/358: write d5/df/d12/f1b [5715510,49550] 0 2026-03-10T08:55:34.286 INFO:tasks.workunit.client.0.vm05.stdout:5/359: fsync d5/df/d12/d39/f78 0 2026-03-10T08:55:34.287 INFO:tasks.workunit.client.0.vm05.stdout:3/480: mkdir d9/d4d/d51/d64/d89 0 2026-03-10T08:55:34.288 INFO:tasks.workunit.client.0.vm05.stdout:3/481: read - d9/d2b/d3a/f68 zero size 2026-03-10T08:55:34.290 INFO:tasks.workunit.client.1.vm08.stdout:9/611: mkdir d2/d41/d4c/dd2 0 2026-03-10T08:55:34.291 INFO:tasks.workunit.client.1.vm08.stdout:3/592: unlink d4/d15/d8/d1d/da8/cc4 0 2026-03-10T08:55:34.293 INFO:tasks.workunit.client.0.vm05.stdout:0/404: rename df/d18/d19/l22 to df/l6d 0 2026-03-10T08:55:34.295 INFO:tasks.workunit.client.1.vm08.stdout:3/593: dwrite d4/d15/d8/d1d/d4f/fa2 [0,4194304] 0 2026-03-10T08:55:34.295 INFO:tasks.workunit.client.0.vm05.stdout:4/468: rename d0/d1d/d30/d32/l8d to d0/d1d/l9a 0 2026-03-10T08:55:34.298 INFO:tasks.workunit.client.0.vm05.stdout:6/489: rename d4/d2c/d84/f41 to d4/d7/d10/d15/d1b/d22/fa4 0 2026-03-10T08:55:34.301 INFO:tasks.workunit.client.0.vm05.stdout:0/405: mkdir df/d18/d2b/d65/d6e 0 2026-03-10T08:55:34.305 INFO:tasks.workunit.client.0.vm05.stdout:4/469: mknod d0/c9b 0 2026-03-10T08:55:34.305 INFO:tasks.workunit.client.0.vm05.stdout:4/470: write d0/f1 [3710630,30626] 0 2026-03-10T08:55:34.306 INFO:tasks.workunit.client.0.vm05.stdout:4/471: write d0/d1d/d30/d49/d4f/f51 [331775,30475] 0 2026-03-10T08:55:34.311 INFO:tasks.workunit.client.1.vm08.stdout:3/594: unlink d4/d15/d8/d2c/d9b/c3e 0 2026-03-10T08:55:34.312 INFO:tasks.workunit.client.0.vm05.stdout:8/424: rename d2/dd/f72 to d2/db/d28/f96 0 2026-03-10T08:55:34.314 INFO:tasks.workunit.client.0.vm05.stdout:0/406: creat df/d18/d19/d39/f6f x:0 0 0 2026-03-10T08:55:34.317 INFO:tasks.workunit.client.0.vm05.stdout:0/407: dwrite f5 [0,4194304] 0 2026-03-10T08:55:34.319 INFO:tasks.workunit.client.1.vm08.stdout:3/595: rename d4/d15/d8/d2c/f5a to d4/d6f/fcb 0 2026-03-10T08:55:34.325 INFO:tasks.workunit.client.0.vm05.stdout:6/490: dread d4/d7/d10/d15/f17 [0,4194304] 0 2026-03-10T08:55:34.328 INFO:tasks.workunit.client.0.vm05.stdout:6/491: dwrite d4/d2d/fa2 [0,4194304] 0 2026-03-10T08:55:34.333 INFO:tasks.workunit.client.1.vm08.stdout:3/596: dwrite d4/d15/d8/fad [0,4194304] 0 2026-03-10T08:55:34.335 INFO:tasks.workunit.client.0.vm05.stdout:8/425: mknod d2/dd/d2c/d2e/d31/d4f/d7b/c97 0 2026-03-10T08:55:34.339 INFO:tasks.workunit.client.1.vm08.stdout:8/683: dread d1/d10/d9/dd/d25/d27/d44/f22 [0,4194304] 0 2026-03-10T08:55:34.342 INFO:tasks.workunit.client.0.vm05.stdout:9/341: dread f3 [4194304,4194304] 0 2026-03-10T08:55:34.342 INFO:tasks.workunit.client.0.vm05.stdout:9/342: chown d6/d12/d43/f52 889176998 1 2026-03-10T08:55:34.346 INFO:tasks.workunit.client.0.vm05.stdout:2/379: rename d0/d55/l5d to d0/d9/d1e/l6a 0 2026-03-10T08:55:34.348 INFO:tasks.workunit.client.1.vm08.stdout:7/688: rmdir d0/d11/d1f/d29/d3d 39 2026-03-10T08:55:34.348 INFO:tasks.workunit.client.1.vm08.stdout:7/689: readlink d0/d11/d1f/lae 0 2026-03-10T08:55:34.349 INFO:tasks.workunit.client.1.vm08.stdout:3/597: creat d4/d6f/dca/fcc x:0 0 0 2026-03-10T08:55:34.350 INFO:tasks.workunit.client.1.vm08.stdout:7/690: truncate d0/d11/d1f/d29/fcc 997798 0 2026-03-10T08:55:34.350 INFO:tasks.workunit.client.0.vm05.stdout:2/380: rmdir d0/d9/d27 39 2026-03-10T08:55:34.354 INFO:tasks.workunit.client.0.vm05.stdout:9/343: symlink d6/d12/d3a/l70 0 2026-03-10T08:55:34.367 INFO:tasks.workunit.client.0.vm05.stdout:0/408: getdents df/d18/d2b/d27 0 2026-03-10T08:55:34.367 INFO:tasks.workunit.client.0.vm05.stdout:7/347: write d18/f4a [3951241,121033] 0 2026-03-10T08:55:34.367 INFO:tasks.workunit.client.0.vm05.stdout:7/348: dwrite d18/d1b/d1f/d25/d2e/f49 [0,4194304] 0 2026-03-10T08:55:34.375 INFO:tasks.workunit.client.1.vm08.stdout:7/691: symlink d0/d14/d43/d62/ldf 0 2026-03-10T08:55:34.379 INFO:tasks.workunit.client.0.vm05.stdout:7/349: dwrite d18/d1b/f50 [0,4194304] 0 2026-03-10T08:55:34.380 INFO:tasks.workunit.client.1.vm08.stdout:6/663: dwrite d9/d13/d4e/fa8 [0,4194304] 0 2026-03-10T08:55:34.380 INFO:tasks.workunit.client.1.vm08.stdout:8/684: link d1/d10/d9/dd/d25/d27/d44/d97/lbb d1/d10/d9/dd/d25/d27/d44/d21/dce/l101 0 2026-03-10T08:55:34.384 INFO:tasks.workunit.client.1.vm08.stdout:8/685: chown d1/d4f/d60/fc4 0 1 2026-03-10T08:55:34.385 INFO:tasks.workunit.client.1.vm08.stdout:7/692: write d0/d11/d1f/d29/d36/d75/fc8 [346218,31972] 0 2026-03-10T08:55:34.388 INFO:tasks.workunit.client.1.vm08.stdout:6/664: unlink d9/d10/f53 0 2026-03-10T08:55:34.389 INFO:tasks.workunit.client.1.vm08.stdout:7/693: fsync d0/d11/d1f/d29/fcf 0 2026-03-10T08:55:34.395 INFO:tasks.workunit.client.0.vm05.stdout:7/350: link d18/c4d d18/d1b/d1f/d25/d2e/d32/c65 0 2026-03-10T08:55:34.406 INFO:tasks.workunit.client.1.vm08.stdout:7/694: fsync d0/d11/d1f/d2c/f6c 0 2026-03-10T08:55:34.406 INFO:tasks.workunit.client.0.vm05.stdout:7/351: chown f3 6222305 1 2026-03-10T08:55:34.406 INFO:tasks.workunit.client.1.vm08.stdout:3/598: dread d4/d15/d8/d2c/f42 [4194304,4194304] 0 2026-03-10T08:55:34.409 INFO:tasks.workunit.client.1.vm08.stdout:3/599: rename d4/lbe to d4/d15/d8/d2c/d89/lcd 0 2026-03-10T08:55:34.410 INFO:tasks.workunit.client.1.vm08.stdout:3/600: unlink d4/d15/cc1 0 2026-03-10T08:55:34.425 INFO:tasks.workunit.client.1.vm08.stdout:3/601: rename d4/d15/d8/d2c/d9b/fc2 to d4/d15/d8/d71/fce 0 2026-03-10T08:55:34.425 INFO:tasks.workunit.client.1.vm08.stdout:3/602: symlink d4/d15/d8/d1d/lcf 0 2026-03-10T08:55:34.464 INFO:tasks.workunit.client.0.vm05.stdout:0/409: sync 2026-03-10T08:55:34.464 INFO:tasks.workunit.client.0.vm05.stdout:2/381: sync 2026-03-10T08:55:34.464 INFO:tasks.workunit.client.0.vm05.stdout:2/382: stat d0/f4 0 2026-03-10T08:55:34.468 INFO:tasks.workunit.client.0.vm05.stdout:0/410: dwrite df/d18/f2a [0,4194304] 0 2026-03-10T08:55:34.482 INFO:tasks.workunit.client.0.vm05.stdout:0/411: chown df/d18/d19/d39/d4d/d50/f66 261 1 2026-03-10T08:55:34.482 INFO:tasks.workunit.client.0.vm05.stdout:0/412: symlink df/d18/d2b/d51/l70 0 2026-03-10T08:55:34.482 INFO:tasks.workunit.client.0.vm05.stdout:0/413: dwrite df/d18/d19/d47/f68 [0,4194304] 0 2026-03-10T08:55:34.482 INFO:tasks.workunit.client.0.vm05.stdout:0/414: creat df/d18/d2b/d51/f71 x:0 0 0 2026-03-10T08:55:34.507 INFO:tasks.workunit.client.0.vm05.stdout:6/492: dread d4/d7/d10/d1a/f1e [0,4194304] 0 2026-03-10T08:55:34.510 INFO:tasks.workunit.client.0.vm05.stdout:6/493: truncate d4/d2c/d84/f3a 1154839 0 2026-03-10T08:55:34.510 INFO:tasks.workunit.client.0.vm05.stdout:6/494: mkdir d4/d2d/d51/d87/da5 0 2026-03-10T08:55:34.577 INFO:tasks.workunit.client.0.vm05.stdout:1/520: write dd/d10/d19/f35 [8770027,69449] 0 2026-03-10T08:55:34.578 INFO:tasks.workunit.client.1.vm08.stdout:1/676: fsync d1/fed 0 2026-03-10T08:55:34.581 INFO:tasks.workunit.client.0.vm05.stdout:1/521: dwrite dd/d21/d3f/f83 [4194304,4194304] 0 2026-03-10T08:55:34.582 INFO:tasks.workunit.client.0.vm05.stdout:1/522: write dd/d10/d18/d20/d69/fb1 [920203,121602] 0 2026-03-10T08:55:34.582 INFO:tasks.workunit.client.0.vm05.stdout:1/523: write dd/f1c [596961,75442] 0 2026-03-10T08:55:34.587 INFO:tasks.workunit.client.0.vm05.stdout:1/524: write dd/d10/f22 [1344549,77359] 0 2026-03-10T08:55:34.587 INFO:tasks.workunit.client.0.vm05.stdout:1/525: readlink dd/d10/l1b 0 2026-03-10T08:55:34.588 INFO:tasks.workunit.client.1.vm08.stdout:1/677: dread - d1/da/d18/d3a/d77/fdf zero size 2026-03-10T08:55:34.589 INFO:tasks.workunit.client.0.vm05.stdout:1/526: symlink dd/d21/d37/d45/lbd 0 2026-03-10T08:55:34.595 INFO:tasks.workunit.client.1.vm08.stdout:1/678: chown d1/da/de/d5c/fb5 6505 1 2026-03-10T08:55:34.606 INFO:tasks.workunit.client.1.vm08.stdout:5/590: dwrite d0/f92 [0,4194304] 0 2026-03-10T08:55:34.610 INFO:tasks.workunit.client.1.vm08.stdout:0/591: write d6/dd/d13/d17/d1f/f67 [979329,9148] 0 2026-03-10T08:55:34.610 INFO:tasks.workunit.client.1.vm08.stdout:5/591: fsync d0/d11/d18/faf 0 2026-03-10T08:55:34.617 INFO:tasks.workunit.client.1.vm08.stdout:5/592: write d0/d11/d27/d68/d7c/d4b/d4e/d84/fa9 [2045618,126397] 0 2026-03-10T08:55:34.619 INFO:tasks.workunit.client.1.vm08.stdout:5/593: chown d0/d11/d18/d52/f91 945037790 1 2026-03-10T08:55:34.620 INFO:tasks.workunit.client.0.vm05.stdout:9/344: getdents d6/d19 0 2026-03-10T08:55:34.621 INFO:tasks.workunit.client.1.vm08.stdout:5/594: truncate d0/d11/d3e/f4d 4416120 0 2026-03-10T08:55:34.623 INFO:tasks.workunit.client.1.vm08.stdout:4/688: write d5/d23/d49/d8f/fbc [387392,61895] 0 2026-03-10T08:55:34.623 INFO:tasks.workunit.client.1.vm08.stdout:2/705: write d1/da/d10/d42/d93/d22/f8a [1711052,120073] 0 2026-03-10T08:55:34.629 INFO:tasks.workunit.client.0.vm05.stdout:9/345: mknod d6/d12/d3a/c71 0 2026-03-10T08:55:34.632 INFO:tasks.workunit.client.0.vm05.stdout:9/346: mknod d6/d12/c72 0 2026-03-10T08:55:34.633 INFO:tasks.workunit.client.0.vm05.stdout:3/482: rmdir d9/d4d 39 2026-03-10T08:55:34.634 INFO:tasks.workunit.client.1.vm08.stdout:5/595: creat d0/d11/d27/fb3 x:0 0 0 2026-03-10T08:55:34.634 INFO:tasks.workunit.client.0.vm05.stdout:3/483: dread - d9/d2b/d3a/d43/d6e/f7d zero size 2026-03-10T08:55:34.638 INFO:tasks.workunit.client.0.vm05.stdout:9/347: creat d6/d19/f73 x:0 0 0 2026-03-10T08:55:34.639 INFO:tasks.workunit.client.0.vm05.stdout:9/348: fsync d6/d15/d3c/d4b/f67 0 2026-03-10T08:55:34.641 INFO:tasks.workunit.client.0.vm05.stdout:3/484: readlink d9/d2b/d2f/d57/l82 0 2026-03-10T08:55:34.641 INFO:tasks.workunit.client.1.vm08.stdout:9/612: write d2/dd/d15/f17 [3859764,88757] 0 2026-03-10T08:55:34.642 INFO:tasks.workunit.client.1.vm08.stdout:2/706: mknod d1/ce6 0 2026-03-10T08:55:34.644 INFO:tasks.workunit.client.1.vm08.stdout:2/707: write d1/d5b/f80 [147508,78060] 0 2026-03-10T08:55:34.650 INFO:tasks.workunit.client.0.vm05.stdout:9/349: dwrite d6/d19/f29 [0,4194304] 0 2026-03-10T08:55:34.650 INFO:tasks.workunit.client.0.vm05.stdout:9/350: read - d6/d15/d3c/f6b zero size 2026-03-10T08:55:34.650 INFO:tasks.workunit.client.0.vm05.stdout:9/351: fdatasync d6/f4e 0 2026-03-10T08:55:34.650 INFO:tasks.workunit.client.0.vm05.stdout:5/360: truncate d5/df/d12/d24/d2c/d41/f4d 3151796 0 2026-03-10T08:55:34.655 INFO:tasks.workunit.client.0.vm05.stdout:5/361: dread d5/df/d12/d21/f1f [0,4194304] 0 2026-03-10T08:55:34.657 INFO:tasks.workunit.client.0.vm05.stdout:9/352: dwrite d6/d15/d3c/d4b/f5b [0,4194304] 0 2026-03-10T08:55:34.664 INFO:tasks.workunit.client.1.vm08.stdout:0/592: link d6/dd/d13/d17/d1f/d2d/d39/f3b d6/dd/d13/d61/fbd 0 2026-03-10T08:55:34.664 INFO:tasks.workunit.client.1.vm08.stdout:0/593: chown d6/dd/d13/d17 356991 1 2026-03-10T08:55:34.665 INFO:tasks.workunit.client.1.vm08.stdout:9/613: fdatasync d2/dd/d15/d1e/d25/dae/f8f 0 2026-03-10T08:55:34.666 INFO:tasks.workunit.client.0.vm05.stdout:5/362: mknod d5/df/d37/c80 0 2026-03-10T08:55:34.667 INFO:tasks.workunit.client.0.vm05.stdout:9/353: write d6/d12/d43/f47 [1302659,30126] 0 2026-03-10T08:55:34.673 INFO:tasks.workunit.client.0.vm05.stdout:9/354: dwrite d6/d19/d2a/d4a/f56 [0,4194304] 0 2026-03-10T08:55:34.674 INFO:tasks.workunit.client.0.vm05.stdout:3/485: creat d9/d2b/d3a/d43/d4f/f8a x:0 0 0 2026-03-10T08:55:34.677 INFO:tasks.workunit.client.0.vm05.stdout:3/486: dread d9/f28 [0,4194304] 0 2026-03-10T08:55:34.679 INFO:tasks.workunit.client.0.vm05.stdout:9/355: creat d6/d12/f74 x:0 0 0 2026-03-10T08:55:34.699 INFO:tasks.workunit.client.1.vm08.stdout:4/689: link d5/d23/l46 d5/d23/d36/d99/db2/d5d/de3/lfb 0 2026-03-10T08:55:34.699 INFO:tasks.workunit.client.1.vm08.stdout:4/690: chown d5/l89 73 1 2026-03-10T08:55:34.699 INFO:tasks.workunit.client.1.vm08.stdout:4/691: dwrite d5/d23/d36/d99/db2/ff7 [0,4194304] 0 2026-03-10T08:55:34.699 INFO:tasks.workunit.client.0.vm05.stdout:5/363: link d5/d48/l7a d5/d48/l81 0 2026-03-10T08:55:34.699 INFO:tasks.workunit.client.0.vm05.stdout:3/487: symlink d9/d2b/d2f/l8b 0 2026-03-10T08:55:34.700 INFO:tasks.workunit.client.0.vm05.stdout:9/356: dread - d6/d19/d2c/d58/f6c zero size 2026-03-10T08:55:34.700 INFO:tasks.workunit.client.0.vm05.stdout:3/488: dwrite d9/d2b/d3a/f44 [4194304,4194304] 0 2026-03-10T08:55:34.700 INFO:tasks.workunit.client.0.vm05.stdout:3/489: dwrite d9/d2b/d2f/f3f [0,4194304] 0 2026-03-10T08:55:34.701 INFO:tasks.workunit.client.1.vm08.stdout:4/692: rename d5/d23/d36/d76/ce8 to d5/de/d96/cfc 0 2026-03-10T08:55:34.703 INFO:tasks.workunit.client.1.vm08.stdout:4/693: creat d5/d23/d36/d99/ffd x:0 0 0 2026-03-10T08:55:34.736 INFO:tasks.workunit.client.1.vm08.stdout:0/594: dread d6/dd/d13/d17/d1f/d20/d2f/d24/f6e [0,4194304] 0 2026-03-10T08:55:34.739 INFO:tasks.workunit.client.1.vm08.stdout:0/595: creat d6/dd/d13/d17/d1f/d2d/d39/fbe x:0 0 0 2026-03-10T08:55:34.743 INFO:tasks.workunit.client.1.vm08.stdout:0/596: dread d6/dd/f3f [0,4194304] 0 2026-03-10T08:55:34.746 INFO:tasks.workunit.client.1.vm08.stdout:0/597: creat d6/dd/d13/d17/fbf x:0 0 0 2026-03-10T08:55:34.747 INFO:tasks.workunit.client.0.vm05.stdout:8/426: dread d2/db/d1f/f53 [0,4194304] 0 2026-03-10T08:55:34.749 INFO:tasks.workunit.client.1.vm08.stdout:0/598: dwrite d6/dd/d13/d17/d1f/d2d/d39/fad [0,4194304] 0 2026-03-10T08:55:34.755 INFO:tasks.workunit.client.1.vm08.stdout:0/599: chown d6/dd/d13/d17/d50/fac 150 1 2026-03-10T08:55:34.758 INFO:tasks.workunit.client.1.vm08.stdout:0/600: rename d6/f11 to d6/dd/d13/d17/d1f/d2d/d85/d93/fc0 0 2026-03-10T08:55:34.759 INFO:tasks.workunit.client.1.vm08.stdout:0/601: mknod d6/dd/d13/d17/cc1 0 2026-03-10T08:55:34.761 INFO:tasks.workunit.client.1.vm08.stdout:0/602: mkdir d6/dd/d13/d17/d1f/d20/d2f/d24/dc2 0 2026-03-10T08:55:34.774 INFO:tasks.workunit.client.0.vm05.stdout:9/357: sync 2026-03-10T08:55:34.777 INFO:tasks.workunit.client.0.vm05.stdout:9/358: link d6/d15/d3c/c5d d6/d19/d21/c75 0 2026-03-10T08:55:34.931 INFO:tasks.workunit.client.0.vm05.stdout:4/472: rename d0/d1d/d30 to d0/d2e/d42/d45/d4a/d36/d37/d9c 0 2026-03-10T08:55:34.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:34 vm05.local ceph-mon[49713]: pgmap v156: 65 pgs: 65 active+clean; 2.1 GiB data, 7.7 GiB used, 112 GiB / 120 GiB avail; 47 MiB/s rd, 132 MiB/s wr, 304 op/s 2026-03-10T08:55:34.968 INFO:tasks.workunit.client.1.vm08.stdout:2/708: sync 2026-03-10T08:55:34.969 INFO:tasks.workunit.client.1.vm08.stdout:9/614: sync 2026-03-10T08:55:34.969 INFO:tasks.workunit.client.1.vm08.stdout:4/694: sync 2026-03-10T08:55:34.969 INFO:tasks.workunit.client.1.vm08.stdout:5/596: sync 2026-03-10T08:55:34.969 INFO:tasks.workunit.client.1.vm08.stdout:5/597: chown d0/fe 3657 1 2026-03-10T08:55:34.969 INFO:tasks.workunit.client.1.vm08.stdout:2/709: dread - d1/da/d10/d42/d93/d1e/dce/fd6 zero size 2026-03-10T08:55:34.970 INFO:tasks.workunit.client.0.vm05.stdout:7/352: rename d18/d1b/d1f to d18/d66 0 2026-03-10T08:55:34.978 INFO:tasks.workunit.client.0.vm05.stdout:6/495: rename d4/d7/d10 to d4/d7/d10/d8f/da6 22 2026-03-10T08:55:34.978 INFO:tasks.workunit.client.0.vm05.stdout:7/353: creat d18/d38/d43/d5c/f67 x:0 0 0 2026-03-10T08:55:34.978 INFO:tasks.workunit.client.0.vm05.stdout:8/427: rename d2/dd/d2c/d2e/d31/d4c/f7f to d2/db/d1f/d67/f98 0 2026-03-10T08:55:34.978 INFO:tasks.workunit.client.1.vm08.stdout:9/615: creat d2/dd/d15/d4f/fd3 x:0 0 0 2026-03-10T08:55:34.978 INFO:tasks.workunit.client.1.vm08.stdout:2/710: mkdir d1/d43/d5c/de7 0 2026-03-10T08:55:34.978 INFO:tasks.workunit.client.1.vm08.stdout:6/665: write d9/d10/d1e/f58 [2808291,67729] 0 2026-03-10T08:55:34.978 INFO:tasks.workunit.client.1.vm08.stdout:5/598: dwrite d0/d11/d27/d68/d7c/d4b/d4e/f89 [0,4194304] 0 2026-03-10T08:55:34.978 INFO:tasks.workunit.client.1.vm08.stdout:6/666: dread - d9/dc/d84/d80/fd5 zero size 2026-03-10T08:55:34.980 INFO:tasks.workunit.client.1.vm08.stdout:4/695: creat d5/d23/d36/d99/db2/d5d/ffe x:0 0 0 2026-03-10T08:55:34.980 INFO:tasks.workunit.client.0.vm05.stdout:6/496: mknod d4/d2c/d84/d4a/ca7 0 2026-03-10T08:55:34.981 INFO:tasks.workunit.client.0.vm05.stdout:6/497: chown d4/d7/f14 1 1 2026-03-10T08:55:34.982 INFO:tasks.workunit.client.1.vm08.stdout:9/616: symlink d2/dd/d15/d1e/d25/d32/d5c/dc2/ld4 0 2026-03-10T08:55:34.984 INFO:tasks.workunit.client.0.vm05.stdout:9/359: rename d6/d19/f73 to d6/d15/d3c/d4b/f76 0 2026-03-10T08:55:34.984 INFO:tasks.workunit.client.1.vm08.stdout:8/686: dwrite d1/f26 [0,4194304] 0 2026-03-10T08:55:34.985 INFO:tasks.workunit.client.0.vm05.stdout:6/498: dread d4/d2c/d84/f3c [0,4194304] 0 2026-03-10T08:55:34.985 INFO:tasks.workunit.client.0.vm05.stdout:9/360: write d6/f4e [246589,122420] 0 2026-03-10T08:55:34.989 INFO:tasks.workunit.client.0.vm05.stdout:7/354: sync 2026-03-10T08:55:34.989 INFO:tasks.workunit.client.0.vm05.stdout:8/428: sync 2026-03-10T08:55:34.997 INFO:tasks.workunit.client.1.vm08.stdout:4/696: fdatasync d5/f9d 0 2026-03-10T08:55:34.998 INFO:tasks.workunit.client.0.vm05.stdout:6/499: fsync d4/fc 0 2026-03-10T08:55:34.998 INFO:tasks.workunit.client.0.vm05.stdout:6/500: readlink d4/d2d/l7e 0 2026-03-10T08:55:35.005 INFO:tasks.workunit.client.1.vm08.stdout:7/695: dwrite d0/d14/f12 [0,4194304] 0 2026-03-10T08:55:35.015 INFO:tasks.workunit.client.0.vm05.stdout:8/429: mkdir d2/db/d28/d99 0 2026-03-10T08:55:35.015 INFO:tasks.workunit.client.0.vm05.stdout:2/383: write d0/fb [436467,110589] 0 2026-03-10T08:55:35.018 INFO:tasks.workunit.client.0.vm05.stdout:9/361: dread d6/fb [0,4194304] 0 2026-03-10T08:55:35.018 INFO:tasks.workunit.client.0.vm05.stdout:9/362: chown f4 112 1 2026-03-10T08:55:35.019 INFO:tasks.workunit.client.0.vm05.stdout:7/355: symlink d18/d66/d25/d2e/d32/l68 0 2026-03-10T08:55:35.019 INFO:tasks.workunit.client.0.vm05.stdout:7/356: chown l13 3340 1 2026-03-10T08:55:35.021 INFO:tasks.workunit.client.1.vm08.stdout:2/711: creat d1/da/d10/d42/d93/d23/d9e/ddc/fe8 x:0 0 0 2026-03-10T08:55:35.022 INFO:tasks.workunit.client.0.vm05.stdout:0/415: write df/d18/d2b/d27/f2e [7029044,56711] 0 2026-03-10T08:55:35.025 INFO:tasks.workunit.client.1.vm08.stdout:9/617: symlink d2/dd/d15/d1e/d21/ld5 0 2026-03-10T08:55:35.028 INFO:tasks.workunit.client.1.vm08.stdout:3/603: dwrite d4/d15/d8/f1e [0,4194304] 0 2026-03-10T08:55:35.031 INFO:tasks.workunit.client.1.vm08.stdout:8/687: creat d1/da8/f102 x:0 0 0 2026-03-10T08:55:35.032 INFO:tasks.workunit.client.0.vm05.stdout:1/527: dwrite dd/f5e [4194304,4194304] 0 2026-03-10T08:55:35.032 INFO:tasks.workunit.client.1.vm08.stdout:1/679: write d1/da/de/d24/d3d/d40/d56/f73 [772582,16361] 0 2026-03-10T08:55:35.038 INFO:tasks.workunit.client.1.vm08.stdout:5/599: getdents d0/d11/d27/d68/d7c/d8e 0 2026-03-10T08:55:35.041 INFO:tasks.workunit.client.0.vm05.stdout:2/384: chown d0/d9/d1e/d20/c2e 1 1 2026-03-10T08:55:35.048 INFO:tasks.workunit.client.1.vm08.stdout:1/680: dread d1/da/d20/d3f/d49/d63/fc9 [0,4194304] 0 2026-03-10T08:55:35.051 INFO:tasks.workunit.client.0.vm05.stdout:0/416: sync 2026-03-10T08:55:35.052 INFO:tasks.workunit.client.0.vm05.stdout:0/417: dread - df/d18/d2b/d27/f60 zero size 2026-03-10T08:55:35.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:34 vm08.local ceph-mon[57559]: pgmap v156: 65 pgs: 65 active+clean; 2.1 GiB data, 7.7 GiB used, 112 GiB / 120 GiB avail; 47 MiB/s rd, 132 MiB/s wr, 304 op/s 2026-03-10T08:55:35.059 INFO:tasks.workunit.client.1.vm08.stdout:4/697: dread d5/d23/d36/d99/db2/d5d/f61 [0,4194304] 0 2026-03-10T08:55:35.065 INFO:tasks.workunit.client.1.vm08.stdout:1/681: dread d1/fd7 [0,4194304] 0 2026-03-10T08:55:35.065 INFO:tasks.workunit.client.1.vm08.stdout:1/682: chown d1/da/d18/d3b/faf 100 1 2026-03-10T08:55:35.065 INFO:tasks.workunit.client.0.vm05.stdout:4/473: rename d0/d55 to d0/d2e/d9d 0 2026-03-10T08:55:35.066 INFO:tasks.workunit.client.0.vm05.stdout:4/474: dwrite d0/d2e/d42/d45/d4a/d36/d37/d9c/d49/d58/d66/d79/f85 [0,4194304] 0 2026-03-10T08:55:35.066 INFO:tasks.workunit.client.0.vm05.stdout:4/475: write d0/d2e/d42/d45/d4a/d36/d37/d9c/f93 [113078,678] 0 2026-03-10T08:55:35.075 INFO:tasks.workunit.client.0.vm05.stdout:5/364: dwrite d5/df/f2f [0,4194304] 0 2026-03-10T08:55:35.081 INFO:tasks.workunit.client.0.vm05.stdout:3/490: truncate d9/d2b/f3b 1986864 0 2026-03-10T08:55:35.085 INFO:tasks.workunit.client.0.vm05.stdout:1/528: write dd/d21/d37/d7c/faa [716367,83496] 0 2026-03-10T08:55:35.092 INFO:tasks.workunit.client.1.vm08.stdout:2/712: symlink d1/da/d10/d1b/dcf/le9 0 2026-03-10T08:55:35.093 INFO:tasks.workunit.client.0.vm05.stdout:2/385: creat d0/d9/d1e/d20/d21/d45/d4b/f6b x:0 0 0 2026-03-10T08:55:35.098 INFO:tasks.workunit.client.0.vm05.stdout:0/418: dread - df/f4a zero size 2026-03-10T08:55:35.101 INFO:tasks.workunit.client.0.vm05.stdout:7/357: dread d18/d66/d25/d2e/d32/f3d [0,4194304] 0 2026-03-10T08:55:35.102 INFO:tasks.workunit.client.1.vm08.stdout:1/683: unlink d1/da/de/d24/d3d/d40/d8e/dd2/d7f/feb 0 2026-03-10T08:55:35.104 INFO:tasks.workunit.client.1.vm08.stdout:3/604: fsync d4/d15/d8/d2c/d9b/d79/f59 0 2026-03-10T08:55:35.104 INFO:tasks.workunit.client.0.vm05.stdout:4/476: creat d0/d2e/d42/d45/d4a/d36/d37/d9c/d32/f9e x:0 0 0 2026-03-10T08:55:35.104 INFO:tasks.workunit.client.1.vm08.stdout:3/605: chown d4/d15/d8/d1d/l46 696474414 1 2026-03-10T08:55:35.105 INFO:tasks.workunit.client.0.vm05.stdout:6/501: creat d4/fa8 x:0 0 0 2026-03-10T08:55:35.108 INFO:tasks.workunit.client.0.vm05.stdout:5/365: unlink d5/df/d12/f13 0 2026-03-10T08:55:35.111 INFO:tasks.workunit.client.1.vm08.stdout:1/684: dread d1/da/d18/d3b/d62/f76 [0,4194304] 0 2026-03-10T08:55:35.112 INFO:tasks.workunit.client.0.vm05.stdout:3/491: dwrite d9/d4d/f88 [0,4194304] 0 2026-03-10T08:55:35.116 INFO:tasks.workunit.client.0.vm05.stdout:8/430: creat d2/db/f9a x:0 0 0 2026-03-10T08:55:35.130 INFO:tasks.workunit.client.0.vm05.stdout:7/358: rmdir d18/d38/d43 39 2026-03-10T08:55:35.130 INFO:tasks.workunit.client.0.vm05.stdout:4/477: unlink d0/d2e/d42/d45/f62 0 2026-03-10T08:55:35.130 INFO:tasks.workunit.client.0.vm05.stdout:6/502: dwrite d4/d2c/d84/f3c [0,4194304] 0 2026-03-10T08:55:35.130 INFO:tasks.workunit.client.0.vm05.stdout:1/529: dread dd/d21/f3e [0,4194304] 0 2026-03-10T08:55:35.133 INFO:tasks.workunit.client.0.vm05.stdout:3/492: rmdir d9/d2b/d53 39 2026-03-10T08:55:35.136 INFO:tasks.workunit.client.1.vm08.stdout:7/696: dread d0/d11/d1f/d29/d3d/d89/f96 [0,4194304] 0 2026-03-10T08:55:35.137 INFO:tasks.workunit.client.0.vm05.stdout:9/363: link d6/d15/l23 d6/d19/l77 0 2026-03-10T08:55:35.139 INFO:tasks.workunit.client.1.vm08.stdout:6/667: write d9/d13/f88 [1754228,16965] 0 2026-03-10T08:55:35.142 INFO:tasks.workunit.client.0.vm05.stdout:7/359: fsync d18/d66/d25/d2e/f48 0 2026-03-10T08:55:35.142 INFO:tasks.workunit.client.0.vm05.stdout:4/478: mknod d0/d2e/d42/d45/d4a/d36/d37/d9c/d32/d41/d67/c9f 0 2026-03-10T08:55:35.143 INFO:tasks.workunit.client.0.vm05.stdout:6/503: mkdir d4/d2d/d51/d62/da9 0 2026-03-10T08:55:35.145 INFO:tasks.workunit.client.0.vm05.stdout:1/530: creat dd/d10/d19/d4d/d88/fbe x:0 0 0 2026-03-10T08:55:35.146 INFO:tasks.workunit.client.0.vm05.stdout:5/366: mknod d5/c82 0 2026-03-10T08:55:35.146 INFO:tasks.workunit.client.0.vm05.stdout:5/367: stat d5/df/d12/c22 0 2026-03-10T08:55:35.147 INFO:tasks.workunit.client.0.vm05.stdout:5/368: dread - d5/df/d12/d24/f6f zero size 2026-03-10T08:55:35.150 INFO:tasks.workunit.client.0.vm05.stdout:8/431: creat d2/dd/d2c/d2e/d93/f9b x:0 0 0 2026-03-10T08:55:35.151 INFO:tasks.workunit.client.0.vm05.stdout:8/432: chown d2/dd/d2c/d2e/f3b 22592 1 2026-03-10T08:55:35.151 INFO:tasks.workunit.client.0.vm05.stdout:8/433: chown d2/d45/f61 1410331716 1 2026-03-10T08:55:35.155 INFO:tasks.workunit.client.0.vm05.stdout:9/364: creat d6/d19/d2c/f78 x:0 0 0 2026-03-10T08:55:35.155 INFO:tasks.workunit.client.0.vm05.stdout:9/365: write d6/d15/d3c/f6b [778456,74170] 0 2026-03-10T08:55:35.158 INFO:tasks.workunit.client.0.vm05.stdout:7/360: stat d18/d66/f2d 0 2026-03-10T08:55:35.158 INFO:tasks.workunit.client.0.vm05.stdout:7/361: read - d18/d66/d25/f56 zero size 2026-03-10T08:55:35.159 INFO:tasks.workunit.client.0.vm05.stdout:7/362: write d18/d66/d25/d2e/d42/f5a [230472,124256] 0 2026-03-10T08:55:35.160 INFO:tasks.workunit.client.0.vm05.stdout:7/363: readlink l1 0 2026-03-10T08:55:35.164 INFO:tasks.workunit.client.0.vm05.stdout:6/504: rename d4/d2c/d84/d4a/ca7 to d4/d7/d10/d1a/d8c/caa 0 2026-03-10T08:55:35.167 INFO:tasks.workunit.client.0.vm05.stdout:3/493: dread - d9/d2b/d53/d61/f87 zero size 2026-03-10T08:55:35.168 INFO:tasks.workunit.client.0.vm05.stdout:8/434: dread - d2/dd/d2c/d2e/d31/d3e/f95 zero size 2026-03-10T08:55:35.168 INFO:tasks.workunit.client.0.vm05.stdout:1/531: sync 2026-03-10T08:55:35.171 INFO:tasks.workunit.client.1.vm08.stdout:0/603: fdatasync d6/dd/d13/d17/d1f/d2d/d85/d93/fc0 0 2026-03-10T08:55:35.172 INFO:tasks.workunit.client.0.vm05.stdout:2/386: getdents d0/d9/d1e 0 2026-03-10T08:55:35.174 INFO:tasks.workunit.client.0.vm05.stdout:9/366: symlink d6/d19/d2a/d4a/l79 0 2026-03-10T08:55:35.175 INFO:tasks.workunit.client.1.vm08.stdout:8/688: write d1/d10/d9/dd/f8f [2155655,13687] 0 2026-03-10T08:55:35.179 INFO:tasks.workunit.client.1.vm08.stdout:3/606: dread - d4/d15/d8/fa0 zero size 2026-03-10T08:55:35.183 INFO:tasks.workunit.client.0.vm05.stdout:5/369: rename d5/df/d37/d68/f6b to d5/d48/d64/f83 0 2026-03-10T08:55:35.183 INFO:tasks.workunit.client.0.vm05.stdout:8/435: creat d2/dd/d2c/d2e/d31/d4f/f9c x:0 0 0 2026-03-10T08:55:35.183 INFO:tasks.workunit.client.0.vm05.stdout:1/532: dread dd/d10/d18/d2d/d51/d58/fa0 [0,4194304] 0 2026-03-10T08:55:35.183 INFO:tasks.workunit.client.0.vm05.stdout:1/533: chown dd/f1c 210245 1 2026-03-10T08:55:35.185 INFO:tasks.workunit.client.1.vm08.stdout:2/713: creat d1/da/d10/d42/d93/d1e/d7b/fea x:0 0 0 2026-03-10T08:55:35.186 INFO:tasks.workunit.client.0.vm05.stdout:2/387: mkdir d0/d9/d1e/d20/d21/d45/d6c 0 2026-03-10T08:55:35.193 INFO:tasks.workunit.client.1.vm08.stdout:6/668: mkdir d9/dc/de0 0 2026-03-10T08:55:35.194 INFO:tasks.workunit.client.1.vm08.stdout:7/697: rename d0/d14/d43/d9d/ccd to d0/d11/d1f/d29/d3d/dd1/ce0 0 2026-03-10T08:55:35.195 INFO:tasks.workunit.client.0.vm05.stdout:5/370: mkdir d5/df/d12/d24/d84 0 2026-03-10T08:55:35.195 INFO:tasks.workunit.client.1.vm08.stdout:6/669: fdatasync d9/d13/f88 0 2026-03-10T08:55:35.195 INFO:tasks.workunit.client.1.vm08.stdout:0/604: dwrite d6/dd/d13/d17/d1f/d20/d2f/d26/f73 [0,4194304] 0 2026-03-10T08:55:35.195 INFO:tasks.workunit.client.1.vm08.stdout:6/670: stat d9/dc/d84/f5e 0 2026-03-10T08:55:35.200 INFO:tasks.workunit.client.1.vm08.stdout:2/714: creat d1/da/d10/d42/d93/d23/feb x:0 0 0 2026-03-10T08:55:35.202 INFO:tasks.workunit.client.1.vm08.stdout:6/671: truncate d9/dc/d11/d23/d2c/fca 338343 0 2026-03-10T08:55:35.207 INFO:tasks.workunit.client.0.vm05.stdout:5/371: mkdir d5/df/d37/d68/d85 0 2026-03-10T08:55:35.207 INFO:tasks.workunit.client.0.vm05.stdout:5/372: chown d5/l56 18908606 1 2026-03-10T08:55:35.209 INFO:tasks.workunit.client.0.vm05.stdout:8/436: mkdir d2/dd/d2c/d2e/d31/d3e/d5d/d9d 0 2026-03-10T08:55:35.210 INFO:tasks.workunit.client.0.vm05.stdout:8/437: write d2/dd/d2c/d2e/f6a [765727,86691] 0 2026-03-10T08:55:35.212 INFO:tasks.workunit.client.1.vm08.stdout:5/600: write d0/d11/d3e/f73 [833384,92983] 0 2026-03-10T08:55:35.213 INFO:tasks.workunit.client.1.vm08.stdout:9/618: write d2/dd/d15/f1b [744902,30627] 0 2026-03-10T08:55:35.213 INFO:tasks.workunit.client.1.vm08.stdout:7/698: dwrite d0/d11/d1f/d29/d3b/fc7 [0,4194304] 0 2026-03-10T08:55:35.214 INFO:tasks.workunit.client.1.vm08.stdout:4/698: write d5/d5f/fcc [761883,41455] 0 2026-03-10T08:55:35.229 INFO:tasks.workunit.client.0.vm05.stdout:1/534: link dd/d21/d37/d7c/faa dd/d55/fbf 0 2026-03-10T08:55:35.230 INFO:tasks.workunit.client.0.vm05.stdout:2/388: mkdir d0/d9/d27/d6d 0 2026-03-10T08:55:35.230 INFO:tasks.workunit.client.1.vm08.stdout:1/685: link d1/da/d20/d91/c87 d1/da/d20/d91/cef 0 2026-03-10T08:55:35.230 INFO:tasks.workunit.client.1.vm08.stdout:0/605: creat d6/dd/d13/d17/d1f/d20/d2f/d24/dc2/fc3 x:0 0 0 2026-03-10T08:55:35.230 INFO:tasks.workunit.client.0.vm05.stdout:6/505: getdents d4/d2d/d51/d87 0 2026-03-10T08:55:35.232 INFO:tasks.workunit.client.0.vm05.stdout:5/373: stat d5/d48/l7a 0 2026-03-10T08:55:35.233 INFO:tasks.workunit.client.0.vm05.stdout:8/438: mkdir d2/dd/d2c/d2e/d31/d4f/d7b/d9e 0 2026-03-10T08:55:35.234 INFO:tasks.workunit.client.1.vm08.stdout:1/686: write d1/da/de/d24/d3d/d40/d56/f73 [366715,86265] 0 2026-03-10T08:55:35.238 INFO:tasks.workunit.client.0.vm05.stdout:2/389: unlink d0/d9/f3b 0 2026-03-10T08:55:35.238 INFO:tasks.workunit.client.1.vm08.stdout:5/601: dread - d0/d11/d27/d68/d7c/d4b/fa0 zero size 2026-03-10T08:55:35.241 INFO:tasks.workunit.client.0.vm05.stdout:6/506: creat d4/d7/fab x:0 0 0 2026-03-10T08:55:35.244 INFO:tasks.workunit.client.0.vm05.stdout:6/507: dwrite d4/d2d/f2f [0,4194304] 0 2026-03-10T08:55:35.255 INFO:tasks.workunit.client.0.vm05.stdout:8/439: creat d2/dd/d2c/d2e/d31/d4f/d80/f9f x:0 0 0 2026-03-10T08:55:35.255 INFO:tasks.workunit.client.1.vm08.stdout:1/687: dread d1/da/d18/d3a/da7/fba [0,4194304] 0 2026-03-10T08:55:35.256 INFO:tasks.workunit.client.0.vm05.stdout:8/440: write d2/dd/d2c/f4d [1729404,60713] 0 2026-03-10T08:55:35.265 INFO:tasks.workunit.client.1.vm08.stdout:5/602: symlink d0/d46/lb4 0 2026-03-10T08:55:35.267 INFO:tasks.workunit.client.1.vm08.stdout:4/699: creat d5/d23/d36/d99/db2/d5d/de3/df8/fff x:0 0 0 2026-03-10T08:55:35.268 INFO:tasks.workunit.client.1.vm08.stdout:0/606: symlink d6/dd/d13/d17/d1f/d2d/lc4 0 2026-03-10T08:55:35.270 INFO:tasks.workunit.client.1.vm08.stdout:4/700: dread - d5/de/d96/fbb zero size 2026-03-10T08:55:35.271 INFO:tasks.workunit.client.1.vm08.stdout:0/607: rename d6/dd/d13/d17/d1f/d2d/d38/c70 to d6/dd/d13/d32/cc5 0 2026-03-10T08:55:35.272 INFO:tasks.workunit.client.1.vm08.stdout:0/608: chown d6/f2c 64149568 1 2026-03-10T08:55:35.273 INFO:tasks.workunit.client.0.vm05.stdout:1/535: sync 2026-03-10T08:55:35.278 INFO:tasks.workunit.client.1.vm08.stdout:1/688: creat d1/da/de/d24/d3d/ff0 x:0 0 0 2026-03-10T08:55:35.278 INFO:tasks.workunit.client.1.vm08.stdout:1/689: truncate d1/da/d18/fea 1006240 0 2026-03-10T08:55:35.279 INFO:tasks.workunit.client.1.vm08.stdout:4/701: creat d5/d23/d36/d76/f100 x:0 0 0 2026-03-10T08:55:35.279 INFO:tasks.workunit.client.1.vm08.stdout:1/690: read - d1/fe6 zero size 2026-03-10T08:55:35.279 INFO:tasks.workunit.client.0.vm05.stdout:0/419: truncate df/d59/f57 52454 0 2026-03-10T08:55:35.279 INFO:tasks.workunit.client.0.vm05.stdout:0/420: chown df/d59 10839036 1 2026-03-10T08:55:35.282 INFO:tasks.workunit.client.0.vm05.stdout:0/421: creat df/d18/d19/d5b/f72 x:0 0 0 2026-03-10T08:55:35.282 INFO:tasks.workunit.client.0.vm05.stdout:6/508: sync 2026-03-10T08:55:35.282 INFO:tasks.workunit.client.0.vm05.stdout:8/441: sync 2026-03-10T08:55:35.282 INFO:tasks.workunit.client.0.vm05.stdout:0/422: readlink df/d18/d19/d39/d4d/d50/l54 0 2026-03-10T08:55:35.286 INFO:tasks.workunit.client.1.vm08.stdout:1/691: creat d1/da/d4b/d4e/ff1 x:0 0 0 2026-03-10T08:55:35.286 INFO:tasks.workunit.client.1.vm08.stdout:0/609: creat d6/dd/d13/d17/fc6 x:0 0 0 2026-03-10T08:55:35.287 INFO:tasks.workunit.client.0.vm05.stdout:8/442: mknod d2/db/d47/ca0 0 2026-03-10T08:55:35.288 INFO:tasks.workunit.client.0.vm05.stdout:6/509: mknod d4/d7/d10/d15/d1b/d22/cac 0 2026-03-10T08:55:35.288 INFO:tasks.workunit.client.0.vm05.stdout:6/510: chown d4/d2d 9668411 1 2026-03-10T08:55:35.289 INFO:tasks.workunit.client.1.vm08.stdout:1/692: read d1/da/d18/d3a/f57 [3135324,113162] 0 2026-03-10T08:55:35.289 INFO:tasks.workunit.client.0.vm05.stdout:0/423: mknod df/d18/d19/d39/d4d/d50/c73 0 2026-03-10T08:55:35.292 INFO:tasks.workunit.client.0.vm05.stdout:6/511: mknod d4/d2d/d7f/cad 0 2026-03-10T08:55:35.298 INFO:tasks.workunit.client.0.vm05.stdout:3/494: dread d9/ff [0,4194304] 0 2026-03-10T08:55:35.298 INFO:tasks.workunit.client.1.vm08.stdout:0/610: fdatasync d6/dd/d13/d17/d1f/d20/f6a 0 2026-03-10T08:55:35.298 INFO:tasks.workunit.client.1.vm08.stdout:1/693: rename d1/da/d20/d3f/d49/f96 to d1/da/de/d24/d3d/d40/d56/d7a/ff2 0 2026-03-10T08:55:35.298 INFO:tasks.workunit.client.1.vm08.stdout:4/702: getdents d5 0 2026-03-10T08:55:35.298 INFO:tasks.workunit.client.1.vm08.stdout:0/611: mkdir d6/dd/d13/d61/dc7 0 2026-03-10T08:55:35.302 INFO:tasks.workunit.client.1.vm08.stdout:1/694: truncate d1/fc 1139603 0 2026-03-10T08:55:35.305 INFO:tasks.workunit.client.0.vm05.stdout:3/495: readlink d9/d2b/l42 0 2026-03-10T08:55:35.306 INFO:tasks.workunit.client.1.vm08.stdout:4/703: unlink d5/d23/d49/d8f/fbc 0 2026-03-10T08:55:35.308 INFO:tasks.workunit.client.0.vm05.stdout:3/496: unlink d9/d2b/d3a/d43/d6e/f7d 0 2026-03-10T08:55:35.311 INFO:tasks.workunit.client.0.vm05.stdout:3/497: fdatasync d9/d2b/d3a/d43/d4f/d55/f6b 0 2026-03-10T08:55:35.312 INFO:tasks.workunit.client.0.vm05.stdout:4/479: dwrite d0/d2e/f4e [0,4194304] 0 2026-03-10T08:55:35.312 INFO:tasks.workunit.client.1.vm08.stdout:1/695: mknod d1/da/d20/d3f/cf3 0 2026-03-10T08:55:35.312 INFO:tasks.workunit.client.1.vm08.stdout:4/704: creat d5/d23/d49/f101 x:0 0 0 2026-03-10T08:55:35.314 INFO:tasks.workunit.client.0.vm05.stdout:3/498: write d9/d2b/f40 [3559284,90744] 0 2026-03-10T08:55:35.318 INFO:tasks.workunit.client.1.vm08.stdout:0/612: getdents d6/dd/d13/d17/d50 0 2026-03-10T08:55:35.319 INFO:tasks.workunit.client.1.vm08.stdout:0/613: chown d6/dd/d13/d17/f82 182169 1 2026-03-10T08:55:35.323 INFO:tasks.workunit.client.0.vm05.stdout:4/480: unlink d0/l5 0 2026-03-10T08:55:35.326 INFO:tasks.workunit.client.0.vm05.stdout:3/499: creat d9/d2b/d3a/d43/d4f/d55/f8c x:0 0 0 2026-03-10T08:55:35.326 INFO:tasks.workunit.client.0.vm05.stdout:3/500: write d9/d2b/f40 [3537098,78700] 0 2026-03-10T08:55:35.332 INFO:tasks.workunit.client.0.vm05.stdout:9/367: dwrite d6/f7 [4194304,4194304] 0 2026-03-10T08:55:35.335 INFO:tasks.workunit.client.0.vm05.stdout:3/501: symlink d9/d2b/d3a/d43/d71/l8d 0 2026-03-10T08:55:35.337 INFO:tasks.workunit.client.1.vm08.stdout:8/689: write d1/d10/d9/dd/d18/d34/f57 [2068622,79038] 0 2026-03-10T08:55:35.338 INFO:tasks.workunit.client.0.vm05.stdout:7/364: dwrite d18/f1d [0,4194304] 0 2026-03-10T08:55:35.346 INFO:tasks.workunit.client.1.vm08.stdout:0/614: mkdir d6/dd/d13/d61/dc7/dc8 0 2026-03-10T08:55:35.353 INFO:tasks.workunit.client.1.vm08.stdout:4/705: symlink d5/d23/d36/d99/db2/d5d/l102 0 2026-03-10T08:55:35.354 INFO:tasks.workunit.client.0.vm05.stdout:5/374: rename d5/df/d12 to d5/d86 0 2026-03-10T08:55:35.356 INFO:tasks.workunit.client.0.vm05.stdout:9/368: mknod d6/d12/d3a/d48/c7a 0 2026-03-10T08:55:35.357 INFO:tasks.workunit.client.0.vm05.stdout:9/369: write d6/d15/d35/f38 [4735512,124198] 0 2026-03-10T08:55:35.367 INFO:tasks.workunit.client.0.vm05.stdout:7/365: unlink d18/d66/d25/d2e/d42/d53/f61 0 2026-03-10T08:55:35.369 INFO:tasks.workunit.client.0.vm05.stdout:3/502: creat d9/d4d/d51/d64/d89/f8e x:0 0 0 2026-03-10T08:55:35.371 INFO:tasks.workunit.client.1.vm08.stdout:3/607: dwrite d4/d15/d8/d2c/d9b/d79/d20/f84 [4194304,4194304] 0 2026-03-10T08:55:35.374 INFO:tasks.workunit.client.0.vm05.stdout:1/536: getdents dd/d55 0 2026-03-10T08:55:35.375 INFO:tasks.workunit.client.0.vm05.stdout:2/390: rename d0/d9/d27 to d0/d9/d1e/d20/d21/d45/d6c/d6e 0 2026-03-10T08:55:35.377 INFO:tasks.workunit.client.1.vm08.stdout:0/615: dread d6/dd/d13/d17/d1f/d20/f46 [0,4194304] 0 2026-03-10T08:55:35.378 INFO:tasks.workunit.client.1.vm08.stdout:2/715: dwrite d1/d5b/d66/f62 [0,4194304] 0 2026-03-10T08:55:35.381 INFO:tasks.workunit.client.0.vm05.stdout:1/537: unlink dd/d21/l81 0 2026-03-10T08:55:35.385 INFO:tasks.workunit.client.0.vm05.stdout:1/538: dwrite dd/f5e [0,4194304] 0 2026-03-10T08:55:35.392 INFO:tasks.workunit.client.1.vm08.stdout:9/619: dwrite d2/dd/d15/d1e/d25/f5f [0,4194304] 0 2026-03-10T08:55:35.397 INFO:tasks.workunit.client.0.vm05.stdout:6/512: rename d4/d7/d10/d15/c46 to d4/d2c/cae 0 2026-03-10T08:55:35.397 INFO:tasks.workunit.client.0.vm05.stdout:6/513: stat d4/d7/d10/d15/d1b/d22/f5c 0 2026-03-10T08:55:35.399 INFO:tasks.workunit.client.1.vm08.stdout:6/672: dwrite d9/d10/d1e/d32/f64 [0,4194304] 0 2026-03-10T08:55:35.402 INFO:tasks.workunit.client.0.vm05.stdout:2/391: chown d0/d9/c62 264 1 2026-03-10T08:55:35.404 INFO:tasks.workunit.client.1.vm08.stdout:2/716: symlink d1/da/d10/d1b/dcf/lec 0 2026-03-10T08:55:35.409 INFO:tasks.workunit.client.0.vm05.stdout:5/375: link d5/d86/d24/d2c/d41/d74/f7d d5/d86/d24/d2c/d41/f87 0 2026-03-10T08:55:35.410 INFO:tasks.workunit.client.1.vm08.stdout:6/673: chown d9/dc/d11/d23/f40 1 1 2026-03-10T08:55:35.410 INFO:tasks.workunit.client.1.vm08.stdout:7/699: dwrite d0/d11/d4a/f53 [0,4194304] 0 2026-03-10T08:55:35.413 INFO:tasks.workunit.client.0.vm05.stdout:6/514: mknod d4/d7/d10/d1a/d89/caf 0 2026-03-10T08:55:35.414 INFO:tasks.workunit.client.0.vm05.stdout:6/515: write d4/d92/f96 [820201,107091] 0 2026-03-10T08:55:35.415 INFO:tasks.workunit.client.1.vm08.stdout:6/674: write d9/d10/d1e/d7e/fc2 [1079211,63739] 0 2026-03-10T08:55:35.415 INFO:tasks.workunit.client.0.vm05.stdout:7/366: creat d18/d1b/f69 x:0 0 0 2026-03-10T08:55:35.416 INFO:tasks.workunit.client.0.vm05.stdout:7/367: write d18/d1b/f50 [1532872,73011] 0 2026-03-10T08:55:35.417 INFO:tasks.workunit.client.0.vm05.stdout:7/368: write d18/d66/d25/d2e/d42/f5a [77830,42397] 0 2026-03-10T08:55:35.426 INFO:tasks.workunit.client.0.vm05.stdout:9/370: rename d6/d19/l77 to d6/d12/d43/l7b 0 2026-03-10T08:55:35.429 INFO:tasks.workunit.client.0.vm05.stdout:6/516: read d4/d2c/f55 [3387594,115279] 0 2026-03-10T08:55:35.432 INFO:tasks.workunit.client.1.vm08.stdout:6/675: fdatasync d9/d50/f78 0 2026-03-10T08:55:35.434 INFO:tasks.workunit.client.1.vm08.stdout:7/700: mkdir d0/d11/d1f/d29/d3b/d80/dd3/de1 0 2026-03-10T08:55:35.435 INFO:tasks.workunit.client.0.vm05.stdout:7/369: read - d18/d66/d25/d2e/d42/f52 zero size 2026-03-10T08:55:35.439 INFO:tasks.workunit.client.1.vm08.stdout:6/676: unlink d9/d50/f78 0 2026-03-10T08:55:35.444 INFO:tasks.workunit.client.0.vm05.stdout:1/539: rename f7 to dd/d21/d37/d7c/dab/db7/fc0 0 2026-03-10T08:55:35.448 INFO:tasks.workunit.client.0.vm05.stdout:6/517: mkdir d4/d92/db0 0 2026-03-10T08:55:35.450 INFO:tasks.workunit.client.1.vm08.stdout:2/717: getdents d1/db1 0 2026-03-10T08:55:35.451 INFO:tasks.workunit.client.1.vm08.stdout:2/718: write d1/da/d10/d42/d93/daa/fdb [348619,1524] 0 2026-03-10T08:55:35.452 INFO:tasks.workunit.client.0.vm05.stdout:7/370: read d18/f31 [861944,15907] 0 2026-03-10T08:55:35.452 INFO:tasks.workunit.client.0.vm05.stdout:7/371: dread - d18/d66/d25/f56 zero size 2026-03-10T08:55:35.456 INFO:tasks.workunit.client.0.vm05.stdout:7/372: dwrite d18/d1b/f69 [0,4194304] 0 2026-03-10T08:55:35.457 INFO:tasks.workunit.client.0.vm05.stdout:9/371: symlink d6/l7c 0 2026-03-10T08:55:35.457 INFO:tasks.workunit.client.0.vm05.stdout:9/372: write d6/f30 [3941778,74009] 0 2026-03-10T08:55:35.461 INFO:tasks.workunit.client.0.vm05.stdout:1/540: mknod dd/d10/d19/d4d/d88/cc1 0 2026-03-10T08:55:35.464 INFO:tasks.workunit.client.1.vm08.stdout:6/677: getdents d9/dc/de0 0 2026-03-10T08:55:35.464 INFO:tasks.workunit.client.0.vm05.stdout:6/518: mkdir d4/d7/d10/d1a/db1 0 2026-03-10T08:55:35.464 INFO:tasks.workunit.client.0.vm05.stdout:7/373: symlink d18/d66/d25/l6a 0 2026-03-10T08:55:35.467 INFO:tasks.workunit.client.0.vm05.stdout:9/373: creat d6/d19/d21/f7d x:0 0 0 2026-03-10T08:55:35.468 INFO:tasks.workunit.client.0.vm05.stdout:6/519: creat d4/d2c/d84/fb2 x:0 0 0 2026-03-10T08:55:35.468 INFO:tasks.workunit.client.0.vm05.stdout:6/520: stat d4/d2d/d51 0 2026-03-10T08:55:35.468 INFO:tasks.workunit.client.0.vm05.stdout:6/521: write d4/d2d/fa2 [5225836,71325] 0 2026-03-10T08:55:35.469 INFO:tasks.workunit.client.0.vm05.stdout:7/374: rename d18/d66/d25/f3a to d18/d66/d25/d2e/d42/d53/f6b 0 2026-03-10T08:55:35.471 INFO:tasks.workunit.client.1.vm08.stdout:6/678: rename d9/dc/d11/d23/d2c/d41/c22 to d9/dc/de0/ce1 0 2026-03-10T08:55:35.472 INFO:tasks.workunit.client.0.vm05.stdout:6/522: rename d4/d2c/d84/f49 to d4/d7/d10/d1a/db1/fb3 0 2026-03-10T08:55:35.473 INFO:tasks.workunit.client.0.vm05.stdout:6/523: write d4/fa8 [239436,102396] 0 2026-03-10T08:55:35.473 INFO:tasks.workunit.client.1.vm08.stdout:6/679: chown d9/d10/d1e 155524630 1 2026-03-10T08:55:35.474 INFO:tasks.workunit.client.0.vm05.stdout:6/524: dread - d4/d7/d10/d15/d1b/d22/f56 zero size 2026-03-10T08:55:35.475 INFO:tasks.workunit.client.1.vm08.stdout:6/680: symlink d9/dc/d11/d23/d2c/le2 0 2026-03-10T08:55:35.476 INFO:tasks.workunit.client.0.vm05.stdout:7/375: creat d18/d66/f6c x:0 0 0 2026-03-10T08:55:35.476 INFO:tasks.workunit.client.0.vm05.stdout:6/525: dread d4/d7/d10/d15/f17 [0,4194304] 0 2026-03-10T08:55:35.477 INFO:tasks.workunit.client.0.vm05.stdout:7/376: fsync d18/d38/f5d 0 2026-03-10T08:55:35.480 INFO:tasks.workunit.client.1.vm08.stdout:6/681: write d9/d10/d1e/f58 [2847917,90791] 0 2026-03-10T08:55:35.483 INFO:tasks.workunit.client.0.vm05.stdout:6/526: dwrite d4/d2c/d84/d4a/f63 [4194304,4194304] 0 2026-03-10T08:55:35.483 INFO:tasks.workunit.client.1.vm08.stdout:5/603: dwrite d0/fb [0,4194304] 0 2026-03-10T08:55:35.497 INFO:tasks.workunit.client.0.vm05.stdout:6/527: fdatasync d4/d7/d10/d1a/d1f/f4b 0 2026-03-10T08:55:35.498 INFO:tasks.workunit.client.1.vm08.stdout:6/682: symlink d9/dc/d11/d23/d2c/le3 0 2026-03-10T08:55:35.499 INFO:tasks.workunit.client.1.vm08.stdout:5/604: mkdir d0/d11/d27/d68/d7c/d4b/d87/db5 0 2026-03-10T08:55:35.501 INFO:tasks.workunit.client.1.vm08.stdout:5/605: dread - d0/d11/d27/d68/d7c/d4b/fa0 zero size 2026-03-10T08:55:35.507 INFO:tasks.workunit.client.1.vm08.stdout:5/606: mknod d0/d11/d3e/cb6 0 2026-03-10T08:55:35.517 INFO:tasks.workunit.client.0.vm05.stdout:7/377: dread d18/d66/f3f [0,4194304] 0 2026-03-10T08:55:35.517 INFO:tasks.workunit.client.1.vm08.stdout:5/607: fdatasync d0/d11/d27/d50/fa1 0 2026-03-10T08:55:35.517 INFO:tasks.workunit.client.0.vm05.stdout:7/378: dwrite d18/d1b/f30 [0,4194304] 0 2026-03-10T08:55:35.522 INFO:tasks.workunit.client.0.vm05.stdout:7/379: write d18/d38/d43/d5c/f5f [383624,54804] 0 2026-03-10T08:55:35.524 INFO:tasks.workunit.client.1.vm08.stdout:4/706: dread d5/d23/d36/d99/db2/d5a/d69/f97 [0,4194304] 0 2026-03-10T08:55:35.524 INFO:tasks.workunit.client.1.vm08.stdout:4/707: chown d5/d5f/fe1 320906181 1 2026-03-10T08:55:35.529 INFO:tasks.workunit.client.0.vm05.stdout:7/380: getdents d18/d66/d25/d2e/d32 0 2026-03-10T08:55:35.531 INFO:tasks.workunit.client.1.vm08.stdout:4/708: mkdir d5/d23/d36/d76/d103 0 2026-03-10T08:55:35.533 INFO:tasks.workunit.client.1.vm08.stdout:4/709: creat d5/d23/d36/d99/dc6/df1/f104 x:0 0 0 2026-03-10T08:55:35.536 INFO:tasks.workunit.client.1.vm08.stdout:4/710: mkdir d5/d23/d36/d99/db2/d5d/dae/ddf/d105 0 2026-03-10T08:55:35.547 INFO:tasks.workunit.client.1.vm08.stdout:9/620: dread d2/dd/d15/d1e/d39/f57 [0,4194304] 0 2026-03-10T08:55:35.559 INFO:tasks.workunit.client.1.vm08.stdout:9/621: dread d2/dd/d15/d1e/d25/d32/f45 [0,4194304] 0 2026-03-10T08:55:35.561 INFO:tasks.workunit.client.1.vm08.stdout:9/622: dwrite d2/dd/d15/d1e/d39/d4e/f55 [0,4194304] 0 2026-03-10T08:55:35.563 INFO:tasks.workunit.client.1.vm08.stdout:9/623: chown d2/dd/d15/d1e/d25/d32/d5c/dc2 483 1 2026-03-10T08:55:35.564 INFO:tasks.workunit.client.1.vm08.stdout:9/624: fdatasync d2/dd/d15/d1e/d24/f9e 0 2026-03-10T08:55:35.568 INFO:tasks.workunit.client.1.vm08.stdout:9/625: link d2/dd/d15/d1e/d21/f50 d2/d41/d4c/dd2/fd6 0 2026-03-10T08:55:35.568 INFO:tasks.workunit.client.0.vm05.stdout:7/381: sync 2026-03-10T08:55:35.570 INFO:tasks.workunit.client.1.vm08.stdout:9/626: dread - d2/dd/d61/f9c zero size 2026-03-10T08:55:35.572 INFO:tasks.workunit.client.0.vm05.stdout:6/528: dread d4/d7/f34 [4194304,4194304] 0 2026-03-10T08:55:35.572 INFO:tasks.workunit.client.1.vm08.stdout:9/627: stat d2/d54/d8e 0 2026-03-10T08:55:35.573 INFO:tasks.workunit.client.0.vm05.stdout:6/529: fdatasync d4/d7/d10/d15/d1b/d22/fa4 0 2026-03-10T08:55:35.573 INFO:tasks.workunit.client.0.vm05.stdout:6/530: chown d4/d7/l19 4 1 2026-03-10T08:55:35.576 INFO:tasks.workunit.client.1.vm08.stdout:9/628: truncate d2/dd/d15/d1e/d25/dae/f8f 712574 0 2026-03-10T08:55:35.579 INFO:tasks.workunit.client.1.vm08.stdout:9/629: fsync d2/dd/d15/d1e/d39/f57 0 2026-03-10T08:55:35.579 INFO:tasks.workunit.client.0.vm05.stdout:6/531: creat d4/fb4 x:0 0 0 2026-03-10T08:55:35.583 INFO:tasks.workunit.client.0.vm05.stdout:6/532: dwrite d4/d7/d10/d1a/d89/f93 [0,4194304] 0 2026-03-10T08:55:35.587 INFO:tasks.workunit.client.1.vm08.stdout:9/630: chown d2/dd/d15/d1e/d24/l27 6469 1 2026-03-10T08:55:35.588 INFO:tasks.workunit.client.0.vm05.stdout:8/443: dwrite d2/d45/f43 [0,4194304] 0 2026-03-10T08:55:35.593 INFO:tasks.workunit.client.0.vm05.stdout:0/424: truncate df/d1f/f21 1737901 0 2026-03-10T08:55:35.595 INFO:tasks.workunit.client.1.vm08.stdout:9/631: truncate d2/dd/d15/d1e/d25/d32/d79/d85/fc0 350555 0 2026-03-10T08:55:35.595 INFO:tasks.workunit.client.0.vm05.stdout:0/425: stat df/d18/d19/d39/d4d/d50/c5e 0 2026-03-10T08:55:35.595 INFO:tasks.workunit.client.0.vm05.stdout:0/426: read df/d18/d2b/d27/d32/f5d [2397740,71917] 0 2026-03-10T08:55:35.600 INFO:tasks.workunit.client.1.vm08.stdout:9/632: truncate d2/dd/d15/d1e/d25/d32/d5c/dc2/fcb 483280 0 2026-03-10T08:55:35.600 INFO:tasks.workunit.client.0.vm05.stdout:0/427: dwrite df/d18/f29 [4194304,4194304] 0 2026-03-10T08:55:35.612 INFO:tasks.workunit.client.1.vm08.stdout:9/633: rename d2/d41/d53/f81 to d2/dd/d15/d1e/d94/fd7 0 2026-03-10T08:55:35.612 INFO:tasks.workunit.client.0.vm05.stdout:8/444: mknod d2/ca1 0 2026-03-10T08:55:35.614 INFO:tasks.workunit.client.0.vm05.stdout:0/428: rename df/d18/d2b/d51 to df/d18/d19/d39/d74 0 2026-03-10T08:55:35.630 INFO:tasks.workunit.client.1.vm08.stdout:9/634: creat d2/dd/d15/d1e/d39/fd8 x:0 0 0 2026-03-10T08:55:35.631 INFO:tasks.workunit.client.1.vm08.stdout:9/635: readlink d2/dd/d15/d1e/d25/d32/d79/d85/lb1 0 2026-03-10T08:55:35.631 INFO:tasks.workunit.client.1.vm08.stdout:1/696: dwrite d1/da/f1e [0,4194304] 0 2026-03-10T08:55:35.631 INFO:tasks.workunit.client.0.vm05.stdout:6/533: getdents d4/d2d/d51/d87 0 2026-03-10T08:55:35.631 INFO:tasks.workunit.client.0.vm05.stdout:0/429: creat df/d1f/d48/f75 x:0 0 0 2026-03-10T08:55:35.631 INFO:tasks.workunit.client.0.vm05.stdout:0/430: write df/d1f/d48/f75 [390864,27690] 0 2026-03-10T08:55:35.631 INFO:tasks.workunit.client.0.vm05.stdout:8/445: symlink d2/dd/d74/d78/la2 0 2026-03-10T08:55:35.631 INFO:tasks.workunit.client.0.vm05.stdout:4/481: write d0/d2c/f2f [1055932,122006] 0 2026-03-10T08:55:35.631 INFO:tasks.workunit.client.0.vm05.stdout:8/446: stat d2/dd/d2c/d2e/d31/d4f/d7b 0 2026-03-10T08:55:35.631 INFO:tasks.workunit.client.0.vm05.stdout:0/431: creat df/d18/d19/d5b/f76 x:0 0 0 2026-03-10T08:55:35.631 INFO:tasks.workunit.client.0.vm05.stdout:8/447: chown d2/db/c10 634 1 2026-03-10T08:55:35.631 INFO:tasks.workunit.client.0.vm05.stdout:8/448: truncate d2/db/d47/f51 626590 0 2026-03-10T08:55:35.640 INFO:tasks.workunit.client.1.vm08.stdout:9/636: dread d2/f77 [0,4194304] 0 2026-03-10T08:55:35.640 INFO:tasks.workunit.client.1.vm08.stdout:9/637: dread - d2/dd/faf zero size 2026-03-10T08:55:35.645 INFO:tasks.workunit.client.0.vm05.stdout:8/449: mkdir d2/dd/d2c/d2e/d31/d4f/da3 0 2026-03-10T08:55:35.647 INFO:tasks.workunit.client.0.vm05.stdout:6/534: link d4/d7/d10/d15/d20/l3d d4/d8d/lb5 0 2026-03-10T08:55:35.647 INFO:tasks.workunit.client.1.vm08.stdout:3/608: dwrite d4/d15/d8/d2c/f32 [0,4194304] 0 2026-03-10T08:55:35.649 INFO:tasks.workunit.client.0.vm05.stdout:8/450: dwrite d2/db/d1f/f44 [0,4194304] 0 2026-03-10T08:55:35.649 INFO:tasks.workunit.client.0.vm05.stdout:0/432: symlink df/l77 0 2026-03-10T08:55:35.651 INFO:tasks.workunit.client.1.vm08.stdout:9/638: mkdir d2/dd/d15/dd9 0 2026-03-10T08:55:35.680 INFO:tasks.workunit.client.0.vm05.stdout:0/433: dwrite df/d18/d19/d39/d74/f71 [0,4194304] 0 2026-03-10T08:55:35.680 INFO:tasks.workunit.client.0.vm05.stdout:6/535: mkdir d4/d2c/d84/db6 0 2026-03-10T08:55:35.680 INFO:tasks.workunit.client.0.vm05.stdout:0/434: creat df/d18/d19/d5b/f78 x:0 0 0 2026-03-10T08:55:35.680 INFO:tasks.workunit.client.0.vm05.stdout:8/451: mkdir d2/db/da4 0 2026-03-10T08:55:35.680 INFO:tasks.workunit.client.0.vm05.stdout:8/452: mkdir d2/dd/d2c/da5 0 2026-03-10T08:55:35.680 INFO:tasks.workunit.client.0.vm05.stdout:0/435: creat df/f79 x:0 0 0 2026-03-10T08:55:35.680 INFO:tasks.workunit.client.0.vm05.stdout:8/453: creat d2/db/d28/fa6 x:0 0 0 2026-03-10T08:55:35.680 INFO:tasks.workunit.client.1.vm08.stdout:8/690: dwrite d1/d10/d9/dd/f41 [0,4194304] 0 2026-03-10T08:55:35.680 INFO:tasks.workunit.client.1.vm08.stdout:0/616: dwrite d6/fe [0,4194304] 0 2026-03-10T08:55:35.680 INFO:tasks.workunit.client.1.vm08.stdout:3/609: stat d4/d15/d8/d1d/da8 0 2026-03-10T08:55:35.680 INFO:tasks.workunit.client.1.vm08.stdout:9/639: dwrite d2/dd/d15/d1e/d25/f5f [0,4194304] 0 2026-03-10T08:55:35.680 INFO:tasks.workunit.client.1.vm08.stdout:9/640: fdatasync d2/dd/d15/d1e/d24/f3f 0 2026-03-10T08:55:35.680 INFO:tasks.workunit.client.1.vm08.stdout:8/691: rmdir d1/d4f/d60/dbf 39 2026-03-10T08:55:35.680 INFO:tasks.workunit.client.1.vm08.stdout:9/641: rename d2/d41/d4c/f8a to d2/dd/d15/d1e/d39/d69/fda 0 2026-03-10T08:55:35.680 INFO:tasks.workunit.client.1.vm08.stdout:8/692: creat d1/d10/d9/d4d/db2/f103 x:0 0 0 2026-03-10T08:55:35.680 INFO:tasks.workunit.client.1.vm08.stdout:9/642: mknod d2/cdb 0 2026-03-10T08:55:35.680 INFO:tasks.workunit.client.1.vm08.stdout:8/693: truncate d1/d10/f2a 4162020 0 2026-03-10T08:55:35.680 INFO:tasks.workunit.client.0.vm05.stdout:8/454: write d2/dd/d2c/d2e/f64 [1019976,66339] 0 2026-03-10T08:55:35.681 INFO:tasks.workunit.client.0.vm05.stdout:0/436: unlink df/d18/d19/d39/d74/l70 0 2026-03-10T08:55:35.681 INFO:tasks.workunit.client.1.vm08.stdout:8/694: write d1/d4f/d60/fc4 [4282078,108460] 0 2026-03-10T08:55:35.682 INFO:tasks.workunit.client.1.vm08.stdout:9/643: unlink d2/dd/d15/d1e/d25/d98/d9d/ca2 0 2026-03-10T08:55:35.684 INFO:tasks.workunit.client.0.vm05.stdout:8/455: fdatasync d2/dd/d2c/f30 0 2026-03-10T08:55:35.685 INFO:tasks.workunit.client.0.vm05.stdout:0/437: dwrite df/d18/d19/d5b/f76 [0,4194304] 0 2026-03-10T08:55:35.690 INFO:tasks.workunit.client.0.vm05.stdout:8/456: dwrite d2/dd/d2c/d2e/d31/d4f/d80/f9f [0,4194304] 0 2026-03-10T08:55:35.697 INFO:tasks.workunit.client.0.vm05.stdout:0/438: rename df/d18/d19/d47/f68 to df/d18/d2b/f7a 0 2026-03-10T08:55:35.697 INFO:tasks.workunit.client.1.vm08.stdout:8/695: rename d1/d10/d9/dd/d25/d27/d44/laf to d1/dd9/l104 0 2026-03-10T08:55:35.698 INFO:tasks.workunit.client.0.vm05.stdout:8/457: unlink d2/db/d1f/d67/f98 0 2026-03-10T08:55:35.698 INFO:tasks.workunit.client.0.vm05.stdout:8/458: chown d2/dd/l27 66057057 1 2026-03-10T08:55:35.700 INFO:tasks.workunit.client.0.vm05.stdout:8/459: symlink d2/db/d1f/d67/la7 0 2026-03-10T08:55:35.700 INFO:tasks.workunit.client.1.vm08.stdout:8/696: read d1/d10/d9/dd/d18/f80 [2757547,48640] 0 2026-03-10T08:55:35.701 INFO:tasks.workunit.client.0.vm05.stdout:0/439: mkdir df/d18/d19/d39/d74/d67/d7b 0 2026-03-10T08:55:35.702 INFO:tasks.workunit.client.0.vm05.stdout:8/460: mknod d2/dd/d2c/d2e/d93/ca8 0 2026-03-10T08:55:35.704 INFO:tasks.workunit.client.0.vm05.stdout:0/440: symlink df/d1f/d48/l7c 0 2026-03-10T08:55:35.704 INFO:tasks.workunit.client.0.vm05.stdout:0/441: readlink df/d18/d19/d39/d4d/d50/l5f 0 2026-03-10T08:55:35.706 INFO:tasks.workunit.client.0.vm05.stdout:8/461: rename d2/dd/d2c/d2e/d31/c69 to d2/dd/d2c/d2e/d31/d3e/d5d/d9d/ca9 0 2026-03-10T08:55:35.707 INFO:tasks.workunit.client.1.vm08.stdout:8/697: symlink d1/d10/d9/dd/d18/d3c/l105 0 2026-03-10T08:55:35.708 INFO:tasks.workunit.client.0.vm05.stdout:8/462: chown d2/d45/f43 15139 1 2026-03-10T08:55:35.709 INFO:tasks.workunit.client.1.vm08.stdout:8/698: write d1/d10/d9/d8a/fb7 [1603186,64801] 0 2026-03-10T08:55:35.710 INFO:tasks.workunit.client.0.vm05.stdout:8/463: truncate d2/dd/d2c/d2e/d31/d4c/d63/f91 617673 0 2026-03-10T08:55:35.713 INFO:tasks.workunit.client.0.vm05.stdout:8/464: dwrite d2/dd/f1a [0,4194304] 0 2026-03-10T08:55:35.718 INFO:tasks.workunit.client.0.vm05.stdout:8/465: dwrite d2/db/f9a [0,4194304] 0 2026-03-10T08:55:35.719 INFO:tasks.workunit.client.0.vm05.stdout:8/466: stat d2/dd/d2c/d2e/f6a 0 2026-03-10T08:55:35.742 INFO:tasks.workunit.client.1.vm08.stdout:8/699: dread d1/d10/d9/f73 [0,4194304] 0 2026-03-10T08:55:35.745 INFO:tasks.workunit.client.0.vm05.stdout:0/442: dread df/d59/f45 [0,4194304] 0 2026-03-10T08:55:35.747 INFO:tasks.workunit.client.0.vm05.stdout:0/443: symlink df/d1f/l7d 0 2026-03-10T08:55:35.749 INFO:tasks.workunit.client.0.vm05.stdout:0/444: link df/f1d df/d18/d19/d39/d4d/d50/f7e 0 2026-03-10T08:55:35.750 INFO:tasks.workunit.client.0.vm05.stdout:0/445: symlink df/d18/d2b/d27/d32/l7f 0 2026-03-10T08:55:35.750 INFO:tasks.workunit.client.0.vm05.stdout:0/446: readlink df/d18/d2b/d27/d32/l58 0 2026-03-10T08:55:35.751 INFO:tasks.workunit.client.0.vm05.stdout:0/447: chown df/f12 168410308 1 2026-03-10T08:55:35.752 INFO:tasks.workunit.client.0.vm05.stdout:0/448: creat df/d18/d19/d39/d74/d67/f80 x:0 0 0 2026-03-10T08:55:35.753 INFO:tasks.workunit.client.0.vm05.stdout:0/449: truncate df/d18/d2b/f3b 1601383 0 2026-03-10T08:55:35.754 INFO:tasks.workunit.client.0.vm05.stdout:0/450: getdents df/d59 0 2026-03-10T08:55:35.754 INFO:tasks.workunit.client.0.vm05.stdout:0/451: stat df/d59/f45 0 2026-03-10T08:55:35.755 INFO:tasks.workunit.client.0.vm05.stdout:0/452: write df/d18/d2b/d27/d32/f5d [2176965,117020] 0 2026-03-10T08:55:35.782 INFO:tasks.workunit.client.1.vm08.stdout:9/644: dread d2/d41/d53/f6d [0,4194304] 0 2026-03-10T08:55:35.783 INFO:tasks.workunit.client.1.vm08.stdout:9/645: mknod d2/dd/d15/d4f/cdc 0 2026-03-10T08:55:35.785 INFO:tasks.workunit.client.1.vm08.stdout:9/646: link d2/dd/d15/d1e/d39/d4e/fcf d2/d41/fdd 0 2026-03-10T08:55:35.789 INFO:tasks.workunit.client.0.vm05.stdout:3/503: write d9/f19 [429594,81247] 0 2026-03-10T08:55:35.791 INFO:tasks.workunit.client.0.vm05.stdout:3/504: rmdir d9/d2b/d3a/d43/d71 39 2026-03-10T08:55:35.791 INFO:tasks.workunit.client.0.vm05.stdout:3/505: truncate d9/d2b/d3a/f44 8752233 0 2026-03-10T08:55:35.792 INFO:tasks.workunit.client.0.vm05.stdout:3/506: fsync d9/d2b/d2f/f33 0 2026-03-10T08:55:35.792 INFO:tasks.workunit.client.0.vm05.stdout:3/507: readlink d9/le 0 2026-03-10T08:55:35.793 INFO:tasks.workunit.client.1.vm08.stdout:9/647: symlink d2/dd/d15/d1e/d25/d98/d9d/lde 0 2026-03-10T08:55:35.794 INFO:tasks.workunit.client.0.vm05.stdout:3/508: unlink d9/d4d/d51/d6f/c73 0 2026-03-10T08:55:35.794 INFO:tasks.workunit.client.1.vm08.stdout:9/648: write d2/d41/d4c/f80 [1400509,62971] 0 2026-03-10T08:55:35.797 INFO:tasks.workunit.client.0.vm05.stdout:3/509: rename d9/d2b/d3a/d43/d4f to d9/d8f 0 2026-03-10T08:55:35.798 INFO:tasks.workunit.client.0.vm05.stdout:3/510: creat d9/d2b/d2f/d57/f90 x:0 0 0 2026-03-10T08:55:35.800 INFO:tasks.workunit.client.1.vm08.stdout:9/649: link d2/d41/d53/f6d d2/dd/d15/d1e/d25/d32/d79/d85/fdf 0 2026-03-10T08:55:35.801 INFO:tasks.workunit.client.0.vm05.stdout:3/511: creat d9/d2b/d3a/d43/d71/f91 x:0 0 0 2026-03-10T08:55:35.802 INFO:tasks.workunit.client.1.vm08.stdout:9/650: mkdir d2/dd/d15/de0 0 2026-03-10T08:55:35.876 INFO:tasks.workunit.client.0.vm05.stdout:5/376: dwrite d5/df/d37/f47 [0,4194304] 0 2026-03-10T08:55:35.884 INFO:tasks.workunit.client.0.vm05.stdout:5/377: creat d5/d86/d24/d2c/f88 x:0 0 0 2026-03-10T08:55:35.888 INFO:tasks.workunit.client.0.vm05.stdout:5/378: dwrite d5/d86/f1b [4194304,4194304] 0 2026-03-10T08:55:35.890 INFO:tasks.workunit.client.0.vm05.stdout:5/379: mkdir d5/d86/d21/d89 0 2026-03-10T08:55:35.902 INFO:tasks.workunit.client.0.vm05.stdout:5/380: rename d5/d3a/d43/l75 to d5/df/d37/d68/l8a 0 2026-03-10T08:55:35.902 INFO:tasks.workunit.client.0.vm05.stdout:5/381: unlink d5/d3a/l4f 0 2026-03-10T08:55:35.902 INFO:tasks.workunit.client.0.vm05.stdout:5/382: rename l0 to d5/d86/d24/d2c/d41/d74/l8b 0 2026-03-10T08:55:35.902 INFO:tasks.workunit.client.0.vm05.stdout:5/383: symlink d5/df/d37/d68/d85/l8c 0 2026-03-10T08:55:35.904 INFO:tasks.workunit.client.0.vm05.stdout:5/384: getdents d5/d86/d21/d89 0 2026-03-10T08:55:35.907 INFO:tasks.workunit.client.0.vm05.stdout:2/392: read d0/f40 [672100,75955] 0 2026-03-10T08:55:35.916 INFO:tasks.workunit.client.0.vm05.stdout:5/385: dread - d5/d86/d24/f51 zero size 2026-03-10T08:55:35.923 INFO:tasks.workunit.client.0.vm05.stdout:5/386: symlink d5/d3a/d43/l8d 0 2026-03-10T08:55:35.923 INFO:tasks.workunit.client.0.vm05.stdout:5/387: fsync d5/d86/d24/d2c/f46 0 2026-03-10T08:55:35.924 INFO:tasks.workunit.client.0.vm05.stdout:5/388: chown d5/d48/l70 70 1 2026-03-10T08:55:35.926 INFO:tasks.workunit.client.0.vm05.stdout:5/389: dwrite d5/d48/f7e [0,4194304] 0 2026-03-10T08:55:35.927 INFO:tasks.workunit.client.0.vm05.stdout:5/390: truncate d5/d86/d39/f78 461638 0 2026-03-10T08:55:35.933 INFO:tasks.workunit.client.0.vm05.stdout:5/391: dwrite d5/d86/d39/f77 [0,4194304] 0 2026-03-10T08:55:35.935 INFO:tasks.workunit.client.0.vm05.stdout:2/393: dread d0/fb [0,4194304] 0 2026-03-10T08:55:35.937 INFO:tasks.workunit.client.0.vm05.stdout:2/394: stat d0/d9/d1e/f59 0 2026-03-10T08:55:35.937 INFO:tasks.workunit.client.0.vm05.stdout:2/395: fsync d0/f2 0 2026-03-10T08:55:35.938 INFO:tasks.workunit.client.0.vm05.stdout:2/396: chown d0/fa 1848058 1 2026-03-10T08:55:35.941 INFO:tasks.workunit.client.0.vm05.stdout:2/397: getdents d0/d55 0 2026-03-10T08:55:35.952 INFO:tasks.workunit.client.0.vm05.stdout:5/392: dread d5/d86/d24/d2c/f46 [4194304,4194304] 0 2026-03-10T08:55:35.956 INFO:tasks.workunit.client.0.vm05.stdout:5/393: dread d5/d86/d39/f77 [0,4194304] 0 2026-03-10T08:55:35.959 INFO:tasks.workunit.client.1.vm08.stdout:7/701: truncate d0/d11/d1f/d29/d3d/f76 1485811 0 2026-03-10T08:55:35.959 INFO:tasks.workunit.client.0.vm05.stdout:5/394: mknod d5/d86/d24/c8e 0 2026-03-10T08:55:35.959 INFO:tasks.workunit.client.1.vm08.stdout:7/702: fsync d0/d11/d4a/d5e/dc3/fce 0 2026-03-10T08:55:35.960 INFO:tasks.workunit.client.1.vm08.stdout:7/703: chown d0/d14/f72 21587 1 2026-03-10T08:55:35.961 INFO:tasks.workunit.client.1.vm08.stdout:7/704: dread - d0/d11/d1f/fb7 zero size 2026-03-10T08:55:35.961 INFO:tasks.workunit.client.0.vm05.stdout:5/395: creat d5/d86/f8f x:0 0 0 2026-03-10T08:55:35.962 INFO:tasks.workunit.client.0.vm05.stdout:5/396: creat d5/d86/d21/d89/f90 x:0 0 0 2026-03-10T08:55:35.963 INFO:tasks.workunit.client.0.vm05.stdout:5/397: symlink d5/d86/d39/l91 0 2026-03-10T08:55:35.968 INFO:tasks.workunit.client.1.vm08.stdout:7/705: link d0/d14/l17 d0/d11/d1f/d29/d36/daf/le2 0 2026-03-10T08:55:35.974 INFO:tasks.workunit.client.1.vm08.stdout:7/706: write d0/d11/d1f/d29/d3b/f86 [5078063,67459] 0 2026-03-10T08:55:35.974 INFO:tasks.workunit.client.1.vm08.stdout:7/707: rename d0/d11/db2/f67 to d0/d14/d2f/fe3 0 2026-03-10T08:55:35.974 INFO:tasks.workunit.client.0.vm05.stdout:5/398: truncate d5/d86/d24/d2c/d41/f87 966124 0 2026-03-10T08:55:35.974 INFO:tasks.workunit.client.0.vm05.stdout:5/399: dread d5/d86/d39/f78 [0,4194304] 0 2026-03-10T08:55:35.974 INFO:tasks.workunit.client.0.vm05.stdout:5/400: chown d5/d86/f44 326 1 2026-03-10T08:55:35.975 INFO:tasks.workunit.client.1.vm08.stdout:7/708: truncate d0/d11/d1f/d29/d3b/d80/f88 765483 0 2026-03-10T08:55:35.978 INFO:tasks.workunit.client.1.vm08.stdout:7/709: creat d0/d11/d1f/d29/d3d/d89/fe4 x:0 0 0 2026-03-10T08:55:35.979 INFO:tasks.workunit.client.1.vm08.stdout:7/710: rmdir d0/d11/d1f/d29/d3d/dd1 39 2026-03-10T08:55:35.981 INFO:tasks.workunit.client.1.vm08.stdout:7/711: creat d0/d11/d1f/d29/d36/d75/fe5 x:0 0 0 2026-03-10T08:55:36.020 INFO:tasks.workunit.client.0.vm05.stdout:0/453: dread df/d18/d2b/d27/d32/d4e/f56 [0,4194304] 0 2026-03-10T08:55:36.023 INFO:tasks.workunit.client.0.vm05.stdout:0/454: dwrite df/d18/d19/d5b/f72 [0,4194304] 0 2026-03-10T08:55:36.024 INFO:tasks.workunit.client.1.vm08.stdout:2/719: dwrite d1/fd [0,4194304] 0 2026-03-10T08:55:36.026 INFO:tasks.workunit.client.0.vm05.stdout:0/455: symlink df/d18/d19/d39/d74/d67/d7b/l81 0 2026-03-10T08:55:36.027 INFO:tasks.workunit.client.0.vm05.stdout:0/456: symlink df/l82 0 2026-03-10T08:55:36.028 INFO:tasks.workunit.client.0.vm05.stdout:0/457: write df/d18/f29 [4856197,8256] 0 2026-03-10T08:55:36.032 INFO:tasks.workunit.client.0.vm05.stdout:0/458: link df/d18/d19/d5b/f76 df/d18/d19/d39/d74/f83 0 2026-03-10T08:55:36.033 INFO:tasks.workunit.client.0.vm05.stdout:0/459: dread - df/d18/d2b/d27/f60 zero size 2026-03-10T08:55:36.058 INFO:tasks.workunit.client.0.vm05.stdout:1/541: dwrite dd/d10/d18/d20/f6c [0,4194304] 0 2026-03-10T08:55:36.064 INFO:tasks.workunit.client.0.vm05.stdout:1/542: readlink l8 0 2026-03-10T08:55:36.067 INFO:tasks.workunit.client.0.vm05.stdout:1/543: rename dd/d10/d19/f35 to dd/d21/d37/fc2 0 2026-03-10T08:55:36.100 INFO:tasks.workunit.client.0.vm05.stdout:9/374: write d6/d27/f2b [3523586,10455] 0 2026-03-10T08:55:36.106 INFO:tasks.workunit.client.0.vm05.stdout:9/375: getdents d6/d19/d21 0 2026-03-10T08:55:36.112 INFO:tasks.workunit.client.0.vm05.stdout:9/376: link d6/d19/c1d d6/d12/c7e 0 2026-03-10T08:55:36.120 INFO:tasks.workunit.client.1.vm08.stdout:6/683: dwrite d9/dc/d84/fae [0,4194304] 0 2026-03-10T08:55:36.125 INFO:tasks.workunit.client.1.vm08.stdout:6/684: symlink d9/dc/d11/d23/d2c/d7a/dce/d69/le4 0 2026-03-10T08:55:36.132 INFO:tasks.workunit.client.1.vm08.stdout:6/685: mknod d9/dc/d11/d23/d2c/d7a/dce/d69/da2/ce5 0 2026-03-10T08:55:36.139 INFO:tasks.workunit.client.1.vm08.stdout:6/686: fdatasync f5 0 2026-03-10T08:55:36.164 INFO:tasks.workunit.client.1.vm08.stdout:5/608: write d0/d11/d27/f2a [4314044,95615] 0 2026-03-10T08:55:36.165 INFO:tasks.workunit.client.1.vm08.stdout:5/609: readlink d0/d11/d27/d68/d7c/d4b/d4e/da5/lac 0 2026-03-10T08:55:36.172 INFO:tasks.workunit.client.1.vm08.stdout:4/711: write d5/fb4 [900647,103971] 0 2026-03-10T08:55:36.172 INFO:tasks.workunit.client.0.vm05.stdout:7/382: write d18/d1b/f2c [3287899,24900] 0 2026-03-10T08:55:36.178 INFO:tasks.workunit.client.1.vm08.stdout:4/712: dread d5/f19 [0,4194304] 0 2026-03-10T08:55:36.180 INFO:tasks.workunit.client.1.vm08.stdout:4/713: chown d5/d23/d36/d99/db2/f71 52771759 1 2026-03-10T08:55:36.183 INFO:tasks.workunit.client.1.vm08.stdout:1/697: write d1/da/d18/d3a/da7/fba [3866075,50358] 0 2026-03-10T08:55:36.189 INFO:tasks.workunit.client.1.vm08.stdout:4/714: mknod d5/de/dea/c106 0 2026-03-10T08:55:36.190 INFO:tasks.workunit.client.0.vm05.stdout:4/482: dwrite d0/d1d/f50 [0,4194304] 0 2026-03-10T08:55:36.192 INFO:tasks.workunit.client.0.vm05.stdout:4/483: write d0/d2e/d42/d45/d4a/d36/d37/d9c/f29 [1647912,18000] 0 2026-03-10T08:55:36.202 INFO:tasks.workunit.client.1.vm08.stdout:1/698: rename d1/da/d4b to d1/da/d20/d91/d83/df4 0 2026-03-10T08:55:36.211 INFO:tasks.workunit.client.1.vm08.stdout:4/715: symlink d5/d23/d36/d99/db2/d5a/l107 0 2026-03-10T08:55:36.211 INFO:tasks.workunit.client.1.vm08.stdout:1/699: dread d1/da/de/d24/d3d/d40/d56/f73 [0,4194304] 0 2026-03-10T08:55:36.211 INFO:tasks.workunit.client.1.vm08.stdout:4/716: creat d5/d23/d36/d99/dc6/f108 x:0 0 0 2026-03-10T08:55:36.215 INFO:tasks.workunit.client.1.vm08.stdout:4/717: dread d5/d5f/fcc [0,4194304] 0 2026-03-10T08:55:36.219 INFO:tasks.workunit.client.1.vm08.stdout:4/718: mkdir d5/d23/d109 0 2026-03-10T08:55:36.223 INFO:tasks.workunit.client.1.vm08.stdout:4/719: link d5/d23/d36/d99/db2/d5d/f60 d5/d23/d49/d8f/da4/f10a 0 2026-03-10T08:55:36.226 INFO:tasks.workunit.client.1.vm08.stdout:4/720: fsync d5/d23/d36/d99/db2/d5d/f61 0 2026-03-10T08:55:36.228 INFO:tasks.workunit.client.1.vm08.stdout:4/721: symlink d5/de/dea/l10b 0 2026-03-10T08:55:36.279 INFO:tasks.workunit.client.0.vm05.stdout:5/401: read d5/d86/d24/d2c/d41/f4d [2744880,24593] 0 2026-03-10T08:55:36.620 INFO:tasks.workunit.client.0.vm05.stdout:5/402: sync 2026-03-10T08:55:36.622 INFO:tasks.workunit.client.0.vm05.stdout:5/403: truncate d5/d48/f69 796951 0 2026-03-10T08:55:36.624 INFO:tasks.workunit.client.0.vm05.stdout:5/404: link d5/df/d37/d68/c7c d5/d86/d24/c92 0 2026-03-10T08:55:36.626 INFO:tasks.workunit.client.0.vm05.stdout:5/405: link d5/d86/f20 d5/d48/f93 0 2026-03-10T08:55:36.628 INFO:tasks.workunit.client.0.vm05.stdout:5/406: dread d5/fd [4194304,4194304] 0 2026-03-10T08:55:36.652 INFO:tasks.workunit.client.1.vm08.stdout:2/720: sync 2026-03-10T08:55:36.671 INFO:tasks.workunit.client.0.vm05.stdout:5/407: read d5/d3a/f4a [3771460,63875] 0 2026-03-10T08:55:36.709 INFO:tasks.workunit.client.1.vm08.stdout:3/610: write d4/d15/d8/d1d/f98 [149532,124281] 0 2026-03-10T08:55:36.714 INFO:tasks.workunit.client.1.vm08.stdout:3/611: truncate d4/d6f/dca/fcc 160541 0 2026-03-10T08:55:36.715 INFO:tasks.workunit.client.1.vm08.stdout:0/617: dwrite d6/dd/d13/d17/f82 [0,4194304] 0 2026-03-10T08:55:36.719 INFO:tasks.workunit.client.1.vm08.stdout:3/612: rmdir d4/d15/d8 39 2026-03-10T08:55:36.729 INFO:tasks.workunit.client.1.vm08.stdout:3/613: symlink d4/d15/d8/ld0 0 2026-03-10T08:55:36.729 INFO:tasks.workunit.client.1.vm08.stdout:0/618: link d6/dd/d13/d17/d1f/d20/d2f/d57/lb7 d6/dd/d13/lc9 0 2026-03-10T08:55:36.731 INFO:tasks.workunit.client.1.vm08.stdout:0/619: dread - d6/dd/d13/d17/d1f/d2d/d39/f8a zero size 2026-03-10T08:55:36.735 INFO:tasks.workunit.client.1.vm08.stdout:0/620: symlink d6/dd/d13/d17/d1f/d20/d2f/d24/dc2/lca 0 2026-03-10T08:55:36.739 INFO:tasks.workunit.client.1.vm08.stdout:3/614: dread d4/d15/f7 [0,4194304] 0 2026-03-10T08:55:36.744 INFO:tasks.workunit.client.1.vm08.stdout:3/615: creat d4/d15/d8/d2c/d6d/fd1 x:0 0 0 2026-03-10T08:55:36.744 INFO:tasks.workunit.client.1.vm08.stdout:3/616: chown d4/d15/d8/d2c/d9b/d79/d20 0 1 2026-03-10T08:55:36.746 INFO:tasks.workunit.client.1.vm08.stdout:3/617: creat d4/d15/d8/d2c/fd2 x:0 0 0 2026-03-10T08:55:36.750 INFO:tasks.workunit.client.0.vm05.stdout:6/536: dwrite d4/d2c/d84/f3c [4194304,4194304] 0 2026-03-10T08:55:36.761 INFO:tasks.workunit.client.0.vm05.stdout:8/467: truncate d2/dd/d2c/f30 2492525 0 2026-03-10T08:55:36.777 INFO:tasks.workunit.client.1.vm08.stdout:0/621: dread d6/dd/d13/d17/f66 [0,4194304] 0 2026-03-10T08:55:36.779 INFO:tasks.workunit.client.1.vm08.stdout:8/700: write d1/d10/d9/dd/d13/f24 [971059,106671] 0 2026-03-10T08:55:36.780 INFO:tasks.workunit.client.0.vm05.stdout:8/468: fsync d2/db/d28/f96 0 2026-03-10T08:55:36.781 INFO:tasks.workunit.client.1.vm08.stdout:8/701: read - d1/d10/d9/d4d/db2/f103 zero size 2026-03-10T08:55:36.781 INFO:tasks.workunit.client.0.vm05.stdout:8/469: write d2/dd/d2c/d2e/d93/f9b [138041,80808] 0 2026-03-10T08:55:36.790 INFO:tasks.workunit.client.1.vm08.stdout:0/622: creat d6/dd/d13/fcb x:0 0 0 2026-03-10T08:55:36.796 INFO:tasks.workunit.client.1.vm08.stdout:9/651: write d2/dd/d15/d1e/d21/f3a [2722858,122543] 0 2026-03-10T08:55:36.796 INFO:tasks.workunit.client.1.vm08.stdout:8/702: dread d1/d10/d9/dd/d13/f92 [0,4194304] 0 2026-03-10T08:55:36.797 INFO:tasks.workunit.client.0.vm05.stdout:3/512: dwrite d9/d2b/f34 [0,4194304] 0 2026-03-10T08:55:36.804 INFO:tasks.workunit.client.1.vm08.stdout:0/623: dwrite d6/dd/d13/d17/d1f/d2d/fa0 [0,4194304] 0 2026-03-10T08:55:36.809 INFO:tasks.workunit.client.0.vm05.stdout:3/513: read d9/d2b/f2c [157823,83690] 0 2026-03-10T08:55:36.818 INFO:tasks.workunit.client.0.vm05.stdout:3/514: creat d9/d2b/d3a/d6c/f92 x:0 0 0 2026-03-10T08:55:36.825 INFO:tasks.workunit.client.1.vm08.stdout:9/652: mkdir d2/d54/d8e/da6/dd0/dc8/de1 0 2026-03-10T08:55:36.829 INFO:tasks.workunit.client.1.vm08.stdout:0/624: rename d6/dd/d13/d17/d1f/d20/d2f/d57/lb7 to d6/dd/d13/d17/d1f/d20/d2f/d24/dc2/lcc 0 2026-03-10T08:55:36.829 INFO:tasks.workunit.client.1.vm08.stdout:8/703: getdents d1/d10/d9/dd/d25/d27/d44/d21 0 2026-03-10T08:55:36.831 INFO:tasks.workunit.client.1.vm08.stdout:9/653: dread - d2/d54/d8e/da6/dd0/f59 zero size 2026-03-10T08:55:36.844 INFO:tasks.workunit.client.0.vm05.stdout:2/398: dwrite d0/d9/d1e/d20/d21/d45/d4b/f58 [0,4194304] 0 2026-03-10T08:55:36.850 INFO:tasks.workunit.client.1.vm08.stdout:0/625: dwrite d6/dd/d13/d17/d1f/d2d/d39/fad [0,4194304] 0 2026-03-10T08:55:36.850 INFO:tasks.workunit.client.0.vm05.stdout:2/399: chown d0/f56 0 1 2026-03-10T08:55:36.850 INFO:tasks.workunit.client.0.vm05.stdout:2/400: mknod d0/d9/d1e/d20/d21/c6f 0 2026-03-10T08:55:36.850 INFO:tasks.workunit.client.0.vm05.stdout:2/401: mkdir d0/d9/d1e/d20/d21/d45/d4b/d70 0 2026-03-10T08:55:36.851 INFO:tasks.workunit.client.0.vm05.stdout:2/402: read d0/d9/d1e/d20/f32 [180664,6851] 0 2026-03-10T08:55:36.853 INFO:tasks.workunit.client.0.vm05.stdout:2/403: fdatasync d0/d9/d1e/f59 0 2026-03-10T08:55:36.854 INFO:tasks.workunit.client.0.vm05.stdout:2/404: readlink d0/d9/l42 0 2026-03-10T08:55:36.855 INFO:tasks.workunit.client.0.vm05.stdout:2/405: chown d0/d9/d1e/d20/d21/f41 228824 1 2026-03-10T08:55:36.857 INFO:tasks.workunit.client.0.vm05.stdout:6/537: dread d4/d2c/d84/d4a/f76 [0,4194304] 0 2026-03-10T08:55:36.878 INFO:tasks.workunit.client.1.vm08.stdout:9/654: dread d2/fb [0,4194304] 0 2026-03-10T08:55:36.880 INFO:tasks.workunit.client.1.vm08.stdout:0/626: creat d6/dd/d13/d17/d1f/d20/d2f/d57/fcd x:0 0 0 2026-03-10T08:55:36.886 INFO:tasks.workunit.client.1.vm08.stdout:9/655: rename d2/dd/d15/d1e/d25/d98/d9d/db3 to d2/d41/d4c/de2 0 2026-03-10T08:55:36.890 INFO:tasks.workunit.client.1.vm08.stdout:0/627: mknod d6/dd/d13/cce 0 2026-03-10T08:55:36.893 INFO:tasks.workunit.client.0.vm05.stdout:1/544: write dd/d21/d37/f85 [802477,22211] 0 2026-03-10T08:55:36.895 INFO:tasks.workunit.client.1.vm08.stdout:7/712: dwrite d0/d11/d4a/fa5 [0,4194304] 0 2026-03-10T08:55:36.917 INFO:tasks.workunit.client.1.vm08.stdout:7/713: fdatasync d0/d11/d4a/d95/fa7 0 2026-03-10T08:55:36.917 INFO:tasks.workunit.client.0.vm05.stdout:9/377: write d6/d12/d3a/d48/f65 [1473098,43927] 0 2026-03-10T08:55:36.918 INFO:tasks.workunit.client.0.vm05.stdout:2/406: rmdir d0/d9/d1e/d20/d21/d45/d6c 39 2026-03-10T08:55:36.918 INFO:tasks.workunit.client.0.vm05.stdout:6/538: fdatasync d4/d2c/f86 0 2026-03-10T08:55:36.918 INFO:tasks.workunit.client.0.vm05.stdout:9/378: creat d6/f7f x:0 0 0 2026-03-10T08:55:36.918 INFO:tasks.workunit.client.0.vm05.stdout:2/407: rename d0/f40 to d0/d9/d1e/d20/f71 0 2026-03-10T08:55:36.918 INFO:tasks.workunit.client.0.vm05.stdout:2/408: chown d0/f2 2935789 1 2026-03-10T08:55:36.918 INFO:tasks.workunit.client.0.vm05.stdout:9/379: symlink d6/d15/d35/l80 0 2026-03-10T08:55:36.918 INFO:tasks.workunit.client.0.vm05.stdout:9/380: truncate d6/d19/d2c/f61 437282 0 2026-03-10T08:55:36.919 INFO:tasks.workunit.client.0.vm05.stdout:2/409: symlink d0/d9/d1e/d20/d21/l72 0 2026-03-10T08:55:36.921 INFO:tasks.workunit.client.0.vm05.stdout:9/381: dwrite d6/d12/f1c [0,4194304] 0 2026-03-10T08:55:36.925 INFO:tasks.workunit.client.0.vm05.stdout:2/410: fsync d0/d9/d1e/f34 0 2026-03-10T08:55:36.943 INFO:tasks.workunit.client.0.vm05.stdout:9/382: rename d6/d15/d35/l80 to d6/d12/d3a/d48/l81 0 2026-03-10T08:55:36.943 INFO:tasks.workunit.client.0.vm05.stdout:2/411: rmdir d0/d9/d1e/d20 39 2026-03-10T08:55:36.943 INFO:tasks.workunit.client.0.vm05.stdout:9/383: mkdir d6/d15/d3c/d4b/d82 0 2026-03-10T08:55:36.943 INFO:tasks.workunit.client.0.vm05.stdout:9/384: dwrite d6/d27/f2b [0,4194304] 0 2026-03-10T08:55:36.943 INFO:tasks.workunit.client.0.vm05.stdout:9/385: fsync d6/d12/f34 0 2026-03-10T08:55:36.943 INFO:tasks.workunit.client.0.vm05.stdout:9/386: readlink d6/d15/l64 0 2026-03-10T08:55:36.943 INFO:tasks.workunit.client.0.vm05.stdout:9/387: stat d6/d15/d35/c66 0 2026-03-10T08:55:36.943 INFO:tasks.workunit.client.0.vm05.stdout:9/388: symlink d6/d15/l83 0 2026-03-10T08:55:36.944 INFO:tasks.workunit.client.0.vm05.stdout:9/389: mkdir d6/d19/d2c/d84 0 2026-03-10T08:55:36.977 INFO:tasks.workunit.client.0.vm05.stdout:9/390: read d6/d12/d43/f47 [2304281,117359] 0 2026-03-10T08:55:36.979 INFO:tasks.workunit.client.0.vm05.stdout:9/391: symlink d6/d12/l85 0 2026-03-10T08:55:36.981 INFO:tasks.workunit.client.0.vm05.stdout:1/545: sync 2026-03-10T08:55:36.988 INFO:tasks.workunit.client.0.vm05.stdout:9/392: creat d6/d15/f86 x:0 0 0 2026-03-10T08:55:36.988 INFO:tasks.workunit.client.0.vm05.stdout:9/393: dread - d6/d19/d2c/d58/f6c zero size 2026-03-10T08:55:36.991 INFO:tasks.workunit.client.1.vm08.stdout:5/610: write d0/d11/d27/d68/d7c/d4b/d4e/f56 [160431,37411] 0 2026-03-10T08:55:36.991 INFO:tasks.workunit.client.0.vm05.stdout:0/460: write df/d1f/f21 [1066270,76841] 0 2026-03-10T08:55:36.994 INFO:tasks.workunit.client.1.vm08.stdout:6/687: dwrite d9/fa [0,4194304] 0 2026-03-10T08:55:36.995 INFO:tasks.workunit.client.0.vm05.stdout:0/461: dwrite df/f79 [0,4194304] 0 2026-03-10T08:55:36.997 INFO:tasks.workunit.client.0.vm05.stdout:0/462: chown df/d18/d19 5 1 2026-03-10T08:55:36.997 INFO:tasks.workunit.client.0.vm05.stdout:0/463: stat df/d18/f29 0 2026-03-10T08:55:36.998 INFO:tasks.workunit.client.0.vm05.stdout:0/464: write df/d18/d19/d5b/f78 [255884,99061] 0 2026-03-10T08:55:37.002 INFO:tasks.workunit.client.1.vm08.stdout:6/688: dwrite d9/dc/d11/d23/f6f [0,4194304] 0 2026-03-10T08:55:37.003 INFO:tasks.workunit.client.0.vm05.stdout:7/383: truncate d18/d66/d25/d2e/f49 355178 0 2026-03-10T08:55:37.022 INFO:tasks.workunit.client.0.vm05.stdout:4/484: dwrite d0/f9 [4194304,4194304] 0 2026-03-10T08:55:37.022 INFO:tasks.workunit.client.1.vm08.stdout:6/689: dread - d9/dc/d11/d23/d2c/d41/fd6 zero size 2026-03-10T08:55:37.022 INFO:tasks.workunit.client.1.vm08.stdout:6/690: chown d9 4367 1 2026-03-10T08:55:37.022 INFO:tasks.workunit.client.1.vm08.stdout:6/691: read d9/dc/d84/f5e [570225,58463] 0 2026-03-10T08:55:37.022 INFO:tasks.workunit.client.1.vm08.stdout:1/700: write d1/da/de/d24/d35/d6d/d82/da2/fcd [293073,119843] 0 2026-03-10T08:55:37.026 INFO:tasks.workunit.client.1.vm08.stdout:4/722: rmdir d5/d23/d49/d8f/da4 39 2026-03-10T08:55:37.028 INFO:tasks.workunit.client.1.vm08.stdout:4/723: write d5/d23/d36/d99/db2/d5a/d69/fb3 [1279737,75317] 0 2026-03-10T08:55:37.029 INFO:tasks.workunit.client.1.vm08.stdout:8/704: read d1/d10/d9/dd/d13/f24 [915595,50078] 0 2026-03-10T08:55:37.029 INFO:tasks.workunit.client.0.vm05.stdout:0/465: mkdir df/d18/d19/d47/d84 0 2026-03-10T08:55:37.039 INFO:tasks.workunit.client.1.vm08.stdout:2/721: write d1/da/d10/d42/d93/f55 [2311891,16338] 0 2026-03-10T08:55:37.041 INFO:tasks.workunit.client.0.vm05.stdout:5/408: write d5/d86/f2a [694248,43284] 0 2026-03-10T08:55:37.046 INFO:tasks.workunit.client.0.vm05.stdout:5/409: dwrite d5/d86/f1b [4194304,4194304] 0 2026-03-10T08:55:37.048 INFO:tasks.workunit.client.1.vm08.stdout:1/701: creat d1/da/de/d24/d35/d6d/d82/da2/ff5 x:0 0 0 2026-03-10T08:55:37.049 INFO:tasks.workunit.client.1.vm08.stdout:8/705: creat d1/d10/d9/dd/d9a/da6/f106 x:0 0 0 2026-03-10T08:55:37.052 INFO:tasks.workunit.client.1.vm08.stdout:4/724: dwrite d5/d23/d36/f51 [0,4194304] 0 2026-03-10T08:55:37.054 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:36 vm08.local ceph-mon[57559]: pgmap v157: 65 pgs: 65 active+clean; 2.2 GiB data, 7.9 GiB used, 112 GiB / 120 GiB avail; 47 MiB/s rd, 128 MiB/s wr, 279 op/s 2026-03-10T08:55:37.062 INFO:tasks.workunit.client.1.vm08.stdout:5/611: link d0/d11/d3e/la6 d0/d11/d18/d52/lb7 0 2026-03-10T08:55:37.076 INFO:tasks.workunit.client.0.vm05.stdout:5/410: write d5/d3a/d43/f6d [790223,5293] 0 2026-03-10T08:55:37.076 INFO:tasks.workunit.client.0.vm05.stdout:4/485: getdents d0/d2e/d42 0 2026-03-10T08:55:37.076 INFO:tasks.workunit.client.0.vm05.stdout:4/486: stat d0/d2e/d42/d45/d4a/d36/d37/d9c/d49/d4f/d5b/c81 0 2026-03-10T08:55:37.076 INFO:tasks.workunit.client.1.vm08.stdout:5/612: dread - d0/d11/d27/fb3 zero size 2026-03-10T08:55:37.076 INFO:tasks.workunit.client.1.vm08.stdout:5/613: readlink d0/l3c 0 2026-03-10T08:55:37.076 INFO:tasks.workunit.client.1.vm08.stdout:1/702: creat d1/da/d20/d91/ff6 x:0 0 0 2026-03-10T08:55:37.076 INFO:tasks.workunit.client.1.vm08.stdout:3/618: dwrite d4/d15/d8/d2c/f3d [0,4194304] 0 2026-03-10T08:55:37.076 INFO:tasks.workunit.client.1.vm08.stdout:8/706: fsync d1/d10/d9/dd/d25/d27/d44/d21/dce/ffd 0 2026-03-10T08:55:37.076 INFO:tasks.workunit.client.1.vm08.stdout:1/703: creat d1/da/de/d24/d3d/d40/d92/ff7 x:0 0 0 2026-03-10T08:55:37.076 INFO:tasks.workunit.client.1.vm08.stdout:8/707: symlink d1/d10/d9/d8a/l107 0 2026-03-10T08:55:37.076 INFO:tasks.workunit.client.1.vm08.stdout:1/704: dread - d1/da/de/d24/d3d/d40/d8e/dd2/d7f/fe9 zero size 2026-03-10T08:55:37.076 INFO:tasks.workunit.client.1.vm08.stdout:4/725: rename d5/fba to d5/d23/d49/d8f/f10c 0 2026-03-10T08:55:37.076 INFO:tasks.workunit.client.1.vm08.stdout:4/726: chown d5/d23/cb7 154408 1 2026-03-10T08:55:37.078 INFO:tasks.workunit.client.1.vm08.stdout:3/619: truncate d4/d15/d8/d2c/d9b/d79/f3c 3535424 0 2026-03-10T08:55:37.079 INFO:tasks.workunit.client.1.vm08.stdout:8/708: rmdir d1/d10/d9/dd/d18/d34/dd0 39 2026-03-10T08:55:37.080 INFO:tasks.workunit.client.1.vm08.stdout:8/709: dread - d1/d10/d9/d4d/db2/f103 zero size 2026-03-10T08:55:37.081 INFO:tasks.workunit.client.1.vm08.stdout:5/614: rename d0/d11/d3e/d45/c8d to d0/d11/d27/cb8 0 2026-03-10T08:55:37.083 INFO:tasks.workunit.client.1.vm08.stdout:3/620: mkdir d4/d6f/d85/dd3 0 2026-03-10T08:55:37.084 INFO:tasks.workunit.client.1.vm08.stdout:9/656: sync 2026-03-10T08:55:37.085 INFO:tasks.workunit.client.1.vm08.stdout:9/657: readlink d2/dd/d15/d1e/d24/lcc 0 2026-03-10T08:55:37.086 INFO:tasks.workunit.client.1.vm08.stdout:8/710: creat d1/da8/f108 x:0 0 0 2026-03-10T08:55:37.086 INFO:tasks.workunit.client.1.vm08.stdout:5/615: fsync d0/d46/f51 0 2026-03-10T08:55:37.097 INFO:tasks.workunit.client.1.vm08.stdout:9/658: chown d2/d54/cb2 4 1 2026-03-10T08:55:37.098 INFO:tasks.workunit.client.1.vm08.stdout:4/727: creat d5/d23/f10d x:0 0 0 2026-03-10T08:55:37.098 INFO:tasks.workunit.client.1.vm08.stdout:3/621: mknod d4/d6f/d85/dd3/cd4 0 2026-03-10T08:55:37.098 INFO:tasks.workunit.client.1.vm08.stdout:8/711: fdatasync d1/d10/d9/dd/d18/d34/dd0/fe1 0 2026-03-10T08:55:37.099 INFO:tasks.workunit.client.1.vm08.stdout:8/712: chown d1/d10/d9/dd/d9a/da6/cc0 10337786 1 2026-03-10T08:55:37.110 INFO:tasks.workunit.client.1.vm08.stdout:3/622: unlink d4/d15/d8/d2c/d9b/d79/d20/l29 0 2026-03-10T08:55:37.111 INFO:tasks.workunit.client.1.vm08.stdout:4/728: rename d5/d23/d36/d99/db2/d5a/c8b to d5/d23/d36/d99/c10e 0 2026-03-10T08:55:37.113 INFO:tasks.workunit.client.1.vm08.stdout:3/623: mknod d4/d6f/cd5 0 2026-03-10T08:55:37.114 INFO:tasks.workunit.client.1.vm08.stdout:4/729: creat d5/f10f x:0 0 0 2026-03-10T08:55:37.114 INFO:tasks.workunit.client.1.vm08.stdout:8/713: symlink d1/d10/d9/dd/d18/dff/l109 0 2026-03-10T08:55:37.116 INFO:tasks.workunit.client.1.vm08.stdout:8/714: fsync d1/d10/d9/dd/d18/d3c/f4e 0 2026-03-10T08:55:37.120 INFO:tasks.workunit.client.1.vm08.stdout:8/715: dread - d1/d10/d9/dd/d25/d27/d44/d21/d5f/fd4 zero size 2026-03-10T08:55:37.122 INFO:tasks.workunit.client.0.vm05.stdout:8/470: truncate d2/db/d1f/f44 229146 0 2026-03-10T08:55:37.122 INFO:tasks.workunit.client.1.vm08.stdout:4/730: symlink d5/d23/d36/l110 0 2026-03-10T08:55:37.125 INFO:tasks.workunit.client.1.vm08.stdout:4/731: symlink d5/d23/d49/l111 0 2026-03-10T08:55:37.130 INFO:tasks.workunit.client.0.vm05.stdout:5/411: dread d5/d86/d24/d2c/d41/f4c [0,4194304] 0 2026-03-10T08:55:37.132 INFO:tasks.workunit.client.0.vm05.stdout:3/515: write d9/d2b/d53/d61/f69 [551304,88303] 0 2026-03-10T08:55:37.135 INFO:tasks.workunit.client.0.vm05.stdout:9/394: dread d6/d19/d21/f32 [0,4194304] 0 2026-03-10T08:55:37.136 INFO:tasks.workunit.client.0.vm05.stdout:9/395: write d6/d19/d2c/f54 [4092012,7873] 0 2026-03-10T08:55:37.142 INFO:tasks.workunit.client.1.vm08.stdout:0/628: truncate d6/dd/d13/d17/d1f/d2d/fa0 817344 0 2026-03-10T08:55:37.150 INFO:tasks.workunit.client.0.vm05.stdout:5/412: creat d5/d86/d66/f94 x:0 0 0 2026-03-10T08:55:37.150 INFO:tasks.workunit.client.0.vm05.stdout:3/516: creat d9/d2b/d53/f93 x:0 0 0 2026-03-10T08:55:37.150 INFO:tasks.workunit.client.1.vm08.stdout:4/732: truncate d5/d23/d36/d76/fa7 4599110 0 2026-03-10T08:55:37.150 INFO:tasks.workunit.client.1.vm08.stdout:7/714: dwrite d0/d11/d1f/f90 [4194304,4194304] 0 2026-03-10T08:55:37.156 INFO:tasks.workunit.client.0.vm05.stdout:3/517: dread d9/f29 [0,4194304] 0 2026-03-10T08:55:37.161 INFO:tasks.workunit.client.1.vm08.stdout:0/629: mknod d6/dd/d13/d17/d1f/d2d/d85/d93/ccf 0 2026-03-10T08:55:37.165 INFO:tasks.workunit.client.0.vm05.stdout:6/539: write d4/d2c/f86 [478196,94786] 0 2026-03-10T08:55:37.171 INFO:tasks.workunit.client.1.vm08.stdout:4/733: fsync d5/d23/d36/d99/db2/d5d/fc5 0 2026-03-10T08:55:37.172 INFO:tasks.workunit.client.0.vm05.stdout:9/396: creat d6/d19/d2a/f87 x:0 0 0 2026-03-10T08:55:37.175 INFO:tasks.workunit.client.0.vm05.stdout:8/471: creat d2/dd/d2c/d2e/d31/d4f/da3/faa x:0 0 0 2026-03-10T08:55:37.176 INFO:tasks.workunit.client.0.vm05.stdout:6/540: dread d4/f6c [0,4194304] 0 2026-03-10T08:55:37.176 INFO:tasks.workunit.client.0.vm05.stdout:6/541: chown d4/d7/d10/d15/d1b/d22/f56 195874687 1 2026-03-10T08:55:37.177 INFO:tasks.workunit.client.0.vm05.stdout:5/413: mkdir d5/d48/d64/d95 0 2026-03-10T08:55:37.178 INFO:tasks.workunit.client.0.vm05.stdout:2/412: write d0/f56 [614472,3841] 0 2026-03-10T08:55:37.183 INFO:tasks.workunit.client.1.vm08.stdout:0/630: unlink d6/dd/d13/d17/cc1 0 2026-03-10T08:55:37.184 INFO:tasks.workunit.client.1.vm08.stdout:0/631: write d6/dd/d13/d17/d50/f71 [2569576,39618] 0 2026-03-10T08:55:37.184 INFO:tasks.workunit.client.1.vm08.stdout:8/716: dread d1/d10/f3b [0,4194304] 0 2026-03-10T08:55:37.187 INFO:tasks.workunit.client.1.vm08.stdout:7/715: mknod d0/d11/d1f/d29/d3b/ce6 0 2026-03-10T08:55:37.195 INFO:tasks.workunit.client.0.vm05.stdout:1/546: dwrite dd/d10/d18/d2d/d51/f6e [0,4194304] 0 2026-03-10T08:55:37.195 INFO:tasks.workunit.client.0.vm05.stdout:1/547: stat dd/d10/d18/d2d/d5c 0 2026-03-10T08:55:37.199 INFO:tasks.workunit.client.0.vm05.stdout:0/466: rename df/d18 to df/d1f/d85 0 2026-03-10T08:55:37.204 INFO:tasks.workunit.client.0.vm05.stdout:3/518: mknod d9/d2b/c94 0 2026-03-10T08:55:37.204 INFO:tasks.workunit.client.0.vm05.stdout:7/384: dwrite d18/d38/f55 [0,4194304] 0 2026-03-10T08:55:37.205 INFO:tasks.workunit.client.1.vm08.stdout:7/716: mkdir d0/d14/d43/de7 0 2026-03-10T08:55:37.206 INFO:tasks.workunit.client.0.vm05.stdout:8/472: creat d2/dd/d2c/d2e/d31/d4f/d7b/d9e/fab x:0 0 0 2026-03-10T08:55:37.206 INFO:tasks.workunit.client.1.vm08.stdout:7/717: chown d0/d11/d1f/d29/fba 82 1 2026-03-10T08:55:37.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:36 vm05.local ceph-mon[49713]: pgmap v157: 65 pgs: 65 active+clean; 2.2 GiB data, 7.9 GiB used, 112 GiB / 120 GiB avail; 47 MiB/s rd, 128 MiB/s wr, 279 op/s 2026-03-10T08:55:37.215 INFO:tasks.workunit.client.1.vm08.stdout:7/718: rename d0/d11/d1f/d29/d3d/d40/cbf to d0/d11/d1f/d29/d3b/ce8 0 2026-03-10T08:55:37.215 INFO:tasks.workunit.client.0.vm05.stdout:0/467: creat df/d1f/d85/d19/d39/f86 x:0 0 0 2026-03-10T08:55:37.217 INFO:tasks.workunit.client.0.vm05.stdout:3/519: unlink d9/d2b/d3a/d43/c48 0 2026-03-10T08:55:37.219 INFO:tasks.workunit.client.0.vm05.stdout:1/548: read dd/d21/f48 [4329853,92189] 0 2026-03-10T08:55:37.222 INFO:tasks.workunit.client.0.vm05.stdout:3/520: dwrite d9/d2b/d2f/d57/f90 [0,4194304] 0 2026-03-10T08:55:37.222 INFO:tasks.workunit.client.0.vm05.stdout:7/385: mkdir d18/d66/d25/d2e/d2f/d6d 0 2026-03-10T08:55:37.224 INFO:tasks.workunit.client.0.vm05.stdout:8/473: truncate d2/dd/f3f 458063 0 2026-03-10T08:55:37.237 INFO:tasks.workunit.client.0.vm05.stdout:2/413: truncate d0/d9/d1e/f34 3360182 0 2026-03-10T08:55:37.238 INFO:tasks.workunit.client.1.vm08.stdout:7/719: dread d0/d11/d1f/d29/fcc [0,4194304] 0 2026-03-10T08:55:37.239 INFO:tasks.workunit.client.1.vm08.stdout:6/692: write d9/d10/f67 [907410,65093] 0 2026-03-10T08:55:37.242 INFO:tasks.workunit.client.0.vm05.stdout:0/468: fsync df/d1f/d85/f53 0 2026-03-10T08:55:37.243 INFO:tasks.workunit.client.0.vm05.stdout:9/397: getdents d6/d27 0 2026-03-10T08:55:37.244 INFO:tasks.workunit.client.0.vm05.stdout:9/398: write d6/d12/d3a/f5e [442670,114485] 0 2026-03-10T08:55:37.245 INFO:tasks.workunit.client.1.vm08.stdout:2/722: dwrite d1/da/d10/d42/d93/d23/d9e/fa1 [0,4194304] 0 2026-03-10T08:55:37.249 INFO:tasks.workunit.client.0.vm05.stdout:1/549: fsync dd/d10/d19/f2e 0 2026-03-10T08:55:37.257 INFO:tasks.workunit.client.0.vm05.stdout:7/386: mkdir d18/d38/d43/d6e 0 2026-03-10T08:55:37.258 INFO:tasks.workunit.client.1.vm08.stdout:6/693: read d9/d13/f35 [2992063,13160] 0 2026-03-10T08:55:37.258 INFO:tasks.workunit.client.0.vm05.stdout:3/521: creat d9/d4d/f95 x:0 0 0 2026-03-10T08:55:37.258 INFO:tasks.workunit.client.0.vm05.stdout:7/387: fdatasync d18/d66/d25/f56 0 2026-03-10T08:55:37.258 INFO:tasks.workunit.client.0.vm05.stdout:3/522: fsync d9/d2b/d3a/f44 0 2026-03-10T08:55:37.259 INFO:tasks.workunit.client.0.vm05.stdout:3/523: dread - d9/d2b/d53/d61/f87 zero size 2026-03-10T08:55:37.260 INFO:tasks.workunit.client.0.vm05.stdout:0/469: mkdir df/d1f/d85/d2b/d27/d32/d4e/d87 0 2026-03-10T08:55:37.261 INFO:tasks.workunit.client.0.vm05.stdout:9/399: truncate d6/d19/d2a/f4d 753232 0 2026-03-10T08:55:37.261 INFO:tasks.workunit.client.0.vm05.stdout:1/550: mkdir dd/d10/d19/d9b/dc3 0 2026-03-10T08:55:37.261 INFO:tasks.workunit.client.0.vm05.stdout:9/400: stat d6/d15/d37 0 2026-03-10T08:55:37.262 INFO:tasks.workunit.client.0.vm05.stdout:9/401: write d6/f16 [2538512,124247] 0 2026-03-10T08:55:37.268 INFO:tasks.workunit.client.1.vm08.stdout:6/694: chown d9/d10/d1e/l82 6688 1 2026-03-10T08:55:37.268 INFO:tasks.workunit.client.0.vm05.stdout:3/524: truncate d9/d8f/d50/f72 828881 0 2026-03-10T08:55:37.269 INFO:tasks.workunit.client.1.vm08.stdout:6/695: dwrite d9/dc/d84/fae [0,4194304] 0 2026-03-10T08:55:37.272 INFO:tasks.workunit.client.0.vm05.stdout:8/474: symlink d2/dd/d2c/lac 0 2026-03-10T08:55:37.273 INFO:tasks.workunit.client.1.vm08.stdout:6/696: dread d9/dc/d84/fae [0,4194304] 0 2026-03-10T08:55:37.282 INFO:tasks.workunit.client.0.vm05.stdout:0/470: read df/d1f/d85/d19/d39/f61 [1486930,65066] 0 2026-03-10T08:55:37.285 INFO:tasks.workunit.client.0.vm05.stdout:9/402: creat d6/d19/d2a/d4a/f88 x:0 0 0 2026-03-10T08:55:37.289 INFO:tasks.workunit.client.0.vm05.stdout:4/487: dwrite d0/d2e/d42/d45/f5f [0,4194304] 0 2026-03-10T08:55:37.292 INFO:tasks.workunit.client.0.vm05.stdout:3/525: mkdir d9/d2b/d2f/d96 0 2026-03-10T08:55:37.294 INFO:tasks.workunit.client.0.vm05.stdout:0/471: symlink df/d1f/d85/d19/d39/d74/d67/l88 0 2026-03-10T08:55:37.295 INFO:tasks.workunit.client.0.vm05.stdout:9/403: mknod d6/d19/d2a/d4a/c89 0 2026-03-10T08:55:37.296 INFO:tasks.workunit.client.0.vm05.stdout:7/388: creat d18/d66/d25/d2e/f6f x:0 0 0 2026-03-10T08:55:37.303 INFO:tasks.workunit.client.0.vm05.stdout:0/472: dread df/d1f/d85/f2a [0,4194304] 0 2026-03-10T08:55:37.309 INFO:tasks.workunit.client.0.vm05.stdout:4/488: mknod d0/d2c/ca0 0 2026-03-10T08:55:37.309 INFO:tasks.workunit.client.0.vm05.stdout:9/404: readlink d6/ld 0 2026-03-10T08:55:37.309 INFO:tasks.workunit.client.0.vm05.stdout:0/473: fdatasync df/f17 0 2026-03-10T08:55:37.309 INFO:tasks.workunit.client.0.vm05.stdout:0/474: chown df/d1f/d85/f53 7 1 2026-03-10T08:55:37.309 INFO:tasks.workunit.client.0.vm05.stdout:0/475: truncate df/d1f/d85/d19/d5b/f6c 921818 0 2026-03-10T08:55:37.313 INFO:tasks.workunit.client.0.vm05.stdout:4/489: symlink d0/d2e/d9d/la1 0 2026-03-10T08:55:37.313 INFO:tasks.workunit.client.1.vm08.stdout:1/705: write d1/da/d18/fb1 [404674,63] 0 2026-03-10T08:55:37.316 INFO:tasks.workunit.client.1.vm08.stdout:1/706: dread - d1/da/de/d24/d35/d6d/fa8 zero size 2026-03-10T08:55:37.322 INFO:tasks.workunit.client.1.vm08.stdout:5/616: dwrite d0/d11/d27/d50/fa1 [0,4194304] 0 2026-03-10T08:55:37.322 INFO:tasks.workunit.client.0.vm05.stdout:9/405: rename d6/fe to d6/d19/d21/f8a 0 2026-03-10T08:55:37.322 INFO:tasks.workunit.client.0.vm05.stdout:0/476: mknod df/d1f/d48/c89 0 2026-03-10T08:55:37.322 INFO:tasks.workunit.client.0.vm05.stdout:0/477: read df/d59/f45 [501039,22635] 0 2026-03-10T08:55:37.322 INFO:tasks.workunit.client.0.vm05.stdout:0/478: stat df/f79 0 2026-03-10T08:55:37.322 INFO:tasks.workunit.client.0.vm05.stdout:0/479: stat df/d1f/c41 0 2026-03-10T08:55:37.322 INFO:tasks.workunit.client.0.vm05.stdout:2/414: link d0/d9/d1e/l6a d0/d9/d1e/d20/d21/d45/l73 0 2026-03-10T08:55:37.323 INFO:tasks.workunit.client.0.vm05.stdout:7/389: link d18/d66/d25/d2e/d2f/f33 d18/d66/f70 0 2026-03-10T08:55:37.335 INFO:tasks.workunit.client.0.vm05.stdout:9/406: symlink d6/d19/d2c/l8b 0 2026-03-10T08:55:37.335 INFO:tasks.workunit.client.0.vm05.stdout:9/407: write d6/d12/d3a/f62 [242775,20956] 0 2026-03-10T08:55:37.336 INFO:tasks.workunit.client.1.vm08.stdout:5/617: mkdir d0/d11/d18/d52/db9 0 2026-03-10T08:55:37.339 INFO:tasks.workunit.client.1.vm08.stdout:1/707: fsync d1/da/de/d24/d3d/d40/d8e/dd2/f8b 0 2026-03-10T08:55:37.340 INFO:tasks.workunit.client.0.vm05.stdout:2/415: write d0/d9/d1e/d20/d21/d45/d6c/d6e/f66 [782827,15221] 0 2026-03-10T08:55:37.344 INFO:tasks.workunit.client.0.vm05.stdout:9/408: write d6/f3f [1975323,21760] 0 2026-03-10T08:55:37.348 INFO:tasks.workunit.client.1.vm08.stdout:1/708: mkdir d1/dde/df8 0 2026-03-10T08:55:37.348 INFO:tasks.workunit.client.0.vm05.stdout:2/416: mknod d0/d9/d1e/d20/d21/d45/d4b/c74 0 2026-03-10T08:55:37.348 INFO:tasks.workunit.client.0.vm05.stdout:9/409: mkdir d6/d19/d2a/d4a/d8c 0 2026-03-10T08:55:37.359 INFO:tasks.workunit.client.1.vm08.stdout:5/618: dread d0/d1b/f2f [0,4194304] 0 2026-03-10T08:55:37.360 INFO:tasks.workunit.client.0.vm05.stdout:2/417: dread d0/d9/d1e/d20/f47 [0,4194304] 0 2026-03-10T08:55:37.368 INFO:tasks.workunit.client.0.vm05.stdout:7/390: dread d18/d66/f2d [0,4194304] 0 2026-03-10T08:55:37.370 INFO:tasks.workunit.client.1.vm08.stdout:5/619: rename d0/d11/d27/d68/d7c/d4b/d4e/da5/lac to d0/d11/d3e/d45/lba 0 2026-03-10T08:55:37.371 INFO:tasks.workunit.client.0.vm05.stdout:7/391: dwrite d18/f1d [0,4194304] 0 2026-03-10T08:55:37.372 INFO:tasks.workunit.client.0.vm05.stdout:7/392: readlink d18/l1e 0 2026-03-10T08:55:37.374 INFO:tasks.workunit.client.0.vm05.stdout:7/393: creat d18/d66/d25/d2e/d42/f71 x:0 0 0 2026-03-10T08:55:37.385 INFO:tasks.workunit.client.0.vm05.stdout:7/394: creat d18/d66/d25/d2e/d32/f72 x:0 0 0 2026-03-10T08:55:37.395 INFO:tasks.workunit.client.0.vm05.stdout:7/395: symlink d18/d66/d25/d2e/l73 0 2026-03-10T08:55:37.395 INFO:tasks.workunit.client.0.vm05.stdout:7/396: mkdir d18/d66/d25/d2e/d42/d74 0 2026-03-10T08:55:37.396 INFO:tasks.workunit.client.1.vm08.stdout:1/709: dread d1/da/de/d24/d3d/d40/d8e/dd2/d7f/fb9 [0,4194304] 0 2026-03-10T08:55:37.396 INFO:tasks.workunit.client.1.vm08.stdout:1/710: symlink d1/da/d18/d3b/lf9 0 2026-03-10T08:55:37.396 INFO:tasks.workunit.client.1.vm08.stdout:1/711: write d1/da/de/d24/d35/d6d/d82/da2/fcd [187555,31286] 0 2026-03-10T08:55:37.399 INFO:tasks.workunit.client.1.vm08.stdout:9/659: dwrite d2/f4 [0,4194304] 0 2026-03-10T08:55:37.400 INFO:tasks.workunit.client.0.vm05.stdout:7/397: mknod d18/d66/d25/d2e/c75 0 2026-03-10T08:55:37.400 INFO:tasks.workunit.client.1.vm08.stdout:1/712: fsync d1/da/d20/d9e/fc0 0 2026-03-10T08:55:37.400 INFO:tasks.workunit.client.0.vm05.stdout:7/398: write d18/d38/d43/d5c/f67 [374725,38598] 0 2026-03-10T08:55:37.401 INFO:tasks.workunit.client.1.vm08.stdout:9/660: stat d2/dd/d15/d1e/d39/lb5 0 2026-03-10T08:55:37.403 INFO:tasks.workunit.client.1.vm08.stdout:1/713: symlink d1/da/d20/d3f/d49/lfa 0 2026-03-10T08:55:37.411 INFO:tasks.workunit.client.1.vm08.stdout:1/714: chown d1/da/d20/d3f/d49/lfa 17574 1 2026-03-10T08:55:37.411 INFO:tasks.workunit.client.0.vm05.stdout:7/399: getdents d18/d38 0 2026-03-10T08:55:37.411 INFO:tasks.workunit.client.0.vm05.stdout:7/400: creat d18/d38/d43/d6e/f76 x:0 0 0 2026-03-10T08:55:37.411 INFO:tasks.workunit.client.0.vm05.stdout:7/401: mknod d18/d66/c77 0 2026-03-10T08:55:37.412 INFO:tasks.workunit.client.1.vm08.stdout:1/715: link d1/da/de/d24/d35/d6d/d82/da2/ff5 d1/da/de/d24/d35/d43/ffb 0 2026-03-10T08:55:37.412 INFO:tasks.workunit.client.1.vm08.stdout:9/661: link d2/dd/d15/d1e/d39/d69/fda d2/dd/d15/d1e/d25/d98/d9d/fe3 0 2026-03-10T08:55:37.412 INFO:tasks.workunit.client.0.vm05.stdout:7/402: unlink d18/d66/d25/d2e/d42/d53/f6b 0 2026-03-10T08:55:37.413 INFO:tasks.workunit.client.1.vm08.stdout:1/716: stat d1/da/d18/c52 0 2026-03-10T08:55:37.413 INFO:tasks.workunit.client.1.vm08.stdout:9/662: write d2/dd/d61/fbb [929759,10289] 0 2026-03-10T08:55:37.414 INFO:tasks.workunit.client.0.vm05.stdout:7/403: truncate d18/d66/d25/d2e/d32/f72 255672 0 2026-03-10T08:55:37.414 INFO:tasks.workunit.client.1.vm08.stdout:1/717: truncate d1/fe6 1297758 0 2026-03-10T08:55:37.416 INFO:tasks.workunit.client.0.vm05.stdout:7/404: unlink d18/d1b/c3b 0 2026-03-10T08:55:37.417 INFO:tasks.workunit.client.0.vm05.stdout:7/405: mkdir d18/d66/d78 0 2026-03-10T08:55:37.420 INFO:tasks.workunit.client.0.vm05.stdout:7/406: dwrite d18/d38/d43/d5c/f5f [0,4194304] 0 2026-03-10T08:55:37.422 INFO:tasks.workunit.client.0.vm05.stdout:9/410: sync 2026-03-10T08:55:37.424 INFO:tasks.workunit.client.0.vm05.stdout:9/411: mkdir d6/d19/d2a/d8d 0 2026-03-10T08:55:37.424 INFO:tasks.workunit.client.0.vm05.stdout:9/412: stat d6/d12/f34 0 2026-03-10T08:55:37.430 INFO:tasks.workunit.client.0.vm05.stdout:7/407: mkdir d18/d66/d79 0 2026-03-10T08:55:37.434 INFO:tasks.workunit.client.0.vm05.stdout:9/413: rename d6/d19/l1f to d6/d12/d43/l8e 0 2026-03-10T08:55:37.435 INFO:tasks.workunit.client.0.vm05.stdout:7/408: symlink d18/d1b/l7a 0 2026-03-10T08:55:37.436 INFO:tasks.workunit.client.0.vm05.stdout:7/409: fdatasync d18/d1b/f30 0 2026-03-10T08:55:37.438 INFO:tasks.workunit.client.0.vm05.stdout:9/414: dread d6/d12/d3a/d48/f65 [0,4194304] 0 2026-03-10T08:55:37.440 INFO:tasks.workunit.client.0.vm05.stdout:7/410: creat d18/d66/d25/d2e/d42/d74/f7b x:0 0 0 2026-03-10T08:55:37.447 INFO:tasks.workunit.client.1.vm08.stdout:3/624: dwrite d4/d15/d8/d2c/d55/f61 [0,4194304] 0 2026-03-10T08:55:37.447 INFO:tasks.workunit.client.1.vm08.stdout:3/625: unlink d4/d15/d8/l36 0 2026-03-10T08:55:37.448 INFO:tasks.workunit.client.0.vm05.stdout:9/415: rename d6/d15/d3c/l50 to d6/d19/l8f 0 2026-03-10T08:55:37.452 INFO:tasks.workunit.client.1.vm08.stdout:3/626: rename d4/d15/d8/d2c/d9b/d79/f59 to d4/d15/d8/d71/fd6 0 2026-03-10T08:55:37.453 INFO:tasks.workunit.client.0.vm05.stdout:9/416: mkdir d6/d15/d3c/d4b/d90 0 2026-03-10T08:55:37.455 INFO:tasks.workunit.client.0.vm05.stdout:7/411: getdents d18 0 2026-03-10T08:55:37.456 INFO:tasks.workunit.client.0.vm05.stdout:9/417: dwrite d6/d15/d3c/f6b [0,4194304] 0 2026-03-10T08:55:37.460 INFO:tasks.workunit.client.1.vm08.stdout:1/718: sync 2026-03-10T08:55:37.461 INFO:tasks.workunit.client.0.vm05.stdout:9/418: read d6/d15/d35/f38 [1318982,5529] 0 2026-03-10T08:55:37.467 INFO:tasks.workunit.client.1.vm08.stdout:3/627: creat d4/d15/d8/d2c/d9b/d79/d8f/fd7 x:0 0 0 2026-03-10T08:55:37.479 INFO:tasks.workunit.client.1.vm08.stdout:1/719: rmdir d1/da/d18/d3a/d77 39 2026-03-10T08:55:37.479 INFO:tasks.workunit.client.1.vm08.stdout:1/720: chown d1/da/d18/f48 1026597 1 2026-03-10T08:55:37.479 INFO:tasks.workunit.client.1.vm08.stdout:1/721: symlink d1/da/de/d24/d3d/d40/lfc 0 2026-03-10T08:55:37.479 INFO:tasks.workunit.client.1.vm08.stdout:1/722: write d1/da/d20/d3f/d49/d63/fc9 [994471,108712] 0 2026-03-10T08:55:37.491 INFO:tasks.workunit.client.1.vm08.stdout:3/628: dread d4/f18 [0,4194304] 0 2026-03-10T08:55:37.493 INFO:tasks.workunit.client.1.vm08.stdout:3/629: rename d4/d15/d8/d2c/d55/l66 to d4/ld8 0 2026-03-10T08:55:37.499 INFO:tasks.workunit.client.1.vm08.stdout:3/630: dwrite d4/d15/d8/d2c/d9b/f86 [0,4194304] 0 2026-03-10T08:55:37.499 INFO:tasks.workunit.client.1.vm08.stdout:3/631: readlink d4/l3b 0 2026-03-10T08:55:37.502 INFO:tasks.workunit.client.1.vm08.stdout:3/632: truncate d4/d15/d8/f83 847047 0 2026-03-10T08:55:37.506 INFO:tasks.workunit.client.1.vm08.stdout:3/633: mknod d4/d15/d8/d2c/d55/cd9 0 2026-03-10T08:55:37.524 INFO:tasks.workunit.client.0.vm05.stdout:5/414: write d5/d86/f59 [278007,36460] 0 2026-03-10T08:55:37.524 INFO:tasks.workunit.client.1.vm08.stdout:4/734: write d5/d23/d36/d76/fa5 [446899,69986] 0 2026-03-10T08:55:37.527 INFO:tasks.workunit.client.1.vm08.stdout:8/717: write d1/d10/d9/dd/d25/d27/d44/d21/d5f/fbd [901613,21024] 0 2026-03-10T08:55:37.528 INFO:tasks.workunit.client.0.vm05.stdout:6/542: write d4/d2d/d5f/f9f [919598,24004] 0 2026-03-10T08:55:37.534 INFO:tasks.workunit.client.1.vm08.stdout:0/632: dwrite d6/dd/d13/d17/d1f/d20/d2f/d26/d56/f6c [0,4194304] 0 2026-03-10T08:55:37.544 INFO:tasks.workunit.client.0.vm05.stdout:6/543: chown d4/d2c/d84/d4a/l8e 171587612 1 2026-03-10T08:55:37.546 INFO:tasks.workunit.client.0.vm05.stdout:5/415: rename d5/d86/d24/d2c/d41/d74/l8b to d5/d48/d64/l96 0 2026-03-10T08:55:37.548 INFO:tasks.workunit.client.0.vm05.stdout:6/544: fdatasync d4/d7/d10/d15/f2a 0 2026-03-10T08:55:37.548 INFO:tasks.workunit.client.0.vm05.stdout:6/545: chown d4/d2d/l7e 9766 1 2026-03-10T08:55:37.550 INFO:tasks.workunit.client.0.vm05.stdout:5/416: rename d5/d86/d21/c67 to d5/d86/d24/d2c/d41/c97 0 2026-03-10T08:55:37.551 INFO:tasks.workunit.client.1.vm08.stdout:0/633: dread d6/dd/d13/d17/d1f/f67 [0,4194304] 0 2026-03-10T08:55:37.552 INFO:tasks.workunit.client.1.vm08.stdout:8/718: read d1/d2c/f30 [253383,58942] 0 2026-03-10T08:55:37.562 INFO:tasks.workunit.client.0.vm05.stdout:5/417: dread d5/f9 [4194304,4194304] 0 2026-03-10T08:55:37.568 INFO:tasks.workunit.client.1.vm08.stdout:7/720: write d0/d11/f6a [2165930,88018] 0 2026-03-10T08:55:37.570 INFO:tasks.workunit.client.0.vm05.stdout:5/418: symlink d5/d86/d24/d84/l98 0 2026-03-10T08:55:37.572 INFO:tasks.workunit.client.1.vm08.stdout:7/721: symlink d0/d11/le9 0 2026-03-10T08:55:37.575 INFO:tasks.workunit.client.0.vm05.stdout:5/419: mknod d5/d48/d64/d95/c99 0 2026-03-10T08:55:37.582 INFO:tasks.workunit.client.1.vm08.stdout:2/723: write d1/da/d10/d42/fda [1197790,15530] 0 2026-03-10T08:55:37.582 INFO:tasks.workunit.client.1.vm08.stdout:7/722: unlink d0/d11/db2/l8a 0 2026-03-10T08:55:37.583 INFO:tasks.workunit.client.1.vm08.stdout:7/723: fsync d0/d14/f12 0 2026-03-10T08:55:37.583 INFO:tasks.workunit.client.1.vm08.stdout:0/634: dread d6/dd/d13/d17/d1f/d20/d2f/d57/f65 [0,4194304] 0 2026-03-10T08:55:37.583 INFO:tasks.workunit.client.0.vm05.stdout:5/420: rename d5/d86/f44 to d5/d86/d21/d71/f9a 0 2026-03-10T08:55:37.590 INFO:tasks.workunit.client.0.vm05.stdout:5/421: mknod d5/d86/d24/c9b 0 2026-03-10T08:55:37.593 INFO:tasks.workunit.client.1.vm08.stdout:2/724: read d1/da/f50 [2009859,58442] 0 2026-03-10T08:55:37.593 INFO:tasks.workunit.client.1.vm08.stdout:6/697: write d9/d10/f25 [17039,23811] 0 2026-03-10T08:55:37.596 INFO:tasks.workunit.client.0.vm05.stdout:1/551: dwrite dd/d10/d18/f36 [4194304,4194304] 0 2026-03-10T08:55:37.596 INFO:tasks.workunit.client.1.vm08.stdout:8/719: dread d1/d10/d9/f5b [0,4194304] 0 2026-03-10T08:55:37.599 INFO:tasks.workunit.client.1.vm08.stdout:2/725: read d1/da/d10/d2d/fa2 [1284299,15282] 0 2026-03-10T08:55:37.601 INFO:tasks.workunit.client.1.vm08.stdout:0/635: mknod d6/dd/d13/d17/d1f/d2d/d38/cd0 0 2026-03-10T08:55:37.607 INFO:tasks.workunit.client.0.vm05.stdout:8/475: dwrite d2/dd/f26 [0,4194304] 0 2026-03-10T08:55:37.608 INFO:tasks.workunit.client.0.vm05.stdout:8/476: chown d2/db/f22 2819072 1 2026-03-10T08:55:37.611 INFO:tasks.workunit.client.1.vm08.stdout:7/724: getdents d0/d11/d1f/d29/d3d/d40/ddc 0 2026-03-10T08:55:37.616 INFO:tasks.workunit.client.1.vm08.stdout:6/698: dread d9/dc/d84/f5e [0,4194304] 0 2026-03-10T08:55:37.617 INFO:tasks.workunit.client.1.vm08.stdout:6/699: fsync d9/d10/d1e/d32/f27 0 2026-03-10T08:55:37.617 INFO:tasks.workunit.client.1.vm08.stdout:6/700: readlink d9/dc/d11/d23/d2c/le2 0 2026-03-10T08:55:37.627 INFO:tasks.workunit.client.1.vm08.stdout:8/720: rmdir d1/d10/d9/dd/d25/d27/d44/d21 39 2026-03-10T08:55:37.628 INFO:tasks.workunit.client.1.vm08.stdout:2/726: rename d1/da/d10/d42/d93/d1e/f84 to d1/da/d78/fed 0 2026-03-10T08:55:37.629 INFO:tasks.workunit.client.1.vm08.stdout:2/727: fsync d1/d5b/f80 0 2026-03-10T08:55:37.629 INFO:tasks.workunit.client.0.vm05.stdout:3/526: write d9/d8f/d50/f7c [625670,58403] 0 2026-03-10T08:55:37.630 INFO:tasks.workunit.client.0.vm05.stdout:3/527: chown d9/d4d/f88 53657 1 2026-03-10T08:55:37.633 INFO:tasks.workunit.client.0.vm05.stdout:0/480: dwrite df/f17 [0,4194304] 0 2026-03-10T08:55:37.648 INFO:tasks.workunit.client.1.vm08.stdout:7/725: creat d0/d11/d4a/d95/fea x:0 0 0 2026-03-10T08:55:37.656 INFO:tasks.workunit.client.0.vm05.stdout:4/490: write d0/d2e/d42/d45/d4a/f86 [1153352,131060] 0 2026-03-10T08:55:37.658 INFO:tasks.workunit.client.1.vm08.stdout:6/701: symlink d9/dc/d11/d23/d2c/d81/d63/dcf/le6 0 2026-03-10T08:55:37.660 INFO:tasks.workunit.client.0.vm05.stdout:1/552: link dd/d10/d18/d2d/f93 dd/d10/d19/d4d/fc4 0 2026-03-10T08:55:37.663 INFO:tasks.workunit.client.0.vm05.stdout:0/481: mkdir df/d1f/d85/d19/d47/d84/d8a 0 2026-03-10T08:55:37.665 INFO:tasks.workunit.client.0.vm05.stdout:0/482: dread df/f17 [0,4194304] 0 2026-03-10T08:55:37.666 INFO:tasks.workunit.client.0.vm05.stdout:0/483: fsync df/d1f/d85/d2b/d27/f2e 0 2026-03-10T08:55:37.667 INFO:tasks.workunit.client.0.vm05.stdout:3/528: sync 2026-03-10T08:55:37.668 INFO:tasks.workunit.client.0.vm05.stdout:4/491: rename d0/d2e/d42/d45/d4a/d36/d37/d9c/f3a to d0/d2e/d42/d45/d4a/d36/d37/d9c/d32/d41/fa2 0 2026-03-10T08:55:37.680 INFO:tasks.workunit.client.0.vm05.stdout:1/553: creat dd/d10/d18/d2d/d5c/fc5 x:0 0 0 2026-03-10T08:55:37.680 INFO:tasks.workunit.client.0.vm05.stdout:1/554: dread - dd/d10/fb5 zero size 2026-03-10T08:55:37.687 INFO:tasks.workunit.client.1.vm08.stdout:0/636: dread d6/dd/d13/d17/d50/f71 [4194304,4194304] 0 2026-03-10T08:55:37.687 INFO:tasks.workunit.client.0.vm05.stdout:2/418: truncate d0/d9/d1e/d20/d21/f35 343365 0 2026-03-10T08:55:37.688 INFO:tasks.workunit.client.1.vm08.stdout:5/620: write d0/d46/f5f [3767363,75316] 0 2026-03-10T08:55:37.689 INFO:tasks.workunit.client.1.vm08.stdout:6/702: mkdir d9/d10/de7 0 2026-03-10T08:55:37.693 INFO:tasks.workunit.client.0.vm05.stdout:0/484: creat df/d1f/d85/d19/d39/d4d/d50/f8b x:0 0 0 2026-03-10T08:55:37.696 INFO:tasks.workunit.client.1.vm08.stdout:2/728: mknod d1/da/d10/d42/cee 0 2026-03-10T08:55:37.696 INFO:tasks.workunit.client.0.vm05.stdout:3/529: rmdir d9/d4d 39 2026-03-10T08:55:37.697 INFO:tasks.workunit.client.1.vm08.stdout:5/621: dread d0/d11/d3e/f48 [0,4194304] 0 2026-03-10T08:55:37.704 INFO:tasks.workunit.client.1.vm08.stdout:0/637: rename d6/dd/d13/d17/d1f/d2d/f45 to d6/dd/d13/d61/dc7/fd1 0 2026-03-10T08:55:37.704 INFO:tasks.workunit.client.0.vm05.stdout:1/555: mknod dd/d21/d37/d7c/dab/cc6 0 2026-03-10T08:55:37.712 INFO:tasks.workunit.client.0.vm05.stdout:3/530: dread d9/d2b/d2f/f33 [0,4194304] 0 2026-03-10T08:55:37.717 INFO:tasks.workunit.client.1.vm08.stdout:2/729: mkdir d1/da/d10/d42/d93/d1e/dce/d52/db3/def 0 2026-03-10T08:55:37.718 INFO:tasks.workunit.client.0.vm05.stdout:0/485: creat df/d1f/d85/d2b/d27/d32/d4e/f8c x:0 0 0 2026-03-10T08:55:37.721 INFO:tasks.workunit.client.1.vm08.stdout:5/622: rmdir d0 39 2026-03-10T08:55:37.721 INFO:tasks.workunit.client.0.vm05.stdout:4/492: symlink d0/d2e/d71/la3 0 2026-03-10T08:55:37.722 INFO:tasks.workunit.client.0.vm05.stdout:4/493: truncate d0/d2e/d71/f90 916491 0 2026-03-10T08:55:37.723 INFO:tasks.workunit.client.1.vm08.stdout:9/663: truncate d2/dd/d15/d1e/d21/fc5 2213532 0 2026-03-10T08:55:37.724 INFO:tasks.workunit.client.1.vm08.stdout:0/638: creat d6/dd/d13/d17/d1f/d2d/d85/d95/fd2 x:0 0 0 2026-03-10T08:55:37.729 INFO:tasks.workunit.client.1.vm08.stdout:6/703: fsync d9/d10/d1e/f9a 0 2026-03-10T08:55:37.741 INFO:tasks.workunit.client.1.vm08.stdout:8/721: getdents d1/d10/d9/dd/d18 0 2026-03-10T08:55:37.743 INFO:tasks.workunit.client.1.vm08.stdout:2/730: truncate d1/da/d10/d1b/fc6 1905600 0 2026-03-10T08:55:37.748 INFO:tasks.workunit.client.1.vm08.stdout:6/704: dread d9/fc5 [0,4194304] 0 2026-03-10T08:55:37.748 INFO:tasks.workunit.client.0.vm05.stdout:7/412: dwrite d18/d66/d25/d2e/d2f/f59 [0,4194304] 0 2026-03-10T08:55:37.751 INFO:tasks.workunit.client.1.vm08.stdout:9/664: mkdir d2/dd/d15/d1e/d39/d69/de4 0 2026-03-10T08:55:37.754 INFO:tasks.workunit.client.0.vm05.stdout:9/419: write d6/d19/d21/f2f [1153700,15836] 0 2026-03-10T08:55:37.756 INFO:tasks.workunit.client.1.vm08.stdout:0/639: symlink d6/dd/d13/d17/d1f/d20/d2f/d24/ld3 0 2026-03-10T08:55:37.758 INFO:tasks.workunit.client.1.vm08.stdout:2/731: dread d1/d5b/d66/f5e [0,4194304] 0 2026-03-10T08:55:37.763 INFO:tasks.workunit.client.1.vm08.stdout:1/723: truncate d1/da/de/d24/d3d/d40/f42 3952004 0 2026-03-10T08:55:37.763 INFO:tasks.workunit.client.1.vm08.stdout:1/724: chown d1/da/d18/f1d 17390 1 2026-03-10T08:55:37.765 INFO:tasks.workunit.client.0.vm05.stdout:2/419: unlink d0/d9/d1e/d20/d24/c48 0 2026-03-10T08:55:37.766 INFO:tasks.workunit.client.0.vm05.stdout:2/420: write d0/f8 [1696971,40955] 0 2026-03-10T08:55:37.768 INFO:tasks.workunit.client.0.vm05.stdout:0/486: creat df/d1f/d85/d2b/d27/d32/d4e/d87/f8d x:0 0 0 2026-03-10T08:55:37.770 INFO:tasks.workunit.client.1.vm08.stdout:4/735: write d5/d23/d36/d99/db2/d5d/f61 [1998984,82030] 0 2026-03-10T08:55:37.771 INFO:tasks.workunit.client.1.vm08.stdout:3/634: dwrite d4/d15/f4b [4194304,4194304] 0 2026-03-10T08:55:37.773 INFO:tasks.workunit.client.0.vm05.stdout:9/420: dread d6/d15/d3c/d4b/f5b [0,4194304] 0 2026-03-10T08:55:37.774 INFO:tasks.workunit.client.1.vm08.stdout:9/665: mkdir d2/dd/d15/d1e/d25/d32/d5c/de5 0 2026-03-10T08:55:37.775 INFO:tasks.workunit.client.0.vm05.stdout:7/413: symlink d18/d38/d43/d6e/l7c 0 2026-03-10T08:55:37.775 INFO:tasks.workunit.client.0.vm05.stdout:7/414: dread - d18/d66/d25/f56 zero size 2026-03-10T08:55:37.776 INFO:tasks.workunit.client.0.vm05.stdout:7/415: stat d18/f1d 0 2026-03-10T08:55:37.786 INFO:tasks.workunit.client.0.vm05.stdout:6/546: write d4/d7/f54 [2122903,37861] 0 2026-03-10T08:55:37.793 INFO:tasks.workunit.client.0.vm05.stdout:3/531: dread d9/d4d/f5e [0,4194304] 0 2026-03-10T08:55:37.799 INFO:tasks.workunit.client.0.vm05.stdout:2/421: mkdir d0/d9/d1e/d20/d21/d45/d4b/d75 0 2026-03-10T08:55:37.800 INFO:tasks.workunit.client.0.vm05.stdout:3/532: dread d9/ff [4194304,4194304] 0 2026-03-10T08:55:37.812 INFO:tasks.workunit.client.0.vm05.stdout:6/547: dread d4/d2c/d84/d4a/f63 [4194304,4194304] 0 2026-03-10T08:55:37.813 INFO:tasks.workunit.client.0.vm05.stdout:6/548: truncate d4/d2d/d5f/f6d 825577 0 2026-03-10T08:55:37.821 INFO:tasks.workunit.client.0.vm05.stdout:9/421: rename d6/f30 to d6/d12/d43/f91 0 2026-03-10T08:55:37.822 INFO:tasks.workunit.client.1.vm08.stdout:6/705: mknod d9/dc/d11/d23/d2c/dc0/ce8 0 2026-03-10T08:55:37.824 INFO:tasks.workunit.client.1.vm08.stdout:6/706: dread d9/dc/d84/f5e [0,4194304] 0 2026-03-10T08:55:37.824 INFO:tasks.workunit.client.1.vm08.stdout:5/623: link d0/d11/d27/d68/d7c/d4b/d4e/f89 d0/d11/d27/d68/d7c/d4b/d4e/d84/fbb 0 2026-03-10T08:55:37.833 INFO:tasks.workunit.client.0.vm05.stdout:8/477: dwrite d2/dd/d2c/d2e/d31/d4f/d7b/f8a [0,4194304] 0 2026-03-10T08:55:37.836 INFO:tasks.workunit.client.0.vm05.stdout:5/422: truncate d5/d86/d21/d71/f9a 1474627 0 2026-03-10T08:55:37.845 INFO:tasks.workunit.client.0.vm05.stdout:4/494: getdents d0/d1d 0 2026-03-10T08:55:37.845 INFO:tasks.workunit.client.1.vm08.stdout:1/725: fdatasync d1/da/de/d24/d3d/d40/f42 0 2026-03-10T08:55:37.845 INFO:tasks.workunit.client.1.vm08.stdout:6/707: mkdir d9/d50/de9 0 2026-03-10T08:55:37.846 INFO:tasks.workunit.client.0.vm05.stdout:4/495: truncate d0/d2e/d42/d45/f5f 4556586 0 2026-03-10T08:55:37.848 INFO:tasks.workunit.client.1.vm08.stdout:1/726: write d1/da/de/d24/d3d/d40/d92/ff7 [811150,130105] 0 2026-03-10T08:55:37.851 INFO:tasks.workunit.client.0.vm05.stdout:6/549: rmdir d4/d2d/d51/d62 39 2026-03-10T08:55:37.859 INFO:tasks.workunit.client.0.vm05.stdout:6/550: write d4/d7/f54 [85119,40151] 0 2026-03-10T08:55:37.859 INFO:tasks.workunit.client.1.vm08.stdout:7/726: dwrite d0/d11/d1f/d29/d3b/f4c [0,4194304] 0 2026-03-10T08:55:37.859 INFO:tasks.workunit.client.0.vm05.stdout:9/422: write d6/d19/f5c [3221768,9840] 0 2026-03-10T08:55:37.862 INFO:tasks.workunit.client.0.vm05.stdout:7/416: symlink d18/d66/d25/d2e/d2f/d6d/l7d 0 2026-03-10T08:55:37.866 INFO:tasks.workunit.client.0.vm05.stdout:2/422: symlink d0/d9/d1e/d20/d21/d45/d6c/d6e/d6d/l76 0 2026-03-10T08:55:37.870 INFO:tasks.workunit.client.1.vm08.stdout:9/666: symlink d2/dd/d15/d1e/d25/le6 0 2026-03-10T08:55:37.875 INFO:tasks.workunit.client.1.vm08.stdout:5/624: mknod d0/cbc 0 2026-03-10T08:55:37.883 INFO:tasks.workunit.client.1.vm08.stdout:3/635: rename d4/d15/d8/d2c/d55/f60 to d4/d15/fda 0 2026-03-10T08:55:37.883 INFO:tasks.workunit.client.1.vm08.stdout:7/727: creat d0/d11/d1f/d29/d3b/d80/feb x:0 0 0 2026-03-10T08:55:37.883 INFO:tasks.workunit.client.1.vm08.stdout:6/708: mkdir d9/d50/de9/dea 0 2026-03-10T08:55:37.883 INFO:tasks.workunit.client.0.vm05.stdout:1/556: write fb [2352500,102536] 0 2026-03-10T08:55:37.883 INFO:tasks.workunit.client.0.vm05.stdout:5/423: truncate d5/d86/d24/d2c/d41/f4d 2433738 0 2026-03-10T08:55:37.883 INFO:tasks.workunit.client.0.vm05.stdout:0/487: creat df/d1f/d85/d19/f8e x:0 0 0 2026-03-10T08:55:37.883 INFO:tasks.workunit.client.0.vm05.stdout:0/488: write df/d1f/d85/d19/d5b/f72 [1107551,7454] 0 2026-03-10T08:55:37.884 INFO:tasks.workunit.client.0.vm05.stdout:4/496: fdatasync d0/d1d/f22 0 2026-03-10T08:55:37.888 INFO:tasks.workunit.client.1.vm08.stdout:5/625: truncate d0/d11/f60 1952897 0 2026-03-10T08:55:37.889 INFO:tasks.workunit.client.1.vm08.stdout:2/732: link d1/da/d10/d2d/c35 d1/da/d10/cf0 0 2026-03-10T08:55:37.889 INFO:tasks.workunit.client.1.vm08.stdout:7/728: creat d0/d14/d43/d62/fec x:0 0 0 2026-03-10T08:55:37.890 INFO:tasks.workunit.client.1.vm08.stdout:8/722: truncate d1/d10/d9/dd/f8f 65131 0 2026-03-10T08:55:37.890 INFO:tasks.workunit.client.0.vm05.stdout:9/423: sync 2026-03-10T08:55:37.895 INFO:tasks.workunit.client.1.vm08.stdout:0/640: dwrite d6/dd/d13/d17/d1f/d20/f43 [0,4194304] 0 2026-03-10T08:55:37.905 INFO:tasks.workunit.client.1.vm08.stdout:2/733: symlink d1/d43/d5c/lf1 0 2026-03-10T08:55:37.905 INFO:tasks.workunit.client.1.vm08.stdout:9/667: truncate d2/dd/d15/d1e/d25/d32/d79/d85/fdf 3080484 0 2026-03-10T08:55:37.907 INFO:tasks.workunit.client.1.vm08.stdout:4/736: dwrite d5/d23/d36/f92 [0,4194304] 0 2026-03-10T08:55:37.908 INFO:tasks.workunit.client.1.vm08.stdout:0/641: stat d6/dd/d13/d61/c74 0 2026-03-10T08:55:37.913 INFO:tasks.workunit.client.1.vm08.stdout:7/729: link d0/d11/d4a/d95/fea d0/d11/d4a/d5e/fed 0 2026-03-10T08:55:37.919 INFO:tasks.workunit.client.1.vm08.stdout:4/737: truncate d5/f6b 470317 0 2026-03-10T08:55:37.919 INFO:tasks.workunit.client.0.vm05.stdout:7/417: creat d18/d66/d25/d2e/d42/d53/f7e x:0 0 0 2026-03-10T08:55:37.919 INFO:tasks.workunit.client.0.vm05.stdout:2/423: write d0/fa [6847951,10330] 0 2026-03-10T08:55:37.920 INFO:tasks.workunit.client.1.vm08.stdout:6/709: getdents d9/d50/d95 0 2026-03-10T08:55:37.922 INFO:tasks.workunit.client.0.vm05.stdout:8/478: link d2/dd/d2c/d2e/f6a d2/db/d1f/d67/d8d/fad 0 2026-03-10T08:55:37.925 INFO:tasks.workunit.client.0.vm05.stdout:1/557: symlink dd/d13/lc7 0 2026-03-10T08:55:37.929 INFO:tasks.workunit.client.1.vm08.stdout:7/730: creat d0/d11/d1f/d29/d3d/d89/fee x:0 0 0 2026-03-10T08:55:37.932 INFO:tasks.workunit.client.1.vm08.stdout:4/738: mknod d5/d23/d49/c112 0 2026-03-10T08:55:37.932 INFO:tasks.workunit.client.1.vm08.stdout:3/636: sync 2026-03-10T08:55:37.936 INFO:tasks.workunit.client.0.vm05.stdout:0/489: creat df/d1f/d85/d19/d47/f8f x:0 0 0 2026-03-10T08:55:37.939 INFO:tasks.workunit.client.1.vm08.stdout:6/710: mknod d9/dc/d84/d80/ceb 0 2026-03-10T08:55:37.940 INFO:tasks.workunit.client.1.vm08.stdout:9/668: creat d2/fe7 x:0 0 0 2026-03-10T08:55:37.940 INFO:tasks.workunit.client.0.vm05.stdout:6/551: creat d4/d92/db0/fb7 x:0 0 0 2026-03-10T08:55:37.943 INFO:tasks.workunit.client.1.vm08.stdout:7/731: creat d0/d11/d1f/d29/d36/d75/fef x:0 0 0 2026-03-10T08:55:37.944 INFO:tasks.workunit.client.0.vm05.stdout:2/424: creat d0/d9/d1e/d20/d21/f77 x:0 0 0 2026-03-10T08:55:37.945 INFO:tasks.workunit.client.1.vm08.stdout:7/732: chown d0/d11/d4a 0 1 2026-03-10T08:55:37.945 INFO:tasks.workunit.client.0.vm05.stdout:8/479: stat d2/c1c 0 2026-03-10T08:55:37.945 INFO:tasks.workunit.client.0.vm05.stdout:1/558: chown dd/d10/d18/d2d/d51/d58/d71/c4b 3 1 2026-03-10T08:55:37.946 INFO:tasks.workunit.client.0.vm05.stdout:1/559: chown dd/d10/d18/d2d/d5c/f8e 0 1 2026-03-10T08:55:37.946 INFO:tasks.workunit.client.1.vm08.stdout:6/711: truncate d9/dc/d11/f55 715865 0 2026-03-10T08:55:37.947 INFO:tasks.workunit.client.1.vm08.stdout:9/669: symlink d2/dd/d61/le8 0 2026-03-10T08:55:37.949 INFO:tasks.workunit.client.0.vm05.stdout:1/560: dwrite dd/d21/d37/f72 [0,4194304] 0 2026-03-10T08:55:37.950 INFO:tasks.workunit.client.0.vm05.stdout:1/561: chown dd/le 3652 1 2026-03-10T08:55:37.956 INFO:tasks.workunit.client.1.vm08.stdout:7/733: dread - d0/d14/d43/fa4 zero size 2026-03-10T08:55:37.957 INFO:tasks.workunit.client.0.vm05.stdout:8/480: creat d2/db/d28/fae x:0 0 0 2026-03-10T08:55:37.958 INFO:tasks.workunit.client.0.vm05.stdout:8/481: write d2/dd/f26 [159454,45502] 0 2026-03-10T08:55:37.960 INFO:tasks.workunit.client.1.vm08.stdout:3/637: dread d4/d15/d8/d2c/f3d [8388608,4194304] 0 2026-03-10T08:55:37.960 INFO:tasks.workunit.client.1.vm08.stdout:9/670: dwrite d2/dd/d15/d1e/d24/f3f [0,4194304] 0 2026-03-10T08:55:37.961 INFO:tasks.workunit.client.0.vm05.stdout:5/424: link d5/d48/d64/f83 d5/f9c 0 2026-03-10T08:55:37.962 INFO:tasks.workunit.client.1.vm08.stdout:2/734: dread d1/da/d10/d1b/f14 [0,4194304] 0 2026-03-10T08:55:37.965 INFO:tasks.workunit.client.0.vm05.stdout:1/562: creat dd/d10/d19/d27/fc8 x:0 0 0 2026-03-10T08:55:37.967 INFO:tasks.workunit.client.0.vm05.stdout:9/424: link f4 d6/d15/d3c/d4b/d82/f92 0 2026-03-10T08:55:37.968 INFO:tasks.workunit.client.1.vm08.stdout:9/671: fdatasync d2/f9f 0 2026-03-10T08:55:37.968 INFO:tasks.workunit.client.0.vm05.stdout:9/425: fsync d6/d15/d3c/d4b/f67 0 2026-03-10T08:55:37.969 INFO:tasks.workunit.client.1.vm08.stdout:9/672: chown d2/dd/d15/d1e/d25/d32/f60 9147 1 2026-03-10T08:55:37.971 INFO:tasks.workunit.client.0.vm05.stdout:8/482: mkdir d2/dd/d2c/d2e/d31/d4c/d63/daf 0 2026-03-10T08:55:37.971 INFO:tasks.workunit.client.1.vm08.stdout:7/734: dwrite d0/d14/d43/d62/fb5 [4194304,4194304] 0 2026-03-10T08:55:37.976 INFO:tasks.workunit.client.0.vm05.stdout:8/483: dwrite d2/dd/d2c/d2e/d31/d3e/d5d/f92 [0,4194304] 0 2026-03-10T08:55:37.982 INFO:tasks.workunit.client.1.vm08.stdout:3/638: unlink d4/d15/d8/d71/fd6 0 2026-03-10T08:55:37.984 INFO:tasks.workunit.client.1.vm08.stdout:7/735: rmdir d0/d14 39 2026-03-10T08:55:37.985 INFO:tasks.workunit.client.1.vm08.stdout:9/673: mknod d2/dd/d15/d1e/ce9 0 2026-03-10T08:55:37.991 INFO:tasks.workunit.client.0.vm05.stdout:8/484: unlink d2/d45/f43 0 2026-03-10T08:55:37.995 INFO:tasks.workunit.client.0.vm05.stdout:3/533: dwrite d9/f20 [0,4194304] 0 2026-03-10T08:55:38.004 INFO:tasks.workunit.client.1.vm08.stdout:3/639: creat d4/d6f/d85/dd3/fdb x:0 0 0 2026-03-10T08:55:38.014 INFO:tasks.workunit.client.1.vm08.stdout:3/640: rename d4/d15/d8/c7a to d4/d15/d8/d71/cdc 0 2026-03-10T08:55:38.014 INFO:tasks.workunit.client.0.vm05.stdout:0/490: getdents df/d1f/d85/d19/d39/d4d/d50 0 2026-03-10T08:55:38.014 INFO:tasks.workunit.client.0.vm05.stdout:9/426: mkdir d6/d15/d3c/d4b/d90/d93 0 2026-03-10T08:55:38.019 INFO:tasks.workunit.client.0.vm05.stdout:8/485: mknod d2/db/d47/cb0 0 2026-03-10T08:55:38.020 INFO:tasks.workunit.client.0.vm05.stdout:5/425: creat d5/d86/f9d x:0 0 0 2026-03-10T08:55:38.020 INFO:tasks.workunit.client.0.vm05.stdout:5/426: write d5/d86/f59 [163940,100206] 0 2026-03-10T08:55:38.026 INFO:tasks.workunit.client.0.vm05.stdout:9/427: unlink d6/d19/d21/c28 0 2026-03-10T08:55:38.030 INFO:tasks.workunit.client.0.vm05.stdout:5/427: dread d5/d86/f1b [0,4194304] 0 2026-03-10T08:55:38.032 INFO:tasks.workunit.client.0.vm05.stdout:5/428: dread d5/d86/d24/d2c/d41/f4c [0,4194304] 0 2026-03-10T08:55:38.045 INFO:tasks.workunit.client.1.vm08.stdout:2/735: dread d1/da/d10/d42/d93/f8f [0,4194304] 0 2026-03-10T08:55:38.045 INFO:tasks.workunit.client.1.vm08.stdout:2/736: dread - d1/da/d10/d42/d93/d1e/d7b/fea zero size 2026-03-10T08:55:38.047 INFO:tasks.workunit.client.1.vm08.stdout:2/737: chown d1/da/d10/cf0 229 1 2026-03-10T08:55:38.048 INFO:tasks.workunit.client.1.vm08.stdout:2/738: mknod d1/da/d10/d42/cf2 0 2026-03-10T08:55:38.057 INFO:tasks.workunit.client.0.vm05.stdout:3/534: sync 2026-03-10T08:55:38.057 INFO:tasks.workunit.client.0.vm05.stdout:9/428: sync 2026-03-10T08:55:38.058 INFO:tasks.workunit.client.0.vm05.stdout:9/429: readlink d6/d15/l68 0 2026-03-10T08:55:38.058 INFO:tasks.workunit.client.0.vm05.stdout:3/535: chown d9/d4d/d51 8 1 2026-03-10T08:55:38.058 INFO:tasks.workunit.client.0.vm05.stdout:9/430: dread - d6/d12/f74 zero size 2026-03-10T08:55:38.059 INFO:tasks.workunit.client.0.vm05.stdout:9/431: fdatasync d6/d19/d2a/f87 0 2026-03-10T08:55:38.064 INFO:tasks.workunit.client.0.vm05.stdout:9/432: sync 2026-03-10T08:55:38.067 INFO:tasks.workunit.client.1.vm08.stdout:1/727: dwrite d1/da/d20/d3f/d49/f9a [0,4194304] 0 2026-03-10T08:55:38.071 INFO:tasks.workunit.client.0.vm05.stdout:9/433: symlink d6/d15/d3c/d4b/l94 0 2026-03-10T08:55:38.073 INFO:tasks.workunit.client.1.vm08.stdout:5/626: write d0/d11/d27/d68/d7c/f6f [1590301,104866] 0 2026-03-10T08:55:38.073 INFO:tasks.workunit.client.1.vm08.stdout:1/728: creat d1/da/de/d24/d3d/d40/d8e/dd2/ffd x:0 0 0 2026-03-10T08:55:38.074 INFO:tasks.workunit.client.1.vm08.stdout:1/729: chown d1/da/de/d24/d3d/lbc 3 1 2026-03-10T08:55:38.076 INFO:tasks.workunit.client.0.vm05.stdout:9/434: dwrite d6/d19/f5c [0,4194304] 0 2026-03-10T08:55:38.077 INFO:tasks.workunit.client.1.vm08.stdout:1/730: readlink d1/da/de/d24/d3d/d40/d92/lb3 0 2026-03-10T08:55:38.085 INFO:tasks.workunit.client.0.vm05.stdout:9/435: mknod d6/d15/d35/c95 0 2026-03-10T08:55:38.085 INFO:tasks.workunit.client.0.vm05.stdout:9/436: fsync d6/f7f 0 2026-03-10T08:55:38.093 INFO:tasks.workunit.client.1.vm08.stdout:8/723: truncate d1/d10/d9/dd/d13/f24 530744 0 2026-03-10T08:55:38.094 INFO:tasks.workunit.client.0.vm05.stdout:9/437: unlink d6/d12/c7e 0 2026-03-10T08:55:38.100 INFO:tasks.workunit.client.0.vm05.stdout:9/438: creat d6/d15/f96 x:0 0 0 2026-03-10T08:55:38.103 INFO:tasks.workunit.client.1.vm08.stdout:0/642: dwrite d6/dd/d13/d17/d1f/d20/f3e [0,4194304] 0 2026-03-10T08:55:38.103 INFO:tasks.workunit.client.0.vm05.stdout:9/439: dwrite d6/d19/d21/f31 [4194304,4194304] 0 2026-03-10T08:55:38.121 INFO:tasks.workunit.client.0.vm05.stdout:9/440: sync 2026-03-10T08:55:38.124 INFO:tasks.workunit.client.0.vm05.stdout:9/441: mknod d6/d12/d3a/d48/c97 0 2026-03-10T08:55:38.135 INFO:tasks.workunit.client.0.vm05.stdout:7/418: dwrite d18/d66/d25/d2e/d32/f35 [0,4194304] 0 2026-03-10T08:55:38.147 INFO:tasks.workunit.client.0.vm05.stdout:7/419: dread d18/d66/d25/d2e/d42/f5a [0,4194304] 0 2026-03-10T08:55:38.147 INFO:tasks.workunit.client.1.vm08.stdout:6/712: write d9/d10/d1e/d7b/fbc [195342,10861] 0 2026-03-10T08:55:38.152 INFO:tasks.workunit.client.0.vm05.stdout:7/420: mkdir d18/d66/d25/d2e/d32/d7f 0 2026-03-10T08:55:38.154 INFO:tasks.workunit.client.1.vm08.stdout:0/643: rmdir d6/dd/d13/d17/d1f/d20 39 2026-03-10T08:55:38.155 INFO:tasks.workunit.client.0.vm05.stdout:7/421: rename d18/d66/d25/d2e/d32/l41 to d18/d38/d43/d5c/l80 0 2026-03-10T08:55:38.156 INFO:tasks.workunit.client.1.vm08.stdout:1/731: getdents d1/da/d18 0 2026-03-10T08:55:38.158 INFO:tasks.workunit.client.1.vm08.stdout:6/713: getdents d9/d10/de7 0 2026-03-10T08:55:38.159 INFO:tasks.workunit.client.0.vm05.stdout:7/422: rename d18/d66/d25/d2e/l37 to d18/d66/d25/d2e/l81 0 2026-03-10T08:55:38.162 INFO:tasks.workunit.client.1.vm08.stdout:0/644: creat d6/dd/d13/d17/d1f/d2d/fd4 x:0 0 0 2026-03-10T08:55:38.162 INFO:tasks.workunit.client.1.vm08.stdout:0/645: chown d6/dd/d13/d17/d1f/d2d/l9a 31456595 1 2026-03-10T08:55:38.166 INFO:tasks.workunit.client.1.vm08.stdout:0/646: read d6/dd/d13/d17/d1f/d2d/fa0 [552187,25103] 0 2026-03-10T08:55:38.166 INFO:tasks.workunit.client.0.vm05.stdout:7/423: getdents d18/d66/d25/d2e/d42/d74 0 2026-03-10T08:55:38.166 INFO:tasks.workunit.client.0.vm05.stdout:7/424: creat d18/d38/f82 x:0 0 0 2026-03-10T08:55:38.166 INFO:tasks.workunit.client.0.vm05.stdout:7/425: fsync d18/d38/f5d 0 2026-03-10T08:55:38.166 INFO:tasks.workunit.client.0.vm05.stdout:7/426: symlink d18/d66/d25/d2e/d32/l83 0 2026-03-10T08:55:38.166 INFO:tasks.workunit.client.0.vm05.stdout:7/427: stat d18/d38/f55 0 2026-03-10T08:55:38.167 INFO:tasks.workunit.client.1.vm08.stdout:0/647: mkdir d6/dd/d13/d17/d1f/d20/d2f/d57/dd5 0 2026-03-10T08:55:38.170 INFO:tasks.workunit.client.0.vm05.stdout:7/428: dwrite d18/d38/d43/d5c/f67 [0,4194304] 0 2026-03-10T08:55:38.170 INFO:tasks.workunit.client.0.vm05.stdout:7/429: chown d18/d38/c5b 10382 1 2026-03-10T08:55:38.170 INFO:tasks.workunit.client.1.vm08.stdout:0/648: write d6/dd/d13/d17/d1f/d2d/d85/d93/f7e [1455498,55476] 0 2026-03-10T08:55:38.175 INFO:tasks.workunit.client.1.vm08.stdout:1/732: sync 2026-03-10T08:55:38.177 INFO:tasks.workunit.client.1.vm08.stdout:1/733: chown d1/da/de/d24/d3d/d40/d8e/fc1 94281183 1 2026-03-10T08:55:38.178 INFO:tasks.workunit.client.1.vm08.stdout:0/649: rename d6/dd/d13/d61/d6f/fa5 to d6/dd/d13/d17/d1f/d2d/d85/d95/fd6 0 2026-03-10T08:55:38.179 INFO:tasks.workunit.client.1.vm08.stdout:1/734: fdatasync d1/da/d18/d3b/d62/fd9 0 2026-03-10T08:55:38.182 INFO:tasks.workunit.client.1.vm08.stdout:1/735: getdents d1/da/d20/d91/d83/df4/d4e 0 2026-03-10T08:55:38.195 INFO:tasks.workunit.client.1.vm08.stdout:0/650: sync 2026-03-10T08:55:38.196 INFO:tasks.workunit.client.1.vm08.stdout:0/651: write d6/dd/d13/d61/fba [740288,125601] 0 2026-03-10T08:55:38.227 INFO:tasks.workunit.client.1.vm08.stdout:4/739: write d5/d23/d49/d8f/da4/f10a [1124713,55934] 0 2026-03-10T08:55:38.229 INFO:tasks.workunit.client.1.vm08.stdout:4/740: chown d5/d23/d36/d99/dc6/df1 517953234 1 2026-03-10T08:55:38.232 INFO:tasks.workunit.client.1.vm08.stdout:4/741: chown d5/d23/l46 0 1 2026-03-10T08:55:38.233 INFO:tasks.workunit.client.1.vm08.stdout:4/742: read - d5/d23/d36/d99/db2/d5d/dae/ddf/fbe zero size 2026-03-10T08:55:38.236 INFO:tasks.workunit.client.1.vm08.stdout:4/743: creat d5/d23/d36/f113 x:0 0 0 2026-03-10T08:55:38.236 INFO:tasks.workunit.client.0.vm05.stdout:4/497: dread d0/fb [4194304,4194304] 0 2026-03-10T08:55:38.237 INFO:tasks.workunit.client.0.vm05.stdout:4/498: chown d0/d2e/d71/c8b 64640322 1 2026-03-10T08:55:38.238 INFO:tasks.workunit.client.1.vm08.stdout:4/744: mkdir d5/de/d114 0 2026-03-10T08:55:38.240 INFO:tasks.workunit.client.0.vm05.stdout:4/499: getdents d0/d2e/d42/d45/d4a/d36/d37/d9c 0 2026-03-10T08:55:38.241 INFO:tasks.workunit.client.0.vm05.stdout:4/500: fsync d0/d2e/d42/d45/d4a/d36/d37/d9c/f29 0 2026-03-10T08:55:38.243 INFO:tasks.workunit.client.0.vm05.stdout:6/552: truncate d4/d2d/f2f 3090610 0 2026-03-10T08:55:38.246 INFO:tasks.workunit.client.0.vm05.stdout:1/563: write dd/d21/f3e [3491527,91962] 0 2026-03-10T08:55:38.247 INFO:tasks.workunit.client.0.vm05.stdout:1/564: truncate dd/d21/fb8 4853851 0 2026-03-10T08:55:38.250 INFO:tasks.workunit.client.1.vm08.stdout:9/674: write d2/dd/d15/d1e/d25/dae/f8f [11137,109769] 0 2026-03-10T08:55:38.250 INFO:tasks.workunit.client.0.vm05.stdout:1/565: dwrite dd/d10/f22 [8388608,4194304] 0 2026-03-10T08:55:38.251 INFO:tasks.workunit.client.1.vm08.stdout:7/736: write d0/d14/f98 [528202,42616] 0 2026-03-10T08:55:38.253 INFO:tasks.workunit.client.1.vm08.stdout:2/739: write d1/d43/f4b [527865,11377] 0 2026-03-10T08:55:38.254 INFO:tasks.workunit.client.0.vm05.stdout:8/486: write d2/dd/d2c/f86 [715238,4340] 0 2026-03-10T08:55:38.256 INFO:tasks.workunit.client.1.vm08.stdout:3/641: dwrite d4/d15/d8/d2c/d9b/d79/f3c [0,4194304] 0 2026-03-10T08:55:38.258 INFO:tasks.workunit.client.0.vm05.stdout:0/491: dwrite fe [0,4194304] 0 2026-03-10T08:55:38.260 INFO:tasks.workunit.client.0.vm05.stdout:3/536: dwrite d9/fa [0,4194304] 0 2026-03-10T08:55:38.260 INFO:tasks.workunit.client.1.vm08.stdout:7/737: dwrite d0/d14/d43/d62/fb5 [4194304,4194304] 0 2026-03-10T08:55:38.261 INFO:tasks.workunit.client.0.vm05.stdout:3/537: write d9/d2b/d2f/f33 [3769834,83319] 0 2026-03-10T08:55:38.262 INFO:tasks.workunit.client.0.vm05.stdout:3/538: write d9/d8f/d55/f8c [539005,74206] 0 2026-03-10T08:55:38.263 INFO:tasks.workunit.client.1.vm08.stdout:3/642: write d4/d15/d8/d2c/d6d/fd1 [693889,42048] 0 2026-03-10T08:55:38.267 INFO:tasks.workunit.client.0.vm05.stdout:0/492: truncate df/d1f/d85/d19/d39/f86 242272 0 2026-03-10T08:55:38.282 INFO:tasks.workunit.client.0.vm05.stdout:2/425: dread d0/d9/d1e/d20/d21/d45/d4b/f58 [0,4194304] 0 2026-03-10T08:55:38.283 INFO:tasks.workunit.client.0.vm05.stdout:3/539: sync 2026-03-10T08:55:38.283 INFO:tasks.workunit.client.0.vm05.stdout:0/493: sync 2026-03-10T08:55:38.283 INFO:tasks.workunit.client.0.vm05.stdout:3/540: chown d9/d2b/d3a 124 1 2026-03-10T08:55:38.291 INFO:tasks.workunit.client.0.vm05.stdout:4/501: rename d0/d2e/d42/d45/l48 to d0/d2e/d9d/la4 0 2026-03-10T08:55:38.291 INFO:tasks.workunit.client.0.vm05.stdout:6/553: creat d4/d7/d10/d15/d1b/fb8 x:0 0 0 2026-03-10T08:55:38.294 INFO:tasks.workunit.client.1.vm08.stdout:9/675: truncate d2/f9f 345024 0 2026-03-10T08:55:38.299 INFO:tasks.workunit.client.1.vm08.stdout:2/740: dread - d1/da/d10/dd3/fd9 zero size 2026-03-10T08:55:38.308 INFO:tasks.workunit.client.0.vm05.stdout:1/566: mkdir dd/d21/d37/d7c/dc9 0 2026-03-10T08:55:38.308 INFO:tasks.workunit.client.0.vm05.stdout:8/487: chown d2/c7 8020 1 2026-03-10T08:55:38.308 INFO:tasks.workunit.client.1.vm08.stdout:2/741: write d1/dd5/fe3 [312097,62441] 0 2026-03-10T08:55:38.308 INFO:tasks.workunit.client.1.vm08.stdout:5/627: dwrite d0/d11/f25 [0,4194304] 0 2026-03-10T08:55:38.316 INFO:tasks.workunit.client.1.vm08.stdout:9/676: sync 2026-03-10T08:55:38.318 INFO:tasks.workunit.client.1.vm08.stdout:4/745: link d5/d23/d49/d8f/l91 d5/d23/d36/d99/db2/d5d/l115 0 2026-03-10T08:55:38.319 INFO:tasks.workunit.client.0.vm05.stdout:0/494: mkdir df/d1f/d85/d19/d39/d74/d90 0 2026-03-10T08:55:38.326 INFO:tasks.workunit.client.0.vm05.stdout:3/541: rename d9/d4d/d51/d64/l66 to d9/d2b/d3a/d43/d71/l97 0 2026-03-10T08:55:38.327 INFO:tasks.workunit.client.0.vm05.stdout:2/426: read d0/d9/d1e/d20/d21/d45/d6c/d6e/f54 [4028268,2511] 0 2026-03-10T08:55:38.329 INFO:tasks.workunit.client.0.vm05.stdout:6/554: rmdir d4/d7/d10/d15 39 2026-03-10T08:55:38.329 INFO:tasks.workunit.client.0.vm05.stdout:6/555: chown d4/d2d/d5f/f81 693324718 1 2026-03-10T08:55:38.330 INFO:tasks.workunit.client.0.vm05.stdout:4/502: fsync d0/d2e/d42/d45/d4a/d36/d37/d9c/f28 0 2026-03-10T08:55:38.333 INFO:tasks.workunit.client.0.vm05.stdout:8/488: symlink d2/dd/d2c/d2e/d31/d3e/d5d/lb1 0 2026-03-10T08:55:38.337 INFO:tasks.workunit.client.0.vm05.stdout:8/489: dwrite d2/dd/d2c/d2e/d31/d4f/d80/f9f [0,4194304] 0 2026-03-10T08:55:38.338 INFO:tasks.workunit.client.0.vm05.stdout:8/490: write d2/dd/f1a [2530071,96869] 0 2026-03-10T08:55:38.338 INFO:tasks.workunit.client.0.vm05.stdout:8/491: write d2/dd/f26 [2733306,70793] 0 2026-03-10T08:55:38.340 INFO:tasks.workunit.client.0.vm05.stdout:9/442: write d6/d12/d43/f52 [431619,59226] 0 2026-03-10T08:55:38.341 INFO:tasks.workunit.client.0.vm05.stdout:9/443: chown d6/d19/d21/f7d 365 1 2026-03-10T08:55:38.344 INFO:tasks.workunit.client.0.vm05.stdout:3/542: read d9/d2b/d53/f60 [2483542,41947] 0 2026-03-10T08:55:38.346 INFO:tasks.workunit.client.0.vm05.stdout:1/567: truncate dd/d10/d18/d2d/d51/f6b 819145 0 2026-03-10T08:55:38.348 INFO:tasks.workunit.client.1.vm08.stdout:5/628: creat d0/d11/d27/d68/d7c/d4b/d4e/fbd x:0 0 0 2026-03-10T08:55:38.349 INFO:tasks.workunit.client.1.vm08.stdout:8/724: write d1/d10/d9/dd/d25/d27/d44/d21/d51/f56 [1800493,66628] 0 2026-03-10T08:55:38.351 INFO:tasks.workunit.client.0.vm05.stdout:6/556: fdatasync d4/f11 0 2026-03-10T08:55:38.351 INFO:tasks.workunit.client.1.vm08.stdout:4/746: symlink d5/d23/d36/d99/dc6/dc8/l116 0 2026-03-10T08:55:38.353 INFO:tasks.workunit.client.1.vm08.stdout:5/629: dwrite d0/d11/d3e/d45/fad [0,4194304] 0 2026-03-10T08:55:38.359 INFO:tasks.workunit.client.0.vm05.stdout:8/492: creat d2/db/d1f/fb2 x:0 0 0 2026-03-10T08:55:38.359 INFO:tasks.workunit.client.0.vm05.stdout:8/493: write d2/dd/d2c/d2e/f64 [1539942,53207] 0 2026-03-10T08:55:38.360 INFO:tasks.workunit.client.1.vm08.stdout:2/742: creat d1/da/d10/d2d/db6/ff3 x:0 0 0 2026-03-10T08:55:38.360 INFO:tasks.workunit.client.1.vm08.stdout:4/747: readlink d5/d23/d36/d99/lf3 0 2026-03-10T08:55:38.360 INFO:tasks.workunit.client.0.vm05.stdout:9/444: mknod d6/d27/c98 0 2026-03-10T08:55:38.361 INFO:tasks.workunit.client.0.vm05.stdout:9/445: write d6/d19/d21/f7d [597099,17367] 0 2026-03-10T08:55:38.362 INFO:tasks.workunit.client.0.vm05.stdout:9/446: write d6/f3f [3977549,58079] 0 2026-03-10T08:55:38.366 INFO:tasks.workunit.client.0.vm05.stdout:9/447: dwrite d6/d19/f5c [0,4194304] 0 2026-03-10T08:55:38.368 INFO:tasks.workunit.client.1.vm08.stdout:5/630: mkdir d0/d11/d27/d68/d7c/d4b/dbe 0 2026-03-10T08:55:38.375 INFO:tasks.workunit.client.0.vm05.stdout:3/543: unlink d9/d4d/d51/d64/d89/f8e 0 2026-03-10T08:55:38.375 INFO:tasks.workunit.client.0.vm05.stdout:1/568: creat dd/d55/fca x:0 0 0 2026-03-10T08:55:38.375 INFO:tasks.workunit.client.0.vm05.stdout:1/569: chown dd/d10/d19/d9b/cba 95 1 2026-03-10T08:55:38.375 INFO:tasks.workunit.client.1.vm08.stdout:9/677: link d2/d54/cbe d2/d54/d8e/db7/cea 0 2026-03-10T08:55:38.375 INFO:tasks.workunit.client.1.vm08.stdout:8/725: mknod d1/d10/d9/dd/d25/d27/d44/d21/d51/d64/dfb/c10a 0 2026-03-10T08:55:38.375 INFO:tasks.workunit.client.1.vm08.stdout:4/748: symlink d5/d23/d36/d99/dc6/l117 0 2026-03-10T08:55:38.375 INFO:tasks.workunit.client.1.vm08.stdout:2/743: link d1/da/d10/d1b/dcf/lec d1/da/d10/d42/dd0/lf4 0 2026-03-10T08:55:38.376 INFO:tasks.workunit.client.1.vm08.stdout:2/744: truncate d1/da/d10/d42/d93/d1e/f83 1075744 0 2026-03-10T08:55:38.377 INFO:tasks.workunit.client.0.vm05.stdout:8/494: fdatasync d2/dd/d2c/f34 0 2026-03-10T08:55:38.377 INFO:tasks.workunit.client.1.vm08.stdout:4/749: fdatasync d5/d23/d36/d99/db2/d5d/fc5 0 2026-03-10T08:55:38.378 INFO:tasks.workunit.client.1.vm08.stdout:5/631: sync 2026-03-10T08:55:38.380 INFO:tasks.workunit.client.1.vm08.stdout:4/750: readlink d5/d23/d36/l4c 0 2026-03-10T08:55:38.383 INFO:tasks.workunit.client.1.vm08.stdout:4/751: dread d5/d23/d49/fdc [0,4194304] 0 2026-03-10T08:55:38.383 INFO:tasks.workunit.client.0.vm05.stdout:6/557: read d4/d7/d10/d15/f2a [997318,75168] 0 2026-03-10T08:55:38.383 INFO:tasks.workunit.client.1.vm08.stdout:4/752: dread - d5/d23/d49/ff9 zero size 2026-03-10T08:55:38.384 INFO:tasks.workunit.client.0.vm05.stdout:4/503: dread d0/d2e/d42/d45/d4a/f26 [0,4194304] 0 2026-03-10T08:55:38.397 INFO:tasks.workunit.client.0.vm05.stdout:3/544: mknod d9/d2b/d2f/d57/c98 0 2026-03-10T08:55:38.397 INFO:tasks.workunit.client.0.vm05.stdout:1/570: rename dd/d10/d19/d4d/c9a to dd/d21/d37/d7c/dc9/ccb 0 2026-03-10T08:55:38.397 INFO:tasks.workunit.client.0.vm05.stdout:3/545: stat d9/f27 0 2026-03-10T08:55:38.402 INFO:tasks.workunit.client.1.vm08.stdout:5/632: dwrite d0/d11/d27/d50/fa1 [0,4194304] 0 2026-03-10T08:55:38.410 INFO:tasks.workunit.client.0.vm05.stdout:9/448: dread f3 [0,4194304] 0 2026-03-10T08:55:38.417 INFO:tasks.workunit.client.1.vm08.stdout:2/745: dread d1/d43/f4b [0,4194304] 0 2026-03-10T08:55:38.418 INFO:tasks.workunit.client.1.vm08.stdout:4/753: mkdir d5/d23/d49/d8f/da4/d118 0 2026-03-10T08:55:38.419 INFO:tasks.workunit.client.1.vm08.stdout:4/754: chown d5/d23/d36/d99/db2/d5a 1152264148 1 2026-03-10T08:55:38.421 INFO:tasks.workunit.client.0.vm05.stdout:8/495: dread d2/dd/d2c/d2e/d31/f89 [0,4194304] 0 2026-03-10T08:55:38.422 INFO:tasks.workunit.client.0.vm05.stdout:8/496: fsync d2/dd/d2c/d2e/d31/d4c/f85 0 2026-03-10T08:55:38.422 INFO:tasks.workunit.client.1.vm08.stdout:8/726: creat d1/d10/d9/dd/d18/d34/f10b x:0 0 0 2026-03-10T08:55:38.424 INFO:tasks.workunit.client.0.vm05.stdout:4/504: creat d0/d2e/d42/d45/d4a/d36/d37/d9c/d49/d58/d66/d79/fa5 x:0 0 0 2026-03-10T08:55:38.436 INFO:tasks.workunit.client.0.vm05.stdout:1/571: mknod dd/d10/d18/d2d/d51/ccc 0 2026-03-10T08:55:38.436 INFO:tasks.workunit.client.1.vm08.stdout:3/643: dread d4/f44 [0,4194304] 0 2026-03-10T08:55:38.436 INFO:tasks.workunit.client.1.vm08.stdout:5/633: truncate d0/d11/d27/f64 785264 0 2026-03-10T08:55:38.436 INFO:tasks.workunit.client.1.vm08.stdout:3/644: chown d4/d15/d8/d2c/d9b/d79/d20/l96 0 1 2026-03-10T08:55:38.436 INFO:tasks.workunit.client.1.vm08.stdout:2/746: mkdir d1/da/d78/df5 0 2026-03-10T08:55:38.436 INFO:tasks.workunit.client.1.vm08.stdout:5/634: symlink d0/d1b/d67/d7a/lbf 0 2026-03-10T08:55:38.436 INFO:tasks.workunit.client.1.vm08.stdout:5/635: stat d0/lc 0 2026-03-10T08:55:38.437 INFO:tasks.workunit.client.1.vm08.stdout:8/727: read d1/d10/d9/dd/d25/d27/d44/d21/d51/d64/fab [119140,6243] 0 2026-03-10T08:55:38.439 INFO:tasks.workunit.client.1.vm08.stdout:2/747: sync 2026-03-10T08:55:38.439 INFO:tasks.workunit.client.0.vm05.stdout:8/497: mkdir d2/dd/d2c/d2e/d31/d3e/d5d/d9d/db3 0 2026-03-10T08:55:38.440 INFO:tasks.workunit.client.0.vm05.stdout:4/505: dread - d0/d2e/d42/d45/d4a/d36/d37/d9c/d49/d58/f94 zero size 2026-03-10T08:55:38.440 INFO:tasks.workunit.client.0.vm05.stdout:8/498: fdatasync d2/dd/d2c/d2e/d31/d4f/f9c 0 2026-03-10T08:55:38.441 INFO:tasks.workunit.client.0.vm05.stdout:8/499: chown d2/dd 40108635 1 2026-03-10T08:55:38.443 INFO:tasks.workunit.client.0.vm05.stdout:1/572: rename dd/d21/fb8 to dd/d10/d18/d2d/d5c/dac/fcd 0 2026-03-10T08:55:38.446 INFO:tasks.workunit.client.0.vm05.stdout:1/573: dwrite dd/d21/f3e [0,4194304] 0 2026-03-10T08:55:38.485 INFO:tasks.workunit.client.1.vm08.stdout:2/748: mkdir d1/d5b/d66/df6 0 2026-03-10T08:55:38.485 INFO:tasks.workunit.client.1.vm08.stdout:8/728: truncate d1/d10/d9/d8a/f95 639308 0 2026-03-10T08:55:38.485 INFO:tasks.workunit.client.0.vm05.stdout:5/429: read d5/d86/d21/f30 [401217,42867] 0 2026-03-10T08:55:38.486 INFO:tasks.workunit.client.0.vm05.stdout:1/574: rename dd/d21/f3e to dd/d21/d37/d45/fce 0 2026-03-10T08:55:38.487 INFO:tasks.workunit.client.0.vm05.stdout:1/575: stat dd/d10/d19/f95 0 2026-03-10T08:55:38.491 INFO:tasks.workunit.client.0.vm05.stdout:9/449: getdents d6/d19/d2c/d58 0 2026-03-10T08:55:38.493 INFO:tasks.workunit.client.0.vm05.stdout:4/506: rmdir d0/d2e/d71/d7c/d82 0 2026-03-10T08:55:38.494 INFO:tasks.workunit.client.0.vm05.stdout:9/450: fdatasync d6/d12/d43/f91 0 2026-03-10T08:55:38.494 INFO:tasks.workunit.client.1.vm08.stdout:2/749: sync 2026-03-10T08:55:38.500 INFO:tasks.workunit.client.0.vm05.stdout:4/507: readlink d0/d2e/d42/d45/d4a/d36/d37/d9c/d32/d41/d67/d7b/l7f 0 2026-03-10T08:55:38.501 INFO:tasks.workunit.client.0.vm05.stdout:1/576: link dd/d10/d18/d2d/d51/l78 dd/d10/d18/d2d/d5c/dac/lcf 0 2026-03-10T08:55:38.501 INFO:tasks.workunit.client.0.vm05.stdout:4/508: rename d0/d2e to d0/d2e/d42/d45/d4a/d36/d37/d9c/d32/d41/d67/da6 22 2026-03-10T08:55:38.507 INFO:tasks.workunit.client.1.vm08.stdout:2/750: dread d1/d43/f6d [0,4194304] 0 2026-03-10T08:55:38.508 INFO:tasks.workunit.client.0.vm05.stdout:1/577: dwrite dd/d21/d3f/f57 [0,4194304] 0 2026-03-10T08:55:38.509 INFO:tasks.workunit.client.1.vm08.stdout:2/751: readlink d1/d43/d5c/lf1 0 2026-03-10T08:55:38.512 INFO:tasks.workunit.client.1.vm08.stdout:2/752: fdatasync d1/da/fc3 0 2026-03-10T08:55:38.514 INFO:tasks.workunit.client.1.vm08.stdout:2/753: write d1/da/d10/d2d/db6/ff3 [596905,34019] 0 2026-03-10T08:55:38.520 INFO:tasks.workunit.client.0.vm05.stdout:1/578: dread dd/f16 [0,4194304] 0 2026-03-10T08:55:38.526 INFO:tasks.workunit.client.1.vm08.stdout:2/754: creat d1/d5b/da7/ff7 x:0 0 0 2026-03-10T08:55:38.527 INFO:tasks.workunit.client.0.vm05.stdout:1/579: mknod dd/d21/cd0 0 2026-03-10T08:55:38.529 INFO:tasks.workunit.client.1.vm08.stdout:2/755: mknod d1/da/d10/dca/cf8 0 2026-03-10T08:55:38.542 INFO:tasks.workunit.client.1.vm08.stdout:2/756: sync 2026-03-10T08:55:38.545 INFO:tasks.workunit.client.0.vm05.stdout:9/451: dread d6/d15/d35/f38 [0,4194304] 0 2026-03-10T08:55:38.549 INFO:tasks.workunit.client.0.vm05.stdout:9/452: symlink d6/d19/d2a/l99 0 2026-03-10T08:55:38.551 INFO:tasks.workunit.client.0.vm05.stdout:9/453: link d6/d12/f14 d6/d15/d35/f9a 0 2026-03-10T08:55:38.552 INFO:tasks.workunit.client.0.vm05.stdout:9/454: read - d6/d15/d3c/d4b/f67 zero size 2026-03-10T08:55:38.555 INFO:tasks.workunit.client.0.vm05.stdout:9/455: dread d6/d15/d3c/d4b/f5b [0,4194304] 0 2026-03-10T08:55:38.566 INFO:tasks.workunit.client.0.vm05.stdout:9/456: dread d6/d19/d2a/d4a/f56 [0,4194304] 0 2026-03-10T08:55:38.575 INFO:tasks.workunit.client.1.vm08.stdout:2/757: dread d1/da/d10/d42/d93/f3b [0,4194304] 0 2026-03-10T08:55:38.577 INFO:tasks.workunit.client.1.vm08.stdout:2/758: chown d1/da/d10/d42/d93/fcb 25 1 2026-03-10T08:55:38.580 INFO:tasks.workunit.client.1.vm08.stdout:2/759: symlink d1/d5b/dc5/lf9 0 2026-03-10T08:55:38.583 INFO:tasks.workunit.client.0.vm05.stdout:4/509: dread d0/d2e/d42/d45/d4a/d36/d37/d9c/d49/f7a [0,4194304] 0 2026-03-10T08:55:38.584 INFO:tasks.workunit.client.1.vm08.stdout:2/760: symlink d1/d43/d5c/de7/lfa 0 2026-03-10T08:55:38.584 INFO:tasks.workunit.client.0.vm05.stdout:4/510: truncate d0/d1d/f24 1555848 0 2026-03-10T08:55:38.587 INFO:tasks.workunit.client.0.vm05.stdout:4/511: dread - d0/d2e/d42/d45/d4a/d36/d37/d9c/d49/d4f/d5b/f84 zero size 2026-03-10T08:55:38.587 INFO:tasks.workunit.client.1.vm08.stdout:2/761: stat d1/da/d10/d1b/c7d 0 2026-03-10T08:55:38.593 INFO:tasks.workunit.client.1.vm08.stdout:2/762: fdatasync d1/da/d78/f95 0 2026-03-10T08:55:38.594 INFO:tasks.workunit.client.0.vm05.stdout:4/512: creat d0/d2e/d42/d45/d4a/d36/d37/d9c/d49/d58/d66/d79/fa7 x:0 0 0 2026-03-10T08:55:38.594 INFO:tasks.workunit.client.1.vm08.stdout:6/714: write d9/dc/d11/d23/d2c/f8e [1693082,23933] 0 2026-03-10T08:55:38.604 INFO:tasks.workunit.client.0.vm05.stdout:7/430: dwrite d18/d66/d25/f47 [0,4194304] 0 2026-03-10T08:55:38.604 INFO:tasks.workunit.client.1.vm08.stdout:6/715: getdents d9/dc/d11/d23/d2c/d81/d63/dcf 0 2026-03-10T08:55:38.615 INFO:tasks.workunit.client.1.vm08.stdout:6/716: readlink d9/dc/d11/d23/l4b 0 2026-03-10T08:55:38.623 INFO:tasks.workunit.client.1.vm08.stdout:0/652: write d6/dd/d13/d17/d1f/d2d/d39/f3b [1409425,21112] 0 2026-03-10T08:55:38.629 INFO:tasks.workunit.client.1.vm08.stdout:1/736: dwrite d1/da/d20/f21 [0,4194304] 0 2026-03-10T08:55:38.636 INFO:tasks.workunit.client.1.vm08.stdout:1/737: chown d1/da/d18/d3a/l5f 757253352 1 2026-03-10T08:55:38.643 INFO:tasks.workunit.client.1.vm08.stdout:1/738: readlink d1/l31 0 2026-03-10T08:55:38.704 INFO:tasks.workunit.client.1.vm08.stdout:7/738: dwrite d0/f25 [0,4194304] 0 2026-03-10T08:55:38.706 INFO:tasks.workunit.client.1.vm08.stdout:7/739: write d0/d14/f12 [3858481,11172] 0 2026-03-10T08:55:38.707 INFO:tasks.workunit.client.1.vm08.stdout:7/740: fsync d0/d11/d1f/d29/d36/d75/fe5 0 2026-03-10T08:55:38.714 INFO:tasks.workunit.client.1.vm08.stdout:7/741: sync 2026-03-10T08:55:38.714 INFO:tasks.workunit.client.1.vm08.stdout:7/742: chown d0/d11/d1f/d29/fba 2 1 2026-03-10T08:55:38.719 INFO:tasks.workunit.client.0.vm05.stdout:0/495: write df/d1f/d85/d2b/d27/d32/d4e/f56 [57543,33301] 0 2026-03-10T08:55:38.721 INFO:tasks.workunit.client.1.vm08.stdout:7/743: mkdir d0/d11/d1f/df0 0 2026-03-10T08:55:38.722 INFO:tasks.workunit.client.0.vm05.stdout:0/496: chown df/d1f/d85/d19/d39/d4d/d50/f7e 3 1 2026-03-10T08:55:38.728 INFO:tasks.workunit.client.0.vm05.stdout:0/497: dwrite df/d1f/d85/d2b/f7a [0,4194304] 0 2026-03-10T08:55:38.752 INFO:tasks.workunit.client.0.vm05.stdout:2/427: truncate d0/d9/d1e/d20/f47 8149647 0 2026-03-10T08:55:38.752 INFO:tasks.workunit.client.1.vm08.stdout:9/678: write d2/dd/d15/d1e/d24/f5e [84731,111662] 0 2026-03-10T08:55:38.752 INFO:tasks.workunit.client.1.vm08.stdout:9/679: chown d2/d54/d8e/da6/cb8 1015313741 1 2026-03-10T08:55:38.752 INFO:tasks.workunit.client.1.vm08.stdout:9/680: dread d2/dd/d61/fbb [0,4194304] 0 2026-03-10T08:55:38.753 INFO:tasks.workunit.client.0.vm05.stdout:6/558: dwrite d4/d7/d10/d15/d1b/f31 [0,4194304] 0 2026-03-10T08:55:38.761 INFO:tasks.workunit.client.0.vm05.stdout:3/546: dwrite d9/d2b/d3a/d43/d7a/f7f [0,4194304] 0 2026-03-10T08:55:38.779 INFO:tasks.workunit.client.0.vm05.stdout:3/547: creat d9/d2b/d53/d61/f99 x:0 0 0 2026-03-10T08:55:38.779 INFO:tasks.workunit.client.0.vm05.stdout:3/548: dread d9/fa [0,4194304] 0 2026-03-10T08:55:38.779 INFO:tasks.workunit.client.0.vm05.stdout:3/549: dwrite d9/d2b/d3a/f68 [0,4194304] 0 2026-03-10T08:55:38.779 INFO:tasks.workunit.client.0.vm05.stdout:3/550: getdents d9/d2b/d3a/d43/d7a 0 2026-03-10T08:55:38.779 INFO:tasks.workunit.client.1.vm08.stdout:3/645: write d4/d15/d8/d1d/d4f/fb0 [764302,97380] 0 2026-03-10T08:55:38.779 INFO:tasks.workunit.client.1.vm08.stdout:4/755: truncate d5/fb4 319625 0 2026-03-10T08:55:38.779 INFO:tasks.workunit.client.1.vm08.stdout:5/636: write d0/f36 [268153,11905] 0 2026-03-10T08:55:38.779 INFO:tasks.workunit.client.1.vm08.stdout:5/637: link d0/d11/d27/d68/d7c/f75 d0/d11/d18/fc0 0 2026-03-10T08:55:38.779 INFO:tasks.workunit.client.1.vm08.stdout:5/638: mkdir d0/d11/d27/d68/dc1 0 2026-03-10T08:55:39.017 INFO:tasks.workunit.client.1.vm08.stdout:8/729: write d1/d10/d9/dd/d18/d3c/fa9 [473994,68534] 0 2026-03-10T08:55:39.019 INFO:tasks.workunit.client.1.vm08.stdout:8/730: symlink d1/d10/d9/dd/d25/d27/d44/d21/d5f/l10c 0 2026-03-10T08:55:39.023 INFO:tasks.workunit.client.1.vm08.stdout:8/731: symlink d1/d10/d9/dd/d25/d27/d44/d21/d5f/d9e/l10d 0 2026-03-10T08:55:39.024 INFO:tasks.workunit.client.1.vm08.stdout:8/732: creat d1/d10/d9/dd/d25/d27/d44/d97/d7d/f10e x:0 0 0 2026-03-10T08:55:39.037 INFO:tasks.workunit.client.0.vm05.stdout:5/430: dwrite d5/f3b [0,4194304] 0 2026-03-10T08:55:39.043 INFO:tasks.workunit.client.0.vm05.stdout:1/580: write dd/d21/d37/d7c/faa [812426,38933] 0 2026-03-10T08:55:39.047 INFO:tasks.workunit.client.0.vm05.stdout:1/581: mkdir dd/d10/d18/dd1 0 2026-03-10T08:55:39.048 INFO:tasks.workunit.client.0.vm05.stdout:1/582: mkdir dd/d13/dd2 0 2026-03-10T08:55:39.057 INFO:tasks.workunit.client.0.vm05.stdout:9/457: truncate d6/d19/d21/f2f 558385 0 2026-03-10T08:55:39.058 INFO:tasks.workunit.client.0.vm05.stdout:9/458: creat d6/d19/d2a/f9b x:0 0 0 2026-03-10T08:55:39.079 INFO:tasks.workunit.client.0.vm05.stdout:9/459: mkdir d6/d12/d3a/d9c 0 2026-03-10T08:55:39.079 INFO:tasks.workunit.client.0.vm05.stdout:9/460: truncate d6/d15/d3c/d4b/d82/f92 791436 0 2026-03-10T08:55:39.079 INFO:tasks.workunit.client.0.vm05.stdout:9/461: getdents d6/d12 0 2026-03-10T08:55:39.079 INFO:tasks.workunit.client.0.vm05.stdout:9/462: mknod d6/d19/d2a/d8d/c9d 0 2026-03-10T08:55:39.103 INFO:tasks.workunit.client.0.vm05.stdout:5/431: sync 2026-03-10T08:55:39.117 INFO:tasks.workunit.client.1.vm08.stdout:2/763: dwrite d1/da/d10/d42/d93/f3b [0,4194304] 0 2026-03-10T08:55:39.126 INFO:tasks.workunit.client.0.vm05.stdout:7/431: dwrite d18/f31 [0,4194304] 0 2026-03-10T08:55:39.126 INFO:tasks.workunit.client.0.vm05.stdout:7/432: stat d18/d66/d25/d2e/d42/d74/f7b 0 2026-03-10T08:55:39.133 INFO:tasks.workunit.client.0.vm05.stdout:5/432: creat d5/d86/d21/f9e x:0 0 0 2026-03-10T08:55:39.138 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:38 vm08.local ceph-mon[57559]: pgmap v158: 65 pgs: 65 active+clean; 2.5 GiB data, 8.7 GiB used, 111 GiB / 120 GiB avail; 54 MiB/s rd, 144 MiB/s wr, 349 op/s 2026-03-10T08:55:39.139 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:38 vm08.local ceph-mon[57559]: Upgrade: Updating mgr.vm08.rpongu 2026-03-10T08:55:39.139 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:38 vm08.local ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:55:39.139 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:38 vm08.local ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm08.rpongu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T08:55:39.139 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:38 vm08.local ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T08:55:39.139 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:38 vm08.local ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:55:39.139 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:38 vm08.local ceph-mon[57559]: Deploying daemon mgr.vm08.rpongu on vm08 2026-03-10T08:55:39.140 INFO:tasks.workunit.client.1.vm08.stdout:2/764: mknod d1/da/d10/d42/d93/daa/cfb 0 2026-03-10T08:55:39.143 INFO:tasks.workunit.client.0.vm05.stdout:7/433: creat d18/d1b/f84 x:0 0 0 2026-03-10T08:55:39.145 INFO:tasks.workunit.client.0.vm05.stdout:7/434: creat d18/d66/d79/f85 x:0 0 0 2026-03-10T08:55:39.145 INFO:tasks.workunit.client.1.vm08.stdout:2/765: dwrite d1/d43/f4b [0,4194304] 0 2026-03-10T08:55:39.145 INFO:tasks.workunit.client.0.vm05.stdout:7/435: chown f9 2278327 1 2026-03-10T08:55:39.146 INFO:tasks.workunit.client.0.vm05.stdout:7/436: chown d18/d66/c77 86 1 2026-03-10T08:55:39.147 INFO:tasks.workunit.client.0.vm05.stdout:7/437: chown d18/d66/d25/d2e/d42/d53 447877145 1 2026-03-10T08:55:39.165 INFO:tasks.workunit.client.0.vm05.stdout:5/433: dread d5/d48/f93 [0,4194304] 0 2026-03-10T08:55:39.165 INFO:tasks.workunit.client.0.vm05.stdout:5/434: fdatasync d5/d48/f7e 0 2026-03-10T08:55:39.183 INFO:tasks.workunit.client.1.vm08.stdout:2/766: dread d1/da/d10/d42/fda [0,4194304] 0 2026-03-10T08:55:39.184 INFO:tasks.workunit.client.1.vm08.stdout:2/767: stat d1/da/d10/f7e 0 2026-03-10T08:55:39.185 INFO:tasks.workunit.client.1.vm08.stdout:2/768: dread - d1/da/d10/dd3/fd9 zero size 2026-03-10T08:55:39.187 INFO:tasks.workunit.client.1.vm08.stdout:2/769: creat d1/db1/ffc x:0 0 0 2026-03-10T08:55:39.188 INFO:tasks.workunit.client.1.vm08.stdout:2/770: chown d1/f19 909631 1 2026-03-10T08:55:39.190 INFO:tasks.workunit.client.0.vm05.stdout:7/438: dread d18/d1b/f2c [0,4194304] 0 2026-03-10T08:55:39.191 INFO:tasks.workunit.client.0.vm05.stdout:7/439: fdatasync d18/d66/f2d 0 2026-03-10T08:55:39.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:38 vm05.local ceph-mon[49713]: pgmap v158: 65 pgs: 65 active+clean; 2.5 GiB data, 8.7 GiB used, 111 GiB / 120 GiB avail; 54 MiB/s rd, 144 MiB/s wr, 349 op/s 2026-03-10T08:55:39.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:38 vm05.local ceph-mon[49713]: Upgrade: Updating mgr.vm08.rpongu 2026-03-10T08:55:39.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:38 vm05.local ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:55:39.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:38 vm05.local ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm08.rpongu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T08:55:39.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:38 vm05.local ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T08:55:39.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:38 vm05.local ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:55:39.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:38 vm05.local ceph-mon[49713]: Deploying daemon mgr.vm08.rpongu on vm08 2026-03-10T08:55:39.224 INFO:tasks.workunit.client.0.vm05.stdout:2/428: dwrite d0/f16 [0,4194304] 0 2026-03-10T08:55:39.247 INFO:tasks.workunit.client.1.vm08.stdout:9/681: write d2/d41/d4c/dd2/fd6 [3403512,110785] 0 2026-03-10T08:55:39.247 INFO:tasks.workunit.client.1.vm08.stdout:9/682: dread - d2/dd/d15/d1e/d25/d32/d5c/f70 zero size 2026-03-10T08:55:39.294 INFO:tasks.workunit.client.0.vm05.stdout:6/559: dwrite d4/d2c/d84/d4a/f76 [0,4194304] 0 2026-03-10T08:55:39.299 INFO:tasks.workunit.client.1.vm08.stdout:3/646: write d4/d15/fda [339411,17078] 0 2026-03-10T08:55:39.301 INFO:tasks.workunit.client.1.vm08.stdout:4/756: dwrite d5/d23/d36/d99/db2/f9c [0,4194304] 0 2026-03-10T08:55:39.305 INFO:tasks.workunit.client.1.vm08.stdout:4/757: creat d5/d23/d36/d99/db2/d5d/de3/df8/f119 x:0 0 0 2026-03-10T08:55:39.306 INFO:tasks.workunit.client.1.vm08.stdout:4/758: dread - d5/d23/d36/d76/ff0 zero size 2026-03-10T08:55:39.308 INFO:tasks.workunit.client.1.vm08.stdout:4/759: creat d5/de/d96/f11a x:0 0 0 2026-03-10T08:55:39.309 INFO:tasks.workunit.client.1.vm08.stdout:4/760: stat d5/d23/d36/d99/db2 0 2026-03-10T08:55:39.309 INFO:tasks.workunit.client.1.vm08.stdout:3/647: link d4/d15/d8/d1d/c77 d4/d15/d8/d2c/d55/cdd 0 2026-03-10T08:55:39.313 INFO:tasks.workunit.client.1.vm08.stdout:4/761: fdatasync d5/d23/d49/d8f/f10c 0 2026-03-10T08:55:39.350 INFO:tasks.workunit.client.0.vm05.stdout:2/429: rmdir d0/d9/d1e/d20/d24 39 2026-03-10T08:55:39.381 INFO:tasks.workunit.client.1.vm08.stdout:5/639: dwrite d0/d11/d27/f61 [0,4194304] 0 2026-03-10T08:55:39.386 INFO:tasks.workunit.client.1.vm08.stdout:5/640: chown d0/d11/d27/cb8 51 1 2026-03-10T08:55:39.388 INFO:tasks.workunit.client.1.vm08.stdout:5/641: truncate d0/d11/f60 168786 0 2026-03-10T08:55:39.388 INFO:tasks.workunit.client.1.vm08.stdout:5/642: chown d0/f6c 1 1 2026-03-10T08:55:39.389 INFO:tasks.workunit.client.1.vm08.stdout:5/643: readlink d0/lc 0 2026-03-10T08:55:39.391 INFO:tasks.workunit.client.1.vm08.stdout:5/644: dread d0/d11/d27/f61 [0,4194304] 0 2026-03-10T08:55:39.392 INFO:tasks.workunit.client.1.vm08.stdout:5/645: creat d0/d1b/fc2 x:0 0 0 2026-03-10T08:55:39.394 INFO:tasks.workunit.client.1.vm08.stdout:5/646: dread d0/d11/d3e/f48 [0,4194304] 0 2026-03-10T08:55:39.410 INFO:tasks.workunit.client.1.vm08.stdout:5/647: dread d0/d11/d27/d68/d7c/d4b/fa2 [0,4194304] 0 2026-03-10T08:55:39.411 INFO:tasks.workunit.client.1.vm08.stdout:5/648: mknod d0/d46/cc3 0 2026-03-10T08:55:39.412 INFO:tasks.workunit.client.1.vm08.stdout:5/649: write d0/d11/d3e/f73 [1663841,55395] 0 2026-03-10T08:55:39.414 INFO:tasks.workunit.client.1.vm08.stdout:5/650: truncate d0/d11/d27/d68/d7c/d4b/d4e/d84/faa 737551 0 2026-03-10T08:55:39.416 INFO:tasks.workunit.client.1.vm08.stdout:5/651: link d0/d11/ca8 d0/d11/d3e/cc4 0 2026-03-10T08:55:39.430 INFO:tasks.workunit.client.0.vm05.stdout:3/551: link d9/d2b/d3a/d43/c58 d9/d2b/d53/c9a 0 2026-03-10T08:55:39.432 INFO:tasks.workunit.client.0.vm05.stdout:3/552: rename d9/f28 to d9/d2b/d3a/d43/d7a/f9b 0 2026-03-10T08:55:39.433 INFO:tasks.workunit.client.0.vm05.stdout:3/553: dread - d9/d2b/d2f/d57/f77 zero size 2026-03-10T08:55:39.434 INFO:tasks.workunit.client.0.vm05.stdout:3/554: read d9/d2b/d2f/d57/f90 [2048817,126371] 0 2026-03-10T08:55:39.436 INFO:tasks.workunit.client.0.vm05.stdout:3/555: unlink d9/f20 0 2026-03-10T08:55:39.437 INFO:tasks.workunit.client.0.vm05.stdout:3/556: chown d9/d2b/d3a/d43/d7a/f7f 0 1 2026-03-10T08:55:39.439 INFO:tasks.workunit.client.0.vm05.stdout:3/557: stat d9/d2b/d53/c62 0 2026-03-10T08:55:39.442 INFO:tasks.workunit.client.0.vm05.stdout:3/558: creat d9/d2b/d53/f9c x:0 0 0 2026-03-10T08:55:39.443 INFO:tasks.workunit.client.0.vm05.stdout:3/559: write d9/d2b/d3a/f44 [1114017,115719] 0 2026-03-10T08:55:39.444 INFO:tasks.workunit.client.0.vm05.stdout:3/560: symlink d9/d8f/d50/d5f/l9d 0 2026-03-10T08:55:39.460 INFO:tasks.workunit.client.1.vm08.stdout:6/717: rename d9/dc/d11/d23/c37 to d9/d50/d95/cec 0 2026-03-10T08:55:39.462 INFO:tasks.workunit.client.1.vm08.stdout:6/718: truncate d9/dc/d11/d23/f5f 2726489 0 2026-03-10T08:55:39.488 INFO:tasks.workunit.client.0.vm05.stdout:4/513: mkdir d0/d2e/da8 0 2026-03-10T08:55:39.490 INFO:tasks.workunit.client.0.vm05.stdout:4/514: write d0/d2e/d42/d45/d4a/d36/d37/d9c/d49/d4f/f51 [321935,38879] 0 2026-03-10T08:55:39.492 INFO:tasks.workunit.client.0.vm05.stdout:4/515: dread - d0/d2e/d42/d45/d4a/d36/d37/d9c/d32/d41/f57 zero size 2026-03-10T08:55:39.493 INFO:tasks.workunit.client.0.vm05.stdout:4/516: symlink d0/d2e/d42/d45/d4a/d36/d37/d9c/d49/la9 0 2026-03-10T08:55:39.495 INFO:tasks.workunit.client.0.vm05.stdout:4/517: dread - d0/d2e/d42/d45/d4a/d36/d37/f68 zero size 2026-03-10T08:55:39.499 INFO:tasks.workunit.client.0.vm05.stdout:4/518: creat d0/d2e/d42/d45/d4a/d36/d37/d9c/d32/d41/d67/d7b/faa x:0 0 0 2026-03-10T08:55:39.500 INFO:tasks.workunit.client.0.vm05.stdout:4/519: read d0/d2e/d71/f90 [902153,45035] 0 2026-03-10T08:55:39.502 INFO:tasks.workunit.client.0.vm05.stdout:4/520: mknod d0/d2e/d42/d45/d4a/d36/d37/d9c/d32/d41/d67/cab 0 2026-03-10T08:55:39.506 INFO:tasks.workunit.client.0.vm05.stdout:4/521: link d0/d2e/d42/d45/d4a/d36/d37/f68 d0/d2e/d42/d45/d4a/d36/d37/fac 0 2026-03-10T08:55:39.516 INFO:tasks.workunit.client.1.vm08.stdout:2/771: dwrite d1/da/d10/d2d/f4c [0,4194304] 0 2026-03-10T08:55:39.517 INFO:tasks.workunit.client.0.vm05.stdout:7/440: rmdir d18/d1b 39 2026-03-10T08:55:39.517 INFO:tasks.workunit.client.0.vm05.stdout:4/522: getdents d0/d2c 0 2026-03-10T08:55:39.517 INFO:tasks.workunit.client.0.vm05.stdout:5/435: write d5/d86/d21/f5a [154444,80465] 0 2026-03-10T08:55:39.523 INFO:tasks.workunit.client.0.vm05.stdout:0/498: creat df/d1f/d85/d2b/d27/f91 x:0 0 0 2026-03-10T08:55:39.528 INFO:tasks.workunit.client.1.vm08.stdout:2/772: chown d1/da/d10/d1b/fc6 13323491 1 2026-03-10T08:55:39.529 INFO:tasks.workunit.client.0.vm05.stdout:5/436: fsync d5/d86/d24/d2c/d41/f4d 0 2026-03-10T08:55:39.531 INFO:tasks.workunit.client.1.vm08.stdout:2/773: creat d1/dd5/ffd x:0 0 0 2026-03-10T08:55:39.532 INFO:tasks.workunit.client.1.vm08.stdout:2/774: chown d1/da/d10/d1b/c53 15345819 1 2026-03-10T08:55:39.535 INFO:tasks.workunit.client.1.vm08.stdout:2/775: link d1/da/d10/d42/d93/d23/d9e/cab d1/da/d10/d1b/d6a/cfe 0 2026-03-10T08:55:39.542 INFO:tasks.workunit.client.1.vm08.stdout:2/776: dread d1/da/d10/f7e [0,4194304] 0 2026-03-10T08:55:39.547 INFO:tasks.workunit.client.1.vm08.stdout:2/777: link d1/da/d10/d42/d93/d23/f31 d1/da/d10/dd3/fff 0 2026-03-10T08:55:39.549 INFO:tasks.workunit.client.1.vm08.stdout:2/778: creat d1/d5b/da7/f100 x:0 0 0 2026-03-10T08:55:39.550 INFO:tasks.workunit.client.1.vm08.stdout:2/779: stat d1/da/d10/d1b/d6a/lbd 0 2026-03-10T08:55:39.564 INFO:tasks.workunit.client.0.vm05.stdout:7/441: sync 2026-03-10T08:55:39.564 INFO:tasks.workunit.client.0.vm05.stdout:0/499: sync 2026-03-10T08:55:39.569 INFO:tasks.workunit.client.0.vm05.stdout:0/500: dwrite df/d1f/d85/d19/f8e [0,4194304] 0 2026-03-10T08:55:39.570 INFO:tasks.workunit.client.0.vm05.stdout:0/501: chown df/d1f/d85/d19/d47/d84 3269 1 2026-03-10T08:55:39.572 INFO:tasks.workunit.client.0.vm05.stdout:0/502: write df/f17 [418996,117629] 0 2026-03-10T08:55:39.576 INFO:tasks.workunit.client.0.vm05.stdout:7/442: dwrite d18/d38/f55 [0,4194304] 0 2026-03-10T08:55:39.587 INFO:tasks.workunit.client.0.vm05.stdout:1/583: mknod dd/d10/cd3 0 2026-03-10T08:55:39.589 INFO:tasks.workunit.client.1.vm08.stdout:9/683: write d2/dd/d15/d1e/d25/d32/d5c/f7f [1083223,14617] 0 2026-03-10T08:55:39.595 INFO:tasks.workunit.client.0.vm05.stdout:8/500: mkdir d2/dd/d2c/d2e/d31/db4 0 2026-03-10T08:55:39.603 INFO:tasks.workunit.client.1.vm08.stdout:9/684: write d2/dd/d61/fbb [837848,103260] 0 2026-03-10T08:55:39.603 INFO:tasks.workunit.client.0.vm05.stdout:6/560: truncate d4/d92/f96 87214 0 2026-03-10T08:55:39.603 INFO:tasks.workunit.client.0.vm05.stdout:8/501: chown d2/dd/d74/c7e 102058141 1 2026-03-10T08:55:39.603 INFO:tasks.workunit.client.0.vm05.stdout:0/503: link df/d59/l69 df/d1f/d85/d19/d55/l92 0 2026-03-10T08:55:39.603 INFO:tasks.workunit.client.0.vm05.stdout:7/443: creat d18/d66/d25/d2e/d42/d74/f86 x:0 0 0 2026-03-10T08:55:39.603 INFO:tasks.workunit.client.0.vm05.stdout:6/561: rmdir d4/d2c/d84 39 2026-03-10T08:55:39.603 INFO:tasks.workunit.client.0.vm05.stdout:6/562: chown d4/d7/f34 2020132257 1 2026-03-10T08:55:39.606 INFO:tasks.workunit.client.0.vm05.stdout:2/430: symlink d0/d9/l78 0 2026-03-10T08:55:39.609 INFO:tasks.workunit.client.1.vm08.stdout:3/648: dwrite d4/d15/d8/d2c/d6d/fc3 [0,4194304] 0 2026-03-10T08:55:39.612 INFO:tasks.workunit.client.0.vm05.stdout:0/504: creat df/d1f/d85/d19/d47/d84/d8a/f93 x:0 0 0 2026-03-10T08:55:39.613 INFO:tasks.workunit.client.0.vm05.stdout:0/505: chown df/d1f/d85/d19/d39/d4d/d50/c5e 221030 1 2026-03-10T08:55:39.617 INFO:tasks.workunit.client.1.vm08.stdout:3/649: creat d4/d15/d8/d2c/d9b/d79/d8f/fde x:0 0 0 2026-03-10T08:55:39.618 INFO:tasks.workunit.client.0.vm05.stdout:7/444: symlink d18/l87 0 2026-03-10T08:55:39.624 INFO:tasks.workunit.client.0.vm05.stdout:6/563: dread d4/d7/d10/d1a/f25 [0,4194304] 0 2026-03-10T08:55:39.624 INFO:tasks.workunit.client.0.vm05.stdout:2/431: symlink d0/d9/d1e/d20/d21/d45/d4b/d70/l79 0 2026-03-10T08:55:39.624 INFO:tasks.workunit.client.0.vm05.stdout:7/445: mknod d18/d66/d25/d2e/d2f/d6d/c88 0 2026-03-10T08:55:39.625 INFO:tasks.workunit.client.0.vm05.stdout:0/506: sync 2026-03-10T08:55:39.633 INFO:tasks.workunit.client.0.vm05.stdout:2/432: fdatasync d0/d9/d1e/d20/d21/d45/d6c/d6e/f38 0 2026-03-10T08:55:39.635 INFO:tasks.workunit.client.0.vm05.stdout:6/564: dread d4/d2d/f2f [0,4194304] 0 2026-03-10T08:55:39.636 INFO:tasks.workunit.client.1.vm08.stdout:3/650: creat d4/d15/d8/d2c/d9b/d79/fdf x:0 0 0 2026-03-10T08:55:39.637 INFO:tasks.workunit.client.1.vm08.stdout:4/762: write d5/f7e [5199056,65619] 0 2026-03-10T08:55:39.638 INFO:tasks.workunit.client.1.vm08.stdout:0/653: rename d6/fa to d6/dd/d13/d17/d1f/fd7 0 2026-03-10T08:55:39.639 INFO:tasks.workunit.client.0.vm05.stdout:7/446: symlink d18/d38/d43/d6e/l89 0 2026-03-10T08:55:39.646 INFO:tasks.workunit.client.0.vm05.stdout:9/463: write f3 [462608,51575] 0 2026-03-10T08:55:39.648 INFO:tasks.workunit.client.0.vm05.stdout:3/561: write d9/d2b/f2d [281773,39603] 0 2026-03-10T08:55:39.651 INFO:tasks.workunit.client.1.vm08.stdout:0/654: mkdir d6/dd/d13/d17/d1f/d20/d2f/d24/dc2/dd8 0 2026-03-10T08:55:39.651 INFO:tasks.workunit.client.0.vm05.stdout:2/433: mkdir d0/d9/d1e/d20/d21/d45/d6c/d6e/d7a 0 2026-03-10T08:55:39.653 INFO:tasks.workunit.client.0.vm05.stdout:6/565: truncate d4/d2d/d5f/f81 966092 0 2026-03-10T08:55:39.658 INFO:tasks.workunit.client.1.vm08.stdout:1/739: rename d1/da/d18/fea to d1/da/de/d24/d3d/d40/ffe 0 2026-03-10T08:55:39.659 INFO:tasks.workunit.client.1.vm08.stdout:0/655: unlink d6/l22 0 2026-03-10T08:55:39.659 INFO:tasks.workunit.client.0.vm05.stdout:3/562: rmdir d9/d8f/d50/d5f 39 2026-03-10T08:55:39.661 INFO:tasks.workunit.client.1.vm08.stdout:4/763: getdents d5/d23/d49 0 2026-03-10T08:55:39.661 INFO:tasks.workunit.client.1.vm08.stdout:4/764: dread - d5/de/d96/fbb zero size 2026-03-10T08:55:39.662 INFO:tasks.workunit.client.1.vm08.stdout:4/765: dread - d5/de/f54 zero size 2026-03-10T08:55:39.662 INFO:tasks.workunit.client.1.vm08.stdout:4/766: chown d5/df5 1 1 2026-03-10T08:55:39.662 INFO:tasks.workunit.client.1.vm08.stdout:4/767: chown d5/d23/d36/d99/db2/d5d/de3 0 1 2026-03-10T08:55:39.664 INFO:tasks.workunit.client.1.vm08.stdout:7/744: rename d0/d11/d1f/d29/d3b/fdd to d0/d11/d1f/d29/d3b/d80/dd3/de1/ff1 0 2026-03-10T08:55:39.665 INFO:tasks.workunit.client.0.vm05.stdout:7/447: mknod d18/d66/c8a 0 2026-03-10T08:55:39.667 INFO:tasks.workunit.client.0.vm05.stdout:0/507: link df/c4c df/d1f/d85/d19/d39/d4d/d50/c94 0 2026-03-10T08:55:39.668 INFO:tasks.workunit.client.1.vm08.stdout:6/719: write d9/dc/d11/d23/d2c/d7a/fd1 [1278023,54340] 0 2026-03-10T08:55:39.668 INFO:tasks.workunit.client.1.vm08.stdout:0/656: truncate d6/dd/d13/d17/fb4 198964 0 2026-03-10T08:55:39.671 INFO:tasks.workunit.client.0.vm05.stdout:9/464: symlink d6/d15/d3c/d4b/d90/d93/l9e 0 2026-03-10T08:55:39.672 INFO:tasks.workunit.client.0.vm05.stdout:3/563: stat d9/d8f/d50/d5f/l9d 0 2026-03-10T08:55:39.676 INFO:tasks.workunit.client.1.vm08.stdout:8/733: rename d1/d10/d9/f5b to d1/d10/d9/dd/d25/d27/d44/d97/d7d/f10f 0 2026-03-10T08:55:39.679 INFO:tasks.workunit.client.0.vm05.stdout:4/523: write d0/d2e/d42/d45/d4a/d36/d37/f97 [556206,35597] 0 2026-03-10T08:55:39.680 INFO:tasks.workunit.client.0.vm05.stdout:4/524: chown d0/d2e/d42/d45/d4a/d36 0 1 2026-03-10T08:55:39.682 INFO:tasks.workunit.client.0.vm05.stdout:5/437: write d5/df/f31 [758266,99548] 0 2026-03-10T08:55:39.689 INFO:tasks.workunit.client.0.vm05.stdout:6/566: dread d4/d7/ff [0,4194304] 0 2026-03-10T08:55:39.690 INFO:tasks.workunit.client.0.vm05.stdout:0/508: mkdir df/d1f/d95 0 2026-03-10T08:55:39.699 INFO:tasks.workunit.client.1.vm08.stdout:2/780: write d1/da/d10/d42/d93/d1e/f1f [1227373,24781] 0 2026-03-10T08:55:39.703 INFO:tasks.workunit.client.0.vm05.stdout:3/564: dwrite d9/f29 [0,4194304] 0 2026-03-10T08:55:39.704 INFO:tasks.workunit.client.0.vm05.stdout:3/565: write d9/d2b/d3a/f68 [4322018,65373] 0 2026-03-10T08:55:39.704 INFO:tasks.workunit.client.0.vm05.stdout:3/566: chown d9/d2b/d53/f93 300077128 1 2026-03-10T08:55:39.718 INFO:tasks.workunit.client.0.vm05.stdout:7/448: creat d18/d66/d78/f8b x:0 0 0 2026-03-10T08:55:39.722 INFO:tasks.workunit.client.1.vm08.stdout:7/745: truncate d0/d11/d1f/d29/fcf 283098 0 2026-03-10T08:55:39.723 INFO:tasks.workunit.client.1.vm08.stdout:7/746: readlink d0/d11/le9 0 2026-03-10T08:55:39.723 INFO:tasks.workunit.client.0.vm05.stdout:8/502: dwrite d2/db/f19 [4194304,4194304] 0 2026-03-10T08:55:39.732 INFO:tasks.workunit.client.0.vm05.stdout:1/584: write dd/d10/d18/d20/d52/d80/fa5 [999687,99777] 0 2026-03-10T08:55:39.735 INFO:tasks.workunit.client.1.vm08.stdout:2/781: truncate d1/da/f50 4439639 0 2026-03-10T08:55:39.735 INFO:tasks.workunit.client.0.vm05.stdout:7/449: creat d18/d38/d43/f8c x:0 0 0 2026-03-10T08:55:39.736 INFO:tasks.workunit.client.1.vm08.stdout:7/747: mknod d0/d51/cf2 0 2026-03-10T08:55:39.737 INFO:tasks.workunit.client.1.vm08.stdout:7/748: write d0/d11/d1f/fb7 [366245,62157] 0 2026-03-10T08:55:39.737 INFO:tasks.workunit.client.0.vm05.stdout:8/503: mknod d2/dd/d74/d78/cb5 0 2026-03-10T08:55:39.738 INFO:tasks.workunit.client.1.vm08.stdout:7/749: truncate d0/d11/d4a/d5e/fed 82389 0 2026-03-10T08:55:39.738 INFO:tasks.workunit.client.1.vm08.stdout:7/750: readlink d0/d11/db2/l83 0 2026-03-10T08:55:39.742 INFO:tasks.workunit.client.0.vm05.stdout:9/465: rename d6/d19/d2a/d4a/f56 to d6/f9f 0 2026-03-10T08:55:39.745 INFO:tasks.workunit.client.0.vm05.stdout:2/434: dwrite d0/d9/f1b [0,4194304] 0 2026-03-10T08:55:39.749 INFO:tasks.workunit.client.1.vm08.stdout:7/751: write d0/d11/d1f/d29/d3b/d80/dd3/de1/ff1 [171187,20581] 0 2026-03-10T08:55:39.752 INFO:tasks.workunit.client.0.vm05.stdout:7/450: dread - d18/d1b/f84 zero size 2026-03-10T08:55:39.755 INFO:tasks.workunit.client.0.vm05.stdout:7/451: dwrite d18/d1b/f50 [0,4194304] 0 2026-03-10T08:55:39.756 INFO:tasks.workunit.client.1.vm08.stdout:5/652: rename d0/d11/d27/d68/d7c/d4b/d4e/c9a to d0/d11/d18/cc5 0 2026-03-10T08:55:39.757 INFO:tasks.workunit.client.1.vm08.stdout:5/653: write d0/d11/d27/d50/fa1 [1838716,22803] 0 2026-03-10T08:55:39.758 INFO:tasks.workunit.client.1.vm08.stdout:1/740: readlink d1/da/d18/d3a/l5f 0 2026-03-10T08:55:39.759 INFO:tasks.workunit.client.1.vm08.stdout:5/654: chown d0/d11/d27/d68/d7c/d4b/d4e/d84 2179160 1 2026-03-10T08:55:39.760 INFO:tasks.workunit.client.1.vm08.stdout:6/720: write d9/d50/f75 [3172738,109913] 0 2026-03-10T08:55:39.761 INFO:tasks.workunit.client.1.vm08.stdout:6/721: chown d9/dc/l33 1038277 1 2026-03-10T08:55:39.762 INFO:tasks.workunit.client.1.vm08.stdout:7/752: fdatasync d0/d11/d1f/d29/d3d/d89/f8b 0 2026-03-10T08:55:39.762 INFO:tasks.workunit.client.1.vm08.stdout:7/753: chown d0/d11/d1f/d29/fba 7 1 2026-03-10T08:55:39.763 INFO:tasks.workunit.client.0.vm05.stdout:0/509: rename df/d1f/d85/d19/d39/d4d/d50 to df/d1f/d85/d2b/d65/d6e/d96 0 2026-03-10T08:55:39.767 INFO:tasks.workunit.client.1.vm08.stdout:1/741: mknod d1/da/d18/d3a/da7/cff 0 2026-03-10T08:55:39.767 INFO:tasks.workunit.client.1.vm08.stdout:5/655: truncate d0/d1b/f2f 77360 0 2026-03-10T08:55:39.767 INFO:tasks.workunit.client.0.vm05.stdout:7/452: creat d18/d66/d25/f8d x:0 0 0 2026-03-10T08:55:39.773 INFO:tasks.workunit.client.0.vm05.stdout:7/453: dwrite d18/d66/d25/d2e/d42/f71 [0,4194304] 0 2026-03-10T08:55:39.775 INFO:tasks.workunit.client.0.vm05.stdout:7/454: stat d18/d66/d25/d2e/d42/f71 0 2026-03-10T08:55:39.775 INFO:tasks.workunit.client.0.vm05.stdout:4/525: getdents d0/d2e/d71/d7c 0 2026-03-10T08:55:39.775 INFO:tasks.workunit.client.1.vm08.stdout:7/754: unlink d0/d11/f66 0 2026-03-10T08:55:39.775 INFO:tasks.workunit.client.1.vm08.stdout:7/755: chown d0/d11/d1f/d29 388 1 2026-03-10T08:55:39.776 INFO:tasks.workunit.client.0.vm05.stdout:7/455: truncate d18/d66/d79/f85 532919 0 2026-03-10T08:55:39.776 INFO:tasks.workunit.client.0.vm05.stdout:7/456: chown d18/d38/f82 991393 1 2026-03-10T08:55:39.777 INFO:tasks.workunit.client.0.vm05.stdout:3/567: rename d9/d8f/d50/d5f/d7b/f83 to d9/d4d/d51/d64/f9e 0 2026-03-10T08:55:39.777 INFO:tasks.workunit.client.0.vm05.stdout:7/457: readlink d18/d38/d43/d6e/l89 0 2026-03-10T08:55:39.778 INFO:tasks.workunit.client.1.vm08.stdout:1/742: truncate d1/da/de/f79 1653807 0 2026-03-10T08:55:39.780 INFO:tasks.workunit.client.0.vm05.stdout:4/526: symlink d0/d78/lad 0 2026-03-10T08:55:39.780 INFO:tasks.workunit.client.1.vm08.stdout:2/782: getdents d1/da/d10/d2d/db6 0 2026-03-10T08:55:39.781 INFO:tasks.workunit.client.1.vm08.stdout:9/685: rename d2/f13 to d2/dd/d15/d1e/d25/feb 0 2026-03-10T08:55:39.782 INFO:tasks.workunit.client.1.vm08.stdout:7/756: symlink d0/d14/d43/d9d/lf3 0 2026-03-10T08:55:39.782 INFO:tasks.workunit.client.0.vm05.stdout:3/568: creat d9/d2b/d3a/d43/d6e/f9f x:0 0 0 2026-03-10T08:55:39.782 INFO:tasks.workunit.client.1.vm08.stdout:2/783: dread d1/da/d10/d1b/fac [0,4194304] 0 2026-03-10T08:55:39.783 INFO:tasks.workunit.client.0.vm05.stdout:3/569: write d9/d8f/d55/f6b [2500261,63440] 0 2026-03-10T08:55:39.783 INFO:tasks.workunit.client.1.vm08.stdout:7/757: stat d0/d11/d1f/d29/d3d/l50 0 2026-03-10T08:55:39.784 INFO:tasks.workunit.client.1.vm08.stdout:6/722: rmdir d9/d10/de7 0 2026-03-10T08:55:39.784 INFO:tasks.workunit.client.1.vm08.stdout:3/651: rename d4/d15/f78 to d4/d15/d8/d2c/d55/fe0 0 2026-03-10T08:55:39.786 INFO:tasks.workunit.client.1.vm08.stdout:1/743: dwrite d1/da/de/d24/d3d/d40/f42 [0,4194304] 0 2026-03-10T08:55:39.786 INFO:tasks.workunit.client.1.vm08.stdout:5/656: creat d0/d11/d18/fc6 x:0 0 0 2026-03-10T08:55:39.788 INFO:tasks.workunit.client.0.vm05.stdout:3/570: rename d9/d4d/c63 to d9/d2b/ca0 0 2026-03-10T08:55:39.789 INFO:tasks.workunit.client.0.vm05.stdout:3/571: creat d9/d4d/d51/d64/d89/fa1 x:0 0 0 2026-03-10T08:55:39.790 INFO:tasks.workunit.client.1.vm08.stdout:4/768: rename d5/de to d5/d23/d36/d99/db2/d5a/d69/d11b 0 2026-03-10T08:55:39.790 INFO:tasks.workunit.client.1.vm08.stdout:3/652: symlink d4/d6f/le1 0 2026-03-10T08:55:39.790 INFO:tasks.workunit.client.1.vm08.stdout:5/657: fsync d0/d11/d27/d68/d7c/d4b/d4e/d84/f90 0 2026-03-10T08:55:39.791 INFO:tasks.workunit.client.0.vm05.stdout:3/572: stat d9/d2b/d53/c9a 0 2026-03-10T08:55:39.791 INFO:tasks.workunit.client.1.vm08.stdout:7/758: getdents d0/d14/d43/de7 0 2026-03-10T08:55:39.791 INFO:tasks.workunit.client.0.vm05.stdout:3/573: readlink d9/d2b/d2f/l8b 0 2026-03-10T08:55:39.791 INFO:tasks.workunit.client.0.vm05.stdout:0/510: sync 2026-03-10T08:55:39.792 INFO:tasks.workunit.client.0.vm05.stdout:4/527: sync 2026-03-10T08:55:39.792 INFO:tasks.workunit.client.0.vm05.stdout:0/511: fsync df/f79 0 2026-03-10T08:55:39.793 INFO:tasks.workunit.client.0.vm05.stdout:0/512: dread - df/d1f/d85/d19/d39/d74/d67/f80 zero size 2026-03-10T08:55:39.794 INFO:tasks.workunit.client.0.vm05.stdout:0/513: fdatasync df/d1f/d85/d2b/d27/d32/f5d 0 2026-03-10T08:55:39.795 INFO:tasks.workunit.client.1.vm08.stdout:1/744: unlink d1/fe6 0 2026-03-10T08:55:39.796 INFO:tasks.workunit.client.1.vm08.stdout:1/745: dread - d1/da/de/d24/d3d/d40/d8e/dd2/ffd zero size 2026-03-10T08:55:39.799 INFO:tasks.workunit.client.1.vm08.stdout:2/784: link d1/d97/ldd d1/d43/d5c/de7/l101 0 2026-03-10T08:55:39.800 INFO:tasks.workunit.client.0.vm05.stdout:3/574: truncate d9/f27 4895493 0 2026-03-10T08:55:39.801 INFO:tasks.workunit.client.0.vm05.stdout:3/575: readlink d9/d2b/d2f/l8b 0 2026-03-10T08:55:39.802 INFO:tasks.workunit.client.0.vm05.stdout:4/528: truncate d0/d1d/f22 3682335 0 2026-03-10T08:55:39.802 INFO:tasks.workunit.client.1.vm08.stdout:3/653: mkdir d4/d15/d8/d2c/d9b/d79/d8f/de2 0 2026-03-10T08:55:39.804 INFO:tasks.workunit.client.0.vm05.stdout:0/514: rmdir df/d1f/d85/d2b/d27/d32/d4e 39 2026-03-10T08:55:39.806 INFO:tasks.workunit.client.0.vm05.stdout:3/576: symlink d9/d8f/d50/d5f/la2 0 2026-03-10T08:55:39.807 INFO:tasks.workunit.client.1.vm08.stdout:6/723: link d9/dc/d84/d80/fc1 d9/d50/de9/fed 0 2026-03-10T08:55:39.808 INFO:tasks.workunit.client.1.vm08.stdout:1/746: creat d1/da/d20/d91/d83/f100 x:0 0 0 2026-03-10T08:55:39.808 INFO:tasks.workunit.client.0.vm05.stdout:0/515: truncate df/d59/f3f 1198924 0 2026-03-10T08:55:39.808 INFO:tasks.workunit.client.1.vm08.stdout:2/785: rename d1/da/d10 to d1/da/d10/d42/d93/d1e/d102 22 2026-03-10T08:55:39.809 INFO:tasks.workunit.client.0.vm05.stdout:0/516: write df/d1f/d85/d2b/d27/d32/f5d [5019113,84508] 0 2026-03-10T08:55:39.810 INFO:tasks.workunit.client.0.vm05.stdout:3/577: mkdir d9/d2b/d3a/d43/da3 0 2026-03-10T08:55:39.811 INFO:tasks.workunit.client.0.vm05.stdout:0/517: symlink df/d1f/d85/d2b/l97 0 2026-03-10T08:55:39.812 INFO:tasks.workunit.client.0.vm05.stdout:3/578: mkdir d9/d2b/d3a/da4 0 2026-03-10T08:55:39.812 INFO:tasks.workunit.client.1.vm08.stdout:1/747: mknod d1/dde/c101 0 2026-03-10T08:55:39.813 INFO:tasks.workunit.client.1.vm08.stdout:3/654: read d4/d15/d8/d2c/f32 [779444,32065] 0 2026-03-10T08:55:39.815 INFO:tasks.workunit.client.0.vm05.stdout:0/518: symlink df/d1f/d85/d19/d39/d4d/l98 0 2026-03-10T08:55:39.817 INFO:tasks.workunit.client.1.vm08.stdout:6/724: truncate d9/d10/d1e/d92/faf 471768 0 2026-03-10T08:55:39.818 INFO:tasks.workunit.client.1.vm08.stdout:4/769: link d5/c15 d5/d23/d36/d99/db2/d5a/d69/c11c 0 2026-03-10T08:55:39.818 INFO:tasks.workunit.client.1.vm08.stdout:6/725: stat d9/dc/d84/fae 0 2026-03-10T08:55:39.820 INFO:tasks.workunit.client.0.vm05.stdout:3/579: dwrite d9/d2b/d53/f9c [0,4194304] 0 2026-03-10T08:55:39.822 INFO:tasks.workunit.client.0.vm05.stdout:0/519: read df/d1f/d85/d2b/d27/d32/f44 [87218,129864] 0 2026-03-10T08:55:39.823 INFO:tasks.workunit.client.0.vm05.stdout:3/580: write d9/f4a [3515331,107503] 0 2026-03-10T08:55:39.825 INFO:tasks.workunit.client.0.vm05.stdout:3/581: fdatasync d9/d2b/d2f/f33 0 2026-03-10T08:55:39.829 INFO:tasks.workunit.client.0.vm05.stdout:4/529: dread d0/d2c/f2f [0,4194304] 0 2026-03-10T08:55:39.841 INFO:tasks.workunit.client.1.vm08.stdout:1/748: rename d1/da/d20/d91/c55 to d1/da/d20/d3f/c102 0 2026-03-10T08:55:39.880 INFO:tasks.workunit.client.1.vm08.stdout:4/770: mknod d5/d23/d36/d99/db2/d5d/dae/ddf/c11d 0 2026-03-10T08:55:39.880 INFO:tasks.workunit.client.1.vm08.stdout:3/655: creat d4/d15/fe3 x:0 0 0 2026-03-10T08:55:39.880 INFO:tasks.workunit.client.1.vm08.stdout:4/771: unlink d5/d5f/l9b 0 2026-03-10T08:55:39.880 INFO:tasks.workunit.client.1.vm08.stdout:6/726: symlink d9/dc/d11/d23/d2c/d81/lee 0 2026-03-10T08:55:39.880 INFO:tasks.workunit.client.1.vm08.stdout:3/656: mknod d4/d15/d8/d2c/d9b/d79/d20/ce4 0 2026-03-10T08:55:39.880 INFO:tasks.workunit.client.1.vm08.stdout:4/772: rmdir d5/d23/d36/d76/d103 0 2026-03-10T08:55:39.880 INFO:tasks.workunit.client.1.vm08.stdout:4/773: write d5/d23/d36/d76/fa5 [1273417,49197] 0 2026-03-10T08:55:39.880 INFO:tasks.workunit.client.1.vm08.stdout:4/774: creat d5/d23/d36/d99/db2/d5d/de3/df8/f11e x:0 0 0 2026-03-10T08:55:39.880 INFO:tasks.workunit.client.0.vm05.stdout:2/435: dread d0/f8 [0,4194304] 0 2026-03-10T08:55:39.880 INFO:tasks.workunit.client.0.vm05.stdout:4/530: mkdir d0/d2e/d42/d45/d4a/d36/d37/d9c/d49/d4f/d5b/dae 0 2026-03-10T08:55:39.881 INFO:tasks.workunit.client.0.vm05.stdout:4/531: fsync d0/f9 0 2026-03-10T08:55:39.881 INFO:tasks.workunit.client.0.vm05.stdout:2/436: creat d0/d9/d1e/d20/d21/d45/d6c/d6e/d6d/f7b x:0 0 0 2026-03-10T08:55:39.881 INFO:tasks.workunit.client.0.vm05.stdout:2/437: truncate d0/d9/d1e/d20/d21/f23 1419034 0 2026-03-10T08:55:39.881 INFO:tasks.workunit.client.0.vm05.stdout:3/582: rmdir d9/d2b/d3a/da4 0 2026-03-10T08:55:39.881 INFO:tasks.workunit.client.0.vm05.stdout:3/583: unlink d9/fa 0 2026-03-10T08:55:39.881 INFO:tasks.workunit.client.0.vm05.stdout:2/438: truncate d0/d9/d1e/d20/d21/d45/d6c/d6e/f64 1740018 0 2026-03-10T08:55:39.881 INFO:tasks.workunit.client.0.vm05.stdout:3/584: chown d9/d2b/l47 984346 1 2026-03-10T08:55:39.881 INFO:tasks.workunit.client.0.vm05.stdout:3/585: unlink d9/f78 0 2026-03-10T08:55:39.881 INFO:tasks.workunit.client.0.vm05.stdout:2/439: creat d0/d9/d1e/d20/f7c x:0 0 0 2026-03-10T08:55:39.886 INFO:tasks.workunit.client.0.vm05.stdout:4/532: dread d0/d2e/d42/d45/d4a/d36/d37/d9c/f61 [0,4194304] 0 2026-03-10T08:55:39.886 INFO:tasks.workunit.client.0.vm05.stdout:4/533: chown d0/d2e/d42/d45/d4a/d36/d37/d9c/d49/d4f/c8f 661815610 1 2026-03-10T08:55:39.914 INFO:tasks.workunit.client.1.vm08.stdout:6/727: read d9/d10/fab [3567060,49673] 0 2026-03-10T08:55:39.933 INFO:tasks.workunit.client.1.vm08.stdout:7/759: sync 2026-03-10T08:55:39.954 INFO:tasks.workunit.client.1.vm08.stdout:7/760: getdents d0/d11/d1f/d29/d3b 0 2026-03-10T08:55:39.954 INFO:tasks.workunit.client.1.vm08.stdout:7/761: mkdir d0/d11/d1f/df0/df4 0 2026-03-10T08:55:39.954 INFO:tasks.workunit.client.1.vm08.stdout:7/762: rename d0/d11/d1f/f90 to d0/d11/d4a/d95/ff5 0 2026-03-10T08:55:39.954 INFO:tasks.workunit.client.1.vm08.stdout:7/763: mkdir d0/d11/d1f/d29/d3d/df6 0 2026-03-10T08:55:39.954 INFO:tasks.workunit.client.1.vm08.stdout:7/764: rmdir d0/d11/d1f/d29/d3b/d80/dd3 39 2026-03-10T08:55:39.985 INFO:tasks.workunit.client.1.vm08.stdout:0/657: dwrite d6/dd/d13/d17/f29 [0,4194304] 0 2026-03-10T08:55:39.988 INFO:tasks.workunit.client.1.vm08.stdout:8/734: dwrite d1/f8 [0,4194304] 0 2026-03-10T08:55:39.994 INFO:tasks.workunit.client.1.vm08.stdout:0/658: rename d6/f25 to d6/dd/d13/d8f/fd9 0 2026-03-10T08:55:39.994 INFO:tasks.workunit.client.1.vm08.stdout:8/735: truncate d1/d10/fad 756599 0 2026-03-10T08:55:39.999 INFO:tasks.workunit.client.1.vm08.stdout:8/736: mknod d1/d10/d9/d4d/d9f/c110 0 2026-03-10T08:55:40.010 INFO:tasks.workunit.client.0.vm05.stdout:3/586: sync 2026-03-10T08:55:40.039 INFO:tasks.workunit.client.1.vm08.stdout:0/659: dread f5 [0,4194304] 0 2026-03-10T08:55:40.041 INFO:tasks.workunit.client.1.vm08.stdout:0/660: creat d6/dd/d13/d17/d1f/d2d/d38/d98/fda x:0 0 0 2026-03-10T08:55:40.045 INFO:tasks.workunit.client.1.vm08.stdout:0/661: dwrite d6/dd/d13/d17/d1f/d2d/d85/d95/fb9 [0,4194304] 0 2026-03-10T08:55:40.055 INFO:tasks.workunit.client.1.vm08.stdout:8/737: sync 2026-03-10T08:55:40.098 INFO:tasks.workunit.client.0.vm05.stdout:6/567: write d4/d7/f34 [2136310,116942] 0 2026-03-10T08:55:40.100 INFO:tasks.workunit.client.0.vm05.stdout:5/438: write d5/d86/d24/d2c/d41/f87 [673313,99309] 0 2026-03-10T08:55:40.104 INFO:tasks.workunit.client.0.vm05.stdout:1/585: dwrite dd/f16 [0,4194304] 0 2026-03-10T08:55:40.109 INFO:tasks.workunit.client.0.vm05.stdout:8/504: write d2/db/f22 [4185826,53913] 0 2026-03-10T08:55:40.114 INFO:tasks.workunit.client.0.vm05.stdout:8/505: dread - d2/db/d1f/f84 zero size 2026-03-10T08:55:40.115 INFO:tasks.workunit.client.0.vm05.stdout:3/587: read d9/d8f/d55/f6b [573974,56408] 0 2026-03-10T08:55:40.120 INFO:tasks.workunit.client.0.vm05.stdout:5/439: link d5/d86/f1b d5/d86/d24/d2c/d41/d74/f9f 0 2026-03-10T08:55:40.129 INFO:tasks.workunit.client.0.vm05.stdout:1/586: creat dd/d10/d18/d2d/d51/d58/d71/d62/fd4 x:0 0 0 2026-03-10T08:55:40.135 INFO:tasks.workunit.client.0.vm05.stdout:6/568: symlink d4/d2d/d51/d62/da9/lb9 0 2026-03-10T08:55:40.137 INFO:tasks.workunit.client.0.vm05.stdout:9/466: truncate d6/f7 6311210 0 2026-03-10T08:55:40.139 INFO:tasks.workunit.client.0.vm05.stdout:5/440: write d5/d86/d39/f77 [4168974,64353] 0 2026-03-10T08:55:40.147 INFO:tasks.workunit.client.0.vm05.stdout:6/569: sync 2026-03-10T08:55:40.151 INFO:tasks.workunit.client.0.vm05.stdout:8/506: creat d2/dd/d2c/d2e/d31/d3e/d5d/d9d/db3/fb6 x:0 0 0 2026-03-10T08:55:40.153 INFO:tasks.workunit.client.0.vm05.stdout:7/458: write d18/d66/d25/d2e/f48 [675913,94667] 0 2026-03-10T08:55:40.156 INFO:tasks.workunit.client.1.vm08.stdout:9/686: write d2/dd/d15/d1e/d39/d4e/fcf [991116,894] 0 2026-03-10T08:55:40.156 INFO:tasks.workunit.client.0.vm05.stdout:7/459: sync 2026-03-10T08:55:40.161 INFO:tasks.workunit.client.0.vm05.stdout:3/588: truncate d9/d2b/f3b 347604 0 2026-03-10T08:55:40.165 INFO:tasks.workunit.client.0.vm05.stdout:9/467: unlink d6/d19/d21/f31 0 2026-03-10T08:55:40.175 INFO:tasks.workunit.client.0.vm05.stdout:6/570: dread d4/f30 [0,4194304] 0 2026-03-10T08:55:40.176 INFO:tasks.workunit.client.0.vm05.stdout:9/468: rmdir d6/d19/d2a/d8d 39 2026-03-10T08:55:40.179 INFO:tasks.workunit.client.0.vm05.stdout:9/469: chown d6/d15/d3c/d4b/f76 35 1 2026-03-10T08:55:40.180 INFO:tasks.workunit.client.0.vm05.stdout:5/441: symlink d5/d86/la0 0 2026-03-10T08:55:40.187 INFO:tasks.workunit.client.0.vm05.stdout:3/589: rename d9/d2b/d3a/c54 to d9/d8f/d50/d5f/ca5 0 2026-03-10T08:55:40.192 INFO:tasks.workunit.client.1.vm08.stdout:5/658: dwrite d0/d11/d27/d68/d7c/d4b/d4e/d84/fbb [0,4194304] 0 2026-03-10T08:55:40.201 INFO:tasks.workunit.client.0.vm05.stdout:9/470: dread d6/f8 [0,4194304] 0 2026-03-10T08:55:40.219 INFO:tasks.workunit.client.1.vm08.stdout:2/786: dwrite d1/da/d10/d42/f89 [0,4194304] 0 2026-03-10T08:55:40.227 INFO:tasks.workunit.client.0.vm05.stdout:9/471: symlink d6/la0 0 2026-03-10T08:55:40.227 INFO:tasks.workunit.client.0.vm05.stdout:5/442: getdents d5/d86/d21/d71 0 2026-03-10T08:55:40.227 INFO:tasks.workunit.client.0.vm05.stdout:5/443: stat d5/d86/d21/l63 0 2026-03-10T08:55:40.236 INFO:tasks.workunit.client.0.vm05.stdout:9/472: fdatasync d6/fb 0 2026-03-10T08:55:40.237 INFO:tasks.workunit.client.1.vm08.stdout:5/659: write d0/d11/d18/faf [796280,3614] 0 2026-03-10T08:55:40.244 INFO:tasks.workunit.client.1.vm08.stdout:2/787: creat d1/dd5/f103 x:0 0 0 2026-03-10T08:55:40.246 INFO:tasks.workunit.client.0.vm05.stdout:5/444: symlink d5/d86/la1 0 2026-03-10T08:55:40.246 INFO:tasks.workunit.client.0.vm05.stdout:9/473: symlink d6/d19/d2a/d8d/la1 0 2026-03-10T08:55:40.252 INFO:tasks.workunit.client.0.vm05.stdout:9/474: mkdir d6/d12/d3a/da2 0 2026-03-10T08:55:40.255 INFO:tasks.workunit.client.1.vm08.stdout:5/660: creat d0/fc7 x:0 0 0 2026-03-10T08:55:40.258 INFO:tasks.workunit.client.0.vm05.stdout:9/475: rmdir d6/d12/d3a 39 2026-03-10T08:55:40.263 INFO:tasks.workunit.client.0.vm05.stdout:5/445: dread d5/d86/d66/f7b [0,4194304] 0 2026-03-10T08:55:40.264 INFO:tasks.workunit.client.1.vm08.stdout:2/788: dread d1/d43/f5d [0,4194304] 0 2026-03-10T08:55:40.266 INFO:tasks.workunit.client.1.vm08.stdout:5/661: rename d0/d11/d27/d68/c9f to d0/d11/d27/d68/dc1/cc8 0 2026-03-10T08:55:40.268 INFO:tasks.workunit.client.1.vm08.stdout:1/749: write d1/da/de/f12 [4604786,32698] 0 2026-03-10T08:55:40.272 INFO:tasks.workunit.client.0.vm05.stdout:0/520: truncate df/f17 2485115 0 2026-03-10T08:55:40.278 INFO:tasks.workunit.client.0.vm05.stdout:5/446: creat d5/df/fa2 x:0 0 0 2026-03-10T08:55:40.280 INFO:tasks.workunit.client.1.vm08.stdout:5/662: chown d0/d11/d18/f23 568 1 2026-03-10T08:55:40.283 INFO:tasks.workunit.client.0.vm05.stdout:9/476: link d6/d15/d35/c66 d6/d19/ca3 0 2026-03-10T08:55:40.283 INFO:tasks.workunit.client.0.vm05.stdout:9/477: dread - d6/d15/d3c/d4b/f76 zero size 2026-03-10T08:55:40.284 INFO:tasks.workunit.client.0.vm05.stdout:9/478: fsync d6/d15/f96 0 2026-03-10T08:55:40.287 INFO:tasks.workunit.client.0.vm05.stdout:9/479: chown d6/d19/d21/c75 416052 1 2026-03-10T08:55:40.288 INFO:tasks.workunit.client.1.vm08.stdout:1/750: creat d1/da/de/d24/d3d/d40/f103 x:0 0 0 2026-03-10T08:55:40.289 INFO:tasks.workunit.client.0.vm05.stdout:9/480: symlink d6/d15/d35/la4 0 2026-03-10T08:55:40.290 INFO:tasks.workunit.client.0.vm05.stdout:9/481: write d6/d19/d2c/f78 [138985,6665] 0 2026-03-10T08:55:40.292 INFO:tasks.workunit.client.1.vm08.stdout:5/663: dread d0/fe [0,4194304] 0 2026-03-10T08:55:40.293 INFO:tasks.workunit.client.0.vm05.stdout:5/447: sync 2026-03-10T08:55:40.299 INFO:tasks.workunit.client.0.vm05.stdout:9/482: creat d6/d12/d3a/d48/fa5 x:0 0 0 2026-03-10T08:55:40.300 INFO:tasks.workunit.client.0.vm05.stdout:0/521: dread df/d1f/d85/d19/d5b/f76 [0,4194304] 0 2026-03-10T08:55:40.302 INFO:tasks.workunit.client.1.vm08.stdout:1/751: creat d1/da/de/d24/d26/d5d/f104 x:0 0 0 2026-03-10T08:55:40.302 INFO:tasks.workunit.client.1.vm08.stdout:5/664: creat d0/d11/d18/d52/db9/fc9 x:0 0 0 2026-03-10T08:55:40.304 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:40 vm08.local ceph-mon[57559]: pgmap v159: 65 pgs: 65 active+clean; 2.6 GiB data, 9.1 GiB used, 111 GiB / 120 GiB avail; 49 MiB/s rd, 124 MiB/s wr, 280 op/s 2026-03-10T08:55:40.308 INFO:tasks.workunit.client.0.vm05.stdout:0/522: creat df/d1f/d85/d19/f99 x:0 0 0 2026-03-10T08:55:40.309 INFO:tasks.workunit.client.0.vm05.stdout:9/483: creat d6/d27/fa6 x:0 0 0 2026-03-10T08:55:40.312 INFO:tasks.workunit.client.0.vm05.stdout:9/484: creat d6/d19/d2a/d4a/d8c/fa7 x:0 0 0 2026-03-10T08:55:40.316 INFO:tasks.workunit.client.1.vm08.stdout:5/665: creat d0/d11/d18/fca x:0 0 0 2026-03-10T08:55:40.321 INFO:tasks.workunit.client.0.vm05.stdout:0/523: dwrite df/d1f/d85/f29 [0,4194304] 0 2026-03-10T08:55:40.325 INFO:tasks.workunit.client.0.vm05.stdout:9/485: fdatasync d6/d15/f25 0 2026-03-10T08:55:40.332 INFO:tasks.workunit.client.0.vm05.stdout:9/486: dwrite d6/d19/d21/f7d [0,4194304] 0 2026-03-10T08:55:40.333 INFO:tasks.workunit.client.0.vm05.stdout:0/524: creat df/d1f/d85/d2b/f9a x:0 0 0 2026-03-10T08:55:40.333 INFO:tasks.workunit.client.0.vm05.stdout:9/487: write d6/d19/d2a/d4a/d8c/fa7 [513296,5787] 0 2026-03-10T08:55:40.334 INFO:tasks.workunit.client.0.vm05.stdout:9/488: chown d6/d19/d2a/d4a/l79 0 1 2026-03-10T08:55:40.335 INFO:tasks.workunit.client.1.vm08.stdout:3/657: write d4/d15/f1a [911729,75892] 0 2026-03-10T08:55:40.335 INFO:tasks.workunit.client.1.vm08.stdout:5/666: unlink d0/c44 0 2026-03-10T08:55:40.335 INFO:tasks.workunit.client.1.vm08.stdout:4/775: write d5/d23/d49/d8f/f10c [236995,21730] 0 2026-03-10T08:55:40.335 INFO:tasks.workunit.client.0.vm05.stdout:0/525: chown df/d59/f3f 847771 1 2026-03-10T08:55:40.335 INFO:tasks.workunit.client.0.vm05.stdout:0/526: readlink df/l77 0 2026-03-10T08:55:40.337 INFO:tasks.workunit.client.1.vm08.stdout:5/667: rename d0/d11/d27/d68/d7c to d0/d11/d27/d68/d7c/d8e/dcb 22 2026-03-10T08:55:40.343 INFO:tasks.workunit.client.1.vm08.stdout:5/668: dwrite d0/d11/d18/fc6 [0,4194304] 0 2026-03-10T08:55:40.365 INFO:tasks.workunit.client.1.vm08.stdout:5/669: creat d0/d1b/d67/d80/fcc x:0 0 0 2026-03-10T08:55:40.368 INFO:tasks.workunit.client.1.vm08.stdout:4/776: getdents d5/d23/d36 0 2026-03-10T08:55:40.368 INFO:tasks.workunit.client.1.vm08.stdout:4/777: dread - d5/d23/d36/d76/f100 zero size 2026-03-10T08:55:40.370 INFO:tasks.workunit.client.1.vm08.stdout:5/670: creat d0/d11/d27/d68/d7c/d4b/d87/db5/fcd x:0 0 0 2026-03-10T08:55:40.373 INFO:tasks.workunit.client.1.vm08.stdout:4/778: creat d5/d23/d36/d99/db2/d5a/d69/d11b/def/df2/f11f x:0 0 0 2026-03-10T08:55:40.375 INFO:tasks.workunit.client.1.vm08.stdout:4/779: mkdir d5/d23/d36/d99/dc6/dc8/d120 0 2026-03-10T08:55:40.449 INFO:tasks.workunit.client.0.vm05.stdout:2/440: write d0/d9/d1e/d20/d21/f44 [1922691,116403] 0 2026-03-10T08:55:40.451 INFO:tasks.workunit.client.0.vm05.stdout:2/441: dread - d0/d9/f4e zero size 2026-03-10T08:55:40.453 INFO:tasks.workunit.client.0.vm05.stdout:2/442: creat d0/d9/d1e/d20/d21/d45/d6c/d6e/f7d x:0 0 0 2026-03-10T08:55:40.453 INFO:tasks.workunit.client.0.vm05.stdout:2/443: chown d0/f8 724 1 2026-03-10T08:55:40.454 INFO:tasks.workunit.client.0.vm05.stdout:2/444: chown d0/cc 11 1 2026-03-10T08:55:40.455 INFO:tasks.workunit.client.0.vm05.stdout:2/445: rmdir d0/d9/d1e/d20/d21/d45 39 2026-03-10T08:55:40.457 INFO:tasks.workunit.client.0.vm05.stdout:2/446: mkdir d0/d9/d1e/d20/d21/d45/d6c/d6e/d7e 0 2026-03-10T08:55:40.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:40 vm05.local ceph-mon[49713]: pgmap v159: 65 pgs: 65 active+clean; 2.6 GiB data, 9.1 GiB used, 111 GiB / 120 GiB avail; 49 MiB/s rd, 124 MiB/s wr, 280 op/s 2026-03-10T08:55:40.474 INFO:tasks.workunit.client.0.vm05.stdout:4/534: dwrite d0/d2c/d6a/f75 [0,4194304] 0 2026-03-10T08:55:40.474 INFO:tasks.workunit.client.1.vm08.stdout:6/728: write d9/d50/fa3 [258100,41078] 0 2026-03-10T08:55:40.476 INFO:tasks.workunit.client.1.vm08.stdout:6/729: creat d9/d50/d95/fef x:0 0 0 2026-03-10T08:55:40.477 INFO:tasks.workunit.client.0.vm05.stdout:4/535: chown d0/d2e/d42/d45/d4a/d36/d37/d9c/f61 7 1 2026-03-10T08:55:40.480 INFO:tasks.workunit.client.1.vm08.stdout:6/730: rename d9/dc/d11/d23/l98 to d9/d10/d1e/d32/lf0 0 2026-03-10T08:55:40.534 INFO:tasks.workunit.client.0.vm05.stdout:9/489: read d6/d12/f34 [1000403,114204] 0 2026-03-10T08:55:40.536 INFO:tasks.workunit.client.1.vm08.stdout:7/765: write d0/d11/d1f/d29/d3b/f9f [811978,26063] 0 2026-03-10T08:55:40.536 INFO:tasks.workunit.client.0.vm05.stdout:9/490: write d6/d15/f96 [762907,87563] 0 2026-03-10T08:55:40.541 INFO:tasks.workunit.client.0.vm05.stdout:9/491: creat d6/d12/d3a/d48/fa8 x:0 0 0 2026-03-10T08:55:40.553 INFO:tasks.workunit.client.1.vm08.stdout:0/662: dwrite d6/dd/d13/d17/d1f/d20/d2f/d24/fab [0,4194304] 0 2026-03-10T08:55:40.555 INFO:tasks.workunit.client.0.vm05.stdout:9/492: dwrite d6/fb [0,4194304] 0 2026-03-10T08:55:40.561 INFO:tasks.workunit.client.1.vm08.stdout:8/738: dwrite d1/d10/d9/dd/d13/fa4 [0,4194304] 0 2026-03-10T08:55:40.563 INFO:tasks.workunit.client.1.vm08.stdout:8/739: chown d1/d10/d9/d4d/d9f/c110 22 1 2026-03-10T08:55:40.564 INFO:tasks.workunit.client.0.vm05.stdout:9/493: write d6/d12/d3a/d48/fa8 [416056,94239] 0 2026-03-10T08:55:40.565 INFO:tasks.workunit.client.0.vm05.stdout:9/494: stat d6/d15/f86 0 2026-03-10T08:55:40.565 INFO:tasks.workunit.client.0.vm05.stdout:9/495: chown d6/d19/d2c/f54 7 1 2026-03-10T08:55:40.573 INFO:tasks.workunit.client.0.vm05.stdout:9/496: fsync d6/d12/d3a/f62 0 2026-03-10T08:55:40.583 INFO:tasks.workunit.client.1.vm08.stdout:8/740: dread d1/d10/d9/dd/d25/d27/d44/d21/d51/f72 [0,4194304] 0 2026-03-10T08:55:40.583 INFO:tasks.workunit.client.0.vm05.stdout:9/497: creat d6/d12/d3a/fa9 x:0 0 0 2026-03-10T08:55:40.585 INFO:tasks.workunit.client.1.vm08.stdout:8/741: symlink d1/d10/d9/dd/d25/d27/d44/d97/d7d/l111 0 2026-03-10T08:55:40.592 INFO:tasks.workunit.client.1.vm08.stdout:8/742: mkdir d1/d10/d9/d4d/d112 0 2026-03-10T08:55:40.594 INFO:tasks.workunit.client.1.vm08.stdout:8/743: creat d1/d10/d9/dd/d25/d27/d44/d21/d5f/f113 x:0 0 0 2026-03-10T08:55:40.611 INFO:tasks.workunit.client.0.vm05.stdout:5/448: fsync d5/d86/f1b 0 2026-03-10T08:55:40.628 INFO:tasks.workunit.client.0.vm05.stdout:1/587: dwrite dd/f9e [0,4194304] 0 2026-03-10T08:55:40.629 INFO:tasks.workunit.client.0.vm05.stdout:1/588: write dd/d10/fb5 [106007,25230] 0 2026-03-10T08:55:40.634 INFO:tasks.workunit.client.0.vm05.stdout:1/589: rename dd/d55 to dd/d10/d18/dd5 0 2026-03-10T08:55:40.635 INFO:tasks.workunit.client.0.vm05.stdout:1/590: creat dd/d10/d18/d20/fd6 x:0 0 0 2026-03-10T08:55:40.638 INFO:tasks.workunit.client.0.vm05.stdout:5/449: sync 2026-03-10T08:55:40.640 INFO:tasks.workunit.client.0.vm05.stdout:1/591: creat dd/d10/d18/d2d/fd7 x:0 0 0 2026-03-10T08:55:40.642 INFO:tasks.workunit.client.0.vm05.stdout:5/450: mknod d5/d3a/d43/ca3 0 2026-03-10T08:55:40.648 INFO:tasks.workunit.client.1.vm08.stdout:0/663: dread d6/dd/d13/d17/d1f/d2d/d85/d93/f7e [0,4194304] 0 2026-03-10T08:55:40.649 INFO:tasks.workunit.client.0.vm05.stdout:7/460: write f9 [2835,91829] 0 2026-03-10T08:55:40.656 INFO:tasks.workunit.client.0.vm05.stdout:1/592: dwrite dd/d21/d3f/f57 [0,4194304] 0 2026-03-10T08:55:40.663 INFO:tasks.workunit.client.0.vm05.stdout:8/507: dwrite d2/dd/d2c/d2e/d31/f89 [4194304,4194304] 0 2026-03-10T08:55:40.663 INFO:tasks.workunit.client.0.vm05.stdout:6/571: dwrite d4/d7/f4d [0,4194304] 0 2026-03-10T08:55:40.663 INFO:tasks.workunit.client.0.vm05.stdout:3/590: dwrite d9/d4d/f5e [0,4194304] 0 2026-03-10T08:55:40.676 INFO:tasks.workunit.client.1.vm08.stdout:9/687: dwrite d2/d41/d4c/d66/fad [0,4194304] 0 2026-03-10T08:55:40.688 INFO:tasks.workunit.client.0.vm05.stdout:7/461: unlink c11 0 2026-03-10T08:55:40.688 INFO:tasks.workunit.client.0.vm05.stdout:3/591: dwrite d9/d2b/d3a/d43/d71/f91 [0,4194304] 0 2026-03-10T08:55:40.697 INFO:tasks.workunit.client.1.vm08.stdout:9/688: mknod d2/d41/d4c/dd2/cec 0 2026-03-10T08:55:40.706 INFO:tasks.workunit.client.0.vm05.stdout:6/572: fdatasync d4/d7/d10/f65 0 2026-03-10T08:55:40.706 INFO:tasks.workunit.client.1.vm08.stdout:9/689: chown d2/dd/d15/d1e/d25/f5f 7 1 2026-03-10T08:55:40.729 INFO:tasks.workunit.client.0.vm05.stdout:7/462: rmdir d18/d66/d25/d2e/d32 39 2026-03-10T08:55:40.744 INFO:tasks.workunit.client.1.vm08.stdout:9/690: getdents d2/dd/d15/d1e/d24 0 2026-03-10T08:55:40.744 INFO:tasks.workunit.client.0.vm05.stdout:8/508: dread d2/dd/d2c/d2e/f3b [4194304,4194304] 0 2026-03-10T08:55:40.744 INFO:tasks.workunit.client.0.vm05.stdout:3/592: fdatasync d9/d4d/f52 0 2026-03-10T08:55:40.744 INFO:tasks.workunit.client.0.vm05.stdout:8/509: mknod d2/dd/d2c/d2e/d31/d4c/d63/cb7 0 2026-03-10T08:55:40.746 INFO:tasks.workunit.client.0.vm05.stdout:7/463: fdatasync d18/d66/d25/d2e/f49 0 2026-03-10T08:55:40.751 INFO:tasks.workunit.client.1.vm08.stdout:9/691: dwrite d2/dd/d15/d1e/d25/dae/f8f [0,4194304] 0 2026-03-10T08:55:40.758 INFO:tasks.workunit.client.0.vm05.stdout:8/510: unlink d2/dd/d2c/d2e/d31/d3e/c48 0 2026-03-10T08:55:40.790 INFO:tasks.workunit.client.1.vm08.stdout:9/692: dread d2/f9f [0,4194304] 0 2026-03-10T08:55:40.790 INFO:tasks.workunit.client.1.vm08.stdout:9/693: stat d2/dd/d15/d1e/d25/d32/c6f 0 2026-03-10T08:55:40.790 INFO:tasks.workunit.client.1.vm08.stdout:9/694: symlink d2/d41/d4c/de2/led 0 2026-03-10T08:55:40.790 INFO:tasks.workunit.client.0.vm05.stdout:8/511: fdatasync d2/dd/d2c/d2e/d31/d4f/d80/f9f 0 2026-03-10T08:55:40.790 INFO:tasks.workunit.client.0.vm05.stdout:3/593: creat d9/d8f/fa6 x:0 0 0 2026-03-10T08:55:40.790 INFO:tasks.workunit.client.0.vm05.stdout:3/594: write d9/d2b/d2f/f33 [4197563,85935] 0 2026-03-10T08:55:40.790 INFO:tasks.workunit.client.0.vm05.stdout:7/464: dwrite d18/d66/d25/d2e/d42/d53/f7e [0,4194304] 0 2026-03-10T08:55:40.790 INFO:tasks.workunit.client.0.vm05.stdout:3/595: readlink d9/l76 0 2026-03-10T08:55:40.790 INFO:tasks.workunit.client.0.vm05.stdout:3/596: dwrite d9/ff [0,4194304] 0 2026-03-10T08:55:40.790 INFO:tasks.workunit.client.0.vm05.stdout:7/465: rename d18/d66/d25/l6a to d18/d66/d79/l8e 0 2026-03-10T08:55:40.790 INFO:tasks.workunit.client.0.vm05.stdout:7/466: read - d18/d66/d78/f8b zero size 2026-03-10T08:55:40.790 INFO:tasks.workunit.client.0.vm05.stdout:3/597: dread - d9/d2b/d2f/f5d zero size 2026-03-10T08:55:40.791 INFO:tasks.workunit.client.0.vm05.stdout:7/467: dread - d18/d66/d25/d2e/d42/d74/f7b zero size 2026-03-10T08:55:40.791 INFO:tasks.workunit.client.0.vm05.stdout:7/468: rmdir d18/d66/d25/d2e 39 2026-03-10T08:55:40.791 INFO:tasks.workunit.client.0.vm05.stdout:7/469: rename d18 to d18/d66/d25/d2e/d2f/d8f 22 2026-03-10T08:55:40.791 INFO:tasks.workunit.client.0.vm05.stdout:7/470: dread d18/d66/d25/d2e/d42/d53/f7e [0,4194304] 0 2026-03-10T08:55:40.791 INFO:tasks.workunit.client.0.vm05.stdout:7/471: chown d18/d66/d25/d2e/d32/d7f 1710 1 2026-03-10T08:55:40.791 INFO:tasks.workunit.client.0.vm05.stdout:7/472: read d18/f4a [2642524,85332] 0 2026-03-10T08:55:40.791 INFO:tasks.workunit.client.0.vm05.stdout:7/473: dread d18/d66/d25/d2e/d42/d53/f7e [0,4194304] 0 2026-03-10T08:55:40.791 INFO:tasks.workunit.client.0.vm05.stdout:7/474: dwrite d18/d1b/f30 [0,4194304] 0 2026-03-10T08:55:40.794 INFO:tasks.workunit.client.0.vm05.stdout:7/475: rmdir d18/d38/d43 39 2026-03-10T08:55:40.794 INFO:tasks.workunit.client.0.vm05.stdout:7/476: write d18/d38/f82 [501513,61108] 0 2026-03-10T08:55:40.802 INFO:tasks.workunit.client.0.vm05.stdout:7/477: link c14 d18/d38/d43/c90 0 2026-03-10T08:55:40.803 INFO:tasks.workunit.client.0.vm05.stdout:7/478: chown d18/c19 124259866 1 2026-03-10T08:55:40.830 INFO:tasks.workunit.client.0.vm05.stdout:8/512: dread d2/db/d47/f51 [0,4194304] 0 2026-03-10T08:55:40.984 INFO:tasks.workunit.client.1.vm08.stdout:9/695: dread d2/d41/d4c/d66/d99/fa0 [0,4194304] 0 2026-03-10T08:55:40.988 INFO:tasks.workunit.client.1.vm08.stdout:9/696: creat d2/d54/d8e/da6/dd0/dc8/fee x:0 0 0 2026-03-10T08:55:40.991 INFO:tasks.workunit.client.1.vm08.stdout:9/697: dread d2/dd/d15/d1e/d25/d32/d5c/dc2/fcb [0,4194304] 0 2026-03-10T08:55:40.994 INFO:tasks.workunit.client.1.vm08.stdout:9/698: symlink d2/dd/d15/de0/lef 0 2026-03-10T08:55:40.995 INFO:tasks.workunit.client.1.vm08.stdout:9/699: mknod d2/dd/d61/cf0 0 2026-03-10T08:55:40.997 INFO:tasks.workunit.client.1.vm08.stdout:9/700: stat d2/d41/d4c/de2 0 2026-03-10T08:55:40.997 INFO:tasks.workunit.client.1.vm08.stdout:2/789: dwrite d1/da/d10/f7e [0,4194304] 0 2026-03-10T08:55:41.000 INFO:tasks.workunit.client.1.vm08.stdout:9/701: stat d2/dd/d15/f22 0 2026-03-10T08:55:41.000 INFO:tasks.workunit.client.1.vm08.stdout:9/702: chown d2/dd/d61/f67 976204 1 2026-03-10T08:55:41.006 INFO:tasks.workunit.client.1.vm08.stdout:2/790: creat d1/da/d10/d42/d93/d22/f104 x:0 0 0 2026-03-10T08:55:41.006 INFO:tasks.workunit.client.1.vm08.stdout:9/703: rmdir d2/dd 39 2026-03-10T08:55:41.006 INFO:tasks.workunit.client.1.vm08.stdout:2/791: write d1/dd5/fe3 [357263,53373] 0 2026-03-10T08:55:41.009 INFO:tasks.workunit.client.1.vm08.stdout:9/704: mkdir d2/dd/d15/d4f/df1 0 2026-03-10T08:55:41.010 INFO:tasks.workunit.client.1.vm08.stdout:9/705: write d2/dd/d15/d1e/d24/f3f [4791779,110479] 0 2026-03-10T08:55:41.010 INFO:tasks.workunit.client.1.vm08.stdout:2/792: creat d1/da/d10/d42/d93/f105 x:0 0 0 2026-03-10T08:55:41.011 INFO:tasks.workunit.client.1.vm08.stdout:2/793: readlink d1/da/d78/l77 0 2026-03-10T08:55:41.012 INFO:tasks.workunit.client.1.vm08.stdout:2/794: creat d1/d5b/dc5/f106 x:0 0 0 2026-03-10T08:55:41.013 INFO:tasks.workunit.client.1.vm08.stdout:9/706: rename d2/d41/d4c/d66/d99 to d2/dd/d15/d1e/d39/d69/de4/df2 0 2026-03-10T08:55:41.014 INFO:tasks.workunit.client.1.vm08.stdout:9/707: stat d2/d41/d4c 0 2026-03-10T08:55:41.016 INFO:tasks.workunit.client.1.vm08.stdout:9/708: creat d2/d41/ff3 x:0 0 0 2026-03-10T08:55:41.057 INFO:tasks.workunit.client.1.vm08.stdout:9/709: sync 2026-03-10T08:55:41.058 INFO:tasks.workunit.client.1.vm08.stdout:9/710: write d2/dd/faf [802125,125371] 0 2026-03-10T08:55:41.061 INFO:tasks.workunit.client.1.vm08.stdout:9/711: creat d2/dd/d15/d1e/d39/d4e/ff4 x:0 0 0 2026-03-10T08:55:41.062 INFO:tasks.workunit.client.1.vm08.stdout:9/712: symlink d2/d54/d8e/da6/dd0/dc8/lf5 0 2026-03-10T08:55:41.117 INFO:tasks.workunit.client.0.vm05.stdout:8/513: read d2/dd/d2c/d2e/f64 [1331126,32678] 0 2026-03-10T08:55:41.121 INFO:tasks.workunit.client.0.vm05.stdout:8/514: dwrite d2/db/d47/f51 [0,4194304] 0 2026-03-10T08:55:41.124 INFO:tasks.workunit.client.0.vm05.stdout:8/515: truncate d2/dd/d2c/d2e/f7d 222974 0 2026-03-10T08:55:41.125 INFO:tasks.workunit.client.0.vm05.stdout:8/516: mkdir d2/dd/d2c/d2e/d31/d4c/d63/db8 0 2026-03-10T08:55:41.129 INFO:tasks.workunit.client.0.vm05.stdout:8/517: write d2/dd/d2c/d2e/d31/d3e/d5d/d9d/db3/fb6 [793358,93353] 0 2026-03-10T08:55:41.135 INFO:tasks.workunit.client.0.vm05.stdout:8/518: symlink d2/dd/d2c/d2e/d31/lb9 0 2026-03-10T08:55:41.140 INFO:tasks.workunit.client.0.vm05.stdout:8/519: dread d2/db/d1f/f44 [0,4194304] 0 2026-03-10T08:55:41.141 INFO:tasks.workunit.client.1.vm08.stdout:1/752: write d1/da/fd6 [101016,29450] 0 2026-03-10T08:55:41.143 INFO:tasks.workunit.client.0.vm05.stdout:8/520: fsync d2/db/f9a 0 2026-03-10T08:55:41.146 INFO:tasks.workunit.client.0.vm05.stdout:0/527: write df/f1d [908610,119452] 0 2026-03-10T08:55:41.146 INFO:tasks.workunit.client.0.vm05.stdout:0/528: stat df/d1f/d85/d19/d5b/f72 0 2026-03-10T08:55:41.147 INFO:tasks.workunit.client.0.vm05.stdout:9/498: unlink d6/d27/c98 0 2026-03-10T08:55:41.149 INFO:tasks.workunit.client.0.vm05.stdout:8/521: mknod d2/dd/d74/cba 0 2026-03-10T08:55:41.151 INFO:tasks.workunit.client.1.vm08.stdout:1/753: unlink d1/da/d18/d3a/f3c 0 2026-03-10T08:55:41.151 INFO:tasks.workunit.client.0.vm05.stdout:0/529: dread - df/d1f/d85/d19/d39/f6f zero size 2026-03-10T08:55:41.158 INFO:tasks.workunit.client.0.vm05.stdout:8/522: creat d2/dd/d2c/d2e/fbb x:0 0 0 2026-03-10T08:55:41.159 INFO:tasks.workunit.client.0.vm05.stdout:0/530: creat df/d1f/d85/d2b/d27/d32/f9b x:0 0 0 2026-03-10T08:55:41.160 INFO:tasks.workunit.client.0.vm05.stdout:8/523: creat d2/db/d1f/d67/fbc x:0 0 0 2026-03-10T08:55:41.162 INFO:tasks.workunit.client.0.vm05.stdout:0/531: link df/d1f/f21 df/d1f/d85/d19/d47/d84/d8a/f9c 0 2026-03-10T08:55:41.163 INFO:tasks.workunit.client.1.vm08.stdout:3/658: dwrite d4/d15/d8/d2c/d9b/f4d [0,4194304] 0 2026-03-10T08:55:41.165 INFO:tasks.workunit.client.0.vm05.stdout:0/532: truncate df/d1f/d85/f53 3495528 0 2026-03-10T08:55:41.166 INFO:tasks.workunit.client.0.vm05.stdout:8/524: creat d2/fbd x:0 0 0 2026-03-10T08:55:41.168 INFO:tasks.workunit.client.1.vm08.stdout:3/659: creat d4/d15/d8/d2c/d9b/d79/d20/fe5 x:0 0 0 2026-03-10T08:55:41.170 INFO:tasks.workunit.client.1.vm08.stdout:3/660: readlink d4/d15/d8/d1d/d4f/lc8 0 2026-03-10T08:55:41.170 INFO:tasks.workunit.client.1.vm08.stdout:5/671: dwrite d0/d11/d27/d50/f9d [0,4194304] 0 2026-03-10T08:55:41.170 INFO:tasks.workunit.client.1.vm08.stdout:3/661: stat d4/l74 0 2026-03-10T08:55:41.172 INFO:tasks.workunit.client.1.vm08.stdout:4/780: dwrite d5/d23/d49/f4d [0,4194304] 0 2026-03-10T08:55:41.173 INFO:tasks.workunit.client.0.vm05.stdout:2/447: dwrite d0/d9/d1e/d20/f22 [4194304,4194304] 0 2026-03-10T08:55:41.185 INFO:tasks.workunit.client.1.vm08.stdout:4/781: dread d5/d23/d36/d76/fa5 [0,4194304] 0 2026-03-10T08:55:41.191 INFO:tasks.workunit.client.1.vm08.stdout:5/672: creat d0/d11/d27/d68/dc1/fce x:0 0 0 2026-03-10T08:55:41.201 INFO:tasks.workunit.client.0.vm05.stdout:2/448: mkdir d0/d9/d7f 0 2026-03-10T08:55:41.203 INFO:tasks.workunit.client.1.vm08.stdout:5/673: truncate d0/d11/d27/d68/d7c/f42 4797206 0 2026-03-10T08:55:41.207 INFO:tasks.workunit.client.1.vm08.stdout:5/674: rename d0/d11/d27/d68/d7c/d4b/d4e/c74 to d0/d1b/d67/d7a/ccf 0 2026-03-10T08:55:41.220 INFO:tasks.workunit.client.1.vm08.stdout:5/675: mknod d0/d1b/cd0 0 2026-03-10T08:55:41.220 INFO:tasks.workunit.client.1.vm08.stdout:5/676: rename d0/d1b/cd0 to d0/d1b/d67/cd1 0 2026-03-10T08:55:41.220 INFO:tasks.workunit.client.1.vm08.stdout:5/677: chown d0/d11/d18/d52/c66 55 1 2026-03-10T08:55:41.220 INFO:tasks.workunit.client.1.vm08.stdout:5/678: chown d0/d11/d18/f23 4438 1 2026-03-10T08:55:41.226 INFO:tasks.workunit.client.0.vm05.stdout:2/449: dread d0/d9/d1e/d20/d24/f29 [0,4194304] 0 2026-03-10T08:55:41.228 INFO:tasks.workunit.client.0.vm05.stdout:2/450: truncate d0/d9/f19 1830182 0 2026-03-10T08:55:41.228 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:41 vm08.local ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:55:41.228 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:41 vm08.local ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:55:41.228 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:41 vm08.local ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:55:41.276 INFO:tasks.workunit.client.0.vm05.stdout:4/536: dwrite d0/d2e/d42/d45/d4a/d36/f88 [0,4194304] 0 2026-03-10T08:55:41.281 INFO:tasks.workunit.client.0.vm05.stdout:4/537: dwrite d0/d2e/d42/d45/d4a/d36/d37/d9c/f29 [0,4194304] 0 2026-03-10T08:55:41.286 INFO:tasks.workunit.client.0.vm05.stdout:4/538: link d0/d2e/d42/d45/d4a/d36/f3d d0/d2e/d42/d45/d4a/d36/d37/d9c/d49/faf 0 2026-03-10T08:55:41.289 INFO:tasks.workunit.client.0.vm05.stdout:4/539: rmdir d0/d2e/d42/d45/d4a/d36/d37/d9c/d32 39 2026-03-10T08:55:41.293 INFO:tasks.workunit.client.1.vm08.stdout:8/744: write d1/d10/d9/dd/d25/d27/fd3 [1045090,92676] 0 2026-03-10T08:55:41.297 INFO:tasks.workunit.client.1.vm08.stdout:6/731: write d9/d10/f9d [898992,62255] 0 2026-03-10T08:55:41.301 INFO:tasks.workunit.client.1.vm08.stdout:8/745: mknod d1/d10/d9/dd/d9a/da6/c114 0 2026-03-10T08:55:41.301 INFO:tasks.workunit.client.0.vm05.stdout:5/451: write d5/d86/d24/d2c/f79 [979114,31257] 0 2026-03-10T08:55:41.301 INFO:tasks.workunit.client.0.vm05.stdout:5/452: chown d5/d48 4220389 1 2026-03-10T08:55:41.302 INFO:tasks.workunit.client.1.vm08.stdout:0/664: write d6/dd/d13/d17/d1f/d2d/d39/f87 [756939,78319] 0 2026-03-10T08:55:41.302 INFO:tasks.workunit.client.1.vm08.stdout:7/766: dwrite d0/f7a [0,4194304] 0 2026-03-10T08:55:41.306 INFO:tasks.workunit.client.1.vm08.stdout:6/732: fsync d9/dc/d11/d23/d2c/f97 0 2026-03-10T08:55:41.308 INFO:tasks.workunit.client.0.vm05.stdout:5/453: creat d5/d86/d66/d76/fa4 x:0 0 0 2026-03-10T08:55:41.309 INFO:tasks.workunit.client.0.vm05.stdout:9/499: creat d6/d19/d2a/d4a/faa x:0 0 0 2026-03-10T08:55:41.319 INFO:tasks.workunit.client.0.vm05.stdout:9/500: unlink d6/d15/d3c/d4b/d82/f92 0 2026-03-10T08:55:41.323 INFO:tasks.workunit.client.1.vm08.stdout:6/733: creat d9/dc/d11/d23/d2c/d81/d63/ff1 x:0 0 0 2026-03-10T08:55:41.323 INFO:tasks.workunit.client.1.vm08.stdout:8/746: rename d1/d10/d9/dd/d25/d27/d44/d21/dce/ce6 to d1/d10/d9/dd/d18/dff/c115 0 2026-03-10T08:55:41.324 INFO:tasks.workunit.client.1.vm08.stdout:8/747: stat d1/d10/d9/d4d/le9 0 2026-03-10T08:55:41.325 INFO:tasks.workunit.client.1.vm08.stdout:0/665: getdents d6/dd/d13/d17/d1f/d2d/d85 0 2026-03-10T08:55:41.326 INFO:tasks.workunit.client.1.vm08.stdout:8/748: fdatasync d1/d10/d9/d4d/db2/fda 0 2026-03-10T08:55:41.329 INFO:tasks.workunit.client.1.vm08.stdout:0/666: symlink d6/dd/d13/ldb 0 2026-03-10T08:55:41.331 INFO:tasks.workunit.client.1.vm08.stdout:0/667: write d6/dd/d13/d17/d1f/d2d/d39/f3b [382806,127703] 0 2026-03-10T08:55:41.331 INFO:tasks.workunit.client.1.vm08.stdout:8/749: unlink d1/d10/d9/dd/d25/d27/d44/l48 0 2026-03-10T08:55:41.332 INFO:tasks.workunit.client.1.vm08.stdout:0/668: chown d6/dd/d13/d17/d1f/d2d/d39 1 1 2026-03-10T08:55:41.337 INFO:tasks.workunit.client.1.vm08.stdout:8/750: rename d1/d10/d9/dd/d25/d27/d44/d21/dce/ffd to d1/dd9/f116 0 2026-03-10T08:55:41.338 INFO:tasks.workunit.client.1.vm08.stdout:0/669: creat d6/dd/d13/d17/d1f/d20/d2f/d26/d56/fdc x:0 0 0 2026-03-10T08:55:41.338 INFO:tasks.workunit.client.1.vm08.stdout:8/751: readlink d1/d10/d9/le7 0 2026-03-10T08:55:41.341 INFO:tasks.workunit.client.1.vm08.stdout:0/670: fsync d6/dd/d13/d17/d1f/f67 0 2026-03-10T08:55:41.342 INFO:tasks.workunit.client.1.vm08.stdout:8/752: fdatasync d1/d10/d9/dd/d25/d27/d44/d21/d5f/fd4 0 2026-03-10T08:55:41.343 INFO:tasks.workunit.client.1.vm08.stdout:0/671: creat d6/dd/d13/d17/d1f/d2d/d38/fdd x:0 0 0 2026-03-10T08:55:41.355 INFO:tasks.workunit.client.1.vm08.stdout:6/734: sync 2026-03-10T08:55:41.358 INFO:tasks.workunit.client.0.vm05.stdout:1/593: truncate dd/d21/d37/d45/fce 786168 0 2026-03-10T08:55:41.359 INFO:tasks.workunit.client.1.vm08.stdout:6/735: write d9/d50/fa3 [396017,28857] 0 2026-03-10T08:55:41.360 INFO:tasks.workunit.client.0.vm05.stdout:1/594: dread dd/d10/fb5 [0,4194304] 0 2026-03-10T08:55:41.363 INFO:tasks.workunit.client.0.vm05.stdout:6/573: write d4/d7/d10/d1a/f1e [3774433,50135] 0 2026-03-10T08:55:41.363 INFO:tasks.workunit.client.0.vm05.stdout:1/595: symlink dd/d10/d18/ld8 0 2026-03-10T08:55:41.364 INFO:tasks.workunit.client.0.vm05.stdout:1/596: write dd/f9e [4298077,24540] 0 2026-03-10T08:55:41.368 INFO:tasks.workunit.client.0.vm05.stdout:6/574: mknod d4/d2d/cba 0 2026-03-10T08:55:41.368 INFO:tasks.workunit.client.1.vm08.stdout:6/736: fdatasync d9/dc/d11/d23/d2c/d81/f62 0 2026-03-10T08:55:41.369 INFO:tasks.workunit.client.0.vm05.stdout:1/597: creat dd/d10/d18/d20/fd9 x:0 0 0 2026-03-10T08:55:41.369 INFO:tasks.workunit.client.0.vm05.stdout:1/598: chown dd/d10/f22 1231614604 1 2026-03-10T08:55:41.372 INFO:tasks.workunit.client.0.vm05.stdout:1/599: getdents dd/d10/d19/d4d/d7d 0 2026-03-10T08:55:41.373 INFO:tasks.workunit.client.0.vm05.stdout:6/575: link d4/d7/d10/d15/d1b/l9b d4/d8d/lbb 0 2026-03-10T08:55:41.376 INFO:tasks.workunit.client.1.vm08.stdout:6/737: creat d9/dc/de0/ff2 x:0 0 0 2026-03-10T08:55:41.377 INFO:tasks.workunit.client.0.vm05.stdout:1/600: dwrite dd/d10/d19/d27/fc8 [0,4194304] 0 2026-03-10T08:55:41.378 INFO:tasks.workunit.client.0.vm05.stdout:6/576: symlink d4/d8d/lbc 0 2026-03-10T08:55:41.378 INFO:tasks.workunit.client.0.vm05.stdout:1/601: readlink dd/d10/d18/dd5/l86 0 2026-03-10T08:55:41.378 INFO:tasks.workunit.client.1.vm08.stdout:6/738: readlink d9/d10/l5b 0 2026-03-10T08:55:41.379 INFO:tasks.workunit.client.1.vm08.stdout:8/753: dread f0 [0,4194304] 0 2026-03-10T08:55:41.384 INFO:tasks.workunit.client.1.vm08.stdout:8/754: sync 2026-03-10T08:55:41.393 INFO:tasks.workunit.client.1.vm08.stdout:6/739: truncate d9/dc/d84/d80/fc1 764925 0 2026-03-10T08:55:41.394 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:41 vm05.local ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:55:41.394 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:41 vm05.local ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:55:41.394 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:41 vm05.local ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:55:41.396 INFO:tasks.workunit.client.1.vm08.stdout:8/755: creat d1/d10/d9/dd/d18/d34/f117 x:0 0 0 2026-03-10T08:55:41.396 INFO:tasks.workunit.client.1.vm08.stdout:6/740: read - d9/d10/d1e/d32/f9f zero size 2026-03-10T08:55:41.400 INFO:tasks.workunit.client.1.vm08.stdout:8/756: creat d1/d10/d9/dd/d25/f118 x:0 0 0 2026-03-10T08:55:41.409 INFO:tasks.workunit.client.1.vm08.stdout:6/741: dwrite d9/d13/f70 [0,4194304] 0 2026-03-10T08:55:41.427 INFO:tasks.workunit.client.1.vm08.stdout:6/742: fdatasync d9/f77 0 2026-03-10T08:55:41.434 INFO:tasks.workunit.client.0.vm05.stdout:3/598: dwrite d9/d8f/f8a [0,4194304] 0 2026-03-10T08:55:41.440 INFO:tasks.workunit.client.0.vm05.stdout:3/599: creat d9/d2b/d53/fa7 x:0 0 0 2026-03-10T08:55:41.440 INFO:tasks.workunit.client.1.vm08.stdout:8/757: getdents d1/d10 0 2026-03-10T08:55:41.440 INFO:tasks.workunit.client.0.vm05.stdout:3/600: write d9/d2b/d53/d61/f69 [1115180,77768] 0 2026-03-10T08:55:41.450 INFO:tasks.workunit.client.1.vm08.stdout:6/743: rmdir d9/dc/dc9 0 2026-03-10T08:55:41.452 INFO:tasks.workunit.client.1.vm08.stdout:8/758: fdatasync d1/d10/d9/dd/f91 0 2026-03-10T08:55:41.457 INFO:tasks.workunit.client.0.vm05.stdout:7/479: truncate d18/f1d 3856243 0 2026-03-10T08:55:41.467 INFO:tasks.workunit.client.1.vm08.stdout:8/759: dread d1/d10/d9/dd/d25/d27/d44/d21/d51/d64/fab [0,4194304] 0 2026-03-10T08:55:41.467 INFO:tasks.workunit.client.0.vm05.stdout:7/480: unlink d18/c21 0 2026-03-10T08:55:41.468 INFO:tasks.workunit.client.1.vm08.stdout:8/760: read - d1/d10/d9/d4d/db2/f103 zero size 2026-03-10T08:55:41.476 INFO:tasks.workunit.client.1.vm08.stdout:8/761: creat d1/d10/d9/dd/d18/dff/f119 x:0 0 0 2026-03-10T08:55:41.479 INFO:tasks.workunit.client.1.vm08.stdout:8/762: mknod d1/d10/d9/dd/d18/d3c/c11a 0 2026-03-10T08:55:41.481 INFO:tasks.workunit.client.1.vm08.stdout:8/763: rename d1/d10/d9/dd/f70 to d1/d4f/d60/f11b 0 2026-03-10T08:55:41.482 INFO:tasks.workunit.client.1.vm08.stdout:8/764: chown d1/d10/d9/dd/d25/d27/d44/d21/d5f/d9e/ccc 864357 1 2026-03-10T08:55:41.491 INFO:tasks.workunit.client.0.vm05.stdout:3/601: dread d9/d2b/d2f/f3f [0,4194304] 0 2026-03-10T08:55:41.492 INFO:tasks.workunit.client.0.vm05.stdout:3/602: write d9/d2b/d3a/d43/d71/f91 [3433238,128869] 0 2026-03-10T08:55:41.492 INFO:tasks.workunit.client.0.vm05.stdout:3/603: stat d9/d8f/d50/f7c 0 2026-03-10T08:55:41.495 INFO:tasks.workunit.client.0.vm05.stdout:3/604: truncate d9/d8f/d55/f79 440286 0 2026-03-10T08:55:41.540 INFO:tasks.workunit.client.1.vm08.stdout:8/765: rename d1/d10/d9/dd/d25/d27/d44/d97/d7d/l111 to d1/d10/d9/dd/d25/d27/d44/d21/d51/d64/dfe/l11c 0 2026-03-10T08:55:41.540 INFO:tasks.workunit.client.1.vm08.stdout:8/766: getdents d1/d10/d9/dd/d25/d27/d44/d21/d51/d64/dfb 0 2026-03-10T08:55:41.540 INFO:tasks.workunit.client.1.vm08.stdout:8/767: creat d1/d10/d9/dd/d9a/da6/f11d x:0 0 0 2026-03-10T08:55:41.561 INFO:tasks.workunit.client.0.vm05.stdout:7/481: sync 2026-03-10T08:55:41.561 INFO:tasks.workunit.client.0.vm05.stdout:3/605: sync 2026-03-10T08:55:41.563 INFO:tasks.workunit.client.0.vm05.stdout:7/482: mknod d18/d66/d25/d2e/d42/d53/c91 0 2026-03-10T08:55:41.566 INFO:tasks.workunit.client.0.vm05.stdout:7/483: sync 2026-03-10T08:55:41.568 INFO:tasks.workunit.client.0.vm05.stdout:7/484: symlink d18/d66/d25/d2e/d42/l92 0 2026-03-10T08:55:41.568 INFO:tasks.workunit.client.0.vm05.stdout:7/485: fdatasync f9 0 2026-03-10T08:55:41.569 INFO:tasks.workunit.client.0.vm05.stdout:7/486: write d18/d66/d78/f8b [233444,74411] 0 2026-03-10T08:55:41.573 INFO:tasks.workunit.client.0.vm05.stdout:7/487: rmdir d18/d38/d43/d6e 39 2026-03-10T08:55:41.577 INFO:tasks.workunit.client.0.vm05.stdout:7/488: rmdir d18/d66/d25/d2e/d32/d7f 0 2026-03-10T08:55:41.587 INFO:tasks.workunit.client.1.vm08.stdout:2/795: write d1/da/f50 [4739401,101610] 0 2026-03-10T08:55:41.591 INFO:tasks.workunit.client.1.vm08.stdout:2/796: creat d1/d43/d5c/de7/f107 x:0 0 0 2026-03-10T08:55:41.595 INFO:tasks.workunit.client.1.vm08.stdout:9/713: dwrite d2/dd/d15/d4f/fa5 [0,4194304] 0 2026-03-10T08:55:41.597 INFO:tasks.workunit.client.0.vm05.stdout:0/533: dwrite df/d1f/d85/f2a [0,4194304] 0 2026-03-10T08:55:41.598 INFO:tasks.workunit.client.0.vm05.stdout:8/525: dwrite d2/dd/d2c/f34 [4194304,4194304] 0 2026-03-10T08:55:41.612 INFO:tasks.workunit.client.1.vm08.stdout:3/662: truncate d4/d15/d8/d2c/f32 3588548 0 2026-03-10T08:55:41.613 INFO:tasks.workunit.client.1.vm08.stdout:3/663: stat f1 0 2026-03-10T08:55:41.616 INFO:tasks.workunit.client.1.vm08.stdout:4/782: dwrite d5/d23/d36/d99/db2/d5a/d69/d11b/f72 [0,4194304] 0 2026-03-10T08:55:41.616 INFO:tasks.workunit.client.0.vm05.stdout:2/451: dwrite d0/f4 [0,4194304] 0 2026-03-10T08:55:41.617 INFO:tasks.workunit.client.1.vm08.stdout:1/754: dwrite d1/fc [0,4194304] 0 2026-03-10T08:55:41.625 INFO:tasks.workunit.client.1.vm08.stdout:5/679: dwrite d0/d11/f86 [0,4194304] 0 2026-03-10T08:55:41.627 INFO:tasks.workunit.client.1.vm08.stdout:5/680: stat d0/d11/d27/d68/d7c/d8e 0 2026-03-10T08:55:41.637 INFO:tasks.workunit.client.0.vm05.stdout:8/526: symlink d2/dd/d2c/d2e/d31/d4c/lbe 0 2026-03-10T08:55:41.638 INFO:tasks.workunit.client.1.vm08.stdout:3/664: creat d4/d15/d8/d1d/fe6 x:0 0 0 2026-03-10T08:55:41.645 INFO:tasks.workunit.client.0.vm05.stdout:8/527: getdents d2/dd 0 2026-03-10T08:55:41.650 INFO:tasks.workunit.client.1.vm08.stdout:1/755: fsync d1/da/de/d24/d26/f94 0 2026-03-10T08:55:41.650 INFO:tasks.workunit.client.1.vm08.stdout:5/681: creat d0/d11/d27/d68/d7c/d4b/d87/db5/fd2 x:0 0 0 2026-03-10T08:55:41.650 INFO:tasks.workunit.client.0.vm05.stdout:8/528: write d2/dd/d2c/d2e/d31/f89 [5739370,55685] 0 2026-03-10T08:55:41.650 INFO:tasks.workunit.client.0.vm05.stdout:8/529: symlink d2/dd/d2c/d2e/d31/d4f/da3/lbf 0 2026-03-10T08:55:41.650 INFO:tasks.workunit.client.0.vm05.stdout:8/530: dread - d2/db/d1f/d67/fbc zero size 2026-03-10T08:55:41.651 INFO:tasks.workunit.client.0.vm05.stdout:8/531: creat d2/dd/d2c/d2e/d31/d3e/d5d/fc0 x:0 0 0 2026-03-10T08:55:41.652 INFO:tasks.workunit.client.1.vm08.stdout:4/783: rename d5/d23/d36/d99/db2/d5a/d69/f8c to d5/d23/d36/d99/db2/d5a/d69/d11b/d114/f121 0 2026-03-10T08:55:41.652 INFO:tasks.workunit.client.1.vm08.stdout:5/682: chown d0/d11/d27/f64 1042 1 2026-03-10T08:55:41.655 INFO:tasks.workunit.client.1.vm08.stdout:4/784: read d5/d23/d49/d8f/da4/f10a [1128934,118856] 0 2026-03-10T08:55:41.655 INFO:tasks.workunit.client.1.vm08.stdout:1/756: creat d1/f105 x:0 0 0 2026-03-10T08:55:41.656 INFO:tasks.workunit.client.1.vm08.stdout:5/683: symlink d0/d1b/d67/ld3 0 2026-03-10T08:55:41.656 INFO:tasks.workunit.client.0.vm05.stdout:2/452: sync 2026-03-10T08:55:41.667 INFO:tasks.workunit.client.0.vm05.stdout:8/532: link d2/dd/d2c/d2e/d31/d4c/d63/l7a d2/lc1 0 2026-03-10T08:55:41.670 INFO:tasks.workunit.client.1.vm08.stdout:5/684: creat d0/d11/d18/d52/db9/fd4 x:0 0 0 2026-03-10T08:55:41.676 INFO:tasks.workunit.client.0.vm05.stdout:2/453: dread d0/d9/d1e/d20/d21/d45/d6c/d6e/f64 [0,4194304] 0 2026-03-10T08:55:41.677 INFO:tasks.workunit.client.0.vm05.stdout:2/454: stat d0/d9/d1e/d20/d24/c25 0 2026-03-10T08:55:41.677 INFO:tasks.workunit.client.0.vm05.stdout:2/455: stat d0/f10 0 2026-03-10T08:55:41.678 INFO:tasks.workunit.client.0.vm05.stdout:2/456: write d0/f56 [1532013,35825] 0 2026-03-10T08:55:41.679 INFO:tasks.workunit.client.1.vm08.stdout:1/757: mknod d1/da/de/d24/d35/d6d/d82/c106 0 2026-03-10T08:55:41.679 INFO:tasks.workunit.client.0.vm05.stdout:2/457: fdatasync d0/d9/d1e/d20/d21/d45/d6c/d6e/f66 0 2026-03-10T08:55:41.684 INFO:tasks.workunit.client.0.vm05.stdout:2/458: unlink d0/fb 0 2026-03-10T08:55:41.695 INFO:tasks.workunit.client.1.vm08.stdout:5/685: dwrite d0/d11/d27/d68/d7c/d4b/d4e/fbd [0,4194304] 0 2026-03-10T08:55:41.695 INFO:tasks.workunit.client.0.vm05.stdout:2/459: creat d0/d9/d7f/f80 x:0 0 0 2026-03-10T08:55:41.701 INFO:tasks.workunit.client.1.vm08.stdout:5/686: dread - d0/d11/d27/d68/d7c/d4b/d87/db5/fd2 zero size 2026-03-10T08:55:41.701 INFO:tasks.workunit.client.1.vm08.stdout:5/687: stat d0/lc 0 2026-03-10T08:55:41.701 INFO:tasks.workunit.client.1.vm08.stdout:5/688: stat d0/cbc 0 2026-03-10T08:55:41.727 INFO:tasks.workunit.client.1.vm08.stdout:3/665: dread f1 [0,4194304] 0 2026-03-10T08:55:41.753 INFO:tasks.workunit.client.1.vm08.stdout:2/797: dread d1/da/d10/d42/d93/d1e/f1f [0,4194304] 0 2026-03-10T08:55:41.753 INFO:tasks.workunit.client.1.vm08.stdout:2/798: chown d1/d5b/da7/ff7 3391 1 2026-03-10T08:55:41.769 INFO:tasks.workunit.client.1.vm08.stdout:2/799: sync 2026-03-10T08:55:41.808 INFO:tasks.workunit.client.1.vm08.stdout:5/689: fsync d0/d11/d27/d68/d7c/d4b/d4e/fbd 0 2026-03-10T08:55:41.855 INFO:tasks.workunit.client.0.vm05.stdout:4/540: dwrite d0/d1d/f24 [0,4194304] 0 2026-03-10T08:55:41.858 INFO:tasks.workunit.client.0.vm05.stdout:5/454: dwrite d5/df/d37/f73 [0,4194304] 0 2026-03-10T08:55:41.860 INFO:tasks.workunit.client.1.vm08.stdout:7/767: dwrite d0/d11/d1f/d29/f8d [0,4194304] 0 2026-03-10T08:55:41.860 INFO:tasks.workunit.client.0.vm05.stdout:9/501: chown f4 0 1 2026-03-10T08:55:41.860 INFO:tasks.workunit.client.0.vm05.stdout:5/455: chown d5/d86/f1a 44381 1 2026-03-10T08:55:41.876 INFO:tasks.workunit.client.1.vm08.stdout:0/672: write d6/dd/d13/d17/d1f/d20/d2f/d24/f37 [1027630,119021] 0 2026-03-10T08:55:41.877 INFO:tasks.workunit.client.0.vm05.stdout:9/502: symlink d6/d12/d3a/d48/lab 0 2026-03-10T08:55:41.877 INFO:tasks.workunit.client.0.vm05.stdout:5/456: creat d5/d86/d66/fa5 x:0 0 0 2026-03-10T08:55:41.878 INFO:tasks.workunit.client.1.vm08.stdout:7/768: fsync d0/d11/d4a/d95/ff5 0 2026-03-10T08:55:41.879 INFO:tasks.workunit.client.1.vm08.stdout:7/769: stat d0/d11/d1f/d2c/f30 0 2026-03-10T08:55:41.882 INFO:tasks.workunit.client.1.vm08.stdout:7/770: read - d0/d11/d1f/d29/d3b/d80/fa2 zero size 2026-03-10T08:55:41.885 INFO:tasks.workunit.client.1.vm08.stdout:7/771: mkdir d0/d11/df7 0 2026-03-10T08:55:41.886 INFO:tasks.workunit.client.1.vm08.stdout:7/772: write d0/d14/f12 [1487445,44605] 0 2026-03-10T08:55:41.894 INFO:tasks.workunit.client.1.vm08.stdout:7/773: rmdir d0/d11/d1f/d29/d3d/d89 39 2026-03-10T08:55:41.895 INFO:tasks.workunit.client.0.vm05.stdout:5/457: truncate d5/d86/f1a 1353032 0 2026-03-10T08:55:41.896 INFO:tasks.workunit.client.1.vm08.stdout:0/673: dread d6/dd/d13/d17/f82 [0,4194304] 0 2026-03-10T08:55:41.898 INFO:tasks.workunit.client.0.vm05.stdout:5/458: rename d5/df/fa2 to d5/d86/fa6 0 2026-03-10T08:55:41.900 INFO:tasks.workunit.client.1.vm08.stdout:0/674: fdatasync d6/f5f 0 2026-03-10T08:55:41.901 INFO:tasks.workunit.client.1.vm08.stdout:7/774: symlink d0/d11/d1f/d29/d3b/da1/daa/lf8 0 2026-03-10T08:55:41.904 INFO:tasks.workunit.client.1.vm08.stdout:7/775: creat d0/d11/d1f/df0/ff9 x:0 0 0 2026-03-10T08:55:41.905 INFO:tasks.workunit.client.1.vm08.stdout:0/675: dwrite d6/dd/d13/d61/fbd [0,4194304] 0 2026-03-10T08:55:41.905 INFO:tasks.workunit.client.0.vm05.stdout:6/577: write d4/d2c/d84/f6b [407696,2849] 0 2026-03-10T08:55:41.907 INFO:tasks.workunit.client.0.vm05.stdout:1/602: dwrite dd/f44 [0,4194304] 0 2026-03-10T08:55:41.911 INFO:tasks.workunit.client.1.vm08.stdout:7/776: truncate d0/d11/d1f/d29/d3b/f86 2272795 0 2026-03-10T08:55:41.911 INFO:tasks.workunit.client.1.vm08.stdout:0/676: rmdir d6/dd/d13/d17/d1f/da3 39 2026-03-10T08:55:41.913 INFO:tasks.workunit.client.0.vm05.stdout:6/578: creat d4/d2c/d84/d4a/fbd x:0 0 0 2026-03-10T08:55:41.914 INFO:tasks.workunit.client.0.vm05.stdout:1/603: symlink dd/d10/d18/d20/d69/lda 0 2026-03-10T08:55:41.920 INFO:tasks.workunit.client.1.vm08.stdout:0/677: truncate d6/d8b/faa 1838515 0 2026-03-10T08:55:41.920 INFO:tasks.workunit.client.1.vm08.stdout:7/777: read d0/d11/d1f/d29/d3d/d40/f24 [3607039,122294] 0 2026-03-10T08:55:41.921 INFO:tasks.workunit.client.1.vm08.stdout:7/778: read d0/d11/f6a [2037233,100356] 0 2026-03-10T08:55:41.922 INFO:tasks.workunit.client.0.vm05.stdout:6/579: symlink d4/d2c/d84/db6/lbe 0 2026-03-10T08:55:41.922 INFO:tasks.workunit.client.0.vm05.stdout:1/604: dread dd/d10/d19/d4d/f74 [0,4194304] 0 2026-03-10T08:55:41.925 INFO:tasks.workunit.client.1.vm08.stdout:7/779: symlink d0/d11/d1f/d29/d36/lfa 0 2026-03-10T08:55:41.925 INFO:tasks.workunit.client.1.vm08.stdout:0/678: mkdir d6/dd/d13/d61/dc7/dc8/dde 0 2026-03-10T08:55:41.929 INFO:tasks.workunit.client.0.vm05.stdout:6/580: symlink d4/d7/d10/lbf 0 2026-03-10T08:55:41.930 INFO:tasks.workunit.client.0.vm05.stdout:6/581: chown d4/d2d/d51 140334 1 2026-03-10T08:55:41.932 INFO:tasks.workunit.client.0.vm05.stdout:5/459: dread d5/d86/d24/f25 [0,4194304] 0 2026-03-10T08:55:41.932 INFO:tasks.workunit.client.1.vm08.stdout:0/679: rmdir d6/dd/d13/d17/d1f/d2d/d38 39 2026-03-10T08:55:41.935 INFO:tasks.workunit.client.0.vm05.stdout:6/582: rename d4/d7/d10/d15/d1b/f3f to d4/d2d/d7f/fc0 0 2026-03-10T08:55:41.942 INFO:tasks.workunit.client.0.vm05.stdout:5/460: symlink d5/d86/d24/d84/la7 0 2026-03-10T08:55:41.942 INFO:tasks.workunit.client.1.vm08.stdout:0/680: fsync d6/f9 0 2026-03-10T08:55:41.942 INFO:tasks.workunit.client.1.vm08.stdout:0/681: dwrite d6/dd/d13/d17/d1f/d20/f43 [4194304,4194304] 0 2026-03-10T08:55:41.945 INFO:tasks.workunit.client.0.vm05.stdout:6/583: rename d4/d2d/d5f/f9f to d4/d7/d10/d1a/d89/fc1 0 2026-03-10T08:55:41.946 INFO:tasks.workunit.client.0.vm05.stdout:5/461: creat d5/d86/d24/d2c/d41/d74/fa8 x:0 0 0 2026-03-10T08:55:41.951 INFO:tasks.workunit.client.1.vm08.stdout:0/682: fdatasync d6/dd/f3f 0 2026-03-10T08:55:41.953 INFO:tasks.workunit.client.0.vm05.stdout:5/462: mkdir d5/d86/d24/d2c/d41/d74/da9 0 2026-03-10T08:55:41.955 INFO:tasks.workunit.client.1.vm08.stdout:6/744: truncate f5 3509277 0 2026-03-10T08:55:41.955 INFO:tasks.workunit.client.0.vm05.stdout:5/463: mknod d5/d48/d64/caa 0 2026-03-10T08:55:41.957 INFO:tasks.workunit.client.1.vm08.stdout:6/745: symlink d9/dc/d11/d23/d2c/d7a/lf3 0 2026-03-10T08:55:41.958 INFO:tasks.workunit.client.0.vm05.stdout:5/464: creat d5/d86/d24/d2c/fab x:0 0 0 2026-03-10T08:55:41.959 INFO:tasks.workunit.client.1.vm08.stdout:6/746: mknod d9/dc/d11/d23/cf4 0 2026-03-10T08:55:41.959 INFO:tasks.workunit.client.0.vm05.stdout:5/465: mkdir d5/d48/d64/d95/dac 0 2026-03-10T08:55:41.962 INFO:tasks.workunit.client.0.vm05.stdout:5/466: creat d5/d86/d24/d2c/d41/fad x:0 0 0 2026-03-10T08:55:41.963 INFO:tasks.workunit.client.0.vm05.stdout:5/467: write d5/d3a/d43/f6d [391731,50647] 0 2026-03-10T08:55:41.973 INFO:tasks.workunit.client.0.vm05.stdout:5/468: dread d5/df/d37/f47 [0,4194304] 0 2026-03-10T08:55:41.976 INFO:tasks.workunit.client.0.vm05.stdout:5/469: creat d5/df/d37/d68/fae x:0 0 0 2026-03-10T08:55:41.977 INFO:tasks.workunit.client.0.vm05.stdout:5/470: write d5/d86/d21/f9e [170821,126841] 0 2026-03-10T08:55:41.978 INFO:tasks.workunit.client.1.vm08.stdout:6/747: dread d9/dc/d11/fbd [0,4194304] 0 2026-03-10T08:55:41.980 INFO:tasks.workunit.client.1.vm08.stdout:6/748: rmdir d9/d10/d1e 39 2026-03-10T08:55:41.987 INFO:tasks.workunit.client.0.vm05.stdout:3/606: truncate d9/d2b/d3a/d43/d7a/f7f 2927929 0 2026-03-10T08:55:41.987 INFO:tasks.workunit.client.0.vm05.stdout:3/607: rmdir d9/d2b/d3a/d43/d7a 39 2026-03-10T08:55:41.988 INFO:tasks.workunit.client.1.vm08.stdout:8/768: dwrite d1/d10/d9/dd/d18/d3c/fd8 [0,4194304] 0 2026-03-10T08:55:41.989 INFO:tasks.workunit.client.0.vm05.stdout:3/608: mknod d9/d2b/d3a/d43/d71/ca8 0 2026-03-10T08:55:41.991 INFO:tasks.workunit.client.0.vm05.stdout:7/489: write fd [5556350,44737] 0 2026-03-10T08:55:41.993 INFO:tasks.workunit.client.1.vm08.stdout:8/769: chown d1/d10/d9/dd/d25/d27/f3a 58897178 1 2026-03-10T08:55:41.995 INFO:tasks.workunit.client.1.vm08.stdout:8/770: truncate d1/d10/d9/dd/d18/f80 2575888 0 2026-03-10T08:55:41.998 INFO:tasks.workunit.client.0.vm05.stdout:5/471: read d5/f3b [1800408,51775] 0 2026-03-10T08:55:42.006 INFO:tasks.workunit.client.1.vm08.stdout:8/771: fsync d1/d10/d9/f73 0 2026-03-10T08:55:42.006 INFO:tasks.workunit.client.0.vm05.stdout:5/472: chown d5/c14 51 1 2026-03-10T08:55:42.006 INFO:tasks.workunit.client.1.vm08.stdout:8/772: chown d1/d10/d9/dd/f41 98668 1 2026-03-10T08:55:42.007 INFO:tasks.workunit.client.0.vm05.stdout:0/534: write f5 [2724310,75781] 0 2026-03-10T08:55:42.008 INFO:tasks.workunit.client.0.vm05.stdout:0/535: chown df/d1f/d85/d19/d39/d74/d67/d7b 151 1 2026-03-10T08:55:42.011 INFO:tasks.workunit.client.1.vm08.stdout:9/714: write d2/dd/d15/d1e/d25/d32/f60 [1481643,33870] 0 2026-03-10T08:55:42.012 INFO:tasks.workunit.client.0.vm05.stdout:5/473: creat d5/d86/d66/d76/faf x:0 0 0 2026-03-10T08:55:42.016 INFO:tasks.workunit.client.1.vm08.stdout:8/773: dread - d1/da8/f102 zero size 2026-03-10T08:55:42.017 INFO:tasks.workunit.client.0.vm05.stdout:0/536: symlink df/d1f/d85/d19/d39/d74/l9d 0 2026-03-10T08:55:42.019 INFO:tasks.workunit.client.1.vm08.stdout:6/749: dread d9/d10/d1e/fba [0,4194304] 0 2026-03-10T08:55:42.022 INFO:tasks.workunit.client.0.vm05.stdout:0/537: fdatasync df/d1f/d85/d19/d39/f6f 0 2026-03-10T08:55:42.025 INFO:tasks.workunit.client.1.vm08.stdout:9/715: link d2/dd/d15/d1e/d21/f3a d2/d41/d4c/d66/d82/ff6 0 2026-03-10T08:55:42.028 INFO:tasks.workunit.client.1.vm08.stdout:9/716: unlink d2/dd/d15/d1e/d25/d32/cd1 0 2026-03-10T08:55:42.037 INFO:tasks.workunit.client.1.vm08.stdout:4/785: dwrite d5/d23/d36/f57 [0,4194304] 0 2026-03-10T08:55:42.037 INFO:tasks.workunit.client.1.vm08.stdout:1/758: write d1/fac [317431,81910] 0 2026-03-10T08:55:42.037 INFO:tasks.workunit.client.0.vm05.stdout:8/533: dwrite d2/dd/d2c/d2e/f3b [0,4194304] 0 2026-03-10T08:55:42.037 INFO:tasks.workunit.client.0.vm05.stdout:8/534: dread - d2/dd/d2c/d2e/d31/d4f/da3/faa zero size 2026-03-10T08:55:42.037 INFO:tasks.workunit.client.0.vm05.stdout:8/535: readlink d2/db/d1f/l82 0 2026-03-10T08:55:42.037 INFO:tasks.workunit.client.0.vm05.stdout:0/538: mkdir df/d1f/d85/d9e 0 2026-03-10T08:55:42.038 INFO:tasks.workunit.client.0.vm05.stdout:2/460: dread d0/d9/f19 [0,4194304] 0 2026-03-10T08:55:42.038 INFO:tasks.workunit.client.0.vm05.stdout:0/539: fsync df/d1f/d85/f29 0 2026-03-10T08:55:42.044 INFO:tasks.workunit.client.0.vm05.stdout:8/536: dread d2/db/f22 [0,4194304] 0 2026-03-10T08:55:42.047 INFO:tasks.workunit.client.0.vm05.stdout:8/537: dwrite d2/dd/d2c/f86 [0,4194304] 0 2026-03-10T08:55:42.053 INFO:tasks.workunit.client.1.vm08.stdout:6/750: sync 2026-03-10T08:55:42.053 INFO:tasks.workunit.client.0.vm05.stdout:5/474: sync 2026-03-10T08:55:42.054 INFO:tasks.workunit.client.1.vm08.stdout:6/751: readlink d9/dc/d11/d23/d2c/d81/d63/l7f 0 2026-03-10T08:55:42.055 INFO:tasks.workunit.client.1.vm08.stdout:6/752: chown d9/dc/d11/d23/d2c/d7a 8 1 2026-03-10T08:55:42.062 INFO:tasks.workunit.client.1.vm08.stdout:8/774: link d1/d2c/fdb d1/d10/d9/f11e 0 2026-03-10T08:55:42.066 INFO:tasks.workunit.client.1.vm08.stdout:1/759: truncate d1/da/de/d5c/fcc 3628272 0 2026-03-10T08:55:42.067 INFO:tasks.workunit.client.1.vm08.stdout:3/666: write d4/d15/d8/d2c/d9b/d79/d8f/f91 [1812015,115235] 0 2026-03-10T08:55:42.072 INFO:tasks.workunit.client.1.vm08.stdout:8/775: truncate d1/d10/d9/d4d/db2/fda 1008758 0 2026-03-10T08:55:42.073 INFO:tasks.workunit.client.0.vm05.stdout:5/475: creat d5/d86/d24/d84/fb0 x:0 0 0 2026-03-10T08:55:42.073 INFO:tasks.workunit.client.1.vm08.stdout:2/800: write d1/d5b/d66/f20 [225116,22459] 0 2026-03-10T08:55:42.075 INFO:tasks.workunit.client.1.vm08.stdout:3/667: unlink d4/d15/f4b 0 2026-03-10T08:55:42.077 INFO:tasks.workunit.client.1.vm08.stdout:5/690: dwrite d0/d11/f29 [0,4194304] 0 2026-03-10T08:55:42.077 INFO:tasks.workunit.client.1.vm08.stdout:1/760: creat d1/da/de/d24/d3d/d40/d8e/f107 x:0 0 0 2026-03-10T08:55:42.080 INFO:tasks.workunit.client.0.vm05.stdout:4/541: dwrite d0/d2e/d42/f59 [0,4194304] 0 2026-03-10T08:55:42.091 INFO:tasks.workunit.client.0.vm05.stdout:4/542: sync 2026-03-10T08:55:42.094 INFO:tasks.workunit.client.1.vm08.stdout:8/776: unlink d1/d10/d9/dd/d13/cb1 0 2026-03-10T08:55:42.094 INFO:tasks.workunit.client.0.vm05.stdout:5/476: creat d5/d86/d24/d2c/d41/d74/fb1 x:0 0 0 2026-03-10T08:55:42.095 INFO:tasks.workunit.client.1.vm08.stdout:8/777: readlink d1/d10/d9/dd/d25/lf0 0 2026-03-10T08:55:42.095 INFO:tasks.workunit.client.1.vm08.stdout:2/801: fsync d1/da/d78/f95 0 2026-03-10T08:55:42.099 INFO:tasks.workunit.client.1.vm08.stdout:3/668: creat d4/d15/d8/d1d/da8/fe7 x:0 0 0 2026-03-10T08:55:42.100 INFO:tasks.workunit.client.0.vm05.stdout:4/543: write d0/d2e/d42/d45/d4a/d36/d37/d9c/d49/d58/d66/d79/fa5 [116956,52182] 0 2026-03-10T08:55:42.102 INFO:tasks.workunit.client.0.vm05.stdout:4/544: read d0/d2e/d42/d45/f5f [947774,95964] 0 2026-03-10T08:55:42.105 INFO:tasks.workunit.client.1.vm08.stdout:1/761: mknod d1/da/de/d24/d3d/d40/d8e/dd2/d7f/c108 0 2026-03-10T08:55:42.107 INFO:tasks.workunit.client.1.vm08.stdout:4/786: getdents d5/d23/d36/d76 0 2026-03-10T08:55:42.109 INFO:tasks.workunit.client.1.vm08.stdout:2/802: rmdir d1/da/d10/d42/d93/d1e/dce/d52/db3 39 2026-03-10T08:55:42.110 INFO:tasks.workunit.client.1.vm08.stdout:3/669: read - d4/d15/d8/fbb zero size 2026-03-10T08:55:42.114 INFO:tasks.workunit.client.0.vm05.stdout:4/545: dwrite d0/d1d/f50 [0,4194304] 0 2026-03-10T08:55:42.115 INFO:tasks.workunit.client.1.vm08.stdout:5/691: dread d0/d11/d3e/f4d [0,4194304] 0 2026-03-10T08:55:42.116 INFO:tasks.workunit.client.1.vm08.stdout:8/778: sync 2026-03-10T08:55:42.116 INFO:tasks.workunit.client.1.vm08.stdout:5/692: write d0/d11/d18/d52/db9/fd4 [1010653,49649] 0 2026-03-10T08:55:42.123 INFO:tasks.workunit.client.0.vm05.stdout:8/538: link d2/dd/d2c/d2e/d31/d4f/l7c d2/dd/d2c/d2e/d31/lc2 0 2026-03-10T08:55:42.124 INFO:tasks.workunit.client.0.vm05.stdout:8/539: truncate d2/db/d28/f2d 1161312 0 2026-03-10T08:55:42.127 INFO:tasks.workunit.client.1.vm08.stdout:6/753: dread d9/dc/d11/d23/d2c/f4f [0,4194304] 0 2026-03-10T08:55:42.135 INFO:tasks.workunit.client.0.vm05.stdout:5/477: getdents d5/d86/d21 0 2026-03-10T08:55:42.135 INFO:tasks.workunit.client.0.vm05.stdout:5/478: readlink d5/d86/d24/d84/la7 0 2026-03-10T08:55:42.136 INFO:tasks.workunit.client.0.vm05.stdout:5/479: fsync d5/d86/f9d 0 2026-03-10T08:55:42.139 INFO:tasks.workunit.client.0.vm05.stdout:4/546: rmdir d0/d2e/da8 0 2026-03-10T08:55:42.143 INFO:tasks.workunit.client.0.vm05.stdout:8/540: symlink d2/dd/d2c/d2e/d31/lc3 0 2026-03-10T08:55:42.159 INFO:tasks.workunit.client.1.vm08.stdout:4/787: readlink d5/d23/lb8 0 2026-03-10T08:55:42.164 INFO:tasks.workunit.client.0.vm05.stdout:8/541: creat d2/dd/d2c/d2e/d31/fc4 x:0 0 0 2026-03-10T08:55:42.166 INFO:tasks.workunit.client.0.vm05.stdout:8/542: creat d2/db/d1f/d67/fc5 x:0 0 0 2026-03-10T08:55:42.167 INFO:tasks.workunit.client.1.vm08.stdout:3/670: unlink d4/c6c 0 2026-03-10T08:55:42.168 INFO:tasks.workunit.client.1.vm08.stdout:8/779: mkdir d1/d10/d9/dd/d9a/d11f 0 2026-03-10T08:55:42.168 INFO:tasks.workunit.client.0.vm05.stdout:8/543: symlink d2/dd/d2c/d2e/d31/d4c/lc6 0 2026-03-10T08:55:42.169 INFO:tasks.workunit.client.1.vm08.stdout:2/803: creat d1/d43/dcd/f108 x:0 0 0 2026-03-10T08:55:42.171 INFO:tasks.workunit.client.1.vm08.stdout:4/788: rmdir d5/d23/d36/d99/db2/d5d/de3/df8 39 2026-03-10T08:55:42.182 INFO:tasks.workunit.client.1.vm08.stdout:3/671: mknod d4/d15/d8/d2c/d9b/d79/d20/ce8 0 2026-03-10T08:55:42.182 INFO:tasks.workunit.client.1.vm08.stdout:2/804: link d1/la3 d1/da/d10/d42/d93/d1e/l109 0 2026-03-10T08:55:42.189 INFO:tasks.workunit.client.1.vm08.stdout:8/780: link d1/d4f/d60/dbf/cf9 d1/d10/d9/dd/d25/d27/d44/d21/d5f/c120 0 2026-03-10T08:55:42.199 INFO:tasks.workunit.client.1.vm08.stdout:4/789: dread d5/d23/d49/d8f/da4/f10a [0,4194304] 0 2026-03-10T08:55:42.199 INFO:tasks.workunit.client.1.vm08.stdout:4/790: read - d5/d23/d49/d83/fd5 zero size 2026-03-10T08:55:42.199 INFO:tasks.workunit.client.1.vm08.stdout:8/781: symlink d1/d10/d9/dd/l121 0 2026-03-10T08:55:42.199 INFO:tasks.workunit.client.1.vm08.stdout:2/805: rename d1/da/d10/d1b/d6a/cfe to d1/da/d10/d42/d93/d1e/dce/d52/db3/def/c10a 0 2026-03-10T08:55:42.199 INFO:tasks.workunit.client.1.vm08.stdout:4/791: truncate d5/f85 3533125 0 2026-03-10T08:55:42.199 INFO:tasks.workunit.client.1.vm08.stdout:2/806: creat d1/d43/f10b x:0 0 0 2026-03-10T08:55:42.199 INFO:tasks.workunit.client.1.vm08.stdout:4/792: truncate d5/d23/d49/d8f/da4/fc3 797400 0 2026-03-10T08:55:42.200 INFO:tasks.workunit.client.1.vm08.stdout:2/807: truncate d1/da/d10/d42/f79 1407092 0 2026-03-10T08:55:42.201 INFO:tasks.workunit.client.1.vm08.stdout:4/793: truncate d5/d23/d36/d99/db2/d5a/d69/d11b/f6d 2215528 0 2026-03-10T08:55:42.207 INFO:tasks.workunit.client.0.vm05.stdout:9/503: dwrite d6/d19/d2c/f3d [0,4194304] 0 2026-03-10T08:55:42.209 INFO:tasks.workunit.client.1.vm08.stdout:2/808: symlink d1/da/d10/d42/d93/d1e/dce/d52/l10c 0 2026-03-10T08:55:42.213 INFO:tasks.workunit.client.1.vm08.stdout:2/809: truncate d1/da/d10/d42/d93/d1e/d7b/fb8 937510 0 2026-03-10T08:55:42.218 INFO:tasks.workunit.client.0.vm05.stdout:9/504: dread d6/d19/d21/f8a [0,4194304] 0 2026-03-10T08:55:42.219 INFO:tasks.workunit.client.0.vm05.stdout:9/505: write d6/d15/f86 [533015,82025] 0 2026-03-10T08:55:42.222 INFO:tasks.workunit.client.1.vm08.stdout:2/810: unlink d1/da/d10/d42/dd0/lf4 0 2026-03-10T08:55:42.246 INFO:tasks.workunit.client.1.vm08.stdout:2/811: sync 2026-03-10T08:55:42.248 INFO:tasks.workunit.client.1.vm08.stdout:2/812: mkdir d1/d97/d10d 0 2026-03-10T08:55:42.253 INFO:tasks.workunit.client.1.vm08.stdout:2/813: getdents d1/da/d10/d1b 0 2026-03-10T08:55:42.254 INFO:tasks.workunit.client.1.vm08.stdout:2/814: chown d1/da/d10/d42/d93/d1e/dce/ce4 53 1 2026-03-10T08:55:42.277 INFO:tasks.workunit.client.1.vm08.stdout:2/815: dread d1/da/f64 [0,4194304] 0 2026-03-10T08:55:42.277 INFO:tasks.workunit.client.0.vm05.stdout:1/605: dwrite dd/d10/d18/d2d/d5c/dac/fcd [0,4194304] 0 2026-03-10T08:55:42.286 INFO:tasks.workunit.client.1.vm08.stdout:7/780: dwrite d0/d51/f5d [0,4194304] 0 2026-03-10T08:55:42.289 INFO:tasks.workunit.client.1.vm08.stdout:2/816: dwrite d1/da/d10/d42/d93/fcb [0,4194304] 0 2026-03-10T08:55:42.291 INFO:tasks.workunit.client.0.vm05.stdout:6/584: write d4/d2c/d84/d4a/f63 [6925774,3523] 0 2026-03-10T08:55:42.292 INFO:tasks.workunit.client.0.vm05.stdout:6/585: truncate d4/d2c/d84/f6b 1217840 0 2026-03-10T08:55:42.293 INFO:tasks.workunit.client.0.vm05.stdout:6/586: fdatasync d4/d7/d10/d15/d1b/fb8 0 2026-03-10T08:55:42.297 INFO:tasks.workunit.client.0.vm05.stdout:1/606: mkdir dd/d10/d18/d2d/d51/d58/ddb 0 2026-03-10T08:55:42.297 INFO:tasks.workunit.client.0.vm05.stdout:6/587: dread - d4/d2c/d84/fb2 zero size 2026-03-10T08:55:42.300 INFO:tasks.workunit.client.0.vm05.stdout:1/607: mkdir dd/d10/d18/d20/d52/ddc 0 2026-03-10T08:55:42.301 INFO:tasks.workunit.client.0.vm05.stdout:6/588: creat d4/d7/d10/d1a/d8c/fc2 x:0 0 0 2026-03-10T08:55:42.301 INFO:tasks.workunit.client.1.vm08.stdout:7/781: getdents d0/d11/d4a 0 2026-03-10T08:55:42.302 INFO:tasks.workunit.client.1.vm08.stdout:0/683: dwrite d6/dd/d13/d17/d1f/d2d/d39/f4a [0,4194304] 0 2026-03-10T08:55:42.306 INFO:tasks.workunit.client.0.vm05.stdout:6/589: mkdir d4/d7/d10/dc3 0 2026-03-10T08:55:42.312 INFO:tasks.workunit.client.1.vm08.stdout:7/782: rename d0/d11/d4a/d95/ff5 to d0/d11/d1f/d29/d3b/d80/dd3/ffb 0 2026-03-10T08:55:42.319 INFO:tasks.workunit.client.1.vm08.stdout:7/783: chown d0/d11/d1f/d29/d36/d75/fb9 1768274 1 2026-03-10T08:55:42.325 INFO:tasks.workunit.client.0.vm05.stdout:7/490: read d18/f1d [2644910,34976] 0 2026-03-10T08:55:42.327 INFO:tasks.workunit.client.0.vm05.stdout:3/609: rmdir d9/d2b/d3a/d43/d71 39 2026-03-10T08:55:42.329 INFO:tasks.workunit.client.0.vm05.stdout:7/491: dwrite f9 [0,4194304] 0 2026-03-10T08:55:42.343 INFO:tasks.workunit.client.0.vm05.stdout:7/492: symlink d18/d66/d25/d2e/d42/d74/l93 0 2026-03-10T08:55:42.343 INFO:tasks.workunit.client.0.vm05.stdout:7/493: fdatasync d18/d66/d25/d2e/f6f 0 2026-03-10T08:55:42.344 INFO:tasks.workunit.client.0.vm05.stdout:7/494: chown d18/d66/f70 277605046 1 2026-03-10T08:55:42.347 INFO:tasks.workunit.client.0.vm05.stdout:7/495: rmdir d18/d66/d78 39 2026-03-10T08:55:42.349 INFO:tasks.workunit.client.0.vm05.stdout:0/540: getdents df/d1f/d85 0 2026-03-10T08:55:42.349 INFO:tasks.workunit.client.0.vm05.stdout:0/541: chown df/f15 1974731 1 2026-03-10T08:55:42.350 INFO:tasks.workunit.client.0.vm05.stdout:3/610: getdents d9/d4d/d51/d64 0 2026-03-10T08:55:42.353 INFO:tasks.workunit.client.0.vm05.stdout:3/611: dwrite d9/d2b/f2d [0,4194304] 0 2026-03-10T08:55:42.354 INFO:tasks.workunit.client.0.vm05.stdout:3/612: truncate d9/f29 4684195 0 2026-03-10T08:55:42.356 INFO:tasks.workunit.client.1.vm08.stdout:9/717: write d2/dd/d15/d1e/d24/f9e [457033,30017] 0 2026-03-10T08:55:42.357 INFO:tasks.workunit.client.1.vm08.stdout:9/718: chown d2/dd/d15/d1e/d39 65287 1 2026-03-10T08:55:42.357 INFO:tasks.workunit.client.0.vm05.stdout:0/542: mkdir df/d1f/d85/d19/d39/d4d/d9f 0 2026-03-10T08:55:42.363 INFO:tasks.workunit.client.0.vm05.stdout:2/461: dwrite d0/d9/d1e/d20/d21/d45/d6c/d6e/f38 [0,4194304] 0 2026-03-10T08:55:42.375 INFO:tasks.workunit.client.0.vm05.stdout:3/613: mknod d9/d4d/d51/d64/ca9 0 2026-03-10T08:55:42.379 INFO:tasks.workunit.client.0.vm05.stdout:0/543: rename df/d1f/d85/d2b/d27/d32/f9b to df/d1f/d85/d2b/d65/d6e/fa0 0 2026-03-10T08:55:42.383 INFO:tasks.workunit.client.1.vm08.stdout:9/719: dread d2/dd/d15/d1e/d21/fc5 [0,4194304] 0 2026-03-10T08:55:42.383 INFO:tasks.workunit.client.0.vm05.stdout:2/462: readlink d0/d9/d1e/d20/d21/d45/l73 0 2026-03-10T08:55:42.383 INFO:tasks.workunit.client.0.vm05.stdout:0/544: dread - df/d1f/d85/d19/d39/f42 zero size 2026-03-10T08:55:42.383 INFO:tasks.workunit.client.0.vm05.stdout:2/463: creat d0/d9/d1e/d20/d21/d45/d6c/d6e/d6d/f81 x:0 0 0 2026-03-10T08:55:42.384 INFO:tasks.workunit.client.0.vm05.stdout:2/464: read - d0/d9/d1e/d20/d21/d45/d6c/d6e/f67 zero size 2026-03-10T08:55:42.386 INFO:tasks.workunit.client.0.vm05.stdout:7/496: link d18/f31 d18/d1b/f94 0 2026-03-10T08:55:42.386 INFO:tasks.workunit.client.0.vm05.stdout:0/545: truncate df/f15 1948184 0 2026-03-10T08:55:42.388 INFO:tasks.workunit.client.0.vm05.stdout:2/465: mknod d0/d9/d1e/d20/d21/d45/d4b/d70/c82 0 2026-03-10T08:55:42.388 INFO:tasks.workunit.client.0.vm05.stdout:2/466: chown d0/l28 201745728 1 2026-03-10T08:55:42.389 INFO:tasks.workunit.client.0.vm05.stdout:3/614: rmdir d9/d4d/d51/d6f 0 2026-03-10T08:55:42.391 INFO:tasks.workunit.client.1.vm08.stdout:9/720: truncate d2/dd/d15/d1e/d39/d69/fda 1260565 0 2026-03-10T08:55:42.391 INFO:tasks.workunit.client.0.vm05.stdout:2/467: truncate d0/d9/f1d 1711619 0 2026-03-10T08:55:42.392 INFO:tasks.workunit.client.0.vm05.stdout:2/468: readlink d0/d9/d1e/d20/d21/d45/d4b/d70/l79 0 2026-03-10T08:55:42.393 INFO:tasks.workunit.client.0.vm05.stdout:3/615: creat d9/d4d/d51/faa x:0 0 0 2026-03-10T08:55:42.396 INFO:tasks.workunit.client.0.vm05.stdout:2/469: dwrite d0/d9/f1b [0,4194304] 0 2026-03-10T08:55:42.398 INFO:tasks.workunit.client.0.vm05.stdout:0/546: symlink df/d1f/d85/d19/d39/d4d/d9f/la1 0 2026-03-10T08:55:42.400 INFO:tasks.workunit.client.0.vm05.stdout:7/497: creat d18/f95 x:0 0 0 2026-03-10T08:55:42.400 INFO:tasks.workunit.client.0.vm05.stdout:7/498: fsync d18/d66/d25/f47 0 2026-03-10T08:55:42.401 INFO:tasks.workunit.client.1.vm08.stdout:0/684: dread d6/dd/d13/d17/d1f/d2d/d85/d93/fc0 [4194304,4194304] 0 2026-03-10T08:55:42.401 INFO:tasks.workunit.client.0.vm05.stdout:0/547: symlink df/d1f/d85/d19/d47/d84/la2 0 2026-03-10T08:55:42.402 INFO:tasks.workunit.client.0.vm05.stdout:3/616: fdatasync d9/d2b/f3b 0 2026-03-10T08:55:42.402 INFO:tasks.workunit.client.0.vm05.stdout:0/548: dread - df/d1f/d85/d19/f99 zero size 2026-03-10T08:55:42.404 INFO:tasks.workunit.client.0.vm05.stdout:0/549: chown df/d1f/d85/d19/d39/f42 11596802 1 2026-03-10T08:55:42.405 INFO:tasks.workunit.client.0.vm05.stdout:7/499: dwrite d18/d1b/f84 [0,4194304] 0 2026-03-10T08:55:42.408 INFO:tasks.workunit.client.0.vm05.stdout:0/550: read df/d1f/d85/d19/d5b/f6c [233969,22836] 0 2026-03-10T08:55:42.408 INFO:tasks.workunit.client.0.vm05.stdout:2/470: symlink d0/d9/d1e/l83 0 2026-03-10T08:55:42.410 INFO:tasks.workunit.client.0.vm05.stdout:3/617: creat d9/d8f/d55/fab x:0 0 0 2026-03-10T08:55:42.412 INFO:tasks.workunit.client.0.vm05.stdout:7/500: chown f15 968224439 1 2026-03-10T08:55:42.412 INFO:tasks.workunit.client.0.vm05.stdout:2/471: symlink d0/d55/l84 0 2026-03-10T08:55:42.412 INFO:tasks.workunit.client.0.vm05.stdout:7/501: dread - d18/d38/f5d zero size 2026-03-10T08:55:42.413 INFO:tasks.workunit.client.0.vm05.stdout:0/551: mkdir df/d1f/d85/d19/d47/da3 0 2026-03-10T08:55:42.415 INFO:tasks.workunit.client.1.vm08.stdout:0/685: fsync d6/dd/f35 0 2026-03-10T08:55:42.417 INFO:tasks.workunit.client.0.vm05.stdout:3/618: creat d9/d2b/d3a/d43/d71/fac x:0 0 0 2026-03-10T08:55:42.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:42 vm05.local ceph-mon[49713]: pgmap v160: 65 pgs: 65 active+clean; 2.6 GiB data, 9.1 GiB used, 111 GiB / 120 GiB avail; 45 MiB/s rd, 112 MiB/s wr, 263 op/s 2026-03-10T08:55:42.524 INFO:tasks.workunit.client.0.vm05.stdout:5/480: dwrite d5/d86/d24/d2c/d41/f4d [0,4194304] 0 2026-03-10T08:55:42.526 INFO:tasks.workunit.client.0.vm05.stdout:5/481: symlink d5/d86/d21/d89/lb2 0 2026-03-10T08:55:42.532 INFO:tasks.workunit.client.0.vm05.stdout:4/547: write d0/d2e/d42/d45/d4a/d36/d37/d9c/d49/f7a [508424,59363] 0 2026-03-10T08:55:42.536 INFO:tasks.workunit.client.1.vm08.stdout:1/762: write d1/f1f [853093,73884] 0 2026-03-10T08:55:42.536 INFO:tasks.workunit.client.1.vm08.stdout:5/693: write d0/f6c [992805,61182] 0 2026-03-10T08:55:42.539 INFO:tasks.workunit.client.1.vm08.stdout:1/763: dread - d1/da/d20/d91/d83/f100 zero size 2026-03-10T08:55:42.543 INFO:tasks.workunit.client.1.vm08.stdout:1/764: truncate d1/da/f25 1673578 0 2026-03-10T08:55:42.546 INFO:tasks.workunit.client.1.vm08.stdout:5/694: getdents d0/d1b/d67/d80 0 2026-03-10T08:55:42.552 INFO:tasks.workunit.client.1.vm08.stdout:5/695: read d0/d11/d27/d68/d7c/d4b/f82 [965280,27992] 0 2026-03-10T08:55:42.557 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:42 vm08.local ceph-mon[57559]: pgmap v160: 65 pgs: 65 active+clean; 2.6 GiB data, 9.1 GiB used, 111 GiB / 120 GiB avail; 45 MiB/s rd, 112 MiB/s wr, 263 op/s 2026-03-10T08:55:42.575 INFO:tasks.workunit.client.1.vm08.stdout:6/754: write d9/d10/d1e/f91 [1968685,126413] 0 2026-03-10T08:55:42.577 INFO:tasks.workunit.client.1.vm08.stdout:6/755: creat d9/d13/d4e/ff5 x:0 0 0 2026-03-10T08:55:42.580 INFO:tasks.workunit.client.1.vm08.stdout:6/756: rmdir d9/dc/d11/d23/d2c/d7a/dce/d69/da2 39 2026-03-10T08:55:42.582 INFO:tasks.workunit.client.1.vm08.stdout:6/757: symlink d9/dc/d11/d23/d2c/lf6 0 2026-03-10T08:55:42.587 INFO:tasks.workunit.client.1.vm08.stdout:6/758: fsync d9/d10/d1e/f2a 0 2026-03-10T08:55:42.602 INFO:tasks.workunit.client.1.vm08.stdout:6/759: sync 2026-03-10T08:55:42.621 INFO:tasks.workunit.client.0.vm05.stdout:8/544: write d2/db/d47/f58 [799528,123809] 0 2026-03-10T08:55:42.624 INFO:tasks.workunit.client.0.vm05.stdout:8/545: unlink d2/dd/d2c/d2e/d31/d3e/f6b 0 2026-03-10T08:55:42.625 INFO:tasks.workunit.client.1.vm08.stdout:3/672: write d4/d6f/d85/f87 [3477935,130567] 0 2026-03-10T08:55:42.627 INFO:tasks.workunit.client.0.vm05.stdout:8/546: dwrite d2/dd/d2c/d2e/d31/d4f/d7b/d9e/fab [0,4194304] 0 2026-03-10T08:55:42.639 INFO:tasks.workunit.client.1.vm08.stdout:3/673: dread d4/d15/f7 [0,4194304] 0 2026-03-10T08:55:42.639 INFO:tasks.workunit.client.0.vm05.stdout:8/547: mkdir d2/db/da4/dc7 0 2026-03-10T08:55:42.641 INFO:tasks.workunit.client.0.vm05.stdout:8/548: creat d2/dd/d2c/d2e/d31/d4c/d63/fc8 x:0 0 0 2026-03-10T08:55:42.642 INFO:tasks.workunit.client.0.vm05.stdout:8/549: chown d2/dd/d2c/f4d 18241155 1 2026-03-10T08:55:42.643 INFO:tasks.workunit.client.0.vm05.stdout:8/550: mknod d2/dd/d2c/d2e/d31/d4c/d63/cc9 0 2026-03-10T08:55:42.648 INFO:tasks.workunit.client.1.vm08.stdout:3/674: dwrite d4/d15/d8/d2c/d9b/d79/d8f/f91 [4194304,4194304] 0 2026-03-10T08:55:42.648 INFO:tasks.workunit.client.0.vm05.stdout:8/551: rename d2/dd/d74/d78/la2 to d2/db/da4/lca 0 2026-03-10T08:55:42.648 INFO:tasks.workunit.client.0.vm05.stdout:8/552: mknod d2/dd/d2c/d2e/d31/d4f/d80/ccb 0 2026-03-10T08:55:42.650 INFO:tasks.workunit.client.1.vm08.stdout:8/782: dwrite d1/d10/d9/dd/d25/d27/d44/fa7 [0,4194304] 0 2026-03-10T08:55:42.657 INFO:tasks.workunit.client.0.vm05.stdout:8/553: creat d2/dd/d74/fcc x:0 0 0 2026-03-10T08:55:42.657 INFO:tasks.workunit.client.0.vm05.stdout:8/554: write d2/dd/d2c/d2e/d31/d4c/f85 [364591,43039] 0 2026-03-10T08:55:42.661 INFO:tasks.workunit.client.0.vm05.stdout:8/555: truncate d2/db/d1f/f84 361385 0 2026-03-10T08:55:42.662 INFO:tasks.workunit.client.0.vm05.stdout:8/556: write d2/dd/d2c/d2e/d31/d4f/f9c [246517,104462] 0 2026-03-10T08:55:42.664 INFO:tasks.workunit.client.0.vm05.stdout:8/557: truncate d2/dd/d2c/d2e/d93/f9b 441139 0 2026-03-10T08:55:42.665 INFO:tasks.workunit.client.0.vm05.stdout:8/558: write d2/dd/d2c/d2e/d31/d4f/d80/f9f [2944747,30812] 0 2026-03-10T08:55:42.669 INFO:tasks.workunit.client.0.vm05.stdout:8/559: dwrite d2/dd/d2c/d2e/d31/d4f/d80/f9f [0,4194304] 0 2026-03-10T08:55:42.688 INFO:tasks.workunit.client.1.vm08.stdout:3/675: dwrite d4/d6f/d85/dd3/fdb [0,4194304] 0 2026-03-10T08:55:42.708 INFO:tasks.workunit.client.1.vm08.stdout:3/676: symlink d4/d15/d8/le9 0 2026-03-10T08:55:42.720 INFO:tasks.workunit.client.1.vm08.stdout:3/677: unlink d4/d15/d8/d2c/d9b/l39 0 2026-03-10T08:55:42.721 INFO:tasks.workunit.client.1.vm08.stdout:3/678: dread - d4/d15/d8/fbb zero size 2026-03-10T08:55:42.727 INFO:tasks.workunit.client.1.vm08.stdout:3/679: symlink d4/d15/d8/d2c/d9b/d79/d8f/de2/lea 0 2026-03-10T08:55:42.730 INFO:tasks.workunit.client.1.vm08.stdout:3/680: rename d4/d15/d8/d2c/d9b/d79/d8f/f91 to d4/d15/d8/d2c/d55/feb 0 2026-03-10T08:55:42.736 INFO:tasks.workunit.client.1.vm08.stdout:4/794: dwrite d5/d23/d36/d99/db2/d5a/d69/d11b/d96/ff6 [0,4194304] 0 2026-03-10T08:55:42.736 INFO:tasks.workunit.client.1.vm08.stdout:4/795: stat d5/d23/d49/l111 0 2026-03-10T08:55:42.736 INFO:tasks.workunit.client.1.vm08.stdout:3/681: rename d4/d15/d8/fbb to d4/d15/d8/fec 0 2026-03-10T08:55:42.738 INFO:tasks.workunit.client.1.vm08.stdout:3/682: rename d4/d15/d8/d1d/f6e to d4/d6f/d85/fed 0 2026-03-10T08:55:42.740 INFO:tasks.workunit.client.1.vm08.stdout:3/683: unlink d4/d15/d8/d2c/f90 0 2026-03-10T08:55:42.741 INFO:tasks.workunit.client.1.vm08.stdout:3/684: chown d4/d15/d8/f41 605635574 1 2026-03-10T08:55:42.743 INFO:tasks.workunit.client.1.vm08.stdout:3/685: creat d4/d15/d8/d1d/d4f/fee x:0 0 0 2026-03-10T08:55:42.751 INFO:tasks.workunit.client.1.vm08.stdout:4/796: sync 2026-03-10T08:55:42.756 INFO:tasks.workunit.client.1.vm08.stdout:3/686: dread d4/d15/d8/d2c/d6d/f9d [0,4194304] 0 2026-03-10T08:55:42.757 INFO:tasks.workunit.client.1.vm08.stdout:3/687: stat d4/d15/d8/fad 0 2026-03-10T08:55:42.758 INFO:tasks.workunit.client.1.vm08.stdout:4/797: link d5/d23/d36/d99/db2/d5a/d69/d11b/lde d5/d23/d36/d99/db2/d5d/de3/df8/l122 0 2026-03-10T08:55:42.762 INFO:tasks.workunit.client.1.vm08.stdout:3/688: creat d4/d15/d8/d2c/d9b/d79/fef x:0 0 0 2026-03-10T08:55:42.766 INFO:tasks.workunit.client.0.vm05.stdout:9/506: write d6/d19/f29 [2634473,69210] 0 2026-03-10T08:55:42.767 INFO:tasks.workunit.client.1.vm08.stdout:2/817: write d1/d5b/d66/f5e [998574,95347] 0 2026-03-10T08:55:42.768 INFO:tasks.workunit.client.0.vm05.stdout:9/507: mknod d6/d15/d37/cac 0 2026-03-10T08:55:42.768 INFO:tasks.workunit.client.0.vm05.stdout:9/508: fsync d6/d12/d3a/d48/fa8 0 2026-03-10T08:55:42.769 INFO:tasks.workunit.client.0.vm05.stdout:9/509: stat d6/d15/l68 0 2026-03-10T08:55:42.769 INFO:tasks.workunit.client.0.vm05.stdout:9/510: write d6/d12/d43/f91 [994513,79250] 0 2026-03-10T08:55:42.771 INFO:tasks.workunit.client.0.vm05.stdout:9/511: rmdir d6/d15/d3c/d4b/d90 39 2026-03-10T08:55:42.784 INFO:tasks.workunit.client.0.vm05.stdout:9/512: fdatasync d6/d12/d3a/f62 0 2026-03-10T08:55:42.784 INFO:tasks.workunit.client.1.vm08.stdout:2/818: creat d1/da/d10/d2d/db6/f10e x:0 0 0 2026-03-10T08:55:42.784 INFO:tasks.workunit.client.1.vm08.stdout:2/819: stat d1/d43/f4b 0 2026-03-10T08:55:42.789 INFO:tasks.workunit.client.0.vm05.stdout:8/560: read d2/dd/f3f [166273,46643] 0 2026-03-10T08:55:42.789 INFO:tasks.workunit.client.1.vm08.stdout:6/760: dread d9/dc/d11/d23/f8b [0,4194304] 0 2026-03-10T08:55:42.793 INFO:tasks.workunit.client.1.vm08.stdout:6/761: rename d9/d13/d4e/fcb to d9/d50/de9/ff7 0 2026-03-10T08:55:42.795 INFO:tasks.workunit.client.1.vm08.stdout:6/762: creat d9/dc/d11/ff8 x:0 0 0 2026-03-10T08:55:42.801 INFO:tasks.workunit.client.0.vm05.stdout:1/608: write dd/d21/d37/d45/d8d/fae [574764,78094] 0 2026-03-10T08:55:42.802 INFO:tasks.workunit.client.1.vm08.stdout:6/763: rename d9/d10/l5b to d9/dc/d11/d23/d2c/d7a/dce/d69/lf9 0 2026-03-10T08:55:42.805 INFO:tasks.workunit.client.0.vm05.stdout:1/609: dwrite dd/d10/d18/d2d/d5c/fc5 [0,4194304] 0 2026-03-10T08:55:42.809 INFO:tasks.workunit.client.0.vm05.stdout:1/610: rename dd/d10/d18/d2d/d51/d58 to dd/d10/d18/d2d/d51/d58/d71/d62/ddd 22 2026-03-10T08:55:42.810 INFO:tasks.workunit.client.0.vm05.stdout:1/611: write dd/d21/d3f/f83 [2749448,52222] 0 2026-03-10T08:55:42.825 INFO:tasks.workunit.client.0.vm05.stdout:6/590: truncate d4/d2c/d84/f6b 817947 0 2026-03-10T08:55:42.825 INFO:tasks.workunit.client.1.vm08.stdout:7/784: getdents d0/d11/d1f/d29/d3b/d80/dd3 0 2026-03-10T08:55:42.825 INFO:tasks.workunit.client.0.vm05.stdout:6/591: chown d4/d7/d10/d15/d20 457 1 2026-03-10T08:55:42.827 INFO:tasks.workunit.client.0.vm05.stdout:0/552: rmdir df/d1f/d85/d2b/d27 39 2026-03-10T08:55:42.832 INFO:tasks.workunit.client.0.vm05.stdout:6/592: dread d4/d7/f54 [0,4194304] 0 2026-03-10T08:55:42.870 INFO:tasks.workunit.client.1.vm08.stdout:9/721: write d2/dd/d15/d1e/d25/f4b [4800002,110248] 0 2026-03-10T08:55:42.870 INFO:tasks.workunit.client.0.vm05.stdout:0/553: dwrite df/d1f/d85/d19/d5b/f72 [0,4194304] 0 2026-03-10T08:55:42.870 INFO:tasks.workunit.client.0.vm05.stdout:6/593: mkdir d4/d7/dc4 0 2026-03-10T08:55:42.870 INFO:tasks.workunit.client.0.vm05.stdout:0/554: mknod df/d59/ca4 0 2026-03-10T08:55:42.870 INFO:tasks.workunit.client.0.vm05.stdout:3/619: getdents d9/d8f/d55 0 2026-03-10T08:55:42.870 INFO:tasks.workunit.client.0.vm05.stdout:0/555: unlink df/d1f/d85/d19/d5b/f76 0 2026-03-10T08:55:42.870 INFO:tasks.workunit.client.0.vm05.stdout:3/620: rename d9/d4d/d51/d64/c7e to d9/d2b/d2f/d96/cad 0 2026-03-10T08:55:42.870 INFO:tasks.workunit.client.0.vm05.stdout:3/621: mknod d9/d2b/d2f/cae 0 2026-03-10T08:55:42.870 INFO:tasks.workunit.client.0.vm05.stdout:3/622: symlink d9/d2b/d2f/d96/laf 0 2026-03-10T08:55:42.872 INFO:tasks.workunit.client.0.vm05.stdout:3/623: symlink d9/d2b/d3a/d43/da3/lb0 0 2026-03-10T08:55:42.874 INFO:tasks.workunit.client.0.vm05.stdout:3/624: creat d9/d2b/fb1 x:0 0 0 2026-03-10T08:55:42.875 INFO:tasks.workunit.client.0.vm05.stdout:3/625: creat d9/d2b/d3a/d6c/fb2 x:0 0 0 2026-03-10T08:55:42.875 INFO:tasks.workunit.client.0.vm05.stdout:3/626: chown d9/d2b/d3a/d6c/f74 57500 1 2026-03-10T08:55:42.878 INFO:tasks.workunit.client.0.vm05.stdout:1/612: sync 2026-03-10T08:55:42.879 INFO:tasks.workunit.client.0.vm05.stdout:1/613: dread - dd/d10/d18/d2d/d51/d58/d71/d62/fd4 zero size 2026-03-10T08:55:42.879 INFO:tasks.workunit.client.0.vm05.stdout:3/627: mknod d9/d4d/cb3 0 2026-03-10T08:55:42.880 INFO:tasks.workunit.client.0.vm05.stdout:1/614: chown dd/d10/d19/d4d/d88 333183 1 2026-03-10T08:55:42.882 INFO:tasks.workunit.client.0.vm05.stdout:1/615: mkdir dd/d21/d37/d7c/dab/db7/dde 0 2026-03-10T08:55:42.884 INFO:tasks.workunit.client.0.vm05.stdout:1/616: fsync dd/d10/d19/d27/f9c 0 2026-03-10T08:55:42.886 INFO:tasks.workunit.client.0.vm05.stdout:3/628: link d9/d2b/fb1 d9/fb4 0 2026-03-10T08:55:42.887 INFO:tasks.workunit.client.0.vm05.stdout:3/629: chown d9/d4d/cb3 1 1 2026-03-10T08:55:42.888 INFO:tasks.workunit.client.0.vm05.stdout:7/502: write d18/d66/f6c [390878,69290] 0 2026-03-10T08:55:42.890 INFO:tasks.workunit.client.0.vm05.stdout:2/472: write d0/d9/d1e/d20/d21/d45/f68 [249942,81013] 0 2026-03-10T08:55:42.893 INFO:tasks.workunit.client.0.vm05.stdout:3/630: write d9/d2b/d53/f60 [1912497,73583] 0 2026-03-10T08:55:42.893 INFO:tasks.workunit.client.0.vm05.stdout:3/631: fdatasync d9/d2b/d3a/d43/d71/f91 0 2026-03-10T08:55:42.894 INFO:tasks.workunit.client.0.vm05.stdout:3/632: fsync d9/d2b/d53/d61/f69 0 2026-03-10T08:55:42.908 INFO:tasks.workunit.client.1.vm08.stdout:0/686: write d6/dd/d13/d17/f6d [1326930,17531] 0 2026-03-10T08:55:42.909 INFO:tasks.workunit.client.0.vm05.stdout:7/503: dread d18/f24 [0,4194304] 0 2026-03-10T08:55:42.909 INFO:tasks.workunit.client.0.vm05.stdout:7/504: write fd [650773,51955] 0 2026-03-10T08:55:42.911 INFO:tasks.workunit.client.0.vm05.stdout:3/633: sync 2026-03-10T08:55:42.912 INFO:tasks.workunit.client.0.vm05.stdout:3/634: write d9/d8f/d55/fab [277382,85218] 0 2026-03-10T08:55:42.914 INFO:tasks.workunit.client.0.vm05.stdout:2/473: symlink d0/d55/l85 0 2026-03-10T08:55:42.914 INFO:tasks.workunit.client.0.vm05.stdout:7/505: dwrite d18/d66/f6c [0,4194304] 0 2026-03-10T08:55:42.915 INFO:tasks.workunit.client.0.vm05.stdout:2/474: chown d0/d9/d1e/d20/d24/c25 242200425 1 2026-03-10T08:55:42.920 INFO:tasks.workunit.client.1.vm08.stdout:0/687: creat d6/dd/d13/d61/dc7/fdf x:0 0 0 2026-03-10T08:55:42.923 INFO:tasks.workunit.client.0.vm05.stdout:4/548: write d0/d2e/d42/d45/f5f [3442329,114909] 0 2026-03-10T08:55:42.923 INFO:tasks.workunit.client.0.vm05.stdout:5/482: dwrite d5/d48/f93 [0,4194304] 0 2026-03-10T08:55:42.931 INFO:tasks.workunit.client.0.vm05.stdout:3/635: rmdir d9/d8f 39 2026-03-10T08:55:42.932 INFO:tasks.workunit.client.0.vm05.stdout:7/506: chown d18/d38/d43/d6e/f76 25 1 2026-03-10T08:55:42.933 INFO:tasks.workunit.client.0.vm05.stdout:7/507: write d18/d1b/f30 [2026838,1921] 0 2026-03-10T08:55:42.940 INFO:tasks.workunit.client.0.vm05.stdout:2/475: fsync d0/d9/d1e/d20/d21/f3d 0 2026-03-10T08:55:42.940 INFO:tasks.workunit.client.1.vm08.stdout:0/688: dwrite d6/dd/d13/d17/d50/f71 [0,4194304] 0 2026-03-10T08:55:42.942 INFO:tasks.workunit.client.0.vm05.stdout:1/617: getdents dd/d10/d18/d2d/d51/d58/d71/d62 0 2026-03-10T08:55:42.943 INFO:tasks.workunit.client.0.vm05.stdout:5/483: creat d5/df/d37/d68/fb3 x:0 0 0 2026-03-10T08:55:42.944 INFO:tasks.workunit.client.0.vm05.stdout:3/636: truncate d9/d4d/f88 2474234 0 2026-03-10T08:55:42.946 INFO:tasks.workunit.client.0.vm05.stdout:2/476: rmdir d0/d9/d1e/d20/d24 39 2026-03-10T08:55:42.947 INFO:tasks.workunit.client.0.vm05.stdout:2/477: write d0/d9/d1e/d20/d21/d45/d6c/d6e/d6d/f81 [786338,6111] 0 2026-03-10T08:55:42.949 INFO:tasks.workunit.client.0.vm05.stdout:1/618: symlink dd/d10/d19/d4d/ldf 0 2026-03-10T08:55:42.950 INFO:tasks.workunit.client.0.vm05.stdout:2/478: dread d0/f4 [0,4194304] 0 2026-03-10T08:55:42.952 INFO:tasks.workunit.client.0.vm05.stdout:3/637: creat d9/d2b/d3a/d6c/fb5 x:0 0 0 2026-03-10T08:55:42.953 INFO:tasks.workunit.client.0.vm05.stdout:7/508: symlink d18/d66/l96 0 2026-03-10T08:55:42.956 INFO:tasks.workunit.client.0.vm05.stdout:5/484: mknod d5/d48/d64/d95/dac/cb4 0 2026-03-10T08:55:42.957 INFO:tasks.workunit.client.0.vm05.stdout:5/485: write d5/d86/d21/f5a [104025,74394] 0 2026-03-10T08:55:42.958 INFO:tasks.workunit.client.1.vm08.stdout:0/689: symlink d6/dd/d13/d17/d1f/d2d/d38/le0 0 2026-03-10T08:55:42.972 INFO:tasks.workunit.client.1.vm08.stdout:1/765: dwrite d1/da/de/fad [0,4194304] 0 2026-03-10T08:55:42.975 INFO:tasks.workunit.client.0.vm05.stdout:1/619: rename dd/d10/d18/d2d/d5c/fc5 to dd/d10/d18/d2d/fe0 0 2026-03-10T08:55:42.975 INFO:tasks.workunit.client.1.vm08.stdout:5/696: dwrite d0/d11/d27/d68/d7c/d4b/fa0 [0,4194304] 0 2026-03-10T08:55:42.978 INFO:tasks.workunit.client.1.vm08.stdout:0/690: creat d6/fe1 x:0 0 0 2026-03-10T08:55:42.979 INFO:tasks.workunit.client.1.vm08.stdout:1/766: mkdir d1/da/de/d24/d35/d43/d109 0 2026-03-10T08:55:42.981 INFO:tasks.workunit.client.0.vm05.stdout:3/638: rename d9/d2b/d53/c62 to d9/d2b/d3a/d6c/cb6 0 2026-03-10T08:55:42.987 INFO:tasks.workunit.client.0.vm05.stdout:1/620: creat dd/d10/d19/d9b/fe1 x:0 0 0 2026-03-10T08:55:42.997 INFO:tasks.workunit.client.1.vm08.stdout:1/767: rename d1/da/de/c2e to d1/da/d20/d91/d83/df4/d4e/c10a 0 2026-03-10T08:55:43.001 INFO:tasks.workunit.client.1.vm08.stdout:0/691: rename d6/dd/d13/d17/d1f/d20/d2f/d57/f58 to d6/dd/d13/d17/d1f/d20/d2f/fe2 0 2026-03-10T08:55:43.003 INFO:tasks.workunit.client.0.vm05.stdout:1/621: dread dd/d10/d19/d4d/fc4 [0,4194304] 0 2026-03-10T08:55:43.009 INFO:tasks.workunit.client.0.vm05.stdout:1/622: read dd/d10/d19/f2e [161626,128661] 0 2026-03-10T08:55:43.052 INFO:tasks.workunit.client.1.vm08.stdout:1/768: sync 2026-03-10T08:55:43.058 INFO:tasks.workunit.client.0.vm05.stdout:1/623: dread dd/d21/d37/f8c [0,4194304] 0 2026-03-10T08:55:43.062 INFO:tasks.workunit.client.1.vm08.stdout:8/783: write d1/d10/d9/dd/d18/fe5 [460277,40583] 0 2026-03-10T08:55:43.065 INFO:tasks.workunit.client.1.vm08.stdout:8/784: fsync d1/d10/d9/dd/d13/f6a 0 2026-03-10T08:55:43.070 INFO:tasks.workunit.client.1.vm08.stdout:8/785: rmdir d1/d10 39 2026-03-10T08:55:43.071 INFO:tasks.workunit.client.0.vm05.stdout:1/624: dread dd/d21/d37/d45/fce [0,4194304] 0 2026-03-10T08:55:43.072 INFO:tasks.workunit.client.1.vm08.stdout:8/786: sync 2026-03-10T08:55:43.072 INFO:tasks.workunit.client.0.vm05.stdout:1/625: creat dd/d10/d18/d20/d69/fe2 x:0 0 0 2026-03-10T08:55:43.073 INFO:tasks.workunit.client.1.vm08.stdout:4/798: rmdir d5/d23/d36/d99/db2 39 2026-03-10T08:55:43.074 INFO:tasks.workunit.client.1.vm08.stdout:3/689: write d4/d15/d8/d2c/d9b/d79/f80 [2035280,87617] 0 2026-03-10T08:55:43.075 INFO:tasks.workunit.client.1.vm08.stdout:8/787: dread - d1/d10/d9/dd/d25/d27/d44/d21/d5f/f113 zero size 2026-03-10T08:55:43.076 INFO:tasks.workunit.client.1.vm08.stdout:3/690: chown d4/d15/d8/d1d/da8/fc9 377942 1 2026-03-10T08:55:43.077 INFO:tasks.workunit.client.0.vm05.stdout:1/626: unlink dd/d10/d18/d20/f6c 0 2026-03-10T08:55:43.077 INFO:tasks.workunit.client.0.vm05.stdout:8/561: truncate d2/dd/f26 4571487 0 2026-03-10T08:55:43.078 INFO:tasks.workunit.client.1.vm08.stdout:8/788: dread d1/d10/d9/dd/d25/f6e [0,4194304] 0 2026-03-10T08:55:43.080 INFO:tasks.workunit.client.0.vm05.stdout:9/513: dwrite d6/d15/d35/f38 [4194304,4194304] 0 2026-03-10T08:55:43.081 INFO:tasks.workunit.client.0.vm05.stdout:9/514: readlink d6/d15/d35/la4 0 2026-03-10T08:55:43.085 INFO:tasks.workunit.client.1.vm08.stdout:2/820: dwrite d1/da/d10/d42/d93/d22/f45 [4194304,4194304] 0 2026-03-10T08:55:43.088 INFO:tasks.workunit.client.1.vm08.stdout:2/821: chown d1/d43/dcd/f108 1406584 1 2026-03-10T08:55:43.100 INFO:tasks.workunit.client.0.vm05.stdout:9/515: mknod d6/d15/d35/cad 0 2026-03-10T08:55:43.100 INFO:tasks.workunit.client.0.vm05.stdout:9/516: truncate d6/d12/f74 26149 0 2026-03-10T08:55:43.101 INFO:tasks.workunit.client.0.vm05.stdout:9/517: readlink d6/d15/d3c/l3e 0 2026-03-10T08:55:43.101 INFO:tasks.workunit.client.1.vm08.stdout:8/789: mkdir d1/d10/d9/dd/d18/d122 0 2026-03-10T08:55:43.101 INFO:tasks.workunit.client.0.vm05.stdout:9/518: chown f3 74 1 2026-03-10T08:55:43.101 INFO:tasks.workunit.client.1.vm08.stdout:4/799: rmdir d5/d23/d36/d99/db2/d5a/d69/d11b/dea 39 2026-03-10T08:55:43.101 INFO:tasks.workunit.client.0.vm05.stdout:9/519: fdatasync d6/d12/f74 0 2026-03-10T08:55:43.102 INFO:tasks.workunit.client.1.vm08.stdout:6/764: dwrite d9/d13/d4e/f6b [4194304,4194304] 0 2026-03-10T08:55:43.105 INFO:tasks.workunit.client.0.vm05.stdout:9/520: dwrite d6/f7f [0,4194304] 0 2026-03-10T08:55:43.112 INFO:tasks.workunit.client.0.vm05.stdout:9/521: getdents d6/d15 0 2026-03-10T08:55:43.114 INFO:tasks.workunit.client.1.vm08.stdout:3/691: rename d4/d15/d8/d2c/d9b/c2b to d4/d15/d8/d1d/d4f/cf0 0 2026-03-10T08:55:43.114 INFO:tasks.workunit.client.1.vm08.stdout:4/800: unlink d5/d23/d36/d99/db2/d5a/d69/d11b/f72 0 2026-03-10T08:55:43.114 INFO:tasks.workunit.client.1.vm08.stdout:8/790: symlink d1/d10/d9/dd/d25/d27/d44/d21/l123 0 2026-03-10T08:55:43.115 INFO:tasks.workunit.client.0.vm05.stdout:9/522: dwrite d6/d15/d3c/d4b/f67 [0,4194304] 0 2026-03-10T08:55:43.118 INFO:tasks.workunit.client.1.vm08.stdout:5/697: dread d0/d11/d27/d50/f55 [4194304,4194304] 0 2026-03-10T08:55:43.121 INFO:tasks.workunit.client.1.vm08.stdout:3/692: dread d4/d15/fda [0,4194304] 0 2026-03-10T08:55:43.132 INFO:tasks.workunit.client.1.vm08.stdout:8/791: mkdir d1/d10/d9/dd/d18/d34/dd0/d124 0 2026-03-10T08:55:43.134 INFO:tasks.workunit.client.1.vm08.stdout:5/698: truncate d0/d11/d3e/d45/f4a 3023310 0 2026-03-10T08:55:43.134 INFO:tasks.workunit.client.1.vm08.stdout:3/693: mkdir d4/d6f/d85/df1 0 2026-03-10T08:55:43.134 INFO:tasks.workunit.client.1.vm08.stdout:4/801: dread - d5/d23/d36/d99/db2/fda zero size 2026-03-10T08:55:43.135 INFO:tasks.workunit.client.1.vm08.stdout:8/792: rename d1/f8 to d1/d10/d9/dd/d25/f125 0 2026-03-10T08:55:43.139 INFO:tasks.workunit.client.1.vm08.stdout:5/699: creat d0/d11/d3e/d45/fd5 x:0 0 0 2026-03-10T08:55:43.139 INFO:tasks.workunit.client.1.vm08.stdout:7/785: write d0/d11/d1f/d29/d3b/fc7 [4500539,111840] 0 2026-03-10T08:55:43.140 INFO:tasks.workunit.client.1.vm08.stdout:4/802: read d5/d23/d36/d99/db2/ff7 [1109735,87239] 0 2026-03-10T08:55:43.141 INFO:tasks.workunit.client.0.vm05.stdout:6/594: write d4/d92/f9d [777231,73810] 0 2026-03-10T08:55:43.141 INFO:tasks.workunit.client.1.vm08.stdout:9/722: write d2/d41/d4c/d66/fb0 [3720150,33992] 0 2026-03-10T08:55:43.147 INFO:tasks.workunit.client.0.vm05.stdout:0/556: dwrite df/d1f/d85/d2b/d27/f60 [0,4194304] 0 2026-03-10T08:55:43.150 INFO:tasks.workunit.client.0.vm05.stdout:9/523: sync 2026-03-10T08:55:43.152 INFO:tasks.workunit.client.0.vm05.stdout:9/524: write d6/d12/d3a/d48/fa5 [682521,28500] 0 2026-03-10T08:55:43.153 INFO:tasks.workunit.client.0.vm05.stdout:9/525: fdatasync d6/d12/d43/f52 0 2026-03-10T08:55:43.154 INFO:tasks.workunit.client.1.vm08.stdout:8/793: creat d1/dd9/f126 x:0 0 0 2026-03-10T08:55:43.156 INFO:tasks.workunit.client.0.vm05.stdout:0/557: dwrite df/d1f/d85/d2b/d65/d6e/d96/f7e [0,4194304] 0 2026-03-10T08:55:43.185 INFO:tasks.workunit.client.1.vm08.stdout:5/700: fsync d0/d11/d27/fb3 0 2026-03-10T08:55:43.192 INFO:tasks.workunit.client.1.vm08.stdout:4/803: creat d5/d23/d49/f123 x:0 0 0 2026-03-10T08:55:43.207 INFO:tasks.workunit.client.0.vm05.stdout:0/558: dread df/d59/f57 [0,4194304] 0 2026-03-10T08:55:43.207 INFO:tasks.workunit.client.0.vm05.stdout:9/526: fdatasync d6/d15/d35/f9a 0 2026-03-10T08:55:43.207 INFO:tasks.workunit.client.1.vm08.stdout:4/804: dread - d5/d23/d36/f113 zero size 2026-03-10T08:55:43.207 INFO:tasks.workunit.client.1.vm08.stdout:9/723: fdatasync d2/d41/d4c/f80 0 2026-03-10T08:55:43.208 INFO:tasks.workunit.client.1.vm08.stdout:4/805: dread d5/d23/d36/f7d [0,4194304] 0 2026-03-10T08:55:43.208 INFO:tasks.workunit.client.0.vm05.stdout:9/527: symlink d6/d12/d3a/d9c/lae 0 2026-03-10T08:55:43.209 INFO:tasks.workunit.client.0.vm05.stdout:9/528: rmdir d6/d15/d35 39 2026-03-10T08:55:43.213 INFO:tasks.workunit.client.0.vm05.stdout:9/529: mkdir d6/d15/daf 0 2026-03-10T08:55:43.213 INFO:tasks.workunit.client.1.vm08.stdout:4/806: dwrite d5/d23/d36/d99/db2/d5d/dae/ddf/fbe [0,4194304] 0 2026-03-10T08:55:43.220 INFO:tasks.workunit.client.1.vm08.stdout:4/807: write d5/d23/d49/d8f/f10c [586378,104894] 0 2026-03-10T08:55:43.223 INFO:tasks.workunit.client.1.vm08.stdout:4/808: link d5/d23/d36/d99/db2/lcd d5/d23/d49/d8f/da4/d118/l124 0 2026-03-10T08:55:43.284 INFO:tasks.workunit.client.1.vm08.stdout:6/765: read d9/d50/de9/ff7 [372573,116797] 0 2026-03-10T08:55:43.331 INFO:tasks.workunit.client.0.vm05.stdout:4/549: write d0/fc [1290684,11141] 0 2026-03-10T08:55:43.338 INFO:tasks.workunit.client.0.vm05.stdout:4/550: symlink d0/d2e/d42/d45/d4a/d36/d37/d9c/d49/d4f/d5b/dae/lb0 0 2026-03-10T08:55:43.338 INFO:tasks.workunit.client.0.vm05.stdout:4/551: fsync d0/fc 0 2026-03-10T08:55:43.338 INFO:tasks.workunit.client.0.vm05.stdout:4/552: chown d0 32980 1 2026-03-10T08:55:43.341 INFO:tasks.workunit.client.0.vm05.stdout:4/553: readlink d0/la 0 2026-03-10T08:55:43.342 INFO:tasks.workunit.client.0.vm05.stdout:4/554: truncate d0/d2e/d42/d45/d4a/f47 6926865 0 2026-03-10T08:55:43.344 INFO:tasks.workunit.client.0.vm05.stdout:4/555: creat d0/d2e/d42/d45/fb1 x:0 0 0 2026-03-10T08:55:43.345 INFO:tasks.workunit.client.0.vm05.stdout:2/479: write d0/d9/d1e/d20/d21/d45/d6c/d6e/f63 [249189,22108] 0 2026-03-10T08:55:43.351 INFO:tasks.workunit.client.0.vm05.stdout:4/556: mknod d0/d2e/d42/d45/d4a/d36/d37/d9c/d49/d4f/cb2 0 2026-03-10T08:55:43.352 INFO:tasks.workunit.client.0.vm05.stdout:4/557: dread - d0/d2e/d42/d45/d4a/d36/d37/fac zero size 2026-03-10T08:55:43.353 INFO:tasks.workunit.client.0.vm05.stdout:4/558: write d0/d2e/d42/d45/f5f [2049534,85624] 0 2026-03-10T08:55:43.364 INFO:tasks.workunit.client.0.vm05.stdout:2/480: sync 2026-03-10T08:55:43.367 INFO:tasks.workunit.client.0.vm05.stdout:2/481: rmdir d0/d9/d1e/d20/d21 39 2026-03-10T08:55:43.369 INFO:tasks.workunit.client.0.vm05.stdout:2/482: mknod d0/d9/d1e/d20/d21/d45/d6c/d6e/d6d/c86 0 2026-03-10T08:55:43.371 INFO:tasks.workunit.client.0.vm05.stdout:2/483: symlink d0/d9/d1e/d20/d21/d45/d6c/d6e/d7e/l87 0 2026-03-10T08:55:43.372 INFO:tasks.workunit.client.0.vm05.stdout:2/484: creat d0/d9/d1e/d20/d21/d45/d6c/f88 x:0 0 0 2026-03-10T08:55:43.388 INFO:tasks.workunit.client.0.vm05.stdout:5/486: dread d5/f3b [0,4194304] 0 2026-03-10T08:55:43.389 INFO:tasks.workunit.client.0.vm05.stdout:5/487: creat d5/d86/d66/d76/fb5 x:0 0 0 2026-03-10T08:55:43.390 INFO:tasks.workunit.client.0.vm05.stdout:5/488: getdents d5/d3a/d43/d60 0 2026-03-10T08:55:43.391 INFO:tasks.workunit.client.0.vm05.stdout:5/489: mkdir d5/df/d37/d68/db6 0 2026-03-10T08:55:43.393 INFO:tasks.workunit.client.0.vm05.stdout:5/490: symlink d5/d86/d24/d2c/lb7 0 2026-03-10T08:55:43.394 INFO:tasks.workunit.client.0.vm05.stdout:5/491: mkdir d5/d86/d24/d84/db8 0 2026-03-10T08:55:43.441 INFO:tasks.workunit.client.0.vm05.stdout:7/509: truncate d18/d66/d25/d2e/d42/d53/f7e 79182 0 2026-03-10T08:55:43.448 INFO:tasks.workunit.client.1.vm08.stdout:5/701: dread d0/d11/d27/d68/d7c/f42 [0,4194304] 0 2026-03-10T08:55:43.448 INFO:tasks.workunit.client.0.vm05.stdout:7/510: sync 2026-03-10T08:55:43.449 INFO:tasks.workunit.client.0.vm05.stdout:7/511: stat d18/d1b/f69 0 2026-03-10T08:55:43.521 INFO:tasks.workunit.client.0.vm05.stdout:3/639: dwrite d9/d2b/d3a/d6c/f74 [0,4194304] 0 2026-03-10T08:55:43.532 INFO:tasks.workunit.client.0.vm05.stdout:3/640: symlink d9/d8f/d50/d5f/d7b/lb7 0 2026-03-10T08:55:43.533 INFO:tasks.workunit.client.0.vm05.stdout:3/641: dread - d9/d2b/d2f/d57/f77 zero size 2026-03-10T08:55:43.537 INFO:tasks.workunit.client.0.vm05.stdout:3/642: rmdir d9/d2b/d53 39 2026-03-10T08:55:43.540 INFO:tasks.workunit.client.1.vm08.stdout:0/692: write d6/dd/d13/d17/d1f/d2d/d85/d93/fc0 [7328259,109853] 0 2026-03-10T08:55:43.540 INFO:tasks.workunit.client.0.vm05.stdout:3/643: chown d9/d2b/d53/fa7 188474812 1 2026-03-10T08:55:43.541 INFO:tasks.workunit.client.1.vm08.stdout:0/693: chown d6/l7b 99 1 2026-03-10T08:55:43.550 INFO:tasks.workunit.client.1.vm08.stdout:1/769: dwrite d1/da/d20/d3f/d49/d9c/fd1 [0,4194304] 0 2026-03-10T08:55:43.562 INFO:tasks.workunit.client.0.vm05.stdout:1/627: dwrite dd/f5e [0,4194304] 0 2026-03-10T08:55:43.569 INFO:tasks.workunit.client.0.vm05.stdout:1/628: fsync dd/d10/d18/d20/f34 0 2026-03-10T08:55:43.571 INFO:tasks.workunit.client.0.vm05.stdout:1/629: symlink dd/d21/d37/d7c/dc9/le3 0 2026-03-10T08:55:43.584 INFO:tasks.workunit.client.0.vm05.stdout:8/562: truncate d2/dd/d2c/d2e/d31/d3e/d5d/d9d/db3/fb6 764119 0 2026-03-10T08:55:43.589 INFO:tasks.workunit.client.1.vm08.stdout:2/822: dwrite d1/d5b/d66/f62 [0,4194304] 0 2026-03-10T08:55:43.601 INFO:tasks.workunit.client.0.vm05.stdout:8/563: rmdir d2/db/da4/dc7 0 2026-03-10T08:55:43.602 INFO:tasks.workunit.client.1.vm08.stdout:2/823: creat d1/d5b/dc5/f10f x:0 0 0 2026-03-10T08:55:43.605 INFO:tasks.workunit.client.0.vm05.stdout:8/564: dwrite d2/dd/d2c/d2e/d31/d4f/f9c [0,4194304] 0 2026-03-10T08:55:43.614 INFO:tasks.workunit.client.1.vm08.stdout:2/824: mknod d1/da/d10/c110 0 2026-03-10T08:55:43.614 INFO:tasks.workunit.client.0.vm05.stdout:8/565: symlink d2/dd/d2c/d2e/d31/d4c/d63/lcd 0 2026-03-10T08:55:43.616 INFO:tasks.workunit.client.1.vm08.stdout:2/825: read d1/da/d10/d1b/fac [70349,16166] 0 2026-03-10T08:55:43.625 INFO:tasks.workunit.client.1.vm08.stdout:2/826: mkdir d1/da/d10/d1b/d111 0 2026-03-10T08:55:43.634 INFO:tasks.workunit.client.1.vm08.stdout:2/827: creat d1/da/d10/d42/d93/d1e/dce/d52/f112 x:0 0 0 2026-03-10T08:55:43.652 INFO:tasks.workunit.client.1.vm08.stdout:2/828: dread d1/da/d10/d42/d93/f3b [0,4194304] 0 2026-03-10T08:55:43.681 INFO:tasks.workunit.client.1.vm08.stdout:3/694: dwrite d4/d15/fc [0,4194304] 0 2026-03-10T08:55:43.692 INFO:tasks.workunit.client.0.vm05.stdout:6/595: write d4/d7/d10/d1a/d89/f93 [1591840,13392] 0 2026-03-10T08:55:43.696 INFO:tasks.workunit.client.0.vm05.stdout:0/559: write df/d1f/d85/d2b/d27/f60 [4819408,25105] 0 2026-03-10T08:55:43.703 INFO:tasks.workunit.client.1.vm08.stdout:7/786: dwrite d0/d14/d43/f6e [0,4194304] 0 2026-03-10T08:55:43.714 INFO:tasks.workunit.client.1.vm08.stdout:8/794: write d1/d10/f23 [1466768,27031] 0 2026-03-10T08:55:43.715 INFO:tasks.workunit.client.1.vm08.stdout:8/795: symlink d1/d4f/l127 0 2026-03-10T08:55:43.720 INFO:tasks.workunit.client.0.vm05.stdout:6/596: dread d4/d7/f4d [0,4194304] 0 2026-03-10T08:55:43.730 INFO:tasks.workunit.client.1.vm08.stdout:9/724: dwrite d2/dd/d15/d1e/d39/d4e/f55 [0,4194304] 0 2026-03-10T08:55:43.736 INFO:tasks.workunit.client.1.vm08.stdout:9/725: stat d2/dd/d15/f44 0 2026-03-10T08:55:43.739 INFO:tasks.workunit.client.1.vm08.stdout:9/726: dread d2/f9f [0,4194304] 0 2026-03-10T08:55:43.739 INFO:tasks.workunit.client.1.vm08.stdout:8/796: dread d1/d10/d9/f73 [0,4194304] 0 2026-03-10T08:55:43.743 INFO:tasks.workunit.client.1.vm08.stdout:8/797: mkdir d1/d10/d9/dd/d25/dca/d128 0 2026-03-10T08:55:43.746 INFO:tasks.workunit.client.0.vm05.stdout:9/530: getdents d6/d15/d3c/d4b/d90 0 2026-03-10T08:55:43.746 INFO:tasks.workunit.client.1.vm08.stdout:8/798: dread - d1/d10/d9/dd/d25/d27/d44/d21/d51/dd6/ffa zero size 2026-03-10T08:55:43.747 INFO:tasks.workunit.client.0.vm05.stdout:9/531: truncate d6/d12/d3a/d48/fa8 1128537 0 2026-03-10T08:55:43.748 INFO:tasks.workunit.client.0.vm05.stdout:9/532: truncate f4 1195287 0 2026-03-10T08:55:43.748 INFO:tasks.workunit.client.0.vm05.stdout:9/533: chown d6/d15/d3c 3322293 1 2026-03-10T08:55:43.757 INFO:tasks.workunit.client.1.vm08.stdout:4/809: dwrite d5/d23/d36/d99/db2/d5d/ffe [0,4194304] 0 2026-03-10T08:55:43.764 INFO:tasks.workunit.client.1.vm08.stdout:6/766: write d9/d50/fb8 [1308295,61470] 0 2026-03-10T08:55:43.778 INFO:tasks.workunit.client.0.vm05.stdout:4/559: dwrite d0/d2e/d42/d45/d4a/d36/d37/d9c/d49/d58/d66/d79/f85 [0,4194304] 0 2026-03-10T08:55:43.780 INFO:tasks.workunit.client.0.vm05.stdout:2/485: dwrite d0/d9/d1e/d20/f71 [0,4194304] 0 2026-03-10T08:55:43.780 INFO:tasks.workunit.client.0.vm05.stdout:4/560: stat d0/d2e/d42/d45/d4a/d36/d37/d9c/d49/d4f/f51 0 2026-03-10T08:55:43.790 INFO:tasks.workunit.client.1.vm08.stdout:6/767: truncate d9/d10/d1e/d32/fb2 4510648 0 2026-03-10T08:55:43.796 INFO:tasks.workunit.client.0.vm05.stdout:4/561: rmdir d0/d2e/d9d 39 2026-03-10T08:55:43.812 INFO:tasks.workunit.client.0.vm05.stdout:5/492: dwrite d5/d86/d24/f25 [4194304,4194304] 0 2026-03-10T08:55:43.813 INFO:tasks.workunit.client.0.vm05.stdout:5/493: chown d5/d86/d24/d2c/d41/f87 2466104 1 2026-03-10T08:55:43.815 INFO:tasks.workunit.client.1.vm08.stdout:6/768: creat d9/d10/d1e/d92/ffa x:0 0 0 2026-03-10T08:55:43.818 INFO:tasks.workunit.client.0.vm05.stdout:4/562: mknod d0/d2e/d42/d45/d4a/d36/d37/d9c/d49/d4f/d5b/cb3 0 2026-03-10T08:55:43.828 INFO:tasks.workunit.client.0.vm05.stdout:9/534: dread d6/d12/d43/f47 [0,4194304] 0 2026-03-10T08:55:43.853 INFO:tasks.workunit.client.0.vm05.stdout:4/563: rmdir d0/d2e/d42/d45/d4a/d36/d37/d9c/d32/d41 39 2026-03-10T08:55:43.853 INFO:tasks.workunit.client.0.vm05.stdout:4/564: chown d0/d2e/d42/d45/d4a/d36/d37/d9c/d49/d58 137 1 2026-03-10T08:55:43.853 INFO:tasks.workunit.client.0.vm05.stdout:5/494: link d5/d86/l5e d5/d86/d24/d2c/d41/d74/lb9 0 2026-03-10T08:55:43.853 INFO:tasks.workunit.client.0.vm05.stdout:4/565: fdatasync d0/d1d/f22 0 2026-03-10T08:55:43.853 INFO:tasks.workunit.client.0.vm05.stdout:4/566: unlink d0/d2e/d42/d45/d4a/d36/d37/d9c/d49/d4f/c8f 0 2026-03-10T08:55:43.864 INFO:tasks.workunit.client.0.vm05.stdout:4/567: dread d0/d2e/d42/d45/d4a/d36/d37/d9c/d32/f72 [0,4194304] 0 2026-03-10T08:55:43.875 INFO:tasks.workunit.client.0.vm05.stdout:9/535: dread d6/d19/d21/f7d [0,4194304] 0 2026-03-10T08:55:43.884 INFO:tasks.workunit.client.0.vm05.stdout:7/512: dwrite d18/d66/f2d [0,4194304] 0 2026-03-10T08:55:43.888 INFO:tasks.workunit.client.0.vm05.stdout:4/568: dread d0/d2e/d42/d45/d4a/d36/d37/d9c/d32/f3e [0,4194304] 0 2026-03-10T08:55:43.903 INFO:tasks.workunit.client.0.vm05.stdout:7/513: rmdir d18/d38 39 2026-03-10T08:55:43.918 INFO:tasks.workunit.client.0.vm05.stdout:5/495: dread d5/d86/f1b [4194304,4194304] 0 2026-03-10T08:55:43.922 INFO:tasks.workunit.client.0.vm05.stdout:5/496: mknod d5/d3a/d43/cba 0 2026-03-10T08:55:43.928 INFO:tasks.workunit.client.0.vm05.stdout:3/644: dwrite d9/d2b/d2f/d57/f77 [0,4194304] 0 2026-03-10T08:55:43.932 INFO:tasks.workunit.client.0.vm05.stdout:3/645: truncate d9/d2b/d3a/d43/d6e/f9f 276475 0 2026-03-10T08:55:43.938 INFO:tasks.workunit.client.0.vm05.stdout:3/646: creat d9/d2b/d3a/d43/d71/d86/fb8 x:0 0 0 2026-03-10T08:55:43.941 INFO:tasks.workunit.client.0.vm05.stdout:3/647: unlink d9/d2b/f34 0 2026-03-10T08:55:43.943 INFO:tasks.workunit.client.1.vm08.stdout:0/694: write d6/dd/d13/d17/d1f/d2d/d39/f8a [172646,87034] 0 2026-03-10T08:55:43.959 INFO:tasks.workunit.client.1.vm08.stdout:0/695: rmdir d6/dd/d13/d17/d1f/d20/d2f/d24/dc2/dd8 0 2026-03-10T08:55:43.975 INFO:tasks.workunit.client.0.vm05.stdout:9/536: sync 2026-03-10T08:55:43.975 INFO:tasks.workunit.client.0.vm05.stdout:3/648: sync 2026-03-10T08:55:43.977 INFO:tasks.workunit.client.1.vm08.stdout:2/829: dwrite d1/da/d78/fed [0,4194304] 0 2026-03-10T08:55:43.990 INFO:tasks.workunit.client.0.vm05.stdout:3/649: creat d9/d2b/d2f/fb9 x:0 0 0 2026-03-10T08:55:43.991 INFO:tasks.workunit.client.0.vm05.stdout:3/650: write d9/d8f/f8a [3348053,19609] 0 2026-03-10T08:55:43.995 INFO:tasks.workunit.client.0.vm05.stdout:3/651: readlink d9/d2b/l2e 0 2026-03-10T08:55:43.999 INFO:tasks.workunit.client.0.vm05.stdout:3/652: mkdir d9/d2b/d3a/d43/d6e/dba 0 2026-03-10T08:55:44.000 INFO:tasks.workunit.client.0.vm05.stdout:9/537: read d6/f16 [1144513,127607] 0 2026-03-10T08:55:44.006 INFO:tasks.workunit.client.1.vm08.stdout:2/830: getdents d1/d43/d5c 0 2026-03-10T08:55:44.017 INFO:tasks.workunit.client.1.vm08.stdout:3/695: dwrite f1 [0,4194304] 0 2026-03-10T08:55:44.020 INFO:tasks.workunit.client.0.vm05.stdout:3/653: sync 2026-03-10T08:55:44.028 INFO:tasks.workunit.client.1.vm08.stdout:3/696: write d4/d15/d8/d71/fce [905526,3242] 0 2026-03-10T08:55:44.037 INFO:tasks.workunit.client.0.vm05.stdout:3/654: mknod d9/d2b/d2f/d57/cbb 0 2026-03-10T08:55:44.040 INFO:tasks.workunit.client.1.vm08.stdout:2/831: getdents d1/da/d10/d42/d93/d1e 0 2026-03-10T08:55:44.050 INFO:tasks.workunit.client.1.vm08.stdout:2/832: dread d1/d5b/d66/f20 [0,4194304] 0 2026-03-10T08:55:44.057 INFO:tasks.workunit.client.1.vm08.stdout:8/799: write d1/d10/d9/d8a/f99 [1590477,4220] 0 2026-03-10T08:55:44.062 INFO:tasks.workunit.client.1.vm08.stdout:9/727: dwrite d2/d41/d4c/f80 [4194304,4194304] 0 2026-03-10T08:55:44.063 INFO:tasks.workunit.client.1.vm08.stdout:3/697: dread d4/d15/d8/d2c/d6d/fc3 [0,4194304] 0 2026-03-10T08:55:44.077 INFO:tasks.workunit.client.1.vm08.stdout:8/800: rmdir d1/d10/d9/dd/d25/d27/d44/d21/dce 39 2026-03-10T08:55:44.082 INFO:tasks.workunit.client.1.vm08.stdout:8/801: dread d1/d10/d9/dd/d25/d27/d44/fa7 [0,4194304] 0 2026-03-10T08:55:44.087 INFO:tasks.workunit.client.1.vm08.stdout:8/802: creat d1/dd9/f129 x:0 0 0 2026-03-10T08:55:44.088 INFO:tasks.workunit.client.0.vm05.stdout:2/486: dwrite d0/d9/d1e/d20/d21/f3d [0,4194304] 0 2026-03-10T08:55:44.097 INFO:tasks.workunit.client.1.vm08.stdout:6/769: write d9/dc/d11/d23/d2c/d41/fd6 [885077,73190] 0 2026-03-10T08:55:44.105 INFO:tasks.workunit.client.0.vm05.stdout:2/487: mkdir d0/d9/d89 0 2026-03-10T08:55:44.124 INFO:tasks.workunit.client.0.vm05.stdout:2/488: mkdir d0/d9/d1e/d20/d21/d8a 0 2026-03-10T08:55:44.124 INFO:tasks.workunit.client.0.vm05.stdout:2/489: read - d0/d9/d1e/d20/d21/d45/d6c/d6e/f7d zero size 2026-03-10T08:55:44.128 INFO:tasks.workunit.client.0.vm05.stdout:2/490: truncate d0/d9/d1e/d20/d21/f31 4791039 0 2026-03-10T08:55:44.136 INFO:tasks.workunit.client.0.vm05.stdout:2/491: dwrite d0/d9/f19 [0,4194304] 0 2026-03-10T08:55:44.141 INFO:tasks.workunit.client.0.vm05.stdout:2/492: dread d0/d55/f60 [0,4194304] 0 2026-03-10T08:55:44.148 INFO:tasks.workunit.client.0.vm05.stdout:1/630: creat dd/d21/fe4 x:0 0 0 2026-03-10T08:55:44.148 INFO:tasks.workunit.client.0.vm05.stdout:1/631: fsync dd/d10/d19/d27/fc8 0 2026-03-10T08:55:44.149 INFO:tasks.workunit.client.0.vm05.stdout:1/632: readlink dd/d10/d18/d20/d69/la7 0 2026-03-10T08:55:44.149 INFO:tasks.workunit.client.0.vm05.stdout:1/633: stat dd/d13/l30 0 2026-03-10T08:55:44.150 INFO:tasks.workunit.client.0.vm05.stdout:1/634: chown dd/d10/d19/d4d/d7d 2751 1 2026-03-10T08:55:44.155 INFO:tasks.workunit.client.0.vm05.stdout:8/566: unlink d2/db/d1f/fb2 0 2026-03-10T08:55:44.156 INFO:tasks.workunit.client.0.vm05.stdout:8/567: chown d2/dd/d2c/d2e/d31/d4f/d7b/c8c 343910 1 2026-03-10T08:55:44.162 INFO:tasks.workunit.client.0.vm05.stdout:2/493: fsync d0/f2 0 2026-03-10T08:55:44.173 INFO:tasks.workunit.client.0.vm05.stdout:1/635: creat dd/fe5 x:0 0 0 2026-03-10T08:55:44.178 INFO:tasks.workunit.client.0.vm05.stdout:1/636: mknod dd/d21/d37/d7c/dab/db7/ce6 0 2026-03-10T08:55:44.180 INFO:tasks.workunit.client.0.vm05.stdout:2/494: getdents d0/d9/d1e/d20/d21/d45/d4b 0 2026-03-10T08:55:44.195 INFO:tasks.workunit.client.0.vm05.stdout:1/637: dread dd/f1c [4194304,4194304] 0 2026-03-10T08:55:44.197 INFO:tasks.workunit.client.0.vm05.stdout:0/560: rename df/d1f/d85/d2b/d27/d32/d4e/f8c to df/d1f/d85/d19/d47/fa5 0 2026-03-10T08:55:44.201 INFO:tasks.workunit.client.1.vm08.stdout:5/702: creat d0/d11/d27/d68/d7c/fd6 x:0 0 0 2026-03-10T08:55:44.202 INFO:tasks.workunit.client.0.vm05.stdout:2/495: dread d0/d9/d1e/d20/d21/d45/d6c/d6e/f54 [0,4194304] 0 2026-03-10T08:55:44.207 INFO:tasks.workunit.client.0.vm05.stdout:4/569: dwrite d0/d2e/d42/d45/d4a/d36/d37/d9c/d32/f76 [0,4194304] 0 2026-03-10T08:55:44.210 INFO:tasks.workunit.client.0.vm05.stdout:6/597: rename d4/d7/d10/d15/d1b/fb8 to d4/d7/d10/d15/fc5 0 2026-03-10T08:55:44.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:43 vm05.local ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:55:44.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:43 vm05.local ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:55:44.214 INFO:tasks.workunit.client.0.vm05.stdout:5/497: dwrite d5/d86/d21/d89/f90 [0,4194304] 0 2026-03-10T08:55:44.216 INFO:tasks.workunit.client.0.vm05.stdout:5/498: chown d5/d86/d66/d76/fb5 23693491 1 2026-03-10T08:55:44.221 INFO:tasks.workunit.client.0.vm05.stdout:0/561: rmdir df/d1f/d85/d2b/d65/d6e 39 2026-03-10T08:55:44.224 INFO:tasks.workunit.client.1.vm08.stdout:5/703: dwrite d0/d11/d3e/d45/fad [0,4194304] 0 2026-03-10T08:55:44.226 INFO:tasks.workunit.client.1.vm08.stdout:1/770: rename d1/da/d18/d3b/lb4 to d1/da/d20/l10b 0 2026-03-10T08:55:44.232 INFO:tasks.workunit.client.1.vm08.stdout:5/704: creat d0/d11/d18/d52/fd7 x:0 0 0 2026-03-10T08:55:44.232 INFO:tasks.workunit.client.0.vm05.stdout:6/598: fsync d4/d7/ff 0 2026-03-10T08:55:44.238 INFO:tasks.workunit.client.1.vm08.stdout:3/698: rmdir d4/d15/d8/d2c/d9b/d79 39 2026-03-10T08:55:44.241 INFO:tasks.workunit.client.1.vm08.stdout:5/705: symlink d0/d11/d27/d68/d7c/ld8 0 2026-03-10T08:55:44.241 INFO:tasks.workunit.client.1.vm08.stdout:0/696: dwrite d6/dd/d13/d17/d1f/da3/fb0 [0,4194304] 0 2026-03-10T08:55:44.243 INFO:tasks.workunit.client.0.vm05.stdout:5/499: truncate d5/fd 1073854 0 2026-03-10T08:55:44.252 INFO:tasks.workunit.client.1.vm08.stdout:7/787: unlink d0/c35 0 2026-03-10T08:55:44.252 INFO:tasks.workunit.client.1.vm08.stdout:3/699: fdatasync d4/d15/d8/fa0 0 2026-03-10T08:55:44.253 INFO:tasks.workunit.client.0.vm05.stdout:0/562: dread - df/d1f/d85/d19/d39/f63 zero size 2026-03-10T08:55:44.253 INFO:tasks.workunit.client.0.vm05.stdout:6/599: fsync d4/d7/d10/d1a/d1f/f4b 0 2026-03-10T08:55:44.255 INFO:tasks.workunit.client.1.vm08.stdout:8/803: symlink d1/d10/l12a 0 2026-03-10T08:55:44.258 INFO:tasks.workunit.client.1.vm08.stdout:7/788: chown d0/d14/d43/c73 23 1 2026-03-10T08:55:44.261 INFO:tasks.workunit.client.1.vm08.stdout:3/700: fdatasync d4/d15/d8/d2c/d9b/f95 0 2026-03-10T08:55:44.262 INFO:tasks.workunit.client.1.vm08.stdout:3/701: write d4/d15/d8/d2c/d9b/f4d [4148499,62457] 0 2026-03-10T08:55:44.263 INFO:tasks.workunit.client.0.vm05.stdout:2/496: creat d0/d9/d1e/d20/f8b x:0 0 0 2026-03-10T08:55:44.264 INFO:tasks.workunit.client.0.vm05.stdout:9/538: creat d6/fb0 x:0 0 0 2026-03-10T08:55:44.266 INFO:tasks.workunit.client.1.vm08.stdout:2/833: write d1/da/d10/d42/d93/d23/f99 [58968,100208] 0 2026-03-10T08:55:44.267 INFO:tasks.workunit.client.1.vm08.stdout:4/810: rename d5/d23/d36/c88 to d5/d23/d49/c125 0 2026-03-10T08:55:44.267 INFO:tasks.workunit.client.1.vm08.stdout:5/706: unlink d0/d1b/f2f 0 2026-03-10T08:55:44.268 INFO:tasks.workunit.client.1.vm08.stdout:4/811: chown d5/d23/d36/d99/dc6/l117 2797 1 2026-03-10T08:55:44.268 INFO:tasks.workunit.client.0.vm05.stdout:7/514: rename d18/d66/d25/d2e/l73 to d18/l97 0 2026-03-10T08:55:44.270 INFO:tasks.workunit.client.0.vm05.stdout:7/515: write d18/d66/d25/d2e/f6f [265129,78931] 0 2026-03-10T08:55:44.273 INFO:tasks.workunit.client.0.vm05.stdout:7/516: dwrite d18/f95 [0,4194304] 0 2026-03-10T08:55:44.275 INFO:tasks.workunit.client.1.vm08.stdout:0/697: mkdir d6/dd/d13/d17/d1f/d2d/de3 0 2026-03-10T08:55:44.280 INFO:tasks.workunit.client.1.vm08.stdout:8/804: fdatasync d1/da8/f108 0 2026-03-10T08:55:44.280 INFO:tasks.workunit.client.0.vm05.stdout:6/600: mkdir d4/d2c/d84/db6/dc6 0 2026-03-10T08:55:44.281 INFO:tasks.workunit.client.1.vm08.stdout:8/805: dread - d1/d10/d9/d4d/db2/f103 zero size 2026-03-10T08:55:44.287 INFO:tasks.workunit.client.0.vm05.stdout:9/539: creat d6/d15/d3c/d4b/d90/fb1 x:0 0 0 2026-03-10T08:55:44.289 INFO:tasks.workunit.client.0.vm05.stdout:0/563: mknod df/d1f/d85/d2b/d65/d6e/ca6 0 2026-03-10T08:55:44.289 INFO:tasks.workunit.client.1.vm08.stdout:3/702: dread d4/d6f/dca/fcc [0,4194304] 0 2026-03-10T08:55:44.290 INFO:tasks.workunit.client.1.vm08.stdout:9/728: write d2/dd/d15/d1e/d39/d69/fda [1908743,120323] 0 2026-03-10T08:55:44.297 INFO:tasks.workunit.client.0.vm05.stdout:7/517: truncate d18/d38/f82 1524210 0 2026-03-10T08:55:44.309 INFO:tasks.workunit.client.1.vm08.stdout:7/789: creat d0/d11/d1f/df0/df4/ffc x:0 0 0 2026-03-10T08:55:44.310 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:43 vm08.local ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:55:44.310 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:43 vm08.local ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:55:44.310 INFO:tasks.workunit.client.0.vm05.stdout:0/564: mknod df/d1f/d85/d2b/d27/ca7 0 2026-03-10T08:55:44.310 INFO:tasks.workunit.client.0.vm05.stdout:3/655: rename d9/d2b/c41 to d9/d8f/d50/d5f/cbc 0 2026-03-10T08:55:44.310 INFO:tasks.workunit.client.0.vm05.stdout:7/518: symlink d18/d66/d79/l98 0 2026-03-10T08:55:44.310 INFO:tasks.workunit.client.1.vm08.stdout:8/806: truncate d1/d10/d9/dd/d25/d27/d44/f22 2507364 0 2026-03-10T08:55:44.310 INFO:tasks.workunit.client.1.vm08.stdout:9/729: fdatasync d2/dd/d15/d1e/d25/d32/d5c/dc2/fcb 0 2026-03-10T08:55:44.310 INFO:tasks.workunit.client.1.vm08.stdout:6/770: dwrite d9/d10/d1e/fba [0,4194304] 0 2026-03-10T08:55:44.316 INFO:tasks.workunit.client.1.vm08.stdout:6/771: dwrite d9/d10/d1e/f91 [0,4194304] 0 2026-03-10T08:55:44.322 INFO:tasks.workunit.client.0.vm05.stdout:3/656: symlink d9/d4d/d51/lbd 0 2026-03-10T08:55:44.322 INFO:tasks.workunit.client.1.vm08.stdout:2/834: creat d1/da/d10/d42/d93/de2/f113 x:0 0 0 2026-03-10T08:55:44.323 INFO:tasks.workunit.client.1.vm08.stdout:2/835: dread - d1/d43/d5c/de7/f107 zero size 2026-03-10T08:55:44.325 INFO:tasks.workunit.client.0.vm05.stdout:3/657: dwrite d9/d4d/d51/d64/d89/fa1 [0,4194304] 0 2026-03-10T08:55:44.332 INFO:tasks.workunit.client.1.vm08.stdout:0/698: truncate d6/dd/d13/d17/fb4 1134044 0 2026-03-10T08:55:44.334 INFO:tasks.workunit.client.0.vm05.stdout:8/568: rename d2/dd/d2c/d2e/d31/d3e/d5d/d9d/ca9 to d2/d45/cce 0 2026-03-10T08:55:44.335 INFO:tasks.workunit.client.0.vm05.stdout:1/638: write dd/d10/d19/d4d/d88/fbe [574420,67341] 0 2026-03-10T08:55:44.341 INFO:tasks.workunit.client.1.vm08.stdout:8/807: stat d1/d10/d9/dd/d25/c42 0 2026-03-10T08:55:44.345 INFO:tasks.workunit.client.0.vm05.stdout:7/519: mknod d18/d38/c99 0 2026-03-10T08:55:44.353 INFO:tasks.workunit.client.1.vm08.stdout:9/730: mknod d2/dd/d15/d1e/d25/d32/d5c/dc2/cf7 0 2026-03-10T08:55:44.356 INFO:tasks.workunit.client.0.vm05.stdout:8/569: unlink d2/db/d1f/d67/fc5 0 2026-03-10T08:55:44.360 INFO:tasks.workunit.client.0.vm05.stdout:7/520: unlink ce 0 2026-03-10T08:55:44.365 INFO:tasks.workunit.client.1.vm08.stdout:1/771: dwrite d1/da/de/d24/d35/d6d/d82/f7b [0,4194304] 0 2026-03-10T08:55:44.367 INFO:tasks.workunit.client.1.vm08.stdout:0/699: creat d6/dd/d13/d17/d1f/d2d/d39/fe4 x:0 0 0 2026-03-10T08:55:44.368 INFO:tasks.workunit.client.0.vm05.stdout:2/497: dwrite d0/d9/d1e/d20/f3a [0,4194304] 0 2026-03-10T08:55:44.370 INFO:tasks.workunit.client.0.vm05.stdout:2/498: stat d0/d9/l42 0 2026-03-10T08:55:44.377 INFO:tasks.workunit.client.0.vm05.stdout:4/570: rename d0/d2e/d42/d45/d4a/d36/d37/d9c/f29 to d0/d2e/d71/d7c/fb4 0 2026-03-10T08:55:44.393 INFO:tasks.workunit.client.1.vm08.stdout:8/808: symlink d1/d10/d9/dd/d18/dff/l12b 0 2026-03-10T08:55:44.396 INFO:tasks.workunit.client.1.vm08.stdout:5/707: dwrite d0/d11/f1e [0,4194304] 0 2026-03-10T08:55:44.396 INFO:tasks.workunit.client.1.vm08.stdout:9/731: mknod d2/dd/d15/d1e/d39/d4e/cf8 0 2026-03-10T08:55:44.398 INFO:tasks.workunit.client.0.vm05.stdout:9/540: dwrite d6/d15/d3c/f6b [0,4194304] 0 2026-03-10T08:55:44.398 INFO:tasks.workunit.client.0.vm05.stdout:7/521: creat d18/d38/d43/d6e/f9a x:0 0 0 2026-03-10T08:55:44.410 INFO:tasks.workunit.client.1.vm08.stdout:0/700: mknod d6/dd/d13/d61/d6f/ce5 0 2026-03-10T08:55:44.437 INFO:tasks.workunit.client.1.vm08.stdout:6/772: creat d9/dc/d11/d23/d2c/d7a/dce/d69/ffb x:0 0 0 2026-03-10T08:55:44.437 INFO:tasks.workunit.client.1.vm08.stdout:5/708: truncate d0/d11/d18/d52/f91 844432 0 2026-03-10T08:55:44.437 INFO:tasks.workunit.client.1.vm08.stdout:2/836: creat d1/da/d10/d1b/d6a/f114 x:0 0 0 2026-03-10T08:55:44.437 INFO:tasks.workunit.client.1.vm08.stdout:3/703: link d4/d15/d8/d2c/d55/fc7 d4/d15/d8/d2c/d9b/d79/ff2 0 2026-03-10T08:55:44.438 INFO:tasks.workunit.client.0.vm05.stdout:8/570: link d2/db/d47/f51 d2/dd/d74/d78/fcf 0 2026-03-10T08:55:44.438 INFO:tasks.workunit.client.0.vm05.stdout:7/522: dread - d18/d66/d25/f56 zero size 2026-03-10T08:55:44.438 INFO:tasks.workunit.client.0.vm05.stdout:6/601: rename d4/d2c/d84/d4a/c68 to d4/d7/d10/d1a/db1/cc7 0 2026-03-10T08:55:44.438 INFO:tasks.workunit.client.0.vm05.stdout:9/541: mkdir d6/d12/db2 0 2026-03-10T08:55:44.438 INFO:tasks.workunit.client.0.vm05.stdout:2/499: creat d0/d9/d1e/d20/d21/d45/f8c x:0 0 0 2026-03-10T08:55:44.438 INFO:tasks.workunit.client.0.vm05.stdout:2/500: readlink d0/l1a 0 2026-03-10T08:55:44.438 INFO:tasks.workunit.client.0.vm05.stdout:7/523: rename fd to d18/d38/d43/d6e/f9b 0 2026-03-10T08:55:44.438 INFO:tasks.workunit.client.0.vm05.stdout:7/524: chown d18/d66/d25/d2e/d42/f52 122 1 2026-03-10T08:55:44.438 INFO:tasks.workunit.client.0.vm05.stdout:8/571: getdents d2/dd/d2c/da5 0 2026-03-10T08:55:44.438 INFO:tasks.workunit.client.0.vm05.stdout:8/572: readlink d2/dd/d2c/l65 0 2026-03-10T08:55:44.438 INFO:tasks.workunit.client.0.vm05.stdout:9/542: mkdir d6/d19/d2c/db3 0 2026-03-10T08:55:44.439 INFO:tasks.workunit.client.0.vm05.stdout:2/501: mkdir d0/d9/d1e/d20/d21/d45/d4b/d8d 0 2026-03-10T08:55:44.440 INFO:tasks.workunit.client.1.vm08.stdout:6/773: dwrite d9/d13/f70 [0,4194304] 0 2026-03-10T08:55:44.441 INFO:tasks.workunit.client.1.vm08.stdout:2/837: rmdir d1/d43/d5c/de7 39 2026-03-10T08:55:44.442 INFO:tasks.workunit.client.0.vm05.stdout:7/525: mkdir d18/d66/d25/d2e/d42/d9c 0 2026-03-10T08:55:44.442 INFO:tasks.workunit.client.0.vm05.stdout:7/526: stat d18/d66/d25/d2e/d42/d53/c64 0 2026-03-10T08:55:44.443 INFO:tasks.workunit.client.0.vm05.stdout:8/573: read - d2/dd/d2c/d2e/d31/d3e/f95 zero size 2026-03-10T08:55:44.446 INFO:tasks.workunit.client.1.vm08.stdout:9/732: creat d2/dd/d15/d1e/ff9 x:0 0 0 2026-03-10T08:55:44.455 INFO:tasks.workunit.client.0.vm05.stdout:5/500: dread d5/df/f2f [0,4194304] 0 2026-03-10T08:55:44.461 INFO:tasks.workunit.client.1.vm08.stdout:0/701: rename d6/dd/d13/d17/d1f/d2d/d38/fdd to d6/dd/d13/fe6 0 2026-03-10T08:55:44.471 INFO:tasks.workunit.client.0.vm05.stdout:7/527: symlink d18/d66/d25/l9d 0 2026-03-10T08:55:44.471 INFO:tasks.workunit.client.1.vm08.stdout:0/702: readlink d6/dd/d13/d17/d1f/d20/d2f/d57/l63 0 2026-03-10T08:55:44.471 INFO:tasks.workunit.client.1.vm08.stdout:0/703: dread d6/dd/d13/d17/d1f/d20/f43 [4194304,4194304] 0 2026-03-10T08:55:44.471 INFO:tasks.workunit.client.1.vm08.stdout:0/704: stat d6/dd/d13/d17/d1f/d20/f3e 0 2026-03-10T08:55:44.471 INFO:tasks.workunit.client.1.vm08.stdout:0/705: creat d6/dd/d13/d17/d1f/d2d/d38/fe7 x:0 0 0 2026-03-10T08:55:44.473 INFO:tasks.workunit.client.1.vm08.stdout:0/706: rename d6/dd/d13/d17/d1f/d20/d2f/d24/dc2 to d6/dd/d13/d17/d1f/d20/d2f/d24/dc2/de8 22 2026-03-10T08:55:44.474 INFO:tasks.workunit.client.0.vm05.stdout:0/565: write df/f12 [2323904,54132] 0 2026-03-10T08:55:44.479 INFO:tasks.workunit.client.0.vm05.stdout:0/566: mknod df/d1f/d85/d9e/ca8 0 2026-03-10T08:55:44.483 INFO:tasks.workunit.client.1.vm08.stdout:3/704: dread d4/d15/d8/d2c/d9b/d79/f34 [0,4194304] 0 2026-03-10T08:55:44.484 INFO:tasks.workunit.client.1.vm08.stdout:3/705: chown d4/d15/d8/d1d/da8/fe7 952825586 1 2026-03-10T08:55:44.484 INFO:tasks.workunit.client.1.vm08.stdout:3/706: fdatasync d4/d15/d8/d1d/fe6 0 2026-03-10T08:55:44.488 INFO:tasks.workunit.client.1.vm08.stdout:3/707: symlink d4/d15/d8/d2c/d9b/d79/d20/lf3 0 2026-03-10T08:55:44.492 INFO:tasks.workunit.client.1.vm08.stdout:3/708: creat d4/d15/d8/d1d/da8/ff4 x:0 0 0 2026-03-10T08:55:44.492 INFO:tasks.workunit.client.1.vm08.stdout:6/774: dread d9/d13/d4e/fa8 [0,4194304] 0 2026-03-10T08:55:44.493 INFO:tasks.workunit.client.1.vm08.stdout:6/775: chown d9/d13/l28 6419 1 2026-03-10T08:55:44.494 INFO:tasks.workunit.client.1.vm08.stdout:3/709: chown d4/d15/d8/d2c/d55/cb7 214353236 1 2026-03-10T08:55:44.500 INFO:tasks.workunit.client.1.vm08.stdout:6/776: mkdir d9/d50/de9/dea/dfc 0 2026-03-10T08:55:44.519 INFO:tasks.workunit.client.1.vm08.stdout:2/838: read d1/da/d10/d42/d93/d1e/dce/f74 [116899,27523] 0 2026-03-10T08:55:44.543 INFO:tasks.workunit.client.0.vm05.stdout:7/528: dread d18/f1d [0,4194304] 0 2026-03-10T08:55:44.555 INFO:tasks.workunit.client.1.vm08.stdout:3/710: dread d4/d15/d8/d2c/f3d [4194304,4194304] 0 2026-03-10T08:55:44.556 INFO:tasks.workunit.client.1.vm08.stdout:3/711: symlink d4/d15/d8/d2c/d6d/lf5 0 2026-03-10T08:55:44.559 INFO:tasks.workunit.client.1.vm08.stdout:3/712: dread d4/d6f/d85/dd3/fdb [0,4194304] 0 2026-03-10T08:55:44.562 INFO:tasks.workunit.client.0.vm05.stdout:4/571: sync 2026-03-10T08:55:44.562 INFO:tasks.workunit.client.0.vm05.stdout:5/501: sync 2026-03-10T08:55:44.562 INFO:tasks.workunit.client.0.vm05.stdout:0/567: sync 2026-03-10T08:55:44.562 INFO:tasks.workunit.client.1.vm08.stdout:3/713: truncate d4/d15/fe3 520501 0 2026-03-10T08:55:44.562 INFO:tasks.workunit.client.1.vm08.stdout:2/839: sync 2026-03-10T08:55:44.563 INFO:tasks.workunit.client.1.vm08.stdout:3/714: chown d4/d15/d8/d71/f8e 4 1 2026-03-10T08:55:44.566 INFO:tasks.workunit.client.0.vm05.stdout:4/572: symlink d0/d2e/d42/d45/d4a/d36/d37/d9c/d49/d58/d66/d79/lb5 0 2026-03-10T08:55:44.567 INFO:tasks.workunit.client.0.vm05.stdout:0/568: creat df/d1f/d85/d19/d55/fa9 x:0 0 0 2026-03-10T08:55:44.570 INFO:tasks.workunit.client.1.vm08.stdout:2/840: dwrite d1/da/d10/d2d/fb7 [0,4194304] 0 2026-03-10T08:55:44.570 INFO:tasks.workunit.client.0.vm05.stdout:5/502: dread d5/d86/f59 [0,4194304] 0 2026-03-10T08:55:44.571 INFO:tasks.workunit.client.0.vm05.stdout:5/503: chown d5/f40 64621442 1 2026-03-10T08:55:44.571 INFO:tasks.workunit.client.0.vm05.stdout:5/504: stat d5/d86/d24/d2c/d41/c58 0 2026-03-10T08:55:44.576 INFO:tasks.workunit.client.1.vm08.stdout:2/841: link d1/da/d10/d42/d93/d1e/fb2 d1/da/d10/d42/d93/d1e/d7b/f115 0 2026-03-10T08:55:44.583 INFO:tasks.workunit.client.0.vm05.stdout:5/505: read d5/d48/f69 [349988,76731] 0 2026-03-10T08:55:44.583 INFO:tasks.workunit.client.0.vm05.stdout:4/573: sync 2026-03-10T08:55:44.585 INFO:tasks.workunit.client.0.vm05.stdout:5/506: rename d5/d3a to d5/df/dbb 0 2026-03-10T08:55:44.586 INFO:tasks.workunit.client.0.vm05.stdout:5/507: write d5/d86/d66/fa5 [156956,101902] 0 2026-03-10T08:55:44.586 INFO:tasks.workunit.client.0.vm05.stdout:5/508: write d5/d86/d21/f5a [249995,51031] 0 2026-03-10T08:55:44.590 INFO:tasks.workunit.client.0.vm05.stdout:4/574: creat d0/d2e/d42/d45/d4a/d36/d37/d9c/d32/d41/d67/fb6 x:0 0 0 2026-03-10T08:55:44.590 INFO:tasks.workunit.client.1.vm08.stdout:3/715: dread d4/d15/d8/d71/fce [0,4194304] 0 2026-03-10T08:55:44.591 INFO:tasks.workunit.client.0.vm05.stdout:4/575: fdatasync d0/d2e/d42/d45/d4a/d36/d37/f97 0 2026-03-10T08:55:44.591 INFO:tasks.workunit.client.0.vm05.stdout:4/576: chown d0/d2e/d42/d45/d4a/c4c 25200 1 2026-03-10T08:55:44.592 INFO:tasks.workunit.client.0.vm05.stdout:4/577: write d0/d2e/d42/d45/d4a/d36/d37/d9c/d32/f76 [2736516,21655] 0 2026-03-10T08:55:44.598 INFO:tasks.workunit.client.1.vm08.stdout:3/716: mknod d4/d15/d8/d2c/d55/cf6 0 2026-03-10T08:55:44.605 INFO:tasks.workunit.client.0.vm05.stdout:4/578: rename d0/d2e/d42/d45/d4a/d36/d37/d9c/f61 to d0/d2e/d42/d45/d4a/d36/d37/d9c/d49/fb7 0 2026-03-10T08:55:44.606 INFO:tasks.workunit.client.1.vm08.stdout:7/790: write d0/d51/f78 [111312,3253] 0 2026-03-10T08:55:44.609 INFO:tasks.workunit.client.0.vm05.stdout:4/579: rmdir d0/d2e/d9d 39 2026-03-10T08:55:44.612 INFO:tasks.workunit.client.1.vm08.stdout:7/791: chown d0/d11/d1f/d29 97139473 1 2026-03-10T08:55:44.612 INFO:tasks.workunit.client.0.vm05.stdout:4/580: dwrite d0/d2e/d42/d45/d4a/d36/d37/d9c/d49/d58/d66/d79/f85 [0,4194304] 0 2026-03-10T08:55:44.613 INFO:tasks.workunit.client.1.vm08.stdout:3/717: sync 2026-03-10T08:55:44.614 INFO:tasks.workunit.client.1.vm08.stdout:2/842: dread d1/da/fc3 [0,4194304] 0 2026-03-10T08:55:44.615 INFO:tasks.workunit.client.1.vm08.stdout:4/812: dwrite d5/d23/d49/d8f/da4/fe5 [0,4194304] 0 2026-03-10T08:55:44.617 INFO:tasks.workunit.client.0.vm05.stdout:4/581: dwrite d0/d2e/d42/d45/d4a/d36/d37/d9c/d32/d41/d67/fb6 [0,4194304] 0 2026-03-10T08:55:44.622 INFO:tasks.workunit.client.1.vm08.stdout:3/718: dread d4/d15/d8/d71/fce [0,4194304] 0 2026-03-10T08:55:44.628 INFO:tasks.workunit.client.0.vm05.stdout:3/658: write d9/d4d/f52 [1334339,13717] 0 2026-03-10T08:55:44.631 INFO:tasks.workunit.client.0.vm05.stdout:1/639: dwrite dd/d21/f6f [0,4194304] 0 2026-03-10T08:55:44.639 INFO:tasks.workunit.client.0.vm05.stdout:8/574: getdents d2/dd/d74/d78 0 2026-03-10T08:55:44.652 INFO:tasks.workunit.client.1.vm08.stdout:1/772: dwrite d1/da/de/d24/d35/d6d/fc8 [0,4194304] 0 2026-03-10T08:55:44.657 INFO:tasks.workunit.client.1.vm08.stdout:7/792: mkdir d0/d14/d43/d9d/dfd 0 2026-03-10T08:55:44.668 INFO:tasks.workunit.client.0.vm05.stdout:8/575: dread d2/db/f19 [0,4194304] 0 2026-03-10T08:55:44.673 INFO:tasks.workunit.client.1.vm08.stdout:1/773: dread d1/da/de/d24/d3d/d40/ffe [0,4194304] 0 2026-03-10T08:55:44.684 INFO:tasks.workunit.client.1.vm08.stdout:4/813: rename d5/d23/d36/d99/db2/fda to d5/d23/d36/d99/dc6/dc8/f126 0 2026-03-10T08:55:44.686 INFO:tasks.workunit.client.0.vm05.stdout:3/659: mkdir d9/d2b/d3a/d6c/dbe 0 2026-03-10T08:55:44.687 INFO:tasks.workunit.client.0.vm05.stdout:3/660: write d9/d2b/d3a/d43/d6e/f9f [486981,8475] 0 2026-03-10T08:55:44.688 INFO:tasks.workunit.client.0.vm05.stdout:3/661: write d9/d4d/f5e [1477161,112387] 0 2026-03-10T08:55:44.690 INFO:tasks.workunit.client.1.vm08.stdout:2/843: truncate d1/da/d10/d42/d93/d23/f70 1003072 0 2026-03-10T08:55:44.696 INFO:tasks.workunit.client.0.vm05.stdout:6/602: dwrite d4/d7/ff [4194304,4194304] 0 2026-03-10T08:55:44.696 INFO:tasks.workunit.client.0.vm05.stdout:6/603: fdatasync d4/d7/f34 0 2026-03-10T08:55:44.696 INFO:tasks.workunit.client.0.vm05.stdout:6/604: fdatasync d4/d7/d10/d15/d1b/d22/fa4 0 2026-03-10T08:55:44.696 INFO:tasks.workunit.client.0.vm05.stdout:9/543: write d6/d12/d3a/f5e [1195196,34918] 0 2026-03-10T08:55:44.699 INFO:tasks.workunit.client.0.vm05.stdout:8/576: rename d2/dd/d2c/d2e/d31/d3e/d5d/d9d/db3 to d2/dd/d2c/d2e/d31/d4f/d80/dd0 0 2026-03-10T08:55:44.699 INFO:tasks.workunit.client.0.vm05.stdout:2/502: dwrite d0/d9/d1e/d20/d21/f77 [0,4194304] 0 2026-03-10T08:55:44.702 INFO:tasks.workunit.client.0.vm05.stdout:2/503: stat d0/d9/d1e/d20/d21/c6f 0 2026-03-10T08:55:44.707 INFO:tasks.workunit.client.0.vm05.stdout:4/582: mknod d0/cb8 0 2026-03-10T08:55:44.708 INFO:tasks.workunit.client.1.vm08.stdout:5/709: dwrite d0/d1b/f77 [4194304,4194304] 0 2026-03-10T08:55:44.711 INFO:tasks.workunit.client.1.vm08.stdout:8/809: dwrite d1/d10/d9/dd/d25/d27/d44/d97/d7d/f10f [0,4194304] 0 2026-03-10T08:55:44.722 INFO:tasks.workunit.client.0.vm05.stdout:7/529: dwrite d18/d1b/f94 [0,4194304] 0 2026-03-10T08:55:44.725 INFO:tasks.workunit.client.1.vm08.stdout:9/733: dwrite d2/d41/d53/f6d [0,4194304] 0 2026-03-10T08:55:44.725 INFO:tasks.workunit.client.1.vm08.stdout:0/707: dwrite d6/dd/d13/d61/d6f/faf [0,4194304] 0 2026-03-10T08:55:44.726 INFO:tasks.workunit.client.0.vm05.stdout:1/640: fdatasync dd/d10/d18/f82 0 2026-03-10T08:55:44.727 INFO:tasks.workunit.client.0.vm05.stdout:1/641: stat dd/d10/d19/d4d/d88/fbe 0 2026-03-10T08:55:44.727 INFO:tasks.workunit.client.0.vm05.stdout:4/583: sync 2026-03-10T08:55:44.735 INFO:tasks.workunit.client.1.vm08.stdout:7/793: fsync d0/d11/d1f/d29/d36/d75/fef 0 2026-03-10T08:55:44.739 INFO:tasks.workunit.client.1.vm08.stdout:6/777: dwrite d9/f77 [0,4194304] 0 2026-03-10T08:55:44.746 INFO:tasks.workunit.client.1.vm08.stdout:4/814: mknod d5/d23/d49/d83/c127 0 2026-03-10T08:55:44.754 INFO:tasks.workunit.client.0.vm05.stdout:0/569: dwrite df/f1a [4194304,4194304] 0 2026-03-10T08:55:44.755 INFO:tasks.workunit.client.0.vm05.stdout:0/570: chown df/d1f/d85/d2b/d65/d6e/d96/f8b 4 1 2026-03-10T08:55:44.756 INFO:tasks.workunit.client.1.vm08.stdout:5/710: creat d0/d1b/d67/d80/fd9 x:0 0 0 2026-03-10T08:55:44.757 INFO:tasks.workunit.client.1.vm08.stdout:5/711: dread - d0/d11/d27/d68/d7c/fd6 zero size 2026-03-10T08:55:44.763 INFO:tasks.workunit.client.1.vm08.stdout:8/810: creat d1/d10/d9/dd/d25/d27/d44/d21/d51/d64/f12c x:0 0 0 2026-03-10T08:55:44.763 INFO:tasks.workunit.client.0.vm05.stdout:5/509: truncate d5/d86/d21/f5a 193989 0 2026-03-10T08:55:44.769 INFO:tasks.workunit.client.1.vm08.stdout:0/708: creat d6/dd/d13/d8f/fe9 x:0 0 0 2026-03-10T08:55:44.769 INFO:tasks.workunit.client.0.vm05.stdout:8/577: fsync d2/db/f22 0 2026-03-10T08:55:44.774 INFO:tasks.workunit.client.1.vm08.stdout:9/734: creat d2/d41/ffa x:0 0 0 2026-03-10T08:55:44.776 INFO:tasks.workunit.client.0.vm05.stdout:2/504: unlink d0/d9/d1e/d20/d21/f44 0 2026-03-10T08:55:44.785 INFO:tasks.workunit.client.0.vm05.stdout:1/642: symlink dd/d21/d37/d7c/dc9/le7 0 2026-03-10T08:55:44.786 INFO:tasks.workunit.client.1.vm08.stdout:4/815: fsync d5/d23/d49/f101 0 2026-03-10T08:55:44.787 INFO:tasks.workunit.client.1.vm08.stdout:2/844: symlink d1/d5b/d66/df6/l116 0 2026-03-10T08:55:44.789 INFO:tasks.workunit.client.0.vm05.stdout:3/662: rename d9/d2b/d3a/d43/d7a to d9/d2b/d3a/d6c/dbf 0 2026-03-10T08:55:44.791 INFO:tasks.workunit.client.1.vm08.stdout:3/719: dwrite d4/d15/d8/d2c/d9b/d79/d20/f99 [0,4194304] 0 2026-03-10T08:55:44.795 INFO:tasks.workunit.client.1.vm08.stdout:2/845: truncate d1/da/d10/d42/d93/d22/f104 751474 0 2026-03-10T08:55:44.796 INFO:tasks.workunit.client.1.vm08.stdout:0/709: creat d6/dd/d13/d17/d1f/d2d/d85/d93/fea x:0 0 0 2026-03-10T08:55:44.799 INFO:tasks.workunit.client.1.vm08.stdout:0/710: readlink d6/dd/d13/d17/d1f/d20/d2f/l4d 0 2026-03-10T08:55:44.799 INFO:tasks.workunit.client.0.vm05.stdout:0/571: dread df/d1f/f2d [0,4194304] 0 2026-03-10T08:55:44.800 INFO:tasks.workunit.client.1.vm08.stdout:3/720: dwrite f1 [0,4194304] 0 2026-03-10T08:55:44.805 INFO:tasks.workunit.client.0.vm05.stdout:2/505: dread d0/d9/d1e/d20/f22 [0,4194304] 0 2026-03-10T08:55:44.810 INFO:tasks.workunit.client.1.vm08.stdout:1/774: rename d1/da/de/d24/d3d/d40/d56/d6b to d1/da/de/d24/d3d/d10c 0 2026-03-10T08:55:44.816 INFO:tasks.workunit.client.1.vm08.stdout:5/712: symlink d0/d11/d18/lda 0 2026-03-10T08:55:44.838 INFO:tasks.workunit.client.1.vm08.stdout:2/846: dread - d1/da/d10/d42/d93/d1e/dce/fd6 zero size 2026-03-10T08:55:44.839 INFO:tasks.workunit.client.1.vm08.stdout:6/778: rename d9/fa to d9/dc/d11/d23/d2c/d7a/dce/ffd 0 2026-03-10T08:55:44.839 INFO:tasks.workunit.client.1.vm08.stdout:3/721: mknod d4/d15/d8/d2c/d9b/cf7 0 2026-03-10T08:55:44.839 INFO:tasks.workunit.client.1.vm08.stdout:6/779: rename d9/d13/d4e/ff5 to d9/d10/dd0/ffe 0 2026-03-10T08:55:44.839 INFO:tasks.workunit.client.1.vm08.stdout:7/794: link d0/d11/d1f/d29/d3d/d40/c63 d0/d11/d1f/d29/cfe 0 2026-03-10T08:55:44.839 INFO:tasks.workunit.client.0.vm05.stdout:1/643: dwrite dd/d10/d18/d2d/fe0 [4194304,4194304] 0 2026-03-10T08:55:44.839 INFO:tasks.workunit.client.0.vm05.stdout:1/644: fdatasync dd/f44 0 2026-03-10T08:55:44.839 INFO:tasks.workunit.client.0.vm05.stdout:1/645: write dd/fe5 [414132,72676] 0 2026-03-10T08:55:44.839 INFO:tasks.workunit.client.0.vm05.stdout:0/572: fsync df/d1f/d85/d19/d39/f63 0 2026-03-10T08:55:44.839 INFO:tasks.workunit.client.0.vm05.stdout:7/530: creat d18/d66/d25/d2e/f9e x:0 0 0 2026-03-10T08:55:44.839 INFO:tasks.workunit.client.0.vm05.stdout:4/584: link d0/d2e/d42/d45/d4a/d36/d37/d9c/d32/d41/l56 d0/d2e/d42/d45/d4a/d36/d37/d9c/d49/d58/lb9 0 2026-03-10T08:55:44.839 INFO:tasks.workunit.client.0.vm05.stdout:1/646: mknod dd/d10/d19/d27/ce8 0 2026-03-10T08:55:44.852 INFO:tasks.workunit.client.1.vm08.stdout:0/711: sync 2026-03-10T08:55:44.855 INFO:tasks.workunit.client.1.vm08.stdout:3/722: creat d4/d6f/dca/ff8 x:0 0 0 2026-03-10T08:55:44.856 INFO:tasks.workunit.client.1.vm08.stdout:3/723: stat d4/d15/d8/d2c/d9b/d79/fef 0 2026-03-10T08:55:44.857 INFO:tasks.workunit.client.0.vm05.stdout:0/573: mknod df/d1f/d85/caa 0 2026-03-10T08:55:44.857 INFO:tasks.workunit.client.0.vm05.stdout:7/531: fdatasync d18/d66/d25/d2e/d2f/f59 0 2026-03-10T08:55:44.858 INFO:tasks.workunit.client.0.vm05.stdout:7/532: chown d18/d1b/f69 13 1 2026-03-10T08:55:44.862 INFO:tasks.workunit.client.0.vm05.stdout:9/544: rename d6/d19/f5c to d6/d15/fb4 0 2026-03-10T08:55:44.862 INFO:tasks.workunit.client.1.vm08.stdout:9/735: dread d2/dd/d15/f17 [0,4194304] 0 2026-03-10T08:55:44.863 INFO:tasks.workunit.client.0.vm05.stdout:9/545: write d6/d12/d3a/d48/fa8 [1534458,35534] 0 2026-03-10T08:55:44.864 INFO:tasks.workunit.client.0.vm05.stdout:8/578: getdents d2/dd 0 2026-03-10T08:55:44.864 INFO:tasks.workunit.client.1.vm08.stdout:2/847: sync 2026-03-10T08:55:44.865 INFO:tasks.workunit.client.0.vm05.stdout:8/579: write d2/db/f9a [1013155,92142] 0 2026-03-10T08:55:44.866 INFO:tasks.workunit.client.1.vm08.stdout:4/816: link d5/d23/d36/d99/db2/d5a/d69/d11b/d96/cd8 d5/d23/d36/d99/db2/d5a/d69/d11b/c128 0 2026-03-10T08:55:44.872 INFO:tasks.workunit.client.0.vm05.stdout:1/647: creat dd/d21/d37/d7c/d60/fe9 x:0 0 0 2026-03-10T08:55:44.872 INFO:tasks.workunit.client.1.vm08.stdout:3/724: rmdir d4/d15/d8/d2c/d9b/d79 39 2026-03-10T08:55:44.872 INFO:tasks.workunit.client.1.vm08.stdout:7/795: dread d0/d11/d4a/f4f [0,4194304] 0 2026-03-10T08:55:44.872 INFO:tasks.workunit.client.0.vm05.stdout:7/533: mknod d18/d38/d43/d5c/c9f 0 2026-03-10T08:55:44.874 INFO:tasks.workunit.client.1.vm08.stdout:0/712: truncate d6/dd/d13/d17/d1f/d2d/d85/d95/fd6 1707256 0 2026-03-10T08:55:44.875 INFO:tasks.workunit.client.1.vm08.stdout:2/848: creat d1/da/d10/d42/d93/d1e/f117 x:0 0 0 2026-03-10T08:55:44.886 INFO:tasks.workunit.client.1.vm08.stdout:3/725: mkdir d4/d6f/dca/df9 0 2026-03-10T08:55:44.887 INFO:tasks.workunit.client.0.vm05.stdout:8/580: dread d2/dd/d2c/d2e/f7d [0,4194304] 0 2026-03-10T08:55:44.887 INFO:tasks.workunit.client.0.vm05.stdout:8/581: chown d2/dd/d2c/d2e/c39 1538 1 2026-03-10T08:55:44.888 INFO:tasks.workunit.client.1.vm08.stdout:9/736: truncate d2/dd/d15/d1e/d94/fd7 870280 0 2026-03-10T08:55:44.889 INFO:tasks.workunit.client.1.vm08.stdout:0/713: mknod d6/dd/d13/d17/d1f/d2d/d38/d98/ceb 0 2026-03-10T08:55:44.895 INFO:tasks.workunit.client.1.vm08.stdout:7/796: mkdir d0/d11/d4a/d95/dc5/dff 0 2026-03-10T08:55:44.896 INFO:tasks.workunit.client.1.vm08.stdout:3/726: mkdir d4/d15/d8/d2c/d6d/dfa 0 2026-03-10T08:55:44.896 INFO:tasks.workunit.client.1.vm08.stdout:9/737: truncate f1 457147 0 2026-03-10T08:55:44.898 INFO:tasks.workunit.client.1.vm08.stdout:4/817: creat d5/d23/d36/d99/db2/d5d/f129 x:0 0 0 2026-03-10T08:55:44.898 INFO:tasks.workunit.client.1.vm08.stdout:9/738: read - d2/dd/d15/d1e/d39/d4e/ff4 zero size 2026-03-10T08:55:44.899 INFO:tasks.workunit.client.1.vm08.stdout:7/797: mkdir d0/d11/d4a/d95/dc5/d100 0 2026-03-10T08:55:44.904 INFO:tasks.workunit.client.1.vm08.stdout:3/727: dread d4/d15/fda [0,4194304] 0 2026-03-10T08:55:44.904 INFO:tasks.workunit.client.1.vm08.stdout:2/849: rename d1/da/d10/d1b/c53 to d1/da/d10/c118 0 2026-03-10T08:55:44.907 INFO:tasks.workunit.client.1.vm08.stdout:9/739: dwrite d2/d41/ffa [0,4194304] 0 2026-03-10T08:55:44.910 INFO:tasks.workunit.client.0.vm05.stdout:1/648: mknod dd/d10/d18/d2d/d51/d58/d71/cea 0 2026-03-10T08:55:44.910 INFO:tasks.workunit.client.0.vm05.stdout:1/649: chown dd/d21/d37/d45 82373100 1 2026-03-10T08:55:44.916 INFO:tasks.workunit.client.1.vm08.stdout:0/714: dread d6/f5f [0,4194304] 0 2026-03-10T08:55:44.916 INFO:tasks.workunit.client.0.vm05.stdout:7/534: mkdir d18/d66/d25/d2e/d2f/da0 0 2026-03-10T08:55:44.916 INFO:tasks.workunit.client.0.vm05.stdout:8/582: truncate d2/db/d1f/d67/f75 580718 0 2026-03-10T08:55:44.928 INFO:tasks.workunit.client.0.vm05.stdout:1/650: rename dd/d10/d18/d2d/d51/d58/ddb to dd/d10/d18/d2d/d51/d58/deb 0 2026-03-10T08:55:44.929 INFO:tasks.workunit.client.1.vm08.stdout:4/818: truncate d5/d23/d36/d99/db2/d5a/d69/d11b/d96/fbb 760275 0 2026-03-10T08:55:44.931 INFO:tasks.workunit.client.0.vm05.stdout:8/583: creat d2/db/d47/fd1 x:0 0 0 2026-03-10T08:55:44.931 INFO:tasks.workunit.client.1.vm08.stdout:9/740: mknod d2/dd/d15/d1e/d39/d4e/cfb 0 2026-03-10T08:55:44.932 INFO:tasks.workunit.client.0.vm05.stdout:8/584: chown d2/dd/d2c/d2e/f6a 10162297 1 2026-03-10T08:55:44.933 INFO:tasks.workunit.client.1.vm08.stdout:0/715: unlink d6/dd/d13/cce 0 2026-03-10T08:55:44.934 INFO:tasks.workunit.client.1.vm08.stdout:3/728: mknod d4/d15/d8/d2c/d9b/d79/d8f/cfb 0 2026-03-10T08:55:44.935 INFO:tasks.workunit.client.1.vm08.stdout:7/798: link d0/d11/db2/l83 d0/d11/d4a/d95/l101 0 2026-03-10T08:55:44.935 INFO:tasks.workunit.client.0.vm05.stdout:8/585: read d2/dd/d2c/d2e/f37 [3538311,113440] 0 2026-03-10T08:55:44.936 INFO:tasks.workunit.client.0.vm05.stdout:1/651: symlink dd/d10/d18/dd5/da9/lec 0 2026-03-10T08:55:44.939 INFO:tasks.workunit.client.0.vm05.stdout:8/586: dwrite d2/db/d47/fd1 [0,4194304] 0 2026-03-10T08:55:44.950 INFO:tasks.workunit.client.0.vm05.stdout:8/587: dwrite d2/db/d47/f58 [0,4194304] 0 2026-03-10T08:55:44.951 INFO:tasks.workunit.client.0.vm05.stdout:8/588: write d2/dd/d74/fcc [956814,110835] 0 2026-03-10T08:55:44.951 INFO:tasks.workunit.client.1.vm08.stdout:9/741: rename d2/dd/d15/d1e/d24/f9e to d2/d54/d8e/ffc 0 2026-03-10T08:55:44.951 INFO:tasks.workunit.client.1.vm08.stdout:4/819: mknod d5/df5/c12a 0 2026-03-10T08:55:44.951 INFO:tasks.workunit.client.1.vm08.stdout:3/729: dread d4/d15/fe3 [0,4194304] 0 2026-03-10T08:55:44.952 INFO:tasks.workunit.client.0.vm05.stdout:8/589: chown d2/dd/d2c/d2e/d31/d3e/d5d/d9d 6048 1 2026-03-10T08:55:44.956 INFO:tasks.workunit.client.1.vm08.stdout:0/716: mknod d6/dd/d13/d17/d1f/d2d/d38/d98/cec 0 2026-03-10T08:55:44.961 INFO:tasks.workunit.client.0.vm05.stdout:1/652: dread f6 [0,4194304] 0 2026-03-10T08:55:44.962 INFO:tasks.workunit.client.0.vm05.stdout:8/590: readlink d2/dd/d2c/d2e/d31/d4f/l7c 0 2026-03-10T08:55:44.964 INFO:tasks.workunit.client.1.vm08.stdout:0/717: truncate d6/dd/d13/fe6 614808 0 2026-03-10T08:55:44.967 INFO:tasks.workunit.client.0.vm05.stdout:0/574: sync 2026-03-10T08:55:44.968 INFO:tasks.workunit.client.1.vm08.stdout:9/742: sync 2026-03-10T08:55:44.977 INFO:tasks.workunit.client.1.vm08.stdout:4/820: mkdir d5/d23/d36/d99/db2/d5a/ddb/d12b 0 2026-03-10T08:55:44.985 INFO:tasks.workunit.client.1.vm08.stdout:7/799: dread d0/d14/f12 [0,4194304] 0 2026-03-10T08:55:44.990 INFO:tasks.workunit.client.0.vm05.stdout:4/585: dread d0/d2e/d42/d45/f5f [0,4194304] 0 2026-03-10T08:55:45.003 INFO:tasks.workunit.client.0.vm05.stdout:1/653: unlink dd/c68 0 2026-03-10T08:55:45.003 INFO:tasks.workunit.client.0.vm05.stdout:4/586: mkdir d0/d2e/d42/d45/d4a/d36/d37/d9c/d32/d41/dba 0 2026-03-10T08:55:45.003 INFO:tasks.workunit.client.0.vm05.stdout:1/654: truncate dd/d10/d18/d2d/d51/d58/d71/d73/fbb 785832 0 2026-03-10T08:55:45.003 INFO:tasks.workunit.client.0.vm05.stdout:7/535: sync 2026-03-10T08:55:45.003 INFO:tasks.workunit.client.0.vm05.stdout:8/591: sync 2026-03-10T08:55:45.006 INFO:tasks.workunit.client.1.vm08.stdout:0/718: dread d6/dd/d13/d17/d1f/d2d/d39/f87 [0,4194304] 0 2026-03-10T08:55:45.010 INFO:tasks.workunit.client.0.vm05.stdout:0/575: dread - df/d1f/d85/d2b/d27/f91 zero size 2026-03-10T08:55:45.011 INFO:tasks.workunit.client.0.vm05.stdout:0/576: chown df/d1f/d85/d2b/d65 66 1 2026-03-10T08:55:45.019 INFO:tasks.workunit.client.0.vm05.stdout:4/587: dwrite d0/d2e/d71/d7c/fb4 [0,4194304] 0 2026-03-10T08:55:45.022 INFO:tasks.workunit.client.0.vm05.stdout:7/536: sync 2026-03-10T08:55:45.024 INFO:tasks.workunit.client.1.vm08.stdout:7/800: unlink d0/fe 0 2026-03-10T08:55:45.025 INFO:tasks.workunit.client.1.vm08.stdout:3/730: creat d4/d15/d8/d2c/d6d/dfa/ffc x:0 0 0 2026-03-10T08:55:45.026 INFO:tasks.workunit.client.1.vm08.stdout:0/719: fsync d6/dd/d13/d17/d1f/d20/d2f/d57/f5c 0 2026-03-10T08:55:45.033 INFO:tasks.workunit.client.0.vm05.stdout:1/655: mknod dd/d10/d18/d20/d69/ced 0 2026-03-10T08:55:45.042 INFO:tasks.workunit.client.1.vm08.stdout:3/731: unlink d4/d15/d8/d2c/d9b/f4d 0 2026-03-10T08:55:45.049 INFO:tasks.workunit.client.0.vm05.stdout:4/588: rename d0/c14 to d0/d2e/d42/d45/d4a/d36/d37/d9c/d32/d41/d67/cbb 0 2026-03-10T08:55:45.051 INFO:tasks.workunit.client.0.vm05.stdout:4/589: chown d0/d2e/d42/d45/d4a/d36/d37/d9c/d49/d58/d66/d79/fa5 646 1 2026-03-10T08:55:45.052 INFO:tasks.workunit.client.0.vm05.stdout:4/590: stat d0/d2e/d42/d45/d4a/d36/d37/d9c/d49/d4f/cb2 0 2026-03-10T08:55:45.055 INFO:tasks.workunit.client.1.vm08.stdout:0/720: dwrite d6/dd/d13/d17/d1f/d20/f43 [4194304,4194304] 0 2026-03-10T08:55:45.066 INFO:tasks.workunit.client.0.vm05.stdout:4/591: creat d0/d78/fbc x:0 0 0 2026-03-10T08:55:45.070 INFO:tasks.workunit.client.0.vm05.stdout:8/592: dread d2/dd/d2c/d2e/f5a [0,4194304] 0 2026-03-10T08:55:45.073 INFO:tasks.workunit.client.1.vm08.stdout:0/721: rmdir d6/dd/d13/d32 39 2026-03-10T08:55:45.075 INFO:tasks.workunit.client.0.vm05.stdout:4/592: truncate d0/d2e/d42/d45/d4a/d36/d37/d9c/d49/fb7 3966880 0 2026-03-10T08:55:45.076 INFO:tasks.workunit.client.1.vm08.stdout:3/732: mkdir d4/d15/dfd 0 2026-03-10T08:55:45.077 INFO:tasks.workunit.client.1.vm08.stdout:0/722: creat d6/dd/d13/d17/d1f/d20/d2f/d24/fed x:0 0 0 2026-03-10T08:55:45.082 INFO:tasks.workunit.client.0.vm05.stdout:6/605: dwrite d4/d7/d10/d15/f17 [0,4194304] 0 2026-03-10T08:55:45.084 INFO:tasks.workunit.client.0.vm05.stdout:4/593: chown d0/d2e/d42/d45/d4a/d36/d37/d9c/d32/d41/d67/d7b/l7f 904 1 2026-03-10T08:55:45.084 INFO:tasks.workunit.client.0.vm05.stdout:4/594: chown d0/d2e/d42/d45/d4a/f26 1859393 1 2026-03-10T08:55:45.102 INFO:tasks.workunit.client.0.vm05.stdout:4/595: truncate d0/d2e/d42/d45/d4a/d36/d37/d9c/f28 3222131 0 2026-03-10T08:55:45.103 INFO:tasks.workunit.client.1.vm08.stdout:3/733: link d4/d15/d8/d2c/d9b/d79/d20/l72 d4/d15/d8/d2c/lfe 0 2026-03-10T08:55:45.105 INFO:tasks.workunit.client.0.vm05.stdout:4/596: truncate d0/d1d/f22 2009418 0 2026-03-10T08:55:45.107 INFO:tasks.workunit.client.0.vm05.stdout:4/597: mkdir d0/d2e/d42/d45/d4a/d36/d37/d9c/d32/d41/dba/dbd 0 2026-03-10T08:55:45.111 INFO:tasks.workunit.client.0.vm05.stdout:5/510: dwrite d5/d86/f8f [0,4194304] 0 2026-03-10T08:55:45.113 INFO:tasks.workunit.client.0.vm05.stdout:8/593: sync 2026-03-10T08:55:45.124 INFO:tasks.workunit.client.0.vm05.stdout:3/663: dwrite d9/d4d/d51/d64/f85 [0,4194304] 0 2026-03-10T08:55:45.129 INFO:tasks.workunit.client.1.vm08.stdout:8/811: dwrite d1/d4f/fcb [0,4194304] 0 2026-03-10T08:55:45.132 INFO:tasks.workunit.client.1.vm08.stdout:8/812: dread - d1/da8/f108 zero size 2026-03-10T08:55:45.134 INFO:tasks.workunit.client.0.vm05.stdout:8/594: sync 2026-03-10T08:55:45.146 INFO:tasks.workunit.client.0.vm05.stdout:5/511: mknod d5/d86/d21/d71/cbc 0 2026-03-10T08:55:45.153 INFO:tasks.workunit.client.0.vm05.stdout:2/506: dwrite d0/d9/d1e/d20/d21/d45/d6c/d6e/f64 [0,4194304] 0 2026-03-10T08:55:45.164 INFO:tasks.workunit.client.0.vm05.stdout:8/595: creat d2/dd/d2c/d2e/d31/d4f/d80/fd2 x:0 0 0 2026-03-10T08:55:45.178 INFO:tasks.workunit.client.1.vm08.stdout:1/775: dwrite d1/da/de/d5c/fb5 [0,4194304] 0 2026-03-10T08:55:45.178 INFO:tasks.workunit.client.0.vm05.stdout:5/512: write d5/d86/fa6 [709244,83291] 0 2026-03-10T08:55:45.182 INFO:tasks.workunit.client.1.vm08.stdout:6/780: dwrite d9/d10/f8c [0,4194304] 0 2026-03-10T08:55:45.184 INFO:tasks.workunit.client.1.vm08.stdout:6/781: chown d9/d10/c59 596928 1 2026-03-10T08:55:45.191 INFO:tasks.workunit.client.0.vm05.stdout:5/513: creat d5/d86/d21/d89/fbd x:0 0 0 2026-03-10T08:55:45.191 INFO:tasks.workunit.client.0.vm05.stdout:5/514: chown d5/d86/d21/l63 107917988 1 2026-03-10T08:55:45.191 INFO:tasks.workunit.client.0.vm05.stdout:5/515: fsync d5/d86/d24/d2c/f79 0 2026-03-10T08:55:45.192 INFO:tasks.workunit.client.0.vm05.stdout:5/516: dread d5/d86/f59 [0,4194304] 0 2026-03-10T08:55:45.201 INFO:tasks.workunit.client.0.vm05.stdout:9/546: dwrite d6/d19/d21/f32 [0,4194304] 0 2026-03-10T08:55:45.224 INFO:tasks.workunit.client.0.vm05.stdout:8/596: creat d2/dd/d2c/fd3 x:0 0 0 2026-03-10T08:55:45.225 INFO:tasks.workunit.client.0.vm05.stdout:8/597: chown d2/db/d1f/d67/d8d/fad 294 1 2026-03-10T08:55:45.227 INFO:tasks.workunit.client.0.vm05.stdout:9/547: readlink d6/d15/l68 0 2026-03-10T08:55:45.232 INFO:tasks.workunit.client.0.vm05.stdout:8/598: symlink d2/dd/d2c/d2e/d31/d4c/d63/daf/ld4 0 2026-03-10T08:55:45.245 INFO:tasks.workunit.client.1.vm08.stdout:6/782: mknod d9/dc/d11/d23/d2c/d41/cff 0 2026-03-10T08:55:45.253 INFO:tasks.workunit.client.1.vm08.stdout:1/776: rename d1/da/d20/d91/d83/df4/d4e/c10a to d1/da/d20/d91/d83/df4/c10d 0 2026-03-10T08:55:45.253 INFO:tasks.workunit.client.1.vm08.stdout:6/783: dread - d9/dc/d84/d80/f9e zero size 2026-03-10T08:55:45.253 INFO:tasks.workunit.client.1.vm08.stdout:6/784: symlink d9/d10/dd0/l100 0 2026-03-10T08:55:45.255 INFO:tasks.workunit.client.0.vm05.stdout:8/599: getdents d2/db/d47 0 2026-03-10T08:55:45.255 INFO:tasks.workunit.client.0.vm05.stdout:9/548: sync 2026-03-10T08:55:45.256 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:45 vm08.local ceph-mon[57559]: pgmap v161: 65 pgs: 65 active+clean; 2.7 GiB data, 9.5 GiB used, 111 GiB / 120 GiB avail; 52 MiB/s rd, 127 MiB/s wr, 334 op/s 2026-03-10T08:55:45.256 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:45 vm08.local ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:55:45.256 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:45 vm08.local ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:55:45.269 INFO:tasks.workunit.client.1.vm08.stdout:1/777: dwrite d1/da/de/d24/d3d/d40/ffe [0,4194304] 0 2026-03-10T08:55:45.272 INFO:tasks.workunit.client.0.vm05.stdout:8/600: creat d2/db/d28/d99/fd5 x:0 0 0 2026-03-10T08:55:45.280 INFO:tasks.workunit.client.1.vm08.stdout:2/850: rename d1/da/d10/d42/f58 to d1/da/d10/d42/d93/d1e/dce/d52/f119 0 2026-03-10T08:55:45.280 INFO:tasks.workunit.client.0.vm05.stdout:9/549: symlink d6/d12/d3a/da2/lb5 0 2026-03-10T08:55:45.284 INFO:tasks.workunit.client.1.vm08.stdout:8/813: dread d1/d10/d9/dd/d25/d27/d44/d21/d51/d64/fc2 [0,4194304] 0 2026-03-10T08:55:45.285 INFO:tasks.workunit.client.0.vm05.stdout:8/601: creat d2/dd/d2c/d2e/d31/d4f/d7b/fd6 x:0 0 0 2026-03-10T08:55:45.285 INFO:tasks.workunit.client.0.vm05.stdout:8/602: chown d2/db/d1f/d67/d8d/fad 94199950 1 2026-03-10T08:55:45.287 INFO:tasks.workunit.client.0.vm05.stdout:9/550: creat d6/d12/d3a/d9c/fb6 x:0 0 0 2026-03-10T08:55:45.290 INFO:tasks.workunit.client.1.vm08.stdout:1/778: dread - d1/da/d18/f72 zero size 2026-03-10T08:55:45.290 INFO:tasks.workunit.client.0.vm05.stdout:8/603: mknod d2/dd/d74/cd7 0 2026-03-10T08:55:45.301 INFO:tasks.workunit.client.0.vm05.stdout:2/507: read d0/d9/d1e/d20/f32 [1266870,91663] 0 2026-03-10T08:55:45.306 INFO:tasks.workunit.client.1.vm08.stdout:2/851: creat d1/da/d10/d42/dd0/f11a x:0 0 0 2026-03-10T08:55:45.315 INFO:tasks.workunit.client.0.vm05.stdout:7/537: write d18/d66/d25/d2e/d42/d53/f7e [400539,110849] 0 2026-03-10T08:55:45.315 INFO:tasks.workunit.client.0.vm05.stdout:8/604: creat d2/dd/d2c/d2e/d31/d4f/d7b/fd8 x:0 0 0 2026-03-10T08:55:45.315 INFO:tasks.workunit.client.0.vm05.stdout:0/577: dwrite df/d1f/d85/d2b/d27/d32/d4e/d87/f8d [0,4194304] 0 2026-03-10T08:55:45.315 INFO:tasks.workunit.client.1.vm08.stdout:7/801: write d0/d11/d1f/d29/d3d/d89/fee [417986,113950] 0 2026-03-10T08:55:45.315 INFO:tasks.workunit.client.1.vm08.stdout:6/785: creat d9/d10/d1e/d32/f101 x:0 0 0 2026-03-10T08:55:45.315 INFO:tasks.workunit.client.1.vm08.stdout:4/821: dwrite d5/d23/d36/d99/db2/d5d/de3/df8/fff [0,4194304] 0 2026-03-10T08:55:45.315 INFO:tasks.workunit.client.1.vm08.stdout:7/802: chown d0/d14/f98 134 1 2026-03-10T08:55:45.315 INFO:tasks.workunit.client.1.vm08.stdout:9/743: dwrite d2/dd/d15/d1e/d21/f3a [0,4194304] 0 2026-03-10T08:55:45.320 INFO:tasks.workunit.client.1.vm08.stdout:9/744: chown d2/dd/d15/d1e/d21/fc5 3 1 2026-03-10T08:55:45.325 INFO:tasks.workunit.client.1.vm08.stdout:8/814: truncate d1/da8/f102 722694 0 2026-03-10T08:55:45.329 INFO:tasks.workunit.client.0.vm05.stdout:0/578: sync 2026-03-10T08:55:45.336 INFO:tasks.workunit.client.0.vm05.stdout:6/606: dwrite d4/d7/f52 [0,4194304] 0 2026-03-10T08:55:45.336 INFO:tasks.workunit.client.1.vm08.stdout:0/723: dwrite d6/dd/d13/d17/d1f/d20/f21 [0,4194304] 0 2026-03-10T08:55:45.338 INFO:tasks.workunit.client.0.vm05.stdout:6/607: chown d4/d7/d10/d15/d1b/c28 19 1 2026-03-10T08:55:45.346 INFO:tasks.workunit.client.0.vm05.stdout:2/508: dread d0/f10 [0,4194304] 0 2026-03-10T08:55:45.347 INFO:tasks.workunit.client.0.vm05.stdout:2/509: rename d0/d9 to d0/d9/d1e/d20/d21/d45/d4b/d8e 22 2026-03-10T08:55:45.355 INFO:tasks.workunit.client.1.vm08.stdout:1/779: creat d1/da/de/d24/d3d/d40/d8e/dd2/d7f/f10e x:0 0 0 2026-03-10T08:55:45.366 INFO:tasks.workunit.client.0.vm05.stdout:7/538: fsync d18/d1b/f50 0 2026-03-10T08:55:45.371 INFO:tasks.workunit.client.1.vm08.stdout:7/803: unlink d0/f25 0 2026-03-10T08:55:45.371 INFO:tasks.workunit.client.0.vm05.stdout:7/539: chown d18/d66/d25/d2e/d42/d53/c64 1065972508 1 2026-03-10T08:55:45.372 INFO:tasks.workunit.client.0.vm05.stdout:8/605: mkdir d2/dd/d2c/d2e/d31/d3e/d5d/d9d/dd9 0 2026-03-10T08:55:45.372 INFO:tasks.workunit.client.1.vm08.stdout:3/734: dwrite d4/d15/d8/f68 [4194304,4194304] 0 2026-03-10T08:55:45.376 INFO:tasks.workunit.client.0.vm05.stdout:4/598: dwrite d0/d2e/d42/d45/d4a/d36/d37/d9c/d49/faf [0,4194304] 0 2026-03-10T08:55:45.388 INFO:tasks.workunit.client.1.vm08.stdout:9/745: fsync d2/d54/d8e/fba 0 2026-03-10T08:55:45.388 INFO:tasks.workunit.client.1.vm08.stdout:8/815: symlink d1/d4f/d60/dbf/l12d 0 2026-03-10T08:55:45.392 INFO:tasks.workunit.client.1.vm08.stdout:0/724: unlink d6/dd/d13/d17/d1f/d20/c7f 0 2026-03-10T08:55:45.398 INFO:tasks.workunit.client.0.vm05.stdout:7/540: fdatasync d18/d38/d43/d5c/f67 0 2026-03-10T08:55:45.398 INFO:tasks.workunit.client.1.vm08.stdout:1/780: rename d1/da/d18/d3b/f5e to d1/da/d20/d3f/d49/f10f 0 2026-03-10T08:55:45.398 INFO:tasks.workunit.client.1.vm08.stdout:2/852: mknod d1/da/d10/d42/d93/c11b 0 2026-03-10T08:55:45.398 INFO:tasks.workunit.client.1.vm08.stdout:7/804: mkdir d0/d14/d43/d62/d102 0 2026-03-10T08:55:45.400 INFO:tasks.workunit.client.0.vm05.stdout:7/541: dwrite d18/d38/f5d [0,4194304] 0 2026-03-10T08:55:45.416 INFO:tasks.workunit.client.0.vm05.stdout:3/664: dwrite d9/d8f/d50/f72 [0,4194304] 0 2026-03-10T08:55:45.418 INFO:tasks.workunit.client.0.vm05.stdout:6/608: dread d4/d7/d10/d1a/f1e [0,4194304] 0 2026-03-10T08:55:45.418 INFO:tasks.workunit.client.0.vm05.stdout:4/599: rmdir d0/d2c 39 2026-03-10T08:55:45.420 INFO:tasks.workunit.client.1.vm08.stdout:8/816: mkdir d1/d4f/d12e 0 2026-03-10T08:55:45.421 INFO:tasks.workunit.client.1.vm08.stdout:8/817: chown d1/d10/d9/dd/d25/lf0 172 1 2026-03-10T08:55:45.434 INFO:tasks.workunit.client.0.vm05.stdout:2/510: getdents d0/d9/d1e/d20/d21/d45/d6c/d6e/d7a 0 2026-03-10T08:55:45.434 INFO:tasks.workunit.client.0.vm05.stdout:5/517: dwrite d5/df/dbb/f4a [4194304,4194304] 0 2026-03-10T08:55:45.434 INFO:tasks.workunit.client.1.vm08.stdout:8/818: readlink d1/d10/d9/dd/d13/d40/l7b 0 2026-03-10T08:55:45.434 INFO:tasks.workunit.client.1.vm08.stdout:1/781: creat d1/da/d20/d3f/d49/d9c/f110 x:0 0 0 2026-03-10T08:55:45.434 INFO:tasks.workunit.client.1.vm08.stdout:2/853: mkdir d1/d5b/da7/d11c 0 2026-03-10T08:55:45.434 INFO:tasks.workunit.client.1.vm08.stdout:9/746: sync 2026-03-10T08:55:45.435 INFO:tasks.workunit.client.1.vm08.stdout:9/747: chown d2/dd/d15/d1e/d25/d32/d5c/de5 145 1 2026-03-10T08:55:45.453 INFO:tasks.workunit.client.0.vm05.stdout:7/542: creat d18/d38/d43/d5c/fa1 x:0 0 0 2026-03-10T08:55:45.460 INFO:tasks.workunit.client.1.vm08.stdout:8/819: creat d1/d10/d9/dd/d25/d27/d44/d21/d51/d64/dfb/f12f x:0 0 0 2026-03-10T08:55:45.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:45 vm05.local ceph-mon[49713]: pgmap v161: 65 pgs: 65 active+clean; 2.7 GiB data, 9.5 GiB used, 111 GiB / 120 GiB avail; 52 MiB/s rd, 127 MiB/s wr, 334 op/s 2026-03-10T08:55:45.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:45 vm05.local ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:55:45.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:45 vm05.local ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:55:45.463 INFO:tasks.workunit.client.0.vm05.stdout:3/665: creat d9/d2b/d2f/d57/fc0 x:0 0 0 2026-03-10T08:55:45.469 INFO:tasks.workunit.client.1.vm08.stdout:9/748: creat d2/d41/d53/ffd x:0 0 0 2026-03-10T08:55:45.473 INFO:tasks.workunit.client.0.vm05.stdout:2/511: rename d0/d9/d1e/d20/d21/d45/d6c/d6e to d0/d9/d7f/d8f 0 2026-03-10T08:55:45.473 INFO:tasks.workunit.client.0.vm05.stdout:9/551: link d6/d19/d2a/f53 d6/d19/d21/fb7 0 2026-03-10T08:55:45.473 INFO:tasks.workunit.client.0.vm05.stdout:9/552: write d6/fb [1172673,35805] 0 2026-03-10T08:55:45.481 INFO:tasks.workunit.client.1.vm08.stdout:7/805: symlink d0/d11/d1f/d29/d3d/l103 0 2026-03-10T08:55:45.487 INFO:tasks.workunit.client.1.vm08.stdout:8/820: symlink d1/d10/d9/dd/d25/l130 0 2026-03-10T08:55:45.488 INFO:tasks.workunit.client.0.vm05.stdout:4/600: dread d0/fb [0,4194304] 0 2026-03-10T08:55:45.491 INFO:tasks.workunit.client.0.vm05.stdout:5/518: mknod d5/d86/d39/cbe 0 2026-03-10T08:55:45.492 INFO:tasks.workunit.client.1.vm08.stdout:2/854: dread d1/d43/f4b [0,4194304] 0 2026-03-10T08:55:45.495 INFO:tasks.workunit.client.1.vm08.stdout:9/749: fdatasync d2/dd/d15/d1e/d39/d4e/f71 0 2026-03-10T08:55:45.497 INFO:tasks.workunit.client.1.vm08.stdout:7/806: creat d0/d11/d4a/da3/f104 x:0 0 0 2026-03-10T08:55:45.498 INFO:tasks.workunit.client.0.vm05.stdout:6/609: mkdir d4/d2c/dc8 0 2026-03-10T08:55:45.507 INFO:tasks.workunit.client.1.vm08.stdout:9/750: truncate d2/dd/d15/d1e/d21/fc5 1547507 0 2026-03-10T08:55:45.519 INFO:tasks.workunit.client.0.vm05.stdout:4/601: rename d0/d2e/d42/d45/d4a/d36/d37/d9c to d0/d2e/d42/d45/d4a/d36/dbe 0 2026-03-10T08:55:45.523 INFO:tasks.workunit.client.0.vm05.stdout:4/602: dread d0/d2e/d42/d45/d4a/d36/f3d [0,4194304] 0 2026-03-10T08:55:45.526 INFO:tasks.workunit.client.1.vm08.stdout:7/807: fdatasync d0/d11/d1f/d29/d3b/f4c 0 2026-03-10T08:55:45.527 INFO:tasks.workunit.client.0.vm05.stdout:5/519: fdatasync d5/df/dbb/f4e 0 2026-03-10T08:55:45.537 INFO:tasks.workunit.client.0.vm05.stdout:7/543: mknod d18/d1b/ca2 0 2026-03-10T08:55:45.537 INFO:tasks.workunit.client.1.vm08.stdout:8/821: creat d1/d10/d9/dd/d25/dca/d128/f131 x:0 0 0 2026-03-10T08:55:45.537 INFO:tasks.workunit.client.1.vm08.stdout:8/822: readlink d1/d10/d9/dd/d25/d27/d44/d21/d5f/l10c 0 2026-03-10T08:55:45.537 INFO:tasks.workunit.client.1.vm08.stdout:7/808: creat d0/d51/f105 x:0 0 0 2026-03-10T08:55:45.543 INFO:tasks.workunit.client.1.vm08.stdout:7/809: readlink d0/d11/d1f/l28 0 2026-03-10T08:55:45.544 INFO:tasks.workunit.client.0.vm05.stdout:3/666: creat d9/d8f/d50/fc1 x:0 0 0 2026-03-10T08:55:45.550 INFO:tasks.workunit.client.0.vm05.stdout:0/579: write df/d1f/d85/d2b/d27/d32/d4e/d87/f8d [4274651,81304] 0 2026-03-10T08:55:45.552 INFO:tasks.workunit.client.1.vm08.stdout:4/822: write d5/d23/d49/d83/f93 [176367,114368] 0 2026-03-10T08:55:45.554 INFO:tasks.workunit.client.1.vm08.stdout:6/786: dwrite d9/dc/d84/f89 [0,4194304] 0 2026-03-10T08:55:45.557 INFO:tasks.workunit.client.1.vm08.stdout:9/751: getdents d2/dd/d15/d1e/d21 0 2026-03-10T08:55:45.558 INFO:tasks.workunit.client.1.vm08.stdout:9/752: chown d2/dd/d15/d1e/d39/fd8 1378 1 2026-03-10T08:55:45.558 INFO:tasks.workunit.client.1.vm08.stdout:7/810: mknod d0/d14/c106 0 2026-03-10T08:55:45.566 INFO:tasks.workunit.client.0.vm05.stdout:2/512: link d0/d9/d1e/d20/d21/f23 d0/d9/d1e/d20/d21/d45/d4b/f90 0 2026-03-10T08:55:45.570 INFO:tasks.workunit.client.0.vm05.stdout:9/553: rename d6/d12/d3a/l70 to d6/d15/d3c/d4b/lb8 0 2026-03-10T08:55:45.577 INFO:tasks.workunit.client.1.vm08.stdout:6/787: rename d9/d10/c59 to d9/dc/c102 0 2026-03-10T08:55:45.578 INFO:tasks.workunit.client.0.vm05.stdout:6/610: link d4/d7/d10/d1a/db1/fb3 d4/d2c/d84/db6/dc6/fc9 0 2026-03-10T08:55:45.578 INFO:tasks.workunit.client.1.vm08.stdout:6/788: readlink d9/d10/dd0/l100 0 2026-03-10T08:55:45.581 INFO:tasks.workunit.client.0.vm05.stdout:9/554: creat d6/d12/d43/fb9 x:0 0 0 2026-03-10T08:55:45.582 INFO:tasks.workunit.client.1.vm08.stdout:4/823: sync 2026-03-10T08:55:45.585 INFO:tasks.workunit.client.1.vm08.stdout:6/789: creat d9/dc/d11/d23/d2c/f103 x:0 0 0 2026-03-10T08:55:45.587 INFO:tasks.workunit.client.0.vm05.stdout:0/580: sync 2026-03-10T08:55:45.587 INFO:tasks.workunit.client.0.vm05.stdout:9/555: sync 2026-03-10T08:55:45.591 INFO:tasks.workunit.client.0.vm05.stdout:2/513: getdents d0/d9/d1e/d20/d21/d45/d4b/d75 0 2026-03-10T08:55:45.592 INFO:tasks.workunit.client.1.vm08.stdout:4/824: stat d5/d23/d49/d8f/da4/d118/l124 0 2026-03-10T08:55:45.601 INFO:tasks.workunit.client.0.vm05.stdout:4/603: rename d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/dba to d0/d2e/d42/d45/d4a/d36/dbe/dbf 0 2026-03-10T08:55:45.607 INFO:tasks.workunit.client.0.vm05.stdout:0/581: dread - df/d1f/d85/d19/d47/d84/d8a/f93 zero size 2026-03-10T08:55:45.609 INFO:tasks.workunit.client.0.vm05.stdout:5/520: rename d5/d86/d66/d76/l6e to d5/d48/d64/d95/dac/lbf 0 2026-03-10T08:55:45.613 INFO:tasks.workunit.client.0.vm05.stdout:4/604: mknod d0/d78/cc0 0 2026-03-10T08:55:45.617 INFO:tasks.workunit.client.0.vm05.stdout:8/606: dwrite d2/dd/d2c/d2e/d31/d3e/f73 [4194304,4194304] 0 2026-03-10T08:55:45.617 INFO:tasks.workunit.client.1.vm08.stdout:3/735: dwrite d4/d15/d8/d1d/d4f/fa2 [4194304,4194304] 0 2026-03-10T08:55:45.618 INFO:tasks.workunit.client.0.vm05.stdout:8/607: write d2/dd/d2c/d2e/d31/d4f/da3/faa [87605,56346] 0 2026-03-10T08:55:45.619 INFO:tasks.workunit.client.1.vm08.stdout:7/811: dread d0/d11/d1f/d29/d36/fb4 [0,4194304] 0 2026-03-10T08:55:45.629 INFO:tasks.workunit.client.1.vm08.stdout:1/782: write d1/da/d20/d91/ff6 [81521,101341] 0 2026-03-10T08:55:45.635 INFO:tasks.workunit.client.0.vm05.stdout:3/667: rmdir d9/d2b 39 2026-03-10T08:55:45.637 INFO:tasks.workunit.client.1.vm08.stdout:9/753: chown f1 12601 1 2026-03-10T08:55:45.640 INFO:tasks.workunit.client.0.vm05.stdout:9/556: dread d6/d19/d2a/d4a/d8c/fa7 [0,4194304] 0 2026-03-10T08:55:45.647 INFO:tasks.workunit.client.1.vm08.stdout:2/855: write d1/d5b/f80 [2486247,107360] 0 2026-03-10T08:55:45.650 INFO:tasks.workunit.client.0.vm05.stdout:7/544: rename d18/d66/d25/d2e/d42/d74/f7b to d18/d66/d25/d2e/d42/fa3 0 2026-03-10T08:55:45.652 INFO:tasks.workunit.client.0.vm05.stdout:4/605: mknod d0/d2e/d71/d7c/cc1 0 2026-03-10T08:55:45.657 INFO:tasks.workunit.client.1.vm08.stdout:7/812: creat d0/d11/db2/d8e/f107 x:0 0 0 2026-03-10T08:55:45.657 INFO:tasks.workunit.client.1.vm08.stdout:1/783: mkdir d1/da/de/d24/d26/d86/d111 0 2026-03-10T08:55:45.657 INFO:tasks.workunit.client.1.vm08.stdout:1/784: chown d1/da/d20/d3f/l45 23351136 1 2026-03-10T08:55:45.660 INFO:tasks.workunit.client.0.vm05.stdout:3/668: chown d9/d2b/d2f/d57/c81 4 1 2026-03-10T08:55:45.660 INFO:tasks.workunit.client.0.vm05.stdout:3/669: readlink d9/d2b/l47 0 2026-03-10T08:55:45.663 INFO:tasks.workunit.client.1.vm08.stdout:9/754: mkdir d2/d41/d4c/d66/d82/dfe 0 2026-03-10T08:55:45.681 INFO:tasks.workunit.client.1.vm08.stdout:9/755: truncate d2/dd/d15/d1e/d21/f90 3047366 0 2026-03-10T08:55:45.685 INFO:tasks.workunit.client.1.vm08.stdout:8/823: dwrite d1/d10/d9/d4d/fe3 [0,4194304] 0 2026-03-10T08:55:45.692 INFO:tasks.workunit.client.0.vm05.stdout:5/521: creat d5/fc0 x:0 0 0 2026-03-10T08:55:45.701 INFO:tasks.workunit.client.0.vm05.stdout:6/611: write d4/d7/d10/d1a/d1f/f4b [9133462,60227] 0 2026-03-10T08:55:45.706 INFO:tasks.workunit.client.1.vm08.stdout:6/790: dwrite d9/dc/d11/d23/d2c/d7a/dce/ffd [0,4194304] 0 2026-03-10T08:55:45.706 INFO:tasks.workunit.client.1.vm08.stdout:4/825: dwrite d5/d23/d36/d99/dc6/f108 [0,4194304] 0 2026-03-10T08:55:45.707 INFO:tasks.workunit.client.0.vm05.stdout:0/582: dwrite df/d1f/d85/d19/d39/f6f [0,4194304] 0 2026-03-10T08:55:45.719 INFO:tasks.workunit.client.1.vm08.stdout:3/736: dwrite d4/d15/d8/d2c/d9b/f86 [0,4194304] 0 2026-03-10T08:55:45.720 INFO:tasks.workunit.client.0.vm05.stdout:9/557: creat d6/d12/db2/fba x:0 0 0 2026-03-10T08:55:45.722 INFO:tasks.workunit.client.1.vm08.stdout:9/756: mknod d2/dd/d15/d1e/d39/d69/de4/cff 0 2026-03-10T08:55:45.724 INFO:tasks.workunit.client.1.vm08.stdout:1/785: write d1/f8 [8181769,70085] 0 2026-03-10T08:55:45.724 INFO:tasks.workunit.client.1.vm08.stdout:8/824: fdatasync d1/d10/fac 0 2026-03-10T08:55:45.725 INFO:tasks.workunit.client.0.vm05.stdout:4/606: symlink d0/d2e/d42/d45/d4a/d36/dbe/d32/lc2 0 2026-03-10T08:55:45.729 INFO:tasks.workunit.client.1.vm08.stdout:6/791: unlink d9/d50/d95/cb5 0 2026-03-10T08:55:45.729 INFO:tasks.workunit.client.1.vm08.stdout:1/786: write d1/da/de/d24/d26/d5d/f104 [458185,84882] 0 2026-03-10T08:55:45.731 INFO:tasks.workunit.client.1.vm08.stdout:2/856: dwrite d1/d43/f4b [0,4194304] 0 2026-03-10T08:55:45.737 INFO:tasks.workunit.client.1.vm08.stdout:3/737: creat d4/d15/d8/d1d/fff x:0 0 0 2026-03-10T08:55:45.737 INFO:tasks.workunit.client.1.vm08.stdout:2/857: rename d1/da/d10/d42/d93/d23 to d1/da/d10/d42/d93/d23/d9e/ddc/d11d 22 2026-03-10T08:55:45.737 INFO:tasks.workunit.client.1.vm08.stdout:2/858: stat d1/da/f50 0 2026-03-10T08:55:45.737 INFO:tasks.workunit.client.1.vm08.stdout:9/757: fsync d2/dd/d15/d1e/d21/f50 0 2026-03-10T08:55:45.738 INFO:tasks.workunit.client.0.vm05.stdout:2/514: rename d0/c69 to d0/d9/d1e/d20/c91 0 2026-03-10T08:55:45.739 INFO:tasks.workunit.client.0.vm05.stdout:2/515: write d0/f56 [951924,112879] 0 2026-03-10T08:55:45.743 INFO:tasks.workunit.client.0.vm05.stdout:2/516: dread d0/d9/f19 [0,4194304] 0 2026-03-10T08:55:45.748 INFO:tasks.workunit.client.1.vm08.stdout:7/813: dread d0/f7a [0,4194304] 0 2026-03-10T08:55:45.748 INFO:tasks.workunit.client.0.vm05.stdout:7/545: creat d18/d66/d25/d2e/fa4 x:0 0 0 2026-03-10T08:55:45.750 INFO:tasks.workunit.client.1.vm08.stdout:1/787: sync 2026-03-10T08:55:45.755 INFO:tasks.workunit.client.1.vm08.stdout:1/788: read d1/fac [332173,54253] 0 2026-03-10T08:55:45.755 INFO:tasks.workunit.client.1.vm08.stdout:8/825: dread d1/d10/d9/dd/d25/d27/d44/d21/d5f/fbd [0,4194304] 0 2026-03-10T08:55:45.767 INFO:tasks.workunit.client.1.vm08.stdout:2/859: rename d1/dd5 to d1/da/d78/df5/d11e 0 2026-03-10T08:55:45.770 INFO:tasks.workunit.client.1.vm08.stdout:9/758: creat d2/dd/d15/d1e/d39/f100 x:0 0 0 2026-03-10T08:55:45.771 INFO:tasks.workunit.client.0.vm05.stdout:8/608: rename d2/db/d1f/c35 to d2/dd/d2c/d2e/d31/d4c/d63/daf/cda 0 2026-03-10T08:55:45.776 INFO:tasks.workunit.client.1.vm08.stdout:6/792: mkdir d9/d10/d1e/d104 0 2026-03-10T08:55:45.776 INFO:tasks.workunit.client.0.vm05.stdout:7/546: dread - d18/d66/d25/f8d zero size 2026-03-10T08:55:45.777 INFO:tasks.workunit.client.0.vm05.stdout:7/547: write f9 [4865804,104949] 0 2026-03-10T08:55:45.777 INFO:tasks.workunit.client.0.vm05.stdout:3/670: getdents d9/d2b/d2f/d57 0 2026-03-10T08:55:45.781 INFO:tasks.workunit.client.1.vm08.stdout:8/826: rename d1/d4f/d60/fc4 to d1/d10/d9/dd/d25/d27/d44/d21/d51/d64/dfb/f132 0 2026-03-10T08:55:45.781 INFO:tasks.workunit.client.0.vm05.stdout:9/558: rename d6/d12/l85 to d6/d19/d2c/d84/lbb 0 2026-03-10T08:55:45.797 INFO:tasks.workunit.client.0.vm05.stdout:2/517: mkdir d0/d9/d1e/d20/d21/d8a/d92 0 2026-03-10T08:55:45.808 INFO:tasks.workunit.client.0.vm05.stdout:3/671: mkdir d9/d4d/d51/d64/d89/dc2 0 2026-03-10T08:55:45.808 INFO:tasks.workunit.client.0.vm05.stdout:3/672: chown d9/d8f/d50 206254 1 2026-03-10T08:55:45.811 INFO:tasks.workunit.client.1.vm08.stdout:8/827: rmdir d1/d10/d9/dd/d25/d27/d44/d21 39 2026-03-10T08:55:45.816 INFO:tasks.workunit.client.0.vm05.stdout:9/559: dread d6/f4e [0,4194304] 0 2026-03-10T08:55:45.821 INFO:tasks.workunit.client.0.vm05.stdout:7/548: symlink d18/d66/d78/la5 0 2026-03-10T08:55:45.821 INFO:tasks.workunit.client.0.vm05.stdout:8/609: rename d2/l1d to d2/db/d1f/d67/d8d/ldb 0 2026-03-10T08:55:45.821 INFO:tasks.workunit.client.1.vm08.stdout:7/814: link d0/d11/d4a/f87 d0/d11/d1f/d29/d3d/df6/f108 0 2026-03-10T08:55:45.822 INFO:tasks.workunit.client.1.vm08.stdout:3/738: getdents d4/d15/d8/d2c/d6d/dfa 0 2026-03-10T08:55:45.822 INFO:tasks.workunit.client.1.vm08.stdout:3/739: chown d4/d15/d8/d2c/d55/f75 21 1 2026-03-10T08:55:45.823 INFO:tasks.workunit.client.0.vm05.stdout:2/518: creat d0/d9/d89/f93 x:0 0 0 2026-03-10T08:55:45.825 INFO:tasks.workunit.client.0.vm05.stdout:2/519: read d0/d9/d7f/d8f/f63 [54953,112029] 0 2026-03-10T08:55:45.828 INFO:tasks.workunit.client.0.vm05.stdout:3/673: mknod d9/d2b/cc3 0 2026-03-10T08:55:45.829 INFO:tasks.workunit.client.0.vm05.stdout:8/610: fsync d2/dd/d2c/d2e/d31/d3e/d5d/f92 0 2026-03-10T08:55:45.831 INFO:tasks.workunit.client.0.vm05.stdout:8/611: read d2/dd/d2c/d2e/d31/f89 [6265010,11209] 0 2026-03-10T08:55:45.839 INFO:tasks.workunit.client.0.vm05.stdout:9/560: mkdir d6/d19/d2a/dbc 0 2026-03-10T08:55:45.843 INFO:tasks.workunit.client.1.vm08.stdout:3/740: mkdir d4/d15/d8/d2c/d6d/dfa/d100 0 2026-03-10T08:55:45.844 INFO:tasks.workunit.client.1.vm08.stdout:3/741: stat d4/d15/d8/d2c/d55/cb7 0 2026-03-10T08:55:45.846 INFO:tasks.workunit.client.0.vm05.stdout:2/520: rmdir d0/d9/d1e/d20/d21/d45/d4b/d70 39 2026-03-10T08:55:45.852 INFO:tasks.workunit.client.1.vm08.stdout:9/759: getdents d2/dd/d15/d1e/d39/d4e 0 2026-03-10T08:55:45.854 INFO:tasks.workunit.client.1.vm08.stdout:7/815: mknod d0/c109 0 2026-03-10T08:55:45.855 INFO:tasks.workunit.client.1.vm08.stdout:7/816: dread - d0/d11/d1f/d29/fba zero size 2026-03-10T08:55:45.861 INFO:tasks.workunit.client.0.vm05.stdout:3/674: mknod d9/d2b/d2f/cc4 0 2026-03-10T08:55:45.864 INFO:tasks.workunit.client.1.vm08.stdout:3/742: symlink d4/d15/d8/d2c/d55/l101 0 2026-03-10T08:55:45.864 INFO:tasks.workunit.client.1.vm08.stdout:3/743: stat d4/d6f/d85/f88 0 2026-03-10T08:55:45.868 INFO:tasks.workunit.client.1.vm08.stdout:0/725: dread d6/dd/d13/d17/d1f/d2d/d39/f4a [0,4194304] 0 2026-03-10T08:55:45.870 INFO:tasks.workunit.client.1.vm08.stdout:4/826: write d5/d23/d36/d99/db2/d5d/fc5 [944559,40582] 0 2026-03-10T08:55:45.876 INFO:tasks.workunit.client.1.vm08.stdout:6/793: dread d9/dc/d84/fae [0,4194304] 0 2026-03-10T08:55:45.877 INFO:tasks.workunit.client.1.vm08.stdout:6/794: chown d9/d10/dd0/ffe 1394375 1 2026-03-10T08:55:45.878 INFO:tasks.workunit.client.0.vm05.stdout:6/612: dwrite d4/d7/d10/d15/d1b/d22/f5c [0,4194304] 0 2026-03-10T08:55:45.878 INFO:tasks.workunit.client.0.vm05.stdout:0/583: dwrite df/d1f/d85/d2b/d65/d6e/d96/f8b [0,4194304] 0 2026-03-10T08:55:45.883 INFO:tasks.workunit.client.0.vm05.stdout:5/522: truncate d5/d86/d21/d89/f90 502100 0 2026-03-10T08:55:45.894 INFO:tasks.workunit.client.0.vm05.stdout:4/607: dwrite d0/f1 [4194304,4194304] 0 2026-03-10T08:55:45.894 INFO:tasks.workunit.client.0.vm05.stdout:4/608: chown d0/d2e/d42/d45/d4a/d36/dbe/dbf 205 1 2026-03-10T08:55:45.895 INFO:tasks.workunit.client.1.vm08.stdout:3/744: read d4/d15/d8/d2c/d9b/d79/d20/f84 [8197894,67514] 0 2026-03-10T08:55:45.895 INFO:tasks.workunit.client.1.vm08.stdout:1/789: write d1/da/d20/f21 [812872,87940] 0 2026-03-10T08:55:45.895 INFO:tasks.workunit.client.1.vm08.stdout:3/745: chown d4/d15/d8/d1d/d4f 989 1 2026-03-10T08:55:45.895 INFO:tasks.workunit.client.0.vm05.stdout:9/561: read d6/d27/f2b [2656124,43534] 0 2026-03-10T08:55:45.902 INFO:tasks.workunit.client.1.vm08.stdout:4/827: fdatasync d5/d23/d36/d99/db2/d5a/d69/f97 0 2026-03-10T08:55:45.909 INFO:tasks.workunit.client.1.vm08.stdout:7/817: fsync d0/d11/d1f/d29/fcf 0 2026-03-10T08:55:45.912 INFO:tasks.workunit.client.1.vm08.stdout:2/860: dwrite d1/da/d10/d42/d93/d22/f8a [0,4194304] 0 2026-03-10T08:55:45.918 INFO:tasks.workunit.client.1.vm08.stdout:4/828: dread d5/f1d [0,4194304] 0 2026-03-10T08:55:45.919 INFO:tasks.workunit.client.1.vm08.stdout:0/726: mknod d6/dd/d13/d17/d1f/d20/cee 0 2026-03-10T08:55:45.925 INFO:tasks.workunit.client.0.vm05.stdout:7/549: dwrite d18/d66/d25/d2e/d42/f52 [0,4194304] 0 2026-03-10T08:55:45.930 INFO:tasks.workunit.client.1.vm08.stdout:7/818: unlink d0/d11/c1b 0 2026-03-10T08:55:45.931 INFO:tasks.workunit.client.1.vm08.stdout:3/746: mknod d4/d15/d8/d2c/d9b/d79/c102 0 2026-03-10T08:55:45.932 INFO:tasks.workunit.client.1.vm08.stdout:2/861: rename d1/d43/d5c to d1/d97/d11f 0 2026-03-10T08:55:45.940 INFO:tasks.workunit.client.1.vm08.stdout:9/760: write d2/d54/d8e/fba [490312,76307] 0 2026-03-10T08:55:45.945 INFO:tasks.workunit.client.1.vm08.stdout:8/828: dread d1/d10/d9/dd/fc5 [0,4194304] 0 2026-03-10T08:55:45.947 INFO:tasks.workunit.client.1.vm08.stdout:6/795: creat d9/dc/f105 x:0 0 0 2026-03-10T08:55:45.959 INFO:tasks.workunit.client.1.vm08.stdout:2/862: mknod d1/da/d10/d42/d93/daa/c120 0 2026-03-10T08:55:45.965 INFO:tasks.workunit.client.1.vm08.stdout:9/761: mkdir d2/dd/d15/d1e/d25/d32/d5c/dc2/d101 0 2026-03-10T08:55:45.969 INFO:tasks.workunit.client.1.vm08.stdout:9/762: write d2/dd/faf [48631,85772] 0 2026-03-10T08:55:45.970 INFO:tasks.workunit.client.1.vm08.stdout:1/790: dwrite d1/da/de/d24/d3d/d40/d8e/dd2/fdc [0,4194304] 0 2026-03-10T08:55:45.979 INFO:tasks.workunit.client.1.vm08.stdout:8/829: read d1/d10/d9/dd/f62 [720457,75877] 0 2026-03-10T08:55:45.979 INFO:tasks.workunit.client.1.vm08.stdout:6/796: truncate d9/d10/d1e/d32/f64 299626 0 2026-03-10T08:55:45.992 INFO:tasks.workunit.client.1.vm08.stdout:2/863: read - d1/fd2 zero size 2026-03-10T08:55:45.992 INFO:tasks.workunit.client.1.vm08.stdout:2/864: dread - d1/fd2 zero size 2026-03-10T08:55:45.994 INFO:tasks.workunit.client.1.vm08.stdout:9/763: read - d2/dd/d15/d4f/fd3 zero size 2026-03-10T08:55:45.996 INFO:tasks.workunit.client.0.vm05.stdout:2/521: mknod d0/c94 0 2026-03-10T08:55:45.997 INFO:tasks.workunit.client.0.vm05.stdout:3/675: mknod d9/d4d/d51/d64/d89/dc2/cc5 0 2026-03-10T08:55:46.002 INFO:tasks.workunit.client.1.vm08.stdout:0/727: rmdir d6/dd/d13/d17/d1f/d2d/de3 0 2026-03-10T08:55:46.008 INFO:tasks.workunit.client.1.vm08.stdout:1/791: mknod d1/da/de/d24/d3d/d40/d5b/c112 0 2026-03-10T08:55:46.009 INFO:tasks.workunit.client.1.vm08.stdout:1/792: write d1/da/de/d24/d35/d6d/fc8 [2459428,116121] 0 2026-03-10T08:55:46.012 INFO:tasks.workunit.client.0.vm05.stdout:6/613: creat d4/d7/dc4/fca x:0 0 0 2026-03-10T08:55:46.018 INFO:tasks.workunit.client.1.vm08.stdout:6/797: rmdir d9/dc/d11/d23/d2c/d81/d63/dcf 39 2026-03-10T08:55:46.023 INFO:tasks.workunit.client.1.vm08.stdout:4/829: dwrite d5/d23/d36/fce [0,4194304] 0 2026-03-10T08:55:46.027 INFO:tasks.workunit.client.1.vm08.stdout:2/865: creat d1/d5b/f121 x:0 0 0 2026-03-10T08:55:46.033 INFO:tasks.workunit.client.1.vm08.stdout:3/747: dwrite d4/d15/d8/fad [0,4194304] 0 2026-03-10T08:55:46.033 INFO:tasks.workunit.client.1.vm08.stdout:7/819: dwrite d0/d11/d1f/d29/d3d/d89/fa6 [0,4194304] 0 2026-03-10T08:55:46.046 INFO:tasks.workunit.client.1.vm08.stdout:0/728: symlink d6/dd/d13/d17/d1f/d2d/d85/d95/lef 0 2026-03-10T08:55:46.048 INFO:tasks.workunit.client.0.vm05.stdout:4/609: write d0/d2e/d42/d45/d4a/d36/d37/fac [456598,31658] 0 2026-03-10T08:55:46.051 INFO:tasks.workunit.client.0.vm05.stdout:0/584: dwrite df/d1f/d85/d2b/d27/f4f [0,4194304] 0 2026-03-10T08:55:46.053 INFO:tasks.workunit.client.1.vm08.stdout:3/748: dwrite d4/d15/d8/d1d/d4f/fee [0,4194304] 0 2026-03-10T08:55:46.057 INFO:tasks.workunit.client.0.vm05.stdout:7/550: dwrite d18/d66/d25/f8d [0,4194304] 0 2026-03-10T08:55:46.058 INFO:tasks.workunit.client.0.vm05.stdout:2/522: rmdir d0/d9/d1e 39 2026-03-10T08:55:46.061 INFO:tasks.workunit.client.0.vm05.stdout:3/676: symlink d9/d2b/d3a/d43/d71/d86/lc6 0 2026-03-10T08:55:46.063 INFO:tasks.workunit.client.0.vm05.stdout:3/677: truncate d9/d2b/d2f/d57/fc0 414556 0 2026-03-10T08:55:46.067 INFO:tasks.workunit.client.1.vm08.stdout:6/798: rename d9/d13/d4e to d9/dc/d11/d106 0 2026-03-10T08:55:46.067 INFO:tasks.workunit.client.1.vm08.stdout:4/830: fsync d5/f8a 0 2026-03-10T08:55:46.068 INFO:tasks.workunit.client.0.vm05.stdout:6/614: fsync d4/f11 0 2026-03-10T08:55:46.070 INFO:tasks.workunit.client.0.vm05.stdout:5/523: creat d5/fc1 x:0 0 0 2026-03-10T08:55:46.071 INFO:tasks.workunit.client.0.vm05.stdout:5/524: write d5/d86/f20 [10124,46598] 0 2026-03-10T08:55:46.085 INFO:tasks.workunit.client.1.vm08.stdout:3/749: symlink d4/d15/d8/d2c/d6d/dfa/l103 0 2026-03-10T08:55:46.090 INFO:tasks.workunit.client.1.vm08.stdout:1/793: mkdir d1/da/d20/d91/d83/df4/d113 0 2026-03-10T08:55:46.106 INFO:tasks.workunit.client.0.vm05.stdout:3/678: creat d9/d2b/d3a/d43/d71/d86/fc7 x:0 0 0 2026-03-10T08:55:46.107 INFO:tasks.workunit.client.0.vm05.stdout:5/525: rename d5/d86/f8f to d5/d86/d24/fc2 0 2026-03-10T08:55:46.107 INFO:tasks.workunit.client.1.vm08.stdout:4/831: creat d5/d23/d36/d99/dc6/dc8/f12c x:0 0 0 2026-03-10T08:55:46.107 INFO:tasks.workunit.client.1.vm08.stdout:6/799: truncate d9/d10/d1e/d32/f12 4320162 0 2026-03-10T08:55:46.109 INFO:tasks.workunit.client.1.vm08.stdout:8/830: creat d1/d10/d9/dd/d25/d27/d44/d21/f133 x:0 0 0 2026-03-10T08:55:46.110 INFO:tasks.workunit.client.1.vm08.stdout:8/831: chown d1/d10/d9/dd/d18/d34/f117 0 1 2026-03-10T08:55:46.113 INFO:tasks.workunit.client.0.vm05.stdout:3/679: mknod d9/d2b/d2f/d57/cc8 0 2026-03-10T08:55:46.117 INFO:tasks.workunit.client.0.vm05.stdout:3/680: dwrite d9/d4d/d51/faa [0,4194304] 0 2026-03-10T08:55:46.122 INFO:tasks.workunit.client.0.vm05.stdout:0/585: creat df/fab x:0 0 0 2026-03-10T08:55:46.125 INFO:tasks.workunit.client.1.vm08.stdout:2/866: creat d1/da/d10/d1b/f122 x:0 0 0 2026-03-10T08:55:46.126 INFO:tasks.workunit.client.1.vm08.stdout:3/750: dread d4/d15/d8/d1d/d4f/fb0 [0,4194304] 0 2026-03-10T08:55:46.134 INFO:tasks.workunit.client.0.vm05.stdout:0/586: dwrite df/d1f/d85/d19/d55/fa9 [0,4194304] 0 2026-03-10T08:55:46.142 INFO:tasks.workunit.client.1.vm08.stdout:1/794: mkdir d1/da/d20/d114 0 2026-03-10T08:55:46.143 INFO:tasks.workunit.client.0.vm05.stdout:3/681: symlink d9/d2b/d3a/lc9 0 2026-03-10T08:55:46.148 INFO:tasks.workunit.client.0.vm05.stdout:3/682: dwrite d9/d2b/d3a/d6c/f74 [4194304,4194304] 0 2026-03-10T08:55:46.150 INFO:tasks.workunit.client.0.vm05.stdout:3/683: dread - d9/d2b/d53/fa7 zero size 2026-03-10T08:55:46.154 INFO:tasks.workunit.client.0.vm05.stdout:5/526: symlink d5/df/d37/d68/db6/lc3 0 2026-03-10T08:55:46.155 INFO:tasks.workunit.client.0.vm05.stdout:5/527: truncate d5/d86/d21/f9e 343886 0 2026-03-10T08:55:46.159 INFO:tasks.workunit.client.0.vm05.stdout:0/587: fdatasync df/d1f/f2d 0 2026-03-10T08:55:46.162 INFO:tasks.workunit.client.0.vm05.stdout:3/684: truncate d9/d2b/d53/d61/f99 997164 0 2026-03-10T08:55:46.181 INFO:tasks.workunit.client.0.vm05.stdout:6/615: dread d4/d2c/f7a [0,4194304] 0 2026-03-10T08:55:46.183 INFO:tasks.workunit.client.0.vm05.stdout:6/616: mknod d4/d7/d10/d1a/d1f/ccb 0 2026-03-10T08:55:46.191 INFO:tasks.workunit.client.1.vm08.stdout:8/832: rmdir d1/d10/d9/dd/d25/d27/d44/d21/d5f 39 2026-03-10T08:55:46.193 INFO:tasks.workunit.client.1.vm08.stdout:9/764: dwrite d2/dd/d15/d1e/d21/fc7 [0,4194304] 0 2026-03-10T08:55:46.195 INFO:tasks.workunit.client.1.vm08.stdout:6/800: mknod d9/dc/d11/d23/d2c/d7a/dce/d69/da2/c107 0 2026-03-10T08:55:46.195 INFO:tasks.workunit.client.0.vm05.stdout:6/617: link d4/d2d/d51/d87/c90 d4/d7/d10/d15/d1b/d22/ccc 0 2026-03-10T08:55:46.197 INFO:tasks.workunit.client.1.vm08.stdout:3/751: symlink d4/d15/d8/d2c/d89/l104 0 2026-03-10T08:55:46.200 INFO:tasks.workunit.client.0.vm05.stdout:0/588: dread df/d1f/d85/f2a [0,4194304] 0 2026-03-10T08:55:46.206 INFO:tasks.workunit.client.0.vm05.stdout:0/589: dwrite df/d1f/d85/d2b/d27/f60 [4194304,4194304] 0 2026-03-10T08:55:46.211 INFO:tasks.workunit.client.1.vm08.stdout:1/795: mknod d1/da/de/d24/d3d/d40/d56/d7a/c115 0 2026-03-10T08:55:46.211 INFO:tasks.workunit.client.1.vm08.stdout:1/796: chown d1/fac 37910 1 2026-03-10T08:55:46.212 INFO:tasks.workunit.client.1.vm08.stdout:1/797: chown d1/da/de/d24/d3d/d40/ffe 124543831 1 2026-03-10T08:55:46.213 INFO:tasks.workunit.client.1.vm08.stdout:8/833: symlink d1/d10/l134 0 2026-03-10T08:55:46.214 INFO:tasks.workunit.client.0.vm05.stdout:0/590: symlink df/d1f/d85/d2b/d27/lac 0 2026-03-10T08:55:46.215 INFO:tasks.workunit.client.1.vm08.stdout:4/832: getdents d5/d23/d36/d99/db2/d5a/d69/d11b 0 2026-03-10T08:55:46.217 INFO:tasks.workunit.client.0.vm05.stdout:0/591: mknod df/d1f/d85/d19/d55/cad 0 2026-03-10T08:55:46.217 INFO:tasks.workunit.client.1.vm08.stdout:1/798: rename d1/da/d20/d3f/d49 to d1/da/de/d24/d35/d6d/d116 0 2026-03-10T08:55:46.218 INFO:tasks.workunit.client.0.vm05.stdout:0/592: unlink df/d1f/d85/d2b/d27/d32/l58 0 2026-03-10T08:55:46.219 INFO:tasks.workunit.client.1.vm08.stdout:6/801: sync 2026-03-10T08:55:46.223 INFO:tasks.workunit.client.1.vm08.stdout:4/833: unlink d5/l12 0 2026-03-10T08:55:46.223 INFO:tasks.workunit.client.0.vm05.stdout:0/593: dread df/d1f/d48/f75 [0,4194304] 0 2026-03-10T08:55:46.226 INFO:tasks.workunit.client.1.vm08.stdout:9/765: dread d2/dd/d15/d1e/d25/f4b [0,4194304] 0 2026-03-10T08:55:46.228 INFO:tasks.workunit.client.1.vm08.stdout:6/802: truncate d9/dc/d11/d23/d2c/d7a/fd3 991870 0 2026-03-10T08:55:46.229 INFO:tasks.workunit.client.0.vm05.stdout:9/562: truncate d6/d12/d43/f91 605667 0 2026-03-10T08:55:46.229 INFO:tasks.workunit.client.1.vm08.stdout:1/799: dread d1/da/de/d5c/fb5 [0,4194304] 0 2026-03-10T08:55:46.231 INFO:tasks.workunit.client.1.vm08.stdout:6/803: chown d9/d10/d1e/f58 101 1 2026-03-10T08:55:46.231 INFO:tasks.workunit.client.1.vm08.stdout:4/834: rmdir d5/d23/d36/d99/db2/d5a 39 2026-03-10T08:55:46.234 INFO:tasks.workunit.client.0.vm05.stdout:9/563: getdents d6/d19/d2a/d8d 0 2026-03-10T08:55:46.237 INFO:tasks.workunit.client.0.vm05.stdout:9/564: creat d6/d19/d2c/fbd x:0 0 0 2026-03-10T08:55:46.238 INFO:tasks.workunit.client.0.vm05.stdout:9/565: symlink d6/d19/d2c/d84/lbe 0 2026-03-10T08:55:46.243 INFO:tasks.workunit.client.0.vm05.stdout:9/566: rmdir d6/d15/daf 0 2026-03-10T08:55:46.243 INFO:tasks.workunit.client.1.vm08.stdout:6/804: link d9/dc/d84/d80/fc1 d9/dc/d11/d23/d2c/d81/d63/f108 0 2026-03-10T08:55:46.244 INFO:tasks.workunit.client.1.vm08.stdout:9/766: dread d2/d41/d4c/d66/d82/ff6 [0,4194304] 0 2026-03-10T08:55:46.244 INFO:tasks.workunit.client.0.vm05.stdout:9/567: mknod d6/cbf 0 2026-03-10T08:55:46.246 INFO:tasks.workunit.client.1.vm08.stdout:9/767: write d2/d41/d4c/f80 [1635300,3934] 0 2026-03-10T08:55:46.247 INFO:tasks.workunit.client.0.vm05.stdout:9/568: creat d6/d15/d35/fc0 x:0 0 0 2026-03-10T08:55:46.248 INFO:tasks.workunit.client.0.vm05.stdout:9/569: creat d6/d15/fc1 x:0 0 0 2026-03-10T08:55:46.249 INFO:tasks.workunit.client.0.vm05.stdout:9/570: dread d6/d19/d2a/d4a/d8c/fa7 [0,4194304] 0 2026-03-10T08:55:46.252 INFO:tasks.workunit.client.1.vm08.stdout:3/752: dread d4/d15/d8/f41 [0,4194304] 0 2026-03-10T08:55:46.258 INFO:tasks.workunit.client.1.vm08.stdout:3/753: fsync d4/d15/d8/d2c/d9b/d79/f34 0 2026-03-10T08:55:46.258 INFO:tasks.workunit.client.1.vm08.stdout:3/754: dwrite d4/d15/d8/d1d/fff [0,4194304] 0 2026-03-10T08:55:46.268 INFO:tasks.workunit.client.1.vm08.stdout:9/768: sync 2026-03-10T08:55:46.314 INFO:tasks.workunit.client.1.vm08.stdout:0/729: write d6/dd/d13/d17/d1f/d20/d2f/d57/fcd [1012569,55311] 0 2026-03-10T08:55:46.320 INFO:tasks.workunit.client.1.vm08.stdout:0/730: symlink d6/dd/d13/d17/d1f/d20/d2f/d57/lf0 0 2026-03-10T08:55:46.322 INFO:tasks.workunit.client.1.vm08.stdout:0/731: getdents d6/dd/d13/d61/dc7/dc8/dde 0 2026-03-10T08:55:46.328 INFO:tasks.workunit.client.0.vm05.stdout:7/551: dwrite d18/d66/d25/d2e/f48 [0,4194304] 0 2026-03-10T08:55:46.331 INFO:tasks.workunit.client.0.vm05.stdout:2/523: write d0/d9/d7f/d8f/f54 [1886003,77075] 0 2026-03-10T08:55:46.331 INFO:tasks.workunit.client.1.vm08.stdout:0/732: creat d6/dd/d13/d17/d1f/ff1 x:0 0 0 2026-03-10T08:55:46.340 INFO:tasks.workunit.client.1.vm08.stdout:7/820: write d0/d11/d1f/d29/d36/d75/f85 [3013184,9271] 0 2026-03-10T08:55:46.341 INFO:tasks.workunit.client.0.vm05.stdout:4/610: truncate d0/d2c/f2f 3356029 0 2026-03-10T08:55:46.341 INFO:tasks.workunit.client.1.vm08.stdout:7/821: dread - d0/d11/d1f/d29/d3b/d80/fa2 zero size 2026-03-10T08:55:46.342 INFO:tasks.workunit.client.1.vm08.stdout:7/822: chown d0/d11/d4a/da3 28431891 1 2026-03-10T08:55:46.355 INFO:tasks.workunit.client.0.vm05.stdout:2/524: chown d0/d9/d1e/l83 52286688 1 2026-03-10T08:55:46.355 INFO:tasks.workunit.client.0.vm05.stdout:2/525: write d0/d9/d1e/d20/f8b [393175,87658] 0 2026-03-10T08:55:46.357 INFO:tasks.workunit.client.0.vm05.stdout:7/552: dread d18/d66/d25/d2e/d42/f5a [0,4194304] 0 2026-03-10T08:55:46.362 INFO:tasks.workunit.client.0.vm05.stdout:4/611: symlink d0/d2e/d42/d45/d4a/d36/dbe/d49/d58/d66/lc3 0 2026-03-10T08:55:46.366 INFO:tasks.workunit.client.1.vm08.stdout:0/733: creat d6/dd/d13/d61/dc7/dc8/dde/ff2 x:0 0 0 2026-03-10T08:55:46.366 INFO:tasks.workunit.client.0.vm05.stdout:3/685: rename d9/d2b/d53/d61 to d9/d4d/dca 0 2026-03-10T08:55:46.367 INFO:tasks.workunit.client.1.vm08.stdout:0/734: read - d6/dd/d13/d17/d1f/d20/f6a zero size 2026-03-10T08:55:46.368 INFO:tasks.workunit.client.0.vm05.stdout:2/526: dread d0/d9/f19 [0,4194304] 0 2026-03-10T08:55:46.369 INFO:tasks.workunit.client.0.vm05.stdout:2/527: write d0/d9/d1e/d20/f3a [1029953,122757] 0 2026-03-10T08:55:46.371 INFO:tasks.workunit.client.0.vm05.stdout:5/528: write d5/d86/d24/d2c/d41/d74/f9f [8582642,51054] 0 2026-03-10T08:55:46.372 INFO:tasks.workunit.client.0.vm05.stdout:5/529: chown d5/df/l1d 19976690 1 2026-03-10T08:55:46.374 INFO:tasks.workunit.client.1.vm08.stdout:0/735: symlink d6/dd/d13/d17/d1f/d20/d2f/d24/dc2/lf3 0 2026-03-10T08:55:46.374 INFO:tasks.workunit.client.1.vm08.stdout:0/736: chown d6/dd/d13/ldb 2 1 2026-03-10T08:55:46.374 INFO:tasks.workunit.client.0.vm05.stdout:7/553: creat d18/d66/d78/fa6 x:0 0 0 2026-03-10T08:55:46.380 INFO:tasks.workunit.client.1.vm08.stdout:0/737: sync 2026-03-10T08:55:46.396 INFO:tasks.workunit.client.1.vm08.stdout:2/867: dwrite d1/da/d78/f95 [0,4194304] 0 2026-03-10T08:55:46.396 INFO:tasks.workunit.client.0.vm05.stdout:7/554: creat d18/d38/d43/d5c/fa7 x:0 0 0 2026-03-10T08:55:46.397 INFO:tasks.workunit.client.0.vm05.stdout:7/555: readlink d18/d66/d78/la5 0 2026-03-10T08:55:46.400 INFO:tasks.workunit.client.0.vm05.stdout:6/618: rename d4/d92/f96 to d4/d7/d10/d15/d1b/fcd 0 2026-03-10T08:55:46.400 INFO:tasks.workunit.client.1.vm08.stdout:0/738: rename d6/dd/d13/d17/d1f/d20/d2f/d26/f73 to d6/dd/d13/d17/d1f/d2d/ff4 0 2026-03-10T08:55:46.401 INFO:tasks.workunit.client.0.vm05.stdout:6/619: truncate d4/d7/d10/d15/fc5 961265 0 2026-03-10T08:55:46.401 INFO:tasks.workunit.client.1.vm08.stdout:0/739: fsync d6/dd/d13/d17/d1f/d20/d2f/d24/fed 0 2026-03-10T08:55:46.406 INFO:tasks.workunit.client.1.vm08.stdout:8/834: write d1/d10/d9/f73 [124242,13111] 0 2026-03-10T08:55:46.412 INFO:tasks.workunit.client.0.vm05.stdout:7/556: mknod d18/d66/d25/d2e/d42/d74/ca8 0 2026-03-10T08:55:46.414 INFO:tasks.workunit.client.1.vm08.stdout:0/740: creat d6/dd/d13/d17/d1f/d2d/d85/d95/ff5 x:0 0 0 2026-03-10T08:55:46.415 INFO:tasks.workunit.client.0.vm05.stdout:0/594: write df/f79 [3676553,97437] 0 2026-03-10T08:55:46.415 INFO:tasks.workunit.client.0.vm05.stdout:0/595: chown df/d1f 29386678 1 2026-03-10T08:55:46.427 INFO:tasks.workunit.client.1.vm08.stdout:4/835: dwrite d5/d23/d36/d76/ff0 [0,4194304] 0 2026-03-10T08:55:46.431 INFO:tasks.workunit.client.1.vm08.stdout:8/835: mkdir d1/d10/d9/dd/d25/d27/d44/d21/d135 0 2026-03-10T08:55:46.434 INFO:tasks.workunit.client.1.vm08.stdout:6/805: dwrite d9/d10/d1e/d7b/fbc [0,4194304] 0 2026-03-10T08:55:46.436 INFO:tasks.workunit.client.1.vm08.stdout:6/806: chown d9/d10/l20 88 1 2026-03-10T08:55:46.437 INFO:tasks.workunit.client.1.vm08.stdout:1/800: dwrite d1/da/de/d24/d3d/d40/d8e/dd2/d7f/fe9 [0,4194304] 0 2026-03-10T08:55:46.439 INFO:tasks.workunit.client.0.vm05.stdout:9/571: dwrite d6/d19/d21/f8a [0,4194304] 0 2026-03-10T08:55:46.440 INFO:tasks.workunit.client.0.vm05.stdout:4/612: dread d0/d2e/d42/d45/d4a/f47 [4194304,4194304] 0 2026-03-10T08:55:46.442 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:46 vm05.local ceph-mon[49713]: pgmap v162: 65 pgs: 65 active+clean; 2.8 GiB data, 9.6 GiB used, 110 GiB / 120 GiB avail; 48 MiB/s rd, 111 MiB/s wr, 262 op/s 2026-03-10T08:55:46.442 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:46 vm05.local ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:55:46.453 INFO:tasks.workunit.client.0.vm05.stdout:7/557: link d18/d66/d25/d2e/d42/l92 d18/d66/d25/d2e/d2f/da0/la9 0 2026-03-10T08:55:46.457 INFO:tasks.workunit.client.1.vm08.stdout:9/769: dwrite d2/dd/d15/d1e/d24/f5e [0,4194304] 0 2026-03-10T08:55:46.458 INFO:tasks.workunit.client.1.vm08.stdout:3/755: dwrite d4/d15/f7 [0,4194304] 0 2026-03-10T08:55:46.462 INFO:tasks.workunit.client.1.vm08.stdout:0/741: stat d6/dd/d13/d32/cc5 0 2026-03-10T08:55:46.465 INFO:tasks.workunit.client.0.vm05.stdout:6/620: getdents d4/d7 0 2026-03-10T08:55:46.471 INFO:tasks.workunit.client.1.vm08.stdout:7/823: write d0/d11/d1f/d29/d3b/fac [814752,22459] 0 2026-03-10T08:55:46.472 INFO:tasks.workunit.client.1.vm08.stdout:0/742: dread d6/dd/d13/d17/d1f/d2d/d39/f4a [0,4194304] 0 2026-03-10T08:55:46.495 INFO:tasks.workunit.client.0.vm05.stdout:9/572: read - d6/d19/d2a/f87 zero size 2026-03-10T08:55:46.500 INFO:tasks.workunit.client.0.vm05.stdout:6/621: dread d4/d7/ff [0,4194304] 0 2026-03-10T08:55:46.503 INFO:tasks.workunit.client.1.vm08.stdout:9/770: dread d2/dd/d15/d1e/d39/f57 [0,4194304] 0 2026-03-10T08:55:46.503 INFO:tasks.workunit.client.0.vm05.stdout:4/613: creat d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/d67/fc4 x:0 0 0 2026-03-10T08:55:46.508 INFO:tasks.workunit.client.0.vm05.stdout:5/530: dwrite d5/d86/d21/d71/f9a [0,4194304] 0 2026-03-10T08:55:46.515 INFO:tasks.workunit.client.1.vm08.stdout:5/713: dread d0/d11/d27/f3b [0,4194304] 0 2026-03-10T08:55:46.516 INFO:tasks.workunit.client.0.vm05.stdout:2/528: dwrite d0/d9/d1e/d20/d21/f23 [0,4194304] 0 2026-03-10T08:55:46.517 INFO:tasks.workunit.client.1.vm08.stdout:5/714: chown d0/d11/d27/d68/d7c/d4b/d4e/c5c 62918140 1 2026-03-10T08:55:46.523 INFO:tasks.workunit.client.0.vm05.stdout:1/656: dread dd/d10/d18/d2d/d51/f6e [0,4194304] 0 2026-03-10T08:55:46.531 INFO:tasks.workunit.client.1.vm08.stdout:3/756: creat d4/d15/d8/d1d/d4f/f105 x:0 0 0 2026-03-10T08:55:46.531 INFO:tasks.workunit.client.1.vm08.stdout:3/757: fdatasync f1 0 2026-03-10T08:55:46.531 INFO:tasks.workunit.client.1.vm08.stdout:2/868: write d1/d43/f5d [482116,84982] 0 2026-03-10T08:55:46.535 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:46 vm08.local ceph-mon[57559]: pgmap v162: 65 pgs: 65 active+clean; 2.8 GiB data, 9.6 GiB used, 110 GiB / 120 GiB avail; 48 MiB/s rd, 111 MiB/s wr, 262 op/s 2026-03-10T08:55:46.536 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:46 vm08.local ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:55:46.536 INFO:tasks.workunit.client.1.vm08.stdout:4/836: mkdir d5/d23/d109/d12d 0 2026-03-10T08:55:46.542 INFO:tasks.workunit.client.0.vm05.stdout:4/614: unlink d0/d1d/f24 0 2026-03-10T08:55:46.545 INFO:tasks.workunit.client.0.vm05.stdout:6/622: creat d4/d7/d10/d1a/d1f/fce x:0 0 0 2026-03-10T08:55:46.547 INFO:tasks.workunit.client.0.vm05.stdout:7/558: symlink d18/d38/d43/laa 0 2026-03-10T08:55:46.549 INFO:tasks.workunit.client.1.vm08.stdout:8/836: mknod d1/d10/d9/c136 0 2026-03-10T08:55:46.550 INFO:tasks.workunit.client.0.vm05.stdout:5/531: mkdir d5/d48/d64/dc4 0 2026-03-10T08:55:46.550 INFO:tasks.workunit.client.0.vm05.stdout:5/532: stat d5/df/d37/l54 0 2026-03-10T08:55:46.551 INFO:tasks.workunit.client.1.vm08.stdout:8/837: chown d1/d10/d9/dd/d25/d27/d44/d21/d51/d64/fc2 51 1 2026-03-10T08:55:46.552 INFO:tasks.workunit.client.0.vm05.stdout:2/529: symlink d0/d9/d89/l95 0 2026-03-10T08:55:46.553 INFO:tasks.workunit.client.1.vm08.stdout:8/838: chown d1/d10/d9/dd/d25/d27/d44/d21 4 1 2026-03-10T08:55:46.553 INFO:tasks.workunit.client.0.vm05.stdout:2/530: write d0/d9/d7f/f80 [593892,124222] 0 2026-03-10T08:55:46.553 INFO:tasks.workunit.client.1.vm08.stdout:5/715: fsync d0/d11/f25 0 2026-03-10T08:55:46.554 INFO:tasks.workunit.client.0.vm05.stdout:1/657: truncate dd/d13/f42 1193814 0 2026-03-10T08:55:46.555 INFO:tasks.workunit.client.1.vm08.stdout:0/743: dwrite d6/dd/d13/d17/f6d [0,4194304] 0 2026-03-10T08:55:46.556 INFO:tasks.workunit.client.0.vm05.stdout:9/573: mknod d6/d27/cc2 0 2026-03-10T08:55:46.558 INFO:tasks.workunit.client.1.vm08.stdout:2/869: creat d1/da/d10/d42/d93/de2/f123 x:0 0 0 2026-03-10T08:55:46.559 INFO:tasks.workunit.client.1.vm08.stdout:4/837: mknod d5/d23/d109/c12e 0 2026-03-10T08:55:46.560 INFO:tasks.workunit.client.0.vm05.stdout:4/615: fdatasync d0/fb 0 2026-03-10T08:55:46.572 INFO:tasks.workunit.client.0.vm05.stdout:6/623: creat d4/d7/d10/d15/d1b/d22/fcf x:0 0 0 2026-03-10T08:55:46.572 INFO:tasks.workunit.client.1.vm08.stdout:9/771: fdatasync d2/dd/d15/d1e/d21/f90 0 2026-03-10T08:55:46.573 INFO:tasks.workunit.client.0.vm05.stdout:6/624: write d4/d7/d10/d15/d1b/d22/fa4 [2235934,130726] 0 2026-03-10T08:55:46.579 INFO:tasks.workunit.client.0.vm05.stdout:5/533: symlink d5/df/lc5 0 2026-03-10T08:55:46.582 INFO:tasks.workunit.client.1.vm08.stdout:1/801: rmdir d1/da/d20/d114 0 2026-03-10T08:55:46.586 INFO:tasks.workunit.client.1.vm08.stdout:8/839: unlink d1/d10/d9/dd/d18/d34/f10b 0 2026-03-10T08:55:46.589 INFO:tasks.workunit.client.0.vm05.stdout:9/574: symlink d6/d19/d2c/d84/lc3 0 2026-03-10T08:55:46.594 INFO:tasks.workunit.client.0.vm05.stdout:5/534: fsync d5/d48/f7e 0 2026-03-10T08:55:46.595 INFO:tasks.workunit.client.0.vm05.stdout:5/535: stat d5/df/f2f 0 2026-03-10T08:55:46.598 INFO:tasks.workunit.client.0.vm05.stdout:5/536: dread d5/df/dbb/f4a [4194304,4194304] 0 2026-03-10T08:55:46.600 INFO:tasks.workunit.client.0.vm05.stdout:1/658: creat dd/d10/d19/d9b/dc3/fee x:0 0 0 2026-03-10T08:55:46.604 INFO:tasks.workunit.client.1.vm08.stdout:5/716: symlink d0/d11/d18/ldb 0 2026-03-10T08:55:46.611 INFO:tasks.workunit.client.1.vm08.stdout:3/758: creat d4/f106 x:0 0 0 2026-03-10T08:55:46.611 INFO:tasks.workunit.client.1.vm08.stdout:2/870: truncate d1/da/d10/d42/d93/d1e/dce/d52/f119 807296 0 2026-03-10T08:55:46.611 INFO:tasks.workunit.client.0.vm05.stdout:9/575: read - d6/d27/fa6 zero size 2026-03-10T08:55:46.611 INFO:tasks.workunit.client.0.vm05.stdout:6/625: mknod d4/d2c/d84/db6/dc6/cd0 0 2026-03-10T08:55:46.612 INFO:tasks.workunit.client.0.vm05.stdout:5/537: mkdir d5/d48/d64/d95/dac/dc6 0 2026-03-10T08:55:46.612 INFO:tasks.workunit.client.0.vm05.stdout:5/538: write d5/d48/f93 [2235753,2690] 0 2026-03-10T08:55:46.612 INFO:tasks.workunit.client.1.vm08.stdout:9/772: mkdir d2/dd/d15/d4f/df1/d102 0 2026-03-10T08:55:46.612 INFO:tasks.workunit.client.1.vm08.stdout:9/773: chown d2/dd 3 1 2026-03-10T08:55:46.615 INFO:tasks.workunit.client.1.vm08.stdout:8/840: symlink d1/d10/d9/dd/d25/d27/d44/d21/d135/l137 0 2026-03-10T08:55:46.616 INFO:tasks.workunit.client.1.vm08.stdout:3/759: mkdir d4/d15/d8/d1d/d107 0 2026-03-10T08:55:46.617 INFO:tasks.workunit.client.1.vm08.stdout:3/760: stat d4/d15/d8/d2c/d6d/c94 0 2026-03-10T08:55:46.618 INFO:tasks.workunit.client.1.vm08.stdout:3/761: read - d4/d15/d8/d1d/fe6 zero size 2026-03-10T08:55:46.620 INFO:tasks.workunit.client.1.vm08.stdout:1/802: rename d1/da/d20/d91/lcb to d1/da/d20/d91/d83/df4/l117 0 2026-03-10T08:55:46.625 INFO:tasks.workunit.client.1.vm08.stdout:9/774: mkdir d2/d41/d53/d103 0 2026-03-10T08:55:46.699 INFO:tasks.workunit.client.0.vm05.stdout:6/626: symlink d4/d7/d10/dc3/ld1 0 2026-03-10T08:55:46.699 INFO:tasks.workunit.client.0.vm05.stdout:6/627: write d4/d7/f4d [381248,123920] 0 2026-03-10T08:55:46.699 INFO:tasks.workunit.client.1.vm08.stdout:3/762: rmdir d4/d15/d8 39 2026-03-10T08:55:46.699 INFO:tasks.workunit.client.1.vm08.stdout:2/871: mknod d1/da/d10/d42/d93/d1e/d7b/c124 0 2026-03-10T08:55:46.699 INFO:tasks.workunit.client.1.vm08.stdout:5/717: link d0/d11/d18/d52/db9/fc9 d0/d11/d27/fdc 0 2026-03-10T08:55:46.699 INFO:tasks.workunit.client.1.vm08.stdout:9/775: creat d2/dd/d15/d1e/d39/d69/de4/f104 x:0 0 0 2026-03-10T08:55:46.699 INFO:tasks.workunit.client.1.vm08.stdout:6/807: dread d9/d10/d1e/d32/f4d [0,4194304] 0 2026-03-10T08:55:46.699 INFO:tasks.workunit.client.1.vm08.stdout:5/718: creat d0/d11/d3e/fdd x:0 0 0 2026-03-10T08:55:46.699 INFO:tasks.workunit.client.1.vm08.stdout:1/803: truncate d1/da/de/d24/d35/d6d/d116/f10f 1019975 0 2026-03-10T08:55:46.699 INFO:tasks.workunit.client.1.vm08.stdout:4/838: rename d5/d23/d36/d99/db2/d5a/d69/d11b/d96/f11a to d5/d23/d36/d99/db2/f12f 0 2026-03-10T08:55:46.699 INFO:tasks.workunit.client.1.vm08.stdout:3/763: fdatasync d4/d15/d8/d2c/d9b/d79/f34 0 2026-03-10T08:55:46.699 INFO:tasks.workunit.client.1.vm08.stdout:6/808: mknod d9/dc/d11/d23/d2c/d7a/dce/d69/da2/c109 0 2026-03-10T08:55:46.699 INFO:tasks.workunit.client.1.vm08.stdout:6/809: chown d9/dc/d11/f8d 23528388 1 2026-03-10T08:55:46.699 INFO:tasks.workunit.client.1.vm08.stdout:3/764: creat d4/d15/d8/d2c/f108 x:0 0 0 2026-03-10T08:55:46.699 INFO:tasks.workunit.client.1.vm08.stdout:3/765: fsync d4/d15/d8/d1d/fff 0 2026-03-10T08:55:46.699 INFO:tasks.workunit.client.1.vm08.stdout:3/766: dwrite d4/d15/d8/d1d/da8/ff4 [0,4194304] 0 2026-03-10T08:55:46.758 INFO:tasks.workunit.client.1.vm08.stdout:3/767: read d4/d15/d8/d2c/f8c [85486,119924] 0 2026-03-10T08:55:46.760 INFO:tasks.workunit.client.1.vm08.stdout:3/768: creat d4/d6f/d85/df1/f109 x:0 0 0 2026-03-10T08:55:46.761 INFO:tasks.workunit.client.1.vm08.stdout:3/769: read d4/d15/fa [2875092,87030] 0 2026-03-10T08:55:46.764 INFO:tasks.workunit.client.0.vm05.stdout:4/616: sync 2026-03-10T08:55:46.775 INFO:tasks.workunit.client.0.vm05.stdout:4/617: chown d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/d67/fc4 33096799 1 2026-03-10T08:55:46.776 INFO:tasks.workunit.client.0.vm05.stdout:4/618: mknod d0/d2e/d42/d45/d4a/d36/cc5 0 2026-03-10T08:55:46.776 INFO:tasks.workunit.client.0.vm05.stdout:4/619: chown d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/d67/c6d 2039850 1 2026-03-10T08:55:46.776 INFO:tasks.workunit.client.0.vm05.stdout:4/620: link d0/d2e/d42/d45/c8a d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/d67/d7b/cc6 0 2026-03-10T08:55:46.776 INFO:tasks.workunit.client.1.vm08.stdout:8/841: sync 2026-03-10T08:55:46.776 INFO:tasks.workunit.client.1.vm08.stdout:2/872: sync 2026-03-10T08:55:46.783 INFO:tasks.workunit.client.1.vm08.stdout:2/873: write d1/da/d10/d42/d93/d1e/f1f [4402358,55096] 0 2026-03-10T08:55:46.787 INFO:tasks.workunit.client.0.vm05.stdout:7/559: write d18/d38/d43/d6e/f76 [347522,93184] 0 2026-03-10T08:55:46.790 INFO:tasks.workunit.client.0.vm05.stdout:2/531: write d0/d9/d1e/d20/d21/f46 [1010602,29333] 0 2026-03-10T08:55:46.793 INFO:tasks.workunit.client.1.vm08.stdout:0/744: write d6/dd/d13/d8f/fbb [357244,89411] 0 2026-03-10T08:55:46.796 INFO:tasks.workunit.client.0.vm05.stdout:2/532: getdents d0/d9/d1e/d20/d21/d45/d6c 0 2026-03-10T08:55:46.796 INFO:tasks.workunit.client.1.vm08.stdout:2/874: creat d1/da/d10/d1b/d111/f125 x:0 0 0 2026-03-10T08:55:46.797 INFO:tasks.workunit.client.0.vm05.stdout:2/533: creat d0/d9/d89/f96 x:0 0 0 2026-03-10T08:55:46.806 INFO:tasks.workunit.client.1.vm08.stdout:2/875: rmdir d1/da/d10/d42/d93/d1e 39 2026-03-10T08:55:46.811 INFO:tasks.workunit.client.1.vm08.stdout:2/876: truncate d1/da/d10/d42/d93/d1e/dce/d52/f112 927922 0 2026-03-10T08:55:46.811 INFO:tasks.workunit.client.0.vm05.stdout:1/659: write dd/d10/d18/d2d/d51/d58/fa0 [1532462,15520] 0 2026-03-10T08:55:46.811 INFO:tasks.workunit.client.0.vm05.stdout:5/539: write d5/d86/f59 [956836,9545] 0 2026-03-10T08:55:46.812 INFO:tasks.workunit.client.0.vm05.stdout:9/576: write d6/d15/f25 [1423267,77314] 0 2026-03-10T08:55:46.813 INFO:tasks.workunit.client.1.vm08.stdout:2/877: rename d1/d5b/dc5/f106 to d1/da/d78/df5/f126 0 2026-03-10T08:55:46.814 INFO:tasks.workunit.client.0.vm05.stdout:6/628: write d4/fa8 [460409,92710] 0 2026-03-10T08:55:46.816 INFO:tasks.workunit.client.0.vm05.stdout:7/560: sync 2026-03-10T08:55:46.816 INFO:tasks.workunit.client.0.vm05.stdout:7/561: stat d18/d38/d43/d6e/f9b 0 2026-03-10T08:55:46.823 INFO:tasks.workunit.client.1.vm08.stdout:9/776: write d2/dd/d15/d1e/d39/d69/de4/df2/fa0 [364742,107716] 0 2026-03-10T08:55:46.824 INFO:tasks.workunit.client.1.vm08.stdout:9/777: fdatasync d2/d41/ffa 0 2026-03-10T08:55:46.825 INFO:tasks.workunit.client.0.vm05.stdout:9/577: readlink d6/d19/d2a/l99 0 2026-03-10T08:55:46.827 INFO:tasks.workunit.client.1.vm08.stdout:1/804: write d1/da/de/d24/d3d/d40/d84/fa5 [453670,51877] 0 2026-03-10T08:55:46.827 INFO:tasks.workunit.client.0.vm05.stdout:6/629: creat d4/d7/d10/d1a/d89/fd2 x:0 0 0 2026-03-10T08:55:46.829 INFO:tasks.workunit.client.1.vm08.stdout:5/719: dwrite d0/d11/d27/d68/d7c/d4b/d4e/f56 [0,4194304] 0 2026-03-10T08:55:46.831 INFO:tasks.workunit.client.1.vm08.stdout:5/720: chown d0/d11/d27/d68/d7c/f42 47 1 2026-03-10T08:55:46.834 INFO:tasks.workunit.client.0.vm05.stdout:7/562: symlink d18/d66/d25/d2e/d2f/lab 0 2026-03-10T08:55:46.835 INFO:tasks.workunit.client.1.vm08.stdout:6/810: write d9/d10/f25 [5012652,17200] 0 2026-03-10T08:55:46.836 INFO:tasks.workunit.client.0.vm05.stdout:5/540: symlink d5/lc7 0 2026-03-10T08:55:46.836 INFO:tasks.workunit.client.0.vm05.stdout:5/541: dread - d5/fc0 zero size 2026-03-10T08:55:46.838 INFO:tasks.workunit.client.1.vm08.stdout:4/839: dwrite d5/d23/d36/d99/db2/d5a/ddb/fe9 [0,4194304] 0 2026-03-10T08:55:46.840 INFO:tasks.workunit.client.0.vm05.stdout:7/563: rename d18/d66/d25/d2e/d32 to d18/d66/d25/d2e/d42/d9c/dac 0 2026-03-10T08:55:46.840 INFO:tasks.workunit.client.1.vm08.stdout:9/778: creat d2/dd/d15/d1e/d25/d32/d5c/f105 x:0 0 0 2026-03-10T08:55:46.840 INFO:tasks.workunit.client.0.vm05.stdout:7/564: dread - d18/d66/d25/d2e/d42/fa3 zero size 2026-03-10T08:55:46.844 INFO:tasks.workunit.client.0.vm05.stdout:5/542: unlink d5/d48/d64/d95/c99 0 2026-03-10T08:55:46.846 INFO:tasks.workunit.client.1.vm08.stdout:3/770: rmdir d4 39 2026-03-10T08:55:46.847 INFO:tasks.workunit.client.0.vm05.stdout:1/660: write dd/d21/d37/f8c [20450,59651] 0 2026-03-10T08:55:46.849 INFO:tasks.workunit.client.0.vm05.stdout:2/534: dwrite d0/d9/d1e/d20/d21/d45/d4b/f6b [0,4194304] 0 2026-03-10T08:55:46.852 INFO:tasks.workunit.client.1.vm08.stdout:6/811: fsync d9/dc/d11/d106/fa8 0 2026-03-10T08:55:46.855 INFO:tasks.workunit.client.1.vm08.stdout:5/721: dread d0/d11/d3e/f48 [0,4194304] 0 2026-03-10T08:55:46.856 INFO:tasks.workunit.client.0.vm05.stdout:5/543: sync 2026-03-10T08:55:46.856 INFO:tasks.workunit.client.1.vm08.stdout:2/878: dwrite d1/da/d10/d2d/db6/ff3 [0,4194304] 0 2026-03-10T08:55:46.856 INFO:tasks.workunit.client.1.vm08.stdout:8/842: dwrite d1/d10/fac [0,4194304] 0 2026-03-10T08:55:46.858 INFO:tasks.workunit.client.1.vm08.stdout:5/722: chown d0/d11/d27/f3d 474170 1 2026-03-10T08:55:46.859 INFO:tasks.workunit.client.0.vm05.stdout:1/661: mknod dd/d10/d19/d9b/cef 0 2026-03-10T08:55:46.862 INFO:tasks.workunit.client.0.vm05.stdout:5/544: dread - d5/d86/d24/d2c/f88 zero size 2026-03-10T08:55:46.863 INFO:tasks.workunit.client.0.vm05.stdout:5/545: chown d5/d86/f9d 103 1 2026-03-10T08:55:46.864 INFO:tasks.workunit.client.0.vm05.stdout:5/546: write d5/d86/d66/fa5 [181820,2499] 0 2026-03-10T08:55:46.864 INFO:tasks.workunit.client.0.vm05.stdout:5/547: dread - d5/d86/d24/f51 zero size 2026-03-10T08:55:46.869 INFO:tasks.workunit.client.1.vm08.stdout:4/840: symlink d5/l130 0 2026-03-10T08:55:46.883 INFO:tasks.workunit.client.1.vm08.stdout:6/812: creat d9/dc/d11/d23/d2c/dc0/f10a x:0 0 0 2026-03-10T08:55:46.883 INFO:tasks.workunit.client.1.vm08.stdout:3/771: dwrite d4/d15/d8/d1d/da8/fe7 [0,4194304] 0 2026-03-10T08:55:46.888 INFO:tasks.workunit.client.1.vm08.stdout:2/879: creat d1/da/d10/d42/d93/d1e/dce/d52/db3/def/f127 x:0 0 0 2026-03-10T08:55:46.889 INFO:tasks.workunit.client.1.vm08.stdout:4/841: mkdir d5/d23/d36/d99/db2/d5a/d69/d11b/d114/d131 0 2026-03-10T08:55:46.889 INFO:tasks.workunit.client.1.vm08.stdout:5/723: write d0/d11/d27/f61 [3366426,94451] 0 2026-03-10T08:55:46.889 INFO:tasks.workunit.client.1.vm08.stdout:1/805: link d1/da/d18/d3b/d62/fc7 d1/da/de/f118 0 2026-03-10T08:55:46.896 INFO:tasks.workunit.client.0.vm05.stdout:2/535: creat d0/d9/d1e/d20/d21/d45/d4b/f97 x:0 0 0 2026-03-10T08:55:46.902 INFO:tasks.workunit.client.1.vm08.stdout:2/880: rmdir d1/da/d10/d42/d93/d1e/dce/d52/db3 39 2026-03-10T08:55:46.910 INFO:tasks.workunit.client.1.vm08.stdout:6/813: rename d9/dc/d11/f31 to d9/d10/d1e/d32/f10b 0 2026-03-10T08:55:46.913 INFO:tasks.workunit.client.1.vm08.stdout:3/772: sync 2026-03-10T08:55:46.918 INFO:tasks.workunit.client.0.vm05.stdout:2/536: dread d0/d9/d7f/d8f/f38 [0,4194304] 0 2026-03-10T08:55:46.921 INFO:tasks.workunit.client.0.vm05.stdout:2/537: fdatasync d0/d9/d7f/d8f/f67 0 2026-03-10T08:55:46.922 INFO:tasks.workunit.client.0.vm05.stdout:2/538: chown d0/d9/d7f/d8f/f37 62991 1 2026-03-10T08:55:46.934 INFO:tasks.workunit.client.1.vm08.stdout:4/842: dread d5/d23/d49/d8f/fb1 [0,4194304] 0 2026-03-10T08:55:46.941 INFO:tasks.workunit.client.0.vm05.stdout:6/630: dread d4/d7/f54 [0,4194304] 0 2026-03-10T08:55:46.941 INFO:tasks.workunit.client.0.vm05.stdout:6/631: mknod d4/d7/d10/d1a/d8c/cd3 0 2026-03-10T08:55:46.942 INFO:tasks.workunit.client.1.vm08.stdout:4/843: unlink d5/d23/d49/f101 0 2026-03-10T08:55:46.942 INFO:tasks.workunit.client.1.vm08.stdout:4/844: chown d5/d23/d36/d99/db2/d5a/d69/d11b/c33 398539558 1 2026-03-10T08:55:46.942 INFO:tasks.workunit.client.1.vm08.stdout:4/845: dwrite d5/d23/d36/d99/db2/d5d/f129 [0,4194304] 0 2026-03-10T08:55:46.957 INFO:tasks.workunit.client.0.vm05.stdout:7/565: write d18/d66/d25/d2e/d42/d9c/dac/f4c [712694,122936] 0 2026-03-10T08:55:46.959 INFO:tasks.workunit.client.1.vm08.stdout:9/779: write d2/dd/d61/f67 [646386,63179] 0 2026-03-10T08:55:46.960 INFO:tasks.workunit.client.0.vm05.stdout:7/566: rename d18/d66/f6c to d18/d66/d25/d2e/d2f/fad 0 2026-03-10T08:55:46.966 INFO:tasks.workunit.client.1.vm08.stdout:9/780: link d2/d41/d4c/d66/fb0 d2/dd/d15/d1e/d94/f106 0 2026-03-10T08:55:46.968 INFO:tasks.workunit.client.0.vm05.stdout:7/567: creat d18/d66/fae x:0 0 0 2026-03-10T08:55:46.970 INFO:tasks.workunit.client.0.vm05.stdout:7/568: write d18/d66/d25/d2e/d42/f52 [515904,15073] 0 2026-03-10T08:55:46.973 INFO:tasks.workunit.client.0.vm05.stdout:7/569: mkdir d18/d38/d43/d5c/daf 0 2026-03-10T08:55:46.988 INFO:tasks.workunit.client.0.vm05.stdout:7/570: chown d18/d66/c62 405705109 1 2026-03-10T08:55:46.988 INFO:tasks.workunit.client.0.vm05.stdout:7/571: rename l13 to d18/d66/d25/d2e/d2f/d6d/lb0 0 2026-03-10T08:55:46.988 INFO:tasks.workunit.client.0.vm05.stdout:1/662: dwrite dd/d10/d18/d20/fd6 [0,4194304] 0 2026-03-10T08:55:46.988 INFO:tasks.workunit.client.0.vm05.stdout:7/572: unlink d18/d66/d25/d2e/d2f/f33 0 2026-03-10T08:55:46.989 INFO:tasks.workunit.client.0.vm05.stdout:1/663: symlink dd/d10/d18/d2d/d51/d58/d71/d62/lf0 0 2026-03-10T08:55:46.991 INFO:tasks.workunit.client.0.vm05.stdout:0/596: read fe [4501919,3617] 0 2026-03-10T08:55:46.992 INFO:tasks.workunit.client.0.vm05.stdout:3/686: read d9/ff [433004,95176] 0 2026-03-10T08:55:46.995 INFO:tasks.workunit.client.0.vm05.stdout:7/573: fsync d18/d38/d43/d5c/f5f 0 2026-03-10T08:55:46.999 INFO:tasks.workunit.client.0.vm05.stdout:9/578: dread d6/d15/d35/f9a [0,4194304] 0 2026-03-10T08:55:47.004 INFO:tasks.workunit.client.0.vm05.stdout:9/579: dwrite d6/d19/d21/f8a [0,4194304] 0 2026-03-10T08:55:47.011 INFO:tasks.workunit.client.0.vm05.stdout:9/580: dwrite d6/d12/db2/fba [0,4194304] 0 2026-03-10T08:55:47.017 INFO:tasks.workunit.client.1.vm08.stdout:7/824: dread d0/d11/d1f/fb7 [0,4194304] 0 2026-03-10T08:55:47.017 INFO:tasks.workunit.client.0.vm05.stdout:0/597: chown df/d1f/d85/f24 13 1 2026-03-10T08:55:47.018 INFO:tasks.workunit.client.0.vm05.stdout:0/598: chown df/d1f/d85/d19/d62 30132 1 2026-03-10T08:55:47.021 INFO:tasks.workunit.client.0.vm05.stdout:3/687: creat d9/d4d/d51/d64/fcb x:0 0 0 2026-03-10T08:55:47.030 INFO:tasks.workunit.client.1.vm08.stdout:0/745: dread d6/dd/d13/d17/d1f/d20/d2f/d26/f80 [0,4194304] 0 2026-03-10T08:55:47.030 INFO:tasks.workunit.client.1.vm08.stdout:2/881: dread d1/da/d10/d42/d93/d1e/fb2 [0,4194304] 0 2026-03-10T08:55:47.033 INFO:tasks.workunit.client.1.vm08.stdout:7/825: mknod d0/d11/d4a/d95/dc5/d100/c10a 0 2026-03-10T08:55:47.034 INFO:tasks.workunit.client.1.vm08.stdout:7/826: write d0/d14/d43/f6e [4491349,49439] 0 2026-03-10T08:55:47.037 INFO:tasks.workunit.client.1.vm08.stdout:5/724: dread d0/d11/d27/d68/d7c/f42 [0,4194304] 0 2026-03-10T08:55:47.038 INFO:tasks.workunit.client.1.vm08.stdout:7/827: dwrite d0/d11/db2/d8e/f107 [0,4194304] 0 2026-03-10T08:55:47.042 INFO:tasks.workunit.client.1.vm08.stdout:8/843: write d1/d10/d9/d4d/db2/fda [1536740,68643] 0 2026-03-10T08:55:47.044 INFO:tasks.workunit.client.1.vm08.stdout:1/806: write d1/da/de/d24/d35/d6d/d82/da2/fcd [741446,64422] 0 2026-03-10T08:55:47.046 INFO:tasks.workunit.client.0.vm05.stdout:5/548: dwrite d5/d86/f1a [0,4194304] 0 2026-03-10T08:55:47.049 INFO:tasks.workunit.client.1.vm08.stdout:6/814: dwrite d9/d10/d1e/d32/f27 [0,4194304] 0 2026-03-10T08:55:47.050 INFO:tasks.workunit.client.1.vm08.stdout:3/773: dwrite d4/d15/d8/fec [0,4194304] 0 2026-03-10T08:55:47.052 INFO:tasks.workunit.client.1.vm08.stdout:0/746: mkdir d6/dd/d13/d17/d1f/d2d/d85/df6 0 2026-03-10T08:55:47.066 INFO:tasks.workunit.client.0.vm05.stdout:0/599: rmdir df/d1f/d85/d2b/d27/d32/d4e/d87 39 2026-03-10T08:55:47.072 INFO:tasks.workunit.client.0.vm05.stdout:4/621: dread d0/d2e/d42/d45/d4a/d36/dbe/d32/f3e [0,4194304] 0 2026-03-10T08:55:47.075 INFO:tasks.workunit.client.1.vm08.stdout:7/828: read - d0/d11/db2/f8c zero size 2026-03-10T08:55:47.078 INFO:tasks.workunit.client.0.vm05.stdout:8/612: dread d2/dd/d2c/d2e/d31/d4f/d80/f9f [0,4194304] 0 2026-03-10T08:55:47.079 INFO:tasks.workunit.client.0.vm05.stdout:4/622: dwrite d0/d2e/d42/d45/d4a/d36/dbe/d49/d58/d66/d79/fa5 [0,4194304] 0 2026-03-10T08:55:47.084 INFO:tasks.workunit.client.0.vm05.stdout:2/539: dwrite d0/f4 [0,4194304] 0 2026-03-10T08:55:47.085 INFO:tasks.workunit.client.0.vm05.stdout:2/540: chown d0/l28 7072 1 2026-03-10T08:55:47.088 INFO:tasks.workunit.client.1.vm08.stdout:3/774: sync 2026-03-10T08:55:47.096 INFO:tasks.workunit.client.1.vm08.stdout:5/725: truncate d0/d11/d3e/f48 2607626 0 2026-03-10T08:55:47.102 INFO:tasks.workunit.client.1.vm08.stdout:8/844: creat d1/d4f/f138 x:0 0 0 2026-03-10T08:55:47.109 INFO:tasks.workunit.client.0.vm05.stdout:5/549: mkdir d5/df/d37/dc8 0 2026-03-10T08:55:47.109 INFO:tasks.workunit.client.1.vm08.stdout:1/807: read - d1/da/de/d24/d3d/d40/d92/f99 zero size 2026-03-10T08:55:47.111 INFO:tasks.workunit.client.0.vm05.stdout:1/664: rename dd/d13/f40 to dd/d10/d18/d2d/ff1 0 2026-03-10T08:55:47.116 INFO:tasks.workunit.client.0.vm05.stdout:6/632: dwrite d4/d7/d10/f65 [0,4194304] 0 2026-03-10T08:55:47.118 INFO:tasks.workunit.client.1.vm08.stdout:9/781: truncate d2/dd/d15/d1e/d21/f2d 4863972 0 2026-03-10T08:55:47.121 INFO:tasks.workunit.client.1.vm08.stdout:4/846: dwrite d5/d23/d36/d99/dc6/df1/f104 [0,4194304] 0 2026-03-10T08:55:47.127 INFO:tasks.workunit.client.1.vm08.stdout:6/815: rename d9/d10/d1e/d7e/fb0 to d9/d10/d1e/d92/f10c 0 2026-03-10T08:55:47.128 INFO:tasks.workunit.client.0.vm05.stdout:0/600: fsync fe 0 2026-03-10T08:55:47.131 INFO:tasks.workunit.client.0.vm05.stdout:3/688: mknod d9/d2b/d3a/d6c/dbe/ccc 0 2026-03-10T08:55:47.142 INFO:tasks.workunit.client.1.vm08.stdout:8/845: mkdir d1/da8/d139 0 2026-03-10T08:55:47.142 INFO:tasks.workunit.client.1.vm08.stdout:5/726: fdatasync d0/d11/d18/fc6 0 2026-03-10T08:55:47.142 INFO:tasks.workunit.client.0.vm05.stdout:7/574: creat d18/fb1 x:0 0 0 2026-03-10T08:55:47.143 INFO:tasks.workunit.client.1.vm08.stdout:3/775: dread d4/d15/d8/d1d/f98 [0,4194304] 0 2026-03-10T08:55:47.144 INFO:tasks.workunit.client.1.vm08.stdout:1/808: mknod d1/da/d18/d3b/d62/c119 0 2026-03-10T08:55:47.145 INFO:tasks.workunit.client.0.vm05.stdout:4/623: mknod d0/d2e/d42/d45/cc7 0 2026-03-10T08:55:47.145 INFO:tasks.workunit.client.0.vm05.stdout:4/624: chown d0/d2c/d6a/f75 25320583 1 2026-03-10T08:55:47.158 INFO:tasks.workunit.client.0.vm05.stdout:0/601: mkdir df/d1f/d85/d19/d47/d84/dae 0 2026-03-10T08:55:47.159 INFO:tasks.workunit.client.0.vm05.stdout:8/613: symlink d2/dd/d2c/d2e/d31/d3e/ldc 0 2026-03-10T08:55:47.159 INFO:tasks.workunit.client.0.vm05.stdout:3/689: creat d9/d4d/d51/d64/d89/dc2/fcd x:0 0 0 2026-03-10T08:55:47.160 INFO:tasks.workunit.client.0.vm05.stdout:7/575: symlink d18/d38/d43/d5c/lb2 0 2026-03-10T08:55:47.161 INFO:tasks.workunit.client.0.vm05.stdout:7/576: fsync d18/f95 0 2026-03-10T08:55:47.164 INFO:tasks.workunit.client.0.vm05.stdout:2/541: rename d0/f61 to d0/d9/d1e/d20/d21/d45/d4b/d70/f98 0 2026-03-10T08:55:47.176 INFO:tasks.workunit.client.0.vm05.stdout:3/690: creat d9/d2b/d3a/d6c/dbf/fce x:0 0 0 2026-03-10T08:55:47.176 INFO:tasks.workunit.client.0.vm05.stdout:3/691: chown d9/d2b/d53/fa7 3107829 1 2026-03-10T08:55:47.176 INFO:tasks.workunit.client.0.vm05.stdout:3/692: truncate d9/d2b/d3a/d43/d71/d86/fb8 22036 0 2026-03-10T08:55:47.177 INFO:tasks.workunit.client.0.vm05.stdout:8/614: chown d2/dd/d2c/d2e/d31/d4f/d80/dd0/fb6 28555092 1 2026-03-10T08:55:47.185 INFO:tasks.workunit.client.0.vm05.stdout:6/633: read d4/d7/d10/d15/d1b/d22/f36 [3163667,116357] 0 2026-03-10T08:55:47.185 INFO:tasks.workunit.client.0.vm05.stdout:7/577: rename d18/d66/d25/d2e/d2f/d6d/l7d to d18/d66/d25/d2e/d2f/da0/lb3 0 2026-03-10T08:55:47.189 INFO:tasks.workunit.client.0.vm05.stdout:5/550: link d5/d86/d24/l27 d5/df/d37/d68/d85/lc9 0 2026-03-10T08:55:47.198 INFO:tasks.workunit.client.0.vm05.stdout:0/602: truncate df/f17 1777329 0 2026-03-10T08:55:47.202 INFO:tasks.workunit.client.0.vm05.stdout:0/603: dwrite df/fab [0,4194304] 0 2026-03-10T08:55:47.208 INFO:tasks.workunit.client.0.vm05.stdout:3/693: mknod d9/d4d/d51/d64/ccf 0 2026-03-10T08:55:47.219 INFO:tasks.workunit.client.0.vm05.stdout:6/634: chown d4/d7/d10/d1a/d8c/caa 5183597 1 2026-03-10T08:55:47.219 INFO:tasks.workunit.client.0.vm05.stdout:6/635: fdatasync d4/fa8 0 2026-03-10T08:55:47.220 INFO:tasks.workunit.client.0.vm05.stdout:6/636: write d4/d7/f34 [6193592,81864] 0 2026-03-10T08:55:47.224 INFO:tasks.workunit.client.0.vm05.stdout:5/551: mkdir d5/d86/d24/d2c/d41/dca 0 2026-03-10T08:55:47.245 INFO:tasks.workunit.client.0.vm05.stdout:0/604: mknod df/d1f/d85/d2b/d65/d6e/d96/caf 0 2026-03-10T08:55:47.248 INFO:tasks.workunit.client.0.vm05.stdout:0/605: dread df/d1f/d85/d2b/d27/f60 [4194304,4194304] 0 2026-03-10T08:55:47.254 INFO:tasks.workunit.client.0.vm05.stdout:3/694: chown d9/c25 104 1 2026-03-10T08:55:47.259 INFO:tasks.workunit.client.0.vm05.stdout:3/695: dread d9/d4d/f52 [0,4194304] 0 2026-03-10T08:55:47.265 INFO:tasks.workunit.client.0.vm05.stdout:6/637: truncate d4/d7/f5d 5112267 0 2026-03-10T08:55:47.269 INFO:tasks.workunit.client.1.vm08.stdout:9/782: creat d2/d54/d8e/da6/dd0/dc8/f107 x:0 0 0 2026-03-10T08:55:47.271 INFO:tasks.workunit.client.0.vm05.stdout:7/578: getdents d18/d38/d43/d5c/daf 0 2026-03-10T08:55:47.271 INFO:tasks.workunit.client.0.vm05.stdout:7/579: write f9 [5924131,77089] 0 2026-03-10T08:55:47.276 INFO:tasks.workunit.client.0.vm05.stdout:5/552: symlink d5/d48/d64/d95/lcb 0 2026-03-10T08:55:47.276 INFO:tasks.workunit.client.0.vm05.stdout:5/553: fdatasync d5/d86/fa6 0 2026-03-10T08:55:47.279 INFO:tasks.workunit.client.0.vm05.stdout:4/625: getdents d0/d78 0 2026-03-10T08:55:47.279 INFO:tasks.workunit.client.0.vm05.stdout:4/626: stat d0/l11 0 2026-03-10T08:55:47.281 INFO:tasks.workunit.client.1.vm08.stdout:6/816: unlink d9/dc/d11/d23/cf4 0 2026-03-10T08:55:47.305 INFO:tasks.workunit.client.0.vm05.stdout:3/696: mkdir d9/d2b/d2f/d57/dd0 0 2026-03-10T08:55:47.305 INFO:tasks.workunit.client.0.vm05.stdout:3/697: dread - d9/fb4 zero size 2026-03-10T08:55:47.320 INFO:tasks.workunit.client.1.vm08.stdout:1/809: truncate d1/da/d18/d3b/faf 4615771 0 2026-03-10T08:55:47.321 INFO:tasks.workunit.client.1.vm08.stdout:5/727: dread d0/d11/d18/f5a [0,4194304] 0 2026-03-10T08:55:47.322 INFO:tasks.workunit.client.1.vm08.stdout:1/810: write d1/da/de/d24/d3d/d40/ffe [496721,116163] 0 2026-03-10T08:55:47.326 INFO:tasks.workunit.client.1.vm08.stdout:2/882: dwrite d1/da/d10/d42/d93/daa/fdb [0,4194304] 0 2026-03-10T08:55:47.329 INFO:tasks.workunit.client.1.vm08.stdout:1/811: sync 2026-03-10T08:55:47.335 INFO:tasks.workunit.client.1.vm08.stdout:7/829: write d0/d11/d1f/d29/d3d/d89/f96 [2946321,69061] 0 2026-03-10T08:55:47.335 INFO:tasks.workunit.client.1.vm08.stdout:9/783: rmdir d2/dd/d15/de0 39 2026-03-10T08:55:47.344 INFO:tasks.workunit.client.0.vm05.stdout:9/581: write d6/d19/d21/fb7 [3948076,106503] 0 2026-03-10T08:55:47.349 INFO:tasks.workunit.client.1.vm08.stdout:5/728: dread d0/d11/d18/f23 [0,4194304] 0 2026-03-10T08:55:47.361 INFO:tasks.workunit.client.0.vm05.stdout:1/665: write dd/d21/d37/f39 [4738600,10524] 0 2026-03-10T08:55:47.363 INFO:tasks.workunit.client.1.vm08.stdout:8/846: mknod d1/d10/d9/d4d/d112/c13a 0 2026-03-10T08:55:47.370 INFO:tasks.workunit.client.1.vm08.stdout:3/776: truncate d4/d15/d8/d2c/f32 3707676 0 2026-03-10T08:55:47.370 INFO:tasks.workunit.client.1.vm08.stdout:3/777: write d4/d15/d8/d2c/d9b/d79/d20/f99 [624705,48387] 0 2026-03-10T08:55:47.370 INFO:tasks.workunit.client.1.vm08.stdout:3/778: read d4/d6f/dca/fcc [136804,118726] 0 2026-03-10T08:55:47.376 INFO:tasks.workunit.client.0.vm05.stdout:9/582: dread d6/d12/f1c [0,4194304] 0 2026-03-10T08:55:47.386 INFO:tasks.workunit.client.0.vm05.stdout:5/554: mkdir d5/d86/d24/d84/db8/dcc 0 2026-03-10T08:55:47.386 INFO:tasks.workunit.client.0.vm05.stdout:5/555: chown d5/d86/d24/d84 4 1 2026-03-10T08:55:47.388 INFO:tasks.workunit.client.1.vm08.stdout:1/812: fdatasync d1/da/de/d24/d35/d6d/d82/da2/dbb/fd8 0 2026-03-10T08:55:47.389 INFO:tasks.workunit.client.1.vm08.stdout:1/813: chown d1/da/de/d24/d3d/d40/d84 3392714 1 2026-03-10T08:55:47.394 INFO:tasks.workunit.client.1.vm08.stdout:2/883: dread d1/da/d10/d2d/fb7 [0,4194304] 0 2026-03-10T08:55:47.403 INFO:tasks.workunit.client.0.vm05.stdout:1/666: symlink dd/d10/d18/d20/d52/lf2 0 2026-03-10T08:55:47.403 INFO:tasks.workunit.client.0.vm05.stdout:1/667: stat fb 0 2026-03-10T08:55:47.406 INFO:tasks.workunit.client.0.vm05.stdout:2/542: dwrite d0/f16 [0,4194304] 0 2026-03-10T08:55:47.420 INFO:tasks.workunit.client.0.vm05.stdout:3/698: rename d9/d2b/d2f/d96/cad to d9/d2b/d53/cd1 0 2026-03-10T08:55:47.423 INFO:tasks.workunit.client.1.vm08.stdout:0/747: getdents d6/dd/d13/d17/d1f/d20/d2f/d26/d56 0 2026-03-10T08:55:47.427 INFO:tasks.workunit.client.0.vm05.stdout:5/556: mkdir d5/df/dbb/d43/dcd 0 2026-03-10T08:55:47.427 INFO:tasks.workunit.client.0.vm05.stdout:1/668: mkdir dd/d10/d18/d20/df3 0 2026-03-10T08:55:47.430 INFO:tasks.workunit.client.0.vm05.stdout:8/615: dwrite d2/dd/d2c/d2e/d31/d3e/d5d/f92 [0,4194304] 0 2026-03-10T08:55:47.430 INFO:tasks.workunit.client.1.vm08.stdout:0/748: sync 2026-03-10T08:55:47.437 INFO:tasks.workunit.client.0.vm05.stdout:8/616: read d2/dd/f3f [111064,93755] 0 2026-03-10T08:55:47.456 INFO:tasks.workunit.client.0.vm05.stdout:0/606: getdents df/d1f/d85/d19/d5b 0 2026-03-10T08:55:47.465 INFO:tasks.workunit.client.1.vm08.stdout:8/847: dread d1/d10/d9/dd/d25/d27/d44/f22 [0,4194304] 0 2026-03-10T08:55:47.465 INFO:tasks.workunit.client.0.vm05.stdout:9/583: mknod d6/d19/d2a/dbc/cc4 0 2026-03-10T08:55:47.487 INFO:tasks.workunit.client.0.vm05.stdout:5/557: truncate d5/d86/d24/d2c/d41/f4c 5149431 0 2026-03-10T08:55:47.500 INFO:tasks.workunit.client.1.vm08.stdout:2/884: rename d1/d43/dcd to d1/da/d10/d42/d93/d23/d128 0 2026-03-10T08:55:47.501 INFO:tasks.workunit.client.1.vm08.stdout:1/814: dread d1/da/d20/d91/d83/df4/d4e/f51 [0,4194304] 0 2026-03-10T08:55:47.515 INFO:tasks.workunit.client.1.vm08.stdout:5/729: creat d0/d11/d27/d68/d7c/d8e/fde x:0 0 0 2026-03-10T08:55:47.517 INFO:tasks.workunit.client.1.vm08.stdout:5/730: sync 2026-03-10T08:55:47.522 INFO:tasks.workunit.client.1.vm08.stdout:5/731: dwrite d0/d11/d27/f61 [0,4194304] 0 2026-03-10T08:55:47.524 INFO:tasks.workunit.client.1.vm08.stdout:5/732: sync 2026-03-10T08:55:47.524 INFO:tasks.workunit.client.1.vm08.stdout:0/749: creat d6/dd/d13/d32/ff7 x:0 0 0 2026-03-10T08:55:47.524 INFO:tasks.workunit.client.1.vm08.stdout:5/733: chown d0/d11/f86 144451 1 2026-03-10T08:55:47.527 INFO:tasks.workunit.client.1.vm08.stdout:3/779: mkdir d4/d15/d8/d1d/d107/d10a 0 2026-03-10T08:55:47.528 INFO:tasks.workunit.client.0.vm05.stdout:8/617: truncate d2/dd/d2c/d2e/d31/d4c/d63/f6c 2462592 0 2026-03-10T08:55:47.529 INFO:tasks.workunit.client.0.vm05.stdout:6/638: write d4/f6c [4071202,77986] 0 2026-03-10T08:55:47.529 INFO:tasks.workunit.client.1.vm08.stdout:4/847: dwrite d5/f8a [0,4194304] 0 2026-03-10T08:55:47.534 INFO:tasks.workunit.client.0.vm05.stdout:0/607: fsync df/d1f/d85/d19/d47/f8f 0 2026-03-10T08:55:47.535 INFO:tasks.workunit.client.0.vm05.stdout:0/608: readlink df/d1f/d85/d19/d39/d4d/l98 0 2026-03-10T08:55:47.544 INFO:tasks.workunit.client.1.vm08.stdout:8/848: rename d1/d4f/cf3 to d1/d10/d9/dd/d13/d40/c13b 0 2026-03-10T08:55:47.545 INFO:tasks.workunit.client.1.vm08.stdout:6/817: write d9/dc/d11/d23/f6f [891274,100642] 0 2026-03-10T08:55:47.547 INFO:tasks.workunit.client.0.vm05.stdout:4/627: dwrite d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/d67/f6b [0,4194304] 0 2026-03-10T08:55:47.569 INFO:tasks.workunit.client.1.vm08.stdout:7/830: dwrite d0/d11/d1f/d29/d3b/f9f [0,4194304] 0 2026-03-10T08:55:47.577 INFO:tasks.workunit.client.1.vm08.stdout:0/750: rmdir d6/dd/d13/d17/d50 39 2026-03-10T08:55:47.577 INFO:tasks.workunit.client.1.vm08.stdout:0/751: chown d6/dd/d13/d17/fbf 37 1 2026-03-10T08:55:47.578 INFO:tasks.workunit.client.1.vm08.stdout:0/752: stat d6/dd/f3f 0 2026-03-10T08:55:47.588 INFO:tasks.workunit.client.0.vm05.stdout:1/669: truncate dd/d10/d18/f8a 2777304 0 2026-03-10T08:55:47.591 INFO:tasks.workunit.client.1.vm08.stdout:9/784: dwrite d2/fb [0,4194304] 0 2026-03-10T08:55:47.593 INFO:tasks.workunit.client.0.vm05.stdout:2/543: write d0/d9/d1e/f34 [1884915,11605] 0 2026-03-10T08:55:47.594 INFO:tasks.workunit.client.1.vm08.stdout:4/848: fdatasync d5/f7e 0 2026-03-10T08:55:47.596 INFO:tasks.workunit.client.1.vm08.stdout:4/849: chown d5/d23/d36/d99/dc6/df1 29713 1 2026-03-10T08:55:47.598 INFO:tasks.workunit.client.0.vm05.stdout:8/618: creat d2/dd/d2c/d2e/d31/d3e/d5d/d9d/fdd x:0 0 0 2026-03-10T08:55:47.616 INFO:tasks.workunit.client.1.vm08.stdout:1/815: rename d1/da/d20/d91/d83/df4/d4e/ff1 to d1/da/d20/d91/d83/f11a 0 2026-03-10T08:55:47.616 INFO:tasks.workunit.client.0.vm05.stdout:8/619: stat d2/db/d47/fd1 0 2026-03-10T08:55:47.616 INFO:tasks.workunit.client.0.vm05.stdout:0/609: creat df/d1f/d85/d19/d5b/fb0 x:0 0 0 2026-03-10T08:55:47.616 INFO:tasks.workunit.client.0.vm05.stdout:0/610: chown df/f1a 7490 1 2026-03-10T08:55:47.616 INFO:tasks.workunit.client.1.vm08.stdout:1/816: stat d1/da/de/d24/d3d/d40/d8e 0 2026-03-10T08:55:47.617 INFO:tasks.workunit.client.1.vm08.stdout:0/753: read d6/dd/d13/d17/d1f/d20/f21 [1589404,5288] 0 2026-03-10T08:55:47.621 INFO:tasks.workunit.client.1.vm08.stdout:8/849: dread - d1/d10/d9/dd/d18/d34/ff4 zero size 2026-03-10T08:55:47.622 INFO:tasks.workunit.client.1.vm08.stdout:8/850: dread - d1/d10/d9/dd/d25/d27/d44/d21/d51/d64/dfb/f12f zero size 2026-03-10T08:55:47.622 INFO:tasks.workunit.client.1.vm08.stdout:6/818: fsync d9/dc/d11/f73 0 2026-03-10T08:55:47.624 INFO:tasks.workunit.client.0.vm05.stdout:3/699: link d9/d2b/d2f/cae d9/d2b/d2f/d57/dd0/cd2 0 2026-03-10T08:55:47.627 INFO:tasks.workunit.client.1.vm08.stdout:2/885: fdatasync d1/da/d10/d42/f79 0 2026-03-10T08:55:47.627 INFO:tasks.workunit.client.1.vm08.stdout:2/886: chown d1/da/d10/c110 16465496 1 2026-03-10T08:55:47.628 INFO:tasks.workunit.client.1.vm08.stdout:2/887: stat d1/d5b 0 2026-03-10T08:55:47.629 INFO:tasks.workunit.client.1.vm08.stdout:2/888: chown d1/da/d78/df5/d11e/fe3 56 1 2026-03-10T08:55:47.630 INFO:tasks.workunit.client.0.vm05.stdout:7/580: symlink d18/d66/lb4 0 2026-03-10T08:55:47.652 INFO:tasks.workunit.client.0.vm05.stdout:6/639: mknod d4/d2d/d51/cd4 0 2026-03-10T08:55:47.656 INFO:tasks.workunit.client.0.vm05.stdout:2/544: unlink d0/d9/d7f/d8f/d6d/l76 0 2026-03-10T08:55:47.659 INFO:tasks.workunit.client.0.vm05.stdout:2/545: stat d0/d9/d89/f96 0 2026-03-10T08:55:47.659 INFO:tasks.workunit.client.0.vm05.stdout:8/620: unlink d2/dd/d2c/f34 0 2026-03-10T08:55:47.661 INFO:tasks.workunit.client.0.vm05.stdout:8/621: dread d2/db/d1f/f44 [0,4194304] 0 2026-03-10T08:55:47.676 INFO:tasks.workunit.client.1.vm08.stdout:8/851: chown d1/dd9/f116 275186 1 2026-03-10T08:55:47.681 INFO:tasks.workunit.client.0.vm05.stdout:2/546: rmdir d0/d9/d1e/d20/d21/d8a 39 2026-03-10T08:55:47.682 INFO:tasks.workunit.client.0.vm05.stdout:0/611: mknod df/d1f/d85/d2b/d27/d32/d4e/d87/cb1 0 2026-03-10T08:55:47.682 INFO:tasks.workunit.client.0.vm05.stdout:0/612: stat c8 0 2026-03-10T08:55:47.683 INFO:tasks.workunit.client.0.vm05.stdout:0/613: write df/d1f/d85/d19/f99 [834367,52294] 0 2026-03-10T08:55:47.688 INFO:tasks.workunit.client.0.vm05.stdout:4/628: creat d0/d2e/d42/d45/d4a/d36/dbe/fc8 x:0 0 0 2026-03-10T08:55:47.688 INFO:tasks.workunit.client.0.vm05.stdout:1/670: link dd/d10/d18/d2d/c4f dd/d21/d37/d45/d8d/cf4 0 2026-03-10T08:55:47.694 INFO:tasks.workunit.client.1.vm08.stdout:2/889: mknod d1/d97/d11f/c129 0 2026-03-10T08:55:47.700 INFO:tasks.workunit.client.0.vm05.stdout:9/584: write d6/d15/d3c/d4b/f76 [414668,114459] 0 2026-03-10T08:55:47.705 INFO:tasks.workunit.client.0.vm05.stdout:7/581: sync 2026-03-10T08:55:47.712 INFO:tasks.workunit.client.0.vm05.stdout:2/547: chown d0/d9/d1e/l49 43440 1 2026-03-10T08:55:47.720 INFO:tasks.workunit.client.1.vm08.stdout:5/734: dread d0/d11/d3e/f48 [0,4194304] 0 2026-03-10T08:55:47.720 INFO:tasks.workunit.client.1.vm08.stdout:3/780: link d4/d6f/d85/fed d4/d15/d8/d2c/d6d/dfa/d100/f10b 0 2026-03-10T08:55:47.721 INFO:tasks.workunit.client.0.vm05.stdout:4/629: mkdir d0/d2c/d6a/dc9 0 2026-03-10T08:55:47.734 INFO:tasks.workunit.client.0.vm05.stdout:1/671: unlink dd/d10/d18/d2d/f84 0 2026-03-10T08:55:47.734 INFO:tasks.workunit.client.0.vm05.stdout:1/672: dread - dd/d21/fe4 zero size 2026-03-10T08:55:47.735 INFO:tasks.workunit.client.0.vm05.stdout:9/585: symlink d6/d19/d2a/dbc/lc5 0 2026-03-10T08:55:47.736 INFO:tasks.workunit.client.0.vm05.stdout:9/586: readlink d6/d15/l68 0 2026-03-10T08:55:47.737 INFO:tasks.workunit.client.0.vm05.stdout:5/558: truncate d5/df/f53 2448510 0 2026-03-10T08:55:47.744 INFO:tasks.workunit.client.1.vm08.stdout:9/785: getdents d2/d54/d8e 0 2026-03-10T08:55:47.746 INFO:tasks.workunit.client.1.vm08.stdout:2/890: dwrite d1/da/d10/d42/f79 [0,4194304] 0 2026-03-10T08:55:47.748 INFO:tasks.workunit.client.1.vm08.stdout:7/831: dwrite d0/d14/d43/d62/fb5 [4194304,4194304] 0 2026-03-10T08:55:47.752 INFO:tasks.workunit.client.0.vm05.stdout:2/548: symlink d0/d9/d7f/d8f/d7e/l99 0 2026-03-10T08:55:47.756 INFO:tasks.workunit.client.0.vm05.stdout:2/549: chown d0/d9/d1e/d20/d21/d45/c52 728 1 2026-03-10T08:55:47.756 INFO:tasks.workunit.client.1.vm08.stdout:1/817: getdents d1/da/de/d24/d35/d6d 0 2026-03-10T08:55:47.756 INFO:tasks.workunit.client.1.vm08.stdout:0/754: write d6/dd/d13/d17/fc6 [997042,50547] 0 2026-03-10T08:55:47.757 INFO:tasks.workunit.client.1.vm08.stdout:1/818: readlink d1/da/de/d24/d3d/d40/l58 0 2026-03-10T08:55:47.758 INFO:tasks.workunit.client.1.vm08.stdout:1/819: chown d1/da/d20/d9e/la4 28169981 1 2026-03-10T08:55:47.761 INFO:tasks.workunit.client.1.vm08.stdout:4/850: dwrite d5/d23/d49/ff9 [0,4194304] 0 2026-03-10T08:55:47.774 INFO:tasks.workunit.client.1.vm08.stdout:8/852: write d1/d10/d9/dd/d25/d27/f3a [1193339,51075] 0 2026-03-10T08:55:47.775 INFO:tasks.workunit.client.1.vm08.stdout:6/819: dwrite d9/d50/fa3 [0,4194304] 0 2026-03-10T08:55:47.779 INFO:tasks.workunit.client.1.vm08.stdout:9/786: mkdir d2/dd/d15/d1e/d21/d108 0 2026-03-10T08:55:47.779 INFO:tasks.workunit.client.1.vm08.stdout:6/820: chown d9/dc/d11/d23/d2c/d81/d63/cbb 21 1 2026-03-10T08:55:47.780 INFO:tasks.workunit.client.1.vm08.stdout:6/821: dread - d9/dc/d11/ff8 zero size 2026-03-10T08:55:47.784 INFO:tasks.workunit.client.0.vm05.stdout:3/700: dwrite d9/d2b/f2c [0,4194304] 0 2026-03-10T08:55:47.801 INFO:tasks.workunit.client.0.vm05.stdout:6/640: dwrite d4/d2d/d5f/f6d [0,4194304] 0 2026-03-10T08:55:47.806 INFO:tasks.workunit.client.1.vm08.stdout:7/832: truncate d0/d11/d1f/d29/d3b/d80/feb 287569 0 2026-03-10T08:55:47.808 INFO:tasks.workunit.client.1.vm08.stdout:3/781: symlink d4/d15/d8/d2c/d9b/l10c 0 2026-03-10T08:55:47.816 INFO:tasks.workunit.client.1.vm08.stdout:1/820: mknod d1/da/de/d24/d3d/d40/d8e/dd2/c11b 0 2026-03-10T08:55:47.822 INFO:tasks.workunit.client.0.vm05.stdout:3/701: mknod d9/d2b/d2f/d96/cd3 0 2026-03-10T08:55:47.823 INFO:tasks.workunit.client.0.vm05.stdout:3/702: write d9/d2b/f2d [154708,123602] 0 2026-03-10T08:55:47.823 INFO:tasks.workunit.client.1.vm08.stdout:2/891: dread d1/da/d10/d42/f89 [0,4194304] 0 2026-03-10T08:55:47.825 INFO:tasks.workunit.client.1.vm08.stdout:7/833: dread d0/d11/d1f/d29/d3d/d89/fee [0,4194304] 0 2026-03-10T08:55:47.825 INFO:tasks.workunit.client.1.vm08.stdout:7/834: readlink d0/d11/db2/d8e/ld9 0 2026-03-10T08:55:47.825 INFO:tasks.workunit.client.1.vm08.stdout:8/853: chown d1/d4f/d60/dbf/lf1 492 1 2026-03-10T08:55:47.826 INFO:tasks.workunit.client.0.vm05.stdout:6/641: truncate d4/d2d/fa2 4803213 0 2026-03-10T08:55:47.826 INFO:tasks.workunit.client.0.vm05.stdout:6/642: write d4/f6c [71392,39783] 0 2026-03-10T08:55:47.827 INFO:tasks.workunit.client.1.vm08.stdout:8/854: write d1/d10/d9/dd/d25/d27/d44/d97/d7d/f10f [4698718,105307] 0 2026-03-10T08:55:47.829 INFO:tasks.workunit.client.1.vm08.stdout:9/787: creat d2/d41/f109 x:0 0 0 2026-03-10T08:55:47.832 INFO:tasks.workunit.client.1.vm08.stdout:6/822: truncate d9/d10/d1e/f58 2431508 0 2026-03-10T08:55:47.834 INFO:tasks.workunit.client.1.vm08.stdout:6/823: stat d9/dc/d11/d23/lc7 0 2026-03-10T08:55:47.835 INFO:tasks.workunit.client.0.vm05.stdout:9/587: link d6/d19/d2a/f53 d6/d19/d2a/d8d/fc6 0 2026-03-10T08:55:47.837 INFO:tasks.workunit.client.1.vm08.stdout:3/782: dread d4/d15/fe3 [0,4194304] 0 2026-03-10T08:55:47.838 INFO:tasks.workunit.client.0.vm05.stdout:0/614: getdents df/d1f/d85/d2b/d27 0 2026-03-10T08:55:47.850 INFO:tasks.workunit.client.0.vm05.stdout:4/630: dread d0/d2c/f2f [0,4194304] 0 2026-03-10T08:55:47.859 INFO:tasks.workunit.client.1.vm08.stdout:7/835: sync 2026-03-10T08:55:47.878 INFO:tasks.workunit.client.1.vm08.stdout:9/788: dread - d2/d41/d4c/d66/d82/fa8 zero size 2026-03-10T08:55:47.880 INFO:tasks.workunit.client.0.vm05.stdout:6/643: unlink d4/d7/d10/dc3/ld1 0 2026-03-10T08:55:47.887 INFO:tasks.workunit.client.0.vm05.stdout:9/588: write d6/d19/d2a/f53 [2513368,30184] 0 2026-03-10T08:55:47.900 INFO:tasks.workunit.client.1.vm08.stdout:5/735: write d0/f7f [3178331,5322] 0 2026-03-10T08:55:47.904 INFO:tasks.workunit.client.0.vm05.stdout:5/559: dwrite d5/df/dbb/f4a [4194304,4194304] 0 2026-03-10T08:55:47.907 INFO:tasks.workunit.client.0.vm05.stdout:2/550: write d0/d9/d1e/d20/d21/d45/d4b/f58 [612186,20382] 0 2026-03-10T08:55:47.918 INFO:tasks.workunit.client.1.vm08.stdout:4/851: dwrite d5/d23/d36/d99/db2/f45 [0,4194304] 0 2026-03-10T08:55:47.920 INFO:tasks.workunit.client.0.vm05.stdout:1/673: truncate dd/d21/d37/f39 3883412 0 2026-03-10T08:55:47.940 INFO:tasks.workunit.client.1.vm08.stdout:1/821: write d1/da/de/d24/d3d/d40/d8e/f107 [147816,34991] 0 2026-03-10T08:55:47.942 INFO:tasks.workunit.client.1.vm08.stdout:0/755: creat d6/dd/d13/d17/ff8 x:0 0 0 2026-03-10T08:55:47.943 INFO:tasks.workunit.client.1.vm08.stdout:2/892: dwrite d1/da/d10/d42/d93/d1e/fb2 [0,4194304] 0 2026-03-10T08:55:47.944 INFO:tasks.workunit.client.0.vm05.stdout:4/631: truncate d0/d1d/f89 140577 0 2026-03-10T08:55:47.944 INFO:tasks.workunit.client.1.vm08.stdout:8/855: write d1/d10/d9/dd/d25/f118 [896611,55574] 0 2026-03-10T08:55:47.956 INFO:tasks.workunit.client.0.vm05.stdout:7/582: mknod d18/d66/d25/d2e/cb5 0 2026-03-10T08:55:47.957 INFO:tasks.workunit.client.1.vm08.stdout:6/824: write d9/dc/d11/d23/d2c/fca [1251153,97167] 0 2026-03-10T08:55:47.958 INFO:tasks.workunit.client.1.vm08.stdout:7/836: creat d0/d11/d1f/d29/d36/d75/f10b x:0 0 0 2026-03-10T08:55:47.961 INFO:tasks.workunit.client.0.vm05.stdout:3/703: creat d9/d2b/d3a/d43/d6e/dba/fd4 x:0 0 0 2026-03-10T08:55:47.970 INFO:tasks.workunit.client.1.vm08.stdout:4/852: mknod d5/d23/d36/d99/db2/d5a/d69/d11b/def/df2/c132 0 2026-03-10T08:55:47.992 INFO:tasks.workunit.client.1.vm08.stdout:2/893: chown d1/da/d10/d1b/f28 1 1 2026-03-10T08:55:48.007 INFO:tasks.workunit.client.1.vm08.stdout:8/856: creat d1/d10/d9/dd/d13/d40/f13c x:0 0 0 2026-03-10T08:55:48.007 INFO:tasks.workunit.client.1.vm08.stdout:8/857: write d1/d10/d9/d4d/db2/f103 [614890,89069] 0 2026-03-10T08:55:48.019 INFO:tasks.workunit.client.1.vm08.stdout:7/837: mkdir d0/d11/d1f/d29/d36/d10c 0 2026-03-10T08:55:48.020 INFO:tasks.workunit.client.1.vm08.stdout:7/838: write d0/d51/f78 [270801,111060] 0 2026-03-10T08:55:48.022 INFO:tasks.workunit.client.1.vm08.stdout:9/789: rename d2/dd/d15/d1e/d25/l6b to d2/dd/d15/d1e/d39/d69/l10a 0 2026-03-10T08:55:48.022 INFO:tasks.workunit.client.1.vm08.stdout:9/790: readlink d2/dd/d15/d1e/d25/d32/d5c/lb6 0 2026-03-10T08:55:48.035 INFO:tasks.workunit.client.1.vm08.stdout:4/853: creat d5/d23/d36/f133 x:0 0 0 2026-03-10T08:55:48.040 INFO:tasks.workunit.client.1.vm08.stdout:1/822: creat d1/da/d18/d3a/d77/f11c x:0 0 0 2026-03-10T08:55:48.043 INFO:tasks.workunit.client.1.vm08.stdout:0/756: unlink d6/dd/d13/d17/d1f/d20/d2f/c42 0 2026-03-10T08:55:48.045 INFO:tasks.workunit.client.1.vm08.stdout:2/894: truncate d1/da/d10/d1b/d6a/fd7 1028079 0 2026-03-10T08:55:48.051 INFO:tasks.workunit.client.1.vm08.stdout:4/854: dread d5/d23/d36/d99/db2/d5d/f60 [0,4194304] 0 2026-03-10T08:55:48.052 INFO:tasks.workunit.client.1.vm08.stdout:4/855: readlink d5/d23/d36/d99/db2/d5a/d69/d11b/dea/l10b 0 2026-03-10T08:55:48.053 INFO:tasks.workunit.client.1.vm08.stdout:6/825: unlink d9/d50/d95/cec 0 2026-03-10T08:55:48.054 INFO:tasks.workunit.client.1.vm08.stdout:4/856: chown d5/d23/d36/d99/db2/d5a/ddb/le2 1780 1 2026-03-10T08:55:48.059 INFO:tasks.workunit.client.1.vm08.stdout:7/839: mknod d0/d11/d4a/d5e/dc3/c10d 0 2026-03-10T08:55:48.061 INFO:tasks.workunit.client.1.vm08.stdout:5/736: rename d0/c6 to d0/d11/d27/d68/d7c/d4b/d4e/d84/cdf 0 2026-03-10T08:55:48.070 INFO:tasks.workunit.client.1.vm08.stdout:3/783: truncate d4/d15/d8/d1d/d4f/fa2 1476650 0 2026-03-10T08:55:48.071 INFO:tasks.workunit.client.1.vm08.stdout:1/823: mkdir d1/da/de/d24/d81/d11d 0 2026-03-10T08:55:48.072 INFO:tasks.workunit.client.1.vm08.stdout:3/784: write d4/d15/d8/d2c/d9b/d79/d20/f99 [520141,73447] 0 2026-03-10T08:55:48.072 INFO:tasks.workunit.client.1.vm08.stdout:0/757: fdatasync d6/dd/d13/d17/d1f/d2d/d85/d93/f7e 0 2026-03-10T08:55:48.073 INFO:tasks.workunit.client.1.vm08.stdout:3/785: write d4/d15/d8/d2c/f108 [509187,18429] 0 2026-03-10T08:55:48.073 INFO:tasks.workunit.client.1.vm08.stdout:0/758: chown d6/dd 15180 1 2026-03-10T08:55:48.083 INFO:tasks.workunit.client.1.vm08.stdout:9/791: write d2/d41/d4c/f62 [782826,87408] 0 2026-03-10T08:55:48.084 INFO:tasks.workunit.client.1.vm08.stdout:9/792: dread - d2/dd/d15/d1e/d39/d69/de4/f104 zero size 2026-03-10T08:55:48.085 INFO:tasks.workunit.client.1.vm08.stdout:2/895: chown d1/da/d10/d42/d93/d1e/l109 2150694 1 2026-03-10T08:55:48.111 INFO:tasks.workunit.client.1.vm08.stdout:7/840: mkdir d0/d11/d4a/d5e/dc3/d10e 0 2026-03-10T08:55:48.120 INFO:tasks.workunit.client.1.vm08.stdout:5/737: dread d0/d11/d18/f4f [8388608,4194304] 0 2026-03-10T08:55:48.121 INFO:tasks.workunit.client.1.vm08.stdout:6/826: write d9/dc/d11/f73 [1517802,109934] 0 2026-03-10T08:55:48.125 INFO:tasks.workunit.client.1.vm08.stdout:3/786: mkdir d4/d6f/d85/dd3/d10d 0 2026-03-10T08:55:48.127 INFO:tasks.workunit.client.1.vm08.stdout:4/857: dwrite d5/d23/d36/d99/db2/d5a/d69/d11b/f5e [0,4194304] 0 2026-03-10T08:55:48.128 INFO:tasks.workunit.client.1.vm08.stdout:4/858: read - d5/d23/d49/f94 zero size 2026-03-10T08:55:48.129 INFO:tasks.workunit.client.1.vm08.stdout:4/859: dread - d5/d23/d36/d76/faa zero size 2026-03-10T08:55:48.130 INFO:tasks.workunit.client.0.vm05.stdout:6/644: mkdir d4/d2c/d84/d4a/dd5 0 2026-03-10T08:55:48.132 INFO:tasks.workunit.client.1.vm08.stdout:4/860: chown d5/d23/d36/d99/db2/d5a/ddb/fe9 14168 1 2026-03-10T08:55:48.132 INFO:tasks.workunit.client.1.vm08.stdout:0/759: truncate d6/dd/d13/d17/d1f/d20/d2f/d24/f68 1131629 0 2026-03-10T08:55:48.133 INFO:tasks.workunit.client.1.vm08.stdout:9/793: rmdir d2/d54/d8e/da6 39 2026-03-10T08:55:48.134 INFO:tasks.workunit.client.0.vm05.stdout:9/589: readlink d6/d19/d2a/l2d 0 2026-03-10T08:55:48.135 INFO:tasks.workunit.client.1.vm08.stdout:9/794: fsync d2/dd/d15/d1e/d39/d4e/ff4 0 2026-03-10T08:55:48.147 INFO:tasks.workunit.client.1.vm08.stdout:3/787: dread d4/d15/d8/d2c/f42 [4194304,4194304] 0 2026-03-10T08:55:48.150 INFO:tasks.workunit.client.0.vm05.stdout:8/622: truncate d2/dd/d2c/d2e/d31/d4c/d63/f6c 2196161 0 2026-03-10T08:55:48.151 INFO:tasks.workunit.client.0.vm05.stdout:8/623: stat d2/db/d47/f51 0 2026-03-10T08:55:48.159 INFO:tasks.workunit.client.0.vm05.stdout:5/560: dwrite d5/d48/d64/f83 [0,4194304] 0 2026-03-10T08:55:48.172 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:48 vm08.local ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:55:48.172 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:48 vm08.local ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:55:48.172 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:48 vm08.local ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:55:48.172 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:48 vm08.local ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:55:48.172 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:48 vm08.local ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:55:48.172 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:48 vm08.local ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:55:48.172 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:48 vm08.local ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mgr fail", "who": "vm05.rxwgjc"}]: dispatch 2026-03-10T08:55:48.172 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:48 vm08.local ceph-mon[57559]: osdmap e40: 6 total, 6 up, 6 in 2026-03-10T08:55:48.172 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:48 vm08.local ceph-mon[57559]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "mgr fail", "who": "vm05.rxwgjc"}]': finished 2026-03-10T08:55:48.172 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:48 vm08.local ceph-mon[57559]: mgrmap e19: vm08.rpongu(active, starting, since 0.0194546s) 2026-03-10T08:55:48.172 INFO:tasks.workunit.client.0.vm05.stdout:2/551: creat d0/d9/d7f/d8f/d6d/f9a x:0 0 0 2026-03-10T08:55:48.172 INFO:tasks.workunit.client.0.vm05.stdout:1/674: chown dd/d10/d18/d2d/d51/f6b 887925 1 2026-03-10T08:55:48.172 INFO:tasks.workunit.client.0.vm05.stdout:4/632: rmdir d0/d2e/d71/d7c 39 2026-03-10T08:55:48.185 INFO:tasks.workunit.client.0.vm05.stdout:9/590: creat d6/d19/d2a/d4a/d8c/fc7 x:0 0 0 2026-03-10T08:55:48.204 INFO:tasks.workunit.client.0.vm05.stdout:6/645: write d4/d2c/d84/d4a/fbd [250681,31326] 0 2026-03-10T08:55:48.205 INFO:tasks.workunit.client.0.vm05.stdout:6/646: chown d4/d7/d10/d1a/d89/caf 1813357 1 2026-03-10T08:55:48.209 INFO:tasks.workunit.client.0.vm05.stdout:8/624: write d2/d45/f61 [164510,120182] 0 2026-03-10T08:55:48.216 INFO:tasks.workunit.client.0.vm05.stdout:5/561: creat d5/d86/d39/fce x:0 0 0 2026-03-10T08:55:48.221 INFO:tasks.workunit.client.0.vm05.stdout:0/615: creat df/d1f/d85/d2b/d27/d32/d4e/fb2 x:0 0 0 2026-03-10T08:55:48.222 INFO:tasks.workunit.client.0.vm05.stdout:0/616: chown df/d1f/d85/d19/d5b/f6c 26 1 2026-03-10T08:55:48.226 INFO:tasks.workunit.client.0.vm05.stdout:7/583: creat d18/d38/d43/d5c/daf/fb6 x:0 0 0 2026-03-10T08:55:48.242 INFO:tasks.workunit.client.0.vm05.stdout:3/704: dwrite d9/d2b/d3a/d6c/dbf/f7f [0,4194304] 0 2026-03-10T08:55:48.246 INFO:tasks.workunit.client.0.vm05.stdout:9/591: write d6/d12/d3a/fa9 [296359,108611] 0 2026-03-10T08:55:48.247 INFO:tasks.workunit.client.0.vm05.stdout:9/592: write d6/d12/db2/fba [2027886,119625] 0 2026-03-10T08:55:48.258 INFO:tasks.workunit.client.0.vm05.stdout:6/647: creat d4/d2d/d51/d62/da9/fd6 x:0 0 0 2026-03-10T08:55:48.286 INFO:tasks.workunit.client.0.vm05.stdout:2/552: unlink d0/d9/d1e/d20/d21/f31 0 2026-03-10T08:55:48.287 INFO:tasks.workunit.client.0.vm05.stdout:2/553: chown d0/d9/f4e 103 1 2026-03-10T08:55:48.291 INFO:tasks.workunit.client.0.vm05.stdout:5/562: dwrite d5/d86/d24/d2c/d41/d74/fb1 [0,4194304] 0 2026-03-10T08:55:48.309 INFO:tasks.workunit.client.0.vm05.stdout:5/563: write d5/fc1 [160122,73774] 0 2026-03-10T08:55:48.309 INFO:tasks.workunit.client.0.vm05.stdout:5/564: dread - d5/d86/d24/d2c/f88 zero size 2026-03-10T08:55:48.318 INFO:tasks.workunit.client.1.vm08.stdout:4/861: chown d5/d23/c2c 93 1 2026-03-10T08:55:48.322 INFO:tasks.workunit.client.1.vm08.stdout:4/862: stat d5/f6 0 2026-03-10T08:55:48.324 INFO:tasks.workunit.client.0.vm05.stdout:3/705: symlink d9/d2b/d3a/ld5 0 2026-03-10T08:55:48.327 INFO:tasks.workunit.client.1.vm08.stdout:3/788: rmdir d4/d15/d8 39 2026-03-10T08:55:48.331 INFO:tasks.workunit.client.0.vm05.stdout:9/593: sync 2026-03-10T08:55:48.332 INFO:tasks.workunit.client.0.vm05.stdout:2/554: sync 2026-03-10T08:55:48.333 INFO:tasks.workunit.client.0.vm05.stdout:9/594: read d6/f4e [169374,116754] 0 2026-03-10T08:55:48.337 INFO:tasks.workunit.client.0.vm05.stdout:1/675: truncate dd/f1c 62709 0 2026-03-10T08:55:48.339 INFO:tasks.workunit.client.1.vm08.stdout:5/738: write d0/d11/d27/d68/d7c/d4b/d87/db5/fcd [645745,24621] 0 2026-03-10T08:55:48.339 INFO:tasks.workunit.client.0.vm05.stdout:9/595: sync 2026-03-10T08:55:48.340 INFO:tasks.workunit.client.1.vm08.stdout:1/824: dwrite d1/da/f25 [0,4194304] 0 2026-03-10T08:55:48.341 INFO:tasks.workunit.client.0.vm05.stdout:9/596: truncate d6/d12/d3a/d9c/fb6 702389 0 2026-03-10T08:55:48.343 INFO:tasks.workunit.client.0.vm05.stdout:4/633: dwrite d0/fe [0,4194304] 0 2026-03-10T08:55:48.343 INFO:tasks.workunit.client.0.vm05.stdout:9/597: dread - d6/d15/d3c/d4b/d90/fb1 zero size 2026-03-10T08:55:48.346 INFO:tasks.workunit.client.0.vm05.stdout:9/598: chown d6/d15/d3c/d4b/f76 419817752 1 2026-03-10T08:55:48.366 INFO:tasks.workunit.client.0.vm05.stdout:7/584: dwrite d18/d38/d43/f8c [0,4194304] 0 2026-03-10T08:55:48.368 INFO:tasks.workunit.client.1.vm08.stdout:8/858: link d1/dd9/l104 d1/d10/d9/dd/d25/l13d 0 2026-03-10T08:55:48.371 INFO:tasks.workunit.client.1.vm08.stdout:9/795: dwrite d2/dd/d15/d1e/d25/f5f [0,4194304] 0 2026-03-10T08:55:48.384 INFO:tasks.workunit.client.1.vm08.stdout:9/796: write d2/d41/d4c/f62 [718700,27247] 0 2026-03-10T08:55:48.384 INFO:tasks.workunit.client.1.vm08.stdout:9/797: chown d2/dd/d15 31 1 2026-03-10T08:55:48.384 INFO:tasks.workunit.client.1.vm08.stdout:7/841: unlink d0/d11/d1f/d29/d3d/d40/c63 0 2026-03-10T08:55:48.384 INFO:tasks.workunit.client.1.vm08.stdout:2/896: dwrite d1/da/d10/d42/d93/d1e/f49 [0,4194304] 0 2026-03-10T08:55:48.384 INFO:tasks.workunit.client.1.vm08.stdout:0/760: mknod d6/dd/d13/d17/d1f/d20/d2f/d57/dd5/cf9 0 2026-03-10T08:55:48.384 INFO:tasks.workunit.client.1.vm08.stdout:0/761: fsync d6/dd/d13/d32/ff7 0 2026-03-10T08:55:48.388 INFO:tasks.workunit.client.1.vm08.stdout:2/897: dwrite d1/d5b/dc5/f10f [0,4194304] 0 2026-03-10T08:55:48.401 INFO:tasks.workunit.client.1.vm08.stdout:5/739: unlink d0/d11/d3e/f4d 0 2026-03-10T08:55:48.404 INFO:tasks.workunit.client.1.vm08.stdout:5/740: dwrite d0/d11/d3e/fdd [0,4194304] 0 2026-03-10T08:55:48.409 INFO:tasks.workunit.client.0.vm05.stdout:5/565: read d5/fc [1694986,97259] 0 2026-03-10T08:55:48.411 INFO:tasks.workunit.client.0.vm05.stdout:1/676: dread dd/d10/fb0 [0,4194304] 0 2026-03-10T08:55:48.429 INFO:tasks.workunit.client.0.vm05.stdout:4/634: truncate d0/d2e/d42/d45/d4a/d36/dbe/d49/d4f/d5b/f70 4087306 0 2026-03-10T08:55:48.429 INFO:tasks.workunit.client.0.vm05.stdout:6/648: getdents d4/d7/d10/d8f 0 2026-03-10T08:55:48.429 INFO:tasks.workunit.client.1.vm08.stdout:7/842: fsync d0/d11/d4a/fa5 0 2026-03-10T08:55:48.430 INFO:tasks.workunit.client.0.vm05.stdout:1/677: sync 2026-03-10T08:55:48.455 INFO:tasks.workunit.client.1.vm08.stdout:2/898: read d1/da/d10/d42/d93/d22/fbc [3146132,26365] 0 2026-03-10T08:55:48.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:48 vm05.local ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:55:48.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:48 vm05.local ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:55:48.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:48 vm05.local ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:55:48.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:48 vm05.local ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:55:48.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:48 vm05.local ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' 2026-03-10T08:55:48.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:48 vm05.local ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:55:48.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:48 vm05.local ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mgr fail", "who": "vm05.rxwgjc"}]: dispatch 2026-03-10T08:55:48.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:48 vm05.local ceph-mon[49713]: osdmap e40: 6 total, 6 up, 6 in 2026-03-10T08:55:48.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:48 vm05.local ceph-mon[49713]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "mgr fail", "who": "vm05.rxwgjc"}]': finished 2026-03-10T08:55:48.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:48 vm05.local ceph-mon[49713]: mgrmap e19: vm08.rpongu(active, starting, since 0.0194546s) 2026-03-10T08:55:48.526 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:48.525+0000 7ff40b9ee700 1 -- 192.168.123.105:0/2594441100 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff4041086f0 msgr2=0x7ff40410f0d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:48.526 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:48.525+0000 7ff40b9ee700 1 --2- 192.168.123.105:0/2594441100 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff4041086f0 0x7ff40410f0d0 secure :-1 s=READY pgs=327 cs=0 l=1 rev1=1 crypto rx=0x7ff3f4009b00 tx=0x7ff3f4009e10 comp rx=0 tx=0).stop 2026-03-10T08:55:48.526 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:48.526+0000 7ff40b9ee700 1 -- 192.168.123.105:0/2594441100 shutdown_connections 2026-03-10T08:55:48.526 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:48.526+0000 7ff40b9ee700 1 --2- 192.168.123.105:0/2594441100 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff4041086f0 0x7ff40410f0d0 unknown :-1 s=CLOSED pgs=327 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:48.527 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:48.526+0000 7ff40b9ee700 1 --2- 192.168.123.105:0/2594441100 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff404107d90 0x7ff4041081b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:48.527 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:48.526+0000 7ff40b9ee700 1 -- 192.168.123.105:0/2594441100 >> 192.168.123.105:0/2594441100 conn(0x7ff40406dda0 msgr2=0x7ff404070220 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:55:48.527 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:48.526+0000 7ff40b9ee700 1 -- 192.168.123.105:0/2594441100 shutdown_connections 2026-03-10T08:55:48.527 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:48.526+0000 7ff40b9ee700 1 -- 192.168.123.105:0/2594441100 wait complete. 2026-03-10T08:55:48.527 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:48.526+0000 7ff40b9ee700 1 Processor -- start 2026-03-10T08:55:48.527 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:48.527+0000 7ff40b9ee700 1 -- start start 2026-03-10T08:55:48.527 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:48.527+0000 7ff40b9ee700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff404107d90 0x7ff4041a55d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:55:48.527 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:48.527+0000 7ff40b9ee700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff4041086f0 0x7ff4041a5b10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:55:48.527 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:48.527+0000 7ff40b9ee700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff4041a6130 con 0x7ff404107d90 2026-03-10T08:55:48.527 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:48.527+0000 7ff40b9ee700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff4041a6270 con 0x7ff4041086f0 2026-03-10T08:55:48.528 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:48.527+0000 7ff408f89700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff4041086f0 0x7ff4041a5b10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:55:48.528 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:48.527+0000 7ff408f89700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff4041086f0 0x7ff4041a5b10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:51558/0 (socket says 192.168.123.105:51558) 2026-03-10T08:55:48.528 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:48.527+0000 7ff408f89700 1 -- 192.168.123.105:0/4182131182 learned_addr learned my addr 192.168.123.105:0/4182131182 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:55:48.528 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:48.528+0000 7ff408f89700 1 -- 192.168.123.105:0/4182131182 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff404107d90 msgr2=0x7ff4041a55d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:48.528 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:48.528+0000 7ff408f89700 1 --2- 192.168.123.105:0/4182131182 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff404107d90 0x7ff4041a55d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:48.528 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:48.528+0000 7ff408f89700 1 -- 192.168.123.105:0/4182131182 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff3f40097e0 con 0x7ff4041086f0 2026-03-10T08:55:48.528 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:48.528+0000 7ff408f89700 1 --2- 192.168.123.105:0/4182131182 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff4041086f0 0x7ff4041a5b10 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7ff3f400bb70 tx=0x7ff3f4004690 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:55:48.530 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:48.528+0000 7ff3fa7fc700 1 -- 192.168.123.105:0/4182131182 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff3f401d070 con 0x7ff4041086f0 2026-03-10T08:55:48.530 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:48.529+0000 7ff40b9ee700 1 -- 192.168.123.105:0/4182131182 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff4041aacc0 con 0x7ff4041086f0 2026-03-10T08:55:48.530 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:48.529+0000 7ff40b9ee700 1 -- 192.168.123.105:0/4182131182 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff4041ab1b0 con 0x7ff4041086f0 2026-03-10T08:55:48.530 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:48.529+0000 7ff3fa7fc700 1 -- 192.168.123.105:0/4182131182 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff3f4022470 con 0x7ff4041086f0 2026-03-10T08:55:48.530 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:48.529+0000 7ff3fa7fc700 1 -- 192.168.123.105:0/4182131182 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff3f400f740 con 0x7ff4041086f0 2026-03-10T08:55:48.531 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:48.530+0000 7ff3fa7fc700 1 -- 192.168.123.105:0/4182131182 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 45072+0+0 (secure 0 0 0) 0x7ff3f400f960 con 0x7ff4041086f0 2026-03-10T08:55:48.531 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:48.530+0000 7ff3fa7fc700 1 -- 192.168.123.105:0/4182131182 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(40..40 src has 1..40) v4 ==== 5534+0+0 (secure 0 0 0) 0x7ff3f404e2e0 con 0x7ff4041086f0 2026-03-10T08:55:48.531 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:48.531+0000 7ff3effff700 1 -- 192.168.123.105:0/4182131182 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff3e8005320 con 0x7ff4041086f0 2026-03-10T08:55:48.536 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:48.535+0000 7ff3fa7fc700 1 -- 192.168.123.105:0/4182131182 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7ff3f4027070 con 0x7ff4041086f0 2026-03-10T08:55:48.570 INFO:tasks.workunit.client.0.vm05.stdout:5/566: chown d5/d86/d24/d2c/d41/c97 136653427 1 2026-03-10T08:55:48.599 INFO:tasks.workunit.client.1.vm08.stdout:3/789: dread - d4/d15/d8/d2c/d89/fb4 zero size 2026-03-10T08:55:48.603 INFO:tasks.workunit.client.1.vm08.stdout:6/827: dwrite d9/d10/d1e/d92/faf [0,4194304] 0 2026-03-10T08:55:48.604 INFO:tasks.workunit.client.1.vm08.stdout:6/828: write d9/d10/d1e/f91 [1219624,7984] 0 2026-03-10T08:55:48.609 INFO:tasks.workunit.client.0.vm05.stdout:3/706: dwrite d9/d2b/d2f/f3f [0,4194304] 0 2026-03-10T08:55:48.619 INFO:tasks.workunit.client.1.vm08.stdout:4/863: dwrite d5/d23/d36/d99/db2/d5d/dae/fd1 [0,4194304] 0 2026-03-10T08:55:48.622 INFO:tasks.workunit.client.0.vm05.stdout:8/625: getdents d2/dd/d2c 0 2026-03-10T08:55:48.623 INFO:tasks.workunit.client.0.vm05.stdout:8/626: write d2/dd/d2c/d2e/d31/d3e/f73 [5401834,19130] 0 2026-03-10T08:55:48.639 INFO:tasks.workunit.client.1.vm08.stdout:1/825: dwrite d1/da/de/d5c/f8a [0,4194304] 0 2026-03-10T08:55:48.646 INFO:tasks.workunit.client.1.vm08.stdout:7/843: mknod d0/d11/d1f/d29/d3d/d40/c10f 0 2026-03-10T08:55:48.651 INFO:tasks.workunit.client.1.vm08.stdout:2/899: mknod d1/d5b/d66/df6/c12a 0 2026-03-10T08:55:48.654 INFO:tasks.workunit.client.0.vm05.stdout:9/599: link d6/d19/d2a/dbc/cc4 d6/d19/d2a/dbc/cc8 0 2026-03-10T08:55:48.654 INFO:tasks.workunit.client.0.vm05.stdout:5/567: fdatasync d5/d86/d66/f94 0 2026-03-10T08:55:48.654 INFO:tasks.workunit.client.1.vm08.stdout:8/859: dwrite d1/d10/d9/dd/d18/d34/f57 [4194304,4194304] 0 2026-03-10T08:55:48.686 INFO:tasks.workunit.client.1.vm08.stdout:9/798: dwrite d2/dd/d15/d1e/d24/f30 [0,4194304] 0 2026-03-10T08:55:48.703 INFO:tasks.workunit.client.0.vm05.stdout:0/617: write df/d1f/d85/d19/d39/d74/f71 [3111490,70712] 0 2026-03-10T08:55:48.704 INFO:tasks.workunit.client.0.vm05.stdout:0/618: chown df/d1f/c3e 57711480 1 2026-03-10T08:55:48.704 INFO:tasks.workunit.client.0.vm05.stdout:0/619: fdatasync df/d1f/d85/d2b/d65/d6e/d96/f7e 0 2026-03-10T08:55:48.709 INFO:tasks.workunit.client.1.vm08.stdout:6/829: dread - d9/dc/d11/d23/d2c/d7a/dce/fb9 zero size 2026-03-10T08:55:48.709 INFO:tasks.workunit.client.0.vm05.stdout:2/555: getdents d0/d9/d1e/d20/d21/d45/d4b/d70 0 2026-03-10T08:55:48.734 INFO:tasks.workunit.client.0.vm05.stdout:6/649: link d4/d7/d10/d15/d20/f64 d4/d2c/d84/d4a/dd5/fd7 0 2026-03-10T08:55:48.735 INFO:tasks.workunit.client.0.vm05.stdout:6/650: dread - d4/d7/d10/d15/d1b/d22/fcf zero size 2026-03-10T08:55:48.739 INFO:tasks.workunit.client.0.vm05.stdout:0/620: sync 2026-03-10T08:55:48.740 INFO:tasks.workunit.client.0.vm05.stdout:0/621: readlink df/d1f/d85/d2b/d27/d32/l7f 0 2026-03-10T08:55:48.747 INFO:tasks.workunit.client.1.vm08.stdout:1/826: creat d1/da/de/d24/d26/d86/f11e x:0 0 0 2026-03-10T08:55:48.749 INFO:tasks.workunit.client.1.vm08.stdout:1/827: write d1/da/de/d24/d35/d6d/d116/d9c/f110 [1007370,90521] 0 2026-03-10T08:55:48.749 INFO:tasks.workunit.client.0.vm05.stdout:8/627: getdents d2/dd/d2c/da5 0 2026-03-10T08:55:48.752 INFO:tasks.workunit.client.0.vm05.stdout:2/556: creat d0/d9/d1e/d20/d21/d45/d4b/d70/f9b x:0 0 0 2026-03-10T08:55:48.763 INFO:tasks.workunit.client.1.vm08.stdout:8/860: mknod d1/d10/d9/dd/d25/d27/c13e 0 2026-03-10T08:55:48.764 INFO:tasks.workunit.client.0.vm05.stdout:2/557: dwrite d0/d9/d7f/d8f/f54 [0,4194304] 0 2026-03-10T08:55:48.764 INFO:tasks.workunit.client.0.vm05.stdout:6/651: symlink d4/d7/d10/d15/ld8 0 2026-03-10T08:55:48.781 INFO:tasks.workunit.client.1.vm08.stdout:8/861: dread d1/d10/d9/dd/d25/f125 [0,4194304] 0 2026-03-10T08:55:48.783 INFO:tasks.workunit.client.0.vm05.stdout:9/600: link d6/d19/d2a/d8d/fc6 d6/d19/d2c/d58/fc9 0 2026-03-10T08:55:48.789 INFO:tasks.workunit.client.1.vm08.stdout:6/830: mknod d9/dc/d11/d23/d2c/c10d 0 2026-03-10T08:55:48.796 INFO:tasks.workunit.client.1.vm08.stdout:7/844: mknod d0/d14/d43/d9d/dfd/c110 0 2026-03-10T08:55:48.805 INFO:tasks.workunit.client.0.vm05.stdout:8/628: fsync d2/dd/d2c/d2e/d93/f9b 0 2026-03-10T08:55:48.806 INFO:tasks.workunit.client.1.vm08.stdout:7/845: dread d0/d14/d43/f58 [0,4194304] 0 2026-03-10T08:55:48.806 INFO:tasks.workunit.client.0.vm05.stdout:8/629: dread - d2/dd/d2c/d2e/d31/d4f/d7b/fd6 zero size 2026-03-10T08:55:48.808 INFO:tasks.workunit.client.0.vm05.stdout:8/630: dread d2/d45/f61 [0,4194304] 0 2026-03-10T08:55:48.811 INFO:tasks.workunit.client.0.vm05.stdout:8/631: dread d2/dd/d2c/d2e/d93/f9b [0,4194304] 0 2026-03-10T08:55:48.820 INFO:tasks.workunit.client.0.vm05.stdout:6/652: rename d4/d7/d10/d15/d20/c5b to d4/d2c/dc8/cd9 0 2026-03-10T08:55:48.833 INFO:tasks.workunit.client.1.vm08.stdout:4/864: link d5/d23/d36/d99/db2/d5a/d69/c11c d5/d23/d36/d99/dc6/dc8/d120/c134 0 2026-03-10T08:55:48.834 INFO:tasks.workunit.client.0.vm05.stdout:6/653: mknod d4/d2c/d84/d4a/cda 0 2026-03-10T08:55:48.837 INFO:tasks.workunit.client.1.vm08.stdout:4/865: mkdir d5/d23/d36/d99/db2/d5a/d69/d11b/def/df2/d135 0 2026-03-10T08:55:48.842 INFO:tasks.workunit.client.0.vm05.stdout:2/558: getdents d0/d9 0 2026-03-10T08:55:48.846 INFO:tasks.workunit.client.1.vm08.stdout:4/866: creat d5/d23/d36/d99/db2/d5d/dae/ddf/f136 x:0 0 0 2026-03-10T08:55:48.853 INFO:tasks.workunit.client.0.vm05.stdout:6/654: creat d4/d2d/d51/d87/fdb x:0 0 0 2026-03-10T08:55:48.857 INFO:tasks.workunit.client.1.vm08.stdout:4/867: fsync d5/d23/f27 0 2026-03-10T08:55:48.857 INFO:tasks.workunit.client.1.vm08.stdout:4/868: chown d5/d23/d49/d8f 52989280 1 2026-03-10T08:55:48.871 INFO:tasks.workunit.client.1.vm08.stdout:4/869: link d5/d23/l30 d5/d23/d36/d99/db2/d5a/d69/d11b/l137 0 2026-03-10T08:55:48.874 INFO:tasks.workunit.client.1.vm08.stdout:4/870: dread - d5/d23/d36/d99/dc6/fee zero size 2026-03-10T08:55:48.881 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:48.881+0000 7ff3fa7fc700 1 -- 192.168.123.105:0/4182131182 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mgrmap(e 20) v1 ==== 49900+0+0 (secure 0 0 0) 0x7ff3f4017780 con 0x7ff4041086f0 2026-03-10T08:55:48.987 INFO:tasks.workunit.client.1.vm08.stdout:1/828: creat d1/da/d20/d3f/f11f x:0 0 0 2026-03-10T08:55:49.074 INFO:tasks.workunit.client.0.vm05.stdout:4/635: dwrite d0/d78/f87 [0,4194304] 0 2026-03-10T08:55:49.107 INFO:tasks.workunit.client.1.vm08.stdout:5/741: write d0/d11/d27/d68/d7c/d4b/d4e/d84/fa9 [962131,107361] 0 2026-03-10T08:55:49.111 INFO:tasks.workunit.client.0.vm05.stdout:1/678: write dd/d21/d37/d45/f47 [1952796,90994] 0 2026-03-10T08:55:49.116 INFO:tasks.workunit.client.1.vm08.stdout:5/742: link d0/d1b/d67/d80/c98 d0/d11/d18/d52/ce0 0 2026-03-10T08:55:49.120 INFO:tasks.workunit.client.1.vm08.stdout:5/743: truncate d0/d11/d27/d68/d7c/d4b/d87/db5/fcd 740381 0 2026-03-10T08:55:49.135 INFO:tasks.workunit.client.0.vm05.stdout:3/707: write d9/f4a [4513169,6146] 0 2026-03-10T08:55:49.139 INFO:tasks.workunit.client.0.vm05.stdout:3/708: dwrite d9/d2b/d2f/f3f [4194304,4194304] 0 2026-03-10T08:55:49.140 INFO:tasks.workunit.client.0.vm05.stdout:3/709: fsync d9/d2b/d3a/d6c/dbf/f7f 0 2026-03-10T08:55:49.143 INFO:tasks.workunit.client.0.vm05.stdout:1/679: sync 2026-03-10T08:55:49.166 INFO:tasks.workunit.client.0.vm05.stdout:1/680: creat dd/d21/d37/ff5 x:0 0 0 2026-03-10T08:55:49.181 INFO:tasks.workunit.client.0.vm05.stdout:2/559: rename d0/d9/d1e/f34 to d0/d9/d1e/d20/d21/d45/d4b/f9c 0 2026-03-10T08:55:49.185 INFO:tasks.workunit.client.0.vm05.stdout:6/655: rename d4/d2d/d51/l82 to d4/d2d/d51/d62/ldc 0 2026-03-10T08:55:49.187 INFO:tasks.workunit.client.0.vm05.stdout:6/656: read d4/d7/f34 [7725829,130511] 0 2026-03-10T08:55:49.208 INFO:tasks.workunit.client.1.vm08.stdout:3/790: rename d4/d15/d8/d1d/lcf to d4/l10e 0 2026-03-10T08:55:49.216 INFO:tasks.workunit.client.0.vm05.stdout:2/560: creat d0/d9/d1e/d20/d21/d45/d4b/d70/f9d x:0 0 0 2026-03-10T08:55:49.221 INFO:tasks.workunit.client.1.vm08.stdout:5/744: creat d0/d11/d27/fe1 x:0 0 0 2026-03-10T08:55:49.226 INFO:tasks.workunit.client.0.vm05.stdout:2/561: symlink d0/d9/d7f/d8f/d7a/l9e 0 2026-03-10T08:55:49.228 INFO:tasks.workunit.client.1.vm08.stdout:2/900: write d1/d5b/d66/f9a [853485,90369] 0 2026-03-10T08:55:49.234 INFO:tasks.workunit.client.0.vm05.stdout:2/562: mkdir d0/d55/d9f 0 2026-03-10T08:55:49.235 INFO:tasks.workunit.client.0.vm05.stdout:2/563: dread - d0/d9/d89/f93 zero size 2026-03-10T08:55:49.236 INFO:tasks.workunit.client.0.vm05.stdout:2/564: read d0/f36 [1320739,28647] 0 2026-03-10T08:55:49.242 INFO:tasks.workunit.client.1.vm08.stdout:3/791: fsync d4/f44 0 2026-03-10T08:55:49.244 INFO:tasks.workunit.client.0.vm05.stdout:0/622: dwrite df/d59/f3f [0,4194304] 0 2026-03-10T08:55:49.250 INFO:tasks.workunit.client.1.vm08.stdout:5/745: mkdir d0/d11/d18/d52/de2 0 2026-03-10T08:55:49.258 INFO:tasks.workunit.client.1.vm08.stdout:9/799: dwrite d2/dd/d15/d1e/d25/d32/d5c/f7f [4194304,4194304] 0 2026-03-10T08:55:49.259 INFO:tasks.workunit.client.0.vm05.stdout:0/623: mkdir df/d1f/d85/d19/d47/d84/dae/db3 0 2026-03-10T08:55:49.264 INFO:tasks.workunit.client.1.vm08.stdout:0/762: link d6/dd/d13/d17/d1f/f48 d6/dd/ffa 0 2026-03-10T08:55:49.271 INFO:tasks.workunit.client.0.vm05.stdout:0/624: unlink df/f1d 0 2026-03-10T08:55:49.274 INFO:tasks.workunit.client.1.vm08.stdout:6/831: dwrite d9/d50/d95/f99 [0,4194304] 0 2026-03-10T08:55:49.276 INFO:tasks.workunit.client.1.vm08.stdout:3/792: creat d4/d15/d8/d1d/d4f/f10f x:0 0 0 2026-03-10T08:55:49.298 INFO:tasks.workunit.client.0.vm05.stdout:0/625: readlink df/l6d 0 2026-03-10T08:55:49.299 INFO:tasks.workunit.client.0.vm05.stdout:0/626: creat df/d1f/d85/d19/d47/d84/d8a/fb4 x:0 0 0 2026-03-10T08:55:49.299 INFO:tasks.workunit.client.0.vm05.stdout:4/636: mkdir d0/d2e/dca 0 2026-03-10T08:55:49.299 INFO:tasks.workunit.client.1.vm08.stdout:6/832: dwrite d9/dc/d11/d23/d2c/f49 [0,4194304] 0 2026-03-10T08:55:49.299 INFO:tasks.workunit.client.1.vm08.stdout:2/901: fdatasync d1/da/d78/df5/f126 0 2026-03-10T08:55:49.299 INFO:tasks.workunit.client.1.vm08.stdout:0/763: symlink d6/dd/d13/d17/d1f/d20/d2f/d57/lfb 0 2026-03-10T08:55:49.299 INFO:tasks.workunit.client.1.vm08.stdout:9/800: truncate d2/d41/ff3 261297 0 2026-03-10T08:55:49.299 INFO:tasks.workunit.client.1.vm08.stdout:0/764: dwrite d6/dd/d13/d32/ff7 [0,4194304] 0 2026-03-10T08:55:49.302 INFO:tasks.workunit.client.0.vm05.stdout:8/632: write d2/dd/d2c/d2e/d31/d4c/d63/f6c [2641993,70068] 0 2026-03-10T08:55:49.309 INFO:tasks.workunit.client.1.vm08.stdout:8/862: write d1/d10/d9/dd/d25/d27/d44/fb0 [806096,46791] 0 2026-03-10T08:55:49.309 INFO:tasks.workunit.client.0.vm05.stdout:8/633: write d2/db/f9a [1591609,18971] 0 2026-03-10T08:55:49.312 INFO:tasks.workunit.client.0.vm05.stdout:2/565: dread d0/d9/d1e/d20/d21/d45/f68 [0,4194304] 0 2026-03-10T08:55:49.314 INFO:tasks.workunit.client.1.vm08.stdout:3/793: rmdir d4/d15/d8/d2c/d6d/dfa/d100 39 2026-03-10T08:55:49.314 INFO:tasks.workunit.client.1.vm08.stdout:5/746: sync 2026-03-10T08:55:49.319 INFO:tasks.workunit.client.0.vm05.stdout:0/627: rename df/d1f/d85/d2b/d27/d32/d4e/fb2 to df/d1f/d85/fb5 0 2026-03-10T08:55:49.323 INFO:tasks.workunit.client.1.vm08.stdout:6/833: symlink d9/dc/d11/d23/d2c/dc0/l10e 0 2026-03-10T08:55:49.346 INFO:tasks.workunit.client.1.vm08.stdout:0/765: mkdir d6/dd/d13/d17/d1f/d2d/d85/dfc 0 2026-03-10T08:55:49.346 INFO:tasks.workunit.client.0.vm05.stdout:4/637: rename d0/l11 to d0/d2e/dca/lcb 0 2026-03-10T08:55:49.346 INFO:tasks.workunit.client.0.vm05.stdout:4/638: write d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/d67/f6b [4778378,18233] 0 2026-03-10T08:55:49.347 INFO:tasks.workunit.client.0.vm05.stdout:2/566: symlink d0/d9/d1e/d20/d21/d45/d4b/d8d/la0 0 2026-03-10T08:55:49.350 INFO:tasks.workunit.client.1.vm08.stdout:5/747: sync 2026-03-10T08:55:49.356 INFO:tasks.workunit.client.1.vm08.stdout:5/748: dwrite d0/d11/d27/d68/d7c/fd6 [0,4194304] 0 2026-03-10T08:55:49.358 INFO:tasks.workunit.client.1.vm08.stdout:3/794: creat d4/d6f/d85/f110 x:0 0 0 2026-03-10T08:55:49.369 INFO:tasks.workunit.client.1.vm08.stdout:3/795: chown d4/d15/d8/d2c/d6d/dfa 12618 1 2026-03-10T08:55:49.375 INFO:tasks.workunit.client.0.vm05.stdout:0/628: mknod df/d1f/d85/d19/d47/d84/dae/db3/cb6 0 2026-03-10T08:55:49.375 INFO:tasks.workunit.client.0.vm05.stdout:0/629: dread - df/d1f/d85/d19/d39/f42 zero size 2026-03-10T08:55:49.377 INFO:tasks.workunit.client.1.vm08.stdout:0/766: creat d6/dd/d13/d17/d1f/d2d/d85/d93/ffd x:0 0 0 2026-03-10T08:55:49.378 INFO:tasks.workunit.client.1.vm08.stdout:0/767: chown d6/dd/d13/d17/d1f/d2d/d85/d93/fc0 29 1 2026-03-10T08:55:49.392 INFO:tasks.workunit.client.1.vm08.stdout:3/796: fdatasync d4/f97 0 2026-03-10T08:55:49.394 INFO:tasks.workunit.client.1.vm08.stdout:0/768: rename d6/dd/d13/d61/f86 to d6/dd/d13/d8f/ffe 0 2026-03-10T08:55:49.401 INFO:tasks.workunit.client.1.vm08.stdout:0/769: dread d6/f5f [0,4194304] 0 2026-03-10T08:55:49.415 INFO:tasks.workunit.client.1.vm08.stdout:3/797: dread - d4/d15/d8/d2c/d9b/d79/d20/fe5 zero size 2026-03-10T08:55:49.415 INFO:tasks.workunit.client.1.vm08.stdout:9/801: link d2/dd/d15/d1e/c7d d2/d54/d8e/da6/dd0/dc8/c10b 0 2026-03-10T08:55:49.419 INFO:tasks.workunit.client.1.vm08.stdout:9/802: mkdir d2/d54/d8e/da6/dd0/dc8/de1/d10c 0 2026-03-10T08:55:49.425 INFO:tasks.workunit.client.1.vm08.stdout:0/770: link d6/dd/d13/d17/d1f/f48 d6/dd/d13/d17/d1f/d20/fff 0 2026-03-10T08:55:49.428 INFO:tasks.workunit.client.1.vm08.stdout:1/829: write d1/da/de/f118 [283117,27522] 0 2026-03-10T08:55:49.430 INFO:tasks.workunit.client.0.vm05.stdout:5/568: mkdir d5/dcf 0 2026-03-10T08:55:49.431 INFO:tasks.workunit.client.1.vm08.stdout:0/771: creat d6/dd/d13/d17/d1f/d20/f100 x:0 0 0 2026-03-10T08:55:49.432 INFO:tasks.workunit.client.0.vm05.stdout:7/585: mknod d18/cb7 0 2026-03-10T08:55:49.434 INFO:tasks.workunit.client.0.vm05.stdout:7/586: truncate d18/d66/d25/d2e/d42/f52 4649880 0 2026-03-10T08:55:49.438 INFO:tasks.workunit.client.0.vm05.stdout:7/587: dwrite d18/fb1 [0,4194304] 0 2026-03-10T08:55:49.443 INFO:tasks.workunit.client.0.vm05.stdout:8/634: rename d2/dd/d2c/d2e/d31/d4c to d2/dd/d2c/d2e/d31/d3e/dde 0 2026-03-10T08:55:49.444 INFO:tasks.workunit.client.1.vm08.stdout:0/772: dread - d6/d8b/f94 zero size 2026-03-10T08:55:49.450 INFO:tasks.workunit.client.1.vm08.stdout:0/773: mknod d6/dd/d13/d17/d1f/d20/d2f/d57/dd5/c101 0 2026-03-10T08:55:49.467 INFO:tasks.workunit.client.1.vm08.stdout:1/830: rename d1/da/de/d24/d3d/d40/d56/d7a/cb7 to d1/da/d20/c120 0 2026-03-10T08:55:49.479 INFO:tasks.workunit.client.0.vm05.stdout:8/635: creat d2/dd/d2c/d2e/d31/db4/fdf x:0 0 0 2026-03-10T08:55:49.486 INFO:tasks.workunit.client.0.vm05.stdout:8/636: dread - d2/dd/d2c/d2e/d31/d4f/d7b/fd8 zero size 2026-03-10T08:55:49.487 INFO:tasks.workunit.client.0.vm05.stdout:9/601: rmdir d6/d19/d2c/db3 0 2026-03-10T08:55:49.487 INFO:tasks.workunit.client.1.vm08.stdout:0/774: link d6/dd/d13/d17/d1f/d20/d2f/d24/fab d6/dd/d13/d61/d6f/f102 0 2026-03-10T08:55:49.487 INFO:tasks.workunit.client.1.vm08.stdout:0/775: mknod d6/dd/d13/d17/d1f/d20/d2f/d24/dc2/c103 0 2026-03-10T08:55:49.487 INFO:tasks.workunit.client.1.vm08.stdout:0/776: write d6/dd/d13/d17/d1f/d20/f100 [9100,116180] 0 2026-03-10T08:55:49.487 INFO:tasks.workunit.client.0.vm05.stdout:8/637: creat d2/dd/d2c/d2e/d31/d3e/dde/d63/fe0 x:0 0 0 2026-03-10T08:55:49.489 INFO:tasks.workunit.client.0.vm05.stdout:3/710: write d9/d4d/d51/d64/f9e [996041,26699] 0 2026-03-10T08:55:49.489 INFO:tasks.workunit.client.0.vm05.stdout:5/569: sync 2026-03-10T08:55:49.492 INFO:tasks.workunit.client.1.vm08.stdout:1/831: sync 2026-03-10T08:55:49.496 INFO:tasks.workunit.client.1.vm08.stdout:0/777: mknod d6/dd/d13/d17/d1f/d20/d2f/d26/d56/c104 0 2026-03-10T08:55:49.497 INFO:tasks.workunit.client.1.vm08.stdout:0/778: fdatasync d6/dd/d13/d17/d1f/d20/d2f/d24/fed 0 2026-03-10T08:55:49.501 INFO:tasks.workunit.client.1.vm08.stdout:1/832: rename d1/da/de/d24/d3d/d40/de2 to d1/da/de/d24/d81/d121 0 2026-03-10T08:55:49.504 INFO:tasks.workunit.client.0.vm05.stdout:3/711: mkdir d9/d2b/d3a/dd6 0 2026-03-10T08:55:49.506 INFO:tasks.workunit.client.1.vm08.stdout:2/902: unlink d1/da/d10/d42/d93/f8f 0 2026-03-10T08:55:49.507 INFO:tasks.workunit.client.1.vm08.stdout:0/779: truncate d6/dd/d13/d17/d1f/d20/f3e 3497865 0 2026-03-10T08:55:49.509 INFO:tasks.workunit.client.0.vm05.stdout:8/638: truncate d2/db/d1f/d67/f75 157276 0 2026-03-10T08:55:49.513 INFO:tasks.workunit.client.0.vm05.stdout:1/681: dwrite dd/d10/d18/d2d/d51/f6b [0,4194304] 0 2026-03-10T08:55:49.526 INFO:tasks.workunit.client.1.vm08.stdout:2/903: fsync d1/da/d10/d2d/fa2 0 2026-03-10T08:55:49.528 INFO:tasks.workunit.client.0.vm05.stdout:8/639: read d2/dd/d2c/d2e/f3b [6741770,95350] 0 2026-03-10T08:55:49.529 INFO:tasks.workunit.client.0.vm05.stdout:8/640: stat d2/dd/d2c/d2e/d31/d3e/dde/lc6 0 2026-03-10T08:55:49.533 INFO:tasks.workunit.client.1.vm08.stdout:0/780: mknod d6/dd/d13/d61/dc7/dc8/dde/c105 0 2026-03-10T08:55:49.538 INFO:tasks.workunit.client.0.vm05.stdout:6/657: dwrite d4/d7/f5d [0,4194304] 0 2026-03-10T08:55:49.554 INFO:tasks.workunit.client.0.vm05.stdout:8/641: sync 2026-03-10T08:55:49.560 INFO:tasks.workunit.client.0.vm05.stdout:4/639: rename d0/d2e/d42/d45/d4a/d36/dbe/d49/d58/d66/d79/fa5 to d0/d2e/d42/d45/fcc 0 2026-03-10T08:55:49.564 INFO:tasks.workunit.client.0.vm05.stdout:4/640: dwrite d0/d2e/d42/d45/d4a/d36/dbe/d32/f76 [0,4194304] 0 2026-03-10T08:55:49.566 INFO:tasks.workunit.client.0.vm05.stdout:1/682: symlink dd/d10/d18/dd5/da9/lf6 0 2026-03-10T08:55:49.566 INFO:tasks.workunit.client.0.vm05.stdout:1/683: chown dd/d10/d19/d27/l2b 102559 1 2026-03-10T08:55:49.578 INFO:tasks.workunit.client.0.vm05.stdout:6/658: readlink d4/d8d/lbb 0 2026-03-10T08:55:49.581 INFO:tasks.workunit.client.1.vm08.stdout:0/781: creat d6/dd/d13/d61/dc7/dc8/dde/f106 x:0 0 0 2026-03-10T08:55:49.581 INFO:tasks.workunit.client.1.vm08.stdout:2/904: dread d1/da/d10/d1b/fac [0,4194304] 0 2026-03-10T08:55:49.585 INFO:tasks.workunit.client.0.vm05.stdout:8/642: mknod d2/dd/d2c/d2e/d31/d4f/ce1 0 2026-03-10T08:55:49.586 INFO:tasks.workunit.client.0.vm05.stdout:8/643: chown d2/db/d1f/c25 8 1 2026-03-10T08:55:49.586 INFO:tasks.workunit.client.1.vm08.stdout:7/846: dwrite d0/d11/d1f/d29/d3d/d40/f38 [4194304,4194304] 0 2026-03-10T08:55:49.593 INFO:tasks.workunit.client.1.vm08.stdout:7/847: dread d0/d11/d4a/d5e/fed [0,4194304] 0 2026-03-10T08:55:49.595 INFO:tasks.workunit.client.0.vm05.stdout:2/567: rename d0/d9/d7f/d8f/f67 to d0/d9/d7f/d8f/d7a/fa1 0 2026-03-10T08:55:49.599 INFO:tasks.workunit.client.0.vm05.stdout:2/568: dwrite d0/d9/d1e/d20/d21/f35 [0,4194304] 0 2026-03-10T08:55:49.601 INFO:tasks.workunit.client.0.vm05.stdout:2/569: write d0/d9/d1e/d20/d21/d45/d4b/d70/f9d [936020,8697] 0 2026-03-10T08:55:49.605 INFO:tasks.workunit.client.0.vm05.stdout:2/570: dwrite d0/d9/d1e/d20/d21/d45/d4b/d70/f9d [0,4194304] 0 2026-03-10T08:55:49.608 INFO:tasks.workunit.client.0.vm05.stdout:2/571: chown d0/d9/d7f/d8f/l3c 0 1 2026-03-10T08:55:49.611 INFO:tasks.workunit.client.0.vm05.stdout:2/572: chown d0/l28 19550226 1 2026-03-10T08:55:49.633 INFO:tasks.workunit.client.0.vm05.stdout:4/641: symlink d0/d1d/lcd 0 2026-03-10T08:55:49.635 INFO:tasks.workunit.client.1.vm08.stdout:2/905: mknod d1/d5b/dc5/c12b 0 2026-03-10T08:55:49.636 INFO:tasks.workunit.client.1.vm08.stdout:2/906: read d1/da/d10/d42/f89 [1011382,22916] 0 2026-03-10T08:55:49.636 INFO:tasks.workunit.client.1.vm08.stdout:2/907: fdatasync d1/da/d10/d42/f79 0 2026-03-10T08:55:49.648 INFO:tasks.workunit.client.1.vm08.stdout:7/848: unlink d0/f7a 0 2026-03-10T08:55:49.649 INFO:tasks.workunit.client.1.vm08.stdout:7/849: readlink d0/d14/d43/d62/ldf 0 2026-03-10T08:55:49.649 INFO:tasks.workunit.client.0.vm05.stdout:6/659: mknod d4/d7/dc4/cdd 0 2026-03-10T08:55:49.652 INFO:tasks.workunit.client.1.vm08.stdout:0/782: symlink d6/dd/d13/d17/d1f/d20/l107 0 2026-03-10T08:55:49.654 INFO:tasks.workunit.client.0.vm05.stdout:8/644: mkdir d2/dd/d2c/d2e/d31/d4f/d80/de2 0 2026-03-10T08:55:49.655 INFO:tasks.workunit.client.1.vm08.stdout:8/863: dwrite d1/d10/d9/dd/d25/f6e [0,4194304] 0 2026-03-10T08:55:49.663 INFO:tasks.workunit.client.1.vm08.stdout:6/834: dwrite d9/dc/d11/d106/fa8 [0,4194304] 0 2026-03-10T08:55:49.676 INFO:tasks.workunit.client.0.vm05.stdout:9/602: rename d6/d19/d2a/d4a/l79 to d6/d12/d3a/d48/lca 0 2026-03-10T08:55:49.677 INFO:tasks.workunit.client.0.vm05.stdout:3/712: rename d9 to d9/d2b/d3a/d6c/dbe/dd7 22 2026-03-10T08:55:49.678 INFO:tasks.workunit.client.0.vm05.stdout:0/630: write df/d1f/d85/d19/d39/f63 [1022550,127574] 0 2026-03-10T08:55:49.684 INFO:tasks.workunit.client.1.vm08.stdout:4/871: write d5/d23/d36/d99/db2/d5a/d69/d11b/f50 [1997932,23236] 0 2026-03-10T08:55:49.687 INFO:tasks.workunit.client.1.vm08.stdout:5/749: dwrite d0/fc7 [0,4194304] 0 2026-03-10T08:55:49.694 INFO:tasks.workunit.client.1.vm08.stdout:3/798: dwrite d4/d15/d8/d2c/d9b/d79/d8f/fd7 [0,4194304] 0 2026-03-10T08:55:49.698 INFO:tasks.workunit.client.1.vm08.stdout:9/803: dwrite d2/dd/d15/d1e/d25/d32/f45 [0,4194304] 0 2026-03-10T08:55:49.712 INFO:tasks.workunit.client.0.vm05.stdout:7/588: write d18/d66/d25/d2e/d42/f46 [4457259,123664] 0 2026-03-10T08:55:49.713 INFO:tasks.workunit.client.1.vm08.stdout:0/783: mknod d6/dd/d13/d17/d1f/d2d/d39/c108 0 2026-03-10T08:55:49.713 INFO:tasks.workunit.client.1.vm08.stdout:9/804: dwrite d2/dd/d15/d1e/d39/d69/de4/f104 [0,4194304] 0 2026-03-10T08:55:49.721 INFO:tasks.workunit.client.1.vm08.stdout:9/805: dread d2/f77 [0,4194304] 0 2026-03-10T08:55:49.737 INFO:tasks.workunit.client.0.vm05.stdout:4/642: fsync d0/d2e/d42/d45/d4a/d36/f3d 0 2026-03-10T08:55:49.753 INFO:tasks.workunit.client.0.vm05.stdout:8/645: unlink d2/dd/d2c/d2e/f6a 0 2026-03-10T08:55:49.754 INFO:tasks.workunit.client.0.vm05.stdout:8/646: stat d2/dd/d2c/d2e/d31/d4f/d80/f9f 0 2026-03-10T08:55:49.762 INFO:tasks.workunit.client.0.vm05.stdout:6/660: rename d4/d7/f52 to d4/d7/d10/d1a/d1f/fde 0 2026-03-10T08:55:49.763 INFO:tasks.workunit.client.0.vm05.stdout:6/661: truncate d4/d7/dc4/fca 844955 0 2026-03-10T08:55:49.763 INFO:tasks.workunit.client.0.vm05.stdout:6/662: readlink d4/d7/d10/d15/d20/la3 0 2026-03-10T08:55:49.773 INFO:tasks.workunit.client.0.vm05.stdout:3/713: mkdir d9/d8f/d50/d5f/dd8 0 2026-03-10T08:55:49.780 INFO:tasks.workunit.client.1.vm08.stdout:1/833: dwrite d1/fed [0,4194304] 0 2026-03-10T08:55:49.781 INFO:tasks.workunit.client.1.vm08.stdout:1/834: read - d1/da/de/d24/d26/d86/f11e zero size 2026-03-10T08:55:49.784 INFO:tasks.workunit.client.0.vm05.stdout:5/570: dwrite d5/d86/d21/d89/f90 [0,4194304] 0 2026-03-10T08:55:49.795 INFO:tasks.workunit.client.0.vm05.stdout:9/603: mknod d6/d12/d3a/da2/ccb 0 2026-03-10T08:55:49.804 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:49 vm08.local ceph-mon[57559]: Active manager daemon vm08.rpongu restarted 2026-03-10T08:55:49.804 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:49 vm08.local ceph-mon[57559]: Activating manager daemon vm08.rpongu 2026-03-10T08:55:49.804 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:49 vm08.local ceph-mon[57559]: osdmap e41: 6 total, 6 up, 6 in 2026-03-10T08:55:49.804 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:49 vm08.local ceph-mon[57559]: mgrmap e20: vm08.rpongu(active, starting, since 0.012904s) 2026-03-10T08:55:49.804 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:49 vm08.local ceph-mon[57559]: from='mgr.? 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.rpongu/crt"}]: dispatch 2026-03-10T08:55:49.804 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:49 vm08.local ceph-mon[57559]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T08:55:49.804 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:49 vm08.local ceph-mon[57559]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T08:55:49.804 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:49 vm08.local ceph-mon[57559]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-10T08:55:49.804 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:49 vm08.local ceph-mon[57559]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T08:55:49.804 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:49 vm08.local ceph-mon[57559]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.slhztf"}]: dispatch 2026-03-10T08:55:49.804 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:49 vm08.local ceph-mon[57559]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.bxdvbu"}]: dispatch 2026-03-10T08:55:49.804 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:49 vm08.local ceph-mon[57559]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.ssijow"}]: dispatch 2026-03-10T08:55:49.804 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:49 vm08.local ceph-mon[57559]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.xfzrbx"}]: dispatch 2026-03-10T08:55:49.804 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:49 vm08.local ceph-mon[57559]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "mgr metadata", "who": "vm08.rpongu", "id": "vm08.rpongu"}]: dispatch 2026-03-10T08:55:49.804 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:49 vm08.local ceph-mon[57559]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T08:55:49.804 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:49 vm08.local ceph-mon[57559]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T08:55:49.804 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:49 vm08.local ceph-mon[57559]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T08:55:49.804 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:49 vm08.local ceph-mon[57559]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T08:55:49.804 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:49 vm08.local ceph-mon[57559]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T08:55:49.804 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:49 vm08.local ceph-mon[57559]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T08:55:49.804 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:49 vm08.local ceph-mon[57559]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T08:55:49.804 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:49 vm08.local ceph-mon[57559]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T08:55:49.804 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:49 vm08.local ceph-mon[57559]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T08:55:49.804 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:49 vm08.local ceph-mon[57559]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.rpongu/key"}]: dispatch 2026-03-10T08:55:49.804 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:49 vm08.local ceph-mon[57559]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T08:55:49.804 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:49 vm08.local ceph-mon[57559]: Manager daemon vm08.rpongu is now available 2026-03-10T08:55:49.807 INFO:tasks.workunit.client.0.vm05.stdout:1/684: write dd/d21/d37/d45/d8d/fae [693554,2232] 0 2026-03-10T08:55:49.814 INFO:tasks.workunit.client.1.vm08.stdout:3/799: symlink d4/d15/d8/d2c/d9b/d79/d8f/l111 0 2026-03-10T08:55:49.829 INFO:tasks.workunit.client.1.vm08.stdout:0/784: fsync d6/dd/d13/d17/d1f/d20/d2f/f59 0 2026-03-10T08:55:49.834 INFO:tasks.workunit.client.0.vm05.stdout:2/573: dwrite d0/f2f [0,4194304] 0 2026-03-10T08:55:49.860 INFO:tasks.workunit.client.0.vm05.stdout:2/574: chown d0/d9/d7f/d8f/f63 1758 1 2026-03-10T08:55:49.860 INFO:tasks.workunit.client.0.vm05.stdout:6/663: unlink d4/d7/d10/d1a/d1f/fde 0 2026-03-10T08:55:49.860 INFO:tasks.workunit.client.1.vm08.stdout:9/806: rename d2/dd/d15/d1e/d24/f30 to d2/dd/d15/d1e/d24/f10d 0 2026-03-10T08:55:49.860 INFO:tasks.workunit.client.1.vm08.stdout:6/835: write d9/dc/d11/d23/d2c/f4f [3066743,2498] 0 2026-03-10T08:55:49.860 INFO:tasks.workunit.client.1.vm08.stdout:6/836: dwrite d9/d10/d1e/f91 [0,4194304] 0 2026-03-10T08:55:49.863 INFO:tasks.workunit.client.0.vm05.stdout:9/604: readlink d6/d27/l40 0 2026-03-10T08:55:49.863 INFO:tasks.workunit.client.0.vm05.stdout:9/605: chown d6/d19/d2a/d8d/fc6 876 1 2026-03-10T08:55:49.868 INFO:tasks.workunit.client.1.vm08.stdout:1/835: dwrite d1/da/de/d24/d35/d6d/d116/f71 [0,4194304] 0 2026-03-10T08:55:49.907 INFO:tasks.workunit.client.1.vm08.stdout:0/785: mkdir d6/dd/d13/d17/d1f/d20/d2f/d57/d109 0 2026-03-10T08:55:49.907 INFO:tasks.workunit.client.1.vm08.stdout:6/837: symlink d9/d13/l10f 0 2026-03-10T08:55:49.907 INFO:tasks.workunit.client.0.vm05.stdout:7/589: creat d18/d66/d78/fb8 x:0 0 0 2026-03-10T08:55:49.907 INFO:tasks.workunit.client.0.vm05.stdout:4/643: rename d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/l56 to d0/d2e/d42/d45/d4a/d36/dbe/d49/d58/d66/lce 0 2026-03-10T08:55:49.907 INFO:tasks.workunit.client.0.vm05.stdout:3/714: mkdir d9/d8f/d50/d5f/dd8/dd9 0 2026-03-10T08:55:49.907 INFO:tasks.workunit.client.0.vm05.stdout:5/571: link d5/d86/d21/d89/fbd d5/df/dbb/fd0 0 2026-03-10T08:55:49.911 INFO:tasks.workunit.client.1.vm08.stdout:2/908: getdents d1/d97 0 2026-03-10T08:55:49.912 INFO:tasks.workunit.client.0.vm05.stdout:0/631: link df/d1f/d85/d2b/d27/lac df/d1f/d85/d19/d62/lb7 0 2026-03-10T08:55:49.914 INFO:tasks.workunit.client.1.vm08.stdout:4/872: link d5/d23/d36/f7d d5/d23/d36/d99/db2/d5a/d69/f138 0 2026-03-10T08:55:49.917 INFO:tasks.workunit.client.0.vm05.stdout:1/685: mknod dd/d10/d18/cf7 0 2026-03-10T08:55:49.918 INFO:tasks.workunit.client.0.vm05.stdout:7/590: fsync d18/d66/fae 0 2026-03-10T08:55:49.920 INFO:tasks.workunit.client.1.vm08.stdout:1/836: chown d1/da/de/d5c/fcc 31 1 2026-03-10T08:55:49.927 INFO:tasks.workunit.client.1.vm08.stdout:0/786: symlink d6/dd/d13/d8f/l10a 0 2026-03-10T08:55:49.927 INFO:tasks.workunit.client.1.vm08.stdout:0/787: chown d6/dd/d13/d8f 15404 1 2026-03-10T08:55:49.931 INFO:tasks.workunit.client.1.vm08.stdout:6/838: truncate d9/d50/de9/ff7 641807 0 2026-03-10T08:55:49.948 INFO:tasks.workunit.client.1.vm08.stdout:2/909: creat d1/da/d78/df5/d11e/f12c x:0 0 0 2026-03-10T08:55:49.948 INFO:tasks.workunit.client.1.vm08.stdout:5/750: getdents d0/d11/d27/d50 0 2026-03-10T08:55:49.948 INFO:tasks.workunit.client.1.vm08.stdout:1/837: read - d1/da/d20/d91/d83/f100 zero size 2026-03-10T08:55:49.948 INFO:tasks.workunit.client.1.vm08.stdout:0/788: mknod d6/dd/d13/d17/d1f/d20/d2f/d24/c10b 0 2026-03-10T08:55:49.948 INFO:tasks.workunit.client.1.vm08.stdout:8/864: getdents d1/d10/d9/dd/d25/d27/d44/d21/d5f 0 2026-03-10T08:55:49.957 INFO:tasks.workunit.client.1.vm08.stdout:3/800: getdents d4/d15/d8/d2c/d9b 0 2026-03-10T08:55:49.959 INFO:tasks.workunit.client.1.vm08.stdout:0/789: symlink d6/dd/d13/d17/d1f/d20/d2f/d57/dd5/l10c 0 2026-03-10T08:55:49.971 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:49 vm05.local ceph-mon[49713]: Active manager daemon vm08.rpongu restarted 2026-03-10T08:55:49.972 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:49 vm05.local ceph-mon[49713]: Activating manager daemon vm08.rpongu 2026-03-10T08:55:49.972 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:49 vm05.local ceph-mon[49713]: osdmap e41: 6 total, 6 up, 6 in 2026-03-10T08:55:49.972 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:49 vm05.local ceph-mon[49713]: mgrmap e20: vm08.rpongu(active, starting, since 0.012904s) 2026-03-10T08:55:49.972 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:49 vm05.local ceph-mon[49713]: from='mgr.? 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.rpongu/crt"}]: dispatch 2026-03-10T08:55:49.972 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:49 vm05.local ceph-mon[49713]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T08:55:49.972 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:49 vm05.local ceph-mon[49713]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T08:55:49.972 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:49 vm05.local ceph-mon[49713]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-10T08:55:49.972 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:49 vm05.local ceph-mon[49713]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T08:55:49.972 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:49 vm05.local ceph-mon[49713]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.slhztf"}]: dispatch 2026-03-10T08:55:49.972 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:49 vm05.local ceph-mon[49713]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.bxdvbu"}]: dispatch 2026-03-10T08:55:49.972 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:49 vm05.local ceph-mon[49713]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.ssijow"}]: dispatch 2026-03-10T08:55:49.972 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:49 vm05.local ceph-mon[49713]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.xfzrbx"}]: dispatch 2026-03-10T08:55:49.972 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:49 vm05.local ceph-mon[49713]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "mgr metadata", "who": "vm08.rpongu", "id": "vm08.rpongu"}]: dispatch 2026-03-10T08:55:49.972 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:49 vm05.local ceph-mon[49713]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T08:55:49.972 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:49 vm05.local ceph-mon[49713]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T08:55:49.972 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:49 vm05.local ceph-mon[49713]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T08:55:49.972 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:49 vm05.local ceph-mon[49713]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T08:55:49.972 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:49 vm05.local ceph-mon[49713]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T08:55:49.972 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:49 vm05.local ceph-mon[49713]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T08:55:49.972 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:49 vm05.local ceph-mon[49713]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T08:55:49.972 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:49 vm05.local ceph-mon[49713]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T08:55:49.972 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:49 vm05.local ceph-mon[49713]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T08:55:49.972 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:49 vm05.local ceph-mon[49713]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.rpongu/key"}]: dispatch 2026-03-10T08:55:49.972 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:49 vm05.local ceph-mon[49713]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T08:55:49.972 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:49 vm05.local ceph-mon[49713]: Manager daemon vm08.rpongu is now available 2026-03-10T08:55:49.972 INFO:tasks.workunit.client.1.vm08.stdout:8/865: mkdir d1/d10/d9/dd/d25/dca/dc6/d13f 0 2026-03-10T08:55:49.972 INFO:tasks.workunit.client.1.vm08.stdout:6/839: mknod d9/d50/de9/dea/dfc/c110 0 2026-03-10T08:55:49.972 INFO:tasks.workunit.client.1.vm08.stdout:2/910: fdatasync d1/da/d10/d2d/f4d 0 2026-03-10T08:55:49.972 INFO:tasks.workunit.client.1.vm08.stdout:5/751: mknod d0/ce3 0 2026-03-10T08:55:49.972 INFO:tasks.workunit.client.1.vm08.stdout:1/838: mknod d1/da/de/d24/d3d/d40/c122 0 2026-03-10T08:55:49.974 INFO:tasks.workunit.client.1.vm08.stdout:3/801: fsync d4/d15/fa 0 2026-03-10T08:55:49.997 INFO:tasks.workunit.client.1.vm08.stdout:0/790: unlink d6/dd/d13/d17/d1f/d20/d2f/c2a 0 2026-03-10T08:55:49.997 INFO:tasks.workunit.client.1.vm08.stdout:8/866: symlink d1/d10/d9/dd/d25/l140 0 2026-03-10T08:55:49.997 INFO:tasks.workunit.client.1.vm08.stdout:5/752: rename d0/fc7 to d0/d11/d3e/d45/fe4 0 2026-03-10T08:55:49.997 INFO:tasks.workunit.client.1.vm08.stdout:3/802: creat d4/d15/d8/d2c/d6d/dfa/d100/f112 x:0 0 0 2026-03-10T08:55:50.000 INFO:tasks.workunit.client.1.vm08.stdout:0/791: creat d6/dd/d13/d17/d1f/da3/f10d x:0 0 0 2026-03-10T08:55:50.001 INFO:tasks.workunit.client.1.vm08.stdout:1/839: dread d1/da/de/f79 [0,4194304] 0 2026-03-10T08:55:50.002 INFO:tasks.workunit.client.1.vm08.stdout:3/803: dwrite d4/d6f/fcb [0,4194304] 0 2026-03-10T08:55:50.003 INFO:tasks.workunit.client.1.vm08.stdout:8/867: mkdir d1/d10/d9/dd/d13/d40/d141 0 2026-03-10T08:55:50.005 INFO:tasks.workunit.client.0.vm05.stdout:3/715: mknod d9/d2b/d3a/d43/d71/d86/cda 0 2026-03-10T08:55:50.006 INFO:tasks.workunit.client.1.vm08.stdout:6/840: symlink d9/d10/d1e/d104/l111 0 2026-03-10T08:55:50.007 INFO:tasks.workunit.client.1.vm08.stdout:6/841: write d9/dc/d11/d23/f40 [2769193,1523] 0 2026-03-10T08:55:50.021 INFO:tasks.workunit.client.0.vm05.stdout:7/591: rmdir d18/d66/d25/d2e 39 2026-03-10T08:55:50.032 INFO:tasks.workunit.client.0.vm05.stdout:3/716: symlink d9/d4d/d51/d64/ldb 0 2026-03-10T08:55:50.032 INFO:tasks.workunit.client.0.vm05.stdout:0/632: symlink df/d1f/d85/d19/d62/lb8 0 2026-03-10T08:55:50.032 INFO:tasks.workunit.client.1.vm08.stdout:2/911: dread d1/d43/f4b [0,4194304] 0 2026-03-10T08:55:50.032 INFO:tasks.workunit.client.1.vm08.stdout:1/840: creat d1/da/de/d24/d35/d6d/f123 x:0 0 0 2026-03-10T08:55:50.033 INFO:tasks.workunit.client.1.vm08.stdout:2/912: chown d1/da/d10/d42/d93/d1e/dce/fc8 273160 1 2026-03-10T08:55:50.033 INFO:tasks.workunit.client.1.vm08.stdout:1/841: fsync d1/da/de/d24/d26/d5d/fe1 0 2026-03-10T08:55:50.036 INFO:tasks.workunit.client.1.vm08.stdout:8/868: rmdir d1/d10/d9/d4d/db2 39 2026-03-10T08:55:50.040 INFO:tasks.workunit.client.1.vm08.stdout:8/869: dwrite d1/d10/d9/dd/d25/d27/d44/d21/d51/d64/f12c [0,4194304] 0 2026-03-10T08:55:50.041 INFO:tasks.workunit.client.1.vm08.stdout:3/804: rename d4/d15/d8/d2c/d9b/d79/d8f/fd7 to d4/d15/d8/d2c/d9b/d79/d8f/de2/f113 0 2026-03-10T08:55:50.043 INFO:tasks.workunit.client.1.vm08.stdout:8/870: readlink d1/d10/d9/dd/d25/l140 0 2026-03-10T08:55:50.085 INFO:tasks.workunit.client.0.vm05.stdout:7/592: dread d18/f95 [0,4194304] 0 2026-03-10T08:55:50.085 INFO:tasks.workunit.client.0.vm05.stdout:3/717: getdents d9/d2b/d2f/d57/dd0 0 2026-03-10T08:55:50.085 INFO:tasks.workunit.client.0.vm05.stdout:3/718: rmdir d9/d2b/d3a/d6c/dbe 39 2026-03-10T08:55:50.085 INFO:tasks.workunit.client.0.vm05.stdout:7/593: creat d18/d66/d25/fb9 x:0 0 0 2026-03-10T08:55:50.085 INFO:tasks.workunit.client.0.vm05.stdout:7/594: write d18/d38/d43/d5c/fa7 [541423,37815] 0 2026-03-10T08:55:50.085 INFO:tasks.workunit.client.1.vm08.stdout:2/913: creat d1/d5b/dc5/f12d x:0 0 0 2026-03-10T08:55:50.085 INFO:tasks.workunit.client.1.vm08.stdout:2/914: dwrite d1/da/d10/d2d/db6/ff3 [0,4194304] 0 2026-03-10T08:55:50.085 INFO:tasks.workunit.client.1.vm08.stdout:8/871: truncate d1/d10/d9/dd/d9a/da6/f11d 71466 0 2026-03-10T08:55:50.085 INFO:tasks.workunit.client.1.vm08.stdout:2/915: creat d1/da/d10/d42/d93/d23/f12e x:0 0 0 2026-03-10T08:55:50.085 INFO:tasks.workunit.client.1.vm08.stdout:6/842: creat d9/dc/d11/d23/f112 x:0 0 0 2026-03-10T08:55:50.085 INFO:tasks.workunit.client.1.vm08.stdout:2/916: dwrite d1/d97/d11f/de7/f107 [0,4194304] 0 2026-03-10T08:55:50.086 INFO:tasks.workunit.client.1.vm08.stdout:8/872: symlink d1/d10/d9/dd/d25/d27/d44/l142 0 2026-03-10T08:55:50.090 INFO:tasks.workunit.client.0.vm05.stdout:7/595: fsync d18/d66/d25/d2e/f6f 0 2026-03-10T08:55:50.095 INFO:tasks.workunit.client.1.vm08.stdout:6/843: creat d9/dc/d11/d23/f113 x:0 0 0 2026-03-10T08:55:50.112 INFO:tasks.workunit.client.0.vm05.stdout:7/596: dwrite d18/d66/d25/d2e/d42/fa3 [0,4194304] 0 2026-03-10T08:55:50.113 INFO:tasks.workunit.client.1.vm08.stdout:2/917: fdatasync d1/da/d10/d42/d93/d1e/dce/fc8 0 2026-03-10T08:55:50.113 INFO:tasks.workunit.client.1.vm08.stdout:8/873: readlink d1/d10/d9/d4d/l69 0 2026-03-10T08:55:50.113 INFO:tasks.workunit.client.1.vm08.stdout:8/874: dwrite d1/d10/d9/dd/d25/dca/d128/f131 [0,4194304] 0 2026-03-10T08:55:50.121 INFO:tasks.workunit.client.1.vm08.stdout:2/918: creat d1/da/d10/dca/f12f x:0 0 0 2026-03-10T08:55:50.121 INFO:tasks.workunit.client.0.vm05.stdout:7/597: rename d18/d66/d25/d2e/f6f to d18/d66/d25/d2e/d2f/d6d/fba 0 2026-03-10T08:55:50.125 INFO:tasks.workunit.client.0.vm05.stdout:7/598: fdatasync d18/d66/d25/d2e/d42/f71 0 2026-03-10T08:55:50.128 INFO:tasks.workunit.client.1.vm08.stdout:8/875: creat d1/d10/d9/dd/d25/d27/d44/d97/d7d/f143 x:0 0 0 2026-03-10T08:55:50.139 INFO:tasks.workunit.client.0.vm05.stdout:7/599: creat d18/d66/d25/d2e/d2f/da0/fbb x:0 0 0 2026-03-10T08:55:50.153 INFO:tasks.workunit.client.1.vm08.stdout:0/792: sync 2026-03-10T08:55:50.153 INFO:tasks.workunit.client.1.vm08.stdout:1/842: sync 2026-03-10T08:55:50.153 INFO:tasks.workunit.client.1.vm08.stdout:1/843: chown d1/da/d18/d3a/f57 472317 1 2026-03-10T08:55:50.154 INFO:tasks.workunit.client.1.vm08.stdout:0/793: chown d6/dd/d13/d17/d1f/d2d/d39/f3b 770096721 1 2026-03-10T08:55:50.158 INFO:tasks.workunit.client.1.vm08.stdout:8/876: rename d1/d10/d9/dd/d18/d3c to d1/d10/d9/dd/d25/d27/d144 0 2026-03-10T08:55:50.172 INFO:tasks.workunit.client.1.vm08.stdout:2/919: truncate d1/da/d10/d42/f79 3445238 0 2026-03-10T08:55:50.173 INFO:tasks.workunit.client.1.vm08.stdout:8/877: chown d1/d10/d9/dd/d25/d27/d44/d21/d51/dd6/ffa 33247695 1 2026-03-10T08:55:50.173 INFO:tasks.workunit.client.1.vm08.stdout:1/844: truncate d1/da/d20/d91/d83/f100 421683 0 2026-03-10T08:55:50.173 INFO:tasks.workunit.client.1.vm08.stdout:2/920: unlink d1/da/d10/d42/d93/d23/f12e 0 2026-03-10T08:55:50.173 INFO:tasks.workunit.client.1.vm08.stdout:0/794: truncate d6/d8b/faa 2019704 0 2026-03-10T08:55:50.173 INFO:tasks.workunit.client.1.vm08.stdout:2/921: symlink d1/da/d10/d1b/d6a/l130 0 2026-03-10T08:55:50.173 INFO:tasks.workunit.client.1.vm08.stdout:8/878: rename d1/d10/d9/c136 to d1/d10/d9/dd/d25/d27/d44/d21/c145 0 2026-03-10T08:55:50.173 INFO:tasks.workunit.client.1.vm08.stdout:2/922: mknod d1/da/d10/d42/d93/d1e/d7b/c131 0 2026-03-10T08:55:50.173 INFO:tasks.workunit.client.1.vm08.stdout:0/795: dread d6/dd/d13/d17/fc6 [0,4194304] 0 2026-03-10T08:55:50.174 INFO:tasks.workunit.client.1.vm08.stdout:8/879: creat d1/d10/d9/dd/d25/d27/d44/d97/d7d/f146 x:0 0 0 2026-03-10T08:55:50.178 INFO:tasks.workunit.client.1.vm08.stdout:0/796: getdents d6/dd/d13/d17/d1f/d2d/d85/d93 0 2026-03-10T08:55:50.179 INFO:tasks.workunit.client.1.vm08.stdout:0/797: chown d6/dd/d13/d17/d1f/d2d/d85/d93/fc0 788 1 2026-03-10T08:55:50.179 INFO:tasks.workunit.client.0.vm05.stdout:0/633: sync 2026-03-10T08:55:50.180 INFO:tasks.workunit.client.0.vm05.stdout:7/600: sync 2026-03-10T08:55:50.180 INFO:tasks.workunit.client.0.vm05.stdout:0/634: stat df/d1f/d85/d19/d47/d84/dae/db3 0 2026-03-10T08:55:50.181 INFO:tasks.workunit.client.0.vm05.stdout:0/635: chown df/d1f/d85/d19/d62/lb8 593 1 2026-03-10T08:55:50.194 INFO:tasks.workunit.client.0.vm05.stdout:7/601: creat d18/d66/d25/d2e/d2f/d6d/fbc x:0 0 0 2026-03-10T08:55:50.205 INFO:tasks.workunit.client.0.vm05.stdout:0/636: readlink df/d59/l69 0 2026-03-10T08:55:50.205 INFO:tasks.workunit.client.0.vm05.stdout:0/637: dread - df/d1f/d85/d2b/f9a zero size 2026-03-10T08:55:50.205 INFO:tasks.workunit.client.0.vm05.stdout:0/638: getdents df/d1f/d85/d19/d5b 0 2026-03-10T08:55:50.205 INFO:tasks.workunit.client.1.vm08.stdout:0/798: creat d6/dd/d13/d17/d1f/d2d/d85/f10e x:0 0 0 2026-03-10T08:55:50.205 INFO:tasks.workunit.client.1.vm08.stdout:0/799: mknod d6/d8b/c10f 0 2026-03-10T08:55:50.205 INFO:tasks.workunit.client.1.vm08.stdout:0/800: chown d6/dd/d13/d17/d50/c8e 48 1 2026-03-10T08:55:50.205 INFO:tasks.workunit.client.1.vm08.stdout:0/801: symlink d6/dd/d13/d17/d1f/d2d/d85/d95/l110 0 2026-03-10T08:55:50.214 INFO:tasks.workunit.client.1.vm08.stdout:0/802: rmdir d6/dd/d13/d17/d1f/d20/d2f/d26 39 2026-03-10T08:55:50.215 INFO:tasks.workunit.client.0.vm05.stdout:0/639: symlink df/d1f/d85/d19/d39/d74/d67/d7b/lb9 0 2026-03-10T08:55:50.218 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.217+0000 7ff3fa7fc700 1 -- 192.168.123.105:0/4182131182 <== mon.1 v2:192.168.123.108:3300/0 8 ==== mgrmap(e 21) v1 ==== 50027+0+0 (secure 0 0 0) 0x7ff3f4028cf0 con 0x7ff4041086f0 2026-03-10T08:55:50.218 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.218+0000 7ff3fa7fc700 1 --2- 192.168.123.105:0/4182131182 >> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] conn(0x7ff3f0041cc0 0x7ff3f00440b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:55:50.218 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.218+0000 7ff3fa7fc700 1 -- 192.168.123.105:0/4182131182 --> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7ff3f00445f0 con 0x7ff3f0041cc0 2026-03-10T08:55:50.219 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.219+0000 7ff40978a700 1 --2- 192.168.123.105:0/4182131182 >> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] conn(0x7ff3f0041cc0 0x7ff3f00440b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:55:50.222 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.221+0000 7ff40978a700 1 --2- 192.168.123.105:0/4182131182 >> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] conn(0x7ff3f0041cc0 0x7ff3f00440b0 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7ff400005950 tx=0x7ff4000058e0 comp rx=0 tx=0).ready entity=mgr.24459 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:55:50.230 INFO:tasks.workunit.client.0.vm05.stdout:5/572: dread d5/d86/d24/d2c/d41/f4c [0,4194304] 0 2026-03-10T08:55:50.235 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.232+0000 7ff3fa7fc700 1 -- 192.168.123.105:0/4182131182 <== mgr.24459 v2:192.168.123.108:6828/865080403 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+318 (secure 0 0 0) 0x7ff3f00445f0 con 0x7ff3f0041cc0 2026-03-10T08:55:50.237 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.236+0000 7ff40b9ee700 1 -- 192.168.123.105:0/4182131182 >> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] conn(0x7ff3f0041cc0 msgr2=0x7ff3f00440b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:50.237 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.236+0000 7ff40b9ee700 1 --2- 192.168.123.105:0/4182131182 >> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] conn(0x7ff3f0041cc0 0x7ff3f00440b0 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7ff400005950 tx=0x7ff4000058e0 comp rx=0 tx=0).stop 2026-03-10T08:55:50.237 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.236+0000 7ff40b9ee700 1 -- 192.168.123.105:0/4182131182 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff4041086f0 msgr2=0x7ff4041a5b10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:50.237 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.236+0000 7ff40b9ee700 1 --2- 192.168.123.105:0/4182131182 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff4041086f0 0x7ff4041a5b10 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7ff3f400bb70 tx=0x7ff3f4004690 comp rx=0 tx=0).stop 2026-03-10T08:55:50.237 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.237+0000 7ff40b9ee700 1 -- 192.168.123.105:0/4182131182 shutdown_connections 2026-03-10T08:55:50.237 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.237+0000 7ff40b9ee700 1 --2- 192.168.123.105:0/4182131182 >> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] conn(0x7ff3f0041cc0 0x7ff3f00440b0 unknown :-1 s=CLOSED pgs=10 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:50.237 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.237+0000 7ff40b9ee700 1 --2- 192.168.123.105:0/4182131182 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff404107d90 0x7ff4041a55d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:50.237 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.237+0000 7ff40b9ee700 1 --2- 192.168.123.105:0/4182131182 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff4041086f0 0x7ff4041a5b10 unknown :-1 s=CLOSED pgs=90 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:50.237 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.237+0000 7ff40b9ee700 1 -- 192.168.123.105:0/4182131182 >> 192.168.123.105:0/4182131182 conn(0x7ff40406dda0 msgr2=0x7ff40410b940 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:55:50.237 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.237+0000 7ff40b9ee700 1 -- 192.168.123.105:0/4182131182 shutdown_connections 2026-03-10T08:55:50.237 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.237+0000 7ff40b9ee700 1 -- 192.168.123.105:0/4182131182 wait complete. 2026-03-10T08:55:50.240 INFO:tasks.workunit.client.0.vm05.stdout:5/573: dwrite d5/df/dbb/f4a [0,4194304] 0 2026-03-10T08:55:50.245 INFO:tasks.workunit.client.0.vm05.stdout:7/602: sync 2026-03-10T08:55:50.249 INFO:tasks.workunit.client.0.vm05.stdout:5/574: creat d5/d48/d64/d95/dac/fd1 x:0 0 0 2026-03-10T08:55:50.252 INFO:tasks.workunit.client.1.vm08.stdout:0/803: sync 2026-03-10T08:55:50.266 INFO:tasks.workunit.client.1.vm08.stdout:1/845: dread d1/fc [0,4194304] 0 2026-03-10T08:55:50.272 INFO:tasks.workunit.client.1.vm08.stdout:0/804: fsync d6/dd/d13/d17/d1f/d2d/ff4 0 2026-03-10T08:55:50.281 INFO:teuthology.orchestra.run.vm05.stdout:true 2026-03-10T08:55:50.282 INFO:tasks.workunit.client.1.vm08.stdout:0/805: creat d6/dd/d13/d17/d1f/d2d/d85/f111 x:0 0 0 2026-03-10T08:55:50.282 INFO:tasks.workunit.client.0.vm05.stdout:8/647: write d2/dd/d2c/d2e/d31/fc4 [578583,112909] 0 2026-03-10T08:55:50.282 INFO:tasks.workunit.client.1.vm08.stdout:0/806: stat d6/dd/d13/d17/d1f/d2d/l9a 0 2026-03-10T08:55:50.292 INFO:tasks.workunit.client.0.vm05.stdout:0/640: dread f5 [0,4194304] 0 2026-03-10T08:55:50.294 INFO:tasks.workunit.client.1.vm08.stdout:0/807: write d6/dd/d13/d17/d1f/d20/d2f/f59 [4255406,93117] 0 2026-03-10T08:55:50.303 INFO:tasks.workunit.client.0.vm05.stdout:8/648: creat d2/dd/d2c/d2e/d31/d3e/fe3 x:0 0 0 2026-03-10T08:55:50.303 INFO:tasks.workunit.client.0.vm05.stdout:8/649: chown d2/dd/f3f 15121034 1 2026-03-10T08:55:50.310 INFO:tasks.workunit.client.0.vm05.stdout:0/641: dread df/d1f/d85/d19/d5b/f72 [0,4194304] 0 2026-03-10T08:55:50.312 INFO:tasks.workunit.client.1.vm08.stdout:0/808: sync 2026-03-10T08:55:50.328 INFO:tasks.workunit.client.0.vm05.stdout:8/650: mknod d2/dd/d2c/d2e/d31/d4f/d7b/ce4 0 2026-03-10T08:55:50.336 INFO:tasks.workunit.client.1.vm08.stdout:9/807: dwrite d2/dd/d15/f17 [0,4194304] 0 2026-03-10T08:55:50.338 INFO:tasks.workunit.client.1.vm08.stdout:9/808: stat d2/dd/d15/d1e/d25/d32/d5c/d7b/c83 0 2026-03-10T08:55:50.342 INFO:tasks.workunit.client.0.vm05.stdout:2/575: dwrite d0/d9/d7f/d8f/f66 [0,4194304] 0 2026-03-10T08:55:50.353 INFO:tasks.workunit.client.1.vm08.stdout:9/809: unlink d2/fb 0 2026-03-10T08:55:50.354 INFO:tasks.workunit.client.1.vm08.stdout:0/809: symlink d6/dd/d13/d17/d1f/d2d/l112 0 2026-03-10T08:55:50.356 INFO:tasks.workunit.client.0.vm05.stdout:0/642: creat df/d1f/d48/fba x:0 0 0 2026-03-10T08:55:50.356 INFO:tasks.workunit.client.0.vm05.stdout:8/651: dread - d2/dd/d2c/d2e/fbb zero size 2026-03-10T08:55:50.370 INFO:tasks.workunit.client.1.vm08.stdout:9/810: mknod d2/dd/d15/d1e/d25/dae/c10e 0 2026-03-10T08:55:50.372 INFO:tasks.workunit.client.1.vm08.stdout:0/810: mknod d6/dd/d13/d17/d1f/d2d/d38/d98/c113 0 2026-03-10T08:55:50.373 INFO:tasks.workunit.client.1.vm08.stdout:0/811: write d6/dd/d13/d8f/fbb [497196,73143] 0 2026-03-10T08:55:50.374 INFO:tasks.workunit.client.1.vm08.stdout:0/812: write d6/dd/d13/d61/dc7/dc8/dde/ff2 [65252,86785] 0 2026-03-10T08:55:50.390 INFO:tasks.workunit.client.0.vm05.stdout:2/576: mkdir d0/d55/da2 0 2026-03-10T08:55:50.391 INFO:tasks.workunit.client.0.vm05.stdout:2/577: readlink d0/d9/l13 0 2026-03-10T08:55:50.391 INFO:tasks.workunit.client.0.vm05.stdout:8/652: rename d2/dd/d2c/d2e/d31/d3e/dde/d63/daf/ld4 to d2/db/d28/d99/le5 0 2026-03-10T08:55:50.393 INFO:tasks.workunit.client.1.vm08.stdout:9/811: fsync d2/d41/d4c/d66/d82/fa8 0 2026-03-10T08:55:50.397 INFO:tasks.workunit.client.0.vm05.stdout:6/664: write d4/d7/ff [884417,93734] 0 2026-03-10T08:55:50.399 INFO:tasks.workunit.client.0.vm05.stdout:9/606: dwrite d6/f16 [4194304,4194304] 0 2026-03-10T08:55:50.405 INFO:tasks.workunit.client.0.vm05.stdout:8/653: sync 2026-03-10T08:55:50.410 INFO:tasks.workunit.client.0.vm05.stdout:8/654: dwrite d2/dd/d2c/d2e/d31/d3e/f73 [4194304,4194304] 0 2026-03-10T08:55:50.413 INFO:tasks.workunit.client.1.vm08.stdout:7/850: truncate d0/d11/d1f/d29/d36/fb4 1625281 0 2026-03-10T08:55:50.425 INFO:tasks.workunit.client.1.vm08.stdout:8/880: dread d1/d10/d9/dd/d13/fa4 [0,4194304] 0 2026-03-10T08:55:50.428 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.426+0000 7f6e51b74700 1 -- 192.168.123.105:0/4168657999 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6e4c072b50 msgr2=0x7f6e4c072f70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:50.428 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.426+0000 7f6e51b74700 1 --2- 192.168.123.105:0/4168657999 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6e4c072b50 0x7f6e4c072f70 secure :-1 s=READY pgs=328 cs=0 l=1 rev1=1 crypto rx=0x7f6e3c00bc70 tx=0x7f6e3c00bf80 comp rx=0 tx=0).stop 2026-03-10T08:55:50.428 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.428+0000 7f6e51b74700 1 -- 192.168.123.105:0/4168657999 shutdown_connections 2026-03-10T08:55:50.428 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.428+0000 7f6e51b74700 1 --2- 192.168.123.105:0/4168657999 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6e4c075a40 0x7f6e4c077ed0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:50.428 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.428+0000 7f6e51b74700 1 --2- 192.168.123.105:0/4168657999 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6e4c072b50 0x7f6e4c072f70 unknown :-1 s=CLOSED pgs=328 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:50.428 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.428+0000 7f6e51b74700 1 -- 192.168.123.105:0/4168657999 >> 192.168.123.105:0/4168657999 conn(0x7f6e4c06dae0 msgr2=0x7f6e4c06ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:55:50.429 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.428+0000 7f6e51b74700 1 -- 192.168.123.105:0/4168657999 shutdown_connections 2026-03-10T08:55:50.430 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.428+0000 7f6e51b74700 1 -- 192.168.123.105:0/4168657999 wait complete. 2026-03-10T08:55:50.430 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.429+0000 7f6e51b74700 1 Processor -- start 2026-03-10T08:55:50.430 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.429+0000 7f6e51b74700 1 -- start start 2026-03-10T08:55:50.431 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.429+0000 7f6e51b74700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6e4c075a40 0x7f6e4c083100 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:55:50.433 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.429+0000 7f6e51b74700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6e4c083640 0x7f6e4c12e400 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:55:50.433 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.429+0000 7f6e51b74700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6e4c083b80 con 0x7f6e4c083640 2026-03-10T08:55:50.433 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.429+0000 7f6e51b74700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6e4c083cf0 con 0x7f6e4c075a40 2026-03-10T08:55:50.433 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.430+0000 7f6e4affd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6e4c083640 0x7f6e4c12e400 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:55:50.433 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.430+0000 7f6e4affd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6e4c083640 0x7f6e4c12e400 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:36800/0 (socket says 192.168.123.105:36800) 2026-03-10T08:55:50.433 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.430+0000 7f6e4affd700 1 -- 192.168.123.105:0/2552244674 learned_addr learned my addr 192.168.123.105:0/2552244674 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:55:50.433 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.430+0000 7f6e4b7fe700 1 --2- 192.168.123.105:0/2552244674 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6e4c075a40 0x7f6e4c083100 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:55:50.433 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.430+0000 7f6e4affd700 1 -- 192.168.123.105:0/2552244674 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6e4c075a40 msgr2=0x7f6e4c083100 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:50.433 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.430+0000 7f6e4affd700 1 --2- 192.168.123.105:0/2552244674 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6e4c075a40 0x7f6e4c083100 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:50.433 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.430+0000 7f6e4affd700 1 -- 192.168.123.105:0/2552244674 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6e3c00b920 con 0x7f6e4c083640 2026-03-10T08:55:50.433 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.433+0000 7f6e4affd700 1 --2- 192.168.123.105:0/2552244674 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6e4c083640 0x7f6e4c12e400 secure :-1 s=READY pgs=329 cs=0 l=1 rev1=1 crypto rx=0x7f6e440060b0 tx=0x7f6e4400b6e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:55:50.435 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.434+0000 7f6e48ff9700 1 -- 192.168.123.105:0/2552244674 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6e44004bb0 con 0x7f6e4c083640 2026-03-10T08:55:50.435 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.434+0000 7f6e51b74700 1 -- 192.168.123.105:0/2552244674 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6e4c12ea60 con 0x7f6e4c083640 2026-03-10T08:55:50.435 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.434+0000 7f6e51b74700 1 -- 192.168.123.105:0/2552244674 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6e4c12ef60 con 0x7f6e4c083640 2026-03-10T08:55:50.435 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.435+0000 7f6e51b74700 1 -- 192.168.123.105:0/2552244674 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6e4c04ea90 con 0x7f6e4c083640 2026-03-10T08:55:50.436 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.435+0000 7f6e48ff9700 1 -- 192.168.123.105:0/2552244674 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6e4400fb40 con 0x7f6e4c083640 2026-03-10T08:55:50.436 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.436+0000 7f6e48ff9700 1 -- 192.168.123.105:0/2552244674 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6e4400ecb0 con 0x7f6e4c083640 2026-03-10T08:55:50.437 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.437+0000 7f6e48ff9700 1 -- 192.168.123.105:0/2552244674 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 21) v1 ==== 50027+0+0 (secure 0 0 0) 0x7f6e4400e470 con 0x7f6e4c083640 2026-03-10T08:55:50.438 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.437+0000 7f6e48ff9700 1 --2- 192.168.123.105:0/2552244674 >> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] conn(0x7f6e3403db20 0x7f6e3403ffe0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:55:50.438 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.437+0000 7f6e48ff9700 1 -- 192.168.123.105:0/2552244674 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f6e44013070 con 0x7f6e4c083640 2026-03-10T08:55:50.438 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.437+0000 7f6e4b7fe700 1 --2- 192.168.123.105:0/2552244674 >> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] conn(0x7f6e3403db20 0x7f6e3403ffe0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:55:50.438 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.438+0000 7f6e4b7fe700 1 --2- 192.168.123.105:0/2552244674 >> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] conn(0x7f6e3403db20 0x7f6e3403ffe0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f6e3c004800 tx=0x7f6e3c004790 comp rx=0 tx=0).ready entity=mgr.24459 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:55:50.442 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.442+0000 7f6e48ff9700 1 -- 192.168.123.105:0/2552244674 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f6e4401f390 con 0x7f6e4c083640 2026-03-10T08:55:50.456 INFO:tasks.workunit.client.0.vm05.stdout:2/578: mkdir d0/d9/d89/da3 0 2026-03-10T08:55:50.476 INFO:tasks.workunit.client.1.vm08.stdout:4/873: truncate d5/d23/d36/d99/db2/d5a/d69/d11b/f50 2392847 0 2026-03-10T08:55:50.483 INFO:tasks.workunit.client.1.vm08.stdout:7/851: mkdir d0/d11/d1f/d2c/d111 0 2026-03-10T08:55:50.483 INFO:tasks.workunit.client.1.vm08.stdout:4/874: chown d5/df5 59002 1 2026-03-10T08:55:50.509 INFO:tasks.workunit.client.0.vm05.stdout:2/579: read - d0/d9/d7f/d8f/d6d/f7b zero size 2026-03-10T08:55:50.516 INFO:tasks.workunit.client.1.vm08.stdout:9/812: unlink d2/d54/d8e/da6/lc9 0 2026-03-10T08:55:50.550 INFO:tasks.workunit.client.0.vm05.stdout:4/644: write d0/f10 [4441307,113753] 0 2026-03-10T08:55:50.584 INFO:tasks.workunit.client.1.vm08.stdout:4/875: fdatasync d5/d23/d49/f123 0 2026-03-10T08:55:50.601 INFO:tasks.workunit.client.0.vm05.stdout:9/607: fsync d6/f7 0 2026-03-10T08:55:50.601 INFO:tasks.workunit.client.0.vm05.stdout:9/608: chown d6/d19/d2c/f61 1621501 1 2026-03-10T08:55:50.625 INFO:tasks.workunit.client.0.vm05.stdout:2/580: creat d0/d9/d7f/d8f/d7e/fa4 x:0 0 0 2026-03-10T08:55:50.660 INFO:tasks.workunit.client.0.vm05.stdout:0/643: link df/d1f/d85/d19/d47/d84/dae/db3/cb6 df/cbb 0 2026-03-10T08:55:50.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.691+0000 7f6e51b74700 1 -- 192.168.123.105:0/2552244674 --> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f6e4c077760 con 0x7f6e3403db20 2026-03-10T08:55:50.694 INFO:tasks.workunit.client.0.vm05.stdout:4/645: chown d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/d67/d7b/l7f 29702258 1 2026-03-10T08:55:50.698 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.694+0000 7f6e48ff9700 1 -- 192.168.123.105:0/2552244674 <== mgr.24459 v2:192.168.123.108:6828/865080403 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+318 (secure 0 0 0) 0x7f6e4c077760 con 0x7f6e3403db20 2026-03-10T08:55:50.700 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.699+0000 7f6e327fc700 1 -- 192.168.123.105:0/2552244674 >> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] conn(0x7f6e3403db20 msgr2=0x7f6e3403ffe0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:50.700 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.699+0000 7f6e327fc700 1 --2- 192.168.123.105:0/2552244674 >> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] conn(0x7f6e3403db20 0x7f6e3403ffe0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f6e3c004800 tx=0x7f6e3c004790 comp rx=0 tx=0).stop 2026-03-10T08:55:50.700 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.699+0000 7f6e327fc700 1 -- 192.168.123.105:0/2552244674 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6e4c083640 msgr2=0x7f6e4c12e400 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:50.700 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.699+0000 7f6e327fc700 1 --2- 192.168.123.105:0/2552244674 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6e4c083640 0x7f6e4c12e400 secure :-1 s=READY pgs=329 cs=0 l=1 rev1=1 crypto rx=0x7f6e440060b0 tx=0x7f6e4400b6e0 comp rx=0 tx=0).stop 2026-03-10T08:55:50.700 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.699+0000 7f6e327fc700 1 -- 192.168.123.105:0/2552244674 shutdown_connections 2026-03-10T08:55:50.700 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.699+0000 7f6e327fc700 1 --2- 192.168.123.105:0/2552244674 >> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] conn(0x7f6e3403db20 0x7f6e3403ffe0 unknown :-1 s=CLOSED pgs=18 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:50.700 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.699+0000 7f6e327fc700 1 --2- 192.168.123.105:0/2552244674 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6e4c075a40 0x7f6e4c083100 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:50.700 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.699+0000 7f6e327fc700 1 --2- 192.168.123.105:0/2552244674 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6e4c083640 0x7f6e4c12e400 unknown :-1 s=CLOSED pgs=329 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:50.700 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.699+0000 7f6e327fc700 1 -- 192.168.123.105:0/2552244674 >> 192.168.123.105:0/2552244674 conn(0x7f6e4c06dae0 msgr2=0x7f6e4c06ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:55:50.700 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.700+0000 7f6e327fc700 1 -- 192.168.123.105:0/2552244674 shutdown_connections 2026-03-10T08:55:50.700 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.700+0000 7f6e327fc700 1 -- 192.168.123.105:0/2552244674 wait complete. 2026-03-10T08:55:50.700 INFO:tasks.workunit.client.0.vm05.stdout:4/646: truncate d0/d78/fbc 842979 0 2026-03-10T08:55:50.731 INFO:tasks.workunit.client.1.vm08.stdout:5/753: write d0/d11/d27/f64 [140514,103323] 0 2026-03-10T08:55:50.747 INFO:tasks.workunit.client.0.vm05.stdout:1/686: dwrite dd/d10/d19/d4d/fc4 [0,4194304] 0 2026-03-10T08:55:50.794 INFO:tasks.workunit.client.1.vm08.stdout:0/813: rename d6/dd/d13/d17/d1f/d20/c60 to d6/dd/c114 0 2026-03-10T08:55:50.797 INFO:tasks.workunit.client.1.vm08.stdout:3/805: dwrite d4/d6f/d85/f87 [0,4194304] 0 2026-03-10T08:55:50.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:50 vm08.local ceph-mon[57559]: Migrating agent root cert to cert store 2026-03-10T08:55:50.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:50 vm08.local ceph-mon[57559]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:55:50.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:50 vm08.local ceph-mon[57559]: Migrating agent root key to cert store 2026-03-10T08:55:50.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:50 vm08.local ceph-mon[57559]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:55:50.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:50 vm08.local ceph-mon[57559]: Checking for cert/key for grafana.vm05 2026-03-10T08:55:50.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:50 vm08.local ceph-mon[57559]: Migrating grafana.vm05 cert to cert store 2026-03-10T08:55:50.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:50 vm08.local ceph-mon[57559]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:55:50.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:50 vm08.local ceph-mon[57559]: Migrating grafana.vm05 key to cert store 2026-03-10T08:55:50.804 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:50 vm08.local ceph-mon[57559]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:55:50.804 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:50 vm08.local ceph-mon[57559]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:55:50.804 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:50 vm08.local ceph-mon[57559]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:55:50.804 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:50 vm08.local ceph-mon[57559]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:55:50.804 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:50 vm08.local ceph-mon[57559]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm08.rpongu/mirror_snapshot_schedule"}]: dispatch 2026-03-10T08:55:50.804 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:50 vm08.local ceph-mon[57559]: from='mgr.24459 ' entity='mgr.vm08.rpongu' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm08.rpongu/mirror_snapshot_schedule"}]: dispatch 2026-03-10T08:55:50.804 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:50 vm08.local ceph-mon[57559]: mgrmap e21: vm08.rpongu(active, since 1.32214s) 2026-03-10T08:55:50.804 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:50 vm08.local ceph-mon[57559]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm08.rpongu/trash_purge_schedule"}]: dispatch 2026-03-10T08:55:50.804 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:50 vm08.local ceph-mon[57559]: from='mgr.24459 ' entity='mgr.vm08.rpongu' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm08.rpongu/trash_purge_schedule"}]: dispatch 2026-03-10T08:55:50.813 INFO:tasks.workunit.client.0.vm05.stdout:3/719: dwrite d9/d4d/d51/f59 [0,4194304] 0 2026-03-10T08:55:50.823 INFO:tasks.workunit.client.1.vm08.stdout:9/813: creat d2/dd/d15/d1e/d25/d32/d5c/de5/f10f x:0 0 0 2026-03-10T08:55:50.829 INFO:tasks.workunit.client.1.vm08.stdout:6/844: write d9/d50/fb8 [2098790,87449] 0 2026-03-10T08:55:50.843 INFO:tasks.workunit.client.0.vm05.stdout:7/603: getdents d18/d66/d25/d2e 0 2026-03-10T08:55:50.851 INFO:tasks.workunit.client.0.vm05.stdout:4/647: creat d0/d2e/d42/d45/d4a/d36/dbe/d49/d58/d66/fcf x:0 0 0 2026-03-10T08:55:50.870 INFO:tasks.workunit.client.0.vm05.stdout:4/648: sync 2026-03-10T08:55:50.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.870+0000 7f080757a700 1 -- 192.168.123.105:0/1585586604 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0800072b50 msgr2=0x7f0800072f70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:50.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.870+0000 7f080757a700 1 --2- 192.168.123.105:0/1585586604 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0800072b50 0x7f0800072f70 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7f07fc007780 tx=0x7f07fc00c050 comp rx=0 tx=0).stop 2026-03-10T08:55:50.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.870+0000 7f080757a700 1 -- 192.168.123.105:0/1585586604 shutdown_connections 2026-03-10T08:55:50.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.870+0000 7f080757a700 1 --2- 192.168.123.105:0/1585586604 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0800075a40 0x7f0800077ed0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:50.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.870+0000 7f080757a700 1 --2- 192.168.123.105:0/1585586604 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0800072b50 0x7f0800072f70 unknown :-1 s=CLOSED pgs=102 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:50.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.870+0000 7f080757a700 1 -- 192.168.123.105:0/1585586604 >> 192.168.123.105:0/1585586604 conn(0x7f080006dae0 msgr2=0x7f080006ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:55:50.872 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.871+0000 7f080757a700 1 -- 192.168.123.105:0/1585586604 shutdown_connections 2026-03-10T08:55:50.872 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.871+0000 7f080757a700 1 -- 192.168.123.105:0/1585586604 wait complete. 2026-03-10T08:55:50.873 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.871+0000 7f080757a700 1 Processor -- start 2026-03-10T08:55:50.873 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.871+0000 7f080757a700 1 -- start start 2026-03-10T08:55:50.873 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.872+0000 7f080757a700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0800075a40 0x7f0800083050 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:55:50.873 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.872+0000 7f080757a700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0800083590 0x7f08001b30b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:55:50.873 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.872+0000 7f080757a700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0800083ad0 con 0x7f0800075a40 2026-03-10T08:55:50.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.872+0000 7f080757a700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0800083c40 con 0x7f0800083590 2026-03-10T08:55:50.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.872+0000 7f0804b15700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0800083590 0x7f08001b30b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:55:50.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.872+0000 7f0804b15700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0800083590 0x7f08001b30b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:49316/0 (socket says 192.168.123.105:49316) 2026-03-10T08:55:50.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.872+0000 7f0804b15700 1 -- 192.168.123.105:0/3160859751 learned_addr learned my addr 192.168.123.105:0/3160859751 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:55:50.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.872+0000 7f0805316700 1 --2- 192.168.123.105:0/3160859751 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0800075a40 0x7f0800083050 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:55:50.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.873+0000 7f0804b15700 1 -- 192.168.123.105:0/3160859751 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0800075a40 msgr2=0x7f0800083050 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:50.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.873+0000 7f0804b15700 1 --2- 192.168.123.105:0/3160859751 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0800075a40 0x7f0800083050 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:50.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.873+0000 7f0804b15700 1 -- 192.168.123.105:0/3160859751 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f07fc007430 con 0x7f0800083590 2026-03-10T08:55:50.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.873+0000 7f0804b15700 1 --2- 192.168.123.105:0/3160859751 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0800083590 0x7f08001b30b0 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7f07f800c390 tx=0x7f07f800c6a0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:55:50.875 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.874+0000 7f07f67fc700 1 -- 192.168.123.105:0/3160859751 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f07f800e030 con 0x7f0800083590 2026-03-10T08:55:50.875 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.875+0000 7f080757a700 1 -- 192.168.123.105:0/3160859751 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f08001b36b0 con 0x7f0800083590 2026-03-10T08:55:50.876 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.875+0000 7f080757a700 1 -- 192.168.123.105:0/3160859751 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f08001b3bb0 con 0x7f0800083590 2026-03-10T08:55:50.877 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.876+0000 7f07f67fc700 1 -- 192.168.123.105:0/3160859751 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f07f800f040 con 0x7f0800083590 2026-03-10T08:55:50.877 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.877+0000 7f07f67fc700 1 -- 192.168.123.105:0/3160859751 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f07f8014650 con 0x7f0800083590 2026-03-10T08:55:50.879 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.879+0000 7f07f67fc700 1 -- 192.168.123.105:0/3160859751 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 21) v1 ==== 50027+0+0 (secure 0 0 0) 0x7f07f8014870 con 0x7f0800083590 2026-03-10T08:55:50.879 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.879+0000 7f07f67fc700 1 --2- 192.168.123.105:0/3160859751 >> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] conn(0x7f07ec03dbb0 0x7f07ec040070 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:55:50.879 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.879+0000 7f07f67fc700 1 -- 192.168.123.105:0/3160859751 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f07f8053cb0 con 0x7f0800083590 2026-03-10T08:55:50.880 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.879+0000 7f0805316700 1 --2- 192.168.123.105:0/3160859751 >> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] conn(0x7f07ec03dbb0 0x7f07ec040070 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:55:50.880 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.880+0000 7f0805316700 1 --2- 192.168.123.105:0/3160859751 >> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] conn(0x7f07ec03dbb0 0x7f07ec040070 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f07fc00c420 tx=0x7f07fc0058e0 comp rx=0 tx=0).ready entity=mgr.24459 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:55:50.880 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.880+0000 7f080757a700 1 -- 192.168.123.105:0/3160859751 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f07e4005320 con 0x7f0800083590 2026-03-10T08:55:50.883 INFO:tasks.workunit.client.0.vm05.stdout:1/687: symlink dd/d21/d37/d45/d8d/lf8 0 2026-03-10T08:55:50.887 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:50.885+0000 7f07f67fc700 1 -- 192.168.123.105:0/3160859751 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f07f8014bc0 con 0x7f0800083590 2026-03-10T08:55:50.887 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:50 vm05.local ceph-mon[49713]: Migrating agent root cert to cert store 2026-03-10T08:55:50.887 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:50 vm05.local ceph-mon[49713]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:55:50.887 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:50 vm05.local ceph-mon[49713]: Migrating agent root key to cert store 2026-03-10T08:55:50.887 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:50 vm05.local ceph-mon[49713]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:55:50.887 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:50 vm05.local ceph-mon[49713]: Checking for cert/key for grafana.vm05 2026-03-10T08:55:50.887 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:50 vm05.local ceph-mon[49713]: Migrating grafana.vm05 cert to cert store 2026-03-10T08:55:50.887 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:50 vm05.local ceph-mon[49713]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:55:50.887 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:50 vm05.local ceph-mon[49713]: Migrating grafana.vm05 key to cert store 2026-03-10T08:55:50.887 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:50 vm05.local ceph-mon[49713]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:55:50.887 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:50 vm05.local ceph-mon[49713]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:55:50.887 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:50 vm05.local ceph-mon[49713]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:55:50.887 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:50 vm05.local ceph-mon[49713]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:55:50.887 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:50 vm05.local ceph-mon[49713]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm08.rpongu/mirror_snapshot_schedule"}]: dispatch 2026-03-10T08:55:50.887 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:50 vm05.local ceph-mon[49713]: from='mgr.24459 ' entity='mgr.vm08.rpongu' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm08.rpongu/mirror_snapshot_schedule"}]: dispatch 2026-03-10T08:55:50.887 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:50 vm05.local ceph-mon[49713]: mgrmap e21: vm08.rpongu(active, since 1.32214s) 2026-03-10T08:55:50.887 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:50 vm05.local ceph-mon[49713]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm08.rpongu/trash_purge_schedule"}]: dispatch 2026-03-10T08:55:50.887 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:50 vm05.local ceph-mon[49713]: from='mgr.24459 ' entity='mgr.vm08.rpongu' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm08.rpongu/trash_purge_schedule"}]: dispatch 2026-03-10T08:55:50.887 INFO:tasks.workunit.client.0.vm05.stdout:1/688: dwrite dd/d21/d37/d45/f47 [0,4194304] 0 2026-03-10T08:55:50.894 INFO:tasks.workunit.client.1.vm08.stdout:0/814: symlink d6/dd/d13/d61/l115 0 2026-03-10T08:55:50.901 INFO:tasks.workunit.client.0.vm05.stdout:6/665: rename d4/d2c/dc8/cd9 to d4/d2d/d51/cdf 0 2026-03-10T08:55:50.901 INFO:tasks.workunit.client.0.vm05.stdout:6/666: chown d4/d7/f14 28517 1 2026-03-10T08:55:50.939 INFO:tasks.workunit.client.0.vm05.stdout:2/581: creat d0/d55/da2/fa5 x:0 0 0 2026-03-10T08:55:50.951 INFO:tasks.workunit.client.1.vm08.stdout:9/814: creat d2/dd/d61/f110 x:0 0 0 2026-03-10T08:55:50.965 INFO:tasks.workunit.client.1.vm08.stdout:9/815: sync 2026-03-10T08:55:50.969 INFO:tasks.workunit.client.1.vm08.stdout:0/815: read - d6/fe1 zero size 2026-03-10T08:55:50.982 INFO:tasks.workunit.client.0.vm05.stdout:4/649: mkdir d0/d2c/d6a/dd0 0 2026-03-10T08:55:50.983 INFO:tasks.workunit.client.1.vm08.stdout:6/845: symlink d9/dc/d11/d23/d2c/d81/d63/l114 0 2026-03-10T08:55:50.992 INFO:tasks.workunit.client.1.vm08.stdout:9/816: mknod d2/dd/d15/d1e/d39/d69/de4/df2/c111 0 2026-03-10T08:55:50.996 INFO:tasks.workunit.client.1.vm08.stdout:0/816: truncate d6/dd/d13/d17/d1f/d20/d2f/d24/fa8 881774 0 2026-03-10T08:55:51.011 INFO:tasks.workunit.client.1.vm08.stdout:9/817: creat d2/d54/d8e/db7/f112 x:0 0 0 2026-03-10T08:55:51.036 INFO:tasks.workunit.client.1.vm08.stdout:6/846: symlink d9/dc/d11/d23/d2c/d81/l115 0 2026-03-10T08:55:51.043 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.042+0000 7f080757a700 1 -- 192.168.123.105:0/3160859751 --> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f07e4000bf0 con 0x7f07ec03dbb0 2026-03-10T08:55:51.048 INFO:tasks.workunit.client.1.vm08.stdout:2/923: dwrite d1/da/d10/d42/d93/d1e/d7b/fb9 [0,4194304] 0 2026-03-10T08:55:51.050 INFO:tasks.workunit.client.1.vm08.stdout:9/818: creat d2/dd/d15/d1e/d25/d32/d79/f113 x:0 0 0 2026-03-10T08:55:51.050 INFO:tasks.workunit.client.1.vm08.stdout:9/819: stat d2/dd/d15/d1e/d25/d32/f8c 0 2026-03-10T08:55:51.065 INFO:teuthology.orchestra.run.vm05.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T08:55:51.065 INFO:teuthology.orchestra.run.vm05.stdout:alertmanager.vm05 vm05 *:9093,9094 running (4m) 2m ago 4m 21.4M - 0.25.0 c8568f914cd2 3be9db7ff6a0 2026-03-10T08:55:51.065 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm05 vm05 running (5m) 2m ago 5m 8032k - 18.2.1 5be31c24972a 4f6a4fa3151a 2026-03-10T08:55:51.065 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm08 vm08 running (4m) 8s ago 4m 8522k - 18.2.1 5be31c24972a 17a1d108c2c1 2026-03-10T08:55:51.065 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm05 vm05 running (5m) 2m ago 5m 7407k - 18.2.1 5be31c24972a f9c585addcea 2026-03-10T08:55:51.065 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm08 vm08 running (4m) 8s ago 4m 7415k - 18.2.1 5be31c24972a f0b88fc7f552 2026-03-10T08:55:51.065 INFO:teuthology.orchestra.run.vm05.stdout:grafana.vm05 vm05 *:3000 running (4m) 2m ago 4m 80.8M - 9.4.7 954c08fa6188 91937b4745e7 2026-03-10T08:55:51.065 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.bxdvbu vm05 running (2m) 2m ago 2m 16.7M - 18.2.1 5be31c24972a 601862397d9b 2026-03-10T08:55:51.065 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.slhztf vm05 running (2m) 2m ago 2m 13.7M - 18.2.1 5be31c24972a 550cdd24738b 2026-03-10T08:55:51.065 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.ssijow vm08 running (2m) 8s ago 2m 18.1M - 18.2.1 5be31c24972a b9e55b365719 2026-03-10T08:55:51.065 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.xfzrbx vm08 running (2m) 8s ago 2m 14.4M - 18.2.1 5be31c24972a d58d1e2e6ff3 2026-03-10T08:55:51.065 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm05.rxwgjc vm05 *:9283,8765,8443 running (5m) 2m ago 5m 501M - 18.2.1 5be31c24972a 6ec0cdb38171 2026-03-10T08:55:51.065 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm08.rpongu vm08 *:8443,9283,8765 running (11s) 8s ago 4m 63.1M - 19.2.3-678-ge911bdeb 654f31e6858e d0749942e44d 2026-03-10T08:55:51.065 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm05 vm05 running (5m) 2m ago 5m 50.0M 2048M 18.2.1 5be31c24972a 4cb0e74c8584 2026-03-10T08:55:51.065 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm08 vm08 running (4m) 8s ago 4m 53.1M 2048M 18.2.1 5be31c24972a bca448418226 2026-03-10T08:55:51.065 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm05 vm05 *:9100 running (5m) 2m ago 5m 12.3M - 1.5.0 0da6a335fe13 3b3db2a6030c 2026-03-10T08:55:51.065 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm08 vm08 *:9100 running (4m) 8s ago 4m 15.5M - 1.5.0 0da6a335fe13 f55df0cefc61 2026-03-10T08:55:51.065 INFO:teuthology.orchestra.run.vm05.stdout:osd.0 vm05 running (4m) 2m ago 4m 48.5M 4096M 18.2.1 5be31c24972a 2a2aeea5e3d4 2026-03-10T08:55:51.065 INFO:teuthology.orchestra.run.vm05.stdout:osd.1 vm05 running (3m) 2m ago 3m 46.9M 4096M 18.2.1 5be31c24972a 902f9ea11f1a 2026-03-10T08:55:51.065 INFO:teuthology.orchestra.run.vm05.stdout:osd.2 vm05 running (3m) 2m ago 3m 48.1M 4096M 18.2.1 5be31c24972a 32c0be0f86f2 2026-03-10T08:55:51.065 INFO:teuthology.orchestra.run.vm05.stdout:osd.3 vm08 running (3m) 8s ago 3m 395M 4096M 18.2.1 5be31c24972a 14f5f93ea4d1 2026-03-10T08:55:51.065 INFO:teuthology.orchestra.run.vm05.stdout:osd.4 vm08 running (3m) 8s ago 3m 371M 4096M 18.2.1 5be31c24972a 155c1482d81c 2026-03-10T08:55:51.065 INFO:teuthology.orchestra.run.vm05.stdout:osd.5 vm08 running (3m) 8s ago 3m 341M 4096M 18.2.1 5be31c24972a 21583bb58d82 2026-03-10T08:55:51.066 INFO:teuthology.orchestra.run.vm05.stdout:prometheus.vm05 vm05 *:9095 running (4m) 2m ago 4m 36.9M - 2.43.0 a07b618ecd1d e84b76e5c1c0 2026-03-10T08:55:51.066 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.062+0000 7f07f67fc700 1 -- 192.168.123.105:0/3160859751 <== mgr.24459 v2:192.168.123.108:6828/865080403 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7f07e4000bf0 con 0x7f07ec03dbb0 2026-03-10T08:55:51.066 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.066+0000 7f080757a700 1 -- 192.168.123.105:0/3160859751 >> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] conn(0x7f07ec03dbb0 msgr2=0x7f07ec040070 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:51.066 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.066+0000 7f080757a700 1 --2- 192.168.123.105:0/3160859751 >> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] conn(0x7f07ec03dbb0 0x7f07ec040070 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f07fc00c420 tx=0x7f07fc0058e0 comp rx=0 tx=0).stop 2026-03-10T08:55:51.066 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.066+0000 7f080757a700 1 -- 192.168.123.105:0/3160859751 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0800083590 msgr2=0x7f08001b30b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:51.066 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.066+0000 7f080757a700 1 --2- 192.168.123.105:0/3160859751 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0800083590 0x7f08001b30b0 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7f07f800c390 tx=0x7f07f800c6a0 comp rx=0 tx=0).stop 2026-03-10T08:55:51.066 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.066+0000 7f080757a700 1 -- 192.168.123.105:0/3160859751 shutdown_connections 2026-03-10T08:55:51.066 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.066+0000 7f080757a700 1 --2- 192.168.123.105:0/3160859751 >> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] conn(0x7f07ec03dbb0 0x7f07ec040070 unknown :-1 s=CLOSED pgs=19 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:51.066 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.066+0000 7f080757a700 1 --2- 192.168.123.105:0/3160859751 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0800075a40 0x7f0800083050 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:51.066 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.066+0000 7f080757a700 1 --2- 192.168.123.105:0/3160859751 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0800083590 0x7f08001b30b0 unknown :-1 s=CLOSED pgs=103 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:51.066 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.066+0000 7f080757a700 1 -- 192.168.123.105:0/3160859751 >> 192.168.123.105:0/3160859751 conn(0x7f080006dae0 msgr2=0x7f080006ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:55:51.067 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.066+0000 7f080757a700 1 -- 192.168.123.105:0/3160859751 shutdown_connections 2026-03-10T08:55:51.067 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.066+0000 7f080757a700 1 -- 192.168.123.105:0/3160859751 wait complete. 2026-03-10T08:55:51.075 INFO:tasks.workunit.client.1.vm08.stdout:0/817: symlink d6/dd/d13/d17/l116 0 2026-03-10T08:55:51.075 INFO:tasks.workunit.client.1.vm08.stdout:0/818: chown d6/dd/d13/d17/d1f/d2d/fb2 299253397 1 2026-03-10T08:55:51.076 INFO:tasks.workunit.client.1.vm08.stdout:0/819: write d6/dd/d13/d17/d1f/d2d/d85/d95/ff5 [94284,6609] 0 2026-03-10T08:55:51.096 INFO:tasks.workunit.client.0.vm05.stdout:5/575: dwrite d5/d86/d66/f7b [0,4194304] 0 2026-03-10T08:55:51.110 INFO:tasks.workunit.client.1.vm08.stdout:2/924: fdatasync d1/da/d10/d42/d93/d22/fbc 0 2026-03-10T08:55:51.113 INFO:tasks.workunit.client.1.vm08.stdout:1/846: dwrite d1/da/d18/d3b/d62/fd9 [0,4194304] 0 2026-03-10T08:55:51.136 INFO:tasks.workunit.client.1.vm08.stdout:9/820: mkdir d2/dd/d15/d1e/d39/d4e/d114 0 2026-03-10T08:55:51.148 INFO:tasks.workunit.client.1.vm08.stdout:0/820: mknod d6/dd/d13/d61/dc7/dc8/c117 0 2026-03-10T08:55:51.168 INFO:tasks.workunit.client.0.vm05.stdout:8/655: rmdir d2/dd/d2c/d2e 39 2026-03-10T08:55:51.170 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.170+0000 7fe66425d700 1 -- 192.168.123.105:0/2259463057 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe65c075a40 msgr2=0x7fe65c077ed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:51.170 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.170+0000 7fe66425d700 1 --2- 192.168.123.105:0/2259463057 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe65c075a40 0x7fe65c077ed0 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7fe654009230 tx=0x7fe654009260 comp rx=0 tx=0).stop 2026-03-10T08:55:51.170 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.170+0000 7fe66425d700 1 -- 192.168.123.105:0/2259463057 shutdown_connections 2026-03-10T08:55:51.171 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.170+0000 7fe66425d700 1 --2- 192.168.123.105:0/2259463057 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe65c075a40 0x7fe65c077ed0 unknown :-1 s=CLOSED pgs=104 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:51.171 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.170+0000 7fe66425d700 1 --2- 192.168.123.105:0/2259463057 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe65c072b50 0x7fe65c072f70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:51.171 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.170+0000 7fe66425d700 1 -- 192.168.123.105:0/2259463057 >> 192.168.123.105:0/2259463057 conn(0x7fe65c06dae0 msgr2=0x7fe65c06ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:55:51.171 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.171+0000 7fe66425d700 1 -- 192.168.123.105:0/2259463057 shutdown_connections 2026-03-10T08:55:51.172 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.171+0000 7fe66425d700 1 -- 192.168.123.105:0/2259463057 wait complete. 2026-03-10T08:55:51.172 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.171+0000 7fe66425d700 1 Processor -- start 2026-03-10T08:55:51.172 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.171+0000 7fe66425d700 1 -- start start 2026-03-10T08:55:51.172 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.171+0000 7fe66425d700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe65c072b50 0x7fe65c083090 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:55:51.172 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.171+0000 7fe66425d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe65c0835d0 0x7fe65c1b3120 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:55:51.172 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.171+0000 7fe66425d700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe65c083ae0 con 0x7fe65c0835d0 2026-03-10T08:55:51.172 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.171+0000 7fe66425d700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe65c083c50 con 0x7fe65c072b50 2026-03-10T08:55:51.173 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.172+0000 7fe661ff9700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe65c072b50 0x7fe65c083090 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:55:51.173 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.172+0000 7fe661ff9700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe65c072b50 0x7fe65c083090 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:49330/0 (socket says 192.168.123.105:49330) 2026-03-10T08:55:51.173 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.172+0000 7fe661ff9700 1 -- 192.168.123.105:0/1896276321 learned_addr learned my addr 192.168.123.105:0/1896276321 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:55:51.173 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.172+0000 7fe6617f8700 1 --2- 192.168.123.105:0/1896276321 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe65c0835d0 0x7fe65c1b3120 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:55:51.174 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.172+0000 7fe661ff9700 1 -- 192.168.123.105:0/1896276321 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe65c0835d0 msgr2=0x7fe65c1b3120 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:51.175 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.172+0000 7fe661ff9700 1 --2- 192.168.123.105:0/1896276321 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe65c0835d0 0x7fe65c1b3120 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:51.175 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.172+0000 7fe661ff9700 1 -- 192.168.123.105:0/1896276321 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe654008ee0 con 0x7fe65c072b50 2026-03-10T08:55:51.175 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.173+0000 7fe661ff9700 1 --2- 192.168.123.105:0/1896276321 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe65c072b50 0x7fe65c083090 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7fe658013f80 tx=0x7fe65800e450 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:55:51.175 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.173+0000 7fe652ffd700 1 -- 192.168.123.105:0/1896276321 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe658010040 con 0x7fe65c072b50 2026-03-10T08:55:51.176 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.173+0000 7fe66425d700 1 -- 192.168.123.105:0/1896276321 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe65c1b36c0 con 0x7fe65c072b50 2026-03-10T08:55:51.176 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.173+0000 7fe66425d700 1 -- 192.168.123.105:0/1896276321 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe65c1b3c10 con 0x7fe65c072b50 2026-03-10T08:55:51.176 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.173+0000 7fe652ffd700 1 -- 192.168.123.105:0/1896276321 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fe65801cd30 con 0x7fe65c072b50 2026-03-10T08:55:51.176 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.173+0000 7fe652ffd700 1 -- 192.168.123.105:0/1896276321 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe658009a20 con 0x7fe65c072b50 2026-03-10T08:55:51.176 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.174+0000 7fe652ffd700 1 -- 192.168.123.105:0/1896276321 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 21) v1 ==== 50027+0+0 (secure 0 0 0) 0x7fe658009c60 con 0x7fe65c072b50 2026-03-10T08:55:51.176 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.174+0000 7fe650ff9700 1 -- 192.168.123.105:0/1896276321 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe640005320 con 0x7fe65c072b50 2026-03-10T08:55:51.177 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.175+0000 7fe652ffd700 1 --2- 192.168.123.105:0/1896276321 >> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] conn(0x7fe64803dad0 0x7fe64803ff90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:55:51.178 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.175+0000 7fe652ffd700 1 -- 192.168.123.105:0/1896276321 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7fe65805af30 con 0x7fe65c072b50 2026-03-10T08:55:51.180 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.180+0000 7fe6617f8700 1 --2- 192.168.123.105:0/1896276321 >> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] conn(0x7fe64803dad0 0x7fe64803ff90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:55:51.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.181+0000 7fe6617f8700 1 --2- 192.168.123.105:0/1896276321 >> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] conn(0x7fe64803dad0 0x7fe64803ff90 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7fe654009200 tx=0x7fe65400c920 comp rx=0 tx=0).ready entity=mgr.24459 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:55:51.185 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.182+0000 7fe652ffd700 1 -- 192.168.123.105:0/1896276321 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7fe65800ed50 con 0x7fe65c072b50 2026-03-10T08:55:51.202 INFO:tasks.workunit.client.1.vm08.stdout:1/847: rmdir d1/da/d20/d91 39 2026-03-10T08:55:51.216 INFO:tasks.workunit.client.1.vm08.stdout:8/881: dwrite d1/d10/d9/dd/d25/d27/d44/d21/d5f/fd4 [0,4194304] 0 2026-03-10T08:55:51.229 INFO:tasks.workunit.client.1.vm08.stdout:4/876: write d5/d23/d49/f94 [852756,89721] 0 2026-03-10T08:55:51.231 INFO:tasks.workunit.client.1.vm08.stdout:7/852: dwrite d0/d11/d4a/f87 [4194304,4194304] 0 2026-03-10T08:55:51.237 INFO:tasks.workunit.client.1.vm08.stdout:4/877: dwrite d5/d23/d36/d99/db2/d5d/f129 [0,4194304] 0 2026-03-10T08:55:51.241 INFO:tasks.workunit.client.1.vm08.stdout:5/754: dwrite d0/d46/f51 [4194304,4194304] 0 2026-03-10T08:55:51.269 INFO:tasks.workunit.client.0.vm05.stdout:6/667: mknod d4/d8d/ce0 0 2026-03-10T08:55:51.270 INFO:tasks.workunit.client.0.vm05.stdout:6/668: chown d4/d7/d10/d15/d1b/c1d 0 1 2026-03-10T08:55:51.272 INFO:tasks.workunit.client.0.vm05.stdout:6/669: dread d4/d7/dc4/fca [0,4194304] 0 2026-03-10T08:55:51.272 INFO:tasks.workunit.client.0.vm05.stdout:6/670: fsync d4/d7/d10/d1a/d1f/fce 0 2026-03-10T08:55:51.273 INFO:tasks.workunit.client.1.vm08.stdout:3/806: write d4/d15/d8/d2c/d9b/d79/d20/fe5 [929735,7611] 0 2026-03-10T08:55:51.273 INFO:tasks.workunit.client.0.vm05.stdout:6/671: write d4/d7/d10/d1a/d89/fd2 [164053,74599] 0 2026-03-10T08:55:51.273 INFO:tasks.workunit.client.1.vm08.stdout:3/807: fsync d4/d15/d8/d1d/fff 0 2026-03-10T08:55:51.274 INFO:tasks.workunit.client.0.vm05.stdout:9/609: creat d6/d27/fcc x:0 0 0 2026-03-10T08:55:51.274 INFO:tasks.workunit.client.1.vm08.stdout:3/808: chown d4/d15/d8/d2c/fd2 1310 1 2026-03-10T08:55:51.277 INFO:tasks.workunit.client.0.vm05.stdout:9/610: dwrite d6/d12/d3a/fa9 [0,4194304] 0 2026-03-10T08:55:51.285 INFO:tasks.workunit.client.0.vm05.stdout:2/582: dwrite d0/d9/d1e/d20/f47 [0,4194304] 0 2026-03-10T08:55:51.311 INFO:tasks.workunit.client.0.vm05.stdout:7/604: dwrite d18/d66/d25/f47 [0,4194304] 0 2026-03-10T08:55:51.344 INFO:tasks.workunit.client.1.vm08.stdout:5/755: dread d0/d11/d27/d50/fa1 [0,4194304] 0 2026-03-10T08:55:51.345 INFO:tasks.workunit.client.1.vm08.stdout:5/756: write d0/d11/d27/f61 [758188,46770] 0 2026-03-10T08:55:51.354 INFO:tasks.workunit.client.0.vm05.stdout:5/576: rename d5/d86/d66 to d5/df/d37/dd2 0 2026-03-10T08:55:51.383 INFO:tasks.workunit.client.0.vm05.stdout:6/672: unlink d4/d2c/l7b 0 2026-03-10T08:55:51.402 INFO:tasks.workunit.client.0.vm05.stdout:3/720: creat d9/d8f/fdc x:0 0 0 2026-03-10T08:55:51.411 INFO:tasks.workunit.client.0.vm05.stdout:7/605: mknod d18/d66/d25/d2e/cbd 0 2026-03-10T08:55:51.412 INFO:tasks.workunit.client.0.vm05.stdout:7/606: chown d18/d38/d43/d6e/l89 478417570 1 2026-03-10T08:55:51.420 INFO:tasks.workunit.client.1.vm08.stdout:6/847: dwrite d9/d10/d1e/d32/f3a [0,4194304] 0 2026-03-10T08:55:51.447 INFO:tasks.workunit.client.0.vm05.stdout:5/577: creat d5/d86/d39/fd3 x:0 0 0 2026-03-10T08:55:51.449 INFO:tasks.workunit.client.1.vm08.stdout:2/925: dwrite d1/da/d10/d1b/d6a/fe0 [0,4194304] 0 2026-03-10T08:55:51.453 INFO:tasks.workunit.client.0.vm05.stdout:8/656: truncate d2/dd/d74/d78/fcf 47263 0 2026-03-10T08:55:51.487 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.486+0000 7fe650ff9700 1 -- 192.168.123.105:0/1896276321 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fe640006200 con 0x7fe65c072b50 2026-03-10T08:55:51.495 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T08:55:51.496 INFO:teuthology.orchestra.run.vm05.stdout: "mon": { 2026-03-10T08:55:51.496 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 2 2026-03-10T08:55:51.496 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:55:51.496 INFO:teuthology.orchestra.run.vm05.stdout: "mgr": { 2026-03-10T08:55:51.496 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-10T08:55:51.496 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:55:51.496 INFO:teuthology.orchestra.run.vm05.stdout: "osd": { 2026-03-10T08:55:51.496 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 6 2026-03-10T08:55:51.496 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:55:51.496 INFO:teuthology.orchestra.run.vm05.stdout: "mds": { 2026-03-10T08:55:51.496 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4 2026-03-10T08:55:51.496 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:55:51.496 INFO:teuthology.orchestra.run.vm05.stdout: "overall": { 2026-03-10T08:55:51.496 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 12, 2026-03-10T08:55:51.496 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-10T08:55:51.496 INFO:teuthology.orchestra.run.vm05.stdout: } 2026-03-10T08:55:51.496 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T08:55:51.496 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.491+0000 7fe652ffd700 1 -- 192.168.123.105:0/1896276321 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+679 (secure 0 0 0) 0x7fe6580122e0 con 0x7fe65c072b50 2026-03-10T08:55:51.501 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.500+0000 7fe66425d700 1 -- 192.168.123.105:0/1896276321 >> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] conn(0x7fe64803dad0 msgr2=0x7fe64803ff90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:51.501 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.500+0000 7fe66425d700 1 --2- 192.168.123.105:0/1896276321 >> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] conn(0x7fe64803dad0 0x7fe64803ff90 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7fe654009200 tx=0x7fe65400c920 comp rx=0 tx=0).stop 2026-03-10T08:55:51.501 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.500+0000 7fe66425d700 1 -- 192.168.123.105:0/1896276321 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe65c072b50 msgr2=0x7fe65c083090 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:51.501 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.500+0000 7fe66425d700 1 --2- 192.168.123.105:0/1896276321 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe65c072b50 0x7fe65c083090 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7fe658013f80 tx=0x7fe65800e450 comp rx=0 tx=0).stop 2026-03-10T08:55:51.501 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.500+0000 7fe66425d700 1 -- 192.168.123.105:0/1896276321 shutdown_connections 2026-03-10T08:55:51.501 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.500+0000 7fe66425d700 1 --2- 192.168.123.105:0/1896276321 >> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] conn(0x7fe64803dad0 0x7fe64803ff90 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:51.501 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.500+0000 7fe66425d700 1 --2- 192.168.123.105:0/1896276321 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe65c072b50 0x7fe65c083090 secure :-1 s=CLOSED pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7fe658013f80 tx=0x7fe65800e450 comp rx=0 tx=0).stop 2026-03-10T08:55:51.501 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.500+0000 7fe66425d700 1 --2- 192.168.123.105:0/1896276321 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe65c0835d0 0x7fe65c1b3120 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:51.501 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.500+0000 7fe66425d700 1 -- 192.168.123.105:0/1896276321 >> 192.168.123.105:0/1896276321 conn(0x7fe65c06dae0 msgr2=0x7fe65c06ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:55:51.502 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.500+0000 7fe66425d700 1 -- 192.168.123.105:0/1896276321 shutdown_connections 2026-03-10T08:55:51.507 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.502+0000 7fe66425d700 1 -- 192.168.123.105:0/1896276321 wait complete. 2026-03-10T08:55:51.520 INFO:tasks.workunit.client.0.vm05.stdout:7/607: rmdir d18/d38/d43/d5c/daf 39 2026-03-10T08:55:51.523 INFO:tasks.workunit.client.0.vm05.stdout:0/644: link df/d1f/d85/d2b/d65/d6e/d96/l5f df/d59/lbc 0 2026-03-10T08:55:51.533 INFO:tasks.workunit.client.0.vm05.stdout:8/657: mkdir d2/dd/de6 0 2026-03-10T08:55:51.556 INFO:tasks.workunit.client.1.vm08.stdout:9/821: creat d2/d54/d8e/da6/dd0/dc8/f115 x:0 0 0 2026-03-10T08:55:51.558 INFO:tasks.workunit.client.0.vm05.stdout:1/689: write dd/d21/d37/d45/d8d/f99 [673858,47772] 0 2026-03-10T08:55:51.561 INFO:tasks.workunit.client.0.vm05.stdout:1/690: chown dd/d10/d18/d20/d52/d80 2 1 2026-03-10T08:55:51.568 INFO:tasks.workunit.client.0.vm05.stdout:4/650: dwrite d0/d2e/d42/d45/d4a/d36/dbe/d32/f3e [0,4194304] 0 2026-03-10T08:55:51.575 INFO:tasks.workunit.client.0.vm05.stdout:3/721: creat d9/d2b/d3a/dd6/fdd x:0 0 0 2026-03-10T08:55:51.588 INFO:tasks.workunit.client.1.vm08.stdout:1/848: creat d1/da/de/d24/d81/d121/f124 x:0 0 0 2026-03-10T08:55:51.593 INFO:tasks.workunit.client.1.vm08.stdout:8/882: creat d1/d10/d9/dd/d13/d40/f147 x:0 0 0 2026-03-10T08:55:51.598 INFO:tasks.workunit.client.0.vm05.stdout:8/658: fdatasync d2/db/d1f/d67/f94 0 2026-03-10T08:55:51.603 INFO:tasks.workunit.client.1.vm08.stdout:0/821: dread d6/dd/d13/d17/d1f/d20/d2f/d24/fab [0,4194304] 0 2026-03-10T08:55:51.610 INFO:tasks.workunit.client.0.vm05.stdout:9/611: write d6/d15/fb4 [2666888,29414] 0 2026-03-10T08:55:51.625 INFO:tasks.workunit.client.0.vm05.stdout:6/673: dwrite d4/d2d/d5f/f88 [0,4194304] 0 2026-03-10T08:55:51.626 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.616+0000 7ff9db59e700 1 -- 192.168.123.105:0/2430677899 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff9dc10a700 msgr2=0x7ff9dc10cb90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:51.626 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.616+0000 7ff9db59e700 1 --2- 192.168.123.105:0/2430677899 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff9dc10a700 0x7ff9dc10cb90 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7ff9d0009b00 tx=0x7ff9d0009e10 comp rx=0 tx=0).stop 2026-03-10T08:55:51.626 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.616+0000 7ff9db59e700 1 -- 192.168.123.105:0/2430677899 shutdown_connections 2026-03-10T08:55:51.626 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.616+0000 7ff9db59e700 1 --2- 192.168.123.105:0/2430677899 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff9dc10a700 0x7ff9dc10cb90 unknown :-1 s=CLOSED pgs=106 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:51.626 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.616+0000 7ff9db59e700 1 --2- 192.168.123.105:0/2430677899 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff9dc107d90 0x7ff9dc10a1c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:51.626 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.616+0000 7ff9db59e700 1 -- 192.168.123.105:0/2430677899 >> 192.168.123.105:0/2430677899 conn(0x7ff9dc06daa0 msgr2=0x7ff9dc06ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:55:51.626 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.616+0000 7ff9db59e700 1 -- 192.168.123.105:0/2430677899 shutdown_connections 2026-03-10T08:55:51.626 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.616+0000 7ff9db59e700 1 -- 192.168.123.105:0/2430677899 wait complete. 2026-03-10T08:55:51.626 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.617+0000 7ff9db59e700 1 Processor -- start 2026-03-10T08:55:51.626 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.617+0000 7ff9db59e700 1 -- start start 2026-03-10T08:55:51.626 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.617+0000 7ff9db59e700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff9dc107d90 0x7ff9dc116cd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:55:51.626 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.617+0000 7ff9db59e700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff9dc10a700 0x7ff9dc117210 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:55:51.626 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.617+0000 7ff9db59e700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff9dc1177e0 con 0x7ff9dc10a700 2026-03-10T08:55:51.626 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.617+0000 7ff9db59e700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff9dc117950 con 0x7ff9dc107d90 2026-03-10T08:55:51.626 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.617+0000 7ff9da59c700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff9dc107d90 0x7ff9dc116cd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:55:51.626 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.617+0000 7ff9da59c700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff9dc107d90 0x7ff9dc116cd0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:49352/0 (socket says 192.168.123.105:49352) 2026-03-10T08:55:51.626 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.617+0000 7ff9da59c700 1 -- 192.168.123.105:0/961572153 learned_addr learned my addr 192.168.123.105:0/961572153 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:55:51.626 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.618+0000 7ff9da59c700 1 -- 192.168.123.105:0/961572153 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff9dc10a700 msgr2=0x7ff9dc117210 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:51.626 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.618+0000 7ff9da59c700 1 --2- 192.168.123.105:0/961572153 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff9dc10a700 0x7ff9dc117210 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:51.626 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.618+0000 7ff9da59c700 1 -- 192.168.123.105:0/961572153 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff9d00097e0 con 0x7ff9dc107d90 2026-03-10T08:55:51.626 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.618+0000 7ff9da59c700 1 --2- 192.168.123.105:0/961572153 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff9dc107d90 0x7ff9dc116cd0 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7ff9cc00d900 tx=0x7ff9cc00dcc0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:55:51.626 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.619+0000 7ff9cb7fe700 1 -- 192.168.123.105:0/961572153 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff9cc0049e0 con 0x7ff9dc107d90 2026-03-10T08:55:51.626 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.619+0000 7ff9db59e700 1 -- 192.168.123.105:0/961572153 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff9dc1b3520 con 0x7ff9dc107d90 2026-03-10T08:55:51.627 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.619+0000 7ff9db59e700 1 -- 192.168.123.105:0/961572153 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff9dc1b3a70 con 0x7ff9dc107d90 2026-03-10T08:55:51.627 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.619+0000 7ff9cb7fe700 1 -- 192.168.123.105:0/961572153 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff9cc005500 con 0x7ff9dc107d90 2026-03-10T08:55:51.627 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.619+0000 7ff9cb7fe700 1 -- 192.168.123.105:0/961572153 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff9cc009de0 con 0x7ff9dc107d90 2026-03-10T08:55:51.627 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.620+0000 7ff9cb7fe700 1 -- 192.168.123.105:0/961572153 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 21) v1 ==== 50027+0+0 (secure 0 0 0) 0x7ff9cc010460 con 0x7ff9dc107d90 2026-03-10T08:55:51.627 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.620+0000 7ff9cb7fe700 1 --2- 192.168.123.105:0/961572153 >> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] conn(0x7ff9c403db10 0x7ff9c403ffd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:55:51.627 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.621+0000 7ff9cb7fe700 1 -- 192.168.123.105:0/961572153 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7ff9cc053980 con 0x7ff9dc107d90 2026-03-10T08:55:51.627 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.621+0000 7ff9db59e700 1 -- 192.168.123.105:0/961572153 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff9bc005320 con 0x7ff9dc107d90 2026-03-10T08:55:51.627 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.624+0000 7ff9d9d9b700 1 --2- 192.168.123.105:0/961572153 >> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] conn(0x7ff9c403db10 0x7ff9c403ffd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:55:51.627 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.625+0000 7ff9cb7fe700 1 -- 192.168.123.105:0/961572153 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7ff9cc023d70 con 0x7ff9dc107d90 2026-03-10T08:55:51.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.628+0000 7ff9d9d9b700 1 --2- 192.168.123.105:0/961572153 >> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] conn(0x7ff9c403db10 0x7ff9c403ffd0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7ff9d000b5c0 tx=0x7ff9d0009f90 comp rx=0 tx=0).ready entity=mgr.24459 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:55:51.642 INFO:tasks.workunit.client.1.vm08.stdout:3/809: dwrite d4/d15/d8/d2c/d55/feb [8388608,4194304] 0 2026-03-10T08:55:51.652 INFO:tasks.workunit.client.0.vm05.stdout:2/583: getdents d0/d9/d7f/d8f/d6d 0 2026-03-10T08:55:51.652 INFO:tasks.workunit.client.0.vm05.stdout:3/722: dread - d9/d2b/d3a/d6c/fb2 zero size 2026-03-10T08:55:51.660 INFO:tasks.workunit.client.0.vm05.stdout:1/691: dread dd/d10/d19/d27/fc8 [0,4194304] 0 2026-03-10T08:55:51.671 INFO:tasks.workunit.client.0.vm05.stdout:8/659: rename d2/db/d28/f96 to d2/db/d28/d99/fe7 0 2026-03-10T08:55:51.682 INFO:tasks.workunit.client.0.vm05.stdout:9/612: symlink d6/d19/d2a/dbc/lcd 0 2026-03-10T08:55:51.683 INFO:tasks.workunit.client.0.vm05.stdout:9/613: readlink d6/d19/d2c/d84/lbe 0 2026-03-10T08:55:51.699 INFO:tasks.workunit.client.1.vm08.stdout:1/849: truncate d1/da/de/d24/d26/f94 680963 0 2026-03-10T08:55:51.704 INFO:tasks.workunit.client.1.vm08.stdout:1/850: chown d1/da/fd6 0 1 2026-03-10T08:55:51.705 INFO:tasks.workunit.client.0.vm05.stdout:3/723: sync 2026-03-10T08:55:51.706 INFO:tasks.workunit.client.0.vm05.stdout:3/724: dread - d9/d4d/d51/d64/d89/dc2/fcd zero size 2026-03-10T08:55:51.711 INFO:tasks.workunit.client.1.vm08.stdout:1/851: chown d1/da/de/d24/d3d/d40/d8e/dd2/ffd 128620 1 2026-03-10T08:55:51.722 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.722+0000 7ff9cb7fe700 1 -- 192.168.123.105:0/961572153 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mgrmap(e 22) v1 ==== 50225+0+0 (secure 0 0 0) 0x7ff9cc0107b0 con 0x7ff9dc107d90 2026-03-10T08:55:51.723 INFO:tasks.workunit.client.1.vm08.stdout:1/852: dwrite d1/da/de/d24/d35/d6d/f123 [0,4194304] 0 2026-03-10T08:55:51.723 INFO:tasks.workunit.client.0.vm05.stdout:6/674: mkdir d4/d7/dc4/de1 0 2026-03-10T08:55:51.723 INFO:tasks.workunit.client.0.vm05.stdout:4/651: symlink d0/d2c/d6a/dd0/ld1 0 2026-03-10T08:55:51.723 INFO:tasks.workunit.client.0.vm05.stdout:0/645: link df/d1f/d85/f24 df/d1f/d85/d19/d5b/fbd 0 2026-03-10T08:55:51.724 INFO:tasks.workunit.client.0.vm05.stdout:1/692: rmdir dd/d21/d37/d7c/dc9 39 2026-03-10T08:55:51.725 INFO:tasks.workunit.client.0.vm05.stdout:6/675: readlink d4/d7/d10/d15/d1b/l73 0 2026-03-10T08:55:51.725 INFO:tasks.workunit.client.0.vm05.stdout:0/646: chown df/d1f/d85/d19/d62/lb8 0 1 2026-03-10T08:55:51.727 INFO:tasks.workunit.client.1.vm08.stdout:8/883: mknod d1/d10/d9/dd/d25/d27/d44/d21/d51/d64/c148 0 2026-03-10T08:55:51.728 INFO:tasks.workunit.client.0.vm05.stdout:1/693: sync 2026-03-10T08:55:51.728 INFO:tasks.workunit.client.1.vm08.stdout:0/822: creat d6/dd/d13/d8f/f118 x:0 0 0 2026-03-10T08:55:51.738 INFO:tasks.workunit.client.1.vm08.stdout:3/810: creat d4/d6f/dca/f114 x:0 0 0 2026-03-10T08:55:51.743 INFO:tasks.workunit.client.0.vm05.stdout:9/614: mknod d6/d12/d3a/da2/cce 0 2026-03-10T08:55:51.761 INFO:tasks.workunit.client.0.vm05.stdout:5/578: write d5/d86/f9d [899009,130051] 0 2026-03-10T08:55:51.795 INFO:tasks.workunit.client.1.vm08.stdout:2/926: fdatasync d1/da/d10/d42/d93/d1e/f90 0 2026-03-10T08:55:51.807 INFO:tasks.workunit.client.1.vm08.stdout:7/853: write d0/d11/d1f/d29/d3d/d40/fb0 [6500985,19710] 0 2026-03-10T08:55:51.819 INFO:tasks.workunit.client.1.vm08.stdout:4/878: write d5/d23/d36/d99/db2/ff7 [2186173,78375] 0 2026-03-10T08:55:51.822 INFO:tasks.workunit.client.0.vm05.stdout:7/608: dwrite d18/d66/d25/d2e/d42/f5a [0,4194304] 0 2026-03-10T08:55:51.823 INFO:tasks.workunit.client.0.vm05.stdout:7/609: dread - d18/d66/d25/fb9 zero size 2026-03-10T08:55:51.823 INFO:tasks.workunit.client.0.vm05.stdout:7/610: stat d18/cb7 0 2026-03-10T08:55:51.845 INFO:tasks.workunit.client.1.vm08.stdout:8/884: creat d1/d10/d9/dd/d25/d27/d44/d21/d5f/d9e/f149 x:0 0 0 2026-03-10T08:55:51.845 INFO:tasks.workunit.client.0.vm05.stdout:6/676: creat d4/d2c/d84/fe2 x:0 0 0 2026-03-10T08:55:51.845 INFO:tasks.workunit.client.1.vm08.stdout:8/885: dread - d1/d4f/f138 zero size 2026-03-10T08:55:51.847 INFO:tasks.workunit.client.0.vm05.stdout:4/652: dread d0/d2e/d42/d45/f5f [0,4194304] 0 2026-03-10T08:55:51.853 INFO:tasks.workunit.client.1.vm08.stdout:0/823: symlink d6/dd/d13/d17/d1f/d2d/d38/d98/l119 0 2026-03-10T08:55:51.866 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.864+0000 7ff9db59e700 1 -- 192.168.123.105:0/961572153 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7ff9bc006200 con 0x7ff9dc107d90 2026-03-10T08:55:51.866 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.866+0000 7ff9cb7fe700 1 -- 192.168.123.105:0/961572153 <== mon.1 v2:192.168.123.108:3300/0 8 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 12 v12) v1 ==== 76+0+1827 (secure 0 0 0) 0x7ff9cc0238c0 con 0x7ff9dc107d90 2026-03-10T08:55:51.866 INFO:teuthology.orchestra.run.vm05.stdout:e12 2026-03-10T08:55:51.866 INFO:teuthology.orchestra.run.vm05.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T08:55:51.867 INFO:teuthology.orchestra.run.vm05.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T08:55:51.867 INFO:teuthology.orchestra.run.vm05.stdout:legacy client fscid: 1 2026-03-10T08:55:51.867 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:55:51.867 INFO:teuthology.orchestra.run.vm05.stdout:Filesystem 'cephfs' (1) 2026-03-10T08:55:51.867 INFO:teuthology.orchestra.run.vm05.stdout:fs_name cephfs 2026-03-10T08:55:51.867 INFO:teuthology.orchestra.run.vm05.stdout:epoch 12 2026-03-10T08:55:51.867 INFO:teuthology.orchestra.run.vm05.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T08:55:51.867 INFO:teuthology.orchestra.run.vm05.stdout:created 2026-03-10T08:52:52.346264+0000 2026-03-10T08:55:51.867 INFO:teuthology.orchestra.run.vm05.stdout:modified 2026-03-10T08:53:01.426195+0000 2026-03-10T08:55:51.867 INFO:teuthology.orchestra.run.vm05.stdout:tableserver 0 2026-03-10T08:55:51.867 INFO:teuthology.orchestra.run.vm05.stdout:root 0 2026-03-10T08:55:51.867 INFO:teuthology.orchestra.run.vm05.stdout:session_timeout 60 2026-03-10T08:55:51.867 INFO:teuthology.orchestra.run.vm05.stdout:session_autoclose 300 2026-03-10T08:55:51.867 INFO:teuthology.orchestra.run.vm05.stdout:max_file_size 1099511627776 2026-03-10T08:55:51.867 INFO:teuthology.orchestra.run.vm05.stdout:required_client_features {} 2026-03-10T08:55:51.867 INFO:teuthology.orchestra.run.vm05.stdout:last_failure 0 2026-03-10T08:55:51.867 INFO:teuthology.orchestra.run.vm05.stdout:last_failure_osd_epoch 39 2026-03-10T08:55:51.867 INFO:teuthology.orchestra.run.vm05.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T08:55:51.867 INFO:teuthology.orchestra.run.vm05.stdout:max_mds 1 2026-03-10T08:55:51.867 INFO:teuthology.orchestra.run.vm05.stdout:in 0 2026-03-10T08:55:51.867 INFO:teuthology.orchestra.run.vm05.stdout:up {0=24289} 2026-03-10T08:55:51.867 INFO:teuthology.orchestra.run.vm05.stdout:failed 2026-03-10T08:55:51.867 INFO:teuthology.orchestra.run.vm05.stdout:damaged 2026-03-10T08:55:51.867 INFO:teuthology.orchestra.run.vm05.stdout:stopped 2026-03-10T08:55:51.867 INFO:teuthology.orchestra.run.vm05.stdout:data_pools [3] 2026-03-10T08:55:51.867 INFO:teuthology.orchestra.run.vm05.stdout:metadata_pool 2 2026-03-10T08:55:51.867 INFO:teuthology.orchestra.run.vm05.stdout:inline_data disabled 2026-03-10T08:55:51.867 INFO:teuthology.orchestra.run.vm05.stdout:balancer 2026-03-10T08:55:51.867 INFO:teuthology.orchestra.run.vm05.stdout:bal_rank_mask -1 2026-03-10T08:55:51.867 INFO:teuthology.orchestra.run.vm05.stdout:standby_count_wanted 1 2026-03-10T08:55:51.867 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.bxdvbu{0:24289} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.105:6826/2466638752,v1:192.168.123.105:6827/2466638752] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:55:51.867 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:55:51.867 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:55:51.867 INFO:teuthology.orchestra.run.vm05.stdout:Standby daemons: 2026-03-10T08:55:51.867 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:55:51.867 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.slhztf{-1:14488} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.105:6828/2662194502,v1:192.168.123.105:6829/2662194502] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:55:51.867 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.ssijow{-1:24307} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.108:6826/573665424,v1:192.168.123.108:6827/573665424] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:55:51.867 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.xfzrbx{-1:24317} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.108:6824/3236998387,v1:192.168.123.108:6825/3236998387] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:55:51.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.869+0000 7ff9c97fa700 1 -- 192.168.123.105:0/961572153 >> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] conn(0x7ff9c403db10 msgr2=0x7ff9c403ffd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:51.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.869+0000 7ff9c97fa700 1 --2- 192.168.123.105:0/961572153 >> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] conn(0x7ff9c403db10 0x7ff9c403ffd0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7ff9d000b5c0 tx=0x7ff9d0009f90 comp rx=0 tx=0).stop 2026-03-10T08:55:51.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.869+0000 7ff9c97fa700 1 -- 192.168.123.105:0/961572153 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff9dc107d90 msgr2=0x7ff9dc116cd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:51.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.869+0000 7ff9c97fa700 1 --2- 192.168.123.105:0/961572153 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff9dc107d90 0x7ff9dc116cd0 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7ff9cc00d900 tx=0x7ff9cc00dcc0 comp rx=0 tx=0).stop 2026-03-10T08:55:51.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.870+0000 7ff9c97fa700 1 -- 192.168.123.105:0/961572153 shutdown_connections 2026-03-10T08:55:51.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.870+0000 7ff9c97fa700 1 --2- 192.168.123.105:0/961572153 >> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] conn(0x7ff9c403db10 0x7ff9c403ffd0 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:51.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.870+0000 7ff9c97fa700 1 --2- 192.168.123.105:0/961572153 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff9dc107d90 0x7ff9dc116cd0 unknown :-1 s=CLOSED pgs=107 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:51.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.870+0000 7ff9c97fa700 1 --2- 192.168.123.105:0/961572153 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff9dc10a700 0x7ff9dc117210 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:51.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.870+0000 7ff9c97fa700 1 -- 192.168.123.105:0/961572153 >> 192.168.123.105:0/961572153 conn(0x7ff9dc06daa0 msgr2=0x7ff9dc06ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:55:51.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.870+0000 7ff9c97fa700 1 -- 192.168.123.105:0/961572153 shutdown_connections 2026-03-10T08:55:51.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.870+0000 7ff9c97fa700 1 -- 192.168.123.105:0/961572153 wait complete. 2026-03-10T08:55:51.875 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 12 2026-03-10T08:55:51.887 INFO:tasks.workunit.client.1.vm08.stdout:5/757: write d0/d11/d18/fc0 [2162978,29860] 0 2026-03-10T08:55:51.891 INFO:tasks.workunit.client.0.vm05.stdout:9/615: dread - d6/fb0 zero size 2026-03-10T08:55:51.892 INFO:tasks.workunit.client.0.vm05.stdout:9/616: chown d6/d19/d2a/f87 73694 1 2026-03-10T08:55:51.895 INFO:tasks.workunit.client.0.vm05.stdout:9/617: dwrite d6/d15/fb4 [0,4194304] 0 2026-03-10T08:55:51.905 INFO:tasks.workunit.client.0.vm05.stdout:2/584: truncate d0/d9/d1e/d20/d21/d45/d4b/f58 3043810 0 2026-03-10T08:55:51.907 INFO:tasks.workunit.client.1.vm08.stdout:6/848: link d9/dc/d11/d23/d2c/d41/l38 d9/d10/d1e/d7e/l116 0 2026-03-10T08:55:51.908 INFO:tasks.workunit.client.1.vm08.stdout:6/849: write d9/d10/f25 [3099847,105867] 0 2026-03-10T08:55:51.923 INFO:tasks.workunit.client.1.vm08.stdout:2/927: truncate d1/da/d78/df5/d11e/fe3 913492 0 2026-03-10T08:55:51.955 INFO:tasks.workunit.client.1.vm08.stdout:4/879: rename d5/d23/d36/f57 to d5/d23/d36/d99/db2/d5a/ddb/f139 0 2026-03-10T08:55:51.962 INFO:tasks.workunit.client.1.vm08.stdout:0/824: mknod d6/dd/d13/d17/d1f/d2d/d39/c11a 0 2026-03-10T08:55:51.963 INFO:tasks.workunit.client.1.vm08.stdout:0/825: chown d6/dd/d13/d61/dc7/fdf 1867 1 2026-03-10T08:55:51.963 INFO:tasks.workunit.client.0.vm05.stdout:6/677: dread d4/d2c/d84/f3a [0,4194304] 0 2026-03-10T08:55:51.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.964+0000 7fa962497700 1 -- 192.168.123.105:0/3577286645 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa95c072b50 msgr2=0x7fa95c072f70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:51.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.964+0000 7fa962497700 1 --2- 192.168.123.105:0/3577286645 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa95c072b50 0x7fa95c072f70 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7fa94c009a60 tx=0x7fa94c009d70 comp rx=0 tx=0).stop 2026-03-10T08:55:51.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.964+0000 7fa962497700 1 -- 192.168.123.105:0/3577286645 shutdown_connections 2026-03-10T08:55:51.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.964+0000 7fa962497700 1 --2- 192.168.123.105:0/3577286645 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa95c075a40 0x7fa95c077ed0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:51.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.964+0000 7fa962497700 1 --2- 192.168.123.105:0/3577286645 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa95c072b50 0x7fa95c072f70 unknown :-1 s=CLOSED pgs=108 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:51.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.964+0000 7fa962497700 1 -- 192.168.123.105:0/3577286645 >> 192.168.123.105:0/3577286645 conn(0x7fa95c06dae0 msgr2=0x7fa95c06ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:55:51.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.965+0000 7fa962497700 1 -- 192.168.123.105:0/3577286645 shutdown_connections 2026-03-10T08:55:51.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.965+0000 7fa962497700 1 -- 192.168.123.105:0/3577286645 wait complete. 2026-03-10T08:55:51.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.965+0000 7fa962497700 1 Processor -- start 2026-03-10T08:55:51.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.965+0000 7fa962497700 1 -- start start 2026-03-10T08:55:51.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.965+0000 7fa962497700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa95c075a40 0x7fa95c083180 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:55:51.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.965+0000 7fa962497700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa95c0836c0 0x7fa95c1b3240 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:55:51.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.965+0000 7fa962497700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa95c083b40 con 0x7fa95c0836c0 2026-03-10T08:55:51.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.965+0000 7fa962497700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa95c083cb0 con 0x7fa95c075a40 2026-03-10T08:55:51.967 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.967+0000 7fa95bfff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa95c075a40 0x7fa95c083180 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:55:51.967 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.967+0000 7fa95bfff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa95c075a40 0x7fa95c083180 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:49380/0 (socket says 192.168.123.105:49380) 2026-03-10T08:55:51.967 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.967+0000 7fa95bfff700 1 -- 192.168.123.105:0/2091475906 learned_addr learned my addr 192.168.123.105:0/2091475906 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:55:51.968 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.967+0000 7fa95b7fe700 1 --2- 192.168.123.105:0/2091475906 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa95c0836c0 0x7fa95c1b3240 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:55:51.968 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.968+0000 7fa95b7fe700 1 -- 192.168.123.105:0/2091475906 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa95c075a40 msgr2=0x7fa95c083180 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:51.968 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.968+0000 7fa95b7fe700 1 --2- 192.168.123.105:0/2091475906 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa95c075a40 0x7fa95c083180 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:51.968 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.968+0000 7fa95b7fe700 1 -- 192.168.123.105:0/2091475906 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa94c009710 con 0x7fa95c0836c0 2026-03-10T08:55:51.968 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.968+0000 7fa95b7fe700 1 --2- 192.168.123.105:0/2091475906 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa95c0836c0 0x7fa95c1b3240 secure :-1 s=READY pgs=330 cs=0 l=1 rev1=1 crypto rx=0x7fa95400e7f0 tx=0x7fa95400eb00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:55:51.969 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.968+0000 7fa9597fa700 1 -- 192.168.123.105:0/2091475906 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa95400c7f0 con 0x7fa95c0836c0 2026-03-10T08:55:51.969 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.968+0000 7fa9597fa700 1 -- 192.168.123.105:0/2091475906 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa95400ce30 con 0x7fa95c0836c0 2026-03-10T08:55:51.969 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.968+0000 7fa9597fa700 1 -- 192.168.123.105:0/2091475906 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa95400f660 con 0x7fa95c0836c0 2026-03-10T08:55:51.969 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.968+0000 7fa962497700 1 -- 192.168.123.105:0/2091475906 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa95c1b7bf0 con 0x7fa95c0836c0 2026-03-10T08:55:51.969 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.968+0000 7fa962497700 1 -- 192.168.123.105:0/2091475906 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa95c1b80b0 con 0x7fa95c0836c0 2026-03-10T08:55:51.969 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.969+0000 7fa962497700 1 -- 192.168.123.105:0/2091475906 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa95c07d240 con 0x7fa95c0836c0 2026-03-10T08:55:51.970 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.970+0000 7fa9597fa700 1 -- 192.168.123.105:0/2091475906 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 22) v1 ==== 50225+0+0 (secure 0 0 0) 0x7fa954015070 con 0x7fa95c0836c0 2026-03-10T08:55:51.970 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.970+0000 7fa9597fa700 1 --2- 192.168.123.105:0/2091475906 >> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] conn(0x7fa944046590 0x7fa944048a50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:55:51.970 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.970+0000 7fa9597fa700 1 -- 192.168.123.105:0/2091475906 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7fa9540534a0 con 0x7fa95c0836c0 2026-03-10T08:55:51.976 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.973+0000 7fa9597fa700 1 -- 192.168.123.105:0/2091475906 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7fa9540187a0 con 0x7fa95c0836c0 2026-03-10T08:55:51.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:51.985+0000 7fa95bfff700 1 --2- 192.168.123.105:0/2091475906 >> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] conn(0x7fa944046590 0x7fa944048a50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:55:52.000 INFO:tasks.workunit.client.0.vm05.stdout:5/579: dwrite d5/df/d37/f73 [0,4194304] 0 2026-03-10T08:55:52.000 INFO:tasks.workunit.client.0.vm05.stdout:3/725: dwrite f2 [0,4194304] 0 2026-03-10T08:55:52.005 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.002+0000 7fa95bfff700 1 --2- 192.168.123.105:0/2091475906 >> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] conn(0x7fa944046590 0x7fa944048a50 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fa94c009a30 tx=0x7fa94c019040 comp rx=0 tx=0).ready entity=mgr.24459 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:55:52.005 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:51 vm05.local ceph-mon[49713]: from='client.24465 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:55:52.005 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:51 vm05.local ceph-mon[49713]: pgmap v3: 65 pgs: 65 active+clean; 3.3 GiB data, 11 GiB used, 109 GiB / 120 GiB avail 2026-03-10T08:55:52.005 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:51 vm05.local ceph-mon[49713]: Deploying cephadm binary to vm08 2026-03-10T08:55:52.005 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:51 vm05.local ceph-mon[49713]: from='client.14702 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:55:52.005 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:51 vm05.local ceph-mon[49713]: pgmap v4: 65 pgs: 65 active+clean; 3.3 GiB data, 11 GiB used, 109 GiB / 120 GiB avail 2026-03-10T08:55:52.005 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:51 vm05.local ceph-mon[49713]: from='client.? 192.168.123.105:0/1896276321' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:55:52.009 INFO:tasks.workunit.client.1.vm08.stdout:6/850: unlink d9/dc/d11/d23/l4b 0 2026-03-10T08:55:52.010 INFO:tasks.workunit.client.1.vm08.stdout:6/851: chown d9/dc/d11/d23/d2c/d41/f56 2278815 1 2026-03-10T08:55:52.016 INFO:tasks.workunit.client.0.vm05.stdout:0/647: dwrite df/d1f/d85/d19/d47/fa5 [0,4194304] 0 2026-03-10T08:55:52.035 INFO:tasks.workunit.client.1.vm08.stdout:2/928: creat d1/d97/d11f/de7/f132 x:0 0 0 2026-03-10T08:55:52.044 INFO:tasks.workunit.client.1.vm08.stdout:7/854: creat d0/d11/d1f/d2c/d111/f112 x:0 0 0 2026-03-10T08:55:52.046 INFO:tasks.workunit.client.1.vm08.stdout:4/880: creat d5/d23/d36/d99/dc6/f13a x:0 0 0 2026-03-10T08:55:52.048 INFO:tasks.workunit.client.1.vm08.stdout:5/758: dwrite d0/d11/d27/f2a [4194304,4194304] 0 2026-03-10T08:55:52.054 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:51 vm08.local ceph-mon[57559]: from='client.24465 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:55:52.054 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:51 vm08.local ceph-mon[57559]: pgmap v3: 65 pgs: 65 active+clean; 3.3 GiB data, 11 GiB used, 109 GiB / 120 GiB avail 2026-03-10T08:55:52.054 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:51 vm08.local ceph-mon[57559]: Deploying cephadm binary to vm08 2026-03-10T08:55:52.054 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:51 vm08.local ceph-mon[57559]: from='client.14702 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:55:52.054 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:51 vm08.local ceph-mon[57559]: pgmap v4: 65 pgs: 65 active+clean; 3.3 GiB data, 11 GiB used, 109 GiB / 120 GiB avail 2026-03-10T08:55:52.054 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:51 vm08.local ceph-mon[57559]: from='client.? 192.168.123.105:0/1896276321' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:55:52.063 INFO:tasks.workunit.client.1.vm08.stdout:8/886: readlink d1/d10/d9/dd/d25/d27/d44/d21/dce/l101 0 2026-03-10T08:55:52.069 INFO:tasks.workunit.client.1.vm08.stdout:0/826: creat d6/dd/d13/d17/d1f/d2d/d39/f11b x:0 0 0 2026-03-10T08:55:52.093 INFO:tasks.workunit.client.1.vm08.stdout:6/852: rmdir d9/dc/d11/d23/d2c 39 2026-03-10T08:55:52.093 INFO:tasks.workunit.client.1.vm08.stdout:1/853: rename d1/da/de/d24/d3d/d40/ld3 to d1/da/de/d24/d3d/d40/l125 0 2026-03-10T08:55:52.093 INFO:tasks.workunit.client.1.vm08.stdout:2/929: rename d1/da/d10/d1b/d6a/f114 to d1/d97/f133 0 2026-03-10T08:55:52.093 INFO:tasks.workunit.client.1.vm08.stdout:6/853: dread d9/d10/d1e/f91 [0,4194304] 0 2026-03-10T08:55:52.116 INFO:tasks.workunit.client.1.vm08.stdout:9/822: dwrite d2/dd/d15/d1e/d25/d32/d5c/dc2/fcb [0,4194304] 0 2026-03-10T08:55:52.131 INFO:tasks.workunit.client.1.vm08.stdout:3/811: getdents d4/d15/d8/d1d/d107 0 2026-03-10T08:55:52.132 INFO:tasks.workunit.client.1.vm08.stdout:3/812: stat d4/d15/d8/d2c/d9b/d79/d20/c7b 0 2026-03-10T08:55:52.148 INFO:tasks.workunit.client.1.vm08.stdout:0/827: truncate d6/dd/d13/d17/d1f/d2d/d39/f8a 498623 0 2026-03-10T08:55:52.149 INFO:tasks.workunit.client.1.vm08.stdout:0/828: chown d6/dd/d13/d61/fb1 10309 1 2026-03-10T08:55:52.166 INFO:tasks.workunit.client.1.vm08.stdout:1/854: mknod d1/da/de/d24/d35/d6d/d82/da2/dbb/c126 0 2026-03-10T08:55:52.166 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.166+0000 7fa962497700 1 -- 192.168.123.105:0/2091475906 --> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fa95c0611d0 con 0x7fa944046590 2026-03-10T08:55:52.171 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.170+0000 7fa9597fa700 1 -- 192.168.123.105:0/2091475906 <== mgr.24459 v2:192.168.123.108:6828/865080403 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+318 (secure 0 0 0) 0x7fa95c0611d0 con 0x7fa944046590 2026-03-10T08:55:52.171 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T08:55:52.171 INFO:teuthology.orchestra.run.vm05.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T08:55:52.171 INFO:teuthology.orchestra.run.vm05.stdout: "in_progress": true, 2026-03-10T08:55:52.171 INFO:teuthology.orchestra.run.vm05.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T08:55:52.171 INFO:teuthology.orchestra.run.vm05.stdout: "services_complete": [], 2026-03-10T08:55:52.171 INFO:teuthology.orchestra.run.vm05.stdout: "progress": "1/23 daemons upgraded", 2026-03-10T08:55:52.171 INFO:teuthology.orchestra.run.vm05.stdout: "message": "", 2026-03-10T08:55:52.171 INFO:teuthology.orchestra.run.vm05.stdout: "is_paused": false 2026-03-10T08:55:52.171 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T08:55:52.172 INFO:tasks.workunit.client.1.vm08.stdout:1/855: dwrite d1/da/d18/d3b/d62/fd9 [0,4194304] 0 2026-03-10T08:55:52.178 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.177+0000 7fa942ffd700 1 -- 192.168.123.105:0/2091475906 >> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] conn(0x7fa944046590 msgr2=0x7fa944048a50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:52.178 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.177+0000 7fa942ffd700 1 --2- 192.168.123.105:0/2091475906 >> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] conn(0x7fa944046590 0x7fa944048a50 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fa94c009a30 tx=0x7fa94c019040 comp rx=0 tx=0).stop 2026-03-10T08:55:52.178 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.177+0000 7fa942ffd700 1 -- 192.168.123.105:0/2091475906 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa95c0836c0 msgr2=0x7fa95c1b3240 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:52.178 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.177+0000 7fa942ffd700 1 --2- 192.168.123.105:0/2091475906 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa95c0836c0 0x7fa95c1b3240 secure :-1 s=READY pgs=330 cs=0 l=1 rev1=1 crypto rx=0x7fa95400e7f0 tx=0x7fa95400eb00 comp rx=0 tx=0).stop 2026-03-10T08:55:52.178 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.177+0000 7fa942ffd700 1 -- 192.168.123.105:0/2091475906 shutdown_connections 2026-03-10T08:55:52.178 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.177+0000 7fa942ffd700 1 --2- 192.168.123.105:0/2091475906 >> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] conn(0x7fa944046590 0x7fa944048a50 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:52.178 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.177+0000 7fa942ffd700 1 --2- 192.168.123.105:0/2091475906 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa95c075a40 0x7fa95c083180 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:52.178 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.177+0000 7fa942ffd700 1 --2- 192.168.123.105:0/2091475906 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa95c0836c0 0x7fa95c1b3240 unknown :-1 s=CLOSED pgs=330 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:52.178 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.177+0000 7fa942ffd700 1 -- 192.168.123.105:0/2091475906 >> 192.168.123.105:0/2091475906 conn(0x7fa95c06dae0 msgr2=0x7fa95c06ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:55:52.178 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.177+0000 7fa942ffd700 1 -- 192.168.123.105:0/2091475906 shutdown_connections 2026-03-10T08:55:52.178 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.178+0000 7fa942ffd700 1 -- 192.168.123.105:0/2091475906 wait complete. 2026-03-10T08:55:52.197 INFO:tasks.workunit.client.1.vm08.stdout:5/759: rename d0/d11/d18/d52 to d0/d11/d27/d68/d7c/de5 0 2026-03-10T08:55:52.213 INFO:tasks.workunit.client.1.vm08.stdout:2/930: dread d1/da/f9c [0,4194304] 0 2026-03-10T08:55:52.217 INFO:tasks.workunit.client.1.vm08.stdout:2/931: chown d1/da/d10/d2d/db6/ff3 49 1 2026-03-10T08:55:52.218 INFO:tasks.workunit.client.1.vm08.stdout:2/932: write d1/da/d10/d42/d93/d1e/f49 [2138455,18012] 0 2026-03-10T08:55:52.219 INFO:tasks.workunit.client.1.vm08.stdout:2/933: stat d1/d97/d11f/de7/f132 0 2026-03-10T08:55:52.231 INFO:tasks.workunit.client.1.vm08.stdout:7/855: dwrite d0/d14/d43/fbc [4194304,4194304] 0 2026-03-10T08:55:52.232 INFO:tasks.workunit.client.1.vm08.stdout:4/881: dwrite d5/d23/f10d [0,4194304] 0 2026-03-10T08:55:52.289 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.288+0000 7f9cc73bd700 1 -- 192.168.123.105:0/2515800833 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9cc0107d90 msgr2=0x7f9cc010a1c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:52.289 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.288+0000 7f9cc73bd700 1 --2- 192.168.123.105:0/2515800833 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9cc0107d90 0x7f9cc010a1c0 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7f9cbc009a60 tx=0x7f9cbc009d70 comp rx=0 tx=0).stop 2026-03-10T08:55:52.289 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.289+0000 7f9cc73bd700 1 -- 192.168.123.105:0/2515800833 shutdown_connections 2026-03-10T08:55:52.289 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.289+0000 7f9cc73bd700 1 --2- 192.168.123.105:0/2515800833 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9cc010a700 0x7f9cc010cb90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:52.290 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.289+0000 7f9cc73bd700 1 --2- 192.168.123.105:0/2515800833 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9cc0107d90 0x7f9cc010a1c0 unknown :-1 s=CLOSED pgs=111 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:52.290 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.289+0000 7f9cc73bd700 1 -- 192.168.123.105:0/2515800833 >> 192.168.123.105:0/2515800833 conn(0x7f9cc006daa0 msgr2=0x7f9cc006ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:55:52.290 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.290+0000 7f9cc73bd700 1 -- 192.168.123.105:0/2515800833 shutdown_connections 2026-03-10T08:55:52.291 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.290+0000 7f9cc73bd700 1 -- 192.168.123.105:0/2515800833 wait complete. 2026-03-10T08:55:52.291 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.290+0000 7f9cc73bd700 1 Processor -- start 2026-03-10T08:55:52.291 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.290+0000 7f9cc73bd700 1 -- start start 2026-03-10T08:55:52.291 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.291+0000 7f9cc73bd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9cc0107d90 0x7f9cc0116a80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:55:52.291 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.291+0000 7f9cc73bd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9cc010a700 0x7f9cc0116fc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:55:52.291 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.291+0000 7f9cc73bd700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9cc01175e0 con 0x7f9cc010a700 2026-03-10T08:55:52.291 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.291+0000 7f9cc73bd700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9cc01b30a0 con 0x7f9cc0107d90 2026-03-10T08:55:52.293 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.292+0000 7f9cc63bb700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9cc0107d90 0x7f9cc0116a80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:55:52.293 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.292+0000 7f9cc63bb700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9cc0107d90 0x7f9cc0116a80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:49408/0 (socket says 192.168.123.105:49408) 2026-03-10T08:55:52.293 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.292+0000 7f9cc63bb700 1 -- 192.168.123.105:0/3143628410 learned_addr learned my addr 192.168.123.105:0/3143628410 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:55:52.294 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.294+0000 7f9cc63bb700 1 -- 192.168.123.105:0/3143628410 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9cc010a700 msgr2=0x7f9cc0116fc0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:52.294 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.294+0000 7f9cc63bb700 1 --2- 192.168.123.105:0/3143628410 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9cc010a700 0x7f9cc0116fc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:52.294 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.294+0000 7f9cc63bb700 1 -- 192.168.123.105:0/3143628410 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9cbc009710 con 0x7f9cc0107d90 2026-03-10T08:55:52.294 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.294+0000 7f9cc63bb700 1 --2- 192.168.123.105:0/3143628410 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9cc0107d90 0x7f9cc0116a80 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f9cbc009a30 tx=0x7f9cbc00f740 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:55:52.296 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.295+0000 7f9cb37fe700 1 -- 192.168.123.105:0/3143628410 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9cbc01d070 con 0x7f9cc0107d90 2026-03-10T08:55:52.296 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.295+0000 7f9cb37fe700 1 -- 192.168.123.105:0/3143628410 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9cbc00fd20 con 0x7f9cc0107d90 2026-03-10T08:55:52.298 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.295+0000 7f9cb37fe700 1 -- 192.168.123.105:0/3143628410 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9cbc017800 con 0x7f9cc0107d90 2026-03-10T08:55:52.298 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.296+0000 7f9cc73bd700 1 -- 192.168.123.105:0/3143628410 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9cc01b3240 con 0x7f9cc0107d90 2026-03-10T08:55:52.298 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.296+0000 7f9cc73bd700 1 -- 192.168.123.105:0/3143628410 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9cc01b3660 con 0x7f9cc0107d90 2026-03-10T08:55:52.298 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.297+0000 7f9cc73bd700 1 -- 192.168.123.105:0/3143628410 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9cc0110c60 con 0x7f9cc0107d90 2026-03-10T08:55:52.299 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.298+0000 7f9cb37fe700 1 -- 192.168.123.105:0/3143628410 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 22) v1 ==== 50225+0+0 (secure 0 0 0) 0x7f9cbc017960 con 0x7f9cc0107d90 2026-03-10T08:55:52.299 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.298+0000 7f9cb37fe700 1 --2- 192.168.123.105:0/3143628410 >> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] conn(0x7f9cac03dac0 0x7f9cac03ff80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:55:52.299 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.299+0000 7f9cc5bba700 1 --2- 192.168.123.105:0/3143628410 >> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] conn(0x7f9cac03dac0 0x7f9cac03ff80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:55:52.300 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.299+0000 7f9cb37fe700 1 -- 192.168.123.105:0/3143628410 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f9cbc022990 con 0x7f9cc0107d90 2026-03-10T08:55:52.302 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.300+0000 7f9cc5bba700 1 --2- 192.168.123.105:0/3143628410 >> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] conn(0x7f9cac03dac0 0x7f9cac03ff80 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f9cb4005950 tx=0x7f9cb40058e0 comp rx=0 tx=0).ready entity=mgr.24459 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:55:52.311 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.306+0000 7f9cb37fe700 1 -- 192.168.123.105:0/3143628410 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f9cbc016da0 con 0x7f9cc0107d90 2026-03-10T08:55:52.343 INFO:tasks.workunit.client.0.vm05.stdout:1/694: fdatasync fc 0 2026-03-10T08:55:52.344 INFO:tasks.workunit.client.0.vm05.stdout:8/660: rmdir d2/dd/de6 0 2026-03-10T08:55:52.360 INFO:tasks.workunit.client.0.vm05.stdout:9/618: unlink d6/d15/d35/l57 0 2026-03-10T08:55:52.375 INFO:tasks.workunit.client.0.vm05.stdout:7/611: mknod d18/d66/d25/d2e/d2f/da0/cbe 0 2026-03-10T08:55:52.408 INFO:tasks.workunit.client.0.vm05.stdout:4/653: rename d0/d2e/d42/d45/d4a/d36/dbe/d49/d58/d66/lc3 to d0/d2e/d42/d45/d4a/d36/dbe/d49/d4f/d5b/ld2 0 2026-03-10T08:55:52.431 INFO:tasks.workunit.client.0.vm05.stdout:7/612: dread d18/f24 [0,4194304] 0 2026-03-10T08:55:52.457 INFO:tasks.workunit.client.0.vm05.stdout:3/726: dread - d9/d4d/dca/f87 zero size 2026-03-10T08:55:52.524 INFO:tasks.workunit.client.0.vm05.stdout:5/580: write d5/d48/f69 [1492399,64007] 0 2026-03-10T08:55:52.541 INFO:tasks.workunit.client.0.vm05.stdout:0/648: write df/d1f/d85/d2b/d27/d32/d4e/d87/f8d [2834996,117316] 0 2026-03-10T08:55:52.572 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.570+0000 7f9cc73bd700 1 -- 192.168.123.105:0/3143628410 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f9cc004ea90 con 0x7f9cc0107d90 2026-03-10T08:55:52.572 INFO:tasks.workunit.client.0.vm05.stdout:7/613: write d18/d38/d43/d6e/f9b [5622407,48816] 0 2026-03-10T08:55:52.579 INFO:teuthology.orchestra.run.vm05.stdout:HEALTH_OK 2026-03-10T08:55:52.580 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.576+0000 7f9cb37fe700 1 -- 192.168.123.105:0/3143628410 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7f9cbc027030 con 0x7f9cc0107d90 2026-03-10T08:55:52.580 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.579+0000 7f9cb17fa700 1 -- 192.168.123.105:0/3143628410 >> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] conn(0x7f9cac03dac0 msgr2=0x7f9cac03ff80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:52.580 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.579+0000 7f9cb17fa700 1 --2- 192.168.123.105:0/3143628410 >> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] conn(0x7f9cac03dac0 0x7f9cac03ff80 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f9cb4005950 tx=0x7f9cb40058e0 comp rx=0 tx=0).stop 2026-03-10T08:55:52.583 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.579+0000 7f9cb17fa700 1 -- 192.168.123.105:0/3143628410 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9cc0107d90 msgr2=0x7f9cc0116a80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:55:52.583 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.579+0000 7f9cb17fa700 1 --2- 192.168.123.105:0/3143628410 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9cc0107d90 0x7f9cc0116a80 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f9cbc009a30 tx=0x7f9cbc00f740 comp rx=0 tx=0).stop 2026-03-10T08:55:52.583 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.579+0000 7f9cb17fa700 1 -- 192.168.123.105:0/3143628410 shutdown_connections 2026-03-10T08:55:52.583 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.579+0000 7f9cb17fa700 1 --2- 192.168.123.105:0/3143628410 >> [v2:192.168.123.108:6828/865080403,v1:192.168.123.108:6829/865080403] conn(0x7f9cac03dac0 0x7f9cac03ff80 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:52.583 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.579+0000 7f9cb17fa700 1 --2- 192.168.123.105:0/3143628410 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9cc0107d90 0x7f9cc0116a80 secure :-1 s=CLOSED pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f9cbc009a30 tx=0x7f9cbc00f740 comp rx=0 tx=0).stop 2026-03-10T08:55:52.583 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.579+0000 7f9cb17fa700 1 --2- 192.168.123.105:0/3143628410 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9cc010a700 0x7f9cc0116fc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:55:52.583 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.579+0000 7f9cb17fa700 1 -- 192.168.123.105:0/3143628410 >> 192.168.123.105:0/3143628410 conn(0x7f9cc006daa0 msgr2=0x7f9cc006ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:55:52.583 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.581+0000 7f9cb17fa700 1 -- 192.168.123.105:0/3143628410 shutdown_connections 2026-03-10T08:55:52.583 INFO:tasks.workunit.client.0.vm05.stdout:8/661: mknod d2/ce8 0 2026-03-10T08:55:52.588 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:55:52.582+0000 7f9cb17fa700 1 -- 192.168.123.105:0/3143628410 wait complete. 2026-03-10T08:55:52.601 INFO:tasks.workunit.client.0.vm05.stdout:2/585: symlink d0/d9/d1e/d20/d24/la6 0 2026-03-10T08:55:52.606 INFO:tasks.workunit.client.0.vm05.stdout:5/581: creat d5/df/dbb/fd4 x:0 0 0 2026-03-10T08:55:52.610 INFO:tasks.workunit.client.0.vm05.stdout:6/678: rmdir d4/d7/dc4/de1 0 2026-03-10T08:55:52.612 INFO:tasks.workunit.client.0.vm05.stdout:1/695: rename dd/d10/d18/f36 to dd/d10/d18/dd1/ff9 0 2026-03-10T08:55:52.620 INFO:tasks.workunit.client.0.vm05.stdout:4/654: fsync d0/d2e/d71/d7c/fb4 0 2026-03-10T08:55:52.624 INFO:tasks.workunit.client.0.vm05.stdout:9/619: dread d6/d15/d37/f4c [0,4194304] 0 2026-03-10T08:55:52.632 INFO:tasks.workunit.client.1.vm08.stdout:8/887: mkdir d1/d10/d9/dd/d25/d27/d44/d21/d14a 0 2026-03-10T08:55:52.639 INFO:tasks.workunit.client.0.vm05.stdout:3/727: unlink d9/l4e 0 2026-03-10T08:55:52.650 INFO:tasks.workunit.client.1.vm08.stdout:3/813: fdatasync d4/d15/d8/d2c/d9b/d79/d20/f84 0 2026-03-10T08:55:52.651 INFO:tasks.workunit.client.1.vm08.stdout:3/814: write d4/d15/d8/d2c/d9b/d79/d20/fe5 [1683947,83628] 0 2026-03-10T08:55:52.653 INFO:tasks.workunit.client.0.vm05.stdout:7/614: dread d18/f31 [0,4194304] 0 2026-03-10T08:55:52.656 INFO:tasks.workunit.client.1.vm08.stdout:0/829: read d6/dd/d13/d17/d1f/d20/fff [3081885,108180] 0 2026-03-10T08:55:52.657 INFO:tasks.workunit.client.1.vm08.stdout:1/856: read d1/da/de/d24/d35/d6d/d116/fb6 [105569,125532] 0 2026-03-10T08:55:52.663 INFO:tasks.workunit.client.1.vm08.stdout:5/760: fsync d0/d46/f5f 0 2026-03-10T08:55:52.663 INFO:tasks.workunit.client.0.vm05.stdout:5/582: creat d5/d86/d24/d2c/d41/d74/fd5 x:0 0 0 2026-03-10T08:55:52.678 INFO:tasks.workunit.client.0.vm05.stdout:0/649: rename df/d1f/d85/d19/d39/d74 to df/d1f/d85/d19/d47/d84/dbe 0 2026-03-10T08:55:52.679 INFO:tasks.workunit.client.0.vm05.stdout:5/583: dwrite d5/d86/d24/d2c/d41/d74/fd5 [0,4194304] 0 2026-03-10T08:55:52.683 INFO:tasks.workunit.client.0.vm05.stdout:1/696: creat dd/d10/d19/d27/ffa x:0 0 0 2026-03-10T08:55:52.706 INFO:tasks.workunit.client.0.vm05.stdout:9/620: truncate d6/d19/d2c/f78 267468 0 2026-03-10T08:55:52.707 INFO:tasks.workunit.client.1.vm08.stdout:7/856: unlink d0/d11/d1f/d29/d3d/d40/ff 0 2026-03-10T08:55:52.708 INFO:tasks.workunit.client.1.vm08.stdout:7/857: write d0/d14/f72 [3778765,79064] 0 2026-03-10T08:55:52.718 INFO:tasks.workunit.client.0.vm05.stdout:8/662: fdatasync d2/dd/d2c/d2e/f3b 0 2026-03-10T08:55:52.731 INFO:tasks.workunit.client.1.vm08.stdout:6/854: symlink d9/dc/d11/d23/d2c/l117 0 2026-03-10T08:55:52.735 INFO:tasks.workunit.client.1.vm08.stdout:8/888: read d1/d10/d9/d8a/f99 [1568561,107068] 0 2026-03-10T08:55:52.755 INFO:tasks.workunit.client.1.vm08.stdout:0/830: unlink d6/dd/d13/d17/d1f/d2d/d39/f91 0 2026-03-10T08:55:52.759 INFO:tasks.workunit.client.0.vm05.stdout:6/679: write d4/d7/d10/d1a/db1/fb3 [637134,63119] 0 2026-03-10T08:55:52.759 INFO:tasks.workunit.client.1.vm08.stdout:2/934: dwrite d1/da/d10/d42/d93/d1e/dce/f74 [0,4194304] 0 2026-03-10T08:55:52.766 INFO:tasks.workunit.client.0.vm05.stdout:4/655: dwrite d0/d2e/d42/d45/d4a/d36/dbe/d32/f72 [0,4194304] 0 2026-03-10T08:55:52.774 INFO:tasks.workunit.client.0.vm05.stdout:7/615: dwrite d18/d38/f55 [4194304,4194304] 0 2026-03-10T08:55:52.791 INFO:tasks.workunit.client.0.vm05.stdout:9/621: fsync d6/d19/d2a/d4a/f88 0 2026-03-10T08:55:52.793 INFO:tasks.workunit.client.0.vm05.stdout:9/622: chown d6/d15 38017873 1 2026-03-10T08:55:52.802 INFO:tasks.workunit.client.0.vm05.stdout:3/728: mkdir d9/d8f/dde 0 2026-03-10T08:55:52.805 INFO:tasks.workunit.client.0.vm05.stdout:3/729: stat d9/d8f/d50/d5f/dd8 0 2026-03-10T08:55:52.812 INFO:tasks.workunit.client.0.vm05.stdout:2/586: creat d0/d9/d1e/d20/d21/d45/d4b/fa7 x:0 0 0 2026-03-10T08:55:52.816 INFO:tasks.workunit.client.1.vm08.stdout:7/858: truncate d0/d11/d1f/d29/d3d/d89/f8b 1694323 0 2026-03-10T08:55:52.824 INFO:tasks.workunit.client.1.vm08.stdout:9/823: rename d2/d41/d4c/d66/c92 to d2/dd/d15/d1e/d39/d4e/c116 0 2026-03-10T08:55:52.832 INFO:tasks.workunit.client.0.vm05.stdout:0/650: creat df/d1f/d85/d2b/d27/d32/d4e/d6a/fbf x:0 0 0 2026-03-10T08:55:52.833 INFO:tasks.workunit.client.0.vm05.stdout:1/697: mkdir dd/dfb 0 2026-03-10T08:55:52.833 INFO:tasks.workunit.client.1.vm08.stdout:8/889: symlink d1/d10/d9/dd/d25/dca/dc6/l14b 0 2026-03-10T08:55:52.833 INFO:tasks.workunit.client.1.vm08.stdout:8/890: dwrite d1/d10/d9/dd/d13/d40/f13c [0,4194304] 0 2026-03-10T08:55:52.841 INFO:tasks.workunit.client.1.vm08.stdout:3/815: creat d4/d6f/dca/df9/f115 x:0 0 0 2026-03-10T08:55:52.841 INFO:tasks.workunit.client.1.vm08.stdout:3/816: stat d4/d15/d8/ld0 0 2026-03-10T08:55:52.842 INFO:tasks.workunit.client.0.vm05.stdout:6/680: symlink d4/d7/d10/d15/d1b/d22/le3 0 2026-03-10T08:55:52.848 INFO:tasks.workunit.client.1.vm08.stdout:2/935: truncate d1/da/d10/d42/d93/d1e/dce/d52/f112 1801618 0 2026-03-10T08:55:52.855 INFO:tasks.workunit.client.0.vm05.stdout:7/616: symlink d18/d66/d25/d2e/d42/d53/lbf 0 2026-03-10T08:55:52.861 INFO:tasks.workunit.client.1.vm08.stdout:7/859: rmdir d0/d11/d4a/da3 39 2026-03-10T08:55:52.875 INFO:tasks.workunit.client.0.vm05.stdout:9/623: fdatasync d6/d27/fa6 0 2026-03-10T08:55:52.888 INFO:tasks.workunit.client.1.vm08.stdout:9/824: mknod d2/dd/d15/d1e/d39/d69/c117 0 2026-03-10T08:55:52.893 INFO:tasks.workunit.client.1.vm08.stdout:0/831: mknod d6/dd/d13/d17/d1f/d2d/c11c 0 2026-03-10T08:55:52.894 INFO:tasks.workunit.client.1.vm08.stdout:8/891: dread d1/d10/d9/dd/d25/d27/fd3 [0,4194304] 0 2026-03-10T08:55:52.895 INFO:tasks.workunit.client.1.vm08.stdout:8/892: chown d1/d10/d9/d4d/c100 27320530 1 2026-03-10T08:55:52.896 INFO:tasks.workunit.client.1.vm08.stdout:8/893: dread - d1/d10/d9/dd/d25/d27/d44/d21/d51/d64/dfb/f12f zero size 2026-03-10T08:55:52.898 INFO:tasks.workunit.client.1.vm08.stdout:1/857: unlink d1/da/d20/d91/d83/df4/fe3 0 2026-03-10T08:55:52.899 INFO:tasks.workunit.client.1.vm08.stdout:1/858: chown d1/da/de/d24/d3d/d40/d8e/f107 197879 1 2026-03-10T08:55:52.908 INFO:tasks.workunit.client.1.vm08.stdout:5/761: creat d0/d11/d18/fe6 x:0 0 0 2026-03-10T08:55:52.915 INFO:tasks.workunit.client.1.vm08.stdout:7/860: mknod d0/d51/c113 0 2026-03-10T08:55:52.916 INFO:tasks.workunit.client.1.vm08.stdout:7/861: chown d0/d11/d1f/d2c/f33 1227853960 1 2026-03-10T08:55:52.917 INFO:tasks.workunit.client.1.vm08.stdout:9/825: mkdir d2/dd/d15/d1e/d21/d118 0 2026-03-10T08:55:52.918 INFO:tasks.workunit.client.1.vm08.stdout:4/882: truncate d5/d23/d36/d99/db2/d5d/dae/fd1 2343359 0 2026-03-10T08:55:52.921 INFO:tasks.workunit.client.1.vm08.stdout:8/894: sync 2026-03-10T08:55:52.921 INFO:tasks.workunit.client.1.vm08.stdout:8/895: readlink d1/l87 0 2026-03-10T08:55:52.923 INFO:tasks.workunit.client.1.vm08.stdout:8/896: rename d1/d10 to d1/d10/d9/dd/d13/d40/d14c 22 2026-03-10T08:55:52.926 INFO:tasks.workunit.client.1.vm08.stdout:1/859: dread d1/da/de/d24/d3d/d40/d8e/dd2/d7f/fb9 [0,4194304] 0 2026-03-10T08:55:52.929 INFO:tasks.workunit.client.0.vm05.stdout:9/624: chown d6/d15/f86 614575 1 2026-03-10T08:55:52.934 INFO:tasks.workunit.client.1.vm08.stdout:6/855: write d9/dc/d84/f76 [1028519,56726] 0 2026-03-10T08:55:52.938 INFO:tasks.workunit.client.0.vm05.stdout:8/663: dwrite d2/dd/d2c/d2e/d31/d4f/da3/faa [0,4194304] 0 2026-03-10T08:55:52.941 INFO:tasks.workunit.client.0.vm05.stdout:8/664: readlink d2/db/d1f/d67/l70 0 2026-03-10T08:55:52.949 INFO:tasks.workunit.client.1.vm08.stdout:5/762: truncate d0/d11/d18/f5a 1313812 0 2026-03-10T08:55:52.953 INFO:tasks.workunit.client.1.vm08.stdout:5/763: dread d0/d11/d27/d50/fa1 [0,4194304] 0 2026-03-10T08:55:52.960 INFO:tasks.workunit.client.1.vm08.stdout:1/860: mknod d1/da/d18/d3a/da7/c127 0 2026-03-10T08:55:52.961 INFO:tasks.workunit.client.1.vm08.stdout:6/856: dread - d9/dc/d11/d23/d2c/d7a/dce/d69/ffb zero size 2026-03-10T08:55:52.966 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:52 vm05.local ceph-mon[49713]: from='client.24491 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:55:52.966 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:52 vm05.local ceph-mon[49713]: [10/Mar/2026:08:55:51] ENGINE Bus STARTING 2026-03-10T08:55:52.966 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:52 vm05.local ceph-mon[49713]: [10/Mar/2026:08:55:51] ENGINE Serving on http://192.168.123.108:8765 2026-03-10T08:55:52.966 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:52 vm05.local ceph-mon[49713]: Deploying cephadm binary to vm05 2026-03-10T08:55:52.966 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:52 vm05.local ceph-mon[49713]: [10/Mar/2026:08:55:51] ENGINE Serving on https://192.168.123.108:7150 2026-03-10T08:55:52.966 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:52 vm05.local ceph-mon[49713]: [10/Mar/2026:08:55:51] ENGINE Bus STARTED 2026-03-10T08:55:52.966 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:52 vm05.local ceph-mon[49713]: [10/Mar/2026:08:55:51] ENGINE Client ('192.168.123.108', 47342) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-10T08:55:52.966 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:52 vm05.local ceph-mon[49713]: mgrmap e22: vm08.rpongu(active, since 2s) 2026-03-10T08:55:52.966 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:52 vm05.local ceph-mon[49713]: from='client.? 192.168.123.105:0/961572153' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T08:55:52.966 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:52 vm05.local ceph-mon[49713]: from='client.? 192.168.123.105:0/3143628410' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T08:55:52.969 INFO:tasks.workunit.client.1.vm08.stdout:6/857: dread d9/dc/d11/d23/f8b [0,4194304] 0 2026-03-10T08:55:52.981 INFO:tasks.workunit.client.0.vm05.stdout:5/584: truncate d5/d86/d21/d89/f90 2730286 0 2026-03-10T08:55:52.983 INFO:tasks.workunit.client.1.vm08.stdout:3/817: write d4/d15/fda [1487853,13698] 0 2026-03-10T08:55:52.984 INFO:tasks.workunit.client.0.vm05.stdout:5/585: chown d5/fc 3543 1 2026-03-10T08:55:52.993 INFO:tasks.workunit.client.0.vm05.stdout:4/656: dwrite d0/d2e/d42/d45/d4a/d36/d37/f97 [0,4194304] 0 2026-03-10T08:55:53.007 INFO:tasks.workunit.client.1.vm08.stdout:7/862: creat d0/d11/d1f/d29/d3b/da1/f114 x:0 0 0 2026-03-10T08:55:53.020 INFO:tasks.workunit.client.1.vm08.stdout:5/764: mknod d0/d11/ce7 0 2026-03-10T08:55:53.025 INFO:tasks.workunit.client.1.vm08.stdout:2/936: write d1/da/f9c [1357923,109539] 0 2026-03-10T08:55:53.030 INFO:tasks.workunit.client.1.vm08.stdout:1/861: dread - d1/da/de/d24/d35/d6d/d116/fc3 zero size 2026-03-10T08:55:53.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:52 vm08.local ceph-mon[57559]: from='client.24491 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:55:53.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:52 vm08.local ceph-mon[57559]: [10/Mar/2026:08:55:51] ENGINE Bus STARTING 2026-03-10T08:55:53.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:52 vm08.local ceph-mon[57559]: [10/Mar/2026:08:55:51] ENGINE Serving on http://192.168.123.108:8765 2026-03-10T08:55:53.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:52 vm08.local ceph-mon[57559]: Deploying cephadm binary to vm05 2026-03-10T08:55:53.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:52 vm08.local ceph-mon[57559]: [10/Mar/2026:08:55:51] ENGINE Serving on https://192.168.123.108:7150 2026-03-10T08:55:53.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:52 vm08.local ceph-mon[57559]: [10/Mar/2026:08:55:51] ENGINE Bus STARTED 2026-03-10T08:55:53.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:52 vm08.local ceph-mon[57559]: [10/Mar/2026:08:55:51] ENGINE Client ('192.168.123.108', 47342) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-10T08:55:53.054 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:52 vm08.local ceph-mon[57559]: mgrmap e22: vm08.rpongu(active, since 2s) 2026-03-10T08:55:53.054 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:52 vm08.local ceph-mon[57559]: from='client.? 192.168.123.105:0/961572153' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T08:55:53.054 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:52 vm08.local ceph-mon[57559]: from='client.? 192.168.123.105:0/3143628410' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T08:55:53.054 INFO:tasks.workunit.client.1.vm08.stdout:0/832: dwrite d6/dd/d13/d61/fba [0,4194304] 0 2026-03-10T08:55:53.066 INFO:tasks.workunit.client.1.vm08.stdout:8/897: write d1/d10/d9/dd/d18/d34/fd1 [462779,101357] 0 2026-03-10T08:55:53.069 INFO:tasks.workunit.client.1.vm08.stdout:4/883: dwrite d5/d23/d36/d99/ffd [0,4194304] 0 2026-03-10T08:55:53.073 INFO:tasks.workunit.client.1.vm08.stdout:3/818: rmdir d4/d15/d8/d2c 39 2026-03-10T08:55:53.080 INFO:tasks.workunit.client.1.vm08.stdout:9/826: creat d2/dd/d15/d1e/d25/f119 x:0 0 0 2026-03-10T08:55:53.091 INFO:tasks.workunit.client.1.vm08.stdout:5/765: readlink d0/d11/d3e/la6 0 2026-03-10T08:55:53.091 INFO:tasks.workunit.client.1.vm08.stdout:7/863: chown d0/d11/d4a/da3/fa9 286 1 2026-03-10T08:55:53.094 INFO:tasks.workunit.client.1.vm08.stdout:2/937: creat d1/da/d10/d42/d93/daa/f134 x:0 0 0 2026-03-10T08:55:53.096 INFO:tasks.workunit.client.1.vm08.stdout:6/858: symlink d9/dc/d11/d23/d2c/d81/d63/dcf/l118 0 2026-03-10T08:55:53.102 INFO:tasks.workunit.client.0.vm05.stdout:3/730: dread - d9/d4d/f95 zero size 2026-03-10T08:55:53.106 INFO:tasks.workunit.client.1.vm08.stdout:0/833: creat d6/dd/d13/d61/f11d x:0 0 0 2026-03-10T08:55:53.111 INFO:tasks.workunit.client.1.vm08.stdout:9/827: dread d2/dd/d15/d1e/d39/f57 [0,4194304] 0 2026-03-10T08:55:53.115 INFO:tasks.workunit.client.1.vm08.stdout:8/898: truncate d1/d10/d9/dd/d13/d40/fdd 644135 0 2026-03-10T08:55:53.122 INFO:tasks.workunit.client.0.vm05.stdout:6/681: link d4/d7/f5d d4/d2d/d51/d62/da9/fe4 0 2026-03-10T08:55:53.124 INFO:tasks.workunit.client.1.vm08.stdout:3/819: dread - d4/d15/d8/d1d/d4f/f105 zero size 2026-03-10T08:55:53.144 INFO:tasks.workunit.client.0.vm05.stdout:9/625: dwrite d6/d12/d3a/d48/fa8 [0,4194304] 0 2026-03-10T08:55:53.157 INFO:tasks.workunit.client.0.vm05.stdout:8/665: unlink d2/dd/d2c/d2e/fbb 0 2026-03-10T08:55:53.160 INFO:tasks.workunit.client.1.vm08.stdout:5/766: creat d0/d11/d27/d68/d7c/d4b/d4e/fe8 x:0 0 0 2026-03-10T08:55:53.167 INFO:tasks.workunit.client.1.vm08.stdout:7/864: dread d0/d14/f68 [0,4194304] 0 2026-03-10T08:55:53.170 INFO:tasks.workunit.client.1.vm08.stdout:5/767: dread d0/d11/d27/d68/d7c/d4b/d4e/d84/f90 [0,4194304] 0 2026-03-10T08:55:53.171 INFO:tasks.workunit.client.1.vm08.stdout:1/862: truncate d1/da/de/d24/d3d/d40/d56/d7a/ff2 4598277 0 2026-03-10T08:55:53.174 INFO:tasks.workunit.client.0.vm05.stdout:5/586: unlink d5/d86/d24/f25 0 2026-03-10T08:55:53.174 INFO:tasks.workunit.client.0.vm05.stdout:5/587: readlink d5/df/dbb/d43/l8d 0 2026-03-10T08:55:53.190 INFO:tasks.workunit.client.0.vm05.stdout:3/731: dwrite d9/d4d/d51/d64/f9e [0,4194304] 0 2026-03-10T08:55:53.191 INFO:tasks.workunit.client.1.vm08.stdout:9/828: chown d2/d41/c65 52553220 1 2026-03-10T08:55:53.192 INFO:tasks.workunit.client.0.vm05.stdout:2/587: mknod d0/d9/d1e/d20/d21/d8a/d92/ca8 0 2026-03-10T08:55:53.192 INFO:tasks.workunit.client.0.vm05.stdout:1/698: link dd/d10/d18/d20/d69/fb1 dd/d10/d18/d20/d52/d80/ffc 0 2026-03-10T08:55:53.193 INFO:tasks.workunit.client.0.vm05.stdout:0/651: creat df/d1f/d85/fc0 x:0 0 0 2026-03-10T08:55:53.193 INFO:tasks.workunit.client.0.vm05.stdout:0/652: stat df/d1f/l7d 0 2026-03-10T08:55:53.200 INFO:tasks.workunit.client.0.vm05.stdout:1/699: dread dd/d10/d19/d4d/f74 [0,4194304] 0 2026-03-10T08:55:53.231 INFO:tasks.workunit.client.0.vm05.stdout:5/588: mknod d5/d48/d64/cd6 0 2026-03-10T08:55:53.236 INFO:tasks.workunit.client.0.vm05.stdout:2/588: dread - d0/d9/d7f/d8f/f7d zero size 2026-03-10T08:55:53.278 INFO:tasks.workunit.client.0.vm05.stdout:2/589: write d0/d9/d89/f93 [677014,15343] 0 2026-03-10T08:55:53.278 INFO:tasks.workunit.client.0.vm05.stdout:7/617: getdents d18/d66/d25/d2e/d42/d53 0 2026-03-10T08:55:53.278 INFO:tasks.workunit.client.0.vm05.stdout:5/589: creat d5/df/d37/d68/fd7 x:0 0 0 2026-03-10T08:55:53.278 INFO:tasks.workunit.client.0.vm05.stdout:2/590: mknod d0/d9/d1e/d20/d21/d8a/d92/ca9 0 2026-03-10T08:55:53.278 INFO:tasks.workunit.client.0.vm05.stdout:2/591: write d0/d9/d1e/d20/f47 [5878722,86046] 0 2026-03-10T08:55:53.278 INFO:tasks.workunit.client.0.vm05.stdout:9/626: creat d6/fcf x:0 0 0 2026-03-10T08:55:53.278 INFO:tasks.workunit.client.0.vm05.stdout:4/657: getdents d0/d2e/d42/d45/d4a/d36/d37 0 2026-03-10T08:55:53.282 INFO:tasks.workunit.client.1.vm08.stdout:3/820: mknod d4/d6f/d85/dd3/c116 0 2026-03-10T08:55:53.285 INFO:tasks.workunit.client.0.vm05.stdout:9/627: creat d6/d19/d2a/d4a/d8c/fd0 x:0 0 0 2026-03-10T08:55:53.291 INFO:tasks.workunit.client.0.vm05.stdout:1/700: sync 2026-03-10T08:55:53.305 INFO:tasks.workunit.client.1.vm08.stdout:5/768: creat d0/d11/d3e/d45/fe9 x:0 0 0 2026-03-10T08:55:53.314 INFO:tasks.workunit.client.1.vm08.stdout:5/769: sync 2026-03-10T08:55:53.322 INFO:tasks.workunit.client.0.vm05.stdout:6/682: dwrite d4/d2d/d51/d62/da9/fe4 [0,4194304] 0 2026-03-10T08:55:53.325 INFO:tasks.workunit.client.0.vm05.stdout:8/666: dwrite d2/db/d1f/d67/f79 [0,4194304] 0 2026-03-10T08:55:53.332 INFO:tasks.workunit.client.0.vm05.stdout:3/732: dwrite d9/d4d/f52 [0,4194304] 0 2026-03-10T08:55:53.358 INFO:tasks.workunit.client.1.vm08.stdout:1/863: truncate d1/da/de/d24/d3d/d40/d8e/dd2/d7f/fb9 787318 0 2026-03-10T08:55:53.360 INFO:tasks.workunit.client.0.vm05.stdout:2/592: write d0/f16 [1237704,124221] 0 2026-03-10T08:55:53.365 INFO:tasks.workunit.client.0.vm05.stdout:7/618: dwrite d18/f31 [0,4194304] 0 2026-03-10T08:55:53.381 INFO:tasks.workunit.client.0.vm05.stdout:0/653: rename df/d1f/d85/d19/d47/d84/dae/db3/cb6 to df/d1f/d85/cc1 0 2026-03-10T08:55:53.381 INFO:tasks.workunit.client.0.vm05.stdout:4/658: write d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/d67/d7b/faa [999724,74993] 0 2026-03-10T08:55:53.384 INFO:tasks.workunit.client.1.vm08.stdout:7/865: dwrite d0/d11/d4a/d95/fa7 [4194304,4194304] 0 2026-03-10T08:55:53.386 INFO:tasks.workunit.client.1.vm08.stdout:0/834: mknod d6/dd/d13/d17/d1f/d20/c11e 0 2026-03-10T08:55:53.386 INFO:tasks.workunit.client.1.vm08.stdout:4/884: link d5/d23/d36/d99/db2/d5d/de3/df8/f11e d5/d23/d36/d76/f13b 0 2026-03-10T08:55:53.387 INFO:tasks.workunit.client.0.vm05.stdout:9/628: creat d6/d19/d2a/d4a/fd1 x:0 0 0 2026-03-10T08:55:53.395 INFO:tasks.workunit.client.1.vm08.stdout:4/885: chown d5/d23/d36/d76/fa5 19609 1 2026-03-10T08:55:53.399 INFO:tasks.workunit.client.1.vm08.stdout:7/866: dread d0/d11/d4a/f87 [4194304,4194304] 0 2026-03-10T08:55:53.402 INFO:tasks.workunit.client.1.vm08.stdout:8/899: rename d1/d10/d9/dd/d9a/da6/f11d to d1/d4f/d60/d88/f14d 0 2026-03-10T08:55:53.409 INFO:tasks.workunit.client.0.vm05.stdout:5/590: creat d5/d86/d24/d2c/fd8 x:0 0 0 2026-03-10T08:55:53.415 INFO:tasks.workunit.client.1.vm08.stdout:9/829: symlink d2/dd/d15/d1e/d39/d4e/d87/l11a 0 2026-03-10T08:55:53.445 INFO:tasks.workunit.client.1.vm08.stdout:5/770: mknod d0/d11/d27/d68/d7c/d4b/d4e/da5/cea 0 2026-03-10T08:55:53.452 INFO:tasks.workunit.client.1.vm08.stdout:3/821: dread d4/d15/d8/d1d/f2d [0,4194304] 0 2026-03-10T08:55:53.456 INFO:tasks.workunit.client.0.vm05.stdout:1/701: write dd/fe5 [1350531,14591] 0 2026-03-10T08:55:53.462 INFO:tasks.workunit.client.0.vm05.stdout:6/683: write d4/f6c [462351,32534] 0 2026-03-10T08:55:53.462 INFO:tasks.workunit.client.1.vm08.stdout:2/938: write d1/da/d10/d42/d93/d23/f31 [312679,119134] 0 2026-03-10T08:55:53.464 INFO:tasks.workunit.client.1.vm08.stdout:2/939: write d1/da/d10/d2d/db6/ff3 [3456363,63991] 0 2026-03-10T08:55:53.465 INFO:tasks.workunit.client.1.vm08.stdout:2/940: chown d1/d5b/f121 81 1 2026-03-10T08:55:53.465 INFO:tasks.workunit.client.0.vm05.stdout:2/593: creat d0/d9/d1e/d20/d21/d8a/d92/faa x:0 0 0 2026-03-10T08:55:53.465 INFO:tasks.workunit.client.1.vm08.stdout:6/859: dwrite d9/dc/d11/d23/f5f [0,4194304] 0 2026-03-10T08:55:53.474 INFO:tasks.workunit.client.1.vm08.stdout:6/860: chown d9/d13 12326545 1 2026-03-10T08:55:53.478 INFO:tasks.workunit.client.1.vm08.stdout:6/861: write d9/dc/d11/d23/d2c/f49 [794668,87346] 0 2026-03-10T08:55:53.727 INFO:tasks.workunit.client.0.vm05.stdout:3/733: dwrite d9/d2b/d3a/d43/d71/f91 [0,4194304] 0 2026-03-10T08:55:53.728 INFO:tasks.workunit.client.0.vm05.stdout:3/734: write d9/d2b/d3a/d6c/dbf/f7f [2182595,6173] 0 2026-03-10T08:55:53.802 INFO:tasks.workunit.client.0.vm05.stdout:4/659: mkdir d0/d2e/dca/dd3 0 2026-03-10T08:55:53.804 INFO:tasks.workunit.client.0.vm05.stdout:0/654: symlink df/d1f/d48/lc2 0 2026-03-10T08:55:53.834 INFO:tasks.workunit.client.1.vm08.stdout:4/886: mknod d5/d23/d36/c13c 0 2026-03-10T08:55:53.837 INFO:tasks.workunit.client.1.vm08.stdout:7/867: mkdir d0/d11/d4a/d95/dc5/d100/d115 0 2026-03-10T08:55:53.840 INFO:tasks.workunit.client.1.vm08.stdout:0/835: dread d6/dd/d13/d17/d1f/d20/d2f/d26/d56/f6c [0,4194304] 0 2026-03-10T08:55:53.844 INFO:tasks.workunit.client.0.vm05.stdout:1/702: symlink dd/d10/d19/d9b/dc3/lfd 0 2026-03-10T08:55:53.869 INFO:tasks.workunit.client.0.vm05.stdout:6/684: dread d4/d7/f4d [0,4194304] 0 2026-03-10T08:55:53.896 INFO:tasks.workunit.client.0.vm05.stdout:5/591: chown d5/df/dbb/d43/f6d 16 1 2026-03-10T08:55:53.897 INFO:tasks.workunit.client.1.vm08.stdout:6/862: creat d9/d10/d1e/d7e/f119 x:0 0 0 2026-03-10T08:55:53.923 INFO:tasks.workunit.client.1.vm08.stdout:7/868: creat d0/d11/d1f/d29/d3d/d89/f116 x:0 0 0 2026-03-10T08:55:53.930 INFO:tasks.workunit.client.0.vm05.stdout:2/594: creat d0/d9/d7f/d8f/fab x:0 0 0 2026-03-10T08:55:53.937 INFO:tasks.workunit.client.1.vm08.stdout:0/836: rmdir d6/dd/d13/d17/d1f/d20 39 2026-03-10T08:55:53.937 INFO:tasks.workunit.client.1.vm08.stdout:8/900: creat d1/d10/d9/dd/d9a/d11f/f14e x:0 0 0 2026-03-10T08:55:53.945 INFO:tasks.workunit.client.1.vm08.stdout:2/941: creat d1/d97/d10d/f135 x:0 0 0 2026-03-10T08:55:53.951 INFO:tasks.workunit.client.1.vm08.stdout:3/822: read d4/d15/d8/d2c/d9b/d79/d20/f84 [2117790,26760] 0 2026-03-10T08:55:53.963 INFO:tasks.workunit.client.1.vm08.stdout:3/823: chown d4/d15/d8/d2c/c48 2127895609 1 2026-03-10T08:55:53.964 INFO:tasks.workunit.client.1.vm08.stdout:1/864: dwrite d1/da/de/d24/d3d/d40/d8e/dd2/d7f/fb9 [0,4194304] 0 2026-03-10T08:55:53.969 INFO:tasks.workunit.client.1.vm08.stdout:6/863: creat d9/dc/d11/d23/d2c/d7a/dce/d69/da2/f11a x:0 0 0 2026-03-10T08:55:53.970 INFO:tasks.workunit.client.1.vm08.stdout:3/824: write d4/d6f/fcb [1438971,75888] 0 2026-03-10T08:55:53.976 INFO:tasks.workunit.client.1.vm08.stdout:0/837: mkdir d6/dd/d13/d17/d1f/da3/d11f 0 2026-03-10T08:55:53.976 INFO:tasks.workunit.client.0.vm05.stdout:2/595: sync 2026-03-10T08:55:53.980 INFO:tasks.workunit.client.0.vm05.stdout:2/596: write d0/d9/d1e/d20/d21/d8a/d92/faa [489248,57875] 0 2026-03-10T08:55:53.981 INFO:tasks.workunit.client.1.vm08.stdout:5/771: creat d0/d11/d27/d68/d7c/de5/feb x:0 0 0 2026-03-10T08:55:54.005 INFO:tasks.workunit.client.1.vm08.stdout:2/942: unlink d1/da/d10/d42/d93/f55 0 2026-03-10T08:55:54.012 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:53 vm05.local ceph-mon[49713]: from='client.14716 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:55:54.012 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:53 vm05.local ceph-mon[49713]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:55:54.012 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:53 vm05.local ceph-mon[49713]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:55:54.012 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:53 vm05.local ceph-mon[49713]: pgmap v5: 65 pgs: 65 active+clean; 3.3 GiB data, 11 GiB used, 109 GiB / 120 GiB avail 2026-03-10T08:55:54.012 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:53 vm05.local ceph-mon[49713]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:55:54.012 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:53 vm05.local ceph-mon[49713]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:55:54.043 INFO:tasks.workunit.client.1.vm08.stdout:3/825: mkdir d4/d15/d8/d1d/d117 0 2026-03-10T08:55:54.104 INFO:tasks.workunit.client.1.vm08.stdout:2/943: mknod d1/d5b/da7/d11c/c136 0 2026-03-10T08:55:54.106 INFO:tasks.workunit.client.0.vm05.stdout:1/703: getdents dd/d13/dd2 0 2026-03-10T08:55:54.108 INFO:tasks.workunit.client.1.vm08.stdout:7/869: getdents d0/d11/d4a 0 2026-03-10T08:55:54.113 INFO:tasks.workunit.client.1.vm08.stdout:0/838: creat d6/dd/d13/d17/d1f/d20/d2f/d57/d109/f120 x:0 0 0 2026-03-10T08:55:54.119 INFO:tasks.workunit.client.0.vm05.stdout:8/667: rename d2/db/d1f/d67/d8d/fad to d2/db/d1f/d67/fe9 0 2026-03-10T08:55:54.128 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:53 vm08.local ceph-mon[57559]: from='client.14716 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:55:54.128 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:53 vm08.local ceph-mon[57559]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:55:54.128 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:53 vm08.local ceph-mon[57559]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:55:54.128 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:53 vm08.local ceph-mon[57559]: pgmap v5: 65 pgs: 65 active+clean; 3.3 GiB data, 11 GiB used, 109 GiB / 120 GiB avail 2026-03-10T08:55:54.128 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:53 vm08.local ceph-mon[57559]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:55:54.128 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:53 vm08.local ceph-mon[57559]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:55:54.134 INFO:tasks.workunit.client.1.vm08.stdout:2/944: mknod d1/d5b/d66/c137 0 2026-03-10T08:55:54.135 INFO:tasks.workunit.client.0.vm05.stdout:0/655: creat df/d1f/d85/d19/d47/d84/dbe/d90/fc3 x:0 0 0 2026-03-10T08:55:54.135 INFO:tasks.workunit.client.0.vm05.stdout:5/592: fsync d5/f40 0 2026-03-10T08:55:54.138 INFO:tasks.workunit.client.0.vm05.stdout:0/656: chown df/d1f/d85/d2b/d27/d32/d4e/d87 0 1 2026-03-10T08:55:54.141 INFO:tasks.workunit.client.1.vm08.stdout:7/870: rmdir d0/d11/d4a/d95/dc5/d100 39 2026-03-10T08:55:54.151 INFO:tasks.workunit.client.0.vm05.stdout:2/597: mkdir d0/d9/d89/da3/dac 0 2026-03-10T08:55:54.153 INFO:tasks.workunit.client.0.vm05.stdout:2/598: chown d0/d9/d7f/d8f/d6d/f9a 15 1 2026-03-10T08:55:54.162 INFO:tasks.workunit.client.0.vm05.stdout:8/668: dread d2/db/d1f/f84 [0,4194304] 0 2026-03-10T08:55:54.204 INFO:tasks.workunit.client.0.vm05.stdout:5/593: creat d5/d86/d21/fd9 x:0 0 0 2026-03-10T08:55:54.240 INFO:tasks.workunit.client.0.vm05.stdout:9/629: write d6/d19/d2c/d58/f6c [770329,11511] 0 2026-03-10T08:55:54.245 INFO:tasks.workunit.client.0.vm05.stdout:5/594: symlink d5/df/dbb/lda 0 2026-03-10T08:55:54.274 INFO:tasks.workunit.client.1.vm08.stdout:2/945: mknod d1/d5b/d66/c138 0 2026-03-10T08:55:54.275 INFO:tasks.workunit.client.1.vm08.stdout:4/887: link d5/d23/d49/cd0 d5/d23/d36/d99/dc6/c13d 0 2026-03-10T08:55:54.280 INFO:tasks.workunit.client.1.vm08.stdout:7/871: mknod d0/d11/d1f/df0/df4/c117 0 2026-03-10T08:55:54.290 INFO:tasks.workunit.client.1.vm08.stdout:7/872: sync 2026-03-10T08:55:54.295 INFO:tasks.workunit.client.1.vm08.stdout:7/873: dread d0/d11/d1f/d29/d3d/f76 [0,4194304] 0 2026-03-10T08:55:54.301 INFO:tasks.workunit.client.1.vm08.stdout:7/874: mknod d0/d11/d1f/d29/d3d/dd1/c118 0 2026-03-10T08:55:54.301 INFO:tasks.workunit.client.1.vm08.stdout:7/875: chown d0/d14/f98 234353 1 2026-03-10T08:55:54.303 INFO:tasks.workunit.client.1.vm08.stdout:7/876: symlink d0/d11/d4a/d5e/l119 0 2026-03-10T08:55:54.307 INFO:tasks.workunit.client.1.vm08.stdout:4/888: creat d5/d23/d36/d99/f13e x:0 0 0 2026-03-10T08:55:54.310 INFO:tasks.workunit.client.1.vm08.stdout:9/830: dwrite d2/d41/d53/f6d [0,4194304] 0 2026-03-10T08:55:54.321 INFO:tasks.workunit.client.1.vm08.stdout:9/831: dread d2/dd/d15/d1e/d24/f34 [0,4194304] 0 2026-03-10T08:55:54.322 INFO:tasks.workunit.client.1.vm08.stdout:9/832: write d2/dd/d61/f110 [726226,125558] 0 2026-03-10T08:55:54.344 INFO:tasks.workunit.client.0.vm05.stdout:7/619: dwrite d18/d66/d25/d2e/f49 [0,4194304] 0 2026-03-10T08:55:54.355 INFO:tasks.workunit.client.1.vm08.stdout:1/865: rename d1/da/d20/d9e to d1/da/de/d24/d3d/d40/d56/d128 0 2026-03-10T08:55:54.368 INFO:tasks.workunit.client.0.vm05.stdout:7/620: fdatasync d18/d66/d25/d2e/d42/d9c/dac/f72 0 2026-03-10T08:55:54.378 INFO:tasks.workunit.client.1.vm08.stdout:0/839: rename d6/l7b to d6/dd/d13/d17/d1f/d2d/d85/df6/l121 0 2026-03-10T08:55:54.382 INFO:tasks.workunit.client.0.vm05.stdout:3/735: dwrite d9/d4d/dca/f87 [0,4194304] 0 2026-03-10T08:55:54.396 INFO:tasks.workunit.client.1.vm08.stdout:1/866: mknod d1/da/de/d5c/c129 0 2026-03-10T08:55:54.397 INFO:tasks.workunit.client.0.vm05.stdout:7/621: truncate d18/d1b/f50 2038907 0 2026-03-10T08:55:54.401 INFO:tasks.workunit.client.1.vm08.stdout:8/901: dwrite d1/d10/d9/dd/f41 [0,4194304] 0 2026-03-10T08:55:54.415 INFO:tasks.workunit.client.0.vm05.stdout:3/736: mkdir d9/d4d/d51/d64/d89/ddf 0 2026-03-10T08:55:54.422 INFO:tasks.workunit.client.0.vm05.stdout:3/737: symlink d9/d2b/d3a/d43/d71/le0 0 2026-03-10T08:55:54.424 INFO:tasks.workunit.client.1.vm08.stdout:0/840: creat d6/dd/d13/d17/d1f/d20/d2f/d26/f122 x:0 0 0 2026-03-10T08:55:54.425 INFO:tasks.workunit.client.1.vm08.stdout:0/841: stat d6/dd/d13/d17/d1f/d2d/d39 0 2026-03-10T08:55:54.425 INFO:tasks.workunit.client.0.vm05.stdout:7/622: symlink d18/d38/lc0 0 2026-03-10T08:55:54.434 INFO:tasks.workunit.client.1.vm08.stdout:8/902: dread - d1/d10/d9/dd/d25/d27/d44/fc9 zero size 2026-03-10T08:55:54.434 INFO:tasks.workunit.client.0.vm05.stdout:7/623: fdatasync f3 0 2026-03-10T08:55:54.435 INFO:tasks.workunit.client.1.vm08.stdout:8/903: chown d1/d10/d9/dd/d25/dca/c55 503343151 1 2026-03-10T08:55:54.436 INFO:tasks.workunit.client.1.vm08.stdout:8/904: read d1/d10/d9/dd/f91 [377619,42096] 0 2026-03-10T08:55:54.440 INFO:tasks.workunit.client.1.vm08.stdout:0/842: mknod d6/dd/d13/d17/d1f/d20/d2f/d57/dd5/c123 0 2026-03-10T08:55:54.443 INFO:tasks.workunit.client.0.vm05.stdout:7/624: unlink d18/d38/d43/c90 0 2026-03-10T08:55:54.451 INFO:tasks.workunit.client.1.vm08.stdout:4/889: rmdir d5/d23/d36/d99/db2/d5a/d69/d11b/d114/d131 0 2026-03-10T08:55:54.456 INFO:tasks.workunit.client.0.vm05.stdout:7/625: getdents d18/d38/d43/d5c/daf 0 2026-03-10T08:55:54.463 INFO:tasks.workunit.client.1.vm08.stdout:8/905: creat d1/d10/d9/dd/d25/dca/dc6/d13f/f14f x:0 0 0 2026-03-10T08:55:54.463 INFO:tasks.workunit.client.1.vm08.stdout:0/843: dread d6/dd/d13/d17/d1f/d20/d2f/fe2 [0,4194304] 0 2026-03-10T08:55:54.465 INFO:tasks.workunit.client.0.vm05.stdout:7/626: rmdir d18/d66/d25 39 2026-03-10T08:55:54.470 INFO:tasks.workunit.client.0.vm05.stdout:7/627: dwrite d18/fb1 [0,4194304] 0 2026-03-10T08:55:54.476 INFO:tasks.workunit.client.1.vm08.stdout:4/890: rmdir d5/d23/d36 39 2026-03-10T08:55:54.486 INFO:tasks.workunit.client.1.vm08.stdout:8/906: creat d1/d4f/d60/d88/f150 x:0 0 0 2026-03-10T08:55:54.486 INFO:tasks.workunit.client.1.vm08.stdout:8/907: chown d1/laa 5410648 1 2026-03-10T08:55:54.493 INFO:tasks.workunit.client.1.vm08.stdout:4/891: stat d5/d23/d36/d99/db2/f84 0 2026-03-10T08:55:54.498 INFO:tasks.workunit.client.1.vm08.stdout:6/864: dwrite d9/dc/d11/d23/fdf [0,4194304] 0 2026-03-10T08:55:54.498 INFO:tasks.workunit.client.1.vm08.stdout:4/892: stat d5/d23/d49/d8f/lc0 0 2026-03-10T08:55:54.511 INFO:tasks.workunit.client.1.vm08.stdout:5/772: write d0/f7f [3601237,98551] 0 2026-03-10T08:55:54.512 INFO:tasks.workunit.client.1.vm08.stdout:0/844: dread d6/dd/d13/d17/d1f/d2d/d85/d95/fb9 [0,4194304] 0 2026-03-10T08:55:54.517 INFO:tasks.workunit.client.0.vm05.stdout:4/660: write d0/d2e/d42/d45/d4a/d36/dbe/d49/fb7 [3164709,41251] 0 2026-03-10T08:55:54.517 INFO:tasks.workunit.client.1.vm08.stdout:3/826: write d4/d15/d8/d71/faf [288403,73782] 0 2026-03-10T08:55:54.528 INFO:tasks.workunit.client.0.vm05.stdout:8/669: getdents d2/db/d1f/d67/d8d 0 2026-03-10T08:55:54.529 INFO:tasks.workunit.client.0.vm05.stdout:8/670: chown d2/dd/d2c/d2e/d93/f9b 6976615 1 2026-03-10T08:55:54.534 INFO:tasks.workunit.client.0.vm05.stdout:7/628: truncate d18/d66/d25/d2e/f9e 963171 0 2026-03-10T08:55:54.541 INFO:tasks.workunit.client.0.vm05.stdout:4/661: truncate d0/d2e/d42/d45/d4a/d36/f88 2107392 0 2026-03-10T08:55:54.546 INFO:tasks.workunit.client.0.vm05.stdout:1/704: dwrite dd/d10/d18/d20/f34 [0,4194304] 0 2026-03-10T08:55:54.557 INFO:tasks.workunit.client.0.vm05.stdout:7/629: mkdir d18/d66/d25/d2e/d2f/d6d/dc1 0 2026-03-10T08:55:54.559 INFO:tasks.workunit.client.0.vm05.stdout:0/657: write df/d1f/d85/d19/d5b/f78 [344193,4226] 0 2026-03-10T08:55:54.568 INFO:tasks.workunit.client.0.vm05.stdout:6/685: unlink d4/d7/l8b 0 2026-03-10T08:55:54.572 INFO:tasks.workunit.client.0.vm05.stdout:2/599: write d0/d9/d1e/d20/d21/f77 [681261,112795] 0 2026-03-10T08:55:54.581 INFO:tasks.workunit.client.0.vm05.stdout:4/662: dread - d0/d2e/d42/d45/fb1 zero size 2026-03-10T08:55:54.582 INFO:tasks.workunit.client.0.vm05.stdout:4/663: write d0/f10 [3563004,99969] 0 2026-03-10T08:55:54.594 INFO:tasks.workunit.client.0.vm05.stdout:1/705: rmdir dd/d10/d18/dd5/da9 39 2026-03-10T08:55:54.597 INFO:tasks.workunit.client.1.vm08.stdout:3/827: unlink d4/d15/d8/ld0 0 2026-03-10T08:55:54.615 INFO:tasks.workunit.client.0.vm05.stdout:9/630: truncate d6/d12/d3a/d48/fa8 1347026 0 2026-03-10T08:55:54.619 INFO:tasks.workunit.client.1.vm08.stdout:4/893: symlink d5/d23/d36/d99/dc6/dc8/d120/l13f 0 2026-03-10T08:55:54.629 INFO:tasks.workunit.client.0.vm05.stdout:5/595: dwrite d5/d86/d24/d2c/d41/d74/f9f [4194304,4194304] 0 2026-03-10T08:55:54.632 INFO:tasks.workunit.client.0.vm05.stdout:5/596: dread - d5/d86/d24/d2c/fd8 zero size 2026-03-10T08:55:54.638 INFO:tasks.workunit.client.1.vm08.stdout:2/946: dwrite d1/d43/f10b [0,4194304] 0 2026-03-10T08:55:54.638 INFO:tasks.workunit.client.1.vm08.stdout:4/894: dread d5/d23/d49/d8f/fb1 [0,4194304] 0 2026-03-10T08:55:54.639 INFO:tasks.workunit.client.0.vm05.stdout:0/658: rmdir df/d1f 39 2026-03-10T08:55:54.639 INFO:tasks.workunit.client.1.vm08.stdout:2/947: stat d1/da/d10/d1b/d6a/fe0 0 2026-03-10T08:55:54.646 INFO:tasks.workunit.client.1.vm08.stdout:3/828: truncate d4/d15/d8/f41 102743 0 2026-03-10T08:55:54.650 INFO:tasks.workunit.client.1.vm08.stdout:8/908: getdents d1/d10/d9/dd/d25/d27/d44/d21/d5f/d9e 0 2026-03-10T08:55:54.653 INFO:tasks.workunit.client.1.vm08.stdout:8/909: dread d1/d10/d9/dd/d18/d34/fd1 [0,4194304] 0 2026-03-10T08:55:54.653 INFO:tasks.workunit.client.1.vm08.stdout:8/910: stat d1/d10/d9/dd/d18 0 2026-03-10T08:55:54.664 INFO:tasks.workunit.client.1.vm08.stdout:7/877: write d0/d11/d1f/fb7 [477817,97690] 0 2026-03-10T08:55:54.674 INFO:tasks.workunit.client.0.vm05.stdout:9/631: symlink d6/d19/d2a/d8d/ld2 0 2026-03-10T08:55:54.678 INFO:tasks.workunit.client.1.vm08.stdout:2/948: rename d1/da/d10/d42/d93/daa to d1/da/d10/d42/d93/de2/d139 0 2026-03-10T08:55:54.679 INFO:tasks.workunit.client.0.vm05.stdout:6/686: creat d4/d2d/d51/d87/da5/fe5 x:0 0 0 2026-03-10T08:55:54.679 INFO:tasks.workunit.client.0.vm05.stdout:6/687: stat d4/d2c/d84 0 2026-03-10T08:55:54.683 INFO:tasks.workunit.client.1.vm08.stdout:9/833: write d2/dd/d15/d1e/d21/fc7 [3356360,17608] 0 2026-03-10T08:55:54.686 INFO:tasks.workunit.client.1.vm08.stdout:3/829: truncate d4/d15/d8/d2c/d9b/d79/d20/f8b 261629 0 2026-03-10T08:55:54.712 INFO:tasks.workunit.client.1.vm08.stdout:1/867: write d1/da/de/d5c/fb5 [3389900,96717] 0 2026-03-10T08:55:54.713 INFO:tasks.workunit.client.0.vm05.stdout:3/738: write d9/d2b/d53/f60 [1790679,46298] 0 2026-03-10T08:55:54.713 INFO:tasks.workunit.client.1.vm08.stdout:7/878: read - d0/d11/d4a/da3/f104 zero size 2026-03-10T08:55:54.716 INFO:tasks.workunit.client.0.vm05.stdout:4/664: symlink d0/d2e/ld4 0 2026-03-10T08:55:54.727 INFO:tasks.workunit.client.1.vm08.stdout:4/895: truncate d5/d23/d36/d99/db2/d5a/d69/f138 28 0 2026-03-10T08:55:54.729 INFO:tasks.workunit.client.1.vm08.stdout:2/949: dread - d1/da/d10/d1b/f122 zero size 2026-03-10T08:55:54.736 INFO:tasks.workunit.client.0.vm05.stdout:9/632: unlink d6/d27/f2b 0 2026-03-10T08:55:54.736 INFO:tasks.workunit.client.0.vm05.stdout:6/688: mknod d4/d2d/d51/d62/da9/ce6 0 2026-03-10T08:55:54.756 INFO:tasks.workunit.client.1.vm08.stdout:6/865: write d9/dc/d11/d23/d2c/d41/f56 [2080486,87489] 0 2026-03-10T08:55:54.757 INFO:tasks.workunit.client.0.vm05.stdout:5/597: getdents d5/d86/d24/d2c/d41/dca 0 2026-03-10T08:55:54.757 INFO:tasks.workunit.client.0.vm05.stdout:8/671: write d2/db/d1f/f44 [1055319,85134] 0 2026-03-10T08:55:54.757 INFO:tasks.workunit.client.0.vm05.stdout:5/598: readlink d5/d86/d21/l57 0 2026-03-10T08:55:54.770 INFO:tasks.workunit.client.1.vm08.stdout:5/773: write d0/d1b/f69 [996322,91740] 0 2026-03-10T08:55:54.771 INFO:tasks.workunit.client.1.vm08.stdout:7/879: fsync d0/d14/d43/d62/fb5 0 2026-03-10T08:55:54.778 INFO:tasks.workunit.client.1.vm08.stdout:0/845: write d6/dd/d13/d17/fbf [217299,125080] 0 2026-03-10T08:55:54.780 INFO:tasks.workunit.client.1.vm08.stdout:4/896: creat d5/d23/d36/d99/db2/d5a/ddb/f140 x:0 0 0 2026-03-10T08:55:54.792 INFO:tasks.workunit.client.0.vm05.stdout:4/665: creat d0/d2e/d42/d45/d4a/d36/fd5 x:0 0 0 2026-03-10T08:55:54.797 INFO:tasks.workunit.client.0.vm05.stdout:6/689: sync 2026-03-10T08:55:54.798 INFO:tasks.workunit.client.0.vm05.stdout:6/690: readlink d4/d7/l2b 0 2026-03-10T08:55:54.799 INFO:tasks.workunit.client.0.vm05.stdout:4/666: sync 2026-03-10T08:55:54.800 INFO:tasks.workunit.client.1.vm08.stdout:2/950: dread d1/d43/f56 [0,4194304] 0 2026-03-10T08:55:54.804 INFO:tasks.workunit.client.1.vm08.stdout:3/830: rename d4/ld8 to d4/d6f/l118 0 2026-03-10T08:55:54.805 INFO:tasks.workunit.client.0.vm05.stdout:4/667: truncate d0/d2e/d42/d45/d4a/d36/dbe/d32/f72 4801772 0 2026-03-10T08:55:54.810 INFO:tasks.workunit.client.1.vm08.stdout:6/866: symlink d9/dc/d11/d23/d2c/l11b 0 2026-03-10T08:55:54.811 INFO:tasks.workunit.client.0.vm05.stdout:2/600: getdents d0/d9/d7f/d8f 0 2026-03-10T08:55:54.812 INFO:tasks.workunit.client.0.vm05.stdout:1/706: write dd/d21/d37/d45/fce [890714,2472] 0 2026-03-10T08:55:54.815 INFO:tasks.workunit.client.0.vm05.stdout:8/672: read d2/fa [2473669,109896] 0 2026-03-10T08:55:54.816 INFO:tasks.workunit.client.0.vm05.stdout:1/707: write dd/d10/d19/d4d/fc4 [274494,69476] 0 2026-03-10T08:55:54.818 INFO:tasks.workunit.client.0.vm05.stdout:7/630: dwrite d18/d66/d79/f85 [0,4194304] 0 2026-03-10T08:55:54.825 INFO:tasks.workunit.client.0.vm05.stdout:5/599: truncate d5/d86/d39/fce 134907 0 2026-03-10T08:55:54.847 INFO:tasks.workunit.client.0.vm05.stdout:0/659: mknod df/d1f/d85/cc4 0 2026-03-10T08:55:54.850 INFO:tasks.workunit.client.1.vm08.stdout:8/911: dwrite d1/d10/d9/dd/d25/d27/d44/d21/d5f/f113 [0,4194304] 0 2026-03-10T08:55:54.850 INFO:tasks.workunit.client.0.vm05.stdout:0/660: chown df/d1f/d85/d19/d47/fa5 8583 1 2026-03-10T08:55:54.855 INFO:tasks.workunit.client.0.vm05.stdout:0/661: stat df/d1f/d85/d19/d47/d84/la2 0 2026-03-10T08:55:54.858 INFO:tasks.workunit.client.0.vm05.stdout:0/662: stat df/d1f/d85/d2b/f9a 0 2026-03-10T08:55:54.867 INFO:tasks.workunit.client.1.vm08.stdout:5/774: unlink d0/d11/d3e/l41 0 2026-03-10T08:55:54.868 INFO:tasks.workunit.client.1.vm08.stdout:5/775: chown d0/d46/f5f 454516 1 2026-03-10T08:55:54.871 INFO:tasks.workunit.client.1.vm08.stdout:9/834: write d2/dd/d15/d1e/d94/f106 [3119419,34733] 0 2026-03-10T08:55:54.876 INFO:tasks.workunit.client.0.vm05.stdout:6/691: read - d4/fb4 zero size 2026-03-10T08:55:54.886 INFO:tasks.workunit.client.1.vm08.stdout:1/868: write d1/da/de/d24/d3d/d40/d8e/f107 [274531,15740] 0 2026-03-10T08:55:54.887 INFO:tasks.workunit.client.1.vm08.stdout:1/869: chown d1/da/de/d24/d3d/d40/d92 92891267 1 2026-03-10T08:55:54.890 INFO:tasks.workunit.client.1.vm08.stdout:4/897: creat d5/d23/d49/f141 x:0 0 0 2026-03-10T08:55:54.892 INFO:tasks.workunit.client.0.vm05.stdout:3/739: write d9/d2b/f2d [2331765,98260] 0 2026-03-10T08:55:54.899 INFO:tasks.workunit.client.1.vm08.stdout:2/951: dread - d1/d97/f133 zero size 2026-03-10T08:55:54.904 INFO:tasks.workunit.client.0.vm05.stdout:2/601: mknod d0/d9/d89/cad 0 2026-03-10T08:55:54.913 INFO:tasks.workunit.client.1.vm08.stdout:6/867: creat d9/dc/d11/d23/d2c/d7a/f11c x:0 0 0 2026-03-10T08:55:54.938 INFO:tasks.workunit.client.1.vm08.stdout:8/912: rmdir d1/d10/d9/dd/d9a/d11f 39 2026-03-10T08:55:54.939 INFO:tasks.workunit.client.1.vm08.stdout:8/913: dread - d1/d10/d9/dd/d25/d27/d44/d21/d5f/d9e/ff2 zero size 2026-03-10T08:55:54.940 INFO:tasks.workunit.client.0.vm05.stdout:8/673: unlink d2/dd/d74/fcc 0 2026-03-10T08:55:54.944 INFO:tasks.workunit.client.1.vm08.stdout:5/776: creat d0/d11/d27/d68/d7c/d4b/d87/db5/fec x:0 0 0 2026-03-10T08:55:54.948 INFO:tasks.workunit.client.1.vm08.stdout:9/835: unlink d2/dd/d15/d4f/fd3 0 2026-03-10T08:55:54.949 INFO:tasks.workunit.client.1.vm08.stdout:7/880: dwrite d0/d11/d1f/d29/d3b/f9f [0,4194304] 0 2026-03-10T08:55:54.959 INFO:tasks.workunit.client.1.vm08.stdout:0/846: truncate d6/dd/d13/d17/d1f/d2d/d39/f8a 1546996 0 2026-03-10T08:55:54.962 INFO:tasks.workunit.client.0.vm05.stdout:5/600: rmdir d5/d86/d21 39 2026-03-10T08:55:54.963 INFO:tasks.workunit.client.1.vm08.stdout:1/870: unlink d1/da/de/d24/d3d/d40/d8e/dd2/ffd 0 2026-03-10T08:55:54.963 INFO:tasks.workunit.client.0.vm05.stdout:0/663: symlink df/d1f/d85/d19/d47/d84/dbe/d67/lc5 0 2026-03-10T08:55:54.965 INFO:tasks.workunit.client.1.vm08.stdout:2/952: creat d1/da/d10/d42/d93/d23/d9e/f13a x:0 0 0 2026-03-10T08:55:54.978 INFO:tasks.workunit.client.1.vm08.stdout:9/836: creat d2/d41/d4c/de2/f11b x:0 0 0 2026-03-10T08:55:54.979 INFO:tasks.workunit.client.0.vm05.stdout:6/692: dread d4/d7/d10/d1a/db1/fb3 [4194304,4194304] 0 2026-03-10T08:55:54.979 INFO:tasks.workunit.client.1.vm08.stdout:7/881: rename d0/d11/d1f/d29/d3d/l50 to d0/d14/d43/d9d/l11a 0 2026-03-10T08:55:54.984 INFO:tasks.workunit.client.1.vm08.stdout:0/847: dread d6/dd/d13/d17/d1f/d20/d2f/d24/f37 [0,4194304] 0 2026-03-10T08:55:54.988 INFO:tasks.workunit.client.1.vm08.stdout:9/837: rename d2/dd/d15/d1e/d39/d69 to d2/dd/d11c 0 2026-03-10T08:55:54.999 INFO:tasks.workunit.client.1.vm08.stdout:7/882: unlink d0/d11/d1f/d29/d36/d75/f85 0 2026-03-10T08:55:54.999 INFO:tasks.workunit.client.1.vm08.stdout:2/953: mknod d1/da/d10/d42/d93/c13b 0 2026-03-10T08:55:54.999 INFO:tasks.workunit.client.1.vm08.stdout:7/883: chown d0/d11/d1f/d29/l6f 423951 1 2026-03-10T08:55:54.999 INFO:tasks.workunit.client.1.vm08.stdout:7/884: chown d0/d11/d1f/d29/d3d/d40/cc 181461740 1 2026-03-10T08:55:54.999 INFO:tasks.workunit.client.1.vm08.stdout:2/954: mknod d1/da/d10/d42/dd0/c13c 0 2026-03-10T08:55:55.000 INFO:tasks.workunit.client.0.vm05.stdout:1/708: unlink dd/d21/d37/d45/c87 0 2026-03-10T08:55:55.001 INFO:tasks.workunit.client.0.vm05.stdout:1/709: chown dd/d21/f4c 10767469 1 2026-03-10T08:55:55.004 INFO:tasks.workunit.client.1.vm08.stdout:9/838: creat d2/d54/d8e/da6/dd0/dc8/de1/f11d x:0 0 0 2026-03-10T08:55:55.005 INFO:tasks.workunit.client.1.vm08.stdout:1/871: getdents d1/da/de 0 2026-03-10T08:55:55.005 INFO:tasks.workunit.client.0.vm05.stdout:7/631: mknod d18/d38/d43/cc2 0 2026-03-10T08:55:55.006 INFO:tasks.workunit.client.1.vm08.stdout:7/885: readlink d0/d11/ld2 0 2026-03-10T08:55:55.007 INFO:tasks.workunit.client.1.vm08.stdout:7/886: readlink d0/d11/d1f/d29/d3d/d40/l48 0 2026-03-10T08:55:55.009 INFO:tasks.workunit.client.1.vm08.stdout:7/887: readlink d0/d11/d1f/d29/d3d/d40/lca 0 2026-03-10T08:55:55.011 INFO:tasks.workunit.client.0.vm05.stdout:0/664: fsync df/d1f/d85/fb5 0 2026-03-10T08:55:55.019 INFO:tasks.workunit.client.1.vm08.stdout:1/872: unlink d1/fac 0 2026-03-10T08:55:55.019 INFO:tasks.workunit.client.0.vm05.stdout:6/693: dread d4/d7/d10/f12 [0,4194304] 0 2026-03-10T08:55:55.025 INFO:tasks.workunit.client.0.vm05.stdout:8/674: mkdir d2/dd/d2c/d2e/d31/d4f/d80/de2/dea 0 2026-03-10T08:55:55.029 INFO:tasks.workunit.client.0.vm05.stdout:6/694: dread d4/d7/d10/d15/d1b/d22/f36 [0,4194304] 0 2026-03-10T08:55:55.047 INFO:tasks.workunit.client.0.vm05.stdout:9/633: write d6/d15/f96 [653782,9600] 0 2026-03-10T08:55:55.047 INFO:tasks.workunit.client.1.vm08.stdout:3/831: write d4/d15/d8/d2c/d9b/d79/f3c [3546124,27752] 0 2026-03-10T08:55:55.049 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:54 vm05.local ceph-mon[49713]: mgrmap e23: vm08.rpongu(active, since 5s) 2026-03-10T08:55:55.064 INFO:tasks.workunit.client.1.vm08.stdout:6/868: dwrite d9/d10/d1e/f91 [0,4194304] 0 2026-03-10T08:55:55.065 INFO:tasks.workunit.client.1.vm08.stdout:8/914: dwrite d1/d10/d9/dd/d25/d27/d44/d97/d7d/f10f [0,4194304] 0 2026-03-10T08:55:55.065 INFO:tasks.workunit.client.0.vm05.stdout:1/710: dread dd/d21/d37/f39 [0,4194304] 0 2026-03-10T08:55:55.067 INFO:tasks.workunit.client.0.vm05.stdout:0/665: unlink df/d1f/d85/d19/d39/d4d/l98 0 2026-03-10T08:55:55.069 INFO:tasks.workunit.client.0.vm05.stdout:0/666: chown df/d1f/d85/d19/d39/f61 981531 1 2026-03-10T08:55:55.071 INFO:tasks.workunit.client.1.vm08.stdout:8/915: dwrite d1/d10/d9/dd/d25/d27/d44/d21/d5f/d9e/f149 [0,4194304] 0 2026-03-10T08:55:55.078 INFO:tasks.workunit.client.0.vm05.stdout:8/675: rmdir d2/dd/d74 39 2026-03-10T08:55:55.094 INFO:tasks.workunit.client.1.vm08.stdout:4/898: dwrite d5/d23/d36/d99/db2/d5a/d69/d11b/d96/ff6 [0,4194304] 0 2026-03-10T08:55:55.094 INFO:tasks.workunit.client.1.vm08.stdout:4/899: stat d5/df5 0 2026-03-10T08:55:55.099 INFO:tasks.workunit.client.1.vm08.stdout:5/777: write d0/d11/d27/d68/d7c/de5/f91 [630626,12577] 0 2026-03-10T08:55:55.102 INFO:tasks.workunit.client.0.vm05.stdout:7/632: mkdir d18/d66/d78/dc3 0 2026-03-10T08:55:55.112 INFO:tasks.workunit.client.0.vm05.stdout:6/695: rename d4/d7/d10/d15/d20/f47 to d4/d2d/d51/d87/da5/fe7 0 2026-03-10T08:55:55.123 INFO:tasks.workunit.client.1.vm08.stdout:8/916: creat d1/d2c/f151 x:0 0 0 2026-03-10T08:55:55.126 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:54 vm08.local ceph-mon[57559]: mgrmap e23: vm08.rpongu(active, since 5s) 2026-03-10T08:55:55.130 INFO:tasks.workunit.client.0.vm05.stdout:0/667: mkdir df/d1f/dc6 0 2026-03-10T08:55:55.133 INFO:tasks.workunit.client.0.vm05.stdout:3/740: write d9/d8f/d55/f6b [8114377,17856] 0 2026-03-10T08:55:55.142 INFO:tasks.workunit.client.0.vm05.stdout:2/602: dwrite d0/d9/d7f/d8f/d6d/f7b [0,4194304] 0 2026-03-10T08:55:55.142 INFO:tasks.workunit.client.0.vm05.stdout:2/603: readlink d0/d55/l84 0 2026-03-10T08:55:55.147 INFO:tasks.workunit.client.1.vm08.stdout:3/832: unlink d4/d15/d8/f41 0 2026-03-10T08:55:55.154 INFO:tasks.workunit.client.1.vm08.stdout:0/848: dwrite d6/d8b/faa [0,4194304] 0 2026-03-10T08:55:55.166 INFO:tasks.workunit.client.1.vm08.stdout:2/955: dwrite d1/da/d10/d42/d93/d23/feb [0,4194304] 0 2026-03-10T08:55:55.172 INFO:tasks.workunit.client.1.vm08.stdout:2/956: chown d1/d97/d11f/de7/f107 149 1 2026-03-10T08:55:55.172 INFO:tasks.workunit.client.1.vm08.stdout:9/839: write d2/dd/d61/fbb [641441,72652] 0 2026-03-10T08:55:55.172 INFO:tasks.workunit.client.1.vm08.stdout:7/888: write d0/d11/d1f/d29/d3d/fda [351938,61021] 0 2026-03-10T08:55:55.177 INFO:tasks.workunit.client.0.vm05.stdout:6/696: fdatasync d4/d7/d10/d15/f94 0 2026-03-10T08:55:55.177 INFO:tasks.workunit.client.0.vm05.stdout:7/633: creat d18/d66/d25/d2e/d2f/fc4 x:0 0 0 2026-03-10T08:55:55.180 INFO:tasks.workunit.client.0.vm05.stdout:1/711: mkdir dd/d10/d18/d20/df3/dfe 0 2026-03-10T08:55:55.189 INFO:tasks.workunit.client.0.vm05.stdout:0/668: sync 2026-03-10T08:55:55.206 INFO:tasks.workunit.client.0.vm05.stdout:2/604: creat d0/d9/d1e/d20/d21/d8a/d92/fae x:0 0 0 2026-03-10T08:55:55.213 INFO:tasks.workunit.client.0.vm05.stdout:8/676: creat d2/dd/feb x:0 0 0 2026-03-10T08:55:55.218 INFO:tasks.workunit.client.0.vm05.stdout:1/712: symlink dd/d10/d18/d2d/d51/d58/d71/d73/lff 0 2026-03-10T08:55:55.219 INFO:tasks.workunit.client.0.vm05.stdout:1/713: chown dd/d10/d18/d2d/d5c 1048 1 2026-03-10T08:55:55.221 INFO:tasks.workunit.client.0.vm05.stdout:3/741: dread d9/d2b/f2c [0,4194304] 0 2026-03-10T08:55:55.238 INFO:tasks.workunit.client.0.vm05.stdout:7/634: symlink d18/d66/d25/d2e/d2f/d6d/dc1/lc5 0 2026-03-10T08:55:55.240 INFO:tasks.workunit.client.0.vm05.stdout:3/742: sync 2026-03-10T08:55:55.243 INFO:tasks.workunit.client.0.vm05.stdout:5/601: write d5/d48/f93 [3774258,47300] 0 2026-03-10T08:55:55.244 INFO:tasks.workunit.client.0.vm05.stdout:5/602: chown d5/d86/d24/d2c/d41/d74/da9 21635082 1 2026-03-10T08:55:55.251 INFO:tasks.workunit.client.0.vm05.stdout:8/677: dread - d2/dd/d2c/d2e/d31/db4/fdf zero size 2026-03-10T08:55:55.260 INFO:tasks.workunit.client.0.vm05.stdout:1/714: creat dd/d10/d18/d2d/d5c/f100 x:0 0 0 2026-03-10T08:55:55.261 INFO:tasks.workunit.client.0.vm05.stdout:2/605: creat d0/d9/d89/da3/dac/faf x:0 0 0 2026-03-10T08:55:55.263 INFO:tasks.workunit.client.0.vm05.stdout:6/697: link d4/d7/d10/d15/d1b/d22/c77 d4/d8d/ce8 0 2026-03-10T08:55:55.272 INFO:tasks.workunit.client.0.vm05.stdout:6/698: chown d4/d2d/d51/d62/da9/fe4 13634 1 2026-03-10T08:55:55.287 INFO:tasks.workunit.client.0.vm05.stdout:9/634: write d6/d19/d2c/f61 [1454284,36240] 0 2026-03-10T08:55:55.292 INFO:tasks.workunit.client.0.vm05.stdout:4/668: write d0/d2e/d42/d45/d4a/d36/f88 [1358443,48963] 0 2026-03-10T08:55:55.298 INFO:tasks.workunit.client.1.vm08.stdout:4/900: dwrite d5/d23/d36/d99/db2/d5a/d69/f6e [0,4194304] 0 2026-03-10T08:55:55.304 INFO:tasks.workunit.client.1.vm08.stdout:5/778: dwrite d0/d11/d27/f3d [0,4194304] 0 2026-03-10T08:55:55.313 INFO:tasks.workunit.client.1.vm08.stdout:8/917: write d1/d10/d9/dd/d25/f125 [3910514,69887] 0 2026-03-10T08:55:55.354 INFO:tasks.workunit.client.0.vm05.stdout:2/606: symlink d0/d9/d89/da3/lb0 0 2026-03-10T08:55:55.359 INFO:tasks.workunit.client.0.vm05.stdout:4/669: mknod d0/d1d/cd6 0 2026-03-10T08:55:55.362 INFO:tasks.workunit.client.0.vm05.stdout:2/607: dwrite d0/d9/d7f/d8f/d7e/fa4 [0,4194304] 0 2026-03-10T08:55:55.367 INFO:tasks.workunit.client.0.vm05.stdout:4/670: sync 2026-03-10T08:55:55.367 INFO:tasks.workunit.client.0.vm05.stdout:0/669: getdents df/d1f/d85/d19/d47/d84/dae 0 2026-03-10T08:55:55.381 INFO:tasks.workunit.client.0.vm05.stdout:9/635: mknod d6/d15/cd3 0 2026-03-10T08:55:55.396 INFO:tasks.workunit.client.0.vm05.stdout:2/608: symlink d0/d55/da2/lb1 0 2026-03-10T08:55:55.398 INFO:tasks.workunit.client.0.vm05.stdout:2/609: stat d0/d9/d7f/d8f/d7e/l87 0 2026-03-10T08:55:55.399 INFO:tasks.workunit.client.0.vm05.stdout:0/670: creat df/d1f/d85/d19/d39/d4d/fc7 x:0 0 0 2026-03-10T08:55:55.399 INFO:tasks.workunit.client.1.vm08.stdout:3/833: mkdir d4/d15/d8/d2c/d9b/d119 0 2026-03-10T08:55:55.407 INFO:tasks.workunit.client.0.vm05.stdout:4/671: mkdir d0/d2e/d42/d45/d4a/d36/dbe/d49/d4f/d5b/dd7 0 2026-03-10T08:55:55.407 INFO:tasks.workunit.client.0.vm05.stdout:4/672: chown d0/d1d/l9a 1 1 2026-03-10T08:55:55.414 INFO:tasks.workunit.client.1.vm08.stdout:0/849: dwrite d6/dd/d13/d17/d1f/d20/f100 [0,4194304] 0 2026-03-10T08:55:55.420 INFO:tasks.workunit.client.0.vm05.stdout:1/715: dread dd/d21/d37/f72 [0,4194304] 0 2026-03-10T08:55:55.430 INFO:tasks.workunit.client.0.vm05.stdout:3/743: getdents d9/d2b/d3a/d43/d6e/dba 0 2026-03-10T08:55:55.446 INFO:tasks.workunit.client.1.vm08.stdout:2/957: unlink d1/da/d10/d42/d93/de2/d139/f134 0 2026-03-10T08:55:55.446 INFO:tasks.workunit.client.0.vm05.stdout:9/636: mkdir d6/d12/d43/dd4 0 2026-03-10T08:55:55.461 INFO:tasks.workunit.client.0.vm05.stdout:2/610: rename d0/d9/d1e/d20/d21/c6f to d0/d9/d7f/d8f/d7a/cb2 0 2026-03-10T08:55:55.467 INFO:tasks.workunit.client.1.vm08.stdout:1/873: getdents d1/da/d20/d91/d83/df4 0 2026-03-10T08:55:55.467 INFO:tasks.workunit.client.0.vm05.stdout:4/673: truncate d0/d2e/d71/d7c/fb4 1650237 0 2026-03-10T08:55:55.467 INFO:tasks.workunit.client.0.vm05.stdout:4/674: fsync d0/d2e/d42/d45/d4a/d36/dbe/d32/f72 0 2026-03-10T08:55:55.467 INFO:tasks.workunit.client.0.vm05.stdout:0/671: dread df/d1f/d85/d2b/d27/f4f [0,4194304] 0 2026-03-10T08:55:55.469 INFO:tasks.workunit.client.0.vm05.stdout:5/603: link d5/df/d37/d68/c7c d5/d86/d24/d2c/d41/cdb 0 2026-03-10T08:55:55.475 INFO:tasks.workunit.client.1.vm08.stdout:5/779: creat d0/d11/d27/d68/dc1/fed x:0 0 0 2026-03-10T08:55:55.479 INFO:tasks.workunit.client.1.vm08.stdout:8/918: truncate d1/d10/d9/dd/d25/d27/d144/fa9 349273 0 2026-03-10T08:55:55.486 INFO:tasks.workunit.client.0.vm05.stdout:8/678: dread d2/db/d47/fd1 [0,4194304] 0 2026-03-10T08:55:55.486 INFO:tasks.workunit.client.0.vm05.stdout:0/672: chown df/c4c 873 1 2026-03-10T08:55:55.505 INFO:tasks.workunit.client.0.vm05.stdout:5/604: creat d5/d86/d24/d84/db8/fdc x:0 0 0 2026-03-10T08:55:55.514 INFO:tasks.workunit.client.1.vm08.stdout:2/958: truncate d1/da/f50 2371171 0 2026-03-10T08:55:55.514 INFO:tasks.workunit.client.1.vm08.stdout:2/959: read - d1/d97/f133 zero size 2026-03-10T08:55:55.517 INFO:tasks.workunit.client.0.vm05.stdout:2/611: creat d0/d9/d1e/d20/d24/fb3 x:0 0 0 2026-03-10T08:55:55.519 INFO:tasks.workunit.client.0.vm05.stdout:2/612: write d0/d9/d1e/d20/d21/f77 [176674,123728] 0 2026-03-10T08:55:55.526 INFO:tasks.workunit.client.0.vm05.stdout:1/716: sync 2026-03-10T08:55:55.530 INFO:tasks.workunit.client.0.vm05.stdout:9/637: sync 2026-03-10T08:55:55.531 INFO:tasks.workunit.client.1.vm08.stdout:6/869: rename d9/d10/d1e/l7c to d9/dc/d11/d23/l11d 0 2026-03-10T08:55:55.533 INFO:tasks.workunit.client.0.vm05.stdout:2/613: dwrite d0/d55/da2/fa5 [0,4194304] 0 2026-03-10T08:55:55.539 INFO:tasks.workunit.client.0.vm05.stdout:4/675: sync 2026-03-10T08:55:55.543 INFO:tasks.workunit.client.0.vm05.stdout:8/679: dread d2/dd/d2c/d2e/d31/d3e/d5d/f92 [0,4194304] 0 2026-03-10T08:55:55.543 INFO:tasks.workunit.client.1.vm08.stdout:1/874: dread d1/da/f1e [0,4194304] 0 2026-03-10T08:55:55.554 INFO:tasks.workunit.client.1.vm08.stdout:5/780: truncate d0/f92 2510166 0 2026-03-10T08:55:55.574 INFO:tasks.workunit.client.1.vm08.stdout:8/919: dread d1/d10/d9/fb [0,4194304] 0 2026-03-10T08:55:55.575 INFO:tasks.workunit.client.0.vm05.stdout:1/717: rmdir dd/d21/d37/d7c/d60 39 2026-03-10T08:55:55.575 INFO:tasks.workunit.client.0.vm05.stdout:1/718: readlink dd/d10/d19/d9b/la6 0 2026-03-10T08:55:55.575 INFO:tasks.workunit.client.0.vm05.stdout:5/605: dread d5/f23 [0,4194304] 0 2026-03-10T08:55:55.586 INFO:tasks.workunit.client.1.vm08.stdout:6/870: fsync d9/d10/d1e/d92/f10c 0 2026-03-10T08:55:55.586 INFO:tasks.workunit.client.1.vm08.stdout:6/871: dread - d9/d10/d1e/d7e/f119 zero size 2026-03-10T08:55:55.593 INFO:tasks.workunit.client.0.vm05.stdout:0/673: link df/d1f/d85/d2b/d27/d32/d4e/d87/cb1 df/d1f/d95/cc8 0 2026-03-10T08:55:55.616 INFO:tasks.workunit.client.0.vm05.stdout:6/699: dwrite d4/d92/f9d [0,4194304] 0 2026-03-10T08:55:55.616 INFO:tasks.workunit.client.0.vm05.stdout:7/635: dwrite d18/d66/d25/d2e/d42/d9c/dac/f3d [0,4194304] 0 2026-03-10T08:55:55.616 INFO:tasks.workunit.client.0.vm05.stdout:7/636: readlink d18/d66/d25/d2e/d42/d53/l5e 0 2026-03-10T08:55:55.616 INFO:tasks.workunit.client.0.vm05.stdout:5/606: dread d5/d48/f93 [0,4194304] 0 2026-03-10T08:55:55.616 INFO:tasks.workunit.client.0.vm05.stdout:5/607: fdatasync d5/d48/f69 0 2026-03-10T08:55:55.631 INFO:tasks.workunit.client.1.vm08.stdout:5/781: mknod d0/d11/d27/d68/cee 0 2026-03-10T08:55:55.635 INFO:tasks.workunit.client.1.vm08.stdout:7/889: dwrite d0/d11/d1f/df0/df4/ffc [0,4194304] 0 2026-03-10T08:55:55.635 INFO:tasks.workunit.client.0.vm05.stdout:1/719: fsync dd/d10/d18/d20/d52/d80/ffc 0 2026-03-10T08:55:55.638 INFO:tasks.workunit.client.0.vm05.stdout:1/720: chown dd/d21/cd0 677448 1 2026-03-10T08:55:55.639 INFO:tasks.workunit.client.0.vm05.stdout:1/721: chown dd/d10/d19 12875039 1 2026-03-10T08:55:55.641 INFO:tasks.workunit.client.0.vm05.stdout:9/638: sync 2026-03-10T08:55:55.644 INFO:tasks.workunit.client.0.vm05.stdout:1/722: truncate dd/d10/d18/d2d/d5c/f100 56066 0 2026-03-10T08:55:55.651 INFO:tasks.workunit.client.1.vm08.stdout:8/920: fsync d1/d10/d9/dd/d18/dff/f119 0 2026-03-10T08:55:55.651 INFO:tasks.workunit.client.0.vm05.stdout:1/723: chown dd/fe5 13973275 1 2026-03-10T08:55:55.653 INFO:tasks.workunit.client.0.vm05.stdout:3/744: dwrite d9/d2b/d2f/fb9 [0,4194304] 0 2026-03-10T08:55:55.657 INFO:tasks.workunit.client.1.vm08.stdout:4/901: dwrite d5/d23/d36/d99/db2/d5d/f60 [0,4194304] 0 2026-03-10T08:55:55.668 INFO:tasks.workunit.client.0.vm05.stdout:0/674: dread f5 [0,4194304] 0 2026-03-10T08:55:55.692 INFO:tasks.workunit.client.1.vm08.stdout:0/850: creat d6/f124 x:0 0 0 2026-03-10T08:55:55.693 INFO:tasks.workunit.client.1.vm08.stdout:3/834: dwrite d4/d15/d8/d2c/d89/fb4 [0,4194304] 0 2026-03-10T08:55:55.693 INFO:tasks.workunit.client.1.vm08.stdout:6/872: creat d9/dc/de0/f11e x:0 0 0 2026-03-10T08:55:55.702 INFO:tasks.workunit.client.0.vm05.stdout:6/700: mkdir d4/d2d/d51/d87/da5/de9 0 2026-03-10T08:55:55.710 INFO:tasks.workunit.client.0.vm05.stdout:7/637: mkdir d18/d66/d25/d2e/d42/dc6 0 2026-03-10T08:55:55.722 INFO:tasks.workunit.client.1.vm08.stdout:7/890: unlink d0/d11/d4a/fa5 0 2026-03-10T08:55:55.730 INFO:tasks.workunit.client.0.vm05.stdout:8/680: mknod d2/dd/d2c/d2e/cec 0 2026-03-10T08:55:55.767 INFO:tasks.workunit.client.1.vm08.stdout:4/902: symlink d5/d23/d49/d8f/da4/l142 0 2026-03-10T08:55:55.768 INFO:tasks.workunit.client.1.vm08.stdout:2/960: rename d1/db1 to d1/da/d10/d42/d93/d13d 0 2026-03-10T08:55:55.768 INFO:tasks.workunit.client.0.vm05.stdout:8/681: dread - d2/db/d1f/d67/f94 zero size 2026-03-10T08:55:55.768 INFO:tasks.workunit.client.0.vm05.stdout:5/608: creat d5/d86/d24/d84/db8/fdd x:0 0 0 2026-03-10T08:55:55.768 INFO:tasks.workunit.client.0.vm05.stdout:9/639: mknod d6/d15/d3c/cd5 0 2026-03-10T08:55:55.768 INFO:tasks.workunit.client.0.vm05.stdout:3/745: creat d9/d4d/d51/fe1 x:0 0 0 2026-03-10T08:55:55.768 INFO:tasks.workunit.client.0.vm05.stdout:0/675: creat df/d1f/d85/d19/d47/d84/dae/fc9 x:0 0 0 2026-03-10T08:55:55.769 INFO:tasks.workunit.client.0.vm05.stdout:7/638: rmdir d18/d66/d78 39 2026-03-10T08:55:55.776 INFO:tasks.workunit.client.0.vm05.stdout:0/676: dwrite df/d1f/d85/d2b/d27/d32/d4e/d87/f8d [0,4194304] 0 2026-03-10T08:55:55.779 INFO:tasks.workunit.client.0.vm05.stdout:3/746: dwrite d9/d2b/d53/f60 [0,4194304] 0 2026-03-10T08:55:55.781 INFO:tasks.workunit.client.0.vm05.stdout:1/724: sync 2026-03-10T08:55:55.786 INFO:tasks.workunit.client.0.vm05.stdout:1/725: stat dd/d21/d37/d45/fce 0 2026-03-10T08:55:55.791 INFO:tasks.workunit.client.1.vm08.stdout:4/903: mknod d5/d23/d36/d99/db2/d5a/c143 0 2026-03-10T08:55:55.797 INFO:tasks.workunit.client.0.vm05.stdout:8/682: mknod d2/dd/d2c/d2e/d31/d3e/dde/d63/daf/ced 0 2026-03-10T08:55:55.803 INFO:tasks.workunit.client.1.vm08.stdout:6/873: sync 2026-03-10T08:55:55.804 INFO:tasks.workunit.client.0.vm05.stdout:6/701: dread d4/d7/d10/d1a/f25 [0,4194304] 0 2026-03-10T08:55:55.810 INFO:tasks.workunit.client.0.vm05.stdout:5/609: mkdir d5/df/d37/dd2/d76/dde 0 2026-03-10T08:55:55.820 INFO:tasks.workunit.client.1.vm08.stdout:2/961: symlink d1/da/d10/d42/d93/d23/d9e/l13e 0 2026-03-10T08:55:55.820 INFO:tasks.workunit.client.1.vm08.stdout:0/851: creat d6/f125 x:0 0 0 2026-03-10T08:55:55.820 INFO:tasks.workunit.client.1.vm08.stdout:6/874: creat d9/d10/dd0/f11f x:0 0 0 2026-03-10T08:55:55.820 INFO:tasks.workunit.client.0.vm05.stdout:5/610: write d5/df/dbb/fd4 [80089,88790] 0 2026-03-10T08:55:55.820 INFO:tasks.workunit.client.0.vm05.stdout:2/614: getdents d0/d9/d7f 0 2026-03-10T08:55:55.826 INFO:tasks.workunit.client.0.vm05.stdout:9/640: mknod d6/d12/d3a/cd6 0 2026-03-10T08:55:55.833 INFO:tasks.workunit.client.1.vm08.stdout:6/875: truncate d9/dc/d84/f89 2332231 0 2026-03-10T08:55:55.839 INFO:tasks.workunit.client.0.vm05.stdout:7/639: mkdir d18/d38/dc7 0 2026-03-10T08:55:55.839 INFO:tasks.workunit.client.0.vm05.stdout:7/640: dread - d18/d66/d25/f56 zero size 2026-03-10T08:55:55.840 INFO:tasks.workunit.client.0.vm05.stdout:0/677: creat df/d1f/d85/d19/d47/da3/fca x:0 0 0 2026-03-10T08:55:55.840 INFO:tasks.workunit.client.0.vm05.stdout:0/678: truncate df/d1f/d85/f2a 6104487 0 2026-03-10T08:55:55.842 INFO:tasks.workunit.client.1.vm08.stdout:2/962: rename d1/da/d10/d42/d93/de2/f123 to d1/da/d10/d42/d93/d1e/dce/d52/db3/def/f13f 0 2026-03-10T08:55:55.846 INFO:tasks.workunit.client.0.vm05.stdout:3/747: mkdir d9/d8f/d50/d5f/dd8/dd9/de2 0 2026-03-10T08:55:55.853 INFO:tasks.workunit.client.1.vm08.stdout:6/876: symlink d9/d10/d1e/d104/l120 0 2026-03-10T08:55:55.856 INFO:tasks.workunit.client.1.vm08.stdout:9/840: dwrite d2/f4 [0,4194304] 0 2026-03-10T08:55:55.858 INFO:tasks.workunit.client.1.vm08.stdout:4/904: getdents d5/d23/d36/d99/dc6/df1 0 2026-03-10T08:55:55.859 INFO:tasks.workunit.client.1.vm08.stdout:4/905: chown d5/d23/d36/d99/db2/d5d 1408 1 2026-03-10T08:55:55.859 INFO:tasks.workunit.client.1.vm08.stdout:2/963: fdatasync d1/da/d10/d42/d93/d23/f37 0 2026-03-10T08:55:55.860 INFO:tasks.workunit.client.1.vm08.stdout:4/906: chown d5/d5f/c98 6744 1 2026-03-10T08:55:55.868 INFO:tasks.workunit.client.0.vm05.stdout:5/611: rename d5/d86/d24/d2c/l34 to d5/dcf/ldf 0 2026-03-10T08:55:55.872 INFO:tasks.workunit.client.1.vm08.stdout:6/877: rename d9/d50/d95/fef to d9/dc/d11/d23/d2c/d81/d63/dcf/f121 0 2026-03-10T08:55:55.873 INFO:tasks.workunit.client.1.vm08.stdout:6/878: chown d9/dc/d11/d23/d2c/d7a/dce/d69/da2/c107 154477 1 2026-03-10T08:55:55.873 INFO:tasks.workunit.client.0.vm05.stdout:9/641: write d6/d15/f4f [6116163,24908] 0 2026-03-10T08:55:55.881 INFO:tasks.workunit.client.1.vm08.stdout:2/964: dread d1/da/d78/f95 [0,4194304] 0 2026-03-10T08:55:55.884 INFO:tasks.workunit.client.0.vm05.stdout:7/641: fdatasync d18/d66/d25/d2e/d42/f46 0 2026-03-10T08:55:55.894 INFO:tasks.workunit.client.1.vm08.stdout:1/875: dwrite d1/da/de/d5c/fa1 [0,4194304] 0 2026-03-10T08:55:55.895 INFO:tasks.workunit.client.1.vm08.stdout:9/841: rename d2/d41/d4c/de2/cce to d2/d41/d53/d103/c11e 0 2026-03-10T08:55:55.895 INFO:tasks.workunit.client.1.vm08.stdout:9/842: write d2/dd/d15/f17 [5154091,34266] 0 2026-03-10T08:55:55.897 INFO:tasks.workunit.client.0.vm05.stdout:3/748: symlink d9/d8f/d50/d5f/le3 0 2026-03-10T08:55:55.898 INFO:tasks.workunit.client.0.vm05.stdout:3/749: stat d9/d2b/d3a/d43/d71/d86/fc7 0 2026-03-10T08:55:55.917 INFO:tasks.workunit.client.0.vm05.stdout:8/683: rename d2/dd/d2c/d2e/d31/f89 to d2/dd/d2c/d2e/d31/fee 0 2026-03-10T08:55:55.928 INFO:tasks.workunit.client.1.vm08.stdout:1/876: creat d1/da/de/d24/d3d/d40/d56/d7a/f12a x:0 0 0 2026-03-10T08:55:55.941 INFO:tasks.workunit.client.0.vm05.stdout:9/642: truncate d6/d15/d3c/d4b/f67 2073654 0 2026-03-10T08:55:55.946 INFO:tasks.workunit.client.1.vm08.stdout:9/843: dread d2/d54/d8e/fba [0,4194304] 0 2026-03-10T08:55:55.961 INFO:tasks.workunit.client.0.vm05.stdout:7/642: dread d18/d38/f5d [0,4194304] 0 2026-03-10T08:55:55.963 INFO:tasks.workunit.client.0.vm05.stdout:4/676: dwrite d0/d2e/d71/d7c/fb4 [0,4194304] 0 2026-03-10T08:55:55.988 INFO:tasks.workunit.client.1.vm08.stdout:5/782: write d0/d46/f5f [5131240,23723] 0 2026-03-10T08:55:55.989 INFO:tasks.workunit.client.1.vm08.stdout:5/783: truncate d0/d11/d27/d68/dc1/fed 1033907 0 2026-03-10T08:55:56.016 INFO:tasks.workunit.client.1.vm08.stdout:8/921: write d1/dd9/f129 [232502,49163] 0 2026-03-10T08:55:56.020 INFO:tasks.workunit.client.1.vm08.stdout:7/891: rmdir d0/d11/d4a/d5e 39 2026-03-10T08:55:56.022 INFO:tasks.workunit.client.1.vm08.stdout:8/922: sync 2026-03-10T08:55:56.033 INFO:tasks.workunit.client.1.vm08.stdout:7/892: readlink d0/d14/d43/d9d/dbb/ld0 0 2026-03-10T08:55:56.036 INFO:tasks.workunit.client.0.vm05.stdout:6/702: write d4/d7/d10/d15/d1b/d22/f56 [223315,49589] 0 2026-03-10T08:55:56.036 INFO:tasks.workunit.client.1.vm08.stdout:3/835: dwrite d4/d15/d8/d1d/f98 [0,4194304] 0 2026-03-10T08:55:56.040 INFO:tasks.workunit.client.1.vm08.stdout:2/965: read d1/da/d10/d2d/fb7 [2159680,49427] 0 2026-03-10T08:55:56.040 INFO:tasks.workunit.client.1.vm08.stdout:2/966: stat d1/d43/f4b 0 2026-03-10T08:55:56.051 INFO:tasks.workunit.client.1.vm08.stdout:0/852: dwrite d6/f2c [0,4194304] 0 2026-03-10T08:55:56.052 INFO:tasks.workunit.client.0.vm05.stdout:2/615: dwrite d0/d9/d1e/d20/f7c [0,4194304] 0 2026-03-10T08:55:56.089 INFO:tasks.workunit.client.0.vm05.stdout:1/726: getdents dd/d10/d19/d9b 0 2026-03-10T08:55:56.090 INFO:tasks.workunit.client.0.vm05.stdout:1/727: chown dd/d10/d19/d9b 491832 1 2026-03-10T08:55:56.097 INFO:tasks.workunit.client.1.vm08.stdout:3/836: rmdir d4/d15/d8/d2c 39 2026-03-10T08:55:56.097 INFO:tasks.workunit.client.1.vm08.stdout:4/907: write d5/d23/d36/d99/dc6/dc8/fe7 [733604,94468] 0 2026-03-10T08:55:56.112 INFO:tasks.workunit.client.1.vm08.stdout:6/879: write d9/d50/d95/f99 [3343200,79772] 0 2026-03-10T08:55:56.112 INFO:tasks.workunit.client.0.vm05.stdout:0/679: write f5 [1324667,44377] 0 2026-03-10T08:55:56.115 INFO:tasks.workunit.client.1.vm08.stdout:4/908: dread d5/f6 [0,4194304] 0 2026-03-10T08:55:56.124 INFO:tasks.workunit.client.0.vm05.stdout:9/643: chown d6/d19/d21/f2f 2 1 2026-03-10T08:55:56.132 INFO:tasks.workunit.client.1.vm08.stdout:8/923: creat d1/d10/d9/dd/d18/d34/dd0/d124/f152 x:0 0 0 2026-03-10T08:55:56.132 INFO:tasks.workunit.client.1.vm08.stdout:8/924: readlink d1/d4f/d60/l61 0 2026-03-10T08:55:56.133 INFO:tasks.workunit.client.1.vm08.stdout:8/925: stat d1/d10/d9/dd/d25/f118 0 2026-03-10T08:55:56.152 INFO:tasks.workunit.client.1.vm08.stdout:1/877: write d1/da/de/d24/d3d/ff0 [276473,57005] 0 2026-03-10T08:55:56.155 INFO:tasks.workunit.client.0.vm05.stdout:7/643: readlink d18/d66/d25/d2e/d42/l92 0 2026-03-10T08:55:56.159 INFO:tasks.workunit.client.0.vm05.stdout:7/644: chown d18/f24 115 1 2026-03-10T08:55:56.165 INFO:tasks.workunit.client.0.vm05.stdout:4/677: fsync d0/fe 0 2026-03-10T08:55:56.171 INFO:tasks.workunit.client.1.vm08.stdout:9/844: write d2/dd/d15/d1e/d24/f3f [5140096,21072] 0 2026-03-10T08:55:56.181 INFO:tasks.workunit.client.0.vm05.stdout:6/703: dread d4/d2c/d84/d4a/f76 [0,4194304] 0 2026-03-10T08:55:56.181 INFO:tasks.workunit.client.1.vm08.stdout:5/784: dwrite d0/fe [0,4194304] 0 2026-03-10T08:55:56.199 INFO:tasks.workunit.client.1.vm08.stdout:7/893: write d0/d11/f39 [3349402,5402] 0 2026-03-10T08:55:56.204 INFO:tasks.workunit.client.0.vm05.stdout:2/616: mkdir d0/d9/d7f/db4 0 2026-03-10T08:55:56.204 INFO:tasks.workunit.client.1.vm08.stdout:0/853: write d6/dd/d13/d17/d1f/f67 [981451,38852] 0 2026-03-10T08:55:56.204 INFO:tasks.workunit.client.0.vm05.stdout:3/750: truncate d9/d4d/d51/f67 5116193 0 2026-03-10T08:55:56.205 INFO:tasks.workunit.client.0.vm05.stdout:5/612: write d5/f9c [151954,98509] 0 2026-03-10T08:55:56.206 INFO:tasks.workunit.client.0.vm05.stdout:2/617: dread - d0/d9/d7f/d8f/fab zero size 2026-03-10T08:55:56.208 INFO:tasks.workunit.client.0.vm05.stdout:8/684: dwrite d2/db/d28/fa6 [0,4194304] 0 2026-03-10T08:55:56.212 INFO:tasks.workunit.client.0.vm05.stdout:8/685: chown d2/db/c11 295557895 1 2026-03-10T08:55:56.215 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:55 vm05.local ceph-mon[49713]: pgmap v6: 65 pgs: 65 active+clean; 3.3 GiB data, 11 GiB used, 109 GiB / 120 GiB avail 2026-03-10T08:55:56.215 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:55 vm05.local ceph-mon[49713]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:55:56.215 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:55 vm05.local ceph-mon[49713]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:55:56.215 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:55 vm05.local ceph-mon[49713]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-10T08:55:56.215 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:55 vm05.local ceph-mon[49713]: from='mgr.24459 ' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-10T08:55:56.215 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:55 vm05.local ceph-mon[49713]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:55:56.221 INFO:tasks.workunit.client.0.vm05.stdout:1/728: read - dd/d10/d18/d2d/d51/d58/d71/d62/fd4 zero size 2026-03-10T08:55:56.227 INFO:tasks.workunit.client.0.vm05.stdout:0/680: rmdir df/d1f/d85/d19/d47/d84/d8a 39 2026-03-10T08:55:56.231 INFO:tasks.workunit.client.0.vm05.stdout:0/681: dread f5 [0,4194304] 0 2026-03-10T08:55:56.231 INFO:tasks.workunit.client.0.vm05.stdout:0/682: chown df/d1f/c3e 328578094 1 2026-03-10T08:55:56.240 INFO:tasks.workunit.client.0.vm05.stdout:7/645: mknod d18/d66/d25/d2e/d2f/cc8 0 2026-03-10T08:55:56.244 INFO:tasks.workunit.client.0.vm05.stdout:4/678: creat d0/d2c/d6a/fd8 x:0 0 0 2026-03-10T08:55:56.248 INFO:tasks.workunit.client.0.vm05.stdout:6/704: creat d4/d2d/d51/d62/fea x:0 0 0 2026-03-10T08:55:56.248 INFO:tasks.workunit.client.0.vm05.stdout:6/705: readlink d4/d7/l2b 0 2026-03-10T08:55:56.249 INFO:tasks.workunit.client.0.vm05.stdout:6/706: stat d4/d7/d10/d15/d1b/l73 0 2026-03-10T08:55:56.257 INFO:tasks.workunit.client.0.vm05.stdout:5/613: dread d5/f23 [4194304,4194304] 0 2026-03-10T08:55:56.281 INFO:tasks.workunit.client.0.vm05.stdout:3/751: mkdir d9/d2b/d3a/d43/d71/de4 0 2026-03-10T08:55:56.282 INFO:tasks.workunit.client.0.vm05.stdout:2/618: symlink d0/d9/d7f/d8f/d7e/lb5 0 2026-03-10T08:55:56.284 INFO:tasks.workunit.client.0.vm05.stdout:9/644: dwrite d6/d19/d21/fb7 [4194304,4194304] 0 2026-03-10T08:55:56.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:55 vm08.local ceph-mon[57559]: pgmap v6: 65 pgs: 65 active+clean; 3.3 GiB data, 11 GiB used, 109 GiB / 120 GiB avail 2026-03-10T08:55:56.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:55 vm08.local ceph-mon[57559]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:55:56.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:55 vm08.local ceph-mon[57559]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:55:56.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:55 vm08.local ceph-mon[57559]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-10T08:55:56.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:55 vm08.local ceph-mon[57559]: from='mgr.24459 ' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-10T08:55:56.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:55 vm08.local ceph-mon[57559]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:55:56.318 INFO:tasks.workunit.client.1.vm08.stdout:1/878: creat d1/da/d18/d3a/da7/f12b x:0 0 0 2026-03-10T08:55:56.332 INFO:tasks.workunit.client.0.vm05.stdout:6/707: rename d4/d7/d10/d1a/db1 to d4/d7/d10/d1a/d89/deb 0 2026-03-10T08:55:56.336 INFO:tasks.workunit.client.1.vm08.stdout:9/845: unlink d2/dd/d15/d1e/ce9 0 2026-03-10T08:55:56.343 INFO:tasks.workunit.client.0.vm05.stdout:8/686: write d2/db/d47/fd1 [4300586,7691] 0 2026-03-10T08:55:56.343 INFO:tasks.workunit.client.0.vm05.stdout:1/729: write dd/d10/d18/d20/fd6 [4556931,3954] 0 2026-03-10T08:55:56.344 INFO:tasks.workunit.client.1.vm08.stdout:8/926: dwrite d1/d10/d9/dd/d25/d27/d44/d89/fed [0,4194304] 0 2026-03-10T08:55:56.350 INFO:tasks.workunit.client.0.vm05.stdout:5/614: read d5/d86/f1a [4172029,116003] 0 2026-03-10T08:55:56.353 INFO:tasks.workunit.client.0.vm05.stdout:1/730: truncate dd/d21/d37/d45/d8d/f99 956830 0 2026-03-10T08:55:56.354 INFO:tasks.workunit.client.0.vm05.stdout:0/683: dwrite df/d1f/d85/d2b/f7a [0,4194304] 0 2026-03-10T08:55:56.369 INFO:tasks.workunit.client.1.vm08.stdout:5/785: stat d0/d11/d27/d68/d7c/de5/db9/fc9 0 2026-03-10T08:55:56.376 INFO:tasks.workunit.client.1.vm08.stdout:2/967: creat d1/da/f140 x:0 0 0 2026-03-10T08:55:56.381 INFO:tasks.workunit.client.1.vm08.stdout:7/894: truncate d0/d14/f12 4165087 0 2026-03-10T08:55:56.395 INFO:tasks.workunit.client.1.vm08.stdout:1/879: rmdir d1/da/de/d24/d3d/d40/d56/d7a 39 2026-03-10T08:55:56.399 INFO:tasks.workunit.client.0.vm05.stdout:6/708: write d4/d7/d10/d1a/d89/fc1 [1395837,5702] 0 2026-03-10T08:55:56.399 INFO:tasks.workunit.client.1.vm08.stdout:9/846: rmdir d2/d41/d4c/d66 39 2026-03-10T08:55:56.399 INFO:tasks.workunit.client.1.vm08.stdout:1/880: dread - d1/da/de/d24/d35/d6d/d82/da2/dbb/fd8 zero size 2026-03-10T08:55:56.400 INFO:tasks.workunit.client.1.vm08.stdout:3/837: dwrite d4/d15/d8/d2c/d9b/d79/f3c [4194304,4194304] 0 2026-03-10T08:55:56.401 INFO:tasks.workunit.client.1.vm08.stdout:9/847: dwrite d2/dd/d15/f17 [4194304,4194304] 0 2026-03-10T08:55:56.426 INFO:tasks.workunit.client.0.vm05.stdout:8/687: dread d2/dd/d2c/d2e/d31/fc4 [0,4194304] 0 2026-03-10T08:55:56.428 INFO:tasks.workunit.client.1.vm08.stdout:5/786: creat d0/d11/d27/d68/d7c/d4b/d4e/da5/fef x:0 0 0 2026-03-10T08:55:56.429 INFO:tasks.workunit.client.0.vm05.stdout:0/684: sync 2026-03-10T08:55:56.429 INFO:tasks.workunit.client.0.vm05.stdout:5/615: sync 2026-03-10T08:55:56.431 INFO:tasks.workunit.client.0.vm05.stdout:0/685: chown df/d1f/d85/d19/d47/d84/dbe/d67/l88 520499 1 2026-03-10T08:55:56.435 INFO:tasks.workunit.client.1.vm08.stdout:2/968: read d1/da/d10/d42/d93/d22/f8a [1652298,73741] 0 2026-03-10T08:55:56.436 INFO:tasks.workunit.client.1.vm08.stdout:6/880: creat d9/d10/d1e/d32/f122 x:0 0 0 2026-03-10T08:55:56.436 INFO:tasks.workunit.client.0.vm05.stdout:7/646: creat d18/d38/fc9 x:0 0 0 2026-03-10T08:55:56.437 INFO:tasks.workunit.client.0.vm05.stdout:4/679: creat d0/d2e/d71/fd9 x:0 0 0 2026-03-10T08:55:56.437 INFO:tasks.workunit.client.0.vm05.stdout:5/616: dwrite d5/df/dbb/fd4 [0,4194304] 0 2026-03-10T08:55:56.449 INFO:tasks.workunit.client.1.vm08.stdout:4/909: rename d5/d23/d36/l4c to d5/d23/d36/d99/db2/l144 0 2026-03-10T08:55:56.462 INFO:tasks.workunit.client.0.vm05.stdout:8/688: mknod d2/db/d1f/d67/d8d/cef 0 2026-03-10T08:55:56.467 INFO:tasks.workunit.client.1.vm08.stdout:3/838: read - d4/d15/d8/d71/fb6 zero size 2026-03-10T08:55:56.468 INFO:tasks.workunit.client.1.vm08.stdout:3/839: read - d4/d15/d8/d1d/d4f/f105 zero size 2026-03-10T08:55:56.469 INFO:tasks.workunit.client.0.vm05.stdout:0/686: symlink df/d1f/d85/d19/d39/d4d/d9f/lcb 0 2026-03-10T08:55:56.472 INFO:tasks.workunit.client.1.vm08.stdout:7/895: read d0/d11/d4a/d5e/fed [36058,72211] 0 2026-03-10T08:55:56.475 INFO:tasks.workunit.client.0.vm05.stdout:2/619: dread d0/d9/d1e/d20/d21/f3d [0,4194304] 0 2026-03-10T08:55:56.476 INFO:tasks.workunit.client.1.vm08.stdout:2/969: fsync d1/da/d78/df5/d11e/ffd 0 2026-03-10T08:55:56.480 INFO:tasks.workunit.client.1.vm08.stdout:0/854: link d6/dd/d13/d32/cc5 d6/dd/d13/d17/d1f/d20/d2f/d26/c126 0 2026-03-10T08:55:56.480 INFO:tasks.workunit.client.1.vm08.stdout:0/855: chown d6/l9c 523295085 1 2026-03-10T08:55:56.485 INFO:tasks.workunit.client.1.vm08.stdout:8/927: rename d1/d10/d9/dd/d13/d40/f147 to d1/d10/d9/dd/d25/dca/dc6/f153 0 2026-03-10T08:55:56.485 INFO:tasks.workunit.client.0.vm05.stdout:1/731: link dd/d10/d19/d4d/f70 dd/d21/d37/d7c/dab/f101 0 2026-03-10T08:55:56.490 INFO:tasks.workunit.client.1.vm08.stdout:3/840: mknod d4/d15/d8/d1d/da8/c11a 0 2026-03-10T08:55:56.496 INFO:tasks.workunit.client.1.vm08.stdout:2/970: dread - d1/da/d78/df5/d11e/f12c zero size 2026-03-10T08:55:56.505 INFO:tasks.workunit.client.1.vm08.stdout:7/896: dread d0/d14/d2f/fe3 [0,4194304] 0 2026-03-10T08:55:56.511 INFO:tasks.workunit.client.0.vm05.stdout:6/709: creat d4/d7/d10/d15/d20/fec x:0 0 0 2026-03-10T08:55:56.513 INFO:tasks.workunit.client.1.vm08.stdout:0/856: creat d6/dd/d13/d17/d1f/d2d/d39/f127 x:0 0 0 2026-03-10T08:55:56.514 INFO:tasks.workunit.client.0.vm05.stdout:7/647: creat d18/d38/fca x:0 0 0 2026-03-10T08:55:56.515 INFO:tasks.workunit.client.0.vm05.stdout:7/648: read d18/d66/d25/d2e/d42/d9c/dac/f3d [1916452,109891] 0 2026-03-10T08:55:56.516 INFO:tasks.workunit.client.1.vm08.stdout:0/857: dwrite d6/f2c [0,4194304] 0 2026-03-10T08:55:56.524 INFO:tasks.workunit.client.1.vm08.stdout:5/787: rename d0/d11/d27/d68/d7c/d4b/d87 to d0/d11/d27/d68/d7c/d8e/df0 0 2026-03-10T08:55:56.548 INFO:tasks.workunit.client.1.vm08.stdout:2/971: symlink d1/d5b/da7/l141 0 2026-03-10T08:55:56.581 INFO:tasks.workunit.client.1.vm08.stdout:7/897: creat d0/d11/d1f/d29/d3b/d80/dd3/de1/f11b x:0 0 0 2026-03-10T08:55:56.610 INFO:tasks.workunit.client.0.vm05.stdout:7/649: creat d18/d66/d25/d2e/d2f/d6d/fcb x:0 0 0 2026-03-10T08:55:56.623 INFO:tasks.workunit.client.0.vm05.stdout:3/752: write d9/d2b/d53/f93 [1029258,14039] 0 2026-03-10T08:55:56.634 INFO:tasks.workunit.client.0.vm05.stdout:9/645: dwrite d6/f7 [4194304,4194304] 0 2026-03-10T08:55:56.642 INFO:tasks.workunit.client.1.vm08.stdout:1/881: dwrite d1/f65 [0,4194304] 0 2026-03-10T08:55:56.649 INFO:tasks.workunit.client.1.vm08.stdout:9/848: write d2/dd/d15/d1e/d25/d32/f60 [541959,64510] 0 2026-03-10T08:55:56.649 INFO:tasks.workunit.client.1.vm08.stdout:9/849: chown d2/dd/d15/d1e/d21/da4 89185 1 2026-03-10T08:55:56.650 INFO:tasks.workunit.client.1.vm08.stdout:1/882: dread d1/fc [0,4194304] 0 2026-03-10T08:55:56.662 INFO:tasks.workunit.client.0.vm05.stdout:5/617: dwrite d5/d86/d24/d2c/d41/f4d [0,4194304] 0 2026-03-10T08:55:56.673 INFO:tasks.workunit.client.1.vm08.stdout:4/910: dwrite d5/f8a [0,4194304] 0 2026-03-10T08:55:56.674 INFO:tasks.workunit.client.0.vm05.stdout:8/689: dwrite d2/db/f19 [4194304,4194304] 0 2026-03-10T08:55:56.675 INFO:tasks.workunit.client.1.vm08.stdout:3/841: rename d4/d15/d8/d2c/d89/l104 to d4/d15/dfd/l11b 0 2026-03-10T08:55:56.688 INFO:tasks.workunit.client.0.vm05.stdout:4/680: truncate d0/d2e/d71/d7c/fb4 2059904 0 2026-03-10T08:55:56.701 INFO:tasks.workunit.client.1.vm08.stdout:6/881: dwrite d9/dc/d84/d80/f94 [0,4194304] 0 2026-03-10T08:55:56.702 INFO:tasks.workunit.client.1.vm08.stdout:7/898: creat d0/d11/d1f/d29/d3d/d40/f11c x:0 0 0 2026-03-10T08:55:56.702 INFO:tasks.workunit.client.1.vm08.stdout:8/928: write d1/dd9/f126 [591178,86911] 0 2026-03-10T08:55:56.704 INFO:tasks.workunit.client.0.vm05.stdout:3/753: creat d9/d4d/d51/fe5 x:0 0 0 2026-03-10T08:55:56.704 INFO:tasks.workunit.client.0.vm05.stdout:2/620: truncate d0/d55/da2/fa5 2290886 0 2026-03-10T08:55:56.708 INFO:tasks.workunit.client.0.vm05.stdout:9/646: rename d6/d19/d2a/d4a/d8c/fc7 to d6/d19/d2c/d84/fd7 0 2026-03-10T08:55:56.716 INFO:tasks.workunit.client.0.vm05.stdout:0/687: getdents df/d1f/d85/d2b/d27/d32/d4e 0 2026-03-10T08:55:56.716 INFO:tasks.workunit.client.1.vm08.stdout:6/882: dwrite d9/d10/d1e/d32/f3a [0,4194304] 0 2026-03-10T08:55:56.716 INFO:tasks.workunit.client.1.vm08.stdout:9/850: creat d2/dd/d15/d1e/d39/d4e/d87/f11f x:0 0 0 2026-03-10T08:55:56.717 INFO:tasks.workunit.client.1.vm08.stdout:9/851: chown d2/dd/d61/fbb 1007424319 1 2026-03-10T08:55:56.717 INFO:tasks.workunit.client.1.vm08.stdout:9/852: stat d2/d41/d4c/dd2/cec 0 2026-03-10T08:55:56.723 INFO:tasks.workunit.client.1.vm08.stdout:4/911: creat d5/d23/d49/d8f/da4/f145 x:0 0 0 2026-03-10T08:55:56.731 INFO:tasks.workunit.client.0.vm05.stdout:1/732: getdents dd/d10/d18/d20/d69 0 2026-03-10T08:55:56.736 INFO:tasks.workunit.client.1.vm08.stdout:1/883: dread d1/da/f25 [0,4194304] 0 2026-03-10T08:55:56.744 INFO:tasks.workunit.client.1.vm08.stdout:3/842: dread d4/d15/d8/d2c/f32 [0,4194304] 0 2026-03-10T08:55:56.749 INFO:tasks.workunit.client.1.vm08.stdout:8/929: chown d1/d10/d9/dd/d25/f93 160398698 1 2026-03-10T08:55:56.756 INFO:tasks.workunit.client.1.vm08.stdout:8/930: chown d1/d10/d9/dd/fc5 0 1 2026-03-10T08:55:56.763 INFO:tasks.workunit.client.1.vm08.stdout:6/883: creat d9/dc/d11/d23/d2c/f123 x:0 0 0 2026-03-10T08:55:56.764 INFO:tasks.workunit.client.1.vm08.stdout:6/884: readlink d9/dc/d11/d23/d2c/d81/d63/l114 0 2026-03-10T08:55:56.765 INFO:tasks.workunit.client.1.vm08.stdout:0/858: write d6/dd/d13/d17/d1f/d20/f21 [668526,16086] 0 2026-03-10T08:55:56.767 INFO:tasks.workunit.client.1.vm08.stdout:9/853: unlink d2/dd/d15/d1e/d39/d4e/d87/l11a 0 2026-03-10T08:55:56.781 INFO:tasks.workunit.client.1.vm08.stdout:4/912: dread d5/d23/d36/d99/db2/d5d/dae/ddf/fbe [0,4194304] 0 2026-03-10T08:55:56.787 INFO:tasks.workunit.client.0.vm05.stdout:8/690: creat d2/dd/d2c/d2e/d31/d4f/d80/de2/dea/ff0 x:0 0 0 2026-03-10T08:55:56.796 INFO:tasks.workunit.client.1.vm08.stdout:5/788: link d0/d11/d27/d68/d7c/d4b/fa0 d0/d11/d27/d68/ff1 0 2026-03-10T08:55:56.797 INFO:tasks.workunit.client.0.vm05.stdout:0/688: dread df/d1f/d85/d2b/d65/d6e/d96/f66 [0,4194304] 0 2026-03-10T08:55:56.797 INFO:tasks.workunit.client.0.vm05.stdout:6/710: write d4/d7/d10/d15/d1b/fcd [240628,48891] 0 2026-03-10T08:55:56.797 INFO:tasks.workunit.client.0.vm05.stdout:0/689: readlink df/d1f/l7d 0 2026-03-10T08:55:56.800 INFO:tasks.workunit.client.0.vm05.stdout:4/681: mknod d0/d2e/cda 0 2026-03-10T08:55:56.808 INFO:tasks.workunit.client.1.vm08.stdout:1/884: rename d1/da/de/d24/d3d/d40/ffe to d1/da/de/d24/d35/d43/d109/f12c 0 2026-03-10T08:55:56.814 INFO:tasks.workunit.client.1.vm08.stdout:3/843: dread - d4/d6f/dca/ff8 zero size 2026-03-10T08:55:56.838 INFO:tasks.workunit.client.1.vm08.stdout:8/931: truncate d1/d10/d9/dd/d18/d34/f57 8671629 0 2026-03-10T08:55:56.840 INFO:tasks.workunit.client.1.vm08.stdout:0/859: symlink d6/dd/d13/d32/l128 0 2026-03-10T08:55:56.841 INFO:tasks.workunit.client.0.vm05.stdout:6/711: mknod d4/d2d/d51/d62/da9/ced 0 2026-03-10T08:55:56.842 INFO:tasks.workunit.client.1.vm08.stdout:9/854: symlink d2/dd/d15/d1e/d39/d4e/d87/l120 0 2026-03-10T08:55:56.842 INFO:tasks.workunit.client.1.vm08.stdout:4/913: mknod d5/d23/d49/d83/c146 0 2026-03-10T08:55:56.842 INFO:tasks.workunit.client.0.vm05.stdout:4/682: fsync d0/d2e/d71/f90 0 2026-03-10T08:55:56.843 INFO:tasks.workunit.client.1.vm08.stdout:4/914: readlink d5/d23/d49/l4b 0 2026-03-10T08:55:56.845 INFO:tasks.workunit.client.0.vm05.stdout:3/754: dwrite d9/fb4 [0,4194304] 0 2026-03-10T08:55:56.848 INFO:tasks.workunit.client.0.vm05.stdout:2/621: dwrite d0/d9/d1e/d20/d24/f29 [4194304,4194304] 0 2026-03-10T08:55:56.858 INFO:tasks.workunit.client.1.vm08.stdout:5/789: creat d0/d1b/d67/d7a/ff2 x:0 0 0 2026-03-10T08:55:56.862 INFO:tasks.workunit.client.1.vm08.stdout:6/885: dread d9/dc/d11/d23/d2c/d7a/fd1 [0,4194304] 0 2026-03-10T08:55:56.876 INFO:tasks.workunit.client.1.vm08.stdout:2/972: link d1/da/d10/c118 d1/da/d10/d42/d93/de2/c142 0 2026-03-10T08:55:56.879 INFO:tasks.workunit.client.0.vm05.stdout:7/650: getdents d18/d66/d25/d2e/d2f 0 2026-03-10T08:55:56.879 INFO:tasks.workunit.client.0.vm05.stdout:9/647: creat d6/d19/d2a/fd8 x:0 0 0 2026-03-10T08:55:56.882 INFO:tasks.workunit.client.1.vm08.stdout:1/885: symlink d1/da/de/dcf/l12d 0 2026-03-10T08:55:56.884 INFO:tasks.workunit.client.1.vm08.stdout:7/899: write d0/d11/d1f/d29/d3b/d80/f88 [755906,66946] 0 2026-03-10T08:55:56.885 INFO:tasks.workunit.client.1.vm08.stdout:7/900: chown d0/d14/d43/d62/d102 0 1 2026-03-10T08:55:56.886 INFO:tasks.workunit.client.1.vm08.stdout:3/844: dread d4/d15/d8/ff [0,4194304] 0 2026-03-10T08:55:56.886 INFO:tasks.workunit.client.1.vm08.stdout:7/901: write d0/d11/d1f/d2c/d111/f112 [613287,107068] 0 2026-03-10T08:55:56.888 INFO:tasks.workunit.client.0.vm05.stdout:5/618: link d5/l16 d5/d86/d24/d84/db8/le0 0 2026-03-10T08:55:56.895 INFO:tasks.workunit.client.1.vm08.stdout:8/932: truncate d1/da8/fe0 1519297 0 2026-03-10T08:55:56.910 INFO:tasks.workunit.client.0.vm05.stdout:9/648: dread d6/d19/d2c/f61 [0,4194304] 0 2026-03-10T08:55:56.941 INFO:tasks.workunit.client.0.vm05.stdout:0/690: mknod df/d1f/d85/d19/d47/d84/d8a/ccc 0 2026-03-10T08:55:56.946 INFO:tasks.workunit.client.1.vm08.stdout:0/860: dread d6/fe [4194304,4194304] 0 2026-03-10T08:55:56.950 INFO:tasks.workunit.client.0.vm05.stdout:7/651: dread d18/f24 [0,4194304] 0 2026-03-10T08:55:56.965 INFO:tasks.workunit.client.0.vm05.stdout:8/691: mknod d2/cf1 0 2026-03-10T08:55:56.965 INFO:tasks.workunit.client.0.vm05.stdout:8/692: chown d2/dd/d2c/d2e/d31/d4f/d7b/d9e 1008 1 2026-03-10T08:55:56.985 INFO:tasks.workunit.client.1.vm08.stdout:5/790: creat d0/d11/d27/d68/d7c/d8e/ff3 x:0 0 0 2026-03-10T08:55:56.986 INFO:tasks.workunit.client.1.vm08.stdout:2/973: truncate d1/da/d10/d42/fda 204332 0 2026-03-10T08:55:56.991 INFO:tasks.workunit.client.1.vm08.stdout:6/886: dread d9/dc/d11/d23/d2c/d7a/fd3 [0,4194304] 0 2026-03-10T08:55:56.993 INFO:tasks.workunit.client.1.vm08.stdout:6/887: fdatasync d9/dc/d11/d23/d2c/d41/f56 0 2026-03-10T08:55:56.993 INFO:tasks.workunit.client.1.vm08.stdout:9/855: dwrite d2/d54/d8e/fba [0,4194304] 0 2026-03-10T08:55:56.994 INFO:tasks.workunit.client.1.vm08.stdout:4/915: dwrite d5/d23/d49/d8f/fb1 [0,4194304] 0 2026-03-10T08:55:57.006 INFO:tasks.workunit.client.0.vm05.stdout:4/683: fsync d0/d2e/d42/d45/d4a/d36/dbe/d49/d58/d66/d79/fa7 0 2026-03-10T08:55:57.019 INFO:tasks.workunit.client.0.vm05.stdout:2/622: rename d0/d9/d89/f93 to d0/d55/fb6 0 2026-03-10T08:55:57.030 INFO:tasks.workunit.client.0.vm05.stdout:1/733: link dd/f16 dd/d10/d18/d2d/d5c/dac/f102 0 2026-03-10T08:55:57.031 INFO:tasks.workunit.client.0.vm05.stdout:1/734: stat dd/d10/d18/d2d/d5c/fa2 0 2026-03-10T08:55:57.036 INFO:tasks.workunit.client.1.vm08.stdout:7/902: creat d0/d11/d1f/df0/df4/f11d x:0 0 0 2026-03-10T08:55:57.047 INFO:tasks.workunit.client.1.vm08.stdout:3/845: rename d4/d15/d8/d1d/da8/fb1 to d4/d15/d8/d2c/d9b/d79/d8f/f11c 0 2026-03-10T08:55:57.065 INFO:tasks.workunit.client.1.vm08.stdout:5/791: creat d0/d11/d27/ff4 x:0 0 0 2026-03-10T08:55:57.070 INFO:tasks.workunit.client.1.vm08.stdout:2/974: fdatasync d1/d5b/da7/f100 0 2026-03-10T08:55:57.112 INFO:tasks.workunit.client.1.vm08.stdout:9/856: rmdir d2/dd/d15/d1e/d25/d32/d5c 39 2026-03-10T08:55:57.135 INFO:tasks.workunit.client.1.vm08.stdout:6/888: dwrite d9/d13/f35 [0,4194304] 0 2026-03-10T08:55:57.147 INFO:tasks.workunit.client.1.vm08.stdout:0/861: unlink d6/c12 0 2026-03-10T08:55:57.177 INFO:tasks.workunit.client.0.vm05.stdout:5/619: creat d5/d48/d64/d95/fe1 x:0 0 0 2026-03-10T08:55:57.180 INFO:tasks.workunit.client.1.vm08.stdout:2/975: mknod d1/d5b/dc5/c143 0 2026-03-10T08:55:57.180 INFO:tasks.workunit.client.0.vm05.stdout:6/712: mknod d4/d7/d10/d8f/cee 0 2026-03-10T08:55:57.181 INFO:tasks.workunit.client.1.vm08.stdout:2/976: chown d1/da/d10/d42/d93/d1e/dce 0 1 2026-03-10T08:55:57.183 INFO:tasks.workunit.client.1.vm08.stdout:3/846: dwrite d4/d15/d8/d2c/d6d/dfa/ffc [0,4194304] 0 2026-03-10T08:55:57.186 INFO:tasks.workunit.client.0.vm05.stdout:9/649: symlink d6/d19/d2a/d4a/ld9 0 2026-03-10T08:55:57.196 INFO:tasks.workunit.client.1.vm08.stdout:9/857: rmdir d2/d41/d4c/dd2 39 2026-03-10T08:55:57.196 INFO:tasks.workunit.client.0.vm05.stdout:0/691: read - df/d1f/d85/d19/d5b/fb0 zero size 2026-03-10T08:55:57.199 INFO:tasks.workunit.client.0.vm05.stdout:8/693: symlink d2/dd/d2c/d2e/d31/d4f/d80/de2/dea/lf2 0 2026-03-10T08:55:57.207 INFO:tasks.workunit.client.0.vm05.stdout:4/684: creat d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/fdb x:0 0 0 2026-03-10T08:55:57.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:56 vm05.local ceph-mon[49713]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:55:57.219 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:56 vm05.local ceph-mon[49713]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:55:57.219 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:56 vm05.local ceph-mon[49713]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:55:57.219 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:56 vm05.local ceph-mon[49713]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-10T08:55:57.219 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:56 vm05.local ceph-mon[49713]: from='mgr.24459 ' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-10T08:55:57.219 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:56 vm05.local ceph-mon[49713]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:55:57.219 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:56 vm05.local ceph-mon[49713]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:55:57.219 INFO:tasks.workunit.client.0.vm05.stdout:5/620: mknod d5/d48/ce2 0 2026-03-10T08:55:57.221 INFO:tasks.workunit.client.0.vm05.stdout:6/713: creat d4/d2d/d51/d87/fef x:0 0 0 2026-03-10T08:55:57.223 INFO:tasks.workunit.client.0.vm05.stdout:9/650: read - d6/d15/d3c/d4b/d90/fb1 zero size 2026-03-10T08:55:57.227 INFO:tasks.workunit.client.0.vm05.stdout:0/692: chown df/d1f/d85/d19/d5b/fbd 29698303 1 2026-03-10T08:55:57.242 INFO:tasks.workunit.client.1.vm08.stdout:9/858: sync 2026-03-10T08:55:57.263 INFO:tasks.workunit.client.0.vm05.stdout:0/693: dread fe [0,4194304] 0 2026-03-10T08:55:57.265 INFO:tasks.workunit.client.1.vm08.stdout:4/916: write d5/d23/d36/d99/db2/d5a/d69/d11b/d96/fbb [421391,36122] 0 2026-03-10T08:55:57.266 INFO:tasks.workunit.client.0.vm05.stdout:1/735: write dd/d21/d37/d45/d8d/fae [247816,11463] 0 2026-03-10T08:55:57.269 INFO:tasks.workunit.client.0.vm05.stdout:3/755: dwrite d9/d4d/dca/f99 [0,4194304] 0 2026-03-10T08:55:57.275 INFO:tasks.workunit.client.0.vm05.stdout:8/694: dwrite d2/dd/d2c/d2e/f7d [0,4194304] 0 2026-03-10T08:55:57.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:56 vm08.local ceph-mon[57559]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:55:57.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:56 vm08.local ceph-mon[57559]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:55:57.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:56 vm08.local ceph-mon[57559]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:55:57.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:56 vm08.local ceph-mon[57559]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-10T08:55:57.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:56 vm08.local ceph-mon[57559]: from='mgr.24459 ' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-10T08:55:57.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:56 vm08.local ceph-mon[57559]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:55:57.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:56 vm08.local ceph-mon[57559]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:55:57.306 INFO:tasks.workunit.client.0.vm05.stdout:6/714: rmdir d4/d2c 39 2026-03-10T08:55:57.311 INFO:tasks.workunit.client.0.vm05.stdout:9/651: readlink d6/d27/l39 0 2026-03-10T08:55:57.322 INFO:tasks.workunit.client.1.vm08.stdout:1/886: creat d1/da/d18/f12e x:0 0 0 2026-03-10T08:55:57.325 INFO:tasks.workunit.client.1.vm08.stdout:6/889: read d9/dc/d11/d23/d2c/d7a/dce/ffd [447461,99583] 0 2026-03-10T08:55:57.329 INFO:tasks.workunit.client.0.vm05.stdout:0/694: mkdir df/d1f/dcd 0 2026-03-10T08:55:57.338 INFO:tasks.workunit.client.1.vm08.stdout:7/903: fsync d0/d11/d1f/d29/fcf 0 2026-03-10T08:55:57.345 INFO:tasks.workunit.client.1.vm08.stdout:8/933: link d1/d10/d9/dd/d25/d27/d144/f4e d1/d4f/d60/dbf/f154 0 2026-03-10T08:55:57.346 INFO:tasks.workunit.client.1.vm08.stdout:8/934: readlink d1/d2c/l35 0 2026-03-10T08:55:57.349 INFO:tasks.workunit.client.0.vm05.stdout:5/621: dwrite d5/df/dbb/f4e [0,4194304] 0 2026-03-10T08:55:57.354 INFO:tasks.workunit.client.1.vm08.stdout:0/862: creat d6/dd/d13/d17/d1f/d2d/d85/d93/f129 x:0 0 0 2026-03-10T08:55:57.354 INFO:tasks.workunit.client.0.vm05.stdout:3/756: symlink d9/d2b/d3a/d43/d6e/le6 0 2026-03-10T08:55:57.355 INFO:tasks.workunit.client.1.vm08.stdout:0/863: read d6/dd/d13/d17/d50/f71 [5405825,62882] 0 2026-03-10T08:55:57.356 INFO:tasks.workunit.client.1.vm08.stdout:0/864: fdatasync d6/dd/d13/d17/d1f/d20/d2f/d57/d109/f120 0 2026-03-10T08:55:57.363 INFO:tasks.workunit.client.1.vm08.stdout:5/792: mkdir d0/d11/d18/df5 0 2026-03-10T08:55:57.363 INFO:tasks.workunit.client.0.vm05.stdout:8/695: mkdir d2/db/d28/d99/df3 0 2026-03-10T08:55:57.363 INFO:tasks.workunit.client.0.vm05.stdout:4/685: mkdir d0/d2e/d42/d45/d4a/d36/dbe/d49/d4f/d5b/dd7/ddc 0 2026-03-10T08:55:57.363 INFO:tasks.workunit.client.1.vm08.stdout:5/793: readlink d0/d11/d27/d68/d7c/de5/lb7 0 2026-03-10T08:55:57.379 INFO:tasks.workunit.client.0.vm05.stdout:4/686: dread d0/d2e/d42/d45/d4a/f26 [0,4194304] 0 2026-03-10T08:55:57.380 INFO:tasks.workunit.client.0.vm05.stdout:4/687: chown d0/d1d/f50 360020 1 2026-03-10T08:55:57.385 INFO:tasks.workunit.client.1.vm08.stdout:2/977: dread - d1/da/d10/d1b/d6a/fbb zero size 2026-03-10T08:55:57.386 INFO:tasks.workunit.client.1.vm08.stdout:2/978: chown d1/da/d10/d2d/f4d 4867792 1 2026-03-10T08:55:57.387 INFO:tasks.workunit.client.0.vm05.stdout:2/623: creat d0/d9/d1e/d20/d21/fb7 x:0 0 0 2026-03-10T08:55:57.387 INFO:tasks.workunit.client.1.vm08.stdout:9/859: chown d2/dd/d15/d1e/d25/d32/d5c/dc2/cf7 169 1 2026-03-10T08:55:57.387 INFO:tasks.workunit.client.0.vm05.stdout:9/652: unlink d6/d15/d35/la4 0 2026-03-10T08:55:57.387 INFO:tasks.workunit.client.1.vm08.stdout:9/860: stat d2/d54/d8e/da6/cb8 0 2026-03-10T08:55:57.387 INFO:tasks.workunit.client.0.vm05.stdout:7/652: link d18/d38/f82 d18/d38/d43/d5c/fcc 0 2026-03-10T08:55:57.397 INFO:tasks.workunit.client.1.vm08.stdout:9/861: dread d2/dd/d15/d1e/d94/f106 [0,4194304] 0 2026-03-10T08:55:57.408 INFO:tasks.workunit.client.0.vm05.stdout:1/736: symlink dd/dfb/l103 0 2026-03-10T08:55:57.408 INFO:tasks.workunit.client.1.vm08.stdout:6/890: unlink d9/d10/d1e/fba 0 2026-03-10T08:55:57.409 INFO:tasks.workunit.client.0.vm05.stdout:4/688: mkdir d0/d2e/d42/d45/d4a/d36/dbe/d49/d4f/d5b/ddd 0 2026-03-10T08:55:57.409 INFO:tasks.workunit.client.1.vm08.stdout:7/904: readlink d0/d14/d43/d9d/l11a 0 2026-03-10T08:55:57.411 INFO:tasks.workunit.client.1.vm08.stdout:7/905: write d0/d11/d1f/d29/d3d/d89/f116 [353368,13267] 0 2026-03-10T08:55:57.416 INFO:tasks.workunit.client.0.vm05.stdout:5/622: sync 2026-03-10T08:55:57.416 INFO:tasks.workunit.client.0.vm05.stdout:8/696: sync 2026-03-10T08:55:57.425 INFO:tasks.workunit.client.0.vm05.stdout:6/715: mknod d4/d2c/d84/cf0 0 2026-03-10T08:55:57.425 INFO:tasks.workunit.client.1.vm08.stdout:5/794: mknod d0/d11/d27/d68/d7c/d8e/cf6 0 2026-03-10T08:55:57.431 INFO:tasks.workunit.client.1.vm08.stdout:3/847: mknod d4/d15/d8/d2c/c11d 0 2026-03-10T08:55:57.431 INFO:tasks.workunit.client.0.vm05.stdout:9/653: creat d6/d15/d3c/fda x:0 0 0 2026-03-10T08:55:57.435 INFO:tasks.workunit.client.0.vm05.stdout:2/624: dread - d0/d9/d1e/d20/d21/d45/d4b/d70/f9b zero size 2026-03-10T08:55:57.438 INFO:tasks.workunit.client.0.vm05.stdout:1/737: truncate dd/d10/fb0 2542329 0 2026-03-10T08:55:57.449 INFO:tasks.workunit.client.0.vm05.stdout:3/757: mkdir d9/d2b/de7 0 2026-03-10T08:55:57.457 INFO:tasks.workunit.client.1.vm08.stdout:4/917: write d5/d23/d36/fa9 [476662,120067] 0 2026-03-10T08:55:57.460 INFO:tasks.workunit.client.1.vm08.stdout:0/865: write d6/d8b/f94 [830753,24437] 0 2026-03-10T08:55:57.463 INFO:tasks.workunit.client.1.vm08.stdout:8/935: dwrite d1/d10/d9/dd/d25/d27/d44/d21/d51/d64/feb [0,4194304] 0 2026-03-10T08:55:57.463 INFO:tasks.workunit.client.0.vm05.stdout:4/689: creat d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/fde x:0 0 0 2026-03-10T08:55:57.466 INFO:tasks.workunit.client.0.vm05.stdout:5/623: creat d5/df/dbb/fe3 x:0 0 0 2026-03-10T08:55:57.469 INFO:tasks.workunit.client.1.vm08.stdout:1/887: dwrite d1/da/d18/d3b/d62/f76 [4194304,4194304] 0 2026-03-10T08:55:57.469 INFO:tasks.workunit.client.1.vm08.stdout:9/862: creat d2/d41/d53/d103/f121 x:0 0 0 2026-03-10T08:55:57.471 INFO:tasks.workunit.client.0.vm05.stdout:0/695: creat df/d1f/d85/d19/d47/d84/dbe/fce x:0 0 0 2026-03-10T08:55:57.474 INFO:tasks.workunit.client.1.vm08.stdout:8/936: stat d1/d10/d9/dd/d25/d27/d44/d21/d5f/d9e/f149 0 2026-03-10T08:55:57.475 INFO:tasks.workunit.client.1.vm08.stdout:9/863: chown d2/dd/d15/d1e/d21/f75 777877 1 2026-03-10T08:55:57.475 INFO:tasks.workunit.client.1.vm08.stdout:8/937: chown d1/d10/d9/dd/d25/d27/d44/d21/dce/l101 5158 1 2026-03-10T08:55:57.483 INFO:tasks.workunit.client.0.vm05.stdout:6/716: symlink d4/d2d/d51/d62/da9/lf1 0 2026-03-10T08:55:57.486 INFO:tasks.workunit.client.1.vm08.stdout:7/906: fsync d0/d11/d1f/d29/d3b/d80/dd3/ffb 0 2026-03-10T08:55:57.497 INFO:tasks.workunit.client.0.vm05.stdout:2/625: rename d0/d9/d1e/d20/d21/d45/d6c to d0/d55/db8 0 2026-03-10T08:55:57.508 INFO:tasks.workunit.client.0.vm05.stdout:7/653: creat d18/d66/d25/d2e/d42/dc6/fcd x:0 0 0 2026-03-10T08:55:57.517 INFO:tasks.workunit.client.1.vm08.stdout:5/795: creat d0/d11/d3e/d45/ff7 x:0 0 0 2026-03-10T08:55:57.517 INFO:tasks.workunit.client.1.vm08.stdout:5/796: readlink d0/d11/d27/d68/d7c/d4b/d4e/l59 0 2026-03-10T08:55:57.522 INFO:tasks.workunit.client.0.vm05.stdout:1/738: dread dd/d10/d18/dd5/fbf [0,4194304] 0 2026-03-10T08:55:57.524 INFO:tasks.workunit.client.0.vm05.stdout:1/739: stat dd/d10/d18/d2d/d51/d58/d71/d73/l76 0 2026-03-10T08:55:57.534 INFO:tasks.workunit.client.0.vm05.stdout:5/624: unlink d5/d86/d24/d2c/d41/d74/f7d 0 2026-03-10T08:55:57.538 INFO:tasks.workunit.client.0.vm05.stdout:8/697: write d2/dd/d2c/d2e/f3b [6951086,15452] 0 2026-03-10T08:55:57.539 INFO:tasks.workunit.client.0.vm05.stdout:8/698: fsync d2/dd/d2c/d2e/d31/d4f/da3/faa 0 2026-03-10T08:55:57.543 INFO:tasks.workunit.client.0.vm05.stdout:0/696: mknod df/d1f/d85/d19/d47/da3/ccf 0 2026-03-10T08:55:57.544 INFO:tasks.workunit.client.0.vm05.stdout:0/697: dread - df/d1f/d85/fb5 zero size 2026-03-10T08:55:57.557 INFO:tasks.workunit.client.1.vm08.stdout:0/866: truncate d6/dd/d13/d17/d1f/d20/d2f/d26/d56/fdc 343777 0 2026-03-10T08:55:57.557 INFO:tasks.workunit.client.1.vm08.stdout:0/867: chown d6/dd/d13/d17/f6d 6 1 2026-03-10T08:55:57.577 INFO:tasks.workunit.client.0.vm05.stdout:2/626: creat d0/d9/d7f/d8f/d7e/fb9 x:0 0 0 2026-03-10T08:55:57.577 INFO:tasks.workunit.client.0.vm05.stdout:7/654: creat d18/d66/d25/d2e/d42/d9c/fce x:0 0 0 2026-03-10T08:55:57.585 INFO:tasks.workunit.client.0.vm05.stdout:7/655: dread d18/d66/d79/f85 [0,4194304] 0 2026-03-10T08:55:57.599 INFO:tasks.workunit.client.0.vm05.stdout:4/690: getdents d0/d78 0 2026-03-10T08:55:57.599 INFO:tasks.workunit.client.1.vm08.stdout:8/938: mkdir d1/d4f/d60/dbf/d155 0 2026-03-10T08:55:57.601 INFO:tasks.workunit.client.1.vm08.stdout:8/939: chown d1/d10/d9/le7 3796158 1 2026-03-10T08:55:57.603 INFO:tasks.workunit.client.1.vm08.stdout:7/907: creat d0/d11/f11e x:0 0 0 2026-03-10T08:55:57.604 INFO:tasks.workunit.client.1.vm08.stdout:9/864: dread d2/dd/d15/d1e/d94/f106 [0,4194304] 0 2026-03-10T08:55:57.620 INFO:tasks.workunit.client.1.vm08.stdout:4/918: mknod d5/d23/d36/d99/db2/d5d/de3/c147 0 2026-03-10T08:55:57.622 INFO:tasks.workunit.client.0.vm05.stdout:0/698: rename df/d1f/d85/d2b/d27/d32/c49 to df/d1f/d85/d19/d47/d84/dae/cd0 0 2026-03-10T08:55:57.625 INFO:tasks.workunit.client.1.vm08.stdout:0/868: dread - d6/dd/d13/d17/d1f/d2d/d38/fae zero size 2026-03-10T08:55:57.629 INFO:tasks.workunit.client.1.vm08.stdout:6/891: rename d9/d10/d1e/d7e/l116 to d9/dc/l124 0 2026-03-10T08:55:57.642 INFO:tasks.workunit.client.0.vm05.stdout:2/627: fsync d0/d9/d1e/d20/f8b 0 2026-03-10T08:55:57.647 INFO:tasks.workunit.client.0.vm05.stdout:7/656: dread - d18/d66/d25/d2e/d42/d74/f86 zero size 2026-03-10T08:55:57.649 INFO:tasks.workunit.client.0.vm05.stdout:7/657: read d18/d66/d25/d2e/d42/f71 [2640296,38039] 0 2026-03-10T08:55:57.652 INFO:tasks.workunit.client.1.vm08.stdout:7/908: fsync d0/d11/d1f/d29/d36/d75/fb9 0 2026-03-10T08:55:57.654 INFO:tasks.workunit.client.1.vm08.stdout:9/865: mknod d2/d54/d8e/da6/c122 0 2026-03-10T08:55:57.656 INFO:tasks.workunit.client.1.vm08.stdout:8/940: read d1/d10/d9/dd/d25/d27/d44/d89/fcd [4038369,124336] 0 2026-03-10T08:55:57.656 INFO:tasks.workunit.client.1.vm08.stdout:7/909: read d0/d11/f39 [678947,67070] 0 2026-03-10T08:55:57.657 INFO:tasks.workunit.client.1.vm08.stdout:7/910: write d0/d11/d1f/d29/d3b/da1/f114 [830098,9596] 0 2026-03-10T08:55:57.659 INFO:tasks.workunit.client.1.vm08.stdout:7/911: read d0/d11/d1f/d29/d3d/fda [387031,71213] 0 2026-03-10T08:55:57.659 INFO:tasks.workunit.client.1.vm08.stdout:3/848: link d4/d15/d8/d2c/d9b/d79/d20/lf3 d4/d15/d8/d2c/d9b/d79/d20/l11e 0 2026-03-10T08:55:57.660 INFO:tasks.workunit.client.1.vm08.stdout:2/979: getdents d1/da 0 2026-03-10T08:55:57.663 INFO:tasks.workunit.client.0.vm05.stdout:8/699: link d2/dd/d2c/d2e/d31/d4f/da3/faa d2/dd/d2c/d2e/d31/d3e/dde/d63/daf/ff4 0 2026-03-10T08:55:57.668 INFO:tasks.workunit.client.1.vm08.stdout:4/919: creat d5/d23/d36/d99/dc6/dc8/d120/f148 x:0 0 0 2026-03-10T08:55:57.668 INFO:tasks.workunit.client.1.vm08.stdout:4/920: readlink d5/d23/d49/d8f/da4/l142 0 2026-03-10T08:55:57.679 INFO:tasks.workunit.client.0.vm05.stdout:9/654: link d6/d19/d2c/l49 d6/d15/d3c/d4b/ldb 0 2026-03-10T08:55:57.686 INFO:tasks.workunit.client.0.vm05.stdout:1/740: dwrite fb [0,4194304] 0 2026-03-10T08:55:57.689 INFO:tasks.workunit.client.1.vm08.stdout:1/888: dwrite d1/da/f25 [0,4194304] 0 2026-03-10T08:55:57.700 INFO:tasks.workunit.client.1.vm08.stdout:5/797: write d0/d11/d18/fca [323451,126833] 0 2026-03-10T08:55:57.701 INFO:tasks.workunit.client.0.vm05.stdout:5/625: dwrite d5/df/d37/dd2/d76/fb5 [0,4194304] 0 2026-03-10T08:55:57.703 INFO:tasks.workunit.client.0.vm05.stdout:5/626: read d5/df/dbb/f4e [1100772,95056] 0 2026-03-10T08:55:57.710 INFO:tasks.workunit.client.0.vm05.stdout:6/717: link d4/d7/d10/d1a/ca0 d4/d7/d10/d1a/cf2 0 2026-03-10T08:55:57.712 INFO:tasks.workunit.client.0.vm05.stdout:2/628: unlink d0/d9/d1e/l83 0 2026-03-10T08:55:57.727 INFO:tasks.workunit.client.0.vm05.stdout:3/758: getdents d9/d4d/d51/d64/d89 0 2026-03-10T08:55:57.730 INFO:tasks.workunit.client.1.vm08.stdout:8/941: creat d1/d4f/d60/dbf/f156 x:0 0 0 2026-03-10T08:55:57.730 INFO:tasks.workunit.client.1.vm08.stdout:8/942: fdatasync d1/dd9/f129 0 2026-03-10T08:55:57.733 INFO:tasks.workunit.client.1.vm08.stdout:3/849: readlink d4/d15/d8/d2c/d9b/d79/l81 0 2026-03-10T08:55:57.734 INFO:tasks.workunit.client.1.vm08.stdout:3/850: chown d4/d15/d8/d2c/c43 452072317 1 2026-03-10T08:55:57.738 INFO:tasks.workunit.client.1.vm08.stdout:2/980: symlink d1/da/d78/df5/l144 0 2026-03-10T08:55:57.739 INFO:tasks.workunit.client.0.vm05.stdout:0/699: mknod df/d1f/d95/cd1 0 2026-03-10T08:55:57.739 INFO:tasks.workunit.client.1.vm08.stdout:2/981: chown d1/d97/d11f/l65 1472989238 1 2026-03-10T08:55:57.748 INFO:tasks.workunit.client.0.vm05.stdout:6/718: creat d4/d2d/d5f/ff3 x:0 0 0 2026-03-10T08:55:57.752 INFO:tasks.workunit.client.0.vm05.stdout:2/629: creat d0/d9/d89/fba x:0 0 0 2026-03-10T08:55:57.752 INFO:tasks.workunit.client.0.vm05.stdout:7/658: link d18/d38/f55 d18/d66/d79/fcf 0 2026-03-10T08:55:57.752 INFO:tasks.workunit.client.0.vm05.stdout:7/659: readlink d18/d38/d43/laa 0 2026-03-10T08:55:57.753 INFO:tasks.workunit.client.0.vm05.stdout:1/741: dread dd/d21/d37/f8c [0,4194304] 0 2026-03-10T08:55:57.765 INFO:tasks.workunit.client.0.vm05.stdout:4/691: creat d0/d2e/fdf x:0 0 0 2026-03-10T08:55:57.767 INFO:tasks.workunit.client.1.vm08.stdout:3/851: dread - d4/d15/d8/d2c/d9b/d79/ff2 zero size 2026-03-10T08:55:57.768 INFO:tasks.workunit.client.0.vm05.stdout:0/700: creat df/d1f/d85/d2b/d27/d32/d4e/fd2 x:0 0 0 2026-03-10T08:55:57.783 INFO:tasks.workunit.client.1.vm08.stdout:0/869: rename d6/dd/d13/d17/d1f/d20/d2f/d26/f80 to d6/f12a 0 2026-03-10T08:55:57.787 INFO:tasks.workunit.client.0.vm05.stdout:5/627: rename d5/df/dbb/f4e to d5/d86/d24/d2c/d41/fe4 0 2026-03-10T08:55:57.790 INFO:tasks.workunit.client.1.vm08.stdout:6/892: symlink d9/dc/d11/d23/d2c/l125 0 2026-03-10T08:55:57.793 INFO:tasks.workunit.client.1.vm08.stdout:6/893: dwrite f5 [0,4194304] 0 2026-03-10T08:55:57.808 INFO:tasks.workunit.client.0.vm05.stdout:7/660: readlink d18/d66/d79/l8e 0 2026-03-10T08:55:57.809 INFO:tasks.workunit.client.0.vm05.stdout:1/742: symlink dd/d10/d19/d9b/dc3/l104 0 2026-03-10T08:55:57.811 INFO:tasks.workunit.client.1.vm08.stdout:9/866: write d2/dd/d15/d1e/d39/d4e/f55 [2244557,98855] 0 2026-03-10T08:55:57.819 INFO:tasks.workunit.client.0.vm05.stdout:8/700: dwrite d2/dd/d2c/f4d [4194304,4194304] 0 2026-03-10T08:55:57.823 INFO:tasks.workunit.client.1.vm08.stdout:1/889: dwrite d1/da/de/d24/d35/d6d/d116/fc3 [0,4194304] 0 2026-03-10T08:55:57.823 INFO:tasks.workunit.client.1.vm08.stdout:5/798: dwrite d0/d11/d18/fc0 [0,4194304] 0 2026-03-10T08:55:57.826 INFO:tasks.workunit.client.1.vm08.stdout:1/890: fsync d1/da/de/d5c/fb5 0 2026-03-10T08:55:57.834 INFO:tasks.workunit.client.1.vm08.stdout:8/943: write d1/d10/d9/dd/d25/d27/d44/d97/f9c [271471,122538] 0 2026-03-10T08:55:57.840 INFO:tasks.workunit.client.1.vm08.stdout:7/912: dwrite d0/d11/d4a/f4f [0,4194304] 0 2026-03-10T08:55:57.848 INFO:tasks.workunit.client.1.vm08.stdout:2/982: rename d1/da/d10/d42/d93/de2/f113 to d1/d5b/da7/d11c/f145 0 2026-03-10T08:55:57.850 INFO:tasks.workunit.client.0.vm05.stdout:9/655: creat d6/d12/d3a/fdc x:0 0 0 2026-03-10T08:55:57.855 INFO:tasks.workunit.client.0.vm05.stdout:2/630: write d0/d9/d7f/f80 [639593,115575] 0 2026-03-10T08:55:57.857 INFO:tasks.workunit.client.0.vm05.stdout:3/759: rename d9/d2b/d2f/d57/cbb to d9/d2b/d3a/d6c/dbe/ce8 0 2026-03-10T08:55:57.863 INFO:tasks.workunit.client.0.vm05.stdout:5/628: truncate d5/f23 5235442 0 2026-03-10T08:55:57.869 INFO:tasks.workunit.client.1.vm08.stdout:4/921: getdents d5/d23/d36/d99/db2/d5a/d69/d11b/d96 0 2026-03-10T08:55:57.873 INFO:tasks.workunit.client.1.vm08.stdout:4/922: dwrite d5/d23/f10d [4194304,4194304] 0 2026-03-10T08:55:57.882 INFO:tasks.workunit.client.1.vm08.stdout:5/799: mkdir d0/d11/d27/d68/d7c/df8 0 2026-03-10T08:55:57.883 INFO:tasks.workunit.client.1.vm08.stdout:5/800: write d0/d11/d27/d68/d7c/d4b/d4e/da5/fef [997696,83277] 0 2026-03-10T08:55:57.890 INFO:tasks.workunit.client.1.vm08.stdout:8/944: read d1/d4f/d60/d88/f14d [27930,97847] 0 2026-03-10T08:55:57.893 INFO:tasks.workunit.client.1.vm08.stdout:3/852: rename d4/d15/d8/d2c/c43 to d4/d15/d8/d2c/d9b/d79/d20/c11f 0 2026-03-10T08:55:57.895 INFO:tasks.workunit.client.1.vm08.stdout:2/983: chown d1/da/d10/d1b/cc7 437465 1 2026-03-10T08:55:57.896 INFO:tasks.workunit.client.0.vm05.stdout:5/629: creat d5/df/d37/d68/fe5 x:0 0 0 2026-03-10T08:55:57.899 INFO:tasks.workunit.client.1.vm08.stdout:9/867: symlink d2/d54/d8e/l123 0 2026-03-10T08:55:57.905 INFO:tasks.workunit.client.0.vm05.stdout:2/631: dread d0/d9/d7f/d8f/f66 [0,4194304] 0 2026-03-10T08:55:57.905 INFO:tasks.workunit.client.0.vm05.stdout:1/743: dread dd/d21/d3f/f57 [0,4194304] 0 2026-03-10T08:55:57.908 INFO:tasks.workunit.client.1.vm08.stdout:4/923: truncate d5/d23/d36/d99/dc6/dc8/f12c 586660 0 2026-03-10T08:55:57.910 INFO:tasks.workunit.client.1.vm08.stdout:7/913: truncate d0/d11/d4a/da3/f104 300745 0 2026-03-10T08:55:57.912 INFO:tasks.workunit.client.1.vm08.stdout:5/801: dwrite d0/d11/d3e/d45/fe4 [0,4194304] 0 2026-03-10T08:55:57.947 INFO:tasks.workunit.client.1.vm08.stdout:5/802: rename d0/d11/d27/d68/d7c/de5/db9 to d0/d11/d27/d68/d7c/d4b/d4e/d84/df9 0 2026-03-10T08:55:57.949 INFO:tasks.workunit.client.1.vm08.stdout:4/924: dread d5/d23/fa1 [0,4194304] 0 2026-03-10T08:55:57.949 INFO:tasks.workunit.client.0.vm05.stdout:5/630: creat d5/d48/fe6 x:0 0 0 2026-03-10T08:55:57.950 INFO:tasks.workunit.client.1.vm08.stdout:4/925: chown d5/d23/d49/l4f 515072626 1 2026-03-10T08:55:57.950 INFO:tasks.workunit.client.0.vm05.stdout:5/631: dread - d5/d48/d64/d95/fe1 zero size 2026-03-10T08:55:57.951 INFO:tasks.workunit.client.0.vm05.stdout:5/632: truncate d5/df/d37/d68/fe5 780206 0 2026-03-10T08:55:57.951 INFO:tasks.workunit.client.1.vm08.stdout:7/914: dread d0/d14/d43/fc2 [0,4194304] 0 2026-03-10T08:55:57.956 INFO:tasks.workunit.client.0.vm05.stdout:2/632: creat d0/d9/d1e/d20/d21/d45/d4b/d8d/fbb x:0 0 0 2026-03-10T08:55:57.956 INFO:tasks.workunit.client.1.vm08.stdout:5/803: dread d0/d11/d27/d50/f55 [4194304,4194304] 0 2026-03-10T08:55:57.961 INFO:tasks.workunit.client.1.vm08.stdout:9/868: creat d2/dd/d15/d1e/d21/da4/f124 x:0 0 0 2026-03-10T08:55:57.963 INFO:tasks.workunit.client.1.vm08.stdout:9/869: dread d2/d41/d4c/f7c [0,4194304] 0 2026-03-10T08:55:57.963 INFO:tasks.workunit.client.1.vm08.stdout:9/870: stat d2/f4 0 2026-03-10T08:55:57.964 INFO:tasks.workunit.client.0.vm05.stdout:8/701: fsync d2/dd/d74/d78/fcf 0 2026-03-10T08:55:57.965 INFO:tasks.workunit.client.0.vm05.stdout:8/702: readlink d2/db/d1f/l2b 0 2026-03-10T08:55:57.972 INFO:tasks.workunit.client.0.vm05.stdout:4/692: rename d0/f23 to d0/d2e/d42/d45/d4a/d36/dbe/fe0 0 2026-03-10T08:55:57.974 INFO:tasks.workunit.client.1.vm08.stdout:2/984: creat d1/d97/d11f/f146 x:0 0 0 2026-03-10T08:55:57.976 INFO:tasks.workunit.client.0.vm05.stdout:3/760: creat d9/d2b/fe9 x:0 0 0 2026-03-10T08:55:57.978 INFO:tasks.workunit.client.0.vm05.stdout:5/633: symlink d5/d86/d39/le7 0 2026-03-10T08:55:57.980 INFO:tasks.workunit.client.0.vm05.stdout:8/703: dread d2/db/d1f/d67/f79 [0,4194304] 0 2026-03-10T08:55:57.981 INFO:tasks.workunit.client.1.vm08.stdout:6/894: link d9/dc/d11/d23/f113 d9/dc/d11/d23/d2c/d7a/dce/d69/da2/f126 0 2026-03-10T08:55:57.983 INFO:tasks.workunit.client.0.vm05.stdout:7/661: link d18/d66/d25/f47 d18/d66/d78/dc3/fd0 0 2026-03-10T08:55:57.983 INFO:tasks.workunit.client.0.vm05.stdout:7/662: chown d18/d66/d25/d2e/d42/f71 247724957 1 2026-03-10T08:55:57.992 INFO:tasks.workunit.client.0.vm05.stdout:7/663: dread d18/d66/d25/d2e/d42/f46 [0,4194304] 0 2026-03-10T08:55:58.010 INFO:tasks.workunit.client.1.vm08.stdout:6/895: unlink d9/dc/l33 0 2026-03-10T08:55:58.010 INFO:tasks.workunit.client.1.vm08.stdout:6/896: stat d9/d13/f6c 0 2026-03-10T08:55:58.010 INFO:tasks.workunit.client.0.vm05.stdout:7/664: read d18/d38/f5d [3687384,83828] 0 2026-03-10T08:55:58.010 INFO:tasks.workunit.client.0.vm05.stdout:2/633: unlink d0/d9/d1e/d20/f47 0 2026-03-10T08:55:58.010 INFO:tasks.workunit.client.0.vm05.stdout:2/634: write d0/d9/d89/da3/dac/faf [937562,62835] 0 2026-03-10T08:55:58.013 INFO:tasks.workunit.client.1.vm08.stdout:4/926: rename d5/d23/d49/d8f/da4/d118 to d5/d23/d36/d99/d149 0 2026-03-10T08:55:58.013 INFO:tasks.workunit.client.1.vm08.stdout:4/927: stat d5/d23/d36/d99/dc6/dc8 0 2026-03-10T08:55:58.022 INFO:tasks.workunit.client.0.vm05.stdout:8/704: creat d2/dd/d2c/d2e/d31/d3e/d5d/d9d/ff5 x:0 0 0 2026-03-10T08:55:58.022 INFO:tasks.workunit.client.0.vm05.stdout:8/705: dread - d2/dd/d2c/d2e/d31/d3e/f95 zero size 2026-03-10T08:55:58.026 INFO:tasks.workunit.client.0.vm05.stdout:0/701: write df/f1a [8837025,35724] 0 2026-03-10T08:55:58.026 INFO:tasks.workunit.client.0.vm05.stdout:0/702: stat df/d1f/d85/d2b/d65/d6e 0 2026-03-10T08:55:58.032 INFO:tasks.workunit.client.1.vm08.stdout:0/870: dwrite d6/dd/d13/d17/d1f/d20/f46 [0,4194304] 0 2026-03-10T08:55:58.033 INFO:tasks.workunit.client.0.vm05.stdout:6/719: dwrite d4/d7/d10/f65 [0,4194304] 0 2026-03-10T08:55:58.035 INFO:tasks.workunit.client.0.vm05.stdout:6/720: dread - d4/d7/d10/d15/f94 zero size 2026-03-10T08:55:58.035 INFO:tasks.workunit.client.0.vm05.stdout:6/721: readlink d4/d2c/d84/d4a/l8e 0 2026-03-10T08:55:58.036 INFO:tasks.workunit.client.0.vm05.stdout:6/722: readlink d4/d7/d10/d15/l58 0 2026-03-10T08:55:58.045 INFO:tasks.workunit.client.1.vm08.stdout:1/891: write d1/da/de/d24/d81/d121/f124 [955110,14109] 0 2026-03-10T08:55:58.054 INFO:tasks.workunit.client.1.vm08.stdout:8/945: write d1/d10/d9/dd/d9a/f9d [737085,67010] 0 2026-03-10T08:55:58.054 INFO:tasks.workunit.client.1.vm08.stdout:3/853: write d4/f44 [1267454,110748] 0 2026-03-10T08:55:58.055 INFO:tasks.workunit.client.1.vm08.stdout:3/854: stat d4/d15/d8/d2c 0 2026-03-10T08:55:58.056 INFO:tasks.workunit.client.1.vm08.stdout:5/804: creat d0/ffa x:0 0 0 2026-03-10T08:55:58.057 INFO:tasks.workunit.client.0.vm05.stdout:7/665: creat d18/d38/d43/d6e/fd1 x:0 0 0 2026-03-10T08:55:58.071 INFO:tasks.workunit.client.0.vm05.stdout:2/635: dread d0/d9/d1e/f39 [0,4194304] 0 2026-03-10T08:55:58.073 INFO:tasks.workunit.client.0.vm05.stdout:9/656: dwrite d6/d19/d2c/d84/fd7 [0,4194304] 0 2026-03-10T08:55:58.075 INFO:tasks.workunit.client.1.vm08.stdout:9/871: rename d2/dd/d15/d1e/d25/d32/c6f to d2/d41/d4c/d66/c125 0 2026-03-10T08:55:58.077 INFO:tasks.workunit.client.1.vm08.stdout:7/915: getdents d0/d11/d1f/d29/d3d/dd1 0 2026-03-10T08:55:58.082 INFO:tasks.workunit.client.0.vm05.stdout:4/693: mknod d0/d2c/d6a/dc9/ce1 0 2026-03-10T08:55:58.089 INFO:tasks.workunit.client.0.vm05.stdout:1/744: write dd/f1c [858293,37819] 0 2026-03-10T08:55:58.097 INFO:tasks.workunit.client.1.vm08.stdout:0/871: unlink d6/dd/c4b 0 2026-03-10T08:55:58.098 INFO:tasks.workunit.client.0.vm05.stdout:3/761: write d9/d8f/d55/f8c [618058,53991] 0 2026-03-10T08:55:58.100 INFO:tasks.workunit.client.1.vm08.stdout:2/985: dwrite d1/da/d10/d42/d93/d23/d9e/fa1 [0,4194304] 0 2026-03-10T08:55:58.101 INFO:tasks.workunit.client.1.vm08.stdout:1/892: creat d1/da/de/d24/d81/f12f x:0 0 0 2026-03-10T08:55:58.104 INFO:tasks.workunit.client.0.vm05.stdout:5/634: write d5/d86/d24/d2c/d41/d74/fa8 [156911,12031] 0 2026-03-10T08:55:58.110 INFO:tasks.workunit.client.0.vm05.stdout:0/703: rmdir df/d1f/d85/d2b/d27/d32 39 2026-03-10T08:55:58.110 INFO:tasks.workunit.client.0.vm05.stdout:0/704: stat df/d1f/d85 0 2026-03-10T08:55:58.111 INFO:tasks.workunit.client.0.vm05.stdout:6/723: truncate d4/d7/d10/d15/d1b/d22/fcf 974354 0 2026-03-10T08:55:58.112 INFO:tasks.workunit.client.0.vm05.stdout:6/724: read - d4/d7/f80 zero size 2026-03-10T08:55:58.123 INFO:tasks.workunit.client.1.vm08.stdout:5/805: creat d0/d11/d27/d50/ffb x:0 0 0 2026-03-10T08:55:58.124 INFO:tasks.workunit.client.0.vm05.stdout:2/636: dread - d0/d9/d1e/d20/d21/d45/d4b/f97 zero size 2026-03-10T08:55:58.125 INFO:tasks.workunit.client.0.vm05.stdout:2/637: chown d0/d55/db8 467345003 1 2026-03-10T08:55:58.132 INFO:tasks.workunit.client.0.vm05.stdout:4/694: read - d0/d2e/d42/d45/d4a/d36/dbe/d32/f9e zero size 2026-03-10T08:55:58.134 INFO:tasks.workunit.client.1.vm08.stdout:9/872: stat d2/c47 0 2026-03-10T08:55:58.134 INFO:tasks.workunit.client.1.vm08.stdout:9/873: chown d2/d54/d8e/db7/cb9 2 1 2026-03-10T08:55:58.134 INFO:tasks.workunit.client.0.vm05.stdout:3/762: rmdir d9/d2b/d53 39 2026-03-10T08:55:58.135 INFO:tasks.workunit.client.1.vm08.stdout:9/874: dread - d2/dd/d15/d1e/d39/d4e/d87/f11f zero size 2026-03-10T08:55:58.135 INFO:tasks.workunit.client.1.vm08.stdout:9/875: chown d2/dd/d61/f9c 1018512 1 2026-03-10T08:55:58.156 INFO:tasks.workunit.client.0.vm05.stdout:0/705: unlink df/c4c 0 2026-03-10T08:55:58.160 INFO:tasks.workunit.client.1.vm08.stdout:3/855: mkdir d4/d15/d8/d1d/d107/d10a/d120 0 2026-03-10T08:55:58.163 INFO:tasks.workunit.client.0.vm05.stdout:6/725: creat d4/d7/d10/d1a/ff4 x:0 0 0 2026-03-10T08:55:58.166 INFO:tasks.workunit.client.1.vm08.stdout:0/872: sync 2026-03-10T08:55:58.167 INFO:tasks.workunit.client.0.vm05.stdout:1/745: sync 2026-03-10T08:55:58.173 INFO:tasks.workunit.client.1.vm08.stdout:4/928: rename d5/d23/d49/cd0 to d5/d23/d36/d76/c14a 0 2026-03-10T08:55:58.174 INFO:tasks.workunit.client.0.vm05.stdout:4/695: readlink d0/d1d/l92 0 2026-03-10T08:55:58.193 INFO:tasks.workunit.client.0.vm05.stdout:6/726: mknod d4/d92/db0/cf5 0 2026-03-10T08:55:58.211 INFO:tasks.workunit.client.0.vm05.stdout:1/746: dread dd/d21/f3a [0,4194304] 0 2026-03-10T08:55:58.222 INFO:tasks.workunit.client.0.vm05.stdout:8/706: getdents d2/dd/d2c/d2e/d31/d4f/d80 0 2026-03-10T08:55:58.223 INFO:tasks.workunit.client.0.vm05.stdout:8/707: chown d2/dd/l13 61 1 2026-03-10T08:55:58.226 INFO:tasks.workunit.client.0.vm05.stdout:8/708: chown d2/dd/d2c/d2e/d31/d3e/dde/d63/fc8 91697543 1 2026-03-10T08:55:58.228 INFO:tasks.workunit.client.0.vm05.stdout:7/666: dwrite d18/d1b/f84 [4194304,4194304] 0 2026-03-10T08:55:58.250 INFO:tasks.workunit.client.1.vm08.stdout:2/986: write d1/d5b/da7/ff7 [617040,4714] 0 2026-03-10T08:55:58.253 INFO:tasks.workunit.client.0.vm05.stdout:9/657: dwrite d6/d12/d3a/d9c/fb6 [0,4194304] 0 2026-03-10T08:55:58.283 INFO:tasks.workunit.client.0.vm05.stdout:5/635: dwrite d5/df/dbb/fd0 [0,4194304] 0 2026-03-10T08:55:58.295 INFO:tasks.workunit.client.1.vm08.stdout:5/806: write d0/f6c [90234,50316] 0 2026-03-10T08:55:58.302 INFO:tasks.workunit.client.0.vm05.stdout:3/763: dwrite d9/d8f/d50/f72 [0,4194304] 0 2026-03-10T08:55:58.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:58 vm08.local ceph-mon[57559]: Updating vm05:/etc/ceph/ceph.conf 2026-03-10T08:55:58.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:58 vm08.local ceph-mon[57559]: Updating vm08:/etc/ceph/ceph.conf 2026-03-10T08:55:58.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:58 vm08.local ceph-mon[57559]: pgmap v7: 65 pgs: 65 active+clean; 3.4 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 16 MiB/s rd, 47 MiB/s wr, 144 op/s 2026-03-10T08:55:58.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:58 vm08.local ceph-mon[57559]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:55:58.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:58 vm08.local ceph-mon[57559]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:55:58.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:58 vm08.local ceph-mon[57559]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:55:58.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:58 vm08.local ceph-mon[57559]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:55:58.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:58 vm08.local ceph-mon[57559]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:55:58.304 INFO:tasks.workunit.client.0.vm05.stdout:4/696: rename d0/d2e/d9d to d0/d2e/d42/d45/d4a/d36/dbe/dbf/dbd/de2 0 2026-03-10T08:55:58.319 INFO:tasks.workunit.client.1.vm08.stdout:9/876: mkdir d2/d41/d53/d103/d126 0 2026-03-10T08:55:58.319 INFO:tasks.workunit.client.0.vm05.stdout:2/638: write d0/d9/d1e/d20/d21/f35 [2874513,36764] 0 2026-03-10T08:55:58.322 INFO:tasks.workunit.client.1.vm08.stdout:1/893: mknod d1/da/de/d24/d81/d11d/c130 0 2026-03-10T08:55:58.323 INFO:tasks.workunit.client.1.vm08.stdout:1/894: truncate d1/da/de/d24/d81/f12f 314506 0 2026-03-10T08:55:58.324 INFO:tasks.workunit.client.1.vm08.stdout:1/895: chown d1/da/de/d24/d26/d86/d111 53641 1 2026-03-10T08:55:58.324 INFO:tasks.workunit.client.0.vm05.stdout:7/667: dread d18/d38/d43/d5c/fa7 [0,4194304] 0 2026-03-10T08:55:58.327 INFO:tasks.workunit.client.0.vm05.stdout:7/668: readlink d18/d66/d79/l98 0 2026-03-10T08:55:58.343 INFO:tasks.workunit.client.0.vm05.stdout:5/636: symlink d5/d48/d64/d95/le8 0 2026-03-10T08:55:58.359 INFO:tasks.workunit.client.0.vm05.stdout:1/747: mknod dd/d10/c105 0 2026-03-10T08:55:58.359 INFO:tasks.workunit.client.1.vm08.stdout:6/897: creat d9/dc/d11/d23/d2c/d81/d63/f127 x:0 0 0 2026-03-10T08:55:58.376 INFO:tasks.workunit.client.1.vm08.stdout:7/916: rename d0/d11/d1f/d29/d36/d75/f10b to d0/d14/d2f/f11f 0 2026-03-10T08:55:58.391 INFO:tasks.workunit.client.1.vm08.stdout:4/929: symlink d5/d23/d36/d99/db2/d5d/de3/df8/l14b 0 2026-03-10T08:55:58.407 INFO:tasks.workunit.client.0.vm05.stdout:6/727: rename d4/d7/d10/f65 to d4/d7/d10/d15/ff6 0 2026-03-10T08:55:58.414 INFO:tasks.workunit.client.0.vm05.stdout:4/697: unlink d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/fde 0 2026-03-10T08:55:58.420 INFO:tasks.workunit.client.1.vm08.stdout:3/856: write d4/d15/d8/d2c/d6d/dfa/d100/f112 [618733,2023] 0 2026-03-10T08:55:58.423 INFO:tasks.workunit.client.1.vm08.stdout:0/873: dwrite d6/dd/d13/d17/d1f/da3/fa7 [0,4194304] 0 2026-03-10T08:55:58.437 INFO:tasks.workunit.client.0.vm05.stdout:2/639: fsync d0/d9/d1e/d20/d21/f41 0 2026-03-10T08:55:58.440 INFO:tasks.workunit.client.1.vm08.stdout:8/946: getdents d1/d10/d9/dd/d25/dca/d128 0 2026-03-10T08:55:58.447 INFO:tasks.workunit.client.0.vm05.stdout:7/669: creat d18/d66/d25/d2e/d2f/da0/fd2 x:0 0 0 2026-03-10T08:55:58.454 INFO:tasks.workunit.client.0.vm05.stdout:5/637: chown d5/d86/f1b 253 1 2026-03-10T08:55:58.454 INFO:tasks.workunit.client.0.vm05.stdout:5/638: dread - d5/f40 zero size 2026-03-10T08:55:58.456 INFO:tasks.workunit.client.1.vm08.stdout:7/917: rmdir d0/d11 39 2026-03-10T08:55:58.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:58 vm05.local ceph-mon[49713]: Updating vm05:/etc/ceph/ceph.conf 2026-03-10T08:55:58.465 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:58 vm05.local ceph-mon[49713]: Updating vm08:/etc/ceph/ceph.conf 2026-03-10T08:55:58.465 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:58 vm05.local ceph-mon[49713]: pgmap v7: 65 pgs: 65 active+clean; 3.4 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 16 MiB/s rd, 47 MiB/s wr, 144 op/s 2026-03-10T08:55:58.465 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:58 vm05.local ceph-mon[49713]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:55:58.465 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:58 vm05.local ceph-mon[49713]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:55:58.465 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:58 vm05.local ceph-mon[49713]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:55:58.465 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:58 vm05.local ceph-mon[49713]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:55:58.465 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:58 vm05.local ceph-mon[49713]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:55:58.465 INFO:tasks.workunit.client.0.vm05.stdout:4/698: dread d0/d2e/d42/d45/d4a/d36/f88 [0,4194304] 0 2026-03-10T08:55:58.476 INFO:tasks.workunit.client.1.vm08.stdout:5/807: dread d0/d46/f51 [0,4194304] 0 2026-03-10T08:55:58.480 INFO:tasks.workunit.client.1.vm08.stdout:5/808: sync 2026-03-10T08:55:58.615 INFO:tasks.workunit.client.0.vm05.stdout:3/764: symlink d9/d4d/lea 0 2026-03-10T08:55:58.650 INFO:tasks.workunit.client.1.vm08.stdout:2/987: write d1/da/d10/d42/d93/d1e/d7b/fea [378912,77372] 0 2026-03-10T08:55:58.655 INFO:tasks.workunit.client.0.vm05.stdout:8/709: rename d2/dd/d2c/d2e/d31/d3e/dde/d63/daf/ff4 to d2/dd/d2c/d2e/d31/d4f/d7b/d9e/ff6 0 2026-03-10T08:55:58.657 INFO:tasks.workunit.client.0.vm05.stdout:8/710: chown d2/db/d1f/d67 0 1 2026-03-10T08:55:58.659 INFO:tasks.workunit.client.1.vm08.stdout:3/857: creat d4/d6f/d85/dd3/f121 x:0 0 0 2026-03-10T08:55:58.664 INFO:tasks.workunit.client.1.vm08.stdout:1/896: dwrite d1/da/de/fe4 [0,4194304] 0 2026-03-10T08:55:58.665 INFO:tasks.workunit.client.0.vm05.stdout:8/711: stat d2/dd/d2c/d2e/d31/d3e/d5d/f92 0 2026-03-10T08:55:58.705 INFO:tasks.workunit.client.1.vm08.stdout:8/947: truncate d1/d10/d9/dd/d25/d27/d44/d21/d5f/fbd 423370 0 2026-03-10T08:55:58.719 INFO:tasks.workunit.client.0.vm05.stdout:1/748: dwrite dd/d10/d19/d4d/f74 [0,4194304] 0 2026-03-10T08:55:58.721 INFO:tasks.workunit.client.0.vm05.stdout:0/706: link df/d1f/d85/d2b/f7a df/d1f/d85/d2b/d27/d32/d4e/fd3 0 2026-03-10T08:55:58.724 INFO:tasks.workunit.client.0.vm05.stdout:0/707: chown df/d1f/d85/d19/d47/d84/d8a/f9c 0 1 2026-03-10T08:55:58.741 INFO:tasks.workunit.client.1.vm08.stdout:7/918: dwrite d0/d11/d1f/df0/df4/f11d [0,4194304] 0 2026-03-10T08:55:58.747 INFO:tasks.workunit.client.0.vm05.stdout:7/670: creat d18/d66/d25/d2e/d2f/fd3 x:0 0 0 2026-03-10T08:55:58.749 INFO:tasks.workunit.client.0.vm05.stdout:7/671: chown d18/d66/d25/d2e/d42 6145265 1 2026-03-10T08:55:58.752 INFO:tasks.workunit.client.0.vm05.stdout:7/672: write d18/d66/d25/d2e/d42/d9c/fce [762590,30710] 0 2026-03-10T08:55:58.770 INFO:tasks.workunit.client.1.vm08.stdout:0/874: dread f5 [0,4194304] 0 2026-03-10T08:55:58.772 INFO:tasks.workunit.client.1.vm08.stdout:0/875: chown d6/dd/d13/d61/dc7/dc8/dde/ff2 12 1 2026-03-10T08:55:58.779 INFO:tasks.workunit.client.1.vm08.stdout:2/988: creat d1/da/d78/df5/f147 x:0 0 0 2026-03-10T08:55:58.801 INFO:tasks.workunit.client.0.vm05.stdout:3/765: unlink d9/d2b/d3a/d6c/dbf/fce 0 2026-03-10T08:55:58.807 INFO:tasks.workunit.client.0.vm05.stdout:2/640: rename d0/d9/d1e/d20/d21/d8a/d92/faa to d0/d9/d7f/d8f/d7a/fbc 0 2026-03-10T08:55:58.826 INFO:tasks.workunit.client.1.vm08.stdout:8/948: rmdir d1/d4f/d60/d88 39 2026-03-10T08:55:58.827 INFO:tasks.workunit.client.1.vm08.stdout:6/898: creat d9/d10/d1e/d32/f128 x:0 0 0 2026-03-10T08:55:58.827 INFO:tasks.workunit.client.1.vm08.stdout:7/919: mkdir d0/d11/d4a/d5e/d120 0 2026-03-10T08:55:58.827 INFO:tasks.workunit.client.0.vm05.stdout:8/712: truncate d2/dd/d2c/d2e/d31/d3e/dde/d63/f6c 1556088 0 2026-03-10T08:55:58.827 INFO:tasks.workunit.client.0.vm05.stdout:6/728: getdents d4/d2d/d51/d87/da5/de9 0 2026-03-10T08:55:58.827 INFO:tasks.workunit.client.0.vm05.stdout:1/749: creat dd/d10/d18/dd1/f106 x:0 0 0 2026-03-10T08:55:58.827 INFO:tasks.workunit.client.0.vm05.stdout:1/750: readlink dd/d10/d18/d2d/d51/d58/d71/d62/l90 0 2026-03-10T08:55:58.832 INFO:tasks.workunit.client.1.vm08.stdout:8/949: rmdir d1/d10/d9/dd/d18/d34 39 2026-03-10T08:55:58.834 INFO:tasks.workunit.client.0.vm05.stdout:7/673: sync 2026-03-10T08:55:58.931 INFO:tasks.workunit.client.1.vm08.stdout:7/920: creat d0/d11/d1f/d29/d3b/da1/daa/f121 x:0 0 0 2026-03-10T08:55:58.940 INFO:tasks.workunit.client.0.vm05.stdout:2/641: mknod d0/d9/d7f/d8f/d7a/cbd 0 2026-03-10T08:55:58.940 INFO:tasks.workunit.client.1.vm08.stdout:7/921: dread d0/d14/d43/d62/fb5 [0,4194304] 0 2026-03-10T08:55:58.956 INFO:tasks.workunit.client.0.vm05.stdout:8/713: creat d2/dd/d2c/d2e/d31/d3e/dde/ff7 x:0 0 0 2026-03-10T08:55:58.956 INFO:tasks.workunit.client.0.vm05.stdout:8/714: chown d2/dd/d2c/d2e/f3b 699710 1 2026-03-10T08:55:58.957 INFO:tasks.workunit.client.0.vm05.stdout:8/715: chown d2/cf1 7226343 1 2026-03-10T08:55:58.973 INFO:tasks.workunit.client.0.vm05.stdout:7/674: mkdir d18/d66/d25/d2e/d2f/d6d/dc1/dd4 0 2026-03-10T08:55:59.031 INFO:tasks.workunit.client.1.vm08.stdout:1/897: getdents d1/da/de/d24/d35/d6d/d116 0 2026-03-10T08:55:59.054 INFO:tasks.workunit.client.1.vm08.stdout:8/950: creat d1/d10/d9/dd/d13/d40/d141/f157 x:0 0 0 2026-03-10T08:55:59.069 INFO:tasks.workunit.client.1.vm08.stdout:1/898: rmdir d1/da/de/d24/d3d/d40/d56/d128 39 2026-03-10T08:55:59.091 INFO:tasks.workunit.client.1.vm08.stdout:8/951: truncate d1/d10/f2a 4405280 0 2026-03-10T08:55:59.108 INFO:tasks.workunit.client.1.vm08.stdout:2/989: creat d1/da/d10/d42/d93/d1e/dce/f148 x:0 0 0 2026-03-10T08:55:59.116 INFO:tasks.workunit.client.1.vm08.stdout:8/952: dread d1/d10/d9/d4d/db2/f103 [0,4194304] 0 2026-03-10T08:55:59.117 INFO:tasks.workunit.client.1.vm08.stdout:8/953: chown d1/d10/d9/d4d/le9 1 1 2026-03-10T08:55:59.118 INFO:tasks.workunit.client.1.vm08.stdout:2/990: creat d1/da/d10/d42/d93/de2/d139/f149 x:0 0 0 2026-03-10T08:55:59.120 INFO:tasks.workunit.client.1.vm08.stdout:8/954: dread d1/d10/d9/dd/d25/d27/d44/d21/d51/d64/feb [0,4194304] 0 2026-03-10T08:55:59.141 INFO:tasks.workunit.client.0.vm05.stdout:8/716: mkdir d2/dd/d2c/d2e/d31/d4f/d7b/d9e/df8 0 2026-03-10T08:55:59.150 INFO:tasks.workunit.client.1.vm08.stdout:8/955: creat d1/d10/d9/d4d/db2/f158 x:0 0 0 2026-03-10T08:55:59.155 INFO:tasks.workunit.client.0.vm05.stdout:9/658: creat d6/d12/d3a/fdd x:0 0 0 2026-03-10T08:55:59.156 INFO:tasks.workunit.client.0.vm05.stdout:9/659: chown d6/d12/d3a/da2/ccb 1692 1 2026-03-10T08:55:59.162 INFO:tasks.workunit.client.0.vm05.stdout:0/708: rename df/d1f/d48/fba to df/d1f/d85/fd4 0 2026-03-10T08:55:59.168 INFO:tasks.workunit.client.0.vm05.stdout:4/699: mkdir d0/d2e/d71/de3 0 2026-03-10T08:55:59.173 INFO:tasks.workunit.client.0.vm05.stdout:3/766: getdents d9/d2b/d3a/d6c/dbe 0 2026-03-10T08:55:59.180 INFO:tasks.workunit.client.0.vm05.stdout:9/660: fsync d6/d12/d3a/fa9 0 2026-03-10T08:55:59.181 INFO:tasks.workunit.client.1.vm08.stdout:9/877: write d2/dd/d15/d1e/d21/fc5 [1562846,112865] 0 2026-03-10T08:55:59.193 INFO:tasks.workunit.client.0.vm05.stdout:0/709: symlink df/d1f/d85/d19/d47/d84/dbe/d90/ld5 0 2026-03-10T08:55:59.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:59 vm05.local ceph-mon[49713]: Updating vm08:/var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/config/ceph.conf 2026-03-10T08:55:59.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:59 vm05.local ceph-mon[49713]: Updating vm05:/var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/config/ceph.conf 2026-03-10T08:55:59.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:59 vm05.local ceph-mon[49713]: Updating vm05:/etc/ceph/ceph.client.admin.keyring 2026-03-10T08:55:59.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:59 vm05.local ceph-mon[49713]: Updating vm08:/etc/ceph/ceph.client.admin.keyring 2026-03-10T08:55:59.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:59 vm05.local ceph-mon[49713]: Updating vm08:/var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/config/ceph.client.admin.keyring 2026-03-10T08:55:59.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:59 vm05.local ceph-mon[49713]: Updating vm05:/var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/config/ceph.client.admin.keyring 2026-03-10T08:55:59.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:59 vm05.local ceph-mon[49713]: Reconfiguring prometheus.vm05 (dependencies changed)... 2026-03-10T08:55:59.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:59 vm05.local ceph-mon[49713]: Reconfiguring daemon prometheus.vm05 on vm05 2026-03-10T08:55:59.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:55:59 vm05.local ceph-mon[49713]: pgmap v8: 65 pgs: 65 active+clean; 3.4 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 12 MiB/s rd, 36 MiB/s wr, 111 op/s 2026-03-10T08:55:59.217 INFO:tasks.workunit.client.1.vm08.stdout:3/858: symlink d4/l122 0 2026-03-10T08:55:59.223 INFO:tasks.workunit.client.0.vm05.stdout:9/661: creat d6/d15/d3c/d4b/d90/fde x:0 0 0 2026-03-10T08:55:59.241 INFO:tasks.workunit.client.1.vm08.stdout:8/956: mkdir d1/da8/d159 0 2026-03-10T08:55:59.252 INFO:tasks.workunit.client.0.vm05.stdout:9/662: mkdir d6/d15/d35/ddf 0 2026-03-10T08:55:59.257 INFO:tasks.workunit.client.1.vm08.stdout:4/930: write d5/d23/d36/d99/db2/d5a/d69/d11b/f38 [1323830,78733] 0 2026-03-10T08:55:59.262 INFO:tasks.workunit.client.1.vm08.stdout:3/859: mknod d4/d15/d8/d1d/c123 0 2026-03-10T08:55:59.264 INFO:tasks.workunit.client.1.vm08.stdout:4/931: dwrite d5/d23/d36/d99/db2/d5d/f60 [4194304,4194304] 0 2026-03-10T08:55:59.274 INFO:tasks.workunit.client.0.vm05.stdout:9/663: creat d6/d15/d3c/d4b/d90/fe0 x:0 0 0 2026-03-10T08:55:59.290 INFO:tasks.workunit.client.1.vm08.stdout:8/957: dread d1/fdc [0,4194304] 0 2026-03-10T08:55:59.290 INFO:tasks.workunit.client.0.vm05.stdout:9/664: dwrite d6/d19/d2c/d58/f6c [0,4194304] 0 2026-03-10T08:55:59.296 INFO:tasks.workunit.client.0.vm05.stdout:9/665: chown d6/d12/c6f 2048671370 1 2026-03-10T08:55:59.300 INFO:tasks.workunit.client.0.vm05.stdout:9/666: chown d6/la0 80996495 1 2026-03-10T08:55:59.306 INFO:tasks.workunit.client.0.vm05.stdout:9/667: dwrite d6/d15/d3c/d4b/d90/fde [0,4194304] 0 2026-03-10T08:55:59.315 INFO:tasks.workunit.client.0.vm05.stdout:9/668: creat d6/d19/d2c/d84/fe1 x:0 0 0 2026-03-10T08:55:59.316 INFO:tasks.workunit.client.1.vm08.stdout:3/860: creat d4/d15/d8/d1d/d117/f124 x:0 0 0 2026-03-10T08:55:59.317 INFO:tasks.workunit.client.1.vm08.stdout:8/958: creat d1/d10/d9/dd/d9a/d11f/f15a x:0 0 0 2026-03-10T08:55:59.321 INFO:tasks.workunit.client.1.vm08.stdout:8/959: symlink d1/d10/d9/dd/d25/d27/d144/l15b 0 2026-03-10T08:55:59.322 INFO:tasks.workunit.client.0.vm05.stdout:5/639: dwrite d5/d86/f1a [0,4194304] 0 2026-03-10T08:55:59.328 INFO:tasks.workunit.client.1.vm08.stdout:8/960: symlink d1/d10/d9/dd/d18/dff/l15c 0 2026-03-10T08:55:59.358 INFO:tasks.workunit.client.0.vm05.stdout:5/640: creat d5/d48/d64/d95/dac/dc6/fe9 x:0 0 0 2026-03-10T08:55:59.358 INFO:tasks.workunit.client.0.vm05.stdout:9/669: getdents d6/d27 0 2026-03-10T08:55:59.359 INFO:tasks.workunit.client.0.vm05.stdout:9/670: write d6/f7 [7793718,94785] 0 2026-03-10T08:55:59.369 INFO:tasks.workunit.client.0.vm05.stdout:9/671: symlink d6/d15/d35/ddf/le2 0 2026-03-10T08:55:59.375 INFO:tasks.workunit.client.0.vm05.stdout:9/672: getdents d6/d19/d2c/d58 0 2026-03-10T08:55:59.427 INFO:tasks.workunit.client.1.vm08.stdout:6/899: write d9/dc/d11/d23/d2c/d7a/dce/ffd [4213090,107356] 0 2026-03-10T08:55:59.428 INFO:tasks.workunit.client.1.vm08.stdout:6/900: symlink d9/d10/d1e/d104/l129 0 2026-03-10T08:55:59.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:59 vm08.local ceph-mon[57559]: Updating vm08:/var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/config/ceph.conf 2026-03-10T08:55:59.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:59 vm08.local ceph-mon[57559]: Updating vm05:/var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/config/ceph.conf 2026-03-10T08:55:59.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:59 vm08.local ceph-mon[57559]: Updating vm05:/etc/ceph/ceph.client.admin.keyring 2026-03-10T08:55:59.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:59 vm08.local ceph-mon[57559]: Updating vm08:/etc/ceph/ceph.client.admin.keyring 2026-03-10T08:55:59.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:59 vm08.local ceph-mon[57559]: Updating vm08:/var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/config/ceph.client.admin.keyring 2026-03-10T08:55:59.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:59 vm08.local ceph-mon[57559]: Updating vm05:/var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/config/ceph.client.admin.keyring 2026-03-10T08:55:59.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:59 vm08.local ceph-mon[57559]: Reconfiguring prometheus.vm05 (dependencies changed)... 2026-03-10T08:55:59.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:59 vm08.local ceph-mon[57559]: Reconfiguring daemon prometheus.vm05 on vm05 2026-03-10T08:55:59.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:55:59 vm08.local ceph-mon[57559]: pgmap v8: 65 pgs: 65 active+clean; 3.4 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 12 MiB/s rd, 36 MiB/s wr, 111 op/s 2026-03-10T08:55:59.553 INFO:tasks.workunit.client.1.vm08.stdout:7/922: dwrite d0/d11/d1f/d29/d3d/d89/f96 [0,4194304] 0 2026-03-10T08:55:59.557 INFO:tasks.workunit.client.1.vm08.stdout:7/923: mknod d0/d11/d4a/d95/dc5/d100/d115/c122 0 2026-03-10T08:55:59.597 INFO:tasks.workunit.client.1.vm08.stdout:1/899: dwrite d1/da/d18/fb1 [0,4194304] 0 2026-03-10T08:55:59.599 INFO:tasks.workunit.client.1.vm08.stdout:1/900: rmdir d1/da/de/d24/d35/d6d/d82 39 2026-03-10T08:55:59.601 INFO:tasks.workunit.client.1.vm08.stdout:1/901: creat d1/da/de/d24/d35/d6d/d82/f131 x:0 0 0 2026-03-10T08:55:59.603 INFO:tasks.workunit.client.1.vm08.stdout:1/902: symlink d1/da/de/d24/d81/d121/l132 0 2026-03-10T08:55:59.614 INFO:tasks.workunit.client.1.vm08.stdout:0/876: creat d6/f12b x:0 0 0 2026-03-10T08:55:59.614 INFO:tasks.workunit.client.0.vm05.stdout:2/642: write d0/d9/d1e/d20/f3a [4660519,4055] 0 2026-03-10T08:55:59.616 INFO:tasks.workunit.client.1.vm08.stdout:0/877: mkdir d6/dd/d13/d61/dc7/d12c 0 2026-03-10T08:55:59.617 INFO:tasks.workunit.client.0.vm05.stdout:2/643: truncate d0/d9/d7f/d8f/f37 2270300 0 2026-03-10T08:55:59.619 INFO:tasks.workunit.client.0.vm05.stdout:2/644: dread d0/d9/d1e/d20/f7c [0,4194304] 0 2026-03-10T08:55:59.623 INFO:tasks.workunit.client.0.vm05.stdout:7/675: dwrite d18/f4a [0,4194304] 0 2026-03-10T08:55:59.626 INFO:tasks.workunit.client.0.vm05.stdout:2/645: creat d0/d9/d1e/d20/d24/fbe x:0 0 0 2026-03-10T08:55:59.627 INFO:tasks.workunit.client.0.vm05.stdout:7/676: fdatasync d18/d66/fae 0 2026-03-10T08:55:59.638 INFO:tasks.workunit.client.0.vm05.stdout:7/677: stat d18/d66/d25/d2e/l81 0 2026-03-10T08:55:59.644 INFO:tasks.workunit.client.0.vm05.stdout:7/678: creat d18/d66/d78/dc3/fd5 x:0 0 0 2026-03-10T08:55:59.650 INFO:tasks.workunit.client.0.vm05.stdout:4/700: dwrite d0/d2e/d42/d45/d4a/d36/dbe/d49/f7a [0,4194304] 0 2026-03-10T08:55:59.653 INFO:tasks.workunit.client.0.vm05.stdout:6/729: rename d4/fa8 to d4/d7/d10/d1a/ff7 0 2026-03-10T08:55:59.657 INFO:tasks.workunit.client.0.vm05.stdout:1/751: rename dd/d10/d18/d2d to dd/d10/d18/d2d/d51/d58/d71/d107 22 2026-03-10T08:55:59.657 INFO:tasks.workunit.client.0.vm05.stdout:7/679: dread d18/d38/d43/d5c/f5f [0,4194304] 0 2026-03-10T08:55:59.659 INFO:tasks.workunit.client.0.vm05.stdout:4/701: fsync d0/d2e/d42/f59 0 2026-03-10T08:55:59.664 INFO:tasks.workunit.client.0.vm05.stdout:7/680: dread d18/d66/d25/d2e/d42/d9c/fce [0,4194304] 0 2026-03-10T08:55:59.676 INFO:tasks.workunit.client.0.vm05.stdout:3/767: rename d9/f4a to d9/d2b/d3a/d43/d6e/feb 0 2026-03-10T08:55:59.677 INFO:tasks.workunit.client.0.vm05.stdout:3/768: write d9/d2b/fe9 [1004733,117797] 0 2026-03-10T08:55:59.686 INFO:tasks.workunit.client.0.vm05.stdout:8/717: dwrite d2/dd/d2c/d2e/d31/d4f/d7b/f8a [0,4194304] 0 2026-03-10T08:55:59.693 INFO:tasks.workunit.client.0.vm05.stdout:7/681: creat d18/d38/d43/d5c/daf/fd6 x:0 0 0 2026-03-10T08:55:59.701 INFO:tasks.workunit.client.0.vm05.stdout:3/769: mkdir d9/d8f/d50/d5f/dd8/dec 0 2026-03-10T08:55:59.702 INFO:tasks.workunit.client.1.vm08.stdout:2/991: write d1/da/d78/df5/d11e/fe3 [1050059,124350] 0 2026-03-10T08:55:59.704 INFO:tasks.workunit.client.0.vm05.stdout:3/770: dwrite d9/d4d/dca/f99 [0,4194304] 0 2026-03-10T08:55:59.705 INFO:tasks.workunit.client.0.vm05.stdout:6/730: symlink d4/d2d/d51/d87/da5/de9/lf8 0 2026-03-10T08:55:59.709 INFO:tasks.workunit.client.0.vm05.stdout:8/718: creat d2/dd/d74/d78/ff9 x:0 0 0 2026-03-10T08:55:59.712 INFO:tasks.workunit.client.1.vm08.stdout:2/992: dread d1/da/d10/d1b/fac [0,4194304] 0 2026-03-10T08:55:59.724 INFO:tasks.workunit.client.0.vm05.stdout:1/752: rename dd/fe5 to dd/d10/d18/d20/df3/f108 0 2026-03-10T08:55:59.724 INFO:tasks.workunit.client.0.vm05.stdout:1/753: read - dd/d10/f8f zero size 2026-03-10T08:55:59.728 INFO:tasks.workunit.client.0.vm05.stdout:8/719: read - d2/dd/d2c/d2e/d31/d3e/dde/d63/fc8 zero size 2026-03-10T08:55:59.730 INFO:tasks.workunit.client.0.vm05.stdout:7/682: symlink d18/d66/d25/ld7 0 2026-03-10T08:55:59.733 INFO:tasks.workunit.client.0.vm05.stdout:0/710: dwrite df/d1f/d85/d2b/d65/d6e/d96/f8b [0,4194304] 0 2026-03-10T08:55:59.734 INFO:tasks.workunit.client.0.vm05.stdout:4/702: sync 2026-03-10T08:55:59.735 INFO:tasks.workunit.client.0.vm05.stdout:1/754: symlink dd/dfb/l109 0 2026-03-10T08:55:59.747 INFO:tasks.workunit.client.0.vm05.stdout:7/683: dread d18/d66/d25/d2e/f9e [0,4194304] 0 2026-03-10T08:55:59.747 INFO:tasks.workunit.client.0.vm05.stdout:4/703: dread d0/fb [4194304,4194304] 0 2026-03-10T08:55:59.757 INFO:tasks.workunit.client.0.vm05.stdout:8/720: mknod d2/dd/d2c/d2e/d31/d3e/cfa 0 2026-03-10T08:55:59.766 INFO:tasks.workunit.client.0.vm05.stdout:4/704: mkdir d0/d2e/d42/d45/d4a/d36/dbe/d49/d4f/d5b/dd7/ddc/de4 0 2026-03-10T08:55:59.772 INFO:tasks.workunit.client.0.vm05.stdout:1/755: rename dd/d10/d18/d2d/d5c/fa2 to dd/d21/f10a 0 2026-03-10T08:55:59.788 INFO:tasks.workunit.client.0.vm05.stdout:1/756: sync 2026-03-10T08:55:59.790 INFO:tasks.workunit.client.0.vm05.stdout:4/705: symlink d0/d2e/d42/d45/d4a/d36/dbe/d49/d58/d66/d79/le5 0 2026-03-10T08:55:59.795 INFO:tasks.workunit.client.0.vm05.stdout:0/711: link df/d1f/d85/d2b/d65/d6e/d96/l5f df/d1f/d85/d2b/d65/ld6 0 2026-03-10T08:55:59.796 INFO:tasks.workunit.client.0.vm05.stdout:0/712: chown df/f12 6305 1 2026-03-10T08:55:59.811 INFO:tasks.workunit.client.0.vm05.stdout:1/757: fsync fc 0 2026-03-10T08:55:59.814 INFO:tasks.workunit.client.1.vm08.stdout:4/932: dwrite d5/d23/d49/f123 [0,4194304] 0 2026-03-10T08:55:59.815 INFO:tasks.workunit.client.1.vm08.stdout:4/933: dread - d5/d23/d36/d76/fcf zero size 2026-03-10T08:55:59.815 INFO:tasks.workunit.client.1.vm08.stdout:4/934: readlink d5/d23/d49/d8f/la0 0 2026-03-10T08:55:59.819 INFO:tasks.workunit.client.0.vm05.stdout:4/706: creat d0/d2e/d71/d7c/fe6 x:0 0 0 2026-03-10T08:55:59.823 INFO:tasks.workunit.client.1.vm08.stdout:3/861: write d4/d6f/dca/f114 [898355,55587] 0 2026-03-10T08:55:59.832 INFO:tasks.workunit.client.0.vm05.stdout:4/707: dread d0/d2e/d42/d45/d4a/d36/dbe/d32/f72 [0,4194304] 0 2026-03-10T08:55:59.839 INFO:tasks.workunit.client.1.vm08.stdout:8/961: write d1/d10/d9/dd/f91 [1276181,11474] 0 2026-03-10T08:55:59.840 INFO:tasks.workunit.client.1.vm08.stdout:8/962: write d1/dd9/f126 [1183453,94471] 0 2026-03-10T08:55:59.848 INFO:tasks.workunit.client.0.vm05.stdout:3/771: dread d9/d2b/d2f/d57/f90 [0,4194304] 0 2026-03-10T08:55:59.848 INFO:tasks.workunit.client.0.vm05.stdout:5/641: write d5/d86/d21/f36 [392306,16191] 0 2026-03-10T08:55:59.851 INFO:tasks.workunit.client.0.vm05.stdout:9/673: truncate d6/d15/f4f 7818927 0 2026-03-10T08:55:59.856 INFO:tasks.workunit.client.0.vm05.stdout:8/721: rename d2/f5 to d2/dd/d2c/d2e/d31/d4f/da3/ffb 0 2026-03-10T08:55:59.863 INFO:tasks.workunit.client.1.vm08.stdout:3/862: creat d4/d6f/dca/f125 x:0 0 0 2026-03-10T08:55:59.870 INFO:tasks.workunit.client.1.vm08.stdout:8/963: symlink d1/d10/d9/dd/d25/d27/d44/d21/d5f/d9e/l15d 0 2026-03-10T08:55:59.875 INFO:tasks.workunit.client.1.vm08.stdout:6/901: dwrite d9/fc5 [0,4194304] 0 2026-03-10T08:55:59.875 INFO:tasks.workunit.client.0.vm05.stdout:3/772: readlink d9/d4d/l70 0 2026-03-10T08:55:59.882 INFO:tasks.workunit.client.1.vm08.stdout:9/878: rename d2/dd/le to d2/d41/d53/d103/d126/l127 0 2026-03-10T08:55:59.882 INFO:tasks.workunit.client.1.vm08.stdout:5/809: unlink d0/f43 0 2026-03-10T08:55:59.902 INFO:tasks.workunit.client.1.vm08.stdout:8/964: mkdir d1/d10/d9/dd/d25/dca/d128/d15e 0 2026-03-10T08:55:59.915 INFO:tasks.workunit.client.1.vm08.stdout:6/902: readlink d9/d10/d1e/d32/lf0 0 2026-03-10T08:55:59.922 INFO:tasks.workunit.client.1.vm08.stdout:7/924: rename d0/d11/d1f/d29/d3b/d80/fa2 to d0/d11/d1f/d29/d3d/df6/f123 0 2026-03-10T08:55:59.926 INFO:tasks.workunit.client.1.vm08.stdout:1/903: dwrite d1/da/f1e [0,4194304] 0 2026-03-10T08:55:59.936 INFO:tasks.workunit.client.1.vm08.stdout:0/878: write d6/dd/d13/d61/dc7/fdf [212149,20439] 0 2026-03-10T08:55:59.942 INFO:tasks.workunit.client.0.vm05.stdout:2/646: dwrite d0/d9/d7f/d8f/f7d [0,4194304] 0 2026-03-10T08:55:59.944 INFO:tasks.workunit.client.0.vm05.stdout:2/647: chown d0/cc 92 1 2026-03-10T08:55:59.944 INFO:tasks.workunit.client.0.vm05.stdout:1/758: truncate dd/d10/d18/d2d/d51/d58/d71/d73/fbb 1066543 0 2026-03-10T08:55:59.944 INFO:tasks.workunit.client.0.vm05.stdout:7/684: rmdir d18/d66/d78/dc3 39 2026-03-10T08:55:59.946 INFO:tasks.workunit.client.0.vm05.stdout:3/773: creat d9/d2b/d2f/fed x:0 0 0 2026-03-10T08:55:59.958 INFO:tasks.workunit.client.1.vm08.stdout:8/965: dread d1/d10/d9/dd/f8f [0,4194304] 0 2026-03-10T08:55:59.968 INFO:tasks.workunit.client.1.vm08.stdout:6/903: symlink d9/dc/d84/l12a 0 2026-03-10T08:55:59.976 INFO:tasks.workunit.client.1.vm08.stdout:2/993: rename d1/da/d10/d42/d93/d1e/dce/d52/l10c to d1/d5b/da7/d11c/l14a 0 2026-03-10T08:55:59.977 INFO:tasks.workunit.client.1.vm08.stdout:2/994: truncate d1/da/f140 551555 0 2026-03-10T08:55:59.977 INFO:tasks.workunit.client.0.vm05.stdout:6/731: write d4/d7/f14 [762045,42463] 0 2026-03-10T08:55:59.979 INFO:tasks.workunit.client.1.vm08.stdout:7/925: creat d0/d11/d1f/df0/f124 x:0 0 0 2026-03-10T08:55:59.996 INFO:tasks.workunit.client.1.vm08.stdout:0/879: creat d6/dd/d13/d17/d1f/d2d/d85/d95/f12d x:0 0 0 2026-03-10T08:55:59.997 INFO:tasks.workunit.client.1.vm08.stdout:0/880: readlink d6/dd/d13/d17/d1f/d2d/d38/d98/l119 0 2026-03-10T08:56:00.000 INFO:tasks.workunit.client.1.vm08.stdout:5/810: mkdir d0/d11/d18/df5/dfc 0 2026-03-10T08:56:00.008 INFO:tasks.workunit.client.0.vm05.stdout:0/713: dwrite fe [0,4194304] 0 2026-03-10T08:56:00.012 INFO:tasks.workunit.client.0.vm05.stdout:0/714: dread df/d1f/f21 [0,4194304] 0 2026-03-10T08:56:00.026 INFO:tasks.workunit.client.1.vm08.stdout:4/935: dwrite d5/d23/d36/f133 [0,4194304] 0 2026-03-10T08:56:00.028 INFO:tasks.workunit.client.0.vm05.stdout:5/642: dwrite d5/df/d37/f47 [0,4194304] 0 2026-03-10T08:56:00.054 INFO:tasks.workunit.client.0.vm05.stdout:9/674: dwrite d6/d12/d3a/d48/fa5 [0,4194304] 0 2026-03-10T08:56:00.057 INFO:tasks.workunit.client.1.vm08.stdout:6/904: dread d9/d13/f88 [0,4194304] 0 2026-03-10T08:56:00.057 INFO:tasks.workunit.client.1.vm08.stdout:6/905: chown d9/d13/f88 0 1 2026-03-10T08:56:00.068 INFO:tasks.workunit.client.1.vm08.stdout:8/966: rename d1/d10/d9/dd/d25/d27/c13e to d1/d10/d9/dd/d25/d27/d44/d21/dce/c15f 0 2026-03-10T08:56:00.072 INFO:tasks.workunit.client.1.vm08.stdout:2/995: mknod d1/da/d10/d42/d93/d23/c14b 0 2026-03-10T08:56:00.086 INFO:tasks.workunit.client.1.vm08.stdout:7/926: dread d0/d51/f78 [0,4194304] 0 2026-03-10T08:56:00.092 INFO:tasks.workunit.client.1.vm08.stdout:5/811: truncate d0/d11/d27/d68/d7c/d4b/fa2 1824908 0 2026-03-10T08:56:00.095 INFO:tasks.workunit.client.0.vm05.stdout:4/708: rename d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/d67/cbb to d0/d1d/ce7 0 2026-03-10T08:56:00.098 INFO:tasks.workunit.client.1.vm08.stdout:9/879: link d2/dd/d15/d1e/d24/lc1 d2/d54/d8e/da6/dd0/dc8/l128 0 2026-03-10T08:56:00.109 INFO:tasks.workunit.client.0.vm05.stdout:8/722: unlink d2/f2a 0 2026-03-10T08:56:00.111 INFO:tasks.workunit.client.1.vm08.stdout:8/967: sync 2026-03-10T08:56:00.112 INFO:tasks.workunit.client.1.vm08.stdout:3/863: write d4/d15/d8/d2c/d9b/d79/f5c [4963606,88524] 0 2026-03-10T08:56:00.127 INFO:tasks.workunit.client.1.vm08.stdout:6/906: dwrite d9/dc/d11/d23/d2c/f3d [0,4194304] 0 2026-03-10T08:56:00.143 INFO:tasks.workunit.client.0.vm05.stdout:7/685: creat d18/d66/d25/d2e/d2f/fd8 x:0 0 0 2026-03-10T08:56:00.151 INFO:tasks.workunit.client.0.vm05.stdout:3/774: creat d9/d2b/d2f/fee x:0 0 0 2026-03-10T08:56:00.157 INFO:tasks.workunit.client.0.vm05.stdout:6/732: creat d4/d7/d10/d1a/d8c/ff9 x:0 0 0 2026-03-10T08:56:00.159 INFO:tasks.workunit.client.1.vm08.stdout:1/904: link d1/da/de/d24/d3d/d40/lfc d1/da/de/d24/d26/d5d/l133 0 2026-03-10T08:56:00.167 INFO:tasks.workunit.client.1.vm08.stdout:5/812: creat d0/d11/d27/d68/d7c/d8e/df0/ffd x:0 0 0 2026-03-10T08:56:00.178 INFO:tasks.workunit.client.1.vm08.stdout:8/968: fdatasync d1/d10/fad 0 2026-03-10T08:56:00.218 INFO:tasks.workunit.client.1.vm08.stdout:8/969: chown d1/d10/d9/dd/d25 156 1 2026-03-10T08:56:00.218 INFO:tasks.workunit.client.1.vm08.stdout:4/936: symlink d5/d23/d36/d99/db2/d5a/d69/d11b/def/l14c 0 2026-03-10T08:56:00.218 INFO:tasks.workunit.client.1.vm08.stdout:6/907: creat d9/d10/d1e/d92/f12b x:0 0 0 2026-03-10T08:56:00.218 INFO:tasks.workunit.client.1.vm08.stdout:6/908: write d9/d50/d95/f99 [749502,3526] 0 2026-03-10T08:56:00.218 INFO:tasks.workunit.client.0.vm05.stdout:4/709: creat d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/d67/d7b/fe8 x:0 0 0 2026-03-10T08:56:00.218 INFO:tasks.workunit.client.0.vm05.stdout:8/723: read - d2/dd/d2c/d2e/d31/d3e/d5d/fc0 zero size 2026-03-10T08:56:00.218 INFO:tasks.workunit.client.0.vm05.stdout:2/648: truncate d0/fa 311811 0 2026-03-10T08:56:00.218 INFO:tasks.workunit.client.0.vm05.stdout:1/759: unlink dd/d21/d37/d7c/d60/c92 0 2026-03-10T08:56:00.218 INFO:tasks.workunit.client.0.vm05.stdout:3/775: chown d9/d4d/d51/f67 15 1 2026-03-10T08:56:00.219 INFO:tasks.workunit.client.0.vm05.stdout:6/733: mknod d4/d7/d10/dc3/cfa 0 2026-03-10T08:56:00.226 INFO:tasks.workunit.client.0.vm05.stdout:4/710: mknod d0/d2e/dca/ce9 0 2026-03-10T08:56:00.226 INFO:tasks.workunit.client.0.vm05.stdout:4/711: chown d0/d2e/d71/de3 46064980 1 2026-03-10T08:56:00.233 INFO:tasks.workunit.client.0.vm05.stdout:1/760: mkdir dd/d13/d10b 0 2026-03-10T08:56:00.238 INFO:tasks.workunit.client.0.vm05.stdout:7/686: mkdir d18/d66/d25/d2e/dd9 0 2026-03-10T08:56:00.241 INFO:tasks.workunit.client.0.vm05.stdout:6/734: creat d4/d2d/d51/d87/ffb x:0 0 0 2026-03-10T08:56:00.244 INFO:tasks.workunit.client.0.vm05.stdout:4/712: fsync d0/fc 0 2026-03-10T08:56:00.249 INFO:tasks.workunit.client.0.vm05.stdout:8/724: mkdir d2/dfc 0 2026-03-10T08:56:00.259 INFO:tasks.workunit.client.0.vm05.stdout:2/649: fsync d0/d9/f1d 0 2026-03-10T08:56:00.279 INFO:tasks.workunit.client.0.vm05.stdout:1/761: unlink dd/d10/ca8 0 2026-03-10T08:56:00.281 INFO:tasks.workunit.client.1.vm08.stdout:0/881: write d6/dd/d13/d17/d1f/d2d/fa0 [1724410,86339] 0 2026-03-10T08:56:00.284 INFO:tasks.workunit.client.1.vm08.stdout:2/996: write d1/da/d10/d42/d93/d23/d128/f108 [457228,30457] 0 2026-03-10T08:56:00.285 INFO:tasks.workunit.client.1.vm08.stdout:0/882: stat d6/dd/d13/d17/d1f/d2d/d85/d95/ff5 0 2026-03-10T08:56:00.286 INFO:tasks.workunit.client.0.vm05.stdout:7/687: creat d18/d66/d25/d2e/d42/dc6/fda x:0 0 0 2026-03-10T08:56:00.290 INFO:tasks.workunit.client.0.vm05.stdout:0/715: write df/d59/f3f [2435910,96401] 0 2026-03-10T08:56:00.294 INFO:tasks.workunit.client.0.vm05.stdout:5/643: write d5/df/d37/dd2/f94 [960957,122127] 0 2026-03-10T08:56:00.303 INFO:tasks.workunit.client.0.vm05.stdout:3/776: link d9/d8f/d55/f6b d9/d2b/d2f/d57/dd0/fef 0 2026-03-10T08:56:00.306 INFO:tasks.workunit.client.0.vm05.stdout:6/735: mkdir d4/d7/d10/d15/d1b/dfc 0 2026-03-10T08:56:00.308 INFO:tasks.workunit.client.1.vm08.stdout:3/864: dwrite d4/d15/d8/d2c/d9b/d79/ff2 [0,4194304] 0 2026-03-10T08:56:00.308 INFO:tasks.workunit.client.0.vm05.stdout:9/675: write d6/d15/d35/f9a [3965830,26503] 0 2026-03-10T08:56:00.337 INFO:tasks.workunit.client.0.vm05.stdout:4/713: creat d0/d2e/d42/d45/d4a/d36/dbe/d49/d58/d66/d79/fea x:0 0 0 2026-03-10T08:56:00.343 INFO:tasks.workunit.client.0.vm05.stdout:8/725: creat d2/dd/d2c/d2e/d31/d4f/d80/de2/ffd x:0 0 0 2026-03-10T08:56:00.344 INFO:tasks.workunit.client.0.vm05.stdout:8/726: dread d2/db/d1f/f84 [0,4194304] 0 2026-03-10T08:56:00.348 INFO:tasks.workunit.client.0.vm05.stdout:2/650: mknod d0/d9/d1e/d20/d21/d45/d4b/d8d/cbf 0 2026-03-10T08:56:00.355 INFO:tasks.workunit.client.0.vm05.stdout:1/762: fdatasync dd/d10/d18/dd5/fbf 0 2026-03-10T08:56:00.362 INFO:tasks.workunit.client.0.vm05.stdout:7/688: truncate d18/d66/d25/d2e/d42/f52 5183562 0 2026-03-10T08:56:00.366 INFO:tasks.workunit.client.0.vm05.stdout:0/716: mknod df/d1f/d85/d2b/d65/d6e/cd7 0 2026-03-10T08:56:00.366 INFO:tasks.workunit.client.0.vm05.stdout:9/676: sync 2026-03-10T08:56:00.368 INFO:tasks.workunit.client.0.vm05.stdout:9/677: fdatasync d6/d12/d3a/d9c/fb6 0 2026-03-10T08:56:00.394 INFO:tasks.workunit.client.0.vm05.stdout:5/644: dread d5/d86/f2a [0,4194304] 0 2026-03-10T08:56:00.399 INFO:tasks.workunit.client.0.vm05.stdout:3/777: truncate d9/d2b/d3a/f68 2345980 0 2026-03-10T08:56:00.400 INFO:tasks.workunit.client.0.vm05.stdout:3/778: chown d9/d8f/d50/d5f/d7b/l80 23 1 2026-03-10T08:56:00.401 INFO:tasks.workunit.client.0.vm05.stdout:6/736: creat d4/d2c/dc8/ffd x:0 0 0 2026-03-10T08:56:00.418 INFO:tasks.workunit.client.0.vm05.stdout:4/714: rmdir d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/d67/d7b 39 2026-03-10T08:56:00.438 INFO:tasks.workunit.client.1.vm08.stdout:5/813: fsync d0/d11/d27/d68/d7c/d4b/d4e/d84/df9/fd4 0 2026-03-10T08:56:00.438 INFO:tasks.workunit.client.1.vm08.stdout:9/880: mknod d2/d54/d8e/c129 0 2026-03-10T08:56:00.438 INFO:tasks.workunit.client.1.vm08.stdout:5/814: chown d0/ffa 551958468 1 2026-03-10T08:56:00.452 INFO:tasks.workunit.client.1.vm08.stdout:8/970: creat d1/d10/d9/dd/d25/d27/d44/d21/d51/d64/dfb/f160 x:0 0 0 2026-03-10T08:56:00.454 INFO:tasks.workunit.client.0.vm05.stdout:8/727: dread d2/dd/f26 [0,4194304] 0 2026-03-10T08:56:00.465 INFO:tasks.workunit.client.1.vm08.stdout:6/909: mkdir d9/dc/d84/d80/d12c 0 2026-03-10T08:56:00.469 INFO:tasks.workunit.client.0.vm05.stdout:0/717: write df/d1f/d85/d19/d47/d84/d8a/f93 [937602,56468] 0 2026-03-10T08:56:00.471 INFO:tasks.workunit.client.0.vm05.stdout:9/678: write d6/d15/f25 [358591,47209] 0 2026-03-10T08:56:00.477 INFO:tasks.workunit.client.1.vm08.stdout:0/883: stat d6/dd/d13/d8f/ffe 0 2026-03-10T08:56:00.477 INFO:tasks.workunit.client.0.vm05.stdout:5/645: read d5/df/f31 [1255478,104486] 0 2026-03-10T08:56:00.478 INFO:tasks.workunit.client.1.vm08.stdout:2/997: unlink d1/da/d10/d42/d93/d23/d128/f108 0 2026-03-10T08:56:00.480 INFO:tasks.workunit.client.0.vm05.stdout:1/763: dread dd/d10/f22 [4194304,4194304] 0 2026-03-10T08:56:00.486 INFO:tasks.workunit.client.0.vm05.stdout:6/737: rename d4/d7/d10/d1a/ca0 to d4/d92/db0/cfe 0 2026-03-10T08:56:00.487 INFO:tasks.workunit.client.1.vm08.stdout:3/865: symlink d4/d15/d8/d1d/d117/l126 0 2026-03-10T08:56:00.488 INFO:tasks.workunit.client.0.vm05.stdout:3/779: write d9/d2b/f3b [576997,4902] 0 2026-03-10T08:56:00.502 INFO:tasks.workunit.client.0.vm05.stdout:2/651: dwrite d0/d9/d1e/d20/d21/d45/d4b/d70/f98 [0,4194304] 0 2026-03-10T08:56:00.512 INFO:tasks.workunit.client.1.vm08.stdout:9/881: readlink d2/lc 0 2026-03-10T08:56:00.513 INFO:tasks.workunit.client.1.vm08.stdout:5/815: mkdir d0/d11/d27/d68/d7c/d4b/d4e/d84/df9/dfe 0 2026-03-10T08:56:00.527 INFO:tasks.workunit.client.0.vm05.stdout:0/718: fsync df/d1f/d85/d2b/f7a 0 2026-03-10T08:56:00.531 INFO:tasks.workunit.client.1.vm08.stdout:0/884: creat d6/dd/d13/d17/d1f/d20/d2f/d57/d109/f12e x:0 0 0 2026-03-10T08:56:00.535 INFO:tasks.workunit.client.0.vm05.stdout:9/679: symlink d6/d15/d3c/d4b/d90/le3 0 2026-03-10T08:56:00.535 INFO:tasks.workunit.client.0.vm05.stdout:5/646: mknod d5/df/d37/dd2/cea 0 2026-03-10T08:56:00.535 INFO:tasks.workunit.client.0.vm05.stdout:5/647: chown d5/d86/l1e 21295 1 2026-03-10T08:56:00.535 INFO:tasks.workunit.client.0.vm05.stdout:9/680: dread - d6/d27/fcc zero size 2026-03-10T08:56:00.535 INFO:tasks.workunit.client.0.vm05.stdout:9/681: readlink d6/d19/d2a/d8d/ld2 0 2026-03-10T08:56:00.535 INFO:tasks.workunit.client.0.vm05.stdout:8/728: dwrite d2/dd/d2c/d2e/d31/d3e/d5d/fc0 [0,4194304] 0 2026-03-10T08:56:00.552 INFO:tasks.workunit.client.0.vm05.stdout:1/764: fdatasync fc 0 2026-03-10T08:56:00.554 INFO:tasks.workunit.client.0.vm05.stdout:1/765: chown dd/d21/d37/d7c/d60/l7a 617035936 1 2026-03-10T08:56:00.556 INFO:tasks.workunit.client.1.vm08.stdout:3/866: creat d4/d15/d8/d71/f127 x:0 0 0 2026-03-10T08:56:00.557 INFO:tasks.workunit.client.1.vm08.stdout:7/927: getdents d0/d11/d1f/d29/d36/d75 0 2026-03-10T08:56:00.560 INFO:tasks.workunit.client.0.vm05.stdout:1/766: dread dd/d21/f3a [4194304,4194304] 0 2026-03-10T08:56:00.568 INFO:tasks.workunit.client.1.vm08.stdout:1/905: mknod d1/da/d18/d3a/c134 0 2026-03-10T08:56:00.574 INFO:tasks.workunit.client.1.vm08.stdout:2/998: dread d1/d43/f10b [0,4194304] 0 2026-03-10T08:56:00.575 INFO:tasks.workunit.client.0.vm05.stdout:6/738: write d4/d7/d10/d15/d1b/d22/fa4 [3275421,114823] 0 2026-03-10T08:56:00.580 INFO:tasks.workunit.client.1.vm08.stdout:5/816: dread d0/d11/d18/faf [0,4194304] 0 2026-03-10T08:56:00.584 INFO:tasks.workunit.client.1.vm08.stdout:8/971: symlink d1/d10/d9/dd/l161 0 2026-03-10T08:56:00.591 INFO:tasks.workunit.client.1.vm08.stdout:6/910: mknod d9/d10/d1e/c12d 0 2026-03-10T08:56:00.598 INFO:tasks.workunit.client.1.vm08.stdout:0/885: rename d6/dd/d13/d17/d1f/d2d/d85/d95 to d6/dd/d13/d17/d1f/d2d/d38/d98/d12f 0 2026-03-10T08:56:00.611 INFO:tasks.workunit.client.1.vm08.stdout:3/867: dwrite d4/d6f/dca/fcc [0,4194304] 0 2026-03-10T08:56:00.621 INFO:tasks.workunit.client.1.vm08.stdout:3/868: sync 2026-03-10T08:56:00.635 INFO:tasks.workunit.client.1.vm08.stdout:2/999: symlink d1/d5b/da7/l14c 0 2026-03-10T08:56:00.636 INFO:tasks.workunit.client.0.vm05.stdout:9/682: stat d6/f16 0 2026-03-10T08:56:00.640 INFO:tasks.workunit.client.1.vm08.stdout:9/882: creat d2/d41/d4c/d66/d82/dfe/f12a x:0 0 0 2026-03-10T08:56:00.645 INFO:tasks.workunit.client.0.vm05.stdout:8/729: fsync d2/db/d1f/f53 0 2026-03-10T08:56:00.656 INFO:tasks.workunit.client.1.vm08.stdout:8/972: rmdir d1/d10/d9/dd/d25/dca/dc6/d13f 39 2026-03-10T08:56:00.661 INFO:tasks.workunit.client.1.vm08.stdout:1/906: write d1/da/de/d24/d35/d6d/d82/da2/ff5 [696956,92086] 0 2026-03-10T08:56:00.665 INFO:tasks.workunit.client.0.vm05.stdout:1/767: creat dd/d10/d18/d20/d52/f10c x:0 0 0 2026-03-10T08:56:00.665 INFO:tasks.workunit.client.0.vm05.stdout:1/768: read dd/d21/d37/f8c [71257,91539] 0 2026-03-10T08:56:00.666 INFO:tasks.workunit.client.1.vm08.stdout:4/937: link d5/d23/d49/d8f/da4/l142 d5/d23/d49/d8f/l14d 0 2026-03-10T08:56:00.669 INFO:tasks.workunit.client.0.vm05.stdout:3/780: mknod d9/d8f/d50/d5f/dd8/dec/cf0 0 2026-03-10T08:56:00.676 INFO:tasks.workunit.client.1.vm08.stdout:6/911: read d9/f77 [1328431,74653] 0 2026-03-10T08:56:00.676 INFO:tasks.workunit.client.1.vm08.stdout:6/912: chown d9/dc 16422045 1 2026-03-10T08:56:00.676 INFO:tasks.workunit.client.0.vm05.stdout:4/715: link d0/d2e/d42/d45/d4a/f26 d0/d2e/d42/d45/d4a/d36/dbe/dbf/dbd/feb 0 2026-03-10T08:56:00.681 INFO:tasks.workunit.client.1.vm08.stdout:7/928: symlink d0/d14/d43/d62/d102/l125 0 2026-03-10T08:56:00.691 INFO:tasks.workunit.client.0.vm05.stdout:7/689: link d18/d66/d25/d2e/d2f/lab d18/d66/d25/d2e/ldb 0 2026-03-10T08:56:00.699 INFO:tasks.workunit.client.0.vm05.stdout:7/690: dwrite d18/f4a [0,4194304] 0 2026-03-10T08:56:00.699 INFO:tasks.workunit.client.1.vm08.stdout:3/869: read d4/d15/d8/d1d/fff [3948212,52520] 0 2026-03-10T08:56:00.706 INFO:tasks.workunit.client.1.vm08.stdout:5/817: creat d0/d11/d27/d68/d7c/d4b/d4e/d84/df9/dfe/fff x:0 0 0 2026-03-10T08:56:00.707 INFO:tasks.workunit.client.1.vm08.stdout:5/818: read d0/d11/d27/d68/d7c/f6f [297356,35065] 0 2026-03-10T08:56:00.721 INFO:tasks.workunit.client.1.vm08.stdout:8/973: symlink d1/d10/d9/dd/d25/d27/d44/d21/d51/d64/dfe/l162 0 2026-03-10T08:56:00.725 INFO:tasks.workunit.client.0.vm05.stdout:9/683: rmdir d6/d19/d21 39 2026-03-10T08:56:00.727 INFO:tasks.workunit.client.1.vm08.stdout:4/938: truncate d5/d5f/fe1 290947 0 2026-03-10T08:56:00.743 INFO:tasks.workunit.client.1.vm08.stdout:6/913: rmdir d9/dc/d84 39 2026-03-10T08:56:00.745 INFO:tasks.workunit.client.0.vm05.stdout:7/691: sync 2026-03-10T08:56:00.745 INFO:tasks.workunit.client.0.vm05.stdout:8/730: creat d2/dd/d2c/d2e/d31/d4f/da3/ffe x:0 0 0 2026-03-10T08:56:00.748 INFO:tasks.workunit.client.0.vm05.stdout:7/692: chown d18/d66/d25/d2e/d42/d53/f7e 60022 1 2026-03-10T08:56:00.758 INFO:tasks.workunit.client.0.vm05.stdout:1/769: mknod dd/d10/d18/d20/c10d 0 2026-03-10T08:56:00.758 INFO:tasks.workunit.client.0.vm05.stdout:1/770: write dd/d21/d37/fc2 [224729,38447] 0 2026-03-10T08:56:00.780 INFO:tasks.workunit.client.0.vm05.stdout:5/648: truncate d5/f9c 217792 0 2026-03-10T08:56:00.780 INFO:tasks.workunit.client.0.vm05.stdout:6/739: symlink d4/d2c/lff 0 2026-03-10T08:56:00.798 INFO:tasks.workunit.client.0.vm05.stdout:2/652: link d0/d9/d89/cad d0/d55/cc0 0 2026-03-10T08:56:00.800 INFO:tasks.workunit.client.1.vm08.stdout:4/939: chown d5/la 490 1 2026-03-10T08:56:00.803 INFO:tasks.workunit.client.1.vm08.stdout:0/886: rename d6/dd/d13/d17/d1f/d2d/d39/f3b to d6/dd/d13/d17/d1f/d2d/f130 0 2026-03-10T08:56:00.805 INFO:tasks.workunit.client.1.vm08.stdout:7/929: dwrite d0/d11/d1f/d29/d3b/f4c [0,4194304] 0 2026-03-10T08:56:00.805 INFO:tasks.workunit.client.1.vm08.stdout:7/930: readlink d0/d51/l94 0 2026-03-10T08:56:00.807 INFO:tasks.workunit.client.1.vm08.stdout:3/870: dwrite d4/d15/f7 [0,4194304] 0 2026-03-10T08:56:00.808 INFO:tasks.workunit.client.1.vm08.stdout:8/974: dread d1/d10/d9/dd/d25/d27/d44/d21/d51/d64/f12c [0,4194304] 0 2026-03-10T08:56:00.809 INFO:tasks.workunit.client.0.vm05.stdout:0/719: rename df/d1f/d85/d19/d47/d84/dbe to df/dd8 0 2026-03-10T08:56:00.809 INFO:tasks.workunit.client.0.vm05.stdout:9/684: truncate d6/d15/f4f 3523251 0 2026-03-10T08:56:00.810 INFO:tasks.workunit.client.0.vm05.stdout:9/685: read - d6/d19/d2a/d4a/f88 zero size 2026-03-10T08:56:00.810 INFO:tasks.workunit.client.0.vm05.stdout:4/716: write d0/d2e/d42/f59 [4936214,55130] 0 2026-03-10T08:56:00.811 INFO:tasks.workunit.client.0.vm05.stdout:9/686: chown d6/d19/d2c/c42 56 1 2026-03-10T08:56:00.814 INFO:tasks.workunit.client.0.vm05.stdout:9/687: write d6/d19/d2a/fd8 [831271,115265] 0 2026-03-10T08:56:00.822 INFO:tasks.workunit.client.0.vm05.stdout:7/693: symlink d18/d66/d25/d2e/d42/d74/ldc 0 2026-03-10T08:56:00.838 INFO:tasks.workunit.client.1.vm08.stdout:9/883: rename d2/dd/d15/d4f/cdc to d2/dd/d15/d1e/d39/d4e/d87/c12b 0 2026-03-10T08:56:00.838 INFO:tasks.workunit.client.1.vm08.stdout:0/887: fdatasync d6/fe 0 2026-03-10T08:56:00.839 INFO:tasks.workunit.client.0.vm05.stdout:6/740: truncate d4/d2c/f7a 1386208 0 2026-03-10T08:56:00.839 INFO:tasks.workunit.client.0.vm05.stdout:2/653: truncate d0/f8 120638 0 2026-03-10T08:56:00.842 INFO:tasks.workunit.client.0.vm05.stdout:4/717: dread d0/d2e/d42/d45/fcc [0,4194304] 0 2026-03-10T08:56:00.848 INFO:tasks.workunit.client.0.vm05.stdout:3/781: rename d9/d2b/d3a to d9/d2b/de7/df1 0 2026-03-10T08:56:00.849 INFO:tasks.workunit.client.0.vm05.stdout:3/782: chown d9/d4d/d51/faa 43905770 1 2026-03-10T08:56:00.851 INFO:tasks.workunit.client.0.vm05.stdout:0/720: fdatasync df/d1f/d85/d19/d39/f86 0 2026-03-10T08:56:00.853 INFO:tasks.workunit.client.1.vm08.stdout:8/975: fdatasync d1/d10/d9/dd/d25/d27/d44/d97/d7d/f10f 0 2026-03-10T08:56:00.856 INFO:tasks.workunit.client.1.vm08.stdout:1/907: link d1/da/de/d24/d26/d5d/l133 d1/da/de/d24/d26/d86/l135 0 2026-03-10T08:56:00.866 INFO:tasks.workunit.client.1.vm08.stdout:0/888: sync 2026-03-10T08:56:00.876 INFO:tasks.workunit.client.1.vm08.stdout:6/914: dread d9/dc/d84/f89 [0,4194304] 0 2026-03-10T08:56:00.876 INFO:tasks.workunit.client.1.vm08.stdout:5/819: rename d0/d46 to d0/d11/d27/d100 0 2026-03-10T08:56:00.887 INFO:tasks.workunit.client.0.vm05.stdout:8/731: rename d2/db/d47/cb0 to d2/dd/d2c/d2e/d31/d4f/d80/de2/dea/cff 0 2026-03-10T08:56:00.890 INFO:tasks.workunit.client.0.vm05.stdout:2/654: dread d0/d9/f1b [0,4194304] 0 2026-03-10T08:56:00.893 INFO:tasks.workunit.client.0.vm05.stdout:0/721: sync 2026-03-10T08:56:00.948 INFO:tasks.workunit.client.1.vm08.stdout:4/940: truncate d5/d23/d36/f133 231064 0 2026-03-10T08:56:00.949 INFO:tasks.workunit.client.0.vm05.stdout:6/741: dwrite d4/d7/dc4/fca [0,4194304] 0 2026-03-10T08:56:00.957 INFO:tasks.workunit.client.1.vm08.stdout:9/884: dwrite d2/dd/d15/d1e/d39/d4e/fcf [0,4194304] 0 2026-03-10T08:56:00.959 INFO:tasks.workunit.client.0.vm05.stdout:4/718: write d0/d2e/d42/d45/d4a/d36/fd5 [737916,48602] 0 2026-03-10T08:56:00.976 INFO:tasks.workunit.client.1.vm08.stdout:8/976: truncate d1/d10/d9/dd/f62 4425798 0 2026-03-10T08:56:00.980 INFO:tasks.workunit.client.0.vm05.stdout:7/694: mkdir d18/d1b/ddd 0 2026-03-10T08:56:00.982 INFO:tasks.workunit.client.0.vm05.stdout:1/771: creat dd/d10/d18/d2d/f10e x:0 0 0 2026-03-10T08:56:00.983 INFO:tasks.workunit.client.0.vm05.stdout:1/772: dread - dd/d10/d18/dd1/f106 zero size 2026-03-10T08:56:00.984 INFO:tasks.workunit.client.0.vm05.stdout:1/773: truncate dd/f1c 1124514 0 2026-03-10T08:56:00.987 INFO:tasks.workunit.client.1.vm08.stdout:5/820: creat d0/d11/d27/d100/f101 x:0 0 0 2026-03-10T08:56:00.987 INFO:tasks.workunit.client.1.vm08.stdout:5/821: readlink d0/d1b/l2c 0 2026-03-10T08:56:00.990 INFO:tasks.workunit.client.1.vm08.stdout:6/915: rename d9/d10/d1e/d32/f48 to d9/d10/d1e/d7e/f12e 0 2026-03-10T08:56:00.995 INFO:tasks.workunit.client.0.vm05.stdout:8/732: unlink d2/dd/d2c/d2e/d31/d4f/d80/de2/ffd 0 2026-03-10T08:56:00.997 INFO:tasks.workunit.client.0.vm05.stdout:2/655: dread - d0/d9/d89/f96 zero size 2026-03-10T08:56:00.997 INFO:tasks.workunit.client.0.vm05.stdout:8/733: chown d2/db/d47/l8b 8221560 1 2026-03-10T08:56:00.998 INFO:tasks.workunit.client.1.vm08.stdout:3/871: link d4/d15/d8/d1d/d4f/lc8 d4/d15/d8/d2c/d89/l128 0 2026-03-10T08:56:01.007 INFO:tasks.workunit.client.1.vm08.stdout:3/872: sync 2026-03-10T08:56:01.012 INFO:tasks.workunit.client.1.vm08.stdout:0/889: mkdir d6/dd/d13/d17/d1f/d2d/d131 0 2026-03-10T08:56:01.013 INFO:tasks.workunit.client.1.vm08.stdout:5/822: dread - d0/d11/d18/fe6 zero size 2026-03-10T08:56:01.014 INFO:tasks.workunit.client.1.vm08.stdout:5/823: chown d0/d11/d27/l6b 922 1 2026-03-10T08:56:01.023 INFO:tasks.workunit.client.0.vm05.stdout:3/783: write d9/d2b/de7/df1/d6c/f92 [867566,130050] 0 2026-03-10T08:56:01.032 INFO:tasks.workunit.client.0.vm05.stdout:3/784: dwrite d9/d4d/dca/f99 [0,4194304] 0 2026-03-10T08:56:01.035 INFO:tasks.workunit.client.1.vm08.stdout:7/931: getdents d0/d11/d1f/d29/d3d/d89 0 2026-03-10T08:56:01.079 INFO:tasks.workunit.client.1.vm08.stdout:8/977: write d1/d10/d9/dd/d25/d27/d44/d21/d51/d64/dfb/f132 [5437318,9308] 0 2026-03-10T08:56:01.083 INFO:tasks.workunit.client.1.vm08.stdout:3/873: mknod d4/d15/d8/d71/c129 0 2026-03-10T08:56:01.085 INFO:tasks.workunit.client.1.vm08.stdout:5/824: truncate d0/fa4 626194 0 2026-03-10T08:56:01.098 INFO:tasks.workunit.client.1.vm08.stdout:6/916: mknod d9/dc/d11/d23/d2c/d7a/dce/d69/c12f 0 2026-03-10T08:56:01.100 INFO:tasks.workunit.client.1.vm08.stdout:7/932: truncate d0/d14/f98 18051 0 2026-03-10T08:56:01.101 INFO:tasks.workunit.client.1.vm08.stdout:9/885: creat d2/dd/d15/d1e/d25/f12c x:0 0 0 2026-03-10T08:56:01.102 INFO:tasks.workunit.client.1.vm08.stdout:9/886: readlink d2/dd/d15/d1e/d25/le6 0 2026-03-10T08:56:01.103 INFO:tasks.workunit.client.0.vm05.stdout:0/722: creat df/d1f/fd9 x:0 0 0 2026-03-10T08:56:01.103 INFO:tasks.workunit.client.1.vm08.stdout:1/908: getdents d1/da/de/d24/d35/d6d/d82 0 2026-03-10T08:56:01.104 INFO:tasks.workunit.client.1.vm08.stdout:9/887: truncate d2/dd/d15/d1e/d39/d4e/d87/f11f 553395 0 2026-03-10T08:56:01.105 INFO:tasks.workunit.client.1.vm08.stdout:9/888: write d2/d41/d53/d103/f121 [396259,23878] 0 2026-03-10T08:56:01.109 INFO:tasks.workunit.client.1.vm08.stdout:0/890: symlink d6/dd/d13/d17/d1f/d2d/d85/dfc/l132 0 2026-03-10T08:56:01.110 INFO:tasks.workunit.client.1.vm08.stdout:0/891: write d6/dd/d13/d17/d1f/da3/fa7 [1989086,53950] 0 2026-03-10T08:56:01.122 INFO:tasks.workunit.client.1.vm08.stdout:3/874: stat d4/d15/d8/f83 0 2026-03-10T08:56:01.124 INFO:tasks.workunit.client.0.vm05.stdout:4/719: truncate d0/d2e/d42/d45/d4a/f47 7035196 0 2026-03-10T08:56:01.126 INFO:tasks.workunit.client.1.vm08.stdout:5/825: rmdir d0/d11/d27/d68/d7c/d4b/d4e 39 2026-03-10T08:56:01.128 INFO:tasks.workunit.client.1.vm08.stdout:4/941: getdents d5/d23/d36/d99/dc6 0 2026-03-10T08:56:01.130 INFO:tasks.workunit.client.1.vm08.stdout:6/917: mkdir d9/d10/dd0/d130 0 2026-03-10T08:56:01.132 INFO:tasks.workunit.client.1.vm08.stdout:7/933: rmdir d0/d11/d1f/d29/d3d/dd1 39 2026-03-10T08:56:01.133 INFO:tasks.workunit.client.1.vm08.stdout:7/934: chown d0/d11/d4a/d95/dc5/d100/c10a 51546710 1 2026-03-10T08:56:01.155 INFO:tasks.workunit.client.1.vm08.stdout:1/909: dread d1/f8 [0,4194304] 0 2026-03-10T08:56:01.158 INFO:tasks.workunit.client.1.vm08.stdout:8/978: write d1/d10/d9/dd/d13/d40/f68 [3443718,76649] 0 2026-03-10T08:56:01.159 INFO:tasks.workunit.client.1.vm08.stdout:8/979: chown d1/d2c/l35 61354753 1 2026-03-10T08:56:01.172 INFO:tasks.workunit.client.1.vm08.stdout:0/892: rmdir d6 39 2026-03-10T08:56:01.183 INFO:tasks.workunit.client.1.vm08.stdout:3/875: creat d4/d15/d8/d2c/d9b/d79/d8f/de2/f12a x:0 0 0 2026-03-10T08:56:01.183 INFO:tasks.workunit.client.0.vm05.stdout:6/742: write d4/d7/d10/d1a/d1f/f4b [8100099,33395] 0 2026-03-10T08:56:01.195 INFO:tasks.workunit.client.0.vm05.stdout:5/649: rename d5/d86/d24/d2c/f46 to d5/d86/d24/d2c/d41/d74/da9/feb 0 2026-03-10T08:56:01.195 INFO:tasks.workunit.client.0.vm05.stdout:7/695: write d18/d66/d25/d2e/d2f/da0/fbb [192942,56517] 0 2026-03-10T08:56:01.199 INFO:tasks.workunit.client.0.vm05.stdout:5/650: dwrite d5/df/d37/dd2/d76/fb5 [0,4194304] 0 2026-03-10T08:56:01.221 INFO:tasks.workunit.client.0.vm05.stdout:1/774: dwrite dd/f44 [0,4194304] 0 2026-03-10T08:56:01.224 INFO:tasks.workunit.client.0.vm05.stdout:8/734: unlink d2/dd/d2c/d2e/d31/fc4 0 2026-03-10T08:56:01.227 INFO:tasks.workunit.client.0.vm05.stdout:1/775: write dd/d21/d37/d45/d8d/f99 [412871,10147] 0 2026-03-10T08:56:01.234 INFO:tasks.workunit.client.0.vm05.stdout:2/656: dread d0/d9/d1e/d20/f71 [4194304,4194304] 0 2026-03-10T08:56:01.242 INFO:tasks.workunit.client.1.vm08.stdout:7/935: truncate d0/d11/d1f/d29/d3d/d40/f38 11184427 0 2026-03-10T08:56:01.247 INFO:tasks.workunit.client.0.vm05.stdout:3/785: creat d9/d2b/de7/ff2 x:0 0 0 2026-03-10T08:56:01.249 INFO:tasks.workunit.client.0.vm05.stdout:0/723: creat df/dd8/d67/fda x:0 0 0 2026-03-10T08:56:01.250 INFO:tasks.workunit.client.0.vm05.stdout:0/724: readlink df/d1f/d85/d19/d47/d84/la2 0 2026-03-10T08:56:01.256 INFO:tasks.workunit.client.0.vm05.stdout:9/688: getdents d6/d19 0 2026-03-10T08:56:01.262 INFO:tasks.workunit.client.0.vm05.stdout:6/743: creat d4/d2c/d84/db6/dc6/f100 x:0 0 0 2026-03-10T08:56:01.272 INFO:tasks.workunit.client.0.vm05.stdout:2/657: sync 2026-03-10T08:56:01.274 INFO:tasks.workunit.client.0.vm05.stdout:8/735: mkdir d2/db/d28/d100 0 2026-03-10T08:56:01.276 INFO:tasks.workunit.client.0.vm05.stdout:2/658: sync 2026-03-10T08:56:01.290 INFO:tasks.workunit.client.0.vm05.stdout:1/776: fdatasync dd/d21/d37/d7c/d60/fe9 0 2026-03-10T08:56:01.293 INFO:tasks.workunit.client.0.vm05.stdout:3/786: rmdir d9/d4d/d51 39 2026-03-10T08:56:01.294 INFO:tasks.workunit.client.0.vm05.stdout:0/725: chown df/d1f/d85/f53 254 1 2026-03-10T08:56:01.309 INFO:tasks.workunit.client.0.vm05.stdout:9/689: dread d6/d12/d43/f47 [0,4194304] 0 2026-03-10T08:56:01.313 INFO:tasks.workunit.client.0.vm05.stdout:7/696: unlink d18/d66/c40 0 2026-03-10T08:56:01.313 INFO:tasks.workunit.client.0.vm05.stdout:6/744: read d4/f6c [3778688,57117] 0 2026-03-10T08:56:01.325 INFO:tasks.workunit.client.0.vm05.stdout:5/651: write d5/df/f53 [292665,39490] 0 2026-03-10T08:56:01.325 INFO:tasks.workunit.client.1.vm08.stdout:9/889: rename d2/dd/d15/d1e/d39/d4e/c116 to d2/dd/d15/d1e/d39/d4e/c12d 0 2026-03-10T08:56:01.326 INFO:tasks.workunit.client.0.vm05.stdout:5/652: chown d5/df/d37/d68/l7f 480 1 2026-03-10T08:56:01.329 INFO:tasks.workunit.client.0.vm05.stdout:5/653: read d5/d86/d21/f1f [331678,96165] 0 2026-03-10T08:56:01.331 INFO:tasks.workunit.client.0.vm05.stdout:2/659: rename d0/d9/d1e/d20/c2e to d0/d9/d7f/db4/cc1 0 2026-03-10T08:56:01.335 INFO:tasks.workunit.client.1.vm08.stdout:1/910: chown d1/da/d20/d91/c87 251 1 2026-03-10T08:56:01.339 INFO:tasks.workunit.client.1.vm08.stdout:8/980: rename d1/d10/d9/dd/d25/d27/d44/d97/f9c to d1/d10/d9/dd/d25/d27/d144/f163 0 2026-03-10T08:56:01.367 INFO:tasks.workunit.client.1.vm08.stdout:0/893: chown d6/dd/d13/d17/f1d 82 1 2026-03-10T08:56:01.373 INFO:tasks.workunit.client.1.vm08.stdout:4/942: mkdir d5/d23/d36/d99/db2/d5a/d69/d11b/dea/d14e 0 2026-03-10T08:56:01.386 INFO:tasks.workunit.client.1.vm08.stdout:6/918: symlink d9/dc/d11/d23/d2c/d81/l131 0 2026-03-10T08:56:01.403 INFO:tasks.workunit.client.1.vm08.stdout:9/890: chown d2/dd/d15/d1e/d25/c64 47296 1 2026-03-10T08:56:01.403 INFO:tasks.workunit.client.1.vm08.stdout:8/981: creat d1/d10/d9/dd/d25/d27/d44/d21/d51/d64/dfe/f164 x:0 0 0 2026-03-10T08:56:01.403 INFO:tasks.workunit.client.1.vm08.stdout:4/943: mkdir d5/d23/d36/d99/db2/d5a/d69/d11b/d96/d14f 0 2026-03-10T08:56:01.403 INFO:tasks.workunit.client.1.vm08.stdout:9/891: chown d2/dd/d15/d1e/f48 86388 1 2026-03-10T08:56:01.403 INFO:tasks.workunit.client.0.vm05.stdout:6/745: truncate d4/d7/f4d 3883398 0 2026-03-10T08:56:01.403 INFO:tasks.workunit.client.0.vm05.stdout:8/736: creat d2/dfc/f101 x:0 0 0 2026-03-10T08:56:01.403 INFO:tasks.workunit.client.0.vm05.stdout:5/654: rename d5/df/d37/d68/d85/l8c to d5/dcf/lec 0 2026-03-10T08:56:01.403 INFO:tasks.workunit.client.0.vm05.stdout:1/777: symlink dd/d13/d10b/l10f 0 2026-03-10T08:56:01.403 INFO:tasks.workunit.client.0.vm05.stdout:4/720: getdents d0/d2c/d6a/dd0 0 2026-03-10T08:56:01.403 INFO:tasks.workunit.client.1.vm08.stdout:1/911: read d1/da/de/d24/d3d/d40/d8e/dd2/fdc [1426580,78365] 0 2026-03-10T08:56:01.404 INFO:tasks.workunit.client.0.vm05.stdout:6/746: rmdir d4/d7/d10/d15/d20 39 2026-03-10T08:56:01.404 INFO:tasks.workunit.client.0.vm05.stdout:8/737: symlink d2/db/d1f/d67/l102 0 2026-03-10T08:56:01.415 INFO:tasks.workunit.client.1.vm08.stdout:0/894: fdatasync d6/dd/d13/d17/d1f/d20/d2f/d24/fa8 0 2026-03-10T08:56:01.419 INFO:tasks.workunit.client.1.vm08.stdout:6/919: unlink d9/dc/d11/d23/d2c/l93 0 2026-03-10T08:56:01.455 INFO:tasks.workunit.client.1.vm08.stdout:7/936: creat d0/d11/d1f/f126 x:0 0 0 2026-03-10T08:56:01.455 INFO:tasks.workunit.client.1.vm08.stdout:3/876: rename d4/d6f/d85/dd3/fdb to d4/d15/d8/d2c/d6d/dfa/f12b 0 2026-03-10T08:56:01.455 INFO:tasks.workunit.client.1.vm08.stdout:3/877: symlink d4/d15/d8/d2c/d9b/d79/d8f/l12c 0 2026-03-10T08:56:01.455 INFO:tasks.workunit.client.1.vm08.stdout:0/895: mknod d6/dd/d13/c133 0 2026-03-10T08:56:01.455 INFO:tasks.workunit.client.1.vm08.stdout:4/944: rename d5/d23/d36/d99/db2/d5a/d69/d11b/cd9 to d5/d23/d36/d99/db2/d5a/d69/d11b/dea/d14e/c150 0 2026-03-10T08:56:01.455 INFO:tasks.workunit.client.1.vm08.stdout:0/896: rmdir d6/dd/d13/d17 39 2026-03-10T08:56:01.455 INFO:tasks.workunit.client.0.vm05.stdout:8/738: creat d2/dfc/f103 x:0 0 0 2026-03-10T08:56:01.455 INFO:tasks.workunit.client.0.vm05.stdout:8/739: readlink d2/dd/d2c/lac 0 2026-03-10T08:56:01.455 INFO:tasks.workunit.client.0.vm05.stdout:7/697: getdents d18/d66/d79 0 2026-03-10T08:56:01.455 INFO:tasks.workunit.client.0.vm05.stdout:2/660: rename d0/d9/d1e/d20/d21/f46 to d0/d9/d1e/d20/d21/d45/d4b/d70/fc2 0 2026-03-10T08:56:01.455 INFO:tasks.workunit.client.0.vm05.stdout:7/698: symlink d18/d38/dc7/lde 0 2026-03-10T08:56:01.455 INFO:tasks.workunit.client.0.vm05.stdout:7/699: dread d18/f4a [0,4194304] 0 2026-03-10T08:56:01.455 INFO:tasks.workunit.client.0.vm05.stdout:7/700: fdatasync d18/d66/d25/d2e/d2f/da0/fbb 0 2026-03-10T08:56:01.457 INFO:tasks.workunit.client.1.vm08.stdout:6/920: rmdir d9/d10/dd0/d130 0 2026-03-10T08:56:01.460 INFO:tasks.workunit.client.0.vm05.stdout:2/661: mkdir d0/d9/d7f/d8f/d7e/dc3 0 2026-03-10T08:56:01.468 INFO:tasks.workunit.client.0.vm05.stdout:4/721: rename d0/l2a to d0/d2e/d42/d45/d4a/d36/dbe/dbf/dbd/de2/lec 0 2026-03-10T08:56:01.468 INFO:tasks.workunit.client.1.vm08.stdout:7/937: dread d0/d11/d1f/d29/fcf [0,4194304] 0 2026-03-10T08:56:01.468 INFO:tasks.workunit.client.1.vm08.stdout:3/878: creat d4/d6f/d85/dd3/d10d/f12d x:0 0 0 2026-03-10T08:56:01.470 INFO:tasks.workunit.client.1.vm08.stdout:6/921: dread - d9/dc/de0/ff2 zero size 2026-03-10T08:56:01.473 INFO:tasks.workunit.client.1.vm08.stdout:1/912: sync 2026-03-10T08:56:01.484 INFO:tasks.workunit.client.1.vm08.stdout:9/892: dread d2/dd/d11c/de4/f104 [0,4194304] 0 2026-03-10T08:56:01.493 INFO:tasks.workunit.client.1.vm08.stdout:5/826: write d0/d11/d27/d68/d7c/d4b/d4e/fe8 [345969,74354] 0 2026-03-10T08:56:01.494 INFO:tasks.workunit.client.0.vm05.stdout:0/726: write df/d1f/f2d [2715750,115188] 0 2026-03-10T08:56:01.498 INFO:tasks.workunit.client.0.vm05.stdout:9/690: dwrite d6/d19/d21/f2f [0,4194304] 0 2026-03-10T08:56:01.498 INFO:tasks.workunit.client.0.vm05.stdout:3/787: dwrite d9/d2b/f2c [0,4194304] 0 2026-03-10T08:56:01.505 INFO:tasks.workunit.client.0.vm05.stdout:3/788: dread - d9/d4d/f95 zero size 2026-03-10T08:56:01.518 INFO:tasks.workunit.client.1.vm08.stdout:8/982: write d1/d10/d9/dd/d25/d27/d44/d97/d7d/f10e [736298,64290] 0 2026-03-10T08:56:01.521 INFO:tasks.workunit.client.0.vm05.stdout:1/778: write dd/d10/d18/d20/d69/fe2 [44633,125955] 0 2026-03-10T08:56:01.528 INFO:tasks.workunit.client.0.vm05.stdout:9/691: dwrite d6/d19/d2c/d58/fc9 [4194304,4194304] 0 2026-03-10T08:56:01.528 INFO:tasks.workunit.client.0.vm05.stdout:5/655: dwrite d5/d86/d24/d2c/d41/d74/fb1 [4194304,4194304] 0 2026-03-10T08:56:01.533 INFO:tasks.workunit.client.1.vm08.stdout:0/897: creat d6/dd/d13/d17/d1f/da3/f134 x:0 0 0 2026-03-10T08:56:01.541 INFO:tasks.workunit.client.0.vm05.stdout:6/747: dwrite d4/d2c/d84/fe2 [0,4194304] 0 2026-03-10T08:56:01.541 INFO:tasks.workunit.client.0.vm05.stdout:8/740: dwrite d2/dd/d2c/d2e/d31/d3e/fe3 [0,4194304] 0 2026-03-10T08:56:01.543 INFO:tasks.workunit.client.1.vm08.stdout:6/922: symlink d9/d50/de9/dea/l132 0 2026-03-10T08:56:01.545 INFO:tasks.workunit.client.0.vm05.stdout:2/662: symlink d0/d9/d1e/lc4 0 2026-03-10T08:56:01.549 INFO:tasks.workunit.client.0.vm05.stdout:0/727: dread - df/dd8/d67/f80 zero size 2026-03-10T08:56:01.573 INFO:tasks.workunit.client.1.vm08.stdout:1/913: creat d1/da/de/d24/d3d/d40/d8e/f136 x:0 0 0 2026-03-10T08:56:01.584 INFO:tasks.workunit.client.1.vm08.stdout:9/893: mknod d2/dd/d15/d1e/d25/d32/d79/c12e 0 2026-03-10T08:56:01.586 INFO:tasks.workunit.client.0.vm05.stdout:9/692: sync 2026-03-10T08:56:01.587 INFO:tasks.workunit.client.1.vm08.stdout:1/914: sync 2026-03-10T08:56:01.604 INFO:tasks.workunit.client.1.vm08.stdout:5/827: rename d0/cbc to d0/d11/d27/d68/d7c/de5/de2/c102 0 2026-03-10T08:56:01.605 INFO:tasks.workunit.client.1.vm08.stdout:5/828: stat d0/d11/d27/d68/d7c/d4b/d4e/d84/df9/dfe 0 2026-03-10T08:56:01.606 INFO:tasks.workunit.client.0.vm05.stdout:1/779: symlink dd/d10/d19/d4d/l110 0 2026-03-10T08:56:01.612 INFO:tasks.workunit.client.1.vm08.stdout:8/983: rmdir d1/d10/d9/dd/d18/dff 39 2026-03-10T08:56:01.613 INFO:tasks.workunit.client.1.vm08.stdout:5/829: sync 2026-03-10T08:56:01.614 INFO:tasks.workunit.client.0.vm05.stdout:4/722: mkdir d0/d2e/d42/d45/d4a/d36/dbe/d32/ded 0 2026-03-10T08:56:01.624 INFO:tasks.workunit.client.1.vm08.stdout:7/938: write d0/d14/d43/fa4 [1038378,73659] 0 2026-03-10T08:56:01.627 INFO:tasks.workunit.client.1.vm08.stdout:0/898: mknod d6/dd/d13/d17/d1f/da3/c135 0 2026-03-10T08:56:01.628 INFO:tasks.workunit.client.0.vm05.stdout:7/701: dwrite d18/d38/d43/d5c/daf/fb6 [0,4194304] 0 2026-03-10T08:56:01.641 INFO:tasks.workunit.client.1.vm08.stdout:0/899: sync 2026-03-10T08:56:01.666 INFO:tasks.workunit.client.1.vm08.stdout:6/923: creat d9/dc/d84/d80/f133 x:0 0 0 2026-03-10T08:56:01.667 INFO:tasks.workunit.client.1.vm08.stdout:3/879: write d4/d15/d8/d1d/f62 [885750,103214] 0 2026-03-10T08:56:01.737 INFO:tasks.workunit.client.0.vm05.stdout:6/748: unlink d4/d2c/d84/db6/lbe 0 2026-03-10T08:56:01.740 INFO:tasks.workunit.client.0.vm05.stdout:8/741: unlink d2/d45/l4b 0 2026-03-10T08:56:01.752 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:01 vm05.local ceph-mon[49713]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:56:01.752 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:01 vm05.local ceph-mon[49713]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:56:01.752 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:01 vm05.local ceph-mon[49713]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T08:56:01.752 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:01 vm05.local ceph-mon[49713]: from='mon.? -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T08:56:01.752 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:01 vm05.local ceph-mon[49713]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:56:01.752 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:01 vm05.local ceph-mon[49713]: Upgrade: Need to upgrade myself (mgr.vm08.rpongu) 2026-03-10T08:56:01.752 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:01 vm05.local ceph-mon[49713]: Upgrade: Need to upgrade myself (mgr.vm08.rpongu) 2026-03-10T08:56:01.752 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:01 vm05.local ceph-mon[49713]: pgmap v9: 65 pgs: 65 active+clean; 3.6 GiB data, 12 GiB used, 108 GiB / 120 GiB avail; 29 MiB/s rd, 64 MiB/s wr, 185 op/s 2026-03-10T08:56:01.752 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:01 vm05.local ceph-mon[49713]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:56:01.752 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:01 vm05.local ceph-mon[49713]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm05.rxwgjc", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T08:56:01.752 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:01 vm05.local ceph-mon[49713]: from='mgr.24459 ' entity='mgr.vm08.rpongu' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm05.rxwgjc", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T08:56:01.752 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:01 vm05.local ceph-mon[49713]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T08:56:01.752 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:01 vm05.local ceph-mon[49713]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:56:01.753 INFO:tasks.workunit.client.0.vm05.stdout:0/728: mknod df/d1f/d85/d19/d47/da3/cdb 0 2026-03-10T08:56:01.754 INFO:tasks.workunit.client.1.vm08.stdout:5/830: dread d0/d11/d27/d68/d7c/de5/f91 [0,4194304] 0 2026-03-10T08:56:01.757 INFO:tasks.workunit.client.0.vm05.stdout:0/729: chown df/d1f/d85/d2b/d27/f91 31 1 2026-03-10T08:56:01.765 INFO:tasks.workunit.client.0.vm05.stdout:3/789: mknod d9/d4d/d51/d64/cf3 0 2026-03-10T08:56:01.765 INFO:tasks.workunit.client.1.vm08.stdout:7/939: dread - d0/d14/d2f/f11f zero size 2026-03-10T08:56:01.765 INFO:tasks.workunit.client.0.vm05.stdout:0/730: stat df/d1f/d85/d19/d47/fa5 0 2026-03-10T08:56:01.768 INFO:tasks.workunit.client.1.vm08.stdout:7/940: dread d0/d11/d1f/df0/df4/f11d [0,4194304] 0 2026-03-10T08:56:01.785 INFO:tasks.workunit.client.0.vm05.stdout:9/693: mkdir d6/d15/d35/ddf/de4 0 2026-03-10T08:56:01.788 INFO:tasks.workunit.client.1.vm08.stdout:9/894: dwrite d2/dd/d15/f44 [0,4194304] 0 2026-03-10T08:56:01.794 INFO:tasks.workunit.client.0.vm05.stdout:2/663: dwrite d0/d9/d1e/d20/d21/d45/d4b/fa7 [0,4194304] 0 2026-03-10T08:56:01.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:01 vm08.local ceph-mon[57559]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:56:01.805 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:01 vm08.local ceph-mon[57559]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:56:01.805 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:01 vm08.local ceph-mon[57559]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T08:56:01.805 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:01 vm08.local ceph-mon[57559]: from='mon.? -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T08:56:01.805 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:01 vm08.local ceph-mon[57559]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:56:01.805 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:01 vm08.local ceph-mon[57559]: Upgrade: Need to upgrade myself (mgr.vm08.rpongu) 2026-03-10T08:56:01.805 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:01 vm08.local ceph-mon[57559]: Upgrade: Need to upgrade myself (mgr.vm08.rpongu) 2026-03-10T08:56:01.805 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:01 vm08.local ceph-mon[57559]: pgmap v9: 65 pgs: 65 active+clean; 3.6 GiB data, 12 GiB used, 108 GiB / 120 GiB avail; 29 MiB/s rd, 64 MiB/s wr, 185 op/s 2026-03-10T08:56:01.805 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:01 vm08.local ceph-mon[57559]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:56:01.805 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:01 vm08.local ceph-mon[57559]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm05.rxwgjc", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T08:56:01.805 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:01 vm08.local ceph-mon[57559]: from='mgr.24459 ' entity='mgr.vm08.rpongu' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm05.rxwgjc", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T08:56:01.805 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:01 vm08.local ceph-mon[57559]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T08:56:01.805 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:01 vm08.local ceph-mon[57559]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:56:01.864 INFO:tasks.workunit.client.0.vm05.stdout:1/780: rename dd/d21/f6f to dd/d10/d18/d20/df3/dfe/f111 0 2026-03-10T08:56:01.880 INFO:tasks.workunit.client.0.vm05.stdout:7/702: chown d18/c4d 49 1 2026-03-10T08:56:01.892 INFO:tasks.workunit.client.1.vm08.stdout:4/945: rename d5/d23/d36/d99/db2/d5a/d69/d11b/def/df2/c132 to d5/d23/d36/d99/db2/d5d/dae/ddf/d105/c151 0 2026-03-10T08:56:01.894 INFO:tasks.workunit.client.0.vm05.stdout:5/656: symlink d5/df/dbb/d43/d60/led 0 2026-03-10T08:56:01.905 INFO:tasks.workunit.client.1.vm08.stdout:1/915: creat d1/da/de/d24/d3d/d40/d5b/de8/f137 x:0 0 0 2026-03-10T08:56:01.910 INFO:tasks.workunit.client.1.vm08.stdout:8/984: mknod d1/d10/d9/dd/d18/c165 0 2026-03-10T08:56:01.910 INFO:tasks.workunit.client.0.vm05.stdout:8/742: rmdir d2/db/d28/d99 39 2026-03-10T08:56:01.920 INFO:tasks.workunit.client.0.vm05.stdout:3/790: dread - d9/d2b/de7/df1/d43/d71/fac zero size 2026-03-10T08:56:01.921 INFO:tasks.workunit.client.0.vm05.stdout:3/791: chown d9/d4d 16153603 1 2026-03-10T08:56:01.921 INFO:tasks.workunit.client.0.vm05.stdout:2/664: dread - d0/d55/db8/f88 zero size 2026-03-10T08:56:01.922 INFO:tasks.workunit.client.0.vm05.stdout:0/731: read df/d1f/d85/f2a [4108126,100640] 0 2026-03-10T08:56:01.923 INFO:tasks.workunit.client.1.vm08.stdout:7/941: creat d0/d14/d2f/f127 x:0 0 0 2026-03-10T08:56:01.924 INFO:tasks.workunit.client.1.vm08.stdout:9/895: creat d2/dd/d15/d1e/d25/d32/d5c/dc2/f12f x:0 0 0 2026-03-10T08:56:01.934 INFO:tasks.workunit.client.1.vm08.stdout:0/900: dwrite d6/dd/d13/d17/d1f/d2d/d39/f47 [0,4194304] 0 2026-03-10T08:56:01.939 INFO:tasks.workunit.client.0.vm05.stdout:4/723: rename d0/d2c/d6a/dd0 to d0/d2e/d42/d45/d4a/d36/dbe/d49/d4f/d5b/dae/dee 0 2026-03-10T08:56:01.950 INFO:tasks.workunit.client.1.vm08.stdout:4/946: creat d5/d23/d36/d76/f152 x:0 0 0 2026-03-10T08:56:01.952 INFO:tasks.workunit.client.0.vm05.stdout:7/703: dread d18/d66/d25/d2e/d42/f5a [0,4194304] 0 2026-03-10T08:56:01.953 INFO:tasks.workunit.client.0.vm05.stdout:7/704: chown d18/d66/d25/d2e/d2f/d6d/fba 19 1 2026-03-10T08:56:01.957 INFO:tasks.workunit.client.1.vm08.stdout:1/916: truncate d1/da/de/d24/d35/d6d/d116/f9a 4620332 0 2026-03-10T08:56:01.965 INFO:tasks.workunit.client.1.vm08.stdout:1/917: read d1/da/de/d24/d35/d43/fb2 [1052427,103174] 0 2026-03-10T08:56:01.966 INFO:tasks.workunit.client.1.vm08.stdout:9/896: sync 2026-03-10T08:56:01.966 INFO:tasks.workunit.client.1.vm08.stdout:3/880: write d4/d15/d8/d2c/d55/d93/fa5 [1323353,101352] 0 2026-03-10T08:56:01.974 INFO:tasks.workunit.client.1.vm08.stdout:5/831: fsync d0/d11/d27/d68/d7c/d4b/fa0 0 2026-03-10T08:56:02.010 INFO:tasks.workunit.client.1.vm08.stdout:7/942: write d0/d14/f72 [1619620,44357] 0 2026-03-10T08:56:02.013 INFO:tasks.workunit.client.1.vm08.stdout:8/985: dwrite d1/d10/d9/dd/d18/fe5 [0,4194304] 0 2026-03-10T08:56:02.016 INFO:tasks.workunit.client.1.vm08.stdout:6/924: dwrite d9/dc/d11/d23/d2c/d81/f62 [0,4194304] 0 2026-03-10T08:56:02.017 INFO:tasks.workunit.client.1.vm08.stdout:0/901: dwrite d6/dd/d13/d17/d1f/d20/d2f/d57/fcd [0,4194304] 0 2026-03-10T08:56:02.022 INFO:tasks.workunit.client.1.vm08.stdout:8/986: dread - d1/d10/d9/dd/d25/d27/d44/d21/d51/d64/dfe/f164 zero size 2026-03-10T08:56:02.022 INFO:tasks.workunit.client.1.vm08.stdout:4/947: dwrite d5/d23/d36/f58 [0,4194304] 0 2026-03-10T08:56:02.022 INFO:tasks.workunit.client.1.vm08.stdout:1/918: symlink d1/da/de/dcf/l138 0 2026-03-10T08:56:02.059 INFO:tasks.workunit.client.1.vm08.stdout:9/897: dread - d2/dd/d15/d1e/d39/fd8 zero size 2026-03-10T08:56:02.091 INFO:tasks.workunit.client.1.vm08.stdout:5/832: rename d0/d11/d27/d68/d7c/d8e/ff3 to d0/d11/d18/df5/dfc/f103 0 2026-03-10T08:56:02.130 INFO:tasks.workunit.client.0.vm05.stdout:9/694: fsync f4 0 2026-03-10T08:56:02.130 INFO:tasks.workunit.client.0.vm05.stdout:3/792: creat d9/d2b/d2f/ff4 x:0 0 0 2026-03-10T08:56:02.142 INFO:tasks.workunit.client.0.vm05.stdout:4/724: truncate d0/d78/fbc 111391 0 2026-03-10T08:56:02.153 INFO:tasks.workunit.client.1.vm08.stdout:4/948: dwrite d5/d23/d49/d83/fd4 [0,4194304] 0 2026-03-10T08:56:02.153 INFO:tasks.workunit.client.1.vm08.stdout:5/833: fsync d0/d11/d3e/d45/fad 0 2026-03-10T08:56:02.153 INFO:tasks.workunit.client.0.vm05.stdout:4/725: stat d0/d2e/d42/d45/d4a 0 2026-03-10T08:56:02.153 INFO:tasks.workunit.client.0.vm05.stdout:1/781: mkdir dd/d10/d112 0 2026-03-10T08:56:02.153 INFO:tasks.workunit.client.0.vm05.stdout:1/782: dread dd/d10/d19/f95 [0,4194304] 0 2026-03-10T08:56:02.153 INFO:tasks.workunit.client.0.vm05.stdout:3/793: mkdir d9/d4d/d51/d64/d89/df5 0 2026-03-10T08:56:02.153 INFO:tasks.workunit.client.0.vm05.stdout:3/794: stat d9/d2b/de7/df1/dd6 0 2026-03-10T08:56:02.161 INFO:tasks.workunit.client.1.vm08.stdout:1/919: dread d1/f1f [0,4194304] 0 2026-03-10T08:56:02.164 INFO:tasks.workunit.client.0.vm05.stdout:7/705: mknod d18/cdf 0 2026-03-10T08:56:02.173 INFO:tasks.workunit.client.0.vm05.stdout:3/795: creat d9/d2b/de7/df1/d43/d71/ff6 x:0 0 0 2026-03-10T08:56:02.177 INFO:tasks.workunit.client.0.vm05.stdout:6/749: rename d4/d2d/d5f/f6d to d4/d7/d10/f101 0 2026-03-10T08:56:02.184 INFO:tasks.workunit.client.0.vm05.stdout:7/706: symlink d18/d66/d25/d2e/d42/dc6/le0 0 2026-03-10T08:56:02.186 INFO:tasks.workunit.client.1.vm08.stdout:3/881: dwrite d4/d6f/d85/df1/f109 [0,4194304] 0 2026-03-10T08:56:02.192 INFO:tasks.workunit.client.0.vm05.stdout:5/657: rename d5/d86/d24/d2c/d41/f4d to d5/d86/d24/d2c/d41/d74/da9/fee 0 2026-03-10T08:56:02.193 INFO:tasks.workunit.client.0.vm05.stdout:5/658: chown d5/d48/d64/d95/le8 955768 1 2026-03-10T08:56:02.193 INFO:tasks.workunit.client.0.vm05.stdout:5/659: stat d5/d86/f1b 0 2026-03-10T08:56:02.200 INFO:tasks.workunit.client.0.vm05.stdout:6/750: read d4/d7/d10/d15/f17 [2937010,62104] 0 2026-03-10T08:56:02.208 INFO:tasks.workunit.client.0.vm05.stdout:8/743: rename d2/dd/d2c/d2e/d31/d3e/dde/d63/f91 to d2/db/d28/d99/df3/f104 0 2026-03-10T08:56:02.212 INFO:tasks.workunit.client.0.vm05.stdout:5/660: truncate d5/d86/d24/d2c/f88 117440 0 2026-03-10T08:56:02.213 INFO:tasks.workunit.client.0.vm05.stdout:6/751: symlink d4/d2d/d51/d87/da5/de9/l102 0 2026-03-10T08:56:02.214 INFO:tasks.workunit.client.0.vm05.stdout:6/752: read - d4/d7/d10/d15/f94 zero size 2026-03-10T08:56:02.219 INFO:tasks.workunit.client.1.vm08.stdout:0/902: dwrite d6/dd/d13/d17/d1f/d2d/d38/d98/fda [0,4194304] 0 2026-03-10T08:56:02.220 INFO:tasks.workunit.client.0.vm05.stdout:0/732: write df/d1f/d85/d19/d5b/fb0 [780570,39926] 0 2026-03-10T08:56:02.221 INFO:tasks.workunit.client.0.vm05.stdout:2/665: dwrite d0/d9/d1e/d20/d21/d45/d4b/d70/f9b [0,4194304] 0 2026-03-10T08:56:02.225 INFO:tasks.workunit.client.0.vm05.stdout:2/666: dread d0/d9/f1b [0,4194304] 0 2026-03-10T08:56:02.239 INFO:tasks.workunit.client.0.vm05.stdout:8/744: mkdir d2/dd/d2c/d2e/d31/d4f/d7b/d9e/d105 0 2026-03-10T08:56:02.242 INFO:tasks.workunit.client.0.vm05.stdout:5/661: truncate d5/d86/fa6 48785 0 2026-03-10T08:56:02.248 INFO:tasks.workunit.client.0.vm05.stdout:0/733: truncate df/d1f/d85/d2b/f3b 577087 0 2026-03-10T08:56:02.251 INFO:tasks.workunit.client.1.vm08.stdout:7/943: dwrite d0/d11/d1f/d29/fcc [0,4194304] 0 2026-03-10T08:56:02.254 INFO:tasks.workunit.client.0.vm05.stdout:6/753: fsync d4/d7/d10/d15/d20/fec 0 2026-03-10T08:56:02.262 INFO:tasks.workunit.client.0.vm05.stdout:9/695: rename d6/d12/d43 to d6/d12/d3a/de5 0 2026-03-10T08:56:02.273 INFO:tasks.workunit.client.0.vm05.stdout:5/662: getdents d5/df/d37/dd2/d76/dde 0 2026-03-10T08:56:02.279 INFO:tasks.workunit.client.1.vm08.stdout:4/949: chown d5/d23/d36/d99/db2/d5a/d69/d11b/f50 1 1 2026-03-10T08:56:02.282 INFO:tasks.workunit.client.1.vm08.stdout:9/898: fsync d2/f77 0 2026-03-10T08:56:02.287 INFO:tasks.workunit.client.0.vm05.stdout:0/734: mknod df/d1f/d85/d2b/d27/d32/cdc 0 2026-03-10T08:56:02.290 INFO:tasks.workunit.client.0.vm05.stdout:4/726: write d0/f1 [12200492,77043] 0 2026-03-10T08:56:02.294 INFO:tasks.workunit.client.0.vm05.stdout:9/696: dread d6/d12/d3a/de5/f52 [0,4194304] 0 2026-03-10T08:56:02.298 INFO:tasks.workunit.client.0.vm05.stdout:1/783: dwrite dd/d21/d37/f85 [0,4194304] 0 2026-03-10T08:56:02.302 INFO:tasks.workunit.client.0.vm05.stdout:1/784: readlink dd/d10/d18/dd5/l86 0 2026-03-10T08:56:02.303 INFO:tasks.workunit.client.0.vm05.stdout:3/796: write d9/d4d/d51/d64/f85 [4951887,28168] 0 2026-03-10T08:56:02.309 INFO:tasks.workunit.client.0.vm05.stdout:7/707: write d18/d66/d25/d2e/d2f/d6d/fcb [823975,130725] 0 2026-03-10T08:56:02.310 INFO:tasks.workunit.client.0.vm05.stdout:5/663: read - d5/d86/d24/d2c/fd8 zero size 2026-03-10T08:56:02.321 INFO:tasks.workunit.client.0.vm05.stdout:6/754: unlink d4/d2d/d51/d62/ldc 0 2026-03-10T08:56:02.327 INFO:tasks.workunit.client.1.vm08.stdout:1/920: rmdir d1/da/de/d24/d35/d6d 39 2026-03-10T08:56:02.329 INFO:tasks.workunit.client.1.vm08.stdout:5/834: dread d0/d11/d27/d68/d7c/d4b/d4e/f89 [0,4194304] 0 2026-03-10T08:56:02.329 INFO:tasks.workunit.client.1.vm08.stdout:1/921: readlink d1/da/d18/d3a/l66 0 2026-03-10T08:56:02.334 INFO:tasks.workunit.client.0.vm05.stdout:4/727: fsync d0/d2e/f4e 0 2026-03-10T08:56:02.340 INFO:tasks.workunit.client.1.vm08.stdout:3/882: symlink d4/d15/dfd/l12e 0 2026-03-10T08:56:02.343 INFO:tasks.workunit.client.0.vm05.stdout:9/697: truncate d6/d12/db2/fba 1318454 0 2026-03-10T08:56:02.344 INFO:tasks.workunit.client.0.vm05.stdout:8/745: write d2/dd/d2c/d2e/d31/d3e/d5d/d9d/fdd [103990,115035] 0 2026-03-10T08:56:02.356 INFO:tasks.workunit.client.0.vm05.stdout:7/708: mkdir d18/d66/d25/d2e/d42/d9c/de1 0 2026-03-10T08:56:02.358 INFO:tasks.workunit.client.1.vm08.stdout:8/987: rmdir d1/d4f/d12e 0 2026-03-10T08:56:02.359 INFO:tasks.workunit.client.0.vm05.stdout:2/667: rename d0/d9/d1e/d20/f22 to d0/d9/d1e/d20/fc5 0 2026-03-10T08:56:02.369 INFO:tasks.workunit.client.0.vm05.stdout:3/797: dread d9/f19 [0,4194304] 0 2026-03-10T08:56:02.370 INFO:tasks.workunit.client.0.vm05.stdout:3/798: readlink d9/d2b/d2f/d96/laf 0 2026-03-10T08:56:02.373 INFO:tasks.workunit.client.0.vm05.stdout:5/664: dread d5/df/d37/dd2/fa5 [0,4194304] 0 2026-03-10T08:56:02.376 INFO:tasks.workunit.client.1.vm08.stdout:9/899: creat d2/dd/d15/d1e/d25/d32/d5c/f130 x:0 0 0 2026-03-10T08:56:02.381 INFO:tasks.workunit.client.1.vm08.stdout:0/903: write d6/dd/d13/d8f/ffe [151632,96576] 0 2026-03-10T08:56:02.384 INFO:tasks.workunit.client.0.vm05.stdout:1/785: dwrite dd/d10/f22 [0,4194304] 0 2026-03-10T08:56:02.385 INFO:tasks.workunit.client.0.vm05.stdout:6/755: rmdir d4/d2d/d51/d87 39 2026-03-10T08:56:02.395 INFO:tasks.workunit.client.1.vm08.stdout:6/925: getdents d9/d50/de9 0 2026-03-10T08:56:02.396 INFO:tasks.workunit.client.1.vm08.stdout:5/835: mknod d0/d11/d27/d68/d7c/d4b/d4e/d84/c104 0 2026-03-10T08:56:02.415 INFO:tasks.workunit.client.0.vm05.stdout:4/728: dread d0/d2e/d71/d7c/fb4 [0,4194304] 0 2026-03-10T08:56:02.416 INFO:tasks.workunit.client.1.vm08.stdout:3/883: creat d4/d15/d8/d2c/d9b/d79/d8f/f12f x:0 0 0 2026-03-10T08:56:02.425 INFO:tasks.workunit.client.1.vm08.stdout:7/944: mkdir d0/d11/d1f/d128 0 2026-03-10T08:56:02.443 INFO:tasks.workunit.client.0.vm05.stdout:3/799: creat d9/d2b/de7/df1/ff7 x:0 0 0 2026-03-10T08:56:02.443 INFO:tasks.workunit.client.0.vm05.stdout:3/800: chown d9/d2b/de7/df1/l49 9 1 2026-03-10T08:56:02.444 INFO:tasks.workunit.client.1.vm08.stdout:0/904: creat d6/dd/d13/d17/d1f/d2d/d38/d98/d12f/f136 x:0 0 0 2026-03-10T08:56:02.450 INFO:tasks.workunit.client.1.vm08.stdout:6/926: rename d9/dc/d11/d23/f40 to d9/d50/f134 0 2026-03-10T08:56:02.453 INFO:tasks.workunit.client.0.vm05.stdout:8/746: dwrite d2/dd/d2c/d2e/f5a [0,4194304] 0 2026-03-10T08:56:02.467 INFO:tasks.workunit.client.0.vm05.stdout:7/709: write d18/d66/d25/d2e/d42/d9c/dac/f72 [69739,22332] 0 2026-03-10T08:56:02.470 INFO:tasks.workunit.client.0.vm05.stdout:2/668: dwrite d0/f2 [4194304,4194304] 0 2026-03-10T08:56:02.474 INFO:tasks.workunit.client.1.vm08.stdout:1/922: mkdir d1/da/de/d24/d35/d6d/d116/d9c/d139 0 2026-03-10T08:56:02.487 INFO:tasks.workunit.client.1.vm08.stdout:7/945: truncate d0/d11/d1f/d29/d3d/d89/fa6 1372000 0 2026-03-10T08:56:02.491 INFO:tasks.workunit.client.0.vm05.stdout:6/756: read d4/d7/d10/f101 [1212800,99763] 0 2026-03-10T08:56:02.493 INFO:tasks.workunit.client.0.vm05.stdout:5/665: dread d5/d86/d24/d2c/d41/f87 [0,4194304] 0 2026-03-10T08:56:02.493 INFO:tasks.workunit.client.0.vm05.stdout:5/666: write d5/d86/f1a [4980687,22786] 0 2026-03-10T08:56:02.496 INFO:tasks.workunit.client.1.vm08.stdout:8/988: mkdir d1/d166 0 2026-03-10T08:56:02.502 INFO:tasks.workunit.client.1.vm08.stdout:4/950: creat d5/d23/d36/d99/db2/d5a/d69/d11b/d96/d14f/f153 x:0 0 0 2026-03-10T08:56:02.523 INFO:tasks.workunit.client.0.vm05.stdout:1/786: write dd/d10/d18/d20/d52/d80/ffc [76934,71130] 0 2026-03-10T08:56:02.525 INFO:tasks.workunit.client.1.vm08.stdout:1/923: symlink d1/da/de/d24/l13a 0 2026-03-10T08:56:02.530 INFO:tasks.workunit.client.1.vm08.stdout:8/989: fsync d1/d10/d9/dd/d9a/d11f/f14e 0 2026-03-10T08:56:02.531 INFO:tasks.workunit.client.1.vm08.stdout:8/990: chown d1/d10/d9/dd/d25/d27/d44/d21/f133 211859641 1 2026-03-10T08:56:02.533 INFO:tasks.workunit.client.0.vm05.stdout:4/729: write d0/d2e/d42/d45/d4a/d36/d37/fac [643617,10811] 0 2026-03-10T08:56:02.534 INFO:tasks.workunit.client.1.vm08.stdout:0/905: write d6/dd/f92 [965657,98576] 0 2026-03-10T08:56:02.537 INFO:tasks.workunit.client.1.vm08.stdout:7/946: dread d0/d14/d2f/f81 [0,4194304] 0 2026-03-10T08:56:02.540 INFO:tasks.workunit.client.0.vm05.stdout:9/698: dwrite d6/d12/d3a/d48/fa8 [0,4194304] 0 2026-03-10T08:56:02.556 INFO:tasks.workunit.client.0.vm05.stdout:7/710: dread d18/d66/d25/f47 [0,4194304] 0 2026-03-10T08:56:02.560 INFO:tasks.workunit.client.1.vm08.stdout:5/836: rename d0/f92 to d0/d11/d27/d68/d7c/d8e/f105 0 2026-03-10T08:56:02.562 INFO:tasks.workunit.client.1.vm08.stdout:5/837: dread d0/d11/d27/d68/d7c/d4b/d4e/d84/fbb [0,4194304] 0 2026-03-10T08:56:02.563 INFO:tasks.workunit.client.1.vm08.stdout:5/838: chown d0/d11/d27/d68/d7c/d4b/l63 69458527 1 2026-03-10T08:56:02.567 INFO:tasks.workunit.client.1.vm08.stdout:1/924: mknod d1/da/d18/d3b/c13b 0 2026-03-10T08:56:02.568 INFO:tasks.workunit.client.1.vm08.stdout:3/884: creat d4/d15/d8/d2c/d9b/f130 x:0 0 0 2026-03-10T08:56:02.569 INFO:tasks.workunit.client.0.vm05.stdout:5/667: symlink d5/d86/d24/d2c/d41/lef 0 2026-03-10T08:56:02.572 INFO:tasks.workunit.client.0.vm05.stdout:2/669: write d0/d9/d1e/d20/d21/d45/d4b/d70/f9d [3436487,19161] 0 2026-03-10T08:56:02.573 INFO:tasks.workunit.client.0.vm05.stdout:6/757: write d4/d7/d10/d1a/f25 [2960275,6268] 0 2026-03-10T08:56:02.577 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:02 vm05.local ceph-mon[49713]: Upgrade: Updating mgr.vm05.rxwgjc 2026-03-10T08:56:02.577 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:02 vm05.local ceph-mon[49713]: Deploying daemon mgr.vm05.rxwgjc on vm05 2026-03-10T08:56:02.582 INFO:tasks.workunit.client.0.vm05.stdout:0/735: link df/d1f/d48/lc2 df/d1f/d85/d2b/d27/d32/d4e/d87/ldd 0 2026-03-10T08:56:02.584 INFO:tasks.workunit.client.1.vm08.stdout:0/906: fsync d6/dd/d13/d17/d1f/da3/f10d 0 2026-03-10T08:56:02.598 INFO:tasks.workunit.client.1.vm08.stdout:7/947: dwrite d0/d14/d43/d9d/fbe [0,4194304] 0 2026-03-10T08:56:02.598 INFO:tasks.workunit.client.0.vm05.stdout:3/801: dwrite d9/d2b/d53/f60 [0,4194304] 0 2026-03-10T08:56:02.598 INFO:tasks.workunit.client.1.vm08.stdout:7/948: chown d0/d11/d1f 87 1 2026-03-10T08:56:02.614 INFO:tasks.workunit.client.0.vm05.stdout:4/730: symlink d0/d2e/d42/d45/d4a/d36/lef 0 2026-03-10T08:56:02.614 INFO:tasks.workunit.client.1.vm08.stdout:9/900: getdents d2/d41/d4c/d66/d82/dfe 0 2026-03-10T08:56:02.617 INFO:tasks.workunit.client.0.vm05.stdout:4/731: dread d0/d2e/d42/d45/d4a/d36/dbe/d49/f7a [0,4194304] 0 2026-03-10T08:56:02.618 INFO:tasks.workunit.client.0.vm05.stdout:4/732: dread - d0/d2c/f74 zero size 2026-03-10T08:56:02.631 INFO:tasks.workunit.client.1.vm08.stdout:1/925: dread - d1/da/de/d5c/fdb zero size 2026-03-10T08:56:02.631 INFO:tasks.workunit.client.0.vm05.stdout:8/747: truncate d2/dd/d2c/d2e/d31/d3e/dde/d63/f6c 2603954 0 2026-03-10T08:56:02.631 INFO:tasks.workunit.client.0.vm05.stdout:7/711: symlink d18/d66/d25/d2e/d2f/d6d/dc1/le2 0 2026-03-10T08:56:02.633 INFO:tasks.workunit.client.1.vm08.stdout:8/991: truncate d1/d10/d9/dd/d25/d27/d144/fa9 1042929 0 2026-03-10T08:56:02.634 INFO:tasks.workunit.client.0.vm05.stdout:5/668: rmdir d5/d86/d21 39 2026-03-10T08:56:02.636 INFO:tasks.workunit.client.1.vm08.stdout:0/907: fsync d6/dd/f3f 0 2026-03-10T08:56:02.637 INFO:tasks.workunit.client.0.vm05.stdout:2/670: read d0/d55/da2/fa5 [185755,50881] 0 2026-03-10T08:56:02.638 INFO:tasks.workunit.client.0.vm05.stdout:2/671: write d0/d9/d1e/d20/d21/d45/d4b/f58 [2116999,44500] 0 2026-03-10T08:56:02.643 INFO:tasks.workunit.client.1.vm08.stdout:4/951: rmdir d5/d23/d109/d12d 0 2026-03-10T08:56:02.643 INFO:tasks.workunit.client.0.vm05.stdout:1/787: mkdir dd/d10/d112/d113 0 2026-03-10T08:56:02.651 INFO:tasks.workunit.client.1.vm08.stdout:9/901: rmdir d2/dd/d15/d1e/d21 39 2026-03-10T08:56:02.654 INFO:tasks.workunit.client.0.vm05.stdout:4/733: mknod d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/cf0 0 2026-03-10T08:56:02.655 INFO:tasks.workunit.client.0.vm05.stdout:4/734: read - d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/f57 zero size 2026-03-10T08:56:02.658 INFO:tasks.workunit.client.1.vm08.stdout:1/926: rename d1/da/de/d24/d3d/d40/d8e/dd2/c11b to d1/da/de/d24/d81/c13c 0 2026-03-10T08:56:02.665 INFO:tasks.workunit.client.0.vm05.stdout:8/748: mknod d2/dd/d74/c106 0 2026-03-10T08:56:02.669 INFO:tasks.workunit.client.1.vm08.stdout:5/839: dwrite d0/d11/d27/d68/d7c/f6a [0,4194304] 0 2026-03-10T08:56:02.671 INFO:tasks.workunit.client.1.vm08.stdout:8/992: fdatasync d1/d10/fad 0 2026-03-10T08:56:02.676 INFO:tasks.workunit.client.1.vm08.stdout:6/927: dwrite d9/dc/d11/d23/d2c/d81/d63/f108 [0,4194304] 0 2026-03-10T08:56:02.678 INFO:tasks.workunit.client.0.vm05.stdout:7/712: readlink d18/d66/d25/d2e/d2f/d6d/lb0 0 2026-03-10T08:56:02.679 INFO:tasks.workunit.client.0.vm05.stdout:3/802: write d9/d4d/d51/d64/d89/dc2/fcd [468330,23508] 0 2026-03-10T08:56:02.680 INFO:tasks.workunit.client.1.vm08.stdout:7/949: write d0/d11/d1f/d29/d36/d75/fb9 [342663,125414] 0 2026-03-10T08:56:02.685 INFO:tasks.workunit.client.0.vm05.stdout:5/669: symlink d5/d86/d24/lf0 0 2026-03-10T08:56:02.697 INFO:tasks.workunit.client.1.vm08.stdout:4/952: symlink d5/d23/d36/d99/db2/d5d/dae/l154 0 2026-03-10T08:56:02.703 INFO:tasks.workunit.client.0.vm05.stdout:2/672: dread d0/d9/d1e/d20/f8b [0,4194304] 0 2026-03-10T08:56:02.704 INFO:tasks.workunit.client.0.vm05.stdout:2/673: chown d0/d9/d89/fba 233218 1 2026-03-10T08:56:02.716 INFO:tasks.workunit.client.0.vm05.stdout:4/735: fsync d0/d2e/d42/d45/d4a/d36/dbe/d32/f76 0 2026-03-10T08:56:02.724 INFO:tasks.workunit.client.1.vm08.stdout:5/840: dread d0/d11/d18/f23 [0,4194304] 0 2026-03-10T08:56:02.727 INFO:tasks.workunit.client.1.vm08.stdout:8/993: creat d1/d10/d9/dd/d25/d27/d44/d97/f167 x:0 0 0 2026-03-10T08:56:02.730 INFO:tasks.workunit.client.1.vm08.stdout:3/885: write d4/d6f/d85/fed [5014011,70128] 0 2026-03-10T08:56:02.731 INFO:tasks.workunit.client.0.vm05.stdout:6/758: write d4/d7/d10/d15/d20/f48 [885703,70576] 0 2026-03-10T08:56:02.731 INFO:tasks.workunit.client.0.vm05.stdout:1/788: write dd/d21/d37/d45/fce [444217,68678] 0 2026-03-10T08:56:02.731 INFO:tasks.workunit.client.1.vm08.stdout:8/994: dwrite d1/d10/d9/dd/d25/d27/d44/d21/d51/d64/dfb/f132 [0,4194304] 0 2026-03-10T08:56:02.737 INFO:tasks.workunit.client.1.vm08.stdout:8/995: dread d1/d10/d9/dd/d25/d27/d44/d97/f79 [0,4194304] 0 2026-03-10T08:56:02.740 INFO:tasks.workunit.client.1.vm08.stdout:9/902: dwrite d2/dd/d15/d1e/d25/d32/d5c/fab [0,4194304] 0 2026-03-10T08:56:02.741 INFO:tasks.workunit.client.0.vm05.stdout:9/699: rmdir d6/d15/d35/ddf/de4 0 2026-03-10T08:56:02.749 INFO:tasks.workunit.client.0.vm05.stdout:8/749: read d2/dd/d2c/d2e/f64 [338356,71488] 0 2026-03-10T08:56:02.750 INFO:tasks.workunit.client.0.vm05.stdout:8/750: stat d2/dd/d2c/d2e/d31/d3e/dde 0 2026-03-10T08:56:02.752 INFO:tasks.workunit.client.0.vm05.stdout:3/803: readlink d9/d2b/l32 0 2026-03-10T08:56:02.759 INFO:tasks.workunit.client.0.vm05.stdout:5/670: mknod d5/d86/d39/cf1 0 2026-03-10T08:56:02.768 INFO:tasks.workunit.client.1.vm08.stdout:1/927: mkdir d1/da/de/d24/d35/d6d/d116/d9c/d139/d13d 0 2026-03-10T08:56:02.770 INFO:tasks.workunit.client.1.vm08.stdout:5/841: mknod d0/d11/d27/d68/d7c/d4b/d4e/d84/c106 0 2026-03-10T08:56:02.771 INFO:tasks.workunit.client.0.vm05.stdout:4/736: truncate d0/d2e/d42/d45/d4a/d36/f88 3000042 0 2026-03-10T08:56:02.777 INFO:tasks.workunit.client.0.vm05.stdout:1/789: fdatasync dd/d10/d18/d2d/d5c/f8e 0 2026-03-10T08:56:02.779 INFO:tasks.workunit.client.0.vm05.stdout:4/737: dread d0/d2e/d42/d45/f5f [0,4194304] 0 2026-03-10T08:56:02.785 INFO:tasks.workunit.client.0.vm05.stdout:6/759: rename d4/d92/l9e to d4/d7/d10/dc3/l103 0 2026-03-10T08:56:02.789 INFO:tasks.workunit.client.0.vm05.stdout:9/700: dread d6/d19/d2c/f78 [0,4194304] 0 2026-03-10T08:56:02.793 INFO:tasks.workunit.client.1.vm08.stdout:8/996: unlink d1/d4f/d60/dbf/l12d 0 2026-03-10T08:56:02.794 INFO:tasks.workunit.client.1.vm08.stdout:4/953: write d5/d23/d36/d99/db2/f45 [3564855,52055] 0 2026-03-10T08:56:02.798 INFO:tasks.workunit.client.0.vm05.stdout:2/674: dwrite d0/d9/d1e/d20/d21/f41 [0,4194304] 0 2026-03-10T08:56:02.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:02 vm08.local ceph-mon[57559]: Upgrade: Updating mgr.vm05.rxwgjc 2026-03-10T08:56:02.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:02 vm08.local ceph-mon[57559]: Deploying daemon mgr.vm05.rxwgjc on vm05 2026-03-10T08:56:02.812 INFO:tasks.workunit.client.0.vm05.stdout:8/751: stat d2/dd/d2c/d2e/d31/d3e/l4e 0 2026-03-10T08:56:02.818 INFO:tasks.workunit.client.1.vm08.stdout:3/886: dwrite d4/d15/d8/ff [4194304,4194304] 0 2026-03-10T08:56:02.820 INFO:tasks.workunit.client.1.vm08.stdout:3/887: fsync d4/d15/d8/d2c/d6d/dfa/ffc 0 2026-03-10T08:56:02.823 INFO:tasks.workunit.client.1.vm08.stdout:7/950: rename d0/d11/db2/l31 to d0/d11/d1f/d29/d36/daf/l129 0 2026-03-10T08:56:02.831 INFO:tasks.workunit.client.1.vm08.stdout:0/908: rmdir d6/dd/d13/d61/dc7/d12c 0 2026-03-10T08:56:02.836 INFO:tasks.workunit.client.1.vm08.stdout:5/842: unlink d0/d11/d27/d68/d7c/d4b/d4e/d84/f90 0 2026-03-10T08:56:02.844 INFO:tasks.workunit.client.0.vm05.stdout:4/738: symlink d0/d2e/d42/d45/d4a/lf1 0 2026-03-10T08:56:02.851 INFO:tasks.workunit.client.1.vm08.stdout:4/954: creat d5/df5/f155 x:0 0 0 2026-03-10T08:56:02.858 INFO:tasks.workunit.client.1.vm08.stdout:9/903: creat d2/d54/d8e/da6/dd0/dc8/de1/d10c/f131 x:0 0 0 2026-03-10T08:56:02.859 INFO:tasks.workunit.client.1.vm08.stdout:6/928: creat d9/dc/d11/d23/d2c/f135 x:0 0 0 2026-03-10T08:56:02.866 INFO:tasks.workunit.client.0.vm05.stdout:9/701: stat d6/d19/d2c/l49 0 2026-03-10T08:56:02.870 INFO:tasks.workunit.client.0.vm05.stdout:9/702: stat d6/d15/d37/c3b 0 2026-03-10T08:56:02.874 INFO:tasks.workunit.client.0.vm05.stdout:1/790: write dd/f16 [3076585,1394] 0 2026-03-10T08:56:02.875 INFO:tasks.workunit.client.0.vm05.stdout:6/760: write d4/d2c/d84/d4a/f76 [2166655,33543] 0 2026-03-10T08:56:02.886 INFO:tasks.workunit.client.0.vm05.stdout:2/675: symlink d0/d9/d7f/lc6 0 2026-03-10T08:56:02.886 INFO:tasks.workunit.client.0.vm05.stdout:3/804: dwrite d9/d8f/d55/fab [0,4194304] 0 2026-03-10T08:56:02.903 INFO:tasks.workunit.client.1.vm08.stdout:7/951: symlink d0/d11/d1f/d29/d3b/d80/dd3/de1/l12a 0 2026-03-10T08:56:02.904 INFO:tasks.workunit.client.1.vm08.stdout:9/904: dread d2/dd/d15/d1e/d24/f2b [0,4194304] 0 2026-03-10T08:56:02.907 INFO:tasks.workunit.client.1.vm08.stdout:7/952: dread d0/d11/d1f/d29/d3b/da1/f114 [0,4194304] 0 2026-03-10T08:56:02.908 INFO:tasks.workunit.client.1.vm08.stdout:7/953: readlink d0/d14/l4d 0 2026-03-10T08:56:02.909 INFO:tasks.workunit.client.0.vm05.stdout:8/752: dread d2/db/d1f/f53 [0,4194304] 0 2026-03-10T08:56:02.911 INFO:tasks.workunit.client.0.vm05.stdout:8/753: dread - d2/dd/d2c/d2e/d31/d3e/d5d/d9d/ff5 zero size 2026-03-10T08:56:02.914 INFO:tasks.workunit.client.1.vm08.stdout:1/928: creat d1/da/de/d24/d35/d6d/d116/d9c/d139/d13d/f13e x:0 0 0 2026-03-10T08:56:02.936 INFO:tasks.workunit.client.0.vm05.stdout:0/736: link df/cbb df/d1f/d85/d2b/d65/cde 0 2026-03-10T08:56:02.946 INFO:tasks.workunit.client.1.vm08.stdout:8/997: creat d1/d10/d9/dd/d25/d27/d44/d21/d14a/f168 x:0 0 0 2026-03-10T08:56:02.961 INFO:tasks.workunit.client.1.vm08.stdout:4/955: fdatasync d5/d23/d36/d99/f13e 0 2026-03-10T08:56:02.962 INFO:tasks.workunit.client.1.vm08.stdout:0/909: write d6/dd/d13/d32/ff7 [781347,78758] 0 2026-03-10T08:56:02.970 INFO:tasks.workunit.client.1.vm08.stdout:6/929: read - d9/dc/d11/fdc zero size 2026-03-10T08:56:02.974 INFO:tasks.workunit.client.1.vm08.stdout:3/888: mkdir d4/d15/d8/d2c/d131 0 2026-03-10T08:56:02.986 INFO:tasks.workunit.client.0.vm05.stdout:7/713: getdents d18/d66 0 2026-03-10T08:56:02.987 INFO:tasks.workunit.client.0.vm05.stdout:9/703: write d6/d12/d3a/de5/f47 [1465702,67399] 0 2026-03-10T08:56:02.993 INFO:tasks.workunit.client.0.vm05.stdout:1/791: dread dd/d10/d18/dd5/fbf [0,4194304] 0 2026-03-10T08:56:02.993 INFO:tasks.workunit.client.1.vm08.stdout:7/954: rmdir d0/d11/db2/d8e 39 2026-03-10T08:56:02.995 INFO:tasks.workunit.client.0.vm05.stdout:6/761: rename d4/d7/c50 to d4/d2c/dc8/c104 0 2026-03-10T08:56:03.001 INFO:tasks.workunit.client.1.vm08.stdout:5/843: rename d0/d11/d27/d68/d7c/d4b/d4e/d84/cdf to d0/d1b/d67/d80/c107 0 2026-03-10T08:56:03.004 INFO:tasks.workunit.client.1.vm08.stdout:0/910: creat d6/dd/d13/d61/dc7/dc8/f137 x:0 0 0 2026-03-10T08:56:03.008 INFO:tasks.workunit.client.1.vm08.stdout:6/930: stat d9/dc/d11/d23/f113 0 2026-03-10T08:56:03.009 INFO:tasks.workunit.client.1.vm08.stdout:3/889: chown d4/d6f/l118 27 1 2026-03-10T08:56:03.013 INFO:tasks.workunit.client.1.vm08.stdout:9/905: mkdir d2/dd/d15/d4f/df1/d102/d132 0 2026-03-10T08:56:03.014 INFO:tasks.workunit.client.1.vm08.stdout:3/890: dwrite d4/d15/d8/d2c/d6d/dfa/d100/f10b [4194304,4194304] 0 2026-03-10T08:56:03.014 INFO:tasks.workunit.client.1.vm08.stdout:7/955: sync 2026-03-10T08:56:03.022 INFO:tasks.workunit.client.1.vm08.stdout:1/929: dread d1/da/de/d24/d3d/d40/f42 [0,4194304] 0 2026-03-10T08:56:03.027 INFO:tasks.workunit.client.1.vm08.stdout:6/931: truncate d9/d50/fa3 4927023 0 2026-03-10T08:56:03.039 INFO:tasks.workunit.client.0.vm05.stdout:2/676: write d0/d9/d1e/d20/d21/d8a/d92/fae [628574,69049] 0 2026-03-10T08:56:03.039 INFO:tasks.workunit.client.0.vm05.stdout:2/677: fdatasync d0/d9/d1e/d20/d24/fbe 0 2026-03-10T08:56:03.039 INFO:tasks.workunit.client.0.vm05.stdout:3/805: dwrite d9/d2b/de7/df1/d6c/fb5 [0,4194304] 0 2026-03-10T08:56:03.039 INFO:tasks.workunit.client.1.vm08.stdout:7/956: unlink d0/d11/d1f/fb7 0 2026-03-10T08:56:03.039 INFO:tasks.workunit.client.1.vm08.stdout:7/957: chown d0/d11/d1f/d29/d3b/fac 40777884 1 2026-03-10T08:56:03.039 INFO:tasks.workunit.client.1.vm08.stdout:9/906: mknod d2/d54/d8e/da6/dd0/c133 0 2026-03-10T08:56:03.046 INFO:tasks.workunit.client.1.vm08.stdout:3/891: truncate d4/f97 684560 0 2026-03-10T08:56:03.047 INFO:tasks.workunit.client.1.vm08.stdout:5/844: write d0/d11/d3e/f48 [14775,96713] 0 2026-03-10T08:56:03.047 INFO:tasks.workunit.client.1.vm08.stdout:4/956: creat d5/d23/f156 x:0 0 0 2026-03-10T08:56:03.049 INFO:tasks.workunit.client.1.vm08.stdout:8/998: dwrite d1/d2c/f30 [0,4194304] 0 2026-03-10T08:56:03.057 INFO:tasks.workunit.client.1.vm08.stdout:7/958: creat d0/d14/d43/d62/f12b x:0 0 0 2026-03-10T08:56:03.058 INFO:tasks.workunit.client.1.vm08.stdout:7/959: chown d0/d11/d1f/d2c/d111/f112 210460724 1 2026-03-10T08:56:03.065 INFO:tasks.workunit.client.1.vm08.stdout:3/892: mkdir d4/d15/d8/d1d/da8/d132 0 2026-03-10T08:56:03.075 INFO:tasks.workunit.client.1.vm08.stdout:1/930: symlink d1/da/de/d24/d3d/d40/l13f 0 2026-03-10T08:56:03.075 INFO:tasks.workunit.client.0.vm05.stdout:1/792: fsync dd/d21/d37/f8c 0 2026-03-10T08:56:03.076 INFO:tasks.workunit.client.0.vm05.stdout:1/793: chown dd/d13/d10b 94085244 1 2026-03-10T08:56:03.076 INFO:tasks.workunit.client.0.vm05.stdout:6/762: rename d4/d7/d10 to d4/d7/d10/d105 22 2026-03-10T08:56:03.076 INFO:tasks.workunit.client.0.vm05.stdout:6/763: chown d4/d2c/d84/d4a/l8e 123 1 2026-03-10T08:56:03.076 INFO:tasks.workunit.client.1.vm08.stdout:7/960: dread d0/d11/d1f/d29/d3d/d40/f24 [0,4194304] 0 2026-03-10T08:56:03.076 INFO:tasks.workunit.client.1.vm08.stdout:5/845: creat d0/d1b/d67/f108 x:0 0 0 2026-03-10T08:56:03.076 INFO:tasks.workunit.client.1.vm08.stdout:0/911: creat d6/dd/f138 x:0 0 0 2026-03-10T08:56:03.076 INFO:tasks.workunit.client.1.vm08.stdout:0/912: chown d6/dd/d13/d17/d1f/d20/d2f/d26/c55 511 1 2026-03-10T08:56:03.086 INFO:tasks.workunit.client.0.vm05.stdout:5/671: link d5/d48/f69 d5/d86/d21/d89/ff2 0 2026-03-10T08:56:03.088 INFO:tasks.workunit.client.1.vm08.stdout:3/893: rmdir d4/d6f/d85 39 2026-03-10T08:56:03.090 INFO:tasks.workunit.client.0.vm05.stdout:2/678: fsync d0/d9/d7f/f80 0 2026-03-10T08:56:03.091 INFO:tasks.workunit.client.0.vm05.stdout:2/679: stat d0/f2 0 2026-03-10T08:56:03.103 INFO:tasks.workunit.client.0.vm05.stdout:3/806: symlink d9/d2b/de7/df1/d6c/dbe/lf8 0 2026-03-10T08:56:03.109 INFO:tasks.workunit.client.1.vm08.stdout:5/846: fsync d0/d11/d18/df5/dfc/f103 0 2026-03-10T08:56:03.110 INFO:tasks.workunit.client.1.vm08.stdout:5/847: write d0/d11/d3e/f48 [1587028,43820] 0 2026-03-10T08:56:03.115 INFO:tasks.workunit.client.1.vm08.stdout:4/957: dwrite d5/d23/d36/d99/db2/d5d/de3/df8/f119 [0,4194304] 0 2026-03-10T08:56:03.118 INFO:tasks.workunit.client.1.vm08.stdout:0/913: truncate f5 1904806 0 2026-03-10T08:56:03.129 INFO:tasks.workunit.client.0.vm05.stdout:4/739: creat d0/d2e/d42/d45/d4a/d36/dbe/d49/d4f/d5b/ddd/ff2 x:0 0 0 2026-03-10T08:56:03.129 INFO:tasks.workunit.client.0.vm05.stdout:7/714: dwrite d18/f95 [0,4194304] 0 2026-03-10T08:56:03.129 INFO:tasks.workunit.client.0.vm05.stdout:0/737: dwrite df/f4a [0,4194304] 0 2026-03-10T08:56:03.129 INFO:tasks.workunit.client.0.vm05.stdout:7/715: chown d18/d66/fae 1169459546 1 2026-03-10T08:56:03.129 INFO:tasks.workunit.client.0.vm05.stdout:8/754: dwrite d2/dd/d2c/d2e/d31/d4f/d80/dd0/fb6 [0,4194304] 0 2026-03-10T08:56:03.139 INFO:tasks.workunit.client.0.vm05.stdout:9/704: symlink d6/d15/le6 0 2026-03-10T08:56:03.155 INFO:tasks.workunit.client.0.vm05.stdout:9/705: dread d6/d12/d3a/d48/fa5 [0,4194304] 0 2026-03-10T08:56:03.155 INFO:tasks.workunit.client.0.vm05.stdout:1/794: dread - dd/d21/d37/d7c/d60/fe9 zero size 2026-03-10T08:56:03.155 INFO:tasks.workunit.client.0.vm05.stdout:5/672: unlink d5/df/d37/dd2/f7b 0 2026-03-10T08:56:03.157 INFO:tasks.workunit.client.1.vm08.stdout:7/961: mkdir d0/d11/d1f/d29/d3d/d12c 0 2026-03-10T08:56:03.158 INFO:tasks.workunit.client.1.vm08.stdout:7/962: readlink d0/d11/d1f/lae 0 2026-03-10T08:56:03.159 INFO:tasks.workunit.client.0.vm05.stdout:3/807: unlink d9/d8f/d55/fab 0 2026-03-10T08:56:03.160 INFO:tasks.workunit.client.0.vm05.stdout:3/808: readlink d9/d8f/d50/d5f/la2 0 2026-03-10T08:56:03.170 INFO:tasks.workunit.client.0.vm05.stdout:0/738: creat df/d59/fdf x:0 0 0 2026-03-10T08:56:03.176 INFO:tasks.workunit.client.1.vm08.stdout:8/999: rename d1/d10/d9/dd/d25/d27/f52 to d1/f169 0 2026-03-10T08:56:03.180 INFO:tasks.workunit.client.0.vm05.stdout:0/739: dread df/d1f/d85/d2b/d27/f60 [0,4194304] 0 2026-03-10T08:56:03.204 INFO:tasks.workunit.client.1.vm08.stdout:6/932: getdents d9/dc/d11/d23/d2c/d81/d63 0 2026-03-10T08:56:03.204 INFO:tasks.workunit.client.1.vm08.stdout:4/958: truncate d5/d23/d36/f44 4786522 0 2026-03-10T08:56:03.204 INFO:tasks.workunit.client.0.vm05.stdout:7/716: rename d18/d66/d25/d2e/d42 to d18/d38/dc7/de3 0 2026-03-10T08:56:03.204 INFO:tasks.workunit.client.0.vm05.stdout:8/755: mkdir d2/dd/d2c/d2e/d31/d3e/d5d/d9d/d107 0 2026-03-10T08:56:03.204 INFO:tasks.workunit.client.0.vm05.stdout:1/795: rename dd/d10 to dd/d10/d19/d4d/d114 22 2026-03-10T08:56:03.204 INFO:tasks.workunit.client.0.vm05.stdout:8/756: read d2/dd/d2c/d2e/d31/d4f/d80/dd0/fb6 [2935329,108726] 0 2026-03-10T08:56:03.206 INFO:tasks.workunit.client.0.vm05.stdout:4/740: dwrite d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/d67/d7b/fe8 [0,4194304] 0 2026-03-10T08:56:03.214 INFO:tasks.workunit.client.0.vm05.stdout:7/717: dread - d18/d38/dc7/de3/dc6/fcd zero size 2026-03-10T08:56:03.214 INFO:tasks.workunit.client.0.vm05.stdout:1/796: symlink dd/d10/d19/d9b/l115 0 2026-03-10T08:56:03.224 INFO:tasks.workunit.client.0.vm05.stdout:3/809: mkdir d9/d8f/dde/df9 0 2026-03-10T08:56:03.226 INFO:tasks.workunit.client.1.vm08.stdout:1/931: write d1/da/de/d24/d3d/d40/d8e/dd2/d7f/fe9 [4208823,22018] 0 2026-03-10T08:56:03.226 INFO:tasks.workunit.client.0.vm05.stdout:4/741: mkdir d0/d2e/dca/df3 0 2026-03-10T08:56:03.226 INFO:tasks.workunit.client.0.vm05.stdout:4/742: stat d0/f9 0 2026-03-10T08:56:03.231 INFO:tasks.workunit.client.0.vm05.stdout:5/673: rename d5/d86/l5e to d5/d48/lf3 0 2026-03-10T08:56:03.241 INFO:tasks.workunit.client.0.vm05.stdout:5/674: stat d5/d48/d64/d95/dac/dc6/fe9 0 2026-03-10T08:56:03.241 INFO:tasks.workunit.client.0.vm05.stdout:7/718: mknod d18/d38/dc7/de3/d74/ce4 0 2026-03-10T08:56:03.241 INFO:tasks.workunit.client.0.vm05.stdout:7/719: dwrite d18/f95 [0,4194304] 0 2026-03-10T08:56:03.244 INFO:tasks.workunit.client.0.vm05.stdout:6/764: sync 2026-03-10T08:56:03.245 INFO:tasks.workunit.client.0.vm05.stdout:9/706: sync 2026-03-10T08:56:03.250 INFO:tasks.workunit.client.0.vm05.stdout:3/810: mknod d9/d2b/de7/df1/d43/da3/cfa 0 2026-03-10T08:56:03.254 INFO:tasks.workunit.client.0.vm05.stdout:5/675: rmdir d5/d86/d24/d84/db8 39 2026-03-10T08:56:03.256 INFO:tasks.workunit.client.0.vm05.stdout:1/797: fsync dd/d10/d18/d20/df3/dfe/f111 0 2026-03-10T08:56:03.257 INFO:tasks.workunit.client.0.vm05.stdout:1/798: chown dd/d21/d37/d45/d8d/f99 49002706 1 2026-03-10T08:56:03.259 INFO:tasks.workunit.client.0.vm05.stdout:5/676: dwrite d5/df/d37/d68/fe5 [0,4194304] 0 2026-03-10T08:56:03.278 INFO:tasks.workunit.client.0.vm05.stdout:9/707: truncate d6/d12/d3a/fdc 594011 0 2026-03-10T08:56:03.288 INFO:tasks.workunit.client.0.vm05.stdout:5/677: write d5/d86/d21/f5a [246862,61446] 0 2026-03-10T08:56:03.289 INFO:tasks.workunit.client.1.vm08.stdout:9/907: getdents d2/d54/d8e/da6 0 2026-03-10T08:56:03.290 INFO:tasks.workunit.client.1.vm08.stdout:3/894: symlink d4/d15/d8/l133 0 2026-03-10T08:56:03.292 INFO:tasks.workunit.client.1.vm08.stdout:3/895: chown d4/d15/d8/d2c/d9b/d79/d20/f84 255 1 2026-03-10T08:56:03.292 INFO:tasks.workunit.client.0.vm05.stdout:3/811: fsync d9/d4d/f5e 0 2026-03-10T08:56:03.292 INFO:tasks.workunit.client.1.vm08.stdout:3/896: stat d4/d15/d8/d2c/d9b/d79/cc5 0 2026-03-10T08:56:03.293 INFO:tasks.workunit.client.0.vm05.stdout:5/678: mknod d5/d86/d24/d2c/d41/d74/cf4 0 2026-03-10T08:56:03.296 INFO:tasks.workunit.client.0.vm05.stdout:9/708: mkdir d6/d15/de7 0 2026-03-10T08:56:03.297 INFO:tasks.workunit.client.1.vm08.stdout:1/932: creat d1/da/d20/d3f/f140 x:0 0 0 2026-03-10T08:56:03.297 INFO:tasks.workunit.client.0.vm05.stdout:9/709: write d6/d12/d3a/de5/f47 [913992,111449] 0 2026-03-10T08:56:03.298 INFO:tasks.workunit.client.1.vm08.stdout:7/963: read d0/d11/d4a/da3/fa9 [2246243,52906] 0 2026-03-10T08:56:03.305 INFO:tasks.workunit.client.1.vm08.stdout:3/897: dread - d4/d6f/d85/dd3/f121 zero size 2026-03-10T08:56:03.307 INFO:tasks.workunit.client.1.vm08.stdout:9/908: dread d2/dd/d15/d1e/d94/fd7 [0,4194304] 0 2026-03-10T08:56:03.309 INFO:tasks.workunit.client.1.vm08.stdout:3/898: chown d4/d15/f12 27590279 1 2026-03-10T08:56:03.317 INFO:tasks.workunit.client.1.vm08.stdout:9/909: rename d2/dd/d15/d1e/d25/d32/c88 to d2/dd/d11c/c134 0 2026-03-10T08:56:03.318 INFO:tasks.workunit.client.1.vm08.stdout:4/959: dread d5/d23/d36/d99/db2/d5d/f129 [0,4194304] 0 2026-03-10T08:56:03.323 INFO:tasks.workunit.client.1.vm08.stdout:3/899: dread d4/d15/d8/d2c/d9b/d79/f5c [0,4194304] 0 2026-03-10T08:56:03.324 INFO:tasks.workunit.client.1.vm08.stdout:4/960: truncate d5/d23/d36/d76/fcf 234132 0 2026-03-10T08:56:03.324 INFO:tasks.workunit.client.1.vm08.stdout:3/900: truncate d4/d6f/d85/f110 632984 0 2026-03-10T08:56:03.327 INFO:tasks.workunit.client.0.vm05.stdout:9/710: sync 2026-03-10T08:56:03.342 INFO:tasks.workunit.client.1.vm08.stdout:3/901: unlink d4/d15/d8/d2c/d9b/c2e 0 2026-03-10T08:56:03.345 INFO:tasks.workunit.client.1.vm08.stdout:4/961: dread d5/d23/d36/d99/db2/d5a/d69/d11b/d114/f121 [0,4194304] 0 2026-03-10T08:56:03.347 INFO:tasks.workunit.client.1.vm08.stdout:4/962: truncate d5/d23/d36/d99/db2/d5d/f61 4166448 0 2026-03-10T08:56:03.350 INFO:tasks.workunit.client.1.vm08.stdout:4/963: truncate d5/d23/d36/d99/db2/d5a/d69/d11b/f41 2847296 0 2026-03-10T08:56:03.351 INFO:tasks.workunit.client.1.vm08.stdout:4/964: fsync d5/df5/f155 0 2026-03-10T08:56:03.353 INFO:tasks.workunit.client.1.vm08.stdout:4/965: rename d5/df5/c12a to d5/df5/c157 0 2026-03-10T08:56:03.356 INFO:tasks.workunit.client.1.vm08.stdout:4/966: dread d5/f95 [0,4194304] 0 2026-03-10T08:56:03.356 INFO:tasks.workunit.client.0.vm05.stdout:9/711: dread d6/d15/d3c/d4b/f67 [0,4194304] 0 2026-03-10T08:56:03.357 INFO:tasks.workunit.client.0.vm05.stdout:9/712: mkdir d6/d15/d37/de8 0 2026-03-10T08:56:03.358 INFO:tasks.workunit.client.1.vm08.stdout:4/967: dread d5/d23/d36/d99/db2/d5d/f129 [0,4194304] 0 2026-03-10T08:56:03.359 INFO:tasks.workunit.client.0.vm05.stdout:9/713: mkdir d6/d15/d3c/d4b/d82/de9 0 2026-03-10T08:56:03.361 INFO:tasks.workunit.client.0.vm05.stdout:9/714: rename d6/ca to d6/d19/d21/cea 0 2026-03-10T08:56:03.362 INFO:tasks.workunit.client.0.vm05.stdout:9/715: dread - d6/d27/fa6 zero size 2026-03-10T08:56:03.362 INFO:tasks.workunit.client.0.vm05.stdout:9/716: dread - d6/d12/d3a/fdd zero size 2026-03-10T08:56:03.370 INFO:tasks.workunit.client.1.vm08.stdout:4/968: mknod d5/d23/d36/d99/db2/d5a/d69/d11b/dea/d14e/c158 0 2026-03-10T08:56:03.373 INFO:tasks.workunit.client.0.vm05.stdout:2/680: dwrite d0/f16 [0,4194304] 0 2026-03-10T08:56:03.383 INFO:tasks.workunit.client.1.vm08.stdout:4/969: dread d5/d23/d36/d76/fa7 [0,4194304] 0 2026-03-10T08:56:03.386 INFO:tasks.workunit.client.0.vm05.stdout:9/717: mkdir d6/d19/d2c/d58/deb 0 2026-03-10T08:56:03.393 INFO:tasks.workunit.client.0.vm05.stdout:9/718: read d6/d19/d2c/d58/fc9 [489223,37066] 0 2026-03-10T08:56:03.397 INFO:tasks.workunit.client.1.vm08.stdout:4/970: creat d5/d23/d36/d99/dc6/dc8/d120/f159 x:0 0 0 2026-03-10T08:56:03.399 INFO:tasks.workunit.client.0.vm05.stdout:2/681: truncate d0/d9/d7f/d8f/f37 2058917 0 2026-03-10T08:56:03.400 INFO:tasks.workunit.client.1.vm08.stdout:5/848: dwrite d0/d11/d27/f3d [0,4194304] 0 2026-03-10T08:56:03.405 INFO:tasks.workunit.client.0.vm05.stdout:9/719: mknod d6/d15/d35/ddf/cec 0 2026-03-10T08:56:03.414 INFO:tasks.workunit.client.1.vm08.stdout:4/971: creat d5/d23/d36/d99/db2/d5a/d69/d11b/d96/d14f/f15a x:0 0 0 2026-03-10T08:56:03.420 INFO:tasks.workunit.client.1.vm08.stdout:0/914: dwrite d6/dd/d13/d17/d1f/d20/f43 [4194304,4194304] 0 2026-03-10T08:56:03.423 INFO:tasks.workunit.client.0.vm05.stdout:2/682: symlink d0/d55/db8/lc7 0 2026-03-10T08:56:03.435 INFO:tasks.workunit.client.0.vm05.stdout:9/720: creat d6/d19/d2a/d8d/fed x:0 0 0 2026-03-10T08:56:03.437 INFO:tasks.workunit.client.0.vm05.stdout:0/740: write df/d1f/d85/f53 [3350783,113040] 0 2026-03-10T08:56:03.440 INFO:tasks.workunit.client.0.vm05.stdout:9/721: fsync d6/d19/d2c/fbd 0 2026-03-10T08:56:03.452 INFO:tasks.workunit.client.1.vm08.stdout:0/915: symlink d6/dd/d13/d17/d1f/d20/d2f/l139 0 2026-03-10T08:56:03.452 INFO:tasks.workunit.client.1.vm08.stdout:0/916: write d6/dd/d13/d17/d1f/d2d/d38/d98/fda [208636,70161] 0 2026-03-10T08:56:03.452 INFO:tasks.workunit.client.0.vm05.stdout:0/741: chown df/d1f/d85/d2b/d65/d6e/d96/l54 571524 1 2026-03-10T08:56:03.452 INFO:tasks.workunit.client.0.vm05.stdout:8/757: write d2/db/d1f/d67/fe9 [1221339,15030] 0 2026-03-10T08:56:03.452 INFO:tasks.workunit.client.0.vm05.stdout:0/742: symlink df/d1f/d85/d19/d55/le0 0 2026-03-10T08:56:03.463 INFO:tasks.workunit.client.1.vm08.stdout:0/917: chown d6/dd/d13/d17/d1f/d20/d2f/d26/d56/fdc 10373083 1 2026-03-10T08:56:03.470 INFO:tasks.workunit.client.0.vm05.stdout:2/683: creat d0/d9/d1e/d20/fc8 x:0 0 0 2026-03-10T08:56:03.473 INFO:tasks.workunit.client.0.vm05.stdout:8/758: dread d2/db/d28/fa6 [0,4194304] 0 2026-03-10T08:56:03.478 INFO:tasks.workunit.client.0.vm05.stdout:9/722: rename d6/d19/c1d to d6/d19/d2c/d58/deb/cee 0 2026-03-10T08:56:03.556 INFO:tasks.workunit.client.1.vm08.stdout:0/918: truncate d6/dd/d13/d17/d1f/d20/f100 2453665 0 2026-03-10T08:56:03.618 INFO:tasks.workunit.client.0.vm05.stdout:4/743: write d0/d2e/d42/d45/d4a/d36/dbe/d49/faf [4688717,40714] 0 2026-03-10T08:56:03.620 INFO:tasks.workunit.client.0.vm05.stdout:7/720: write d18/f4a [3633979,128536] 0 2026-03-10T08:56:03.634 INFO:tasks.workunit.client.1.vm08.stdout:0/919: creat d6/dd/d13/d17/d1f/d2d/d131/f13a x:0 0 0 2026-03-10T08:56:03.638 INFO:tasks.workunit.client.0.vm05.stdout:8/759: truncate d2/dd/d2c/d2e/d31/d4f/d80/f9f 1794913 0 2026-03-10T08:56:03.648 INFO:tasks.workunit.client.0.vm05.stdout:1/799: dwrite dd/d10/d19/d27/f4e [0,4194304] 0 2026-03-10T08:56:03.653 INFO:tasks.workunit.client.0.vm05.stdout:0/743: fsync df/f17 0 2026-03-10T08:56:03.663 INFO:tasks.workunit.client.0.vm05.stdout:6/765: dwrite d4/d7/d10/d15/d20/fa1 [0,4194304] 0 2026-03-10T08:56:03.679 INFO:tasks.workunit.client.0.vm05.stdout:2/684: creat d0/d9/d1e/d20/d21/d45/d4b/d75/fc9 x:0 0 0 2026-03-10T08:56:03.688 INFO:tasks.workunit.client.0.vm05.stdout:3/812: write d9/d4d/d51/fe1 [357221,37607] 0 2026-03-10T08:56:03.690 INFO:tasks.workunit.client.0.vm05.stdout:1/800: creat dd/d10/d18/d2d/d51/d58/f116 x:0 0 0 2026-03-10T08:56:03.691 INFO:tasks.workunit.client.0.vm05.stdout:5/679: write d5/d86/d24/d84/fb0 [429315,73230] 0 2026-03-10T08:56:03.692 INFO:tasks.workunit.client.1.vm08.stdout:6/933: dwrite d9/dc/d11/d23/d2c/f8e [0,4194304] 0 2026-03-10T08:56:03.699 INFO:tasks.workunit.client.1.vm08.stdout:1/933: dwrite d1/da/de/d24/d3d/ff0 [0,4194304] 0 2026-03-10T08:56:03.703 INFO:tasks.workunit.client.1.vm08.stdout:7/964: write d0/d11/d1f/d29/d3d/d40/fb0 [3222600,31960] 0 2026-03-10T08:56:03.742 INFO:tasks.workunit.client.0.vm05.stdout:6/766: fdatasync d4/f61 0 2026-03-10T08:56:03.742 INFO:tasks.workunit.client.1.vm08.stdout:9/910: dwrite d2/d41/d4c/d66/fad [0,4194304] 0 2026-03-10T08:56:03.742 INFO:tasks.workunit.client.1.vm08.stdout:6/934: read d9/dc/d11/f55 [623696,36217] 0 2026-03-10T08:56:03.750 INFO:tasks.workunit.client.1.vm08.stdout:9/911: read d2/f86 [755199,69926] 0 2026-03-10T08:56:03.758 INFO:tasks.workunit.client.1.vm08.stdout:3/902: write f1 [756858,68875] 0 2026-03-10T08:56:03.767 INFO:tasks.workunit.client.0.vm05.stdout:4/744: creat d0/d2e/d71/ff4 x:0 0 0 2026-03-10T08:56:03.785 INFO:tasks.workunit.client.0.vm05.stdout:1/801: symlink dd/d10/d18/d20/d52/ddc/l117 0 2026-03-10T08:56:03.785 INFO:tasks.workunit.client.0.vm05.stdout:1/802: fsync dd/d10/d18/d2d/f10e 0 2026-03-10T08:56:03.785 INFO:tasks.workunit.client.0.vm05.stdout:0/744: rename df/d1f/d85/d2b/d27/f91 to df/d1f/d85/d19/fe1 0 2026-03-10T08:56:03.785 INFO:tasks.workunit.client.0.vm05.stdout:0/745: chown df/f79 138 1 2026-03-10T08:56:03.785 INFO:tasks.workunit.client.0.vm05.stdout:4/745: mkdir d0/d2e/d42/d45/d4a/d36/dbe/d49/d4f/d5b/dae/df5 0 2026-03-10T08:56:03.785 INFO:tasks.workunit.client.0.vm05.stdout:1/803: fdatasync f6 0 2026-03-10T08:56:03.788 INFO:tasks.workunit.client.1.vm08.stdout:5/849: dwrite d0/d11/d27/d68/d7c/f42 [0,4194304] 0 2026-03-10T08:56:03.791 INFO:tasks.workunit.client.1.vm08.stdout:7/965: sync 2026-03-10T08:56:03.808 INFO:tasks.workunit.client.1.vm08.stdout:7/966: chown d0/d11/d1f/d2c/d111 6589 1 2026-03-10T08:56:03.809 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:03 vm08.local ceph-mon[57559]: pgmap v10: 65 pgs: 65 active+clean; 3.6 GiB data, 12 GiB used, 108 GiB / 120 GiB avail; 26 MiB/s rd, 56 MiB/s wr, 163 op/s 2026-03-10T08:56:03.809 INFO:tasks.workunit.client.0.vm05.stdout:0/746: mknod df/d1f/d85/d19/d39/d4d/d9f/ce2 0 2026-03-10T08:56:03.809 INFO:tasks.workunit.client.0.vm05.stdout:6/767: creat d4/d7/d10/d15/d20/f106 x:0 0 0 2026-03-10T08:56:03.809 INFO:tasks.workunit.client.0.vm05.stdout:3/813: getdents d9/d8f/d50/d5f 0 2026-03-10T08:56:03.809 INFO:tasks.workunit.client.1.vm08.stdout:4/972: dwrite d5/d23/d36/d99/db2/d5a/d69/d11b/def/df2/f11f [0,4194304] 0 2026-03-10T08:56:03.823 INFO:tasks.workunit.client.1.vm08.stdout:7/967: symlink d0/d11/d1f/d29/d3d/dd1/l12d 0 2026-03-10T08:56:03.827 INFO:tasks.workunit.client.1.vm08.stdout:4/973: fsync d5/d23/d36/d99/dc6/dc8/f12c 0 2026-03-10T08:56:03.828 INFO:tasks.workunit.client.0.vm05.stdout:4/746: unlink d0/d2e/d42/d45/d4a/d36/dbe/d49/d4f/d5b/ld2 0 2026-03-10T08:56:03.829 INFO:tasks.workunit.client.1.vm08.stdout:7/968: mknod d0/d11/d1f/d2c/c12e 0 2026-03-10T08:56:03.833 INFO:tasks.workunit.client.1.vm08.stdout:4/974: fdatasync d5/d23/d36/d99/db2/d5a/f87 0 2026-03-10T08:56:03.838 INFO:tasks.workunit.client.0.vm05.stdout:4/747: unlink d0/d2e/d42/d45/d4a/d36/dbe/d49/f7a 0 2026-03-10T08:56:03.842 INFO:tasks.workunit.client.0.vm05.stdout:4/748: dwrite d0/d2e/d42/d45/d4a/d36/fd5 [0,4194304] 0 2026-03-10T08:56:03.849 INFO:tasks.workunit.client.0.vm05.stdout:4/749: dwrite d0/d2e/d42/d45/d4a/d36/f3d [0,4194304] 0 2026-03-10T08:56:03.851 INFO:tasks.workunit.client.0.vm05.stdout:0/747: getdents df/d1f/d95 0 2026-03-10T08:56:03.881 INFO:tasks.workunit.client.0.vm05.stdout:6/768: rmdir d4/d2d/d51/d87/da5 39 2026-03-10T08:56:03.885 INFO:tasks.workunit.client.1.vm08.stdout:4/975: sync 2026-03-10T08:56:03.885 INFO:tasks.workunit.client.0.vm05.stdout:6/769: truncate d4/d7/d10/d15/f17 3341653 0 2026-03-10T08:56:03.886 INFO:tasks.workunit.client.0.vm05.stdout:6/770: stat d4/d7/f14 0 2026-03-10T08:56:03.889 INFO:tasks.workunit.client.1.vm08.stdout:4/976: link d5/f6b d5/d23/d36/d99/db2/d5a/d69/d11b/d114/f15b 0 2026-03-10T08:56:03.891 INFO:tasks.workunit.client.1.vm08.stdout:4/977: truncate d5/f8 4409488 0 2026-03-10T08:56:03.892 INFO:tasks.workunit.client.0.vm05.stdout:6/771: symlink d4/d8d/l107 0 2026-03-10T08:56:03.893 INFO:tasks.workunit.client.1.vm08.stdout:4/978: truncate d5/f85 3002899 0 2026-03-10T08:56:03.931 INFO:tasks.workunit.client.0.vm05.stdout:1/804: sync 2026-03-10T08:56:03.931 INFO:tasks.workunit.client.0.vm05.stdout:4/750: sync 2026-03-10T08:56:03.931 INFO:tasks.workunit.client.0.vm05.stdout:1/805: chown dd/d10/d18/ld8 986945 1 2026-03-10T08:56:03.938 INFO:tasks.workunit.client.0.vm05.stdout:4/751: mknod d0/d2e/d42/d45/d4a/d36/dbe/d49/cf6 0 2026-03-10T08:56:03.944 INFO:tasks.workunit.client.0.vm05.stdout:1/806: creat dd/d10/d19/d4d/f118 x:0 0 0 2026-03-10T08:56:03.945 INFO:tasks.workunit.client.0.vm05.stdout:1/807: write dd/f44 [2780906,85874] 0 2026-03-10T08:56:03.950 INFO:tasks.workunit.client.0.vm05.stdout:4/752: dwrite d0/d2e/d42/d45/d4a/d36/d37/fac [0,4194304] 0 2026-03-10T08:56:03.961 INFO:tasks.workunit.client.0.vm05.stdout:4/753: chown d0/d2e/l3b 21887 1 2026-03-10T08:56:03.961 INFO:tasks.workunit.client.1.vm08.stdout:3/903: dread f1 [0,4194304] 0 2026-03-10T08:56:03.966 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:03 vm05.local ceph-mon[49713]: pgmap v10: 65 pgs: 65 active+clean; 3.6 GiB data, 12 GiB used, 108 GiB / 120 GiB avail; 26 MiB/s rd, 56 MiB/s wr, 163 op/s 2026-03-10T08:56:04.072 INFO:tasks.workunit.client.0.vm05.stdout:9/723: write d6/f9f [3244052,90990] 0 2026-03-10T08:56:04.094 INFO:tasks.workunit.client.0.vm05.stdout:9/724: link d6/d19/d2c/f3d d6/d15/d37/de8/fef 0 2026-03-10T08:56:04.100 INFO:tasks.workunit.client.0.vm05.stdout:9/725: creat d6/d19/d2a/d4a/ff0 x:0 0 0 2026-03-10T08:56:04.110 INFO:tasks.workunit.client.0.vm05.stdout:9/726: fdatasync d6/d12/d3a/de5/f91 0 2026-03-10T08:56:04.110 INFO:tasks.workunit.client.0.vm05.stdout:7/721: dwrite d18/d38/dc7/de3/f5a [0,4194304] 0 2026-03-10T08:56:04.116 INFO:tasks.workunit.client.0.vm05.stdout:7/722: read d18/f24 [3908826,111350] 0 2026-03-10T08:56:04.144 INFO:tasks.workunit.client.0.vm05.stdout:7/723: dread d18/d66/d79/f85 [0,4194304] 0 2026-03-10T08:56:04.144 INFO:tasks.workunit.client.0.vm05.stdout:9/727: dread - d6/d15/d3c/fda zero size 2026-03-10T08:56:04.150 INFO:tasks.workunit.client.0.vm05.stdout:5/680: symlink d5/d86/d24/d84/lf5 0 2026-03-10T08:56:04.174 INFO:tasks.workunit.client.1.vm08.stdout:9/912: mkdir d2/dd/d15/d1e/d25/d32/d135 0 2026-03-10T08:56:04.182 INFO:tasks.workunit.client.0.vm05.stdout:5/681: truncate d5/fc 2633389 0 2026-03-10T08:56:04.206 INFO:tasks.workunit.client.0.vm05.stdout:5/682: symlink d5/d86/d24/d84/lf6 0 2026-03-10T08:56:04.222 INFO:tasks.workunit.client.1.vm08.stdout:9/913: readlink d2/l36 0 2026-03-10T08:56:04.224 INFO:tasks.workunit.client.0.vm05.stdout:9/728: symlink d6/d15/d3c/d4b/lf1 0 2026-03-10T08:56:04.232 INFO:tasks.workunit.client.0.vm05.stdout:9/729: fdatasync d6/d19/d2a/f53 0 2026-03-10T08:56:04.237 INFO:tasks.workunit.client.0.vm05.stdout:9/730: mkdir d6/d15/d35/df2 0 2026-03-10T08:56:04.254 INFO:tasks.workunit.client.1.vm08.stdout:5/850: dwrite d0/d1b/d67/d80/fcc [0,4194304] 0 2026-03-10T08:56:04.287 INFO:tasks.workunit.client.0.vm05.stdout:3/814: write d9/d4d/f52 [4312049,99082] 0 2026-03-10T08:56:04.323 INFO:tasks.workunit.client.1.vm08.stdout:7/969: dwrite d0/d11/d1f/d29/d3b/d80/dd3/ffb [0,4194304] 0 2026-03-10T08:56:04.324 INFO:tasks.workunit.client.1.vm08.stdout:7/970: dread - d0/d11/db2/f8c zero size 2026-03-10T08:56:04.329 INFO:tasks.workunit.client.1.vm08.stdout:7/971: dwrite d0/d11/d1f/df0/df4/f11d [0,4194304] 0 2026-03-10T08:56:04.338 INFO:tasks.workunit.client.1.vm08.stdout:7/972: mkdir d0/d14/d43/de7/d12f 0 2026-03-10T08:56:04.343 INFO:tasks.workunit.client.1.vm08.stdout:7/973: dwrite d0/d11/d1f/d29/d3b/f4c [0,4194304] 0 2026-03-10T08:56:04.343 INFO:tasks.workunit.client.1.vm08.stdout:7/974: dread - d0/d14/d43/d62/f12b zero size 2026-03-10T08:56:04.354 INFO:tasks.workunit.client.1.vm08.stdout:7/975: getdents d0/d11 0 2026-03-10T08:56:04.357 INFO:tasks.workunit.client.1.vm08.stdout:7/976: link d0/d14/d43/d62/d102/l125 d0/d14/d43/d9d/l130 0 2026-03-10T08:56:04.370 INFO:tasks.workunit.client.1.vm08.stdout:7/977: dread d0/d11/d1f/d29/d3b/d80/dd3/ffb [4194304,4194304] 0 2026-03-10T08:56:04.372 INFO:tasks.workunit.client.1.vm08.stdout:7/978: getdents d0/d11/d1f/d29/d36 0 2026-03-10T08:56:04.381 INFO:tasks.workunit.client.1.vm08.stdout:7/979: chown d0/d11/d4a/d95/dc5/d100 1 1 2026-03-10T08:56:04.438 INFO:tasks.workunit.client.0.vm05.stdout:8/760: rmdir d2/dd/d2c/d2e/d31/d4f/da3 39 2026-03-10T08:56:04.442 INFO:tasks.workunit.client.0.vm05.stdout:8/761: mkdir d2/dd/d2c/d2e/d108 0 2026-03-10T08:56:04.473 INFO:tasks.workunit.client.1.vm08.stdout:4/979: dwrite d5/d23/d36/d99/db2/d5d/ffe [0,4194304] 0 2026-03-10T08:56:04.490 INFO:tasks.workunit.client.1.vm08.stdout:4/980: creat d5/d23/d36/d76/f15c x:0 0 0 2026-03-10T08:56:04.510 INFO:tasks.workunit.client.1.vm08.stdout:4/981: rename d5/d23/d36/d99/db2/d5d/de3/df8/f11e to d5/d23/d36/d99/db2/d5d/de3/df8/f15d 0 2026-03-10T08:56:04.529 INFO:tasks.workunit.client.0.vm05.stdout:0/748: unlink df/d1f/c41 0 2026-03-10T08:56:04.547 INFO:tasks.workunit.client.1.vm08.stdout:6/935: mknod d9/dc/c136 0 2026-03-10T08:56:04.555 INFO:tasks.workunit.client.1.vm08.stdout:6/936: creat d9/dc/d11/d23/d2c/d7a/dce/d69/f137 x:0 0 0 2026-03-10T08:56:04.558 INFO:tasks.workunit.client.1.vm08.stdout:6/937: mknod d9/dc/d84/d80/d12c/c138 0 2026-03-10T08:56:04.559 INFO:tasks.workunit.client.0.vm05.stdout:1/808: write dd/d10/d18/d20/f34 [4193116,100085] 0 2026-03-10T08:56:04.568 INFO:tasks.workunit.client.0.vm05.stdout:1/809: symlink dd/d10/d18/d2d/d51/d58/d71/l119 0 2026-03-10T08:56:04.573 INFO:tasks.workunit.client.0.vm05.stdout:1/810: creat dd/d13/d10b/f11a x:0 0 0 2026-03-10T08:56:04.591 INFO:tasks.workunit.client.0.vm05.stdout:4/754: dwrite d0/d2e/d42/d45/d4a/d36/dbe/d49/d58/f94 [0,4194304] 0 2026-03-10T08:56:04.613 INFO:tasks.workunit.client.0.vm05.stdout:4/755: symlink d0/d2e/d42/d45/d4a/d36/dbe/d49/d58/d66/lf7 0 2026-03-10T08:56:04.614 INFO:tasks.workunit.client.0.vm05.stdout:6/772: rename d4/d7/f80 to d4/d7/d10/d15/d1b/f108 0 2026-03-10T08:56:04.625 INFO:tasks.workunit.client.0.vm05.stdout:0/749: rename df/d1f/d85/d19/f8e to df/d1f/d85/d19/d39/d4d/fe3 0 2026-03-10T08:56:04.628 INFO:tasks.workunit.client.0.vm05.stdout:6/773: creat d4/d2c/dc8/f109 x:0 0 0 2026-03-10T08:56:04.631 INFO:tasks.workunit.client.0.vm05.stdout:4/756: truncate d0/d2c/d6a/fd8 878402 0 2026-03-10T08:56:04.634 INFO:tasks.workunit.client.0.vm05.stdout:0/750: truncate df/d1f/d85/d19/d5b/f78 1296312 0 2026-03-10T08:56:04.655 INFO:tasks.workunit.client.0.vm05.stdout:6/774: mkdir d4/d2d/d51/d87/da5/de9/d10a 0 2026-03-10T08:56:04.661 INFO:tasks.workunit.client.0.vm05.stdout:4/757: dread d0/d1d/f50 [0,4194304] 0 2026-03-10T08:56:04.674 INFO:tasks.workunit.client.0.vm05.stdout:6/775: creat d4/d2d/d51/f10b x:0 0 0 2026-03-10T08:56:04.677 INFO:tasks.workunit.client.0.vm05.stdout:7/724: dwrite d18/d1b/f50 [0,4194304] 0 2026-03-10T08:56:04.692 INFO:tasks.workunit.client.0.vm05.stdout:6/776: truncate d4/d7/f5d 4964136 0 2026-03-10T08:56:04.702 INFO:tasks.workunit.client.0.vm05.stdout:7/725: mkdir d18/d66/d25/de5 0 2026-03-10T08:56:04.706 INFO:tasks.workunit.client.0.vm05.stdout:6/777: mknod d4/d2d/d51/c10c 0 2026-03-10T08:56:04.709 INFO:tasks.workunit.client.0.vm05.stdout:7/726: fsync d18/d66/d78/f8b 0 2026-03-10T08:56:04.712 INFO:tasks.workunit.client.0.vm05.stdout:6/778: rmdir d4/d8d 39 2026-03-10T08:56:04.724 INFO:tasks.workunit.client.0.vm05.stdout:6/779: rename d4/d7/d10/d15/d1b/d22/f5c to d4/f10d 0 2026-03-10T08:56:04.726 INFO:tasks.workunit.client.0.vm05.stdout:6/780: creat d4/d7/d10/d1a/d8c/f10e x:0 0 0 2026-03-10T08:56:04.731 INFO:tasks.workunit.client.1.vm08.stdout:9/914: dwrite d2/dd/d15/d1e/d94/fd7 [0,4194304] 0 2026-03-10T08:56:04.731 INFO:tasks.workunit.client.0.vm05.stdout:2/685: mknod d0/cca 0 2026-03-10T08:56:04.740 INFO:tasks.workunit.client.0.vm05.stdout:9/731: dwrite d6/d15/d35/f38 [8388608,4194304] 0 2026-03-10T08:56:04.741 INFO:tasks.workunit.client.0.vm05.stdout:9/732: write d6/f7 [8600945,18951] 0 2026-03-10T08:56:04.742 INFO:tasks.workunit.client.0.vm05.stdout:9/733: write d6/d12/d3a/d48/fa8 [832952,17172] 0 2026-03-10T08:56:04.749 INFO:tasks.workunit.client.1.vm08.stdout:9/915: sync 2026-03-10T08:56:04.750 INFO:tasks.workunit.client.1.vm08.stdout:9/916: chown d2/d54/d8e/da6/dd0/f59 32580167 1 2026-03-10T08:56:04.754 INFO:tasks.workunit.client.0.vm05.stdout:2/686: mkdir d0/d9/d1e/d20/d21/d45/d4b/d8d/dcb 0 2026-03-10T08:56:04.765 INFO:tasks.workunit.client.1.vm08.stdout:5/851: write d0/d11/d3e/d45/f5b [2557117,83310] 0 2026-03-10T08:56:04.766 INFO:tasks.workunit.client.1.vm08.stdout:0/920: rename d6/dd/d13/d17/d1f/d20/d2f/c30 to d6/dd/d13/d17/d1f/d20/c13b 0 2026-03-10T08:56:04.770 INFO:tasks.workunit.client.0.vm05.stdout:5/683: dread d5/d86/d24/d2c/d41/fe4 [0,4194304] 0 2026-03-10T08:56:04.772 INFO:tasks.workunit.client.0.vm05.stdout:9/734: symlink d6/d15/d35/ddf/lf3 0 2026-03-10T08:56:04.773 INFO:tasks.workunit.client.0.vm05.stdout:9/735: readlink d6/d27/l39 0 2026-03-10T08:56:04.778 INFO:tasks.workunit.client.0.vm05.stdout:3/815: dwrite d9/d2b/de7/df1/d43/d71/fac [0,4194304] 0 2026-03-10T08:56:04.792 INFO:tasks.workunit.client.1.vm08.stdout:1/934: rename d1/da/d20/d91/d83/df4/d113 to d1/da/de/d24/d3d/d40/d5b/d141 0 2026-03-10T08:56:04.794 INFO:tasks.workunit.client.0.vm05.stdout:2/687: mkdir d0/d55/db8/dcc 0 2026-03-10T08:56:04.799 INFO:tasks.workunit.client.1.vm08.stdout:7/980: dwrite d0/d14/d43/f6e [4194304,4194304] 0 2026-03-10T08:56:04.807 INFO:tasks.workunit.client.1.vm08.stdout:9/917: getdents d2/dd/d11c 0 2026-03-10T08:56:04.808 INFO:tasks.workunit.client.1.vm08.stdout:9/918: chown d2/dd/d15/d1e/d25/d32/d5c/dc2/d101 6047987 1 2026-03-10T08:56:04.815 INFO:tasks.workunit.client.0.vm05.stdout:8/762: write d2/dd/d2c/d2e/d31/d3e/f95 [597269,52592] 0 2026-03-10T08:56:04.834 INFO:tasks.workunit.client.0.vm05.stdout:1/811: dwrite dd/d10/d18/d20/f89 [0,4194304] 0 2026-03-10T08:56:04.838 INFO:tasks.workunit.client.0.vm05.stdout:1/812: chown dd/d10/d18/d2d/d5c/dac 8297886 1 2026-03-10T08:56:04.861 INFO:tasks.workunit.client.1.vm08.stdout:3/904: rename d4/d6f/l118 to d4/d15/d8/d2c/d9b/d79/d8f/l134 0 2026-03-10T08:56:04.862 INFO:tasks.workunit.client.1.vm08.stdout:1/935: chown d1/da/d20/c120 612096 1 2026-03-10T08:56:04.862 INFO:tasks.workunit.client.1.vm08.stdout:7/981: mkdir d0/d11/d1f/d29/d36/d131 0 2026-03-10T08:56:04.863 INFO:tasks.workunit.client.0.vm05.stdout:9/736: truncate d6/d19/d2a/d4a/f88 598879 0 2026-03-10T08:56:04.868 INFO:tasks.workunit.client.0.vm05.stdout:3/816: creat d9/d2b/d53/ffb x:0 0 0 2026-03-10T08:56:04.900 INFO:tasks.workunit.client.1.vm08.stdout:9/919: truncate d2/f77 96682 0 2026-03-10T08:56:04.901 INFO:tasks.workunit.client.0.vm05.stdout:2/688: unlink d0/d9/d7f/lc6 0 2026-03-10T08:56:04.901 INFO:tasks.workunit.client.0.vm05.stdout:4/758: write d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/f60 [472455,22236] 0 2026-03-10T08:56:04.901 INFO:tasks.workunit.client.0.vm05.stdout:0/751: dwrite df/dd8/f71 [0,4194304] 0 2026-03-10T08:56:04.901 INFO:tasks.workunit.client.0.vm05.stdout:0/752: chown df/dd8/d90 3013 1 2026-03-10T08:56:04.901 INFO:tasks.workunit.client.0.vm05.stdout:0/753: chown df/dd8/d67/f80 1780 1 2026-03-10T08:56:04.901 INFO:tasks.workunit.client.0.vm05.stdout:0/754: dwrite df/d1f/fd9 [0,4194304] 0 2026-03-10T08:56:04.901 INFO:tasks.workunit.client.0.vm05.stdout:6/781: getdents d4 0 2026-03-10T08:56:04.909 INFO:tasks.workunit.client.0.vm05.stdout:7/727: dwrite d18/d38/dc7/de3/d74/f86 [0,4194304] 0 2026-03-10T08:56:04.923 INFO:tasks.workunit.client.1.vm08.stdout:7/982: unlink d0/d11/d1f/d29/d3d/d89/fee 0 2026-03-10T08:56:04.930 INFO:tasks.workunit.client.0.vm05.stdout:0/755: dread df/d1f/d85/d19/d47/d84/d8a/f93 [0,4194304] 0 2026-03-10T08:56:04.933 INFO:tasks.workunit.client.1.vm08.stdout:4/982: rename d5/d23/d36/d99/db2/d5d/dae/ddf/fbe to d5/d23/d36/d99/db2/d5d/de3/df8/f15e 0 2026-03-10T08:56:04.936 INFO:tasks.workunit.client.0.vm05.stdout:5/684: mkdir d5/d86/d24/d84/df7 0 2026-03-10T08:56:04.946 INFO:tasks.workunit.client.0.vm05.stdout:3/817: chown d9/d4d/cb3 0 1 2026-03-10T08:56:04.947 INFO:tasks.workunit.client.0.vm05.stdout:2/689: truncate d0/d9/d1e/d20/d24/fb3 959020 0 2026-03-10T08:56:04.948 INFO:tasks.workunit.client.0.vm05.stdout:2/690: chown d0/d9 1339395 1 2026-03-10T08:56:04.948 INFO:tasks.workunit.client.0.vm05.stdout:4/759: rename d0/d2e/d42/d45/d4a/d36/d37/f97 to d0/d2e/d71/de3/ff8 0 2026-03-10T08:56:04.949 INFO:tasks.workunit.client.0.vm05.stdout:3/818: chown d9/d2b/de7/df1/d43/d71/le0 8587862 1 2026-03-10T08:56:04.959 INFO:tasks.workunit.client.1.vm08.stdout:6/938: rename d9/d10/d1e/d92/fcc to d9/d10/f139 0 2026-03-10T08:56:04.967 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:04 vm05.local ceph-mon[49713]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:56:04.967 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:04 vm05.local ceph-mon[49713]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:56:04.967 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:04 vm05.local ceph-mon[49713]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:56:04.967 INFO:tasks.workunit.client.1.vm08.stdout:5/852: rename d0/d11/d27/f61 to d0/d11/d27/d68/d7c/d4b/d4e/d84/df9/f109 0 2026-03-10T08:56:04.976 INFO:tasks.workunit.client.1.vm08.stdout:3/905: rename d4/d15/d8/d71/fce to d4/d6f/d85/df1/f135 0 2026-03-10T08:56:04.976 INFO:tasks.workunit.client.1.vm08.stdout:5/853: symlink d0/d11/d27/d68/d7c/d4b/d4e/da5/l10a 0 2026-03-10T08:56:04.977 INFO:tasks.workunit.client.1.vm08.stdout:5/854: fsync d0/d11/d27/d68/d7c/f42 0 2026-03-10T08:56:04.979 INFO:tasks.workunit.client.1.vm08.stdout:3/906: rmdir d4/d15/d8/d2c/d9b/d79/d20 39 2026-03-10T08:56:04.994 INFO:tasks.workunit.client.0.vm05.stdout:0/756: fdatasync df/d1f/d85/d19/d47/d84/dae/fc9 0 2026-03-10T08:56:04.994 INFO:tasks.workunit.client.0.vm05.stdout:4/760: dread d0/d1d/f22 [0,4194304] 0 2026-03-10T08:56:04.999 INFO:tasks.workunit.client.1.vm08.stdout:5/855: getdents d0/d11/d3e/d45 0 2026-03-10T08:56:05.001 INFO:tasks.workunit.client.1.vm08.stdout:5/856: dread d0/d11/d27/d68/d7c/d4b/d4e/f89 [0,4194304] 0 2026-03-10T08:56:05.006 INFO:tasks.workunit.client.0.vm05.stdout:2/691: symlink d0/d9/d7f/db4/lcd 0 2026-03-10T08:56:05.014 INFO:tasks.workunit.client.0.vm05.stdout:9/737: creat d6/ff4 x:0 0 0 2026-03-10T08:56:05.026 INFO:tasks.workunit.client.0.vm05.stdout:0/757: mkdir df/d1f/dc6/de4 0 2026-03-10T08:56:05.029 INFO:tasks.workunit.client.1.vm08.stdout:6/939: sync 2026-03-10T08:56:05.029 INFO:tasks.workunit.client.1.vm08.stdout:6/940: readlink d9/dc/lbf 0 2026-03-10T08:56:05.032 INFO:tasks.workunit.client.1.vm08.stdout:3/907: link d4/d15/d8/d2c/d9b/d79/d20/fe5 d4/d6f/d85/dd3/f136 0 2026-03-10T08:56:05.041 INFO:tasks.workunit.client.0.vm05.stdout:0/758: symlink df/d1f/d85/d19/d55/le5 0 2026-03-10T08:56:05.041 INFO:tasks.workunit.client.1.vm08.stdout:6/941: read d9/d10/f8c [3988193,46883] 0 2026-03-10T08:56:05.042 INFO:tasks.workunit.client.1.vm08.stdout:6/942: chown d9/dc/d11/d106 1227137 1 2026-03-10T08:56:05.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:04 vm08.local ceph-mon[57559]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:56:05.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:04 vm08.local ceph-mon[57559]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:56:05.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:04 vm08.local ceph-mon[57559]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:56:05.060 INFO:tasks.workunit.client.0.vm05.stdout:4/761: getdents d0/d2e/d42/d45/d4a 0 2026-03-10T08:56:05.066 INFO:tasks.workunit.client.1.vm08.stdout:6/943: creat d9/d10/d1e/d7b/f13a x:0 0 0 2026-03-10T08:56:05.075 INFO:tasks.workunit.client.1.vm08.stdout:0/921: write d6/dd/d13/d17/f1d [397949,85957] 0 2026-03-10T08:56:05.078 INFO:tasks.workunit.client.0.vm05.stdout:9/738: link d6/f7f d6/d19/ff5 0 2026-03-10T08:56:05.079 INFO:tasks.workunit.client.0.vm05.stdout:9/739: stat d6/d15/d37/c46 0 2026-03-10T08:56:05.079 INFO:tasks.workunit.client.0.vm05.stdout:1/813: write dd/d10/d18/d20/d52/d80/fa5 [97242,62031] 0 2026-03-10T08:56:05.092 INFO:tasks.workunit.client.1.vm08.stdout:9/920: dwrite d2/dd/f2e [0,4194304] 0 2026-03-10T08:56:05.098 INFO:tasks.workunit.client.1.vm08.stdout:7/983: dwrite d0/d11/d4a/d95/fa7 [8388608,4194304] 0 2026-03-10T08:56:05.098 INFO:tasks.workunit.client.1.vm08.stdout:4/983: dwrite d5/f95 [0,4194304] 0 2026-03-10T08:56:05.106 INFO:tasks.workunit.client.1.vm08.stdout:6/944: mknod d9/d50/de9/dea/dfc/c13b 0 2026-03-10T08:56:05.106 INFO:tasks.workunit.client.1.vm08.stdout:6/945: chown d9/dc/d11/f73 724607 1 2026-03-10T08:56:05.135 INFO:tasks.workunit.client.0.vm05.stdout:6/782: write d4/d2d/d51/d87/fdb [663164,61812] 0 2026-03-10T08:56:05.135 INFO:tasks.workunit.client.0.vm05.stdout:4/762: rmdir d0/d2e/d42/d45/d4a/d36/dbe/d49/d58/d66/d79 39 2026-03-10T08:56:05.139 INFO:tasks.workunit.client.1.vm08.stdout:9/921: rmdir d2/d41/d4c/d66/d82 39 2026-03-10T08:56:05.142 INFO:tasks.workunit.client.1.vm08.stdout:7/984: rmdir d0/d11/d1f/d2c 39 2026-03-10T08:56:05.143 INFO:tasks.workunit.client.1.vm08.stdout:4/984: rename d5/d23/d49/f4d to d5/d23/d36/d99/db2/d5a/d69/d11b/dea/f15f 0 2026-03-10T08:56:05.144 INFO:tasks.workunit.client.1.vm08.stdout:4/985: stat d5/d23/d109/c12e 0 2026-03-10T08:56:05.145 INFO:tasks.workunit.client.0.vm05.stdout:1/814: creat dd/d10/d18/d2d/d51/d58/f11b x:0 0 0 2026-03-10T08:56:05.145 INFO:tasks.workunit.client.0.vm05.stdout:0/759: mkdir df/d1f/dcd/de6 0 2026-03-10T08:56:05.145 INFO:tasks.workunit.client.1.vm08.stdout:6/946: symlink d9/d10/d1e/d7b/l13c 0 2026-03-10T08:56:05.148 INFO:tasks.workunit.client.1.vm08.stdout:7/985: dwrite d0/d11/d1f/df0/df4/f11d [0,4194304] 0 2026-03-10T08:56:05.148 INFO:tasks.workunit.client.1.vm08.stdout:1/936: creat d1/da/d20/d91/d83/df4/d4e/f142 x:0 0 0 2026-03-10T08:56:05.152 INFO:tasks.workunit.client.0.vm05.stdout:4/763: dread d0/d2c/d6a/fd8 [0,4194304] 0 2026-03-10T08:56:05.160 INFO:tasks.workunit.client.0.vm05.stdout:1/815: dread dd/d21/d37/f72 [0,4194304] 0 2026-03-10T08:56:05.166 INFO:tasks.workunit.client.0.vm05.stdout:4/764: dread d0/d2e/d42/d45/d4a/d36/d37/f68 [0,4194304] 0 2026-03-10T08:56:05.177 INFO:tasks.workunit.client.1.vm08.stdout:0/922: unlink d6/dd/d13/d8f/fd9 0 2026-03-10T08:56:05.177 INFO:tasks.workunit.client.1.vm08.stdout:5/857: write d0/d11/d27/d50/f9d [5080057,65148] 0 2026-03-10T08:56:05.177 INFO:tasks.workunit.client.0.vm05.stdout:8/763: truncate d2/dd/d2c/d2e/d31/d3e/fe3 3620062 0 2026-03-10T08:56:05.177 INFO:tasks.workunit.client.0.vm05.stdout:5/685: write d5/f9 [11154,70449] 0 2026-03-10T08:56:05.180 INFO:tasks.workunit.client.0.vm05.stdout:9/740: mkdir d6/df6 0 2026-03-10T08:56:05.183 INFO:tasks.workunit.client.0.vm05.stdout:7/728: dwrite d18/d38/d43/d6e/fd1 [0,4194304] 0 2026-03-10T08:56:05.184 INFO:tasks.workunit.client.0.vm05.stdout:3/819: dwrite d9/d8f/d55/f8c [0,4194304] 0 2026-03-10T08:56:05.186 INFO:tasks.workunit.client.0.vm05.stdout:3/820: chown d9/le 479068515 1 2026-03-10T08:56:05.189 INFO:tasks.workunit.client.1.vm08.stdout:3/908: dwrite d4/d15/fda [0,4194304] 0 2026-03-10T08:56:05.190 INFO:tasks.workunit.client.0.vm05.stdout:2/692: dwrite d0/d9/f4e [0,4194304] 0 2026-03-10T08:56:05.205 INFO:tasks.workunit.client.0.vm05.stdout:7/729: dread d18/d1b/f50 [0,4194304] 0 2026-03-10T08:56:05.205 INFO:tasks.workunit.client.0.vm05.stdout:2/693: read d0/d55/f60 [527533,99034] 0 2026-03-10T08:56:05.213 INFO:tasks.workunit.client.0.vm05.stdout:0/760: read - df/d1f/d85/d19/d39/d4d/fc7 zero size 2026-03-10T08:56:05.215 INFO:tasks.workunit.client.0.vm05.stdout:6/783: symlink d4/d2d/d51/d87/da5/de9/d10a/l10f 0 2026-03-10T08:56:05.231 INFO:tasks.workunit.client.0.vm05.stdout:9/741: rmdir d6/d19/d2c/d84 39 2026-03-10T08:56:05.244 INFO:tasks.workunit.client.0.vm05.stdout:3/821: mknod d9/d8f/d50/d5f/dd8/cfc 0 2026-03-10T08:56:05.252 INFO:tasks.workunit.client.0.vm05.stdout:0/761: truncate df/d1f/d85/f24 998184 0 2026-03-10T08:56:05.261 INFO:tasks.workunit.client.0.vm05.stdout:9/742: fsync d6/d15/d3c/d4b/d90/fe0 0 2026-03-10T08:56:05.264 INFO:tasks.workunit.client.0.vm05.stdout:3/822: truncate d9/d2b/fe9 970629 0 2026-03-10T08:56:05.266 INFO:tasks.workunit.client.0.vm05.stdout:7/730: symlink d18/d1b/le6 0 2026-03-10T08:56:05.267 INFO:tasks.workunit.client.0.vm05.stdout:2/694: symlink d0/d9/d1e/d20/d21/d45/lce 0 2026-03-10T08:56:05.269 INFO:tasks.workunit.client.0.vm05.stdout:0/762: truncate df/fab 1966933 0 2026-03-10T08:56:05.272 INFO:tasks.workunit.client.1.vm08.stdout:0/923: mknod d6/dd/d13/d17/d1f/d2d/d85/d93/c13c 0 2026-03-10T08:56:05.293 INFO:tasks.workunit.client.1.vm08.stdout:0/924: dread - d6/dd/d13/d17/d1f/d20/d2f/d57/d109/f12e zero size 2026-03-10T08:56:05.293 INFO:tasks.workunit.client.1.vm08.stdout:3/909: truncate d4/d15/d8/d2c/d9b/d79/d8f/fa1 935924 0 2026-03-10T08:56:05.293 INFO:tasks.workunit.client.0.vm05.stdout:8/764: creat d2/dd/d2c/d2e/d31/d3e/f109 x:0 0 0 2026-03-10T08:56:05.293 INFO:tasks.workunit.client.0.vm05.stdout:2/695: dwrite d0/d9/d1e/d20/d21/f41 [0,4194304] 0 2026-03-10T08:56:05.293 INFO:tasks.workunit.client.0.vm05.stdout:8/765: dwrite d2/dd/d2c/d2e/d31/d3e/f109 [0,4194304] 0 2026-03-10T08:56:05.293 INFO:tasks.workunit.client.0.vm05.stdout:9/743: creat d6/d15/d3c/ff7 x:0 0 0 2026-03-10T08:56:05.296 INFO:tasks.workunit.client.0.vm05.stdout:0/763: fdatasync df/d1f/d85/d19/d47/f8f 0 2026-03-10T08:56:05.315 INFO:tasks.workunit.client.0.vm05.stdout:1/816: link dd/d10/d18/d2d/d5c/dac/lcf dd/d10/d19/d4d/l11c 0 2026-03-10T08:56:05.316 INFO:tasks.workunit.client.0.vm05.stdout:8/766: mkdir d2/db/d1f/d67/d8d/d10a 0 2026-03-10T08:56:05.316 INFO:tasks.workunit.client.0.vm05.stdout:1/817: read dd/d10/d18/d2d/d5c/f100 [29469,122609] 0 2026-03-10T08:56:05.316 INFO:tasks.workunit.client.0.vm05.stdout:7/731: rename d18/d66/d79 to d18/d66/d25/d2e/de7 0 2026-03-10T08:56:05.316 INFO:tasks.workunit.client.0.vm05.stdout:3/823: getdents d9/d2b/de7/df1/d43/da3 0 2026-03-10T08:56:05.316 INFO:tasks.workunit.client.0.vm05.stdout:8/767: read d2/dd/f3f [122992,127700] 0 2026-03-10T08:56:05.316 INFO:tasks.workunit.client.0.vm05.stdout:1/818: write dd/d21/d37/f72 [3430474,106998] 0 2026-03-10T08:56:05.316 INFO:tasks.workunit.client.0.vm05.stdout:7/732: mkdir d18/d38/dc7/de3/d9c/de8 0 2026-03-10T08:56:05.316 INFO:tasks.workunit.client.1.vm08.stdout:7/986: mknod d0/d14/d43/de7/d12f/c132 0 2026-03-10T08:56:05.316 INFO:tasks.workunit.client.1.vm08.stdout:4/986: creat d5/d23/d36/d99/db2/d5d/de3/f160 x:0 0 0 2026-03-10T08:56:05.316 INFO:tasks.workunit.client.1.vm08.stdout:9/922: getdents d2/d54/d8e/da6/dd0/dc8 0 2026-03-10T08:56:05.316 INFO:tasks.workunit.client.1.vm08.stdout:3/910: mknod d4/d15/d8/d2c/d9b/d119/c137 0 2026-03-10T08:56:05.316 INFO:tasks.workunit.client.1.vm08.stdout:1/937: getdents d1/da 0 2026-03-10T08:56:05.316 INFO:tasks.workunit.client.1.vm08.stdout:1/938: readlink d1/da/d20/d3f/l70 0 2026-03-10T08:56:05.316 INFO:tasks.workunit.client.1.vm08.stdout:7/987: mknod d0/d11/d4a/d5e/d120/c133 0 2026-03-10T08:56:05.316 INFO:tasks.workunit.client.1.vm08.stdout:9/923: dwrite d2/dd/d15/d1e/d94/fd7 [0,4194304] 0 2026-03-10T08:56:05.316 INFO:tasks.workunit.client.1.vm08.stdout:1/939: mknod d1/da/de/dcf/c143 0 2026-03-10T08:56:05.316 INFO:tasks.workunit.client.1.vm08.stdout:7/988: symlink d0/d14/d2f/l134 0 2026-03-10T08:56:05.316 INFO:tasks.workunit.client.1.vm08.stdout:7/989: chown d0/d51/cf2 1167698696 1 2026-03-10T08:56:05.316 INFO:tasks.workunit.client.1.vm08.stdout:9/924: symlink d2/dd/d11c/de4/l136 0 2026-03-10T08:56:05.316 INFO:tasks.workunit.client.0.vm05.stdout:7/733: chown d18/d38/dc7/de3/d9c/dac/f4c 2504 1 2026-03-10T08:56:05.325 INFO:tasks.workunit.client.0.vm05.stdout:4/765: sync 2026-03-10T08:56:05.326 INFO:tasks.workunit.client.1.vm08.stdout:0/925: dread d6/fe [0,4194304] 0 2026-03-10T08:56:05.327 INFO:tasks.workunit.client.0.vm05.stdout:9/744: getdents d6/d19/d2a/d4a 0 2026-03-10T08:56:05.328 INFO:tasks.workunit.client.1.vm08.stdout:9/925: symlink d2/dd/d15/d1e/d25/d32/d5c/dc2/l137 0 2026-03-10T08:56:05.328 INFO:tasks.workunit.client.0.vm05.stdout:3/824: rename d9/d8f/d50/d5f/d7b/lb7 to d9/d2b/de7/df1/lfd 0 2026-03-10T08:56:05.329 INFO:tasks.workunit.client.0.vm05.stdout:8/768: creat d2/dd/d2c/d2e/d31/d4f/d80/de2/dea/f10b x:0 0 0 2026-03-10T08:56:05.331 INFO:tasks.workunit.client.0.vm05.stdout:2/696: read d0/d9/d7f/d8f/f37 [660583,75507] 0 2026-03-10T08:56:05.334 INFO:tasks.workunit.client.0.vm05.stdout:2/697: readlink d0/d9/d89/da3/lb0 0 2026-03-10T08:56:05.338 INFO:tasks.workunit.client.1.vm08.stdout:0/926: rmdir d6/dd/d13/d17/d1f/d20/d2f/d57/d109 39 2026-03-10T08:56:05.344 INFO:tasks.workunit.client.0.vm05.stdout:2/698: dwrite d0/d9/d1e/d20/d21/d45/d4b/d75/fc9 [0,4194304] 0 2026-03-10T08:56:05.344 INFO:tasks.workunit.client.1.vm08.stdout:9/926: chown d2/d41/d4c/d66/d82/lc6 5321 1 2026-03-10T08:56:05.344 INFO:tasks.workunit.client.1.vm08.stdout:3/911: dread d4/d15/d8/d2c/d9b/d79/d8f/de2/f113 [0,4194304] 0 2026-03-10T08:56:05.351 INFO:tasks.workunit.client.0.vm05.stdout:1/819: rmdir dd/d10/d18/d2d/d51 39 2026-03-10T08:56:05.352 INFO:tasks.workunit.client.0.vm05.stdout:4/766: creat d0/d2e/d42/d45/d4a/d36/dbe/d49/d4f/d5b/ddd/ff9 x:0 0 0 2026-03-10T08:56:05.354 INFO:tasks.workunit.client.1.vm08.stdout:4/987: dread d5/d23/d36/d99/db2/d5a/ddb/fe9 [0,4194304] 0 2026-03-10T08:56:05.356 INFO:tasks.workunit.client.0.vm05.stdout:9/745: fdatasync d6/d15/d3c/d4b/f5b 0 2026-03-10T08:56:05.364 INFO:tasks.workunit.client.1.vm08.stdout:0/927: write d6/dd/d13/d61/dc7/fdf [679198,41245] 0 2026-03-10T08:56:05.364 INFO:tasks.workunit.client.0.vm05.stdout:3/825: rmdir d9/d8f/d50/d5f/d7b 39 2026-03-10T08:56:05.364 INFO:tasks.workunit.client.0.vm05.stdout:2/699: truncate d0/d9/d1e/d20/d21/f77 1211439 0 2026-03-10T08:56:05.367 INFO:tasks.workunit.client.0.vm05.stdout:4/767: rename d0/d2e/d42/d45/d4a/d36/dbe/d32/ded to d0/d2e/d42/d45/d4a/d36/dbe/d49/d4f/d5b/dae/dee/dfa 0 2026-03-10T08:56:05.367 INFO:tasks.workunit.client.0.vm05.stdout:8/769: dread d2/fa [4194304,4194304] 0 2026-03-10T08:56:05.370 INFO:tasks.workunit.client.0.vm05.stdout:3/826: creat d9/d2b/de7/df1/dd6/ffe x:0 0 0 2026-03-10T08:56:05.376 INFO:tasks.workunit.client.0.vm05.stdout:8/770: creat d2/db/da4/f10c x:0 0 0 2026-03-10T08:56:05.378 INFO:tasks.workunit.client.0.vm05.stdout:2/700: getdents d0/d55/d9f 0 2026-03-10T08:56:05.379 INFO:tasks.workunit.client.0.vm05.stdout:3/827: dread - d9/d2b/d53/fa7 zero size 2026-03-10T08:56:05.382 INFO:tasks.workunit.client.0.vm05.stdout:4/768: creat d0/d2e/d42/d45/d4a/d36/dbe/ffb x:0 0 0 2026-03-10T08:56:05.383 INFO:tasks.workunit.client.1.vm08.stdout:4/988: sync 2026-03-10T08:56:05.384 INFO:tasks.workunit.client.0.vm05.stdout:1/820: dread dd/d21/d37/d7c/dab/db7/fc0 [4194304,4194304] 0 2026-03-10T08:56:05.388 INFO:tasks.workunit.client.0.vm05.stdout:3/828: creat d9/d2b/de7/fff x:0 0 0 2026-03-10T08:56:05.392 INFO:tasks.workunit.client.1.vm08.stdout:4/989: dread - d5/f10f zero size 2026-03-10T08:56:05.392 INFO:tasks.workunit.client.0.vm05.stdout:2/701: rmdir d0/d9/d1e/d20/d21/d45/d4b 39 2026-03-10T08:56:05.392 INFO:tasks.workunit.client.0.vm05.stdout:8/771: creat d2/db/d28/f10d x:0 0 0 2026-03-10T08:56:05.395 INFO:tasks.workunit.client.0.vm05.stdout:3/829: symlink d9/d2b/d2f/d57/dd0/l100 0 2026-03-10T08:56:05.397 INFO:tasks.workunit.client.1.vm08.stdout:4/990: symlink d5/d23/d36/d99/db2/d5a/d69/d11b/def/df2/l161 0 2026-03-10T08:56:05.407 INFO:tasks.workunit.client.0.vm05.stdout:2/702: mknod d0/d9/d7f/d8f/ccf 0 2026-03-10T08:56:05.410 INFO:tasks.workunit.client.0.vm05.stdout:1/821: link dd/d10/d18/d2d/f10e dd/d10/d18/d20/d52/ddc/f11d 0 2026-03-10T08:56:05.417 INFO:tasks.workunit.client.0.vm05.stdout:8/772: dread d2/dd/f1a [0,4194304] 0 2026-03-10T08:56:05.420 INFO:tasks.workunit.client.1.vm08.stdout:5/858: dread d0/d11/d27/d68/d7c/de5/f57 [0,4194304] 0 2026-03-10T08:56:05.423 INFO:tasks.workunit.client.0.vm05.stdout:8/773: read d2/dd/d2c/f86 [324006,74364] 0 2026-03-10T08:56:05.425 INFO:tasks.workunit.client.0.vm05.stdout:2/703: unlink d0/d9/d7f/d8f/c3f 0 2026-03-10T08:56:05.426 INFO:tasks.workunit.client.0.vm05.stdout:2/704: stat d0/d9/d7f/d8f/fab 0 2026-03-10T08:56:05.440 INFO:tasks.workunit.client.0.vm05.stdout:1/822: symlink dd/d21/d37/d7c/d60/l11e 0 2026-03-10T08:56:05.444 INFO:tasks.workunit.client.0.vm05.stdout:5/686: write d5/d86/d21/fd9 [420953,128966] 0 2026-03-10T08:56:05.452 INFO:tasks.workunit.client.0.vm05.stdout:3/830: creat d9/d2b/f101 x:0 0 0 2026-03-10T08:56:05.457 INFO:tasks.workunit.client.1.vm08.stdout:4/991: rename d5/d23/d36/d99/db2/d5d/de3/df8/f15e to d5/d23/d36/d99/f162 0 2026-03-10T08:56:05.466 INFO:tasks.workunit.client.0.vm05.stdout:1/823: symlink dd/d21/d37/d7c/dab/db7/dde/l11f 0 2026-03-10T08:56:05.473 INFO:tasks.workunit.client.0.vm05.stdout:8/774: link d2/dd/d2c/d2e/d31/d4f/d7b/fd8 d2/db/d1f/d67/d8d/f10e 0 2026-03-10T08:56:05.473 INFO:tasks.workunit.client.0.vm05.stdout:1/824: creat dd/d10/d18/d2d/d51/d58/d71/d73/d8b/f120 x:0 0 0 2026-03-10T08:56:05.475 INFO:tasks.workunit.client.0.vm05.stdout:5/687: sync 2026-03-10T08:56:05.481 INFO:tasks.workunit.client.1.vm08.stdout:4/992: dread d5/d23/d36/d99/db2/d5d/f66 [0,4194304] 0 2026-03-10T08:56:05.489 INFO:tasks.workunit.client.0.vm05.stdout:1/825: unlink dd/d10/d18/d2d/d51/f6e 0 2026-03-10T08:56:05.493 INFO:tasks.workunit.client.0.vm05.stdout:8/775: rmdir d2/dd/d2c/d2e/d31/d4f/da3 39 2026-03-10T08:56:05.515 INFO:tasks.workunit.client.1.vm08.stdout:6/947: write d9/d10/d1e/d7e/f119 [503500,18898] 0 2026-03-10T08:56:05.528 INFO:tasks.workunit.client.0.vm05.stdout:6/784: dwrite d4/d7/d10/d15/d20/f64 [0,4194304] 0 2026-03-10T08:56:05.544 INFO:tasks.workunit.client.0.vm05.stdout:0/764: dwrite df/d1f/d85/fb5 [0,4194304] 0 2026-03-10T08:56:05.550 INFO:tasks.workunit.client.0.vm05.stdout:5/688: symlink d5/d86/d24/d2c/d41/lf8 0 2026-03-10T08:56:05.552 INFO:tasks.workunit.client.1.vm08.stdout:6/948: creat d9/dc/d11/d23/d2c/d41/f13d x:0 0 0 2026-03-10T08:56:05.552 INFO:tasks.workunit.client.1.vm08.stdout:6/949: stat d9/d10/d1e/d7b/f13a 0 2026-03-10T08:56:05.553 INFO:tasks.workunit.client.0.vm05.stdout:5/689: chown d5/df/d37/d68/fd7 975 1 2026-03-10T08:56:05.575 INFO:tasks.workunit.client.1.vm08.stdout:4/993: symlink d5/d23/d36/d99/l163 0 2026-03-10T08:56:05.584 INFO:tasks.workunit.client.1.vm08.stdout:6/950: mkdir d9/d13/d13e 0 2026-03-10T08:56:05.585 INFO:tasks.workunit.client.1.vm08.stdout:4/994: mknod d5/c164 0 2026-03-10T08:56:05.586 INFO:tasks.workunit.client.0.vm05.stdout:8/776: mknod d2/dd/d2c/d2e/d31/d3e/c10f 0 2026-03-10T08:56:05.586 INFO:tasks.workunit.client.0.vm05.stdout:6/785: mknod d4/d7/d10/d15/d20/c110 0 2026-03-10T08:56:05.588 INFO:tasks.workunit.client.1.vm08.stdout:7/990: dwrite d0/d11/f11e [0,4194304] 0 2026-03-10T08:56:05.596 INFO:tasks.workunit.client.1.vm08.stdout:1/940: dwrite d1/da/d20/d91/d83/f100 [0,4194304] 0 2026-03-10T08:56:05.597 INFO:tasks.workunit.client.1.vm08.stdout:6/951: mkdir d9/dc/d11/d23/d2c/d81/d63/d13f 0 2026-03-10T08:56:05.599 INFO:tasks.workunit.client.0.vm05.stdout:5/690: creat d5/df/d37/d68/ff9 x:0 0 0 2026-03-10T08:56:05.602 INFO:tasks.workunit.client.1.vm08.stdout:9/927: write d2/f86 [576440,58993] 0 2026-03-10T08:56:05.613 INFO:tasks.workunit.client.1.vm08.stdout:7/991: rename d0/d11/d1f/d29/d3d/d89/fe4 to d0/d11/d1f/d29/d3b/d80/dd3/de1/f135 0 2026-03-10T08:56:05.620 INFO:tasks.workunit.client.1.vm08.stdout:7/992: stat d0/d14/d43/d9d/dbb 0 2026-03-10T08:56:05.631 INFO:tasks.workunit.client.1.vm08.stdout:9/928: symlink d2/d54/d8e/db7/l138 0 2026-03-10T08:56:05.631 INFO:tasks.workunit.client.1.vm08.stdout:4/995: sync 2026-03-10T08:56:05.632 INFO:tasks.workunit.client.1.vm08.stdout:7/993: fsync d0/d11/f6a 0 2026-03-10T08:56:05.634 INFO:tasks.workunit.client.0.vm05.stdout:8/777: creat d2/dd/d2c/da5/f110 x:0 0 0 2026-03-10T08:56:05.634 INFO:tasks.workunit.client.0.vm05.stdout:7/734: mkdir d18/d66/d25/d2e/d2f/d6d/dc1/dd4/de9 0 2026-03-10T08:56:05.635 INFO:tasks.workunit.client.1.vm08.stdout:4/996: mkdir d5/d23/d36/d99/d149/d165 0 2026-03-10T08:56:05.637 INFO:tasks.workunit.client.0.vm05.stdout:8/778: unlink d2/d45/l60 0 2026-03-10T08:56:05.638 INFO:tasks.workunit.client.1.vm08.stdout:4/997: dread d5/d23/d36/d99/db2/d5d/f129 [0,4194304] 0 2026-03-10T08:56:05.650 INFO:tasks.workunit.client.0.vm05.stdout:7/735: dwrite d18/d66/d25/d2e/d2f/d6d/fba [0,4194304] 0 2026-03-10T08:56:05.650 INFO:tasks.workunit.client.1.vm08.stdout:6/952: creat d9/d10/d1e/f140 x:0 0 0 2026-03-10T08:56:05.650 INFO:tasks.workunit.client.1.vm08.stdout:7/994: mknod d0/d11/d1f/d29/d36/d10c/c136 0 2026-03-10T08:56:05.657 INFO:tasks.workunit.client.0.vm05.stdout:7/736: dread d18/d38/d43/d6e/f76 [0,4194304] 0 2026-03-10T08:56:05.659 INFO:tasks.workunit.client.0.vm05.stdout:7/737: chown d18/d38/dc7/de3/d9c/de8 1 1 2026-03-10T08:56:05.668 INFO:tasks.workunit.client.1.vm08.stdout:9/929: dread d2/d54/d8e/da6/dd0/f3e [0,4194304] 0 2026-03-10T08:56:05.668 INFO:tasks.workunit.client.1.vm08.stdout:7/995: rename d0/d14/d43 to d0/d11/db2/d137 0 2026-03-10T08:56:05.668 INFO:tasks.workunit.client.1.vm08.stdout:9/930: chown d2/d54/d8e/da6/dd0/dc8/l128 1178487 1 2026-03-10T08:56:05.668 INFO:tasks.workunit.client.1.vm08.stdout:7/996: mkdir d0/d51/d138 0 2026-03-10T08:56:05.670 INFO:tasks.workunit.client.1.vm08.stdout:7/997: creat d0/d11/d1f/d29/d36/d10c/f139 x:0 0 0 2026-03-10T08:56:05.678 INFO:tasks.workunit.client.1.vm08.stdout:3/912: write d4/d15/d8/d1d/f98 [1735730,9420] 0 2026-03-10T08:56:05.681 INFO:tasks.workunit.client.1.vm08.stdout:3/913: truncate d4/d15/d8/d1d/da8/fc9 324759 0 2026-03-10T08:56:05.683 INFO:tasks.workunit.client.1.vm08.stdout:3/914: chown d4/d15/d8/d2c/d9b/d79/d8f/de2/lea 238716 1 2026-03-10T08:56:05.684 INFO:tasks.workunit.client.0.vm05.stdout:7/738: mknod d18/d66/d25/de5/cea 0 2026-03-10T08:56:05.685 INFO:tasks.workunit.client.1.vm08.stdout:9/931: dread d2/d41/d4c/d66/fb0 [0,4194304] 0 2026-03-10T08:56:05.685 INFO:tasks.workunit.client.0.vm05.stdout:7/739: read - d18/d66/d25/d2e/d2f/fd8 zero size 2026-03-10T08:56:05.689 INFO:tasks.workunit.client.1.vm08.stdout:3/915: rename d4/d15/d8/d2c/d55/cd9 to d4/d15/d8/d2c/d9b/d79/d8f/de2/c138 0 2026-03-10T08:56:05.696 INFO:tasks.workunit.client.1.vm08.stdout:9/932: creat d2/dd/d15/d1e/d25/d32/dc4/f139 x:0 0 0 2026-03-10T08:56:05.696 INFO:tasks.workunit.client.0.vm05.stdout:7/740: rmdir d18/d66/d25/de5 39 2026-03-10T08:56:05.696 INFO:tasks.workunit.client.1.vm08.stdout:0/928: dwrite d6/dd/d13/d17/d1f/d2d/d85/f111 [0,4194304] 0 2026-03-10T08:56:05.699 INFO:tasks.workunit.client.0.vm05.stdout:9/746: dwrite d6/d19/d2c/d58/f6c [0,4194304] 0 2026-03-10T08:56:05.701 INFO:tasks.workunit.client.0.vm05.stdout:9/747: chown d6/d15/d3c/cd5 664520 1 2026-03-10T08:56:05.704 INFO:tasks.workunit.client.1.vm08.stdout:3/916: readlink d4/d15/d8/d2c/lfe 0 2026-03-10T08:56:05.726 INFO:tasks.workunit.client.0.vm05.stdout:4/769: write d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/fdb [338951,107792] 0 2026-03-10T08:56:05.734 INFO:tasks.workunit.client.0.vm05.stdout:3/831: write d9/d2b/d2f/fb9 [279839,58548] 0 2026-03-10T08:56:05.737 INFO:tasks.workunit.client.1.vm08.stdout:5/859: dwrite d0/d11/d18/faf [0,4194304] 0 2026-03-10T08:56:05.737 INFO:tasks.workunit.client.0.vm05.stdout:2/705: dwrite d0/d9/d1e/d20/d21/f3d [0,4194304] 0 2026-03-10T08:56:05.774 INFO:tasks.workunit.client.0.vm05.stdout:7/741: stat d18/d66/d78 0 2026-03-10T08:56:05.778 INFO:tasks.workunit.client.1.vm08.stdout:9/933: symlink d2/d54/d8e/db7/l13a 0 2026-03-10T08:56:05.780 INFO:tasks.workunit.client.0.vm05.stdout:1/826: dwrite fa [0,4194304] 0 2026-03-10T08:56:05.781 INFO:tasks.workunit.client.0.vm05.stdout:1/827: stat dd/d10/d19/f95 0 2026-03-10T08:56:05.806 INFO:tasks.workunit.client.0.vm05.stdout:0/765: write df/d1f/d85/d2b/d27/f4f [4226254,94222] 0 2026-03-10T08:56:05.812 INFO:tasks.workunit.client.1.vm08.stdout:5/860: unlink d0/d11/d27/d68/d7c/fd6 0 2026-03-10T08:56:05.815 INFO:tasks.workunit.client.0.vm05.stdout:3/832: mkdir d9/d2b/de7/d102 0 2026-03-10T08:56:05.827 INFO:tasks.workunit.client.1.vm08.stdout:5/861: creat d0/d11/d27/d100/f10b x:0 0 0 2026-03-10T08:56:05.828 INFO:tasks.workunit.client.1.vm08.stdout:1/941: dwrite d1/f65 [0,4194304] 0 2026-03-10T08:56:05.830 INFO:tasks.workunit.client.1.vm08.stdout:1/942: read d1/da/de/d24/d3d/ff0 [393500,46546] 0 2026-03-10T08:56:05.830 INFO:tasks.workunit.client.1.vm08.stdout:5/862: chown d0/d11/d27/d68/d7c/d4b/d4e/l59 1 1 2026-03-10T08:56:05.840 INFO:tasks.workunit.client.0.vm05.stdout:6/786: dwrite d4/d7/fab [0,4194304] 0 2026-03-10T08:56:05.852 INFO:tasks.workunit.client.1.vm08.stdout:6/953: rmdir d9/d10/d1e 39 2026-03-10T08:56:05.852 INFO:tasks.workunit.client.1.vm08.stdout:9/934: mkdir d2/dd/d15/d1e/d21/d118/d13b 0 2026-03-10T08:56:05.853 INFO:tasks.workunit.client.1.vm08.stdout:9/935: fdatasync d2/d41/d53/d103/f121 0 2026-03-10T08:56:05.854 INFO:tasks.workunit.client.1.vm08.stdout:4/998: write d5/f7e [3871713,52478] 0 2026-03-10T08:56:05.855 INFO:tasks.workunit.client.1.vm08.stdout:4/999: chown d5/d23/d49/d8f/da4/f145 13727 1 2026-03-10T08:56:05.868 INFO:tasks.workunit.client.0.vm05.stdout:7/742: rename d18/d66/d25/d2e/d2f/da0 to d18/d38/dc7/de3/d74/deb 0 2026-03-10T08:56:05.876 INFO:tasks.workunit.client.0.vm05.stdout:5/691: dwrite d5/fd [0,4194304] 0 2026-03-10T08:56:05.896 INFO:tasks.workunit.client.0.vm05.stdout:0/766: creat df/d1f/d95/fe7 x:0 0 0 2026-03-10T08:56:05.897 INFO:tasks.workunit.client.1.vm08.stdout:7/998: dwrite d0/d11/d1f/d29/d3d/d40/f38 [8388608,4194304] 0 2026-03-10T08:56:05.936 INFO:tasks.workunit.client.1.vm08.stdout:0/929: write d6/dd/d13/d17/ff8 [1020740,27063] 0 2026-03-10T08:56:05.937 INFO:tasks.workunit.client.1.vm08.stdout:3/917: write d4/d15/d8/d2c/d9b/d79/fef [386328,9968] 0 2026-03-10T08:56:05.938 INFO:tasks.workunit.client.1.vm08.stdout:7/999: dread d0/d11/d1f/d29/fcf [0,4194304] 0 2026-03-10T08:56:05.942 INFO:tasks.workunit.client.0.vm05.stdout:8/779: truncate d2/dd/d2c/d2e/d31/d3e/fe3 4267899 0 2026-03-10T08:56:05.942 INFO:tasks.workunit.client.0.vm05.stdout:9/748: creat d6/ff8 x:0 0 0 2026-03-10T08:56:05.948 INFO:tasks.workunit.client.0.vm05.stdout:7/743: fsync d18/d38/d43/d5c/fa7 0 2026-03-10T08:56:05.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:05 vm05.local ceph-mon[49713]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:56:05.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:05 vm05.local ceph-mon[49713]: pgmap v11: 65 pgs: 65 active+clean; 3.6 GiB data, 12 GiB used, 108 GiB / 120 GiB avail; 26 MiB/s rd, 56 MiB/s wr, 163 op/s 2026-03-10T08:56:05.966 INFO:tasks.workunit.client.0.vm05.stdout:4/770: creat d0/d2e/ffc x:0 0 0 2026-03-10T08:56:05.980 INFO:tasks.workunit.client.0.vm05.stdout:5/692: symlink d5/d48/d64/d95/dac/lfa 0 2026-03-10T08:56:05.980 INFO:tasks.workunit.client.0.vm05.stdout:1/828: write dd/d10/d18/dd5/fbf [1820999,6027] 0 2026-03-10T08:56:05.980 INFO:tasks.workunit.client.0.vm05.stdout:5/693: chown d5/d86/d24/d84/fb0 219602 1 2026-03-10T08:56:05.987 INFO:tasks.workunit.client.1.vm08.stdout:5/863: dwrite d0/d1b/d67/f9b [0,4194304] 0 2026-03-10T08:56:05.990 INFO:tasks.workunit.client.1.vm08.stdout:1/943: write d1/da/de/d24/d3d/d40/d92/ff7 [1033403,28512] 0 2026-03-10T08:56:05.991 INFO:tasks.workunit.client.1.vm08.stdout:6/954: write d9/d10/f9d [968554,45341] 0 2026-03-10T08:56:06.001 INFO:tasks.workunit.client.1.vm08.stdout:9/936: dwrite d2/d41/d4c/d66/d82/fa8 [0,4194304] 0 2026-03-10T08:56:06.037 INFO:tasks.workunit.client.1.vm08.stdout:6/955: mkdir d9/dc/d84/d80/d141 0 2026-03-10T08:56:06.047 INFO:tasks.workunit.client.1.vm08.stdout:9/937: truncate d2/dd/d15/d1e/d39/d4e/f55 3496311 0 2026-03-10T08:56:06.048 INFO:tasks.workunit.client.1.vm08.stdout:9/938: stat d2/dd/d15/d1e/d25/d32/d5c/fab 0 2026-03-10T08:56:06.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:05 vm08.local ceph-mon[57559]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:56:06.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:05 vm08.local ceph-mon[57559]: pgmap v11: 65 pgs: 65 active+clean; 3.6 GiB data, 12 GiB used, 108 GiB / 120 GiB avail; 26 MiB/s rd, 56 MiB/s wr, 163 op/s 2026-03-10T08:56:06.057 INFO:tasks.workunit.client.1.vm08.stdout:0/930: write d6/dd/d13/d17/d1f/d2d/f130 [223222,104178] 0 2026-03-10T08:56:06.060 INFO:tasks.workunit.client.0.vm05.stdout:0/767: dwrite df/d1f/d85/d19/d47/da3/fca [0,4194304] 0 2026-03-10T08:56:06.063 INFO:tasks.workunit.client.1.vm08.stdout:1/944: write d1/da/d18/d3b/d62/fc7 [749150,8193] 0 2026-03-10T08:56:06.078 INFO:tasks.workunit.client.1.vm08.stdout:5/864: symlink d0/d11/d27/d68/d7c/d4b/l10c 0 2026-03-10T08:56:06.082 INFO:tasks.workunit.client.1.vm08.stdout:3/918: rename d4/d15/d8/d1d/f98 to d4/d15/d8/d2c/d6d/dfa/f139 0 2026-03-10T08:56:06.083 INFO:tasks.workunit.client.1.vm08.stdout:3/919: write d4/d15/d8/d2c/d6d/dfa/d100/f10b [1883442,86979] 0 2026-03-10T08:56:06.101 INFO:tasks.workunit.client.1.vm08.stdout:1/945: mkdir d1/da/de/d24/d35/d6d/d116/d144 0 2026-03-10T08:56:06.102 INFO:tasks.workunit.client.1.vm08.stdout:1/946: stat d1/da/de/d5c/c129 0 2026-03-10T08:56:06.111 INFO:tasks.workunit.client.1.vm08.stdout:3/920: write d4/d15/d8/d2c/d9b/d79/d8f/de2/f113 [4930646,91141] 0 2026-03-10T08:56:06.125 INFO:tasks.workunit.client.1.vm08.stdout:0/931: creat d6/dd/d13/d17/d1f/d20/d2f/d57/d109/f13d x:0 0 0 2026-03-10T08:56:06.130 INFO:tasks.workunit.client.1.vm08.stdout:5/865: symlink d0/l10d 0 2026-03-10T08:56:06.132 INFO:tasks.workunit.client.1.vm08.stdout:3/921: rmdir d4/d15/d8/d1d 39 2026-03-10T08:56:06.139 INFO:tasks.workunit.client.1.vm08.stdout:3/922: dwrite d4/d6f/d85/dd3/d10d/f12d [0,4194304] 0 2026-03-10T08:56:06.140 INFO:tasks.workunit.client.1.vm08.stdout:3/923: chown d4/d15/d8/d2c/d9b/d79/cc5 1 1 2026-03-10T08:56:06.157 INFO:tasks.workunit.client.0.vm05.stdout:6/787: mkdir d4/d7/d10/d111 0 2026-03-10T08:56:06.160 INFO:tasks.workunit.client.1.vm08.stdout:1/947: dread - d1/da/de/d24/d3d/d40/d56/d7a/f12a zero size 2026-03-10T08:56:06.161 INFO:tasks.workunit.client.1.vm08.stdout:1/948: readlink d1/da/d18/d3b/lf9 0 2026-03-10T08:56:06.165 INFO:tasks.workunit.client.0.vm05.stdout:7/744: rmdir d18/d38/dc7/de3 39 2026-03-10T08:56:06.172 INFO:tasks.workunit.client.1.vm08.stdout:3/924: truncate d4/f106 563449 0 2026-03-10T08:56:06.174 INFO:tasks.workunit.client.0.vm05.stdout:1/829: symlink dd/d21/d37/d45/d8d/l121 0 2026-03-10T08:56:06.182 INFO:tasks.workunit.client.0.vm05.stdout:4/771: dwrite d0/d2c/f74 [0,4194304] 0 2026-03-10T08:56:06.183 INFO:tasks.workunit.client.0.vm05.stdout:4/772: chown d0/d2c 166521810 1 2026-03-10T08:56:06.185 INFO:tasks.workunit.client.0.vm05.stdout:5/694: fdatasync d5/d86/d21/f9e 0 2026-03-10T08:56:06.196 INFO:tasks.workunit.client.0.vm05.stdout:0/768: stat df/d1f/d85/d2b/d27/d32/d4e/d6a/fbf 0 2026-03-10T08:56:06.201 INFO:tasks.workunit.client.0.vm05.stdout:2/706: link d0/d9/d7f/d8f/d7a/cbd d0/d9/d1e/d20/d24/cd0 0 2026-03-10T08:56:06.215 INFO:tasks.workunit.client.0.vm05.stdout:9/749: symlink d6/d15/lf9 0 2026-03-10T08:56:06.227 INFO:tasks.workunit.client.0.vm05.stdout:1/830: dwrite dd/d10/d19/f1d [0,4194304] 0 2026-03-10T08:56:06.240 INFO:tasks.workunit.client.0.vm05.stdout:4/773: write d0/d2e/d42/d45/d4a/f26 [4494118,56276] 0 2026-03-10T08:56:06.249 INFO:tasks.workunit.client.1.vm08.stdout:1/949: fsync d1/da/de/d24/d35/d6d/d82/f131 0 2026-03-10T08:56:06.251 INFO:tasks.workunit.client.0.vm05.stdout:3/833: getdents d9/d8f 0 2026-03-10T08:56:06.253 INFO:tasks.workunit.client.1.vm08.stdout:5/866: mkdir d0/d11/d27/d68/d7c/d4b/d10e 0 2026-03-10T08:56:06.254 INFO:tasks.workunit.client.0.vm05.stdout:2/707: symlink d0/d9/d7f/d8f/d6d/ld1 0 2026-03-10T08:56:06.254 INFO:tasks.workunit.client.0.vm05.stdout:2/708: chown d0/d55 0 1 2026-03-10T08:56:06.261 INFO:tasks.workunit.client.0.vm05.stdout:6/788: truncate d4/d2d/d7f/fc0 1199751 0 2026-03-10T08:56:06.263 INFO:tasks.workunit.client.0.vm05.stdout:0/769: dwrite df/d1f/d85/d2b/d27/d32/d4e/d6a/fbf [0,4194304] 0 2026-03-10T08:56:06.274 INFO:tasks.workunit.client.1.vm08.stdout:3/925: mkdir d4/d6f/d85/d13a 0 2026-03-10T08:56:06.276 INFO:tasks.workunit.client.0.vm05.stdout:8/780: creat d2/dd/d2c/d2e/d31/f111 x:0 0 0 2026-03-10T08:56:06.280 INFO:tasks.workunit.client.1.vm08.stdout:9/939: truncate d2/dd/d15/d1e/d39/d4e/f55 1143061 0 2026-03-10T08:56:06.290 INFO:tasks.workunit.client.0.vm05.stdout:9/750: dread d6/d15/fb4 [0,4194304] 0 2026-03-10T08:56:06.297 INFO:tasks.workunit.client.1.vm08.stdout:0/932: rename d6/dd/d13/d61/dc7 to d6/dd/d13/d17/d1f/d2d/d13e 0 2026-03-10T08:56:06.308 INFO:tasks.workunit.client.1.vm08.stdout:1/950: dread - d1/da/d18/d3a/da7/f12b zero size 2026-03-10T08:56:06.314 INFO:tasks.workunit.client.0.vm05.stdout:1/831: dwrite dd/d21/f3a [8388608,4194304] 0 2026-03-10T08:56:06.325 INFO:tasks.workunit.client.1.vm08.stdout:6/956: link d9/d10/d1e/d104/l111 d9/dc/d11/d23/d2c/d81/d63/dcf/l142 0 2026-03-10T08:56:06.327 INFO:tasks.workunit.client.0.vm05.stdout:4/774: mknod d0/d2e/d42/d45/cfd 0 2026-03-10T08:56:06.328 INFO:tasks.workunit.client.0.vm05.stdout:4/775: chown d0/d2e/d42/d45/d4a/d36/dbe/dbf/dbd/de2/c65 4056 1 2026-03-10T08:56:06.329 INFO:tasks.workunit.client.0.vm05.stdout:4/776: chown d0/d2e/d42/d45/d4a/d36/dbe/d49/d4f/d5b/dae/df5 17313294 1 2026-03-10T08:56:06.331 INFO:tasks.workunit.client.1.vm08.stdout:3/926: symlink d4/d6f/d85/dd3/l13b 0 2026-03-10T08:56:06.361 INFO:tasks.workunit.client.1.vm08.stdout:0/933: mknod d6/dd/d13/d17/d1f/da3/c13f 0 2026-03-10T08:56:06.361 INFO:tasks.workunit.client.1.vm08.stdout:5/867: creat d0/d11/d27/d68/d7c/df8/f10f x:0 0 0 2026-03-10T08:56:06.361 INFO:tasks.workunit.client.1.vm08.stdout:6/957: creat d9/dc/d84/d80/d12c/f143 x:0 0 0 2026-03-10T08:56:06.361 INFO:tasks.workunit.client.1.vm08.stdout:6/958: dread - d9/dc/d11/d23/f113 zero size 2026-03-10T08:56:06.361 INFO:tasks.workunit.client.0.vm05.stdout:5/695: creat d5/d48/d64/dc4/ffb x:0 0 0 2026-03-10T08:56:06.361 INFO:tasks.workunit.client.0.vm05.stdout:2/709: rename d0/d55/db8/lc7 to d0/d9/d89/ld2 0 2026-03-10T08:56:06.361 INFO:tasks.workunit.client.0.vm05.stdout:6/789: symlink d4/d7/d10/d15/d1b/d22/l112 0 2026-03-10T08:56:06.361 INFO:tasks.workunit.client.0.vm05.stdout:0/770: fdatasync df/d1f/d85/d19/f99 0 2026-03-10T08:56:06.361 INFO:tasks.workunit.client.0.vm05.stdout:0/771: write df/d1f/d85/d19/d5b/fb0 [1195207,127512] 0 2026-03-10T08:56:06.361 INFO:tasks.workunit.client.0.vm05.stdout:0/772: chown df/d1f/d85/d19/d55/fa9 113831 1 2026-03-10T08:56:06.361 INFO:tasks.workunit.client.0.vm05.stdout:0/773: stat df/d1f/d85/d2b/d27/f60 0 2026-03-10T08:56:06.361 INFO:tasks.workunit.client.0.vm05.stdout:9/751: mknod d6/d12/db2/cfa 0 2026-03-10T08:56:06.361 INFO:tasks.workunit.client.0.vm05.stdout:9/752: readlink d6/d15/d3c/d4b/d90/le3 0 2026-03-10T08:56:06.367 INFO:tasks.workunit.client.1.vm08.stdout:6/959: readlink d9/dc/d11/d23/d2c/d81/lee 0 2026-03-10T08:56:06.400 INFO:tasks.workunit.client.0.vm05.stdout:4/777: dread - d0/d2e/d42/d45/d4a/d36/dbe/fc8 zero size 2026-03-10T08:56:06.400 INFO:tasks.workunit.client.0.vm05.stdout:4/778: write d0/d2e/d71/ff4 [247148,54255] 0 2026-03-10T08:56:06.400 INFO:tasks.workunit.client.0.vm05.stdout:4/779: chown d0/d1d 3 1 2026-03-10T08:56:06.400 INFO:tasks.workunit.client.0.vm05.stdout:5/696: mknod d5/d48/cfc 0 2026-03-10T08:56:06.400 INFO:tasks.workunit.client.0.vm05.stdout:0/774: chown df/d1f/d85/d19/d5b/f78 134 1 2026-03-10T08:56:06.400 INFO:tasks.workunit.client.0.vm05.stdout:7/745: getdents d18/d38/d43/d5c/daf 0 2026-03-10T08:56:06.400 INFO:tasks.workunit.client.0.vm05.stdout:9/753: mknod d6/d19/d2c/d58/deb/cfb 0 2026-03-10T08:56:06.400 INFO:tasks.workunit.client.1.vm08.stdout:3/927: creat d4/d15/d8/d2c/f13c x:0 0 0 2026-03-10T08:56:06.400 INFO:tasks.workunit.client.1.vm08.stdout:5/868: link d0/d11/d27/d100/c54 d0/d11/c110 0 2026-03-10T08:56:06.400 INFO:tasks.workunit.client.1.vm08.stdout:6/960: mkdir d9/d144 0 2026-03-10T08:56:06.400 INFO:tasks.workunit.client.1.vm08.stdout:6/961: chown d9/dc/f105 2041404520 1 2026-03-10T08:56:06.400 INFO:tasks.workunit.client.1.vm08.stdout:6/962: mkdir d9/d50/de9/dea/dfc/d145 0 2026-03-10T08:56:06.400 INFO:tasks.workunit.client.1.vm08.stdout:6/963: readlink d9/dc/d11/d23/d2c/d81/lee 0 2026-03-10T08:56:06.400 INFO:tasks.workunit.client.1.vm08.stdout:3/928: creat d4/d15/d8/d1d/f13d x:0 0 0 2026-03-10T08:56:06.400 INFO:tasks.workunit.client.0.vm05.stdout:9/754: chown d6/d19/d2a/d4a/c51 195055 1 2026-03-10T08:56:06.406 INFO:tasks.workunit.client.1.vm08.stdout:6/964: rename d9/d13/cd2 to d9/d10/d1e/d104/c146 0 2026-03-10T08:56:06.413 INFO:tasks.workunit.client.0.vm05.stdout:6/790: write d4/d7/f14 [895412,28771] 0 2026-03-10T08:56:06.414 INFO:tasks.workunit.client.1.vm08.stdout:9/940: dwrite d2/d41/d4c/de2/f11b [0,4194304] 0 2026-03-10T08:56:06.417 INFO:tasks.workunit.client.1.vm08.stdout:5/869: sync 2026-03-10T08:56:06.422 INFO:tasks.workunit.client.1.vm08.stdout:1/951: dwrite d1/da/de/d24/d35/d43/fb2 [0,4194304] 0 2026-03-10T08:56:06.426 INFO:tasks.workunit.client.1.vm08.stdout:0/934: dwrite d6/dd/d13/d17/d1f/d2d/d38/d98/d12f/fd6 [0,4194304] 0 2026-03-10T08:56:06.426 INFO:tasks.workunit.client.1.vm08.stdout:1/952: stat d1/da/f93 0 2026-03-10T08:56:06.426 INFO:tasks.workunit.client.1.vm08.stdout:9/941: truncate d2/d41/d4c/d66/d82/fa8 4469759 0 2026-03-10T08:56:06.431 INFO:tasks.workunit.client.1.vm08.stdout:3/929: symlink d4/d15/d8/d2c/d9b/d119/l13e 0 2026-03-10T08:56:06.447 INFO:tasks.workunit.client.1.vm08.stdout:6/965: dwrite d9/d50/f134 [4194304,4194304] 0 2026-03-10T08:56:06.462 INFO:tasks.workunit.client.1.vm08.stdout:5/870: symlink d0/d1b/d67/l111 0 2026-03-10T08:56:06.466 INFO:tasks.workunit.client.0.vm05.stdout:1/832: dwrite dd/d21/f10a [0,4194304] 0 2026-03-10T08:56:06.479 INFO:tasks.workunit.client.1.vm08.stdout:9/942: dread d2/dd/faf [0,4194304] 0 2026-03-10T08:56:06.480 INFO:tasks.workunit.client.1.vm08.stdout:1/953: dread - d1/da/d18/d3a/d77/f11c zero size 2026-03-10T08:56:06.482 INFO:tasks.workunit.client.1.vm08.stdout:1/954: truncate d1/da/de/d24/d3d/d40/d8e/f136 917153 0 2026-03-10T08:56:06.488 INFO:tasks.workunit.client.0.vm05.stdout:8/781: link d2/db/d1f/d67/la7 d2/dd/d2c/d2e/d31/d4f/d80/de2/l112 0 2026-03-10T08:56:06.488 INFO:tasks.workunit.client.0.vm05.stdout:8/782: chown d2/dd/d2c/d2e/d31/fee 3 1 2026-03-10T08:56:06.500 INFO:tasks.workunit.client.0.vm05.stdout:0/775: dread df/f79 [0,4194304] 0 2026-03-10T08:56:06.509 INFO:tasks.workunit.client.0.vm05.stdout:9/755: creat d6/d15/d3c/d4b/d90/ffc x:0 0 0 2026-03-10T08:56:06.510 INFO:tasks.workunit.client.1.vm08.stdout:3/930: dread d4/d6f/d85/df1/f135 [0,4194304] 0 2026-03-10T08:56:06.512 INFO:tasks.workunit.client.1.vm08.stdout:6/966: dread - d9/dc/de0/f11e zero size 2026-03-10T08:56:06.513 INFO:tasks.workunit.client.1.vm08.stdout:6/967: write d9/d10/d1e/d7e/f119 [982135,53484] 0 2026-03-10T08:56:06.559 INFO:tasks.workunit.client.0.vm05.stdout:2/710: symlink d0/d9/d1e/d20/d21/d45/d4b/d70/ld3 0 2026-03-10T08:56:06.559 INFO:tasks.workunit.client.0.vm05.stdout:8/783: mknod d2/dd/d2c/d2e/d93/c113 0 2026-03-10T08:56:06.561 INFO:tasks.workunit.client.0.vm05.stdout:8/784: truncate d2/dd/d2c/d2e/d31/d4f/d80/de2/dea/f10b 835727 0 2026-03-10T08:56:06.578 INFO:tasks.workunit.client.0.vm05.stdout:0/776: unlink df/d1f/d85/d19/d39/f63 0 2026-03-10T08:56:06.584 INFO:tasks.workunit.client.0.vm05.stdout:4/780: rename d0/d2e/d42/d45/d4a/d36/dbe/dbf/dbd to d0/dfe 0 2026-03-10T08:56:06.594 INFO:tasks.workunit.client.1.vm08.stdout:3/931: rmdir d4/d15/d8/d2c 39 2026-03-10T08:56:06.594 INFO:tasks.workunit.client.1.vm08.stdout:0/935: write d6/dd/d13/d17/d1f/f67 [1879668,8333] 0 2026-03-10T08:56:06.605 INFO:tasks.workunit.client.0.vm05.stdout:9/756: fdatasync d6/d12/f34 0 2026-03-10T08:56:06.615 INFO:tasks.workunit.client.1.vm08.stdout:1/955: dwrite d1/da/de/d24/d35/d6d/d82/da2/dbb/fd8 [0,4194304] 0 2026-03-10T08:56:06.624 INFO:tasks.workunit.client.0.vm05.stdout:5/697: dwrite d5/fc1 [0,4194304] 0 2026-03-10T08:56:06.634 INFO:tasks.workunit.client.1.vm08.stdout:6/968: mkdir d9/dc/d11/d147 0 2026-03-10T08:56:06.639 INFO:tasks.workunit.client.0.vm05.stdout:3/834: getdents d9/d8f/d50/d5f/d7b 0 2026-03-10T08:56:06.670 INFO:tasks.workunit.client.1.vm08.stdout:9/943: rename d2/dd/d15/d1e to d2/d41/d4c/dd2/d13c 0 2026-03-10T08:56:06.678 INFO:tasks.workunit.client.1.vm08.stdout:5/871: dwrite d0/d11/f60 [0,4194304] 0 2026-03-10T08:56:06.681 INFO:tasks.workunit.client.1.vm08.stdout:9/944: sync 2026-03-10T08:56:06.702 INFO:tasks.workunit.client.1.vm08.stdout:0/936: rename d6/dd/d13/d17/d1f/d2d/d85/d93/fc0 to d6/dd/d13/d17/d1f/d2d/d85/d93/f140 0 2026-03-10T08:56:06.709 INFO:tasks.workunit.client.0.vm05.stdout:2/711: dread d0/f56 [0,4194304] 0 2026-03-10T08:56:06.710 INFO:tasks.workunit.client.0.vm05.stdout:2/712: readlink d0/d9/d1e/d20/d21/d45/d4b/l5a 0 2026-03-10T08:56:06.717 INFO:tasks.workunit.client.0.vm05.stdout:7/746: rename d18/d1b/f30 to d18/d66/d25/d2e/d2f/fec 0 2026-03-10T08:56:06.735 INFO:tasks.workunit.client.1.vm08.stdout:0/937: dread d6/dd/d13/d32/f34 [0,4194304] 0 2026-03-10T08:56:06.736 INFO:tasks.workunit.client.1.vm08.stdout:1/956: write d1/da/de/dcf/fdd [4545390,7302] 0 2026-03-10T08:56:06.736 INFO:tasks.workunit.client.1.vm08.stdout:0/938: readlink d6/dd/d13/d17/d1f/d2d/d38/d98/d12f/l110 0 2026-03-10T08:56:06.738 INFO:tasks.workunit.client.0.vm05.stdout:5/698: creat d5/d86/d39/ffd x:0 0 0 2026-03-10T08:56:06.748 INFO:tasks.workunit.client.1.vm08.stdout:6/969: dwrite d9/dc/de0/f11e [0,4194304] 0 2026-03-10T08:56:06.749 INFO:tasks.workunit.client.0.vm05.stdout:3/835: creat d9/d2b/de7/df1/d43/d6e/f103 x:0 0 0 2026-03-10T08:56:06.752 INFO:tasks.workunit.client.1.vm08.stdout:6/970: dread d9/dc/d11/d23/d2c/f8e [0,4194304] 0 2026-03-10T08:56:06.752 INFO:tasks.workunit.client.1.vm08.stdout:5/872: symlink d0/d11/d18/df5/dfc/l112 0 2026-03-10T08:56:06.753 INFO:tasks.workunit.client.1.vm08.stdout:5/873: chown d0/d11/d27/l6b 11929690 1 2026-03-10T08:56:06.754 INFO:tasks.workunit.client.0.vm05.stdout:1/833: creat dd/d10/d18/f122 x:0 0 0 2026-03-10T08:56:06.760 INFO:tasks.workunit.client.1.vm08.stdout:9/945: symlink d2/dd/d11c/l13d 0 2026-03-10T08:56:06.760 INFO:tasks.workunit.client.0.vm05.stdout:8/785: getdents d2/dd/d2c/d2e/d31/d4f/d7b/d9e/df8 0 2026-03-10T08:56:06.761 INFO:tasks.workunit.client.0.vm05.stdout:8/786: chown d2/db/d1f/d67/d8d 193170 1 2026-03-10T08:56:06.764 INFO:tasks.workunit.client.1.vm08.stdout:3/932: mkdir d4/d15/d8/d13f 0 2026-03-10T08:56:06.765 INFO:tasks.workunit.client.0.vm05.stdout:0/777: mkdir df/de8 0 2026-03-10T08:56:06.774 INFO:tasks.workunit.client.0.vm05.stdout:2/713: mkdir d0/d55/dd4 0 2026-03-10T08:56:06.776 INFO:tasks.workunit.client.1.vm08.stdout:0/939: truncate d6/dd/d13/d17/d1f/d20/f21 4440434 0 2026-03-10T08:56:06.782 INFO:tasks.workunit.client.0.vm05.stdout:6/791: rename d4/d7/dc4 to d4/d2d/d51/d62/d113 0 2026-03-10T08:56:06.782 INFO:tasks.workunit.client.0.vm05.stdout:7/747: truncate d18/d66/d25/f56 635220 0 2026-03-10T08:56:06.785 INFO:tasks.workunit.client.1.vm08.stdout:0/940: sync 2026-03-10T08:56:06.796 INFO:tasks.workunit.client.1.vm08.stdout:1/957: dwrite d1/da/de/d24/d35/d6d/fa8 [0,4194304] 0 2026-03-10T08:56:06.809 INFO:tasks.workunit.client.1.vm08.stdout:5/874: creat d0/d11/d27/d68/dc1/f113 x:0 0 0 2026-03-10T08:56:06.814 INFO:tasks.workunit.client.0.vm05.stdout:5/699: mkdir d5/df/dbb/d43/dfe 0 2026-03-10T08:56:06.827 INFO:tasks.workunit.client.1.vm08.stdout:3/933: fdatasync d4/d15/f12 0 2026-03-10T08:56:06.832 INFO:tasks.workunit.client.0.vm05.stdout:0/778: creat df/d1f/fe9 x:0 0 0 2026-03-10T08:56:06.833 INFO:tasks.workunit.client.1.vm08.stdout:6/971: dread d9/dc/d11/d106/fa8 [0,4194304] 0 2026-03-10T08:56:06.839 INFO:tasks.workunit.client.0.vm05.stdout:2/714: creat d0/d9/d1e/d20/d21/d8a/fd5 x:0 0 0 2026-03-10T08:56:06.844 INFO:tasks.workunit.client.0.vm05.stdout:4/781: rename d0/l96 to d0/d2e/d42/d45/lff 0 2026-03-10T08:56:06.849 INFO:tasks.workunit.client.1.vm08.stdout:1/958: creat d1/da/d20/d91/f145 x:0 0 0 2026-03-10T08:56:06.853 INFO:tasks.workunit.client.1.vm08.stdout:5/875: unlink d0/l10d 0 2026-03-10T08:56:06.854 INFO:tasks.workunit.client.1.vm08.stdout:5/876: dread - d0/d11/d3e/d45/ff7 zero size 2026-03-10T08:56:06.857 INFO:tasks.workunit.client.1.vm08.stdout:0/941: write d6/dd/d13/d17/d1f/d2d/d85/d93/fea [805422,82487] 0 2026-03-10T08:56:06.858 INFO:tasks.workunit.client.1.vm08.stdout:9/946: dwrite d2/d54/d8e/fba [0,4194304] 0 2026-03-10T08:56:06.866 INFO:tasks.workunit.client.0.vm05.stdout:8/787: dwrite d2/dd/f1a [0,4194304] 0 2026-03-10T08:56:06.891 INFO:tasks.workunit.client.0.vm05.stdout:9/757: rmdir d6/d15/d35/df2 0 2026-03-10T08:56:06.898 INFO:tasks.workunit.client.0.vm05.stdout:1/834: symlink dd/d21/d37/d7c/dc9/l123 0 2026-03-10T08:56:06.918 INFO:tasks.workunit.client.1.vm08.stdout:6/972: creat d9/dc/d11/d23/d2c/d7a/dce/d69/da2/f148 x:0 0 0 2026-03-10T08:56:06.918 INFO:tasks.workunit.client.0.vm05.stdout:0/779: truncate df/d1f/d85/d19/d47/d84/dae/fc9 454027 0 2026-03-10T08:56:06.918 INFO:tasks.workunit.client.0.vm05.stdout:0/780: chown df/d1f/d85/d19/d5b/fb0 194 1 2026-03-10T08:56:06.918 INFO:tasks.workunit.client.0.vm05.stdout:2/715: dread - d0/d9/d7f/d8f/d7e/fb9 zero size 2026-03-10T08:56:06.928 INFO:tasks.workunit.client.1.vm08.stdout:1/959: truncate d1/da/de/d24/d81/d121/f124 780363 0 2026-03-10T08:56:06.946 INFO:tasks.workunit.client.0.vm05.stdout:6/792: creat d4/d7/d10/d15/d1b/dfc/f114 x:0 0 0 2026-03-10T08:56:06.954 INFO:tasks.workunit.client.0.vm05.stdout:6/793: dread - d4/d7/d10/d1a/d8c/f10e zero size 2026-03-10T08:56:06.960 INFO:tasks.workunit.client.1.vm08.stdout:0/942: mknod d6/dd/d13/d17/d1f/d2d/d38/c141 0 2026-03-10T08:56:06.966 INFO:tasks.workunit.client.0.vm05.stdout:4/782: read d0/d2e/d42/d45/d4a/d36/dbe/d49/d4f/f51 [37660,91716] 0 2026-03-10T08:56:06.966 INFO:tasks.workunit.client.1.vm08.stdout:9/947: fsync d2/d41/d4c/dd2/d13c/d39/f57 0 2026-03-10T08:56:06.972 INFO:tasks.workunit.client.0.vm05.stdout:7/748: truncate d18/d66/d25/d2e/de7/fcf 7603964 0 2026-03-10T08:56:06.972 INFO:tasks.workunit.client.0.vm05.stdout:7/749: read d18/d38/d43/d6e/f76 [347163,72844] 0 2026-03-10T08:56:06.974 INFO:tasks.workunit.client.0.vm05.stdout:8/788: mknod d2/db/d1f/c114 0 2026-03-10T08:56:06.981 INFO:tasks.workunit.client.0.vm05.stdout:9/758: read - d6/d19/d2a/d4a/d8c/fd0 zero size 2026-03-10T08:56:06.985 INFO:tasks.workunit.client.0.vm05.stdout:1/835: creat dd/d10/d18/d20/d52/d80/f124 x:0 0 0 2026-03-10T08:56:06.985 INFO:tasks.workunit.client.0.vm05.stdout:1/836: fsync dd/d10/d18/dd5/fbf 0 2026-03-10T08:56:06.986 INFO:tasks.workunit.client.1.vm08.stdout:5/877: creat d0/d11/d18/f114 x:0 0 0 2026-03-10T08:56:06.988 INFO:tasks.workunit.client.1.vm08.stdout:6/973: creat d9/dc/d11/d23/f149 x:0 0 0 2026-03-10T08:56:07.000 INFO:tasks.workunit.client.1.vm08.stdout:1/960: creat d1/da/de/d24/d3d/d40/d56/f146 x:0 0 0 2026-03-10T08:56:07.005 INFO:tasks.workunit.client.1.vm08.stdout:0/943: dread d6/dd/d13/d61/d6f/f102 [0,4194304] 0 2026-03-10T08:56:07.006 INFO:tasks.workunit.client.1.vm08.stdout:1/961: unlink d1/f105 0 2026-03-10T08:56:07.006 INFO:tasks.workunit.client.1.vm08.stdout:1/962: readlink d1/da/d20/d3f/l70 0 2026-03-10T08:56:07.007 INFO:tasks.workunit.client.1.vm08.stdout:6/974: creat d9/dc/d84/d80/d141/f14a x:0 0 0 2026-03-10T08:56:07.010 INFO:tasks.workunit.client.1.vm08.stdout:0/944: mkdir d6/dd/d13/d17/d1f/d20/d2f/d24/d142 0 2026-03-10T08:56:07.012 INFO:tasks.workunit.client.1.vm08.stdout:1/963: rename d1/da/de/d24/d35/d6d/d116/d144 to d1/da/de/d24/d35/d43/d109/d147 0 2026-03-10T08:56:07.013 INFO:tasks.workunit.client.1.vm08.stdout:1/964: chown d1/da/de/d24/d35/d6d/d82/da2/fcd 1655 1 2026-03-10T08:56:07.013 INFO:tasks.workunit.client.1.vm08.stdout:5/878: link d0/c20 d0/d11/d27/d68/d7c/df8/c115 0 2026-03-10T08:56:07.023 INFO:tasks.workunit.client.1.vm08.stdout:1/965: fsync d1/da/de/d24/d35/d6d/d82/da2/ff5 0 2026-03-10T08:56:07.025 INFO:tasks.workunit.client.1.vm08.stdout:1/966: read d1/da/de/d24/d3d/d40/d92/ff7 [745523,28001] 0 2026-03-10T08:56:07.026 INFO:tasks.workunit.client.1.vm08.stdout:6/975: dread d9/dc/d11/d23/f8a [0,4194304] 0 2026-03-10T08:56:07.033 INFO:tasks.workunit.client.0.vm05.stdout:2/716: creat d0/d9/fd6 x:0 0 0 2026-03-10T08:56:07.033 INFO:tasks.workunit.client.1.vm08.stdout:1/967: fsync d1/da/d20/d3f/f140 0 2026-03-10T08:56:07.035 INFO:tasks.workunit.client.0.vm05.stdout:2/717: dread d0/d9/d1e/d20/d21/f3d [0,4194304] 0 2026-03-10T08:56:07.036 INFO:tasks.workunit.client.1.vm08.stdout:6/976: creat d9/d50/de9/f14b x:0 0 0 2026-03-10T08:56:07.046 INFO:tasks.workunit.client.0.vm05.stdout:8/789: dread d2/db/d28/f2d [0,4194304] 0 2026-03-10T08:56:07.048 INFO:tasks.workunit.client.1.vm08.stdout:6/977: dread - d9/d10/d1e/d32/f101 zero size 2026-03-10T08:56:07.053 INFO:tasks.workunit.client.0.vm05.stdout:5/700: link d5/d48/d64/d95/dac/cb4 d5/d86/d24/d84/df7/cff 0 2026-03-10T08:56:07.065 INFO:tasks.workunit.client.1.vm08.stdout:6/978: creat d9/dc/d11/d23/d2c/d81/d63/d13f/f14c x:0 0 0 2026-03-10T08:56:07.065 INFO:tasks.workunit.client.1.vm08.stdout:6/979: dread - d9/dc/d84/d80/f9e zero size 2026-03-10T08:56:07.065 INFO:tasks.workunit.client.0.vm05.stdout:9/759: rmdir d6/d19/d2a/d8d 39 2026-03-10T08:56:07.074 INFO:tasks.workunit.client.0.vm05.stdout:0/781: mknod df/d1f/d85/d19/d47/d84/cea 0 2026-03-10T08:56:07.090 INFO:tasks.workunit.client.1.vm08.stdout:9/948: dwrite d2/d41/d4c/dd2/d13c/d21/fc7 [4194304,4194304] 0 2026-03-10T08:56:07.091 INFO:tasks.workunit.client.1.vm08.stdout:3/934: write d4/d15/d8/d2c/d9b/f86 [4196364,6104] 0 2026-03-10T08:56:07.099 INFO:tasks.workunit.client.0.vm05.stdout:3/836: rename d9/d2b/d53/c9a to d9/c104 0 2026-03-10T08:56:07.100 INFO:tasks.workunit.client.1.vm08.stdout:6/980: creat d9/d10/dd0/f14d x:0 0 0 2026-03-10T08:56:07.103 INFO:tasks.workunit.client.0.vm05.stdout:6/794: fdatasync d4/d7/f4d 0 2026-03-10T08:56:07.112 INFO:tasks.workunit.client.1.vm08.stdout:3/935: fdatasync d4/d15/d8/fec 0 2026-03-10T08:56:07.115 INFO:tasks.workunit.client.1.vm08.stdout:6/981: symlink d9/d10/dd0/l14e 0 2026-03-10T08:56:07.119 INFO:tasks.workunit.client.1.vm08.stdout:6/982: dwrite d9/d50/f134 [4194304,4194304] 0 2026-03-10T08:56:07.131 INFO:tasks.workunit.client.1.vm08.stdout:5/879: write d0/d11/d27/d68/d7c/de5/feb [175445,92463] 0 2026-03-10T08:56:07.132 INFO:tasks.workunit.client.1.vm08.stdout:0/945: dwrite d6/dd/d13/d17/d1f/d2d/fb2 [0,4194304] 0 2026-03-10T08:56:07.139 INFO:tasks.workunit.client.0.vm05.stdout:5/701: unlink d5/d86/d24/c9b 0 2026-03-10T08:56:07.141 INFO:tasks.workunit.client.1.vm08.stdout:1/968: dwrite d1/da/de/d24/d26/d5d/f104 [0,4194304] 0 2026-03-10T08:56:07.144 INFO:tasks.workunit.client.1.vm08.stdout:1/969: dread d1/f65 [0,4194304] 0 2026-03-10T08:56:07.155 INFO:tasks.workunit.client.1.vm08.stdout:6/983: creat d9/dc/d11/d106/f14f x:0 0 0 2026-03-10T08:56:07.158 INFO:tasks.workunit.client.0.vm05.stdout:4/783: write d0/d2e/d42/d45/fb1 [341580,15894] 0 2026-03-10T08:56:07.164 INFO:tasks.workunit.client.0.vm05.stdout:7/750: write d18/d66/d25/d2e/d2f/fc4 [108312,7711] 0 2026-03-10T08:56:07.180 INFO:tasks.workunit.client.1.vm08.stdout:5/880: dread - d0/d11/d27/fe1 zero size 2026-03-10T08:56:07.180 INFO:tasks.workunit.client.1.vm08.stdout:0/946: creat d6/dd/d13/d17/d1f/d2d/d38/d98/f143 x:0 0 0 2026-03-10T08:56:07.193 INFO:tasks.workunit.client.0.vm05.stdout:5/702: fdatasync d5/df/dbb/fd4 0 2026-03-10T08:56:07.203 INFO:tasks.workunit.client.0.vm05.stdout:6/795: dread d4/d2c/f7a [0,4194304] 0 2026-03-10T08:56:07.212 INFO:tasks.workunit.client.0.vm05.stdout:1/837: getdents dd/d21/d37/d45/d8d 0 2026-03-10T08:56:07.215 INFO:tasks.workunit.client.0.vm05.stdout:1/838: stat dd/d10/d18/d20/c66 0 2026-03-10T08:56:07.215 INFO:tasks.workunit.client.0.vm05.stdout:0/782: link df/d1f/d85/d2b/d27/d32/f5d df/d1f/d85/d2b/d65/d6e/feb 0 2026-03-10T08:56:07.224 INFO:tasks.workunit.client.0.vm05.stdout:8/790: write d2/db/d1f/f44 [1976742,93087] 0 2026-03-10T08:56:07.226 INFO:tasks.workunit.client.1.vm08.stdout:9/949: dwrite d2/d41/d4c/dd2/d13c/d21/f2d [4194304,4194304] 0 2026-03-10T08:56:07.229 INFO:tasks.workunit.client.0.vm05.stdout:9/760: dwrite d6/d15/d3c/fda [0,4194304] 0 2026-03-10T08:56:07.245 INFO:tasks.workunit.client.1.vm08.stdout:3/936: write d4/d15/d8/d1d/d4f/fa2 [2136840,118752] 0 2026-03-10T08:56:07.248 INFO:tasks.workunit.client.1.vm08.stdout:1/970: write d1/f8 [4979174,107180] 0 2026-03-10T08:56:07.249 INFO:tasks.workunit.client.0.vm05.stdout:3/837: dwrite d9/d8f/d55/f6b [0,4194304] 0 2026-03-10T08:56:07.262 INFO:tasks.workunit.client.0.vm05.stdout:5/703: rename d5/df/d37/d68/d85 to d5/df/d37/dc8/d100 0 2026-03-10T08:56:07.264 INFO:tasks.workunit.client.0.vm05.stdout:6/796: creat d4/d92/db0/f115 x:0 0 0 2026-03-10T08:56:07.268 INFO:tasks.workunit.client.0.vm05.stdout:7/751: dwrite d18/d38/dc7/de3/d9c/dac/f4c [0,4194304] 0 2026-03-10T08:56:07.271 INFO:tasks.workunit.client.0.vm05.stdout:2/718: getdents d0/d9/d7f/db4 0 2026-03-10T08:56:07.274 INFO:tasks.workunit.client.0.vm05.stdout:0/783: fsync df/d1f/d85/d19/d39/f42 0 2026-03-10T08:56:07.282 INFO:tasks.workunit.client.0.vm05.stdout:3/838: rename d9/d2b/de7/df1/d6c/fb5 to d9/d4d/d51/d64/f105 0 2026-03-10T08:56:07.301 INFO:tasks.workunit.client.1.vm08.stdout:6/984: mkdir d9/dc/d11/d106/d150 0 2026-03-10T08:56:07.301 INFO:tasks.workunit.client.1.vm08.stdout:5/881: rename d0/f8a to d0/d11/d18/df5/f116 0 2026-03-10T08:56:07.301 INFO:tasks.workunit.client.1.vm08.stdout:6/985: creat d9/dc/d11/d23/d2c/d7a/f151 x:0 0 0 2026-03-10T08:56:07.301 INFO:tasks.workunit.client.1.vm08.stdout:1/971: rename d1/da/de/d24/d35/d6d/d82/f7b to d1/da/de/d24/d35/d6d/d82/da2/dbb/f148 0 2026-03-10T08:56:07.301 INFO:tasks.workunit.client.1.vm08.stdout:3/937: mknod d4/d6f/c140 0 2026-03-10T08:56:07.301 INFO:tasks.workunit.client.1.vm08.stdout:9/950: creat d2/d41/d4c/dd2/d13c/d25/d98/d9d/f13e x:0 0 0 2026-03-10T08:56:07.301 INFO:tasks.workunit.client.1.vm08.stdout:6/986: mkdir d9/dc/d84/d152 0 2026-03-10T08:56:07.301 INFO:tasks.workunit.client.1.vm08.stdout:1/972: fdatasync d1/da/d18/d3a/f57 0 2026-03-10T08:56:07.301 INFO:tasks.workunit.client.0.vm05.stdout:5/704: symlink d5/df/d37/dd2/d76/l101 0 2026-03-10T08:56:07.301 INFO:tasks.workunit.client.0.vm05.stdout:2/719: creat d0/d55/fd7 x:0 0 0 2026-03-10T08:56:07.301 INFO:tasks.workunit.client.0.vm05.stdout:0/784: unlink df/d1f/d85/d2b/d65/d6e/d96/l54 0 2026-03-10T08:56:07.301 INFO:tasks.workunit.client.0.vm05.stdout:3/839: mkdir d9/d8f/d50/d5f/d7b/d106 0 2026-03-10T08:56:07.301 INFO:tasks.workunit.client.0.vm05.stdout:8/791: creat d2/db/f115 x:0 0 0 2026-03-10T08:56:07.301 INFO:tasks.workunit.client.0.vm05.stdout:0/785: truncate df/d1f/d85/fc0 190125 0 2026-03-10T08:56:07.301 INFO:tasks.workunit.client.0.vm05.stdout:0/786: dread - df/d1f/d85/d19/d47/f8f zero size 2026-03-10T08:56:07.301 INFO:tasks.workunit.client.0.vm05.stdout:0/787: chown df/d1f/d85/fb5 6132 1 2026-03-10T08:56:07.301 INFO:tasks.workunit.client.0.vm05.stdout:3/840: mkdir d9/d2b/de7/df1/dd6/d107 0 2026-03-10T08:56:07.301 INFO:tasks.workunit.client.0.vm05.stdout:3/841: read d9/d2b/de7/df1/d43/d71/fac [844421,27232] 0 2026-03-10T08:56:07.301 INFO:tasks.workunit.client.0.vm05.stdout:5/705: mknod d5/c102 0 2026-03-10T08:56:07.301 INFO:tasks.workunit.client.0.vm05.stdout:0/788: dread df/dd8/f71 [0,4194304] 0 2026-03-10T08:56:07.301 INFO:tasks.workunit.client.0.vm05.stdout:5/706: chown d5/d86/d21/l57 202284 1 2026-03-10T08:56:07.301 INFO:tasks.workunit.client.0.vm05.stdout:0/789: chown df/d1f/d85/d2b/d27/f2e 32092 1 2026-03-10T08:56:07.301 INFO:tasks.workunit.client.0.vm05.stdout:5/707: write d5/f9 [3993634,102651] 0 2026-03-10T08:56:07.304 INFO:tasks.workunit.client.0.vm05.stdout:1/839: getdents dd/d10/d18/d2d/d51/d58/d71 0 2026-03-10T08:56:07.306 INFO:tasks.workunit.client.1.vm08.stdout:1/973: rmdir d1/da/de/d24/d3d/d40 39 2026-03-10T08:56:07.306 INFO:tasks.workunit.client.0.vm05.stdout:8/792: creat d2/dd/d2c/d2e/d31/d3e/dde/d63/f116 x:0 0 0 2026-03-10T08:56:07.307 INFO:tasks.workunit.client.0.vm05.stdout:5/708: dwrite d5/df/d37/d68/fe5 [0,4194304] 0 2026-03-10T08:56:07.308 INFO:tasks.workunit.client.0.vm05.stdout:9/761: getdents d6/d15/d37/de8 0 2026-03-10T08:56:07.310 INFO:tasks.workunit.client.1.vm08.stdout:9/951: unlink d2/d41/d4c/dd2/d13c/c7d 0 2026-03-10T08:56:07.311 INFO:tasks.workunit.client.0.vm05.stdout:9/762: read d6/f7 [6188701,16947] 0 2026-03-10T08:56:07.311 INFO:tasks.workunit.client.1.vm08.stdout:1/974: rmdir d1/da/de/d24/d35/d43 39 2026-03-10T08:56:07.312 INFO:tasks.workunit.client.0.vm05.stdout:9/763: truncate d6/ff8 967451 0 2026-03-10T08:56:07.313 INFO:tasks.workunit.client.1.vm08.stdout:1/975: read d1/da/de/f12 [4443488,43511] 0 2026-03-10T08:56:07.313 INFO:tasks.workunit.client.0.vm05.stdout:3/842: stat d9/c46 0 2026-03-10T08:56:07.315 INFO:tasks.workunit.client.1.vm08.stdout:9/952: dread - d2/d41/d4c/dd2/d13c/d25/d32/d79/f113 zero size 2026-03-10T08:56:07.317 INFO:tasks.workunit.client.0.vm05.stdout:1/840: read dd/d10/d18/d2d/d51/d58/fa0 [128653,39375] 0 2026-03-10T08:56:07.317 INFO:tasks.workunit.client.0.vm05.stdout:5/709: fsync d5/df/d37/dd2/fa5 0 2026-03-10T08:56:07.317 INFO:tasks.workunit.client.1.vm08.stdout:9/953: chown d2/d41/d4c/dd2/d13c/d25/d32/d5c/dc2/cf7 2 1 2026-03-10T08:56:07.317 INFO:tasks.workunit.client.1.vm08.stdout:1/976: dwrite d1/da/d20/d3f/f140 [0,4194304] 0 2026-03-10T08:56:07.328 INFO:tasks.workunit.client.0.vm05.stdout:9/764: creat d6/d15/d37/de8/ffd x:0 0 0 2026-03-10T08:56:07.328 INFO:tasks.workunit.client.1.vm08.stdout:9/954: chown d2/d41/d4c/dd2/d13c/d25/d32/d5c/dc2/f12f 3534569 1 2026-03-10T08:56:07.328 INFO:tasks.workunit.client.1.vm08.stdout:3/938: dread d4/d15/d8/d1d/d4f/fee [0,4194304] 0 2026-03-10T08:56:07.328 INFO:tasks.workunit.client.1.vm08.stdout:3/939: chown d4/d15/d8/d2c/d6d/dfa/d100 4928 1 2026-03-10T08:56:07.328 INFO:tasks.workunit.client.0.vm05.stdout:0/790: mknod df/d1f/d85/d2b/d27/cec 0 2026-03-10T08:56:07.329 INFO:tasks.workunit.client.0.vm05.stdout:4/784: sync 2026-03-10T08:56:07.332 INFO:tasks.workunit.client.0.vm05.stdout:3/843: dwrite d9/d4d/d51/d64/f105 [0,4194304] 0 2026-03-10T08:56:07.333 INFO:tasks.workunit.client.0.vm05.stdout:3/844: chown d9/d2b/d2f/d57 62021 1 2026-03-10T08:56:07.335 INFO:tasks.workunit.client.0.vm05.stdout:3/845: readlink d9/le 0 2026-03-10T08:56:07.349 INFO:tasks.workunit.client.0.vm05.stdout:8/793: symlink d2/db/d28/d100/l117 0 2026-03-10T08:56:07.350 INFO:tasks.workunit.client.1.vm08.stdout:1/977: readlink d1/da/de/d24/d26/d5d/l133 0 2026-03-10T08:56:07.350 INFO:tasks.workunit.client.0.vm05.stdout:1/841: unlink dd/d10/c105 0 2026-03-10T08:56:07.350 INFO:tasks.workunit.client.0.vm05.stdout:8/794: chown d2/dd/d2c/d2e/d93/f9b 5525931 1 2026-03-10T08:56:07.356 INFO:tasks.workunit.client.0.vm05.stdout:0/791: dread df/d1f/d85/d2b/d27/f4f [0,4194304] 0 2026-03-10T08:56:07.359 INFO:tasks.workunit.client.1.vm08.stdout:3/940: creat d4/d15/d8/d2c/d89/f141 x:0 0 0 2026-03-10T08:56:07.368 INFO:tasks.workunit.client.1.vm08.stdout:1/978: unlink d1/da/fd6 0 2026-03-10T08:56:07.369 INFO:tasks.workunit.client.0.vm05.stdout:9/765: truncate d6/d12/d3a/de5/f91 1317183 0 2026-03-10T08:56:07.369 INFO:tasks.workunit.client.1.vm08.stdout:3/941: stat d4/d15/d8/d2c/d9b/f63 0 2026-03-10T08:56:07.370 INFO:tasks.workunit.client.1.vm08.stdout:9/955: rename d2/d41/d4c/dd2/d13c/d21/f75 to d2/d41/d4c/f13f 0 2026-03-10T08:56:07.375 INFO:tasks.workunit.client.0.vm05.stdout:1/842: dread - dd/d10/d18/d2d/d5c/f8e zero size 2026-03-10T08:56:07.376 INFO:tasks.workunit.client.0.vm05.stdout:1/843: write dd/d10/d18/d20/d52/d80/fa5 [734514,120576] 0 2026-03-10T08:56:07.381 INFO:tasks.workunit.client.1.vm08.stdout:9/956: creat d2/d41/d4c/dd2/d13c/d94/f140 x:0 0 0 2026-03-10T08:56:07.386 INFO:tasks.workunit.client.0.vm05.stdout:8/795: dread - d2/db/d28/d99/fd5 zero size 2026-03-10T08:56:07.386 INFO:tasks.workunit.client.0.vm05.stdout:5/710: read d5/d86/d39/f78 [239720,18403] 0 2026-03-10T08:56:07.387 INFO:tasks.workunit.client.1.vm08.stdout:9/957: mknod d2/dd/c141 0 2026-03-10T08:56:07.410 INFO:tasks.workunit.client.0.vm05.stdout:1/844: truncate dd/d10/d18/d20/fd6 50523 0 2026-03-10T08:56:07.415 INFO:tasks.workunit.client.1.vm08.stdout:9/958: dread d2/dd/d15/f44 [0,4194304] 0 2026-03-10T08:56:07.427 INFO:tasks.workunit.client.1.vm08.stdout:9/959: fdatasync d2/d41/d4c/dd2/d13c/ff9 0 2026-03-10T08:56:07.427 INFO:tasks.workunit.client.0.vm05.stdout:5/711: dread d5/d86/d39/fce [0,4194304] 0 2026-03-10T08:56:07.433 INFO:tasks.workunit.client.0.vm05.stdout:1/845: rmdir dd/d10/d18/d20/df3 39 2026-03-10T08:56:07.437 INFO:tasks.workunit.client.1.vm08.stdout:9/960: symlink d2/dd/d15/dd9/l142 0 2026-03-10T08:56:07.437 INFO:tasks.workunit.client.0.vm05.stdout:0/792: creat df/d1f/d85/d2b/d65/fed x:0 0 0 2026-03-10T08:56:07.441 INFO:tasks.workunit.client.0.vm05.stdout:3/846: link d9/d4d/f88 d9/d4d/d51/d64/d89/dc2/f108 0 2026-03-10T08:56:07.442 INFO:tasks.workunit.client.0.vm05.stdout:3/847: fdatasync d9/d2b/f101 0 2026-03-10T08:56:07.448 INFO:tasks.workunit.client.0.vm05.stdout:6/797: write d4/f11 [3296171,17792] 0 2026-03-10T08:56:07.456 INFO:tasks.workunit.client.0.vm05.stdout:1/846: truncate dd/d21/fe4 91454 0 2026-03-10T08:56:07.457 INFO:tasks.workunit.client.0.vm05.stdout:1/847: readlink dd/d21/d37/d7c/dc9/le3 0 2026-03-10T08:56:07.459 INFO:tasks.workunit.client.0.vm05.stdout:7/752: write d18/d38/dc7/de3/dc6/fcd [986788,64829] 0 2026-03-10T08:56:07.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:07 vm05.local ceph-mon[49713]: pgmap v12: 65 pgs: 65 active+clean; 3.7 GiB data, 12 GiB used, 108 GiB / 120 GiB avail; 40 MiB/s rd, 86 MiB/s wr, 255 op/s 2026-03-10T08:56:07.464 INFO:tasks.workunit.client.0.vm05.stdout:0/793: unlink df/d1f/d85/d2b/d27/d32/l7f 0 2026-03-10T08:56:07.469 INFO:tasks.workunit.client.0.vm05.stdout:5/712: creat d5/df/d37/dd2/d76/dde/f103 x:0 0 0 2026-03-10T08:56:07.472 INFO:tasks.workunit.client.1.vm08.stdout:9/961: rename d2/dd/d15/f44 to d2/d41/d4c/f143 0 2026-03-10T08:56:07.472 INFO:tasks.workunit.client.1.vm08.stdout:9/962: chown d2/d54/d8e/da6/dd0/f59 159619068 1 2026-03-10T08:56:07.473 INFO:tasks.workunit.client.1.vm08.stdout:9/963: write d2/d41/d4c/d66/fad [3963607,1305] 0 2026-03-10T08:56:07.485 INFO:tasks.workunit.client.1.vm08.stdout:9/964: creat d2/d41/d53/f144 x:0 0 0 2026-03-10T08:56:07.491 INFO:tasks.workunit.client.0.vm05.stdout:7/753: unlink d18/d66/d25/d2e/d2f/f59 0 2026-03-10T08:56:07.493 INFO:tasks.workunit.client.0.vm05.stdout:0/794: chown df/d1f/d85/d2b/d27/d32/d4e/l5a 64375664 1 2026-03-10T08:56:07.495 INFO:tasks.workunit.client.1.vm08.stdout:5/882: write d0/d11/d27/d68/d7c/d8e/df0/ffd [191413,46534] 0 2026-03-10T08:56:07.497 INFO:tasks.workunit.client.1.vm08.stdout:0/947: truncate d6/dd/d13/d17/d1f/d2d/d85/f111 2300159 0 2026-03-10T08:56:07.499 INFO:tasks.workunit.client.0.vm05.stdout:5/713: creat d5/d48/d64/d95/dac/dc6/f104 x:0 0 0 2026-03-10T08:56:07.500 INFO:tasks.workunit.client.0.vm05.stdout:2/720: write d0/d9/d1e/d20/f8b [75483,80542] 0 2026-03-10T08:56:07.500 INFO:tasks.workunit.client.0.vm05.stdout:5/714: chown d5/df/dbb/l50 39529 1 2026-03-10T08:56:07.501 INFO:tasks.workunit.client.0.vm05.stdout:2/721: chown d0/d9/d1e/d20/d21/d45/d4b 0 1 2026-03-10T08:56:07.505 INFO:tasks.workunit.client.0.vm05.stdout:5/715: truncate d5/d48/d64/d95/dac/dc6/f104 135331 0 2026-03-10T08:56:07.510 INFO:tasks.workunit.client.1.vm08.stdout:5/883: creat d0/d11/d18/df5/f117 x:0 0 0 2026-03-10T08:56:07.516 INFO:tasks.workunit.client.1.vm08.stdout:0/948: fdatasync d6/dd/d13/d61/fb1 0 2026-03-10T08:56:07.528 INFO:tasks.workunit.client.0.vm05.stdout:2/722: symlink d0/d9/d1e/d20/d21/d8a/d92/ld8 0 2026-03-10T08:56:07.531 INFO:tasks.workunit.client.1.vm08.stdout:6/987: dwrite d9/d13/f36 [4194304,4194304] 0 2026-03-10T08:56:07.538 INFO:tasks.workunit.client.0.vm05.stdout:0/795: dread df/dd8/f83 [0,4194304] 0 2026-03-10T08:56:07.549 INFO:tasks.workunit.client.0.vm05.stdout:5/716: unlink d5/d86/f59 0 2026-03-10T08:56:07.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:07 vm08.local ceph-mon[57559]: pgmap v12: 65 pgs: 65 active+clean; 3.7 GiB data, 12 GiB used, 108 GiB / 120 GiB avail; 40 MiB/s rd, 86 MiB/s wr, 255 op/s 2026-03-10T08:56:07.559 INFO:tasks.workunit.client.1.vm08.stdout:9/965: getdents d2/d54/d8e/da6/dd0/dc8/de1 0 2026-03-10T08:56:07.573 INFO:tasks.workunit.client.1.vm08.stdout:5/884: mknod d0/d11/d18/c118 0 2026-03-10T08:56:07.581 INFO:tasks.workunit.client.1.vm08.stdout:6/988: fsync d9/fc5 0 2026-03-10T08:56:07.592 INFO:tasks.workunit.client.1.vm08.stdout:9/966: rename d2/d41/d4c/dd2/d13c/ca9 to d2/d41/d4c/dd2/d13c/d21/d118/d13b/c145 0 2026-03-10T08:56:07.603 INFO:tasks.workunit.client.1.vm08.stdout:5/885: fsync d0/d11/d18/f23 0 2026-03-10T08:56:07.629 INFO:tasks.workunit.client.1.vm08.stdout:3/942: dwrite d4/d15/d8/d2c/d9b/d79/d8f/de2/f12a [0,4194304] 0 2026-03-10T08:56:07.630 INFO:tasks.workunit.client.1.vm08.stdout:3/943: stat d4/d15/d8/d2c/d9b/d79/fdf 0 2026-03-10T08:56:07.633 INFO:tasks.workunit.client.0.vm05.stdout:7/754: rename d18/cdf to d18/d38/d43/d5c/ced 0 2026-03-10T08:56:07.639 INFO:tasks.workunit.client.0.vm05.stdout:2/723: fdatasync d0/d9/d1e/f59 0 2026-03-10T08:56:07.641 INFO:tasks.workunit.client.0.vm05.stdout:4/785: dwrite d0/d2e/fdf [0,4194304] 0 2026-03-10T08:56:07.655 INFO:tasks.workunit.client.1.vm08.stdout:9/967: dread - d2/d54/d8e/db7/f112 zero size 2026-03-10T08:56:07.655 INFO:tasks.workunit.client.1.vm08.stdout:9/968: readlink d2/dd/d11c/l10a 0 2026-03-10T08:56:07.659 INFO:tasks.workunit.client.1.vm08.stdout:9/969: dwrite d2/dd/f2e [0,4194304] 0 2026-03-10T08:56:07.671 INFO:tasks.workunit.client.0.vm05.stdout:9/766: dwrite d6/d19/d2a/d4a/d8c/fd0 [0,4194304] 0 2026-03-10T08:56:07.681 INFO:tasks.workunit.client.0.vm05.stdout:0/796: fdatasync df/d1f/d85/d2b/d65/d6e/d96/f66 0 2026-03-10T08:56:07.697 INFO:tasks.workunit.client.0.vm05.stdout:8/796: dwrite d2/dd/d2c/d2e/f3b [4194304,4194304] 0 2026-03-10T08:56:07.698 INFO:tasks.workunit.client.1.vm08.stdout:0/949: getdents d6/dd/d13/d17/d1f/d2d/d85/d93 0 2026-03-10T08:56:07.705 INFO:tasks.workunit.client.1.vm08.stdout:3/944: rmdir d4/d15/d8/d2c/d9b/d119 39 2026-03-10T08:56:07.707 INFO:tasks.workunit.client.0.vm05.stdout:3/848: dwrite d9/d2b/de7/df1/d43/d71/d86/fb8 [0,4194304] 0 2026-03-10T08:56:07.714 INFO:tasks.workunit.client.0.vm05.stdout:9/767: truncate d6/d19/d21/f32 3287668 0 2026-03-10T08:56:07.714 INFO:tasks.workunit.client.0.vm05.stdout:6/798: write d4/d2d/d51/d62/da9/fe4 [3590863,130383] 0 2026-03-10T08:56:07.719 INFO:tasks.workunit.client.0.vm05.stdout:0/797: mkdir df/d1f/dee 0 2026-03-10T08:56:07.726 INFO:tasks.workunit.client.0.vm05.stdout:1/848: dwrite dd/d10/d18/d20/df3/f108 [0,4194304] 0 2026-03-10T08:56:07.730 INFO:tasks.workunit.client.0.vm05.stdout:1/849: stat dd/d21/d37/d7c/dab/db7/dde 0 2026-03-10T08:56:07.747 INFO:tasks.workunit.client.1.vm08.stdout:3/945: creat d4/d6f/dca/f142 x:0 0 0 2026-03-10T08:56:07.750 INFO:tasks.workunit.client.1.vm08.stdout:9/970: fdatasync d2/d41/d4c/f7c 0 2026-03-10T08:56:07.763 INFO:tasks.workunit.client.1.vm08.stdout:6/989: getdents d9/d10/dd0 0 2026-03-10T08:56:07.780 INFO:tasks.workunit.client.1.vm08.stdout:3/946: truncate d4/d15/d8/fa0 617982 0 2026-03-10T08:56:07.780 INFO:tasks.workunit.client.1.vm08.stdout:3/947: dread d4/d6f/d85/dd3/d10d/f12d [0,4194304] 0 2026-03-10T08:56:07.780 INFO:tasks.workunit.client.1.vm08.stdout:6/990: rmdir d9/d10/d1e/d7e 39 2026-03-10T08:56:07.780 INFO:tasks.workunit.client.1.vm08.stdout:6/991: chown d9/f77 198432 1 2026-03-10T08:56:07.781 INFO:tasks.workunit.client.0.vm05.stdout:9/768: truncate d6/d15/d3c/d4b/d90/fe0 602358 0 2026-03-10T08:56:07.782 INFO:tasks.workunit.client.0.vm05.stdout:9/769: chown d6/d15/f86 1947 1 2026-03-10T08:56:07.783 INFO:tasks.workunit.client.0.vm05.stdout:1/850: creat dd/d21/d37/d45/d8d/f125 x:0 0 0 2026-03-10T08:56:07.783 INFO:tasks.workunit.client.0.vm05.stdout:1/851: chown dd/d21/d37/d7c/d60 554705 1 2026-03-10T08:56:07.784 INFO:tasks.workunit.client.0.vm05.stdout:1/852: dread - dd/d21/d37/d7c/d60/fe9 zero size 2026-03-10T08:56:07.786 INFO:tasks.workunit.client.0.vm05.stdout:5/717: link d5/d86/d21/f9e d5/d86/d24/d84/db8/dcc/f105 0 2026-03-10T08:56:07.789 INFO:tasks.workunit.client.1.vm08.stdout:6/992: chown d9/dc/d11/d23/d2c/c10d 475 1 2026-03-10T08:56:07.792 INFO:tasks.workunit.client.0.vm05.stdout:1/853: dwrite dd/d21/f10a [4194304,4194304] 0 2026-03-10T08:56:07.798 INFO:tasks.workunit.client.1.vm08.stdout:3/948: creat d4/d15/d8/d2c/d9b/d119/f143 x:0 0 0 2026-03-10T08:56:07.825 INFO:tasks.workunit.client.0.vm05.stdout:4/786: rename d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/fdb to d0/f100 0 2026-03-10T08:56:07.825 INFO:tasks.workunit.client.0.vm05.stdout:5/718: mknod d5/df/d37/dc8/d100/c106 0 2026-03-10T08:56:07.825 INFO:tasks.workunit.client.0.vm05.stdout:1/854: unlink dd/d21/d37/f85 0 2026-03-10T08:56:07.834 INFO:tasks.workunit.client.0.vm05.stdout:5/719: fsync d5/f40 0 2026-03-10T08:56:07.834 INFO:tasks.workunit.client.0.vm05.stdout:3/849: getdents d9/d2b/de7 0 2026-03-10T08:56:07.837 INFO:tasks.workunit.client.0.vm05.stdout:1/855: mkdir dd/d10/d126 0 2026-03-10T08:56:07.845 INFO:tasks.workunit.client.0.vm05.stdout:0/798: sync 2026-03-10T08:56:07.846 INFO:tasks.workunit.client.0.vm05.stdout:0/799: chown df/d1f/d85/caa 419018027 1 2026-03-10T08:56:07.847 INFO:tasks.workunit.client.0.vm05.stdout:0/800: chown df/dd8/f83 0 1 2026-03-10T08:56:07.852 INFO:tasks.workunit.client.0.vm05.stdout:1/856: unlink dd/d21/d37/d7c/dab/cc6 0 2026-03-10T08:56:07.858 INFO:tasks.workunit.client.0.vm05.stdout:1/857: fdatasync dd/f44 0 2026-03-10T08:56:07.858 INFO:tasks.workunit.client.0.vm05.stdout:1/858: stat dd/d10/d18/d2d/d51 0 2026-03-10T08:56:07.858 INFO:tasks.workunit.client.0.vm05.stdout:0/801: symlink df/dd8/d67/lef 0 2026-03-10T08:56:07.858 INFO:tasks.workunit.client.0.vm05.stdout:0/802: dread - df/dd8/d67/f80 zero size 2026-03-10T08:56:07.858 INFO:tasks.workunit.client.0.vm05.stdout:5/720: dread d5/df/d37/f47 [0,4194304] 0 2026-03-10T08:56:07.870 INFO:tasks.workunit.client.0.vm05.stdout:5/721: dwrite d5/fd [4194304,4194304] 0 2026-03-10T08:56:07.877 INFO:tasks.workunit.client.0.vm05.stdout:5/722: truncate d5/d48/d64/dc4/ffb 776763 0 2026-03-10T08:56:07.878 INFO:tasks.workunit.client.0.vm05.stdout:5/723: read d5/df/d37/d68/fe5 [2556534,55719] 0 2026-03-10T08:56:07.885 INFO:tasks.workunit.client.0.vm05.stdout:1/859: creat dd/d10/d18/dd1/f127 x:0 0 0 2026-03-10T08:56:07.896 INFO:tasks.workunit.client.0.vm05.stdout:2/724: write d0/d9/d7f/d8f/f54 [2764329,69472] 0 2026-03-10T08:56:07.896 INFO:tasks.workunit.client.1.vm08.stdout:5/886: dwrite d0/d11/d3e/fdd [0,4194304] 0 2026-03-10T08:56:07.903 INFO:tasks.workunit.client.1.vm08.stdout:1/979: write d1/da/de/d5c/fdb [572234,51943] 0 2026-03-10T08:56:07.906 INFO:tasks.workunit.client.0.vm05.stdout:8/797: write d2/dd/d2c/f4d [6119154,22758] 0 2026-03-10T08:56:07.918 INFO:tasks.workunit.client.1.vm08.stdout:0/950: dwrite d6/dd/d13/d17/d1f/d20/d2f/d24/fab [0,4194304] 0 2026-03-10T08:56:07.920 INFO:tasks.workunit.client.1.vm08.stdout:0/951: stat d6/dd/d13/d17/d1f/d2d/d85/dfc 0 2026-03-10T08:56:07.937 INFO:tasks.workunit.client.0.vm05.stdout:7/755: dwrite d18/d66/d25/d2e/fa4 [0,4194304] 0 2026-03-10T08:56:07.944 INFO:tasks.workunit.client.0.vm05.stdout:6/799: write d4/d7/d10/d15/fc5 [834188,49042] 0 2026-03-10T08:56:07.945 INFO:tasks.workunit.client.1.vm08.stdout:5/887: read d0/d11/d27/d68/d7c/d4b/d4e/d84/fa9 [1081510,83205] 0 2026-03-10T08:56:07.946 INFO:tasks.workunit.client.0.vm05.stdout:9/770: rename d6/d15/d3c/d4b/d90/fe0 to d6/d12/d3a/da2/ffe 0 2026-03-10T08:56:07.951 INFO:tasks.workunit.client.0.vm05.stdout:1/860: mkdir dd/d21/d37/d45/d8d/d128 0 2026-03-10T08:56:07.952 INFO:tasks.workunit.client.0.vm05.stdout:1/861: readlink dd/d10/d19/d9b/dc3/lfd 0 2026-03-10T08:56:07.961 INFO:tasks.workunit.client.1.vm08.stdout:3/949: dwrite d4/d15/d8/d1d/d4f/f10f [0,4194304] 0 2026-03-10T08:56:07.962 INFO:tasks.workunit.client.1.vm08.stdout:9/971: dwrite d2/d54/d8e/da6/dd0/dc8/fee [0,4194304] 0 2026-03-10T08:56:07.965 INFO:tasks.workunit.client.1.vm08.stdout:1/980: truncate d1/da/de/d24/d35/d43/d109/f12c 1668094 0 2026-03-10T08:56:07.977 INFO:tasks.workunit.client.1.vm08.stdout:6/993: write d9/d50/fa3 [3381920,43276] 0 2026-03-10T08:56:07.977 INFO:tasks.workunit.client.0.vm05.stdout:4/787: write d0/d2e/d71/fd9 [66079,25198] 0 2026-03-10T08:56:07.977 INFO:tasks.workunit.client.1.vm08.stdout:3/950: stat d4/d15/d8/d71/cdc 0 2026-03-10T08:56:07.977 INFO:tasks.workunit.client.1.vm08.stdout:9/972: chown d2/d41/d4c/dd2/d13c/d39/d4e/d87/c12b 1 1 2026-03-10T08:56:07.983 INFO:tasks.workunit.client.1.vm08.stdout:6/994: mkdir d9/d10/d1e/d7b/d153 0 2026-03-10T08:56:07.984 INFO:tasks.workunit.client.1.vm08.stdout:6/995: dread - d9/d50/de9/f14b zero size 2026-03-10T08:56:08.002 INFO:tasks.workunit.client.1.vm08.stdout:9/973: mkdir d2/dd/d15/d4f/df1/d102/d132/d146 0 2026-03-10T08:56:08.005 INFO:tasks.workunit.client.1.vm08.stdout:5/888: sync 2026-03-10T08:56:08.031 INFO:tasks.workunit.client.1.vm08.stdout:6/996: rename d9/d10/f67 to d9/dc/d11/d23/d2c/dc0/f154 0 2026-03-10T08:56:08.031 INFO:tasks.workunit.client.1.vm08.stdout:6/997: stat d9/dc/d11/d23/d2c/d7a/fd3 0 2026-03-10T08:56:08.036 INFO:tasks.workunit.client.1.vm08.stdout:6/998: dread d9/dc/d11/d106/f57 [0,4194304] 0 2026-03-10T08:56:08.048 INFO:tasks.workunit.client.1.vm08.stdout:5/889: mknod d0/d11/d27/d68/d7c/de5/c119 0 2026-03-10T08:56:08.049 INFO:tasks.workunit.client.0.vm05.stdout:8/798: chown d2/d45/cce 0 1 2026-03-10T08:56:08.053 INFO:tasks.workunit.client.0.vm05.stdout:0/803: symlink df/d1f/d85/d2b/lf0 0 2026-03-10T08:56:08.057 INFO:tasks.workunit.client.1.vm08.stdout:1/981: getdents d1/da/d20/d3f 0 2026-03-10T08:56:08.066 INFO:tasks.workunit.client.1.vm08.stdout:0/952: write d6/dd/d13/d17/d1f/d20/f46 [4422428,21165] 0 2026-03-10T08:56:08.067 INFO:tasks.workunit.client.0.vm05.stdout:7/756: rmdir d18 39 2026-03-10T08:56:08.088 INFO:tasks.workunit.client.1.vm08.stdout:6/999: getdents d9/dc/d11/d147 0 2026-03-10T08:56:08.096 INFO:tasks.workunit.client.0.vm05.stdout:3/850: dread d9/d2b/de7/df1/d43/d6e/feb [0,4194304] 0 2026-03-10T08:56:08.098 INFO:tasks.workunit.client.1.vm08.stdout:0/953: mknod d6/d8b/c144 0 2026-03-10T08:56:08.099 INFO:tasks.workunit.client.1.vm08.stdout:0/954: write d6/dd/d13/d17/d1f/d2d/d13e/fdf [1183846,59021] 0 2026-03-10T08:56:08.109 INFO:tasks.workunit.client.1.vm08.stdout:0/955: dread d6/dd/d13/d17/d1f/d2d/d85/d93/f140 [0,4194304] 0 2026-03-10T08:56:08.119 INFO:tasks.workunit.client.0.vm05.stdout:2/725: rename d0/d9/d1e/d20/d21/d45/d4b/d8d to d0/d55/db8/dcc/dd9 0 2026-03-10T08:56:08.123 INFO:tasks.workunit.client.0.vm05.stdout:9/771: creat d6/d19/d2a/dbc/fff x:0 0 0 2026-03-10T08:56:08.124 INFO:tasks.workunit.client.1.vm08.stdout:0/956: rmdir d6/dd/d13 39 2026-03-10T08:56:08.130 INFO:tasks.workunit.client.0.vm05.stdout:4/788: creat d0/d2e/d42/d45/d4a/d36/dbe/d49/d4f/d5b/dd7/f101 x:0 0 0 2026-03-10T08:56:08.162 INFO:tasks.workunit.client.0.vm05.stdout:0/804: symlink df/d1f/d85/d19/d47/d84/dae/db3/lf1 0 2026-03-10T08:56:08.181 INFO:tasks.workunit.client.0.vm05.stdout:3/851: creat d9/d2b/de7/df1/d43/d6e/f109 x:0 0 0 2026-03-10T08:56:08.181 INFO:tasks.workunit.client.1.vm08.stdout:3/951: dwrite d4/d15/d8/d2c/d9b/d79/d20/f8b [0,4194304] 0 2026-03-10T08:56:08.186 INFO:tasks.workunit.client.0.vm05.stdout:2/726: unlink d0/d9/d7f/d8f/d6d/f81 0 2026-03-10T08:56:08.187 INFO:tasks.workunit.client.1.vm08.stdout:9/974: dwrite d2/d41/d53/ffd [0,4194304] 0 2026-03-10T08:56:08.187 INFO:tasks.workunit.client.1.vm08.stdout:9/975: readlink d2/d54/d8e/db7/l138 0 2026-03-10T08:56:08.203 INFO:tasks.workunit.client.0.vm05.stdout:4/789: readlink d0/dfe/de2/la4 0 2026-03-10T08:56:08.208 INFO:tasks.workunit.client.0.vm05.stdout:6/800: dwrite d4/d7/d10/d1a/d8c/ff9 [0,4194304] 0 2026-03-10T08:56:08.208 INFO:tasks.workunit.client.0.vm05.stdout:5/724: dwrite d5/f9c [0,4194304] 0 2026-03-10T08:56:08.219 INFO:tasks.workunit.client.0.vm05.stdout:0/805: mknod df/d1f/d85/d19/d62/cf2 0 2026-03-10T08:56:08.219 INFO:tasks.workunit.client.0.vm05.stdout:3/852: symlink d9/d4d/d51/d64/d89/l10a 0 2026-03-10T08:56:08.226 INFO:tasks.workunit.client.0.vm05.stdout:2/727: mkdir d0/d9/d7f/d8f/d6d/dda 0 2026-03-10T08:56:08.227 INFO:tasks.workunit.client.0.vm05.stdout:2/728: chown d0/d9/d1e/d20/d24/la6 1477152 1 2026-03-10T08:56:08.227 INFO:tasks.workunit.client.0.vm05.stdout:2/729: readlink d0/d55/da2/lb1 0 2026-03-10T08:56:08.239 INFO:tasks.workunit.client.0.vm05.stdout:4/790: symlink d0/d2e/d42/d45/d4a/d36/d37/l102 0 2026-03-10T08:56:08.239 INFO:tasks.workunit.client.0.vm05.stdout:6/801: dread - d4/d2d/d51/d87/da5/fe5 zero size 2026-03-10T08:56:08.240 INFO:tasks.workunit.client.0.vm05.stdout:7/757: unlink d18/d38/d43/d5c/l80 0 2026-03-10T08:56:08.246 INFO:tasks.workunit.client.0.vm05.stdout:1/862: rename dd/d10/d18/d2d/d5c/dac/f102 to dd/d10/d19/f129 0 2026-03-10T08:56:08.255 INFO:tasks.workunit.client.0.vm05.stdout:4/791: unlink d0/f10 0 2026-03-10T08:56:08.257 INFO:tasks.workunit.client.0.vm05.stdout:5/725: mknod d5/d86/d24/d2c/d41/d74/c107 0 2026-03-10T08:56:08.258 INFO:tasks.workunit.client.0.vm05.stdout:6/802: dread - d4/d7/d10/d1a/ff4 zero size 2026-03-10T08:56:08.261 INFO:tasks.workunit.client.0.vm05.stdout:7/758: unlink d18/d66/d25/d2e/d2f/cc8 0 2026-03-10T08:56:08.264 INFO:tasks.workunit.client.0.vm05.stdout:1/863: symlink dd/d10/d18/dd1/l12a 0 2026-03-10T08:56:08.272 INFO:tasks.workunit.client.0.vm05.stdout:2/730: getdents d0/d9/d7f/db4 0 2026-03-10T08:56:08.272 INFO:tasks.workunit.client.0.vm05.stdout:2/731: chown d0/d55/fd7 5 1 2026-03-10T08:56:08.272 INFO:tasks.workunit.client.0.vm05.stdout:2/732: chown d0/d9/d1e/d20/f7c 175 1 2026-03-10T08:56:08.272 INFO:tasks.workunit.client.0.vm05.stdout:4/792: chown d0/d2e/d42/d45/d4a/d36/dbe/d49/d58/d66/d79/fea 25369438 1 2026-03-10T08:56:08.272 INFO:tasks.workunit.client.0.vm05.stdout:4/793: chown d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/d67 1617 1 2026-03-10T08:56:08.272 INFO:tasks.workunit.client.0.vm05.stdout:4/794: stat d0/d1d/l9a 0 2026-03-10T08:56:08.272 INFO:tasks.workunit.client.0.vm05.stdout:4/795: chown d0/d2e/d42/d45/cc7 1623 1 2026-03-10T08:56:08.272 INFO:tasks.workunit.client.0.vm05.stdout:5/726: truncate d5/d48/f7e 2937219 0 2026-03-10T08:56:08.273 INFO:tasks.workunit.client.1.vm08.stdout:5/890: link d0/d11/d18/l24 d0/d11/d27/d68/d7c/d8e/df0/db5/l11a 0 2026-03-10T08:56:08.274 INFO:tasks.workunit.client.0.vm05.stdout:5/727: read d5/d86/d21/f1f [2766115,92456] 0 2026-03-10T08:56:08.279 INFO:tasks.workunit.client.0.vm05.stdout:8/799: rename d2/dd/l13 to d2/db/d1f/d67/l118 0 2026-03-10T08:56:08.283 INFO:tasks.workunit.client.1.vm08.stdout:1/982: write d1/da/de/d24/d3d/d40/d8e/dd2/d7f/fb9 [2030324,48738] 0 2026-03-10T08:56:08.286 INFO:tasks.workunit.client.0.vm05.stdout:9/772: write d6/d19/d2a/d4a/faa [1007111,115461] 0 2026-03-10T08:56:08.297 INFO:tasks.workunit.client.1.vm08.stdout:0/957: fsync d6/dd/d13/d17/d1f/d20/d2f/d24/dc2/fc3 0 2026-03-10T08:56:08.310 INFO:tasks.workunit.client.0.vm05.stdout:4/796: chown d0/d2e/d42/d45/c8a 42555003 1 2026-03-10T08:56:08.320 INFO:tasks.workunit.client.0.vm05.stdout:3/853: dwrite d9/d2b/de7/df1/f44 [4194304,4194304] 0 2026-03-10T08:56:08.336 INFO:tasks.workunit.client.1.vm08.stdout:5/891: truncate d0/d11/d27/fe1 905726 0 2026-03-10T08:56:08.338 INFO:tasks.workunit.client.1.vm08.stdout:1/983: rename d1/da/de/d24/d3d/d40/d8e/f136 to d1/da/de/d24/d35/d6d/d82/da2/f149 0 2026-03-10T08:56:08.339 INFO:tasks.workunit.client.0.vm05.stdout:9/773: mknod d6/d15/d3c/d4b/d90/c100 0 2026-03-10T08:56:08.342 INFO:tasks.workunit.client.0.vm05.stdout:2/733: mknod d0/d9/d1e/cdb 0 2026-03-10T08:56:08.352 INFO:tasks.workunit.client.1.vm08.stdout:1/984: read d1/fc [4110803,74840] 0 2026-03-10T08:56:08.361 INFO:tasks.workunit.client.0.vm05.stdout:6/803: write d4/d7/f54 [834424,101458] 0 2026-03-10T08:56:08.361 INFO:tasks.workunit.client.1.vm08.stdout:3/952: write d4/d15/f7 [1650083,97322] 0 2026-03-10T08:56:08.365 INFO:tasks.workunit.client.0.vm05.stdout:1/864: dwrite dd/d21/d37/d45/d8d/fae [0,4194304] 0 2026-03-10T08:56:08.380 INFO:tasks.workunit.client.0.vm05.stdout:4/797: truncate d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/d67/f6b 2536141 0 2026-03-10T08:56:08.383 INFO:tasks.workunit.client.1.vm08.stdout:9/976: dwrite d2/d41/d4c/dd2/d13c/d39/d4e/fcf [0,4194304] 0 2026-03-10T08:56:08.385 INFO:tasks.workunit.client.0.vm05.stdout:5/728: dwrite d5/d86/d24/d2c/d41/d74/fa8 [0,4194304] 0 2026-03-10T08:56:08.402 INFO:tasks.workunit.client.0.vm05.stdout:8/800: link d2/dd/d2c/d2e/d31/f111 d2/dd/d74/d78/f119 0 2026-03-10T08:56:08.402 INFO:tasks.workunit.client.0.vm05.stdout:6/804: rmdir d4/d7/d10/d1a/d89 39 2026-03-10T08:56:08.410 INFO:tasks.workunit.client.0.vm05.stdout:0/806: rename df/d1f/d48/l7c to df/d1f/d85/d2b/d65/d6e/d96/lf3 0 2026-03-10T08:56:08.411 INFO:tasks.workunit.client.0.vm05.stdout:5/729: mkdir d5/df/dbb/d108 0 2026-03-10T08:56:08.411 INFO:tasks.workunit.client.0.vm05.stdout:4/798: write d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/fa2 [46690,86039] 0 2026-03-10T08:56:08.416 INFO:tasks.workunit.client.1.vm08.stdout:1/985: rename d1/da/d20 to d1/da/de/d24/d35/d6d/d14a 0 2026-03-10T08:56:08.417 INFO:tasks.workunit.client.0.vm05.stdout:0/807: dread df/d1f/d85/fb5 [0,4194304] 0 2026-03-10T08:56:08.422 INFO:tasks.workunit.client.0.vm05.stdout:2/734: link d0/d9/d1e/d20/fc8 d0/d9/d1e/d20/d21/d45/d4b/d75/fdc 0 2026-03-10T08:56:08.441 INFO:tasks.workunit.client.0.vm05.stdout:4/799: symlink d0/dfe/l103 0 2026-03-10T08:56:08.441 INFO:tasks.workunit.client.0.vm05.stdout:7/759: rename d18/d66/d25/d2e/d2f/d6d/fba to d18/d66/d25/d2e/d2f/d6d/dc1/fee 0 2026-03-10T08:56:08.441 INFO:tasks.workunit.client.0.vm05.stdout:5/730: mkdir d5/df/dbb/d43/dcd/d109 0 2026-03-10T08:56:08.441 INFO:tasks.workunit.client.0.vm05.stdout:7/760: symlink d18/d38/dc7/de3/lef 0 2026-03-10T08:56:08.441 INFO:tasks.workunit.client.0.vm05.stdout:8/801: dread d2/dd/d2c/d2e/d31/d4f/d7b/f8a [0,4194304] 0 2026-03-10T08:56:08.443 INFO:tasks.workunit.client.0.vm05.stdout:9/774: sync 2026-03-10T08:56:08.446 INFO:tasks.workunit.client.1.vm08.stdout:3/953: dread d4/d6f/dca/fcc [0,4194304] 0 2026-03-10T08:56:08.452 INFO:tasks.workunit.client.0.vm05.stdout:2/735: creat d0/d9/d1e/d20/d21/fdd x:0 0 0 2026-03-10T08:56:08.457 INFO:tasks.workunit.client.0.vm05.stdout:4/800: creat d0/d2e/d42/d45/d4a/d36/dbe/f104 x:0 0 0 2026-03-10T08:56:08.483 INFO:tasks.workunit.client.1.vm08.stdout:3/954: link d4/c38 d4/d15/d8/d1d/d107/d10a/d120/c144 0 2026-03-10T08:56:08.483 INFO:tasks.workunit.client.1.vm08.stdout:3/955: symlink d4/d15/d8/d2c/d131/l145 0 2026-03-10T08:56:08.483 INFO:tasks.workunit.client.1.vm08.stdout:3/956: stat d4/d15/d8/d2c/d9b/d79/d8f/l134 0 2026-03-10T08:56:08.483 INFO:tasks.workunit.client.0.vm05.stdout:7/761: mkdir d18/d1b/df0 0 2026-03-10T08:56:08.483 INFO:tasks.workunit.client.0.vm05.stdout:7/762: read d18/d66/d25/d2e/fa4 [1934893,63839] 0 2026-03-10T08:56:08.483 INFO:tasks.workunit.client.0.vm05.stdout:7/763: rename d18/d38/dc7/de3/d53/f7e to d18/d38/d43/d6e/ff1 0 2026-03-10T08:56:08.483 INFO:tasks.workunit.client.0.vm05.stdout:7/764: stat d18/d66/d25/d2e/d2f/d6d/dc1/le2 0 2026-03-10T08:56:08.483 INFO:tasks.workunit.client.0.vm05.stdout:8/802: unlink d2/db/d47/l56 0 2026-03-10T08:56:08.483 INFO:tasks.workunit.client.0.vm05.stdout:7/765: rename d18/d38/dc7/de3/dc6/le0 to d18/d66/d25/d2e/de7/lf2 0 2026-03-10T08:56:08.624 INFO:tasks.workunit.client.1.vm08.stdout:0/958: dwrite d6/dd/d13/d17/d1f/da3/f134 [0,4194304] 0 2026-03-10T08:56:08.645 INFO:tasks.workunit.client.0.vm05.stdout:3/854: dwrite d9/d2b/d2f/fee [0,4194304] 0 2026-03-10T08:56:08.647 INFO:tasks.workunit.client.1.vm08.stdout:0/959: rmdir d6/dd/d13/d17/d1f/d2d/d38 39 2026-03-10T08:56:08.648 INFO:tasks.workunit.client.0.vm05.stdout:8/803: mknod d2/dd/c11a 0 2026-03-10T08:56:08.650 INFO:tasks.workunit.client.1.vm08.stdout:5/892: dwrite d0/d11/d3e/f73 [0,4194304] 0 2026-03-10T08:56:08.651 INFO:tasks.workunit.client.1.vm08.stdout:9/977: dwrite d2/d41/d4c/dd2/d13c/d25/d32/d5c/f105 [0,4194304] 0 2026-03-10T08:56:08.656 INFO:tasks.workunit.client.0.vm05.stdout:3/855: stat d9/d2b/de7/df1/d43/d6e/f9f 0 2026-03-10T08:56:08.656 INFO:tasks.workunit.client.1.vm08.stdout:5/893: chown d0/d11/d18/fe6 62042728 1 2026-03-10T08:56:08.661 INFO:tasks.workunit.client.0.vm05.stdout:1/865: dwrite dd/d10/d19/f2e [0,4194304] 0 2026-03-10T08:56:08.690 INFO:tasks.workunit.client.0.vm05.stdout:9/775: getdents d6/d19/d2c/d84 0 2026-03-10T08:56:08.690 INFO:tasks.workunit.client.0.vm05.stdout:8/804: unlink d2/dd/d2c/d2e/d31/d4f/d80/de2/dea/lf2 0 2026-03-10T08:56:08.690 INFO:tasks.workunit.client.0.vm05.stdout:3/856: rename d9/d2b/de7/df1/dd6/ffe to d9/d8f/dde/f10b 0 2026-03-10T08:56:08.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:08 vm05.local ceph-mon[49713]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:56:08.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:08 vm05.local ceph-mon[49713]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:56:08.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:08 vm05.local ceph-mon[49713]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:56:08.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:08 vm05.local ceph-mon[49713]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:56:08.716 INFO:tasks.workunit.client.0.vm05.stdout:0/808: write df/d1f/d85/d19/d5b/f78 [355473,126789] 0 2026-03-10T08:56:08.721 INFO:tasks.workunit.client.0.vm05.stdout:6/805: dwrite d4/d7/d10/d1a/d89/fc1 [0,4194304] 0 2026-03-10T08:56:08.724 INFO:tasks.workunit.client.0.vm05.stdout:6/806: stat d4/d7/d10/d15/d1b/d22 0 2026-03-10T08:56:08.730 INFO:tasks.workunit.client.0.vm05.stdout:3/857: getdents d9 0 2026-03-10T08:56:08.731 INFO:tasks.workunit.client.0.vm05.stdout:3/858: chown d9/d2b/de7/df1/d6c/fb2 251644 1 2026-03-10T08:56:08.733 INFO:tasks.workunit.client.0.vm05.stdout:3/859: readlink d9/d4d/d51/d64/d89/l10a 0 2026-03-10T08:56:08.734 INFO:tasks.workunit.client.0.vm05.stdout:6/807: unlink d4/d7/le 0 2026-03-10T08:56:08.735 INFO:tasks.workunit.client.0.vm05.stdout:9/776: sync 2026-03-10T08:56:08.736 INFO:tasks.workunit.client.0.vm05.stdout:9/777: chown d6/fb0 4 1 2026-03-10T08:56:08.737 INFO:tasks.workunit.client.0.vm05.stdout:6/808: rmdir d4/d92/db0 39 2026-03-10T08:56:08.741 INFO:tasks.workunit.client.0.vm05.stdout:9/778: creat d6/d15/d3c/d4b/d82/de9/f101 x:0 0 0 2026-03-10T08:56:08.747 INFO:tasks.workunit.client.0.vm05.stdout:0/809: dread df/d1f/d85/f29 [0,4194304] 0 2026-03-10T08:56:08.750 INFO:tasks.workunit.client.0.vm05.stdout:9/779: dwrite d6/d19/d2a/dbc/fff [0,4194304] 0 2026-03-10T08:56:08.752 INFO:tasks.workunit.client.0.vm05.stdout:6/809: symlink d4/d2d/d51/d87/l116 0 2026-03-10T08:56:08.768 INFO:tasks.workunit.client.0.vm05.stdout:5/731: dwrite d5/d86/d24/d2c/d41/d74/da9/fee [4194304,4194304] 0 2026-03-10T08:56:08.771 INFO:tasks.workunit.client.0.vm05.stdout:9/780: rename d6/d12/d3a/da2/ccb to d6/d12/d3a/de5/dd4/c102 0 2026-03-10T08:56:08.781 INFO:tasks.workunit.client.0.vm05.stdout:6/810: truncate d4/d7/d10/d15/d20/f48 202372 0 2026-03-10T08:56:08.783 INFO:tasks.workunit.client.0.vm05.stdout:2/736: write d0/d9/d7f/d8f/f37 [1751243,10234] 0 2026-03-10T08:56:08.786 INFO:tasks.workunit.client.0.vm05.stdout:5/732: unlink d5/d86/d21/f1f 0 2026-03-10T08:56:08.793 INFO:tasks.workunit.client.0.vm05.stdout:0/810: creat df/ff4 x:0 0 0 2026-03-10T08:56:08.800 INFO:tasks.workunit.client.0.vm05.stdout:6/811: sync 2026-03-10T08:56:08.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:08 vm08.local ceph-mon[57559]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:56:08.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:08 vm08.local ceph-mon[57559]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:56:08.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:08 vm08.local ceph-mon[57559]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:56:08.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:08 vm08.local ceph-mon[57559]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:56:08.806 INFO:tasks.workunit.client.0.vm05.stdout:4/801: truncate d0/d2e/d42/d45/d4a/d36/dbe/d49/d58/f94 2133282 0 2026-03-10T08:56:08.814 INFO:tasks.workunit.client.0.vm05.stdout:6/812: truncate d4/f61 1692153 0 2026-03-10T08:56:08.832 INFO:tasks.workunit.client.0.vm05.stdout:5/733: creat d5/df/dbb/d108/f10a x:0 0 0 2026-03-10T08:56:08.839 INFO:tasks.workunit.client.0.vm05.stdout:6/813: dread - d4/d7/d10/d15/d1b/f108 zero size 2026-03-10T08:56:08.867 INFO:tasks.workunit.client.0.vm05.stdout:4/802: mknod d0/d2e/d42/d45/d4a/d36/dbe/d32/c105 0 2026-03-10T08:56:08.881 INFO:tasks.workunit.client.0.vm05.stdout:4/803: rmdir d0/d2e/d42/d45/d4a/d36/dbe/d49/d4f/d5b/dd7 39 2026-03-10T08:56:08.913 INFO:tasks.workunit.client.0.vm05.stdout:5/734: getdents d5/d48/d64/dc4 0 2026-03-10T08:56:08.927 INFO:tasks.workunit.client.0.vm05.stdout:5/735: dread d5/d86/d21/f30 [0,4194304] 0 2026-03-10T08:56:08.976 INFO:tasks.workunit.client.0.vm05.stdout:5/736: dread d5/d86/d21/f9e [0,4194304] 0 2026-03-10T08:56:08.991 INFO:tasks.workunit.client.0.vm05.stdout:5/737: link d5/d86/d21/l63 d5/d86/d24/d2c/d41/l10b 0 2026-03-10T08:56:08.994 INFO:tasks.workunit.client.0.vm05.stdout:4/804: rename d0/d2e/d71/de3 to d0/d2c/d6a/dc9/d106 0 2026-03-10T08:56:09.014 INFO:tasks.workunit.client.0.vm05.stdout:7/766: dwrite d18/d66/d25/d2e/d2f/d6d/fbc [0,4194304] 0 2026-03-10T08:56:09.016 INFO:tasks.workunit.client.1.vm08.stdout:3/957: dwrite d4/f106 [0,4194304] 0 2026-03-10T08:56:09.019 INFO:tasks.workunit.client.0.vm05.stdout:7/767: read d18/d38/d43/d5c/f5f [2609771,100654] 0 2026-03-10T08:56:09.032 INFO:tasks.workunit.client.0.vm05.stdout:5/738: link d5/d48/d64/dc4/ffb d5/df/d37/dd2/d76/dde/f10c 0 2026-03-10T08:56:09.037 INFO:tasks.workunit.client.0.vm05.stdout:4/805: rename d0/d2e/d42/d45/d4a/d36/dbe/d49/d4f/d5b/ddd to d0/d2e/d42/d45/d4a/d36/dbe/d49/d58/d66/d79/d107 0 2026-03-10T08:56:09.043 INFO:tasks.workunit.client.0.vm05.stdout:4/806: dread d0/d2e/d71/fd9 [0,4194304] 0 2026-03-10T08:56:09.053 INFO:tasks.workunit.client.0.vm05.stdout:7/768: mknod d18/d38/dc7/de3/d74/cf3 0 2026-03-10T08:56:09.054 INFO:tasks.workunit.client.0.vm05.stdout:7/769: chown d18/d66/d25/d2e/de7/l98 399 1 2026-03-10T08:56:09.075 INFO:tasks.workunit.client.0.vm05.stdout:7/770: dread d18/fb1 [0,4194304] 0 2026-03-10T08:56:09.112 INFO:tasks.workunit.client.1.vm08.stdout:9/978: dwrite d2/d41/d4c/dd2/d13c/d94/f106 [0,4194304] 0 2026-03-10T08:56:09.115 INFO:tasks.workunit.client.0.vm05.stdout:1/866: write dd/d10/d19/d27/ffa [1001341,99836] 0 2026-03-10T08:56:09.122 INFO:tasks.workunit.client.0.vm05.stdout:1/867: dwrite dd/d10/d18/dd5/fbf [0,4194304] 0 2026-03-10T08:56:09.122 INFO:tasks.workunit.client.0.vm05.stdout:1/868: write dd/d10/d18/d20/d52/d80/f124 [771978,126295] 0 2026-03-10T08:56:09.123 INFO:tasks.workunit.client.1.vm08.stdout:9/979: symlink d2/d54/d8e/da6/dd0/dc8/de1/l147 0 2026-03-10T08:56:09.127 INFO:tasks.workunit.client.0.vm05.stdout:1/869: creat dd/d13/dd2/f12b x:0 0 0 2026-03-10T08:56:09.128 INFO:tasks.workunit.client.1.vm08.stdout:9/980: mkdir d2/dd/d15/d4f/df1/d102/d132/d146/d148 0 2026-03-10T08:56:09.132 INFO:tasks.workunit.client.1.vm08.stdout:0/960: dwrite d6/dd/d13/d17/d1f/d20/f100 [0,4194304] 0 2026-03-10T08:56:09.136 INFO:tasks.workunit.client.0.vm05.stdout:8/805: write d2/dd/d74/d78/f119 [884099,82958] 0 2026-03-10T08:56:09.143 INFO:tasks.workunit.client.1.vm08.stdout:0/961: readlink d6/dd/d13/d17/d1f/d2d/d38/d98/d12f/l110 0 2026-03-10T08:56:09.145 INFO:tasks.workunit.client.1.vm08.stdout:0/962: chown d6/dd/d13/d17/d1f/d2d/d38/le0 48993 1 2026-03-10T08:56:09.146 INFO:tasks.workunit.client.1.vm08.stdout:1/986: dwrite d1/da/de/d24/d35/d6d/d116/f78 [0,4194304] 0 2026-03-10T08:56:09.151 INFO:tasks.workunit.client.0.vm05.stdout:8/806: rename d2/dd/d2c/d2e/d31/d3e/dde/lc6 to d2/dfc/l11b 0 2026-03-10T08:56:09.159 INFO:tasks.workunit.client.0.vm05.stdout:8/807: mknod d2/dd/d2c/d2e/d31/d4f/d7b/d9e/c11c 0 2026-03-10T08:56:09.160 INFO:tasks.workunit.client.0.vm05.stdout:8/808: chown d2/db/d1f/d67/d8d/cef 30786 1 2026-03-10T08:56:09.164 INFO:tasks.workunit.client.0.vm05.stdout:8/809: fsync d2/dd/d2c/d2e/d31/d4f/d7b/d9e/ff6 0 2026-03-10T08:56:09.169 INFO:tasks.workunit.client.0.vm05.stdout:8/810: creat d2/db/d28/d99/f11d x:0 0 0 2026-03-10T08:56:09.169 INFO:tasks.workunit.client.0.vm05.stdout:8/811: stat d2/dd/d2c/d2e/d108 0 2026-03-10T08:56:09.187 INFO:tasks.workunit.client.0.vm05.stdout:0/811: dwrite df/d1f/d85/d19/d47/fa5 [0,4194304] 0 2026-03-10T08:56:09.189 INFO:tasks.workunit.client.0.vm05.stdout:2/737: write d0/f8 [569570,65420] 0 2026-03-10T08:56:09.194 INFO:tasks.workunit.client.0.vm05.stdout:2/738: rename d0/d9/d89/da3 to d0/d55/dde 0 2026-03-10T08:56:09.194 INFO:tasks.workunit.client.0.vm05.stdout:0/812: chown df/d1f/d85/d19/d62/lb7 39 1 2026-03-10T08:56:09.201 INFO:tasks.workunit.client.0.vm05.stdout:0/813: mknod df/d1f/dcd/de6/cf5 0 2026-03-10T08:56:09.209 INFO:tasks.workunit.client.0.vm05.stdout:0/814: link df/d1f/d85/fd4 df/d1f/d95/ff6 0 2026-03-10T08:56:09.209 INFO:tasks.workunit.client.0.vm05.stdout:6/814: write d4/f10d [1978679,24523] 0 2026-03-10T08:56:09.228 INFO:tasks.workunit.client.0.vm05.stdout:6/815: dread - d4/d2d/d51/d87/ffb zero size 2026-03-10T08:56:09.237 INFO:tasks.workunit.client.0.vm05.stdout:0/815: getdents df/d1f/d85/d2b 0 2026-03-10T08:56:09.240 INFO:tasks.workunit.client.1.vm08.stdout:9/981: dread d2/d41/d4c/d66/d82/ff6 [0,4194304] 0 2026-03-10T08:56:09.241 INFO:tasks.workunit.client.1.vm08.stdout:9/982: readlink d2/l4a 0 2026-03-10T08:56:09.242 INFO:tasks.workunit.client.0.vm05.stdout:0/816: rmdir df/d1f/dcd/de6 39 2026-03-10T08:56:09.364 INFO:tasks.workunit.client.0.vm05.stdout:9/781: rmdir d6/d12/d3a/d48 39 2026-03-10T08:56:09.365 INFO:tasks.workunit.client.0.vm05.stdout:9/782: fdatasync d6/d19/f29 0 2026-03-10T08:56:09.367 INFO:tasks.workunit.client.0.vm05.stdout:9/783: mknod d6/d19/d2c/d58/c103 0 2026-03-10T08:56:09.378 INFO:tasks.workunit.client.1.vm08.stdout:5/894: rename d0/d11/d27/d68/d7c/d4b/l63 to d0/d11/l11b 0 2026-03-10T08:56:09.381 INFO:tasks.workunit.client.1.vm08.stdout:3/958: rename d4/d15/d8/d2c/d9b/d79/d20/f99 to d4/d15/d8/d2c/d9b/d79/f146 0 2026-03-10T08:56:09.382 INFO:tasks.workunit.client.1.vm08.stdout:3/959: chown d4/d15/c57 27649 1 2026-03-10T08:56:09.385 INFO:tasks.workunit.client.1.vm08.stdout:0/963: rename d6/dd/d13/d17/d1f/d20/d2f/d26/c9d to d6/dd/d13/d17/d1f/d20/d2f/d24/d142/c145 0 2026-03-10T08:56:09.386 INFO:tasks.workunit.client.1.vm08.stdout:3/960: read - d4/d15/d8/d1d/d117/f124 zero size 2026-03-10T08:56:09.388 INFO:tasks.workunit.client.1.vm08.stdout:1/987: rename d1/da/d18/d3b/lf9 to d1/da/de/d24/d35/d6d/d116/d9c/d139/l14b 0 2026-03-10T08:56:09.389 INFO:tasks.workunit.client.0.vm05.stdout:5/739: write d5/df/d37/dd2/fa5 [925286,111583] 0 2026-03-10T08:56:09.390 INFO:tasks.workunit.client.1.vm08.stdout:1/988: write d1/da/de/d24/d3d/d40/d8e/dd2/d7f/fb9 [261033,63452] 0 2026-03-10T08:56:09.395 INFO:tasks.workunit.client.1.vm08.stdout:1/989: fdatasync d1/f65 0 2026-03-10T08:56:09.396 INFO:tasks.workunit.client.0.vm05.stdout:7/771: write d18/d66/f3f [3467342,2900] 0 2026-03-10T08:56:09.396 INFO:tasks.workunit.client.1.vm08.stdout:1/990: write d1/da/de/d24/d3d/d40/d56/f146 [275923,115227] 0 2026-03-10T08:56:09.398 INFO:tasks.workunit.client.1.vm08.stdout:3/961: dread d4/d15/d8/d71/faf [0,4194304] 0 2026-03-10T08:56:09.401 INFO:tasks.workunit.client.0.vm05.stdout:5/740: rename d5/df/d37/d68/fd7 to d5/d86/d24/d2c/d41/dca/f10d 0 2026-03-10T08:56:09.404 INFO:tasks.workunit.client.0.vm05.stdout:7/772: mknod d18/d38/dc7/de3/d9c/dac/cf4 0 2026-03-10T08:56:09.408 INFO:tasks.workunit.client.0.vm05.stdout:5/741: rename d5/d48/d64/d95/dac/fd1 to d5/df/dbb/d43/f10e 0 2026-03-10T08:56:09.408 INFO:tasks.workunit.client.1.vm08.stdout:3/962: symlink d4/d6f/l147 0 2026-03-10T08:56:09.411 INFO:tasks.workunit.client.0.vm05.stdout:5/742: mknod d5/d86/d24/d84/db8/dcc/c10f 0 2026-03-10T08:56:09.411 INFO:tasks.workunit.client.0.vm05.stdout:5/743: stat d5/d86/la0 0 2026-03-10T08:56:09.412 INFO:tasks.workunit.client.0.vm05.stdout:5/744: chown d5/f9 300422 1 2026-03-10T08:56:09.415 INFO:tasks.workunit.client.0.vm05.stdout:1/870: dwrite dd/d21/d37/d7c/dab/f101 [0,4194304] 0 2026-03-10T08:56:09.424 INFO:tasks.workunit.client.0.vm05.stdout:5/745: mkdir d5/df/d37/d68/d110 0 2026-03-10T08:56:09.424 INFO:tasks.workunit.client.0.vm05.stdout:1/871: mknod dd/d10/d19/d4d/d88/c12c 0 2026-03-10T08:56:09.436 INFO:tasks.workunit.client.0.vm05.stdout:5/746: symlink d5/d86/d39/l111 0 2026-03-10T08:56:09.459 INFO:tasks.workunit.client.0.vm05.stdout:1/872: mkdir dd/d10/d19/d27/d12d 0 2026-03-10T08:56:09.459 INFO:tasks.workunit.client.0.vm05.stdout:5/747: symlink d5/d86/d24/d84/db8/l112 0 2026-03-10T08:56:09.459 INFO:tasks.workunit.client.0.vm05.stdout:5/748: stat d5/df 0 2026-03-10T08:56:09.459 INFO:tasks.workunit.client.0.vm05.stdout:5/749: dwrite d5/d48/d64/d95/dac/dc6/f104 [0,4194304] 0 2026-03-10T08:56:09.459 INFO:tasks.workunit.client.0.vm05.stdout:8/812: truncate d2/dd/d2c/d2e/f3b 3092397 0 2026-03-10T08:56:09.460 INFO:tasks.workunit.client.0.vm05.stdout:8/813: chown d2/dd/f1a 3112 1 2026-03-10T08:56:09.460 INFO:tasks.workunit.client.0.vm05.stdout:8/814: rename d2/dd/d2c/d2e/d31/d3e/d5d/f92 to d2/db/d1f/d67/d8d/d10a/f11e 0 2026-03-10T08:56:09.460 INFO:tasks.workunit.client.0.vm05.stdout:1/873: getdents dd/d10/d18/d20/d52/d80 0 2026-03-10T08:56:09.460 INFO:tasks.workunit.client.0.vm05.stdout:1/874: fsync dd/d10/d18/d2d/d5c/f100 0 2026-03-10T08:56:09.460 INFO:tasks.workunit.client.0.vm05.stdout:1/875: readlink dd/d10/d19/d4d/l110 0 2026-03-10T08:56:09.468 INFO:tasks.workunit.client.1.vm08.stdout:0/964: rmdir d6/dd/d13/d32 39 2026-03-10T08:56:09.468 INFO:tasks.workunit.client.1.vm08.stdout:1/991: dread d1/da/de/d24/d3d/d40/d8e/dd2/fdc [0,4194304] 0 2026-03-10T08:56:09.470 INFO:tasks.workunit.client.1.vm08.stdout:3/963: sync 2026-03-10T08:56:09.476 INFO:tasks.workunit.client.0.vm05.stdout:1/876: fsync dd/d10/d19/d4d/f74 0 2026-03-10T08:56:09.477 INFO:tasks.workunit.client.1.vm08.stdout:0/965: dread - d6/dd/d13/d17/d1f/d2d/d13e/fd1 zero size 2026-03-10T08:56:09.479 INFO:tasks.workunit.client.0.vm05.stdout:3/860: rmdir d9/d2b/de7/df1/d6c 39 2026-03-10T08:56:09.483 INFO:tasks.workunit.client.1.vm08.stdout:1/992: mkdir d1/da/de/d24/d3d/d40/d8e/dd2/d7f/d14c 0 2026-03-10T08:56:09.484 INFO:tasks.workunit.client.1.vm08.stdout:1/993: dread - d1/da/de/d24/d35/d6d/d116/f61 zero size 2026-03-10T08:56:09.488 INFO:tasks.workunit.client.0.vm05.stdout:6/816: write d4/d7/f34 [7163170,97398] 0 2026-03-10T08:56:09.489 INFO:tasks.workunit.client.0.vm05.stdout:2/739: dwrite d0/d9/d1e/d20/d21/f77 [0,4194304] 0 2026-03-10T08:56:09.503 INFO:tasks.workunit.client.1.vm08.stdout:3/964: creat d4/d15/d8/d2c/d9b/d79/d20/f148 x:0 0 0 2026-03-10T08:56:09.504 INFO:tasks.workunit.client.1.vm08.stdout:9/983: dwrite d2/d41/d4c/dd2/d13c/d25/f119 [0,4194304] 0 2026-03-10T08:56:09.509 INFO:tasks.workunit.client.0.vm05.stdout:0/817: write df/d1f/d85/d19/d39/f61 [519118,27974] 0 2026-03-10T08:56:09.510 INFO:tasks.workunit.client.1.vm08.stdout:1/994: mknod d1/da/de/d24/d81/d121/c14d 0 2026-03-10T08:56:09.514 INFO:tasks.workunit.client.0.vm05.stdout:9/784: write d6/d27/fa6 [801082,18333] 0 2026-03-10T08:56:09.518 INFO:tasks.workunit.client.1.vm08.stdout:5/895: dwrite d0/d11/d27/d50/ffb [0,4194304] 0 2026-03-10T08:56:09.520 INFO:tasks.workunit.client.1.vm08.stdout:5/896: chown d0/d11/d27/d68/cee 15276270 1 2026-03-10T08:56:09.521 INFO:tasks.workunit.client.0.vm05.stdout:3/861: symlink d9/d8f/d50/d5f/d7b/l10c 0 2026-03-10T08:56:09.528 INFO:tasks.workunit.client.1.vm08.stdout:3/965: mknod d4/d6f/dca/df9/c149 0 2026-03-10T08:56:09.538 INFO:tasks.workunit.client.0.vm05.stdout:2/740: fdatasync d0/d55/f60 0 2026-03-10T08:56:09.539 INFO:tasks.workunit.client.0.vm05.stdout:0/818: mkdir df/d1f/d85/d19/d39/d4d/d9f/df7 0 2026-03-10T08:56:09.539 INFO:tasks.workunit.client.0.vm05.stdout:0/819: readlink df/d1f/d85/d19/d62/lb8 0 2026-03-10T08:56:09.539 INFO:tasks.workunit.client.0.vm05.stdout:0/820: dread - df/dd8/d67/f80 zero size 2026-03-10T08:56:09.539 INFO:tasks.workunit.client.0.vm05.stdout:7/773: write d18/d38/d43/d6e/f9a [433183,2207] 0 2026-03-10T08:56:09.539 INFO:tasks.workunit.client.0.vm05.stdout:7/774: chown d18/d66/d25/d2e/d2f/fc4 159 1 2026-03-10T08:56:09.542 INFO:tasks.workunit.client.1.vm08.stdout:5/897: mknod d0/d11/d18/df5/dfc/c11c 0 2026-03-10T08:56:09.544 INFO:tasks.workunit.client.0.vm05.stdout:4/807: link d0/d2e/d42/d45/d4a/d36/dbe/d49/d58/d66/d79/fa7 d0/d2e/f108 0 2026-03-10T08:56:09.545 INFO:tasks.workunit.client.0.vm05.stdout:4/808: fdatasync d0/d2e/d42/d45/d4a/f26 0 2026-03-10T08:56:09.549 INFO:tasks.workunit.client.0.vm05.stdout:3/862: mkdir d9/d2b/de7/df1/d43/d6e/dba/d10d 0 2026-03-10T08:56:09.556 INFO:tasks.workunit.client.0.vm05.stdout:5/750: write d5/df/d37/dd2/d76/dde/f10c [1707989,94606] 0 2026-03-10T08:56:09.556 INFO:tasks.workunit.client.0.vm05.stdout:8/815: dwrite d2/db/f1b [0,4194304] 0 2026-03-10T08:56:09.578 INFO:tasks.workunit.client.1.vm08.stdout:1/995: creat d1/da/de/d24/d35/d6d/f14e x:0 0 0 2026-03-10T08:56:09.582 INFO:tasks.workunit.client.0.vm05.stdout:7/775: readlink d18/d66/l96 0 2026-03-10T08:56:09.586 INFO:tasks.workunit.client.0.vm05.stdout:9/785: mkdir d6/d15/d104 0 2026-03-10T08:56:09.588 INFO:tasks.workunit.client.1.vm08.stdout:1/996: symlink d1/da/de/d24/d35/d6d/d116/d9c/d139/d13d/l14f 0 2026-03-10T08:56:09.588 INFO:tasks.workunit.client.0.vm05.stdout:6/817: sync 2026-03-10T08:56:09.604 INFO:tasks.workunit.client.1.vm08.stdout:5/898: creat d0/d11/d27/d68/d7c/d4b/f11d x:0 0 0 2026-03-10T08:56:09.616 INFO:tasks.workunit.client.0.vm05.stdout:3/863: mkdir d9/d2b/de7/df1/d43/d71/d86/d10e 0 2026-03-10T08:56:09.617 INFO:tasks.workunit.client.1.vm08.stdout:1/997: creat d1/da/de/d24/d81/d121/f150 x:0 0 0 2026-03-10T08:56:09.627 INFO:tasks.workunit.client.1.vm08.stdout:0/966: write d6/dd/d13/d17/d50/fac [772389,98497] 0 2026-03-10T08:56:09.628 INFO:tasks.workunit.client.1.vm08.stdout:5/899: symlink d0/d11/d27/d68/d7c/d4b/d10e/l11e 0 2026-03-10T08:56:09.630 INFO:tasks.workunit.client.0.vm05.stdout:9/786: creat d6/d15/d3c/d4b/d82/f105 x:0 0 0 2026-03-10T08:56:09.643 INFO:tasks.workunit.client.1.vm08.stdout:9/984: dwrite d2/d41/d4c/dd2/d13c/d25/d32/f8c [0,4194304] 0 2026-03-10T08:56:09.647 INFO:tasks.workunit.client.0.vm05.stdout:7/776: dread d18/d66/f2d [0,4194304] 0 2026-03-10T08:56:09.653 INFO:tasks.workunit.client.0.vm05.stdout:6/818: creat d4/d7/d10/d1a/d8c/f117 x:0 0 0 2026-03-10T08:56:09.653 INFO:tasks.workunit.client.0.vm05.stdout:1/877: write dd/d10/d18/dd1/ff9 [5563307,117702] 0 2026-03-10T08:56:09.654 INFO:tasks.workunit.client.1.vm08.stdout:3/966: write d4/d15/d8/d2c/d9b/d79/f146 [2822534,124791] 0 2026-03-10T08:56:09.654 INFO:tasks.workunit.client.0.vm05.stdout:6/819: chown d4/d7/d10/d1a/l1c 7 1 2026-03-10T08:56:09.660 INFO:tasks.workunit.client.1.vm08.stdout:0/967: creat d6/dd/d13/d17/d1f/d2d/d39/f146 x:0 0 0 2026-03-10T08:56:09.660 INFO:tasks.workunit.client.0.vm05.stdout:0/821: write df/d1f/d85/d19/d39/d4d/fe3 [588854,56038] 0 2026-03-10T08:56:09.679 INFO:tasks.workunit.client.1.vm08.stdout:1/998: dwrite d1/da/de/d24/d35/d43/ffb [0,4194304] 0 2026-03-10T08:56:09.680 INFO:tasks.workunit.client.0.vm05.stdout:3/864: rename d9/d2b/d2f/d57/c98 to d9/d2b/de7/df1/dd6/c10f 0 2026-03-10T08:56:09.681 INFO:tasks.workunit.client.0.vm05.stdout:3/865: readlink d9/d2b/l47 0 2026-03-10T08:56:09.682 INFO:tasks.workunit.client.1.vm08.stdout:1/999: chown d1/da/de/d24/d35/d6d/d116/d9c/f110 330362 1 2026-03-10T08:56:09.692 INFO:tasks.workunit.client.0.vm05.stdout:4/809: write d0/d2e/d42/d45/d4a/d36/dbe/f28 [2644642,16350] 0 2026-03-10T08:56:09.692 INFO:tasks.workunit.client.1.vm08.stdout:5/900: write d0/d11/d27/d68/d7c/d8e/df0/db5/fcd [707894,48646] 0 2026-03-10T08:56:09.703 INFO:tasks.workunit.client.1.vm08.stdout:0/968: creat d6/dd/d13/d17/d1f/d2d/d85/d93/f147 x:0 0 0 2026-03-10T08:56:09.705 INFO:tasks.workunit.client.0.vm05.stdout:7/777: mkdir d18/d38/dc7/de3/d9c/dac/df5 0 2026-03-10T08:56:09.707 INFO:tasks.workunit.client.0.vm05.stdout:7/778: dread - d18/d66/d25/d2e/d2f/fd8 zero size 2026-03-10T08:56:09.713 INFO:tasks.workunit.client.0.vm05.stdout:7/779: dread d18/d66/d25/f47 [0,4194304] 0 2026-03-10T08:56:09.718 INFO:tasks.workunit.client.1.vm08.stdout:9/985: write d2/d41/d4c/f62 [1126356,119541] 0 2026-03-10T08:56:09.719 INFO:tasks.workunit.client.1.vm08.stdout:9/986: read d2/d41/d4c/dd2/d13c/d25/f4b [4309350,99801] 0 2026-03-10T08:56:09.720 INFO:tasks.workunit.client.0.vm05.stdout:1/878: fdatasync dd/d21/d37/d45/f47 0 2026-03-10T08:56:09.727 INFO:tasks.workunit.client.0.vm05.stdout:6/820: unlink d4/d2d/d51/d62/d113/fca 0 2026-03-10T08:56:09.730 INFO:tasks.workunit.client.1.vm08.stdout:5/901: fdatasync d0/d11/d27/d50/f55 0 2026-03-10T08:56:09.733 INFO:tasks.workunit.client.0.vm05.stdout:0/822: unlink f6 0 2026-03-10T08:56:09.733 INFO:tasks.workunit.client.0.vm05.stdout:0/823: chown df/d1f/d85/d19/d47/d84 27 1 2026-03-10T08:56:09.745 INFO:tasks.workunit.client.1.vm08.stdout:3/967: dwrite d4/d15/d8/d2c/d9b/d79/d8f/f11c [0,4194304] 0 2026-03-10T08:56:09.747 INFO:tasks.workunit.client.0.vm05.stdout:2/741: link d0/d9/d1e/c5b d0/d55/dde/cdf 0 2026-03-10T08:56:09.748 INFO:tasks.workunit.client.1.vm08.stdout:5/902: symlink d0/d11/d27/d50/l11f 0 2026-03-10T08:56:09.749 INFO:tasks.workunit.client.0.vm05.stdout:9/787: rename d6/d15/cd3 to d6/d15/d3c/c106 0 2026-03-10T08:56:09.751 INFO:tasks.workunit.client.0.vm05.stdout:9/788: chown d6/d15/d3c/d4b/d90/d93/l9e 122 1 2026-03-10T08:56:09.756 INFO:tasks.workunit.client.0.vm05.stdout:9/789: read d6/d19/d2c/d58/fc9 [2332751,48394] 0 2026-03-10T08:56:09.761 INFO:tasks.workunit.client.0.vm05.stdout:3/866: truncate d9/d2b/d53/f93 324971 0 2026-03-10T08:56:09.764 INFO:tasks.workunit.client.0.vm05.stdout:3/867: dwrite d9/d2b/de7/df1/d43/d71/d86/fb8 [0,4194304] 0 2026-03-10T08:56:09.772 INFO:tasks.workunit.client.1.vm08.stdout:0/969: mkdir d6/dd/d148 0 2026-03-10T08:56:09.774 INFO:tasks.workunit.client.1.vm08.stdout:3/968: creat d4/d15/d8/d1d/d4f/f14a x:0 0 0 2026-03-10T08:56:09.777 INFO:tasks.workunit.client.0.vm05.stdout:7/780: dread d18/d38/dc7/de3/d74/deb/fbb [0,4194304] 0 2026-03-10T08:56:09.781 INFO:tasks.workunit.client.0.vm05.stdout:0/824: fsync df/d1f/d85/d19/d55/fa9 0 2026-03-10T08:56:09.794 INFO:tasks.workunit.client.0.vm05.stdout:5/751: getdents d5/d86/d24/d84/db8/dcc 0 2026-03-10T08:56:09.794 INFO:tasks.workunit.client.0.vm05.stdout:0/825: chown df/d1f/d85/d19/d62/lb7 154943 1 2026-03-10T08:56:09.794 INFO:tasks.workunit.client.0.vm05.stdout:9/790: read d6/d19/d21/f8a [3846236,50979] 0 2026-03-10T08:56:09.794 INFO:tasks.workunit.client.0.vm05.stdout:3/868: rename d9/d2b/de7/df1/d43/d6e/f103 to d9/d2b/d2f/d57/f110 0 2026-03-10T08:56:09.795 INFO:tasks.workunit.client.0.vm05.stdout:8/816: link d2/dd/d2c/d2e/d31/d3e/f73 d2/dd/d2c/d2e/d31/d3e/dde/f11f 0 2026-03-10T08:56:09.795 INFO:tasks.workunit.client.0.vm05.stdout:2/742: sync 2026-03-10T08:56:09.797 INFO:tasks.workunit.client.0.vm05.stdout:8/817: chown d2/dd/d2c/d2e/d31/d3e/dde/d63/fe0 22 1 2026-03-10T08:56:09.798 INFO:tasks.workunit.client.0.vm05.stdout:2/743: dread - d0/d9/d7f/d8f/d7a/fa1 zero size 2026-03-10T08:56:09.802 INFO:tasks.workunit.client.1.vm08.stdout:3/969: link d4/d15/d8/d2c/d9b/c50 d4/d6f/d85/dd3/c14b 0 2026-03-10T08:56:09.802 INFO:tasks.workunit.client.0.vm05.stdout:7/781: creat d18/d38/dc7/de3/d9c/dac/ff6 x:0 0 0 2026-03-10T08:56:09.802 INFO:tasks.workunit.client.1.vm08.stdout:3/970: chown d4/d15/d8/d1d/l46 10589 1 2026-03-10T08:56:09.805 INFO:tasks.workunit.client.1.vm08.stdout:5/903: dread d0/f6c [0,4194304] 0 2026-03-10T08:56:09.816 INFO:tasks.workunit.client.1.vm08.stdout:9/987: dwrite d2/d41/d4c/f7c [0,4194304] 0 2026-03-10T08:56:09.818 INFO:tasks.workunit.client.1.vm08.stdout:3/971: dread d4/d6f/d85/fed [4194304,4194304] 0 2026-03-10T08:56:09.824 INFO:tasks.workunit.client.1.vm08.stdout:5/904: mknod d0/d11/c120 0 2026-03-10T08:56:09.827 INFO:tasks.workunit.client.0.vm05.stdout:1/879: write dd/d21/d37/d7c/dab/db7/fc0 [3948524,1817] 0 2026-03-10T08:56:09.828 INFO:tasks.workunit.client.0.vm05.stdout:1/880: write dd/d10/d19/d27/ffa [1058145,7812] 0 2026-03-10T08:56:09.832 INFO:tasks.workunit.client.0.vm05.stdout:1/881: dwrite dd/d10/d18/d20/d52/d80/f124 [0,4194304] 0 2026-03-10T08:56:09.838 INFO:tasks.workunit.client.1.vm08.stdout:0/970: write d6/dd/d13/d17/d1f/fd7 [2463570,63086] 0 2026-03-10T08:56:09.844 INFO:tasks.workunit.client.0.vm05.stdout:0/826: dread df/d1f/d85/fb5 [0,4194304] 0 2026-03-10T08:56:09.846 INFO:tasks.workunit.client.0.vm05.stdout:9/791: mknod d6/d15/d3c/d4b/d82/de9/c107 0 2026-03-10T08:56:09.847 INFO:tasks.workunit.client.0.vm05.stdout:9/792: readlink d6/d19/d2c/d84/lbe 0 2026-03-10T08:56:09.849 INFO:tasks.workunit.client.1.vm08.stdout:9/988: creat d2/d41/d4c/dd2/d13c/d25/d98/d9d/f149 x:0 0 0 2026-03-10T08:56:09.852 INFO:tasks.workunit.client.0.vm05.stdout:4/810: creat d0/d2e/d42/d45/d4a/d36/dbe/d49/d4f/d5b/dd7/ddc/de4/f109 x:0 0 0 2026-03-10T08:56:09.879 INFO:tasks.workunit.client.0.vm05.stdout:7/782: symlink d18/d66/d25/d2e/de7/lf7 0 2026-03-10T08:56:09.881 INFO:tasks.workunit.client.0.vm05.stdout:8/818: read d2/dd/d2c/d2e/d31/d4f/d80/f9f [1456426,11445] 0 2026-03-10T08:56:09.882 INFO:tasks.workunit.client.0.vm05.stdout:6/821: link d4/d7/d10/d1a/d1f/f98 d4/d2d/d51/d87/da5/de9/f118 0 2026-03-10T08:56:09.886 INFO:tasks.workunit.client.0.vm05.stdout:6/822: chown d4/d2c/c9c 122501200 1 2026-03-10T08:56:09.886 INFO:tasks.workunit.client.0.vm05.stdout:2/744: dwrite d0/d55/fb6 [0,4194304] 0 2026-03-10T08:56:09.908 INFO:tasks.workunit.client.1.vm08.stdout:3/972: dwrite d4/d15/d8/d2c/d55/f75 [0,4194304] 0 2026-03-10T08:56:09.923 INFO:tasks.workunit.client.1.vm08.stdout:9/989: creat d2/dd/d15/f14a x:0 0 0 2026-03-10T08:56:09.925 INFO:tasks.workunit.client.1.vm08.stdout:3/973: symlink d4/d6f/dca/df9/l14c 0 2026-03-10T08:56:09.926 INFO:tasks.workunit.client.0.vm05.stdout:9/793: mknod d6/d15/d3c/d4b/d82/de9/c108 0 2026-03-10T08:56:09.927 INFO:tasks.workunit.client.1.vm08.stdout:5/905: dwrite d0/fb [0,4194304] 0 2026-03-10T08:56:09.932 INFO:tasks.workunit.client.0.vm05.stdout:4/811: mknod d0/d2e/d42/d45/c10a 0 2026-03-10T08:56:09.932 INFO:tasks.workunit.client.1.vm08.stdout:9/990: dread d2/d41/d4c/dd2/d13c/f91 [0,4194304] 0 2026-03-10T08:56:09.934 INFO:tasks.workunit.client.1.vm08.stdout:9/991: dread d2/d54/d8e/fba [0,4194304] 0 2026-03-10T08:56:09.935 INFO:tasks.workunit.client.1.vm08.stdout:9/992: readlink d2/d41/d4c/dd2/d13c/d21/lb4 0 2026-03-10T08:56:09.936 INFO:tasks.workunit.client.1.vm08.stdout:9/993: read - d2/d54/d8e/da6/dd0/dc8/de1/f11d zero size 2026-03-10T08:56:09.943 INFO:tasks.workunit.client.0.vm05.stdout:3/869: creat d9/d8f/d50/d5f/dd8/dd9/de2/f111 x:0 0 0 2026-03-10T08:56:09.944 INFO:tasks.workunit.client.0.vm05.stdout:3/870: truncate d9/d2b/f101 27555 0 2026-03-10T08:56:09.949 INFO:tasks.workunit.client.1.vm08.stdout:0/971: dwrite d6/dd/d13/d17/f82 [0,4194304] 0 2026-03-10T08:56:09.954 INFO:tasks.workunit.client.0.vm05.stdout:1/882: dwrite dd/f9e [4194304,4194304] 0 2026-03-10T08:56:09.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:09 vm05.local ceph-mon[49713]: pgmap v13: 65 pgs: 65 active+clean; 3.7 GiB data, 12 GiB used, 108 GiB / 120 GiB avail; 31 MiB/s rd, 60 MiB/s wr, 176 op/s 2026-03-10T08:56:09.971 INFO:tasks.workunit.client.1.vm08.stdout:5/906: fdatasync d0/d11/d27/d68/d7c/d4b/d4e/d84/fbb 0 2026-03-10T08:56:09.978 INFO:tasks.workunit.client.1.vm08.stdout:3/974: symlink d4/d15/d8/d2c/d9b/l14d 0 2026-03-10T08:56:09.991 INFO:tasks.workunit.client.1.vm08.stdout:9/994: symlink d2/d41/d4c/dd2/d13c/d94/l14b 0 2026-03-10T08:56:09.994 INFO:tasks.workunit.client.0.vm05.stdout:6/823: creat d4/f119 x:0 0 0 2026-03-10T08:56:09.996 INFO:tasks.workunit.client.1.vm08.stdout:0/972: rmdir d6/dd/d13/d17/d1f/d20/d2f/d57/dd5 39 2026-03-10T08:56:10.000 INFO:tasks.workunit.client.1.vm08.stdout:0/973: dwrite d6/dd/d13/d17/d50/fac [0,4194304] 0 2026-03-10T08:56:10.001 INFO:tasks.workunit.client.0.vm05.stdout:5/752: link d5/l16 d5/d48/d64/d95/l113 0 2026-03-10T08:56:10.004 INFO:tasks.workunit.client.1.vm08.stdout:5/907: sync 2026-03-10T08:56:10.024 INFO:tasks.workunit.client.1.vm08.stdout:3/975: mkdir d4/d15/d8/d71/d14e 0 2026-03-10T08:56:10.028 INFO:tasks.workunit.client.1.vm08.stdout:9/995: creat d2/dd/d11c/de4/df2/f14c x:0 0 0 2026-03-10T08:56:10.045 INFO:tasks.workunit.client.1.vm08.stdout:5/908: creat d0/d11/d27/d68/d7c/d4b/d4e/d84/df9/dfe/f121 x:0 0 0 2026-03-10T08:56:10.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:09 vm08.local ceph-mon[57559]: pgmap v13: 65 pgs: 65 active+clean; 3.7 GiB data, 12 GiB used, 108 GiB / 120 GiB avail; 31 MiB/s rd, 60 MiB/s wr, 176 op/s 2026-03-10T08:56:10.060 INFO:tasks.workunit.client.1.vm08.stdout:9/996: creat d2/dd/d61/f14d x:0 0 0 2026-03-10T08:56:10.064 INFO:tasks.workunit.client.1.vm08.stdout:9/997: dread d2/d41/d4c/dd2/d13c/d39/d4e/fcf [0,4194304] 0 2026-03-10T08:56:10.066 INFO:tasks.workunit.client.1.vm08.stdout:9/998: sync 2026-03-10T08:56:10.123 INFO:tasks.workunit.client.1.vm08.stdout:0/974: link d6/dd/d13/d17/d1f/da3/f10d d6/dd/d13/d17/d1f/d2d/d13e/dc8/f149 0 2026-03-10T08:56:10.126 INFO:tasks.workunit.client.1.vm08.stdout:5/909: dwrite d0/f6c [0,4194304] 0 2026-03-10T08:56:10.134 INFO:tasks.workunit.client.1.vm08.stdout:5/910: dwrite d0/d11/d27/d50/ffb [0,4194304] 0 2026-03-10T08:56:10.145 INFO:tasks.workunit.client.1.vm08.stdout:9/999: mknod d2/d41/d4c/dd2/d13c/d25/d9b/c14e 0 2026-03-10T08:56:10.146 INFO:tasks.workunit.client.1.vm08.stdout:3/976: creat d4/d15/d8/d2c/f14f x:0 0 0 2026-03-10T08:56:10.146 INFO:tasks.workunit.client.1.vm08.stdout:0/975: fdatasync d6/dd/d13/d8f/ffe 0 2026-03-10T08:56:10.147 INFO:tasks.workunit.client.1.vm08.stdout:3/977: chown d4/d15/d8/d71/la3 182895 1 2026-03-10T08:56:10.149 INFO:tasks.workunit.client.1.vm08.stdout:3/978: dread d4/d15/d8/d2c/f8c [0,4194304] 0 2026-03-10T08:56:10.171 INFO:tasks.workunit.client.0.vm05.stdout:8/819: mkdir d2/dd/d2c/d2e/d31/d3e/dde/d63/db8/d120 0 2026-03-10T08:56:10.196 INFO:tasks.workunit.client.1.vm08.stdout:0/976: write d6/dd/d13/d17/d1f/d2d/d85/f10e [9544,58919] 0 2026-03-10T08:56:10.196 INFO:tasks.workunit.client.0.vm05.stdout:3/871: write d9/d2b/d2f/f4b [5326995,64915] 0 2026-03-10T08:56:10.205 INFO:tasks.workunit.client.1.vm08.stdout:3/979: write d4/d15/d8/d71/f127 [352755,84064] 0 2026-03-10T08:56:10.213 INFO:tasks.workunit.client.0.vm05.stdout:1/883: dwrite dd/d10/fb5 [0,4194304] 0 2026-03-10T08:56:10.238 INFO:tasks.workunit.client.0.vm05.stdout:5/753: write d5/d86/d21/d71/f9a [1814109,24751] 0 2026-03-10T08:56:10.249 INFO:tasks.workunit.client.0.vm05.stdout:0/827: creat df/d1f/d85/d2b/d27/d32/ff8 x:0 0 0 2026-03-10T08:56:10.279 INFO:tasks.workunit.client.0.vm05.stdout:7/783: creat d18/d38/d43/ff8 x:0 0 0 2026-03-10T08:56:10.279 INFO:tasks.workunit.client.0.vm05.stdout:8/820: dread - d2/dd/d2c/d2e/d31/d3e/dde/ff7 zero size 2026-03-10T08:56:10.279 INFO:tasks.workunit.client.0.vm05.stdout:7/784: readlink d18/d38/l45 0 2026-03-10T08:56:10.281 INFO:tasks.workunit.client.0.vm05.stdout:2/745: rename d0/d9/l42 to d0/d9/d1e/d20/d21/d45/d4b/le0 0 2026-03-10T08:56:10.283 INFO:tasks.workunit.client.0.vm05.stdout:3/872: mkdir d9/d8f/d50/d5f/dd8/dd9/de2/d112 0 2026-03-10T08:56:10.286 INFO:tasks.workunit.client.0.vm05.stdout:5/754: creat d5/df/dbb/f114 x:0 0 0 2026-03-10T08:56:10.287 INFO:tasks.workunit.client.0.vm05.stdout:5/755: read d5/d86/d24/d2c/d41/d74/fa8 [109157,103434] 0 2026-03-10T08:56:10.287 INFO:tasks.workunit.client.0.vm05.stdout:9/794: link d6/d15/d37/l69 d6/d15/d35/ddf/l109 0 2026-03-10T08:56:10.288 INFO:tasks.workunit.client.0.vm05.stdout:4/812: creat d0/d2e/d42/d45/d4a/d36/dbe/f10b x:0 0 0 2026-03-10T08:56:10.292 INFO:tasks.workunit.client.0.vm05.stdout:8/821: chown d2/dd/d2c/d2e/d31/d3e/f73 9 1 2026-03-10T08:56:10.293 INFO:tasks.workunit.client.0.vm05.stdout:1/884: rename dd/d10/d18/d20/fa1 to dd/d10/d19/d9b/f12e 0 2026-03-10T08:56:10.293 INFO:tasks.workunit.client.0.vm05.stdout:6/824: dread d4/d7/d10/d1a/ff7 [0,4194304] 0 2026-03-10T08:56:10.298 INFO:tasks.workunit.client.1.vm08.stdout:0/977: mknod d6/c14a 0 2026-03-10T08:56:10.307 INFO:tasks.workunit.client.1.vm08.stdout:5/911: write d0/fa4 [935001,54308] 0 2026-03-10T08:56:10.308 INFO:tasks.workunit.client.1.vm08.stdout:5/912: truncate d0/d11/f60 4457852 0 2026-03-10T08:56:10.309 INFO:tasks.workunit.client.0.vm05.stdout:0/828: symlink df/d1f/d85/d9e/lf9 0 2026-03-10T08:56:10.313 INFO:tasks.workunit.client.0.vm05.stdout:2/746: dwrite d0/d9/d1e/d20/d21/d45/d4b/f58 [0,4194304] 0 2026-03-10T08:56:10.314 INFO:tasks.workunit.client.1.vm08.stdout:3/980: rmdir d4/d15/d8/d71/d14e 0 2026-03-10T08:56:10.318 INFO:tasks.workunit.client.0.vm05.stdout:4/813: mkdir d0/d2e/d71/d7c/d10c 0 2026-03-10T08:56:10.320 INFO:tasks.workunit.client.1.vm08.stdout:3/981: dread d4/d15/d8/d2c/d9b/d79/fef [0,4194304] 0 2026-03-10T08:56:10.321 INFO:tasks.workunit.client.0.vm05.stdout:7/785: rename d18/l87 to d18/d38/dc7/de3/d74/deb/lf9 0 2026-03-10T08:56:10.322 INFO:tasks.workunit.client.0.vm05.stdout:7/786: readlink d18/d38/dc7/de3/d9c/dac/l68 0 2026-03-10T08:56:10.322 INFO:tasks.workunit.client.0.vm05.stdout:7/787: fsync d18/d38/dc7/de3/d9c/dac/ff6 0 2026-03-10T08:56:10.323 INFO:tasks.workunit.client.0.vm05.stdout:7/788: truncate d18/d38/d43/ff8 353507 0 2026-03-10T08:56:10.324 INFO:tasks.workunit.client.1.vm08.stdout:0/978: symlink d6/dd/d13/d17/d1f/l14b 0 2026-03-10T08:56:10.334 INFO:tasks.workunit.client.0.vm05.stdout:9/795: write d6/d12/d3a/d9c/fb6 [4447138,33788] 0 2026-03-10T08:56:10.334 INFO:tasks.workunit.client.1.vm08.stdout:5/913: dread d0/d11/d27/d68/d7c/f6a [0,4194304] 0 2026-03-10T08:56:10.341 INFO:tasks.workunit.client.0.vm05.stdout:3/873: link d9/d2b/f101 d9/d2b/de7/df1/dd6/d107/f113 0 2026-03-10T08:56:10.342 INFO:tasks.workunit.client.0.vm05.stdout:3/874: dread - d9/d2b/d53/fa7 zero size 2026-03-10T08:56:10.344 INFO:tasks.workunit.client.0.vm05.stdout:0/829: fsync df/d1f/d85/d19/d47/f8f 0 2026-03-10T08:56:10.344 INFO:tasks.workunit.client.1.vm08.stdout:3/982: dread d4/d15/d8/d2c/d55/d93/fa5 [0,4194304] 0 2026-03-10T08:56:10.347 INFO:tasks.workunit.client.1.vm08.stdout:5/914: rename d0/d11/d27/d68/d7c/d8e/fde to d0/d11/f122 0 2026-03-10T08:56:10.360 INFO:tasks.workunit.client.1.vm08.stdout:0/979: mkdir d6/dd/d13/d32/d14c 0 2026-03-10T08:56:10.361 INFO:tasks.workunit.client.1.vm08.stdout:0/980: chown d6/dd/d13/d17/d1f/d2d/d39/f87 714598163 1 2026-03-10T08:56:10.362 INFO:tasks.workunit.client.1.vm08.stdout:0/981: chown d6/dd/d13/d17/d1f/d2d/d38/cd0 51 1 2026-03-10T08:56:10.370 INFO:tasks.workunit.client.1.vm08.stdout:3/983: symlink d4/d6f/d85/l150 0 2026-03-10T08:56:10.372 INFO:tasks.workunit.client.1.vm08.stdout:0/982: sync 2026-03-10T08:56:10.374 INFO:tasks.workunit.client.1.vm08.stdout:5/915: creat d0/d11/d18/df5/f123 x:0 0 0 2026-03-10T08:56:10.380 INFO:tasks.workunit.client.0.vm05.stdout:4/814: truncate d0/d2c/d6a/fd8 582664 0 2026-03-10T08:56:10.381 INFO:tasks.workunit.client.0.vm05.stdout:2/747: write d0/f30 [355851,93351] 0 2026-03-10T08:56:10.382 INFO:tasks.workunit.client.1.vm08.stdout:3/984: write d4/d15/d8/d2c/d6d/fc3 [1635400,43926] 0 2026-03-10T08:56:10.384 INFO:tasks.workunit.client.0.vm05.stdout:5/756: dwrite d5/f33 [0,4194304] 0 2026-03-10T08:56:10.385 INFO:tasks.workunit.client.0.vm05.stdout:8/822: rename d2/db/d28/f2d to d2/dfc/f121 0 2026-03-10T08:56:10.386 INFO:tasks.workunit.client.1.vm08.stdout:0/983: truncate d6/dd/d13/d17/fbf 1237394 0 2026-03-10T08:56:10.386 INFO:tasks.workunit.client.0.vm05.stdout:8/823: chown d2/db/d28/d99/f11d 49927343 1 2026-03-10T08:56:10.388 INFO:tasks.workunit.client.1.vm08.stdout:5/916: mknod d0/d1b/d67/c124 0 2026-03-10T08:56:10.389 INFO:tasks.workunit.client.1.vm08.stdout:0/984: chown d6/dd/d13/d17/d1f/d2d/c4c 8 1 2026-03-10T08:56:10.407 INFO:tasks.workunit.client.1.vm08.stdout:5/917: stat d0/d11/d18/l31 0 2026-03-10T08:56:10.410 INFO:tasks.workunit.client.1.vm08.stdout:3/985: rename d4/d15/d8/d1d/f13d to d4/d15/d8/d1d/d117/f151 0 2026-03-10T08:56:10.410 INFO:tasks.workunit.client.1.vm08.stdout:5/918: write d0/d11/d27/d68/dc1/f113 [325242,102269] 0 2026-03-10T08:56:10.411 INFO:tasks.workunit.client.1.vm08.stdout:3/986: readlink d4/d15/d8/l133 0 2026-03-10T08:56:10.412 INFO:tasks.workunit.client.1.vm08.stdout:5/919: write d0/d11/d18/df5/f123 [271526,88548] 0 2026-03-10T08:56:10.414 INFO:tasks.workunit.client.1.vm08.stdout:5/920: chown d0/d1b/d67/d7a/ff2 111685126 1 2026-03-10T08:56:10.417 INFO:tasks.workunit.client.1.vm08.stdout:3/987: readlink d4/d15/d8/d2c/d9b/d79/d20/lf3 0 2026-03-10T08:56:10.419 INFO:tasks.workunit.client.1.vm08.stdout:5/921: rmdir d0 39 2026-03-10T08:56:10.423 INFO:tasks.workunit.client.1.vm08.stdout:0/985: creat d6/dd/d148/f14d x:0 0 0 2026-03-10T08:56:10.428 INFO:tasks.workunit.client.0.vm05.stdout:1/885: rename dd/dfb/l109 to dd/d10/d112/d113/l12f 0 2026-03-10T08:56:10.428 INFO:tasks.workunit.client.0.vm05.stdout:1/886: readlink dd/lad 0 2026-03-10T08:56:10.429 INFO:tasks.workunit.client.0.vm05.stdout:1/887: dread - dd/d10/d18/d2d/d51/d58/d71/d62/f96 zero size 2026-03-10T08:56:10.431 INFO:tasks.workunit.client.0.vm05.stdout:1/888: read dd/d10/d19/d27/fc8 [398362,106566] 0 2026-03-10T08:56:10.432 INFO:tasks.workunit.client.1.vm08.stdout:5/922: chown d0/d11/d27/d68/d7c/d4b/f11d 93540 1 2026-03-10T08:56:10.436 INFO:tasks.workunit.client.1.vm08.stdout:0/986: mknod d6/dd/d13/d17/d1f/d2d/d38/d98/d12f/c14e 0 2026-03-10T08:56:10.440 INFO:tasks.workunit.client.0.vm05.stdout:5/757: creat d5/d48/d64/dc4/f115 x:0 0 0 2026-03-10T08:56:10.442 INFO:tasks.workunit.client.0.vm05.stdout:8/824: sync 2026-03-10T08:56:10.443 INFO:tasks.workunit.client.0.vm05.stdout:8/825: fsync d2/dd/d2c/d2e/d31/f111 0 2026-03-10T08:56:10.449 INFO:tasks.workunit.client.0.vm05.stdout:3/875: write d9/fb4 [4772051,67974] 0 2026-03-10T08:56:10.449 INFO:tasks.workunit.client.0.vm05.stdout:4/815: write d0/d2e/d42/f59 [5733546,92000] 0 2026-03-10T08:56:10.449 INFO:tasks.workunit.client.0.vm05.stdout:2/748: write d0/d9/d1e/d20/d21/fb7 [981502,128492] 0 2026-03-10T08:56:10.450 INFO:tasks.workunit.client.0.vm05.stdout:2/749: chown d0/d55/cc0 0 1 2026-03-10T08:56:10.451 INFO:tasks.workunit.client.0.vm05.stdout:3/876: write d9/d2b/de7/df1/d43/d71/d86/fb8 [3879881,109014] 0 2026-03-10T08:56:10.453 INFO:tasks.workunit.client.1.vm08.stdout:0/987: rmdir d6/dd/d13/d17/d1f/d20/d2f/d24 39 2026-03-10T08:56:10.455 INFO:tasks.workunit.client.1.vm08.stdout:3/988: creat d4/d15/d8/d2c/f152 x:0 0 0 2026-03-10T08:56:10.465 INFO:tasks.workunit.client.0.vm05.stdout:9/796: rename d6/d19/d2c/d58/deb to d6/d27/d10a 0 2026-03-10T08:56:10.465 INFO:tasks.workunit.client.0.vm05.stdout:7/789: link d18/d38/dc7/de3/d9c/dac/c57 d18/d38/dc7/de3/d9c/dac/df5/cfa 0 2026-03-10T08:56:10.465 INFO:tasks.workunit.client.0.vm05.stdout:1/889: creat dd/d21/d37/d45/d8d/f130 x:0 0 0 2026-03-10T08:56:10.465 INFO:tasks.workunit.client.1.vm08.stdout:3/989: fdatasync d4/d15/d8/fec 0 2026-03-10T08:56:10.465 INFO:tasks.workunit.client.1.vm08.stdout:3/990: chown d4/d15/d8/d2c/d6d/dfa/d100 351 1 2026-03-10T08:56:10.469 INFO:tasks.workunit.client.1.vm08.stdout:0/988: read d6/dd/d13/d17/fb4 [511194,52092] 0 2026-03-10T08:56:10.479 INFO:tasks.workunit.client.1.vm08.stdout:5/923: write d0/d11/d3e/d45/fad [1434481,1622] 0 2026-03-10T08:56:10.480 INFO:tasks.workunit.client.1.vm08.stdout:5/924: dread - d0/d11/d18/fe6 zero size 2026-03-10T08:56:10.487 INFO:tasks.workunit.client.0.vm05.stdout:4/816: fdatasync d0/d2e/d42/d45/d4a/d36/dbe/fc8 0 2026-03-10T08:56:10.490 INFO:tasks.workunit.client.1.vm08.stdout:5/925: truncate d0/d11/d27/d50/fa1 2719429 0 2026-03-10T08:56:10.490 INFO:tasks.workunit.client.0.vm05.stdout:2/750: creat d0/d9/d7f/db4/fe1 x:0 0 0 2026-03-10T08:56:10.490 INFO:tasks.workunit.client.0.vm05.stdout:2/751: chown d0/d9/d1e/d20/f32 4 1 2026-03-10T08:56:10.491 INFO:tasks.workunit.client.0.vm05.stdout:2/752: write d0/d9/d1e/d20/d21/fdd [295175,70336] 0 2026-03-10T08:56:10.493 INFO:tasks.workunit.client.1.vm08.stdout:5/926: dread d0/d11/d3e/fdd [0,4194304] 0 2026-03-10T08:56:10.494 INFO:tasks.workunit.client.0.vm05.stdout:3/877: truncate d9/d4d/d51/fe1 656203 0 2026-03-10T08:56:10.496 INFO:tasks.workunit.client.0.vm05.stdout:0/830: link df/d1f/d85/d2b/d27/f60 df/dd8/d67/d7b/ffa 0 2026-03-10T08:56:10.497 INFO:tasks.workunit.client.0.vm05.stdout:1/890: fdatasync dd/d10/d19/f95 0 2026-03-10T08:56:10.499 INFO:tasks.workunit.client.1.vm08.stdout:3/991: getdents d4/d15/d8/d1d/d4f 0 2026-03-10T08:56:10.504 INFO:tasks.workunit.client.1.vm08.stdout:0/989: link d6/dd/d13/d17/d1f/d20/d2f/d24/lbc d6/dd/d13/d17/d1f/d2d/d85/d93/l14f 0 2026-03-10T08:56:10.510 INFO:tasks.workunit.client.0.vm05.stdout:6/825: creat d4/d2d/d5f/f11a x:0 0 0 2026-03-10T08:56:10.510 INFO:tasks.workunit.client.0.vm05.stdout:6/826: write d4/d7/d10/d15/fc5 [872182,77425] 0 2026-03-10T08:56:10.511 INFO:tasks.workunit.client.0.vm05.stdout:6/827: readlink d4/d2d/l7e 0 2026-03-10T08:56:10.520 INFO:tasks.workunit.client.0.vm05.stdout:8/826: mknod d2/dd/d2c/c122 0 2026-03-10T08:56:10.522 INFO:tasks.workunit.client.1.vm08.stdout:0/990: fsync d6/dd/d13/d17/d1f/d2d/d85/d93/f140 0 2026-03-10T08:56:10.525 INFO:tasks.workunit.client.1.vm08.stdout:0/991: dread d6/dd/d13/d61/d6f/f102 [0,4194304] 0 2026-03-10T08:56:10.533 INFO:tasks.workunit.client.0.vm05.stdout:9/797: dwrite d6/d15/d35/fc0 [0,4194304] 0 2026-03-10T08:56:10.540 INFO:tasks.workunit.client.0.vm05.stdout:2/753: unlink d0/d9/d1e/d20/d24/fbe 0 2026-03-10T08:56:10.553 INFO:tasks.workunit.client.0.vm05.stdout:2/754: chown d0/d9/d1e/d20/d24/cd0 14 1 2026-03-10T08:56:10.553 INFO:tasks.workunit.client.0.vm05.stdout:3/878: write d9/d2b/d2f/d57/f110 [598508,130198] 0 2026-03-10T08:56:10.553 INFO:tasks.workunit.client.1.vm08.stdout:5/927: write d0/d1b/d67/d80/fd9 [849120,123272] 0 2026-03-10T08:56:10.553 INFO:tasks.workunit.client.1.vm08.stdout:3/992: dwrite d4/d15/fa [0,4194304] 0 2026-03-10T08:56:10.553 INFO:tasks.workunit.client.1.vm08.stdout:0/992: creat d6/dd/d13/d17/d1f/d20/d2f/d24/d142/f150 x:0 0 0 2026-03-10T08:56:10.567 INFO:tasks.workunit.client.0.vm05.stdout:7/790: rename d18/d38/dc7/de3/d9c/dac/f3d to d18/d38/dc7/de3/d9c/de1/ffb 0 2026-03-10T08:56:10.571 INFO:tasks.workunit.client.1.vm08.stdout:0/993: symlink d6/dd/d13/d17/d1f/d20/d2f/d57/dd5/l151 0 2026-03-10T08:56:10.572 INFO:tasks.workunit.client.1.vm08.stdout:5/928: read d0/d11/d27/d68/d7c/d4b/d4e/f89 [2494968,13387] 0 2026-03-10T08:56:10.575 INFO:tasks.workunit.client.1.vm08.stdout:0/994: dwrite d6/fe [0,4194304] 0 2026-03-10T08:56:10.584 INFO:tasks.workunit.client.0.vm05.stdout:9/798: symlink d6/d12/db2/l10b 0 2026-03-10T08:56:10.585 INFO:tasks.workunit.client.0.vm05.stdout:9/799: chown d6/d19/d2a/d4a/faa 25926620 1 2026-03-10T08:56:10.586 INFO:tasks.workunit.client.1.vm08.stdout:3/993: link d4/d15/d8/d1d/d4f/lc8 d4/d15/d8/d2c/d6d/dfa/d100/l153 0 2026-03-10T08:56:10.592 INFO:tasks.workunit.client.0.vm05.stdout:3/879: chown d9/c22 238318867 1 2026-03-10T08:56:10.592 INFO:tasks.workunit.client.1.vm08.stdout:0/995: mknod d6/dd/d13/d17/d1f/d2d/d13e/dc8/c152 0 2026-03-10T08:56:10.592 INFO:tasks.workunit.client.1.vm08.stdout:5/929: creat d0/d11/d27/d68/d7c/d4b/dbe/f125 x:0 0 0 2026-03-10T08:56:10.592 INFO:tasks.workunit.client.1.vm08.stdout:0/996: unlink d6/dd/d13/d17/d1f/d2d/c2e 0 2026-03-10T08:56:10.592 INFO:tasks.workunit.client.1.vm08.stdout:3/994: dwrite d4/d15/d8/d2c/d9b/d79/d8f/de2/f113 [4194304,4194304] 0 2026-03-10T08:56:10.592 INFO:tasks.workunit.client.1.vm08.stdout:3/995: stat d4/d15/dfd/l12e 0 2026-03-10T08:56:10.601 INFO:tasks.workunit.client.1.vm08.stdout:0/997: rename d6/dd/d13/d17/d1f/d2d/d38/c5d to d6/dd/d13/d61/c153 0 2026-03-10T08:56:10.603 INFO:tasks.workunit.client.1.vm08.stdout:0/998: rename d6/dd/d13/d17/d1f/d2d to d6/dd/d13/d17/d1f/d2d/d38/d154 22 2026-03-10T08:56:10.603 INFO:tasks.workunit.client.0.vm05.stdout:7/791: unlink d18/d66/d25/l9d 0 2026-03-10T08:56:10.604 INFO:tasks.workunit.client.1.vm08.stdout:0/999: chown d6/dd/d13/d17/d1f/d2d 22585 1 2026-03-10T08:56:10.604 INFO:tasks.workunit.client.0.vm05.stdout:5/758: getdents d5 0 2026-03-10T08:56:10.604 INFO:tasks.workunit.client.0.vm05.stdout:6/828: dread - d4/d2c/d84/db6/dc6/f100 zero size 2026-03-10T08:56:10.615 INFO:tasks.workunit.client.1.vm08.stdout:3/996: dread d4/d15/d8/d2c/d6d/dfa/f12b [0,4194304] 0 2026-03-10T08:56:10.618 INFO:tasks.workunit.client.0.vm05.stdout:3/880: stat d9/d2b/de7/df1/d43/c58 0 2026-03-10T08:56:10.625 INFO:tasks.workunit.client.1.vm08.stdout:3/997: dread d4/d15/d8/d2c/f32 [0,4194304] 0 2026-03-10T08:56:10.631 INFO:tasks.workunit.client.0.vm05.stdout:6/829: dread d4/d2d/d51/d62/da9/fe4 [0,4194304] 0 2026-03-10T08:56:10.651 INFO:tasks.workunit.client.1.vm08.stdout:3/998: rmdir d4/d15/d8/d2c/d9b/d79/d8f/de2 39 2026-03-10T08:56:10.652 INFO:tasks.workunit.client.1.vm08.stdout:3/999: readlink d4/d15/d8/lba 0 2026-03-10T08:56:10.652 INFO:tasks.workunit.client.0.vm05.stdout:2/755: link d0/d9/d7f/d8f/d7e/fb9 d0/d55/da2/fe2 0 2026-03-10T08:56:10.652 INFO:tasks.workunit.client.0.vm05.stdout:5/759: chown d5/d86/d24/d2c/d41/cdb 32188 1 2026-03-10T08:56:10.652 INFO:tasks.workunit.client.0.vm05.stdout:0/831: getdents df/d1f/d85/d19/d47 0 2026-03-10T08:56:10.652 INFO:tasks.workunit.client.0.vm05.stdout:7/792: mkdir d18/dfc 0 2026-03-10T08:56:10.652 INFO:tasks.workunit.client.0.vm05.stdout:5/760: dwrite d5/fd [8388608,4194304] 0 2026-03-10T08:56:10.657 INFO:tasks.workunit.client.0.vm05.stdout:2/756: dread d0/f30 [0,4194304] 0 2026-03-10T08:56:10.659 INFO:tasks.workunit.client.0.vm05.stdout:7/793: truncate d18/d38/d43/d5c/f67 3944288 0 2026-03-10T08:56:10.664 INFO:tasks.workunit.client.0.vm05.stdout:3/881: creat d9/d2b/de7/df1/d6c/dbf/f114 x:0 0 0 2026-03-10T08:56:10.667 INFO:tasks.workunit.client.0.vm05.stdout:2/757: dread d0/d9/f4e [0,4194304] 0 2026-03-10T08:56:10.673 INFO:tasks.workunit.client.0.vm05.stdout:0/832: creat df/d1f/dcd/de6/ffb x:0 0 0 2026-03-10T08:56:10.688 INFO:tasks.workunit.client.0.vm05.stdout:5/761: fsync d5/d86/d21/d89/f90 0 2026-03-10T08:56:10.693 INFO:tasks.workunit.client.0.vm05.stdout:0/833: truncate df/d59/fdf 464128 0 2026-03-10T08:56:10.697 INFO:tasks.workunit.client.0.vm05.stdout:4/817: dwrite d0/d2e/d42/d45/d4a/d36/dbe/d32/f72 [0,4194304] 0 2026-03-10T08:56:10.700 INFO:tasks.workunit.client.0.vm05.stdout:3/882: truncate d9/d4d/d51/fe1 210268 0 2026-03-10T08:56:10.700 INFO:tasks.workunit.client.0.vm05.stdout:4/818: chown d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/d67/d7b/cc6 2399399 1 2026-03-10T08:56:10.709 INFO:tasks.workunit.client.0.vm05.stdout:1/891: dwrite dd/d10/d18/d20/d52/d80/ffc [0,4194304] 0 2026-03-10T08:56:10.714 INFO:tasks.workunit.client.0.vm05.stdout:8/827: dwrite d2/dd/d2c/d2e/d31/d4f/d80/de2/dea/ff0 [0,4194304] 0 2026-03-10T08:56:10.719 INFO:tasks.workunit.client.1.vm08.stdout:5/930: dwrite d0/ffa [0,4194304] 0 2026-03-10T08:56:10.727 INFO:tasks.workunit.client.0.vm05.stdout:9/800: dwrite d6/f7 [4194304,4194304] 0 2026-03-10T08:56:10.729 INFO:tasks.workunit.client.0.vm05.stdout:9/801: fdatasync d6/d19/d2c/d58/f6c 0 2026-03-10T08:56:10.731 INFO:tasks.workunit.client.0.vm05.stdout:0/834: truncate df/d59/f45 3147612 0 2026-03-10T08:56:10.733 INFO:tasks.workunit.client.0.vm05.stdout:6/830: write d4/d2c/d84/f3a [1246495,8481] 0 2026-03-10T08:56:10.751 INFO:tasks.workunit.client.1.vm08.stdout:5/931: creat d0/d11/d27/d68/d7c/d4b/dbe/f126 x:0 0 0 2026-03-10T08:56:10.757 INFO:tasks.workunit.client.0.vm05.stdout:7/794: write d18/f1d [4304067,117190] 0 2026-03-10T08:56:10.761 INFO:tasks.workunit.client.0.vm05.stdout:2/758: write d0/d9/d1e/d20/d21/d45/d4b/f9c [2170677,102541] 0 2026-03-10T08:56:10.761 INFO:tasks.workunit.client.0.vm05.stdout:2/759: stat d0/d9/d7f/d8f/d7e/dc3 0 2026-03-10T08:56:10.764 INFO:tasks.workunit.client.0.vm05.stdout:8/828: creat d2/dd/d2c/d2e/d31/d3e/d5d/f123 x:0 0 0 2026-03-10T08:56:10.765 INFO:tasks.workunit.client.0.vm05.stdout:8/829: write d2/db/d28/d99/f11d [870170,59796] 0 2026-03-10T08:56:10.765 INFO:tasks.workunit.client.1.vm08.stdout:5/932: dread d0/d11/d27/d68/d7c/d4b/d4e/f56 [0,4194304] 0 2026-03-10T08:56:10.767 INFO:tasks.workunit.client.0.vm05.stdout:9/802: readlink d6/d12/d3a/de5/l7b 0 2026-03-10T08:56:10.775 INFO:tasks.workunit.client.1.vm08.stdout:5/933: unlink d0/d11/d27/d68/d7c/de5/ce0 0 2026-03-10T08:56:10.778 INFO:tasks.workunit.client.0.vm05.stdout:1/892: mkdir dd/d10/d19/d131 0 2026-03-10T08:56:10.786 INFO:tasks.workunit.client.1.vm08.stdout:5/934: rename d0/d11/d27/ff4 to d0/d11/d27/d68/d7c/d4b/dbe/f127 0 2026-03-10T08:56:10.790 INFO:tasks.workunit.client.1.vm08.stdout:5/935: mknod d0/d11/d27/d68/d7c/d4b/d4e/d84/c128 0 2026-03-10T08:56:10.806 INFO:tasks.workunit.client.1.vm08.stdout:5/936: dread - d0/d11/d27/d68/d7c/d8e/df0/db5/fd2 zero size 2026-03-10T08:56:10.807 INFO:tasks.workunit.client.0.vm05.stdout:1/893: mkdir dd/d10/d18/d2d/d51/d58/deb/d132 0 2026-03-10T08:56:10.807 INFO:tasks.workunit.client.0.vm05.stdout:1/894: chown dd/d10/d19/d9b/dc3 18 1 2026-03-10T08:56:10.807 INFO:tasks.workunit.client.0.vm05.stdout:8/830: mknod d2/dd/c124 0 2026-03-10T08:56:10.807 INFO:tasks.workunit.client.0.vm05.stdout:9/803: rename d6/d19/d21/cea to d6/d15/d104/c10c 0 2026-03-10T08:56:10.807 INFO:tasks.workunit.client.0.vm05.stdout:0/835: creat df/d1f/d85/d2b/d27/d32/ffc x:0 0 0 2026-03-10T08:56:10.807 INFO:tasks.workunit.client.0.vm05.stdout:9/804: rmdir d6/d19/d2c/d58 39 2026-03-10T08:56:10.807 INFO:tasks.workunit.client.0.vm05.stdout:9/805: chown d6/d15/d3c 246 1 2026-03-10T08:56:10.818 INFO:tasks.workunit.client.0.vm05.stdout:8/831: dread d2/db/f9a [0,4194304] 0 2026-03-10T08:56:10.818 INFO:tasks.workunit.client.0.vm05.stdout:8/832: chown d2/dd/d2c/d2e/d31/d3e/dde/d63/daf/ced 26186667 1 2026-03-10T08:56:10.822 INFO:tasks.workunit.client.1.vm08.stdout:5/937: dread d0/d11/d3e/f48 [0,4194304] 0 2026-03-10T08:56:10.823 INFO:tasks.workunit.client.0.vm05.stdout:9/806: creat d6/d19/d21/f10d x:0 0 0 2026-03-10T08:56:10.823 INFO:tasks.workunit.client.1.vm08.stdout:5/938: chown d0/d1b/d67/d7a 253723386 1 2026-03-10T08:56:10.828 INFO:tasks.workunit.client.0.vm05.stdout:2/760: dread d0/d9/d1e/d20/d21/d45/d4b/fa7 [0,4194304] 0 2026-03-10T08:56:10.830 INFO:tasks.workunit.client.0.vm05.stdout:2/761: dread d0/d9/d1e/d20/d21/f77 [0,4194304] 0 2026-03-10T08:56:10.832 INFO:tasks.workunit.client.0.vm05.stdout:5/762: write d5/f3b [220436,36496] 0 2026-03-10T08:56:10.836 INFO:tasks.workunit.client.0.vm05.stdout:4/819: dwrite d0/d78/f95 [0,4194304] 0 2026-03-10T08:56:10.846 INFO:tasks.workunit.client.0.vm05.stdout:3/883: write d9/d4d/d51/d64/d89/dc2/fcd [497708,83127] 0 2026-03-10T08:56:10.851 INFO:tasks.workunit.client.0.vm05.stdout:6/831: dwrite d4/d2d/d51/f10b [0,4194304] 0 2026-03-10T08:56:10.863 INFO:tasks.workunit.client.0.vm05.stdout:8/833: creat d2/dd/d74/f125 x:0 0 0 2026-03-10T08:56:10.876 INFO:tasks.workunit.client.0.vm05.stdout:9/807: truncate d6/d15/d35/f9a 1986814 0 2026-03-10T08:56:10.876 INFO:tasks.workunit.client.0.vm05.stdout:1/895: dread dd/d10/d19/f129 [0,4194304] 0 2026-03-10T08:56:10.876 INFO:tasks.workunit.client.0.vm05.stdout:5/763: creat d5/d48/d64/d95/dac/dc6/f116 x:0 0 0 2026-03-10T08:56:10.876 INFO:tasks.workunit.client.0.vm05.stdout:4/820: symlink d0/d2e/d42/d45/d4a/d36/dbe/d49/l10d 0 2026-03-10T08:56:10.879 INFO:tasks.workunit.client.0.vm05.stdout:3/884: mkdir d9/d2b/de7/df1/d43/d6e/dba/d115 0 2026-03-10T08:56:10.888 INFO:tasks.workunit.client.0.vm05.stdout:8/834: dread d2/dd/d2c/d2e/d31/d4f/d80/dd0/fb6 [0,4194304] 0 2026-03-10T08:56:10.888 INFO:tasks.workunit.client.0.vm05.stdout:6/832: rmdir d4/d7/d10/d1a/d8c 39 2026-03-10T08:56:10.891 INFO:tasks.workunit.client.0.vm05.stdout:5/764: dread d5/df/f2f [0,4194304] 0 2026-03-10T08:56:10.894 INFO:tasks.workunit.client.0.vm05.stdout:0/836: rename df/d1f/d48 to df/d1f/dcd/dfd 0 2026-03-10T08:56:10.912 INFO:tasks.workunit.client.0.vm05.stdout:7/795: write d18/d38/fca [378847,23914] 0 2026-03-10T08:56:10.913 INFO:tasks.workunit.client.0.vm05.stdout:7/796: readlink d18/d38/dc7/de3/d74/l93 0 2026-03-10T08:56:10.929 INFO:tasks.workunit.client.1.vm08.stdout:5/939: write d0/d11/d18/fc0 [3299222,24744] 0 2026-03-10T08:56:10.943 INFO:tasks.workunit.client.1.vm08.stdout:5/940: sync 2026-03-10T08:56:10.946 INFO:tasks.workunit.client.0.vm05.stdout:2/762: dwrite d0/fa [0,4194304] 0 2026-03-10T08:56:10.947 INFO:tasks.workunit.client.0.vm05.stdout:4/821: unlink d0/d2e/f4e 0 2026-03-10T08:56:10.948 INFO:tasks.workunit.client.0.vm05.stdout:3/885: fdatasync d9/d4d/d51/f59 0 2026-03-10T08:56:10.949 INFO:tasks.workunit.client.1.vm08.stdout:5/941: dwrite d0/d1b/d67/d80/fd9 [0,4194304] 0 2026-03-10T08:56:10.955 INFO:tasks.workunit.client.0.vm05.stdout:8/835: fsync d2/db/d1f/d67/fe9 0 2026-03-10T08:56:10.955 INFO:tasks.workunit.client.0.vm05.stdout:8/836: chown d2/cf1 74892854 1 2026-03-10T08:56:10.956 INFO:tasks.workunit.client.0.vm05.stdout:8/837: stat d2/dd/d2c/d2e/c39 0 2026-03-10T08:56:10.958 INFO:tasks.workunit.client.1.vm08.stdout:5/942: dwrite d0/d11/d27/d68/d7c/d4b/dbe/f125 [0,4194304] 0 2026-03-10T08:56:10.959 INFO:tasks.workunit.client.0.vm05.stdout:5/765: creat d5/d86/d24/d2c/d41/d74/f117 x:0 0 0 2026-03-10T08:56:10.960 INFO:tasks.workunit.client.0.vm05.stdout:5/766: chown d5/d86/d24/d84/db8/l112 36041184 1 2026-03-10T08:56:10.973 INFO:tasks.workunit.client.0.vm05.stdout:9/808: mknod d6/d19/d2a/d8d/c10e 0 2026-03-10T08:56:10.975 INFO:tasks.workunit.client.1.vm08.stdout:5/943: fdatasync d0/f7f 0 2026-03-10T08:56:10.985 INFO:tasks.workunit.client.0.vm05.stdout:7/797: mkdir d18/d38/dc7/de3/d9c/dfd 0 2026-03-10T08:56:10.985 INFO:tasks.workunit.client.1.vm08.stdout:5/944: stat d0/d11/d27/d68/d7c/d4b/d4e/da5/l10a 0 2026-03-10T08:56:10.985 INFO:tasks.workunit.client.1.vm08.stdout:5/945: fsync d0/d1b/d67/d7a/ff2 0 2026-03-10T08:56:11.008 INFO:tasks.workunit.client.0.vm05.stdout:2/763: symlink d0/d55/dde/le3 0 2026-03-10T08:56:11.010 INFO:tasks.workunit.client.0.vm05.stdout:4/822: creat d0/d2e/d71/d7c/f10e x:0 0 0 2026-03-10T08:56:11.019 INFO:tasks.workunit.client.0.vm05.stdout:7/798: rename d18/d38/dc7/de3/dc6/fcd to d18/d38/dc7/de3/dc6/ffe 0 2026-03-10T08:56:11.030 INFO:tasks.workunit.client.0.vm05.stdout:6/833: dwrite d4/d2d/d51/d87/da5/fe5 [0,4194304] 0 2026-03-10T08:56:11.043 INFO:tasks.workunit.client.0.vm05.stdout:4/823: truncate d0/fc 1261464 0 2026-03-10T08:56:11.049 INFO:tasks.workunit.client.1.vm08.stdout:5/946: truncate d0/d1b/d67/f9b 1166754 0 2026-03-10T08:56:11.053 INFO:tasks.workunit.client.0.vm05.stdout:3/886: write d9/d4d/f5e [434710,4848] 0 2026-03-10T08:56:11.055 INFO:tasks.workunit.client.0.vm05.stdout:3/887: stat d9/d2b/de7/df1/d43/d71/d86/fb8 0 2026-03-10T08:56:11.057 INFO:tasks.workunit.client.0.vm05.stdout:5/767: dwrite d5/d86/d24/d84/db8/fdc [0,4194304] 0 2026-03-10T08:56:11.058 INFO:tasks.workunit.client.0.vm05.stdout:1/896: rename dd/d10/d18/d20/d52/lf2 to dd/d10/d18/l133 0 2026-03-10T08:56:11.061 INFO:tasks.workunit.client.0.vm05.stdout:9/809: symlink d6/d12/l10f 0 2026-03-10T08:56:11.063 INFO:tasks.workunit.client.0.vm05.stdout:1/897: dread dd/d10/d18/d20/d69/fb1 [0,4194304] 0 2026-03-10T08:56:11.078 INFO:tasks.workunit.client.0.vm05.stdout:7/799: truncate d18/d38/dc7/de3/f71 4934074 0 2026-03-10T08:56:11.085 INFO:tasks.workunit.client.1.vm08.stdout:5/947: dread d0/d11/f29 [0,4194304] 0 2026-03-10T08:56:11.087 INFO:tasks.workunit.client.0.vm05.stdout:2/764: symlink d0/d9/d1e/d20/le4 0 2026-03-10T08:56:11.128 INFO:tasks.workunit.client.1.vm08.stdout:5/948: rmdir d0/d11/d27 39 2026-03-10T08:56:11.129 INFO:tasks.workunit.client.1.vm08.stdout:5/949: creat d0/d11/d27/d68/d7c/d4b/d4e/d84/f129 x:0 0 0 2026-03-10T08:56:11.129 INFO:tasks.workunit.client.1.vm08.stdout:5/950: unlink d0/d11/d27/fdc 0 2026-03-10T08:56:11.129 INFO:tasks.workunit.client.0.vm05.stdout:4/824: fsync d0/d2e/d42/d45/d4a/d36/dbe/d32/f76 0 2026-03-10T08:56:11.129 INFO:tasks.workunit.client.0.vm05.stdout:8/838: creat d2/dd/d2c/d2e/d31/d3e/f126 x:0 0 0 2026-03-10T08:56:11.129 INFO:tasks.workunit.client.0.vm05.stdout:5/768: creat d5/df/dbb/d108/f118 x:0 0 0 2026-03-10T08:56:11.129 INFO:tasks.workunit.client.0.vm05.stdout:0/837: rename df/d1f/l7d to df/d1f/d85/d19/d47/d84/d8a/lfe 0 2026-03-10T08:56:11.129 INFO:tasks.workunit.client.0.vm05.stdout:9/810: symlink d6/d15/d3c/l110 0 2026-03-10T08:56:11.129 INFO:tasks.workunit.client.0.vm05.stdout:7/800: unlink d18/f24 0 2026-03-10T08:56:11.129 INFO:tasks.workunit.client.0.vm05.stdout:2/765: truncate d0/f36 115571 0 2026-03-10T08:56:11.129 INFO:tasks.workunit.client.0.vm05.stdout:4/825: mknod d0/d2e/dca/c10f 0 2026-03-10T08:56:11.129 INFO:tasks.workunit.client.0.vm05.stdout:5/769: symlink d5/d48/d64/dc4/l119 0 2026-03-10T08:56:11.132 INFO:tasks.workunit.client.0.vm05.stdout:6/834: rename d4/d7/d10/d1a/d89/deb to d4/d2d/d51/d62/d113/d11b 0 2026-03-10T08:56:11.134 INFO:tasks.workunit.client.0.vm05.stdout:1/898: sync 2026-03-10T08:56:11.135 INFO:tasks.workunit.client.0.vm05.stdout:7/801: sync 2026-03-10T08:56:11.148 INFO:tasks.workunit.client.1.vm08.stdout:5/951: write d0/d11/d27/d68/d7c/d4b/d4e/d84/fa9 [3028355,124756] 0 2026-03-10T08:56:11.149 INFO:tasks.workunit.client.0.vm05.stdout:9/811: creat d6/d15/d104/f111 x:0 0 0 2026-03-10T08:56:11.151 INFO:tasks.workunit.client.0.vm05.stdout:4/826: mknod d0/d1d/c110 0 2026-03-10T08:56:11.152 INFO:tasks.workunit.client.0.vm05.stdout:2/766: dread d0/d9/d1e/d20/d21/f77 [0,4194304] 0 2026-03-10T08:56:11.153 INFO:tasks.workunit.client.0.vm05.stdout:4/827: chown d0/d2e/d42/d45/d4a/d36/dbe/d32/f72 55770009 1 2026-03-10T08:56:11.155 INFO:tasks.workunit.client.0.vm05.stdout:8/839: write d2/db/d1f/f84 [204027,19760] 0 2026-03-10T08:56:11.159 INFO:tasks.workunit.client.0.vm05.stdout:8/840: dwrite d2/dd/d2c/d2e/d31/d3e/f126 [0,4194304] 0 2026-03-10T08:56:11.174 INFO:tasks.workunit.client.0.vm05.stdout:3/888: creat d9/d4d/f116 x:0 0 0 2026-03-10T08:56:11.182 INFO:tasks.workunit.client.0.vm05.stdout:5/770: rmdir d5/df/d37/d68 39 2026-03-10T08:56:11.183 INFO:tasks.workunit.client.0.vm05.stdout:5/771: dread - d5/d48/d64/dc4/f115 zero size 2026-03-10T08:56:11.198 INFO:tasks.workunit.client.0.vm05.stdout:0/838: write df/f13 [1270702,18424] 0 2026-03-10T08:56:11.201 INFO:tasks.workunit.client.1.vm08.stdout:5/952: dwrite d0/d11/d18/f23 [4194304,4194304] 0 2026-03-10T08:56:11.220 INFO:tasks.workunit.client.0.vm05.stdout:9/812: dread d6/f9f [0,4194304] 0 2026-03-10T08:56:11.226 INFO:tasks.workunit.client.0.vm05.stdout:8/841: mkdir d2/dd/d2c/d2e/d31/db4/d127 0 2026-03-10T08:56:11.228 INFO:tasks.workunit.client.0.vm05.stdout:3/889: rmdir d9/d8f/d50/d5f/dd8/dec 39 2026-03-10T08:56:11.236 INFO:tasks.workunit.client.0.vm05.stdout:5/772: dread d5/d48/f7e [0,4194304] 0 2026-03-10T08:56:11.242 INFO:tasks.workunit.client.0.vm05.stdout:2/767: dwrite d0/d9/d1e/d20/fc8 [0,4194304] 0 2026-03-10T08:56:11.242 INFO:tasks.workunit.client.0.vm05.stdout:4/828: dwrite d0/d2e/d42/d45/fcc [0,4194304] 0 2026-03-10T08:56:11.262 INFO:tasks.workunit.client.0.vm05.stdout:7/802: mkdir d18/d66/dff 0 2026-03-10T08:56:11.269 INFO:tasks.workunit.client.0.vm05.stdout:1/899: mknod dd/d21/d37/d45/d8d/d128/c134 0 2026-03-10T08:56:11.277 INFO:tasks.workunit.client.1.vm08.stdout:5/953: mknod d0/d1b/d67/c12a 0 2026-03-10T08:56:11.286 INFO:tasks.workunit.client.0.vm05.stdout:5/773: dread d5/d86/d24/d2c/d41/d74/fb1 [4194304,4194304] 0 2026-03-10T08:56:11.291 INFO:tasks.workunit.client.0.vm05.stdout:5/774: truncate d5/d86/d24/d2c/d41/d74/fa8 5061560 0 2026-03-10T08:56:11.291 INFO:tasks.workunit.client.0.vm05.stdout:5/775: chown d5/d86/d24/d2c/d41/dca 50 1 2026-03-10T08:56:11.291 INFO:tasks.workunit.client.0.vm05.stdout:4/829: dread d0/d2e/d42/d45/d4a/d36/dbe/f93 [0,4194304] 0 2026-03-10T08:56:11.292 INFO:tasks.workunit.client.1.vm08.stdout:5/954: sync 2026-03-10T08:56:11.297 INFO:tasks.workunit.client.0.vm05.stdout:1/900: truncate dd/d21/d37/d7c/d60/fe9 827127 0 2026-03-10T08:56:11.300 INFO:tasks.workunit.client.0.vm05.stdout:0/839: write df/d1f/d85/f29 [1083405,20207] 0 2026-03-10T08:56:11.304 INFO:tasks.workunit.client.0.vm05.stdout:9/813: write d6/d19/d2c/fbd [303404,94489] 0 2026-03-10T08:56:11.308 INFO:tasks.workunit.client.1.vm08.stdout:5/955: mkdir d0/d11/d3e/d45/d12b 0 2026-03-10T08:56:11.309 INFO:tasks.workunit.client.0.vm05.stdout:3/890: creat d9/d2b/de7/d102/f117 x:0 0 0 2026-03-10T08:56:11.311 INFO:tasks.workunit.client.0.vm05.stdout:6/835: rename d4/d2c/d84/d4a/l8a to d4/d7/d10/d15/l11c 0 2026-03-10T08:56:11.313 INFO:tasks.workunit.client.0.vm05.stdout:2/768: mknod d0/d9/d1e/d20/d21/ce5 0 2026-03-10T08:56:11.316 INFO:tasks.workunit.client.0.vm05.stdout:5/776: rmdir d5/d48/d64/d95/dac 39 2026-03-10T08:56:11.317 INFO:tasks.workunit.client.0.vm05.stdout:5/777: chown d5/d48/l70 5 1 2026-03-10T08:56:11.323 INFO:tasks.workunit.client.0.vm05.stdout:7/803: symlink d18/dfc/l100 0 2026-03-10T08:56:11.323 INFO:tasks.workunit.client.1.vm08.stdout:5/956: getdents d0/d11/d27/d68/d7c/d8e 0 2026-03-10T08:56:11.324 INFO:tasks.workunit.client.1.vm08.stdout:5/957: write d0/d11/d18/f114 [172937,95870] 0 2026-03-10T08:56:11.325 INFO:tasks.workunit.client.0.vm05.stdout:4/830: unlink d0/d2e/d42/d45/d4a/f86 0 2026-03-10T08:56:11.329 INFO:tasks.workunit.client.0.vm05.stdout:1/901: creat dd/d10/d18/d2d/d51/d58/d71/d73/f135 x:0 0 0 2026-03-10T08:56:11.332 INFO:tasks.workunit.client.1.vm08.stdout:5/958: unlink d0/d1b/d67/d80/c9e 0 2026-03-10T08:56:11.335 INFO:tasks.workunit.client.1.vm08.stdout:5/959: fdatasync d0/d11/d27/f3b 0 2026-03-10T08:56:11.338 INFO:tasks.workunit.client.0.vm05.stdout:8/842: rename d2/cf1 to d2/dd/d2c/d2e/d31/d3e/d5d/d9d/dd9/c128 0 2026-03-10T08:56:11.345 INFO:tasks.workunit.client.0.vm05.stdout:7/804: chown d18/d38/dc7/de3/d9c/de1/ffb 32535917 1 2026-03-10T08:56:11.347 INFO:tasks.workunit.client.0.vm05.stdout:1/902: creat dd/d10/d18/dd1/f136 x:0 0 0 2026-03-10T08:56:11.351 INFO:tasks.workunit.client.0.vm05.stdout:0/840: mkdir df/d1f/d85/d2b/d27/dff 0 2026-03-10T08:56:11.355 INFO:tasks.workunit.client.1.vm08.stdout:5/960: dwrite d0/d11/d18/faf [0,4194304] 0 2026-03-10T08:56:11.355 INFO:tasks.workunit.client.0.vm05.stdout:3/891: mknod d9/d2b/de7/df1/d43/d71/d86/d10e/c118 0 2026-03-10T08:56:11.356 INFO:tasks.workunit.client.1.vm08.stdout:5/961: fdatasync d0/d11/d27/d68/d7c/d4b/dbe/f126 0 2026-03-10T08:56:11.360 INFO:tasks.workunit.client.1.vm08.stdout:5/962: fdatasync d0/d11/d27/d68/d7c/d4b/d4e/fbd 0 2026-03-10T08:56:11.364 INFO:tasks.workunit.client.1.vm08.stdout:5/963: dwrite d0/d11/d18/fc0 [0,4194304] 0 2026-03-10T08:56:11.366 INFO:tasks.workunit.client.0.vm05.stdout:7/805: chown d18/d66/d25/f47 4095185 1 2026-03-10T08:56:11.368 INFO:tasks.workunit.client.1.vm08.stdout:5/964: dread d0/d11/d27/d68/d7c/de5/feb [0,4194304] 0 2026-03-10T08:56:11.377 INFO:tasks.workunit.client.0.vm05.stdout:1/903: truncate dd/d10/d18/d2d/d51/d58/fa0 1766914 0 2026-03-10T08:56:11.384 INFO:tasks.workunit.client.1.vm08.stdout:5/965: creat d0/d11/d27/d68/d7c/f12c x:0 0 0 2026-03-10T08:56:11.385 INFO:tasks.workunit.client.0.vm05.stdout:9/814: link d6/d15/d3c/d4b/d90/c100 d6/d15/d3c/c112 0 2026-03-10T08:56:11.385 INFO:tasks.workunit.client.0.vm05.stdout:9/815: chown d6/ff4 49 1 2026-03-10T08:56:11.385 INFO:tasks.workunit.client.0.vm05.stdout:9/816: stat d6/d15/d35/ddf/cec 0 2026-03-10T08:56:11.388 INFO:tasks.workunit.client.1.vm08.stdout:5/966: truncate d0/d11/f1e 572251 0 2026-03-10T08:56:11.389 INFO:tasks.workunit.client.0.vm05.stdout:8/843: creat d2/dd/d2c/d2e/d108/f129 x:0 0 0 2026-03-10T08:56:11.391 INFO:tasks.workunit.client.0.vm05.stdout:2/769: creat d0/d9/d1e/d20/d21/fe6 x:0 0 0 2026-03-10T08:56:11.392 INFO:tasks.workunit.client.0.vm05.stdout:2/770: stat d0/d9/d7f/d8f/f38 0 2026-03-10T08:56:11.394 INFO:tasks.workunit.client.0.vm05.stdout:5/778: link d5/d48/cfc d5/d86/d24/d84/db8/dcc/c11a 0 2026-03-10T08:56:11.397 INFO:tasks.workunit.client.0.vm05.stdout:1/904: read - dd/d10/d18/dd1/f106 zero size 2026-03-10T08:56:11.405 INFO:tasks.workunit.client.1.vm08.stdout:5/967: getdents d0/d11/d27/d68/d7c/de5/de2 0 2026-03-10T08:56:11.411 INFO:tasks.workunit.client.0.vm05.stdout:6/836: write d4/d2d/d7f/fc0 [302286,104817] 0 2026-03-10T08:56:11.420 INFO:tasks.workunit.client.0.vm05.stdout:4/831: write d0/d2e/d42/d45/d4a/d36/fd5 [1242598,119735] 0 2026-03-10T08:56:11.422 INFO:tasks.workunit.client.0.vm05.stdout:0/841: dwrite df/f15 [0,4194304] 0 2026-03-10T08:56:11.423 INFO:tasks.workunit.client.0.vm05.stdout:2/771: creat d0/d55/dde/fe7 x:0 0 0 2026-03-10T08:56:11.424 INFO:tasks.workunit.client.0.vm05.stdout:5/779: creat d5/df/d37/dd2/f11b x:0 0 0 2026-03-10T08:56:11.424 INFO:tasks.workunit.client.0.vm05.stdout:5/780: chown d5/d86/c29 9 1 2026-03-10T08:56:11.425 INFO:tasks.workunit.client.0.vm05.stdout:5/781: chown d5/d86/d24/d2c/d41/d74/f9f 324726 1 2026-03-10T08:56:11.428 INFO:tasks.workunit.client.0.vm05.stdout:7/806: rename d18/d66/d25/d2e/d2f/fec to d18/d38/dc7/de3/d9c/dac/df5/f101 0 2026-03-10T08:56:11.439 INFO:tasks.workunit.client.0.vm05.stdout:8/844: write d2/db/d28/fae [1005463,29149] 0 2026-03-10T08:56:11.447 INFO:tasks.workunit.client.1.vm08.stdout:5/968: link d0/fa4 d0/f12d 0 2026-03-10T08:56:11.457 INFO:tasks.workunit.client.1.vm08.stdout:5/969: fsync d0/d1b/d67/f9b 0 2026-03-10T08:56:11.466 INFO:tasks.workunit.client.0.vm05.stdout:6/837: dwrite d4/d2c/d84/db6/dc6/f100 [0,4194304] 0 2026-03-10T08:56:11.468 INFO:tasks.workunit.client.0.vm05.stdout:4/832: symlink d0/d2e/d42/d45/d4a/d36/dbe/d49/d4f/d5b/dd7/ddc/l111 0 2026-03-10T08:56:11.476 INFO:tasks.workunit.client.1.vm08.stdout:5/970: creat d0/d1b/d67/d7a/f12e x:0 0 0 2026-03-10T08:56:11.477 INFO:tasks.workunit.client.1.vm08.stdout:5/971: chown d0/d11/d27/c26 993287103 1 2026-03-10T08:56:11.480 INFO:tasks.workunit.client.1.vm08.stdout:5/972: dread d0/f6c [0,4194304] 0 2026-03-10T08:56:11.483 INFO:tasks.workunit.client.0.vm05.stdout:6/838: dread d4/d7/d10/d15/d1b/d22/f36 [0,4194304] 0 2026-03-10T08:56:11.486 INFO:tasks.workunit.client.0.vm05.stdout:5/782: rmdir d5/df/dbb/d43 39 2026-03-10T08:56:11.496 INFO:tasks.workunit.client.0.vm05.stdout:5/783: dwrite d5/d86/d21/d71/f9a [0,4194304] 0 2026-03-10T08:56:11.509 INFO:tasks.workunit.client.0.vm05.stdout:4/833: dread d0/d2e/d42/d45/d4a/d36/dbe/f28 [0,4194304] 0 2026-03-10T08:56:11.511 INFO:tasks.workunit.client.0.vm05.stdout:1/905: mknod dd/d10/d18/d20/c137 0 2026-03-10T08:56:11.512 INFO:tasks.workunit.client.0.vm05.stdout:1/906: stat dd/d10/d18/dd1/l12a 0 2026-03-10T08:56:11.514 INFO:tasks.workunit.client.1.vm08.stdout:5/973: fsync d0/d11/d27/d68/d7c/d4b/f82 0 2026-03-10T08:56:11.515 INFO:tasks.workunit.client.1.vm08.stdout:5/974: readlink d0/d11/d27/d68/d7c/lae 0 2026-03-10T08:56:11.520 INFO:tasks.workunit.client.0.vm05.stdout:8/845: unlink d2/ca1 0 2026-03-10T08:56:11.526 INFO:tasks.workunit.client.1.vm08.stdout:5/975: chown d0/d11/d27/d68/d7c/d4b/dbe/f127 549193 1 2026-03-10T08:56:11.535 INFO:tasks.workunit.client.0.vm05.stdout:0/842: dwrite df/d1f/d95/fe7 [0,4194304] 0 2026-03-10T08:56:11.536 INFO:tasks.workunit.client.1.vm08.stdout:5/976: mkdir d0/d11/d27/d12f 0 2026-03-10T08:56:11.536 INFO:tasks.workunit.client.0.vm05.stdout:7/807: dwrite d18/d1b/f69 [0,4194304] 0 2026-03-10T08:56:11.536 INFO:tasks.workunit.client.1.vm08.stdout:5/977: dread - d0/d11/d18/fe6 zero size 2026-03-10T08:56:11.538 INFO:tasks.workunit.client.0.vm05.stdout:0/843: write df/d1f/d85/d19/d39/d4d/fe3 [1166730,127879] 0 2026-03-10T08:56:11.552 INFO:tasks.workunit.client.1.vm08.stdout:5/978: unlink d0/c20 0 2026-03-10T08:56:11.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:11 vm08.local ceph-mon[57559]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:56:11.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:11 vm08.local ceph-mon[57559]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:56:11.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:11 vm08.local ceph-mon[57559]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:56:11.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:11 vm08.local ceph-mon[57559]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:56:11.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:11 vm08.local ceph-mon[57559]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:56:11.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:11 vm08.local ceph-mon[57559]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:56:11.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:11 vm08.local ceph-mon[57559]: Upgrade: Need to upgrade myself (mgr.vm08.rpongu) 2026-03-10T08:56:11.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:11 vm08.local ceph-mon[57559]: Upgrade: Need to upgrade myself (mgr.vm08.rpongu) 2026-03-10T08:56:11.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:11 vm08.local ceph-mon[57559]: Failed to find standby mgr for failover. Retrying in 2 seconds 2026-03-10T08:56:11.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:11 vm08.local ceph-mon[57559]: pgmap v14: 65 pgs: 65 active+clean; 4.0 GiB data, 13 GiB used, 107 GiB / 120 GiB avail; 44 MiB/s rd, 99 MiB/s wr, 268 op/s 2026-03-10T08:56:11.555 INFO:tasks.workunit.client.1.vm08.stdout:5/979: mkdir d0/d11/d3e/d130 0 2026-03-10T08:56:11.562 INFO:tasks.workunit.client.0.vm05.stdout:4/834: unlink d0/d2e/ffc 0 2026-03-10T08:56:11.574 INFO:tasks.workunit.client.0.vm05.stdout:9/817: getdents d6/d19 0 2026-03-10T08:56:11.574 INFO:tasks.workunit.client.0.vm05.stdout:3/892: getdents d9/d8f/d50/d5f/dd8/dec 0 2026-03-10T08:56:11.574 INFO:tasks.workunit.client.0.vm05.stdout:3/893: chown d9/d2b/de7/d102/f117 0 1 2026-03-10T08:56:11.579 INFO:tasks.workunit.client.0.vm05.stdout:2/772: link d0/d9/d7f/d8f/d7e/lb5 d0/d9/d7f/d8f/le8 0 2026-03-10T08:56:11.579 INFO:tasks.workunit.client.0.vm05.stdout:8/846: sync 2026-03-10T08:56:11.580 INFO:tasks.workunit.client.0.vm05.stdout:8/847: readlink d2/dd/d2c/d2e/d31/d3e/dde/l52 0 2026-03-10T08:56:11.583 INFO:tasks.workunit.client.0.vm05.stdout:5/784: mknod d5/df/d37/d68/c11c 0 2026-03-10T08:56:11.583 INFO:tasks.workunit.client.1.vm08.stdout:5/980: sync 2026-03-10T08:56:11.584 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:11 vm05.local ceph-mon[49713]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:56:11.584 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:11 vm05.local ceph-mon[49713]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:56:11.584 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:11 vm05.local ceph-mon[49713]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:56:11.584 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:11 vm05.local ceph-mon[49713]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:56:11.584 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:11 vm05.local ceph-mon[49713]: from='mgr.24459 ' entity='mgr.vm08.rpongu' 2026-03-10T08:56:11.584 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:11 vm05.local ceph-mon[49713]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:56:11.584 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:11 vm05.local ceph-mon[49713]: Upgrade: Need to upgrade myself (mgr.vm08.rpongu) 2026-03-10T08:56:11.584 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:11 vm05.local ceph-mon[49713]: Upgrade: Need to upgrade myself (mgr.vm08.rpongu) 2026-03-10T08:56:11.584 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:11 vm05.local ceph-mon[49713]: Failed to find standby mgr for failover. Retrying in 2 seconds 2026-03-10T08:56:11.584 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:11 vm05.local ceph-mon[49713]: pgmap v14: 65 pgs: 65 active+clean; 4.0 GiB data, 13 GiB used, 107 GiB / 120 GiB avail; 44 MiB/s rd, 99 MiB/s wr, 268 op/s 2026-03-10T08:56:11.599 INFO:tasks.workunit.client.0.vm05.stdout:5/785: sync 2026-03-10T08:56:11.600 INFO:tasks.workunit.client.0.vm05.stdout:5/786: write d5/d86/d24/d2c/d41/d74/da9/fee [7900612,34288] 0 2026-03-10T08:56:11.611 INFO:tasks.workunit.client.1.vm08.stdout:5/981: dread d0/fe [0,4194304] 0 2026-03-10T08:56:11.619 INFO:tasks.workunit.client.0.vm05.stdout:9/818: write f4 [558507,89635] 0 2026-03-10T08:56:11.622 INFO:tasks.workunit.client.0.vm05.stdout:1/907: dwrite dd/d21/fe4 [0,4194304] 0 2026-03-10T08:56:11.625 INFO:tasks.workunit.client.0.vm05.stdout:3/894: dwrite d9/d2b/f3b [0,4194304] 0 2026-03-10T08:56:11.627 INFO:tasks.workunit.client.0.vm05.stdout:3/895: chown d9/d2b/d2f 140713946 1 2026-03-10T08:56:11.629 INFO:tasks.workunit.client.1.vm08.stdout:5/982: mknod d0/d11/d27/d68/d7c/d4b/d10e/c131 0 2026-03-10T08:56:11.632 INFO:tasks.workunit.client.0.vm05.stdout:0/844: mknod df/d1f/dee/c100 0 2026-03-10T08:56:11.632 INFO:tasks.workunit.client.0.vm05.stdout:7/808: dwrite d18/d66/d78/fb8 [0,4194304] 0 2026-03-10T08:56:11.637 INFO:tasks.workunit.client.0.vm05.stdout:7/809: chown d18/d66/d25/d2e/d2f/fd8 43309831 1 2026-03-10T08:56:11.644 INFO:tasks.workunit.client.0.vm05.stdout:3/896: dwrite d9/d2b/d2f/d57/dd0/fef [0,4194304] 0 2026-03-10T08:56:11.684 INFO:tasks.workunit.client.1.vm08.stdout:5/983: mknod d0/d11/d27/d68/d7c/d4b/d4e/da5/c132 0 2026-03-10T08:56:11.690 INFO:tasks.workunit.client.1.vm08.stdout:5/984: read - d0/d11/d27/d68/d7c/d4b/d4e/d84/df9/fc9 zero size 2026-03-10T08:56:11.691 INFO:tasks.workunit.client.1.vm08.stdout:5/985: read d0/d11/d27/d68/d7c/d4b/dbe/f125 [2689456,48389] 0 2026-03-10T08:56:11.693 INFO:tasks.workunit.client.0.vm05.stdout:5/787: chown d5/df/dbb/d43/f6d 2219 1 2026-03-10T08:56:11.701 INFO:tasks.workunit.client.1.vm08.stdout:5/986: creat d0/d11/d27/d68/d7c/d8e/df0/db5/f133 x:0 0 0 2026-03-10T08:56:11.707 INFO:tasks.workunit.client.0.vm05.stdout:9/819: read - d6/d12/d3a/fdd zero size 2026-03-10T08:56:11.710 INFO:tasks.workunit.client.0.vm05.stdout:8/848: dread d2/dd/d2c/d2e/d31/d3e/d5d/fc0 [0,4194304] 0 2026-03-10T08:56:11.713 INFO:tasks.workunit.client.0.vm05.stdout:9/820: dwrite d6/ff8 [0,4194304] 0 2026-03-10T08:56:11.716 INFO:tasks.workunit.client.0.vm05.stdout:9/821: truncate d6/d19/d2c/fbd 765543 0 2026-03-10T08:56:11.719 INFO:tasks.workunit.client.1.vm08.stdout:5/987: creat d0/d11/d27/d68/d7c/d4b/f134 x:0 0 0 2026-03-10T08:56:11.724 INFO:tasks.workunit.client.1.vm08.stdout:5/988: mkdir d0/d11/d27/d68/d7c/d4b/dbe/d135 0 2026-03-10T08:56:11.730 INFO:tasks.workunit.client.0.vm05.stdout:0/845: rename df/d1f/d85/d2b/d27/d32/d4e/d6a/fbf to df/d1f/dc6/f101 0 2026-03-10T08:56:11.737 INFO:tasks.workunit.client.0.vm05.stdout:0/846: write df/d1f/d95/fe7 [3167312,107057] 0 2026-03-10T08:56:11.746 INFO:tasks.workunit.client.1.vm08.stdout:5/989: mkdir d0/d11/d27/d12f/d136 0 2026-03-10T08:56:11.749 INFO:tasks.workunit.client.0.vm05.stdout:3/897: symlink d9/d2b/de7/df1/d43/d71/l119 0 2026-03-10T08:56:11.751 INFO:tasks.workunit.client.0.vm05.stdout:5/788: creat d5/d86/d24/d2c/d41/d74/da9/f11d x:0 0 0 2026-03-10T08:56:11.751 INFO:tasks.workunit.client.0.vm05.stdout:5/789: dread - d5/f40 zero size 2026-03-10T08:56:11.752 INFO:tasks.workunit.client.0.vm05.stdout:5/790: fsync d5/df/d37/dd2/d76/dde/f103 0 2026-03-10T08:56:11.754 INFO:tasks.workunit.client.1.vm08.stdout:5/990: truncate d0/d11/d27/d68/d7c/d8e/df0/db5/fec 472451 0 2026-03-10T08:56:11.763 INFO:tasks.workunit.client.1.vm08.stdout:5/991: unlink d0/d11/d27/d100/f101 0 2026-03-10T08:56:11.763 INFO:tasks.workunit.client.0.vm05.stdout:0/847: sync 2026-03-10T08:56:11.767 INFO:tasks.workunit.client.0.vm05.stdout:2/773: write d0/d9/d7f/d8f/f66 [4883451,60485] 0 2026-03-10T08:56:11.771 INFO:tasks.workunit.client.0.vm05.stdout:6/839: write d4/d92/db0/fb7 [288266,41428] 0 2026-03-10T08:56:11.776 INFO:tasks.workunit.client.0.vm05.stdout:1/908: dwrite dd/d10/d18/d2d/f10e [0,4194304] 0 2026-03-10T08:56:11.777 INFO:tasks.workunit.client.0.vm05.stdout:1/909: chown dd/d10/d19/d27/ffa 264 1 2026-03-10T08:56:11.777 INFO:tasks.workunit.client.0.vm05.stdout:1/910: stat dd/d10/d18/d20/d52/ddc 0 2026-03-10T08:56:11.791 INFO:tasks.workunit.client.1.vm08.stdout:5/992: read - d0/d1b/d67/f108 zero size 2026-03-10T08:56:11.797 INFO:tasks.workunit.client.0.vm05.stdout:4/835: getdents d0/d2c/d6a 0 2026-03-10T08:56:11.797 INFO:tasks.workunit.client.0.vm05.stdout:8/849: dwrite d2/dd/d74/d78/fcf [0,4194304] 0 2026-03-10T08:56:11.804 INFO:tasks.workunit.client.0.vm05.stdout:3/898: truncate d9/d2b/f2d 1024397 0 2026-03-10T08:56:11.804 INFO:tasks.workunit.client.0.vm05.stdout:9/822: truncate d6/f4e 361216 0 2026-03-10T08:56:11.810 INFO:tasks.workunit.client.0.vm05.stdout:0/848: symlink df/d1f/d85/d19/d39/d4d/l102 0 2026-03-10T08:56:11.811 INFO:tasks.workunit.client.0.vm05.stdout:0/849: chown df/d1f/d85/d2b/lf0 53428369 1 2026-03-10T08:56:11.821 INFO:tasks.workunit.client.0.vm05.stdout:2/774: mknod d0/d9/d1e/d20/d21/d8a/d92/ce9 0 2026-03-10T08:56:11.829 INFO:tasks.workunit.client.1.vm08.stdout:5/993: dread d0/d1b/f39 [0,4194304] 0 2026-03-10T08:56:11.839 INFO:tasks.workunit.client.1.vm08.stdout:5/994: write d0/d11/d3e/f73 [3136419,65948] 0 2026-03-10T08:56:11.840 INFO:tasks.workunit.client.0.vm05.stdout:6/840: mkdir d4/d2d/d51/d62/d11d 0 2026-03-10T08:56:11.840 INFO:tasks.workunit.client.0.vm05.stdout:1/911: mkdir dd/d13/d10b/d138 0 2026-03-10T08:56:11.840 INFO:tasks.workunit.client.0.vm05.stdout:7/810: creat d18/d38/dc7/f102 x:0 0 0 2026-03-10T08:56:11.840 INFO:tasks.workunit.client.0.vm05.stdout:8/850: fsync d2/dd/d2c/d2e/d31/d4f/d80/f9f 0 2026-03-10T08:56:11.840 INFO:tasks.workunit.client.0.vm05.stdout:3/899: creat d9/d8f/dde/f11a x:0 0 0 2026-03-10T08:56:11.840 INFO:tasks.workunit.client.0.vm05.stdout:3/900: write d9/fb4 [4409305,7524] 0 2026-03-10T08:56:11.840 INFO:tasks.workunit.client.0.vm05.stdout:3/901: chown d9/d2b/de7/df1/d43/d6e 68298083 1 2026-03-10T08:56:11.846 INFO:tasks.workunit.client.0.vm05.stdout:8/851: dread d2/dd/d2c/d2e/d31/d3e/dde/f11f [0,4194304] 0 2026-03-10T08:56:11.851 INFO:tasks.workunit.client.0.vm05.stdout:9/823: mkdir d6/d12/d3a/de5/dd4/d113 0 2026-03-10T08:56:11.855 INFO:tasks.workunit.client.0.vm05.stdout:4/836: dread d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/d67/f6b [0,4194304] 0 2026-03-10T08:56:11.859 INFO:tasks.workunit.client.0.vm05.stdout:4/837: dwrite d0/d2e/d42/d45/d4a/d36/dbe/d32/f72 [4194304,4194304] 0 2026-03-10T08:56:11.863 INFO:tasks.workunit.client.0.vm05.stdout:2/775: fdatasync d0/d9/f4e 0 2026-03-10T08:56:11.868 INFO:tasks.workunit.client.0.vm05.stdout:8/852: truncate d2/dd/f3f 586156 0 2026-03-10T08:56:11.878 INFO:tasks.workunit.client.0.vm05.stdout:5/791: rename d5/d48/d64/d95/dac/cb4 to d5/d86/d24/d2c/d41/d74/da9/c11e 0 2026-03-10T08:56:11.878 INFO:tasks.workunit.client.0.vm05.stdout:9/824: readlink d6/d15/d35/ddf/l109 0 2026-03-10T08:56:11.878 INFO:tasks.workunit.client.0.vm05.stdout:4/838: fdatasync d0/f9 0 2026-03-10T08:56:11.878 INFO:tasks.workunit.client.0.vm05.stdout:6/841: mknod d4/d7/d10/d1a/d8c/c11e 0 2026-03-10T08:56:11.878 INFO:tasks.workunit.client.0.vm05.stdout:7/811: getdents d18/d1b/ddd 0 2026-03-10T08:56:11.881 INFO:tasks.workunit.client.0.vm05.stdout:9/825: creat d6/d15/d3c/d4b/d82/f114 x:0 0 0 2026-03-10T08:56:11.882 INFO:tasks.workunit.client.0.vm05.stdout:4/839: fdatasync d0/d2e/d42/d45/d4a/d36/dbe/d49/d58/d66/d79/d107/ff2 0 2026-03-10T08:56:11.883 INFO:tasks.workunit.client.0.vm05.stdout:4/840: chown d0/d2e/d42/d45/d4a/d36/dbe/d32/d41 59543003 1 2026-03-10T08:56:11.890 INFO:tasks.workunit.client.0.vm05.stdout:5/792: mkdir d5/d86/d24/d2c/d41/d11f 0 2026-03-10T08:56:11.890 INFO:tasks.workunit.client.0.vm05.stdout:9/826: mknod d6/d15/d37/de8/c115 0 2026-03-10T08:56:11.891 INFO:tasks.workunit.client.0.vm05.stdout:4/841: symlink d0/d2e/d42/d45/d4a/d36/dbe/d49/d58/d66/d79/d107/l112 0 2026-03-10T08:56:11.891 INFO:tasks.workunit.client.0.vm05.stdout:7/812: mknod d18/d66/d25/d2e/dd9/c103 0 2026-03-10T08:56:11.891 INFO:tasks.workunit.client.0.vm05.stdout:3/902: getdents d9/d2b/d2f/d96 0 2026-03-10T08:56:11.892 INFO:tasks.workunit.client.0.vm05.stdout:3/903: chown d9/d4d/d51/d64/ccf 1831036 1 2026-03-10T08:56:11.893 INFO:tasks.workunit.client.0.vm05.stdout:8/853: link d2/dd/d74/d78/ff9 d2/db/d1f/d67/f12a 0 2026-03-10T08:56:11.894 INFO:tasks.workunit.client.0.vm05.stdout:1/912: rename dd/d21/d37/d45/d8d/c9f to dd/c139 0 2026-03-10T08:56:11.895 INFO:tasks.workunit.client.0.vm05.stdout:5/793: creat d5/d86/d39/f120 x:0 0 0 2026-03-10T08:56:11.897 INFO:tasks.workunit.client.0.vm05.stdout:4/842: unlink d0/d2e/d42/d45/d4a/d36/d37/f68 0 2026-03-10T08:56:11.898 INFO:tasks.workunit.client.0.vm05.stdout:7/813: unlink d18/d66/d25/d2e/d2f/d6d/fbc 0 2026-03-10T08:56:11.899 INFO:tasks.workunit.client.0.vm05.stdout:3/904: rmdir d9/d2b/d2f/d57 39 2026-03-10T08:56:11.906 INFO:tasks.workunit.client.0.vm05.stdout:6/842: rename d4/d2c/d84/d4a/l9a to d4/d2c/d84/d4a/dd5/l11f 0 2026-03-10T08:56:11.913 INFO:tasks.workunit.client.0.vm05.stdout:1/913: mkdir dd/d10/d19/d9b/d13a 0 2026-03-10T08:56:11.913 INFO:tasks.workunit.client.0.vm05.stdout:5/794: rmdir d5/df/dbb/d43 39 2026-03-10T08:56:11.913 INFO:tasks.workunit.client.0.vm05.stdout:5/795: truncate d5/df/d37/dd2/d76/dde/f103 294954 0 2026-03-10T08:56:11.913 INFO:tasks.workunit.client.0.vm05.stdout:9/827: creat d6/df6/f116 x:0 0 0 2026-03-10T08:56:11.913 INFO:tasks.workunit.client.0.vm05.stdout:7/814: mknod d18/d38/d43/d6e/c104 0 2026-03-10T08:56:11.922 INFO:tasks.workunit.client.0.vm05.stdout:9/828: rename d6/d15/d3c/fda to d6/d15/d3c/d4b/d82/f117 0 2026-03-10T08:56:11.936 INFO:tasks.workunit.client.0.vm05.stdout:7/815: symlink d18/d38/d43/d6e/l105 0 2026-03-10T08:56:11.936 INFO:tasks.workunit.client.0.vm05.stdout:6/843: creat d4/d7/d10/d111/f120 x:0 0 0 2026-03-10T08:56:11.936 INFO:tasks.workunit.client.0.vm05.stdout:1/914: symlink dd/d10/d19/l13b 0 2026-03-10T08:56:11.936 INFO:tasks.workunit.client.0.vm05.stdout:9/829: mkdir d6/d19/d2c/d84/d118 0 2026-03-10T08:56:11.936 INFO:tasks.workunit.client.0.vm05.stdout:9/830: write d6/d15/d35/fc0 [3737763,129268] 0 2026-03-10T08:56:11.936 INFO:tasks.workunit.client.0.vm05.stdout:7/816: creat d18/d66/d25/d2e/dd9/f106 x:0 0 0 2026-03-10T08:56:11.938 INFO:tasks.workunit.client.0.vm05.stdout:1/915: creat dd/d10/d18/d2d/d51/f13c x:0 0 0 2026-03-10T08:56:11.939 INFO:tasks.workunit.client.0.vm05.stdout:9/831: fsync d6/d19/d2a/fd8 0 2026-03-10T08:56:11.943 INFO:tasks.workunit.client.0.vm05.stdout:4/843: getdents d0/d2e/d42/d45/d4a/d36/dbe/dbf 0 2026-03-10T08:56:11.948 INFO:tasks.workunit.client.0.vm05.stdout:9/832: unlink d6/ff4 0 2026-03-10T08:56:11.949 INFO:tasks.workunit.client.0.vm05.stdout:4/844: fdatasync d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/d67/d7b/fe8 0 2026-03-10T08:56:11.952 INFO:tasks.workunit.client.0.vm05.stdout:6/844: creat d4/d2d/d51/f121 x:0 0 0 2026-03-10T08:56:11.953 INFO:tasks.workunit.client.0.vm05.stdout:7/817: creat d18/d66/d25/d2e/d2f/d6d/dc1/dd4/de9/f107 x:0 0 0 2026-03-10T08:56:11.958 INFO:tasks.workunit.client.0.vm05.stdout:4/845: symlink d0/d2e/d42/d45/d4a/d36/dbe/d49/d4f/d5b/dd7/l113 0 2026-03-10T08:56:11.958 INFO:tasks.workunit.client.0.vm05.stdout:6/845: mknod d4/d2c/d84/db6/c122 0 2026-03-10T08:56:11.958 INFO:tasks.workunit.client.0.vm05.stdout:7/818: unlink d18/d38/dc7/de3/l92 0 2026-03-10T08:56:11.959 INFO:tasks.workunit.client.0.vm05.stdout:6/846: chown d4/d7/d10/d15/d1b/dfc/f114 13 1 2026-03-10T08:56:11.960 INFO:tasks.workunit.client.0.vm05.stdout:4/846: fdatasync d0/d1d/f22 0 2026-03-10T08:56:11.961 INFO:tasks.workunit.client.0.vm05.stdout:4/847: chown d0/d2e/d42/d45/d4a/d36/dbe/d32/d41 421 1 2026-03-10T08:56:11.961 INFO:tasks.workunit.client.0.vm05.stdout:4/848: chown d0/d2c/d6a/dc9 25009 1 2026-03-10T08:56:11.975 INFO:tasks.workunit.client.0.vm05.stdout:4/849: truncate d0/d2e/d42/d45/d4a/d36/dbe/f93 236399 0 2026-03-10T08:56:11.980 INFO:tasks.workunit.client.0.vm05.stdout:9/833: link d6/d19/d2a/d4a/ff0 d6/d15/d3c/d4b/d82/de9/f119 0 2026-03-10T08:56:11.980 INFO:tasks.workunit.client.0.vm05.stdout:0/850: sync 2026-03-10T08:56:11.987 INFO:tasks.workunit.client.0.vm05.stdout:0/851: creat df/d1f/dcd/de6/f103 x:0 0 0 2026-03-10T08:56:11.995 INFO:tasks.workunit.client.0.vm05.stdout:2/776: sync 2026-03-10T08:56:11.995 INFO:tasks.workunit.client.0.vm05.stdout:6/847: sync 2026-03-10T08:56:11.995 INFO:tasks.workunit.client.0.vm05.stdout:4/850: sync 2026-03-10T08:56:11.998 INFO:tasks.workunit.client.0.vm05.stdout:2/777: sync 2026-03-10T08:56:11.999 INFO:tasks.workunit.client.0.vm05.stdout:2/778: truncate d0/d9/d7f/d8f/d7a/fa1 405580 0 2026-03-10T08:56:12.018 INFO:tasks.workunit.client.0.vm05.stdout:4/851: mkdir d0/d2e/d42/d45/d4a/d36/d37/d114 0 2026-03-10T08:56:12.019 INFO:tasks.workunit.client.0.vm05.stdout:4/852: fdatasync d0/d2e/d42/d45/d4a/d36/dbe/f10b 0 2026-03-10T08:56:12.019 INFO:tasks.workunit.client.1.vm08.stdout:5/995: dwrite d0/d11/d27/d68/d7c/d8e/df0/db5/fec [0,4194304] 0 2026-03-10T08:56:12.020 INFO:tasks.workunit.client.0.vm05.stdout:2/779: write d0/d9/d7f/d8f/d7a/fbc [636568,103054] 0 2026-03-10T08:56:12.022 INFO:tasks.workunit.client.1.vm08.stdout:5/996: dread - d0/d11/d27/d68/d7c/d4b/f134 zero size 2026-03-10T08:56:12.023 INFO:tasks.workunit.client.0.vm05.stdout:6/848: fsync d4/d7/d10/d15/d1b/d22/fcf 0 2026-03-10T08:56:12.033 INFO:tasks.workunit.client.0.vm05.stdout:4/853: unlink d0/f1e 0 2026-03-10T08:56:12.038 INFO:tasks.workunit.client.1.vm08.stdout:5/997: unlink d0/l7 0 2026-03-10T08:56:12.039 INFO:tasks.workunit.client.0.vm05.stdout:2/780: symlink d0/d9/d89/lea 0 2026-03-10T08:56:12.041 INFO:tasks.workunit.client.0.vm05.stdout:6/849: unlink d4/d7/d10/d15/ld8 0 2026-03-10T08:56:12.046 INFO:tasks.workunit.client.0.vm05.stdout:8/854: dwrite d2/dd/d2c/d2e/d31/d3e/f95 [0,4194304] 0 2026-03-10T08:56:12.049 INFO:tasks.workunit.client.0.vm05.stdout:4/854: unlink d0/d2e/d42/d45/d4a/d36/dbe/d32/lc2 0 2026-03-10T08:56:12.054 INFO:tasks.workunit.client.1.vm08.stdout:5/998: creat d0/d11/d27/d12f/d136/f137 x:0 0 0 2026-03-10T08:56:12.054 INFO:tasks.workunit.client.1.vm08.stdout:5/999: chown d0/d11/d27/d68/d7c/de5/f91 533982 1 2026-03-10T08:56:12.058 INFO:tasks.workunit.client.1.vm08.stderr:+ rm -rf -- ./tmp.I077c4cT8V 2026-03-10T08:56:12.061 INFO:tasks.workunit.client.0.vm05.stdout:6/850: creat d4/d7/d10/d8f/f123 x:0 0 0 2026-03-10T08:56:12.062 INFO:tasks.workunit.client.0.vm05.stdout:3/905: write d9/d4d/d51/d64/f85 [1430572,128335] 0 2026-03-10T08:56:12.062 INFO:tasks.workunit.client.0.vm05.stdout:3/906: chown d9/d8f/d55/f6b 14565989 1 2026-03-10T08:56:12.066 INFO:tasks.workunit.client.0.vm05.stdout:5/796: write d5/d86/f9d [1696167,19284] 0 2026-03-10T08:56:12.075 INFO:tasks.workunit.client.0.vm05.stdout:6/851: mkdir d4/d7/d10/d15/d1b/d124 0 2026-03-10T08:56:12.076 INFO:tasks.workunit.client.0.vm05.stdout:3/907: write d9/d8f/dde/f10b [23967,123008] 0 2026-03-10T08:56:12.077 INFO:tasks.workunit.client.0.vm05.stdout:3/908: stat d9/d2b/d2f/d96 0 2026-03-10T08:56:12.084 INFO:tasks.workunit.client.0.vm05.stdout:6/852: dread - d4/d7/d10/d1a/d1f/f98 zero size 2026-03-10T08:56:12.084 INFO:tasks.workunit.client.0.vm05.stdout:3/909: rmdir d9/d8f 39 2026-03-10T08:56:12.085 INFO:tasks.workunit.client.0.vm05.stdout:1/916: write dd/d10/d18/dd1/f106 [177976,39249] 0 2026-03-10T08:56:12.085 INFO:tasks.workunit.client.0.vm05.stdout:1/917: chown dd/d21/f10a 431747988 1 2026-03-10T08:56:12.086 INFO:tasks.workunit.client.0.vm05.stdout:6/853: creat d4/d7/d10/dc3/f125 x:0 0 0 2026-03-10T08:56:12.090 INFO:tasks.workunit.client.0.vm05.stdout:3/910: rmdir d9/d4d/d51/d64/d89/dc2 39 2026-03-10T08:56:12.092 INFO:tasks.workunit.client.0.vm05.stdout:1/918: stat dd/d10/d18/d2d/d51/l78 0 2026-03-10T08:56:12.093 INFO:tasks.workunit.client.0.vm05.stdout:7/819: rmdir d18/d38/dc7/de3 39 2026-03-10T08:56:12.094 INFO:tasks.workunit.client.0.vm05.stdout:6/854: creat d4/d2d/d51/d62/d113/f126 x:0 0 0 2026-03-10T08:56:12.099 INFO:tasks.workunit.client.0.vm05.stdout:1/919: mknod dd/d10/d112/d113/c13d 0 2026-03-10T08:56:12.099 INFO:tasks.workunit.client.0.vm05.stdout:3/911: read d9/d8f/d55/f6b [6995914,129379] 0 2026-03-10T08:56:12.110 INFO:tasks.workunit.client.0.vm05.stdout:3/912: mknod d9/d4d/d51/d64/c11b 0 2026-03-10T08:56:12.113 INFO:tasks.workunit.client.0.vm05.stdout:7/820: creat d18/d66/d25/de5/f108 x:0 0 0 2026-03-10T08:56:12.118 INFO:tasks.workunit.client.0.vm05.stdout:3/913: creat d9/d4d/d51/f11c x:0 0 0 2026-03-10T08:56:12.126 INFO:tasks.workunit.client.0.vm05.stdout:7/821: dread d18/d38/f55 [4194304,4194304] 0 2026-03-10T08:56:12.133 INFO:tasks.workunit.client.0.vm05.stdout:7/822: getdents d18/d38/dc7 0 2026-03-10T08:56:12.140 INFO:tasks.workunit.client.0.vm05.stdout:7/823: unlink d18/d38/d43/d6e/fd1 0 2026-03-10T08:56:12.140 INFO:tasks.workunit.client.0.vm05.stdout:7/824: chown d18/d38/dc7/de3 4382661 1 2026-03-10T08:56:12.153 INFO:tasks.workunit.client.0.vm05.stdout:1/920: sync 2026-03-10T08:56:12.170 INFO:tasks.workunit.client.0.vm05.stdout:0/852: write df/f4a [1322456,32261] 0 2026-03-10T08:56:12.170 INFO:tasks.workunit.client.0.vm05.stdout:9/834: dwrite d6/d15/d37/f4c [0,4194304] 0 2026-03-10T08:56:12.179 INFO:tasks.workunit.client.0.vm05.stdout:1/921: chown dd/d10/fb0 9 1 2026-03-10T08:56:12.190 INFO:tasks.workunit.client.0.vm05.stdout:9/835: rmdir d6/d12/d3a/da2 39 2026-03-10T08:56:12.197 INFO:tasks.workunit.client.0.vm05.stdout:2/781: write d0/d9/d1e/d20/f71 [6102931,92570] 0 2026-03-10T08:56:12.206 INFO:tasks.workunit.client.0.vm05.stdout:1/922: read dd/d10/d18/d2d/d51/d58/d71/d73/fbb [969551,43743] 0 2026-03-10T08:56:12.229 INFO:tasks.workunit.client.0.vm05.stdout:6/855: write d4/d7/d10/d15/f2a [2771926,111631] 0 2026-03-10T08:56:12.239 INFO:tasks.workunit.client.0.vm05.stdout:4/855: dwrite d0/d2c/f2f [0,4194304] 0 2026-03-10T08:56:12.239 INFO:tasks.workunit.client.0.vm05.stdout:3/914: dwrite d9/d8f/d55/f8c [4194304,4194304] 0 2026-03-10T08:56:12.241 INFO:tasks.workunit.client.0.vm05.stdout:8/855: dwrite d2/db/d1f/d67/f79 [0,4194304] 0 2026-03-10T08:56:12.243 INFO:tasks.workunit.client.0.vm05.stdout:9/836: mknod d6/d15/d3c/d4b/d82/de9/c11a 0 2026-03-10T08:56:12.243 INFO:tasks.workunit.client.0.vm05.stdout:5/797: dwrite d5/d86/f1b [0,4194304] 0 2026-03-10T08:56:12.247 INFO:tasks.workunit.client.0.vm05.stdout:9/837: chown d6/d19/d2a/d4a/d8c/fd0 12636731 1 2026-03-10T08:56:12.249 INFO:tasks.workunit.client.0.vm05.stdout:7/825: write d18/d38/dc7/de3/fa3 [4274891,4159] 0 2026-03-10T08:56:12.250 INFO:tasks.workunit.client.0.vm05.stdout:7/826: chown d18/d38/dc7/de3/d9c/dac/cf4 0 1 2026-03-10T08:56:12.260 INFO:tasks.workunit.client.0.vm05.stdout:8/856: dread d2/db/d1f/d67/f79 [0,4194304] 0 2026-03-10T08:56:12.267 INFO:tasks.workunit.client.0.vm05.stdout:1/923: read - dd/d21/d37/ff5 zero size 2026-03-10T08:56:12.276 INFO:tasks.workunit.client.0.vm05.stdout:6/856: dwrite d4/d7/d10/d15/ff6 [0,4194304] 0 2026-03-10T08:56:12.277 INFO:tasks.workunit.client.0.vm05.stdout:6/857: stat d4/d2d/d51/f7d 0 2026-03-10T08:56:12.309 INFO:tasks.workunit.client.0.vm05.stdout:3/915: mkdir d9/d8f/d50/d5f/dd8/dd9/de2/d11d 0 2026-03-10T08:56:12.317 INFO:tasks.workunit.client.0.vm05.stdout:6/858: sync 2026-03-10T08:56:12.317 INFO:tasks.workunit.client.0.vm05.stdout:5/798: dread - d5/d48/d64/d95/dac/dc6/fe9 zero size 2026-03-10T08:56:12.317 INFO:tasks.workunit.client.0.vm05.stdout:5/799: dread - d5/d86/d24/f51 zero size 2026-03-10T08:56:12.321 INFO:tasks.workunit.client.0.vm05.stdout:5/800: dread d5/d86/d24/d2c/d41/d74/da9/feb [8388608,4194304] 0 2026-03-10T08:56:12.327 INFO:tasks.workunit.client.0.vm05.stdout:0/853: dwrite df/d1f/d85/d19/d5b/f72 [0,4194304] 0 2026-03-10T08:56:12.340 INFO:tasks.workunit.client.0.vm05.stdout:8/857: creat d2/dd/d2c/d2e/d31/d4f/d80/de2/dea/f12b x:0 0 0 2026-03-10T08:56:12.344 INFO:tasks.workunit.client.0.vm05.stdout:1/924: rename fc to dd/d10/d18/d20/d69/f13e 0 2026-03-10T08:56:12.355 INFO:tasks.workunit.client.0.vm05.stdout:3/916: chown d9/d2b/de7/df1/dd6/c10f 372 1 2026-03-10T08:56:12.359 INFO:tasks.workunit.client.0.vm05.stdout:6/859: creat d4/d2c/d84/d4a/f127 x:0 0 0 2026-03-10T08:56:12.360 INFO:tasks.workunit.client.0.vm05.stdout:6/860: chown d4/d2d/d51/f10b 1451 1 2026-03-10T08:56:12.361 INFO:tasks.workunit.client.0.vm05.stdout:6/861: truncate d4/d7/d10/dc3/f125 381219 0 2026-03-10T08:56:12.368 INFO:tasks.workunit.client.0.vm05.stdout:0/854: readlink df/d1f/d85/d2b/d65/d6e/d96/lf3 0 2026-03-10T08:56:12.368 INFO:tasks.workunit.client.0.vm05.stdout:0/855: readlink df/d1f/d85/d19/d39/d4d/d9f/la1 0 2026-03-10T08:56:12.371 INFO:tasks.workunit.client.0.vm05.stdout:9/838: creat d6/d12/d3a/d48/f11b x:0 0 0 2026-03-10T08:56:12.373 INFO:tasks.workunit.client.0.vm05.stdout:8/858: stat d2/dd/d2c/d2e/d31/d3e/dde/l54 0 2026-03-10T08:56:12.378 INFO:tasks.workunit.client.0.vm05.stdout:8/859: dwrite d2/dd/d2c/d2e/d31/d3e/f95 [4194304,4194304] 0 2026-03-10T08:56:12.384 INFO:tasks.workunit.client.0.vm05.stdout:8/860: chown d2/dd/d74/cd7 16 1 2026-03-10T08:56:12.387 INFO:tasks.workunit.client.0.vm05.stdout:2/782: dwrite d0/f10 [0,4194304] 0 2026-03-10T08:56:12.393 INFO:tasks.workunit.client.0.vm05.stdout:1/925: truncate dd/d21/d37/d45/d8d/f125 120460 0 2026-03-10T08:56:12.405 INFO:tasks.workunit.client.0.vm05.stdout:0/856: rename df/d1f/d85/f29 to df/de8/f104 0 2026-03-10T08:56:12.406 INFO:tasks.workunit.client.0.vm05.stdout:9/839: rmdir d6/d12/d3a/de5/dd4 39 2026-03-10T08:56:12.415 INFO:tasks.workunit.client.0.vm05.stdout:8/861: creat d2/dd/d2c/d2e/d31/d4f/d80/de2/dea/f12c x:0 0 0 2026-03-10T08:56:12.418 INFO:tasks.workunit.client.0.vm05.stdout:2/783: mkdir d0/d9/d1e/d20/d21/d45/d4b/d75/deb 0 2026-03-10T08:56:12.419 INFO:tasks.workunit.client.0.vm05.stdout:2/784: write d0/d9/d1e/d20/d21/f35 [2730959,24697] 0 2026-03-10T08:56:12.439 INFO:tasks.workunit.client.0.vm05.stdout:3/917: symlink d9/d2b/de7/df1/d6c/l11e 0 2026-03-10T08:56:12.439 INFO:tasks.workunit.client.0.vm05.stdout:4/856: write d0/d2e/d42/d45/d4a/d36/f3d [5265981,82445] 0 2026-03-10T08:56:12.441 INFO:tasks.workunit.client.0.vm05.stdout:4/857: chown d0/d2e/d42/d45/d4a/d36/dbe/d49/d4f/d5b/c81 0 1 2026-03-10T08:56:12.443 INFO:tasks.workunit.client.0.vm05.stdout:3/918: dwrite d9/d2b/de7/d102/f117 [0,4194304] 0 2026-03-10T08:56:12.445 INFO:tasks.workunit.client.0.vm05.stdout:3/919: write d9/d2b/de7/df1/d43/d6e/f109 [910599,80330] 0 2026-03-10T08:56:12.447 INFO:tasks.workunit.client.0.vm05.stdout:7/827: dwrite d18/d66/d25/f8d [0,4194304] 0 2026-03-10T08:56:12.467 INFO:tasks.workunit.client.0.vm05.stdout:6/862: truncate d4/d2d/d51/d87/da5/fe7 811619 0 2026-03-10T08:56:12.469 INFO:tasks.workunit.client.0.vm05.stdout:5/801: dwrite d5/d86/d39/fce [0,4194304] 0 2026-03-10T08:56:12.478 INFO:tasks.workunit.client.0.vm05.stdout:7/828: dread d18/d66/d25/f47 [0,4194304] 0 2026-03-10T08:56:12.478 INFO:tasks.workunit.client.0.vm05.stdout:0/857: chown df/d1f/d85/fd4 0 1 2026-03-10T08:56:12.480 INFO:tasks.workunit.client.0.vm05.stdout:7/829: fdatasync d18/d38/fca 0 2026-03-10T08:56:12.483 INFO:tasks.workunit.client.0.vm05.stdout:1/926: dwrite dd/d10/d18/d2d/d5c/f100 [0,4194304] 0 2026-03-10T08:56:12.485 INFO:tasks.workunit.client.0.vm05.stdout:8/862: mkdir d2/dd/d2c/d2e/d31/d3e/d5d/d9d/dd9/d12d 0 2026-03-10T08:56:12.494 INFO:tasks.workunit.client.0.vm05.stdout:2/785: rmdir d0/d9/d1e/d20/d21/d8a/d92 39 2026-03-10T08:56:12.510 INFO:tasks.workunit.client.0.vm05.stdout:3/920: truncate d9/d2b/de7/df1/d43/d71/ff6 221756 0 2026-03-10T08:56:12.516 INFO:tasks.workunit.client.0.vm05.stdout:6/863: mkdir d4/d2c/d84/db6/dc6/d128 0 2026-03-10T08:56:12.521 INFO:tasks.workunit.client.0.vm05.stdout:5/802: rename d5/f9c to d5/df/d37/d68/db6/f121 0 2026-03-10T08:56:12.525 INFO:tasks.workunit.client.0.vm05.stdout:0/858: creat df/d1f/d85/d19/d47/d84/dae/f105 x:0 0 0 2026-03-10T08:56:12.530 INFO:tasks.workunit.client.0.vm05.stdout:8/863: stat d2/dd/d2c/d2e/d31/d3e/l5c 0 2026-03-10T08:56:12.532 INFO:tasks.workunit.client.0.vm05.stdout:2/786: mkdir d0/d9/d7f/d8f/d7a/dec 0 2026-03-10T08:56:12.536 INFO:tasks.workunit.client.0.vm05.stdout:7/830: sync 2026-03-10T08:56:12.549 INFO:tasks.workunit.client.0.vm05.stdout:4/858: write d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/d67/fc4 [1039201,14245] 0 2026-03-10T08:56:12.553 INFO:tasks.workunit.client.0.vm05.stdout:9/840: dwrite d6/f9f [4194304,4194304] 0 2026-03-10T08:56:12.566 INFO:tasks.workunit.client.0.vm05.stdout:8/864: creat d2/db/d28/d99/f12e x:0 0 0 2026-03-10T08:56:12.570 INFO:tasks.workunit.client.0.vm05.stdout:8/865: dwrite d2/dd/d74/d78/fcf [4194304,4194304] 0 2026-03-10T08:56:12.578 INFO:tasks.workunit.client.0.vm05.stdout:6/864: mknod d4/d2d/d51/d87/c129 0 2026-03-10T08:56:12.579 INFO:tasks.workunit.client.0.vm05.stdout:0/859: read df/d1f/d85/d2b/d65/d6e/d96/f7e [234277,113560] 0 2026-03-10T08:56:12.587 INFO:tasks.workunit.client.0.vm05.stdout:5/803: dread d5/d86/d39/f78 [0,4194304] 0 2026-03-10T08:56:12.603 INFO:tasks.workunit.client.0.vm05.stdout:9/841: truncate d6/d19/d2a/f53 2037425 0 2026-03-10T08:56:12.603 INFO:tasks.workunit.client.0.vm05.stdout:9/842: stat d6/d19/d21/f10d 0 2026-03-10T08:56:12.603 INFO:tasks.workunit.client.0.vm05.stdout:9/843: read d6/ff8 [1769738,54754] 0 2026-03-10T08:56:12.603 INFO:tasks.workunit.client.0.vm05.stdout:9/844: readlink d6/d15/d35/ddf/lf3 0 2026-03-10T08:56:12.609 INFO:tasks.workunit.client.0.vm05.stdout:4/859: mkdir d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/d67/d115 0 2026-03-10T08:56:12.614 INFO:tasks.workunit.client.0.vm05.stdout:9/845: fdatasync d6/d19/d2c/f78 0 2026-03-10T08:56:12.619 INFO:tasks.workunit.client.0.vm05.stdout:2/787: rename d0/l1a to d0/d9/d1e/d20/d21/d45/d4b/led 0 2026-03-10T08:56:12.619 INFO:tasks.workunit.client.0.vm05.stdout:2/788: fdatasync d0/f8 0 2026-03-10T08:56:12.629 INFO:tasks.workunit.client.0.vm05.stdout:5/804: mknod d5/df/d37/dc8/c122 0 2026-03-10T08:56:12.630 INFO:tasks.workunit.client.0.vm05.stdout:5/805: write d5/d86/d24/d2c/d41/d74/da9/f11d [53416,82678] 0 2026-03-10T08:56:12.634 INFO:tasks.workunit.client.0.vm05.stdout:4/860: symlink d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/l116 0 2026-03-10T08:56:12.641 INFO:tasks.workunit.client.0.vm05.stdout:2/789: symlink d0/d9/d7f/d8f/d7a/lee 0 2026-03-10T08:56:12.651 INFO:tasks.workunit.client.0.vm05.stdout:9/846: creat d6/d15/f11c x:0 0 0 2026-03-10T08:56:12.653 INFO:tasks.workunit.client.0.vm05.stdout:5/806: creat d5/df/d37/f123 x:0 0 0 2026-03-10T08:56:12.659 INFO:tasks.workunit.client.0.vm05.stdout:5/807: symlink d5/d86/d24/d2c/d41/d11f/l124 0 2026-03-10T08:56:12.661 INFO:tasks.workunit.client.0.vm05.stdout:5/808: dread d5/df/d37/dd2/fa5 [0,4194304] 0 2026-03-10T08:56:12.663 INFO:tasks.workunit.client.0.vm05.stdout:5/809: dread - d5/fc0 zero size 2026-03-10T08:56:12.668 INFO:tasks.workunit.client.0.vm05.stdout:5/810: dwrite d5/df/d37/dd2/fa5 [0,4194304] 0 2026-03-10T08:56:12.679 INFO:tasks.workunit.client.0.vm05.stdout:5/811: getdents d5/d48/d64/dc4 0 2026-03-10T08:56:12.681 INFO:tasks.workunit.client.0.vm05.stdout:5/812: mkdir d5/d86/d24/d84/db8/dcc/d125 0 2026-03-10T08:56:12.689 INFO:tasks.workunit.client.0.vm05.stdout:2/790: sync 2026-03-10T08:56:12.693 INFO:tasks.workunit.client.0.vm05.stdout:3/921: write d9/d2b/de7/df1/ff7 [200330,130531] 0 2026-03-10T08:56:12.696 INFO:tasks.workunit.client.0.vm05.stdout:2/791: mkdir d0/d55/dd4/def 0 2026-03-10T08:56:12.699 INFO:tasks.workunit.client.0.vm05.stdout:3/922: symlink d9/d2b/de7/df1/d43/da3/l11f 0 2026-03-10T08:56:12.713 INFO:tasks.workunit.client.0.vm05.stdout:1/927: write dd/d10/d18/d20/fd6 [357174,116317] 0 2026-03-10T08:56:12.714 INFO:tasks.workunit.client.0.vm05.stdout:1/928: fdatasync dd/d10/d18/d2d/f10e 0 2026-03-10T08:56:12.723 INFO:tasks.workunit.client.0.vm05.stdout:7/831: dwrite d18/d66/d25/d2e/de7/fcf [4194304,4194304] 0 2026-03-10T08:56:12.726 INFO:tasks.workunit.client.0.vm05.stdout:8/866: write d2/dd/d2c/d2e/d31/d4f/d7b/d9e/ff6 [4249088,118324] 0 2026-03-10T08:56:12.733 INFO:tasks.workunit.client.0.vm05.stdout:0/860: write df/d1f/d85/d19/d47/d84/dae/fc9 [135887,107695] 0 2026-03-10T08:56:12.735 INFO:tasks.workunit.client.0.vm05.stdout:6/865: dwrite d4/fb4 [0,4194304] 0 2026-03-10T08:56:12.742 INFO:tasks.workunit.client.0.vm05.stdout:0/861: dwrite df/d1f/dcd/de6/f103 [0,4194304] 0 2026-03-10T08:56:12.743 INFO:tasks.workunit.client.0.vm05.stdout:7/832: rename d18/d38/dc7/de3/d74/ca8 to d18/d1b/ddd/c109 0 2026-03-10T08:56:12.748 INFO:tasks.workunit.client.0.vm05.stdout:4/861: write d0/f9 [1529058,88067] 0 2026-03-10T08:56:12.750 INFO:tasks.workunit.client.0.vm05.stdout:9/847: write d6/d12/d3a/fdc [1602675,14471] 0 2026-03-10T08:56:12.755 INFO:tasks.workunit.client.0.vm05.stdout:5/813: truncate d5/f3b 916627 0 2026-03-10T08:56:12.755 INFO:tasks.workunit.client.0.vm05.stdout:5/814: chown d5/d86/d24/d84/db8/dcc/f105 2125382 1 2026-03-10T08:56:12.771 INFO:tasks.workunit.client.0.vm05.stdout:2/792: truncate d0/d55/fb6 814605 0 2026-03-10T08:56:12.779 INFO:tasks.workunit.client.0.vm05.stdout:4/862: dwrite d0/f100 [0,4194304] 0 2026-03-10T08:56:12.786 INFO:tasks.workunit.client.0.vm05.stdout:0/862: fsync df/d59/fdf 0 2026-03-10T08:56:12.789 INFO:tasks.workunit.client.0.vm05.stdout:2/793: mkdir d0/d55/db8/dcc/dd9/dcb/df0 0 2026-03-10T08:56:12.793 INFO:tasks.workunit.client.0.vm05.stdout:2/794: dwrite d0/f10 [0,4194304] 0 2026-03-10T08:56:12.795 INFO:tasks.workunit.client.0.vm05.stdout:5/815: dread d5/d86/d24/d2c/d41/f4c [0,4194304] 0 2026-03-10T08:56:12.800 INFO:tasks.workunit.client.0.vm05.stdout:4/863: symlink d0/d2e/d42/d45/d4a/d36/dbe/l117 0 2026-03-10T08:56:12.803 INFO:tasks.workunit.client.0.vm05.stdout:4/864: dwrite d0/d2e/fdf [0,4194304] 0 2026-03-10T08:56:12.813 INFO:tasks.workunit.client.0.vm05.stdout:0/863: rename df/d1f/dc6/f101 to df/dd8/d67/f106 0 2026-03-10T08:56:12.814 INFO:tasks.workunit.client.0.vm05.stdout:3/923: write d9/d2b/de7/df1/f68 [376758,208] 0 2026-03-10T08:56:12.817 INFO:tasks.workunit.client.0.vm05.stdout:2/795: mkdir d0/d9/d89/df1 0 2026-03-10T08:56:12.820 INFO:tasks.workunit.client.0.vm05.stdout:1/929: dwrite dd/d10/d18/d2d/ff1 [4194304,4194304] 0 2026-03-10T08:56:12.825 INFO:tasks.workunit.client.0.vm05.stdout:5/816: dread d5/d86/d21/f5a [0,4194304] 0 2026-03-10T08:56:12.836 INFO:tasks.workunit.client.0.vm05.stdout:1/930: mkdir dd/d10/d18/d20/d69/d13f 0 2026-03-10T08:56:12.838 INFO:tasks.workunit.client.0.vm05.stdout:5/817: symlink d5/df/d37/dd2/d76/l126 0 2026-03-10T08:56:12.838 INFO:tasks.workunit.client.0.vm05.stdout:5/818: chown d5/d86/d39/f78 13238 1 2026-03-10T08:56:12.840 INFO:tasks.workunit.client.0.vm05.stdout:4/865: mkdir d0/d2e/dca/df3/d118 0 2026-03-10T08:56:12.841 INFO:tasks.workunit.client.0.vm05.stdout:4/866: readlink d0/d2e/l3b 0 2026-03-10T08:56:12.843 INFO:tasks.workunit.client.0.vm05.stdout:2/796: mkdir d0/d55/dd4/def/df2 0 2026-03-10T08:56:12.844 INFO:tasks.workunit.client.0.vm05.stdout:2/797: chown d0/d9/d1e/d20/d21/d45/d4b/d75 48016 1 2026-03-10T08:56:12.846 INFO:tasks.workunit.client.0.vm05.stdout:5/819: mknod d5/d48/d64/d95/dac/dc6/c127 0 2026-03-10T08:56:12.864 INFO:tasks.workunit.client.0.vm05.stdout:1/931: dread dd/f1c [0,4194304] 0 2026-03-10T08:56:12.864 INFO:tasks.workunit.client.0.vm05.stdout:2/798: mknod d0/d55/da2/cf3 0 2026-03-10T08:56:12.864 INFO:tasks.workunit.client.0.vm05.stdout:2/799: fdatasync d0/d9/d1e/d20/d21/d45/d4b/f58 0 2026-03-10T08:56:12.864 INFO:tasks.workunit.client.0.vm05.stdout:5/820: creat d5/d86/d24/f128 x:0 0 0 2026-03-10T08:56:12.864 INFO:tasks.workunit.client.0.vm05.stdout:5/821: chown d5/d86/d24/d84 66839570 1 2026-03-10T08:56:12.864 INFO:tasks.workunit.client.0.vm05.stdout:1/932: truncate dd/f16 3632814 0 2026-03-10T08:56:12.864 INFO:tasks.workunit.client.0.vm05.stdout:4/867: link d0/d2e/d42/d45/d4a/d36/dbe/d32/f72 d0/d2e/d42/d45/d4a/d36/dbe/f119 0 2026-03-10T08:56:12.864 INFO:tasks.workunit.client.0.vm05.stdout:1/933: mknod dd/d13/d10b/d138/c140 0 2026-03-10T08:56:12.864 INFO:tasks.workunit.client.0.vm05.stdout:1/934: truncate dd/d10/d18/d2d/d51/d58/d71/d73/fbb 35595 0 2026-03-10T08:56:12.864 INFO:tasks.workunit.client.0.vm05.stdout:1/935: write dd/d10/d18/dd1/f127 [745660,40651] 0 2026-03-10T08:56:12.871 INFO:tasks.workunit.client.0.vm05.stdout:4/868: creat d0/d2e/d42/d45/d4a/d36/dbe/f11a x:0 0 0 2026-03-10T08:56:12.872 INFO:tasks.workunit.client.0.vm05.stdout:6/866: write d4/d7/f4d [3406432,115668] 0 2026-03-10T08:56:12.874 INFO:tasks.workunit.client.0.vm05.stdout:1/936: stat dd/d21/d37/d45/d8d/cf4 0 2026-03-10T08:56:12.874 INFO:tasks.workunit.client.0.vm05.stdout:1/937: chown dd/d10/d18/d2d/d5c/dac/fcd 897194384 1 2026-03-10T08:56:12.885 INFO:tasks.workunit.client.0.vm05.stdout:1/938: unlink dd/d21/d37/d7c/dab/f101 0 2026-03-10T08:56:12.892 INFO:tasks.workunit.client.0.vm05.stdout:1/939: dwrite dd/d10/d18/d2d/d51/d58/d71/d73/f135 [0,4194304] 0 2026-03-10T08:56:12.905 INFO:tasks.workunit.client.0.vm05.stdout:7/833: dwrite d18/d38/d43/d5c/daf/fb6 [0,4194304] 0 2026-03-10T08:56:12.906 INFO:tasks.workunit.client.0.vm05.stdout:7/834: stat d18/d38/dc7/de3/d74/ldc 0 2026-03-10T08:56:12.922 INFO:tasks.workunit.client.0.vm05.stdout:9/848: write d6/d15/d3c/f6b [3859252,117857] 0 2026-03-10T08:56:12.925 INFO:tasks.workunit.client.0.vm05.stdout:9/849: write d6/d19/d2c/fbd [1662285,68594] 0 2026-03-10T08:56:12.926 INFO:tasks.workunit.client.0.vm05.stdout:8/867: dwrite d2/db/d1f/d67/f75 [0,4194304] 0 2026-03-10T08:56:12.934 INFO:tasks.workunit.client.0.vm05.stdout:1/940: truncate dd/d10/d19/d27/f4e 4171135 0 2026-03-10T08:56:12.944 INFO:tasks.workunit.client.0.vm05.stdout:7/835: symlink d18/d38/d43/d6e/l10a 0 2026-03-10T08:56:12.948 INFO:tasks.workunit.client.0.vm05.stdout:7/836: write d18/d66/d25/d2e/dd9/f106 [1013270,68358] 0 2026-03-10T08:56:12.960 INFO:tasks.workunit.client.0.vm05.stdout:3/924: write d9/d4d/d51/f67 [4977954,111269] 0 2026-03-10T08:56:12.962 INFO:tasks.workunit.client.0.vm05.stdout:0/864: dwrite df/d1f/d85/d19/d39/f6f [0,4194304] 0 2026-03-10T08:56:12.966 INFO:tasks.workunit.client.0.vm05.stdout:8/868: sync 2026-03-10T08:56:12.980 INFO:tasks.workunit.client.0.vm05.stdout:2/800: write d0/f16 [1465085,32799] 0 2026-03-10T08:56:12.986 INFO:tasks.workunit.client.0.vm05.stdout:5/822: write d5/d86/d21/f5a [104011,20205] 0 2026-03-10T08:56:12.987 INFO:tasks.workunit.client.0.vm05.stdout:5/823: dread - d5/df/d37/dd2/f11b zero size 2026-03-10T08:56:12.991 INFO:tasks.workunit.client.0.vm05.stdout:5/824: dread d5/d86/d21/f9e [0,4194304] 0 2026-03-10T08:56:13.001 INFO:tasks.workunit.client.0.vm05.stdout:6/867: write d4/d2d/d51/f7c [2961375,27655] 0 2026-03-10T08:56:13.014 INFO:tasks.workunit.client.0.vm05.stdout:4/869: write d0/d2e/d42/d45/d4a/d36/dbe/fe0 [932312,10411] 0 2026-03-10T08:56:13.022 INFO:tasks.workunit.client.0.vm05.stdout:4/870: dread d0/d1d/f50 [0,4194304] 0 2026-03-10T08:56:13.060 INFO:tasks.workunit.client.0.vm05.stdout:1/941: mknod dd/d21/d37/d45/d8d/d128/c141 0 2026-03-10T08:56:13.060 INFO:tasks.workunit.client.0.vm05.stdout:1/942: chown dd/d10/d18/d20/d69/d13f 60 1 2026-03-10T08:56:13.074 INFO:tasks.workunit.client.0.vm05.stdout:2/801: mknod d0/d55/dd4/def/cf4 0 2026-03-10T08:56:13.075 INFO:tasks.workunit.client.0.vm05.stdout:2/802: readlink d0/d9/d7f/db4/lcd 0 2026-03-10T08:56:13.078 INFO:tasks.workunit.client.0.vm05.stdout:2/803: dwrite d0/d9/d1e/d20/d21/d45/d4b/d75/fdc [0,4194304] 0 2026-03-10T08:56:13.087 INFO:tasks.workunit.client.0.vm05.stdout:5/825: truncate d5/fc0 62075 0 2026-03-10T08:56:13.089 INFO:tasks.workunit.client.0.vm05.stdout:5/826: dread d5/d86/d39/f78 [0,4194304] 0 2026-03-10T08:56:13.099 INFO:tasks.workunit.client.0.vm05.stdout:4/871: dread - d0/d2e/d42/d45/d4a/d36/dbe/d49/d58/d66/d79/d107/ff9 zero size 2026-03-10T08:56:13.109 INFO:tasks.workunit.client.0.vm05.stdout:9/850: creat d6/d12/d3a/de5/dd4/f11d x:0 0 0 2026-03-10T08:56:13.109 INFO:tasks.workunit.client.0.vm05.stdout:9/851: fsync d6/f7 0 2026-03-10T08:56:13.115 INFO:tasks.workunit.client.0.vm05.stdout:6/868: dwrite d4/d7/d10/d1a/f1e [0,4194304] 0 2026-03-10T08:56:13.119 INFO:tasks.workunit.client.0.vm05.stdout:1/943: rename dd/lad to dd/d13/dd2/l142 0 2026-03-10T08:56:13.125 INFO:tasks.workunit.client.0.vm05.stdout:7/837: symlink d18/d66/d78/l10b 0 2026-03-10T08:56:13.129 INFO:tasks.workunit.client.0.vm05.stdout:3/925: mknod d9/d2b/d2f/d57/c120 0 2026-03-10T08:56:13.140 INFO:tasks.workunit.client.0.vm05.stdout:5/827: truncate d5/d48/f93 2166650 0 2026-03-10T08:56:13.141 INFO:tasks.workunit.client.0.vm05.stdout:5/828: chown d5/d86/d21/d71 1287 1 2026-03-10T08:56:13.146 INFO:tasks.workunit.client.0.vm05.stdout:4/872: creat d0/d2e/dca/f11b x:0 0 0 2026-03-10T08:56:13.146 INFO:tasks.workunit.client.0.vm05.stdout:4/873: readlink d0/d2c/d6a/l77 0 2026-03-10T08:56:13.156 INFO:tasks.workunit.client.0.vm05.stdout:9/852: rmdir d6/d15/d35 39 2026-03-10T08:56:13.159 INFO:tasks.workunit.client.0.vm05.stdout:1/944: mknod dd/d10/d18/d20/df3/c143 0 2026-03-10T08:56:13.173 INFO:tasks.workunit.client.0.vm05.stdout:5/829: mkdir d5/d86/d24/d129 0 2026-03-10T08:56:13.182 INFO:tasks.workunit.client.0.vm05.stdout:4/874: write d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/d67/fb6 [3459856,114231] 0 2026-03-10T08:56:13.191 INFO:tasks.workunit.client.0.vm05.stdout:6/869: creat d4/d2c/d84/db6/dc6/d128/f12a x:0 0 0 2026-03-10T08:56:13.194 INFO:tasks.workunit.client.0.vm05.stdout:9/853: write d6/fb0 [747535,123516] 0 2026-03-10T08:56:13.195 INFO:tasks.workunit.client.0.vm05.stdout:6/870: dwrite d4/d7/d10/d15/fc5 [0,4194304] 0 2026-03-10T08:56:13.205 INFO:tasks.workunit.client.0.vm05.stdout:1/945: chown f6 4 1 2026-03-10T08:56:13.210 INFO:tasks.workunit.client.0.vm05.stdout:3/926: getdents d9/d2b/d2f/d96 0 2026-03-10T08:56:13.213 INFO:tasks.workunit.client.0.vm05.stdout:0/865: link df/d1f/f21 df/d1f/d85/d19/f107 0 2026-03-10T08:56:13.213 INFO:tasks.workunit.client.0.vm05.stdout:0/866: chown df/d1f/d85/d19/d47/d84/d8a 27105294 1 2026-03-10T08:56:13.215 INFO:tasks.workunit.client.0.vm05.stdout:8/869: link d2/dd/d2c/c122 d2/dd/d2c/c12f 0 2026-03-10T08:56:13.220 INFO:tasks.workunit.client.0.vm05.stdout:2/804: rmdir d0/d9/d89/df1 0 2026-03-10T08:56:13.222 INFO:tasks.workunit.client.0.vm05.stdout:4/875: sync 2026-03-10T08:56:13.237 INFO:tasks.workunit.client.0.vm05.stdout:5/830: dwrite d5/d48/d64/d95/dac/dc6/fe9 [0,4194304] 0 2026-03-10T08:56:13.248 INFO:tasks.workunit.client.0.vm05.stdout:9/854: mknod d6/d15/d3c/c11e 0 2026-03-10T08:56:13.252 INFO:tasks.workunit.client.0.vm05.stdout:6/871: mknod d4/d7/d10/d1a/d8c/c12b 0 2026-03-10T08:56:13.264 INFO:tasks.workunit.client.0.vm05.stdout:1/946: rmdir dd/d21/d37/d7c/dab 39 2026-03-10T08:56:13.264 INFO:tasks.workunit.client.0.vm05.stdout:7/838: symlink d18/d38/dc7/de3/d74/deb/l10c 0 2026-03-10T08:56:13.264 INFO:tasks.workunit.client.0.vm05.stdout:3/927: truncate f2 2985245 0 2026-03-10T08:56:13.264 INFO:tasks.workunit.client.0.vm05.stdout:0/867: mkdir df/dd8/d67/d7b/d108 0 2026-03-10T08:56:13.264 INFO:tasks.workunit.client.0.vm05.stdout:0/868: stat df/d1f/dc6/de4 0 2026-03-10T08:56:13.268 INFO:tasks.workunit.client.0.vm05.stdout:8/870: truncate d2/dd/d2c/d2e/f64 703132 0 2026-03-10T08:56:13.273 INFO:tasks.workunit.client.0.vm05.stdout:4/876: rename d0/d2e/d42/d45/d4a/d36/dbe/d49/d4f/cb2 to d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/d67/d115/c11c 0 2026-03-10T08:56:13.290 INFO:tasks.workunit.client.0.vm05.stdout:5/831: creat d5/d86/d24/d2c/d41/d11f/f12a x:0 0 0 2026-03-10T08:56:13.292 INFO:tasks.workunit.client.0.vm05.stdout:9/855: creat d6/d19/d2a/dbc/f11f x:0 0 0 2026-03-10T08:56:13.293 INFO:tasks.workunit.client.0.vm05.stdout:9/856: dread - d6/d19/d2a/dbc/f11f zero size 2026-03-10T08:56:13.297 INFO:tasks.workunit.client.0.vm05.stdout:9/857: dread - d6/d15/d3c/d4b/d82/f105 zero size 2026-03-10T08:56:13.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:13 vm08.local ceph-mon[57559]: Failed to find standby mgr for failover. Retrying in 8 seconds 2026-03-10T08:56:13.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:13 vm08.local ceph-mon[57559]: pgmap v15: 65 pgs: 65 active+clean; 4.0 GiB data, 13 GiB used, 107 GiB / 120 GiB avail; 28 MiB/s rd, 68 MiB/s wr, 184 op/s 2026-03-10T08:56:13.304 INFO:tasks.workunit.client.0.vm05.stdout:5/832: dwrite d5/d86/f1b [4194304,4194304] 0 2026-03-10T08:56:13.307 INFO:tasks.workunit.client.0.vm05.stdout:5/833: fsync d5/d86/d24/d2c/d41/d74/f117 0 2026-03-10T08:56:13.311 INFO:tasks.workunit.client.0.vm05.stdout:6/872: dread d4/d7/d10/d15/d1b/d22/f56 [0,4194304] 0 2026-03-10T08:56:13.317 INFO:tasks.workunit.client.0.vm05.stdout:8/871: dwrite d2/dd/d2c/d2e/d31/d4f/da3/faa [0,4194304] 0 2026-03-10T08:56:13.340 INFO:tasks.workunit.client.0.vm05.stdout:9/858: creat d6/d19/f120 x:0 0 0 2026-03-10T08:56:13.340 INFO:tasks.workunit.client.0.vm05.stdout:9/859: dread - d6/d15/d3c/d4b/d82/f105 zero size 2026-03-10T08:56:13.343 INFO:tasks.workunit.client.0.vm05.stdout:1/947: symlink dd/d10/d19/d131/l144 0 2026-03-10T08:56:13.351 INFO:tasks.workunit.client.0.vm05.stdout:7/839: mknod d18/d38/c10d 0 2026-03-10T08:56:13.353 INFO:tasks.workunit.client.0.vm05.stdout:0/869: truncate df/d1f/d85/d19/fe1 1039115 0 2026-03-10T08:56:13.361 INFO:tasks.workunit.client.0.vm05.stdout:0/870: dread df/d1f/d85/d19/d55/fa9 [0,4194304] 0 2026-03-10T08:56:13.368 INFO:tasks.workunit.client.0.vm05.stdout:0/871: chown df/f17 457 1 2026-03-10T08:56:13.368 INFO:tasks.workunit.client.0.vm05.stdout:6/873: mknod d4/d2c/d84/db6/dc6/d128/c12c 0 2026-03-10T08:56:13.368 INFO:tasks.workunit.client.0.vm05.stdout:6/874: truncate d4/d7/d10/d111/f120 248984 0 2026-03-10T08:56:13.373 INFO:tasks.workunit.client.0.vm05.stdout:8/872: write d2/dd/d2c/d2e/d31/d4f/d80/dd0/fb6 [490524,128703] 0 2026-03-10T08:56:13.380 INFO:tasks.workunit.client.0.vm05.stdout:9/860: dread d6/d15/f96 [0,4194304] 0 2026-03-10T08:56:13.383 INFO:tasks.workunit.client.0.vm05.stdout:1/948: rename dd/d10/d18/d2d/fe0 to dd/d10/d112/f145 0 2026-03-10T08:56:13.383 INFO:tasks.workunit.client.0.vm05.stdout:1/949: chown dd/d10/d18/d2d/d51/d58/d71/cea 28 1 2026-03-10T08:56:13.387 INFO:tasks.workunit.client.0.vm05.stdout:7/840: creat d18/d66/d25/d2e/dd9/f10e x:0 0 0 2026-03-10T08:56:13.397 INFO:tasks.workunit.client.0.vm05.stdout:2/805: getdents d0/d55/dd4 0 2026-03-10T08:56:13.399 INFO:tasks.workunit.client.0.vm05.stdout:9/861: creat d6/df6/f121 x:0 0 0 2026-03-10T08:56:13.404 INFO:tasks.workunit.client.0.vm05.stdout:7/841: readlink d18/l4e 0 2026-03-10T08:56:13.406 INFO:tasks.workunit.client.0.vm05.stdout:3/928: getdents d9/d2b/de7/df1/d43/da3 0 2026-03-10T08:56:13.410 INFO:tasks.workunit.client.0.vm05.stdout:5/834: creat d5/d86/d24/d2c/d41/f12b x:0 0 0 2026-03-10T08:56:13.414 INFO:tasks.workunit.client.0.vm05.stdout:6/875: mkdir d4/d2c/d12d 0 2026-03-10T08:56:13.423 INFO:tasks.workunit.client.0.vm05.stdout:4/877: truncate d0/f9 7080000 0 2026-03-10T08:56:13.429 INFO:tasks.workunit.client.0.vm05.stdout:1/950: write dd/d10/d18/d20/df3/f108 [1127525,22376] 0 2026-03-10T08:56:13.429 INFO:tasks.workunit.client.0.vm05.stdout:0/872: dwrite df/dd8/d67/fda [0,4194304] 0 2026-03-10T08:56:13.435 INFO:tasks.workunit.client.0.vm05.stdout:1/951: dread dd/d10/d18/d20/fd6 [0,4194304] 0 2026-03-10T08:56:13.436 INFO:tasks.workunit.client.0.vm05.stdout:1/952: chown dd/d10/d18/d20/d69/fe2 120 1 2026-03-10T08:56:13.447 INFO:tasks.workunit.client.0.vm05.stdout:3/929: fsync d9/d4d/d51/d64/d89/fa1 0 2026-03-10T08:56:13.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:13 vm05.local ceph-mon[49713]: Failed to find standby mgr for failover. Retrying in 8 seconds 2026-03-10T08:56:13.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:13 vm05.local ceph-mon[49713]: pgmap v15: 65 pgs: 65 active+clean; 4.0 GiB data, 13 GiB used, 107 GiB / 120 GiB avail; 28 MiB/s rd, 68 MiB/s wr, 184 op/s 2026-03-10T08:56:13.464 INFO:tasks.workunit.client.0.vm05.stdout:9/862: write d6/d19/d2c/f78 [944032,106298] 0 2026-03-10T08:56:13.473 INFO:tasks.workunit.client.0.vm05.stdout:5/835: dwrite d5/d86/d39/f78 [0,4194304] 0 2026-03-10T08:56:13.474 INFO:tasks.workunit.client.0.vm05.stdout:5/836: stat d5/df/d37/dd2/d76/dde/f10c 0 2026-03-10T08:56:13.683 INFO:tasks.workunit.client.0.vm05.stdout:0/873: mknod df/d1f/d85/d19/d5b/c109 0 2026-03-10T08:56:13.698 INFO:tasks.workunit.client.0.vm05.stdout:3/930: fsync d9/d8f/fdc 0 2026-03-10T08:56:13.710 INFO:tasks.workunit.client.0.vm05.stdout:2/806: link d0/d55/l85 d0/d9/d7f/d8f/lf5 0 2026-03-10T08:56:13.711 INFO:tasks.workunit.client.0.vm05.stdout:8/873: link d2/db/d1f/d67/d8d/f8f d2/db/d1f/f130 0 2026-03-10T08:56:13.732 INFO:tasks.workunit.client.0.vm05.stdout:3/931: fsync d9/d4d/d51/d64/f9e 0 2026-03-10T08:56:13.738 INFO:tasks.workunit.client.0.vm05.stdout:2/807: rmdir d0/d9/d89 39 2026-03-10T08:56:13.741 INFO:tasks.workunit.client.0.vm05.stdout:2/808: dread d0/d9/d7f/d8f/f63 [0,4194304] 0 2026-03-10T08:56:13.742 INFO:tasks.workunit.client.0.vm05.stdout:2/809: chown d0/d9/d1e/d20/d21/d45/d4b/f97 91530 1 2026-03-10T08:56:13.743 INFO:tasks.workunit.client.0.vm05.stdout:8/874: dread - d2/fbd zero size 2026-03-10T08:56:13.745 INFO:tasks.workunit.client.0.vm05.stdout:8/875: write d2/dd/d2c/d2e/d31/d4f/d80/dd0/fb6 [2245301,16322] 0 2026-03-10T08:56:13.746 INFO:tasks.workunit.client.0.vm05.stdout:9/863: symlink d6/d12/d3a/de5/dd4/d113/l122 0 2026-03-10T08:56:13.753 INFO:tasks.workunit.client.0.vm05.stdout:1/953: link dd/d10/d18/d2d/ff1 dd/d21/d37/d7c/dab/db7/dde/f146 0 2026-03-10T08:56:13.783 INFO:tasks.workunit.client.0.vm05.stdout:1/954: truncate dd/d10/d18/d2d/d51/f13c 325487 0 2026-03-10T08:56:13.783 INFO:tasks.workunit.client.0.vm05.stdout:7/842: getdents d18/d38/dc7/de3/d74 0 2026-03-10T08:56:13.783 INFO:tasks.workunit.client.0.vm05.stdout:2/810: fsync d0/d9/d1e/d20/f8b 0 2026-03-10T08:56:13.783 INFO:tasks.workunit.client.0.vm05.stdout:1/955: fdatasync dd/d10/d18/d20/fd6 0 2026-03-10T08:56:13.783 INFO:tasks.workunit.client.0.vm05.stdout:1/956: fsync dd/d21/d37/d45/d8d/f130 0 2026-03-10T08:56:13.783 INFO:tasks.workunit.client.0.vm05.stdout:8/876: mkdir d2/dd/d2c/d131 0 2026-03-10T08:56:13.783 INFO:tasks.workunit.client.0.vm05.stdout:0/874: getdents df/d1f/d85/d19/d47/d84/dae 0 2026-03-10T08:56:13.790 INFO:tasks.workunit.client.0.vm05.stdout:1/957: mkdir dd/d10/d18/d2d/d51/d58/deb/d147 0 2026-03-10T08:56:13.802 INFO:tasks.workunit.client.0.vm05.stdout:1/958: truncate dd/d21/d37/d45/f47 5167031 0 2026-03-10T08:56:13.806 INFO:tasks.workunit.client.0.vm05.stdout:3/932: getdents d9/d8f/dde 0 2026-03-10T08:56:13.809 INFO:tasks.workunit.client.0.vm05.stdout:2/811: creat d0/d9/d1e/d20/d24/ff6 x:0 0 0 2026-03-10T08:56:13.813 INFO:tasks.workunit.client.0.vm05.stdout:3/933: dread d9/d2b/f40 [0,4194304] 0 2026-03-10T08:56:13.846 INFO:tasks.workunit.client.0.vm05.stdout:0/875: mknod df/d1f/d85/d2b/d65/d6e/d96/c10a 0 2026-03-10T08:56:13.846 INFO:tasks.workunit.client.0.vm05.stdout:0/876: truncate df/f15 5072482 0 2026-03-10T08:56:13.846 INFO:tasks.workunit.client.0.vm05.stdout:0/877: dread df/d1f/d85/d19/f99 [0,4194304] 0 2026-03-10T08:56:13.846 INFO:tasks.workunit.client.0.vm05.stdout:2/812: mknod d0/d9/d1e/d20/d21/d45/d4b/d75/cf7 0 2026-03-10T08:56:13.846 INFO:tasks.workunit.client.0.vm05.stdout:3/934: fsync d9/d4d/d51/d64/d89/dc2/f108 0 2026-03-10T08:56:13.846 INFO:tasks.workunit.client.0.vm05.stdout:0/878: creat df/d1f/d85/d19/f10b x:0 0 0 2026-03-10T08:56:13.846 INFO:tasks.workunit.client.0.vm05.stdout:2/813: link d0/d9/d1e/d20/f3a d0/d55/dd4/def/df2/ff8 0 2026-03-10T08:56:13.849 INFO:tasks.workunit.client.0.vm05.stdout:2/814: dwrite d0/d9/d1e/d20/d21/d45/d4b/f9c [0,4194304] 0 2026-03-10T08:56:13.861 INFO:tasks.workunit.client.0.vm05.stdout:3/935: symlink d9/d4d/d51/d64/d89/dc2/l121 0 2026-03-10T08:56:13.867 INFO:tasks.workunit.client.0.vm05.stdout:3/936: dread - d9/d4d/f95 zero size 2026-03-10T08:56:13.867 INFO:tasks.workunit.client.0.vm05.stdout:0/879: fsync df/d59/fdf 0 2026-03-10T08:56:13.870 INFO:tasks.workunit.client.0.vm05.stdout:2/815: symlink d0/d55/db8/dcc/dd9/lf9 0 2026-03-10T08:56:13.872 INFO:tasks.workunit.client.0.vm05.stdout:2/816: dread d0/d9/d1e/d20/fc8 [0,4194304] 0 2026-03-10T08:56:13.877 INFO:tasks.workunit.client.0.vm05.stdout:0/880: mknod df/d1f/d95/c10c 0 2026-03-10T08:56:13.882 INFO:tasks.workunit.client.0.vm05.stdout:2/817: fsync d0/d9/d89/f96 0 2026-03-10T08:56:13.891 INFO:tasks.workunit.client.0.vm05.stdout:7/843: sync 2026-03-10T08:56:13.900 INFO:tasks.workunit.client.0.vm05.stdout:0/881: creat df/d1f/d85/d2b/d27/d32/f10d x:0 0 0 2026-03-10T08:56:13.908 INFO:tasks.workunit.client.0.vm05.stdout:0/882: mkdir df/d1f/d85/d19/d47/d10e 0 2026-03-10T08:56:13.917 INFO:tasks.workunit.client.0.vm05.stdout:0/883: creat df/d1f/d85/d2b/d27/d32/f10f x:0 0 0 2026-03-10T08:56:13.938 INFO:tasks.workunit.client.0.vm05.stdout:5/837: rename d5/df/dbb/d43 to d5/d86/d21/d71/d12c 0 2026-03-10T08:56:13.940 INFO:tasks.workunit.client.0.vm05.stdout:0/884: mkdir df/d1f/d85/d19/d47/d84/d8a/d110 0 2026-03-10T08:56:13.942 INFO:tasks.workunit.client.0.vm05.stdout:4/878: write d0/f1 [1974570,86601] 0 2026-03-10T08:56:13.943 INFO:tasks.workunit.client.0.vm05.stdout:4/879: read - d0/d2e/d42/d45/d4a/d36/dbe/d49/d58/d66/d79/d107/ff2 zero size 2026-03-10T08:56:13.984 INFO:tasks.workunit.client.0.vm05.stdout:6/876: dwrite d4/d2c/d84/f6b [0,4194304] 0 2026-03-10T08:56:13.992 INFO:tasks.workunit.client.0.vm05.stdout:4/880: truncate d0/d2e/d42/d45/d4a/f47 2418318 0 2026-03-10T08:56:13.993 INFO:tasks.workunit.client.0.vm05.stdout:4/881: write d0/d2e/dca/f11b [439805,12825] 0 2026-03-10T08:56:13.995 INFO:tasks.workunit.client.0.vm05.stdout:6/877: creat d4/d2c/d84/d4a/f12e x:0 0 0 2026-03-10T08:56:14.003 INFO:tasks.workunit.client.0.vm05.stdout:6/878: mknod d4/d7/d10/d15/d1b/dfc/c12f 0 2026-03-10T08:56:14.028 INFO:tasks.workunit.client.0.vm05.stdout:9/864: write d6/d15/fb4 [2395142,87089] 0 2026-03-10T08:56:14.032 INFO:tasks.workunit.client.0.vm05.stdout:9/865: mknod d6/d19/d2a/d4a/c123 0 2026-03-10T08:56:14.035 INFO:tasks.workunit.client.0.vm05.stdout:9/866: symlink d6/d15/d3c/d4b/d82/de9/l124 0 2026-03-10T08:56:14.038 INFO:tasks.workunit.client.0.vm05.stdout:9/867: rename d6/d12/d3a/de5/f52 to d6/d15/d37/de8/f125 0 2026-03-10T08:56:14.053 INFO:tasks.workunit.client.0.vm05.stdout:9/868: rename d6/d15/d3c/d4b/lf1 to d6/d15/d35/ddf/l126 0 2026-03-10T08:56:14.056 INFO:tasks.workunit.client.0.vm05.stdout:9/869: read d6/d12/f74 [11248,109212] 0 2026-03-10T08:56:14.056 INFO:tasks.workunit.client.0.vm05.stdout:8/877: write d2/db/d28/d99/df3/f104 [650710,120914] 0 2026-03-10T08:56:14.067 INFO:tasks.workunit.client.0.vm05.stdout:9/870: dread d6/d12/d3a/de5/f47 [0,4194304] 0 2026-03-10T08:56:14.068 INFO:tasks.workunit.client.0.vm05.stdout:9/871: fdatasync d6/d15/d35/fc0 0 2026-03-10T08:56:14.071 INFO:tasks.workunit.client.0.vm05.stdout:9/872: rename d6/d15/d3c/d4b/d90/fde to d6/d19/d2a/d8d/f127 0 2026-03-10T08:56:14.082 INFO:tasks.workunit.client.0.vm05.stdout:1/959: dwrite dd/d10/d19/f129 [0,4194304] 0 2026-03-10T08:56:14.084 INFO:tasks.workunit.client.0.vm05.stdout:1/960: chown dd/d10/d18/d2d/d51/d58/d71/d73/l76 5 1 2026-03-10T08:56:14.085 INFO:tasks.workunit.client.0.vm05.stdout:1/961: fsync dd/d10/d18/d2d/f10e 0 2026-03-10T08:56:14.085 INFO:tasks.workunit.client.0.vm05.stdout:1/962: readlink dd/d13/d10b/l10f 0 2026-03-10T08:56:14.097 INFO:tasks.workunit.client.0.vm05.stdout:1/963: chown dd/d10/d19/d27/l2c 941 1 2026-03-10T08:56:14.098 INFO:tasks.workunit.client.0.vm05.stdout:1/964: readlink dd/d10/d19/d4d/ldf 0 2026-03-10T08:56:14.126 INFO:tasks.workunit.client.0.vm05.stdout:3/937: dwrite d9/d2b/de7/df1/d6c/fb2 [0,4194304] 0 2026-03-10T08:56:14.137 INFO:tasks.workunit.client.0.vm05.stdout:8/878: rmdir d2/db/d1f 39 2026-03-10T08:56:14.138 INFO:tasks.workunit.client.0.vm05.stdout:8/879: readlink d2/d45/l5b 0 2026-03-10T08:56:14.139 INFO:tasks.workunit.client.0.vm05.stdout:1/965: sync 2026-03-10T08:56:14.142 INFO:tasks.workunit.client.0.vm05.stdout:7/844: dwrite d18/d66/d25/fb9 [0,4194304] 0 2026-03-10T08:56:14.158 INFO:tasks.workunit.client.0.vm05.stdout:1/966: dread dd/d10/d18/d20/d52/ddc/f11d [0,4194304] 0 2026-03-10T08:56:14.175 INFO:tasks.workunit.client.0.vm05.stdout:5/838: write d5/d86/d24/d2c/d41/dca/f10d [633065,30855] 0 2026-03-10T08:56:14.175 INFO:tasks.workunit.client.0.vm05.stdout:0/885: write df/f17 [1623565,81497] 0 2026-03-10T08:56:14.184 INFO:tasks.workunit.client.0.vm05.stdout:0/886: dread df/d1f/d85/d19/d47/fa5 [0,4194304] 0 2026-03-10T08:56:14.191 INFO:tasks.workunit.client.0.vm05.stdout:4/882: write d0/d2e/d42/d45/d4a/d36/dbe/d32/f76 [723827,116469] 0 2026-03-10T08:56:14.191 INFO:tasks.workunit.client.0.vm05.stdout:0/887: read - df/d1f/d85/d2b/d65/fed zero size 2026-03-10T08:56:14.194 INFO:tasks.workunit.client.0.vm05.stdout:5/839: dwrite d5/f23 [4194304,4194304] 0 2026-03-10T08:56:14.208 INFO:tasks.workunit.client.0.vm05.stdout:2/818: unlink d0/d9/d1e/d20/d21/ce5 0 2026-03-10T08:56:14.211 INFO:tasks.workunit.client.0.vm05.stdout:0/888: symlink df/d1f/d85/d2b/d27/d32/d4e/d87/l111 0 2026-03-10T08:56:14.224 INFO:tasks.workunit.client.0.vm05.stdout:4/883: creat d0/d2e/dca/df3/f11d x:0 0 0 2026-03-10T08:56:14.232 INFO:tasks.workunit.client.0.vm05.stdout:5/840: rename d5/d86/d24/d84/db8 to d5/df/d37/d68/d12d 0 2026-03-10T08:56:14.232 INFO:tasks.workunit.client.0.vm05.stdout:0/889: mknod df/d1f/d85/d19/d47/d84/dae/c112 0 2026-03-10T08:56:14.232 INFO:tasks.workunit.client.0.vm05.stdout:5/841: symlink d5/df/d37/d68/l12e 0 2026-03-10T08:56:14.232 INFO:tasks.workunit.client.0.vm05.stdout:6/879: dwrite d4/f61 [0,4194304] 0 2026-03-10T08:56:14.232 INFO:tasks.workunit.client.0.vm05.stdout:0/890: symlink df/d1f/d85/d19/d39/l113 0 2026-03-10T08:56:14.232 INFO:tasks.workunit.client.0.vm05.stdout:4/884: rename d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/d67/f6b to d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/d67/f11e 0 2026-03-10T08:56:14.236 INFO:tasks.workunit.client.0.vm05.stdout:2/819: rename d0/d55/da2/fe2 to d0/d9/d1e/d20/d24/ffa 0 2026-03-10T08:56:14.240 INFO:tasks.workunit.client.0.vm05.stdout:5/842: sync 2026-03-10T08:56:14.241 INFO:tasks.workunit.client.0.vm05.stdout:9/873: dwrite d6/d15/d3c/d4b/f67 [0,4194304] 0 2026-03-10T08:56:14.244 INFO:tasks.workunit.client.0.vm05.stdout:4/885: fsync d0/d2e/d42/d45/d4a/d36/f88 0 2026-03-10T08:56:14.251 INFO:tasks.workunit.client.0.vm05.stdout:5/843: dread d5/d86/f9d [0,4194304] 0 2026-03-10T08:56:14.256 INFO:tasks.workunit.client.0.vm05.stdout:0/891: creat df/d1f/d85/d2b/d65/f114 x:0 0 0 2026-03-10T08:56:14.256 INFO:tasks.workunit.client.0.vm05.stdout:2/820: symlink d0/d9/lfb 0 2026-03-10T08:56:14.257 INFO:tasks.workunit.client.0.vm05.stdout:0/892: stat df/d1f/d85/d19/d39/d4d 0 2026-03-10T08:56:14.262 INFO:tasks.workunit.client.0.vm05.stdout:9/874: mkdir d6/d15/d3c/d128 0 2026-03-10T08:56:14.266 INFO:tasks.workunit.client.0.vm05.stdout:3/938: write d9/d2b/f2d [1702039,117630] 0 2026-03-10T08:56:14.305 INFO:tasks.workunit.client.0.vm05.stdout:7/845: write d18/d66/d78/dc3/fd0 [4830129,37283] 0 2026-03-10T08:56:14.306 INFO:tasks.workunit.client.0.vm05.stdout:8/880: dwrite d2/db/f19 [0,4194304] 0 2026-03-10T08:56:14.306 INFO:tasks.workunit.client.0.vm05.stdout:0/893: rename df/dd8/d67/fda to df/d1f/d85/d19/d47/da3/f115 0 2026-03-10T08:56:14.306 INFO:tasks.workunit.client.0.vm05.stdout:1/967: dwrite dd/d21/d37/d7c/d60/fe9 [0,4194304] 0 2026-03-10T08:56:14.306 INFO:tasks.workunit.client.0.vm05.stdout:9/875: unlink d6/d12/d3a/d48/lab 0 2026-03-10T08:56:14.306 INFO:tasks.workunit.client.0.vm05.stdout:9/876: stat d6/d19/d2c/d84/lbb 0 2026-03-10T08:56:14.306 INFO:tasks.workunit.client.0.vm05.stdout:9/877: chown d6/d19/d2c/l8b 1062 1 2026-03-10T08:56:14.312 INFO:tasks.workunit.client.0.vm05.stdout:7/846: symlink d18/d66/d78/dc3/l10f 0 2026-03-10T08:56:14.312 INFO:tasks.workunit.client.0.vm05.stdout:7/847: fdatasync d18/d38/d43/ff8 0 2026-03-10T08:56:14.314 INFO:tasks.workunit.client.0.vm05.stdout:2/821: symlink d0/d9/d1e/d20/d21/d45/d4b/d70/lfc 0 2026-03-10T08:56:14.314 INFO:tasks.workunit.client.0.vm05.stdout:2/822: chown d0/d9/d89/l95 1277601739 1 2026-03-10T08:56:14.315 INFO:tasks.workunit.client.0.vm05.stdout:2/823: stat d0/c50 0 2026-03-10T08:56:14.320 INFO:tasks.workunit.client.0.vm05.stdout:0/894: symlink df/d59/l116 0 2026-03-10T08:56:14.325 INFO:tasks.workunit.client.0.vm05.stdout:1/968: truncate dd/d21/f10a 8053911 0 2026-03-10T08:56:14.327 INFO:tasks.workunit.client.0.vm05.stdout:1/969: read f6 [13576,70440] 0 2026-03-10T08:56:14.328 INFO:tasks.workunit.client.0.vm05.stdout:1/970: chown dd/d10/d19/d9b/dc3/lfd 24232545 1 2026-03-10T08:56:14.336 INFO:tasks.workunit.client.0.vm05.stdout:3/939: symlink d9/d2b/d2f/l122 0 2026-03-10T08:56:14.337 INFO:tasks.workunit.client.0.vm05.stdout:3/940: dread - d9/d8f/dde/f11a zero size 2026-03-10T08:56:14.340 INFO:tasks.workunit.client.0.vm05.stdout:0/895: mknod df/d1f/d85/d2b/d27/d32/d4e/d6a/c117 0 2026-03-10T08:56:14.342 INFO:tasks.workunit.client.0.vm05.stdout:7/848: symlink d18/d66/d78/l110 0 2026-03-10T08:56:14.342 INFO:tasks.workunit.client.0.vm05.stdout:3/941: unlink d9/d4d/d51/f67 0 2026-03-10T08:56:14.347 INFO:tasks.workunit.client.0.vm05.stdout:2/824: link d0/f8 d0/d9/d7f/d8f/d7e/ffd 0 2026-03-10T08:56:14.350 INFO:tasks.workunit.client.0.vm05.stdout:3/942: stat d9/d2b/d53/cd1 0 2026-03-10T08:56:14.353 INFO:tasks.workunit.client.0.vm05.stdout:7/849: mkdir d18/d66/d25/d2e/dd9/d111 0 2026-03-10T08:56:14.359 INFO:tasks.workunit.client.0.vm05.stdout:3/943: truncate d9/d8f/d55/f6b 7420165 0 2026-03-10T08:56:14.360 INFO:tasks.workunit.client.0.vm05.stdout:1/971: sync 2026-03-10T08:56:14.373 INFO:tasks.workunit.client.0.vm05.stdout:5/844: dread d5/df/d37/dd2/d76/dde/f10c [0,4194304] 0 2026-03-10T08:56:14.374 INFO:tasks.workunit.client.0.vm05.stdout:2/825: dread d0/d9/d1e/d20/d21/d8a/d92/fae [0,4194304] 0 2026-03-10T08:56:14.376 INFO:tasks.workunit.client.0.vm05.stdout:7/850: rename d18/d66/d25/f47 to d18/f112 0 2026-03-10T08:56:14.377 INFO:tasks.workunit.client.0.vm05.stdout:6/880: write d4/d2c/d84/db6/dc6/fc9 [4465969,124118] 0 2026-03-10T08:56:14.389 INFO:tasks.workunit.client.0.vm05.stdout:4/886: dwrite d0/d1d/f89 [0,4194304] 0 2026-03-10T08:56:14.394 INFO:tasks.workunit.client.0.vm05.stdout:8/881: write d2/db/d28/fa6 [1786165,14381] 0 2026-03-10T08:56:14.405 INFO:tasks.workunit.client.0.vm05.stdout:9/878: write d6/d19/d2a/f87 [186191,42794] 0 2026-03-10T08:56:14.410 INFO:tasks.workunit.client.0.vm05.stdout:0/896: dwrite df/d59/f3f [0,4194304] 0 2026-03-10T08:56:14.412 INFO:tasks.workunit.client.0.vm05.stdout:5/845: creat d5/df/d37/d68/db6/f12f x:0 0 0 2026-03-10T08:56:14.427 INFO:tasks.workunit.client.0.vm05.stdout:1/972: dwrite dd/d10/d18/d20/d52/d80/ffc [0,4194304] 0 2026-03-10T08:56:14.432 INFO:tasks.workunit.client.0.vm05.stdout:2/826: dread d0/f36 [0,4194304] 0 2026-03-10T08:56:14.432 INFO:tasks.workunit.client.0.vm05.stdout:2/827: chown d0/d9/d1e/d20/d21/f41 495 1 2026-03-10T08:56:14.439 INFO:tasks.workunit.client.0.vm05.stdout:3/944: rename d9/d2b/de7/df1/d43/d71 to d9/d8f/d50/d5f/dd8/dec/d123 0 2026-03-10T08:56:14.448 INFO:tasks.workunit.client.0.vm05.stdout:8/882: dread - d2/dfc/f103 zero size 2026-03-10T08:56:14.452 INFO:tasks.workunit.client.0.vm05.stdout:0/897: chown df/d1f/d85/d2b/d65/d6e/d96/l5f 7058 1 2026-03-10T08:56:14.456 INFO:tasks.workunit.client.0.vm05.stdout:9/879: dread d6/d19/d2c/d58/f6c [0,4194304] 0 2026-03-10T08:56:14.459 INFO:tasks.workunit.client.0.vm05.stdout:5/846: creat d5/d86/d21/d71/f130 x:0 0 0 2026-03-10T08:56:14.466 INFO:tasks.workunit.client.0.vm05.stdout:2/828: creat d0/d55/db8/ffe x:0 0 0 2026-03-10T08:56:14.470 INFO:tasks.workunit.client.0.vm05.stdout:7/851: mkdir d18/d66/d25/d2e/dd9/d111/d113 0 2026-03-10T08:56:14.473 INFO:tasks.workunit.client.0.vm05.stdout:3/945: dread - d9/d2b/d2f/ff4 zero size 2026-03-10T08:56:14.476 INFO:tasks.workunit.client.0.vm05.stdout:2/829: dread d0/f16 [0,4194304] 0 2026-03-10T08:56:14.485 INFO:tasks.workunit.client.0.vm05.stdout:0/898: symlink df/d1f/d85/d9e/l118 0 2026-03-10T08:56:14.486 INFO:tasks.workunit.client.0.vm05.stdout:0/899: read df/d1f/d85/fb5 [1734270,93691] 0 2026-03-10T08:56:14.491 INFO:tasks.workunit.client.0.vm05.stdout:9/880: rmdir d6/d19 39 2026-03-10T08:56:14.508 INFO:tasks.workunit.client.0.vm05.stdout:4/887: fsync d0/d2e/d42/d45/d4a/d36/d37/fac 0 2026-03-10T08:56:14.509 INFO:tasks.workunit.client.0.vm05.stdout:2/830: dread d0/f2f [0,4194304] 0 2026-03-10T08:56:14.510 INFO:tasks.workunit.client.0.vm05.stdout:2/831: truncate d0/d55/db8/ffe 920821 0 2026-03-10T08:56:14.514 INFO:tasks.workunit.client.0.vm05.stdout:8/883: readlink d2/db/d1f/d67/l118 0 2026-03-10T08:56:14.514 INFO:tasks.workunit.client.0.vm05.stdout:8/884: chown d2/db/f22 357737513 1 2026-03-10T08:56:14.518 INFO:tasks.workunit.client.0.vm05.stdout:0/900: fsync df/d1f/d85/d2b/d27/f4f 0 2026-03-10T08:56:14.524 INFO:tasks.workunit.client.0.vm05.stdout:5/847: truncate d5/d86/d24/d2c/d41/fad 384470 0 2026-03-10T08:56:14.530 INFO:tasks.workunit.client.0.vm05.stdout:7/852: creat d18/d38/dc7/de3/d9c/de8/f114 x:0 0 0 2026-03-10T08:56:14.532 INFO:tasks.workunit.client.0.vm05.stdout:3/946: truncate d9/d2b/de7/df1/d6c/dbf/f9b 4361717 0 2026-03-10T08:56:14.538 INFO:tasks.workunit.client.0.vm05.stdout:6/881: rename d4/d7/d10/d8f/f123 to d4/d7/d10/d1a/f130 0 2026-03-10T08:56:14.542 INFO:tasks.workunit.client.0.vm05.stdout:2/832: write d0/f16 [4608270,37053] 0 2026-03-10T08:56:14.552 INFO:tasks.workunit.client.0.vm05.stdout:8/885: mkdir d2/dd/d2c/d2e/d31/d3e/dde/d132 0 2026-03-10T08:56:14.559 INFO:tasks.workunit.client.0.vm05.stdout:0/901: mknod df/d1f/d85/d19/d47/d84/d8a/c119 0 2026-03-10T08:56:14.566 INFO:tasks.workunit.client.0.vm05.stdout:9/881: dwrite d6/d15/d37/de8/f125 [0,4194304] 0 2026-03-10T08:56:14.577 INFO:tasks.workunit.client.0.vm05.stdout:7/853: mknod d18/d1b/ddd/c115 0 2026-03-10T08:56:14.580 INFO:tasks.workunit.client.0.vm05.stdout:3/947: creat d9/d2b/de7/df1/d6c/dbe/f124 x:0 0 0 2026-03-10T08:56:14.580 INFO:tasks.workunit.client.0.vm05.stdout:3/948: stat d9/ff 0 2026-03-10T08:56:14.587 INFO:tasks.workunit.client.0.vm05.stdout:6/882: read d4/fc [472046,778] 0 2026-03-10T08:56:14.596 INFO:tasks.workunit.client.0.vm05.stdout:8/886: fdatasync d2/db/d28/d99/fd5 0 2026-03-10T08:56:14.622 INFO:tasks.workunit.client.0.vm05.stdout:1/973: rename dd/d21/d37/f72 to dd/d10/d18/d2d/d51/d58/deb/d147/f148 0 2026-03-10T08:56:14.626 INFO:tasks.workunit.client.0.vm05.stdout:6/883: mknod d4/d2d/d51/d87/da5/de9/c131 0 2026-03-10T08:56:14.631 INFO:tasks.workunit.client.0.vm05.stdout:8/887: creat d2/dd/d2c/d2e/d31/d3e/d5d/d9d/f133 x:0 0 0 2026-03-10T08:56:14.632 INFO:tasks.workunit.client.0.vm05.stdout:0/902: mknod df/d1f/dcd/c11a 0 2026-03-10T08:56:14.644 INFO:tasks.workunit.client.0.vm05.stdout:9/882: dwrite d6/d12/f14 [0,4194304] 0 2026-03-10T08:56:14.646 INFO:tasks.workunit.client.0.vm05.stdout:7/854: symlink d18/l116 0 2026-03-10T08:56:14.662 INFO:tasks.workunit.client.0.vm05.stdout:4/888: getdents d0/d2c/d6a/dc9/d106 0 2026-03-10T08:56:14.662 INFO:tasks.workunit.client.0.vm05.stdout:4/889: chown d0/d1d/c110 1096 1 2026-03-10T08:56:14.662 INFO:tasks.workunit.client.0.vm05.stdout:4/890: write d0/d1d/f89 [3585382,103106] 0 2026-03-10T08:56:14.664 INFO:tasks.workunit.client.0.vm05.stdout:4/891: chown d0/d2e/d42/d45/d4a/d36/dbe/d49/d4f/d5b/dd7/l113 5 1 2026-03-10T08:56:14.672 INFO:tasks.workunit.client.0.vm05.stdout:6/884: unlink d4/d7/d10/d15/d1b/f108 0 2026-03-10T08:56:14.672 INFO:tasks.workunit.client.0.vm05.stdout:6/885: dread - d4/d2c/d84/fb2 zero size 2026-03-10T08:56:14.677 INFO:tasks.workunit.client.0.vm05.stdout:1/974: dwrite dd/d10/d18/d2d/ff1 [4194304,4194304] 0 2026-03-10T08:56:14.679 INFO:tasks.workunit.client.0.vm05.stdout:1/975: chown dd/d10/d18/d2d/f93 66289 1 2026-03-10T08:56:14.689 INFO:tasks.workunit.client.0.vm05.stdout:8/888: symlink d2/dd/d2c/d2e/d31/d3e/dde/l134 0 2026-03-10T08:56:14.692 INFO:tasks.workunit.client.0.vm05.stdout:0/903: creat df/dd8/d67/d7b/f11b x:0 0 0 2026-03-10T08:56:14.695 INFO:tasks.workunit.client.0.vm05.stdout:0/904: dread df/d1f/d85/d19/d39/f6f [0,4194304] 0 2026-03-10T08:56:14.697 INFO:tasks.workunit.client.0.vm05.stdout:5/848: getdents d5/d48/d64/d95/dac/dc6 0 2026-03-10T08:56:14.710 INFO:tasks.workunit.client.0.vm05.stdout:7/855: mknod d18/d38/dc7/de3/d9c/c117 0 2026-03-10T08:56:14.714 INFO:tasks.workunit.client.0.vm05.stdout:3/949: link d9/d4d/lea d9/d2b/de7/df1/dd6/d107/l125 0 2026-03-10T08:56:14.714 INFO:tasks.workunit.client.0.vm05.stdout:7/856: dread d18/d38/f82 [0,4194304] 0 2026-03-10T08:56:14.724 INFO:tasks.workunit.client.0.vm05.stdout:3/950: dread d9/ff [4194304,4194304] 0 2026-03-10T08:56:14.737 INFO:tasks.workunit.client.0.vm05.stdout:4/892: dwrite d0/d2e/d42/d45/d4a/d36/dbe/ffb [0,4194304] 0 2026-03-10T08:56:14.745 INFO:tasks.workunit.client.0.vm05.stdout:6/886: fsync d4/d2d/f2f 0 2026-03-10T08:56:14.754 INFO:tasks.workunit.client.0.vm05.stdout:2/833: getdents d0/d55/db8/dcc/dd9/dcb 0 2026-03-10T08:56:14.758 INFO:tasks.workunit.client.0.vm05.stdout:2/834: chown d0/d9/f19 212813528 1 2026-03-10T08:56:14.761 INFO:tasks.workunit.client.0.vm05.stdout:8/889: dread d2/dd/d2c/d2e/d31/d3e/dde/f85 [0,4194304] 0 2026-03-10T08:56:14.766 INFO:tasks.workunit.client.0.vm05.stdout:0/905: symlink df/dd8/d67/d7b/l11c 0 2026-03-10T08:56:14.766 INFO:tasks.workunit.client.0.vm05.stdout:0/906: dread - df/d1f/d85/d2b/d27/d32/ff8 zero size 2026-03-10T08:56:14.769 INFO:tasks.workunit.client.0.vm05.stdout:5/849: truncate d5/df/d37/d68/d12d/dcc/f105 949209 0 2026-03-10T08:56:14.770 INFO:tasks.workunit.client.0.vm05.stdout:1/976: write dd/d21/f4c [1192639,80672] 0 2026-03-10T08:56:14.783 INFO:tasks.workunit.client.0.vm05.stdout:7/857: chown d18/d38/dc7/de3/d9c/dac/df5/cfa 95 1 2026-03-10T08:56:14.787 INFO:tasks.workunit.client.0.vm05.stdout:3/951: mknod d9/d2b/de7/df1/d43/da3/c126 0 2026-03-10T08:56:14.797 INFO:tasks.workunit.client.0.vm05.stdout:4/893: unlink d0/d2e/d42/d45/d4a/d36/dbe/d49/d4f/d5b/dd7/l113 0 2026-03-10T08:56:14.804 INFO:tasks.workunit.client.0.vm05.stdout:2/835: rmdir d0/d9/d89 39 2026-03-10T08:56:14.812 INFO:tasks.workunit.client.0.vm05.stdout:6/887: dread d4/d7/d10/d1a/d89/f93 [0,4194304] 0 2026-03-10T08:56:14.821 INFO:tasks.workunit.client.0.vm05.stdout:0/907: write df/dd8/f83 [2920279,98786] 0 2026-03-10T08:56:14.822 INFO:tasks.workunit.client.0.vm05.stdout:0/908: stat df/d1f/d85/d2b/f3b 0 2026-03-10T08:56:14.822 INFO:tasks.workunit.client.0.vm05.stdout:6/888: dread d4/d7/d10/d15/f2a [0,4194304] 0 2026-03-10T08:56:14.825 INFO:tasks.workunit.client.0.vm05.stdout:6/889: dwrite d4/d2c/d84/f3a [0,4194304] 0 2026-03-10T08:56:14.833 INFO:tasks.workunit.client.0.vm05.stdout:5/850: symlink d5/df/d37/dd2/d76/l131 0 2026-03-10T08:56:14.840 INFO:tasks.workunit.client.0.vm05.stdout:9/883: link d6/d12/d3a/f62 d6/d12/d3a/de5/dd4/d113/f129 0 2026-03-10T08:56:14.849 INFO:tasks.workunit.client.0.vm05.stdout:9/884: dread - d6/d15/f11c zero size 2026-03-10T08:56:14.849 INFO:tasks.workunit.client.0.vm05.stdout:7/858: creat d18/d38/d43/d5c/daf/f118 x:0 0 0 2026-03-10T08:56:14.849 INFO:tasks.workunit.client.0.vm05.stdout:7/859: chown d18/d66/d25/d2e/d2f/d6d/lb0 50 1 2026-03-10T08:56:14.852 INFO:tasks.workunit.client.0.vm05.stdout:3/952: truncate d9/d8f/d50/d5f/dd8/dec/d123/f91 1990058 0 2026-03-10T08:56:14.855 INFO:tasks.workunit.client.0.vm05.stdout:1/977: dwrite dd/d10/fb0 [0,4194304] 0 2026-03-10T08:56:14.871 INFO:tasks.workunit.client.0.vm05.stdout:4/894: creat d0/d2e/d42/d45/f11f x:0 0 0 2026-03-10T08:56:14.873 INFO:tasks.workunit.client.0.vm05.stdout:4/895: dread d0/f100 [0,4194304] 0 2026-03-10T08:56:14.879 INFO:tasks.workunit.client.0.vm05.stdout:2/836: dread - d0/d55/db8/dcc/dd9/fbb zero size 2026-03-10T08:56:14.885 INFO:tasks.workunit.client.0.vm05.stdout:2/837: chown d0/d55/db8/dcc/dd9 7410495 1 2026-03-10T08:56:14.885 INFO:tasks.workunit.client.0.vm05.stdout:8/890: mknod d2/dd/c135 0 2026-03-10T08:56:14.889 INFO:tasks.workunit.client.0.vm05.stdout:0/909: truncate df/d1f/fd9 3655940 0 2026-03-10T08:56:14.894 INFO:tasks.workunit.client.0.vm05.stdout:6/890: mknod d4/d7/d10/dc3/c132 0 2026-03-10T08:56:14.895 INFO:tasks.workunit.client.0.vm05.stdout:6/891: chown d4/d2d/d51/d87/da5/de9/d10a/l10f 232 1 2026-03-10T08:56:14.896 INFO:tasks.workunit.client.0.vm05.stdout:6/892: chown d4/d2d/d51/d87/fef 0 1 2026-03-10T08:56:14.910 INFO:tasks.workunit.client.0.vm05.stdout:5/851: dwrite d5/df/dbb/fd4 [0,4194304] 0 2026-03-10T08:56:14.923 INFO:tasks.workunit.client.0.vm05.stdout:9/885: rmdir d6/d12/d3a/de5 39 2026-03-10T08:56:14.926 INFO:tasks.workunit.client.0.vm05.stdout:7/860: creat d18/d66/d25/de5/f119 x:0 0 0 2026-03-10T08:56:14.937 INFO:tasks.workunit.client.0.vm05.stdout:1/978: mkdir dd/d10/d18/d20/df3/d149 0 2026-03-10T08:56:14.937 INFO:tasks.workunit.client.0.vm05.stdout:1/979: chown dd/d21/d37/d7c/dab/db7/dde/l11f 155880 1 2026-03-10T08:56:14.938 INFO:tasks.workunit.client.0.vm05.stdout:8/891: truncate d2/dd/d2c/d2e/d31/d3e/dde/f85 1655337 0 2026-03-10T08:56:14.946 INFO:tasks.workunit.client.0.vm05.stdout:2/838: dwrite d0/d9/f4e [0,4194304] 0 2026-03-10T08:56:14.948 INFO:tasks.workunit.client.0.vm05.stdout:2/839: stat d0/d9/d1e/c5b 0 2026-03-10T08:56:14.948 INFO:tasks.workunit.client.0.vm05.stdout:2/840: chown d0/d9/d7f/d8f/f37 0 1 2026-03-10T08:56:14.966 INFO:tasks.workunit.client.0.vm05.stdout:0/910: rmdir df/d1f/d85/d19/d55 39 2026-03-10T08:56:14.974 INFO:tasks.workunit.client.0.vm05.stdout:6/893: read d4/d7/d10/d15/f17 [1538941,13859] 0 2026-03-10T08:56:14.975 INFO:tasks.workunit.client.0.vm05.stdout:6/894: write d4/d7/d10/dc3/f125 [941122,95046] 0 2026-03-10T08:56:15.010 INFO:tasks.workunit.client.0.vm05.stdout:5/852: creat d5/df/d37/dd2/f132 x:0 0 0 2026-03-10T08:56:15.016 INFO:tasks.workunit.client.0.vm05.stdout:5/853: dread d5/df/f31 [0,4194304] 0 2026-03-10T08:56:15.070 INFO:tasks.workunit.client.0.vm05.stdout:3/953: symlink d9/d4d/d51/d64/d89/ddf/l127 0 2026-03-10T08:56:15.073 INFO:tasks.workunit.client.0.vm05.stdout:1/980: mknod dd/d21/d3f/c14a 0 2026-03-10T08:56:15.074 INFO:tasks.workunit.client.0.vm05.stdout:1/981: stat dd/d13/d10b/d138/c140 0 2026-03-10T08:56:15.078 INFO:tasks.workunit.client.0.vm05.stdout:1/982: dwrite dd/d10/d18/d2d/d5c/f100 [0,4194304] 0 2026-03-10T08:56:15.083 INFO:tasks.workunit.client.0.vm05.stdout:8/892: creat d2/dd/d2c/d2e/d31/d4f/d80/de2/f136 x:0 0 0 2026-03-10T08:56:15.084 INFO:tasks.workunit.client.0.vm05.stdout:0/911: readlink df/d1f/d85/d19/d55/le5 0 2026-03-10T08:56:15.084 INFO:tasks.workunit.client.0.vm05.stdout:2/841: dread - d0/d9/d1e/d20/d21/d45/f8c zero size 2026-03-10T08:56:15.085 INFO:tasks.workunit.client.0.vm05.stdout:0/912: dread - df/d1f/d85/d2b/d27/d32/f10f zero size 2026-03-10T08:56:15.086 INFO:tasks.workunit.client.0.vm05.stdout:6/895: creat d4/d2c/dc8/f133 x:0 0 0 2026-03-10T08:56:15.088 INFO:tasks.workunit.client.0.vm05.stdout:5/854: mkdir d5/d86/d24/d2c/d41/d74/d133 0 2026-03-10T08:56:15.088 INFO:tasks.workunit.client.0.vm05.stdout:5/855: fsync d5/d86/d24/d2c/d41/d74/f117 0 2026-03-10T08:56:15.090 INFO:tasks.workunit.client.0.vm05.stdout:9/886: truncate d6/d15/d35/fc0 784809 0 2026-03-10T08:56:15.094 INFO:tasks.workunit.client.0.vm05.stdout:1/983: mknod dd/d13/d10b/c14b 0 2026-03-10T08:56:15.101 INFO:tasks.workunit.client.0.vm05.stdout:0/913: creat df/d1f/d85/d19/d47/d84/dae/f11d x:0 0 0 2026-03-10T08:56:15.105 INFO:tasks.workunit.client.0.vm05.stdout:0/914: write df/d1f/d85/d2b/d27/d32/ff8 [863660,87869] 0 2026-03-10T08:56:15.117 INFO:tasks.workunit.client.0.vm05.stdout:2/842: dread d0/d9/d1e/d20/d21/f23 [0,4194304] 0 2026-03-10T08:56:15.118 INFO:tasks.workunit.client.0.vm05.stdout:2/843: write d0/d9/d1e/d20/d24/ff6 [1018644,52368] 0 2026-03-10T08:56:15.119 INFO:tasks.workunit.client.0.vm05.stdout:5/856: creat d5/d48/d64/dc4/f134 x:0 0 0 2026-03-10T08:56:15.132 INFO:tasks.workunit.client.0.vm05.stdout:1/984: creat dd/d10/d18/d20/df3/f14c x:0 0 0 2026-03-10T08:56:15.134 INFO:tasks.workunit.client.0.vm05.stdout:4/896: link d0/d2e/d42/d45/d4a/d36/l7d d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/d67/l120 0 2026-03-10T08:56:15.136 INFO:tasks.workunit.client.0.vm05.stdout:0/915: chown df/d1f/d85/d2b/d65/d6e/d96/c5e 14894 1 2026-03-10T08:56:15.137 INFO:tasks.workunit.client.0.vm05.stdout:0/916: write df/d1f/d85/d2b/d27/d32/f10f [162371,117081] 0 2026-03-10T08:56:15.138 INFO:tasks.workunit.client.0.vm05.stdout:2/844: creat d0/d55/fff x:0 0 0 2026-03-10T08:56:15.145 INFO:tasks.workunit.client.0.vm05.stdout:5/857: mkdir d5/d48/d64/dc4/d135 0 2026-03-10T08:56:15.150 INFO:tasks.workunit.client.0.vm05.stdout:7/861: write d18/d66/f2d [5703682,24712] 0 2026-03-10T08:56:15.153 INFO:tasks.workunit.client.0.vm05.stdout:8/893: write d2/dd/d2c/d2e/f37 [2973937,66082] 0 2026-03-10T08:56:15.156 INFO:tasks.workunit.client.0.vm05.stdout:9/887: write d6/d12/d3a/d48/f65 [2131364,110131] 0 2026-03-10T08:56:15.159 INFO:tasks.workunit.client.0.vm05.stdout:0/917: unlink df/c16 0 2026-03-10T08:56:15.160 INFO:tasks.workunit.client.0.vm05.stdout:0/918: stat df/d1f/d85/d2b/d65/fed 0 2026-03-10T08:56:15.162 INFO:tasks.workunit.client.0.vm05.stdout:4/897: dread d0/d2e/d71/f90 [0,4194304] 0 2026-03-10T08:56:15.170 INFO:tasks.workunit.client.0.vm05.stdout:6/896: truncate d4/d2d/d5f/f88 2750066 0 2026-03-10T08:56:15.173 INFO:tasks.workunit.client.0.vm05.stdout:5/858: creat d5/d86/d24/f136 x:0 0 0 2026-03-10T08:56:15.179 INFO:tasks.workunit.client.0.vm05.stdout:7/862: symlink d18/d66/d25/d2e/de7/l11a 0 2026-03-10T08:56:15.180 INFO:tasks.workunit.client.0.vm05.stdout:2/845: write d0/d9/d1e/d20/d21/f3d [526105,77050] 0 2026-03-10T08:56:15.188 INFO:tasks.workunit.client.0.vm05.stdout:3/954: getdents d9/d8f/d50/d5f 0 2026-03-10T08:56:15.189 INFO:tasks.workunit.client.0.vm05.stdout:3/955: write d9/d2b/de7/df1/ff7 [675423,53393] 0 2026-03-10T08:56:15.204 INFO:tasks.workunit.client.0.vm05.stdout:8/894: write d2/db/d47/f58 [3100147,26797] 0 2026-03-10T08:56:15.206 INFO:tasks.workunit.client.0.vm05.stdout:1/985: write dd/d21/d37/d45/f47 [686210,64330] 0 2026-03-10T08:56:15.207 INFO:tasks.workunit.client.0.vm05.stdout:1/986: chown dd/d21/d37/d7c 82438 1 2026-03-10T08:56:15.209 INFO:tasks.workunit.client.0.vm05.stdout:1/987: dread dd/d10/d19/f129 [0,4194304] 0 2026-03-10T08:56:15.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:14 vm05.local ceph-mon[49713]: pgmap v16: 65 pgs: 65 active+clean; 4.0 GiB data, 13 GiB used, 107 GiB / 120 GiB avail; 28 MiB/s rd, 68 MiB/s wr, 184 op/s 2026-03-10T08:56:15.215 INFO:tasks.workunit.client.0.vm05.stdout:0/919: read df/d1f/d85/d2b/f7a [4055269,92363] 0 2026-03-10T08:56:15.220 INFO:tasks.workunit.client.0.vm05.stdout:4/898: creat d0/d2e/d42/d45/d4a/d36/dbe/d49/d58/d66/d79/d107/f121 x:0 0 0 2026-03-10T08:56:15.221 INFO:tasks.workunit.client.0.vm05.stdout:6/897: fsync d4/d7/d10/d1a/d1f/f4b 0 2026-03-10T08:56:15.222 INFO:tasks.workunit.client.0.vm05.stdout:6/898: write d4/d92/db0/fb7 [446605,55166] 0 2026-03-10T08:56:15.242 INFO:tasks.workunit.client.0.vm05.stdout:2/846: dread d0/d55/dd4/def/df2/ff8 [0,4194304] 0 2026-03-10T08:56:15.243 INFO:tasks.workunit.client.0.vm05.stdout:2/847: dread d0/d9/d7f/d8f/f37 [0,4194304] 0 2026-03-10T08:56:15.244 INFO:tasks.workunit.client.0.vm05.stdout:7/863: write d18/f95 [4094698,108839] 0 2026-03-10T08:56:15.251 INFO:tasks.workunit.client.0.vm05.stdout:3/956: dwrite d9/d8f/d50/fc1 [0,4194304] 0 2026-03-10T08:56:15.285 INFO:tasks.workunit.client.0.vm05.stdout:5/859: getdents d5/d86/d24/d2c/d41/d74/d133 0 2026-03-10T08:56:15.286 INFO:tasks.workunit.client.0.vm05.stdout:8/895: symlink d2/dd/d2c/d2e/d31/d3e/d5d/d9d/dd9/d12d/l137 0 2026-03-10T08:56:15.286 INFO:tasks.workunit.client.0.vm05.stdout:6/899: mknod d4/d2d/d51/d62/d11d/c134 0 2026-03-10T08:56:15.286 INFO:tasks.workunit.client.0.vm05.stdout:5/860: stat d5/f9 0 2026-03-10T08:56:15.286 INFO:tasks.workunit.client.0.vm05.stdout:2/848: mknod d0/d9/d7f/d8f/d7e/dc3/c100 0 2026-03-10T08:56:15.286 INFO:tasks.workunit.client.0.vm05.stdout:5/861: dread - d5/d86/d24/d2c/d41/d74/f117 zero size 2026-03-10T08:56:15.286 INFO:tasks.workunit.client.0.vm05.stdout:5/862: fdatasync d5/d48/d64/d95/dac/dc6/f116 0 2026-03-10T08:56:15.286 INFO:tasks.workunit.client.0.vm05.stdout:7/864: fsync d18/d38/dc7/de3/f71 0 2026-03-10T08:56:15.286 INFO:tasks.workunit.client.0.vm05.stdout:6/900: dwrite d4/d2c/d84/db6/dc6/fc9 [4194304,4194304] 0 2026-03-10T08:56:15.286 INFO:tasks.workunit.client.0.vm05.stdout:7/865: chown d18/d1b/l7a 48657 1 2026-03-10T08:56:15.291 INFO:tasks.workunit.client.0.vm05.stdout:7/866: dwrite d18/d38/dc7/f102 [0,4194304] 0 2026-03-10T08:56:15.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:14 vm08.local ceph-mon[57559]: pgmap v16: 65 pgs: 65 active+clean; 4.0 GiB data, 13 GiB used, 107 GiB / 120 GiB avail; 28 MiB/s rd, 68 MiB/s wr, 184 op/s 2026-03-10T08:56:15.305 INFO:tasks.workunit.client.0.vm05.stdout:2/849: dread d0/d9/d1e/d20/d21/d45/d4b/f58 [0,4194304] 0 2026-03-10T08:56:15.305 INFO:tasks.workunit.client.0.vm05.stdout:2/850: stat d0/d9/d7f/d8f/d7a/dec 0 2026-03-10T08:56:15.306 INFO:tasks.workunit.client.0.vm05.stdout:2/851: chown d0/d9/d7f/d8f/d7a/dec 18011864 1 2026-03-10T08:56:15.315 INFO:tasks.workunit.client.0.vm05.stdout:9/888: dread d6/d15/d35/fc0 [0,4194304] 0 2026-03-10T08:56:15.315 INFO:tasks.workunit.client.0.vm05.stdout:8/896: mknod d2/db/d1f/d67/d8d/c138 0 2026-03-10T08:56:15.315 INFO:tasks.workunit.client.0.vm05.stdout:6/901: truncate d4/d2c/f7a 1515648 0 2026-03-10T08:56:15.315 INFO:tasks.workunit.client.0.vm05.stdout:0/920: sync 2026-03-10T08:56:15.336 INFO:tasks.workunit.client.0.vm05.stdout:2/852: rmdir d0/d55/dd4/def/df2 39 2026-03-10T08:56:15.341 INFO:tasks.workunit.client.0.vm05.stdout:9/889: dread d6/d12/d3a/de5/f47 [0,4194304] 0 2026-03-10T08:56:15.347 INFO:tasks.workunit.client.0.vm05.stdout:6/902: readlink d4/d7/d10/d15/l11c 0 2026-03-10T08:56:15.348 INFO:tasks.workunit.client.0.vm05.stdout:0/921: creat df/d1f/dcd/de6/f11e x:0 0 0 2026-03-10T08:56:15.351 INFO:tasks.workunit.client.0.vm05.stdout:4/899: write d0/d2c/d6a/dc9/d106/ff8 [2407000,103456] 0 2026-03-10T08:56:15.354 INFO:tasks.workunit.client.0.vm05.stdout:1/988: write dd/d10/d18/f82 [1674107,76147] 0 2026-03-10T08:56:15.359 INFO:tasks.workunit.client.0.vm05.stdout:7/867: write d18/d38/dc7/de3/d9c/fce [1445056,86167] 0 2026-03-10T08:56:15.361 INFO:tasks.workunit.client.0.vm05.stdout:3/957: dwrite d9/d2b/d2f/fee [4194304,4194304] 0 2026-03-10T08:56:15.364 INFO:tasks.workunit.client.0.vm05.stdout:5/863: dwrite d5/fc1 [0,4194304] 0 2026-03-10T08:56:15.369 INFO:tasks.workunit.client.0.vm05.stdout:2/853: fdatasync d0/d9/d7f/d8f/d7a/fa1 0 2026-03-10T08:56:15.369 INFO:tasks.workunit.client.0.vm05.stdout:8/897: mknod d2/dd/d2c/d2e/d31/d3e/c139 0 2026-03-10T08:56:15.374 INFO:tasks.workunit.client.0.vm05.stdout:4/900: truncate d0/d2e/d42/d45/d4a/d36/dbe/d49/d58/d66/d79/d107/ff9 94663 0 2026-03-10T08:56:15.376 INFO:tasks.workunit.client.0.vm05.stdout:3/958: unlink d9/d2b/l2e 0 2026-03-10T08:56:15.380 INFO:tasks.workunit.client.0.vm05.stdout:3/959: dread d9/d2b/de7/df1/dd6/d107/f113 [0,4194304] 0 2026-03-10T08:56:15.381 INFO:tasks.workunit.client.0.vm05.stdout:5/864: rename d5/d48/d64 to d5/d86/d24/d84/df7/d137 0 2026-03-10T08:56:15.389 INFO:tasks.workunit.client.0.vm05.stdout:8/898: fdatasync d2/db/f9a 0 2026-03-10T08:56:15.389 INFO:tasks.workunit.client.0.vm05.stdout:7/868: dread d18/d1b/f50 [0,4194304] 0 2026-03-10T08:56:15.396 INFO:tasks.workunit.client.0.vm05.stdout:0/922: symlink df/d1f/d85/d19/d39/d4d/d9f/df7/l11f 0 2026-03-10T08:56:15.398 INFO:tasks.workunit.client.0.vm05.stdout:1/989: mkdir dd/d10/d19/d4d/d7d/d14d 0 2026-03-10T08:56:15.402 INFO:tasks.workunit.client.0.vm05.stdout:1/990: dwrite dd/d21/d37/d45/f47 [0,4194304] 0 2026-03-10T08:56:15.405 INFO:tasks.workunit.client.0.vm05.stdout:1/991: chown dd/d10 747811 1 2026-03-10T08:56:15.417 INFO:tasks.workunit.client.0.vm05.stdout:3/960: creat d9/d8f/d50/d5f/f128 x:0 0 0 2026-03-10T08:56:15.418 INFO:tasks.workunit.client.0.vm05.stdout:5/865: read - d5/df/d37/d68/ff9 zero size 2026-03-10T08:56:15.431 INFO:tasks.workunit.client.0.vm05.stdout:0/923: mknod df/d1f/d85/d2b/d65/d6e/c120 0 2026-03-10T08:56:15.447 INFO:tasks.workunit.client.0.vm05.stdout:1/992: rename dd/d10/d18/d2d/d51/d58/d71/d73/lff to dd/d10/d18/d20/df3/l14e 0 2026-03-10T08:56:15.448 INFO:tasks.workunit.client.0.vm05.stdout:3/961: dread - d9/d2b/de7/fff zero size 2026-03-10T08:56:15.450 INFO:tasks.workunit.client.0.vm05.stdout:5/866: creat d5/d86/d39/f138 x:0 0 0 2026-03-10T08:56:15.451 INFO:tasks.workunit.client.0.vm05.stdout:6/903: getdents d4/d7 0 2026-03-10T08:56:15.453 INFO:tasks.workunit.client.0.vm05.stdout:0/924: unlink df/d1f/d85/d2b/d27/d32/d4e/fd2 0 2026-03-10T08:56:15.456 INFO:tasks.workunit.client.0.vm05.stdout:5/867: creat d5/df/d37/dd2/d76/f139 x:0 0 0 2026-03-10T08:56:15.458 INFO:tasks.workunit.client.0.vm05.stdout:6/904: mkdir d4/d7/d10/d1a/d1f/d135 0 2026-03-10T08:56:15.459 INFO:tasks.workunit.client.0.vm05.stdout:0/925: creat df/d1f/d85/d19/d39/f121 x:0 0 0 2026-03-10T08:56:15.462 INFO:tasks.workunit.client.0.vm05.stdout:5/868: creat d5/d86/d24/d2c/d41/d74/da9/f13a x:0 0 0 2026-03-10T08:56:15.463 INFO:tasks.workunit.client.0.vm05.stdout:6/905: dread - d4/d2c/dc8/ffd zero size 2026-03-10T08:56:15.465 INFO:tasks.workunit.client.0.vm05.stdout:7/869: getdents d18/d66/d25/d2e 0 2026-03-10T08:56:15.467 INFO:tasks.workunit.client.0.vm05.stdout:9/890: write d6/d12/d3a/d48/fa5 [3926857,123641] 0 2026-03-10T08:56:15.470 INFO:tasks.workunit.client.0.vm05.stdout:0/926: dread df/d1f/d85/d19/d47/d84/d8a/f93 [0,4194304] 0 2026-03-10T08:56:15.473 INFO:tasks.workunit.client.0.vm05.stdout:4/901: dread d0/d2e/d42/d45/d4a/d36/dbe/d49/d58/d66/d79/d107/ff9 [0,4194304] 0 2026-03-10T08:56:15.477 INFO:tasks.workunit.client.0.vm05.stdout:7/870: rmdir d18/d66/d25/de5 39 2026-03-10T08:56:15.482 INFO:tasks.workunit.client.0.vm05.stdout:5/869: dread d5/df/f53 [0,4194304] 0 2026-03-10T08:56:15.483 INFO:tasks.workunit.client.0.vm05.stdout:7/871: dread d18/f112 [0,4194304] 0 2026-03-10T08:56:15.487 INFO:tasks.workunit.client.0.vm05.stdout:7/872: dwrite d18/d66/d25/d2e/de7/fcf [4194304,4194304] 0 2026-03-10T08:56:15.487 INFO:tasks.workunit.client.0.vm05.stdout:9/891: truncate d6/d15/d35/fc0 1458730 0 2026-03-10T08:56:15.488 INFO:tasks.workunit.client.0.vm05.stdout:9/892: stat d6/d12/db2/l10b 0 2026-03-10T08:56:15.491 INFO:tasks.workunit.client.0.vm05.stdout:1/993: getdents dd/d10/d18 0 2026-03-10T08:56:15.493 INFO:tasks.workunit.client.0.vm05.stdout:2/854: write d0/d9/fd6 [64567,64834] 0 2026-03-10T08:56:15.504 INFO:tasks.workunit.client.0.vm05.stdout:0/927: sync 2026-03-10T08:56:15.505 INFO:tasks.workunit.client.0.vm05.stdout:4/902: rename d0/d2c/f74 to d0/d2e/dca/df3/d118/f122 0 2026-03-10T08:56:15.506 INFO:tasks.workunit.client.0.vm05.stdout:4/903: chown d0/d2e/d42/d45/d4a/d36/dbe/d49/d58/d66/lf7 45468761 1 2026-03-10T08:56:15.508 INFO:tasks.workunit.client.0.vm05.stdout:6/906: mkdir d4/d2c/d136 0 2026-03-10T08:56:15.509 INFO:tasks.workunit.client.0.vm05.stdout:8/899: write d2/dd/d2c/d2e/d31/d3e/dde/d63/fc8 [877478,94619] 0 2026-03-10T08:56:15.509 INFO:tasks.workunit.client.0.vm05.stdout:6/907: stat d4/d7/d10/d15/d1b/f31 0 2026-03-10T08:56:15.512 INFO:tasks.workunit.client.0.vm05.stdout:5/870: mkdir d5/df/dbb/d108/d13b 0 2026-03-10T08:56:15.515 INFO:tasks.workunit.client.0.vm05.stdout:7/873: mkdir d18/d66/d25/d2e/dd9/d11b 0 2026-03-10T08:56:15.520 INFO:tasks.workunit.client.0.vm05.stdout:2/855: creat d0/d55/da2/f101 x:0 0 0 2026-03-10T08:56:15.524 INFO:tasks.workunit.client.0.vm05.stdout:0/928: mkdir df/d1f/d85/d9e/d122 0 2026-03-10T08:56:15.525 INFO:tasks.workunit.client.0.vm05.stdout:0/929: write df/d1f/d85/d2b/d27/d32/ff8 [336193,123567] 0 2026-03-10T08:56:15.533 INFO:tasks.workunit.client.0.vm05.stdout:4/904: unlink d0/d2e/d42/d45/d4a/d36/dbe/d49/d58/d66/d79/fea 0 2026-03-10T08:56:15.542 INFO:tasks.workunit.client.0.vm05.stdout:6/908: write d4/d7/d10/d15/f17 [1788765,66623] 0 2026-03-10T08:56:15.544 INFO:tasks.workunit.client.0.vm05.stdout:6/909: write d4/d2c/d84/d4a/f12e [811477,92566] 0 2026-03-10T08:56:15.545 INFO:tasks.workunit.client.0.vm05.stdout:7/874: dread - d18/d66/d25/de5/f119 zero size 2026-03-10T08:56:15.549 INFO:tasks.workunit.client.0.vm05.stdout:9/893: mkdir d6/d15/d3c/d128/d12a 0 2026-03-10T08:56:15.550 INFO:tasks.workunit.client.0.vm05.stdout:1/994: creat dd/d10/d19/d4d/d7d/d14d/f14f x:0 0 0 2026-03-10T08:56:15.551 INFO:tasks.workunit.client.0.vm05.stdout:1/995: chown dd/d10/d19 675680607 1 2026-03-10T08:56:15.554 INFO:tasks.workunit.client.0.vm05.stdout:0/930: mkdir df/d1f/d85/d9e/d123 0 2026-03-10T08:56:15.556 INFO:tasks.workunit.client.0.vm05.stdout:4/905: rmdir d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/d67 39 2026-03-10T08:56:15.558 INFO:tasks.workunit.client.0.vm05.stdout:6/910: creat d4/d2d/d51/d62/d113/f137 x:0 0 0 2026-03-10T08:56:15.559 INFO:tasks.workunit.client.0.vm05.stdout:7/875: dread d18/d38/d43/d5c/daf/fb6 [0,4194304] 0 2026-03-10T08:56:15.561 INFO:tasks.workunit.client.0.vm05.stdout:7/876: dread d18/d38/f82 [0,4194304] 0 2026-03-10T08:56:15.568 INFO:tasks.workunit.client.0.vm05.stdout:1/996: rename dd/d10/d18/d20/d69/fe2 to dd/d10/d112/d113/f150 0 2026-03-10T08:56:15.570 INFO:tasks.workunit.client.0.vm05.stdout:1/997: dread - dd/d10/d18/dd1/f136 zero size 2026-03-10T08:56:15.570 INFO:tasks.workunit.client.0.vm05.stdout:7/877: dread d18/d66/d25/f8d [0,4194304] 0 2026-03-10T08:56:15.574 INFO:tasks.workunit.client.0.vm05.stdout:2/856: dwrite d0/d9/d1e/d20/d21/d45/f68 [0,4194304] 0 2026-03-10T08:56:15.578 INFO:tasks.workunit.client.0.vm05.stdout:2/857: fsync d0/d9/f4e 0 2026-03-10T08:56:15.581 INFO:tasks.workunit.client.0.vm05.stdout:4/906: fdatasync d0/fb 0 2026-03-10T08:56:15.582 INFO:tasks.workunit.client.0.vm05.stdout:8/900: creat d2/dd/d2c/f13a x:0 0 0 2026-03-10T08:56:15.588 INFO:tasks.workunit.client.0.vm05.stdout:6/911: creat d4/d2d/d51/d62/d11d/f138 x:0 0 0 2026-03-10T08:56:15.589 INFO:tasks.workunit.client.0.vm05.stdout:6/912: chown d4/d7/d10/d1a/d1f/l60 33 1 2026-03-10T08:56:15.595 INFO:tasks.workunit.client.0.vm05.stdout:7/878: truncate d18/d1b/f94 1104418 0 2026-03-10T08:56:15.596 INFO:tasks.workunit.client.0.vm05.stdout:9/894: dread d6/d19/d2a/d4a/faa [0,4194304] 0 2026-03-10T08:56:15.601 INFO:tasks.workunit.client.0.vm05.stdout:2/858: fdatasync d0/d9/d7f/d8f/d7e/fa4 0 2026-03-10T08:56:15.606 INFO:tasks.workunit.client.0.vm05.stdout:2/859: dwrite d0/d55/db8/ffe [0,4194304] 0 2026-03-10T08:56:15.615 INFO:tasks.workunit.client.0.vm05.stdout:5/871: getdents d5/df/d37/dd2/d76 0 2026-03-10T08:56:15.615 INFO:tasks.workunit.client.0.vm05.stdout:6/913: fdatasync d4/d7/d10/d1a/f130 0 2026-03-10T08:56:15.615 INFO:tasks.workunit.client.0.vm05.stdout:1/998: mknod dd/d21/d37/c151 0 2026-03-10T08:56:15.615 INFO:tasks.workunit.client.0.vm05.stdout:5/872: chown d5/df/d37/d68/l7f 209 1 2026-03-10T08:56:15.618 INFO:tasks.workunit.client.0.vm05.stdout:7/879: unlink f9 0 2026-03-10T08:56:15.620 INFO:tasks.workunit.client.0.vm05.stdout:3/962: getdents d9/d8f/d50/d5f/dd8/dec/d123/d86/d10e 0 2026-03-10T08:56:15.620 INFO:tasks.workunit.client.0.vm05.stdout:7/880: truncate d18/d38/d43/d5c/daf/f118 495157 0 2026-03-10T08:56:15.621 INFO:tasks.workunit.client.0.vm05.stdout:0/931: rmdir df/dd8/d67/d7b/d108 0 2026-03-10T08:56:15.627 INFO:tasks.workunit.client.0.vm05.stdout:2/860: rename d0/d9/d1e/d20/d21/d45/d4b/d75/fdc to d0/d9/d7f/d8f/d7a/dec/f102 0 2026-03-10T08:56:15.633 INFO:tasks.workunit.client.0.vm05.stdout:2/861: chown d0/f2f 12102686 1 2026-03-10T08:56:15.633 INFO:tasks.workunit.client.0.vm05.stdout:6/914: stat d4/d2c/f7a 0 2026-03-10T08:56:15.636 INFO:tasks.workunit.client.0.vm05.stdout:0/932: creat df/d1f/dc6/f124 x:0 0 0 2026-03-10T08:56:15.639 INFO:tasks.workunit.client.0.vm05.stdout:7/881: rename d18/d1b/ddd to d18/d38/dc7/de3/d9c/dfd/d11c 0 2026-03-10T08:56:15.643 INFO:tasks.workunit.client.0.vm05.stdout:4/907: getdents d0/d2e/d42/d45/d4a/d36/d37/d114 0 2026-03-10T08:56:15.648 INFO:tasks.workunit.client.0.vm05.stdout:2/862: sync 2026-03-10T08:56:15.648 INFO:tasks.workunit.client.0.vm05.stdout:2/863: chown d0/d9/d7f 800868135 1 2026-03-10T08:56:15.649 INFO:tasks.workunit.client.0.vm05.stdout:2/864: readlink d0/d9/d7f/d8f/d7a/l9e 0 2026-03-10T08:56:15.650 INFO:tasks.workunit.client.0.vm05.stdout:2/865: fsync d0/d9/d1e/d20/d21/fdd 0 2026-03-10T08:56:15.654 INFO:tasks.workunit.client.0.vm05.stdout:2/866: dread d0/d9/d1e/d20/d21/fb7 [0,4194304] 0 2026-03-10T08:56:15.657 INFO:tasks.workunit.client.0.vm05.stdout:9/895: creat d6/d19/d2c/f12b x:0 0 0 2026-03-10T08:56:15.658 INFO:tasks.workunit.client.0.vm05.stdout:2/867: sync 2026-03-10T08:56:15.664 INFO:tasks.workunit.client.0.vm05.stdout:0/933: unlink df/d1f/d85/d2b/d27/d32/d4e/d6a/c117 0 2026-03-10T08:56:15.665 INFO:tasks.workunit.client.0.vm05.stdout:0/934: chown df/d1f/d85/d2b/f7a 4 1 2026-03-10T08:56:15.667 INFO:tasks.workunit.client.0.vm05.stdout:0/935: sync 2026-03-10T08:56:15.669 INFO:tasks.workunit.client.0.vm05.stdout:7/882: read - d18/d38/dc7/de3/d74/deb/fd2 zero size 2026-03-10T08:56:15.672 INFO:tasks.workunit.client.0.vm05.stdout:8/901: getdents d2/dd/d2c/d2e/d93 0 2026-03-10T08:56:15.674 INFO:tasks.workunit.client.0.vm05.stdout:4/908: symlink d0/d2e/d42/d45/d4a/d36/dbe/dbf/l123 0 2026-03-10T08:56:15.674 INFO:tasks.workunit.client.0.vm05.stdout:4/909: readlink d0/dfe/de2/l80 0 2026-03-10T08:56:15.678 INFO:tasks.workunit.client.0.vm05.stdout:1/999: link dd/d10/d18/d20/fd9 dd/dfb/f152 0 2026-03-10T08:56:15.683 INFO:tasks.workunit.client.0.vm05.stdout:6/915: dwrite d4/d2c/d84/d4a/f63 [4194304,4194304] 0 2026-03-10T08:56:15.690 INFO:tasks.workunit.client.0.vm05.stdout:9/896: creat d6/d19/f12c x:0 0 0 2026-03-10T08:56:15.692 INFO:tasks.workunit.client.0.vm05.stdout:2/868: creat d0/d55/db8/dcc/dd9/f103 x:0 0 0 2026-03-10T08:56:15.693 INFO:tasks.workunit.client.0.vm05.stdout:2/869: stat d0/f10 0 2026-03-10T08:56:15.694 INFO:tasks.workunit.client.0.vm05.stdout:6/916: dread d4/d7/d10/f12 [0,4194304] 0 2026-03-10T08:56:15.695 INFO:tasks.workunit.client.0.vm05.stdout:3/963: creat d9/d2b/f129 x:0 0 0 2026-03-10T08:56:15.696 INFO:tasks.workunit.client.0.vm05.stdout:3/964: chown d9/d2b/de7/df1/d43/da3/lb0 286 1 2026-03-10T08:56:15.704 INFO:tasks.workunit.client.0.vm05.stdout:0/936: mkdir df/d1f/d85/d19/d39/d4d/d9f/df7/d125 0 2026-03-10T08:56:15.709 INFO:tasks.workunit.client.0.vm05.stdout:5/873: getdents d5/df/d37/dc8/d100 0 2026-03-10T08:56:15.713 INFO:tasks.workunit.client.0.vm05.stdout:9/897: symlink d6/d19/l12d 0 2026-03-10T08:56:15.713 INFO:tasks.workunit.client.0.vm05.stdout:2/870: creat d0/d9/d7f/d8f/d6d/f104 x:0 0 0 2026-03-10T08:56:15.713 INFO:tasks.workunit.client.0.vm05.stdout:3/965: dread d9/d8f/f8a [0,4194304] 0 2026-03-10T08:56:15.713 INFO:tasks.workunit.client.0.vm05.stdout:3/966: chown d9/d4d/dca/f87 59438099 1 2026-03-10T08:56:15.715 INFO:tasks.workunit.client.0.vm05.stdout:6/917: creat d4/d2c/d84/db6/f139 x:0 0 0 2026-03-10T08:56:15.719 INFO:tasks.workunit.client.0.vm05.stdout:7/883: dread d18/f4a [0,4194304] 0 2026-03-10T08:56:15.723 INFO:tasks.workunit.client.0.vm05.stdout:8/902: mknod d2/dd/d2c/d2e/d31/d4f/d7b/d9e/df8/c13b 0 2026-03-10T08:56:15.731 INFO:tasks.workunit.client.0.vm05.stdout:8/903: sync 2026-03-10T08:56:15.732 INFO:tasks.workunit.client.0.vm05.stdout:2/871: mkdir d0/d55/db8/d105 0 2026-03-10T08:56:15.735 INFO:tasks.workunit.client.0.vm05.stdout:3/967: fdatasync d9/d4d/d51/d64/d89/dc2/f108 0 2026-03-10T08:56:15.745 INFO:tasks.workunit.client.0.vm05.stdout:7/884: mknod d18/d38/dc7/de3/d9c/dfd/d11c/c11d 0 2026-03-10T08:56:15.746 INFO:tasks.workunit.client.0.vm05.stdout:7/885: dread - d18/d38/dc7/de3/d74/deb/fd2 zero size 2026-03-10T08:56:15.747 INFO:tasks.workunit.client.0.vm05.stdout:4/910: mknod d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/d67/c124 0 2026-03-10T08:56:15.748 INFO:tasks.workunit.client.0.vm05.stdout:0/937: write df/d1f/d85/d2b/d65/d6e/d96/f8b [3845656,44905] 0 2026-03-10T08:56:15.749 INFO:tasks.workunit.client.0.vm05.stdout:5/874: symlink d5/d86/d24/d2c/d41/d74/d133/l13c 0 2026-03-10T08:56:15.769 INFO:tasks.workunit.client.0.vm05.stdout:7/886: creat d18/d66/d25/d2e/d2f/f11e x:0 0 0 2026-03-10T08:56:15.771 INFO:tasks.workunit.client.0.vm05.stdout:5/875: read - d5/d86/d39/fd3 zero size 2026-03-10T08:56:15.773 INFO:tasks.workunit.client.0.vm05.stdout:5/876: dread - d5/d86/d24/d2c/d41/d74/da9/f13a zero size 2026-03-10T08:56:15.773 INFO:tasks.workunit.client.0.vm05.stdout:8/904: dwrite d2/dd/f1a [0,4194304] 0 2026-03-10T08:56:15.776 INFO:tasks.workunit.client.0.vm05.stdout:3/968: write d9/d2b/f101 [287004,96514] 0 2026-03-10T08:56:15.785 INFO:tasks.workunit.client.0.vm05.stdout:5/877: dwrite d5/d86/d21/d71/f9a [0,4194304] 0 2026-03-10T08:56:15.787 INFO:tasks.workunit.client.0.vm05.stdout:6/918: dwrite d4/f30 [4194304,4194304] 0 2026-03-10T08:56:15.793 INFO:tasks.workunit.client.0.vm05.stdout:6/919: chown d4/d7/d10/d15/d1b 15893899 1 2026-03-10T08:56:15.800 INFO:tasks.workunit.client.0.vm05.stdout:7/887: symlink d18/d38/dc7/de3/l11f 0 2026-03-10T08:56:15.802 INFO:tasks.workunit.client.0.vm05.stdout:3/969: rmdir d9/d8f/d50/d5f/dd8/dec/d123/d86/d10e 39 2026-03-10T08:56:15.803 INFO:tasks.workunit.client.0.vm05.stdout:3/970: stat d9/d8f 0 2026-03-10T08:56:15.803 INFO:tasks.workunit.client.0.vm05.stdout:3/971: chown d9/d4d/d51/d64/d89/dc2/cc5 38 1 2026-03-10T08:56:15.814 INFO:tasks.workunit.client.0.vm05.stdout:9/898: link d6/df6/f116 d6/d27/d10a/f12e 0 2026-03-10T08:56:15.823 INFO:tasks.workunit.client.0.vm05.stdout:4/911: creat d0/f125 x:0 0 0 2026-03-10T08:56:15.827 INFO:tasks.workunit.client.0.vm05.stdout:7/888: truncate d18/d38/d43/d5c/f5f 933429 0 2026-03-10T08:56:15.839 INFO:tasks.workunit.client.0.vm05.stdout:6/920: symlink d4/d8d/l13a 0 2026-03-10T08:56:15.842 INFO:tasks.workunit.client.0.vm05.stdout:0/938: rename df/d1f/d85/d19/c30 to df/d1f/d85/d19/d47/d84/c126 0 2026-03-10T08:56:15.850 INFO:tasks.workunit.client.0.vm05.stdout:9/899: creat d6/d12/d3a/de5/dd4/d113/f12f x:0 0 0 2026-03-10T08:56:15.853 INFO:tasks.workunit.client.0.vm05.stdout:2/872: truncate d0/f16 2908543 0 2026-03-10T08:56:15.855 INFO:tasks.workunit.client.0.vm05.stdout:7/889: rmdir d18/d38/dc7/de3/d9c/de1 39 2026-03-10T08:56:15.872 INFO:tasks.workunit.client.0.vm05.stdout:9/900: fdatasync d6/d19/d2c/f54 0 2026-03-10T08:56:15.877 INFO:tasks.workunit.client.0.vm05.stdout:3/972: dwrite d9/d4d/f52 [4194304,4194304] 0 2026-03-10T08:56:15.881 INFO:tasks.workunit.client.0.vm05.stdout:5/878: write d5/d48/f93 [329498,105126] 0 2026-03-10T08:56:15.893 INFO:tasks.workunit.client.0.vm05.stdout:2/873: fsync d0/f56 0 2026-03-10T08:56:15.896 INFO:tasks.workunit.client.0.vm05.stdout:4/912: symlink d0/d2e/d42/d45/d4a/d36/dbe/d49/d58/d66/l126 0 2026-03-10T08:56:15.905 INFO:tasks.workunit.client.0.vm05.stdout:7/890: creat d18/d66/d25/d2e/dd9/d111/f120 x:0 0 0 2026-03-10T08:56:15.906 INFO:tasks.workunit.client.0.vm05.stdout:0/939: write df/d1f/d85/d19/d39/f61 [218921,103518] 0 2026-03-10T08:56:15.908 INFO:tasks.workunit.client.0.vm05.stdout:8/905: getdents d2/dd/d2c/d2e 0 2026-03-10T08:56:15.914 INFO:tasks.workunit.client.0.vm05.stdout:6/921: fdatasync d4/d2d/fa2 0 2026-03-10T08:56:15.919 INFO:tasks.workunit.client.0.vm05.stdout:9/901: dread - d6/d19/d2a/f9b zero size 2026-03-10T08:56:15.919 INFO:tasks.workunit.client.0.vm05.stdout:9/902: write d6/d15/d3c/f6b [3704249,94340] 0 2026-03-10T08:56:15.920 INFO:tasks.workunit.client.0.vm05.stdout:9/903: stat d6/d15/fb4 0 2026-03-10T08:56:15.944 INFO:tasks.workunit.client.0.vm05.stdout:0/940: creat df/d1f/d85/d2b/f127 x:0 0 0 2026-03-10T08:56:15.956 INFO:tasks.workunit.client.0.vm05.stdout:4/913: write d0/d2e/d42/d45/d4a/d36/dbe/d49/d58/d66/d79/d107/ff2 [299214,25194] 0 2026-03-10T08:56:15.962 INFO:tasks.workunit.client.0.vm05.stdout:8/906: dread d2/dd/d2c/d2e/d31/d3e/fe3 [0,4194304] 0 2026-03-10T08:56:15.967 INFO:tasks.workunit.client.0.vm05.stdout:6/922: rmdir d4/d2c 39 2026-03-10T08:56:15.975 INFO:tasks.workunit.client.0.vm05.stdout:9/904: creat d6/d15/d35/f130 x:0 0 0 2026-03-10T08:56:15.976 INFO:tasks.workunit.client.0.vm05.stdout:3/973: mkdir d9/d8f/d50/d12a 0 2026-03-10T08:56:16.008 INFO:tasks.workunit.client.0.vm05.stdout:2/874: symlink d0/d9/d1e/d20/d21/l106 0 2026-03-10T08:56:16.010 INFO:tasks.workunit.client.0.vm05.stdout:7/891: symlink d18/d66/d78/l121 0 2026-03-10T08:56:16.015 INFO:tasks.workunit.client.0.vm05.stdout:0/941: fdatasync df/d59/f57 0 2026-03-10T08:56:16.019 INFO:tasks.workunit.client.0.vm05.stdout:2/875: dread d0/d9/d1e/d20/d21/d45/d4b/f9c [0,4194304] 0 2026-03-10T08:56:16.020 INFO:tasks.workunit.client.0.vm05.stdout:4/914: symlink d0/d78/l127 0 2026-03-10T08:56:16.026 INFO:tasks.workunit.client.0.vm05.stdout:3/974: mkdir d9/d8f/d50/d5f/dd8/dd9/d12b 0 2026-03-10T08:56:16.026 INFO:tasks.workunit.client.0.vm05.stdout:3/975: chown d9/d2b/de7/df1/ff7 45240 1 2026-03-10T08:56:16.032 INFO:tasks.workunit.client.0.vm05.stdout:4/915: dread d0/d2e/d42/d45/d4a/d36/dbe/d49/faf [0,4194304] 0 2026-03-10T08:56:16.040 INFO:tasks.workunit.client.0.vm05.stdout:0/942: rename df/d1f/d85/d19/d47/d84/d8a/d110 to df/d1f/dcd/de6/d128 0 2026-03-10T08:56:16.043 INFO:tasks.workunit.client.0.vm05.stdout:9/905: mkdir d6/d15/d3c/d4b/d131 0 2026-03-10T08:56:16.045 INFO:tasks.workunit.client.0.vm05.stdout:5/879: getdents d5/df/dbb/d108 0 2026-03-10T08:56:16.053 INFO:tasks.workunit.client.0.vm05.stdout:2/876: mknod d0/c107 0 2026-03-10T08:56:16.060 INFO:tasks.workunit.client.0.vm05.stdout:8/907: write d2/dd/d2c/d2e/d31/d4f/da3/ffe [501083,125669] 0 2026-03-10T08:56:16.064 INFO:tasks.workunit.client.0.vm05.stdout:6/923: mknod d4/d2c/d84/db6/c13b 0 2026-03-10T08:56:16.065 INFO:tasks.workunit.client.0.vm05.stdout:9/906: mkdir d6/d19/d21/d132 0 2026-03-10T08:56:16.069 INFO:tasks.workunit.client.0.vm05.stdout:9/907: dwrite d6/d12/d3a/fdc [0,4194304] 0 2026-03-10T08:56:16.072 INFO:tasks.workunit.client.0.vm05.stdout:7/892: creat d18/d38/dc7/de3/d74/deb/f122 x:0 0 0 2026-03-10T08:56:16.079 INFO:tasks.workunit.client.0.vm05.stdout:2/877: truncate d0/d9/d1e/d20/d21/f41 5223293 0 2026-03-10T08:56:16.080 INFO:tasks.workunit.client.0.vm05.stdout:8/908: truncate d2/fa 9036279 0 2026-03-10T08:56:16.085 INFO:tasks.workunit.client.0.vm05.stdout:5/880: creat d5/d86/d21/d71/d12c/dcd/d109/f13d x:0 0 0 2026-03-10T08:56:16.090 INFO:tasks.workunit.client.0.vm05.stdout:3/976: truncate d9/d2b/de7/df1/d43/d6e/f109 905613 0 2026-03-10T08:56:16.090 INFO:tasks.workunit.client.0.vm05.stdout:4/916: creat d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/f128 x:0 0 0 2026-03-10T08:56:16.092 INFO:tasks.workunit.client.0.vm05.stdout:0/943: unlink df/d1f/d85/d2b/d27/d32/d4e/d87/l111 0 2026-03-10T08:56:16.092 INFO:tasks.workunit.client.0.vm05.stdout:2/878: dread - d0/d9/d7f/d8f/fab zero size 2026-03-10T08:56:16.095 INFO:tasks.workunit.client.0.vm05.stdout:6/924: creat d4/d7/d10/d15/d1b/f13c x:0 0 0 2026-03-10T08:56:16.096 INFO:tasks.workunit.client.0.vm05.stdout:8/909: truncate d2/dfc/f101 216245 0 2026-03-10T08:56:16.101 INFO:tasks.workunit.client.0.vm05.stdout:8/910: dwrite d2/dd/d2c/d2e/d31/d4f/d80/de2/f136 [0,4194304] 0 2026-03-10T08:56:16.112 INFO:tasks.workunit.client.0.vm05.stdout:4/917: rename d0/d2e/d42/d45/d4a/d36/dbe/d49/d4f to d0/d2e/d42/d45/d4a/d36/d37/d114/d129 0 2026-03-10T08:56:16.116 INFO:tasks.workunit.client.0.vm05.stdout:0/944: unlink df/d59/fdf 0 2026-03-10T08:56:16.126 INFO:tasks.workunit.client.0.vm05.stdout:0/945: dwrite df/d1f/d85/d19/d5b/f72 [4194304,4194304] 0 2026-03-10T08:56:16.139 INFO:tasks.workunit.client.0.vm05.stdout:9/908: getdents d6/d12/d3a/de5/dd4 0 2026-03-10T08:56:16.139 INFO:tasks.workunit.client.0.vm05.stdout:9/909: write d6/d12/d3a/d48/fa5 [3486195,82787] 0 2026-03-10T08:56:16.146 INFO:tasks.workunit.client.0.vm05.stdout:3/977: write d9/f29 [3074455,37303] 0 2026-03-10T08:56:16.148 INFO:tasks.workunit.client.0.vm05.stdout:7/893: truncate d18/f95 1913096 0 2026-03-10T08:56:16.149 INFO:tasks.workunit.client.0.vm05.stdout:2/879: mknod d0/d9/d1e/d20/d21/d45/c108 0 2026-03-10T08:56:16.159 INFO:tasks.workunit.client.0.vm05.stdout:9/910: dread d6/d19/d2c/d58/fc9 [0,4194304] 0 2026-03-10T08:56:16.160 INFO:tasks.workunit.client.0.vm05.stdout:6/925: dwrite d4/d7/d10/f12 [0,4194304] 0 2026-03-10T08:56:16.167 INFO:tasks.workunit.client.0.vm05.stdout:9/911: dwrite d6/d19/d2c/d58/f6c [0,4194304] 0 2026-03-10T08:56:16.177 INFO:tasks.workunit.client.0.vm05.stdout:5/881: truncate d5/d86/d21/f5a 142091 0 2026-03-10T08:56:16.178 INFO:tasks.workunit.client.0.vm05.stdout:5/882: dread - d5/df/d37/dd2/d76/faf zero size 2026-03-10T08:56:16.181 INFO:tasks.workunit.client.0.vm05.stdout:8/911: write d2/dd/d2c/d2e/d31/d4f/d80/fd2 [47119,66089] 0 2026-03-10T08:56:16.185 INFO:tasks.workunit.client.0.vm05.stdout:4/918: dwrite d0/d78/f87 [0,4194304] 0 2026-03-10T08:56:16.193 INFO:tasks.workunit.client.0.vm05.stdout:4/919: dread - d0/d2e/d42/d45/d4a/d36/dbe/f11a zero size 2026-03-10T08:56:16.193 INFO:tasks.workunit.client.0.vm05.stdout:4/920: chown d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/d67/c6d 69851 1 2026-03-10T08:56:16.193 INFO:tasks.workunit.client.0.vm05.stdout:4/921: dwrite d0/d2e/d42/d45/f11f [0,4194304] 0 2026-03-10T08:56:16.200 INFO:tasks.workunit.client.0.vm05.stdout:3/978: mkdir d9/d8f/d50/d5f/d12c 0 2026-03-10T08:56:16.212 INFO:tasks.workunit.client.0.vm05.stdout:7/894: write d18/fb1 [104955,81013] 0 2026-03-10T08:56:16.215 INFO:tasks.workunit.client.0.vm05.stdout:2/880: mkdir d0/d9/d1e/d20/d21/d45/d4b/d109 0 2026-03-10T08:56:16.217 INFO:tasks.workunit.client.0.vm05.stdout:5/883: chown d5/fd 4 1 2026-03-10T08:56:16.218 INFO:tasks.workunit.client.0.vm05.stdout:8/912: rename d2/dd/d2c/d2e/d31/d3e/f126 to d2/dd/d2c/d131/f13c 0 2026-03-10T08:56:16.219 INFO:tasks.workunit.client.0.vm05.stdout:8/913: read - d2/db/d1f/d67/d8d/f8f zero size 2026-03-10T08:56:16.224 INFO:tasks.workunit.client.0.vm05.stdout:6/926: symlink d4/l13d 0 2026-03-10T08:56:16.225 INFO:tasks.workunit.client.0.vm05.stdout:7/895: fsync d18/d38/dc7/de3/dc6/fda 0 2026-03-10T08:56:16.226 INFO:tasks.workunit.client.0.vm05.stdout:7/896: chown d18/d38 593 1 2026-03-10T08:56:16.229 INFO:tasks.workunit.client.0.vm05.stdout:5/884: fsync d5/d86/d21/d89/ff2 0 2026-03-10T08:56:16.229 INFO:tasks.workunit.client.0.vm05.stdout:5/885: stat d5/d86/d21/d71/d12c/ca3 0 2026-03-10T08:56:16.230 INFO:tasks.workunit.client.0.vm05.stdout:0/946: creat df/d1f/d85/d2b/d65/f129 x:0 0 0 2026-03-10T08:56:16.234 INFO:tasks.workunit.client.0.vm05.stdout:8/914: mknod d2/dd/d74/d78/c13d 0 2026-03-10T08:56:16.237 INFO:tasks.workunit.client.0.vm05.stdout:4/922: symlink d0/d2e/d42/d45/d4a/d36/dbe/d32/l12a 0 2026-03-10T08:56:16.237 INFO:tasks.workunit.client.0.vm05.stdout:8/915: write d2/db/d28/d99/f12e [209058,115114] 0 2026-03-10T08:56:16.241 INFO:tasks.workunit.client.0.vm05.stdout:9/912: truncate d6/d12/d3a/da2/ffe 1381094 0 2026-03-10T08:56:16.242 INFO:tasks.workunit.client.0.vm05.stdout:9/913: chown d6/d15/d3c/d4b/d90/d93/l9e 318 1 2026-03-10T08:56:16.247 INFO:tasks.workunit.client.0.vm05.stdout:0/947: dread df/d1f/d85/d2b/d65/d6e/d96/f7e [0,4194304] 0 2026-03-10T08:56:16.257 INFO:tasks.workunit.client.0.vm05.stdout:5/886: truncate d5/d86/d39/ffd 316255 0 2026-03-10T08:56:16.259 INFO:tasks.workunit.client.0.vm05.stdout:6/927: mkdir d4/d2c/d13e 0 2026-03-10T08:56:16.259 INFO:tasks.workunit.client.0.vm05.stdout:9/914: creat d6/d15/d3c/d4b/d90/f133 x:0 0 0 2026-03-10T08:56:16.264 INFO:tasks.workunit.client.0.vm05.stdout:0/948: read df/d1f/d85/d2b/d65/d6e/feb [62464,65799] 0 2026-03-10T08:56:16.275 INFO:tasks.workunit.client.0.vm05.stdout:9/915: dread d6/d12/f14 [0,4194304] 0 2026-03-10T08:56:16.281 INFO:tasks.workunit.client.0.vm05.stdout:2/881: link d0/d9/d1e/d20/d21/d45/d4b/fa7 d0/d9/d1e/d20/d21/d8a/f10a 0 2026-03-10T08:56:16.283 INFO:tasks.workunit.client.0.vm05.stdout:8/916: write d2/dd/f26 [5261795,78443] 0 2026-03-10T08:56:16.285 INFO:tasks.workunit.client.0.vm05.stdout:4/923: dwrite d0/d2e/d42/d45/d4a/d36/dbe/d32/f3e [4194304,4194304] 0 2026-03-10T08:56:16.288 INFO:tasks.workunit.client.0.vm05.stdout:5/887: fsync d5/d86/d24/d2c/d41/fad 0 2026-03-10T08:56:16.294 INFO:tasks.workunit.client.0.vm05.stdout:7/897: getdents d18 0 2026-03-10T08:56:16.297 INFO:tasks.workunit.client.0.vm05.stdout:6/928: rename d4/d8d to d4/d7/d10/d15/d1b/dfc/d13f 0 2026-03-10T08:56:16.297 INFO:tasks.workunit.client.0.vm05.stdout:8/917: mknod d2/db/d28/d100/c13e 0 2026-03-10T08:56:16.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:15 vm08.local ceph-mon[57559]: Standby manager daemon vm05.rxwgjc started 2026-03-10T08:56:16.324 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:15 vm08.local ceph-mon[57559]: from='mgr.? 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm05.rxwgjc/crt"}]: dispatch 2026-03-10T08:56:16.328 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:15 vm08.local ceph-mon[57559]: from='mgr.? 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T08:56:16.328 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:15 vm08.local ceph-mon[57559]: from='mgr.? 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm05.rxwgjc/key"}]: dispatch 2026-03-10T08:56:16.328 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:15 vm08.local ceph-mon[57559]: from='mgr.? 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T08:56:16.329 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:15 vm08.local ceph-mon[57559]: mgrmap e24: vm08.rpongu(active, since 27s), standbys: vm05.rxwgjc 2026-03-10T08:56:16.329 INFO:tasks.workunit.client.0.vm05.stdout:6/929: dread d4/d7/d10/d15/f17 [0,4194304] 0 2026-03-10T08:56:16.329 INFO:tasks.workunit.client.0.vm05.stdout:7/898: creat d18/f123 x:0 0 0 2026-03-10T08:56:16.329 INFO:tasks.workunit.client.0.vm05.stdout:8/918: fsync d2/dd/d2c/d2e/f5a 0 2026-03-10T08:56:16.329 INFO:tasks.workunit.client.0.vm05.stdout:6/930: unlink d4/d7/d10/d1a/d8c/ff9 0 2026-03-10T08:56:16.329 INFO:tasks.workunit.client.0.vm05.stdout:8/919: rmdir d2/dd/d74 39 2026-03-10T08:56:16.329 INFO:tasks.workunit.client.0.vm05.stdout:9/916: getdents d6/d19/d2a/d8d 0 2026-03-10T08:56:16.329 INFO:tasks.workunit.client.0.vm05.stdout:8/920: creat d2/db/d28/f13f x:0 0 0 2026-03-10T08:56:16.329 INFO:tasks.workunit.client.0.vm05.stdout:8/921: fdatasync d2/db/d28/d99/fe7 0 2026-03-10T08:56:16.329 INFO:tasks.workunit.client.0.vm05.stdout:8/922: getdents d2/dd/d2c/d2e/d31/d3e 0 2026-03-10T08:56:16.329 INFO:tasks.workunit.client.0.vm05.stdout:8/923: creat d2/dd/d2c/d2e/d31/d4f/d7b/d9e/f140 x:0 0 0 2026-03-10T08:56:16.329 INFO:tasks.workunit.client.0.vm05.stdout:8/924: symlink d2/db/d28/l141 0 2026-03-10T08:56:16.338 INFO:tasks.workunit.client.0.vm05.stdout:4/924: dread d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/d67/d7b/faa [0,4194304] 0 2026-03-10T08:56:16.338 INFO:tasks.workunit.client.0.vm05.stdout:0/949: sync 2026-03-10T08:56:16.339 INFO:tasks.workunit.client.0.vm05.stdout:0/950: chown df/d1f/dcd/de6/f103 512 1 2026-03-10T08:56:16.346 INFO:tasks.workunit.client.0.vm05.stdout:3/979: write d9/d2b/de7/df1/d43/d6e/f109 [705663,56774] 0 2026-03-10T08:56:16.346 INFO:tasks.workunit.client.0.vm05.stdout:2/882: write d0/f56 [1837730,47277] 0 2026-03-10T08:56:16.347 INFO:tasks.workunit.client.0.vm05.stdout:6/931: dread d4/d7/d10/dc3/f125 [0,4194304] 0 2026-03-10T08:56:16.353 INFO:tasks.workunit.client.0.vm05.stdout:0/951: truncate df/dd8/d90/fc3 26477 0 2026-03-10T08:56:16.357 INFO:tasks.workunit.client.0.vm05.stdout:2/883: truncate d0/d9/d1e/d20/d21/d45/d4b/f97 346696 0 2026-03-10T08:56:16.358 INFO:tasks.workunit.client.0.vm05.stdout:5/888: write d5/df/d37/d68/fe5 [5041050,78521] 0 2026-03-10T08:56:16.358 INFO:tasks.workunit.client.0.vm05.stdout:5/889: chown d5/d86/d24/d129 399647857 1 2026-03-10T08:56:16.362 INFO:tasks.workunit.client.0.vm05.stdout:4/925: truncate d0/d2e/d42/d45/d4a/f47 3342434 0 2026-03-10T08:56:16.364 INFO:tasks.workunit.client.0.vm05.stdout:0/952: truncate df/d1f/d85/fd4 1016302 0 2026-03-10T08:56:16.367 INFO:tasks.workunit.client.0.vm05.stdout:0/953: dwrite df/d1f/d95/fe7 [0,4194304] 0 2026-03-10T08:56:16.372 INFO:tasks.workunit.client.0.vm05.stdout:2/884: symlink d0/d9/d7f/d8f/d7a/dec/l10b 0 2026-03-10T08:56:16.375 INFO:tasks.workunit.client.0.vm05.stdout:5/890: rmdir d5/d86/d39 39 2026-03-10T08:56:16.377 INFO:tasks.workunit.client.0.vm05.stdout:5/891: truncate d5/d86/d24/d2c/d41/d74/da9/f13a 532180 0 2026-03-10T08:56:16.385 INFO:tasks.workunit.client.0.vm05.stdout:9/917: dwrite d6/d19/d21/f10d [0,4194304] 0 2026-03-10T08:56:16.385 INFO:tasks.workunit.client.0.vm05.stdout:0/954: dread df/dd8/d67/d7b/ffa [0,4194304] 0 2026-03-10T08:56:16.388 INFO:tasks.workunit.client.0.vm05.stdout:9/918: truncate d6/d12/d3a/de5/dd4/d113/f12f 465765 0 2026-03-10T08:56:16.404 INFO:tasks.workunit.client.0.vm05.stdout:8/925: dwrite d2/dd/d2c/d2e/d31/d4f/d7b/d9e/fab [4194304,4194304] 0 2026-03-10T08:56:16.418 INFO:tasks.workunit.client.0.vm05.stdout:9/919: symlink d6/d15/d104/l134 0 2026-03-10T08:56:16.423 INFO:tasks.workunit.client.0.vm05.stdout:3/980: dwrite d9/d2b/d2f/f5d [0,4194304] 0 2026-03-10T08:56:16.435 INFO:tasks.workunit.client.0.vm05.stdout:6/932: write d4/d7/f34 [3726640,81725] 0 2026-03-10T08:56:16.440 INFO:tasks.workunit.client.0.vm05.stdout:9/920: truncate d6/d19/d21/fb7 1184311 0 2026-03-10T08:56:16.440 INFO:tasks.workunit.client.0.vm05.stdout:8/926: creat d2/db/da4/f142 x:0 0 0 2026-03-10T08:56:16.441 INFO:tasks.workunit.client.0.vm05.stdout:9/921: dread - d6/d19/d2a/f9b zero size 2026-03-10T08:56:16.447 INFO:tasks.workunit.client.0.vm05.stdout:3/981: dread d9/d2b/d53/f5a [4194304,4194304] 0 2026-03-10T08:56:16.447 INFO:tasks.workunit.client.0.vm05.stdout:3/982: fdatasync d9/d2b/de7/d102/f117 0 2026-03-10T08:56:16.451 INFO:tasks.workunit.client.0.vm05.stdout:7/899: mknod d18/d66/dff/c124 0 2026-03-10T08:56:16.451 INFO:tasks.workunit.client.0.vm05.stdout:4/926: write d0/d2e/d71/fd9 [559449,8898] 0 2026-03-10T08:56:16.453 INFO:tasks.workunit.client.0.vm05.stdout:5/892: creat d5/d86/d24/d84/f13e x:0 0 0 2026-03-10T08:56:16.456 INFO:tasks.workunit.client.0.vm05.stdout:0/955: creat df/d1f/d85/d2b/f12a x:0 0 0 2026-03-10T08:56:16.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:15 vm05.local ceph-mon[49713]: Standby manager daemon vm05.rxwgjc started 2026-03-10T08:56:16.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:15 vm05.local ceph-mon[49713]: from='mgr.? 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm05.rxwgjc/crt"}]: dispatch 2026-03-10T08:56:16.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:15 vm05.local ceph-mon[49713]: from='mgr.? 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T08:56:16.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:15 vm05.local ceph-mon[49713]: from='mgr.? 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm05.rxwgjc/key"}]: dispatch 2026-03-10T08:56:16.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:15 vm05.local ceph-mon[49713]: from='mgr.? 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T08:56:16.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:15 vm05.local ceph-mon[49713]: mgrmap e24: vm08.rpongu(active, since 27s), standbys: vm05.rxwgjc 2026-03-10T08:56:16.468 INFO:tasks.workunit.client.0.vm05.stdout:8/927: write d2/db/f22 [5202200,112477] 0 2026-03-10T08:56:16.479 INFO:tasks.workunit.client.0.vm05.stdout:4/927: truncate d0/fb 3108775 0 2026-03-10T08:56:16.479 INFO:tasks.workunit.client.0.vm05.stdout:2/885: link d0/cca d0/d9/d1e/d20/d21/d45/c10c 0 2026-03-10T08:56:16.480 INFO:tasks.workunit.client.0.vm05.stdout:5/893: rename d5/d86/d21/d71/d12c/dcd to d5/d86/d24/d84/df7/d137/dc4/d13f 0 2026-03-10T08:56:16.483 INFO:tasks.workunit.client.0.vm05.stdout:6/933: mkdir d4/d2d/d51/d140 0 2026-03-10T08:56:16.483 INFO:tasks.workunit.client.0.vm05.stdout:0/956: sync 2026-03-10T08:56:16.484 INFO:tasks.workunit.client.0.vm05.stdout:6/934: write d4/d2c/d84/db6/dc6/fc9 [183684,42131] 0 2026-03-10T08:56:16.488 INFO:tasks.workunit.client.0.vm05.stdout:8/928: write d2/dd/d2c/d131/f13c [3542575,108088] 0 2026-03-10T08:56:16.495 INFO:tasks.workunit.client.0.vm05.stdout:7/900: mknod d18/d38/dc7/de3/d53/c125 0 2026-03-10T08:56:16.499 INFO:tasks.workunit.client.0.vm05.stdout:4/928: read d0/d78/fbc [95074,64445] 0 2026-03-10T08:56:16.499 INFO:tasks.workunit.client.0.vm05.stdout:4/929: readlink d0/d2c/d6a/l77 0 2026-03-10T08:56:16.500 INFO:tasks.workunit.client.0.vm05.stdout:4/930: dread - d0/d2e/d42/d45/d4a/d36/dbe/f10b zero size 2026-03-10T08:56:16.500 INFO:tasks.workunit.client.0.vm05.stdout:4/931: fsync d0/f1 0 2026-03-10T08:56:16.501 INFO:tasks.workunit.client.0.vm05.stdout:4/932: dread - d0/d2e/d42/d45/d4a/d36/dbe/f10b zero size 2026-03-10T08:56:16.503 INFO:tasks.workunit.client.0.vm05.stdout:2/886: creat d0/d55/db8/dcc/dd9/dcb/f10d x:0 0 0 2026-03-10T08:56:16.506 INFO:tasks.workunit.client.0.vm05.stdout:5/894: truncate d5/d86/d21/f36 1536817 0 2026-03-10T08:56:16.513 INFO:tasks.workunit.client.0.vm05.stdout:8/929: mknod d2/dd/d2c/d2e/d31/d4f/d80/de2/dea/c143 0 2026-03-10T08:56:16.517 INFO:tasks.workunit.client.0.vm05.stdout:3/983: rmdir d9/d8f/d50/d5f/d12c 0 2026-03-10T08:56:16.523 INFO:tasks.workunit.client.0.vm05.stdout:6/935: dwrite d4/d7/fab [0,4194304] 0 2026-03-10T08:56:16.534 INFO:tasks.workunit.client.0.vm05.stdout:5/895: stat d5/f33 0 2026-03-10T08:56:16.537 INFO:tasks.workunit.client.0.vm05.stdout:9/922: getdents d6/d15/d37/de8 0 2026-03-10T08:56:16.540 INFO:tasks.workunit.client.0.vm05.stdout:3/984: rmdir d9/d2b/de7/df1/d43/d6e 39 2026-03-10T08:56:16.551 INFO:tasks.workunit.client.0.vm05.stdout:6/936: fdatasync d4/d2c/d84/fe2 0 2026-03-10T08:56:16.551 INFO:tasks.workunit.client.0.vm05.stdout:4/933: mkdir d0/d2e/d12b 0 2026-03-10T08:56:16.551 INFO:tasks.workunit.client.0.vm05.stdout:8/930: mknod d2/dd/d2c/d2e/d31/d4f/c144 0 2026-03-10T08:56:16.551 INFO:tasks.workunit.client.0.vm05.stdout:8/931: chown d2/dd/d2c/d2e/d31/d3e/ldc 1791761 1 2026-03-10T08:56:16.551 INFO:tasks.workunit.client.0.vm05.stdout:8/932: chown d2/dd/d2c/d2e/d31/db4 1240814 1 2026-03-10T08:56:16.551 INFO:tasks.workunit.client.0.vm05.stdout:8/933: chown d2/dd/d2c/d2e/d31/d4f/d80 5294326 1 2026-03-10T08:56:16.551 INFO:tasks.workunit.client.0.vm05.stdout:8/934: read d2/dd/d2c/f30 [915316,94516] 0 2026-03-10T08:56:16.551 INFO:tasks.workunit.client.0.vm05.stdout:8/935: dread - d2/db/d1f/f130 zero size 2026-03-10T08:56:16.551 INFO:tasks.workunit.client.0.vm05.stdout:9/923: rename d6/d12/d3a/d48/fa5 to d6/d12/d3a/da2/f135 0 2026-03-10T08:56:16.552 INFO:tasks.workunit.client.0.vm05.stdout:8/936: write d2/db/d28/d99/f12e [943033,901] 0 2026-03-10T08:56:16.552 INFO:tasks.workunit.client.0.vm05.stdout:8/937: write d2/db/d28/fae [395661,107443] 0 2026-03-10T08:56:16.555 INFO:tasks.workunit.client.0.vm05.stdout:9/924: read f4 [10387,109979] 0 2026-03-10T08:56:16.557 INFO:tasks.workunit.client.0.vm05.stdout:7/901: creat d18/d38/dc7/de3/d9c/f126 x:0 0 0 2026-03-10T08:56:16.560 INFO:tasks.workunit.client.0.vm05.stdout:5/896: write d5/df/d37/d68/fb3 [553129,69397] 0 2026-03-10T08:56:16.566 INFO:tasks.workunit.client.0.vm05.stdout:4/934: readlink d0/d2e/d42/d45/d4a/d36/d37/l52 0 2026-03-10T08:56:16.567 INFO:tasks.workunit.client.0.vm05.stdout:2/887: link d0/cca d0/d55/db8/dcc/dd9/dcb/df0/c10e 0 2026-03-10T08:56:16.568 INFO:tasks.workunit.client.0.vm05.stdout:0/957: creat df/d1f/d85/d2b/d27/d32/d4e/f12b x:0 0 0 2026-03-10T08:56:16.570 INFO:tasks.workunit.client.0.vm05.stdout:6/937: write d4/d2c/dc8/f109 [1031009,13309] 0 2026-03-10T08:56:16.571 INFO:tasks.workunit.client.0.vm05.stdout:5/897: dread d5/d86/d24/d2c/d41/d74/fa8 [0,4194304] 0 2026-03-10T08:56:16.571 INFO:tasks.workunit.client.0.vm05.stdout:5/898: chown d5/df/d37/d68/d12d/fdc 20 1 2026-03-10T08:56:16.579 INFO:tasks.workunit.client.0.vm05.stdout:3/985: unlink d9/d4d/c84 0 2026-03-10T08:56:16.582 INFO:tasks.workunit.client.0.vm05.stdout:3/986: dread d9/d2b/d2f/f5d [0,4194304] 0 2026-03-10T08:56:16.587 INFO:tasks.workunit.client.0.vm05.stdout:3/987: dwrite d9/d4d/d51/f11c [0,4194304] 0 2026-03-10T08:56:16.595 INFO:tasks.workunit.client.0.vm05.stdout:9/925: rmdir d6/d12 39 2026-03-10T08:56:16.595 INFO:tasks.workunit.client.0.vm05.stdout:4/935: creat d0/d2c/f12c x:0 0 0 2026-03-10T08:56:16.600 INFO:tasks.workunit.client.0.vm05.stdout:4/936: read d0/d2c/d6a/fd8 [1637,59584] 0 2026-03-10T08:56:16.612 INFO:tasks.workunit.client.0.vm05.stdout:8/938: dread - d2/dd/d74/f125 zero size 2026-03-10T08:56:16.613 INFO:tasks.workunit.client.0.vm05.stdout:9/926: symlink d6/d15/d3c/d4b/d82/l136 0 2026-03-10T08:56:16.613 INFO:tasks.workunit.client.0.vm05.stdout:9/927: stat d6/d19/d2a/d8d/ld2 0 2026-03-10T08:56:16.614 INFO:tasks.workunit.client.0.vm05.stdout:9/928: fsync d6/d19/d2a/f87 0 2026-03-10T08:56:16.614 INFO:tasks.workunit.client.0.vm05.stdout:9/929: dread - d6/d15/f11c zero size 2026-03-10T08:56:16.619 INFO:tasks.workunit.client.0.vm05.stdout:9/930: dread - d6/df6/f116 zero size 2026-03-10T08:56:16.619 INFO:tasks.workunit.client.0.vm05.stdout:9/931: stat d6/d19/d21/f10d 0 2026-03-10T08:56:16.619 INFO:tasks.workunit.client.0.vm05.stdout:9/932: dwrite d6/d15/d104/f111 [0,4194304] 0 2026-03-10T08:56:16.628 INFO:tasks.workunit.client.0.vm05.stdout:0/958: mkdir df/d1f/d85/d19/d47/d10e/d12c 0 2026-03-10T08:56:16.629 INFO:tasks.workunit.client.0.vm05.stdout:2/888: write d0/d9/f19 [2546637,102188] 0 2026-03-10T08:56:16.630 INFO:tasks.workunit.client.0.vm05.stdout:2/889: chown d0/d9/d1e/d20/d21/d45/d4b/d75/cf7 1048 1 2026-03-10T08:56:16.631 INFO:tasks.workunit.client.0.vm05.stdout:2/890: write d0/d9/d1e/d20/d24/ff6 [646223,106174] 0 2026-03-10T08:56:16.634 INFO:tasks.workunit.client.0.vm05.stdout:4/937: creat d0/d2c/d6a/dc9/f12d x:0 0 0 2026-03-10T08:56:16.637 INFO:tasks.workunit.client.0.vm05.stdout:6/938: dwrite d4/d2d/f2f [0,4194304] 0 2026-03-10T08:56:16.646 INFO:tasks.workunit.client.0.vm05.stdout:9/933: symlink d6/d15/d3c/d4b/d90/d93/l137 0 2026-03-10T08:56:16.661 INFO:tasks.workunit.client.0.vm05.stdout:2/891: unlink d0/c107 0 2026-03-10T08:56:16.667 INFO:tasks.workunit.client.0.vm05.stdout:6/939: mkdir d4/d7/d141 0 2026-03-10T08:56:16.681 INFO:tasks.workunit.client.0.vm05.stdout:4/938: mkdir d0/d2e/d71/d12e 0 2026-03-10T08:56:16.687 INFO:tasks.workunit.client.0.vm05.stdout:9/934: getdents d6/d15/d37/de8 0 2026-03-10T08:56:16.691 INFO:tasks.workunit.client.0.vm05.stdout:2/892: creat d0/d9/d89/f10f x:0 0 0 2026-03-10T08:56:16.692 INFO:tasks.workunit.client.0.vm05.stdout:7/902: write d18/d38/d43/d5c/f5f [1530317,35809] 0 2026-03-10T08:56:16.696 INFO:tasks.workunit.client.0.vm05.stdout:7/903: dwrite d18/d38/fca [0,4194304] 0 2026-03-10T08:56:16.700 INFO:tasks.workunit.client.0.vm05.stdout:5/899: write d5/d86/d39/f77 [4566856,57247] 0 2026-03-10T08:56:16.700 INFO:tasks.workunit.client.0.vm05.stdout:8/939: write d2/dd/d74/f125 [571086,27011] 0 2026-03-10T08:56:16.701 INFO:tasks.workunit.client.0.vm05.stdout:3/988: link d9/c25 d9/d8f/d50/d5f/dd8/c12d 0 2026-03-10T08:56:16.701 INFO:tasks.workunit.client.0.vm05.stdout:0/959: write f5 [3929404,16500] 0 2026-03-10T08:56:16.703 INFO:tasks.workunit.client.0.vm05.stdout:0/960: fdatasync df/d1f/d85/d19/d39/d4d/fe3 0 2026-03-10T08:56:16.703 INFO:tasks.workunit.client.0.vm05.stdout:0/961: chown df/d1f/dcd/de6 6 1 2026-03-10T08:56:16.708 INFO:tasks.workunit.client.0.vm05.stdout:4/939: rename d0/d2e/d42/d45/d4a/d36/dbe/d49/d58/d66/d79/d107/f121 to d0/d2e/d42/d45/d4a/d36/d37/d114/d129/d5b/dd7/f12f 0 2026-03-10T08:56:16.708 INFO:tasks.workunit.client.0.vm05.stdout:6/940: creat d4/d2c/d12d/f142 x:0 0 0 2026-03-10T08:56:16.709 INFO:tasks.workunit.client.0.vm05.stdout:9/935: symlink d6/d15/d37/de8/l138 0 2026-03-10T08:56:16.710 INFO:tasks.workunit.client.0.vm05.stdout:9/936: chown d6/d15/d3c/d4b/d90/d93 5728 1 2026-03-10T08:56:16.715 INFO:tasks.workunit.client.0.vm05.stdout:5/900: mknod d5/df/d37/d68/c140 0 2026-03-10T08:56:16.728 INFO:tasks.workunit.client.0.vm05.stdout:9/937: chown d6/d19/ff5 30 1 2026-03-10T08:56:16.731 INFO:tasks.workunit.client.0.vm05.stdout:2/893: symlink d0/d9/d1e/l110 0 2026-03-10T08:56:16.735 INFO:tasks.workunit.client.0.vm05.stdout:5/901: creat d5/df/d37/d68/db6/f141 x:0 0 0 2026-03-10T08:56:16.742 INFO:tasks.workunit.client.0.vm05.stdout:6/941: dread d4/d7/d10/d15/f2a [0,4194304] 0 2026-03-10T08:56:16.742 INFO:tasks.workunit.client.0.vm05.stdout:3/989: mkdir d9/d8f/d50/d5f/dd8/dec/d123/d86/d10e/d12e 0 2026-03-10T08:56:16.744 INFO:tasks.workunit.client.0.vm05.stdout:3/990: truncate d9/d2b/de7/df1/dd6/d107/f113 731893 0 2026-03-10T08:56:16.745 INFO:tasks.workunit.client.0.vm05.stdout:8/940: dread d2/db/f19 [4194304,4194304] 0 2026-03-10T08:56:16.752 INFO:tasks.workunit.client.0.vm05.stdout:9/938: unlink d6/d27/d10a/cfb 0 2026-03-10T08:56:16.753 INFO:tasks.workunit.client.0.vm05.stdout:9/939: chown d6/cbf 78 1 2026-03-10T08:56:16.756 INFO:tasks.workunit.client.0.vm05.stdout:0/962: dwrite df/d1f/d85/fb5 [4194304,4194304] 0 2026-03-10T08:56:16.757 INFO:tasks.workunit.client.0.vm05.stdout:0/963: chown df/dd8/fce 1673 1 2026-03-10T08:56:16.757 INFO:tasks.workunit.client.0.vm05.stdout:0/964: stat df/d1f/d85/d2b/f127 0 2026-03-10T08:56:16.765 INFO:tasks.workunit.client.0.vm05.stdout:7/904: mkdir d18/d1b/df0/d127 0 2026-03-10T08:56:16.765 INFO:tasks.workunit.client.0.vm05.stdout:4/940: write d0/d2e/d42/d45/d4a/d36/dbe/d32/f72 [3772968,78627] 0 2026-03-10T08:56:16.770 INFO:tasks.workunit.client.0.vm05.stdout:6/942: creat d4/d2c/d84/f143 x:0 0 0 2026-03-10T08:56:16.779 INFO:tasks.workunit.client.0.vm05.stdout:3/991: dread d9/d4d/d51/faa [0,4194304] 0 2026-03-10T08:56:16.784 INFO:tasks.workunit.client.0.vm05.stdout:9/940: dwrite d6/d12/d3a/fdc [0,4194304] 0 2026-03-10T08:56:16.800 INFO:tasks.workunit.client.0.vm05.stdout:7/905: rename d18/d38/d43/laa to d18/d66/d25/d2e/d2f/d6d/dc1/dd4/de9/l128 0 2026-03-10T08:56:16.805 INFO:tasks.workunit.client.0.vm05.stdout:4/941: read - d0/d2e/d42/d45/d4a/d36/dbe/d49/d58/d66/d79/fa7 zero size 2026-03-10T08:56:16.812 INFO:tasks.workunit.client.0.vm05.stdout:6/943: truncate d4/d7/d10/d15/d1b/fcd 201024 0 2026-03-10T08:56:16.812 INFO:tasks.workunit.client.0.vm05.stdout:5/902: dwrite d5/d86/d39/fd3 [0,4194304] 0 2026-03-10T08:56:16.815 INFO:tasks.workunit.client.0.vm05.stdout:5/903: chown d5/df/l1d 809 1 2026-03-10T08:56:16.818 INFO:tasks.workunit.client.0.vm05.stdout:0/965: mkdir df/d1f/d85/d2b/d27/dff/d12d 0 2026-03-10T08:56:16.822 INFO:tasks.workunit.client.0.vm05.stdout:4/942: rmdir d0/dfe 39 2026-03-10T08:56:16.823 INFO:tasks.workunit.client.0.vm05.stdout:4/943: readlink d0/d1d/l9a 0 2026-03-10T08:56:16.825 INFO:tasks.workunit.client.0.vm05.stdout:2/894: link d0/d9/d7f/d8f/d7e/l99 d0/d9/d7f/l111 0 2026-03-10T08:56:16.827 INFO:tasks.workunit.client.0.vm05.stdout:8/941: link d2/db/d47/l8b d2/dd/d2c/d2e/l145 0 2026-03-10T08:56:16.836 INFO:tasks.workunit.client.0.vm05.stdout:0/966: dread df/d1f/d85/d19/d55/fa9 [0,4194304] 0 2026-03-10T08:56:16.836 INFO:tasks.workunit.client.0.vm05.stdout:3/992: mkdir d9/d8f/d50/d5f/dd8/dd9/d12b/d12f 0 2026-03-10T08:56:16.836 INFO:tasks.workunit.client.0.vm05.stdout:3/993: truncate d9/d2b/de7/df1/ff7 862911 0 2026-03-10T08:56:16.836 INFO:tasks.workunit.client.0.vm05.stdout:6/944: rmdir d4/d7/d10/d15/d20 39 2026-03-10T08:56:16.842 INFO:tasks.workunit.client.0.vm05.stdout:8/942: creat d2/dd/d74/f146 x:0 0 0 2026-03-10T08:56:16.842 INFO:tasks.workunit.client.0.vm05.stdout:0/967: creat df/d59/f12e x:0 0 0 2026-03-10T08:56:16.844 INFO:tasks.workunit.client.0.vm05.stdout:9/941: sync 2026-03-10T08:56:16.848 INFO:tasks.workunit.client.0.vm05.stdout:5/904: truncate d5/d86/d39/ffd 86148 0 2026-03-10T08:56:16.854 INFO:tasks.workunit.client.0.vm05.stdout:4/944: dread d0/d2e/d42/d45/d4a/f47 [0,4194304] 0 2026-03-10T08:56:16.854 INFO:tasks.workunit.client.0.vm05.stdout:3/994: creat d9/d8f/d50/d5f/dd8/dec/d123/d86/d10e/d12e/f130 x:0 0 0 2026-03-10T08:56:16.856 INFO:tasks.workunit.client.0.vm05.stdout:2/895: dread d0/f4 [0,4194304] 0 2026-03-10T08:56:16.863 INFO:tasks.workunit.client.0.vm05.stdout:4/945: creat d0/d2e/d42/d45/d4a/d36/d37/d114/d129/d5b/dd7/f130 x:0 0 0 2026-03-10T08:56:16.866 INFO:tasks.workunit.client.0.vm05.stdout:6/945: rename d4/d2c/d84/d4a/c95 to d4/d7/d10/c144 0 2026-03-10T08:56:16.869 INFO:tasks.workunit.client.0.vm05.stdout:6/946: dwrite d4/d2c/dc8/f133 [0,4194304] 0 2026-03-10T08:56:16.879 INFO:tasks.workunit.client.0.vm05.stdout:2/896: mknod d0/d9/d1e/c112 0 2026-03-10T08:56:16.883 INFO:tasks.workunit.client.0.vm05.stdout:4/946: rmdir d0/d2e/d42/d45/d4a/d36/d37/d114/d129/d5b 39 2026-03-10T08:56:16.889 INFO:tasks.workunit.client.0.vm05.stdout:6/947: rmdir d4/d7/d10/d1a 39 2026-03-10T08:56:16.896 INFO:tasks.workunit.client.0.vm05.stdout:2/897: unlink d0/d9/d7f/d8f/f54 0 2026-03-10T08:56:16.901 INFO:tasks.workunit.client.0.vm05.stdout:5/905: getdents d5/d86/d24 0 2026-03-10T08:56:16.903 INFO:tasks.workunit.client.0.vm05.stdout:6/948: fdatasync d4/fc 0 2026-03-10T08:56:16.905 INFO:tasks.workunit.client.0.vm05.stdout:7/906: truncate d18/d38/dc7/de3/d9c/fce 214653 0 2026-03-10T08:56:16.908 INFO:tasks.workunit.client.0.vm05.stdout:2/898: sync 2026-03-10T08:56:16.910 INFO:tasks.workunit.client.0.vm05.stdout:4/947: link d0/d2e/dca/df3/f11d d0/d2e/d42/f131 0 2026-03-10T08:56:16.918 INFO:tasks.workunit.client.0.vm05.stdout:9/942: write d6/d19/d2a/d4a/fd1 [859329,26540] 0 2026-03-10T08:56:16.919 INFO:tasks.workunit.client.0.vm05.stdout:0/968: dwrite df/dd8/d67/d7b/ffa [4194304,4194304] 0 2026-03-10T08:56:16.920 INFO:tasks.workunit.client.0.vm05.stdout:5/906: dread d5/df/d37/dd2/f94 [0,4194304] 0 2026-03-10T08:56:16.924 INFO:tasks.workunit.client.0.vm05.stdout:3/995: write d9/d2b/f40 [4334982,21242] 0 2026-03-10T08:56:16.927 INFO:tasks.workunit.client.0.vm05.stdout:6/949: stat d4/d7/d10/d1a/f25 0 2026-03-10T08:56:16.934 INFO:tasks.workunit.client.0.vm05.stdout:8/943: rename d2/db/d28/d99/f11d to d2/dd/d2c/d2e/d31/d3e/f147 0 2026-03-10T08:56:16.940 INFO:tasks.workunit.client.0.vm05.stdout:9/943: dwrite d6/d12/d3a/da2/f135 [0,4194304] 0 2026-03-10T08:56:17.008 INFO:tasks.workunit.client.0.vm05.stdout:6/950: symlink d4/d2d/d51/d62/d11d/l145 0 2026-03-10T08:56:17.008 INFO:tasks.workunit.client.0.vm05.stdout:7/907: symlink d18/d1b/l129 0 2026-03-10T08:56:17.022 INFO:tasks.workunit.client.0.vm05.stdout:5/907: mknod d5/d86/d24/d2c/c142 0 2026-03-10T08:56:17.033 INFO:tasks.workunit.client.0.vm05.stdout:7/908: symlink d18/d38/d43/d5c/daf/l12a 0 2026-03-10T08:56:17.033 INFO:tasks.workunit.client.0.vm05.stdout:7/909: chown d18/d38/d43/d6e/f9b 5542 1 2026-03-10T08:56:17.035 INFO:tasks.workunit.client.0.vm05.stdout:8/944: write d2/dd/d2c/d2e/d31/d3e/d5d/d9d/f133 [761969,60357] 0 2026-03-10T08:56:17.041 INFO:tasks.workunit.client.0.vm05.stdout:2/899: link d0/f36 d0/d55/db8/f113 0 2026-03-10T08:56:17.051 INFO:tasks.workunit.client.0.vm05.stdout:4/948: fsync d0/d2e/d42/d45/d4a/d36/d37/d114/d129/d5b/f70 0 2026-03-10T08:56:17.054 INFO:tasks.workunit.client.0.vm05.stdout:0/969: rmdir df/d1f/d85/d2b/d27/dff/d12d 0 2026-03-10T08:56:17.057 INFO:tasks.workunit.client.0.vm05.stdout:3/996: creat d9/d2b/f131 x:0 0 0 2026-03-10T08:56:17.065 INFO:tasks.workunit.client.0.vm05.stdout:9/944: write d6/d12/d3a/d9c/fb6 [5520698,37233] 0 2026-03-10T08:56:17.078 INFO:tasks.workunit.client.0.vm05.stdout:2/900: truncate d0/d9/d1e/d20/d21/d45/d4b/f9c 1720362 0 2026-03-10T08:56:17.085 INFO:tasks.workunit.client.0.vm05.stdout:9/945: sync 2026-03-10T08:56:17.092 INFO:tasks.workunit.client.0.vm05.stdout:6/951: truncate d4/d2c/dc8/f109 331665 0 2026-03-10T08:56:17.098 INFO:tasks.workunit.client.0.vm05.stdout:6/952: dwrite d4/d7/f34 [0,4194304] 0 2026-03-10T08:56:17.100 INFO:tasks.workunit.client.0.vm05.stdout:8/945: dwrite d2/dd/d2c/d2e/d31/d3e/d5d/d9d/fdd [0,4194304] 0 2026-03-10T08:56:17.102 INFO:tasks.workunit.client.0.vm05.stdout:6/953: readlink d4/d7/d10/d15/d1b/d22/le3 0 2026-03-10T08:56:17.113 INFO:tasks.workunit.client.0.vm05.stdout:2/901: symlink d0/d9/d1e/d20/d21/d45/d4b/d75/l114 0 2026-03-10T08:56:17.114 INFO:tasks.workunit.client.0.vm05.stdout:4/949: dwrite d0/d2e/d42/d45/d4a/d36/d37/d114/d129/f51 [0,4194304] 0 2026-03-10T08:56:17.122 INFO:tasks.workunit.client.0.vm05.stdout:9/946: readlink d6/l10 0 2026-03-10T08:56:17.123 INFO:tasks.workunit.client.0.vm05.stdout:9/947: truncate d6/d12/d3a/de5/dd4/d113/f12f 735614 0 2026-03-10T08:56:17.123 INFO:tasks.workunit.client.0.vm05.stdout:9/948: dread - d6/d12/d3a/fdd zero size 2026-03-10T08:56:17.124 INFO:tasks.workunit.client.0.vm05.stdout:9/949: chown d6/d12/d3a/de5/f47 2811 1 2026-03-10T08:56:17.141 INFO:tasks.workunit.client.0.vm05.stdout:6/954: symlink d4/d7/d10/dc3/l146 0 2026-03-10T08:56:17.146 INFO:tasks.workunit.client.0.vm05.stdout:4/950: rmdir d0/d2e/d71 39 2026-03-10T08:56:17.160 INFO:tasks.workunit.client.0.vm05.stdout:4/951: truncate d0/d2e/dca/f11b 699505 0 2026-03-10T08:56:17.162 INFO:tasks.workunit.client.0.vm05.stdout:0/970: link df/d1f/d85/d19/d55/l92 df/d1f/d85/d19/d47/d84/dae/db3/l12f 0 2026-03-10T08:56:17.162 INFO:tasks.workunit.client.0.vm05.stdout:8/946: link d2/dd/d2c/d2e/d31/d3e/d5d/d9d/fdd d2/dd/d2c/d2e/d31/d4f/d80/f148 0 2026-03-10T08:56:17.162 INFO:tasks.workunit.client.0.vm05.stdout:5/908: getdents d5/d86/d21 0 2026-03-10T08:56:17.162 INFO:tasks.workunit.client.0.vm05.stdout:5/909: chown d5/d86/d39 29 1 2026-03-10T08:56:17.162 INFO:tasks.workunit.client.0.vm05.stdout:7/910: rename d18/d1b/c20 to d18/d66/c12b 0 2026-03-10T08:56:17.167 INFO:tasks.workunit.client.0.vm05.stdout:0/971: dread df/d1f/d85/d19/d5b/f78 [0,4194304] 0 2026-03-10T08:56:17.170 INFO:tasks.workunit.client.0.vm05.stdout:5/910: symlink d5/df/d37/d68/d12d/l143 0 2026-03-10T08:56:17.171 INFO:tasks.workunit.client.0.vm05.stdout:5/911: chown d5/d86/c29 1454647281 1 2026-03-10T08:56:17.172 INFO:tasks.workunit.client.0.vm05.stdout:2/902: rename d0/d55/db8/dcc/dd9/lf9 to d0/d55/db8/dcc/dd9/dcb/l115 0 2026-03-10T08:56:17.172 INFO:tasks.workunit.client.0.vm05.stdout:2/903: chown d0/d9/lfb 3368 1 2026-03-10T08:56:17.173 INFO:tasks.workunit.client.0.vm05.stdout:2/904: chown d0/d9/d1e/d20/d21/d45/d4b 20317 1 2026-03-10T08:56:17.175 INFO:tasks.workunit.client.0.vm05.stdout:2/905: dread d0/f4 [0,4194304] 0 2026-03-10T08:56:17.186 INFO:tasks.workunit.client.0.vm05.stdout:3/997: write d9/d8f/d50/d5f/dd8/dec/d123/ff6 [307726,94420] 0 2026-03-10T08:56:17.186 INFO:tasks.workunit.client.0.vm05.stdout:3/998: dread - d9/d8f/fdc zero size 2026-03-10T08:56:17.188 INFO:tasks.workunit.client.0.vm05.stdout:9/950: write d6/d15/d3c/d4b/f76 [1367367,77636] 0 2026-03-10T08:56:17.192 INFO:tasks.workunit.client.0.vm05.stdout:4/952: write d0/d2e/f108 [595346,53243] 0 2026-03-10T08:56:17.200 INFO:tasks.workunit.client.0.vm05.stdout:7/911: dwrite d18/d66/d25/d2e/d2f/fd3 [0,4194304] 0 2026-03-10T08:56:17.202 INFO:tasks.workunit.client.0.vm05.stdout:0/972: write df/f1a [322334,11291] 0 2026-03-10T08:56:17.211 INFO:tasks.workunit.client.0.vm05.stdout:6/955: creat d4/d2c/f147 x:0 0 0 2026-03-10T08:56:17.214 INFO:tasks.workunit.client.0.vm05.stdout:2/906: fsync d0/d55/dde/dac/faf 0 2026-03-10T08:56:17.219 INFO:tasks.workunit.client.0.vm05.stdout:2/907: dwrite d0/d9/d89/f10f [0,4194304] 0 2026-03-10T08:56:17.234 INFO:tasks.workunit.client.0.vm05.stdout:0/973: mkdir df/d1f/d85/d2b/d27/d32/d4e/d87/d130 0 2026-03-10T08:56:17.235 INFO:tasks.workunit.client.0.vm05.stdout:0/974: read df/d1f/d85/d19/d47/d84/d8a/f93 [908569,63986] 0 2026-03-10T08:56:17.248 INFO:tasks.workunit.client.0.vm05.stdout:6/956: rename d4/d2c/d84/db6/f139 to d4/d2d/d51/d62/f148 0 2026-03-10T08:56:17.248 INFO:tasks.workunit.client.0.vm05.stdout:6/957: fsync d4/d7/d10/d15/fc5 0 2026-03-10T08:56:17.248 INFO:tasks.workunit.client.0.vm05.stdout:5/912: creat d5/df/dbb/d108/d13b/f144 x:0 0 0 2026-03-10T08:56:17.248 INFO:tasks.workunit.client.0.vm05.stdout:2/908: creat d0/d55/dd4/f116 x:0 0 0 2026-03-10T08:56:17.248 INFO:tasks.workunit.client.0.vm05.stdout:2/909: chown d0/d9/d7f/d8f/d7a 30244301 1 2026-03-10T08:56:17.251 INFO:tasks.workunit.client.0.vm05.stdout:8/947: getdents d2/dd/d2c/d2e/d31/d3e/d5d/d9d 0 2026-03-10T08:56:17.262 INFO:tasks.workunit.client.0.vm05.stdout:2/910: fsync d0/d9/f12 0 2026-03-10T08:56:17.270 INFO:tasks.workunit.client.0.vm05.stdout:2/911: dread d0/d9/d1e/d20/d21/f3d [0,4194304] 0 2026-03-10T08:56:17.272 INFO:tasks.workunit.client.0.vm05.stdout:4/953: creat d0/d2e/d42/d45/d4a/d36/d37/f132 x:0 0 0 2026-03-10T08:56:17.275 INFO:tasks.workunit.client.0.vm05.stdout:0/975: mkdir df/d1f/d85/d2b/d27/d32/d4e/d87/d130/d131 0 2026-03-10T08:56:17.276 INFO:tasks.workunit.client.0.vm05.stdout:0/976: stat df/d1f/d85 0 2026-03-10T08:56:17.277 INFO:tasks.workunit.client.0.vm05.stdout:6/958: mknod d4/c149 0 2026-03-10T08:56:17.282 INFO:tasks.workunit.client.0.vm05.stdout:3/999: link d9/d2b/de7/ff2 d9/d4d/f132 0 2026-03-10T08:56:17.284 INFO:tasks.workunit.client.0.vm05.stdout:7/912: dwrite d18/d38/d43/d6e/f76 [0,4194304] 0 2026-03-10T08:56:17.286 INFO:tasks.workunit.client.0.vm05.stdout:7/913: write d18/d38/d43/ff8 [1139974,69984] 0 2026-03-10T08:56:17.289 INFO:tasks.workunit.client.0.vm05.stdout:5/913: write d5/df/d37/dd2/d76/fb5 [1033756,91853] 0 2026-03-10T08:56:17.297 INFO:tasks.workunit.client.0.vm05.stdout:9/951: getdents d6/d15/d3c/d4b 0 2026-03-10T08:56:17.299 INFO:tasks.workunit.client.0.vm05.stdout:4/954: rename d0/d2e/f108 to d0/d2e/d12b/f133 0 2026-03-10T08:56:17.312 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:17 vm08.local ceph-mon[57559]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "mgr metadata", "who": "vm05.rxwgjc", "id": "vm05.rxwgjc"}]: dispatch 2026-03-10T08:56:17.312 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:17 vm08.local ceph-mon[57559]: pgmap v17: 65 pgs: 65 active+clean; 4.1 GiB data, 14 GiB used, 106 GiB / 120 GiB avail; 42 MiB/s rd, 102 MiB/s wr, 287 op/s 2026-03-10T08:56:17.313 INFO:tasks.workunit.client.0.vm05.stdout:8/948: truncate d2/dd/d2c/d2e/d31/d3e/f73 523838 0 2026-03-10T08:56:17.313 INFO:tasks.workunit.client.0.vm05.stdout:4/955: write d0/d2e/d42/d45/d4a/d36/dbe/f119 [8750606,28197] 0 2026-03-10T08:56:17.313 INFO:tasks.workunit.client.0.vm05.stdout:8/949: write d2/dd/d2c/d2e/d31/d4f/d80/de2/f136 [3630107,96224] 0 2026-03-10T08:56:17.313 INFO:tasks.workunit.client.0.vm05.stdout:4/956: dwrite d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/d67/fc4 [0,4194304] 0 2026-03-10T08:56:17.313 INFO:tasks.workunit.client.0.vm05.stdout:6/959: unlink d4/d7/f14 0 2026-03-10T08:56:17.314 INFO:tasks.workunit.client.0.vm05.stdout:0/977: dread df/d1f/d85/d2b/f3b [0,4194304] 0 2026-03-10T08:56:17.318 INFO:tasks.workunit.client.0.vm05.stdout:9/952: fdatasync d6/d15/d3c/d4b/d82/f117 0 2026-03-10T08:56:17.319 INFO:tasks.workunit.client.0.vm05.stdout:9/953: dread - d6/d19/d2a/d8d/fed zero size 2026-03-10T08:56:17.323 INFO:tasks.workunit.client.0.vm05.stdout:7/914: rename d18/d38/dc7/de3/d53/lbf to d18/d38/d43/d5c/daf/l12c 0 2026-03-10T08:56:17.328 INFO:tasks.workunit.client.0.vm05.stdout:4/957: truncate d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/f60 1309908 0 2026-03-10T08:56:17.336 INFO:tasks.workunit.client.0.vm05.stdout:4/958: chown d0/d2e/d42/d45/d4a/d36/dbe/d49/d58/d66/lce 87 1 2026-03-10T08:56:17.336 INFO:tasks.workunit.client.0.vm05.stdout:5/914: fsync d5/fc0 0 2026-03-10T08:56:17.340 INFO:tasks.workunit.client.0.vm05.stdout:8/950: mkdir d2/dd/d149 0 2026-03-10T08:56:17.343 INFO:tasks.workunit.client.0.vm05.stdout:2/912: write d0/d9/d7f/d8f/f63 [402510,55590] 0 2026-03-10T08:56:17.345 INFO:tasks.workunit.client.0.vm05.stdout:2/913: dread d0/d9/fd6 [0,4194304] 0 2026-03-10T08:56:17.357 INFO:tasks.workunit.client.0.vm05.stdout:6/960: creat d4/d7/d141/f14a x:0 0 0 2026-03-10T08:56:17.361 INFO:tasks.workunit.client.0.vm05.stdout:6/961: dwrite d4/d2d/d51/d62/d113/d11b/fb3 [4194304,4194304] 0 2026-03-10T08:56:17.372 INFO:tasks.workunit.client.0.vm05.stdout:4/959: mkdir d0/d2e/d42/d45/d4a/d36/dbe/d49/d58/d66/d79/d107/d134 0 2026-03-10T08:56:17.375 INFO:tasks.workunit.client.0.vm05.stdout:5/915: mknod d5/d86/d24/d84/df7/d137/c145 0 2026-03-10T08:56:17.377 INFO:tasks.workunit.client.0.vm05.stdout:7/915: fsync d18/f112 0 2026-03-10T08:56:17.381 INFO:tasks.workunit.client.0.vm05.stdout:4/960: dread d0/d2e/d42/d45/d4a/d36/dbe/f119 [0,4194304] 0 2026-03-10T08:56:17.388 INFO:tasks.workunit.client.0.vm05.stdout:2/914: truncate d0/d9/d7f/d8f/d7e/fa4 1460795 0 2026-03-10T08:56:17.388 INFO:tasks.workunit.client.0.vm05.stdout:0/978: mknod df/d1f/d85/d2b/d65/c132 0 2026-03-10T08:56:17.398 INFO:tasks.workunit.client.0.vm05.stdout:4/961: unlink d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/d67/cab 0 2026-03-10T08:56:17.400 INFO:tasks.workunit.client.0.vm05.stdout:2/915: rename d0/d9/d1e/d20/d21/d45 to d0/d55/dd4/d117 0 2026-03-10T08:56:17.411 INFO:tasks.workunit.client.0.vm05.stdout:6/962: dwrite d4/d2c/dc8/ffd [0,4194304] 0 2026-03-10T08:56:17.424 INFO:tasks.workunit.client.0.vm05.stdout:0/979: symlink df/d1f/d85/d19/d47/d10e/l133 0 2026-03-10T08:56:17.425 INFO:tasks.workunit.client.0.vm05.stdout:0/980: chown df/d1f/d95/c10c 2668583 1 2026-03-10T08:56:17.426 INFO:tasks.workunit.client.0.vm05.stdout:0/981: stat df/d1f/d85/d2b/d27/d32/d4e/d87 0 2026-03-10T08:56:17.426 INFO:tasks.workunit.client.0.vm05.stdout:8/951: truncate d2/dd/d74/f125 532215 0 2026-03-10T08:56:17.427 INFO:tasks.workunit.client.0.vm05.stdout:0/982: dread - df/d1f/d85/d19/d39/f42 zero size 2026-03-10T08:56:17.427 INFO:tasks.workunit.client.0.vm05.stdout:0/983: stat df/d1f/d85/d19/d47/d84/dae 0 2026-03-10T08:56:17.428 INFO:tasks.workunit.client.0.vm05.stdout:0/984: chown df/d1f/d95/c10c 359 1 2026-03-10T08:56:17.429 INFO:tasks.workunit.client.0.vm05.stdout:5/916: creat d5/d86/d24/d129/f146 x:0 0 0 2026-03-10T08:56:17.430 INFO:tasks.workunit.client.0.vm05.stdout:5/917: dread - d5/df/dbb/d108/f10a zero size 2026-03-10T08:56:17.431 INFO:tasks.workunit.client.0.vm05.stdout:5/918: write d5/d86/d24/f136 [376164,56053] 0 2026-03-10T08:56:17.437 INFO:tasks.workunit.client.0.vm05.stdout:9/954: link d6/d15/d35/c45 d6/d19/d2c/d84/d118/c139 0 2026-03-10T08:56:17.442 INFO:tasks.workunit.client.0.vm05.stdout:5/919: dread d5/d86/d24/fc2 [0,4194304] 0 2026-03-10T08:56:17.456 INFO:tasks.workunit.client.0.vm05.stdout:7/916: truncate d18/d38/d43/d6e/f76 3380795 0 2026-03-10T08:56:17.460 INFO:tasks.workunit.client.0.vm05.stdout:6/963: rmdir d4/d2d/d51/d62 39 2026-03-10T08:56:17.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:17 vm05.local ceph-mon[49713]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "mgr metadata", "who": "vm05.rxwgjc", "id": "vm05.rxwgjc"}]: dispatch 2026-03-10T08:56:17.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:17 vm05.local ceph-mon[49713]: pgmap v17: 65 pgs: 65 active+clean; 4.1 GiB data, 14 GiB used, 106 GiB / 120 GiB avail; 42 MiB/s rd, 102 MiB/s wr, 287 op/s 2026-03-10T08:56:17.462 INFO:tasks.workunit.client.0.vm05.stdout:8/952: mkdir d2/dd/d74/d14a 0 2026-03-10T08:56:17.463 INFO:tasks.workunit.client.0.vm05.stdout:0/985: creat df/d1f/d85/d19/d55/f134 x:0 0 0 2026-03-10T08:56:17.466 INFO:tasks.workunit.client.0.vm05.stdout:9/955: truncate d6/d15/fc1 374832 0 2026-03-10T08:56:17.485 INFO:tasks.workunit.client.0.vm05.stdout:6/964: mkdir d4/d7/d10/d15/d1b/dfc/d13f/d14b 0 2026-03-10T08:56:17.485 INFO:tasks.workunit.client.0.vm05.stdout:8/953: creat d2/dd/d2c/d2e/d31/d3e/dde/d63/daf/f14b x:0 0 0 2026-03-10T08:56:17.485 INFO:tasks.workunit.client.0.vm05.stdout:9/956: mknod d6/d27/c13a 0 2026-03-10T08:56:17.485 INFO:tasks.workunit.client.0.vm05.stdout:6/965: creat d4/d2d/d51/d62/da9/f14c x:0 0 0 2026-03-10T08:56:17.485 INFO:tasks.workunit.client.0.vm05.stdout:8/954: readlink d2/lc1 0 2026-03-10T08:56:17.485 INFO:tasks.workunit.client.0.vm05.stdout:9/957: symlink d6/d15/d3c/d4b/d82/de9/l13b 0 2026-03-10T08:56:17.485 INFO:tasks.workunit.client.0.vm05.stdout:0/986: getdents df/de8 0 2026-03-10T08:56:17.485 INFO:tasks.workunit.client.0.vm05.stdout:0/987: stat df/dd8/d67/d7b/l81 0 2026-03-10T08:56:17.487 INFO:tasks.workunit.client.0.vm05.stdout:5/920: sync 2026-03-10T08:56:17.488 INFO:tasks.workunit.client.0.vm05.stdout:6/966: creat d4/d2d/d5f/f14d x:0 0 0 2026-03-10T08:56:17.500 INFO:tasks.workunit.client.0.vm05.stdout:2/916: truncate d0/d9/d1e/d20/f71 3308475 0 2026-03-10T08:56:17.500 INFO:tasks.workunit.client.0.vm05.stdout:7/917: write d18/d38/dc7/de3/d9c/dac/ff6 [395960,69321] 0 2026-03-10T08:56:17.500 INFO:tasks.workunit.client.0.vm05.stdout:4/962: write d0/d2e/d71/d7c/f10e [598877,11501] 0 2026-03-10T08:56:17.501 INFO:tasks.workunit.client.0.vm05.stdout:2/917: chown d0/cca 206753 1 2026-03-10T08:56:17.507 INFO:tasks.workunit.client.0.vm05.stdout:2/918: dwrite d0/d9/d1e/d20/d24/ff6 [0,4194304] 0 2026-03-10T08:56:17.509 INFO:tasks.workunit.client.0.vm05.stdout:8/955: write d2/dd/feb [739693,35397] 0 2026-03-10T08:56:17.511 INFO:tasks.workunit.client.0.vm05.stdout:6/967: creat d4/d2c/d84/d4a/dd5/f14e x:0 0 0 2026-03-10T08:56:17.515 INFO:tasks.workunit.client.0.vm05.stdout:4/963: mknod d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/d67/d115/c135 0 2026-03-10T08:56:17.518 INFO:tasks.workunit.client.0.vm05.stdout:9/958: dwrite d6/d15/d3c/d4b/d82/f114 [0,4194304] 0 2026-03-10T08:56:17.519 INFO:tasks.workunit.client.0.vm05.stdout:9/959: write d6/df6/f121 [1007391,128527] 0 2026-03-10T08:56:17.525 INFO:tasks.workunit.client.0.vm05.stdout:7/918: truncate d18/d66/d25/d2e/f48 1943982 0 2026-03-10T08:56:17.527 INFO:tasks.workunit.client.0.vm05.stdout:0/988: write df/d1f/d85/d2b/d65/d6e/feb [4885669,126188] 0 2026-03-10T08:56:17.528 INFO:tasks.workunit.client.0.vm05.stdout:0/989: write df/d1f/d85/d2b/d27/d32/f10f [353490,14633] 0 2026-03-10T08:56:17.531 INFO:tasks.workunit.client.0.vm05.stdout:2/919: unlink d0/d9/f1d 0 2026-03-10T08:56:17.531 INFO:tasks.workunit.client.0.vm05.stdout:5/921: mknod d5/d86/d24/d84/c147 0 2026-03-10T08:56:17.535 INFO:tasks.workunit.client.0.vm05.stdout:4/964: mknod d0/d2e/d42/d45/d4a/d36/dbe/d49/d58/d66/d79/d107/c136 0 2026-03-10T08:56:17.536 INFO:tasks.workunit.client.0.vm05.stdout:9/960: mkdir d6/d12/d3a/da2/d13c 0 2026-03-10T08:56:17.541 INFO:tasks.workunit.client.0.vm05.stdout:0/990: unlink df/d1f/d85/d2b/d27/d32/d4e/d87/f8d 0 2026-03-10T08:56:17.543 INFO:tasks.workunit.client.0.vm05.stdout:0/991: fsync df/d1f/d85/d19/d47/d84/dae/fc9 0 2026-03-10T08:56:17.543 INFO:tasks.workunit.client.0.vm05.stdout:5/922: truncate d5/df/dbb/fd0 4042519 0 2026-03-10T08:56:17.544 INFO:tasks.workunit.client.0.vm05.stdout:2/920: rename d0/d9/d7f/d8f/d6d/f7b to d0/d9/d7f/d8f/d7e/f118 0 2026-03-10T08:56:17.546 INFO:tasks.workunit.client.0.vm05.stdout:9/961: chown d6/d27/cc2 582597 1 2026-03-10T08:56:17.548 INFO:tasks.workunit.client.0.vm05.stdout:8/956: dread d2/db/f1b [0,4194304] 0 2026-03-10T08:56:17.548 INFO:tasks.workunit.client.0.vm05.stdout:5/923: mknod d5/d86/d21/c148 0 2026-03-10T08:56:17.550 INFO:tasks.workunit.client.0.vm05.stdout:6/968: creat d4/d7/d10/d1a/f14f x:0 0 0 2026-03-10T08:56:17.551 INFO:tasks.workunit.client.0.vm05.stdout:4/965: mkdir d0/d2e/d42/d45/d4a/d36/d37/d114/d129/d5b/dae/df5/d137 0 2026-03-10T08:56:17.559 INFO:tasks.workunit.client.0.vm05.stdout:5/924: dread d5/df/d37/d68/fb3 [0,4194304] 0 2026-03-10T08:56:17.559 INFO:tasks.workunit.client.0.vm05.stdout:6/969: creat d4/d7/d10/d111/f150 x:0 0 0 2026-03-10T08:56:17.559 INFO:tasks.workunit.client.0.vm05.stdout:4/966: mknod d0/d2e/d71/d7c/c138 0 2026-03-10T08:56:17.559 INFO:tasks.workunit.client.0.vm05.stdout:2/921: rename d0/d55/dd4/d117/d4b/d70 to d0/d9/d1e/d20/d24/d119 0 2026-03-10T08:56:17.559 INFO:tasks.workunit.client.0.vm05.stdout:9/962: creat d6/d12/d3a/f13d x:0 0 0 2026-03-10T08:56:17.562 INFO:tasks.workunit.client.0.vm05.stdout:2/922: readlink d0/d55/dd4/d117/d4b/le0 0 2026-03-10T08:56:17.563 INFO:tasks.workunit.client.0.vm05.stdout:6/970: write d4/d7/d10/dc3/f125 [1645610,9238] 0 2026-03-10T08:56:17.563 INFO:tasks.workunit.client.0.vm05.stdout:8/957: dread d2/dd/d2c/d2e/f7d [0,4194304] 0 2026-03-10T08:56:17.569 INFO:tasks.workunit.client.0.vm05.stdout:0/992: sync 2026-03-10T08:56:17.569 INFO:tasks.workunit.client.0.vm05.stdout:4/967: sync 2026-03-10T08:56:17.569 INFO:tasks.workunit.client.0.vm05.stdout:4/968: fdatasync d0/d2c/f2f 0 2026-03-10T08:56:17.571 INFO:tasks.workunit.client.0.vm05.stdout:9/963: rename d6/d19/d2a/d8d/ld2 to d6/d19/d21/d132/l13e 0 2026-03-10T08:56:17.575 INFO:tasks.workunit.client.0.vm05.stdout:9/964: sync 2026-03-10T08:56:17.586 INFO:tasks.workunit.client.0.vm05.stdout:7/919: dwrite d18/d38/d43/d5c/fa1 [0,4194304] 0 2026-03-10T08:56:17.587 INFO:tasks.workunit.client.0.vm05.stdout:5/925: getdents d5/d86/d24/d2c/d41/d74/da9 0 2026-03-10T08:56:17.600 INFO:tasks.workunit.client.0.vm05.stdout:2/923: dread d0/d55/dd4/d117/d4b/d75/fc9 [0,4194304] 0 2026-03-10T08:56:17.601 INFO:tasks.workunit.client.0.vm05.stdout:4/969: fdatasync d0/d2e/d42/d45/d4a/d36/f88 0 2026-03-10T08:56:17.603 INFO:tasks.workunit.client.0.vm05.stdout:6/971: getdents d4/d7/d10/d15/d1b/d124 0 2026-03-10T08:56:17.603 INFO:tasks.workunit.client.0.vm05.stdout:6/972: write d4/d2c/f147 [649135,59012] 0 2026-03-10T08:56:17.608 INFO:tasks.workunit.client.0.vm05.stdout:7/920: creat d18/d66/d25/d2e/d2f/d6d/f12d x:0 0 0 2026-03-10T08:56:17.612 INFO:tasks.workunit.client.0.vm05.stdout:0/993: creat df/d1f/dcd/f135 x:0 0 0 2026-03-10T08:56:17.612 INFO:tasks.workunit.client.0.vm05.stdout:6/973: read d4/d2c/d84/d4a/f76 [3247207,32311] 0 2026-03-10T08:56:17.613 INFO:tasks.workunit.client.0.vm05.stdout:6/974: fdatasync d4/d2c/d84/d4a/dd5/f14e 0 2026-03-10T08:56:17.614 INFO:tasks.workunit.client.0.vm05.stdout:7/921: creat d18/d66/d25/d2e/de7/f12e x:0 0 0 2026-03-10T08:56:17.616 INFO:tasks.workunit.client.0.vm05.stdout:0/994: fdatasync df/dd8/d67/f106 0 2026-03-10T08:56:17.617 INFO:tasks.workunit.client.0.vm05.stdout:6/975: dwrite d4/d2c/d84/d4a/f12e [0,4194304] 0 2026-03-10T08:56:17.635 INFO:tasks.workunit.client.0.vm05.stdout:2/924: rename d0/d9/d1e/l6a to d0/l11a 0 2026-03-10T08:56:17.636 INFO:tasks.workunit.client.0.vm05.stdout:8/958: truncate d2/dd/d2c/d2e/d31/d3e/d5d/d9d/fdd 167847 0 2026-03-10T08:56:17.638 INFO:tasks.workunit.client.0.vm05.stdout:8/959: read d2/dd/d2c/f30 [195334,67702] 0 2026-03-10T08:56:17.640 INFO:tasks.workunit.client.0.vm05.stdout:7/922: fsync d18/d38/dc7/de3/d9c/dac/f35 0 2026-03-10T08:56:17.646 INFO:tasks.workunit.client.0.vm05.stdout:5/926: getdents d5/d86/d21/d71 0 2026-03-10T08:56:17.648 INFO:tasks.workunit.client.0.vm05.stdout:2/925: dread - d0/d9/d89/fba zero size 2026-03-10T08:56:17.652 INFO:tasks.workunit.client.0.vm05.stdout:8/960: truncate d2/db/d1f/f53 2789281 0 2026-03-10T08:56:17.658 INFO:tasks.workunit.client.0.vm05.stdout:9/965: dwrite d6/f4e [0,4194304] 0 2026-03-10T08:56:17.661 INFO:tasks.workunit.client.0.vm05.stdout:5/927: mkdir d5/d86/d24/d84/df7/d149 0 2026-03-10T08:56:17.664 INFO:tasks.workunit.client.0.vm05.stdout:0/995: dwrite df/d1f/d85/d19/d39/f6f [0,4194304] 0 2026-03-10T08:56:17.665 INFO:tasks.workunit.client.0.vm05.stdout:8/961: sync 2026-03-10T08:56:17.666 INFO:tasks.workunit.client.0.vm05.stdout:0/996: readlink df/dd8/d67/lc5 0 2026-03-10T08:56:17.667 INFO:tasks.workunit.client.0.vm05.stdout:6/976: dwrite d4/d2d/d5f/f11a [0,4194304] 0 2026-03-10T08:56:17.686 INFO:tasks.workunit.client.0.vm05.stdout:4/970: rename d0/d1d/ce7 to d0/d2e/d42/d45/d4a/d36/dbe/d49/d58/d66/d79/c139 0 2026-03-10T08:56:17.688 INFO:tasks.workunit.client.0.vm05.stdout:2/926: symlink d0/d9/d7f/l11b 0 2026-03-10T08:56:17.694 INFO:tasks.workunit.client.0.vm05.stdout:5/928: creat d5/d86/d21/d71/f14a x:0 0 0 2026-03-10T08:56:17.695 INFO:tasks.workunit.client.0.vm05.stdout:5/929: stat d5/df/dbb/d108/f10a 0 2026-03-10T08:56:17.696 INFO:tasks.workunit.client.0.vm05.stdout:6/977: readlink d4/d2d/d5f/l6e 0 2026-03-10T08:56:17.703 INFO:tasks.workunit.client.0.vm05.stdout:6/978: read d4/d2d/d51/d87/da5/fe5 [1152717,39496] 0 2026-03-10T08:56:17.704 INFO:tasks.workunit.client.0.vm05.stdout:8/962: mkdir d2/dd/d2c/d14c 0 2026-03-10T08:56:17.705 INFO:tasks.workunit.client.0.vm05.stdout:8/963: chown d2/dd/d2c/d2e/d31/d4f/da3/ffe 4240822 1 2026-03-10T08:56:17.706 INFO:tasks.workunit.client.0.vm05.stdout:8/964: fsync d2/db/d28/d99/df3/f104 0 2026-03-10T08:56:17.708 INFO:tasks.workunit.client.0.vm05.stdout:7/923: rename d18/d66/d25/d2e/de7/l11a to d18/d38/l12f 0 2026-03-10T08:56:17.709 INFO:tasks.workunit.client.0.vm05.stdout:2/927: creat d0/d9/f11c x:0 0 0 2026-03-10T08:56:17.712 INFO:tasks.workunit.client.0.vm05.stdout:8/965: mkdir d2/dd/d2c/d2e/d31/d3e/dde/d63/d14d 0 2026-03-10T08:56:17.712 INFO:tasks.workunit.client.0.vm05.stdout:8/966: chown d2/db/f19 4632 1 2026-03-10T08:56:17.714 INFO:tasks.workunit.client.0.vm05.stdout:2/928: creat d0/d9/d7f/d8f/d6d/f11d x:0 0 0 2026-03-10T08:56:17.714 INFO:tasks.workunit.client.0.vm05.stdout:5/930: link d5/d86/f1a d5/df/d37/f14b 0 2026-03-10T08:56:17.715 INFO:tasks.workunit.client.0.vm05.stdout:8/967: write d2/dd/d2c/d2e/d31/d3e/f95 [1659893,48041] 0 2026-03-10T08:56:17.717 INFO:tasks.workunit.client.0.vm05.stdout:6/979: creat d4/d2d/d51/f151 x:0 0 0 2026-03-10T08:56:17.720 INFO:tasks.workunit.client.0.vm05.stdout:2/929: fsync d0/d55/db8/f113 0 2026-03-10T08:56:17.722 INFO:tasks.workunit.client.0.vm05.stdout:6/980: mknod d4/d7/d10/d1a/c152 0 2026-03-10T08:56:17.722 INFO:tasks.workunit.client.0.vm05.stdout:6/981: dread - d4/d7/d10/d1a/f14f zero size 2026-03-10T08:56:17.723 INFO:tasks.workunit.client.0.vm05.stdout:6/982: fdatasync d4/d7/d10/d1a/d8c/f10e 0 2026-03-10T08:56:17.724 INFO:tasks.workunit.client.0.vm05.stdout:2/930: read d0/d9/d1e/d20/d24/f29 [2151809,2235] 0 2026-03-10T08:56:17.724 INFO:tasks.workunit.client.0.vm05.stdout:6/983: truncate d4/d7/d10/d1a/f14f 654589 0 2026-03-10T08:56:17.727 INFO:tasks.workunit.client.0.vm05.stdout:6/984: dread d4/d2d/d5f/f11a [0,4194304] 0 2026-03-10T08:56:17.728 INFO:tasks.workunit.client.0.vm05.stdout:6/985: dread - d4/d2d/d51/d62/f148 zero size 2026-03-10T08:56:17.728 INFO:tasks.workunit.client.0.vm05.stdout:2/931: mkdir d0/d9/d1e/d20/d21/d8a/d11e 0 2026-03-10T08:56:17.729 INFO:tasks.workunit.client.0.vm05.stdout:6/986: stat d4/d2d/d51/d62/da9/fe4 0 2026-03-10T08:56:17.730 INFO:tasks.workunit.client.0.vm05.stdout:6/987: symlink d4/d2c/d84/db6/dc6/d128/l153 0 2026-03-10T08:56:17.731 INFO:tasks.workunit.client.0.vm05.stdout:2/932: mkdir d0/d55/db8/d105/d11f 0 2026-03-10T08:56:17.732 INFO:tasks.workunit.client.0.vm05.stdout:6/988: creat d4/d2c/d13e/f154 x:0 0 0 2026-03-10T08:56:17.733 INFO:tasks.workunit.client.0.vm05.stdout:2/933: truncate d0/d9/d1e/d20/f71 2456390 0 2026-03-10T08:56:17.735 INFO:tasks.workunit.client.0.vm05.stdout:8/968: sync 2026-03-10T08:56:17.736 INFO:tasks.workunit.client.0.vm05.stdout:6/989: unlink d4/d2c/d84/db6/dc6/d128/f12a 0 2026-03-10T08:56:17.737 INFO:tasks.workunit.client.0.vm05.stdout:6/990: dread - d4/d2d/d51/d87/da5/de9/f118 zero size 2026-03-10T08:56:17.738 INFO:tasks.workunit.client.0.vm05.stdout:8/969: readlink d2/dd/d2c/d2e/l145 0 2026-03-10T08:56:17.741 INFO:tasks.workunit.client.0.vm05.stdout:6/991: stat d4/d92/db0/f115 0 2026-03-10T08:56:17.741 INFO:tasks.workunit.client.0.vm05.stdout:8/970: mknod d2/dd/d2c/d2e/d31/d4f/d7b/c14e 0 2026-03-10T08:56:17.741 INFO:tasks.workunit.client.0.vm05.stdout:6/992: truncate d4/d2d/d51/d62/da9/fe4 4143543 0 2026-03-10T08:56:17.752 INFO:tasks.workunit.client.0.vm05.stdout:4/971: write d0/d2e/d71/d7c/fe6 [482769,99871] 0 2026-03-10T08:56:17.752 INFO:tasks.workunit.client.0.vm05.stdout:9/966: dwrite d6/d15/f25 [0,4194304] 0 2026-03-10T08:56:17.754 INFO:tasks.workunit.client.0.vm05.stdout:0/997: truncate df/d59/f3f 3838614 0 2026-03-10T08:56:17.765 INFO:tasks.workunit.client.0.vm05.stdout:7/924: write d18/d66/d25/d2e/f49 [949157,9480] 0 2026-03-10T08:56:17.769 INFO:tasks.workunit.client.0.vm05.stdout:6/993: rename d4/d7/d10/d15/f2a to d4/d2c/d84/d4a/dd5/f155 0 2026-03-10T08:56:17.769 INFO:tasks.workunit.client.0.vm05.stdout:6/994: dread - d4/d7/d10/d15/d1b/f13c zero size 2026-03-10T08:56:17.772 INFO:tasks.workunit.client.0.vm05.stdout:5/931: truncate d5/df/d37/dd2/d76/fb5 4079044 0 2026-03-10T08:56:17.786 INFO:tasks.workunit.client.0.vm05.stdout:5/932: fdatasync d5/d86/d24/d2c/d41/d74/fb1 0 2026-03-10T08:56:17.789 INFO:tasks.workunit.client.0.vm05.stdout:4/972: creat d0/dfe/de2/f13a x:0 0 0 2026-03-10T08:56:17.793 INFO:tasks.workunit.client.0.vm05.stdout:2/934: dwrite d0/d9/d1e/f59 [0,4194304] 0 2026-03-10T08:56:17.796 INFO:tasks.workunit.client.0.vm05.stdout:5/933: rmdir d5/df/d37/d68/d12d 39 2026-03-10T08:56:17.802 INFO:tasks.workunit.client.0.vm05.stdout:4/973: rename d0/d2c/d6a/l77 to d0/dfe/l13b 0 2026-03-10T08:56:17.807 INFO:tasks.workunit.client.0.vm05.stdout:2/935: creat d0/d9/d7f/d8f/f120 x:0 0 0 2026-03-10T08:56:17.808 INFO:tasks.workunit.client.0.vm05.stdout:2/936: dread - d0/d55/db8/dcc/dd9/dcb/f10d zero size 2026-03-10T08:56:17.808 INFO:tasks.workunit.client.0.vm05.stdout:2/937: write d0/d9/d7f/d8f/f120 [874294,61611] 0 2026-03-10T08:56:17.817 INFO:tasks.workunit.client.0.vm05.stdout:6/995: rmdir d4/d2c/d84/d4a 39 2026-03-10T08:56:17.820 INFO:tasks.workunit.client.0.vm05.stdout:6/996: dwrite d4/d2c/dc8/f133 [0,4194304] 0 2026-03-10T08:56:17.829 INFO:tasks.workunit.client.0.vm05.stdout:8/971: truncate d2/db/d47/f51 1302360 0 2026-03-10T08:56:17.829 INFO:tasks.workunit.client.0.vm05.stdout:6/997: readlink d4/d7/d10/d1a/d1f/l60 0 2026-03-10T08:56:17.831 INFO:tasks.workunit.client.0.vm05.stdout:2/938: sync 2026-03-10T08:56:17.846 INFO:tasks.workunit.client.0.vm05.stdout:9/967: dwrite d6/d12/d3a/f5e [0,4194304] 0 2026-03-10T08:56:17.847 INFO:tasks.workunit.client.0.vm05.stdout:6/998: truncate d4/d7/d10/d15/d1b/d22/f56 1080255 0 2026-03-10T08:56:17.848 INFO:tasks.workunit.client.0.vm05.stdout:0/998: write df/d1f/d85/d19/d39/f42 [309887,102557] 0 2026-03-10T08:56:17.861 INFO:tasks.workunit.client.0.vm05.stdout:2/939: mkdir d0/d55/db8/dcc/d121 0 2026-03-10T08:56:17.862 INFO:tasks.workunit.client.0.vm05.stdout:7/925: write d18/d66/d25/d2e/de7/f85 [4388970,92264] 0 2026-03-10T08:56:17.868 INFO:tasks.workunit.client.0.vm05.stdout:2/940: dread d0/d55/f60 [0,4194304] 0 2026-03-10T08:56:17.878 INFO:tasks.workunit.client.0.vm05.stdout:0/999: truncate df/f12 1682167 0 2026-03-10T08:56:17.878 INFO:tasks.workunit.client.0.vm05.stdout:7/926: truncate d18/d1b/f50 454574 0 2026-03-10T08:56:17.884 INFO:tasks.workunit.client.0.vm05.stdout:7/927: dread d18/f112 [0,4194304] 0 2026-03-10T08:56:17.884 INFO:tasks.workunit.client.0.vm05.stdout:4/974: rename d0/d2e/c35 to d0/d2e/d42/d45/d4a/d36/d37/d114/d129/d5b/c13c 0 2026-03-10T08:56:17.886 INFO:tasks.workunit.client.0.vm05.stdout:6/999: dread d4/d2d/d51/f10b [0,4194304] 0 2026-03-10T08:56:17.890 INFO:tasks.workunit.client.0.vm05.stdout:5/934: write d5/d86/d24/f51 [601708,104723] 0 2026-03-10T08:56:17.906 INFO:tasks.workunit.client.0.vm05.stdout:8/972: dwrite d2/dd/d2c/d2e/d31/d3e/d5d/d9d/fdd [0,4194304] 0 2026-03-10T08:56:17.925 INFO:tasks.workunit.client.0.vm05.stdout:2/941: write d0/d9/d7f/d8f/f7d [5040855,49881] 0 2026-03-10T08:56:17.935 INFO:tasks.workunit.client.0.vm05.stdout:9/968: write d6/d12/db2/fba [880392,75534] 0 2026-03-10T08:56:17.935 INFO:tasks.workunit.client.0.vm05.stdout:9/969: read - d6/d15/d3c/d4b/d82/f105 zero size 2026-03-10T08:56:17.950 INFO:tasks.workunit.client.0.vm05.stdout:8/973: dread d2/dd/d2c/d2e/d31/f111 [0,4194304] 0 2026-03-10T08:56:17.962 INFO:tasks.workunit.client.0.vm05.stdout:7/928: write d18/d66/f70 [2734492,124836] 0 2026-03-10T08:56:17.963 INFO:tasks.workunit.client.0.vm05.stdout:7/929: chown d18/d38/dc7/de3/d9c/de8/f114 133 1 2026-03-10T08:56:17.969 INFO:tasks.workunit.client.0.vm05.stdout:4/975: truncate d0/d2e/d42/d45/d4a/d36/dbe/ffb 3199329 0 2026-03-10T08:56:18.026 INFO:tasks.workunit.client.0.vm05.stdout:5/935: link d5/d86/d24/d2c/lb7 d5/df/d37/dc8/d100/l14c 0 2026-03-10T08:56:18.028 INFO:tasks.workunit.client.0.vm05.stdout:2/942: creat d0/d9/d1e/d20/d21/d8a/d11e/f122 x:0 0 0 2026-03-10T08:56:18.032 INFO:tasks.workunit.client.0.vm05.stdout:5/936: truncate d5/d86/d24/d2c/fd8 890440 0 2026-03-10T08:56:18.034 INFO:tasks.workunit.client.0.vm05.stdout:2/943: creat d0/d55/dd4/def/df2/f123 x:0 0 0 2026-03-10T08:56:18.035 INFO:tasks.workunit.client.0.vm05.stdout:7/930: mknod d18/d38/dc7/de3/d74/deb/c130 0 2026-03-10T08:56:18.041 INFO:tasks.workunit.client.0.vm05.stdout:5/937: creat d5/d86/d24/d84/df7/d137/d95/dac/f14d x:0 0 0 2026-03-10T08:56:18.042 INFO:tasks.workunit.client.0.vm05.stdout:2/944: dread - d0/d9/d1e/d20/d21/fe6 zero size 2026-03-10T08:56:18.050 INFO:tasks.workunit.client.0.vm05.stdout:9/970: getdents d6/d15 0 2026-03-10T08:56:18.053 INFO:tasks.workunit.client.0.vm05.stdout:9/971: dwrite d6/d12/d3a/d9c/fb6 [0,4194304] 0 2026-03-10T08:56:18.072 INFO:tasks.workunit.client.0.vm05.stdout:8/974: rename d2/db/l32 to d2/dd/d2c/d2e/l14f 0 2026-03-10T08:56:18.076 INFO:tasks.workunit.client.0.vm05.stdout:2/945: mkdir d0/d55/dde/d124 0 2026-03-10T08:56:18.081 INFO:tasks.workunit.client.0.vm05.stdout:4/976: getdents d0/d2e/d42/d45/d4a/d36/d37/d114/d129/d5b/dd7/ddc/de4 0 2026-03-10T08:56:18.086 INFO:tasks.workunit.client.0.vm05.stdout:9/972: readlink d6/d12/d3a/d48/l81 0 2026-03-10T08:56:18.087 INFO:tasks.workunit.client.0.vm05.stdout:7/931: write d18/d66/d78/fa6 [548715,22580] 0 2026-03-10T08:56:18.090 INFO:tasks.workunit.client.0.vm05.stdout:8/975: unlink d2/db/d1f/d67/f94 0 2026-03-10T08:56:18.093 INFO:tasks.workunit.client.0.vm05.stdout:5/938: rename d5/d86/d24/d84/df7/d137/dc4/f115 to d5/dcf/f14e 0 2026-03-10T08:56:18.093 INFO:tasks.workunit.client.0.vm05.stdout:5/939: stat d5/d86/d24/d84/df7/d137/f83 0 2026-03-10T08:56:18.094 INFO:tasks.workunit.client.0.vm05.stdout:5/940: write d5/df/d37/d68/fe5 [3932224,62610] 0 2026-03-10T08:56:18.098 INFO:tasks.workunit.client.0.vm05.stdout:4/977: symlink d0/d2e/d42/d45/d4a/d36/dbe/d49/d58/l13d 0 2026-03-10T08:56:18.101 INFO:tasks.workunit.client.0.vm05.stdout:9/973: rmdir d6/d15/d3c/d4b/d82 39 2026-03-10T08:56:18.102 INFO:tasks.workunit.client.0.vm05.stdout:7/932: read d18/f1d [1349,27125] 0 2026-03-10T08:56:18.106 INFO:tasks.workunit.client.0.vm05.stdout:5/941: creat d5/d86/d24/d2c/d41/d74/f14f x:0 0 0 2026-03-10T08:56:18.107 INFO:tasks.workunit.client.0.vm05.stdout:4/978: mknod d0/d2e/d42/d45/d4a/d36/d37/d114/d129/d5b/dae/dee/c13e 0 2026-03-10T08:56:18.109 INFO:tasks.workunit.client.0.vm05.stdout:9/974: chown d6/d15/d35/fc0 43867 1 2026-03-10T08:56:18.109 INFO:tasks.workunit.client.0.vm05.stdout:9/975: read d6/d15/d35/fc0 [919276,12737] 0 2026-03-10T08:56:18.110 INFO:tasks.workunit.client.0.vm05.stdout:7/933: rmdir d18/d66/d25/de5 39 2026-03-10T08:56:18.113 INFO:tasks.workunit.client.0.vm05.stdout:8/976: mkdir d2/dd/d2c/d2e/d31/db4/d150 0 2026-03-10T08:56:18.113 INFO:tasks.workunit.client.0.vm05.stdout:2/946: link d0/d9/d1e/d20/d21/d8a/d92/ld8 d0/d55/dd4/d117/d4b/d75/l125 0 2026-03-10T08:56:18.114 INFO:tasks.workunit.client.0.vm05.stdout:2/947: dread - d0/d55/db8/f88 zero size 2026-03-10T08:56:18.117 INFO:tasks.workunit.client.0.vm05.stdout:9/976: creat d6/d12/d3a/da2/f13f x:0 0 0 2026-03-10T08:56:18.118 INFO:tasks.workunit.client.0.vm05.stdout:7/934: chown d18/d38/dc7/de3/d9c/dfd/d11c/c109 267521911 1 2026-03-10T08:56:18.120 INFO:tasks.workunit.client.0.vm05.stdout:8/977: creat d2/db/d28/d100/f151 x:0 0 0 2026-03-10T08:56:18.125 INFO:tasks.workunit.client.0.vm05.stdout:9/977: read d6/d12/f34 [476546,87954] 0 2026-03-10T08:56:18.125 INFO:tasks.workunit.client.0.vm05.stdout:7/935: creat d18/d66/d25/d2e/d2f/d6d/dc1/f131 x:0 0 0 2026-03-10T08:56:18.127 INFO:tasks.workunit.client.0.vm05.stdout:2/948: mkdir d0/d9/d1e/d126 0 2026-03-10T08:56:18.128 INFO:tasks.workunit.client.0.vm05.stdout:8/978: symlink d2/db/d28/d99/l152 0 2026-03-10T08:56:18.132 INFO:tasks.workunit.client.0.vm05.stdout:2/949: dwrite d0/d9/d7f/d8f/d7a/fa1 [0,4194304] 0 2026-03-10T08:56:18.142 INFO:tasks.workunit.client.0.vm05.stdout:5/942: creat d5/df/d37/f150 x:0 0 0 2026-03-10T08:56:18.142 INFO:tasks.workunit.client.0.vm05.stdout:5/943: readlink d5/d86/d24/lf0 0 2026-03-10T08:56:18.148 INFO:tasks.workunit.client.0.vm05.stdout:4/979: truncate d0/d1d/f89 357694 0 2026-03-10T08:56:18.151 INFO:tasks.workunit.client.0.vm05.stdout:8/979: read d2/dd/d2c/d2e/d31/d3e/dde/d63/f6c [2572897,11643] 0 2026-03-10T08:56:18.151 INFO:tasks.workunit.client.0.vm05.stdout:8/980: stat d2/dd/d2c/d2e/d31/d3e/dde/l52 0 2026-03-10T08:56:18.158 INFO:tasks.workunit.client.0.vm05.stdout:8/981: symlink d2/db/d28/d99/l153 0 2026-03-10T08:56:18.160 INFO:tasks.workunit.client.0.vm05.stdout:5/944: mkdir d5/d86/d21/d71/d12c/dfe/d151 0 2026-03-10T08:56:18.161 INFO:tasks.workunit.client.0.vm05.stdout:5/945: chown d5/d86/d24/d84/df7/d137/cd6 105938 1 2026-03-10T08:56:18.163 INFO:tasks.workunit.client.0.vm05.stdout:7/936: link d18/d38/dc7/de3/d74/ce4 d18/d38/dc7/de3/d9c/dfd/d11c/c132 0 2026-03-10T08:56:18.164 INFO:tasks.workunit.client.0.vm05.stdout:4/980: link d0/d2e/d71/d7c/f10e d0/d78/f13f 0 2026-03-10T08:56:18.165 INFO:tasks.workunit.client.0.vm05.stdout:8/982: fsync d2/db/d47/fd1 0 2026-03-10T08:56:18.168 INFO:tasks.workunit.client.0.vm05.stdout:2/950: link d0/d9/d1e/d20/fc5 d0/d9/d7f/d8f/d7a/f127 0 2026-03-10T08:56:18.171 INFO:tasks.workunit.client.0.vm05.stdout:7/937: creat d18/d1b/df0/f133 x:0 0 0 2026-03-10T08:56:18.177 INFO:tasks.workunit.client.0.vm05.stdout:4/981: fdatasync d0/d2e/d42/d45/fcc 0 2026-03-10T08:56:18.177 INFO:tasks.workunit.client.0.vm05.stdout:4/982: chown d0/d2e/d42/f131 31753953 1 2026-03-10T08:56:18.183 INFO:tasks.workunit.client.0.vm05.stdout:9/978: dwrite d6/d15/f4f [0,4194304] 0 2026-03-10T08:56:18.185 INFO:tasks.workunit.client.0.vm05.stdout:9/979: dread - d6/d12/d3a/da2/f13f zero size 2026-03-10T08:56:18.189 INFO:tasks.workunit.client.0.vm05.stdout:5/946: dread d5/d86/d24/d2c/d41/f87 [0,4194304] 0 2026-03-10T08:56:18.194 INFO:tasks.workunit.client.0.vm05.stdout:7/938: mknod d18/d66/d25/d2e/d2f/d6d/dc1/dd4/de9/c134 0 2026-03-10T08:56:18.201 INFO:tasks.workunit.client.0.vm05.stdout:2/951: unlink d0/d9/d1e/d20/d24/fb3 0 2026-03-10T08:56:18.202 INFO:tasks.workunit.client.0.vm05.stdout:2/952: chown d0/d9/d1e/d20/f8b 153 1 2026-03-10T08:56:18.210 INFO:tasks.workunit.client.0.vm05.stdout:8/983: link d2/ce8 d2/dd/d149/c154 0 2026-03-10T08:56:18.212 INFO:tasks.workunit.client.0.vm05.stdout:2/953: creat d0/d9/d7f/db4/f128 x:0 0 0 2026-03-10T08:56:18.213 INFO:tasks.workunit.client.0.vm05.stdout:2/954: truncate d0/d55/dd4/def/df2/f123 287129 0 2026-03-10T08:56:18.217 INFO:tasks.workunit.client.0.vm05.stdout:7/939: creat d18/d66/d25/d2e/dd9/d111/d113/f135 x:0 0 0 2026-03-10T08:56:18.252 INFO:tasks.workunit.client.0.vm05.stdout:7/940: dread - d18/d66/d25/d2e/d2f/fd8 zero size 2026-03-10T08:56:18.252 INFO:tasks.workunit.client.0.vm05.stdout:7/941: write d18/d66/d25/d2e/de7/f12e [536473,87294] 0 2026-03-10T08:56:18.252 INFO:tasks.workunit.client.0.vm05.stdout:2/955: truncate d0/d9/d1e/d20/d21/f41 6085173 0 2026-03-10T08:56:18.252 INFO:tasks.workunit.client.0.vm05.stdout:7/942: chown d18/d66/l96 4703 1 2026-03-10T08:56:18.252 INFO:tasks.workunit.client.0.vm05.stdout:8/984: creat d2/dd/d2c/f155 x:0 0 0 2026-03-10T08:56:18.253 INFO:tasks.workunit.client.0.vm05.stdout:7/943: fsync f15 0 2026-03-10T08:56:18.253 INFO:tasks.workunit.client.0.vm05.stdout:8/985: dread - d2/dd/d2c/da5/f110 zero size 2026-03-10T08:56:18.253 INFO:tasks.workunit.client.0.vm05.stdout:7/944: symlink d18/d66/d25/de5/l136 0 2026-03-10T08:56:18.253 INFO:tasks.workunit.client.0.vm05.stdout:8/986: dwrite d2/dd/d2c/d2e/d31/d3e/dde/d63/fc8 [0,4194304] 0 2026-03-10T08:56:18.253 INFO:tasks.workunit.client.0.vm05.stdout:7/945: symlink d18/d38/dc7/de3/d9c/de1/l137 0 2026-03-10T08:56:18.253 INFO:tasks.workunit.client.0.vm05.stdout:8/987: dwrite d2/dd/d2c/d2e/d31/d3e/d5d/d9d/f133 [0,4194304] 0 2026-03-10T08:56:18.253 INFO:tasks.workunit.client.0.vm05.stdout:2/956: read d0/f16 [1479332,44304] 0 2026-03-10T08:56:18.255 INFO:tasks.workunit.client.0.vm05.stdout:2/957: creat d0/d55/db8/f129 x:0 0 0 2026-03-10T08:56:18.261 INFO:tasks.workunit.client.0.vm05.stdout:2/958: symlink d0/d55/dd4/def/df2/l12a 0 2026-03-10T08:56:18.262 INFO:tasks.workunit.client.0.vm05.stdout:2/959: link d0/d9/d1e/d20/d24/d119/f98 d0/d9/d1e/d20/d24/d119/f12b 0 2026-03-10T08:56:18.262 INFO:tasks.workunit.client.0.vm05.stdout:2/960: link d0/d55/da2/fa5 d0/d9/d1e/d20/d24/f12c 0 2026-03-10T08:56:18.263 INFO:tasks.workunit.client.0.vm05.stdout:2/961: creat d0/d55/d9f/f12d x:0 0 0 2026-03-10T08:56:18.269 INFO:tasks.workunit.client.0.vm05.stdout:2/962: creat d0/d9/d1e/d20/d24/f12e x:0 0 0 2026-03-10T08:56:18.269 INFO:tasks.workunit.client.0.vm05.stdout:2/963: mkdir d0/d9/d12f 0 2026-03-10T08:56:18.269 INFO:tasks.workunit.client.0.vm05.stdout:2/964: unlink d0/d55/db8/dcc/dd9/f103 0 2026-03-10T08:56:18.271 INFO:tasks.workunit.client.0.vm05.stdout:2/965: rename d0/d9/d7f/d8f/d7a/cbd to d0/d9/d1e/d20/d21/d8a/c130 0 2026-03-10T08:56:18.272 INFO:tasks.workunit.client.0.vm05.stdout:2/966: dread - d0/d9/d7f/d8f/d6d/f9a zero size 2026-03-10T08:56:18.273 INFO:tasks.workunit.client.0.vm05.stdout:2/967: mknod d0/d9/d89/c131 0 2026-03-10T08:56:18.274 INFO:tasks.workunit.client.0.vm05.stdout:9/980: sync 2026-03-10T08:56:18.277 INFO:tasks.workunit.client.0.vm05.stdout:2/968: creat d0/d55/dde/dac/f132 x:0 0 0 2026-03-10T08:56:18.279 INFO:tasks.workunit.client.0.vm05.stdout:9/981: creat d6/d12/d3a/de5/dd4/d113/f140 x:0 0 0 2026-03-10T08:56:18.280 INFO:tasks.workunit.client.0.vm05.stdout:2/969: mkdir d0/d9/d7f/d133 0 2026-03-10T08:56:18.282 INFO:tasks.workunit.client.0.vm05.stdout:2/970: creat d0/d9/d7f/d8f/d6d/dda/f134 x:0 0 0 2026-03-10T08:56:18.283 INFO:tasks.workunit.client.0.vm05.stdout:9/982: rmdir d6/d15/d3c/d128/d12a 0 2026-03-10T08:56:18.284 INFO:tasks.workunit.client.0.vm05.stdout:2/971: creat d0/d9/d7f/d8f/d7e/f135 x:0 0 0 2026-03-10T08:56:18.284 INFO:tasks.workunit.client.0.vm05.stdout:2/972: chown d0/d55/dd4/def/df2 1440839298 1 2026-03-10T08:56:18.287 INFO:tasks.workunit.client.0.vm05.stdout:2/973: link d0/d55/db8/f129 d0/d9/d89/f136 0 2026-03-10T08:56:18.289 INFO:tasks.workunit.client.0.vm05.stdout:2/974: creat d0/d55/db8/dcc/dd9/dcb/df0/f137 x:0 0 0 2026-03-10T08:56:18.299 INFO:tasks.workunit.client.0.vm05.stdout:5/947: dwrite d5/df/d37/d68/fae [0,4194304] 0 2026-03-10T08:56:18.313 INFO:tasks.workunit.client.0.vm05.stdout:4/983: dwrite d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/d67/f11e [0,4194304] 0 2026-03-10T08:56:18.317 INFO:tasks.workunit.client.0.vm05.stdout:5/948: rename d5/df/d37/d68/d12d/dcc/c10f to d5/d86/d21/d71/d12c/dfe/c152 0 2026-03-10T08:56:18.320 INFO:tasks.workunit.client.0.vm05.stdout:4/984: truncate d0/d2e/d42/d45/d4a/d36/dbe/d32/f9e 482963 0 2026-03-10T08:56:18.329 INFO:tasks.workunit.client.0.vm05.stdout:5/949: fsync d5/df/dbb/f4a 0 2026-03-10T08:56:18.330 INFO:tasks.workunit.client.0.vm05.stdout:5/950: dread - d5/df/d37/dd2/f132 zero size 2026-03-10T08:56:18.332 INFO:tasks.workunit.client.0.vm05.stdout:7/946: rmdir d18/d66/d25/d2e/dd9/d111 39 2026-03-10T08:56:18.337 INFO:tasks.workunit.client.0.vm05.stdout:4/985: creat d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/d67/d7b/f140 x:0 0 0 2026-03-10T08:56:18.341 INFO:tasks.workunit.client.0.vm05.stdout:5/951: mknod d5/df/d37/d68/d12d/dcc/c153 0 2026-03-10T08:56:18.342 INFO:tasks.workunit.client.0.vm05.stdout:5/952: write d5/d86/d24/d2c/d41/d74/f117 [779849,80174] 0 2026-03-10T08:56:18.347 INFO:tasks.workunit.client.0.vm05.stdout:7/947: chown d18/d38/l12f 11102 1 2026-03-10T08:56:18.352 INFO:tasks.workunit.client.0.vm05.stdout:4/986: rename d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/d67/c6d to d0/d2e/d42/d45/d4a/d36/d37/d114/d129/d5b/dd7/c141 0 2026-03-10T08:56:18.355 INFO:tasks.workunit.client.0.vm05.stdout:4/987: dwrite d0/d2c/f2f [0,4194304] 0 2026-03-10T08:56:18.362 INFO:tasks.workunit.client.0.vm05.stdout:8/988: write d2/dd/d2c/d2e/d31/d3e/dde/f11f [595827,85469] 0 2026-03-10T08:56:18.363 INFO:tasks.workunit.client.0.vm05.stdout:8/989: fsync d2/dd/d2c/f155 0 2026-03-10T08:56:18.363 INFO:tasks.workunit.client.0.vm05.stdout:8/990: chown d2/dd/c135 22911661 1 2026-03-10T08:56:18.364 INFO:tasks.workunit.client.0.vm05.stdout:8/991: chown d2/dd/d2c/d14c 150118 1 2026-03-10T08:56:18.365 INFO:tasks.workunit.client.0.vm05.stdout:8/992: chown d2/dd/d2c/d2e/d31/d3e/dde/d63/l7a 13716093 1 2026-03-10T08:56:18.371 INFO:tasks.workunit.client.0.vm05.stdout:8/993: dwrite d2/dd/d2c/d2e/d31/d3e/dde/d63/daf/f14b [0,4194304] 0 2026-03-10T08:56:18.377 INFO:tasks.workunit.client.0.vm05.stdout:4/988: mkdir d0/dfe/de2/d142 0 2026-03-10T08:56:18.388 INFO:tasks.workunit.client.0.vm05.stdout:8/994: creat d2/dd/d2c/d2e/d31/d4f/f156 x:0 0 0 2026-03-10T08:56:18.391 INFO:tasks.workunit.client.0.vm05.stdout:4/989: creat d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/f143 x:0 0 0 2026-03-10T08:56:18.396 INFO:tasks.workunit.client.0.vm05.stdout:9/983: write d6/d12/f74 [365110,121449] 0 2026-03-10T08:56:18.396 INFO:tasks.workunit.client.0.vm05.stdout:8/995: mknod d2/dd/d2c/d2e/d31/d4f/d7b/d9e/df8/c157 0 2026-03-10T08:56:18.396 INFO:tasks.workunit.client.0.vm05.stdout:9/984: rmdir d6/d15/d37/de8 39 2026-03-10T08:56:18.398 INFO:tasks.workunit.client.0.vm05.stdout:8/996: mkdir d2/d45/d158 0 2026-03-10T08:56:18.399 INFO:tasks.workunit.client.0.vm05.stdout:8/997: stat d2/dd/d2c/d2e/d31/d4f/d80 0 2026-03-10T08:56:18.399 INFO:tasks.workunit.client.0.vm05.stdout:2/975: write d0/d55/db8/f88 [227718,112632] 0 2026-03-10T08:56:18.402 INFO:tasks.workunit.client.0.vm05.stdout:4/990: symlink d0/d2e/d42/d45/d4a/d36/d37/d114/d129/l144 0 2026-03-10T08:56:18.407 INFO:tasks.workunit.client.0.vm05.stdout:9/985: dread - d6/d15/d3c/ff7 zero size 2026-03-10T08:56:18.408 INFO:tasks.workunit.client.0.vm05.stdout:8/998: symlink d2/db/d1f/l159 0 2026-03-10T08:56:18.410 INFO:tasks.workunit.client.0.vm05.stdout:2/976: rmdir d0/d55/db8/dcc/dd9/dcb/df0 39 2026-03-10T08:56:18.412 INFO:tasks.workunit.client.0.vm05.stdout:4/991: creat d0/d2e/d42/d45/d4a/d36/d37/d114/d129/d5b/dd7/ddc/f145 x:0 0 0 2026-03-10T08:56:18.413 INFO:tasks.workunit.client.0.vm05.stdout:4/992: chown d0/d2e/d42/d45/d4a/d36/d37/d114/d129/d5b/cb3 4 1 2026-03-10T08:56:18.416 INFO:tasks.workunit.client.0.vm05.stdout:2/977: readlink d0/d9/d7f/l111 0 2026-03-10T08:56:18.422 INFO:tasks.workunit.client.0.vm05.stdout:8/999: fdatasync d2/dfc/f121 0 2026-03-10T08:56:18.424 INFO:tasks.workunit.client.0.vm05.stdout:2/978: mknod d0/d55/dd4/d117/d4b/d75/c138 0 2026-03-10T08:56:18.425 INFO:tasks.workunit.client.0.vm05.stdout:7/948: write d18/d38/dc7/de3/dc6/fda [600927,60476] 0 2026-03-10T08:56:18.426 INFO:tasks.workunit.client.0.vm05.stdout:5/953: dwrite d5/df/d37/f73 [0,4194304] 0 2026-03-10T08:56:18.429 INFO:tasks.workunit.client.0.vm05.stdout:4/993: symlink d0/d2e/l146 0 2026-03-10T08:56:18.445 INFO:tasks.workunit.client.0.vm05.stdout:2/979: rename d0/d9/d7f/d8f/d6d to d0/d9/d1e/d20/d21/d8a/d11e/d139 0 2026-03-10T08:56:18.448 INFO:tasks.workunit.client.0.vm05.stdout:9/986: write d6/d12/d3a/de5/f91 [640286,94595] 0 2026-03-10T08:56:18.452 INFO:tasks.workunit.client.0.vm05.stdout:7/949: mkdir d18/d66/d138 0 2026-03-10T08:56:18.456 INFO:tasks.workunit.client.0.vm05.stdout:7/950: stat d18/d66/d25/d2e/de7/l98 0 2026-03-10T08:56:18.456 INFO:tasks.workunit.client.0.vm05.stdout:7/951: write d18/d66/d25/d2e/f49 [1608410,108132] 0 2026-03-10T08:56:18.456 INFO:tasks.workunit.client.0.vm05.stdout:5/954: rename d5/d86/d24/d2c/d41/d11f/f12a to d5/d86/d24/d84/df7/d137/d95/dac/dc6/f154 0 2026-03-10T08:56:18.461 INFO:tasks.workunit.client.0.vm05.stdout:4/994: write d0/d2e/d42/d45/d4a/d36/dbe/d32/d41/d67/d7b/faa [89755,6656] 0 2026-03-10T08:56:18.466 INFO:tasks.workunit.client.0.vm05.stdout:9/987: dread d6/d19/d2a/d4a/d8c/fa7 [0,4194304] 0 2026-03-10T08:56:18.469 INFO:tasks.workunit.client.0.vm05.stdout:9/988: dwrite d6/d15/fb4 [4194304,4194304] 0 2026-03-10T08:56:18.477 INFO:tasks.workunit.client.0.vm05.stdout:2/980: mkdir d0/d9/d1e/d126/d13a 0 2026-03-10T08:56:18.477 INFO:tasks.workunit.client.0.vm05.stdout:4/995: read d0/dfe/feb [2477247,11366] 0 2026-03-10T08:56:18.478 INFO:tasks.workunit.client.0.vm05.stdout:2/981: write d0/d55/d9f/f12d [49890,46992] 0 2026-03-10T08:56:18.481 INFO:tasks.workunit.client.0.vm05.stdout:4/996: fdatasync d0/d2c/f12c 0 2026-03-10T08:56:18.563 INFO:tasks.workunit.client.0.vm05.stdout:7/952: rename d18/d38/d43/d5c/ced to d18/d66/d138/c139 0 2026-03-10T08:56:18.575 INFO:tasks.workunit.client.0.vm05.stdout:5/955: write d5/df/dbb/f4a [5531180,47691] 0 2026-03-10T08:56:18.579 INFO:tasks.workunit.client.0.vm05.stdout:5/956: dwrite d5/f23 [4194304,4194304] 0 2026-03-10T08:56:18.592 INFO:tasks.workunit.client.0.vm05.stdout:9/989: creat d6/d15/d3c/d4b/d90/d93/f141 x:0 0 0 2026-03-10T08:56:18.608 INFO:tasks.workunit.client.0.vm05.stdout:9/990: fsync d6/d15/d37/de8/fef 0 2026-03-10T08:56:18.613 INFO:tasks.workunit.client.0.vm05.stdout:9/991: chown d6/d19/d2a/dbc/cc8 1079050 1 2026-03-10T08:56:18.661 INFO:tasks.workunit.client.0.vm05.stdout:9/992: mkdir d6/d15/d3c/d4b/d82/d142 0 2026-03-10T08:56:18.663 INFO:tasks.workunit.client.0.vm05.stdout:9/993: creat d6/d15/d3c/d4b/d90/f143 x:0 0 0 2026-03-10T08:56:18.696 INFO:tasks.workunit.client.0.vm05.stdout:2/982: dwrite d0/f30 [0,4194304] 0 2026-03-10T08:56:18.701 INFO:tasks.workunit.client.0.vm05.stdout:4/997: write d0/d2e/d42/d45/d4a/d36/d37/d114/d129/d5b/f70 [2145463,90157] 0 2026-03-10T08:56:18.703 INFO:tasks.workunit.client.0.vm05.stdout:4/998: fsync d0/d2e/d42/d45/d4a/d36/d37/d114/d129/f51 0 2026-03-10T08:56:18.705 INFO:tasks.workunit.client.0.vm05.stdout:2/983: creat d0/d55/dd4/d117/f13b x:0 0 0 2026-03-10T08:56:18.708 INFO:tasks.workunit.client.0.vm05.stdout:2/984: rename d0/d9/d7f/l111 to d0/d9/d89/l13c 0 2026-03-10T08:56:18.711 INFO:tasks.workunit.client.0.vm05.stdout:2/985: dwrite d0/d55/dd4/d117/f68 [0,4194304] 0 2026-03-10T08:56:18.765 INFO:tasks.workunit.client.0.vm05.stdout:7/953: creat d18/d38/f13a x:0 0 0 2026-03-10T08:56:18.766 INFO:tasks.workunit.client.0.vm05.stdout:7/954: read d18/d38/d43/d5c/fcc [312291,48777] 0 2026-03-10T08:56:18.773 INFO:tasks.workunit.client.0.vm05.stdout:5/957: rmdir d5/df 39 2026-03-10T08:56:18.775 INFO:tasks.workunit.client.0.vm05.stdout:5/958: write d5/df/dbb/d108/d13b/f144 [847422,105513] 0 2026-03-10T08:56:18.779 INFO:tasks.workunit.client.0.vm05.stdout:5/959: mknod d5/d86/d24/d84/df7/d137/dc4/d13f/c155 0 2026-03-10T08:56:18.783 INFO:tasks.workunit.client.0.vm05.stdout:5/960: dwrite d5/df/dbb/f4a [4194304,4194304] 0 2026-03-10T08:56:18.792 INFO:tasks.workunit.client.0.vm05.stdout:5/961: dread d5/d86/d24/d2c/d41/d74/f9f [0,4194304] 0 2026-03-10T08:56:18.793 INFO:tasks.workunit.client.0.vm05.stdout:9/994: write d6/ff8 [317783,65784] 0 2026-03-10T08:56:18.799 INFO:tasks.workunit.client.0.vm05.stdout:5/962: mknod d5/d86/d21/d71/d12c/d60/c156 0 2026-03-10T08:56:18.799 INFO:tasks.workunit.client.0.vm05.stdout:9/995: rmdir d6/d19 39 2026-03-10T08:56:18.803 INFO:tasks.workunit.client.0.vm05.stdout:9/996: dwrite d6/d15/d3c/d4b/f76 [0,4194304] 0 2026-03-10T08:56:18.805 INFO:tasks.workunit.client.0.vm05.stdout:9/997: readlink d6/d15/d3c/d4b/d82/l136 0 2026-03-10T08:56:18.811 INFO:tasks.workunit.client.0.vm05.stdout:9/998: getdents d6/d19 0 2026-03-10T08:56:18.864 INFO:tasks.workunit.client.0.vm05.stdout:7/955: creat d18/d66/d25/f13b x:0 0 0 2026-03-10T08:56:18.865 INFO:tasks.workunit.client.0.vm05.stdout:7/956: symlink d18/d66/d78/dc3/l13c 0 2026-03-10T08:56:18.866 INFO:tasks.workunit.client.0.vm05.stdout:7/957: symlink d18/d66/d25/d2e/de7/l13d 0 2026-03-10T08:56:18.868 INFO:tasks.workunit.client.0.vm05.stdout:7/958: unlink d18/d38/dc7/de3/d9c/dac/f4c 0 2026-03-10T08:56:18.871 INFO:tasks.workunit.client.0.vm05.stdout:7/959: rename d18/d1b/l129 to d18/d66/dff/l13e 0 2026-03-10T08:56:18.873 INFO:tasks.workunit.client.0.vm05.stdout:7/960: write d18/d66/d25/d2e/dd9/d111/d113/f135 [363285,90894] 0 2026-03-10T08:56:18.882 INFO:tasks.workunit.client.0.vm05.stdout:4/999: write d0/fe [1782127,112821] 0 2026-03-10T08:56:18.889 INFO:tasks.workunit.client.0.vm05.stdout:2/986: write d0/d9/d1e/f39 [3973391,41028] 0 2026-03-10T08:56:18.894 INFO:tasks.workunit.client.0.vm05.stdout:2/987: rename d0/d55/dd4/d117/d4b/f97 to d0/d9/d7f/d8f/f13d 0 2026-03-10T08:56:18.896 INFO:tasks.workunit.client.0.vm05.stdout:2/988: creat d0/d9/d7f/db4/f13e x:0 0 0 2026-03-10T08:56:18.952 INFO:tasks.workunit.client.0.vm05.stdout:7/961: sync 2026-03-10T08:56:18.955 INFO:tasks.workunit.client.0.vm05.stdout:7/962: fdatasync d18/d38/d43/d5c/fa7 0 2026-03-10T08:56:18.962 INFO:tasks.workunit.client.0.vm05.stdout:7/963: dwrite d18/d66/d25/d2e/d2f/d6d/dc1/fee [0,4194304] 0 2026-03-10T08:56:18.969 INFO:tasks.workunit.client.0.vm05.stdout:7/964: unlink f15 0 2026-03-10T08:56:18.974 INFO:tasks.workunit.client.0.vm05.stdout:7/965: dwrite d18/d66/d25/d2e/d2f/d6d/dc1/fee [0,4194304] 0 2026-03-10T08:56:18.977 INFO:tasks.workunit.client.0.vm05.stdout:7/966: chown d18/f1d 483180883 1 2026-03-10T08:56:18.979 INFO:tasks.workunit.client.0.vm05.stdout:5/963: write d5/df/d37/f123 [260974,127267] 0 2026-03-10T08:56:18.983 INFO:tasks.workunit.client.0.vm05.stdout:9/999: dwrite d6/d15/f86 [0,4194304] 0 2026-03-10T08:56:18.987 INFO:tasks.workunit.client.0.vm05.stdout:7/967: unlink cc 0 2026-03-10T08:56:18.996 INFO:tasks.workunit.client.0.vm05.stdout:7/968: chown d18/d66/d25/d2e/d2f/f11e 2412895 1 2026-03-10T08:56:19.005 INFO:tasks.workunit.client.0.vm05.stdout:5/964: sync 2026-03-10T08:56:19.017 INFO:tasks.workunit.client.0.vm05.stdout:2/989: write d0/d55/dd4/def/df2/ff8 [437986,11972] 0 2026-03-10T08:56:19.018 INFO:tasks.workunit.client.0.vm05.stdout:2/990: read d0/d9/d89/f10f [259899,127961] 0 2026-03-10T08:56:19.023 INFO:tasks.workunit.client.0.vm05.stdout:2/991: symlink d0/d9/d1e/d126/d13a/l13f 0 2026-03-10T08:56:19.034 INFO:tasks.workunit.client.0.vm05.stdout:2/992: fdatasync d0/d55/fd7 0 2026-03-10T08:56:19.034 INFO:tasks.workunit.client.0.vm05.stdout:7/969: dread d18/d66/d25/d2e/d2f/fad [0,4194304] 0 2026-03-10T08:56:19.034 INFO:tasks.workunit.client.0.vm05.stdout:2/993: getdents d0/d55/dd4 0 2026-03-10T08:56:19.034 INFO:tasks.workunit.client.0.vm05.stdout:2/994: getdents d0/d55/da2 0 2026-03-10T08:56:19.044 INFO:tasks.workunit.client.0.vm05.stdout:2/995: sync 2026-03-10T08:56:19.079 INFO:tasks.workunit.client.0.vm05.stdout:5/965: write d5/d86/d24/d2c/d41/d74/da9/f11d [26618,113739] 0 2026-03-10T08:56:19.091 INFO:tasks.workunit.client.0.vm05.stdout:7/970: write d18/d38/dc7/de3/f5a [3553786,80432] 0 2026-03-10T08:56:19.108 INFO:tasks.workunit.client.0.vm05.stdout:2/996: dwrite d0/d9/d7f/d8f/f66 [4194304,4194304] 0 2026-03-10T08:56:19.111 INFO:tasks.workunit.client.0.vm05.stdout:5/966: write d5/d86/d21/d89/fbd [2724881,35111] 0 2026-03-10T08:56:19.115 INFO:tasks.workunit.client.0.vm05.stdout:7/971: symlink d18/d66/d25/d2e/de7/l13f 0 2026-03-10T08:56:19.118 INFO:tasks.workunit.client.0.vm05.stdout:5/967: mknod d5/d86/c157 0 2026-03-10T08:56:19.124 INFO:tasks.workunit.client.0.vm05.stdout:7/972: chown d18/d66/d25/d2e/dd9/d111/d113/f135 12731400 1 2026-03-10T08:56:19.128 INFO:tasks.workunit.client.0.vm05.stdout:2/997: rename d0/d55/dd4/d117/d4b/f6b to d0/f140 0 2026-03-10T08:56:19.129 INFO:tasks.workunit.client.0.vm05.stdout:5/968: truncate d5/d86/d24/d2c/f79 819446 0 2026-03-10T08:56:19.130 INFO:tasks.workunit.client.0.vm05.stdout:5/969: chown d5/df/d37/dd2/f132 34636061 1 2026-03-10T08:56:19.136 INFO:tasks.workunit.client.0.vm05.stdout:2/998: mkdir d0/d55/dde/d124/d141 0 2026-03-10T08:56:19.136 INFO:tasks.workunit.client.0.vm05.stdout:7/973: link d18/d66/d25/d2e/d2f/d6d/dc1/le2 d18/d38/dc7/de3/l140 0 2026-03-10T08:56:19.138 INFO:tasks.workunit.client.0.vm05.stdout:2/999: stat d0/d55/dd4/d117/d4b/d75/l125 0 2026-03-10T08:56:19.139 INFO:tasks.workunit.client.0.vm05.stdout:5/970: creat d5/df/d37/d68/f158 x:0 0 0 2026-03-10T08:56:19.140 INFO:tasks.workunit.client.0.vm05.stdout:7/974: creat d18/d38/dc7/de3/f141 x:0 0 0 2026-03-10T08:56:19.142 INFO:tasks.workunit.client.0.vm05.stdout:7/975: mkdir d18/d38/dc7/de3/dc6/d142 0 2026-03-10T08:56:19.144 INFO:tasks.workunit.client.0.vm05.stdout:5/971: symlink d5/l159 0 2026-03-10T08:56:19.147 INFO:tasks.workunit.client.0.vm05.stdout:5/972: creat d5/d86/d39/f15a x:0 0 0 2026-03-10T08:56:19.148 INFO:tasks.workunit.client.0.vm05.stdout:7/976: chown d18/d66/d138/c139 12923545 1 2026-03-10T08:56:19.150 INFO:tasks.workunit.client.0.vm05.stdout:7/977: fdatasync d18/d38/d43/d6e/f9a 0 2026-03-10T08:56:19.163 INFO:tasks.workunit.client.0.vm05.stdout:5/973: getdents d5/df 0 2026-03-10T08:56:19.173 INFO:tasks.workunit.client.0.vm05.stdout:5/974: unlink d5/d86/d24/d84/df7/d137/d95/le8 0 2026-03-10T08:56:19.173 INFO:tasks.workunit.client.0.vm05.stdout:5/975: chown d5/df/l5b 5 1 2026-03-10T08:56:19.176 INFO:tasks.workunit.client.0.vm05.stdout:5/976: mkdir d5/df/d37/dc8/d15b 0 2026-03-10T08:56:19.177 INFO:tasks.workunit.client.0.vm05.stdout:7/978: truncate d18/d66/d78/fa6 305683 0 2026-03-10T08:56:19.180 INFO:tasks.workunit.client.0.vm05.stdout:5/977: rename d5/df/dbb/l50 to d5/d86/d21/d89/l15c 0 2026-03-10T08:56:19.181 INFO:tasks.workunit.client.0.vm05.stdout:7/979: mkdir d18/d38/dc7/d143 0 2026-03-10T08:56:19.183 INFO:tasks.workunit.client.0.vm05.stdout:5/978: write d5/d48/f93 [1179941,99822] 0 2026-03-10T08:56:19.190 INFO:tasks.workunit.client.0.vm05.stdout:5/979: mkdir d5/d86/d24/d84/df7/d137/d95/dac/d15d 0 2026-03-10T08:56:19.192 INFO:tasks.workunit.client.0.vm05.stdout:7/980: dread d18/d66/f3f [0,4194304] 0 2026-03-10T08:56:19.194 INFO:tasks.workunit.client.0.vm05.stdout:5/980: creat d5/d86/d24/d84/df7/d137/dc4/d135/f15e x:0 0 0 2026-03-10T08:56:19.194 INFO:tasks.workunit.client.0.vm05.stdout:5/981: fsync d5/df/d37/f123 0 2026-03-10T08:56:19.206 INFO:tasks.workunit.client.0.vm05.stdout:5/982: mknod d5/d86/d24/d2c/d41/d11f/c15f 0 2026-03-10T08:56:19.208 INFO:tasks.workunit.client.0.vm05.stdout:7/981: write d18/d66/d78/fb8 [1095634,1823] 0 2026-03-10T08:56:19.211 INFO:tasks.workunit.client.0.vm05.stdout:5/983: creat d5/d86/d21/d71/d12c/d60/f160 x:0 0 0 2026-03-10T08:56:19.211 INFO:tasks.workunit.client.0.vm05.stdout:5/984: fsync d5/d86/d21/d71/f9a 0 2026-03-10T08:56:19.225 INFO:tasks.workunit.client.0.vm05.stdout:5/985: dread d5/d86/d39/fce [0,4194304] 0 2026-03-10T08:56:19.225 INFO:tasks.workunit.client.0.vm05.stdout:7/982: dwrite d18/d1b/f69 [4194304,4194304] 0 2026-03-10T08:56:19.229 INFO:tasks.workunit.client.0.vm05.stdout:5/986: chown d5/df/d37/d68 0 1 2026-03-10T08:56:19.239 INFO:tasks.workunit.client.0.vm05.stdout:5/987: rename d5/d86/d24/d84/df7/d137/d95/dac/d15d to d5/df/d37/dd2/d76/dde/d161 0 2026-03-10T08:56:19.240 INFO:tasks.workunit.client.0.vm05.stdout:7/983: chown d18/d38/dc7/de3/d53 14267 1 2026-03-10T08:56:19.246 INFO:tasks.workunit.client.0.vm05.stdout:7/984: mkdir d18/d38/d144 0 2026-03-10T08:56:19.249 INFO:tasks.workunit.client.0.vm05.stdout:7/985: unlink d18/fb1 0 2026-03-10T08:56:19.266 INFO:tasks.workunit.client.0.vm05.stdout:5/988: write d5/d86/d21/fd9 [1497900,41038] 0 2026-03-10T08:56:19.269 INFO:tasks.workunit.client.0.vm05.stdout:7/986: dwrite d18/d66/d78/dc3/fd5 [0,4194304] 0 2026-03-10T08:56:19.279 INFO:tasks.workunit.client.0.vm05.stdout:5/989: mknod d5/dcf/c162 0 2026-03-10T08:56:19.283 INFO:tasks.workunit.client.0.vm05.stdout:7/987: link d18/d66/d25/d2e/d2f/fd8 d18/d38/dc7/de3/d9c/dac/f145 0 2026-03-10T08:56:19.284 INFO:tasks.workunit.client.0.vm05.stdout:7/988: fsync d18/d38/dc7/de3/f5a 0 2026-03-10T08:56:19.286 INFO:tasks.workunit.client.0.vm05.stdout:7/989: mkdir d18/d38/d43/d5c/daf/d146 0 2026-03-10T08:56:19.287 INFO:tasks.workunit.client.0.vm05.stdout:7/990: creat d18/d66/d25/d2e/d2f/d6d/dc1/f147 x:0 0 0 2026-03-10T08:56:19.293 INFO:tasks.workunit.client.0.vm05.stdout:7/991: write d18/d38/dc7/de3/dc6/ffe [2018188,23529] 0 2026-03-10T08:56:19.293 INFO:tasks.workunit.client.0.vm05.stdout:7/992: creat d18/d66/dff/f148 x:0 0 0 2026-03-10T08:56:19.293 INFO:tasks.workunit.client.0.vm05.stdout:7/993: unlink d18/d66/d25/d2e/de7/lf7 0 2026-03-10T08:56:19.294 INFO:tasks.workunit.client.0.vm05.stdout:7/994: chown d18/d66/lb4 121557272 1 2026-03-10T08:56:19.296 INFO:tasks.workunit.client.0.vm05.stdout:7/995: mknod d18/d66/d78/dc3/c149 0 2026-03-10T08:56:19.304 INFO:tasks.workunit.client.0.vm05.stdout:5/990: write d5/d86/d24/d2c/fab [803735,62138] 0 2026-03-10T08:56:19.308 INFO:tasks.workunit.client.0.vm05.stdout:5/991: creat d5/df/d37/dc8/d15b/f163 x:0 0 0 2026-03-10T08:56:19.309 INFO:tasks.workunit.client.0.vm05.stdout:5/992: write d5/df/d37/d68/fae [193500,15021] 0 2026-03-10T08:56:19.316 INFO:tasks.workunit.client.0.vm05.stdout:5/993: mkdir d5/d164 0 2026-03-10T08:56:19.323 INFO:tasks.workunit.client.0.vm05.stdout:7/996: write d18/d38/dc7/de3/d9c/dac/f35 [1097863,32406] 0 2026-03-10T08:56:19.335 INFO:tasks.workunit.client.0.vm05.stdout:7/997: dread d18/d38/fca [0,4194304] 0 2026-03-10T08:56:19.336 INFO:tasks.workunit.client.0.vm05.stdout:7/998: fsync d18/d66/d78/fb8 0 2026-03-10T08:56:19.340 INFO:tasks.workunit.client.0.vm05.stdout:5/994: dwrite d5/d86/d24/d2c/d41/fe4 [0,4194304] 0 2026-03-10T08:56:19.342 INFO:tasks.workunit.client.0.vm05.stdout:5/995: chown d5/d86/d21/fd9 124355158 1 2026-03-10T08:56:19.346 INFO:tasks.workunit.client.0.vm05.stdout:7/999: dread d18/d38/dc7/de3/d9c/fce [0,4194304] 0 2026-03-10T08:56:19.349 INFO:tasks.workunit.client.0.vm05.stdout:5/996: rmdir d5/d86/d24/d2c/d41/d74/d133 39 2026-03-10T08:56:19.350 INFO:tasks.workunit.client.0.vm05.stdout:5/997: chown d5/df/d37/d68/d12d/l112 65045 1 2026-03-10T08:56:19.359 INFO:tasks.workunit.client.0.vm05.stdout:5/998: link d5/df/d37/dd2/d76/fa4 d5/d86/d24/d129/f165 0 2026-03-10T08:56:19.361 INFO:tasks.workunit.client.0.vm05.stdout:5/999: mkdir d5/df/d37/d68/d166 0 2026-03-10T08:56:19.365 INFO:tasks.workunit.client.0.vm05.stderr:+ rm -rf -- ./tmp.xqItf6Q9xe 2026-03-10T08:56:19.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:19 vm05.local ceph-mon[49713]: pgmap v18: 65 pgs: 65 active+clean; 4.1 GiB data, 14 GiB used, 106 GiB / 120 GiB avail; 28 MiB/s rd, 72 MiB/s wr, 195 op/s 2026-03-10T08:56:20.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:19 vm08.local ceph-mon[57559]: pgmap v18: 65 pgs: 65 active+clean; 4.1 GiB data, 14 GiB used, 106 GiB / 120 GiB avail; 28 MiB/s rd, 72 MiB/s wr, 195 op/s 2026-03-10T08:56:20.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:20 vm08.local ceph-mon[57559]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:56:20.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:20 vm08.local ceph-mon[57559]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "mgr fail", "who": "vm08.rpongu"}]: dispatch 2026-03-10T08:56:20.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:20 vm08.local ceph-mon[57559]: from='mgr.24459 ' entity='mgr.vm08.rpongu' cmd=[{"prefix": "mgr fail", "who": "vm08.rpongu"}]: dispatch 2026-03-10T08:56:20.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:20 vm08.local ceph-mon[57559]: osdmap e42: 6 total, 6 up, 6 in 2026-03-10T08:56:20.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:20 vm08.local ceph-mon[57559]: from='mgr.24459 ' entity='mgr.vm08.rpongu' cmd='[{"prefix": "mgr fail", "who": "vm08.rpongu"}]': finished 2026-03-10T08:56:20.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:20 vm08.local ceph-mon[57559]: mgrmap e25: vm05.rxwgjc(active, starting, since 0.00876936s) 2026-03-10T08:56:20.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:20 vm08.local ceph-mon[57559]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-10T08:56:20.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:20 vm08.local ceph-mon[57559]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T08:56:20.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:20 vm08.local ceph-mon[57559]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.slhztf"}]: dispatch 2026-03-10T08:56:20.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:20 vm08.local ceph-mon[57559]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.bxdvbu"}]: dispatch 2026-03-10T08:56:20.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:20 vm08.local ceph-mon[57559]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.ssijow"}]: dispatch 2026-03-10T08:56:20.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:20 vm08.local ceph-mon[57559]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.xfzrbx"}]: dispatch 2026-03-10T08:56:20.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:20 vm08.local ceph-mon[57559]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mgr metadata", "who": "vm05.rxwgjc", "id": "vm05.rxwgjc"}]: dispatch 2026-03-10T08:56:20.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:20 vm08.local ceph-mon[57559]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T08:56:20.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:20 vm08.local ceph-mon[57559]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T08:56:20.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:20 vm08.local ceph-mon[57559]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T08:56:20.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:20 vm08.local ceph-mon[57559]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T08:56:20.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:20 vm08.local ceph-mon[57559]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T08:56:20.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:20 vm08.local ceph-mon[57559]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T08:56:20.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:20 vm08.local ceph-mon[57559]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T08:56:20.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:20 vm08.local ceph-mon[57559]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T08:56:20.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:20 vm08.local ceph-mon[57559]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T08:56:20.819 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:20 vm05.local ceph-mon[49713]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:56:20.819 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:20 vm05.local ceph-mon[49713]: from='mgr.24459 192.168.123.108:0/962629603' entity='mgr.vm08.rpongu' cmd=[{"prefix": "mgr fail", "who": "vm08.rpongu"}]: dispatch 2026-03-10T08:56:20.819 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:20 vm05.local ceph-mon[49713]: from='mgr.24459 ' entity='mgr.vm08.rpongu' cmd=[{"prefix": "mgr fail", "who": "vm08.rpongu"}]: dispatch 2026-03-10T08:56:20.819 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:20 vm05.local ceph-mon[49713]: osdmap e42: 6 total, 6 up, 6 in 2026-03-10T08:56:20.819 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:20 vm05.local ceph-mon[49713]: from='mgr.24459 ' entity='mgr.vm08.rpongu' cmd='[{"prefix": "mgr fail", "who": "vm08.rpongu"}]': finished 2026-03-10T08:56:20.819 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:20 vm05.local ceph-mon[49713]: mgrmap e25: vm05.rxwgjc(active, starting, since 0.00876936s) 2026-03-10T08:56:20.819 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:20 vm05.local ceph-mon[49713]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-10T08:56:20.819 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:20 vm05.local ceph-mon[49713]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T08:56:20.819 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:20 vm05.local ceph-mon[49713]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.slhztf"}]: dispatch 2026-03-10T08:56:20.819 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:20 vm05.local ceph-mon[49713]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.bxdvbu"}]: dispatch 2026-03-10T08:56:20.819 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:20 vm05.local ceph-mon[49713]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.ssijow"}]: dispatch 2026-03-10T08:56:20.819 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:20 vm05.local ceph-mon[49713]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.xfzrbx"}]: dispatch 2026-03-10T08:56:20.819 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:20 vm05.local ceph-mon[49713]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mgr metadata", "who": "vm05.rxwgjc", "id": "vm05.rxwgjc"}]: dispatch 2026-03-10T08:56:20.819 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:20 vm05.local ceph-mon[49713]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T08:56:20.819 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:20 vm05.local ceph-mon[49713]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T08:56:20.819 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:20 vm05.local ceph-mon[49713]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T08:56:20.819 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:20 vm05.local ceph-mon[49713]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T08:56:20.819 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:20 vm05.local ceph-mon[49713]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T08:56:20.819 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:20 vm05.local ceph-mon[49713]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T08:56:20.819 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:20 vm05.local ceph-mon[49713]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T08:56:20.819 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:20 vm05.local ceph-mon[49713]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T08:56:20.819 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:20 vm05.local ceph-mon[49713]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T08:56:21.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:21 vm08.local ceph-mon[57559]: Manager daemon vm05.rxwgjc is now available 2026-03-10T08:56:21.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:21 vm08.local ceph-mon[57559]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:56:21.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:21 vm08.local ceph-mon[57559]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:56:21.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:21 vm08.local ceph-mon[57559]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.rxwgjc/mirror_snapshot_schedule"}]: dispatch 2026-03-10T08:56:21.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:21 vm08.local ceph-mon[57559]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.rxwgjc/mirror_snapshot_schedule"}]: dispatch 2026-03-10T08:56:21.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:21 vm08.local ceph-mon[57559]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.rxwgjc/trash_purge_schedule"}]: dispatch 2026-03-10T08:56:21.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:21 vm08.local ceph-mon[57559]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.rxwgjc/trash_purge_schedule"}]: dispatch 2026-03-10T08:56:21.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:21 vm05.local ceph-mon[49713]: Manager daemon vm05.rxwgjc is now available 2026-03-10T08:56:21.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:21 vm05.local ceph-mon[49713]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:56:21.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:21 vm05.local ceph-mon[49713]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:56:21.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:21 vm05.local ceph-mon[49713]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.rxwgjc/mirror_snapshot_schedule"}]: dispatch 2026-03-10T08:56:21.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:21 vm05.local ceph-mon[49713]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.rxwgjc/mirror_snapshot_schedule"}]: dispatch 2026-03-10T08:56:21.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:21 vm05.local ceph-mon[49713]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.rxwgjc/trash_purge_schedule"}]: dispatch 2026-03-10T08:56:21.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:21 vm05.local ceph-mon[49713]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.rxwgjc/trash_purge_schedule"}]: dispatch 2026-03-10T08:56:22.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:22 vm05.local ceph-mon[49713]: mgrmap e26: vm05.rxwgjc(active, since 1.10856s) 2026-03-10T08:56:22.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:22 vm05.local ceph-mon[49713]: pgmap v3: 65 pgs: 65 active+clean; 4.1 GiB data, 14 GiB used, 106 GiB / 120 GiB avail 2026-03-10T08:56:22.720 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.718+0000 7fc138f7c700 1 -- 192.168.123.105:0/1354023141 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc134075a40 msgr2=0x7fc134077ed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:22.720 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.718+0000 7fc138f7c700 1 --2- 192.168.123.105:0/1354023141 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc134075a40 0x7fc134077ed0 secure :-1 s=READY pgs=338 cs=0 l=1 rev1=1 crypto rx=0x7fc12c00d3f0 tx=0x7fc12c00d700 comp rx=0 tx=0).stop 2026-03-10T08:56:22.720 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.718+0000 7fc138f7c700 1 -- 192.168.123.105:0/1354023141 shutdown_connections 2026-03-10T08:56:22.720 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.718+0000 7fc138f7c700 1 --2- 192.168.123.105:0/1354023141 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc134075a40 0x7fc134077ed0 unknown :-1 s=CLOSED pgs=338 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:22.720 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.718+0000 7fc138f7c700 1 --2- 192.168.123.105:0/1354023141 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc134072b50 0x7fc134072f70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:22.720 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.718+0000 7fc138f7c700 1 -- 192.168.123.105:0/1354023141 >> 192.168.123.105:0/1354023141 conn(0x7fc13406dae0 msgr2=0x7fc13406ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:56:22.720 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.719+0000 7fc138f7c700 1 -- 192.168.123.105:0/1354023141 shutdown_connections 2026-03-10T08:56:22.720 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.719+0000 7fc138f7c700 1 -- 192.168.123.105:0/1354023141 wait complete. 2026-03-10T08:56:22.720 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.719+0000 7fc138f7c700 1 Processor -- start 2026-03-10T08:56:22.720 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.719+0000 7fc138f7c700 1 -- start start 2026-03-10T08:56:22.720 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.719+0000 7fc138f7c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc134072b50 0x7fc134082f70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:56:22.720 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.719+0000 7fc138f7c700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc1340834b0 0x7fc134083930 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:56:22.720 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.719+0000 7fc138f7c700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc13412e710 con 0x7fc134072b50 2026-03-10T08:56:22.720 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.719+0000 7fc138f7c700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc13412e880 con 0x7fc1340834b0 2026-03-10T08:56:22.723 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.721+0000 7fc13259c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc134072b50 0x7fc134082f70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:56:22.723 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.721+0000 7fc13259c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc134072b50 0x7fc134082f70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:42994/0 (socket says 192.168.123.105:42994) 2026-03-10T08:56:22.723 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.721+0000 7fc13259c700 1 -- 192.168.123.105:0/2059932004 learned_addr learned my addr 192.168.123.105:0/2059932004 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:56:22.723 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.721+0000 7fc131d9b700 1 --2- 192.168.123.105:0/2059932004 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc1340834b0 0x7fc134083930 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:56:22.723 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.721+0000 7fc13259c700 1 -- 192.168.123.105:0/2059932004 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc1340834b0 msgr2=0x7fc134083930 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:22.723 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.721+0000 7fc13259c700 1 --2- 192.168.123.105:0/2059932004 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc1340834b0 0x7fc134083930 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:22.723 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.721+0000 7fc13259c700 1 -- 192.168.123.105:0/2059932004 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc12c007ed0 con 0x7fc134072b50 2026-03-10T08:56:22.723 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.722+0000 7fc13259c700 1 --2- 192.168.123.105:0/2059932004 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc134072b50 0x7fc134082f70 secure :-1 s=READY pgs=339 cs=0 l=1 rev1=1 crypto rx=0x7fc12400c8a0 tx=0x7fc12400cbb0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:56:22.723 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.722+0000 7fc1237fe700 1 -- 192.168.123.105:0/2059932004 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc1240078c0 con 0x7fc134072b50 2026-03-10T08:56:22.723 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.722+0000 7fc138f7c700 1 -- 192.168.123.105:0/2059932004 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc13412eb60 con 0x7fc134072b50 2026-03-10T08:56:22.723 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.722+0000 7fc138f7c700 1 -- 192.168.123.105:0/2059932004 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc13412f0b0 con 0x7fc134072b50 2026-03-10T08:56:22.723 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.723+0000 7fc1237fe700 1 -- 192.168.123.105:0/2059932004 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc12400f450 con 0x7fc134072b50 2026-03-10T08:56:22.723 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.723+0000 7fc1237fe700 1 -- 192.168.123.105:0/2059932004 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc12400e5c0 con 0x7fc134072b50 2026-03-10T08:56:22.724 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.724+0000 7fc138f7c700 1 -- 192.168.123.105:0/2059932004 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc114005320 con 0x7fc134072b50 2026-03-10T08:56:22.728 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.727+0000 7fc1237fe700 1 -- 192.168.123.105:0/2059932004 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 26) v1 ==== 50225+0+0 (secure 0 0 0) 0x7fc12400fa80 con 0x7fc134072b50 2026-03-10T08:56:22.728 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.727+0000 7fc1237fe700 1 --2- 192.168.123.105:0/2059932004 >> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] conn(0x7fc11c03dd80 0x7fc11c040240 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:56:22.728 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.728+0000 7fc1237fe700 1 -- 192.168.123.105:0/2059932004 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5878+0+0 (secure 0 0 0) 0x7fc12401f2f0 con 0x7fc134072b50 2026-03-10T08:56:22.729 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.729+0000 7fc1237fe700 1 -- 192.168.123.105:0/2059932004 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7fc124016780 con 0x7fc134072b50 2026-03-10T08:56:22.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.732+0000 7fc131d9b700 1 --2- 192.168.123.105:0/2059932004 >> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] conn(0x7fc11c03dd80 0x7fc11c040240 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:56:22.742 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.741+0000 7fc131d9b700 1 --2- 192.168.123.105:0/2059932004 >> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] conn(0x7fc11c03dd80 0x7fc11c040240 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7fc12c000f80 tx=0x7fc12c00db00 comp rx=0 tx=0).ready entity=mgr.14722 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:56:22.940 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.939+0000 7fc138f7c700 1 -- 192.168.123.105:0/2059932004 --> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fc114000bf0 con 0x7fc11c03dd80 2026-03-10T08:56:22.941 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.941+0000 7fc1237fe700 1 -- 192.168.123.105:0/2059932004 <== mgr.14722 v2:192.168.123.105:6800/2453972605 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+337 (secure 0 0 0) 0x7fc114000bf0 con 0x7fc11c03dd80 2026-03-10T08:56:22.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.944+0000 7fc1217fa700 1 -- 192.168.123.105:0/2059932004 >> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] conn(0x7fc11c03dd80 msgr2=0x7fc11c040240 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:22.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.944+0000 7fc1217fa700 1 --2- 192.168.123.105:0/2059932004 >> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] conn(0x7fc11c03dd80 0x7fc11c040240 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7fc12c000f80 tx=0x7fc12c00db00 comp rx=0 tx=0).stop 2026-03-10T08:56:22.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.944+0000 7fc1217fa700 1 -- 192.168.123.105:0/2059932004 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc134072b50 msgr2=0x7fc134082f70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:22.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.944+0000 7fc1217fa700 1 --2- 192.168.123.105:0/2059932004 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc134072b50 0x7fc134082f70 secure :-1 s=READY pgs=339 cs=0 l=1 rev1=1 crypto rx=0x7fc12400c8a0 tx=0x7fc12400cbb0 comp rx=0 tx=0).stop 2026-03-10T08:56:22.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.945+0000 7fc1217fa700 1 -- 192.168.123.105:0/2059932004 shutdown_connections 2026-03-10T08:56:22.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.945+0000 7fc1217fa700 1 --2- 192.168.123.105:0/2059932004 >> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] conn(0x7fc11c03dd80 0x7fc11c040240 unknown :-1 s=CLOSED pgs=17 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:22.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.945+0000 7fc1217fa700 1 --2- 192.168.123.105:0/2059932004 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc134072b50 0x7fc134082f70 unknown :-1 s=CLOSED pgs=339 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:22.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.945+0000 7fc1217fa700 1 --2- 192.168.123.105:0/2059932004 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc1340834b0 0x7fc134083930 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:22.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.945+0000 7fc1217fa700 1 -- 192.168.123.105:0/2059932004 >> 192.168.123.105:0/2059932004 conn(0x7fc13406dae0 msgr2=0x7fc13406ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:56:22.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.945+0000 7fc1217fa700 1 -- 192.168.123.105:0/2059932004 shutdown_connections 2026-03-10T08:56:22.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:22.945+0000 7fc1217fa700 1 -- 192.168.123.105:0/2059932004 wait complete. 2026-03-10T08:56:22.961 INFO:teuthology.orchestra.run.vm05.stdout:true 2026-03-10T08:56:23.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:22 vm08.local ceph-mon[57559]: mgrmap e26: vm05.rxwgjc(active, since 1.10856s) 2026-03-10T08:56:23.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:22 vm08.local ceph-mon[57559]: pgmap v3: 65 pgs: 65 active+clean; 4.1 GiB data, 14 GiB used, 106 GiB / 120 GiB avail 2026-03-10T08:56:23.070 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.069+0000 7f275bfff700 1 -- 192.168.123.105:0/165995113 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f275c072b20 msgr2=0x7f275c072f40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:23.070 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.069+0000 7f275bfff700 1 --2- 192.168.123.105:0/165995113 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f275c072b20 0x7f275c072f40 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7f274c007780 tx=0x7f274c00c050 comp rx=0 tx=0).stop 2026-03-10T08:56:23.070 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.069+0000 7f275bfff700 1 -- 192.168.123.105:0/165995113 shutdown_connections 2026-03-10T08:56:23.070 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.069+0000 7f275bfff700 1 --2- 192.168.123.105:0/165995113 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f275c075a10 0x7f275c077ea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:23.070 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.069+0000 7f275bfff700 1 --2- 192.168.123.105:0/165995113 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f275c072b20 0x7f275c072f40 unknown :-1 s=CLOSED pgs=123 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:23.070 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.069+0000 7f275bfff700 1 -- 192.168.123.105:0/165995113 >> 192.168.123.105:0/165995113 conn(0x7f275c06daa0 msgr2=0x7f275c06ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:56:23.071 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.069+0000 7f275bfff700 1 -- 192.168.123.105:0/165995113 shutdown_connections 2026-03-10T08:56:23.071 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.070+0000 7f275bfff700 1 -- 192.168.123.105:0/165995113 wait complete. 2026-03-10T08:56:23.071 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.070+0000 7f275bfff700 1 Processor -- start 2026-03-10T08:56:23.071 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.070+0000 7f275bfff700 1 -- start start 2026-03-10T08:56:23.071 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.070+0000 7f275bfff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f275c075a10 0x7f275c0830a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:56:23.071 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.070+0000 7f275bfff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f275c0835e0 0x7f275c12e3f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:56:23.071 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.070+0000 7f275bfff700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f275c083af0 con 0x7f275c0835e0 2026-03-10T08:56:23.071 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.070+0000 7f275bfff700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f275c083c60 con 0x7f275c075a10 2026-03-10T08:56:23.071 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.071+0000 7f275affd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f275c075a10 0x7f275c0830a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:56:23.071 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.071+0000 7f275affd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f275c075a10 0x7f275c0830a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:54992/0 (socket says 192.168.123.105:54992) 2026-03-10T08:56:23.071 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.071+0000 7f275affd700 1 -- 192.168.123.105:0/87441355 learned_addr learned my addr 192.168.123.105:0/87441355 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:56:23.071 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.071+0000 7f275affd700 1 -- 192.168.123.105:0/87441355 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f275c0835e0 msgr2=0x7f275c12e3f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:23.071 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.071+0000 7f275affd700 1 --2- 192.168.123.105:0/87441355 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f275c0835e0 0x7f275c12e3f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:23.071 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.071+0000 7f275affd700 1 -- 192.168.123.105:0/87441355 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f274c007430 con 0x7f275c075a10 2026-03-10T08:56:23.071 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.071+0000 7f275affd700 1 --2- 192.168.123.105:0/87441355 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f275c075a10 0x7f275c0830a0 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7f274c00afd0 tx=0x7f274c00c9b0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:56:23.075 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.075+0000 7f2743fff700 1 -- 192.168.123.105:0/87441355 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f274c00f050 con 0x7f275c075a10 2026-03-10T08:56:23.076 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.075+0000 7f275bfff700 1 -- 192.168.123.105:0/87441355 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f275c12e930 con 0x7f275c075a10 2026-03-10T08:56:23.076 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.075+0000 7f275bfff700 1 -- 192.168.123.105:0/87441355 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f275c12ee80 con 0x7f275c075a10 2026-03-10T08:56:23.076 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.076+0000 7f2743fff700 1 -- 192.168.123.105:0/87441355 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f274c00cde0 con 0x7f275c075a10 2026-03-10T08:56:23.076 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.076+0000 7f2743fff700 1 -- 192.168.123.105:0/87441355 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f274c008740 con 0x7f275c075a10 2026-03-10T08:56:23.077 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.076+0000 7f275bfff700 1 -- 192.168.123.105:0/87441355 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2748005320 con 0x7f275c075a10 2026-03-10T08:56:23.077 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.077+0000 7f2743fff700 1 -- 192.168.123.105:0/87441355 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 26) v1 ==== 50225+0+0 (secure 0 0 0) 0x7f274c01a040 con 0x7f275c075a10 2026-03-10T08:56:23.077 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.077+0000 7f2743fff700 1 --2- 192.168.123.105:0/87441355 >> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] conn(0x7f274403ddd0 0x7f2744040290 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:56:23.077 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.077+0000 7f2743fff700 1 -- 192.168.123.105:0/87441355 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5878+0+0 (secure 0 0 0) 0x7f274c0535f0 con 0x7f275c075a10 2026-03-10T08:56:23.078 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.078+0000 7f275a7fc700 1 --2- 192.168.123.105:0/87441355 >> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] conn(0x7f274403ddd0 0x7f2744040290 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:56:23.078 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.078+0000 7f275a7fc700 1 --2- 192.168.123.105:0/87441355 >> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] conn(0x7f274403ddd0 0x7f2744040290 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f275400c4d0 tx=0x7f275400b040 comp rx=0 tx=0).ready entity=mgr.14722 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:56:23.082 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.082+0000 7f2743fff700 1 -- 192.168.123.105:0/87441355 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f274c00a9f0 con 0x7f275c075a10 2026-03-10T08:56:23.326 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.325+0000 7f275bfff700 1 -- 192.168.123.105:0/87441355 --> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f2748000bf0 con 0x7f274403ddd0 2026-03-10T08:56:23.334 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.331+0000 7f2743fff700 1 -- 192.168.123.105:0/87441355 <== mgr.14722 v2:192.168.123.105:6800/2453972605 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+337 (secure 0 0 0) 0x7f2748000bf0 con 0x7f274403ddd0 2026-03-10T08:56:23.338 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.334+0000 7f2741ffb700 1 -- 192.168.123.105:0/87441355 >> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] conn(0x7f274403ddd0 msgr2=0x7f2744040290 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:23.338 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.334+0000 7f2741ffb700 1 --2- 192.168.123.105:0/87441355 >> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] conn(0x7f274403ddd0 0x7f2744040290 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f275400c4d0 tx=0x7f275400b040 comp rx=0 tx=0).stop 2026-03-10T08:56:23.338 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.334+0000 7f2741ffb700 1 -- 192.168.123.105:0/87441355 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f275c075a10 msgr2=0x7f275c0830a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:23.338 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.334+0000 7f2741ffb700 1 --2- 192.168.123.105:0/87441355 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f275c075a10 0x7f275c0830a0 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7f274c00afd0 tx=0x7f274c00c9b0 comp rx=0 tx=0).stop 2026-03-10T08:56:23.338 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.338+0000 7f2741ffb700 1 -- 192.168.123.105:0/87441355 shutdown_connections 2026-03-10T08:56:23.338 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.338+0000 7f2741ffb700 1 --2- 192.168.123.105:0/87441355 >> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] conn(0x7f274403ddd0 0x7f2744040290 unknown :-1 s=CLOSED pgs=19 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:23.338 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.338+0000 7f2741ffb700 1 --2- 192.168.123.105:0/87441355 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f275c075a10 0x7f275c0830a0 unknown :-1 s=CLOSED pgs=124 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:23.338 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.338+0000 7f2741ffb700 1 --2- 192.168.123.105:0/87441355 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f275c0835e0 0x7f275c12e3f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:23.339 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.338+0000 7f2741ffb700 1 -- 192.168.123.105:0/87441355 >> 192.168.123.105:0/87441355 conn(0x7f275c06daa0 msgr2=0x7f275c06ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:56:23.339 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.338+0000 7f2741ffb700 1 -- 192.168.123.105:0/87441355 shutdown_connections 2026-03-10T08:56:23.339 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.339+0000 7f2741ffb700 1 -- 192.168.123.105:0/87441355 wait complete. 2026-03-10T08:56:23.505 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.504+0000 7f23e1391700 1 -- 192.168.123.105:0/497925601 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f23d40a4800 msgr2=0x7f23d40a4c60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:23.505 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.504+0000 7f23e1391700 1 --2- 192.168.123.105:0/497925601 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f23d40a4800 0x7f23d40a4c60 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7f23dc066a30 tx=0x7f23dc067220 comp rx=0 tx=0).stop 2026-03-10T08:56:23.505 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.505+0000 7f23e1391700 1 -- 192.168.123.105:0/497925601 shutdown_connections 2026-03-10T08:56:23.505 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.505+0000 7f23e1391700 1 --2- 192.168.123.105:0/497925601 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f23d40a4800 0x7f23d40a4c60 unknown :-1 s=CLOSED pgs=125 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:23.505 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.505+0000 7f23e1391700 1 --2- 192.168.123.105:0/497925601 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23d40a61c0 0x7f23d40a65e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:23.505 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.505+0000 7f23e1391700 1 -- 192.168.123.105:0/497925601 >> 192.168.123.105:0/497925601 conn(0x7f23d40a0160 msgr2=0x7f23d40a25c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:56:23.506 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.505+0000 7f23e1391700 1 -- 192.168.123.105:0/497925601 shutdown_connections 2026-03-10T08:56:23.506 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.505+0000 7f23e1391700 1 -- 192.168.123.105:0/497925601 wait complete. 2026-03-10T08:56:23.506 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.505+0000 7f23e1391700 1 Processor -- start 2026-03-10T08:56:23.506 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.505+0000 7f23e1391700 1 -- start start 2026-03-10T08:56:23.506 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.505+0000 7f23e1391700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f23d40a61c0 0x7f23d40d0690 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:56:23.506 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.505+0000 7f23e1391700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23d40d0bd0 0x7f23d4010ed0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:56:23.506 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.505+0000 7f23e1391700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f23d40d10e0 con 0x7f23d40d0bd0 2026-03-10T08:56:23.506 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.505+0000 7f23e1391700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f23d40d1250 con 0x7f23d40a61c0 2026-03-10T08:56:23.506 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.506+0000 7f23dbfff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f23d40a61c0 0x7f23d40d0690 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:56:23.506 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.506+0000 7f23dbfff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f23d40a61c0 0x7f23d40d0690 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:55010/0 (socket says 192.168.123.105:55010) 2026-03-10T08:56:23.506 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.506+0000 7f23dbfff700 1 -- 192.168.123.105:0/908004883 learned_addr learned my addr 192.168.123.105:0/908004883 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:56:23.507 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.507+0000 7f23dbfff700 1 -- 192.168.123.105:0/908004883 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23d40d0bd0 msgr2=0x7f23d4010ed0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:23.507 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.507+0000 7f23dbfff700 1 --2- 192.168.123.105:0/908004883 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23d40d0bd0 0x7f23d4010ed0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:23.507 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.507+0000 7f23dbfff700 1 -- 192.168.123.105:0/908004883 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f23dc067090 con 0x7f23d40a61c0 2026-03-10T08:56:23.508 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.507+0000 7f23dbfff700 1 --2- 192.168.123.105:0/908004883 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f23d40a61c0 0x7f23d40d0690 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f23d000eb10 tx=0x7f23d000eed0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:56:23.508 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.508+0000 7f23d97fa700 1 -- 192.168.123.105:0/908004883 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f23d0009970 con 0x7f23d40a61c0 2026-03-10T08:56:23.509 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.508+0000 7f23e1391700 1 -- 192.168.123.105:0/908004883 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f23d4011470 con 0x7f23d40a61c0 2026-03-10T08:56:23.509 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.508+0000 7f23e1391700 1 -- 192.168.123.105:0/908004883 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f23d40119c0 con 0x7f23d40a61c0 2026-03-10T08:56:23.509 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.508+0000 7f23d97fa700 1 -- 192.168.123.105:0/908004883 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f23d0004500 con 0x7f23d40a61c0 2026-03-10T08:56:23.509 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.508+0000 7f23d97fa700 1 -- 192.168.123.105:0/908004883 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f23d0010430 con 0x7f23d40a61c0 2026-03-10T08:56:23.510 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.509+0000 7f23d97fa700 1 -- 192.168.123.105:0/908004883 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 26) v1 ==== 50225+0+0 (secure 0 0 0) 0x7f23d0010690 con 0x7f23d40a61c0 2026-03-10T08:56:23.510 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.510+0000 7f23d97fa700 1 --2- 192.168.123.105:0/908004883 >> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] conn(0x7f23cc03ddd0 0x7f23cc040290 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:56:23.510 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.510+0000 7f23d97fa700 1 -- 192.168.123.105:0/908004883 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5878+0+0 (secure 0 0 0) 0x7f23d0014070 con 0x7f23d40a61c0 2026-03-10T08:56:23.512 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.512+0000 7f23e1391700 1 -- 192.168.123.105:0/908004883 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f23c0005320 con 0x7f23d40a61c0 2026-03-10T08:56:23.515 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.515+0000 7f23d97fa700 1 -- 192.168.123.105:0/908004883 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f23d00109e0 con 0x7f23d40a61c0 2026-03-10T08:56:23.530 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.529+0000 7f23db7fe700 1 --2- 192.168.123.105:0/908004883 >> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] conn(0x7f23cc03ddd0 0x7f23cc040290 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:56:23.534 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.533+0000 7f23db7fe700 1 --2- 192.168.123.105:0/908004883 >> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] conn(0x7f23cc03ddd0 0x7f23cc040290 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f23dc04ed60 tx=0x7f23dc070040 comp rx=0 tx=0).ready entity=mgr.14722 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:56:23.645 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.644+0000 7f23d97fa700 1 -- 192.168.123.105:0/908004883 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mgrmap(e 27) v1 ==== 50327+0+0 (secure 0 0 0) 0x7f23d000cca0 con 0x7f23d40a61c0 2026-03-10T08:56:23.737 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.737+0000 7f23e1391700 1 -- 192.168.123.105:0/908004883 --> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f23c0000bf0 con 0x7f23cc03ddd0 2026-03-10T08:56:23.778 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.777+0000 7f23d97fa700 1 -- 192.168.123.105:0/908004883 <== mgr.14722 v2:192.168.123.105:6800/2453972605 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7f23c0000bf0 con 0x7f23cc03ddd0 2026-03-10T08:56:23.778 INFO:teuthology.orchestra.run.vm05.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T08:56:23.778 INFO:teuthology.orchestra.run.vm05.stdout:alertmanager.vm05 vm05 *:9093,9094 running (4m) 16s ago 5m 23.9M - 0.25.0 c8568f914cd2 3be9db7ff6a0 2026-03-10T08:56:23.778 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm05 vm05 running (5m) 16s ago 5m 8539k - 18.2.1 5be31c24972a 4f6a4fa3151a 2026-03-10T08:56:23.778 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm08 vm08 running (5m) 0s ago 5m 11.0M - 18.2.1 5be31c24972a 17a1d108c2c1 2026-03-10T08:56:23.778 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm05 vm05 running (5m) 16s ago 5m 7407k - 18.2.1 5be31c24972a f9c585addcea 2026-03-10T08:56:23.778 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm08 vm08 running (5m) 0s ago 5m 7415k - 18.2.1 5be31c24972a f0b88fc7f552 2026-03-10T08:56:23.778 INFO:teuthology.orchestra.run.vm05.stdout:grafana.vm05 vm05 *:3000 running (4m) 16s ago 5m 86.2M - 9.4.7 954c08fa6188 91937b4745e7 2026-03-10T08:56:23.778 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.bxdvbu vm05 running (3m) 16s ago 3m 230M - 18.2.1 5be31c24972a 601862397d9b 2026-03-10T08:56:23.778 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.slhztf vm05 running (3m) 16s ago 3m 16.3M - 18.2.1 5be31c24972a 550cdd24738b 2026-03-10T08:56:23.778 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.ssijow vm08 running (3m) 0s ago 3m 19.0M - 18.2.1 5be31c24972a b9e55b365719 2026-03-10T08:56:23.778 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.xfzrbx vm08 running (3m) 0s ago 3m 15.2M - 18.2.1 5be31c24972a d58d1e2e6ff3 2026-03-10T08:56:23.778 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm05.rxwgjc vm05 *:8443,9283,8765 running (20s) 16s ago 6m 28.4M - 19.2.3-678-ge911bdeb 654f31e6858e d8c53d04a173 2026-03-10T08:56:23.778 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm08.rpongu vm08 *:8443,9283,8765 running (44s) 0s ago 4m 144M - 19.2.3-678-ge911bdeb 654f31e6858e d0749942e44d 2026-03-10T08:56:23.778 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm05 vm05 running (6m) 16s ago 6m 48.9M 2048M 18.2.1 5be31c24972a 4cb0e74c8584 2026-03-10T08:56:23.778 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm08 vm08 running (4m) 0s ago 4m 46.1M 2048M 18.2.1 5be31c24972a bca448418226 2026-03-10T08:56:23.778 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm05 vm05 *:9100 running (5m) 16s ago 5m 14.1M - 1.5.0 0da6a335fe13 3b3db2a6030c 2026-03-10T08:56:23.778 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm08 vm08 *:9100 running (4m) 0s ago 4m 15.7M - 1.5.0 0da6a335fe13 f55df0cefc61 2026-03-10T08:56:23.778 INFO:teuthology.orchestra.run.vm05.stdout:osd.0 vm05 running (4m) 16s ago 4m 372M 4096M 18.2.1 5be31c24972a 2a2aeea5e3d4 2026-03-10T08:56:23.778 INFO:teuthology.orchestra.run.vm05.stdout:osd.1 vm05 running (4m) 16s ago 4m 380M 4096M 18.2.1 5be31c24972a 902f9ea11f1a 2026-03-10T08:56:23.778 INFO:teuthology.orchestra.run.vm05.stdout:osd.2 vm05 running (4m) 16s ago 4m 327M 4096M 18.2.1 5be31c24972a 32c0be0f86f2 2026-03-10T08:56:23.778 INFO:teuthology.orchestra.run.vm05.stdout:osd.3 vm08 running (4m) 0s ago 4m 442M 4096M 18.2.1 5be31c24972a 14f5f93ea4d1 2026-03-10T08:56:23.778 INFO:teuthology.orchestra.run.vm05.stdout:osd.4 vm08 running (4m) 0s ago 4m 406M 4096M 18.2.1 5be31c24972a 155c1482d81c 2026-03-10T08:56:23.778 INFO:teuthology.orchestra.run.vm05.stdout:osd.5 vm08 running (3m) 0s ago 3m 353M 4096M 18.2.1 5be31c24972a 21583bb58d82 2026-03-10T08:56:23.778 INFO:teuthology.orchestra.run.vm05.stdout:prometheus.vm05 vm05 *:9095 running (23s) 16s ago 5m 34.8M - 2.43.0 a07b618ecd1d 1ad3428005d6 2026-03-10T08:56:23.781 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.780+0000 7f23c6ffd700 1 -- 192.168.123.105:0/908004883 >> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] conn(0x7f23cc03ddd0 msgr2=0x7f23cc040290 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:23.781 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.780+0000 7f23c6ffd700 1 --2- 192.168.123.105:0/908004883 >> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] conn(0x7f23cc03ddd0 0x7f23cc040290 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f23dc04ed60 tx=0x7f23dc070040 comp rx=0 tx=0).stop 2026-03-10T08:56:23.781 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.780+0000 7f23c6ffd700 1 -- 192.168.123.105:0/908004883 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f23d40a61c0 msgr2=0x7f23d40d0690 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:23.781 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.780+0000 7f23c6ffd700 1 --2- 192.168.123.105:0/908004883 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f23d40a61c0 0x7f23d40d0690 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f23d000eb10 tx=0x7f23d000eed0 comp rx=0 tx=0).stop 2026-03-10T08:56:23.781 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.780+0000 7f23c6ffd700 1 -- 192.168.123.105:0/908004883 shutdown_connections 2026-03-10T08:56:23.781 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.780+0000 7f23c6ffd700 1 --2- 192.168.123.105:0/908004883 >> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] conn(0x7f23cc03ddd0 0x7f23cc040290 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:23.781 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.780+0000 7f23c6ffd700 1 --2- 192.168.123.105:0/908004883 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f23d40a61c0 0x7f23d40d0690 unknown :-1 s=CLOSED pgs=126 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:23.781 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.780+0000 7f23c6ffd700 1 --2- 192.168.123.105:0/908004883 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23d40d0bd0 0x7f23d4010ed0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:23.781 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.780+0000 7f23c6ffd700 1 -- 192.168.123.105:0/908004883 >> 192.168.123.105:0/908004883 conn(0x7f23d40a0160 msgr2=0x7f23d40a24e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:56:23.781 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.781+0000 7f23c6ffd700 1 -- 192.168.123.105:0/908004883 shutdown_connections 2026-03-10T08:56:23.781 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.781+0000 7f23c6ffd700 1 -- 192.168.123.105:0/908004883 wait complete. 2026-03-10T08:56:23.919 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.919+0000 7f03d4b1e700 1 -- 192.168.123.105:0/1665298219 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f03d010a700 msgr2=0x7f03d010cb90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:23.920 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.919+0000 7f03d4b1e700 1 --2- 192.168.123.105:0/1665298219 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f03d010a700 0x7f03d010cb90 secure :-1 s=READY pgs=127 cs=0 l=1 rev1=1 crypto rx=0x7f03c4009b00 tx=0x7f03c4009e10 comp rx=0 tx=0).stop 2026-03-10T08:56:23.920 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.919+0000 7f03d4b1e700 1 -- 192.168.123.105:0/1665298219 shutdown_connections 2026-03-10T08:56:23.920 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.919+0000 7f03d4b1e700 1 --2- 192.168.123.105:0/1665298219 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f03d010a700 0x7f03d010cb90 unknown :-1 s=CLOSED pgs=127 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:23.920 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.919+0000 7f03d4b1e700 1 --2- 192.168.123.105:0/1665298219 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f03d0107d90 0x7f03d010a1c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:23.920 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.919+0000 7f03d4b1e700 1 -- 192.168.123.105:0/1665298219 >> 192.168.123.105:0/1665298219 conn(0x7f03d006daa0 msgr2=0x7f03d006ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:56:23.920 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.919+0000 7f03d4b1e700 1 -- 192.168.123.105:0/1665298219 shutdown_connections 2026-03-10T08:56:23.920 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.919+0000 7f03d4b1e700 1 -- 192.168.123.105:0/1665298219 wait complete. 2026-03-10T08:56:23.921 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:23 vm05.local ceph-mon[49713]: pgmap v4: 65 pgs: 65 active+clean; 4.1 GiB data, 14 GiB used, 106 GiB / 120 GiB avail 2026-03-10T08:56:23.921 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:23 vm05.local ceph-mon[49713]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:23.921 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:23 vm05.local ceph-mon[49713]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:23.921 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.921+0000 7f03d4b1e700 1 Processor -- start 2026-03-10T08:56:23.922 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.921+0000 7f03d4b1e700 1 -- start start 2026-03-10T08:56:23.922 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.922+0000 7f03d4b1e700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f03d0107d90 0x7f03d0116970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:56:23.922 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.922+0000 7f03d4b1e700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f03d010a700 0x7f03d0116eb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:56:23.922 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.922+0000 7f03d4b1e700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f03d01174d0 con 0x7f03d010a700 2026-03-10T08:56:23.922 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.922+0000 7f03d4b1e700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f03d01b3060 con 0x7f03d0107d90 2026-03-10T08:56:23.922 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.922+0000 7f03cf7fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f03d0107d90 0x7f03d0116970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:56:23.922 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.922+0000 7f03cf7fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f03d0107d90 0x7f03d0116970 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:55030/0 (socket says 192.168.123.105:55030) 2026-03-10T08:56:23.922 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.922+0000 7f03cf7fe700 1 -- 192.168.123.105:0/2161767562 learned_addr learned my addr 192.168.123.105:0/2161767562 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:56:23.923 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.922+0000 7f03cf7fe700 1 -- 192.168.123.105:0/2161767562 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f03d010a700 msgr2=0x7f03d0116eb0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:23.923 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.923+0000 7f03cf7fe700 1 --2- 192.168.123.105:0/2161767562 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f03d010a700 0x7f03d0116eb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:23.923 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.923+0000 7f03cf7fe700 1 -- 192.168.123.105:0/2161767562 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f03c40097e0 con 0x7f03d0107d90 2026-03-10T08:56:23.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.923+0000 7f03cf7fe700 1 --2- 192.168.123.105:0/2161767562 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f03d0107d90 0x7f03d0116970 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7f03c000b700 tx=0x7f03c000bac0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:56:23.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.923+0000 7f03ccff9700 1 -- 192.168.123.105:0/2161767562 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f03c0010840 con 0x7f03d0107d90 2026-03-10T08:56:23.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.923+0000 7f03d4b1e700 1 -- 192.168.123.105:0/2161767562 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f03d01b3260 con 0x7f03d0107d90 2026-03-10T08:56:23.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.923+0000 7f03d4b1e700 1 -- 192.168.123.105:0/2161767562 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f03d01b3760 con 0x7f03d0107d90 2026-03-10T08:56:23.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.924+0000 7f03ccff9700 1 -- 192.168.123.105:0/2161767562 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f03c0010e80 con 0x7f03d0107d90 2026-03-10T08:56:23.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.924+0000 7f03ccff9700 1 -- 192.168.123.105:0/2161767562 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f03c000d590 con 0x7f03d0107d90 2026-03-10T08:56:23.925 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.924+0000 7f03ccff9700 1 -- 192.168.123.105:0/2161767562 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 27) v1 ==== 50327+0+0 (secure 0 0 0) 0x7f03c00109a0 con 0x7f03d0107d90 2026-03-10T08:56:23.925 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.925+0000 7f03d4b1e700 1 -- 192.168.123.105:0/2161767562 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f03bc005320 con 0x7f03d0107d90 2026-03-10T08:56:23.925 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.925+0000 7f03ccff9700 1 --2- 192.168.123.105:0/2161767562 >> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] conn(0x7f03b803dea0 0x7f03b8040360 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:56:23.925 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.925+0000 7f03ccff9700 1 -- 192.168.123.105:0/2161767562 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5878+0+0 (secure 0 0 0) 0x7f03c0052ba0 con 0x7f03d0107d90 2026-03-10T08:56:23.928 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.928+0000 7f03ccff9700 1 -- 192.168.123.105:0/2161767562 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f03c000f3e0 con 0x7f03d0107d90 2026-03-10T08:56:23.931 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.931+0000 7f03ceffd700 1 --2- 192.168.123.105:0/2161767562 >> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] conn(0x7f03b803dea0 0x7f03b8040360 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:56:23.933 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:23.933+0000 7f03ceffd700 1 --2- 192.168.123.105:0/2161767562 >> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] conn(0x7f03b803dea0 0x7f03b8040360 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f03c4005fd0 tx=0x7f03c4009f90 comp rx=0 tx=0).ready entity=mgr.14722 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:56:24.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:23 vm08.local ceph-mon[57559]: pgmap v4: 65 pgs: 65 active+clean; 4.1 GiB data, 14 GiB used, 106 GiB / 120 GiB avail 2026-03-10T08:56:24.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:23 vm08.local ceph-mon[57559]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:24.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:23 vm08.local ceph-mon[57559]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:24.175 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.175+0000 7f03d4b1e700 1 -- 192.168.123.105:0/2161767562 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f03bc005cc0 con 0x7f03d0107d90 2026-03-10T08:56:24.176 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.175+0000 7f03ccff9700 1 -- 192.168.123.105:0/2161767562 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+679 (secure 0 0 0) 0x7f03c001e020 con 0x7f03d0107d90 2026-03-10T08:56:24.178 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T08:56:24.178 INFO:teuthology.orchestra.run.vm05.stdout: "mon": { 2026-03-10T08:56:24.178 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 2 2026-03-10T08:56:24.178 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:56:24.178 INFO:teuthology.orchestra.run.vm05.stdout: "mgr": { 2026-03-10T08:56:24.178 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-10T08:56:24.178 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:56:24.178 INFO:teuthology.orchestra.run.vm05.stdout: "osd": { 2026-03-10T08:56:24.178 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 6 2026-03-10T08:56:24.178 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:56:24.178 INFO:teuthology.orchestra.run.vm05.stdout: "mds": { 2026-03-10T08:56:24.178 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4 2026-03-10T08:56:24.178 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:56:24.178 INFO:teuthology.orchestra.run.vm05.stdout: "overall": { 2026-03-10T08:56:24.178 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 12, 2026-03-10T08:56:24.178 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-10T08:56:24.178 INFO:teuthology.orchestra.run.vm05.stdout: } 2026-03-10T08:56:24.178 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T08:56:24.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.181+0000 7f03d4b1e700 1 -- 192.168.123.105:0/2161767562 >> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] conn(0x7f03b803dea0 msgr2=0x7f03b8040360 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:24.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.181+0000 7f03d4b1e700 1 --2- 192.168.123.105:0/2161767562 >> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] conn(0x7f03b803dea0 0x7f03b8040360 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f03c4005fd0 tx=0x7f03c4009f90 comp rx=0 tx=0).stop 2026-03-10T08:56:24.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.181+0000 7f03d4b1e700 1 -- 192.168.123.105:0/2161767562 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f03d0107d90 msgr2=0x7f03d0116970 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:24.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.181+0000 7f03d4b1e700 1 --2- 192.168.123.105:0/2161767562 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f03d0107d90 0x7f03d0116970 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7f03c000b700 tx=0x7f03c000bac0 comp rx=0 tx=0).stop 2026-03-10T08:56:24.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.181+0000 7f03d4b1e700 1 -- 192.168.123.105:0/2161767562 shutdown_connections 2026-03-10T08:56:24.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.181+0000 7f03d4b1e700 1 --2- 192.168.123.105:0/2161767562 >> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] conn(0x7f03b803dea0 0x7f03b8040360 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:24.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.181+0000 7f03d4b1e700 1 --2- 192.168.123.105:0/2161767562 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f03d0107d90 0x7f03d0116970 unknown :-1 s=CLOSED pgs=128 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:24.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.181+0000 7f03d4b1e700 1 --2- 192.168.123.105:0/2161767562 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f03d010a700 0x7f03d0116eb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:24.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.181+0000 7f03d4b1e700 1 -- 192.168.123.105:0/2161767562 >> 192.168.123.105:0/2161767562 conn(0x7f03d006daa0 msgr2=0x7f03d006ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:56:24.183 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.183+0000 7f03d4b1e700 1 -- 192.168.123.105:0/2161767562 shutdown_connections 2026-03-10T08:56:24.183 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.183+0000 7f03d4b1e700 1 -- 192.168.123.105:0/2161767562 wait complete. 2026-03-10T08:56:24.275 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.274+0000 7f01381ba700 1 -- 192.168.123.105:0/3183848755 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0130075a40 msgr2=0x7f0130077ed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:24.275 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.274+0000 7f01381ba700 1 --2- 192.168.123.105:0/3183848755 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0130075a40 0x7f0130077ed0 secure :-1 s=READY pgs=340 cs=0 l=1 rev1=1 crypto rx=0x7f012800d3f0 tx=0x7f012800d700 comp rx=0 tx=0).stop 2026-03-10T08:56:24.275 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.274+0000 7f01381ba700 1 -- 192.168.123.105:0/3183848755 shutdown_connections 2026-03-10T08:56:24.275 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.274+0000 7f01381ba700 1 --2- 192.168.123.105:0/3183848755 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0130075a40 0x7f0130077ed0 unknown :-1 s=CLOSED pgs=340 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:24.275 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.274+0000 7f01381ba700 1 --2- 192.168.123.105:0/3183848755 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0130072b50 0x7f0130072f70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:24.275 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.274+0000 7f01381ba700 1 -- 192.168.123.105:0/3183848755 >> 192.168.123.105:0/3183848755 conn(0x7f013006dae0 msgr2=0x7f013006ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:56:24.276 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.275+0000 7f01381ba700 1 -- 192.168.123.105:0/3183848755 shutdown_connections 2026-03-10T08:56:24.276 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.275+0000 7f01381ba700 1 -- 192.168.123.105:0/3183848755 wait complete. 2026-03-10T08:56:24.276 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.275+0000 7f01381ba700 1 Processor -- start 2026-03-10T08:56:24.276 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.275+0000 7f01381ba700 1 -- start start 2026-03-10T08:56:24.276 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.275+0000 7f01381ba700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0130072b50 0x7f0130083100 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:56:24.276 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.275+0000 7f01381ba700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0130083640 0x7f01301b30f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:56:24.276 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.275+0000 7f01381ba700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0130083b50 con 0x7f0130072b50 2026-03-10T08:56:24.276 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.275+0000 7f01381ba700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0130083cc0 con 0x7f0130083640 2026-03-10T08:56:24.276 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.276+0000 7f0135755700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0130083640 0x7f01301b30f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:56:24.276 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.276+0000 7f0135755700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0130083640 0x7f01301b30f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:55038/0 (socket says 192.168.123.105:55038) 2026-03-10T08:56:24.276 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.276+0000 7f0135755700 1 -- 192.168.123.105:0/161521784 learned_addr learned my addr 192.168.123.105:0/161521784 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:56:24.276 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.276+0000 7f0135f56700 1 --2- 192.168.123.105:0/161521784 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0130072b50 0x7f0130083100 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:56:24.276 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.276+0000 7f0135755700 1 -- 192.168.123.105:0/161521784 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0130072b50 msgr2=0x7f0130083100 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:24.276 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.276+0000 7f0135755700 1 --2- 192.168.123.105:0/161521784 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0130072b50 0x7f0130083100 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:24.276 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.276+0000 7f0135755700 1 -- 192.168.123.105:0/161521784 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0128007ed0 con 0x7f0130083640 2026-03-10T08:56:24.276 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.276+0000 7f0135755700 1 --2- 192.168.123.105:0/161521784 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0130083640 0x7f01301b30f0 secure :-1 s=READY pgs=129 cs=0 l=1 rev1=1 crypto rx=0x7f0128010e90 tx=0x7f0128010f70 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:56:24.277 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.276+0000 7f0126ffd700 1 -- 192.168.123.105:0/161521784 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f01280186e0 con 0x7f0130083640 2026-03-10T08:56:24.278 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.277+0000 7f01381ba700 1 -- 192.168.123.105:0/161521784 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f01301b3690 con 0x7f0130083640 2026-03-10T08:56:24.278 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.277+0000 7f01381ba700 1 -- 192.168.123.105:0/161521784 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f01301b3b80 con 0x7f0130083640 2026-03-10T08:56:24.278 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.277+0000 7f0126ffd700 1 -- 192.168.123.105:0/161521784 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f0128018d20 con 0x7f0130083640 2026-03-10T08:56:24.278 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.278+0000 7f0126ffd700 1 -- 192.168.123.105:0/161521784 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f012800b3d0 con 0x7f0130083640 2026-03-10T08:56:24.278 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.278+0000 7f0126ffd700 1 -- 192.168.123.105:0/161521784 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 27) v1 ==== 50327+0+0 (secure 0 0 0) 0x7f0128020070 con 0x7f0130083640 2026-03-10T08:56:24.279 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.279+0000 7f0126ffd700 1 --2- 192.168.123.105:0/161521784 >> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] conn(0x7f011c03de90 0x7f011c040350 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:56:24.280 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.279+0000 7f0126ffd700 1 -- 192.168.123.105:0/161521784 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5878+0+0 (secure 0 0 0) 0x7f0128013070 con 0x7f0130083640 2026-03-10T08:56:24.280 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.280+0000 7f01381ba700 1 -- 192.168.123.105:0/161521784 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0114005320 con 0x7f0130083640 2026-03-10T08:56:24.284 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.284+0000 7f0135f56700 1 --2- 192.168.123.105:0/161521784 >> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] conn(0x7f011c03de90 0x7f011c040350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:56:24.288 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.285+0000 7f0126ffd700 1 -- 192.168.123.105:0/161521784 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f0128027030 con 0x7f0130083640 2026-03-10T08:56:24.301 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.301+0000 7f0135f56700 1 --2- 192.168.123.105:0/161521784 >> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] conn(0x7f011c03de90 0x7f011c040350 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f012c009de0 tx=0x7f012c009450 comp rx=0 tx=0).ready entity=mgr.14722 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:56:24.469 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.469+0000 7f01381ba700 1 -- 192.168.123.105:0/161521784 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f0114005cc0 con 0x7f0130083640 2026-03-10T08:56:24.470 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.470+0000 7f0126ffd700 1 -- 192.168.123.105:0/161521784 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 12 v12) v1 ==== 76+0+1827 (secure 0 0 0) 0x7f0128027020 con 0x7f0130083640 2026-03-10T08:56:24.470 INFO:teuthology.orchestra.run.vm05.stdout:e12 2026-03-10T08:56:24.470 INFO:teuthology.orchestra.run.vm05.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T08:56:24.470 INFO:teuthology.orchestra.run.vm05.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T08:56:24.470 INFO:teuthology.orchestra.run.vm05.stdout:legacy client fscid: 1 2026-03-10T08:56:24.470 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:56:24.470 INFO:teuthology.orchestra.run.vm05.stdout:Filesystem 'cephfs' (1) 2026-03-10T08:56:24.470 INFO:teuthology.orchestra.run.vm05.stdout:fs_name cephfs 2026-03-10T08:56:24.470 INFO:teuthology.orchestra.run.vm05.stdout:epoch 12 2026-03-10T08:56:24.470 INFO:teuthology.orchestra.run.vm05.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T08:56:24.470 INFO:teuthology.orchestra.run.vm05.stdout:created 2026-03-10T08:52:52.346264+0000 2026-03-10T08:56:24.470 INFO:teuthology.orchestra.run.vm05.stdout:modified 2026-03-10T08:53:01.426195+0000 2026-03-10T08:56:24.470 INFO:teuthology.orchestra.run.vm05.stdout:tableserver 0 2026-03-10T08:56:24.470 INFO:teuthology.orchestra.run.vm05.stdout:root 0 2026-03-10T08:56:24.470 INFO:teuthology.orchestra.run.vm05.stdout:session_timeout 60 2026-03-10T08:56:24.470 INFO:teuthology.orchestra.run.vm05.stdout:session_autoclose 300 2026-03-10T08:56:24.470 INFO:teuthology.orchestra.run.vm05.stdout:max_file_size 1099511627776 2026-03-10T08:56:24.470 INFO:teuthology.orchestra.run.vm05.stdout:required_client_features {} 2026-03-10T08:56:24.470 INFO:teuthology.orchestra.run.vm05.stdout:last_failure 0 2026-03-10T08:56:24.470 INFO:teuthology.orchestra.run.vm05.stdout:last_failure_osd_epoch 39 2026-03-10T08:56:24.470 INFO:teuthology.orchestra.run.vm05.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T08:56:24.471 INFO:teuthology.orchestra.run.vm05.stdout:max_mds 1 2026-03-10T08:56:24.471 INFO:teuthology.orchestra.run.vm05.stdout:in 0 2026-03-10T08:56:24.471 INFO:teuthology.orchestra.run.vm05.stdout:up {0=24289} 2026-03-10T08:56:24.471 INFO:teuthology.orchestra.run.vm05.stdout:failed 2026-03-10T08:56:24.471 INFO:teuthology.orchestra.run.vm05.stdout:damaged 2026-03-10T08:56:24.471 INFO:teuthology.orchestra.run.vm05.stdout:stopped 2026-03-10T08:56:24.471 INFO:teuthology.orchestra.run.vm05.stdout:data_pools [3] 2026-03-10T08:56:24.471 INFO:teuthology.orchestra.run.vm05.stdout:metadata_pool 2 2026-03-10T08:56:24.471 INFO:teuthology.orchestra.run.vm05.stdout:inline_data disabled 2026-03-10T08:56:24.471 INFO:teuthology.orchestra.run.vm05.stdout:balancer 2026-03-10T08:56:24.471 INFO:teuthology.orchestra.run.vm05.stdout:bal_rank_mask -1 2026-03-10T08:56:24.471 INFO:teuthology.orchestra.run.vm05.stdout:standby_count_wanted 1 2026-03-10T08:56:24.471 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.bxdvbu{0:24289} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.105:6826/2466638752,v1:192.168.123.105:6827/2466638752] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:56:24.471 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:56:24.471 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:56:24.471 INFO:teuthology.orchestra.run.vm05.stdout:Standby daemons: 2026-03-10T08:56:24.471 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:56:24.471 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.slhztf{-1:14488} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.105:6828/2662194502,v1:192.168.123.105:6829/2662194502] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:56:24.471 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.ssijow{-1:24307} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.108:6826/573665424,v1:192.168.123.108:6827/573665424] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:56:24.471 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.xfzrbx{-1:24317} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.108:6824/3236998387,v1:192.168.123.108:6825/3236998387] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:56:24.473 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.472+0000 7f0124ff9700 1 -- 192.168.123.105:0/161521784 >> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] conn(0x7f011c03de90 msgr2=0x7f011c040350 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:24.473 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.472+0000 7f0124ff9700 1 --2- 192.168.123.105:0/161521784 >> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] conn(0x7f011c03de90 0x7f011c040350 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f012c009de0 tx=0x7f012c009450 comp rx=0 tx=0).stop 2026-03-10T08:56:24.473 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.473+0000 7f0124ff9700 1 -- 192.168.123.105:0/161521784 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0130083640 msgr2=0x7f01301b30f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:24.473 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.473+0000 7f0124ff9700 1 --2- 192.168.123.105:0/161521784 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0130083640 0x7f01301b30f0 secure :-1 s=READY pgs=129 cs=0 l=1 rev1=1 crypto rx=0x7f0128010e90 tx=0x7f0128010f70 comp rx=0 tx=0).stop 2026-03-10T08:56:24.473 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.473+0000 7f0124ff9700 1 -- 192.168.123.105:0/161521784 shutdown_connections 2026-03-10T08:56:24.473 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.473+0000 7f0124ff9700 1 --2- 192.168.123.105:0/161521784 >> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] conn(0x7f011c03de90 0x7f011c040350 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:24.473 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.473+0000 7f0124ff9700 1 --2- 192.168.123.105:0/161521784 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0130072b50 0x7f0130083100 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:24.473 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.473+0000 7f0124ff9700 1 --2- 192.168.123.105:0/161521784 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0130083640 0x7f01301b30f0 unknown :-1 s=CLOSED pgs=129 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:24.473 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.473+0000 7f0124ff9700 1 -- 192.168.123.105:0/161521784 >> 192.168.123.105:0/161521784 conn(0x7f013006dae0 msgr2=0x7f013006ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:56:24.473 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.473+0000 7f0124ff9700 1 -- 192.168.123.105:0/161521784 shutdown_connections 2026-03-10T08:56:24.473 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.473+0000 7f0124ff9700 1 -- 192.168.123.105:0/161521784 wait complete. 2026-03-10T08:56:24.474 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 12 2026-03-10T08:56:24.554 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.553+0000 7f6f04920700 1 -- 192.168.123.105:0/3866118604 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6f00075a10 msgr2=0x7f6f00077ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:24.554 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.553+0000 7f6f04920700 1 --2- 192.168.123.105:0/3866118604 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6f00075a10 0x7f6f00077ea0 secure :-1 s=READY pgs=341 cs=0 l=1 rev1=1 crypto rx=0x7f6ef8009230 tx=0x7f6ef8009260 comp rx=0 tx=0).stop 2026-03-10T08:56:24.554 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.553+0000 7f6f04920700 1 -- 192.168.123.105:0/3866118604 shutdown_connections 2026-03-10T08:56:24.554 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.553+0000 7f6f04920700 1 --2- 192.168.123.105:0/3866118604 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6f00075a10 0x7f6f00077ea0 unknown :-1 s=CLOSED pgs=341 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:24.554 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.553+0000 7f6f04920700 1 --2- 192.168.123.105:0/3866118604 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6f00072b20 0x7f6f00072f40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:24.554 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.553+0000 7f6f04920700 1 -- 192.168.123.105:0/3866118604 >> 192.168.123.105:0/3866118604 conn(0x7f6f0006daa0 msgr2=0x7f6f0006ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:56:24.555 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.553+0000 7f6f04920700 1 -- 192.168.123.105:0/3866118604 shutdown_connections 2026-03-10T08:56:24.555 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.553+0000 7f6f04920700 1 -- 192.168.123.105:0/3866118604 wait complete. 2026-03-10T08:56:24.555 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.553+0000 7f6f04920700 1 Processor -- start 2026-03-10T08:56:24.555 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.554+0000 7f6f04920700 1 -- start start 2026-03-10T08:56:24.555 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.554+0000 7f6f04920700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6f00072b20 0x7f6f00081520 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:56:24.555 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.554+0000 7f6f04920700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6f00081a60 0x7f6f0012e170 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:56:24.555 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.554+0000 7f6f04920700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6f00081f70 con 0x7f6f00072b20 2026-03-10T08:56:24.555 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.554+0000 7f6f04920700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6f000820e0 con 0x7f6f00081a60 2026-03-10T08:56:24.555 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.554+0000 7f6efed9d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6f00072b20 0x7f6f00081520 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:56:24.555 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.554+0000 7f6efed9d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6f00072b20 0x7f6f00081520 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:43066/0 (socket says 192.168.123.105:43066) 2026-03-10T08:56:24.555 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.554+0000 7f6efed9d700 1 -- 192.168.123.105:0/2856039270 learned_addr learned my addr 192.168.123.105:0/2856039270 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:56:24.555 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.554+0000 7f6efe59c700 1 --2- 192.168.123.105:0/2856039270 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6f00081a60 0x7f6f0012e170 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:56:24.555 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.554+0000 7f6efed9d700 1 -- 192.168.123.105:0/2856039270 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6f00081a60 msgr2=0x7f6f0012e170 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:24.555 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.554+0000 7f6efed9d700 1 --2- 192.168.123.105:0/2856039270 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6f00081a60 0x7f6f0012e170 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:24.555 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.554+0000 7f6efed9d700 1 -- 192.168.123.105:0/2856039270 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6ef8008ee0 con 0x7f6f00072b20 2026-03-10T08:56:24.555 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.554+0000 7f6efed9d700 1 --2- 192.168.123.105:0/2856039270 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6f00072b20 0x7f6f00081520 secure :-1 s=READY pgs=342 cs=0 l=1 rev1=1 crypto rx=0x7f6ef0007c00 tx=0x7f6ef0007f10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:56:24.555 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.554+0000 7f6ee7fff700 1 -- 192.168.123.105:0/2856039270 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6ef0010040 con 0x7f6f00072b20 2026-03-10T08:56:24.555 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.554+0000 7f6f04920700 1 -- 192.168.123.105:0/2856039270 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6f0012e710 con 0x7f6f00072b20 2026-03-10T08:56:24.555 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.555+0000 7f6f04920700 1 -- 192.168.123.105:0/2856039270 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6f0012ec60 con 0x7f6f00072b20 2026-03-10T08:56:24.555 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.555+0000 7f6ee7fff700 1 -- 192.168.123.105:0/2856039270 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6ef0015470 con 0x7f6f00072b20 2026-03-10T08:56:24.556 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.555+0000 7f6ee7fff700 1 -- 192.168.123.105:0/2856039270 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6ef00145e0 con 0x7f6f00072b20 2026-03-10T08:56:24.557 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.557+0000 7f6ee7fff700 1 -- 192.168.123.105:0/2856039270 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 27) v1 ==== 50327+0+0 (secure 0 0 0) 0x7f6ef0014740 con 0x7f6f00072b20 2026-03-10T08:56:24.557 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.557+0000 7f6ee7fff700 1 --2- 192.168.123.105:0/2856039270 >> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] conn(0x7f6ee803def0 0x7f6ee80403b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:56:24.557 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.557+0000 7f6ee7fff700 1 -- 192.168.123.105:0/2856039270 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5878+0+0 (secure 0 0 0) 0x7f6ef0054980 con 0x7f6f00072b20 2026-03-10T08:56:24.558 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.557+0000 7f6ee5ffb700 1 -- 192.168.123.105:0/2856039270 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6eec005320 con 0x7f6f00072b20 2026-03-10T08:56:24.558 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.558+0000 7f6efe59c700 1 --2- 192.168.123.105:0/2856039270 >> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] conn(0x7f6ee803def0 0x7f6ee80403b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:56:24.558 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.558+0000 7f6efe59c700 1 --2- 192.168.123.105:0/2856039270 >> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] conn(0x7f6ee803def0 0x7f6ee80403b0 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f6ef8009200 tx=0x7f6ef8010d30 comp rx=0 tx=0).ready entity=mgr.14722 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:56:24.563 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.561+0000 7f6ee7fff700 1 -- 192.168.123.105:0/2856039270 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f6ef0019080 con 0x7f6f00072b20 2026-03-10T08:56:24.721 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.720+0000 7f6ee5ffb700 1 -- 192.168.123.105:0/2856039270 --> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f6eec000bf0 con 0x7f6ee803def0 2026-03-10T08:56:24.721 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:24 vm05.local ceph-mon[49713]: from='client.14748 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:56:24.721 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:24 vm05.local ceph-mon[49713]: from='client.24541 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:56:24.721 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:24 vm05.local ceph-mon[49713]: mgrmap e27: vm05.rxwgjc(active, since 3s) 2026-03-10T08:56:24.721 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:24 vm05.local ceph-mon[49713]: from='client.24545 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:56:24.721 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:24 vm05.local ceph-mon[49713]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:24.721 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:24 vm05.local ceph-mon[49713]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:24.721 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:24 vm05.local ceph-mon[49713]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-10T08:56:24.721 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:24 vm05.local ceph-mon[49713]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-10T08:56:24.721 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:24 vm05.local ceph-mon[49713]: from='client.? 192.168.123.105:0/2161767562' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:56:24.721 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:24 vm05.local ceph-mon[49713]: from='client.? 192.168.123.105:0/161521784' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T08:56:24.727 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T08:56:24.727 INFO:teuthology.orchestra.run.vm05.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T08:56:24.727 INFO:teuthology.orchestra.run.vm05.stdout: "in_progress": true, 2026-03-10T08:56:24.727 INFO:teuthology.orchestra.run.vm05.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T08:56:24.727 INFO:teuthology.orchestra.run.vm05.stdout: "services_complete": [ 2026-03-10T08:56:24.727 INFO:teuthology.orchestra.run.vm05.stdout: "mgr" 2026-03-10T08:56:24.727 INFO:teuthology.orchestra.run.vm05.stdout: ], 2026-03-10T08:56:24.727 INFO:teuthology.orchestra.run.vm05.stdout: "progress": "2/23 daemons upgraded", 2026-03-10T08:56:24.727 INFO:teuthology.orchestra.run.vm05.stdout: "message": "", 2026-03-10T08:56:24.727 INFO:teuthology.orchestra.run.vm05.stdout: "is_paused": false 2026-03-10T08:56:24.727 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T08:56:24.727 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.724+0000 7f6ee7fff700 1 -- 192.168.123.105:0/2856039270 <== mgr.14722 v2:192.168.123.105:6800/2453972605 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+337 (secure 0 0 0) 0x7f6eec000bf0 con 0x7f6ee803def0 2026-03-10T08:56:24.728 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.727+0000 7f6f04920700 1 -- 192.168.123.105:0/2856039270 >> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] conn(0x7f6ee803def0 msgr2=0x7f6ee80403b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:24.728 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.727+0000 7f6f04920700 1 --2- 192.168.123.105:0/2856039270 >> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] conn(0x7f6ee803def0 0x7f6ee80403b0 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f6ef8009200 tx=0x7f6ef8010d30 comp rx=0 tx=0).stop 2026-03-10T08:56:24.728 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.727+0000 7f6f04920700 1 -- 192.168.123.105:0/2856039270 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6f00072b20 msgr2=0x7f6f00081520 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:24.728 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.727+0000 7f6f04920700 1 --2- 192.168.123.105:0/2856039270 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6f00072b20 0x7f6f00081520 secure :-1 s=READY pgs=342 cs=0 l=1 rev1=1 crypto rx=0x7f6ef0007c00 tx=0x7f6ef0007f10 comp rx=0 tx=0).stop 2026-03-10T08:56:24.728 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.727+0000 7f6f04920700 1 -- 192.168.123.105:0/2856039270 shutdown_connections 2026-03-10T08:56:24.728 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.727+0000 7f6f04920700 1 --2- 192.168.123.105:0/2856039270 >> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] conn(0x7f6ee803def0 0x7f6ee80403b0 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:24.728 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.727+0000 7f6f04920700 1 --2- 192.168.123.105:0/2856039270 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6f00072b20 0x7f6f00081520 unknown :-1 s=CLOSED pgs=342 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:24.728 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.727+0000 7f6f04920700 1 --2- 192.168.123.105:0/2856039270 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6f00081a60 0x7f6f0012e170 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:24.728 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.727+0000 7f6f04920700 1 -- 192.168.123.105:0/2856039270 >> 192.168.123.105:0/2856039270 conn(0x7f6f0006daa0 msgr2=0x7f6f0006ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:56:24.728 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.728+0000 7f6f04920700 1 -- 192.168.123.105:0/2856039270 shutdown_connections 2026-03-10T08:56:24.728 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.728+0000 7f6f04920700 1 -- 192.168.123.105:0/2856039270 wait complete. 2026-03-10T08:56:24.837 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.836+0000 7f0467b82700 1 -- 192.168.123.105:0/976344872 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0460075a40 msgr2=0x7f0460077ed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:24.837 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.836+0000 7f0467b82700 1 --2- 192.168.123.105:0/976344872 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0460075a40 0x7f0460077ed0 secure :-1 s=READY pgs=343 cs=0 l=1 rev1=1 crypto rx=0x7f045800d3e0 tx=0x7f045800d6f0 comp rx=0 tx=0).stop 2026-03-10T08:56:24.837 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.836+0000 7f0467b82700 1 -- 192.168.123.105:0/976344872 shutdown_connections 2026-03-10T08:56:24.837 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.836+0000 7f0467b82700 1 --2- 192.168.123.105:0/976344872 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0460075a40 0x7f0460077ed0 unknown :-1 s=CLOSED pgs=343 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:24.837 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.836+0000 7f0467b82700 1 --2- 192.168.123.105:0/976344872 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0460072b50 0x7f0460072f70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:24.837 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.836+0000 7f0467b82700 1 -- 192.168.123.105:0/976344872 >> 192.168.123.105:0/976344872 conn(0x7f046006dae0 msgr2=0x7f046006ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:56:24.838 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.836+0000 7f0467b82700 1 -- 192.168.123.105:0/976344872 shutdown_connections 2026-03-10T08:56:24.838 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.836+0000 7f0467b82700 1 -- 192.168.123.105:0/976344872 wait complete. 2026-03-10T08:56:24.839 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.837+0000 7f0467b82700 1 Processor -- start 2026-03-10T08:56:24.839 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.837+0000 7f0467b82700 1 -- start start 2026-03-10T08:56:24.839 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.837+0000 7f0467b82700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0460072b50 0x7f04600830a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:56:24.839 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.837+0000 7f0467b82700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f04600835e0 0x7f046012e3f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:56:24.839 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.837+0000 7f0467b82700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0460083af0 con 0x7f04600835e0 2026-03-10T08:56:24.839 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.837+0000 7f0467b82700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0460083c60 con 0x7f0460072b50 2026-03-10T08:56:24.839 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.838+0000 7f046511d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f04600835e0 0x7f046012e3f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:56:24.839 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.838+0000 7f046511d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f04600835e0 0x7f046012e3f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:43090/0 (socket says 192.168.123.105:43090) 2026-03-10T08:56:24.839 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.838+0000 7f046511d700 1 -- 192.168.123.105:0/1772582729 learned_addr learned my addr 192.168.123.105:0/1772582729 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:56:24.839 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.838+0000 7f046591e700 1 --2- 192.168.123.105:0/1772582729 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0460072b50 0x7f04600830a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:56:24.839 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.838+0000 7f046511d700 1 -- 192.168.123.105:0/1772582729 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0460072b50 msgr2=0x7f04600830a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:24.839 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.838+0000 7f046511d700 1 --2- 192.168.123.105:0/1772582729 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0460072b50 0x7f04600830a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:24.839 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.838+0000 7f046511d700 1 -- 192.168.123.105:0/1772582729 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f045800d090 con 0x7f04600835e0 2026-03-10T08:56:24.839 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.838+0000 7f046511d700 1 --2- 192.168.123.105:0/1772582729 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f04600835e0 0x7f046012e3f0 secure :-1 s=READY pgs=344 cs=0 l=1 rev1=1 crypto rx=0x7f0458003fa0 tx=0x7f0458009bc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:56:24.841 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.839+0000 7f0456ffd700 1 -- 192.168.123.105:0/1772582729 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0458021070 con 0x7f04600835e0 2026-03-10T08:56:24.841 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.839+0000 7f0467b82700 1 -- 192.168.123.105:0/1772582729 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f046012e930 con 0x7f04600835e0 2026-03-10T08:56:24.841 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.839+0000 7f0467b82700 1 -- 192.168.123.105:0/1772582729 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f046012ee20 con 0x7f04600835e0 2026-03-10T08:56:24.841 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.840+0000 7f0456ffd700 1 -- 192.168.123.105:0/1772582729 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f045800b5f0 con 0x7f04600835e0 2026-03-10T08:56:24.841 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.840+0000 7f0456ffd700 1 -- 192.168.123.105:0/1772582729 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0458010040 con 0x7f04600835e0 2026-03-10T08:56:24.841 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.840+0000 7f0456ffd700 1 -- 192.168.123.105:0/1772582729 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 27) v1 ==== 50327+0+0 (secure 0 0 0) 0x7f045801db30 con 0x7f04600835e0 2026-03-10T08:56:24.841 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.840+0000 7f0456ffd700 1 --2- 192.168.123.105:0/1772582729 >> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] conn(0x7f044c03dea0 0x7f044c040360 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:56:24.841 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.841+0000 7f0456ffd700 1 -- 192.168.123.105:0/1772582729 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5878+0+0 (secure 0 0 0) 0x7f04580556f0 con 0x7f04600835e0 2026-03-10T08:56:24.841 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.841+0000 7f0467b82700 1 -- 192.168.123.105:0/1772582729 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0444005320 con 0x7f04600835e0 2026-03-10T08:56:24.841 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.841+0000 7f046591e700 1 --2- 192.168.123.105:0/1772582729 >> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] conn(0x7f044c03dea0 0x7f044c040360 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:56:24.842 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.842+0000 7f046591e700 1 --2- 192.168.123.105:0/1772582729 >> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] conn(0x7f044c03dea0 0x7f044c040360 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f046012a360 tx=0x7f045c00c040 comp rx=0 tx=0).ready entity=mgr.14722 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:56:24.847 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:24.844+0000 7f0456ffd700 1 -- 192.168.123.105:0/1772582729 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f0458009e60 con 0x7f04600835e0 2026-03-10T08:56:24.960 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:24 vm08.local ceph-mon[57559]: from='client.14748 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:56:24.960 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:24 vm08.local ceph-mon[57559]: from='client.24541 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:56:24.960 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:24 vm08.local ceph-mon[57559]: mgrmap e27: vm05.rxwgjc(active, since 3s) 2026-03-10T08:56:24.960 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:24 vm08.local ceph-mon[57559]: from='client.24545 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:56:24.960 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:24 vm08.local ceph-mon[57559]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:24.960 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:24 vm08.local ceph-mon[57559]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:24.960 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:24 vm08.local ceph-mon[57559]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-10T08:56:24.960 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:24 vm08.local ceph-mon[57559]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-10T08:56:24.960 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:24 vm08.local ceph-mon[57559]: from='client.? 192.168.123.105:0/2161767562' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:56:24.960 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:24 vm08.local ceph-mon[57559]: from='client.? 192.168.123.105:0/161521784' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T08:56:25.122 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:25.120+0000 7f0467b82700 1 -- 192.168.123.105:0/1772582729 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f0444005190 con 0x7f04600835e0 2026-03-10T08:56:25.129 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:25.127+0000 7f0456ffd700 1 -- 192.168.123.105:0/1772582729 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7f0458025370 con 0x7f04600835e0 2026-03-10T08:56:25.132 INFO:teuthology.orchestra.run.vm05.stdout:HEALTH_OK 2026-03-10T08:56:25.132 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:25.131+0000 7f0467b82700 1 -- 192.168.123.105:0/1772582729 >> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] conn(0x7f044c03dea0 msgr2=0x7f044c040360 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:25.132 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:25.131+0000 7f0467b82700 1 --2- 192.168.123.105:0/1772582729 >> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] conn(0x7f044c03dea0 0x7f044c040360 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f046012a360 tx=0x7f045c00c040 comp rx=0 tx=0).stop 2026-03-10T08:56:25.132 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:25.132+0000 7f0467b82700 1 -- 192.168.123.105:0/1772582729 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f04600835e0 msgr2=0x7f046012e3f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:25.132 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:25.132+0000 7f0467b82700 1 --2- 192.168.123.105:0/1772582729 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f04600835e0 0x7f046012e3f0 secure :-1 s=READY pgs=344 cs=0 l=1 rev1=1 crypto rx=0x7f0458003fa0 tx=0x7f0458009bc0 comp rx=0 tx=0).stop 2026-03-10T08:56:25.132 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:25.132+0000 7f0467b82700 1 -- 192.168.123.105:0/1772582729 shutdown_connections 2026-03-10T08:56:25.132 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:25.132+0000 7f0467b82700 1 --2- 192.168.123.105:0/1772582729 >> [v2:192.168.123.105:6800/2453972605,v1:192.168.123.105:6801/2453972605] conn(0x7f044c03dea0 0x7f044c040360 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:25.132 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:25.132+0000 7f0467b82700 1 --2- 192.168.123.105:0/1772582729 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0460072b50 0x7f04600830a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:25.132 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:25.132+0000 7f0467b82700 1 --2- 192.168.123.105:0/1772582729 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f04600835e0 0x7f046012e3f0 unknown :-1 s=CLOSED pgs=344 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:25.132 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:25.132+0000 7f0467b82700 1 -- 192.168.123.105:0/1772582729 >> 192.168.123.105:0/1772582729 conn(0x7f046006dae0 msgr2=0x7f046006ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:56:25.133 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:25.132+0000 7f0467b82700 1 -- 192.168.123.105:0/1772582729 shutdown_connections 2026-03-10T08:56:25.133 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:25.132+0000 7f0467b82700 1 -- 192.168.123.105:0/1772582729 wait complete. 2026-03-10T08:56:25.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:25 vm05.local ceph-mon[49713]: pgmap v5: 65 pgs: 65 active+clean; 4.1 GiB data, 14 GiB used, 106 GiB / 120 GiB avail 2026-03-10T08:56:25.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:25 vm05.local ceph-mon[49713]: from='client.14760 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:56:25.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:25 vm05.local ceph-mon[49713]: from='client.? 192.168.123.105:0/1772582729' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T08:56:25.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:25 vm05.local ceph-mon[49713]: Standby manager daemon vm08.rpongu started 2026-03-10T08:56:25.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:25 vm05.local ceph-mon[49713]: from='mgr.? 192.168.123.108:0/2353496561' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.rpongu/crt"}]: dispatch 2026-03-10T08:56:25.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:25 vm05.local ceph-mon[49713]: from='mgr.? 192.168.123.108:0/2353496561' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T08:56:25.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:25 vm05.local ceph-mon[49713]: from='mgr.? 192.168.123.108:0/2353496561' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.rpongu/key"}]: dispatch 2026-03-10T08:56:25.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:25 vm05.local ceph-mon[49713]: from='mgr.? 192.168.123.108:0/2353496561' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T08:56:25.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:25 vm05.local ceph-mon[49713]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:25.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:25 vm05.local ceph-mon[49713]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:26.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:25 vm08.local ceph-mon[57559]: pgmap v5: 65 pgs: 65 active+clean; 4.1 GiB data, 14 GiB used, 106 GiB / 120 GiB avail 2026-03-10T08:56:26.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:25 vm08.local ceph-mon[57559]: from='client.14760 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:56:26.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:25 vm08.local ceph-mon[57559]: from='client.? 192.168.123.105:0/1772582729' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T08:56:26.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:25 vm08.local ceph-mon[57559]: Standby manager daemon vm08.rpongu started 2026-03-10T08:56:26.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:25 vm08.local ceph-mon[57559]: from='mgr.? 192.168.123.108:0/2353496561' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.rpongu/crt"}]: dispatch 2026-03-10T08:56:26.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:25 vm08.local ceph-mon[57559]: from='mgr.? 192.168.123.108:0/2353496561' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T08:56:26.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:25 vm08.local ceph-mon[57559]: from='mgr.? 192.168.123.108:0/2353496561' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.rpongu/key"}]: dispatch 2026-03-10T08:56:26.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:25 vm08.local ceph-mon[57559]: from='mgr.? 192.168.123.108:0/2353496561' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T08:56:26.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:25 vm08.local ceph-mon[57559]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:26.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:25 vm08.local ceph-mon[57559]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:26.792 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:26 vm05.local ceph-mon[49713]: [10/Mar/2026:08:56:25] ENGINE Bus STARTING 2026-03-10T08:56:26.792 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:26 vm05.local ceph-mon[49713]: [10/Mar/2026:08:56:25] ENGINE Serving on https://192.168.123.105:7150 2026-03-10T08:56:26.792 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:26 vm05.local ceph-mon[49713]: [10/Mar/2026:08:56:25] ENGINE Client ('192.168.123.105', 53596) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-10T08:56:26.792 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:26 vm05.local ceph-mon[49713]: [10/Mar/2026:08:56:25] ENGINE Serving on http://192.168.123.105:8765 2026-03-10T08:56:26.792 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:26 vm05.local ceph-mon[49713]: [10/Mar/2026:08:56:25] ENGINE Bus STARTED 2026-03-10T08:56:26.792 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:26 vm05.local ceph-mon[49713]: mgrmap e28: vm05.rxwgjc(active, since 5s), standbys: vm08.rpongu 2026-03-10T08:56:26.792 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:26 vm05.local ceph-mon[49713]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mgr metadata", "who": "vm08.rpongu", "id": "vm08.rpongu"}]: dispatch 2026-03-10T08:56:26.793 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:26 vm05.local ceph-mon[49713]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:26.793 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:26 vm05.local ceph-mon[49713]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:27.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:26 vm08.local ceph-mon[57559]: [10/Mar/2026:08:56:25] ENGINE Bus STARTING 2026-03-10T08:56:27.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:26 vm08.local ceph-mon[57559]: [10/Mar/2026:08:56:25] ENGINE Serving on https://192.168.123.105:7150 2026-03-10T08:56:27.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:26 vm08.local ceph-mon[57559]: [10/Mar/2026:08:56:25] ENGINE Client ('192.168.123.105', 53596) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-10T08:56:27.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:26 vm08.local ceph-mon[57559]: [10/Mar/2026:08:56:25] ENGINE Serving on http://192.168.123.105:8765 2026-03-10T08:56:27.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:26 vm08.local ceph-mon[57559]: [10/Mar/2026:08:56:25] ENGINE Bus STARTED 2026-03-10T08:56:27.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:26 vm08.local ceph-mon[57559]: mgrmap e28: vm05.rxwgjc(active, since 5s), standbys: vm08.rpongu 2026-03-10T08:56:27.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:26 vm08.local ceph-mon[57559]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mgr metadata", "who": "vm08.rpongu", "id": "vm08.rpongu"}]: dispatch 2026-03-10T08:56:27.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:26 vm08.local ceph-mon[57559]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:27.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:26 vm08.local ceph-mon[57559]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:28.012 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:27 vm05.local ceph-mon[49713]: pgmap v6: 65 pgs: 65 active+clean; 4.1 GiB data, 14 GiB used, 106 GiB / 120 GiB avail 2026-03-10T08:56:28.220 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:27 vm08.local ceph-mon[57559]: pgmap v6: 65 pgs: 65 active+clean; 4.1 GiB data, 14 GiB used, 106 GiB / 120 GiB avail 2026-03-10T08:56:29.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:29 vm05.local ceph-mon[49713]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:29.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:29 vm05.local ceph-mon[49713]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:29.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:29 vm05.local ceph-mon[49713]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-10T08:56:29.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:29 vm05.local ceph-mon[49713]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-10T08:56:29.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:29 vm05.local ceph-mon[49713]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:56:29.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:29 vm05.local ceph-mon[49713]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:56:29.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:29 vm05.local ceph-mon[49713]: Updating vm05:/etc/ceph/ceph.conf 2026-03-10T08:56:29.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:29 vm05.local ceph-mon[49713]: Updating vm08:/etc/ceph/ceph.conf 2026-03-10T08:56:29.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:29 vm05.local ceph-mon[49713]: Updating vm08:/var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/config/ceph.conf 2026-03-10T08:56:29.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:29 vm05.local ceph-mon[49713]: Updating vm05:/var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/config/ceph.conf 2026-03-10T08:56:29.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:29 vm05.local ceph-mon[49713]: Updating vm05:/etc/ceph/ceph.client.admin.keyring 2026-03-10T08:56:29.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:29 vm05.local ceph-mon[49713]: Updating vm08:/etc/ceph/ceph.client.admin.keyring 2026-03-10T08:56:29.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:29 vm05.local ceph-mon[49713]: pgmap v7: 65 pgs: 65 active+clean; 3.6 GiB data, 12 GiB used, 108 GiB / 120 GiB avail; 36 KiB/s rd, 488 KiB/s wr, 91 op/s 2026-03-10T08:56:29.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:29 vm05.local ceph-mon[49713]: Updating vm05:/var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/config/ceph.client.admin.keyring 2026-03-10T08:56:29.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:29 vm05.local ceph-mon[49713]: Updating vm08:/var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/config/ceph.client.admin.keyring 2026-03-10T08:56:29.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:29 vm05.local ceph-mon[49713]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:29.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:29 vm05.local ceph-mon[49713]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:29.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:29 vm08.local ceph-mon[57559]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:29.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:29 vm08.local ceph-mon[57559]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:29.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:29 vm08.local ceph-mon[57559]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-10T08:56:29.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:29 vm08.local ceph-mon[57559]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-10T08:56:29.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:29 vm08.local ceph-mon[57559]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:56:29.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:29 vm08.local ceph-mon[57559]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:56:29.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:29 vm08.local ceph-mon[57559]: Updating vm05:/etc/ceph/ceph.conf 2026-03-10T08:56:29.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:29 vm08.local ceph-mon[57559]: Updating vm08:/etc/ceph/ceph.conf 2026-03-10T08:56:29.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:29 vm08.local ceph-mon[57559]: Updating vm08:/var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/config/ceph.conf 2026-03-10T08:56:29.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:29 vm08.local ceph-mon[57559]: Updating vm05:/var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/config/ceph.conf 2026-03-10T08:56:29.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:29 vm08.local ceph-mon[57559]: Updating vm05:/etc/ceph/ceph.client.admin.keyring 2026-03-10T08:56:29.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:29 vm08.local ceph-mon[57559]: Updating vm08:/etc/ceph/ceph.client.admin.keyring 2026-03-10T08:56:29.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:29 vm08.local ceph-mon[57559]: pgmap v7: 65 pgs: 65 active+clean; 3.6 GiB data, 12 GiB used, 108 GiB / 120 GiB avail; 36 KiB/s rd, 488 KiB/s wr, 91 op/s 2026-03-10T08:56:29.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:29 vm08.local ceph-mon[57559]: Updating vm05:/var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/config/ceph.client.admin.keyring 2026-03-10T08:56:29.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:29 vm08.local ceph-mon[57559]: Updating vm08:/var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/config/ceph.client.admin.keyring 2026-03-10T08:56:29.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:29 vm08.local ceph-mon[57559]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:29.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:29 vm08.local ceph-mon[57559]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:30.587 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:30 vm05.local ceph-mon[49713]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:30.587 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:30 vm05.local ceph-mon[49713]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:30.587 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:30 vm05.local ceph-mon[49713]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:30.587 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:30 vm05.local ceph-mon[49713]: Reconfiguring prometheus.vm05 (dependencies changed)... 2026-03-10T08:56:30.587 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:30 vm05.local ceph-mon[49713]: Reconfiguring daemon prometheus.vm05 on vm05 2026-03-10T08:56:30.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:30 vm08.local ceph-mon[57559]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:30.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:30 vm08.local ceph-mon[57559]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:30.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:30 vm08.local ceph-mon[57559]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:30.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:30 vm08.local ceph-mon[57559]: Reconfiguring prometheus.vm05 (dependencies changed)... 2026-03-10T08:56:30.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:30 vm08.local ceph-mon[57559]: Reconfiguring daemon prometheus.vm05 on vm05 2026-03-10T08:56:31.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:31 vm05.local ceph-mon[49713]: pgmap v8: 65 pgs: 65 active+clean; 3.6 GiB data, 12 GiB used, 108 GiB / 120 GiB avail; 28 KiB/s rd, 378 KiB/s wr, 70 op/s 2026-03-10T08:56:31.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:31 vm05.local ceph-mon[49713]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:31.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:31 vm05.local ceph-mon[49713]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:31.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:31 vm05.local ceph-mon[49713]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T08:56:31.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:31 vm05.local ceph-mon[49713]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:56:31.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:31 vm08.local ceph-mon[57559]: pgmap v8: 65 pgs: 65 active+clean; 3.6 GiB data, 12 GiB used, 108 GiB / 120 GiB avail; 28 KiB/s rd, 378 KiB/s wr, 70 op/s 2026-03-10T08:56:31.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:31 vm08.local ceph-mon[57559]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:31.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:31 vm08.local ceph-mon[57559]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:31.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:31 vm08.local ceph-mon[57559]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T08:56:31.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:31 vm08.local ceph-mon[57559]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:56:33.192 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:32 vm08.local ceph-mon[57559]: from='mon.? -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T08:56:33.193 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:32 vm08.local ceph-mon[57559]: Upgrade: Updating mgr.vm08.rpongu 2026-03-10T08:56:33.193 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:32 vm08.local ceph-mon[57559]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm08.rpongu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T08:56:33.193 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:32 vm08.local ceph-mon[57559]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm08.rpongu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T08:56:33.193 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:32 vm08.local ceph-mon[57559]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T08:56:33.193 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:32 vm08.local ceph-mon[57559]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:56:33.193 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:32 vm08.local ceph-mon[57559]: Deploying daemon mgr.vm08.rpongu on vm08 2026-03-10T08:56:33.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:32 vm05.local ceph-mon[49713]: from='mon.? -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T08:56:33.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:32 vm05.local ceph-mon[49713]: Upgrade: Updating mgr.vm08.rpongu 2026-03-10T08:56:33.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:32 vm05.local ceph-mon[49713]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm08.rpongu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T08:56:33.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:32 vm05.local ceph-mon[49713]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm08.rpongu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T08:56:33.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:32 vm05.local ceph-mon[49713]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T08:56:33.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:32 vm05.local ceph-mon[49713]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:56:33.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:32 vm05.local ceph-mon[49713]: Deploying daemon mgr.vm08.rpongu on vm08 2026-03-10T08:56:33.844 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:33 vm05.local ceph-mon[49713]: pgmap v9: 65 pgs: 65 active+clean; 2.9 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 29 KiB/s rd, 620 KiB/s wr, 103 op/s 2026-03-10T08:56:33.844 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:33 vm05.local ceph-mon[49713]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:33.844 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:33 vm05.local ceph-mon[49713]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:33.844 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:33 vm05.local ceph-mon[49713]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:56:34.017 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:33 vm08.local ceph-mon[57559]: pgmap v9: 65 pgs: 65 active+clean; 2.9 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 29 KiB/s rd, 620 KiB/s wr, 103 op/s 2026-03-10T08:56:34.017 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:33 vm08.local ceph-mon[57559]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:34.017 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:33 vm08.local ceph-mon[57559]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:34.017 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:33 vm08.local ceph-mon[57559]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:56:36.035 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:35 vm08.local ceph-mon[57559]: pgmap v10: 65 pgs: 65 active+clean; 2.9 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 26 KiB/s rd, 562 KiB/s wr, 94 op/s 2026-03-10T08:56:36.035 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:35 vm08.local ceph-mon[57559]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:36.035 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:35 vm08.local ceph-mon[57559]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:36.035 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:35 vm08.local ceph-mon[57559]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:36.035 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:35 vm08.local ceph-mon[57559]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:36.035 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:35 vm08.local ceph-mon[57559]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:36.035 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:35 vm08.local ceph-mon[57559]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:36.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:35 vm05.local ceph-mon[49713]: pgmap v10: 65 pgs: 65 active+clean; 2.9 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 26 KiB/s rd, 562 KiB/s wr, 94 op/s 2026-03-10T08:56:36.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:35 vm05.local ceph-mon[49713]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:36.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:35 vm05.local ceph-mon[49713]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:36.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:35 vm05.local ceph-mon[49713]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:36.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:35 vm05.local ceph-mon[49713]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:36.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:35 vm05.local ceph-mon[49713]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:36.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:35 vm05.local ceph-mon[49713]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:37.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:36 vm08.local ceph-mon[57559]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:56:37.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:36 vm05.local ceph-mon[49713]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:56:37.964 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:37 vm05.local ceph-mon[49713]: pgmap v11: 65 pgs: 65 active+clean; 2.9 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 26 KiB/s rd, 562 KiB/s wr, 94 op/s 2026-03-10T08:56:37.964 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:37 vm05.local ceph-mon[49713]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:37.964 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:37 vm05.local ceph-mon[49713]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:37.964 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:37 vm05.local ceph-mon[49713]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:56:37.964 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:37 vm05.local ceph-mon[49713]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:56:37.964 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:37 vm05.local ceph-mon[49713]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:37.964 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:37 vm05.local ceph-mon[49713]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:56:37.964 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:37 vm05.local ceph-mon[49713]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:56:37.964 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:37 vm05.local ceph-mon[49713]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:37.964 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:37 vm05.local ceph-mon[49713]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm05.rxwgjc"}]: dispatch 2026-03-10T08:56:37.964 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:37 vm05.local ceph-mon[49713]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm05.rxwgjc"}]: dispatch 2026-03-10T08:56:37.964 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:37 vm05.local ceph-mon[49713]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm05.rxwgjc"}]': finished 2026-03-10T08:56:37.964 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:37 vm05.local ceph-mon[49713]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm08.rpongu"}]: dispatch 2026-03-10T08:56:37.964 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:37 vm05.local ceph-mon[49713]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm08.rpongu"}]: dispatch 2026-03-10T08:56:37.964 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:37 vm05.local ceph-mon[49713]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm08.rpongu"}]': finished 2026-03-10T08:56:37.964 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:37 vm05.local ceph-mon[49713]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "quorum_status"}]: dispatch 2026-03-10T08:56:37.964 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:37 vm05.local ceph-mon[49713]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:37.964 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:37 vm05.local ceph-mon[49713]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T08:56:37.964 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:37 vm05.local ceph-mon[49713]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T08:56:37.964 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:37 vm05.local ceph-mon[49713]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:56:38.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:37 vm08.local ceph-mon[57559]: pgmap v11: 65 pgs: 65 active+clean; 2.9 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 26 KiB/s rd, 562 KiB/s wr, 94 op/s 2026-03-10T08:56:38.054 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:37 vm08.local ceph-mon[57559]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:38.054 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:37 vm08.local ceph-mon[57559]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:38.054 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:37 vm08.local ceph-mon[57559]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:56:38.054 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:37 vm08.local ceph-mon[57559]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:56:38.054 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:37 vm08.local ceph-mon[57559]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:38.054 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:37 vm08.local ceph-mon[57559]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:56:38.054 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:37 vm08.local ceph-mon[57559]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:56:38.054 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:37 vm08.local ceph-mon[57559]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:38.054 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:37 vm08.local ceph-mon[57559]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm05.rxwgjc"}]: dispatch 2026-03-10T08:56:38.054 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:37 vm08.local ceph-mon[57559]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm05.rxwgjc"}]: dispatch 2026-03-10T08:56:38.054 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:37 vm08.local ceph-mon[57559]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm05.rxwgjc"}]': finished 2026-03-10T08:56:38.054 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:37 vm08.local ceph-mon[57559]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm08.rpongu"}]: dispatch 2026-03-10T08:56:38.054 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:37 vm08.local ceph-mon[57559]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm08.rpongu"}]: dispatch 2026-03-10T08:56:38.054 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:37 vm08.local ceph-mon[57559]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm08.rpongu"}]': finished 2026-03-10T08:56:38.054 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:37 vm08.local ceph-mon[57559]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "quorum_status"}]: dispatch 2026-03-10T08:56:38.054 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:37 vm08.local ceph-mon[57559]: from='mgr.14722 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:38.054 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:37 vm08.local ceph-mon[57559]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T08:56:38.054 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:37 vm08.local ceph-mon[57559]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T08:56:38.054 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:37 vm08.local ceph-mon[57559]: from='mgr.14722 192.168.123.105:0/374727733' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:56:38.596 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:38 vm05.local systemd[1]: Stopping Ceph mon.vm05 for 16587ed2-1c5e-11f1-90f6-35051361a039... 2026-03-10T08:56:38.596 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:38 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-mon-vm05[49709]: 2026-03-10T08:56:38.549+0000 7f3359a03700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-mon -n mon.vm05 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false --default-mon-cluster-log-to-file=false --default-mon-cluster-log-to-journald=true --default-mon-cluster-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T08:56:38.596 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:38 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-mon-vm05[49709]: 2026-03-10T08:56:38.549+0000 7f3359a03700 -1 mon.vm05@0(leader) e2 *** Got Signal Terminated *** 2026-03-10T08:56:38.857 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:38 vm05.local podman[111507]: 2026-03-10 08:56:38.596037273 +0000 UTC m=+0.080936764 container died 4cb0e74c858492fe7a9e643719fa5d270c6249137a89ab206b1ffa2780d304e0 (image=quay.io/ceph/ceph:v18.2.1, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-mon-vm05, org.label-schema.schema-version=1.0, GIT_CLEAN=True, RELEASE=HEAD, org.label-schema.vendor=CentOS, GIT_BRANCH=HEAD, ceph=True, io.buildah.version=1.29.1, org.label-schema.license=GPLv2, CEPH_POINT_RELEASE=-18.2.1, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, org.label-schema.build-date=20240222, org.label-schema.name=CentOS Stream 8 Base Image) 2026-03-10T08:56:38.857 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:38 vm05.local podman[111507]: 2026-03-10 08:56:38.62430767 +0000 UTC m=+0.109207162 container remove 4cb0e74c858492fe7a9e643719fa5d270c6249137a89ab206b1ffa2780d304e0 (image=quay.io/ceph/ceph:v18.2.1, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-mon-vm05, io.buildah.version=1.29.1, org.label-schema.build-date=20240222, org.label-schema.schema-version=1.0, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, maintainer=Guillaume Abrioux , org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 8 Base Image, CEPH_POINT_RELEASE=-18.2.1, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, GIT_BRANCH=HEAD, RELEASE=HEAD) 2026-03-10T08:56:38.857 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:38 vm05.local bash[111507]: ceph-16587ed2-1c5e-11f1-90f6-35051361a039-mon-vm05 2026-03-10T08:56:38.857 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:38 vm05.local systemd[1]: ceph-16587ed2-1c5e-11f1-90f6-35051361a039@mon.vm05.service: Deactivated successfully. 2026-03-10T08:56:38.857 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:38 vm05.local systemd[1]: Stopped Ceph mon.vm05 for 16587ed2-1c5e-11f1-90f6-35051361a039. 2026-03-10T08:56:38.858 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:38 vm05.local systemd[1]: ceph-16587ed2-1c5e-11f1-90f6-35051361a039@mon.vm05.service: Consumed 6.377s CPU time. 2026-03-10T08:56:39.275 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:38 vm05.local systemd[1]: Starting Ceph mon.vm05 for 16587ed2-1c5e-11f1-90f6-35051361a039... 2026-03-10T08:56:39.275 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local podman[111616]: 2026-03-10 08:56:39.125968033 +0000 UTC m=+0.032870177 container create cdc9176bec281ab9d1e08966187c6abbc4fba6e4bdaea6686cadac19f3f2f8b2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-mon-vm05, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, OSD_FLAVOR=default) 2026-03-10T08:56:39.275 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local podman[111616]: 2026-03-10 08:56:39.178151754 +0000 UTC m=+0.085053898 container init cdc9176bec281ab9d1e08966187c6abbc4fba6e4bdaea6686cadac19f3f2f8b2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-mon-vm05, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.build-date=20260223, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, org.label-schema.schema-version=1.0) 2026-03-10T08:56:39.275 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local podman[111616]: 2026-03-10 08:56:39.182815964 +0000 UTC m=+0.089718108 container start cdc9176bec281ab9d1e08966187c6abbc4fba6e4bdaea6686cadac19f3f2f8b2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-mon-vm05, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T08:56:39.275 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local bash[111616]: cdc9176bec281ab9d1e08966187c6abbc4fba6e4bdaea6686cadac19f3f2f8b2 2026-03-10T08:56:39.275 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local podman[111616]: 2026-03-10 08:56:39.114569241 +0000 UTC m=+0.021471385 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T08:56:39.275 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local systemd[1]: Started Ceph mon.vm05 for 16587ed2-1c5e-11f1-90f6-35051361a039. 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: set uid:gid to 167:167 (ceph:ceph) 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable), process ceph-mon, pid 2 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: pidfile_write: ignore empty --pid-file 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: load: jerasure load: lrc 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: RocksDB version: 7.9.2 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Git sha 0 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Compile date 2026-02-25 18:11:04 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: DB SUMMARY 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: DB Session ID: I2SN8S5QWJL1B1927KEQ 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: CURRENT file: CURRENT 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: IDENTITY file: IDENTITY 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: MANIFEST file: MANIFEST-000015 size: 775 Bytes 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: SST files in /var/lib/ceph/mon/ceph-vm05/store.db dir, Total Num: 1, files: 000023.sst 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-vm05/store.db: 000021.log size: 2197575 ; 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.error_if_exists: 0 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.create_if_missing: 0 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.paranoid_checks: 1 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.flush_verify_memtable_count: 1 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.env: 0x55e790c02dc0 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.fs: PosixFileSystem 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.info_log: 0x55e79269b900 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.max_file_opening_threads: 16 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.statistics: (nil) 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.use_fsync: 0 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.max_log_file_size: 0 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.max_manifest_file_size: 1073741824 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.log_file_time_to_roll: 0 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.keep_log_file_num: 1000 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.recycle_log_file_num: 0 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.allow_fallocate: 1 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.allow_mmap_reads: 0 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.allow_mmap_writes: 0 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.use_direct_reads: 0 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.create_missing_column_families: 0 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.db_log_dir: 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.wal_dir: 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.table_cache_numshardbits: 6 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.WAL_ttl_seconds: 0 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.WAL_size_limit_MB: 0 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.manifest_preallocation_size: 4194304 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.is_fd_close_on_exec: 1 2026-03-10T08:56:39.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.advise_random_on_open: 1 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.db_write_buffer_size: 0 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.write_buffer_manager: 0x55e79269f900 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.access_hint_on_compaction_start: 1 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.random_access_max_buffer_size: 1048576 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.use_adaptive_mutex: 0 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.rate_limiter: (nil) 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.wal_recovery_mode: 2 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.enable_thread_tracking: 0 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.enable_pipelined_write: 0 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.unordered_write: 0 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.allow_concurrent_memtable_write: 1 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.write_thread_max_yield_usec: 100 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.write_thread_slow_yield_usec: 3 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.row_cache: None 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.wal_filter: None 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.avoid_flush_during_recovery: 0 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.allow_ingest_behind: 0 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.two_write_queues: 0 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.manual_wal_flush: 0 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.wal_compression: 0 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.atomic_flush: 0 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.persist_stats_to_disk: 0 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.write_dbid_to_manifest: 0 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.log_readahead_size: 0 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.file_checksum_gen_factory: Unknown 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.best_efforts_recovery: 0 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.max_bgerror_resume_count: 2147483647 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.allow_data_in_errors: 0 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.db_host_id: __hostname__ 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.enforce_single_del_contracts: true 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.max_background_jobs: 2 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.max_background_compactions: -1 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.max_subcompactions: 1 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.avoid_flush_during_shutdown: 0 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.writable_file_max_buffer_size: 1048576 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.delayed_write_rate : 16777216 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.max_total_wal_size: 0 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.stats_dump_period_sec: 600 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.stats_persist_period_sec: 600 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.stats_history_buffer_size: 1048576 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.max_open_files: -1 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.bytes_per_sync: 0 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.wal_bytes_per_sync: 0 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.strict_bytes_per_sync: 0 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.compaction_readahead_size: 0 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.max_background_flushes: -1 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Compression algorithms supported: 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: kZSTD supported: 0 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: kXpressCompression supported: 0 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: kBZip2Compression supported: 0 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: kZSTDNotFinalCompression supported: 0 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: kLZ4Compression supported: 1 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: kZlibCompression supported: 1 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: kLZ4HCCompression supported: 1 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: kSnappyCompression supported: 1 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Fast CRC32 supported: Supported on x86 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: DMutex implementation: pthread_mutex_t 2026-03-10T08:56:39.715 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-vm05/store.db/MANIFEST-000015 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.comparator: leveldb.BytewiseComparator 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.merge_operator: 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.compaction_filter: None 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.compaction_filter_factory: None 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.sst_partitioner_factory: None 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.memtable_factory: SkipListFactory 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.table_factory: BlockBasedTable 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e79269b580) 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout: cache_index_and_filter_blocks: 1 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout: cache_index_and_filter_blocks_with_high_priority: 0 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout: pin_l0_filter_and_index_blocks_in_cache: 0 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout: pin_top_level_index_and_filter: 1 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout: index_type: 0 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout: data_block_index_type: 0 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout: index_shortening: 1 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout: data_block_hash_table_util_ratio: 0.750000 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout: checksum: 4 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout: no_block_cache: 0 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout: block_cache: 0x55e7926be9b0 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout: block_cache_name: BinnedLRUCache 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout: block_cache_options: 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout: capacity : 536870912 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout: num_shard_bits : 4 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout: strict_capacity_limit : 0 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout: high_pri_pool_ratio: 0.000 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout: block_cache_compressed: (nil) 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout: persistent_cache: (nil) 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout: block_size: 4096 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout: block_size_deviation: 10 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout: block_restart_interval: 16 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout: index_block_restart_interval: 1 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout: metadata_block_size: 4096 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout: partition_filters: 0 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout: use_delta_encoding: 1 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout: filter_policy: bloomfilter 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout: whole_key_filtering: 1 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout: verify_compression: 0 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout: read_amp_bytes_per_bit: 0 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout: format_version: 5 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout: enable_index_compression: 1 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout: block_align: 0 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout: max_auto_readahead_size: 262144 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout: prepopulate_block_cache: 0 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout: initial_auto_readahead_size: 8192 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout: num_file_reads_for_auto_readahead: 2 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.write_buffer_size: 33554432 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.max_write_buffer_number: 2 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.compression: NoCompression 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.bottommost_compression: Disabled 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.prefix_extractor: nullptr 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.num_levels: 7 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.min_write_buffer_number_to_merge: 1 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.bottommost_compression_opts.level: 32767 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.bottommost_compression_opts.strategy: 0 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.bottommost_compression_opts.enabled: false 2026-03-10T08:56:39.716 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.compression_opts.window_bits: -14 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.compression_opts.level: 32767 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.compression_opts.strategy: 0 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.compression_opts.max_dict_bytes: 0 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.compression_opts.parallel_threads: 1 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.compression_opts.enabled: false 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.level0_file_num_compaction_trigger: 4 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.level0_slowdown_writes_trigger: 20 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.level0_stop_writes_trigger: 36 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.target_file_size_base: 67108864 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.target_file_size_multiplier: 1 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.max_bytes_for_level_base: 268435456 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.max_bytes_for_level_multiplier: 10.000000 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.max_sequential_skip_in_iterations: 8 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.max_compaction_bytes: 1677721600 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.arena_block_size: 1048576 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.disable_auto_compactions: 0 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.compaction_style: kCompactionStyleLevel 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.compaction_pri: kMinOverlappingRatio 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.compaction_options_universal.size_ratio: 1 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.inplace_update_support: 0 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.inplace_update_num_locks: 10000 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.memtable_whole_key_filtering: 0 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.memtable_huge_page_size: 0 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.bloom_locality: 0 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.max_successive_merges: 0 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.optimize_filters_for_hits: 0 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.paranoid_file_checks: 0 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.force_consistency_checks: 1 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.report_bg_io_stats: 0 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.ttl: 2592000 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.periodic_compaction_seconds: 0 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.preclude_last_level_data_seconds: 0 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.preserve_internal_time_seconds: 0 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.enable_blob_files: false 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.min_blob_size: 0 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.blob_file_size: 268435456 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.blob_compression_type: NoCompression 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.enable_blob_garbage_collection: false 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 2026-03-10T08:56:39.717 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.blob_compaction_readahead_size: 0 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.blob_file_starting_level: 0 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-vm05/store.db/MANIFEST-000015 succeeded,manifest_file_number is 15, next_file_number is 25, last_sequence is 8202, log_number is 21,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 21 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 21 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 05836416-2294-42f5-b375-8f9e69647089 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773132999323224, "job": 1, "event": "recovery_started", "wal_files": [21]} 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #21 mode 2 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773132999349841, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 26, "file_size": 1983866, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 8203, "largest_seqno": 8944, "table_properties": {"data_size": 1979345, "index_size": 2584, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 901, "raw_key_size": 8413, "raw_average_key_size": 23, "raw_value_size": 1971326, "raw_average_value_size": 5537, "num_data_blocks": 125, "num_entries": 356, "num_filter_entries": 356, "num_deletions": 2, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1773132999, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "05836416-2294-42f5-b375-8f9e69647089", "db_session_id": "I2SN8S5QWJL1B1927KEQ", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}} 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773132999350145, "job": 1, "event": "recovery_finished"} 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: [db/version_set.cc:5047] Creating manifest 28 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-vm05/store.db/000021.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55e7926c0e00 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: DB pointer 0x55e7926d0000 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: starting mon.vm05 rank 0 at public addrs [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] at bind addrs [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] mon_data /var/lib/ceph/mon/ceph-vm05 fsid 16587ed2-1c5e-11f1-90f6-35051361a039 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: rocksdb: [db/db_impl/db_impl.cc:1111] 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout: ** DB Stats ** 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Uptime(secs): 0.0 total, 0.0 interval 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Interval stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout: 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout: ** Compaction Stats [default] ** 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout: ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout: L0 1/0 1.89 MB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 95.7 0.02 0.00 1 0.020 0 0 0.0 0.0 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout: L6 1/0 6.09 MB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Sum 2/0 7.99 MB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 95.7 0.02 0.00 1 0.020 0 0 0.0 0.0 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 95.7 0.02 0.00 1 0.020 0 0 0.0 0.0 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout: 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout: ** Compaction Stats [default] ** 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout: --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout: User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 95.7 0.02 0.00 1 0.020 0 0 0.0 0.0 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout: 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout: 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Uptime(secs): 0.1 total, 0.1 interval 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Flush(GB): cumulative 0.002, interval 0.002 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout: AddFile(GB): cumulative 0.000, interval 0.000 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout: AddFile(Total Files): cumulative 0, interval 0 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout: AddFile(L0 Files): cumulative 0, interval 0 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout: AddFile(Keys): cumulative 0, interval 0 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Cumulative compaction: 0.00 GB write, 37.69 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Interval compaction: 0.00 GB write, 37.69 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Block cache BinnedLRUCache@0x55e7926be9b0#2 capacity: 512.00 MB usage: 46.20 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 1.3e-05 secs_since: 0 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Block cache entry stats(count,size,portion): DataBlock(3,18.23 KB,0.00347793%) FilterBlock(2,9.06 KB,0.00172853%) IndexBlock(2,18.91 KB,0.00360608%) Misc(1,0.00 KB,0%) 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout: 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout: ** File Read Latency Histogram By Level [default] ** 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: mon.vm05@-1(???) e2 preinit fsid 16587ed2-1c5e-11f1-90f6-35051361a039 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: mon.vm05@-1(???).mds e12 new map 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: mon.vm05@-1(???).mds e12 print_map 2026-03-10T08:56:39.718 INFO:journalctl@ceph.mon.vm05.vm05.stdout: e12 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout: btime 1970-01-01T00:00:00:000000+0000 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout: enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout: default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout: legacy client fscid: 1 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout: 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Filesystem 'cephfs' (1) 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout: fs_name cephfs 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout: epoch 12 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout: flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout: created 2026-03-10T08:52:52.346264+0000 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout: modified 2026-03-10T08:53:01.426195+0000 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout: tableserver 0 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout: root 0 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout: session_timeout 60 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout: session_autoclose 300 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout: max_file_size 1099511627776 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout: max_xattr_size 65536 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout: required_client_features {} 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout: last_failure 0 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout: last_failure_osd_epoch 39 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout: compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout: max_mds 1 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout: in 0 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout: up {0=24289} 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout: failed 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout: damaged 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout: stopped 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout: data_pools [3] 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout: metadata_pool 2 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout: inline_data disabled 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout: balancer 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout: bal_rank_mask -1 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout: standby_count_wanted 1 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout: qdb_cluster leader: 0 members: 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout: [mds.cephfs.vm05.bxdvbu{0:24289} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.105:6826/2466638752,v1:192.168.123.105:6827/2466638752] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout: 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout: 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Standby daemons: 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout: 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout: [mds.cephfs.vm05.slhztf{-1:14488} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.105:6828/2662194502,v1:192.168.123.105:6829/2662194502] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout: [mds.cephfs.vm08.ssijow{-1:24307} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.108:6826/573665424,v1:192.168.123.108:6827/573665424] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout: [mds.cephfs.vm08.xfzrbx{-1:24317} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.108:6824/3236998387,v1:192.168.123.108:6825/3236998387] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: mon.vm05@-1(???).osd e42 crush map has features 3314933000852226048, adjusting msgr requires 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: mon.vm05@-1(???).osd e42 crush map has features 288514051259236352, adjusting msgr requires 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: mon.vm05@-1(???).osd e42 crush map has features 288514051259236352, adjusting msgr requires 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: mon.vm05@-1(???).osd e42 crush map has features 288514051259236352, adjusting msgr requires 2026-03-10T08:56:39.719 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:39 vm05.local ceph-mon[111630]: mon.vm05@-1(???).paxosservice(auth 1..21) refresh upgraded, format 0 -> 3 2026-03-10T08:56:40.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:40 vm08.local ceph-mon[57559]: pgmap v12: 65 pgs: 65 active+clean; 2.3 GiB data, 9.2 GiB used, 111 GiB / 120 GiB avail; 34 KiB/s rd, 925 KiB/s wr, 139 op/s 2026-03-10T08:56:40.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:40 vm08.local ceph-mon[57559]: from='mgr.14768 192.168.123.108:0/3451109001' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.rpongu/crt"}]: dispatch 2026-03-10T08:56:40.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:40 vm08.local ceph-mon[57559]: from='mgr.14768 192.168.123.108:0/3451109001' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T08:56:40.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:40 vm08.local ceph-mon[57559]: from='mgr.14768 192.168.123.108:0/3451109001' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.rpongu/key"}]: dispatch 2026-03-10T08:56:40.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:40 vm08.local ceph-mon[57559]: from='mgr.14768 192.168.123.108:0/3451109001' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T08:56:40.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:40 vm08.local ceph-mon[57559]: mon.vm05 calling monitor election 2026-03-10T08:56:40.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:40 vm08.local ceph-mon[57559]: mon.vm05 is new leader, mons vm05,vm08 in quorum (ranks 0,1) 2026-03-10T08:56:40.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:40 vm08.local ceph-mon[57559]: monmap epoch 2 2026-03-10T08:56:40.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:40 vm08.local ceph-mon[57559]: fsid 16587ed2-1c5e-11f1-90f6-35051361a039 2026-03-10T08:56:40.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:40 vm08.local ceph-mon[57559]: last_changed 2026-03-10T08:51:26.309295+0000 2026-03-10T08:56:40.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:40 vm08.local ceph-mon[57559]: created 2026-03-10T08:50:09.891602+0000 2026-03-10T08:56:40.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:40 vm08.local ceph-mon[57559]: min_mon_release 18 (reef) 2026-03-10T08:56:40.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:40 vm08.local ceph-mon[57559]: election_strategy: 1 2026-03-10T08:56:40.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:40 vm08.local ceph-mon[57559]: 0: [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] mon.vm05 2026-03-10T08:56:40.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:40 vm08.local ceph-mon[57559]: 1: [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] mon.vm08 2026-03-10T08:56:40.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:40 vm08.local ceph-mon[57559]: fsmap cephfs:1 {0=cephfs.vm05.bxdvbu=up:active} 3 up:standby 2026-03-10T08:56:40.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:40 vm08.local ceph-mon[57559]: osdmap e42: 6 total, 6 up, 6 in 2026-03-10T08:56:40.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:40 vm08.local ceph-mon[57559]: mgrmap e28: vm05.rxwgjc(active, since 19s), standbys: vm08.rpongu 2026-03-10T08:56:40.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:40 vm08.local ceph-mon[57559]: overall HEALTH_OK 2026-03-10T08:56:40.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:40 vm08.local ceph-mon[57559]: mgrmap e29: vm05.rxwgjc(active, since 19s), standbys: vm08.rpongu 2026-03-10T08:56:40.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:40 vm08.local ceph-mon[57559]: from='mgr.14722 ' entity='' 2026-03-10T08:56:40.308 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:40 vm05.local ceph-mon[111630]: pgmap v12: 65 pgs: 65 active+clean; 2.3 GiB data, 9.2 GiB used, 111 GiB / 120 GiB avail; 34 KiB/s rd, 925 KiB/s wr, 139 op/s 2026-03-10T08:56:40.308 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:40 vm05.local ceph-mon[111630]: from='mgr.14768 192.168.123.108:0/3451109001' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.rpongu/crt"}]: dispatch 2026-03-10T08:56:40.308 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:40 vm05.local ceph-mon[111630]: from='mgr.14768 192.168.123.108:0/3451109001' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T08:56:40.308 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:40 vm05.local ceph-mon[111630]: from='mgr.14768 192.168.123.108:0/3451109001' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.rpongu/key"}]: dispatch 2026-03-10T08:56:40.308 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:40 vm05.local ceph-mon[111630]: from='mgr.14768 192.168.123.108:0/3451109001' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T08:56:40.308 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:40 vm05.local ceph-mon[111630]: mon.vm05 calling monitor election 2026-03-10T08:56:40.308 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:40 vm05.local ceph-mon[111630]: mon.vm05 is new leader, mons vm05,vm08 in quorum (ranks 0,1) 2026-03-10T08:56:40.308 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:40 vm05.local ceph-mon[111630]: monmap epoch 2 2026-03-10T08:56:40.308 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:40 vm05.local ceph-mon[111630]: fsid 16587ed2-1c5e-11f1-90f6-35051361a039 2026-03-10T08:56:40.308 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:40 vm05.local ceph-mon[111630]: last_changed 2026-03-10T08:51:26.309295+0000 2026-03-10T08:56:40.308 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:40 vm05.local ceph-mon[111630]: created 2026-03-10T08:50:09.891602+0000 2026-03-10T08:56:40.308 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:40 vm05.local ceph-mon[111630]: min_mon_release 18 (reef) 2026-03-10T08:56:40.308 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:40 vm05.local ceph-mon[111630]: election_strategy: 1 2026-03-10T08:56:40.308 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:40 vm05.local ceph-mon[111630]: 0: [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] mon.vm05 2026-03-10T08:56:40.308 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:40 vm05.local ceph-mon[111630]: 1: [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] mon.vm08 2026-03-10T08:56:40.308 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:40 vm05.local ceph-mon[111630]: fsmap cephfs:1 {0=cephfs.vm05.bxdvbu=up:active} 3 up:standby 2026-03-10T08:56:40.308 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:40 vm05.local ceph-mon[111630]: osdmap e42: 6 total, 6 up, 6 in 2026-03-10T08:56:40.308 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:40 vm05.local ceph-mon[111630]: mgrmap e28: vm05.rxwgjc(active, since 19s), standbys: vm08.rpongu 2026-03-10T08:56:40.308 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:40 vm05.local ceph-mon[111630]: overall HEALTH_OK 2026-03-10T08:56:40.308 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:40 vm05.local ceph-mon[111630]: mgrmap e29: vm05.rxwgjc(active, since 19s), standbys: vm08.rpongu 2026-03-10T08:56:40.308 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:40 vm05.local ceph-mon[111630]: from='mgr.14722 ' entity='' 2026-03-10T08:56:45.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:44 vm08.local ceph-mon[57559]: from='mgr.? 192.168.123.108:0/180457289' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.rpongu/crt"}]: dispatch 2026-03-10T08:56:45.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:44 vm08.local ceph-mon[57559]: from='mgr.? 192.168.123.108:0/180457289' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T08:56:45.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:44 vm08.local ceph-mon[57559]: from='mgr.? 192.168.123.108:0/180457289' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.rpongu/key"}]: dispatch 2026-03-10T08:56:45.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:44 vm08.local ceph-mon[57559]: from='mgr.? 192.168.123.108:0/180457289' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T08:56:45.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:44 vm08.local ceph-mon[57559]: Standby manager daemon vm08.rpongu restarted 2026-03-10T08:56:45.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:44 vm08.local ceph-mon[57559]: Standby manager daemon vm08.rpongu started 2026-03-10T08:56:45.107 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:44 vm05.local ceph-mon[111630]: from='mgr.? 192.168.123.108:0/180457289' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.rpongu/crt"}]: dispatch 2026-03-10T08:56:45.108 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:44 vm05.local ceph-mon[111630]: from='mgr.? 192.168.123.108:0/180457289' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T08:56:45.108 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:44 vm05.local ceph-mon[111630]: from='mgr.? 192.168.123.108:0/180457289' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.rpongu/key"}]: dispatch 2026-03-10T08:56:45.108 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:44 vm05.local ceph-mon[111630]: from='mgr.? 192.168.123.108:0/180457289' entity='mgr.vm08.rpongu' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T08:56:45.108 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:44 vm05.local ceph-mon[111630]: Standby manager daemon vm08.rpongu restarted 2026-03-10T08:56:45.108 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:44 vm05.local ceph-mon[111630]: Standby manager daemon vm08.rpongu started 2026-03-10T08:56:46.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:45 vm08.local ceph-mon[57559]: mgrmap e30: vm05.rxwgjc(active, since 24s), standbys: vm08.rpongu 2026-03-10T08:56:46.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:45 vm08.local ceph-mon[57559]: Active manager daemon vm05.rxwgjc restarted 2026-03-10T08:56:46.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:45 vm08.local ceph-mon[57559]: Activating manager daemon vm05.rxwgjc 2026-03-10T08:56:46.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:45 vm08.local ceph-mon[57559]: osdmap e43: 6 total, 6 up, 6 in 2026-03-10T08:56:46.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:45 vm08.local ceph-mon[57559]: mgrmap e31: vm05.rxwgjc(active, starting, since 0.00806738s), standbys: vm08.rpongu 2026-03-10T08:56:46.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:45 vm08.local ceph-mon[57559]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-10T08:56:46.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:45 vm08.local ceph-mon[57559]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T08:56:46.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:45 vm08.local ceph-mon[57559]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.slhztf"}]: dispatch 2026-03-10T08:56:46.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:45 vm08.local ceph-mon[57559]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.bxdvbu"}]: dispatch 2026-03-10T08:56:46.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:45 vm08.local ceph-mon[57559]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.ssijow"}]: dispatch 2026-03-10T08:56:46.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:45 vm08.local ceph-mon[57559]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.xfzrbx"}]: dispatch 2026-03-10T08:56:46.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:45 vm08.local ceph-mon[57559]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mgr metadata", "who": "vm05.rxwgjc", "id": "vm05.rxwgjc"}]: dispatch 2026-03-10T08:56:46.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:45 vm08.local ceph-mon[57559]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mgr metadata", "who": "vm08.rpongu", "id": "vm08.rpongu"}]: dispatch 2026-03-10T08:56:46.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:45 vm08.local ceph-mon[57559]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T08:56:46.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:45 vm08.local ceph-mon[57559]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T08:56:46.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:45 vm08.local ceph-mon[57559]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T08:56:46.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:45 vm08.local ceph-mon[57559]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T08:56:46.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:45 vm08.local ceph-mon[57559]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T08:56:46.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:45 vm08.local ceph-mon[57559]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T08:56:46.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:45 vm08.local ceph-mon[57559]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T08:56:46.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:45 vm08.local ceph-mon[57559]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T08:56:46.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:45 vm08.local ceph-mon[57559]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T08:56:46.074 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:45 vm05.local ceph-mon[111630]: mgrmap e30: vm05.rxwgjc(active, since 24s), standbys: vm08.rpongu 2026-03-10T08:56:46.074 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:45 vm05.local ceph-mon[111630]: Active manager daemon vm05.rxwgjc restarted 2026-03-10T08:56:46.074 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:45 vm05.local ceph-mon[111630]: Activating manager daemon vm05.rxwgjc 2026-03-10T08:56:46.074 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:45 vm05.local ceph-mon[111630]: osdmap e43: 6 total, 6 up, 6 in 2026-03-10T08:56:46.074 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:45 vm05.local ceph-mon[111630]: mgrmap e31: vm05.rxwgjc(active, starting, since 0.00806738s), standbys: vm08.rpongu 2026-03-10T08:56:46.074 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:45 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-10T08:56:46.074 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:45 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T08:56:46.074 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:45 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.slhztf"}]: dispatch 2026-03-10T08:56:46.074 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:45 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.bxdvbu"}]: dispatch 2026-03-10T08:56:46.074 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:45 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.ssijow"}]: dispatch 2026-03-10T08:56:46.074 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:45 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.xfzrbx"}]: dispatch 2026-03-10T08:56:46.074 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:45 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mgr metadata", "who": "vm05.rxwgjc", "id": "vm05.rxwgjc"}]: dispatch 2026-03-10T08:56:46.074 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:45 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mgr metadata", "who": "vm08.rpongu", "id": "vm08.rpongu"}]: dispatch 2026-03-10T08:56:46.074 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:45 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T08:56:46.074 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:45 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T08:56:46.074 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:45 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T08:56:46.074 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:45 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T08:56:46.074 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:45 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T08:56:46.074 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:45 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T08:56:46.074 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:45 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T08:56:46.074 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:45 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T08:56:46.074 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:45 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T08:56:47.019 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:46 vm05.local ceph-mon[111630]: Manager daemon vm05.rxwgjc is now available 2026-03-10T08:56:47.019 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:46 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:56:47.019 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:46 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:56:47.020 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:46 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.rxwgjc/mirror_snapshot_schedule"}]: dispatch 2026-03-10T08:56:47.020 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:46 vm05.local ceph-mon[111630]: from='mgr.34104 ' entity='mgr.vm05.rxwgjc' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.rxwgjc/mirror_snapshot_schedule"}]: dispatch 2026-03-10T08:56:47.020 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:46 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.rxwgjc/trash_purge_schedule"}]: dispatch 2026-03-10T08:56:47.020 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:46 vm05.local ceph-mon[111630]: from='mgr.34104 ' entity='mgr.vm05.rxwgjc' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.rxwgjc/trash_purge_schedule"}]: dispatch 2026-03-10T08:56:47.041 INFO:teuthology.orchestra.run:Running command with timeout 3600 2026-03-10T08:56:47.042 DEBUG:teuthology.orchestra.run.vm08:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.1/client.1/tmp 2026-03-10T08:56:47.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:46 vm08.local ceph-mon[57559]: Manager daemon vm05.rxwgjc is now available 2026-03-10T08:56:47.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:46 vm08.local ceph-mon[57559]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:56:47.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:46 vm08.local ceph-mon[57559]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:56:47.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:46 vm08.local ceph-mon[57559]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.rxwgjc/mirror_snapshot_schedule"}]: dispatch 2026-03-10T08:56:47.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:46 vm08.local ceph-mon[57559]: from='mgr.34104 ' entity='mgr.vm05.rxwgjc' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.rxwgjc/mirror_snapshot_schedule"}]: dispatch 2026-03-10T08:56:47.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:46 vm08.local ceph-mon[57559]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.rxwgjc/trash_purge_schedule"}]: dispatch 2026-03-10T08:56:47.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:46 vm08.local ceph-mon[57559]: from='mgr.34104 ' entity='mgr.vm05.rxwgjc' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.rxwgjc/trash_purge_schedule"}]: dispatch 2026-03-10T08:56:47.864 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:47 vm05.local ceph-mon[111630]: mgrmap e32: vm05.rxwgjc(active, since 1.07847s), standbys: vm08.rpongu 2026-03-10T08:56:47.864 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:47 vm05.local ceph-mon[111630]: pgmap v3: 65 pgs: 65 active+clean; 684 MiB data, 4.6 GiB used, 115 GiB / 120 GiB avail 2026-03-10T08:56:48.002 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:47 vm08.local ceph-mon[57559]: mgrmap e32: vm05.rxwgjc(active, since 1.07847s), standbys: vm08.rpongu 2026-03-10T08:56:48.002 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:47 vm08.local ceph-mon[57559]: pgmap v3: 65 pgs: 65 active+clean; 684 MiB data, 4.6 GiB used, 115 GiB / 120 GiB avail 2026-03-10T08:56:48.698 INFO:teuthology.orchestra.run:Running command with timeout 3600 2026-03-10T08:56:48.698 DEBUG:teuthology.orchestra.run.vm05:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0/tmp 2026-03-10T08:56:48.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:48 vm05.local ceph-mon[111630]: mgrmap e33: vm05.rxwgjc(active, since 2s), standbys: vm08.rpongu 2026-03-10T08:56:48.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:48 vm05.local ceph-mon[111630]: [10/Mar/2026:08:56:48] ENGINE Bus STARTING 2026-03-10T08:56:48.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:48 vm05.local ceph-mon[111630]: from='mgr.34104 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:48.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:48 vm05.local ceph-mon[111630]: from='mgr.34104 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:49.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:48 vm08.local ceph-mon[57559]: mgrmap e33: vm05.rxwgjc(active, since 2s), standbys: vm08.rpongu 2026-03-10T08:56:49.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:48 vm08.local ceph-mon[57559]: [10/Mar/2026:08:56:48] ENGINE Bus STARTING 2026-03-10T08:56:49.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:48 vm08.local ceph-mon[57559]: from='mgr.34104 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:49.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:48 vm08.local ceph-mon[57559]: from='mgr.34104 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:50.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:50 vm05.local ceph-mon[111630]: [10/Mar/2026:08:56:48] ENGINE Serving on https://192.168.123.105:7150 2026-03-10T08:56:50.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:50 vm05.local ceph-mon[111630]: [10/Mar/2026:08:56:48] ENGINE Client ('192.168.123.105', 44324) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-10T08:56:50.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:50 vm05.local ceph-mon[111630]: [10/Mar/2026:08:56:48] ENGINE Serving on http://192.168.123.105:8765 2026-03-10T08:56:50.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:50 vm05.local ceph-mon[111630]: [10/Mar/2026:08:56:48] ENGINE Bus STARTED 2026-03-10T08:56:50.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:50 vm05.local ceph-mon[111630]: from='mgr.34104 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:50.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:50 vm05.local ceph-mon[111630]: from='mgr.34104 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:50.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:50 vm05.local ceph-mon[111630]: from='mgr.34104 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:50.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:50 vm05.local ceph-mon[111630]: from='mgr.34104 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:50.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:50 vm05.local ceph-mon[111630]: from='mgr.34104 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:50.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:50 vm05.local ceph-mon[111630]: from='mgr.34104 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:50.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:50 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-10T08:56:50.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:50 vm05.local ceph-mon[111630]: from='mgr.34104 ' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-10T08:56:50.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:50 vm08.local ceph-mon[57559]: [10/Mar/2026:08:56:48] ENGINE Serving on https://192.168.123.105:7150 2026-03-10T08:56:50.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:50 vm08.local ceph-mon[57559]: [10/Mar/2026:08:56:48] ENGINE Client ('192.168.123.105', 44324) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-10T08:56:50.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:50 vm08.local ceph-mon[57559]: [10/Mar/2026:08:56:48] ENGINE Serving on http://192.168.123.105:8765 2026-03-10T08:56:50.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:50 vm08.local ceph-mon[57559]: [10/Mar/2026:08:56:48] ENGINE Bus STARTED 2026-03-10T08:56:50.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:50 vm08.local ceph-mon[57559]: from='mgr.34104 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:50.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:50 vm08.local ceph-mon[57559]: from='mgr.34104 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:50.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:50 vm08.local ceph-mon[57559]: from='mgr.34104 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:50.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:50 vm08.local ceph-mon[57559]: from='mgr.34104 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:50.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:50 vm08.local ceph-mon[57559]: from='mgr.34104 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:50.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:50 vm08.local ceph-mon[57559]: from='mgr.34104 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:50.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:50 vm08.local ceph-mon[57559]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-10T08:56:50.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:50 vm08.local ceph-mon[57559]: from='mgr.34104 ' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-10T08:56:51.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:51 vm05.local ceph-mon[111630]: pgmap v5: 65 pgs: 65 active+clean; 684 MiB data, 4.6 GiB used, 115 GiB / 120 GiB avail 2026-03-10T08:56:51.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:51 vm05.local ceph-mon[111630]: mgrmap e34: vm05.rxwgjc(active, since 4s), standbys: vm08.rpongu 2026-03-10T08:56:51.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:51 vm05.local ceph-mon[111630]: from='mgr.34104 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:51.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:51 vm05.local ceph-mon[111630]: from='mgr.34104 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:51.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:51 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-10T08:56:51.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:51 vm05.local ceph-mon[111630]: from='mgr.34104 ' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-10T08:56:51.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:51 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:56:51.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:51 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:56:51.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:51 vm08.local ceph-mon[57559]: pgmap v5: 65 pgs: 65 active+clean; 684 MiB data, 4.6 GiB used, 115 GiB / 120 GiB avail 2026-03-10T08:56:51.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:51 vm08.local ceph-mon[57559]: mgrmap e34: vm05.rxwgjc(active, since 4s), standbys: vm08.rpongu 2026-03-10T08:56:51.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:51 vm08.local ceph-mon[57559]: from='mgr.34104 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:51.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:51 vm08.local ceph-mon[57559]: from='mgr.34104 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:51.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:51 vm08.local ceph-mon[57559]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-10T08:56:51.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:51 vm08.local ceph-mon[57559]: from='mgr.34104 ' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-10T08:56:51.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:51 vm08.local ceph-mon[57559]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:56:51.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:51 vm08.local ceph-mon[57559]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:56:52.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:52 vm08.local ceph-mon[57559]: Updating vm05:/etc/ceph/ceph.conf 2026-03-10T08:56:52.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:52 vm08.local ceph-mon[57559]: Updating vm08:/etc/ceph/ceph.conf 2026-03-10T08:56:52.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:52 vm08.local ceph-mon[57559]: Updating vm05:/var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/config/ceph.conf 2026-03-10T08:56:52.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:52 vm08.local ceph-mon[57559]: Updating vm08:/var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/config/ceph.conf 2026-03-10T08:56:52.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:52 vm08.local ceph-mon[57559]: Updating vm08:/etc/ceph/ceph.client.admin.keyring 2026-03-10T08:56:52.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:52 vm08.local ceph-mon[57559]: Updating vm05:/etc/ceph/ceph.client.admin.keyring 2026-03-10T08:56:52.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:52 vm08.local ceph-mon[57559]: from='mgr.34104 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:52.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:52 vm08.local ceph-mon[57559]: from='mgr.34104 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:52.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:52 vm08.local ceph-mon[57559]: from='mgr.34104 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:52.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:52 vm08.local ceph-mon[57559]: from='mgr.34104 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:52.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:52 vm08.local ceph-mon[57559]: from='mgr.34104 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:52.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:52 vm08.local ceph-mon[57559]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:56:52.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:52 vm08.local ceph-mon[57559]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:56:52.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:52 vm08.local ceph-mon[57559]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "quorum_status"}]: dispatch 2026-03-10T08:56:52.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:52 vm05.local ceph-mon[111630]: Updating vm05:/etc/ceph/ceph.conf 2026-03-10T08:56:52.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:52 vm05.local ceph-mon[111630]: Updating vm08:/etc/ceph/ceph.conf 2026-03-10T08:56:52.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:52 vm05.local ceph-mon[111630]: Updating vm05:/var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/config/ceph.conf 2026-03-10T08:56:52.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:52 vm05.local ceph-mon[111630]: Updating vm08:/var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/config/ceph.conf 2026-03-10T08:56:52.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:52 vm05.local ceph-mon[111630]: Updating vm08:/etc/ceph/ceph.client.admin.keyring 2026-03-10T08:56:52.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:52 vm05.local ceph-mon[111630]: Updating vm05:/etc/ceph/ceph.client.admin.keyring 2026-03-10T08:56:52.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:52 vm05.local ceph-mon[111630]: from='mgr.34104 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:52.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:52 vm05.local ceph-mon[111630]: from='mgr.34104 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:52.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:52 vm05.local ceph-mon[111630]: from='mgr.34104 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:52.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:52 vm05.local ceph-mon[111630]: from='mgr.34104 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:52.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:52 vm05.local ceph-mon[111630]: from='mgr.34104 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:52.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:52 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:56:52.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:52 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:56:52.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:52 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "quorum_status"}]: dispatch 2026-03-10T08:56:53.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:52 vm08.local systemd[1]: Stopping Ceph mon.vm08 for 16587ed2-1c5e-11f1-90f6-35051361a039... 2026-03-10T08:56:53.107 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:53 vm05.local ceph-mon[111630]: Updating vm08:/var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/config/ceph.client.admin.keyring 2026-03-10T08:56:53.107 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:53 vm05.local ceph-mon[111630]: Updating vm05:/var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/config/ceph.client.admin.keyring 2026-03-10T08:56:53.107 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:53 vm05.local ceph-mon[111630]: pgmap v6: 65 pgs: 65 active+clean; 684 MiB data, 4.6 GiB used, 115 GiB / 120 GiB avail 2026-03-10T08:56:53.107 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:53 vm05.local ceph-mon[111630]: from='mgr.34104 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:53.107 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:53 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T08:56:53.403 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[57559]: Updating vm08:/var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/config/ceph.client.admin.keyring 2026-03-10T08:56:53.404 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[57559]: Updating vm05:/var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/config/ceph.client.admin.keyring 2026-03-10T08:56:53.404 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[57559]: pgmap v6: 65 pgs: 65 active+clean; 684 MiB data, 4.6 GiB used, 115 GiB / 120 GiB avail 2026-03-10T08:56:53.404 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[57559]: from='mgr.34104 ' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:53.404 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[57559]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T08:56:53.404 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[57559]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T08:56:53.404 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[57559]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:56:53.404 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-mon-vm08[57552]: 2026-03-10T08:56:53.107+0000 7fa1f4f6b700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-mon -n mon.vm08 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false --default-mon-cluster-log-to-file=false --default-mon-cluster-log-to-journald=true --default-mon-cluster-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T08:56:53.404 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-mon-vm08[57552]: 2026-03-10T08:56:53.107+0000 7fa1f4f6b700 -1 mon.vm08@1(peon) e2 *** Got Signal Terminated *** 2026-03-10T08:56:53.404 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local podman[101203]: 2026-03-10 08:56:53.193359285 +0000 UTC m=+0.170672904 container died bca448418226372bdd96f0caa2aa7c76e8b8ef5852afe068b91b308ca3e55f2f (image=quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-mon-vm08, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=HEAD, ceph=True, maintainer=Guillaume Abrioux , org.label-schema.build-date=20240222, GIT_CLEAN=True, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, io.buildah.version=1.29.1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_POINT_RELEASE=-18.2.1, GIT_BRANCH=HEAD, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.schema-version=1.0) 2026-03-10T08:56:53.404 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local podman[101203]: 2026-03-10 08:56:53.214607236 +0000 UTC m=+0.191920855 container remove bca448418226372bdd96f0caa2aa7c76e8b8ef5852afe068b91b308ca3e55f2f (image=quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-mon-vm08, maintainer=Guillaume Abrioux , org.label-schema.build-date=20240222, GIT_CLEAN=True, org.label-schema.vendor=CentOS, CEPH_POINT_RELEASE=-18.2.1, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GIT_BRANCH=HEAD, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.29.1, org.label-schema.name=CentOS Stream 8 Base Image, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, RELEASE=HEAD) 2026-03-10T08:56:53.404 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local bash[101203]: ceph-16587ed2-1c5e-11f1-90f6-35051361a039-mon-vm08 2026-03-10T08:56:53.404 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local systemd[1]: ceph-16587ed2-1c5e-11f1-90f6-35051361a039@mon.vm08.service: Deactivated successfully. 2026-03-10T08:56:53.404 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local systemd[1]: Stopped Ceph mon.vm08 for 16587ed2-1c5e-11f1-90f6-35051361a039. 2026-03-10T08:56:53.404 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local systemd[1]: ceph-16587ed2-1c5e-11f1-90f6-35051361a039@mon.vm08.service: Consumed 3.859s CPU time. 2026-03-10T08:56:53.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:53 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T08:56:53.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:53 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:56:53.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local systemd[1]: Starting Ceph mon.vm08 for 16587ed2-1c5e-11f1-90f6-35051361a039... 2026-03-10T08:56:54.082 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local podman[101315]: 2026-03-10 08:56:53.817357073 +0000 UTC m=+0.053117766 container create 34546aa1422bdf812a785754331901e7c3c8a5f6e641aef0bf3d305d15f0cce6 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-mon-vm08, io.buildah.version=1.41.3, CEPH_REF=squid, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-10T08:56:54.082 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local podman[101315]: 2026-03-10 08:56:53.863157452 +0000 UTC m=+0.098918134 container init 34546aa1422bdf812a785754331901e7c3c8a5f6e641aef0bf3d305d15f0cce6 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-mon-vm08, CEPH_REF=squid, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223) 2026-03-10T08:56:54.082 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local podman[101315]: 2026-03-10 08:56:53.86704628 +0000 UTC m=+0.102806973 container start 34546aa1422bdf812a785754331901e7c3c8a5f6e641aef0bf3d305d15f0cce6 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-mon-vm08, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default) 2026-03-10T08:56:54.082 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local bash[101315]: 34546aa1422bdf812a785754331901e7c3c8a5f6e641aef0bf3d305d15f0cce6 2026-03-10T08:56:54.082 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local podman[101315]: 2026-03-10 08:56:53.799471715 +0000 UTC m=+0.035232419 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T08:56:54.082 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local systemd[1]: Started Ceph mon.vm08 for 16587ed2-1c5e-11f1-90f6-35051361a039. 2026-03-10T08:56:54.082 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: set uid:gid to 167:167 (ceph:ceph) 2026-03-10T08:56:54.082 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable), process ceph-mon, pid 2 2026-03-10T08:56:54.082 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: pidfile_write: ignore empty --pid-file 2026-03-10T08:56:54.082 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: load: jerasure load: lrc 2026-03-10T08:56:54.082 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: RocksDB version: 7.9.2 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Git sha 0 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Compile date 2026-02-25 18:11:04 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: DB SUMMARY 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: DB Session ID: 0W7YAWS703599HU22GNC 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: CURRENT file: CURRENT 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: IDENTITY file: IDENTITY 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: MANIFEST file: MANIFEST-000010 size: 668 Bytes 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: SST files in /var/lib/ceph/mon/ceph-vm08/store.db dir, Total Num: 1, files: 000018.sst 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-vm08/store.db: 000016.log size: 6399074 ; 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.error_if_exists: 0 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.create_if_missing: 0 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.paranoid_checks: 1 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.flush_verify_memtable_count: 1 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.env: 0x55e8e6b16dc0 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.fs: PosixFileSystem 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.info_log: 0x55e8e73e1900 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.max_file_opening_threads: 16 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.statistics: (nil) 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.use_fsync: 0 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.max_log_file_size: 0 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.max_manifest_file_size: 1073741824 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.log_file_time_to_roll: 0 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.keep_log_file_num: 1000 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.recycle_log_file_num: 0 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.allow_fallocate: 1 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.allow_mmap_reads: 0 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.allow_mmap_writes: 0 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.use_direct_reads: 0 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.create_missing_column_families: 0 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.db_log_dir: 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.wal_dir: 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.table_cache_numshardbits: 6 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.WAL_ttl_seconds: 0 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.WAL_size_limit_MB: 0 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.manifest_preallocation_size: 4194304 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.is_fd_close_on_exec: 1 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.advise_random_on_open: 1 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.db_write_buffer_size: 0 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.write_buffer_manager: 0x55e8e73e5900 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.access_hint_on_compaction_start: 1 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.random_access_max_buffer_size: 1048576 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.use_adaptive_mutex: 0 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.rate_limiter: (nil) 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.wal_recovery_mode: 2 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.enable_thread_tracking: 0 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.enable_pipelined_write: 0 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.unordered_write: 0 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.allow_concurrent_memtable_write: 1 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 2026-03-10T08:56:54.083 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.write_thread_max_yield_usec: 100 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.write_thread_slow_yield_usec: 3 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.row_cache: None 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.wal_filter: None 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.avoid_flush_during_recovery: 0 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.allow_ingest_behind: 0 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.two_write_queues: 0 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.manual_wal_flush: 0 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.wal_compression: 0 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.atomic_flush: 0 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.persist_stats_to_disk: 0 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.write_dbid_to_manifest: 0 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.log_readahead_size: 0 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.file_checksum_gen_factory: Unknown 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.best_efforts_recovery: 0 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.max_bgerror_resume_count: 2147483647 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.allow_data_in_errors: 0 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.db_host_id: __hostname__ 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.enforce_single_del_contracts: true 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.max_background_jobs: 2 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.max_background_compactions: -1 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.max_subcompactions: 1 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.avoid_flush_during_shutdown: 0 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.writable_file_max_buffer_size: 1048576 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.delayed_write_rate : 16777216 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.max_total_wal_size: 0 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.stats_dump_period_sec: 600 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.stats_persist_period_sec: 600 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.stats_history_buffer_size: 1048576 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.max_open_files: -1 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.bytes_per_sync: 0 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.wal_bytes_per_sync: 0 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.strict_bytes_per_sync: 0 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.compaction_readahead_size: 0 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.max_background_flushes: -1 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Compression algorithms supported: 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: kZSTD supported: 0 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: kXpressCompression supported: 0 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: kBZip2Compression supported: 0 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: kZSTDNotFinalCompression supported: 0 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: kLZ4Compression supported: 1 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: kZlibCompression supported: 1 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: kLZ4HCCompression supported: 1 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: kSnappyCompression supported: 1 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Fast CRC32 supported: Supported on x86 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: DMutex implementation: pthread_mutex_t 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-vm08/store.db/MANIFEST-000010 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.comparator: leveldb.BytewiseComparator 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.merge_operator: 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.compaction_filter: None 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.compaction_filter_factory: None 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.sst_partitioner_factory: None 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.memtable_factory: SkipListFactory 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.table_factory: BlockBasedTable 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e8e73e1580) 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout: cache_index_and_filter_blocks: 1 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout: cache_index_and_filter_blocks_with_high_priority: 0 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout: pin_l0_filter_and_index_blocks_in_cache: 0 2026-03-10T08:56:54.084 INFO:journalctl@ceph.mon.vm08.vm08.stdout: pin_top_level_index_and_filter: 1 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout: index_type: 0 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout: data_block_index_type: 0 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout: index_shortening: 1 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout: data_block_hash_table_util_ratio: 0.750000 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout: checksum: 4 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout: no_block_cache: 0 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout: block_cache: 0x55e8e74049b0 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout: block_cache_name: BinnedLRUCache 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout: block_cache_options: 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout: capacity : 536870912 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout: num_shard_bits : 4 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout: strict_capacity_limit : 0 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout: high_pri_pool_ratio: 0.000 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout: block_cache_compressed: (nil) 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout: persistent_cache: (nil) 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout: block_size: 4096 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout: block_size_deviation: 10 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout: block_restart_interval: 16 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout: index_block_restart_interval: 1 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout: metadata_block_size: 4096 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout: partition_filters: 0 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout: use_delta_encoding: 1 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout: filter_policy: bloomfilter 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout: whole_key_filtering: 1 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout: verify_compression: 0 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout: read_amp_bytes_per_bit: 0 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout: format_version: 5 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout: enable_index_compression: 1 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout: block_align: 0 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout: max_auto_readahead_size: 262144 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout: prepopulate_block_cache: 0 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout: initial_auto_readahead_size: 8192 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout: num_file_reads_for_auto_readahead: 2 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.write_buffer_size: 33554432 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.max_write_buffer_number: 2 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.compression: NoCompression 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.bottommost_compression: Disabled 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.prefix_extractor: nullptr 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.num_levels: 7 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.min_write_buffer_number_to_merge: 1 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.bottommost_compression_opts.level: 32767 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.bottommost_compression_opts.strategy: 0 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.bottommost_compression_opts.enabled: false 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.compression_opts.window_bits: -14 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.compression_opts.level: 32767 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.compression_opts.strategy: 0 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.compression_opts.max_dict_bytes: 0 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.compression_opts.parallel_threads: 1 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.compression_opts.enabled: false 2026-03-10T08:56:54.085 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.level0_file_num_compaction_trigger: 4 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.level0_slowdown_writes_trigger: 20 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.level0_stop_writes_trigger: 36 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.target_file_size_base: 67108864 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.target_file_size_multiplier: 1 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.max_bytes_for_level_base: 268435456 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.max_bytes_for_level_multiplier: 10.000000 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.max_sequential_skip_in_iterations: 8 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.max_compaction_bytes: 1677721600 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.arena_block_size: 1048576 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.disable_auto_compactions: 0 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.compaction_style: kCompactionStyleLevel 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.compaction_pri: kMinOverlappingRatio 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.compaction_options_universal.size_ratio: 1 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.inplace_update_support: 0 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.inplace_update_num_locks: 10000 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.memtable_whole_key_filtering: 0 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.memtable_huge_page_size: 0 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.bloom_locality: 0 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.max_successive_merges: 0 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.optimize_filters_for_hits: 0 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.paranoid_file_checks: 0 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.force_consistency_checks: 1 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.report_bg_io_stats: 0 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.ttl: 2592000 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.periodic_compaction_seconds: 0 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.preclude_last_level_data_seconds: 0 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.preserve_internal_time_seconds: 0 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.enable_blob_files: false 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.min_blob_size: 0 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.blob_file_size: 268435456 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.blob_compression_type: NoCompression 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.enable_blob_garbage_collection: false 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.blob_compaction_readahead_size: 0 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.blob_file_starting_level: 0 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. 2026-03-10T08:56:54.086 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-vm08/store.db/MANIFEST-000010 succeeded,manifest_file_number is 10, next_file_number is 20, last_sequence is 8156, log_number is 16,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 16 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 16 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 12a5d0c8-1fb9-496b-8328-f41e14400618 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773133013923293, "job": 1, "event": "recovery_started", "wal_files": [16]} 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #16 mode 2 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773133013947869, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 21, "file_size": 3819344, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 8161, "largest_seqno": 9391, "table_properties": {"data_size": 3812820, "index_size": 4070, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1413, "raw_key_size": 12853, "raw_average_key_size": 23, "raw_value_size": 3800606, "raw_average_value_size": 6973, "num_data_blocks": 193, "num_entries": 545, "num_filter_entries": 545, "num_deletions": 2, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1773133013, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "12a5d0c8-1fb9-496b-8328-f41e14400618", "db_session_id": "0W7YAWS703599HU22GNC", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}} 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773133013948315, "job": 1, "event": "recovery_finished"} 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: [db/version_set.cc:5047] Creating manifest 23 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-vm08/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55e8e7406e00 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: DB pointer 0x55e8e7416000 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: rocksdb: [db/db_impl/db_impl.cc:1111] 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout: ** DB Stats ** 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Uptime(secs): 0.0 total, 0.0 interval 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Interval stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout: 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout: ** Compaction Stats [default] ** 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout: ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout: L0 1/0 3.64 MB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 200.4 0.02 0.00 1 0.018 0 0 0.0 0.0 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout: L6 1/0 6.09 MB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Sum 2/0 9.74 MB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 200.4 0.02 0.00 1 0.018 0 0 0.0 0.0 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 200.4 0.02 0.00 1 0.018 0 0 0.0 0.0 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout: 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout: ** Compaction Stats [default] ** 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout: --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout: User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 200.4 0.02 0.00 1 0.018 0 0 0.0 0.0 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout: 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout: 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Uptime(secs): 0.0 total, 0.0 interval 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Flush(GB): cumulative 0.004, interval 0.004 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout: AddFile(GB): cumulative 0.000, interval 0.000 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout: AddFile(Total Files): cumulative 0, interval 0 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout: AddFile(L0 Files): cumulative 0, interval 0 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout: AddFile(Keys): cumulative 0, interval 0 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Cumulative compaction: 0.00 GB write, 74.43 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Interval compaction: 0.00 GB write, 74.43 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Block cache BinnedLRUCache@0x55e8e74049b0#2 capacity: 512.00 MB usage: 5.61 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 2.9e-05 secs_since: 0 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Block cache entry stats(count,size,portion): FilterBlock(1,1.53 KB,0.000292063%) IndexBlock(1,4.08 KB,0.000777841%) Misc(1,0.00 KB,0%) 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout: 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout: ** File Read Latency Histogram By Level [default] ** 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: starting mon.vm08 rank 1 at public addrs [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] at bind addrs [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] mon_data /var/lib/ceph/mon/ceph-vm08 fsid 16587ed2-1c5e-11f1-90f6-35051361a039 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: mon.vm08@-1(???) e2 preinit fsid 16587ed2-1c5e-11f1-90f6-35051361a039 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: mon.vm08@-1(???).mds e12 new map 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: mon.vm08@-1(???).mds e12 print_map 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout: e12 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout: btime 1970-01-01T00:00:00:000000+0000 2026-03-10T08:56:54.087 INFO:journalctl@ceph.mon.vm08.vm08.stdout: enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T08:56:54.088 INFO:journalctl@ceph.mon.vm08.vm08.stdout: default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T08:56:54.088 INFO:journalctl@ceph.mon.vm08.vm08.stdout: legacy client fscid: 1 2026-03-10T08:56:54.088 INFO:journalctl@ceph.mon.vm08.vm08.stdout: 2026-03-10T08:56:54.088 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Filesystem 'cephfs' (1) 2026-03-10T08:56:54.088 INFO:journalctl@ceph.mon.vm08.vm08.stdout: fs_name cephfs 2026-03-10T08:56:54.088 INFO:journalctl@ceph.mon.vm08.vm08.stdout: epoch 12 2026-03-10T08:56:54.088 INFO:journalctl@ceph.mon.vm08.vm08.stdout: flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T08:56:54.088 INFO:journalctl@ceph.mon.vm08.vm08.stdout: created 2026-03-10T08:52:52.346264+0000 2026-03-10T08:56:54.088 INFO:journalctl@ceph.mon.vm08.vm08.stdout: modified 2026-03-10T08:53:01.426195+0000 2026-03-10T08:56:54.088 INFO:journalctl@ceph.mon.vm08.vm08.stdout: tableserver 0 2026-03-10T08:56:54.088 INFO:journalctl@ceph.mon.vm08.vm08.stdout: root 0 2026-03-10T08:56:54.088 INFO:journalctl@ceph.mon.vm08.vm08.stdout: session_timeout 60 2026-03-10T08:56:54.088 INFO:journalctl@ceph.mon.vm08.vm08.stdout: session_autoclose 300 2026-03-10T08:56:54.088 INFO:journalctl@ceph.mon.vm08.vm08.stdout: max_file_size 1099511627776 2026-03-10T08:56:54.088 INFO:journalctl@ceph.mon.vm08.vm08.stdout: max_xattr_size 65536 2026-03-10T08:56:54.088 INFO:journalctl@ceph.mon.vm08.vm08.stdout: required_client_features {} 2026-03-10T08:56:54.088 INFO:journalctl@ceph.mon.vm08.vm08.stdout: last_failure 0 2026-03-10T08:56:54.088 INFO:journalctl@ceph.mon.vm08.vm08.stdout: last_failure_osd_epoch 39 2026-03-10T08:56:54.088 INFO:journalctl@ceph.mon.vm08.vm08.stdout: compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T08:56:54.088 INFO:journalctl@ceph.mon.vm08.vm08.stdout: max_mds 1 2026-03-10T08:56:54.088 INFO:journalctl@ceph.mon.vm08.vm08.stdout: in 0 2026-03-10T08:56:54.088 INFO:journalctl@ceph.mon.vm08.vm08.stdout: up {0=24289} 2026-03-10T08:56:54.088 INFO:journalctl@ceph.mon.vm08.vm08.stdout: failed 2026-03-10T08:56:54.088 INFO:journalctl@ceph.mon.vm08.vm08.stdout: damaged 2026-03-10T08:56:54.088 INFO:journalctl@ceph.mon.vm08.vm08.stdout: stopped 2026-03-10T08:56:54.088 INFO:journalctl@ceph.mon.vm08.vm08.stdout: data_pools [3] 2026-03-10T08:56:54.088 INFO:journalctl@ceph.mon.vm08.vm08.stdout: metadata_pool 2 2026-03-10T08:56:54.088 INFO:journalctl@ceph.mon.vm08.vm08.stdout: inline_data disabled 2026-03-10T08:56:54.088 INFO:journalctl@ceph.mon.vm08.vm08.stdout: balancer 2026-03-10T08:56:54.088 INFO:journalctl@ceph.mon.vm08.vm08.stdout: bal_rank_mask -1 2026-03-10T08:56:54.088 INFO:journalctl@ceph.mon.vm08.vm08.stdout: standby_count_wanted 1 2026-03-10T08:56:54.088 INFO:journalctl@ceph.mon.vm08.vm08.stdout: qdb_cluster leader: 0 members: 2026-03-10T08:56:54.088 INFO:journalctl@ceph.mon.vm08.vm08.stdout: [mds.cephfs.vm05.bxdvbu{0:24289} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.105:6826/2466638752,v1:192.168.123.105:6827/2466638752] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:56:54.088 INFO:journalctl@ceph.mon.vm08.vm08.stdout: 2026-03-10T08:56:54.088 INFO:journalctl@ceph.mon.vm08.vm08.stdout: 2026-03-10T08:56:54.088 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Standby daemons: 2026-03-10T08:56:54.088 INFO:journalctl@ceph.mon.vm08.vm08.stdout: 2026-03-10T08:56:54.088 INFO:journalctl@ceph.mon.vm08.vm08.stdout: [mds.cephfs.vm05.slhztf{-1:14488} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.105:6828/2662194502,v1:192.168.123.105:6829/2662194502] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:56:54.088 INFO:journalctl@ceph.mon.vm08.vm08.stdout: [mds.cephfs.vm08.ssijow{-1:24307} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.108:6826/573665424,v1:192.168.123.108:6827/573665424] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:56:54.088 INFO:journalctl@ceph.mon.vm08.vm08.stdout: [mds.cephfs.vm08.xfzrbx{-1:24317} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.108:6824/3236998387,v1:192.168.123.108:6825/3236998387] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:56:54.088 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: mon.vm08@-1(???).osd e43 crush map has features 3314933000852226048, adjusting msgr requires 2026-03-10T08:56:54.088 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: mon.vm08@-1(???).osd e43 crush map has features 288514051259236352, adjusting msgr requires 2026-03-10T08:56:54.088 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: mon.vm08@-1(???).osd e43 crush map has features 288514051259236352, adjusting msgr requires 2026-03-10T08:56:54.088 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: mon.vm08@-1(???).osd e43 crush map has features 288514051259236352, adjusting msgr requires 2026-03-10T08:56:54.088 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:53 vm08.local ceph-mon[101330]: mon.vm08@-1(???).paxosservice(auth 1..22) refresh upgraded, format 0 -> 3 2026-03-10T08:56:55.236 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.235+0000 7fa0ddca7700 1 -- 192.168.123.105:0/1069248678 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa0d8072b50 msgr2=0x7fa0d8072f70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:55.236 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.235+0000 7fa0ddca7700 1 --2- 192.168.123.105:0/1069248678 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa0d8072b50 0x7fa0d8072f70 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7fa0c8009fd0 tx=0x7fa0c8009c50 comp rx=0 tx=0).stop 2026-03-10T08:56:55.237 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.237+0000 7fa0ddca7700 1 -- 192.168.123.105:0/1069248678 shutdown_connections 2026-03-10T08:56:55.237 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.237+0000 7fa0ddca7700 1 --2- 192.168.123.105:0/1069248678 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa0d8075a40 0x7fa0d8077ed0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:55.237 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.237+0000 7fa0ddca7700 1 --2- 192.168.123.105:0/1069248678 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa0d8072b50 0x7fa0d8072f70 unknown :-1 s=CLOSED pgs=37 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:55.237 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.237+0000 7fa0ddca7700 1 -- 192.168.123.105:0/1069248678 >> 192.168.123.105:0/1069248678 conn(0x7fa0d806dae0 msgr2=0x7fa0d806ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:56:55.238 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.237+0000 7fa0ddca7700 1 -- 192.168.123.105:0/1069248678 shutdown_connections 2026-03-10T08:56:55.239 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.237+0000 7fa0ddca7700 1 -- 192.168.123.105:0/1069248678 wait complete. 2026-03-10T08:56:55.239 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.237+0000 7fa0ddca7700 1 Processor -- start 2026-03-10T08:56:55.239 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.237+0000 7fa0ddca7700 1 -- start start 2026-03-10T08:56:55.239 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.238+0000 7fa0ddca7700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa0d8072b50 0x7fa0d8082de0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:56:55.239 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.238+0000 7fa0ddca7700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa0d8075a40 0x7fa0d8083320 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:56:55.239 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.238+0000 7fa0ddca7700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa0d8083940 con 0x7fa0d8075a40 2026-03-10T08:56:55.239 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.238+0000 7fa0ddca7700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa0d8083a80 con 0x7fa0d8072b50 2026-03-10T08:56:55.239 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.238+0000 7fa0d6ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa0d8075a40 0x7fa0d8083320 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:56:55.240 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.238+0000 7fa0d6ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa0d8075a40 0x7fa0d8083320 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:39618/0 (socket says 192.168.123.105:39618) 2026-03-10T08:56:55.240 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.238+0000 7fa0d6ffd700 1 -- 192.168.123.105:0/99547555 learned_addr learned my addr 192.168.123.105:0/99547555 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:56:55.240 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.238+0000 7fa0d6ffd700 1 -- 192.168.123.105:0/99547555 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa0d8072b50 msgr2=0x7fa0d8082de0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:55.240 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.238+0000 7fa0d6ffd700 1 --2- 192.168.123.105:0/99547555 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa0d8072b50 0x7fa0d8082de0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:55.240 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.238+0000 7fa0d6ffd700 1 -- 192.168.123.105:0/99547555 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa0c80097e0 con 0x7fa0d8075a40 2026-03-10T08:56:55.240 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.238+0000 7fa0d6ffd700 1 --2- 192.168.123.105:0/99547555 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa0d8075a40 0x7fa0d8083320 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7fa0d0007f80 tx=0x7fa0d000d460 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:56:55.240 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.240+0000 7fa0d4ff9700 1 -- 192.168.123.105:0/99547555 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa0d0017070 con 0x7fa0d8075a40 2026-03-10T08:56:55.242 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.240+0000 7fa0ddca7700 1 -- 192.168.123.105:0/99547555 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa0d812e580 con 0x7fa0d8075a40 2026-03-10T08:56:55.242 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.240+0000 7fa0ddca7700 1 -- 192.168.123.105:0/99547555 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa0d812ead0 con 0x7fa0d8075a40 2026-03-10T08:56:55.243 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.242+0000 7fa0d4ff9700 1 -- 192.168.123.105:0/99547555 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa0d000f040 con 0x7fa0d8075a40 2026-03-10T08:56:55.243 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.243+0000 7fa0d4ff9700 1 -- 192.168.123.105:0/99547555 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa0d0013600 con 0x7fa0d8075a40 2026-03-10T08:56:55.243 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.243+0000 7fa0d4ff9700 1 -- 192.168.123.105:0/99547555 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 34) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fa0d0013820 con 0x7fa0d8075a40 2026-03-10T08:56:55.243 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.243+0000 7fa0d4ff9700 1 --2- 192.168.123.105:0/99547555 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fa0c0077be0 0x7fa0c007a0a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:56:55.244 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.243+0000 7fa0d4ff9700 1 -- 192.168.123.105:0/99547555 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 6222+0+0 (secure 0 0 0) 0x7fa0d009a7f0 con 0x7fa0d8075a40 2026-03-10T08:56:55.244 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.243+0000 7fa0d77fe700 1 --2- 192.168.123.105:0/99547555 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fa0c0077be0 0x7fa0c007a0a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:56:55.244 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.244+0000 7fa0ddca7700 1 -- 192.168.123.105:0/99547555 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa0c4005320 con 0x7fa0d8075a40 2026-03-10T08:56:55.245 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.244+0000 7fa0d77fe700 1 --2- 192.168.123.105:0/99547555 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fa0c0077be0 0x7fa0c007a0a0 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7fa0c8005210 tx=0x7fa0c80058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:56:55.248 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.248+0000 7fa0d4ff9700 1 -- 192.168.123.105:0/99547555 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fa0d0062e30 con 0x7fa0d8075a40 2026-03-10T08:56:55.416 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.415+0000 7fa0ddca7700 1 -- 192.168.123.105:0/99547555 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fa0c4000bf0 con 0x7fa0c0077be0 2026-03-10T08:56:55.417 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.417+0000 7fa0d4ff9700 1 -- 192.168.123.105:0/99547555 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+368 (secure 0 0 0) 0x7fa0c4000bf0 con 0x7fa0c0077be0 2026-03-10T08:56:55.421 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.420+0000 7fa0be7fc700 1 -- 192.168.123.105:0/99547555 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fa0c0077be0 msgr2=0x7fa0c007a0a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:55.421 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.421+0000 7fa0be7fc700 1 --2- 192.168.123.105:0/99547555 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fa0c0077be0 0x7fa0c007a0a0 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7fa0c8005210 tx=0x7fa0c80058e0 comp rx=0 tx=0).stop 2026-03-10T08:56:55.421 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.421+0000 7fa0be7fc700 1 -- 192.168.123.105:0/99547555 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa0d8075a40 msgr2=0x7fa0d8083320 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:55.422 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.421+0000 7fa0be7fc700 1 --2- 192.168.123.105:0/99547555 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa0d8075a40 0x7fa0d8083320 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7fa0d0007f80 tx=0x7fa0d000d460 comp rx=0 tx=0).stop 2026-03-10T08:56:55.422 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.422+0000 7fa0be7fc700 1 -- 192.168.123.105:0/99547555 shutdown_connections 2026-03-10T08:56:55.422 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.422+0000 7fa0be7fc700 1 --2- 192.168.123.105:0/99547555 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fa0c0077be0 0x7fa0c007a0a0 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:55.422 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.422+0000 7fa0be7fc700 1 --2- 192.168.123.105:0/99547555 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa0d8072b50 0x7fa0d8082de0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:55.422 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.422+0000 7fa0be7fc700 1 --2- 192.168.123.105:0/99547555 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa0d8075a40 0x7fa0d8083320 unknown :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:55.422 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.422+0000 7fa0be7fc700 1 -- 192.168.123.105:0/99547555 >> 192.168.123.105:0/99547555 conn(0x7fa0d806dae0 msgr2=0x7fa0d806ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:56:55.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.422+0000 7fa0be7fc700 1 -- 192.168.123.105:0/99547555 shutdown_connections 2026-03-10T08:56:55.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.423+0000 7fa0be7fc700 1 -- 192.168.123.105:0/99547555 wait complete. 2026-03-10T08:56:55.438 INFO:teuthology.orchestra.run.vm05.stdout:true 2026-03-10T08:56:55.551 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.551+0000 7f96ccace700 1 -- 192.168.123.105:0/540780140 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f96c8107d90 msgr2=0x7f96c810a1c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:55.551 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.551+0000 7f96ccace700 1 --2- 192.168.123.105:0/540780140 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f96c8107d90 0x7f96c810a1c0 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f96b8009b00 tx=0x7f96b8009e10 comp rx=0 tx=0).stop 2026-03-10T08:56:55.551 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.551+0000 7f96ccace700 1 -- 192.168.123.105:0/540780140 shutdown_connections 2026-03-10T08:56:55.552 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.551+0000 7f96ccace700 1 --2- 192.168.123.105:0/540780140 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f96c810a700 0x7f96c810cb90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:55.552 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.551+0000 7f96ccace700 1 --2- 192.168.123.105:0/540780140 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f96c8107d90 0x7f96c810a1c0 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:55.552 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.551+0000 7f96ccace700 1 -- 192.168.123.105:0/540780140 >> 192.168.123.105:0/540780140 conn(0x7f96c806dda0 msgr2=0x7f96c8070220 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:56:55.552 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.551+0000 7f96ccace700 1 -- 192.168.123.105:0/540780140 shutdown_connections 2026-03-10T08:56:55.552 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.551+0000 7f96ccace700 1 -- 192.168.123.105:0/540780140 wait complete. 2026-03-10T08:56:55.552 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.552+0000 7f96ccace700 1 Processor -- start 2026-03-10T08:56:55.552 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.552+0000 7f96ccace700 1 -- start start 2026-03-10T08:56:55.552 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.552+0000 7f96ccace700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f96c8107d90 0x7f96c8116a80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:56:55.552 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.552+0000 7f96ccace700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f96c810a700 0x7f96c8116fc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:56:55.552 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.552+0000 7f96ccace700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f96c81175e0 con 0x7f96c8107d90 2026-03-10T08:56:55.552 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.552+0000 7f96ccace700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f96c81b3330 con 0x7f96c810a700 2026-03-10T08:56:55.552 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.552+0000 7f96c659c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f96c8107d90 0x7f96c8116a80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:56:55.552 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.552+0000 7f96c659c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f96c8107d90 0x7f96c8116a80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:39624/0 (socket says 192.168.123.105:39624) 2026-03-10T08:56:55.552 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.552+0000 7f96c659c700 1 -- 192.168.123.105:0/1357505853 learned_addr learned my addr 192.168.123.105:0/1357505853 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:56:55.553 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.552+0000 7f96c659c700 1 -- 192.168.123.105:0/1357505853 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f96c810a700 msgr2=0x7f96c8116fc0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:55.553 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.552+0000 7f96c659c700 1 --2- 192.168.123.105:0/1357505853 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f96c810a700 0x7f96c8116fc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:55.553 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.552+0000 7f96c659c700 1 -- 192.168.123.105:0/1357505853 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f96b80097e0 con 0x7f96c8107d90 2026-03-10T08:56:55.553 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.553+0000 7f96c659c700 1 --2- 192.168.123.105:0/1357505853 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f96c8107d90 0x7f96c8116a80 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f96b80048c0 tx=0x7f96b80049a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:56:55.557 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.557+0000 7f96bffff700 1 -- 192.168.123.105:0/1357505853 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f96b801d070 con 0x7f96c8107d90 2026-03-10T08:56:55.558 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.557+0000 7f96ccace700 1 -- 192.168.123.105:0/1357505853 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f96c81b34d0 con 0x7f96c8107d90 2026-03-10T08:56:55.558 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.557+0000 7f96ccace700 1 -- 192.168.123.105:0/1357505853 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f96c81b3970 con 0x7f96c8107d90 2026-03-10T08:56:55.558 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.558+0000 7f96ccace700 1 -- 192.168.123.105:0/1357505853 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f96c8110c60 con 0x7f96c8107d90 2026-03-10T08:56:55.561 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.561+0000 7f96bffff700 1 -- 192.168.123.105:0/1357505853 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f96b800bc50 con 0x7f96c8107d90 2026-03-10T08:56:55.561 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.561+0000 7f96bffff700 1 -- 192.168.123.105:0/1357505853 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f96b800e680 con 0x7f96c8107d90 2026-03-10T08:56:55.562 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.562+0000 7f96bffff700 1 -- 192.168.123.105:0/1357505853 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 34) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f96b800f460 con 0x7f96c8107d90 2026-03-10T08:56:55.562 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.562+0000 7f96bffff700 1 --2- 192.168.123.105:0/1357505853 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f96b4077780 0x7f96b4079c40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:56:55.563 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.562+0000 7f96bffff700 1 -- 192.168.123.105:0/1357505853 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 6222+0+0 (secure 0 0 0) 0x7f96b809b750 con 0x7f96c8107d90 2026-03-10T08:56:55.564 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.563+0000 7f96bdbff700 1 --2- 192.168.123.105:0/1357505853 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f96b4077780 0x7f96b4079c40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:56:55.564 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.564+0000 7f96bdbff700 1 --2- 192.168.123.105:0/1357505853 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f96b4077780 0x7f96b4079c40 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f96b0005fd0 tx=0x7f96b00094b0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:56:55.568 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.568+0000 7f96bffff700 1 -- 192.168.123.105:0/1357505853 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f96b8063e40 con 0x7f96c8107d90 2026-03-10T08:56:55.787 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.785+0000 7f96ccace700 1 -- 192.168.123.105:0/1357505853 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f96c80611d0 con 0x7f96b4077780 2026-03-10T08:56:55.788 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.788+0000 7f96bffff700 1 -- 192.168.123.105:0/1357505853 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+368 (secure 0 0 0) 0x7f96c80611d0 con 0x7f96b4077780 2026-03-10T08:56:55.792 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.792+0000 7f96bd3fe700 1 -- 192.168.123.105:0/1357505853 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f96b4077780 msgr2=0x7f96b4079c40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:55.792 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.792+0000 7f96bd3fe700 1 --2- 192.168.123.105:0/1357505853 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f96b4077780 0x7f96b4079c40 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f96b0005fd0 tx=0x7f96b00094b0 comp rx=0 tx=0).stop 2026-03-10T08:56:55.792 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.792+0000 7f96bd3fe700 1 -- 192.168.123.105:0/1357505853 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f96c8107d90 msgr2=0x7f96c8116a80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:55.792 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.792+0000 7f96bd3fe700 1 --2- 192.168.123.105:0/1357505853 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f96c8107d90 0x7f96c8116a80 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f96b80048c0 tx=0x7f96b80049a0 comp rx=0 tx=0).stop 2026-03-10T08:56:55.792 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.792+0000 7f96bd3fe700 1 -- 192.168.123.105:0/1357505853 shutdown_connections 2026-03-10T08:56:55.792 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.792+0000 7f96bd3fe700 1 --2- 192.168.123.105:0/1357505853 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f96b4077780 0x7f96b4079c40 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:55.792 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.792+0000 7f96bd3fe700 1 --2- 192.168.123.105:0/1357505853 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f96c8107d90 0x7f96c8116a80 unknown :-1 s=CLOSED pgs=40 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:55.792 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.792+0000 7f96bd3fe700 1 --2- 192.168.123.105:0/1357505853 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f96c810a700 0x7f96c8116fc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:55.792 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.792+0000 7f96bd3fe700 1 -- 192.168.123.105:0/1357505853 >> 192.168.123.105:0/1357505853 conn(0x7f96c806dda0 msgr2=0x7f96c810c110 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:56:55.793 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.792+0000 7f96bd3fe700 1 -- 192.168.123.105:0/1357505853 shutdown_connections 2026-03-10T08:56:55.793 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.792+0000 7f96bd3fe700 1 -- 192.168.123.105:0/1357505853 wait complete. 2026-03-10T08:56:55.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.890+0000 7fedff3db700 1 -- 192.168.123.105:0/2911733989 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fedf8072b50 msgr2=0x7fedf8072f70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:55.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.890+0000 7fedff3db700 1 --2- 192.168.123.105:0/2911733989 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fedf8072b50 0x7fedf8072f70 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7fedf400bc70 tx=0x7fedf400bf80 comp rx=0 tx=0).stop 2026-03-10T08:56:55.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.891+0000 7fedff3db700 1 -- 192.168.123.105:0/2911733989 shutdown_connections 2026-03-10T08:56:55.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.891+0000 7fedff3db700 1 --2- 192.168.123.105:0/2911733989 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fedf8075a40 0x7fedf8077ed0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:55.892 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.891+0000 7fedff3db700 1 --2- 192.168.123.105:0/2911733989 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fedf8072b50 0x7fedf8072f70 unknown :-1 s=CLOSED pgs=41 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:55.892 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.891+0000 7fedff3db700 1 -- 192.168.123.105:0/2911733989 >> 192.168.123.105:0/2911733989 conn(0x7fedf806dae0 msgr2=0x7fedf806ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:56:55.893 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.891+0000 7fedff3db700 1 -- 192.168.123.105:0/2911733989 shutdown_connections 2026-03-10T08:56:55.893 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.891+0000 7fedff3db700 1 -- 192.168.123.105:0/2911733989 wait complete. 2026-03-10T08:56:55.893 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:55 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-10T08:56:55.893 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:55 vm05.local ceph-mon[111630]: mon.vm05 calling monitor election 2026-03-10T08:56:55.893 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:55 vm05.local ceph-mon[111630]: mon.vm08 calling monitor election 2026-03-10T08:56:55.893 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:55 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T08:56:55.893 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:55 vm05.local ceph-mon[111630]: mon.vm05 is new leader, mons vm05,vm08 in quorum (ranks 0,1) 2026-03-10T08:56:55.893 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:55 vm05.local ceph-mon[111630]: monmap epoch 3 2026-03-10T08:56:55.893 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:55 vm05.local ceph-mon[111630]: fsid 16587ed2-1c5e-11f1-90f6-35051361a039 2026-03-10T08:56:55.893 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:55 vm05.local ceph-mon[111630]: last_changed 2026-03-10T08:56:54.708592+0000 2026-03-10T08:56:55.894 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:55 vm05.local ceph-mon[111630]: created 2026-03-10T08:50:09.891602+0000 2026-03-10T08:56:55.894 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:55 vm05.local ceph-mon[111630]: min_mon_release 19 (squid) 2026-03-10T08:56:55.894 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:55 vm05.local ceph-mon[111630]: election_strategy: 1 2026-03-10T08:56:55.894 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:55 vm05.local ceph-mon[111630]: 0: [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] mon.vm05 2026-03-10T08:56:55.894 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:55 vm05.local ceph-mon[111630]: 1: [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] mon.vm08 2026-03-10T08:56:55.894 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:55 vm05.local ceph-mon[111630]: fsmap cephfs:1 {0=cephfs.vm05.bxdvbu=up:active} 3 up:standby 2026-03-10T08:56:55.894 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:55 vm05.local ceph-mon[111630]: osdmap e43: 6 total, 6 up, 6 in 2026-03-10T08:56:55.894 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:55 vm05.local ceph-mon[111630]: mgrmap e34: vm05.rxwgjc(active, since 9s), standbys: vm08.rpongu 2026-03-10T08:56:55.894 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:55 vm05.local ceph-mon[111630]: overall HEALTH_OK 2026-03-10T08:56:55.894 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:55 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:55.894 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:55 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:55.894 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:55 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:56:55.896 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.892+0000 7fedff3db700 1 Processor -- start 2026-03-10T08:56:55.896 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.892+0000 7fedff3db700 1 -- start start 2026-03-10T08:56:55.896 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.892+0000 7fedff3db700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fedf8075a40 0x7fedf8083120 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:56:55.897 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.892+0000 7fedff3db700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fedf8083660 0x7fedf812e470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:56:55.897 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.892+0000 7fedff3db700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fedf8083b70 con 0x7fedf8083660 2026-03-10T08:56:55.897 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.892+0000 7fedff3db700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fedf8083ce0 con 0x7fedf8075a40 2026-03-10T08:56:55.897 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.892+0000 7fedfc976700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fedf8083660 0x7fedf812e470 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:56:55.897 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.892+0000 7fedfc976700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fedf8083660 0x7fedf812e470 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:39638/0 (socket says 192.168.123.105:39638) 2026-03-10T08:56:55.897 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.892+0000 7fedfc976700 1 -- 192.168.123.105:0/3420760614 learned_addr learned my addr 192.168.123.105:0/3420760614 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:56:55.897 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.893+0000 7fedfd177700 1 --2- 192.168.123.105:0/3420760614 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fedf8075a40 0x7fedf8083120 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:56:55.897 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.896+0000 7fedfd177700 1 -- 192.168.123.105:0/3420760614 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fedf8075a40 msgr2=0x7fedf8083120 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_bulk peer close file descriptor 13 2026-03-10T08:56:55.897 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.896+0000 7fedfd177700 1 -- 192.168.123.105:0/3420760614 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fedf8075a40 msgr2=0x7fedf8083120 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until read failed 2026-03-10T08:56:55.898 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.896+0000 7fedfd177700 1 --2- 192.168.123.105:0/3420760614 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fedf8075a40 0x7fedf8083120 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_read_frame_preamble_main read frame preamble failed r=-1 2026-03-10T08:56:55.898 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.896+0000 7fedfd177700 1 --2- 192.168.123.105:0/3420760614 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fedf8075a40 0x7fedf8083120 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-10T08:56:55.898 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.896+0000 7fedfc976700 1 -- 192.168.123.105:0/3420760614 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fedf8075a40 msgr2=0x7fedf8083120 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T08:56:55.898 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.896+0000 7fedfc976700 1 --2- 192.168.123.105:0/3420760614 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fedf8075a40 0x7fedf8083120 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:55.898 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.896+0000 7fedfc976700 1 -- 192.168.123.105:0/3420760614 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fedf400b920 con 0x7fedf8083660 2026-03-10T08:56:55.898 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.897+0000 7fedfc976700 1 --2- 192.168.123.105:0/3420760614 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fedf8083660 0x7fedf812e470 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7fedf000b330 tx=0x7fedf000b6f0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:56:55.902 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.901+0000 7fedee7fc700 1 -- 192.168.123.105:0/3420760614 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fedf000f660 con 0x7fedf8083660 2026-03-10T08:56:55.903 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.901+0000 7fedff3db700 1 -- 192.168.123.105:0/3420760614 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fedf812ea10 con 0x7fedf8083660 2026-03-10T08:56:55.903 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.901+0000 7fedff3db700 1 -- 192.168.123.105:0/3420760614 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fedf812ef60 con 0x7fedf8083660 2026-03-10T08:56:55.903 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.902+0000 7fedee7fc700 1 -- 192.168.123.105:0/3420760614 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fedf000fca0 con 0x7fedf8083660 2026-03-10T08:56:55.903 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.902+0000 7fedee7fc700 1 -- 192.168.123.105:0/3420760614 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fedf000e3e0 con 0x7fedf8083660 2026-03-10T08:56:55.903 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.903+0000 7fedff3db700 1 -- 192.168.123.105:0/3420760614 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7feddc005320 con 0x7fedf8083660 2026-03-10T08:56:55.907 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.907+0000 7fedee7fc700 1 -- 192.168.123.105:0/3420760614 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 34) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fedf0018480 con 0x7fedf8083660 2026-03-10T08:56:55.908 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.908+0000 7fedee7fc700 1 --2- 192.168.123.105:0/3420760614 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fede4077b10 0x7fede4079fd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:56:55.908 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.908+0000 7fedee7fc700 1 -- 192.168.123.105:0/3420760614 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 6222+0+0 (secure 0 0 0) 0x7fedf0013070 con 0x7fedf8083660 2026-03-10T08:56:55.908 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.908+0000 7fedfd177700 1 --2- 192.168.123.105:0/3420760614 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fede4077b10 0x7fede4079fd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:56:55.909 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.909+0000 7fedfd177700 1 --2- 192.168.123.105:0/3420760614 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fede4077b10 0x7fede4079fd0 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fedf400b890 tx=0x7fedf400b800 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:56:55.910 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:55.909+0000 7fedee7fc700 1 -- 192.168.123.105:0/3420760614 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fedf0062f70 con 0x7fedf8083660 2026-03-10T08:56:56.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:55 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-10T08:56:56.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:55 vm08.local ceph-mon[101330]: mon.vm05 calling monitor election 2026-03-10T08:56:56.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:55 vm08.local ceph-mon[101330]: mon.vm08 calling monitor election 2026-03-10T08:56:56.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:55 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T08:56:56.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:55 vm08.local ceph-mon[101330]: mon.vm05 is new leader, mons vm05,vm08 in quorum (ranks 0,1) 2026-03-10T08:56:56.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:55 vm08.local ceph-mon[101330]: monmap epoch 3 2026-03-10T08:56:56.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:55 vm08.local ceph-mon[101330]: fsid 16587ed2-1c5e-11f1-90f6-35051361a039 2026-03-10T08:56:56.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:55 vm08.local ceph-mon[101330]: last_changed 2026-03-10T08:56:54.708592+0000 2026-03-10T08:56:56.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:55 vm08.local ceph-mon[101330]: created 2026-03-10T08:50:09.891602+0000 2026-03-10T08:56:56.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:55 vm08.local ceph-mon[101330]: min_mon_release 19 (squid) 2026-03-10T08:56:56.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:55 vm08.local ceph-mon[101330]: election_strategy: 1 2026-03-10T08:56:56.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:55 vm08.local ceph-mon[101330]: 0: [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] mon.vm05 2026-03-10T08:56:56.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:55 vm08.local ceph-mon[101330]: 1: [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] mon.vm08 2026-03-10T08:56:56.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:55 vm08.local ceph-mon[101330]: fsmap cephfs:1 {0=cephfs.vm05.bxdvbu=up:active} 3 up:standby 2026-03-10T08:56:56.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:55 vm08.local ceph-mon[101330]: osdmap e43: 6 total, 6 up, 6 in 2026-03-10T08:56:56.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:55 vm08.local ceph-mon[101330]: mgrmap e34: vm05.rxwgjc(active, since 9s), standbys: vm08.rpongu 2026-03-10T08:56:56.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:55 vm08.local ceph-mon[101330]: overall HEALTH_OK 2026-03-10T08:56:56.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:55 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:56.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:55 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:56.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:55 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:56:56.070 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.069+0000 7fedff3db700 1 -- 192.168.123.105:0/3420760614 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7feddc000bf0 con 0x7fede4077b10 2026-03-10T08:56:56.079 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.078+0000 7fedee7fc700 1 -- 192.168.123.105:0/3420760614 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7feddc000bf0 con 0x7fede4077b10 2026-03-10T08:56:56.080 INFO:teuthology.orchestra.run.vm05.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T08:56:56.080 INFO:teuthology.orchestra.run.vm05.stdout:alertmanager.vm05 vm05 *:9093,9094 running (5m) 6s ago 6m 24.2M - 0.25.0 c8568f914cd2 3be9db7ff6a0 2026-03-10T08:56:56.080 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm05 vm05 running (6m) 6s ago 6m 8594k - 18.2.1 5be31c24972a 4f6a4fa3151a 2026-03-10T08:56:56.080 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm08 vm08 running (5m) 7s ago 5m 11.0M - 18.2.1 5be31c24972a 17a1d108c2c1 2026-03-10T08:56:56.080 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm05 vm05 running (6m) 6s ago 6m 7407k - 18.2.1 5be31c24972a f9c585addcea 2026-03-10T08:56:56.080 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm08 vm08 running (5m) 7s ago 5m 7415k - 18.2.1 5be31c24972a f0b88fc7f552 2026-03-10T08:56:56.080 INFO:teuthology.orchestra.run.vm05.stdout:grafana.vm05 vm05 *:3000 running (5m) 6s ago 5m 88.2M - 9.4.7 954c08fa6188 91937b4745e7 2026-03-10T08:56:56.080 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.bxdvbu vm05 running (4m) 6s ago 4m 243M - 18.2.1 5be31c24972a 601862397d9b 2026-03-10T08:56:56.080 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.slhztf vm05 running (4m) 6s ago 4m 17.3M - 18.2.1 5be31c24972a 550cdd24738b 2026-03-10T08:56:56.080 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.ssijow vm08 running (4m) 7s ago 3m 19.6M - 18.2.1 5be31c24972a b9e55b365719 2026-03-10T08:56:56.080 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.xfzrbx vm08 running (4m) 7s ago 4m 15.6M - 18.2.1 5be31c24972a d58d1e2e6ff3 2026-03-10T08:56:56.080 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm05.rxwgjc vm05 *:8443,9283,8765 running (52s) 6s ago 6m 586M - 19.2.3-678-ge911bdeb 654f31e6858e d8c53d04a173 2026-03-10T08:56:56.080 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm08.rpongu vm08 *:8443,9283,8765 running (23s) 7s ago 5m 490M - 19.2.3-678-ge911bdeb 654f31e6858e 867781ac9d98 2026-03-10T08:56:56.080 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm05 vm05 running (16s) 6s ago 6m 40.2M 2048M 19.2.3-678-ge911bdeb 654f31e6858e cdc9176bec28 2026-03-10T08:56:56.080 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm08 vm08 starting - - - 2048M 2026-03-10T08:56:56.080 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm05 vm05 *:9100 running (6m) 6s ago 6m 14.4M - 1.5.0 0da6a335fe13 3b3db2a6030c 2026-03-10T08:56:56.080 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm08 vm08 *:9100 running (5m) 7s ago 5m 16.0M - 1.5.0 0da6a335fe13 f55df0cefc61 2026-03-10T08:56:56.080 INFO:teuthology.orchestra.run.vm05.stdout:osd.0 vm05 running (5m) 6s ago 5m 350M 4096M 18.2.1 5be31c24972a 2a2aeea5e3d4 2026-03-10T08:56:56.080 INFO:teuthology.orchestra.run.vm05.stdout:osd.1 vm05 running (5m) 6s ago 5m 376M 4096M 18.2.1 5be31c24972a 902f9ea11f1a 2026-03-10T08:56:56.080 INFO:teuthology.orchestra.run.vm05.stdout:osd.2 vm05 running (4m) 6s ago 4m 318M 4096M 18.2.1 5be31c24972a 32c0be0f86f2 2026-03-10T08:56:56.080 INFO:teuthology.orchestra.run.vm05.stdout:osd.3 vm08 running (4m) 7s ago 4m 439M 4096M 18.2.1 5be31c24972a 14f5f93ea4d1 2026-03-10T08:56:56.080 INFO:teuthology.orchestra.run.vm05.stdout:osd.4 vm08 running (4m) 7s ago 4m 412M 4096M 18.2.1 5be31c24972a 155c1482d81c 2026-03-10T08:56:56.080 INFO:teuthology.orchestra.run.vm05.stdout:osd.5 vm08 running (4m) 7s ago 4m 320M 4096M 18.2.1 5be31c24972a 21583bb58d82 2026-03-10T08:56:56.080 INFO:teuthology.orchestra.run.vm05.stdout:prometheus.vm05 vm05 *:9095 running (25s) 6s ago 5m 49.5M - 2.43.0 a07b618ecd1d 1879cf55d022 2026-03-10T08:56:56.085 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.084+0000 7fede3fff700 1 -- 192.168.123.105:0/3420760614 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fede4077b10 msgr2=0x7fede4079fd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:56.085 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.084+0000 7fede3fff700 1 --2- 192.168.123.105:0/3420760614 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fede4077b10 0x7fede4079fd0 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fedf400b890 tx=0x7fedf400b800 comp rx=0 tx=0).stop 2026-03-10T08:56:56.085 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.085+0000 7fede3fff700 1 -- 192.168.123.105:0/3420760614 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fedf8083660 msgr2=0x7fedf812e470 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:56.085 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.085+0000 7fede3fff700 1 --2- 192.168.123.105:0/3420760614 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fedf8083660 0x7fedf812e470 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7fedf000b330 tx=0x7fedf000b6f0 comp rx=0 tx=0).stop 2026-03-10T08:56:56.086 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.085+0000 7fede3fff700 1 -- 192.168.123.105:0/3420760614 shutdown_connections 2026-03-10T08:56:56.086 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.086+0000 7fede3fff700 1 --2- 192.168.123.105:0/3420760614 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fede4077b10 0x7fede4079fd0 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:56.086 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.086+0000 7fede3fff700 1 --2- 192.168.123.105:0/3420760614 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fedf8075a40 0x7fedf8083120 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:56.086 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.086+0000 7fede3fff700 1 --2- 192.168.123.105:0/3420760614 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fedf8083660 0x7fedf812e470 unknown :-1 s=CLOSED pgs=42 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:56.086 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.086+0000 7fede3fff700 1 -- 192.168.123.105:0/3420760614 >> 192.168.123.105:0/3420760614 conn(0x7fedf806dae0 msgr2=0x7fedf806ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:56:56.087 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.086+0000 7fede3fff700 1 -- 192.168.123.105:0/3420760614 shutdown_connections 2026-03-10T08:56:56.087 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.086+0000 7fede3fff700 1 -- 192.168.123.105:0/3420760614 wait complete. 2026-03-10T08:56:56.184 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.183+0000 7fe32bb91700 1 -- 192.168.123.105:0/3197876725 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe324075a40 msgr2=0x7fe324077ed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:56.187 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.183+0000 7fe32bb91700 1 --2- 192.168.123.105:0/3197876725 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe324075a40 0x7fe324077ed0 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7fe31c00d3f0 tx=0x7fe31c00d700 comp rx=0 tx=0).stop 2026-03-10T08:56:56.188 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.186+0000 7fe32bb91700 1 -- 192.168.123.105:0/3197876725 shutdown_connections 2026-03-10T08:56:56.188 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.186+0000 7fe32bb91700 1 --2- 192.168.123.105:0/3197876725 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe324075a40 0x7fe324077ed0 unknown :-1 s=CLOSED pgs=43 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:56.188 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.187+0000 7fe32bb91700 1 --2- 192.168.123.105:0/3197876725 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe324072b50 0x7fe324072f70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:56.188 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.188+0000 7fe32bb91700 1 -- 192.168.123.105:0/3197876725 >> 192.168.123.105:0/3197876725 conn(0x7fe32406dae0 msgr2=0x7fe32406ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:56:56.189 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.188+0000 7fe32bb91700 1 -- 192.168.123.105:0/3197876725 shutdown_connections 2026-03-10T08:56:56.191 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.188+0000 7fe32bb91700 1 -- 192.168.123.105:0/3197876725 wait complete. 2026-03-10T08:56:56.191 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.189+0000 7fe32bb91700 1 Processor -- start 2026-03-10T08:56:56.191 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.189+0000 7fe32bb91700 1 -- start start 2026-03-10T08:56:56.191 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.189+0000 7fe32bb91700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe324072b50 0x7fe324082f70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:56:56.191 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.189+0000 7fe32bb91700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe3240834b0 0x7fe324083930 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:56:56.191 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.189+0000 7fe32bb91700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe32412e710 con 0x7fe324072b50 2026-03-10T08:56:56.192 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.189+0000 7fe32bb91700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe32412e880 con 0x7fe3240834b0 2026-03-10T08:56:56.192 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.191+0000 7fe32912c700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe3240834b0 0x7fe324083930 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:56:56.192 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.191+0000 7fe32912c700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe3240834b0 0x7fe324083930 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:56708/0 (socket says 192.168.123.105:56708) 2026-03-10T08:56:56.192 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.191+0000 7fe32912c700 1 -- 192.168.123.105:0/3484103304 learned_addr learned my addr 192.168.123.105:0/3484103304 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:56:56.192 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.191+0000 7fe32992d700 1 --2- 192.168.123.105:0/3484103304 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe324072b50 0x7fe324082f70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:56:56.192 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.191+0000 7fe32912c700 1 -- 192.168.123.105:0/3484103304 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe3240834b0 msgr2=0x7fe324083930 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_bulk peer close file descriptor 12 2026-03-10T08:56:56.192 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.191+0000 7fe32912c700 1 -- 192.168.123.105:0/3484103304 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe3240834b0 msgr2=0x7fe324083930 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until read failed 2026-03-10T08:56:56.192 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.191+0000 7fe32912c700 1 --2- 192.168.123.105:0/3484103304 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe3240834b0 0x7fe324083930 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_read_frame_preamble_main read frame preamble failed r=-1 2026-03-10T08:56:56.192 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.191+0000 7fe32912c700 1 --2- 192.168.123.105:0/3484103304 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe3240834b0 0x7fe324083930 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-10T08:56:56.192 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.191+0000 7fe32992d700 1 -- 192.168.123.105:0/3484103304 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe3240834b0 msgr2=0x7fe324083930 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T08:56:56.192 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.191+0000 7fe32992d700 1 --2- 192.168.123.105:0/3484103304 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe3240834b0 0x7fe324083930 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:56.192 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.191+0000 7fe32992d700 1 -- 192.168.123.105:0/3484103304 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe31c007ed0 con 0x7fe324072b50 2026-03-10T08:56:56.192 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.191+0000 7fe32992d700 1 --2- 192.168.123.105:0/3484103304 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe324072b50 0x7fe324082f70 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7fe32000d8d0 tx=0x7fe32000dc90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:56:56.193 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.193+0000 7fe31affd700 1 -- 192.168.123.105:0/3484103304 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe3200098e0 con 0x7fe324072b50 2026-03-10T08:56:56.195 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.193+0000 7fe31affd700 1 -- 192.168.123.105:0/3484103304 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fe320010460 con 0x7fe324072b50 2026-03-10T08:56:56.195 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.193+0000 7fe31affd700 1 -- 192.168.123.105:0/3484103304 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe32000b5d0 con 0x7fe324072b50 2026-03-10T08:56:56.195 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.193+0000 7fe32bb91700 1 -- 192.168.123.105:0/3484103304 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe32412eb60 con 0x7fe324072b50 2026-03-10T08:56:56.195 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.193+0000 7fe32bb91700 1 -- 192.168.123.105:0/3484103304 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe32412f0b0 con 0x7fe324072b50 2026-03-10T08:56:56.195 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.194+0000 7fe32bb91700 1 -- 192.168.123.105:0/3484103304 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe32404ea90 con 0x7fe324072b50 2026-03-10T08:56:56.198 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.198+0000 7fe31affd700 1 -- 192.168.123.105:0/3484103304 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 34) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fe320009a90 con 0x7fe324072b50 2026-03-10T08:56:56.199 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.198+0000 7fe31affd700 1 --2- 192.168.123.105:0/3484103304 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fe310077a00 0x7fe310079ec0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:56:56.199 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.199+0000 7fe31affd700 1 -- 192.168.123.105:0/3484103304 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 6222+0+0 (secure 0 0 0) 0x7fe320099e00 con 0x7fe324072b50 2026-03-10T08:56:56.200 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.200+0000 7fe32912c700 1 --2- 192.168.123.105:0/3484103304 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fe310077a00 0x7fe310079ec0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:56:56.201 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.200+0000 7fe32912c700 1 --2- 192.168.123.105:0/3484103304 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fe310077a00 0x7fe310079ec0 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7fe31c0062a0 tx=0x7fe31c0061f0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:56:56.204 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.202+0000 7fe31affd700 1 -- 192.168.123.105:0/3484103304 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fe320062440 con 0x7fe324072b50 2026-03-10T08:56:56.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.419+0000 7fe32bb91700 1 -- 192.168.123.105:0/3484103304 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fe32412f9f0 con 0x7fe324072b50 2026-03-10T08:56:56.422 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.422+0000 7fe31affd700 1 -- 192.168.123.105:0/3484103304 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+694 (secure 0 0 0) 0x7fe320061b90 con 0x7fe324072b50 2026-03-10T08:56:56.423 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T08:56:56.423 INFO:teuthology.orchestra.run.vm05.stdout: "mon": { 2026-03-10T08:56:56.423 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T08:56:56.423 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:56:56.423 INFO:teuthology.orchestra.run.vm05.stdout: "mgr": { 2026-03-10T08:56:56.423 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T08:56:56.423 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:56:56.423 INFO:teuthology.orchestra.run.vm05.stdout: "osd": { 2026-03-10T08:56:56.423 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 6 2026-03-10T08:56:56.423 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:56:56.423 INFO:teuthology.orchestra.run.vm05.stdout: "mds": { 2026-03-10T08:56:56.423 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4 2026-03-10T08:56:56.423 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:56:56.423 INFO:teuthology.orchestra.run.vm05.stdout: "overall": { 2026-03-10T08:56:56.423 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 10, 2026-03-10T08:56:56.423 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 4 2026-03-10T08:56:56.423 INFO:teuthology.orchestra.run.vm05.stdout: } 2026-03-10T08:56:56.423 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T08:56:56.432 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.429+0000 7fe318ff9700 1 -- 192.168.123.105:0/3484103304 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fe310077a00 msgr2=0x7fe310079ec0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:56.432 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.429+0000 7fe318ff9700 1 --2- 192.168.123.105:0/3484103304 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fe310077a00 0x7fe310079ec0 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7fe31c0062a0 tx=0x7fe31c0061f0 comp rx=0 tx=0).stop 2026-03-10T08:56:56.432 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.429+0000 7fe318ff9700 1 -- 192.168.123.105:0/3484103304 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe324072b50 msgr2=0x7fe324082f70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:56.432 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.429+0000 7fe318ff9700 1 --2- 192.168.123.105:0/3484103304 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe324072b50 0x7fe324082f70 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7fe32000d8d0 tx=0x7fe32000dc90 comp rx=0 tx=0).stop 2026-03-10T08:56:56.432 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.431+0000 7fe318ff9700 1 -- 192.168.123.105:0/3484103304 shutdown_connections 2026-03-10T08:56:56.432 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.431+0000 7fe318ff9700 1 --2- 192.168.123.105:0/3484103304 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fe310077a00 0x7fe310079ec0 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:56.432 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.431+0000 7fe318ff9700 1 --2- 192.168.123.105:0/3484103304 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe324072b50 0x7fe324082f70 unknown :-1 s=CLOSED pgs=44 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:56.432 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.431+0000 7fe318ff9700 1 --2- 192.168.123.105:0/3484103304 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe3240834b0 0x7fe324083930 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:56.432 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.431+0000 7fe318ff9700 1 -- 192.168.123.105:0/3484103304 >> 192.168.123.105:0/3484103304 conn(0x7fe32406dae0 msgr2=0x7fe32406ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:56:56.432 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.431+0000 7fe318ff9700 1 -- 192.168.123.105:0/3484103304 shutdown_connections 2026-03-10T08:56:56.432 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.431+0000 7fe318ff9700 1 -- 192.168.123.105:0/3484103304 wait complete. 2026-03-10T08:56:56.533 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.533+0000 7fb2633d5700 1 -- 192.168.123.105:0/3527572036 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb25c075a40 msgr2=0x7fb25c077ed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:56.534 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.533+0000 7fb2633d5700 1 --2- 192.168.123.105:0/3527572036 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb25c075a40 0x7fb25c077ed0 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7fb25400d3e0 tx=0x7fb25400d6f0 comp rx=0 tx=0).stop 2026-03-10T08:56:56.534 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.533+0000 7fb2633d5700 1 -- 192.168.123.105:0/3527572036 shutdown_connections 2026-03-10T08:56:56.534 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.533+0000 7fb2633d5700 1 --2- 192.168.123.105:0/3527572036 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb25c075a40 0x7fb25c077ed0 unknown :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:56.534 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.533+0000 7fb2633d5700 1 --2- 192.168.123.105:0/3527572036 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb25c072b50 0x7fb25c072f70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:56.534 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.533+0000 7fb2633d5700 1 -- 192.168.123.105:0/3527572036 >> 192.168.123.105:0/3527572036 conn(0x7fb25c06dae0 msgr2=0x7fb25c06ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:56:56.535 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.534+0000 7fb2633d5700 1 -- 192.168.123.105:0/3527572036 shutdown_connections 2026-03-10T08:56:56.535 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.534+0000 7fb2633d5700 1 -- 192.168.123.105:0/3527572036 wait complete. 2026-03-10T08:56:56.536 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.534+0000 7fb2633d5700 1 Processor -- start 2026-03-10T08:56:56.536 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.534+0000 7fb2633d5700 1 -- start start 2026-03-10T08:56:56.536 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.535+0000 7fb2633d5700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb25c072b50 0x7fb25c0830b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:56:56.536 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.535+0000 7fb2633d5700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb25c0835f0 0x7fb25c12e490 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:56:56.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.535+0000 7fb2633d5700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb25c083a70 con 0x7fb25c0835f0 2026-03-10T08:56:56.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.535+0000 7fb2633d5700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb25c083be0 con 0x7fb25c072b50 2026-03-10T08:56:56.539 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.535+0000 7fb260970700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb25c0835f0 0x7fb25c12e490 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:56:56.539 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.535+0000 7fb260970700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb25c0835f0 0x7fb25c12e490 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:39680/0 (socket says 192.168.123.105:39680) 2026-03-10T08:56:56.539 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.535+0000 7fb260970700 1 -- 192.168.123.105:0/1156918371 learned_addr learned my addr 192.168.123.105:0/1156918371 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:56:56.539 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.535+0000 7fb261171700 1 --2- 192.168.123.105:0/1156918371 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb25c072b50 0x7fb25c0830b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:56:56.539 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.536+0000 7fb260970700 1 -- 192.168.123.105:0/1156918371 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb25c072b50 msgr2=0x7fb25c0830b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:56.539 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.536+0000 7fb260970700 1 --2- 192.168.123.105:0/1156918371 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb25c072b50 0x7fb25c0830b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:56.539 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.536+0000 7fb260970700 1 -- 192.168.123.105:0/1156918371 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb25400d090 con 0x7fb25c0835f0 2026-03-10T08:56:56.539 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.536+0000 7fb260970700 1 --2- 192.168.123.105:0/1156918371 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb25c0835f0 0x7fb25c12e490 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7fb254003fa0 tx=0x7fb2540094f0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:56:56.540 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.536+0000 7fb2527fc700 1 -- 192.168.123.105:0/1156918371 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb254010040 con 0x7fb25c0835f0 2026-03-10T08:56:56.541 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.536+0000 7fb2527fc700 1 -- 192.168.123.105:0/1156918371 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb25400b4c0 con 0x7fb25c0835f0 2026-03-10T08:56:56.542 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.537+0000 7fb2527fc700 1 -- 192.168.123.105:0/1156918371 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb25401da30 con 0x7fb25c0835f0 2026-03-10T08:56:56.542 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.537+0000 7fb2633d5700 1 -- 192.168.123.105:0/1156918371 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb25c12e9d0 con 0x7fb25c0835f0 2026-03-10T08:56:56.542 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.538+0000 7fb2633d5700 1 -- 192.168.123.105:0/1156918371 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb25c12ee40 con 0x7fb25c0835f0 2026-03-10T08:56:56.542 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.541+0000 7fb2527fc700 1 -- 192.168.123.105:0/1156918371 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 34) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fb254004250 con 0x7fb25c0835f0 2026-03-10T08:56:56.542 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.541+0000 7fb2527fc700 1 --2- 192.168.123.105:0/1156918371 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb2480777d0 0x7fb248079c90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:56:56.542 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.542+0000 7fb2527fc700 1 -- 192.168.123.105:0/1156918371 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 6222+0+0 (secure 0 0 0) 0x7fb25409ae50 con 0x7fb25c0835f0 2026-03-10T08:56:56.542 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.542+0000 7fb2633d5700 1 -- 192.168.123.105:0/1156918371 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb25c04ea90 con 0x7fb25c0835f0 2026-03-10T08:56:56.551 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.550+0000 7fb261171700 1 --2- 192.168.123.105:0/1156918371 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb2480777d0 0x7fb248079c90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:56:56.551 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.550+0000 7fb2527fc700 1 -- 192.168.123.105:0/1156918371 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fb2540634b0 con 0x7fb25c0835f0 2026-03-10T08:56:56.564 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.563+0000 7fb261171700 1 --2- 192.168.123.105:0/1156918371 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb2480777d0 0x7fb248079c90 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7fb25c12a400 tx=0x7fb25800c040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:56:56.742 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.741+0000 7fb2633d5700 1 -- 192.168.123.105:0/1156918371 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fb25c12f120 con 0x7fb25c0835f0 2026-03-10T08:56:56.743 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.742+0000 7fb2527fc700 1 -- 192.168.123.105:0/1156918371 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 12 v12) v1 ==== 76+0+1918 (secure 0 0 0) 0x7fb254062c00 con 0x7fb25c0835f0 2026-03-10T08:56:56.743 INFO:teuthology.orchestra.run.vm05.stdout:e12 2026-03-10T08:56:56.743 INFO:teuthology.orchestra.run.vm05.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-10T08:56:56.743 INFO:teuthology.orchestra.run.vm05.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T08:56:56.743 INFO:teuthology.orchestra.run.vm05.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T08:56:56.743 INFO:teuthology.orchestra.run.vm05.stdout:legacy client fscid: 1 2026-03-10T08:56:56.743 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:56:56.743 INFO:teuthology.orchestra.run.vm05.stdout:Filesystem 'cephfs' (1) 2026-03-10T08:56:56.743 INFO:teuthology.orchestra.run.vm05.stdout:fs_name cephfs 2026-03-10T08:56:56.743 INFO:teuthology.orchestra.run.vm05.stdout:epoch 12 2026-03-10T08:56:56.743 INFO:teuthology.orchestra.run.vm05.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T08:56:56.743 INFO:teuthology.orchestra.run.vm05.stdout:created 2026-03-10T08:52:52.346264+0000 2026-03-10T08:56:56.743 INFO:teuthology.orchestra.run.vm05.stdout:modified 2026-03-10T08:53:01.426195+0000 2026-03-10T08:56:56.743 INFO:teuthology.orchestra.run.vm05.stdout:tableserver 0 2026-03-10T08:56:56.743 INFO:teuthology.orchestra.run.vm05.stdout:root 0 2026-03-10T08:56:56.744 INFO:teuthology.orchestra.run.vm05.stdout:session_timeout 60 2026-03-10T08:56:56.744 INFO:teuthology.orchestra.run.vm05.stdout:session_autoclose 300 2026-03-10T08:56:56.744 INFO:teuthology.orchestra.run.vm05.stdout:max_file_size 1099511627776 2026-03-10T08:56:56.744 INFO:teuthology.orchestra.run.vm05.stdout:max_xattr_size 65536 2026-03-10T08:56:56.744 INFO:teuthology.orchestra.run.vm05.stdout:required_client_features {} 2026-03-10T08:56:56.744 INFO:teuthology.orchestra.run.vm05.stdout:last_failure 0 2026-03-10T08:56:56.744 INFO:teuthology.orchestra.run.vm05.stdout:last_failure_osd_epoch 39 2026-03-10T08:56:56.744 INFO:teuthology.orchestra.run.vm05.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T08:56:56.744 INFO:teuthology.orchestra.run.vm05.stdout:max_mds 1 2026-03-10T08:56:56.744 INFO:teuthology.orchestra.run.vm05.stdout:in 0 2026-03-10T08:56:56.744 INFO:teuthology.orchestra.run.vm05.stdout:up {0=24289} 2026-03-10T08:56:56.744 INFO:teuthology.orchestra.run.vm05.stdout:failed 2026-03-10T08:56:56.744 INFO:teuthology.orchestra.run.vm05.stdout:damaged 2026-03-10T08:56:56.744 INFO:teuthology.orchestra.run.vm05.stdout:stopped 2026-03-10T08:56:56.744 INFO:teuthology.orchestra.run.vm05.stdout:data_pools [3] 2026-03-10T08:56:56.744 INFO:teuthology.orchestra.run.vm05.stdout:metadata_pool 2 2026-03-10T08:56:56.744 INFO:teuthology.orchestra.run.vm05.stdout:inline_data disabled 2026-03-10T08:56:56.744 INFO:teuthology.orchestra.run.vm05.stdout:balancer 2026-03-10T08:56:56.744 INFO:teuthology.orchestra.run.vm05.stdout:bal_rank_mask -1 2026-03-10T08:56:56.744 INFO:teuthology.orchestra.run.vm05.stdout:standby_count_wanted 1 2026-03-10T08:56:56.744 INFO:teuthology.orchestra.run.vm05.stdout:qdb_cluster leader: 0 members: 2026-03-10T08:56:56.744 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.bxdvbu{0:24289} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.105:6826/2466638752,v1:192.168.123.105:6827/2466638752] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:56:56.744 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:56:56.744 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:56:56.744 INFO:teuthology.orchestra.run.vm05.stdout:Standby daemons: 2026-03-10T08:56:56.744 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:56:56.744 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.slhztf{-1:14488} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.105:6828/2662194502,v1:192.168.123.105:6829/2662194502] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:56:56.744 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.ssijow{-1:24307} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.108:6826/573665424,v1:192.168.123.108:6827/573665424] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:56:56.744 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.xfzrbx{-1:24317} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.108:6824/3236998387,v1:192.168.123.108:6825/3236998387] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:56:56.747 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.747+0000 7fb247fff700 1 -- 192.168.123.105:0/1156918371 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb2480777d0 msgr2=0x7fb248079c90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:56.748 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.747+0000 7fb247fff700 1 --2- 192.168.123.105:0/1156918371 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb2480777d0 0x7fb248079c90 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7fb25c12a400 tx=0x7fb25800c040 comp rx=0 tx=0).stop 2026-03-10T08:56:56.748 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.747+0000 7fb247fff700 1 -- 192.168.123.105:0/1156918371 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb25c0835f0 msgr2=0x7fb25c12e490 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:56.748 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.747+0000 7fb247fff700 1 --2- 192.168.123.105:0/1156918371 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb25c0835f0 0x7fb25c12e490 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7fb254003fa0 tx=0x7fb2540094f0 comp rx=0 tx=0).stop 2026-03-10T08:56:56.748 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.748+0000 7fb247fff700 1 -- 192.168.123.105:0/1156918371 shutdown_connections 2026-03-10T08:56:56.748 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.748+0000 7fb247fff700 1 --2- 192.168.123.105:0/1156918371 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb2480777d0 0x7fb248079c90 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:56.748 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.748+0000 7fb247fff700 1 --2- 192.168.123.105:0/1156918371 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb25c072b50 0x7fb25c0830b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:56.748 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.748+0000 7fb247fff700 1 --2- 192.168.123.105:0/1156918371 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb25c0835f0 0x7fb25c12e490 unknown :-1 s=CLOSED pgs=46 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:56.748 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.748+0000 7fb247fff700 1 -- 192.168.123.105:0/1156918371 >> 192.168.123.105:0/1156918371 conn(0x7fb25c06dae0 msgr2=0x7fb25c06ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:56:56.748 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.748+0000 7fb247fff700 1 -- 192.168.123.105:0/1156918371 shutdown_connections 2026-03-10T08:56:56.749 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:56.748+0000 7fb247fff700 1 -- 192.168.123.105:0/1156918371 wait complete. 2026-03-10T08:56:56.753 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 12 2026-03-10T08:56:57.402 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.402+0000 7f2e1aed3700 1 -- 192.168.123.105:0/4214483914 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2e14075740 msgr2=0x7f2e14075b60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:57.402 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.402+0000 7f2e1aed3700 1 --2- 192.168.123.105:0/4214483914 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2e14075740 0x7f2e14075b60 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7f2e10009b00 tx=0x7f2e10009e10 comp rx=0 tx=0).stop 2026-03-10T08:56:57.402 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.402+0000 7f2e1aed3700 1 -- 192.168.123.105:0/4214483914 shutdown_connections 2026-03-10T08:56:57.402 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.402+0000 7f2e1aed3700 1 --2- 192.168.123.105:0/4214483914 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2e14076990 0x7f2e14076e10 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:57.402 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.402+0000 7f2e1aed3700 1 --2- 192.168.123.105:0/4214483914 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2e14075740 0x7f2e14075b60 unknown :-1 s=CLOSED pgs=47 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:57.402 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.402+0000 7f2e1aed3700 1 -- 192.168.123.105:0/4214483914 >> 192.168.123.105:0/4214483914 conn(0x7f2e140fe6c0 msgr2=0x7f2e14100b00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:56:57.403 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.402+0000 7f2e1aed3700 1 -- 192.168.123.105:0/4214483914 shutdown_connections 2026-03-10T08:56:57.403 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.402+0000 7f2e1aed3700 1 -- 192.168.123.105:0/4214483914 wait complete. 2026-03-10T08:56:57.403 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.403+0000 7f2e1aed3700 1 Processor -- start 2026-03-10T08:56:57.403 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.403+0000 7f2e1aed3700 1 -- start start 2026-03-10T08:56:57.403 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.403+0000 7f2e1aed3700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2e14075740 0x7f2e1419cde0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:56:57.403 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.403+0000 7f2e1aed3700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2e14076990 0x7f2e1419d320 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:56:57.403 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.403+0000 7f2e1aed3700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2e1419d940 con 0x7f2e14075740 2026-03-10T08:56:57.403 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.403+0000 7f2e1aed3700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2e1419da80 con 0x7f2e14076990 2026-03-10T08:56:57.404 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.404+0000 7f2e18c6f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2e14075740 0x7f2e1419cde0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:56:57.404 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.404+0000 7f2e18c6f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2e14075740 0x7f2e1419cde0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:39694/0 (socket says 192.168.123.105:39694) 2026-03-10T08:56:57.404 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.404+0000 7f2e18c6f700 1 -- 192.168.123.105:0/3080602779 learned_addr learned my addr 192.168.123.105:0/3080602779 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:56:57.404 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.404+0000 7f2e0bfff700 1 --2- 192.168.123.105:0/3080602779 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2e14076990 0x7f2e1419d320 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:56:57.404 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.404+0000 7f2e0bfff700 1 -- 192.168.123.105:0/3080602779 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2e14076990 msgr2=0x7f2e1419d320 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_bulk peer close file descriptor 12 2026-03-10T08:56:57.404 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.404+0000 7f2e0bfff700 1 -- 192.168.123.105:0/3080602779 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2e14076990 msgr2=0x7f2e1419d320 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until read failed 2026-03-10T08:56:57.404 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.404+0000 7f2e0bfff700 1 --2- 192.168.123.105:0/3080602779 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2e14076990 0x7f2e1419d320 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_read_frame_preamble_main read frame preamble failed r=-1 2026-03-10T08:56:57.404 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.404+0000 7f2e0bfff700 1 --2- 192.168.123.105:0/3080602779 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2e14076990 0x7f2e1419d320 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-10T08:56:57.407 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.407+0000 7f2e18c6f700 1 -- 192.168.123.105:0/3080602779 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2e14076990 msgr2=0x7f2e1419d320 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T08:56:57.407 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.407+0000 7f2e18c6f700 1 --2- 192.168.123.105:0/3080602779 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2e14076990 0x7f2e1419d320 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:57.407 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.407+0000 7f2e18c6f700 1 -- 192.168.123.105:0/3080602779 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2e100097e0 con 0x7f2e14075740 2026-03-10T08:56:57.407 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.407+0000 7f2e18c6f700 1 --2- 192.168.123.105:0/3080602779 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2e14075740 0x7f2e1419cde0 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f2e100048c0 tx=0x7f2e100049a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:56:57.417 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.415+0000 7f2e09ffb700 1 -- 192.168.123.105:0/3080602779 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2e1001d070 con 0x7f2e14075740 2026-03-10T08:56:57.417 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.415+0000 7f2e09ffb700 1 -- 192.168.123.105:0/3080602779 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f2e1000bc50 con 0x7f2e14075740 2026-03-10T08:56:57.417 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.415+0000 7f2e09ffb700 1 -- 192.168.123.105:0/3080602779 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2e1000f830 con 0x7f2e14075740 2026-03-10T08:56:57.417 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.415+0000 7f2e1aed3700 1 -- 192.168.123.105:0/3080602779 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2e141a24d0 con 0x7f2e14075740 2026-03-10T08:56:57.417 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.415+0000 7f2e1aed3700 1 -- 192.168.123.105:0/3080602779 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2e141a29c0 con 0x7f2e14075740 2026-03-10T08:56:57.417 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.416+0000 7f2e1aed3700 1 -- 192.168.123.105:0/3080602779 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2e14066e80 con 0x7f2e14075740 2026-03-10T08:56:57.417 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.416+0000 7f2e09ffb700 1 -- 192.168.123.105:0/3080602779 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 34) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f2e10022b50 con 0x7f2e14075740 2026-03-10T08:56:57.417 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.417+0000 7f2e09ffb700 1 --2- 192.168.123.105:0/3080602779 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f2dfc0779f0 0x7f2dfc079eb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:56:57.417 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.417+0000 7f2e09ffb700 1 -- 192.168.123.105:0/3080602779 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 6222+0+0 (secure 0 0 0) 0x7f2e1009b320 con 0x7f2e14075740 2026-03-10T08:56:57.419 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.419+0000 7f2e0bfff700 1 --2- 192.168.123.105:0/3080602779 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f2dfc0779f0 0x7f2dfc079eb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:56:57.419 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.419+0000 7f2e0bfff700 1 --2- 192.168.123.105:0/3080602779 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f2dfc0779f0 0x7f2dfc079eb0 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f2e04005920 tx=0x7f2e0400b2c0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:56:57.422 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.422+0000 7f2e09ffb700 1 -- 192.168.123.105:0/3080602779 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f2e10063a10 con 0x7f2e14075740 2026-03-10T08:56:57.603 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.603+0000 7f2e1aed3700 1 -- 192.168.123.105:0/3080602779 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f2e1410eaa0 con 0x7f2dfc0779f0 2026-03-10T08:56:57.605 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.605+0000 7f2e09ffb700 1 -- 192.168.123.105:0/3080602779 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+383 (secure 0 0 0) 0x7f2e1410eaa0 con 0x7f2dfc0779f0 2026-03-10T08:56:57.605 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T08:56:57.605 INFO:teuthology.orchestra.run.vm05.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T08:56:57.605 INFO:teuthology.orchestra.run.vm05.stdout: "in_progress": true, 2026-03-10T08:56:57.605 INFO:teuthology.orchestra.run.vm05.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T08:56:57.605 INFO:teuthology.orchestra.run.vm05.stdout: "services_complete": [ 2026-03-10T08:56:57.605 INFO:teuthology.orchestra.run.vm05.stdout: "mgr", 2026-03-10T08:56:57.605 INFO:teuthology.orchestra.run.vm05.stdout: "mon" 2026-03-10T08:56:57.605 INFO:teuthology.orchestra.run.vm05.stdout: ], 2026-03-10T08:56:57.605 INFO:teuthology.orchestra.run.vm05.stdout: "progress": "4/23 daemons upgraded", 2026-03-10T08:56:57.605 INFO:teuthology.orchestra.run.vm05.stdout: "message": "Currently upgrading mon daemons", 2026-03-10T08:56:57.605 INFO:teuthology.orchestra.run.vm05.stdout: "is_paused": false 2026-03-10T08:56:57.605 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T08:56:57.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.609+0000 7f2dfb7fe700 1 -- 192.168.123.105:0/3080602779 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f2dfc0779f0 msgr2=0x7f2dfc079eb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:57.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.609+0000 7f2dfb7fe700 1 --2- 192.168.123.105:0/3080602779 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f2dfc0779f0 0x7f2dfc079eb0 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f2e04005920 tx=0x7f2e0400b2c0 comp rx=0 tx=0).stop 2026-03-10T08:56:57.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.609+0000 7f2dfb7fe700 1 -- 192.168.123.105:0/3080602779 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2e14075740 msgr2=0x7f2e1419cde0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:57.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.609+0000 7f2dfb7fe700 1 --2- 192.168.123.105:0/3080602779 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2e14075740 0x7f2e1419cde0 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f2e100048c0 tx=0x7f2e100049a0 comp rx=0 tx=0).stop 2026-03-10T08:56:57.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.609+0000 7f2dfb7fe700 1 -- 192.168.123.105:0/3080602779 shutdown_connections 2026-03-10T08:56:57.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.609+0000 7f2dfb7fe700 1 --2- 192.168.123.105:0/3080602779 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f2dfc0779f0 0x7f2dfc079eb0 unknown :-1 s=CLOSED pgs=25 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:57.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.609+0000 7f2dfb7fe700 1 --2- 192.168.123.105:0/3080602779 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2e14075740 0x7f2e1419cde0 unknown :-1 s=CLOSED pgs=48 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:57.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.609+0000 7f2dfb7fe700 1 --2- 192.168.123.105:0/3080602779 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2e14076990 0x7f2e1419d320 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:57.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.609+0000 7f2dfb7fe700 1 -- 192.168.123.105:0/3080602779 >> 192.168.123.105:0/3080602779 conn(0x7f2e140fe6c0 msgr2=0x7f2e1410d380 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:56:57.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.609+0000 7f2dfb7fe700 1 -- 192.168.123.105:0/3080602779 shutdown_connections 2026-03-10T08:56:57.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.609+0000 7f2dfb7fe700 1 -- 192.168.123.105:0/3080602779 wait complete. 2026-03-10T08:56:57.711 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.711+0000 7f26d5c28700 1 -- 192.168.123.105:0/3534342238 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f26d0072b50 msgr2=0x7f26d0072f70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:57.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:57 vm05.local ceph-mon[111630]: from='client.34122 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:56:57.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:57 vm05.local ceph-mon[111630]: pgmap v8: 65 pgs: 65 active+clean; 287 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 30 KiB/s rd, 426 KiB/s wr, 90 op/s 2026-03-10T08:56:57.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:57 vm05.local ceph-mon[111630]: from='client.34126 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:56:57.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:57 vm05.local ceph-mon[111630]: from='client.34130 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:56:57.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:57 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:57.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:57 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:57.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:57 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/3484103304' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:56:57.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:57 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/1156918371' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T08:56:57.712 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.711+0000 7f26d5c28700 1 --2- 192.168.123.105:0/3534342238 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f26d0072b50 0x7f26d0072f70 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7f26c000bc70 tx=0x7f26c000bf80 comp rx=0 tx=0).stop 2026-03-10T08:56:57.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.713+0000 7f26d5c28700 1 -- 192.168.123.105:0/3534342238 shutdown_connections 2026-03-10T08:56:57.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.713+0000 7f26d5c28700 1 --2- 192.168.123.105:0/3534342238 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f26d0075a40 0x7f26d0077ed0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:57.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.713+0000 7f26d5c28700 1 --2- 192.168.123.105:0/3534342238 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f26d0072b50 0x7f26d0072f70 unknown :-1 s=CLOSED pgs=49 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:57.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.713+0000 7f26d5c28700 1 -- 192.168.123.105:0/3534342238 >> 192.168.123.105:0/3534342238 conn(0x7f26d006dae0 msgr2=0x7f26d006ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:56:57.715 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.714+0000 7f26d5c28700 1 -- 192.168.123.105:0/3534342238 shutdown_connections 2026-03-10T08:56:57.715 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.714+0000 7f26d5c28700 1 -- 192.168.123.105:0/3534342238 wait complete. 2026-03-10T08:56:57.723 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.715+0000 7f26d5c28700 1 Processor -- start 2026-03-10T08:56:57.723 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.715+0000 7f26d5c28700 1 -- start start 2026-03-10T08:56:57.723 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.715+0000 7f26d5c28700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f26d0075a40 0x7f26d0082f30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:56:57.723 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.715+0000 7f26d5c28700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f26d0083470 0x7f26d00838f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:56:57.723 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.715+0000 7f26d5c28700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f26d012e6d0 con 0x7f26d0083470 2026-03-10T08:56:57.723 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.715+0000 7f26d5c28700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f26d012e840 con 0x7f26d0075a40 2026-03-10T08:56:57.723 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.716+0000 7f26cf7fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f26d0075a40 0x7f26d0082f30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:56:57.723 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.716+0000 7f26cf7fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f26d0075a40 0x7f26d0082f30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:56776/0 (socket says 192.168.123.105:56776) 2026-03-10T08:56:57.723 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.722+0000 7f26ceffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f26d0083470 0x7f26d00838f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:56:57.723 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.722+0000 7f26ceffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f26d0083470 0x7f26d00838f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:39706/0 (socket says 192.168.123.105:39706) 2026-03-10T08:56:57.723 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.722+0000 7f26cf7fe700 1 -- 192.168.123.105:0/1007349194 learned_addr learned my addr 192.168.123.105:0/1007349194 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:56:57.728 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:57 vm08.local ceph-mon[101330]: from='client.34122 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:56:57.728 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:57 vm08.local ceph-mon[101330]: pgmap v8: 65 pgs: 65 active+clean; 287 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 30 KiB/s rd, 426 KiB/s wr, 90 op/s 2026-03-10T08:56:57.728 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:57 vm08.local ceph-mon[101330]: from='client.34126 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:56:57.728 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:57 vm08.local ceph-mon[101330]: from='client.34130 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:56:57.728 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:57 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:57.728 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:57 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:57.728 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:57 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/3484103304' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:56:57.728 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:57 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/1156918371' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T08:56:57.729 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.729+0000 7f26cf7fe700 1 -- 192.168.123.105:0/1007349194 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f26d0075a40 msgr2=0x7f26d0082f30 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_bulk peer close file descriptor 13 2026-03-10T08:56:57.729 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.729+0000 7f26cf7fe700 1 -- 192.168.123.105:0/1007349194 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f26d0075a40 msgr2=0x7f26d0082f30 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until read failed 2026-03-10T08:56:57.729 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.729+0000 7f26cf7fe700 1 --2- 192.168.123.105:0/1007349194 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f26d0075a40 0x7f26d0082f30 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_read_frame_preamble_main read frame preamble failed r=-1 2026-03-10T08:56:57.729 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.729+0000 7f26cf7fe700 1 --2- 192.168.123.105:0/1007349194 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f26d0075a40 0x7f26d0082f30 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-10T08:56:57.731 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.730+0000 7f26ceffd700 1 -- 192.168.123.105:0/1007349194 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f26d0075a40 msgr2=0x7f26d0082f30 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T08:56:57.731 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.730+0000 7f26ceffd700 1 --2- 192.168.123.105:0/1007349194 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f26d0075a40 0x7f26d0082f30 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:57.731 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.730+0000 7f26ceffd700 1 -- 192.168.123.105:0/1007349194 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f26c000b920 con 0x7f26d0083470 2026-03-10T08:56:57.731 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.731+0000 7f26ceffd700 1 --2- 192.168.123.105:0/1007349194 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f26d0083470 0x7f26d00838f0 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f26c800b300 tx=0x7f26c800b6c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:56:57.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.732+0000 7f26ccff9700 1 -- 192.168.123.105:0/1007349194 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f26c8003bb0 con 0x7f26d0083470 2026-03-10T08:56:57.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.732+0000 7f26d5c28700 1 -- 192.168.123.105:0/1007349194 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f26d012eb20 con 0x7f26d0083470 2026-03-10T08:56:57.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.732+0000 7f26d5c28700 1 -- 192.168.123.105:0/1007349194 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f26d012f070 con 0x7f26d0083470 2026-03-10T08:56:57.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.733+0000 7f26d5c28700 1 -- 192.168.123.105:0/1007349194 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f26d007c8e0 con 0x7f26d0083470 2026-03-10T08:56:57.734 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.733+0000 7f26ccff9700 1 -- 192.168.123.105:0/1007349194 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f26c8025080 con 0x7f26d0083470 2026-03-10T08:56:57.735 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.734+0000 7f26ccff9700 1 -- 192.168.123.105:0/1007349194 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f26c80041e0 con 0x7f26d0083470 2026-03-10T08:56:57.737 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.736+0000 7f26ccff9700 1 -- 192.168.123.105:0/1007349194 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 34) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f26c8004340 con 0x7f26d0083470 2026-03-10T08:56:57.737 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.737+0000 7f26ccff9700 1 --2- 192.168.123.105:0/1007349194 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f26b8077a00 0x7f26b8079ec0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:56:57.738 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.737+0000 7f26ccff9700 1 -- 192.168.123.105:0/1007349194 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 6222+0+0 (secure 0 0 0) 0x7f26c8013070 con 0x7f26d0083470 2026-03-10T08:56:57.738 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.738+0000 7f26cf7fe700 1 --2- 192.168.123.105:0/1007349194 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f26b8077a00 0x7f26b8079ec0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:56:57.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.739+0000 7f26cf7fe700 1 --2- 192.168.123.105:0/1007349194 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f26b8077a00 0x7f26b8079ec0 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f26c0005c10 tx=0x7f26c0005b80 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:56:57.740 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:57.739+0000 7f26ccff9700 1 -- 192.168.123.105:0/1007349194 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f26c8062a90 con 0x7f26d0083470 2026-03-10T08:56:58.025 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:58.022+0000 7f26d5c28700 1 -- 192.168.123.105:0/1007349194 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f26d004ea90 con 0x7f26d0083470 2026-03-10T08:56:58.026 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:58.025+0000 7f26ccff9700 1 -- 192.168.123.105:0/1007349194 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7f26c80621e0 con 0x7f26d0083470 2026-03-10T08:56:58.026 INFO:teuthology.orchestra.run.vm05.stdout:HEALTH_OK 2026-03-10T08:56:58.029 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:58.028+0000 7f26d5c28700 1 -- 192.168.123.105:0/1007349194 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f26b8077a00 msgr2=0x7f26b8079ec0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:58.029 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:58.028+0000 7f26d5c28700 1 --2- 192.168.123.105:0/1007349194 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f26b8077a00 0x7f26b8079ec0 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f26c0005c10 tx=0x7f26c0005b80 comp rx=0 tx=0).stop 2026-03-10T08:56:58.029 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:58.028+0000 7f26d5c28700 1 -- 192.168.123.105:0/1007349194 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f26d0083470 msgr2=0x7f26d00838f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:56:58.030 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:58.028+0000 7f26d5c28700 1 --2- 192.168.123.105:0/1007349194 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f26d0083470 0x7f26d00838f0 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f26c800b300 tx=0x7f26c800b6c0 comp rx=0 tx=0).stop 2026-03-10T08:56:58.030 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:58.029+0000 7f26d5c28700 1 -- 192.168.123.105:0/1007349194 shutdown_connections 2026-03-10T08:56:58.030 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:58.029+0000 7f26d5c28700 1 --2- 192.168.123.105:0/1007349194 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f26b8077a00 0x7f26b8079ec0 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:58.030 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:58.029+0000 7f26d5c28700 1 --2- 192.168.123.105:0/1007349194 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f26d0075a40 0x7f26d0082f30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:58.030 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:58.029+0000 7f26d5c28700 1 --2- 192.168.123.105:0/1007349194 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f26d0083470 0x7f26d00838f0 unknown :-1 s=CLOSED pgs=50 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:56:58.030 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:58.029+0000 7f26d5c28700 1 -- 192.168.123.105:0/1007349194 >> 192.168.123.105:0/1007349194 conn(0x7f26d006dae0 msgr2=0x7f26d006ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:56:58.030 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:58.030+0000 7f26d5c28700 1 -- 192.168.123.105:0/1007349194 shutdown_connections 2026-03-10T08:56:58.030 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:56:58.030+0000 7f26d5c28700 1 -- 192.168.123.105:0/1007349194 wait complete. 2026-03-10T08:56:58.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:58 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:58.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:58 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:58.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:58 vm05.local ceph-mon[111630]: from='client.34142 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:56:58.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:58 vm05.local ceph-mon[111630]: pgmap v9: 65 pgs: 65 active+clean; 290 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 38 KiB/s rd, 863 KiB/s wr, 196 op/s 2026-03-10T08:56:58.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:56:58 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/1007349194' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T08:56:58.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:58 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:58.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:58 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:56:58.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:58 vm08.local ceph-mon[101330]: from='client.34142 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:56:58.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:58 vm08.local ceph-mon[101330]: pgmap v9: 65 pgs: 65 active+clean; 290 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 38 KiB/s rd, 863 KiB/s wr, 196 op/s 2026-03-10T08:56:58.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:56:58 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/1007349194' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T08:57:00.215 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:00 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:00.215 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:00 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:00.215 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:00 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:00.215 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:00 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:57:00.215 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:00 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:00.215 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:00 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T08:57:00.215 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:00 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T08:57:00.215 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:00 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:00.216 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:00 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:00.216 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:00 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:00.216 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:00 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm05.rxwgjc", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T08:57:00.216 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:00 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T08:57:00.216 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:00 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:00.216 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:00 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:00.216 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:00 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:00.216 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:00 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm05", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T08:57:00.216 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:00 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm05", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T08:57:00.216 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:00 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm05", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-10T08:57:00.216 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:00 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.ceph-exporter.vm05"}]: dispatch 2026-03-10T08:57:00.216 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:00 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:00.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:00 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:00.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:00 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:00.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:00 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:00.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:00 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:57:00.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:00 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:00.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:00 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T08:57:00.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:00 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T08:57:00.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:00 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:00.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:00 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:00.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:00 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:00.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:00 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm05.rxwgjc", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T08:57:00.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:00 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T08:57:00.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:00 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:00.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:00 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:00.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:00 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:00.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:00 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm05", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T08:57:00.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:00 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm05", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T08:57:00.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:00 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm05", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-10T08:57:00.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:00 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.ceph-exporter.vm05"}]: dispatch 2026-03-10T08:57:00.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:00 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:01.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:01 vm05.local ceph-mon[111630]: Reconfiguring mon.vm05 (monmap changed)... 2026-03-10T08:57:01.027 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:01 vm05.local ceph-mon[111630]: Reconfiguring daemon mon.vm05 on vm05 2026-03-10T08:57:01.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:01 vm05.local ceph-mon[111630]: Reconfiguring mgr.vm05.rxwgjc (monmap changed)... 2026-03-10T08:57:01.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:01 vm05.local ceph-mon[111630]: Reconfiguring daemon mgr.vm05.rxwgjc on vm05 2026-03-10T08:57:01.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:01 vm05.local ceph-mon[111630]: pgmap v10: 65 pgs: 65 active+clean; 290 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 35 KiB/s rd, 786 KiB/s wr, 178 op/s 2026-03-10T08:57:01.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:01 vm05.local ceph-mon[111630]: Reconfiguring ceph-exporter.vm05 (monmap changed)... 2026-03-10T08:57:01.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:01 vm05.local ceph-mon[111630]: Unable to update caps for client.ceph-exporter.vm05 2026-03-10T08:57:01.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:01 vm05.local ceph-mon[111630]: Reconfiguring daemon ceph-exporter.vm05 on vm05 2026-03-10T08:57:01.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:01 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:01.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:01 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:01.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:01 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm05", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T08:57:01.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:01 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:01.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:01 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:01.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:01 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:57:01.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:01 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:01.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:01 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-10T08:57:01.028 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:01 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:01.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:01 vm08.local ceph-mon[101330]: Reconfiguring mon.vm05 (monmap changed)... 2026-03-10T08:57:01.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:01 vm08.local ceph-mon[101330]: Reconfiguring daemon mon.vm05 on vm05 2026-03-10T08:57:01.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:01 vm08.local ceph-mon[101330]: Reconfiguring mgr.vm05.rxwgjc (monmap changed)... 2026-03-10T08:57:01.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:01 vm08.local ceph-mon[101330]: Reconfiguring daemon mgr.vm05.rxwgjc on vm05 2026-03-10T08:57:01.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:01 vm08.local ceph-mon[101330]: pgmap v10: 65 pgs: 65 active+clean; 290 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 35 KiB/s rd, 786 KiB/s wr, 178 op/s 2026-03-10T08:57:01.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:01 vm08.local ceph-mon[101330]: Reconfiguring ceph-exporter.vm05 (monmap changed)... 2026-03-10T08:57:01.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:01 vm08.local ceph-mon[101330]: Unable to update caps for client.ceph-exporter.vm05 2026-03-10T08:57:01.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:01 vm08.local ceph-mon[101330]: Reconfiguring daemon ceph-exporter.vm05 on vm05 2026-03-10T08:57:01.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:01 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:01.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:01 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:01.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:01 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm05", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T08:57:01.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:01 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:01.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:01 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:01.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:01 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:57:01.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:01 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:01.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:01 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-10T08:57:01.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:01 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:02.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:02 vm05.local ceph-mon[111630]: Reconfiguring crash.vm05 (monmap changed)... 2026-03-10T08:57:02.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:02 vm05.local ceph-mon[111630]: Reconfiguring daemon crash.vm05 on vm05 2026-03-10T08:57:02.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:02 vm05.local ceph-mon[111630]: Reconfiguring osd.0 (monmap changed)... 2026-03-10T08:57:02.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:02 vm05.local ceph-mon[111630]: Reconfiguring daemon osd.0 on vm05 2026-03-10T08:57:02.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:02 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:02.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:02 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:02.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:02 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-10T08:57:02.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:02 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:02.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:02 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:02.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:02 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:02.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:02 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-10T08:57:02.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:02 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:02.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:02 vm08.local ceph-mon[101330]: Reconfiguring crash.vm05 (monmap changed)... 2026-03-10T08:57:02.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:02 vm08.local ceph-mon[101330]: Reconfiguring daemon crash.vm05 on vm05 2026-03-10T08:57:02.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:02 vm08.local ceph-mon[101330]: Reconfiguring osd.0 (monmap changed)... 2026-03-10T08:57:02.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:02 vm08.local ceph-mon[101330]: Reconfiguring daemon osd.0 on vm05 2026-03-10T08:57:02.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:02 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:02.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:02 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:02.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:02 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-10T08:57:02.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:02 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:02.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:02 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:02.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:02 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:02.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:02 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-10T08:57:02.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:02 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:03.187 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:03 vm08.local ceph-mon[101330]: Reconfiguring osd.1 (monmap changed)... 2026-03-10T08:57:03.187 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:03 vm08.local ceph-mon[101330]: Reconfiguring daemon osd.1 on vm05 2026-03-10T08:57:03.187 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:03 vm08.local ceph-mon[101330]: Reconfiguring osd.2 (monmap changed)... 2026-03-10T08:57:03.187 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:03 vm08.local ceph-mon[101330]: Reconfiguring daemon osd.2 on vm05 2026-03-10T08:57:03.187 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:03 vm08.local ceph-mon[101330]: pgmap v11: 65 pgs: 65 active+clean; 290 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 35 KiB/s rd, 785 KiB/s wr, 178 op/s 2026-03-10T08:57:03.187 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:03 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:03.187 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:03 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:03.187 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:03 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.bxdvbu", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T08:57:03.187 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:03 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:03.187 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:03 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:03.187 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:03 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:03.187 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:03 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.slhztf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T08:57:03.187 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:03 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:03.187 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:03 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:03.187 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:03 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:03.187 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:03 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm08", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T08:57:03.187 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:03 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm08", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T08:57:03.187 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:03 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm08", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-10T08:57:03.187 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:03 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.ceph-exporter.vm08"}]: dispatch 2026-03-10T08:57:03.187 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:03 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:03.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:03 vm05.local ceph-mon[111630]: Reconfiguring osd.1 (monmap changed)... 2026-03-10T08:57:03.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:03 vm05.local ceph-mon[111630]: Reconfiguring daemon osd.1 on vm05 2026-03-10T08:57:03.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:03 vm05.local ceph-mon[111630]: Reconfiguring osd.2 (monmap changed)... 2026-03-10T08:57:03.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:03 vm05.local ceph-mon[111630]: Reconfiguring daemon osd.2 on vm05 2026-03-10T08:57:03.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:03 vm05.local ceph-mon[111630]: pgmap v11: 65 pgs: 65 active+clean; 290 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 35 KiB/s rd, 785 KiB/s wr, 178 op/s 2026-03-10T08:57:03.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:03 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:03.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:03 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:03.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:03 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.bxdvbu", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T08:57:03.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:03 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:03.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:03 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:03.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:03 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:03.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:03 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.slhztf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T08:57:03.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:03 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:03.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:03 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:03.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:03 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:03.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:03 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm08", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T08:57:03.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:03 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm08", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T08:57:03.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:03 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm08", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-10T08:57:03.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:03 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.ceph-exporter.vm08"}]: dispatch 2026-03-10T08:57:03.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:03 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:04.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:04 vm08.local ceph-mon[101330]: Reconfiguring mds.cephfs.vm05.bxdvbu (monmap changed)... 2026-03-10T08:57:04.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:04 vm08.local ceph-mon[101330]: Reconfiguring daemon mds.cephfs.vm05.bxdvbu on vm05 2026-03-10T08:57:04.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:04 vm08.local ceph-mon[101330]: Reconfiguring mds.cephfs.vm05.slhztf (monmap changed)... 2026-03-10T08:57:04.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:04 vm08.local ceph-mon[101330]: Reconfiguring daemon mds.cephfs.vm05.slhztf on vm05 2026-03-10T08:57:04.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:04 vm08.local ceph-mon[101330]: Reconfiguring ceph-exporter.vm08 (monmap changed)... 2026-03-10T08:57:04.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:04 vm08.local ceph-mon[101330]: Unable to update caps for client.ceph-exporter.vm08 2026-03-10T08:57:04.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:04 vm08.local ceph-mon[101330]: Reconfiguring daemon ceph-exporter.vm08 on vm08 2026-03-10T08:57:04.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:04 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:04.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:04 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:04.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:04 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm08", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T08:57:04.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:04 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:04.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:04 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:04.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:04 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:04.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:04 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm08.rpongu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T08:57:04.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:04 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T08:57:04.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:04 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:04.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:04 vm05.local ceph-mon[111630]: Reconfiguring mds.cephfs.vm05.bxdvbu (monmap changed)... 2026-03-10T08:57:04.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:04 vm05.local ceph-mon[111630]: Reconfiguring daemon mds.cephfs.vm05.bxdvbu on vm05 2026-03-10T08:57:04.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:04 vm05.local ceph-mon[111630]: Reconfiguring mds.cephfs.vm05.slhztf (monmap changed)... 2026-03-10T08:57:04.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:04 vm05.local ceph-mon[111630]: Reconfiguring daemon mds.cephfs.vm05.slhztf on vm05 2026-03-10T08:57:04.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:04 vm05.local ceph-mon[111630]: Reconfiguring ceph-exporter.vm08 (monmap changed)... 2026-03-10T08:57:04.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:04 vm05.local ceph-mon[111630]: Unable to update caps for client.ceph-exporter.vm08 2026-03-10T08:57:04.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:04 vm05.local ceph-mon[111630]: Reconfiguring daemon ceph-exporter.vm08 on vm08 2026-03-10T08:57:04.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:04 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:04.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:04 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:04.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:04 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm08", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T08:57:04.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:04 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:04.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:04 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:04.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:04 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:04.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:04 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm08.rpongu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T08:57:04.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:04 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T08:57:04.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:04 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:05.304 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:05 vm08.local ceph-mon[101330]: Reconfiguring crash.vm08 (monmap changed)... 2026-03-10T08:57:05.304 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:05 vm08.local ceph-mon[101330]: Reconfiguring daemon crash.vm08 on vm08 2026-03-10T08:57:05.304 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:05 vm08.local ceph-mon[101330]: pgmap v12: 65 pgs: 65 active+clean; 292 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 43 KiB/s rd, 1.1 MiB/s wr, 265 op/s 2026-03-10T08:57:05.304 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:05 vm08.local ceph-mon[101330]: Reconfiguring mgr.vm08.rpongu (monmap changed)... 2026-03-10T08:57:05.304 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:05 vm08.local ceph-mon[101330]: Reconfiguring daemon mgr.vm08.rpongu on vm08 2026-03-10T08:57:05.304 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:05 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:05.304 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:05 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:05.304 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:05 vm08.local ceph-mon[101330]: Reconfiguring mon.vm08 (monmap changed)... 2026-03-10T08:57:05.304 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:05 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T08:57:05.304 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:05 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T08:57:05.304 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:05 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:05.304 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:05 vm08.local ceph-mon[101330]: Reconfiguring daemon mon.vm08 on vm08 2026-03-10T08:57:05.304 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:05 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:05.304 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:05 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:05.304 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:05 vm08.local ceph-mon[101330]: Reconfiguring osd.3 (monmap changed)... 2026-03-10T08:57:05.304 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:05 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-10T08:57:05.304 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:05 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:05.304 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:05 vm08.local ceph-mon[101330]: Reconfiguring daemon osd.3 on vm08 2026-03-10T08:57:05.304 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:05 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:05.304 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:05 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:05.304 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:05 vm08.local ceph-mon[101330]: Reconfiguring osd.4 (monmap changed)... 2026-03-10T08:57:05.304 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:05 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-10T08:57:05.304 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:05 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:05.304 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:05 vm08.local ceph-mon[101330]: Reconfiguring daemon osd.4 on vm08 2026-03-10T08:57:05.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:05 vm05.local ceph-mon[111630]: Reconfiguring crash.vm08 (monmap changed)... 2026-03-10T08:57:05.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:05 vm05.local ceph-mon[111630]: Reconfiguring daemon crash.vm08 on vm08 2026-03-10T08:57:05.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:05 vm05.local ceph-mon[111630]: pgmap v12: 65 pgs: 65 active+clean; 292 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 43 KiB/s rd, 1.1 MiB/s wr, 265 op/s 2026-03-10T08:57:05.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:05 vm05.local ceph-mon[111630]: Reconfiguring mgr.vm08.rpongu (monmap changed)... 2026-03-10T08:57:05.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:05 vm05.local ceph-mon[111630]: Reconfiguring daemon mgr.vm08.rpongu on vm08 2026-03-10T08:57:05.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:05 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:05.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:05 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:05.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:05 vm05.local ceph-mon[111630]: Reconfiguring mon.vm08 (monmap changed)... 2026-03-10T08:57:05.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:05 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T08:57:05.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:05 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T08:57:05.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:05 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:05.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:05 vm05.local ceph-mon[111630]: Reconfiguring daemon mon.vm08 on vm08 2026-03-10T08:57:05.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:05 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:05.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:05 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:05.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:05 vm05.local ceph-mon[111630]: Reconfiguring osd.3 (monmap changed)... 2026-03-10T08:57:05.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:05 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-10T08:57:05.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:05 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:05.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:05 vm05.local ceph-mon[111630]: Reconfiguring daemon osd.3 on vm08 2026-03-10T08:57:05.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:05 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:05.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:05 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:05.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:05 vm05.local ceph-mon[111630]: Reconfiguring osd.4 (monmap changed)... 2026-03-10T08:57:05.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:05 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-10T08:57:05.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:05 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:05.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:05 vm05.local ceph-mon[111630]: Reconfiguring daemon osd.4 on vm08 2026-03-10T08:57:06.660 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:06 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:06.660 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:06 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:06.660 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:06 vm08.local ceph-mon[101330]: Reconfiguring osd.5 (monmap changed)... 2026-03-10T08:57:06.660 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:06 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-10T08:57:06.660 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:06 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:06.660 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:06 vm08.local ceph-mon[101330]: Reconfiguring daemon osd.5 on vm08 2026-03-10T08:57:06.664 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:06 vm08.local ceph-mon[101330]: pgmap v13: 65 pgs: 65 active+clean; 292 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 20 KiB/s rd, 783 KiB/s wr, 198 op/s 2026-03-10T08:57:06.664 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:06 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:06.664 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:06 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:06.664 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:06 vm08.local ceph-mon[101330]: Reconfiguring mds.cephfs.vm08.xfzrbx (monmap changed)... 2026-03-10T08:57:06.664 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:06 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.xfzrbx", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T08:57:06.664 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:06 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:06.664 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:06 vm08.local ceph-mon[101330]: Reconfiguring daemon mds.cephfs.vm08.xfzrbx on vm08 2026-03-10T08:57:06.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:06 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:06.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:06 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:06.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:06 vm05.local ceph-mon[111630]: Reconfiguring osd.5 (monmap changed)... 2026-03-10T08:57:06.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:06 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-10T08:57:06.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:06 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:06.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:06 vm05.local ceph-mon[111630]: Reconfiguring daemon osd.5 on vm08 2026-03-10T08:57:06.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:06 vm05.local ceph-mon[111630]: pgmap v13: 65 pgs: 65 active+clean; 292 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 20 KiB/s rd, 783 KiB/s wr, 198 op/s 2026-03-10T08:57:06.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:06 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:06.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:06 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:06.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:06 vm05.local ceph-mon[111630]: Reconfiguring mds.cephfs.vm08.xfzrbx (monmap changed)... 2026-03-10T08:57:06.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:06 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.xfzrbx", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T08:57:06.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:06 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:06.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:06 vm05.local ceph-mon[111630]: Reconfiguring daemon mds.cephfs.vm08.xfzrbx on vm08 2026-03-10T08:57:07.888 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:07 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:07.889 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:07 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:07.889 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:07 vm05.local ceph-mon[111630]: Reconfiguring mds.cephfs.vm08.ssijow (monmap changed)... 2026-03-10T08:57:07.889 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:07 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.ssijow", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T08:57:07.889 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:07 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:07.889 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:07 vm05.local ceph-mon[111630]: Reconfiguring daemon mds.cephfs.vm08.ssijow on vm08 2026-03-10T08:57:07.889 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:07 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:07.889 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:07 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:07.889 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:07 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:57:07.889 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:07 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:57:07.889 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:07 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:57:07.889 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:07 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:07.889 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:07 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon.vm05"}]: dispatch 2026-03-10T08:57:07.889 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:07 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon.vm05"}]': finished 2026-03-10T08:57:07.889 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:07 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon.vm08"}]: dispatch 2026-03-10T08:57:07.889 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:07 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon.vm08"}]': finished 2026-03-10T08:57:07.985 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:07 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:08.004 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:07 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:08.004 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:07 vm08.local ceph-mon[101330]: Reconfiguring mds.cephfs.vm08.ssijow (monmap changed)... 2026-03-10T08:57:08.004 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:07 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.ssijow", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T08:57:08.004 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:07 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:08.004 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:07 vm08.local ceph-mon[101330]: Reconfiguring daemon mds.cephfs.vm08.ssijow on vm08 2026-03-10T08:57:08.004 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:07 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:08.004 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:07 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:08.004 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:07 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:57:08.004 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:07 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:57:08.004 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:07 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:57:08.004 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:07 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:08.004 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:07 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon.vm05"}]: dispatch 2026-03-10T08:57:08.004 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:07 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon.vm05"}]': finished 2026-03-10T08:57:08.004 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:07 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon.vm08"}]: dispatch 2026-03-10T08:57:08.004 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:07 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon.vm08"}]': finished 2026-03-10T08:57:09.169 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:09 vm05.local ceph-mon[111630]: Upgrade: Setting container_image for all mon 2026-03-10T08:57:09.169 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:09 vm05.local ceph-mon[111630]: pgmap v14: 65 pgs: 65 active+clean; 295 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 31 KiB/s rd, 1.2 MiB/s wr, 306 op/s 2026-03-10T08:57:09.169 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:09 vm05.local ceph-mon[111630]: Upgrade: Updating crash.vm05 (1/2) 2026-03-10T08:57:09.169 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:09 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:09.169 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:09 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm05", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T08:57:09.169 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:09 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:09.169 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:09 vm05.local ceph-mon[111630]: Deploying daemon crash.vm05 on vm05 2026-03-10T08:57:09.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:09 vm08.local ceph-mon[101330]: Upgrade: Setting container_image for all mon 2026-03-10T08:57:09.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:09 vm08.local ceph-mon[101330]: pgmap v14: 65 pgs: 65 active+clean; 295 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 31 KiB/s rd, 1.2 MiB/s wr, 306 op/s 2026-03-10T08:57:09.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:09 vm08.local ceph-mon[101330]: Upgrade: Updating crash.vm05 (1/2) 2026-03-10T08:57:09.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:09 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:09.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:09 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm05", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T08:57:09.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:09 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:09.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:09 vm08.local ceph-mon[101330]: Deploying daemon crash.vm05 on vm05 2026-03-10T08:57:10.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:10 vm08.local ceph-mon[101330]: pgmap v15: 65 pgs: 65 active+clean; 295 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 18 KiB/s rd, 771 KiB/s wr, 194 op/s 2026-03-10T08:57:10.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:10 vm05.local ceph-mon[111630]: pgmap v15: 65 pgs: 65 active+clean; 295 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 18 KiB/s rd, 771 KiB/s wr, 194 op/s 2026-03-10T08:57:11.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:11 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:11.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:11 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:11.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:11 vm08.local ceph-mon[101330]: Upgrade: Updating crash.vm08 (2/2) 2026-03-10T08:57:11.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:11 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:11.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:11 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm08", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T08:57:11.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:11 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:11.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:11 vm08.local ceph-mon[101330]: Deploying daemon crash.vm08 on vm08 2026-03-10T08:57:11.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:11 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:11.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:11 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:11.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:11 vm05.local ceph-mon[111630]: Upgrade: Updating crash.vm08 (2/2) 2026-03-10T08:57:11.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:11 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:11.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:11 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm08", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T08:57:11.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:11 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:11.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:11 vm05.local ceph-mon[111630]: Deploying daemon crash.vm08 on vm08 2026-03-10T08:57:12.454 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:12 vm05.local ceph-mon[111630]: pgmap v16: 65 pgs: 65 active+clean; 295 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 18 KiB/s rd, 771 KiB/s wr, 194 op/s 2026-03-10T08:57:12.454 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:12 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:12.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:12 vm08.local ceph-mon[101330]: pgmap v16: 65 pgs: 65 active+clean; 295 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 18 KiB/s rd, 771 KiB/s wr, 194 op/s 2026-03-10T08:57:12.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:12 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:12.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:12 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:12.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:12 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:57:12.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:12 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:12.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:12 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:57:15.099 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:15 vm05.local ceph-mon[111630]: pgmap v17: 65 pgs: 65 active+clean; 298 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 28 KiB/s rd, 1.2 MiB/s wr, 293 op/s 2026-03-10T08:57:15.099 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:15 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:15.099 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:15 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:15.099 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:15 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:15.099 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:15 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:15.305 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:15 vm08.local ceph-mon[101330]: pgmap v17: 65 pgs: 65 active+clean; 298 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 28 KiB/s rd, 1.2 MiB/s wr, 293 op/s 2026-03-10T08:57:15.305 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:15 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:15.305 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:15 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:15.305 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:15 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:15.305 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:15 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:16.444 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:16 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:16.444 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:16 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:16.444 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:16 vm05.local ceph-mon[111630]: pgmap v18: 65 pgs: 65 active+clean; 298 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 21 KiB/s rd, 923 KiB/s wr, 206 op/s 2026-03-10T08:57:16.445 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:16 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:16.445 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:16 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:16.445 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:16 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:57:16.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:16 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:16.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:16 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:16.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:16 vm08.local ceph-mon[101330]: pgmap v18: 65 pgs: 65 active+clean; 298 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 21 KiB/s rd, 923 KiB/s wr, 206 op/s 2026-03-10T08:57:16.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:16 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:16.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:16 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:16.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:16 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:57:17.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:17 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:17.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:17 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:17.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:17 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:17.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:17 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:17.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:17 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:17.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:17 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:57:17.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:17 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:17.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:17 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:17.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:17 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:17.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:17 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:17.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:17 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:17.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:17 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:17.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:17 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:57:17.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:17 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:18.549 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:18 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:57:18.549 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:18 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:57:18.549 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:18 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:57:18.549 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:18 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:57:18.549 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:18 vm05.local ceph-mon[111630]: Upgrade: Setting container_image for all crash 2026-03-10T08:57:18.549 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:18 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:18.549 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:18 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm05"}]: dispatch 2026-03-10T08:57:18.549 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:18 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm05"}]': finished 2026-03-10T08:57:18.549 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:18 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm08"}]: dispatch 2026-03-10T08:57:18.549 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:18 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm08"}]': finished 2026-03-10T08:57:18.549 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:18 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["0"], "max": 16}]: dispatch 2026-03-10T08:57:18.549 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:18 vm05.local ceph-mon[111630]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["0"], "max": 16}]: dispatch 2026-03-10T08:57:18.549 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:18 vm05.local ceph-mon[111630]: Upgrade: osd.0 is safe to restart 2026-03-10T08:57:18.549 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:18 vm05.local ceph-mon[111630]: pgmap v19: 65 pgs: 65 active+clean; 299 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 31 KiB/s rd, 1.2 MiB/s wr, 328 op/s 2026-03-10T08:57:18.549 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:18 vm05.local ceph-mon[111630]: Upgrade: Updating osd.0 2026-03-10T08:57:18.549 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:18 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:18.549 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:18 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-10T08:57:18.549 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:18 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:18.549 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:18 vm05.local ceph-mon[111630]: Deploying daemon osd.0 on vm05 2026-03-10T08:57:18.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:18 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:57:18.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:18 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:57:18.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:18 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:57:18.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:18 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:57:18.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:18 vm08.local ceph-mon[101330]: Upgrade: Setting container_image for all crash 2026-03-10T08:57:18.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:18 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:18.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:18 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm05"}]: dispatch 2026-03-10T08:57:18.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:18 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm05"}]': finished 2026-03-10T08:57:18.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:18 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm08"}]: dispatch 2026-03-10T08:57:18.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:18 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm08"}]': finished 2026-03-10T08:57:18.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:18 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["0"], "max": 16}]: dispatch 2026-03-10T08:57:18.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:18 vm08.local ceph-mon[101330]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["0"], "max": 16}]: dispatch 2026-03-10T08:57:18.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:18 vm08.local ceph-mon[101330]: Upgrade: osd.0 is safe to restart 2026-03-10T08:57:18.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:18 vm08.local ceph-mon[101330]: pgmap v19: 65 pgs: 65 active+clean; 299 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 31 KiB/s rd, 1.2 MiB/s wr, 328 op/s 2026-03-10T08:57:18.804 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:18 vm08.local ceph-mon[101330]: Upgrade: Updating osd.0 2026-03-10T08:57:18.804 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:18 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:18.804 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:18 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-10T08:57:18.804 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:18 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:18.804 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:18 vm08.local ceph-mon[101330]: Deploying daemon osd.0 on vm05 2026-03-10T08:57:19.463 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:19 vm05.local systemd[1]: Stopping Ceph osd.0 for 16587ed2-1c5e-11f1-90f6-35051361a039... 2026-03-10T08:57:19.463 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:19 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0[68780]: 2026-03-10T08:57:19.319+0000 7f5885d0b700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.0 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T08:57:19.463 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:19 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0[68780]: 2026-03-10T08:57:19.319+0000 7f5885d0b700 -1 osd.0 43 *** Got signal Terminated *** 2026-03-10T08:57:19.463 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:19 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0[68780]: 2026-03-10T08:57:19.319+0000 7f5885d0b700 -1 osd.0 43 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T08:57:19.886 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:19 vm05.local podman[118900]: 2026-03-10 08:57:19.65911243 +0000 UTC m=+0.374782371 container died 2a2aeea5e3d44407284f1f983645fa37dab93d50ba98ba19118aff9949041788 (image=quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0, RELEASE=HEAD, ceph=True, io.buildah.version=1.29.1, maintainer=Guillaume Abrioux , org.label-schema.name=CentOS Stream 8 Base Image, CEPH_POINT_RELEASE=-18.2.1, GIT_REPO=https://github.com/ceph/ceph-container.git, org.label-schema.build-date=20240222, GIT_BRANCH=HEAD, org.label-schema.schema-version=1.0, GIT_CLEAN=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd) 2026-03-10T08:57:19.886 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:19 vm05.local podman[118900]: 2026-03-10 08:57:19.687816382 +0000 UTC m=+0.403486323 container remove 2a2aeea5e3d44407284f1f983645fa37dab93d50ba98ba19118aff9949041788 (image=quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0, maintainer=Guillaume Abrioux , org.label-schema.license=GPLv2, RELEASE=HEAD, org.label-schema.build-date=20240222, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_POINT_RELEASE=-18.2.1, GIT_BRANCH=HEAD, GIT_CLEAN=True, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.29.1) 2026-03-10T08:57:19.886 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:19 vm05.local bash[118900]: ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0 2026-03-10T08:57:19.887 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:19 vm05.local ceph-mon[111630]: osd.0 marked itself down and dead 2026-03-10T08:57:20.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:19 vm08.local ceph-mon[101330]: osd.0 marked itself down and dead 2026-03-10T08:57:20.154 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:19 vm05.local podman[118966]: 2026-03-10 08:57:19.886876817 +0000 UTC m=+0.025893442 container create c814d27db10d6f905fdf924f01fc41265e7f4aef86879b02710ebe0ad8da1659 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0-deactivate, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, org.label-schema.build-date=20260223, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-10T08:57:20.154 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:19 vm05.local podman[118966]: 2026-03-10 08:57:19.931924166 +0000 UTC m=+0.070940791 container init c814d27db10d6f905fdf924f01fc41265e7f4aef86879b02710ebe0ad8da1659 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0-deactivate, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, ceph=True, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default) 2026-03-10T08:57:20.154 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:19 vm05.local podman[118966]: 2026-03-10 08:57:19.942679886 +0000 UTC m=+0.081696511 container start c814d27db10d6f905fdf924f01fc41265e7f4aef86879b02710ebe0ad8da1659 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0-deactivate, io.buildah.version=1.41.3, OSD_FLAVOR=default, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-10T08:57:20.154 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:19 vm05.local podman[118966]: 2026-03-10 08:57:19.944062965 +0000 UTC m=+0.083079590 container attach c814d27db10d6f905fdf924f01fc41265e7f4aef86879b02710ebe0ad8da1659 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0-deactivate, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, CEPH_REF=squid) 2026-03-10T08:57:20.154 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:19 vm05.local podman[118966]: 2026-03-10 08:57:19.875796909 +0000 UTC m=+0.014813545 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T08:57:20.154 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:20 vm05.local conmon[118976]: conmon c814d27db10d6f905fdf : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-c814d27db10d6f905fdf924f01fc41265e7f4aef86879b02710ebe0ad8da1659.scope/container/memory.events 2026-03-10T08:57:20.154 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:20 vm05.local podman[118966]: 2026-03-10 08:57:20.095235163 +0000 UTC m=+0.234251788 container died c814d27db10d6f905fdf924f01fc41265e7f4aef86879b02710ebe0ad8da1659 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0-deactivate, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-10T08:57:20.154 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:20 vm05.local podman[118966]: 2026-03-10 08:57:20.117956297 +0000 UTC m=+0.256972922 container remove c814d27db10d6f905fdf924f01fc41265e7f4aef86879b02710ebe0ad8da1659 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0-deactivate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, org.label-schema.build-date=20260223, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=squid) 2026-03-10T08:57:20.154 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:20 vm05.local systemd[1]: ceph-16587ed2-1c5e-11f1-90f6-35051361a039@osd.0.service: Deactivated successfully. 2026-03-10T08:57:20.154 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:20 vm05.local systemd[1]: ceph-16587ed2-1c5e-11f1-90f6-35051361a039@osd.0.service: Unit process 118976 (conmon) remains running after unit stopped. 2026-03-10T08:57:20.154 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:20 vm05.local systemd[1]: ceph-16587ed2-1c5e-11f1-90f6-35051361a039@osd.0.service: Unit process 118985 (podman) remains running after unit stopped. 2026-03-10T08:57:20.154 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:20 vm05.local systemd[1]: Stopped Ceph osd.0 for 16587ed2-1c5e-11f1-90f6-35051361a039. 2026-03-10T08:57:20.154 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:20 vm05.local systemd[1]: ceph-16587ed2-1c5e-11f1-90f6-35051361a039@osd.0.service: Consumed 31.831s CPU time, 553.7M memory peak. 2026-03-10T08:57:20.463 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:20 vm05.local systemd[1]: Starting Ceph osd.0 for 16587ed2-1c5e-11f1-90f6-35051361a039... 2026-03-10T08:57:20.963 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:20 vm05.local podman[119066]: 2026-03-10 08:57:20.596794104 +0000 UTC m=+0.030572189 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T08:57:20.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:20 vm05.local ceph-mon[111630]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T08:57:20.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:20 vm05.local ceph-mon[111630]: osdmap e44: 6 total, 5 up, 6 in 2026-03-10T08:57:20.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:20 vm05.local ceph-mon[111630]: pgmap v21: 65 pgs: 9 stale+active+clean, 56 active+clean; 299 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 24 KiB/s rd, 914 KiB/s wr, 265 op/s 2026-03-10T08:57:21.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:20 vm08.local ceph-mon[101330]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T08:57:21.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:20 vm08.local ceph-mon[101330]: osdmap e44: 6 total, 5 up, 6 in 2026-03-10T08:57:21.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:20 vm08.local ceph-mon[101330]: pgmap v21: 65 pgs: 9 stale+active+clean, 56 active+clean; 299 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 24 KiB/s rd, 914 KiB/s wr, 265 op/s 2026-03-10T08:57:21.223 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:20 vm05.local podman[119066]: 2026-03-10 08:57:20.96723533 +0000 UTC m=+0.401013405 container create de77c1a06de3d7f36da9194ef18482e8d210940bef9db7f7bf1e8d00c88e8a24 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0-activate, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T08:57:21.223 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:21 vm05.local podman[119066]: 2026-03-10 08:57:21.08525913 +0000 UTC m=+0.519037205 container init de77c1a06de3d7f36da9194ef18482e8d210940bef9db7f7bf1e8d00c88e8a24 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0-activate, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid) 2026-03-10T08:57:21.223 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:21 vm05.local podman[119066]: 2026-03-10 08:57:21.088532065 +0000 UTC m=+0.522310140 container start de77c1a06de3d7f36da9194ef18482e8d210940bef9db7f7bf1e8d00c88e8a24 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0-activate, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS) 2026-03-10T08:57:21.223 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:21 vm05.local podman[119066]: 2026-03-10 08:57:21.123007338 +0000 UTC m=+0.556785422 container attach de77c1a06de3d7f36da9194ef18482e8d210940bef9db7f7bf1e8d00c88e8a24 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0-activate, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3) 2026-03-10T08:57:21.223 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:21 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0-activate[119077]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T08:57:21.223 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:21 vm05.local bash[119066]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T08:57:21.713 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:21 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0-activate[119077]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T08:57:21.713 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:21 vm05.local bash[119066]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T08:57:22.037 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:21 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0-activate[119077]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T08:57:22.037 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:21 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0-activate[119077]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T08:57:22.037 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:21 vm05.local bash[119066]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T08:57:22.037 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:21 vm05.local bash[119066]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T08:57:22.037 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:21 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0-activate[119077]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T08:57:22.037 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:21 vm05.local bash[119066]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T08:57:22.037 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:21 vm05.local ceph-mon[111630]: osdmap e45: 6 total, 5 up, 6 in 2026-03-10T08:57:22.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:21 vm08.local ceph-mon[101330]: osdmap e45: 6 total, 5 up, 6 in 2026-03-10T08:57:22.346 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:22 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0-activate[119077]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 2026-03-10T08:57:22.346 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:22 vm05.local bash[119066]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 2026-03-10T08:57:22.346 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:22 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0-activate[119077]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-1208ff5d-f4d4-471f-808c-d672021153bb/osd-block-0e25ea50-b19b-4e07-85f6-5d48c19d3a4f --path /var/lib/ceph/osd/ceph-0 --no-mon-config 2026-03-10T08:57:22.346 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:22 vm05.local bash[119066]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-1208ff5d-f4d4-471f-808c-d672021153bb/osd-block-0e25ea50-b19b-4e07-85f6-5d48c19d3a4f --path /var/lib/ceph/osd/ceph-0 --no-mon-config 2026-03-10T08:57:22.660 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:22 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0-activate[119077]: Running command: /usr/bin/ln -snf /dev/ceph-1208ff5d-f4d4-471f-808c-d672021153bb/osd-block-0e25ea50-b19b-4e07-85f6-5d48c19d3a4f /var/lib/ceph/osd/ceph-0/block 2026-03-10T08:57:22.661 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:22 vm05.local bash[119066]: Running command: /usr/bin/ln -snf /dev/ceph-1208ff5d-f4d4-471f-808c-d672021153bb/osd-block-0e25ea50-b19b-4e07-85f6-5d48c19d3a4f /var/lib/ceph/osd/ceph-0/block 2026-03-10T08:57:22.661 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:22 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0-activate[119077]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block 2026-03-10T08:57:22.661 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:22 vm05.local bash[119066]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block 2026-03-10T08:57:22.661 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:22 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0-activate[119077]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 2026-03-10T08:57:22.661 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:22 vm05.local bash[119066]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 2026-03-10T08:57:22.661 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:22 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0-activate[119077]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 2026-03-10T08:57:22.661 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:22 vm05.local bash[119066]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 2026-03-10T08:57:22.661 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:22 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0-activate[119077]: --> ceph-volume lvm activate successful for osd ID: 0 2026-03-10T08:57:22.661 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:22 vm05.local bash[119066]: --> ceph-volume lvm activate successful for osd ID: 0 2026-03-10T08:57:22.661 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:22 vm05.local podman[119305]: 2026-03-10 08:57:22.432299115 +0000 UTC m=+0.026474781 container died de77c1a06de3d7f36da9194ef18482e8d210940bef9db7f7bf1e8d00c88e8a24 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0-activate, ceph=True, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid) 2026-03-10T08:57:22.661 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:22 vm05.local podman[119305]: 2026-03-10 08:57:22.456900318 +0000 UTC m=+0.051075974 container remove de77c1a06de3d7f36da9194ef18482e8d210940bef9db7f7bf1e8d00c88e8a24 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0-activate, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, ceph=True) 2026-03-10T08:57:22.963 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:22 vm05.local podman[119345]: 2026-03-10 08:57:22.66066619 +0000 UTC m=+0.024352419 container create 4f1dac46f59bb0a72b86bab4a176031592a98ff9eb67b738e0a1aa6f743eaba7 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-10T08:57:22.963 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:22 vm05.local podman[119345]: 2026-03-10 08:57:22.711002157 +0000 UTC m=+0.074688395 container init 4f1dac46f59bb0a72b86bab4a176031592a98ff9eb67b738e0a1aa6f743eaba7 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, CEPH_REF=squid, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T08:57:22.963 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:22 vm05.local podman[119345]: 2026-03-10 08:57:22.720382021 +0000 UTC m=+0.084068250 container start 4f1dac46f59bb0a72b86bab4a176031592a98ff9eb67b738e0a1aa6f743eaba7 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) 2026-03-10T08:57:22.963 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:22 vm05.local bash[119345]: 4f1dac46f59bb0a72b86bab4a176031592a98ff9eb67b738e0a1aa6f743eaba7 2026-03-10T08:57:22.963 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:22 vm05.local podman[119345]: 2026-03-10 08:57:22.652781682 +0000 UTC m=+0.016467922 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T08:57:22.963 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:22 vm05.local systemd[1]: Started Ceph osd.0 for 16587ed2-1c5e-11f1-90f6-35051361a039. 2026-03-10T08:57:23.237 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:22 vm05.local ceph-mon[111630]: pgmap v23: 65 pgs: 9 stale+active+clean, 56 active+clean; 299 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 15 KiB/s rd, 442 KiB/s wr, 183 op/s 2026-03-10T08:57:23.237 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:22 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:23.237 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:22 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:23.237 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:22 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:57:23.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:22 vm08.local ceph-mon[101330]: pgmap v23: 65 pgs: 9 stale+active+clean, 56 active+clean; 299 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 15 KiB/s rd, 442 KiB/s wr, 183 op/s 2026-03-10T08:57:23.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:22 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:23.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:22 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:23.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:22 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:57:23.805 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:23 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0[119356]: 2026-03-10T08:57:23.620+0000 7f014135d740 -1 Falling back to public interface 2026-03-10T08:57:24.189 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:24 vm05.local ceph-mon[111630]: Health check failed: Degraded data redundancy: 6702/45033 objects degraded (14.882%), 33 pgs degraded (PG_DEGRADED) 2026-03-10T08:57:24.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:24 vm08.local ceph-mon[101330]: Health check failed: Degraded data redundancy: 6702/45033 objects degraded (14.882%), 33 pgs degraded (PG_DEGRADED) 2026-03-10T08:57:25.220 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:25 vm05.local ceph-mon[111630]: pgmap v24: 65 pgs: 1 active+undersized, 33 active+undersized+degraded, 31 active+clean; 301 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 25 KiB/s rd, 929 KiB/s wr, 294 op/s; 6702/45033 objects degraded (14.882%) 2026-03-10T08:57:25.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:25 vm08.local ceph-mon[101330]: pgmap v24: 65 pgs: 1 active+undersized, 33 active+undersized+degraded, 31 active+clean; 301 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 25 KiB/s rd, 929 KiB/s wr, 294 op/s; 6702/45033 objects degraded (14.882%) 2026-03-10T08:57:26.952 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:26 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:26.952 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:26 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:26.952 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:26 vm05.local ceph-mon[111630]: pgmap v25: 65 pgs: 1 active+undersized, 33 active+undersized+degraded, 31 active+clean; 301 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 10 KiB/s rd, 487 KiB/s wr, 111 op/s; 6702/45033 objects degraded (14.882%) 2026-03-10T08:57:26.952 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:26 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:26.952 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:26 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:27.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:26 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:27.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:26 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:27.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:26 vm08.local ceph-mon[101330]: pgmap v25: 65 pgs: 1 active+undersized, 33 active+undersized+degraded, 31 active+clean; 301 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 10 KiB/s rd, 487 KiB/s wr, 111 op/s; 6702/45033 objects degraded (14.882%) 2026-03-10T08:57:27.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:26 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:27.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:26 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:28.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.201+0000 7fa23c555700 1 -- 192.168.123.105:0/1031892597 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa234075a40 msgr2=0x7fa234077ed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:57:28.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.201+0000 7fa23c555700 1 --2- 192.168.123.105:0/1031892597 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa234075a40 0x7fa234077ed0 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7fa22c00d3f0 tx=0x7fa22c00d700 comp rx=0 tx=0).stop 2026-03-10T08:57:28.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.201+0000 7fa23c555700 1 -- 192.168.123.105:0/1031892597 shutdown_connections 2026-03-10T08:57:28.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.201+0000 7fa23c555700 1 --2- 192.168.123.105:0/1031892597 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa234075a40 0x7fa234077ed0 unknown :-1 s=CLOSED pgs=52 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:57:28.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.201+0000 7fa23c555700 1 --2- 192.168.123.105:0/1031892597 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa234072b50 0x7fa234072f70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:57:28.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.201+0000 7fa23c555700 1 -- 192.168.123.105:0/1031892597 >> 192.168.123.105:0/1031892597 conn(0x7fa23406dae0 msgr2=0x7fa23406ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:57:28.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.201+0000 7fa23c555700 1 -- 192.168.123.105:0/1031892597 shutdown_connections 2026-03-10T08:57:28.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.201+0000 7fa23c555700 1 -- 192.168.123.105:0/1031892597 wait complete. 2026-03-10T08:57:28.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.201+0000 7fa23c555700 1 Processor -- start 2026-03-10T08:57:28.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.201+0000 7fa23c555700 1 -- start start 2026-03-10T08:57:28.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.202+0000 7fa23c555700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa234072b50 0x7fa234082f60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:57:28.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.202+0000 7fa23c555700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa2340834a0 0x7fa234083920 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:57:28.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.202+0000 7fa23c555700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa23412e700 con 0x7fa2340834a0 2026-03-10T08:57:28.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.202+0000 7fa23c555700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa23412e870 con 0x7fa234072b50 2026-03-10T08:57:28.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.202+0000 7fa23a2f1700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa234072b50 0x7fa234082f60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:57:28.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.202+0000 7fa23a2f1700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa234072b50 0x7fa234082f60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:49694/0 (socket says 192.168.123.105:49694) 2026-03-10T08:57:28.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.202+0000 7fa23a2f1700 1 -- 192.168.123.105:0/1536833658 learned_addr learned my addr 192.168.123.105:0/1536833658 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:57:28.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.203+0000 7fa23a2f1700 1 -- 192.168.123.105:0/1536833658 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa2340834a0 msgr2=0x7fa234083920 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:57:28.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.203+0000 7fa23a2f1700 1 --2- 192.168.123.105:0/1536833658 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa2340834a0 0x7fa234083920 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:57:28.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.203+0000 7fa23a2f1700 1 -- 192.168.123.105:0/1536833658 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa22c007ed0 con 0x7fa234072b50 2026-03-10T08:57:28.204 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.203+0000 7fa23a2f1700 1 --2- 192.168.123.105:0/1536833658 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa234072b50 0x7fa234082f60 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7fa23000b770 tx=0x7fa23000bb30 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:57:28.205 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.204+0000 7fa22b7fe700 1 -- 192.168.123.105:0/1536833658 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa23000f820 con 0x7fa234072b50 2026-03-10T08:57:28.205 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.204+0000 7fa22b7fe700 1 -- 192.168.123.105:0/1536833658 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa23000fe60 con 0x7fa234072b50 2026-03-10T08:57:28.205 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.204+0000 7fa22b7fe700 1 -- 192.168.123.105:0/1536833658 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa23000d610 con 0x7fa234072b50 2026-03-10T08:57:28.205 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.204+0000 7fa23c555700 1 -- 192.168.123.105:0/1536833658 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa23412eb50 con 0x7fa234072b50 2026-03-10T08:57:28.205 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.204+0000 7fa23c555700 1 -- 192.168.123.105:0/1536833658 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa23412f0a0 con 0x7fa234072b50 2026-03-10T08:57:28.205 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.204+0000 7fa23c555700 1 -- 192.168.123.105:0/1536833658 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa23404ea90 con 0x7fa234072b50 2026-03-10T08:57:28.207 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.206+0000 7fa22b7fe700 1 -- 192.168.123.105:0/1536833658 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 34) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fa23000d770 con 0x7fa234072b50 2026-03-10T08:57:28.207 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.206+0000 7fa22b7fe700 1 --2- 192.168.123.105:0/1536833658 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fa220077a00 0x7fa220079ec0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:57:28.207 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.206+0000 7fa22b7fe700 1 -- 192.168.123.105:0/1536833658 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6222+0+0 (secure 0 0 0) 0x7fa23009b040 con 0x7fa234072b50 2026-03-10T08:57:28.208 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.208+0000 7fa239af0700 1 --2- 192.168.123.105:0/1536833658 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fa220077a00 0x7fa220079ec0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:57:28.212 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.209+0000 7fa22b7fe700 1 -- 192.168.123.105:0/1536833658 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fa230063680 con 0x7fa234072b50 2026-03-10T08:57:28.212 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.209+0000 7fa239af0700 1 --2- 192.168.123.105:0/1536833658 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fa220077a00 0x7fa220079ec0 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7fa22c00d3f0 tx=0x7fa22c00db00 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:57:28.428 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.427+0000 7fa23c555700 1 -- 192.168.123.105:0/1536833658 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fa23412ed10 con 0x7fa220077a00 2026-03-10T08:57:28.436 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.433+0000 7fa22b7fe700 1 -- 192.168.123.105:0/1536833658 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7fa23412ed10 con 0x7fa220077a00 2026-03-10T08:57:28.438 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.438+0000 7fa2297fa700 1 -- 192.168.123.105:0/1536833658 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fa220077a00 msgr2=0x7fa220079ec0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:57:28.438 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.438+0000 7fa2297fa700 1 --2- 192.168.123.105:0/1536833658 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fa220077a00 0x7fa220079ec0 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7fa22c00d3f0 tx=0x7fa22c00db00 comp rx=0 tx=0).stop 2026-03-10T08:57:28.438 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.438+0000 7fa2297fa700 1 -- 192.168.123.105:0/1536833658 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa234072b50 msgr2=0x7fa234082f60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:57:28.438 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.438+0000 7fa2297fa700 1 --2- 192.168.123.105:0/1536833658 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa234072b50 0x7fa234082f60 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7fa23000b770 tx=0x7fa23000bb30 comp rx=0 tx=0).stop 2026-03-10T08:57:28.439 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.438+0000 7fa2297fa700 1 -- 192.168.123.105:0/1536833658 shutdown_connections 2026-03-10T08:57:28.439 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.438+0000 7fa2297fa700 1 --2- 192.168.123.105:0/1536833658 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fa220077a00 0x7fa220079ec0 unknown :-1 s=CLOSED pgs=27 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:57:28.439 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.438+0000 7fa2297fa700 1 --2- 192.168.123.105:0/1536833658 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa234072b50 0x7fa234082f60 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:57:28.439 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.438+0000 7fa2297fa700 1 --2- 192.168.123.105:0/1536833658 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa2340834a0 0x7fa234083920 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:57:28.439 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.438+0000 7fa2297fa700 1 -- 192.168.123.105:0/1536833658 >> 192.168.123.105:0/1536833658 conn(0x7fa23406dae0 msgr2=0x7fa23406ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:57:28.440 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.440+0000 7fa2297fa700 1 -- 192.168.123.105:0/1536833658 shutdown_connections 2026-03-10T08:57:28.440 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.440+0000 7fa2297fa700 1 -- 192.168.123.105:0/1536833658 wait complete. 2026-03-10T08:57:28.464 INFO:teuthology.orchestra.run.vm05.stdout:true 2026-03-10T08:57:28.565 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.563+0000 7effe6e7a700 1 -- 192.168.123.105:0/3472609985 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7effe0072b20 msgr2=0x7effe0072f40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:57:28.565 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.563+0000 7effe6e7a700 1 --2- 192.168.123.105:0/3472609985 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7effe0072b20 0x7effe0072f40 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7effdc007780 tx=0x7effdc00c050 comp rx=0 tx=0).stop 2026-03-10T08:57:28.565 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.563+0000 7effe6e7a700 1 -- 192.168.123.105:0/3472609985 shutdown_connections 2026-03-10T08:57:28.565 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.563+0000 7effe6e7a700 1 --2- 192.168.123.105:0/3472609985 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7effe0075a10 0x7effe0077ea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:57:28.565 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.563+0000 7effe6e7a700 1 --2- 192.168.123.105:0/3472609985 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7effe0072b20 0x7effe0072f40 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:57:28.565 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.563+0000 7effe6e7a700 1 -- 192.168.123.105:0/3472609985 >> 192.168.123.105:0/3472609985 conn(0x7effe006daa0 msgr2=0x7effe006ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:57:28.565 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.563+0000 7effe6e7a700 1 -- 192.168.123.105:0/3472609985 shutdown_connections 2026-03-10T08:57:28.565 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.563+0000 7effe6e7a700 1 -- 192.168.123.105:0/3472609985 wait complete. 2026-03-10T08:57:28.565 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.564+0000 7effe6e7a700 1 Processor -- start 2026-03-10T08:57:28.565 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.564+0000 7effe6e7a700 1 -- start start 2026-03-10T08:57:28.565 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.564+0000 7effe6e7a700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7effe0075a10 0x7effe0082e00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:57:28.565 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.564+0000 7effe6e7a700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7effe0083340 0x7effe00837c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:57:28.565 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.564+0000 7effe6e7a700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7effe012e5a0 con 0x7effe0083340 2026-03-10T08:57:28.565 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.564+0000 7effe6e7a700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7effe012e710 con 0x7effe0075a10 2026-03-10T08:57:28.565 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.564+0000 7effe5e78700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7effe0075a10 0x7effe0082e00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:57:28.565 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.564+0000 7effe5e78700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7effe0075a10 0x7effe0082e00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:49712/0 (socket says 192.168.123.105:49712) 2026-03-10T08:57:28.565 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.564+0000 7effe5e78700 1 -- 192.168.123.105:0/2233345574 learned_addr learned my addr 192.168.123.105:0/2233345574 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:57:28.565 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.564+0000 7effe5e78700 1 -- 192.168.123.105:0/2233345574 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7effe0083340 msgr2=0x7effe00837c0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:57:28.565 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.564+0000 7effe5e78700 1 --2- 192.168.123.105:0/2233345574 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7effe0083340 0x7effe00837c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:57:28.565 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.564+0000 7effe5e78700 1 -- 192.168.123.105:0/2233345574 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7effdc007430 con 0x7effe0075a10 2026-03-10T08:57:28.565 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.565+0000 7effe5e78700 1 --2- 192.168.123.105:0/2233345574 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7effe0075a10 0x7effe0082e00 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7effdc007750 tx=0x7effdc00da70 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:57:28.566 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.565+0000 7effd6ffd700 1 -- 192.168.123.105:0/2233345574 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7effdc00f040 con 0x7effe0075a10 2026-03-10T08:57:28.566 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.565+0000 7effd6ffd700 1 -- 192.168.123.105:0/2233345574 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7effdc008520 con 0x7effe0075a10 2026-03-10T08:57:28.566 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.565+0000 7effd6ffd700 1 -- 192.168.123.105:0/2233345574 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7effdc01eb00 con 0x7effe0075a10 2026-03-10T08:57:28.566 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.565+0000 7effe6e7a700 1 -- 192.168.123.105:0/2233345574 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7effe012e990 con 0x7effe0075a10 2026-03-10T08:57:28.566 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.565+0000 7effe6e7a700 1 -- 192.168.123.105:0/2233345574 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7effe012ee80 con 0x7effe0075a10 2026-03-10T08:57:28.567 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.567+0000 7effe6e7a700 1 -- 192.168.123.105:0/2233345574 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7effe004ea90 con 0x7effe0075a10 2026-03-10T08:57:28.573 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.568+0000 7effd6ffd700 1 -- 192.168.123.105:0/2233345574 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 34) v1 ==== 100217+0+0 (secure 0 0 0) 0x7effdc01a070 con 0x7effe0075a10 2026-03-10T08:57:28.573 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.568+0000 7effd6ffd700 1 --2- 192.168.123.105:0/2233345574 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7effcc077a40 0x7effcc079f00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:57:28.573 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.569+0000 7effe5677700 1 --2- 192.168.123.105:0/2233345574 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7effcc077a40 0x7effcc079f00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:57:28.573 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.569+0000 7effe5677700 1 --2- 192.168.123.105:0/2233345574 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7effcc077a40 0x7effcc079f00 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7effd800aab0 tx=0x7effd8009250 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:57:28.573 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.573+0000 7effd6ffd700 1 -- 192.168.123.105:0/2233345574 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6222+0+0 (secure 0 0 0) 0x7effdc067ee0 con 0x7effe0075a10 2026-03-10T08:57:28.576 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.573+0000 7effd6ffd700 1 -- 192.168.123.105:0/2233345574 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7effdc063900 con 0x7effe0075a10 2026-03-10T08:57:28.777 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.776+0000 7effe6e7a700 1 -- 192.168.123.105:0/2233345574 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7effe0077710 con 0x7effcc077a40 2026-03-10T08:57:28.778 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:28 vm05.local ceph-mon[111630]: pgmap v26: 65 pgs: 1 active+undersized, 33 active+undersized+degraded, 31 active+clean; 297 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 24 KiB/s rd, 968 KiB/s wr, 243 op/s; 6293/42123 objects degraded (14.940%) 2026-03-10T08:57:28.785 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.782+0000 7effd6ffd700 1 -- 192.168.123.105:0/2233345574 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7effe0077710 con 0x7effcc077a40 2026-03-10T08:57:28.786 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.786+0000 7effd4ff9700 1 -- 192.168.123.105:0/2233345574 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7effcc077a40 msgr2=0x7effcc079f00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:57:28.786 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.786+0000 7effd4ff9700 1 --2- 192.168.123.105:0/2233345574 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7effcc077a40 0x7effcc079f00 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7effd800aab0 tx=0x7effd8009250 comp rx=0 tx=0).stop 2026-03-10T08:57:28.786 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.786+0000 7effd4ff9700 1 -- 192.168.123.105:0/2233345574 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7effe0075a10 msgr2=0x7effe0082e00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:57:28.786 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.786+0000 7effd4ff9700 1 --2- 192.168.123.105:0/2233345574 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7effe0075a10 0x7effe0082e00 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7effdc007750 tx=0x7effdc00da70 comp rx=0 tx=0).stop 2026-03-10T08:57:28.787 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.786+0000 7effd4ff9700 1 -- 192.168.123.105:0/2233345574 shutdown_connections 2026-03-10T08:57:28.787 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.786+0000 7effd4ff9700 1 --2- 192.168.123.105:0/2233345574 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7effcc077a40 0x7effcc079f00 unknown :-1 s=CLOSED pgs=28 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:57:28.787 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.786+0000 7effd4ff9700 1 --2- 192.168.123.105:0/2233345574 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7effe0075a10 0x7effe0082e00 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:57:28.787 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.786+0000 7effd4ff9700 1 --2- 192.168.123.105:0/2233345574 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7effe0083340 0x7effe00837c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:57:28.787 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.786+0000 7effd4ff9700 1 -- 192.168.123.105:0/2233345574 >> 192.168.123.105:0/2233345574 conn(0x7effe006daa0 msgr2=0x7effe006ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:57:28.787 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.786+0000 7effd4ff9700 1 -- 192.168.123.105:0/2233345574 shutdown_connections 2026-03-10T08:57:28.787 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.786+0000 7effd4ff9700 1 -- 192.168.123.105:0/2233345574 wait complete. 2026-03-10T08:57:28.890 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.888+0000 7fc5ab07e700 1 -- 192.168.123.105:0/2942716900 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc5a4075a40 msgr2=0x7fc5a4077ed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:57:28.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.888+0000 7fc5ab07e700 1 --2- 192.168.123.105:0/2942716900 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc5a4075a40 0x7fc5a4077ed0 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7fc59c00d3f0 tx=0x7fc59c00d700 comp rx=0 tx=0).stop 2026-03-10T08:57:28.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.888+0000 7fc5ab07e700 1 -- 192.168.123.105:0/2942716900 shutdown_connections 2026-03-10T08:57:28.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.888+0000 7fc5ab07e700 1 --2- 192.168.123.105:0/2942716900 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc5a4075a40 0x7fc5a4077ed0 unknown :-1 s=CLOSED pgs=53 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:57:28.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.888+0000 7fc5ab07e700 1 --2- 192.168.123.105:0/2942716900 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc5a4072b50 0x7fc5a4072f70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:57:28.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.888+0000 7fc5ab07e700 1 -- 192.168.123.105:0/2942716900 >> 192.168.123.105:0/2942716900 conn(0x7fc5a406dae0 msgr2=0x7fc5a406ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:57:28.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.888+0000 7fc5ab07e700 1 -- 192.168.123.105:0/2942716900 shutdown_connections 2026-03-10T08:57:28.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.888+0000 7fc5ab07e700 1 -- 192.168.123.105:0/2942716900 wait complete. 2026-03-10T08:57:28.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.889+0000 7fc5ab07e700 1 Processor -- start 2026-03-10T08:57:28.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.889+0000 7fc5ab07e700 1 -- start start 2026-03-10T08:57:28.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.889+0000 7fc5ab07e700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc5a4072b50 0x7fc5a4082f60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:57:28.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.889+0000 7fc5ab07e700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc5a40834a0 0x7fc5a4083920 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:57:28.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.889+0000 7fc5ab07e700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc5a412e700 con 0x7fc5a4072b50 2026-03-10T08:57:28.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.889+0000 7fc5ab07e700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc5a412e870 con 0x7fc5a40834a0 2026-03-10T08:57:28.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.889+0000 7fc5a8e1a700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc5a4072b50 0x7fc5a4082f60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:57:28.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.889+0000 7fc5a8e1a700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc5a4072b50 0x7fc5a4082f60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:60066/0 (socket says 192.168.123.105:60066) 2026-03-10T08:57:28.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.889+0000 7fc5a8e1a700 1 -- 192.168.123.105:0/2236842229 learned_addr learned my addr 192.168.123.105:0/2236842229 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:57:28.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.890+0000 7fc5a3fff700 1 --2- 192.168.123.105:0/2236842229 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc5a40834a0 0x7fc5a4083920 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:57:28.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.890+0000 7fc5a8e1a700 1 -- 192.168.123.105:0/2236842229 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc5a40834a0 msgr2=0x7fc5a4083920 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:57:28.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.890+0000 7fc5a8e1a700 1 --2- 192.168.123.105:0/2236842229 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc5a40834a0 0x7fc5a4083920 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:57:28.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.890+0000 7fc5a8e1a700 1 -- 192.168.123.105:0/2236842229 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc59c007ed0 con 0x7fc5a4072b50 2026-03-10T08:57:28.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.891+0000 7fc5a8e1a700 1 --2- 192.168.123.105:0/2236842229 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc5a4072b50 0x7fc5a4082f60 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7fc59400b770 tx=0x7fc59400ba80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:57:28.893 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.891+0000 7fc5a1ffb700 1 -- 192.168.123.105:0/2236842229 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc594010840 con 0x7fc5a4072b50 2026-03-10T08:57:28.893 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.891+0000 7fc5ab07e700 1 -- 192.168.123.105:0/2236842229 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc5a412eb50 con 0x7fc5a4072b50 2026-03-10T08:57:28.893 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.891+0000 7fc5ab07e700 1 -- 192.168.123.105:0/2236842229 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc5a412f0a0 con 0x7fc5a4072b50 2026-03-10T08:57:28.893 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.892+0000 7fc5a1ffb700 1 -- 192.168.123.105:0/2236842229 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc594010e80 con 0x7fc5a4072b50 2026-03-10T08:57:28.893 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.892+0000 7fc5a1ffb700 1 -- 192.168.123.105:0/2236842229 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc59400d590 con 0x7fc5a4072b50 2026-03-10T08:57:28.893 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.892+0000 7fc5ab07e700 1 -- 192.168.123.105:0/2236842229 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc590005320 con 0x7fc5a4072b50 2026-03-10T08:57:28.896 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.895+0000 7fc5a1ffb700 1 -- 192.168.123.105:0/2236842229 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 34) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fc59400f870 con 0x7fc5a4072b50 2026-03-10T08:57:28.896 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.896+0000 7fc5a1ffb700 1 --2- 192.168.123.105:0/2236842229 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fc58c077b00 0x7fc58c079fc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:57:28.897 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.896+0000 7fc5a1ffb700 1 -- 192.168.123.105:0/2236842229 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6222+0+0 (secure 0 0 0) 0x7fc594099af0 con 0x7fc5a4072b50 2026-03-10T08:57:28.899 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.897+0000 7fc5a3fff700 1 --2- 192.168.123.105:0/2236842229 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fc58c077b00 0x7fc58c079fc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:57:28.899 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.898+0000 7fc5a3fff700 1 --2- 192.168.123.105:0/2236842229 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fc58c077b00 0x7fc58c079fc0 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7fc59c000f80 tx=0x7fc59c00db00 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:57:28.925 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:28.919+0000 7fc5a1ffb700 1 -- 192.168.123.105:0/2236842229 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fc594062130 con 0x7fc5a4072b50 2026-03-10T08:57:29.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:28 vm08.local ceph-mon[101330]: pgmap v26: 65 pgs: 1 active+undersized, 33 active+undersized+degraded, 31 active+clean; 297 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 24 KiB/s rd, 968 KiB/s wr, 243 op/s; 6293/42123 objects degraded (14.940%) 2026-03-10T08:57:29.071 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.070+0000 7fc5ab07e700 1 -- 192.168.123.105:0/2236842229 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fc590000bf0 con 0x7fc58c077b00 2026-03-10T08:57:29.088 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.088+0000 7fc5a1ffb700 1 -- 192.168.123.105:0/2236842229 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7fc590000bf0 con 0x7fc58c077b00 2026-03-10T08:57:29.089 INFO:teuthology.orchestra.run.vm05.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T08:57:29.089 INFO:teuthology.orchestra.run.vm05.stdout:alertmanager.vm05 vm05 *:9093,9094 running (5m) 3s ago 6m 24.2M - 0.25.0 c8568f914cd2 3be9db7ff6a0 2026-03-10T08:57:29.089 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm05 vm05 running (6m) 3s ago 6m 8892k - 18.2.1 5be31c24972a 4f6a4fa3151a 2026-03-10T08:57:29.089 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm08 vm08 running (6m) 14s ago 6m 11.0M - 18.2.1 5be31c24972a 17a1d108c2c1 2026-03-10T08:57:29.089 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm05 vm05 running (19s) 3s ago 6m 7834k - 19.2.3-678-ge911bdeb 654f31e6858e 8f7c3cd210c7 2026-03-10T08:57:29.089 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm08 vm08 running (16s) 14s ago 6m 8296k - 19.2.3-678-ge911bdeb 654f31e6858e dc71c3161edd 2026-03-10T08:57:29.089 INFO:teuthology.orchestra.run.vm05.stdout:grafana.vm05 vm05 *:3000 running (5m) 3s ago 6m 88.3M - 9.4.7 954c08fa6188 91937b4745e7 2026-03-10T08:57:29.089 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.bxdvbu vm05 running (4m) 3s ago 4m 242M - 18.2.1 5be31c24972a 601862397d9b 2026-03-10T08:57:29.089 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.slhztf vm05 running (4m) 3s ago 4m 17.7M - 18.2.1 5be31c24972a 550cdd24738b 2026-03-10T08:57:29.089 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.ssijow vm08 running (4m) 14s ago 4m 19.9M - 18.2.1 5be31c24972a b9e55b365719 2026-03-10T08:57:29.089 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.xfzrbx vm08 running (4m) 14s ago 4m 16.2M - 18.2.1 5be31c24972a d58d1e2e6ff3 2026-03-10T08:57:29.089 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm05.rxwgjc vm05 *:8443,9283,8765 running (85s) 3s ago 7m 613M - 19.2.3-678-ge911bdeb 654f31e6858e d8c53d04a173 2026-03-10T08:57:29.089 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm08.rpongu vm08 *:8443,9283,8765 running (56s) 14s ago 6m 491M - 19.2.3-678-ge911bdeb 654f31e6858e 867781ac9d98 2026-03-10T08:57:29.089 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm05 vm05 running (49s) 3s ago 7m 55.8M 2048M 19.2.3-678-ge911bdeb 654f31e6858e cdc9176bec28 2026-03-10T08:57:29.089 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm08 vm08 running (35s) 14s ago 6m 48.1M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 34546aa1422b 2026-03-10T08:57:29.089 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm05 vm05 *:9100 running (6m) 3s ago 6m 14.7M - 1.5.0 0da6a335fe13 3b3db2a6030c 2026-03-10T08:57:29.089 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm08 vm08 *:9100 running (6m) 14s ago 6m 15.7M - 1.5.0 0da6a335fe13 f55df0cefc61 2026-03-10T08:57:29.089 INFO:teuthology.orchestra.run.vm05.stdout:osd.0 vm05 running (6s) 3s ago 5m 30.0M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 4f1dac46f59b 2026-03-10T08:57:29.089 INFO:teuthology.orchestra.run.vm05.stdout:osd.1 vm05 running (5m) 3s ago 5m 378M 4096M 18.2.1 5be31c24972a 902f9ea11f1a 2026-03-10T08:57:29.089 INFO:teuthology.orchestra.run.vm05.stdout:osd.2 vm05 running (5m) 3s ago 5m 327M 4096M 18.2.1 5be31c24972a 32c0be0f86f2 2026-03-10T08:57:29.089 INFO:teuthology.orchestra.run.vm05.stdout:osd.3 vm08 running (5m) 14s ago 5m 456M 4096M 18.2.1 5be31c24972a 14f5f93ea4d1 2026-03-10T08:57:29.089 INFO:teuthology.orchestra.run.vm05.stdout:osd.4 vm08 running (5m) 14s ago 5m 418M 4096M 18.2.1 5be31c24972a 155c1482d81c 2026-03-10T08:57:29.089 INFO:teuthology.orchestra.run.vm05.stdout:osd.5 vm08 running (4m) 14s ago 4m 339M 4096M 18.2.1 5be31c24972a 21583bb58d82 2026-03-10T08:57:29.089 INFO:teuthology.orchestra.run.vm05.stdout:prometheus.vm05 vm05 *:9095 running (58s) 3s ago 6m 51.3M - 2.43.0 a07b618ecd1d 1879cf55d022 2026-03-10T08:57:29.092 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.091+0000 7fc58b7fe700 1 -- 192.168.123.105:0/2236842229 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fc58c077b00 msgr2=0x7fc58c079fc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:57:29.092 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.091+0000 7fc58b7fe700 1 --2- 192.168.123.105:0/2236842229 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fc58c077b00 0x7fc58c079fc0 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7fc59c000f80 tx=0x7fc59c00db00 comp rx=0 tx=0).stop 2026-03-10T08:57:29.092 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.091+0000 7fc58b7fe700 1 -- 192.168.123.105:0/2236842229 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc5a4072b50 msgr2=0x7fc5a4082f60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:57:29.092 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.091+0000 7fc58b7fe700 1 --2- 192.168.123.105:0/2236842229 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc5a4072b50 0x7fc5a4082f60 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7fc59400b770 tx=0x7fc59400ba80 comp rx=0 tx=0).stop 2026-03-10T08:57:29.092 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.091+0000 7fc58b7fe700 1 -- 192.168.123.105:0/2236842229 shutdown_connections 2026-03-10T08:57:29.092 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.091+0000 7fc58b7fe700 1 --2- 192.168.123.105:0/2236842229 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fc58c077b00 0x7fc58c079fc0 unknown :-1 s=CLOSED pgs=29 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:57:29.092 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.091+0000 7fc58b7fe700 1 --2- 192.168.123.105:0/2236842229 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc5a4072b50 0x7fc5a4082f60 unknown :-1 s=CLOSED pgs=54 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:57:29.092 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.091+0000 7fc58b7fe700 1 --2- 192.168.123.105:0/2236842229 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc5a40834a0 0x7fc5a4083920 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:57:29.092 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.091+0000 7fc58b7fe700 1 -- 192.168.123.105:0/2236842229 >> 192.168.123.105:0/2236842229 conn(0x7fc5a406dae0 msgr2=0x7fc5a406ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:57:29.092 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.091+0000 7fc58b7fe700 1 -- 192.168.123.105:0/2236842229 shutdown_connections 2026-03-10T08:57:29.092 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.091+0000 7fc58b7fe700 1 -- 192.168.123.105:0/2236842229 wait complete. 2026-03-10T08:57:29.189 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.187+0000 7f98b53fe700 1 -- 192.168.123.105:0/2259078947 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f98b010a700 msgr2=0x7f98b010cb90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:57:29.189 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.187+0000 7f98b53fe700 1 --2- 192.168.123.105:0/2259078947 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f98b010a700 0x7f98b010cb90 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f98a800b3a0 tx=0x7f98a800b6b0 comp rx=0 tx=0).stop 2026-03-10T08:57:29.189 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.187+0000 7f98b53fe700 1 -- 192.168.123.105:0/2259078947 shutdown_connections 2026-03-10T08:57:29.189 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.187+0000 7f98b53fe700 1 --2- 192.168.123.105:0/2259078947 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f98b010a700 0x7f98b010cb90 unknown :-1 s=CLOSED pgs=55 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:57:29.189 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.187+0000 7f98b53fe700 1 --2- 192.168.123.105:0/2259078947 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f98b0107d90 0x7f98b010a1c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:57:29.189 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.187+0000 7f98b53fe700 1 -- 192.168.123.105:0/2259078947 >> 192.168.123.105:0/2259078947 conn(0x7f98b006daa0 msgr2=0x7f98b006ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:57:29.189 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.187+0000 7f98b53fe700 1 -- 192.168.123.105:0/2259078947 shutdown_connections 2026-03-10T08:57:29.189 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.187+0000 7f98b53fe700 1 -- 192.168.123.105:0/2259078947 wait complete. 2026-03-10T08:57:29.189 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.188+0000 7f98b53fe700 1 Processor -- start 2026-03-10T08:57:29.189 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.188+0000 7f98b53fe700 1 -- start start 2026-03-10T08:57:29.189 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.188+0000 7f98b53fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f98b0107d90 0x7f98b0116b50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:57:29.189 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.188+0000 7f98b53fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f98b0117090 0x7f98b011d0c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:57:29.189 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.188+0000 7f98b53fe700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f98b01175a0 con 0x7f98b0117090 2026-03-10T08:57:29.189 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.188+0000 7f98b53fe700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f98b01176e0 con 0x7f98b0107d90 2026-03-10T08:57:29.190 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.188+0000 7f98affff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f98b0107d90 0x7f98b0116b50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:57:29.190 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.188+0000 7f98affff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f98b0107d90 0x7f98b0116b50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:49756/0 (socket says 192.168.123.105:49756) 2026-03-10T08:57:29.190 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.188+0000 7f98affff700 1 -- 192.168.123.105:0/3205288623 learned_addr learned my addr 192.168.123.105:0/3205288623 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:57:29.190 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.189+0000 7f98affff700 1 -- 192.168.123.105:0/3205288623 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f98b0117090 msgr2=0x7f98b011d0c0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:57:29.190 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.189+0000 7f98affff700 1 --2- 192.168.123.105:0/3205288623 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f98b0117090 0x7f98b011d0c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:57:29.190 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.189+0000 7f98affff700 1 -- 192.168.123.105:0/3205288623 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f98a800b050 con 0x7f98b0107d90 2026-03-10T08:57:29.190 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.189+0000 7f98affff700 1 --2- 192.168.123.105:0/3205288623 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f98b0107d90 0x7f98b0116b50 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f98a000d8d0 tx=0x7f98a000dc90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:57:29.190 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.189+0000 7f98ad7fa700 1 -- 192.168.123.105:0/3205288623 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f98a0009940 con 0x7f98b0107d90 2026-03-10T08:57:29.190 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.189+0000 7f98b53fe700 1 -- 192.168.123.105:0/3205288623 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f98b011d660 con 0x7f98b0107d90 2026-03-10T08:57:29.190 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.189+0000 7f98b53fe700 1 -- 192.168.123.105:0/3205288623 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f98b011dbb0 con 0x7f98b0107d90 2026-03-10T08:57:29.190 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.190+0000 7f98ad7fa700 1 -- 192.168.123.105:0/3205288623 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f98a0010460 con 0x7f98b0107d90 2026-03-10T08:57:29.191 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.190+0000 7f98ad7fa700 1 -- 192.168.123.105:0/3205288623 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f98a000f5d0 con 0x7f98b0107d90 2026-03-10T08:57:29.191 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.191+0000 7f98ad7fa700 1 -- 192.168.123.105:0/3205288623 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 34) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f98a000f790 con 0x7f98b0107d90 2026-03-10T08:57:29.192 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.192+0000 7f98ad7fa700 1 --2- 192.168.123.105:0/3205288623 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f9898077b10 0x7f9898079fd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:57:29.192 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.192+0000 7f98ad7fa700 1 -- 192.168.123.105:0/3205288623 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6222+0+0 (secure 0 0 0) 0x7f98a009abb0 con 0x7f98b0107d90 2026-03-10T08:57:29.192 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.192+0000 7f98af7fe700 1 --2- 192.168.123.105:0/3205288623 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f9898077b10 0x7f9898079fd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:57:29.193 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.193+0000 7f98b53fe700 1 -- 192.168.123.105:0/3205288623 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f989c005320 con 0x7f98b0107d90 2026-03-10T08:57:29.195 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.195+0000 7f98af7fe700 1 --2- 192.168.123.105:0/3205288623 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f9898077b10 0x7f9898079fd0 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f98a800bb30 tx=0x7f98a800bf90 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:57:29.202 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.198+0000 7f98ad7fa700 1 -- 192.168.123.105:0/3205288623 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f98a0062b20 con 0x7f98b0107d90 2026-03-10T08:57:29.431 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.429+0000 7f98b53fe700 1 -- 192.168.123.105:0/3205288623 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f989c005cc0 con 0x7f98b0107d90 2026-03-10T08:57:29.433 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.432+0000 7f98ad7fa700 1 -- 192.168.123.105:0/3205288623 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+693 (secure 0 0 0) 0x7f98a00202d0 con 0x7f98b0107d90 2026-03-10T08:57:29.433 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T08:57:29.433 INFO:teuthology.orchestra.run.vm05.stdout: "mon": { 2026-03-10T08:57:29.433 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T08:57:29.433 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:57:29.433 INFO:teuthology.orchestra.run.vm05.stdout: "mgr": { 2026-03-10T08:57:29.433 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T08:57:29.433 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:57:29.433 INFO:teuthology.orchestra.run.vm05.stdout: "osd": { 2026-03-10T08:57:29.433 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 5 2026-03-10T08:57:29.433 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:57:29.433 INFO:teuthology.orchestra.run.vm05.stdout: "mds": { 2026-03-10T08:57:29.433 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4 2026-03-10T08:57:29.433 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:57:29.433 INFO:teuthology.orchestra.run.vm05.stdout: "overall": { 2026-03-10T08:57:29.433 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 9, 2026-03-10T08:57:29.433 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 4 2026-03-10T08:57:29.433 INFO:teuthology.orchestra.run.vm05.stdout: } 2026-03-10T08:57:29.433 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T08:57:29.437 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.436+0000 7f9896ffd700 1 -- 192.168.123.105:0/3205288623 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f9898077b10 msgr2=0x7f9898079fd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:57:29.437 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.437+0000 7f9896ffd700 1 --2- 192.168.123.105:0/3205288623 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f9898077b10 0x7f9898079fd0 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f98a800bb30 tx=0x7f98a800bf90 comp rx=0 tx=0).stop 2026-03-10T08:57:29.437 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.437+0000 7f9896ffd700 1 -- 192.168.123.105:0/3205288623 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f98b0107d90 msgr2=0x7f98b0116b50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:57:29.437 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.437+0000 7f9896ffd700 1 --2- 192.168.123.105:0/3205288623 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f98b0107d90 0x7f98b0116b50 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f98a000d8d0 tx=0x7f98a000dc90 comp rx=0 tx=0).stop 2026-03-10T08:57:29.437 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.437+0000 7f9896ffd700 1 -- 192.168.123.105:0/3205288623 shutdown_connections 2026-03-10T08:57:29.437 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.437+0000 7f9896ffd700 1 --2- 192.168.123.105:0/3205288623 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f9898077b10 0x7f9898079fd0 unknown :-1 s=CLOSED pgs=30 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:57:29.437 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.437+0000 7f9896ffd700 1 --2- 192.168.123.105:0/3205288623 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f98b0107d90 0x7f98b0116b50 unknown :-1 s=CLOSED pgs=9 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:57:29.437 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.437+0000 7f9896ffd700 1 --2- 192.168.123.105:0/3205288623 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f98b0117090 0x7f98b011d0c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:57:29.438 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.437+0000 7f9896ffd700 1 -- 192.168.123.105:0/3205288623 >> 192.168.123.105:0/3205288623 conn(0x7f98b006daa0 msgr2=0x7f98b006e780 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:57:29.438 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.437+0000 7f9896ffd700 1 -- 192.168.123.105:0/3205288623 shutdown_connections 2026-03-10T08:57:29.438 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.437+0000 7f9896ffd700 1 -- 192.168.123.105:0/3205288623 wait complete. 2026-03-10T08:57:29.535 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.535+0000 7fb21f7e8700 1 -- 192.168.123.105:0/4156244744 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb218075a40 msgr2=0x7fb218077ed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:57:29.536 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.535+0000 7fb21f7e8700 1 --2- 192.168.123.105:0/4156244744 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb218075a40 0x7fb218077ed0 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7fb21000d3f0 tx=0x7fb21000d700 comp rx=0 tx=0).stop 2026-03-10T08:57:29.536 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.535+0000 7fb21f7e8700 1 -- 192.168.123.105:0/4156244744 shutdown_connections 2026-03-10T08:57:29.536 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.535+0000 7fb21f7e8700 1 --2- 192.168.123.105:0/4156244744 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb218075a40 0x7fb218077ed0 unknown :-1 s=CLOSED pgs=56 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:57:29.536 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.535+0000 7fb21f7e8700 1 --2- 192.168.123.105:0/4156244744 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb218072b50 0x7fb218072f70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:57:29.536 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.535+0000 7fb21f7e8700 1 -- 192.168.123.105:0/4156244744 >> 192.168.123.105:0/4156244744 conn(0x7fb21806dae0 msgr2=0x7fb21806ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:57:29.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.536+0000 7fb21f7e8700 1 -- 192.168.123.105:0/4156244744 shutdown_connections 2026-03-10T08:57:29.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.536+0000 7fb21f7e8700 1 -- 192.168.123.105:0/4156244744 wait complete. 2026-03-10T08:57:29.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.536+0000 7fb21f7e8700 1 Processor -- start 2026-03-10T08:57:29.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.536+0000 7fb21f7e8700 1 -- start start 2026-03-10T08:57:29.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.536+0000 7fb21f7e8700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb218072b50 0x7fb218082f60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:57:29.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.536+0000 7fb21f7e8700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb2180834a0 0x7fb218083920 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:57:29.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.536+0000 7fb21f7e8700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb21812e700 con 0x7fb218072b50 2026-03-10T08:57:29.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.536+0000 7fb21f7e8700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb21812e870 con 0x7fb2180834a0 2026-03-10T08:57:29.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.537+0000 7fb21cd83700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb2180834a0 0x7fb218083920 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:57:29.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.537+0000 7fb21d584700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb218072b50 0x7fb218082f60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:57:29.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.537+0000 7fb21d584700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb218072b50 0x7fb218082f60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:60106/0 (socket says 192.168.123.105:60106) 2026-03-10T08:57:29.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.537+0000 7fb21d584700 1 -- 192.168.123.105:0/2910871509 learned_addr learned my addr 192.168.123.105:0/2910871509 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:57:29.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.537+0000 7fb21cd83700 1 -- 192.168.123.105:0/2910871509 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb218072b50 msgr2=0x7fb218082f60 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:57:29.539 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.537+0000 7fb21cd83700 1 --2- 192.168.123.105:0/2910871509 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb218072b50 0x7fb218082f60 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:57:29.539 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.537+0000 7fb21cd83700 1 -- 192.168.123.105:0/2910871509 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb210007ed0 con 0x7fb2180834a0 2026-03-10T08:57:29.539 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.537+0000 7fb21cd83700 1 --2- 192.168.123.105:0/2910871509 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb2180834a0 0x7fb218083920 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7fb21000d3f0 tx=0x7fb210003c30 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:57:29.539 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.538+0000 7fb20e7fc700 1 -- 192.168.123.105:0/2910871509 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb21001c070 con 0x7fb2180834a0 2026-03-10T08:57:29.540 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.539+0000 7fb21f7e8700 1 -- 192.168.123.105:0/2910871509 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb21812eaf0 con 0x7fb2180834a0 2026-03-10T08:57:29.540 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.539+0000 7fb21f7e8700 1 -- 192.168.123.105:0/2910871509 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb21812efe0 con 0x7fb2180834a0 2026-03-10T08:57:29.540 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.540+0000 7fb20e7fc700 1 -- 192.168.123.105:0/2910871509 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb21000de50 con 0x7fb2180834a0 2026-03-10T08:57:29.540 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.540+0000 7fb20e7fc700 1 -- 192.168.123.105:0/2910871509 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb210017cc0 con 0x7fb2180834a0 2026-03-10T08:57:29.541 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.541+0000 7fb20e7fc700 1 -- 192.168.123.105:0/2910871509 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 34) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fb210004370 con 0x7fb2180834a0 2026-03-10T08:57:29.542 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.542+0000 7fb20e7fc700 1 --2- 192.168.123.105:0/2910871509 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb204077b00 0x7fb204079fc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:57:29.542 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.542+0000 7fb20e7fc700 1 -- 192.168.123.105:0/2910871509 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6222+0+0 (secure 0 0 0) 0x7fb210013070 con 0x7fb2180834a0 2026-03-10T08:57:29.543 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.543+0000 7fb21d584700 1 --2- 192.168.123.105:0/2910871509 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb204077b00 0x7fb204079fc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:57:29.543 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.543+0000 7fb21f7e8700 1 -- 192.168.123.105:0/2910871509 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb1fc005320 con 0x7fb2180834a0 2026-03-10T08:57:29.544 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.544+0000 7fb21d584700 1 --2- 192.168.123.105:0/2910871509 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb204077b00 0x7fb204079fc0 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7fb214009d30 tx=0x7fb2140094b0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:57:29.555 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.549+0000 7fb20e7fc700 1 -- 192.168.123.105:0/2910871509 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fb210064d70 con 0x7fb2180834a0 2026-03-10T08:57:29.734 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.734+0000 7fb21f7e8700 1 -- 192.168.123.105:0/2910871509 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fb1fc005cc0 con 0x7fb2180834a0 2026-03-10T08:57:29.789 INFO:teuthology.orchestra.run.vm05.stdout:e12 2026-03-10T08:57:29.789 INFO:teuthology.orchestra.run.vm05.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-10T08:57:29.789 INFO:teuthology.orchestra.run.vm05.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T08:57:29.789 INFO:teuthology.orchestra.run.vm05.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T08:57:29.789 INFO:teuthology.orchestra.run.vm05.stdout:legacy client fscid: 1 2026-03-10T08:57:29.789 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:57:29.789 INFO:teuthology.orchestra.run.vm05.stdout:Filesystem 'cephfs' (1) 2026-03-10T08:57:29.789 INFO:teuthology.orchestra.run.vm05.stdout:fs_name cephfs 2026-03-10T08:57:29.790 INFO:teuthology.orchestra.run.vm05.stdout:epoch 12 2026-03-10T08:57:29.790 INFO:teuthology.orchestra.run.vm05.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T08:57:29.790 INFO:teuthology.orchestra.run.vm05.stdout:created 2026-03-10T08:52:52.346264+0000 2026-03-10T08:57:29.790 INFO:teuthology.orchestra.run.vm05.stdout:modified 2026-03-10T08:53:01.426195+0000 2026-03-10T08:57:29.790 INFO:teuthology.orchestra.run.vm05.stdout:tableserver 0 2026-03-10T08:57:29.790 INFO:teuthology.orchestra.run.vm05.stdout:root 0 2026-03-10T08:57:29.790 INFO:teuthology.orchestra.run.vm05.stdout:session_timeout 60 2026-03-10T08:57:29.790 INFO:teuthology.orchestra.run.vm05.stdout:session_autoclose 300 2026-03-10T08:57:29.790 INFO:teuthology.orchestra.run.vm05.stdout:max_file_size 1099511627776 2026-03-10T08:57:29.790 INFO:teuthology.orchestra.run.vm05.stdout:max_xattr_size 65536 2026-03-10T08:57:29.790 INFO:teuthology.orchestra.run.vm05.stdout:required_client_features {} 2026-03-10T08:57:29.790 INFO:teuthology.orchestra.run.vm05.stdout:last_failure 0 2026-03-10T08:57:29.790 INFO:teuthology.orchestra.run.vm05.stdout:last_failure_osd_epoch 39 2026-03-10T08:57:29.790 INFO:teuthology.orchestra.run.vm05.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T08:57:29.790 INFO:teuthology.orchestra.run.vm05.stdout:max_mds 1 2026-03-10T08:57:29.790 INFO:teuthology.orchestra.run.vm05.stdout:in 0 2026-03-10T08:57:29.790 INFO:teuthology.orchestra.run.vm05.stdout:up {0=24289} 2026-03-10T08:57:29.790 INFO:teuthology.orchestra.run.vm05.stdout:failed 2026-03-10T08:57:29.790 INFO:teuthology.orchestra.run.vm05.stdout:damaged 2026-03-10T08:57:29.790 INFO:teuthology.orchestra.run.vm05.stdout:stopped 2026-03-10T08:57:29.790 INFO:teuthology.orchestra.run.vm05.stdout:data_pools [3] 2026-03-10T08:57:29.790 INFO:teuthology.orchestra.run.vm05.stdout:metadata_pool 2 2026-03-10T08:57:29.790 INFO:teuthology.orchestra.run.vm05.stdout:inline_data disabled 2026-03-10T08:57:29.790 INFO:teuthology.orchestra.run.vm05.stdout:balancer 2026-03-10T08:57:29.790 INFO:teuthology.orchestra.run.vm05.stdout:bal_rank_mask -1 2026-03-10T08:57:29.790 INFO:teuthology.orchestra.run.vm05.stdout:standby_count_wanted 1 2026-03-10T08:57:29.790 INFO:teuthology.orchestra.run.vm05.stdout:qdb_cluster leader: 0 members: 2026-03-10T08:57:29.790 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.bxdvbu{0:24289} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.105:6826/2466638752,v1:192.168.123.105:6827/2466638752] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:57:29.790 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:57:29.790 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:57:29.790 INFO:teuthology.orchestra.run.vm05.stdout:Standby daemons: 2026-03-10T08:57:29.790 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:57:29.790 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.slhztf{-1:14488} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.105:6828/2662194502,v1:192.168.123.105:6829/2662194502] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:57:29.790 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.ssijow{-1:24307} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.108:6826/573665424,v1:192.168.123.108:6827/573665424] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:57:29.790 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.xfzrbx{-1:24317} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.108:6824/3236998387,v1:192.168.123.108:6825/3236998387] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:57:29.791 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.786+0000 7fb20e7fc700 1 -- 192.168.123.105:0/2910871509 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 12 v12) v1 ==== 76+0+1918 (secure 0 0 0) 0x7fb2100644c0 con 0x7fb2180834a0 2026-03-10T08:57:29.791 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.789+0000 7fb203fff700 1 -- 192.168.123.105:0/2910871509 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb204077b00 msgr2=0x7fb204079fc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:57:29.791 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.789+0000 7fb203fff700 1 --2- 192.168.123.105:0/2910871509 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb204077b00 0x7fb204079fc0 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7fb214009d30 tx=0x7fb2140094b0 comp rx=0 tx=0).stop 2026-03-10T08:57:29.791 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.789+0000 7fb203fff700 1 -- 192.168.123.105:0/2910871509 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb2180834a0 msgr2=0x7fb218083920 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:57:29.791 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.789+0000 7fb203fff700 1 --2- 192.168.123.105:0/2910871509 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb2180834a0 0x7fb218083920 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7fb21000d3f0 tx=0x7fb210003c30 comp rx=0 tx=0).stop 2026-03-10T08:57:29.791 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.789+0000 7fb203fff700 1 -- 192.168.123.105:0/2910871509 shutdown_connections 2026-03-10T08:57:29.791 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.790+0000 7fb203fff700 1 --2- 192.168.123.105:0/2910871509 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb204077b00 0x7fb204079fc0 unknown :-1 s=CLOSED pgs=31 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:57:29.791 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.790+0000 7fb203fff700 1 --2- 192.168.123.105:0/2910871509 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb218072b50 0x7fb218082f60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:57:29.791 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.790+0000 7fb203fff700 1 --2- 192.168.123.105:0/2910871509 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb2180834a0 0x7fb218083920 unknown :-1 s=CLOSED pgs=10 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:57:29.791 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.790+0000 7fb203fff700 1 -- 192.168.123.105:0/2910871509 >> 192.168.123.105:0/2910871509 conn(0x7fb21806dae0 msgr2=0x7fb21806ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:57:29.791 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.790+0000 7fb203fff700 1 -- 192.168.123.105:0/2910871509 shutdown_connections 2026-03-10T08:57:29.791 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.790+0000 7fb203fff700 1 -- 192.168.123.105:0/2910871509 wait complete. 2026-03-10T08:57:29.791 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 12 2026-03-10T08:57:29.890 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.890+0000 7ffb1ae61700 1 -- 192.168.123.105:0/882446572 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffb14075a10 msgr2=0x7ffb14077ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:57:29.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.890+0000 7ffb1ae61700 1 --2- 192.168.123.105:0/882446572 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffb14075a10 0x7ffb14077ea0 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7ffb0c00d3f0 tx=0x7ffb0c00d700 comp rx=0 tx=0).stop 2026-03-10T08:57:29.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.890+0000 7ffb1ae61700 1 -- 192.168.123.105:0/882446572 shutdown_connections 2026-03-10T08:57:29.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.890+0000 7ffb1ae61700 1 --2- 192.168.123.105:0/882446572 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffb14075a10 0x7ffb14077ea0 unknown :-1 s=CLOSED pgs=57 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:57:29.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.890+0000 7ffb1ae61700 1 --2- 192.168.123.105:0/882446572 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ffb14072b20 0x7ffb14072f40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:57:29.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.890+0000 7ffb1ae61700 1 -- 192.168.123.105:0/882446572 >> 192.168.123.105:0/882446572 conn(0x7ffb1406daa0 msgr2=0x7ffb1406ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:57:29.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.890+0000 7ffb1ae61700 1 -- 192.168.123.105:0/882446572 shutdown_connections 2026-03-10T08:57:29.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.890+0000 7ffb1ae61700 1 -- 192.168.123.105:0/882446572 wait complete. 2026-03-10T08:57:29.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.891+0000 7ffb1ae61700 1 Processor -- start 2026-03-10T08:57:29.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.891+0000 7ffb1ae61700 1 -- start start 2026-03-10T08:57:29.892 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.891+0000 7ffb1ae61700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffb14072b20 0x7ffb14083130 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:57:29.892 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.891+0000 7ffb1ae61700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ffb14083670 0x7ffb141b3120 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:57:29.892 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.891+0000 7ffb1ae61700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ffb14083b80 con 0x7ffb14072b20 2026-03-10T08:57:29.892 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.891+0000 7ffb1ae61700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ffb14083cf0 con 0x7ffb14083670 2026-03-10T08:57:29.892 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.891+0000 7ffb1965e700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ffb14083670 0x7ffb141b3120 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:57:29.892 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.891+0000 7ffb1965e700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ffb14083670 0x7ffb141b3120 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:49434/0 (socket says 192.168.123.105:49434) 2026-03-10T08:57:29.892 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.891+0000 7ffb1965e700 1 -- 192.168.123.105:0/3913425294 learned_addr learned my addr 192.168.123.105:0/3913425294 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:57:29.892 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.891+0000 7ffb1965e700 1 -- 192.168.123.105:0/3913425294 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffb14072b20 msgr2=0x7ffb14083130 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:57:29.892 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.891+0000 7ffb1965e700 1 --2- 192.168.123.105:0/3913425294 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffb14072b20 0x7ffb14083130 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:57:29.892 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.891+0000 7ffb1965e700 1 -- 192.168.123.105:0/3913425294 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ffb0c007ed0 con 0x7ffb14083670 2026-03-10T08:57:29.892 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.892+0000 7ffb1965e700 1 --2- 192.168.123.105:0/3913425294 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ffb14083670 0x7ffb141b3120 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7ffb0c007ba0 tx=0x7ffb0c019460 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:57:29.893 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.892+0000 7ffb0affd700 1 -- 192.168.123.105:0/3913425294 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ffb0c003dc0 con 0x7ffb14083670 2026-03-10T08:57:29.894 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.893+0000 7ffb1ae61700 1 -- 192.168.123.105:0/3913425294 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ffb141b36c0 con 0x7ffb14083670 2026-03-10T08:57:29.894 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.893+0000 7ffb1ae61700 1 -- 192.168.123.105:0/3913425294 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ffb141b3bb0 con 0x7ffb14083670 2026-03-10T08:57:29.894 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.894+0000 7ffb0affd700 1 -- 192.168.123.105:0/3913425294 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ffb0c01d070 con 0x7ffb14083670 2026-03-10T08:57:29.894 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.894+0000 7ffb0affd700 1 -- 192.168.123.105:0/3913425294 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ffb0c019700 con 0x7ffb14083670 2026-03-10T08:57:29.895 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.895+0000 7ffb0affd700 1 -- 192.168.123.105:0/3913425294 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 34) v1 ==== 100217+0+0 (secure 0 0 0) 0x7ffb0c019860 con 0x7ffb14083670 2026-03-10T08:57:29.897 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.897+0000 7ffb0affd700 1 --2- 192.168.123.105:0/3913425294 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ffb00077b10 0x7ffb00079fd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:57:29.898 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.898+0000 7ffb0affd700 1 -- 192.168.123.105:0/3913425294 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6222+0+0 (secure 0 0 0) 0x7ffb0c013070 con 0x7ffb14083670 2026-03-10T08:57:29.898 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.898+0000 7ffb1ae61700 1 -- 192.168.123.105:0/3913425294 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ffaf8005320 con 0x7ffb14083670 2026-03-10T08:57:29.900 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.900+0000 7ffb19e5f700 1 --2- 192.168.123.105:0/3913425294 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ffb00077b10 0x7ffb00079fd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:57:29.900 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.900+0000 7ffb19e5f700 1 --2- 192.168.123.105:0/3913425294 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ffb00077b10 0x7ffb00079fd0 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7ffb10005950 tx=0x7ffb100058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:57:29.907 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:29.903+0000 7ffb0affd700 1 -- 192.168.123.105:0/3913425294 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ffb0c06cd80 con 0x7ffb14083670 2026-03-10T08:57:30.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:29 vm08.local ceph-mon[101330]: from='client.44101 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:57:30.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:29 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:30.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:29 vm08.local ceph-mon[101330]: from='client.44105 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:57:30.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:29 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:30.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:29 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:30.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:29 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:57:30.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:29 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:30.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:29 vm08.local ceph-mon[101330]: from='client.34154 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:57:30.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:29 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:57:30.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:29 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:57:30.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:29 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:57:30.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:29 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:57:30.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:29 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T08:57:30.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:29 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/3205288623' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:57:30.083 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.083+0000 7ffb1ae61700 1 -- 192.168.123.105:0/3913425294 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7ffaf8000bf0 con 0x7ffb00077b10 2026-03-10T08:57:30.083 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:29 vm05.local ceph-mon[111630]: from='client.44101 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:57:30.084 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:29 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:30.084 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:29 vm05.local ceph-mon[111630]: from='client.44105 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:57:30.084 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:29 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:30.084 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:29 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:57:30.084 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:29 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:57:30.084 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:29 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:30.084 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:29 vm05.local ceph-mon[111630]: from='client.34154 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:57:30.084 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:29 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:57:30.084 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:29 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:57:30.084 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:29 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:57:30.084 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:29 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:57:30.084 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:29 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T08:57:30.084 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:29 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/3205288623' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:57:30.091 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T08:57:30.091 INFO:teuthology.orchestra.run.vm05.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T08:57:30.091 INFO:teuthology.orchestra.run.vm05.stdout: "in_progress": true, 2026-03-10T08:57:30.091 INFO:teuthology.orchestra.run.vm05.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T08:57:30.091 INFO:teuthology.orchestra.run.vm05.stdout: "services_complete": [ 2026-03-10T08:57:30.091 INFO:teuthology.orchestra.run.vm05.stdout: "mgr", 2026-03-10T08:57:30.091 INFO:teuthology.orchestra.run.vm05.stdout: "crash", 2026-03-10T08:57:30.091 INFO:teuthology.orchestra.run.vm05.stdout: "mon" 2026-03-10T08:57:30.091 INFO:teuthology.orchestra.run.vm05.stdout: ], 2026-03-10T08:57:30.091 INFO:teuthology.orchestra.run.vm05.stdout: "progress": "7/23 daemons upgraded", 2026-03-10T08:57:30.091 INFO:teuthology.orchestra.run.vm05.stdout: "message": "Currently upgrading osd daemons", 2026-03-10T08:57:30.091 INFO:teuthology.orchestra.run.vm05.stdout: "is_paused": false 2026-03-10T08:57:30.091 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T08:57:30.091 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.088+0000 7ffb0affd700 1 -- 192.168.123.105:0/3913425294 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7ffaf8000bf0 con 0x7ffb00077b10 2026-03-10T08:57:30.092 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.092+0000 7ffb08ff9700 1 -- 192.168.123.105:0/3913425294 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ffb00077b10 msgr2=0x7ffb00079fd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:57:30.092 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.092+0000 7ffb08ff9700 1 --2- 192.168.123.105:0/3913425294 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ffb00077b10 0x7ffb00079fd0 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7ffb10005950 tx=0x7ffb100058e0 comp rx=0 tx=0).stop 2026-03-10T08:57:30.092 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.092+0000 7ffb08ff9700 1 -- 192.168.123.105:0/3913425294 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ffb14083670 msgr2=0x7ffb141b3120 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:57:30.092 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.092+0000 7ffb08ff9700 1 --2- 192.168.123.105:0/3913425294 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ffb14083670 0x7ffb141b3120 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7ffb0c007ba0 tx=0x7ffb0c019460 comp rx=0 tx=0).stop 2026-03-10T08:57:30.092 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.092+0000 7ffb08ff9700 1 -- 192.168.123.105:0/3913425294 shutdown_connections 2026-03-10T08:57:30.092 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.092+0000 7ffb08ff9700 1 --2- 192.168.123.105:0/3913425294 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ffb00077b10 0x7ffb00079fd0 unknown :-1 s=CLOSED pgs=32 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:57:30.093 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.092+0000 7ffb08ff9700 1 --2- 192.168.123.105:0/3913425294 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffb14072b20 0x7ffb14083130 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:57:30.093 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.092+0000 7ffb08ff9700 1 --2- 192.168.123.105:0/3913425294 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ffb14083670 0x7ffb141b3120 unknown :-1 s=CLOSED pgs=11 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:57:30.093 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.093+0000 7ffb08ff9700 1 -- 192.168.123.105:0/3913425294 >> 192.168.123.105:0/3913425294 conn(0x7ffb1406daa0 msgr2=0x7ffb1406ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:57:30.093 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.093+0000 7ffb08ff9700 1 -- 192.168.123.105:0/3913425294 shutdown_connections 2026-03-10T08:57:30.093 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.093+0000 7ffb08ff9700 1 -- 192.168.123.105:0/3913425294 wait complete. 2026-03-10T08:57:30.184 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.184+0000 7fa0cf01b700 1 -- 192.168.123.105:0/1283138490 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa0c8072b50 msgr2=0x7fa0c8072f70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:57:30.184 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.184+0000 7fa0cf01b700 1 --2- 192.168.123.105:0/1283138490 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa0c8072b50 0x7fa0c8072f70 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7fa0b8009a60 tx=0x7fa0b8009d70 comp rx=0 tx=0).stop 2026-03-10T08:57:30.184 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.184+0000 7fa0cf01b700 1 -- 192.168.123.105:0/1283138490 shutdown_connections 2026-03-10T08:57:30.184 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.184+0000 7fa0cf01b700 1 --2- 192.168.123.105:0/1283138490 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa0c8075a40 0x7fa0c8077ed0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:57:30.184 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.184+0000 7fa0cf01b700 1 --2- 192.168.123.105:0/1283138490 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa0c8072b50 0x7fa0c8072f70 unknown :-1 s=CLOSED pgs=58 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:57:30.184 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.184+0000 7fa0cf01b700 1 -- 192.168.123.105:0/1283138490 >> 192.168.123.105:0/1283138490 conn(0x7fa0c806dae0 msgr2=0x7fa0c806ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:57:30.185 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.184+0000 7fa0cf01b700 1 -- 192.168.123.105:0/1283138490 shutdown_connections 2026-03-10T08:57:30.185 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.184+0000 7fa0cf01b700 1 -- 192.168.123.105:0/1283138490 wait complete. 2026-03-10T08:57:30.185 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.185+0000 7fa0cf01b700 1 Processor -- start 2026-03-10T08:57:30.185 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.185+0000 7fa0cf01b700 1 -- start start 2026-03-10T08:57:30.185 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.185+0000 7fa0cf01b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa0c8075a40 0x7fa0c80831a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:57:30.186 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.185+0000 7fa0cf01b700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa0c80836e0 0x7fa0c81b3220 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:57:30.186 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.185+0000 7fa0cf01b700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa0c8083b90 con 0x7fa0c8075a40 2026-03-10T08:57:30.186 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.185+0000 7fa0cf01b700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa0c8083d00 con 0x7fa0c80836e0 2026-03-10T08:57:30.186 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.185+0000 7fa0c7fff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa0c80836e0 0x7fa0c81b3220 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:57:30.186 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.185+0000 7fa0c7fff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa0c80836e0 0x7fa0c81b3220 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:49450/0 (socket says 192.168.123.105:49450) 2026-03-10T08:57:30.186 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.185+0000 7fa0c7fff700 1 -- 192.168.123.105:0/1243014895 learned_addr learned my addr 192.168.123.105:0/1243014895 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:57:30.186 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.185+0000 7fa0c7fff700 1 -- 192.168.123.105:0/1243014895 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa0c8075a40 msgr2=0x7fa0c80831a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:57:30.186 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.185+0000 7fa0c7fff700 1 --2- 192.168.123.105:0/1243014895 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa0c8075a40 0x7fa0c80831a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:57:30.186 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.185+0000 7fa0c7fff700 1 -- 192.168.123.105:0/1243014895 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa0b8009710 con 0x7fa0c80836e0 2026-03-10T08:57:30.186 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.186+0000 7fa0c7fff700 1 --2- 192.168.123.105:0/1243014895 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa0c80836e0 0x7fa0c81b3220 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7fa0c000deb0 tx=0x7fa0c000df90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:57:30.186 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.186+0000 7fa0c5ffb700 1 -- 192.168.123.105:0/1243014895 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa0c000cdf0 con 0x7fa0c80836e0 2026-03-10T08:57:30.187 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.186+0000 7fa0cf01b700 1 -- 192.168.123.105:0/1243014895 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa0c81b37c0 con 0x7fa0c80836e0 2026-03-10T08:57:30.187 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.186+0000 7fa0cf01b700 1 -- 192.168.123.105:0/1243014895 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa0c81b3d10 con 0x7fa0c80836e0 2026-03-10T08:57:30.187 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.187+0000 7fa0c5ffb700 1 -- 192.168.123.105:0/1243014895 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa0c0012650 con 0x7fa0c80836e0 2026-03-10T08:57:30.187 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.187+0000 7fa0c5ffb700 1 -- 192.168.123.105:0/1243014895 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa0c000f7f0 con 0x7fa0c80836e0 2026-03-10T08:57:30.188 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.188+0000 7fa0c5ffb700 1 -- 192.168.123.105:0/1243014895 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 34) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fa0c0012170 con 0x7fa0c80836e0 2026-03-10T08:57:30.191 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.191+0000 7fa0c5ffb700 1 --2- 192.168.123.105:0/1243014895 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fa0b0077b10 0x7fa0b0079fd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:57:30.192 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.191+0000 7fa0c5ffb700 1 -- 192.168.123.105:0/1243014895 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(45..45 src has 1..45) v4 ==== 6222+0+0 (secure 0 0 0) 0x7fa0c00a2830 con 0x7fa0c80836e0 2026-03-10T08:57:30.192 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.192+0000 7fa0cf01b700 1 -- 192.168.123.105:0/1243014895 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa0b4005320 con 0x7fa0c80836e0 2026-03-10T08:57:30.192 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.192+0000 7fa0ccdb7700 1 --2- 192.168.123.105:0/1243014895 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fa0b0077b10 0x7fa0b0079fd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:57:30.193 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.192+0000 7fa0ccdb7700 1 --2- 192.168.123.105:0/1243014895 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fa0b0077b10 0x7fa0b0079fd0 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7fa0c81aeb60 tx=0x7fa0b800d750 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:57:30.199 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.195+0000 7fa0c5ffb700 1 -- 192.168.123.105:0/1243014895 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fa0c006ae70 con 0x7fa0c80836e0 2026-03-10T08:57:30.406 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.405+0000 7fa0cf01b700 1 -- 192.168.123.105:0/1243014895 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7fa0b4005190 con 0x7fa0c80836e0 2026-03-10T08:57:30.407 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.407+0000 7fa0c5ffb700 1 -- 192.168.123.105:0/1243014895 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+2108 (secure 0 0 0) 0x7fa0c006a5c0 con 0x7fa0c80836e0 2026-03-10T08:57:30.407 INFO:teuthology.orchestra.run.vm05.stdout:HEALTH_WARN 1 osds down; Degraded data redundancy: 6293/42123 objects degraded (14.940%), 33 pgs degraded 2026-03-10T08:57:30.407 INFO:teuthology.orchestra.run.vm05.stdout:[WRN] OSD_DOWN: 1 osds down 2026-03-10T08:57:30.407 INFO:teuthology.orchestra.run.vm05.stdout: osd.0 (root=default,host=vm05) is down 2026-03-10T08:57:30.407 INFO:teuthology.orchestra.run.vm05.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 6293/42123 objects degraded (14.940%), 33 pgs degraded 2026-03-10T08:57:30.407 INFO:teuthology.orchestra.run.vm05.stdout: pg 1.0 is active+undersized+degraded, acting [3,1] 2026-03-10T08:57:30.407 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.0 is active+undersized+degraded, acting [3,1] 2026-03-10T08:57:30.407 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.1 is active+undersized+degraded, acting [2,1] 2026-03-10T08:57:30.408 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.2 is active+undersized+degraded, acting [5,1] 2026-03-10T08:57:30.408 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.4 is active+undersized+degraded, acting [1,4] 2026-03-10T08:57:30.408 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.5 is active+undersized+degraded, acting [3,4] 2026-03-10T08:57:30.408 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.8 is active+undersized+degraded, acting [3,5] 2026-03-10T08:57:30.408 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.9 is active+undersized+degraded, acting [1,4] 2026-03-10T08:57:30.408 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.c is active+undersized+degraded, acting [2,3] 2026-03-10T08:57:30.408 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.e is active+undersized+degraded, acting [2,3] 2026-03-10T08:57:30.408 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.f is active+undersized+degraded, acting [4,5] 2026-03-10T08:57:30.408 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.10 is active+undersized+degraded, acting [2,1] 2026-03-10T08:57:30.408 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.13 is active+undersized+degraded, acting [4,2] 2026-03-10T08:57:30.408 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.15 is active+undersized+degraded, acting [1,3] 2026-03-10T08:57:30.408 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.19 is active+undersized+degraded, acting [4,2] 2026-03-10T08:57:30.408 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.1b is active+undersized+degraded, acting [1,5] 2026-03-10T08:57:30.408 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.1d is active+undersized+degraded, acting [3,5] 2026-03-10T08:57:30.408 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.1e is active+undersized+degraded, acting [2,5] 2026-03-10T08:57:30.408 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.1f is active+undersized+degraded, acting [3,4] 2026-03-10T08:57:30.408 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.1 is active+undersized+degraded, acting [4,2] 2026-03-10T08:57:30.408 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.3 is active+undersized+degraded, acting [4,3] 2026-03-10T08:57:30.408 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.6 is active+undersized+degraded, acting [1,4] 2026-03-10T08:57:30.408 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.b is active+undersized+degraded, acting [1,4] 2026-03-10T08:57:30.408 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.c is active+undersized+degraded, acting [5,3] 2026-03-10T08:57:30.408 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.f is active+undersized+degraded, acting [5,3] 2026-03-10T08:57:30.408 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.10 is active+undersized+degraded, acting [5,1] 2026-03-10T08:57:30.408 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.11 is active+undersized+degraded, acting [3,4] 2026-03-10T08:57:30.408 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.12 is active+undersized+degraded, acting [3,1] 2026-03-10T08:57:30.408 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.15 is active+undersized+degraded, acting [3,4] 2026-03-10T08:57:30.408 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.17 is active+undersized+degraded, acting [5,2] 2026-03-10T08:57:30.408 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.18 is active+undersized+degraded, acting [2,1] 2026-03-10T08:57:30.408 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.1b is active+undersized+degraded, acting [4,3] 2026-03-10T08:57:30.408 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.1f is active+undersized+degraded, acting [3,2] 2026-03-10T08:57:30.410 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.410+0000 7fa0cf01b700 1 -- 192.168.123.105:0/1243014895 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fa0b0077b10 msgr2=0x7fa0b0079fd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:57:30.410 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.410+0000 7fa0cf01b700 1 --2- 192.168.123.105:0/1243014895 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fa0b0077b10 0x7fa0b0079fd0 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7fa0c81aeb60 tx=0x7fa0b800d750 comp rx=0 tx=0).stop 2026-03-10T08:57:30.410 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.410+0000 7fa0cf01b700 1 -- 192.168.123.105:0/1243014895 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa0c80836e0 msgr2=0x7fa0c81b3220 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:57:30.410 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.410+0000 7fa0cf01b700 1 --2- 192.168.123.105:0/1243014895 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa0c80836e0 0x7fa0c81b3220 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7fa0c000deb0 tx=0x7fa0c000df90 comp rx=0 tx=0).stop 2026-03-10T08:57:30.410 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.410+0000 7fa0cf01b700 1 -- 192.168.123.105:0/1243014895 shutdown_connections 2026-03-10T08:57:30.410 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.410+0000 7fa0cf01b700 1 --2- 192.168.123.105:0/1243014895 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fa0b0077b10 0x7fa0b0079fd0 unknown :-1 s=CLOSED pgs=33 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:57:30.411 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.410+0000 7fa0cf01b700 1 --2- 192.168.123.105:0/1243014895 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa0c8075a40 0x7fa0c80831a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:57:30.411 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.410+0000 7fa0cf01b700 1 --2- 192.168.123.105:0/1243014895 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa0c80836e0 0x7fa0c81b3220 unknown :-1 s=CLOSED pgs=12 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:57:30.411 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.410+0000 7fa0cf01b700 1 -- 192.168.123.105:0/1243014895 >> 192.168.123.105:0/1243014895 conn(0x7fa0c806dae0 msgr2=0x7fa0c806ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:57:30.411 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.411+0000 7fa0cf01b700 1 -- 192.168.123.105:0/1243014895 shutdown_connections 2026-03-10T08:57:30.411 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:57:30.411+0000 7fa0cf01b700 1 -- 192.168.123.105:0/1243014895 wait complete. 2026-03-10T08:57:31.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:30 vm08.local ceph-mon[101330]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T08:57:31.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:30 vm08.local ceph-mon[101330]: Upgrade: unsafe to stop osd(s) at this time (15 PGs are or would become offline) 2026-03-10T08:57:31.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:30 vm08.local ceph-mon[101330]: Health check update: Degraded data redundancy: 6293/42123 objects degraded (14.940%), 33 pgs degraded (PG_DEGRADED) 2026-03-10T08:57:31.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:30 vm08.local ceph-mon[101330]: pgmap v27: 65 pgs: 1 active+undersized, 33 active+undersized+degraded, 31 active+clean; 297 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 20 KiB/s rd, 794 KiB/s wr, 199 op/s; 6293/42123 objects degraded (14.940%) 2026-03-10T08:57:31.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:30 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/2910871509' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T08:57:31.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:30 vm08.local ceph-mon[101330]: from='client.44117 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:57:31.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:30 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/1243014895' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T08:57:31.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:30 vm05.local ceph-mon[111630]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T08:57:31.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:30 vm05.local ceph-mon[111630]: Upgrade: unsafe to stop osd(s) at this time (15 PGs are or would become offline) 2026-03-10T08:57:31.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:30 vm05.local ceph-mon[111630]: Health check update: Degraded data redundancy: 6293/42123 objects degraded (14.940%), 33 pgs degraded (PG_DEGRADED) 2026-03-10T08:57:31.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:30 vm05.local ceph-mon[111630]: pgmap v27: 65 pgs: 1 active+undersized, 33 active+undersized+degraded, 31 active+clean; 297 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 20 KiB/s rd, 794 KiB/s wr, 199 op/s; 6293/42123 objects degraded (14.940%) 2026-03-10T08:57:31.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:30 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/2910871509' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T08:57:31.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:30 vm05.local ceph-mon[111630]: from='client.44117 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:57:31.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:30 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/1243014895' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T08:57:31.853 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:31 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0[119356]: 2026-03-10T08:57:31.473+0000 7f014135d740 -1 osd.0 0 read_superblock omap replica is missing. 2026-03-10T08:57:32.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:32.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:57:32.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:32.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:57:33.166 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:32 vm08.local ceph-mon[101330]: pgmap v28: 65 pgs: 1 active+undersized, 33 active+undersized+degraded, 31 active+clean; 297 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 18 KiB/s rd, 735 KiB/s wr, 185 op/s; 6293/42123 objects degraded (14.940%) 2026-03-10T08:57:33.212 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:32 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0[119356]: 2026-03-10T08:57:32.946+0000 7f014135d740 -1 osd.0 43 log_to_monitors true 2026-03-10T08:57:33.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:32 vm05.local ceph-mon[111630]: pgmap v28: 65 pgs: 1 active+undersized, 33 active+undersized+degraded, 31 active+clean; 297 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 18 KiB/s rd, 735 KiB/s wr, 185 op/s; 6293/42123 objects degraded (14.940%) 2026-03-10T08:57:34.213 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:57:33 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0[119356]: 2026-03-10T08:57:33.960+0000 7f01390f7640 -1 osd.0 43 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T08:57:34.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:33 vm05.local ceph-mon[111630]: from='osd.0 [v2:192.168.123.105:6802/1893517983,v1:192.168.123.105:6803/1893517983]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch 2026-03-10T08:57:34.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:33 vm08.local ceph-mon[101330]: from='osd.0 [v2:192.168.123.105:6802/1893517983,v1:192.168.123.105:6803/1893517983]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch 2026-03-10T08:57:35.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:34 vm05.local ceph-mon[111630]: pgmap v29: 65 pgs: 34 active+undersized+degraded, 31 active+clean; 289 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 24 KiB/s rd, 971 KiB/s wr, 363 op/s; 5957/39807 objects degraded (14.965%) 2026-03-10T08:57:35.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:34 vm05.local ceph-mon[111630]: from='osd.0 [v2:192.168.123.105:6802/1893517983,v1:192.168.123.105:6803/1893517983]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished 2026-03-10T08:57:35.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:34 vm05.local ceph-mon[111630]: osdmap e46: 6 total, 5 up, 6 in 2026-03-10T08:57:35.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:34 vm05.local ceph-mon[111630]: from='osd.0 [v2:192.168.123.105:6802/1893517983,v1:192.168.123.105:6803/1893517983]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-10T08:57:35.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:34 vm05.local ceph-mon[111630]: Health check update: Degraded data redundancy: 5957/39807 objects degraded (14.965%), 34 pgs degraded (PG_DEGRADED) 2026-03-10T08:57:35.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:34 vm08.local ceph-mon[101330]: pgmap v29: 65 pgs: 34 active+undersized+degraded, 31 active+clean; 289 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 24 KiB/s rd, 971 KiB/s wr, 363 op/s; 5957/39807 objects degraded (14.965%) 2026-03-10T08:57:35.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:34 vm08.local ceph-mon[101330]: from='osd.0 [v2:192.168.123.105:6802/1893517983,v1:192.168.123.105:6803/1893517983]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished 2026-03-10T08:57:35.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:34 vm08.local ceph-mon[101330]: osdmap e46: 6 total, 5 up, 6 in 2026-03-10T08:57:35.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:34 vm08.local ceph-mon[101330]: from='osd.0 [v2:192.168.123.105:6802/1893517983,v1:192.168.123.105:6803/1893517983]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-10T08:57:35.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:34 vm08.local ceph-mon[101330]: Health check update: Degraded data redundancy: 5957/39807 objects degraded (14.965%), 34 pgs degraded (PG_DEGRADED) 2026-03-10T08:57:36.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:35 vm08.local ceph-mon[101330]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T08:57:36.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:35 vm08.local ceph-mon[101330]: osd.0 [v2:192.168.123.105:6802/1893517983,v1:192.168.123.105:6803/1893517983] boot 2026-03-10T08:57:36.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:35 vm08.local ceph-mon[101330]: osdmap e47: 6 total, 6 up, 6 in 2026-03-10T08:57:36.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:35 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T08:57:36.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:35 vm05.local ceph-mon[111630]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T08:57:36.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:35 vm05.local ceph-mon[111630]: osd.0 [v2:192.168.123.105:6802/1893517983,v1:192.168.123.105:6803/1893517983] boot 2026-03-10T08:57:36.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:35 vm05.local ceph-mon[111630]: osdmap e47: 6 total, 6 up, 6 in 2026-03-10T08:57:36.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:35 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T08:57:37.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:37 vm05.local ceph-mon[111630]: pgmap v32: 65 pgs: 34 active+undersized+degraded, 31 active+clean; 289 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 12 KiB/s rd, 464 KiB/s wr, 295 op/s; 5957/39807 objects degraded (14.965%) 2026-03-10T08:57:37.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:37 vm05.local ceph-mon[111630]: osdmap e48: 6 total, 6 up, 6 in 2026-03-10T08:57:37.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:37 vm08.local ceph-mon[101330]: pgmap v32: 65 pgs: 34 active+undersized+degraded, 31 active+clean; 289 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 12 KiB/s rd, 464 KiB/s wr, 295 op/s; 5957/39807 objects degraded (14.965%) 2026-03-10T08:57:37.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:37 vm08.local ceph-mon[101330]: osdmap e48: 6 total, 6 up, 6 in 2026-03-10T08:57:38.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:38 vm08.local ceph-mon[101330]: osdmap e49: 6 total, 6 up, 6 in 2026-03-10T08:57:38.640 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:38 vm05.local ceph-mon[111630]: osdmap e49: 6 total, 6 up, 6 in 2026-03-10T08:57:39.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:39 vm05.local ceph-mon[111630]: pgmap v35: 65 pgs: 14 remapped+peering, 1 active+recovering+degraded, 14 active+recovery_wait+degraded, 3 active+undersized+degraded, 33 active+clean; 292 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 17 KiB/s rd, 2.0 MiB/s wr, 911 op/s; 149/38610 objects degraded (0.386%); 73 KiB/s, 9 keys/s, 1 objects/s recovering 2026-03-10T08:57:39.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:39 vm08.local ceph-mon[101330]: pgmap v35: 65 pgs: 14 remapped+peering, 1 active+recovering+degraded, 14 active+recovery_wait+degraded, 3 active+undersized+degraded, 33 active+clean; 292 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 17 KiB/s rd, 2.0 MiB/s wr, 911 op/s; 149/38610 objects degraded (0.386%); 73 KiB/s, 9 keys/s, 1 objects/s recovering 2026-03-10T08:57:40.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:40 vm08.local ceph-mon[101330]: Health check update: Degraded data redundancy: 149/38610 objects degraded (0.386%), 18 pgs degraded (PG_DEGRADED) 2026-03-10T08:57:40.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:40 vm08.local ceph-mon[101330]: pgmap v36: 65 pgs: 14 remapped+peering, 2 active+recovering+degraded, 14 active+recovery_wait+degraded, 35 active+clean; 292 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 12 KiB/s rd, 1.4 MiB/s wr, 638 op/s; 141/38661 objects degraded (0.365%); 50 KiB/s, 26 keys/s, 4 objects/s recovering 2026-03-10T08:57:40.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:40 vm05.local ceph-mon[111630]: Health check update: Degraded data redundancy: 149/38610 objects degraded (0.386%), 18 pgs degraded (PG_DEGRADED) 2026-03-10T08:57:40.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:40 vm05.local ceph-mon[111630]: pgmap v36: 65 pgs: 14 remapped+peering, 2 active+recovering+degraded, 14 active+recovery_wait+degraded, 35 active+clean; 292 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 12 KiB/s rd, 1.4 MiB/s wr, 638 op/s; 141/38661 objects degraded (0.365%); 50 KiB/s, 26 keys/s, 4 objects/s recovering 2026-03-10T08:57:42.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:42 vm08.local ceph-mon[101330]: pgmap v37: 65 pgs: 14 remapped+peering, 2 active+recovering+degraded, 14 active+recovery_wait+degraded, 35 active+clean; 292 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 10 KiB/s rd, 1.2 MiB/s wr, 544 op/s; 141/38661 objects degraded (0.365%); 43 KiB/s, 23 keys/s, 3 objects/s recovering 2026-03-10T08:57:42.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:42 vm05.local ceph-mon[111630]: pgmap v37: 65 pgs: 14 remapped+peering, 2 active+recovering+degraded, 14 active+recovery_wait+degraded, 35 active+clean; 292 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 10 KiB/s rd, 1.2 MiB/s wr, 544 op/s; 141/38661 objects degraded (0.365%); 43 KiB/s, 23 keys/s, 3 objects/s recovering 2026-03-10T08:57:45.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:44 vm05.local ceph-mon[111630]: pgmap v38: 65 pgs: 14 active+recovery_wait+undersized+degraded+remapped, 2 active+recovering+degraded, 11 active+recovery_wait+degraded, 38 active+clean; 279 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 44 KiB/s rd, 1.8 MiB/s wr, 822 op/s; 2058/32295 objects degraded (6.373%); 37 KiB/s, 53 keys/s, 7 objects/s recovering 2026-03-10T08:57:45.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:44 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T08:57:45.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:44 vm05.local ceph-mon[111630]: Health check update: Degraded data redundancy: 2058/32295 objects degraded (6.373%), 27 pgs degraded (PG_DEGRADED) 2026-03-10T08:57:45.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:44 vm08.local ceph-mon[101330]: pgmap v38: 65 pgs: 14 active+recovery_wait+undersized+degraded+remapped, 2 active+recovering+degraded, 11 active+recovery_wait+degraded, 38 active+clean; 279 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 44 KiB/s rd, 1.8 MiB/s wr, 822 op/s; 2058/32295 objects degraded (6.373%); 37 KiB/s, 53 keys/s, 7 objects/s recovering 2026-03-10T08:57:45.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:44 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T08:57:45.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:44 vm08.local ceph-mon[101330]: Health check update: Degraded data redundancy: 2058/32295 objects degraded (6.373%), 27 pgs degraded (PG_DEGRADED) 2026-03-10T08:57:46.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:45 vm05.local ceph-mon[111630]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T08:57:46.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:45 vm05.local ceph-mon[111630]: Upgrade: unsafe to stop osd(s) at this time (12 PGs are or would become offline) 2026-03-10T08:57:46.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:45 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:46.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:45 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:57:46.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:45 vm08.local ceph-mon[101330]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T08:57:46.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:45 vm08.local ceph-mon[101330]: Upgrade: unsafe to stop osd(s) at this time (12 PGs are or would become offline) 2026-03-10T08:57:46.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:45 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:57:46.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:45 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:57:47.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:46 vm08.local ceph-mon[101330]: pgmap v39: 65 pgs: 14 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+degraded, 11 active+recovery_wait+degraded, 39 active+clean; 279 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 36 KiB/s rd, 1.5 MiB/s wr, 672 op/s; 2050/32295 objects degraded (6.348%); 30 KiB/s, 51 keys/s, 6 objects/s recovering 2026-03-10T08:57:47.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:46 vm05.local ceph-mon[111630]: pgmap v39: 65 pgs: 14 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+degraded, 11 active+recovery_wait+degraded, 39 active+clean; 279 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 36 KiB/s rd, 1.5 MiB/s wr, 672 op/s; 2050/32295 objects degraded (6.348%); 30 KiB/s, 51 keys/s, 6 objects/s recovering 2026-03-10T08:57:49.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:49 vm05.local ceph-mon[111630]: pgmap v40: 65 pgs: 14 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+degraded, 6 active+recovery_wait+degraded, 44 active+clean; 269 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 37 KiB/s rd, 1.1 MiB/s wr, 463 op/s; 2005/27156 objects degraded (7.383%); 385 KiB/s, 70 keys/s, 9 objects/s recovering 2026-03-10T08:57:49.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:49 vm08.local ceph-mon[101330]: pgmap v40: 65 pgs: 14 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+degraded, 6 active+recovery_wait+degraded, 44 active+clean; 269 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 37 KiB/s rd, 1.1 MiB/s wr, 463 op/s; 2005/27156 objects degraded (7.383%); 385 KiB/s, 70 keys/s, 9 objects/s recovering 2026-03-10T08:57:50.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:50 vm05.local ceph-mon[111630]: Health check update: Degraded data redundancy: 2005/27156 objects degraded (7.383%), 21 pgs degraded (PG_DEGRADED) 2026-03-10T08:57:50.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:50 vm08.local ceph-mon[101330]: Health check update: Degraded data redundancy: 2005/27156 objects degraded (7.383%), 21 pgs degraded (PG_DEGRADED) 2026-03-10T08:57:52.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:51 vm08.local ceph-mon[101330]: pgmap v41: 65 pgs: 14 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+degraded, 6 active+recovery_wait+degraded, 44 active+clean; 265 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 33 KiB/s rd, 1023 KiB/s wr, 413 op/s; 2005/27159 objects degraded (7.382%); 341 KiB/s, 62 keys/s, 8 objects/s recovering 2026-03-10T08:57:52.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:51 vm05.local ceph-mon[111630]: pgmap v41: 65 pgs: 14 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+degraded, 6 active+recovery_wait+degraded, 44 active+clean; 265 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 33 KiB/s rd, 1023 KiB/s wr, 413 op/s; 2005/27159 objects degraded (7.382%); 341 KiB/s, 62 keys/s, 8 objects/s recovering 2026-03-10T08:57:53.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:52 vm05.local ceph-mon[111630]: pgmap v42: 65 pgs: 14 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+degraded, 6 active+recovery_wait+degraded, 44 active+clean; 265 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 33 KiB/s rd, 1008 KiB/s wr, 407 op/s; 2005/27159 objects degraded (7.382%); 341 KiB/s, 52 keys/s, 7 objects/s recovering 2026-03-10T08:57:53.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:52 vm08.local ceph-mon[101330]: pgmap v42: 65 pgs: 14 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+degraded, 6 active+recovery_wait+degraded, 44 active+clean; 265 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 33 KiB/s rd, 1008 KiB/s wr, 407 op/s; 2005/27159 objects degraded (7.382%); 341 KiB/s, 52 keys/s, 7 objects/s recovering 2026-03-10T08:57:55.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:55 vm08.local ceph-mon[101330]: pgmap v43: 65 pgs: 1 active+recovering+degraded, 14 active+recovery_wait+undersized+degraded+remapped, 1 active+recovery_wait+degraded, 49 active+clean; 266 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 46 KiB/s rd, 1.4 MiB/s wr, 614 op/s; 1954/23280 objects degraded (8.393%); 341 KiB/s, 79 keys/s, 11 objects/s recovering 2026-03-10T08:57:55.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:55 vm08.local ceph-mon[101330]: Health check update: Degraded data redundancy: 1954/23280 objects degraded (8.393%), 16 pgs degraded (PG_DEGRADED) 2026-03-10T08:57:55.360 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:55 vm05.local ceph-mon[111630]: pgmap v43: 65 pgs: 1 active+recovering+degraded, 14 active+recovery_wait+undersized+degraded+remapped, 1 active+recovery_wait+degraded, 49 active+clean; 266 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 46 KiB/s rd, 1.4 MiB/s wr, 614 op/s; 1954/23280 objects degraded (8.393%); 341 KiB/s, 79 keys/s, 11 objects/s recovering 2026-03-10T08:57:55.360 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:55 vm05.local ceph-mon[111630]: Health check update: Degraded data redundancy: 1954/23280 objects degraded (8.393%), 16 pgs degraded (PG_DEGRADED) 2026-03-10T08:57:57.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:57 vm05.local ceph-mon[111630]: pgmap v44: 65 pgs: 1 active+recovering+degraded, 14 active+recovery_wait+undersized+degraded+remapped, 1 active+recovery_wait+degraded, 49 active+clean; 266 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 22 KiB/s rd, 913 KiB/s wr, 376 op/s; 1954/23226 objects degraded (8.413%); 341 KiB/s, 56 keys/s, 8 objects/s recovering 2026-03-10T08:57:57.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:57 vm08.local ceph-mon[101330]: pgmap v44: 65 pgs: 1 active+recovering+degraded, 14 active+recovery_wait+undersized+degraded+remapped, 1 active+recovery_wait+degraded, 49 active+clean; 266 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 22 KiB/s rd, 913 KiB/s wr, 376 op/s; 1954/23226 objects degraded (8.413%); 341 KiB/s, 56 keys/s, 8 objects/s recovering 2026-03-10T08:57:58.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:58 vm08.local ceph-mon[101330]: pgmap v45: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 13 active+recovery_wait+undersized+degraded+remapped, 51 active+clean; 259 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 35 KiB/s rd, 1.4 MiB/s wr, 521 op/s; 1869/19050 objects degraded (9.811%); 341 KiB/s, 51 keys/s, 11 objects/s recovering 2026-03-10T08:57:58.688 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:58 vm05.local ceph-mon[111630]: pgmap v45: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 13 active+recovery_wait+undersized+degraded+remapped, 51 active+clean; 259 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 35 KiB/s rd, 1.4 MiB/s wr, 521 op/s; 1869/19050 objects degraded (9.811%); 341 KiB/s, 51 keys/s, 11 objects/s recovering 2026-03-10T08:57:59.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:57:59 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T08:57:59.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:57:59 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T08:58:00.535 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.534+0000 7f0c753fe700 1 -- 192.168.123.105:0/4206003820 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0c70075a40 msgr2=0x7f0c70077ed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:00.535 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.534+0000 7f0c753fe700 1 --2- 192.168.123.105:0/4206003820 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0c70075a40 0x7f0c70077ed0 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7f0c6800cd40 tx=0x7f0c6800a320 comp rx=0 tx=0).stop 2026-03-10T08:58:00.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.537+0000 7f0c753fe700 1 -- 192.168.123.105:0/4206003820 shutdown_connections 2026-03-10T08:58:00.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.537+0000 7f0c753fe700 1 --2- 192.168.123.105:0/4206003820 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0c70075a40 0x7f0c70077ed0 unknown :-1 s=CLOSED pgs=13 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:00.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.537+0000 7f0c753fe700 1 --2- 192.168.123.105:0/4206003820 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0c70072b50 0x7f0c70072f70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:00.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.537+0000 7f0c753fe700 1 -- 192.168.123.105:0/4206003820 >> 192.168.123.105:0/4206003820 conn(0x7f0c7006dae0 msgr2=0x7f0c7006ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:58:00.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.537+0000 7f0c753fe700 1 -- 192.168.123.105:0/4206003820 shutdown_connections 2026-03-10T08:58:00.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.537+0000 7f0c753fe700 1 -- 192.168.123.105:0/4206003820 wait complete. 2026-03-10T08:58:00.539 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.538+0000 7f0c753fe700 1 Processor -- start 2026-03-10T08:58:00.539 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.538+0000 7f0c753fe700 1 -- start start 2026-03-10T08:58:00.539 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.538+0000 7f0c753fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0c70072b50 0x7f0c70083a60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:58:00.539 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.538+0000 7f0c753fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0c70075a40 0x7f0c701b2ad0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:58:00.539 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.538+0000 7f0c753fe700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0c701b30a0 con 0x7f0c70072b50 2026-03-10T08:58:00.539 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.538+0000 7f0c753fe700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0c701b3210 con 0x7f0c70075a40 2026-03-10T08:58:00.539 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.538+0000 7f0c6e7fc700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0c70075a40 0x7f0c701b2ad0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:58:00.539 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.538+0000 7f0c6e7fc700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0c70075a40 0x7f0c701b2ad0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:48426/0 (socket says 192.168.123.105:48426) 2026-03-10T08:58:00.539 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.538+0000 7f0c6e7fc700 1 -- 192.168.123.105:0/2642609003 learned_addr learned my addr 192.168.123.105:0/2642609003 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:58:00.540 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.540+0000 7f0c6effd700 1 --2- 192.168.123.105:0/2642609003 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0c70072b50 0x7f0c70083a60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:58:00.540 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.540+0000 7f0c6e7fc700 1 -- 192.168.123.105:0/2642609003 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0c70072b50 msgr2=0x7f0c70083a60 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:00.540 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.540+0000 7f0c6e7fc700 1 --2- 192.168.123.105:0/2642609003 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0c70072b50 0x7f0c70083a60 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:00.541 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.540+0000 7f0c6e7fc700 1 -- 192.168.123.105:0/2642609003 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0c6800c9f0 con 0x7f0c70075a40 2026-03-10T08:58:00.541 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.540+0000 7f0c6e7fc700 1 --2- 192.168.123.105:0/2642609003 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0c70075a40 0x7f0c701b2ad0 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7f0c6800bb40 tx=0x7f0c6800bc20 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:58:00.543 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.541+0000 7f0c57fff700 1 -- 192.168.123.105:0/2642609003 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0c6800dea0 con 0x7f0c70075a40 2026-03-10T08:58:00.543 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.542+0000 7f0c753fe700 1 -- 192.168.123.105:0/2642609003 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0c701b3490 con 0x7f0c70075a40 2026-03-10T08:58:00.543 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.542+0000 7f0c753fe700 1 -- 192.168.123.105:0/2642609003 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0c701b39e0 con 0x7f0c70075a40 2026-03-10T08:58:00.543 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.543+0000 7f0c57fff700 1 -- 192.168.123.105:0/2642609003 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f0c68020070 con 0x7f0c70075a40 2026-03-10T08:58:00.543 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.543+0000 7f0c57fff700 1 -- 192.168.123.105:0/2642609003 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0c6801c5d0 con 0x7f0c70075a40 2026-03-10T08:58:00.546 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.545+0000 7f0c57fff700 1 -- 192.168.123.105:0/2642609003 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 34) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f0c6800c040 con 0x7f0c70075a40 2026-03-10T08:58:00.546 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.545+0000 7f0c753fe700 1 -- 192.168.123.105:0/2642609003 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0c5c005320 con 0x7f0c70075a40 2026-03-10T08:58:00.547 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.546+0000 7f0c57fff700 1 --2- 192.168.123.105:0/2642609003 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f0c580801f0 0x7f0c580826b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:58:00.547 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.546+0000 7f0c57fff700 1 -- 192.168.123.105:0/2642609003 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(49..49 src has 1..49) v4 ==== 6628+0+0 (secure 0 0 0) 0x7f0c68022070 con 0x7f0c70075a40 2026-03-10T08:58:00.547 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.547+0000 7f0c6effd700 1 --2- 192.168.123.105:0/2642609003 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f0c580801f0 0x7f0c580826b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:58:00.547 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.547+0000 7f0c6effd700 1 --2- 192.168.123.105:0/2642609003 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f0c580801f0 0x7f0c580826b0 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f0c60005950 tx=0x7f0c600058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:58:00.559 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.553+0000 7f0c57fff700 1 -- 192.168.123.105:0/2642609003 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f0c680633c0 con 0x7f0c70075a40 2026-03-10T08:58:00.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:00 vm05.local ceph-mon[111630]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T08:58:00.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:00 vm05.local ceph-mon[111630]: Upgrade: unsafe to stop osd(s) at this time (5 PGs are or would become offline) 2026-03-10T08:58:00.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:00 vm05.local ceph-mon[111630]: Health check update: Degraded data redundancy: 1869/19050 objects degraded (9.811%), 14 pgs degraded (PG_DEGRADED) 2026-03-10T08:58:00.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:00 vm05.local ceph-mon[111630]: pgmap v46: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 13 active+recovery_wait+undersized+degraded+remapped, 51 active+clean; 259 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 26 KiB/s rd, 953 KiB/s wr, 356 op/s; 1869/19032 objects degraded (9.820%); 0 B/s, 27 keys/s, 7 objects/s recovering 2026-03-10T08:58:00.756 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.756+0000 7f0c753fe700 1 -- 192.168.123.105:0/2642609003 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f0c5c000bf0 con 0x7f0c580801f0 2026-03-10T08:58:00.757 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.757+0000 7f0c57fff700 1 -- 192.168.123.105:0/2642609003 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f0c5c000bf0 con 0x7f0c580801f0 2026-03-10T08:58:00.760 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.759+0000 7f0c753fe700 1 -- 192.168.123.105:0/2642609003 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f0c580801f0 msgr2=0x7f0c580826b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:00.760 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.759+0000 7f0c753fe700 1 --2- 192.168.123.105:0/2642609003 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f0c580801f0 0x7f0c580826b0 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f0c60005950 tx=0x7f0c600058e0 comp rx=0 tx=0).stop 2026-03-10T08:58:00.760 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.759+0000 7f0c753fe700 1 -- 192.168.123.105:0/2642609003 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0c70075a40 msgr2=0x7f0c701b2ad0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:00.760 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.759+0000 7f0c753fe700 1 --2- 192.168.123.105:0/2642609003 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0c70075a40 0x7f0c701b2ad0 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7f0c6800bb40 tx=0x7f0c6800bc20 comp rx=0 tx=0).stop 2026-03-10T08:58:00.761 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.760+0000 7f0c753fe700 1 -- 192.168.123.105:0/2642609003 shutdown_connections 2026-03-10T08:58:00.761 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.760+0000 7f0c753fe700 1 --2- 192.168.123.105:0/2642609003 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f0c580801f0 0x7f0c580826b0 unknown :-1 s=CLOSED pgs=35 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:00.761 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.760+0000 7f0c753fe700 1 --2- 192.168.123.105:0/2642609003 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0c70072b50 0x7f0c70083a60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:00.761 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.760+0000 7f0c753fe700 1 --2- 192.168.123.105:0/2642609003 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0c70075a40 0x7f0c701b2ad0 unknown :-1 s=CLOSED pgs=14 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:00.761 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.760+0000 7f0c753fe700 1 -- 192.168.123.105:0/2642609003 >> 192.168.123.105:0/2642609003 conn(0x7f0c7006dae0 msgr2=0x7f0c7006ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:58:00.761 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.760+0000 7f0c753fe700 1 -- 192.168.123.105:0/2642609003 shutdown_connections 2026-03-10T08:58:00.761 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.760+0000 7f0c753fe700 1 -- 192.168.123.105:0/2642609003 wait complete. 2026-03-10T08:58:00.769 INFO:teuthology.orchestra.run.vm05.stdout:true 2026-03-10T08:58:00.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:00 vm08.local ceph-mon[101330]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T08:58:00.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:00 vm08.local ceph-mon[101330]: Upgrade: unsafe to stop osd(s) at this time (5 PGs are or would become offline) 2026-03-10T08:58:00.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:00 vm08.local ceph-mon[101330]: Health check update: Degraded data redundancy: 1869/19050 objects degraded (9.811%), 14 pgs degraded (PG_DEGRADED) 2026-03-10T08:58:00.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:00 vm08.local ceph-mon[101330]: pgmap v46: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 13 active+recovery_wait+undersized+degraded+remapped, 51 active+clean; 259 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 26 KiB/s rd, 953 KiB/s wr, 356 op/s; 1869/19032 objects degraded (9.820%); 0 B/s, 27 keys/s, 7 objects/s recovering 2026-03-10T08:58:00.835 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.833+0000 7f56e9a86700 1 -- 192.168.123.105:0/3781623487 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f56e4074dc0 msgr2=0x7f56e4073220 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:00.835 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.833+0000 7f56e9a86700 1 --2- 192.168.123.105:0/3781623487 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f56e4074dc0 0x7f56e4073220 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7f56cc009b50 tx=0x7f56cc009e60 comp rx=0 tx=0).stop 2026-03-10T08:58:00.838 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.835+0000 7f56e9a86700 1 -- 192.168.123.105:0/3781623487 shutdown_connections 2026-03-10T08:58:00.838 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.835+0000 7f56e9a86700 1 --2- 192.168.123.105:0/3781623487 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f56e40737f0 0x7f56e4073c70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:00.838 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.835+0000 7f56e9a86700 1 --2- 192.168.123.105:0/3781623487 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f56e4074dc0 0x7f56e4073220 unknown :-1 s=CLOSED pgs=60 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:00.838 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.835+0000 7f56e9a86700 1 -- 192.168.123.105:0/3781623487 >> 192.168.123.105:0/3781623487 conn(0x7f56e40fc4c0 msgr2=0x7f56e40fe920 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:58:00.842 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.841+0000 7f56e9a86700 1 -- 192.168.123.105:0/3781623487 shutdown_connections 2026-03-10T08:58:00.842 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.842+0000 7f56e9a86700 1 -- 192.168.123.105:0/3781623487 wait complete. 2026-03-10T08:58:00.842 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.842+0000 7f56e9a86700 1 Processor -- start 2026-03-10T08:58:00.844 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.844+0000 7f56e9a86700 1 -- start start 2026-03-10T08:58:00.845 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.844+0000 7f56e9a86700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f56e40737f0 0x7f56e419ce40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:58:00.845 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.844+0000 7f56e9a86700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f56e4074dc0 0x7f56e419d380 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:58:00.845 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.844+0000 7f56e9a86700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f56e419d9a0 con 0x7f56e40737f0 2026-03-10T08:58:00.845 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.844+0000 7f56e9a86700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f56e419dae0 con 0x7f56e4074dc0 2026-03-10T08:58:00.845 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.845+0000 7f56e2ffd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f56e4074dc0 0x7f56e419d380 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:58:00.845 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.845+0000 7f56e2ffd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f56e4074dc0 0x7f56e419d380 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:48458/0 (socket says 192.168.123.105:48458) 2026-03-10T08:58:00.845 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.845+0000 7f56e2ffd700 1 -- 192.168.123.105:0/103345680 learned_addr learned my addr 192.168.123.105:0/103345680 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:58:00.845 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.845+0000 7f56e2ffd700 1 -- 192.168.123.105:0/103345680 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f56e40737f0 msgr2=0x7f56e419ce40 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:00.845 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.845+0000 7f56e2ffd700 1 --2- 192.168.123.105:0/103345680 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f56e40737f0 0x7f56e419ce40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:00.845 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.845+0000 7f56e2ffd700 1 -- 192.168.123.105:0/103345680 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f56cc0097e0 con 0x7f56e4074dc0 2026-03-10T08:58:00.845 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.845+0000 7f56e2ffd700 1 --2- 192.168.123.105:0/103345680 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f56e4074dc0 0x7f56e419d380 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7f56d4009fd0 tx=0x7f56d400eea0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:58:00.847 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.845+0000 7f56e0ff9700 1 -- 192.168.123.105:0/103345680 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f56d4009980 con 0x7f56e4074dc0 2026-03-10T08:58:00.847 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.846+0000 7f56e9a86700 1 -- 192.168.123.105:0/103345680 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f56e41a2590 con 0x7f56e4074dc0 2026-03-10T08:58:00.847 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.846+0000 7f56e9a86700 1 -- 192.168.123.105:0/103345680 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f56e41a2ae0 con 0x7f56e4074dc0 2026-03-10T08:58:00.847 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.846+0000 7f56e0ff9700 1 -- 192.168.123.105:0/103345680 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f56d4004500 con 0x7f56e4074dc0 2026-03-10T08:58:00.847 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.847+0000 7f56e0ff9700 1 -- 192.168.123.105:0/103345680 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f56d4010430 con 0x7f56e4074dc0 2026-03-10T08:58:00.847 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.847+0000 7f56e9a86700 1 -- 192.168.123.105:0/103345680 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f56c4005320 con 0x7f56e4074dc0 2026-03-10T08:58:00.849 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.849+0000 7f56e0ff9700 1 -- 192.168.123.105:0/103345680 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 34) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f56d4003780 con 0x7f56e4074dc0 2026-03-10T08:58:00.849 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.849+0000 7f56e0ff9700 1 --2- 192.168.123.105:0/103345680 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f56d0077a00 0x7f56d0079ec0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:58:00.849 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.849+0000 7f56e0ff9700 1 -- 192.168.123.105:0/103345680 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(49..49 src has 1..49) v4 ==== 6628+0+0 (secure 0 0 0) 0x7f56d4014070 con 0x7f56e4074dc0 2026-03-10T08:58:00.850 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.850+0000 7f56e37fe700 1 --2- 192.168.123.105:0/103345680 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f56d0077a00 0x7f56d0079ec0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:58:00.850 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.850+0000 7f56e37fe700 1 --2- 192.168.123.105:0/103345680 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f56d0077a00 0x7f56d0079ec0 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f56cc009b20 tx=0x7f56cc005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:58:00.852 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.851+0000 7f56e0ff9700 1 -- 192.168.123.105:0/103345680 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f56d40627c0 con 0x7f56e4074dc0 2026-03-10T08:58:00.984 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.983+0000 7f56e9a86700 1 -- 192.168.123.105:0/103345680 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f56c4000bf0 con 0x7f56d0077a00 2026-03-10T08:58:00.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.985+0000 7f56e0ff9700 1 -- 192.168.123.105:0/103345680 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f56c4000bf0 con 0x7f56d0077a00 2026-03-10T08:58:00.991 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.991+0000 7f56da7fc700 1 -- 192.168.123.105:0/103345680 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f56d0077a00 msgr2=0x7f56d0079ec0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:00.991 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.991+0000 7f56da7fc700 1 --2- 192.168.123.105:0/103345680 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f56d0077a00 0x7f56d0079ec0 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f56cc009b20 tx=0x7f56cc005fb0 comp rx=0 tx=0).stop 2026-03-10T08:58:00.992 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.991+0000 7f56da7fc700 1 -- 192.168.123.105:0/103345680 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f56e4074dc0 msgr2=0x7f56e419d380 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:00.992 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.991+0000 7f56da7fc700 1 --2- 192.168.123.105:0/103345680 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f56e4074dc0 0x7f56e419d380 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7f56d4009fd0 tx=0x7f56d400eea0 comp rx=0 tx=0).stop 2026-03-10T08:58:00.992 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.992+0000 7f56da7fc700 1 -- 192.168.123.105:0/103345680 shutdown_connections 2026-03-10T08:58:00.992 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.992+0000 7f56da7fc700 1 --2- 192.168.123.105:0/103345680 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f56d0077a00 0x7f56d0079ec0 unknown :-1 s=CLOSED pgs=36 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:00.993 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.992+0000 7f56da7fc700 1 --2- 192.168.123.105:0/103345680 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f56e40737f0 0x7f56e419ce40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:00.993 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.993+0000 7f56da7fc700 1 --2- 192.168.123.105:0/103345680 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f56e4074dc0 0x7f56e419d380 unknown :-1 s=CLOSED pgs=15 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:00.993 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.993+0000 7f56da7fc700 1 -- 192.168.123.105:0/103345680 >> 192.168.123.105:0/103345680 conn(0x7f56e40fc4c0 msgr2=0x7f56e41028d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:58:00.993 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.993+0000 7f56da7fc700 1 -- 192.168.123.105:0/103345680 shutdown_connections 2026-03-10T08:58:00.994 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:00.994+0000 7f56da7fc700 1 -- 192.168.123.105:0/103345680 wait complete. 2026-03-10T08:58:01.075 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.074+0000 7fa6f48a7700 1 -- 192.168.123.105:0/469298577 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa6ec075a40 msgr2=0x7fa6ec077ed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:01.075 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.074+0000 7fa6f48a7700 1 --2- 192.168.123.105:0/469298577 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa6ec075a40 0x7fa6ec077ed0 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7fa6e800d3f0 tx=0x7fa6e800d700 comp rx=0 tx=0).stop 2026-03-10T08:58:01.075 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.075+0000 7fa6f48a7700 1 -- 192.168.123.105:0/469298577 shutdown_connections 2026-03-10T08:58:01.075 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.075+0000 7fa6f48a7700 1 --2- 192.168.123.105:0/469298577 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa6ec075a40 0x7fa6ec077ed0 unknown :-1 s=CLOSED pgs=61 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:01.075 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.075+0000 7fa6f48a7700 1 --2- 192.168.123.105:0/469298577 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa6ec072b50 0x7fa6ec072f70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:01.075 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.075+0000 7fa6f48a7700 1 -- 192.168.123.105:0/469298577 >> 192.168.123.105:0/469298577 conn(0x7fa6ec06dae0 msgr2=0x7fa6ec06ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:58:01.076 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.075+0000 7fa6f48a7700 1 -- 192.168.123.105:0/469298577 shutdown_connections 2026-03-10T08:58:01.076 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.075+0000 7fa6f48a7700 1 -- 192.168.123.105:0/469298577 wait complete. 2026-03-10T08:58:01.076 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.076+0000 7fa6f48a7700 1 Processor -- start 2026-03-10T08:58:01.076 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.076+0000 7fa6f48a7700 1 -- start start 2026-03-10T08:58:01.077 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.076+0000 7fa6f48a7700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa6ec072b50 0x7fa6ec082f60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:58:01.077 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.076+0000 7fa6f48a7700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa6ec0834a0 0x7fa6ec083920 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:58:01.077 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.076+0000 7fa6f48a7700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa6ec12e700 con 0x7fa6ec072b50 2026-03-10T08:58:01.077 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.076+0000 7fa6f48a7700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa6ec12e870 con 0x7fa6ec0834a0 2026-03-10T08:58:01.077 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.076+0000 7fa6f1e42700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa6ec0834a0 0x7fa6ec083920 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:58:01.077 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.076+0000 7fa6f1e42700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa6ec0834a0 0x7fa6ec083920 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:48486/0 (socket says 192.168.123.105:48486) 2026-03-10T08:58:01.077 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.076+0000 7fa6f1e42700 1 -- 192.168.123.105:0/1458802212 learned_addr learned my addr 192.168.123.105:0/1458802212 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:58:01.077 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.076+0000 7fa6f1e42700 1 -- 192.168.123.105:0/1458802212 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa6ec072b50 msgr2=0x7fa6ec082f60 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:01.077 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.076+0000 7fa6f1e42700 1 --2- 192.168.123.105:0/1458802212 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa6ec072b50 0x7fa6ec082f60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:01.078 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.076+0000 7fa6f1e42700 1 -- 192.168.123.105:0/1458802212 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa6e8007ed0 con 0x7fa6ec0834a0 2026-03-10T08:58:01.078 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.077+0000 7fa6f1e42700 1 --2- 192.168.123.105:0/1458802212 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa6ec0834a0 0x7fa6ec083920 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7fa6e8003c60 tx=0x7fa6e8003d40 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:58:01.078 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.078+0000 7fa6df7fe700 1 -- 192.168.123.105:0/1458802212 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa6e801c070 con 0x7fa6ec0834a0 2026-03-10T08:58:01.079 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.078+0000 7fa6f48a7700 1 -- 192.168.123.105:0/1458802212 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa6ec12eaf0 con 0x7fa6ec0834a0 2026-03-10T08:58:01.079 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.078+0000 7fa6f48a7700 1 -- 192.168.123.105:0/1458802212 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa6ec12efe0 con 0x7fa6ec0834a0 2026-03-10T08:58:01.079 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.079+0000 7fa6df7fe700 1 -- 192.168.123.105:0/1458802212 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa6e800fb40 con 0x7fa6ec0834a0 2026-03-10T08:58:01.079 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.079+0000 7fa6df7fe700 1 -- 192.168.123.105:0/1458802212 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa6e8017cc0 con 0x7fa6ec0834a0 2026-03-10T08:58:01.080 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.080+0000 7fa6df7fe700 1 -- 192.168.123.105:0/1458802212 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 34) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fa6e802a430 con 0x7fa6ec0834a0 2026-03-10T08:58:01.081 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.080+0000 7fa6df7fe700 1 --2- 192.168.123.105:0/1458802212 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fa6d8077b00 0x7fa6d8079fc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:58:01.081 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.081+0000 7fa6f2643700 1 --2- 192.168.123.105:0/1458802212 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fa6d8077b00 0x7fa6d8079fc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:58:01.081 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.081+0000 7fa6df7fe700 1 -- 192.168.123.105:0/1458802212 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(49..49 src has 1..49) v4 ==== 6628+0+0 (secure 0 0 0) 0x7fa6e8013070 con 0x7fa6ec0834a0 2026-03-10T08:58:01.081 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.081+0000 7fa6f2643700 1 --2- 192.168.123.105:0/1458802212 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fa6d8077b00 0x7fa6d8079fc0 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7fa6e00098a0 tx=0x7fa6e0006d90 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:58:01.082 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.082+0000 7fa6f48a7700 1 -- 192.168.123.105:0/1458802212 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa6d0005320 con 0x7fa6ec0834a0 2026-03-10T08:58:01.086 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.085+0000 7fa6df7fe700 1 -- 192.168.123.105:0/1458802212 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fa6e8064f00 con 0x7fa6ec0834a0 2026-03-10T08:58:01.215 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.214+0000 7fa6f48a7700 1 -- 192.168.123.105:0/1458802212 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fa6d0000bf0 con 0x7fa6d8077b00 2026-03-10T08:58:01.221 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.221+0000 7fa6df7fe700 1 -- 192.168.123.105:0/1458802212 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7fa6d0000bf0 con 0x7fa6d8077b00 2026-03-10T08:58:01.222 INFO:teuthology.orchestra.run.vm05.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T08:58:01.222 INFO:teuthology.orchestra.run.vm05.stdout:alertmanager.vm05 vm05 *:9093,9094 running (6m) 35s ago 7m 24.2M - 0.25.0 c8568f914cd2 3be9db7ff6a0 2026-03-10T08:58:01.222 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm05 vm05 running (7m) 35s ago 7m 8892k - 18.2.1 5be31c24972a 4f6a4fa3151a 2026-03-10T08:58:01.222 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm08 vm08 running (6m) 47s ago 6m 11.0M - 18.2.1 5be31c24972a 17a1d108c2c1 2026-03-10T08:58:01.222 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm05 vm05 running (51s) 35s ago 7m 7834k - 19.2.3-678-ge911bdeb 654f31e6858e 8f7c3cd210c7 2026-03-10T08:58:01.222 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm08 vm08 running (49s) 47s ago 6m 8296k - 19.2.3-678-ge911bdeb 654f31e6858e dc71c3161edd 2026-03-10T08:58:01.222 INFO:teuthology.orchestra.run.vm05.stdout:grafana.vm05 vm05 *:3000 running (6m) 35s ago 6m 88.3M - 9.4.7 954c08fa6188 91937b4745e7 2026-03-10T08:58:01.222 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.bxdvbu vm05 running (5m) 35s ago 5m 242M - 18.2.1 5be31c24972a 601862397d9b 2026-03-10T08:58:01.222 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.slhztf vm05 running (5m) 35s ago 5m 17.7M - 18.2.1 5be31c24972a 550cdd24738b 2026-03-10T08:58:01.222 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.ssijow vm08 running (5m) 47s ago 5m 19.9M - 18.2.1 5be31c24972a b9e55b365719 2026-03-10T08:58:01.222 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.xfzrbx vm08 running (5m) 47s ago 5m 16.2M - 18.2.1 5be31c24972a d58d1e2e6ff3 2026-03-10T08:58:01.222 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm05.rxwgjc vm05 *:8443,9283,8765 running (117s) 35s ago 7m 613M - 19.2.3-678-ge911bdeb 654f31e6858e d8c53d04a173 2026-03-10T08:58:01.222 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm08.rpongu vm08 *:8443,9283,8765 running (88s) 47s ago 6m 491M - 19.2.3-678-ge911bdeb 654f31e6858e 867781ac9d98 2026-03-10T08:58:01.222 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm05 vm05 running (82s) 35s ago 7m 55.8M 2048M 19.2.3-678-ge911bdeb 654f31e6858e cdc9176bec28 2026-03-10T08:58:01.222 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm08 vm08 running (67s) 47s ago 6m 48.1M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 34546aa1422b 2026-03-10T08:58:01.222 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm05 vm05 *:9100 running (7m) 35s ago 7m 14.7M - 1.5.0 0da6a335fe13 3b3db2a6030c 2026-03-10T08:58:01.222 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm08 vm08 *:9100 running (6m) 47s ago 6m 15.7M - 1.5.0 0da6a335fe13 f55df0cefc61 2026-03-10T08:58:01.222 INFO:teuthology.orchestra.run.vm05.stdout:osd.0 vm05 running (38s) 35s ago 6m 30.0M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 4f1dac46f59b 2026-03-10T08:58:01.222 INFO:teuthology.orchestra.run.vm05.stdout:osd.1 vm05 running (6m) 35s ago 6m 378M 4096M 18.2.1 5be31c24972a 902f9ea11f1a 2026-03-10T08:58:01.222 INFO:teuthology.orchestra.run.vm05.stdout:osd.2 vm05 running (5m) 35s ago 5m 327M 4096M 18.2.1 5be31c24972a 32c0be0f86f2 2026-03-10T08:58:01.222 INFO:teuthology.orchestra.run.vm05.stdout:osd.3 vm08 running (5m) 47s ago 5m 456M 4096M 18.2.1 5be31c24972a 14f5f93ea4d1 2026-03-10T08:58:01.222 INFO:teuthology.orchestra.run.vm05.stdout:osd.4 vm08 running (5m) 47s ago 5m 418M 4096M 18.2.1 5be31c24972a 155c1482d81c 2026-03-10T08:58:01.222 INFO:teuthology.orchestra.run.vm05.stdout:osd.5 vm08 running (5m) 47s ago 5m 339M 4096M 18.2.1 5be31c24972a 21583bb58d82 2026-03-10T08:58:01.222 INFO:teuthology.orchestra.run.vm05.stdout:prometheus.vm05 vm05 *:9095 running (90s) 35s ago 6m 51.3M - 2.43.0 a07b618ecd1d 1879cf55d022 2026-03-10T08:58:01.226 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.226+0000 7fa6f48a7700 1 -- 192.168.123.105:0/1458802212 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fa6d8077b00 msgr2=0x7fa6d8079fc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:01.226 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.226+0000 7fa6f48a7700 1 --2- 192.168.123.105:0/1458802212 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fa6d8077b00 0x7fa6d8079fc0 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7fa6e00098a0 tx=0x7fa6e0006d90 comp rx=0 tx=0).stop 2026-03-10T08:58:01.226 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.226+0000 7fa6f48a7700 1 -- 192.168.123.105:0/1458802212 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa6ec0834a0 msgr2=0x7fa6ec083920 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:01.226 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.226+0000 7fa6f48a7700 1 --2- 192.168.123.105:0/1458802212 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa6ec0834a0 0x7fa6ec083920 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7fa6e8003c60 tx=0x7fa6e8003d40 comp rx=0 tx=0).stop 2026-03-10T08:58:01.226 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.226+0000 7fa6f48a7700 1 -- 192.168.123.105:0/1458802212 shutdown_connections 2026-03-10T08:58:01.226 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.226+0000 7fa6f48a7700 1 --2- 192.168.123.105:0/1458802212 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fa6d8077b00 0x7fa6d8079fc0 secure :-1 s=CLOSED pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7fa6e00098a0 tx=0x7fa6e0006d90 comp rx=0 tx=0).stop 2026-03-10T08:58:01.226 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.226+0000 7fa6f48a7700 1 --2- 192.168.123.105:0/1458802212 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa6ec072b50 0x7fa6ec082f60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:01.226 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.226+0000 7fa6f48a7700 1 --2- 192.168.123.105:0/1458802212 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa6ec0834a0 0x7fa6ec083920 unknown :-1 s=CLOSED pgs=16 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:01.227 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.226+0000 7fa6f48a7700 1 -- 192.168.123.105:0/1458802212 >> 192.168.123.105:0/1458802212 conn(0x7fa6ec06dae0 msgr2=0x7fa6ec06ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:58:01.227 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.226+0000 7fa6f48a7700 1 -- 192.168.123.105:0/1458802212 shutdown_connections 2026-03-10T08:58:01.227 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.227+0000 7fa6f48a7700 1 -- 192.168.123.105:0/1458802212 wait complete. 2026-03-10T08:58:01.306 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.306+0000 7fbd18b70700 1 -- 192.168.123.105:0/1318927179 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbd1410a700 msgr2=0x7fbd1410cb90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:01.306 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.306+0000 7fbd18b70700 1 --2- 192.168.123.105:0/1318927179 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbd1410a700 0x7fbd1410cb90 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7fbd0c00b3a0 tx=0x7fbd0c00b6b0 comp rx=0 tx=0).stop 2026-03-10T08:58:01.307 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.306+0000 7fbd18b70700 1 -- 192.168.123.105:0/1318927179 shutdown_connections 2026-03-10T08:58:01.307 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.306+0000 7fbd18b70700 1 --2- 192.168.123.105:0/1318927179 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbd1410a700 0x7fbd1410cb90 unknown :-1 s=CLOSED pgs=62 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:01.307 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.306+0000 7fbd18b70700 1 --2- 192.168.123.105:0/1318927179 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbd14107d90 0x7fbd1410a1c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:01.307 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.306+0000 7fbd18b70700 1 -- 192.168.123.105:0/1318927179 >> 192.168.123.105:0/1318927179 conn(0x7fbd1406dae0 msgr2=0x7fbd1406ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:58:01.307 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.306+0000 7fbd18b70700 1 -- 192.168.123.105:0/1318927179 shutdown_connections 2026-03-10T08:58:01.307 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.306+0000 7fbd18b70700 1 -- 192.168.123.105:0/1318927179 wait complete. 2026-03-10T08:58:01.307 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.307+0000 7fbd18b70700 1 Processor -- start 2026-03-10T08:58:01.307 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.307+0000 7fbd18b70700 1 -- start start 2026-03-10T08:58:01.307 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.307+0000 7fbd18b70700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbd14107d90 0x7fbd141a54d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:58:01.307 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.307+0000 7fbd18b70700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbd141a5a10 0x7fbd14076fe0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:58:01.308 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.307+0000 7fbd18b70700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbd141a5f20 con 0x7fbd141a5a10 2026-03-10T08:58:01.308 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.307+0000 7fbd18b70700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbd141a6090 con 0x7fbd14107d90 2026-03-10T08:58:01.308 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.307+0000 7fbd11d9b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbd141a5a10 0x7fbd14076fe0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:58:01.308 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.307+0000 7fbd11d9b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbd141a5a10 0x7fbd14076fe0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:57668/0 (socket says 192.168.123.105:57668) 2026-03-10T08:58:01.308 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.307+0000 7fbd11d9b700 1 -- 192.168.123.105:0/991292951 learned_addr learned my addr 192.168.123.105:0/991292951 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:58:01.308 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.307+0000 7fbd1259c700 1 --2- 192.168.123.105:0/991292951 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbd14107d90 0x7fbd141a54d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:58:01.308 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.308+0000 7fbd11d9b700 1 -- 192.168.123.105:0/991292951 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbd14107d90 msgr2=0x7fbd141a54d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:01.308 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.308+0000 7fbd11d9b700 1 --2- 192.168.123.105:0/991292951 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbd14107d90 0x7fbd141a54d0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:01.308 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.308+0000 7fbd11d9b700 1 -- 192.168.123.105:0/991292951 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbd0c00b050 con 0x7fbd141a5a10 2026-03-10T08:58:01.308 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.308+0000 7fbd11d9b700 1 --2- 192.168.123.105:0/991292951 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbd141a5a10 0x7fbd14076fe0 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7fbd0c007b90 tx=0x7fbd0c0095a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:58:01.309 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.309+0000 7fbd037fe700 1 -- 192.168.123.105:0/991292951 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbd0c00e050 con 0x7fbd141a5a10 2026-03-10T08:58:01.309 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.309+0000 7fbd18b70700 1 -- 192.168.123.105:0/991292951 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbd14077520 con 0x7fbd141a5a10 2026-03-10T08:58:01.309 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.309+0000 7fbd18b70700 1 -- 192.168.123.105:0/991292951 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbd14077a10 con 0x7fbd141a5a10 2026-03-10T08:58:01.310 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.309+0000 7fbd18b70700 1 -- 192.168.123.105:0/991292951 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbd1404ea90 con 0x7fbd141a5a10 2026-03-10T08:58:01.310 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.309+0000 7fbd037fe700 1 -- 192.168.123.105:0/991292951 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fbd0c004730 con 0x7fbd141a5a10 2026-03-10T08:58:01.310 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.310+0000 7fbd037fe700 1 -- 192.168.123.105:0/991292951 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbd0c01bc10 con 0x7fbd141a5a10 2026-03-10T08:58:01.312 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.312+0000 7fbd037fe700 1 -- 192.168.123.105:0/991292951 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 34) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fbd0c019040 con 0x7fbd141a5a10 2026-03-10T08:58:01.312 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.312+0000 7fbd037fe700 1 --2- 192.168.123.105:0/991292951 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fbcfc077b00 0x7fbcfc079fc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:58:01.312 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.312+0000 7fbd037fe700 1 -- 192.168.123.105:0/991292951 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(49..49 src has 1..49) v4 ==== 6628+0+0 (secure 0 0 0) 0x7fbd0c09b740 con 0x7fbd141a5a10 2026-03-10T08:58:01.313 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.313+0000 7fbd1259c700 1 --2- 192.168.123.105:0/991292951 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fbcfc077b00 0x7fbcfc079fc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:58:01.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.314+0000 7fbd037fe700 1 -- 192.168.123.105:0/991292951 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fbd0c063be0 con 0x7fbd141a5a10 2026-03-10T08:58:01.318 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.315+0000 7fbd1259c700 1 --2- 192.168.123.105:0/991292951 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fbcfc077b00 0x7fbcfc079fc0 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7fbd141a6a30 tx=0x7fbd08009380 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:58:01.503 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.503+0000 7fbd18b70700 1 -- 192.168.123.105:0/991292951 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fbd14077cf0 con 0x7fbd141a5a10 2026-03-10T08:58:01.612 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T08:58:01.612 INFO:teuthology.orchestra.run.vm05.stdout: "mon": { 2026-03-10T08:58:01.612 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T08:58:01.612 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:58:01.612 INFO:teuthology.orchestra.run.vm05.stdout: "mgr": { 2026-03-10T08:58:01.612 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T08:58:01.612 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:58:01.612 INFO:teuthology.orchestra.run.vm05.stdout: "osd": { 2026-03-10T08:58:01.612 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 5, 2026-03-10T08:58:01.612 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-10T08:58:01.612 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:58:01.612 INFO:teuthology.orchestra.run.vm05.stdout: "mds": { 2026-03-10T08:58:01.612 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4 2026-03-10T08:58:01.612 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:58:01.612 INFO:teuthology.orchestra.run.vm05.stdout: "overall": { 2026-03-10T08:58:01.612 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 9, 2026-03-10T08:58:01.612 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 5 2026-03-10T08:58:01.612 INFO:teuthology.orchestra.run.vm05.stdout: } 2026-03-10T08:58:01.612 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T08:58:01.613 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.609+0000 7fbd037fe700 1 -- 192.168.123.105:0/991292951 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+799 (secure 0 0 0) 0x7fbd0c063330 con 0x7fbd141a5a10 2026-03-10T08:58:01.613 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.612+0000 7fbd18b70700 1 -- 192.168.123.105:0/991292951 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fbcfc077b00 msgr2=0x7fbcfc079fc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:01.613 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.612+0000 7fbd18b70700 1 --2- 192.168.123.105:0/991292951 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fbcfc077b00 0x7fbcfc079fc0 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7fbd141a6a30 tx=0x7fbd08009380 comp rx=0 tx=0).stop 2026-03-10T08:58:01.613 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.612+0000 7fbd18b70700 1 -- 192.168.123.105:0/991292951 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbd141a5a10 msgr2=0x7fbd14076fe0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:01.613 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.612+0000 7fbd18b70700 1 --2- 192.168.123.105:0/991292951 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbd141a5a10 0x7fbd14076fe0 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7fbd0c007b90 tx=0x7fbd0c0095a0 comp rx=0 tx=0).stop 2026-03-10T08:58:01.613 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.612+0000 7fbd18b70700 1 -- 192.168.123.105:0/991292951 shutdown_connections 2026-03-10T08:58:01.613 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.612+0000 7fbd18b70700 1 --2- 192.168.123.105:0/991292951 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fbcfc077b00 0x7fbcfc079fc0 unknown :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:01.613 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.612+0000 7fbd18b70700 1 --2- 192.168.123.105:0/991292951 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbd14107d90 0x7fbd141a54d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:01.613 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.612+0000 7fbd18b70700 1 --2- 192.168.123.105:0/991292951 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbd141a5a10 0x7fbd14076fe0 unknown :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:01.613 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.612+0000 7fbd18b70700 1 -- 192.168.123.105:0/991292951 >> 192.168.123.105:0/991292951 conn(0x7fbd1406dae0 msgr2=0x7fbd1406e7c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:58:01.613 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.613+0000 7fbd18b70700 1 -- 192.168.123.105:0/991292951 shutdown_connections 2026-03-10T08:58:01.613 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.613+0000 7fbd18b70700 1 -- 192.168.123.105:0/991292951 wait complete. 2026-03-10T08:58:01.712 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.712+0000 7f21ca5c9700 1 -- 192.168.123.105:0/760416528 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f21c410a700 msgr2=0x7f21c410cb90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:01.712 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.712+0000 7f21ca5c9700 1 --2- 192.168.123.105:0/760416528 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f21c410a700 0x7f21c410cb90 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7f21bc00b3a0 tx=0x7f21bc00b6b0 comp rx=0 tx=0).stop 2026-03-10T08:58:01.712 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.712+0000 7f21ca5c9700 1 -- 192.168.123.105:0/760416528 shutdown_connections 2026-03-10T08:58:01.712 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.712+0000 7f21ca5c9700 1 --2- 192.168.123.105:0/760416528 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f21c410a700 0x7f21c410cb90 unknown :-1 s=CLOSED pgs=17 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:01.712 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.712+0000 7f21ca5c9700 1 --2- 192.168.123.105:0/760416528 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f21c4107d90 0x7f21c410a1c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:01.712 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.712+0000 7f21ca5c9700 1 -- 192.168.123.105:0/760416528 >> 192.168.123.105:0/760416528 conn(0x7f21c406dae0 msgr2=0x7f21c406ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:58:01.712 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.712+0000 7f21ca5c9700 1 -- 192.168.123.105:0/760416528 shutdown_connections 2026-03-10T08:58:01.713 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.712+0000 7f21ca5c9700 1 -- 192.168.123.105:0/760416528 wait complete. 2026-03-10T08:58:01.713 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.712+0000 7f21ca5c9700 1 Processor -- start 2026-03-10T08:58:01.713 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.712+0000 7f21ca5c9700 1 -- start start 2026-03-10T08:58:01.713 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.713+0000 7f21ca5c9700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f21c4107d90 0x7f21c4116b10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:58:01.713 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.713+0000 7f21ca5c9700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f21c4117050 0x7f21c41a1ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:58:01.713 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.713+0000 7f21ca5c9700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f21c4117560 con 0x7f21c4107d90 2026-03-10T08:58:01.713 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.713+0000 7f21ca5c9700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f21c41176a0 con 0x7f21c4117050 2026-03-10T08:58:01.713 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.713+0000 7f21c37fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f21c4117050 0x7f21c41a1ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:58:01.713 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.713+0000 7f21c37fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f21c4117050 0x7f21c41a1ea0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:48524/0 (socket says 192.168.123.105:48524) 2026-03-10T08:58:01.713 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.713+0000 7f21c37fe700 1 -- 192.168.123.105:0/958576787 learned_addr learned my addr 192.168.123.105:0/958576787 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:58:01.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.713+0000 7f21c3fff700 1 --2- 192.168.123.105:0/958576787 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f21c4107d90 0x7f21c4116b10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:58:01.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.713+0000 7f21c37fe700 1 -- 192.168.123.105:0/958576787 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f21c4107d90 msgr2=0x7f21c4116b10 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:01.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.713+0000 7f21c37fe700 1 --2- 192.168.123.105:0/958576787 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f21c4107d90 0x7f21c4116b10 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:01.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.713+0000 7f21c37fe700 1 -- 192.168.123.105:0/958576787 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f21bc00b050 con 0x7f21c4117050 2026-03-10T08:58:01.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.713+0000 7f21c37fe700 1 --2- 192.168.123.105:0/958576787 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f21c4117050 0x7f21c41a1ea0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f21bc0077f0 tx=0x7f21bc0078d0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:58:01.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.714+0000 7f21c17fa700 1 -- 192.168.123.105:0/958576787 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f21bc00e040 con 0x7f21c4117050 2026-03-10T08:58:01.715 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.714+0000 7f21ca5c9700 1 -- 192.168.123.105:0/958576787 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f21c41a23e0 con 0x7f21c4117050 2026-03-10T08:58:01.715 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.714+0000 7f21ca5c9700 1 -- 192.168.123.105:0/958576787 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f21c41a2920 con 0x7f21c4117050 2026-03-10T08:58:01.715 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.715+0000 7f21c17fa700 1 -- 192.168.123.105:0/958576787 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f21bc009440 con 0x7f21c4117050 2026-03-10T08:58:01.715 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.715+0000 7f21c17fa700 1 -- 192.168.123.105:0/958576787 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f21bc01b650 con 0x7f21c4117050 2026-03-10T08:58:01.716 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.716+0000 7f21c17fa700 1 -- 192.168.123.105:0/958576787 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 34) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f21bc019070 con 0x7f21c4117050 2026-03-10T08:58:01.716 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.716+0000 7f21c17fa700 1 --2- 192.168.123.105:0/958576787 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f21ac077a40 0x7f21ac079f00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:58:01.718 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.718+0000 7f21c17fa700 1 -- 192.168.123.105:0/958576787 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(49..49 src has 1..49) v4 ==== 6628+0+0 (secure 0 0 0) 0x7f21bc09ba20 con 0x7f21c4117050 2026-03-10T08:58:01.719 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.719+0000 7f21ca5c9700 1 -- 192.168.123.105:0/958576787 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f21b0005320 con 0x7f21c4117050 2026-03-10T08:58:01.719 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.718+0000 7f21c3fff700 1 --2- 192.168.123.105:0/958576787 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f21ac077a40 0x7f21ac079f00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:58:01.722 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.722+0000 7f21c3fff700 1 --2- 192.168.123.105:0/958576787 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f21ac077a40 0x7f21ac079f00 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f21c4117b50 tx=0x7f21b4008040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:58:01.726 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.722+0000 7f21c17fa700 1 -- 192.168.123.105:0/958576787 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f21bc063e40 con 0x7f21c4117050 2026-03-10T08:58:01.920 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.919+0000 7f21ca5c9700 1 -- 192.168.123.105:0/958576787 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f21b0005cc0 con 0x7f21c4117050 2026-03-10T08:58:01.920 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:01 vm05.local ceph-mon[111630]: from='client.44127 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:58:01.920 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:01 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:58:01.920 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:01 vm05.local ceph-mon[111630]: from='client.44131 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:58:01.921 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.921+0000 7f21c17fa700 1 -- 192.168.123.105:0/958576787 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 12 v12) v1 ==== 76+0+1918 (secure 0 0 0) 0x7f21bc017070 con 0x7f21c4117050 2026-03-10T08:58:01.921 INFO:teuthology.orchestra.run.vm05.stdout:e12 2026-03-10T08:58:01.921 INFO:teuthology.orchestra.run.vm05.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-10T08:58:01.922 INFO:teuthology.orchestra.run.vm05.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T08:58:01.922 INFO:teuthology.orchestra.run.vm05.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T08:58:01.922 INFO:teuthology.orchestra.run.vm05.stdout:legacy client fscid: 1 2026-03-10T08:58:01.922 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:58:01.922 INFO:teuthology.orchestra.run.vm05.stdout:Filesystem 'cephfs' (1) 2026-03-10T08:58:01.922 INFO:teuthology.orchestra.run.vm05.stdout:fs_name cephfs 2026-03-10T08:58:01.922 INFO:teuthology.orchestra.run.vm05.stdout:epoch 12 2026-03-10T08:58:01.922 INFO:teuthology.orchestra.run.vm05.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T08:58:01.922 INFO:teuthology.orchestra.run.vm05.stdout:created 2026-03-10T08:52:52.346264+0000 2026-03-10T08:58:01.922 INFO:teuthology.orchestra.run.vm05.stdout:modified 2026-03-10T08:53:01.426195+0000 2026-03-10T08:58:01.922 INFO:teuthology.orchestra.run.vm05.stdout:tableserver 0 2026-03-10T08:58:01.922 INFO:teuthology.orchestra.run.vm05.stdout:root 0 2026-03-10T08:58:01.922 INFO:teuthology.orchestra.run.vm05.stdout:session_timeout 60 2026-03-10T08:58:01.922 INFO:teuthology.orchestra.run.vm05.stdout:session_autoclose 300 2026-03-10T08:58:01.922 INFO:teuthology.orchestra.run.vm05.stdout:max_file_size 1099511627776 2026-03-10T08:58:01.922 INFO:teuthology.orchestra.run.vm05.stdout:max_xattr_size 65536 2026-03-10T08:58:01.922 INFO:teuthology.orchestra.run.vm05.stdout:required_client_features {} 2026-03-10T08:58:01.922 INFO:teuthology.orchestra.run.vm05.stdout:last_failure 0 2026-03-10T08:58:01.922 INFO:teuthology.orchestra.run.vm05.stdout:last_failure_osd_epoch 39 2026-03-10T08:58:01.922 INFO:teuthology.orchestra.run.vm05.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T08:58:01.922 INFO:teuthology.orchestra.run.vm05.stdout:max_mds 1 2026-03-10T08:58:01.922 INFO:teuthology.orchestra.run.vm05.stdout:in 0 2026-03-10T08:58:01.922 INFO:teuthology.orchestra.run.vm05.stdout:up {0=24289} 2026-03-10T08:58:01.922 INFO:teuthology.orchestra.run.vm05.stdout:failed 2026-03-10T08:58:01.922 INFO:teuthology.orchestra.run.vm05.stdout:damaged 2026-03-10T08:58:01.922 INFO:teuthology.orchestra.run.vm05.stdout:stopped 2026-03-10T08:58:01.922 INFO:teuthology.orchestra.run.vm05.stdout:data_pools [3] 2026-03-10T08:58:01.922 INFO:teuthology.orchestra.run.vm05.stdout:metadata_pool 2 2026-03-10T08:58:01.922 INFO:teuthology.orchestra.run.vm05.stdout:inline_data disabled 2026-03-10T08:58:01.922 INFO:teuthology.orchestra.run.vm05.stdout:balancer 2026-03-10T08:58:01.922 INFO:teuthology.orchestra.run.vm05.stdout:bal_rank_mask -1 2026-03-10T08:58:01.922 INFO:teuthology.orchestra.run.vm05.stdout:standby_count_wanted 1 2026-03-10T08:58:01.922 INFO:teuthology.orchestra.run.vm05.stdout:qdb_cluster leader: 0 members: 2026-03-10T08:58:01.922 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.bxdvbu{0:24289} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.105:6826/2466638752,v1:192.168.123.105:6827/2466638752] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:58:01.922 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:58:01.922 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:58:01.922 INFO:teuthology.orchestra.run.vm05.stdout:Standby daemons: 2026-03-10T08:58:01.922 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:58:01.922 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.slhztf{-1:14488} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.105:6828/2662194502,v1:192.168.123.105:6829/2662194502] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:58:01.923 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.ssijow{-1:24307} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.108:6826/573665424,v1:192.168.123.108:6827/573665424] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:58:01.923 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.xfzrbx{-1:24317} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.108:6824/3236998387,v1:192.168.123.108:6825/3236998387] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:58:01.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.924+0000 7f21aaffd700 1 -- 192.168.123.105:0/958576787 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f21ac077a40 msgr2=0x7f21ac079f00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:01.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.924+0000 7f21aaffd700 1 --2- 192.168.123.105:0/958576787 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f21ac077a40 0x7f21ac079f00 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f21c4117b50 tx=0x7f21b4008040 comp rx=0 tx=0).stop 2026-03-10T08:58:01.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.924+0000 7f21aaffd700 1 -- 192.168.123.105:0/958576787 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f21c4117050 msgr2=0x7f21c41a1ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:01.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.924+0000 7f21aaffd700 1 --2- 192.168.123.105:0/958576787 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f21c4117050 0x7f21c41a1ea0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f21bc0077f0 tx=0x7f21bc0078d0 comp rx=0 tx=0).stop 2026-03-10T08:58:01.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.924+0000 7f21aaffd700 1 -- 192.168.123.105:0/958576787 shutdown_connections 2026-03-10T08:58:01.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.924+0000 7f21aaffd700 1 --2- 192.168.123.105:0/958576787 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f21ac077a40 0x7f21ac079f00 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:01.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.924+0000 7f21aaffd700 1 --2- 192.168.123.105:0/958576787 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f21c4107d90 0x7f21c4116b10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:01.925 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.924+0000 7f21aaffd700 1 --2- 192.168.123.105:0/958576787 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f21c4117050 0x7f21c41a1ea0 unknown :-1 s=CLOSED pgs=18 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:01.925 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.924+0000 7f21aaffd700 1 -- 192.168.123.105:0/958576787 >> 192.168.123.105:0/958576787 conn(0x7f21c406dae0 msgr2=0x7f21c406e7c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:58:01.925 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.925+0000 7f21aaffd700 1 -- 192.168.123.105:0/958576787 shutdown_connections 2026-03-10T08:58:01.926 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:01.926+0000 7f21aaffd700 1 -- 192.168.123.105:0/958576787 wait complete. 2026-03-10T08:58:01.928 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 12 2026-03-10T08:58:02.019 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.019+0000 7f405ec56700 1 -- 192.168.123.105:0/2801172486 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4058075a40 msgr2=0x7f4058077ed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:02.019 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.019+0000 7f405ec56700 1 --2- 192.168.123.105:0/2801172486 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4058075a40 0x7f4058077ed0 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f405000d3f0 tx=0x7f405000d700 comp rx=0 tx=0).stop 2026-03-10T08:58:02.019 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.019+0000 7f405ec56700 1 -- 192.168.123.105:0/2801172486 shutdown_connections 2026-03-10T08:58:02.020 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.019+0000 7f405ec56700 1 --2- 192.168.123.105:0/2801172486 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4058075a40 0x7f4058077ed0 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:02.020 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.019+0000 7f405ec56700 1 --2- 192.168.123.105:0/2801172486 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4058072b50 0x7f4058072f70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:02.020 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.019+0000 7f405ec56700 1 -- 192.168.123.105:0/2801172486 >> 192.168.123.105:0/2801172486 conn(0x7f405806dae0 msgr2=0x7f405806ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:58:02.020 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.019+0000 7f405ec56700 1 -- 192.168.123.105:0/2801172486 shutdown_connections 2026-03-10T08:58:02.020 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.020+0000 7f405ec56700 1 -- 192.168.123.105:0/2801172486 wait complete. 2026-03-10T08:58:02.020 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.020+0000 7f405ec56700 1 Processor -- start 2026-03-10T08:58:02.020 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.020+0000 7f405ec56700 1 -- start start 2026-03-10T08:58:02.021 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.020+0000 7f405ec56700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4058072b50 0x7f4058082f70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:58:02.021 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.020+0000 7f405ec56700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f40580834b0 0x7f4058083930 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:58:02.021 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.020+0000 7f405ec56700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f405812e710 con 0x7f40580834b0 2026-03-10T08:58:02.021 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.020+0000 7f405ec56700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f405812e880 con 0x7f4058072b50 2026-03-10T08:58:02.021 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.021+0000 7f405c9f2700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4058072b50 0x7f4058082f70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:58:02.021 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.021+0000 7f405c9f2700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4058072b50 0x7f4058082f70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:48550/0 (socket says 192.168.123.105:48550) 2026-03-10T08:58:02.021 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.021+0000 7f405c9f2700 1 -- 192.168.123.105:0/3309436020 learned_addr learned my addr 192.168.123.105:0/3309436020 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:58:02.021 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.021+0000 7f4057fff700 1 --2- 192.168.123.105:0/3309436020 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f40580834b0 0x7f4058083930 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:58:02.021 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.021+0000 7f4057fff700 1 -- 192.168.123.105:0/3309436020 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4058072b50 msgr2=0x7f4058082f70 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:02.021 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.021+0000 7f4057fff700 1 --2- 192.168.123.105:0/3309436020 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4058072b50 0x7f4058082f70 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:02.021 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.021+0000 7f4057fff700 1 -- 192.168.123.105:0/3309436020 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4050007ed0 con 0x7f40580834b0 2026-03-10T08:58:02.022 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.021+0000 7f4057fff700 1 --2- 192.168.123.105:0/3309436020 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f40580834b0 0x7f4058083930 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f405000d3f0 tx=0x7f4050003c30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:58:02.022 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.022+0000 7f4055ffb700 1 -- 192.168.123.105:0/3309436020 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f405001c070 con 0x7f40580834b0 2026-03-10T08:58:02.023 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.022+0000 7f405ec56700 1 -- 192.168.123.105:0/3309436020 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f405812eb00 con 0x7f40580834b0 2026-03-10T08:58:02.023 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.022+0000 7f405ec56700 1 -- 192.168.123.105:0/3309436020 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f405812eff0 con 0x7f40580834b0 2026-03-10T08:58:02.023 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.023+0000 7f4055ffb700 1 -- 192.168.123.105:0/3309436020 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f405000de50 con 0x7f40580834b0 2026-03-10T08:58:02.023 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.023+0000 7f4055ffb700 1 -- 192.168.123.105:0/3309436020 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4050017b40 con 0x7f40580834b0 2026-03-10T08:58:02.024 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.023+0000 7f405ec56700 1 -- 192.168.123.105:0/3309436020 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4044005320 con 0x7f40580834b0 2026-03-10T08:58:02.026 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.026+0000 7f4055ffb700 1 -- 192.168.123.105:0/3309436020 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 34) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f4050017420 con 0x7f40580834b0 2026-03-10T08:58:02.026 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.026+0000 7f4055ffb700 1 --2- 192.168.123.105:0/3309436020 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f4040077a30 0x7f4040079ef0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:58:02.033 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.032+0000 7f405c9f2700 1 --2- 192.168.123.105:0/3309436020 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f4040077a30 0x7f4040079ef0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:58:02.033 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.033+0000 7f405c9f2700 1 --2- 192.168.123.105:0/3309436020 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f4040077a30 0x7f4040079ef0 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f4048005950 tx=0x7f404800b500 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:58:02.034 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.033+0000 7f4055ffb700 1 -- 192.168.123.105:0/3309436020 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(49..49 src has 1..49) v4 ==== 6628+0+0 (secure 0 0 0) 0x7f4050013070 con 0x7f40580834b0 2026-03-10T08:58:02.041 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.036+0000 7f4055ffb700 1 -- 192.168.123.105:0/3309436020 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f4050064290 con 0x7f40580834b0 2026-03-10T08:58:02.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:01 vm08.local ceph-mon[101330]: from='client.44127 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:58:02.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:01 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:58:02.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:01 vm08.local ceph-mon[101330]: from='client.44131 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:58:02.190 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.188+0000 7f405ec56700 1 -- 192.168.123.105:0/3309436020 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f4044000bf0 con 0x7f4040077a30 2026-03-10T08:58:02.192 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.192+0000 7f4055ffb700 1 -- 192.168.123.105:0/3309436020 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f4044000bf0 con 0x7f4040077a30 2026-03-10T08:58:02.192 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T08:58:02.193 INFO:teuthology.orchestra.run.vm05.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T08:58:02.193 INFO:teuthology.orchestra.run.vm05.stdout: "in_progress": true, 2026-03-10T08:58:02.193 INFO:teuthology.orchestra.run.vm05.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T08:58:02.193 INFO:teuthology.orchestra.run.vm05.stdout: "services_complete": [ 2026-03-10T08:58:02.193 INFO:teuthology.orchestra.run.vm05.stdout: "mgr", 2026-03-10T08:58:02.193 INFO:teuthology.orchestra.run.vm05.stdout: "crash", 2026-03-10T08:58:02.193 INFO:teuthology.orchestra.run.vm05.stdout: "mon" 2026-03-10T08:58:02.193 INFO:teuthology.orchestra.run.vm05.stdout: ], 2026-03-10T08:58:02.193 INFO:teuthology.orchestra.run.vm05.stdout: "progress": "7/23 daemons upgraded", 2026-03-10T08:58:02.193 INFO:teuthology.orchestra.run.vm05.stdout: "message": "Currently upgrading osd daemons", 2026-03-10T08:58:02.193 INFO:teuthology.orchestra.run.vm05.stdout: "is_paused": false 2026-03-10T08:58:02.193 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T08:58:02.197 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.196+0000 7f403f7fe700 1 -- 192.168.123.105:0/3309436020 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f4040077a30 msgr2=0x7f4040079ef0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:02.197 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.196+0000 7f403f7fe700 1 --2- 192.168.123.105:0/3309436020 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f4040077a30 0x7f4040079ef0 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f4048005950 tx=0x7f404800b500 comp rx=0 tx=0).stop 2026-03-10T08:58:02.197 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.196+0000 7f403f7fe700 1 -- 192.168.123.105:0/3309436020 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f40580834b0 msgr2=0x7f4058083930 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:02.197 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.197+0000 7f403f7fe700 1 --2- 192.168.123.105:0/3309436020 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f40580834b0 0x7f4058083930 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f405000d3f0 tx=0x7f4050003c30 comp rx=0 tx=0).stop 2026-03-10T08:58:02.197 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.197+0000 7f403f7fe700 1 -- 192.168.123.105:0/3309436020 shutdown_connections 2026-03-10T08:58:02.197 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.197+0000 7f403f7fe700 1 --2- 192.168.123.105:0/3309436020 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f4040077a30 0x7f4040079ef0 unknown :-1 s=CLOSED pgs=40 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:02.197 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.197+0000 7f403f7fe700 1 --2- 192.168.123.105:0/3309436020 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4058072b50 0x7f4058082f70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:02.197 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.197+0000 7f403f7fe700 1 --2- 192.168.123.105:0/3309436020 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f40580834b0 0x7f4058083930 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:02.197 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.197+0000 7f403f7fe700 1 -- 192.168.123.105:0/3309436020 >> 192.168.123.105:0/3309436020 conn(0x7f405806dae0 msgr2=0x7f405806ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:58:02.197 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.197+0000 7f403f7fe700 1 -- 192.168.123.105:0/3309436020 shutdown_connections 2026-03-10T08:58:02.197 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.197+0000 7f403f7fe700 1 -- 192.168.123.105:0/3309436020 wait complete. 2026-03-10T08:58:02.301 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.300+0000 7f2ef9aed700 1 -- 192.168.123.105:0/217711712 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2ef4075a40 msgr2=0x7f2ef4077ed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:02.301 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.300+0000 7f2ef9aed700 1 --2- 192.168.123.105:0/217711712 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2ef4075a40 0x7f2ef4077ed0 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f2eec009230 tx=0x7f2eec009260 comp rx=0 tx=0).stop 2026-03-10T08:58:02.301 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.300+0000 7f2ef9aed700 1 -- 192.168.123.105:0/217711712 shutdown_connections 2026-03-10T08:58:02.301 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.300+0000 7f2ef9aed700 1 --2- 192.168.123.105:0/217711712 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2ef4075a40 0x7f2ef4077ed0 unknown :-1 s=CLOSED pgs=19 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:02.301 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.300+0000 7f2ef9aed700 1 --2- 192.168.123.105:0/217711712 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2ef4072b50 0x7f2ef4072f70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:02.301 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.300+0000 7f2ef9aed700 1 -- 192.168.123.105:0/217711712 >> 192.168.123.105:0/217711712 conn(0x7f2ef406dae0 msgr2=0x7f2ef406ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:58:02.301 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.300+0000 7f2ef9aed700 1 -- 192.168.123.105:0/217711712 shutdown_connections 2026-03-10T08:58:02.301 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.301+0000 7f2ef9aed700 1 -- 192.168.123.105:0/217711712 wait complete. 2026-03-10T08:58:02.301 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.301+0000 7f2ef9aed700 1 Processor -- start 2026-03-10T08:58:02.302 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.301+0000 7f2ef9aed700 1 -- start start 2026-03-10T08:58:02.302 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.301+0000 7f2ef9aed700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2ef4072b50 0x7f2ef40830a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:58:02.302 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.301+0000 7f2ef9aed700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2ef40835e0 0x7f2ef412e3f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:58:02.302 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.301+0000 7f2ef9aed700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2ef4083af0 con 0x7f2ef4072b50 2026-03-10T08:58:02.302 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.301+0000 7f2ef9aed700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2ef4083c60 con 0x7f2ef40835e0 2026-03-10T08:58:02.302 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.301+0000 7f2ef2ffd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2ef40835e0 0x7f2ef412e3f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:58:02.302 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.301+0000 7f2ef2ffd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2ef40835e0 0x7f2ef412e3f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:48576/0 (socket says 192.168.123.105:48576) 2026-03-10T08:58:02.302 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.301+0000 7f2ef2ffd700 1 -- 192.168.123.105:0/1006145683 learned_addr learned my addr 192.168.123.105:0/1006145683 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:58:02.303 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.303+0000 7f2ef37fe700 1 --2- 192.168.123.105:0/1006145683 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2ef4072b50 0x7f2ef40830a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:58:02.303 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.303+0000 7f2ef2ffd700 1 -- 192.168.123.105:0/1006145683 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2ef4072b50 msgr2=0x7f2ef40830a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:02.303 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.303+0000 7f2ef2ffd700 1 --2- 192.168.123.105:0/1006145683 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2ef4072b50 0x7f2ef40830a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:02.303 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.303+0000 7f2ef2ffd700 1 -- 192.168.123.105:0/1006145683 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2eec008ee0 con 0x7f2ef40835e0 2026-03-10T08:58:02.303 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.303+0000 7f2ef2ffd700 1 --2- 192.168.123.105:0/1006145683 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2ef40835e0 0x7f2ef412e3f0 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f2eec003fa0 tx=0x7f2eec008e70 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:58:02.304 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.303+0000 7f2ef0ff9700 1 -- 192.168.123.105:0/1006145683 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2eec01d070 con 0x7f2ef40835e0 2026-03-10T08:58:02.305 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.304+0000 7f2ef9aed700 1 -- 192.168.123.105:0/1006145683 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2ef412e930 con 0x7f2ef40835e0 2026-03-10T08:58:02.305 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.304+0000 7f2ef9aed700 1 -- 192.168.123.105:0/1006145683 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2ef412ee20 con 0x7f2ef40835e0 2026-03-10T08:58:02.305 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.304+0000 7f2ef0ff9700 1 -- 192.168.123.105:0/1006145683 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f2eec003fd0 con 0x7f2ef40835e0 2026-03-10T08:58:02.305 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.304+0000 7f2ef0ff9700 1 -- 192.168.123.105:0/1006145683 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2eec00e650 con 0x7f2ef40835e0 2026-03-10T08:58:02.306 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.305+0000 7f2ef9aed700 1 -- 192.168.123.105:0/1006145683 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2ef404ea90 con 0x7f2ef40835e0 2026-03-10T08:58:02.307 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.307+0000 7f2ef0ff9700 1 -- 192.168.123.105:0/1006145683 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 34) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f2eec004250 con 0x7f2ef40835e0 2026-03-10T08:58:02.307 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.307+0000 7f2ef0ff9700 1 --2- 192.168.123.105:0/1006145683 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f2edc077a40 0x7f2edc079f00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:58:02.308 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.307+0000 7f2ef0ff9700 1 -- 192.168.123.105:0/1006145683 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(49..49 src has 1..49) v4 ==== 6628+0+0 (secure 0 0 0) 0x7f2eec012070 con 0x7f2ef40835e0 2026-03-10T08:58:02.310 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.309+0000 7f2ef0ff9700 1 -- 192.168.123.105:0/1006145683 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f2eec063cf0 con 0x7f2ef40835e0 2026-03-10T08:58:02.310 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.309+0000 7f2ef37fe700 1 --2- 192.168.123.105:0/1006145683 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f2edc077a40 0x7f2edc079f00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:58:02.329 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.328+0000 7f2ef37fe700 1 --2- 192.168.123.105:0/1006145683 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f2edc077a40 0x7f2edc079f00 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f2ee400b3c0 tx=0x7f2ee400d040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:58:02.504 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.503+0000 7f2ef9aed700 1 -- 192.168.123.105:0/1006145683 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f2ef412f100 con 0x7f2ef40835e0 2026-03-10T08:58:02.504 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.504+0000 7f2ef0ff9700 1 -- 192.168.123.105:0/1006145683 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+1288 (secure 0 0 0) 0x7f2eec026520 con 0x7f2ef40835e0 2026-03-10T08:58:02.505 INFO:teuthology.orchestra.run.vm05.stdout:HEALTH_WARN Degraded data redundancy: 1869/19032 objects degraded (9.820%), 14 pgs degraded 2026-03-10T08:58:02.505 INFO:teuthology.orchestra.run.vm05.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 1869/19032 objects degraded (9.820%), 14 pgs degraded 2026-03-10T08:58:02.505 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.1 is active+recovery_wait+undersized+degraded+remapped, acting [2,4] 2026-03-10T08:58:02.505 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.3 is active+recovery_wait+undersized+degraded+remapped, acting [4,3] 2026-03-10T08:58:02.505 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.6 is active+recovery_wait+undersized+degraded+remapped, acting [1,4] 2026-03-10T08:58:02.505 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.b is active+recovery_wait+undersized+degraded+remapped, acting [1,4] 2026-03-10T08:58:02.505 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.c is active+recovery_wait+undersized+degraded+remapped, acting [5,3] 2026-03-10T08:58:02.505 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.f is active+recovering+undersized+degraded+remapped, acting [5,3] 2026-03-10T08:58:02.505 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.10 is active+recovery_wait+undersized+degraded+remapped, acting [5,1] 2026-03-10T08:58:02.505 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.11 is active+recovery_wait+undersized+degraded+remapped, acting [3,4] 2026-03-10T08:58:02.505 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.12 is active+recovery_wait+undersized+degraded+remapped, acting [1,3] 2026-03-10T08:58:02.505 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.15 is active+recovery_wait+undersized+degraded+remapped, acting [3,4] 2026-03-10T08:58:02.505 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.17 is active+recovery_wait+undersized+degraded+remapped, acting [2,5] 2026-03-10T08:58:02.505 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.18 is active+recovery_wait+undersized+degraded+remapped, acting [2,1] 2026-03-10T08:58:02.505 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.1b is active+recovery_wait+undersized+degraded+remapped, acting [3,4] 2026-03-10T08:58:02.505 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.1f is active+recovery_wait+undersized+degraded+remapped, acting [2,3] 2026-03-10T08:58:02.507 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.507+0000 7f2ef9aed700 1 -- 192.168.123.105:0/1006145683 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f2edc077a40 msgr2=0x7f2edc079f00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:02.507 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.507+0000 7f2ef9aed700 1 --2- 192.168.123.105:0/1006145683 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f2edc077a40 0x7f2edc079f00 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f2ee400b3c0 tx=0x7f2ee400d040 comp rx=0 tx=0).stop 2026-03-10T08:58:02.507 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.507+0000 7f2ef9aed700 1 -- 192.168.123.105:0/1006145683 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2ef40835e0 msgr2=0x7f2ef412e3f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:02.507 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.507+0000 7f2ef9aed700 1 --2- 192.168.123.105:0/1006145683 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2ef40835e0 0x7f2ef412e3f0 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f2eec003fa0 tx=0x7f2eec008e70 comp rx=0 tx=0).stop 2026-03-10T08:58:02.507 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.507+0000 7f2ef9aed700 1 -- 192.168.123.105:0/1006145683 shutdown_connections 2026-03-10T08:58:02.508 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.507+0000 7f2ef9aed700 1 --2- 192.168.123.105:0/1006145683 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f2edc077a40 0x7f2edc079f00 unknown :-1 s=CLOSED pgs=41 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:02.508 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.507+0000 7f2ef9aed700 1 --2- 192.168.123.105:0/1006145683 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2ef4072b50 0x7f2ef40830a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:02.508 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.507+0000 7f2ef9aed700 1 --2- 192.168.123.105:0/1006145683 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2ef40835e0 0x7f2ef412e3f0 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:02.508 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.507+0000 7f2ef9aed700 1 -- 192.168.123.105:0/1006145683 >> 192.168.123.105:0/1006145683 conn(0x7f2ef406dae0 msgr2=0x7f2ef406ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:58:02.508 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.507+0000 7f2ef9aed700 1 -- 192.168.123.105:0/1006145683 shutdown_connections 2026-03-10T08:58:02.508 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:02.507+0000 7f2ef9aed700 1 -- 192.168.123.105:0/1006145683 wait complete. 2026-03-10T08:58:02.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:02 vm05.local ceph-mon[111630]: from='client.44133 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:58:02.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:02 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/991292951' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:58:02.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:02 vm05.local ceph-mon[111630]: pgmap v47: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 13 active+recovery_wait+undersized+degraded+remapped, 51 active+clean; 259 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 26 KiB/s rd, 951 KiB/s wr, 354 op/s; 1869/19032 objects degraded (9.820%); 0 B/s, 27 keys/s, 7 objects/s recovering 2026-03-10T08:58:02.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:02 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/958576787' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T08:58:02.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:02 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/1006145683' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T08:58:03.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:02 vm08.local ceph-mon[101330]: from='client.44133 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:58:03.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:02 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/991292951' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:58:03.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:02 vm08.local ceph-mon[101330]: pgmap v47: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 13 active+recovery_wait+undersized+degraded+remapped, 51 active+clean; 259 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 26 KiB/s rd, 951 KiB/s wr, 354 op/s; 1869/19032 objects degraded (9.820%); 0 B/s, 27 keys/s, 7 objects/s recovering 2026-03-10T08:58:03.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:02 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/958576787' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T08:58:03.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:02 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/1006145683' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T08:58:03.879 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:03 vm05.local ceph-mon[111630]: from='client.34184 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:58:04.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:03 vm08.local ceph-mon[101330]: from='client.34184 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:58:05.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:04 vm05.local ceph-mon[111630]: pgmap v48: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 13 active+recovery_wait+undersized+degraded+remapped, 51 active+clean; 256 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 38 KiB/s rd, 1.4 MiB/s wr, 462 op/s; 1868/16038 objects degraded (11.647%); 0 B/s, 27 keys/s, 11 objects/s recovering 2026-03-10T08:58:05.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:04 vm08.local ceph-mon[101330]: pgmap v48: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 13 active+recovery_wait+undersized+degraded+remapped, 51 active+clean; 256 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 38 KiB/s rd, 1.4 MiB/s wr, 462 op/s; 1868/16038 objects degraded (11.647%); 0 B/s, 27 keys/s, 11 objects/s recovering 2026-03-10T08:58:06.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:06 vm08.local ceph-mon[101330]: Health check update: Degraded data redundancy: 1869/19032 objects degraded (9.820%), 14 pgs degraded (PG_DEGRADED) 2026-03-10T08:58:06.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:06 vm05.local ceph-mon[111630]: Health check update: Degraded data redundancy: 1869/19032 objects degraded (9.820%), 14 pgs degraded (PG_DEGRADED) 2026-03-10T08:58:07.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:07 vm08.local ceph-mon[101330]: pgmap v49: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 13 active+recovery_wait+undersized+degraded+remapped, 51 active+clean; 256 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 25 KiB/s rd, 965 KiB/s wr, 255 op/s; 1868/16038 objects degraded (11.647%); 0 B/s, 1 keys/s, 7 objects/s recovering 2026-03-10T08:58:07.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:07 vm05.local ceph-mon[111630]: pgmap v49: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 13 active+recovery_wait+undersized+degraded+remapped, 51 active+clean; 256 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 25 KiB/s rd, 965 KiB/s wr, 255 op/s; 1868/16038 objects degraded (11.647%); 0 B/s, 1 keys/s, 7 objects/s recovering 2026-03-10T08:58:09.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:09 vm08.local ceph-mon[101330]: pgmap v50: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 13 active+recovery_wait+undersized+degraded+remapped, 51 active+clean; 254 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 36 KiB/s rd, 1.4 MiB/s wr, 382 op/s; 1828/12432 objects degraded (14.704%); 0 B/s, 1 keys/s, 10 objects/s recovering 2026-03-10T08:58:09.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:09 vm05.local ceph-mon[111630]: pgmap v50: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 13 active+recovery_wait+undersized+degraded+remapped, 51 active+clean; 254 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 36 KiB/s rd, 1.4 MiB/s wr, 382 op/s; 1828/12432 objects degraded (14.704%); 0 B/s, 1 keys/s, 10 objects/s recovering 2026-03-10T08:58:10.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:10 vm05.local ceph-mon[111630]: osdmap e50: 6 total, 6 up, 6 in 2026-03-10T08:58:10.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:10 vm05.local ceph-mon[111630]: Health check update: Degraded data redundancy: 1828/12432 objects degraded (14.704%), 14 pgs degraded (PG_DEGRADED) 2026-03-10T08:58:10.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:10 vm08.local ceph-mon[101330]: osdmap e50: 6 total, 6 up, 6 in 2026-03-10T08:58:10.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:10 vm08.local ceph-mon[101330]: Health check update: Degraded data redundancy: 1828/12432 objects degraded (14.704%), 14 pgs degraded (PG_DEGRADED) 2026-03-10T08:58:11.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:11 vm05.local ceph-mon[111630]: pgmap v52: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 13 active+recovery_wait+undersized+degraded+remapped, 51 active+clean; 254 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 27 KiB/s rd, 1.0 MiB/s wr, 282 op/s; 1828/12432 objects degraded (14.704%); 0 B/s, 8 objects/s recovering 2026-03-10T08:58:11.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:11 vm05.local ceph-mon[111630]: osdmap e51: 6 total, 6 up, 6 in 2026-03-10T08:58:11.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:11 vm08.local ceph-mon[101330]: pgmap v52: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 13 active+recovery_wait+undersized+degraded+remapped, 51 active+clean; 254 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 27 KiB/s rd, 1.0 MiB/s wr, 282 op/s; 1828/12432 objects degraded (14.704%); 0 B/s, 8 objects/s recovering 2026-03-10T08:58:11.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:11 vm08.local ceph-mon[101330]: osdmap e51: 6 total, 6 up, 6 in 2026-03-10T08:58:12.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:12 vm05.local ceph-mon[111630]: pgmap v54: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 13 active+recovery_wait+undersized+degraded+remapped, 51 active+clean; 254 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 16 KiB/s rd, 672 KiB/s wr, 191 op/s; 1828/12432 objects degraded (14.704%); 0 B/s, 4 objects/s recovering 2026-03-10T08:58:12.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:12 vm08.local ceph-mon[101330]: pgmap v54: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 13 active+recovery_wait+undersized+degraded+remapped, 51 active+clean; 254 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 16 KiB/s rd, 672 KiB/s wr, 191 op/s; 1828/12432 objects degraded (14.704%); 0 B/s, 4 objects/s recovering 2026-03-10T08:58:15.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:15 vm08.local ceph-mon[101330]: pgmap v55: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 12 active+recovery_wait+undersized+degraded+remapped, 52 active+clean; 245 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 34 KiB/s rd, 1.3 MiB/s wr, 388 op/s; 1736/9147 objects degraded (18.979%); 0 B/s, 10 objects/s recovering 2026-03-10T08:58:15.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:15 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T08:58:15.360 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:15 vm05.local ceph-mon[111630]: pgmap v55: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 12 active+recovery_wait+undersized+degraded+remapped, 52 active+clean; 245 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 34 KiB/s rd, 1.3 MiB/s wr, 388 op/s; 1736/9147 objects degraded (18.979%); 0 B/s, 10 objects/s recovering 2026-03-10T08:58:15.360 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:15 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T08:58:16.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:16 vm05.local ceph-mon[111630]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T08:58:16.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:16 vm05.local ceph-mon[111630]: Upgrade: unsafe to stop osd(s) at this time (5 PGs are or would become offline) 2026-03-10T08:58:16.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:16 vm05.local ceph-mon[111630]: Health check update: Degraded data redundancy: 1736/9147 objects degraded (18.979%), 13 pgs degraded (PG_DEGRADED) 2026-03-10T08:58:16.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:16 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:58:17.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:16 vm08.local ceph-mon[101330]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T08:58:17.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:16 vm08.local ceph-mon[101330]: Upgrade: unsafe to stop osd(s) at this time (5 PGs are or would become offline) 2026-03-10T08:58:17.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:16 vm08.local ceph-mon[101330]: Health check update: Degraded data redundancy: 1736/9147 objects degraded (18.979%), 13 pgs degraded (PG_DEGRADED) 2026-03-10T08:58:17.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:16 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:58:18.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:17 vm08.local ceph-mon[101330]: pgmap v56: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 12 active+recovery_wait+undersized+degraded+remapped, 52 active+clean; 241 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 18 KiB/s rd, 708 KiB/s wr, 196 op/s; 1736/9144 objects degraded (18.985%); 0 B/s, 5 objects/s recovering 2026-03-10T08:58:18.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:17 vm05.local ceph-mon[111630]: pgmap v56: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 12 active+recovery_wait+undersized+degraded+remapped, 52 active+clean; 241 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 18 KiB/s rd, 708 KiB/s wr, 196 op/s; 1736/9144 objects degraded (18.985%); 0 B/s, 5 objects/s recovering 2026-03-10T08:58:19.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:18 vm08.local ceph-mon[101330]: pgmap v57: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 12 active+recovery_wait+undersized+degraded+remapped, 52 active+clean; 235 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 33 KiB/s rd, 1.1 MiB/s wr, 335 op/s; 1736/6039 objects degraded (28.746%); 0 B/s, 9 objects/s recovering 2026-03-10T08:58:19.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:18 vm05.local ceph-mon[111630]: pgmap v57: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 12 active+recovery_wait+undersized+degraded+remapped, 52 active+clean; 235 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 33 KiB/s rd, 1.1 MiB/s wr, 335 op/s; 1736/6039 objects degraded (28.746%); 0 B/s, 9 objects/s recovering 2026-03-10T08:58:19.267 INFO:tasks.workunit:Stopping ['suites/fsstress.sh'] on client.0... 2026-03-10T08:58:19.267 DEBUG:teuthology.orchestra.run.vm05:> sudo rm -rf -- /home/ubuntu/cephtest/workunits.list.client.0 /home/ubuntu/cephtest/clone.client.0 2026-03-10T08:58:19.715 DEBUG:teuthology.parallel:result is None 2026-03-10T08:58:20.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:20 vm08.local ceph-mon[101330]: mgrmap e35: vm05.rxwgjc(active, since 93s), standbys: vm08.rpongu 2026-03-10T08:58:20.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:19 vm05.local ceph-mon[111630]: mgrmap e35: vm05.rxwgjc(active, since 93s), standbys: vm08.rpongu 2026-03-10T08:58:21.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:20 vm08.local ceph-mon[101330]: pgmap v58: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 12 active+recovery_wait+undersized+degraded+remapped, 52 active+clean; 235 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 29 KiB/s rd, 1003 KiB/s wr, 294 op/s; 1736/6039 objects degraded (28.746%); 0 B/s, 8 objects/s recovering 2026-03-10T08:58:21.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:20 vm08.local ceph-mon[101330]: Health check update: Degraded data redundancy: 1736/6039 objects degraded (28.746%), 13 pgs degraded (PG_DEGRADED) 2026-03-10T08:58:21.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:20 vm05.local ceph-mon[111630]: pgmap v58: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 12 active+recovery_wait+undersized+degraded+remapped, 52 active+clean; 235 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 29 KiB/s rd, 1003 KiB/s wr, 294 op/s; 1736/6039 objects degraded (28.746%); 0 B/s, 8 objects/s recovering 2026-03-10T08:58:21.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:20 vm05.local ceph-mon[111630]: Health check update: Degraded data redundancy: 1736/6039 objects degraded (28.746%), 13 pgs degraded (PG_DEGRADED) 2026-03-10T08:58:22.700 INFO:tasks.workunit:Stopping ['suites/fsstress.sh'] on client.1... 2026-03-10T08:58:22.700 DEBUG:teuthology.orchestra.run.vm08:> sudo rm -rf -- /home/ubuntu/cephtest/workunits.list.client.1 /home/ubuntu/cephtest/clone.client.1 2026-03-10T08:58:23.137 DEBUG:teuthology.parallel:result is None 2026-03-10T08:58:23.137 DEBUG:teuthology.orchestra.run.vm05:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0 2026-03-10T08:58:23.173 INFO:tasks.workunit:Deleted dir /home/ubuntu/cephtest/mnt.0/client.0 2026-03-10T08:58:23.173 DEBUG:teuthology.orchestra.run.vm08:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.1/client.1 2026-03-10T08:58:23.215 INFO:tasks.workunit:Deleted dir /home/ubuntu/cephtest/mnt.1/client.1 2026-03-10T08:58:23.215 DEBUG:teuthology.parallel:result is None 2026-03-10T08:58:23.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:23 vm05.local ceph-mon[111630]: pgmap v59: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 12 active+recovery_wait+undersized+degraded+remapped, 52 active+clean; 235 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 24 KiB/s rd, 856 KiB/s wr, 251 op/s; 1736/6039 objects degraded (28.746%); 0 B/s, 7 objects/s recovering 2026-03-10T08:58:23.525 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:23 vm08.local ceph-mon[101330]: pgmap v59: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 12 active+recovery_wait+undersized+degraded+remapped, 52 active+clean; 235 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 24 KiB/s rd, 856 KiB/s wr, 251 op/s; 1736/6039 objects degraded (28.746%); 0 B/s, 7 objects/s recovering 2026-03-10T08:58:25.360 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:25 vm05.local ceph-mon[111630]: pgmap v60: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 12 active+recovery_wait+undersized+degraded+remapped, 52 active+clean; 223 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 35 KiB/s rd, 1.3 MiB/s wr, 386 op/s; 1701/2244 objects degraded (75.802%); 0 B/s, 10 objects/s recovering 2026-03-10T08:58:25.360 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:25 vm05.local ceph-mon[111630]: osdmap e52: 6 total, 6 up, 6 in 2026-03-10T08:58:25.360 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:25 vm05.local ceph-mon[111630]: Health check update: Degraded data redundancy: 1701/2244 objects degraded (75.802%), 13 pgs degraded (PG_DEGRADED) 2026-03-10T08:58:25.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:25 vm08.local ceph-mon[101330]: pgmap v60: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 12 active+recovery_wait+undersized+degraded+remapped, 52 active+clean; 223 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 35 KiB/s rd, 1.3 MiB/s wr, 386 op/s; 1701/2244 objects degraded (75.802%); 0 B/s, 10 objects/s recovering 2026-03-10T08:58:25.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:25 vm08.local ceph-mon[101330]: osdmap e52: 6 total, 6 up, 6 in 2026-03-10T08:58:25.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:25 vm08.local ceph-mon[101330]: Health check update: Degraded data redundancy: 1701/2244 objects degraded (75.802%), 13 pgs degraded (PG_DEGRADED) 2026-03-10T08:58:26.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:26 vm05.local ceph-mon[111630]: osdmap e53: 6 total, 6 up, 6 in 2026-03-10T08:58:26.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:26 vm08.local ceph-mon[101330]: osdmap e53: 6 total, 6 up, 6 in 2026-03-10T08:58:27.572 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:27 vm05.local ceph-mon[111630]: pgmap v63: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 12 active+recovery_wait+undersized+degraded+remapped, 52 active+clean; 219 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 17 KiB/s rd, 745 KiB/s wr, 211 op/s; 1701/2235 objects degraded (76.107%); 0 B/s, 5 objects/s recovering 2026-03-10T08:58:27.665 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:27 vm08.local ceph-mon[101330]: pgmap v63: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 12 active+recovery_wait+undersized+degraded+remapped, 52 active+clean; 219 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 17 KiB/s rd, 745 KiB/s wr, 211 op/s; 1701/2235 objects degraded (76.107%); 0 B/s, 5 objects/s recovering 2026-03-10T08:58:28.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:28 vm05.local ceph-mon[111630]: pgmap v64: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 11 active+recovery_wait+undersized+degraded+remapped, 53 active+clean; 214 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 27 KiB/s rd, 944 KiB/s wr, 305 op/s; 1597/231 objects degraded (691.342%); 0 B/s, 10 objects/s recovering 2026-03-10T08:58:28.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:28 vm08.local ceph-mon[101330]: pgmap v64: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 11 active+recovery_wait+undersized+degraded+remapped, 53 active+clean; 214 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 27 KiB/s rd, 944 KiB/s wr, 305 op/s; 1597/231 objects degraded (691.342%); 0 B/s, 10 objects/s recovering 2026-03-10T08:58:29.367 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:29 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:58:29.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:29 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:58:30.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:30 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:58:30.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:30 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:58:30.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:30 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:58:30.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:30 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:58:30.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:30 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:58:30.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:30 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:58:30.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:30 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:58:30.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:30 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T08:58:30.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:30 vm05.local ceph-mon[111630]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T08:58:30.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:30 vm05.local ceph-mon[111630]: Upgrade: unsafe to stop osd(s) at this time (5 PGs are or would become offline) 2026-03-10T08:58:30.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:30 vm05.local ceph-mon[111630]: pgmap v65: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 11 active+recovery_wait+undersized+degraded+remapped, 53 active+clean; 214 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 27 KiB/s rd, 944 KiB/s wr, 305 op/s; 1597/231 objects degraded (691.342%); 0 B/s, 10 objects/s recovering 2026-03-10T08:58:30.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:30 vm05.local ceph-mon[111630]: Health check update: Degraded data redundancy: 1597/231 objects degraded (691.342%), 12 pgs degraded (PG_DEGRADED) 2026-03-10T08:58:30.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:30 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:58:30.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:30 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:58:30.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:30 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:58:30.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:30 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:58:30.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:30 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:58:30.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:30 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:58:30.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:30 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:58:30.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:30 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T08:58:30.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:30 vm08.local ceph-mon[101330]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T08:58:30.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:30 vm08.local ceph-mon[101330]: Upgrade: unsafe to stop osd(s) at this time (5 PGs are or would become offline) 2026-03-10T08:58:30.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:30 vm08.local ceph-mon[101330]: pgmap v65: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 11 active+recovery_wait+undersized+degraded+remapped, 53 active+clean; 214 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 27 KiB/s rd, 944 KiB/s wr, 305 op/s; 1597/231 objects degraded (691.342%); 0 B/s, 10 objects/s recovering 2026-03-10T08:58:30.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:30 vm08.local ceph-mon[101330]: Health check update: Degraded data redundancy: 1597/231 objects degraded (691.342%), 12 pgs degraded (PG_DEGRADED) 2026-03-10T08:58:32.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:58:32.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:58:32.596 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.594+0000 7f8c81359700 1 -- 192.168.123.105:0/551080756 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8c7c101470 msgr2=0x7f8c7c101890 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:32.596 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.594+0000 7f8c81359700 1 --2- 192.168.123.105:0/551080756 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8c7c101470 0x7f8c7c101890 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7f8c64009b50 tx=0x7f8c64009e60 comp rx=0 tx=0).stop 2026-03-10T08:58:32.596 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.594+0000 7f8c81359700 1 -- 192.168.123.105:0/551080756 shutdown_connections 2026-03-10T08:58:32.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.594+0000 7f8c81359700 1 --2- 192.168.123.105:0/551080756 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8c7c0ff000 0x7f8c7c0ff460 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:32.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.594+0000 7f8c81359700 1 --2- 192.168.123.105:0/551080756 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8c7c101470 0x7f8c7c101890 unknown :-1 s=CLOSED pgs=66 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:32.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.594+0000 7f8c81359700 1 -- 192.168.123.105:0/551080756 >> 192.168.123.105:0/551080756 conn(0x7f8c7c0fab60 msgr2=0x7f8c7c0fcfc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:58:32.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.594+0000 7f8c81359700 1 -- 192.168.123.105:0/551080756 shutdown_connections 2026-03-10T08:58:32.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.594+0000 7f8c81359700 1 -- 192.168.123.105:0/551080756 wait complete. 2026-03-10T08:58:32.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.595+0000 7f8c81359700 1 Processor -- start 2026-03-10T08:58:32.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.595+0000 7f8c81359700 1 -- start start 2026-03-10T08:58:32.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.595+0000 7f8c81359700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8c7c0ff000 0x7f8c7c071d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:58:32.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.595+0000 7f8c81359700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8c7c101470 0x7f8c7c072260 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:58:32.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.595+0000 7f8c81359700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8c7c0727a0 con 0x7f8c7c0ff000 2026-03-10T08:58:32.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.595+0000 7f8c81359700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8c7c072910 con 0x7f8c7c101470 2026-03-10T08:58:32.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.596+0000 7f8c7affd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8c7c0ff000 0x7f8c7c071d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:58:32.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.596+0000 7f8c7affd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8c7c0ff000 0x7f8c7c071d20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:46186/0 (socket says 192.168.123.105:46186) 2026-03-10T08:58:32.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.596+0000 7f8c7affd700 1 -- 192.168.123.105:0/1319388145 learned_addr learned my addr 192.168.123.105:0/1319388145 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:58:32.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.596+0000 7f8c7affd700 1 -- 192.168.123.105:0/1319388145 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8c7c101470 msgr2=0x7f8c7c072260 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:32.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.596+0000 7f8c7affd700 1 --2- 192.168.123.105:0/1319388145 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8c7c101470 0x7f8c7c072260 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:32.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.596+0000 7f8c7affd700 1 -- 192.168.123.105:0/1319388145 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8c640097e0 con 0x7f8c7c0ff000 2026-03-10T08:58:32.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.596+0000 7f8c7affd700 1 --2- 192.168.123.105:0/1319388145 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8c7c0ff000 0x7f8c7c071d20 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f8c64004ce0 tx=0x7f8c64004f60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:58:32.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.596+0000 7f8c73fff700 1 -- 192.168.123.105:0/1319388145 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8c6401d070 con 0x7f8c7c0ff000 2026-03-10T08:58:32.598 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.597+0000 7f8c81359700 1 -- 192.168.123.105:0/1319388145 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8c7c1a2510 con 0x7f8c7c0ff000 2026-03-10T08:58:32.598 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.597+0000 7f8c81359700 1 -- 192.168.123.105:0/1319388145 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8c7c1a2a00 con 0x7f8c7c0ff000 2026-03-10T08:58:32.598 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.598+0000 7f8c73fff700 1 -- 192.168.123.105:0/1319388145 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8c6400bb70 con 0x7f8c7c0ff000 2026-03-10T08:58:32.598 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.598+0000 7f8c73fff700 1 -- 192.168.123.105:0/1319388145 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8c6400f700 con 0x7f8c7c0ff000 2026-03-10T08:58:32.598 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.598+0000 7f8c73fff700 1 -- 192.168.123.105:0/1319388145 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f8c6400f860 con 0x7f8c7c0ff000 2026-03-10T08:58:32.599 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.599+0000 7f8c73fff700 1 --2- 192.168.123.105:0/1319388145 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f8c68077910 0x7f8c68079dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:58:32.599 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.599+0000 7f8c7a7fc700 1 --2- 192.168.123.105:0/1319388145 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f8c68077910 0x7f8c68079dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:58:32.599 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.599+0000 7f8c73fff700 1 -- 192.168.123.105:0/1319388145 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(53..53 src has 1..53) v4 ==== 6570+0+0 (secure 0 0 0) 0x7f8c6409b170 con 0x7f8c7c0ff000 2026-03-10T08:58:32.600 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.599+0000 7f8c7a7fc700 1 --2- 192.168.123.105:0/1319388145 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f8c68077910 0x7f8c68079dd0 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f8c6c005950 tx=0x7f8c6c0058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:58:32.600 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.599+0000 7f8c81359700 1 -- 192.168.123.105:0/1319388145 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8c5c005320 con 0x7f8c7c0ff000 2026-03-10T08:58:32.603 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.602+0000 7f8c73fff700 1 -- 192.168.123.105:0/1319388145 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f8c64064a10 con 0x7f8c7c0ff000 2026-03-10T08:58:32.742 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.741+0000 7f8c81359700 1 -- 192.168.123.105:0/1319388145 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f8c5c000bf0 con 0x7f8c68077910 2026-03-10T08:58:32.743 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.743+0000 7f8c73fff700 1 -- 192.168.123.105:0/1319388145 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f8c5c000bf0 con 0x7f8c68077910 2026-03-10T08:58:32.746 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.745+0000 7f8c81359700 1 -- 192.168.123.105:0/1319388145 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f8c68077910 msgr2=0x7f8c68079dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:32.746 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.745+0000 7f8c81359700 1 --2- 192.168.123.105:0/1319388145 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f8c68077910 0x7f8c68079dd0 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f8c6c005950 tx=0x7f8c6c0058e0 comp rx=0 tx=0).stop 2026-03-10T08:58:32.746 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.745+0000 7f8c81359700 1 -- 192.168.123.105:0/1319388145 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8c7c0ff000 msgr2=0x7f8c7c071d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:32.746 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.745+0000 7f8c81359700 1 --2- 192.168.123.105:0/1319388145 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8c7c0ff000 0x7f8c7c071d20 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f8c64004ce0 tx=0x7f8c64004f60 comp rx=0 tx=0).stop 2026-03-10T08:58:32.746 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.746+0000 7f8c81359700 1 -- 192.168.123.105:0/1319388145 shutdown_connections 2026-03-10T08:58:32.746 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.746+0000 7f8c81359700 1 --2- 192.168.123.105:0/1319388145 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f8c68077910 0x7f8c68079dd0 unknown :-1 s=CLOSED pgs=42 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:32.746 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.746+0000 7f8c81359700 1 --2- 192.168.123.105:0/1319388145 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8c7c0ff000 0x7f8c7c071d20 unknown :-1 s=CLOSED pgs=67 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:32.746 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.746+0000 7f8c81359700 1 --2- 192.168.123.105:0/1319388145 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8c7c101470 0x7f8c7c072260 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:32.747 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.746+0000 7f8c81359700 1 -- 192.168.123.105:0/1319388145 >> 192.168.123.105:0/1319388145 conn(0x7f8c7c0fab60 msgr2=0x7f8c7c108700 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:58:32.747 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.746+0000 7f8c81359700 1 -- 192.168.123.105:0/1319388145 shutdown_connections 2026-03-10T08:58:32.747 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.746+0000 7f8c81359700 1 -- 192.168.123.105:0/1319388145 wait complete. 2026-03-10T08:58:32.757 INFO:teuthology.orchestra.run.vm05.stdout:true 2026-03-10T08:58:32.822 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.822+0000 7faf111c2700 1 -- 192.168.123.105:0/647069542 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faf0c101e90 msgr2=0x7faf0c1022f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:32.823 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.822+0000 7faf111c2700 1 --2- 192.168.123.105:0/647069542 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faf0c101e90 0x7faf0c1022f0 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7faefc009b00 tx=0x7faefc009e10 comp rx=0 tx=0).stop 2026-03-10T08:58:32.823 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.822+0000 7faf111c2700 1 -- 192.168.123.105:0/647069542 shutdown_connections 2026-03-10T08:58:32.823 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.822+0000 7faf111c2700 1 --2- 192.168.123.105:0/647069542 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faf0c101e90 0x7faf0c1022f0 unknown :-1 s=CLOSED pgs=68 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:32.823 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.822+0000 7faf111c2700 1 --2- 192.168.123.105:0/647069542 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7faf0c100c90 0x7faf0c1010b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:32.823 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.822+0000 7faf111c2700 1 -- 192.168.123.105:0/647069542 >> 192.168.123.105:0/647069542 conn(0x7faf0c0fc210 msgr2=0x7faf0c0fe670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:58:32.823 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.823+0000 7faf111c2700 1 -- 192.168.123.105:0/647069542 shutdown_connections 2026-03-10T08:58:32.823 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.823+0000 7faf111c2700 1 -- 192.168.123.105:0/647069542 wait complete. 2026-03-10T08:58:32.823 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.823+0000 7faf111c2700 1 Processor -- start 2026-03-10T08:58:32.824 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.823+0000 7faf111c2700 1 -- start start 2026-03-10T08:58:32.824 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.824+0000 7faf111c2700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faf0c100c90 0x7faf0c1943b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:58:32.824 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.824+0000 7faf111c2700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7faf0c101e90 0x7faf0c1948f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:58:32.824 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.824+0000 7faf111c2700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7faf0c194e80 con 0x7faf0c100c90 2026-03-10T08:58:32.824 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.824+0000 7faf111c2700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7faf0c194fc0 con 0x7faf0c101e90 2026-03-10T08:58:32.825 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.824+0000 7faf0b7fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7faf0c101e90 0x7faf0c1948f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:58:32.825 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.824+0000 7faf0b7fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7faf0c101e90 0x7faf0c1948f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:57682/0 (socket says 192.168.123.105:57682) 2026-03-10T08:58:32.825 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.824+0000 7faf0b7fe700 1 -- 192.168.123.105:0/21254224 learned_addr learned my addr 192.168.123.105:0/21254224 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:58:32.825 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.824+0000 7faf0b7fe700 1 -- 192.168.123.105:0/21254224 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faf0c100c90 msgr2=0x7faf0c1943b0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T08:58:32.825 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.825+0000 7faf0bfff700 1 --2- 192.168.123.105:0/21254224 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faf0c100c90 0x7faf0c1943b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:58:32.825 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.825+0000 7faf0b7fe700 1 --2- 192.168.123.105:0/21254224 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faf0c100c90 0x7faf0c1943b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:32.825 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.825+0000 7faf0b7fe700 1 -- 192.168.123.105:0/21254224 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7faefc0097e0 con 0x7faf0c101e90 2026-03-10T08:58:32.825 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.825+0000 7faf0bfff700 1 --2- 192.168.123.105:0/21254224 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faf0c100c90 0x7faf0c1943b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T08:58:32.826 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.825+0000 7faf0b7fe700 1 --2- 192.168.123.105:0/21254224 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7faf0c101e90 0x7faf0c1948f0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7faefc009fd0 tx=0x7faefc004ab0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:58:32.826 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.825+0000 7faf097fa700 1 -- 192.168.123.105:0/21254224 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7faefc01d070 con 0x7faf0c101e90 2026-03-10T08:58:32.826 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.825+0000 7faf097fa700 1 -- 192.168.123.105:0/21254224 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7faefc00bd10 con 0x7faf0c101e90 2026-03-10T08:58:32.826 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.825+0000 7faf097fa700 1 -- 192.168.123.105:0/21254224 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7faefc00f850 con 0x7faf0c101e90 2026-03-10T08:58:32.826 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.825+0000 7faf111c2700 1 -- 192.168.123.105:0/21254224 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7faf0c199a20 con 0x7faf0c101e90 2026-03-10T08:58:32.826 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.825+0000 7faf111c2700 1 -- 192.168.123.105:0/21254224 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7faf0c199ee0 con 0x7faf0c101e90 2026-03-10T08:58:32.827 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.826+0000 7faf111c2700 1 -- 192.168.123.105:0/21254224 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7faf0c04ea90 con 0x7faf0c101e90 2026-03-10T08:58:32.829 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.828+0000 7faf097fa700 1 -- 192.168.123.105:0/21254224 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7faefc022b70 con 0x7faf0c101e90 2026-03-10T08:58:32.829 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.828+0000 7faf097fa700 1 --2- 192.168.123.105:0/21254224 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7faef407bcd0 0x7faef407e190 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:58:32.829 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.828+0000 7faf0bfff700 1 --2- 192.168.123.105:0/21254224 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7faef407bcd0 0x7faef407e190 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:58:32.829 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.828+0000 7faf097fa700 1 -- 192.168.123.105:0/21254224 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(53..53 src has 1..53) v4 ==== 6570+0+0 (secure 0 0 0) 0x7faefc09bd30 con 0x7faf0c101e90 2026-03-10T08:58:32.829 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.829+0000 7faf0bfff700 1 --2- 192.168.123.105:0/21254224 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7faef407bcd0 0x7faef407e190 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7faf00005fd0 tx=0x7faf00005e40 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:58:32.830 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.830+0000 7faf097fa700 1 -- 192.168.123.105:0/21254224 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7faefc064430 con 0x7faf0c101e90 2026-03-10T08:58:32.978 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.978+0000 7faf111c2700 1 -- 192.168.123.105:0/21254224 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7faf0c19a280 con 0x7faef407bcd0 2026-03-10T08:58:32.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.979+0000 7faf097fa700 1 -- 192.168.123.105:0/21254224 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7faf0c19a280 con 0x7faef407bcd0 2026-03-10T08:58:32.981 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.981+0000 7faf111c2700 1 -- 192.168.123.105:0/21254224 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7faef407bcd0 msgr2=0x7faef407e190 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:32.981 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.981+0000 7faf111c2700 1 --2- 192.168.123.105:0/21254224 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7faef407bcd0 0x7faef407e190 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7faf00005fd0 tx=0x7faf00005e40 comp rx=0 tx=0).stop 2026-03-10T08:58:32.982 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.981+0000 7faf111c2700 1 -- 192.168.123.105:0/21254224 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7faf0c101e90 msgr2=0x7faf0c1948f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:32.982 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.981+0000 7faf111c2700 1 --2- 192.168.123.105:0/21254224 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7faf0c101e90 0x7faf0c1948f0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7faefc009fd0 tx=0x7faefc004ab0 comp rx=0 tx=0).stop 2026-03-10T08:58:32.982 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.981+0000 7faf111c2700 1 -- 192.168.123.105:0/21254224 shutdown_connections 2026-03-10T08:58:32.982 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.981+0000 7faf111c2700 1 --2- 192.168.123.105:0/21254224 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7faef407bcd0 0x7faef407e190 unknown :-1 s=CLOSED pgs=43 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:32.982 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.981+0000 7faf111c2700 1 --2- 192.168.123.105:0/21254224 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faf0c100c90 0x7faf0c1943b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:32.982 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.981+0000 7faf111c2700 1 --2- 192.168.123.105:0/21254224 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7faf0c101e90 0x7faf0c1948f0 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:32.982 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.982+0000 7faf111c2700 1 -- 192.168.123.105:0/21254224 >> 192.168.123.105:0/21254224 conn(0x7faf0c0fc210 msgr2=0x7faf0c1050c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:58:32.982 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.982+0000 7faf111c2700 1 -- 192.168.123.105:0/21254224 shutdown_connections 2026-03-10T08:58:32.982 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:32.982+0000 7faf111c2700 1 -- 192.168.123.105:0/21254224 wait complete. 2026-03-10T08:58:33.056 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.055+0000 7f405dd4b700 1 -- 192.168.123.105:0/2732349849 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4058104320 msgr2=0x7f4058104780 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:33.056 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.055+0000 7f405dd4b700 1 --2- 192.168.123.105:0/2732349849 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4058104320 0x7f4058104780 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f404c009b00 tx=0x7f404c009e10 comp rx=0 tx=0).stop 2026-03-10T08:58:33.056 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.056+0000 7f405dd4b700 1 -- 192.168.123.105:0/2732349849 shutdown_connections 2026-03-10T08:58:33.056 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.056+0000 7f405dd4b700 1 --2- 192.168.123.105:0/2732349849 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4058104320 0x7f4058104780 unknown :-1 s=CLOSED pgs=69 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:33.056 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.056+0000 7f405dd4b700 1 --2- 192.168.123.105:0/2732349849 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4058103120 0x7f4058103540 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:33.056 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.056+0000 7f405dd4b700 1 -- 192.168.123.105:0/2732349849 >> 192.168.123.105:0/2732349849 conn(0x7f40580fe6c0 msgr2=0x7f4058100b00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:58:33.057 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.056+0000 7f405dd4b700 1 -- 192.168.123.105:0/2732349849 shutdown_connections 2026-03-10T08:58:33.057 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.056+0000 7f405dd4b700 1 -- 192.168.123.105:0/2732349849 wait complete. 2026-03-10T08:58:33.057 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.057+0000 7f405dd4b700 1 Processor -- start 2026-03-10T08:58:33.058 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.057+0000 7f405dd4b700 1 -- start start 2026-03-10T08:58:33.058 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.058+0000 7f405dd4b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4058103120 0x7f4058198c70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:58:33.058 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.058+0000 7f405dd4b700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f40581991b0 0x7f405819e210 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:58:33.058 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.058+0000 7f405dd4b700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4058199660 con 0x7f4058103120 2026-03-10T08:58:33.058 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.058+0000 7f405dd4b700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f40581997d0 con 0x7f40581991b0 2026-03-10T08:58:33.058 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.058+0000 7f40577fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4058103120 0x7f4058198c70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:58:33.059 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.058+0000 7f40577fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4058103120 0x7f4058198c70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:46234/0 (socket says 192.168.123.105:46234) 2026-03-10T08:58:33.059 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.058+0000 7f40577fe700 1 -- 192.168.123.105:0/2058046999 learned_addr learned my addr 192.168.123.105:0/2058046999 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:58:33.059 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.059+0000 7f40577fe700 1 -- 192.168.123.105:0/2058046999 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f40581991b0 msgr2=0x7f405819e210 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T08:58:33.059 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.059+0000 7f40577fe700 1 --2- 192.168.123.105:0/2058046999 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f40581991b0 0x7f405819e210 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:33.059 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.059+0000 7f40577fe700 1 -- 192.168.123.105:0/2058046999 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f404c0097e0 con 0x7f4058103120 2026-03-10T08:58:33.060 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.059+0000 7f40577fe700 1 --2- 192.168.123.105:0/2058046999 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4058103120 0x7f4058198c70 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7f404800eac0 tx=0x7f404800ee80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:58:33.060 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.059+0000 7f4054ff9700 1 -- 192.168.123.105:0/2058046999 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4048009960 con 0x7f4058103120 2026-03-10T08:58:33.060 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.059+0000 7f4054ff9700 1 -- 192.168.123.105:0/2058046999 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4048004510 con 0x7f4058103120 2026-03-10T08:58:33.060 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.059+0000 7f405dd4b700 1 -- 192.168.123.105:0/2058046999 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f405819e7b0 con 0x7f4058103120 2026-03-10T08:58:33.061 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.059+0000 7f4054ff9700 1 -- 192.168.123.105:0/2058046999 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4048010450 con 0x7f4058103120 2026-03-10T08:58:33.061 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.060+0000 7f405dd4b700 1 -- 192.168.123.105:0/2058046999 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f405819ed00 con 0x7f4058103120 2026-03-10T08:58:33.063 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.062+0000 7f4054ff9700 1 -- 192.168.123.105:0/2058046999 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f40480106c0 con 0x7f4058103120 2026-03-10T08:58:33.063 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.063+0000 7f4054ff9700 1 --2- 192.168.123.105:0/2058046999 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f40400778c0 0x7f4040079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:58:33.064 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.063+0000 7f4056ffd700 1 --2- 192.168.123.105:0/2058046999 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f40400778c0 0x7f4040079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:58:33.064 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.064+0000 7f4054ff9700 1 -- 192.168.123.105:0/2058046999 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(53..53 src has 1..53) v4 ==== 6570+0+0 (secure 0 0 0) 0x7f4048014070 con 0x7f4058103120 2026-03-10T08:58:33.064 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.064+0000 7f4056ffd700 1 --2- 192.168.123.105:0/2058046999 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f40400778c0 0x7f4040079d80 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f404c000c00 tx=0x7f404c005dc0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:58:33.064 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.064+0000 7f405dd4b700 1 -- 192.168.123.105:0/2058046999 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4058066e80 con 0x7f4058103120 2026-03-10T08:58:33.068 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.068+0000 7f4054ff9700 1 -- 192.168.123.105:0/2058046999 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f40480637e0 con 0x7f4058103120 2026-03-10T08:58:33.195 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.195+0000 7f405dd4b700 1 -- 192.168.123.105:0/2058046999 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f405819f0a0 con 0x7f40400778c0 2026-03-10T08:58:33.200 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.200+0000 7f4054ff9700 1 -- 192.168.123.105:0/2058046999 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f405819f0a0 con 0x7f40400778c0 2026-03-10T08:58:33.201 INFO:teuthology.orchestra.run.vm05.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T08:58:33.201 INFO:teuthology.orchestra.run.vm05.stdout:alertmanager.vm05 vm05 *:9093,9094 running (6m) 67s ago 7m 24.2M - 0.25.0 c8568f914cd2 3be9db7ff6a0 2026-03-10T08:58:33.201 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm05 vm05 running (7m) 67s ago 7m 8892k - 18.2.1 5be31c24972a 4f6a4fa3151a 2026-03-10T08:58:33.201 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm08 vm08 running (7m) 79s ago 7m 11.0M - 18.2.1 5be31c24972a 17a1d108c2c1 2026-03-10T08:58:33.201 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm05 vm05 running (83s) 67s ago 7m 7834k - 19.2.3-678-ge911bdeb 654f31e6858e 8f7c3cd210c7 2026-03-10T08:58:33.201 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm08 vm08 running (81s) 79s ago 7m 8296k - 19.2.3-678-ge911bdeb 654f31e6858e dc71c3161edd 2026-03-10T08:58:33.201 INFO:teuthology.orchestra.run.vm05.stdout:grafana.vm05 vm05 *:3000 running (6m) 67s ago 7m 88.3M - 9.4.7 954c08fa6188 91937b4745e7 2026-03-10T08:58:33.201 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.bxdvbu vm05 running (5m) 67s ago 5m 242M - 18.2.1 5be31c24972a 601862397d9b 2026-03-10T08:58:33.201 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.slhztf vm05 running (5m) 67s ago 5m 17.7M - 18.2.1 5be31c24972a 550cdd24738b 2026-03-10T08:58:33.201 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.ssijow vm08 running (5m) 79s ago 5m 19.9M - 18.2.1 5be31c24972a b9e55b365719 2026-03-10T08:58:33.201 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.xfzrbx vm08 running (5m) 79s ago 5m 16.2M - 18.2.1 5be31c24972a d58d1e2e6ff3 2026-03-10T08:58:33.201 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm05.rxwgjc vm05 *:8443,9283,8765 running (2m) 67s ago 8m 613M - 19.2.3-678-ge911bdeb 654f31e6858e d8c53d04a173 2026-03-10T08:58:33.201 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm08.rpongu vm08 *:8443,9283,8765 running (2m) 79s ago 7m 491M - 19.2.3-678-ge911bdeb 654f31e6858e 867781ac9d98 2026-03-10T08:58:33.201 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm05 vm05 running (114s) 67s ago 8m 55.8M 2048M 19.2.3-678-ge911bdeb 654f31e6858e cdc9176bec28 2026-03-10T08:58:33.201 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm08 vm08 running (99s) 79s ago 7m 48.1M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 34546aa1422b 2026-03-10T08:58:33.201 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm05 vm05 *:9100 running (7m) 67s ago 7m 14.7M - 1.5.0 0da6a335fe13 3b3db2a6030c 2026-03-10T08:58:33.202 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm08 vm08 *:9100 running (7m) 79s ago 7m 15.7M - 1.5.0 0da6a335fe13 f55df0cefc61 2026-03-10T08:58:33.202 INFO:teuthology.orchestra.run.vm05.stdout:osd.0 vm05 running (70s) 67s ago 6m 30.0M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 4f1dac46f59b 2026-03-10T08:58:33.202 INFO:teuthology.orchestra.run.vm05.stdout:osd.1 vm05 running (6m) 67s ago 6m 378M 4096M 18.2.1 5be31c24972a 902f9ea11f1a 2026-03-10T08:58:33.202 INFO:teuthology.orchestra.run.vm05.stdout:osd.2 vm05 running (6m) 67s ago 6m 327M 4096M 18.2.1 5be31c24972a 32c0be0f86f2 2026-03-10T08:58:33.202 INFO:teuthology.orchestra.run.vm05.stdout:osd.3 vm08 running (6m) 79s ago 6m 456M 4096M 18.2.1 5be31c24972a 14f5f93ea4d1 2026-03-10T08:58:33.202 INFO:teuthology.orchestra.run.vm05.stdout:osd.4 vm08 running (6m) 79s ago 6m 418M 4096M 18.2.1 5be31c24972a 155c1482d81c 2026-03-10T08:58:33.202 INFO:teuthology.orchestra.run.vm05.stdout:osd.5 vm08 running (6m) 79s ago 6m 339M 4096M 18.2.1 5be31c24972a 21583bb58d82 2026-03-10T08:58:33.202 INFO:teuthology.orchestra.run.vm05.stdout:prometheus.vm05 vm05 *:9095 running (2m) 67s ago 7m 51.3M - 2.43.0 a07b618ecd1d 1879cf55d022 2026-03-10T08:58:33.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.203+0000 7f405dd4b700 1 -- 192.168.123.105:0/2058046999 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f40400778c0 msgr2=0x7f4040079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:33.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.203+0000 7f405dd4b700 1 --2- 192.168.123.105:0/2058046999 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f40400778c0 0x7f4040079d80 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f404c000c00 tx=0x7f404c005dc0 comp rx=0 tx=0).stop 2026-03-10T08:58:33.204 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.203+0000 7f405dd4b700 1 -- 192.168.123.105:0/2058046999 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4058103120 msgr2=0x7f4058198c70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:33.204 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.203+0000 7f405dd4b700 1 --2- 192.168.123.105:0/2058046999 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4058103120 0x7f4058198c70 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7f404800eac0 tx=0x7f404800ee80 comp rx=0 tx=0).stop 2026-03-10T08:58:33.204 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.204+0000 7f405dd4b700 1 -- 192.168.123.105:0/2058046999 shutdown_connections 2026-03-10T08:58:33.204 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.204+0000 7f405dd4b700 1 --2- 192.168.123.105:0/2058046999 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f40400778c0 0x7f4040079d80 unknown :-1 s=CLOSED pgs=44 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:33.204 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.204+0000 7f405dd4b700 1 --2- 192.168.123.105:0/2058046999 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4058103120 0x7f4058198c70 unknown :-1 s=CLOSED pgs=70 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:33.204 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.204+0000 7f405dd4b700 1 --2- 192.168.123.105:0/2058046999 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f40581991b0 0x7f405819e210 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:33.204 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.204+0000 7f405dd4b700 1 -- 192.168.123.105:0/2058046999 >> 192.168.123.105:0/2058046999 conn(0x7f40580fe6c0 msgr2=0x7f4058107550 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:58:33.205 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.204+0000 7f405dd4b700 1 -- 192.168.123.105:0/2058046999 shutdown_connections 2026-03-10T08:58:33.205 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.205+0000 7f405dd4b700 1 -- 192.168.123.105:0/2058046999 wait complete. 2026-03-10T08:58:33.284 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:32 vm05.local ceph-mon[111630]: pgmap v66: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 11 active+recovery_wait+undersized+degraded+remapped, 53 active+clean; 214 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 9.7 KiB/s rd, 199 KiB/s wr, 94 op/s; 1597/231 objects degraded (691.342%); 0 B/s, 5 objects/s recovering 2026-03-10T08:58:33.284 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.281+0000 7fb0b8ad5700 1 -- 192.168.123.105:0/2278794875 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb0b4100160 msgr2=0x7fb0b41005e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:33.284 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.281+0000 7fb0b8ad5700 1 --2- 192.168.123.105:0/2278794875 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb0b4100160 0x7fb0b41005e0 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7fb0a4009b50 tx=0x7fb0a4009e60 comp rx=0 tx=0).stop 2026-03-10T08:58:33.284 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.283+0000 7fb0b8ad5700 1 -- 192.168.123.105:0/2278794875 shutdown_connections 2026-03-10T08:58:33.284 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.283+0000 7fb0b8ad5700 1 --2- 192.168.123.105:0/2278794875 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb0b4100160 0x7fb0b41005e0 secure :-1 s=CLOSED pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7fb0a4009b50 tx=0x7fb0a4009e60 comp rx=0 tx=0).stop 2026-03-10T08:58:33.284 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.283+0000 7fb0b8ad5700 1 --2- 192.168.123.105:0/2278794875 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb0b40ff800 0x7fb0b40ffc20 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:33.284 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.283+0000 7fb0b8ad5700 1 -- 192.168.123.105:0/2278794875 >> 192.168.123.105:0/2278794875 conn(0x7fb0b40fb360 msgr2=0x7fb0b40fd7e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:58:33.285 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.285+0000 7fb0b8ad5700 1 -- 192.168.123.105:0/2278794875 shutdown_connections 2026-03-10T08:58:33.285 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.285+0000 7fb0b8ad5700 1 -- 192.168.123.105:0/2278794875 wait complete. 2026-03-10T08:58:33.285 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.285+0000 7fb0b8ad5700 1 Processor -- start 2026-03-10T08:58:33.286 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.285+0000 7fb0b8ad5700 1 -- start start 2026-03-10T08:58:33.286 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.285+0000 7fb0b8ad5700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb0b40ff800 0x7fb0b4194820 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:58:33.286 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.285+0000 7fb0b8ad5700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb0b4194d60 0x7fb0b4197dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:58:33.286 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.285+0000 7fb0b8ad5700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb0b4195270 con 0x7fb0b40ff800 2026-03-10T08:58:33.286 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.285+0000 7fb0b8ad5700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb0b41953e0 con 0x7fb0b4194d60 2026-03-10T08:58:33.286 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.286+0000 7fb0b259c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb0b40ff800 0x7fb0b4194820 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:58:33.286 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.286+0000 7fb0b1d9b700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb0b4194d60 0x7fb0b4197dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:58:33.286 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.286+0000 7fb0b1d9b700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb0b4194d60 0x7fb0b4197dc0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:57704/0 (socket says 192.168.123.105:57704) 2026-03-10T08:58:33.286 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.286+0000 7fb0b1d9b700 1 -- 192.168.123.105:0/3165282459 learned_addr learned my addr 192.168.123.105:0/3165282459 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:58:33.286 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.286+0000 7fb0b1d9b700 1 -- 192.168.123.105:0/3165282459 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb0b40ff800 msgr2=0x7fb0b4194820 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:33.286 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.286+0000 7fb0b1d9b700 1 --2- 192.168.123.105:0/3165282459 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb0b40ff800 0x7fb0b4194820 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:33.286 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.286+0000 7fb0b1d9b700 1 -- 192.168.123.105:0/3165282459 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb09c009710 con 0x7fb0b4194d60 2026-03-10T08:58:33.286 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.286+0000 7fb0b259c700 1 --2- 192.168.123.105:0/3165282459 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb0b40ff800 0x7fb0b4194820 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T08:58:33.287 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.286+0000 7fb0b1d9b700 1 --2- 192.168.123.105:0/3165282459 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb0b4194d60 0x7fb0b4197dc0 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fb0a4000c00 tx=0x7fb0a4005250 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:58:33.287 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.287+0000 7fb0ab7fe700 1 -- 192.168.123.105:0/3165282459 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb0a401d070 con 0x7fb0b4194d60 2026-03-10T08:58:33.287 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.287+0000 7fb0b8ad5700 1 -- 192.168.123.105:0/3165282459 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb0a40097e0 con 0x7fb0b4194d60 2026-03-10T08:58:33.287 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.287+0000 7fb0b8ad5700 1 -- 192.168.123.105:0/3165282459 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb0b4198660 con 0x7fb0b4194d60 2026-03-10T08:58:33.287 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.287+0000 7fb0ab7fe700 1 -- 192.168.123.105:0/3165282459 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb0a4004e80 con 0x7fb0b4194d60 2026-03-10T08:58:33.287 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.287+0000 7fb0ab7fe700 1 -- 192.168.123.105:0/3165282459 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb0a4021620 con 0x7fb0b4194d60 2026-03-10T08:58:33.288 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.288+0000 7fb0ab7fe700 1 -- 192.168.123.105:0/3165282459 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fb0a400bc50 con 0x7fb0b4194d60 2026-03-10T08:58:33.289 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.288+0000 7fb0ab7fe700 1 --2- 192.168.123.105:0/3165282459 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb0a00779e0 0x7fb0a0079ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:58:33.289 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.289+0000 7fb0ab7fe700 1 -- 192.168.123.105:0/3165282459 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(53..53 src has 1..53) v4 ==== 6570+0+0 (secure 0 0 0) 0x7fb0a409ad00 con 0x7fb0b4194d60 2026-03-10T08:58:33.290 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.290+0000 7fb0b259c700 1 --2- 192.168.123.105:0/3165282459 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb0a00779e0 0x7fb0a0079ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:58:33.290 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.290+0000 7fb0b259c700 1 --2- 192.168.123.105:0/3165282459 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb0a00779e0 0x7fb0a0079ea0 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7fb09c00f790 tx=0x7fb09c009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:58:33.292 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.292+0000 7fb0b8ad5700 1 -- 192.168.123.105:0/3165282459 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb094005320 con 0x7fb0b4194d60 2026-03-10T08:58:33.296 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.295+0000 7fb0ab7fe700 1 -- 192.168.123.105:0/3165282459 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fb0a40634b0 con 0x7fb0b4194d60 2026-03-10T08:58:33.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:32 vm08.local ceph-mon[101330]: pgmap v66: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 11 active+recovery_wait+undersized+degraded+remapped, 53 active+clean; 214 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 9.7 KiB/s rd, 199 KiB/s wr, 94 op/s; 1597/231 objects degraded (691.342%); 0 B/s, 5 objects/s recovering 2026-03-10T08:58:33.471 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.470+0000 7fb0b8ad5700 1 -- 192.168.123.105:0/3165282459 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fb094006200 con 0x7fb0b4194d60 2026-03-10T08:58:33.471 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.471+0000 7fb0ab7fe700 1 -- 192.168.123.105:0/3165282459 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+799 (secure 0 0 0) 0x7fb0a4026070 con 0x7fb0b4194d60 2026-03-10T08:58:33.471 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T08:58:33.471 INFO:teuthology.orchestra.run.vm05.stdout: "mon": { 2026-03-10T08:58:33.472 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T08:58:33.472 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:58:33.472 INFO:teuthology.orchestra.run.vm05.stdout: "mgr": { 2026-03-10T08:58:33.472 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T08:58:33.472 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:58:33.472 INFO:teuthology.orchestra.run.vm05.stdout: "osd": { 2026-03-10T08:58:33.472 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 5, 2026-03-10T08:58:33.472 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-10T08:58:33.472 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:58:33.472 INFO:teuthology.orchestra.run.vm05.stdout: "mds": { 2026-03-10T08:58:33.472 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4 2026-03-10T08:58:33.472 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:58:33.472 INFO:teuthology.orchestra.run.vm05.stdout: "overall": { 2026-03-10T08:58:33.472 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 9, 2026-03-10T08:58:33.472 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 5 2026-03-10T08:58:33.472 INFO:teuthology.orchestra.run.vm05.stdout: } 2026-03-10T08:58:33.472 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T08:58:33.475 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.474+0000 7fb0b8ad5700 1 -- 192.168.123.105:0/3165282459 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb0a00779e0 msgr2=0x7fb0a0079ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:33.475 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.474+0000 7fb0b8ad5700 1 --2- 192.168.123.105:0/3165282459 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb0a00779e0 0x7fb0a0079ea0 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7fb09c00f790 tx=0x7fb09c009450 comp rx=0 tx=0).stop 2026-03-10T08:58:33.475 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.474+0000 7fb0b8ad5700 1 -- 192.168.123.105:0/3165282459 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb0b4194d60 msgr2=0x7fb0b4197dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:33.475 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.474+0000 7fb0b8ad5700 1 --2- 192.168.123.105:0/3165282459 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb0b4194d60 0x7fb0b4197dc0 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fb0a4000c00 tx=0x7fb0a4005250 comp rx=0 tx=0).stop 2026-03-10T08:58:33.475 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.475+0000 7fb0b8ad5700 1 -- 192.168.123.105:0/3165282459 shutdown_connections 2026-03-10T08:58:33.475 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.475+0000 7fb0b8ad5700 1 --2- 192.168.123.105:0/3165282459 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb0a00779e0 0x7fb0a0079ea0 unknown :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:33.475 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.475+0000 7fb0b8ad5700 1 --2- 192.168.123.105:0/3165282459 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb0b40ff800 0x7fb0b4194820 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:33.475 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.475+0000 7fb0b8ad5700 1 --2- 192.168.123.105:0/3165282459 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb0b4194d60 0x7fb0b4197dc0 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:33.475 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.475+0000 7fb0b8ad5700 1 -- 192.168.123.105:0/3165282459 >> 192.168.123.105:0/3165282459 conn(0x7fb0b40fb360 msgr2=0x7fb0b4103a20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:58:33.475 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.475+0000 7fb0b8ad5700 1 -- 192.168.123.105:0/3165282459 shutdown_connections 2026-03-10T08:58:33.475 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.475+0000 7fb0b8ad5700 1 -- 192.168.123.105:0/3165282459 wait complete. 2026-03-10T08:58:33.551 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.551+0000 7f3b2896e700 1 -- 192.168.123.105:0/2584792905 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b20103140 msgr2=0x7f3b20103560 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:33.551 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.551+0000 7f3b2896e700 1 --2- 192.168.123.105:0/2584792905 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b20103140 0x7f3b20103560 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f3b18009b50 tx=0x7f3b18009e60 comp rx=0 tx=0).stop 2026-03-10T08:58:33.552 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.552+0000 7f3b2896e700 1 -- 192.168.123.105:0/2584792905 shutdown_connections 2026-03-10T08:58:33.552 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.552+0000 7f3b2896e700 1 --2- 192.168.123.105:0/2584792905 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3b20104340 0x7f3b201047a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:33.552 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.552+0000 7f3b2896e700 1 --2- 192.168.123.105:0/2584792905 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b20103140 0x7f3b20103560 unknown :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:33.552 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.552+0000 7f3b2896e700 1 -- 192.168.123.105:0/2584792905 >> 192.168.123.105:0/2584792905 conn(0x7f3b200fe6c0 msgr2=0x7f3b20100b20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:58:33.552 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.552+0000 7f3b2896e700 1 -- 192.168.123.105:0/2584792905 shutdown_connections 2026-03-10T08:58:33.552 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.552+0000 7f3b2896e700 1 -- 192.168.123.105:0/2584792905 wait complete. 2026-03-10T08:58:33.553 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.553+0000 7f3b2896e700 1 Processor -- start 2026-03-10T08:58:33.553 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.553+0000 7f3b2896e700 1 -- start start 2026-03-10T08:58:33.554 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.553+0000 7f3b2896e700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b20103140 0x7f3b20071d50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:58:33.554 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.553+0000 7f3b2896e700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3b20104340 0x7f3b20072290 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:58:33.554 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.553+0000 7f3b2896e700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3b200727d0 con 0x7f3b20103140 2026-03-10T08:58:33.554 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.553+0000 7f3b2896e700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3b20072940 con 0x7f3b20104340 2026-03-10T08:58:33.555 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.554+0000 7f3b25f09700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3b20104340 0x7f3b20072290 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:58:33.555 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.554+0000 7f3b25f09700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3b20104340 0x7f3b20072290 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:57716/0 (socket says 192.168.123.105:57716) 2026-03-10T08:58:33.555 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.554+0000 7f3b25f09700 1 -- 192.168.123.105:0/3909889632 learned_addr learned my addr 192.168.123.105:0/3909889632 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:58:33.555 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.555+0000 7f3b25f09700 1 -- 192.168.123.105:0/3909889632 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b20103140 msgr2=0x7f3b20071d50 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:33.555 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.555+0000 7f3b2670a700 1 --2- 192.168.123.105:0/3909889632 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b20103140 0x7f3b20071d50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:58:33.556 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.555+0000 7f3b25f09700 1 --2- 192.168.123.105:0/3909889632 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b20103140 0x7f3b20071d50 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:33.556 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.555+0000 7f3b25f09700 1 -- 192.168.123.105:0/3909889632 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3b180097e0 con 0x7f3b20104340 2026-03-10T08:58:33.556 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.556+0000 7f3b2670a700 1 --2- 192.168.123.105:0/3909889632 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b20103140 0x7f3b20071d50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T08:58:33.556 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.556+0000 7f3b25f09700 1 --2- 192.168.123.105:0/3909889632 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3b20104340 0x7f3b20072290 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f3b1400d8d0 tx=0x7f3b1400dc90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:58:33.557 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.556+0000 7f3b137fe700 1 -- 192.168.123.105:0/3909889632 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3b14009940 con 0x7f3b20104340 2026-03-10T08:58:33.557 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.556+0000 7f3b137fe700 1 -- 192.168.123.105:0/3909889632 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3b14010460 con 0x7f3b20104340 2026-03-10T08:58:33.557 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.556+0000 7f3b137fe700 1 -- 192.168.123.105:0/3909889632 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3b1400f640 con 0x7f3b20104340 2026-03-10T08:58:33.557 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.557+0000 7f3b2896e700 1 -- 192.168.123.105:0/3909889632 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3b201a24f0 con 0x7f3b20104340 2026-03-10T08:58:33.557 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.557+0000 7f3b2896e700 1 -- 192.168.123.105:0/3909889632 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3b201a29e0 con 0x7f3b20104340 2026-03-10T08:58:33.558 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.558+0000 7f3b2896e700 1 -- 192.168.123.105:0/3909889632 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3b20066e80 con 0x7f3b20104340 2026-03-10T08:58:33.559 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.559+0000 7f3b137fe700 1 -- 192.168.123.105:0/3909889632 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f3b14009af0 con 0x7f3b20104340 2026-03-10T08:58:33.559 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.559+0000 7f3b137fe700 1 --2- 192.168.123.105:0/3909889632 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f3b0c0778c0 0x7f3b0c079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:58:33.560 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.559+0000 7f3b137fe700 1 -- 192.168.123.105:0/3909889632 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(53..53 src has 1..53) v4 ==== 6570+0+0 (secure 0 0 0) 0x7f3b14099c00 con 0x7f3b20104340 2026-03-10T08:58:33.560 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.560+0000 7f3b2670a700 1 --2- 192.168.123.105:0/3909889632 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f3b0c0778c0 0x7f3b0c079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:58:33.560 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.560+0000 7f3b2670a700 1 --2- 192.168.123.105:0/3909889632 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f3b0c0778c0 0x7f3b0c079d80 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f3b18005950 tx=0x7f3b180058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:58:33.562 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.561+0000 7f3b137fe700 1 -- 192.168.123.105:0/3909889632 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f3b14061c30 con 0x7f3b20104340 2026-03-10T08:58:33.709 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.708+0000 7f3b2896e700 1 -- 192.168.123.105:0/3909889632 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f3b201a2c80 con 0x7f3b20104340 2026-03-10T08:58:33.710 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.709+0000 7f3b137fe700 1 -- 192.168.123.105:0/3909889632 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 12 v12) v1 ==== 76+0+1918 (secure 0 0 0) 0x7f3b14016790 con 0x7f3b20104340 2026-03-10T08:58:33.710 INFO:teuthology.orchestra.run.vm05.stdout:e12 2026-03-10T08:58:33.710 INFO:teuthology.orchestra.run.vm05.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-10T08:58:33.710 INFO:teuthology.orchestra.run.vm05.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T08:58:33.710 INFO:teuthology.orchestra.run.vm05.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T08:58:33.710 INFO:teuthology.orchestra.run.vm05.stdout:legacy client fscid: 1 2026-03-10T08:58:33.710 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:58:33.710 INFO:teuthology.orchestra.run.vm05.stdout:Filesystem 'cephfs' (1) 2026-03-10T08:58:33.710 INFO:teuthology.orchestra.run.vm05.stdout:fs_name cephfs 2026-03-10T08:58:33.710 INFO:teuthology.orchestra.run.vm05.stdout:epoch 12 2026-03-10T08:58:33.710 INFO:teuthology.orchestra.run.vm05.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T08:58:33.710 INFO:teuthology.orchestra.run.vm05.stdout:created 2026-03-10T08:52:52.346264+0000 2026-03-10T08:58:33.710 INFO:teuthology.orchestra.run.vm05.stdout:modified 2026-03-10T08:53:01.426195+0000 2026-03-10T08:58:33.710 INFO:teuthology.orchestra.run.vm05.stdout:tableserver 0 2026-03-10T08:58:33.710 INFO:teuthology.orchestra.run.vm05.stdout:root 0 2026-03-10T08:58:33.710 INFO:teuthology.orchestra.run.vm05.stdout:session_timeout 60 2026-03-10T08:58:33.710 INFO:teuthology.orchestra.run.vm05.stdout:session_autoclose 300 2026-03-10T08:58:33.710 INFO:teuthology.orchestra.run.vm05.stdout:max_file_size 1099511627776 2026-03-10T08:58:33.710 INFO:teuthology.orchestra.run.vm05.stdout:max_xattr_size 65536 2026-03-10T08:58:33.710 INFO:teuthology.orchestra.run.vm05.stdout:required_client_features {} 2026-03-10T08:58:33.711 INFO:teuthology.orchestra.run.vm05.stdout:last_failure 0 2026-03-10T08:58:33.711 INFO:teuthology.orchestra.run.vm05.stdout:last_failure_osd_epoch 39 2026-03-10T08:58:33.711 INFO:teuthology.orchestra.run.vm05.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T08:58:33.711 INFO:teuthology.orchestra.run.vm05.stdout:max_mds 1 2026-03-10T08:58:33.711 INFO:teuthology.orchestra.run.vm05.stdout:in 0 2026-03-10T08:58:33.711 INFO:teuthology.orchestra.run.vm05.stdout:up {0=24289} 2026-03-10T08:58:33.711 INFO:teuthology.orchestra.run.vm05.stdout:failed 2026-03-10T08:58:33.711 INFO:teuthology.orchestra.run.vm05.stdout:damaged 2026-03-10T08:58:33.711 INFO:teuthology.orchestra.run.vm05.stdout:stopped 2026-03-10T08:58:33.711 INFO:teuthology.orchestra.run.vm05.stdout:data_pools [3] 2026-03-10T08:58:33.711 INFO:teuthology.orchestra.run.vm05.stdout:metadata_pool 2 2026-03-10T08:58:33.711 INFO:teuthology.orchestra.run.vm05.stdout:inline_data disabled 2026-03-10T08:58:33.711 INFO:teuthology.orchestra.run.vm05.stdout:balancer 2026-03-10T08:58:33.711 INFO:teuthology.orchestra.run.vm05.stdout:bal_rank_mask -1 2026-03-10T08:58:33.711 INFO:teuthology.orchestra.run.vm05.stdout:standby_count_wanted 1 2026-03-10T08:58:33.711 INFO:teuthology.orchestra.run.vm05.stdout:qdb_cluster leader: 0 members: 2026-03-10T08:58:33.711 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.bxdvbu{0:24289} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.105:6826/2466638752,v1:192.168.123.105:6827/2466638752] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:58:33.711 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:58:33.711 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:58:33.711 INFO:teuthology.orchestra.run.vm05.stdout:Standby daemons: 2026-03-10T08:58:33.711 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:58:33.711 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.slhztf{-1:14488} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.105:6828/2662194502,v1:192.168.123.105:6829/2662194502] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:58:33.711 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.ssijow{-1:24307} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.108:6826/573665424,v1:192.168.123.108:6827/573665424] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:58:33.711 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.xfzrbx{-1:24317} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.108:6824/3236998387,v1:192.168.123.108:6825/3236998387] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:58:33.713 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.713+0000 7f3b2896e700 1 -- 192.168.123.105:0/3909889632 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f3b0c0778c0 msgr2=0x7f3b0c079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:33.713 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.713+0000 7f3b2896e700 1 --2- 192.168.123.105:0/3909889632 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f3b0c0778c0 0x7f3b0c079d80 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f3b18005950 tx=0x7f3b180058e0 comp rx=0 tx=0).stop 2026-03-10T08:58:33.713 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.713+0000 7f3b2896e700 1 -- 192.168.123.105:0/3909889632 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3b20104340 msgr2=0x7f3b20072290 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:33.713 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.713+0000 7f3b2896e700 1 --2- 192.168.123.105:0/3909889632 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3b20104340 0x7f3b20072290 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f3b1400d8d0 tx=0x7f3b1400dc90 comp rx=0 tx=0).stop 2026-03-10T08:58:33.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.714+0000 7f3b2896e700 1 -- 192.168.123.105:0/3909889632 shutdown_connections 2026-03-10T08:58:33.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.714+0000 7f3b2896e700 1 --2- 192.168.123.105:0/3909889632 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f3b0c0778c0 0x7f3b0c079d80 unknown :-1 s=CLOSED pgs=46 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:33.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.714+0000 7f3b2896e700 1 --2- 192.168.123.105:0/3909889632 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b20103140 0x7f3b20071d50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:33.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.714+0000 7f3b2896e700 1 --2- 192.168.123.105:0/3909889632 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3b20104340 0x7f3b20072290 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:33.715 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.714+0000 7f3b2896e700 1 -- 192.168.123.105:0/3909889632 >> 192.168.123.105:0/3909889632 conn(0x7f3b200fe6c0 msgr2=0x7f3b20107570 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:58:33.715 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.714+0000 7f3b2896e700 1 -- 192.168.123.105:0/3909889632 shutdown_connections 2026-03-10T08:58:33.715 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.715+0000 7f3b2896e700 1 -- 192.168.123.105:0/3909889632 wait complete. 2026-03-10T08:58:33.716 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 12 2026-03-10T08:58:33.788 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.787+0000 7fc4dc6e9700 1 -- 192.168.123.105:0/2290404547 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc4d40ff220 msgr2=0x7fc4d40ff640 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:33.789 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.788+0000 7fc4d9483700 1 -- 192.168.123.105:0/2290404547 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc4d000ba40 con 0x7fc4d40ff220 2026-03-10T08:58:33.789 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.788+0000 7fc4dc6e9700 1 --2- 192.168.123.105:0/2290404547 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc4d40ff220 0x7fc4d40ff640 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7fc4d0009b00 tx=0x7fc4d0009e10 comp rx=0 tx=0).stop 2026-03-10T08:58:33.789 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.788+0000 7fc4dc6e9700 1 -- 192.168.123.105:0/2290404547 shutdown_connections 2026-03-10T08:58:33.789 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.788+0000 7fc4dc6e9700 1 --2- 192.168.123.105:0/2290404547 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc4d4102200 0x7fc4d4102660 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:33.789 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.788+0000 7fc4dc6e9700 1 --2- 192.168.123.105:0/2290404547 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc4d40ff220 0x7fc4d40ff640 unknown :-1 s=CLOSED pgs=73 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:33.789 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.788+0000 7fc4dc6e9700 1 -- 192.168.123.105:0/2290404547 >> 192.168.123.105:0/2290404547 conn(0x7fc4d40fab60 msgr2=0x7fc4d40fcfc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:58:33.789 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.788+0000 7fc4dc6e9700 1 -- 192.168.123.105:0/2290404547 shutdown_connections 2026-03-10T08:58:33.789 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.788+0000 7fc4dc6e9700 1 -- 192.168.123.105:0/2290404547 wait complete. 2026-03-10T08:58:33.789 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.789+0000 7fc4dc6e9700 1 Processor -- start 2026-03-10T08:58:33.789 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.789+0000 7fc4dc6e9700 1 -- start start 2026-03-10T08:58:33.790 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.789+0000 7fc4dc6e9700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc4d40ff220 0x7fc4d41945a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:58:33.790 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.789+0000 7fc4dc6e9700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc4d4102200 0x7fc4d4194ae0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:58:33.790 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.789+0000 7fc4dc6e9700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc4d41950e0 con 0x7fc4d40ff220 2026-03-10T08:58:33.790 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.789+0000 7fc4dc6e9700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc4d4195250 con 0x7fc4d4102200 2026-03-10T08:58:33.790 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.789+0000 7fc4da485700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc4d40ff220 0x7fc4d41945a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:58:33.791 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.789+0000 7fc4da485700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc4d40ff220 0x7fc4d41945a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:46286/0 (socket says 192.168.123.105:46286) 2026-03-10T08:58:33.791 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.789+0000 7fc4da485700 1 -- 192.168.123.105:0/342034435 learned_addr learned my addr 192.168.123.105:0/342034435 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:58:33.791 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.789+0000 7fc4d9c84700 1 --2- 192.168.123.105:0/342034435 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc4d4102200 0x7fc4d4194ae0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:58:33.791 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.790+0000 7fc4da485700 1 -- 192.168.123.105:0/342034435 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc4d4102200 msgr2=0x7fc4d4194ae0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:33.791 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.790+0000 7fc4da485700 1 --2- 192.168.123.105:0/342034435 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc4d4102200 0x7fc4d4194ae0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:33.791 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.790+0000 7fc4da485700 1 -- 192.168.123.105:0/342034435 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc4d00097e0 con 0x7fc4d40ff220 2026-03-10T08:58:33.791 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.790+0000 7fc4da485700 1 --2- 192.168.123.105:0/342034435 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc4d40ff220 0x7fc4d41945a0 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7fc4d0000c00 tx=0x7fc4d000be30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:58:33.791 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.790+0000 7fc4cb7fe700 1 -- 192.168.123.105:0/342034435 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc4d0003a40 con 0x7fc4d40ff220 2026-03-10T08:58:33.792 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.790+0000 7fc4dc6e9700 1 -- 192.168.123.105:0/342034435 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc4d406a8f0 con 0x7fc4d40ff220 2026-03-10T08:58:33.792 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.790+0000 7fc4dc6e9700 1 -- 192.168.123.105:0/342034435 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc4d406ae40 con 0x7fc4d40ff220 2026-03-10T08:58:33.792 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.790+0000 7fc4cb7fe700 1 -- 192.168.123.105:0/342034435 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc4d0003ba0 con 0x7fc4d40ff220 2026-03-10T08:58:33.792 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.790+0000 7fc4cb7fe700 1 -- 192.168.123.105:0/342034435 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc4d0018650 con 0x7fc4d40ff220 2026-03-10T08:58:33.793 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.792+0000 7fc4cb7fe700 1 -- 192.168.123.105:0/342034435 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fc4d00187b0 con 0x7fc4d40ff220 2026-03-10T08:58:33.793 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.793+0000 7fc4cb7fe700 1 --2- 192.168.123.105:0/342034435 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fc4c0077910 0x7fc4c0079dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:58:33.794 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.793+0000 7fc4d9c84700 1 --2- 192.168.123.105:0/342034435 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fc4c0077910 0x7fc4c0079dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:58:33.794 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.794+0000 7fc4d9c84700 1 --2- 192.168.123.105:0/342034435 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fc4c0077910 0x7fc4c0079dd0 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7fc4d41a6ac0 tx=0x7fc4c400b410 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:58:33.794 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.794+0000 7fc4cb7fe700 1 -- 192.168.123.105:0/342034435 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(53..53 src has 1..53) v4 ==== 6570+0+0 (secure 0 0 0) 0x7fc4d009cf50 con 0x7fc4d40ff220 2026-03-10T08:58:33.795 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.794+0000 7fc4dc6e9700 1 -- 192.168.123.105:0/342034435 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc4b8005320 con 0x7fc4d40ff220 2026-03-10T08:58:33.798 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.797+0000 7fc4cb7fe700 1 -- 192.168.123.105:0/342034435 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fc4d00655d0 con 0x7fc4d40ff220 2026-03-10T08:58:33.939 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.937+0000 7fc4dc6e9700 1 -- 192.168.123.105:0/342034435 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fc4b8000bf0 con 0x7fc4c0077910 2026-03-10T08:58:33.940 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.940+0000 7fc4cb7fe700 1 -- 192.168.123.105:0/342034435 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7fc4b8000bf0 con 0x7fc4c0077910 2026-03-10T08:58:33.940 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T08:58:33.940 INFO:teuthology.orchestra.run.vm05.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T08:58:33.940 INFO:teuthology.orchestra.run.vm05.stdout: "in_progress": true, 2026-03-10T08:58:33.940 INFO:teuthology.orchestra.run.vm05.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T08:58:33.940 INFO:teuthology.orchestra.run.vm05.stdout: "services_complete": [ 2026-03-10T08:58:33.941 INFO:teuthology.orchestra.run.vm05.stdout: "mgr", 2026-03-10T08:58:33.941 INFO:teuthology.orchestra.run.vm05.stdout: "crash", 2026-03-10T08:58:33.941 INFO:teuthology.orchestra.run.vm05.stdout: "mon" 2026-03-10T08:58:33.941 INFO:teuthology.orchestra.run.vm05.stdout: ], 2026-03-10T08:58:33.941 INFO:teuthology.orchestra.run.vm05.stdout: "progress": "7/23 daemons upgraded", 2026-03-10T08:58:33.941 INFO:teuthology.orchestra.run.vm05.stdout: "message": "Currently upgrading osd daemons", 2026-03-10T08:58:33.941 INFO:teuthology.orchestra.run.vm05.stdout: "is_paused": false 2026-03-10T08:58:33.941 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T08:58:33.944 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.944+0000 7fc4dc6e9700 1 -- 192.168.123.105:0/342034435 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fc4c0077910 msgr2=0x7fc4c0079dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:33.944 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.944+0000 7fc4dc6e9700 1 --2- 192.168.123.105:0/342034435 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fc4c0077910 0x7fc4c0079dd0 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7fc4d41a6ac0 tx=0x7fc4c400b410 comp rx=0 tx=0).stop 2026-03-10T08:58:33.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.944+0000 7fc4dc6e9700 1 -- 192.168.123.105:0/342034435 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc4d40ff220 msgr2=0x7fc4d41945a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:33.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.944+0000 7fc4dc6e9700 1 --2- 192.168.123.105:0/342034435 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc4d40ff220 0x7fc4d41945a0 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7fc4d0000c00 tx=0x7fc4d000be30 comp rx=0 tx=0).stop 2026-03-10T08:58:33.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.944+0000 7fc4dc6e9700 1 -- 192.168.123.105:0/342034435 shutdown_connections 2026-03-10T08:58:33.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.944+0000 7fc4dc6e9700 1 --2- 192.168.123.105:0/342034435 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fc4c0077910 0x7fc4c0079dd0 unknown :-1 s=CLOSED pgs=47 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:33.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.944+0000 7fc4dc6e9700 1 --2- 192.168.123.105:0/342034435 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc4d40ff220 0x7fc4d41945a0 unknown :-1 s=CLOSED pgs=74 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:33.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.944+0000 7fc4dc6e9700 1 --2- 192.168.123.105:0/342034435 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc4d4102200 0x7fc4d4194ae0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:33.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.944+0000 7fc4dc6e9700 1 -- 192.168.123.105:0/342034435 >> 192.168.123.105:0/342034435 conn(0x7fc4d40fab60 msgr2=0x7fc4d40fcf30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:58:33.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.944+0000 7fc4dc6e9700 1 -- 192.168.123.105:0/342034435 shutdown_connections 2026-03-10T08:58:33.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:33.944+0000 7fc4dc6e9700 1 -- 192.168.123.105:0/342034435 wait complete. 2026-03-10T08:58:34.031 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.030+0000 7ff3062f0700 1 -- 192.168.123.105:0/2115065716 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff300100fc0 msgr2=0x7ff300101440 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:34.031 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.030+0000 7ff3062f0700 1 --2- 192.168.123.105:0/2115065716 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff300100fc0 0x7ff300101440 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7ff2f0009b00 tx=0x7ff2f0009e10 comp rx=0 tx=0).stop 2026-03-10T08:58:34.031 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.031+0000 7ff3062f0700 1 -- 192.168.123.105:0/2115065716 shutdown_connections 2026-03-10T08:58:34.031 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.031+0000 7ff3062f0700 1 --2- 192.168.123.105:0/2115065716 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff300100fc0 0x7ff300101440 unknown :-1 s=CLOSED pgs=75 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:34.031 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.031+0000 7ff3062f0700 1 --2- 192.168.123.105:0/2115065716 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff3000ffe60 0x7ff300100280 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:34.031 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.031+0000 7ff3062f0700 1 -- 192.168.123.105:0/2115065716 >> 192.168.123.105:0/2115065716 conn(0x7ff3000fb3c0 msgr2=0x7ff3000fd840 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:58:34.032 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.031+0000 7ff3062f0700 1 -- 192.168.123.105:0/2115065716 shutdown_connections 2026-03-10T08:58:34.032 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.031+0000 7ff3062f0700 1 -- 192.168.123.105:0/2115065716 wait complete. 2026-03-10T08:58:34.032 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.032+0000 7ff3062f0700 1 Processor -- start 2026-03-10T08:58:34.032 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.032+0000 7ff3062f0700 1 -- start start 2026-03-10T08:58:34.032 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.032+0000 7ff3062f0700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff3000ffe60 0x7ff3001945e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:58:34.033 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.032+0000 7ff3062f0700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff300100fc0 0x7ff300194b20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:58:34.033 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.032+0000 7ff3062f0700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff3001950d0 con 0x7ff300100fc0 2026-03-10T08:58:34.033 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.032+0000 7ff3062f0700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff300195240 con 0x7ff3000ffe60 2026-03-10T08:58:34.033 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.033+0000 7ff2fffff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff3000ffe60 0x7ff3001945e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:58:34.033 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.033+0000 7ff2ff7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff300100fc0 0x7ff300194b20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:58:34.033 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.033+0000 7ff2ff7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff300100fc0 0x7ff300194b20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:46300/0 (socket says 192.168.123.105:46300) 2026-03-10T08:58:34.033 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.033+0000 7ff2ff7fe700 1 -- 192.168.123.105:0/3248308237 learned_addr learned my addr 192.168.123.105:0/3248308237 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:58:34.034 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.033+0000 7ff2ff7fe700 1 -- 192.168.123.105:0/3248308237 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff3000ffe60 msgr2=0x7ff3001945e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:34.034 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.033+0000 7ff2ff7fe700 1 --2- 192.168.123.105:0/3248308237 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff3000ffe60 0x7ff3001945e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:34.034 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.033+0000 7ff2ff7fe700 1 -- 192.168.123.105:0/3248308237 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff2f00097e0 con 0x7ff300100fc0 2026-03-10T08:58:34.034 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.034+0000 7ff2ff7fe700 1 --2- 192.168.123.105:0/3248308237 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff300100fc0 0x7ff300194b20 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7ff2f0009ad0 tx=0x7ff2f0004ca0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:58:34.035 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.034+0000 7ff2fd7fa700 1 -- 192.168.123.105:0/3248308237 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff2f001d070 con 0x7ff300100fc0 2026-03-10T08:58:34.035 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.034+0000 7ff2fd7fa700 1 -- 192.168.123.105:0/3248308237 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff2f000bcb0 con 0x7ff300100fc0 2026-03-10T08:58:34.035 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.034+0000 7ff2fd7fa700 1 -- 192.168.123.105:0/3248308237 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff2f000f7c0 con 0x7ff300100fc0 2026-03-10T08:58:34.036 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.034+0000 7ff3062f0700 1 -- 192.168.123.105:0/3248308237 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff3001a24a0 con 0x7ff300100fc0 2026-03-10T08:58:34.036 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.034+0000 7ff3062f0700 1 -- 192.168.123.105:0/3248308237 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff3001a2990 con 0x7ff300100fc0 2026-03-10T08:58:34.037 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.036+0000 7ff2fd7fa700 1 -- 192.168.123.105:0/3248308237 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7ff2f0022ae0 con 0x7ff300100fc0 2026-03-10T08:58:34.037 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.036+0000 7ff3062f0700 1 -- 192.168.123.105:0/3248308237 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff300066e80 con 0x7ff300100fc0 2026-03-10T08:58:34.037 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.036+0000 7ff2fd7fa700 1 --2- 192.168.123.105:0/3248308237 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ff2e80778c0 0x7ff2e8079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:58:34.037 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.036+0000 7ff2fd7fa700 1 -- 192.168.123.105:0/3248308237 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(53..53 src has 1..53) v4 ==== 6570+0+0 (secure 0 0 0) 0x7ff2f009b380 con 0x7ff300100fc0 2026-03-10T08:58:34.040 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.039+0000 7ff2fffff700 1 --2- 192.168.123.105:0/3248308237 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ff2e80778c0 0x7ff2e8079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:58:34.040 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.040+0000 7ff2fd7fa700 1 -- 192.168.123.105:0/3248308237 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff2f0063b30 con 0x7ff300100fc0 2026-03-10T08:58:34.040 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.040+0000 7ff2fffff700 1 --2- 192.168.123.105:0/3248308237 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ff2e80778c0 0x7ff2e8079d80 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7ff300195b40 tx=0x7ff2f400a380 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:58:34.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:34 vm05.local ceph-mon[111630]: from='client.34192 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:58:34.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:34 vm05.local ceph-mon[111630]: from='client.44151 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:58:34.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:34 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/3165282459' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:58:34.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:34 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/3909889632' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T08:58:34.239 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.238+0000 7ff3062f0700 1 -- 192.168.123.105:0/3248308237 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7ff3001a2d60 con 0x7ff300100fc0 2026-03-10T08:58:34.239 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.238+0000 7ff2fd7fa700 1 -- 192.168.123.105:0/3248308237 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+1132 (secure 0 0 0) 0x7ff2f0063280 con 0x7ff300100fc0 2026-03-10T08:58:34.239 INFO:teuthology.orchestra.run.vm05.stdout:HEALTH_WARN Degraded data redundancy: 1597/231 objects degraded (691.342%), 12 pgs degraded 2026-03-10T08:58:34.239 INFO:teuthology.orchestra.run.vm05.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 1597/231 objects degraded (691.342%), 12 pgs degraded 2026-03-10T08:58:34.239 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.1 is active+recovery_wait+undersized+degraded+remapped, acting [2,4] 2026-03-10T08:58:34.239 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.6 is active+recovery_wait+undersized+degraded+remapped, acting [1,4] 2026-03-10T08:58:34.239 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.b is active+recovery_wait+undersized+degraded+remapped, acting [1,4] 2026-03-10T08:58:34.239 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.c is active+recovery_wait+undersized+degraded+remapped, acting [5,3] 2026-03-10T08:58:34.239 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.10 is active+recovery_wait+undersized+degraded+remapped, acting [5,1] 2026-03-10T08:58:34.239 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.11 is active+recovery_wait+undersized+degraded+remapped, acting [3,4] 2026-03-10T08:58:34.239 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.12 is active+recovering+undersized+degraded+remapped, acting [1,3] 2026-03-10T08:58:34.239 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.15 is active+recovery_wait+undersized+degraded+remapped, acting [3,4] 2026-03-10T08:58:34.239 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.17 is active+recovery_wait+undersized+degraded+remapped, acting [2,5] 2026-03-10T08:58:34.239 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.18 is active+recovery_wait+undersized+degraded+remapped, acting [2,1] 2026-03-10T08:58:34.239 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.1b is active+recovery_wait+undersized+degraded+remapped, acting [3,4] 2026-03-10T08:58:34.239 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.1f is active+recovery_wait+undersized+degraded+remapped, acting [2,3] 2026-03-10T08:58:34.242 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.241+0000 7ff3062f0700 1 -- 192.168.123.105:0/3248308237 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ff2e80778c0 msgr2=0x7ff2e8079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:34.242 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.241+0000 7ff3062f0700 1 --2- 192.168.123.105:0/3248308237 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ff2e80778c0 0x7ff2e8079d80 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7ff300195b40 tx=0x7ff2f400a380 comp rx=0 tx=0).stop 2026-03-10T08:58:34.242 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.241+0000 7ff3062f0700 1 -- 192.168.123.105:0/3248308237 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff300100fc0 msgr2=0x7ff300194b20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:58:34.242 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.241+0000 7ff3062f0700 1 --2- 192.168.123.105:0/3248308237 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff300100fc0 0x7ff300194b20 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7ff2f0009ad0 tx=0x7ff2f0004ca0 comp rx=0 tx=0).stop 2026-03-10T08:58:34.242 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.242+0000 7ff3062f0700 1 -- 192.168.123.105:0/3248308237 shutdown_connections 2026-03-10T08:58:34.242 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.242+0000 7ff3062f0700 1 --2- 192.168.123.105:0/3248308237 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ff2e80778c0 0x7ff2e8079d80 unknown :-1 s=CLOSED pgs=48 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:34.242 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.242+0000 7ff3062f0700 1 --2- 192.168.123.105:0/3248308237 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff3000ffe60 0x7ff3001945e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:34.242 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.242+0000 7ff3062f0700 1 --2- 192.168.123.105:0/3248308237 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff300100fc0 0x7ff300194b20 unknown :-1 s=CLOSED pgs=76 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:58:34.242 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.242+0000 7ff3062f0700 1 -- 192.168.123.105:0/3248308237 >> 192.168.123.105:0/3248308237 conn(0x7ff3000fb3c0 msgr2=0x7ff300104280 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:58:34.242 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.242+0000 7ff3062f0700 1 -- 192.168.123.105:0/3248308237 shutdown_connections 2026-03-10T08:58:34.242 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:58:34.242+0000 7ff3062f0700 1 -- 192.168.123.105:0/3248308237 wait complete. 2026-03-10T08:58:34.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:34 vm08.local ceph-mon[101330]: from='client.34192 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:58:34.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:34 vm08.local ceph-mon[101330]: from='client.44151 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:58:34.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:34 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/3165282459' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:58:34.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:34 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/3909889632' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T08:58:35.359 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:35 vm05.local ceph-mon[111630]: from='client.34198 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:58:35.359 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:35 vm05.local ceph-mon[111630]: pgmap v67: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 11 active+recovery_wait+undersized+degraded+remapped, 53 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 8.0 KiB/s rd, 198 KiB/s wr, 77 op/s; 1597/231 objects degraded (691.342%); 0 B/s, 9 objects/s recovering 2026-03-10T08:58:35.359 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:35 vm05.local ceph-mon[111630]: from='client.34208 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:58:35.359 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:35 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/3248308237' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T08:58:35.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:35 vm08.local ceph-mon[101330]: from='client.34198 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:58:35.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:35 vm08.local ceph-mon[101330]: pgmap v67: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 11 active+recovery_wait+undersized+degraded+remapped, 53 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 8.0 KiB/s rd, 198 KiB/s wr, 77 op/s; 1597/231 objects degraded (691.342%); 0 B/s, 9 objects/s recovering 2026-03-10T08:58:35.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:35 vm08.local ceph-mon[101330]: from='client.34208 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:58:35.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:35 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/3248308237' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T08:58:38.178 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:37 vm05.local ceph-mon[111630]: pgmap v68: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 11 active+recovery_wait+undersized+degraded+remapped, 53 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 7.3 KiB/s rd, 180 KiB/s wr, 70 op/s; 1597/231 objects degraded (691.342%); 0 B/s, 8 objects/s recovering 2026-03-10T08:58:38.255 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:38 vm08.local ceph-mon[101330]: pgmap v68: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 11 active+recovery_wait+undersized+degraded+remapped, 53 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 7.3 KiB/s rd, 180 KiB/s wr, 70 op/s; 1597/231 objects degraded (691.342%); 0 B/s, 8 objects/s recovering 2026-03-10T08:58:39.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:39 vm05.local ceph-mon[111630]: pgmap v69: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 11 active+recovery_wait+undersized+degraded+remapped, 53 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 6.5 KiB/s rd, 161 KiB/s wr, 62 op/s; 1571/231 objects degraded (680.087%); 0 B/s, 10 objects/s recovering 2026-03-10T08:58:39.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:39 vm05.local ceph-mon[111630]: Health check update: Degraded data redundancy: 1571/231 objects degraded (680.087%), 12 pgs degraded, 12 pgs undersized (PG_DEGRADED) 2026-03-10T08:58:39.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:39 vm08.local ceph-mon[101330]: pgmap v69: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 11 active+recovery_wait+undersized+degraded+remapped, 53 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 6.5 KiB/s rd, 161 KiB/s wr, 62 op/s; 1571/231 objects degraded (680.087%); 0 B/s, 10 objects/s recovering 2026-03-10T08:58:39.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:39 vm08.local ceph-mon[101330]: Health check update: Degraded data redundancy: 1571/231 objects degraded (680.087%), 12 pgs degraded, 12 pgs undersized (PG_DEGRADED) 2026-03-10T08:58:40.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:40 vm05.local ceph-mon[111630]: pgmap v70: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 11 active+recovery_wait+undersized+degraded+remapped, 53 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 28 KiB/s wr, 0 op/s; 1571/231 objects degraded (680.087%); 0 B/s, 7 objects/s recovering 2026-03-10T08:58:41.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:40 vm08.local ceph-mon[101330]: pgmap v70: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 11 active+recovery_wait+undersized+degraded+remapped, 53 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 28 KiB/s wr, 0 op/s; 1571/231 objects degraded (680.087%); 0 B/s, 7 objects/s recovering 2026-03-10T08:58:43.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:43 vm08.local ceph-mon[101330]: pgmap v71: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 11 active+recovery_wait+undersized+degraded+remapped, 53 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 28 KiB/s wr, 0 op/s; 1571/231 objects degraded (680.087%); 0 B/s, 7 objects/s recovering 2026-03-10T08:58:43.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:42 vm05.local ceph-mon[111630]: pgmap v71: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 11 active+recovery_wait+undersized+degraded+remapped, 53 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 28 KiB/s wr, 0 op/s; 1571/231 objects degraded (680.087%); 0 B/s, 7 objects/s recovering 2026-03-10T08:58:44.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:44 vm08.local ceph-mon[101330]: osdmap e54: 6 total, 6 up, 6 in 2026-03-10T08:58:44.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:44 vm05.local ceph-mon[111630]: osdmap e54: 6 total, 6 up, 6 in 2026-03-10T08:58:45.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:45 vm08.local ceph-mon[101330]: pgmap v73: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 11 active+recovery_wait+undersized+degraded+remapped, 53 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1541/231 objects degraded (667.100%); 0 B/s, 7 objects/s recovering 2026-03-10T08:58:45.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:45 vm08.local ceph-mon[101330]: Health check update: Degraded data redundancy: 1541/231 objects degraded (667.100%), 12 pgs degraded, 12 pgs undersized (PG_DEGRADED) 2026-03-10T08:58:45.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:45 vm08.local ceph-mon[101330]: osdmap e55: 6 total, 6 up, 6 in 2026-03-10T08:58:45.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:45 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T08:58:45.359 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:45 vm05.local ceph-mon[111630]: pgmap v73: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 11 active+recovery_wait+undersized+degraded+remapped, 53 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1541/231 objects degraded (667.100%); 0 B/s, 7 objects/s recovering 2026-03-10T08:58:45.359 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:45 vm05.local ceph-mon[111630]: Health check update: Degraded data redundancy: 1541/231 objects degraded (667.100%), 12 pgs degraded, 12 pgs undersized (PG_DEGRADED) 2026-03-10T08:58:45.359 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:45 vm05.local ceph-mon[111630]: osdmap e55: 6 total, 6 up, 6 in 2026-03-10T08:58:45.359 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:45 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T08:58:46.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:46 vm08.local ceph-mon[101330]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T08:58:46.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:46 vm08.local ceph-mon[101330]: Upgrade: unsafe to stop osd(s) at this time (5 PGs are or would become offline) 2026-03-10T08:58:46.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:46 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:58:46.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:46 vm05.local ceph-mon[111630]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T08:58:46.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:46 vm05.local ceph-mon[111630]: Upgrade: unsafe to stop osd(s) at this time (5 PGs are or would become offline) 2026-03-10T08:58:46.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:46 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:58:47.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:47 vm08.local ceph-mon[101330]: pgmap v75: 65 pgs: 1 peering, 11 active+recovery_wait+undersized+degraded+remapped, 53 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1540/231 objects degraded (666.667%) 2026-03-10T08:58:47.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:47 vm05.local ceph-mon[111630]: pgmap v75: 65 pgs: 1 peering, 11 active+recovery_wait+undersized+degraded+remapped, 53 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1540/231 objects degraded (666.667%) 2026-03-10T08:58:48.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:48 vm08.local ceph-mon[101330]: pgmap v76: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 1 peering, 10 active+recovery_wait+undersized+degraded+remapped, 53 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1451/231 objects degraded (628.139%) 2026-03-10T08:58:48.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:48 vm05.local ceph-mon[111630]: pgmap v76: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 1 peering, 10 active+recovery_wait+undersized+degraded+remapped, 53 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1451/231 objects degraded (628.139%) 2026-03-10T08:58:51.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:50 vm05.local ceph-mon[111630]: pgmap v77: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1451/231 objects degraded (628.139%); 0 B/s, 7 objects/s recovering 2026-03-10T08:58:51.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:50 vm05.local ceph-mon[111630]: Health check update: Degraded data redundancy: 1451/231 objects degraded (628.139%), 11 pgs degraded, 11 pgs undersized (PG_DEGRADED) 2026-03-10T08:58:51.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:50 vm08.local ceph-mon[101330]: pgmap v77: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1451/231 objects degraded (628.139%); 0 B/s, 7 objects/s recovering 2026-03-10T08:58:51.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:50 vm08.local ceph-mon[101330]: Health check update: Degraded data redundancy: 1451/231 objects degraded (628.139%), 11 pgs degraded, 11 pgs undersized (PG_DEGRADED) 2026-03-10T08:58:53.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:52 vm05.local ceph-mon[111630]: pgmap v78: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1451/231 objects degraded (628.139%); 0 B/s, 3 objects/s recovering 2026-03-10T08:58:53.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:52 vm08.local ceph-mon[101330]: pgmap v78: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1451/231 objects degraded (628.139%); 0 B/s, 3 objects/s recovering 2026-03-10T08:58:55.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:54 vm05.local ceph-mon[111630]: pgmap v79: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1451/231 objects degraded (628.139%); 0 B/s, 7 objects/s recovering 2026-03-10T08:58:55.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:54 vm08.local ceph-mon[101330]: pgmap v79: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1451/231 objects degraded (628.139%); 0 B/s, 7 objects/s recovering 2026-03-10T08:58:57.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:56 vm05.local ceph-mon[111630]: pgmap v80: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1451/231 objects degraded (628.139%); 0 B/s, 18 objects/s recovering 2026-03-10T08:58:57.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:56 vm08.local ceph-mon[101330]: pgmap v80: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1451/231 objects degraded (628.139%); 0 B/s, 18 objects/s recovering 2026-03-10T08:58:58.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:58 vm05.local ceph-mon[111630]: Health check update: Degraded data redundancy: 1413/231 objects degraded (611.688%), 11 pgs degraded, 11 pgs undersized (PG_DEGRADED) 2026-03-10T08:58:58.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:58 vm08.local ceph-mon[101330]: Health check update: Degraded data redundancy: 1413/231 objects degraded (611.688%), 11 pgs degraded, 11 pgs undersized (PG_DEGRADED) 2026-03-10T08:58:59.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:58:59 vm05.local ceph-mon[111630]: pgmap v81: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1413/231 objects degraded (611.688%); 0 B/s, 21 objects/s recovering 2026-03-10T08:58:59.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:58:59 vm08.local ceph-mon[101330]: pgmap v81: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1413/231 objects degraded (611.688%); 0 B/s, 21 objects/s recovering 2026-03-10T08:59:00.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:00 vm05.local ceph-mon[111630]: osdmap e56: 6 total, 6 up, 6 in 2026-03-10T08:59:00.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:00 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T08:59:00.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:00 vm08.local ceph-mon[101330]: osdmap e56: 6 total, 6 up, 6 in 2026-03-10T08:59:00.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:00 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T08:59:01.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:01 vm05.local ceph-mon[111630]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T08:59:01.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:01 vm05.local ceph-mon[111630]: Upgrade: unsafe to stop osd(s) at this time (4 PGs are or would become offline) 2026-03-10T08:59:01.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:01 vm05.local ceph-mon[111630]: pgmap v83: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1413/231 objects degraded (611.688%); 0 B/s, 9 objects/s recovering 2026-03-10T08:59:01.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:01 vm05.local ceph-mon[111630]: osdmap e57: 6 total, 6 up, 6 in 2026-03-10T08:59:01.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:01 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:59:01.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:01 vm08.local ceph-mon[101330]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T08:59:01.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:01 vm08.local ceph-mon[101330]: Upgrade: unsafe to stop osd(s) at this time (4 PGs are or would become offline) 2026-03-10T08:59:01.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:01 vm08.local ceph-mon[101330]: pgmap v83: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1413/231 objects degraded (611.688%); 0 B/s, 9 objects/s recovering 2026-03-10T08:59:01.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:01 vm08.local ceph-mon[101330]: osdmap e57: 6 total, 6 up, 6 in 2026-03-10T08:59:01.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:01 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:59:03.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:03 vm05.local ceph-mon[111630]: pgmap v85: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1413/231 objects degraded (611.688%); 0 B/s, 6 objects/s recovering 2026-03-10T08:59:03.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:03 vm08.local ceph-mon[101330]: pgmap v85: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1413/231 objects degraded (611.688%); 0 B/s, 6 objects/s recovering 2026-03-10T08:59:04.319 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.318+0000 7f80b3efb700 1 -- 192.168.123.105:0/1807920736 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f80ac104320 msgr2=0x7f80ac106710 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:04.319 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.318+0000 7f80b3efb700 1 --2- 192.168.123.105:0/1807920736 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f80ac104320 0x7f80ac106710 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f80a8009b00 tx=0x7f80a8009e10 comp rx=0 tx=0).stop 2026-03-10T08:59:04.319 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.318+0000 7f80b3efb700 1 -- 192.168.123.105:0/1807920736 shutdown_connections 2026-03-10T08:59:04.319 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.318+0000 7f80b3efb700 1 --2- 192.168.123.105:0/1807920736 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f80ac104320 0x7f80ac106710 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:04.319 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.318+0000 7f80b3efb700 1 --2- 192.168.123.105:0/1807920736 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f80ac1019f0 0x7f80ac103de0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:04.319 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.319+0000 7f80b3efb700 1 -- 192.168.123.105:0/1807920736 >> 192.168.123.105:0/1807920736 conn(0x7f80ac0fb3e0 msgr2=0x7f80ac0fd840 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:59:04.319 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.319+0000 7f80b3efb700 1 -- 192.168.123.105:0/1807920736 shutdown_connections 2026-03-10T08:59:04.319 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.319+0000 7f80b3efb700 1 -- 192.168.123.105:0/1807920736 wait complete. 2026-03-10T08:59:04.320 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.320+0000 7f80b3efb700 1 Processor -- start 2026-03-10T08:59:04.320 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.320+0000 7f80b3efb700 1 -- start start 2026-03-10T08:59:04.320 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.320+0000 7f80b3efb700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f80ac1019f0 0x7f80ac1967c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:59:04.320 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.320+0000 7f80b3efb700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f80ac104320 0x7f80ac196d00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:59:04.321 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.320+0000 7f80b1496700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f80ac104320 0x7f80ac196d00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:59:04.321 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.320+0000 7f80b1496700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f80ac104320 0x7f80ac196d00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:53270/0 (socket says 192.168.123.105:53270) 2026-03-10T08:59:04.321 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.320+0000 7f80b1496700 1 -- 192.168.123.105:0/3639404264 learned_addr learned my addr 192.168.123.105:0/3639404264 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:59:04.321 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.320+0000 7f80b3efb700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f80ac197320 con 0x7f80ac104320 2026-03-10T08:59:04.321 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.321+0000 7f80b3efb700 1 -- 192.168.123.105:0/3639404264 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f80ac197460 con 0x7f80ac1019f0 2026-03-10T08:59:04.321 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.321+0000 7f80b1c97700 1 --2- 192.168.123.105:0/3639404264 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f80ac1019f0 0x7f80ac1967c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:59:04.322 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.321+0000 7f80b1496700 1 -- 192.168.123.105:0/3639404264 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f80ac1019f0 msgr2=0x7f80ac1967c0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:04.322 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.321+0000 7f80b1496700 1 --2- 192.168.123.105:0/3639404264 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f80ac1019f0 0x7f80ac1967c0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:04.322 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.321+0000 7f80b1496700 1 -- 192.168.123.105:0/3639404264 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f809c009710 con 0x7f80ac104320 2026-03-10T08:59:04.322 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.321+0000 7f80b1496700 1 --2- 192.168.123.105:0/3639404264 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f80ac104320 0x7f80ac196d00 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f80a8006010 tx=0x7f80a80048c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:59:04.322 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.321+0000 7f80a2ffd700 1 -- 192.168.123.105:0/3639404264 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f80a801d070 con 0x7f80ac104320 2026-03-10T08:59:04.322 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.321+0000 7f80a2ffd700 1 -- 192.168.123.105:0/3639404264 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f80a800bb40 con 0x7f80ac104320 2026-03-10T08:59:04.322 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.321+0000 7f80a2ffd700 1 -- 192.168.123.105:0/3639404264 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f80a800f670 con 0x7f80ac104320 2026-03-10T08:59:04.322 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.322+0000 7f80b3efb700 1 -- 192.168.123.105:0/3639404264 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f80a80097e0 con 0x7f80ac104320 2026-03-10T08:59:04.323 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.322+0000 7f80b3efb700 1 -- 192.168.123.105:0/3639404264 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f80ac0ffb10 con 0x7f80ac104320 2026-03-10T08:59:04.323 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.323+0000 7f80a2ffd700 1 -- 192.168.123.105:0/3639404264 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f80a8004d20 con 0x7f80ac104320 2026-03-10T08:59:04.324 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.323+0000 7f80a2ffd700 1 --2- 192.168.123.105:0/3639404264 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f809807bcd0 0x7f809807e190 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:59:04.324 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.324+0000 7f80a2ffd700 1 -- 192.168.123.105:0/3639404264 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(57..57 src has 1..57) v4 ==== 6512+0+0 (secure 0 0 0) 0x7f80a80678d0 con 0x7f80ac104320 2026-03-10T08:59:04.324 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.324+0000 7f80b1c97700 1 --2- 192.168.123.105:0/3639404264 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f809807bcd0 0x7f809807e190 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:59:04.327 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.324+0000 7f80b3efb700 1 -- 192.168.123.105:0/3639404264 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f80ac190a40 con 0x7f80ac104320 2026-03-10T08:59:04.330 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.329+0000 7f80a2ffd700 1 -- 192.168.123.105:0/3639404264 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f80a80632f0 con 0x7f80ac104320 2026-03-10T08:59:04.331 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.330+0000 7f80b1c97700 1 --2- 192.168.123.105:0/3639404264 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f809807bcd0 0x7f809807e190 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7f809c011440 tx=0x7f809c009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:59:04.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:04 vm05.local ceph-mon[111630]: pgmap v86: 65 pgs: 2 active+recovering+undersized+degraded+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1319/231 objects degraded (570.996%); 0 B/s, 9 objects/s recovering 2026-03-10T08:59:04.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:04 vm05.local ceph-mon[111630]: Health check update: Degraded data redundancy: 1319/231 objects degraded (570.996%), 11 pgs degraded, 11 pgs undersized (PG_DEGRADED) 2026-03-10T08:59:04.466 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.466+0000 7f80b3efb700 1 -- 192.168.123.105:0/3639404264 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f80ac0611d0 con 0x7f809807bcd0 2026-03-10T08:59:04.467 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.467+0000 7f80a2ffd700 1 -- 192.168.123.105:0/3639404264 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f80ac0611d0 con 0x7f809807bcd0 2026-03-10T08:59:04.469 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.469+0000 7f80b3efb700 1 -- 192.168.123.105:0/3639404264 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f809807bcd0 msgr2=0x7f809807e190 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:04.470 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.469+0000 7f80b3efb700 1 --2- 192.168.123.105:0/3639404264 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f809807bcd0 0x7f809807e190 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7f809c011440 tx=0x7f809c009450 comp rx=0 tx=0).stop 2026-03-10T08:59:04.470 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.469+0000 7f80b3efb700 1 -- 192.168.123.105:0/3639404264 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f80ac104320 msgr2=0x7f80ac196d00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:04.470 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.469+0000 7f80b3efb700 1 --2- 192.168.123.105:0/3639404264 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f80ac104320 0x7f80ac196d00 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f80a8006010 tx=0x7f80a80048c0 comp rx=0 tx=0).stop 2026-03-10T08:59:04.470 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.469+0000 7f80b3efb700 1 -- 192.168.123.105:0/3639404264 shutdown_connections 2026-03-10T08:59:04.470 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.469+0000 7f80b3efb700 1 --2- 192.168.123.105:0/3639404264 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f809807bcd0 0x7f809807e190 unknown :-1 s=CLOSED pgs=49 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:04.470 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.469+0000 7f80b3efb700 1 --2- 192.168.123.105:0/3639404264 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f80ac1019f0 0x7f80ac1967c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:04.470 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.469+0000 7f80b3efb700 1 --2- 192.168.123.105:0/3639404264 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f80ac104320 0x7f80ac196d00 unknown :-1 s=CLOSED pgs=77 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:04.470 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.469+0000 7f80b3efb700 1 -- 192.168.123.105:0/3639404264 >> 192.168.123.105:0/3639404264 conn(0x7f80ac0fb3e0 msgr2=0x7f80ac0fd840 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:59:04.470 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.469+0000 7f80b3efb700 1 -- 192.168.123.105:0/3639404264 shutdown_connections 2026-03-10T08:59:04.470 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.470+0000 7f80b3efb700 1 -- 192.168.123.105:0/3639404264 wait complete. 2026-03-10T08:59:04.480 INFO:teuthology.orchestra.run.vm05.stdout:true 2026-03-10T08:59:04.542 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.541+0000 7fb7bdafb700 1 -- 192.168.123.105:0/1838897897 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb7b8101990 msgr2=0x7fb7b8103d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:04.542 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.541+0000 7fb7bdafb700 1 --2- 192.168.123.105:0/1838897897 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb7b8101990 0x7fb7b8103d80 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7fb7a0009b50 tx=0x7fb7a0009e60 comp rx=0 tx=0).stop 2026-03-10T08:59:04.542 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.542+0000 7fb7bdafb700 1 -- 192.168.123.105:0/1838897897 shutdown_connections 2026-03-10T08:59:04.542 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.542+0000 7fb7bdafb700 1 --2- 192.168.123.105:0/1838897897 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb7b81042c0 0x7fb7b81066b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:04.542 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.542+0000 7fb7bdafb700 1 --2- 192.168.123.105:0/1838897897 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb7b8101990 0x7fb7b8103d80 unknown :-1 s=CLOSED pgs=78 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:04.542 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.542+0000 7fb7bdafb700 1 -- 192.168.123.105:0/1838897897 >> 192.168.123.105:0/1838897897 conn(0x7fb7b80fb360 msgr2=0x7fb7b80fd7e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:59:04.542 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.542+0000 7fb7bdafb700 1 -- 192.168.123.105:0/1838897897 shutdown_connections 2026-03-10T08:59:04.542 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.542+0000 7fb7bdafb700 1 -- 192.168.123.105:0/1838897897 wait complete. 2026-03-10T08:59:04.543 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.543+0000 7fb7bdafb700 1 Processor -- start 2026-03-10T08:59:04.543 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.543+0000 7fb7bdafb700 1 -- start start 2026-03-10T08:59:04.543 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.543+0000 7fb7bdafb700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb7b8101990 0x7fb7b810f8f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:59:04.543 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.543+0000 7fb7bdafb700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb7b81042c0 0x7fb7b810fe30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:59:04.543 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.543+0000 7fb7bdafb700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb7b8110450 con 0x7fb7b81042c0 2026-03-10T08:59:04.543 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.543+0000 7fb7bdafb700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb7b8110590 con 0x7fb7b8101990 2026-03-10T08:59:04.544 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.543+0000 7fb7b6ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb7b81042c0 0x7fb7b810fe30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:59:04.544 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.543+0000 7fb7b6ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb7b81042c0 0x7fb7b810fe30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:53288/0 (socket says 192.168.123.105:53288) 2026-03-10T08:59:04.544 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.543+0000 7fb7b6ffd700 1 -- 192.168.123.105:0/2448271663 learned_addr learned my addr 192.168.123.105:0/2448271663 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:59:04.544 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.543+0000 7fb7b77fe700 1 --2- 192.168.123.105:0/2448271663 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb7b8101990 0x7fb7b810f8f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:59:04.544 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.544+0000 7fb7b6ffd700 1 -- 192.168.123.105:0/2448271663 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb7b8101990 msgr2=0x7fb7b810f8f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:04.544 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.544+0000 7fb7b6ffd700 1 --2- 192.168.123.105:0/2448271663 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb7b8101990 0x7fb7b810f8f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:04.544 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.544+0000 7fb7b6ffd700 1 -- 192.168.123.105:0/2448271663 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb7a00097e0 con 0x7fb7b81042c0 2026-03-10T08:59:04.544 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.544+0000 7fb7b77fe700 1 --2- 192.168.123.105:0/2448271663 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb7b8101990 0x7fb7b810f8f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T08:59:04.545 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.544+0000 7fb7b6ffd700 1 --2- 192.168.123.105:0/2448271663 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb7b81042c0 0x7fb7b810fe30 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7fb7a800eb10 tx=0x7fb7a800eed0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:59:04.545 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.544+0000 7fb7b4ff9700 1 -- 192.168.123.105:0/2448271663 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb7a800cca0 con 0x7fb7b81042c0 2026-03-10T08:59:04.545 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.544+0000 7fb7b4ff9700 1 -- 192.168.123.105:0/2448271663 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb7a800ce00 con 0x7fb7b81042c0 2026-03-10T08:59:04.545 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.544+0000 7fb7bdafb700 1 -- 192.168.123.105:0/2448271663 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb7b8113030 con 0x7fb7b81042c0 2026-03-10T08:59:04.545 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.545+0000 7fb7bdafb700 1 -- 192.168.123.105:0/2448271663 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb7b8113580 con 0x7fb7b81042c0 2026-03-10T08:59:04.546 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.545+0000 7fb7b4ff9700 1 -- 192.168.123.105:0/2448271663 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb7a80105e0 con 0x7fb7b81042c0 2026-03-10T08:59:04.546 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.545+0000 7fb7bdafb700 1 -- 192.168.123.105:0/2448271663 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb7b8066e80 con 0x7fb7b81042c0 2026-03-10T08:59:04.549 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.547+0000 7fb7b4ff9700 1 -- 192.168.123.105:0/2448271663 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fb7a8018700 con 0x7fb7b81042c0 2026-03-10T08:59:04.549 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.547+0000 7fb7b4ff9700 1 --2- 192.168.123.105:0/2448271663 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb7a40778c0 0x7fb7a4079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:59:04.549 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.547+0000 7fb7b4ff9700 1 -- 192.168.123.105:0/2448271663 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(57..57 src has 1..57) v4 ==== 6512+0+0 (secure 0 0 0) 0x7fb7a8014070 con 0x7fb7b81042c0 2026-03-10T08:59:04.550 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.550+0000 7fb7b77fe700 1 --2- 192.168.123.105:0/2448271663 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb7a40778c0 0x7fb7a4079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:59:04.550 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.550+0000 7fb7b4ff9700 1 -- 192.168.123.105:0/2448271663 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fb7a8062870 con 0x7fb7b81042c0 2026-03-10T08:59:04.550 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.550+0000 7fb7b77fe700 1 --2- 192.168.123.105:0/2448271663 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb7a40778c0 0x7fb7a4079d80 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7fb7a0006010 tx=0x7fb7a000b540 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:59:04.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:04 vm08.local ceph-mon[101330]: pgmap v86: 65 pgs: 2 active+recovering+undersized+degraded+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1319/231 objects degraded (570.996%); 0 B/s, 9 objects/s recovering 2026-03-10T08:59:04.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:04 vm08.local ceph-mon[101330]: Health check update: Degraded data redundancy: 1319/231 objects degraded (570.996%), 11 pgs degraded, 11 pgs undersized (PG_DEGRADED) 2026-03-10T08:59:04.683 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.683+0000 7fb7bdafb700 1 -- 192.168.123.105:0/2448271663 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fb7b81131c0 con 0x7fb7a40778c0 2026-03-10T08:59:04.684 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.684+0000 7fb7b4ff9700 1 -- 192.168.123.105:0/2448271663 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7fb7b81131c0 con 0x7fb7a40778c0 2026-03-10T08:59:04.687 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.686+0000 7fb7bdafb700 1 -- 192.168.123.105:0/2448271663 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb7a40778c0 msgr2=0x7fb7a4079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:04.687 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.686+0000 7fb7bdafb700 1 --2- 192.168.123.105:0/2448271663 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb7a40778c0 0x7fb7a4079d80 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7fb7a0006010 tx=0x7fb7a000b540 comp rx=0 tx=0).stop 2026-03-10T08:59:04.687 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.686+0000 7fb7bdafb700 1 -- 192.168.123.105:0/2448271663 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb7b81042c0 msgr2=0x7fb7b810fe30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:04.687 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.686+0000 7fb7bdafb700 1 --2- 192.168.123.105:0/2448271663 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb7b81042c0 0x7fb7b810fe30 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7fb7a800eb10 tx=0x7fb7a800eed0 comp rx=0 tx=0).stop 2026-03-10T08:59:04.687 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.687+0000 7fb7bdafb700 1 -- 192.168.123.105:0/2448271663 shutdown_connections 2026-03-10T08:59:04.687 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.687+0000 7fb7bdafb700 1 --2- 192.168.123.105:0/2448271663 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb7a40778c0 0x7fb7a4079d80 unknown :-1 s=CLOSED pgs=50 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:04.687 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.687+0000 7fb7bdafb700 1 --2- 192.168.123.105:0/2448271663 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb7b8101990 0x7fb7b810f8f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:04.687 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.687+0000 7fb7bdafb700 1 --2- 192.168.123.105:0/2448271663 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb7b81042c0 0x7fb7b810fe30 unknown :-1 s=CLOSED pgs=79 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:04.688 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.687+0000 7fb7bdafb700 1 -- 192.168.123.105:0/2448271663 >> 192.168.123.105:0/2448271663 conn(0x7fb7b80fb360 msgr2=0x7fb7b80fd7e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:59:04.688 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.687+0000 7fb7bdafb700 1 -- 192.168.123.105:0/2448271663 shutdown_connections 2026-03-10T08:59:04.688 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.687+0000 7fb7bdafb700 1 -- 192.168.123.105:0/2448271663 wait complete. 2026-03-10T08:59:04.759 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.757+0000 7f8626cf7700 1 -- 192.168.123.105:0/1720437511 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f86200ff780 msgr2=0x7f86200ffba0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:04.759 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.757+0000 7f8626cf7700 1 --2- 192.168.123.105:0/1720437511 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f86200ff780 0x7f86200ffba0 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7f860c009b50 tx=0x7f860c009e60 comp rx=0 tx=0).stop 2026-03-10T08:59:04.760 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.760+0000 7f8626cf7700 1 -- 192.168.123.105:0/1720437511 shutdown_connections 2026-03-10T08:59:04.760 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.760+0000 7f8626cf7700 1 --2- 192.168.123.105:0/1720437511 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f86201000e0 0x7f8620100560 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:04.760 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.760+0000 7f8626cf7700 1 --2- 192.168.123.105:0/1720437511 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f86200ff780 0x7f86200ffba0 unknown :-1 s=CLOSED pgs=80 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:04.760 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.760+0000 7f8626cf7700 1 -- 192.168.123.105:0/1720437511 >> 192.168.123.105:0/1720437511 conn(0x7f86200fb380 msgr2=0x7f86200fd7e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:59:04.761 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.761+0000 7f8626cf7700 1 -- 192.168.123.105:0/1720437511 shutdown_connections 2026-03-10T08:59:04.761 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.761+0000 7f8626cf7700 1 -- 192.168.123.105:0/1720437511 wait complete. 2026-03-10T08:59:04.763 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.763+0000 7f8626cf7700 1 Processor -- start 2026-03-10T08:59:04.763 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.763+0000 7f8626cf7700 1 -- start start 2026-03-10T08:59:04.763 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.763+0000 7f8626cf7700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f86200ff780 0x7f8620198a30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:59:04.764 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.763+0000 7f8626cf7700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f86201000e0 0x7f8620198f70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:59:04.764 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.763+0000 7f8626cf7700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8620199590 con 0x7f86201000e0 2026-03-10T08:59:04.764 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.763+0000 7f8626cf7700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f86201996d0 con 0x7f86200ff780 2026-03-10T08:59:04.764 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.764+0000 7f861ffff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f86201000e0 0x7f8620198f70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:59:04.764 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.764+0000 7f861ffff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f86201000e0 0x7f8620198f70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:53298/0 (socket says 192.168.123.105:53298) 2026-03-10T08:59:04.764 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.764+0000 7f861ffff700 1 -- 192.168.123.105:0/2704804627 learned_addr learned my addr 192.168.123.105:0/2704804627 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:59:04.764 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.764+0000 7f8624a93700 1 --2- 192.168.123.105:0/2704804627 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f86200ff780 0x7f8620198a30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:59:04.764 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.764+0000 7f861ffff700 1 -- 192.168.123.105:0/2704804627 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f86200ff780 msgr2=0x7f8620198a30 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:04.764 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.764+0000 7f861ffff700 1 --2- 192.168.123.105:0/2704804627 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f86200ff780 0x7f8620198a30 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:04.765 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.764+0000 7f861ffff700 1 -- 192.168.123.105:0/2704804627 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f860c0097e0 con 0x7f86201000e0 2026-03-10T08:59:04.765 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.764+0000 7f861ffff700 1 --2- 192.168.123.105:0/2704804627 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f86201000e0 0x7f8620198f70 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7f861400ed70 tx=0x7f861400c5b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:59:04.766 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.765+0000 7f861dffb700 1 -- 192.168.123.105:0/2704804627 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f861400cd70 con 0x7f86201000e0 2026-03-10T08:59:04.766 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.765+0000 7f861dffb700 1 -- 192.168.123.105:0/2704804627 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8614010910 con 0x7f86201000e0 2026-03-10T08:59:04.766 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.765+0000 7f8626cf7700 1 -- 192.168.123.105:0/2704804627 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f862019e180 con 0x7f86201000e0 2026-03-10T08:59:04.766 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.765+0000 7f861dffb700 1 -- 192.168.123.105:0/2704804627 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8614018980 con 0x7f86201000e0 2026-03-10T08:59:04.766 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.765+0000 7f8626cf7700 1 -- 192.168.123.105:0/2704804627 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8620101990 con 0x7f86201000e0 2026-03-10T08:59:04.768 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.767+0000 7f861dffb700 1 -- 192.168.123.105:0/2704804627 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f8614018ae0 con 0x7f86201000e0 2026-03-10T08:59:04.768 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.767+0000 7f861dffb700 1 --2- 192.168.123.105:0/2704804627 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f8610077990 0x7f8610079e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:59:04.768 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.767+0000 7f861dffb700 1 -- 192.168.123.105:0/2704804627 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(57..57 src has 1..57) v4 ==== 6512+0+0 (secure 0 0 0) 0x7f8614014070 con 0x7f86201000e0 2026-03-10T08:59:04.768 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.768+0000 7f8624a93700 1 --2- 192.168.123.105:0/2704804627 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f8610077990 0x7f8610079e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:59:04.769 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.768+0000 7f8626cf7700 1 -- 192.168.123.105:0/2704804627 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f862019e310 con 0x7f86201000e0 2026-03-10T08:59:04.771 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.771+0000 7f8624a93700 1 --2- 192.168.123.105:0/2704804627 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f8610077990 0x7f8610079e50 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7f860c006010 tx=0x7f860c00b540 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:59:04.772 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.771+0000 7f861dffb700 1 -- 192.168.123.105:0/2704804627 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f862019e310 con 0x7f86201000e0 2026-03-10T08:59:04.894 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.894+0000 7f8626cf7700 1 -- 192.168.123.105:0/2704804627 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f862019e310 con 0x7f8610077990 2026-03-10T08:59:04.901 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.901+0000 7f861dffb700 1 -- 192.168.123.105:0/2704804627 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f862019e310 con 0x7f8610077990 2026-03-10T08:59:04.901 INFO:teuthology.orchestra.run.vm05.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T08:59:04.901 INFO:teuthology.orchestra.run.vm05.stdout:alertmanager.vm05 vm05 *:9093,9094 running (7m) 99s ago 8m 24.2M - 0.25.0 c8568f914cd2 3be9db7ff6a0 2026-03-10T08:59:04.901 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm05 vm05 running (8m) 99s ago 8m 8892k - 18.2.1 5be31c24972a 4f6a4fa3151a 2026-03-10T08:59:04.901 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm08 vm08 running (7m) 110s ago 7m 11.0M - 18.2.1 5be31c24972a 17a1d108c2c1 2026-03-10T08:59:04.901 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm05 vm05 running (115s) 99s ago 8m 7834k - 19.2.3-678-ge911bdeb 654f31e6858e 8f7c3cd210c7 2026-03-10T08:59:04.901 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm08 vm08 running (112s) 110s ago 7m 8296k - 19.2.3-678-ge911bdeb 654f31e6858e dc71c3161edd 2026-03-10T08:59:04.901 INFO:teuthology.orchestra.run.vm05.stdout:grafana.vm05 vm05 *:3000 running (7m) 99s ago 8m 88.3M - 9.4.7 954c08fa6188 91937b4745e7 2026-03-10T08:59:04.902 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.bxdvbu vm05 running (6m) 99s ago 6m 242M - 18.2.1 5be31c24972a 601862397d9b 2026-03-10T08:59:04.902 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.slhztf vm05 running (6m) 99s ago 6m 17.7M - 18.2.1 5be31c24972a 550cdd24738b 2026-03-10T08:59:04.902 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.ssijow vm08 running (6m) 110s ago 6m 19.9M - 18.2.1 5be31c24972a b9e55b365719 2026-03-10T08:59:04.902 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.xfzrbx vm08 running (6m) 110s ago 6m 16.2M - 18.2.1 5be31c24972a d58d1e2e6ff3 2026-03-10T08:59:04.902 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm05.rxwgjc vm05 *:8443,9283,8765 running (3m) 99s ago 8m 613M - 19.2.3-678-ge911bdeb 654f31e6858e d8c53d04a173 2026-03-10T08:59:04.902 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm08.rpongu vm08 *:8443,9283,8765 running (2m) 110s ago 7m 491M - 19.2.3-678-ge911bdeb 654f31e6858e 867781ac9d98 2026-03-10T08:59:04.902 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm05 vm05 running (2m) 99s ago 8m 55.8M 2048M 19.2.3-678-ge911bdeb 654f31e6858e cdc9176bec28 2026-03-10T08:59:04.902 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm08 vm08 running (2m) 110s ago 7m 48.1M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 34546aa1422b 2026-03-10T08:59:04.902 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm05 vm05 *:9100 running (8m) 99s ago 8m 14.7M - 1.5.0 0da6a335fe13 3b3db2a6030c 2026-03-10T08:59:04.902 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm08 vm08 *:9100 running (7m) 110s ago 7m 15.7M - 1.5.0 0da6a335fe13 f55df0cefc61 2026-03-10T08:59:04.902 INFO:teuthology.orchestra.run.vm05.stdout:osd.0 vm05 running (102s) 99s ago 7m 30.0M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 4f1dac46f59b 2026-03-10T08:59:04.902 INFO:teuthology.orchestra.run.vm05.stdout:osd.1 vm05 running (7m) 99s ago 7m 378M 4096M 18.2.1 5be31c24972a 902f9ea11f1a 2026-03-10T08:59:04.902 INFO:teuthology.orchestra.run.vm05.stdout:osd.2 vm05 running (7m) 99s ago 7m 327M 4096M 18.2.1 5be31c24972a 32c0be0f86f2 2026-03-10T08:59:04.902 INFO:teuthology.orchestra.run.vm05.stdout:osd.3 vm08 running (6m) 110s ago 6m 456M 4096M 18.2.1 5be31c24972a 14f5f93ea4d1 2026-03-10T08:59:04.902 INFO:teuthology.orchestra.run.vm05.stdout:osd.4 vm08 running (6m) 110s ago 6m 418M 4096M 18.2.1 5be31c24972a 155c1482d81c 2026-03-10T08:59:04.902 INFO:teuthology.orchestra.run.vm05.stdout:osd.5 vm08 running (6m) 110s ago 6m 339M 4096M 18.2.1 5be31c24972a 21583bb58d82 2026-03-10T08:59:04.902 INFO:teuthology.orchestra.run.vm05.stdout:prometheus.vm05 vm05 *:9095 running (2m) 99s ago 7m 51.3M - 2.43.0 a07b618ecd1d 1879cf55d022 2026-03-10T08:59:04.904 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.904+0000 7f8626cf7700 1 -- 192.168.123.105:0/2704804627 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f8610077990 msgr2=0x7f8610079e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:04.904 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.904+0000 7f8626cf7700 1 --2- 192.168.123.105:0/2704804627 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f8610077990 0x7f8610079e50 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7f860c006010 tx=0x7f860c00b540 comp rx=0 tx=0).stop 2026-03-10T08:59:04.904 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.904+0000 7f8626cf7700 1 -- 192.168.123.105:0/2704804627 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f86201000e0 msgr2=0x7f8620198f70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:04.904 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.904+0000 7f8626cf7700 1 --2- 192.168.123.105:0/2704804627 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f86201000e0 0x7f8620198f70 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7f861400ed70 tx=0x7f861400c5b0 comp rx=0 tx=0).stop 2026-03-10T08:59:04.904 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.904+0000 7f8626cf7700 1 -- 192.168.123.105:0/2704804627 shutdown_connections 2026-03-10T08:59:04.905 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.904+0000 7f8626cf7700 1 --2- 192.168.123.105:0/2704804627 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f8610077990 0x7f8610079e50 unknown :-1 s=CLOSED pgs=51 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:04.905 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.904+0000 7f8626cf7700 1 --2- 192.168.123.105:0/2704804627 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f86200ff780 0x7f8620198a30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:04.905 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.905+0000 7f8626cf7700 1 --2- 192.168.123.105:0/2704804627 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f86201000e0 0x7f8620198f70 unknown :-1 s=CLOSED pgs=81 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:04.905 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.905+0000 7f8626cf7700 1 -- 192.168.123.105:0/2704804627 >> 192.168.123.105:0/2704804627 conn(0x7f86200fb380 msgr2=0x7f8620107db0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:59:04.905 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.905+0000 7f8626cf7700 1 -- 192.168.123.105:0/2704804627 shutdown_connections 2026-03-10T08:59:04.905 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.905+0000 7f8626cf7700 1 -- 192.168.123.105:0/2704804627 wait complete. 2026-03-10T08:59:04.978 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.978+0000 7f343797e700 1 -- 192.168.123.105:0/1193285924 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3430104320 msgr2=0x7f3430104780 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:04.978 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.978+0000 7f343797e700 1 --2- 192.168.123.105:0/1193285924 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3430104320 0x7f3430104780 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f342c009b50 tx=0x7f342c009e60 comp rx=0 tx=0).stop 2026-03-10T08:59:04.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.978+0000 7f343797e700 1 -- 192.168.123.105:0/1193285924 shutdown_connections 2026-03-10T08:59:04.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.978+0000 7f343797e700 1 --2- 192.168.123.105:0/1193285924 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3430104320 0x7f3430104780 unknown :-1 s=CLOSED pgs=82 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:04.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.978+0000 7f343797e700 1 --2- 192.168.123.105:0/1193285924 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3430103120 0x7f3430103540 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:04.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.978+0000 7f343797e700 1 -- 192.168.123.105:0/1193285924 >> 192.168.123.105:0/1193285924 conn(0x7f34300fe6c0 msgr2=0x7f3430100b00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:59:04.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.979+0000 7f343797e700 1 -- 192.168.123.105:0/1193285924 shutdown_connections 2026-03-10T08:59:04.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.979+0000 7f343797e700 1 -- 192.168.123.105:0/1193285924 wait complete. 2026-03-10T08:59:04.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.979+0000 7f343797e700 1 Processor -- start 2026-03-10T08:59:04.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.979+0000 7f343797e700 1 -- start start 2026-03-10T08:59:04.980 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.979+0000 7f343797e700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3430103120 0x7f3430198d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:59:04.980 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.980+0000 7f343797e700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3430199270 0x7f343019e2e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:59:04.980 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.980+0000 7f343797e700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3430199780 con 0x7f3430103120 2026-03-10T08:59:04.980 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.980+0000 7f343797e700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f34301998f0 con 0x7f3430199270 2026-03-10T08:59:04.980 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.980+0000 7f343571a700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3430103120 0x7f3430198d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:59:04.980 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.980+0000 7f3434f19700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3430199270 0x7f343019e2e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:59:04.980 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.980+0000 7f3434f19700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3430199270 0x7f343019e2e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:36286/0 (socket says 192.168.123.105:36286) 2026-03-10T08:59:04.980 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.980+0000 7f3434f19700 1 -- 192.168.123.105:0/1735171012 learned_addr learned my addr 192.168.123.105:0/1735171012 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:59:04.981 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.980+0000 7f3434f19700 1 -- 192.168.123.105:0/1735171012 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3430103120 msgr2=0x7f3430198d30 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:04.981 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.981+0000 7f3434f19700 1 --2- 192.168.123.105:0/1735171012 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3430103120 0x7f3430198d30 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:04.981 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.981+0000 7f3434f19700 1 -- 192.168.123.105:0/1735171012 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f342c0097e0 con 0x7f3430199270 2026-03-10T08:59:04.981 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.981+0000 7f343571a700 1 --2- 192.168.123.105:0/1735171012 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3430103120 0x7f3430198d30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T08:59:04.981 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.981+0000 7f3434f19700 1 --2- 192.168.123.105:0/1735171012 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3430199270 0x7f343019e2e0 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f342c006010 tx=0x7f342c005710 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:59:04.981 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.981+0000 7f34267fc700 1 -- 192.168.123.105:0/1735171012 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f342c01d070 con 0x7f3430199270 2026-03-10T08:59:04.981 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.981+0000 7f343797e700 1 -- 192.168.123.105:0/1735171012 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f343019e820 con 0x7f3430199270 2026-03-10T08:59:04.982 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.981+0000 7f343797e700 1 -- 192.168.123.105:0/1735171012 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f343019ecb0 con 0x7f3430199270 2026-03-10T08:59:04.983 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.982+0000 7f34267fc700 1 -- 192.168.123.105:0/1735171012 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f342c004500 con 0x7f3430199270 2026-03-10T08:59:04.983 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.982+0000 7f34267fc700 1 -- 192.168.123.105:0/1735171012 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f342c003db0 con 0x7f3430199270 2026-03-10T08:59:04.984 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.983+0000 7f34267fc700 1 -- 192.168.123.105:0/1735171012 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f342c00f460 con 0x7f3430199270 2026-03-10T08:59:04.984 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.983+0000 7f343797e700 1 -- 192.168.123.105:0/1735171012 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3430066e80 con 0x7f3430199270 2026-03-10T08:59:04.984 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.984+0000 7f34267fc700 1 --2- 192.168.123.105:0/1735171012 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f341c077870 0x7f341c079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:59:04.984 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.984+0000 7f343571a700 1 --2- 192.168.123.105:0/1735171012 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f341c077870 0x7f341c079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:59:04.984 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.984+0000 7f34267fc700 1 -- 192.168.123.105:0/1735171012 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(57..57 src has 1..57) v4 ==== 6512+0+0 (secure 0 0 0) 0x7f342c09b220 con 0x7f3430199270 2026-03-10T08:59:04.984 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.984+0000 7f343571a700 1 --2- 192.168.123.105:0/1735171012 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f341c077870 0x7f341c079d30 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f3420006fd0 tx=0x7f3420009380 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:59:04.987 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:04.986+0000 7f34267fc700 1 -- 192.168.123.105:0/1735171012 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f342c0630d0 con 0x7f3430199270 2026-03-10T08:59:05.152 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.152+0000 7f343797e700 1 -- 192.168.123.105:0/1735171012 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f343019f080 con 0x7f3430199270 2026-03-10T08:59:05.155 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.155+0000 7f34267fc700 1 -- 192.168.123.105:0/1735171012 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+799 (secure 0 0 0) 0x7f342c0630d0 con 0x7f3430199270 2026-03-10T08:59:05.156 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T08:59:05.156 INFO:teuthology.orchestra.run.vm05.stdout: "mon": { 2026-03-10T08:59:05.156 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T08:59:05.156 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:59:05.156 INFO:teuthology.orchestra.run.vm05.stdout: "mgr": { 2026-03-10T08:59:05.156 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T08:59:05.156 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:59:05.156 INFO:teuthology.orchestra.run.vm05.stdout: "osd": { 2026-03-10T08:59:05.156 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 5, 2026-03-10T08:59:05.156 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-10T08:59:05.156 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:59:05.156 INFO:teuthology.orchestra.run.vm05.stdout: "mds": { 2026-03-10T08:59:05.156 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4 2026-03-10T08:59:05.156 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:59:05.156 INFO:teuthology.orchestra.run.vm05.stdout: "overall": { 2026-03-10T08:59:05.156 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 9, 2026-03-10T08:59:05.156 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 5 2026-03-10T08:59:05.156 INFO:teuthology.orchestra.run.vm05.stdout: } 2026-03-10T08:59:05.156 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T08:59:05.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.158+0000 7f343797e700 1 -- 192.168.123.105:0/1735171012 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f341c077870 msgr2=0x7f341c079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:05.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.158+0000 7f343797e700 1 --2- 192.168.123.105:0/1735171012 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f341c077870 0x7f341c079d30 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f3420006fd0 tx=0x7f3420009380 comp rx=0 tx=0).stop 2026-03-10T08:59:05.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.159+0000 7f343797e700 1 -- 192.168.123.105:0/1735171012 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3430199270 msgr2=0x7f343019e2e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:05.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.159+0000 7f343797e700 1 --2- 192.168.123.105:0/1735171012 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3430199270 0x7f343019e2e0 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f342c006010 tx=0x7f342c005710 comp rx=0 tx=0).stop 2026-03-10T08:59:05.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.159+0000 7f343797e700 1 -- 192.168.123.105:0/1735171012 shutdown_connections 2026-03-10T08:59:05.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.159+0000 7f343797e700 1 --2- 192.168.123.105:0/1735171012 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f341c077870 0x7f341c079d30 unknown :-1 s=CLOSED pgs=52 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:05.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.159+0000 7f343797e700 1 --2- 192.168.123.105:0/1735171012 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3430103120 0x7f3430198d30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:05.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.159+0000 7f343797e700 1 --2- 192.168.123.105:0/1735171012 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3430199270 0x7f343019e2e0 unknown :-1 s=CLOSED pgs=25 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:05.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.159+0000 7f343797e700 1 -- 192.168.123.105:0/1735171012 >> 192.168.123.105:0/1735171012 conn(0x7f34300fe6c0 msgr2=0x7f3430107550 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:59:05.160 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.159+0000 7f343797e700 1 -- 192.168.123.105:0/1735171012 shutdown_connections 2026-03-10T08:59:05.160 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.159+0000 7f343797e700 1 -- 192.168.123.105:0/1735171012 wait complete. 2026-03-10T08:59:05.230 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.229+0000 7f8a7f6e9700 1 -- 192.168.123.105:0/1287786890 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8a78104380 msgr2=0x7f8a78106770 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:05.230 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.229+0000 7f8a7f6e9700 1 --2- 192.168.123.105:0/1287786890 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8a78104380 0x7f8a78106770 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f8a74009b00 tx=0x7f8a74009e10 comp rx=0 tx=0).stop 2026-03-10T08:59:05.230 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.229+0000 7f8a7f6e9700 1 -- 192.168.123.105:0/1287786890 shutdown_connections 2026-03-10T08:59:05.230 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.229+0000 7f8a7f6e9700 1 --2- 192.168.123.105:0/1287786890 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8a78104380 0x7f8a78106770 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:05.230 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.229+0000 7f8a7f6e9700 1 --2- 192.168.123.105:0/1287786890 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8a78101a50 0x7f8a78103e40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:05.230 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.229+0000 7f8a7f6e9700 1 -- 192.168.123.105:0/1287786890 >> 192.168.123.105:0/1287786890 conn(0x7f8a780fb380 msgr2=0x7f8a780fd7e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:59:05.230 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.230+0000 7f8a7f6e9700 1 -- 192.168.123.105:0/1287786890 shutdown_connections 2026-03-10T08:59:05.230 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.230+0000 7f8a7f6e9700 1 -- 192.168.123.105:0/1287786890 wait complete. 2026-03-10T08:59:05.230 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.230+0000 7f8a7f6e9700 1 Processor -- start 2026-03-10T08:59:05.230 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.230+0000 7f8a7f6e9700 1 -- start start 2026-03-10T08:59:05.231 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.230+0000 7f8a7f6e9700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8a78101a50 0x7f8a781967a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:59:05.231 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.230+0000 7f8a7f6e9700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8a78104380 0x7f8a78196ce0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:59:05.231 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.230+0000 7f8a7f6e9700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8a78197300 con 0x7f8a78101a50 2026-03-10T08:59:05.231 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.230+0000 7f8a7f6e9700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8a78197440 con 0x7f8a78104380 2026-03-10T08:59:05.231 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.231+0000 7f8a7cc84700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8a78104380 0x7f8a78196ce0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:59:05.231 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.231+0000 7f8a7cc84700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8a78104380 0x7f8a78196ce0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:36306/0 (socket says 192.168.123.105:36306) 2026-03-10T08:59:05.231 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.231+0000 7f8a7cc84700 1 -- 192.168.123.105:0/914353803 learned_addr learned my addr 192.168.123.105:0/914353803 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:59:05.231 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.231+0000 7f8a7d485700 1 --2- 192.168.123.105:0/914353803 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8a78101a50 0x7f8a781967a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:59:05.231 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.231+0000 7f8a7cc84700 1 -- 192.168.123.105:0/914353803 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8a78101a50 msgr2=0x7f8a781967a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:05.231 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.231+0000 7f8a7cc84700 1 --2- 192.168.123.105:0/914353803 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8a78101a50 0x7f8a781967a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:05.231 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.231+0000 7f8a7cc84700 1 -- 192.168.123.105:0/914353803 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8a68009710 con 0x7f8a78104380 2026-03-10T08:59:05.232 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.231+0000 7f8a7d485700 1 --2- 192.168.123.105:0/914353803 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8a78101a50 0x7f8a781967a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T08:59:05.232 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.232+0000 7f8a7cc84700 1 --2- 192.168.123.105:0/914353803 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8a78104380 0x7f8a78196ce0 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7f8a7400bb50 tx=0x7f8a74005f00 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:59:05.232 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.232+0000 7f8a6e7fc700 1 -- 192.168.123.105:0/914353803 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8a7401d070 con 0x7f8a78104380 2026-03-10T08:59:05.232 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.232+0000 7f8a7f6e9700 1 -- 192.168.123.105:0/914353803 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8a740097e0 con 0x7f8a78104380 2026-03-10T08:59:05.232 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.232+0000 7f8a7f6e9700 1 -- 192.168.123.105:0/914353803 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8a780ffba0 con 0x7f8a78104380 2026-03-10T08:59:05.233 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.232+0000 7f8a6e7fc700 1 -- 192.168.123.105:0/914353803 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8a74022470 con 0x7f8a78104380 2026-03-10T08:59:05.233 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.232+0000 7f8a7f6e9700 1 -- 192.168.123.105:0/914353803 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8a781909e0 con 0x7f8a78104380 2026-03-10T08:59:05.233 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.232+0000 7f8a6e7fc700 1 -- 192.168.123.105:0/914353803 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8a7400f670 con 0x7f8a78104380 2026-03-10T08:59:05.234 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.233+0000 7f8a6e7fc700 1 -- 192.168.123.105:0/914353803 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f8a74022ae0 con 0x7f8a78104380 2026-03-10T08:59:05.234 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.234+0000 7f8a6e7fc700 1 --2- 192.168.123.105:0/914353803 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f8a64077870 0x7f8a64079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:59:05.234 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.234+0000 7f8a6e7fc700 1 -- 192.168.123.105:0/914353803 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(57..57 src has 1..57) v4 ==== 6512+0+0 (secure 0 0 0) 0x7f8a7409b8f0 con 0x7f8a78104380 2026-03-10T08:59:05.234 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.234+0000 7f8a7d485700 1 --2- 192.168.123.105:0/914353803 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f8a64077870 0x7f8a64079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:59:05.235 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.234+0000 7f8a7d485700 1 --2- 192.168.123.105:0/914353803 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f8a64077870 0x7f8a64079d30 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f8a68011430 tx=0x7f8a68009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:59:05.236 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.236+0000 7f8a6e7fc700 1 -- 192.168.123.105:0/914353803 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f8a74064100 con 0x7f8a78104380 2026-03-10T08:59:05.376 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.376+0000 7f8a7f6e9700 1 -- 192.168.123.105:0/914353803 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f8a78066e80 con 0x7f8a78104380 2026-03-10T08:59:05.377 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.377+0000 7f8a6e7fc700 1 -- 192.168.123.105:0/914353803 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 12 v12) v1 ==== 76+0+1918 (secure 0 0 0) 0x7f8a74063850 con 0x7f8a78104380 2026-03-10T08:59:05.377 INFO:teuthology.orchestra.run.vm05.stdout:e12 2026-03-10T08:59:05.377 INFO:teuthology.orchestra.run.vm05.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-10T08:59:05.377 INFO:teuthology.orchestra.run.vm05.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T08:59:05.377 INFO:teuthology.orchestra.run.vm05.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T08:59:05.377 INFO:teuthology.orchestra.run.vm05.stdout:legacy client fscid: 1 2026-03-10T08:59:05.377 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:59:05.377 INFO:teuthology.orchestra.run.vm05.stdout:Filesystem 'cephfs' (1) 2026-03-10T08:59:05.378 INFO:teuthology.orchestra.run.vm05.stdout:fs_name cephfs 2026-03-10T08:59:05.378 INFO:teuthology.orchestra.run.vm05.stdout:epoch 12 2026-03-10T08:59:05.378 INFO:teuthology.orchestra.run.vm05.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T08:59:05.378 INFO:teuthology.orchestra.run.vm05.stdout:created 2026-03-10T08:52:52.346264+0000 2026-03-10T08:59:05.378 INFO:teuthology.orchestra.run.vm05.stdout:modified 2026-03-10T08:53:01.426195+0000 2026-03-10T08:59:05.378 INFO:teuthology.orchestra.run.vm05.stdout:tableserver 0 2026-03-10T08:59:05.378 INFO:teuthology.orchestra.run.vm05.stdout:root 0 2026-03-10T08:59:05.378 INFO:teuthology.orchestra.run.vm05.stdout:session_timeout 60 2026-03-10T08:59:05.378 INFO:teuthology.orchestra.run.vm05.stdout:session_autoclose 300 2026-03-10T08:59:05.378 INFO:teuthology.orchestra.run.vm05.stdout:max_file_size 1099511627776 2026-03-10T08:59:05.378 INFO:teuthology.orchestra.run.vm05.stdout:max_xattr_size 65536 2026-03-10T08:59:05.378 INFO:teuthology.orchestra.run.vm05.stdout:required_client_features {} 2026-03-10T08:59:05.378 INFO:teuthology.orchestra.run.vm05.stdout:last_failure 0 2026-03-10T08:59:05.378 INFO:teuthology.orchestra.run.vm05.stdout:last_failure_osd_epoch 39 2026-03-10T08:59:05.378 INFO:teuthology.orchestra.run.vm05.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T08:59:05.378 INFO:teuthology.orchestra.run.vm05.stdout:max_mds 1 2026-03-10T08:59:05.378 INFO:teuthology.orchestra.run.vm05.stdout:in 0 2026-03-10T08:59:05.378 INFO:teuthology.orchestra.run.vm05.stdout:up {0=24289} 2026-03-10T08:59:05.378 INFO:teuthology.orchestra.run.vm05.stdout:failed 2026-03-10T08:59:05.378 INFO:teuthology.orchestra.run.vm05.stdout:damaged 2026-03-10T08:59:05.378 INFO:teuthology.orchestra.run.vm05.stdout:stopped 2026-03-10T08:59:05.378 INFO:teuthology.orchestra.run.vm05.stdout:data_pools [3] 2026-03-10T08:59:05.378 INFO:teuthology.orchestra.run.vm05.stdout:metadata_pool 2 2026-03-10T08:59:05.378 INFO:teuthology.orchestra.run.vm05.stdout:inline_data disabled 2026-03-10T08:59:05.378 INFO:teuthology.orchestra.run.vm05.stdout:balancer 2026-03-10T08:59:05.378 INFO:teuthology.orchestra.run.vm05.stdout:bal_rank_mask -1 2026-03-10T08:59:05.378 INFO:teuthology.orchestra.run.vm05.stdout:standby_count_wanted 1 2026-03-10T08:59:05.378 INFO:teuthology.orchestra.run.vm05.stdout:qdb_cluster leader: 0 members: 2026-03-10T08:59:05.378 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.bxdvbu{0:24289} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.105:6826/2466638752,v1:192.168.123.105:6827/2466638752] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:59:05.378 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:59:05.378 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:59:05.378 INFO:teuthology.orchestra.run.vm05.stdout:Standby daemons: 2026-03-10T08:59:05.378 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:59:05.378 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.slhztf{-1:14488} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.105:6828/2662194502,v1:192.168.123.105:6829/2662194502] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:59:05.378 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.ssijow{-1:24307} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.108:6826/573665424,v1:192.168.123.108:6827/573665424] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:59:05.378 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.xfzrbx{-1:24317} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.108:6824/3236998387,v1:192.168.123.108:6825/3236998387] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:59:05.380 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.380+0000 7f8a7f6e9700 1 -- 192.168.123.105:0/914353803 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f8a64077870 msgr2=0x7f8a64079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:05.380 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.380+0000 7f8a7f6e9700 1 --2- 192.168.123.105:0/914353803 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f8a64077870 0x7f8a64079d30 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f8a68011430 tx=0x7f8a68009450 comp rx=0 tx=0).stop 2026-03-10T08:59:05.380 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.380+0000 7f8a7f6e9700 1 -- 192.168.123.105:0/914353803 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8a78104380 msgr2=0x7f8a78196ce0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:05.380 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.380+0000 7f8a7f6e9700 1 --2- 192.168.123.105:0/914353803 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8a78104380 0x7f8a78196ce0 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7f8a7400bb50 tx=0x7f8a74005f00 comp rx=0 tx=0).stop 2026-03-10T08:59:05.380 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.380+0000 7f8a7f6e9700 1 -- 192.168.123.105:0/914353803 shutdown_connections 2026-03-10T08:59:05.380 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.380+0000 7f8a7f6e9700 1 --2- 192.168.123.105:0/914353803 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f8a64077870 0x7f8a64079d30 unknown :-1 s=CLOSED pgs=53 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:05.380 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.380+0000 7f8a7f6e9700 1 --2- 192.168.123.105:0/914353803 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8a78101a50 0x7f8a781967a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:05.380 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.380+0000 7f8a7f6e9700 1 --2- 192.168.123.105:0/914353803 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8a78104380 0x7f8a78196ce0 unknown :-1 s=CLOSED pgs=27 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:05.380 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.380+0000 7f8a7f6e9700 1 -- 192.168.123.105:0/914353803 >> 192.168.123.105:0/914353803 conn(0x7f8a780fb380 msgr2=0x7f8a780fd7e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:59:05.381 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.380+0000 7f8a7f6e9700 1 -- 192.168.123.105:0/914353803 shutdown_connections 2026-03-10T08:59:05.381 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.380+0000 7f8a7f6e9700 1 -- 192.168.123.105:0/914353803 wait complete. 2026-03-10T08:59:05.381 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 12 2026-03-10T08:59:05.449 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:05 vm05.local ceph-mon[111630]: from='client.34216 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:59:05.449 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:05 vm05.local ceph-mon[111630]: from='client.34220 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:59:05.449 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:05 vm05.local ceph-mon[111630]: from='client.34224 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:59:05.449 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:05 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/1735171012' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:59:05.449 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.446+0000 7fb3dddf0700 1 -- 192.168.123.105:0/3468573270 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3d81043e0 msgr2=0x7fb3d81067d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:05.449 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.446+0000 7fb3dddf0700 1 --2- 192.168.123.105:0/3468573270 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3d81043e0 0x7fb3d81067d0 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7fb3c8009b00 tx=0x7fb3c8009e10 comp rx=0 tx=0).stop 2026-03-10T08:59:05.450 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.450+0000 7fb3dddf0700 1 -- 192.168.123.105:0/3468573270 shutdown_connections 2026-03-10T08:59:05.450 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.450+0000 7fb3dddf0700 1 --2- 192.168.123.105:0/3468573270 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3d81043e0 0x7fb3d81067d0 unknown :-1 s=CLOSED pgs=83 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:05.450 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.450+0000 7fb3dddf0700 1 --2- 192.168.123.105:0/3468573270 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb3d8101ab0 0x7fb3d8103ea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:05.450 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.450+0000 7fb3dddf0700 1 -- 192.168.123.105:0/3468573270 >> 192.168.123.105:0/3468573270 conn(0x7fb3d80fb3c0 msgr2=0x7fb3d80fd840 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:59:05.452 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.451+0000 7fb3dddf0700 1 -- 192.168.123.105:0/3468573270 shutdown_connections 2026-03-10T08:59:05.452 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.451+0000 7fb3dddf0700 1 -- 192.168.123.105:0/3468573270 wait complete. 2026-03-10T08:59:05.452 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.452+0000 7fb3dddf0700 1 Processor -- start 2026-03-10T08:59:05.452 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.452+0000 7fb3dddf0700 1 -- start start 2026-03-10T08:59:05.452 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.452+0000 7fb3dddf0700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3d8101ab0 0x7fb3d8071cb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:59:05.453 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.452+0000 7fb3dddf0700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb3d81043e0 0x7fb3d80721f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:59:05.453 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.452+0000 7fb3dddf0700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb3d80727f0 con 0x7fb3d8101ab0 2026-03-10T08:59:05.453 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.452+0000 7fb3dddf0700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb3d8072960 con 0x7fb3d81043e0 2026-03-10T08:59:05.453 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.452+0000 7fb3d77fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3d8101ab0 0x7fb3d8071cb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:59:05.453 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.452+0000 7fb3d77fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3d8101ab0 0x7fb3d8071cb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:53338/0 (socket says 192.168.123.105:53338) 2026-03-10T08:59:05.453 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.452+0000 7fb3d77fe700 1 -- 192.168.123.105:0/312984635 learned_addr learned my addr 192.168.123.105:0/312984635 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:59:05.453 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.453+0000 7fb3d6ffd700 1 --2- 192.168.123.105:0/312984635 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb3d81043e0 0x7fb3d80721f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:59:05.453 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.453+0000 7fb3d77fe700 1 -- 192.168.123.105:0/312984635 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb3d81043e0 msgr2=0x7fb3d80721f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:05.453 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.453+0000 7fb3d77fe700 1 --2- 192.168.123.105:0/312984635 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb3d81043e0 0x7fb3d80721f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:05.453 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.453+0000 7fb3d77fe700 1 -- 192.168.123.105:0/312984635 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb3c80097e0 con 0x7fb3d8101ab0 2026-03-10T08:59:05.453 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.453+0000 7fb3d77fe700 1 --2- 192.168.123.105:0/312984635 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3d8101ab0 0x7fb3d8071cb0 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7fb3c000d8d0 tx=0x7fb3c000dbe0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:59:05.454 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.453+0000 7fb3d4ff9700 1 -- 192.168.123.105:0/312984635 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb3c0009880 con 0x7fb3d8101ab0 2026-03-10T08:59:05.454 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.453+0000 7fb3d4ff9700 1 -- 192.168.123.105:0/312984635 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb3c0010460 con 0x7fb3d8101ab0 2026-03-10T08:59:05.454 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.454+0000 7fb3d4ff9700 1 -- 192.168.123.105:0/312984635 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb3c000f5d0 con 0x7fb3d8101ab0 2026-03-10T08:59:05.455 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.454+0000 7fb3dddf0700 1 -- 192.168.123.105:0/312984635 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb3d80ffcb0 con 0x7fb3d8101ab0 2026-03-10T08:59:05.455 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.454+0000 7fb3dddf0700 1 -- 192.168.123.105:0/312984635 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb3d8100180 con 0x7fb3d8101ab0 2026-03-10T08:59:05.455 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.455+0000 7fb3d4ff9700 1 -- 192.168.123.105:0/312984635 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fb3c00099e0 con 0x7fb3d8101ab0 2026-03-10T08:59:05.456 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.455+0000 7fb3d4ff9700 1 --2- 192.168.123.105:0/312984635 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb3c40777d0 0x7fb3c4079c90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:59:05.456 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.456+0000 7fb3d4ff9700 1 -- 192.168.123.105:0/312984635 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(57..57 src has 1..57) v4 ==== 6512+0+0 (secure 0 0 0) 0x7fb3c00996d0 con 0x7fb3d8101ab0 2026-03-10T08:59:05.456 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.456+0000 7fb3d6ffd700 1 --2- 192.168.123.105:0/312984635 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb3c40777d0 0x7fb3c4079c90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:59:05.456 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.456+0000 7fb3d6ffd700 1 --2- 192.168.123.105:0/312984635 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb3c40777d0 0x7fb3c4079c90 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7fb3c800b5c0 tx=0x7fb3c8005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:59:05.456 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.456+0000 7fb3dddf0700 1 -- 192.168.123.105:0/312984635 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb3d804ea90 con 0x7fb3d8101ab0 2026-03-10T08:59:05.459 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.459+0000 7fb3d4ff9700 1 -- 192.168.123.105:0/312984635 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fb3c0061e30 con 0x7fb3d8101ab0 2026-03-10T08:59:05.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:05 vm08.local ceph-mon[101330]: from='client.34216 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:59:05.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:05 vm08.local ceph-mon[101330]: from='client.34220 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:59:05.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:05 vm08.local ceph-mon[101330]: from='client.34224 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:59:05.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:05 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/1735171012' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:59:05.584 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.584+0000 7fb3dddf0700 1 -- 192.168.123.105:0/312984635 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fb3d806b9a0 con 0x7fb3c40777d0 2026-03-10T08:59:05.585 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.585+0000 7fb3d4ff9700 1 -- 192.168.123.105:0/312984635 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7fb3d806b9a0 con 0x7fb3c40777d0 2026-03-10T08:59:05.585 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T08:59:05.585 INFO:teuthology.orchestra.run.vm05.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T08:59:05.585 INFO:teuthology.orchestra.run.vm05.stdout: "in_progress": true, 2026-03-10T08:59:05.585 INFO:teuthology.orchestra.run.vm05.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T08:59:05.585 INFO:teuthology.orchestra.run.vm05.stdout: "services_complete": [ 2026-03-10T08:59:05.585 INFO:teuthology.orchestra.run.vm05.stdout: "mgr", 2026-03-10T08:59:05.585 INFO:teuthology.orchestra.run.vm05.stdout: "crash", 2026-03-10T08:59:05.585 INFO:teuthology.orchestra.run.vm05.stdout: "mon" 2026-03-10T08:59:05.586 INFO:teuthology.orchestra.run.vm05.stdout: ], 2026-03-10T08:59:05.586 INFO:teuthology.orchestra.run.vm05.stdout: "progress": "7/23 daemons upgraded", 2026-03-10T08:59:05.586 INFO:teuthology.orchestra.run.vm05.stdout: "message": "Currently upgrading osd daemons", 2026-03-10T08:59:05.586 INFO:teuthology.orchestra.run.vm05.stdout: "is_paused": false 2026-03-10T08:59:05.586 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T08:59:05.587 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.587+0000 7fb3dddf0700 1 -- 192.168.123.105:0/312984635 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb3c40777d0 msgr2=0x7fb3c4079c90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:05.588 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.587+0000 7fb3dddf0700 1 --2- 192.168.123.105:0/312984635 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb3c40777d0 0x7fb3c4079c90 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7fb3c800b5c0 tx=0x7fb3c8005fb0 comp rx=0 tx=0).stop 2026-03-10T08:59:05.588 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.587+0000 7fb3dddf0700 1 -- 192.168.123.105:0/312984635 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3d8101ab0 msgr2=0x7fb3d8071cb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:05.588 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.587+0000 7fb3dddf0700 1 --2- 192.168.123.105:0/312984635 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3d8101ab0 0x7fb3d8071cb0 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7fb3c000d8d0 tx=0x7fb3c000dbe0 comp rx=0 tx=0).stop 2026-03-10T08:59:05.588 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.588+0000 7fb3dddf0700 1 -- 192.168.123.105:0/312984635 shutdown_connections 2026-03-10T08:59:05.588 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.588+0000 7fb3dddf0700 1 --2- 192.168.123.105:0/312984635 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb3c40777d0 0x7fb3c4079c90 unknown :-1 s=CLOSED pgs=54 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:05.588 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.588+0000 7fb3dddf0700 1 --2- 192.168.123.105:0/312984635 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3d8101ab0 0x7fb3d8071cb0 unknown :-1 s=CLOSED pgs=84 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:05.588 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.588+0000 7fb3dddf0700 1 --2- 192.168.123.105:0/312984635 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb3d81043e0 0x7fb3d80721f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:05.588 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.588+0000 7fb3dddf0700 1 -- 192.168.123.105:0/312984635 >> 192.168.123.105:0/312984635 conn(0x7fb3d80fb3c0 msgr2=0x7fb3d80fd840 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:59:05.588 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.588+0000 7fb3dddf0700 1 -- 192.168.123.105:0/312984635 shutdown_connections 2026-03-10T08:59:05.588 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.588+0000 7fb3dddf0700 1 -- 192.168.123.105:0/312984635 wait complete. 2026-03-10T08:59:05.653 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.653+0000 7fc74c545700 1 -- 192.168.123.105:0/2818925149 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc744101990 msgr2=0x7fc744103d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:05.653 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.653+0000 7fc74c545700 1 --2- 192.168.123.105:0/2818925149 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc744101990 0x7fc744103d80 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7fc738009b00 tx=0x7fc738009e10 comp rx=0 tx=0).stop 2026-03-10T08:59:05.653 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.653+0000 7fc74c545700 1 -- 192.168.123.105:0/2818925149 shutdown_connections 2026-03-10T08:59:05.653 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.653+0000 7fc74c545700 1 --2- 192.168.123.105:0/2818925149 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc7441042c0 0x7fc7441066b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:05.653 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.653+0000 7fc74c545700 1 --2- 192.168.123.105:0/2818925149 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc744101990 0x7fc744103d80 unknown :-1 s=CLOSED pgs=28 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:05.653 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.653+0000 7fc74c545700 1 -- 192.168.123.105:0/2818925149 >> 192.168.123.105:0/2818925149 conn(0x7fc7440fb380 msgr2=0x7fc7440fd7e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:59:05.654 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.653+0000 7fc74c545700 1 -- 192.168.123.105:0/2818925149 shutdown_connections 2026-03-10T08:59:05.654 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.653+0000 7fc74c545700 1 -- 192.168.123.105:0/2818925149 wait complete. 2026-03-10T08:59:05.654 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.654+0000 7fc74c545700 1 Processor -- start 2026-03-10T08:59:05.654 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.654+0000 7fc74c545700 1 -- start start 2026-03-10T08:59:05.654 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.654+0000 7fc74c545700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc744101990 0x7fc744196760 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:59:05.655 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.654+0000 7fc74c545700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc7441042c0 0x7fc744196ca0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:59:05.655 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.654+0000 7fc74c545700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc7441972c0 con 0x7fc7441042c0 2026-03-10T08:59:05.655 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.654+0000 7fc74c545700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc744197400 con 0x7fc744101990 2026-03-10T08:59:05.655 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.654+0000 7fc749ae0700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc7441042c0 0x7fc744196ca0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:59:05.655 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.654+0000 7fc749ae0700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc7441042c0 0x7fc744196ca0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:53358/0 (socket says 192.168.123.105:53358) 2026-03-10T08:59:05.655 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.654+0000 7fc749ae0700 1 -- 192.168.123.105:0/305304549 learned_addr learned my addr 192.168.123.105:0/305304549 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:59:05.655 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.655+0000 7fc74a2e1700 1 --2- 192.168.123.105:0/305304549 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc744101990 0x7fc744196760 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:59:05.655 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.655+0000 7fc749ae0700 1 -- 192.168.123.105:0/305304549 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc744101990 msgr2=0x7fc744196760 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:05.655 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.655+0000 7fc749ae0700 1 --2- 192.168.123.105:0/305304549 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc744101990 0x7fc744196760 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:05.655 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.655+0000 7fc749ae0700 1 -- 192.168.123.105:0/305304549 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc740009710 con 0x7fc7441042c0 2026-03-10T08:59:05.655 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.655+0000 7fc749ae0700 1 --2- 192.168.123.105:0/305304549 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc7441042c0 0x7fc744196ca0 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7fc74000eee0 tx=0x7fc74000c5b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:59:05.656 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.655+0000 7fc7377fe700 1 -- 192.168.123.105:0/305304549 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc740009a70 con 0x7fc7441042c0 2026-03-10T08:59:05.656 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.655+0000 7fc7377fe700 1 -- 192.168.123.105:0/305304549 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc740004500 con 0x7fc7441042c0 2026-03-10T08:59:05.656 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.655+0000 7fc7377fe700 1 -- 192.168.123.105:0/305304549 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc740005350 con 0x7fc7441042c0 2026-03-10T08:59:05.656 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.655+0000 7fc74c545700 1 -- 192.168.123.105:0/305304549 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc7380097e0 con 0x7fc7441042c0 2026-03-10T08:59:05.656 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.655+0000 7fc74c545700 1 -- 192.168.123.105:0/305304549 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc74419c270 con 0x7fc7441042c0 2026-03-10T08:59:05.657 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.656+0000 7fc74c545700 1 -- 192.168.123.105:0/305304549 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc7441909e0 con 0x7fc7441042c0 2026-03-10T08:59:05.661 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.660+0000 7fc7377fe700 1 -- 192.168.123.105:0/305304549 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fc74001e030 con 0x7fc7441042c0 2026-03-10T08:59:05.661 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.660+0000 7fc7377fe700 1 --2- 192.168.123.105:0/305304549 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fc730077990 0x7fc730079e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:59:05.661 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.660+0000 7fc7377fe700 1 -- 192.168.123.105:0/305304549 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(57..57 src has 1..57) v4 ==== 6512+0+0 (secure 0 0 0) 0x7fc740014070 con 0x7fc7441042c0 2026-03-10T08:59:05.661 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.661+0000 7fc74a2e1700 1 --2- 192.168.123.105:0/305304549 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fc730077990 0x7fc730079e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:59:05.661 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.661+0000 7fc7377fe700 1 -- 192.168.123.105:0/305304549 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fc74009a460 con 0x7fc7441042c0 2026-03-10T08:59:05.661 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.661+0000 7fc74a2e1700 1 --2- 192.168.123.105:0/305304549 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fc730077990 0x7fc730079e50 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7fc738006010 tx=0x7fc73801a040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:59:05.826 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.825+0000 7fc74c545700 1 -- 192.168.123.105:0/305304549 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7fc744066e80 con 0x7fc7441042c0 2026-03-10T08:59:05.826 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.826+0000 7fc7377fe700 1 -- 192.168.123.105:0/305304549 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+1583 (secure 0 0 0) 0x7fc740062900 con 0x7fc7441042c0 2026-03-10T08:59:05.826 INFO:teuthology.orchestra.run.vm05.stdout:HEALTH_WARN Degraded data redundancy: 1319/231 objects degraded (570.996%), 11 pgs degraded, 11 pgs undersized 2026-03-10T08:59:05.826 INFO:teuthology.orchestra.run.vm05.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 1319/231 objects degraded (570.996%), 11 pgs degraded, 11 pgs undersized 2026-03-10T08:59:05.826 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.1 is stuck undersized for 86s, current state active+recovery_wait+undersized+degraded+remapped, last acting [2,4] 2026-03-10T08:59:05.826 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.6 is stuck undersized for 86s, current state active+recovery_wait+undersized+degraded+remapped, last acting [1,4] 2026-03-10T08:59:05.826 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.b is stuck undersized for 86s, current state active+recovery_wait+undersized+degraded+remapped, last acting [1,4] 2026-03-10T08:59:05.826 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.c is stuck undersized for 86s, current state active+recovery_wait+undersized+degraded+remapped, last acting [5,3] 2026-03-10T08:59:05.826 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.10 is stuck undersized for 86s, current state active+recovery_wait+undersized+degraded+remapped, last acting [5,1] 2026-03-10T08:59:05.826 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.11 is stuck undersized for 86s, current state active+recovery_wait+undersized+degraded+remapped, last acting [3,4] 2026-03-10T08:59:05.826 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.15 is stuck undersized for 86s, current state active+recovery_wait+undersized+degraded+remapped, last acting [3,4] 2026-03-10T08:59:05.826 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.17 is stuck undersized for 86s, current state active+recovering+undersized+degraded+remapped, last acting [2,5] 2026-03-10T08:59:05.826 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.18 is stuck undersized for 86s, current state active+recovery_wait+undersized+degraded+remapped, last acting [2,1] 2026-03-10T08:59:05.826 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.1b is stuck undersized for 86s, current state active+recovering+undersized+degraded+remapped, last acting [3,4] 2026-03-10T08:59:05.826 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.1f is stuck undersized for 86s, current state active+recovery_wait+undersized+degraded+remapped, last acting [2,3] 2026-03-10T08:59:05.828 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.828+0000 7fc74c545700 1 -- 192.168.123.105:0/305304549 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fc730077990 msgr2=0x7fc730079e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:05.828 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.828+0000 7fc74c545700 1 --2- 192.168.123.105:0/305304549 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fc730077990 0x7fc730079e50 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7fc738006010 tx=0x7fc73801a040 comp rx=0 tx=0).stop 2026-03-10T08:59:05.828 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.828+0000 7fc74c545700 1 -- 192.168.123.105:0/305304549 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc7441042c0 msgr2=0x7fc744196ca0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:05.829 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.828+0000 7fc74c545700 1 --2- 192.168.123.105:0/305304549 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc7441042c0 0x7fc744196ca0 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7fc74000eee0 tx=0x7fc74000c5b0 comp rx=0 tx=0).stop 2026-03-10T08:59:05.829 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.828+0000 7fc74c545700 1 -- 192.168.123.105:0/305304549 shutdown_connections 2026-03-10T08:59:05.829 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.828+0000 7fc74c545700 1 --2- 192.168.123.105:0/305304549 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fc730077990 0x7fc730079e50 unknown :-1 s=CLOSED pgs=55 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:05.829 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.828+0000 7fc74c545700 1 --2- 192.168.123.105:0/305304549 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc744101990 0x7fc744196760 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:05.829 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.828+0000 7fc74c545700 1 --2- 192.168.123.105:0/305304549 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc7441042c0 0x7fc744196ca0 unknown :-1 s=CLOSED pgs=85 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:05.829 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.828+0000 7fc74c545700 1 -- 192.168.123.105:0/305304549 >> 192.168.123.105:0/305304549 conn(0x7fc7440fb380 msgr2=0x7fc7440fd7e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:59:05.829 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.829+0000 7fc74c545700 1 -- 192.168.123.105:0/305304549 shutdown_connections 2026-03-10T08:59:05.829 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:05.829+0000 7fc74c545700 1 -- 192.168.123.105:0/305304549 wait complete. 2026-03-10T08:59:06.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:06 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/914353803' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T08:59:06.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:06 vm08.local ceph-mon[101330]: from='client.34236 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:59:06.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:06 vm08.local ceph-mon[101330]: pgmap v87: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1304/231 objects degraded (564.502%); 0 B/s, 4 objects/s recovering 2026-03-10T08:59:06.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:06 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/305304549' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T08:59:06.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:06 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/914353803' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T08:59:06.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:06 vm05.local ceph-mon[111630]: from='client.34236 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:59:06.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:06 vm05.local ceph-mon[111630]: pgmap v87: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1304/231 objects degraded (564.502%); 0 B/s, 4 objects/s recovering 2026-03-10T08:59:06.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:06 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/305304549' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T08:59:09.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:08 vm08.local ceph-mon[101330]: pgmap v88: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1304/231 objects degraded (564.502%); 0 B/s, 10 objects/s recovering 2026-03-10T08:59:09.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:08 vm05.local ceph-mon[111630]: pgmap v88: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1304/231 objects degraded (564.502%); 0 B/s, 10 objects/s recovering 2026-03-10T08:59:11.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:10 vm05.local ceph-mon[111630]: pgmap v89: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1304/231 objects degraded (564.502%); 0 B/s, 8 objects/s recovering 2026-03-10T08:59:11.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:10 vm05.local ceph-mon[111630]: Health check update: Degraded data redundancy: 1304/231 objects degraded (564.502%), 10 pgs degraded, 10 pgs undersized (PG_DEGRADED) 2026-03-10T08:59:11.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:10 vm08.local ceph-mon[101330]: pgmap v89: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1304/231 objects degraded (564.502%); 0 B/s, 8 objects/s recovering 2026-03-10T08:59:11.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:10 vm08.local ceph-mon[101330]: Health check update: Degraded data redundancy: 1304/231 objects degraded (564.502%), 10 pgs degraded, 10 pgs undersized (PG_DEGRADED) 2026-03-10T08:59:13.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:12 vm05.local ceph-mon[111630]: pgmap v90: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1304/231 objects degraded (564.502%); 0 B/s, 7 objects/s recovering 2026-03-10T08:59:13.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:12 vm08.local ceph-mon[101330]: pgmap v90: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1304/231 objects degraded (564.502%); 0 B/s, 7 objects/s recovering 2026-03-10T08:59:14.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:14 vm05.local ceph-mon[111630]: pgmap v91: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1278/231 objects degraded (553.247%); 0 B/s, 11 objects/s recovering 2026-03-10T08:59:14.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:14 vm05.local ceph-mon[111630]: osdmap e58: 6 total, 6 up, 6 in 2026-03-10T08:59:14.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:14 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T08:59:15.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:14 vm08.local ceph-mon[101330]: pgmap v91: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1278/231 objects degraded (553.247%); 0 B/s, 11 objects/s recovering 2026-03-10T08:59:15.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:14 vm08.local ceph-mon[101330]: osdmap e58: 6 total, 6 up, 6 in 2026-03-10T08:59:15.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:14 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T08:59:16.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:15 vm05.local ceph-mon[111630]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T08:59:16.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:15 vm05.local ceph-mon[111630]: Upgrade: unsafe to stop osd(s) at this time (4 PGs are or would become offline) 2026-03-10T08:59:16.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:15 vm05.local ceph-mon[111630]: osdmap e59: 6 total, 6 up, 6 in 2026-03-10T08:59:16.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:15 vm05.local ceph-mon[111630]: Health check update: Degraded data redundancy: 1278/231 objects degraded (553.247%), 10 pgs degraded, 10 pgs undersized (PG_DEGRADED) 2026-03-10T08:59:16.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:15 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:59:16.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:15 vm08.local ceph-mon[101330]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T08:59:16.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:15 vm08.local ceph-mon[101330]: Upgrade: unsafe to stop osd(s) at this time (4 PGs are or would become offline) 2026-03-10T08:59:16.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:15 vm08.local ceph-mon[101330]: osdmap e59: 6 total, 6 up, 6 in 2026-03-10T08:59:16.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:15 vm08.local ceph-mon[101330]: Health check update: Degraded data redundancy: 1278/231 objects degraded (553.247%), 10 pgs degraded, 10 pgs undersized (PG_DEGRADED) 2026-03-10T08:59:16.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:15 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:59:17.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:16 vm05.local ceph-mon[111630]: pgmap v94: 65 pgs: 1 peering, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1259/231 objects degraded (545.022%) 2026-03-10T08:59:17.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:16 vm08.local ceph-mon[101330]: pgmap v94: 65 pgs: 1 peering, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1259/231 objects degraded (545.022%) 2026-03-10T08:59:19.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:18 vm05.local ceph-mon[111630]: pgmap v95: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 1 peering, 8 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1171/231 objects degraded (506.926%) 2026-03-10T08:59:19.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:18 vm08.local ceph-mon[101330]: pgmap v95: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 1 peering, 8 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1171/231 objects degraded (506.926%) 2026-03-10T08:59:21.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:20 vm05.local ceph-mon[111630]: pgmap v96: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 8 active+recovery_wait+undersized+degraded+remapped, 56 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1171/231 objects degraded (506.926%); 0 B/s, 11 objects/s recovering 2026-03-10T08:59:21.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:20 vm05.local ceph-mon[111630]: Health check update: Degraded data redundancy: 1171/231 objects degraded (506.926%), 9 pgs degraded, 9 pgs undersized (PG_DEGRADED) 2026-03-10T08:59:21.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:20 vm08.local ceph-mon[101330]: pgmap v96: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 8 active+recovery_wait+undersized+degraded+remapped, 56 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1171/231 objects degraded (506.926%); 0 B/s, 11 objects/s recovering 2026-03-10T08:59:21.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:20 vm08.local ceph-mon[101330]: Health check update: Degraded data redundancy: 1171/231 objects degraded (506.926%), 9 pgs degraded, 9 pgs undersized (PG_DEGRADED) 2026-03-10T08:59:23.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:22 vm05.local ceph-mon[111630]: pgmap v97: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 8 active+recovery_wait+undersized+degraded+remapped, 56 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1171/231 objects degraded (506.926%); 0 B/s, 5 objects/s recovering 2026-03-10T08:59:23.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:22 vm08.local ceph-mon[101330]: pgmap v97: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 8 active+recovery_wait+undersized+degraded+remapped, 56 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1171/231 objects degraded (506.926%); 0 B/s, 5 objects/s recovering 2026-03-10T08:59:25.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:24 vm05.local ceph-mon[111630]: pgmap v98: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 8 active+recovery_wait+undersized+degraded+remapped, 56 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1171/231 objects degraded (506.926%); 0 B/s, 9 objects/s recovering 2026-03-10T08:59:25.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:24 vm08.local ceph-mon[101330]: pgmap v98: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 8 active+recovery_wait+undersized+degraded+remapped, 56 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1171/231 objects degraded (506.926%); 0 B/s, 9 objects/s recovering 2026-03-10T08:59:27.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:26 vm05.local ceph-mon[111630]: pgmap v99: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 8 active+recovery_wait+undersized+degraded+remapped, 56 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1171/231 objects degraded (506.926%); 0 B/s, 19 objects/s recovering 2026-03-10T08:59:27.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:26 vm08.local ceph-mon[101330]: pgmap v99: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 8 active+recovery_wait+undersized+degraded+remapped, 56 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1171/231 objects degraded (506.926%); 0 B/s, 19 objects/s recovering 2026-03-10T08:59:28.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:28 vm08.local ceph-mon[101330]: Health check update: Degraded data redundancy: 1135/231 objects degraded (491.342%), 9 pgs degraded, 9 pgs undersized (PG_DEGRADED) 2026-03-10T08:59:28.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:28 vm05.local ceph-mon[111630]: Health check update: Degraded data redundancy: 1135/231 objects degraded (491.342%), 9 pgs degraded, 9 pgs undersized (PG_DEGRADED) 2026-03-10T08:59:29.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:29 vm08.local ceph-mon[101330]: pgmap v100: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 8 active+recovery_wait+undersized+degraded+remapped, 56 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1135/231 objects degraded (491.342%); 0 B/s, 21 objects/s recovering 2026-03-10T08:59:29.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:29 vm05.local ceph-mon[111630]: pgmap v100: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 8 active+recovery_wait+undersized+degraded+remapped, 56 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1135/231 objects degraded (491.342%); 0 B/s, 21 objects/s recovering 2026-03-10T08:59:30.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:30 vm08.local ceph-mon[101330]: osdmap e60: 6 total, 6 up, 6 in 2026-03-10T08:59:30.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:30 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:59:30.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:30 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:59:30.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:30 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:59:30.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:30 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:59:30.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:30 vm05.local ceph-mon[111630]: osdmap e60: 6 total, 6 up, 6 in 2026-03-10T08:59:30.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:30 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:59:30.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:30 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:59:30.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:30 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:59:30.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:30 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T08:59:31.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:31 vm08.local ceph-mon[101330]: pgmap v102: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 8 active+recovery_wait+undersized+degraded+remapped, 56 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1135/231 objects degraded (491.342%); 0 B/s, 9 objects/s recovering 2026-03-10T08:59:31.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:31 vm08.local ceph-mon[101330]: osdmap e61: 6 total, 6 up, 6 in 2026-03-10T08:59:31.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:59:31.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:59:31.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:59:31.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:59:31.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T08:59:31.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:31 vm08.local ceph-mon[101330]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T08:59:31.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:31 vm08.local ceph-mon[101330]: Upgrade: unsafe to stop osd(s) at this time (4 PGs are or would become offline) 2026-03-10T08:59:31.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:59:31.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:31 vm05.local ceph-mon[111630]: pgmap v102: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 8 active+recovery_wait+undersized+degraded+remapped, 56 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1135/231 objects degraded (491.342%); 0 B/s, 9 objects/s recovering 2026-03-10T08:59:31.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:31 vm05.local ceph-mon[111630]: osdmap e61: 6 total, 6 up, 6 in 2026-03-10T08:59:31.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:59:31.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:59:31.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:59:31.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:59:31.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T08:59:31.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:31 vm05.local ceph-mon[111630]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T08:59:31.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:31 vm05.local ceph-mon[111630]: Upgrade: unsafe to stop osd(s) at this time (4 PGs are or would become offline) 2026-03-10T08:59:31.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:59:33.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:33 vm05.local ceph-mon[111630]: pgmap v104: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 8 active+recovery_wait+undersized+degraded+remapped, 56 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1135/231 objects degraded (491.342%); 0 B/s, 5 objects/s recovering 2026-03-10T08:59:33.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:33 vm08.local ceph-mon[101330]: pgmap v104: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 8 active+recovery_wait+undersized+degraded+remapped, 56 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1135/231 objects degraded (491.342%); 0 B/s, 5 objects/s recovering 2026-03-10T08:59:34.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:34 vm08.local ceph-mon[101330]: Health check update: Degraded data redundancy: 1010/231 objects degraded (437.229%), 8 pgs degraded, 8 pgs undersized (PG_DEGRADED) 2026-03-10T08:59:34.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:34 vm05.local ceph-mon[111630]: Health check update: Degraded data redundancy: 1010/231 objects degraded (437.229%), 8 pgs degraded, 8 pgs undersized (PG_DEGRADED) 2026-03-10T08:59:35.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:35 vm05.local ceph-mon[111630]: pgmap v105: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 7 active+recovery_wait+undersized+degraded+remapped, 57 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1010/231 objects degraded (437.229%); 0 B/s, 11 objects/s recovering 2026-03-10T08:59:35.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:35 vm08.local ceph-mon[101330]: pgmap v105: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 7 active+recovery_wait+undersized+degraded+remapped, 57 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1010/231 objects degraded (437.229%); 0 B/s, 11 objects/s recovering 2026-03-10T08:59:35.897 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:35.896+0000 7f2a140b7700 1 -- 192.168.123.105:0/2359232643 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2a0c101ab0 msgr2=0x7f2a0c103ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:35.897 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:35.896+0000 7f2a140b7700 1 --2- 192.168.123.105:0/2359232643 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2a0c101ab0 0x7f2a0c103ea0 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7f29fc009b50 tx=0x7f29fc009e60 comp rx=0 tx=0).stop 2026-03-10T08:59:35.897 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:35.897+0000 7f2a140b7700 1 -- 192.168.123.105:0/2359232643 shutdown_connections 2026-03-10T08:59:35.897 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:35.897+0000 7f2a140b7700 1 --2- 192.168.123.105:0/2359232643 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2a0c1043e0 0x7f2a0c1067d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:35.897 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:35.897+0000 7f2a140b7700 1 --2- 192.168.123.105:0/2359232643 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2a0c101ab0 0x7f2a0c103ea0 unknown :-1 s=CLOSED pgs=86 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:35.898 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:35.897+0000 7f2a140b7700 1 -- 192.168.123.105:0/2359232643 >> 192.168.123.105:0/2359232643 conn(0x7f2a0c0fb3c0 msgr2=0x7f2a0c0fd840 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:59:35.898 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:35.897+0000 7f2a140b7700 1 -- 192.168.123.105:0/2359232643 shutdown_connections 2026-03-10T08:59:35.898 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:35.897+0000 7f2a140b7700 1 -- 192.168.123.105:0/2359232643 wait complete. 2026-03-10T08:59:35.898 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:35.898+0000 7f2a140b7700 1 Processor -- start 2026-03-10T08:59:35.898 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:35.898+0000 7f2a140b7700 1 -- start start 2026-03-10T08:59:35.898 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:35.898+0000 7f2a140b7700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2a0c101ab0 0x7f2a0c198a20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:59:35.898 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:35.898+0000 7f2a140b7700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2a0c1043e0 0x7f2a0c198f60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:59:35.899 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:35.898+0000 7f2a140b7700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2a0c199580 con 0x7f2a0c1043e0 2026-03-10T08:59:35.899 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:35.898+0000 7f2a140b7700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2a0c1996c0 con 0x7f2a0c101ab0 2026-03-10T08:59:35.899 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:35.898+0000 7f2a11e53700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2a0c101ab0 0x7f2a0c198a20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:59:35.899 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:35.898+0000 7f2a11652700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2a0c1043e0 0x7f2a0c198f60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:59:35.899 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:35.898+0000 7f2a11e53700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2a0c101ab0 0x7f2a0c198a20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:33800/0 (socket says 192.168.123.105:33800) 2026-03-10T08:59:35.899 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:35.898+0000 7f2a11e53700 1 -- 192.168.123.105:0/2506620928 learned_addr learned my addr 192.168.123.105:0/2506620928 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:59:35.899 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:35.899+0000 7f2a11e53700 1 -- 192.168.123.105:0/2506620928 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2a0c1043e0 msgr2=0x7f2a0c198f60 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:35.899 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:35.899+0000 7f2a11e53700 1 --2- 192.168.123.105:0/2506620928 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2a0c1043e0 0x7f2a0c198f60 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:35.900 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:35.899+0000 7f2a11e53700 1 -- 192.168.123.105:0/2506620928 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f29fc0097e0 con 0x7f2a0c101ab0 2026-03-10T08:59:35.900 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:35.899+0000 7f2a11652700 1 --2- 192.168.123.105:0/2506620928 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2a0c1043e0 0x7f2a0c198f60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T08:59:35.900 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:35.899+0000 7f2a11e53700 1 --2- 192.168.123.105:0/2506620928 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2a0c101ab0 0x7f2a0c198a20 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f29fc004ce0 tx=0x7f29fc005740 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:59:35.901 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:35.899+0000 7f2a02ffd700 1 -- 192.168.123.105:0/2506620928 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f29fc01d070 con 0x7f2a0c101ab0 2026-03-10T08:59:35.901 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:35.899+0000 7f2a140b7700 1 -- 192.168.123.105:0/2506620928 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2a0c19e110 con 0x7f2a0c101ab0 2026-03-10T08:59:35.901 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:35.899+0000 7f2a140b7700 1 -- 192.168.123.105:0/2506620928 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2a0c19e600 con 0x7f2a0c101ab0 2026-03-10T08:59:35.901 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:35.899+0000 7f2a02ffd700 1 -- 192.168.123.105:0/2506620928 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f29fc00bc30 con 0x7f2a0c101ab0 2026-03-10T08:59:35.901 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:35.899+0000 7f2a02ffd700 1 -- 192.168.123.105:0/2506620928 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f29fc00f670 con 0x7f2a0c101ab0 2026-03-10T08:59:35.901 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:35.901+0000 7f2a02ffd700 1 -- 192.168.123.105:0/2506620928 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f29fc00f7d0 con 0x7f2a0c101ab0 2026-03-10T08:59:35.901 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:35.901+0000 7f2a02ffd700 1 --2- 192.168.123.105:0/2506620928 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f29f80778c0 0x7f29f8079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:59:35.902 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:35.901+0000 7f2a11652700 1 --2- 192.168.123.105:0/2506620928 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f29f80778c0 0x7f29f8079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:59:35.902 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:35.902+0000 7f2a11652700 1 --2- 192.168.123.105:0/2506620928 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f29f80778c0 0x7f29f8079d80 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7f2a08005950 tx=0x7f2a0800b410 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:59:35.902 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:35.902+0000 7f2a02ffd700 1 -- 192.168.123.105:0/2506620928 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(61..61 src has 1..61) v4 ==== 6454+0+0 (secure 0 0 0) 0x7f29fc09b0b0 con 0x7f2a0c101ab0 2026-03-10T08:59:35.903 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:35.903+0000 7f2a140b7700 1 -- 192.168.123.105:0/2506620928 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f29f0005320 con 0x7f2a0c101ab0 2026-03-10T08:59:35.906 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:35.906+0000 7f2a02ffd700 1 -- 192.168.123.105:0/2506620928 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f29fc063850 con 0x7f2a0c101ab0 2026-03-10T08:59:36.033 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.033+0000 7f2a140b7700 1 -- 192.168.123.105:0/2506620928 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f29f0000bf0 con 0x7f29f80778c0 2026-03-10T08:59:36.034 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.034+0000 7f2a02ffd700 1 -- 192.168.123.105:0/2506620928 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f29f0000bf0 con 0x7f29f80778c0 2026-03-10T08:59:36.036 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.036+0000 7f2a140b7700 1 -- 192.168.123.105:0/2506620928 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f29f80778c0 msgr2=0x7f29f8079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:36.036 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.036+0000 7f2a140b7700 1 --2- 192.168.123.105:0/2506620928 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f29f80778c0 0x7f29f8079d80 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7f2a08005950 tx=0x7f2a0800b410 comp rx=0 tx=0).stop 2026-03-10T08:59:36.036 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.036+0000 7f2a140b7700 1 -- 192.168.123.105:0/2506620928 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2a0c101ab0 msgr2=0x7f2a0c198a20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:36.036 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.036+0000 7f2a140b7700 1 --2- 192.168.123.105:0/2506620928 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2a0c101ab0 0x7f2a0c198a20 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f29fc004ce0 tx=0x7f29fc005740 comp rx=0 tx=0).stop 2026-03-10T08:59:36.037 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.036+0000 7f2a140b7700 1 -- 192.168.123.105:0/2506620928 shutdown_connections 2026-03-10T08:59:36.037 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.036+0000 7f2a140b7700 1 --2- 192.168.123.105:0/2506620928 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f29f80778c0 0x7f29f8079d80 unknown :-1 s=CLOSED pgs=56 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:36.037 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.036+0000 7f2a140b7700 1 --2- 192.168.123.105:0/2506620928 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2a0c101ab0 0x7f2a0c198a20 unknown :-1 s=CLOSED pgs=29 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:36.037 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.036+0000 7f2a140b7700 1 --2- 192.168.123.105:0/2506620928 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2a0c1043e0 0x7f2a0c198f60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:36.037 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.036+0000 7f2a140b7700 1 -- 192.168.123.105:0/2506620928 >> 192.168.123.105:0/2506620928 conn(0x7f2a0c0fb3c0 msgr2=0x7f2a0c100130 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:59:36.037 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.037+0000 7f2a140b7700 1 -- 192.168.123.105:0/2506620928 shutdown_connections 2026-03-10T08:59:36.037 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.037+0000 7f2a140b7700 1 -- 192.168.123.105:0/2506620928 wait complete. 2026-03-10T08:59:36.047 INFO:teuthology.orchestra.run.vm05.stdout:true 2026-03-10T08:59:36.106 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.105+0000 7f7648ced700 1 -- 192.168.123.105:0/1528128104 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7644102090 msgr2=0x7f7644102510 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:36.106 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.105+0000 7f7648ced700 1 --2- 192.168.123.105:0/1528128104 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7644102090 0x7f7644102510 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7f7634009b00 tx=0x7f7634009e10 comp rx=0 tx=0).stop 2026-03-10T08:59:36.106 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.105+0000 7f7648ced700 1 -- 192.168.123.105:0/1528128104 shutdown_connections 2026-03-10T08:59:36.106 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.105+0000 7f7648ced700 1 --2- 192.168.123.105:0/1528128104 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7644102090 0x7f7644102510 unknown :-1 s=CLOSED pgs=87 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:36.106 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.105+0000 7f7648ced700 1 --2- 192.168.123.105:0/1528128104 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7644100f30 0x7f7644101350 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:36.106 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.105+0000 7f7648ced700 1 -- 192.168.123.105:0/1528128104 >> 192.168.123.105:0/1528128104 conn(0x7f76440fc4d0 msgr2=0x7f76440fe910 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:59:36.106 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.106+0000 7f7648ced700 1 -- 192.168.123.105:0/1528128104 shutdown_connections 2026-03-10T08:59:36.106 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.106+0000 7f7648ced700 1 -- 192.168.123.105:0/1528128104 wait complete. 2026-03-10T08:59:36.106 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.106+0000 7f7648ced700 1 Processor -- start 2026-03-10T08:59:36.106 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.106+0000 7f7648ced700 1 -- start start 2026-03-10T08:59:36.107 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.106+0000 7f7648ced700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7644100f30 0x7f76441945a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:59:36.107 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.106+0000 7f7648ced700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7644102090 0x7f7644194ae0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:59:36.107 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.106+0000 7f7648ced700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7644195100 con 0x7f7644100f30 2026-03-10T08:59:36.107 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.106+0000 7f7648ced700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7644195240 con 0x7f7644102090 2026-03-10T08:59:36.107 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.106+0000 7f764259c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7644100f30 0x7f76441945a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:59:36.107 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.107+0000 7f764259c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7644100f30 0x7f76441945a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:42496/0 (socket says 192.168.123.105:42496) 2026-03-10T08:59:36.107 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.107+0000 7f764259c700 1 -- 192.168.123.105:0/191057301 learned_addr learned my addr 192.168.123.105:0/191057301 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:59:36.107 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.107+0000 7f7641d9b700 1 --2- 192.168.123.105:0/191057301 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7644102090 0x7f7644194ae0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:59:36.107 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.107+0000 7f764259c700 1 -- 192.168.123.105:0/191057301 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7644102090 msgr2=0x7f7644194ae0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:36.107 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.107+0000 7f764259c700 1 --2- 192.168.123.105:0/191057301 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7644102090 0x7f7644194ae0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:36.107 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.107+0000 7f764259c700 1 -- 192.168.123.105:0/191057301 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f76340097e0 con 0x7f7644100f30 2026-03-10T08:59:36.107 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.107+0000 7f764259c700 1 --2- 192.168.123.105:0/191057301 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7644100f30 0x7f76441945a0 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f762c00dc40 tx=0x7f762c00df50 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:59:36.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.107+0000 7f763b7fe700 1 -- 192.168.123.105:0/191057301 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f762c0098e0 con 0x7f7644100f30 2026-03-10T08:59:36.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.107+0000 7f763b7fe700 1 -- 192.168.123.105:0/191057301 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f762c010460 con 0x7f7644100f30 2026-03-10T08:59:36.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.107+0000 7f7648ced700 1 -- 192.168.123.105:0/191057301 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7644199cf0 con 0x7f7644100f30 2026-03-10T08:59:36.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.107+0000 7f763b7fe700 1 -- 192.168.123.105:0/191057301 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f762c00f5d0 con 0x7f7644100f30 2026-03-10T08:59:36.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.107+0000 7f7648ced700 1 -- 192.168.123.105:0/191057301 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f764419a1e0 con 0x7f7644100f30 2026-03-10T08:59:36.111 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.109+0000 7f763b7fe700 1 -- 192.168.123.105:0/191057301 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f762c0105d0 con 0x7f7644100f30 2026-03-10T08:59:36.111 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.109+0000 7f7648ced700 1 -- 192.168.123.105:0/191057301 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7644066e80 con 0x7f7644100f30 2026-03-10T08:59:36.111 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.109+0000 7f763b7fe700 1 --2- 192.168.123.105:0/191057301 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f763007bdf0 0x7f763007e2b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:59:36.111 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.109+0000 7f763b7fe700 1 -- 192.168.123.105:0/191057301 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(61..61 src has 1..61) v4 ==== 6454+0+0 (secure 0 0 0) 0x7f762c099920 con 0x7f7644100f30 2026-03-10T08:59:36.111 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.109+0000 7f7641d9b700 1 --2- 192.168.123.105:0/191057301 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f763007bdf0 0x7f763007e2b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:59:36.111 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.111+0000 7f7641d9b700 1 --2- 192.168.123.105:0/191057301 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f763007bdf0 0x7f763007e2b0 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f763400b5c0 tx=0x7f7634005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:59:36.112 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.112+0000 7f763b7fe700 1 -- 192.168.123.105:0/191057301 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f762c0619c0 con 0x7f7644100f30 2026-03-10T08:59:36.241 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.241+0000 7f7648ced700 1 -- 192.168.123.105:0/191057301 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f7644106a70 con 0x7f763007bdf0 2026-03-10T08:59:36.242 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.242+0000 7f763b7fe700 1 -- 192.168.123.105:0/191057301 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f7644106a70 con 0x7f763007bdf0 2026-03-10T08:59:36.245 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.244+0000 7f7648ced700 1 -- 192.168.123.105:0/191057301 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f763007bdf0 msgr2=0x7f763007e2b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:36.245 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.244+0000 7f7648ced700 1 --2- 192.168.123.105:0/191057301 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f763007bdf0 0x7f763007e2b0 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f763400b5c0 tx=0x7f7634005fb0 comp rx=0 tx=0).stop 2026-03-10T08:59:36.245 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.245+0000 7f7648ced700 1 -- 192.168.123.105:0/191057301 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7644100f30 msgr2=0x7f76441945a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:36.245 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.245+0000 7f7648ced700 1 --2- 192.168.123.105:0/191057301 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7644100f30 0x7f76441945a0 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f762c00dc40 tx=0x7f762c00df50 comp rx=0 tx=0).stop 2026-03-10T08:59:36.245 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.245+0000 7f7648ced700 1 -- 192.168.123.105:0/191057301 shutdown_connections 2026-03-10T08:59:36.245 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.245+0000 7f7648ced700 1 --2- 192.168.123.105:0/191057301 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f763007bdf0 0x7f763007e2b0 unknown :-1 s=CLOSED pgs=57 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:36.245 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.245+0000 7f7648ced700 1 --2- 192.168.123.105:0/191057301 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7644100f30 0x7f76441945a0 unknown :-1 s=CLOSED pgs=88 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:36.245 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.245+0000 7f7648ced700 1 --2- 192.168.123.105:0/191057301 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7644102090 0x7f7644194ae0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:36.245 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.245+0000 7f7648ced700 1 -- 192.168.123.105:0/191057301 >> 192.168.123.105:0/191057301 conn(0x7f76440fc4d0 msgr2=0x7f7644105350 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:59:36.245 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.245+0000 7f7648ced700 1 -- 192.168.123.105:0/191057301 shutdown_connections 2026-03-10T08:59:36.246 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.245+0000 7f7648ced700 1 -- 192.168.123.105:0/191057301 wait complete. 2026-03-10T08:59:36.311 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.311+0000 7f7e3dfac700 1 -- 192.168.123.105:0/3695086290 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7e38104320 msgr2=0x7f7e38106710 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:36.311 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.311+0000 7f7e3dfac700 1 --2- 192.168.123.105:0/3695086290 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7e38104320 0x7f7e38106710 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f7e28009b00 tx=0x7f7e28009e10 comp rx=0 tx=0).stop 2026-03-10T08:59:36.312 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.311+0000 7f7e3dfac700 1 -- 192.168.123.105:0/3695086290 shutdown_connections 2026-03-10T08:59:36.312 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.311+0000 7f7e3dfac700 1 --2- 192.168.123.105:0/3695086290 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7e38104320 0x7f7e38106710 unknown :-1 s=CLOSED pgs=89 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:36.312 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.311+0000 7f7e3dfac700 1 --2- 192.168.123.105:0/3695086290 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7e381019f0 0x7f7e38103de0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:36.312 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.311+0000 7f7e3dfac700 1 -- 192.168.123.105:0/3695086290 >> 192.168.123.105:0/3695086290 conn(0x7f7e380fb3c0 msgr2=0x7f7e380fd840 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:59:36.312 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.312+0000 7f7e3dfac700 1 -- 192.168.123.105:0/3695086290 shutdown_connections 2026-03-10T08:59:36.312 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.312+0000 7f7e3dfac700 1 -- 192.168.123.105:0/3695086290 wait complete. 2026-03-10T08:59:36.312 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.312+0000 7f7e3dfac700 1 Processor -- start 2026-03-10T08:59:36.312 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.312+0000 7f7e3dfac700 1 -- start start 2026-03-10T08:59:36.312 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.312+0000 7f7e3dfac700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7e381019f0 0x7f7e381989b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:59:36.313 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.312+0000 7f7e3dfac700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7e38104320 0x7f7e38198ef0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:59:36.313 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.312+0000 7f7e3dfac700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7e38199480 con 0x7f7e38104320 2026-03-10T08:59:36.313 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.312+0000 7f7e3dfac700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7e381995c0 con 0x7f7e381019f0 2026-03-10T08:59:36.313 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.313+0000 7f7e377fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7e381019f0 0x7f7e381989b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:59:36.313 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.313+0000 7f7e377fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7e381019f0 0x7f7e381989b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:33832/0 (socket says 192.168.123.105:33832) 2026-03-10T08:59:36.313 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.313+0000 7f7e377fe700 1 -- 192.168.123.105:0/3541323820 learned_addr learned my addr 192.168.123.105:0/3541323820 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:59:36.313 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.313+0000 7f7e36ffd700 1 --2- 192.168.123.105:0/3541323820 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7e38104320 0x7f7e38198ef0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:59:36.313 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.313+0000 7f7e36ffd700 1 -- 192.168.123.105:0/3541323820 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7e381019f0 msgr2=0x7f7e381989b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:36.313 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.313+0000 7f7e36ffd700 1 --2- 192.168.123.105:0/3541323820 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7e381019f0 0x7f7e381989b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:36.313 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.313+0000 7f7e36ffd700 1 -- 192.168.123.105:0/3541323820 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7e280097e0 con 0x7f7e38104320 2026-03-10T08:59:36.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.313+0000 7f7e36ffd700 1 --2- 192.168.123.105:0/3541323820 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7e38104320 0x7f7e38198ef0 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7f7e28000c00 tx=0x7f7e28004ab0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:59:36.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.314+0000 7f7e34ff9700 1 -- 192.168.123.105:0/3541323820 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7e2801d070 con 0x7f7e38104320 2026-03-10T08:59:36.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.314+0000 7f7e3dfac700 1 -- 192.168.123.105:0/3541323820 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7e3819e020 con 0x7f7e38104320 2026-03-10T08:59:36.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.314+0000 7f7e3dfac700 1 -- 192.168.123.105:0/3541323820 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7e3819e4e0 con 0x7f7e38104320 2026-03-10T08:59:36.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.314+0000 7f7e3dfac700 1 -- 192.168.123.105:0/3541323820 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7e380fcfd0 con 0x7f7e38104320 2026-03-10T08:59:36.315 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.314+0000 7f7e34ff9700 1 -- 192.168.123.105:0/3541323820 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7e2800bc50 con 0x7f7e38104320 2026-03-10T08:59:36.315 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.314+0000 7f7e34ff9700 1 -- 192.168.123.105:0/3541323820 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7e2800f700 con 0x7f7e38104320 2026-03-10T08:59:36.316 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.315+0000 7f7e34ff9700 1 -- 192.168.123.105:0/3541323820 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f7e2800f960 con 0x7f7e38104320 2026-03-10T08:59:36.316 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.316+0000 7f7e34ff9700 1 --2- 192.168.123.105:0/3541323820 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f7e240778c0 0x7f7e24079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:59:36.316 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.316+0000 7f7e34ff9700 1 -- 192.168.123.105:0/3541323820 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(61..61 src has 1..61) v4 ==== 6454+0+0 (secure 0 0 0) 0x7f7e2809cb80 con 0x7f7e38104320 2026-03-10T08:59:36.318 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.318+0000 7f7e34ff9700 1 -- 192.168.123.105:0/3541323820 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f7e280653a0 con 0x7f7e38104320 2026-03-10T08:59:36.318 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.318+0000 7f7e377fe700 1 --2- 192.168.123.105:0/3541323820 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f7e240778c0 0x7f7e24079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:59:36.319 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.318+0000 7f7e377fe700 1 --2- 192.168.123.105:0/3541323820 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f7e240778c0 0x7f7e24079d80 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f7e2000ac80 tx=0x7f7e20005ef0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:59:36.436 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.436+0000 7f7e3dfac700 1 -- 192.168.123.105:0/3541323820 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f7e380611d0 con 0x7f7e240778c0 2026-03-10T08:59:36.442 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.442+0000 7f7e34ff9700 1 -- 192.168.123.105:0/3541323820 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3528 (secure 0 0 0) 0x7f7e380611d0 con 0x7f7e240778c0 2026-03-10T08:59:36.442 INFO:teuthology.orchestra.run.vm05.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T08:59:36.442 INFO:teuthology.orchestra.run.vm05.stdout:alertmanager.vm05 vm05 *:9093,9094 running (8m) 2m ago 8m 24.2M - 0.25.0 c8568f914cd2 3be9db7ff6a0 2026-03-10T08:59:36.442 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm05 vm05 running (8m) 2m ago 8m 8892k - 18.2.1 5be31c24972a 4f6a4fa3151a 2026-03-10T08:59:36.442 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm08 vm08 running (8m) 2m ago 8m 11.0M - 18.2.1 5be31c24972a 17a1d108c2c1 2026-03-10T08:59:36.442 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm05 vm05 running (2m) 2m ago 8m 7834k - 19.2.3-678-ge911bdeb 654f31e6858e 8f7c3cd210c7 2026-03-10T08:59:36.442 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm08 vm08 running (2m) 2m ago 8m 8296k - 19.2.3-678-ge911bdeb 654f31e6858e dc71c3161edd 2026-03-10T08:59:36.442 INFO:teuthology.orchestra.run.vm05.stdout:grafana.vm05 vm05 *:3000 running (8m) 2m ago 8m 88.3M - 9.4.7 954c08fa6188 91937b4745e7 2026-03-10T08:59:36.442 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.bxdvbu vm05 running (6m) 2m ago 6m 242M - 18.2.1 5be31c24972a 601862397d9b 2026-03-10T08:59:36.442 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.slhztf vm05 running (6m) 2m ago 6m 17.7M - 18.2.1 5be31c24972a 550cdd24738b 2026-03-10T08:59:36.442 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.ssijow vm08 running (6m) 2m ago 6m 19.9M - 18.2.1 5be31c24972a b9e55b365719 2026-03-10T08:59:36.442 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.xfzrbx vm08 running (6m) 2m ago 6m 16.2M - 18.2.1 5be31c24972a d58d1e2e6ff3 2026-03-10T08:59:36.442 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm05.rxwgjc vm05 *:8443,9283,8765 running (3m) 2m ago 9m 613M - 19.2.3-678-ge911bdeb 654f31e6858e d8c53d04a173 2026-03-10T08:59:36.443 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm08.rpongu vm08 *:8443,9283,8765 running (3m) 2m ago 8m 491M - 19.2.3-678-ge911bdeb 654f31e6858e 867781ac9d98 2026-03-10T08:59:36.443 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm05 vm05 running (2m) 2m ago 9m 55.8M 2048M 19.2.3-678-ge911bdeb 654f31e6858e cdc9176bec28 2026-03-10T08:59:36.443 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm08 vm08 running (2m) 2m ago 8m 48.1M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 34546aa1422b 2026-03-10T08:59:36.443 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm05 vm05 *:9100 running (8m) 2m ago 8m 14.7M - 1.5.0 0da6a335fe13 3b3db2a6030c 2026-03-10T08:59:36.443 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm08 vm08 *:9100 running (8m) 2m ago 8m 15.7M - 1.5.0 0da6a335fe13 f55df0cefc61 2026-03-10T08:59:36.443 INFO:teuthology.orchestra.run.vm05.stdout:osd.0 vm05 running (2m) 2m ago 7m 30.0M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 4f1dac46f59b 2026-03-10T08:59:36.443 INFO:teuthology.orchestra.run.vm05.stdout:osd.1 vm05 running (7m) 2m ago 7m 378M 4096M 18.2.1 5be31c24972a 902f9ea11f1a 2026-03-10T08:59:36.443 INFO:teuthology.orchestra.run.vm05.stdout:osd.2 vm05 running (7m) 2m ago 7m 327M 4096M 18.2.1 5be31c24972a 32c0be0f86f2 2026-03-10T08:59:36.443 INFO:teuthology.orchestra.run.vm05.stdout:osd.3 vm08 running (7m) 2m ago 7m 456M 4096M 18.2.1 5be31c24972a 14f5f93ea4d1 2026-03-10T08:59:36.443 INFO:teuthology.orchestra.run.vm05.stdout:osd.4 vm08 running (7m) 2m ago 7m 418M 4096M 18.2.1 5be31c24972a 155c1482d81c 2026-03-10T08:59:36.443 INFO:teuthology.orchestra.run.vm05.stdout:osd.5 vm08 running (7m) 2m ago 7m 339M 4096M 18.2.1 5be31c24972a 21583bb58d82 2026-03-10T08:59:36.443 INFO:teuthology.orchestra.run.vm05.stdout:prometheus.vm05 vm05 *:9095 running (3m) 2m ago 8m 51.3M - 2.43.0 a07b618ecd1d 1879cf55d022 2026-03-10T08:59:36.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.445+0000 7f7e3dfac700 1 -- 192.168.123.105:0/3541323820 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f7e240778c0 msgr2=0x7f7e24079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:36.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.445+0000 7f7e3dfac700 1 --2- 192.168.123.105:0/3541323820 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f7e240778c0 0x7f7e24079d80 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f7e2000ac80 tx=0x7f7e20005ef0 comp rx=0 tx=0).stop 2026-03-10T08:59:36.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.445+0000 7f7e3dfac700 1 -- 192.168.123.105:0/3541323820 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7e38104320 msgr2=0x7f7e38198ef0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:36.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.445+0000 7f7e3dfac700 1 --2- 192.168.123.105:0/3541323820 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7e38104320 0x7f7e38198ef0 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7f7e28000c00 tx=0x7f7e28004ab0 comp rx=0 tx=0).stop 2026-03-10T08:59:36.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.445+0000 7f7e3dfac700 1 -- 192.168.123.105:0/3541323820 shutdown_connections 2026-03-10T08:59:36.446 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.445+0000 7f7e3dfac700 1 --2- 192.168.123.105:0/3541323820 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f7e240778c0 0x7f7e24079d80 unknown :-1 s=CLOSED pgs=58 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:36.446 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.445+0000 7f7e3dfac700 1 --2- 192.168.123.105:0/3541323820 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7e381019f0 0x7f7e381989b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:36.446 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.445+0000 7f7e3dfac700 1 --2- 192.168.123.105:0/3541323820 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7e38104320 0x7f7e38198ef0 unknown :-1 s=CLOSED pgs=90 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:36.446 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.445+0000 7f7e3dfac700 1 -- 192.168.123.105:0/3541323820 >> 192.168.123.105:0/3541323820 conn(0x7f7e380fb3c0 msgr2=0x7f7e38104f90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:59:36.446 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.445+0000 7f7e3dfac700 1 -- 192.168.123.105:0/3541323820 shutdown_connections 2026-03-10T08:59:36.446 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.445+0000 7f7e3dfac700 1 -- 192.168.123.105:0/3541323820 wait complete. 2026-03-10T08:59:36.510 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.509+0000 7f4a3c0ed700 1 -- 192.168.123.105:0/2994725053 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4a341042c0 msgr2=0x7f4a341066b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:36.512 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.509+0000 7f4a3c0ed700 1 --2- 192.168.123.105:0/2994725053 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4a341042c0 0x7f4a341066b0 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f4a30009a60 tx=0x7f4a30009d70 comp rx=0 tx=0).stop 2026-03-10T08:59:36.512 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.512+0000 7f4a3c0ed700 1 -- 192.168.123.105:0/2994725053 shutdown_connections 2026-03-10T08:59:36.512 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.512+0000 7f4a3c0ed700 1 --2- 192.168.123.105:0/2994725053 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4a341042c0 0x7f4a341066b0 secure :-1 s=CLOSED pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f4a30009a60 tx=0x7f4a30009d70 comp rx=0 tx=0).stop 2026-03-10T08:59:36.512 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.512+0000 7f4a3c0ed700 1 --2- 192.168.123.105:0/2994725053 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4a34101990 0x7f4a34103d80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:36.512 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.512+0000 7f4a3c0ed700 1 -- 192.168.123.105:0/2994725053 >> 192.168.123.105:0/2994725053 conn(0x7f4a340fb380 msgr2=0x7f4a340fd7e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:59:36.512 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.512+0000 7f4a3c0ed700 1 -- 192.168.123.105:0/2994725053 shutdown_connections 2026-03-10T08:59:36.512 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.512+0000 7f4a3c0ed700 1 -- 192.168.123.105:0/2994725053 wait complete. 2026-03-10T08:59:36.513 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.512+0000 7f4a3c0ed700 1 Processor -- start 2026-03-10T08:59:36.513 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.513+0000 7f4a3c0ed700 1 -- start start 2026-03-10T08:59:36.513 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.513+0000 7f4a3c0ed700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4a34101990 0x7f4a34071f90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:59:36.513 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.513+0000 7f4a3c0ed700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4a340724d0 0x7f4a340ffce0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:59:36.513 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.513+0000 7f4a3c0ed700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4a341a4300 con 0x7f4a34101990 2026-03-10T08:59:36.513 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.513+0000 7f4a3c0ed700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4a341a4470 con 0x7f4a340724d0 2026-03-10T08:59:36.513 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.513+0000 7f4a39e89700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4a34101990 0x7f4a34071f90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:59:36.514 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.513+0000 7f4a39688700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4a340724d0 0x7f4a340ffce0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:59:36.514 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.513+0000 7f4a39688700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4a340724d0 0x7f4a340ffce0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:33840/0 (socket says 192.168.123.105:33840) 2026-03-10T08:59:36.514 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.513+0000 7f4a39688700 1 -- 192.168.123.105:0/2771943958 learned_addr learned my addr 192.168.123.105:0/2771943958 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:59:36.514 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.514+0000 7f4a39688700 1 -- 192.168.123.105:0/2771943958 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4a34101990 msgr2=0x7f4a34071f90 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:36.514 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.514+0000 7f4a39688700 1 --2- 192.168.123.105:0/2771943958 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4a34101990 0x7f4a34071f90 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:36.514 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.514+0000 7f4a39688700 1 -- 192.168.123.105:0/2771943958 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4a240097e0 con 0x7f4a340724d0 2026-03-10T08:59:36.514 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.514+0000 7f4a39e89700 1 --2- 192.168.123.105:0/2771943958 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4a34101990 0x7f4a34071f90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T08:59:36.514 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.514+0000 7f4a39688700 1 --2- 192.168.123.105:0/2771943958 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4a340724d0 0x7f4a340ffce0 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7f4a30003960 tx=0x7f4a3000f690 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:59:36.514 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.514+0000 7f4a2affd700 1 -- 192.168.123.105:0/2771943958 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4a3001d070 con 0x7f4a340724d0 2026-03-10T08:59:36.515 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.514+0000 7f4a3c0ed700 1 -- 192.168.123.105:0/2771943958 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4a30009710 con 0x7f4a340724d0 2026-03-10T08:59:36.515 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.514+0000 7f4a3c0ed700 1 -- 192.168.123.105:0/2771943958 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4a341a4a50 con 0x7f4a340724d0 2026-03-10T08:59:36.515 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.515+0000 7f4a2affd700 1 -- 192.168.123.105:0/2771943958 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4a3000fc20 con 0x7f4a340724d0 2026-03-10T08:59:36.515 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.515+0000 7f4a2affd700 1 -- 192.168.123.105:0/2771943958 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4a30017600 con 0x7f4a340724d0 2026-03-10T08:59:36.516 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.515+0000 7f4a2affd700 1 -- 192.168.123.105:0/2771943958 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f4a30017760 con 0x7f4a340724d0 2026-03-10T08:59:36.516 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.516+0000 7f4a3c0ed700 1 -- 192.168.123.105:0/2771943958 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4a34066e80 con 0x7f4a340724d0 2026-03-10T08:59:36.516 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.516+0000 7f4a2affd700 1 --2- 192.168.123.105:0/2771943958 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f4a20077990 0x7f4a20079e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:59:36.516 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.516+0000 7f4a39e89700 1 --2- 192.168.123.105:0/2771943958 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f4a20077990 0x7f4a20079e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:59:36.517 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.517+0000 7f4a39e89700 1 --2- 192.168.123.105:0/2771943958 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f4a20077990 0x7f4a20079e50 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f4a24005fd0 tx=0x7f4a24009500 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:59:36.517 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.517+0000 7f4a2affd700 1 -- 192.168.123.105:0/2771943958 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(61..61 src has 1..61) v4 ==== 6454+0+0 (secure 0 0 0) 0x7f4a3009bc70 con 0x7f4a340724d0 2026-03-10T08:59:36.519 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.519+0000 7f4a2affd700 1 -- 192.168.123.105:0/2771943958 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f4a30064490 con 0x7f4a340724d0 2026-03-10T08:59:36.683 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.683+0000 7f4a3c0ed700 1 -- 192.168.123.105:0/2771943958 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f4a34101440 con 0x7f4a340724d0 2026-03-10T08:59:36.684 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.684+0000 7f4a2affd700 1 -- 192.168.123.105:0/2771943958 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+799 (secure 0 0 0) 0x7f4a30063be0 con 0x7f4a340724d0 2026-03-10T08:59:36.685 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T08:59:36.685 INFO:teuthology.orchestra.run.vm05.stdout: "mon": { 2026-03-10T08:59:36.685 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T08:59:36.685 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:59:36.685 INFO:teuthology.orchestra.run.vm05.stdout: "mgr": { 2026-03-10T08:59:36.685 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T08:59:36.685 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:59:36.685 INFO:teuthology.orchestra.run.vm05.stdout: "osd": { 2026-03-10T08:59:36.685 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 5, 2026-03-10T08:59:36.685 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-10T08:59:36.685 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:59:36.685 INFO:teuthology.orchestra.run.vm05.stdout: "mds": { 2026-03-10T08:59:36.685 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4 2026-03-10T08:59:36.685 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T08:59:36.685 INFO:teuthology.orchestra.run.vm05.stdout: "overall": { 2026-03-10T08:59:36.685 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 9, 2026-03-10T08:59:36.685 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 5 2026-03-10T08:59:36.685 INFO:teuthology.orchestra.run.vm05.stdout: } 2026-03-10T08:59:36.685 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T08:59:36.687 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.686+0000 7f4a3c0ed700 1 -- 192.168.123.105:0/2771943958 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f4a20077990 msgr2=0x7f4a20079e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:36.687 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.686+0000 7f4a3c0ed700 1 --2- 192.168.123.105:0/2771943958 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f4a20077990 0x7f4a20079e50 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f4a24005fd0 tx=0x7f4a24009500 comp rx=0 tx=0).stop 2026-03-10T08:59:36.687 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.687+0000 7f4a3c0ed700 1 -- 192.168.123.105:0/2771943958 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4a340724d0 msgr2=0x7f4a340ffce0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:36.687 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.687+0000 7f4a3c0ed700 1 --2- 192.168.123.105:0/2771943958 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4a340724d0 0x7f4a340ffce0 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7f4a30003960 tx=0x7f4a3000f690 comp rx=0 tx=0).stop 2026-03-10T08:59:36.687 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.687+0000 7f4a3c0ed700 1 -- 192.168.123.105:0/2771943958 shutdown_connections 2026-03-10T08:59:36.687 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.687+0000 7f4a3c0ed700 1 --2- 192.168.123.105:0/2771943958 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f4a20077990 0x7f4a20079e50 unknown :-1 s=CLOSED pgs=59 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:36.687 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.687+0000 7f4a3c0ed700 1 --2- 192.168.123.105:0/2771943958 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4a34101990 0x7f4a34071f90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:36.687 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.687+0000 7f4a3c0ed700 1 --2- 192.168.123.105:0/2771943958 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4a340724d0 0x7f4a340ffce0 unknown :-1 s=CLOSED pgs=31 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:36.687 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.687+0000 7f4a3c0ed700 1 -- 192.168.123.105:0/2771943958 >> 192.168.123.105:0/2771943958 conn(0x7f4a340fb380 msgr2=0x7f4a340fd7e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:59:36.687 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.687+0000 7f4a3c0ed700 1 -- 192.168.123.105:0/2771943958 shutdown_connections 2026-03-10T08:59:36.687 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.687+0000 7f4a3c0ed700 1 -- 192.168.123.105:0/2771943958 wait complete. 2026-03-10T08:59:36.756 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.755+0000 7f168eece700 1 -- 192.168.123.105:0/2031572540 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1680097fd0 msgr2=0x7f16800983f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:36.756 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.755+0000 7f168eece700 1 --2- 192.168.123.105:0/2031572540 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1680097fd0 0x7f16800983f0 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7f167c009b50 tx=0x7f167c009e60 comp rx=0 tx=0).stop 2026-03-10T08:59:36.756 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.756+0000 7f168eece700 1 -- 192.168.123.105:0/2031572540 shutdown_connections 2026-03-10T08:59:36.756 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.756+0000 7f168eece700 1 --2- 192.168.123.105:0/2031572540 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f16800991d0 0x7f1680099630 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:36.756 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.756+0000 7f168eece700 1 --2- 192.168.123.105:0/2031572540 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1680097fd0 0x7f16800983f0 unknown :-1 s=CLOSED pgs=91 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:36.756 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.756+0000 7f168eece700 1 -- 192.168.123.105:0/2031572540 >> 192.168.123.105:0/2031572540 conn(0x7f1680093550 msgr2=0x7f16800959b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:59:36.756 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.756+0000 7f168eece700 1 -- 192.168.123.105:0/2031572540 shutdown_connections 2026-03-10T08:59:36.756 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.756+0000 7f168eece700 1 -- 192.168.123.105:0/2031572540 wait complete. 2026-03-10T08:59:36.756 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.756+0000 7f168eece700 1 Processor -- start 2026-03-10T08:59:36.757 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.756+0000 7f168eece700 1 -- start start 2026-03-10T08:59:36.757 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.757+0000 7f168eece700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f16800991d0 0x7f168012db90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:59:36.757 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.757+0000 7f168eece700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f168012e0d0 0x7f1680133140 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:59:36.757 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.757+0000 7f168eece700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f168012e5e0 con 0x7f16800991d0 2026-03-10T08:59:36.757 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.757+0000 7f168eece700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f168012e750 con 0x7f168012e0d0 2026-03-10T08:59:36.757 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.757+0000 7f168d6cb700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f168012e0d0 0x7f1680133140 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:59:36.757 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.757+0000 7f168d6cb700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f168012e0d0 0x7f1680133140 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:33846/0 (socket says 192.168.123.105:33846) 2026-03-10T08:59:36.757 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.757+0000 7f168d6cb700 1 -- 192.168.123.105:0/2920673925 learned_addr learned my addr 192.168.123.105:0/2920673925 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:59:36.757 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.757+0000 7f168d6cb700 1 -- 192.168.123.105:0/2920673925 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f16800991d0 msgr2=0x7f168012db90 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:36.757 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.757+0000 7f168decc700 1 --2- 192.168.123.105:0/2920673925 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f16800991d0 0x7f168012db90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:59:36.758 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.757+0000 7f168d6cb700 1 --2- 192.168.123.105:0/2920673925 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f16800991d0 0x7f168012db90 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:36.758 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.757+0000 7f168d6cb700 1 -- 192.168.123.105:0/2920673925 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f167c0097e0 con 0x7f168012e0d0 2026-03-10T08:59:36.758 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.757+0000 7f168decc700 1 --2- 192.168.123.105:0/2920673925 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f16800991d0 0x7f168012db90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T08:59:36.758 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.758+0000 7f168d6cb700 1 --2- 192.168.123.105:0/2920673925 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f168012e0d0 0x7f1680133140 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7f168400eb10 tx=0x7f168400eed0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:59:36.758 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.758+0000 7f167affd700 1 -- 192.168.123.105:0/2920673925 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f168400cca0 con 0x7f168012e0d0 2026-03-10T08:59:36.758 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.758+0000 7f168eece700 1 -- 192.168.123.105:0/2920673925 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f16801336e0 con 0x7f168012e0d0 2026-03-10T08:59:36.759 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.758+0000 7f167affd700 1 -- 192.168.123.105:0/2920673925 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f168400ce00 con 0x7f168012e0d0 2026-03-10T08:59:36.759 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.758+0000 7f168eece700 1 -- 192.168.123.105:0/2920673925 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1680133c30 con 0x7f168012e0d0 2026-03-10T08:59:36.759 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.759+0000 7f167affd700 1 -- 192.168.123.105:0/2920673925 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f16840105e0 con 0x7f168012e0d0 2026-03-10T08:59:36.760 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.759+0000 7f168eece700 1 -- 192.168.123.105:0/2920673925 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1680005800 con 0x7f168012e0d0 2026-03-10T08:59:36.760 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.759+0000 7f167affd700 1 -- 192.168.123.105:0/2920673925 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f1684010780 con 0x7f168012e0d0 2026-03-10T08:59:36.762 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.760+0000 7f167affd700 1 --2- 192.168.123.105:0/2920673925 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f1674077870 0x7f1674079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:59:36.762 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.760+0000 7f168decc700 1 --2- 192.168.123.105:0/2920673925 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f1674077870 0x7f1674079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:59:36.762 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.762+0000 7f167affd700 1 -- 192.168.123.105:0/2920673925 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(61..61 src has 1..61) v4 ==== 6454+0+0 (secure 0 0 0) 0x7f1684014070 con 0x7f168012e0d0 2026-03-10T08:59:36.762 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.762+0000 7f168decc700 1 --2- 192.168.123.105:0/2920673925 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f1674077870 0x7f1674079d30 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7f167c0053b0 tx=0x7f167c0058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:59:36.762 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.762+0000 7f167affd700 1 -- 192.168.123.105:0/2920673925 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f16840d09f0 con 0x7f168012e0d0 2026-03-10T08:59:36.910 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.910+0000 7f168eece700 1 -- 192.168.123.105:0/2920673925 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f1680134630 con 0x7f168012e0d0 2026-03-10T08:59:36.911 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.910+0000 7f167affd700 1 -- 192.168.123.105:0/2920673925 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 12 v12) v1 ==== 76+0+1918 (secure 0 0 0) 0x7f1684062860 con 0x7f168012e0d0 2026-03-10T08:59:36.911 INFO:teuthology.orchestra.run.vm05.stdout:e12 2026-03-10T08:59:36.911 INFO:teuthology.orchestra.run.vm05.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-10T08:59:36.911 INFO:teuthology.orchestra.run.vm05.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T08:59:36.911 INFO:teuthology.orchestra.run.vm05.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T08:59:36.911 INFO:teuthology.orchestra.run.vm05.stdout:legacy client fscid: 1 2026-03-10T08:59:36.911 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:59:36.911 INFO:teuthology.orchestra.run.vm05.stdout:Filesystem 'cephfs' (1) 2026-03-10T08:59:36.911 INFO:teuthology.orchestra.run.vm05.stdout:fs_name cephfs 2026-03-10T08:59:36.911 INFO:teuthology.orchestra.run.vm05.stdout:epoch 12 2026-03-10T08:59:36.911 INFO:teuthology.orchestra.run.vm05.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T08:59:36.911 INFO:teuthology.orchestra.run.vm05.stdout:created 2026-03-10T08:52:52.346264+0000 2026-03-10T08:59:36.911 INFO:teuthology.orchestra.run.vm05.stdout:modified 2026-03-10T08:53:01.426195+0000 2026-03-10T08:59:36.911 INFO:teuthology.orchestra.run.vm05.stdout:tableserver 0 2026-03-10T08:59:36.911 INFO:teuthology.orchestra.run.vm05.stdout:root 0 2026-03-10T08:59:36.911 INFO:teuthology.orchestra.run.vm05.stdout:session_timeout 60 2026-03-10T08:59:36.912 INFO:teuthology.orchestra.run.vm05.stdout:session_autoclose 300 2026-03-10T08:59:36.912 INFO:teuthology.orchestra.run.vm05.stdout:max_file_size 1099511627776 2026-03-10T08:59:36.912 INFO:teuthology.orchestra.run.vm05.stdout:max_xattr_size 65536 2026-03-10T08:59:36.912 INFO:teuthology.orchestra.run.vm05.stdout:required_client_features {} 2026-03-10T08:59:36.912 INFO:teuthology.orchestra.run.vm05.stdout:last_failure 0 2026-03-10T08:59:36.912 INFO:teuthology.orchestra.run.vm05.stdout:last_failure_osd_epoch 39 2026-03-10T08:59:36.912 INFO:teuthology.orchestra.run.vm05.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T08:59:36.912 INFO:teuthology.orchestra.run.vm05.stdout:max_mds 1 2026-03-10T08:59:36.912 INFO:teuthology.orchestra.run.vm05.stdout:in 0 2026-03-10T08:59:36.912 INFO:teuthology.orchestra.run.vm05.stdout:up {0=24289} 2026-03-10T08:59:36.912 INFO:teuthology.orchestra.run.vm05.stdout:failed 2026-03-10T08:59:36.912 INFO:teuthology.orchestra.run.vm05.stdout:damaged 2026-03-10T08:59:36.912 INFO:teuthology.orchestra.run.vm05.stdout:stopped 2026-03-10T08:59:36.912 INFO:teuthology.orchestra.run.vm05.stdout:data_pools [3] 2026-03-10T08:59:36.912 INFO:teuthology.orchestra.run.vm05.stdout:metadata_pool 2 2026-03-10T08:59:36.912 INFO:teuthology.orchestra.run.vm05.stdout:inline_data disabled 2026-03-10T08:59:36.912 INFO:teuthology.orchestra.run.vm05.stdout:balancer 2026-03-10T08:59:36.912 INFO:teuthology.orchestra.run.vm05.stdout:bal_rank_mask -1 2026-03-10T08:59:36.912 INFO:teuthology.orchestra.run.vm05.stdout:standby_count_wanted 1 2026-03-10T08:59:36.912 INFO:teuthology.orchestra.run.vm05.stdout:qdb_cluster leader: 0 members: 2026-03-10T08:59:36.912 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.bxdvbu{0:24289} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.105:6826/2466638752,v1:192.168.123.105:6827/2466638752] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:59:36.912 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:59:36.912 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:59:36.912 INFO:teuthology.orchestra.run.vm05.stdout:Standby daemons: 2026-03-10T08:59:36.912 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:59:36.912 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.slhztf{-1:14488} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.105:6828/2662194502,v1:192.168.123.105:6829/2662194502] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:59:36.912 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.ssijow{-1:24307} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.108:6826/573665424,v1:192.168.123.108:6827/573665424] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:59:36.912 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.xfzrbx{-1:24317} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.108:6824/3236998387,v1:192.168.123.108:6825/3236998387] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T08:59:36.913 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.913+0000 7f168eece700 1 -- 192.168.123.105:0/2920673925 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f1674077870 msgr2=0x7f1674079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:36.914 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.913+0000 7f168eece700 1 --2- 192.168.123.105:0/2920673925 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f1674077870 0x7f1674079d30 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7f167c0053b0 tx=0x7f167c0058e0 comp rx=0 tx=0).stop 2026-03-10T08:59:36.914 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.914+0000 7f168eece700 1 -- 192.168.123.105:0/2920673925 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f168012e0d0 msgr2=0x7f1680133140 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:36.914 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.914+0000 7f168eece700 1 --2- 192.168.123.105:0/2920673925 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f168012e0d0 0x7f1680133140 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7f168400eb10 tx=0x7f168400eed0 comp rx=0 tx=0).stop 2026-03-10T08:59:36.914 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.914+0000 7f168eece700 1 -- 192.168.123.105:0/2920673925 shutdown_connections 2026-03-10T08:59:36.914 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.914+0000 7f168eece700 1 --2- 192.168.123.105:0/2920673925 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f1674077870 0x7f1674079d30 unknown :-1 s=CLOSED pgs=60 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:36.914 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.914+0000 7f168eece700 1 --2- 192.168.123.105:0/2920673925 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f16800991d0 0x7f168012db90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:36.915 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.914+0000 7f168eece700 1 --2- 192.168.123.105:0/2920673925 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f168012e0d0 0x7f1680133140 unknown :-1 s=CLOSED pgs=32 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:36.915 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.915+0000 7f168eece700 1 -- 192.168.123.105:0/2920673925 >> 192.168.123.105:0/2920673925 conn(0x7f1680093550 msgr2=0x7f168009c400 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:59:36.915 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.915+0000 7f168eece700 1 -- 192.168.123.105:0/2920673925 shutdown_connections 2026-03-10T08:59:36.915 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.915+0000 7f168eece700 1 -- 192.168.123.105:0/2920673925 wait complete. 2026-03-10T08:59:36.916 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 12 2026-03-10T08:59:36.983 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.983+0000 7f34a296c700 1 -- 192.168.123.105:0/3830096206 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f349c103120 msgr2=0x7f349c103540 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:36.983 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.983+0000 7f34a296c700 1 --2- 192.168.123.105:0/3830096206 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f349c103120 0x7f349c103540 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7f3484009b50 tx=0x7f3484009e60 comp rx=0 tx=0).stop 2026-03-10T08:59:36.983 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.983+0000 7f34a296c700 1 -- 192.168.123.105:0/3830096206 shutdown_connections 2026-03-10T08:59:36.983 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.983+0000 7f34a296c700 1 --2- 192.168.123.105:0/3830096206 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f349c104320 0x7f349c104780 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:36.983 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.983+0000 7f34a296c700 1 --2- 192.168.123.105:0/3830096206 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f349c103120 0x7f349c103540 unknown :-1 s=CLOSED pgs=92 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:36.983 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.983+0000 7f34a296c700 1 -- 192.168.123.105:0/3830096206 >> 192.168.123.105:0/3830096206 conn(0x7f349c0fe6c0 msgr2=0x7f349c100b00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:59:36.984 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.983+0000 7f34a296c700 1 -- 192.168.123.105:0/3830096206 shutdown_connections 2026-03-10T08:59:36.984 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.984+0000 7f34a296c700 1 -- 192.168.123.105:0/3830096206 wait complete. 2026-03-10T08:59:36.984 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.984+0000 7f34a296c700 1 Processor -- start 2026-03-10T08:59:36.984 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.984+0000 7f34a296c700 1 -- start start 2026-03-10T08:59:36.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.985+0000 7f34a296c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f349c104320 0x7f349c198c90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:59:36.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.985+0000 7f34a296c700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f349c1991d0 0x7f349c19e230 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:59:36.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.985+0000 7f34a296c700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f349c199680 con 0x7f349c104320 2026-03-10T08:59:36.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.985+0000 7f34a296c700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f349c1997f0 con 0x7f349c1991d0 2026-03-10T08:59:36.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.985+0000 7f349b7fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f349c1991d0 0x7f349c19e230 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:59:36.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.985+0000 7f349b7fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f349c1991d0 0x7f349c19e230 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:33866/0 (socket says 192.168.123.105:33866) 2026-03-10T08:59:36.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.985+0000 7f349b7fe700 1 -- 192.168.123.105:0/1788877465 learned_addr learned my addr 192.168.123.105:0/1788877465 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:59:36.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.985+0000 7f349b7fe700 1 -- 192.168.123.105:0/1788877465 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f349c104320 msgr2=0x7f349c198c90 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T08:59:36.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.985+0000 7f349bfff700 1 --2- 192.168.123.105:0/1788877465 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f349c104320 0x7f349c198c90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:59:36.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.985+0000 7f349b7fe700 1 --2- 192.168.123.105:0/1788877465 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f349c104320 0x7f349c198c90 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:36.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.985+0000 7f349b7fe700 1 -- 192.168.123.105:0/1788877465 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f34840097e0 con 0x7f349c1991d0 2026-03-10T08:59:36.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.985+0000 7f349bfff700 1 --2- 192.168.123.105:0/1788877465 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f349c104320 0x7f349c198c90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T08:59:36.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.986+0000 7f349b7fe700 1 --2- 192.168.123.105:0/1788877465 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f349c1991d0 0x7f349c19e230 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7f348c00eb10 tx=0x7f348c00eed0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:59:36.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.986+0000 7f34997fa700 1 -- 192.168.123.105:0/1788877465 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f348c00cca0 con 0x7f349c1991d0 2026-03-10T08:59:36.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.986+0000 7f34997fa700 1 -- 192.168.123.105:0/1788877465 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f348c00ce00 con 0x7f349c1991d0 2026-03-10T08:59:36.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.986+0000 7f34a296c700 1 -- 192.168.123.105:0/1788877465 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f349c19e7d0 con 0x7f349c1991d0 2026-03-10T08:59:36.987 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.986+0000 7f34997fa700 1 -- 192.168.123.105:0/1788877465 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f348c018910 con 0x7f349c1991d0 2026-03-10T08:59:36.987 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.986+0000 7f34a296c700 1 -- 192.168.123.105:0/1788877465 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f349c19ed20 con 0x7f349c1991d0 2026-03-10T08:59:36.987 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.987+0000 7f34a296c700 1 -- 192.168.123.105:0/1788877465 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f349c066e80 con 0x7f349c1991d0 2026-03-10T08:59:36.988 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.988+0000 7f34997fa700 1 -- 192.168.123.105:0/1788877465 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f348c018b50 con 0x7f349c1991d0 2026-03-10T08:59:36.988 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.988+0000 7f34997fa700 1 --2- 192.168.123.105:0/1788877465 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f34880778c0 0x7f3488079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:59:36.989 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.988+0000 7f34997fa700 1 -- 192.168.123.105:0/1788877465 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(61..61 src has 1..61) v4 ==== 6454+0+0 (secure 0 0 0) 0x7f348c014070 con 0x7f349c1991d0 2026-03-10T08:59:36.989 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.989+0000 7f349bfff700 1 --2- 192.168.123.105:0/1788877465 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f34880778c0 0x7f3488079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:59:36.989 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.989+0000 7f349bfff700 1 --2- 192.168.123.105:0/1788877465 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f34880778c0 0x7f3488079d80 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f349c199b70 tx=0x7f34840058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:59:36.991 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:36.991+0000 7f34997fa700 1 -- 192.168.123.105:0/1788877465 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f348c062730 con 0x7f349c1991d0 2026-03-10T08:59:37.115 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.114+0000 7f34a296c700 1 -- 192.168.123.105:0/1788877465 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f349c108c70 con 0x7f34880778c0 2026-03-10T08:59:37.116 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.115+0000 7f34997fa700 1 -- 192.168.123.105:0/1788877465 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f349c108c70 con 0x7f34880778c0 2026-03-10T08:59:37.117 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T08:59:37.117 INFO:teuthology.orchestra.run.vm05.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T08:59:37.117 INFO:teuthology.orchestra.run.vm05.stdout: "in_progress": true, 2026-03-10T08:59:37.117 INFO:teuthology.orchestra.run.vm05.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T08:59:37.117 INFO:teuthology.orchestra.run.vm05.stdout: "services_complete": [ 2026-03-10T08:59:37.117 INFO:teuthology.orchestra.run.vm05.stdout: "mgr", 2026-03-10T08:59:37.117 INFO:teuthology.orchestra.run.vm05.stdout: "crash", 2026-03-10T08:59:37.117 INFO:teuthology.orchestra.run.vm05.stdout: "mon" 2026-03-10T08:59:37.117 INFO:teuthology.orchestra.run.vm05.stdout: ], 2026-03-10T08:59:37.117 INFO:teuthology.orchestra.run.vm05.stdout: "progress": "7/23 daemons upgraded", 2026-03-10T08:59:37.117 INFO:teuthology.orchestra.run.vm05.stdout: "message": "Currently upgrading osd daemons", 2026-03-10T08:59:37.117 INFO:teuthology.orchestra.run.vm05.stdout: "is_paused": false 2026-03-10T08:59:37.117 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T08:59:37.120 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.120+0000 7f34a296c700 1 -- 192.168.123.105:0/1788877465 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f34880778c0 msgr2=0x7f3488079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:37.120 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.120+0000 7f34a296c700 1 --2- 192.168.123.105:0/1788877465 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f34880778c0 0x7f3488079d80 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f349c199b70 tx=0x7f34840058e0 comp rx=0 tx=0).stop 2026-03-10T08:59:37.120 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.120+0000 7f34a296c700 1 -- 192.168.123.105:0/1788877465 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f349c1991d0 msgr2=0x7f349c19e230 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:37.120 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.120+0000 7f34a296c700 1 --2- 192.168.123.105:0/1788877465 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f349c1991d0 0x7f349c19e230 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7f348c00eb10 tx=0x7f348c00eed0 comp rx=0 tx=0).stop 2026-03-10T08:59:37.120 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.120+0000 7f34a296c700 1 -- 192.168.123.105:0/1788877465 shutdown_connections 2026-03-10T08:59:37.121 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.120+0000 7f34a296c700 1 --2- 192.168.123.105:0/1788877465 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f34880778c0 0x7f3488079d80 secure :-1 s=CLOSED pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f349c199b70 tx=0x7f34840058e0 comp rx=0 tx=0).stop 2026-03-10T08:59:37.121 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.120+0000 7f34a296c700 1 --2- 192.168.123.105:0/1788877465 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f349c104320 0x7f349c198c90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:37.121 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.120+0000 7f34a296c700 1 --2- 192.168.123.105:0/1788877465 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f349c1991d0 0x7f349c19e230 unknown :-1 s=CLOSED pgs=33 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:37.121 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.120+0000 7f34a296c700 1 -- 192.168.123.105:0/1788877465 >> 192.168.123.105:0/1788877465 conn(0x7f349c0fe6c0 msgr2=0x7f349c107550 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:59:37.121 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.120+0000 7f34a296c700 1 -- 192.168.123.105:0/1788877465 shutdown_connections 2026-03-10T08:59:37.121 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.120+0000 7f34a296c700 1 -- 192.168.123.105:0/1788877465 wait complete. 2026-03-10T08:59:37.191 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.190+0000 7f9d70fb0700 1 -- 192.168.123.105:0/2006041729 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9d6c075740 msgr2=0x7f9d6c075b60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:37.191 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.190+0000 7f9d70fb0700 1 --2- 192.168.123.105:0/2006041729 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9d6c075740 0x7f9d6c075b60 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f9d5c009b50 tx=0x7f9d5c009e60 comp rx=0 tx=0).stop 2026-03-10T08:59:37.191 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.191+0000 7f9d70fb0700 1 -- 192.168.123.105:0/2006041729 shutdown_connections 2026-03-10T08:59:37.191 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.191+0000 7f9d70fb0700 1 --2- 192.168.123.105:0/2006041729 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9d6c076990 0x7f9d6c076e10 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:37.191 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.191+0000 7f9d70fb0700 1 --2- 192.168.123.105:0/2006041729 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9d6c075740 0x7f9d6c075b60 unknown :-1 s=CLOSED pgs=93 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:37.191 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.191+0000 7f9d70fb0700 1 -- 192.168.123.105:0/2006041729 >> 192.168.123.105:0/2006041729 conn(0x7f9d6c0fe6c0 msgr2=0x7f9d6c100b00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:59:37.191 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.191+0000 7f9d70fb0700 1 -- 192.168.123.105:0/2006041729 shutdown_connections 2026-03-10T08:59:37.191 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.191+0000 7f9d70fb0700 1 -- 192.168.123.105:0/2006041729 wait complete. 2026-03-10T08:59:37.192 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.192+0000 7f9d70fb0700 1 Processor -- start 2026-03-10T08:59:37.192 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.192+0000 7f9d70fb0700 1 -- start start 2026-03-10T08:59:37.192 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.192+0000 7f9d70fb0700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9d6c076990 0x7f9d6c19d080 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:59:37.193 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.192+0000 7f9d70fb0700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9d6c19d5c0 0x7f9d6c1a2630 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:59:37.193 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.192+0000 7f9d70fb0700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9d6c19dad0 con 0x7f9d6c076990 2026-03-10T08:59:37.193 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.192+0000 7f9d70fb0700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9d6c19dc40 con 0x7f9d6c19d5c0 2026-03-10T08:59:37.193 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.192+0000 7f9d63fff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9d6c19d5c0 0x7f9d6c1a2630 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:59:37.193 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.192+0000 7f9d63fff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9d6c19d5c0 0x7f9d6c1a2630 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:33884/0 (socket says 192.168.123.105:33884) 2026-03-10T08:59:37.193 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.192+0000 7f9d63fff700 1 -- 192.168.123.105:0/905536547 learned_addr learned my addr 192.168.123.105:0/905536547 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T08:59:37.193 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.193+0000 7f9d6a59c700 1 --2- 192.168.123.105:0/905536547 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9d6c076990 0x7f9d6c19d080 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:59:37.193 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.193+0000 7f9d63fff700 1 -- 192.168.123.105:0/905536547 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9d6c076990 msgr2=0x7f9d6c19d080 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:37.193 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.193+0000 7f9d63fff700 1 --2- 192.168.123.105:0/905536547 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9d6c076990 0x7f9d6c19d080 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:37.193 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.193+0000 7f9d63fff700 1 -- 192.168.123.105:0/905536547 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9d5c0097e0 con 0x7f9d6c19d5c0 2026-03-10T08:59:37.193 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.193+0000 7f9d6a59c700 1 --2- 192.168.123.105:0/905536547 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9d6c076990 0x7f9d6c19d080 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T08:59:37.194 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.193+0000 7f9d63fff700 1 --2- 192.168.123.105:0/905536547 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9d6c19d5c0 0x7f9d6c1a2630 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7f9d5400d8d0 tx=0x7f9d5400dc90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:59:37.194 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.193+0000 7f9d637fe700 1 -- 192.168.123.105:0/905536547 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9d54009940 con 0x7f9d6c19d5c0 2026-03-10T08:59:37.194 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.194+0000 7f9d70fb0700 1 -- 192.168.123.105:0/905536547 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9d6c1a2bd0 con 0x7f9d6c19d5c0 2026-03-10T08:59:37.194 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.194+0000 7f9d637fe700 1 -- 192.168.123.105:0/905536547 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9d54010460 con 0x7f9d6c19d5c0 2026-03-10T08:59:37.195 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.194+0000 7f9d637fe700 1 -- 192.168.123.105:0/905536547 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9d5400f5d0 con 0x7f9d6c19d5c0 2026-03-10T08:59:37.195 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.194+0000 7f9d70fb0700 1 -- 192.168.123.105:0/905536547 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9d6c1a3120 con 0x7f9d6c19d5c0 2026-03-10T08:59:37.195 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.195+0000 7f9d70fb0700 1 -- 192.168.123.105:0/905536547 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9d6c066e80 con 0x7f9d6c19d5c0 2026-03-10T08:59:37.197 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.196+0000 7f9d637fe700 1 -- 192.168.123.105:0/905536547 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f9d54010a90 con 0x7f9d6c19d5c0 2026-03-10T08:59:37.197 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.196+0000 7f9d637fe700 1 --2- 192.168.123.105:0/905536547 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f9d4c0778c0 0x7f9d4c079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T08:59:37.197 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.196+0000 7f9d637fe700 1 -- 192.168.123.105:0/905536547 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(61..61 src has 1..61) v4 ==== 6454+0+0 (secure 0 0 0) 0x7f9d54099f80 con 0x7f9d6c19d5c0 2026-03-10T08:59:37.197 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.196+0000 7f9d6a59c700 1 --2- 192.168.123.105:0/905536547 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f9d4c0778c0 0x7f9d4c079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T08:59:37.197 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.197+0000 7f9d6a59c700 1 --2- 192.168.123.105:0/905536547 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f9d4c0778c0 0x7f9d4c079d80 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f9d5c005950 tx=0x7f9d5c0058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T08:59:37.198 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.198+0000 7f9d637fe700 1 -- 192.168.123.105:0/905536547 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f9d540626f0 con 0x7f9d6c19d5c0 2026-03-10T08:59:37.355 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.355+0000 7f9d70fb0700 1 -- 192.168.123.105:0/905536547 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f9d6c1a34c0 con 0x7f9d6c19d5c0 2026-03-10T08:59:37.356 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.355+0000 7f9d637fe700 1 -- 192.168.123.105:0/905536547 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+1219 (secure 0 0 0) 0x7f9d54067e60 con 0x7f9d6c19d5c0 2026-03-10T08:59:37.356 INFO:teuthology.orchestra.run.vm05.stdout:HEALTH_WARN Degraded data redundancy: 1010/231 objects degraded (437.229%), 8 pgs degraded, 8 pgs undersized 2026-03-10T08:59:37.356 INFO:teuthology.orchestra.run.vm05.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 1010/231 objects degraded (437.229%), 8 pgs degraded, 8 pgs undersized 2026-03-10T08:59:37.356 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.1 is stuck undersized for 118s, current state active+recovery_wait+undersized+degraded+remapped, last acting [2,4] 2026-03-10T08:59:37.356 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.6 is stuck undersized for 118s, current state active+recovering+undersized+degraded+remapped, last acting [1,4] 2026-03-10T08:59:37.356 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.b is stuck undersized for 118s, current state active+recovery_wait+undersized+degraded+remapped, last acting [1,4] 2026-03-10T08:59:37.356 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.10 is stuck undersized for 118s, current state active+recovery_wait+undersized+degraded+remapped, last acting [5,1] 2026-03-10T08:59:37.356 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.11 is stuck undersized for 118s, current state active+recovery_wait+undersized+degraded+remapped, last acting [3,4] 2026-03-10T08:59:37.356 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.15 is stuck undersized for 118s, current state active+recovery_wait+undersized+degraded+remapped, last acting [3,4] 2026-03-10T08:59:37.356 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.18 is stuck undersized for 118s, current state active+recovery_wait+undersized+degraded+remapped, last acting [2,1] 2026-03-10T08:59:37.356 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.1f is stuck undersized for 118s, current state active+recovery_wait+undersized+degraded+remapped, last acting [2,3] 2026-03-10T08:59:37.358 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.358+0000 7f9d70fb0700 1 -- 192.168.123.105:0/905536547 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f9d4c0778c0 msgr2=0x7f9d4c079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:37.358 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.358+0000 7f9d70fb0700 1 --2- 192.168.123.105:0/905536547 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f9d4c0778c0 0x7f9d4c079d80 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f9d5c005950 tx=0x7f9d5c0058e0 comp rx=0 tx=0).stop 2026-03-10T08:59:37.358 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.358+0000 7f9d70fb0700 1 -- 192.168.123.105:0/905536547 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9d6c19d5c0 msgr2=0x7f9d6c1a2630 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T08:59:37.358 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.358+0000 7f9d70fb0700 1 --2- 192.168.123.105:0/905536547 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9d6c19d5c0 0x7f9d6c1a2630 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7f9d5400d8d0 tx=0x7f9d5400dc90 comp rx=0 tx=0).stop 2026-03-10T08:59:37.359 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.358+0000 7f9d70fb0700 1 -- 192.168.123.105:0/905536547 shutdown_connections 2026-03-10T08:59:37.359 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.358+0000 7f9d70fb0700 1 --2- 192.168.123.105:0/905536547 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f9d4c0778c0 0x7f9d4c079d80 unknown :-1 s=CLOSED pgs=62 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:37.359 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.358+0000 7f9d70fb0700 1 --2- 192.168.123.105:0/905536547 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9d6c076990 0x7f9d6c19d080 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:37.359 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.358+0000 7f9d70fb0700 1 --2- 192.168.123.105:0/905536547 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9d6c19d5c0 0x7f9d6c1a2630 unknown :-1 s=CLOSED pgs=34 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T08:59:37.359 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.359+0000 7f9d70fb0700 1 -- 192.168.123.105:0/905536547 >> 192.168.123.105:0/905536547 conn(0x7f9d6c0fe6c0 msgr2=0x7f9d6c10d380 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T08:59:37.359 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.359+0000 7f9d70fb0700 1 -- 192.168.123.105:0/905536547 shutdown_connections 2026-03-10T08:59:37.359 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T08:59:37.359+0000 7f9d70fb0700 1 -- 192.168.123.105:0/905536547 wait complete. 2026-03-10T08:59:37.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:37 vm05.local ceph-mon[111630]: pgmap v106: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 7 active+recovery_wait+undersized+degraded+remapped, 57 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1010/231 objects degraded (437.229%); 0 B/s, 5 objects/s recovering 2026-03-10T08:59:37.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:37 vm05.local ceph-mon[111630]: from='client.44197 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:59:37.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:37 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/2771943958' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:59:37.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:37 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/2920673925' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T08:59:37.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:37 vm08.local ceph-mon[101330]: pgmap v106: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 7 active+recovery_wait+undersized+degraded+remapped, 57 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1010/231 objects degraded (437.229%); 0 B/s, 5 objects/s recovering 2026-03-10T08:59:37.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:37 vm08.local ceph-mon[101330]: from='client.44197 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:59:37.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:37 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/2771943958' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T08:59:37.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:37 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/2920673925' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T08:59:38.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:38 vm05.local ceph-mon[111630]: from='client.34248 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:59:38.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:38 vm05.local ceph-mon[111630]: from='client.34252 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:59:38.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:38 vm05.local ceph-mon[111630]: from='client.44215 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:59:38.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:38 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/905536547' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T08:59:38.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:38 vm08.local ceph-mon[101330]: from='client.34248 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:59:38.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:38 vm08.local ceph-mon[101330]: from='client.34252 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:59:38.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:38 vm08.local ceph-mon[101330]: from='client.44215 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T08:59:38.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:38 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/905536547' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T08:59:39.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:39 vm08.local ceph-mon[101330]: pgmap v107: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 7 active+recovery_wait+undersized+degraded+remapped, 57 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1010/231 objects degraded (437.229%); 0 B/s, 10 objects/s recovering 2026-03-10T08:59:39.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:39 vm05.local ceph-mon[111630]: pgmap v107: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 7 active+recovery_wait+undersized+degraded+remapped, 57 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1010/231 objects degraded (437.229%); 0 B/s, 10 objects/s recovering 2026-03-10T08:59:41.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:41 vm05.local ceph-mon[111630]: pgmap v108: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 7 active+recovery_wait+undersized+degraded+remapped, 57 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1010/231 objects degraded (437.229%); 0 B/s, 9 objects/s recovering 2026-03-10T08:59:41.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:41 vm08.local ceph-mon[101330]: pgmap v108: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 7 active+recovery_wait+undersized+degraded+remapped, 57 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1010/231 objects degraded (437.229%); 0 B/s, 9 objects/s recovering 2026-03-10T08:59:43.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:43 vm05.local ceph-mon[111630]: pgmap v109: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 7 active+recovery_wait+undersized+degraded+remapped, 57 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1010/231 objects degraded (437.229%); 0 B/s, 7 objects/s recovering 2026-03-10T08:59:43.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:43 vm08.local ceph-mon[101330]: pgmap v109: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 7 active+recovery_wait+undersized+degraded+remapped, 57 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1010/231 objects degraded (437.229%); 0 B/s, 7 objects/s recovering 2026-03-10T08:59:44.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:44 vm08.local ceph-mon[101330]: pgmap v110: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 7 active+recovery_wait+undersized+degraded+remapped, 57 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 997/231 objects degraded (431.602%); 0 B/s, 11 objects/s recovering 2026-03-10T08:59:44.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:44 vm08.local ceph-mon[101330]: Health check update: Degraded data redundancy: 997/231 objects degraded (431.602%), 8 pgs degraded, 8 pgs undersized (PG_DEGRADED) 2026-03-10T08:59:44.572 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:44 vm05.local ceph-mon[111630]: pgmap v110: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 7 active+recovery_wait+undersized+degraded+remapped, 57 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 997/231 objects degraded (431.602%); 0 B/s, 11 objects/s recovering 2026-03-10T08:59:44.572 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:44 vm05.local ceph-mon[111630]: Health check update: Degraded data redundancy: 997/231 objects degraded (431.602%), 8 pgs degraded, 8 pgs undersized (PG_DEGRADED) 2026-03-10T08:59:45.359 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:45 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T08:59:45.359 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:45 vm05.local ceph-mon[111630]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T08:59:45.359 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:45 vm05.local ceph-mon[111630]: Upgrade: unsafe to stop osd(s) at this time (4 PGs are or would become offline) 2026-03-10T08:59:45.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:45 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T08:59:45.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:45 vm08.local ceph-mon[101330]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T08:59:45.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:45 vm08.local ceph-mon[101330]: Upgrade: unsafe to stop osd(s) at this time (4 PGs are or would become offline) 2026-03-10T08:59:46.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:46 vm08.local ceph-mon[101330]: pgmap v111: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 7 active+recovery_wait+undersized+degraded+remapped, 57 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 997/231 objects degraded (431.602%); 0 B/s, 7 objects/s recovering 2026-03-10T08:59:46.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:46 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:59:46.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:46 vm05.local ceph-mon[111630]: pgmap v111: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 7 active+recovery_wait+undersized+degraded+remapped, 57 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 997/231 objects degraded (431.602%); 0 B/s, 7 objects/s recovering 2026-03-10T08:59:46.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:46 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:59:47.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:47 vm08.local ceph-mon[101330]: osdmap e62: 6 total, 6 up, 6 in 2026-03-10T08:59:47.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:47 vm05.local ceph-mon[111630]: osdmap e62: 6 total, 6 up, 6 in 2026-03-10T08:59:48.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:48 vm08.local ceph-mon[101330]: osdmap e63: 6 total, 6 up, 6 in 2026-03-10T08:59:48.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:48 vm08.local ceph-mon[101330]: pgmap v114: 65 pgs: 2 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 57 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 919/231 objects degraded (397.835%); 0 B/s, 7 objects/s recovering 2026-03-10T08:59:48.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:48 vm05.local ceph-mon[111630]: osdmap e63: 6 total, 6 up, 6 in 2026-03-10T08:59:48.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:48 vm05.local ceph-mon[111630]: pgmap v114: 65 pgs: 2 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 57 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 919/231 objects degraded (397.835%); 0 B/s, 7 objects/s recovering 2026-03-10T08:59:50.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:49 vm05.local ceph-mon[111630]: Health check update: Degraded data redundancy: 884/231 objects degraded (382.684%), 7 pgs degraded, 7 pgs undersized (PG_DEGRADED) 2026-03-10T08:59:50.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:49 vm08.local ceph-mon[101330]: Health check update: Degraded data redundancy: 884/231 objects degraded (382.684%), 7 pgs degraded, 7 pgs undersized (PG_DEGRADED) 2026-03-10T08:59:51.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:50 vm05.local ceph-mon[111630]: pgmap v115: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 884/231 objects degraded (382.684%); 0 B/s, 11 objects/s recovering 2026-03-10T08:59:51.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:50 vm08.local ceph-mon[101330]: pgmap v115: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 884/231 objects degraded (382.684%); 0 B/s, 11 objects/s recovering 2026-03-10T08:59:53.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:52 vm05.local ceph-mon[111630]: pgmap v116: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 884/231 objects degraded (382.684%); 0 B/s, 5 objects/s recovering 2026-03-10T08:59:53.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:52 vm08.local ceph-mon[101330]: pgmap v116: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 884/231 objects degraded (382.684%); 0 B/s, 5 objects/s recovering 2026-03-10T08:59:55.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:54 vm05.local ceph-mon[111630]: pgmap v117: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 884/231 objects degraded (382.684%); 0 B/s, 11 objects/s recovering 2026-03-10T08:59:55.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:54 vm08.local ceph-mon[101330]: pgmap v117: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 884/231 objects degraded (382.684%); 0 B/s, 11 objects/s recovering 2026-03-10T08:59:57.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:56 vm05.local ceph-mon[111630]: pgmap v118: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 884/231 objects degraded (382.684%); 0 B/s, 9 objects/s recovering 2026-03-10T08:59:57.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:56 vm08.local ceph-mon[101330]: pgmap v118: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 884/231 objects degraded (382.684%); 0 B/s, 9 objects/s recovering 2026-03-10T08:59:58.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:57 vm08.local ceph-mon[101330]: Health check update: Degraded data redundancy: 854/231 objects degraded (369.697%), 7 pgs degraded, 7 pgs undersized (PG_DEGRADED) 2026-03-10T08:59:58.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:57 vm05.local ceph-mon[111630]: Health check update: Degraded data redundancy: 854/231 objects degraded (369.697%), 7 pgs degraded, 7 pgs undersized (PG_DEGRADED) 2026-03-10T08:59:59.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:59:58 vm08.local ceph-mon[101330]: pgmap v119: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 854/231 objects degraded (369.697%); 0 B/s, 12 objects/s recovering 2026-03-10T08:59:59.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:59:58 vm05.local ceph-mon[111630]: pgmap v119: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 854/231 objects degraded (369.697%); 0 B/s, 12 objects/s recovering 2026-03-10T09:00:01.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:00 vm08.local ceph-mon[101330]: pgmap v120: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 854/231 objects degraded (369.697%); 0 B/s, 10 objects/s recovering 2026-03-10T09:00:01.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:00 vm08.local ceph-mon[101330]: osdmap e64: 6 total, 6 up, 6 in 2026-03-10T09:00:01.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:00 vm08.local ceph-mon[101330]: Health detail: HEALTH_WARN Degraded data redundancy: 854/231 objects degraded (369.697%), 7 pgs degraded, 7 pgs undersized 2026-03-10T09:00:01.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:00 vm08.local ceph-mon[101330]: [WRN] PG_DEGRADED: Degraded data redundancy: 854/231 objects degraded (369.697%), 7 pgs degraded, 7 pgs undersized 2026-03-10T09:00:01.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:00 vm08.local ceph-mon[101330]: pg 3.1 is stuck undersized for 2m, current state active+recovery_wait+undersized+degraded+remapped, last acting [2,4] 2026-03-10T09:00:01.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:00 vm08.local ceph-mon[101330]: pg 3.b is stuck undersized for 2m, current state active+recovery_wait+undersized+degraded+remapped, last acting [1,4] 2026-03-10T09:00:01.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:00 vm08.local ceph-mon[101330]: pg 3.10 is stuck undersized for 2m, current state active+recovery_wait+undersized+degraded+remapped, last acting [5,1] 2026-03-10T09:00:01.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:00 vm08.local ceph-mon[101330]: pg 3.11 is stuck undersized for 2m, current state active+recovery_wait+undersized+degraded+remapped, last acting [3,4] 2026-03-10T09:00:01.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:00 vm08.local ceph-mon[101330]: pg 3.15 is stuck undersized for 2m, current state active+recovering+undersized+degraded+remapped, last acting [3,4] 2026-03-10T09:00:01.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:00 vm08.local ceph-mon[101330]: pg 3.18 is stuck undersized for 2m, current state active+recovery_wait+undersized+degraded+remapped, last acting [2,1] 2026-03-10T09:00:01.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:00 vm08.local ceph-mon[101330]: pg 3.1f is stuck undersized for 2m, current state active+recovery_wait+undersized+degraded+remapped, last acting [2,3] 2026-03-10T09:00:01.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:00 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T09:00:01.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:00 vm08.local ceph-mon[101330]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T09:00:01.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:00 vm08.local ceph-mon[101330]: Upgrade: unsafe to stop osd(s) at this time (3 PGs are or would become offline) 2026-03-10T09:00:01.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:00 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:00:01.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:00 vm05.local ceph-mon[111630]: pgmap v120: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 854/231 objects degraded (369.697%); 0 B/s, 10 objects/s recovering 2026-03-10T09:00:01.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:00 vm05.local ceph-mon[111630]: osdmap e64: 6 total, 6 up, 6 in 2026-03-10T09:00:01.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:00 vm05.local ceph-mon[111630]: Health detail: HEALTH_WARN Degraded data redundancy: 854/231 objects degraded (369.697%), 7 pgs degraded, 7 pgs undersized 2026-03-10T09:00:01.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:00 vm05.local ceph-mon[111630]: [WRN] PG_DEGRADED: Degraded data redundancy: 854/231 objects degraded (369.697%), 7 pgs degraded, 7 pgs undersized 2026-03-10T09:00:01.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:00 vm05.local ceph-mon[111630]: pg 3.1 is stuck undersized for 2m, current state active+recovery_wait+undersized+degraded+remapped, last acting [2,4] 2026-03-10T09:00:01.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:00 vm05.local ceph-mon[111630]: pg 3.b is stuck undersized for 2m, current state active+recovery_wait+undersized+degraded+remapped, last acting [1,4] 2026-03-10T09:00:01.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:00 vm05.local ceph-mon[111630]: pg 3.10 is stuck undersized for 2m, current state active+recovery_wait+undersized+degraded+remapped, last acting [5,1] 2026-03-10T09:00:01.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:00 vm05.local ceph-mon[111630]: pg 3.11 is stuck undersized for 2m, current state active+recovery_wait+undersized+degraded+remapped, last acting [3,4] 2026-03-10T09:00:01.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:00 vm05.local ceph-mon[111630]: pg 3.15 is stuck undersized for 2m, current state active+recovering+undersized+degraded+remapped, last acting [3,4] 2026-03-10T09:00:01.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:00 vm05.local ceph-mon[111630]: pg 3.18 is stuck undersized for 2m, current state active+recovery_wait+undersized+degraded+remapped, last acting [2,1] 2026-03-10T09:00:01.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:00 vm05.local ceph-mon[111630]: pg 3.1f is stuck undersized for 2m, current state active+recovery_wait+undersized+degraded+remapped, last acting [2,3] 2026-03-10T09:00:01.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:00 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T09:00:01.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:00 vm05.local ceph-mon[111630]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T09:00:01.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:00 vm05.local ceph-mon[111630]: Upgrade: unsafe to stop osd(s) at this time (3 PGs are or would become offline) 2026-03-10T09:00:01.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:00 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:00:02.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:01 vm08.local ceph-mon[101330]: osdmap e65: 6 total, 6 up, 6 in 2026-03-10T09:00:02.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:01 vm05.local ceph-mon[111630]: osdmap e65: 6 total, 6 up, 6 in 2026-03-10T09:00:03.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:02 vm08.local ceph-mon[101330]: pgmap v123: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 854/231 objects degraded (369.697%); 0 B/s, 6 objects/s recovering 2026-03-10T09:00:03.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:02 vm05.local ceph-mon[111630]: pgmap v123: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 854/231 objects degraded (369.697%); 0 B/s, 6 objects/s recovering 2026-03-10T09:00:04.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:04 vm08.local ceph-mon[101330]: Health check update: Degraded data redundancy: 756/231 objects degraded (327.273%), 6 pgs degraded, 6 pgs undersized (PG_DEGRADED) 2026-03-10T09:00:04.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:04 vm05.local ceph-mon[111630]: Health check update: Degraded data redundancy: 756/231 objects degraded (327.273%), 6 pgs degraded, 6 pgs undersized (PG_DEGRADED) 2026-03-10T09:00:05.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:05 vm08.local ceph-mon[101330]: pgmap v124: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 756/231 objects degraded (327.273%); 0 B/s, 11 objects/s recovering 2026-03-10T09:00:05.360 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:05 vm05.local ceph-mon[111630]: pgmap v124: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 756/231 objects degraded (327.273%); 0 B/s, 11 objects/s recovering 2026-03-10T09:00:07.433 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:07 vm05.local ceph-mon[111630]: pgmap v125: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 756/231 objects degraded (327.273%); 0 B/s, 5 objects/s recovering 2026-03-10T09:00:07.434 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.432+0000 7f1738e01700 1 -- 192.168.123.105:0/1877150812 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1734103140 msgr2=0x7f1734103560 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:07.434 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.432+0000 7f1738e01700 1 --2- 192.168.123.105:0/1877150812 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1734103140 0x7f1734103560 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7f1724009b00 tx=0x7f1724009e10 comp rx=0 tx=0).stop 2026-03-10T09:00:07.434 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.433+0000 7f1738e01700 1 -- 192.168.123.105:0/1877150812 shutdown_connections 2026-03-10T09:00:07.434 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.433+0000 7f1738e01700 1 --2- 192.168.123.105:0/1877150812 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1734104340 0x7f17341047a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:07.434 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.433+0000 7f1738e01700 1 --2- 192.168.123.105:0/1877150812 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1734103140 0x7f1734103560 unknown :-1 s=CLOSED pgs=94 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:07.434 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.433+0000 7f1738e01700 1 -- 192.168.123.105:0/1877150812 >> 192.168.123.105:0/1877150812 conn(0x7f17340fe6c0 msgr2=0x7f1734100b20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:00:07.434 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.433+0000 7f1738e01700 1 -- 192.168.123.105:0/1877150812 shutdown_connections 2026-03-10T09:00:07.434 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.434+0000 7f1738e01700 1 -- 192.168.123.105:0/1877150812 wait complete. 2026-03-10T09:00:07.434 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.434+0000 7f1738e01700 1 Processor -- start 2026-03-10T09:00:07.434 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.434+0000 7f1738e01700 1 -- start start 2026-03-10T09:00:07.435 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.434+0000 7f1738e01700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1734103140 0x7f1734078b40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:00:07.435 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.434+0000 7f1738e01700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1734104340 0x7f1734079080 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:00:07.435 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.434+0000 7f1738e01700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f17340755a0 con 0x7f1734104340 2026-03-10T09:00:07.435 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.434+0000 7f1738e01700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1734075710 con 0x7f1734103140 2026-03-10T09:00:07.435 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.435+0000 7f172bfff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1734104340 0x7f1734079080 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:00:07.435 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.435+0000 7f172bfff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1734104340 0x7f1734079080 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:45904/0 (socket says 192.168.123.105:45904) 2026-03-10T09:00:07.435 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.435+0000 7f172bfff700 1 -- 192.168.123.105:0/1237586906 learned_addr learned my addr 192.168.123.105:0/1237586906 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:00:07.435 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.435+0000 7f173259c700 1 --2- 192.168.123.105:0/1237586906 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1734103140 0x7f1734078b40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:00:07.435 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.435+0000 7f173259c700 1 -- 192.168.123.105:0/1237586906 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1734104340 msgr2=0x7f1734079080 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:07.435 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.435+0000 7f173259c700 1 --2- 192.168.123.105:0/1237586906 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1734104340 0x7f1734079080 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:07.436 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.435+0000 7f173259c700 1 -- 192.168.123.105:0/1237586906 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f17240097e0 con 0x7f1734103140 2026-03-10T09:00:07.436 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.435+0000 7f172bfff700 1 --2- 192.168.123.105:0/1237586906 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1734104340 0x7f1734079080 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T09:00:07.436 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.435+0000 7f173259c700 1 --2- 192.168.123.105:0/1237586906 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1734103140 0x7f1734078b40 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f1724000c00 tx=0x7f1724004a00 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:00:07.436 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.436+0000 7f172b7fe700 1 -- 192.168.123.105:0/1237586906 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f172401d070 con 0x7f1734103140 2026-03-10T09:00:07.437 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.436+0000 7f1738e01700 1 -- 192.168.123.105:0/1237586906 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1734075990 con 0x7f1734103140 2026-03-10T09:00:07.437 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.436+0000 7f1738e01700 1 -- 192.168.123.105:0/1237586906 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1734075e80 con 0x7f1734103140 2026-03-10T09:00:07.437 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.436+0000 7f172b7fe700 1 -- 192.168.123.105:0/1237586906 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f172400bc50 con 0x7f1734103140 2026-03-10T09:00:07.437 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.436+0000 7f172b7fe700 1 -- 192.168.123.105:0/1237586906 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f172400f690 con 0x7f1734103140 2026-03-10T09:00:07.437 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.437+0000 7f172b7fe700 1 -- 192.168.123.105:0/1237586906 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f172400f7f0 con 0x7f1734103140 2026-03-10T09:00:07.438 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.438+0000 7f172b7fe700 1 --2- 192.168.123.105:0/1237586906 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f17140778c0 0x7f1714079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:00:07.438 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.438+0000 7f172b7fe700 1 -- 192.168.123.105:0/1237586906 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(65..65 src has 1..65) v4 ==== 6396+0+0 (secure 0 0 0) 0x7f172409c240 con 0x7f1734103140 2026-03-10T09:00:07.438 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.438+0000 7f172bfff700 1 --2- 192.168.123.105:0/1237586906 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f17140778c0 0x7f1714079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:00:07.439 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.438+0000 7f1738e01700 1 -- 192.168.123.105:0/1237586906 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1718005320 con 0x7f1734103140 2026-03-10T09:00:07.439 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.439+0000 7f172bfff700 1 --2- 192.168.123.105:0/1237586906 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f17140778c0 0x7f1714079d80 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f171c00f4d0 tx=0x7f171c005f90 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:00:07.441 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.441+0000 7f172b7fe700 1 -- 192.168.123.105:0/1237586906 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f1724064a20 con 0x7f1734103140 2026-03-10T09:00:07.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:07 vm08.local ceph-mon[101330]: pgmap v125: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 756/231 objects degraded (327.273%); 0 B/s, 5 objects/s recovering 2026-03-10T09:00:07.571 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.570+0000 7f1738e01700 1 -- 192.168.123.105:0/1237586906 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f1718000bf0 con 0x7f17140778c0 2026-03-10T09:00:07.572 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.571+0000 7f172b7fe700 1 -- 192.168.123.105:0/1237586906 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f1718000bf0 con 0x7f17140778c0 2026-03-10T09:00:07.574 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.574+0000 7f1738e01700 1 -- 192.168.123.105:0/1237586906 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f17140778c0 msgr2=0x7f1714079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:07.574 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.574+0000 7f1738e01700 1 --2- 192.168.123.105:0/1237586906 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f17140778c0 0x7f1714079d80 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f171c00f4d0 tx=0x7f171c005f90 comp rx=0 tx=0).stop 2026-03-10T09:00:07.574 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.574+0000 7f1738e01700 1 -- 192.168.123.105:0/1237586906 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1734103140 msgr2=0x7f1734078b40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:07.574 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.574+0000 7f1738e01700 1 --2- 192.168.123.105:0/1237586906 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1734103140 0x7f1734078b40 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f1724000c00 tx=0x7f1724004a00 comp rx=0 tx=0).stop 2026-03-10T09:00:07.574 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.574+0000 7f1738e01700 1 -- 192.168.123.105:0/1237586906 shutdown_connections 2026-03-10T09:00:07.575 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.574+0000 7f1738e01700 1 --2- 192.168.123.105:0/1237586906 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f17140778c0 0x7f1714079d80 unknown :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:07.575 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.574+0000 7f1738e01700 1 --2- 192.168.123.105:0/1237586906 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1734103140 0x7f1734078b40 unknown :-1 s=CLOSED pgs=35 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:07.575 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.574+0000 7f1738e01700 1 --2- 192.168.123.105:0/1237586906 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1734104340 0x7f1734079080 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:07.575 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.575+0000 7f1738e01700 1 -- 192.168.123.105:0/1237586906 >> 192.168.123.105:0/1237586906 conn(0x7f17340fe6c0 msgr2=0x7f1734107570 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:00:07.575 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.575+0000 7f1738e01700 1 -- 192.168.123.105:0/1237586906 shutdown_connections 2026-03-10T09:00:07.575 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.575+0000 7f1738e01700 1 -- 192.168.123.105:0/1237586906 wait complete. 2026-03-10T09:00:07.584 INFO:teuthology.orchestra.run.vm05.stdout:true 2026-03-10T09:00:07.645 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.644+0000 7f69a589f700 1 -- 192.168.123.105:0/2056556111 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f69a0101a90 msgr2=0x7f69a0103e80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:07.645 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.644+0000 7f69a589f700 1 --2- 192.168.123.105:0/2056556111 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f69a0101a90 0x7f69a0103e80 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7f6988009b00 tx=0x7f6988009e10 comp rx=0 tx=0).stop 2026-03-10T09:00:07.645 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.645+0000 7f69a589f700 1 -- 192.168.123.105:0/2056556111 shutdown_connections 2026-03-10T09:00:07.645 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.645+0000 7f69a589f700 1 --2- 192.168.123.105:0/2056556111 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f69a01043c0 0x7f69a01067b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:07.645 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.645+0000 7f69a589f700 1 --2- 192.168.123.105:0/2056556111 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f69a0101a90 0x7f69a0103e80 unknown :-1 s=CLOSED pgs=95 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:07.645 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.645+0000 7f69a589f700 1 -- 192.168.123.105:0/2056556111 >> 192.168.123.105:0/2056556111 conn(0x7f69a00fb3c0 msgr2=0x7f69a00fd840 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:00:07.645 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.645+0000 7f69a589f700 1 -- 192.168.123.105:0/2056556111 shutdown_connections 2026-03-10T09:00:07.645 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.645+0000 7f69a589f700 1 -- 192.168.123.105:0/2056556111 wait complete. 2026-03-10T09:00:07.646 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.645+0000 7f69a589f700 1 Processor -- start 2026-03-10T09:00:07.646 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.646+0000 7f69a589f700 1 -- start start 2026-03-10T09:00:07.646 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.646+0000 7f69a589f700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f69a0101a90 0x7f69a01967e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:00:07.646 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.646+0000 7f69a589f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f69a01043c0 0x7f69a0196d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:00:07.646 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.646+0000 7f69a589f700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f69a0197340 con 0x7f69a01043c0 2026-03-10T09:00:07.646 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.646+0000 7f69a589f700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f69a0197480 con 0x7f69a0101a90 2026-03-10T09:00:07.646 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.646+0000 7f699effd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f69a0101a90 0x7f69a01967e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:00:07.646 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.646+0000 7f699effd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f69a0101a90 0x7f69a01967e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:53424/0 (socket says 192.168.123.105:53424) 2026-03-10T09:00:07.646 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.646+0000 7f699effd700 1 -- 192.168.123.105:0/2809480478 learned_addr learned my addr 192.168.123.105:0/2809480478 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:00:07.647 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.646+0000 7f699e7fc700 1 --2- 192.168.123.105:0/2809480478 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f69a01043c0 0x7f69a0196d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:00:07.647 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.646+0000 7f699effd700 1 -- 192.168.123.105:0/2809480478 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f69a01043c0 msgr2=0x7f69a0196d20 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:07.647 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.646+0000 7f699effd700 1 --2- 192.168.123.105:0/2809480478 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f69a01043c0 0x7f69a0196d20 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:07.647 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.646+0000 7f699effd700 1 -- 192.168.123.105:0/2809480478 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f69880097e0 con 0x7f69a0101a90 2026-03-10T09:00:07.647 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.647+0000 7f699e7fc700 1 --2- 192.168.123.105:0/2809480478 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f69a01043c0 0x7f69a0196d20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T09:00:07.647 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.647+0000 7f699effd700 1 --2- 192.168.123.105:0/2809480478 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f69a0101a90 0x7f69a01967e0 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f6988005230 tx=0x7f69880056c0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:00:07.647 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.647+0000 7f69a489d700 1 -- 192.168.123.105:0/2809480478 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f698801d070 con 0x7f69a0101a90 2026-03-10T09:00:07.647 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.647+0000 7f69a589f700 1 -- 192.168.123.105:0/2809480478 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f69a019bed0 con 0x7f69a0101a90 2026-03-10T09:00:07.647 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.647+0000 7f69a589f700 1 -- 192.168.123.105:0/2809480478 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f69a019c340 con 0x7f69a0101a90 2026-03-10T09:00:07.648 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.647+0000 7f69a489d700 1 -- 192.168.123.105:0/2809480478 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f698800bc50 con 0x7f69a0101a90 2026-03-10T09:00:07.648 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.648+0000 7f69a489d700 1 -- 192.168.123.105:0/2809480478 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f698800f7c0 con 0x7f69a0101a90 2026-03-10T09:00:07.649 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.648+0000 7f69a489d700 1 -- 192.168.123.105:0/2809480478 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f6988022470 con 0x7f69a0101a90 2026-03-10T09:00:07.649 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.648+0000 7f69a589f700 1 -- 192.168.123.105:0/2809480478 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f69a0190a70 con 0x7f69a0101a90 2026-03-10T09:00:07.649 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.649+0000 7f69a489d700 1 --2- 192.168.123.105:0/2809480478 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f698c0776b0 0x7f698c079b70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:00:07.649 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.649+0000 7f699e7fc700 1 --2- 192.168.123.105:0/2809480478 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f698c0776b0 0x7f698c079b70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:00:07.649 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.649+0000 7f69a489d700 1 -- 192.168.123.105:0/2809480478 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(65..65 src has 1..65) v4 ==== 6396+0+0 (secure 0 0 0) 0x7f698809b1e0 con 0x7f69a0101a90 2026-03-10T09:00:07.650 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.649+0000 7f699e7fc700 1 --2- 192.168.123.105:0/2809480478 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f698c0776b0 0x7f698c079b70 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f6990005fd0 tx=0x7f6990005dc0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:00:07.652 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.651+0000 7f69a489d700 1 -- 192.168.123.105:0/2809480478 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f69880639c0 con 0x7f69a0101a90 2026-03-10T09:00:07.776 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.775+0000 7f69a589f700 1 -- 192.168.123.105:0/2809480478 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f69a002d080 con 0x7f698c0776b0 2026-03-10T09:00:07.777 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.776+0000 7f69a489d700 1 -- 192.168.123.105:0/2809480478 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f69a002d080 con 0x7f698c0776b0 2026-03-10T09:00:07.779 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.779+0000 7f69a589f700 1 -- 192.168.123.105:0/2809480478 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f698c0776b0 msgr2=0x7f698c079b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:07.779 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.779+0000 7f69a589f700 1 --2- 192.168.123.105:0/2809480478 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f698c0776b0 0x7f698c079b70 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f6990005fd0 tx=0x7f6990005dc0 comp rx=0 tx=0).stop 2026-03-10T09:00:07.779 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.779+0000 7f69a589f700 1 -- 192.168.123.105:0/2809480478 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f69a0101a90 msgr2=0x7f69a01967e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:07.779 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.779+0000 7f69a589f700 1 --2- 192.168.123.105:0/2809480478 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f69a0101a90 0x7f69a01967e0 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f6988005230 tx=0x7f69880056c0 comp rx=0 tx=0).stop 2026-03-10T09:00:07.779 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.779+0000 7f69a589f700 1 -- 192.168.123.105:0/2809480478 shutdown_connections 2026-03-10T09:00:07.779 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.779+0000 7f69a589f700 1 --2- 192.168.123.105:0/2809480478 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f698c0776b0 0x7f698c079b70 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:07.779 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.779+0000 7f69a589f700 1 --2- 192.168.123.105:0/2809480478 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f69a0101a90 0x7f69a01967e0 unknown :-1 s=CLOSED pgs=36 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:07.779 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.779+0000 7f69a589f700 1 --2- 192.168.123.105:0/2809480478 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f69a01043c0 0x7f69a0196d20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:07.780 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.779+0000 7f69a589f700 1 -- 192.168.123.105:0/2809480478 >> 192.168.123.105:0/2809480478 conn(0x7f69a00fb3c0 msgr2=0x7f69a00fd840 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:00:07.780 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.779+0000 7f69a589f700 1 -- 192.168.123.105:0/2809480478 shutdown_connections 2026-03-10T09:00:07.780 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.779+0000 7f69a589f700 1 -- 192.168.123.105:0/2809480478 wait complete. 2026-03-10T09:00:07.846 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.846+0000 7f95ba539700 1 -- 192.168.123.105:0/2154837766 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f95b40ffe60 msgr2=0x7f95b4100280 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:07.846 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.846+0000 7f95ba539700 1 --2- 192.168.123.105:0/2154837766 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f95b40ffe60 0x7f95b4100280 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7f95a8009b00 tx=0x7f95a8009e10 comp rx=0 tx=0).stop 2026-03-10T09:00:07.846 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.846+0000 7f95ba539700 1 -- 192.168.123.105:0/2154837766 shutdown_connections 2026-03-10T09:00:07.846 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.846+0000 7f95ba539700 1 --2- 192.168.123.105:0/2154837766 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f95b4100fc0 0x7f95b4101440 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:07.846 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.846+0000 7f95ba539700 1 --2- 192.168.123.105:0/2154837766 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f95b40ffe60 0x7f95b4100280 unknown :-1 s=CLOSED pgs=96 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:07.847 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.846+0000 7f95ba539700 1 -- 192.168.123.105:0/2154837766 >> 192.168.123.105:0/2154837766 conn(0x7f95b40fb3c0 msgr2=0x7f95b40fd840 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:00:07.847 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.846+0000 7f95ba539700 1 -- 192.168.123.105:0/2154837766 shutdown_connections 2026-03-10T09:00:07.847 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.846+0000 7f95ba539700 1 -- 192.168.123.105:0/2154837766 wait complete. 2026-03-10T09:00:07.847 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.847+0000 7f95ba539700 1 Processor -- start 2026-03-10T09:00:07.847 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.847+0000 7f95ba539700 1 -- start start 2026-03-10T09:00:07.847 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.847+0000 7f95ba539700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f95b40ffe60 0x7f95b419ce60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:00:07.847 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.847+0000 7f95ba539700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f95b4100fc0 0x7f95b419d3a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:00:07.847 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.847+0000 7f95ba539700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f95b419d9c0 con 0x7f95b4100fc0 2026-03-10T09:00:07.848 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.847+0000 7f95ba539700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f95b419db00 con 0x7f95b40ffe60 2026-03-10T09:00:07.848 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.847+0000 7f95b3fff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f95b40ffe60 0x7f95b419ce60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:00:07.848 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.847+0000 7f95b3fff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f95b40ffe60 0x7f95b419ce60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:53438/0 (socket says 192.168.123.105:53438) 2026-03-10T09:00:07.848 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.847+0000 7f95b3fff700 1 -- 192.168.123.105:0/348056607 learned_addr learned my addr 192.168.123.105:0/348056607 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:00:07.848 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.848+0000 7f95b37fe700 1 --2- 192.168.123.105:0/348056607 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f95b4100fc0 0x7f95b419d3a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:00:07.848 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.848+0000 7f95b3fff700 1 -- 192.168.123.105:0/348056607 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f95b4100fc0 msgr2=0x7f95b419d3a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:07.848 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.848+0000 7f95b3fff700 1 --2- 192.168.123.105:0/348056607 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f95b4100fc0 0x7f95b419d3a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:07.848 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.848+0000 7f95b3fff700 1 -- 192.168.123.105:0/348056607 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f95a80097e0 con 0x7f95b40ffe60 2026-03-10T09:00:07.848 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.848+0000 7f95b37fe700 1 --2- 192.168.123.105:0/348056607 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f95b4100fc0 0x7f95b419d3a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T09:00:07.848 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.848+0000 7f95b3fff700 1 --2- 192.168.123.105:0/348056607 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f95b40ffe60 0x7f95b419ce60 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7f95a8005850 tx=0x7f95a8004ab0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:00:07.849 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.848+0000 7f95b17fa700 1 -- 192.168.123.105:0/348056607 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f95a801d070 con 0x7f95b40ffe60 2026-03-10T09:00:07.849 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.848+0000 7f95ba539700 1 -- 192.168.123.105:0/348056607 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f95b41a2550 con 0x7f95b40ffe60 2026-03-10T09:00:07.849 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.848+0000 7f95ba539700 1 -- 192.168.123.105:0/348056607 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f95b41a2a40 con 0x7f95b40ffe60 2026-03-10T09:00:07.850 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.849+0000 7f95b17fa700 1 -- 192.168.123.105:0/348056607 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f95a800bc50 con 0x7f95b40ffe60 2026-03-10T09:00:07.850 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.849+0000 7f95b17fa700 1 -- 192.168.123.105:0/348056607 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f95a80217e0 con 0x7f95b40ffe60 2026-03-10T09:00:07.851 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.850+0000 7f95b17fa700 1 -- 192.168.123.105:0/348056607 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f95a802b430 con 0x7f95b40ffe60 2026-03-10T09:00:07.851 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.850+0000 7f95ba539700 1 -- 192.168.123.105:0/348056607 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f95b4066e80 con 0x7f95b40ffe60 2026-03-10T09:00:07.851 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.850+0000 7f95b17fa700 1 --2- 192.168.123.105:0/348056607 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f959c077870 0x7f959c079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:00:07.851 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.850+0000 7f95b17fa700 1 -- 192.168.123.105:0/348056607 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(65..65 src has 1..65) v4 ==== 6396+0+0 (secure 0 0 0) 0x7f95a809b140 con 0x7f95b40ffe60 2026-03-10T09:00:07.851 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.851+0000 7f95b37fe700 1 --2- 192.168.123.105:0/348056607 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f959c077870 0x7f959c079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:00:07.851 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.851+0000 7f95b37fe700 1 --2- 192.168.123.105:0/348056607 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f959c077870 0x7f959c079d30 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f95a4007900 tx=0x7f95a4008040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:00:07.854 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.853+0000 7f95b17fa700 1 -- 192.168.123.105:0/348056607 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f95a80637f0 con 0x7f95b40ffe60 2026-03-10T09:00:07.970 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.970+0000 7f95ba539700 1 -- 192.168.123.105:0/348056607 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f95b41059a0 con 0x7f959c077870 2026-03-10T09:00:07.976 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.975+0000 7f95b17fa700 1 -- 192.168.123.105:0/348056607 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3528 (secure 0 0 0) 0x7f95b41059a0 con 0x7f959c077870 2026-03-10T09:00:07.976 INFO:teuthology.orchestra.run.vm05.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T09:00:07.976 INFO:teuthology.orchestra.run.vm05.stdout:alertmanager.vm05 vm05 *:9093,9094 running (8m) 2m ago 9m 24.2M - 0.25.0 c8568f914cd2 3be9db7ff6a0 2026-03-10T09:00:07.976 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm05 vm05 running (9m) 2m ago 9m 8892k - 18.2.1 5be31c24972a 4f6a4fa3151a 2026-03-10T09:00:07.976 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm08 vm08 running (8m) 2m ago 8m 11.0M - 18.2.1 5be31c24972a 17a1d108c2c1 2026-03-10T09:00:07.976 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm05 vm05 running (2m) 2m ago 9m 7834k - 19.2.3-678-ge911bdeb 654f31e6858e 8f7c3cd210c7 2026-03-10T09:00:07.976 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm08 vm08 running (2m) 2m ago 8m 8296k - 19.2.3-678-ge911bdeb 654f31e6858e dc71c3161edd 2026-03-10T09:00:07.976 INFO:teuthology.orchestra.run.vm05.stdout:grafana.vm05 vm05 *:3000 running (8m) 2m ago 9m 88.3M - 9.4.7 954c08fa6188 91937b4745e7 2026-03-10T09:00:07.976 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.bxdvbu vm05 running (7m) 2m ago 7m 242M - 18.2.1 5be31c24972a 601862397d9b 2026-03-10T09:00:07.976 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.slhztf vm05 running (7m) 2m ago 7m 17.7M - 18.2.1 5be31c24972a 550cdd24738b 2026-03-10T09:00:07.976 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.ssijow vm08 running (7m) 2m ago 7m 19.9M - 18.2.1 5be31c24972a b9e55b365719 2026-03-10T09:00:07.976 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.xfzrbx vm08 running (7m) 2m ago 7m 16.2M - 18.2.1 5be31c24972a d58d1e2e6ff3 2026-03-10T09:00:07.976 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm05.rxwgjc vm05 *:8443,9283,8765 running (4m) 2m ago 9m 613M - 19.2.3-678-ge911bdeb 654f31e6858e d8c53d04a173 2026-03-10T09:00:07.976 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm08.rpongu vm08 *:8443,9283,8765 running (3m) 2m ago 8m 491M - 19.2.3-678-ge911bdeb 654f31e6858e 867781ac9d98 2026-03-10T09:00:07.976 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm05 vm05 running (3m) 2m ago 9m 55.8M 2048M 19.2.3-678-ge911bdeb 654f31e6858e cdc9176bec28 2026-03-10T09:00:07.976 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm08 vm08 running (3m) 2m ago 8m 48.1M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 34546aa1422b 2026-03-10T09:00:07.976 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm05 vm05 *:9100 running (9m) 2m ago 9m 14.7M - 1.5.0 0da6a335fe13 3b3db2a6030c 2026-03-10T09:00:07.976 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm08 vm08 *:9100 running (8m) 2m ago 8m 15.7M - 1.5.0 0da6a335fe13 f55df0cefc61 2026-03-10T09:00:07.976 INFO:teuthology.orchestra.run.vm05.stdout:osd.0 vm05 running (2m) 2m ago 8m 30.0M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 4f1dac46f59b 2026-03-10T09:00:07.976 INFO:teuthology.orchestra.run.vm05.stdout:osd.1 vm05 running (8m) 2m ago 8m 378M 4096M 18.2.1 5be31c24972a 902f9ea11f1a 2026-03-10T09:00:07.976 INFO:teuthology.orchestra.run.vm05.stdout:osd.2 vm05 running (8m) 2m ago 8m 327M 4096M 18.2.1 5be31c24972a 32c0be0f86f2 2026-03-10T09:00:07.976 INFO:teuthology.orchestra.run.vm05.stdout:osd.3 vm08 running (7m) 2m ago 7m 456M 4096M 18.2.1 5be31c24972a 14f5f93ea4d1 2026-03-10T09:00:07.976 INFO:teuthology.orchestra.run.vm05.stdout:osd.4 vm08 running (7m) 2m ago 7m 418M 4096M 18.2.1 5be31c24972a 155c1482d81c 2026-03-10T09:00:07.976 INFO:teuthology.orchestra.run.vm05.stdout:osd.5 vm08 running (7m) 2m ago 7m 339M 4096M 18.2.1 5be31c24972a 21583bb58d82 2026-03-10T09:00:07.976 INFO:teuthology.orchestra.run.vm05.stdout:prometheus.vm05 vm05 *:9095 running (3m) 2m ago 8m 51.3M - 2.43.0 a07b618ecd1d 1879cf55d022 2026-03-10T09:00:07.978 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.978+0000 7f95ba539700 1 -- 192.168.123.105:0/348056607 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f959c077870 msgr2=0x7f959c079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:07.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.978+0000 7f95ba539700 1 --2- 192.168.123.105:0/348056607 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f959c077870 0x7f959c079d30 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f95a4007900 tx=0x7f95a4008040 comp rx=0 tx=0).stop 2026-03-10T09:00:07.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.978+0000 7f95ba539700 1 -- 192.168.123.105:0/348056607 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f95b40ffe60 msgr2=0x7f95b419ce60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:07.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.978+0000 7f95ba539700 1 --2- 192.168.123.105:0/348056607 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f95b40ffe60 0x7f95b419ce60 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7f95a8005850 tx=0x7f95a8004ab0 comp rx=0 tx=0).stop 2026-03-10T09:00:07.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.979+0000 7f95ba539700 1 -- 192.168.123.105:0/348056607 shutdown_connections 2026-03-10T09:00:07.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.979+0000 7f95ba539700 1 --2- 192.168.123.105:0/348056607 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f959c077870 0x7f959c079d30 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:07.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.979+0000 7f95ba539700 1 --2- 192.168.123.105:0/348056607 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f95b40ffe60 0x7f95b419ce60 unknown :-1 s=CLOSED pgs=37 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:07.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.979+0000 7f95ba539700 1 --2- 192.168.123.105:0/348056607 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f95b4100fc0 0x7f95b419d3a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:07.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.979+0000 7f95ba539700 1 -- 192.168.123.105:0/348056607 >> 192.168.123.105:0/348056607 conn(0x7f95b40fb3c0 msgr2=0x7f95b4104280 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:00:07.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.979+0000 7f95ba539700 1 -- 192.168.123.105:0/348056607 shutdown_connections 2026-03-10T09:00:07.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:07.979+0000 7f95ba539700 1 -- 192.168.123.105:0/348056607 wait complete. 2026-03-10T09:00:08.046 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.046+0000 7f33e1b80700 1 -- 192.168.123.105:0/869795800 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f33dc0ff860 msgr2=0x7f33dc0ffc80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:08.046 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.046+0000 7f33e1b80700 1 --2- 192.168.123.105:0/869795800 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f33dc0ff860 0x7f33dc0ffc80 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7f33c4009b00 tx=0x7f33c4009e10 comp rx=0 tx=0).stop 2026-03-10T09:00:08.050 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.050+0000 7f33e1b80700 1 -- 192.168.123.105:0/869795800 shutdown_connections 2026-03-10T09:00:08.050 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.050+0000 7f33e1b80700 1 --2- 192.168.123.105:0/869795800 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f33dc1001c0 0x7f33dc100640 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:08.050 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.050+0000 7f33e1b80700 1 --2- 192.168.123.105:0/869795800 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f33dc0ff860 0x7f33dc0ffc80 unknown :-1 s=CLOSED pgs=97 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:08.051 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.050+0000 7f33e1b80700 1 -- 192.168.123.105:0/869795800 >> 192.168.123.105:0/869795800 conn(0x7f33dc0fb3c0 msgr2=0x7f33dc0fd840 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:00:08.051 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.050+0000 7f33e1b80700 1 -- 192.168.123.105:0/869795800 shutdown_connections 2026-03-10T09:00:08.051 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.050+0000 7f33e1b80700 1 -- 192.168.123.105:0/869795800 wait complete. 2026-03-10T09:00:08.051 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.051+0000 7f33e1b80700 1 Processor -- start 2026-03-10T09:00:08.051 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.051+0000 7f33e1b80700 1 -- start start 2026-03-10T09:00:08.051 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.051+0000 7f33e1b80700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f33dc0ff860 0x7f33dc1053e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:00:08.052 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.051+0000 7f33e1b80700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f33dc1001c0 0x7f33dc105920 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:00:08.052 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.051+0000 7f33e1b80700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f33dc101e40 con 0x7f33dc0ff860 2026-03-10T09:00:08.052 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.051+0000 7f33e1b80700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f33dc101fb0 con 0x7f33dc1001c0 2026-03-10T09:00:08.052 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.051+0000 7f33db7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f33dc0ff860 0x7f33dc1053e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:00:08.054 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.051+0000 7f33db7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f33dc0ff860 0x7f33dc1053e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:45940/0 (socket says 192.168.123.105:45940) 2026-03-10T09:00:08.054 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.051+0000 7f33db7fe700 1 -- 192.168.123.105:0/1937972537 learned_addr learned my addr 192.168.123.105:0/1937972537 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:00:08.054 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.052+0000 7f33db7fe700 1 -- 192.168.123.105:0/1937972537 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f33dc1001c0 msgr2=0x7f33dc105920 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T09:00:08.054 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.052+0000 7f33db7fe700 1 --2- 192.168.123.105:0/1937972537 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f33dc1001c0 0x7f33dc105920 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:08.054 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.052+0000 7f33db7fe700 1 -- 192.168.123.105:0/1937972537 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f33c40097e0 con 0x7f33dc0ff860 2026-03-10T09:00:08.054 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.052+0000 7f33db7fe700 1 --2- 192.168.123.105:0/1937972537 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f33dc0ff860 0x7f33dc1053e0 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f33c4004990 tx=0x7f33c4004a70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:00:08.054 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.052+0000 7f33d8ff9700 1 -- 192.168.123.105:0/1937972537 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f33c401d070 con 0x7f33dc0ff860 2026-03-10T09:00:08.054 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.052+0000 7f33d8ff9700 1 -- 192.168.123.105:0/1937972537 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f33c400bd90 con 0x7f33dc0ff860 2026-03-10T09:00:08.054 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.052+0000 7f33d8ff9700 1 -- 192.168.123.105:0/1937972537 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f33c400f980 con 0x7f33dc0ff860 2026-03-10T09:00:08.054 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.052+0000 7f33e1b80700 1 -- 192.168.123.105:0/1937972537 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f33dc102230 con 0x7f33dc0ff860 2026-03-10T09:00:08.055 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.052+0000 7f33e1b80700 1 -- 192.168.123.105:0/1937972537 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f33dc102720 con 0x7f33dc0ff860 2026-03-10T09:00:08.056 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.055+0000 7f33d8ff9700 1 -- 192.168.123.105:0/1937972537 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f33c400fae0 con 0x7f33dc0ff860 2026-03-10T09:00:08.056 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.055+0000 7f33d8ff9700 1 --2- 192.168.123.105:0/1937972537 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f33c8077990 0x7f33c8079e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:00:08.056 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.055+0000 7f33d8ff9700 1 -- 192.168.123.105:0/1937972537 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(65..65 src has 1..65) v4 ==== 6396+0+0 (secure 0 0 0) 0x7f33c409c570 con 0x7f33dc0ff860 2026-03-10T09:00:08.056 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.056+0000 7f33daffd700 1 --2- 192.168.123.105:0/1937972537 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f33c8077990 0x7f33c8079e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:00:08.057 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.057+0000 7f33e1b80700 1 -- 192.168.123.105:0/1937972537 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f33dc066e80 con 0x7f33dc0ff860 2026-03-10T09:00:08.060 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.060+0000 7f33daffd700 1 --2- 192.168.123.105:0/1937972537 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f33c8077990 0x7f33c8079e50 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7f33cc00a9b0 tx=0x7f33cc005c90 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:00:08.060 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.060+0000 7f33d8ff9700 1 -- 192.168.123.105:0/1937972537 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f33c4064d20 con 0x7f33dc0ff860 2026-03-10T09:00:08.220 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.219+0000 7f33e1b80700 1 -- 192.168.123.105:0/1937972537 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f33dc102a00 con 0x7f33dc0ff860 2026-03-10T09:00:08.220 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.220+0000 7f33d8ff9700 1 -- 192.168.123.105:0/1937972537 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+799 (secure 0 0 0) 0x7f33c4027070 con 0x7f33dc0ff860 2026-03-10T09:00:08.220 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T09:00:08.220 INFO:teuthology.orchestra.run.vm05.stdout: "mon": { 2026-03-10T09:00:08.221 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T09:00:08.221 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T09:00:08.221 INFO:teuthology.orchestra.run.vm05.stdout: "mgr": { 2026-03-10T09:00:08.221 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T09:00:08.221 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T09:00:08.221 INFO:teuthology.orchestra.run.vm05.stdout: "osd": { 2026-03-10T09:00:08.221 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 5, 2026-03-10T09:00:08.221 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-10T09:00:08.221 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T09:00:08.221 INFO:teuthology.orchestra.run.vm05.stdout: "mds": { 2026-03-10T09:00:08.221 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4 2026-03-10T09:00:08.221 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T09:00:08.221 INFO:teuthology.orchestra.run.vm05.stdout: "overall": { 2026-03-10T09:00:08.221 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 9, 2026-03-10T09:00:08.221 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 5 2026-03-10T09:00:08.221 INFO:teuthology.orchestra.run.vm05.stdout: } 2026-03-10T09:00:08.221 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T09:00:08.223 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.223+0000 7f33e1b80700 1 -- 192.168.123.105:0/1937972537 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f33c8077990 msgr2=0x7f33c8079e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:08.223 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.223+0000 7f33e1b80700 1 --2- 192.168.123.105:0/1937972537 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f33c8077990 0x7f33c8079e50 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7f33cc00a9b0 tx=0x7f33cc005c90 comp rx=0 tx=0).stop 2026-03-10T09:00:08.223 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.223+0000 7f33e1b80700 1 -- 192.168.123.105:0/1937972537 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f33dc0ff860 msgr2=0x7f33dc1053e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:08.224 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.223+0000 7f33e1b80700 1 --2- 192.168.123.105:0/1937972537 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f33dc0ff860 0x7f33dc1053e0 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f33c4004990 tx=0x7f33c4004a70 comp rx=0 tx=0).stop 2026-03-10T09:00:08.224 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.223+0000 7f33e1b80700 1 -- 192.168.123.105:0/1937972537 shutdown_connections 2026-03-10T09:00:08.224 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.223+0000 7f33e1b80700 1 --2- 192.168.123.105:0/1937972537 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f33c8077990 0x7f33c8079e50 unknown :-1 s=CLOSED pgs=66 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:08.224 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.223+0000 7f33e1b80700 1 --2- 192.168.123.105:0/1937972537 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f33dc0ff860 0x7f33dc1053e0 unknown :-1 s=CLOSED pgs=98 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:08.224 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.223+0000 7f33e1b80700 1 --2- 192.168.123.105:0/1937972537 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f33dc1001c0 0x7f33dc105920 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:08.224 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.224+0000 7f33e1b80700 1 -- 192.168.123.105:0/1937972537 >> 192.168.123.105:0/1937972537 conn(0x7f33dc0fb3c0 msgr2=0x7f33dc107e90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:00:08.224 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.224+0000 7f33e1b80700 1 -- 192.168.123.105:0/1937972537 shutdown_connections 2026-03-10T09:00:08.224 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.224+0000 7f33e1b80700 1 -- 192.168.123.105:0/1937972537 wait complete. 2026-03-10T09:00:08.295 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.295+0000 7f5e16d6d700 1 -- 192.168.123.105:0/2557525799 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5e10100fc0 msgr2=0x7f5e10101440 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:08.296 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.295+0000 7f5e16d6d700 1 --2- 192.168.123.105:0/2557525799 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5e10100fc0 0x7f5e10101440 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7f5e04009b00 tx=0x7f5e04009e10 comp rx=0 tx=0).stop 2026-03-10T09:00:08.296 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.295+0000 7f5e16d6d700 1 -- 192.168.123.105:0/2557525799 shutdown_connections 2026-03-10T09:00:08.296 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.295+0000 7f5e16d6d700 1 --2- 192.168.123.105:0/2557525799 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5e10100fc0 0x7f5e10101440 unknown :-1 s=CLOSED pgs=99 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:08.296 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.295+0000 7f5e16d6d700 1 --2- 192.168.123.105:0/2557525799 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5e100ffe60 0x7f5e10100280 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:08.296 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.295+0000 7f5e16d6d700 1 -- 192.168.123.105:0/2557525799 >> 192.168.123.105:0/2557525799 conn(0x7f5e100fb3c0 msgr2=0x7f5e100fd840 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:00:08.296 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.295+0000 7f5e16d6d700 1 -- 192.168.123.105:0/2557525799 shutdown_connections 2026-03-10T09:00:08.296 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.295+0000 7f5e16d6d700 1 -- 192.168.123.105:0/2557525799 wait complete. 2026-03-10T09:00:08.296 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.296+0000 7f5e16d6d700 1 Processor -- start 2026-03-10T09:00:08.299 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.299+0000 7f5e16d6d700 1 -- start start 2026-03-10T09:00:08.299 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.299+0000 7f5e16d6d700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5e100ffe60 0x7f5e10196880 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:00:08.299 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.299+0000 7f5e16d6d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5e10100fc0 0x7f5e10196dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:00:08.299 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.299+0000 7f5e16d6d700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5e101973e0 con 0x7f5e10100fc0 2026-03-10T09:00:08.299 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.299+0000 7f5e16d6d700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5e10197520 con 0x7f5e100ffe60 2026-03-10T09:00:08.300 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.300+0000 7f5e0ffff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5e10100fc0 0x7f5e10196dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:00:08.300 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.300+0000 7f5e0ffff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5e10100fc0 0x7f5e10196dc0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:45958/0 (socket says 192.168.123.105:45958) 2026-03-10T09:00:08.300 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.300+0000 7f5e0ffff700 1 -- 192.168.123.105:0/2029971338 learned_addr learned my addr 192.168.123.105:0/2029971338 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:00:08.301 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.301+0000 7f5e0ffff700 1 -- 192.168.123.105:0/2029971338 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5e100ffe60 msgr2=0x7f5e10196880 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T09:00:08.301 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.301+0000 7f5e0ffff700 1 --2- 192.168.123.105:0/2029971338 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5e100ffe60 0x7f5e10196880 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:08.301 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.301+0000 7f5e0ffff700 1 -- 192.168.123.105:0/2029971338 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5e040097e0 con 0x7f5e10100fc0 2026-03-10T09:00:08.301 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.301+0000 7f5e0ffff700 1 --2- 192.168.123.105:0/2029971338 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5e10100fc0 0x7f5e10196dc0 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f5e040052d0 tx=0x7f5e04004a80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:00:08.301 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.301+0000 7f5e0dffb700 1 -- 192.168.123.105:0/2029971338 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5e0401d070 con 0x7f5e10100fc0 2026-03-10T09:00:08.303 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.301+0000 7f5e0dffb700 1 -- 192.168.123.105:0/2029971338 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f5e04004500 con 0x7f5e10100fc0 2026-03-10T09:00:08.303 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.301+0000 7f5e0dffb700 1 -- 192.168.123.105:0/2029971338 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5e04022470 con 0x7f5e10100fc0 2026-03-10T09:00:08.303 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.302+0000 7f5e16d6d700 1 -- 192.168.123.105:0/2029971338 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5e1006a8f0 con 0x7f5e10100fc0 2026-03-10T09:00:08.303 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.302+0000 7f5e16d6d700 1 -- 192.168.123.105:0/2029971338 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5e1006ad80 con 0x7f5e10100fc0 2026-03-10T09:00:08.304 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.303+0000 7f5e16d6d700 1 -- 192.168.123.105:0/2029971338 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5e10066e80 con 0x7f5e10100fc0 2026-03-10T09:00:08.304 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.304+0000 7f5e0dffb700 1 -- 192.168.123.105:0/2029971338 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f5e0400bc50 con 0x7f5e10100fc0 2026-03-10T09:00:08.307 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.304+0000 7f5e0dffb700 1 --2- 192.168.123.105:0/2029971338 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f5e00077870 0x7f5e00079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:00:08.307 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.304+0000 7f5e0dffb700 1 -- 192.168.123.105:0/2029971338 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(65..65 src has 1..65) v4 ==== 6396+0+0 (secure 0 0 0) 0x7f5e0409b7e0 con 0x7f5e10100fc0 2026-03-10T09:00:08.307 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.305+0000 7f5e14b09700 1 --2- 192.168.123.105:0/2029971338 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f5e00077870 0x7f5e00079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:00:08.307 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.305+0000 7f5e14b09700 1 --2- 192.168.123.105:0/2029971338 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f5e00077870 0x7f5e00079d30 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f5dfc009710 tx=0x7f5dfc006c60 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:00:08.307 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.307+0000 7f5e0dffb700 1 -- 192.168.123.105:0/2029971338 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f5e04063f10 con 0x7f5e10100fc0 2026-03-10T09:00:08.456 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.455+0000 7f5e16d6d700 1 -- 192.168.123.105:0/2029971338 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f5e1006b060 con 0x7f5e10100fc0 2026-03-10T09:00:08.456 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.456+0000 7f5e0dffb700 1 -- 192.168.123.105:0/2029971338 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 12 v12) v1 ==== 76+0+1918 (secure 0 0 0) 0x7f5e04063660 con 0x7f5e10100fc0 2026-03-10T09:00:08.456 INFO:teuthology.orchestra.run.vm05.stdout:e12 2026-03-10T09:00:08.456 INFO:teuthology.orchestra.run.vm05.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-10T09:00:08.456 INFO:teuthology.orchestra.run.vm05.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T09:00:08.456 INFO:teuthology.orchestra.run.vm05.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T09:00:08.456 INFO:teuthology.orchestra.run.vm05.stdout:legacy client fscid: 1 2026-03-10T09:00:08.456 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:00:08.456 INFO:teuthology.orchestra.run.vm05.stdout:Filesystem 'cephfs' (1) 2026-03-10T09:00:08.456 INFO:teuthology.orchestra.run.vm05.stdout:fs_name cephfs 2026-03-10T09:00:08.456 INFO:teuthology.orchestra.run.vm05.stdout:epoch 12 2026-03-10T09:00:08.456 INFO:teuthology.orchestra.run.vm05.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T09:00:08.456 INFO:teuthology.orchestra.run.vm05.stdout:created 2026-03-10T08:52:52.346264+0000 2026-03-10T09:00:08.456 INFO:teuthology.orchestra.run.vm05.stdout:modified 2026-03-10T08:53:01.426195+0000 2026-03-10T09:00:08.457 INFO:teuthology.orchestra.run.vm05.stdout:tableserver 0 2026-03-10T09:00:08.457 INFO:teuthology.orchestra.run.vm05.stdout:root 0 2026-03-10T09:00:08.457 INFO:teuthology.orchestra.run.vm05.stdout:session_timeout 60 2026-03-10T09:00:08.457 INFO:teuthology.orchestra.run.vm05.stdout:session_autoclose 300 2026-03-10T09:00:08.457 INFO:teuthology.orchestra.run.vm05.stdout:max_file_size 1099511627776 2026-03-10T09:00:08.457 INFO:teuthology.orchestra.run.vm05.stdout:max_xattr_size 65536 2026-03-10T09:00:08.457 INFO:teuthology.orchestra.run.vm05.stdout:required_client_features {} 2026-03-10T09:00:08.457 INFO:teuthology.orchestra.run.vm05.stdout:last_failure 0 2026-03-10T09:00:08.457 INFO:teuthology.orchestra.run.vm05.stdout:last_failure_osd_epoch 39 2026-03-10T09:00:08.457 INFO:teuthology.orchestra.run.vm05.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T09:00:08.457 INFO:teuthology.orchestra.run.vm05.stdout:max_mds 1 2026-03-10T09:00:08.457 INFO:teuthology.orchestra.run.vm05.stdout:in 0 2026-03-10T09:00:08.457 INFO:teuthology.orchestra.run.vm05.stdout:up {0=24289} 2026-03-10T09:00:08.457 INFO:teuthology.orchestra.run.vm05.stdout:failed 2026-03-10T09:00:08.457 INFO:teuthology.orchestra.run.vm05.stdout:damaged 2026-03-10T09:00:08.457 INFO:teuthology.orchestra.run.vm05.stdout:stopped 2026-03-10T09:00:08.457 INFO:teuthology.orchestra.run.vm05.stdout:data_pools [3] 2026-03-10T09:00:08.457 INFO:teuthology.orchestra.run.vm05.stdout:metadata_pool 2 2026-03-10T09:00:08.457 INFO:teuthology.orchestra.run.vm05.stdout:inline_data disabled 2026-03-10T09:00:08.457 INFO:teuthology.orchestra.run.vm05.stdout:balancer 2026-03-10T09:00:08.457 INFO:teuthology.orchestra.run.vm05.stdout:bal_rank_mask -1 2026-03-10T09:00:08.457 INFO:teuthology.orchestra.run.vm05.stdout:standby_count_wanted 1 2026-03-10T09:00:08.457 INFO:teuthology.orchestra.run.vm05.stdout:qdb_cluster leader: 0 members: 2026-03-10T09:00:08.457 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.bxdvbu{0:24289} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.105:6826/2466638752,v1:192.168.123.105:6827/2466638752] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T09:00:08.457 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:00:08.457 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:00:08.457 INFO:teuthology.orchestra.run.vm05.stdout:Standby daemons: 2026-03-10T09:00:08.457 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:00:08.457 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.slhztf{-1:14488} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.105:6828/2662194502,v1:192.168.123.105:6829/2662194502] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T09:00:08.457 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.ssijow{-1:24307} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.108:6826/573665424,v1:192.168.123.108:6827/573665424] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T09:00:08.457 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.xfzrbx{-1:24317} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.108:6824/3236998387,v1:192.168.123.108:6825/3236998387] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T09:00:08.459 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.459+0000 7f5e16d6d700 1 -- 192.168.123.105:0/2029971338 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f5e00077870 msgr2=0x7f5e00079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:08.459 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.459+0000 7f5e16d6d700 1 --2- 192.168.123.105:0/2029971338 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f5e00077870 0x7f5e00079d30 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f5dfc009710 tx=0x7f5dfc006c60 comp rx=0 tx=0).stop 2026-03-10T09:00:08.459 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.459+0000 7f5e16d6d700 1 -- 192.168.123.105:0/2029971338 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5e10100fc0 msgr2=0x7f5e10196dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:08.459 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.459+0000 7f5e16d6d700 1 --2- 192.168.123.105:0/2029971338 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5e10100fc0 0x7f5e10196dc0 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f5e040052d0 tx=0x7f5e04004a80 comp rx=0 tx=0).stop 2026-03-10T09:00:08.459 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.459+0000 7f5e16d6d700 1 -- 192.168.123.105:0/2029971338 shutdown_connections 2026-03-10T09:00:08.459 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.459+0000 7f5e16d6d700 1 --2- 192.168.123.105:0/2029971338 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f5e00077870 0x7f5e00079d30 unknown :-1 s=CLOSED pgs=67 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:08.460 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.459+0000 7f5e16d6d700 1 --2- 192.168.123.105:0/2029971338 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5e100ffe60 0x7f5e10196880 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:08.460 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.459+0000 7f5e16d6d700 1 --2- 192.168.123.105:0/2029971338 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5e10100fc0 0x7f5e10196dc0 unknown :-1 s=CLOSED pgs=100 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:08.460 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.459+0000 7f5e16d6d700 1 -- 192.168.123.105:0/2029971338 >> 192.168.123.105:0/2029971338 conn(0x7f5e100fb3c0 msgr2=0x7f5e100fcf40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:00:08.460 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.459+0000 7f5e16d6d700 1 -- 192.168.123.105:0/2029971338 shutdown_connections 2026-03-10T09:00:08.460 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.459+0000 7f5e16d6d700 1 -- 192.168.123.105:0/2029971338 wait complete. 2026-03-10T09:00:08.460 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 12 2026-03-10T09:00:08.536 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.535+0000 7f62c0c28700 1 -- 192.168.123.105:0/3135661428 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f62bc103150 msgr2=0x7f62bc103570 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:08.536 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.535+0000 7f62c0c28700 1 --2- 192.168.123.105:0/3135661428 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f62bc103150 0x7f62bc103570 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7f62ac009b00 tx=0x7f62ac009e10 comp rx=0 tx=0).stop 2026-03-10T09:00:08.536 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.535+0000 7f62c0c28700 1 -- 192.168.123.105:0/3135661428 shutdown_connections 2026-03-10T09:00:08.536 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.535+0000 7f62c0c28700 1 --2- 192.168.123.105:0/3135661428 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f62bc104350 0x7f62bc1047b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:08.536 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.535+0000 7f62c0c28700 1 --2- 192.168.123.105:0/3135661428 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f62bc103150 0x7f62bc103570 unknown :-1 s=CLOSED pgs=101 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:08.536 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.536+0000 7f62c0c28700 1 -- 192.168.123.105:0/3135661428 >> 192.168.123.105:0/3135661428 conn(0x7f62bc0fe6d0 msgr2=0x7f62bc100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:00:08.536 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.536+0000 7f62c0c28700 1 -- 192.168.123.105:0/3135661428 shutdown_connections 2026-03-10T09:00:08.536 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.536+0000 7f62c0c28700 1 -- 192.168.123.105:0/3135661428 wait complete. 2026-03-10T09:00:08.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.536+0000 7f62c0c28700 1 Processor -- start 2026-03-10T09:00:08.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.537+0000 7f62c0c28700 1 -- start start 2026-03-10T09:00:08.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.537+0000 7f62c0c28700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f62bc103150 0x7f62bc198a30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:00:08.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.537+0000 7f62c0c28700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f62bc104350 0x7f62bc198f70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:00:08.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.537+0000 7f62c0c28700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f62bc199590 con 0x7f62bc104350 2026-03-10T09:00:08.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.537+0000 7f62c0c28700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f62bc1996d0 con 0x7f62bc103150 2026-03-10T09:00:08.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.537+0000 7f62ba59c700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f62bc103150 0x7f62bc198a30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:00:08.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.537+0000 7f62ba59c700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f62bc103150 0x7f62bc198a30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:53498/0 (socket says 192.168.123.105:53498) 2026-03-10T09:00:08.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.537+0000 7f62ba59c700 1 -- 192.168.123.105:0/139255843 learned_addr learned my addr 192.168.123.105:0/139255843 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:00:08.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.537+0000 7f62b1bff700 1 --2- 192.168.123.105:0/139255843 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f62bc104350 0x7f62bc198f70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:00:08.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.538+0000 7f62ba59c700 1 -- 192.168.123.105:0/139255843 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f62bc104350 msgr2=0x7f62bc198f70 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:08.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.538+0000 7f62ba59c700 1 --2- 192.168.123.105:0/139255843 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f62bc104350 0x7f62bc198f70 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:08.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.538+0000 7f62ba59c700 1 -- 192.168.123.105:0/139255843 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f62ac0097e0 con 0x7f62bc103150 2026-03-10T09:00:08.539 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.538+0000 7f62ba59c700 1 --2- 192.168.123.105:0/139255843 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f62bc103150 0x7f62bc198a30 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f62ac004990 tx=0x7f62ac0049c0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:00:08.539 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.539+0000 7f62b3fff700 1 -- 192.168.123.105:0/139255843 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f62ac01d070 con 0x7f62bc103150 2026-03-10T09:00:08.539 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.539+0000 7f62c0c28700 1 -- 192.168.123.105:0/139255843 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f62bc19e120 con 0x7f62bc103150 2026-03-10T09:00:08.540 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.539+0000 7f62b3fff700 1 -- 192.168.123.105:0/139255843 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f62ac00bc50 con 0x7f62bc103150 2026-03-10T09:00:08.540 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.539+0000 7f62b3fff700 1 -- 192.168.123.105:0/139255843 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f62ac00f810 con 0x7f62bc103150 2026-03-10T09:00:08.541 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.539+0000 7f62c0c28700 1 -- 192.168.123.105:0/139255843 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f62bc19e690 con 0x7f62bc103150 2026-03-10T09:00:08.541 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.541+0000 7f62b3fff700 1 -- 192.168.123.105:0/139255843 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f62ac00f970 con 0x7f62bc103150 2026-03-10T09:00:08.541 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.541+0000 7f62c0c28700 1 -- 192.168.123.105:0/139255843 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f62bc066e80 con 0x7f62bc103150 2026-03-10T09:00:08.542 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.541+0000 7f62b3fff700 1 --2- 192.168.123.105:0/139255843 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f62a8077870 0x7f62a8079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:00:08.542 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.541+0000 7f62b3fff700 1 -- 192.168.123.105:0/139255843 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(65..65 src has 1..65) v4 ==== 6396+0+0 (secure 0 0 0) 0x7f62ac09b5a0 con 0x7f62bc103150 2026-03-10T09:00:08.542 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.542+0000 7f62b1bff700 1 --2- 192.168.123.105:0/139255843 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f62a8077870 0x7f62a8079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:00:08.542 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.542+0000 7f62b1bff700 1 --2- 192.168.123.105:0/139255843 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f62a8077870 0x7f62a8079d30 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f62a4009c00 tx=0x7f62a4009380 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:00:08.546 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.546+0000 7f62b3fff700 1 -- 192.168.123.105:0/139255843 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f62ac064bd0 con 0x7f62bc103150 2026-03-10T09:00:08.671 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.670+0000 7f62c0c28700 1 -- 192.168.123.105:0/139255843 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f62bc108ca0 con 0x7f62a8077870 2026-03-10T09:00:08.676 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.675+0000 7f62b3fff700 1 -- 192.168.123.105:0/139255843 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f62bc108ca0 con 0x7f62a8077870 2026-03-10T09:00:08.676 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T09:00:08.676 INFO:teuthology.orchestra.run.vm05.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T09:00:08.676 INFO:teuthology.orchestra.run.vm05.stdout: "in_progress": true, 2026-03-10T09:00:08.676 INFO:teuthology.orchestra.run.vm05.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T09:00:08.676 INFO:teuthology.orchestra.run.vm05.stdout: "services_complete": [ 2026-03-10T09:00:08.676 INFO:teuthology.orchestra.run.vm05.stdout: "mgr", 2026-03-10T09:00:08.676 INFO:teuthology.orchestra.run.vm05.stdout: "crash", 2026-03-10T09:00:08.676 INFO:teuthology.orchestra.run.vm05.stdout: "mon" 2026-03-10T09:00:08.676 INFO:teuthology.orchestra.run.vm05.stdout: ], 2026-03-10T09:00:08.676 INFO:teuthology.orchestra.run.vm05.stdout: "progress": "7/23 daemons upgraded", 2026-03-10T09:00:08.676 INFO:teuthology.orchestra.run.vm05.stdout: "message": "Currently upgrading osd daemons", 2026-03-10T09:00:08.676 INFO:teuthology.orchestra.run.vm05.stdout: "is_paused": false 2026-03-10T09:00:08.676 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T09:00:08.678 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.678+0000 7f62c0c28700 1 -- 192.168.123.105:0/139255843 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f62a8077870 msgr2=0x7f62a8079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:08.679 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.678+0000 7f62c0c28700 1 --2- 192.168.123.105:0/139255843 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f62a8077870 0x7f62a8079d30 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f62a4009c00 tx=0x7f62a4009380 comp rx=0 tx=0).stop 2026-03-10T09:00:08.679 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.678+0000 7f62c0c28700 1 -- 192.168.123.105:0/139255843 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f62bc103150 msgr2=0x7f62bc198a30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:08.679 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.678+0000 7f62c0c28700 1 --2- 192.168.123.105:0/139255843 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f62bc103150 0x7f62bc198a30 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f62ac004990 tx=0x7f62ac0049c0 comp rx=0 tx=0).stop 2026-03-10T09:00:08.679 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.679+0000 7f62c0c28700 1 -- 192.168.123.105:0/139255843 shutdown_connections 2026-03-10T09:00:08.679 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.679+0000 7f62c0c28700 1 --2- 192.168.123.105:0/139255843 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f62a8077870 0x7f62a8079d30 unknown :-1 s=CLOSED pgs=68 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:08.679 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.679+0000 7f62c0c28700 1 --2- 192.168.123.105:0/139255843 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f62bc103150 0x7f62bc198a30 unknown :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:08.679 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.679+0000 7f62c0c28700 1 --2- 192.168.123.105:0/139255843 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f62bc104350 0x7f62bc198f70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:08.679 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.679+0000 7f62c0c28700 1 -- 192.168.123.105:0/139255843 >> 192.168.123.105:0/139255843 conn(0x7f62bc0fe6d0 msgr2=0x7f62bc107580 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:00:08.680 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.679+0000 7f62c0c28700 1 -- 192.168.123.105:0/139255843 shutdown_connections 2026-03-10T09:00:08.680 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.679+0000 7f62c0c28700 1 -- 192.168.123.105:0/139255843 wait complete. 2026-03-10T09:00:08.750 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.749+0000 7fcb0de04700 1 -- 192.168.123.105:0/575984104 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcb08101710 msgr2=0x7fcb08103b00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:08.750 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.749+0000 7fcb0de04700 1 --2- 192.168.123.105:0/575984104 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcb08101710 0x7fcb08103b00 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7fcaf8009b00 tx=0x7fcaf8009e10 comp rx=0 tx=0).stop 2026-03-10T09:00:08.750 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.749+0000 7fcb0de04700 1 -- 192.168.123.105:0/575984104 shutdown_connections 2026-03-10T09:00:08.750 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.749+0000 7fcb0de04700 1 --2- 192.168.123.105:0/575984104 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcb08104040 0x7fcb08106430 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:08.750 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.749+0000 7fcb0de04700 1 --2- 192.168.123.105:0/575984104 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcb08101710 0x7fcb08103b00 unknown :-1 s=CLOSED pgs=102 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:08.750 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.749+0000 7fcb0de04700 1 -- 192.168.123.105:0/575984104 >> 192.168.123.105:0/575984104 conn(0x7fcb080fb0e0 msgr2=0x7fcb080fd560 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:00:08.750 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.750+0000 7fcb0de04700 1 -- 192.168.123.105:0/575984104 shutdown_connections 2026-03-10T09:00:08.750 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.750+0000 7fcb0de04700 1 -- 192.168.123.105:0/575984104 wait complete. 2026-03-10T09:00:08.750 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.750+0000 7fcb0de04700 1 Processor -- start 2026-03-10T09:00:08.750 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.750+0000 7fcb0de04700 1 -- start start 2026-03-10T09:00:08.751 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.750+0000 7fcb0de04700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcb08101710 0x7fcb08198790 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:00:08.751 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.750+0000 7fcb0de04700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcb08104040 0x7fcb08198cd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:00:08.751 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.750+0000 7fcb0de04700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcb081992f0 con 0x7fcb08104040 2026-03-10T09:00:08.751 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.750+0000 7fcb0de04700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcb08199430 con 0x7fcb08101710 2026-03-10T09:00:08.751 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.751+0000 7fcb0ce02700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcb08101710 0x7fcb08198790 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:00:08.751 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.751+0000 7fcb0ce02700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcb08101710 0x7fcb08198790 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:53514/0 (socket says 192.168.123.105:53514) 2026-03-10T09:00:08.751 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.751+0000 7fcb0ce02700 1 -- 192.168.123.105:0/1391308501 learned_addr learned my addr 192.168.123.105:0/1391308501 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:00:08.751 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.751+0000 7fcb07fff700 1 --2- 192.168.123.105:0/1391308501 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcb08104040 0x7fcb08198cd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:00:08.751 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.751+0000 7fcb0ce02700 1 -- 192.168.123.105:0/1391308501 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcb08104040 msgr2=0x7fcb08198cd0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:08.751 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.751+0000 7fcb0ce02700 1 --2- 192.168.123.105:0/1391308501 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcb08104040 0x7fcb08198cd0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:08.751 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.751+0000 7fcb0ce02700 1 -- 192.168.123.105:0/1391308501 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fcaf80097e0 con 0x7fcb08101710 2026-03-10T09:00:08.752 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.751+0000 7fcb07fff700 1 --2- 192.168.123.105:0/1391308501 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcb08104040 0x7fcb08198cd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T09:00:08.752 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.752+0000 7fcb0ce02700 1 --2- 192.168.123.105:0/1391308501 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcb08101710 0x7fcb08198790 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7fcaf800b5c0 tx=0x7fcaf8005090 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:00:08.752 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.752+0000 7fcb05ffb700 1 -- 192.168.123.105:0/1391308501 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcaf801d070 con 0x7fcb08101710 2026-03-10T09:00:08.752 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.752+0000 7fcb0de04700 1 -- 192.168.123.105:0/1391308501 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fcb0819de80 con 0x7fcb08101710 2026-03-10T09:00:08.752 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.752+0000 7fcb0de04700 1 -- 192.168.123.105:0/1391308501 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fcb0819e2f0 con 0x7fcb08101710 2026-03-10T09:00:08.753 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.752+0000 7fcb05ffb700 1 -- 192.168.123.105:0/1391308501 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fcaf800bcd0 con 0x7fcb08101710 2026-03-10T09:00:08.753 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.752+0000 7fcb05ffb700 1 -- 192.168.123.105:0/1391308501 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcaf80217d0 con 0x7fcb08101710 2026-03-10T09:00:08.754 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.754+0000 7fcb05ffb700 1 -- 192.168.123.105:0/1391308501 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fcaf800f460 con 0x7fcb08101710 2026-03-10T09:00:08.754 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.754+0000 7fcb0de04700 1 -- 192.168.123.105:0/1391308501 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fcaf4005320 con 0x7fcb08101710 2026-03-10T09:00:08.755 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.754+0000 7fcb05ffb700 1 --2- 192.168.123.105:0/1391308501 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fcaf0077870 0x7fcaf0079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:00:08.755 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.754+0000 7fcb05ffb700 1 -- 192.168.123.105:0/1391308501 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(65..65 src has 1..65) v4 ==== 6396+0+0 (secure 0 0 0) 0x7fcaf809b070 con 0x7fcb08101710 2026-03-10T09:00:08.755 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.754+0000 7fcb07fff700 1 --2- 192.168.123.105:0/1391308501 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fcaf0077870 0x7fcaf0079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:00:08.755 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.755+0000 7fcb07fff700 1 --2- 192.168.123.105:0/1391308501 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fcaf0077870 0x7fcaf0079d30 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7fcafc009a20 tx=0x7fcafc008040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:00:08.757 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.757+0000 7fcb05ffb700 1 -- 192.168.123.105:0/1391308501 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fcaf80637a0 con 0x7fcb08101710 2026-03-10T09:00:08.919 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.918+0000 7fcb0de04700 1 -- 192.168.123.105:0/1391308501 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7fcaf4005190 con 0x7fcb08101710 2026-03-10T09:00:08.919 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.919+0000 7fcb05ffb700 1 -- 192.168.123.105:0/1391308501 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+956 (secure 0 0 0) 0x7fcaf8062ef0 con 0x7fcb08101710 2026-03-10T09:00:08.920 INFO:teuthology.orchestra.run.vm05.stdout:HEALTH_WARN Degraded data redundancy: 756/231 objects degraded (327.273%), 6 pgs degraded, 6 pgs undersized 2026-03-10T09:00:08.920 INFO:teuthology.orchestra.run.vm05.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 756/231 objects degraded (327.273%), 6 pgs degraded, 6 pgs undersized 2026-03-10T09:00:08.920 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.1 is stuck undersized for 2m, current state active+recovery_wait+undersized+degraded+remapped, last acting [2,4] 2026-03-10T09:00:08.920 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.b is stuck undersized for 2m, current state active+recovery_wait+undersized+degraded+remapped, last acting [1,4] 2026-03-10T09:00:08.920 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.10 is stuck undersized for 2m, current state active+recovery_wait+undersized+degraded+remapped, last acting [5,1] 2026-03-10T09:00:08.920 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.11 is stuck undersized for 2m, current state active+recovery_wait+undersized+degraded+remapped, last acting [3,4] 2026-03-10T09:00:08.920 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.18 is stuck undersized for 2m, current state active+recovery_wait+undersized+degraded+remapped, last acting [2,1] 2026-03-10T09:00:08.920 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.1f is stuck undersized for 2m, current state active+recovering+undersized+degraded+remapped, last acting [2,3] 2026-03-10T09:00:08.922 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.922+0000 7fcb0de04700 1 -- 192.168.123.105:0/1391308501 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fcaf0077870 msgr2=0x7fcaf0079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:08.922 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.922+0000 7fcb0de04700 1 --2- 192.168.123.105:0/1391308501 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fcaf0077870 0x7fcaf0079d30 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7fcafc009a20 tx=0x7fcafc008040 comp rx=0 tx=0).stop 2026-03-10T09:00:08.922 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.922+0000 7fcb0de04700 1 -- 192.168.123.105:0/1391308501 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcb08101710 msgr2=0x7fcb08198790 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:08.922 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.922+0000 7fcb0de04700 1 --2- 192.168.123.105:0/1391308501 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcb08101710 0x7fcb08198790 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7fcaf800b5c0 tx=0x7fcaf8005090 comp rx=0 tx=0).stop 2026-03-10T09:00:08.923 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.922+0000 7fcb0de04700 1 -- 192.168.123.105:0/1391308501 shutdown_connections 2026-03-10T09:00:08.923 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.922+0000 7fcb0de04700 1 --2- 192.168.123.105:0/1391308501 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fcaf0077870 0x7fcaf0079d30 unknown :-1 s=CLOSED pgs=69 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:08.923 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.922+0000 7fcb0de04700 1 --2- 192.168.123.105:0/1391308501 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcb08101710 0x7fcb08198790 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:08.923 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.922+0000 7fcb0de04700 1 --2- 192.168.123.105:0/1391308501 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcb08104040 0x7fcb08198cd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:08.923 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.922+0000 7fcb0de04700 1 -- 192.168.123.105:0/1391308501 >> 192.168.123.105:0/1391308501 conn(0x7fcb080fb0e0 msgr2=0x7fcb080ffd90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:00:08.923 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.923+0000 7fcb0de04700 1 -- 192.168.123.105:0/1391308501 shutdown_connections 2026-03-10T09:00:08.923 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:08.923+0000 7fcb0de04700 1 -- 192.168.123.105:0/1391308501 wait complete. 2026-03-10T09:00:09.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:09 vm05.local ceph-mon[111630]: from='client.44221 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:00:09.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:09 vm05.local ceph-mon[111630]: from='client.44223 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:00:09.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:09 vm05.local ceph-mon[111630]: pgmap v126: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 756/231 objects degraded (327.273%); 0 B/s, 11 objects/s recovering 2026-03-10T09:00:09.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:09 vm05.local ceph-mon[111630]: from='client.44225 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:00:09.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:09 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/1937972537' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:00:09.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:09 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/2029971338' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T09:00:09.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:09 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/1391308501' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T09:00:09.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:09 vm08.local ceph-mon[101330]: from='client.44221 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:00:09.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:09 vm08.local ceph-mon[101330]: from='client.44223 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:00:09.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:09 vm08.local ceph-mon[101330]: pgmap v126: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 756/231 objects degraded (327.273%); 0 B/s, 11 objects/s recovering 2026-03-10T09:00:09.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:09 vm08.local ceph-mon[101330]: from='client.44225 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:00:09.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:09 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/1937972537' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:00:09.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:09 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/2029971338' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T09:00:09.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:09 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/1391308501' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T09:00:10.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:10 vm05.local ceph-mon[111630]: from='client.44227 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:00:10.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:10 vm08.local ceph-mon[101330]: from='client.44227 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:00:11.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:11 vm05.local ceph-mon[111630]: pgmap v127: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 756/231 objects degraded (327.273%); 0 B/s, 9 objects/s recovering 2026-03-10T09:00:11.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:11 vm08.local ceph-mon[101330]: pgmap v127: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 756/231 objects degraded (327.273%); 0 B/s, 9 objects/s recovering 2026-03-10T09:00:13.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:13 vm05.local ceph-mon[111630]: pgmap v128: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 756/231 objects degraded (327.273%); 0 B/s, 8 objects/s recovering 2026-03-10T09:00:13.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:13 vm08.local ceph-mon[101330]: pgmap v128: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 756/231 objects degraded (327.273%); 0 B/s, 8 objects/s recovering 2026-03-10T09:00:14.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:14 vm05.local ceph-mon[111630]: osdmap e66: 6 total, 6 up, 6 in 2026-03-10T09:00:14.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:14 vm08.local ceph-mon[101330]: osdmap e66: 6 total, 6 up, 6 in 2026-03-10T09:00:15.359 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:15 vm05.local ceph-mon[111630]: pgmap v130: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 713/231 objects degraded (308.658%); 0 B/s, 9 objects/s recovering 2026-03-10T09:00:15.359 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:15 vm05.local ceph-mon[111630]: Health check update: Degraded data redundancy: 713/231 objects degraded (308.658%), 6 pgs degraded, 6 pgs undersized (PG_DEGRADED) 2026-03-10T09:00:15.359 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:15 vm05.local ceph-mon[111630]: osdmap e67: 6 total, 6 up, 6 in 2026-03-10T09:00:15.359 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:15 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T09:00:15.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:15 vm08.local ceph-mon[101330]: pgmap v130: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 713/231 objects degraded (308.658%); 0 B/s, 9 objects/s recovering 2026-03-10T09:00:15.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:15 vm08.local ceph-mon[101330]: Health check update: Degraded data redundancy: 713/231 objects degraded (308.658%), 6 pgs degraded, 6 pgs undersized (PG_DEGRADED) 2026-03-10T09:00:15.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:15 vm08.local ceph-mon[101330]: osdmap e67: 6 total, 6 up, 6 in 2026-03-10T09:00:15.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:15 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T09:00:16.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:16 vm05.local ceph-mon[111630]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T09:00:16.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:16 vm05.local ceph-mon[111630]: Upgrade: unsafe to stop osd(s) at this time (3 PGs are or would become offline) 2026-03-10T09:00:16.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:16 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:00:16.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:16 vm08.local ceph-mon[101330]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T09:00:16.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:16 vm08.local ceph-mon[101330]: Upgrade: unsafe to stop osd(s) at this time (3 PGs are or would become offline) 2026-03-10T09:00:16.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:16 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:00:17.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:17 vm05.local ceph-mon[111630]: pgmap v132: 65 pgs: 1 peering, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 706/231 objects degraded (305.628%); 0 B/s, 6 objects/s recovering 2026-03-10T09:00:17.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:17 vm08.local ceph-mon[101330]: pgmap v132: 65 pgs: 1 peering, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 706/231 objects degraded (305.628%); 0 B/s, 6 objects/s recovering 2026-03-10T09:00:19.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:19 vm05.local ceph-mon[111630]: pgmap v133: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 1 peering, 4 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 629/231 objects degraded (272.294%); 0 B/s, 12 objects/s recovering 2026-03-10T09:00:19.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:19 vm08.local ceph-mon[101330]: pgmap v133: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 1 peering, 4 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 629/231 objects degraded (272.294%); 0 B/s, 12 objects/s recovering 2026-03-10T09:00:20.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:20 vm05.local ceph-mon[111630]: Health check update: Degraded data redundancy: 629/231 objects degraded (272.294%), 5 pgs degraded, 5 pgs undersized (PG_DEGRADED) 2026-03-10T09:00:20.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:20 vm08.local ceph-mon[101330]: Health check update: Degraded data redundancy: 629/231 objects degraded (272.294%), 5 pgs degraded, 5 pgs undersized (PG_DEGRADED) 2026-03-10T09:00:21.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:21 vm05.local ceph-mon[111630]: pgmap v134: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 4 active+recovery_wait+undersized+degraded+remapped, 60 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 629/231 objects degraded (272.294%); 0 B/s, 12 objects/s recovering 2026-03-10T09:00:21.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:21 vm08.local ceph-mon[101330]: pgmap v134: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 4 active+recovery_wait+undersized+degraded+remapped, 60 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 629/231 objects degraded (272.294%); 0 B/s, 12 objects/s recovering 2026-03-10T09:00:23.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:23 vm05.local ceph-mon[111630]: pgmap v135: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 4 active+recovery_wait+undersized+degraded+remapped, 60 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 629/231 objects degraded (272.294%); 0 B/s, 5 objects/s recovering 2026-03-10T09:00:23.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:23 vm08.local ceph-mon[101330]: pgmap v135: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 4 active+recovery_wait+undersized+degraded+remapped, 60 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 629/231 objects degraded (272.294%); 0 B/s, 5 objects/s recovering 2026-03-10T09:00:24.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:24 vm08.local ceph-mon[101330]: pgmap v136: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 4 active+recovery_wait+undersized+degraded+remapped, 60 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 620/231 objects degraded (268.398%); 0 B/s, 9 objects/s recovering 2026-03-10T09:00:24.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:24 vm05.local ceph-mon[111630]: pgmap v136: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 4 active+recovery_wait+undersized+degraded+remapped, 60 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 620/231 objects degraded (268.398%); 0 B/s, 9 objects/s recovering 2026-03-10T09:00:25.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:25 vm05.local ceph-mon[111630]: Health check update: Degraded data redundancy: 620/231 objects degraded (268.398%), 5 pgs degraded, 5 pgs undersized (PG_DEGRADED) 2026-03-10T09:00:25.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:25 vm08.local ceph-mon[101330]: Health check update: Degraded data redundancy: 620/231 objects degraded (268.398%), 5 pgs degraded, 5 pgs undersized (PG_DEGRADED) 2026-03-10T09:00:26.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:26 vm05.local ceph-mon[111630]: pgmap v137: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 4 active+recovery_wait+undersized+degraded+remapped, 60 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 620/231 objects degraded (268.398%); 0 B/s, 7 objects/s recovering 2026-03-10T09:00:26.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:26 vm08.local ceph-mon[101330]: pgmap v137: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 4 active+recovery_wait+undersized+degraded+remapped, 60 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 620/231 objects degraded (268.398%); 0 B/s, 7 objects/s recovering 2026-03-10T09:00:28.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:28 vm05.local ceph-mon[111630]: osdmap e68: 6 total, 6 up, 6 in 2026-03-10T09:00:28.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:28 vm05.local ceph-mon[111630]: pgmap v139: 65 pgs: 1 active+undersized+remapped, 1 active+recovering+undersized+degraded+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 60 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 491/231 objects degraded (212.554%); 0 B/s, 9 objects/s recovering 2026-03-10T09:00:28.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:28 vm08.local ceph-mon[101330]: osdmap e68: 6 total, 6 up, 6 in 2026-03-10T09:00:28.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:28 vm08.local ceph-mon[101330]: pgmap v139: 65 pgs: 1 active+undersized+remapped, 1 active+recovering+undersized+degraded+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 60 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 491/231 objects degraded (212.554%); 0 B/s, 9 objects/s recovering 2026-03-10T09:00:29.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:29 vm08.local ceph-mon[101330]: osdmap e69: 6 total, 6 up, 6 in 2026-03-10T09:00:29.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:29 vm05.local ceph-mon[111630]: osdmap e69: 6 total, 6 up, 6 in 2026-03-10T09:00:30.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:30 vm05.local ceph-mon[111630]: pgmap v141: 65 pgs: 1 active+undersized+remapped, 1 active+recovering+undersized+degraded+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 60 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 491/231 objects degraded (212.554%); 0 B/s, 11 objects/s recovering 2026-03-10T09:00:30.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:30 vm05.local ceph-mon[111630]: Health check update: Degraded data redundancy: 491/231 objects degraded (212.554%), 4 pgs degraded, 5 pgs undersized (PG_DEGRADED) 2026-03-10T09:00:30.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:30 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:00:30.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:30 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:00:30.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:30 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:00:30.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:30 vm08.local ceph-mon[101330]: pgmap v141: 65 pgs: 1 active+undersized+remapped, 1 active+recovering+undersized+degraded+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 60 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 491/231 objects degraded (212.554%); 0 B/s, 11 objects/s recovering 2026-03-10T09:00:30.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:30 vm08.local ceph-mon[101330]: Health check update: Degraded data redundancy: 491/231 objects degraded (212.554%), 4 pgs degraded, 5 pgs undersized (PG_DEGRADED) 2026-03-10T09:00:30.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:30 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:00:30.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:30 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:00:30.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:30 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:00:31.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:00:31.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:00:31.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:00:31.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:00:31.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:00:31.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:00:31.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:00:31.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T09:00:31.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:31 vm05.local ceph-mon[111630]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T09:00:31.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:31 vm05.local ceph-mon[111630]: Upgrade: unsafe to stop osd(s) at this time (3 PGs are or would become offline) 2026-03-10T09:00:31.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:00:31.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:00:31.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:00:31.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:00:31.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:00:31.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:00:31.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:00:31.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:00:31.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T09:00:31.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:31 vm08.local ceph-mon[101330]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T09:00:31.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:31 vm08.local ceph-mon[101330]: Upgrade: unsafe to stop osd(s) at this time (3 PGs are or would become offline) 2026-03-10T09:00:31.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:00:32.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:32 vm05.local ceph-mon[111630]: pgmap v142: 65 pgs: 1 active+undersized+remapped, 1 active+recovering+undersized+degraded+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 60 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 491/231 objects degraded (212.554%); 0 B/s, 5 objects/s recovering 2026-03-10T09:00:32.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:32 vm08.local ceph-mon[101330]: pgmap v142: 65 pgs: 1 active+undersized+remapped, 1 active+recovering+undersized+degraded+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 60 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 491/231 objects degraded (212.554%); 0 B/s, 5 objects/s recovering 2026-03-10T09:00:35.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:34 vm05.local ceph-mon[111630]: pgmap v143: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 491/231 objects degraded (212.554%); 0 B/s, 11 objects/s recovering 2026-03-10T09:00:35.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:34 vm08.local ceph-mon[101330]: pgmap v143: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 491/231 objects degraded (212.554%); 0 B/s, 11 objects/s recovering 2026-03-10T09:00:36.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:35 vm05.local ceph-mon[111630]: Health check update: Degraded data redundancy: 491/231 objects degraded (212.554%), 4 pgs degraded, 4 pgs undersized (PG_DEGRADED) 2026-03-10T09:00:36.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:35 vm08.local ceph-mon[101330]: Health check update: Degraded data redundancy: 491/231 objects degraded (212.554%), 4 pgs degraded, 4 pgs undersized (PG_DEGRADED) 2026-03-10T09:00:37.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:36 vm05.local ceph-mon[111630]: pgmap v144: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 491/231 objects degraded (212.554%); 0 B/s, 5 objects/s recovering 2026-03-10T09:00:37.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:36 vm08.local ceph-mon[101330]: pgmap v144: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 491/231 objects degraded (212.554%); 0 B/s, 5 objects/s recovering 2026-03-10T09:00:38.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:38.997+0000 7ff0063c1700 1 -- 192.168.123.105:0/326403510 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff000104060 msgr2=0x7ff0001044e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:38.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:38.997+0000 7ff0063c1700 1 --2- 192.168.123.105:0/326403510 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff000104060 0x7ff0001044e0 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7feffc009b00 tx=0x7feffc009e10 comp rx=0 tx=0).stop 2026-03-10T09:00:38.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:38.998+0000 7ff0063c1700 1 -- 192.168.123.105:0/326403510 shutdown_connections 2026-03-10T09:00:38.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:38.998+0000 7ff0063c1700 1 --2- 192.168.123.105:0/326403510 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff000104060 0x7ff0001044e0 unknown :-1 s=CLOSED pgs=103 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:38.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:38.998+0000 7ff0063c1700 1 --2- 192.168.123.105:0/326403510 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff000102e70 0x7ff000103290 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:38.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:38.998+0000 7ff0063c1700 1 -- 192.168.123.105:0/326403510 >> 192.168.123.105:0/326403510 conn(0x7ff0000fe440 msgr2=0x7ff0001008a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:00:38.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:38.999+0000 7ff0063c1700 1 -- 192.168.123.105:0/326403510 shutdown_connections 2026-03-10T09:00:38.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:38.999+0000 7ff0063c1700 1 -- 192.168.123.105:0/326403510 wait complete. 2026-03-10T09:00:38.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:38.999+0000 7ff0063c1700 1 Processor -- start 2026-03-10T09:00:39.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:38.999+0000 7ff0063c1700 1 -- start start 2026-03-10T09:00:39.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.000+0000 7ff0063c1700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff000102e70 0x7ff000198860 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:00:39.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.000+0000 7ff0063c1700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff000104060 0x7ff000198da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:00:39.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.000+0000 7ff0063c1700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff0001993c0 con 0x7ff000102e70 2026-03-10T09:00:39.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.000+0000 7ff0063c1700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff000199500 con 0x7ff000104060 2026-03-10T09:00:39.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.000+0000 7ff0053bf700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff000102e70 0x7ff000198860 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:00:39.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.000+0000 7ff0053bf700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff000102e70 0x7ff000198860 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:37716/0 (socket says 192.168.123.105:37716) 2026-03-10T09:00:39.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.000+0000 7ff0053bf700 1 -- 192.168.123.105:0/3877298514 learned_addr learned my addr 192.168.123.105:0/3877298514 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:00:39.001 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.000+0000 7ff004bbe700 1 --2- 192.168.123.105:0/3877298514 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff000104060 0x7ff000198da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:00:39.001 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.001+0000 7ff0053bf700 1 -- 192.168.123.105:0/3877298514 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff000104060 msgr2=0x7ff000198da0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:39.001 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.001+0000 7ff0053bf700 1 --2- 192.168.123.105:0/3877298514 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff000104060 0x7ff000198da0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:39.001 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.001+0000 7ff0053bf700 1 -- 192.168.123.105:0/3877298514 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7feffc0097e0 con 0x7ff000102e70 2026-03-10T09:00:39.001 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.001+0000 7ff0053bf700 1 --2- 192.168.123.105:0/3877298514 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff000102e70 0x7ff000198860 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7feff000b700 tx=0x7feff000ba10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:00:39.003 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.001+0000 7feff67fc700 1 -- 192.168.123.105:0/3877298514 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7feff0011840 con 0x7ff000102e70 2026-03-10T09:00:39.003 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.001+0000 7feff67fc700 1 -- 192.168.123.105:0/3877298514 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7feff0011e80 con 0x7ff000102e70 2026-03-10T09:00:39.003 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.001+0000 7feff67fc700 1 -- 192.168.123.105:0/3877298514 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7feff000f550 con 0x7ff000102e70 2026-03-10T09:00:39.003 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.001+0000 7ff0063c1700 1 -- 192.168.123.105:0/3877298514 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff00019dfb0 con 0x7ff000102e70 2026-03-10T09:00:39.003 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.001+0000 7ff0063c1700 1 -- 192.168.123.105:0/3877298514 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff00019e470 con 0x7ff000102e70 2026-03-10T09:00:39.006 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.003+0000 7feff67fc700 1 -- 192.168.123.105:0/3877298514 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7feff000f6b0 con 0x7ff000102e70 2026-03-10T09:00:39.006 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.003+0000 7ff0063c1700 1 -- 192.168.123.105:0/3877298514 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff00004ea90 con 0x7ff000102e70 2026-03-10T09:00:39.007 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.006+0000 7feff67fc700 1 --2- 192.168.123.105:0/3877298514 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fefec077870 0x7fefec079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:00:39.007 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.006+0000 7feff67fc700 1 -- 192.168.123.105:0/3877298514 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(69..69 src has 1..69) v4 ==== 6338+0+0 (secure 0 0 0) 0x7feff00997e0 con 0x7ff000102e70 2026-03-10T09:00:39.007 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.006+0000 7ff004bbe700 1 --2- 192.168.123.105:0/3877298514 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fefec077870 0x7fefec079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:00:39.007 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.007+0000 7ff004bbe700 1 --2- 192.168.123.105:0/3877298514 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fefec077870 0x7fefec079d30 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7feffc0052d0 tx=0x7feffc00b540 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:00:39.007 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.007+0000 7feff67fc700 1 -- 192.168.123.105:0/3877298514 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7feff0062fd0 con 0x7ff000102e70 2026-03-10T09:00:39.140 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.139+0000 7ff0063c1700 1 -- 192.168.123.105:0/3877298514 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7ff000108a40 con 0x7fefec077870 2026-03-10T09:00:39.141 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.141+0000 7feff67fc700 1 -- 192.168.123.105:0/3877298514 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7ff000108a40 con 0x7fefec077870 2026-03-10T09:00:39.144 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.144+0000 7ff0063c1700 1 -- 192.168.123.105:0/3877298514 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fefec077870 msgr2=0x7fefec079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:39.144 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.144+0000 7ff0063c1700 1 --2- 192.168.123.105:0/3877298514 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fefec077870 0x7fefec079d30 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7feffc0052d0 tx=0x7feffc00b540 comp rx=0 tx=0).stop 2026-03-10T09:00:39.144 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.144+0000 7ff0063c1700 1 -- 192.168.123.105:0/3877298514 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff000102e70 msgr2=0x7ff000198860 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:39.144 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.144+0000 7ff0063c1700 1 --2- 192.168.123.105:0/3877298514 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff000102e70 0x7ff000198860 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7feff000b700 tx=0x7feff000ba10 comp rx=0 tx=0).stop 2026-03-10T09:00:39.145 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.144+0000 7ff0063c1700 1 -- 192.168.123.105:0/3877298514 shutdown_connections 2026-03-10T09:00:39.145 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.144+0000 7ff0063c1700 1 --2- 192.168.123.105:0/3877298514 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fefec077870 0x7fefec079d30 unknown :-1 s=CLOSED pgs=70 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:39.145 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.144+0000 7ff0063c1700 1 --2- 192.168.123.105:0/3877298514 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff000102e70 0x7ff000198860 unknown :-1 s=CLOSED pgs=104 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:39.145 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.144+0000 7ff0063c1700 1 --2- 192.168.123.105:0/3877298514 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff000104060 0x7ff000198da0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:39.145 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.144+0000 7ff0063c1700 1 -- 192.168.123.105:0/3877298514 >> 192.168.123.105:0/3877298514 conn(0x7ff0000fe440 msgr2=0x7ff000107320 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:00:39.145 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.145+0000 7ff0063c1700 1 -- 192.168.123.105:0/3877298514 shutdown_connections 2026-03-10T09:00:39.145 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.145+0000 7ff0063c1700 1 -- 192.168.123.105:0/3877298514 wait complete. 2026-03-10T09:00:39.154 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:38 vm05.local ceph-mon[111630]: pgmap v145: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 476/231 objects degraded (206.061%); 0 B/s, 9 objects/s recovering 2026-03-10T09:00:39.154 INFO:teuthology.orchestra.run.vm05.stdout:true 2026-03-10T09:00:39.222 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.222+0000 7fca7417f700 1 -- 192.168.123.105:0/1754311029 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fca6c103180 msgr2=0x7fca6c1035a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:39.222 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.222+0000 7fca7417f700 1 --2- 192.168.123.105:0/1754311029 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fca6c103180 0x7fca6c1035a0 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7fca5c009b00 tx=0x7fca5c009e10 comp rx=0 tx=0).stop 2026-03-10T09:00:39.223 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.222+0000 7fca7417f700 1 -- 192.168.123.105:0/1754311029 shutdown_connections 2026-03-10T09:00:39.223 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.222+0000 7fca7417f700 1 --2- 192.168.123.105:0/1754311029 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fca6c104380 0x7fca6c1047e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:39.223 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.222+0000 7fca7417f700 1 --2- 192.168.123.105:0/1754311029 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fca6c103180 0x7fca6c1035a0 unknown :-1 s=CLOSED pgs=105 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:39.223 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.222+0000 7fca7417f700 1 -- 192.168.123.105:0/1754311029 >> 192.168.123.105:0/1754311029 conn(0x7fca6c0fe720 msgr2=0x7fca6c100b60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:00:39.223 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.223+0000 7fca7417f700 1 -- 192.168.123.105:0/1754311029 shutdown_connections 2026-03-10T09:00:39.223 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.223+0000 7fca7417f700 1 -- 192.168.123.105:0/1754311029 wait complete. 2026-03-10T09:00:39.224 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.223+0000 7fca7417f700 1 Processor -- start 2026-03-10T09:00:39.224 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.223+0000 7fca7417f700 1 -- start start 2026-03-10T09:00:39.224 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.224+0000 7fca7417f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fca6c103180 0x7fca6c198a30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:00:39.224 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.224+0000 7fca7417f700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fca6c104380 0x7fca6c198f70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:00:39.224 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.224+0000 7fca7417f700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fca6c199590 con 0x7fca6c103180 2026-03-10T09:00:39.224 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.224+0000 7fca7417f700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fca6c1996d0 con 0x7fca6c104380 2026-03-10T09:00:39.225 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.225+0000 7fca71f1b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fca6c103180 0x7fca6c198a30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:00:39.225 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.225+0000 7fca7171a700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fca6c104380 0x7fca6c198f70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:00:39.225 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.225+0000 7fca7171a700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fca6c104380 0x7fca6c198f70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:47874/0 (socket says 192.168.123.105:47874) 2026-03-10T09:00:39.225 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.225+0000 7fca7171a700 1 -- 192.168.123.105:0/4265303550 learned_addr learned my addr 192.168.123.105:0/4265303550 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:00:39.225 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.225+0000 7fca71f1b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fca6c103180 0x7fca6c198a30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:37732/0 (socket says 192.168.123.105:37732) 2026-03-10T09:00:39.226 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.225+0000 7fca7171a700 1 -- 192.168.123.105:0/4265303550 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fca6c103180 msgr2=0x7fca6c198a30 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:39.226 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.225+0000 7fca7171a700 1 --2- 192.168.123.105:0/4265303550 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fca6c103180 0x7fca6c198a30 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:39.226 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.225+0000 7fca7171a700 1 -- 192.168.123.105:0/4265303550 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fca5c0097e0 con 0x7fca6c104380 2026-03-10T09:00:39.226 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.226+0000 7fca7171a700 1 --2- 192.168.123.105:0/4265303550 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fca6c104380 0x7fca6c198f70 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7fca6800eb10 tx=0x7fca6800ee20 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:00:39.227 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.226+0000 7fca62ffd700 1 -- 192.168.123.105:0/4265303550 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fca6800cc40 con 0x7fca6c104380 2026-03-10T09:00:39.227 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.227+0000 7fca7417f700 1 -- 192.168.123.105:0/4265303550 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fca6c19e180 con 0x7fca6c104380 2026-03-10T09:00:39.227 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.227+0000 7fca7417f700 1 -- 192.168.123.105:0/4265303550 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fca6c19e6d0 con 0x7fca6c104380 2026-03-10T09:00:39.228 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.227+0000 7fca62ffd700 1 -- 192.168.123.105:0/4265303550 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fca6800cda0 con 0x7fca6c104380 2026-03-10T09:00:39.228 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.227+0000 7fca62ffd700 1 -- 192.168.123.105:0/4265303550 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fca68018810 con 0x7fca6c104380 2026-03-10T09:00:39.229 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.228+0000 7fca62ffd700 1 -- 192.168.123.105:0/4265303550 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fca68018aa0 con 0x7fca6c104380 2026-03-10T09:00:39.229 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.229+0000 7fca7417f700 1 -- 192.168.123.105:0/4265303550 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fca6c066e80 con 0x7fca6c104380 2026-03-10T09:00:39.229 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.229+0000 7fca62ffd700 1 --2- 192.168.123.105:0/4265303550 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fca580778e0 0x7fca58079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:00:39.229 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.229+0000 7fca62ffd700 1 -- 192.168.123.105:0/4265303550 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(69..69 src has 1..69) v4 ==== 6338+0+0 (secure 0 0 0) 0x7fca68014070 con 0x7fca6c104380 2026-03-10T09:00:39.229 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.229+0000 7fca71f1b700 1 --2- 192.168.123.105:0/4265303550 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fca580778e0 0x7fca58079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:00:39.230 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.230+0000 7fca71f1b700 1 --2- 192.168.123.105:0/4265303550 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fca580778e0 0x7fca58079da0 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7fca5c00b5c0 tx=0x7fca5c01a040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:00:39.232 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.232+0000 7fca62ffd700 1 -- 192.168.123.105:0/4265303550 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fca68062bd0 con 0x7fca6c104380 2026-03-10T09:00:39.253 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:38 vm08.local ceph-mon[101330]: pgmap v145: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 476/231 objects degraded (206.061%); 0 B/s, 9 objects/s recovering 2026-03-10T09:00:39.358 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.357+0000 7fca7417f700 1 -- 192.168.123.105:0/4265303550 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fca6c108cd0 con 0x7fca580778e0 2026-03-10T09:00:39.359 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.358+0000 7fca62ffd700 1 -- 192.168.123.105:0/4265303550 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7fca6c108cd0 con 0x7fca580778e0 2026-03-10T09:00:39.361 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.361+0000 7fca7417f700 1 -- 192.168.123.105:0/4265303550 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fca580778e0 msgr2=0x7fca58079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:39.361 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.361+0000 7fca7417f700 1 --2- 192.168.123.105:0/4265303550 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fca580778e0 0x7fca58079da0 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7fca5c00b5c0 tx=0x7fca5c01a040 comp rx=0 tx=0).stop 2026-03-10T09:00:39.361 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.361+0000 7fca7417f700 1 -- 192.168.123.105:0/4265303550 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fca6c104380 msgr2=0x7fca6c198f70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:39.361 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.361+0000 7fca7417f700 1 --2- 192.168.123.105:0/4265303550 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fca6c104380 0x7fca6c198f70 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7fca6800eb10 tx=0x7fca6800ee20 comp rx=0 tx=0).stop 2026-03-10T09:00:39.361 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.361+0000 7fca7417f700 1 -- 192.168.123.105:0/4265303550 shutdown_connections 2026-03-10T09:00:39.361 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.361+0000 7fca7417f700 1 --2- 192.168.123.105:0/4265303550 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fca580778e0 0x7fca58079da0 unknown :-1 s=CLOSED pgs=71 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:39.361 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.361+0000 7fca7417f700 1 --2- 192.168.123.105:0/4265303550 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fca6c103180 0x7fca6c198a30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:39.361 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.361+0000 7fca7417f700 1 --2- 192.168.123.105:0/4265303550 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fca6c104380 0x7fca6c198f70 unknown :-1 s=CLOSED pgs=40 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:39.361 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.361+0000 7fca7417f700 1 -- 192.168.123.105:0/4265303550 >> 192.168.123.105:0/4265303550 conn(0x7fca6c0fe720 msgr2=0x7fca6c1075b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:00:39.362 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.361+0000 7fca7417f700 1 -- 192.168.123.105:0/4265303550 shutdown_connections 2026-03-10T09:00:39.362 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.361+0000 7fca7417f700 1 -- 192.168.123.105:0/4265303550 wait complete. 2026-03-10T09:00:39.431 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.430+0000 7fa7e5ff0700 1 -- 192.168.123.105:0/1049932789 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa7e00ff860 msgr2=0x7fa7e00ffc80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:39.431 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.430+0000 7fa7e5ff0700 1 --2- 192.168.123.105:0/1049932789 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa7e00ff860 0x7fa7e00ffc80 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7fa7c8009b50 tx=0x7fa7c8009e60 comp rx=0 tx=0).stop 2026-03-10T09:00:39.433 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.433+0000 7fa7e5ff0700 1 -- 192.168.123.105:0/1049932789 shutdown_connections 2026-03-10T09:00:39.433 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.433+0000 7fa7e5ff0700 1 --2- 192.168.123.105:0/1049932789 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa7e01001c0 0x7fa7e0100640 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:39.433 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.433+0000 7fa7e5ff0700 1 --2- 192.168.123.105:0/1049932789 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa7e00ff860 0x7fa7e00ffc80 unknown :-1 s=CLOSED pgs=106 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:39.433 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.433+0000 7fa7e5ff0700 1 -- 192.168.123.105:0/1049932789 >> 192.168.123.105:0/1049932789 conn(0x7fa7e00fb3c0 msgr2=0x7fa7e00fd840 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:00:39.433 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.433+0000 7fa7e5ff0700 1 -- 192.168.123.105:0/1049932789 shutdown_connections 2026-03-10T09:00:39.433 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.433+0000 7fa7e5ff0700 1 -- 192.168.123.105:0/1049932789 wait complete. 2026-03-10T09:00:39.434 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.434+0000 7fa7e5ff0700 1 Processor -- start 2026-03-10T09:00:39.434 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.434+0000 7fa7e5ff0700 1 -- start start 2026-03-10T09:00:39.434 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.434+0000 7fa7e5ff0700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa7e00ff860 0x7fa7e0198a50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:00:39.434 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.434+0000 7fa7e5ff0700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa7e01001c0 0x7fa7e0198f90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:00:39.434 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.434+0000 7fa7e5ff0700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa7e01995b0 con 0x7fa7e01001c0 2026-03-10T09:00:39.435 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.434+0000 7fa7e5ff0700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa7e01996f0 con 0x7fa7e00ff860 2026-03-10T09:00:39.435 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.434+0000 7fa7df7fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa7e00ff860 0x7fa7e0198a50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:00:39.435 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.434+0000 7fa7df7fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa7e00ff860 0x7fa7e0198a50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:47884/0 (socket says 192.168.123.105:47884) 2026-03-10T09:00:39.435 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.434+0000 7fa7df7fe700 1 -- 192.168.123.105:0/3003147430 learned_addr learned my addr 192.168.123.105:0/3003147430 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:00:39.435 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.434+0000 7fa7df7fe700 1 -- 192.168.123.105:0/3003147430 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa7e01001c0 msgr2=0x7fa7e0198f90 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:39.435 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.434+0000 7fa7deffd700 1 --2- 192.168.123.105:0/3003147430 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa7e01001c0 0x7fa7e0198f90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:00:39.436 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.435+0000 7fa7df7fe700 1 --2- 192.168.123.105:0/3003147430 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa7e01001c0 0x7fa7e0198f90 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:39.436 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.435+0000 7fa7df7fe700 1 -- 192.168.123.105:0/3003147430 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa7c80097e0 con 0x7fa7e00ff860 2026-03-10T09:00:39.436 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.435+0000 7fa7df7fe700 1 --2- 192.168.123.105:0/3003147430 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa7e00ff860 0x7fa7e0198a50 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7fa7c8004ce0 tx=0x7fa7c80057f0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:00:39.436 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.435+0000 7fa7dcff9700 1 -- 192.168.123.105:0/3003147430 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa7c801d070 con 0x7fa7e00ff860 2026-03-10T09:00:39.436 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.435+0000 7fa7e5ff0700 1 -- 192.168.123.105:0/3003147430 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa7e019e140 con 0x7fa7e00ff860 2026-03-10T09:00:39.436 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.435+0000 7fa7e5ff0700 1 -- 192.168.123.105:0/3003147430 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa7e0101a70 con 0x7fa7e00ff860 2026-03-10T09:00:39.436 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.435+0000 7fa7dcff9700 1 -- 192.168.123.105:0/3003147430 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa7c800bc30 con 0x7fa7e00ff860 2026-03-10T09:00:39.437 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.435+0000 7fa7dcff9700 1 -- 192.168.123.105:0/3003147430 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa7c800f700 con 0x7fa7e00ff860 2026-03-10T09:00:39.437 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.436+0000 7fa7deffd700 1 --2- 192.168.123.105:0/3003147430 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa7e01001c0 0x7fa7e0198f90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T09:00:39.437 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.436+0000 7fa7dcff9700 1 -- 192.168.123.105:0/3003147430 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fa7c8022a50 con 0x7fa7e00ff860 2026-03-10T09:00:39.437 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.437+0000 7fa7dcff9700 1 --2- 192.168.123.105:0/3003147430 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fa7cc0778c0 0x7fa7cc079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:00:39.437 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.437+0000 7fa7deffd700 1 --2- 192.168.123.105:0/3003147430 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fa7cc0778c0 0x7fa7cc079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:00:39.437 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.437+0000 7fa7dcff9700 1 -- 192.168.123.105:0/3003147430 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(69..69 src has 1..69) v4 ==== 6338+0+0 (secure 0 0 0) 0x7fa7c809b1e0 con 0x7fa7e00ff860 2026-03-10T09:00:39.438 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.437+0000 7fa7deffd700 1 --2- 192.168.123.105:0/3003147430 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fa7cc0778c0 0x7fa7cc079d80 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7fa7d0005fd0 tx=0x7fa7d0005ee0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:00:39.438 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.438+0000 7fa7e5ff0700 1 -- 192.168.123.105:0/3003147430 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa7c0005320 con 0x7fa7e00ff860 2026-03-10T09:00:39.441 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.441+0000 7fa7dcff9700 1 -- 192.168.123.105:0/3003147430 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fa7c806a080 con 0x7fa7e00ff860 2026-03-10T09:00:39.566 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.565+0000 7fa7e5ff0700 1 -- 192.168.123.105:0/3003147430 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fa7c0000bf0 con 0x7fa7cc0778c0 2026-03-10T09:00:39.570 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.570+0000 7fa7dcff9700 1 -- 192.168.123.105:0/3003147430 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3528 (secure 0 0 0) 0x7fa7c0000bf0 con 0x7fa7cc0778c0 2026-03-10T09:00:39.571 INFO:teuthology.orchestra.run.vm05.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T09:00:39.571 INFO:teuthology.orchestra.run.vm05.stdout:alertmanager.vm05 vm05 *:9093,9094 running (9m) 3m ago 9m 24.2M - 0.25.0 c8568f914cd2 3be9db7ff6a0 2026-03-10T09:00:39.571 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm05 vm05 running (9m) 3m ago 9m 8892k - 18.2.1 5be31c24972a 4f6a4fa3151a 2026-03-10T09:00:39.571 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm08 vm08 running (9m) 3m ago 9m 11.0M - 18.2.1 5be31c24972a 17a1d108c2c1 2026-03-10T09:00:39.571 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm05 vm05 running (3m) 3m ago 9m 7834k - 19.2.3-678-ge911bdeb 654f31e6858e 8f7c3cd210c7 2026-03-10T09:00:39.571 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm08 vm08 running (3m) 3m ago 9m 8296k - 19.2.3-678-ge911bdeb 654f31e6858e dc71c3161edd 2026-03-10T09:00:39.571 INFO:teuthology.orchestra.run.vm05.stdout:grafana.vm05 vm05 *:3000 running (9m) 3m ago 9m 88.3M - 9.4.7 954c08fa6188 91937b4745e7 2026-03-10T09:00:39.571 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.bxdvbu vm05 running (7m) 3m ago 7m 242M - 18.2.1 5be31c24972a 601862397d9b 2026-03-10T09:00:39.571 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.slhztf vm05 running (7m) 3m ago 7m 17.7M - 18.2.1 5be31c24972a 550cdd24738b 2026-03-10T09:00:39.571 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.ssijow vm08 running (7m) 3m ago 7m 19.9M - 18.2.1 5be31c24972a b9e55b365719 2026-03-10T09:00:39.571 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.xfzrbx vm08 running (7m) 3m ago 7m 16.2M - 18.2.1 5be31c24972a d58d1e2e6ff3 2026-03-10T09:00:39.571 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm05.rxwgjc vm05 *:8443,9283,8765 running (4m) 3m ago 10m 613M - 19.2.3-678-ge911bdeb 654f31e6858e d8c53d04a173 2026-03-10T09:00:39.571 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm08.rpongu vm08 *:8443,9283,8765 running (4m) 3m ago 9m 491M - 19.2.3-678-ge911bdeb 654f31e6858e 867781ac9d98 2026-03-10T09:00:39.571 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm05 vm05 running (4m) 3m ago 10m 55.8M 2048M 19.2.3-678-ge911bdeb 654f31e6858e cdc9176bec28 2026-03-10T09:00:39.571 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm08 vm08 running (3m) 3m ago 9m 48.1M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 34546aa1422b 2026-03-10T09:00:39.571 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm05 vm05 *:9100 running (9m) 3m ago 9m 14.7M - 1.5.0 0da6a335fe13 3b3db2a6030c 2026-03-10T09:00:39.571 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm08 vm08 *:9100 running (9m) 3m ago 9m 15.7M - 1.5.0 0da6a335fe13 f55df0cefc61 2026-03-10T09:00:39.571 INFO:teuthology.orchestra.run.vm05.stdout:osd.0 vm05 running (3m) 3m ago 8m 30.0M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 4f1dac46f59b 2026-03-10T09:00:39.571 INFO:teuthology.orchestra.run.vm05.stdout:osd.1 vm05 running (8m) 3m ago 8m 378M 4096M 18.2.1 5be31c24972a 902f9ea11f1a 2026-03-10T09:00:39.571 INFO:teuthology.orchestra.run.vm05.stdout:osd.2 vm05 running (8m) 3m ago 8m 327M 4096M 18.2.1 5be31c24972a 32c0be0f86f2 2026-03-10T09:00:39.571 INFO:teuthology.orchestra.run.vm05.stdout:osd.3 vm08 running (8m) 3m ago 8m 456M 4096M 18.2.1 5be31c24972a 14f5f93ea4d1 2026-03-10T09:00:39.571 INFO:teuthology.orchestra.run.vm05.stdout:osd.4 vm08 running (8m) 3m ago 8m 418M 4096M 18.2.1 5be31c24972a 155c1482d81c 2026-03-10T09:00:39.571 INFO:teuthology.orchestra.run.vm05.stdout:osd.5 vm08 running (8m) 3m ago 8m 339M 4096M 18.2.1 5be31c24972a 21583bb58d82 2026-03-10T09:00:39.571 INFO:teuthology.orchestra.run.vm05.stdout:prometheus.vm05 vm05 *:9095 running (4m) 3m ago 9m 51.3M - 2.43.0 a07b618ecd1d 1879cf55d022 2026-03-10T09:00:39.573 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.573+0000 7fa7e5ff0700 1 -- 192.168.123.105:0/3003147430 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fa7cc0778c0 msgr2=0x7fa7cc079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:39.573 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.573+0000 7fa7e5ff0700 1 --2- 192.168.123.105:0/3003147430 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fa7cc0778c0 0x7fa7cc079d80 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7fa7d0005fd0 tx=0x7fa7d0005ee0 comp rx=0 tx=0).stop 2026-03-10T09:00:39.573 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.573+0000 7fa7e5ff0700 1 -- 192.168.123.105:0/3003147430 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa7e00ff860 msgr2=0x7fa7e0198a50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:39.573 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.573+0000 7fa7e5ff0700 1 --2- 192.168.123.105:0/3003147430 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa7e00ff860 0x7fa7e0198a50 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7fa7c8004ce0 tx=0x7fa7c80057f0 comp rx=0 tx=0).stop 2026-03-10T09:00:39.573 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.573+0000 7fa7e5ff0700 1 -- 192.168.123.105:0/3003147430 shutdown_connections 2026-03-10T09:00:39.573 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.573+0000 7fa7e5ff0700 1 --2- 192.168.123.105:0/3003147430 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fa7cc0778c0 0x7fa7cc079d80 unknown :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:39.573 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.573+0000 7fa7e5ff0700 1 --2- 192.168.123.105:0/3003147430 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa7e00ff860 0x7fa7e0198a50 unknown :-1 s=CLOSED pgs=41 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:39.573 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.573+0000 7fa7e5ff0700 1 --2- 192.168.123.105:0/3003147430 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa7e01001c0 0x7fa7e0198f90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:39.574 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.573+0000 7fa7e5ff0700 1 -- 192.168.123.105:0/3003147430 >> 192.168.123.105:0/3003147430 conn(0x7fa7e00fb3c0 msgr2=0x7fa7e0107e90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:00:39.574 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.573+0000 7fa7e5ff0700 1 -- 192.168.123.105:0/3003147430 shutdown_connections 2026-03-10T09:00:39.574 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.573+0000 7fa7e5ff0700 1 -- 192.168.123.105:0/3003147430 wait complete. 2026-03-10T09:00:39.640 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.639+0000 7f5a2452f700 1 -- 192.168.123.105:0/2440761754 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5a1c104320 msgr2=0x7f5a1c104780 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:39.640 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.639+0000 7f5a2452f700 1 --2- 192.168.123.105:0/2440761754 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5a1c104320 0x7f5a1c104780 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7f5a18009b00 tx=0x7f5a18009e10 comp rx=0 tx=0).stop 2026-03-10T09:00:39.640 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.640+0000 7f5a2452f700 1 -- 192.168.123.105:0/2440761754 shutdown_connections 2026-03-10T09:00:39.640 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.640+0000 7f5a2452f700 1 --2- 192.168.123.105:0/2440761754 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5a1c104320 0x7f5a1c104780 unknown :-1 s=CLOSED pgs=107 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:39.640 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.640+0000 7f5a2452f700 1 --2- 192.168.123.105:0/2440761754 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5a1c103120 0x7f5a1c103540 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:39.640 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.640+0000 7f5a2452f700 1 -- 192.168.123.105:0/2440761754 >> 192.168.123.105:0/2440761754 conn(0x7f5a1c0fe6c0 msgr2=0x7f5a1c100b00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:00:39.640 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.640+0000 7f5a2452f700 1 -- 192.168.123.105:0/2440761754 shutdown_connections 2026-03-10T09:00:39.640 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.640+0000 7f5a2452f700 1 -- 192.168.123.105:0/2440761754 wait complete. 2026-03-10T09:00:39.641 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.640+0000 7f5a2452f700 1 Processor -- start 2026-03-10T09:00:39.641 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.641+0000 7f5a2452f700 1 -- start start 2026-03-10T09:00:39.641 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.641+0000 7f5a2452f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5a1c103120 0x7f5a1c198a40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:00:39.641 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.641+0000 7f5a2452f700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5a1c104320 0x7f5a1c198f80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:00:39.641 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.641+0000 7f5a2452f700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5a1c1995a0 con 0x7f5a1c103120 2026-03-10T09:00:39.641 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.641+0000 7f5a2452f700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5a1c1996e0 con 0x7f5a1c104320 2026-03-10T09:00:39.641 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.641+0000 7f5a222cb700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5a1c103120 0x7f5a1c198a40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:00:39.641 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.641+0000 7f5a222cb700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5a1c103120 0x7f5a1c198a40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:54504/0 (socket says 192.168.123.105:54504) 2026-03-10T09:00:39.641 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.641+0000 7f5a222cb700 1 -- 192.168.123.105:0/474219987 learned_addr learned my addr 192.168.123.105:0/474219987 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:00:39.641 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.641+0000 7f5a21aca700 1 --2- 192.168.123.105:0/474219987 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5a1c104320 0x7f5a1c198f80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:00:39.642 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.642+0000 7f5a222cb700 1 -- 192.168.123.105:0/474219987 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5a1c104320 msgr2=0x7f5a1c198f80 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:39.642 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.642+0000 7f5a222cb700 1 --2- 192.168.123.105:0/474219987 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5a1c104320 0x7f5a1c198f80 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:39.642 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.642+0000 7f5a222cb700 1 -- 192.168.123.105:0/474219987 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5a180097e0 con 0x7f5a1c103120 2026-03-10T09:00:39.642 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.642+0000 7f5a21aca700 1 --2- 192.168.123.105:0/474219987 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5a1c104320 0x7f5a1c198f80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T09:00:39.642 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.642+0000 7f5a222cb700 1 --2- 192.168.123.105:0/474219987 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5a1c103120 0x7f5a1c198a40 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7f5a1000ba70 tx=0x7f5a1000bd80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:00:39.642 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.642+0000 7f5a0f7fe700 1 -- 192.168.123.105:0/474219987 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5a1000c700 con 0x7f5a1c103120 2026-03-10T09:00:39.643 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.642+0000 7f5a2452f700 1 -- 192.168.123.105:0/474219987 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5a1c19e190 con 0x7f5a1c103120 2026-03-10T09:00:39.643 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.642+0000 7f5a0f7fe700 1 -- 192.168.123.105:0/474219987 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f5a1000cd40 con 0x7f5a1c103120 2026-03-10T09:00:39.643 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.643+0000 7f5a0f7fe700 1 -- 192.168.123.105:0/474219987 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5a10012340 con 0x7f5a1c103120 2026-03-10T09:00:39.644 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.644+0000 7f5a2452f700 1 -- 192.168.123.105:0/474219987 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5a1c19e6e0 con 0x7f5a1c103120 2026-03-10T09:00:39.644 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.644+0000 7f5a0f7fe700 1 -- 192.168.123.105:0/474219987 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f5a1000c860 con 0x7f5a1c103120 2026-03-10T09:00:39.648 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.644+0000 7f5a2452f700 1 -- 192.168.123.105:0/474219987 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5a1c066e80 con 0x7f5a1c103120 2026-03-10T09:00:39.648 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.647+0000 7f5a0f7fe700 1 --2- 192.168.123.105:0/474219987 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f5a08077720 0x7f5a08079be0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:00:39.648 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.647+0000 7f5a21aca700 1 --2- 192.168.123.105:0/474219987 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f5a08077720 0x7f5a08079be0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:00:39.648 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.648+0000 7f5a0f7fe700 1 -- 192.168.123.105:0/474219987 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(69..69 src has 1..69) v4 ==== 6338+0+0 (secure 0 0 0) 0x7f5a10098e40 con 0x7f5a1c103120 2026-03-10T09:00:39.648 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.648+0000 7f5a21aca700 1 --2- 192.168.123.105:0/474219987 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f5a08077720 0x7f5a08079be0 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f5a1800b5c0 tx=0x7f5a18005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:00:39.648 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.648+0000 7f5a0f7fe700 1 -- 192.168.123.105:0/474219987 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f5a1009d050 con 0x7f5a1c103120 2026-03-10T09:00:39.817 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.817+0000 7f5a2452f700 1 -- 192.168.123.105:0/474219987 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f5a1c19e990 con 0x7f5a1c103120 2026-03-10T09:00:39.818 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.817+0000 7f5a0f7fe700 1 -- 192.168.123.105:0/474219987 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+799 (secure 0 0 0) 0x7f5a10019330 con 0x7f5a1c103120 2026-03-10T09:00:39.818 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T09:00:39.818 INFO:teuthology.orchestra.run.vm05.stdout: "mon": { 2026-03-10T09:00:39.818 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T09:00:39.818 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T09:00:39.818 INFO:teuthology.orchestra.run.vm05.stdout: "mgr": { 2026-03-10T09:00:39.818 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T09:00:39.818 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T09:00:39.818 INFO:teuthology.orchestra.run.vm05.stdout: "osd": { 2026-03-10T09:00:39.818 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 5, 2026-03-10T09:00:39.818 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-10T09:00:39.818 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T09:00:39.818 INFO:teuthology.orchestra.run.vm05.stdout: "mds": { 2026-03-10T09:00:39.818 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4 2026-03-10T09:00:39.818 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T09:00:39.818 INFO:teuthology.orchestra.run.vm05.stdout: "overall": { 2026-03-10T09:00:39.818 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 9, 2026-03-10T09:00:39.818 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 5 2026-03-10T09:00:39.818 INFO:teuthology.orchestra.run.vm05.stdout: } 2026-03-10T09:00:39.818 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T09:00:39.820 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.820+0000 7f5a2452f700 1 -- 192.168.123.105:0/474219987 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f5a08077720 msgr2=0x7f5a08079be0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:39.820 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.820+0000 7f5a2452f700 1 --2- 192.168.123.105:0/474219987 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f5a08077720 0x7f5a08079be0 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f5a1800b5c0 tx=0x7f5a18005fb0 comp rx=0 tx=0).stop 2026-03-10T09:00:39.820 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.820+0000 7f5a2452f700 1 -- 192.168.123.105:0/474219987 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5a1c103120 msgr2=0x7f5a1c198a40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:39.820 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.820+0000 7f5a2452f700 1 --2- 192.168.123.105:0/474219987 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5a1c103120 0x7f5a1c198a40 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7f5a1000ba70 tx=0x7f5a1000bd80 comp rx=0 tx=0).stop 2026-03-10T09:00:39.820 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.820+0000 7f5a2452f700 1 -- 192.168.123.105:0/474219987 shutdown_connections 2026-03-10T09:00:39.820 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.820+0000 7f5a2452f700 1 --2- 192.168.123.105:0/474219987 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f5a08077720 0x7f5a08079be0 unknown :-1 s=CLOSED pgs=73 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:39.820 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.820+0000 7f5a2452f700 1 --2- 192.168.123.105:0/474219987 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5a1c103120 0x7f5a1c198a40 unknown :-1 s=CLOSED pgs=108 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:39.820 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.820+0000 7f5a2452f700 1 --2- 192.168.123.105:0/474219987 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5a1c104320 0x7f5a1c198f80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:39.820 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.820+0000 7f5a2452f700 1 -- 192.168.123.105:0/474219987 >> 192.168.123.105:0/474219987 conn(0x7f5a1c0fe6c0 msgr2=0x7f5a1c107550 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:00:39.821 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.820+0000 7f5a2452f700 1 -- 192.168.123.105:0/474219987 shutdown_connections 2026-03-10T09:00:39.821 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.820+0000 7f5a2452f700 1 -- 192.168.123.105:0/474219987 wait complete. 2026-03-10T09:00:39.884 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.883+0000 7fb39eb91700 1 -- 192.168.123.105:0/3058632941 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb398104340 msgr2=0x7fb3981047a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:39.884 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.883+0000 7fb39eb91700 1 --2- 192.168.123.105:0/3058632941 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb398104340 0x7fb3981047a0 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7fb38c009b50 tx=0x7fb38c009e60 comp rx=0 tx=0).stop 2026-03-10T09:00:39.884 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.884+0000 7fb39eb91700 1 -- 192.168.123.105:0/3058632941 shutdown_connections 2026-03-10T09:00:39.884 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.884+0000 7fb39eb91700 1 --2- 192.168.123.105:0/3058632941 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb398104340 0x7fb3981047a0 unknown :-1 s=CLOSED pgs=109 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:39.884 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.884+0000 7fb39eb91700 1 --2- 192.168.123.105:0/3058632941 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb398103140 0x7fb398103560 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:39.884 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.884+0000 7fb39eb91700 1 -- 192.168.123.105:0/3058632941 >> 192.168.123.105:0/3058632941 conn(0x7fb3980fe6c0 msgr2=0x7fb398100b20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:00:39.884 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.884+0000 7fb39eb91700 1 -- 192.168.123.105:0/3058632941 shutdown_connections 2026-03-10T09:00:39.884 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.884+0000 7fb39eb91700 1 -- 192.168.123.105:0/3058632941 wait complete. 2026-03-10T09:00:39.885 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.884+0000 7fb39eb91700 1 Processor -- start 2026-03-10T09:00:39.885 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.885+0000 7fb39eb91700 1 -- start start 2026-03-10T09:00:39.885 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.885+0000 7fb39eb91700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb398103140 0x7fb398198ad0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:00:39.885 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.885+0000 7fb39eb91700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb398104340 0x7fb398199010 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:00:39.885 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.885+0000 7fb39eb91700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb398199630 con 0x7fb398104340 2026-03-10T09:00:39.885 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.885+0000 7fb39eb91700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb398199770 con 0x7fb398103140 2026-03-10T09:00:39.885 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.885+0000 7fb397fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb398104340 0x7fb398199010 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:00:39.886 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.885+0000 7fb397fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb398104340 0x7fb398199010 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:54534/0 (socket says 192.168.123.105:54534) 2026-03-10T09:00:39.886 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.885+0000 7fb397fff700 1 -- 192.168.123.105:0/3899410437 learned_addr learned my addr 192.168.123.105:0/3899410437 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:00:39.886 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.886+0000 7fb397fff700 1 -- 192.168.123.105:0/3899410437 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb398103140 msgr2=0x7fb398198ad0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T09:00:39.886 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.886+0000 7fb397fff700 1 --2- 192.168.123.105:0/3899410437 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb398103140 0x7fb398198ad0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:39.886 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.886+0000 7fb397fff700 1 -- 192.168.123.105:0/3899410437 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb38c0097e0 con 0x7fb398104340 2026-03-10T09:00:39.886 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.886+0000 7fb397fff700 1 --2- 192.168.123.105:0/3899410437 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb398104340 0x7fb398199010 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7fb38c005850 tx=0x7fb38c00b920 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:00:39.886 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.886+0000 7fb395ffb700 1 -- 192.168.123.105:0/3899410437 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb38c01d070 con 0x7fb398104340 2026-03-10T09:00:39.887 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.886+0000 7fb39eb91700 1 -- 192.168.123.105:0/3899410437 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb39819e1c0 con 0x7fb398104340 2026-03-10T09:00:39.888 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.886+0000 7fb39eb91700 1 -- 192.168.123.105:0/3899410437 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb39819e6b0 con 0x7fb398104340 2026-03-10T09:00:39.888 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.887+0000 7fb395ffb700 1 -- 192.168.123.105:0/3899410437 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb38c00bd20 con 0x7fb398104340 2026-03-10T09:00:39.888 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.887+0000 7fb395ffb700 1 -- 192.168.123.105:0/3899410437 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb38c0219b0 con 0x7fb398104340 2026-03-10T09:00:39.888 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.888+0000 7fb395ffb700 1 -- 192.168.123.105:0/3899410437 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fb38c02b430 con 0x7fb398104340 2026-03-10T09:00:39.889 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.888+0000 7fb395ffb700 1 --2- 192.168.123.105:0/3899410437 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb388077990 0x7fb388079e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:00:39.889 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.889+0000 7fb39c92d700 1 --2- 192.168.123.105:0/3899410437 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb388077990 0x7fb388079e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:00:39.889 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.889+0000 7fb39c92d700 1 --2- 192.168.123.105:0/3899410437 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb388077990 0x7fb388079e50 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7fb3981041a0 tx=0x7fb384006c60 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:00:39.889 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.889+0000 7fb395ffb700 1 -- 192.168.123.105:0/3899410437 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(69..69 src has 1..69) v4 ==== 6338+0+0 (secure 0 0 0) 0x7fb38c09b010 con 0x7fb398104340 2026-03-10T09:00:39.890 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.889+0000 7fb39eb91700 1 -- 192.168.123.105:0/3899410437 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb37c005320 con 0x7fb398104340 2026-03-10T09:00:39.893 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:39.892+0000 7fb395ffb700 1 -- 192.168.123.105:0/3899410437 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fb38c0648e0 con 0x7fb398104340 2026-03-10T09:00:40.049 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.048+0000 7fb39eb91700 1 -- 192.168.123.105:0/3899410437 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fb37c006200 con 0x7fb398104340 2026-03-10T09:00:40.049 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.049+0000 7fb395ffb700 1 -- 192.168.123.105:0/3899410437 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 12 v12) v1 ==== 76+0+1918 (secure 0 0 0) 0x7fb38c026020 con 0x7fb398104340 2026-03-10T09:00:40.050 INFO:teuthology.orchestra.run.vm05.stdout:e12 2026-03-10T09:00:40.050 INFO:teuthology.orchestra.run.vm05.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-10T09:00:40.050 INFO:teuthology.orchestra.run.vm05.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T09:00:40.050 INFO:teuthology.orchestra.run.vm05.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T09:00:40.050 INFO:teuthology.orchestra.run.vm05.stdout:legacy client fscid: 1 2026-03-10T09:00:40.050 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:00:40.050 INFO:teuthology.orchestra.run.vm05.stdout:Filesystem 'cephfs' (1) 2026-03-10T09:00:40.050 INFO:teuthology.orchestra.run.vm05.stdout:fs_name cephfs 2026-03-10T09:00:40.050 INFO:teuthology.orchestra.run.vm05.stdout:epoch 12 2026-03-10T09:00:40.050 INFO:teuthology.orchestra.run.vm05.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T09:00:40.050 INFO:teuthology.orchestra.run.vm05.stdout:created 2026-03-10T08:52:52.346264+0000 2026-03-10T09:00:40.050 INFO:teuthology.orchestra.run.vm05.stdout:modified 2026-03-10T08:53:01.426195+0000 2026-03-10T09:00:40.050 INFO:teuthology.orchestra.run.vm05.stdout:tableserver 0 2026-03-10T09:00:40.050 INFO:teuthology.orchestra.run.vm05.stdout:root 0 2026-03-10T09:00:40.050 INFO:teuthology.orchestra.run.vm05.stdout:session_timeout 60 2026-03-10T09:00:40.050 INFO:teuthology.orchestra.run.vm05.stdout:session_autoclose 300 2026-03-10T09:00:40.050 INFO:teuthology.orchestra.run.vm05.stdout:max_file_size 1099511627776 2026-03-10T09:00:40.050 INFO:teuthology.orchestra.run.vm05.stdout:max_xattr_size 65536 2026-03-10T09:00:40.050 INFO:teuthology.orchestra.run.vm05.stdout:required_client_features {} 2026-03-10T09:00:40.050 INFO:teuthology.orchestra.run.vm05.stdout:last_failure 0 2026-03-10T09:00:40.050 INFO:teuthology.orchestra.run.vm05.stdout:last_failure_osd_epoch 39 2026-03-10T09:00:40.050 INFO:teuthology.orchestra.run.vm05.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T09:00:40.050 INFO:teuthology.orchestra.run.vm05.stdout:max_mds 1 2026-03-10T09:00:40.050 INFO:teuthology.orchestra.run.vm05.stdout:in 0 2026-03-10T09:00:40.050 INFO:teuthology.orchestra.run.vm05.stdout:up {0=24289} 2026-03-10T09:00:40.050 INFO:teuthology.orchestra.run.vm05.stdout:failed 2026-03-10T09:00:40.050 INFO:teuthology.orchestra.run.vm05.stdout:damaged 2026-03-10T09:00:40.050 INFO:teuthology.orchestra.run.vm05.stdout:stopped 2026-03-10T09:00:40.050 INFO:teuthology.orchestra.run.vm05.stdout:data_pools [3] 2026-03-10T09:00:40.050 INFO:teuthology.orchestra.run.vm05.stdout:metadata_pool 2 2026-03-10T09:00:40.050 INFO:teuthology.orchestra.run.vm05.stdout:inline_data disabled 2026-03-10T09:00:40.050 INFO:teuthology.orchestra.run.vm05.stdout:balancer 2026-03-10T09:00:40.050 INFO:teuthology.orchestra.run.vm05.stdout:bal_rank_mask -1 2026-03-10T09:00:40.051 INFO:teuthology.orchestra.run.vm05.stdout:standby_count_wanted 1 2026-03-10T09:00:40.051 INFO:teuthology.orchestra.run.vm05.stdout:qdb_cluster leader: 0 members: 2026-03-10T09:00:40.051 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.bxdvbu{0:24289} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.105:6826/2466638752,v1:192.168.123.105:6827/2466638752] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T09:00:40.051 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:00:40.051 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:00:40.051 INFO:teuthology.orchestra.run.vm05.stdout:Standby daemons: 2026-03-10T09:00:40.051 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:00:40.051 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.slhztf{-1:14488} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.105:6828/2662194502,v1:192.168.123.105:6829/2662194502] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T09:00:40.051 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.ssijow{-1:24307} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.108:6826/573665424,v1:192.168.123.108:6827/573665424] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T09:00:40.051 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.xfzrbx{-1:24317} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.108:6824/3236998387,v1:192.168.123.108:6825/3236998387] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T09:00:40.053 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.052+0000 7fb39eb91700 1 -- 192.168.123.105:0/3899410437 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb388077990 msgr2=0x7fb388079e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:40.053 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.052+0000 7fb39eb91700 1 --2- 192.168.123.105:0/3899410437 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb388077990 0x7fb388079e50 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7fb3981041a0 tx=0x7fb384006c60 comp rx=0 tx=0).stop 2026-03-10T09:00:40.053 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.053+0000 7fb39eb91700 1 -- 192.168.123.105:0/3899410437 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb398104340 msgr2=0x7fb398199010 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:40.053 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.053+0000 7fb39eb91700 1 --2- 192.168.123.105:0/3899410437 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb398104340 0x7fb398199010 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7fb38c005850 tx=0x7fb38c00b920 comp rx=0 tx=0).stop 2026-03-10T09:00:40.053 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.053+0000 7fb39eb91700 1 -- 192.168.123.105:0/3899410437 shutdown_connections 2026-03-10T09:00:40.053 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.053+0000 7fb39eb91700 1 --2- 192.168.123.105:0/3899410437 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb388077990 0x7fb388079e50 unknown :-1 s=CLOSED pgs=74 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:40.053 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.053+0000 7fb39eb91700 1 --2- 192.168.123.105:0/3899410437 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb398103140 0x7fb398198ad0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:40.053 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.053+0000 7fb39eb91700 1 --2- 192.168.123.105:0/3899410437 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb398104340 0x7fb398199010 unknown :-1 s=CLOSED pgs=110 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:40.053 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.053+0000 7fb39eb91700 1 -- 192.168.123.105:0/3899410437 >> 192.168.123.105:0/3899410437 conn(0x7fb3980fe6c0 msgr2=0x7fb398107570 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:00:40.053 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.053+0000 7fb39eb91700 1 -- 192.168.123.105:0/3899410437 shutdown_connections 2026-03-10T09:00:40.053 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.053+0000 7fb39eb91700 1 -- 192.168.123.105:0/3899410437 wait complete. 2026-03-10T09:00:40.054 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 12 2026-03-10T09:00:40.127 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.127+0000 7f39025cb700 1 -- 192.168.123.105:0/3513143143 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f38fc104350 msgr2=0x7f38fc1047b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:40.128 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.127+0000 7f39025cb700 1 --2- 192.168.123.105:0/3513143143 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f38fc104350 0x7f38fc1047b0 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7f38ec009b50 tx=0x7f38ec009e60 comp rx=0 tx=0).stop 2026-03-10T09:00:40.128 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:39 vm05.local ceph-mon[111630]: from='client.34290 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:00:40.128 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:39 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/474219987' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:00:40.130 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.130+0000 7f39025cb700 1 -- 192.168.123.105:0/3513143143 shutdown_connections 2026-03-10T09:00:40.130 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.130+0000 7f39025cb700 1 --2- 192.168.123.105:0/3513143143 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f38fc104350 0x7f38fc1047b0 unknown :-1 s=CLOSED pgs=111 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:40.130 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.130+0000 7f39025cb700 1 --2- 192.168.123.105:0/3513143143 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f38fc103150 0x7f38fc103570 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:40.130 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.130+0000 7f39025cb700 1 -- 192.168.123.105:0/3513143143 >> 192.168.123.105:0/3513143143 conn(0x7f38fc0fe6d0 msgr2=0x7f38fc100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:00:40.132 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.132+0000 7f39025cb700 1 -- 192.168.123.105:0/3513143143 shutdown_connections 2026-03-10T09:00:40.132 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.132+0000 7f39025cb700 1 -- 192.168.123.105:0/3513143143 wait complete. 2026-03-10T09:00:40.132 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.132+0000 7f39025cb700 1 Processor -- start 2026-03-10T09:00:40.132 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.132+0000 7f39025cb700 1 -- start start 2026-03-10T09:00:40.133 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.132+0000 7f39025cb700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f38fc103150 0x7f38fc071cf0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:00:40.133 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.132+0000 7f39025cb700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f38fc104350 0x7f38fc072230 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:00:40.133 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.132+0000 7f39025cb700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f38fc072770 con 0x7f38fc104350 2026-03-10T09:00:40.133 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.132+0000 7f39025cb700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f38fc0728b0 con 0x7f38fc103150 2026-03-10T09:00:40.133 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.133+0000 7f38fb7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f38fc104350 0x7f38fc072230 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:00:40.133 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.133+0000 7f38fb7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f38fc104350 0x7f38fc072230 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:54550/0 (socket says 192.168.123.105:54550) 2026-03-10T09:00:40.133 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.133+0000 7f38fb7fe700 1 -- 192.168.123.105:0/321493034 learned_addr learned my addr 192.168.123.105:0/321493034 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:00:40.133 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.133+0000 7f38fb7fe700 1 -- 192.168.123.105:0/321493034 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f38fc103150 msgr2=0x7f38fc071cf0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T09:00:40.133 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.133+0000 7f38fb7fe700 1 --2- 192.168.123.105:0/321493034 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f38fc103150 0x7f38fc071cf0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:40.133 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.133+0000 7f38fb7fe700 1 -- 192.168.123.105:0/321493034 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f38ec0097e0 con 0x7f38fc104350 2026-03-10T09:00:40.134 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.133+0000 7f38fb7fe700 1 --2- 192.168.123.105:0/321493034 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f38fc104350 0x7f38fc072230 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f38ec005950 tx=0x7f38ec0057d0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:00:40.134 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.134+0000 7f38f97fa700 1 -- 192.168.123.105:0/321493034 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f38ec01d070 con 0x7f38fc104350 2026-03-10T09:00:40.134 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.134+0000 7f38f97fa700 1 -- 192.168.123.105:0/321493034 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f38ec00bb50 con 0x7f38fc104350 2026-03-10T09:00:40.134 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.134+0000 7f38f97fa700 1 -- 192.168.123.105:0/321493034 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f38ec00f810 con 0x7f38fc104350 2026-03-10T09:00:40.134 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.134+0000 7f39025cb700 1 -- 192.168.123.105:0/321493034 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f38fc1a24a0 con 0x7f38fc104350 2026-03-10T09:00:40.136 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.135+0000 7f38f97fa700 1 -- 192.168.123.105:0/321493034 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f38ec00bcc0 con 0x7f38fc104350 2026-03-10T09:00:40.136 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.135+0000 7f39025cb700 1 -- 192.168.123.105:0/321493034 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f38fc1a29f0 con 0x7f38fc104350 2026-03-10T09:00:40.136 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.136+0000 7f38f97fa700 1 --2- 192.168.123.105:0/321493034 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f38e8077910 0x7f38e8079dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:00:40.136 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.136+0000 7f39025cb700 1 -- 192.168.123.105:0/321493034 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f38fc066e80 con 0x7f38fc104350 2026-03-10T09:00:40.139 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.137+0000 7f38fbfff700 1 --2- 192.168.123.105:0/321493034 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f38e8077910 0x7f38e8079dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:00:40.139 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.137+0000 7f38fbfff700 1 --2- 192.168.123.105:0/321493034 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f38e8077910 0x7f38e8079dd0 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7f38e4005950 tx=0x7f38e40058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:00:40.139 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.139+0000 7f38f97fa700 1 -- 192.168.123.105:0/321493034 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(69..69 src has 1..69) v4 ==== 6338+0+0 (secure 0 0 0) 0x7f38ec067b30 con 0x7f38fc104350 2026-03-10T09:00:40.140 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.139+0000 7f38f97fa700 1 -- 192.168.123.105:0/321493034 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f38ec027030 con 0x7f38fc104350 2026-03-10T09:00:40.266 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.266+0000 7f39025cb700 1 -- 192.168.123.105:0/321493034 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f38fc108ca0 con 0x7f38e8077910 2026-03-10T09:00:40.267 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.267+0000 7f38f97fa700 1 -- 192.168.123.105:0/321493034 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f38fc108ca0 con 0x7f38e8077910 2026-03-10T09:00:40.268 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T09:00:40.268 INFO:teuthology.orchestra.run.vm05.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T09:00:40.268 INFO:teuthology.orchestra.run.vm05.stdout: "in_progress": true, 2026-03-10T09:00:40.268 INFO:teuthology.orchestra.run.vm05.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T09:00:40.268 INFO:teuthology.orchestra.run.vm05.stdout: "services_complete": [ 2026-03-10T09:00:40.268 INFO:teuthology.orchestra.run.vm05.stdout: "mgr", 2026-03-10T09:00:40.268 INFO:teuthology.orchestra.run.vm05.stdout: "crash", 2026-03-10T09:00:40.268 INFO:teuthology.orchestra.run.vm05.stdout: "mon" 2026-03-10T09:00:40.268 INFO:teuthology.orchestra.run.vm05.stdout: ], 2026-03-10T09:00:40.268 INFO:teuthology.orchestra.run.vm05.stdout: "progress": "7/23 daemons upgraded", 2026-03-10T09:00:40.268 INFO:teuthology.orchestra.run.vm05.stdout: "message": "Currently upgrading osd daemons", 2026-03-10T09:00:40.268 INFO:teuthology.orchestra.run.vm05.stdout: "is_paused": false 2026-03-10T09:00:40.268 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T09:00:40.270 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.269+0000 7f39025cb700 1 -- 192.168.123.105:0/321493034 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f38e8077910 msgr2=0x7f38e8079dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:40.270 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.269+0000 7f39025cb700 1 --2- 192.168.123.105:0/321493034 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f38e8077910 0x7f38e8079dd0 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7f38e4005950 tx=0x7f38e40058e0 comp rx=0 tx=0).stop 2026-03-10T09:00:40.270 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.270+0000 7f39025cb700 1 -- 192.168.123.105:0/321493034 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f38fc104350 msgr2=0x7f38fc072230 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:40.270 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.270+0000 7f39025cb700 1 --2- 192.168.123.105:0/321493034 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f38fc104350 0x7f38fc072230 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f38ec005950 tx=0x7f38ec0057d0 comp rx=0 tx=0).stop 2026-03-10T09:00:40.270 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.270+0000 7f39025cb700 1 -- 192.168.123.105:0/321493034 shutdown_connections 2026-03-10T09:00:40.270 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.270+0000 7f39025cb700 1 --2- 192.168.123.105:0/321493034 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f38e8077910 0x7f38e8079dd0 unknown :-1 s=CLOSED pgs=75 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:40.270 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.270+0000 7f39025cb700 1 --2- 192.168.123.105:0/321493034 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f38fc103150 0x7f38fc071cf0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:40.270 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.270+0000 7f39025cb700 1 --2- 192.168.123.105:0/321493034 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f38fc104350 0x7f38fc072230 unknown :-1 s=CLOSED pgs=112 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:40.270 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.270+0000 7f39025cb700 1 -- 192.168.123.105:0/321493034 >> 192.168.123.105:0/321493034 conn(0x7f38fc0fe6d0 msgr2=0x7f38fc107580 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:00:40.270 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.270+0000 7f39025cb700 1 -- 192.168.123.105:0/321493034 shutdown_connections 2026-03-10T09:00:40.270 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.270+0000 7f39025cb700 1 -- 192.168.123.105:0/321493034 wait complete. 2026-03-10T09:00:40.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:39 vm08.local ceph-mon[101330]: from='client.34290 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:00:40.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:39 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/474219987' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:00:40.337 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.337+0000 7f6204445700 1 -- 192.168.123.105:0/553183556 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f61fc104320 msgr2=0x7f61fc104780 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:40.337 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.337+0000 7f6204445700 1 --2- 192.168.123.105:0/553183556 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f61fc104320 0x7f61fc104780 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7f61f8009b00 tx=0x7f61f8009e10 comp rx=0 tx=0).stop 2026-03-10T09:00:40.338 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.337+0000 7f6204445700 1 -- 192.168.123.105:0/553183556 shutdown_connections 2026-03-10T09:00:40.338 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.337+0000 7f6204445700 1 --2- 192.168.123.105:0/553183556 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f61fc104320 0x7f61fc104780 unknown :-1 s=CLOSED pgs=113 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:40.338 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.337+0000 7f6204445700 1 --2- 192.168.123.105:0/553183556 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f61fc103120 0x7f61fc103540 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:40.338 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.337+0000 7f6204445700 1 -- 192.168.123.105:0/553183556 >> 192.168.123.105:0/553183556 conn(0x7f61fc0fe6c0 msgr2=0x7f61fc100b00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:00:40.338 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.337+0000 7f6204445700 1 -- 192.168.123.105:0/553183556 shutdown_connections 2026-03-10T09:00:40.338 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.338+0000 7f6204445700 1 -- 192.168.123.105:0/553183556 wait complete. 2026-03-10T09:00:40.338 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.338+0000 7f6204445700 1 Processor -- start 2026-03-10T09:00:40.338 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.338+0000 7f6204445700 1 -- start start 2026-03-10T09:00:40.339 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.338+0000 7f6204445700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f61fc103120 0x7f61fc198a40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:00:40.339 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.338+0000 7f6204445700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f61fc104320 0x7f61fc198f80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:00:40.339 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.338+0000 7f6204445700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f61fc1995a0 con 0x7f61fc104320 2026-03-10T09:00:40.339 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.338+0000 7f6204445700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f61fc1996e0 con 0x7f61fc103120 2026-03-10T09:00:40.339 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.339+0000 7f62019e0700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f61fc104320 0x7f61fc198f80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:00:40.339 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.339+0000 7f62019e0700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f61fc104320 0x7f61fc198f80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:54568/0 (socket says 192.168.123.105:54568) 2026-03-10T09:00:40.339 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.339+0000 7f62019e0700 1 -- 192.168.123.105:0/663076135 learned_addr learned my addr 192.168.123.105:0/663076135 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:00:40.339 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.339+0000 7f62019e0700 1 -- 192.168.123.105:0/663076135 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f61fc103120 msgr2=0x7f61fc198a40 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T09:00:40.339 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.339+0000 7f62021e1700 1 --2- 192.168.123.105:0/663076135 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f61fc103120 0x7f61fc198a40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:00:40.339 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.339+0000 7f62019e0700 1 --2- 192.168.123.105:0/663076135 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f61fc103120 0x7f61fc198a40 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:40.340 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.339+0000 7f62019e0700 1 -- 192.168.123.105:0/663076135 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f61f80097e0 con 0x7f61fc104320 2026-03-10T09:00:40.340 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.339+0000 7f62019e0700 1 --2- 192.168.123.105:0/663076135 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f61fc104320 0x7f61fc198f80 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7f61f8005230 tx=0x7f61f80056c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:00:40.340 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.339+0000 7f62021e1700 1 --2- 192.168.123.105:0/663076135 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f61fc103120 0x7f61fc198a40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T09:00:40.340 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.340+0000 7f61ef7fe700 1 -- 192.168.123.105:0/663076135 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f61f801d070 con 0x7f61fc104320 2026-03-10T09:00:40.340 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.340+0000 7f6204445700 1 -- 192.168.123.105:0/663076135 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f61fc19e130 con 0x7f61fc104320 2026-03-10T09:00:40.341 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.340+0000 7f61ef7fe700 1 -- 192.168.123.105:0/663076135 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f61f800bc50 con 0x7f61fc104320 2026-03-10T09:00:40.341 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.340+0000 7f61ef7fe700 1 -- 192.168.123.105:0/663076135 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f61f800f800 con 0x7f61fc104320 2026-03-10T09:00:40.341 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.340+0000 7f6204445700 1 -- 192.168.123.105:0/663076135 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f61fc19e620 con 0x7f61fc104320 2026-03-10T09:00:40.342 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.341+0000 7f61ed7fa700 1 -- 192.168.123.105:0/663076135 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f61e40052f0 con 0x7f61fc104320 2026-03-10T09:00:40.345 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.344+0000 7f61ef7fe700 1 -- 192.168.123.105:0/663076135 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f61f8022ae0 con 0x7f61fc104320 2026-03-10T09:00:40.345 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.345+0000 7f61ef7fe700 1 --2- 192.168.123.105:0/663076135 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f61e8077910 0x7f61e8079dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:00:40.345 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.345+0000 7f62021e1700 1 --2- 192.168.123.105:0/663076135 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f61e8077910 0x7f61e8079dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:00:40.346 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.345+0000 7f61ef7fe700 1 -- 192.168.123.105:0/663076135 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(69..69 src has 1..69) v4 ==== 6338+0+0 (secure 0 0 0) 0x7f61f809bc90 con 0x7f61fc104320 2026-03-10T09:00:40.346 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.346+0000 7f61ef7fe700 1 -- 192.168.123.105:0/663076135 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f61f80cba90 con 0x7f61fc104320 2026-03-10T09:00:40.346 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.346+0000 7f62021e1700 1 --2- 192.168.123.105:0/663076135 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f61e8077910 0x7f61e8079dd0 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f61fc104180 tx=0x7f61f0008040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:00:40.509 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.509+0000 7f61ed7fa700 1 -- 192.168.123.105:0/663076135 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f61e4005160 con 0x7f61fc104320 2026-03-10T09:00:40.510 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.509+0000 7f61ef7fe700 1 -- 192.168.123.105:0/663076135 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+710 (secure 0 0 0) 0x7f61f80272e0 con 0x7f61fc104320 2026-03-10T09:00:40.510 INFO:teuthology.orchestra.run.vm05.stdout:HEALTH_WARN Degraded data redundancy: 476/231 objects degraded (206.061%), 4 pgs degraded, 4 pgs undersized 2026-03-10T09:00:40.510 INFO:teuthology.orchestra.run.vm05.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 476/231 objects degraded (206.061%), 4 pgs degraded, 4 pgs undersized 2026-03-10T09:00:40.510 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.1 is stuck undersized for 3m, current state active+recovery_wait+undersized+degraded+remapped, last acting [2,4] 2026-03-10T09:00:40.510 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.b is stuck undersized for 3m, current state active+recovering+undersized+degraded+remapped, last acting [1,4] 2026-03-10T09:00:40.510 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.11 is stuck undersized for 3m, current state active+recovery_wait+undersized+degraded+remapped, last acting [3,4] 2026-03-10T09:00:40.510 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.18 is stuck undersized for 3m, current state active+recovery_wait+undersized+degraded+remapped, last acting [2,1] 2026-03-10T09:00:40.512 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.512+0000 7f61ed7fa700 1 -- 192.168.123.105:0/663076135 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f61e8077910 msgr2=0x7f61e8079dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:40.512 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.512+0000 7f61ed7fa700 1 --2- 192.168.123.105:0/663076135 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f61e8077910 0x7f61e8079dd0 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f61fc104180 tx=0x7f61f0008040 comp rx=0 tx=0).stop 2026-03-10T09:00:40.512 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.512+0000 7f61ed7fa700 1 -- 192.168.123.105:0/663076135 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f61fc104320 msgr2=0x7f61fc198f80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:00:40.512 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.512+0000 7f61ed7fa700 1 --2- 192.168.123.105:0/663076135 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f61fc104320 0x7f61fc198f80 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7f61f8005230 tx=0x7f61f80056c0 comp rx=0 tx=0).stop 2026-03-10T09:00:40.512 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.512+0000 7f61ed7fa700 1 -- 192.168.123.105:0/663076135 shutdown_connections 2026-03-10T09:00:40.512 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.512+0000 7f61ed7fa700 1 --2- 192.168.123.105:0/663076135 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f61e8077910 0x7f61e8079dd0 unknown :-1 s=CLOSED pgs=76 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:40.512 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.512+0000 7f61ed7fa700 1 --2- 192.168.123.105:0/663076135 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f61fc103120 0x7f61fc198a40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:40.512 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.512+0000 7f61ed7fa700 1 --2- 192.168.123.105:0/663076135 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f61fc104320 0x7f61fc198f80 unknown :-1 s=CLOSED pgs=114 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:00:40.512 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.512+0000 7f61ed7fa700 1 -- 192.168.123.105:0/663076135 >> 192.168.123.105:0/663076135 conn(0x7f61fc0fe6c0 msgr2=0x7f61fc107550 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:00:40.512 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.512+0000 7f61ed7fa700 1 -- 192.168.123.105:0/663076135 shutdown_connections 2026-03-10T09:00:40.512 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:00:40.512+0000 7f61ed7fa700 1 -- 192.168.123.105:0/663076135 wait complete. 2026-03-10T09:00:41.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:40 vm05.local ceph-mon[111630]: from='client.44235 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:00:41.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:40 vm05.local ceph-mon[111630]: from='client.44239 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:00:41.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:40 vm05.local ceph-mon[111630]: pgmap v146: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 476/231 objects degraded (206.061%); 0 B/s, 8 objects/s recovering 2026-03-10T09:00:41.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:40 vm05.local ceph-mon[111630]: Health check update: Degraded data redundancy: 476/231 objects degraded (206.061%), 4 pgs degraded, 4 pgs undersized (PG_DEGRADED) 2026-03-10T09:00:41.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:40 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/3899410437' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T09:00:41.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:40 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/663076135' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T09:00:41.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:40 vm08.local ceph-mon[101330]: from='client.44235 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:00:41.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:40 vm08.local ceph-mon[101330]: from='client.44239 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:00:41.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:40 vm08.local ceph-mon[101330]: pgmap v146: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 476/231 objects degraded (206.061%); 0 B/s, 8 objects/s recovering 2026-03-10T09:00:41.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:40 vm08.local ceph-mon[101330]: Health check update: Degraded data redundancy: 476/231 objects degraded (206.061%), 4 pgs degraded, 4 pgs undersized (PG_DEGRADED) 2026-03-10T09:00:41.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:40 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/3899410437' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T09:00:41.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:40 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/663076135' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T09:00:42.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:41 vm05.local ceph-mon[111630]: from='client.34308 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:00:42.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:41 vm05.local ceph-mon[111630]: osdmap e70: 6 total, 6 up, 6 in 2026-03-10T09:00:42.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:41 vm08.local ceph-mon[101330]: from='client.34308 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:00:42.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:41 vm08.local ceph-mon[101330]: osdmap e70: 6 total, 6 up, 6 in 2026-03-10T09:00:43.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:42 vm05.local ceph-mon[111630]: pgmap v148: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 476/231 objects degraded (206.061%); 0 B/s, 9 objects/s recovering 2026-03-10T09:00:43.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:42 vm05.local ceph-mon[111630]: osdmap e71: 6 total, 6 up, 6 in 2026-03-10T09:00:43.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:42 vm08.local ceph-mon[101330]: pgmap v148: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 476/231 objects degraded (206.061%); 0 B/s, 9 objects/s recovering 2026-03-10T09:00:43.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:42 vm08.local ceph-mon[101330]: osdmap e71: 6 total, 6 up, 6 in 2026-03-10T09:00:45.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:44 vm05.local ceph-mon[111630]: pgmap v150: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 353/231 objects degraded (152.814%); 0 B/s, 12 objects/s recovering 2026-03-10T09:00:45.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:44 vm08.local ceph-mon[101330]: pgmap v150: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 353/231 objects degraded (152.814%); 0 B/s, 12 objects/s recovering 2026-03-10T09:00:46.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:45 vm05.local ceph-mon[111630]: Health check update: Degraded data redundancy: 353/231 objects degraded (152.814%), 3 pgs degraded, 3 pgs undersized (PG_DEGRADED) 2026-03-10T09:00:46.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:45 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T09:00:46.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:45 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:00:46.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:45 vm08.local ceph-mon[101330]: Health check update: Degraded data redundancy: 353/231 objects degraded (152.814%), 3 pgs degraded, 3 pgs undersized (PG_DEGRADED) 2026-03-10T09:00:46.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:45 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T09:00:46.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:45 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:00:47.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:46 vm05.local ceph-mon[111630]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T09:00:47.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:46 vm05.local ceph-mon[111630]: Upgrade: unsafe to stop osd(s) at this time (1 PGs are or would become offline) 2026-03-10T09:00:47.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:46 vm05.local ceph-mon[111630]: pgmap v151: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 353/231 objects degraded (152.814%); 0 B/s, 5 objects/s recovering 2026-03-10T09:00:47.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:46 vm08.local ceph-mon[101330]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T09:00:47.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:46 vm08.local ceph-mon[101330]: Upgrade: unsafe to stop osd(s) at this time (1 PGs are or would become offline) 2026-03-10T09:00:47.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:46 vm08.local ceph-mon[101330]: pgmap v151: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 353/231 objects degraded (152.814%); 0 B/s, 5 objects/s recovering 2026-03-10T09:00:49.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:48 vm05.local ceph-mon[111630]: pgmap v152: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 353/231 objects degraded (152.814%); 0 B/s, 12 objects/s recovering 2026-03-10T09:00:49.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:48 vm08.local ceph-mon[101330]: pgmap v152: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 353/231 objects degraded (152.814%); 0 B/s, 12 objects/s recovering 2026-03-10T09:00:51.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:50 vm05.local ceph-mon[111630]: pgmap v153: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 353/231 objects degraded (152.814%); 0 B/s, 10 objects/s recovering 2026-03-10T09:00:51.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:50 vm08.local ceph-mon[101330]: pgmap v153: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 353/231 objects degraded (152.814%); 0 B/s, 10 objects/s recovering 2026-03-10T09:00:53.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:52 vm05.local ceph-mon[111630]: pgmap v154: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 353/231 objects degraded (152.814%); 0 B/s, 9 objects/s recovering 2026-03-10T09:00:53.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:52 vm08.local ceph-mon[101330]: pgmap v154: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 353/231 objects degraded (152.814%); 0 B/s, 9 objects/s recovering 2026-03-10T09:00:54.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:54 vm08.local ceph-mon[101330]: Health check update: Degraded data redundancy: 332/231 objects degraded (143.723%), 3 pgs degraded, 3 pgs undersized (PG_DEGRADED) 2026-03-10T09:00:54.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:54 vm05.local ceph-mon[111630]: Health check update: Degraded data redundancy: 332/231 objects degraded (143.723%), 3 pgs degraded, 3 pgs undersized (PG_DEGRADED) 2026-03-10T09:00:55.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:55 vm08.local ceph-mon[101330]: pgmap v155: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 332/231 objects degraded (143.723%); 0 B/s, 11 objects/s recovering 2026-03-10T09:00:55.360 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:55 vm05.local ceph-mon[111630]: pgmap v155: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 332/231 objects degraded (143.723%); 0 B/s, 11 objects/s recovering 2026-03-10T09:00:57.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:57 vm05.local ceph-mon[111630]: pgmap v156: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 332/231 objects degraded (143.723%); 0 B/s, 7 objects/s recovering 2026-03-10T09:00:57.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:57 vm05.local ceph-mon[111630]: osdmap e72: 6 total, 6 up, 6 in 2026-03-10T09:00:57.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:57 vm08.local ceph-mon[101330]: pgmap v156: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 332/231 objects degraded (143.723%); 0 B/s, 7 objects/s recovering 2026-03-10T09:00:57.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:57 vm08.local ceph-mon[101330]: osdmap e72: 6 total, 6 up, 6 in 2026-03-10T09:00:58.454 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:58 vm08.local ceph-mon[101330]: osdmap e73: 6 total, 6 up, 6 in 2026-03-10T09:00:58.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:58 vm05.local ceph-mon[111630]: osdmap e73: 6 total, 6 up, 6 in 2026-03-10T09:00:59.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:00:59 vm05.local ceph-mon[111630]: pgmap v159: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 1 peering, 1 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 218/231 objects degraded (94.372%); 0 B/s, 11 objects/s recovering 2026-03-10T09:00:59.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:00:59 vm08.local ceph-mon[101330]: pgmap v159: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 1 peering, 1 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 218/231 objects degraded (94.372%); 0 B/s, 11 objects/s recovering 2026-03-10T09:01:00.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:00 vm05.local ceph-mon[111630]: Health check update: Degraded data redundancy: 218/231 objects degraded (94.372%), 2 pgs degraded, 2 pgs undersized (PG_DEGRADED) 2026-03-10T09:01:00.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:00 vm08.local ceph-mon[101330]: Health check update: Degraded data redundancy: 218/231 objects degraded (94.372%), 2 pgs degraded, 2 pgs undersized (PG_DEGRADED) 2026-03-10T09:01:01.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:01 vm05.local ceph-mon[111630]: pgmap v160: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 1 peering, 1 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 218/231 objects degraded (94.372%); 0 B/s, 11 objects/s recovering 2026-03-10T09:01:01.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:01 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T09:01:01.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:01 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:01:01.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:01 vm08.local ceph-mon[101330]: pgmap v160: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 1 peering, 1 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 218/231 objects degraded (94.372%); 0 B/s, 11 objects/s recovering 2026-03-10T09:01:01.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:01 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T09:01:01.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:01 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:01:02.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:02 vm05.local ceph-mon[111630]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T09:01:02.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:02 vm05.local ceph-mon[111630]: Upgrade: unsafe to stop osd(s) at this time (1 PGs are or would become offline) 2026-03-10T09:01:02.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:02 vm08.local ceph-mon[101330]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T09:01:02.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:02 vm08.local ceph-mon[101330]: Upgrade: unsafe to stop osd(s) at this time (1 PGs are or would become offline) 2026-03-10T09:01:03.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:03 vm05.local ceph-mon[111630]: pgmap v161: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 1 peering, 1 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 218/231 objects degraded (94.372%); 0 B/s, 5 objects/s recovering 2026-03-10T09:01:03.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:03 vm08.local ceph-mon[101330]: pgmap v161: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 1 peering, 1 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 218/231 objects degraded (94.372%); 0 B/s, 5 objects/s recovering 2026-03-10T09:01:04.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:04 vm05.local ceph-mon[111630]: pgmap v162: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 218/231 objects degraded (94.372%); 0 B/s, 11 objects/s recovering 2026-03-10T09:01:04.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:04 vm08.local ceph-mon[101330]: pgmap v162: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 218/231 objects degraded (94.372%); 0 B/s, 11 objects/s recovering 2026-03-10T09:01:07.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:06 vm05.local ceph-mon[111630]: pgmap v163: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 218/231 objects degraded (94.372%); 0 B/s, 9 objects/s recovering 2026-03-10T09:01:07.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:06 vm08.local ceph-mon[101330]: pgmap v163: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 218/231 objects degraded (94.372%); 0 B/s, 9 objects/s recovering 2026-03-10T09:01:08.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:07 vm05.local ceph-mon[111630]: Health check update: Degraded data redundancy: 194/231 objects degraded (83.983%), 2 pgs degraded, 2 pgs undersized (PG_DEGRADED) 2026-03-10T09:01:08.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:07 vm08.local ceph-mon[101330]: Health check update: Degraded data redundancy: 194/231 objects degraded (83.983%), 2 pgs degraded, 2 pgs undersized (PG_DEGRADED) 2026-03-10T09:01:09.116 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:08 vm05.local ceph-mon[111630]: pgmap v164: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 194/231 objects degraded (83.983%); 0 B/s, 8 objects/s recovering 2026-03-10T09:01:09.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:08 vm08.local ceph-mon[101330]: pgmap v164: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 194/231 objects degraded (83.983%); 0 B/s, 8 objects/s recovering 2026-03-10T09:01:10.589 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.588+0000 7f6a89c00700 1 -- 192.168.123.105:0/628602471 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6a84076990 msgr2=0x7f6a84076e10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:10.589 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.588+0000 7f6a89c00700 1 --2- 192.168.123.105:0/628602471 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6a84076990 0x7f6a84076e10 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7f6a74009b00 tx=0x7f6a74009e10 comp rx=0 tx=0).stop 2026-03-10T09:01:10.590 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.589+0000 7f6a89c00700 1 -- 192.168.123.105:0/628602471 shutdown_connections 2026-03-10T09:01:10.590 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.589+0000 7f6a89c00700 1 --2- 192.168.123.105:0/628602471 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6a84076990 0x7f6a84076e10 unknown :-1 s=CLOSED pgs=115 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:10.590 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.589+0000 7f6a89c00700 1 --2- 192.168.123.105:0/628602471 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6a84075740 0x7f6a84075b60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:10.590 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.589+0000 7f6a89c00700 1 -- 192.168.123.105:0/628602471 >> 192.168.123.105:0/628602471 conn(0x7f6a840fe6c0 msgr2=0x7f6a84100b20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:01:10.590 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.589+0000 7f6a89c00700 1 -- 192.168.123.105:0/628602471 shutdown_connections 2026-03-10T09:01:10.590 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.589+0000 7f6a89c00700 1 -- 192.168.123.105:0/628602471 wait complete. 2026-03-10T09:01:10.590 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.590+0000 7f6a89c00700 1 Processor -- start 2026-03-10T09:01:10.590 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.590+0000 7f6a89c00700 1 -- start start 2026-03-10T09:01:10.590 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.590+0000 7f6a89c00700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6a84075740 0x7f6a8419ce00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:01:10.591 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.590+0000 7f6a89c00700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6a84076990 0x7f6a8419d340 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:01:10.591 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.590+0000 7f6a89c00700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6a8419d960 con 0x7f6a84076990 2026-03-10T09:01:10.591 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.590+0000 7f6a89c00700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6a8419daa0 con 0x7f6a84075740 2026-03-10T09:01:10.591 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.590+0000 7f6a82ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6a84076990 0x7f6a8419d340 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:01:10.591 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.591+0000 7f6a82ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6a84076990 0x7f6a8419d340 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:60006/0 (socket says 192.168.123.105:60006) 2026-03-10T09:01:10.591 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.591+0000 7f6a82ffd700 1 -- 192.168.123.105:0/4030705924 learned_addr learned my addr 192.168.123.105:0/4030705924 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:01:10.591 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.591+0000 7f6a82ffd700 1 -- 192.168.123.105:0/4030705924 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6a84075740 msgr2=0x7f6a8419ce00 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T09:01:10.591 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.591+0000 7f6a837fe700 1 --2- 192.168.123.105:0/4030705924 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6a84075740 0x7f6a8419ce00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:01:10.591 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.591+0000 7f6a82ffd700 1 --2- 192.168.123.105:0/4030705924 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6a84075740 0x7f6a8419ce00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:10.591 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.591+0000 7f6a82ffd700 1 -- 192.168.123.105:0/4030705924 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6a740097e0 con 0x7f6a84076990 2026-03-10T09:01:10.592 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.591+0000 7f6a82ffd700 1 --2- 192.168.123.105:0/4030705924 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6a84076990 0x7f6a8419d340 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f6a740048c0 tx=0x7f6a740049a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:01:10.593 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.592+0000 7f6a80ff9700 1 -- 192.168.123.105:0/4030705924 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6a7401d070 con 0x7f6a84076990 2026-03-10T09:01:10.593 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.592+0000 7f6a80ff9700 1 -- 192.168.123.105:0/4030705924 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6a7400bc50 con 0x7f6a84076990 2026-03-10T09:01:10.593 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.592+0000 7f6a80ff9700 1 -- 192.168.123.105:0/4030705924 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6a7400f800 con 0x7f6a84076990 2026-03-10T09:01:10.593 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.592+0000 7f6a89c00700 1 -- 192.168.123.105:0/4030705924 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6a841a24f0 con 0x7f6a84076990 2026-03-10T09:01:10.593 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.592+0000 7f6a89c00700 1 -- 192.168.123.105:0/4030705924 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6a841a29e0 con 0x7f6a84076990 2026-03-10T09:01:10.596 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.593+0000 7f6a89c00700 1 -- 192.168.123.105:0/4030705924 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6a84066e80 con 0x7f6a84076990 2026-03-10T09:01:10.596 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.595+0000 7f6a80ff9700 1 -- 192.168.123.105:0/4030705924 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f6a74022be0 con 0x7f6a84076990 2026-03-10T09:01:10.596 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.595+0000 7f6a80ff9700 1 --2- 192.168.123.105:0/4030705924 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f6a6c077990 0x7f6a6c079e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:01:10.596 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.595+0000 7f6a80ff9700 1 -- 192.168.123.105:0/4030705924 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(73..73 src has 1..73) v4 ==== 6280+0+0 (secure 0 0 0) 0x7f6a7409bde0 con 0x7f6a84076990 2026-03-10T09:01:10.596 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.596+0000 7f6a837fe700 1 --2- 192.168.123.105:0/4030705924 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f6a6c077990 0x7f6a6c079e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:01:10.596 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.596+0000 7f6a837fe700 1 --2- 192.168.123.105:0/4030705924 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f6a6c077990 0x7f6a6c079e50 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f6a78005950 tx=0x7f6a780058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:01:10.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.597+0000 7f6a80ff9700 1 -- 192.168.123.105:0/4030705924 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f6a74064600 con 0x7f6a84076990 2026-03-10T09:01:10.729 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.729+0000 7f6a89c00700 1 -- 192.168.123.105:0/4030705924 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f6a8410eac0 con 0x7f6a6c077990 2026-03-10T09:01:10.730 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.730+0000 7f6a80ff9700 1 -- 192.168.123.105:0/4030705924 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f6a8410eac0 con 0x7f6a6c077990 2026-03-10T09:01:10.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.733+0000 7f6a89c00700 1 -- 192.168.123.105:0/4030705924 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f6a6c077990 msgr2=0x7f6a6c079e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:10.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.733+0000 7f6a89c00700 1 --2- 192.168.123.105:0/4030705924 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f6a6c077990 0x7f6a6c079e50 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f6a78005950 tx=0x7f6a780058e0 comp rx=0 tx=0).stop 2026-03-10T09:01:10.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.733+0000 7f6a89c00700 1 -- 192.168.123.105:0/4030705924 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6a84076990 msgr2=0x7f6a8419d340 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:10.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.733+0000 7f6a89c00700 1 --2- 192.168.123.105:0/4030705924 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6a84076990 0x7f6a8419d340 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f6a740048c0 tx=0x7f6a740049a0 comp rx=0 tx=0).stop 2026-03-10T09:01:10.734 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.734+0000 7f6a89c00700 1 -- 192.168.123.105:0/4030705924 shutdown_connections 2026-03-10T09:01:10.734 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.734+0000 7f6a89c00700 1 --2- 192.168.123.105:0/4030705924 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f6a6c077990 0x7f6a6c079e50 unknown :-1 s=CLOSED pgs=77 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:10.734 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.734+0000 7f6a89c00700 1 --2- 192.168.123.105:0/4030705924 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6a84075740 0x7f6a8419ce00 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:10.734 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.734+0000 7f6a89c00700 1 --2- 192.168.123.105:0/4030705924 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6a84076990 0x7f6a8419d340 unknown :-1 s=CLOSED pgs=116 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:10.734 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.734+0000 7f6a89c00700 1 -- 192.168.123.105:0/4030705924 >> 192.168.123.105:0/4030705924 conn(0x7f6a840fe6c0 msgr2=0x7f6a8410d3a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:01:10.734 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.734+0000 7f6a89c00700 1 -- 192.168.123.105:0/4030705924 shutdown_connections 2026-03-10T09:01:10.734 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.734+0000 7f6a89c00700 1 -- 192.168.123.105:0/4030705924 wait complete. 2026-03-10T09:01:10.744 INFO:teuthology.orchestra.run.vm05.stdout:true 2026-03-10T09:01:10.806 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.805+0000 7f5ca1a79700 1 -- 192.168.123.105:0/3786068914 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5c9c1042c0 msgr2=0x7f5c9c1066b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:10.806 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.805+0000 7f5ca1a79700 1 --2- 192.168.123.105:0/3786068914 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5c9c1042c0 0x7f5c9c1066b0 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7f5c8c009b00 tx=0x7f5c8c009e10 comp rx=0 tx=0).stop 2026-03-10T09:01:10.806 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.806+0000 7f5ca1a79700 1 -- 192.168.123.105:0/3786068914 shutdown_connections 2026-03-10T09:01:10.806 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.806+0000 7f5ca1a79700 1 --2- 192.168.123.105:0/3786068914 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5c9c1042c0 0x7f5c9c1066b0 unknown :-1 s=CLOSED pgs=117 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:10.806 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.806+0000 7f5ca1a79700 1 --2- 192.168.123.105:0/3786068914 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5c9c101990 0x7f5c9c103d80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:10.806 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.806+0000 7f5ca1a79700 1 -- 192.168.123.105:0/3786068914 >> 192.168.123.105:0/3786068914 conn(0x7f5c9c0fb380 msgr2=0x7f5c9c0fd7e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:01:10.806 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.806+0000 7f5ca1a79700 1 -- 192.168.123.105:0/3786068914 shutdown_connections 2026-03-10T09:01:10.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.806+0000 7f5ca1a79700 1 -- 192.168.123.105:0/3786068914 wait complete. 2026-03-10T09:01:10.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.807+0000 7f5ca1a79700 1 Processor -- start 2026-03-10T09:01:10.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.807+0000 7f5ca1a79700 1 -- start start 2026-03-10T09:01:10.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.807+0000 7f5ca1a79700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5c9c101990 0x7f5c9c1012f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:01:10.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.807+0000 7f5ca1a79700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5c9c1042c0 0x7f5c9c0ff940 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:01:10.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.807+0000 7f5ca1a79700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5c9c0ffe80 con 0x7f5c9c1042c0 2026-03-10T09:01:10.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.807+0000 7f5ca1a79700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5c9c0fffc0 con 0x7f5c9c101990 2026-03-10T09:01:10.808 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.807+0000 7f5c9affd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5c9c1042c0 0x7f5c9c0ff940 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:01:10.808 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.807+0000 7f5c9affd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5c9c1042c0 0x7f5c9c0ff940 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:60020/0 (socket says 192.168.123.105:60020) 2026-03-10T09:01:10.808 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.807+0000 7f5c9affd700 1 -- 192.168.123.105:0/604494560 learned_addr learned my addr 192.168.123.105:0/604494560 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:01:10.808 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.808+0000 7f5c9affd700 1 -- 192.168.123.105:0/604494560 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5c9c101990 msgr2=0x7f5c9c1012f0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T09:01:10.808 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.808+0000 7f5c9b7fe700 1 --2- 192.168.123.105:0/604494560 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5c9c101990 0x7f5c9c1012f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:01:10.809 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.808+0000 7f5c9affd700 1 --2- 192.168.123.105:0/604494560 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5c9c101990 0x7f5c9c1012f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:10.809 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.808+0000 7f5c9affd700 1 -- 192.168.123.105:0/604494560 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5c8c0097e0 con 0x7f5c9c1042c0 2026-03-10T09:01:10.809 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.808+0000 7f5c9b7fe700 1 --2- 192.168.123.105:0/604494560 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5c9c101990 0x7f5c9c1012f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T09:01:10.809 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.809+0000 7f5c9affd700 1 --2- 192.168.123.105:0/604494560 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5c9c1042c0 0x7f5c9c0ff940 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7f5c8c005230 tx=0x7f5c8c0056c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:01:10.809 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.809+0000 7f5c98ff9700 1 -- 192.168.123.105:0/604494560 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5c8c01d070 con 0x7f5c9c1042c0 2026-03-10T09:01:10.809 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.809+0000 7f5c98ff9700 1 -- 192.168.123.105:0/604494560 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f5c8c00bc50 con 0x7f5c9c1042c0 2026-03-10T09:01:10.809 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.809+0000 7f5c98ff9700 1 -- 192.168.123.105:0/604494560 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5c8c00f870 con 0x7f5c9c1042c0 2026-03-10T09:01:10.810 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.809+0000 7f5ca1a79700 1 -- 192.168.123.105:0/604494560 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5c9c100240 con 0x7f5c9c1042c0 2026-03-10T09:01:10.810 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.809+0000 7f5ca1a79700 1 -- 192.168.123.105:0/604494560 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5c9c100730 con 0x7f5c9c1042c0 2026-03-10T09:01:10.814 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.810+0000 7f5ca1a79700 1 -- 192.168.123.105:0/604494560 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5c9c1909e0 con 0x7f5c9c1042c0 2026-03-10T09:01:10.814 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.811+0000 7f5c98ff9700 1 -- 192.168.123.105:0/604494560 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f5c8c022b50 con 0x7f5c9c1042c0 2026-03-10T09:01:10.814 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.813+0000 7f5c98ff9700 1 --2- 192.168.123.105:0/604494560 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f5c880779e0 0x7f5c88079ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:01:10.814 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.813+0000 7f5c98ff9700 1 -- 192.168.123.105:0/604494560 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(73..73 src has 1..73) v4 ==== 6280+0+0 (secure 0 0 0) 0x7f5c8c09b240 con 0x7f5c9c1042c0 2026-03-10T09:01:10.814 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.814+0000 7f5c9b7fe700 1 --2- 192.168.123.105:0/604494560 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f5c880779e0 0x7f5c88079ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:01:10.814 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.814+0000 7f5c9b7fe700 1 --2- 192.168.123.105:0/604494560 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f5c880779e0 0x7f5c88079ea0 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7f5c84005fd0 tx=0x7f5c84005dc0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:01:10.815 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.814+0000 7f5c98ff9700 1 -- 192.168.123.105:0/604494560 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f5c8c063b10 con 0x7f5c9c1042c0 2026-03-10T09:01:10.949 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.946+0000 7f5ca1a79700 1 -- 192.168.123.105:0/604494560 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f5c9c0611d0 con 0x7f5c880779e0 2026-03-10T09:01:10.951 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.951+0000 7f5c98ff9700 1 -- 192.168.123.105:0/604494560 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f5c9c0611d0 con 0x7f5c880779e0 2026-03-10T09:01:10.953 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.953+0000 7f5ca1a79700 1 -- 192.168.123.105:0/604494560 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f5c880779e0 msgr2=0x7f5c88079ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:10.953 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.953+0000 7f5ca1a79700 1 --2- 192.168.123.105:0/604494560 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f5c880779e0 0x7f5c88079ea0 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7f5c84005fd0 tx=0x7f5c84005dc0 comp rx=0 tx=0).stop 2026-03-10T09:01:10.953 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.953+0000 7f5ca1a79700 1 -- 192.168.123.105:0/604494560 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5c9c1042c0 msgr2=0x7f5c9c0ff940 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:10.954 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.953+0000 7f5ca1a79700 1 --2- 192.168.123.105:0/604494560 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5c9c1042c0 0x7f5c9c0ff940 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7f5c8c005230 tx=0x7f5c8c0056c0 comp rx=0 tx=0).stop 2026-03-10T09:01:10.954 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.953+0000 7f5ca1a79700 1 -- 192.168.123.105:0/604494560 shutdown_connections 2026-03-10T09:01:10.954 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.953+0000 7f5ca1a79700 1 --2- 192.168.123.105:0/604494560 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f5c880779e0 0x7f5c88079ea0 unknown :-1 s=CLOSED pgs=78 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:10.954 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.953+0000 7f5ca1a79700 1 --2- 192.168.123.105:0/604494560 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5c9c101990 0x7f5c9c1012f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:10.954 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.953+0000 7f5ca1a79700 1 --2- 192.168.123.105:0/604494560 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5c9c1042c0 0x7f5c9c0ff940 unknown :-1 s=CLOSED pgs=118 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:10.954 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.953+0000 7f5ca1a79700 1 -- 192.168.123.105:0/604494560 >> 192.168.123.105:0/604494560 conn(0x7f5c9c0fb380 msgr2=0x7f5c9c0fd7e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:01:10.954 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.954+0000 7f5ca1a79700 1 -- 192.168.123.105:0/604494560 shutdown_connections 2026-03-10T09:01:10.954 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:10.954+0000 7f5ca1a79700 1 -- 192.168.123.105:0/604494560 wait complete. 2026-03-10T09:01:11.033 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.033+0000 7f643bfff700 1 -- 192.168.123.105:0/2409125207 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f64340991b0 msgr2=0x7f6434099610 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:11.034 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.033+0000 7f643bfff700 1 --2- 192.168.123.105:0/2409125207 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f64340991b0 0x7f6434099610 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7f643c066a30 tx=0x7f643c06a320 comp rx=0 tx=0).stop 2026-03-10T09:01:11.034 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.033+0000 7f643bfff700 1 -- 192.168.123.105:0/2409125207 shutdown_connections 2026-03-10T09:01:11.034 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.033+0000 7f643bfff700 1 --2- 192.168.123.105:0/2409125207 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f64340991b0 0x7f6434099610 unknown :-1 s=CLOSED pgs=119 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:11.034 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.033+0000 7f643bfff700 1 --2- 192.168.123.105:0/2409125207 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6434097fb0 0x7f64340983d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:11.034 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.033+0000 7f643bfff700 1 -- 192.168.123.105:0/2409125207 >> 192.168.123.105:0/2409125207 conn(0x7f6434093530 msgr2=0x7f6434095990 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:01:11.034 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.033+0000 7f643bfff700 1 -- 192.168.123.105:0/2409125207 shutdown_connections 2026-03-10T09:01:11.034 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.033+0000 7f643bfff700 1 -- 192.168.123.105:0/2409125207 wait complete. 2026-03-10T09:01:11.034 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.034+0000 7f643bfff700 1 Processor -- start 2026-03-10T09:01:11.034 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.034+0000 7f643bfff700 1 -- start start 2026-03-10T09:01:11.034 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.034+0000 7f643bfff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6434097fb0 0x7f643413e920 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:01:11.035 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.034+0000 7f643bfff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f64340991b0 0x7f643413ee60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:01:11.035 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.034+0000 7f643bfff700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f643413f480 con 0x7f6434097fb0 2026-03-10T09:01:11.035 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.034+0000 7f643bfff700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f643413f5c0 con 0x7f64340991b0 2026-03-10T09:01:11.035 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.035+0000 7f643a7fc700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f64340991b0 0x7f643413ee60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:01:11.035 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.035+0000 7f643a7fc700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f64340991b0 0x7f643413ee60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:33988/0 (socket says 192.168.123.105:33988) 2026-03-10T09:01:11.035 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.035+0000 7f643a7fc700 1 -- 192.168.123.105:0/1524239202 learned_addr learned my addr 192.168.123.105:0/1524239202 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:01:11.035 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.035+0000 7f643affd700 1 --2- 192.168.123.105:0/1524239202 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6434097fb0 0x7f643413e920 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:01:11.035 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.035+0000 7f643a7fc700 1 -- 192.168.123.105:0/1524239202 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6434097fb0 msgr2=0x7f643413e920 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:11.035 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.035+0000 7f643a7fc700 1 --2- 192.168.123.105:0/1524239202 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6434097fb0 0x7f643413e920 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:11.035 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.035+0000 7f643a7fc700 1 -- 192.168.123.105:0/1524239202 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f643c077040 con 0x7f64340991b0 2026-03-10T09:01:11.035 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.035+0000 7f643affd700 1 --2- 192.168.123.105:0/1524239202 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6434097fb0 0x7f643413e920 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T09:01:11.036 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.035+0000 7f643a7fc700 1 --2- 192.168.123.105:0/1524239202 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f64340991b0 0x7f643413ee60 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f643c04f1d0 tx=0x7f643c069ac0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:01:11.036 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.036+0000 7f6423fff700 1 -- 192.168.123.105:0/1524239202 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f643c06f070 con 0x7f64340991b0 2026-03-10T09:01:11.036 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.036+0000 7f643bfff700 1 -- 192.168.123.105:0/1524239202 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6434144010 con 0x7f64340991b0 2026-03-10T09:01:11.036 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.036+0000 7f643bfff700 1 -- 192.168.123.105:0/1524239202 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f64341444d0 con 0x7f64340991b0 2026-03-10T09:01:11.037 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.036+0000 7f6423fff700 1 -- 192.168.123.105:0/1524239202 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f643c06ac00 con 0x7f64340991b0 2026-03-10T09:01:11.037 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.037+0000 7f643bfff700 1 -- 192.168.123.105:0/1524239202 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6434004730 con 0x7f64340991b0 2026-03-10T09:01:11.037 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.037+0000 7f6423fff700 1 -- 192.168.123.105:0/1524239202 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f643c07de00 con 0x7f64340991b0 2026-03-10T09:01:11.038 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.038+0000 7f6423fff700 1 -- 192.168.123.105:0/1524239202 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f643c067bd0 con 0x7f64340991b0 2026-03-10T09:01:11.038 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.038+0000 7f6423fff700 1 --2- 192.168.123.105:0/1524239202 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f6424077910 0x7f6424079dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:01:11.039 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.038+0000 7f6423fff700 1 -- 192.168.123.105:0/1524239202 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(74..74 src has 1..74) v4 ==== 6251+0+0 (secure 0 0 0) 0x7f643c0fd420 con 0x7f64340991b0 2026-03-10T09:01:11.039 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.038+0000 7f643affd700 1 --2- 192.168.123.105:0/1524239202 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f6424077910 0x7f6424079dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:01:11.039 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.039+0000 7f643affd700 1 --2- 192.168.123.105:0/1524239202 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f6424077910 0x7f6424079dd0 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7f6430006fd0 tx=0x7f6430009380 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:01:11.040 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.040+0000 7f6423fff700 1 -- 192.168.123.105:0/1524239202 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f643c0c5c60 con 0x7f64340991b0 2026-03-10T09:01:11.165 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.164+0000 7f643bfff700 1 -- 192.168.123.105:0/1524239202 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f643409db00 con 0x7f6424077910 2026-03-10T09:01:11.170 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.169+0000 7f6423fff700 1 -- 192.168.123.105:0/1524239202 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7f643409db00 con 0x7f6424077910 2026-03-10T09:01:11.170 INFO:teuthology.orchestra.run.vm05.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T09:01:11.170 INFO:teuthology.orchestra.run.vm05.stdout:alertmanager.vm05 vm05 *:9093,9094 running (9m) 3m ago 10m 24.2M - 0.25.0 c8568f914cd2 3be9db7ff6a0 2026-03-10T09:01:11.170 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm05 vm05 running (10m) 3m ago 10m 8892k - 18.2.1 5be31c24972a 4f6a4fa3151a 2026-03-10T09:01:11.170 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm08 vm08 running (9m) 3m ago 9m 11.0M - 18.2.1 5be31c24972a 17a1d108c2c1 2026-03-10T09:01:11.170 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm05 vm05 running (4m) 3m ago 10m 7834k - 19.2.3-678-ge911bdeb 654f31e6858e 8f7c3cd210c7 2026-03-10T09:01:11.170 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm08 vm08 running (3m) 3m ago 9m 8296k - 19.2.3-678-ge911bdeb 654f31e6858e dc71c3161edd 2026-03-10T09:01:11.170 INFO:teuthology.orchestra.run.vm05.stdout:grafana.vm05 vm05 *:3000 running (9m) 3m ago 10m 88.3M - 9.4.7 954c08fa6188 91937b4745e7 2026-03-10T09:01:11.170 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.bxdvbu vm05 running (8m) 3m ago 8m 242M - 18.2.1 5be31c24972a 601862397d9b 2026-03-10T09:01:11.170 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.slhztf vm05 running (8m) 3m ago 8m 17.7M - 18.2.1 5be31c24972a 550cdd24738b 2026-03-10T09:01:11.170 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.ssijow vm08 running (8m) 3m ago 8m 19.9M - 18.2.1 5be31c24972a b9e55b365719 2026-03-10T09:01:11.170 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.xfzrbx vm08 running (8m) 3m ago 8m 16.2M - 18.2.1 5be31c24972a d58d1e2e6ff3 2026-03-10T09:01:11.170 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm05.rxwgjc vm05 *:8443,9283,8765 running (5m) 3m ago 10m 613M - 19.2.3-678-ge911bdeb 654f31e6858e d8c53d04a173 2026-03-10T09:01:11.170 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm08.rpongu vm08 *:8443,9283,8765 running (4m) 3m ago 9m 491M - 19.2.3-678-ge911bdeb 654f31e6858e 867781ac9d98 2026-03-10T09:01:11.170 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm05 vm05 running (4m) 3m ago 11m 55.8M 2048M 19.2.3-678-ge911bdeb 654f31e6858e cdc9176bec28 2026-03-10T09:01:11.170 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm08 vm08 running (4m) 3m ago 9m 48.1M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 34546aa1422b 2026-03-10T09:01:11.170 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm05 vm05 *:9100 running (10m) 3m ago 10m 14.7M - 1.5.0 0da6a335fe13 3b3db2a6030c 2026-03-10T09:01:11.170 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm08 vm08 *:9100 running (9m) 3m ago 9m 15.7M - 1.5.0 0da6a335fe13 f55df0cefc61 2026-03-10T09:01:11.170 INFO:teuthology.orchestra.run.vm05.stdout:osd.0 vm05 running (3m) 3m ago 9m 30.0M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 4f1dac46f59b 2026-03-10T09:01:11.170 INFO:teuthology.orchestra.run.vm05.stdout:osd.1 vm05 running (9m) 3m ago 9m 378M 4096M 18.2.1 5be31c24972a 902f9ea11f1a 2026-03-10T09:01:11.170 INFO:teuthology.orchestra.run.vm05.stdout:osd.2 vm05 running (9m) 3m ago 9m 327M 4096M 18.2.1 5be31c24972a 32c0be0f86f2 2026-03-10T09:01:11.170 INFO:teuthology.orchestra.run.vm05.stdout:osd.3 vm08 running (8m) 3m ago 8m 456M 4096M 18.2.1 5be31c24972a 14f5f93ea4d1 2026-03-10T09:01:11.170 INFO:teuthology.orchestra.run.vm05.stdout:osd.4 vm08 running (8m) 3m ago 8m 418M 4096M 18.2.1 5be31c24972a 155c1482d81c 2026-03-10T09:01:11.170 INFO:teuthology.orchestra.run.vm05.stdout:osd.5 vm08 running (8m) 3m ago 8m 339M 4096M 18.2.1 5be31c24972a 21583bb58d82 2026-03-10T09:01:11.170 INFO:teuthology.orchestra.run.vm05.stdout:prometheus.vm05 vm05 *:9095 running (4m) 3m ago 10m 51.3M - 2.43.0 a07b618ecd1d 1879cf55d022 2026-03-10T09:01:11.172 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.172+0000 7f643bfff700 1 -- 192.168.123.105:0/1524239202 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f6424077910 msgr2=0x7f6424079dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:11.172 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.172+0000 7f643bfff700 1 --2- 192.168.123.105:0/1524239202 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f6424077910 0x7f6424079dd0 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7f6430006fd0 tx=0x7f6430009380 comp rx=0 tx=0).stop 2026-03-10T09:01:11.172 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.172+0000 7f643bfff700 1 -- 192.168.123.105:0/1524239202 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f64340991b0 msgr2=0x7f643413ee60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:11.172 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.172+0000 7f643bfff700 1 --2- 192.168.123.105:0/1524239202 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f64340991b0 0x7f643413ee60 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f643c04f1d0 tx=0x7f643c069ac0 comp rx=0 tx=0).stop 2026-03-10T09:01:11.173 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.172+0000 7f643bfff700 1 -- 192.168.123.105:0/1524239202 shutdown_connections 2026-03-10T09:01:11.173 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.172+0000 7f643bfff700 1 --2- 192.168.123.105:0/1524239202 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f6424077910 0x7f6424079dd0 unknown :-1 s=CLOSED pgs=79 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:11.173 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.172+0000 7f643bfff700 1 --2- 192.168.123.105:0/1524239202 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6434097fb0 0x7f643413e920 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:11.173 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.173+0000 7f643bfff700 1 --2- 192.168.123.105:0/1524239202 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f64340991b0 0x7f643413ee60 unknown :-1 s=CLOSED pgs=42 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:11.173 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.173+0000 7f643bfff700 1 -- 192.168.123.105:0/1524239202 >> 192.168.123.105:0/1524239202 conn(0x7f6434093530 msgr2=0x7f643409c3e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:01:11.173 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.173+0000 7f643bfff700 1 -- 192.168.123.105:0/1524239202 shutdown_connections 2026-03-10T09:01:11.173 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.173+0000 7f643bfff700 1 -- 192.168.123.105:0/1524239202 wait complete. 2026-03-10T09:01:11.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:10 vm05.local ceph-mon[111630]: pgmap v165: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 194/231 objects degraded (83.983%); 0 B/s, 7 objects/s recovering 2026-03-10T09:01:11.245 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.245+0000 7f6b23c51700 1 -- 192.168.123.105:0/3813523317 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6b1c076990 msgr2=0x7f6b1c076e10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:11.245 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.245+0000 7f6b23c51700 1 --2- 192.168.123.105:0/3813523317 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6b1c076990 0x7f6b1c076e10 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f6b1800b3a0 tx=0x7f6b1800b6b0 comp rx=0 tx=0).stop 2026-03-10T09:01:11.246 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.245+0000 7f6b23c51700 1 -- 192.168.123.105:0/3813523317 shutdown_connections 2026-03-10T09:01:11.246 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.245+0000 7f6b23c51700 1 --2- 192.168.123.105:0/3813523317 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6b1c076990 0x7f6b1c076e10 unknown :-1 s=CLOSED pgs=43 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:11.246 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.245+0000 7f6b23c51700 1 --2- 192.168.123.105:0/3813523317 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6b1c075740 0x7f6b1c075b60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:11.246 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.245+0000 7f6b23c51700 1 -- 192.168.123.105:0/3813523317 >> 192.168.123.105:0/3813523317 conn(0x7f6b1c0fe6c0 msgr2=0x7f6b1c100ae0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:01:11.246 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.245+0000 7f6b23c51700 1 -- 192.168.123.105:0/3813523317 shutdown_connections 2026-03-10T09:01:11.246 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.245+0000 7f6b23c51700 1 -- 192.168.123.105:0/3813523317 wait complete. 2026-03-10T09:01:11.247 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.246+0000 7f6b23c51700 1 Processor -- start 2026-03-10T09:01:11.247 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.247+0000 7f6b23c51700 1 -- start start 2026-03-10T09:01:11.247 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.247+0000 7f6b23c51700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6b1c075740 0x7f6b1c19ce10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:01:11.247 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.247+0000 7f6b23c51700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6b1c076990 0x7f6b1c19d350 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:01:11.247 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.247+0000 7f6b23c51700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6b1c19d970 con 0x7f6b1c076990 2026-03-10T09:01:11.247 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.247+0000 7f6b23c51700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6b1c19dab0 con 0x7f6b1c075740 2026-03-10T09:01:11.247 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.247+0000 7f6b219ed700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6b1c075740 0x7f6b1c19ce10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:01:11.248 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.247+0000 7f6b211ec700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6b1c076990 0x7f6b1c19d350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:01:11.248 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.247+0000 7f6b211ec700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6b1c076990 0x7f6b1c19d350 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:60058/0 (socket says 192.168.123.105:60058) 2026-03-10T09:01:11.248 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.247+0000 7f6b211ec700 1 -- 192.168.123.105:0/2852713617 learned_addr learned my addr 192.168.123.105:0/2852713617 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:01:11.248 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.248+0000 7f6b211ec700 1 -- 192.168.123.105:0/2852713617 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6b1c075740 msgr2=0x7f6b1c19ce10 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:11.248 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.248+0000 7f6b211ec700 1 --2- 192.168.123.105:0/2852713617 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6b1c075740 0x7f6b1c19ce10 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:11.248 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.248+0000 7f6b211ec700 1 -- 192.168.123.105:0/2852713617 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6b1800b050 con 0x7f6b1c076990 2026-03-10T09:01:11.248 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.248+0000 7f6b211ec700 1 --2- 192.168.123.105:0/2852713617 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6b1c076990 0x7f6b1c19d350 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7f6b180062a0 tx=0x7f6b18009200 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:01:11.248 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.248+0000 7f6b12ffd700 1 -- 192.168.123.105:0/2852713617 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6b1800e050 con 0x7f6b1c076990 2026-03-10T09:01:11.250 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.248+0000 7f6b12ffd700 1 -- 192.168.123.105:0/2852713617 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6b18003dc0 con 0x7f6b1c076990 2026-03-10T09:01:11.250 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.248+0000 7f6b23c51700 1 -- 192.168.123.105:0/2852713617 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6b1c1a2500 con 0x7f6b1c076990 2026-03-10T09:01:11.250 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.248+0000 7f6b12ffd700 1 -- 192.168.123.105:0/2852713617 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6b1801db30 con 0x7f6b1c076990 2026-03-10T09:01:11.250 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.248+0000 7f6b23c51700 1 -- 192.168.123.105:0/2852713617 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6b1c1a2a50 con 0x7f6b1c076990 2026-03-10T09:01:11.252 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.250+0000 7f6b12ffd700 1 -- 192.168.123.105:0/2852713617 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f6b18019040 con 0x7f6b1c076990 2026-03-10T09:01:11.252 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.250+0000 7f6b23c51700 1 -- 192.168.123.105:0/2852713617 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6b1c066e80 con 0x7f6b1c076990 2026-03-10T09:01:11.252 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.250+0000 7f6b12ffd700 1 --2- 192.168.123.105:0/2852713617 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f6b08077870 0x7f6b08079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:01:11.252 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.251+0000 7f6b12ffd700 1 -- 192.168.123.105:0/2852713617 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(74..74 src has 1..74) v4 ==== 6251+0+0 (secure 0 0 0) 0x7f6b1809b100 con 0x7f6b1c076990 2026-03-10T09:01:11.254 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.253+0000 7f6b12ffd700 1 -- 192.168.123.105:0/2852713617 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f6b180649d0 con 0x7f6b1c076990 2026-03-10T09:01:11.254 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.254+0000 7f6b219ed700 1 --2- 192.168.123.105:0/2852713617 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f6b08077870 0x7f6b08079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:01:11.255 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.254+0000 7f6b219ed700 1 --2- 192.168.123.105:0/2852713617 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f6b08077870 0x7f6b08079d30 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7f6b0c009ea0 tx=0x7f6b0c009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:01:11.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:10 vm08.local ceph-mon[101330]: pgmap v165: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 194/231 objects degraded (83.983%); 0 B/s, 7 objects/s recovering 2026-03-10T09:01:11.414 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.413+0000 7f6b23c51700 1 -- 192.168.123.105:0/2852713617 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f6b1c1a2d80 con 0x7f6b1c076990 2026-03-10T09:01:11.414 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.414+0000 7f6b12ffd700 1 -- 192.168.123.105:0/2852713617 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+799 (secure 0 0 0) 0x7f6b18064120 con 0x7f6b1c076990 2026-03-10T09:01:11.415 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T09:01:11.415 INFO:teuthology.orchestra.run.vm05.stdout: "mon": { 2026-03-10T09:01:11.415 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T09:01:11.415 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T09:01:11.415 INFO:teuthology.orchestra.run.vm05.stdout: "mgr": { 2026-03-10T09:01:11.415 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T09:01:11.415 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T09:01:11.415 INFO:teuthology.orchestra.run.vm05.stdout: "osd": { 2026-03-10T09:01:11.415 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 5, 2026-03-10T09:01:11.415 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-10T09:01:11.415 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T09:01:11.415 INFO:teuthology.orchestra.run.vm05.stdout: "mds": { 2026-03-10T09:01:11.415 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4 2026-03-10T09:01:11.415 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T09:01:11.415 INFO:teuthology.orchestra.run.vm05.stdout: "overall": { 2026-03-10T09:01:11.415 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 9, 2026-03-10T09:01:11.415 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 5 2026-03-10T09:01:11.415 INFO:teuthology.orchestra.run.vm05.stdout: } 2026-03-10T09:01:11.415 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T09:01:11.417 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.417+0000 7f6b23c51700 1 -- 192.168.123.105:0/2852713617 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f6b08077870 msgr2=0x7f6b08079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:11.417 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.417+0000 7f6b23c51700 1 --2- 192.168.123.105:0/2852713617 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f6b08077870 0x7f6b08079d30 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7f6b0c009ea0 tx=0x7f6b0c009450 comp rx=0 tx=0).stop 2026-03-10T09:01:11.417 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.417+0000 7f6b23c51700 1 -- 192.168.123.105:0/2852713617 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6b1c076990 msgr2=0x7f6b1c19d350 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:11.417 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.417+0000 7f6b23c51700 1 --2- 192.168.123.105:0/2852713617 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6b1c076990 0x7f6b1c19d350 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7f6b180062a0 tx=0x7f6b18009200 comp rx=0 tx=0).stop 2026-03-10T09:01:11.417 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.417+0000 7f6b23c51700 1 -- 192.168.123.105:0/2852713617 shutdown_connections 2026-03-10T09:01:11.417 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.417+0000 7f6b23c51700 1 --2- 192.168.123.105:0/2852713617 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f6b08077870 0x7f6b08079d30 unknown :-1 s=CLOSED pgs=80 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:11.417 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.417+0000 7f6b23c51700 1 --2- 192.168.123.105:0/2852713617 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6b1c075740 0x7f6b1c19ce10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:11.418 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.417+0000 7f6b23c51700 1 --2- 192.168.123.105:0/2852713617 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6b1c076990 0x7f6b1c19d350 unknown :-1 s=CLOSED pgs=120 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:11.418 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.417+0000 7f6b23c51700 1 -- 192.168.123.105:0/2852713617 >> 192.168.123.105:0/2852713617 conn(0x7f6b1c0fe6c0 msgr2=0x7f6b1c10d360 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:01:11.418 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.417+0000 7f6b23c51700 1 -- 192.168.123.105:0/2852713617 shutdown_connections 2026-03-10T09:01:11.418 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.417+0000 7f6b23c51700 1 -- 192.168.123.105:0/2852713617 wait complete. 2026-03-10T09:01:11.485 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.485+0000 7f2ca7711700 1 -- 192.168.123.105:0/469470701 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2ca00737f0 msgr2=0x7f2ca0073c70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:11.485 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.485+0000 7f2ca7711700 1 --2- 192.168.123.105:0/469470701 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2ca00737f0 0x7f2ca0073c70 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7f2c94009b50 tx=0x7f2c94009e60 comp rx=0 tx=0).stop 2026-03-10T09:01:11.485 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.485+0000 7f2ca7711700 1 -- 192.168.123.105:0/469470701 shutdown_connections 2026-03-10T09:01:11.486 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.485+0000 7f2ca7711700 1 --2- 192.168.123.105:0/469470701 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2ca00737f0 0x7f2ca0073c70 unknown :-1 s=CLOSED pgs=121 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:11.486 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.485+0000 7f2ca7711700 1 --2- 192.168.123.105:0/469470701 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2ca0074dc0 0x7f2ca0073220 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:11.486 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.485+0000 7f2ca7711700 1 -- 192.168.123.105:0/469470701 >> 192.168.123.105:0/469470701 conn(0x7f2ca00fc460 msgr2=0x7f2ca00fe8e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:01:11.486 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.486+0000 7f2ca7711700 1 -- 192.168.123.105:0/469470701 shutdown_connections 2026-03-10T09:01:11.486 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.486+0000 7f2ca7711700 1 -- 192.168.123.105:0/469470701 wait complete. 2026-03-10T09:01:11.486 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.486+0000 7f2ca7711700 1 Processor -- start 2026-03-10T09:01:11.486 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.486+0000 7f2ca7711700 1 -- start start 2026-03-10T09:01:11.487 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.487+0000 7f2ca7711700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2ca00737f0 0x7f2ca019ce80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:01:11.487 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.487+0000 7f2ca7711700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2ca0074dc0 0x7f2ca019d3c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:01:11.487 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.487+0000 7f2ca7711700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2ca019d9e0 con 0x7f2ca00737f0 2026-03-10T09:01:11.487 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.487+0000 7f2ca7711700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2ca019db20 con 0x7f2ca0074dc0 2026-03-10T09:01:11.487 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.487+0000 7f2ca54ad700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2ca00737f0 0x7f2ca019ce80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:01:11.487 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.487+0000 7f2ca54ad700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2ca00737f0 0x7f2ca019ce80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:60086/0 (socket says 192.168.123.105:60086) 2026-03-10T09:01:11.487 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.487+0000 7f2ca54ad700 1 -- 192.168.123.105:0/1148104437 learned_addr learned my addr 192.168.123.105:0/1148104437 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:01:11.487 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.487+0000 7f2ca4cac700 1 --2- 192.168.123.105:0/1148104437 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2ca0074dc0 0x7f2ca019d3c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:01:11.488 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.487+0000 7f2ca54ad700 1 -- 192.168.123.105:0/1148104437 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2ca0074dc0 msgr2=0x7f2ca019d3c0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:11.488 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.487+0000 7f2ca54ad700 1 --2- 192.168.123.105:0/1148104437 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2ca0074dc0 0x7f2ca019d3c0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:11.488 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.487+0000 7f2ca54ad700 1 -- 192.168.123.105:0/1148104437 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2c940097e0 con 0x7f2ca00737f0 2026-03-10T09:01:11.488 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.488+0000 7f2ca54ad700 1 --2- 192.168.123.105:0/1148104437 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2ca00737f0 0x7f2ca019ce80 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7f2c9c009fd0 tx=0x7f2c9c00edf0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:01:11.489 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.488+0000 7f2c927fc700 1 -- 192.168.123.105:0/1148104437 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2c9c009980 con 0x7f2ca00737f0 2026-03-10T09:01:11.489 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.488+0000 7f2c927fc700 1 -- 192.168.123.105:0/1148104437 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f2c9c004d10 con 0x7f2ca00737f0 2026-03-10T09:01:11.490 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.488+0000 7f2c927fc700 1 -- 192.168.123.105:0/1148104437 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2c9c010430 con 0x7f2ca00737f0 2026-03-10T09:01:11.490 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.488+0000 7f2ca7711700 1 -- 192.168.123.105:0/1148104437 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2ca01a25d0 con 0x7f2ca00737f0 2026-03-10T09:01:11.490 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.488+0000 7f2ca7711700 1 -- 192.168.123.105:0/1148104437 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2ca01a2b20 con 0x7f2ca00737f0 2026-03-10T09:01:11.490 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.490+0000 7f2c927fc700 1 -- 192.168.123.105:0/1148104437 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f2c9c0106a0 con 0x7f2ca00737f0 2026-03-10T09:01:11.490 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.490+0000 7f2ca7711700 1 -- 192.168.123.105:0/1148104437 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2ca0066e80 con 0x7f2ca00737f0 2026-03-10T09:01:11.491 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.490+0000 7f2c927fc700 1 --2- 192.168.123.105:0/1148104437 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f2c8c077870 0x7f2c8c079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:01:11.491 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.491+0000 7f2c927fc700 1 -- 192.168.123.105:0/1148104437 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(74..74 src has 1..74) v4 ==== 6251+0+0 (secure 0 0 0) 0x7f2c9c014070 con 0x7f2ca00737f0 2026-03-10T09:01:11.491 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.491+0000 7f2ca4cac700 1 --2- 192.168.123.105:0/1148104437 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f2c8c077870 0x7f2c8c079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:01:11.491 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.491+0000 7f2ca4cac700 1 --2- 192.168.123.105:0/1148104437 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f2c8c077870 0x7f2c8c079d30 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7f2c94009b20 tx=0x7f2c94005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:01:11.494 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.494+0000 7f2c927fc700 1 -- 192.168.123.105:0/1148104437 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f2c9c063710 con 0x7f2ca00737f0 2026-03-10T09:01:11.638 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.638+0000 7f2ca7711700 1 -- 192.168.123.105:0/1148104437 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f2ca01a2e00 con 0x7f2ca00737f0 2026-03-10T09:01:11.642 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.642+0000 7f2c927fc700 1 -- 192.168.123.105:0/1148104437 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 12 v12) v1 ==== 76+0+1918 (secure 0 0 0) 0x7f2c9c062e60 con 0x7f2ca00737f0 2026-03-10T09:01:11.643 INFO:teuthology.orchestra.run.vm05.stdout:e12 2026-03-10T09:01:11.643 INFO:teuthology.orchestra.run.vm05.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-10T09:01:11.643 INFO:teuthology.orchestra.run.vm05.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T09:01:11.643 INFO:teuthology.orchestra.run.vm05.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T09:01:11.643 INFO:teuthology.orchestra.run.vm05.stdout:legacy client fscid: 1 2026-03-10T09:01:11.643 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:01:11.643 INFO:teuthology.orchestra.run.vm05.stdout:Filesystem 'cephfs' (1) 2026-03-10T09:01:11.643 INFO:teuthology.orchestra.run.vm05.stdout:fs_name cephfs 2026-03-10T09:01:11.643 INFO:teuthology.orchestra.run.vm05.stdout:epoch 12 2026-03-10T09:01:11.643 INFO:teuthology.orchestra.run.vm05.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T09:01:11.643 INFO:teuthology.orchestra.run.vm05.stdout:created 2026-03-10T08:52:52.346264+0000 2026-03-10T09:01:11.643 INFO:teuthology.orchestra.run.vm05.stdout:modified 2026-03-10T08:53:01.426195+0000 2026-03-10T09:01:11.643 INFO:teuthology.orchestra.run.vm05.stdout:tableserver 0 2026-03-10T09:01:11.643 INFO:teuthology.orchestra.run.vm05.stdout:root 0 2026-03-10T09:01:11.643 INFO:teuthology.orchestra.run.vm05.stdout:session_timeout 60 2026-03-10T09:01:11.643 INFO:teuthology.orchestra.run.vm05.stdout:session_autoclose 300 2026-03-10T09:01:11.643 INFO:teuthology.orchestra.run.vm05.stdout:max_file_size 1099511627776 2026-03-10T09:01:11.643 INFO:teuthology.orchestra.run.vm05.stdout:max_xattr_size 65536 2026-03-10T09:01:11.643 INFO:teuthology.orchestra.run.vm05.stdout:required_client_features {} 2026-03-10T09:01:11.643 INFO:teuthology.orchestra.run.vm05.stdout:last_failure 0 2026-03-10T09:01:11.643 INFO:teuthology.orchestra.run.vm05.stdout:last_failure_osd_epoch 39 2026-03-10T09:01:11.643 INFO:teuthology.orchestra.run.vm05.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T09:01:11.643 INFO:teuthology.orchestra.run.vm05.stdout:max_mds 1 2026-03-10T09:01:11.643 INFO:teuthology.orchestra.run.vm05.stdout:in 0 2026-03-10T09:01:11.643 INFO:teuthology.orchestra.run.vm05.stdout:up {0=24289} 2026-03-10T09:01:11.643 INFO:teuthology.orchestra.run.vm05.stdout:failed 2026-03-10T09:01:11.643 INFO:teuthology.orchestra.run.vm05.stdout:damaged 2026-03-10T09:01:11.643 INFO:teuthology.orchestra.run.vm05.stdout:stopped 2026-03-10T09:01:11.643 INFO:teuthology.orchestra.run.vm05.stdout:data_pools [3] 2026-03-10T09:01:11.643 INFO:teuthology.orchestra.run.vm05.stdout:metadata_pool 2 2026-03-10T09:01:11.643 INFO:teuthology.orchestra.run.vm05.stdout:inline_data disabled 2026-03-10T09:01:11.643 INFO:teuthology.orchestra.run.vm05.stdout:balancer 2026-03-10T09:01:11.643 INFO:teuthology.orchestra.run.vm05.stdout:bal_rank_mask -1 2026-03-10T09:01:11.643 INFO:teuthology.orchestra.run.vm05.stdout:standby_count_wanted 1 2026-03-10T09:01:11.643 INFO:teuthology.orchestra.run.vm05.stdout:qdb_cluster leader: 0 members: 2026-03-10T09:01:11.643 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.bxdvbu{0:24289} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.105:6826/2466638752,v1:192.168.123.105:6827/2466638752] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T09:01:11.643 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:01:11.643 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:01:11.643 INFO:teuthology.orchestra.run.vm05.stdout:Standby daemons: 2026-03-10T09:01:11.643 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:01:11.643 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.slhztf{-1:14488} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.105:6828/2662194502,v1:192.168.123.105:6829/2662194502] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T09:01:11.643 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.ssijow{-1:24307} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.108:6826/573665424,v1:192.168.123.108:6827/573665424] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T09:01:11.643 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.xfzrbx{-1:24317} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.108:6824/3236998387,v1:192.168.123.108:6825/3236998387] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T09:01:11.645 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.645+0000 7f2ca7711700 1 -- 192.168.123.105:0/1148104437 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f2c8c077870 msgr2=0x7f2c8c079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:11.645 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.645+0000 7f2ca7711700 1 --2- 192.168.123.105:0/1148104437 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f2c8c077870 0x7f2c8c079d30 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7f2c94009b20 tx=0x7f2c94005fb0 comp rx=0 tx=0).stop 2026-03-10T09:01:11.645 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.645+0000 7f2ca7711700 1 -- 192.168.123.105:0/1148104437 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2ca00737f0 msgr2=0x7f2ca019ce80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:11.645 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.645+0000 7f2ca7711700 1 --2- 192.168.123.105:0/1148104437 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2ca00737f0 0x7f2ca019ce80 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7f2c9c009fd0 tx=0x7f2c9c00edf0 comp rx=0 tx=0).stop 2026-03-10T09:01:11.645 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.645+0000 7f2ca7711700 1 -- 192.168.123.105:0/1148104437 shutdown_connections 2026-03-10T09:01:11.645 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.645+0000 7f2ca7711700 1 --2- 192.168.123.105:0/1148104437 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f2c8c077870 0x7f2c8c079d30 unknown :-1 s=CLOSED pgs=81 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:11.645 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.645+0000 7f2ca7711700 1 --2- 192.168.123.105:0/1148104437 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2ca00737f0 0x7f2ca019ce80 unknown :-1 s=CLOSED pgs=122 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:11.645 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.645+0000 7f2ca7711700 1 --2- 192.168.123.105:0/1148104437 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2ca0074dc0 0x7f2ca019d3c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:11.645 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.645+0000 7f2ca7711700 1 -- 192.168.123.105:0/1148104437 >> 192.168.123.105:0/1148104437 conn(0x7f2ca00fc460 msgr2=0x7f2ca0102780 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:01:11.646 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.645+0000 7f2ca7711700 1 -- 192.168.123.105:0/1148104437 shutdown_connections 2026-03-10T09:01:11.646 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.645+0000 7f2ca7711700 1 -- 192.168.123.105:0/1148104437 wait complete. 2026-03-10T09:01:11.646 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 12 2026-03-10T09:01:11.722 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.721+0000 7f3aee6b6700 1 -- 192.168.123.105:0/1247142579 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3ae8076990 msgr2=0x7f3ae8076e10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:11.722 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.721+0000 7f3aee6b6700 1 --2- 192.168.123.105:0/1247142579 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3ae8076990 0x7f3ae8076e10 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7f3ad0009b00 tx=0x7f3ad0009e10 comp rx=0 tx=0).stop 2026-03-10T09:01:11.722 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.722+0000 7f3aee6b6700 1 -- 192.168.123.105:0/1247142579 shutdown_connections 2026-03-10T09:01:11.722 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.722+0000 7f3aee6b6700 1 --2- 192.168.123.105:0/1247142579 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3ae8076990 0x7f3ae8076e10 unknown :-1 s=CLOSED pgs=123 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:11.722 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.722+0000 7f3aee6b6700 1 --2- 192.168.123.105:0/1247142579 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3ae8075740 0x7f3ae8075b60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:11.722 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.722+0000 7f3aee6b6700 1 -- 192.168.123.105:0/1247142579 >> 192.168.123.105:0/1247142579 conn(0x7f3ae80fe6c0 msgr2=0x7f3ae8100b00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:01:11.722 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.722+0000 7f3aee6b6700 1 -- 192.168.123.105:0/1247142579 shutdown_connections 2026-03-10T09:01:11.722 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.722+0000 7f3aee6b6700 1 -- 192.168.123.105:0/1247142579 wait complete. 2026-03-10T09:01:11.723 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.722+0000 7f3aee6b6700 1 Processor -- start 2026-03-10T09:01:11.723 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.723+0000 7f3aee6b6700 1 -- start start 2026-03-10T09:01:11.723 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.723+0000 7f3aee6b6700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3ae8075740 0x7f3ae819cde0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:01:11.723 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.723+0000 7f3aee6b6700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3ae8076990 0x7f3ae819d320 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:01:11.723 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.723+0000 7f3aee6b6700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3ae819d940 con 0x7f3ae8076990 2026-03-10T09:01:11.723 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.723+0000 7f3aee6b6700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3ae819da80 con 0x7f3ae8075740 2026-03-10T09:01:11.724 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.723+0000 7f3ae7fff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3ae8075740 0x7f3ae819cde0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:01:11.724 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.723+0000 7f3ae7fff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3ae8075740 0x7f3ae819cde0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:34064/0 (socket says 192.168.123.105:34064) 2026-03-10T09:01:11.724 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.723+0000 7f3ae7fff700 1 -- 192.168.123.105:0/919105356 learned_addr learned my addr 192.168.123.105:0/919105356 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:01:11.724 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.723+0000 7f3adf5ff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3ae8076990 0x7f3ae819d320 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:01:11.724 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.724+0000 7f3ae7fff700 1 -- 192.168.123.105:0/919105356 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3ae8076990 msgr2=0x7f3ae819d320 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:11.724 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.724+0000 7f3ae7fff700 1 --2- 192.168.123.105:0/919105356 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3ae8076990 0x7f3ae819d320 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:11.724 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.724+0000 7f3ae7fff700 1 -- 192.168.123.105:0/919105356 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3ad8009710 con 0x7f3ae8075740 2026-03-10T09:01:11.724 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.724+0000 7f3ae7fff700 1 --2- 192.168.123.105:0/919105356 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3ae8075740 0x7f3ae819cde0 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f3ad800ec80 tx=0x7f3ad800ef90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:01:11.724 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.724+0000 7f3ae5ffb700 1 -- 192.168.123.105:0/919105356 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3ad800ccd0 con 0x7f3ae8075740 2026-03-10T09:01:11.725 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.724+0000 7f3ae5ffb700 1 -- 192.168.123.105:0/919105356 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3ad8004500 con 0x7f3ae8075740 2026-03-10T09:01:11.726 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.724+0000 7f3aee6b6700 1 -- 192.168.123.105:0/919105356 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3ad00097e0 con 0x7f3ae8075740 2026-03-10T09:01:11.726 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.724+0000 7f3ae5ffb700 1 -- 192.168.123.105:0/919105356 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3ad80052c0 con 0x7f3ae8075740 2026-03-10T09:01:11.726 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.724+0000 7f3aee6b6700 1 -- 192.168.123.105:0/919105356 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3ae81a28f0 con 0x7f3ae8075740 2026-03-10T09:01:11.726 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.726+0000 7f3ae5ffb700 1 -- 192.168.123.105:0/919105356 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f3ad8005530 con 0x7f3ae8075740 2026-03-10T09:01:11.727 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.726+0000 7f3aee6b6700 1 -- 192.168.123.105:0/919105356 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3ae8110200 con 0x7f3ae8075740 2026-03-10T09:01:11.727 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.726+0000 7f3ae5ffb700 1 --2- 192.168.123.105:0/919105356 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f3ad4077870 0x7f3ad4079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:01:11.727 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.726+0000 7f3ae5ffb700 1 -- 192.168.123.105:0/919105356 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(74..74 src has 1..74) v4 ==== 6251+0+0 (secure 0 0 0) 0x7f3ad8014070 con 0x7f3ae8075740 2026-03-10T09:01:11.727 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.726+0000 7f3adf5ff700 1 --2- 192.168.123.105:0/919105356 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f3ad4077870 0x7f3ad4079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:01:11.727 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.727+0000 7f3adf5ff700 1 --2- 192.168.123.105:0/919105356 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f3ad4077870 0x7f3ad4079d30 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f3ad0005200 tx=0x7f3ad001a040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:01:11.730 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.730+0000 7f3ae5ffb700 1 -- 192.168.123.105:0/919105356 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f3ad8066d70 con 0x7f3ae8075740 2026-03-10T09:01:11.855 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.855+0000 7f3aee6b6700 1 -- 192.168.123.105:0/919105356 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f3ae81a2410 con 0x7f3ad4077870 2026-03-10T09:01:11.856 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.856+0000 7f3ae5ffb700 1 -- 192.168.123.105:0/919105356 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f3ae81a2410 con 0x7f3ad4077870 2026-03-10T09:01:11.857 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T09:01:11.857 INFO:teuthology.orchestra.run.vm05.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T09:01:11.857 INFO:teuthology.orchestra.run.vm05.stdout: "in_progress": true, 2026-03-10T09:01:11.857 INFO:teuthology.orchestra.run.vm05.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T09:01:11.857 INFO:teuthology.orchestra.run.vm05.stdout: "services_complete": [ 2026-03-10T09:01:11.857 INFO:teuthology.orchestra.run.vm05.stdout: "mgr", 2026-03-10T09:01:11.857 INFO:teuthology.orchestra.run.vm05.stdout: "crash", 2026-03-10T09:01:11.857 INFO:teuthology.orchestra.run.vm05.stdout: "mon" 2026-03-10T09:01:11.857 INFO:teuthology.orchestra.run.vm05.stdout: ], 2026-03-10T09:01:11.857 INFO:teuthology.orchestra.run.vm05.stdout: "progress": "7/23 daemons upgraded", 2026-03-10T09:01:11.857 INFO:teuthology.orchestra.run.vm05.stdout: "message": "Currently upgrading osd daemons", 2026-03-10T09:01:11.857 INFO:teuthology.orchestra.run.vm05.stdout: "is_paused": false 2026-03-10T09:01:11.857 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T09:01:11.859 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.859+0000 7f3aee6b6700 1 -- 192.168.123.105:0/919105356 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f3ad4077870 msgr2=0x7f3ad4079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:11.859 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.859+0000 7f3aee6b6700 1 --2- 192.168.123.105:0/919105356 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f3ad4077870 0x7f3ad4079d30 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f3ad0005200 tx=0x7f3ad001a040 comp rx=0 tx=0).stop 2026-03-10T09:01:11.859 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.859+0000 7f3aee6b6700 1 -- 192.168.123.105:0/919105356 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3ae8075740 msgr2=0x7f3ae819cde0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:11.859 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.859+0000 7f3aee6b6700 1 --2- 192.168.123.105:0/919105356 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3ae8075740 0x7f3ae819cde0 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f3ad800ec80 tx=0x7f3ad800ef90 comp rx=0 tx=0).stop 2026-03-10T09:01:11.860 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.859+0000 7f3aee6b6700 1 -- 192.168.123.105:0/919105356 shutdown_connections 2026-03-10T09:01:11.860 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.859+0000 7f3aee6b6700 1 --2- 192.168.123.105:0/919105356 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f3ad4077870 0x7f3ad4079d30 unknown :-1 s=CLOSED pgs=82 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:11.860 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.859+0000 7f3aee6b6700 1 --2- 192.168.123.105:0/919105356 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3ae8075740 0x7f3ae819cde0 unknown :-1 s=CLOSED pgs=44 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:11.860 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.859+0000 7f3aee6b6700 1 --2- 192.168.123.105:0/919105356 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3ae8076990 0x7f3ae819d320 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:11.860 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.859+0000 7f3aee6b6700 1 -- 192.168.123.105:0/919105356 >> 192.168.123.105:0/919105356 conn(0x7f3ae80fe6c0 msgr2=0x7f3ae810d380 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:01:11.860 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.859+0000 7f3aee6b6700 1 -- 192.168.123.105:0/919105356 shutdown_connections 2026-03-10T09:01:11.860 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.860+0000 7f3aee6b6700 1 -- 192.168.123.105:0/919105356 wait complete. 2026-03-10T09:01:11.928 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.927+0000 7f203c31f700 1 -- 192.168.123.105:0/4115232796 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2034100f60 msgr2=0x7f20341013e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:11.928 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.927+0000 7f203c31f700 1 --2- 192.168.123.105:0/4115232796 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2034100f60 0x7f20341013e0 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7f2030009b50 tx=0x7f2030009e60 comp rx=0 tx=0).stop 2026-03-10T09:01:11.930 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.930+0000 7f203c31f700 1 -- 192.168.123.105:0/4115232796 shutdown_connections 2026-03-10T09:01:11.930 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.930+0000 7f203c31f700 1 --2- 192.168.123.105:0/4115232796 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2034100f60 0x7f20341013e0 unknown :-1 s=CLOSED pgs=124 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:11.930 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.930+0000 7f203c31f700 1 --2- 192.168.123.105:0/4115232796 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f20340ffe00 0x7f2034100220 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:11.930 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.930+0000 7f203c31f700 1 -- 192.168.123.105:0/4115232796 >> 192.168.123.105:0/4115232796 conn(0x7f20340fb360 msgr2=0x7f20340fd7e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:01:11.933 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.933+0000 7f203c31f700 1 -- 192.168.123.105:0/4115232796 shutdown_connections 2026-03-10T09:01:11.933 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.933+0000 7f203c31f700 1 -- 192.168.123.105:0/4115232796 wait complete. 2026-03-10T09:01:11.935 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.935+0000 7f203c31f700 1 Processor -- start 2026-03-10T09:01:11.935 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.935+0000 7f203c31f700 1 -- start start 2026-03-10T09:01:11.936 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.935+0000 7f203c31f700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f20340ffe00 0x7f203419cdf0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:01:11.936 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.935+0000 7f203c31f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2034100f60 0x7f203419d330 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:01:11.936 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.935+0000 7f203c31f700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f203419d950 con 0x7f2034100f60 2026-03-10T09:01:11.936 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.935+0000 7f203c31f700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f203419da90 con 0x7f20340ffe00 2026-03-10T09:01:11.936 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.936+0000 7f203a0bb700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f20340ffe00 0x7f203419cdf0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:01:11.936 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.936+0000 7f203a0bb700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f20340ffe00 0x7f203419cdf0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:34074/0 (socket says 192.168.123.105:34074) 2026-03-10T09:01:11.936 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.936+0000 7f203a0bb700 1 -- 192.168.123.105:0/3568867816 learned_addr learned my addr 192.168.123.105:0/3568867816 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:01:11.936 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.936+0000 7f20398ba700 1 --2- 192.168.123.105:0/3568867816 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2034100f60 0x7f203419d330 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:01:11.937 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.936+0000 7f203a0bb700 1 -- 192.168.123.105:0/3568867816 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2034100f60 msgr2=0x7f203419d330 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:11.937 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.937+0000 7f203a0bb700 1 --2- 192.168.123.105:0/3568867816 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2034100f60 0x7f203419d330 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:11.937 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.937+0000 7f203a0bb700 1 -- 192.168.123.105:0/3568867816 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f202800b920 con 0x7f20340ffe00 2026-03-10T09:01:11.937 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.937+0000 7f20398ba700 1 --2- 192.168.123.105:0/3568867816 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2034100f60 0x7f203419d330 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T09:01:11.937 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.937+0000 7f203a0bb700 1 --2- 192.168.123.105:0/3568867816 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f20340ffe00 0x7f203419cdf0 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f2028009e20 tx=0x7f202800bf60 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:01:11.937 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.937+0000 7f20277fe700 1 -- 192.168.123.105:0/3568867816 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2028010040 con 0x7f20340ffe00 2026-03-10T09:01:11.937 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.937+0000 7f203c31f700 1 -- 192.168.123.105:0/3568867816 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f20300097e0 con 0x7f20340ffe00 2026-03-10T09:01:11.937 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.937+0000 7f20277fe700 1 -- 192.168.123.105:0/3568867816 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f202800e7d0 con 0x7f20340ffe00 2026-03-10T09:01:11.937 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.937+0000 7f203c31f700 1 -- 192.168.123.105:0/3568867816 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f20341a2900 con 0x7f20340ffe00 2026-03-10T09:01:11.938 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.937+0000 7f20277fe700 1 -- 192.168.123.105:0/3568867816 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2028010040 con 0x7f20340ffe00 2026-03-10T09:01:11.938 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.938+0000 7f203c31f700 1 -- 192.168.123.105:0/3568867816 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2034066e80 con 0x7f20340ffe00 2026-03-10T09:01:11.939 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.939+0000 7f20277fe700 1 -- 192.168.123.105:0/3568867816 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f2028004220 con 0x7f20340ffe00 2026-03-10T09:01:11.939 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.939+0000 7f20277fe700 1 --2- 192.168.123.105:0/3568867816 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f2020077990 0x7f2020079e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:01:11.940 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.939+0000 7f20277fe700 1 -- 192.168.123.105:0/3568867816 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(74..74 src has 1..74) v4 ==== 6251+0+0 (secure 0 0 0) 0x7f202806a590 con 0x7f20340ffe00 2026-03-10T09:01:11.940 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.940+0000 7f20398ba700 1 --2- 192.168.123.105:0/3568867816 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f2020077990 0x7f2020079e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:01:11.940 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.940+0000 7f20398ba700 1 --2- 192.168.123.105:0/3568867816 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f2020077990 0x7f2020079e50 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f203000b5c0 tx=0x7f2030005fd0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:01:11.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:11.942+0000 7f20277fe700 1 -- 192.168.123.105:0/3568867816 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f202805d8d0 con 0x7f20340ffe00 2026-03-10T09:01:12.118 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:12.117+0000 7f203c31f700 1 -- 192.168.123.105:0/3568867816 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f20341a2be0 con 0x7f20340ffe00 2026-03-10T09:01:12.118 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:12.118+0000 7f20277fe700 1 -- 192.168.123.105:0/3568867816 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+463 (secure 0 0 0) 0x7f202805d8d0 con 0x7f20340ffe00 2026-03-10T09:01:12.118 INFO:teuthology.orchestra.run.vm05.stdout:HEALTH_WARN Degraded data redundancy: 194/231 objects degraded (83.983%), 2 pgs degraded, 2 pgs undersized 2026-03-10T09:01:12.118 INFO:teuthology.orchestra.run.vm05.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 194/231 objects degraded (83.983%), 2 pgs degraded, 2 pgs undersized 2026-03-10T09:01:12.118 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.1 is stuck undersized for 3m, current state active+recovery_wait+undersized+degraded+remapped, last acting [2,4] 2026-03-10T09:01:12.118 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.18 is stuck undersized for 3m, current state active+recovering+undersized+degraded+remapped, last acting [2,1] 2026-03-10T09:01:12.121 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:12.120+0000 7f203c31f700 1 -- 192.168.123.105:0/3568867816 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f2020077990 msgr2=0x7f2020079e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:12.121 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:12.120+0000 7f203c31f700 1 --2- 192.168.123.105:0/3568867816 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f2020077990 0x7f2020079e50 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f203000b5c0 tx=0x7f2030005fd0 comp rx=0 tx=0).stop 2026-03-10T09:01:12.121 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:12.120+0000 7f203c31f700 1 -- 192.168.123.105:0/3568867816 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f20340ffe00 msgr2=0x7f203419cdf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:12.121 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:12.120+0000 7f203c31f700 1 --2- 192.168.123.105:0/3568867816 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f20340ffe00 0x7f203419cdf0 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f2028009e20 tx=0x7f202800bf60 comp rx=0 tx=0).stop 2026-03-10T09:01:12.121 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:12.121+0000 7f203c31f700 1 -- 192.168.123.105:0/3568867816 shutdown_connections 2026-03-10T09:01:12.121 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:12.121+0000 7f203c31f700 1 --2- 192.168.123.105:0/3568867816 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f2020077990 0x7f2020079e50 unknown :-1 s=CLOSED pgs=83 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:12.121 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:12.121+0000 7f203c31f700 1 --2- 192.168.123.105:0/3568867816 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f20340ffe00 0x7f203419cdf0 unknown :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:12.121 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:12.121+0000 7f203c31f700 1 --2- 192.168.123.105:0/3568867816 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2034100f60 0x7f203419d330 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:12.121 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:12.121+0000 7f203c31f700 1 -- 192.168.123.105:0/3568867816 >> 192.168.123.105:0/3568867816 conn(0x7f20340fb360 msgr2=0x7f2034104220 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:01:12.122 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:12.121+0000 7f203c31f700 1 -- 192.168.123.105:0/3568867816 shutdown_connections 2026-03-10T09:01:12.122 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:12.121+0000 7f203c31f700 1 -- 192.168.123.105:0/3568867816 wait complete. 2026-03-10T09:01:12.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:11 vm05.local ceph-mon[111630]: from='client.34316 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:01:12.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:11 vm05.local ceph-mon[111630]: from='client.34320 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:01:12.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:11 vm05.local ceph-mon[111630]: osdmap e74: 6 total, 6 up, 6 in 2026-03-10T09:01:12.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:11 vm05.local ceph-mon[111630]: from='client.44247 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:01:12.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:11 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/2852713617' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:01:12.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:11 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/1148104437' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T09:01:12.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:11 vm08.local ceph-mon[101330]: from='client.34316 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:01:12.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:11 vm08.local ceph-mon[101330]: from='client.34320 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:01:12.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:11 vm08.local ceph-mon[101330]: osdmap e74: 6 total, 6 up, 6 in 2026-03-10T09:01:12.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:11 vm08.local ceph-mon[101330]: from='client.44247 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:01:12.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:11 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/2852713617' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:01:12.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:11 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/1148104437' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T09:01:13.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:12 vm08.local ceph-mon[101330]: pgmap v167: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 194/231 objects degraded (83.983%); 0 B/s, 9 objects/s recovering 2026-03-10T09:01:13.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:12 vm08.local ceph-mon[101330]: from='client.44259 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:01:13.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:12 vm08.local ceph-mon[101330]: osdmap e75: 6 total, 6 up, 6 in 2026-03-10T09:01:13.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:12 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/3568867816' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T09:01:13.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:12 vm05.local ceph-mon[111630]: pgmap v167: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 194/231 objects degraded (83.983%); 0 B/s, 9 objects/s recovering 2026-03-10T09:01:13.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:12 vm05.local ceph-mon[111630]: from='client.44259 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:01:13.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:12 vm05.local ceph-mon[111630]: osdmap e75: 6 total, 6 up, 6 in 2026-03-10T09:01:13.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:12 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/3568867816' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T09:01:14.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:14 vm08.local ceph-mon[101330]: Health check update: Degraded data redundancy: 58/231 objects degraded (25.108%), 1 pg degraded, 1 pg undersized (PG_DEGRADED) 2026-03-10T09:01:14.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:14 vm05.local ceph-mon[111630]: Health check update: Degraded data redundancy: 58/231 objects degraded (25.108%), 1 pg degraded, 1 pg undersized (PG_DEGRADED) 2026-03-10T09:01:15.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:15 vm08.local ceph-mon[101330]: pgmap v169: 65 pgs: 1 peering, 1 active+recovering+undersized+degraded+remapped, 63 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 58/231 objects degraded (25.108%); 0 B/s, 12 objects/s recovering 2026-03-10T09:01:15.359 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:15 vm05.local ceph-mon[111630]: pgmap v169: 65 pgs: 1 peering, 1 active+recovering+undersized+degraded+remapped, 63 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 58/231 objects degraded (25.108%); 0 B/s, 12 objects/s recovering 2026-03-10T09:01:16.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:16 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T09:01:16.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:16 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:01:16.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:16 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T09:01:16.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:16 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:01:17.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:17 vm08.local ceph-mon[101330]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T09:01:17.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:17 vm08.local ceph-mon[101330]: Upgrade: unsafe to stop osd(s) at this time (1 PGs are or would become offline) 2026-03-10T09:01:17.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:17 vm08.local ceph-mon[101330]: pgmap v170: 65 pgs: 1 peering, 1 active+recovering+undersized+degraded+remapped, 63 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 58/231 objects degraded (25.108%); 0 B/s, 5 objects/s recovering 2026-03-10T09:01:17.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:17 vm05.local ceph-mon[111630]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T09:01:17.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:17 vm05.local ceph-mon[111630]: Upgrade: unsafe to stop osd(s) at this time (1 PGs are or would become offline) 2026-03-10T09:01:17.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:17 vm05.local ceph-mon[111630]: pgmap v170: 65 pgs: 1 peering, 1 active+recovering+undersized+degraded+remapped, 63 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 58/231 objects degraded (25.108%); 0 B/s, 5 objects/s recovering 2026-03-10T09:01:19.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:19 vm08.local ceph-mon[101330]: pgmap v171: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 64 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 58/231 objects degraded (25.108%); 0 B/s, 11 objects/s recovering 2026-03-10T09:01:19.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:19 vm05.local ceph-mon[111630]: pgmap v171: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 64 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 58/231 objects degraded (25.108%); 0 B/s, 11 objects/s recovering 2026-03-10T09:01:21.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:21 vm05.local ceph-mon[111630]: pgmap v172: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 64 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 58/231 objects degraded (25.108%); 0 B/s, 10 objects/s recovering 2026-03-10T09:01:21.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:21 vm08.local ceph-mon[101330]: pgmap v172: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 64 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 58/231 objects degraded (25.108%); 0 B/s, 10 objects/s recovering 2026-03-10T09:01:23.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:23 vm05.local ceph-mon[111630]: pgmap v173: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 64 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 58/231 objects degraded (25.108%); 0 B/s, 9 objects/s recovering 2026-03-10T09:01:23.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:23 vm08.local ceph-mon[101330]: pgmap v173: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 64 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 58/231 objects degraded (25.108%); 0 B/s, 9 objects/s recovering 2026-03-10T09:01:24.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:24 vm05.local ceph-mon[111630]: pgmap v174: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 64 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 49/231 objects degraded (21.212%); 0 B/s, 8 objects/s recovering 2026-03-10T09:01:24.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:24 vm05.local ceph-mon[111630]: Health check update: Degraded data redundancy: 49/231 objects degraded (21.212%), 1 pg degraded, 1 pg undersized (PG_DEGRADED) 2026-03-10T09:01:24.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:24 vm08.local ceph-mon[101330]: pgmap v174: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 64 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 49/231 objects degraded (21.212%); 0 B/s, 8 objects/s recovering 2026-03-10T09:01:24.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:24 vm08.local ceph-mon[101330]: Health check update: Degraded data redundancy: 49/231 objects degraded (21.212%), 1 pg degraded, 1 pg undersized (PG_DEGRADED) 2026-03-10T09:01:27.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:26 vm05.local ceph-mon[111630]: pgmap v175: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 64 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 49/231 objects degraded (21.212%); 0 B/s, 8 objects/s recovering 2026-03-10T09:01:27.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:26 vm08.local ceph-mon[101330]: pgmap v175: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 64 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 49/231 objects degraded (21.212%); 0 B/s, 8 objects/s recovering 2026-03-10T09:01:29.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:28 vm05.local ceph-mon[111630]: pgmap v176: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 64 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2/231 objects degraded (0.866%); 0 B/s, 11 objects/s recovering 2026-03-10T09:01:29.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:28 vm05.local ceph-mon[111630]: osdmap e76: 6 total, 6 up, 6 in 2026-03-10T09:01:29.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:28 vm08.local ceph-mon[101330]: pgmap v176: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 64 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2/231 objects degraded (0.866%); 0 B/s, 11 objects/s recovering 2026-03-10T09:01:29.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:28 vm08.local ceph-mon[101330]: osdmap e76: 6 total, 6 up, 6 in 2026-03-10T09:01:30.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:29 vm05.local ceph-mon[111630]: osdmap e77: 6 total, 6 up, 6 in 2026-03-10T09:01:30.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:29 vm08.local ceph-mon[101330]: osdmap e77: 6 total, 6 up, 6 in 2026-03-10T09:01:31.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:30 vm05.local ceph-mon[111630]: pgmap v179: 65 pgs: 65 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 0 B/s, 12 objects/s recovering 2026-03-10T09:01:31.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:30 vm05.local ceph-mon[111630]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 2/231 objects degraded (0.866%), 1 pg degraded, 1 pg undersized) 2026-03-10T09:01:31.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:30 vm05.local ceph-mon[111630]: Cluster is now healthy 2026-03-10T09:01:31.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:30 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:01:31.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:30 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:01:31.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:30 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:01:31.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:30 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:01:31.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:30 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:01:31.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:30 vm08.local ceph-mon[101330]: pgmap v179: 65 pgs: 65 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 0 B/s, 12 objects/s recovering 2026-03-10T09:01:31.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:30 vm08.local ceph-mon[101330]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 2/231 objects degraded (0.866%), 1 pg degraded, 1 pg undersized) 2026-03-10T09:01:31.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:30 vm08.local ceph-mon[101330]: Cluster is now healthy 2026-03-10T09:01:31.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:30 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:01:31.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:30 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:01:31.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:30 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:01:31.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:30 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:01:31.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:30 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:01:31.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:01:31.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:01:31.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:01:31.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:01:31.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:01:31.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:01:31.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:01:31.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T09:01:31.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:01:31.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-10T09:01:31.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:01:32.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:01:32.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:01:32.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:01:32.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:01:32.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:01:32.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:01:32.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:01:32.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T09:01:32.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:01:32.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-10T09:01:32.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:01:32.713 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:32 vm05.local systemd[1]: Stopping Ceph osd.1 for 16587ed2-1c5e-11f1-90f6-35051361a039... 2026-03-10T09:01:32.713 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:32 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1[76319]: 2026-03-10T09:01:32.359+0000 7fadd8197700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.1 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T09:01:32.713 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:32 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1[76319]: 2026-03-10T09:01:32.359+0000 7fadd8197700 -1 osd.1 77 *** Got signal Terminated *** 2026-03-10T09:01:32.713 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:32 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1[76319]: 2026-03-10T09:01:32.359+0000 7fadd8197700 -1 osd.1 77 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T09:01:33.298 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:33 vm05.local podman[126112]: 2026-03-10 09:01:33.126025611 +0000 UTC m=+0.780599567 container died 902f9ea11f1a067a7aed88be42a8e9942da3518857e947c9059973e9e91e76e0 (image=quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1, org.label-schema.vendor=CentOS, CEPH_POINT_RELEASE=-18.2.1, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, io.buildah.version=1.29.1, org.label-schema.license=GPLv2, maintainer=Guillaume Abrioux , GIT_BRANCH=HEAD, GIT_CLEAN=True, org.label-schema.build-date=20240222, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=HEAD, ceph=True, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.schema-version=1.0) 2026-03-10T09:01:33.298 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:33 vm05.local podman[126112]: 2026-03-10 09:01:33.150483727 +0000 UTC m=+0.805057683 container remove 902f9ea11f1a067a7aed88be42a8e9942da3518857e947c9059973e9e91e76e0 (image=quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1, RELEASE=HEAD, maintainer=Guillaume Abrioux , org.label-schema.build-date=20240222, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.schema-version=1.0, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.29.1, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_POINT_RELEASE=-18.2.1, GIT_BRANCH=HEAD) 2026-03-10T09:01:33.298 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:33 vm05.local bash[126112]: ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1 2026-03-10T09:01:33.298 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:33 vm05.local ceph-mon[111630]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T09:01:33.298 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:33 vm05.local ceph-mon[111630]: Upgrade: osd.1 is safe to restart 2026-03-10T09:01:33.298 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:33 vm05.local ceph-mon[111630]: Upgrade: Updating osd.1 2026-03-10T09:01:33.298 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:33 vm05.local ceph-mon[111630]: Deploying daemon osd.1 on vm05 2026-03-10T09:01:33.298 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:33 vm05.local ceph-mon[111630]: pgmap v180: 65 pgs: 65 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 0 B/s, 6 objects/s recovering 2026-03-10T09:01:33.298 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:33 vm05.local ceph-mon[111630]: osd.1 marked itself down and dead 2026-03-10T09:01:33.551 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:33 vm05.local podman[126176]: 2026-03-10 09:01:33.314619327 +0000 UTC m=+0.023361294 container create ae8004ccb5ae36dfcc6b6b4588f8f798809793150966b1741ffa11c8fab5caf0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1-deactivate, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-10T09:01:33.551 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:33 vm05.local podman[126176]: 2026-03-10 09:01:33.356573108 +0000 UTC m=+0.065315075 container init ae8004ccb5ae36dfcc6b6b4588f8f798809793150966b1741ffa11c8fab5caf0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1-deactivate, CEPH_REF=squid, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0) 2026-03-10T09:01:33.551 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:33 vm05.local podman[126176]: 2026-03-10 09:01:33.363085487 +0000 UTC m=+0.071827454 container start ae8004ccb5ae36dfcc6b6b4588f8f798809793150966b1741ffa11c8fab5caf0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1-deactivate, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , ceph=True, CEPH_REF=squid, org.label-schema.license=GPLv2) 2026-03-10T09:01:33.551 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:33 vm05.local podman[126176]: 2026-03-10 09:01:33.364400468 +0000 UTC m=+0.073142435 container attach ae8004ccb5ae36dfcc6b6b4588f8f798809793150966b1741ffa11c8fab5caf0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1-deactivate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=squid, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223) 2026-03-10T09:01:33.551 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:33 vm05.local podman[126176]: 2026-03-10 09:01:33.302084046 +0000 UTC m=+0.010826013 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T09:01:33.551 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:33 vm05.local podman[126176]: 2026-03-10 09:01:33.51960363 +0000 UTC m=+0.228345597 container died ae8004ccb5ae36dfcc6b6b4588f8f798809793150966b1741ffa11c8fab5caf0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1-deactivate, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T09:01:33.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:33 vm08.local ceph-mon[101330]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T09:01:33.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:33 vm08.local ceph-mon[101330]: Upgrade: osd.1 is safe to restart 2026-03-10T09:01:33.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:33 vm08.local ceph-mon[101330]: Upgrade: Updating osd.1 2026-03-10T09:01:33.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:33 vm08.local ceph-mon[101330]: Deploying daemon osd.1 on vm05 2026-03-10T09:01:33.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:33 vm08.local ceph-mon[101330]: pgmap v180: 65 pgs: 65 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 0 B/s, 6 objects/s recovering 2026-03-10T09:01:33.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:33 vm08.local ceph-mon[101330]: osd.1 marked itself down and dead 2026-03-10T09:01:33.804 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:33 vm05.local podman[126176]: 2026-03-10 09:01:33.551224392 +0000 UTC m=+0.259966359 container remove ae8004ccb5ae36dfcc6b6b4588f8f798809793150966b1741ffa11c8fab5caf0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1-deactivate, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223) 2026-03-10T09:01:33.804 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:33 vm05.local systemd[1]: ceph-16587ed2-1c5e-11f1-90f6-35051361a039@osd.1.service: Deactivated successfully. 2026-03-10T09:01:33.804 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:33 vm05.local systemd[1]: Stopped Ceph osd.1 for 16587ed2-1c5e-11f1-90f6-35051361a039. 2026-03-10T09:01:33.804 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:33 vm05.local systemd[1]: ceph-16587ed2-1c5e-11f1-90f6-35051361a039@osd.1.service: Consumed 51.833s CPU time. 2026-03-10T09:01:33.804 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:33 vm05.local systemd[1]: Starting Ceph osd.1 for 16587ed2-1c5e-11f1-90f6-35051361a039... 2026-03-10T09:01:34.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:34 vm05.local ceph-mon[111630]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T09:01:34.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:34 vm05.local ceph-mon[111630]: osdmap e78: 6 total, 5 up, 6 in 2026-03-10T09:01:34.213 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:33 vm05.local podman[126291]: 2026-03-10 09:01:33.916884278 +0000 UTC m=+0.024342521 container create 2f1b3d2a8a58c64b08ef5965358c5d47fe0927b72d116f062620589c73b947a0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1-activate, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T09:01:34.213 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:33 vm05.local podman[126291]: 2026-03-10 09:01:33.958134614 +0000 UTC m=+0.065592857 container init 2f1b3d2a8a58c64b08ef5965358c5d47fe0927b72d116f062620589c73b947a0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1-activate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, org.label-schema.build-date=20260223, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2) 2026-03-10T09:01:34.213 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:33 vm05.local podman[126291]: 2026-03-10 09:01:33.961837794 +0000 UTC m=+0.069296037 container start 2f1b3d2a8a58c64b08ef5965358c5d47fe0927b72d116f062620589c73b947a0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1-activate, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0) 2026-03-10T09:01:34.213 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:33 vm05.local podman[126291]: 2026-03-10 09:01:33.96742415 +0000 UTC m=+0.074882393 container attach 2f1b3d2a8a58c64b08ef5965358c5d47fe0927b72d116f062620589c73b947a0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1-activate, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2) 2026-03-10T09:01:34.213 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:34 vm05.local podman[126291]: 2026-03-10 09:01:33.907413703 +0000 UTC m=+0.014871946 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T09:01:34.213 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:34 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1-activate[126305]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T09:01:34.213 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:34 vm05.local bash[126291]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T09:01:34.213 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:34 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1-activate[126305]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T09:01:34.213 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:34 vm05.local bash[126291]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T09:01:34.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:34 vm08.local ceph-mon[101330]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T09:01:34.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:34 vm08.local ceph-mon[101330]: osdmap e78: 6 total, 5 up, 6 in 2026-03-10T09:01:34.842 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:34 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1-activate[126305]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T09:01:34.842 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:34 vm05.local bash[126291]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T09:01:34.843 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:34 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1-activate[126305]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T09:01:34.843 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:34 vm05.local bash[126291]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T09:01:34.843 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:34 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1-activate[126305]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T09:01:34.843 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:34 vm05.local bash[126291]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T09:01:34.843 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:34 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1-activate[126305]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 2026-03-10T09:01:34.843 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:34 vm05.local bash[126291]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 2026-03-10T09:01:34.843 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:34 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1-activate[126305]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-222b2d67-efe4-4188-8287-f0c9aa37f4a6/osd-block-65d8a731-173e-4188-b03d-f0602d504870 --path /var/lib/ceph/osd/ceph-1 --no-mon-config 2026-03-10T09:01:34.843 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:34 vm05.local bash[126291]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-222b2d67-efe4-4188-8287-f0c9aa37f4a6/osd-block-65d8a731-173e-4188-b03d-f0602d504870 --path /var/lib/ceph/osd/ceph-1 --no-mon-config 2026-03-10T09:01:34.843 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:34 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1-activate[126305]: Running command: /usr/bin/ln -snf /dev/ceph-222b2d67-efe4-4188-8287-f0c9aa37f4a6/osd-block-65d8a731-173e-4188-b03d-f0602d504870 /var/lib/ceph/osd/ceph-1/block 2026-03-10T09:01:34.843 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:34 vm05.local bash[126291]: Running command: /usr/bin/ln -snf /dev/ceph-222b2d67-efe4-4188-8287-f0c9aa37f4a6/osd-block-65d8a731-173e-4188-b03d-f0602d504870 /var/lib/ceph/osd/ceph-1/block 2026-03-10T09:01:35.214 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:35 vm05.local ceph-mon[111630]: pgmap v182: 65 pgs: 12 stale+active+clean, 53 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 0 B/s, 0 objects/s recovering 2026-03-10T09:01:35.214 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:35 vm05.local ceph-mon[111630]: osdmap e79: 6 total, 5 up, 6 in 2026-03-10T09:01:35.214 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:35 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:01:35.214 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:35 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:01:35.214 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:35 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:01:35.214 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:34 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1-activate[126305]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-1/block 2026-03-10T09:01:35.214 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:34 vm05.local bash[126291]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-1/block 2026-03-10T09:01:35.214 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:34 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1-activate[126305]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 2026-03-10T09:01:35.214 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:34 vm05.local bash[126291]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 2026-03-10T09:01:35.214 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:34 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1-activate[126305]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 2026-03-10T09:01:35.214 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:34 vm05.local bash[126291]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 2026-03-10T09:01:35.214 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:34 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1-activate[126305]: --> ceph-volume lvm activate successful for osd ID: 1 2026-03-10T09:01:35.214 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:34 vm05.local bash[126291]: --> ceph-volume lvm activate successful for osd ID: 1 2026-03-10T09:01:35.214 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:34 vm05.local conmon[126305]: conmon 2f1b3d2a8a58c64b08ef : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-2f1b3d2a8a58c64b08ef5965358c5d47fe0927b72d116f062620589c73b947a0.scope/container/memory.events 2026-03-10T09:01:35.214 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:34 vm05.local podman[126291]: 2026-03-10 09:01:34.874904148 +0000 UTC m=+0.982362391 container died 2f1b3d2a8a58c64b08ef5965358c5d47fe0927b72d116f062620589c73b947a0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1-activate, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , ceph=True, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid) 2026-03-10T09:01:35.214 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:34 vm05.local podman[126291]: 2026-03-10 09:01:34.895947693 +0000 UTC m=+1.003405936 container remove 2f1b3d2a8a58c64b08ef5965358c5d47fe0927b72d116f062620589c73b947a0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1-activate, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, OSD_FLAVOR=default) 2026-03-10T09:01:35.214 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:34 vm05.local podman[126552]: 2026-03-10 09:01:34.980470148 +0000 UTC m=+0.015258880 container create 306e95bddd95de9d6fe6ebc844a29d0c146373ef3a7c7c9618df561806cacae1 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0) 2026-03-10T09:01:35.214 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:35 vm05.local podman[126552]: 2026-03-10 09:01:35.01756744 +0000 UTC m=+0.052356172 container init 306e95bddd95de9d6fe6ebc844a29d0c146373ef3a7c7c9618df561806cacae1 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T09:01:35.214 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:35 vm05.local podman[126552]: 2026-03-10 09:01:35.020693121 +0000 UTC m=+0.055481842 container start 306e95bddd95de9d6fe6ebc844a29d0c146373ef3a7c7c9618df561806cacae1 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0) 2026-03-10T09:01:35.214 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:35 vm05.local bash[126552]: 306e95bddd95de9d6fe6ebc844a29d0c146373ef3a7c7c9618df561806cacae1 2026-03-10T09:01:35.214 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:35 vm05.local podman[126552]: 2026-03-10 09:01:34.974201605 +0000 UTC m=+0.008990347 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T09:01:35.214 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:35 vm05.local systemd[1]: Started Ceph osd.1 for 16587ed2-1c5e-11f1-90f6-35051361a039. 2026-03-10T09:01:35.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:35 vm08.local ceph-mon[101330]: pgmap v182: 65 pgs: 12 stale+active+clean, 53 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 0 B/s, 0 objects/s recovering 2026-03-10T09:01:35.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:35 vm08.local ceph-mon[101330]: osdmap e79: 6 total, 5 up, 6 in 2026-03-10T09:01:35.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:35 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:01:35.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:35 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:01:35.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:35 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:01:36.114 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:36 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1[126562]: 2026-03-10T09:01:36.108+0000 7fe21bfc2740 -1 Falling back to public interface 2026-03-10T09:01:37.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:36 vm05.local ceph-mon[111630]: pgmap v184: 65 pgs: 5 peering, 9 stale+active+clean, 51 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail 2026-03-10T09:01:37.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:36 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:01:37.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:36 vm05.local ceph-mon[111630]: Health check failed: Reduced data availability: 3 pgs peering (PG_AVAILABILITY) 2026-03-10T09:01:37.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:36 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:01:37.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:36 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:01:37.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:36 vm08.local ceph-mon[101330]: pgmap v184: 65 pgs: 5 peering, 9 stale+active+clean, 51 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail 2026-03-10T09:01:37.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:36 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:01:37.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:36 vm08.local ceph-mon[101330]: Health check failed: Reduced data availability: 3 pgs peering (PG_AVAILABILITY) 2026-03-10T09:01:37.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:36 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:01:37.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:36 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:01:38.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:37 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:01:38.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:37 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:01:38.311 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:37 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:01:38.312 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:37 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:01:39.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:38 vm08.local ceph-mon[101330]: pgmap v185: 65 pgs: 18 active+undersized, 5 peering, 11 active+undersized+degraded, 31 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 27/231 objects degraded (11.688%) 2026-03-10T09:01:39.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:38 vm08.local ceph-mon[101330]: Health check failed: Degraded data redundancy: 27/231 objects degraded (11.688%), 11 pgs degraded (PG_DEGRADED) 2026-03-10T09:01:39.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:38 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:01:39.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:38 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:01:39.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:38 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:01:39.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:38 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:01:39.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:38 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:01:39.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:38 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:01:39.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:38 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:01:39.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:38 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:01:39.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:38 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:01:39.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:38 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-10T09:01:39.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:38 vm05.local ceph-mon[111630]: pgmap v185: 65 pgs: 18 active+undersized, 5 peering, 11 active+undersized+degraded, 31 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 27/231 objects degraded (11.688%) 2026-03-10T09:01:39.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:38 vm05.local ceph-mon[111630]: Health check failed: Degraded data redundancy: 27/231 objects degraded (11.688%), 11 pgs degraded (PG_DEGRADED) 2026-03-10T09:01:39.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:38 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:01:39.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:38 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:01:39.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:38 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:01:39.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:38 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:01:39.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:38 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:01:39.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:38 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:01:39.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:38 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:01:39.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:38 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:01:39.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:38 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:01:39.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:38 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-10T09:01:39.995 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:39 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1[126562]: 2026-03-10T09:01:39.724+0000 7fe21bfc2740 -1 osd.1 0 read_superblock omap replica is missing. 2026-03-10T09:01:40.291 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:39 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1[126562]: 2026-03-10T09:01:39.994+0000 7fe21bfc2740 -1 osd.1 77 log_to_monitors true 2026-03-10T09:01:40.291 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:40 vm05.local ceph-mon[111630]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-10T09:01:40.291 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:40 vm05.local ceph-mon[111630]: Upgrade: unsafe to stop osd(s) at this time (12 PGs are or would become offline) 2026-03-10T09:01:40.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:40 vm08.local ceph-mon[101330]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-10T09:01:40.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:40 vm08.local ceph-mon[101330]: Upgrade: unsafe to stop osd(s) at this time (12 PGs are or would become offline) 2026-03-10T09:01:41.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:41 vm08.local ceph-mon[101330]: pgmap v186: 65 pgs: 22 active+undersized, 12 active+undersized+degraded, 31 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 28/231 objects degraded (12.121%) 2026-03-10T09:01:41.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:41 vm08.local ceph-mon[101330]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 3 pgs peering) 2026-03-10T09:01:41.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:41 vm08.local ceph-mon[101330]: from='osd.1 [v2:192.168.123.105:6810/1997326784,v1:192.168.123.105:6811/1997326784]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch 2026-03-10T09:01:41.462 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:01:41 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1[126562]: 2026-03-10T09:01:41.031+0000 7fe213d5c640 -1 osd.1 77 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T09:01:41.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:41 vm05.local ceph-mon[111630]: pgmap v186: 65 pgs: 22 active+undersized, 12 active+undersized+degraded, 31 active+clean; 215 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 28/231 objects degraded (12.121%) 2026-03-10T09:01:41.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:41 vm05.local ceph-mon[111630]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 3 pgs peering) 2026-03-10T09:01:41.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:41 vm05.local ceph-mon[111630]: from='osd.1 [v2:192.168.123.105:6810/1997326784,v1:192.168.123.105:6811/1997326784]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch 2026-03-10T09:01:42.187 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.186+0000 7f9d6e5b9700 1 -- 192.168.123.105:0/356172061 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9d68103180 msgr2=0x7f9d681035a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:42.187 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.186+0000 7f9d6e5b9700 1 --2- 192.168.123.105:0/356172061 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9d68103180 0x7f9d681035a0 secure :-1 s=READY pgs=127 cs=0 l=1 rev1=1 crypto rx=0x7f9d50009b00 tx=0x7f9d50009e10 comp rx=0 tx=0).stop 2026-03-10T09:01:42.188 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.187+0000 7f9d6e5b9700 1 -- 192.168.123.105:0/356172061 shutdown_connections 2026-03-10T09:01:42.188 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.187+0000 7f9d6e5b9700 1 --2- 192.168.123.105:0/356172061 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9d68104380 0x7f9d681047e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:42.188 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.187+0000 7f9d6e5b9700 1 --2- 192.168.123.105:0/356172061 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9d68103180 0x7f9d681035a0 unknown :-1 s=CLOSED pgs=127 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:42.188 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.187+0000 7f9d6e5b9700 1 -- 192.168.123.105:0/356172061 >> 192.168.123.105:0/356172061 conn(0x7f9d680fe720 msgr2=0x7f9d68100b60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:01:42.188 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.187+0000 7f9d6e5b9700 1 -- 192.168.123.105:0/356172061 shutdown_connections 2026-03-10T09:01:42.188 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.188+0000 7f9d6e5b9700 1 -- 192.168.123.105:0/356172061 wait complete. 2026-03-10T09:01:42.188 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.188+0000 7f9d6e5b9700 1 Processor -- start 2026-03-10T09:01:42.188 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.188+0000 7f9d6e5b9700 1 -- start start 2026-03-10T09:01:42.189 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.188+0000 7f9d6e5b9700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9d68103180 0x7f9d68078b40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:01:42.189 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.188+0000 7f9d6e5b9700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9d68104380 0x7f9d68079080 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:01:42.189 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.188+0000 7f9d6e5b9700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9d68075630 con 0x7f9d68103180 2026-03-10T09:01:42.189 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.188+0000 7f9d6e5b9700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9d680757a0 con 0x7f9d68104380 2026-03-10T09:01:42.189 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.189+0000 7f9d677fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9d68104380 0x7f9d68079080 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:01:42.189 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.189+0000 7f9d677fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9d68104380 0x7f9d68079080 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:53030/0 (socket says 192.168.123.105:53030) 2026-03-10T09:01:42.189 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.189+0000 7f9d677fe700 1 -- 192.168.123.105:0/3001244528 learned_addr learned my addr 192.168.123.105:0/3001244528 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:01:42.189 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.189+0000 7f9d67fff700 1 --2- 192.168.123.105:0/3001244528 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9d68103180 0x7f9d68078b40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:01:42.189 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.189+0000 7f9d67fff700 1 -- 192.168.123.105:0/3001244528 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9d68104380 msgr2=0x7f9d68079080 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:42.189 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.189+0000 7f9d67fff700 1 --2- 192.168.123.105:0/3001244528 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9d68104380 0x7f9d68079080 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:42.189 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.189+0000 7f9d67fff700 1 -- 192.168.123.105:0/3001244528 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9d500097e0 con 0x7f9d68103180 2026-03-10T09:01:42.190 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.189+0000 7f9d67fff700 1 --2- 192.168.123.105:0/3001244528 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9d68103180 0x7f9d68078b40 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7f9d50009ad0 tx=0x7f9d50004c70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:01:42.191 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.190+0000 7f9d657fa700 1 -- 192.168.123.105:0/3001244528 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9d5001d070 con 0x7f9d68103180 2026-03-10T09:01:42.191 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.190+0000 7f9d657fa700 1 -- 192.168.123.105:0/3001244528 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9d50004500 con 0x7f9d68103180 2026-03-10T09:01:42.191 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.190+0000 7f9d657fa700 1 -- 192.168.123.105:0/3001244528 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9d5000f460 con 0x7f9d68103180 2026-03-10T09:01:42.191 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.190+0000 7f9d6e5b9700 1 -- 192.168.123.105:0/3001244528 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9d68075a20 con 0x7f9d68103180 2026-03-10T09:01:42.191 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.190+0000 7f9d6e5b9700 1 -- 192.168.123.105:0/3001244528 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9d68075e80 con 0x7f9d68103180 2026-03-10T09:01:42.195 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.191+0000 7f9d657fa700 1 -- 192.168.123.105:0/3001244528 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f9d5000f5c0 con 0x7f9d68103180 2026-03-10T09:01:42.195 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.191+0000 7f9d6e5b9700 1 -- 192.168.123.105:0/3001244528 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9d68066e80 con 0x7f9d68103180 2026-03-10T09:01:42.195 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.193+0000 7f9d657fa700 1 --2- 192.168.123.105:0/3001244528 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f9d54077910 0x7f9d54079dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:01:42.195 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.193+0000 7f9d657fa700 1 -- 192.168.123.105:0/3001244528 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6222+0+0 (secure 0 0 0) 0x7f9d5009af40 con 0x7f9d68103180 2026-03-10T09:01:42.195 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.195+0000 7f9d677fe700 1 --2- 192.168.123.105:0/3001244528 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f9d54077910 0x7f9d54079dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:01:42.195 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.195+0000 7f9d657fa700 1 -- 192.168.123.105:0/3001244528 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f9d50064850 con 0x7f9d68103180 2026-03-10T09:01:42.195 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.195+0000 7f9d677fe700 1 --2- 192.168.123.105:0/3001244528 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f9d54077910 0x7f9d54079dd0 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7f9d58009cc0 tx=0x7f9d58009480 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:01:42.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:42 vm08.local ceph-mon[101330]: from='osd.1 [v2:192.168.123.105:6810/1997326784,v1:192.168.123.105:6811/1997326784]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished 2026-03-10T09:01:42.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:42 vm08.local ceph-mon[101330]: osdmap e80: 6 total, 5 up, 6 in 2026-03-10T09:01:42.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:42 vm08.local ceph-mon[101330]: from='osd.1 [v2:192.168.123.105:6810/1997326784,v1:192.168.123.105:6811/1997326784]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-10T09:01:42.324 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.324+0000 7f9d6e5b9700 1 -- 192.168.123.105:0/3001244528 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f9d681a2ed0 con 0x7f9d54077910 2026-03-10T09:01:42.325 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:42 vm05.local ceph-mon[111630]: from='osd.1 [v2:192.168.123.105:6810/1997326784,v1:192.168.123.105:6811/1997326784]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished 2026-03-10T09:01:42.325 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:42 vm05.local ceph-mon[111630]: osdmap e80: 6 total, 5 up, 6 in 2026-03-10T09:01:42.325 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:42 vm05.local ceph-mon[111630]: from='osd.1 [v2:192.168.123.105:6810/1997326784,v1:192.168.123.105:6811/1997326784]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-10T09:01:42.327 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.326+0000 7f9d657fa700 1 -- 192.168.123.105:0/3001244528 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f9d681a2ed0 con 0x7f9d54077910 2026-03-10T09:01:42.329 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.329+0000 7f9d6e5b9700 1 -- 192.168.123.105:0/3001244528 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f9d54077910 msgr2=0x7f9d54079dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:42.329 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.329+0000 7f9d6e5b9700 1 --2- 192.168.123.105:0/3001244528 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f9d54077910 0x7f9d54079dd0 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7f9d58009cc0 tx=0x7f9d58009480 comp rx=0 tx=0).stop 2026-03-10T09:01:42.329 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.329+0000 7f9d6e5b9700 1 -- 192.168.123.105:0/3001244528 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9d68103180 msgr2=0x7f9d68078b40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:42.329 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.329+0000 7f9d6e5b9700 1 --2- 192.168.123.105:0/3001244528 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9d68103180 0x7f9d68078b40 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7f9d50009ad0 tx=0x7f9d50004c70 comp rx=0 tx=0).stop 2026-03-10T09:01:42.330 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.329+0000 7f9d6e5b9700 1 -- 192.168.123.105:0/3001244528 shutdown_connections 2026-03-10T09:01:42.330 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.329+0000 7f9d6e5b9700 1 --2- 192.168.123.105:0/3001244528 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f9d54077910 0x7f9d54079dd0 unknown :-1 s=CLOSED pgs=85 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:42.330 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.329+0000 7f9d6e5b9700 1 --2- 192.168.123.105:0/3001244528 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9d68103180 0x7f9d68078b40 unknown :-1 s=CLOSED pgs=128 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:42.330 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.329+0000 7f9d6e5b9700 1 --2- 192.168.123.105:0/3001244528 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9d68104380 0x7f9d68079080 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:42.330 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.330+0000 7f9d6e5b9700 1 -- 192.168.123.105:0/3001244528 >> 192.168.123.105:0/3001244528 conn(0x7f9d680fe720 msgr2=0x7f9d681075b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:01:42.330 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.330+0000 7f9d6e5b9700 1 -- 192.168.123.105:0/3001244528 shutdown_connections 2026-03-10T09:01:42.330 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.330+0000 7f9d6e5b9700 1 -- 192.168.123.105:0/3001244528 wait complete. 2026-03-10T09:01:42.339 INFO:teuthology.orchestra.run.vm05.stdout:true 2026-03-10T09:01:42.396 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.396+0000 7f98d4dce700 1 -- 192.168.123.105:0/364453627 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f98d0102060 msgr2=0x7f98d01024e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:42.396 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.396+0000 7f98d4dce700 1 --2- 192.168.123.105:0/364453627 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f98d0102060 0x7f98d01024e0 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f98c0009a60 tx=0x7f98c0009d70 comp rx=0 tx=0).stop 2026-03-10T09:01:42.396 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.396+0000 7f98d4dce700 1 -- 192.168.123.105:0/364453627 shutdown_connections 2026-03-10T09:01:42.396 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.396+0000 7f98d4dce700 1 --2- 192.168.123.105:0/364453627 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f98d0102060 0x7f98d01024e0 unknown :-1 s=CLOSED pgs=46 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:42.396 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.396+0000 7f98d4dce700 1 --2- 192.168.123.105:0/364453627 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f98d0100f00 0x7f98d0101320 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:42.396 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.396+0000 7f98d4dce700 1 -- 192.168.123.105:0/364453627 >> 192.168.123.105:0/364453627 conn(0x7f98d00fc460 msgr2=0x7f98d00fe8e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:01:42.397 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.396+0000 7f98d4dce700 1 -- 192.168.123.105:0/364453627 shutdown_connections 2026-03-10T09:01:42.397 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.396+0000 7f98d4dce700 1 -- 192.168.123.105:0/364453627 wait complete. 2026-03-10T09:01:42.397 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.397+0000 7f98d4dce700 1 Processor -- start 2026-03-10T09:01:42.397 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.397+0000 7f98d4dce700 1 -- start start 2026-03-10T09:01:42.397 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.397+0000 7f98d4dce700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f98d0100f00 0x7f98d01989b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:01:42.397 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.397+0000 7f98d4dce700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f98d0102060 0x7f98d0198ef0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:01:42.397 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.397+0000 7f98d4dce700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f98d0199510 con 0x7f98d0102060 2026-03-10T09:01:42.398 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.397+0000 7f98d4dce700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f98d0199650 con 0x7f98d0100f00 2026-03-10T09:01:42.398 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.398+0000 7f98cdd9b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f98d0102060 0x7f98d0198ef0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:01:42.398 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.398+0000 7f98cdd9b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f98d0102060 0x7f98d0198ef0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:45682/0 (socket says 192.168.123.105:45682) 2026-03-10T09:01:42.398 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.398+0000 7f98cdd9b700 1 -- 192.168.123.105:0/1548016870 learned_addr learned my addr 192.168.123.105:0/1548016870 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:01:42.398 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.398+0000 7f98ce59c700 1 --2- 192.168.123.105:0/1548016870 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f98d0100f00 0x7f98d01989b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:01:42.398 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.398+0000 7f98cdd9b700 1 -- 192.168.123.105:0/1548016870 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f98d0100f00 msgr2=0x7f98d01989b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:42.398 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.398+0000 7f98cdd9b700 1 --2- 192.168.123.105:0/1548016870 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f98d0100f00 0x7f98d01989b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:42.398 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.398+0000 7f98cdd9b700 1 -- 192.168.123.105:0/1548016870 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f98c0009710 con 0x7f98d0102060 2026-03-10T09:01:42.398 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.398+0000 7f98cdd9b700 1 --2- 192.168.123.105:0/1548016870 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f98d0102060 0x7f98d0198ef0 secure :-1 s=READY pgs=129 cs=0 l=1 rev1=1 crypto rx=0x7f98c00096a0 tx=0x7f98c000f880 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:01:42.399 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.399+0000 7f98bf7fe700 1 -- 192.168.123.105:0/1548016870 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f98c001d070 con 0x7f98d0102060 2026-03-10T09:01:42.400 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.399+0000 7f98bf7fe700 1 -- 192.168.123.105:0/1548016870 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f98c000fe60 con 0x7f98d0102060 2026-03-10T09:01:42.400 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.399+0000 7f98d4dce700 1 -- 192.168.123.105:0/1548016870 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f98d019e0a0 con 0x7f98d0102060 2026-03-10T09:01:42.400 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.399+0000 7f98bf7fe700 1 -- 192.168.123.105:0/1548016870 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f98c00177d0 con 0x7f98d0102060 2026-03-10T09:01:42.400 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.399+0000 7f98d4dce700 1 -- 192.168.123.105:0/1548016870 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f98d019e590 con 0x7f98d0102060 2026-03-10T09:01:42.402 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.400+0000 7f98d4dce700 1 -- 192.168.123.105:0/1548016870 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f98d01061d0 con 0x7f98d0102060 2026-03-10T09:01:42.402 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.400+0000 7f98bf7fe700 1 -- 192.168.123.105:0/1548016870 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f98c0021bf0 con 0x7f98d0102060 2026-03-10T09:01:42.402 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.402+0000 7f98bf7fe700 1 --2- 192.168.123.105:0/1548016870 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f98b8077870 0x7f98b8079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:01:42.402 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.402+0000 7f98bf7fe700 1 -- 192.168.123.105:0/1548016870 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6222+0+0 (secure 0 0 0) 0x7f98c009ad10 con 0x7f98d0102060 2026-03-10T09:01:42.403 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.402+0000 7f98ce59c700 1 --2- 192.168.123.105:0/1548016870 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f98b8077870 0x7f98b8079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:01:42.403 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.403+0000 7f98ce59c700 1 --2- 192.168.123.105:0/1548016870 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f98b8077870 0x7f98b8079d30 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7f98c4005fd0 tx=0x7f98c4009500 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:01:42.405 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.405+0000 7f98bf7fe700 1 -- 192.168.123.105:0/1548016870 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f98c00635a0 con 0x7f98d0102060 2026-03-10T09:01:42.535 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.534+0000 7f98d4dce700 1 -- 192.168.123.105:0/1548016870 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f98d00611d0 con 0x7f98b8077870 2026-03-10T09:01:42.536 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.536+0000 7f98bf7fe700 1 -- 192.168.123.105:0/1548016870 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f98d00611d0 con 0x7f98b8077870 2026-03-10T09:01:42.539 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.538+0000 7f98d4dce700 1 -- 192.168.123.105:0/1548016870 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f98b8077870 msgr2=0x7f98b8079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:42.539 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.539+0000 7f98d4dce700 1 --2- 192.168.123.105:0/1548016870 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f98b8077870 0x7f98b8079d30 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7f98c4005fd0 tx=0x7f98c4009500 comp rx=0 tx=0).stop 2026-03-10T09:01:42.539 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.539+0000 7f98d4dce700 1 -- 192.168.123.105:0/1548016870 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f98d0102060 msgr2=0x7f98d0198ef0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:42.540 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.539+0000 7f98d4dce700 1 --2- 192.168.123.105:0/1548016870 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f98d0102060 0x7f98d0198ef0 secure :-1 s=READY pgs=129 cs=0 l=1 rev1=1 crypto rx=0x7f98c00096a0 tx=0x7f98c000f880 comp rx=0 tx=0).stop 2026-03-10T09:01:42.540 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.540+0000 7f98d4dce700 1 -- 192.168.123.105:0/1548016870 shutdown_connections 2026-03-10T09:01:42.540 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.540+0000 7f98d4dce700 1 --2- 192.168.123.105:0/1548016870 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f98b8077870 0x7f98b8079d30 unknown :-1 s=CLOSED pgs=86 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:42.540 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.540+0000 7f98d4dce700 1 --2- 192.168.123.105:0/1548016870 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f98d0100f00 0x7f98d01989b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:42.540 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.540+0000 7f98d4dce700 1 --2- 192.168.123.105:0/1548016870 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f98d0102060 0x7f98d0198ef0 unknown :-1 s=CLOSED pgs=129 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:42.540 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.540+0000 7f98d4dce700 1 -- 192.168.123.105:0/1548016870 >> 192.168.123.105:0/1548016870 conn(0x7f98d00fc460 msgr2=0x7f98d0103310 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:01:42.541 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.540+0000 7f98d4dce700 1 -- 192.168.123.105:0/1548016870 shutdown_connections 2026-03-10T09:01:42.541 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.540+0000 7f98d4dce700 1 -- 192.168.123.105:0/1548016870 wait complete. 2026-03-10T09:01:42.605 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.605+0000 7f05fddac700 1 -- 192.168.123.105:0/4057994231 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f05f81000e0 msgr2=0x7f05f8100560 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:42.605 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.605+0000 7f05fddac700 1 --2- 192.168.123.105:0/4057994231 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f05f81000e0 0x7f05f8100560 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7f05e0009b50 tx=0x7f05e0009e60 comp rx=0 tx=0).stop 2026-03-10T09:01:42.608 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.607+0000 7f05fddac700 1 -- 192.168.123.105:0/4057994231 shutdown_connections 2026-03-10T09:01:42.608 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.607+0000 7f05fddac700 1 --2- 192.168.123.105:0/4057994231 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f05f81000e0 0x7f05f8100560 unknown :-1 s=CLOSED pgs=130 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:42.608 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.607+0000 7f05fddac700 1 --2- 192.168.123.105:0/4057994231 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f05f80ff780 0x7f05f80ffba0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:42.608 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.607+0000 7f05fddac700 1 -- 192.168.123.105:0/4057994231 >> 192.168.123.105:0/4057994231 conn(0x7f05f80fb380 msgr2=0x7f05f80fd7e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:01:42.608 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.608+0000 7f05fddac700 1 -- 192.168.123.105:0/4057994231 shutdown_connections 2026-03-10T09:01:42.608 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.608+0000 7f05fddac700 1 -- 192.168.123.105:0/4057994231 wait complete. 2026-03-10T09:01:42.608 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.608+0000 7f05fddac700 1 Processor -- start 2026-03-10T09:01:42.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.608+0000 7f05fddac700 1 -- start start 2026-03-10T09:01:42.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.609+0000 7f05fddac700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f05f80ff780 0x7f05f81989e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:01:42.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.609+0000 7f05fddac700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f05f81000e0 0x7f05f8198f20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:01:42.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.609+0000 7f05fddac700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f05f8199540 con 0x7f05f81000e0 2026-03-10T09:01:42.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.609+0000 7f05fddac700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f05f8199680 con 0x7f05f80ff780 2026-03-10T09:01:42.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.609+0000 7f05f6ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f05f81000e0 0x7f05f8198f20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:01:42.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.609+0000 7f05f6ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f05f81000e0 0x7f05f8198f20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:45692/0 (socket says 192.168.123.105:45692) 2026-03-10T09:01:42.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.609+0000 7f05f6ffd700 1 -- 192.168.123.105:0/2916110575 learned_addr learned my addr 192.168.123.105:0/2916110575 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:01:42.610 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.609+0000 7f05f6ffd700 1 -- 192.168.123.105:0/2916110575 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f05f80ff780 msgr2=0x7f05f81989e0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T09:01:42.610 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.609+0000 7f05f6ffd700 1 --2- 192.168.123.105:0/2916110575 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f05f80ff780 0x7f05f81989e0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:42.610 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.609+0000 7f05f6ffd700 1 -- 192.168.123.105:0/2916110575 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f05e00097e0 con 0x7f05f81000e0 2026-03-10T09:01:42.610 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.609+0000 7f05f6ffd700 1 --2- 192.168.123.105:0/2916110575 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f05f81000e0 0x7f05f8198f20 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7f05e0006010 tx=0x7f05e000b890 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:01:42.611 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.609+0000 7f05f4ff9700 1 -- 192.168.123.105:0/2916110575 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f05e001d070 con 0x7f05f81000e0 2026-03-10T09:01:42.611 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.610+0000 7f05f4ff9700 1 -- 192.168.123.105:0/2916110575 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f05e000bbe0 con 0x7f05f81000e0 2026-03-10T09:01:42.611 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.610+0000 7f05f4ff9700 1 -- 192.168.123.105:0/2916110575 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f05e000f870 con 0x7f05f81000e0 2026-03-10T09:01:42.611 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.610+0000 7f05fddac700 1 -- 192.168.123.105:0/2916110575 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f05f819e0d0 con 0x7f05f81000e0 2026-03-10T09:01:42.611 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.610+0000 7f05fddac700 1 -- 192.168.123.105:0/2916110575 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f05f8101cf0 con 0x7f05f81000e0 2026-03-10T09:01:42.611 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.611+0000 7f05fddac700 1 -- 192.168.123.105:0/2916110575 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f05f8066e80 con 0x7f05f81000e0 2026-03-10T09:01:42.615 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.615+0000 7f05f4ff9700 1 -- 192.168.123.105:0/2916110575 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f05e000bd50 con 0x7f05f81000e0 2026-03-10T09:01:42.615 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.615+0000 7f05f4ff9700 1 --2- 192.168.123.105:0/2916110575 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f05e40779e0 0x7f05e4079ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:01:42.616 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.615+0000 7f05f77fe700 1 --2- 192.168.123.105:0/2916110575 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f05e40779e0 0x7f05e4079ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:01:42.616 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.616+0000 7f05f4ff9700 1 -- 192.168.123.105:0/2916110575 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6222+0+0 (secure 0 0 0) 0x7f05e009bcb0 con 0x7f05f81000e0 2026-03-10T09:01:42.616 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.616+0000 7f05f4ff9700 1 -- 192.168.123.105:0/2916110575 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f05e009c130 con 0x7f05f81000e0 2026-03-10T09:01:42.616 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.616+0000 7f05f77fe700 1 --2- 192.168.123.105:0/2916110575 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f05e40779e0 0x7f05e4079ea0 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7f05e8009ce0 tx=0x7f05e8009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:01:42.732 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.732+0000 7f05fddac700 1 -- 192.168.123.105:0/2916110575 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f05f81094d0 con 0x7f05e40779e0 2026-03-10T09:01:42.737 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.737+0000 7f05f4ff9700 1 -- 192.168.123.105:0/2916110575 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7f05f81094d0 con 0x7f05e40779e0 2026-03-10T09:01:42.738 INFO:teuthology.orchestra.run.vm05.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T09:01:42.738 INFO:teuthology.orchestra.run.vm05.stdout:alertmanager.vm05 vm05 *:9093,9094 running (10m) 6s ago 10m 24.2M - 0.25.0 c8568f914cd2 3be9db7ff6a0 2026-03-10T09:01:42.738 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm05 vm05 running (10m) 6s ago 10m 9508k - 18.2.1 5be31c24972a 4f6a4fa3151a 2026-03-10T09:01:42.738 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm08 vm08 running (10m) 4m ago 10m 11.0M - 18.2.1 5be31c24972a 17a1d108c2c1 2026-03-10T09:01:42.738 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm05 vm05 running (4m) 6s ago 10m 7834k - 19.2.3-678-ge911bdeb 654f31e6858e 8f7c3cd210c7 2026-03-10T09:01:42.738 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm08 vm08 running (4m) 4m ago 10m 8296k - 19.2.3-678-ge911bdeb 654f31e6858e dc71c3161edd 2026-03-10T09:01:42.738 INFO:teuthology.orchestra.run.vm05.stdout:grafana.vm05 vm05 *:3000 running (10m) 6s ago 10m 88.6M - 9.4.7 954c08fa6188 91937b4745e7 2026-03-10T09:01:42.738 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.bxdvbu vm05 running (8m) 6s ago 8m 176M - 18.2.1 5be31c24972a 601862397d9b 2026-03-10T09:01:42.738 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.slhztf vm05 running (8m) 6s ago 8m 19.4M - 18.2.1 5be31c24972a 550cdd24738b 2026-03-10T09:01:42.738 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.ssijow vm08 running (8m) 4m ago 8m 19.9M - 18.2.1 5be31c24972a b9e55b365719 2026-03-10T09:01:42.738 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.xfzrbx vm08 running (8m) 4m ago 8m 16.2M - 18.2.1 5be31c24972a d58d1e2e6ff3 2026-03-10T09:01:42.738 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm05.rxwgjc vm05 *:8443,9283,8765 running (5m) 6s ago 11m 620M - 19.2.3-678-ge911bdeb 654f31e6858e d8c53d04a173 2026-03-10T09:01:42.738 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm08.rpongu vm08 *:8443,9283,8765 running (5m) 4m ago 10m 491M - 19.2.3-678-ge911bdeb 654f31e6858e 867781ac9d98 2026-03-10T09:01:42.738 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm05 vm05 running (5m) 6s ago 11m 62.1M 2048M 19.2.3-678-ge911bdeb 654f31e6858e cdc9176bec28 2026-03-10T09:01:42.738 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm08 vm08 running (4m) 4m ago 10m 48.1M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 34546aa1422b 2026-03-10T09:01:42.738 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm05 vm05 *:9100 running (10m) 6s ago 10m 14.6M - 1.5.0 0da6a335fe13 3b3db2a6030c 2026-03-10T09:01:42.738 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm08 vm08 *:9100 running (10m) 4m ago 10m 15.7M - 1.5.0 0da6a335fe13 f55df0cefc61 2026-03-10T09:01:42.738 INFO:teuthology.orchestra.run.vm05.stdout:osd.0 vm05 running (4m) 6s ago 9m 214M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 4f1dac46f59b 2026-03-10T09:01:42.738 INFO:teuthology.orchestra.run.vm05.stdout:osd.1 vm05 running (7s) 6s ago 9m 14.0M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 306e95bddd95 2026-03-10T09:01:42.738 INFO:teuthology.orchestra.run.vm05.stdout:osd.2 vm05 running (9m) 6s ago 9m 371M 4096M 18.2.1 5be31c24972a 32c0be0f86f2 2026-03-10T09:01:42.738 INFO:teuthology.orchestra.run.vm05.stdout:osd.3 vm08 running (9m) 4m ago 9m 456M 4096M 18.2.1 5be31c24972a 14f5f93ea4d1 2026-03-10T09:01:42.738 INFO:teuthology.orchestra.run.vm05.stdout:osd.4 vm08 running (9m) 4m ago 9m 418M 4096M 18.2.1 5be31c24972a 155c1482d81c 2026-03-10T09:01:42.738 INFO:teuthology.orchestra.run.vm05.stdout:osd.5 vm08 running (9m) 4m ago 9m 339M 4096M 18.2.1 5be31c24972a 21583bb58d82 2026-03-10T09:01:42.738 INFO:teuthology.orchestra.run.vm05.stdout:prometheus.vm05 vm05 *:9095 running (5m) 6s ago 10m 65.7M - 2.43.0 a07b618ecd1d 1879cf55d022 2026-03-10T09:01:42.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.738+0000 7f05fddac700 1 -- 192.168.123.105:0/2916110575 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f05e40779e0 msgr2=0x7f05e4079ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:42.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.738+0000 7f05fddac700 1 --2- 192.168.123.105:0/2916110575 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f05e40779e0 0x7f05e4079ea0 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7f05e8009ce0 tx=0x7f05e8009450 comp rx=0 tx=0).stop 2026-03-10T09:01:42.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.739+0000 7f05fddac700 1 -- 192.168.123.105:0/2916110575 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f05f81000e0 msgr2=0x7f05f8198f20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:42.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.739+0000 7f05fddac700 1 --2- 192.168.123.105:0/2916110575 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f05f81000e0 0x7f05f8198f20 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7f05e0006010 tx=0x7f05e000b890 comp rx=0 tx=0).stop 2026-03-10T09:01:42.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.739+0000 7f05fddac700 1 -- 192.168.123.105:0/2916110575 shutdown_connections 2026-03-10T09:01:42.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.739+0000 7f05fddac700 1 --2- 192.168.123.105:0/2916110575 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f05e40779e0 0x7f05e4079ea0 unknown :-1 s=CLOSED pgs=87 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:42.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.739+0000 7f05fddac700 1 --2- 192.168.123.105:0/2916110575 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f05f80ff780 0x7f05f81989e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:42.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.739+0000 7f05fddac700 1 --2- 192.168.123.105:0/2916110575 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f05f81000e0 0x7f05f8198f20 unknown :-1 s=CLOSED pgs=131 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:42.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.739+0000 7f05fddac700 1 -- 192.168.123.105:0/2916110575 >> 192.168.123.105:0/2916110575 conn(0x7f05f80fb380 msgr2=0x7f05f8107db0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:01:42.740 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.739+0000 7f05fddac700 1 -- 192.168.123.105:0/2916110575 shutdown_connections 2026-03-10T09:01:42.740 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.739+0000 7f05fddac700 1 -- 192.168.123.105:0/2916110575 wait complete. 2026-03-10T09:01:42.802 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.802+0000 7f24ea715700 1 -- 192.168.123.105:0/3899745081 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f24e4103120 msgr2=0x7f24e4103540 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:42.802 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.802+0000 7f24ea715700 1 --2- 192.168.123.105:0/3899745081 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f24e4103120 0x7f24e4103540 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7f24cc009b00 tx=0x7f24cc009e10 comp rx=0 tx=0).stop 2026-03-10T09:01:42.802 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.802+0000 7f24ea715700 1 -- 192.168.123.105:0/3899745081 shutdown_connections 2026-03-10T09:01:42.802 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.802+0000 7f24ea715700 1 --2- 192.168.123.105:0/3899745081 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f24e4104320 0x7f24e4104780 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:42.802 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.802+0000 7f24ea715700 1 --2- 192.168.123.105:0/3899745081 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f24e4103120 0x7f24e4103540 unknown :-1 s=CLOSED pgs=132 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:42.802 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.802+0000 7f24ea715700 1 -- 192.168.123.105:0/3899745081 >> 192.168.123.105:0/3899745081 conn(0x7f24e40fe6c0 msgr2=0x7f24e4100b00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:01:42.802 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.802+0000 7f24ea715700 1 -- 192.168.123.105:0/3899745081 shutdown_connections 2026-03-10T09:01:42.803 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.802+0000 7f24ea715700 1 -- 192.168.123.105:0/3899745081 wait complete. 2026-03-10T09:01:42.803 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.803+0000 7f24ea715700 1 Processor -- start 2026-03-10T09:01:42.803 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.803+0000 7f24ea715700 1 -- start start 2026-03-10T09:01:42.804 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.803+0000 7f24ea715700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f24e4104320 0x7f24e4198c80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:01:42.804 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.803+0000 7f24ea715700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f24e41991c0 0x7f24e419e220 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:01:42.804 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.803+0000 7f24ea715700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f24e4199670 con 0x7f24e4104320 2026-03-10T09:01:42.804 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.804+0000 7f24ea715700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f24e41997e0 con 0x7f24e41991c0 2026-03-10T09:01:42.804 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.804+0000 7f24e3fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f24e4104320 0x7f24e4198c80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:01:42.804 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.804+0000 7f24dbfff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f24e41991c0 0x7f24e419e220 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:01:42.804 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.804+0000 7f24dbfff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f24e41991c0 0x7f24e419e220 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:53088/0 (socket says 192.168.123.105:53088) 2026-03-10T09:01:42.804 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.804+0000 7f24dbfff700 1 -- 192.168.123.105:0/302118715 learned_addr learned my addr 192.168.123.105:0/302118715 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:01:42.805 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.805+0000 7f24dbfff700 1 -- 192.168.123.105:0/302118715 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f24e4104320 msgr2=0x7f24e4198c80 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:42.805 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.805+0000 7f24dbfff700 1 --2- 192.168.123.105:0/302118715 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f24e4104320 0x7f24e4198c80 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:42.805 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.805+0000 7f24dbfff700 1 -- 192.168.123.105:0/302118715 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f24cc0097e0 con 0x7f24e41991c0 2026-03-10T09:01:42.805 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.805+0000 7f24dbfff700 1 --2- 192.168.123.105:0/302118715 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f24e41991c0 0x7f24e419e220 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7f24d400d900 tx=0x7f24d400dc10 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:01:42.806 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.805+0000 7f24e1ffb700 1 -- 192.168.123.105:0/302118715 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f24d40049e0 con 0x7f24e41991c0 2026-03-10T09:01:42.806 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.806+0000 7f24ea715700 1 -- 192.168.123.105:0/302118715 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f24e419e7c0 con 0x7f24e41991c0 2026-03-10T09:01:42.806 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.806+0000 7f24ea715700 1 -- 192.168.123.105:0/302118715 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f24e419ece0 con 0x7f24e41991c0 2026-03-10T09:01:42.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.806+0000 7f24e1ffb700 1 -- 192.168.123.105:0/302118715 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f24d4005500 con 0x7f24e41991c0 2026-03-10T09:01:42.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.806+0000 7f24e1ffb700 1 -- 192.168.123.105:0/302118715 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f24d4009d40 con 0x7f24e41991c0 2026-03-10T09:01:42.808 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.807+0000 7f24e1ffb700 1 -- 192.168.123.105:0/302118715 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f24d4005020 con 0x7f24e41991c0 2026-03-10T09:01:42.808 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.808+0000 7f24ea715700 1 -- 192.168.123.105:0/302118715 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f24e4066e80 con 0x7f24e41991c0 2026-03-10T09:01:42.808 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.808+0000 7f24e1ffb700 1 --2- 192.168.123.105:0/302118715 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f24d00778c0 0x7f24d0079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:01:42.808 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.808+0000 7f24e1ffb700 1 -- 192.168.123.105:0/302118715 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6222+0+0 (secure 0 0 0) 0x7f24d40998c0 con 0x7f24e41991c0 2026-03-10T09:01:42.809 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.808+0000 7f24e3fff700 1 --2- 192.168.123.105:0/302118715 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f24d00778c0 0x7f24d0079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:01:42.809 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.809+0000 7f24e3fff700 1 --2- 192.168.123.105:0/302118715 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f24d00778c0 0x7f24d0079d80 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f24cc000c00 tx=0x7f24cc005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:01:42.811 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.811+0000 7f24e1ffb700 1 -- 192.168.123.105:0/302118715 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f24d4062010 con 0x7f24e41991c0 2026-03-10T09:01:42.972 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.971+0000 7f24ea715700 1 -- 192.168.123.105:0/302118715 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f24e419f0b0 con 0x7f24e41991c0 2026-03-10T09:01:42.972 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.972+0000 7f24e1ffb700 1 -- 192.168.123.105:0/302118715 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+799 (secure 0 0 0) 0x7f24d4061760 con 0x7f24e41991c0 2026-03-10T09:01:42.973 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T09:01:42.973 INFO:teuthology.orchestra.run.vm05.stdout: "mon": { 2026-03-10T09:01:42.973 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T09:01:42.973 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T09:01:42.973 INFO:teuthology.orchestra.run.vm05.stdout: "mgr": { 2026-03-10T09:01:42.973 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T09:01:42.973 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T09:01:42.973 INFO:teuthology.orchestra.run.vm05.stdout: "osd": { 2026-03-10T09:01:42.973 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4, 2026-03-10T09:01:42.973 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T09:01:42.973 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T09:01:42.973 INFO:teuthology.orchestra.run.vm05.stdout: "mds": { 2026-03-10T09:01:42.973 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4 2026-03-10T09:01:42.973 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T09:01:42.973 INFO:teuthology.orchestra.run.vm05.stdout: "overall": { 2026-03-10T09:01:42.973 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 8, 2026-03-10T09:01:42.973 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-10T09:01:42.973 INFO:teuthology.orchestra.run.vm05.stdout: } 2026-03-10T09:01:42.973 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T09:01:42.974 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.974+0000 7f24ea715700 1 -- 192.168.123.105:0/302118715 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f24d00778c0 msgr2=0x7f24d0079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:42.974 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.974+0000 7f24ea715700 1 --2- 192.168.123.105:0/302118715 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f24d00778c0 0x7f24d0079d80 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f24cc000c00 tx=0x7f24cc005fb0 comp rx=0 tx=0).stop 2026-03-10T09:01:42.974 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.974+0000 7f24ea715700 1 -- 192.168.123.105:0/302118715 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f24e41991c0 msgr2=0x7f24e419e220 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:42.975 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.974+0000 7f24ea715700 1 --2- 192.168.123.105:0/302118715 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f24e41991c0 0x7f24e419e220 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7f24d400d900 tx=0x7f24d400dc10 comp rx=0 tx=0).stop 2026-03-10T09:01:42.975 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.975+0000 7f24ea715700 1 -- 192.168.123.105:0/302118715 shutdown_connections 2026-03-10T09:01:42.975 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.975+0000 7f24ea715700 1 --2- 192.168.123.105:0/302118715 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f24d00778c0 0x7f24d0079d80 unknown :-1 s=CLOSED pgs=88 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:42.975 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.975+0000 7f24ea715700 1 --2- 192.168.123.105:0/302118715 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f24e4104320 0x7f24e4198c80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:42.975 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.975+0000 7f24ea715700 1 --2- 192.168.123.105:0/302118715 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f24e41991c0 0x7f24e419e220 unknown :-1 s=CLOSED pgs=47 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:42.975 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.975+0000 7f24ea715700 1 -- 192.168.123.105:0/302118715 >> 192.168.123.105:0/302118715 conn(0x7f24e40fe6c0 msgr2=0x7f24e4107550 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:01:42.975 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.975+0000 7f24ea715700 1 -- 192.168.123.105:0/302118715 shutdown_connections 2026-03-10T09:01:42.976 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:42.975+0000 7f24ea715700 1 -- 192.168.123.105:0/302118715 wait complete. 2026-03-10T09:01:43.051 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.051+0000 7ff0dabe8700 1 -- 192.168.123.105:0/3821153517 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff0d4107d90 msgr2=0x7ff0d410a1c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:43.052 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.051+0000 7ff0dabe8700 1 --2- 192.168.123.105:0/3821153517 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff0d4107d90 0x7ff0d410a1c0 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7ff0d0009b00 tx=0x7ff0d0009e10 comp rx=0 tx=0).stop 2026-03-10T09:01:43.052 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.052+0000 7ff0dabe8700 1 -- 192.168.123.105:0/3821153517 shutdown_connections 2026-03-10T09:01:43.052 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.052+0000 7ff0dabe8700 1 --2- 192.168.123.105:0/3821153517 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff0d410a700 0x7ff0d410cb90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:43.052 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.052+0000 7ff0dabe8700 1 --2- 192.168.123.105:0/3821153517 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff0d4107d90 0x7ff0d410a1c0 unknown :-1 s=CLOSED pgs=48 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:43.052 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.052+0000 7ff0dabe8700 1 -- 192.168.123.105:0/3821153517 >> 192.168.123.105:0/3821153517 conn(0x7ff0d406dae0 msgr2=0x7ff0d406ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:01:43.053 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.052+0000 7ff0dabe8700 1 -- 192.168.123.105:0/3821153517 shutdown_connections 2026-03-10T09:01:43.053 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.052+0000 7ff0dabe8700 1 -- 192.168.123.105:0/3821153517 wait complete. 2026-03-10T09:01:43.053 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.053+0000 7ff0dabe8700 1 Processor -- start 2026-03-10T09:01:43.053 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.053+0000 7ff0dabe8700 1 -- start start 2026-03-10T09:01:43.053 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.053+0000 7ff0dabe8700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff0d4107d90 0x7ff0d4116d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:01:43.053 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.053+0000 7ff0dabe8700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff0d410a700 0x7ff0d41172c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:01:43.053 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.053+0000 7ff0dabe8700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff0d4117930 con 0x7ff0d410a700 2026-03-10T09:01:43.053 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.053+0000 7ff0dabe8700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff0d41b33d0 con 0x7ff0d4107d90 2026-03-10T09:01:43.053 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.053+0000 7ff0d8984700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff0d4107d90 0x7ff0d4116d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:01:43.053 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.053+0000 7ff0d8984700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff0d4107d90 0x7ff0d4116d80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:53096/0 (socket says 192.168.123.105:53096) 2026-03-10T09:01:43.053 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.053+0000 7ff0d8984700 1 -- 192.168.123.105:0/2367930175 learned_addr learned my addr 192.168.123.105:0/2367930175 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:01:43.054 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.053+0000 7ff0d8984700 1 -- 192.168.123.105:0/2367930175 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff0d410a700 msgr2=0x7ff0d41172c0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:43.054 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.053+0000 7ff0d8984700 1 --2- 192.168.123.105:0/2367930175 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff0d410a700 0x7ff0d41172c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:43.054 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.053+0000 7ff0d8984700 1 -- 192.168.123.105:0/2367930175 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff0d00097e0 con 0x7ff0d4107d90 2026-03-10T09:01:43.054 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.054+0000 7ff0d8984700 1 --2- 192.168.123.105:0/2367930175 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff0d4107d90 0x7ff0d4116d80 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7ff0d00038c0 tx=0x7ff0d00039a0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:01:43.054 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.054+0000 7ff0c9ffb700 1 -- 192.168.123.105:0/2367930175 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff0d001d070 con 0x7ff0d4107d90 2026-03-10T09:01:43.055 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.054+0000 7ff0dabe8700 1 -- 192.168.123.105:0/2367930175 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff0d41b3570 con 0x7ff0d4107d90 2026-03-10T09:01:43.055 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.054+0000 7ff0dabe8700 1 -- 192.168.123.105:0/2367930175 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff0d41b3a60 con 0x7ff0d4107d90 2026-03-10T09:01:43.055 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.054+0000 7ff0c9ffb700 1 -- 192.168.123.105:0/2367930175 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff0d0003d00 con 0x7ff0d4107d90 2026-03-10T09:01:43.055 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.055+0000 7ff0c9ffb700 1 -- 192.168.123.105:0/2367930175 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff0d0021920 con 0x7ff0d4107d90 2026-03-10T09:01:43.055 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.055+0000 7ff0bb7fe700 1 -- 192.168.123.105:0/2367930175 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff0c0005320 con 0x7ff0d4107d90 2026-03-10T09:01:43.055 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.055+0000 7ff0c9ffb700 1 -- 192.168.123.105:0/2367930175 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7ff0d000f4e0 con 0x7ff0d4107d90 2026-03-10T09:01:43.056 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.056+0000 7ff0c9ffb700 1 --2- 192.168.123.105:0/2367930175 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ff0bc0776c0 0x7ff0bc079b80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:01:43.056 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.056+0000 7ff0c9ffb700 1 -- 192.168.123.105:0/2367930175 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(82..82 src has 1..82) v4 ==== 6222+0+0 (secure 0 0 0) 0x7ff0d009ae10 con 0x7ff0d4107d90 2026-03-10T09:01:43.056 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.056+0000 7ff0cbfff700 1 --2- 192.168.123.105:0/2367930175 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ff0bc0776c0 0x7ff0bc079b80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:01:43.057 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.057+0000 7ff0cbfff700 1 --2- 192.168.123.105:0/2367930175 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ff0bc0776c0 0x7ff0bc079b80 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7ff0d41aefa0 tx=0x7ff0c4005c80 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:01:43.059 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.059+0000 7ff0c9ffb700 1 -- 192.168.123.105:0/2367930175 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff0d00635f0 con 0x7ff0d4107d90 2026-03-10T09:01:43.204 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.204+0000 7ff0bb7fe700 1 -- 192.168.123.105:0/2367930175 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7ff0c0006200 con 0x7ff0d4107d90 2026-03-10T09:01:43.205 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.205+0000 7ff0c9ffb700 1 -- 192.168.123.105:0/2367930175 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 12 v12) v1 ==== 76+0+1918 (secure 0 0 0) 0x7ff0d0062d40 con 0x7ff0d4107d90 2026-03-10T09:01:43.205 INFO:teuthology.orchestra.run.vm05.stdout:e12 2026-03-10T09:01:43.205 INFO:teuthology.orchestra.run.vm05.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-10T09:01:43.205 INFO:teuthology.orchestra.run.vm05.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T09:01:43.205 INFO:teuthology.orchestra.run.vm05.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T09:01:43.205 INFO:teuthology.orchestra.run.vm05.stdout:legacy client fscid: 1 2026-03-10T09:01:43.205 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:01:43.205 INFO:teuthology.orchestra.run.vm05.stdout:Filesystem 'cephfs' (1) 2026-03-10T09:01:43.205 INFO:teuthology.orchestra.run.vm05.stdout:fs_name cephfs 2026-03-10T09:01:43.205 INFO:teuthology.orchestra.run.vm05.stdout:epoch 12 2026-03-10T09:01:43.205 INFO:teuthology.orchestra.run.vm05.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T09:01:43.205 INFO:teuthology.orchestra.run.vm05.stdout:created 2026-03-10T08:52:52.346264+0000 2026-03-10T09:01:43.205 INFO:teuthology.orchestra.run.vm05.stdout:modified 2026-03-10T08:53:01.426195+0000 2026-03-10T09:01:43.205 INFO:teuthology.orchestra.run.vm05.stdout:tableserver 0 2026-03-10T09:01:43.205 INFO:teuthology.orchestra.run.vm05.stdout:root 0 2026-03-10T09:01:43.205 INFO:teuthology.orchestra.run.vm05.stdout:session_timeout 60 2026-03-10T09:01:43.205 INFO:teuthology.orchestra.run.vm05.stdout:session_autoclose 300 2026-03-10T09:01:43.205 INFO:teuthology.orchestra.run.vm05.stdout:max_file_size 1099511627776 2026-03-10T09:01:43.205 INFO:teuthology.orchestra.run.vm05.stdout:max_xattr_size 65536 2026-03-10T09:01:43.205 INFO:teuthology.orchestra.run.vm05.stdout:required_client_features {} 2026-03-10T09:01:43.205 INFO:teuthology.orchestra.run.vm05.stdout:last_failure 0 2026-03-10T09:01:43.205 INFO:teuthology.orchestra.run.vm05.stdout:last_failure_osd_epoch 39 2026-03-10T09:01:43.205 INFO:teuthology.orchestra.run.vm05.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T09:01:43.205 INFO:teuthology.orchestra.run.vm05.stdout:max_mds 1 2026-03-10T09:01:43.205 INFO:teuthology.orchestra.run.vm05.stdout:in 0 2026-03-10T09:01:43.205 INFO:teuthology.orchestra.run.vm05.stdout:up {0=24289} 2026-03-10T09:01:43.205 INFO:teuthology.orchestra.run.vm05.stdout:failed 2026-03-10T09:01:43.205 INFO:teuthology.orchestra.run.vm05.stdout:damaged 2026-03-10T09:01:43.205 INFO:teuthology.orchestra.run.vm05.stdout:stopped 2026-03-10T09:01:43.205 INFO:teuthology.orchestra.run.vm05.stdout:data_pools [3] 2026-03-10T09:01:43.205 INFO:teuthology.orchestra.run.vm05.stdout:metadata_pool 2 2026-03-10T09:01:43.206 INFO:teuthology.orchestra.run.vm05.stdout:inline_data disabled 2026-03-10T09:01:43.206 INFO:teuthology.orchestra.run.vm05.stdout:balancer 2026-03-10T09:01:43.206 INFO:teuthology.orchestra.run.vm05.stdout:bal_rank_mask -1 2026-03-10T09:01:43.206 INFO:teuthology.orchestra.run.vm05.stdout:standby_count_wanted 1 2026-03-10T09:01:43.206 INFO:teuthology.orchestra.run.vm05.stdout:qdb_cluster leader: 0 members: 2026-03-10T09:01:43.206 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.bxdvbu{0:24289} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.105:6826/2466638752,v1:192.168.123.105:6827/2466638752] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T09:01:43.206 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:01:43.206 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:01:43.206 INFO:teuthology.orchestra.run.vm05.stdout:Standby daemons: 2026-03-10T09:01:43.206 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:01:43.206 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.slhztf{-1:14488} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.105:6828/2662194502,v1:192.168.123.105:6829/2662194502] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T09:01:43.206 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.ssijow{-1:24307} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.108:6826/573665424,v1:192.168.123.108:6827/573665424] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T09:01:43.206 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.xfzrbx{-1:24317} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.108:6824/3236998387,v1:192.168.123.108:6825/3236998387] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T09:01:43.207 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.206+0000 7ff0bb7fe700 1 -- 192.168.123.105:0/2367930175 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ff0bc0776c0 msgr2=0x7ff0bc079b80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:43.207 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.206+0000 7ff0bb7fe700 1 --2- 192.168.123.105:0/2367930175 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ff0bc0776c0 0x7ff0bc079b80 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7ff0d41aefa0 tx=0x7ff0c4005c80 comp rx=0 tx=0).stop 2026-03-10T09:01:43.207 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.206+0000 7ff0bb7fe700 1 -- 192.168.123.105:0/2367930175 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff0d4107d90 msgr2=0x7ff0d4116d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:43.207 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.206+0000 7ff0bb7fe700 1 --2- 192.168.123.105:0/2367930175 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff0d4107d90 0x7ff0d4116d80 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7ff0d00038c0 tx=0x7ff0d00039a0 comp rx=0 tx=0).stop 2026-03-10T09:01:43.207 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.207+0000 7ff0bb7fe700 1 -- 192.168.123.105:0/2367930175 shutdown_connections 2026-03-10T09:01:43.207 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.207+0000 7ff0bb7fe700 1 --2- 192.168.123.105:0/2367930175 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ff0bc0776c0 0x7ff0bc079b80 unknown :-1 s=CLOSED pgs=89 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:43.207 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.207+0000 7ff0bb7fe700 1 --2- 192.168.123.105:0/2367930175 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff0d4107d90 0x7ff0d4116d80 unknown :-1 s=CLOSED pgs=49 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:43.207 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.207+0000 7ff0bb7fe700 1 --2- 192.168.123.105:0/2367930175 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff0d410a700 0x7ff0d41172c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:43.207 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.207+0000 7ff0bb7fe700 1 -- 192.168.123.105:0/2367930175 >> 192.168.123.105:0/2367930175 conn(0x7ff0d406dae0 msgr2=0x7ff0d406ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:01:43.207 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.207+0000 7ff0bb7fe700 1 -- 192.168.123.105:0/2367930175 shutdown_connections 2026-03-10T09:01:43.207 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.207+0000 7ff0bb7fe700 1 -- 192.168.123.105:0/2367930175 wait complete. 2026-03-10T09:01:43.208 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 12 2026-03-10T09:01:43.275 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:43 vm05.local ceph-mon[111630]: pgmap v188: 65 pgs: 22 active+undersized, 12 active+undersized+degraded, 31 active+clean; 215 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail; 28/231 objects degraded (12.121%) 2026-03-10T09:01:43.275 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:43 vm05.local ceph-mon[111630]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T09:01:43.275 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:43 vm05.local ceph-mon[111630]: osd.1 [v2:192.168.123.105:6810/1997326784,v1:192.168.123.105:6811/1997326784] boot 2026-03-10T09:01:43.275 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:43 vm05.local ceph-mon[111630]: osdmap e81: 6 total, 6 up, 6 in 2026-03-10T09:01:43.275 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:43 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T09:01:43.275 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:43 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/302118715' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:01:43.275 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.272+0000 7fd6cfef2700 1 -- 192.168.123.105:0/3763014372 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd6c80ff520 msgr2=0x7fd6c8106700 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:43.275 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.272+0000 7fd6cfef2700 1 --2- 192.168.123.105:0/3763014372 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd6c80ff520 0x7fd6c8106700 secure :-1 s=READY pgs=133 cs=0 l=1 rev1=1 crypto rx=0x7fd6c4009b00 tx=0x7fd6c4009e10 comp rx=0 tx=0).stop 2026-03-10T09:01:43.275 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.275+0000 7fd6cfef2700 1 -- 192.168.123.105:0/3763014372 shutdown_connections 2026-03-10T09:01:43.275 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.275+0000 7fd6cfef2700 1 --2- 192.168.123.105:0/3763014372 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd6c80ff520 0x7fd6c8106700 unknown :-1 s=CLOSED pgs=133 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:43.275 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.275+0000 7fd6cfef2700 1 --2- 192.168.123.105:0/3763014372 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd6c80feb30 0x7fd6c80fef50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:43.275 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.275+0000 7fd6cfef2700 1 -- 192.168.123.105:0/3763014372 >> 192.168.123.105:0/3763014372 conn(0x7fd6c80fa6d0 msgr2=0x7fd6c80fcb10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:01:43.277 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.277+0000 7fd6cfef2700 1 -- 192.168.123.105:0/3763014372 shutdown_connections 2026-03-10T09:01:43.277 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.277+0000 7fd6cfef2700 1 -- 192.168.123.105:0/3763014372 wait complete. 2026-03-10T09:01:43.278 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.278+0000 7fd6cfef2700 1 Processor -- start 2026-03-10T09:01:43.278 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.278+0000 7fd6cfef2700 1 -- start start 2026-03-10T09:01:43.278 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.278+0000 7fd6cfef2700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd6c80feb30 0x7fd6c8194690 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:01:43.278 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.278+0000 7fd6cfef2700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd6c80ff520 0x7fd6c8194bd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:01:43.278 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.278+0000 7fd6cfef2700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd6c81951f0 con 0x7fd6c80feb30 2026-03-10T09:01:43.279 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.278+0000 7fd6cfef2700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd6c8195330 con 0x7fd6c80ff520 2026-03-10T09:01:43.279 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.278+0000 7fd6cd48d700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd6c80ff520 0x7fd6c8194bd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:01:43.279 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.279+0000 7fd6cd48d700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd6c80ff520 0x7fd6c8194bd0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:53114/0 (socket says 192.168.123.105:53114) 2026-03-10T09:01:43.279 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.279+0000 7fd6cd48d700 1 -- 192.168.123.105:0/1959228264 learned_addr learned my addr 192.168.123.105:0/1959228264 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:01:43.279 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.279+0000 7fd6cd48d700 1 -- 192.168.123.105:0/1959228264 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd6c80feb30 msgr2=0x7fd6c8194690 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:43.279 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.279+0000 7fd6cd48d700 1 --2- 192.168.123.105:0/1959228264 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd6c80feb30 0x7fd6c8194690 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:43.279 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.279+0000 7fd6cd48d700 1 -- 192.168.123.105:0/1959228264 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd6c40097e0 con 0x7fd6c80ff520 2026-03-10T09:01:43.279 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.279+0000 7fd6cd48d700 1 --2- 192.168.123.105:0/1959228264 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd6c80ff520 0x7fd6c8194bd0 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7fd6c40052a0 tx=0x7fd6c4004b10 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:01:43.280 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.279+0000 7fd6baffd700 1 -- 192.168.123.105:0/1959228264 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd6c401d070 con 0x7fd6c80ff520 2026-03-10T09:01:43.281 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.281+0000 7fd6baffd700 1 -- 192.168.123.105:0/1959228264 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd6c400bd10 con 0x7fd6c80ff520 2026-03-10T09:01:43.281 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.281+0000 7fd6baffd700 1 -- 192.168.123.105:0/1959228264 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd6c400f830 con 0x7fd6c80ff520 2026-03-10T09:01:43.281 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.281+0000 7fd6cfef2700 1 -- 192.168.123.105:0/1959228264 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd6c8199d80 con 0x7fd6c80ff520 2026-03-10T09:01:43.281 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.281+0000 7fd6cfef2700 1 -- 192.168.123.105:0/1959228264 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd6c819a240 con 0x7fd6c80ff520 2026-03-10T09:01:43.283 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.283+0000 7fd6cfef2700 1 -- 192.168.123.105:0/1959228264 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd6c818e860 con 0x7fd6c80ff520 2026-03-10T09:01:43.284 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.284+0000 7fd6baffd700 1 -- 192.168.123.105:0/1959228264 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fd6c400f990 con 0x7fd6c80ff520 2026-03-10T09:01:43.284 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.284+0000 7fd6baffd700 1 --2- 192.168.123.105:0/1959228264 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fd6b407bb70 0x7fd6b407e030 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:01:43.285 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.284+0000 7fd6baffd700 1 -- 192.168.123.105:0/1959228264 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(82..82 src has 1..82) v4 ==== 6222+0+0 (secure 0 0 0) 0x7fd6c409b550 con 0x7fd6c80ff520 2026-03-10T09:01:43.286 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.285+0000 7fd6cdc8e700 1 --2- 192.168.123.105:0/1959228264 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fd6b407bb70 0x7fd6b407e030 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:01:43.286 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.286+0000 7fd6cdc8e700 1 --2- 192.168.123.105:0/1959228264 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fd6b407bb70 0x7fd6b407e030 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7fd6c81004a0 tx=0x7fd6bc006d20 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:01:43.287 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.287+0000 7fd6baffd700 1 -- 192.168.123.105:0/1959228264 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fd6c4063db0 con 0x7fd6c80ff520 2026-03-10T09:01:43.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:43 vm08.local ceph-mon[101330]: pgmap v188: 65 pgs: 22 active+undersized, 12 active+undersized+degraded, 31 active+clean; 215 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail; 28/231 objects degraded (12.121%) 2026-03-10T09:01:43.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:43 vm08.local ceph-mon[101330]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T09:01:43.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:43 vm08.local ceph-mon[101330]: osd.1 [v2:192.168.123.105:6810/1997326784,v1:192.168.123.105:6811/1997326784] boot 2026-03-10T09:01:43.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:43 vm08.local ceph-mon[101330]: osdmap e81: 6 total, 6 up, 6 in 2026-03-10T09:01:43.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:43 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T09:01:43.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:43 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/302118715' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:01:43.417 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.416+0000 7fd6cfef2700 1 -- 192.168.123.105:0/1959228264 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fd6c80611d0 con 0x7fd6b407bb70 2026-03-10T09:01:43.417 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.417+0000 7fd6baffd700 1 -- 192.168.123.105:0/1959228264 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7fd6c80611d0 con 0x7fd6b407bb70 2026-03-10T09:01:43.417 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T09:01:43.417 INFO:teuthology.orchestra.run.vm05.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T09:01:43.417 INFO:teuthology.orchestra.run.vm05.stdout: "in_progress": true, 2026-03-10T09:01:43.417 INFO:teuthology.orchestra.run.vm05.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T09:01:43.417 INFO:teuthology.orchestra.run.vm05.stdout: "services_complete": [ 2026-03-10T09:01:43.417 INFO:teuthology.orchestra.run.vm05.stdout: "mgr", 2026-03-10T09:01:43.417 INFO:teuthology.orchestra.run.vm05.stdout: "crash", 2026-03-10T09:01:43.417 INFO:teuthology.orchestra.run.vm05.stdout: "mon" 2026-03-10T09:01:43.417 INFO:teuthology.orchestra.run.vm05.stdout: ], 2026-03-10T09:01:43.417 INFO:teuthology.orchestra.run.vm05.stdout: "progress": "8/23 daemons upgraded", 2026-03-10T09:01:43.417 INFO:teuthology.orchestra.run.vm05.stdout: "message": "Currently upgrading osd daemons", 2026-03-10T09:01:43.417 INFO:teuthology.orchestra.run.vm05.stdout: "is_paused": false 2026-03-10T09:01:43.417 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T09:01:43.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.419+0000 7fd6cfef2700 1 -- 192.168.123.105:0/1959228264 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fd6b407bb70 msgr2=0x7fd6b407e030 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:43.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.419+0000 7fd6cfef2700 1 --2- 192.168.123.105:0/1959228264 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fd6b407bb70 0x7fd6b407e030 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7fd6c81004a0 tx=0x7fd6bc006d20 comp rx=0 tx=0).stop 2026-03-10T09:01:43.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.419+0000 7fd6cfef2700 1 -- 192.168.123.105:0/1959228264 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd6c80ff520 msgr2=0x7fd6c8194bd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:43.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.420+0000 7fd6cfef2700 1 --2- 192.168.123.105:0/1959228264 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd6c80ff520 0x7fd6c8194bd0 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7fd6c40052a0 tx=0x7fd6c4004b10 comp rx=0 tx=0).stop 2026-03-10T09:01:43.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.420+0000 7fd6cfef2700 1 -- 192.168.123.105:0/1959228264 shutdown_connections 2026-03-10T09:01:43.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.420+0000 7fd6cfef2700 1 --2- 192.168.123.105:0/1959228264 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fd6b407bb70 0x7fd6b407e030 unknown :-1 s=CLOSED pgs=90 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:43.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.420+0000 7fd6cfef2700 1 --2- 192.168.123.105:0/1959228264 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd6c80feb30 0x7fd6c8194690 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:43.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.420+0000 7fd6cfef2700 1 --2- 192.168.123.105:0/1959228264 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd6c80ff520 0x7fd6c8194bd0 unknown :-1 s=CLOSED pgs=50 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:43.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.420+0000 7fd6cfef2700 1 -- 192.168.123.105:0/1959228264 >> 192.168.123.105:0/1959228264 conn(0x7fd6c80fa6d0 msgr2=0x7fd6c8104f80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:01:43.421 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.421+0000 7fd6cfef2700 1 -- 192.168.123.105:0/1959228264 shutdown_connections 2026-03-10T09:01:43.421 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.421+0000 7fd6cfef2700 1 -- 192.168.123.105:0/1959228264 wait complete. 2026-03-10T09:01:43.491 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.491+0000 7fd278072700 1 -- 192.168.123.105:0/1839157985 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd270074dc0 msgr2=0x7fd270073220 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:43.491 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.491+0000 7fd278072700 1 --2- 192.168.123.105:0/1839157985 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd270074dc0 0x7fd270073220 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7fd260009b00 tx=0x7fd260009e10 comp rx=0 tx=0).stop 2026-03-10T09:01:43.492 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.491+0000 7fd278072700 1 -- 192.168.123.105:0/1839157985 shutdown_connections 2026-03-10T09:01:43.492 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.491+0000 7fd278072700 1 --2- 192.168.123.105:0/1839157985 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd2700737f0 0x7fd270073c70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:43.492 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.491+0000 7fd278072700 1 --2- 192.168.123.105:0/1839157985 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd270074dc0 0x7fd270073220 unknown :-1 s=CLOSED pgs=134 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:43.492 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.491+0000 7fd278072700 1 -- 192.168.123.105:0/1839157985 >> 192.168.123.105:0/1839157985 conn(0x7fd2700fc4d0 msgr2=0x7fd2700fe930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:01:43.492 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.492+0000 7fd278072700 1 -- 192.168.123.105:0/1839157985 shutdown_connections 2026-03-10T09:01:43.492 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.492+0000 7fd278072700 1 -- 192.168.123.105:0/1839157985 wait complete. 2026-03-10T09:01:43.492 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.492+0000 7fd278072700 1 Processor -- start 2026-03-10T09:01:43.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.492+0000 7fd278072700 1 -- start start 2026-03-10T09:01:43.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.492+0000 7fd278072700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd2700737f0 0x7fd27019ce10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:01:43.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.492+0000 7fd278072700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd270074dc0 0x7fd27019d350 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:01:43.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.492+0000 7fd278072700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd27019d970 con 0x7fd2700737f0 2026-03-10T09:01:43.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.492+0000 7fd278072700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd27019dab0 con 0x7fd270074dc0 2026-03-10T09:01:43.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.493+0000 7fd275e0e700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd2700737f0 0x7fd27019ce10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:01:43.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.493+0000 7fd275e0e700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd2700737f0 0x7fd27019ce10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:45764/0 (socket says 192.168.123.105:45764) 2026-03-10T09:01:43.494 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.493+0000 7fd275e0e700 1 -- 192.168.123.105:0/574322738 learned_addr learned my addr 192.168.123.105:0/574322738 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:01:43.494 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.493+0000 7fd27560d700 1 --2- 192.168.123.105:0/574322738 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd270074dc0 0x7fd27019d350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:01:43.494 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.493+0000 7fd275e0e700 1 -- 192.168.123.105:0/574322738 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd270074dc0 msgr2=0x7fd27019d350 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:43.495 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.493+0000 7fd275e0e700 1 --2- 192.168.123.105:0/574322738 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd270074dc0 0x7fd27019d350 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:43.495 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.493+0000 7fd275e0e700 1 -- 192.168.123.105:0/574322738 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd2600097e0 con 0x7fd2700737f0 2026-03-10T09:01:43.495 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.493+0000 7fd275e0e700 1 --2- 192.168.123.105:0/574322738 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd2700737f0 0x7fd27019ce10 secure :-1 s=READY pgs=135 cs=0 l=1 rev1=1 crypto rx=0x7fd260006010 tx=0x7fd2600049e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:01:43.495 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.494+0000 7fd266ffd700 1 -- 192.168.123.105:0/574322738 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd26001d070 con 0x7fd2700737f0 2026-03-10T09:01:43.495 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.494+0000 7fd278072700 1 -- 192.168.123.105:0/574322738 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd2701a2500 con 0x7fd2700737f0 2026-03-10T09:01:43.495 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.494+0000 7fd278072700 1 -- 192.168.123.105:0/574322738 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd2701a29f0 con 0x7fd2700737f0 2026-03-10T09:01:43.496 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.494+0000 7fd266ffd700 1 -- 192.168.123.105:0/574322738 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd26000bc50 con 0x7fd2700737f0 2026-03-10T09:01:43.496 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.494+0000 7fd266ffd700 1 -- 192.168.123.105:0/574322738 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd26000f890 con 0x7fd2700737f0 2026-03-10T09:01:43.496 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.495+0000 7fd266ffd700 1 -- 192.168.123.105:0/574322738 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fd26000f9f0 con 0x7fd2700737f0 2026-03-10T09:01:43.496 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.495+0000 7fd266ffd700 1 --2- 192.168.123.105:0/574322738 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fd25c077990 0x7fd25c079e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:01:43.496 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.495+0000 7fd266ffd700 1 -- 192.168.123.105:0/574322738 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(82..82 src has 1..82) v4 ==== 6222+0+0 (secure 0 0 0) 0x7fd26009c1f0 con 0x7fd2700737f0 2026-03-10T09:01:43.496 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.495+0000 7fd27560d700 1 --2- 192.168.123.105:0/574322738 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fd25c077990 0x7fd25c079e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:01:43.496 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.496+0000 7fd27560d700 1 --2- 192.168.123.105:0/574322738 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fd25c077990 0x7fd25c079e50 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7fd26c005950 tx=0x7fd26c0058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:01:43.499 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.496+0000 7fd278072700 1 -- 192.168.123.105:0/574322738 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd254005320 con 0x7fd2700737f0 2026-03-10T09:01:43.499 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.499+0000 7fd266ffd700 1 -- 192.168.123.105:0/574322738 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fd260064b00 con 0x7fd2700737f0 2026-03-10T09:01:43.662 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.662+0000 7fd278072700 1 -- 192.168.123.105:0/574322738 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7fd254005190 con 0x7fd2700737f0 2026-03-10T09:01:43.663 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.662+0000 7fd266ffd700 1 -- 192.168.123.105:0/574322738 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+847 (secure 0 0 0) 0x7fd260027090 con 0x7fd2700737f0 2026-03-10T09:01:43.663 INFO:teuthology.orchestra.run.vm05.stdout:HEALTH_WARN Degraded data redundancy: 28/231 objects degraded (12.121%), 12 pgs degraded 2026-03-10T09:01:43.663 INFO:teuthology.orchestra.run.vm05.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 28/231 objects degraded (12.121%), 12 pgs degraded 2026-03-10T09:01:43.663 INFO:teuthology.orchestra.run.vm05.stdout: pg 1.0 is active+undersized+degraded, acting [3,0] 2026-03-10T09:01:43.663 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.0 is active+undersized+degraded, acting [3,0] 2026-03-10T09:01:43.663 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.1 is active+undersized+degraded, acting [2,0] 2026-03-10T09:01:43.663 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.2 is active+undersized+degraded, acting [5,0] 2026-03-10T09:01:43.663 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.3 is active+undersized+degraded, acting [5,2] 2026-03-10T09:01:43.663 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.4 is active+undersized+degraded, acting [0,4] 2026-03-10T09:01:43.663 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.6 is active+undersized+degraded, acting [3,4] 2026-03-10T09:01:43.663 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.9 is active+undersized+degraded, acting [4,0] 2026-03-10T09:01:43.663 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.a is active+undersized+degraded, acting [4,3] 2026-03-10T09:01:43.663 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.d is active+undersized+degraded, acting [3,2] 2026-03-10T09:01:43.663 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.10 is active+undersized+degraded, acting [2,0] 2026-03-10T09:01:43.663 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.15 is active+undersized+degraded, acting [3,0] 2026-03-10T09:01:43.665 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.665+0000 7fd278072700 1 -- 192.168.123.105:0/574322738 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fd25c077990 msgr2=0x7fd25c079e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:43.665 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.665+0000 7fd278072700 1 --2- 192.168.123.105:0/574322738 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fd25c077990 0x7fd25c079e50 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7fd26c005950 tx=0x7fd26c0058e0 comp rx=0 tx=0).stop 2026-03-10T09:01:43.665 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.665+0000 7fd278072700 1 -- 192.168.123.105:0/574322738 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd2700737f0 msgr2=0x7fd27019ce10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:01:43.665 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.665+0000 7fd278072700 1 --2- 192.168.123.105:0/574322738 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd2700737f0 0x7fd27019ce10 secure :-1 s=READY pgs=135 cs=0 l=1 rev1=1 crypto rx=0x7fd260006010 tx=0x7fd2600049e0 comp rx=0 tx=0).stop 2026-03-10T09:01:43.665 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.665+0000 7fd278072700 1 -- 192.168.123.105:0/574322738 shutdown_connections 2026-03-10T09:01:43.665 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.665+0000 7fd278072700 1 --2- 192.168.123.105:0/574322738 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fd25c077990 0x7fd25c079e50 unknown :-1 s=CLOSED pgs=91 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:43.665 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.665+0000 7fd278072700 1 --2- 192.168.123.105:0/574322738 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd2700737f0 0x7fd27019ce10 unknown :-1 s=CLOSED pgs=135 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:43.665 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.665+0000 7fd278072700 1 --2- 192.168.123.105:0/574322738 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd270074dc0 0x7fd27019d350 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:01:43.665 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.665+0000 7fd278072700 1 -- 192.168.123.105:0/574322738 >> 192.168.123.105:0/574322738 conn(0x7fd2700fc4d0 msgr2=0x7fd2701028e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:01:43.665 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.665+0000 7fd278072700 1 -- 192.168.123.105:0/574322738 shutdown_connections 2026-03-10T09:01:43.666 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:01:43.665+0000 7fd278072700 1 -- 192.168.123.105:0/574322738 wait complete. 2026-03-10T09:01:44.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:44 vm08.local ceph-mon[101330]: from='client.34348 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:01:44.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:44 vm08.local ceph-mon[101330]: from='client.34352 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:01:44.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:44 vm08.local ceph-mon[101330]: from='client.34356 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:01:44.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:44 vm08.local ceph-mon[101330]: osdmap e82: 6 total, 6 up, 6 in 2026-03-10T09:01:44.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:44 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/2367930175' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T09:01:44.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:44 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/574322738' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T09:01:44.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:44 vm05.local ceph-mon[111630]: from='client.34348 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:01:44.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:44 vm05.local ceph-mon[111630]: from='client.34352 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:01:44.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:44 vm05.local ceph-mon[111630]: from='client.34356 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:01:44.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:44 vm05.local ceph-mon[111630]: osdmap e82: 6 total, 6 up, 6 in 2026-03-10T09:01:44.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:44 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/2367930175' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T09:01:44.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:44 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/574322738' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T09:01:45.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:45 vm08.local ceph-mon[101330]: from='client.44283 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:01:45.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:45 vm08.local ceph-mon[101330]: pgmap v191: 65 pgs: 22 active+undersized, 12 active+undersized+degraded, 31 active+clean; 215 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail; 28/231 objects degraded (12.121%) 2026-03-10T09:01:45.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:45 vm08.local ceph-mon[101330]: Health check update: Degraded data redundancy: 28/231 objects degraded (12.121%), 12 pgs degraded (PG_DEGRADED) 2026-03-10T09:01:45.321 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:45 vm05.local ceph-mon[111630]: from='client.44283 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:01:45.321 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:45 vm05.local ceph-mon[111630]: pgmap v191: 65 pgs: 22 active+undersized, 12 active+undersized+degraded, 31 active+clean; 215 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail; 28/231 objects degraded (12.121%) 2026-03-10T09:01:45.321 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:45 vm05.local ceph-mon[111630]: Health check update: Degraded data redundancy: 28/231 objects degraded (12.121%), 12 pgs degraded (PG_DEGRADED) 2026-03-10T09:01:47.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:46 vm05.local ceph-mon[111630]: pgmap v192: 65 pgs: 20 active+undersized, 12 active+undersized+degraded, 33 active+clean; 215 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail; 28/231 objects degraded (12.121%) 2026-03-10T09:01:47.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:46 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:01:47.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:46 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:01:47.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:46 vm08.local ceph-mon[101330]: pgmap v192: 65 pgs: 20 active+undersized, 12 active+undersized+degraded, 33 active+clean; 215 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail; 28/231 objects degraded (12.121%) 2026-03-10T09:01:47.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:46 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:01:47.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:46 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:01:48.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:47 vm05.local ceph-mon[111630]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 28/231 objects degraded (12.121%), 12 pgs degraded) 2026-03-10T09:01:48.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:47 vm05.local ceph-mon[111630]: Cluster is now healthy 2026-03-10T09:01:48.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:47 vm08.local ceph-mon[101330]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 28/231 objects degraded (12.121%), 12 pgs degraded) 2026-03-10T09:01:48.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:47 vm08.local ceph-mon[101330]: Cluster is now healthy 2026-03-10T09:01:49.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:48 vm05.local ceph-mon[111630]: pgmap v193: 65 pgs: 65 active+clean; 215 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail 2026-03-10T09:01:49.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:48 vm08.local ceph-mon[101330]: pgmap v193: 65 pgs: 65 active+clean; 215 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail 2026-03-10T09:01:51.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:51 vm05.local ceph-mon[111630]: pgmap v194: 65 pgs: 65 active+clean; 215 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail 2026-03-10T09:01:51.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:51 vm08.local ceph-mon[101330]: pgmap v194: 65 pgs: 65 active+clean; 215 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail 2026-03-10T09:01:53.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:53 vm05.local ceph-mon[111630]: pgmap v195: 65 pgs: 65 active+clean; 215 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail 2026-03-10T09:01:53.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:53 vm08.local ceph-mon[101330]: pgmap v195: 65 pgs: 65 active+clean; 215 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail 2026-03-10T09:01:54.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:54 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-10T09:01:54.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:54 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:01:54.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:54 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-10T09:01:54.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:54 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:01:54.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:54 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-10T09:01:54.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:54 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:01:54.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:54 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-10T09:01:54.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:54 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:01:54.963 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:54 vm05.local systemd[1]: Stopping Ceph osd.2 for 16587ed2-1c5e-11f1-90f6-35051361a039... 2026-03-10T09:01:54.963 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:54 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2[83525]: 2026-03-10T09:01:54.795+0000 7f5a1a4f7700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.2 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T09:01:54.963 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:54 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2[83525]: 2026-03-10T09:01:54.795+0000 7f5a1a4f7700 -1 osd.2 82 *** Got signal Terminated *** 2026-03-10T09:01:54.963 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:54 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2[83525]: 2026-03-10T09:01:54.795+0000 7f5a1a4f7700 -1 osd.2 82 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T09:01:55.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:55 vm05.local ceph-mon[111630]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-10T09:01:55.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:55 vm05.local ceph-mon[111630]: Upgrade: osd.2 is safe to restart 2026-03-10T09:01:55.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:55 vm05.local ceph-mon[111630]: Upgrade: Updating osd.2 2026-03-10T09:01:55.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:55 vm05.local ceph-mon[111630]: Deploying daemon osd.2 on vm05 2026-03-10T09:01:55.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:55 vm05.local ceph-mon[111630]: pgmap v196: 65 pgs: 65 active+clean; 215 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail 2026-03-10T09:01:55.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:55 vm05.local ceph-mon[111630]: osd.2 marked itself down and dead 2026-03-10T09:01:55.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:55 vm08.local ceph-mon[101330]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-10T09:01:55.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:55 vm08.local ceph-mon[101330]: Upgrade: osd.2 is safe to restart 2026-03-10T09:01:55.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:55 vm08.local ceph-mon[101330]: Upgrade: Updating osd.2 2026-03-10T09:01:55.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:55 vm08.local ceph-mon[101330]: Deploying daemon osd.2 on vm05 2026-03-10T09:01:55.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:55 vm08.local ceph-mon[101330]: pgmap v196: 65 pgs: 65 active+clean; 215 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail 2026-03-10T09:01:55.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:55 vm08.local ceph-mon[101330]: osd.2 marked itself down and dead 2026-03-10T09:01:56.031 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:55 vm05.local podman[131897]: 2026-03-10 09:01:55.74641974 +0000 UTC m=+0.964101583 container died 32c0be0f86f2833aa3132e02a514325c066782f3aa8b86848c9950a860187e13 (image=quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2, io.buildah.version=1.29.1, maintainer=Guillaume Abrioux , org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20240222, org.label-schema.vendor=CentOS, GIT_REPO=https://github.com/ceph/ceph-container.git, org.label-schema.name=CentOS Stream 8 Base Image, GIT_CLEAN=True, RELEASE=HEAD, org.label-schema.schema-version=1.0, CEPH_POINT_RELEASE=-18.2.1, GIT_BRANCH=HEAD, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd) 2026-03-10T09:01:56.031 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:55 vm05.local podman[131897]: 2026-03-10 09:01:55.792170488 +0000 UTC m=+1.009852320 container remove 32c0be0f86f2833aa3132e02a514325c066782f3aa8b86848c9950a860187e13 (image=quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2, RELEASE=HEAD, ceph=True, io.buildah.version=1.29.1, org.label-schema.name=CentOS Stream 8 Base Image, CEPH_POINT_RELEASE=-18.2.1, GIT_CLEAN=True, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=HEAD, org.label-schema.build-date=20240222, org.label-schema.vendor=CentOS, maintainer=Guillaume Abrioux , org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) 2026-03-10T09:01:56.031 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:55 vm05.local bash[131897]: ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2 2026-03-10T09:01:56.031 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:55 vm05.local podman[131964]: 2026-03-10 09:01:55.941827309 +0000 UTC m=+0.017734073 container create 5ef76f095256d3962b33a026cc4ef52ee7618149679ae44e1f66fccad07448f2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2-deactivate, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default) 2026-03-10T09:01:56.289 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:56 vm05.local ceph-mon[111630]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T09:01:56.289 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:56 vm05.local ceph-mon[111630]: osdmap e83: 6 total, 5 up, 6 in 2026-03-10T09:01:56.289 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:56 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:01:56.289 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:56 vm05.local podman[131964]: 2026-03-10 09:01:55.932585081 +0000 UTC m=+0.008491845 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T09:01:56.289 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:56 vm05.local podman[131964]: 2026-03-10 09:01:56.034860522 +0000 UTC m=+0.110767296 container init 5ef76f095256d3962b33a026cc4ef52ee7618149679ae44e1f66fccad07448f2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2-deactivate, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T09:01:56.290 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:56 vm05.local podman[131964]: 2026-03-10 09:01:56.038132195 +0000 UTC m=+0.114038959 container start 5ef76f095256d3962b33a026cc4ef52ee7618149679ae44e1f66fccad07448f2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2-deactivate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) 2026-03-10T09:01:56.290 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:56 vm05.local podman[131964]: 2026-03-10 09:01:56.048079032 +0000 UTC m=+0.123985806 container attach 5ef76f095256d3962b33a026cc4ef52ee7618149679ae44e1f66fccad07448f2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2-deactivate, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default) 2026-03-10T09:01:56.290 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:56 vm05.local podman[131964]: 2026-03-10 09:01:56.180082052 +0000 UTC m=+0.255988816 container died 5ef76f095256d3962b33a026cc4ef52ee7618149679ae44e1f66fccad07448f2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2-deactivate, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T09:01:56.290 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:56 vm05.local podman[131964]: 2026-03-10 09:01:56.198985822 +0000 UTC m=+0.274892586 container remove 5ef76f095256d3962b33a026cc4ef52ee7618149679ae44e1f66fccad07448f2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2-deactivate, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=squid, ceph=True, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-10T09:01:56.290 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:56 vm05.local systemd[1]: ceph-16587ed2-1c5e-11f1-90f6-35051361a039@osd.2.service: Deactivated successfully. 2026-03-10T09:01:56.290 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:56 vm05.local systemd[1]: Stopped Ceph osd.2 for 16587ed2-1c5e-11f1-90f6-35051361a039. 2026-03-10T09:01:56.290 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:56 vm05.local systemd[1]: ceph-16587ed2-1c5e-11f1-90f6-35051361a039@osd.2.service: Consumed 43.078s CPU time. 2026-03-10T09:01:56.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:56 vm08.local ceph-mon[101330]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T09:01:56.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:56 vm08.local ceph-mon[101330]: osdmap e83: 6 total, 5 up, 6 in 2026-03-10T09:01:56.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:56 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:01:56.713 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:56 vm05.local systemd[1]: Starting Ceph osd.2 for 16587ed2-1c5e-11f1-90f6-35051361a039... 2026-03-10T09:01:56.713 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:56 vm05.local podman[132067]: 2026-03-10 09:01:56.496488805 +0000 UTC m=+0.018437961 container create c93f1ab586d1472c5cbec8c49f68841cdd23c914471fa5ed8a6398e9006bdc7d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2-activate, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T09:01:56.713 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:56 vm05.local podman[132067]: 2026-03-10 09:01:56.54088666 +0000 UTC m=+0.062835816 container init c93f1ab586d1472c5cbec8c49f68841cdd23c914471fa5ed8a6398e9006bdc7d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2-activate, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T09:01:56.713 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:56 vm05.local podman[132067]: 2026-03-10 09:01:56.544619548 +0000 UTC m=+0.066568704 container start c93f1ab586d1472c5cbec8c49f68841cdd23c914471fa5ed8a6398e9006bdc7d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2-activate, CEPH_REF=squid, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T09:01:56.713 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:56 vm05.local podman[132067]: 2026-03-10 09:01:56.549874182 +0000 UTC m=+0.071823339 container attach c93f1ab586d1472c5cbec8c49f68841cdd23c914471fa5ed8a6398e9006bdc7d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2-activate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.41.3, ceph=True) 2026-03-10T09:01:56.713 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:56 vm05.local podman[132067]: 2026-03-10 09:01:56.488201244 +0000 UTC m=+0.010150400 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T09:01:56.713 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:56 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2-activate[132078]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T09:01:56.713 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:56 vm05.local bash[132067]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T09:01:56.713 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:56 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2-activate[132078]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T09:01:56.713 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:56 vm05.local bash[132067]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T09:01:57.457 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:57 vm05.local ceph-mon[111630]: pgmap v198: 65 pgs: 9 stale+active+clean, 56 active+clean; 215 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail 2026-03-10T09:01:57.457 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:57 vm05.local ceph-mon[111630]: osdmap e84: 6 total, 5 up, 6 in 2026-03-10T09:01:57.457 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:57 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2-activate[132078]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T09:01:57.457 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:57 vm05.local bash[132067]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T09:01:57.457 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:57 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2-activate[132078]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T09:01:57.457 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:57 vm05.local bash[132067]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T09:01:57.457 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:57 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2-activate[132078]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T09:01:57.457 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:57 vm05.local bash[132067]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T09:01:57.457 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:57 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2-activate[132078]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 2026-03-10T09:01:57.457 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:57 vm05.local bash[132067]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 2026-03-10T09:01:57.457 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:57 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2-activate[132078]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-4c0de6b6-704d-4c07-b6dc-3a0200df9d56/osd-block-3a3adfaf-6208-4836-b16d-7bbb2065933b --path /var/lib/ceph/osd/ceph-2 --no-mon-config 2026-03-10T09:01:57.457 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:57 vm05.local bash[132067]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-4c0de6b6-704d-4c07-b6dc-3a0200df9d56/osd-block-3a3adfaf-6208-4836-b16d-7bbb2065933b --path /var/lib/ceph/osd/ceph-2 --no-mon-config 2026-03-10T09:01:57.457 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:57 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2-activate[132078]: Running command: /usr/bin/ln -snf /dev/ceph-4c0de6b6-704d-4c07-b6dc-3a0200df9d56/osd-block-3a3adfaf-6208-4836-b16d-7bbb2065933b /var/lib/ceph/osd/ceph-2/block 2026-03-10T09:01:57.457 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:57 vm05.local bash[132067]: Running command: /usr/bin/ln -snf /dev/ceph-4c0de6b6-704d-4c07-b6dc-3a0200df9d56/osd-block-3a3adfaf-6208-4836-b16d-7bbb2065933b /var/lib/ceph/osd/ceph-2/block 2026-03-10T09:01:57.713 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:57 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2-activate[132078]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block 2026-03-10T09:01:57.713 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:57 vm05.local bash[132067]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block 2026-03-10T09:01:57.713 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:57 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2-activate[132078]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2 2026-03-10T09:01:57.713 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:57 vm05.local bash[132067]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2 2026-03-10T09:01:57.713 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:57 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2-activate[132078]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 2026-03-10T09:01:57.713 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:57 vm05.local bash[132067]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 2026-03-10T09:01:57.713 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:57 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2-activate[132078]: --> ceph-volume lvm activate successful for osd ID: 2 2026-03-10T09:01:57.713 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:57 vm05.local bash[132067]: --> ceph-volume lvm activate successful for osd ID: 2 2026-03-10T09:01:57.713 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:57 vm05.local podman[132067]: 2026-03-10 09:01:57.489104375 +0000 UTC m=+1.011053531 container died c93f1ab586d1472c5cbec8c49f68841cdd23c914471fa5ed8a6398e9006bdc7d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2-activate, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, OSD_FLAVOR=default) 2026-03-10T09:01:57.713 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:57 vm05.local podman[132067]: 2026-03-10 09:01:57.506913858 +0000 UTC m=+1.028863014 container remove c93f1ab586d1472c5cbec8c49f68841cdd23c914471fa5ed8a6398e9006bdc7d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2-activate, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, OSD_FLAVOR=default) 2026-03-10T09:01:57.713 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:57 vm05.local podman[132337]: 2026-03-10 09:01:57.605214027 +0000 UTC m=+0.016284950 container create a555d70ff4bddd0aafe9353118eb5090320b317142599c87543e6eb71039c8bb (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default) 2026-03-10T09:01:57.713 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:57 vm05.local podman[132337]: 2026-03-10 09:01:57.64848286 +0000 UTC m=+0.059553793 container init a555d70ff4bddd0aafe9353118eb5090320b317142599c87543e6eb71039c8bb (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T09:01:57.713 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:57 vm05.local podman[132337]: 2026-03-10 09:01:57.65126597 +0000 UTC m=+0.062336893 container start a555d70ff4bddd0aafe9353118eb5090320b317142599c87543e6eb71039c8bb (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20260223) 2026-03-10T09:01:57.713 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:57 vm05.local bash[132337]: a555d70ff4bddd0aafe9353118eb5090320b317142599c87543e6eb71039c8bb 2026-03-10T09:01:57.713 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:57 vm05.local podman[132337]: 2026-03-10 09:01:57.598323822 +0000 UTC m=+0.009394755 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T09:01:57.713 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:57 vm05.local systemd[1]: Started Ceph osd.2 for 16587ed2-1c5e-11f1-90f6-35051361a039. 2026-03-10T09:01:57.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:57 vm08.local ceph-mon[101330]: pgmap v198: 65 pgs: 9 stale+active+clean, 56 active+clean; 215 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail 2026-03-10T09:01:57.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:57 vm08.local ceph-mon[101330]: osdmap e84: 6 total, 5 up, 6 in 2026-03-10T09:01:58.389 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:01:58 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2[132347]: 2026-03-10T09:01:58.234+0000 7f5bddbd7740 -1 Falling back to public interface 2026-03-10T09:01:58.698 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:58 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:01:58.698 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:58 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:01:58.698 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:58 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:01:58.698 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:58 vm05.local ceph-mon[111630]: pgmap v200: 65 pgs: 7 active+undersized, 7 peering, 3 stale+active+clean, 5 active+undersized+degraded, 43 active+clean; 215 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail; 10/231 objects degraded (4.329%) 2026-03-10T09:01:59.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:58 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:01:59.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:58 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:01:59.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:58 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:01:59.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:58 vm08.local ceph-mon[101330]: pgmap v200: 65 pgs: 7 active+undersized, 7 peering, 3 stale+active+clean, 5 active+undersized+degraded, 43 active+clean; 215 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail; 10/231 objects degraded (4.329%) 2026-03-10T09:01:59.954 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:59 vm08.local ceph-mon[101330]: Health check failed: Reduced data availability: 4 pgs peering (PG_AVAILABILITY) 2026-03-10T09:01:59.954 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:59 vm08.local ceph-mon[101330]: Health check failed: Degraded data redundancy: 10/231 objects degraded (4.329%), 5 pgs degraded (PG_DEGRADED) 2026-03-10T09:01:59.954 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:59 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:01:59.954 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:01:59 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:01:59.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:59 vm05.local ceph-mon[111630]: Health check failed: Reduced data availability: 4 pgs peering (PG_AVAILABILITY) 2026-03-10T09:01:59.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:59 vm05.local ceph-mon[111630]: Health check failed: Degraded data redundancy: 10/231 objects degraded (4.329%), 5 pgs degraded (PG_DEGRADED) 2026-03-10T09:01:59.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:59 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:01:59.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:01:59 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:00.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:00 vm05.local ceph-mon[111630]: pgmap v201: 65 pgs: 11 active+undersized, 7 peering, 9 active+undersized+degraded, 38 active+clean; 215 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail; 18/231 objects degraded (7.792%) 2026-03-10T09:02:00.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:00 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:00.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:00 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:01.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:00 vm08.local ceph-mon[101330]: pgmap v201: 65 pgs: 11 active+undersized, 7 peering, 9 active+undersized+degraded, 38 active+clean; 215 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail; 18/231 objects degraded (7.792%) 2026-03-10T09:02:01.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:00 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:01.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:00 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:02.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:01 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:02.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:01 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:02:02.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:01 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:02.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:01 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:02.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:01 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:02:02.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:01 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:02:02.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:01 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:02.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:01 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:02:02.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:01 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:02:02.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:01 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:02:02.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:01 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:02:02.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:01 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-10T09:02:02.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:01 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:02.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:01 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:02:02.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:01 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:02.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:01 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:02.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:01 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:02:02.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:01 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:02:02.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:01 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:02.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:01 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:02:02.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:01 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:02:02.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:01 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:02:02.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:01 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:02:02.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:01 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-10T09:02:02.712 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:02:02 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2[132347]: 2026-03-10T09:02:02.244+0000 7f5bddbd7740 -1 osd.2 0 read_superblock omap replica is missing. 2026-03-10T09:02:02.712 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:02:02 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2[132347]: 2026-03-10T09:02:02.494+0000 7f5bddbd7740 -1 osd.2 82 log_to_monitors true 2026-03-10T09:02:03.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:02 vm05.local ceph-mon[111630]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-10T09:02:03.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:02 vm05.local ceph-mon[111630]: Upgrade: unsafe to stop osd(s) at this time (11 PGs are or would become offline) 2026-03-10T09:02:03.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:02 vm05.local ceph-mon[111630]: pgmap v202: 65 pgs: 15 active+undersized, 12 active+undersized+degraded, 38 active+clean; 215 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail; 29/231 objects degraded (12.554%) 2026-03-10T09:02:03.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:02 vm05.local ceph-mon[111630]: from='osd.2 [v2:192.168.123.105:6818/2140353082,v1:192.168.123.105:6819/2140353082]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch 2026-03-10T09:02:03.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:02 vm05.local ceph-mon[111630]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 4 pgs peering) 2026-03-10T09:02:03.213 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:02:02 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2[132347]: 2026-03-10T09:02:02.884+0000 7f5bd5971640 -1 osd.2 82 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T09:02:03.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:02 vm08.local ceph-mon[101330]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-10T09:02:03.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:02 vm08.local ceph-mon[101330]: Upgrade: unsafe to stop osd(s) at this time (11 PGs are or would become offline) 2026-03-10T09:02:03.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:02 vm08.local ceph-mon[101330]: pgmap v202: 65 pgs: 15 active+undersized, 12 active+undersized+degraded, 38 active+clean; 215 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail; 29/231 objects degraded (12.554%) 2026-03-10T09:02:03.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:02 vm08.local ceph-mon[101330]: from='osd.2 [v2:192.168.123.105:6818/2140353082,v1:192.168.123.105:6819/2140353082]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch 2026-03-10T09:02:03.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:02 vm08.local ceph-mon[101330]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 4 pgs peering) 2026-03-10T09:02:04.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:03 vm05.local ceph-mon[111630]: from='osd.2 [v2:192.168.123.105:6818/2140353082,v1:192.168.123.105:6819/2140353082]' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished 2026-03-10T09:02:04.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:03 vm05.local ceph-mon[111630]: osdmap e85: 6 total, 5 up, 6 in 2026-03-10T09:02:04.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:03 vm05.local ceph-mon[111630]: from='osd.2 [v2:192.168.123.105:6818/2140353082,v1:192.168.123.105:6819/2140353082]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-10T09:02:04.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:03 vm08.local ceph-mon[101330]: from='osd.2 [v2:192.168.123.105:6818/2140353082,v1:192.168.123.105:6819/2140353082]' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished 2026-03-10T09:02:04.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:03 vm08.local ceph-mon[101330]: osdmap e85: 6 total, 5 up, 6 in 2026-03-10T09:02:04.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:03 vm08.local ceph-mon[101330]: from='osd.2 [v2:192.168.123.105:6818/2140353082,v1:192.168.123.105:6819/2140353082]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-10T09:02:05.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:04 vm08.local ceph-mon[101330]: pgmap v204: 65 pgs: 15 active+undersized, 12 active+undersized+degraded, 38 active+clean; 215 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 29/231 objects degraded (12.554%) 2026-03-10T09:02:05.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:04 vm08.local ceph-mon[101330]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T09:02:05.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:04 vm08.local ceph-mon[101330]: osd.2 [v2:192.168.123.105:6818/2140353082,v1:192.168.123.105:6819/2140353082] boot 2026-03-10T09:02:05.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:04 vm08.local ceph-mon[101330]: osdmap e86: 6 total, 6 up, 6 in 2026-03-10T09:02:05.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:04 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T09:02:05.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:04 vm05.local ceph-mon[111630]: pgmap v204: 65 pgs: 15 active+undersized, 12 active+undersized+degraded, 38 active+clean; 215 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 29/231 objects degraded (12.554%) 2026-03-10T09:02:05.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:04 vm05.local ceph-mon[111630]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T09:02:05.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:04 vm05.local ceph-mon[111630]: osd.2 [v2:192.168.123.105:6818/2140353082,v1:192.168.123.105:6819/2140353082] boot 2026-03-10T09:02:05.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:04 vm05.local ceph-mon[111630]: osdmap e86: 6 total, 6 up, 6 in 2026-03-10T09:02:05.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:04 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T09:02:06.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:05 vm05.local ceph-mon[111630]: osdmap e87: 6 total, 6 up, 6 in 2026-03-10T09:02:06.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:05 vm05.local ceph-mon[111630]: Health check update: Degraded data redundancy: 29/231 objects degraded (12.554%), 12 pgs degraded (PG_DEGRADED) 2026-03-10T09:02:06.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:05 vm08.local ceph-mon[101330]: osdmap e87: 6 total, 6 up, 6 in 2026-03-10T09:02:06.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:05 vm08.local ceph-mon[101330]: Health check update: Degraded data redundancy: 29/231 objects degraded (12.554%), 12 pgs degraded (PG_DEGRADED) 2026-03-10T09:02:07.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:06 vm05.local ceph-mon[111630]: pgmap v207: 65 pgs: 5 peering, 12 active+undersized, 10 active+undersized+degraded, 38 active+clean; 215 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 26/231 objects degraded (11.255%) 2026-03-10T09:02:07.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:06 vm08.local ceph-mon[101330]: pgmap v207: 65 pgs: 5 peering, 12 active+undersized, 10 active+undersized+degraded, 38 active+clean; 215 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 26/231 objects degraded (11.255%) 2026-03-10T09:02:09.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:08 vm08.local ceph-mon[101330]: pgmap v208: 65 pgs: 5 peering, 4 active+undersized, 5 active+undersized+degraded, 51 active+clean; 215 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 14/231 objects degraded (6.061%) 2026-03-10T09:02:09.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:08 vm05.local ceph-mon[111630]: pgmap v208: 65 pgs: 5 peering, 4 active+undersized, 5 active+undersized+degraded, 51 active+clean; 215 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 14/231 objects degraded (6.061%) 2026-03-10T09:02:10.386 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:10 vm08.local ceph-mon[101330]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 14/231 objects degraded (6.061%), 5 pgs degraded) 2026-03-10T09:02:10.386 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:10 vm08.local ceph-mon[101330]: Cluster is now healthy 2026-03-10T09:02:10.423 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:10 vm05.local ceph-mon[111630]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 14/231 objects degraded (6.061%), 5 pgs degraded) 2026-03-10T09:02:10.423 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:10 vm05.local ceph-mon[111630]: Cluster is now healthy 2026-03-10T09:02:11.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:11 vm05.local ceph-mon[111630]: pgmap v209: 65 pgs: 65 active+clean; 215 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail 2026-03-10T09:02:11.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:11 vm08.local ceph-mon[101330]: pgmap v209: 65 pgs: 65 active+clean; 215 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail 2026-03-10T09:02:13.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:13 vm05.local ceph-mon[111630]: pgmap v210: 65 pgs: 65 active+clean; 215 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail 2026-03-10T09:02:13.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:13 vm08.local ceph-mon[101330]: pgmap v210: 65 pgs: 65 active+clean; 215 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail 2026-03-10T09:02:13.738 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.737+0000 7fc981e42700 1 -- 192.168.123.105:0/3275310287 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc97c107540 msgr2=0x7fc97c107960 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:13.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.737+0000 7fc981e42700 1 --2- 192.168.123.105:0/3275310287 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc97c107540 0x7fc97c107960 secure :-1 s=READY pgs=138 cs=0 l=1 rev1=1 crypto rx=0x7fc964009b00 tx=0x7fc964009e10 comp rx=0 tx=0).stop 2026-03-10T09:02:13.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.738+0000 7fc981e42700 1 -- 192.168.123.105:0/3275310287 shutdown_connections 2026-03-10T09:02:13.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.738+0000 7fc981e42700 1 --2- 192.168.123.105:0/3275310287 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc97c108740 0x7fc97c108ba0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:13.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.738+0000 7fc981e42700 1 --2- 192.168.123.105:0/3275310287 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc97c107540 0x7fc97c107960 secure :-1 s=CLOSED pgs=138 cs=0 l=1 rev1=1 crypto rx=0x7fc964009b00 tx=0x7fc964009e10 comp rx=0 tx=0).stop 2026-03-10T09:02:13.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.738+0000 7fc981e42700 1 -- 192.168.123.105:0/3275310287 >> 192.168.123.105:0/3275310287 conn(0x7fc97c0766d0 msgr2=0x7fc97c078b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:02:13.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.738+0000 7fc981e42700 1 -- 192.168.123.105:0/3275310287 shutdown_connections 2026-03-10T09:02:13.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.738+0000 7fc981e42700 1 -- 192.168.123.105:0/3275310287 wait complete. 2026-03-10T09:02:13.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.738+0000 7fc981e42700 1 Processor -- start 2026-03-10T09:02:13.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.738+0000 7fc981e42700 1 -- start start 2026-03-10T09:02:13.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.738+0000 7fc981e42700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc97c108740 0x7fc97c19d0f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:02:13.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.738+0000 7fc981e42700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc97c19d630 0x7fc97c1a26a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:02:13.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.738+0000 7fc981e42700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc97c19db40 con 0x7fc97c19d630 2026-03-10T09:02:13.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.738+0000 7fc981e42700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc97c19dcb0 con 0x7fc97c108740 2026-03-10T09:02:13.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.739+0000 7fc97affd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc97c19d630 0x7fc97c1a26a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:02:13.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.739+0000 7fc97affd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc97c19d630 0x7fc97c1a26a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:55344/0 (socket says 192.168.123.105:55344) 2026-03-10T09:02:13.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.739+0000 7fc97affd700 1 -- 192.168.123.105:0/3452416262 learned_addr learned my addr 192.168.123.105:0/3452416262 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:02:13.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.739+0000 7fc97affd700 1 -- 192.168.123.105:0/3452416262 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc97c108740 msgr2=0x7fc97c19d0f0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T09:02:13.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.739+0000 7fc97affd700 1 --2- 192.168.123.105:0/3452416262 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc97c108740 0x7fc97c19d0f0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:13.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.739+0000 7fc97affd700 1 -- 192.168.123.105:0/3452416262 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc9640097e0 con 0x7fc97c19d630 2026-03-10T09:02:13.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.739+0000 7fc97affd700 1 --2- 192.168.123.105:0/3452416262 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc97c19d630 0x7fc97c1a26a0 secure :-1 s=READY pgs=139 cs=0 l=1 rev1=1 crypto rx=0x7fc96c00ba70 tx=0x7fc96c00be30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:02:13.740 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.739+0000 7fc978ff9700 1 -- 192.168.123.105:0/3452416262 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc96c00c780 con 0x7fc97c19d630 2026-03-10T09:02:13.740 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.739+0000 7fc978ff9700 1 -- 192.168.123.105:0/3452416262 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc96c00cdc0 con 0x7fc97c19d630 2026-03-10T09:02:13.740 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.739+0000 7fc981e42700 1 -- 192.168.123.105:0/3452416262 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc97c1a2c40 con 0x7fc97c19d630 2026-03-10T09:02:13.740 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.739+0000 7fc978ff9700 1 -- 192.168.123.105:0/3452416262 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc96c012550 con 0x7fc97c19d630 2026-03-10T09:02:13.742 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.741+0000 7fc978ff9700 1 -- 192.168.123.105:0/3452416262 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fc96c014440 con 0x7fc97c19d630 2026-03-10T09:02:13.742 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.741+0000 7fc981e42700 1 -- 192.168.123.105:0/3452416262 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc97c1a3190 con 0x7fc97c19d630 2026-03-10T09:02:13.742 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.741+0000 7fc978ff9700 1 --2- 192.168.123.105:0/3452416262 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fc9680778c0 0x7fc968079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:02:13.742 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.742+0000 7fc97b7fe700 1 --2- 192.168.123.105:0/3452416262 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fc9680778c0 0x7fc968079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:02:13.742 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.742+0000 7fc978ff9700 1 -- 192.168.123.105:0/3452416262 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(87..87 src has 1..87) v4 ==== 6222+0+0 (secure 0 0 0) 0x7fc96c0656c0 con 0x7fc97c19d630 2026-03-10T09:02:13.742 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.742+0000 7fc97b7fe700 1 --2- 192.168.123.105:0/3452416262 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fc9680778c0 0x7fc968079d80 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7fc964000c00 tx=0x7fc964009f90 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:02:13.743 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.742+0000 7fc981e42700 1 -- 192.168.123.105:0/3452416262 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc97c066e80 con 0x7fc97c19d630 2026-03-10T09:02:13.745 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.745+0000 7fc978ff9700 1 -- 192.168.123.105:0/3452416262 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fc96c019050 con 0x7fc97c19d630 2026-03-10T09:02:13.878 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.876+0000 7fc981e42700 1 -- 192.168.123.105:0/3452416262 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fc97c10d090 con 0x7fc9680778c0 2026-03-10T09:02:13.878 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.877+0000 7fc978ff9700 1 -- 192.168.123.105:0/3452416262 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7fc97c10d090 con 0x7fc9680778c0 2026-03-10T09:02:13.880 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.880+0000 7fc981e42700 1 -- 192.168.123.105:0/3452416262 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fc9680778c0 msgr2=0x7fc968079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:13.880 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.880+0000 7fc981e42700 1 --2- 192.168.123.105:0/3452416262 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fc9680778c0 0x7fc968079d80 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7fc964000c00 tx=0x7fc964009f90 comp rx=0 tx=0).stop 2026-03-10T09:02:13.880 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.880+0000 7fc981e42700 1 -- 192.168.123.105:0/3452416262 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc97c19d630 msgr2=0x7fc97c1a26a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:13.880 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.880+0000 7fc981e42700 1 --2- 192.168.123.105:0/3452416262 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc97c19d630 0x7fc97c1a26a0 secure :-1 s=READY pgs=139 cs=0 l=1 rev1=1 crypto rx=0x7fc96c00ba70 tx=0x7fc96c00be30 comp rx=0 tx=0).stop 2026-03-10T09:02:13.881 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.880+0000 7fc981e42700 1 -- 192.168.123.105:0/3452416262 shutdown_connections 2026-03-10T09:02:13.881 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.880+0000 7fc981e42700 1 --2- 192.168.123.105:0/3452416262 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fc9680778c0 0x7fc968079d80 unknown :-1 s=CLOSED pgs=93 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:13.881 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.880+0000 7fc981e42700 1 --2- 192.168.123.105:0/3452416262 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc97c108740 0x7fc97c19d0f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:13.881 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.880+0000 7fc981e42700 1 --2- 192.168.123.105:0/3452416262 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc97c19d630 0x7fc97c1a26a0 unknown :-1 s=CLOSED pgs=139 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:13.881 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.881+0000 7fc981e42700 1 -- 192.168.123.105:0/3452416262 >> 192.168.123.105:0/3452416262 conn(0x7fc97c0766d0 msgr2=0x7fc97c10b970 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:02:13.881 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.881+0000 7fc981e42700 1 -- 192.168.123.105:0/3452416262 shutdown_connections 2026-03-10T09:02:13.881 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.881+0000 7fc981e42700 1 -- 192.168.123.105:0/3452416262 wait complete. 2026-03-10T09:02:13.890 INFO:teuthology.orchestra.run.vm05.stdout:true 2026-03-10T09:02:13.957 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.956+0000 7f3971371700 1 -- 192.168.123.105:0/3910104068 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f396c104360 msgr2=0x7f396c106750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:13.957 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.956+0000 7f3971371700 1 --2- 192.168.123.105:0/3910104068 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f396c104360 0x7f396c106750 secure :-1 s=READY pgs=140 cs=0 l=1 rev1=1 crypto rx=0x7f395c009b50 tx=0x7f395c009e60 comp rx=0 tx=0).stop 2026-03-10T09:02:13.957 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.956+0000 7f3971371700 1 -- 192.168.123.105:0/3910104068 shutdown_connections 2026-03-10T09:02:13.957 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.956+0000 7f3971371700 1 --2- 192.168.123.105:0/3910104068 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f396c104360 0x7f396c106750 unknown :-1 s=CLOSED pgs=140 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:13.957 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.956+0000 7f3971371700 1 --2- 192.168.123.105:0/3910104068 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f396c101a30 0x7f396c103e20 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:13.957 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.956+0000 7f3971371700 1 -- 192.168.123.105:0/3910104068 >> 192.168.123.105:0/3910104068 conn(0x7f396c0fb380 msgr2=0x7f396c0fd7e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:02:13.957 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.957+0000 7f3971371700 1 -- 192.168.123.105:0/3910104068 shutdown_connections 2026-03-10T09:02:13.957 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.957+0000 7f3971371700 1 -- 192.168.123.105:0/3910104068 wait complete. 2026-03-10T09:02:13.958 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.958+0000 7f3971371700 1 Processor -- start 2026-03-10T09:02:13.958 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.958+0000 7f3971371700 1 -- start start 2026-03-10T09:02:13.958 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.958+0000 7f3971371700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f396c101a30 0x7f396c198a20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:02:13.959 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.958+0000 7f3971371700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f396c104360 0x7f396c198f60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:02:13.959 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.958+0000 7f3971371700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f396c1994f0 con 0x7f396c104360 2026-03-10T09:02:13.959 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.958+0000 7f3971371700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f396c199630 con 0x7f396c101a30 2026-03-10T09:02:13.959 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.958+0000 7f396a7fc700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f396c104360 0x7f396c198f60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:02:13.959 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.958+0000 7f396a7fc700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f396c104360 0x7f396c198f60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:55360/0 (socket says 192.168.123.105:55360) 2026-03-10T09:02:13.959 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.958+0000 7f396a7fc700 1 -- 192.168.123.105:0/2206678954 learned_addr learned my addr 192.168.123.105:0/2206678954 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:02:13.959 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.959+0000 7f396a7fc700 1 -- 192.168.123.105:0/2206678954 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f396c101a30 msgr2=0x7f396c198a20 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:13.959 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.959+0000 7f396a7fc700 1 --2- 192.168.123.105:0/2206678954 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f396c101a30 0x7f396c198a20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:13.959 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.959+0000 7f396a7fc700 1 -- 192.168.123.105:0/2206678954 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f395c0097e0 con 0x7f396c104360 2026-03-10T09:02:13.959 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.959+0000 7f396a7fc700 1 --2- 192.168.123.105:0/2206678954 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f396c104360 0x7f396c198f60 secure :-1 s=READY pgs=141 cs=0 l=1 rev1=1 crypto rx=0x7f395c005850 tx=0x7f395c00b920 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:02:13.959 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.959+0000 7f3963fff700 1 -- 192.168.123.105:0/2206678954 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f395c01d070 con 0x7f396c104360 2026-03-10T09:02:13.960 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.959+0000 7f3963fff700 1 -- 192.168.123.105:0/2206678954 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f395c00bca0 con 0x7f396c104360 2026-03-10T09:02:13.960 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.959+0000 7f3963fff700 1 -- 192.168.123.105:0/2206678954 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f395c021920 con 0x7f396c104360 2026-03-10T09:02:13.960 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.959+0000 7f3971371700 1 -- 192.168.123.105:0/2206678954 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f396c19e090 con 0x7f396c104360 2026-03-10T09:02:13.960 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.959+0000 7f3971371700 1 -- 192.168.123.105:0/2206678954 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f396c19e550 con 0x7f396c104360 2026-03-10T09:02:13.960 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.960+0000 7f3971371700 1 -- 192.168.123.105:0/2206678954 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f396c0fcf70 con 0x7f396c104360 2026-03-10T09:02:13.964 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.963+0000 7f3963fff700 1 -- 192.168.123.105:0/2206678954 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f395c00fc90 con 0x7f396c104360 2026-03-10T09:02:13.964 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.963+0000 7f3963fff700 1 --2- 192.168.123.105:0/2206678954 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f39580778c0 0x7f3958079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:02:13.964 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.964+0000 7f3963fff700 1 -- 192.168.123.105:0/2206678954 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(87..87 src has 1..87) v4 ==== 6222+0+0 (secure 0 0 0) 0x7f395c09b5c0 con 0x7f396c104360 2026-03-10T09:02:13.965 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.964+0000 7f396affd700 1 --2- 192.168.123.105:0/2206678954 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f39580778c0 0x7f3958079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:02:13.965 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.964+0000 7f3963fff700 1 -- 192.168.123.105:0/2206678954 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f395c063e20 con 0x7f396c104360 2026-03-10T09:02:13.965 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:13.965+0000 7f396affd700 1 --2- 192.168.123.105:0/2206678954 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f39580778c0 0x7f3958079d80 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7f3954005fd0 tx=0x7f3954005d00 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:02:14.105 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.104+0000 7f3971371700 1 -- 192.168.123.105:0/2206678954 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f396c0611d0 con 0x7f39580778c0 2026-03-10T09:02:14.109 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.108+0000 7f3963fff700 1 -- 192.168.123.105:0/2206678954 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f396c0611d0 con 0x7f39580778c0 2026-03-10T09:02:14.111 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.111+0000 7f3971371700 1 -- 192.168.123.105:0/2206678954 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f39580778c0 msgr2=0x7f3958079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:14.111 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.111+0000 7f3971371700 1 --2- 192.168.123.105:0/2206678954 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f39580778c0 0x7f3958079d80 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7f3954005fd0 tx=0x7f3954005d00 comp rx=0 tx=0).stop 2026-03-10T09:02:14.112 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.111+0000 7f3971371700 1 -- 192.168.123.105:0/2206678954 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f396c104360 msgr2=0x7f396c198f60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:14.112 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.111+0000 7f3971371700 1 --2- 192.168.123.105:0/2206678954 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f396c104360 0x7f396c198f60 secure :-1 s=READY pgs=141 cs=0 l=1 rev1=1 crypto rx=0x7f395c005850 tx=0x7f395c00b920 comp rx=0 tx=0).stop 2026-03-10T09:02:14.112 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.112+0000 7f3971371700 1 -- 192.168.123.105:0/2206678954 shutdown_connections 2026-03-10T09:02:14.112 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.112+0000 7f3971371700 1 --2- 192.168.123.105:0/2206678954 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f39580778c0 0x7f3958079d80 unknown :-1 s=CLOSED pgs=94 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:14.112 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.112+0000 7f3971371700 1 --2- 192.168.123.105:0/2206678954 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f396c101a30 0x7f396c198a20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:14.112 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.112+0000 7f3971371700 1 --2- 192.168.123.105:0/2206678954 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f396c104360 0x7f396c198f60 unknown :-1 s=CLOSED pgs=141 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:14.112 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.112+0000 7f3971371700 1 -- 192.168.123.105:0/2206678954 >> 192.168.123.105:0/2206678954 conn(0x7f396c0fb380 msgr2=0x7f396c1000b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:02:14.112 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.112+0000 7f3971371700 1 -- 192.168.123.105:0/2206678954 shutdown_connections 2026-03-10T09:02:14.112 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.112+0000 7f3971371700 1 -- 192.168.123.105:0/2206678954 wait complete. 2026-03-10T09:02:14.179 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.179+0000 7fd33a70e700 1 -- 192.168.123.105:0/3769854026 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd3341042c0 msgr2=0x7fd3341066b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:14.179 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.179+0000 7fd33a70e700 1 --2- 192.168.123.105:0/3769854026 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd3341042c0 0x7fd3341066b0 secure :-1 s=READY pgs=142 cs=0 l=1 rev1=1 crypto rx=0x7fd324009b50 tx=0x7fd324009e60 comp rx=0 tx=0).stop 2026-03-10T09:02:14.179 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.179+0000 7fd33a70e700 1 -- 192.168.123.105:0/3769854026 shutdown_connections 2026-03-10T09:02:14.179 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.179+0000 7fd33a70e700 1 --2- 192.168.123.105:0/3769854026 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd3341042c0 0x7fd3341066b0 unknown :-1 s=CLOSED pgs=142 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:14.179 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.179+0000 7fd33a70e700 1 --2- 192.168.123.105:0/3769854026 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd334101990 0x7fd334103d80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:14.180 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.179+0000 7fd33a70e700 1 -- 192.168.123.105:0/3769854026 >> 192.168.123.105:0/3769854026 conn(0x7fd3340fb360 msgr2=0x7fd3340fd7e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:02:14.180 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.179+0000 7fd33a70e700 1 -- 192.168.123.105:0/3769854026 shutdown_connections 2026-03-10T09:02:14.180 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.179+0000 7fd33a70e700 1 -- 192.168.123.105:0/3769854026 wait complete. 2026-03-10T09:02:14.180 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.180+0000 7fd33a70e700 1 Processor -- start 2026-03-10T09:02:14.180 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.180+0000 7fd33a70e700 1 -- start start 2026-03-10T09:02:14.180 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.180+0000 7fd33a70e700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd334101990 0x7fd3341945c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:02:14.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.180+0000 7fd33a70e700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd3341042c0 0x7fd334194b00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:02:14.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.180+0000 7fd33a70e700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd334195120 con 0x7fd3341042c0 2026-03-10T09:02:14.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.180+0000 7fd33a70e700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd334195260 con 0x7fd334101990 2026-03-10T09:02:14.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.181+0000 7fd3337fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd3341042c0 0x7fd334194b00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:02:14.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.181+0000 7fd3337fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd3341042c0 0x7fd334194b00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:55384/0 (socket says 192.168.123.105:55384) 2026-03-10T09:02:14.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.181+0000 7fd3337fe700 1 -- 192.168.123.105:0/1930868248 learned_addr learned my addr 192.168.123.105:0/1930868248 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:02:14.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.181+0000 7fd3337fe700 1 -- 192.168.123.105:0/1930868248 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd334101990 msgr2=0x7fd3341945c0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T09:02:14.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.181+0000 7fd3337fe700 1 --2- 192.168.123.105:0/1930868248 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd334101990 0x7fd3341945c0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:14.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.181+0000 7fd3337fe700 1 -- 192.168.123.105:0/1930868248 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd3240097e0 con 0x7fd3341042c0 2026-03-10T09:02:14.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.181+0000 7fd3337fe700 1 --2- 192.168.123.105:0/1930868248 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd3341042c0 0x7fd334194b00 secure :-1 s=READY pgs=143 cs=0 l=1 rev1=1 crypto rx=0x7fd324005950 tx=0x7fd3240057d0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:02:14.182 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.181+0000 7fd3317fa700 1 -- 192.168.123.105:0/1930868248 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd32401d070 con 0x7fd3341042c0 2026-03-10T09:02:14.182 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.181+0000 7fd3317fa700 1 -- 192.168.123.105:0/1930868248 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd32400bb50 con 0x7fd3341042c0 2026-03-10T09:02:14.182 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.181+0000 7fd3317fa700 1 -- 192.168.123.105:0/1930868248 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd32400f810 con 0x7fd3341042c0 2026-03-10T09:02:14.182 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.181+0000 7fd33a70e700 1 -- 192.168.123.105:0/1930868248 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd334199cb0 con 0x7fd3341042c0 2026-03-10T09:02:14.182 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.181+0000 7fd33a70e700 1 -- 192.168.123.105:0/1930868248 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd3341a2980 con 0x7fd3341042c0 2026-03-10T09:02:14.183 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.182+0000 7fd33a70e700 1 -- 192.168.123.105:0/1930868248 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd33418e7d0 con 0x7fd3341042c0 2026-03-10T09:02:14.186 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.185+0000 7fd3317fa700 1 -- 192.168.123.105:0/1930868248 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fd32400bcc0 con 0x7fd3341042c0 2026-03-10T09:02:14.186 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.186+0000 7fd3317fa700 1 --2- 192.168.123.105:0/1930868248 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fd3200779e0 0x7fd320079ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:02:14.186 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.186+0000 7fd3317fa700 1 -- 192.168.123.105:0/1930868248 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(87..87 src has 1..87) v4 ==== 6222+0+0 (secure 0 0 0) 0x7fd32409bce0 con 0x7fd3341042c0 2026-03-10T09:02:14.186 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.186+0000 7fd3317fa700 1 -- 192.168.123.105:0/1930868248 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fd32409c160 con 0x7fd3341042c0 2026-03-10T09:02:14.186 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.186+0000 7fd333fff700 1 --2- 192.168.123.105:0/1930868248 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fd3200779e0 0x7fd320079ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:02:14.187 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.186+0000 7fd333fff700 1 --2- 192.168.123.105:0/1930868248 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fd3200779e0 0x7fd320079ea0 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7fd31c00ba60 tx=0x7fd31c00b3f0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:02:14.304 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.304+0000 7fd33a70e700 1 -- 192.168.123.105:0/1930868248 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fd3340611d0 con 0x7fd3200779e0 2026-03-10T09:02:14.310 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.309+0000 7fd3317fa700 1 -- 192.168.123.105:0/1930868248 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7fd3340611d0 con 0x7fd3200779e0 2026-03-10T09:02:14.311 INFO:teuthology.orchestra.run.vm05.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T09:02:14.311 INFO:teuthology.orchestra.run.vm05.stdout:alertmanager.vm05 vm05 *:9093,9094 running (10m) 14s ago 11m 24.2M - 0.25.0 c8568f914cd2 3be9db7ff6a0 2026-03-10T09:02:14.311 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm05 vm05 running (11m) 14s ago 11m 9638k - 18.2.1 5be31c24972a 4f6a4fa3151a 2026-03-10T09:02:14.311 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm08 vm08 running (10m) 5m ago 10m 11.0M - 18.2.1 5be31c24972a 17a1d108c2c1 2026-03-10T09:02:14.311 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm05 vm05 running (5m) 14s ago 11m 7834k - 19.2.3-678-ge911bdeb 654f31e6858e 8f7c3cd210c7 2026-03-10T09:02:14.311 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm08 vm08 running (5m) 5m ago 10m 8296k - 19.2.3-678-ge911bdeb 654f31e6858e dc71c3161edd 2026-03-10T09:02:14.311 INFO:teuthology.orchestra.run.vm05.stdout:grafana.vm05 vm05 *:3000 running (10m) 14s ago 11m 89.2M - 9.4.7 954c08fa6188 91937b4745e7 2026-03-10T09:02:14.311 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.bxdvbu vm05 running (9m) 14s ago 9m 176M - 18.2.1 5be31c24972a 601862397d9b 2026-03-10T09:02:14.311 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.slhztf vm05 running (9m) 14s ago 9m 19.5M - 18.2.1 5be31c24972a 550cdd24738b 2026-03-10T09:02:14.311 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.ssijow vm08 running (9m) 5m ago 9m 19.9M - 18.2.1 5be31c24972a b9e55b365719 2026-03-10T09:02:14.311 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.xfzrbx vm08 running (9m) 5m ago 9m 16.2M - 18.2.1 5be31c24972a d58d1e2e6ff3 2026-03-10T09:02:14.311 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm05.rxwgjc vm05 *:8443,9283,8765 running (6m) 14s ago 12m 621M - 19.2.3-678-ge911bdeb 654f31e6858e d8c53d04a173 2026-03-10T09:02:14.311 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm08.rpongu vm08 *:8443,9283,8765 running (5m) 5m ago 10m 491M - 19.2.3-678-ge911bdeb 654f31e6858e 867781ac9d98 2026-03-10T09:02:14.311 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm05 vm05 running (5m) 14s ago 12m 63.3M 2048M 19.2.3-678-ge911bdeb 654f31e6858e cdc9176bec28 2026-03-10T09:02:14.311 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm08 vm08 running (5m) 5m ago 10m 48.1M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 34546aa1422b 2026-03-10T09:02:14.311 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm05 vm05 *:9100 running (11m) 14s ago 11m 14.9M - 1.5.0 0da6a335fe13 3b3db2a6030c 2026-03-10T09:02:14.311 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm08 vm08 *:9100 running (10m) 5m ago 10m 15.7M - 1.5.0 0da6a335fe13 f55df0cefc61 2026-03-10T09:02:14.311 INFO:teuthology.orchestra.run.vm05.stdout:osd.0 vm05 running (4m) 14s ago 10m 214M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 4f1dac46f59b 2026-03-10T09:02:14.311 INFO:teuthology.orchestra.run.vm05.stdout:osd.1 vm05 running (39s) 14s ago 10m 101M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 306e95bddd95 2026-03-10T09:02:14.311 INFO:teuthology.orchestra.run.vm05.stdout:osd.2 vm05 running (16s) 14s ago 10m 12.9M 4096M 19.2.3-678-ge911bdeb 654f31e6858e a555d70ff4bd 2026-03-10T09:02:14.311 INFO:teuthology.orchestra.run.vm05.stdout:osd.3 vm08 running (10m) 5m ago 10m 456M 4096M 18.2.1 5be31c24972a 14f5f93ea4d1 2026-03-10T09:02:14.311 INFO:teuthology.orchestra.run.vm05.stdout:osd.4 vm08 running (9m) 5m ago 9m 418M 4096M 18.2.1 5be31c24972a 155c1482d81c 2026-03-10T09:02:14.311 INFO:teuthology.orchestra.run.vm05.stdout:osd.5 vm08 running (9m) 5m ago 9m 339M 4096M 18.2.1 5be31c24972a 21583bb58d82 2026-03-10T09:02:14.311 INFO:teuthology.orchestra.run.vm05.stdout:prometheus.vm05 vm05 *:9095 running (5m) 14s ago 11m 65.7M - 2.43.0 a07b618ecd1d 1879cf55d022 2026-03-10T09:02:14.313 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.313+0000 7fd33a70e700 1 -- 192.168.123.105:0/1930868248 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fd3200779e0 msgr2=0x7fd320079ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:14.313 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.313+0000 7fd33a70e700 1 --2- 192.168.123.105:0/1930868248 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fd3200779e0 0x7fd320079ea0 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7fd31c00ba60 tx=0x7fd31c00b3f0 comp rx=0 tx=0).stop 2026-03-10T09:02:14.313 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.313+0000 7fd33a70e700 1 -- 192.168.123.105:0/1930868248 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd3341042c0 msgr2=0x7fd334194b00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:14.313 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.313+0000 7fd33a70e700 1 --2- 192.168.123.105:0/1930868248 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd3341042c0 0x7fd334194b00 secure :-1 s=READY pgs=143 cs=0 l=1 rev1=1 crypto rx=0x7fd324005950 tx=0x7fd3240057d0 comp rx=0 tx=0).stop 2026-03-10T09:02:14.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.313+0000 7fd33a70e700 1 -- 192.168.123.105:0/1930868248 shutdown_connections 2026-03-10T09:02:14.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.313+0000 7fd33a70e700 1 --2- 192.168.123.105:0/1930868248 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fd3200779e0 0x7fd320079ea0 unknown :-1 s=CLOSED pgs=95 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:14.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.313+0000 7fd33a70e700 1 --2- 192.168.123.105:0/1930868248 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd334101990 0x7fd3341945c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:14.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.313+0000 7fd33a70e700 1 --2- 192.168.123.105:0/1930868248 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd3341042c0 0x7fd334194b00 unknown :-1 s=CLOSED pgs=143 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:14.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.314+0000 7fd33a70e700 1 -- 192.168.123.105:0/1930868248 >> 192.168.123.105:0/1930868248 conn(0x7fd3340fb360 msgr2=0x7fd3340fd7e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:02:14.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.314+0000 7fd33a70e700 1 -- 192.168.123.105:0/1930868248 shutdown_connections 2026-03-10T09:02:14.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.314+0000 7fd33a70e700 1 -- 192.168.123.105:0/1930868248 wait complete. 2026-03-10T09:02:14.382 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.379+0000 7fcd14998700 1 -- 192.168.123.105:0/16375569 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcd0c101a90 msgr2=0x7fcd0c103e80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:14.382 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.379+0000 7fcd14998700 1 --2- 192.168.123.105:0/16375569 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcd0c101a90 0x7fcd0c103e80 secure :-1 s=READY pgs=144 cs=0 l=1 rev1=1 crypto rx=0x7fcd04009b00 tx=0x7fcd04009e10 comp rx=0 tx=0).stop 2026-03-10T09:02:14.382 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.382+0000 7fcd14998700 1 -- 192.168.123.105:0/16375569 shutdown_connections 2026-03-10T09:02:14.382 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.382+0000 7fcd14998700 1 --2- 192.168.123.105:0/16375569 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcd0c1043c0 0x7fcd0c1067b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:14.382 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.382+0000 7fcd14998700 1 --2- 192.168.123.105:0/16375569 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcd0c101a90 0x7fcd0c103e80 secure :-1 s=CLOSED pgs=144 cs=0 l=1 rev1=1 crypto rx=0x7fcd04009b00 tx=0x7fcd04009e10 comp rx=0 tx=0).stop 2026-03-10T09:02:14.382 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.382+0000 7fcd14998700 1 -- 192.168.123.105:0/16375569 >> 192.168.123.105:0/16375569 conn(0x7fcd0c0fb3c0 msgr2=0x7fcd0c0fd840 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:02:14.382 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.382+0000 7fcd14998700 1 -- 192.168.123.105:0/16375569 shutdown_connections 2026-03-10T09:02:14.383 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.382+0000 7fcd14998700 1 -- 192.168.123.105:0/16375569 wait complete. 2026-03-10T09:02:14.383 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.382+0000 7fcd14998700 1 Processor -- start 2026-03-10T09:02:14.383 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.382+0000 7fcd14998700 1 -- start start 2026-03-10T09:02:14.383 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.383+0000 7fcd14998700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcd0c1043c0 0x7fcd0c19d140 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:02:14.383 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.383+0000 7fcd14998700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcd0c19d680 0x7fcd0c1a26f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:02:14.383 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.383+0000 7fcd14998700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcd0c19db90 con 0x7fcd0c19d680 2026-03-10T09:02:14.383 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.383+0000 7fcd14998700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcd0c19dd00 con 0x7fcd0c1043c0 2026-03-10T09:02:14.383 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.383+0000 7fcd11f33700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcd0c19d680 0x7fcd0c1a26f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:02:14.383 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.383+0000 7fcd11f33700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcd0c19d680 0x7fcd0c1a26f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:55404/0 (socket says 192.168.123.105:55404) 2026-03-10T09:02:14.383 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.383+0000 7fcd11f33700 1 -- 192.168.123.105:0/3246775140 learned_addr learned my addr 192.168.123.105:0/3246775140 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:02:14.384 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.383+0000 7fcd11f33700 1 -- 192.168.123.105:0/3246775140 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcd0c1043c0 msgr2=0x7fcd0c19d140 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T09:02:14.384 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.383+0000 7fcd11f33700 1 --2- 192.168.123.105:0/3246775140 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcd0c1043c0 0x7fcd0c19d140 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:14.384 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.383+0000 7fcd11f33700 1 -- 192.168.123.105:0/3246775140 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fcd040097e0 con 0x7fcd0c19d680 2026-03-10T09:02:14.384 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.384+0000 7fcd11f33700 1 --2- 192.168.123.105:0/3246775140 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcd0c19d680 0x7fcd0c1a26f0 secure :-1 s=READY pgs=145 cs=0 l=1 rev1=1 crypto rx=0x7fcd0000ba70 tx=0x7fcd0000be30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:02:14.385 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.384+0000 7fccff7fe700 1 -- 192.168.123.105:0/3246775140 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcd0000c780 con 0x7fcd0c19d680 2026-03-10T09:02:14.385 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.384+0000 7fccff7fe700 1 -- 192.168.123.105:0/3246775140 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fcd0000cdc0 con 0x7fcd0c19d680 2026-03-10T09:02:14.385 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.384+0000 7fccff7fe700 1 -- 192.168.123.105:0/3246775140 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcd00012550 con 0x7fcd0c19d680 2026-03-10T09:02:14.386 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.385+0000 7fcd14998700 1 -- 192.168.123.105:0/3246775140 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fcd0c1a2c90 con 0x7fcd0c19d680 2026-03-10T09:02:14.386 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.385+0000 7fcd14998700 1 -- 192.168.123.105:0/3246775140 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fcd0c1a3230 con 0x7fcd0c19d680 2026-03-10T09:02:14.390 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.386+0000 7fcd14998700 1 -- 192.168.123.105:0/3246775140 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fcd0c066e80 con 0x7fcd0c19d680 2026-03-10T09:02:14.390 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.387+0000 7fccff7fe700 1 -- 192.168.123.105:0/3246775140 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fcd00014440 con 0x7fcd0c19d680 2026-03-10T09:02:14.390 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.387+0000 7fccff7fe700 1 --2- 192.168.123.105:0/3246775140 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fccf8077990 0x7fccf8079e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:02:14.390 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.387+0000 7fccff7fe700 1 -- 192.168.123.105:0/3246775140 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(87..87 src has 1..87) v4 ==== 6222+0+0 (secure 0 0 0) 0x7fcd000994a0 con 0x7fcd0c19d680 2026-03-10T09:02:14.390 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.389+0000 7fcd12734700 1 --2- 192.168.123.105:0/3246775140 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fccf8077990 0x7fccf8079e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:02:14.390 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.389+0000 7fcd12734700 1 --2- 192.168.123.105:0/3246775140 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fccf8077990 0x7fccf8079e50 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7fcd04006010 tx=0x7fcd04005760 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:02:14.390 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.390+0000 7fccff7fe700 1 -- 192.168.123.105:0/3246775140 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fcd00061e10 con 0x7fcd0c19d680 2026-03-10T09:02:14.549 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.549+0000 7fcd14998700 1 -- 192.168.123.105:0/3246775140 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fcd0c1a35d0 con 0x7fcd0c19d680 2026-03-10T09:02:14.550 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.550+0000 7fccff7fe700 1 -- 192.168.123.105:0/3246775140 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+799 (secure 0 0 0) 0x7fcd000146f0 con 0x7fcd0c19d680 2026-03-10T09:02:14.550 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T09:02:14.550 INFO:teuthology.orchestra.run.vm05.stdout: "mon": { 2026-03-10T09:02:14.550 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T09:02:14.550 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T09:02:14.550 INFO:teuthology.orchestra.run.vm05.stdout: "mgr": { 2026-03-10T09:02:14.550 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T09:02:14.550 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T09:02:14.550 INFO:teuthology.orchestra.run.vm05.stdout: "osd": { 2026-03-10T09:02:14.550 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 3, 2026-03-10T09:02:14.550 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 3 2026-03-10T09:02:14.550 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T09:02:14.550 INFO:teuthology.orchestra.run.vm05.stdout: "mds": { 2026-03-10T09:02:14.550 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4 2026-03-10T09:02:14.551 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T09:02:14.551 INFO:teuthology.orchestra.run.vm05.stdout: "overall": { 2026-03-10T09:02:14.551 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 7, 2026-03-10T09:02:14.551 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 7 2026-03-10T09:02:14.551 INFO:teuthology.orchestra.run.vm05.stdout: } 2026-03-10T09:02:14.551 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T09:02:14.553 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.552+0000 7fcd14998700 1 -- 192.168.123.105:0/3246775140 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fccf8077990 msgr2=0x7fccf8079e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:14.553 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.553+0000 7fcd14998700 1 --2- 192.168.123.105:0/3246775140 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fccf8077990 0x7fccf8079e50 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7fcd04006010 tx=0x7fcd04005760 comp rx=0 tx=0).stop 2026-03-10T09:02:14.553 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.553+0000 7fcd14998700 1 -- 192.168.123.105:0/3246775140 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcd0c19d680 msgr2=0x7fcd0c1a26f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:14.553 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.553+0000 7fcd14998700 1 --2- 192.168.123.105:0/3246775140 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcd0c19d680 0x7fcd0c1a26f0 secure :-1 s=READY pgs=145 cs=0 l=1 rev1=1 crypto rx=0x7fcd0000ba70 tx=0x7fcd0000be30 comp rx=0 tx=0).stop 2026-03-10T09:02:14.553 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.553+0000 7fcd14998700 1 -- 192.168.123.105:0/3246775140 shutdown_connections 2026-03-10T09:02:14.553 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.553+0000 7fcd14998700 1 --2- 192.168.123.105:0/3246775140 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fccf8077990 0x7fccf8079e50 unknown :-1 s=CLOSED pgs=96 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:14.553 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.553+0000 7fcd14998700 1 --2- 192.168.123.105:0/3246775140 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcd0c1043c0 0x7fcd0c19d140 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:14.553 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.553+0000 7fcd14998700 1 --2- 192.168.123.105:0/3246775140 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcd0c19d680 0x7fcd0c1a26f0 unknown :-1 s=CLOSED pgs=145 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:14.553 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.553+0000 7fcd14998700 1 -- 192.168.123.105:0/3246775140 >> 192.168.123.105:0/3246775140 conn(0x7fcd0c0fb3c0 msgr2=0x7fcd0c0fd840 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:02:14.553 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.553+0000 7fcd14998700 1 -- 192.168.123.105:0/3246775140 shutdown_connections 2026-03-10T09:02:14.554 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.553+0000 7fcd14998700 1 -- 192.168.123.105:0/3246775140 wait complete. 2026-03-10T09:02:14.620 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.619+0000 7fbf90d02700 1 -- 192.168.123.105:0/2583243183 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbf8c074dc0 msgr2=0x7fbf8c073220 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:14.620 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.619+0000 7fbf90d02700 1 --2- 192.168.123.105:0/2583243183 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbf8c074dc0 0x7fbf8c073220 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7fbf74009a60 tx=0x7fbf74009d70 comp rx=0 tx=0).stop 2026-03-10T09:02:14.620 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.620+0000 7fbf90d02700 1 -- 192.168.123.105:0/2583243183 shutdown_connections 2026-03-10T09:02:14.620 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.620+0000 7fbf90d02700 1 --2- 192.168.123.105:0/2583243183 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbf8c0737f0 0x7fbf8c073c70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:14.620 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.620+0000 7fbf90d02700 1 --2- 192.168.123.105:0/2583243183 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbf8c074dc0 0x7fbf8c073220 unknown :-1 s=CLOSED pgs=51 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:14.620 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.620+0000 7fbf90d02700 1 -- 192.168.123.105:0/2583243183 >> 192.168.123.105:0/2583243183 conn(0x7fbf8c0fc4c0 msgr2=0x7fbf8c0fe920 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:02:14.620 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.620+0000 7fbf90d02700 1 -- 192.168.123.105:0/2583243183 shutdown_connections 2026-03-10T09:02:14.620 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.620+0000 7fbf90d02700 1 -- 192.168.123.105:0/2583243183 wait complete. 2026-03-10T09:02:14.621 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.621+0000 7fbf90d02700 1 Processor -- start 2026-03-10T09:02:14.621 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.621+0000 7fbf90d02700 1 -- start start 2026-03-10T09:02:14.621 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.621+0000 7fbf90d02700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbf8c0737f0 0x7fbf8c19d070 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:02:14.621 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.621+0000 7fbf90d02700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbf8c19d5b0 0x7fbf8c1a2620 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:02:14.621 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.621+0000 7fbf90d02700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbf8c19dac0 con 0x7fbf8c19d5b0 2026-03-10T09:02:14.621 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.621+0000 7fbf90d02700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbf8c19dc30 con 0x7fbf8c0737f0 2026-03-10T09:02:14.621 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.621+0000 7fbf89d9b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbf8c19d5b0 0x7fbf8c1a2620 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:02:14.622 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.621+0000 7fbf89d9b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbf8c19d5b0 0x7fbf8c1a2620 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:55428/0 (socket says 192.168.123.105:55428) 2026-03-10T09:02:14.622 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.621+0000 7fbf89d9b700 1 -- 192.168.123.105:0/2764197230 learned_addr learned my addr 192.168.123.105:0/2764197230 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:02:14.622 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.622+0000 7fbf8a59c700 1 --2- 192.168.123.105:0/2764197230 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbf8c0737f0 0x7fbf8c19d070 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:02:14.622 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.622+0000 7fbf89d9b700 1 -- 192.168.123.105:0/2764197230 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbf8c0737f0 msgr2=0x7fbf8c19d070 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:14.622 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.622+0000 7fbf89d9b700 1 --2- 192.168.123.105:0/2764197230 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbf8c0737f0 0x7fbf8c19d070 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:14.622 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.622+0000 7fbf89d9b700 1 -- 192.168.123.105:0/2764197230 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbf7c0097e0 con 0x7fbf8c19d5b0 2026-03-10T09:02:14.622 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.622+0000 7fbf89d9b700 1 --2- 192.168.123.105:0/2764197230 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbf8c19d5b0 0x7fbf8c1a2620 secure :-1 s=READY pgs=146 cs=0 l=1 rev1=1 crypto rx=0x7fbf7c009fc0 tx=0x7fbf7c00c660 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:02:14.623 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.622+0000 7fbf837fe700 1 -- 192.168.123.105:0/2764197230 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbf7c010430 con 0x7fbf8c19d5b0 2026-03-10T09:02:14.623 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.622+0000 7fbf837fe700 1 -- 192.168.123.105:0/2764197230 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fbf7c010a70 con 0x7fbf8c19d5b0 2026-03-10T09:02:14.623 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.622+0000 7fbf837fe700 1 -- 192.168.123.105:0/2764197230 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbf7c018bf0 con 0x7fbf8c19d5b0 2026-03-10T09:02:14.623 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.622+0000 7fbf90d02700 1 -- 192.168.123.105:0/2764197230 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbf74009710 con 0x7fbf8c19d5b0 2026-03-10T09:02:14.623 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.622+0000 7fbf90d02700 1 -- 192.168.123.105:0/2764197230 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbf8c1a2f80 con 0x7fbf8c19d5b0 2026-03-10T09:02:14.624 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.623+0000 7fbf90d02700 1 -- 192.168.123.105:0/2764197230 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbf8c066e80 con 0x7fbf8c19d5b0 2026-03-10T09:02:14.627 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.627+0000 7fbf837fe700 1 -- 192.168.123.105:0/2764197230 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fbf7c018d50 con 0x7fbf8c19d5b0 2026-03-10T09:02:14.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.627+0000 7fbf837fe700 1 --2- 192.168.123.105:0/2764197230 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fbf78077990 0x7fbf78079e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:02:14.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.627+0000 7fbf8a59c700 1 --2- 192.168.123.105:0/2764197230 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fbf78077990 0x7fbf78079e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:02:14.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.628+0000 7fbf837fe700 1 -- 192.168.123.105:0/2764197230 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(87..87 src has 1..87) v4 ==== 6222+0+0 (secure 0 0 0) 0x7fbf7c014070 con 0x7fbf8c19d5b0 2026-03-10T09:02:14.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.628+0000 7fbf837fe700 1 -- 192.168.123.105:0/2764197230 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fbf7c0636d0 con 0x7fbf8c19d5b0 2026-03-10T09:02:14.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.628+0000 7fbf8a59c700 1 --2- 192.168.123.105:0/2764197230 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fbf78077990 0x7fbf78079e50 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7fbf74000c00 tx=0x7fbf740058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:02:14.766 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.766+0000 7fbf90d02700 1 -- 192.168.123.105:0/2764197230 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fbf8c1a3260 con 0x7fbf8c19d5b0 2026-03-10T09:02:14.767 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.767+0000 7fbf837fe700 1 -- 192.168.123.105:0/2764197230 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 12 v12) v1 ==== 76+0+1918 (secure 0 0 0) 0x7fbf7c062e20 con 0x7fbf8c19d5b0 2026-03-10T09:02:14.767 INFO:teuthology.orchestra.run.vm05.stdout:e12 2026-03-10T09:02:14.767 INFO:teuthology.orchestra.run.vm05.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-10T09:02:14.767 INFO:teuthology.orchestra.run.vm05.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T09:02:14.767 INFO:teuthology.orchestra.run.vm05.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T09:02:14.767 INFO:teuthology.orchestra.run.vm05.stdout:legacy client fscid: 1 2026-03-10T09:02:14.767 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:02:14.767 INFO:teuthology.orchestra.run.vm05.stdout:Filesystem 'cephfs' (1) 2026-03-10T09:02:14.767 INFO:teuthology.orchestra.run.vm05.stdout:fs_name cephfs 2026-03-10T09:02:14.767 INFO:teuthology.orchestra.run.vm05.stdout:epoch 12 2026-03-10T09:02:14.767 INFO:teuthology.orchestra.run.vm05.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T09:02:14.767 INFO:teuthology.orchestra.run.vm05.stdout:created 2026-03-10T08:52:52.346264+0000 2026-03-10T09:02:14.767 INFO:teuthology.orchestra.run.vm05.stdout:modified 2026-03-10T08:53:01.426195+0000 2026-03-10T09:02:14.767 INFO:teuthology.orchestra.run.vm05.stdout:tableserver 0 2026-03-10T09:02:14.767 INFO:teuthology.orchestra.run.vm05.stdout:root 0 2026-03-10T09:02:14.767 INFO:teuthology.orchestra.run.vm05.stdout:session_timeout 60 2026-03-10T09:02:14.767 INFO:teuthology.orchestra.run.vm05.stdout:session_autoclose 300 2026-03-10T09:02:14.767 INFO:teuthology.orchestra.run.vm05.stdout:max_file_size 1099511627776 2026-03-10T09:02:14.767 INFO:teuthology.orchestra.run.vm05.stdout:max_xattr_size 65536 2026-03-10T09:02:14.767 INFO:teuthology.orchestra.run.vm05.stdout:required_client_features {} 2026-03-10T09:02:14.767 INFO:teuthology.orchestra.run.vm05.stdout:last_failure 0 2026-03-10T09:02:14.767 INFO:teuthology.orchestra.run.vm05.stdout:last_failure_osd_epoch 39 2026-03-10T09:02:14.767 INFO:teuthology.orchestra.run.vm05.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T09:02:14.767 INFO:teuthology.orchestra.run.vm05.stdout:max_mds 1 2026-03-10T09:02:14.767 INFO:teuthology.orchestra.run.vm05.stdout:in 0 2026-03-10T09:02:14.768 INFO:teuthology.orchestra.run.vm05.stdout:up {0=24289} 2026-03-10T09:02:14.768 INFO:teuthology.orchestra.run.vm05.stdout:failed 2026-03-10T09:02:14.768 INFO:teuthology.orchestra.run.vm05.stdout:damaged 2026-03-10T09:02:14.768 INFO:teuthology.orchestra.run.vm05.stdout:stopped 2026-03-10T09:02:14.768 INFO:teuthology.orchestra.run.vm05.stdout:data_pools [3] 2026-03-10T09:02:14.768 INFO:teuthology.orchestra.run.vm05.stdout:metadata_pool 2 2026-03-10T09:02:14.768 INFO:teuthology.orchestra.run.vm05.stdout:inline_data disabled 2026-03-10T09:02:14.768 INFO:teuthology.orchestra.run.vm05.stdout:balancer 2026-03-10T09:02:14.768 INFO:teuthology.orchestra.run.vm05.stdout:bal_rank_mask -1 2026-03-10T09:02:14.768 INFO:teuthology.orchestra.run.vm05.stdout:standby_count_wanted 1 2026-03-10T09:02:14.768 INFO:teuthology.orchestra.run.vm05.stdout:qdb_cluster leader: 0 members: 2026-03-10T09:02:14.768 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.bxdvbu{0:24289} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.105:6826/2466638752,v1:192.168.123.105:6827/2466638752] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T09:02:14.768 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:02:14.768 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:02:14.768 INFO:teuthology.orchestra.run.vm05.stdout:Standby daemons: 2026-03-10T09:02:14.768 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:02:14.768 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.slhztf{-1:14488} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.105:6828/2662194502,v1:192.168.123.105:6829/2662194502] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T09:02:14.768 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.ssijow{-1:24307} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.108:6826/573665424,v1:192.168.123.108:6827/573665424] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T09:02:14.768 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.xfzrbx{-1:24317} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.108:6824/3236998387,v1:192.168.123.108:6825/3236998387] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T09:02:14.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.769+0000 7fbf90d02700 1 -- 192.168.123.105:0/2764197230 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fbf78077990 msgr2=0x7fbf78079e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:14.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.769+0000 7fbf90d02700 1 --2- 192.168.123.105:0/2764197230 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fbf78077990 0x7fbf78079e50 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7fbf74000c00 tx=0x7fbf740058e0 comp rx=0 tx=0).stop 2026-03-10T09:02:14.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.769+0000 7fbf90d02700 1 -- 192.168.123.105:0/2764197230 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbf8c19d5b0 msgr2=0x7fbf8c1a2620 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:14.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.769+0000 7fbf90d02700 1 --2- 192.168.123.105:0/2764197230 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbf8c19d5b0 0x7fbf8c1a2620 secure :-1 s=READY pgs=146 cs=0 l=1 rev1=1 crypto rx=0x7fbf7c009fc0 tx=0x7fbf7c00c660 comp rx=0 tx=0).stop 2026-03-10T09:02:14.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.770+0000 7fbf90d02700 1 -- 192.168.123.105:0/2764197230 shutdown_connections 2026-03-10T09:02:14.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.770+0000 7fbf90d02700 1 --2- 192.168.123.105:0/2764197230 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fbf78077990 0x7fbf78079e50 unknown :-1 s=CLOSED pgs=97 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:14.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.770+0000 7fbf90d02700 1 --2- 192.168.123.105:0/2764197230 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbf8c0737f0 0x7fbf8c19d070 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:14.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.770+0000 7fbf90d02700 1 --2- 192.168.123.105:0/2764197230 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbf8c19d5b0 0x7fbf8c1a2620 unknown :-1 s=CLOSED pgs=146 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:14.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.770+0000 7fbf90d02700 1 -- 192.168.123.105:0/2764197230 >> 192.168.123.105:0/2764197230 conn(0x7fbf8c0fc4c0 msgr2=0x7fbf8c1028d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:02:14.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.770+0000 7fbf90d02700 1 -- 192.168.123.105:0/2764197230 shutdown_connections 2026-03-10T09:02:14.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.770+0000 7fbf90d02700 1 -- 192.168.123.105:0/2764197230 wait complete. 2026-03-10T09:02:14.771 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 12 2026-03-10T09:02:14.839 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.838+0000 7fe91ba03700 1 -- 192.168.123.105:0/2912161862 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe914104320 msgr2=0x7fe914104780 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:14.839 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.838+0000 7fe91ba03700 1 --2- 192.168.123.105:0/2912161862 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe914104320 0x7fe914104780 secure :-1 s=READY pgs=147 cs=0 l=1 rev1=1 crypto rx=0x7fe910009b50 tx=0x7fe910009e60 comp rx=0 tx=0).stop 2026-03-10T09:02:14.839 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.838+0000 7fe91ba03700 1 -- 192.168.123.105:0/2912161862 shutdown_connections 2026-03-10T09:02:14.839 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.838+0000 7fe91ba03700 1 --2- 192.168.123.105:0/2912161862 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe914104320 0x7fe914104780 unknown :-1 s=CLOSED pgs=147 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:14.839 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.838+0000 7fe91ba03700 1 --2- 192.168.123.105:0/2912161862 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe914103120 0x7fe914103540 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:14.839 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.838+0000 7fe91ba03700 1 -- 192.168.123.105:0/2912161862 >> 192.168.123.105:0/2912161862 conn(0x7fe9140fe6c0 msgr2=0x7fe914100b00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:02:14.839 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.839+0000 7fe91ba03700 1 -- 192.168.123.105:0/2912161862 shutdown_connections 2026-03-10T09:02:14.839 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.839+0000 7fe91ba03700 1 -- 192.168.123.105:0/2912161862 wait complete. 2026-03-10T09:02:14.840 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.839+0000 7fe91ba03700 1 Processor -- start 2026-03-10T09:02:14.840 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.840+0000 7fe91ba03700 1 -- start start 2026-03-10T09:02:14.840 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.840+0000 7fe91ba03700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe914103120 0x7fe9141989a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:02:14.840 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.840+0000 7fe91ba03700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe914104320 0x7fe914198ee0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:02:14.840 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.840+0000 7fe91ba03700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe914199500 con 0x7fe914103120 2026-03-10T09:02:14.840 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.840+0000 7fe91ba03700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe914199640 con 0x7fe914104320 2026-03-10T09:02:14.841 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.840+0000 7fe918f9e700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe914104320 0x7fe914198ee0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:02:14.841 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.840+0000 7fe918f9e700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe914104320 0x7fe914198ee0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:52658/0 (socket says 192.168.123.105:52658) 2026-03-10T09:02:14.841 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.840+0000 7fe918f9e700 1 -- 192.168.123.105:0/1093900783 learned_addr learned my addr 192.168.123.105:0/1093900783 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:02:14.841 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.841+0000 7fe91979f700 1 --2- 192.168.123.105:0/1093900783 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe914103120 0x7fe9141989a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:02:14.841 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.841+0000 7fe918f9e700 1 -- 192.168.123.105:0/1093900783 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe914103120 msgr2=0x7fe9141989a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:14.842 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.841+0000 7fe918f9e700 1 --2- 192.168.123.105:0/1093900783 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe914103120 0x7fe9141989a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:14.842 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.841+0000 7fe918f9e700 1 -- 192.168.123.105:0/1093900783 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe9100097e0 con 0x7fe914104320 2026-03-10T09:02:14.842 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.841+0000 7fe918f9e700 1 --2- 192.168.123.105:0/1093900783 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe914104320 0x7fe914198ee0 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7fe910005950 tx=0x7fe910004c30 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:02:14.842 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.842+0000 7fe9067fc700 1 -- 192.168.123.105:0/1093900783 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe91001d070 con 0x7fe914104320 2026-03-10T09:02:14.842 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.842+0000 7fe91ba03700 1 -- 192.168.123.105:0/1093900783 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe91419e090 con 0x7fe914104320 2026-03-10T09:02:14.842 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.842+0000 7fe9067fc700 1 -- 192.168.123.105:0/1093900783 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fe91000bc50 con 0x7fe914104320 2026-03-10T09:02:14.843 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.842+0000 7fe9067fc700 1 -- 192.168.123.105:0/1093900783 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe91000f780 con 0x7fe914104320 2026-03-10T09:02:14.843 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.842+0000 7fe91ba03700 1 -- 192.168.123.105:0/1093900783 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe91419e520 con 0x7fe914104320 2026-03-10T09:02:14.844 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.843+0000 7fe91ba03700 1 -- 192.168.123.105:0/1093900783 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe914066e80 con 0x7fe914104320 2026-03-10T09:02:14.844 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.844+0000 7fe9067fc700 1 -- 192.168.123.105:0/1093900783 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fe910044b60 con 0x7fe914104320 2026-03-10T09:02:14.845 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.844+0000 7fe9067fc700 1 --2- 192.168.123.105:0/1093900783 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fe9000778e0 0x7fe900079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:02:14.845 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.844+0000 7fe9067fc700 1 -- 192.168.123.105:0/1093900783 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(87..87 src has 1..87) v4 ==== 6222+0+0 (secure 0 0 0) 0x7fe910067fe0 con 0x7fe914104320 2026-03-10T09:02:14.845 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.844+0000 7fe91979f700 1 --2- 192.168.123.105:0/1093900783 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fe9000778e0 0x7fe900079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:02:14.845 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.844+0000 7fe91979f700 1 --2- 192.168.123.105:0/1093900783 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fe9000778e0 0x7fe900079da0 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7fe914104180 tx=0x7fe908009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:02:14.849 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.848+0000 7fe9067fc700 1 -- 192.168.123.105:0/1093900783 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fe91006a080 con 0x7fe914104320 2026-03-10T09:02:14.974 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.974+0000 7fe91ba03700 1 -- 192.168.123.105:0/1093900783 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fe914108c70 con 0x7fe9000778e0 2026-03-10T09:02:14.975 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.975+0000 7fe9067fc700 1 -- 192.168.123.105:0/1093900783 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7fe914108c70 con 0x7fe9000778e0 2026-03-10T09:02:14.976 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T09:02:14.976 INFO:teuthology.orchestra.run.vm05.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T09:02:14.976 INFO:teuthology.orchestra.run.vm05.stdout: "in_progress": true, 2026-03-10T09:02:14.976 INFO:teuthology.orchestra.run.vm05.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T09:02:14.976 INFO:teuthology.orchestra.run.vm05.stdout: "services_complete": [ 2026-03-10T09:02:14.976 INFO:teuthology.orchestra.run.vm05.stdout: "mgr", 2026-03-10T09:02:14.976 INFO:teuthology.orchestra.run.vm05.stdout: "crash", 2026-03-10T09:02:14.976 INFO:teuthology.orchestra.run.vm05.stdout: "mon" 2026-03-10T09:02:14.976 INFO:teuthology.orchestra.run.vm05.stdout: ], 2026-03-10T09:02:14.976 INFO:teuthology.orchestra.run.vm05.stdout: "progress": "9/23 daemons upgraded", 2026-03-10T09:02:14.976 INFO:teuthology.orchestra.run.vm05.stdout: "message": "Currently upgrading osd daemons", 2026-03-10T09:02:14.976 INFO:teuthology.orchestra.run.vm05.stdout: "is_paused": false 2026-03-10T09:02:14.976 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T09:02:14.978 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.978+0000 7fe91ba03700 1 -- 192.168.123.105:0/1093900783 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fe9000778e0 msgr2=0x7fe900079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:14.978 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.978+0000 7fe91ba03700 1 --2- 192.168.123.105:0/1093900783 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fe9000778e0 0x7fe900079da0 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7fe914104180 tx=0x7fe908009450 comp rx=0 tx=0).stop 2026-03-10T09:02:14.978 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.978+0000 7fe91ba03700 1 -- 192.168.123.105:0/1093900783 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe914104320 msgr2=0x7fe914198ee0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:14.978 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.978+0000 7fe91ba03700 1 --2- 192.168.123.105:0/1093900783 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe914104320 0x7fe914198ee0 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7fe910005950 tx=0x7fe910004c30 comp rx=0 tx=0).stop 2026-03-10T09:02:14.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.978+0000 7fe91ba03700 1 -- 192.168.123.105:0/1093900783 shutdown_connections 2026-03-10T09:02:14.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.978+0000 7fe91ba03700 1 --2- 192.168.123.105:0/1093900783 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fe9000778e0 0x7fe900079da0 unknown :-1 s=CLOSED pgs=98 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:14.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.979+0000 7fe91ba03700 1 --2- 192.168.123.105:0/1093900783 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe914103120 0x7fe9141989a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:14.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.979+0000 7fe91ba03700 1 --2- 192.168.123.105:0/1093900783 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe914104320 0x7fe914198ee0 unknown :-1 s=CLOSED pgs=52 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:14.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.979+0000 7fe91ba03700 1 -- 192.168.123.105:0/1093900783 >> 192.168.123.105:0/1093900783 conn(0x7fe9140fe6c0 msgr2=0x7fe914107550 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:02:14.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.979+0000 7fe91ba03700 1 -- 192.168.123.105:0/1093900783 shutdown_connections 2026-03-10T09:02:14.980 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:14.979+0000 7fe91ba03700 1 -- 192.168.123.105:0/1093900783 wait complete. 2026-03-10T09:02:15.043 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.042+0000 7f3b306dd700 1 -- 192.168.123.105:0/3754962514 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b28101990 msgr2=0x7f3b28103d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:15.043 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.042+0000 7f3b306dd700 1 --2- 192.168.123.105:0/3754962514 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b28101990 0x7f3b28103d80 secure :-1 s=READY pgs=148 cs=0 l=1 rev1=1 crypto rx=0x7f3b18009b00 tx=0x7f3b18009e10 comp rx=0 tx=0).stop 2026-03-10T09:02:15.047 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.046+0000 7f3b306dd700 1 -- 192.168.123.105:0/3754962514 shutdown_connections 2026-03-10T09:02:15.047 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.046+0000 7f3b306dd700 1 --2- 192.168.123.105:0/3754962514 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3b281042c0 0x7f3b281066b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:15.047 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.046+0000 7f3b306dd700 1 --2- 192.168.123.105:0/3754962514 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b28101990 0x7f3b28103d80 unknown :-1 s=CLOSED pgs=148 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:15.047 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.046+0000 7f3b306dd700 1 -- 192.168.123.105:0/3754962514 >> 192.168.123.105:0/3754962514 conn(0x7f3b280fb380 msgr2=0x7f3b280fd7e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:02:15.047 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.047+0000 7f3b306dd700 1 -- 192.168.123.105:0/3754962514 shutdown_connections 2026-03-10T09:02:15.048 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.048+0000 7f3b306dd700 1 -- 192.168.123.105:0/3754962514 wait complete. 2026-03-10T09:02:15.048 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.048+0000 7f3b306dd700 1 Processor -- start 2026-03-10T09:02:15.048 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.048+0000 7f3b306dd700 1 -- start start 2026-03-10T09:02:15.048 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.048+0000 7f3b306dd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b28101990 0x7f3b281989c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:02:15.048 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.048+0000 7f3b306dd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3b281042c0 0x7f3b28198f00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:02:15.049 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.048+0000 7f3b306dd700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3b28199520 con 0x7f3b28101990 2026-03-10T09:02:15.049 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.048+0000 7f3b306dd700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3b28199660 con 0x7f3b281042c0 2026-03-10T09:02:15.049 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.048+0000 7f3b2e479700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b28101990 0x7f3b281989c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:02:15.049 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.049+0000 7f3b2e479700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b28101990 0x7f3b281989c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:55452/0 (socket says 192.168.123.105:55452) 2026-03-10T09:02:15.049 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.049+0000 7f3b2e479700 1 -- 192.168.123.105:0/378814805 learned_addr learned my addr 192.168.123.105:0/378814805 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:02:15.049 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.049+0000 7f3b2e479700 1 -- 192.168.123.105:0/378814805 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3b281042c0 msgr2=0x7f3b28198f00 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:15.049 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.049+0000 7f3b2dc78700 1 --2- 192.168.123.105:0/378814805 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3b281042c0 0x7f3b28198f00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:02:15.049 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.049+0000 7f3b2e479700 1 --2- 192.168.123.105:0/378814805 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3b281042c0 0x7f3b28198f00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:15.049 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.049+0000 7f3b2e479700 1 -- 192.168.123.105:0/378814805 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3b180097e0 con 0x7f3b28101990 2026-03-10T09:02:15.049 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.049+0000 7f3b2e479700 1 --2- 192.168.123.105:0/378814805 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b28101990 0x7f3b281989c0 secure :-1 s=READY pgs=149 cs=0 l=1 rev1=1 crypto rx=0x7f3b180048c0 tx=0x7f3b180049a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:02:15.082 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.081+0000 7f3b1f7fe700 1 -- 192.168.123.105:0/378814805 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3b1801d070 con 0x7f3b28101990 2026-03-10T09:02:15.082 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.081+0000 7f3b1f7fe700 1 -- 192.168.123.105:0/378814805 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3b1800bc50 con 0x7f3b28101990 2026-03-10T09:02:15.082 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.081+0000 7f3b1f7fe700 1 -- 192.168.123.105:0/378814805 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3b1800f800 con 0x7f3b28101990 2026-03-10T09:02:15.082 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.081+0000 7f3b306dd700 1 -- 192.168.123.105:0/378814805 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3b2819e0b0 con 0x7f3b28101990 2026-03-10T09:02:15.082 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.082+0000 7f3b306dd700 1 -- 192.168.123.105:0/378814805 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3b2819e5a0 con 0x7f3b28101990 2026-03-10T09:02:15.084 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.083+0000 7f3b306dd700 1 -- 192.168.123.105:0/378814805 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3b280fcf70 con 0x7f3b28101990 2026-03-10T09:02:15.085 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.084+0000 7f3b1f7fe700 1 -- 192.168.123.105:0/378814805 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f3b18022be0 con 0x7f3b28101990 2026-03-10T09:02:15.088 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.085+0000 7f3b1f7fe700 1 --2- 192.168.123.105:0/378814805 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f3b140778c0 0x7f3b14079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:02:15.088 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.085+0000 7f3b1f7fe700 1 -- 192.168.123.105:0/378814805 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(87..87 src has 1..87) v4 ==== 6222+0+0 (secure 0 0 0) 0x7f3b1809b120 con 0x7f3b28101990 2026-03-10T09:02:15.088 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.085+0000 7f3b2dc78700 1 --2- 192.168.123.105:0/378814805 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f3b140778c0 0x7f3b14079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:02:15.088 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.086+0000 7f3b2dc78700 1 --2- 192.168.123.105:0/378814805 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f3b140778c0 0x7f3b14079d80 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7f3b24005950 tx=0x7f3b240058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:02:15.089 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.089+0000 7f3b1f7fe700 1 -- 192.168.123.105:0/378814805 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f3b18063a30 con 0x7f3b28101990 2026-03-10T09:02:15.250 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.249+0000 7f3b306dd700 1 -- 192.168.123.105:0/378814805 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f3b28066e80 con 0x7f3b28101990 2026-03-10T09:02:15.250 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.250+0000 7f3b1f7fe700 1 -- 192.168.123.105:0/378814805 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7f3b18063180 con 0x7f3b28101990 2026-03-10T09:02:15.250 INFO:teuthology.orchestra.run.vm05.stdout:HEALTH_OK 2026-03-10T09:02:15.253 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.252+0000 7f3b306dd700 1 -- 192.168.123.105:0/378814805 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f3b140778c0 msgr2=0x7f3b14079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:15.253 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.253+0000 7f3b306dd700 1 --2- 192.168.123.105:0/378814805 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f3b140778c0 0x7f3b14079d80 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7f3b24005950 tx=0x7f3b240058e0 comp rx=0 tx=0).stop 2026-03-10T09:02:15.253 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.253+0000 7f3b306dd700 1 -- 192.168.123.105:0/378814805 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b28101990 msgr2=0x7f3b281989c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:15.253 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.253+0000 7f3b306dd700 1 --2- 192.168.123.105:0/378814805 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b28101990 0x7f3b281989c0 secure :-1 s=READY pgs=149 cs=0 l=1 rev1=1 crypto rx=0x7f3b180048c0 tx=0x7f3b180049a0 comp rx=0 tx=0).stop 2026-03-10T09:02:15.253 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.253+0000 7f3b306dd700 1 -- 192.168.123.105:0/378814805 shutdown_connections 2026-03-10T09:02:15.253 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.253+0000 7f3b306dd700 1 --2- 192.168.123.105:0/378814805 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f3b140778c0 0x7f3b14079d80 unknown :-1 s=CLOSED pgs=99 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:15.253 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.253+0000 7f3b306dd700 1 --2- 192.168.123.105:0/378814805 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b28101990 0x7f3b281989c0 unknown :-1 s=CLOSED pgs=149 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:15.253 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.253+0000 7f3b306dd700 1 --2- 192.168.123.105:0/378814805 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3b281042c0 0x7f3b28198f00 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:15.254 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.253+0000 7f3b306dd700 1 -- 192.168.123.105:0/378814805 >> 192.168.123.105:0/378814805 conn(0x7f3b280fb380 msgr2=0x7f3b28104f30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:02:15.254 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.253+0000 7f3b306dd700 1 -- 192.168.123.105:0/378814805 shutdown_connections 2026-03-10T09:02:15.254 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:15.254+0000 7f3b306dd700 1 -- 192.168.123.105:0/378814805 wait complete. 2026-03-10T09:02:15.360 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:15 vm05.local ceph-mon[111630]: pgmap v211: 65 pgs: 65 active+clean; 215 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail 2026-03-10T09:02:15.361 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:15 vm05.local ceph-mon[111630]: from='client.34374 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:02:15.361 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:15 vm05.local ceph-mon[111630]: from='client.34378 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:02:15.361 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:15 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/3246775140' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:02:15.361 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:15 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/2764197230' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T09:02:15.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:15 vm08.local ceph-mon[101330]: pgmap v211: 65 pgs: 65 active+clean; 215 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail 2026-03-10T09:02:15.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:15 vm08.local ceph-mon[101330]: from='client.34374 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:02:15.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:15 vm08.local ceph-mon[101330]: from='client.34378 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:02:15.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:15 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/3246775140' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:02:15.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:15 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/2764197230' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T09:02:16.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:16 vm05.local ceph-mon[111630]: from='client.34382 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:02:16.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:16 vm05.local ceph-mon[111630]: from='client.44301 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:02:16.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:16 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/378814805' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T09:02:16.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:16 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:16.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:16 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:02:16.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:16 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:16.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:16 vm08.local ceph-mon[101330]: from='client.34382 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:02:16.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:16 vm08.local ceph-mon[101330]: from='client.44301 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:02:16.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:16 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/378814805' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T09:02:16.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:16 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:16.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:16 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:02:16.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:16 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:17.255 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:17 vm08.local ceph-mon[101330]: pgmap v212: 65 pgs: 65 active+clean; 215 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail 2026-03-10T09:02:17.255 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:17 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-10T09:02:17.255 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:17 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:17.255 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:17 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-10T09:02:17.255 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:17 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:02:17.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:17 vm05.local ceph-mon[111630]: pgmap v212: 65 pgs: 65 active+clean; 215 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail 2026-03-10T09:02:17.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:17 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-10T09:02:17.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:17 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:17.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:17 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-10T09:02:17.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:17 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:02:18.128 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:17 vm08.local systemd[1]: Stopping Ceph osd.3 for 16587ed2-1c5e-11f1-90f6-35051361a039... 2026-03-10T09:02:18.128 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:17 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3[63369]: 2026-03-10T09:02:17.875+0000 7f3e69824700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.3 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T09:02:18.128 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:17 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3[63369]: 2026-03-10T09:02:17.875+0000 7f3e69824700 -1 osd.3 87 *** Got signal Terminated *** 2026-03-10T09:02:18.128 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:17 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3[63369]: 2026-03-10T09:02:17.875+0000 7f3e69824700 -1 osd.3 87 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T09:02:18.397 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:18 vm08.local ceph-mon[101330]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-10T09:02:18.397 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:18 vm08.local ceph-mon[101330]: Upgrade: osd.3 is safe to restart 2026-03-10T09:02:18.397 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:18 vm08.local ceph-mon[101330]: Upgrade: Updating osd.3 2026-03-10T09:02:18.397 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:18 vm08.local ceph-mon[101330]: Deploying daemon osd.3 on vm08 2026-03-10T09:02:18.397 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:18 vm08.local ceph-mon[101330]: osd.3 marked itself down and dead 2026-03-10T09:02:18.397 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:18 vm08.local podman[110856]: 2026-03-10 09:02:18.220030923 +0000 UTC m=+0.356999973 container died 14f5f93ea4d1213b19feae5a03379bab7d55ec9c1c1d24d4f31fb0a348d35ddd (image=quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3, maintainer=Guillaume Abrioux , io.buildah.version=1.29.1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.schema-version=1.0, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, org.label-schema.build-date=20240222, org.label-schema.vendor=CentOS, CEPH_POINT_RELEASE=-18.2.1, GIT_BRANCH=HEAD, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, RELEASE=HEAD) 2026-03-10T09:02:18.397 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:18 vm08.local podman[110856]: 2026-03-10 09:02:18.249381258 +0000 UTC m=+0.386350299 container remove 14f5f93ea4d1213b19feae5a03379bab7d55ec9c1c1d24d4f31fb0a348d35ddd (image=quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3, org.label-schema.vendor=CentOS, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, org.label-schema.license=GPLv2, CEPH_POINT_RELEASE=-18.2.1, GIT_BRANCH=HEAD, GIT_CLEAN=True, io.buildah.version=1.29.1, maintainer=Guillaume Abrioux , org.label-schema.build-date=20240222, org.label-schema.name=CentOS Stream 8 Base Image, RELEASE=HEAD, org.label-schema.schema-version=1.0) 2026-03-10T09:02:18.397 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:18 vm08.local bash[110856]: ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3 2026-03-10T09:02:18.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:18 vm05.local ceph-mon[111630]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-10T09:02:18.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:18 vm05.local ceph-mon[111630]: Upgrade: osd.3 is safe to restart 2026-03-10T09:02:18.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:18 vm05.local ceph-mon[111630]: Upgrade: Updating osd.3 2026-03-10T09:02:18.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:18 vm05.local ceph-mon[111630]: Deploying daemon osd.3 on vm08 2026-03-10T09:02:18.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:18 vm05.local ceph-mon[111630]: osd.3 marked itself down and dead 2026-03-10T09:02:18.691 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:18 vm08.local podman[110921]: 2026-03-10 09:02:18.397256685 +0000 UTC m=+0.017025348 container create 9554744c4c8557a84e2d54b8c10e503623391e5ce0229a1a0193c184b9b2b263 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3-deactivate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default) 2026-03-10T09:02:18.691 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:18 vm08.local podman[110921]: 2026-03-10 09:02:18.433293023 +0000 UTC m=+0.053061697 container init 9554744c4c8557a84e2d54b8c10e503623391e5ce0229a1a0193c184b9b2b263 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3-deactivate, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T09:02:18.691 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:18 vm08.local podman[110921]: 2026-03-10 09:02:18.442664013 +0000 UTC m=+0.062432676 container start 9554744c4c8557a84e2d54b8c10e503623391e5ce0229a1a0193c184b9b2b263 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3-deactivate, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223) 2026-03-10T09:02:18.691 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:18 vm08.local podman[110921]: 2026-03-10 09:02:18.444165813 +0000 UTC m=+0.063934476 container attach 9554744c4c8557a84e2d54b8c10e503623391e5ce0229a1a0193c184b9b2b263 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3-deactivate, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T09:02:18.691 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:18 vm08.local podman[110921]: 2026-03-10 09:02:18.390362931 +0000 UTC m=+0.010131605 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T09:02:18.691 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:18 vm08.local podman[110921]: 2026-03-10 09:02:18.577385148 +0000 UTC m=+0.197153812 container died 9554744c4c8557a84e2d54b8c10e503623391e5ce0229a1a0193c184b9b2b263 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3-deactivate, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3) 2026-03-10T09:02:18.691 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:18 vm08.local podman[110921]: 2026-03-10 09:02:18.595670233 +0000 UTC m=+0.215438896 container remove 9554744c4c8557a84e2d54b8c10e503623391e5ce0229a1a0193c184b9b2b263 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3-deactivate, io.buildah.version=1.41.3, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T09:02:18.691 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:18 vm08.local systemd[1]: ceph-16587ed2-1c5e-11f1-90f6-35051361a039@osd.3.service: Deactivated successfully. 2026-03-10T09:02:18.691 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:18 vm08.local systemd[1]: Stopped Ceph osd.3 for 16587ed2-1c5e-11f1-90f6-35051361a039. 2026-03-10T09:02:18.691 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:18 vm08.local systemd[1]: ceph-16587ed2-1c5e-11f1-90f6-35051361a039@osd.3.service: Consumed 58.142s CPU time. 2026-03-10T09:02:19.053 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:18 vm08.local systemd[1]: Starting Ceph osd.3 for 16587ed2-1c5e-11f1-90f6-35051361a039... 2026-03-10T09:02:19.053 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:18 vm08.local podman[111026]: 2026-03-10 09:02:18.897831303 +0000 UTC m=+0.017330097 container create c2db249218232bb0d7ce4008a50964cbef3b0ba0ea9414e5089ab3f5ef9329c7 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3-activate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=squid, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, OSD_FLAVOR=default) 2026-03-10T09:02:19.053 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:18 vm08.local podman[111026]: 2026-03-10 09:02:18.941750085 +0000 UTC m=+0.061248889 container init c2db249218232bb0d7ce4008a50964cbef3b0ba0ea9414e5089ab3f5ef9329c7 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3-activate, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid) 2026-03-10T09:02:19.053 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:18 vm08.local podman[111026]: 2026-03-10 09:02:18.946097723 +0000 UTC m=+0.065596517 container start c2db249218232bb0d7ce4008a50964cbef3b0ba0ea9414e5089ab3f5ef9329c7 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3-activate, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, CEPH_REF=squid, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) 2026-03-10T09:02:19.053 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:18 vm08.local podman[111026]: 2026-03-10 09:02:18.949176325 +0000 UTC m=+0.068675108 container attach c2db249218232bb0d7ce4008a50964cbef3b0ba0ea9414e5089ab3f5ef9329c7 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3-activate, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T09:02:19.053 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:18 vm08.local podman[111026]: 2026-03-10 09:02:18.89020531 +0000 UTC m=+0.009704114 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T09:02:19.053 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:19 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3-activate[111037]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T09:02:19.053 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:19 vm08.local bash[111026]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T09:02:19.053 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:19 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3-activate[111037]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T09:02:19.053 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:19 vm08.local bash[111026]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T09:02:19.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:19 vm05.local ceph-mon[111630]: pgmap v213: 65 pgs: 65 active+clean; 215 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail 2026-03-10T09:02:19.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:19 vm05.local ceph-mon[111630]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T09:02:19.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:19 vm05.local ceph-mon[111630]: osdmap e88: 6 total, 5 up, 6 in 2026-03-10T09:02:19.528 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:19 vm08.local ceph-mon[101330]: pgmap v213: 65 pgs: 65 active+clean; 215 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail 2026-03-10T09:02:19.528 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:19 vm08.local ceph-mon[101330]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T09:02:19.528 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:19 vm08.local ceph-mon[101330]: osdmap e88: 6 total, 5 up, 6 in 2026-03-10T09:02:19.528 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:19 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3-activate[111037]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T09:02:19.528 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:19 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3-activate[111037]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T09:02:19.788 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:19 vm08.local bash[111026]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T09:02:19.788 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:19 vm08.local bash[111026]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T09:02:19.788 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:19 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3-activate[111037]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T09:02:19.788 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:19 vm08.local bash[111026]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T09:02:19.788 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:19 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3-activate[111037]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 2026-03-10T09:02:19.788 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:19 vm08.local bash[111026]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 2026-03-10T09:02:19.788 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:19 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3-activate[111037]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-33250eb0-d087-4f9c-b41b-7d90a625cb1d/osd-block-3e86065a-202f-4640-9f03-2490c913e09b --path /var/lib/ceph/osd/ceph-3 --no-mon-config 2026-03-10T09:02:19.788 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:19 vm08.local bash[111026]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-33250eb0-d087-4f9c-b41b-7d90a625cb1d/osd-block-3e86065a-202f-4640-9f03-2490c913e09b --path /var/lib/ceph/osd/ceph-3 --no-mon-config 2026-03-10T09:02:20.053 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:19 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3-activate[111037]: Running command: /usr/bin/ln -snf /dev/ceph-33250eb0-d087-4f9c-b41b-7d90a625cb1d/osd-block-3e86065a-202f-4640-9f03-2490c913e09b /var/lib/ceph/osd/ceph-3/block 2026-03-10T09:02:20.053 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:19 vm08.local bash[111026]: Running command: /usr/bin/ln -snf /dev/ceph-33250eb0-d087-4f9c-b41b-7d90a625cb1d/osd-block-3e86065a-202f-4640-9f03-2490c913e09b /var/lib/ceph/osd/ceph-3/block 2026-03-10T09:02:20.053 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:19 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3-activate[111037]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-3/block 2026-03-10T09:02:20.053 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:19 vm08.local bash[111026]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-3/block 2026-03-10T09:02:20.053 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:19 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3-activate[111037]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 2026-03-10T09:02:20.053 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:19 vm08.local bash[111026]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 2026-03-10T09:02:20.053 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:19 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3-activate[111037]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 2026-03-10T09:02:20.053 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:19 vm08.local bash[111026]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 2026-03-10T09:02:20.053 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:19 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3-activate[111037]: --> ceph-volume lvm activate successful for osd ID: 3 2026-03-10T09:02:20.053 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:19 vm08.local bash[111026]: --> ceph-volume lvm activate successful for osd ID: 3 2026-03-10T09:02:20.053 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:19 vm08.local podman[111026]: 2026-03-10 09:02:19.900042028 +0000 UTC m=+1.019540822 container died c2db249218232bb0d7ce4008a50964cbef3b0ba0ea9414e5089ab3f5ef9329c7 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3-activate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, CEPH_REF=squid, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T09:02:20.053 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:19 vm08.local podman[111026]: 2026-03-10 09:02:19.923670052 +0000 UTC m=+1.043168846 container remove c2db249218232bb0d7ce4008a50964cbef3b0ba0ea9414e5089ab3f5ef9329c7 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3-activate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) 2026-03-10T09:02:20.356 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:20 vm08.local podman[111288]: 2026-03-10 09:02:20.04382798 +0000 UTC m=+0.016329034 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T09:02:20.356 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:20 vm08.local podman[111288]: 2026-03-10 09:02:20.201592265 +0000 UTC m=+0.174093309 container create b025f9a6ca2a74de41ae0c72e028f37ca9cc9a878231121bce43d419f8f864a4 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) 2026-03-10T09:02:20.558 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:20 vm05.local ceph-mon[111630]: osdmap e89: 6 total, 5 up, 6 in 2026-03-10T09:02:20.645 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:20 vm08.local ceph-mon[101330]: osdmap e89: 6 total, 5 up, 6 in 2026-03-10T09:02:20.645 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:20 vm08.local podman[111288]: 2026-03-10 09:02:20.39182448 +0000 UTC m=+0.364325524 container init b025f9a6ca2a74de41ae0c72e028f37ca9cc9a878231121bce43d419f8f864a4 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, CEPH_REF=squid, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T09:02:20.645 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:20 vm08.local podman[111288]: 2026-03-10 09:02:20.398291074 +0000 UTC m=+0.370792118 container start b025f9a6ca2a74de41ae0c72e028f37ca9cc9a878231121bce43d419f8f864a4 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS) 2026-03-10T09:02:20.645 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:20 vm08.local bash[111288]: b025f9a6ca2a74de41ae0c72e028f37ca9cc9a878231121bce43d419f8f864a4 2026-03-10T09:02:20.646 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:20 vm08.local systemd[1]: Started Ceph osd.3 for 16587ed2-1c5e-11f1-90f6-35051361a039. 2026-03-10T09:02:21.290 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:21 vm08.local ceph-mon[101330]: pgmap v216: 65 pgs: 7 peering, 13 stale+active+clean, 45 active+clean; 215 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail 2026-03-10T09:02:21.290 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:21 vm08.local ceph-mon[101330]: Health check failed: Reduced data availability: 3 pgs peering (PG_AVAILABILITY) 2026-03-10T09:02:21.290 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:21 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:21.290 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:21 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:21.290 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:21 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:02:21.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:21 vm05.local ceph-mon[111630]: pgmap v216: 65 pgs: 7 peering, 13 stale+active+clean, 45 active+clean; 215 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail 2026-03-10T09:02:21.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:21 vm05.local ceph-mon[111630]: Health check failed: Reduced data availability: 3 pgs peering (PG_AVAILABILITY) 2026-03-10T09:02:21.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:21 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:21.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:21 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:21.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:21 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:02:22.053 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:21 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3[111306]: 2026-03-10T09:02:21.776+0000 7fed4e538740 -1 Falling back to public interface 2026-03-10T09:02:22.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:22 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:22.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:22 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:22.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:22 vm05.local ceph-mon[111630]: pgmap v217: 65 pgs: 4 active+undersized, 7 peering, 11 stale+active+clean, 5 active+undersized+degraded, 38 active+clean; 215 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 14/231 objects degraded (6.061%) 2026-03-10T09:02:22.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:22 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:22.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:22 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:22.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:22 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:22.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:22 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:22.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:22 vm08.local ceph-mon[101330]: pgmap v217: 65 pgs: 4 active+undersized, 7 peering, 11 stale+active+clean, 5 active+undersized+degraded, 38 active+clean; 215 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 14/231 objects degraded (6.061%) 2026-03-10T09:02:22.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:22 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:22.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:22 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:23.467 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:23 vm08.local ceph-mon[101330]: Health check failed: Degraded data redundancy: 14/231 objects degraded (6.061%), 5 pgs degraded (PG_DEGRADED) 2026-03-10T09:02:23.467 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:23 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:23.467 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:23 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:23.467 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:23 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:02:23.467 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:23 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:02:23.467 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:23 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:23.467 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:23 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:02:23.467 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:23 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:02:23.467 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:23 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:02:23.467 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:23 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:02:23.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:23 vm05.local ceph-mon[111630]: Health check failed: Degraded data redundancy: 14/231 objects degraded (6.061%), 5 pgs degraded (PG_DEGRADED) 2026-03-10T09:02:23.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:23 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:23.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:23 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:23.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:23 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:02:23.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:23 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:02:23.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:23 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:23.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:23 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:02:23.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:23 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:02:23.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:23 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:02:23.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:23 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:02:23.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:23 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-10T09:02:23.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:23 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-10T09:02:24.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:24 vm08.local ceph-mon[101330]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-10T09:02:24.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:24 vm08.local ceph-mon[101330]: Upgrade: unsafe to stop osd(s) at this time (7 PGs are or would become offline) 2026-03-10T09:02:24.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:24 vm08.local ceph-mon[101330]: pgmap v218: 65 pgs: 16 active+undersized, 7 peering, 15 active+undersized+degraded, 27 active+clean; 215 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 47/231 objects degraded (20.346%) 2026-03-10T09:02:24.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:24 vm05.local ceph-mon[111630]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-10T09:02:24.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:24 vm05.local ceph-mon[111630]: Upgrade: unsafe to stop osd(s) at this time (7 PGs are or would become offline) 2026-03-10T09:02:24.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:24 vm05.local ceph-mon[111630]: pgmap v218: 65 pgs: 16 active+undersized, 7 peering, 15 active+undersized+degraded, 27 active+clean; 215 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 47/231 objects degraded (20.346%) 2026-03-10T09:02:26.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:25 vm05.local ceph-mon[111630]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 3 pgs peering) 2026-03-10T09:02:26.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:25 vm08.local ceph-mon[101330]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 3 pgs peering) 2026-03-10T09:02:26.668 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:26 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3[111306]: 2026-03-10T09:02:26.304+0000 7fed4e538740 -1 osd.3 0 read_superblock omap replica is missing. 2026-03-10T09:02:26.947 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:26 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3[111306]: 2026-03-10T09:02:26.667+0000 7fed4e538740 -1 osd.3 87 log_to_monitors true 2026-03-10T09:02:27.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:26 vm05.local ceph-mon[111630]: pgmap v219: 65 pgs: 20 active+undersized, 18 active+undersized+degraded, 27 active+clean; 215 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 55/231 objects degraded (23.810%) 2026-03-10T09:02:27.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:26 vm05.local ceph-mon[111630]: from='osd.3 [v2:192.168.123.108:6800/624463886,v1:192.168.123.108:6801/624463886]' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-10T09:02:27.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:26 vm05.local ceph-mon[111630]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-10T09:02:27.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:26 vm08.local ceph-mon[101330]: pgmap v219: 65 pgs: 20 active+undersized, 18 active+undersized+degraded, 27 active+clean; 215 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 55/231 objects degraded (23.810%) 2026-03-10T09:02:27.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:26 vm08.local ceph-mon[101330]: from='osd.3 [v2:192.168.123.108:6800/624463886,v1:192.168.123.108:6801/624463886]' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-10T09:02:27.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:26 vm08.local ceph-mon[101330]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-10T09:02:27.303 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:02:26 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3[111306]: 2026-03-10T09:02:26.968+0000 7fed462d2640 -1 osd.3 87 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T09:02:28.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:27 vm05.local ceph-mon[111630]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]': finished 2026-03-10T09:02:28.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:27 vm05.local ceph-mon[111630]: osdmap e90: 6 total, 5 up, 6 in 2026-03-10T09:02:28.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:27 vm05.local ceph-mon[111630]: from='osd.3 [v2:192.168.123.108:6800/624463886,v1:192.168.123.108:6801/624463886]' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T09:02:28.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:27 vm05.local ceph-mon[111630]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T09:02:28.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:27 vm08.local ceph-mon[101330]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]': finished 2026-03-10T09:02:28.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:27 vm08.local ceph-mon[101330]: osdmap e90: 6 total, 5 up, 6 in 2026-03-10T09:02:28.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:27 vm08.local ceph-mon[101330]: from='osd.3 [v2:192.168.123.108:6800/624463886,v1:192.168.123.108:6801/624463886]' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T09:02:28.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:27 vm08.local ceph-mon[101330]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T09:02:29.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:29 vm08.local ceph-mon[101330]: pgmap v221: 65 pgs: 20 active+undersized, 18 active+undersized+degraded, 27 active+clean; 215 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 55/231 objects degraded (23.810%) 2026-03-10T09:02:29.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:29 vm08.local ceph-mon[101330]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T09:02:29.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:29 vm08.local ceph-mon[101330]: osd.3 [v2:192.168.123.108:6800/624463886,v1:192.168.123.108:6801/624463886] boot 2026-03-10T09:02:29.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:29 vm08.local ceph-mon[101330]: osdmap e91: 6 total, 6 up, 6 in 2026-03-10T09:02:29.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:29 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T09:02:29.395 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:29 vm05.local ceph-mon[111630]: pgmap v221: 65 pgs: 20 active+undersized, 18 active+undersized+degraded, 27 active+clean; 215 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 55/231 objects degraded (23.810%) 2026-03-10T09:02:29.395 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:29 vm05.local ceph-mon[111630]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T09:02:29.395 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:29 vm05.local ceph-mon[111630]: osd.3 [v2:192.168.123.108:6800/624463886,v1:192.168.123.108:6801/624463886] boot 2026-03-10T09:02:29.395 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:29 vm05.local ceph-mon[111630]: osdmap e91: 6 total, 6 up, 6 in 2026-03-10T09:02:29.395 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:29 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T09:02:30.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:30 vm05.local ceph-mon[111630]: osdmap e92: 6 total, 6 up, 6 in 2026-03-10T09:02:30.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:30 vm08.local ceph-mon[101330]: osdmap e92: 6 total, 6 up, 6 in 2026-03-10T09:02:31.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:31 vm05.local ceph-mon[111630]: pgmap v224: 65 pgs: 4 peering, 17 active+undersized, 17 active+undersized+degraded, 27 active+clean; 215 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 52/231 objects degraded (22.511%) 2026-03-10T09:02:31.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:31 vm05.local ceph-mon[111630]: Health check update: Degraded data redundancy: 52/231 objects degraded (22.511%), 17 pgs degraded (PG_DEGRADED) 2026-03-10T09:02:31.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:31.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:02:31.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:31 vm08.local ceph-mon[101330]: pgmap v224: 65 pgs: 4 peering, 17 active+undersized, 17 active+undersized+degraded, 27 active+clean; 215 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 52/231 objects degraded (22.511%) 2026-03-10T09:02:31.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:31 vm08.local ceph-mon[101330]: Health check update: Degraded data redundancy: 52/231 objects degraded (22.511%), 17 pgs degraded (PG_DEGRADED) 2026-03-10T09:02:31.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:31.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:02:33.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:33 vm05.local ceph-mon[111630]: pgmap v225: 65 pgs: 4 peering, 14 active+undersized, 13 active+undersized+degraded, 34 active+clean; 215 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 39/231 objects degraded (16.883%) 2026-03-10T09:02:33.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:33 vm08.local ceph-mon[101330]: pgmap v225: 65 pgs: 4 peering, 14 active+undersized, 13 active+undersized+degraded, 34 active+clean; 215 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 39/231 objects degraded (16.883%) 2026-03-10T09:02:34.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:34 vm05.local ceph-mon[111630]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 39/231 objects degraded (16.883%), 13 pgs degraded) 2026-03-10T09:02:34.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:34 vm05.local ceph-mon[111630]: Cluster is now healthy 2026-03-10T09:02:34.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:34 vm08.local ceph-mon[101330]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 39/231 objects degraded (16.883%), 13 pgs degraded) 2026-03-10T09:02:34.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:34 vm08.local ceph-mon[101330]: Cluster is now healthy 2026-03-10T09:02:35.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:35 vm05.local ceph-mon[111630]: pgmap v226: 65 pgs: 4 peering, 61 active+clean; 215 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail 2026-03-10T09:02:35.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:35 vm08.local ceph-mon[101330]: pgmap v226: 65 pgs: 4 peering, 61 active+clean; 215 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail 2026-03-10T09:02:37.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:37 vm05.local ceph-mon[111630]: pgmap v227: 65 pgs: 65 active+clean; 215 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail 2026-03-10T09:02:37.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:37 vm08.local ceph-mon[101330]: pgmap v227: 65 pgs: 65 active+clean; 215 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail 2026-03-10T09:02:39.381 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:39 vm08.local ceph-mon[101330]: pgmap v228: 65 pgs: 65 active+clean; 215 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail 2026-03-10T09:02:39.382 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:39 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-10T09:02:39.382 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:39 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:39.382 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:39 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-10T09:02:39.382 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:39 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:02:39.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:39 vm05.local ceph-mon[111630]: pgmap v228: 65 pgs: 65 active+clean; 215 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail 2026-03-10T09:02:39.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:39 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-10T09:02:39.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:39 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:39.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:39 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-10T09:02:39.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:39 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:02:39.802 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:39 vm08.local systemd[1]: Stopping Ceph osd.4 for 16587ed2-1c5e-11f1-90f6-35051361a039... 2026-03-10T09:02:39.803 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:39 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4[69269]: 2026-03-10T09:02:39.538+0000 7fbdf10d9700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.4 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T09:02:39.803 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:39 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4[69269]: 2026-03-10T09:02:39.539+0000 7fbdf10d9700 -1 osd.4 92 *** Got signal Terminated *** 2026-03-10T09:02:39.803 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:39 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4[69269]: 2026-03-10T09:02:39.539+0000 7fbdf10d9700 -1 osd.4 92 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T09:02:40.431 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:40 vm08.local podman[116207]: 2026-03-10 09:02:40.265698457 +0000 UTC m=+0.784196873 container died 155c1482d81c2c7c7095799c5bf065464fac4e13ab85adf404f1a0b84529b8a2 (image=quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.schema-version=1.0, CEPH_POINT_RELEASE=-18.2.1, RELEASE=HEAD, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , GIT_CLEAN=True, org.label-schema.build-date=20240222, org.label-schema.vendor=CentOS, GIT_BRANCH=HEAD, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, io.buildah.version=1.29.1) 2026-03-10T09:02:40.431 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:40 vm08.local podman[116207]: 2026-03-10 09:02:40.287163362 +0000 UTC m=+0.805661767 container remove 155c1482d81c2c7c7095799c5bf065464fac4e13ab85adf404f1a0b84529b8a2 (image=quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4, ceph=True, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_POINT_RELEASE=-18.2.1, GIT_BRANCH=HEAD, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, RELEASE=HEAD, io.buildah.version=1.29.1, org.label-schema.build-date=20240222, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.schema-version=1.0, GIT_CLEAN=True) 2026-03-10T09:02:40.431 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:40 vm08.local bash[116207]: ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4 2026-03-10T09:02:40.431 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:40 vm08.local ceph-mon[101330]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-10T09:02:40.431 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:40 vm08.local ceph-mon[101330]: Upgrade: osd.4 is safe to restart 2026-03-10T09:02:40.431 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:40 vm08.local ceph-mon[101330]: Upgrade: Updating osd.4 2026-03-10T09:02:40.431 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:40 vm08.local ceph-mon[101330]: Deploying daemon osd.4 on vm08 2026-03-10T09:02:40.432 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:40 vm08.local ceph-mon[101330]: osd.4 marked itself down and dead 2026-03-10T09:02:40.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:40 vm05.local ceph-mon[111630]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-10T09:02:40.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:40 vm05.local ceph-mon[111630]: Upgrade: osd.4 is safe to restart 2026-03-10T09:02:40.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:40 vm05.local ceph-mon[111630]: Upgrade: Updating osd.4 2026-03-10T09:02:40.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:40 vm05.local ceph-mon[111630]: Deploying daemon osd.4 on vm08 2026-03-10T09:02:40.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:40 vm05.local ceph-mon[111630]: osd.4 marked itself down and dead 2026-03-10T09:02:40.738 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:40 vm08.local podman[116273]: 2026-03-10 09:02:40.431519147 +0000 UTC m=+0.017002865 container create bf48bf03b0526cd656d54702974a2276827e32f090691c2337830dc65aa101c3 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4-deactivate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=squid, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS) 2026-03-10T09:02:40.738 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:40 vm08.local podman[116273]: 2026-03-10 09:02:40.476460542 +0000 UTC m=+0.061944260 container init bf48bf03b0526cd656d54702974a2276827e32f090691c2337830dc65aa101c3 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4-deactivate, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2) 2026-03-10T09:02:40.738 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:40 vm08.local podman[116273]: 2026-03-10 09:02:40.47945644 +0000 UTC m=+0.064940168 container start bf48bf03b0526cd656d54702974a2276827e32f090691c2337830dc65aa101c3 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4-deactivate, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223) 2026-03-10T09:02:40.738 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:40 vm08.local podman[116273]: 2026-03-10 09:02:40.485180213 +0000 UTC m=+0.070663931 container attach bf48bf03b0526cd656d54702974a2276827e32f090691c2337830dc65aa101c3 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4-deactivate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, OSD_FLAVOR=default) 2026-03-10T09:02:40.739 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:40 vm08.local podman[116273]: 2026-03-10 09:02:40.424918363 +0000 UTC m=+0.010402081 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T09:02:40.739 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:40 vm08.local podman[116292]: 2026-03-10 09:02:40.62654555 +0000 UTC m=+0.009770076 container died bf48bf03b0526cd656d54702974a2276827e32f090691c2337830dc65aa101c3 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4-deactivate, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True) 2026-03-10T09:02:40.739 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:40 vm08.local podman[116292]: 2026-03-10 09:02:40.642000407 +0000 UTC m=+0.025224933 container remove bf48bf03b0526cd656d54702974a2276827e32f090691c2337830dc65aa101c3 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4-deactivate, ceph=True, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.schema-version=1.0) 2026-03-10T09:02:40.739 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:40 vm08.local systemd[1]: ceph-16587ed2-1c5e-11f1-90f6-35051361a039@osd.4.service: Deactivated successfully. 2026-03-10T09:02:40.739 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:40 vm08.local systemd[1]: Stopped Ceph osd.4 for 16587ed2-1c5e-11f1-90f6-35051361a039. 2026-03-10T09:02:40.739 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:40 vm08.local systemd[1]: ceph-16587ed2-1c5e-11f1-90f6-35051361a039@osd.4.service: Consumed 48.671s CPU time. 2026-03-10T09:02:41.118 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:40 vm08.local systemd[1]: Starting Ceph osd.4 for 16587ed2-1c5e-11f1-90f6-35051361a039... 2026-03-10T09:02:41.118 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:40 vm08.local podman[116379]: 2026-03-10 09:02:40.939616213 +0000 UTC m=+0.019273655 container create de887f854eb6f21ea5ec52a771fa4614c83e1ca052ae026e00fda896fd5a4482 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4-activate, OSD_FLAVOR=default, org.label-schema.build-date=20260223, ceph=True, CEPH_REF=squid, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T09:02:41.118 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:41 vm08.local podman[116379]: 2026-03-10 09:02:40.931865045 +0000 UTC m=+0.011522498 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T09:02:41.118 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:41 vm08.local podman[116379]: 2026-03-10 09:02:41.059448538 +0000 UTC m=+0.139105980 container init de887f854eb6f21ea5ec52a771fa4614c83e1ca052ae026e00fda896fd5a4482 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4-activate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=squid, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223) 2026-03-10T09:02:41.118 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:41 vm08.local podman[116379]: 2026-03-10 09:02:41.06223281 +0000 UTC m=+0.141890252 container start de887f854eb6f21ea5ec52a771fa4614c83e1ca052ae026e00fda896fd5a4482 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4-activate, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0) 2026-03-10T09:02:41.118 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:41 vm08.local podman[116379]: 2026-03-10 09:02:41.118234426 +0000 UTC m=+0.197891868 container attach de887f854eb6f21ea5ec52a771fa4614c83e1ca052ae026e00fda896fd5a4482 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4-activate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3) 2026-03-10T09:02:41.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:41 vm05.local ceph-mon[111630]: pgmap v229: 65 pgs: 65 active+clean; 215 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail 2026-03-10T09:02:41.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:41 vm05.local ceph-mon[111630]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T09:02:41.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:41 vm05.local ceph-mon[111630]: osdmap e93: 6 total, 5 up, 6 in 2026-03-10T09:02:41.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:41 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:41.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:41 vm08.local ceph-mon[101330]: pgmap v229: 65 pgs: 65 active+clean; 215 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail 2026-03-10T09:02:41.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:41 vm08.local ceph-mon[101330]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T09:02:41.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:41 vm08.local ceph-mon[101330]: osdmap e93: 6 total, 5 up, 6 in 2026-03-10T09:02:41.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:41 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:41.554 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:41 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4-activate[116389]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T09:02:41.554 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:41 vm08.local bash[116379]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T09:02:41.554 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:41 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4-activate[116389]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T09:02:41.554 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:41 vm08.local bash[116379]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T09:02:41.927 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:41 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4-activate[116389]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T09:02:41.927 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:41 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4-activate[116389]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T09:02:41.927 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:41 vm08.local bash[116379]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T09:02:41.927 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:41 vm08.local bash[116379]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T09:02:41.928 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:41 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4-activate[116389]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T09:02:41.928 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:41 vm08.local bash[116379]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T09:02:41.928 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:41 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4-activate[116389]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 2026-03-10T09:02:41.928 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:41 vm08.local bash[116379]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 2026-03-10T09:02:41.928 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:41 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4-activate[116389]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-7b79d155-b1ec-4afc-8e2d-17a5b618945b/osd-block-2499eecc-b6be-48ac-ba73-53ff8a0686a4 --path /var/lib/ceph/osd/ceph-4 --no-mon-config 2026-03-10T09:02:41.928 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:41 vm08.local bash[116379]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-7b79d155-b1ec-4afc-8e2d-17a5b618945b/osd-block-2499eecc-b6be-48ac-ba73-53ff8a0686a4 --path /var/lib/ceph/osd/ceph-4 --no-mon-config 2026-03-10T09:02:41.928 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:41 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4-activate[116389]: Running command: /usr/bin/ln -snf /dev/ceph-7b79d155-b1ec-4afc-8e2d-17a5b618945b/osd-block-2499eecc-b6be-48ac-ba73-53ff8a0686a4 /var/lib/ceph/osd/ceph-4/block 2026-03-10T09:02:42.191 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:41 vm08.local bash[116379]: Running command: /usr/bin/ln -snf /dev/ceph-7b79d155-b1ec-4afc-8e2d-17a5b618945b/osd-block-2499eecc-b6be-48ac-ba73-53ff8a0686a4 /var/lib/ceph/osd/ceph-4/block 2026-03-10T09:02:42.191 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:41 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4-activate[116389]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-4/block 2026-03-10T09:02:42.191 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:41 vm08.local bash[116379]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-4/block 2026-03-10T09:02:42.191 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:41 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4-activate[116389]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 2026-03-10T09:02:42.191 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:41 vm08.local bash[116379]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 2026-03-10T09:02:42.191 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:41 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4-activate[116389]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 2026-03-10T09:02:42.191 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:41 vm08.local bash[116379]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 2026-03-10T09:02:42.191 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:41 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4-activate[116389]: --> ceph-volume lvm activate successful for osd ID: 4 2026-03-10T09:02:42.191 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:41 vm08.local bash[116379]: --> ceph-volume lvm activate successful for osd ID: 4 2026-03-10T09:02:42.191 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:41 vm08.local conmon[116389]: conmon de887f854eb6f21ea5ec : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-de887f854eb6f21ea5ec52a771fa4614c83e1ca052ae026e00fda896fd5a4482.scope/container/memory.events 2026-03-10T09:02:42.191 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:41 vm08.local podman[116379]: 2026-03-10 09:02:41.958810711 +0000 UTC m=+1.038468153 container died de887f854eb6f21ea5ec52a771fa4614c83e1ca052ae026e00fda896fd5a4482 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4-activate, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T09:02:42.191 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:41 vm08.local podman[116379]: 2026-03-10 09:02:41.976506271 +0000 UTC m=+1.056163713 container remove de887f854eb6f21ea5ec52a771fa4614c83e1ca052ae026e00fda896fd5a4482 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4-activate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T09:02:42.191 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:42 vm08.local podman[116642]: 2026-03-10 09:02:42.053522228 +0000 UTC m=+0.014168419 container create 76fe84edd71692339bfdc3304aed26202046eee7a2b8e9877d8d3339b74fd0ae (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, ceph=True, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default) 2026-03-10T09:02:42.191 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:42 vm08.local podman[116642]: 2026-03-10 09:02:42.092544404 +0000 UTC m=+0.053190595 container init 76fe84edd71692339bfdc3304aed26202046eee7a2b8e9877d8d3339b74fd0ae (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2) 2026-03-10T09:02:42.191 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:42 vm08.local podman[116642]: 2026-03-10 09:02:42.095290855 +0000 UTC m=+0.055937046 container start 76fe84edd71692339bfdc3304aed26202046eee7a2b8e9877d8d3339b74fd0ae (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid) 2026-03-10T09:02:42.191 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:42 vm08.local bash[116642]: 76fe84edd71692339bfdc3304aed26202046eee7a2b8e9877d8d3339b74fd0ae 2026-03-10T09:02:42.191 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:42 vm08.local podman[116642]: 2026-03-10 09:02:42.047813644 +0000 UTC m=+0.008459846 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T09:02:42.191 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:42 vm08.local systemd[1]: Started Ceph osd.4 for 16587ed2-1c5e-11f1-90f6-35051361a039. 2026-03-10T09:02:42.440 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:42 vm08.local ceph-mon[101330]: osdmap e94: 6 total, 5 up, 6 in 2026-03-10T09:02:42.441 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:42 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:42.441 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:42 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:42.441 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:42 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:02:42.445 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:42 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4[116652]: 2026-03-10T09:02:42.439+0000 7fc5e48fd740 -1 Falling back to public interface 2026-03-10T09:02:42.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:42 vm05.local ceph-mon[111630]: osdmap e94: 6 total, 5 up, 6 in 2026-03-10T09:02:42.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:42 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:42.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:42 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:42.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:42 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:02:43.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:43 vm05.local ceph-mon[111630]: pgmap v232: 65 pgs: 8 peering, 5 stale+active+clean, 52 active+clean; 215 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail 2026-03-10T09:02:43.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:43 vm05.local ceph-mon[111630]: Health check failed: Reduced data availability: 1 pg peering (PG_AVAILABILITY) 2026-03-10T09:02:43.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:43 vm08.local ceph-mon[101330]: pgmap v232: 65 pgs: 8 peering, 5 stale+active+clean, 52 active+clean; 215 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail 2026-03-10T09:02:43.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:43 vm08.local ceph-mon[101330]: Health check failed: Reduced data availability: 1 pg peering (PG_AVAILABILITY) 2026-03-10T09:02:44.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:44 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:44.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:44 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:44.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:44 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:44.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:44 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:44.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:44 vm05.local ceph-mon[111630]: pgmap v233: 65 pgs: 11 active+undersized, 8 peering, 2 stale+active+clean, 7 active+undersized+degraded, 37 active+clean; 215 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 26/231 objects degraded (11.255%) 2026-03-10T09:02:44.749 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:44 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:44.749 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:44 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:44.749 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:44 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:44.750 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:44 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:44.750 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:44 vm08.local ceph-mon[101330]: pgmap v233: 65 pgs: 11 active+undersized, 8 peering, 2 stale+active+clean, 7 active+undersized+degraded, 37 active+clean; 215 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 26/231 objects degraded (11.255%) 2026-03-10T09:02:45.327 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.326+0000 7f247f0ce700 1 -- 192.168.123.105:0/2498604271 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f24780737f0 msgr2=0x7f2478073c70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:45.327 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.326+0000 7f247f0ce700 1 --2- 192.168.123.105:0/2498604271 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f24780737f0 0x7f2478073c70 secure :-1 s=READY pgs=150 cs=0 l=1 rev1=1 crypto rx=0x7f246c009b00 tx=0x7f246c009e10 comp rx=0 tx=0).stop 2026-03-10T09:02:45.328 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.327+0000 7f247f0ce700 1 -- 192.168.123.105:0/2498604271 shutdown_connections 2026-03-10T09:02:45.328 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.327+0000 7f247f0ce700 1 --2- 192.168.123.105:0/2498604271 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f24780737f0 0x7f2478073c70 unknown :-1 s=CLOSED pgs=150 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:45.328 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.327+0000 7f247f0ce700 1 --2- 192.168.123.105:0/2498604271 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2478074dc0 0x7f2478073220 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:45.328 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.327+0000 7f247f0ce700 1 -- 192.168.123.105:0/2498604271 >> 192.168.123.105:0/2498604271 conn(0x7f24780fc460 msgr2=0x7f24780fe8e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:02:45.328 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.327+0000 7f247f0ce700 1 -- 192.168.123.105:0/2498604271 shutdown_connections 2026-03-10T09:02:45.328 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.327+0000 7f247f0ce700 1 -- 192.168.123.105:0/2498604271 wait complete. 2026-03-10T09:02:45.328 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.328+0000 7f247f0ce700 1 Processor -- start 2026-03-10T09:02:45.328 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.328+0000 7f247f0ce700 1 -- start start 2026-03-10T09:02:45.329 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.328+0000 7f247f0ce700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f24780737f0 0x7f247810c260 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:02:45.329 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.328+0000 7f247f0ce700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2478074dc0 0x7f247810c7a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:02:45.329 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.328+0000 7f247f0ce700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f247810cdc0 con 0x7f2478074dc0 2026-03-10T09:02:45.329 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.328+0000 7f247f0ce700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f24781aab80 con 0x7f24780737f0 2026-03-10T09:02:45.329 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.328+0000 7f2477fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2478074dc0 0x7f247810c7a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:02:45.329 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.328+0000 7f2477fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2478074dc0 0x7f247810c7a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:45206/0 (socket says 192.168.123.105:45206) 2026-03-10T09:02:45.329 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.328+0000 7f2477fff700 1 -- 192.168.123.105:0/1846700729 learned_addr learned my addr 192.168.123.105:0/1846700729 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:02:45.329 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.328+0000 7f2477fff700 1 -- 192.168.123.105:0/1846700729 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f24780737f0 msgr2=0x7f247810c260 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T09:02:45.329 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.328+0000 7f247ce6a700 1 --2- 192.168.123.105:0/1846700729 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f24780737f0 0x7f247810c260 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:02:45.329 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.329+0000 7f2477fff700 1 --2- 192.168.123.105:0/1846700729 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f24780737f0 0x7f247810c260 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:45.329 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.329+0000 7f2477fff700 1 -- 192.168.123.105:0/1846700729 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f246c0097e0 con 0x7f2478074dc0 2026-03-10T09:02:45.329 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.329+0000 7f247ce6a700 1 --2- 192.168.123.105:0/1846700729 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f24780737f0 0x7f247810c260 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T09:02:45.329 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.329+0000 7f2477fff700 1 --2- 192.168.123.105:0/1846700729 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2478074dc0 0x7f247810c7a0 secure :-1 s=READY pgs=151 cs=0 l=1 rev1=1 crypto rx=0x7f246c000c00 tx=0x7f246c0056c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:02:45.329 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.329+0000 7f2475ffb700 1 -- 192.168.123.105:0/1846700729 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f246c01d070 con 0x7f2478074dc0 2026-03-10T09:02:45.330 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.329+0000 7f247f0ce700 1 -- 192.168.123.105:0/1846700729 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f24781aad20 con 0x7f2478074dc0 2026-03-10T09:02:45.330 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.329+0000 7f247f0ce700 1 -- 192.168.123.105:0/1846700729 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f24781ab1c0 con 0x7f2478074dc0 2026-03-10T09:02:45.330 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.330+0000 7f2475ffb700 1 -- 192.168.123.105:0/1846700729 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f246c00bc50 con 0x7f2478074dc0 2026-03-10T09:02:45.331 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.330+0000 7f2475ffb700 1 -- 192.168.123.105:0/1846700729 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f246c00f8b0 con 0x7f2478074dc0 2026-03-10T09:02:45.332 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.331+0000 7f247f0ce700 1 -- 192.168.123.105:0/1846700729 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2478066e80 con 0x7f2478074dc0 2026-03-10T09:02:45.332 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.331+0000 7f2475ffb700 1 -- 192.168.123.105:0/1846700729 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f246c00fa10 con 0x7f2478074dc0 2026-03-10T09:02:45.332 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.331+0000 7f2475ffb700 1 --2- 192.168.123.105:0/1846700729 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f24680779e0 0x7f2468079ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:02:45.332 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.331+0000 7f2475ffb700 1 -- 192.168.123.105:0/1846700729 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(94..94 src has 1..94) v4 ==== 6222+0+0 (secure 0 0 0) 0x7f246c09b1c0 con 0x7f2478074dc0 2026-03-10T09:02:45.334 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.334+0000 7f2475ffb700 1 -- 192.168.123.105:0/1846700729 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f246c00bdc0 con 0x7f2478074dc0 2026-03-10T09:02:45.334 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.334+0000 7f247ce6a700 1 --2- 192.168.123.105:0/1846700729 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f24680779e0 0x7f2468079ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:02:45.335 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.334+0000 7f247ce6a700 1 --2- 192.168.123.105:0/1846700729 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f24680779e0 0x7f2468079ea0 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7f2464005fd0 tx=0x7f2464005de0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:02:45.459 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.459+0000 7f247f0ce700 1 -- 192.168.123.105:0/1846700729 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f2478103ea0 con 0x7f24680779e0 2026-03-10T09:02:45.460 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.460+0000 7f2475ffb700 1 -- 192.168.123.105:0/1846700729 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+401 (secure 0 0 0) 0x7f2478103ea0 con 0x7f24680779e0 2026-03-10T09:02:45.463 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.463+0000 7f247f0ce700 1 -- 192.168.123.105:0/1846700729 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f24680779e0 msgr2=0x7f2468079ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:45.463 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.463+0000 7f247f0ce700 1 --2- 192.168.123.105:0/1846700729 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f24680779e0 0x7f2468079ea0 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7f2464005fd0 tx=0x7f2464005de0 comp rx=0 tx=0).stop 2026-03-10T09:02:45.463 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.463+0000 7f247f0ce700 1 -- 192.168.123.105:0/1846700729 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2478074dc0 msgr2=0x7f247810c7a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:45.463 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.463+0000 7f247f0ce700 1 --2- 192.168.123.105:0/1846700729 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2478074dc0 0x7f247810c7a0 secure :-1 s=READY pgs=151 cs=0 l=1 rev1=1 crypto rx=0x7f246c000c00 tx=0x7f246c0056c0 comp rx=0 tx=0).stop 2026-03-10T09:02:45.463 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.463+0000 7f247f0ce700 1 -- 192.168.123.105:0/1846700729 shutdown_connections 2026-03-10T09:02:45.463 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.463+0000 7f247f0ce700 1 --2- 192.168.123.105:0/1846700729 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f24680779e0 0x7f2468079ea0 unknown :-1 s=CLOSED pgs=101 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:45.463 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.463+0000 7f247f0ce700 1 --2- 192.168.123.105:0/1846700729 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f24780737f0 0x7f247810c260 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:45.464 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.463+0000 7f247f0ce700 1 --2- 192.168.123.105:0/1846700729 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2478074dc0 0x7f247810c7a0 unknown :-1 s=CLOSED pgs=151 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:45.464 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.463+0000 7f247f0ce700 1 -- 192.168.123.105:0/1846700729 >> 192.168.123.105:0/1846700729 conn(0x7f24780fc460 msgr2=0x7f2478102780 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:02:45.464 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.463+0000 7f247f0ce700 1 -- 192.168.123.105:0/1846700729 shutdown_connections 2026-03-10T09:02:45.464 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.463+0000 7f247f0ce700 1 -- 192.168.123.105:0/1846700729 wait complete. 2026-03-10T09:02:45.473 INFO:teuthology.orchestra.run.vm05.stdout:true 2026-03-10T09:02:45.529 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.528+0000 7ff5a769b700 1 -- 192.168.123.105:0/1966148370 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff5a0101a30 msgr2=0x7ff5a0103e20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:45.529 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.528+0000 7ff5a769b700 1 --2- 192.168.123.105:0/1966148370 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff5a0101a30 0x7ff5a0103e20 secure :-1 s=READY pgs=152 cs=0 l=1 rev1=1 crypto rx=0x7ff590009b50 tx=0x7ff590009e60 comp rx=0 tx=0).stop 2026-03-10T09:02:45.529 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.529+0000 7ff5a769b700 1 -- 192.168.123.105:0/1966148370 shutdown_connections 2026-03-10T09:02:45.529 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.529+0000 7ff5a769b700 1 --2- 192.168.123.105:0/1966148370 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff5a0104360 0x7ff5a0106750 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:45.529 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.529+0000 7ff5a769b700 1 --2- 192.168.123.105:0/1966148370 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff5a0101a30 0x7ff5a0103e20 unknown :-1 s=CLOSED pgs=152 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:45.529 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.529+0000 7ff5a769b700 1 -- 192.168.123.105:0/1966148370 >> 192.168.123.105:0/1966148370 conn(0x7ff5a00fb360 msgr2=0x7ff5a00fd7e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:02:45.529 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.529+0000 7ff5a769b700 1 -- 192.168.123.105:0/1966148370 shutdown_connections 2026-03-10T09:02:45.529 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.529+0000 7ff5a769b700 1 -- 192.168.123.105:0/1966148370 wait complete. 2026-03-10T09:02:45.529 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.529+0000 7ff5a769b700 1 Processor -- start 2026-03-10T09:02:45.530 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.529+0000 7ff5a769b700 1 -- start start 2026-03-10T09:02:45.530 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.530+0000 7ff5a769b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff5a0101a30 0x7ff5a0194620 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:02:45.530 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.530+0000 7ff5a769b700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff5a0104360 0x7ff5a0194b60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:02:45.530 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.530+0000 7ff5a769b700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff5a0195180 con 0x7ff5a0101a30 2026-03-10T09:02:45.530 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.530+0000 7ff5a769b700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff5a01952c0 con 0x7ff5a0104360 2026-03-10T09:02:45.530 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.530+0000 7ff5a5437700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff5a0101a30 0x7ff5a0194620 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:02:45.530 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.530+0000 7ff5a5437700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff5a0101a30 0x7ff5a0194620 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:45210/0 (socket says 192.168.123.105:45210) 2026-03-10T09:02:45.530 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.530+0000 7ff5a5437700 1 -- 192.168.123.105:0/661107460 learned_addr learned my addr 192.168.123.105:0/661107460 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:02:45.530 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.530+0000 7ff5a4c36700 1 --2- 192.168.123.105:0/661107460 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff5a0104360 0x7ff5a0194b60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:02:45.530 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.530+0000 7ff5a5437700 1 -- 192.168.123.105:0/661107460 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff5a0104360 msgr2=0x7ff5a0194b60 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:45.531 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.530+0000 7ff5a5437700 1 --2- 192.168.123.105:0/661107460 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff5a0104360 0x7ff5a0194b60 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:45.531 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.530+0000 7ff5a5437700 1 -- 192.168.123.105:0/661107460 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff5900097e0 con 0x7ff5a0101a30 2026-03-10T09:02:45.531 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.531+0000 7ff5a5437700 1 --2- 192.168.123.105:0/661107460 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff5a0101a30 0x7ff5a0194620 secure :-1 s=READY pgs=153 cs=0 l=1 rev1=1 crypto rx=0x7ff590004ce0 tx=0x7ff590005790 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:02:45.531 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.531+0000 7ff5967fc700 1 -- 192.168.123.105:0/661107460 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff59001d070 con 0x7ff5a0101a30 2026-03-10T09:02:45.532 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.531+0000 7ff5967fc700 1 -- 192.168.123.105:0/661107460 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff59000bb00 con 0x7ff5a0101a30 2026-03-10T09:02:45.532 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.531+0000 7ff5967fc700 1 -- 192.168.123.105:0/661107460 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff59000f7c0 con 0x7ff5a0101a30 2026-03-10T09:02:45.532 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.531+0000 7ff5a769b700 1 -- 192.168.123.105:0/661107460 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff5a0199d10 con 0x7ff5a0101a30 2026-03-10T09:02:45.532 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.531+0000 7ff5a769b700 1 -- 192.168.123.105:0/661107460 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff5a019a1b0 con 0x7ff5a0101a30 2026-03-10T09:02:45.533 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.532+0000 7ff5967fc700 1 -- 192.168.123.105:0/661107460 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7ff59000bc70 con 0x7ff5a0101a30 2026-03-10T09:02:45.533 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.532+0000 7ff5a769b700 1 -- 192.168.123.105:0/661107460 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff5a018e7e0 con 0x7ff5a0101a30 2026-03-10T09:02:45.533 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.533+0000 7ff5967fc700 1 --2- 192.168.123.105:0/661107460 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ff58c07bc80 0x7ff58c07e140 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:02:45.533 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.533+0000 7ff5967fc700 1 -- 192.168.123.105:0/661107460 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(94..94 src has 1..94) v4 ==== 6222+0+0 (secure 0 0 0) 0x7ff59009b040 con 0x7ff5a0101a30 2026-03-10T09:02:45.533 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.533+0000 7ff5a4c36700 1 --2- 192.168.123.105:0/661107460 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ff58c07bc80 0x7ff58c07e140 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:02:45.535 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.535+0000 7ff5967fc700 1 -- 192.168.123.105:0/661107460 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff5900638d0 con 0x7ff5a0101a30 2026-03-10T09:02:45.536 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.536+0000 7ff5a4c36700 1 --2- 192.168.123.105:0/661107460 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ff58c07bc80 0x7ff58c07e140 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7ff59c005950 tx=0x7ff59c00b410 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:02:45.659 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.658+0000 7ff5a769b700 1 -- 192.168.123.105:0/661107460 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7ff5a00611d0 con 0x7ff58c07bc80 2026-03-10T09:02:45.659 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:45 vm05.local ceph-mon[111630]: Health check failed: Degraded data redundancy: 26/231 objects degraded (11.255%), 7 pgs degraded (PG_DEGRADED) 2026-03-10T09:02:45.659 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:45 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:45.659 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:45 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:45.659 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:45 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:02:45.659 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:45 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:02:45.659 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:45 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:45.659 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:45 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:02:45.659 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:45 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:02:45.659 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:45 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:02:45.659 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:45 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:02:45.659 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:45 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-10T09:02:45.660 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:45 vm05.local ceph-mon[111630]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-10T09:02:45.660 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:45 vm05.local ceph-mon[111630]: Upgrade: unsafe to stop osd(s) at this time (10 PGs are or would become offline) 2026-03-10T09:02:45.663 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.663+0000 7ff5967fc700 1 -- 192.168.123.105:0/661107460 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+401 (secure 0 0 0) 0x7ff5a00611d0 con 0x7ff58c07bc80 2026-03-10T09:02:45.666 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.666+0000 7ff5a769b700 1 -- 192.168.123.105:0/661107460 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ff58c07bc80 msgr2=0x7ff58c07e140 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:45.666 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.666+0000 7ff5a769b700 1 --2- 192.168.123.105:0/661107460 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ff58c07bc80 0x7ff58c07e140 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7ff59c005950 tx=0x7ff59c00b410 comp rx=0 tx=0).stop 2026-03-10T09:02:45.666 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.666+0000 7ff5a769b700 1 -- 192.168.123.105:0/661107460 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff5a0101a30 msgr2=0x7ff5a0194620 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:45.666 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.666+0000 7ff5a769b700 1 --2- 192.168.123.105:0/661107460 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff5a0101a30 0x7ff5a0194620 secure :-1 s=READY pgs=153 cs=0 l=1 rev1=1 crypto rx=0x7ff590004ce0 tx=0x7ff590005790 comp rx=0 tx=0).stop 2026-03-10T09:02:45.666 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.666+0000 7ff5a769b700 1 -- 192.168.123.105:0/661107460 shutdown_connections 2026-03-10T09:02:45.666 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.666+0000 7ff5a769b700 1 --2- 192.168.123.105:0/661107460 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ff58c07bc80 0x7ff58c07e140 unknown :-1 s=CLOSED pgs=102 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:45.667 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.666+0000 7ff5a769b700 1 --2- 192.168.123.105:0/661107460 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff5a0101a30 0x7ff5a0194620 unknown :-1 s=CLOSED pgs=153 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:45.667 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.666+0000 7ff5a769b700 1 --2- 192.168.123.105:0/661107460 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff5a0104360 0x7ff5a0194b60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:45.667 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.666+0000 7ff5a769b700 1 -- 192.168.123.105:0/661107460 >> 192.168.123.105:0/661107460 conn(0x7ff5a00fb360 msgr2=0x7ff5a00fd7e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:02:45.667 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.666+0000 7ff5a769b700 1 -- 192.168.123.105:0/661107460 shutdown_connections 2026-03-10T09:02:45.667 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.666+0000 7ff5a769b700 1 -- 192.168.123.105:0/661107460 wait complete. 2026-03-10T09:02:45.745 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.742+0000 7fcdff255700 1 -- 192.168.123.105:0/937758307 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcdf810a700 msgr2=0x7fcdf810cb90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:45.746 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.742+0000 7fcdff255700 1 --2- 192.168.123.105:0/937758307 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcdf810a700 0x7fcdf810cb90 secure :-1 s=READY pgs=154 cs=0 l=1 rev1=1 crypto rx=0x7fcde8009a60 tx=0x7fcde8009d70 comp rx=0 tx=0).stop 2026-03-10T09:02:45.746 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.745+0000 7fcdff255700 1 -- 192.168.123.105:0/937758307 shutdown_connections 2026-03-10T09:02:45.746 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.745+0000 7fcdff255700 1 --2- 192.168.123.105:0/937758307 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcdf810a700 0x7fcdf810cb90 unknown :-1 s=CLOSED pgs=154 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:45.746 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.745+0000 7fcdff255700 1 --2- 192.168.123.105:0/937758307 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcdf8107d90 0x7fcdf810a1c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:45.746 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.745+0000 7fcdff255700 1 -- 192.168.123.105:0/937758307 >> 192.168.123.105:0/937758307 conn(0x7fcdf806daa0 msgr2=0x7fcdf806ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:02:45.751 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.745+0000 7fcdff255700 1 -- 192.168.123.105:0/937758307 shutdown_connections 2026-03-10T09:02:45.751 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.745+0000 7fcdff255700 1 -- 192.168.123.105:0/937758307 wait complete. 2026-03-10T09:02:45.751 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.751+0000 7fcdff255700 1 Processor -- start 2026-03-10T09:02:45.751 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.751+0000 7fcdff255700 1 -- start start 2026-03-10T09:02:45.751 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.751+0000 7fcdff255700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcdf8107d90 0x7fcdf81a5280 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:02:45.751 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.751+0000 7fcdff255700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcdf810a700 0x7fcdf81a57c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:02:45.751 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.751+0000 7fcdff255700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcdf81a5de0 con 0x7fcdf8107d90 2026-03-10T09:02:45.751 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.751+0000 7fcdff255700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcdf81a5f20 con 0x7fcdf810a700 2026-03-10T09:02:45.751 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.751+0000 7fcdfe253700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcdf8107d90 0x7fcdf81a5280 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:02:45.751 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.751+0000 7fcdfe253700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcdf8107d90 0x7fcdf81a5280 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:45226/0 (socket says 192.168.123.105:45226) 2026-03-10T09:02:45.751 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.751+0000 7fcdfe253700 1 -- 192.168.123.105:0/885980101 learned_addr learned my addr 192.168.123.105:0/885980101 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:02:45.751 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.751+0000 7fcdfda52700 1 --2- 192.168.123.105:0/885980101 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcdf810a700 0x7fcdf81a57c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:02:45.752 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.751+0000 7fcdfe253700 1 -- 192.168.123.105:0/885980101 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcdf810a700 msgr2=0x7fcdf81a57c0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:45.752 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.751+0000 7fcdfe253700 1 --2- 192.168.123.105:0/885980101 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcdf810a700 0x7fcdf81a57c0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:45.752 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.751+0000 7fcdfe253700 1 -- 192.168.123.105:0/885980101 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fcde8009710 con 0x7fcdf8107d90 2026-03-10T09:02:45.752 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.752+0000 7fcdfe253700 1 --2- 192.168.123.105:0/885980101 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcdf8107d90 0x7fcdf81a5280 secure :-1 s=READY pgs=155 cs=0 l=1 rev1=1 crypto rx=0x7fcdf000e3f0 tx=0x7fcdf000e7b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:02:45.752 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.752+0000 7fcdef7fe700 1 -- 192.168.123.105:0/885980101 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcdf00090d0 con 0x7fcdf8107d90 2026-03-10T09:02:45.752 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.752+0000 7fcdff255700 1 -- 192.168.123.105:0/885980101 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fcdf81aa970 con 0x7fcdf8107d90 2026-03-10T09:02:45.752 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.752+0000 7fcdff255700 1 -- 192.168.123.105:0/885980101 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fcdf81aaec0 con 0x7fcdf8107d90 2026-03-10T09:02:45.752 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.752+0000 7fcdef7fe700 1 -- 192.168.123.105:0/885980101 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fcdf000f040 con 0x7fcdf8107d90 2026-03-10T09:02:45.753 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.753+0000 7fcdef7fe700 1 -- 192.168.123.105:0/885980101 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcdf0014720 con 0x7fcdf8107d90 2026-03-10T09:02:45.754 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.754+0000 7fcdef7fe700 1 -- 192.168.123.105:0/885980101 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fcdf0009230 con 0x7fcdf8107d90 2026-03-10T09:02:45.755 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.755+0000 7fcdef7fe700 1 --2- 192.168.123.105:0/885980101 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fcde4077700 0x7fcde4079bc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:02:45.755 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.755+0000 7fcdef7fe700 1 -- 192.168.123.105:0/885980101 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(94..94 src has 1..94) v4 ==== 6222+0+0 (secure 0 0 0) 0x7fcdf0099e40 con 0x7fcdf8107d90 2026-03-10T09:02:45.756 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.755+0000 7fcdfda52700 1 --2- 192.168.123.105:0/885980101 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fcde4077700 0x7fcde4079bc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:02:45.756 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.756+0000 7fcdff255700 1 -- 192.168.123.105:0/885980101 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fcddc005320 con 0x7fcdf8107d90 2026-03-10T09:02:45.759 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.759+0000 7fcdef7fe700 1 -- 192.168.123.105:0/885980101 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fcdf0062620 con 0x7fcdf8107d90 2026-03-10T09:02:45.765 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.764+0000 7fcdfda52700 1 --2- 192.168.123.105:0/885980101 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fcde4077700 0x7fcde4079bc0 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7fcde8003930 tx=0x7fcde80095f0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:02:45.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:45 vm08.local ceph-mon[101330]: Health check failed: Degraded data redundancy: 26/231 objects degraded (11.255%), 7 pgs degraded (PG_DEGRADED) 2026-03-10T09:02:45.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:45 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:45.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:45 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:45.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:45 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:02:45.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:45 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:02:45.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:45 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:45.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:45 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:02:45.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:45 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:02:45.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:45 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:02:45.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:45 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:02:45.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:45 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-10T09:02:45.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:45 vm08.local ceph-mon[101330]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-10T09:02:45.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:45 vm08.local ceph-mon[101330]: Upgrade: unsafe to stop osd(s) at this time (10 PGs are or would become offline) 2026-03-10T09:02:45.894 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.893+0000 7fcdff255700 1 -- 192.168.123.105:0/885980101 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fcddc000bf0 con 0x7fcde4077700 2026-03-10T09:02:45.899 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.898+0000 7fcdef7fe700 1 -- 192.168.123.105:0/885980101 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7fcddc000bf0 con 0x7fcde4077700 2026-03-10T09:02:45.899 INFO:teuthology.orchestra.run.vm05.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T09:02:45.899 INFO:teuthology.orchestra.run.vm05.stdout:alertmanager.vm05 vm05 *:9093,9094 running (11m) 46s ago 11m 24.2M - 0.25.0 c8568f914cd2 3be9db7ff6a0 2026-03-10T09:02:45.899 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm05 vm05 running (12m) 46s ago 12m 9638k - 18.2.1 5be31c24972a 4f6a4fa3151a 2026-03-10T09:02:45.899 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm08 vm08 running (11m) 2s ago 11m 11.6M - 18.2.1 5be31c24972a 17a1d108c2c1 2026-03-10T09:02:45.899 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm05 vm05 running (5m) 46s ago 12m 7834k - 19.2.3-678-ge911bdeb 654f31e6858e 8f7c3cd210c7 2026-03-10T09:02:45.899 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm08 vm08 running (5m) 2s ago 11m 8296k - 19.2.3-678-ge911bdeb 654f31e6858e dc71c3161edd 2026-03-10T09:02:45.899 INFO:teuthology.orchestra.run.vm05.stdout:grafana.vm05 vm05 *:3000 running (11m) 46s ago 11m 89.2M - 9.4.7 954c08fa6188 91937b4745e7 2026-03-10T09:02:45.899 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.bxdvbu vm05 running (9m) 46s ago 9m 176M - 18.2.1 5be31c24972a 601862397d9b 2026-03-10T09:02:45.899 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.slhztf vm05 running (9m) 46s ago 9m 19.5M - 18.2.1 5be31c24972a 550cdd24738b 2026-03-10T09:02:45.899 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.ssijow vm08 running (9m) 2s ago 9m 22.1M - 18.2.1 5be31c24972a b9e55b365719 2026-03-10T09:02:45.899 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.xfzrbx vm08 running (9m) 2s ago 9m 18.8M - 18.2.1 5be31c24972a d58d1e2e6ff3 2026-03-10T09:02:45.899 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm05.rxwgjc vm05 *:8443,9283,8765 running (6m) 46s ago 12m 621M - 19.2.3-678-ge911bdeb 654f31e6858e d8c53d04a173 2026-03-10T09:02:45.899 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm08.rpongu vm08 *:8443,9283,8765 running (6m) 2s ago 11m 494M - 19.2.3-678-ge911bdeb 654f31e6858e 867781ac9d98 2026-03-10T09:02:45.899 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm05 vm05 running (6m) 46s ago 12m 63.3M 2048M 19.2.3-678-ge911bdeb 654f31e6858e cdc9176bec28 2026-03-10T09:02:45.899 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm08 vm08 running (5m) 2s ago 11m 55.9M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 34546aa1422b 2026-03-10T09:02:45.899 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm05 vm05 *:9100 running (11m) 46s ago 11m 14.9M - 1.5.0 0da6a335fe13 3b3db2a6030c 2026-03-10T09:02:45.899 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm08 vm08 *:9100 running (11m) 2s ago 11m 15.7M - 1.5.0 0da6a335fe13 f55df0cefc61 2026-03-10T09:02:45.899 INFO:teuthology.orchestra.run.vm05.stdout:osd.0 vm05 running (5m) 46s ago 11m 214M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 4f1dac46f59b 2026-03-10T09:02:45.899 INFO:teuthology.orchestra.run.vm05.stdout:osd.1 vm05 running (70s) 46s ago 10m 101M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 306e95bddd95 2026-03-10T09:02:45.899 INFO:teuthology.orchestra.run.vm05.stdout:osd.2 vm05 running (48s) 46s ago 10m 12.9M 4096M 19.2.3-678-ge911bdeb 654f31e6858e a555d70ff4bd 2026-03-10T09:02:45.899 INFO:teuthology.orchestra.run.vm05.stdout:osd.3 vm08 running (25s) 2s ago 10m 161M 4096M 19.2.3-678-ge911bdeb 654f31e6858e b025f9a6ca2a 2026-03-10T09:02:45.899 INFO:teuthology.orchestra.run.vm05.stdout:osd.4 vm08 running (3s) 2s ago 10m 13.0M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 76fe84edd716 2026-03-10T09:02:45.899 INFO:teuthology.orchestra.run.vm05.stdout:osd.5 vm08 running (10m) 2s ago 10m 406M 4096M 18.2.1 5be31c24972a 21583bb58d82 2026-03-10T09:02:45.899 INFO:teuthology.orchestra.run.vm05.stdout:prometheus.vm05 vm05 *:9095 running (6m) 46s ago 11m 65.7M - 2.43.0 a07b618ecd1d 1879cf55d022 2026-03-10T09:02:45.902 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.901+0000 7fcdff255700 1 -- 192.168.123.105:0/885980101 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fcde4077700 msgr2=0x7fcde4079bc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:45.902 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.901+0000 7fcdff255700 1 --2- 192.168.123.105:0/885980101 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fcde4077700 0x7fcde4079bc0 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7fcde8003930 tx=0x7fcde80095f0 comp rx=0 tx=0).stop 2026-03-10T09:02:45.902 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.902+0000 7fcdff255700 1 -- 192.168.123.105:0/885980101 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcdf8107d90 msgr2=0x7fcdf81a5280 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:45.902 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.902+0000 7fcdff255700 1 --2- 192.168.123.105:0/885980101 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcdf8107d90 0x7fcdf81a5280 secure :-1 s=READY pgs=155 cs=0 l=1 rev1=1 crypto rx=0x7fcdf000e3f0 tx=0x7fcdf000e7b0 comp rx=0 tx=0).stop 2026-03-10T09:02:45.902 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.902+0000 7fcdff255700 1 -- 192.168.123.105:0/885980101 shutdown_connections 2026-03-10T09:02:45.902 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.902+0000 7fcdff255700 1 --2- 192.168.123.105:0/885980101 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fcde4077700 0x7fcde4079bc0 unknown :-1 s=CLOSED pgs=103 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:45.902 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.902+0000 7fcdff255700 1 --2- 192.168.123.105:0/885980101 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcdf8107d90 0x7fcdf81a5280 unknown :-1 s=CLOSED pgs=155 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:45.902 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.902+0000 7fcdff255700 1 --2- 192.168.123.105:0/885980101 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcdf810a700 0x7fcdf81a57c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:45.902 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.902+0000 7fcdff255700 1 -- 192.168.123.105:0/885980101 >> 192.168.123.105:0/885980101 conn(0x7fcdf806daa0 msgr2=0x7fcdf806ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:02:45.902 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.902+0000 7fcdff255700 1 -- 192.168.123.105:0/885980101 shutdown_connections 2026-03-10T09:02:45.902 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.902+0000 7fcdff255700 1 -- 192.168.123.105:0/885980101 wait complete. 2026-03-10T09:02:45.977 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.977+0000 7f5fc1e9f700 1 -- 192.168.123.105:0/1095812818 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5fbc076990 msgr2=0x7f5fbc076e10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:45.978 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.977+0000 7f5fc1e9f700 1 --2- 192.168.123.105:0/1095812818 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5fbc076990 0x7f5fbc076e10 secure :-1 s=READY pgs=156 cs=0 l=1 rev1=1 crypto rx=0x7f5fac009b50 tx=0x7f5fac009e60 comp rx=0 tx=0).stop 2026-03-10T09:02:45.978 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.977+0000 7f5fc1e9f700 1 -- 192.168.123.105:0/1095812818 shutdown_connections 2026-03-10T09:02:45.978 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.977+0000 7f5fc1e9f700 1 --2- 192.168.123.105:0/1095812818 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5fbc076990 0x7f5fbc076e10 unknown :-1 s=CLOSED pgs=156 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:45.978 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.977+0000 7f5fc1e9f700 1 --2- 192.168.123.105:0/1095812818 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5fbc075740 0x7f5fbc075b60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:45.978 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.977+0000 7f5fc1e9f700 1 -- 192.168.123.105:0/1095812818 >> 192.168.123.105:0/1095812818 conn(0x7f5fbc0fe6c0 msgr2=0x7f5fbc100b00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:02:45.978 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.977+0000 7f5fc1e9f700 1 -- 192.168.123.105:0/1095812818 shutdown_connections 2026-03-10T09:02:45.978 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.977+0000 7f5fc1e9f700 1 -- 192.168.123.105:0/1095812818 wait complete. 2026-03-10T09:02:45.978 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.978+0000 7f5fc1e9f700 1 Processor -- start 2026-03-10T09:02:45.978 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.978+0000 7f5fc1e9f700 1 -- start start 2026-03-10T09:02:45.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.978+0000 7f5fc1e9f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5fbc075740 0x7f5fbc19cdd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:02:45.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.978+0000 7f5fc1e9f700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5fbc076990 0x7f5fbc19d310 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:02:45.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.978+0000 7f5fc1e9f700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5fbc19d930 con 0x7f5fbc075740 2026-03-10T09:02:45.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.978+0000 7f5fbaffd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5fbc076990 0x7f5fbc19d310 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:02:45.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.978+0000 7f5fbaffd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5fbc076990 0x7f5fbc19d310 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:53346/0 (socket says 192.168.123.105:53346) 2026-03-10T09:02:45.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.978+0000 7f5fbaffd700 1 -- 192.168.123.105:0/1261578186 learned_addr learned my addr 192.168.123.105:0/1261578186 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:02:45.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.979+0000 7f5fbb7fe700 1 --2- 192.168.123.105:0/1261578186 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5fbc075740 0x7f5fbc19cdd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:02:45.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.978+0000 7f5fc1e9f700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5fbc19da70 con 0x7f5fbc076990 2026-03-10T09:02:45.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.979+0000 7f5fbb7fe700 1 -- 192.168.123.105:0/1261578186 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5fbc076990 msgr2=0x7f5fbc19d310 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:45.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.979+0000 7f5fbb7fe700 1 --2- 192.168.123.105:0/1261578186 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5fbc076990 0x7f5fbc19d310 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:45.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.979+0000 7f5fbb7fe700 1 -- 192.168.123.105:0/1261578186 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5fac0097e0 con 0x7f5fbc075740 2026-03-10T09:02:45.980 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.979+0000 7f5fbb7fe700 1 --2- 192.168.123.105:0/1261578186 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5fbc075740 0x7f5fbc19cdd0 secure :-1 s=READY pgs=157 cs=0 l=1 rev1=1 crypto rx=0x7f5fa400eab0 tx=0x7f5fa400edc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:02:45.981 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.980+0000 7f5fb8ff9700 1 -- 192.168.123.105:0/1261578186 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5fa400cb20 con 0x7f5fbc075740 2026-03-10T09:02:45.981 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.980+0000 7f5fb8ff9700 1 -- 192.168.123.105:0/1261578186 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f5fa400cc80 con 0x7f5fbc075740 2026-03-10T09:02:45.981 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.980+0000 7f5fb8ff9700 1 -- 192.168.123.105:0/1261578186 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5fa4018860 con 0x7f5fbc075740 2026-03-10T09:02:45.981 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.980+0000 7f5fc1e9f700 1 -- 192.168.123.105:0/1261578186 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5fbc1a2520 con 0x7f5fbc075740 2026-03-10T09:02:45.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.980+0000 7f5fc1e9f700 1 -- 192.168.123.105:0/1261578186 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5fbc1a2a70 con 0x7f5fbc075740 2026-03-10T09:02:45.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.981+0000 7f5fb8ff9700 1 -- 192.168.123.105:0/1261578186 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f5fa40189c0 con 0x7f5fbc075740 2026-03-10T09:02:45.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.981+0000 7f5fc1e9f700 1 -- 192.168.123.105:0/1261578186 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5fbc110200 con 0x7f5fbc075740 2026-03-10T09:02:45.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.985+0000 7f5fb8ff9700 1 --2- 192.168.123.105:0/1261578186 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f5fa80779e0 0x7f5fa8079ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:02:45.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.985+0000 7f5fb8ff9700 1 -- 192.168.123.105:0/1261578186 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(94..94 src has 1..94) v4 ==== 6222+0+0 (secure 0 0 0) 0x7f5fa4014070 con 0x7f5fbc075740 2026-03-10T09:02:45.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.985+0000 7f5fbaffd700 1 --2- 192.168.123.105:0/1261578186 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f5fa80779e0 0x7f5fa8079ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:02:45.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.985+0000 7f5fb8ff9700 1 -- 192.168.123.105:0/1261578186 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f5fa400cdf0 con 0x7f5fbc075740 2026-03-10T09:02:45.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:45.985+0000 7f5fbaffd700 1 --2- 192.168.123.105:0/1261578186 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f5fa80779e0 0x7f5fa8079ea0 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7f5fac006010 tx=0x7f5fac0058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:02:46.155 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.154+0000 7f5fc1e9f700 1 -- 192.168.123.105:0/1261578186 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f5fbc1a3280 con 0x7f5fbc075740 2026-03-10T09:02:46.155 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.155+0000 7f5fb8ff9700 1 -- 192.168.123.105:0/1261578186 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+799 (secure 0 0 0) 0x7f5fa4063bd0 con 0x7f5fbc075740 2026-03-10T09:02:46.156 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T09:02:46.156 INFO:teuthology.orchestra.run.vm05.stdout: "mon": { 2026-03-10T09:02:46.156 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T09:02:46.156 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T09:02:46.156 INFO:teuthology.orchestra.run.vm05.stdout: "mgr": { 2026-03-10T09:02:46.156 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T09:02:46.156 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T09:02:46.156 INFO:teuthology.orchestra.run.vm05.stdout: "osd": { 2026-03-10T09:02:46.156 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 1, 2026-03-10T09:02:46.156 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 4 2026-03-10T09:02:46.156 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T09:02:46.156 INFO:teuthology.orchestra.run.vm05.stdout: "mds": { 2026-03-10T09:02:46.156 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4 2026-03-10T09:02:46.156 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T09:02:46.156 INFO:teuthology.orchestra.run.vm05.stdout: "overall": { 2026-03-10T09:02:46.156 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 5, 2026-03-10T09:02:46.156 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 8 2026-03-10T09:02:46.156 INFO:teuthology.orchestra.run.vm05.stdout: } 2026-03-10T09:02:46.156 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T09:02:46.158 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.158+0000 7f5fc1e9f700 1 -- 192.168.123.105:0/1261578186 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f5fa80779e0 msgr2=0x7f5fa8079ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:46.158 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.158+0000 7f5fc1e9f700 1 --2- 192.168.123.105:0/1261578186 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f5fa80779e0 0x7f5fa8079ea0 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7f5fac006010 tx=0x7f5fac0058e0 comp rx=0 tx=0).stop 2026-03-10T09:02:46.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.158+0000 7f5fc1e9f700 1 -- 192.168.123.105:0/1261578186 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5fbc075740 msgr2=0x7f5fbc19cdd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:46.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.158+0000 7f5fc1e9f700 1 --2- 192.168.123.105:0/1261578186 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5fbc075740 0x7f5fbc19cdd0 secure :-1 s=READY pgs=157 cs=0 l=1 rev1=1 crypto rx=0x7f5fa400eab0 tx=0x7f5fa400edc0 comp rx=0 tx=0).stop 2026-03-10T09:02:46.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.159+0000 7f5fc1e9f700 1 -- 192.168.123.105:0/1261578186 shutdown_connections 2026-03-10T09:02:46.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.159+0000 7f5fc1e9f700 1 --2- 192.168.123.105:0/1261578186 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f5fa80779e0 0x7f5fa8079ea0 unknown :-1 s=CLOSED pgs=104 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:46.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.159+0000 7f5fc1e9f700 1 --2- 192.168.123.105:0/1261578186 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5fbc075740 0x7f5fbc19cdd0 unknown :-1 s=CLOSED pgs=157 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:46.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.159+0000 7f5fc1e9f700 1 --2- 192.168.123.105:0/1261578186 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5fbc076990 0x7f5fbc19d310 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:46.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.159+0000 7f5fc1e9f700 1 -- 192.168.123.105:0/1261578186 >> 192.168.123.105:0/1261578186 conn(0x7f5fbc0fe6c0 msgr2=0x7f5fbc10d380 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:02:46.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.159+0000 7f5fc1e9f700 1 -- 192.168.123.105:0/1261578186 shutdown_connections 2026-03-10T09:02:46.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.159+0000 7f5fc1e9f700 1 -- 192.168.123.105:0/1261578186 wait complete. 2026-03-10T09:02:46.237 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.236+0000 7f42ac114700 1 -- 192.168.123.105:0/3812812775 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f42a4103120 msgr2=0x7f42a4103540 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:46.237 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.236+0000 7f42ac114700 1 --2- 192.168.123.105:0/3812812775 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f42a4103120 0x7f42a4103540 secure :-1 s=READY pgs=158 cs=0 l=1 rev1=1 crypto rx=0x7f4294009b00 tx=0x7f4294009e10 comp rx=0 tx=0).stop 2026-03-10T09:02:46.240 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.239+0000 7f42ac114700 1 -- 192.168.123.105:0/3812812775 shutdown_connections 2026-03-10T09:02:46.240 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.239+0000 7f42ac114700 1 --2- 192.168.123.105:0/3812812775 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f42a4104320 0x7f42a4104780 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:46.240 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.239+0000 7f42ac114700 1 --2- 192.168.123.105:0/3812812775 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f42a4103120 0x7f42a4103540 unknown :-1 s=CLOSED pgs=158 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:46.240 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.239+0000 7f42ac114700 1 -- 192.168.123.105:0/3812812775 >> 192.168.123.105:0/3812812775 conn(0x7f42a40fe6c0 msgr2=0x7f42a4100b00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:02:46.240 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.240+0000 7f42ac114700 1 -- 192.168.123.105:0/3812812775 shutdown_connections 2026-03-10T09:02:46.240 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.240+0000 7f42ac114700 1 -- 192.168.123.105:0/3812812775 wait complete. 2026-03-10T09:02:46.240 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.240+0000 7f42ac114700 1 Processor -- start 2026-03-10T09:02:46.241 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.240+0000 7f42ac114700 1 -- start start 2026-03-10T09:02:46.241 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.241+0000 7f42ac114700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f42a4103120 0x7f42a41989e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:02:46.241 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.241+0000 7f42ac114700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f42a4104320 0x7f42a4198f20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:02:46.241 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.241+0000 7f42ac114700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f42a4199540 con 0x7f42a4103120 2026-03-10T09:02:46.241 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.241+0000 7f42ac114700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f42a4199680 con 0x7f42a4104320 2026-03-10T09:02:46.241 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.241+0000 7f42a96af700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f42a4104320 0x7f42a4198f20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:02:46.241 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.241+0000 7f42a96af700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f42a4104320 0x7f42a4198f20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:53364/0 (socket says 192.168.123.105:53364) 2026-03-10T09:02:46.241 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.241+0000 7f42a96af700 1 -- 192.168.123.105:0/121113280 learned_addr learned my addr 192.168.123.105:0/121113280 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:02:46.241 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.241+0000 7f42a9eb0700 1 --2- 192.168.123.105:0/121113280 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f42a4103120 0x7f42a41989e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:02:46.242 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.241+0000 7f42a96af700 1 -- 192.168.123.105:0/121113280 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f42a4103120 msgr2=0x7f42a41989e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:46.242 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.241+0000 7f42a96af700 1 --2- 192.168.123.105:0/121113280 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f42a4103120 0x7f42a41989e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:46.242 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.241+0000 7f42a96af700 1 -- 192.168.123.105:0/121113280 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f42940097e0 con 0x7f42a4104320 2026-03-10T09:02:46.242 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.242+0000 7f42a9eb0700 1 --2- 192.168.123.105:0/121113280 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f42a4103120 0x7f42a41989e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T09:02:46.242 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.242+0000 7f42a96af700 1 --2- 192.168.123.105:0/121113280 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f42a4104320 0x7f42a4198f20 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7f42a000eb10 tx=0x7f42a000eed0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:02:46.243 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.242+0000 7f429affd700 1 -- 192.168.123.105:0/121113280 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f42a000cca0 con 0x7f42a4104320 2026-03-10T09:02:46.243 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.243+0000 7f42ac114700 1 -- 192.168.123.105:0/121113280 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f42a419e130 con 0x7f42a4104320 2026-03-10T09:02:46.243 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.243+0000 7f42ac114700 1 -- 192.168.123.105:0/121113280 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f42a419e680 con 0x7f42a4104320 2026-03-10T09:02:46.244 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.244+0000 7f429affd700 1 -- 192.168.123.105:0/121113280 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f42a000ce00 con 0x7f42a4104320 2026-03-10T09:02:46.244 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.244+0000 7f429affd700 1 -- 192.168.123.105:0/121113280 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f42a00189c0 con 0x7f42a4104320 2026-03-10T09:02:46.245 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.245+0000 7f42ac114700 1 -- 192.168.123.105:0/121113280 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4288005320 con 0x7f42a4104320 2026-03-10T09:02:46.246 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.245+0000 7f429affd700 1 -- 192.168.123.105:0/121113280 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f42a0018b20 con 0x7f42a4104320 2026-03-10T09:02:46.246 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.245+0000 7f429affd700 1 --2- 192.168.123.105:0/121113280 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f4290077870 0x7f4290079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:02:46.246 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.245+0000 7f429affd700 1 -- 192.168.123.105:0/121113280 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(94..94 src has 1..94) v4 ==== 6222+0+0 (secure 0 0 0) 0x7f42a0014070 con 0x7f42a4104320 2026-03-10T09:02:46.246 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.246+0000 7f42a9eb0700 1 --2- 192.168.123.105:0/121113280 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f4290077870 0x7f4290079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:02:46.246 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.246+0000 7f42a9eb0700 1 --2- 192.168.123.105:0/121113280 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f4290077870 0x7f4290079d30 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7f4294006010 tx=0x7f429401a040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:02:46.248 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.248+0000 7f429affd700 1 -- 192.168.123.105:0/121113280 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f42a0062c20 con 0x7f42a4104320 2026-03-10T09:02:46.400 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.399+0000 7f42ac114700 1 -- 192.168.123.105:0/121113280 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f4288006200 con 0x7f42a4104320 2026-03-10T09:02:46.401 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.400+0000 7f429affd700 1 -- 192.168.123.105:0/121113280 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 12 v12) v1 ==== 76+0+1918 (secure 0 0 0) 0x7f42a0062370 con 0x7f42a4104320 2026-03-10T09:02:46.401 INFO:teuthology.orchestra.run.vm05.stdout:e12 2026-03-10T09:02:46.402 INFO:teuthology.orchestra.run.vm05.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-10T09:02:46.402 INFO:teuthology.orchestra.run.vm05.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T09:02:46.402 INFO:teuthology.orchestra.run.vm05.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T09:02:46.402 INFO:teuthology.orchestra.run.vm05.stdout:legacy client fscid: 1 2026-03-10T09:02:46.402 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:02:46.402 INFO:teuthology.orchestra.run.vm05.stdout:Filesystem 'cephfs' (1) 2026-03-10T09:02:46.402 INFO:teuthology.orchestra.run.vm05.stdout:fs_name cephfs 2026-03-10T09:02:46.402 INFO:teuthology.orchestra.run.vm05.stdout:epoch 12 2026-03-10T09:02:46.402 INFO:teuthology.orchestra.run.vm05.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T09:02:46.402 INFO:teuthology.orchestra.run.vm05.stdout:created 2026-03-10T08:52:52.346264+0000 2026-03-10T09:02:46.402 INFO:teuthology.orchestra.run.vm05.stdout:modified 2026-03-10T08:53:01.426195+0000 2026-03-10T09:02:46.402 INFO:teuthology.orchestra.run.vm05.stdout:tableserver 0 2026-03-10T09:02:46.402 INFO:teuthology.orchestra.run.vm05.stdout:root 0 2026-03-10T09:02:46.402 INFO:teuthology.orchestra.run.vm05.stdout:session_timeout 60 2026-03-10T09:02:46.402 INFO:teuthology.orchestra.run.vm05.stdout:session_autoclose 300 2026-03-10T09:02:46.402 INFO:teuthology.orchestra.run.vm05.stdout:max_file_size 1099511627776 2026-03-10T09:02:46.402 INFO:teuthology.orchestra.run.vm05.stdout:max_xattr_size 65536 2026-03-10T09:02:46.402 INFO:teuthology.orchestra.run.vm05.stdout:required_client_features {} 2026-03-10T09:02:46.402 INFO:teuthology.orchestra.run.vm05.stdout:last_failure 0 2026-03-10T09:02:46.402 INFO:teuthology.orchestra.run.vm05.stdout:last_failure_osd_epoch 39 2026-03-10T09:02:46.402 INFO:teuthology.orchestra.run.vm05.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T09:02:46.402 INFO:teuthology.orchestra.run.vm05.stdout:max_mds 1 2026-03-10T09:02:46.402 INFO:teuthology.orchestra.run.vm05.stdout:in 0 2026-03-10T09:02:46.402 INFO:teuthology.orchestra.run.vm05.stdout:up {0=24289} 2026-03-10T09:02:46.402 INFO:teuthology.orchestra.run.vm05.stdout:failed 2026-03-10T09:02:46.402 INFO:teuthology.orchestra.run.vm05.stdout:damaged 2026-03-10T09:02:46.402 INFO:teuthology.orchestra.run.vm05.stdout:stopped 2026-03-10T09:02:46.402 INFO:teuthology.orchestra.run.vm05.stdout:data_pools [3] 2026-03-10T09:02:46.402 INFO:teuthology.orchestra.run.vm05.stdout:metadata_pool 2 2026-03-10T09:02:46.402 INFO:teuthology.orchestra.run.vm05.stdout:inline_data disabled 2026-03-10T09:02:46.402 INFO:teuthology.orchestra.run.vm05.stdout:balancer 2026-03-10T09:02:46.402 INFO:teuthology.orchestra.run.vm05.stdout:bal_rank_mask -1 2026-03-10T09:02:46.402 INFO:teuthology.orchestra.run.vm05.stdout:standby_count_wanted 1 2026-03-10T09:02:46.402 INFO:teuthology.orchestra.run.vm05.stdout:qdb_cluster leader: 0 members: 2026-03-10T09:02:46.402 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.bxdvbu{0:24289} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.105:6826/2466638752,v1:192.168.123.105:6827/2466638752] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T09:02:46.402 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:02:46.402 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:02:46.402 INFO:teuthology.orchestra.run.vm05.stdout:Standby daemons: 2026-03-10T09:02:46.402 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:02:46.402 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.slhztf{-1:14488} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.105:6828/2662194502,v1:192.168.123.105:6829/2662194502] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T09:02:46.403 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.ssijow{-1:24307} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.108:6826/573665424,v1:192.168.123.108:6827/573665424] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T09:02:46.403 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.xfzrbx{-1:24317} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.108:6824/3236998387,v1:192.168.123.108:6825/3236998387] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T09:02:46.403 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.403+0000 7f42ac114700 1 -- 192.168.123.105:0/121113280 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f4290077870 msgr2=0x7f4290079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:46.403 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.403+0000 7f42ac114700 1 --2- 192.168.123.105:0/121113280 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f4290077870 0x7f4290079d30 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7f4294006010 tx=0x7f429401a040 comp rx=0 tx=0).stop 2026-03-10T09:02:46.403 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.403+0000 7f42ac114700 1 -- 192.168.123.105:0/121113280 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f42a4104320 msgr2=0x7f42a4198f20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:46.404 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.403+0000 7f42ac114700 1 --2- 192.168.123.105:0/121113280 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f42a4104320 0x7f42a4198f20 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7f42a000eb10 tx=0x7f42a000eed0 comp rx=0 tx=0).stop 2026-03-10T09:02:46.404 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.403+0000 7f42ac114700 1 -- 192.168.123.105:0/121113280 shutdown_connections 2026-03-10T09:02:46.404 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.403+0000 7f42ac114700 1 --2- 192.168.123.105:0/121113280 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f4290077870 0x7f4290079d30 unknown :-1 s=CLOSED pgs=105 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:46.404 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.403+0000 7f42ac114700 1 --2- 192.168.123.105:0/121113280 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f42a4103120 0x7f42a41989e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:46.404 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.403+0000 7f42ac114700 1 --2- 192.168.123.105:0/121113280 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f42a4104320 0x7f42a4198f20 unknown :-1 s=CLOSED pgs=56 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:46.404 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.403+0000 7f42ac114700 1 -- 192.168.123.105:0/121113280 >> 192.168.123.105:0/121113280 conn(0x7f42a40fe6c0 msgr2=0x7f42a4107550 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:02:46.404 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.404+0000 7f42ac114700 1 -- 192.168.123.105:0/121113280 shutdown_connections 2026-03-10T09:02:46.404 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.404+0000 7f42ac114700 1 -- 192.168.123.105:0/121113280 wait complete. 2026-03-10T09:02:46.405 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 12 2026-03-10T09:02:46.475 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.474+0000 7f9164599700 1 -- 192.168.123.105:0/2700007802 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f915c104320 msgr2=0x7f915c106710 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:46.475 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.474+0000 7f9164599700 1 --2- 192.168.123.105:0/2700007802 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f915c104320 0x7f915c106710 secure :-1 s=READY pgs=159 cs=0 l=1 rev1=1 crypto rx=0x7f9158009b50 tx=0x7f9158009e60 comp rx=0 tx=0).stop 2026-03-10T09:02:46.475 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.475+0000 7f9164599700 1 -- 192.168.123.105:0/2700007802 shutdown_connections 2026-03-10T09:02:46.475 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.475+0000 7f9164599700 1 --2- 192.168.123.105:0/2700007802 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f915c104320 0x7f915c106710 unknown :-1 s=CLOSED pgs=159 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:46.476 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.475+0000 7f9164599700 1 --2- 192.168.123.105:0/2700007802 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f915c1019f0 0x7f915c103de0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:46.476 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.475+0000 7f9164599700 1 -- 192.168.123.105:0/2700007802 >> 192.168.123.105:0/2700007802 conn(0x7f915c0fb3c0 msgr2=0x7f915c0fd840 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:02:46.476 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.475+0000 7f9164599700 1 -- 192.168.123.105:0/2700007802 shutdown_connections 2026-03-10T09:02:46.476 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.475+0000 7f9164599700 1 -- 192.168.123.105:0/2700007802 wait complete. 2026-03-10T09:02:46.476 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.476+0000 7f9164599700 1 Processor -- start 2026-03-10T09:02:46.476 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.476+0000 7f9164599700 1 -- start start 2026-03-10T09:02:46.476 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.476+0000 7f9164599700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f915c1019f0 0x7f915c194640 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:02:46.476 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.476+0000 7f9164599700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f915c104320 0x7f915c194b80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:02:46.476 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.476+0000 7f9164599700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f915c1951a0 con 0x7f915c1019f0 2026-03-10T09:02:46.476 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.476+0000 7f9164599700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f915c1952e0 con 0x7f915c104320 2026-03-10T09:02:46.477 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.476+0000 7f9162335700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f915c1019f0 0x7f915c194640 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:02:46.477 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.476+0000 7f9162335700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f915c1019f0 0x7f915c194640 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:45296/0 (socket says 192.168.123.105:45296) 2026-03-10T09:02:46.477 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.476+0000 7f9162335700 1 -- 192.168.123.105:0/1916623827 learned_addr learned my addr 192.168.123.105:0/1916623827 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:02:46.477 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.476+0000 7f9161b34700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f915c104320 0x7f915c194b80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:02:46.477 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.477+0000 7f9162335700 1 -- 192.168.123.105:0/1916623827 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f915c104320 msgr2=0x7f915c194b80 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:46.477 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.477+0000 7f9162335700 1 --2- 192.168.123.105:0/1916623827 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f915c104320 0x7f915c194b80 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:46.477 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.477+0000 7f9162335700 1 -- 192.168.123.105:0/1916623827 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f91580097e0 con 0x7f915c1019f0 2026-03-10T09:02:46.477 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.477+0000 7f9162335700 1 --2- 192.168.123.105:0/1916623827 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f915c1019f0 0x7f915c194640 secure :-1 s=READY pgs=160 cs=0 l=1 rev1=1 crypto rx=0x7f914c009e60 tx=0x7f914c0074a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:02:46.477 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.477+0000 7f91537fe700 1 -- 192.168.123.105:0/1916623827 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f914c00f9a0 con 0x7f915c1019f0 2026-03-10T09:02:46.478 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.477+0000 7f9164599700 1 -- 192.168.123.105:0/1916623827 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f915c199d90 con 0x7f915c1019f0 2026-03-10T09:02:46.478 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.478+0000 7f9164599700 1 -- 192.168.123.105:0/1916623827 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f915c19a250 con 0x7f915c1019f0 2026-03-10T09:02:46.479 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.478+0000 7f91537fe700 1 -- 192.168.123.105:0/1916623827 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f914c015040 con 0x7f915c1019f0 2026-03-10T09:02:46.479 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.478+0000 7f91537fe700 1 -- 192.168.123.105:0/1916623827 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f914c018560 con 0x7f915c1019f0 2026-03-10T09:02:46.479 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.479+0000 7f91537fe700 1 -- 192.168.123.105:0/1916623827 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f914c0186c0 con 0x7f915c1019f0 2026-03-10T09:02:46.480 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.479+0000 7f91537fe700 1 --2- 192.168.123.105:0/1916623827 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f9148077990 0x7f9148079e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:02:46.480 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.480+0000 7f9161b34700 1 --2- 192.168.123.105:0/1916623827 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f9148077990 0x7f9148079e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:02:46.480 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.480+0000 7f91537fe700 1 -- 192.168.123.105:0/1916623827 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(94..94 src has 1..94) v4 ==== 6222+0+0 (secure 0 0 0) 0x7f914c09e770 con 0x7f915c1019f0 2026-03-10T09:02:46.480 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.480+0000 7f9161b34700 1 --2- 192.168.123.105:0/1916623827 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f9148077990 0x7f9148079e50 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f9158006010 tx=0x7f9158005a90 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:02:46.480 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.480+0000 7f9164599700 1 -- 192.168.123.105:0/1916623827 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9140005320 con 0x7f915c1019f0 2026-03-10T09:02:46.484 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.483+0000 7f91537fe700 1 -- 192.168.123.105:0/1916623827 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f914c066900 con 0x7f915c1019f0 2026-03-10T09:02:46.614 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.613+0000 7f9164599700 1 -- 192.168.123.105:0/1916623827 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f9140000bf0 con 0x7f9148077990 2026-03-10T09:02:46.615 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.615+0000 7f91537fe700 1 -- 192.168.123.105:0/1916623827 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+401 (secure 0 0 0) 0x7f9140000bf0 con 0x7f9148077990 2026-03-10T09:02:46.620 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T09:02:46.620 INFO:teuthology.orchestra.run.vm05.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T09:02:46.620 INFO:teuthology.orchestra.run.vm05.stdout: "in_progress": true, 2026-03-10T09:02:46.620 INFO:teuthology.orchestra.run.vm05.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T09:02:46.620 INFO:teuthology.orchestra.run.vm05.stdout: "services_complete": [ 2026-03-10T09:02:46.620 INFO:teuthology.orchestra.run.vm05.stdout: "mgr", 2026-03-10T09:02:46.620 INFO:teuthology.orchestra.run.vm05.stdout: "crash", 2026-03-10T09:02:46.620 INFO:teuthology.orchestra.run.vm05.stdout: "mon" 2026-03-10T09:02:46.620 INFO:teuthology.orchestra.run.vm05.stdout: ], 2026-03-10T09:02:46.620 INFO:teuthology.orchestra.run.vm05.stdout: "progress": "11/23 daemons upgraded", 2026-03-10T09:02:46.620 INFO:teuthology.orchestra.run.vm05.stdout: "message": "Currently upgrading osd daemons", 2026-03-10T09:02:46.620 INFO:teuthology.orchestra.run.vm05.stdout: "is_paused": false 2026-03-10T09:02:46.620 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T09:02:46.624 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.621+0000 7f9164599700 1 -- 192.168.123.105:0/1916623827 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f9148077990 msgr2=0x7f9148079e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:46.624 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.621+0000 7f9164599700 1 --2- 192.168.123.105:0/1916623827 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f9148077990 0x7f9148079e50 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f9158006010 tx=0x7f9158005a90 comp rx=0 tx=0).stop 2026-03-10T09:02:46.624 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.621+0000 7f9164599700 1 -- 192.168.123.105:0/1916623827 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f915c1019f0 msgr2=0x7f915c194640 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:46.624 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.621+0000 7f9164599700 1 --2- 192.168.123.105:0/1916623827 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f915c1019f0 0x7f915c194640 secure :-1 s=READY pgs=160 cs=0 l=1 rev1=1 crypto rx=0x7f914c009e60 tx=0x7f914c0074a0 comp rx=0 tx=0).stop 2026-03-10T09:02:46.624 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.621+0000 7f9164599700 1 -- 192.168.123.105:0/1916623827 shutdown_connections 2026-03-10T09:02:46.624 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.621+0000 7f9164599700 1 --2- 192.168.123.105:0/1916623827 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f9148077990 0x7f9148079e50 secure :-1 s=CLOSED pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f9158006010 tx=0x7f9158005a90 comp rx=0 tx=0).stop 2026-03-10T09:02:46.624 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.621+0000 7f9164599700 1 --2- 192.168.123.105:0/1916623827 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f915c1019f0 0x7f915c194640 secure :-1 s=CLOSED pgs=160 cs=0 l=1 rev1=1 crypto rx=0x7f914c009e60 tx=0x7f914c0074a0 comp rx=0 tx=0).stop 2026-03-10T09:02:46.624 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.621+0000 7f9164599700 1 --2- 192.168.123.105:0/1916623827 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f915c104320 0x7f915c194b80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:46.624 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.621+0000 7f9164599700 1 -- 192.168.123.105:0/1916623827 >> 192.168.123.105:0/1916623827 conn(0x7f915c0fb3c0 msgr2=0x7f915c0fd840 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:02:46.624 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.624+0000 7f9164599700 1 -- 192.168.123.105:0/1916623827 shutdown_connections 2026-03-10T09:02:46.625 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.624+0000 7f9164599700 1 -- 192.168.123.105:0/1916623827 wait complete. 2026-03-10T09:02:46.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.692+0000 7f3f1d41c700 1 -- 192.168.123.105:0/1898088611 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3f18075740 msgr2=0x7f3f18075b60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:46.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.692+0000 7f3f1d41c700 1 --2- 192.168.123.105:0/1898088611 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3f18075740 0x7f3f18075b60 secure :-1 s=READY pgs=161 cs=0 l=1 rev1=1 crypto rx=0x7f3f00009b00 tx=0x7f3f00009e10 comp rx=0 tx=0).stop 2026-03-10T09:02:46.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.692+0000 7f3f1d41c700 1 -- 192.168.123.105:0/1898088611 shutdown_connections 2026-03-10T09:02:46.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.692+0000 7f3f1d41c700 1 --2- 192.168.123.105:0/1898088611 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3f18076990 0x7f3f18076e10 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:46.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.692+0000 7f3f1d41c700 1 --2- 192.168.123.105:0/1898088611 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3f18075740 0x7f3f18075b60 unknown :-1 s=CLOSED pgs=161 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:46.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.692+0000 7f3f1d41c700 1 -- 192.168.123.105:0/1898088611 >> 192.168.123.105:0/1898088611 conn(0x7f3f180fe6c0 msgr2=0x7f3f18100b00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:02:46.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.692+0000 7f3f1d41c700 1 -- 192.168.123.105:0/1898088611 shutdown_connections 2026-03-10T09:02:46.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.692+0000 7f3f1d41c700 1 -- 192.168.123.105:0/1898088611 wait complete. 2026-03-10T09:02:46.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.693+0000 7f3f1d41c700 1 Processor -- start 2026-03-10T09:02:46.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.693+0000 7f3f1d41c700 1 -- start start 2026-03-10T09:02:46.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.693+0000 7f3f1d41c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3f18075740 0x7f3f1819cdf0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:02:46.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.693+0000 7f3f1d41c700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3f18076990 0x7f3f1819d330 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:02:46.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.693+0000 7f3f1d41c700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3f1819d950 con 0x7f3f18075740 2026-03-10T09:02:46.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.693+0000 7f3f1d41c700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3f1819da90 con 0x7f3f18076990 2026-03-10T09:02:46.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.694+0000 7f3f167fc700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3f18076990 0x7f3f1819d330 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:02:46.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.694+0000 7f3f167fc700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3f18076990 0x7f3f1819d330 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:53396/0 (socket says 192.168.123.105:53396) 2026-03-10T09:02:46.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.694+0000 7f3f167fc700 1 -- 192.168.123.105:0/2582482503 learned_addr learned my addr 192.168.123.105:0/2582482503 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:02:46.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.694+0000 7f3f16ffd700 1 --2- 192.168.123.105:0/2582482503 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3f18075740 0x7f3f1819cdf0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:02:46.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.694+0000 7f3f167fc700 1 -- 192.168.123.105:0/2582482503 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3f18075740 msgr2=0x7f3f1819cdf0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:46.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.694+0000 7f3f167fc700 1 --2- 192.168.123.105:0/2582482503 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3f18075740 0x7f3f1819cdf0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:46.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.694+0000 7f3f167fc700 1 -- 192.168.123.105:0/2582482503 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3f000097e0 con 0x7f3f18076990 2026-03-10T09:02:46.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.694+0000 7f3f16ffd700 1 --2- 192.168.123.105:0/2582482503 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3f18075740 0x7f3f1819cdf0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T09:02:46.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.694+0000 7f3f167fc700 1 --2- 192.168.123.105:0/2582482503 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3f18076990 0x7f3f1819d330 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f3f0800b700 tx=0x7f3f0800bac0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:02:46.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.694+0000 7f3f0ffff700 1 -- 192.168.123.105:0/2582482503 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3f08010840 con 0x7f3f18076990 2026-03-10T09:02:46.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.695+0000 7f3f1d41c700 1 -- 192.168.123.105:0/2582482503 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3f181a2540 con 0x7f3f18076990 2026-03-10T09:02:46.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.695+0000 7f3f1d41c700 1 -- 192.168.123.105:0/2582482503 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3f181a2b10 con 0x7f3f18076990 2026-03-10T09:02:46.696 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.695+0000 7f3f0ffff700 1 -- 192.168.123.105:0/2582482503 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3f08010e80 con 0x7f3f18076990 2026-03-10T09:02:46.696 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.696+0000 7f3f0ffff700 1 -- 192.168.123.105:0/2582482503 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3f0800d590 con 0x7f3f18076990 2026-03-10T09:02:46.696 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.696+0000 7f3f1d41c700 1 -- 192.168.123.105:0/2582482503 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3f18066e80 con 0x7f3f18076990 2026-03-10T09:02:46.697 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.696+0000 7f3f0ffff700 1 -- 192.168.123.105:0/2582482503 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f3f080109a0 con 0x7f3f18076990 2026-03-10T09:02:46.697 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.697+0000 7f3f0ffff700 1 --2- 192.168.123.105:0/2582482503 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f3f040776b0 0x7f3f04079b70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:02:46.697 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.697+0000 7f3f0ffff700 1 -- 192.168.123.105:0/2582482503 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(94..94 src has 1..94) v4 ==== 6222+0+0 (secure 0 0 0) 0x7f3f08099560 con 0x7f3f18076990 2026-03-10T09:02:46.697 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.697+0000 7f3f16ffd700 1 --2- 192.168.123.105:0/2582482503 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f3f040776b0 0x7f3f04079b70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:02:46.698 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.697+0000 7f3f16ffd700 1 --2- 192.168.123.105:0/2582482503 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f3f040776b0 0x7f3f04079b70 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7f3f000053e0 tx=0x7f3f00005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:02:46.700 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.700+0000 7f3f0ffff700 1 -- 192.168.123.105:0/2582482503 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f3f08061cc0 con 0x7f3f18076990 2026-03-10T09:02:46.802 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:46 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4[116652]: 2026-03-10T09:02:46.521+0000 7fc5e48fd740 -1 osd.4 0 read_superblock omap replica is missing. 2026-03-10T09:02:46.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.874+0000 7f3f1d41c700 1 -- 192.168.123.105:0/2582482503 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f3f181a2e40 con 0x7f3f18076990 2026-03-10T09:02:46.874 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:46 vm05.local ceph-mon[111630]: from='client.34408 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:02:46.875 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:46 vm05.local ceph-mon[111630]: from='client.34412 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:02:46.875 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:46 vm05.local ceph-mon[111630]: pgmap v234: 65 pgs: 15 active+undersized, 8 peering, 11 active+undersized+degraded, 31 active+clean; 215 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 35/231 objects degraded (15.152%) 2026-03-10T09:02:46.875 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:46 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:46.875 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:46 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:02:46.875 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:46 vm05.local ceph-mon[111630]: from='client.34416 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:02:46.875 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:46 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/1261578186' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:02:46.875 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:46 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/121113280' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T09:02:46.875 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:46 vm05.local ceph-mon[111630]: from='osd.4 [v2:192.168.123.108:6808/3079572873,v1:192.168.123.108:6809/3079572873]' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-10T09:02:46.875 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:46 vm05.local ceph-mon[111630]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-10T09:02:46.875 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.875+0000 7f3f0ffff700 1 -- 192.168.123.105:0/2582482503 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+1063 (secure 0 0 0) 0x7f3f08061410 con 0x7f3f18076990 2026-03-10T09:02:46.876 INFO:teuthology.orchestra.run.vm05.stdout:HEALTH_WARN 1 osds down; Reduced data availability: 1 pg peering; Degraded data redundancy: 35/231 objects degraded (15.152%), 11 pgs degraded 2026-03-10T09:02:46.876 INFO:teuthology.orchestra.run.vm05.stdout:[WRN] OSD_DOWN: 1 osds down 2026-03-10T09:02:46.876 INFO:teuthology.orchestra.run.vm05.stdout: osd.4 (root=default,host=vm08) is down 2026-03-10T09:02:46.876 INFO:teuthology.orchestra.run.vm05.stdout:[WRN] PG_AVAILABILITY: Reduced data availability: 1 pg peering 2026-03-10T09:02:46.876 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.1a is stuck peering for 4m, current state peering, last acting [1,2] 2026-03-10T09:02:46.876 INFO:teuthology.orchestra.run.vm05.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 35/231 objects degraded (15.152%), 11 pgs degraded 2026-03-10T09:02:46.876 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.5 is active+undersized+degraded, acting [3,0] 2026-03-10T09:02:46.876 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.7 is active+undersized+degraded, acting [3,2] 2026-03-10T09:02:46.876 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.b is active+undersized+degraded, acting [3,5] 2026-03-10T09:02:46.876 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.f is active+undersized+degraded, acting [0,5] 2026-03-10T09:02:46.876 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.13 is active+undersized+degraded, acting [0,2] 2026-03-10T09:02:46.876 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.14 is active+undersized+degraded, acting [3,5] 2026-03-10T09:02:46.876 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.18 is active+undersized+degraded, acting [5,3] 2026-03-10T09:02:46.876 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.19 is active+undersized+degraded, acting [0,2] 2026-03-10T09:02:46.876 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.1a is active+undersized+degraded, acting [3,5] 2026-03-10T09:02:46.876 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.1c is active+undersized+degraded, acting [5,2] 2026-03-10T09:02:46.876 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.1f is active+undersized+degraded, acting [0,3] 2026-03-10T09:02:46.878 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.878+0000 7f3f1d41c700 1 -- 192.168.123.105:0/2582482503 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f3f040776b0 msgr2=0x7f3f04079b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:46.878 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.878+0000 7f3f1d41c700 1 --2- 192.168.123.105:0/2582482503 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f3f040776b0 0x7f3f04079b70 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7f3f000053e0 tx=0x7f3f00005fb0 comp rx=0 tx=0).stop 2026-03-10T09:02:46.878 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.878+0000 7f3f1d41c700 1 -- 192.168.123.105:0/2582482503 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3f18076990 msgr2=0x7f3f1819d330 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:02:46.879 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.878+0000 7f3f1d41c700 1 --2- 192.168.123.105:0/2582482503 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3f18076990 0x7f3f1819d330 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f3f0800b700 tx=0x7f3f0800bac0 comp rx=0 tx=0).stop 2026-03-10T09:02:46.879 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.879+0000 7f3f1d41c700 1 -- 192.168.123.105:0/2582482503 shutdown_connections 2026-03-10T09:02:46.879 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.879+0000 7f3f1d41c700 1 --2- 192.168.123.105:0/2582482503 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f3f040776b0 0x7f3f04079b70 unknown :-1 s=CLOSED pgs=107 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:46.879 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.879+0000 7f3f1d41c700 1 --2- 192.168.123.105:0/2582482503 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3f18075740 0x7f3f1819cdf0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:46.879 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.879+0000 7f3f1d41c700 1 --2- 192.168.123.105:0/2582482503 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3f18076990 0x7f3f1819d330 unknown :-1 s=CLOSED pgs=57 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:02:46.879 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.879+0000 7f3f1d41c700 1 -- 192.168.123.105:0/2582482503 >> 192.168.123.105:0/2582482503 conn(0x7f3f180fe6c0 msgr2=0x7f3f1810d380 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:02:46.879 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.879+0000 7f3f1d41c700 1 -- 192.168.123.105:0/2582482503 shutdown_connections 2026-03-10T09:02:46.879 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:02:46.879+0000 7f3f1d41c700 1 -- 192.168.123.105:0/2582482503 wait complete. 2026-03-10T09:02:47.302 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:46 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4[116652]: 2026-03-10T09:02:46.824+0000 7fc5e48fd740 -1 osd.4 92 log_to_monitors true 2026-03-10T09:02:47.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:46 vm08.local ceph-mon[101330]: from='client.34408 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:02:47.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:46 vm08.local ceph-mon[101330]: from='client.34412 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:02:47.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:46 vm08.local ceph-mon[101330]: pgmap v234: 65 pgs: 15 active+undersized, 8 peering, 11 active+undersized+degraded, 31 active+clean; 215 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 35/231 objects degraded (15.152%) 2026-03-10T09:02:47.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:46 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:02:47.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:46 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:02:47.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:46 vm08.local ceph-mon[101330]: from='client.34416 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:02:47.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:46 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/1261578186' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:02:47.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:46 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/121113280' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T09:02:47.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:46 vm08.local ceph-mon[101330]: from='osd.4 [v2:192.168.123.108:6808/3079572873,v1:192.168.123.108:6809/3079572873]' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-10T09:02:47.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:46 vm08.local ceph-mon[101330]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-10T09:02:47.802 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:02:47 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4[116652]: 2026-03-10T09:02:47.331+0000 7fc5dc697640 -1 osd.4 92 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T09:02:48.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:47 vm05.local ceph-mon[111630]: from='client.34428 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:02:48.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:47 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/2582482503' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T09:02:48.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:47 vm05.local ceph-mon[111630]: from='osd.4 ' entity='osd.4' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]': finished 2026-03-10T09:02:48.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:47 vm05.local ceph-mon[111630]: osdmap e95: 6 total, 5 up, 6 in 2026-03-10T09:02:48.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:47 vm05.local ceph-mon[111630]: from='osd.4 [v2:192.168.123.108:6808/3079572873,v1:192.168.123.108:6809/3079572873]' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T09:02:48.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:47 vm05.local ceph-mon[111630]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T09:02:48.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:47 vm08.local ceph-mon[101330]: from='client.34428 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:02:48.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:47 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/2582482503' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T09:02:48.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:47 vm08.local ceph-mon[101330]: from='osd.4 ' entity='osd.4' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]': finished 2026-03-10T09:02:48.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:47 vm08.local ceph-mon[101330]: osdmap e95: 6 total, 5 up, 6 in 2026-03-10T09:02:48.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:47 vm08.local ceph-mon[101330]: from='osd.4 [v2:192.168.123.108:6808/3079572873,v1:192.168.123.108:6809/3079572873]' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T09:02:48.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:47 vm08.local ceph-mon[101330]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T09:02:49.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:48 vm05.local ceph-mon[111630]: pgmap v236: 65 pgs: 19 active+undersized, 15 active+undersized+degraded, 31 active+clean; 215 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 41/231 objects degraded (17.749%) 2026-03-10T09:02:49.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:48 vm05.local ceph-mon[111630]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T09:02:49.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:48 vm05.local ceph-mon[111630]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 1 pg peering) 2026-03-10T09:02:49.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:48 vm05.local ceph-mon[111630]: osd.4 [v2:192.168.123.108:6808/3079572873,v1:192.168.123.108:6809/3079572873] boot 2026-03-10T09:02:49.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:48 vm05.local ceph-mon[111630]: osdmap e96: 6 total, 6 up, 6 in 2026-03-10T09:02:49.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:48 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T09:02:49.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:48 vm08.local ceph-mon[101330]: pgmap v236: 65 pgs: 19 active+undersized, 15 active+undersized+degraded, 31 active+clean; 215 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 41/231 objects degraded (17.749%) 2026-03-10T09:02:49.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:48 vm08.local ceph-mon[101330]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T09:02:49.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:48 vm08.local ceph-mon[101330]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 1 pg peering) 2026-03-10T09:02:49.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:48 vm08.local ceph-mon[101330]: osd.4 [v2:192.168.123.108:6808/3079572873,v1:192.168.123.108:6809/3079572873] boot 2026-03-10T09:02:49.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:48 vm08.local ceph-mon[101330]: osdmap e96: 6 total, 6 up, 6 in 2026-03-10T09:02:49.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:48 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T09:02:50.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:50 vm05.local ceph-mon[111630]: osdmap e97: 6 total, 6 up, 6 in 2026-03-10T09:02:50.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:50 vm05.local ceph-mon[111630]: pgmap v239: 65 pgs: 6 peering, 16 active+undersized, 12 active+undersized+degraded, 31 active+clean; 215 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 35/231 objects degraded (15.152%) 2026-03-10T09:02:50.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:50 vm05.local ceph-mon[111630]: Health check update: Degraded data redundancy: 41/231 objects degraded (17.749%), 15 pgs degraded (PG_DEGRADED) 2026-03-10T09:02:50.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:50 vm08.local ceph-mon[101330]: osdmap e97: 6 total, 6 up, 6 in 2026-03-10T09:02:50.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:50 vm08.local ceph-mon[101330]: pgmap v239: 65 pgs: 6 peering, 16 active+undersized, 12 active+undersized+degraded, 31 active+clean; 215 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 35/231 objects degraded (15.152%) 2026-03-10T09:02:50.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:50 vm08.local ceph-mon[101330]: Health check update: Degraded data redundancy: 41/231 objects degraded (17.749%), 15 pgs degraded (PG_DEGRADED) 2026-03-10T09:02:53.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:52 vm05.local ceph-mon[111630]: pgmap v240: 65 pgs: 6 peering, 13 active+undersized, 8 active+undersized+degraded, 38 active+clean; 215 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 29/231 objects degraded (12.554%) 2026-03-10T09:02:53.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:52 vm08.local ceph-mon[101330]: pgmap v240: 65 pgs: 6 peering, 13 active+undersized, 8 active+undersized+degraded, 38 active+clean; 215 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 29/231 objects degraded (12.554%) 2026-03-10T09:02:54.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:53 vm05.local ceph-mon[111630]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 29/231 objects degraded (12.554%), 8 pgs degraded) 2026-03-10T09:02:54.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:53 vm05.local ceph-mon[111630]: Cluster is now healthy 2026-03-10T09:02:54.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:53 vm08.local ceph-mon[101330]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 29/231 objects degraded (12.554%), 8 pgs degraded) 2026-03-10T09:02:54.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:53 vm08.local ceph-mon[101330]: Cluster is now healthy 2026-03-10T09:02:55.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:54 vm05.local ceph-mon[111630]: pgmap v241: 65 pgs: 6 peering, 59 active+clean; 215 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail 2026-03-10T09:02:55.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:54 vm08.local ceph-mon[101330]: pgmap v241: 65 pgs: 6 peering, 59 active+clean; 215 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail 2026-03-10T09:02:57.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:56 vm08.local ceph-mon[101330]: pgmap v242: 65 pgs: 65 active+clean; 215 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail 2026-03-10T09:02:57.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:56 vm05.local ceph-mon[111630]: pgmap v242: 65 pgs: 65 active+clean; 215 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail 2026-03-10T09:02:59.254 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:58 vm08.local ceph-mon[101330]: pgmap v243: 65 pgs: 65 active+clean; 215 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail 2026-03-10T09:02:59.265 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:58 vm05.local ceph-mon[111630]: pgmap v243: 65 pgs: 65 active+clean; 215 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail 2026-03-10T09:03:00.121 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:02:59 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-10T09:03:00.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:02:59 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-10T09:03:00.980 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:00 vm08.local ceph-mon[101330]: pgmap v244: 65 pgs: 65 active+clean; 215 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail 2026-03-10T09:03:00.980 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:00 vm08.local ceph-mon[101330]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-10T09:03:00.980 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:00 vm08.local ceph-mon[101330]: Upgrade: osd.5 is safe to restart 2026-03-10T09:03:00.980 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:00 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:00.980 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:00 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-10T09:03:00.980 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:00 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:03:00.980 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:00 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:00.980 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:00 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:03:00.980 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:00 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:01.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:00 vm05.local ceph-mon[111630]: pgmap v244: 65 pgs: 65 active+clean; 215 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail 2026-03-10T09:03:01.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:00 vm05.local ceph-mon[111630]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-10T09:03:01.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:00 vm05.local ceph-mon[111630]: Upgrade: osd.5 is safe to restart 2026-03-10T09:03:01.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:00 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:01.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:00 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-10T09:03:01.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:00 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:03:01.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:00 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:01.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:00 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:03:01.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:00 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:01.302 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:00 vm08.local systemd[1]: Stopping Ceph osd.5 for 16587ed2-1c5e-11f1-90f6-35051361a039... 2026-03-10T09:03:01.303 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:01 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5[75639]: 2026-03-10T09:03:01.045+0000 7f96f8226700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.5 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T09:03:01.303 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:01 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5[75639]: 2026-03-10T09:03:01.045+0000 7f96f8226700 -1 osd.5 97 *** Got signal Terminated *** 2026-03-10T09:03:01.303 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:01 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5[75639]: 2026-03-10T09:03:01.045+0000 7f96f8226700 -1 osd.5 97 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T09:03:02.233 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:02 vm08.local podman[121568]: 2026-03-10 09:03:02.067589765 +0000 UTC m=+1.035176370 container died 21583bb58d82be2aa3f5c7052f1e11a637c7ed15911f5e8509432ed7551bd7b0 (image=quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, RELEASE=HEAD, maintainer=Guillaume Abrioux , org.label-schema.build-date=20240222, GIT_BRANCH=HEAD, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.schema-version=1.0, CEPH_POINT_RELEASE=-18.2.1, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.buildah.version=1.29.1) 2026-03-10T09:03:02.233 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:02 vm08.local podman[121568]: 2026-03-10 09:03:02.088487419 +0000 UTC m=+1.056074034 container remove 21583bb58d82be2aa3f5c7052f1e11a637c7ed15911f5e8509432ed7551bd7b0 (image=quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5, ceph=True, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=-18.2.1, org.label-schema.build-date=20240222, org.label-schema.schema-version=1.0, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, io.buildah.version=1.29.1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 8 Base Image, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=HEAD, org.label-schema.vendor=CentOS, GIT_BRANCH=HEAD, GIT_CLEAN=True) 2026-03-10T09:03:02.233 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:02 vm08.local bash[121568]: ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5 2026-03-10T09:03:02.233 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:01 vm08.local ceph-mon[101330]: Upgrade: Updating osd.5 2026-03-10T09:03:02.233 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:01 vm08.local ceph-mon[101330]: Deploying daemon osd.5 on vm08 2026-03-10T09:03:02.233 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:01 vm08.local ceph-mon[101330]: osd.5 marked itself down and dead 2026-03-10T09:03:02.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:01 vm05.local ceph-mon[111630]: Upgrade: Updating osd.5 2026-03-10T09:03:02.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:01 vm05.local ceph-mon[111630]: Deploying daemon osd.5 on vm08 2026-03-10T09:03:02.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:01 vm05.local ceph-mon[111630]: osd.5 marked itself down and dead 2026-03-10T09:03:02.547 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:02 vm08.local podman[121634]: 2026-03-10 09:03:02.233518738 +0000 UTC m=+0.015983456 container create f6905bae1e1ce2f2c97f2cf9fc092661306d28d5ac524c1155bcad4caaec37d4 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5-deactivate, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, CEPH_REF=squid, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-10T09:03:02.547 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:02 vm08.local podman[121634]: 2026-03-10 09:03:02.274395425 +0000 UTC m=+0.056860134 container init f6905bae1e1ce2f2c97f2cf9fc092661306d28d5ac524c1155bcad4caaec37d4 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5-deactivate, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T09:03:02.547 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:02 vm08.local podman[121634]: 2026-03-10 09:03:02.277600755 +0000 UTC m=+0.060065473 container start f6905bae1e1ce2f2c97f2cf9fc092661306d28d5ac524c1155bcad4caaec37d4 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5-deactivate, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T09:03:02.547 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:02 vm08.local podman[121634]: 2026-03-10 09:03:02.284130887 +0000 UTC m=+0.066595605 container attach f6905bae1e1ce2f2c97f2cf9fc092661306d28d5ac524c1155bcad4caaec37d4 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5-deactivate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T09:03:02.547 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:02 vm08.local podman[121634]: 2026-03-10 09:03:02.227200423 +0000 UTC m=+0.009665141 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T09:03:02.547 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:02 vm08.local podman[121652]: 2026-03-10 09:03:02.427357198 +0000 UTC m=+0.011975465 container died f6905bae1e1ce2f2c97f2cf9fc092661306d28d5ac524c1155bcad4caaec37d4 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5-deactivate, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T09:03:02.547 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:02 vm08.local podman[121652]: 2026-03-10 09:03:02.443347998 +0000 UTC m=+0.027966255 container remove f6905bae1e1ce2f2c97f2cf9fc092661306d28d5ac524c1155bcad4caaec37d4 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5-deactivate, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, CEPH_REF=squid, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, org.label-schema.schema-version=1.0) 2026-03-10T09:03:02.547 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:02 vm08.local systemd[1]: ceph-16587ed2-1c5e-11f1-90f6-35051361a039@osd.5.service: Deactivated successfully. 2026-03-10T09:03:02.547 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:02 vm08.local systemd[1]: Stopped Ceph osd.5 for 16587ed2-1c5e-11f1-90f6-35051361a039. 2026-03-10T09:03:02.547 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:02 vm08.local systemd[1]: ceph-16587ed2-1c5e-11f1-90f6-35051361a039@osd.5.service: Consumed 44.752s CPU time. 2026-03-10T09:03:02.985 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:02 vm08.local ceph-mon[101330]: pgmap v245: 65 pgs: 65 active+clean; 215 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail 2026-03-10T09:03:02.985 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:02 vm08.local ceph-mon[101330]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T09:03:02.985 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:02 vm08.local ceph-mon[101330]: Health check failed: all OSDs are running squid or later but require_osd_release < squid (OSD_UPGRADE_FINISHED) 2026-03-10T09:03:02.985 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:02 vm08.local ceph-mon[101330]: osdmap e98: 6 total, 5 up, 6 in 2026-03-10T09:03:02.985 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:02 vm08.local systemd[1]: Starting Ceph osd.5 for 16587ed2-1c5e-11f1-90f6-35051361a039... 2026-03-10T09:03:02.985 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:02 vm08.local podman[121736]: 2026-03-10 09:03:02.751919388 +0000 UTC m=+0.018222918 container create 6208ca4c75969f764bfb1d3d57fd9aa67bd3bdad650f31f03f095109c8e27759 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5-activate, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-10T09:03:02.985 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:02 vm08.local podman[121736]: 2026-03-10 09:03:02.796265329 +0000 UTC m=+0.062568869 container init 6208ca4c75969f764bfb1d3d57fd9aa67bd3bdad650f31f03f095109c8e27759 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5-activate, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) 2026-03-10T09:03:02.985 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:02 vm08.local podman[121736]: 2026-03-10 09:03:02.799381191 +0000 UTC m=+0.065684720 container start 6208ca4c75969f764bfb1d3d57fd9aa67bd3bdad650f31f03f095109c8e27759 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5-activate, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223) 2026-03-10T09:03:02.985 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:02 vm08.local podman[121736]: 2026-03-10 09:03:02.806130193 +0000 UTC m=+0.072433722 container attach 6208ca4c75969f764bfb1d3d57fd9aa67bd3bdad650f31f03f095109c8e27759 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5-activate, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T09:03:02.985 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:02 vm08.local podman[121736]: 2026-03-10 09:03:02.745570516 +0000 UTC m=+0.011874055 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T09:03:02.985 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:02 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5-activate[121747]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T09:03:02.985 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:02 vm08.local bash[121736]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T09:03:02.985 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:02 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5-activate[121747]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T09:03:02.985 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:02 vm08.local bash[121736]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T09:03:03.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:02 vm05.local ceph-mon[111630]: pgmap v245: 65 pgs: 65 active+clean; 215 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail 2026-03-10T09:03:03.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:02 vm05.local ceph-mon[111630]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T09:03:03.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:02 vm05.local ceph-mon[111630]: Health check failed: all OSDs are running squid or later but require_osd_release < squid (OSD_UPGRADE_FINISHED) 2026-03-10T09:03:03.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:02 vm05.local ceph-mon[111630]: osdmap e98: 6 total, 5 up, 6 in 2026-03-10T09:03:03.656 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:03 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5-activate[121747]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T09:03:03.656 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:03 vm08.local bash[121736]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T09:03:03.656 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:03 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5-activate[121747]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T09:03:03.656 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:03 vm08.local bash[121736]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T09:03:03.656 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:03 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5-activate[121747]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T09:03:03.656 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:03 vm08.local bash[121736]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T09:03:03.656 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:03 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5-activate[121747]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 2026-03-10T09:03:03.656 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:03 vm08.local bash[121736]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 2026-03-10T09:03:03.656 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:03 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5-activate[121747]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-1551ba8f-6cdf-4dfe-99cc-2bfe75a24a09/osd-block-52d5763a-8095-46f7-9dd8-2d20a4d53ab7 --path /var/lib/ceph/osd/ceph-5 --no-mon-config 2026-03-10T09:03:03.656 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:03 vm08.local bash[121736]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-1551ba8f-6cdf-4dfe-99cc-2bfe75a24a09/osd-block-52d5763a-8095-46f7-9dd8-2d20a4d53ab7 --path /var/lib/ceph/osd/ceph-5 --no-mon-config 2026-03-10T09:03:03.656 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:03 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5-activate[121747]: Running command: /usr/bin/ln -snf /dev/ceph-1551ba8f-6cdf-4dfe-99cc-2bfe75a24a09/osd-block-52d5763a-8095-46f7-9dd8-2d20a4d53ab7 /var/lib/ceph/osd/ceph-5/block 2026-03-10T09:03:03.656 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:03 vm08.local bash[121736]: Running command: /usr/bin/ln -snf /dev/ceph-1551ba8f-6cdf-4dfe-99cc-2bfe75a24a09/osd-block-52d5763a-8095-46f7-9dd8-2d20a4d53ab7 /var/lib/ceph/osd/ceph-5/block 2026-03-10T09:03:03.996 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:03 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5-activate[121747]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-5/block 2026-03-10T09:03:03.996 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:03 vm08.local bash[121736]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-5/block 2026-03-10T09:03:03.996 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:03 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5-activate[121747]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2 2026-03-10T09:03:03.996 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:03 vm08.local bash[121736]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2 2026-03-10T09:03:03.996 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:03 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5-activate[121747]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 2026-03-10T09:03:03.996 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:03 vm08.local bash[121736]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 2026-03-10T09:03:03.996 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:03 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5-activate[121747]: --> ceph-volume lvm activate successful for osd ID: 5 2026-03-10T09:03:03.996 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:03 vm08.local bash[121736]: --> ceph-volume lvm activate successful for osd ID: 5 2026-03-10T09:03:03.996 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:03 vm08.local podman[121736]: 2026-03-10 09:03:03.68748464 +0000 UTC m=+0.953788170 container died 6208ca4c75969f764bfb1d3d57fd9aa67bd3bdad650f31f03f095109c8e27759 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5-activate, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.build-date=20260223) 2026-03-10T09:03:03.996 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:03 vm08.local podman[121736]: 2026-03-10 09:03:03.708985103 +0000 UTC m=+0.975288632 container remove 6208ca4c75969f764bfb1d3d57fd9aa67bd3bdad650f31f03f095109c8e27759 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5-activate, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default) 2026-03-10T09:03:03.996 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:03 vm08.local podman[122002]: 2026-03-10 09:03:03.806135465 +0000 UTC m=+0.016784617 container create 41f6c3ce6ac4c158c72321b1533f1de2882722b7f1f673d11508b17d9f40a12c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T09:03:03.996 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:03 vm08.local podman[122002]: 2026-03-10 09:03:03.849598603 +0000 UTC m=+0.060247755 container init 41f6c3ce6ac4c158c72321b1533f1de2882722b7f1f673d11508b17d9f40a12c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS) 2026-03-10T09:03:03.996 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:03 vm08.local podman[122002]: 2026-03-10 09:03:03.852726177 +0000 UTC m=+0.063375329 container start 41f6c3ce6ac4c158c72321b1533f1de2882722b7f1f673d11508b17d9f40a12c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20260223, ceph=True, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS) 2026-03-10T09:03:03.996 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:03 vm08.local bash[122002]: 41f6c3ce6ac4c158c72321b1533f1de2882722b7f1f673d11508b17d9f40a12c 2026-03-10T09:03:03.996 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:03 vm08.local podman[122002]: 2026-03-10 09:03:03.799661398 +0000 UTC m=+0.010310560 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T09:03:03.996 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:03 vm08.local systemd[1]: Started Ceph osd.5 for 16587ed2-1c5e-11f1-90f6-35051361a039. 2026-03-10T09:03:04.268 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:03 vm08.local ceph-mon[101330]: osdmap e99: 6 total, 5 up, 6 in 2026-03-10T09:03:04.268 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:03 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:04.268 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:03 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:04.268 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:03 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:03:04.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:03 vm05.local ceph-mon[111630]: osdmap e99: 6 total, 5 up, 6 in 2026-03-10T09:03:04.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:03 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:04.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:03 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:04.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:03 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:03:04.543 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:04 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5[122013]: 2026-03-10T09:03:04.431+0000 7f5438c2f740 -1 Falling back to public interface 2026-03-10T09:03:05.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:04 vm08.local ceph-mon[101330]: pgmap v248: 65 pgs: 12 peering, 6 stale+active+clean, 47 active+clean; 215 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail 2026-03-10T09:03:05.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:04 vm08.local ceph-mon[101330]: Health check failed: Reduced data availability: 7 pgs peering (PG_AVAILABILITY) 2026-03-10T09:03:05.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:04 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:05.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:04 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:05.359 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:04 vm05.local ceph-mon[111630]: pgmap v248: 65 pgs: 12 peering, 6 stale+active+clean, 47 active+clean; 215 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail 2026-03-10T09:03:05.360 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:04 vm05.local ceph-mon[111630]: Health check failed: Reduced data availability: 7 pgs peering (PG_AVAILABILITY) 2026-03-10T09:03:05.360 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:04 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:05.360 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:04 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:06.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:06 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:06.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:06 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:06.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:06 vm05.local ceph-mon[111630]: pgmap v249: 65 pgs: 3 active+undersized, 12 peering, 4 stale+active+clean, 46 active+clean; 215 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail 2026-03-10T09:03:06.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:06 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:06.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:06 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:06.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:06 vm08.local ceph-mon[101330]: pgmap v249: 65 pgs: 3 active+undersized, 12 peering, 4 stale+active+clean, 46 active+clean; 215 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail 2026-03-10T09:03:07.892 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:07 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:07.892 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:07 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:07.892 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:07 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:03:07.892 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:07 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:03:07.892 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:07 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:07.892 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:07 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:03:07.892 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:07 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:07.892 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:07 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:07.892 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:07 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:07.892 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:07 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:07.892 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:07 vm05.local ceph-mon[111630]: Upgrade: Setting container_image for all osd 2026-03-10T09:03:07.892 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:07 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:07.892 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:07 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.0"}]: dispatch 2026-03-10T09:03:07.892 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:07 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.0"}]': finished 2026-03-10T09:03:07.892 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:07 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.1"}]: dispatch 2026-03-10T09:03:07.892 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:07 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.1"}]': finished 2026-03-10T09:03:07.892 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:07 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.2"}]: dispatch 2026-03-10T09:03:07.892 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:07 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.2"}]': finished 2026-03-10T09:03:07.892 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:07 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.3"}]: dispatch 2026-03-10T09:03:07.892 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:07 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.3"}]': finished 2026-03-10T09:03:07.892 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:07 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.4"}]: dispatch 2026-03-10T09:03:07.892 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:07 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.4"}]': finished 2026-03-10T09:03:07.892 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:07 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.5"}]: dispatch 2026-03-10T09:03:07.892 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:07 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.5"}]': finished 2026-03-10T09:03:07.892 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:07 vm05.local ceph-mon[111630]: Upgrade: Setting require_osd_release to 19 squid 2026-03-10T09:03:07.892 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:07 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd require-osd-release", "release": "squid"}]: dispatch 2026-03-10T09:03:08.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:07 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:08.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:07 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:08.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:07 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:03:08.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:07 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:03:08.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:07 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:08.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:07 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:03:08.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:07 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:08.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:07 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:08.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:07 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:08.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:07 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:08.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:07 vm08.local ceph-mon[101330]: Upgrade: Setting container_image for all osd 2026-03-10T09:03:08.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:07 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:08.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:07 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.0"}]: dispatch 2026-03-10T09:03:08.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:07 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.0"}]': finished 2026-03-10T09:03:08.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:07 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.1"}]: dispatch 2026-03-10T09:03:08.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:07 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.1"}]': finished 2026-03-10T09:03:08.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:07 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.2"}]: dispatch 2026-03-10T09:03:08.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:07 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.2"}]': finished 2026-03-10T09:03:08.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:07 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.3"}]: dispatch 2026-03-10T09:03:08.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:07 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.3"}]': finished 2026-03-10T09:03:08.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:07 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.4"}]: dispatch 2026-03-10T09:03:08.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:07 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.4"}]': finished 2026-03-10T09:03:08.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:07 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.5"}]: dispatch 2026-03-10T09:03:08.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:07 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.5"}]': finished 2026-03-10T09:03:08.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:07 vm08.local ceph-mon[101330]: Upgrade: Setting require_osd_release to 19 squid 2026-03-10T09:03:08.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:07 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd require-osd-release", "release": "squid"}]: dispatch 2026-03-10T09:03:08.553 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:08 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5[122013]: 2026-03-10T09:03:08.247+0000 7f5438c2f740 -1 osd.5 0 read_superblock omap replica is missing. 2026-03-10T09:03:08.553 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:08 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5[122013]: 2026-03-10T09:03:08.481+0000 7f5438c2f740 -1 osd.5 97 log_to_monitors true 2026-03-10T09:03:08.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:08 vm05.local ceph-mon[111630]: Health check cleared: OSD_UPGRADE_FINISHED (was: all OSDs are running squid or later but require_osd_release < squid) 2026-03-10T09:03:08.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:08 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "osd require-osd-release", "release": "squid"}]': finished 2026-03-10T09:03:08.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:08 vm05.local ceph-mon[111630]: osdmap e100: 6 total, 5 up, 6 in 2026-03-10T09:03:08.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:08 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds ok-to-stop", "ids": ["cephfs.vm05.bxdvbu"]}]: dispatch 2026-03-10T09:03:08.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:08 vm05.local ceph-mon[111630]: Upgrade: It appears safe to stop mds.cephfs.vm05.bxdvbu 2026-03-10T09:03:08.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:08 vm05.local ceph-mon[111630]: pgmap v251: 65 pgs: 13 active+undersized, 5 peering, 10 active+undersized+degraded, 37 active+clean; 215 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 33/231 objects degraded (14.286%) 2026-03-10T09:03:08.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:08 vm05.local ceph-mon[111630]: Upgrade: Updating mds.cephfs.vm05.bxdvbu 2026-03-10T09:03:08.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:08 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:08.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:08 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.bxdvbu", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T09:03:08.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:08 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:03:08.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:08 vm05.local ceph-mon[111630]: Deploying daemon mds.cephfs.vm05.bxdvbu on vm05 2026-03-10T09:03:08.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:08 vm05.local ceph-mon[111630]: from='osd.5 [v2:192.168.123.108:6816/3833060682,v1:192.168.123.108:6817/3833060682]' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-10T09:03:08.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:08 vm05.local ceph-mon[111630]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-10T09:03:08.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:08 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-mon-vm05[111626]: 2026-03-10T09:03:08.618+0000 7ff673aa8640 -1 log_channel(cluster) log [ERR] : Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T09:03:09.052 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:03:08 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5[122013]: 2026-03-10T09:03:08.663+0000 7f54309c9640 -1 osd.5 97 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T09:03:09.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:08 vm08.local ceph-mon[101330]: Health check cleared: OSD_UPGRADE_FINISHED (was: all OSDs are running squid or later but require_osd_release < squid) 2026-03-10T09:03:09.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:08 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "osd require-osd-release", "release": "squid"}]': finished 2026-03-10T09:03:09.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:08 vm08.local ceph-mon[101330]: osdmap e100: 6 total, 5 up, 6 in 2026-03-10T09:03:09.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:08 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds ok-to-stop", "ids": ["cephfs.vm05.bxdvbu"]}]: dispatch 2026-03-10T09:03:09.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:08 vm08.local ceph-mon[101330]: Upgrade: It appears safe to stop mds.cephfs.vm05.bxdvbu 2026-03-10T09:03:09.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:08 vm08.local ceph-mon[101330]: pgmap v251: 65 pgs: 13 active+undersized, 5 peering, 10 active+undersized+degraded, 37 active+clean; 215 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 33/231 objects degraded (14.286%) 2026-03-10T09:03:09.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:08 vm08.local ceph-mon[101330]: Upgrade: Updating mds.cephfs.vm05.bxdvbu 2026-03-10T09:03:09.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:08 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:09.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:08 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.bxdvbu", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T09:03:09.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:08 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:03:09.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:08 vm08.local ceph-mon[101330]: Deploying daemon mds.cephfs.vm05.bxdvbu on vm05 2026-03-10T09:03:09.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:08 vm08.local ceph-mon[101330]: from='osd.5 [v2:192.168.123.108:6816/3833060682,v1:192.168.123.108:6817/3833060682]' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-10T09:03:09.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:08 vm08.local ceph-mon[101330]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-10T09:03:09.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:09 vm05.local ceph-mon[111630]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-10T09:03:09.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:09 vm05.local ceph-mon[111630]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T09:03:09.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:09 vm05.local ceph-mon[111630]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]': finished 2026-03-10T09:03:09.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:09 vm05.local ceph-mon[111630]: osdmap e101: 6 total, 5 up, 6 in 2026-03-10T09:03:09.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:09 vm05.local ceph-mon[111630]: Standby daemon mds.cephfs.vm05.slhztf assigned to filesystem cephfs as rank 0 2026-03-10T09:03:09.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:09 vm05.local ceph-mon[111630]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-10T09:03:09.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:09 vm05.local ceph-mon[111630]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-10T09:03:09.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:09 vm05.local ceph-mon[111630]: from='osd.5 [v2:192.168.123.108:6816/3833060682,v1:192.168.123.108:6817/3833060682]' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T09:03:09.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:09 vm05.local ceph-mon[111630]: fsmap cephfs:1/1 {0=cephfs.vm05.slhztf=up:replay} 2 up:standby 2026-03-10T09:03:09.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:09 vm05.local ceph-mon[111630]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T09:03:09.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:09 vm05.local ceph-mon[111630]: Health check failed: Degraded data redundancy: 33/231 objects degraded (14.286%), 10 pgs degraded (PG_DEGRADED) 2026-03-10T09:03:10.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:09 vm08.local ceph-mon[101330]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-10T09:03:10.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:09 vm08.local ceph-mon[101330]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T09:03:10.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:09 vm08.local ceph-mon[101330]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]': finished 2026-03-10T09:03:10.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:09 vm08.local ceph-mon[101330]: osdmap e101: 6 total, 5 up, 6 in 2026-03-10T09:03:10.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:09 vm08.local ceph-mon[101330]: Standby daemon mds.cephfs.vm05.slhztf assigned to filesystem cephfs as rank 0 2026-03-10T09:03:10.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:09 vm08.local ceph-mon[101330]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-10T09:03:10.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:09 vm08.local ceph-mon[101330]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-10T09:03:10.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:09 vm08.local ceph-mon[101330]: from='osd.5 [v2:192.168.123.108:6816/3833060682,v1:192.168.123.108:6817/3833060682]' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T09:03:10.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:09 vm08.local ceph-mon[101330]: fsmap cephfs:1/1 {0=cephfs.vm05.slhztf=up:replay} 2 up:standby 2026-03-10T09:03:10.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:09 vm08.local ceph-mon[101330]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T09:03:10.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:09 vm08.local ceph-mon[101330]: Health check failed: Degraded data redundancy: 33/231 objects degraded (14.286%), 10 pgs degraded (PG_DEGRADED) 2026-03-10T09:03:10.902 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:10 vm05.local ceph-mon[111630]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T09:03:10.902 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:10 vm05.local ceph-mon[111630]: osd.5 [v2:192.168.123.108:6816/3833060682,v1:192.168.123.108:6817/3833060682] boot 2026-03-10T09:03:10.902 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:10 vm05.local ceph-mon[111630]: osdmap e102: 6 total, 6 up, 6 in 2026-03-10T09:03:10.902 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:10 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T09:03:10.902 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:10 vm05.local ceph-mon[111630]: pgmap v254: 65 pgs: 16 active+undersized, 12 active+undersized+degraded, 37 active+clean; 215 MiB data, 915 MiB used, 119 GiB / 120 GiB avail; 683 KiB/s rd, 0 op/s; 36/231 objects degraded (15.584%) 2026-03-10T09:03:10.902 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:10 vm05.local ceph-mon[111630]: Health check update: Reduced data availability: 3 pgs peering (PG_AVAILABILITY) 2026-03-10T09:03:11.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:10 vm08.local ceph-mon[101330]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T09:03:11.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:10 vm08.local ceph-mon[101330]: osd.5 [v2:192.168.123.108:6816/3833060682,v1:192.168.123.108:6817/3833060682] boot 2026-03-10T09:03:11.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:10 vm08.local ceph-mon[101330]: osdmap e102: 6 total, 6 up, 6 in 2026-03-10T09:03:11.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:10 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T09:03:11.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:10 vm08.local ceph-mon[101330]: pgmap v254: 65 pgs: 16 active+undersized, 12 active+undersized+degraded, 37 active+clean; 215 MiB data, 915 MiB used, 119 GiB / 120 GiB avail; 683 KiB/s rd, 0 op/s; 36/231 objects degraded (15.584%) 2026-03-10T09:03:11.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:10 vm08.local ceph-mon[101330]: Health check update: Reduced data availability: 3 pgs peering (PG_AVAILABILITY) 2026-03-10T09:03:11.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:11 vm05.local ceph-mon[111630]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 3 pgs peering) 2026-03-10T09:03:11.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:11 vm05.local ceph-mon[111630]: osdmap e103: 6 total, 6 up, 6 in 2026-03-10T09:03:12.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:11 vm08.local ceph-mon[101330]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 3 pgs peering) 2026-03-10T09:03:12.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:11 vm08.local ceph-mon[101330]: osdmap e103: 6 total, 6 up, 6 in 2026-03-10T09:03:13.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:12 vm05.local ceph-mon[111630]: pgmap v256: 65 pgs: 14 active+undersized, 12 active+undersized+degraded, 39 active+clean; 215 MiB data, 915 MiB used, 119 GiB / 120 GiB avail; 7.7 MiB/s rd, 3 op/s; 36/231 objects degraded (15.584%) 2026-03-10T09:03:13.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:12 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.bxdvbu"}]: dispatch 2026-03-10T09:03:13.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:12 vm08.local ceph-mon[101330]: pgmap v256: 65 pgs: 14 active+undersized, 12 active+undersized+degraded, 39 active+clean; 215 MiB data, 915 MiB used, 119 GiB / 120 GiB avail; 7.7 MiB/s rd, 3 op/s; 36/231 objects degraded (15.584%) 2026-03-10T09:03:13.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:12 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.bxdvbu"}]: dispatch 2026-03-10T09:03:15.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:14 vm08.local ceph-mon[101330]: pgmap v257: 65 pgs: 1 active+undersized, 64 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 23 MiB/s rd, 6 op/s 2026-03-10T09:03:15.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:14 vm08.local ceph-mon[101330]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 36/231 objects degraded (15.584%), 12 pgs degraded) 2026-03-10T09:03:15.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:14 vm05.local ceph-mon[111630]: pgmap v257: 65 pgs: 1 active+undersized, 64 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 23 MiB/s rd, 6 op/s 2026-03-10T09:03:15.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:14 vm05.local ceph-mon[111630]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 36/231 objects degraded (15.584%), 12 pgs degraded) 2026-03-10T09:03:16.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:15 vm08.local ceph-mon[101330]: reconnect by client.24325 192.168.123.105:0/1319997093 after 0 2026-03-10T09:03:16.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:15 vm08.local ceph-mon[101330]: reconnect by client.24333 192.168.144.1:0/2430695904 after 0.001 2026-03-10T09:03:16.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:15 vm08.local ceph-mon[101330]: mds.? [v2:192.168.123.105:6828/2662194502,v1:192.168.123.105:6829/2662194502] up:reconnect 2026-03-10T09:03:16.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:15 vm08.local ceph-mon[101330]: fsmap cephfs:1/1 {0=cephfs.vm05.slhztf=up:reconnect} 2 up:standby 2026-03-10T09:03:16.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:15 vm05.local ceph-mon[111630]: reconnect by client.24325 192.168.123.105:0/1319997093 after 0 2026-03-10T09:03:16.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:15 vm05.local ceph-mon[111630]: reconnect by client.24333 192.168.144.1:0/2430695904 after 0.001 2026-03-10T09:03:16.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:15 vm05.local ceph-mon[111630]: mds.? [v2:192.168.123.105:6828/2662194502,v1:192.168.123.105:6829/2662194502] up:reconnect 2026-03-10T09:03:16.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:15 vm05.local ceph-mon[111630]: fsmap cephfs:1/1 {0=cephfs.vm05.slhztf=up:reconnect} 2 up:standby 2026-03-10T09:03:16.936 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:16 vm05.local ceph-mon[111630]: mds.? [v2:192.168.123.105:6828/2662194502,v1:192.168.123.105:6829/2662194502] up:rejoin 2026-03-10T09:03:16.936 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:16 vm05.local ceph-mon[111630]: fsmap cephfs:1/1 {0=cephfs.vm05.slhztf=up:rejoin} 2 up:standby 2026-03-10T09:03:16.936 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:16 vm05.local ceph-mon[111630]: daemon mds.cephfs.vm05.slhztf is now active in filesystem cephfs as rank 0 2026-03-10T09:03:16.936 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:16 vm05.local ceph-mon[111630]: pgmap v258: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 19 MiB/s rd, 5 op/s 2026-03-10T09:03:16.936 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:16 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:16.936 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:16 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:03:16.978 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:16.976+0000 7f6d5c24b700 1 -- 192.168.123.105:0/643354224 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6d5410a700 msgr2=0x7f6d5410cb90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:16.978 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:16.976+0000 7f6d5c24b700 1 --2- 192.168.123.105:0/643354224 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6d5410a700 0x7f6d5410cb90 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f6d4c00b3a0 tx=0x7f6d4c00b6b0 comp rx=0 tx=0).stop 2026-03-10T09:03:16.978 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:16.978+0000 7f6d5c24b700 1 -- 192.168.123.105:0/643354224 shutdown_connections 2026-03-10T09:03:16.978 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:16.978+0000 7f6d5c24b700 1 --2- 192.168.123.105:0/643354224 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6d5410a700 0x7f6d5410cb90 unknown :-1 s=CLOSED pgs=61 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:16.978 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:16.978+0000 7f6d5c24b700 1 --2- 192.168.123.105:0/643354224 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d54107d90 0x7f6d5410a1c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:16.978 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:16.978+0000 7f6d5c24b700 1 -- 192.168.123.105:0/643354224 >> 192.168.123.105:0/643354224 conn(0x7f6d5406dae0 msgr2=0x7f6d5406ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:03:16.978 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:16.978+0000 7f6d5c24b700 1 -- 192.168.123.105:0/643354224 shutdown_connections 2026-03-10T09:03:16.978 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:16.978+0000 7f6d5c24b700 1 -- 192.168.123.105:0/643354224 wait complete. 2026-03-10T09:03:16.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:16.979+0000 7f6d5c24b700 1 Processor -- start 2026-03-10T09:03:16.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:16.979+0000 7f6d5c24b700 1 -- start start 2026-03-10T09:03:16.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:16.979+0000 7f6d5c24b700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6d54107d90 0x7f6d541a53a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:03:16.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:16.979+0000 7f6d5c24b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d5410a700 0x7f6d541a58e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:03:16.980 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:16.979+0000 7f6d5c24b700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6d541a5f00 con 0x7f6d5410a700 2026-03-10T09:03:16.980 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:16.979+0000 7f6d5c24b700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6d541a6040 con 0x7f6d54107d90 2026-03-10T09:03:16.980 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:16.980+0000 7f6d59fe7700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6d54107d90 0x7f6d541a53a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:03:16.980 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:16.980+0000 7f6d597e6700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d5410a700 0x7f6d541a58e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:03:16.980 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:16.980+0000 7f6d597e6700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d5410a700 0x7f6d541a58e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:44162/0 (socket says 192.168.123.105:44162) 2026-03-10T09:03:16.980 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:16.980+0000 7f6d597e6700 1 -- 192.168.123.105:0/3009325691 learned_addr learned my addr 192.168.123.105:0/3009325691 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:03:16.981 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:16.980+0000 7f6d597e6700 1 -- 192.168.123.105:0/3009325691 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6d54107d90 msgr2=0x7f6d541a53a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:16.981 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:16.980+0000 7f6d597e6700 1 --2- 192.168.123.105:0/3009325691 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6d54107d90 0x7f6d541a53a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:16.981 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:16.980+0000 7f6d597e6700 1 -- 192.168.123.105:0/3009325691 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6d50009710 con 0x7f6d5410a700 2026-03-10T09:03:16.981 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:16.981+0000 7f6d597e6700 1 --2- 192.168.123.105:0/3009325691 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d5410a700 0x7f6d541a58e0 secure :-1 s=READY pgs=162 cs=0 l=1 rev1=1 crypto rx=0x7f6d4c0062a0 tx=0x7f6d4c008f50 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:03:16.981 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:16.981+0000 7f6d4affd700 1 -- 192.168.123.105:0/3009325691 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6d4c00e050 con 0x7f6d5410a700 2026-03-10T09:03:16.981 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:16.981+0000 7f6d5c24b700 1 -- 192.168.123.105:0/3009325691 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6d4c00b050 con 0x7f6d5410a700 2026-03-10T09:03:16.983 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:16.982+0000 7f6d5c24b700 1 -- 192.168.123.105:0/3009325691 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6d541aae50 con 0x7f6d5410a700 2026-03-10T09:03:16.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:16.983+0000 7f6d4affd700 1 -- 192.168.123.105:0/3009325691 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6d4c003cf0 con 0x7f6d5410a700 2026-03-10T09:03:16.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:16.984+0000 7f6d4affd700 1 -- 192.168.123.105:0/3009325691 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6d4c01da90 con 0x7f6d5410a700 2026-03-10T09:03:16.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:16.984+0000 7f6d4affd700 1 -- 192.168.123.105:0/3009325691 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f6d4c019040 con 0x7f6d5410a700 2026-03-10T09:03:16.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:16.985+0000 7f6d4affd700 1 --2- 192.168.123.105:0/3009325691 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f6d400776c0 0x7f6d40079b80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:03:16.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:16.985+0000 7f6d59fe7700 1 --2- 192.168.123.105:0/3009325691 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f6d400776c0 0x7f6d40079b80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:03:16.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:16.986+0000 7f6d5c24b700 1 -- 192.168.123.105:0/3009325691 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6d38005320 con 0x7f6d5410a700 2026-03-10T09:03:16.987 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:16.986+0000 7f6d59fe7700 1 --2- 192.168.123.105:0/3009325691 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f6d400776c0 0x7f6d40079b80 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f6d50011440 tx=0x7f6d50009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:03:16.987 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:16.986+0000 7f6d4affd700 1 -- 192.168.123.105:0/3009325691 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(103..103 src has 1..103) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f6d4c0684d0 con 0x7f6d5410a700 2026-03-10T09:03:16.990 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:16.989+0000 7f6d4affd700 1 -- 192.168.123.105:0/3009325691 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f6d4c017080 con 0x7f6d5410a700 2026-03-10T09:03:17.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:16 vm08.local ceph-mon[101330]: mds.? [v2:192.168.123.105:6828/2662194502,v1:192.168.123.105:6829/2662194502] up:rejoin 2026-03-10T09:03:17.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:16 vm08.local ceph-mon[101330]: fsmap cephfs:1/1 {0=cephfs.vm05.slhztf=up:rejoin} 2 up:standby 2026-03-10T09:03:17.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:16 vm08.local ceph-mon[101330]: daemon mds.cephfs.vm05.slhztf is now active in filesystem cephfs as rank 0 2026-03-10T09:03:17.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:16 vm08.local ceph-mon[101330]: pgmap v258: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 19 MiB/s rd, 5 op/s 2026-03-10T09:03:17.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:16 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:17.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:16 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:03:17.189 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.186+0000 7f6d5c24b700 1 -- 192.168.123.105:0/3009325691 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f6d38000bf0 con 0x7f6d400776c0 2026-03-10T09:03:17.195 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.192+0000 7f6d4affd700 1 -- 192.168.123.105:0/3009325691 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+416 (secure 0 0 0) 0x7f6d38000bf0 con 0x7f6d400776c0 2026-03-10T09:03:17.197 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.197+0000 7f6d48ff9700 1 -- 192.168.123.105:0/3009325691 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f6d400776c0 msgr2=0x7f6d40079b80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:17.197 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.197+0000 7f6d48ff9700 1 --2- 192.168.123.105:0/3009325691 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f6d400776c0 0x7f6d40079b80 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f6d50011440 tx=0x7f6d50009450 comp rx=0 tx=0).stop 2026-03-10T09:03:17.197 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.197+0000 7f6d48ff9700 1 -- 192.168.123.105:0/3009325691 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d5410a700 msgr2=0x7f6d541a58e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:17.197 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.197+0000 7f6d48ff9700 1 --2- 192.168.123.105:0/3009325691 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d5410a700 0x7f6d541a58e0 secure :-1 s=READY pgs=162 cs=0 l=1 rev1=1 crypto rx=0x7f6d4c0062a0 tx=0x7f6d4c008f50 comp rx=0 tx=0).stop 2026-03-10T09:03:17.197 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.197+0000 7f6d48ff9700 1 -- 192.168.123.105:0/3009325691 shutdown_connections 2026-03-10T09:03:17.197 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.197+0000 7f6d48ff9700 1 --2- 192.168.123.105:0/3009325691 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f6d400776c0 0x7f6d40079b80 secure :-1 s=CLOSED pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f6d50011440 tx=0x7f6d50009450 comp rx=0 tx=0).stop 2026-03-10T09:03:17.197 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.197+0000 7f6d48ff9700 1 --2- 192.168.123.105:0/3009325691 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6d54107d90 0x7f6d541a53a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:17.197 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.197+0000 7f6d48ff9700 1 --2- 192.168.123.105:0/3009325691 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d5410a700 0x7f6d541a58e0 unknown :-1 s=CLOSED pgs=162 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:17.197 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.197+0000 7f6d48ff9700 1 -- 192.168.123.105:0/3009325691 >> 192.168.123.105:0/3009325691 conn(0x7f6d5406dae0 msgr2=0x7f6d5406ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:03:17.198 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.197+0000 7f6d48ff9700 1 -- 192.168.123.105:0/3009325691 shutdown_connections 2026-03-10T09:03:17.198 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.198+0000 7f6d48ff9700 1 -- 192.168.123.105:0/3009325691 wait complete. 2026-03-10T09:03:17.208 INFO:teuthology.orchestra.run.vm05.stdout:true 2026-03-10T09:03:17.282 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.281+0000 7f615929c700 1 -- 192.168.123.105:0/2345440981 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f614c095de0 msgr2=0x7f614c096200 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:17.282 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.282+0000 7f615929c700 1 --2- 192.168.123.105:0/2345440981 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f614c095de0 0x7f614c096200 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f6148009b10 tx=0x7f6148009e20 comp rx=0 tx=0).stop 2026-03-10T09:03:17.282 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.282+0000 7f615929c700 1 -- 192.168.123.105:0/2345440981 shutdown_connections 2026-03-10T09:03:17.282 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.282+0000 7f615929c700 1 --2- 192.168.123.105:0/2345440981 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f614c096fe0 0x7f614c097440 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:17.282 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.282+0000 7f615929c700 1 --2- 192.168.123.105:0/2345440981 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f614c095de0 0x7f614c096200 unknown :-1 s=CLOSED pgs=62 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:17.282 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.282+0000 7f615929c700 1 -- 192.168.123.105:0/2345440981 >> 192.168.123.105:0/2345440981 conn(0x7f614c091360 msgr2=0x7f614c0937c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:03:17.283 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.283+0000 7f615929c700 1 -- 192.168.123.105:0/2345440981 shutdown_connections 2026-03-10T09:03:17.283 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.283+0000 7f615929c700 1 -- 192.168.123.105:0/2345440981 wait complete. 2026-03-10T09:03:17.284 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.283+0000 7f615929c700 1 Processor -- start 2026-03-10T09:03:17.284 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.284+0000 7f615929c700 1 -- start start 2026-03-10T09:03:17.284 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.284+0000 7f615929c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f614c095de0 0x7f614c12b610 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:03:17.285 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.284+0000 7f6153fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f614c095de0 0x7f614c12b610 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:03:17.285 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.284+0000 7f6153fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f614c095de0 0x7f614c12b610 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:44208/0 (socket says 192.168.123.105:44208) 2026-03-10T09:03:17.285 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.284+0000 7f6153fff700 1 -- 192.168.123.105:0/3970058507 learned_addr learned my addr 192.168.123.105:0/3970058507 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:03:17.285 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.284+0000 7f615929c700 1 --2- 192.168.123.105:0/3970058507 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f614c096fe0 0x7f614c12bb50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:03:17.285 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.284+0000 7f615929c700 1 -- 192.168.123.105:0/3970058507 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f614c12c170 con 0x7f614c095de0 2026-03-10T09:03:17.285 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.284+0000 7f615929c700 1 -- 192.168.123.105:0/3970058507 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f614c12c2b0 con 0x7f614c096fe0 2026-03-10T09:03:17.285 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.285+0000 7f61537fe700 1 --2- 192.168.123.105:0/3970058507 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f614c096fe0 0x7f614c12bb50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:03:17.285 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.285+0000 7f6153fff700 1 -- 192.168.123.105:0/3970058507 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f614c096fe0 msgr2=0x7f614c12bb50 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:17.285 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.285+0000 7f6153fff700 1 --2- 192.168.123.105:0/3970058507 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f614c096fe0 0x7f614c12bb50 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:17.285 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.285+0000 7f6153fff700 1 -- 192.168.123.105:0/3970058507 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6148009770 con 0x7f614c095de0 2026-03-10T09:03:17.287 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.285+0000 7f6153fff700 1 --2- 192.168.123.105:0/3970058507 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f614c095de0 0x7f614c12b610 secure :-1 s=READY pgs=165 cs=0 l=1 rev1=1 crypto rx=0x7f6148005210 tx=0x7f614800bee0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:03:17.287 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.285+0000 7f61537fe700 1 --2- 192.168.123.105:0/3970058507 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f614c096fe0 0x7f614c12bb50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T09:03:17.287 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.286+0000 7f61517fa700 1 -- 192.168.123.105:0/3970058507 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f614801d070 con 0x7f614c095de0 2026-03-10T09:03:17.287 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.286+0000 7f61517fa700 1 -- 192.168.123.105:0/3970058507 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6148005790 con 0x7f614c095de0 2026-03-10T09:03:17.287 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.286+0000 7f61517fa700 1 -- 192.168.123.105:0/3970058507 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6148017ab0 con 0x7f614c095de0 2026-03-10T09:03:17.287 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.286+0000 7f615929c700 1 -- 192.168.123.105:0/3970058507 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f614c006af0 con 0x7f614c095de0 2026-03-10T09:03:17.287 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.286+0000 7f615929c700 1 -- 192.168.123.105:0/3970058507 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f614c006fe0 con 0x7f614c095de0 2026-03-10T09:03:17.291 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.287+0000 7f61517fa700 1 -- 192.168.123.105:0/3970058507 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f6148017c70 con 0x7f614c095de0 2026-03-10T09:03:17.291 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.288+0000 7f613effd700 1 -- 192.168.123.105:0/3970058507 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f614c006120 con 0x7f614c095de0 2026-03-10T09:03:17.291 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.288+0000 7f61517fa700 1 --2- 192.168.123.105:0/3970058507 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f614407bd20 0x7f614407e1e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:03:17.291 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.288+0000 7f61517fa700 1 -- 192.168.123.105:0/3970058507 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(103..103 src has 1..103) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f614809aff0 con 0x7f614c095de0 2026-03-10T09:03:17.291 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.288+0000 7f61537fe700 1 --2- 192.168.123.105:0/3970058507 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f614407bd20 0x7f614407e1e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:03:17.296 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.294+0000 7f61537fe700 1 --2- 192.168.123.105:0/3970058507 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f614407bd20 0x7f614407e1e0 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7f6140005d90 tx=0x7f6140005d20 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:03:17.296 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.296+0000 7f61517fa700 1 -- 192.168.123.105:0/3970058507 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f6148063850 con 0x7f614c095de0 2026-03-10T09:03:17.433 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.432+0000 7f613effd700 1 -- 192.168.123.105:0/3970058507 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f614c003680 con 0x7f614407bd20 2026-03-10T09:03:17.434 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.434+0000 7f61517fa700 1 -- 192.168.123.105:0/3970058507 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+416 (secure 0 0 0) 0x7f614c003680 con 0x7f614407bd20 2026-03-10T09:03:17.441 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.441+0000 7f615929c700 1 -- 192.168.123.105:0/3970058507 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f614407bd20 msgr2=0x7f614407e1e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:17.441 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.441+0000 7f615929c700 1 --2- 192.168.123.105:0/3970058507 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f614407bd20 0x7f614407e1e0 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7f6140005d90 tx=0x7f6140005d20 comp rx=0 tx=0).stop 2026-03-10T09:03:17.442 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.441+0000 7f615929c700 1 -- 192.168.123.105:0/3970058507 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f614c095de0 msgr2=0x7f614c12b610 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:17.442 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.441+0000 7f615929c700 1 --2- 192.168.123.105:0/3970058507 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f614c095de0 0x7f614c12b610 secure :-1 s=READY pgs=165 cs=0 l=1 rev1=1 crypto rx=0x7f6148005210 tx=0x7f614800bee0 comp rx=0 tx=0).stop 2026-03-10T09:03:17.444 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.444+0000 7f615929c700 1 -- 192.168.123.105:0/3970058507 shutdown_connections 2026-03-10T09:03:17.444 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.444+0000 7f615929c700 1 --2- 192.168.123.105:0/3970058507 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f614407bd20 0x7f614407e1e0 unknown :-1 s=CLOSED pgs=111 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:17.444 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.444+0000 7f615929c700 1 --2- 192.168.123.105:0/3970058507 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f614c095de0 0x7f614c12b610 unknown :-1 s=CLOSED pgs=165 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:17.444 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.444+0000 7f615929c700 1 --2- 192.168.123.105:0/3970058507 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f614c096fe0 0x7f614c12bb50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:17.444 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.444+0000 7f615929c700 1 -- 192.168.123.105:0/3970058507 >> 192.168.123.105:0/3970058507 conn(0x7f614c091360 msgr2=0x7f614c09a210 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:03:17.446 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.446+0000 7f615929c700 1 -- 192.168.123.105:0/3970058507 shutdown_connections 2026-03-10T09:03:17.448 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.447+0000 7f615929c700 1 -- 192.168.123.105:0/3970058507 wait complete. 2026-03-10T09:03:17.546 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.546+0000 7fbca0937700 1 -- 192.168.123.105:0/3293062124 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbc98107d90 msgr2=0x7fbc9810a1c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:17.546 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.546+0000 7fbca0937700 1 --2- 192.168.123.105:0/3293062124 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbc98107d90 0x7fbc9810a1c0 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7fbc8c009a60 tx=0x7fbc8c009d70 comp rx=0 tx=0).stop 2026-03-10T09:03:17.546 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.546+0000 7fbca0937700 1 -- 192.168.123.105:0/3293062124 shutdown_connections 2026-03-10T09:03:17.546 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.546+0000 7fbca0937700 1 --2- 192.168.123.105:0/3293062124 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbc9810a700 0x7fbc981114d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:17.547 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.546+0000 7fbca0937700 1 --2- 192.168.123.105:0/3293062124 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbc98107d90 0x7fbc9810a1c0 unknown :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:17.547 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.546+0000 7fbca0937700 1 -- 192.168.123.105:0/3293062124 >> 192.168.123.105:0/3293062124 conn(0x7fbc9806dda0 msgr2=0x7fbc98070220 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:03:17.547 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.546+0000 7fbca0937700 1 -- 192.168.123.105:0/3293062124 shutdown_connections 2026-03-10T09:03:17.547 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.546+0000 7fbca0937700 1 -- 192.168.123.105:0/3293062124 wait complete. 2026-03-10T09:03:17.547 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.546+0000 7fbca0937700 1 Processor -- start 2026-03-10T09:03:17.547 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.547+0000 7fbca0937700 1 -- start start 2026-03-10T09:03:17.547 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.547+0000 7fbca0937700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbc98107d90 0x7fbc981a10e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:03:17.547 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.547+0000 7fbca0937700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbc9810a700 0x7fbc981a1620 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:03:17.547 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.547+0000 7fbca0937700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbc981a1c40 con 0x7fbc98107d90 2026-03-10T09:03:17.547 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.547+0000 7fbca0937700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbc981a1d80 con 0x7fbc9810a700 2026-03-10T09:03:17.547 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.547+0000 7fbc9ded2700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbc9810a700 0x7fbc981a1620 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:03:17.547 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.547+0000 7fbc9ded2700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbc9810a700 0x7fbc981a1620 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:38644/0 (socket says 192.168.123.105:38644) 2026-03-10T09:03:17.547 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.547+0000 7fbc9ded2700 1 -- 192.168.123.105:0/1936218336 learned_addr learned my addr 192.168.123.105:0/1936218336 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:03:17.547 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.547+0000 7fbc9ded2700 1 -- 192.168.123.105:0/1936218336 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbc98107d90 msgr2=0x7fbc981a10e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:17.548 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.547+0000 7fbc9ded2700 1 --2- 192.168.123.105:0/1936218336 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbc98107d90 0x7fbc981a10e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:17.548 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.547+0000 7fbc9ded2700 1 -- 192.168.123.105:0/1936218336 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbc8c009710 con 0x7fbc9810a700 2026-03-10T09:03:17.548 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.547+0000 7fbc9ded2700 1 --2- 192.168.123.105:0/1936218336 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbc9810a700 0x7fbc981a1620 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7fbc8400ea30 tx=0x7fbc8400edf0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:03:17.549 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.549+0000 7fbc937fe700 1 -- 192.168.123.105:0/1936218336 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbc8400cc40 con 0x7fbc9810a700 2026-03-10T09:03:17.549 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.549+0000 7fbc937fe700 1 -- 192.168.123.105:0/1936218336 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fbc8400cda0 con 0x7fbc9810a700 2026-03-10T09:03:17.549 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.549+0000 7fbc937fe700 1 -- 192.168.123.105:0/1936218336 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbc84010430 con 0x7fbc9810a700 2026-03-10T09:03:17.556 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.556+0000 7fbca0937700 1 -- 192.168.123.105:0/1936218336 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbc981a6830 con 0x7fbc9810a700 2026-03-10T09:03:17.556 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.556+0000 7fbca0937700 1 -- 192.168.123.105:0/1936218336 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbc981a6d50 con 0x7fbc9810a700 2026-03-10T09:03:17.557 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.556+0000 7fbca0937700 1 -- 192.168.123.105:0/1936218336 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbc9819b470 con 0x7fbc9810a700 2026-03-10T09:03:17.559 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.558+0000 7fbc937fe700 1 -- 192.168.123.105:0/1936218336 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fbc84004750 con 0x7fbc9810a700 2026-03-10T09:03:17.559 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.558+0000 7fbc937fe700 1 --2- 192.168.123.105:0/1936218336 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fbc880779e0 0x7fbc88079ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:03:17.559 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.559+0000 7fbc937fe700 1 -- 192.168.123.105:0/1936218336 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(103..103 src has 1..103) v4 ==== 6308+0+0 (secure 0 0 0) 0x7fbc84026080 con 0x7fbc9810a700 2026-03-10T09:03:17.560 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.559+0000 7fbc9e6d3700 1 --2- 192.168.123.105:0/1936218336 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fbc880779e0 0x7fbc88079ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:03:17.560 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.559+0000 7fbc9e6d3700 1 --2- 192.168.123.105:0/1936218336 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fbc880779e0 0x7fbc88079ea0 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7fbc8c00b5c0 tx=0x7fbc8c009650 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:03:17.562 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.562+0000 7fbc937fe700 1 -- 192.168.123.105:0/1936218336 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fbc84062ca0 con 0x7fbc9810a700 2026-03-10T09:03:17.699 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.698+0000 7fbca0937700 1 -- 192.168.123.105:0/1936218336 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fbc980611d0 con 0x7fbc880779e0 2026-03-10T09:03:17.706 INFO:teuthology.orchestra.run.vm05.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T09:03:17.706 INFO:teuthology.orchestra.run.vm05.stdout:alertmanager.vm05 vm05 *:9093,9094 running (11m) 78s ago 12m 24.2M - 0.25.0 c8568f914cd2 3be9db7ff6a0 2026-03-10T09:03:17.706 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm05 vm05 running (12m) 78s ago 12m 9638k - 18.2.1 5be31c24972a 4f6a4fa3151a 2026-03-10T09:03:17.706 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm08 vm08 running (11m) 12s ago 11m 11.8M - 18.2.1 5be31c24972a 17a1d108c2c1 2026-03-10T09:03:17.706 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm05 vm05 running (6m) 78s ago 12m 7834k - 19.2.3-678-ge911bdeb 654f31e6858e 8f7c3cd210c7 2026-03-10T09:03:17.706 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm08 vm08 running (6m) 12s ago 11m 8296k - 19.2.3-678-ge911bdeb 654f31e6858e dc71c3161edd 2026-03-10T09:03:17.706 INFO:teuthology.orchestra.run.vm05.stdout:grafana.vm05 vm05 *:3000 running (11m) 78s ago 12m 89.2M - 9.4.7 954c08fa6188 91937b4745e7 2026-03-10T09:03:17.706 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.bxdvbu vm05 starting - - - - 2026-03-10T09:03:17.706 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.slhztf vm05 running (10m) 78s ago 10m 19.5M - 18.2.1 5be31c24972a 550cdd24738b 2026-03-10T09:03:17.706 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.ssijow vm08 running (10m) 12s ago 10m 22.2M - 18.2.1 5be31c24972a b9e55b365719 2026-03-10T09:03:17.707 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.xfzrbx vm08 running (10m) 12s ago 10m 19.0M - 18.2.1 5be31c24972a d58d1e2e6ff3 2026-03-10T09:03:17.707 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm05.rxwgjc vm05 *:8443,9283,8765 running (7m) 78s ago 13m 621M - 19.2.3-678-ge911bdeb 654f31e6858e d8c53d04a173 2026-03-10T09:03:17.707 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm08.rpongu vm08 *:8443,9283,8765 running (6m) 12s ago 11m 495M - 19.2.3-678-ge911bdeb 654f31e6858e 867781ac9d98 2026-03-10T09:03:17.707 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm05 vm05 running (6m) 78s ago 13m 63.3M 2048M 19.2.3-678-ge911bdeb 654f31e6858e cdc9176bec28 2026-03-10T09:03:17.707 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm08 vm08 running (6m) 12s ago 11m 56.9M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 34546aa1422b 2026-03-10T09:03:17.707 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm05 vm05 *:9100 running (12m) 78s ago 12m 14.9M - 1.5.0 0da6a335fe13 3b3db2a6030c 2026-03-10T09:03:17.707 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm08 vm08 *:9100 running (11m) 12s ago 11m 16.0M - 1.5.0 0da6a335fe13 f55df0cefc61 2026-03-10T09:03:17.707 INFO:teuthology.orchestra.run.vm05.stdout:osd.0 vm05 running (5m) 78s ago 11m 214M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 4f1dac46f59b 2026-03-10T09:03:17.707 INFO:teuthology.orchestra.run.vm05.stdout:osd.1 vm05 running (102s) 78s ago 11m 101M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 306e95bddd95 2026-03-10T09:03:17.707 INFO:teuthology.orchestra.run.vm05.stdout:osd.2 vm05 running (80s) 78s ago 11m 12.9M 4096M 19.2.3-678-ge911bdeb 654f31e6858e a555d70ff4bd 2026-03-10T09:03:17.707 INFO:teuthology.orchestra.run.vm05.stdout:osd.3 vm08 running (57s) 12s ago 11m 163M 4096M 19.2.3-678-ge911bdeb 654f31e6858e b025f9a6ca2a 2026-03-10T09:03:17.707 INFO:teuthology.orchestra.run.vm05.stdout:osd.4 vm08 running (35s) 12s ago 10m 119M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 76fe84edd716 2026-03-10T09:03:17.707 INFO:teuthology.orchestra.run.vm05.stdout:osd.5 vm08 running (13s) 12s ago 10m 12.5M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 41f6c3ce6ac4 2026-03-10T09:03:17.707 INFO:teuthology.orchestra.run.vm05.stdout:prometheus.vm05 vm05 *:9095 running (6m) 78s ago 12m 65.7M - 2.43.0 a07b618ecd1d 1879cf55d022 2026-03-10T09:03:17.707 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.705+0000 7fbc937fe700 1 -- 192.168.123.105:0/1936218336 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7fbc980611d0 con 0x7fbc880779e0 2026-03-10T09:03:17.709 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.708+0000 7fbca0937700 1 -- 192.168.123.105:0/1936218336 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fbc880779e0 msgr2=0x7fbc88079ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:17.709 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.708+0000 7fbca0937700 1 --2- 192.168.123.105:0/1936218336 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fbc880779e0 0x7fbc88079ea0 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7fbc8c00b5c0 tx=0x7fbc8c009650 comp rx=0 tx=0).stop 2026-03-10T09:03:17.709 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.708+0000 7fbca0937700 1 -- 192.168.123.105:0/1936218336 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbc9810a700 msgr2=0x7fbc981a1620 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:17.709 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.708+0000 7fbca0937700 1 --2- 192.168.123.105:0/1936218336 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbc9810a700 0x7fbc981a1620 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7fbc8400ea30 tx=0x7fbc8400edf0 comp rx=0 tx=0).stop 2026-03-10T09:03:17.709 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.708+0000 7fbca0937700 1 -- 192.168.123.105:0/1936218336 shutdown_connections 2026-03-10T09:03:17.709 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.708+0000 7fbca0937700 1 --2- 192.168.123.105:0/1936218336 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fbc880779e0 0x7fbc88079ea0 unknown :-1 s=CLOSED pgs=112 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:17.709 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.708+0000 7fbca0937700 1 --2- 192.168.123.105:0/1936218336 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbc98107d90 0x7fbc981a10e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:17.709 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.708+0000 7fbca0937700 1 --2- 192.168.123.105:0/1936218336 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbc9810a700 0x7fbc981a1620 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:17.709 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.708+0000 7fbca0937700 1 -- 192.168.123.105:0/1936218336 >> 192.168.123.105:0/1936218336 conn(0x7fbc9806dda0 msgr2=0x7fbc981109b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:03:17.709 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.708+0000 7fbca0937700 1 -- 192.168.123.105:0/1936218336 shutdown_connections 2026-03-10T09:03:17.709 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.708+0000 7fbca0937700 1 -- 192.168.123.105:0/1936218336 wait complete. 2026-03-10T09:03:17.815 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.815+0000 7f523958a700 1 -- 192.168.123.105:0/674243693 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f523410a700 msgr2=0x7f523410cb90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:17.815 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.815+0000 7f523958a700 1 --2- 192.168.123.105:0/674243693 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f523410a700 0x7f523410cb90 secure :-1 s=READY pgs=166 cs=0 l=1 rev1=1 crypto rx=0x7f522c00b3a0 tx=0x7f522c00b6b0 comp rx=0 tx=0).stop 2026-03-10T09:03:17.815 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.815+0000 7f523958a700 1 -- 192.168.123.105:0/674243693 shutdown_connections 2026-03-10T09:03:17.815 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.815+0000 7f523958a700 1 --2- 192.168.123.105:0/674243693 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f523410a700 0x7f523410cb90 unknown :-1 s=CLOSED pgs=166 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:17.815 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.815+0000 7f523958a700 1 --2- 192.168.123.105:0/674243693 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5234107d90 0x7f523410a1c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:17.816 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.815+0000 7f523958a700 1 -- 192.168.123.105:0/674243693 >> 192.168.123.105:0/674243693 conn(0x7f523406daa0 msgr2=0x7f523406ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:03:17.816 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.815+0000 7f523958a700 1 -- 192.168.123.105:0/674243693 shutdown_connections 2026-03-10T09:03:17.816 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.816+0000 7f523958a700 1 -- 192.168.123.105:0/674243693 wait complete. 2026-03-10T09:03:17.816 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.816+0000 7f523958a700 1 Processor -- start 2026-03-10T09:03:17.817 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.816+0000 7f523958a700 1 -- start start 2026-03-10T09:03:17.817 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.816+0000 7f523958a700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5234107d90 0x7f5234116ae0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:03:17.817 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.816+0000 7f523958a700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5234117020 0x7f52341b3180 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:03:17.817 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.816+0000 7f523958a700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5234117530 con 0x7f5234117020 2026-03-10T09:03:17.817 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.816+0000 7f523958a700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f52341176a0 con 0x7f5234107d90 2026-03-10T09:03:17.817 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.817+0000 7f5233fff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5234107d90 0x7f5234116ae0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:03:17.817 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.817+0000 7f5233fff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5234107d90 0x7f5234116ae0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:38668/0 (socket says 192.168.123.105:38668) 2026-03-10T09:03:17.817 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.817+0000 7f5233fff700 1 -- 192.168.123.105:0/186371881 learned_addr learned my addr 192.168.123.105:0/186371881 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:03:17.818 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.817+0000 7f52337fe700 1 --2- 192.168.123.105:0/186371881 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5234117020 0x7f52341b3180 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:03:17.818 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.818+0000 7f5233fff700 1 -- 192.168.123.105:0/186371881 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5234117020 msgr2=0x7f52341b3180 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:17.818 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.818+0000 7f5233fff700 1 --2- 192.168.123.105:0/186371881 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5234117020 0x7f52341b3180 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:17.818 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.818+0000 7f5233fff700 1 -- 192.168.123.105:0/186371881 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f522c00b050 con 0x7f5234107d90 2026-03-10T09:03:17.818 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.818+0000 7f5233fff700 1 --2- 192.168.123.105:0/186371881 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5234107d90 0x7f5234116ae0 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f522400d8d0 tx=0x7f522400dc90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:03:17.819 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.819+0000 7f52317fa700 1 -- 192.168.123.105:0/186371881 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5224009940 con 0x7f5234107d90 2026-03-10T09:03:17.819 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.819+0000 7f523958a700 1 -- 192.168.123.105:0/186371881 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f52341b3720 con 0x7f5234107d90 2026-03-10T09:03:17.820 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.819+0000 7f523958a700 1 -- 192.168.123.105:0/186371881 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f52341b3c70 con 0x7f5234107d90 2026-03-10T09:03:17.820 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.819+0000 7f52317fa700 1 -- 192.168.123.105:0/186371881 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f5224010460 con 0x7f5234107d90 2026-03-10T09:03:17.820 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.819+0000 7f52317fa700 1 -- 192.168.123.105:0/186371881 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f522400f5d0 con 0x7f5234107d90 2026-03-10T09:03:17.821 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.820+0000 7f52317fa700 1 -- 192.168.123.105:0/186371881 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f522400f790 con 0x7f5234107d90 2026-03-10T09:03:17.821 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.821+0000 7f52317fa700 1 --2- 192.168.123.105:0/186371881 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f521c0779e0 0x7f521c079ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:03:17.822 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.821+0000 7f52337fe700 1 --2- 192.168.123.105:0/186371881 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f521c0779e0 0x7f521c079ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:03:17.822 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.821+0000 7f52317fa700 1 -- 192.168.123.105:0/186371881 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(103..103 src has 1..103) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f5224099900 con 0x7f5234107d90 2026-03-10T09:03:17.822 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.822+0000 7f52337fe700 1 --2- 192.168.123.105:0/186371881 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f521c0779e0 0x7f521c079ea0 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7f522c00bb30 tx=0x7f522c0061f0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:03:17.822 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.822+0000 7f523958a700 1 -- 192.168.123.105:0/186371881 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5220005320 con 0x7f5234107d90 2026-03-10T09:03:17.826 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:17.825+0000 7f52317fa700 1 -- 192.168.123.105:0/186371881 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f5224063190 con 0x7f5234107d90 2026-03-10T09:03:17.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:17 vm05.local ceph-mon[111630]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-10T09:03:17.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:17 vm05.local ceph-mon[111630]: Cluster is now healthy 2026-03-10T09:03:17.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:17 vm05.local ceph-mon[111630]: mds.? [v2:192.168.123.105:6828/2662194502,v1:192.168.123.105:6829/2662194502] up:active 2026-03-10T09:03:17.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:17 vm05.local ceph-mon[111630]: fsmap cephfs:1 {0=cephfs.vm05.slhztf=up:active} 2 up:standby 2026-03-10T09:03:17.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:17 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:17.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:17 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:17.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:17 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:03:17.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:17 vm05.local ceph-mon[111630]: from='client.34440 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:03:18.018 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.017+0000 7f523958a700 1 -- 192.168.123.105:0/186371881 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f5220006200 con 0x7f5234107d90 2026-03-10T09:03:18.024 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T09:03:18.024 INFO:teuthology.orchestra.run.vm05.stdout: "mon": { 2026-03-10T09:03:18.024 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T09:03:18.024 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T09:03:18.024 INFO:teuthology.orchestra.run.vm05.stdout: "mgr": { 2026-03-10T09:03:18.024 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T09:03:18.024 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T09:03:18.024 INFO:teuthology.orchestra.run.vm05.stdout: "osd": { 2026-03-10T09:03:18.024 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-10T09:03:18.024 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T09:03:18.024 INFO:teuthology.orchestra.run.vm05.stdout: "mds": { 2026-03-10T09:03:18.024 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 3, 2026-03-10T09:03:18.024 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-10T09:03:18.024 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T09:03:18.024 INFO:teuthology.orchestra.run.vm05.stdout: "overall": { 2026-03-10T09:03:18.024 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 3, 2026-03-10T09:03:18.024 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 11 2026-03-10T09:03:18.024 INFO:teuthology.orchestra.run.vm05.stdout: } 2026-03-10T09:03:18.024 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T09:03:18.024 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.021+0000 7f52317fa700 1 -- 192.168.123.105:0/186371881 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+815 (secure 0 0 0) 0x7f5224020340 con 0x7f5234107d90 2026-03-10T09:03:18.025 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.025+0000 7f523958a700 1 -- 192.168.123.105:0/186371881 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f521c0779e0 msgr2=0x7f521c079ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:18.025 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.025+0000 7f523958a700 1 --2- 192.168.123.105:0/186371881 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f521c0779e0 0x7f521c079ea0 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7f522c00bb30 tx=0x7f522c0061f0 comp rx=0 tx=0).stop 2026-03-10T09:03:18.026 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.025+0000 7f523958a700 1 -- 192.168.123.105:0/186371881 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5234107d90 msgr2=0x7f5234116ae0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:18.026 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.025+0000 7f523958a700 1 --2- 192.168.123.105:0/186371881 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5234107d90 0x7f5234116ae0 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f522400d8d0 tx=0x7f522400dc90 comp rx=0 tx=0).stop 2026-03-10T09:03:18.026 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.025+0000 7f523958a700 1 -- 192.168.123.105:0/186371881 shutdown_connections 2026-03-10T09:03:18.026 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.025+0000 7f523958a700 1 --2- 192.168.123.105:0/186371881 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f521c0779e0 0x7f521c079ea0 unknown :-1 s=CLOSED pgs=114 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:18.026 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.025+0000 7f523958a700 1 --2- 192.168.123.105:0/186371881 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5234107d90 0x7f5234116ae0 secure :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f522400d8d0 tx=0x7f522400dc90 comp rx=0 tx=0).stop 2026-03-10T09:03:18.026 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.025+0000 7f523958a700 1 --2- 192.168.123.105:0/186371881 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5234117020 0x7f52341b3180 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:18.026 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.025+0000 7f523958a700 1 -- 192.168.123.105:0/186371881 >> 192.168.123.105:0/186371881 conn(0x7f523406daa0 msgr2=0x7f523406e780 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:03:18.026 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.026+0000 7f523958a700 1 -- 192.168.123.105:0/186371881 shutdown_connections 2026-03-10T09:03:18.026 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.026+0000 7f523958a700 1 -- 192.168.123.105:0/186371881 wait complete. 2026-03-10T09:03:18.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:17 vm08.local ceph-mon[101330]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-10T09:03:18.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:17 vm08.local ceph-mon[101330]: Cluster is now healthy 2026-03-10T09:03:18.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:17 vm08.local ceph-mon[101330]: mds.? [v2:192.168.123.105:6828/2662194502,v1:192.168.123.105:6829/2662194502] up:active 2026-03-10T09:03:18.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:17 vm08.local ceph-mon[101330]: fsmap cephfs:1 {0=cephfs.vm05.slhztf=up:active} 2 up:standby 2026-03-10T09:03:18.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:17 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:18.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:17 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:18.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:17 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:03:18.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:17 vm08.local ceph-mon[101330]: from='client.34440 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:03:18.107 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.107+0000 7f3352601700 1 -- 192.168.123.105:0/3424209517 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f334c107d90 msgr2=0x7f334c10a1c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:18.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.107+0000 7f3352601700 1 --2- 192.168.123.105:0/3424209517 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f334c107d90 0x7f334c10a1c0 secure :-1 s=READY pgs=167 cs=0 l=1 rev1=1 crypto rx=0x7f3348009b00 tx=0x7f3348009e10 comp rx=0 tx=0).stop 2026-03-10T09:03:18.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.107+0000 7f3352601700 1 -- 192.168.123.105:0/3424209517 shutdown_connections 2026-03-10T09:03:18.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.107+0000 7f3352601700 1 --2- 192.168.123.105:0/3424209517 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f334c10a700 0x7f334c10cb90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:18.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.107+0000 7f3352601700 1 --2- 192.168.123.105:0/3424209517 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f334c107d90 0x7f334c10a1c0 unknown :-1 s=CLOSED pgs=167 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:18.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.107+0000 7f3352601700 1 -- 192.168.123.105:0/3424209517 >> 192.168.123.105:0/3424209517 conn(0x7f334c06daa0 msgr2=0x7f334c06ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:03:18.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.107+0000 7f3352601700 1 -- 192.168.123.105:0/3424209517 shutdown_connections 2026-03-10T09:03:18.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.108+0000 7f3352601700 1 -- 192.168.123.105:0/3424209517 wait complete. 2026-03-10T09:03:18.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.108+0000 7f3352601700 1 Processor -- start 2026-03-10T09:03:18.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.108+0000 7f3352601700 1 -- start start 2026-03-10T09:03:18.109 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.108+0000 7f3352601700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f334c107d90 0x7f334c1a5240 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:03:18.109 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.108+0000 7f3352601700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f334c10a700 0x7f334c1a5780 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:03:18.109 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.108+0000 7f3352601700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f334c1a5da0 con 0x7f334c107d90 2026-03-10T09:03:18.109 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.108+0000 7f3352601700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f334c1a5ee0 con 0x7f334c10a700 2026-03-10T09:03:18.109 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.109+0000 7f33515ff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f334c107d90 0x7f334c1a5240 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:03:18.109 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.109+0000 7f3350dfe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f334c10a700 0x7f334c1a5780 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:03:18.109 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.109+0000 7f3350dfe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f334c10a700 0x7f334c1a5780 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:38682/0 (socket says 192.168.123.105:38682) 2026-03-10T09:03:18.109 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.109+0000 7f3350dfe700 1 -- 192.168.123.105:0/2291036174 learned_addr learned my addr 192.168.123.105:0/2291036174 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:03:18.110 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.109+0000 7f3350dfe700 1 -- 192.168.123.105:0/2291036174 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f334c107d90 msgr2=0x7f334c1a5240 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:18.110 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.109+0000 7f3350dfe700 1 --2- 192.168.123.105:0/2291036174 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f334c107d90 0x7f334c1a5240 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:18.110 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.110+0000 7f3350dfe700 1 -- 192.168.123.105:0/2291036174 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f33480097e0 con 0x7f334c10a700 2026-03-10T09:03:18.110 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.110+0000 7f33515ff700 1 --2- 192.168.123.105:0/2291036174 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f334c107d90 0x7f334c1a5240 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T09:03:18.110 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.110+0000 7f3350dfe700 1 --2- 192.168.123.105:0/2291036174 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f334c10a700 0x7f334c1a5780 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7f333c00d8d0 tx=0x7f333c00dbe0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:03:18.111 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.110+0000 7f33427fc700 1 -- 192.168.123.105:0/2291036174 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f333c009880 con 0x7f334c10a700 2026-03-10T09:03:18.111 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.110+0000 7f3352601700 1 -- 192.168.123.105:0/2291036174 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f334c1aa990 con 0x7f334c10a700 2026-03-10T09:03:18.111 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.111+0000 7f3352601700 1 -- 192.168.123.105:0/2291036174 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f334c1aaee0 con 0x7f334c10a700 2026-03-10T09:03:18.111 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.111+0000 7f33427fc700 1 -- 192.168.123.105:0/2291036174 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f333c010460 con 0x7f334c10a700 2026-03-10T09:03:18.111 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.111+0000 7f33427fc700 1 -- 192.168.123.105:0/2291036174 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f333c00f5d0 con 0x7f334c10a700 2026-03-10T09:03:18.112 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.112+0000 7f3337fff700 1 -- 192.168.123.105:0/2291036174 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3330005320 con 0x7f334c10a700 2026-03-10T09:03:18.113 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.112+0000 7f33427fc700 1 -- 192.168.123.105:0/2291036174 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f333c0099e0 con 0x7f334c10a700 2026-03-10T09:03:18.113 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.113+0000 7f33427fc700 1 --2- 192.168.123.105:0/2291036174 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f33380778c0 0x7f3338079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:03:18.113 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.113+0000 7f33427fc700 1 -- 192.168.123.105:0/2291036174 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(103..103 src has 1..103) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f333c099500 con 0x7f334c10a700 2026-03-10T09:03:18.113 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.113+0000 7f33515ff700 1 --2- 192.168.123.105:0/2291036174 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f33380778c0 0x7f3338079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:03:18.113 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.113+0000 7f33515ff700 1 --2- 192.168.123.105:0/2291036174 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f33380778c0 0x7f3338079d80 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7f334800b5c0 tx=0x7f3348005c00 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:03:18.116 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.116+0000 7f33427fc700 1 -- 192.168.123.105:0/2291036174 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f333c061c80 con 0x7f334c10a700 2026-03-10T09:03:18.297 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.295+0000 7f3337fff700 1 -- 192.168.123.105:0/2291036174 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f3330006200 con 0x7f334c10a700 2026-03-10T09:03:18.298 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.298+0000 7f33427fc700 1 -- 192.168.123.105:0/2291036174 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 18 v18) v1 ==== 76+0+1931 (secure 0 0 0) 0x7f333c0613d0 con 0x7f334c10a700 2026-03-10T09:03:18.301 INFO:teuthology.orchestra.run.vm05.stdout:e18 2026-03-10T09:03:18.301 INFO:teuthology.orchestra.run.vm05.stdout:btime 2026-03-10T09:03:17:783297+0000 2026-03-10T09:03:18.301 INFO:teuthology.orchestra.run.vm05.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T09:03:18.301 INFO:teuthology.orchestra.run.vm05.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T09:03:18.301 INFO:teuthology.orchestra.run.vm05.stdout:legacy client fscid: 1 2026-03-10T09:03:18.301 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:03:18.301 INFO:teuthology.orchestra.run.vm05.stdout:Filesystem 'cephfs' (1) 2026-03-10T09:03:18.301 INFO:teuthology.orchestra.run.vm05.stdout:fs_name cephfs 2026-03-10T09:03:18.301 INFO:teuthology.orchestra.run.vm05.stdout:epoch 17 2026-03-10T09:03:18.301 INFO:teuthology.orchestra.run.vm05.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T09:03:18.301 INFO:teuthology.orchestra.run.vm05.stdout:created 2026-03-10T08:52:52.346264+0000 2026-03-10T09:03:18.301 INFO:teuthology.orchestra.run.vm05.stdout:modified 2026-03-10T09:03:16.775887+0000 2026-03-10T09:03:18.301 INFO:teuthology.orchestra.run.vm05.stdout:tableserver 0 2026-03-10T09:03:18.301 INFO:teuthology.orchestra.run.vm05.stdout:root 0 2026-03-10T09:03:18.301 INFO:teuthology.orchestra.run.vm05.stdout:session_timeout 60 2026-03-10T09:03:18.301 INFO:teuthology.orchestra.run.vm05.stdout:session_autoclose 300 2026-03-10T09:03:18.301 INFO:teuthology.orchestra.run.vm05.stdout:max_file_size 1099511627776 2026-03-10T09:03:18.301 INFO:teuthology.orchestra.run.vm05.stdout:max_xattr_size 65536 2026-03-10T09:03:18.301 INFO:teuthology.orchestra.run.vm05.stdout:required_client_features {} 2026-03-10T09:03:18.301 INFO:teuthology.orchestra.run.vm05.stdout:last_failure 0 2026-03-10T09:03:18.301 INFO:teuthology.orchestra.run.vm05.stdout:last_failure_osd_epoch 101 2026-03-10T09:03:18.302 INFO:teuthology.orchestra.run.vm05.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T09:03:18.302 INFO:teuthology.orchestra.run.vm05.stdout:max_mds 1 2026-03-10T09:03:18.302 INFO:teuthology.orchestra.run.vm05.stdout:in 0 2026-03-10T09:03:18.302 INFO:teuthology.orchestra.run.vm05.stdout:up {0=14488} 2026-03-10T09:03:18.302 INFO:teuthology.orchestra.run.vm05.stdout:failed 2026-03-10T09:03:18.302 INFO:teuthology.orchestra.run.vm05.stdout:damaged 2026-03-10T09:03:18.302 INFO:teuthology.orchestra.run.vm05.stdout:stopped 2026-03-10T09:03:18.302 INFO:teuthology.orchestra.run.vm05.stdout:data_pools [3] 2026-03-10T09:03:18.302 INFO:teuthology.orchestra.run.vm05.stdout:metadata_pool 2 2026-03-10T09:03:18.302 INFO:teuthology.orchestra.run.vm05.stdout:inline_data disabled 2026-03-10T09:03:18.302 INFO:teuthology.orchestra.run.vm05.stdout:balancer 2026-03-10T09:03:18.302 INFO:teuthology.orchestra.run.vm05.stdout:bal_rank_mask -1 2026-03-10T09:03:18.302 INFO:teuthology.orchestra.run.vm05.stdout:standby_count_wanted 1 2026-03-10T09:03:18.302 INFO:teuthology.orchestra.run.vm05.stdout:qdb_cluster leader: 14488 members: 14488 2026-03-10T09:03:18.302 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.slhztf{0:14488} state up:active seq 158 join_fscid=1 addr [v2:192.168.123.105:6828/2662194502,v1:192.168.123.105:6829/2662194502] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T09:03:18.302 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:03:18.302 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:03:18.302 INFO:teuthology.orchestra.run.vm05.stdout:Standby daemons: 2026-03-10T09:03:18.302 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:03:18.302 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.ssijow{-1:24307} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.108:6826/573665424,v1:192.168.123.108:6827/573665424] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T09:03:18.302 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.xfzrbx{-1:24317} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.108:6824/3236998387,v1:192.168.123.108:6825/3236998387] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T09:03:18.302 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.bxdvbu{-1:34444} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.105:6826/2948722085,v1:192.168.123.105:6827/2948722085] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T09:03:18.302 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.301+0000 7f3352601700 1 -- 192.168.123.105:0/2291036174 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f33380778c0 msgr2=0x7f3338079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:18.302 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.301+0000 7f3352601700 1 --2- 192.168.123.105:0/2291036174 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f33380778c0 0x7f3338079d80 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7f334800b5c0 tx=0x7f3348005c00 comp rx=0 tx=0).stop 2026-03-10T09:03:18.302 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.301+0000 7f3352601700 1 -- 192.168.123.105:0/2291036174 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f334c10a700 msgr2=0x7f334c1a5780 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:18.302 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.301+0000 7f3352601700 1 --2- 192.168.123.105:0/2291036174 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f334c10a700 0x7f334c1a5780 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7f333c00d8d0 tx=0x7f333c00dbe0 comp rx=0 tx=0).stop 2026-03-10T09:03:18.302 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.301+0000 7f3352601700 1 -- 192.168.123.105:0/2291036174 shutdown_connections 2026-03-10T09:03:18.302 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.301+0000 7f3352601700 1 --2- 192.168.123.105:0/2291036174 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f33380778c0 0x7f3338079d80 unknown :-1 s=CLOSED pgs=115 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:18.302 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.301+0000 7f3352601700 1 --2- 192.168.123.105:0/2291036174 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f334c107d90 0x7f334c1a5240 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:18.302 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.301+0000 7f3352601700 1 --2- 192.168.123.105:0/2291036174 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f334c10a700 0x7f334c1a5780 unknown :-1 s=CLOSED pgs=66 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:18.302 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.301+0000 7f3352601700 1 -- 192.168.123.105:0/2291036174 >> 192.168.123.105:0/2291036174 conn(0x7f334c06daa0 msgr2=0x7f334c06ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:03:18.303 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.301+0000 7f3352601700 1 -- 192.168.123.105:0/2291036174 shutdown_connections 2026-03-10T09:03:18.303 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.301+0000 7f3352601700 1 -- 192.168.123.105:0/2291036174 wait complete. 2026-03-10T09:03:18.310 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 18 2026-03-10T09:03:18.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.444+0000 7ff61af86700 1 -- 192.168.123.105:0/904784545 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff6141019b0 msgr2=0x7ff614103da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:18.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.444+0000 7ff61af86700 1 --2- 192.168.123.105:0/904784545 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff6141019b0 0x7ff614103da0 secure :-1 s=READY pgs=168 cs=0 l=1 rev1=1 crypto rx=0x7ff608009b00 tx=0x7ff608009e10 comp rx=0 tx=0).stop 2026-03-10T09:03:18.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.445+0000 7ff61af86700 1 -- 192.168.123.105:0/904784545 shutdown_connections 2026-03-10T09:03:18.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.445+0000 7ff61af86700 1 --2- 192.168.123.105:0/904784545 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff6141042e0 0x7ff6141066d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:18.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.445+0000 7ff61af86700 1 --2- 192.168.123.105:0/904784545 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff6141019b0 0x7ff614103da0 unknown :-1 s=CLOSED pgs=168 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:18.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.445+0000 7ff61af86700 1 -- 192.168.123.105:0/904784545 >> 192.168.123.105:0/904784545 conn(0x7ff6140fb380 msgr2=0x7ff6140fd800 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:03:18.446 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.445+0000 7ff61af86700 1 -- 192.168.123.105:0/904784545 shutdown_connections 2026-03-10T09:03:18.446 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.445+0000 7ff61af86700 1 -- 192.168.123.105:0/904784545 wait complete. 2026-03-10T09:03:18.446 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.445+0000 7ff61af86700 1 Processor -- start 2026-03-10T09:03:18.446 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.446+0000 7ff61af86700 1 -- start start 2026-03-10T09:03:18.446 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.446+0000 7ff61af86700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff6141019b0 0x7ff614198a40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:03:18.446 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.446+0000 7ff61af86700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff6141042e0 0x7ff614198f80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:03:18.446 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.446+0000 7ff61af86700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff614199510 con 0x7ff6141019b0 2026-03-10T09:03:18.446 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.446+0000 7ff61af86700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff614199650 con 0x7ff6141042e0 2026-03-10T09:03:18.446 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.446+0000 7ff618d22700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff6141019b0 0x7ff614198a40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:03:18.446 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.446+0000 7ff618d22700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff6141019b0 0x7ff614198a40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:44278/0 (socket says 192.168.123.105:44278) 2026-03-10T09:03:18.446 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.446+0000 7ff618d22700 1 -- 192.168.123.105:0/3906148965 learned_addr learned my addr 192.168.123.105:0/3906148965 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:03:18.447 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.446+0000 7ff618d22700 1 -- 192.168.123.105:0/3906148965 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff6141042e0 msgr2=0x7ff614198f80 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:18.447 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.446+0000 7ff613fff700 1 --2- 192.168.123.105:0/3906148965 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff6141042e0 0x7ff614198f80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:03:18.447 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.446+0000 7ff618d22700 1 --2- 192.168.123.105:0/3906148965 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff6141042e0 0x7ff614198f80 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:18.447 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.446+0000 7ff618d22700 1 -- 192.168.123.105:0/3906148965 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff6080097e0 con 0x7ff6141019b0 2026-03-10T09:03:18.447 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.446+0000 7ff618d22700 1 --2- 192.168.123.105:0/3906148965 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff6141019b0 0x7ff614198a40 secure :-1 s=READY pgs=169 cs=0 l=1 rev1=1 crypto rx=0x7ff608009fd0 tx=0x7ff608004ab0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:03:18.447 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.447+0000 7ff613fff700 1 --2- 192.168.123.105:0/3906148965 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff6141042e0 0x7ff614198f80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T09:03:18.448 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.447+0000 7ff611ffb700 1 -- 192.168.123.105:0/3906148965 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff60801d070 con 0x7ff6141019b0 2026-03-10T09:03:18.448 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.447+0000 7ff61af86700 1 -- 192.168.123.105:0/3906148965 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff61419e0b0 con 0x7ff6141019b0 2026-03-10T09:03:18.448 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.447+0000 7ff61af86700 1 -- 192.168.123.105:0/3906148965 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff61419e570 con 0x7ff6141019b0 2026-03-10T09:03:18.448 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.448+0000 7ff61af86700 1 -- 192.168.123.105:0/3906148965 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff6140fcf90 con 0x7ff6141019b0 2026-03-10T09:03:18.451 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.449+0000 7ff611ffb700 1 -- 192.168.123.105:0/3906148965 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff60800bd10 con 0x7ff6141019b0 2026-03-10T09:03:18.451 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.449+0000 7ff611ffb700 1 -- 192.168.123.105:0/3906148965 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff60800e660 con 0x7ff6141019b0 2026-03-10T09:03:18.451 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.450+0000 7ff611ffb700 1 -- 192.168.123.105:0/3906148965 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7ff60800fa40 con 0x7ff6141019b0 2026-03-10T09:03:18.451 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.450+0000 7ff611ffb700 1 --2- 192.168.123.105:0/3906148965 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ff604077870 0x7ff604079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:03:18.451 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.451+0000 7ff611ffb700 1 -- 192.168.123.105:0/3906148965 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(103..103 src has 1..103) v4 ==== 6308+0+0 (secure 0 0 0) 0x7ff60809b7e0 con 0x7ff6141019b0 2026-03-10T09:03:18.453 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.453+0000 7ff613fff700 1 --2- 192.168.123.105:0/3906148965 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ff604077870 0x7ff604079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:03:18.453 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.453+0000 7ff613fff700 1 --2- 192.168.123.105:0/3906148965 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ff604077870 0x7ff604079d30 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7ff600005fd0 tx=0x7ff600005e20 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:03:18.456 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.454+0000 7ff611ffb700 1 -- 192.168.123.105:0/3906148965 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff608064090 con 0x7ff6141019b0 2026-03-10T09:03:18.658 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.657+0000 7ff61af86700 1 -- 192.168.123.105:0/3906148965 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7ff6140611d0 con 0x7ff604077870 2026-03-10T09:03:18.661 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.660+0000 7ff611ffb700 1 -- 192.168.123.105:0/3906148965 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+416 (secure 0 0 0) 0x7ff6140611d0 con 0x7ff604077870 2026-03-10T09:03:18.664 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T09:03:18.664 INFO:teuthology.orchestra.run.vm05.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T09:03:18.664 INFO:teuthology.orchestra.run.vm05.stdout: "in_progress": true, 2026-03-10T09:03:18.664 INFO:teuthology.orchestra.run.vm05.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T09:03:18.664 INFO:teuthology.orchestra.run.vm05.stdout: "services_complete": [ 2026-03-10T09:03:18.664 INFO:teuthology.orchestra.run.vm05.stdout: "mgr", 2026-03-10T09:03:18.665 INFO:teuthology.orchestra.run.vm05.stdout: "crash", 2026-03-10T09:03:18.665 INFO:teuthology.orchestra.run.vm05.stdout: "mon", 2026-03-10T09:03:18.665 INFO:teuthology.orchestra.run.vm05.stdout: "osd" 2026-03-10T09:03:18.665 INFO:teuthology.orchestra.run.vm05.stdout: ], 2026-03-10T09:03:18.665 INFO:teuthology.orchestra.run.vm05.stdout: "progress": "12/23 daemons upgraded", 2026-03-10T09:03:18.665 INFO:teuthology.orchestra.run.vm05.stdout: "message": "Currently upgrading mds daemons", 2026-03-10T09:03:18.665 INFO:teuthology.orchestra.run.vm05.stdout: "is_paused": false 2026-03-10T09:03:18.665 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T09:03:18.665 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.664+0000 7ff5ff7fe700 1 -- 192.168.123.105:0/3906148965 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ff604077870 msgr2=0x7ff604079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:18.665 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.664+0000 7ff5ff7fe700 1 --2- 192.168.123.105:0/3906148965 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ff604077870 0x7ff604079d30 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7ff600005fd0 tx=0x7ff600005e20 comp rx=0 tx=0).stop 2026-03-10T09:03:18.665 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.664+0000 7ff5ff7fe700 1 -- 192.168.123.105:0/3906148965 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff6141019b0 msgr2=0x7ff614198a40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:18.665 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.664+0000 7ff5ff7fe700 1 --2- 192.168.123.105:0/3906148965 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff6141019b0 0x7ff614198a40 secure :-1 s=READY pgs=169 cs=0 l=1 rev1=1 crypto rx=0x7ff608009fd0 tx=0x7ff608004ab0 comp rx=0 tx=0).stop 2026-03-10T09:03:18.665 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.664+0000 7ff5ff7fe700 1 -- 192.168.123.105:0/3906148965 shutdown_connections 2026-03-10T09:03:18.665 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.664+0000 7ff5ff7fe700 1 --2- 192.168.123.105:0/3906148965 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ff604077870 0x7ff604079d30 unknown :-1 s=CLOSED pgs=116 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:18.665 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.664+0000 7ff5ff7fe700 1 --2- 192.168.123.105:0/3906148965 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff6141019b0 0x7ff614198a40 unknown :-1 s=CLOSED pgs=169 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:18.665 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.664+0000 7ff5ff7fe700 1 --2- 192.168.123.105:0/3906148965 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff6141042e0 0x7ff614198f80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:18.665 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.664+0000 7ff5ff7fe700 1 -- 192.168.123.105:0/3906148965 >> 192.168.123.105:0/3906148965 conn(0x7ff6140fb380 msgr2=0x7ff614100030 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:03:18.667 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.667+0000 7ff5ff7fe700 1 -- 192.168.123.105:0/3906148965 shutdown_connections 2026-03-10T09:03:18.667 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.667+0000 7ff5ff7fe700 1 -- 192.168.123.105:0/3906148965 wait complete. 2026-03-10T09:03:18.780 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.779+0000 7f785f28b700 1 -- 192.168.123.105:0/921018920 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f78581042f0 msgr2=0x7f78581066e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:18.780 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.779+0000 7f785f28b700 1 --2- 192.168.123.105:0/921018920 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f78581042f0 0x7f78581066e0 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f785000b3a0 tx=0x7f785000b6b0 comp rx=0 tx=0).stop 2026-03-10T09:03:18.780 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.779+0000 7f785f28b700 1 -- 192.168.123.105:0/921018920 shutdown_connections 2026-03-10T09:03:18.780 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.779+0000 7f785f28b700 1 --2- 192.168.123.105:0/921018920 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f78581042f0 0x7f78581066e0 unknown :-1 s=CLOSED pgs=67 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:18.780 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.779+0000 7f785f28b700 1 --2- 192.168.123.105:0/921018920 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f78581019c0 0x7f7858103db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:18.780 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.779+0000 7f785f28b700 1 -- 192.168.123.105:0/921018920 >> 192.168.123.105:0/921018920 conn(0x7f78580fb3b0 msgr2=0x7f78580fd810 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:03:18.780 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.779+0000 7f785f28b700 1 -- 192.168.123.105:0/921018920 shutdown_connections 2026-03-10T09:03:18.781 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.779+0000 7f785f28b700 1 -- 192.168.123.105:0/921018920 wait complete. 2026-03-10T09:03:18.781 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.780+0000 7f785f28b700 1 Processor -- start 2026-03-10T09:03:18.781 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.780+0000 7f785f28b700 1 -- start start 2026-03-10T09:03:18.781 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.780+0000 7f785f28b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f78581019c0 0x7f785819eef0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:03:18.781 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.780+0000 7f785f28b700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f78581042f0 0x7f785819f430 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:03:18.781 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.780+0000 7f785f28b700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f785819fa70 con 0x7f78581019c0 2026-03-10T09:03:18.781 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.780+0000 7f785f28b700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f785819fbe0 con 0x7f78581042f0 2026-03-10T09:03:18.781 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.780+0000 7f785d027700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f78581019c0 0x7f785819eef0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:03:18.781 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.780+0000 7f785c826700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f78581042f0 0x7f785819f430 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:03:18.781 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.780+0000 7f785c826700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f78581042f0 0x7f785819f430 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:38710/0 (socket says 192.168.123.105:38710) 2026-03-10T09:03:18.781 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.780+0000 7f785c826700 1 -- 192.168.123.105:0/2659160949 learned_addr learned my addr 192.168.123.105:0/2659160949 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:03:18.781 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.781+0000 7f785c826700 1 -- 192.168.123.105:0/2659160949 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f78581019c0 msgr2=0x7f785819eef0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:18.781 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.781+0000 7f785c826700 1 --2- 192.168.123.105:0/2659160949 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f78581019c0 0x7f785819eef0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:18.781 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.781+0000 7f785c826700 1 -- 192.168.123.105:0/2659160949 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f785000b050 con 0x7f78581042f0 2026-03-10T09:03:18.781 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.781+0000 7f785c826700 1 --2- 192.168.123.105:0/2659160949 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f78581042f0 0x7f785819f430 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f7850007b30 tx=0x7f7850003c30 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:03:18.787 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.781+0000 7f784e7fc700 1 -- 192.168.123.105:0/2659160949 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f785000e070 con 0x7f78581042f0 2026-03-10T09:03:18.787 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.781+0000 7f785f28b700 1 -- 192.168.123.105:0/2659160949 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f785810c9c0 con 0x7f78581042f0 2026-03-10T09:03:18.787 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.781+0000 7f785f28b700 1 -- 192.168.123.105:0/2659160949 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f785810cf10 con 0x7f78581042f0 2026-03-10T09:03:18.787 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.781+0000 7f784e7fc700 1 -- 192.168.123.105:0/2659160949 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7850003f90 con 0x7f78581042f0 2026-03-10T09:03:18.787 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.781+0000 7f784e7fc700 1 -- 192.168.123.105:0/2659160949 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7850012cb0 con 0x7f78581042f0 2026-03-10T09:03:18.787 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.783+0000 7f784e7fc700 1 -- 192.168.123.105:0/2659160949 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f7850019040 con 0x7f78581042f0 2026-03-10T09:03:18.787 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.783+0000 7f784e7fc700 1 --2- 192.168.123.105:0/2659160949 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f7844077910 0x7f7844079dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:03:18.787 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.783+0000 7f784e7fc700 1 -- 192.168.123.105:0/2659160949 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(103..103 src has 1..103) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f785009aea0 con 0x7f78581042f0 2026-03-10T09:03:18.787 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.784+0000 7f785d027700 1 --2- 192.168.123.105:0/2659160949 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f7844077910 0x7f7844079dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:03:18.787 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.784+0000 7f785d027700 1 --2- 192.168.123.105:0/2659160949 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f7844077910 0x7f7844079dd0 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7f78581a0450 tx=0x7f7854009380 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:03:18.787 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.786+0000 7f785f28b700 1 -- 192.168.123.105:0/2659160949 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f783c005320 con 0x7f78581042f0 2026-03-10T09:03:18.804 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:18.790+0000 7f784e7fc700 1 -- 192.168.123.105:0/2659160949 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f78500636d0 con 0x7f78581042f0 2026-03-10T09:03:19.022 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:19.021+0000 7f785f28b700 1 -- 192.168.123.105:0/2659160949 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f783c005190 con 0x7f78581042f0 2026-03-10T09:03:19.022 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:19.022+0000 7f784e7fc700 1 -- 192.168.123.105:0/2659160949 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7f7850062e20 con 0x7f78581042f0 2026-03-10T09:03:19.026 INFO:teuthology.orchestra.run.vm05.stdout:HEALTH_OK 2026-03-10T09:03:19.026 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:19.025+0000 7f7843fff700 1 -- 192.168.123.105:0/2659160949 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f7844077910 msgr2=0x7f7844079dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:19.026 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:19.025+0000 7f7843fff700 1 --2- 192.168.123.105:0/2659160949 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f7844077910 0x7f7844079dd0 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7f78581a0450 tx=0x7f7854009380 comp rx=0 tx=0).stop 2026-03-10T09:03:19.026 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:19.025+0000 7f7843fff700 1 -- 192.168.123.105:0/2659160949 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f78581042f0 msgr2=0x7f785819f430 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:19.026 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:19.025+0000 7f7843fff700 1 --2- 192.168.123.105:0/2659160949 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f78581042f0 0x7f785819f430 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f7850007b30 tx=0x7f7850003c30 comp rx=0 tx=0).stop 2026-03-10T09:03:19.026 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:19.025+0000 7f7843fff700 1 -- 192.168.123.105:0/2659160949 shutdown_connections 2026-03-10T09:03:19.026 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:19.026+0000 7f7843fff700 1 --2- 192.168.123.105:0/2659160949 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f7844077910 0x7f7844079dd0 unknown :-1 s=CLOSED pgs=117 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:19.026 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:19.026+0000 7f7843fff700 1 --2- 192.168.123.105:0/2659160949 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f78581019c0 0x7f785819eef0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:19.026 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:19.026+0000 7f7843fff700 1 --2- 192.168.123.105:0/2659160949 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f78581042f0 0x7f785819f430 unknown :-1 s=CLOSED pgs=68 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:19.026 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:19.026+0000 7f7843fff700 1 -- 192.168.123.105:0/2659160949 >> 192.168.123.105:0/2659160949 conn(0x7f78580fb3b0 msgr2=0x7f78580fd810 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:03:19.026 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:19.026+0000 7f7843fff700 1 -- 192.168.123.105:0/2659160949 shutdown_connections 2026-03-10T09:03:19.026 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:19.026+0000 7f7843fff700 1 -- 192.168.123.105:0/2659160949 wait complete. 2026-03-10T09:03:19.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:18 vm05.local ceph-mon[111630]: from='client.34448 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:03:19.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:18 vm05.local ceph-mon[111630]: from='client.44351 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:03:19.214 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:18 vm05.local ceph-mon[111630]: mds.? [v2:192.168.123.105:6826/2948722085,v1:192.168.123.105:6827/2948722085] up:boot 2026-03-10T09:03:19.214 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:18 vm05.local ceph-mon[111630]: fsmap cephfs:1 {0=cephfs.vm05.slhztf=up:active} 3 up:standby 2026-03-10T09:03:19.214 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:18 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.bxdvbu"}]: dispatch 2026-03-10T09:03:19.214 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:18 vm05.local ceph-mon[111630]: pgmap v259: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 20 MiB/s rd, 249 B/s wr, 7 op/s 2026-03-10T09:03:19.214 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:18 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/186371881' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:19.214 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:18 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/2291036174' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T09:03:19.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:18 vm08.local ceph-mon[101330]: from='client.34448 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:03:19.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:18 vm08.local ceph-mon[101330]: from='client.44351 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:03:19.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:18 vm08.local ceph-mon[101330]: mds.? [v2:192.168.123.105:6826/2948722085,v1:192.168.123.105:6827/2948722085] up:boot 2026-03-10T09:03:19.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:18 vm08.local ceph-mon[101330]: fsmap cephfs:1 {0=cephfs.vm05.slhztf=up:active} 3 up:standby 2026-03-10T09:03:19.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:18 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.bxdvbu"}]: dispatch 2026-03-10T09:03:19.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:18 vm08.local ceph-mon[101330]: pgmap v259: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 20 MiB/s rd, 249 B/s wr, 7 op/s 2026-03-10T09:03:19.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:18 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/186371881' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:19.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:18 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/2291036174' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T09:03:20.086 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:19 vm05.local ceph-mon[111630]: from='client.34462 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:03:20.087 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:19 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/2659160949' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T09:03:20.087 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:19 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:20.087 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:19 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:20.087 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:19 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:20.087 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:19 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:20.139 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:19 vm08.local ceph-mon[101330]: from='client.34462 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:03:20.139 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:19 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/2659160949' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T09:03:20.139 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:19 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:20.139 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:19 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:20.139 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:19 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:20.139 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:19 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:20.969 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:20 vm05.local ceph-mon[111630]: pgmap v260: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 16 MiB/s rd, 307 B/s wr, 8 op/s 2026-03-10T09:03:20.969 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:20 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:20.969 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:20 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:20.969 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:20 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:03:20.969 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:20 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:03:20.969 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:20 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:20.969 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:20 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:03:20.969 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:20 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:20.969 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:20 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:20.970 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:20 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:20.970 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:20 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:20.970 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:20 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds ok-to-stop", "ids": ["cephfs.vm05.slhztf"]}]: dispatch 2026-03-10T09:03:21.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:20 vm08.local ceph-mon[101330]: pgmap v260: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 16 MiB/s rd, 307 B/s wr, 8 op/s 2026-03-10T09:03:21.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:20 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:21.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:20 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:21.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:20 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:03:21.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:20 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:03:21.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:20 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:21.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:20 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:03:21.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:20 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:21.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:20 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:21.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:20 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:21.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:20 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:21.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:20 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds ok-to-stop", "ids": ["cephfs.vm05.slhztf"]}]: dispatch 2026-03-10T09:03:21.906 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:21 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-mon-vm05[111626]: 2026-03-10T09:03:21.673+0000 7ff673aa8640 -1 log_channel(cluster) log [ERR] : Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T09:03:22.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:21 vm05.local ceph-mon[111630]: Detected new or changed devices on vm05 2026-03-10T09:03:22.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:21 vm05.local ceph-mon[111630]: Upgrade: It appears safe to stop mds.cephfs.vm05.slhztf 2026-03-10T09:03:22.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:21 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:22.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:21 vm05.local ceph-mon[111630]: Upgrade: Updating mds.cephfs.vm05.slhztf 2026-03-10T09:03:22.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:21 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:22.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:21 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.slhztf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T09:03:22.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:21 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:03:22.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:21 vm05.local ceph-mon[111630]: Deploying daemon mds.cephfs.vm05.slhztf on vm05 2026-03-10T09:03:22.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:21 vm05.local ceph-mon[111630]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-10T09:03:22.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:21 vm05.local ceph-mon[111630]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T09:03:22.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:21 vm05.local ceph-mon[111630]: osdmap e104: 6 total, 6 up, 6 in 2026-03-10T09:03:22.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:21 vm05.local ceph-mon[111630]: Standby daemon mds.cephfs.vm08.ssijow assigned to filesystem cephfs as rank 0 2026-03-10T09:03:22.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:21 vm05.local ceph-mon[111630]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-10T09:03:22.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:21 vm05.local ceph-mon[111630]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-10T09:03:22.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:21 vm05.local ceph-mon[111630]: fsmap cephfs:1/1 {0=cephfs.vm08.ssijow=up:replay} 2 up:standby 2026-03-10T09:03:22.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:21 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.slhztf"}]: dispatch 2026-03-10T09:03:22.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:21 vm08.local ceph-mon[101330]: Detected new or changed devices on vm05 2026-03-10T09:03:22.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:21 vm08.local ceph-mon[101330]: Upgrade: It appears safe to stop mds.cephfs.vm05.slhztf 2026-03-10T09:03:22.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:21 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:22.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:21 vm08.local ceph-mon[101330]: Upgrade: Updating mds.cephfs.vm05.slhztf 2026-03-10T09:03:22.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:21 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:22.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:21 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.slhztf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T09:03:22.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:21 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:03:22.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:21 vm08.local ceph-mon[101330]: Deploying daemon mds.cephfs.vm05.slhztf on vm05 2026-03-10T09:03:22.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:21 vm08.local ceph-mon[101330]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-10T09:03:22.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:21 vm08.local ceph-mon[101330]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T09:03:22.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:21 vm08.local ceph-mon[101330]: osdmap e104: 6 total, 6 up, 6 in 2026-03-10T09:03:22.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:21 vm08.local ceph-mon[101330]: Standby daemon mds.cephfs.vm08.ssijow assigned to filesystem cephfs as rank 0 2026-03-10T09:03:22.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:21 vm08.local ceph-mon[101330]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-10T09:03:22.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:21 vm08.local ceph-mon[101330]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-10T09:03:22.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:21 vm08.local ceph-mon[101330]: fsmap cephfs:1/1 {0=cephfs.vm08.ssijow=up:replay} 2 up:standby 2026-03-10T09:03:22.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:21 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.slhztf"}]: dispatch 2026-03-10T09:03:23.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:22 vm05.local ceph-mon[111630]: pgmap v262: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 14 MiB/s rd, 511 B/s wr, 7 op/s 2026-03-10T09:03:23.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:22 vm08.local ceph-mon[101330]: pgmap v262: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 14 MiB/s rd, 511 B/s wr, 7 op/s 2026-03-10T09:03:25.173 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:24 vm08.local ceph-mon[101330]: pgmap v263: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 9.8 MiB/s rd, 5.4 KiB/s wr, 9 op/s 2026-03-10T09:03:25.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:24 vm05.local ceph-mon[111630]: pgmap v263: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 9.8 MiB/s rd, 5.4 KiB/s wr, 9 op/s 2026-03-10T09:03:26.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:26 vm05.local ceph-mon[111630]: pgmap v264: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 10 MiB/s rd, 5.4 KiB/s wr, 10 op/s 2026-03-10T09:03:26.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:26 vm08.local ceph-mon[101330]: pgmap v264: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 10 MiB/s rd, 5.4 KiB/s wr, 10 op/s 2026-03-10T09:03:27.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:27 vm05.local ceph-mon[111630]: mds.? [v2:192.168.123.108:6826/573665424,v1:192.168.123.108:6827/573665424] up:reconnect 2026-03-10T09:03:27.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:27 vm05.local ceph-mon[111630]: fsmap cephfs:1/1 {0=cephfs.vm08.ssijow=up:reconnect} 2 up:standby 2026-03-10T09:03:27.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:27 vm05.local ceph-mon[111630]: reconnect by client.24333 192.168.144.1:0/2430695904 after 0 2026-03-10T09:03:27.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:27 vm05.local ceph-mon[111630]: reconnect by client.24325 192.168.123.105:0/1319997093 after 0 2026-03-10T09:03:27.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:27 vm08.local ceph-mon[101330]: mds.? [v2:192.168.123.108:6826/573665424,v1:192.168.123.108:6827/573665424] up:reconnect 2026-03-10T09:03:27.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:27 vm08.local ceph-mon[101330]: fsmap cephfs:1/1 {0=cephfs.vm08.ssijow=up:reconnect} 2 up:standby 2026-03-10T09:03:27.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:27 vm08.local ceph-mon[101330]: reconnect by client.24333 192.168.144.1:0/2430695904 after 0 2026-03-10T09:03:27.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:27 vm08.local ceph-mon[101330]: reconnect by client.24325 192.168.123.105:0/1319997093 after 0 2026-03-10T09:03:28.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:28 vm05.local ceph-mon[111630]: mds.? [v2:192.168.123.108:6826/573665424,v1:192.168.123.108:6827/573665424] up:rejoin 2026-03-10T09:03:28.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:28 vm05.local ceph-mon[111630]: fsmap cephfs:1/1 {0=cephfs.vm08.ssijow=up:rejoin} 2 up:standby 2026-03-10T09:03:28.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:28 vm05.local ceph-mon[111630]: daemon mds.cephfs.vm08.ssijow is now active in filesystem cephfs as rank 0 2026-03-10T09:03:28.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:28.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:28 vm05.local ceph-mon[111630]: pgmap v265: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 17 MiB/s rd, 5.3 KiB/s wr, 11 op/s 2026-03-10T09:03:28.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:28.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:03:28.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:28 vm08.local ceph-mon[101330]: mds.? [v2:192.168.123.108:6826/573665424,v1:192.168.123.108:6827/573665424] up:rejoin 2026-03-10T09:03:28.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:28 vm08.local ceph-mon[101330]: fsmap cephfs:1/1 {0=cephfs.vm08.ssijow=up:rejoin} 2 up:standby 2026-03-10T09:03:28.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:28 vm08.local ceph-mon[101330]: daemon mds.cephfs.vm08.ssijow is now active in filesystem cephfs as rank 0 2026-03-10T09:03:28.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:28.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:28 vm08.local ceph-mon[101330]: pgmap v265: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 17 MiB/s rd, 5.3 KiB/s wr, 11 op/s 2026-03-10T09:03:28.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:28.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:03:29.704 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:29 vm08.local ceph-mon[101330]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-10T09:03:29.704 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:29 vm08.local ceph-mon[101330]: Cluster is now healthy 2026-03-10T09:03:29.704 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:29 vm08.local ceph-mon[101330]: mds.? [v2:192.168.123.108:6826/573665424,v1:192.168.123.108:6827/573665424] up:active 2026-03-10T09:03:29.704 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:29 vm08.local ceph-mon[101330]: mds.? [v2:192.168.123.105:6828/930707688,v1:192.168.123.105:6829/930707688] up:boot 2026-03-10T09:03:29.704 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:29 vm08.local ceph-mon[101330]: fsmap cephfs:1 {0=cephfs.vm08.ssijow=up:active} 3 up:standby 2026-03-10T09:03:29.704 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:29 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.slhztf"}]: dispatch 2026-03-10T09:03:29.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:29 vm05.local ceph-mon[111630]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-10T09:03:29.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:29 vm05.local ceph-mon[111630]: Cluster is now healthy 2026-03-10T09:03:29.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:29 vm05.local ceph-mon[111630]: mds.? [v2:192.168.123.108:6826/573665424,v1:192.168.123.108:6827/573665424] up:active 2026-03-10T09:03:29.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:29 vm05.local ceph-mon[111630]: mds.? [v2:192.168.123.105:6828/930707688,v1:192.168.123.105:6829/930707688] up:boot 2026-03-10T09:03:29.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:29 vm05.local ceph-mon[111630]: fsmap cephfs:1 {0=cephfs.vm08.ssijow=up:active} 3 up:standby 2026-03-10T09:03:29.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:29 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.slhztf"}]: dispatch 2026-03-10T09:03:30.637 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:30 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:30.637 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:30 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:30.637 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:30 vm05.local ceph-mon[111630]: pgmap v266: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 5.2 KiB/s wr, 10 op/s 2026-03-10T09:03:30.637 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:30 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:30.637 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:30 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:30.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:30 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:30.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:30 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:30.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:30 vm08.local ceph-mon[101330]: pgmap v266: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 5.2 KiB/s wr, 10 op/s 2026-03-10T09:03:30.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:30 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:30.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:30 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:32.159 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:32.159 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:03:32.159 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:32.159 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:32.159 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:03:32.160 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:03:32.160 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:32.160 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:03:32.160 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:32.160 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:32.160 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:32.160 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:32.160 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds ok-to-stop", "ids": ["cephfs.vm08.xfzrbx"]}]: dispatch 2026-03-10T09:03:32.160 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:32.160 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.xfzrbx", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T09:03:32.160 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:03:32.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:32.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:03:32.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:32.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:32.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:03:32.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:03:32.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:32.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:03:32.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:32.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:32.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:32.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:32.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds ok-to-stop", "ids": ["cephfs.vm08.xfzrbx"]}]: dispatch 2026-03-10T09:03:32.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:32.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.xfzrbx", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T09:03:32.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:03:33.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:32 vm05.local ceph-mon[111630]: Upgrade: It appears safe to stop mds.cephfs.vm08.xfzrbx 2026-03-10T09:03:33.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:32 vm05.local ceph-mon[111630]: Upgrade: Updating mds.cephfs.vm08.xfzrbx 2026-03-10T09:03:33.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:32 vm05.local ceph-mon[111630]: Deploying daemon mds.cephfs.vm08.xfzrbx on vm08 2026-03-10T09:03:33.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:32 vm05.local ceph-mon[111630]: pgmap v267: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 4.9 KiB/s wr, 9 op/s 2026-03-10T09:03:33.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:32 vm05.local ceph-mon[111630]: osdmap e105: 6 total, 6 up, 6 in 2026-03-10T09:03:33.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:32 vm08.local ceph-mon[101330]: Upgrade: It appears safe to stop mds.cephfs.vm08.xfzrbx 2026-03-10T09:03:33.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:32 vm08.local ceph-mon[101330]: Upgrade: Updating mds.cephfs.vm08.xfzrbx 2026-03-10T09:03:33.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:32 vm08.local ceph-mon[101330]: Deploying daemon mds.cephfs.vm08.xfzrbx on vm08 2026-03-10T09:03:33.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:32 vm08.local ceph-mon[101330]: pgmap v267: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 4.9 KiB/s wr, 9 op/s 2026-03-10T09:03:33.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:32 vm08.local ceph-mon[101330]: osdmap e105: 6 total, 6 up, 6 in 2026-03-10T09:03:34.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:33 vm05.local ceph-mon[111630]: fsmap cephfs:1 {0=cephfs.vm08.ssijow=up:active} 2 up:standby 2026-03-10T09:03:34.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:33 vm08.local ceph-mon[101330]: fsmap cephfs:1 {0=cephfs.vm08.ssijow=up:active} 2 up:standby 2026-03-10T09:03:35.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:34 vm08.local ceph-mon[101330]: pgmap v269: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 14 MiB/s rd, 4.9 KiB/s wr, 10 op/s 2026-03-10T09:03:35.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:34 vm05.local ceph-mon[111630]: pgmap v269: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 14 MiB/s rd, 4.9 KiB/s wr, 10 op/s 2026-03-10T09:03:36.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:35 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:36.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:35 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:36.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:35 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:03:36.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:35 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:36.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:35 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:36.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:35 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:03:36.939 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:36 vm08.local ceph-mon[101330]: pgmap v270: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 14 MiB/s rd, 4.9 KiB/s wr, 9 op/s 2026-03-10T09:03:37.211 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:36 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:37.211 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:36 vm08.local ceph-mon[101330]: mds.? [v2:192.168.123.108:6824/3904677772,v1:192.168.123.108:6825/3904677772] up:boot 2026-03-10T09:03:37.211 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:36 vm08.local ceph-mon[101330]: fsmap cephfs:1 {0=cephfs.vm08.ssijow=up:active} 3 up:standby 2026-03-10T09:03:37.211 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:36 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.xfzrbx"}]: dispatch 2026-03-10T09:03:37.212 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:36 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:37.212 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:36 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:37.212 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:36 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:37.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:36 vm05.local ceph-mon[111630]: pgmap v270: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 14 MiB/s rd, 4.9 KiB/s wr, 9 op/s 2026-03-10T09:03:37.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:36 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:37.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:36 vm05.local ceph-mon[111630]: mds.? [v2:192.168.123.108:6824/3904677772,v1:192.168.123.108:6825/3904677772] up:boot 2026-03-10T09:03:37.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:36 vm05.local ceph-mon[111630]: fsmap cephfs:1 {0=cephfs.vm08.ssijow=up:active} 3 up:standby 2026-03-10T09:03:37.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:36 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.xfzrbx"}]: dispatch 2026-03-10T09:03:37.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:36 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:37.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:36 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:37.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:36 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:38.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:38 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-mon-vm05[111626]: 2026-03-10T09:03:38.170+0000 7ff673aa8640 -1 log_channel(cluster) log [ERR] : Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T09:03:38.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:38 vm05.local ceph-mon[111630]: Detected new or changed devices on vm08 2026-03-10T09:03:38.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:38 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:38.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:38 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:38.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:38 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:03:38.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:38 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:03:38.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:38 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:38.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:38 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:03:38.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:38 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:38.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:38 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:38.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:38 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:38.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:38 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:38.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:38 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds ok-to-stop", "ids": ["cephfs.vm08.ssijow"]}]: dispatch 2026-03-10T09:03:38.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:38 vm05.local ceph-mon[111630]: Upgrade: It appears safe to stop mds.cephfs.vm08.ssijow 2026-03-10T09:03:38.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:38 vm05.local ceph-mon[111630]: Upgrade: Updating mds.cephfs.vm08.ssijow 2026-03-10T09:03:38.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:38 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:38.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:38 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.ssijow", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T09:03:38.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:38 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:03:38.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:38 vm05.local ceph-mon[111630]: Deploying daemon mds.cephfs.vm08.ssijow on vm08 2026-03-10T09:03:38.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:38 vm05.local ceph-mon[111630]: pgmap v271: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 3.6 MiB/s rd, 4.8 KiB/s wr, 6 op/s 2026-03-10T09:03:38.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:38 vm05.local ceph-mon[111630]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-10T09:03:38.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:38 vm05.local ceph-mon[111630]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T09:03:38.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:38 vm05.local ceph-mon[111630]: osdmap e106: 6 total, 6 up, 6 in 2026-03-10T09:03:38.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:38 vm05.local ceph-mon[111630]: Standby daemon mds.cephfs.vm05.bxdvbu assigned to filesystem cephfs as rank 0 2026-03-10T09:03:38.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:38 vm05.local ceph-mon[111630]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-10T09:03:38.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:38 vm05.local ceph-mon[111630]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-10T09:03:38.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:38 vm05.local ceph-mon[111630]: fsmap cephfs:1/1 {0=cephfs.vm05.bxdvbu=up:replay} 2 up:standby 2026-03-10T09:03:38.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:38 vm08.local ceph-mon[101330]: Detected new or changed devices on vm08 2026-03-10T09:03:38.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:38 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:38.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:38 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:38.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:38 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:03:38.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:38 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:03:38.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:38 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:38.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:38 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:03:38.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:38 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:38.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:38 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:38.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:38 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:38.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:38 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:38.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:38 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds ok-to-stop", "ids": ["cephfs.vm08.ssijow"]}]: dispatch 2026-03-10T09:03:38.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:38 vm08.local ceph-mon[101330]: Upgrade: It appears safe to stop mds.cephfs.vm08.ssijow 2026-03-10T09:03:38.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:38 vm08.local ceph-mon[101330]: Upgrade: Updating mds.cephfs.vm08.ssijow 2026-03-10T09:03:38.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:38 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:38.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:38 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.ssijow", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T09:03:38.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:38 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:03:38.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:38 vm08.local ceph-mon[101330]: Deploying daemon mds.cephfs.vm08.ssijow on vm08 2026-03-10T09:03:38.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:38 vm08.local ceph-mon[101330]: pgmap v271: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 3.6 MiB/s rd, 4.8 KiB/s wr, 6 op/s 2026-03-10T09:03:38.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:38 vm08.local ceph-mon[101330]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-10T09:03:38.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:38 vm08.local ceph-mon[101330]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T09:03:38.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:38 vm08.local ceph-mon[101330]: osdmap e106: 6 total, 6 up, 6 in 2026-03-10T09:03:38.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:38 vm08.local ceph-mon[101330]: Standby daemon mds.cephfs.vm05.bxdvbu assigned to filesystem cephfs as rank 0 2026-03-10T09:03:38.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:38 vm08.local ceph-mon[101330]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-10T09:03:38.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:38 vm08.local ceph-mon[101330]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-10T09:03:38.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:38 vm08.local ceph-mon[101330]: fsmap cephfs:1/1 {0=cephfs.vm05.bxdvbu=up:replay} 2 up:standby 2026-03-10T09:03:41.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:41 vm08.local ceph-mon[101330]: pgmap v273: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 2.3 MiB/s rd, 6.0 KiB/s wr, 7 op/s 2026-03-10T09:03:41.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:41 vm05.local ceph-mon[111630]: pgmap v273: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 2.3 MiB/s rd, 6.0 KiB/s wr, 7 op/s 2026-03-10T09:03:42.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:42 vm05.local ceph-mon[111630]: pgmap v274: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 6.8 MiB/s rd, 0 B/s wr, 4 op/s 2026-03-10T09:03:42.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:42 vm05.local ceph-mon[111630]: mds.? [v2:192.168.123.105:6826/2948722085,v1:192.168.123.105:6827/2948722085] up:reconnect 2026-03-10T09:03:42.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:42 vm05.local ceph-mon[111630]: fsmap cephfs:1/1 {0=cephfs.vm05.bxdvbu=up:reconnect} 2 up:standby 2026-03-10T09:03:42.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:42 vm08.local ceph-mon[101330]: pgmap v274: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 6.8 MiB/s rd, 0 B/s wr, 4 op/s 2026-03-10T09:03:42.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:42 vm08.local ceph-mon[101330]: mds.? [v2:192.168.123.105:6826/2948722085,v1:192.168.123.105:6827/2948722085] up:reconnect 2026-03-10T09:03:42.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:42 vm08.local ceph-mon[101330]: fsmap cephfs:1/1 {0=cephfs.vm05.bxdvbu=up:reconnect} 2 up:standby 2026-03-10T09:03:43.669 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:43 vm08.local ceph-mon[101330]: reconnect by client.24325 192.168.123.105:0/1319997093 after 0.00600001 2026-03-10T09:03:43.669 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:43 vm08.local ceph-mon[101330]: reconnect by client.24333 192.168.144.1:0/2430695904 after 0.00600001 2026-03-10T09:03:43.669 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:43 vm08.local ceph-mon[101330]: mds.? [v2:192.168.123.105:6826/2948722085,v1:192.168.123.105:6827/2948722085] up:rejoin 2026-03-10T09:03:43.669 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:43 vm08.local ceph-mon[101330]: fsmap cephfs:1/1 {0=cephfs.vm05.bxdvbu=up:rejoin} 2 up:standby 2026-03-10T09:03:43.669 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:43 vm08.local ceph-mon[101330]: daemon mds.cephfs.vm05.bxdvbu is now active in filesystem cephfs as rank 0 2026-03-10T09:03:43.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:43 vm05.local ceph-mon[111630]: reconnect by client.24325 192.168.123.105:0/1319997093 after 0.00600001 2026-03-10T09:03:43.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:43 vm05.local ceph-mon[111630]: reconnect by client.24333 192.168.144.1:0/2430695904 after 0.00600001 2026-03-10T09:03:43.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:43 vm05.local ceph-mon[111630]: mds.? [v2:192.168.123.105:6826/2948722085,v1:192.168.123.105:6827/2948722085] up:rejoin 2026-03-10T09:03:43.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:43 vm05.local ceph-mon[111630]: fsmap cephfs:1/1 {0=cephfs.vm05.bxdvbu=up:rejoin} 2 up:standby 2026-03-10T09:03:43.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:43 vm05.local ceph-mon[111630]: daemon mds.cephfs.vm05.bxdvbu is now active in filesystem cephfs as rank 0 2026-03-10T09:03:45.319 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:45 vm08.local ceph-mon[101330]: pgmap v275: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 0 B/s wr, 6 op/s 2026-03-10T09:03:45.320 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:45 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:45.320 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:45 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:45.320 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:45 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:03:45.320 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:45 vm08.local ceph-mon[101330]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-10T09:03:45.320 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:45 vm08.local ceph-mon[101330]: Cluster is now healthy 2026-03-10T09:03:45.320 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:45 vm08.local ceph-mon[101330]: mds.? [v2:192.168.123.105:6826/2948722085,v1:192.168.123.105:6827/2948722085] up:active 2026-03-10T09:03:45.320 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:45 vm08.local ceph-mon[101330]: mds.? [v2:192.168.123.108:6826/42427465,v1:192.168.123.108:6827/42427465] up:boot 2026-03-10T09:03:45.320 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:45 vm08.local ceph-mon[101330]: fsmap cephfs:1 {0=cephfs.vm05.bxdvbu=up:active} 3 up:standby 2026-03-10T09:03:45.320 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:45 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.ssijow"}]: dispatch 2026-03-10T09:03:45.359 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:45 vm05.local ceph-mon[111630]: pgmap v275: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 0 B/s wr, 6 op/s 2026-03-10T09:03:45.359 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:45 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:45.359 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:45 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:45.359 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:45 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:03:45.359 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:45 vm05.local ceph-mon[111630]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-10T09:03:45.359 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:45 vm05.local ceph-mon[111630]: Cluster is now healthy 2026-03-10T09:03:45.359 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:45 vm05.local ceph-mon[111630]: mds.? [v2:192.168.123.105:6826/2948722085,v1:192.168.123.105:6827/2948722085] up:active 2026-03-10T09:03:45.359 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:45 vm05.local ceph-mon[111630]: mds.? [v2:192.168.123.108:6826/42427465,v1:192.168.123.108:6827/42427465] up:boot 2026-03-10T09:03:45.360 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:45 vm05.local ceph-mon[111630]: fsmap cephfs:1 {0=cephfs.vm05.bxdvbu=up:active} 3 up:standby 2026-03-10T09:03:45.360 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:45 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.ssijow"}]: dispatch 2026-03-10T09:03:46.122 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:46 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:46.122 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:46 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:46.122 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:46 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:46.122 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:46 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:46.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:46 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:46.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:46 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:46.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:46 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:46.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:46 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:46.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:46 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:03:46.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:46 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:03:47.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:47 vm05.local ceph-mon[111630]: pgmap v276: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 0 B/s wr, 6 op/s 2026-03-10T09:03:47.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:47 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:47.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:47 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:47.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:47 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:03:47.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:47 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:03:47.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:47 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:47.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:47 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:03:47.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:47 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:47.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:47 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:47.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:47 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:47.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:47 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:47.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:47 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:47.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:47 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:47.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:47 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm05.bxdvbu"}]: dispatch 2026-03-10T09:03:47.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:47 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm05.bxdvbu"}]': finished 2026-03-10T09:03:47.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:47 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm05.slhztf"}]: dispatch 2026-03-10T09:03:47.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:47 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm05.slhztf"}]': finished 2026-03-10T09:03:47.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:47 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm08.ssijow"}]: dispatch 2026-03-10T09:03:47.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:47 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm08.ssijow"}]': finished 2026-03-10T09:03:47.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:47 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm08.xfzrbx"}]: dispatch 2026-03-10T09:03:47.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:47 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm08.xfzrbx"}]': finished 2026-03-10T09:03:47.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:47 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:47.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:47 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:47.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:47 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:47.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:47 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:47.464 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:47 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:47.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:47 vm08.local ceph-mon[101330]: pgmap v276: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 0 B/s wr, 6 op/s 2026-03-10T09:03:47.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:47 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:47.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:47 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:47.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:47 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:03:47.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:47 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:03:47.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:47 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:47.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:47 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:03:47.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:47 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:47.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:47 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:47.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:47 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:47.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:47 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:47.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:47 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:47.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:47 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:47.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:47 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm05.bxdvbu"}]: dispatch 2026-03-10T09:03:47.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:47 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm05.bxdvbu"}]': finished 2026-03-10T09:03:47.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:47 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm05.slhztf"}]: dispatch 2026-03-10T09:03:47.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:47 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm05.slhztf"}]': finished 2026-03-10T09:03:47.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:47 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm08.ssijow"}]: dispatch 2026-03-10T09:03:47.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:47 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm08.ssijow"}]': finished 2026-03-10T09:03:47.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:47 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm08.xfzrbx"}]: dispatch 2026-03-10T09:03:47.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:47 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm08.xfzrbx"}]': finished 2026-03-10T09:03:47.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:47 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:47.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:47 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:47.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:47 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:47.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:47 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:47.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:47 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:48.346 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:48 vm05.local ceph-mon[111630]: Upgrade: Setting container_image for all mds 2026-03-10T09:03:48.346 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:48 vm05.local ceph-mon[111630]: Upgrade: Setting container_image for all rgw 2026-03-10T09:03:48.346 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:48 vm05.local ceph-mon[111630]: Upgrade: Setting container_image for all rbd-mirror 2026-03-10T09:03:48.346 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:48 vm05.local ceph-mon[111630]: Upgrade: Updating ceph-exporter.vm05 (1/2) 2026-03-10T09:03:48.346 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:48 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:48.346 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:48 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm05", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T09:03:48.346 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:48 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:03:48.346 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:48 vm05.local ceph-mon[111630]: Deploying daemon ceph-exporter.vm05 on vm05 2026-03-10T09:03:48.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:48 vm08.local ceph-mon[101330]: Upgrade: Setting container_image for all mds 2026-03-10T09:03:48.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:48 vm08.local ceph-mon[101330]: Upgrade: Setting container_image for all rgw 2026-03-10T09:03:48.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:48 vm08.local ceph-mon[101330]: Upgrade: Setting container_image for all rbd-mirror 2026-03-10T09:03:48.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:48 vm08.local ceph-mon[101330]: Upgrade: Updating ceph-exporter.vm05 (1/2) 2026-03-10T09:03:48.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:48 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:48.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:48 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm05", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T09:03:48.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:48 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:03:48.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:48 vm08.local ceph-mon[101330]: Deploying daemon ceph-exporter.vm05 on vm05 2026-03-10T09:03:49.115 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.112+0000 7fca41111700 1 -- 192.168.123.105:0/4088295948 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fca3c100ea0 msgr2=0x7fca3c1012c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:49.115 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.112+0000 7fca41111700 1 --2- 192.168.123.105:0/4088295948 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fca3c100ea0 0x7fca3c1012c0 secure :-1 s=READY pgs=172 cs=0 l=1 rev1=1 crypto rx=0x7fca24009b50 tx=0x7fca24009e60 comp rx=0 tx=0).stop 2026-03-10T09:03:49.115 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.115+0000 7fca41111700 1 -- 192.168.123.105:0/4088295948 shutdown_connections 2026-03-10T09:03:49.115 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.115+0000 7fca41111700 1 --2- 192.168.123.105:0/4088295948 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fca3c102090 0x7fca3c102510 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:49.115 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.115+0000 7fca41111700 1 --2- 192.168.123.105:0/4088295948 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fca3c100ea0 0x7fca3c1012c0 unknown :-1 s=CLOSED pgs=172 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:49.115 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.115+0000 7fca41111700 1 -- 192.168.123.105:0/4088295948 >> 192.168.123.105:0/4088295948 conn(0x7fca3c0fc470 msgr2=0x7fca3c0fe8d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:03:49.115 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.115+0000 7fca41111700 1 -- 192.168.123.105:0/4088295948 shutdown_connections 2026-03-10T09:03:49.115 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.115+0000 7fca41111700 1 -- 192.168.123.105:0/4088295948 wait complete. 2026-03-10T09:03:49.116 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.116+0000 7fca41111700 1 Processor -- start 2026-03-10T09:03:49.116 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.116+0000 7fca41111700 1 -- start start 2026-03-10T09:03:49.116 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.116+0000 7fca41111700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fca3c100ea0 0x7fca3c194640 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:03:49.116 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.116+0000 7fca41111700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fca3c102090 0x7fca3c194b80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:03:49.117 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.116+0000 7fca41111700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fca3c1951a0 con 0x7fca3c102090 2026-03-10T09:03:49.117 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.116+0000 7fca41111700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fca3c1952e0 con 0x7fca3c100ea0 2026-03-10T09:03:49.117 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.117+0000 7fca3a59c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fca3c102090 0x7fca3c194b80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:03:49.117 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.117+0000 7fca3a59c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fca3c102090 0x7fca3c194b80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:58988/0 (socket says 192.168.123.105:58988) 2026-03-10T09:03:49.117 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.117+0000 7fca3a59c700 1 -- 192.168.123.105:0/2088097644 learned_addr learned my addr 192.168.123.105:0/2088097644 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:03:49.117 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.117+0000 7fca3ad9d700 1 --2- 192.168.123.105:0/2088097644 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fca3c100ea0 0x7fca3c194640 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:03:49.117 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.117+0000 7fca3a59c700 1 -- 192.168.123.105:0/2088097644 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fca3c100ea0 msgr2=0x7fca3c194640 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:49.117 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.117+0000 7fca3a59c700 1 --2- 192.168.123.105:0/2088097644 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fca3c100ea0 0x7fca3c194640 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:49.117 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.117+0000 7fca3a59c700 1 -- 192.168.123.105:0/2088097644 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fca240097e0 con 0x7fca3c102090 2026-03-10T09:03:49.118 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.117+0000 7fca3a59c700 1 --2- 192.168.123.105:0/2088097644 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fca3c102090 0x7fca3c194b80 secure :-1 s=READY pgs=173 cs=0 l=1 rev1=1 crypto rx=0x7fca2c00d900 tx=0x7fca2c00dcc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:03:49.118 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.118+0000 7fca33fff700 1 -- 192.168.123.105:0/2088097644 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fca2c0041d0 con 0x7fca3c102090 2026-03-10T09:03:49.118 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.118+0000 7fca33fff700 1 -- 192.168.123.105:0/2088097644 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fca2c004330 con 0x7fca3c102090 2026-03-10T09:03:49.118 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.118+0000 7fca41111700 1 -- 192.168.123.105:0/2088097644 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fca3c199d90 con 0x7fca3c102090 2026-03-10T09:03:49.119 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.118+0000 7fca41111700 1 -- 192.168.123.105:0/2088097644 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fca3c19a2b0 con 0x7fca3c102090 2026-03-10T09:03:49.119 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.118+0000 7fca33fff700 1 -- 192.168.123.105:0/2088097644 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fca2c010460 con 0x7fca3c102090 2026-03-10T09:03:49.120 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.119+0000 7fca41111700 1 -- 192.168.123.105:0/2088097644 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fca3c066e80 con 0x7fca3c102090 2026-03-10T09:03:49.124 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.122+0000 7fca33fff700 1 -- 192.168.123.105:0/2088097644 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fca2c009730 con 0x7fca3c102090 2026-03-10T09:03:49.124 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.123+0000 7fca33fff700 1 --2- 192.168.123.105:0/2088097644 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fca2807bda0 0x7fca2807e260 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:03:49.124 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.123+0000 7fca33fff700 1 -- 192.168.123.105:0/2088097644 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7fca2c0998d0 con 0x7fca3c102090 2026-03-10T09:03:49.124 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.123+0000 7fca33fff700 1 -- 192.168.123.105:0/2088097644 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fca2c0c99f0 con 0x7fca3c102090 2026-03-10T09:03:49.124 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.123+0000 7fca3ad9d700 1 --2- 192.168.123.105:0/2088097644 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fca2807bda0 0x7fca2807e260 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:03:49.128 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.124+0000 7fca3ad9d700 1 --2- 192.168.123.105:0/2088097644 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fca2807bda0 0x7fca2807e260 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7fca24006010 tx=0x7fca2400b540 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:03:49.256 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.256+0000 7fca41111700 1 -- 192.168.123.105:0/2088097644 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fca3c106a70 con 0x7fca2807bda0 2026-03-10T09:03:49.257 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.257+0000 7fca33fff700 1 -- 192.168.123.105:0/2088097644 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+441 (secure 0 0 0) 0x7fca3c106a70 con 0x7fca2807bda0 2026-03-10T09:03:49.260 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.260+0000 7fca41111700 1 -- 192.168.123.105:0/2088097644 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fca2807bda0 msgr2=0x7fca2807e260 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:49.260 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.260+0000 7fca41111700 1 --2- 192.168.123.105:0/2088097644 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fca2807bda0 0x7fca2807e260 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7fca24006010 tx=0x7fca2400b540 comp rx=0 tx=0).stop 2026-03-10T09:03:49.260 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.260+0000 7fca41111700 1 -- 192.168.123.105:0/2088097644 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fca3c102090 msgr2=0x7fca3c194b80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:49.260 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.260+0000 7fca41111700 1 --2- 192.168.123.105:0/2088097644 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fca3c102090 0x7fca3c194b80 secure :-1 s=READY pgs=173 cs=0 l=1 rev1=1 crypto rx=0x7fca2c00d900 tx=0x7fca2c00dcc0 comp rx=0 tx=0).stop 2026-03-10T09:03:49.261 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.260+0000 7fca41111700 1 -- 192.168.123.105:0/2088097644 shutdown_connections 2026-03-10T09:03:49.261 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.261+0000 7fca41111700 1 --2- 192.168.123.105:0/2088097644 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fca2807bda0 0x7fca2807e260 unknown :-1 s=CLOSED pgs=121 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:49.261 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.261+0000 7fca41111700 1 --2- 192.168.123.105:0/2088097644 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fca3c100ea0 0x7fca3c194640 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:49.261 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.261+0000 7fca41111700 1 --2- 192.168.123.105:0/2088097644 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fca3c102090 0x7fca3c194b80 unknown :-1 s=CLOSED pgs=173 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:49.261 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.261+0000 7fca41111700 1 -- 192.168.123.105:0/2088097644 >> 192.168.123.105:0/2088097644 conn(0x7fca3c0fc470 msgr2=0x7fca3c105350 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:03:49.261 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.261+0000 7fca41111700 1 -- 192.168.123.105:0/2088097644 shutdown_connections 2026-03-10T09:03:49.261 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.261+0000 7fca41111700 1 -- 192.168.123.105:0/2088097644 wait complete. 2026-03-10T09:03:49.271 INFO:teuthology.orchestra.run.vm05.stdout:true 2026-03-10T09:03:49.341 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.340+0000 7f7c52da9700 1 -- 192.168.123.105:0/2292047547 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7c4c103140 msgr2=0x7f7c4c103560 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:49.341 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.340+0000 7f7c52da9700 1 --2- 192.168.123.105:0/2292047547 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7c4c103140 0x7f7c4c103560 secure :-1 s=READY pgs=174 cs=0 l=1 rev1=1 crypto rx=0x7f7c3c009b50 tx=0x7f7c3c009e60 comp rx=0 tx=0).stop 2026-03-10T09:03:49.341 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.341+0000 7f7c52da9700 1 -- 192.168.123.105:0/2292047547 shutdown_connections 2026-03-10T09:03:49.341 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.341+0000 7f7c52da9700 1 --2- 192.168.123.105:0/2292047547 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7c4c104340 0x7f7c4c1047a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:49.341 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.341+0000 7f7c52da9700 1 --2- 192.168.123.105:0/2292047547 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7c4c103140 0x7f7c4c103560 unknown :-1 s=CLOSED pgs=174 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:49.341 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.341+0000 7f7c52da9700 1 -- 192.168.123.105:0/2292047547 >> 192.168.123.105:0/2292047547 conn(0x7f7c4c0fe6c0 msgr2=0x7f7c4c100b20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:03:49.341 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.341+0000 7f7c52da9700 1 -- 192.168.123.105:0/2292047547 shutdown_connections 2026-03-10T09:03:49.341 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.341+0000 7f7c52da9700 1 -- 192.168.123.105:0/2292047547 wait complete. 2026-03-10T09:03:49.342 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.341+0000 7f7c52da9700 1 Processor -- start 2026-03-10T09:03:49.342 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.342+0000 7f7c52da9700 1 -- start start 2026-03-10T09:03:49.342 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.342+0000 7f7c52da9700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7c4c103140 0x7f7c4c198a60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:03:49.342 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.342+0000 7f7c52da9700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7c4c104340 0x7f7c4c198fa0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:03:49.342 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.342+0000 7f7c52da9700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7c4c1995c0 con 0x7f7c4c103140 2026-03-10T09:03:49.342 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.342+0000 7f7c52da9700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7c4c199700 con 0x7f7c4c104340 2026-03-10T09:03:49.342 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.342+0000 7f7c50b45700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7c4c103140 0x7f7c4c198a60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:03:49.343 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.342+0000 7f7c50b45700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7c4c103140 0x7f7c4c198a60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:58998/0 (socket says 192.168.123.105:58998) 2026-03-10T09:03:49.343 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.342+0000 7f7c50b45700 1 -- 192.168.123.105:0/3706685467 learned_addr learned my addr 192.168.123.105:0/3706685467 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:03:49.343 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.342+0000 7f7c4bfff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7c4c104340 0x7f7c4c198fa0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:03:49.343 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.343+0000 7f7c50b45700 1 -- 192.168.123.105:0/3706685467 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7c4c104340 msgr2=0x7f7c4c198fa0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:49.343 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.343+0000 7f7c50b45700 1 --2- 192.168.123.105:0/3706685467 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7c4c104340 0x7f7c4c198fa0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:49.343 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.343+0000 7f7c50b45700 1 -- 192.168.123.105:0/3706685467 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7c40009710 con 0x7f7c4c103140 2026-03-10T09:03:49.343 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.343+0000 7f7c50b45700 1 --2- 192.168.123.105:0/3706685467 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7c4c103140 0x7f7c4c198a60 secure :-1 s=READY pgs=175 cs=0 l=1 rev1=1 crypto rx=0x7f7c3c009b20 tx=0x7f7c3c0052a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:03:49.344 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.343+0000 7f7c49ffb700 1 -- 192.168.123.105:0/3706685467 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7c3c01d070 con 0x7f7c4c103140 2026-03-10T09:03:49.344 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.343+0000 7f7c49ffb700 1 -- 192.168.123.105:0/3706685467 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7c3c022470 con 0x7f7c4c103140 2026-03-10T09:03:49.344 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.343+0000 7f7c52da9700 1 -- 192.168.123.105:0/3706685467 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7c3c0097e0 con 0x7f7c4c103140 2026-03-10T09:03:49.345 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.344+0000 7f7c49ffb700 1 -- 192.168.123.105:0/3706685467 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7c3c00f670 con 0x7f7c4c103140 2026-03-10T09:03:49.345 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.345+0000 7f7c52da9700 1 -- 192.168.123.105:0/3706685467 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7c4c19e4b0 con 0x7f7c4c103140 2026-03-10T09:03:49.349 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.345+0000 7f7c49ffb700 1 -- 192.168.123.105:0/3706685467 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f7c3c0225e0 con 0x7f7c4c103140 2026-03-10T09:03:49.349 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.346+0000 7f7c52da9700 1 -- 192.168.123.105:0/3706685467 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7c4c066e80 con 0x7f7c4c103140 2026-03-10T09:03:49.349 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.346+0000 7f7c49ffb700 1 --2- 192.168.123.105:0/3706685467 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f7c340776c0 0x7f7c34079b80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:03:49.349 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.346+0000 7f7c49ffb700 1 -- 192.168.123.105:0/3706685467 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f7c3c09afe0 con 0x7f7c4c103140 2026-03-10T09:03:49.350 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.347+0000 7f7c4bfff700 1 --2- 192.168.123.105:0/3706685467 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f7c340776c0 0x7f7c34079b80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:03:49.350 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.348+0000 7f7c4bfff700 1 --2- 192.168.123.105:0/3706685467 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f7c340776c0 0x7f7c34079b80 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7f7c40011440 tx=0x7f7c40009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:03:49.350 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.349+0000 7f7c49ffb700 1 -- 192.168.123.105:0/3706685467 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f7c3c0637a0 con 0x7f7c4c103140 2026-03-10T09:03:49.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:49 vm05.local ceph-mon[111630]: pgmap v277: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 4.9 KiB/s wr, 10 op/s 2026-03-10T09:03:49.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:49 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:49.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:49 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:49.479 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:49 vm08.local ceph-mon[101330]: pgmap v277: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 4.9 KiB/s wr, 10 op/s 2026-03-10T09:03:49.479 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:49 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:49.480 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:49 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:49.492 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.492+0000 7f7c52da9700 1 -- 192.168.123.105:0/3706685467 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f7c4c19e820 con 0x7f7c340776c0 2026-03-10T09:03:49.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.493+0000 7f7c49ffb700 1 -- 192.168.123.105:0/3706685467 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+441 (secure 0 0 0) 0x7f7c4c19e820 con 0x7f7c340776c0 2026-03-10T09:03:49.496 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.496+0000 7f7c52da9700 1 -- 192.168.123.105:0/3706685467 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f7c340776c0 msgr2=0x7f7c34079b80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:49.497 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.496+0000 7f7c52da9700 1 --2- 192.168.123.105:0/3706685467 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f7c340776c0 0x7f7c34079b80 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7f7c40011440 tx=0x7f7c40009450 comp rx=0 tx=0).stop 2026-03-10T09:03:49.497 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.496+0000 7f7c52da9700 1 -- 192.168.123.105:0/3706685467 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7c4c103140 msgr2=0x7f7c4c198a60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:49.497 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.496+0000 7f7c52da9700 1 --2- 192.168.123.105:0/3706685467 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7c4c103140 0x7f7c4c198a60 secure :-1 s=READY pgs=175 cs=0 l=1 rev1=1 crypto rx=0x7f7c3c009b20 tx=0x7f7c3c0052a0 comp rx=0 tx=0).stop 2026-03-10T09:03:49.497 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.496+0000 7f7c52da9700 1 -- 192.168.123.105:0/3706685467 shutdown_connections 2026-03-10T09:03:49.497 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.496+0000 7f7c52da9700 1 --2- 192.168.123.105:0/3706685467 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f7c340776c0 0x7f7c34079b80 unknown :-1 s=CLOSED pgs=122 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:49.498 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.496+0000 7f7c52da9700 1 --2- 192.168.123.105:0/3706685467 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7c4c103140 0x7f7c4c198a60 secure :-1 s=CLOSED pgs=175 cs=0 l=1 rev1=1 crypto rx=0x7f7c3c009b20 tx=0x7f7c3c0052a0 comp rx=0 tx=0).stop 2026-03-10T09:03:49.498 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.496+0000 7f7c52da9700 1 --2- 192.168.123.105:0/3706685467 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7c4c104340 0x7f7c4c198fa0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:49.498 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.496+0000 7f7c52da9700 1 -- 192.168.123.105:0/3706685467 >> 192.168.123.105:0/3706685467 conn(0x7f7c4c0fe6c0 msgr2=0x7f7c4c107570 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:03:49.498 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.497+0000 7f7c52da9700 1 -- 192.168.123.105:0/3706685467 shutdown_connections 2026-03-10T09:03:49.498 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.497+0000 7f7c52da9700 1 -- 192.168.123.105:0/3706685467 wait complete. 2026-03-10T09:03:49.581 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.581+0000 7ff8aaf4d700 1 -- 192.168.123.105:0/3001192909 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff8a41001c0 msgr2=0x7ff8a4100640 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:49.581 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.581+0000 7ff8aaf4d700 1 --2- 192.168.123.105:0/3001192909 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff8a41001c0 0x7ff8a4100640 secure :-1 s=READY pgs=176 cs=0 l=1 rev1=1 crypto rx=0x7ff898009b00 tx=0x7ff898009e10 comp rx=0 tx=0).stop 2026-03-10T09:03:49.581 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.581+0000 7ff8aaf4d700 1 -- 192.168.123.105:0/3001192909 shutdown_connections 2026-03-10T09:03:49.581 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.581+0000 7ff8aaf4d700 1 --2- 192.168.123.105:0/3001192909 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff8a41001c0 0x7ff8a4100640 unknown :-1 s=CLOSED pgs=176 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:49.581 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.581+0000 7ff8aaf4d700 1 --2- 192.168.123.105:0/3001192909 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff8a40ff860 0x7ff8a40ffc80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:49.581 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.581+0000 7ff8aaf4d700 1 -- 192.168.123.105:0/3001192909 >> 192.168.123.105:0/3001192909 conn(0x7ff8a40fb3c0 msgr2=0x7ff8a40fd840 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:03:49.581 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.581+0000 7ff8aaf4d700 1 -- 192.168.123.105:0/3001192909 shutdown_connections 2026-03-10T09:03:49.582 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.581+0000 7ff8aaf4d700 1 -- 192.168.123.105:0/3001192909 wait complete. 2026-03-10T09:03:49.582 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.582+0000 7ff8aaf4d700 1 Processor -- start 2026-03-10T09:03:49.582 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.582+0000 7ff8aaf4d700 1 -- start start 2026-03-10T09:03:49.582 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.582+0000 7ff8aaf4d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff8a40ff860 0x7ff8a41989c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:03:49.583 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.582+0000 7ff8aaf4d700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff8a41001c0 0x7ff8a4198f00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:03:49.583 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.582+0000 7ff8aaf4d700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff8a4199490 con 0x7ff8a40ff860 2026-03-10T09:03:49.583 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.582+0000 7ff8aaf4d700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff8a41995d0 con 0x7ff8a41001c0 2026-03-10T09:03:49.583 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.583+0000 7ff8a3fff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff8a41001c0 0x7ff8a4198f00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:03:49.583 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.583+0000 7ff8a8ce9700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff8a40ff860 0x7ff8a41989c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:03:49.583 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.583+0000 7ff8a3fff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff8a41001c0 0x7ff8a4198f00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:59852/0 (socket says 192.168.123.105:59852) 2026-03-10T09:03:49.583 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.583+0000 7ff8a3fff700 1 -- 192.168.123.105:0/3094013066 learned_addr learned my addr 192.168.123.105:0/3094013066 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:03:49.583 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.583+0000 7ff8a8ce9700 1 -- 192.168.123.105:0/3094013066 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff8a41001c0 msgr2=0x7ff8a4198f00 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:49.583 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.583+0000 7ff8a8ce9700 1 --2- 192.168.123.105:0/3094013066 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff8a41001c0 0x7ff8a4198f00 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:49.583 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.583+0000 7ff8a8ce9700 1 -- 192.168.123.105:0/3094013066 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff8980097e0 con 0x7ff8a40ff860 2026-03-10T09:03:49.583 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.583+0000 7ff8a8ce9700 1 --2- 192.168.123.105:0/3094013066 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff8a40ff860 0x7ff8a41989c0 secure :-1 s=READY pgs=177 cs=0 l=1 rev1=1 crypto rx=0x7ff89400cc60 tx=0x7ff89400cf70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:03:49.584 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.584+0000 7ff8a1ffb700 1 -- 192.168.123.105:0/3094013066 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff8940041e0 con 0x7ff8a40ff860 2026-03-10T09:03:49.584 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.584+0000 7ff8a1ffb700 1 -- 192.168.123.105:0/3094013066 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff894004d10 con 0x7ff8a40ff860 2026-03-10T09:03:49.584 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.584+0000 7ff8aaf4d700 1 -- 192.168.123.105:0/3094013066 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff8a419e090 con 0x7ff8a40ff860 2026-03-10T09:03:49.585 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.584+0000 7ff8a1ffb700 1 -- 192.168.123.105:0/3094013066 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff894005020 con 0x7ff8a40ff860 2026-03-10T09:03:49.586 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.584+0000 7ff8aaf4d700 1 -- 192.168.123.105:0/3094013066 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff8a419e5b0 con 0x7ff8a40ff860 2026-03-10T09:03:49.589 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.586+0000 7ff8a1ffb700 1 -- 192.168.123.105:0/3094013066 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7ff894004750 con 0x7ff8a40ff860 2026-03-10T09:03:49.589 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.586+0000 7ff8aaf4d700 1 -- 192.168.123.105:0/3094013066 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff8a4066e80 con 0x7ff8a40ff860 2026-03-10T09:03:49.589 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.586+0000 7ff8a1ffb700 1 --2- 192.168.123.105:0/3094013066 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ff88c0778c0 0x7ff88c079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:03:49.589 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.586+0000 7ff8a1ffb700 1 -- 192.168.123.105:0/3094013066 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7ff894098ff0 con 0x7ff8a40ff860 2026-03-10T09:03:49.589 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.586+0000 7ff8a3fff700 1 --2- 192.168.123.105:0/3094013066 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ff88c0778c0 0x7ff88c079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:03:49.589 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.588+0000 7ff8a3fff700 1 --2- 192.168.123.105:0/3094013066 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ff88c0778c0 0x7ff88c079d80 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7ff898006010 tx=0x7ff89800b540 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:03:49.589 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.589+0000 7ff8a1ffb700 1 -- 192.168.123.105:0/3094013066 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff894061790 con 0x7ff8a40ff860 2026-03-10T09:03:49.728 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.727+0000 7ff8aaf4d700 1 -- 192.168.123.105:0/3094013066 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7ff8a419e8e0 con 0x7ff88c0778c0 2026-03-10T09:03:49.735 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.735+0000 7ff8a1ffb700 1 -- 192.168.123.105:0/3094013066 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7ff8a419e8e0 con 0x7ff88c0778c0 2026-03-10T09:03:49.736 INFO:teuthology.orchestra.run.vm05.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T09:03:49.736 INFO:teuthology.orchestra.run.vm05.stdout:alertmanager.vm05 vm05 *:9093,9094 running (12m) 20s ago 12m 24.3M - 0.25.0 c8568f914cd2 3be9db7ff6a0 2026-03-10T09:03:49.736 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm05 vm05 starting - - - - 2026-03-10T09:03:49.736 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm08 vm08 running (12m) 4s ago 12m 12.2M - 18.2.1 5be31c24972a 17a1d108c2c1 2026-03-10T09:03:49.736 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm05 vm05 running (6m) 20s ago 13m 7834k - 19.2.3-678-ge911bdeb 654f31e6858e 8f7c3cd210c7 2026-03-10T09:03:49.736 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm08 vm08 running (6m) 4s ago 12m 8296k - 19.2.3-678-ge911bdeb 654f31e6858e dc71c3161edd 2026-03-10T09:03:49.736 INFO:teuthology.orchestra.run.vm05.stdout:grafana.vm05 vm05 *:3000 running (12m) 20s ago 12m 90.1M - 9.4.7 954c08fa6188 91937b4745e7 2026-03-10T09:03:49.736 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.bxdvbu vm05 running (32s) 20s ago 10m 21.1M - 19.2.3-678-ge911bdeb 654f31e6858e a19223339e34 2026-03-10T09:03:49.736 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.slhztf vm05 running (22s) 20s ago 10m 12.2M - 19.2.3-678-ge911bdeb 654f31e6858e 0ae3b9e36731 2026-03-10T09:03:49.736 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.ssijow vm08 running (5s) 4s ago 10m 13.1M - 19.2.3-678-ge911bdeb 654f31e6858e 3540ff73b5c2 2026-03-10T09:03:49.736 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.xfzrbx vm08 running (14s) 4s ago 10m 18.2M - 19.2.3-678-ge911bdeb 654f31e6858e 4b83d0ce9fe6 2026-03-10T09:03:49.736 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm05.rxwgjc vm05 *:8443,9283,8765 running (7m) 20s ago 13m 625M - 19.2.3-678-ge911bdeb 654f31e6858e d8c53d04a173 2026-03-10T09:03:49.736 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm08.rpongu vm08 *:8443,9283,8765 running (7m) 4s ago 12m 495M - 19.2.3-678-ge911bdeb 654f31e6858e 867781ac9d98 2026-03-10T09:03:49.736 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm05 vm05 running (7m) 20s ago 13m 68.4M 2048M 19.2.3-678-ge911bdeb 654f31e6858e cdc9176bec28 2026-03-10T09:03:49.736 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm08 vm08 running (6m) 4s ago 12m 60.8M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 34546aa1422b 2026-03-10T09:03:49.736 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm05 vm05 *:9100 running (13m) 20s ago 13m 14.9M - 1.5.0 0da6a335fe13 3b3db2a6030c 2026-03-10T09:03:49.736 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm08 vm08 *:9100 running (12m) 4s ago 12m 16.0M - 1.5.0 0da6a335fe13 f55df0cefc61 2026-03-10T09:03:49.736 INFO:teuthology.orchestra.run.vm05.stdout:osd.0 vm05 running (6m) 20s ago 12m 218M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 4f1dac46f59b 2026-03-10T09:03:49.736 INFO:teuthology.orchestra.run.vm05.stdout:osd.1 vm05 running (2m) 20s ago 11m 115M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 306e95bddd95 2026-03-10T09:03:49.736 INFO:teuthology.orchestra.run.vm05.stdout:osd.2 vm05 running (112s) 20s ago 11m 102M 4096M 19.2.3-678-ge911bdeb 654f31e6858e a555d70ff4bd 2026-03-10T09:03:49.736 INFO:teuthology.orchestra.run.vm05.stdout:osd.3 vm08 running (89s) 4s ago 11m 174M 4096M 19.2.3-678-ge911bdeb 654f31e6858e b025f9a6ca2a 2026-03-10T09:03:49.736 INFO:teuthology.orchestra.run.vm05.stdout:osd.4 vm08 running (67s) 4s ago 11m 129M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 76fe84edd716 2026-03-10T09:03:49.736 INFO:teuthology.orchestra.run.vm05.stdout:osd.5 vm08 running (45s) 4s ago 11m 108M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 41f6c3ce6ac4 2026-03-10T09:03:49.736 INFO:teuthology.orchestra.run.vm05.stdout:prometheus.vm05 vm05 *:9095 running (7m) 20s ago 12m 66.4M - 2.43.0 a07b618ecd1d 1879cf55d022 2026-03-10T09:03:49.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.739+0000 7ff8aaf4d700 1 -- 192.168.123.105:0/3094013066 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ff88c0778c0 msgr2=0x7ff88c079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:49.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.739+0000 7ff8aaf4d700 1 --2- 192.168.123.105:0/3094013066 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ff88c0778c0 0x7ff88c079d80 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7ff898006010 tx=0x7ff89800b540 comp rx=0 tx=0).stop 2026-03-10T09:03:49.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.739+0000 7ff8aaf4d700 1 -- 192.168.123.105:0/3094013066 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff8a40ff860 msgr2=0x7ff8a41989c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:49.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.739+0000 7ff8aaf4d700 1 --2- 192.168.123.105:0/3094013066 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff8a40ff860 0x7ff8a41989c0 secure :-1 s=READY pgs=177 cs=0 l=1 rev1=1 crypto rx=0x7ff89400cc60 tx=0x7ff89400cf70 comp rx=0 tx=0).stop 2026-03-10T09:03:49.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.739+0000 7ff8aaf4d700 1 -- 192.168.123.105:0/3094013066 shutdown_connections 2026-03-10T09:03:49.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.739+0000 7ff8aaf4d700 1 --2- 192.168.123.105:0/3094013066 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ff88c0778c0 0x7ff88c079d80 unknown :-1 s=CLOSED pgs=123 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:49.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.739+0000 7ff8aaf4d700 1 --2- 192.168.123.105:0/3094013066 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff8a40ff860 0x7ff8a41989c0 unknown :-1 s=CLOSED pgs=177 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:49.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.739+0000 7ff8aaf4d700 1 --2- 192.168.123.105:0/3094013066 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff8a41001c0 0x7ff8a4198f00 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:49.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.739+0000 7ff8aaf4d700 1 -- 192.168.123.105:0/3094013066 >> 192.168.123.105:0/3094013066 conn(0x7ff8a40fb3c0 msgr2=0x7ff8a4107e90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:03:49.740 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.739+0000 7ff8aaf4d700 1 -- 192.168.123.105:0/3094013066 shutdown_connections 2026-03-10T09:03:49.740 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.739+0000 7ff8aaf4d700 1 -- 192.168.123.105:0/3094013066 wait complete. 2026-03-10T09:03:49.829 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.828+0000 7feb4f326700 1 -- 192.168.123.105:0/530560676 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7feb481043e0 msgr2=0x7feb481067d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:49.829 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.828+0000 7feb4f326700 1 --2- 192.168.123.105:0/530560676 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7feb481043e0 0x7feb481067d0 secure :-1 s=READY pgs=178 cs=0 l=1 rev1=1 crypto rx=0x7feb44009b00 tx=0x7feb44009e10 comp rx=0 tx=0).stop 2026-03-10T09:03:49.829 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.829+0000 7feb4f326700 1 -- 192.168.123.105:0/530560676 shutdown_connections 2026-03-10T09:03:49.829 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.829+0000 7feb4f326700 1 --2- 192.168.123.105:0/530560676 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7feb481043e0 0x7feb481067d0 unknown :-1 s=CLOSED pgs=178 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:49.829 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.829+0000 7feb4f326700 1 --2- 192.168.123.105:0/530560676 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7feb48101ab0 0x7feb48103ea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:49.829 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.829+0000 7feb4f326700 1 -- 192.168.123.105:0/530560676 >> 192.168.123.105:0/530560676 conn(0x7feb480fb3c0 msgr2=0x7feb480fd840 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:03:49.830 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.829+0000 7feb4f326700 1 -- 192.168.123.105:0/530560676 shutdown_connections 2026-03-10T09:03:49.830 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.830+0000 7feb4f326700 1 -- 192.168.123.105:0/530560676 wait complete. 2026-03-10T09:03:49.831 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.830+0000 7feb4f326700 1 Processor -- start 2026-03-10T09:03:49.831 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.831+0000 7feb4f326700 1 -- start start 2026-03-10T09:03:49.831 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.831+0000 7feb4f326700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7feb48101ab0 0x7feb481945f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:03:49.831 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.831+0000 7feb4f326700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7feb481043e0 0x7feb48194b30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:03:49.831 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.831+0000 7feb4f326700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7feb48195150 con 0x7feb481043e0 2026-03-10T09:03:49.831 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.831+0000 7feb4f326700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7feb48195290 con 0x7feb48101ab0 2026-03-10T09:03:49.832 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.832+0000 7feb4c8c1700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7feb481043e0 0x7feb48194b30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:03:49.832 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.832+0000 7feb4c8c1700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7feb481043e0 0x7feb48194b30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:39230/0 (socket says 192.168.123.105:39230) 2026-03-10T09:03:49.832 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.832+0000 7feb4c8c1700 1 -- 192.168.123.105:0/1524982786 learned_addr learned my addr 192.168.123.105:0/1524982786 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:03:49.833 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.832+0000 7feb4c8c1700 1 -- 192.168.123.105:0/1524982786 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7feb48101ab0 msgr2=0x7feb481945f0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T09:03:49.833 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.832+0000 7feb4c8c1700 1 --2- 192.168.123.105:0/1524982786 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7feb48101ab0 0x7feb481945f0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:49.833 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.832+0000 7feb4c8c1700 1 -- 192.168.123.105:0/1524982786 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7feb440097e0 con 0x7feb481043e0 2026-03-10T09:03:49.833 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.833+0000 7feb4c8c1700 1 --2- 192.168.123.105:0/1524982786 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7feb481043e0 0x7feb48194b30 secure :-1 s=READY pgs=179 cs=0 l=1 rev1=1 crypto rx=0x7feb44005850 tx=0x7feb4400b890 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:03:49.835 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.834+0000 7feb3a7fc700 1 -- 192.168.123.105:0/1524982786 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7feb4401d070 con 0x7feb481043e0 2026-03-10T09:03:49.835 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.834+0000 7feb3a7fc700 1 -- 192.168.123.105:0/1524982786 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7feb4400bdd0 con 0x7feb481043e0 2026-03-10T09:03:49.836 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.834+0000 7feb3a7fc700 1 -- 192.168.123.105:0/1524982786 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7feb4400f8e0 con 0x7feb481043e0 2026-03-10T09:03:49.836 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.835+0000 7feb4f326700 1 -- 192.168.123.105:0/1524982786 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7feb48199ce0 con 0x7feb481043e0 2026-03-10T09:03:49.837 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.836+0000 7feb4f326700 1 -- 192.168.123.105:0/1524982786 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7feb4819a250 con 0x7feb481043e0 2026-03-10T09:03:49.838 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.837+0000 7feb3a7fc700 1 -- 192.168.123.105:0/1524982786 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7feb44022c50 con 0x7feb481043e0 2026-03-10T09:03:49.838 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.838+0000 7feb3a7fc700 1 --2- 192.168.123.105:0/1524982786 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7feb340778c0 0x7feb34079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:03:49.839 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.838+0000 7feb4d0c2700 1 --2- 192.168.123.105:0/1524982786 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7feb340778c0 0x7feb34079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:03:49.839 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.839+0000 7feb33fff700 1 -- 192.168.123.105:0/1524982786 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7feb280052f0 con 0x7feb481043e0 2026-03-10T09:03:49.839 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.839+0000 7feb4d0c2700 1 --2- 192.168.123.105:0/1524982786 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7feb340778c0 0x7feb34079d80 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7feb3c009780 tx=0x7feb3c006cb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:03:49.845 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.841+0000 7feb3a7fc700 1 -- 192.168.123.105:0/1524982786 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7feb44067fc0 con 0x7feb481043e0 2026-03-10T09:03:49.845 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:49.844+0000 7feb3a7fc700 1 -- 192.168.123.105:0/1524982786 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7feb440a0050 con 0x7feb481043e0 2026-03-10T09:03:50.025 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.025+0000 7feb33fff700 1 -- 192.168.123.105:0/1524982786 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7feb28005c90 con 0x7feb481043e0 2026-03-10T09:03:50.026 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.026+0000 7feb3a7fc700 1 -- 192.168.123.105:0/1524982786 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7feb44027070 con 0x7feb481043e0 2026-03-10T09:03:50.026 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T09:03:50.026 INFO:teuthology.orchestra.run.vm05.stdout: "mon": { 2026-03-10T09:03:50.026 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T09:03:50.026 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T09:03:50.026 INFO:teuthology.orchestra.run.vm05.stdout: "mgr": { 2026-03-10T09:03:50.027 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T09:03:50.027 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T09:03:50.027 INFO:teuthology.orchestra.run.vm05.stdout: "osd": { 2026-03-10T09:03:50.027 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-10T09:03:50.027 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T09:03:50.027 INFO:teuthology.orchestra.run.vm05.stdout: "mds": { 2026-03-10T09:03:50.027 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 4 2026-03-10T09:03:50.027 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T09:03:50.027 INFO:teuthology.orchestra.run.vm05.stdout: "overall": { 2026-03-10T09:03:50.027 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 14 2026-03-10T09:03:50.027 INFO:teuthology.orchestra.run.vm05.stdout: } 2026-03-10T09:03:50.027 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T09:03:50.029 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.028+0000 7feb33fff700 1 -- 192.168.123.105:0/1524982786 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7feb340778c0 msgr2=0x7feb34079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:50.029 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.028+0000 7feb33fff700 1 --2- 192.168.123.105:0/1524982786 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7feb340778c0 0x7feb34079d80 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7feb3c009780 tx=0x7feb3c006cb0 comp rx=0 tx=0).stop 2026-03-10T09:03:50.029 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.028+0000 7feb33fff700 1 -- 192.168.123.105:0/1524982786 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7feb481043e0 msgr2=0x7feb48194b30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:50.029 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.028+0000 7feb33fff700 1 --2- 192.168.123.105:0/1524982786 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7feb481043e0 0x7feb48194b30 secure :-1 s=READY pgs=179 cs=0 l=1 rev1=1 crypto rx=0x7feb44005850 tx=0x7feb4400b890 comp rx=0 tx=0).stop 2026-03-10T09:03:50.029 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.029+0000 7feb33fff700 1 -- 192.168.123.105:0/1524982786 shutdown_connections 2026-03-10T09:03:50.029 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.029+0000 7feb33fff700 1 --2- 192.168.123.105:0/1524982786 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7feb340778c0 0x7feb34079d80 unknown :-1 s=CLOSED pgs=124 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:50.029 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.029+0000 7feb33fff700 1 --2- 192.168.123.105:0/1524982786 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7feb48101ab0 0x7feb481945f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:50.029 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.029+0000 7feb33fff700 1 --2- 192.168.123.105:0/1524982786 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7feb481043e0 0x7feb48194b30 unknown :-1 s=CLOSED pgs=179 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:50.029 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.029+0000 7feb33fff700 1 -- 192.168.123.105:0/1524982786 >> 192.168.123.105:0/1524982786 conn(0x7feb480fb3c0 msgr2=0x7feb48100130 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:03:50.030 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.030+0000 7feb33fff700 1 -- 192.168.123.105:0/1524982786 shutdown_connections 2026-03-10T09:03:50.030 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.030+0000 7feb33fff700 1 -- 192.168.123.105:0/1524982786 wait complete. 2026-03-10T09:03:50.103 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.102+0000 7f82aaf69700 1 -- 192.168.123.105:0/3227434034 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f82a4076990 msgr2=0x7f82a4076e10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:50.103 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.102+0000 7f82aaf69700 1 --2- 192.168.123.105:0/3227434034 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f82a4076990 0x7f82a4076e10 secure :-1 s=READY pgs=180 cs=0 l=1 rev1=1 crypto rx=0x7f82a0009b00 tx=0x7f82a0009e10 comp rx=0 tx=0).stop 2026-03-10T09:03:50.103 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.103+0000 7f82aaf69700 1 -- 192.168.123.105:0/3227434034 shutdown_connections 2026-03-10T09:03:50.104 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.103+0000 7f82aaf69700 1 --2- 192.168.123.105:0/3227434034 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f82a4076990 0x7f82a4076e10 unknown :-1 s=CLOSED pgs=180 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:50.104 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.103+0000 7f82aaf69700 1 --2- 192.168.123.105:0/3227434034 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f82a4075740 0x7f82a4075b60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:50.104 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.103+0000 7f82aaf69700 1 -- 192.168.123.105:0/3227434034 >> 192.168.123.105:0/3227434034 conn(0x7f82a40fe440 msgr2=0x7f82a41008a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:03:50.104 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.104+0000 7f82aaf69700 1 -- 192.168.123.105:0/3227434034 shutdown_connections 2026-03-10T09:03:50.104 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.104+0000 7f82aaf69700 1 -- 192.168.123.105:0/3227434034 wait complete. 2026-03-10T09:03:50.105 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.104+0000 7f82aaf69700 1 Processor -- start 2026-03-10T09:03:50.105 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.105+0000 7f82aaf69700 1 -- start start 2026-03-10T09:03:50.105 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.105+0000 7f82aaf69700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f82a4075740 0x7f82a4071d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:03:50.105 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.105+0000 7f82aaf69700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f82a4076990 0x7f82a4072260 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:03:50.105 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.105+0000 7f82aaf69700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f82a40727f0 con 0x7f82a4075740 2026-03-10T09:03:50.105 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.105+0000 7f82aaf69700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f82a4072960 con 0x7f82a4076990 2026-03-10T09:03:50.106 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.105+0000 7f82a9f67700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f82a4075740 0x7f82a4071d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:03:50.106 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.106+0000 7f82a9766700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f82a4076990 0x7f82a4072260 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:03:50.106 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.106+0000 7f82a9766700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f82a4076990 0x7f82a4072260 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:55680/0 (socket says 192.168.123.105:55680) 2026-03-10T09:03:50.106 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.106+0000 7f82a9766700 1 -- 192.168.123.105:0/781154529 learned_addr learned my addr 192.168.123.105:0/781154529 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:03:50.107 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.106+0000 7f82a9766700 1 -- 192.168.123.105:0/781154529 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f82a4075740 msgr2=0x7f82a4071d20 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:50.107 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.107+0000 7f82a9766700 1 --2- 192.168.123.105:0/781154529 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f82a4075740 0x7f82a4071d20 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:50.107 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.107+0000 7f82a9766700 1 -- 192.168.123.105:0/781154529 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f82a00097e0 con 0x7f82a4076990 2026-03-10T09:03:50.107 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.107+0000 7f82a9f67700 1 --2- 192.168.123.105:0/781154529 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f82a4075740 0x7f82a4071d20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T09:03:50.107 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.107+0000 7f82a9766700 1 --2- 192.168.123.105:0/781154529 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f82a4076990 0x7f82a4072260 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7f82a0005f50 tx=0x7f82a00049f0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:03:50.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.108+0000 7f829affd700 1 -- 192.168.123.105:0/781154529 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f82a001d070 con 0x7f82a4076990 2026-03-10T09:03:50.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.108+0000 7f829affd700 1 -- 192.168.123.105:0/781154529 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f82a000bc50 con 0x7f82a4076990 2026-03-10T09:03:50.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.108+0000 7f82aaf69700 1 -- 192.168.123.105:0/781154529 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f82a41029c0 con 0x7f82a4076990 2026-03-10T09:03:50.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.108+0000 7f829affd700 1 -- 192.168.123.105:0/781154529 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f82a000f830 con 0x7f82a4076990 2026-03-10T09:03:50.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.108+0000 7f82aaf69700 1 -- 192.168.123.105:0/781154529 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f82a4102eb0 con 0x7f82a4076990 2026-03-10T09:03:50.109 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.109+0000 7f82aaf69700 1 -- 192.168.123.105:0/781154529 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f82a404ea90 con 0x7f82a4076990 2026-03-10T09:03:50.110 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.110+0000 7f829affd700 1 -- 192.168.123.105:0/781154529 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f82a0022a50 con 0x7f82a4076990 2026-03-10T09:03:50.111 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.110+0000 7f829affd700 1 --2- 192.168.123.105:0/781154529 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f82900776b0 0x7f8290079b70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:03:50.111 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.110+0000 7f829affd700 1 -- 192.168.123.105:0/781154529 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f82a009bd90 con 0x7f82a4076990 2026-03-10T09:03:50.111 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.111+0000 7f82a9f67700 1 --2- 192.168.123.105:0/781154529 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f82900776b0 0x7f8290079b70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:03:50.112 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.111+0000 7f82a9f67700 1 --2- 192.168.123.105:0/781154529 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f82900776b0 0x7f8290079b70 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7f829400afd0 tx=0x7f829400a380 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:03:50.117 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.117+0000 7f829affd700 1 -- 192.168.123.105:0/781154529 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f82a0064510 con 0x7f82a4076990 2026-03-10T09:03:50.285 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.284+0000 7f82aaf69700 1 -- 192.168.123.105:0/781154529 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f82a4103100 con 0x7f82a4076990 2026-03-10T09:03:50.295 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.295+0000 7f829affd700 1 -- 192.168.123.105:0/781154529 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 30 v30) v1 ==== 76+0+1973 (secure 0 0 0) 0x7f82a0063c60 con 0x7f82a4076990 2026-03-10T09:03:50.299 INFO:teuthology.orchestra.run.vm05.stdout:e30 2026-03-10T09:03:50.299 INFO:teuthology.orchestra.run.vm05.stdout:btime 2026-03-10T09:03:44:361690+0000 2026-03-10T09:03:50.299 INFO:teuthology.orchestra.run.vm05.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T09:03:50.299 INFO:teuthology.orchestra.run.vm05.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T09:03:50.299 INFO:teuthology.orchestra.run.vm05.stdout:legacy client fscid: 1 2026-03-10T09:03:50.299 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:03:50.299 INFO:teuthology.orchestra.run.vm05.stdout:Filesystem 'cephfs' (1) 2026-03-10T09:03:50.299 INFO:teuthology.orchestra.run.vm05.stdout:fs_name cephfs 2026-03-10T09:03:50.299 INFO:teuthology.orchestra.run.vm05.stdout:epoch 30 2026-03-10T09:03:50.299 INFO:teuthology.orchestra.run.vm05.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T09:03:50.299 INFO:teuthology.orchestra.run.vm05.stdout:created 2026-03-10T08:52:52.346264+0000 2026-03-10T09:03:50.299 INFO:teuthology.orchestra.run.vm05.stdout:modified 2026-03-10T09:03:44.361687+0000 2026-03-10T09:03:50.299 INFO:teuthology.orchestra.run.vm05.stdout:tableserver 0 2026-03-10T09:03:50.299 INFO:teuthology.orchestra.run.vm05.stdout:root 0 2026-03-10T09:03:50.299 INFO:teuthology.orchestra.run.vm05.stdout:session_timeout 60 2026-03-10T09:03:50.299 INFO:teuthology.orchestra.run.vm05.stdout:session_autoclose 300 2026-03-10T09:03:50.299 INFO:teuthology.orchestra.run.vm05.stdout:max_file_size 1099511627776 2026-03-10T09:03:50.299 INFO:teuthology.orchestra.run.vm05.stdout:max_xattr_size 65536 2026-03-10T09:03:50.299 INFO:teuthology.orchestra.run.vm05.stdout:required_client_features {} 2026-03-10T09:03:50.299 INFO:teuthology.orchestra.run.vm05.stdout:last_failure 0 2026-03-10T09:03:50.299 INFO:teuthology.orchestra.run.vm05.stdout:last_failure_osd_epoch 106 2026-03-10T09:03:50.299 INFO:teuthology.orchestra.run.vm05.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes} 2026-03-10T09:03:50.299 INFO:teuthology.orchestra.run.vm05.stdout:max_mds 1 2026-03-10T09:03:50.299 INFO:teuthology.orchestra.run.vm05.stdout:in 0 2026-03-10T09:03:50.299 INFO:teuthology.orchestra.run.vm05.stdout:up {0=34444} 2026-03-10T09:03:50.299 INFO:teuthology.orchestra.run.vm05.stdout:failed 2026-03-10T09:03:50.299 INFO:teuthology.orchestra.run.vm05.stdout:damaged 2026-03-10T09:03:50.299 INFO:teuthology.orchestra.run.vm05.stdout:stopped 2026-03-10T09:03:50.299 INFO:teuthology.orchestra.run.vm05.stdout:data_pools [3] 2026-03-10T09:03:50.299 INFO:teuthology.orchestra.run.vm05.stdout:metadata_pool 2 2026-03-10T09:03:50.299 INFO:teuthology.orchestra.run.vm05.stdout:inline_data disabled 2026-03-10T09:03:50.299 INFO:teuthology.orchestra.run.vm05.stdout:balancer 2026-03-10T09:03:50.299 INFO:teuthology.orchestra.run.vm05.stdout:bal_rank_mask -1 2026-03-10T09:03:50.299 INFO:teuthology.orchestra.run.vm05.stdout:standby_count_wanted 1 2026-03-10T09:03:50.299 INFO:teuthology.orchestra.run.vm05.stdout:qdb_cluster leader: 34444 members: 34444 2026-03-10T09:03:50.299 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.bxdvbu{0:34444} state up:active seq 10 join_fscid=1 addr [v2:192.168.123.105:6826/2948722085,v1:192.168.123.105:6827/2948722085] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T09:03:50.299 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:03:50.299 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:03:50.299 INFO:teuthology.orchestra.run.vm05.stdout:Standby daemons: 2026-03-10T09:03:50.299 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:03:50.299 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.xfzrbx{-1:34470} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.108:6824/3904677772,v1:192.168.123.108:6825/3904677772] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T09:03:50.299 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.slhztf{-1:44367} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.105:6828/930707688,v1:192.168.123.105:6829/930707688] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T09:03:50.299 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.ssijow{-1:44373} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.108:6826/42427465,v1:192.168.123.108:6827/42427465] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T09:03:50.301 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.301+0000 7f82aaf69700 1 -- 192.168.123.105:0/781154529 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f82900776b0 msgr2=0x7f8290079b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:50.301 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.301+0000 7f82aaf69700 1 --2- 192.168.123.105:0/781154529 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f82900776b0 0x7f8290079b70 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7f829400afd0 tx=0x7f829400a380 comp rx=0 tx=0).stop 2026-03-10T09:03:50.302 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.301+0000 7f82aaf69700 1 -- 192.168.123.105:0/781154529 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f82a4076990 msgr2=0x7f82a4072260 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:50.302 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.301+0000 7f82aaf69700 1 --2- 192.168.123.105:0/781154529 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f82a4076990 0x7f82a4072260 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7f82a0005f50 tx=0x7f82a00049f0 comp rx=0 tx=0).stop 2026-03-10T09:03:50.302 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.302+0000 7f82aaf69700 1 -- 192.168.123.105:0/781154529 shutdown_connections 2026-03-10T09:03:50.302 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.302+0000 7f82aaf69700 1 --2- 192.168.123.105:0/781154529 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f82900776b0 0x7f8290079b70 unknown :-1 s=CLOSED pgs=125 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:50.302 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.302+0000 7f82aaf69700 1 --2- 192.168.123.105:0/781154529 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f82a4075740 0x7f82a4071d20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:50.302 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.302+0000 7f82aaf69700 1 --2- 192.168.123.105:0/781154529 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f82a4076990 0x7f82a4072260 unknown :-1 s=CLOSED pgs=74 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:50.302 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.302+0000 7f82aaf69700 1 -- 192.168.123.105:0/781154529 >> 192.168.123.105:0/781154529 conn(0x7f82a40fe440 msgr2=0x7f82a410d120 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:03:50.302 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.302+0000 7f82aaf69700 1 -- 192.168.123.105:0/781154529 shutdown_connections 2026-03-10T09:03:50.302 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.302+0000 7f82aaf69700 1 -- 192.168.123.105:0/781154529 wait complete. 2026-03-10T09:03:50.306 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 30 2026-03-10T09:03:50.433 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.432+0000 7fe9ff30f700 1 -- 192.168.123.105:0/3419777458 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe9f8072b50 msgr2=0x7fe9f8072f70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:50.433 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.432+0000 7fe9ff30f700 1 --2- 192.168.123.105:0/3419777458 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe9f8072b50 0x7fe9f8072f70 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7fe9f4007780 tx=0x7fe9f400c050 comp rx=0 tx=0).stop 2026-03-10T09:03:50.433 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.432+0000 7fe9ff30f700 1 -- 192.168.123.105:0/3419777458 shutdown_connections 2026-03-10T09:03:50.433 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.432+0000 7fe9ff30f700 1 --2- 192.168.123.105:0/3419777458 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe9f8075a40 0x7fe9f8077ed0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:50.433 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.432+0000 7fe9ff30f700 1 --2- 192.168.123.105:0/3419777458 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe9f8072b50 0x7fe9f8072f70 secure :-1 s=CLOSED pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7fe9f4007780 tx=0x7fe9f400c050 comp rx=0 tx=0).stop 2026-03-10T09:03:50.433 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.432+0000 7fe9ff30f700 1 -- 192.168.123.105:0/3419777458 >> 192.168.123.105:0/3419777458 conn(0x7fe9f806dae0 msgr2=0x7fe9f806ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:03:50.433 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:50 vm05.local ceph-mon[111630]: from='client.34478 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:03:50.433 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:50 vm05.local ceph-mon[111630]: Upgrade: Updating ceph-exporter.vm08 (2/2) 2026-03-10T09:03:50.433 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:50 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:50.433 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:50 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm08", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T09:03:50.433 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:50 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:03:50.434 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:50 vm05.local ceph-mon[111630]: Deploying daemon ceph-exporter.vm08 on vm08 2026-03-10T09:03:50.434 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:50 vm05.local ceph-mon[111630]: from='client.34482 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:03:50.434 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:50 vm05.local ceph-mon[111630]: from='client.34486 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:03:50.434 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:50 vm05.local ceph-mon[111630]: pgmap v278: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 18 MiB/s rd, 4.2 KiB/s wr, 10 op/s 2026-03-10T09:03:50.434 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:50 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/1524982786' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:50.434 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.433+0000 7fe9ff30f700 1 -- 192.168.123.105:0/3419777458 shutdown_connections 2026-03-10T09:03:50.435 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.435+0000 7fe9ff30f700 1 -- 192.168.123.105:0/3419777458 wait complete. 2026-03-10T09:03:50.436 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.435+0000 7fe9ff30f700 1 Processor -- start 2026-03-10T09:03:50.436 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.435+0000 7fe9ff30f700 1 -- start start 2026-03-10T09:03:50.436 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.435+0000 7fe9ff30f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe9f8075a40 0x7fe9f812bdb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:03:50.436 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.435+0000 7fe9ff30f700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe9f812c2f0 0x7fe9f812e780 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:03:50.436 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.435+0000 7fe9ff30f700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe9f8083e10 con 0x7fe9f8075a40 2026-03-10T09:03:50.436 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.435+0000 7fe9ff30f700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe9f812ecc0 con 0x7fe9f812c2f0 2026-03-10T09:03:50.437 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.436+0000 7fe9fc8aa700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe9f812c2f0 0x7fe9f812e780 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:03:50.437 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.437+0000 7fe9fc8aa700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe9f812c2f0 0x7fe9f812e780 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:55700/0 (socket says 192.168.123.105:55700) 2026-03-10T09:03:50.437 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.437+0000 7fe9fc8aa700 1 -- 192.168.123.105:0/878164487 learned_addr learned my addr 192.168.123.105:0/878164487 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:03:50.439 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.439+0000 7fe9fd0ab700 1 --2- 192.168.123.105:0/878164487 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe9f8075a40 0x7fe9f812bdb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:03:50.440 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.439+0000 7fe9fc8aa700 1 -- 192.168.123.105:0/878164487 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe9f8075a40 msgr2=0x7fe9f812bdb0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:50.440 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.439+0000 7fe9fc8aa700 1 --2- 192.168.123.105:0/878164487 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe9f8075a40 0x7fe9f812bdb0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:50.440 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.439+0000 7fe9fc8aa700 1 -- 192.168.123.105:0/878164487 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe9f4007430 con 0x7fe9f812c2f0 2026-03-10T09:03:50.440 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.440+0000 7fe9fc8aa700 1 --2- 192.168.123.105:0/878164487 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe9f812c2f0 0x7fe9f812e780 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7fe9f000d350 tx=0x7fe9f000d710 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:03:50.441 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.440+0000 7fe9ee7fc700 1 -- 192.168.123.105:0/878164487 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe9f00155b0 con 0x7fe9f812c2f0 2026-03-10T09:03:50.441 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.440+0000 7fe9ff30f700 1 -- 192.168.123.105:0/878164487 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe9f812ef50 con 0x7fe9f812c2f0 2026-03-10T09:03:50.441 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.440+0000 7fe9ff30f700 1 -- 192.168.123.105:0/878164487 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe9f812f4a0 con 0x7fe9f812c2f0 2026-03-10T09:03:50.442 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.441+0000 7fe9ee7fc700 1 -- 192.168.123.105:0/878164487 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fe9f000f040 con 0x7fe9f812c2f0 2026-03-10T09:03:50.442 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.441+0000 7fe9ee7fc700 1 -- 192.168.123.105:0/878164487 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe9f00149c0 con 0x7fe9f812c2f0 2026-03-10T09:03:50.443 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.442+0000 7fe9ee7fc700 1 -- 192.168.123.105:0/878164487 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fe9f0014c20 con 0x7fe9f812c2f0 2026-03-10T09:03:50.443 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.443+0000 7fe9ff30f700 1 -- 192.168.123.105:0/878164487 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe9dc005320 con 0x7fe9f812c2f0 2026-03-10T09:03:50.444 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.443+0000 7fe9ee7fc700 1 --2- 192.168.123.105:0/878164487 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fe9e4077910 0x7fe9e4079dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:03:50.444 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.443+0000 7fe9ee7fc700 1 -- 192.168.123.105:0/878164487 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7fe9f00995b0 con 0x7fe9f812c2f0 2026-03-10T09:03:50.444 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.444+0000 7fe9fd0ab700 1 --2- 192.168.123.105:0/878164487 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fe9e4077910 0x7fe9e4079dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:03:50.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.445+0000 7fe9fd0ab700 1 --2- 192.168.123.105:0/878164487 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fe9e4077910 0x7fe9e4079dd0 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7fe9f4000c00 tx=0x7fe9f4005d80 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:03:50.451 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.449+0000 7fe9ee7fc700 1 -- 192.168.123.105:0/878164487 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fe9f0061de0 con 0x7fe9f812c2f0 2026-03-10T09:03:50.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:50 vm08.local ceph-mon[101330]: from='client.34478 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:03:50.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:50 vm08.local ceph-mon[101330]: Upgrade: Updating ceph-exporter.vm08 (2/2) 2026-03-10T09:03:50.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:50 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:50.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:50 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm08", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T09:03:50.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:50 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:03:50.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:50 vm08.local ceph-mon[101330]: Deploying daemon ceph-exporter.vm08 on vm08 2026-03-10T09:03:50.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:50 vm08.local ceph-mon[101330]: from='client.34482 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:03:50.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:50 vm08.local ceph-mon[101330]: from='client.34486 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:03:50.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:50 vm08.local ceph-mon[101330]: pgmap v278: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 18 MiB/s rd, 4.2 KiB/s wr, 10 op/s 2026-03-10T09:03:50.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:50 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/1524982786' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:50.682 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.676+0000 7fe9ff30f700 1 -- 192.168.123.105:0/878164487 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fe9dc000bf0 con 0x7fe9e4077910 2026-03-10T09:03:50.682 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.682+0000 7fe9ee7fc700 1 -- 192.168.123.105:0/878164487 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+441 (secure 0 0 0) 0x7fe9dc000bf0 con 0x7fe9e4077910 2026-03-10T09:03:50.686 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T09:03:50.686 INFO:teuthology.orchestra.run.vm05.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T09:03:50.686 INFO:teuthology.orchestra.run.vm05.stdout: "in_progress": true, 2026-03-10T09:03:50.686 INFO:teuthology.orchestra.run.vm05.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T09:03:50.686 INFO:teuthology.orchestra.run.vm05.stdout: "services_complete": [ 2026-03-10T09:03:50.686 INFO:teuthology.orchestra.run.vm05.stdout: "crash", 2026-03-10T09:03:50.686 INFO:teuthology.orchestra.run.vm05.stdout: "mgr", 2026-03-10T09:03:50.686 INFO:teuthology.orchestra.run.vm05.stdout: "mon", 2026-03-10T09:03:50.686 INFO:teuthology.orchestra.run.vm05.stdout: "mds", 2026-03-10T09:03:50.686 INFO:teuthology.orchestra.run.vm05.stdout: "osd" 2026-03-10T09:03:50.686 INFO:teuthology.orchestra.run.vm05.stdout: ], 2026-03-10T09:03:50.686 INFO:teuthology.orchestra.run.vm05.stdout: "progress": "16/23 daemons upgraded", 2026-03-10T09:03:50.686 INFO:teuthology.orchestra.run.vm05.stdout: "message": "Currently upgrading ceph-exporter daemons", 2026-03-10T09:03:50.686 INFO:teuthology.orchestra.run.vm05.stdout: "is_paused": false 2026-03-10T09:03:50.686 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T09:03:50.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.692+0000 7fe9e3fff700 1 -- 192.168.123.105:0/878164487 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fe9e4077910 msgr2=0x7fe9e4079dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:50.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.692+0000 7fe9e3fff700 1 --2- 192.168.123.105:0/878164487 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fe9e4077910 0x7fe9e4079dd0 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7fe9f4000c00 tx=0x7fe9f4005d80 comp rx=0 tx=0).stop 2026-03-10T09:03:50.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.692+0000 7fe9e3fff700 1 -- 192.168.123.105:0/878164487 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe9f812c2f0 msgr2=0x7fe9f812e780 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:50.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.692+0000 7fe9e3fff700 1 --2- 192.168.123.105:0/878164487 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe9f812c2f0 0x7fe9f812e780 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7fe9f000d350 tx=0x7fe9f000d710 comp rx=0 tx=0).stop 2026-03-10T09:03:50.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.692+0000 7fe9e3fff700 1 -- 192.168.123.105:0/878164487 shutdown_connections 2026-03-10T09:03:50.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.692+0000 7fe9e3fff700 1 --2- 192.168.123.105:0/878164487 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fe9e4077910 0x7fe9e4079dd0 secure :-1 s=CLOSED pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7fe9f4000c00 tx=0x7fe9f4005d80 comp rx=0 tx=0).stop 2026-03-10T09:03:50.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.692+0000 7fe9e3fff700 1 --2- 192.168.123.105:0/878164487 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe9f8075a40 0x7fe9f812bdb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:50.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.693+0000 7fe9e3fff700 1 --2- 192.168.123.105:0/878164487 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe9f812c2f0 0x7fe9f812e780 secure :-1 s=CLOSED pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7fe9f000d350 tx=0x7fe9f000d710 comp rx=0 tx=0).stop 2026-03-10T09:03:50.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.693+0000 7fe9e3fff700 1 -- 192.168.123.105:0/878164487 >> 192.168.123.105:0/878164487 conn(0x7fe9f806dae0 msgr2=0x7fe9f806ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:03:50.696 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.694+0000 7fe9e3fff700 1 -- 192.168.123.105:0/878164487 shutdown_connections 2026-03-10T09:03:50.699 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.696+0000 7fe9e3fff700 1 -- 192.168.123.105:0/878164487 wait complete. 2026-03-10T09:03:50.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.805+0000 7f5789056700 1 -- 192.168.123.105:0/1403220320 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5784075a40 msgr2=0x7f5784077ed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:50.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.805+0000 7f5789056700 1 --2- 192.168.123.105:0/1403220320 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5784075a40 0x7f5784077ed0 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7f577c00d3f0 tx=0x7f577c00d700 comp rx=0 tx=0).stop 2026-03-10T09:03:50.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.805+0000 7f5789056700 1 -- 192.168.123.105:0/1403220320 shutdown_connections 2026-03-10T09:03:50.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.805+0000 7f5789056700 1 --2- 192.168.123.105:0/1403220320 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5784075a40 0x7f5784077ed0 unknown :-1 s=CLOSED pgs=78 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:50.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.805+0000 7f5789056700 1 --2- 192.168.123.105:0/1403220320 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5784072b50 0x7f5784072f70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:50.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.805+0000 7f5789056700 1 -- 192.168.123.105:0/1403220320 >> 192.168.123.105:0/1403220320 conn(0x7f578406dae0 msgr2=0x7f578406ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:03:50.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.805+0000 7f5789056700 1 -- 192.168.123.105:0/1403220320 shutdown_connections 2026-03-10T09:03:50.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.805+0000 7f5789056700 1 -- 192.168.123.105:0/1403220320 wait complete. 2026-03-10T09:03:50.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.806+0000 7f5789056700 1 Processor -- start 2026-03-10T09:03:50.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.806+0000 7f5789056700 1 -- start start 2026-03-10T09:03:50.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.806+0000 7f5789056700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5784072b50 0x7f5784082f50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:03:50.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.806+0000 7f5789056700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5784083490 0x7f5784083910 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:03:50.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.806+0000 7f5789056700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f578412e760 con 0x7f5784083490 2026-03-10T09:03:50.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.806+0000 7f5789056700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f578412e8d0 con 0x7f5784072b50 2026-03-10T09:03:50.809 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.808+0000 7f578259c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5784083490 0x7f5784083910 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:03:50.809 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.808+0000 7f578259c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5784083490 0x7f5784083910 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:39290/0 (socket says 192.168.123.105:39290) 2026-03-10T09:03:50.809 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.808+0000 7f578259c700 1 -- 192.168.123.105:0/3158058745 learned_addr learned my addr 192.168.123.105:0/3158058745 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:03:50.809 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.808+0000 7f5782d9d700 1 --2- 192.168.123.105:0/3158058745 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5784072b50 0x7f5784082f50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:03:50.809 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.808+0000 7f578259c700 1 -- 192.168.123.105:0/3158058745 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5784072b50 msgr2=0x7f5784082f50 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:50.809 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.808+0000 7f578259c700 1 --2- 192.168.123.105:0/3158058745 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5784072b50 0x7f5784082f50 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:50.809 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.808+0000 7f578259c700 1 -- 192.168.123.105:0/3158058745 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f577c007ed0 con 0x7f5784083490 2026-03-10T09:03:50.809 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.808+0000 7f578259c700 1 --2- 192.168.123.105:0/3158058745 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5784083490 0x7f5784083910 secure :-1 s=READY pgs=181 cs=0 l=1 rev1=1 crypto rx=0x7f577c003c60 tx=0x7f577c004b40 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:03:50.810 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.809+0000 7f576bfff700 1 -- 192.168.123.105:0/3158058745 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f577c01c070 con 0x7f5784083490 2026-03-10T09:03:50.810 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.809+0000 7f5789056700 1 -- 192.168.123.105:0/3158058745 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f578412eb50 con 0x7f5784083490 2026-03-10T09:03:50.810 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.809+0000 7f5789056700 1 -- 192.168.123.105:0/3158058745 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f578412f0a0 con 0x7f5784083490 2026-03-10T09:03:50.810 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.809+0000 7f576bfff700 1 -- 192.168.123.105:0/3158058745 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f577c00deb0 con 0x7f5784083490 2026-03-10T09:03:50.810 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.809+0000 7f576bfff700 1 -- 192.168.123.105:0/3158058745 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f577c0176c0 con 0x7f5784083490 2026-03-10T09:03:50.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.811+0000 7f5789056700 1 -- 192.168.123.105:0/3158058745 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5770005320 con 0x7f5784083490 2026-03-10T09:03:50.820 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.819+0000 7f576bfff700 1 -- 192.168.123.105:0/3158058745 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f577c00f660 con 0x7f5784083490 2026-03-10T09:03:50.820 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.819+0000 7f576bfff700 1 --2- 192.168.123.105:0/3158058745 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f576c077910 0x7f576c079dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:03:50.820 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.819+0000 7f5782d9d700 1 --2- 192.168.123.105:0/3158058745 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f576c077910 0x7f576c079dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:03:50.820 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.819+0000 7f576bfff700 1 -- 192.168.123.105:0/3158058745 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f577c013070 con 0x7f5784083490 2026-03-10T09:03:50.823 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.822+0000 7f5782d9d700 1 --2- 192.168.123.105:0/3158058745 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f576c077910 0x7f576c079dd0 secure :-1 s=READY pgs=127 cs=0 l=1 rev1=1 crypto rx=0x7f5774005950 tx=0x7f577400b410 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:03:50.829 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:50.826+0000 7f576bfff700 1 -- 192.168.123.105:0/3158058745 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f577c0642a0 con 0x7f5784083490 2026-03-10T09:03:51.023 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:51.022+0000 7f5789056700 1 -- 192.168.123.105:0/3158058745 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f5770005190 con 0x7f5784083490 2026-03-10T09:03:51.023 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:51.023+0000 7f576bfff700 1 -- 192.168.123.105:0/3158058745 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7f577c007480 con 0x7f5784083490 2026-03-10T09:03:51.024 INFO:teuthology.orchestra.run.vm05.stdout:HEALTH_OK 2026-03-10T09:03:51.027 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:51.026+0000 7f5769ffb700 1 -- 192.168.123.105:0/3158058745 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f576c077910 msgr2=0x7f576c079dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:51.027 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:51.026+0000 7f5769ffb700 1 --2- 192.168.123.105:0/3158058745 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f576c077910 0x7f576c079dd0 secure :-1 s=READY pgs=127 cs=0 l=1 rev1=1 crypto rx=0x7f5774005950 tx=0x7f577400b410 comp rx=0 tx=0).stop 2026-03-10T09:03:51.027 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:51.027+0000 7f5769ffb700 1 -- 192.168.123.105:0/3158058745 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5784083490 msgr2=0x7f5784083910 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:03:51.027 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:51.027+0000 7f5769ffb700 1 --2- 192.168.123.105:0/3158058745 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5784083490 0x7f5784083910 secure :-1 s=READY pgs=181 cs=0 l=1 rev1=1 crypto rx=0x7f577c003c60 tx=0x7f577c004b40 comp rx=0 tx=0).stop 2026-03-10T09:03:51.027 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:51.027+0000 7f5769ffb700 1 -- 192.168.123.105:0/3158058745 shutdown_connections 2026-03-10T09:03:51.027 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:51.027+0000 7f5769ffb700 1 --2- 192.168.123.105:0/3158058745 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f576c077910 0x7f576c079dd0 unknown :-1 s=CLOSED pgs=127 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:51.027 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:51.027+0000 7f5769ffb700 1 --2- 192.168.123.105:0/3158058745 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5784072b50 0x7f5784082f50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:51.027 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:51.027+0000 7f5769ffb700 1 --2- 192.168.123.105:0/3158058745 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5784083490 0x7f5784083910 unknown :-1 s=CLOSED pgs=181 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:03:51.027 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:51.027+0000 7f5769ffb700 1 -- 192.168.123.105:0/3158058745 >> 192.168.123.105:0/3158058745 conn(0x7f578406dae0 msgr2=0x7f578406ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:03:51.027 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:51.027+0000 7f5769ffb700 1 -- 192.168.123.105:0/3158058745 shutdown_connections 2026-03-10T09:03:51.028 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:03:51.027+0000 7f5769ffb700 1 -- 192.168.123.105:0/3158058745 wait complete. 2026-03-10T09:03:51.296 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:51 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/781154529' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T09:03:51.296 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:51 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:51.296 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:51 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:51.296 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:51 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:03:51.296 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:51 vm08.local ceph-mon[101330]: from='client.44391 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:03:51.296 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:51 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/3158058745' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T09:03:51.606 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:51 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/781154529' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T09:03:51.606 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:51 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:51.606 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:51 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:51.607 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:51 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:03:51.607 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:51 vm05.local ceph-mon[111630]: from='client.44391 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:03:51.607 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:51 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/3158058745' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T09:03:52.655 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:52 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:52.655 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:52 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:52.655 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:52 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:52.655 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:52 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:52.655 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:52 vm08.local ceph-mon[101330]: pgmap v279: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 16 MiB/s rd, 4.1 KiB/s wr, 10 op/s 2026-03-10T09:03:52.655 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:52 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:52.655 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:52 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:52.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:52 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:52.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:52 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:52.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:52 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:52.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:52 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:52.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:52 vm05.local ceph-mon[111630]: pgmap v279: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 16 MiB/s rd, 4.1 KiB/s wr, 10 op/s 2026-03-10T09:03:52.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:52 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:52.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:52 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:53.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:53 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:53.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:53 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:53.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:53 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:53.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:53 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:53.917 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:53 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:53.917 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:53 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:53.917 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:53 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:53.917 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:53 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:54.731 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:54 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:54.732 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:54 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:54.732 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:54 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:03:54.732 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:54 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:03:54.732 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:54 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:54.732 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:54 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:03:54.732 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:54 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:54.732 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:54 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:54.732 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:54 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:54.732 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:54 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:54.732 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:54 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:54.732 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:54 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:54.732 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:54 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:54.732 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:54 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:54.732 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:54 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:54.732 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:54 vm05.local ceph-mon[111630]: Upgrade: Setting container_image for all ceph-exporter 2026-03-10T09:03:54.732 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:54 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:54.732 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:54 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm05"}]: dispatch 2026-03-10T09:03:54.732 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:54 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm05"}]': finished 2026-03-10T09:03:54.732 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:54 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm08"}]: dispatch 2026-03-10T09:03:54.732 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:54 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm08"}]': finished 2026-03-10T09:03:54.732 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:54 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:54.732 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:54 vm05.local ceph-mon[111630]: Upgrade: Setting container_image for all iscsi 2026-03-10T09:03:54.732 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:54 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:54.732 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:54 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:54.732 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:54 vm05.local ceph-mon[111630]: Upgrade: Setting container_image for all nfs 2026-03-10T09:03:54.732 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:54 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:54.732 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:54 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:54.732 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:54 vm05.local ceph-mon[111630]: Upgrade: Setting container_image for all nvmeof 2026-03-10T09:03:54.732 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:54 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:54.732 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:54 vm05.local ceph-mon[111630]: pgmap v280: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 12 MiB/s rd, 4.1 KiB/s wr, 8 op/s 2026-03-10T09:03:54.732 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:54 vm05.local ceph-mon[111630]: Upgrade: Updating node-exporter.vm05 (1/2) 2026-03-10T09:03:54.732 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:54 vm05.local ceph-mon[111630]: Deploying daemon node-exporter.vm05 on vm05 2026-03-10T09:03:55.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:54 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:55.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:54 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:55.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:54 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:03:55.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:54 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:03:55.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:54 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:55.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:54 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:03:55.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:54 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:55.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:54 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:55.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:54 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:55.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:54 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:55.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:54 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:55.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:54 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:55.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:54 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:55.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:54 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:55.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:54 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:55.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:54 vm08.local ceph-mon[101330]: Upgrade: Setting container_image for all ceph-exporter 2026-03-10T09:03:55.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:54 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:55.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:54 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm05"}]: dispatch 2026-03-10T09:03:55.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:54 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm05"}]': finished 2026-03-10T09:03:55.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:54 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm08"}]: dispatch 2026-03-10T09:03:55.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:54 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm08"}]': finished 2026-03-10T09:03:55.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:54 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:55.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:54 vm08.local ceph-mon[101330]: Upgrade: Setting container_image for all iscsi 2026-03-10T09:03:55.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:54 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:55.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:54 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:55.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:54 vm08.local ceph-mon[101330]: Upgrade: Setting container_image for all nfs 2026-03-10T09:03:55.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:54 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:55.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:54 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:03:55.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:54 vm08.local ceph-mon[101330]: Upgrade: Setting container_image for all nvmeof 2026-03-10T09:03:55.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:54 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:55.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:54 vm08.local ceph-mon[101330]: pgmap v280: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 12 MiB/s rd, 4.1 KiB/s wr, 8 op/s 2026-03-10T09:03:55.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:54 vm08.local ceph-mon[101330]: Upgrade: Updating node-exporter.vm05 (1/2) 2026-03-10T09:03:55.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:54 vm08.local ceph-mon[101330]: Deploying daemon node-exporter.vm05 on vm05 2026-03-10T09:03:57.167 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:56 vm08.local ceph-mon[101330]: pgmap v281: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 345 KiB/s rd, 4.1 KiB/s wr, 5 op/s 2026-03-10T09:03:57.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:56 vm05.local ceph-mon[111630]: pgmap v281: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 345 KiB/s rd, 4.1 KiB/s wr, 5 op/s 2026-03-10T09:03:58.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:57 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:58.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:57 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:58.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:57 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:58.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:57 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:03:59.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:58 vm08.local ceph-mon[101330]: Upgrade: Updating node-exporter.vm08 (2/2) 2026-03-10T09:03:59.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:58 vm08.local ceph-mon[101330]: Deploying daemon node-exporter.vm08 on vm08 2026-03-10T09:03:59.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:03:58 vm08.local ceph-mon[101330]: pgmap v282: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 3.4 KiB/s rd, 4.1 KiB/s wr, 5 op/s 2026-03-10T09:03:59.364 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:58 vm05.local ceph-mon[111630]: Upgrade: Updating node-exporter.vm08 (2/2) 2026-03-10T09:03:59.364 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:58 vm05.local ceph-mon[111630]: Deploying daemon node-exporter.vm08 on vm08 2026-03-10T09:03:59.364 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:03:58 vm05.local ceph-mon[111630]: pgmap v282: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 3.4 KiB/s rd, 4.1 KiB/s wr, 5 op/s 2026-03-10T09:04:01.196 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:01 vm05.local ceph-mon[111630]: pgmap v283: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 2.2 KiB/s rd, 1 op/s 2026-03-10T09:04:01.196 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:01 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:01.196 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:01 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:01.196 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:01 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:04:01.196 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:01 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:04:01.228 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:01 vm08.local ceph-mon[101330]: pgmap v283: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail; 2.2 KiB/s rd, 1 op/s 2026-03-10T09:04:01.228 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:01 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:01.228 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:01 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:01.228 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:01 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:04:01.228 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:01 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:04:02.542 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:02 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:02.542 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:02 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:02.542 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:02 vm08.local ceph-mon[101330]: pgmap v284: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:02.542 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:02 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:02.542 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:02 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:02.542 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:02 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:02.542 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:02 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:02.542 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:02 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:02.542 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:02 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:02.827 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:02 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:02.827 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:02 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:02.827 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:02 vm05.local ceph-mon[111630]: pgmap v284: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:02.827 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:02 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:02.827 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:02 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:02.827 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:02 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:02.827 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:02 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:02.827 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:02 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:02.827 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:02 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:03.905 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:03 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:03.905 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:03 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:03.905 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:03 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:03.905 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:03 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:03.905 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:03 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:04:03.905 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:03 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:04:03.905 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:03 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:03.905 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:03 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:04:03.905 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:03 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:03.905 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:03 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:03.905 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:03 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:03.905 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:03 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:03.905 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:03 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:03.905 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:03 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:03.905 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:03 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:03.905 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:03 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:03.905 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:03 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:03.905 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:03 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:03.905 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:03 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:03.905 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:03 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:03.905 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:03 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:04.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:03 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:04.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:03 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:04.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:03 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:04.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:03 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:04.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:03 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:04:04.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:03 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:04:04.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:03 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:04.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:03 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:04:04.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:03 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:04.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:03 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:04.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:03 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:04.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:03 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:04.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:03 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:04.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:03 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:04.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:03 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:04.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:03 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:04.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:03 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:04.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:03 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:04.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:03 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:04.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:03 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:04.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:03 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:05.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:04 vm05.local ceph-mon[111630]: Upgrade: Updating prometheus.vm05 2026-03-10T09:04:05.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:04 vm05.local ceph-mon[111630]: pgmap v285: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:05.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:04 vm05.local ceph-mon[111630]: Deploying daemon prometheus.vm05 on vm05 2026-03-10T09:04:05.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:04 vm08.local ceph-mon[101330]: Upgrade: Updating prometheus.vm05 2026-03-10T09:04:05.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:04 vm08.local ceph-mon[101330]: pgmap v285: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:05.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:04 vm08.local ceph-mon[101330]: Deploying daemon prometheus.vm05 on vm05 2026-03-10T09:04:07.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:06 vm05.local ceph-mon[111630]: pgmap v286: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:07.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:06 vm08.local ceph-mon[101330]: pgmap v286: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:09.287 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:09 vm05.local ceph-mon[111630]: pgmap v287: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:09.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:09 vm08.local ceph-mon[101330]: pgmap v287: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:10.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:10 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:10.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:10 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:10.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:10 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:04:10.550 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:10 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:10.550 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:10 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:10.550 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:10 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:04:11.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:11 vm05.local ceph-mon[111630]: pgmap v288: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:11.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:11 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:11.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:11 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:11.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:11 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:11.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:11 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:11.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:11 vm08.local ceph-mon[101330]: pgmap v288: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:11.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:11 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:11.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:11 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:11.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:11 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:11.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:11 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:13.019 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:12 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:13.019 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:12 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:13.019 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:12 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:04:13.020 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:12 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:04:13.020 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:12 vm05.local ceph-mon[111630]: pgmap v289: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:13.020 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:12 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:13.020 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:12 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T09:04:13.020 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:12 vm05.local ceph-mon[111630]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T09:04:13.020 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:12 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:04:13.020 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:12 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:13.020 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:12 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:13.020 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:12 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:13.020 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:12 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:13.020 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:12 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:13.020 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:12 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:13.020 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:12 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:13.020 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:12 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:13.020 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:12 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:13.020 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:12 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:13.020 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:12 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:13.020 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:12 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:13.020 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:12 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:13.020 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:12 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:13.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:12 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:13.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:12 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:13.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:12 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:04:13.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:12 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:04:13.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:12 vm08.local ceph-mon[101330]: pgmap v289: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:13.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:12 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:13.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:12 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T09:04:13.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:12 vm08.local ceph-mon[101330]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T09:04:13.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:12 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:04:13.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:12 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:13.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:12 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:13.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:12 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:13.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:12 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:13.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:12 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:13.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:12 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:13.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:12 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:13.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:12 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:13.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:12 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:13.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:12 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:13.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:12 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:13.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:12 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:13.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:12 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:13.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:12 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:14.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:13 vm05.local ceph-mon[111630]: Upgrade: Updating alertmanager.vm05 2026-03-10T09:04:14.214 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:13 vm05.local ceph-mon[111630]: Deploying daemon alertmanager.vm05 on vm05 2026-03-10T09:04:14.214 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:13 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:14.214 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:13 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:14.214 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:13 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:04:14.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:13 vm08.local ceph-mon[101330]: Upgrade: Updating alertmanager.vm05 2026-03-10T09:04:14.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:13 vm08.local ceph-mon[101330]: Deploying daemon alertmanager.vm05 on vm05 2026-03-10T09:04:14.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:13 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:14.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:13 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:14.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:13 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:04:15.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:14 vm05.local ceph-mon[111630]: pgmap v290: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:15.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:14 vm08.local ceph-mon[101330]: pgmap v290: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:16.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:16 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:16.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:16 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:16.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:16 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:16.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:16 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:16.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:16 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:04:16.430 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:16 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:16.430 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:16 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:16.430 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:16 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:16.430 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:16 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:16.430 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:16 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:04:17.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:17 vm05.local ceph-mon[111630]: pgmap v291: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:17.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:17 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:17.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:17 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:17.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:17 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:04:17.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:17 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:04:17.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:17 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:17.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:17 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-10T09:04:17.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:17 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:04:17.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:17 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:17.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:17 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:17.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:17 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:17.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:17 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:17.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:17 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:17.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:17 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:17.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:17 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:17.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:17 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:17.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:17 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:17.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:17 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:17.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:17 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:17.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:17 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:17.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:17 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:17.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:17 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:17.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:17 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:17.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:17 vm08.local ceph-mon[101330]: pgmap v291: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:17.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:17 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:17.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:17 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:17.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:17 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:04:17.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:17 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:04:17.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:17 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:17.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:17 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-10T09:04:17.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:17 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:04:17.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:17 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:17.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:17 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:17.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:17 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:17.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:17 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:17.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:17 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:17.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:17 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:17.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:17 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:17.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:17 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:17.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:17 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:17.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:17 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:17.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:17 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:17.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:17 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:17.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:17 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:17.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:17 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:17.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:17 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:18.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:18 vm08.local ceph-mon[101330]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-10T09:04:18.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:18 vm08.local ceph-mon[101330]: Upgrade: Updating grafana.vm05 2026-03-10T09:04:18.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:18 vm08.local ceph-mon[101330]: Deploying daemon grafana.vm05 on vm05 2026-03-10T09:04:18.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:18 vm05.local ceph-mon[111630]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-10T09:04:18.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:18 vm05.local ceph-mon[111630]: Upgrade: Updating grafana.vm05 2026-03-10T09:04:18.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:18 vm05.local ceph-mon[111630]: Deploying daemon grafana.vm05 on vm05 2026-03-10T09:04:19.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:19 vm08.local ceph-mon[101330]: pgmap v292: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:19.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:19 vm05.local ceph-mon[111630]: pgmap v292: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:21.124 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.122+0000 7f8cd2f40700 1 -- 192.168.123.105:0/3194273974 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8ccc072b50 msgr2=0x7f8ccc072f70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:21.124 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.122+0000 7f8cd2f40700 1 --2- 192.168.123.105:0/3194273974 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8ccc072b50 0x7f8ccc072f70 secure :-1 s=READY pgs=182 cs=0 l=1 rev1=1 crypto rx=0x7f8cbc008790 tx=0x7f8cbc00ae50 comp rx=0 tx=0).stop 2026-03-10T09:04:21.124 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.122+0000 7f8cd2f40700 1 -- 192.168.123.105:0/3194273974 shutdown_connections 2026-03-10T09:04:21.124 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.122+0000 7f8cd2f40700 1 --2- 192.168.123.105:0/3194273974 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8ccc075a40 0x7f8ccc077ed0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:21.124 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.122+0000 7f8cd2f40700 1 --2- 192.168.123.105:0/3194273974 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8ccc072b50 0x7f8ccc072f70 unknown :-1 s=CLOSED pgs=182 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:21.124 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.122+0000 7f8cd2f40700 1 -- 192.168.123.105:0/3194273974 >> 192.168.123.105:0/3194273974 conn(0x7f8ccc06dae0 msgr2=0x7f8ccc06ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:04:21.125 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.123+0000 7f8cd2f40700 1 -- 192.168.123.105:0/3194273974 shutdown_connections 2026-03-10T09:04:21.125 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.123+0000 7f8cd2f40700 1 -- 192.168.123.105:0/3194273974 wait complete. 2026-03-10T09:04:21.125 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.123+0000 7f8cd2f40700 1 Processor -- start 2026-03-10T09:04:21.125 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.123+0000 7f8cd2f40700 1 -- start start 2026-03-10T09:04:21.125 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.123+0000 7f8cd2f40700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8ccc075a40 0x7f8ccc0830a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:04:21.125 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.123+0000 7f8cd2f40700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8ccc0835e0 0x7f8ccc12e3f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:04:21.125 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.123+0000 7f8cd2f40700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8ccc083af0 con 0x7f8ccc075a40 2026-03-10T09:04:21.125 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.123+0000 7f8cd2f40700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8ccc083c60 con 0x7f8ccc0835e0 2026-03-10T09:04:21.126 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.123+0000 7f8cd0cdc700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8ccc075a40 0x7f8ccc0830a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:04:21.126 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.123+0000 7f8ccbfff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8ccc0835e0 0x7f8ccc12e3f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:04:21.126 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.123+0000 7f8ccbfff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8ccc0835e0 0x7f8ccc12e3f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:42412/0 (socket says 192.168.123.105:42412) 2026-03-10T09:04:21.126 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.124+0000 7f8ccbfff700 1 -- 192.168.123.105:0/1605085045 learned_addr learned my addr 192.168.123.105:0/1605085045 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:04:21.126 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.124+0000 7f8ccbfff700 1 -- 192.168.123.105:0/1605085045 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8ccc075a40 msgr2=0x7f8ccc0830a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:21.126 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.124+0000 7f8ccbfff700 1 --2- 192.168.123.105:0/1605085045 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8ccc075a40 0x7f8ccc0830a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:21.126 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.124+0000 7f8ccbfff700 1 -- 192.168.123.105:0/1605085045 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8cbc008440 con 0x7f8ccc0835e0 2026-03-10T09:04:21.126 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.124+0000 7f8ccbfff700 1 --2- 192.168.123.105:0/1605085045 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8ccc0835e0 0x7f8ccc12e3f0 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7f8cc40060b0 tx=0x7f8cc40076f0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:04:21.126 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.124+0000 7f8cc9ffb700 1 -- 192.168.123.105:0/1605085045 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8cc4010040 con 0x7f8ccc0835e0 2026-03-10T09:04:21.127 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.124+0000 7f8cd2f40700 1 -- 192.168.123.105:0/1605085045 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8ccc12e990 con 0x7f8ccc0835e0 2026-03-10T09:04:21.127 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.124+0000 7f8cd2f40700 1 -- 192.168.123.105:0/1605085045 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8ccc12eee0 con 0x7f8ccc0835e0 2026-03-10T09:04:21.127 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.125+0000 7f8cc9ffb700 1 -- 192.168.123.105:0/1605085045 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8cc4009240 con 0x7f8ccc0835e0 2026-03-10T09:04:21.127 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.125+0000 7f8cc9ffb700 1 -- 192.168.123.105:0/1605085045 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8cc4016600 con 0x7f8ccc0835e0 2026-03-10T09:04:21.128 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.126+0000 7f8cc9ffb700 1 -- 192.168.123.105:0/1605085045 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f8cc4004ad0 con 0x7f8ccc0835e0 2026-03-10T09:04:21.128 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.127+0000 7f8cc9ffb700 1 --2- 192.168.123.105:0/1605085045 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f8cb4077910 0x7f8cb4079dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:04:21.128 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.127+0000 7f8cd0cdc700 1 --2- 192.168.123.105:0/1605085045 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f8cb4077910 0x7f8cb4079dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:04:21.128 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.127+0000 7f8cc9ffb700 1 -- 192.168.123.105:0/1605085045 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f8cc4099e70 con 0x7f8ccc0835e0 2026-03-10T09:04:21.128 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.126+0000 7f8cd2f40700 1 -- 192.168.123.105:0/1605085045 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8cb8005320 con 0x7f8ccc0835e0 2026-03-10T09:04:21.129 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.128+0000 7f8cd0cdc700 1 --2- 192.168.123.105:0/1605085045 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f8cb4077910 0x7f8cb4079dd0 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7f8ccc12a360 tx=0x7f8cbc00b360 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:04:21.133 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.131+0000 7f8cc9ffb700 1 -- 192.168.123.105:0/1605085045 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f8cc4062570 con 0x7f8ccc0835e0 2026-03-10T09:04:21.260 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.259+0000 7f8cd2f40700 1 -- 192.168.123.105:0/1605085045 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f8cb8000bf0 con 0x7f8cb4077910 2026-03-10T09:04:21.261 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.260+0000 7f8cc9ffb700 1 -- 192.168.123.105:0/1605085045 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+460 (secure 0 0 0) 0x7f8cb8000bf0 con 0x7f8cb4077910 2026-03-10T09:04:21.266 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.263+0000 7f8cb37fe700 1 -- 192.168.123.105:0/1605085045 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f8cb4077910 msgr2=0x7f8cb4079dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:21.266 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.263+0000 7f8cb37fe700 1 --2- 192.168.123.105:0/1605085045 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f8cb4077910 0x7f8cb4079dd0 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7f8ccc12a360 tx=0x7f8cbc00b360 comp rx=0 tx=0).stop 2026-03-10T09:04:21.266 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.263+0000 7f8cb37fe700 1 -- 192.168.123.105:0/1605085045 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8ccc0835e0 msgr2=0x7f8ccc12e3f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:21.266 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.263+0000 7f8cb37fe700 1 --2- 192.168.123.105:0/1605085045 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8ccc0835e0 0x7f8ccc12e3f0 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7f8cc40060b0 tx=0x7f8cc40076f0 comp rx=0 tx=0).stop 2026-03-10T09:04:21.266 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.263+0000 7f8cb37fe700 1 -- 192.168.123.105:0/1605085045 shutdown_connections 2026-03-10T09:04:21.266 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.263+0000 7f8cb37fe700 1 --2- 192.168.123.105:0/1605085045 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f8cb4077910 0x7f8cb4079dd0 unknown :-1 s=CLOSED pgs=128 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:21.266 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.263+0000 7f8cb37fe700 1 --2- 192.168.123.105:0/1605085045 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8ccc075a40 0x7f8ccc0830a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:21.266 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.263+0000 7f8cb37fe700 1 --2- 192.168.123.105:0/1605085045 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8ccc0835e0 0x7f8ccc12e3f0 unknown :-1 s=CLOSED pgs=79 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:21.266 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.263+0000 7f8cb37fe700 1 -- 192.168.123.105:0/1605085045 >> 192.168.123.105:0/1605085045 conn(0x7f8ccc06dae0 msgr2=0x7f8ccc06ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:04:21.266 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.263+0000 7f8cb37fe700 1 -- 192.168.123.105:0/1605085045 shutdown_connections 2026-03-10T09:04:21.266 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.263+0000 7f8cb37fe700 1 -- 192.168.123.105:0/1605085045 wait complete. 2026-03-10T09:04:21.275 INFO:teuthology.orchestra.run.vm05.stdout:true 2026-03-10T09:04:21.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:21 vm08.local ceph-mon[101330]: pgmap v293: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:21.352 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:21 vm05.local ceph-mon[111630]: pgmap v293: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:21.352 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.349+0000 7fbc7ffff700 1 -- 192.168.123.105:0/1905153190 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbc80075a10 msgr2=0x7fbc80077ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:21.352 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.349+0000 7fbc7ffff700 1 --2- 192.168.123.105:0/1905153190 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbc80075a10 0x7fbc80077ea0 secure :-1 s=READY pgs=183 cs=0 l=1 rev1=1 crypto rx=0x7fbc7800bc30 tx=0x7fbc7800bf40 comp rx=0 tx=0).stop 2026-03-10T09:04:21.352 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.349+0000 7fbc7ffff700 1 -- 192.168.123.105:0/1905153190 shutdown_connections 2026-03-10T09:04:21.352 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.349+0000 7fbc7ffff700 1 --2- 192.168.123.105:0/1905153190 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbc80075a10 0x7fbc80077ea0 unknown :-1 s=CLOSED pgs=183 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:21.352 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.349+0000 7fbc7ffff700 1 --2- 192.168.123.105:0/1905153190 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbc80072b20 0x7fbc80072f40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:21.352 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.349+0000 7fbc7ffff700 1 -- 192.168.123.105:0/1905153190 >> 192.168.123.105:0/1905153190 conn(0x7fbc8006daa0 msgr2=0x7fbc8006ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:04:21.353 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.349+0000 7fbc7ffff700 1 -- 192.168.123.105:0/1905153190 shutdown_connections 2026-03-10T09:04:21.353 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.349+0000 7fbc7ffff700 1 -- 192.168.123.105:0/1905153190 wait complete. 2026-03-10T09:04:21.353 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.350+0000 7fbc7ffff700 1 Processor -- start 2026-03-10T09:04:21.353 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.350+0000 7fbc7ffff700 1 -- start start 2026-03-10T09:04:21.353 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.350+0000 7fbc7ffff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbc80072b20 0x7fbc800830c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:04:21.353 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.350+0000 7fbc7ffff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbc80083600 0x7fbc801bb960 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:04:21.353 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.350+0000 7fbc7ffff700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbc80083b10 con 0x7fbc80072b20 2026-03-10T09:04:21.353 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.350+0000 7fbc7ffff700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbc80083c80 con 0x7fbc80083600 2026-03-10T09:04:21.353 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.350+0000 7fbc7e7fc700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbc80083600 0x7fbc801bb960 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:04:21.353 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.350+0000 7fbc7e7fc700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbc80083600 0x7fbc801bb960 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:42434/0 (socket says 192.168.123.105:42434) 2026-03-10T09:04:21.353 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.350+0000 7fbc7e7fc700 1 -- 192.168.123.105:0/2540083917 learned_addr learned my addr 192.168.123.105:0/2540083917 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:04:21.353 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.350+0000 7fbc7e7fc700 1 -- 192.168.123.105:0/2540083917 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbc80072b20 msgr2=0x7fbc800830c0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:21.353 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.350+0000 7fbc7e7fc700 1 --2- 192.168.123.105:0/2540083917 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbc80072b20 0x7fbc800830c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:21.353 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.350+0000 7fbc7e7fc700 1 -- 192.168.123.105:0/2540083917 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbc7800b8e0 con 0x7fbc80083600 2026-03-10T09:04:21.353 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.350+0000 7fbc7e7fc700 1 --2- 192.168.123.105:0/2540083917 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbc80083600 0x7fbc801bb960 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7fbc7800bfb0 tx=0x7fbc78008db0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:04:21.353 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.351+0000 7fbc84a48700 1 -- 192.168.123.105:0/2540083917 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbc78009d70 con 0x7fbc80083600 2026-03-10T09:04:21.353 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.351+0000 7fbc7ffff700 1 -- 192.168.123.105:0/2540083917 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbc801bbea0 con 0x7fbc80083600 2026-03-10T09:04:21.353 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.351+0000 7fbc7ffff700 1 -- 192.168.123.105:0/2540083917 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbc801bc360 con 0x7fbc80083600 2026-03-10T09:04:21.354 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.351+0000 7fbc84a48700 1 -- 192.168.123.105:0/2540083917 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fbc78007690 con 0x7fbc80083600 2026-03-10T09:04:21.354 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.351+0000 7fbc84a48700 1 -- 192.168.123.105:0/2540083917 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbc7800ce80 con 0x7fbc80083600 2026-03-10T09:04:21.354 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.352+0000 7fbc84a48700 1 -- 192.168.123.105:0/2540083917 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fbc780264a0 con 0x7fbc80083600 2026-03-10T09:04:21.354 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.352+0000 7fbc84a48700 1 --2- 192.168.123.105:0/2540083917 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fbc68077920 0x7fbc68079de0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:04:21.354 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.353+0000 7fbc7effd700 1 --2- 192.168.123.105:0/2540083917 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fbc68077920 0x7fbc68079de0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:04:21.358 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.353+0000 7fbc84a48700 1 -- 192.168.123.105:0/2540083917 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7fbc7809b9f0 con 0x7fbc80083600 2026-03-10T09:04:21.358 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.357+0000 7fbc7ffff700 1 -- 192.168.123.105:0/2540083917 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbc6c005320 con 0x7fbc80083600 2026-03-10T09:04:21.363 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.357+0000 7fbc7effd700 1 --2- 192.168.123.105:0/2540083917 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fbc68077920 0x7fbc68079de0 secure :-1 s=READY pgs=129 cs=0 l=1 rev1=1 crypto rx=0x7fbc70005950 tx=0x7fbc700058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:04:21.363 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.361+0000 7fbc84a48700 1 -- 192.168.123.105:0/2540083917 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fbc780640f0 con 0x7fbc80083600 2026-03-10T09:04:21.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.490+0000 7fbc7ffff700 1 -- 192.168.123.105:0/2540083917 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fbc6c000bf0 con 0x7fbc68077920 2026-03-10T09:04:21.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.493+0000 7fbc84a48700 1 -- 192.168.123.105:0/2540083917 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+460 (secure 0 0 0) 0x7fbc6c000bf0 con 0x7fbc68077920 2026-03-10T09:04:21.499 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.497+0000 7fbc667fc700 1 -- 192.168.123.105:0/2540083917 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fbc68077920 msgr2=0x7fbc68079de0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:21.499 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.497+0000 7fbc667fc700 1 --2- 192.168.123.105:0/2540083917 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fbc68077920 0x7fbc68079de0 secure :-1 s=READY pgs=129 cs=0 l=1 rev1=1 crypto rx=0x7fbc70005950 tx=0x7fbc700058e0 comp rx=0 tx=0).stop 2026-03-10T09:04:21.499 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.497+0000 7fbc667fc700 1 -- 192.168.123.105:0/2540083917 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbc80083600 msgr2=0x7fbc801bb960 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:21.499 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.497+0000 7fbc667fc700 1 --2- 192.168.123.105:0/2540083917 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbc80083600 0x7fbc801bb960 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7fbc7800bfb0 tx=0x7fbc78008db0 comp rx=0 tx=0).stop 2026-03-10T09:04:21.499 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.497+0000 7fbc667fc700 1 -- 192.168.123.105:0/2540083917 shutdown_connections 2026-03-10T09:04:21.499 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.497+0000 7fbc667fc700 1 --2- 192.168.123.105:0/2540083917 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fbc68077920 0x7fbc68079de0 unknown :-1 s=CLOSED pgs=129 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:21.499 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.497+0000 7fbc667fc700 1 --2- 192.168.123.105:0/2540083917 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbc80072b20 0x7fbc800830c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:21.499 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.497+0000 7fbc667fc700 1 --2- 192.168.123.105:0/2540083917 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbc80083600 0x7fbc801bb960 unknown :-1 s=CLOSED pgs=80 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:21.499 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.498+0000 7fbc667fc700 1 -- 192.168.123.105:0/2540083917 >> 192.168.123.105:0/2540083917 conn(0x7fbc8006daa0 msgr2=0x7fbc80075680 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:04:21.499 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.498+0000 7fbc667fc700 1 -- 192.168.123.105:0/2540083917 shutdown_connections 2026-03-10T09:04:21.499 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.498+0000 7fbc667fc700 1 -- 192.168.123.105:0/2540083917 wait complete. 2026-03-10T09:04:21.576 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.574+0000 7f7405940700 1 -- 192.168.123.105:0/1406189159 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7400072d90 msgr2=0x7f74000731b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:21.576 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.574+0000 7f7405940700 1 --2- 192.168.123.105:0/1406189159 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7400072d90 0x7f74000731b0 secure :-1 s=READY pgs=184 cs=0 l=1 rev1=1 crypto rx=0x7f73f0009b50 tx=0x7f73f0009e60 comp rx=0 tx=0).stop 2026-03-10T09:04:21.576 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.574+0000 7f7405940700 1 -- 192.168.123.105:0/1406189159 shutdown_connections 2026-03-10T09:04:21.576 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.574+0000 7f7405940700 1 --2- 192.168.123.105:0/1406189159 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7400075c80 0x7f7400078110 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:21.576 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.574+0000 7f7405940700 1 --2- 192.168.123.105:0/1406189159 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7400072d90 0x7f74000731b0 unknown :-1 s=CLOSED pgs=184 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:21.576 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.574+0000 7f7405940700 1 -- 192.168.123.105:0/1406189159 >> 192.168.123.105:0/1406189159 conn(0x7f740006dda0 msgr2=0x7f7400070220 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:04:21.576 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.574+0000 7f7405940700 1 -- 192.168.123.105:0/1406189159 shutdown_connections 2026-03-10T09:04:21.576 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.574+0000 7f7405940700 1 -- 192.168.123.105:0/1406189159 wait complete. 2026-03-10T09:04:21.577 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.575+0000 7f7405940700 1 Processor -- start 2026-03-10T09:04:21.577 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.575+0000 7f7405940700 1 -- start start 2026-03-10T09:04:21.577 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.575+0000 7f7405940700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7400075c80 0x7f7400083a90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:04:21.577 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.575+0000 7f7405940700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f740012bdb0 0x7f740012e240 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:04:21.577 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.575+0000 7f7405940700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f740012e810 con 0x7f740012bdb0 2026-03-10T09:04:21.577 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.575+0000 7f7405940700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f740012e980 con 0x7f7400075c80 2026-03-10T09:04:21.577 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.575+0000 7f73feffd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7400075c80 0x7f7400083a90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:04:21.578 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.575+0000 7f73feffd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7400075c80 0x7f7400083a90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:42452/0 (socket says 192.168.123.105:42452) 2026-03-10T09:04:21.578 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.575+0000 7f73feffd700 1 -- 192.168.123.105:0/90510346 learned_addr learned my addr 192.168.123.105:0/90510346 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:04:21.578 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.575+0000 7f73fe7fc700 1 --2- 192.168.123.105:0/90510346 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f740012bdb0 0x7f740012e240 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:04:21.578 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.576+0000 7f73fe7fc700 1 -- 192.168.123.105:0/90510346 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7400075c80 msgr2=0x7f7400083a90 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:21.578 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.576+0000 7f73fe7fc700 1 --2- 192.168.123.105:0/90510346 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7400075c80 0x7f7400083a90 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:21.578 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.576+0000 7f73fe7fc700 1 -- 192.168.123.105:0/90510346 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f73f00097e0 con 0x7f740012bdb0 2026-03-10T09:04:21.578 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.576+0000 7f73fe7fc700 1 --2- 192.168.123.105:0/90510346 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f740012bdb0 0x7f740012e240 secure :-1 s=READY pgs=185 cs=0 l=1 rev1=1 crypto rx=0x7f73f800c420 tx=0x7f73f800c730 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:04:21.578 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.576+0000 7f740493e700 1 -- 192.168.123.105:0/90510346 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f73f800e050 con 0x7f740012bdb0 2026-03-10T09:04:21.578 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.576+0000 7f7405940700 1 -- 192.168.123.105:0/90510346 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f740012ec00 con 0x7f740012bdb0 2026-03-10T09:04:21.579 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.576+0000 7f7405940700 1 -- 192.168.123.105:0/90510346 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f740012f150 con 0x7f740012bdb0 2026-03-10T09:04:21.580 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.578+0000 7f740493e700 1 -- 192.168.123.105:0/90510346 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f73f800f040 con 0x7f740012bdb0 2026-03-10T09:04:21.584 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.578+0000 7f740493e700 1 -- 192.168.123.105:0/90510346 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f73f8013610 con 0x7f740012bdb0 2026-03-10T09:04:21.584 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.578+0000 7f740493e700 1 -- 192.168.123.105:0/90510346 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f73f8013830 con 0x7f740012bdb0 2026-03-10T09:04:21.584 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.578+0000 7f740493e700 1 --2- 192.168.123.105:0/90510346 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f73e8077ab0 0x7f73e8079f70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:04:21.584 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.578+0000 7f73feffd700 1 --2- 192.168.123.105:0/90510346 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f73e8077ab0 0x7f73e8079f70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:04:21.584 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.578+0000 7f740493e700 1 -- 192.168.123.105:0/90510346 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f73f809aa10 con 0x7f740012bdb0 2026-03-10T09:04:21.584 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.579+0000 7f73feffd700 1 --2- 192.168.123.105:0/90510346 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f73e8077ab0 0x7f73e8079f70 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7f7400083850 tx=0x7f73f0019040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:04:21.584 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.579+0000 7f7405940700 1 -- 192.168.123.105:0/90510346 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f74000709c0 con 0x7f740012bdb0 2026-03-10T09:04:21.587 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.585+0000 7f740493e700 1 -- 192.168.123.105:0/90510346 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f73f8063190 con 0x7f740012bdb0 2026-03-10T09:04:21.725 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.718+0000 7f7405940700 1 -- 192.168.123.105:0/90510346 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f740012ed90 con 0x7f73e8077ab0 2026-03-10T09:04:21.725 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.723+0000 7f740493e700 1 -- 192.168.123.105:0/90510346 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7f740012ed90 con 0x7f73e8077ab0 2026-03-10T09:04:21.725 INFO:teuthology.orchestra.run.vm05.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T09:04:21.725 INFO:teuthology.orchestra.run.vm05.stdout:alertmanager.vm05 vm05 *:9093,9094 running (8s) 6s ago 13m 15.1M - 0.25.0 c8568f914cd2 2f2e8b2aa368 2026-03-10T09:04:21.725 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm05 vm05 running (32s) 6s ago 13m 10.2M - 19.2.3-678-ge911bdeb 654f31e6858e a95a49a68fec 2026-03-10T09:04:21.725 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm08 vm08 running (31s) 20s ago 13m 10.2M - 19.2.3-678-ge911bdeb 654f31e6858e 296c2a7f0ba5 2026-03-10T09:04:21.725 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm05 vm05 running (7m) 6s ago 13m 7834k - 19.2.3-678-ge911bdeb 654f31e6858e 8f7c3cd210c7 2026-03-10T09:04:21.725 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm08 vm08 running (7m) 20s ago 13m 8296k - 19.2.3-678-ge911bdeb 654f31e6858e dc71c3161edd 2026-03-10T09:04:21.725 INFO:teuthology.orchestra.run.vm05.stdout:grafana.vm05 vm05 *:3000 running (12m) 6s ago 13m 90.2M - 9.4.7 954c08fa6188 91937b4745e7 2026-03-10T09:04:21.725 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.bxdvbu vm05 running (64s) 6s ago 11m 106M - 19.2.3-678-ge911bdeb 654f31e6858e a19223339e34 2026-03-10T09:04:21.725 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.slhztf vm05 running (54s) 6s ago 11m 14.7M - 19.2.3-678-ge911bdeb 654f31e6858e 0ae3b9e36731 2026-03-10T09:04:21.725 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.ssijow vm08 running (37s) 20s ago 11m 15.5M - 19.2.3-678-ge911bdeb 654f31e6858e 3540ff73b5c2 2026-03-10T09:04:21.725 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.xfzrbx vm08 running (46s) 20s ago 11m 18.4M - 19.2.3-678-ge911bdeb 654f31e6858e 4b83d0ce9fe6 2026-03-10T09:04:21.725 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm05.rxwgjc vm05 *:8443,9283,8765 running (8m) 6s ago 14m 631M - 19.2.3-678-ge911bdeb 654f31e6858e d8c53d04a173 2026-03-10T09:04:21.725 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm08.rpongu vm08 *:8443,9283,8765 running (7m) 20s ago 12m 495M - 19.2.3-678-ge911bdeb 654f31e6858e 867781ac9d98 2026-03-10T09:04:21.725 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm05 vm05 running (7m) 6s ago 14m 70.7M 2048M 19.2.3-678-ge911bdeb 654f31e6858e cdc9176bec28 2026-03-10T09:04:21.725 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm08 vm08 running (7m) 20s ago 12m 57.5M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 34546aa1422b 2026-03-10T09:04:21.725 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm05 vm05 *:9100 running (24s) 6s ago 13m 8702k - 1.7.0 72c9c2088986 389c3ddf4b37 2026-03-10T09:04:21.725 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm08 vm08 *:9100 running (21s) 20s ago 12m 5347k - 1.7.0 72c9c2088986 8cc5d2924193 2026-03-10T09:04:21.725 INFO:teuthology.orchestra.run.vm05.stdout:osd.0 vm05 running (6m) 6s ago 12m 221M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 4f1dac46f59b 2026-03-10T09:04:21.725 INFO:teuthology.orchestra.run.vm05.stdout:osd.1 vm05 running (2m) 6s ago 12m 115M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 306e95bddd95 2026-03-10T09:04:21.725 INFO:teuthology.orchestra.run.vm05.stdout:osd.2 vm05 running (2m) 6s ago 12m 103M 4096M 19.2.3-678-ge911bdeb 654f31e6858e a555d70ff4bd 2026-03-10T09:04:21.725 INFO:teuthology.orchestra.run.vm05.stdout:osd.3 vm08 running (2m) 20s ago 12m 174M 4096M 19.2.3-678-ge911bdeb 654f31e6858e b025f9a6ca2a 2026-03-10T09:04:21.725 INFO:teuthology.orchestra.run.vm05.stdout:osd.4 vm08 running (99s) 20s ago 12m 130M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 76fe84edd716 2026-03-10T09:04:21.725 INFO:teuthology.orchestra.run.vm05.stdout:osd.5 vm08 running (77s) 20s ago 11m 109M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 41f6c3ce6ac4 2026-03-10T09:04:21.725 INFO:teuthology.orchestra.run.vm05.stdout:prometheus.vm05 vm05 *:9095 running (12s) 6s ago 13m 46.7M - 2.51.0 1d3b7f56885b 17d47f7668ac 2026-03-10T09:04:21.731 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.729+0000 7f73e67fc700 1 -- 192.168.123.105:0/90510346 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f73e8077ab0 msgr2=0x7f73e8079f70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:21.731 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.729+0000 7f73e67fc700 1 --2- 192.168.123.105:0/90510346 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f73e8077ab0 0x7f73e8079f70 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7f7400083850 tx=0x7f73f0019040 comp rx=0 tx=0).stop 2026-03-10T09:04:21.731 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.729+0000 7f73e67fc700 1 -- 192.168.123.105:0/90510346 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f740012bdb0 msgr2=0x7f740012e240 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:21.731 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.729+0000 7f73e67fc700 1 --2- 192.168.123.105:0/90510346 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f740012bdb0 0x7f740012e240 secure :-1 s=READY pgs=185 cs=0 l=1 rev1=1 crypto rx=0x7f73f800c420 tx=0x7f73f800c730 comp rx=0 tx=0).stop 2026-03-10T09:04:21.731 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.729+0000 7f73e67fc700 1 -- 192.168.123.105:0/90510346 shutdown_connections 2026-03-10T09:04:21.731 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.729+0000 7f73e67fc700 1 --2- 192.168.123.105:0/90510346 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f73e8077ab0 0x7f73e8079f70 unknown :-1 s=CLOSED pgs=130 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:21.731 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.729+0000 7f73e67fc700 1 --2- 192.168.123.105:0/90510346 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7400075c80 0x7f7400083a90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:21.731 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.729+0000 7f73e67fc700 1 --2- 192.168.123.105:0/90510346 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f740012bdb0 0x7f740012e240 unknown :-1 s=CLOSED pgs=185 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:21.731 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.729+0000 7f73e67fc700 1 -- 192.168.123.105:0/90510346 >> 192.168.123.105:0/90510346 conn(0x7f740006dda0 msgr2=0x7f740007c1a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:04:21.731 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.729+0000 7f73e67fc700 1 -- 192.168.123.105:0/90510346 shutdown_connections 2026-03-10T09:04:21.731 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.730+0000 7f73e67fc700 1 -- 192.168.123.105:0/90510346 wait complete. 2026-03-10T09:04:21.802 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.801+0000 7fb3edd27700 1 -- 192.168.123.105:0/991024210 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3e810a700 msgr2=0x7fb3e810cb90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:21.803 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.801+0000 7fb3edd27700 1 --2- 192.168.123.105:0/991024210 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3e810a700 0x7fb3e810cb90 secure :-1 s=READY pgs=186 cs=0 l=1 rev1=1 crypto rx=0x7fb3e001c580 tx=0x7fb3e001c890 comp rx=0 tx=0).stop 2026-03-10T09:04:21.803 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.801+0000 7fb3edd27700 1 -- 192.168.123.105:0/991024210 shutdown_connections 2026-03-10T09:04:21.803 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.801+0000 7fb3edd27700 1 --2- 192.168.123.105:0/991024210 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3e810a700 0x7fb3e810cb90 unknown :-1 s=CLOSED pgs=186 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:21.803 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.801+0000 7fb3edd27700 1 --2- 192.168.123.105:0/991024210 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb3e8107d90 0x7fb3e810a1c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:21.803 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.801+0000 7fb3edd27700 1 -- 192.168.123.105:0/991024210 >> 192.168.123.105:0/991024210 conn(0x7fb3e806dae0 msgr2=0x7fb3e806ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:04:21.803 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.801+0000 7fb3edd27700 1 -- 192.168.123.105:0/991024210 shutdown_connections 2026-03-10T09:04:21.803 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.801+0000 7fb3edd27700 1 -- 192.168.123.105:0/991024210 wait complete. 2026-03-10T09:04:21.803 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.802+0000 7fb3edd27700 1 Processor -- start 2026-03-10T09:04:21.804 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.802+0000 7fb3edd27700 1 -- start start 2026-03-10T09:04:21.804 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.802+0000 7fb3edd27700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb3e8107d90 0x7fb3e819cd30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:04:21.804 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.802+0000 7fb3edd27700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3e819d270 0x7fb3e81a22a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:04:21.804 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.802+0000 7fb3edd27700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb3e819d6f0 con 0x7fb3e819d270 2026-03-10T09:04:21.804 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.802+0000 7fb3edd27700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb3e819d860 con 0x7fb3e8107d90 2026-03-10T09:04:21.804 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.802+0000 7fb3e6ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3e819d270 0x7fb3e81a22a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:04:21.804 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.802+0000 7fb3e6ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3e819d270 0x7fb3e81a22a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:37556/0 (socket says 192.168.123.105:37556) 2026-03-10T09:04:21.804 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.802+0000 7fb3e6ffd700 1 -- 192.168.123.105:0/1227651310 learned_addr learned my addr 192.168.123.105:0/1227651310 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:04:21.804 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.802+0000 7fb3e77fe700 1 --2- 192.168.123.105:0/1227651310 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb3e8107d90 0x7fb3e819cd30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:04:21.804 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.802+0000 7fb3e6ffd700 1 -- 192.168.123.105:0/1227651310 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb3e8107d90 msgr2=0x7fb3e819cd30 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:21.804 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.802+0000 7fb3e6ffd700 1 --2- 192.168.123.105:0/1227651310 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb3e8107d90 0x7fb3e819cd30 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:21.804 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.802+0000 7fb3e6ffd700 1 -- 192.168.123.105:0/1227651310 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb3e001c060 con 0x7fb3e819d270 2026-03-10T09:04:21.804 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.803+0000 7fb3e6ffd700 1 --2- 192.168.123.105:0/1227651310 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3e819d270 0x7fb3e81a22a0 secure :-1 s=READY pgs=187 cs=0 l=1 rev1=1 crypto rx=0x7fb3e0009750 tx=0x7fb3e0007b90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:04:21.804 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.803+0000 7fb3e4ff9700 1 -- 192.168.123.105:0/1227651310 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb3e0003d60 con 0x7fb3e819d270 2026-03-10T09:04:21.805 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.803+0000 7fb3e4ff9700 1 -- 192.168.123.105:0/1227651310 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb3e001f030 con 0x7fb3e819d270 2026-03-10T09:04:21.805 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.803+0000 7fb3edd27700 1 -- 192.168.123.105:0/1227651310 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb3e81a27e0 con 0x7fb3e819d270 2026-03-10T09:04:21.805 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.803+0000 7fb3edd27700 1 -- 192.168.123.105:0/1227651310 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb3e81a2d00 con 0x7fb3e819d270 2026-03-10T09:04:21.805 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.804+0000 7fb3edd27700 1 -- 192.168.123.105:0/1227651310 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb3e804ea90 con 0x7fb3e819d270 2026-03-10T09:04:21.808 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.807+0000 7fb3e4ff9700 1 -- 192.168.123.105:0/1227651310 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb3e002d8c0 con 0x7fb3e819d270 2026-03-10T09:04:21.808 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.807+0000 7fb3e4ff9700 1 -- 192.168.123.105:0/1227651310 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fb3e0007c90 con 0x7fb3e819d270 2026-03-10T09:04:21.809 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.807+0000 7fb3e4ff9700 1 --2- 192.168.123.105:0/1227651310 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb3d0077ab0 0x7fb3d0079f70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:04:21.809 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.807+0000 7fb3e4ff9700 1 -- 192.168.123.105:0/1227651310 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7fb3e00ad680 con 0x7fb3e819d270 2026-03-10T09:04:21.809 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.808+0000 7fb3e4ff9700 1 -- 192.168.123.105:0/1227651310 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fb3e00ada10 con 0x7fb3e819d270 2026-03-10T09:04:21.809 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.808+0000 7fb3e77fe700 1 --2- 192.168.123.105:0/1227651310 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb3d0077ab0 0x7fb3d0079f70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:04:21.815 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.812+0000 7fb3e77fe700 1 --2- 192.168.123.105:0/1227651310 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb3d0077ab0 0x7fb3d0079f70 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7fb3e819e210 tx=0x7fb3d8009380 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:04:21.976 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.972+0000 7fb3edd27700 1 -- 192.168.123.105:0/1227651310 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fb3e81a2fe0 con 0x7fb3e819d270 2026-03-10T09:04:21.977 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.976+0000 7fb3e4ff9700 1 -- 192.168.123.105:0/1227651310 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7fb3e0075e00 con 0x7fb3e819d270 2026-03-10T09:04:21.977 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T09:04:21.977 INFO:teuthology.orchestra.run.vm05.stdout: "mon": { 2026-03-10T09:04:21.977 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T09:04:21.977 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T09:04:21.977 INFO:teuthology.orchestra.run.vm05.stdout: "mgr": { 2026-03-10T09:04:21.977 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T09:04:21.977 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T09:04:21.977 INFO:teuthology.orchestra.run.vm05.stdout: "osd": { 2026-03-10T09:04:21.977 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-10T09:04:21.978 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T09:04:21.978 INFO:teuthology.orchestra.run.vm05.stdout: "mds": { 2026-03-10T09:04:21.978 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 4 2026-03-10T09:04:21.978 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T09:04:21.978 INFO:teuthology.orchestra.run.vm05.stdout: "overall": { 2026-03-10T09:04:21.978 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 14 2026-03-10T09:04:21.978 INFO:teuthology.orchestra.run.vm05.stdout: } 2026-03-10T09:04:21.978 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T09:04:21.980 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.980+0000 7fb3ce7fc700 1 -- 192.168.123.105:0/1227651310 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb3d0077ab0 msgr2=0x7fb3d0079f70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:21.980 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.980+0000 7fb3ce7fc700 1 --2- 192.168.123.105:0/1227651310 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb3d0077ab0 0x7fb3d0079f70 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7fb3e819e210 tx=0x7fb3d8009380 comp rx=0 tx=0).stop 2026-03-10T09:04:21.980 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.980+0000 7fb3ce7fc700 1 -- 192.168.123.105:0/1227651310 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3e819d270 msgr2=0x7fb3e81a22a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:21.980 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.980+0000 7fb3ce7fc700 1 --2- 192.168.123.105:0/1227651310 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3e819d270 0x7fb3e81a22a0 secure :-1 s=READY pgs=187 cs=0 l=1 rev1=1 crypto rx=0x7fb3e0009750 tx=0x7fb3e0007b90 comp rx=0 tx=0).stop 2026-03-10T09:04:21.980 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.980+0000 7fb3ce7fc700 1 -- 192.168.123.105:0/1227651310 shutdown_connections 2026-03-10T09:04:21.980 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.980+0000 7fb3ce7fc700 1 --2- 192.168.123.105:0/1227651310 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb3d0077ab0 0x7fb3d0079f70 unknown :-1 s=CLOSED pgs=131 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:21.981 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.980+0000 7fb3ce7fc700 1 --2- 192.168.123.105:0/1227651310 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb3e8107d90 0x7fb3e819cd30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:21.981 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.980+0000 7fb3ce7fc700 1 --2- 192.168.123.105:0/1227651310 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3e819d270 0x7fb3e81a22a0 unknown :-1 s=CLOSED pgs=187 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:21.981 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.980+0000 7fb3ce7fc700 1 -- 192.168.123.105:0/1227651310 >> 192.168.123.105:0/1227651310 conn(0x7fb3e806dae0 msgr2=0x7fb3e806e7c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:04:21.981 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.980+0000 7fb3ce7fc700 1 -- 192.168.123.105:0/1227651310 shutdown_connections 2026-03-10T09:04:21.981 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:21.980+0000 7fb3ce7fc700 1 -- 192.168.123.105:0/1227651310 wait complete. 2026-03-10T09:04:22.050 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.049+0000 7fb75204a700 1 -- 192.168.123.105:0/2742126367 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb74c075a40 msgr2=0x7fb74c077ed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:22.050 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.049+0000 7fb75204a700 1 --2- 192.168.123.105:0/2742126367 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb74c075a40 0x7fb74c077ed0 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7fb74400b3a0 tx=0x7fb74400b6b0 comp rx=0 tx=0).stop 2026-03-10T09:04:22.050 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.049+0000 7fb75204a700 1 -- 192.168.123.105:0/2742126367 shutdown_connections 2026-03-10T09:04:22.050 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.049+0000 7fb75204a700 1 --2- 192.168.123.105:0/2742126367 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb74c075a40 0x7fb74c077ed0 unknown :-1 s=CLOSED pgs=81 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:22.050 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.049+0000 7fb75204a700 1 --2- 192.168.123.105:0/2742126367 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb74c072b50 0x7fb74c072f70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:22.050 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.049+0000 7fb75204a700 1 -- 192.168.123.105:0/2742126367 >> 192.168.123.105:0/2742126367 conn(0x7fb74c06dae0 msgr2=0x7fb74c06ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:04:22.052 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.049+0000 7fb75204a700 1 -- 192.168.123.105:0/2742126367 shutdown_connections 2026-03-10T09:04:22.052 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.049+0000 7fb75204a700 1 -- 192.168.123.105:0/2742126367 wait complete. 2026-03-10T09:04:22.052 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.050+0000 7fb75204a700 1 Processor -- start 2026-03-10T09:04:22.052 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.050+0000 7fb75204a700 1 -- start start 2026-03-10T09:04:22.052 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.050+0000 7fb75204a700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb74c072b50 0x7fb74c083030 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:04:22.052 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.050+0000 7fb75204a700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb74c083570 0x7fb74c1b30c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:04:22.052 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.050+0000 7fb75204a700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb74c083a80 con 0x7fb74c072b50 2026-03-10T09:04:22.052 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.050+0000 7fb75204a700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb74c083bc0 con 0x7fb74c083570 2026-03-10T09:04:22.052 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.050+0000 7fb74affd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb74c083570 0x7fb74c1b30c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:04:22.052 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.050+0000 7fb74affd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb74c083570 0x7fb74c1b30c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:42476/0 (socket says 192.168.123.105:42476) 2026-03-10T09:04:22.052 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.050+0000 7fb74affd700 1 -- 192.168.123.105:0/3131152322 learned_addr learned my addr 192.168.123.105:0/3131152322 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:04:22.052 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.050+0000 7fb74b7fe700 1 --2- 192.168.123.105:0/3131152322 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb74c072b50 0x7fb74c083030 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:04:22.052 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.050+0000 7fb74affd700 1 -- 192.168.123.105:0/3131152322 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb74c072b50 msgr2=0x7fb74c083030 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:22.052 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.050+0000 7fb74affd700 1 --2- 192.168.123.105:0/3131152322 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb74c072b50 0x7fb74c083030 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:22.052 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.050+0000 7fb74affd700 1 -- 192.168.123.105:0/3131152322 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb73c009710 con 0x7fb74c083570 2026-03-10T09:04:22.052 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.050+0000 7fb74b7fe700 1 --2- 192.168.123.105:0/3131152322 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb74c072b50 0x7fb74c083030 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T09:04:22.052 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.050+0000 7fb74affd700 1 --2- 192.168.123.105:0/3131152322 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb74c083570 0x7fb74c1b30c0 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7fb74400ba80 tx=0x7fb744009580 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:04:22.052 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.051+0000 7fb748ff9700 1 -- 192.168.123.105:0/3131152322 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb74400e050 con 0x7fb74c083570 2026-03-10T09:04:22.053 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.051+0000 7fb75204a700 1 -- 192.168.123.105:0/3131152322 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb74400b050 con 0x7fb74c083570 2026-03-10T09:04:22.053 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.051+0000 7fb75204a700 1 -- 192.168.123.105:0/3131152322 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb74c1b3990 con 0x7fb74c083570 2026-03-10T09:04:22.053 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.052+0000 7fb748ff9700 1 -- 192.168.123.105:0/3131152322 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb744003e80 con 0x7fb74c083570 2026-03-10T09:04:22.053 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.052+0000 7fb748ff9700 1 -- 192.168.123.105:0/3131152322 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb74401baf0 con 0x7fb74c083570 2026-03-10T09:04:22.055 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.053+0000 7fb748ff9700 1 -- 192.168.123.105:0/3131152322 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fb744019040 con 0x7fb74c083570 2026-03-10T09:04:22.058 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.057+0000 7fb748ff9700 1 --2- 192.168.123.105:0/3131152322 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb734079c00 0x7fb73407c0c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:04:22.058 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.057+0000 7fb74b7fe700 1 --2- 192.168.123.105:0/3131152322 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb734079c00 0x7fb73407c0c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:04:22.059 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.057+0000 7fb748ff9700 1 -- 192.168.123.105:0/3131152322 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7fb74409bf50 con 0x7fb74c083570 2026-03-10T09:04:22.059 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.058+0000 7fb75204a700 1 -- 192.168.123.105:0/3131152322 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb74c04ea90 con 0x7fb74c083570 2026-03-10T09:04:22.059 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.058+0000 7fb74b7fe700 1 --2- 192.168.123.105:0/3131152322 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb734079c00 0x7fb73407c0c0 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7fb73c00f790 tx=0x7fb73c009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:04:22.062 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.061+0000 7fb748ff9700 1 -- 192.168.123.105:0/3131152322 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fb7440646d0 con 0x7fb74c083570 2026-03-10T09:04:22.201 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.200+0000 7fb75204a700 1 -- 192.168.123.105:0/3131152322 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fb74c1b40f0 con 0x7fb74c083570 2026-03-10T09:04:22.202 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.201+0000 7fb748ff9700 1 -- 192.168.123.105:0/3131152322 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 30 v30) v1 ==== 76+0+1973 (secure 0 0 0) 0x7fb74401f7d0 con 0x7fb74c083570 2026-03-10T09:04:22.202 INFO:teuthology.orchestra.run.vm05.stdout:e30 2026-03-10T09:04:22.202 INFO:teuthology.orchestra.run.vm05.stdout:btime 2026-03-10T09:03:44:361690+0000 2026-03-10T09:04:22.202 INFO:teuthology.orchestra.run.vm05.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T09:04:22.202 INFO:teuthology.orchestra.run.vm05.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T09:04:22.202 INFO:teuthology.orchestra.run.vm05.stdout:legacy client fscid: 1 2026-03-10T09:04:22.202 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:04:22.202 INFO:teuthology.orchestra.run.vm05.stdout:Filesystem 'cephfs' (1) 2026-03-10T09:04:22.202 INFO:teuthology.orchestra.run.vm05.stdout:fs_name cephfs 2026-03-10T09:04:22.202 INFO:teuthology.orchestra.run.vm05.stdout:epoch 30 2026-03-10T09:04:22.202 INFO:teuthology.orchestra.run.vm05.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T09:04:22.202 INFO:teuthology.orchestra.run.vm05.stdout:created 2026-03-10T08:52:52.346264+0000 2026-03-10T09:04:22.202 INFO:teuthology.orchestra.run.vm05.stdout:modified 2026-03-10T09:03:44.361687+0000 2026-03-10T09:04:22.202 INFO:teuthology.orchestra.run.vm05.stdout:tableserver 0 2026-03-10T09:04:22.202 INFO:teuthology.orchestra.run.vm05.stdout:root 0 2026-03-10T09:04:22.202 INFO:teuthology.orchestra.run.vm05.stdout:session_timeout 60 2026-03-10T09:04:22.203 INFO:teuthology.orchestra.run.vm05.stdout:session_autoclose 300 2026-03-10T09:04:22.203 INFO:teuthology.orchestra.run.vm05.stdout:max_file_size 1099511627776 2026-03-10T09:04:22.203 INFO:teuthology.orchestra.run.vm05.stdout:max_xattr_size 65536 2026-03-10T09:04:22.203 INFO:teuthology.orchestra.run.vm05.stdout:required_client_features {} 2026-03-10T09:04:22.203 INFO:teuthology.orchestra.run.vm05.stdout:last_failure 0 2026-03-10T09:04:22.203 INFO:teuthology.orchestra.run.vm05.stdout:last_failure_osd_epoch 106 2026-03-10T09:04:22.203 INFO:teuthology.orchestra.run.vm05.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes} 2026-03-10T09:04:22.203 INFO:teuthology.orchestra.run.vm05.stdout:max_mds 1 2026-03-10T09:04:22.203 INFO:teuthology.orchestra.run.vm05.stdout:in 0 2026-03-10T09:04:22.203 INFO:teuthology.orchestra.run.vm05.stdout:up {0=34444} 2026-03-10T09:04:22.203 INFO:teuthology.orchestra.run.vm05.stdout:failed 2026-03-10T09:04:22.203 INFO:teuthology.orchestra.run.vm05.stdout:damaged 2026-03-10T09:04:22.203 INFO:teuthology.orchestra.run.vm05.stdout:stopped 2026-03-10T09:04:22.203 INFO:teuthology.orchestra.run.vm05.stdout:data_pools [3] 2026-03-10T09:04:22.203 INFO:teuthology.orchestra.run.vm05.stdout:metadata_pool 2 2026-03-10T09:04:22.203 INFO:teuthology.orchestra.run.vm05.stdout:inline_data disabled 2026-03-10T09:04:22.203 INFO:teuthology.orchestra.run.vm05.stdout:balancer 2026-03-10T09:04:22.203 INFO:teuthology.orchestra.run.vm05.stdout:bal_rank_mask -1 2026-03-10T09:04:22.203 INFO:teuthology.orchestra.run.vm05.stdout:standby_count_wanted 1 2026-03-10T09:04:22.203 INFO:teuthology.orchestra.run.vm05.stdout:qdb_cluster leader: 34444 members: 34444 2026-03-10T09:04:22.203 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.bxdvbu{0:34444} state up:active seq 10 join_fscid=1 addr [v2:192.168.123.105:6826/2948722085,v1:192.168.123.105:6827/2948722085] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T09:04:22.203 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:04:22.203 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:04:22.203 INFO:teuthology.orchestra.run.vm05.stdout:Standby daemons: 2026-03-10T09:04:22.203 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:04:22.203 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.xfzrbx{-1:34470} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.108:6824/3904677772,v1:192.168.123.108:6825/3904677772] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T09:04:22.203 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.slhztf{-1:44367} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.105:6828/930707688,v1:192.168.123.105:6829/930707688] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T09:04:22.203 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.ssijow{-1:44373} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.108:6826/42427465,v1:192.168.123.108:6827/42427465] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T09:04:22.207 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.205+0000 7fb7327fc700 1 -- 192.168.123.105:0/3131152322 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb734079c00 msgr2=0x7fb73407c0c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:22.207 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.205+0000 7fb7327fc700 1 --2- 192.168.123.105:0/3131152322 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb734079c00 0x7fb73407c0c0 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7fb73c00f790 tx=0x7fb73c009450 comp rx=0 tx=0).stop 2026-03-10T09:04:22.207 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.205+0000 7fb7327fc700 1 -- 192.168.123.105:0/3131152322 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb74c083570 msgr2=0x7fb74c1b30c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:22.207 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.205+0000 7fb7327fc700 1 --2- 192.168.123.105:0/3131152322 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb74c083570 0x7fb74c1b30c0 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7fb74400ba80 tx=0x7fb744009580 comp rx=0 tx=0).stop 2026-03-10T09:04:22.207 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.206+0000 7fb7327fc700 1 -- 192.168.123.105:0/3131152322 shutdown_connections 2026-03-10T09:04:22.207 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.206+0000 7fb7327fc700 1 --2- 192.168.123.105:0/3131152322 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb734079c00 0x7fb73407c0c0 unknown :-1 s=CLOSED pgs=132 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:22.207 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.206+0000 7fb7327fc700 1 --2- 192.168.123.105:0/3131152322 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb74c072b50 0x7fb74c083030 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:22.207 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.206+0000 7fb7327fc700 1 --2- 192.168.123.105:0/3131152322 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb74c083570 0x7fb74c1b30c0 unknown :-1 s=CLOSED pgs=82 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:22.207 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.206+0000 7fb7327fc700 1 -- 192.168.123.105:0/3131152322 >> 192.168.123.105:0/3131152322 conn(0x7fb74c06dae0 msgr2=0x7fb74c06ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:04:22.208 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.206+0000 7fb7327fc700 1 -- 192.168.123.105:0/3131152322 shutdown_connections 2026-03-10T09:04:22.208 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.207+0000 7fb7327fc700 1 -- 192.168.123.105:0/3131152322 wait complete. 2026-03-10T09:04:22.208 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 30 2026-03-10T09:04:22.286 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:22 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/1227651310' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:22.287 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.285+0000 7f846da4e700 1 -- 192.168.123.105:0/3317207710 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8468107d90 msgr2=0x7f846810a1c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:22.287 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.285+0000 7f846da4e700 1 --2- 192.168.123.105:0/3317207710 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8468107d90 0x7f846810a1c0 secure :-1 s=READY pgs=188 cs=0 l=1 rev1=1 crypto rx=0x7f8458009b00 tx=0x7f8458009e10 comp rx=0 tx=0).stop 2026-03-10T09:04:22.289 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.288+0000 7f846da4e700 1 -- 192.168.123.105:0/3317207710 shutdown_connections 2026-03-10T09:04:22.289 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.288+0000 7f846da4e700 1 --2- 192.168.123.105:0/3317207710 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f846810a700 0x7f846810cb90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:22.289 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.288+0000 7f846da4e700 1 --2- 192.168.123.105:0/3317207710 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8468107d90 0x7f846810a1c0 unknown :-1 s=CLOSED pgs=188 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:22.289 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.288+0000 7f846da4e700 1 -- 192.168.123.105:0/3317207710 >> 192.168.123.105:0/3317207710 conn(0x7f846806dda0 msgr2=0x7f8468070220 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:04:22.289 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.288+0000 7f846da4e700 1 -- 192.168.123.105:0/3317207710 shutdown_connections 2026-03-10T09:04:22.290 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.288+0000 7f846da4e700 1 -- 192.168.123.105:0/3317207710 wait complete. 2026-03-10T09:04:22.290 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.288+0000 7f846da4e700 1 Processor -- start 2026-03-10T09:04:22.290 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.288+0000 7f846da4e700 1 -- start start 2026-03-10T09:04:22.291 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.289+0000 7f846da4e700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8468107d90 0x7f84681a5520 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:04:22.291 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.289+0000 7f846da4e700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f846810a700 0x7f84681a5a60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:04:22.291 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.289+0000 7f846da4e700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f84681a6080 con 0x7f846810a700 2026-03-10T09:04:22.291 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.289+0000 7f846da4e700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f84681a61c0 con 0x7f8468107d90 2026-03-10T09:04:22.291 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.289+0000 7f8466ffd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8468107d90 0x7f84681a5520 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:04:22.291 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.289+0000 7f8466ffd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8468107d90 0x7f84681a5520 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:42490/0 (socket says 192.168.123.105:42490) 2026-03-10T09:04:22.291 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.289+0000 7f8466ffd700 1 -- 192.168.123.105:0/2066007616 learned_addr learned my addr 192.168.123.105:0/2066007616 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:04:22.291 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.289+0000 7f845ffff700 1 --2- 192.168.123.105:0/2066007616 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f846810a700 0x7f84681a5a60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:04:22.291 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.289+0000 7f8466ffd700 1 -- 192.168.123.105:0/2066007616 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f846810a700 msgr2=0x7f84681a5a60 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:22.291 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.289+0000 7f8466ffd700 1 --2- 192.168.123.105:0/2066007616 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f846810a700 0x7f84681a5a60 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:22.291 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.289+0000 7f8466ffd700 1 -- 192.168.123.105:0/2066007616 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f84580097e0 con 0x7f8468107d90 2026-03-10T09:04:22.293 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.290+0000 7f8466ffd700 1 --2- 192.168.123.105:0/2066007616 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8468107d90 0x7f84681a5520 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f8458005230 tx=0x7f845800bac0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:04:22.293 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.292+0000 7f846ca4c700 1 -- 192.168.123.105:0/2066007616 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f845801d070 con 0x7f8468107d90 2026-03-10T09:04:22.293 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.292+0000 7f846da4e700 1 -- 192.168.123.105:0/2066007616 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f84681aac10 con 0x7f8468107d90 2026-03-10T09:04:22.294 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.292+0000 7f846da4e700 1 -- 192.168.123.105:0/2066007616 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f84681ab100 con 0x7f8468107d90 2026-03-10T09:04:22.294 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.292+0000 7f846ca4c700 1 -- 192.168.123.105:0/2066007616 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f845800bdf0 con 0x7f8468107d90 2026-03-10T09:04:22.294 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.292+0000 7f846ca4c700 1 -- 192.168.123.105:0/2066007616 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8458021b20 con 0x7f8468107d90 2026-03-10T09:04:22.294 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.293+0000 7f845e7fc700 1 -- 192.168.123.105:0/2066007616 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8448005320 con 0x7f8468107d90 2026-03-10T09:04:22.295 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.293+0000 7f846ca4c700 1 -- 192.168.123.105:0/2066007616 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f845802b430 con 0x7f8468107d90 2026-03-10T09:04:22.295 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.294+0000 7f846ca4c700 1 --2- 192.168.123.105:0/2066007616 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f84540779e0 0x7f8454079ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:04:22.295 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.294+0000 7f846ca4c700 1 -- 192.168.123.105:0/2066007616 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f845809c3b0 con 0x7f8468107d90 2026-03-10T09:04:22.295 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.294+0000 7f845ffff700 1 --2- 192.168.123.105:0/2066007616 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f84540779e0 0x7f8454079ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:04:22.295 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.294+0000 7f845ffff700 1 --2- 192.168.123.105:0/2066007616 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f84540779e0 0x7f8454079ea0 secure :-1 s=READY pgs=133 cs=0 l=1 rev1=1 crypto rx=0x7f8450005950 tx=0x7f845000b410 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:04:22.298 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.296+0000 7f846ca4c700 1 -- 192.168.123.105:0/2066007616 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f8458064b30 con 0x7f8468107d90 2026-03-10T09:04:22.427 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.424+0000 7f845e7fc700 1 -- 192.168.123.105:0/2066007616 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f8448000bf0 con 0x7f84540779e0 2026-03-10T09:04:22.429 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.427+0000 7f846ca4c700 1 -- 192.168.123.105:0/2066007616 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+460 (secure 0 0 0) 0x7f8448000bf0 con 0x7f84540779e0 2026-03-10T09:04:22.429 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T09:04:22.429 INFO:teuthology.orchestra.run.vm05.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T09:04:22.429 INFO:teuthology.orchestra.run.vm05.stdout: "in_progress": true, 2026-03-10T09:04:22.429 INFO:teuthology.orchestra.run.vm05.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T09:04:22.429 INFO:teuthology.orchestra.run.vm05.stdout: "services_complete": [ 2026-03-10T09:04:22.429 INFO:teuthology.orchestra.run.vm05.stdout: "crash", 2026-03-10T09:04:22.429 INFO:teuthology.orchestra.run.vm05.stdout: "ceph-exporter", 2026-03-10T09:04:22.429 INFO:teuthology.orchestra.run.vm05.stdout: "mgr", 2026-03-10T09:04:22.429 INFO:teuthology.orchestra.run.vm05.stdout: "osd", 2026-03-10T09:04:22.429 INFO:teuthology.orchestra.run.vm05.stdout: "mon", 2026-03-10T09:04:22.429 INFO:teuthology.orchestra.run.vm05.stdout: "mds" 2026-03-10T09:04:22.429 INFO:teuthology.orchestra.run.vm05.stdout: ], 2026-03-10T09:04:22.429 INFO:teuthology.orchestra.run.vm05.stdout: "progress": "18/23 daemons upgraded", 2026-03-10T09:04:22.429 INFO:teuthology.orchestra.run.vm05.stdout: "message": "Currently upgrading grafana daemons", 2026-03-10T09:04:22.429 INFO:teuthology.orchestra.run.vm05.stdout: "is_paused": false 2026-03-10T09:04:22.429 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T09:04:22.435 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.433+0000 7f845e7fc700 1 -- 192.168.123.105:0/2066007616 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f84540779e0 msgr2=0x7f8454079ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:22.435 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.433+0000 7f845e7fc700 1 --2- 192.168.123.105:0/2066007616 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f84540779e0 0x7f8454079ea0 secure :-1 s=READY pgs=133 cs=0 l=1 rev1=1 crypto rx=0x7f8450005950 tx=0x7f845000b410 comp rx=0 tx=0).stop 2026-03-10T09:04:22.435 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.433+0000 7f845e7fc700 1 -- 192.168.123.105:0/2066007616 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8468107d90 msgr2=0x7f84681a5520 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:22.435 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.433+0000 7f845e7fc700 1 --2- 192.168.123.105:0/2066007616 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8468107d90 0x7f84681a5520 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f8458005230 tx=0x7f845800bac0 comp rx=0 tx=0).stop 2026-03-10T09:04:22.435 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.433+0000 7f845e7fc700 1 -- 192.168.123.105:0/2066007616 shutdown_connections 2026-03-10T09:04:22.435 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.433+0000 7f845e7fc700 1 --2- 192.168.123.105:0/2066007616 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f84540779e0 0x7f8454079ea0 unknown :-1 s=CLOSED pgs=133 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:22.435 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.433+0000 7f845e7fc700 1 --2- 192.168.123.105:0/2066007616 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8468107d90 0x7f84681a5520 unknown :-1 s=CLOSED pgs=83 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:22.435 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.433+0000 7f845e7fc700 1 --2- 192.168.123.105:0/2066007616 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f846810a700 0x7f84681a5a60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:22.435 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.433+0000 7f845e7fc700 1 -- 192.168.123.105:0/2066007616 >> 192.168.123.105:0/2066007616 conn(0x7f846806dda0 msgr2=0x7f846810c150 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:04:22.435 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.433+0000 7f845e7fc700 1 -- 192.168.123.105:0/2066007616 shutdown_connections 2026-03-10T09:04:22.435 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.433+0000 7f845e7fc700 1 -- 192.168.123.105:0/2066007616 wait complete. 2026-03-10T09:04:22.505 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.503+0000 7f6ab2b91700 1 -- 192.168.123.105:0/1989799781 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6aac075a40 msgr2=0x7f6aac077ed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:22.505 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.503+0000 7f6ab2b91700 1 --2- 192.168.123.105:0/1989799781 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6aac075a40 0x7f6aac077ed0 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f6aa400b3a0 tx=0x7f6aa400b6b0 comp rx=0 tx=0).stop 2026-03-10T09:04:22.505 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.503+0000 7f6ab2b91700 1 -- 192.168.123.105:0/1989799781 shutdown_connections 2026-03-10T09:04:22.505 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.503+0000 7f6ab2b91700 1 --2- 192.168.123.105:0/1989799781 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6aac075a40 0x7f6aac077ed0 unknown :-1 s=CLOSED pgs=84 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:22.505 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.503+0000 7f6ab2b91700 1 --2- 192.168.123.105:0/1989799781 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6aac072b50 0x7f6aac072f70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:22.505 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.503+0000 7f6ab2b91700 1 -- 192.168.123.105:0/1989799781 >> 192.168.123.105:0/1989799781 conn(0x7f6aac06dae0 msgr2=0x7f6aac06ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:04:22.506 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.503+0000 7f6ab2b91700 1 -- 192.168.123.105:0/1989799781 shutdown_connections 2026-03-10T09:04:22.506 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.503+0000 7f6ab2b91700 1 -- 192.168.123.105:0/1989799781 wait complete. 2026-03-10T09:04:22.506 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.504+0000 7f6ab2b91700 1 Processor -- start 2026-03-10T09:04:22.506 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.504+0000 7f6ab2b91700 1 -- start start 2026-03-10T09:04:22.506 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.504+0000 7f6ab2b91700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6aac072b50 0x7f6aac083030 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:04:22.506 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.504+0000 7f6ab2b91700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6aac083570 0x7f6aac1b30c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:04:22.506 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.504+0000 7f6ab2b91700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6aac083a80 con 0x7f6aac072b50 2026-03-10T09:04:22.506 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.504+0000 7f6ab2b91700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6aac083bc0 con 0x7f6aac083570 2026-03-10T09:04:22.506 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.504+0000 7f6aabfff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6aac083570 0x7f6aac1b30c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:04:22.506 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.504+0000 7f6aabfff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6aac083570 0x7f6aac1b30c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:42510/0 (socket says 192.168.123.105:42510) 2026-03-10T09:04:22.506 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.504+0000 7f6aabfff700 1 -- 192.168.123.105:0/2774616996 learned_addr learned my addr 192.168.123.105:0/2774616996 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:04:22.506 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.504+0000 7f6ab092d700 1 --2- 192.168.123.105:0/2774616996 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6aac072b50 0x7f6aac083030 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:04:22.506 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.504+0000 7f6aabfff700 1 -- 192.168.123.105:0/2774616996 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6aac072b50 msgr2=0x7f6aac083030 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:22.506 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.504+0000 7f6aabfff700 1 --2- 192.168.123.105:0/2774616996 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6aac072b50 0x7f6aac083030 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:22.507 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.504+0000 7f6aabfff700 1 -- 192.168.123.105:0/2774616996 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6a9c009710 con 0x7f6aac083570 2026-03-10T09:04:22.507 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.504+0000 7f6ab092d700 1 --2- 192.168.123.105:0/2774616996 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6aac072b50 0x7f6aac083030 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T09:04:22.507 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.505+0000 7f6aabfff700 1 --2- 192.168.123.105:0/2774616996 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6aac083570 0x7f6aac1b30c0 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7f6aa400ba80 tx=0x7f6aa4009580 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:04:22.507 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.505+0000 7f6aa9ffb700 1 -- 192.168.123.105:0/2774616996 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6aa400e050 con 0x7f6aac083570 2026-03-10T09:04:22.507 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.505+0000 7f6ab2b91700 1 -- 192.168.123.105:0/2774616996 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6aa400b050 con 0x7f6aac083570 2026-03-10T09:04:22.507 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.505+0000 7f6ab2b91700 1 -- 192.168.123.105:0/2774616996 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6aac1b3990 con 0x7f6aac083570 2026-03-10T09:04:22.507 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.506+0000 7f6aa9ffb700 1 -- 192.168.123.105:0/2774616996 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6aa4003e80 con 0x7f6aac083570 2026-03-10T09:04:22.507 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.506+0000 7f6aa9ffb700 1 -- 192.168.123.105:0/2774616996 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6aa401baf0 con 0x7f6aac083570 2026-03-10T09:04:22.508 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.507+0000 7f6aa9ffb700 1 -- 192.168.123.105:0/2774616996 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f6aa4019040 con 0x7f6aac083570 2026-03-10T09:04:22.508 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.507+0000 7f6aa9ffb700 1 --2- 192.168.123.105:0/2774616996 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f6a94078b10 0x7f6a9407afd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:04:22.508 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.507+0000 7f6ab092d700 1 --2- 192.168.123.105:0/2774616996 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f6a94078b10 0x7f6a9407afd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:04:22.508 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.507+0000 7f6aa9ffb700 1 -- 192.168.123.105:0/2774616996 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f6aa409ae30 con 0x7f6aac083570 2026-03-10T09:04:22.509 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.507+0000 7f6ab2b91700 1 -- 192.168.123.105:0/2774616996 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6a98005320 con 0x7f6aac083570 2026-03-10T09:04:22.509 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.508+0000 7f6ab092d700 1 --2- 192.168.123.105:0/2774616996 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f6a94078b10 0x7f6a9407afd0 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7f6a9c00f790 tx=0x7f6a9c009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:04:22.512 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.510+0000 7f6aa9ffb700 1 -- 192.168.123.105:0/2774616996 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f6aa40635b0 con 0x7f6aac083570 2026-03-10T09:04:22.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:22 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/1227651310' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:22.697 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.695+0000 7f6ab2b91700 1 -- 192.168.123.105:0/2774616996 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f6a980059f0 con 0x7f6aac083570 2026-03-10T09:04:22.702 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.700+0000 7f6aa9ffb700 1 -- 192.168.123.105:0/2774616996 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7f6aa4017090 con 0x7f6aac083570 2026-03-10T09:04:22.702 INFO:teuthology.orchestra.run.vm05.stdout:HEALTH_OK 2026-03-10T09:04:22.709 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.707+0000 7f6a937fe700 1 -- 192.168.123.105:0/2774616996 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f6a94078b10 msgr2=0x7f6a9407afd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:22.710 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.707+0000 7f6a937fe700 1 --2- 192.168.123.105:0/2774616996 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f6a94078b10 0x7f6a9407afd0 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7f6a9c00f790 tx=0x7f6a9c009450 comp rx=0 tx=0).stop 2026-03-10T09:04:22.710 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.707+0000 7f6a937fe700 1 -- 192.168.123.105:0/2774616996 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6aac083570 msgr2=0x7f6aac1b30c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:22.710 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.707+0000 7f6a937fe700 1 --2- 192.168.123.105:0/2774616996 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6aac083570 0x7f6aac1b30c0 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7f6aa400ba80 tx=0x7f6aa4009580 comp rx=0 tx=0).stop 2026-03-10T09:04:22.710 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.707+0000 7f6a937fe700 1 -- 192.168.123.105:0/2774616996 shutdown_connections 2026-03-10T09:04:22.710 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.707+0000 7f6a937fe700 1 --2- 192.168.123.105:0/2774616996 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f6a94078b10 0x7f6a9407afd0 unknown :-1 s=CLOSED pgs=134 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:22.710 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.707+0000 7f6a937fe700 1 --2- 192.168.123.105:0/2774616996 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6aac072b50 0x7f6aac083030 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:22.710 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.707+0000 7f6a937fe700 1 --2- 192.168.123.105:0/2774616996 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6aac083570 0x7f6aac1b30c0 unknown :-1 s=CLOSED pgs=85 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:22.710 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.707+0000 7f6a937fe700 1 -- 192.168.123.105:0/2774616996 >> 192.168.123.105:0/2774616996 conn(0x7f6aac06dae0 msgr2=0x7f6aac06ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:04:22.710 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.708+0000 7f6a937fe700 1 -- 192.168.123.105:0/2774616996 shutdown_connections 2026-03-10T09:04:22.710 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:22.708+0000 7f6a937fe700 1 -- 192.168.123.105:0/2774616996 wait complete. 2026-03-10T09:04:23.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:23 vm05.local ceph-mon[111630]: from='client.44399 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:04:23.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:23 vm05.local ceph-mon[111630]: from='client.44403 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:04:23.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:23 vm05.local ceph-mon[111630]: from='client.34512 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:04:23.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:23 vm05.local ceph-mon[111630]: pgmap v294: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:23.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:23 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/3131152322' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T09:04:23.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:23 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/2774616996' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T09:04:23.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:23 vm08.local ceph-mon[101330]: from='client.44399 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:04:23.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:23 vm08.local ceph-mon[101330]: from='client.44403 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:04:23.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:23 vm08.local ceph-mon[101330]: from='client.34512 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:04:23.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:23 vm08.local ceph-mon[101330]: pgmap v294: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:23.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:23 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/3131152322' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T09:04:23.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:23 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/2774616996' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T09:04:24.234 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:24 vm05.local ceph-mon[111630]: from='client.44417 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:04:24.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:24 vm08.local ceph-mon[101330]: from='client.44417 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:04:25.105 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:25 vm05.local ceph-mon[111630]: pgmap v295: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:25.105 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:25 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:25.105 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:25 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:25.105 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:25 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:04:25.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:25 vm08.local ceph-mon[101330]: pgmap v295: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:25.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:25 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:25.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:25 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:25.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:25 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:04:26.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:26 vm05.local ceph-mon[111630]: pgmap v296: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:26.964 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:26 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:26.964 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:26 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:26.964 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:26 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:26.964 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:26 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:27.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:26 vm08.local ceph-mon[101330]: pgmap v296: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:27.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:26 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:27.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:26 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:27.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:26 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:27.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:26 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:28.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:28.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:28.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:04:28.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:04:28.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:28.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: Upgrade: Finalizing container_image settings 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]: dispatch 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]': finished 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon"}]': finished 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]: dispatch 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]': finished 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd"}]: dispatch 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd"}]': finished 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds"}]: dispatch 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds"}]': finished 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]: dispatch 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]': finished 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]: dispatch 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]': finished 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]: dispatch 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]': finished 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]: dispatch 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]': finished 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]: dispatch 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]': finished 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]: dispatch 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]': finished 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: Upgrade: Complete! 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]: dispatch 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]': finished 2026-03-10T09:04:28.554 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:04:28.555 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:04:28.555 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:04:28.555 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:28.555 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:04:28.555 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:04:28.555 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:04:28.555 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:28.555 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:28 vm08.local ceph-mon[101330]: pgmap v297: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:28.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:28.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:28.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:04:28.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:04:28.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:28.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-10T09:04:28.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-10T09:04:28.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:04:28.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:28.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:28.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:28.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:28.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:28.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:28.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: Upgrade: Finalizing container_image settings 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]: dispatch 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]': finished 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon"}]': finished 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]: dispatch 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]': finished 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd"}]: dispatch 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd"}]': finished 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds"}]: dispatch 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds"}]': finished 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]: dispatch 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]': finished 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]: dispatch 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]': finished 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]: dispatch 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]': finished 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]: dispatch 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]': finished 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]: dispatch 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]': finished 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]: dispatch 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]': finished 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: Upgrade: Complete! 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]: dispatch 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]': finished 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:28.714 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:28 vm05.local ceph-mon[111630]: pgmap v297: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:30.626 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:30 vm08.local ceph-mon[101330]: pgmap v298: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:30.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:30 vm05.local ceph-mon[111630]: pgmap v298: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:31.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:04:31.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:31.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:04:31.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:04:32.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:32 vm05.local ceph-mon[111630]: pgmap v299: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:32.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:32 vm08.local ceph-mon[101330]: pgmap v299: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:35.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:34 vm05.local ceph-mon[111630]: pgmap v300: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:35.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:34 vm08.local ceph-mon[101330]: pgmap v300: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:37.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:37 vm05.local ceph-mon[111630]: pgmap v301: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:37.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:37 vm08.local ceph-mon[101330]: pgmap v301: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:39.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:39 vm05.local ceph-mon[111630]: pgmap v302: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:39.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:39 vm08.local ceph-mon[101330]: pgmap v302: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:41.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:41 vm05.local ceph-mon[111630]: pgmap v303: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:41.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:41 vm08.local ceph-mon[101330]: pgmap v303: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:43.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:43 vm05.local ceph-mon[111630]: pgmap v304: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:43.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:43 vm08.local ceph-mon[101330]: pgmap v304: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:45.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:45 vm05.local ceph-mon[111630]: pgmap v305: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:45.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:45 vm08.local ceph-mon[101330]: pgmap v305: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:46.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:46 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:04:46.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:46 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:04:47.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:47 vm05.local ceph-mon[111630]: pgmap v306: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:47.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:47 vm08.local ceph-mon[101330]: pgmap v306: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:49.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:49 vm05.local ceph-mon[111630]: pgmap v307: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:49.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:49 vm08.local ceph-mon[101330]: pgmap v307: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:51.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:51 vm08.local ceph-mon[101330]: pgmap v308: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:51.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:51 vm05.local ceph-mon[111630]: pgmap v308: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:52.783 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.780+0000 7f3369b84700 1 -- 192.168.123.105:0/3322479125 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3364103150 msgr2=0x7f3364103570 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:52.783 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.780+0000 7f3369b84700 1 --2- 192.168.123.105:0/3322479125 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3364103150 0x7f3364103570 secure :-1 s=READY pgs=189 cs=0 l=1 rev1=1 crypto rx=0x7f3358009b00 tx=0x7f3358009e10 comp rx=0 tx=0).stop 2026-03-10T09:04:52.783 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.781+0000 7f3369b84700 1 -- 192.168.123.105:0/3322479125 shutdown_connections 2026-03-10T09:04:52.783 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.781+0000 7f3369b84700 1 --2- 192.168.123.105:0/3322479125 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3364104350 0x7f33641047b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:52.783 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.781+0000 7f3369b84700 1 --2- 192.168.123.105:0/3322479125 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3364103150 0x7f3364103570 unknown :-1 s=CLOSED pgs=189 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:52.783 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.781+0000 7f3369b84700 1 -- 192.168.123.105:0/3322479125 >> 192.168.123.105:0/3322479125 conn(0x7f33640fe6d0 msgr2=0x7f3364100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:04:52.783 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.782+0000 7f3369b84700 1 -- 192.168.123.105:0/3322479125 shutdown_connections 2026-03-10T09:04:52.783 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.782+0000 7f3369b84700 1 -- 192.168.123.105:0/3322479125 wait complete. 2026-03-10T09:04:52.783 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.782+0000 7f3369b84700 1 Processor -- start 2026-03-10T09:04:52.783 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.782+0000 7f3369b84700 1 -- start start 2026-03-10T09:04:52.783 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.782+0000 7f3369b84700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3364103150 0x7f3364198a00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:04:52.784 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.782+0000 7f3369b84700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3364104350 0x7f3364198f40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:04:52.784 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.782+0000 7f3369b84700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3364199560 con 0x7f3364104350 2026-03-10T09:04:52.784 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.782+0000 7f3369b84700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f33641996a0 con 0x7f3364103150 2026-03-10T09:04:52.784 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.783+0000 7f33637fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3364103150 0x7f3364198a00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:04:52.784 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.783+0000 7f3362ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3364104350 0x7f3364198f40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:04:52.784 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.783+0000 7f33637fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3364103150 0x7f3364198a00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:41034/0 (socket says 192.168.123.105:41034) 2026-03-10T09:04:52.784 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.783+0000 7f33637fe700 1 -- 192.168.123.105:0/3001789674 learned_addr learned my addr 192.168.123.105:0/3001789674 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:04:52.784 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.783+0000 7f33637fe700 1 -- 192.168.123.105:0/3001789674 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3364104350 msgr2=0x7f3364198f40 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:52.784 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.783+0000 7f33637fe700 1 --2- 192.168.123.105:0/3001789674 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3364104350 0x7f3364198f40 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:52.784 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.783+0000 7f33637fe700 1 -- 192.168.123.105:0/3001789674 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f33580097e0 con 0x7f3364103150 2026-03-10T09:04:52.784 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.783+0000 7f33637fe700 1 --2- 192.168.123.105:0/3001789674 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3364103150 0x7f3364198a00 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7f335800bb40 tx=0x7f335800bb70 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:04:52.785 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.783+0000 7f3360ff9700 1 -- 192.168.123.105:0/3001789674 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f335801d070 con 0x7f3364103150 2026-03-10T09:04:52.785 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.784+0000 7f3369b84700 1 -- 192.168.123.105:0/3001789674 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f336419e0f0 con 0x7f3364103150 2026-03-10T09:04:52.785 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.784+0000 7f3369b84700 1 -- 192.168.123.105:0/3001789674 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f336419e5e0 con 0x7f3364103150 2026-03-10T09:04:52.786 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.784+0000 7f3360ff9700 1 -- 192.168.123.105:0/3001789674 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f3358022470 con 0x7f3364103150 2026-03-10T09:04:52.786 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.784+0000 7f3360ff9700 1 -- 192.168.123.105:0/3001789674 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f335800f670 con 0x7f3364103150 2026-03-10T09:04:52.786 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.785+0000 7f3369b84700 1 -- 192.168.123.105:0/3001789674 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3350005320 con 0x7f3364103150 2026-03-10T09:04:52.787 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.785+0000 7f3360ff9700 1 -- 192.168.123.105:0/3001789674 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f3358004b50 con 0x7f3364103150 2026-03-10T09:04:52.787 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.786+0000 7f3360ff9700 1 --2- 192.168.123.105:0/3001789674 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f334c0778c0 0x7f334c079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:04:52.787 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.786+0000 7f3362ffd700 1 --2- 192.168.123.105:0/3001789674 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f334c0778c0 0x7f334c079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:04:52.788 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.787+0000 7f3360ff9700 1 -- 192.168.123.105:0/3001789674 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f335809b010 con 0x7f3364103150 2026-03-10T09:04:52.788 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.787+0000 7f3362ffd700 1 --2- 192.168.123.105:0/3001789674 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f334c0778c0 0x7f334c079d80 secure :-1 s=READY pgs=135 cs=0 l=1 rev1=1 crypto rx=0x7f3354009ea0 tx=0x7f3354009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:04:52.789 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.788+0000 7f3360ff9700 1 -- 192.168.123.105:0/3001789674 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f3358063840 con 0x7f3364103150 2026-03-10T09:04:52.917 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.915+0000 7f3369b84700 1 -- 192.168.123.105:0/3001789674 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f3350000bf0 con 0x7f334c0778c0 2026-03-10T09:04:52.918 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.916+0000 7f3360ff9700 1 -- 192.168.123.105:0/3001789674 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+175 (secure 0 0 0) 0x7f3350000bf0 con 0x7f334c0778c0 2026-03-10T09:04:52.920 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.919+0000 7f3369b84700 1 -- 192.168.123.105:0/3001789674 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f334c0778c0 msgr2=0x7f334c079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:52.920 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.919+0000 7f3369b84700 1 --2- 192.168.123.105:0/3001789674 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f334c0778c0 0x7f334c079d80 secure :-1 s=READY pgs=135 cs=0 l=1 rev1=1 crypto rx=0x7f3354009ea0 tx=0x7f3354009450 comp rx=0 tx=0).stop 2026-03-10T09:04:52.920 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.919+0000 7f3369b84700 1 -- 192.168.123.105:0/3001789674 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3364103150 msgr2=0x7f3364198a00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:52.920 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.919+0000 7f3369b84700 1 --2- 192.168.123.105:0/3001789674 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3364103150 0x7f3364198a00 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7f335800bb40 tx=0x7f335800bb70 comp rx=0 tx=0).stop 2026-03-10T09:04:52.921 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.919+0000 7f3369b84700 1 -- 192.168.123.105:0/3001789674 shutdown_connections 2026-03-10T09:04:52.921 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.919+0000 7f3369b84700 1 --2- 192.168.123.105:0/3001789674 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f334c0778c0 0x7f334c079d80 unknown :-1 s=CLOSED pgs=135 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:52.921 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.919+0000 7f3369b84700 1 --2- 192.168.123.105:0/3001789674 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3364103150 0x7f3364198a00 unknown :-1 s=CLOSED pgs=86 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:52.921 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.919+0000 7f3369b84700 1 --2- 192.168.123.105:0/3001789674 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3364104350 0x7f3364198f40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:52.921 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.919+0000 7f3369b84700 1 -- 192.168.123.105:0/3001789674 >> 192.168.123.105:0/3001789674 conn(0x7f33640fe6d0 msgr2=0x7f3364107580 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:04:52.921 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.920+0000 7f3369b84700 1 -- 192.168.123.105:0/3001789674 shutdown_connections 2026-03-10T09:04:52.921 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:52.920+0000 7f3369b84700 1 -- 192.168.123.105:0/3001789674 wait complete. 2026-03-10T09:04:52.991 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch ps' 2026-03-10T09:04:53.197 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T09:04:53.349 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:53 vm05.local ceph-mon[111630]: pgmap v309: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:53.462 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.460+0000 7f29dc5be700 1 -- 192.168.123.105:0/3113299083 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f29d4103cf0 msgr2=0x7f29d4107d40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:53.462 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.460+0000 7f29dc5be700 1 --2- 192.168.123.105:0/3113299083 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f29d4103cf0 0x7f29d4107d40 secure :-1 s=READY pgs=190 cs=0 l=1 rev1=1 crypto rx=0x7f29d0009b00 tx=0x7f29d0009e10 comp rx=0 tx=0).stop 2026-03-10T09:04:53.463 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.461+0000 7f29dc5be700 1 -- 192.168.123.105:0/3113299083 shutdown_connections 2026-03-10T09:04:53.463 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.461+0000 7f29dc5be700 1 --2- 192.168.123.105:0/3113299083 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f29d4103cf0 0x7f29d4107d40 unknown :-1 s=CLOSED pgs=190 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:53.463 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.461+0000 7f29dc5be700 1 --2- 192.168.123.105:0/3113299083 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f29d4103340 0x7f29d4103720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:53.463 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.461+0000 7f29dc5be700 1 -- 192.168.123.105:0/3113299083 >> 192.168.123.105:0/3113299083 conn(0x7f29d40febd0 msgr2=0x7f29d4100ff0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:04:53.463 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.462+0000 7f29dc5be700 1 -- 192.168.123.105:0/3113299083 shutdown_connections 2026-03-10T09:04:53.463 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.462+0000 7f29dc5be700 1 -- 192.168.123.105:0/3113299083 wait complete. 2026-03-10T09:04:53.464 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.462+0000 7f29dc5be700 1 Processor -- start 2026-03-10T09:04:53.464 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.462+0000 7f29dc5be700 1 -- start start 2026-03-10T09:04:53.464 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.463+0000 7f29dc5be700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f29d4103340 0x7f29d4198db0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:04:53.464 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.463+0000 7f29dc5be700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f29d4103cf0 0x7f29d41992f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:04:53.464 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.463+0000 7f29dc5be700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f29d41999d0 con 0x7f29d4103cf0 2026-03-10T09:04:53.464 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.463+0000 7f29dc5be700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f29d419d760 con 0x7f29d4103340 2026-03-10T09:04:53.464 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.463+0000 7f29d9b59700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f29d4103cf0 0x7f29d41992f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:04:53.464 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.463+0000 7f29d9b59700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f29d4103cf0 0x7f29d41992f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:41358/0 (socket says 192.168.123.105:41358) 2026-03-10T09:04:53.464 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.463+0000 7f29d9b59700 1 -- 192.168.123.105:0/3077130456 learned_addr learned my addr 192.168.123.105:0/3077130456 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:04:53.465 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.463+0000 7f29d9b59700 1 -- 192.168.123.105:0/3077130456 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f29d4103340 msgr2=0x7f29d4198db0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T09:04:53.465 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.463+0000 7f29d9b59700 1 --2- 192.168.123.105:0/3077130456 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f29d4103340 0x7f29d4198db0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:53.465 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.463+0000 7f29d9b59700 1 -- 192.168.123.105:0/3077130456 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f29d00097e0 con 0x7f29d4103cf0 2026-03-10T09:04:53.465 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.463+0000 7f29d9b59700 1 --2- 192.168.123.105:0/3077130456 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f29d4103cf0 0x7f29d41992f0 secure :-1 s=READY pgs=191 cs=0 l=1 rev1=1 crypto rx=0x7f29d000ba30 tx=0x7f29d000bb10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:04:53.465 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.463+0000 7f29cb7fe700 1 -- 192.168.123.105:0/3077130456 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f29d001d070 con 0x7f29d4103cf0 2026-03-10T09:04:53.466 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.464+0000 7f29cb7fe700 1 -- 192.168.123.105:0/3077130456 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f29d000f460 con 0x7f29d4103cf0 2026-03-10T09:04:53.466 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.464+0000 7f29cb7fe700 1 -- 192.168.123.105:0/3077130456 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f29d0021620 con 0x7f29d4103cf0 2026-03-10T09:04:53.466 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.464+0000 7f29dc5be700 1 -- 192.168.123.105:0/3077130456 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f29d419d9e0 con 0x7f29d4103cf0 2026-03-10T09:04:53.466 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.464+0000 7f29dc5be700 1 -- 192.168.123.105:0/3077130456 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f29d419ded0 con 0x7f29d4103cf0 2026-03-10T09:04:53.466 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.465+0000 7f29dc5be700 1 -- 192.168.123.105:0/3077130456 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f29d404ea90 con 0x7f29d4103cf0 2026-03-10T09:04:53.470 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.466+0000 7f29cb7fe700 1 -- 192.168.123.105:0/3077130456 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f29d000f5d0 con 0x7f29d4103cf0 2026-03-10T09:04:53.470 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.466+0000 7f29cb7fe700 1 --2- 192.168.123.105:0/3077130456 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f29c00778c0 0x7f29c0079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:04:53.470 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.466+0000 7f29cb7fe700 1 -- 192.168.123.105:0/3077130456 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f29d009ab20 con 0x7f29d4103cf0 2026-03-10T09:04:53.470 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.469+0000 7f29da35a700 1 --2- 192.168.123.105:0/3077130456 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f29c00778c0 0x7f29c0079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:04:53.470 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.469+0000 7f29da35a700 1 --2- 192.168.123.105:0/3077130456 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f29c00778c0 0x7f29c0079d80 secure :-1 s=READY pgs=136 cs=0 l=1 rev1=1 crypto rx=0x7f29c4009ce0 tx=0x7f29c4009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:04:53.470 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.469+0000 7f29cb7fe700 1 -- 192.168.123.105:0/3077130456 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f29d00631a0 con 0x7f29d4103cf0 2026-03-10T09:04:53.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:53 vm08.local ceph-mon[101330]: pgmap v309: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:53.593 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.591+0000 7f29dc5be700 1 -- 192.168.123.105:0/3077130456 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f29d419e1b0 con 0x7f29c00778c0 2026-03-10T09:04:53.598 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.597+0000 7f29cb7fe700 1 -- 192.168.123.105:0/3077130456 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f29d419e1b0 con 0x7f29c00778c0 2026-03-10T09:04:53.599 INFO:teuthology.orchestra.run.vm05.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T09:04:53.599 INFO:teuthology.orchestra.run.vm05.stdout:alertmanager.vm05 vm05 *:9093,9094 running (40s) 27s ago 14m 15.1M - 0.25.0 c8568f914cd2 2f2e8b2aa368 2026-03-10T09:04:53.599 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm05 vm05 running (64s) 27s ago 14m 10.3M - 19.2.3-678-ge911bdeb 654f31e6858e a95a49a68fec 2026-03-10T09:04:53.599 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm08 vm08 running (63s) 52s ago 13m 10.2M - 19.2.3-678-ge911bdeb 654f31e6858e 296c2a7f0ba5 2026-03-10T09:04:53.599 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm05 vm05 running (7m) 27s ago 14m 7834k - 19.2.3-678-ge911bdeb 654f31e6858e 8f7c3cd210c7 2026-03-10T09:04:53.599 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm08 vm08 running (7m) 52s ago 13m 8296k - 19.2.3-678-ge911bdeb 654f31e6858e dc71c3161edd 2026-03-10T09:04:53.599 INFO:teuthology.orchestra.run.vm05.stdout:grafana.vm05 vm05 *:3000 running (29s) 27s ago 13m 41.7M - 10.4.0 c8b91775d855 ff4933ec3b1b 2026-03-10T09:04:53.599 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.bxdvbu vm05 running (96s) 27s ago 12m 106M - 19.2.3-678-ge911bdeb 654f31e6858e a19223339e34 2026-03-10T09:04:53.599 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.slhztf vm05 running (86s) 27s ago 11m 14.7M - 19.2.3-678-ge911bdeb 654f31e6858e 0ae3b9e36731 2026-03-10T09:04:53.599 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.ssijow vm08 running (69s) 52s ago 11m 15.5M - 19.2.3-678-ge911bdeb 654f31e6858e 3540ff73b5c2 2026-03-10T09:04:53.599 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.xfzrbx vm08 running (78s) 52s ago 11m 18.4M - 19.2.3-678-ge911bdeb 654f31e6858e 4b83d0ce9fe6 2026-03-10T09:04:53.599 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm05.rxwgjc vm05 *:8443,9283,8765 running (8m) 27s ago 14m 638M - 19.2.3-678-ge911bdeb 654f31e6858e d8c53d04a173 2026-03-10T09:04:53.599 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm08.rpongu vm08 *:8443,9283,8765 running (8m) 52s ago 13m 495M - 19.2.3-678-ge911bdeb 654f31e6858e 867781ac9d98 2026-03-10T09:04:53.599 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm05 vm05 running (8m) 27s ago 14m 71.9M 2048M 19.2.3-678-ge911bdeb 654f31e6858e cdc9176bec28 2026-03-10T09:04:53.599 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm08 vm08 running (7m) 52s ago 13m 57.5M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 34546aa1422b 2026-03-10T09:04:53.600 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm05 vm05 *:9100 running (56s) 27s ago 14m 9051k - 1.7.0 72c9c2088986 389c3ddf4b37 2026-03-10T09:04:53.600 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm08 vm08 *:9100 running (53s) 52s ago 13m 5347k - 1.7.0 72c9c2088986 8cc5d2924193 2026-03-10T09:04:53.600 INFO:teuthology.orchestra.run.vm05.stdout:osd.0 vm05 running (7m) 27s ago 13m 222M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 4f1dac46f59b 2026-03-10T09:04:53.600 INFO:teuthology.orchestra.run.vm05.stdout:osd.1 vm05 running (3m) 27s ago 12m 117M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 306e95bddd95 2026-03-10T09:04:53.600 INFO:teuthology.orchestra.run.vm05.stdout:osd.2 vm05 running (2m) 27s ago 12m 103M 4096M 19.2.3-678-ge911bdeb 654f31e6858e a555d70ff4bd 2026-03-10T09:04:53.600 INFO:teuthology.orchestra.run.vm05.stdout:osd.3 vm08 running (2m) 52s ago 12m 174M 4096M 19.2.3-678-ge911bdeb 654f31e6858e b025f9a6ca2a 2026-03-10T09:04:53.600 INFO:teuthology.orchestra.run.vm05.stdout:osd.4 vm08 running (2m) 52s ago 12m 130M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 76fe84edd716 2026-03-10T09:04:53.600 INFO:teuthology.orchestra.run.vm05.stdout:osd.5 vm08 running (109s) 52s ago 12m 109M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 41f6c3ce6ac4 2026-03-10T09:04:53.600 INFO:teuthology.orchestra.run.vm05.stdout:prometheus.vm05 vm05 *:9095 running (44s) 27s ago 13m 57.6M - 2.51.0 1d3b7f56885b 17d47f7668ac 2026-03-10T09:04:53.603 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.602+0000 7f29dc5be700 1 -- 192.168.123.105:0/3077130456 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f29c00778c0 msgr2=0x7f29c0079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:53.604 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.602+0000 7f29dc5be700 1 --2- 192.168.123.105:0/3077130456 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f29c00778c0 0x7f29c0079d80 secure :-1 s=READY pgs=136 cs=0 l=1 rev1=1 crypto rx=0x7f29c4009ce0 tx=0x7f29c4009450 comp rx=0 tx=0).stop 2026-03-10T09:04:53.604 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.602+0000 7f29dc5be700 1 -- 192.168.123.105:0/3077130456 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f29d4103cf0 msgr2=0x7f29d41992f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:53.604 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.602+0000 7f29dc5be700 1 --2- 192.168.123.105:0/3077130456 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f29d4103cf0 0x7f29d41992f0 secure :-1 s=READY pgs=191 cs=0 l=1 rev1=1 crypto rx=0x7f29d000ba30 tx=0x7f29d000bb10 comp rx=0 tx=0).stop 2026-03-10T09:04:53.604 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.602+0000 7f29dc5be700 1 -- 192.168.123.105:0/3077130456 shutdown_connections 2026-03-10T09:04:53.604 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.602+0000 7f29dc5be700 1 --2- 192.168.123.105:0/3077130456 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f29c00778c0 0x7f29c0079d80 unknown :-1 s=CLOSED pgs=136 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:53.604 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.602+0000 7f29dc5be700 1 --2- 192.168.123.105:0/3077130456 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f29d4103340 0x7f29d4198db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:53.604 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.602+0000 7f29dc5be700 1 --2- 192.168.123.105:0/3077130456 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f29d4103cf0 0x7f29d41992f0 unknown :-1 s=CLOSED pgs=191 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:53.605 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.602+0000 7f29dc5be700 1 -- 192.168.123.105:0/3077130456 >> 192.168.123.105:0/3077130456 conn(0x7f29d40febd0 msgr2=0x7f29d4100160 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:04:53.605 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.603+0000 7f29dc5be700 1 -- 192.168.123.105:0/3077130456 shutdown_connections 2026-03-10T09:04:53.605 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:53.603+0000 7f29dc5be700 1 -- 192.168.123.105:0/3077130456 wait complete. 2026-03-10T09:04:53.665 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch upgrade status' 2026-03-10T09:04:53.817 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T09:04:54.084 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.082+0000 7f42527fe700 1 -- 192.168.123.105:0/4167547109 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f424c068730 msgr2=0x7f424c068b10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:54.084 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.082+0000 7f42527fe700 1 --2- 192.168.123.105:0/4167547109 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f424c068730 0x7f424c068b10 secure :-1 s=READY pgs=192 cs=0 l=1 rev1=1 crypto rx=0x7f4234009b00 tx=0x7f4234009e10 comp rx=0 tx=0).stop 2026-03-10T09:04:54.085 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.083+0000 7f42527fe700 1 -- 192.168.123.105:0/4167547109 shutdown_connections 2026-03-10T09:04:54.085 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.083+0000 7f42527fe700 1 --2- 192.168.123.105:0/4167547109 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f424c0690e0 0x7f424c105b50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:54.085 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.083+0000 7f42527fe700 1 --2- 192.168.123.105:0/4167547109 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f424c068730 0x7f424c068b10 unknown :-1 s=CLOSED pgs=192 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:54.085 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.083+0000 7f42527fe700 1 -- 192.168.123.105:0/4167547109 >> 192.168.123.105:0/4167547109 conn(0x7f424c075960 msgr2=0x7f424c075d70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:04:54.085 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.083+0000 7f42527fe700 1 -- 192.168.123.105:0/4167547109 shutdown_connections 2026-03-10T09:04:54.085 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.083+0000 7f42527fe700 1 -- 192.168.123.105:0/4167547109 wait complete. 2026-03-10T09:04:54.085 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.084+0000 7f42527fe700 1 Processor -- start 2026-03-10T09:04:54.085 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.084+0000 7f42527fe700 1 -- start start 2026-03-10T09:04:54.086 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.084+0000 7f42527fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f424c068730 0x7f424c198eb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:04:54.086 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.084+0000 7f42527fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f424c0690e0 0x7f424c1993f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:04:54.086 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.084+0000 7f42527fe700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f424c199ad0 con 0x7f424c0690e0 2026-03-10T09:04:54.086 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.084+0000 7f42527fe700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f424c19d860 con 0x7f424c068730 2026-03-10T09:04:54.086 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.085+0000 7f424b7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f424c0690e0 0x7f424c1993f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:04:54.086 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.085+0000 7f424b7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f424c0690e0 0x7f424c1993f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:41380/0 (socket says 192.168.123.105:41380) 2026-03-10T09:04:54.086 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.085+0000 7f424b7fe700 1 -- 192.168.123.105:0/4277058141 learned_addr learned my addr 192.168.123.105:0/4277058141 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:04:54.087 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.085+0000 7f424bfff700 1 --2- 192.168.123.105:0/4277058141 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f424c068730 0x7f424c198eb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:04:54.087 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.085+0000 7f424b7fe700 1 -- 192.168.123.105:0/4277058141 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f424c068730 msgr2=0x7f424c198eb0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:54.087 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.085+0000 7f424b7fe700 1 --2- 192.168.123.105:0/4277058141 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f424c068730 0x7f424c198eb0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:54.087 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.085+0000 7f424b7fe700 1 -- 192.168.123.105:0/4277058141 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f42340097e0 con 0x7f424c0690e0 2026-03-10T09:04:54.087 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.085+0000 7f424b7fe700 1 --2- 192.168.123.105:0/4277058141 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f424c0690e0 0x7f424c1993f0 secure :-1 s=READY pgs=193 cs=0 l=1 rev1=1 crypto rx=0x7f423c00b6d0 tx=0x7f423c00ba90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:04:54.087 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.086+0000 7f42497fa700 1 -- 192.168.123.105:0/4277058141 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f423c010820 con 0x7f424c0690e0 2026-03-10T09:04:54.087 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.086+0000 7f42497fa700 1 -- 192.168.123.105:0/4277058141 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f423c010e60 con 0x7f424c0690e0 2026-03-10T09:04:54.087 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.086+0000 7f42527fe700 1 -- 192.168.123.105:0/4277058141 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f424c19db40 con 0x7f424c0690e0 2026-03-10T09:04:54.088 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.086+0000 7f42527fe700 1 -- 192.168.123.105:0/4277058141 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f424c19e090 con 0x7f424c0690e0 2026-03-10T09:04:54.088 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.086+0000 7f42497fa700 1 -- 192.168.123.105:0/4277058141 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f423c017570 con 0x7f424c0690e0 2026-03-10T09:04:54.089 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.087+0000 7f42497fa700 1 -- 192.168.123.105:0/4277058141 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f423c0176d0 con 0x7f424c0690e0 2026-03-10T09:04:54.089 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.088+0000 7f42527fe700 1 -- 192.168.123.105:0/4277058141 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f424c04ea90 con 0x7f424c0690e0 2026-03-10T09:04:54.090 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.088+0000 7f42497fa700 1 --2- 192.168.123.105:0/4277058141 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f42380779e0 0x7f4238079ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:04:54.090 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.088+0000 7f42497fa700 1 -- 192.168.123.105:0/4277058141 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f423c010980 con 0x7f424c0690e0 2026-03-10T09:04:54.090 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.089+0000 7f424bfff700 1 --2- 192.168.123.105:0/4277058141 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f42380779e0 0x7f4238079ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:04:54.093 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.091+0000 7f424bfff700 1 --2- 192.168.123.105:0/4277058141 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f42380779e0 0x7f4238079ea0 secure :-1 s=READY pgs=137 cs=0 l=1 rev1=1 crypto rx=0x7f4234005950 tx=0x7f42340058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:04:54.093 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.092+0000 7f42497fa700 1 -- 192.168.123.105:0/4277058141 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f423c062bb0 con 0x7f424c0690e0 2026-03-10T09:04:54.232 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.231+0000 7f42527fe700 1 -- 192.168.123.105:0/4277058141 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f424c19e3c0 con 0x7f42380779e0 2026-03-10T09:04:54.233 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.232+0000 7f42497fa700 1 -- 192.168.123.105:0/4277058141 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+175 (secure 0 0 0) 0x7f424c19e3c0 con 0x7f42380779e0 2026-03-10T09:04:54.233 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T09:04:54.233 INFO:teuthology.orchestra.run.vm05.stdout: "target_image": null, 2026-03-10T09:04:54.234 INFO:teuthology.orchestra.run.vm05.stdout: "in_progress": false, 2026-03-10T09:04:54.234 INFO:teuthology.orchestra.run.vm05.stdout: "which": "", 2026-03-10T09:04:54.234 INFO:teuthology.orchestra.run.vm05.stdout: "services_complete": [], 2026-03-10T09:04:54.234 INFO:teuthology.orchestra.run.vm05.stdout: "progress": null, 2026-03-10T09:04:54.234 INFO:teuthology.orchestra.run.vm05.stdout: "message": "", 2026-03-10T09:04:54.234 INFO:teuthology.orchestra.run.vm05.stdout: "is_paused": false 2026-03-10T09:04:54.234 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T09:04:54.244 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.243+0000 7f42527fe700 1 -- 192.168.123.105:0/4277058141 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f42380779e0 msgr2=0x7f4238079ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:54.244 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.243+0000 7f42527fe700 1 --2- 192.168.123.105:0/4277058141 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f42380779e0 0x7f4238079ea0 secure :-1 s=READY pgs=137 cs=0 l=1 rev1=1 crypto rx=0x7f4234005950 tx=0x7f42340058e0 comp rx=0 tx=0).stop 2026-03-10T09:04:54.244 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.243+0000 7f42527fe700 1 -- 192.168.123.105:0/4277058141 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f424c0690e0 msgr2=0x7f424c1993f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:54.244 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.243+0000 7f42527fe700 1 --2- 192.168.123.105:0/4277058141 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f424c0690e0 0x7f424c1993f0 secure :-1 s=READY pgs=193 cs=0 l=1 rev1=1 crypto rx=0x7f423c00b6d0 tx=0x7f423c00ba90 comp rx=0 tx=0).stop 2026-03-10T09:04:54.245 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.243+0000 7f42527fe700 1 -- 192.168.123.105:0/4277058141 shutdown_connections 2026-03-10T09:04:54.245 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.244+0000 7f42527fe700 1 --2- 192.168.123.105:0/4277058141 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f42380779e0 0x7f4238079ea0 unknown :-1 s=CLOSED pgs=137 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:54.245 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.244+0000 7f42527fe700 1 --2- 192.168.123.105:0/4277058141 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f424c068730 0x7f424c198eb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:54.245 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.244+0000 7f42527fe700 1 --2- 192.168.123.105:0/4277058141 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f424c0690e0 0x7f424c1993f0 unknown :-1 s=CLOSED pgs=193 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:54.245 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.244+0000 7f42527fe700 1 -- 192.168.123.105:0/4277058141 >> 192.168.123.105:0/4277058141 conn(0x7f424c075960 msgr2=0x7f424c0feac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:04:54.245 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.244+0000 7f42527fe700 1 -- 192.168.123.105:0/4277058141 shutdown_connections 2026-03-10T09:04:54.245 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.244+0000 7f42527fe700 1 -- 192.168.123.105:0/4277058141 wait complete. 2026-03-10T09:04:54.443 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:54 vm05.local ceph-mon[111630]: from='client.44425 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:04:54.470 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph health detail' 2026-03-10T09:04:54.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:54 vm08.local ceph-mon[101330]: from='client.44425 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:04:54.619 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T09:04:54.942 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.940+0000 7f9e0918b700 1 -- 192.168.123.105:0/955812924 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9e041033c0 msgr2=0x7f9e041037a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:54.942 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.940+0000 7f9e0918b700 1 --2- 192.168.123.105:0/955812924 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9e041033c0 0x7f9e041037a0 secure :-1 s=READY pgs=194 cs=0 l=1 rev1=1 crypto rx=0x7f9df4009b00 tx=0x7f9df4009e10 comp rx=0 tx=0).stop 2026-03-10T09:04:54.942 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.940+0000 7f9e0918b700 1 -- 192.168.123.105:0/955812924 shutdown_connections 2026-03-10T09:04:54.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.940+0000 7f9e0918b700 1 --2- 192.168.123.105:0/955812924 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9e04103d70 0x7f9e04107dc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:54.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.940+0000 7f9e0918b700 1 --2- 192.168.123.105:0/955812924 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9e041033c0 0x7f9e041037a0 unknown :-1 s=CLOSED pgs=194 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:54.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.940+0000 7f9e0918b700 1 -- 192.168.123.105:0/955812924 >> 192.168.123.105:0/955812924 conn(0x7f9e040fec30 msgr2=0x7f9e04101050 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:04:54.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.942+0000 7f9e0918b700 1 -- 192.168.123.105:0/955812924 shutdown_connections 2026-03-10T09:04:54.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.942+0000 7f9e0918b700 1 -- 192.168.123.105:0/955812924 wait complete. 2026-03-10T09:04:54.944 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.943+0000 7f9e0918b700 1 Processor -- start 2026-03-10T09:04:54.944 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.943+0000 7f9e0918b700 1 -- start start 2026-03-10T09:04:54.944 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.943+0000 7f9e0918b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9e041033c0 0x7f9e04198ec0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:04:54.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.943+0000 7f9e0918b700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9e04103d70 0x7f9e04199400 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:04:54.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.943+0000 7f9e0918b700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9e04199ae0 con 0x7f9e041033c0 2026-03-10T09:04:54.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.943+0000 7f9e0918b700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9e0419d870 con 0x7f9e04103d70 2026-03-10T09:04:54.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.944+0000 7f9e02d9d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9e041033c0 0x7f9e04198ec0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:04:54.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.944+0000 7f9e02d9d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9e041033c0 0x7f9e04198ec0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:41400/0 (socket says 192.168.123.105:41400) 2026-03-10T09:04:54.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.944+0000 7f9e02d9d700 1 -- 192.168.123.105:0/1329657873 learned_addr learned my addr 192.168.123.105:0/1329657873 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:04:54.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.944+0000 7f9dfbfff700 1 --2- 192.168.123.105:0/1329657873 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9e04103d70 0x7f9e04199400 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:04:54.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.944+0000 7f9e02d9d700 1 -- 192.168.123.105:0/1329657873 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9e04103d70 msgr2=0x7f9e04199400 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:54.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.944+0000 7f9e02d9d700 1 --2- 192.168.123.105:0/1329657873 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9e04103d70 0x7f9e04199400 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:54.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.944+0000 7f9e02d9d700 1 -- 192.168.123.105:0/1329657873 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9df40097e0 con 0x7f9e041033c0 2026-03-10T09:04:54.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.944+0000 7f9dfbfff700 1 --2- 192.168.123.105:0/1329657873 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9e04103d70 0x7f9e04199400 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T09:04:54.946 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.944+0000 7f9e02d9d700 1 --2- 192.168.123.105:0/1329657873 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9e041033c0 0x7f9e04198ec0 secure :-1 s=READY pgs=195 cs=0 l=1 rev1=1 crypto rx=0x7f9df4005b40 tx=0x7f9df400bfd0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:04:54.946 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.945+0000 7f9e00d99700 1 -- 192.168.123.105:0/1329657873 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9df401d070 con 0x7f9e041033c0 2026-03-10T09:04:54.946 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.945+0000 7f9e00d99700 1 -- 192.168.123.105:0/1329657873 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f9df400f460 con 0x7f9e041033c0 2026-03-10T09:04:54.946 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.945+0000 7f9e0918b700 1 -- 192.168.123.105:0/1329657873 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9e0419daf0 con 0x7f9e041033c0 2026-03-10T09:04:54.947 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.945+0000 7f9e0918b700 1 -- 192.168.123.105:0/1329657873 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9e0419dfe0 con 0x7f9e041033c0 2026-03-10T09:04:54.947 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.946+0000 7f9e00d99700 1 -- 192.168.123.105:0/1329657873 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9df4021620 con 0x7f9e041033c0 2026-03-10T09:04:54.950 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.946+0000 7f9e00d99700 1 -- 192.168.123.105:0/1329657873 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f9df402b430 con 0x7f9e041033c0 2026-03-10T09:04:54.950 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.947+0000 7f9e0918b700 1 -- 192.168.123.105:0/1329657873 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9de8005320 con 0x7f9e041033c0 2026-03-10T09:04:54.951 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.950+0000 7f9e00d99700 1 --2- 192.168.123.105:0/1329657873 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f9de40778c0 0x7f9de4079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:04:54.951 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.950+0000 7f9e00d99700 1 -- 192.168.123.105:0/1329657873 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f9df409b2f0 con 0x7f9e041033c0 2026-03-10T09:04:54.951 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.950+0000 7f9e00d99700 1 -- 192.168.123.105:0/1329657873 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f9df409d7c0 con 0x7f9e041033c0 2026-03-10T09:04:54.952 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.950+0000 7f9dfbfff700 1 --2- 192.168.123.105:0/1329657873 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f9de40778c0 0x7f9de4079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:04:54.952 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:54.951+0000 7f9dfbfff700 1 --2- 192.168.123.105:0/1329657873 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f9de40778c0 0x7f9de4079d80 secure :-1 s=READY pgs=138 cs=0 l=1 rev1=1 crypto rx=0x7f9e0419a4e0 tx=0x7f9dec009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:04:55.121 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.120+0000 7f9e0918b700 1 -- 192.168.123.105:0/1329657873 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f9de80059f0 con 0x7f9e041033c0 2026-03-10T09:04:55.122 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.121+0000 7f9e00d99700 1 -- 192.168.123.105:0/1329657873 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7f9df40639f0 con 0x7f9e041033c0 2026-03-10T09:04:55.122 INFO:teuthology.orchestra.run.vm05.stdout:HEALTH_OK 2026-03-10T09:04:55.124 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.123+0000 7f9e0918b700 1 -- 192.168.123.105:0/1329657873 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f9de40778c0 msgr2=0x7f9de4079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:55.124 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.123+0000 7f9e0918b700 1 --2- 192.168.123.105:0/1329657873 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f9de40778c0 0x7f9de4079d80 secure :-1 s=READY pgs=138 cs=0 l=1 rev1=1 crypto rx=0x7f9e0419a4e0 tx=0x7f9dec009450 comp rx=0 tx=0).stop 2026-03-10T09:04:55.125 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.123+0000 7f9e0918b700 1 -- 192.168.123.105:0/1329657873 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9e041033c0 msgr2=0x7f9e04198ec0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:55.125 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.123+0000 7f9e0918b700 1 --2- 192.168.123.105:0/1329657873 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9e041033c0 0x7f9e04198ec0 secure :-1 s=READY pgs=195 cs=0 l=1 rev1=1 crypto rx=0x7f9df4005b40 tx=0x7f9df400bfd0 comp rx=0 tx=0).stop 2026-03-10T09:04:55.125 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.123+0000 7f9e0918b700 1 -- 192.168.123.105:0/1329657873 shutdown_connections 2026-03-10T09:04:55.125 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.123+0000 7f9e0918b700 1 --2- 192.168.123.105:0/1329657873 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f9de40778c0 0x7f9de4079d80 unknown :-1 s=CLOSED pgs=138 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:55.125 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.123+0000 7f9e0918b700 1 --2- 192.168.123.105:0/1329657873 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9e041033c0 0x7f9e04198ec0 unknown :-1 s=CLOSED pgs=195 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:55.125 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.123+0000 7f9e0918b700 1 --2- 192.168.123.105:0/1329657873 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9e04103d70 0x7f9e04199400 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:55.125 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.123+0000 7f9e0918b700 1 -- 192.168.123.105:0/1329657873 >> 192.168.123.105:0/1329657873 conn(0x7f9e040fec30 msgr2=0x7f9e04100200 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:04:55.125 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.123+0000 7f9e0918b700 1 -- 192.168.123.105:0/1329657873 shutdown_connections 2026-03-10T09:04:55.125 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.123+0000 7f9e0918b700 1 -- 192.168.123.105:0/1329657873 wait complete. 2026-03-10T09:04:55.193 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:55 vm05.local ceph-mon[111630]: from='client.34536 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:04:55.193 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:55 vm05.local ceph-mon[111630]: pgmap v310: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:55.193 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:55 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/1329657873' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T09:04:55.211 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions' 2026-03-10T09:04:55.419 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T09:04:55.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:55 vm08.local ceph-mon[101330]: from='client.34536 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:04:55.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:55 vm08.local ceph-mon[101330]: pgmap v310: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:55.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:55 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/1329657873' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T09:04:55.795 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.791+0000 7ff9a3fff700 1 -- 192.168.123.105:0/1830695894 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff9a410d0f0 msgr2=0x7ff9a410d570 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:55.795 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.791+0000 7ff9a3fff700 1 --2- 192.168.123.105:0/1830695894 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff9a410d0f0 0x7ff9a410d570 secure :-1 s=READY pgs=196 cs=0 l=1 rev1=1 crypto rx=0x7ff99c01c320 tx=0x7ff99c01c630 comp rx=0 tx=0).stop 2026-03-10T09:04:55.795 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.791+0000 7ff9a3fff700 1 -- 192.168.123.105:0/1830695894 shutdown_connections 2026-03-10T09:04:55.795 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.791+0000 7ff9a3fff700 1 --2- 192.168.123.105:0/1830695894 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff9a410d0f0 0x7ff9a410d570 unknown :-1 s=CLOSED pgs=196 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:55.795 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.791+0000 7ff9a3fff700 1 --2- 192.168.123.105:0/1830695894 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff9a410f340 0x7ff9a410f720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:55.795 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.791+0000 7ff9a3fff700 1 -- 192.168.123.105:0/1830695894 >> 192.168.123.105:0/1830695894 conn(0x7ff9a406ce20 msgr2=0x7ff9a406d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:04:55.795 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.791+0000 7ff9a3fff700 1 -- 192.168.123.105:0/1830695894 shutdown_connections 2026-03-10T09:04:55.795 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.791+0000 7ff9a3fff700 1 -- 192.168.123.105:0/1830695894 wait complete. 2026-03-10T09:04:55.795 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.792+0000 7ff9a3fff700 1 Processor -- start 2026-03-10T09:04:55.795 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.792+0000 7ff9a3fff700 1 -- start start 2026-03-10T09:04:55.795 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.792+0000 7ff9a3fff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff9a410f340 0x7ff9a419d010 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:04:55.795 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.792+0000 7ff9a3fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff9a419d550 0x7ff9a41a19c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:04:55.795 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.792+0000 7ff9a3fff700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff9a419db70 con 0x7ff9a419d550 2026-03-10T09:04:55.795 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.792+0000 7ff9a3fff700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff9a419dce0 con 0x7ff9a410f340 2026-03-10T09:04:55.795 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.792+0000 7ff9a2ffd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff9a410f340 0x7ff9a419d010 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:04:55.795 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.792+0000 7ff9a27fc700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff9a419d550 0x7ff9a41a19c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:04:55.795 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.792+0000 7ff9a2ffd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff9a410f340 0x7ff9a419d010 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:41102/0 (socket says 192.168.123.105:41102) 2026-03-10T09:04:55.795 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.792+0000 7ff9a2ffd700 1 -- 192.168.123.105:0/678140445 learned_addr learned my addr 192.168.123.105:0/678140445 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:04:55.795 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.792+0000 7ff9a27fc700 1 -- 192.168.123.105:0/678140445 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff9a410f340 msgr2=0x7ff9a419d010 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:55.795 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.792+0000 7ff9a27fc700 1 --2- 192.168.123.105:0/678140445 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff9a410f340 0x7ff9a419d010 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:55.795 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.792+0000 7ff9a27fc700 1 -- 192.168.123.105:0/678140445 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff99c01c060 con 0x7ff9a419d550 2026-03-10T09:04:55.795 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.792+0000 7ff9a27fc700 1 --2- 192.168.123.105:0/678140445 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff9a419d550 0x7ff9a41a19c0 secure :-1 s=READY pgs=197 cs=0 l=1 rev1=1 crypto rx=0x7ff99c01cab0 tx=0x7ff99c009300 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:04:55.798 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.794+0000 7ff98bfff700 1 -- 192.168.123.105:0/678140445 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff99c0094e0 con 0x7ff9a419d550 2026-03-10T09:04:55.799 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.794+0000 7ff9a3fff700 1 -- 192.168.123.105:0/678140445 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff9a41a1f60 con 0x7ff9a419d550 2026-03-10T09:04:55.799 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.794+0000 7ff9a3fff700 1 -- 192.168.123.105:0/678140445 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff9a41a24b0 con 0x7ff9a419d550 2026-03-10T09:04:55.799 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.795+0000 7ff98bfff700 1 -- 192.168.123.105:0/678140445 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7ff99c01f040 con 0x7ff9a419d550 2026-03-10T09:04:55.799 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.796+0000 7ff98bfff700 1 -- 192.168.123.105:0/678140445 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff99c02eb70 con 0x7ff9a419d550 2026-03-10T09:04:55.799 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.796+0000 7ff98bfff700 1 -- 192.168.123.105:0/678140445 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7ff99c02a070 con 0x7ff9a419d550 2026-03-10T09:04:55.799 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.797+0000 7ff98bfff700 1 --2- 192.168.123.105:0/678140445 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ff98c0779e0 0x7ff98c079ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:04:55.799 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.797+0000 7ff9a2ffd700 1 --2- 192.168.123.105:0/678140445 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ff98c0779e0 0x7ff98c079ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:04:55.799 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.798+0000 7ff98bfff700 1 -- 192.168.123.105:0/678140445 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7ff99c0acbb0 con 0x7ff9a419d550 2026-03-10T09:04:55.799 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.798+0000 7ff9a2ffd700 1 --2- 192.168.123.105:0/678140445 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ff98c0779e0 0x7ff98c079ea0 secure :-1 s=READY pgs=139 cs=0 l=1 rev1=1 crypto rx=0x7ff994009d30 tx=0x7ff9940094b0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:04:55.799 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.798+0000 7ff9a3fff700 1 -- 192.168.123.105:0/678140445 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff990005320 con 0x7ff9a419d550 2026-03-10T09:04:55.802 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.801+0000 7ff98bfff700 1 -- 192.168.123.105:0/678140445 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff99c075330 con 0x7ff9a419d550 2026-03-10T09:04:55.972 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.970+0000 7ff9a3fff700 1 -- 192.168.123.105:0/678140445 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7ff990006200 con 0x7ff9a419d550 2026-03-10T09:04:55.972 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.971+0000 7ff98bfff700 1 -- 192.168.123.105:0/678140445 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7ff99c028070 con 0x7ff9a419d550 2026-03-10T09:04:55.973 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T09:04:55.973 INFO:teuthology.orchestra.run.vm05.stdout: "mon": { 2026-03-10T09:04:55.973 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T09:04:55.973 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T09:04:55.973 INFO:teuthology.orchestra.run.vm05.stdout: "mgr": { 2026-03-10T09:04:55.973 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T09:04:55.973 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T09:04:55.973 INFO:teuthology.orchestra.run.vm05.stdout: "osd": { 2026-03-10T09:04:55.973 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-10T09:04:55.973 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T09:04:55.973 INFO:teuthology.orchestra.run.vm05.stdout: "mds": { 2026-03-10T09:04:55.973 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 4 2026-03-10T09:04:55.973 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T09:04:55.973 INFO:teuthology.orchestra.run.vm05.stdout: "overall": { 2026-03-10T09:04:55.973 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 14 2026-03-10T09:04:55.973 INFO:teuthology.orchestra.run.vm05.stdout: } 2026-03-10T09:04:55.973 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T09:04:55.976 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.974+0000 7ff9a3fff700 1 -- 192.168.123.105:0/678140445 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ff98c0779e0 msgr2=0x7ff98c079ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:55.976 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.974+0000 7ff9a3fff700 1 --2- 192.168.123.105:0/678140445 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ff98c0779e0 0x7ff98c079ea0 secure :-1 s=READY pgs=139 cs=0 l=1 rev1=1 crypto rx=0x7ff994009d30 tx=0x7ff9940094b0 comp rx=0 tx=0).stop 2026-03-10T09:04:55.976 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.974+0000 7ff9a3fff700 1 -- 192.168.123.105:0/678140445 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff9a419d550 msgr2=0x7ff9a41a19c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:04:55.976 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.974+0000 7ff9a3fff700 1 --2- 192.168.123.105:0/678140445 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff9a419d550 0x7ff9a41a19c0 secure :-1 s=READY pgs=197 cs=0 l=1 rev1=1 crypto rx=0x7ff99c01cab0 tx=0x7ff99c009300 comp rx=0 tx=0).stop 2026-03-10T09:04:55.976 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.975+0000 7ff9a3fff700 1 -- 192.168.123.105:0/678140445 shutdown_connections 2026-03-10T09:04:55.976 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.975+0000 7ff9a3fff700 1 --2- 192.168.123.105:0/678140445 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ff98c0779e0 0x7ff98c079ea0 unknown :-1 s=CLOSED pgs=139 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:55.976 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.975+0000 7ff9a3fff700 1 --2- 192.168.123.105:0/678140445 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff9a410f340 0x7ff9a419d010 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:55.976 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.975+0000 7ff9a3fff700 1 --2- 192.168.123.105:0/678140445 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff9a419d550 0x7ff9a41a19c0 unknown :-1 s=CLOSED pgs=197 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:04:55.976 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.975+0000 7ff9a3fff700 1 -- 192.168.123.105:0/678140445 >> 192.168.123.105:0/678140445 conn(0x7ff9a406ce20 msgr2=0x7ff9a4109910 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:04:55.976 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.975+0000 7ff9a3fff700 1 -- 192.168.123.105:0/678140445 shutdown_connections 2026-03-10T09:04:55.976 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:04:55.975+0000 7ff9a3fff700 1 -- 192.168.123.105:0/678140445 wait complete. 2026-03-10T09:04:56.059 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'echo "wait for servicemap items w/ changing names to refresh"' 2026-03-10T09:04:56.230 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T09:04:56.256 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:56 vm05.local ceph-mon[111630]: from='client.34540 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:04:56.256 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:56 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/678140445' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:56.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:56 vm08.local ceph-mon[101330]: from='client.34540 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:04:56.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:56 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/678140445' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:04:56.460 INFO:teuthology.orchestra.run.vm05.stdout:wait for servicemap items w/ changing names to refresh 2026-03-10T09:04:56.502 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'sleep 60' 2026-03-10T09:04:56.651 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T09:04:57.171 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:57 vm05.local ceph-mon[111630]: pgmap v311: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:57.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:57 vm08.local ceph-mon[101330]: pgmap v311: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:59.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:04:59 vm05.local ceph-mon[111630]: pgmap v312: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:04:59.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:04:59 vm08.local ceph-mon[101330]: pgmap v312: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:01.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:05:01 vm08.local ceph-mon[101330]: pgmap v313: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:01.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:05:01 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:05:01.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:05:01 vm05.local ceph-mon[111630]: pgmap v313: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:01.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:05:01 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:05:03.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:05:03 vm05.local ceph-mon[111630]: pgmap v314: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:03.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:05:03 vm08.local ceph-mon[101330]: pgmap v314: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:05.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:05:05 vm05.local ceph-mon[111630]: pgmap v315: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:05.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:05:05 vm08.local ceph-mon[101330]: pgmap v315: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:07.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:05:07 vm05.local ceph-mon[111630]: pgmap v316: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:07.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:05:07 vm08.local ceph-mon[101330]: pgmap v316: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:08.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:05:08 vm08.local ceph-mon[101330]: pgmap v317: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:08.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:05:08 vm05.local ceph-mon[111630]: pgmap v317: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:11.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:05:10 vm05.local ceph-mon[111630]: pgmap v318: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:11.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:05:10 vm08.local ceph-mon[101330]: pgmap v318: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:13.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:05:12 vm05.local ceph-mon[111630]: pgmap v319: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:13.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:05:12 vm08.local ceph-mon[101330]: pgmap v319: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:15.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:05:14 vm05.local ceph-mon[111630]: pgmap v320: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:15.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:05:14 vm08.local ceph-mon[101330]: pgmap v320: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:16.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:05:15 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:05:16.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:05:15 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:05:17.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:05:16 vm05.local ceph-mon[111630]: pgmap v321: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:17.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:05:16 vm08.local ceph-mon[101330]: pgmap v321: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:19.223 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:05:18 vm05.local ceph-mon[111630]: pgmap v322: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:19.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:05:18 vm08.local ceph-mon[101330]: pgmap v322: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:21.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:05:20 vm08.local ceph-mon[101330]: pgmap v323: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:21.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:05:20 vm05.local ceph-mon[111630]: pgmap v323: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:23.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:05:22 vm08.local ceph-mon[101330]: pgmap v324: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:23.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:05:22 vm05.local ceph-mon[111630]: pgmap v324: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:25.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:05:25 vm05.local ceph-mon[111630]: pgmap v325: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:25.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:05:25 vm08.local ceph-mon[101330]: pgmap v325: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:27.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:05:27 vm08.local ceph-mon[101330]: pgmap v326: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:27.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:05:27 vm05.local ceph-mon[111630]: pgmap v326: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:28.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:05:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:05:28.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:05:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:05:28.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:05:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:05:28.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:05:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:05:28.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:05:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:05:28.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:05:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:05:28.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:05:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:05:28.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:05:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:05:29.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:05:29 vm08.local ceph-mon[101330]: pgmap v327: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:29.364 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:05:29 vm05.local ceph-mon[111630]: pgmap v327: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:31.371 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:05:31 vm08.local ceph-mon[101330]: pgmap v328: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:31.371 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:05:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:05:31.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:05:31 vm05.local ceph-mon[111630]: pgmap v328: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:31.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:05:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:05:33.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:05:33 vm08.local ceph-mon[101330]: pgmap v329: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:33.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:05:33 vm05.local ceph-mon[111630]: pgmap v329: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:34.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:05:34 vm08.local ceph-mon[101330]: pgmap v330: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:34.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:05:34 vm05.local ceph-mon[111630]: pgmap v330: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:37.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:05:36 vm05.local ceph-mon[111630]: pgmap v331: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:37.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:05:36 vm08.local ceph-mon[101330]: pgmap v331: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:39.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:05:38 vm05.local ceph-mon[111630]: pgmap v332: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:39.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:05:38 vm08.local ceph-mon[101330]: pgmap v332: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:41.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:05:40 vm05.local ceph-mon[111630]: pgmap v333: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:41.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:05:40 vm08.local ceph-mon[101330]: pgmap v333: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:43.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:05:42 vm05.local ceph-mon[111630]: pgmap v334: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:43.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:05:42 vm08.local ceph-mon[101330]: pgmap v334: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:45.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:05:44 vm05.local ceph-mon[111630]: pgmap v335: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:45.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:05:44 vm08.local ceph-mon[101330]: pgmap v335: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:46.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:05:45 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:05:46.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:05:45 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:05:47.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:05:46 vm08.local ceph-mon[101330]: pgmap v336: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:47.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:05:46 vm05.local ceph-mon[111630]: pgmap v336: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:49.223 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:05:48 vm05.local ceph-mon[111630]: pgmap v337: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:49.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:05:48 vm08.local ceph-mon[101330]: pgmap v337: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:51.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:05:50 vm08.local ceph-mon[101330]: pgmap v338: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:51.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:05:50 vm05.local ceph-mon[111630]: pgmap v338: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:53.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:05:53 vm05.local ceph-mon[111630]: pgmap v339: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:53.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:05:53 vm08.local ceph-mon[101330]: pgmap v339: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:55.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:05:55 vm05.local ceph-mon[111630]: pgmap v340: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:55.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:05:55 vm08.local ceph-mon[101330]: pgmap v340: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:56.917 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch ps' 2026-03-10T09:05:57.076 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T09:05:57.126 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:05:57 vm05.local ceph-mon[111630]: pgmap v341: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:57.359 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.356+0000 7f8106c52700 1 -- 192.168.123.105:0/1023720238 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8100103cf0 msgr2=0x7f8100107d40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:05:57.359 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.356+0000 7f8106c52700 1 --2- 192.168.123.105:0/1023720238 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8100103cf0 0x7f8100107d40 secure :-1 s=READY pgs=198 cs=0 l=1 rev1=1 crypto rx=0x7f80f4009b00 tx=0x7f80f4009e10 comp rx=0 tx=0).stop 2026-03-10T09:05:57.359 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.357+0000 7f8106c52700 1 -- 192.168.123.105:0/1023720238 shutdown_connections 2026-03-10T09:05:57.359 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.357+0000 7f8106c52700 1 --2- 192.168.123.105:0/1023720238 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8100103cf0 0x7f8100107d40 unknown :-1 s=CLOSED pgs=198 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:05:57.359 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.357+0000 7f8106c52700 1 --2- 192.168.123.105:0/1023720238 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8100103340 0x7f8100103720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:05:57.359 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.357+0000 7f8106c52700 1 -- 192.168.123.105:0/1023720238 >> 192.168.123.105:0/1023720238 conn(0x7f81000feb90 msgr2=0x7f8100100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:05:57.359 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.358+0000 7f8106c52700 1 -- 192.168.123.105:0/1023720238 shutdown_connections 2026-03-10T09:05:57.359 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.358+0000 7f8106c52700 1 -- 192.168.123.105:0/1023720238 wait complete. 2026-03-10T09:05:57.360 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.358+0000 7f8106c52700 1 Processor -- start 2026-03-10T09:05:57.360 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.359+0000 7f8106c52700 1 -- start start 2026-03-10T09:05:57.360 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.359+0000 7f8106c52700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8100103340 0x7f8100198e20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:05:57.360 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.359+0000 7f81049ee700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8100103340 0x7f8100198e20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:05:57.360 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.359+0000 7f81049ee700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8100103340 0x7f8100198e20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:50012/0 (socket says 192.168.123.105:50012) 2026-03-10T09:05:57.360 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.359+0000 7f8106c52700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8100103cf0 0x7f8100199360 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:05:57.361 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.359+0000 7f8106c52700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f81001999b0 con 0x7f8100103340 2026-03-10T09:05:57.361 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.359+0000 7f8106c52700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8100199af0 con 0x7f8100103cf0 2026-03-10T09:05:57.361 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.359+0000 7f81049ee700 1 -- 192.168.123.105:0/1536552202 learned_addr learned my addr 192.168.123.105:0/1536552202 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:05:57.361 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.360+0000 7f80fffff700 1 --2- 192.168.123.105:0/1536552202 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8100103cf0 0x7f8100199360 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:05:57.361 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.360+0000 7f81049ee700 1 -- 192.168.123.105:0/1536552202 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8100103cf0 msgr2=0x7f8100199360 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:05:57.361 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.360+0000 7f81049ee700 1 --2- 192.168.123.105:0/1536552202 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8100103cf0 0x7f8100199360 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:05:57.361 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.360+0000 7f81049ee700 1 -- 192.168.123.105:0/1536552202 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f80f40097e0 con 0x7f8100103340 2026-03-10T09:05:57.361 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.360+0000 7f81049ee700 1 --2- 192.168.123.105:0/1536552202 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8100103340 0x7f8100198e20 secure :-1 s=READY pgs=199 cs=0 l=1 rev1=1 crypto rx=0x7f80f000cc60 tx=0x7f80f000cf70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:05:57.363 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.361+0000 7f80fdffb700 1 -- 192.168.123.105:0/1536552202 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f80f00049e0 con 0x7f8100103340 2026-03-10T09:05:57.363 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.361+0000 7f80fdffb700 1 -- 192.168.123.105:0/1536552202 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f80f0007cf0 con 0x7f8100103340 2026-03-10T09:05:57.363 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.361+0000 7f80fdffb700 1 -- 192.168.123.105:0/1536552202 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f80f000f450 con 0x7f8100103340 2026-03-10T09:05:57.363 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.361+0000 7f8106c52700 1 -- 192.168.123.105:0/1536552202 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f810019d940 con 0x7f8100103340 2026-03-10T09:05:57.363 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.361+0000 7f8106c52700 1 -- 192.168.123.105:0/1536552202 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f810019de60 con 0x7f8100103340 2026-03-10T09:05:57.367 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.362+0000 7f80fdffb700 1 -- 192.168.123.105:0/1536552202 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f80f0007810 con 0x7f8100103340 2026-03-10T09:05:57.368 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.363+0000 7f8106c52700 1 -- 192.168.123.105:0/1536552202 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f810010b650 con 0x7f8100103340 2026-03-10T09:05:57.368 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.364+0000 7f80fdffb700 1 --2- 192.168.123.105:0/1536552202 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f80e80778c0 0x7f80e8079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:05:57.368 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.364+0000 7f80fdffb700 1 -- 192.168.123.105:0/1536552202 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f80f0099a60 con 0x7f8100103340 2026-03-10T09:05:57.368 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.366+0000 7f80fffff700 1 --2- 192.168.123.105:0/1536552202 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f80e80778c0 0x7f80e8079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:05:57.368 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.366+0000 7f80fffff700 1 --2- 192.168.123.105:0/1536552202 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f80e80778c0 0x7f80e8079d80 secure :-1 s=READY pgs=140 cs=0 l=1 rev1=1 crypto rx=0x7f80f4009ad0 tx=0x7f80f4005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:05:57.368 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.366+0000 7f80fdffb700 1 -- 192.168.123.105:0/1536552202 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f80f0062160 con 0x7f8100103340 2026-03-10T09:05:57.497 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.495+0000 7f8106c52700 1 -- 192.168.123.105:0/1536552202 --> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f810019a1f0 con 0x7f80e80778c0 2026-03-10T09:05:57.502 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.501+0000 7f80fdffb700 1 -- 192.168.123.105:0/1536552202 <== mgr.34104 v2:192.168.123.105:6800/567882508 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f810019a1f0 con 0x7f80e80778c0 2026-03-10T09:05:57.503 INFO:teuthology.orchestra.run.vm05.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T09:05:57.503 INFO:teuthology.orchestra.run.vm05.stdout:alertmanager.vm05 vm05 *:9093,9094 running (104s) 91s ago 15m 15.1M - 0.25.0 c8568f914cd2 2f2e8b2aa368 2026-03-10T09:05:57.503 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm05 vm05 running (2m) 91s ago 15m 10.3M - 19.2.3-678-ge911bdeb 654f31e6858e a95a49a68fec 2026-03-10T09:05:57.503 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm08 vm08 running (2m) 115s ago 14m 10.2M - 19.2.3-678-ge911bdeb 654f31e6858e 296c2a7f0ba5 2026-03-10T09:05:57.503 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm05 vm05 running (8m) 91s ago 15m 7834k - 19.2.3-678-ge911bdeb 654f31e6858e 8f7c3cd210c7 2026-03-10T09:05:57.503 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm08 vm08 running (8m) 115s ago 14m 8296k - 19.2.3-678-ge911bdeb 654f31e6858e dc71c3161edd 2026-03-10T09:05:57.503 INFO:teuthology.orchestra.run.vm05.stdout:grafana.vm05 vm05 *:3000 running (93s) 91s ago 14m 41.7M - 10.4.0 c8b91775d855 ff4933ec3b1b 2026-03-10T09:05:57.503 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.bxdvbu vm05 running (2m) 91s ago 13m 106M - 19.2.3-678-ge911bdeb 654f31e6858e a19223339e34 2026-03-10T09:05:57.503 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.slhztf vm05 running (2m) 91s ago 13m 14.7M - 19.2.3-678-ge911bdeb 654f31e6858e 0ae3b9e36731 2026-03-10T09:05:57.503 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.ssijow vm08 running (2m) 115s ago 13m 15.5M - 19.2.3-678-ge911bdeb 654f31e6858e 3540ff73b5c2 2026-03-10T09:05:57.503 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.xfzrbx vm08 running (2m) 115s ago 13m 18.4M - 19.2.3-678-ge911bdeb 654f31e6858e 4b83d0ce9fe6 2026-03-10T09:05:57.503 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm05.rxwgjc vm05 *:8443,9283,8765 running (9m) 91s ago 15m 638M - 19.2.3-678-ge911bdeb 654f31e6858e d8c53d04a173 2026-03-10T09:05:57.503 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm08.rpongu vm08 *:8443,9283,8765 running (9m) 115s ago 14m 495M - 19.2.3-678-ge911bdeb 654f31e6858e 867781ac9d98 2026-03-10T09:05:57.503 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm05 vm05 running (9m) 91s ago 15m 71.9M 2048M 19.2.3-678-ge911bdeb 654f31e6858e cdc9176bec28 2026-03-10T09:05:57.503 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm08 vm08 running (9m) 115s ago 14m 57.5M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 34546aa1422b 2026-03-10T09:05:57.503 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm05 vm05 *:9100 running (2m) 91s ago 15m 9051k - 1.7.0 72c9c2088986 389c3ddf4b37 2026-03-10T09:05:57.503 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm08 vm08 *:9100 running (116s) 115s ago 14m 5347k - 1.7.0 72c9c2088986 8cc5d2924193 2026-03-10T09:05:57.503 INFO:teuthology.orchestra.run.vm05.stdout:osd.0 vm05 running (8m) 91s ago 14m 222M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 4f1dac46f59b 2026-03-10T09:05:57.503 INFO:teuthology.orchestra.run.vm05.stdout:osd.1 vm05 running (4m) 91s ago 14m 117M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 306e95bddd95 2026-03-10T09:05:57.503 INFO:teuthology.orchestra.run.vm05.stdout:osd.2 vm05 running (3m) 91s ago 13m 103M 4096M 19.2.3-678-ge911bdeb 654f31e6858e a555d70ff4bd 2026-03-10T09:05:57.503 INFO:teuthology.orchestra.run.vm05.stdout:osd.3 vm08 running (3m) 115s ago 13m 174M 4096M 19.2.3-678-ge911bdeb 654f31e6858e b025f9a6ca2a 2026-03-10T09:05:57.503 INFO:teuthology.orchestra.run.vm05.stdout:osd.4 vm08 running (3m) 115s ago 13m 130M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 76fe84edd716 2026-03-10T09:05:57.503 INFO:teuthology.orchestra.run.vm05.stdout:osd.5 vm08 running (2m) 115s ago 13m 109M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 41f6c3ce6ac4 2026-03-10T09:05:57.503 INFO:teuthology.orchestra.run.vm05.stdout:prometheus.vm05 vm05 *:9095 running (108s) 91s ago 14m 57.6M - 2.51.0 1d3b7f56885b 17d47f7668ac 2026-03-10T09:05:57.505 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.504+0000 7f8106c52700 1 -- 192.168.123.105:0/1536552202 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f80e80778c0 msgr2=0x7f80e8079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:05:57.505 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.504+0000 7f8106c52700 1 --2- 192.168.123.105:0/1536552202 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f80e80778c0 0x7f80e8079d80 secure :-1 s=READY pgs=140 cs=0 l=1 rev1=1 crypto rx=0x7f80f4009ad0 tx=0x7f80f4005fb0 comp rx=0 tx=0).stop 2026-03-10T09:05:57.505 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.504+0000 7f8106c52700 1 -- 192.168.123.105:0/1536552202 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8100103340 msgr2=0x7f8100198e20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:05:57.505 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.504+0000 7f8106c52700 1 --2- 192.168.123.105:0/1536552202 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8100103340 0x7f8100198e20 secure :-1 s=READY pgs=199 cs=0 l=1 rev1=1 crypto rx=0x7f80f000cc60 tx=0x7f80f000cf70 comp rx=0 tx=0).stop 2026-03-10T09:05:57.506 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.504+0000 7f8106c52700 1 -- 192.168.123.105:0/1536552202 shutdown_connections 2026-03-10T09:05:57.506 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.504+0000 7f8106c52700 1 --2- 192.168.123.105:0/1536552202 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f80e80778c0 0x7f80e8079d80 unknown :-1 s=CLOSED pgs=140 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:05:57.506 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.504+0000 7f8106c52700 1 --2- 192.168.123.105:0/1536552202 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8100103340 0x7f8100198e20 unknown :-1 s=CLOSED pgs=199 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:05:57.506 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.504+0000 7f8106c52700 1 --2- 192.168.123.105:0/1536552202 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8100103cf0 0x7f8100199360 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:05:57.506 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.505+0000 7f8106c52700 1 -- 192.168.123.105:0/1536552202 >> 192.168.123.105:0/1536552202 conn(0x7f81000feb90 msgr2=0x7f81001075b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:05:57.506 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.505+0000 7f8106c52700 1 -- 192.168.123.105:0/1536552202 shutdown_connections 2026-03-10T09:05:57.506 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:57.505+0000 7f8106c52700 1 -- 192.168.123.105:0/1536552202 wait complete. 2026-03-10T09:05:57.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:05:57 vm08.local ceph-mon[101330]: pgmap v341: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:57.574 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions' 2026-03-10T09:05:57.733 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T09:05:58.007 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.005+0000 7f5976d2c700 1 -- 192.168.123.105:0/4270180917 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f59700752c0 msgr2=0x7f59700756a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:05:58.007 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.005+0000 7f5976d2c700 1 --2- 192.168.123.105:0/4270180917 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f59700752c0 0x7f59700756a0 secure :-1 s=READY pgs=200 cs=0 l=1 rev1=1 crypto rx=0x7f5960009b00 tx=0x7f5960009e10 comp rx=0 tx=0).stop 2026-03-10T09:05:58.007 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.006+0000 7f5976d2c700 1 -- 192.168.123.105:0/4270180917 shutdown_connections 2026-03-10T09:05:58.007 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.006+0000 7f5976d2c700 1 --2- 192.168.123.105:0/4270180917 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5970075be0 0x7f59701117e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:05:58.007 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.006+0000 7f5976d2c700 1 --2- 192.168.123.105:0/4270180917 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f59700752c0 0x7f59700756a0 unknown :-1 s=CLOSED pgs=200 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:05:58.007 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.006+0000 7f5976d2c700 1 -- 192.168.123.105:0/4270180917 >> 192.168.123.105:0/4270180917 conn(0x7f59700feb50 msgr2=0x7f5970100f70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:05:58.008 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.006+0000 7f5976d2c700 1 -- 192.168.123.105:0/4270180917 shutdown_connections 2026-03-10T09:05:58.008 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.006+0000 7f5976d2c700 1 -- 192.168.123.105:0/4270180917 wait complete. 2026-03-10T09:05:58.008 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.007+0000 7f5976d2c700 1 Processor -- start 2026-03-10T09:05:58.008 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.007+0000 7f5976d2c700 1 -- start start 2026-03-10T09:05:58.009 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.007+0000 7f5976d2c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f59700752c0 0x7f597019d180 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:05:58.010 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.008+0000 7f5976d2c700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5970075be0 0x7f597019d6c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:05:58.010 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.008+0000 7f5976d2c700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f597019dda0 con 0x7f59700752c0 2026-03-10T09:05:58.010 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.008+0000 7f5976d2c700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f59701a1b30 con 0x7f5970075be0 2026-03-10T09:05:58.010 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.009+0000 7f596ffff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5970075be0 0x7f597019d6c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:05:58.010 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.009+0000 7f5974ac8700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f59700752c0 0x7f597019d180 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:05:58.011 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.009+0000 7f596ffff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5970075be0 0x7f597019d6c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:36964/0 (socket says 192.168.123.105:36964) 2026-03-10T09:05:58.011 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.009+0000 7f596ffff700 1 -- 192.168.123.105:0/2907803409 learned_addr learned my addr 192.168.123.105:0/2907803409 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:05:58.011 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.010+0000 7f5974ac8700 1 -- 192.168.123.105:0/2907803409 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5970075be0 msgr2=0x7f597019d6c0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:05:58.011 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.010+0000 7f5974ac8700 1 --2- 192.168.123.105:0/2907803409 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5970075be0 0x7f597019d6c0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:05:58.011 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.010+0000 7f5974ac8700 1 -- 192.168.123.105:0/2907803409 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f59600097e0 con 0x7f59700752c0 2026-03-10T09:05:58.012 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.010+0000 7f5974ac8700 1 --2- 192.168.123.105:0/2907803409 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f59700752c0 0x7f597019d180 secure :-1 s=READY pgs=201 cs=0 l=1 rev1=1 crypto rx=0x7f5960009b00 tx=0x7f596000bd80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:05:58.012 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.011+0000 7f596dffb700 1 -- 192.168.123.105:0/2907803409 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f596001d070 con 0x7f59700752c0 2026-03-10T09:05:58.012 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.011+0000 7f596dffb700 1 -- 192.168.123.105:0/2907803409 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f596000b8b0 con 0x7f59700752c0 2026-03-10T09:05:58.012 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.011+0000 7f596dffb700 1 -- 192.168.123.105:0/2907803409 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5960021410 con 0x7f59700752c0 2026-03-10T09:05:58.012 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.011+0000 7f5976d2c700 1 -- 192.168.123.105:0/2907803409 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f59701a1e10 con 0x7f59700752c0 2026-03-10T09:05:58.013 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.011+0000 7f5976d2c700 1 -- 192.168.123.105:0/2907803409 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f59701a2360 con 0x7f59700752c0 2026-03-10T09:05:58.014 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.013+0000 7f5976d2c700 1 -- 192.168.123.105:0/2907803409 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f597010ef50 con 0x7f59700752c0 2026-03-10T09:05:58.014 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.013+0000 7f596dffb700 1 -- 192.168.123.105:0/2907803409 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f596000f460 con 0x7f59700752c0 2026-03-10T09:05:58.016 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.014+0000 7f596dffb700 1 --2- 192.168.123.105:0/2907803409 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f5958077870 0x7f5958079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:05:58.016 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.014+0000 7f596dffb700 1 -- 192.168.123.105:0/2907803409 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f596009aa50 con 0x7f59700752c0 2026-03-10T09:05:58.016 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.014+0000 7f596ffff700 1 --2- 192.168.123.105:0/2907803409 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f5958077870 0x7f5958079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:05:58.016 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.015+0000 7f596ffff700 1 --2- 192.168.123.105:0/2907803409 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f5958077870 0x7f5958079d30 secure :-1 s=READY pgs=141 cs=0 l=1 rev1=1 crypto rx=0x7f597019e7a0 tx=0x7f5964009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:05:58.017 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.016+0000 7f596dffb700 1 -- 192.168.123.105:0/2907803409 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f5960063150 con 0x7f59700752c0 2026-03-10T09:05:58.180 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.178+0000 7f5976d2c700 1 -- 192.168.123.105:0/2907803409 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f597004ea90 con 0x7f59700752c0 2026-03-10T09:05:58.180 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.179+0000 7f596dffb700 1 -- 192.168.123.105:0/2907803409 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7f59600628a0 con 0x7f59700752c0 2026-03-10T09:05:58.180 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T09:05:58.180 INFO:teuthology.orchestra.run.vm05.stdout: "mon": { 2026-03-10T09:05:58.181 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T09:05:58.181 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T09:05:58.181 INFO:teuthology.orchestra.run.vm05.stdout: "mgr": { 2026-03-10T09:05:58.181 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T09:05:58.181 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T09:05:58.181 INFO:teuthology.orchestra.run.vm05.stdout: "osd": { 2026-03-10T09:05:58.181 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-10T09:05:58.181 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T09:05:58.181 INFO:teuthology.orchestra.run.vm05.stdout: "mds": { 2026-03-10T09:05:58.181 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 4 2026-03-10T09:05:58.181 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T09:05:58.181 INFO:teuthology.orchestra.run.vm05.stdout: "overall": { 2026-03-10T09:05:58.181 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 14 2026-03-10T09:05:58.181 INFO:teuthology.orchestra.run.vm05.stdout: } 2026-03-10T09:05:58.181 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T09:05:58.183 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.182+0000 7f5976d2c700 1 -- 192.168.123.105:0/2907803409 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f5958077870 msgr2=0x7f5958079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:05:58.183 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.182+0000 7f5976d2c700 1 --2- 192.168.123.105:0/2907803409 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f5958077870 0x7f5958079d30 secure :-1 s=READY pgs=141 cs=0 l=1 rev1=1 crypto rx=0x7f597019e7a0 tx=0x7f5964009450 comp rx=0 tx=0).stop 2026-03-10T09:05:58.183 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.182+0000 7f5976d2c700 1 -- 192.168.123.105:0/2907803409 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f59700752c0 msgr2=0x7f597019d180 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:05:58.183 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.182+0000 7f5976d2c700 1 --2- 192.168.123.105:0/2907803409 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f59700752c0 0x7f597019d180 secure :-1 s=READY pgs=201 cs=0 l=1 rev1=1 crypto rx=0x7f5960009b00 tx=0x7f596000bd80 comp rx=0 tx=0).stop 2026-03-10T09:05:58.183 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.182+0000 7f5976d2c700 1 -- 192.168.123.105:0/2907803409 shutdown_connections 2026-03-10T09:05:58.184 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.182+0000 7f5976d2c700 1 --2- 192.168.123.105:0/2907803409 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f5958077870 0x7f5958079d30 unknown :-1 s=CLOSED pgs=141 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:05:58.184 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.182+0000 7f5976d2c700 1 --2- 192.168.123.105:0/2907803409 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f59700752c0 0x7f597019d180 unknown :-1 s=CLOSED pgs=201 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:05:58.184 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.183+0000 7f5976d2c700 1 --2- 192.168.123.105:0/2907803409 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5970075be0 0x7f597019d6c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:05:58.184 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.183+0000 7f5976d2c700 1 -- 192.168.123.105:0/2907803409 >> 192.168.123.105:0/2907803409 conn(0x7f59700feb50 msgr2=0x7f59700ff7c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:05:58.184 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.183+0000 7f5976d2c700 1 -- 192.168.123.105:0/2907803409 shutdown_connections 2026-03-10T09:05:58.184 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.183+0000 7f5976d2c700 1 -- 192.168.123.105:0/2907803409 wait complete. 2026-03-10T09:05:58.244 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions | jq -e '"'"'.overall | length == 1'"'"'' 2026-03-10T09:05:58.400 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T09:05:58.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.690+0000 7fe67bf25700 1 -- 192.168.123.105:0/2930289282 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe6740690e0 msgr2=0x7fe674105b50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:05:58.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.690+0000 7fe67bf25700 1 --2- 192.168.123.105:0/2930289282 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe6740690e0 0x7fe674105b50 secure :-1 s=READY pgs=202 cs=0 l=1 rev1=1 crypto rx=0x7fe670009b00 tx=0x7fe670009e10 comp rx=0 tx=0).stop 2026-03-10T09:05:58.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.691+0000 7fe67bf25700 1 -- 192.168.123.105:0/2930289282 shutdown_connections 2026-03-10T09:05:58.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.691+0000 7fe67bf25700 1 --2- 192.168.123.105:0/2930289282 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe6740690e0 0x7fe674105b50 unknown :-1 s=CLOSED pgs=202 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:05:58.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.691+0000 7fe67bf25700 1 --2- 192.168.123.105:0/2930289282 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe674068730 0x7fe674068b10 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:05:58.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.691+0000 7fe67bf25700 1 -- 192.168.123.105:0/2930289282 >> 192.168.123.105:0/2930289282 conn(0x7fe674075960 msgr2=0x7fe674075d70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:05:58.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.692+0000 7fe67bf25700 1 -- 192.168.123.105:0/2930289282 shutdown_connections 2026-03-10T09:05:58.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.692+0000 7fe67bf25700 1 -- 192.168.123.105:0/2930289282 wait complete. 2026-03-10T09:05:58.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.692+0000 7fe67bf25700 1 Processor -- start 2026-03-10T09:05:58.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.692+0000 7fe67bf25700 1 -- start start 2026-03-10T09:05:58.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.693+0000 7fe67bf25700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe674068730 0x7fe67419ca50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:05:58.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.693+0000 7fe67bf25700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe6740690e0 0x7fe67419cf90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:05:58.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.693+0000 7fe67bf25700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe67419d620 con 0x7fe674068730 2026-03-10T09:05:58.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.693+0000 7fe67bf25700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe674196ad0 con 0x7fe6740690e0 2026-03-10T09:05:58.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.693+0000 7fe6794c0700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe6740690e0 0x7fe67419cf90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:05:58.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.693+0000 7fe6794c0700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe6740690e0 0x7fe67419cf90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:36974/0 (socket says 192.168.123.105:36974) 2026-03-10T09:05:58.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.693+0000 7fe6794c0700 1 -- 192.168.123.105:0/3829922452 learned_addr learned my addr 192.168.123.105:0/3829922452 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:05:58.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.693+0000 7fe679cc1700 1 --2- 192.168.123.105:0/3829922452 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe674068730 0x7fe67419ca50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:05:58.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.693+0000 7fe6794c0700 1 -- 192.168.123.105:0/3829922452 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe674068730 msgr2=0x7fe67419ca50 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:05:58.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.693+0000 7fe6794c0700 1 --2- 192.168.123.105:0/3829922452 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe674068730 0x7fe67419ca50 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:05:58.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.693+0000 7fe6794c0700 1 -- 192.168.123.105:0/3829922452 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe6700097e0 con 0x7fe6740690e0 2026-03-10T09:05:58.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.693+0000 7fe6794c0700 1 --2- 192.168.123.105:0/3829922452 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe6740690e0 0x7fe67419cf90 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7fe670000c00 tx=0x7fe670004970 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:05:58.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.693+0000 7fe679cc1700 1 --2- 192.168.123.105:0/3829922452 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe674068730 0x7fe67419ca50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T09:05:58.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.694+0000 7fe66affd700 1 -- 192.168.123.105:0/3829922452 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe67001d070 con 0x7fe6740690e0 2026-03-10T09:05:58.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.694+0000 7fe66affd700 1 -- 192.168.123.105:0/3829922452 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fe670004b90 con 0x7fe6740690e0 2026-03-10T09:05:58.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.694+0000 7fe67bf25700 1 -- 192.168.123.105:0/3829922452 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe674196d50 con 0x7fe6740690e0 2026-03-10T09:05:58.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.694+0000 7fe66affd700 1 -- 192.168.123.105:0/3829922452 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe67000f700 con 0x7fe6740690e0 2026-03-10T09:05:58.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.694+0000 7fe67bf25700 1 -- 192.168.123.105:0/3829922452 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe674197240 con 0x7fe6740690e0 2026-03-10T09:05:58.696 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.695+0000 7fe67bf25700 1 -- 192.168.123.105:0/3829922452 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe674109470 con 0x7fe6740690e0 2026-03-10T09:05:58.698 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.697+0000 7fe66affd700 1 -- 192.168.123.105:0/3829922452 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fe67000bc50 con 0x7fe6740690e0 2026-03-10T09:05:58.699 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.697+0000 7fe66affd700 1 --2- 192.168.123.105:0/3829922452 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fe6600778c0 0x7fe660079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:05:58.699 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.697+0000 7fe66affd700 1 -- 192.168.123.105:0/3829922452 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7fe67009bd10 con 0x7fe6740690e0 2026-03-10T09:05:58.699 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.697+0000 7fe679cc1700 1 --2- 192.168.123.105:0/3829922452 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fe6600778c0 0x7fe660079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:05:58.699 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.698+0000 7fe679cc1700 1 --2- 192.168.123.105:0/3829922452 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fe6600778c0 0x7fe660079d80 secure :-1 s=READY pgs=142 cs=0 l=1 rev1=1 crypto rx=0x7fe664005fd0 tx=0x7fe664005ee0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:05:58.700 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.699+0000 7fe66affd700 1 -- 192.168.123.105:0/3829922452 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fe670064490 con 0x7fe6740690e0 2026-03-10T09:05:58.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.867+0000 7fe67bf25700 1 -- 192.168.123.105:0/3829922452 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fe67404ea90 con 0x7fe6740690e0 2026-03-10T09:05:58.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.868+0000 7fe66affd700 1 -- 192.168.123.105:0/3829922452 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7fe670063be0 con 0x7fe6740690e0 2026-03-10T09:05:58.873 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.872+0000 7fe67bf25700 1 -- 192.168.123.105:0/3829922452 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fe6600778c0 msgr2=0x7fe660079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:05:58.873 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.872+0000 7fe67bf25700 1 --2- 192.168.123.105:0/3829922452 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fe6600778c0 0x7fe660079d80 secure :-1 s=READY pgs=142 cs=0 l=1 rev1=1 crypto rx=0x7fe664005fd0 tx=0x7fe664005ee0 comp rx=0 tx=0).stop 2026-03-10T09:05:58.873 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.872+0000 7fe67bf25700 1 -- 192.168.123.105:0/3829922452 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe6740690e0 msgr2=0x7fe67419cf90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:05:58.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.872+0000 7fe67bf25700 1 --2- 192.168.123.105:0/3829922452 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe6740690e0 0x7fe67419cf90 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7fe670000c00 tx=0x7fe670004970 comp rx=0 tx=0).stop 2026-03-10T09:05:58.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.873+0000 7fe67bf25700 1 -- 192.168.123.105:0/3829922452 shutdown_connections 2026-03-10T09:05:58.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.873+0000 7fe67bf25700 1 --2- 192.168.123.105:0/3829922452 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fe6600778c0 0x7fe660079d80 unknown :-1 s=CLOSED pgs=142 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:05:58.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.873+0000 7fe67bf25700 1 --2- 192.168.123.105:0/3829922452 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe674068730 0x7fe67419ca50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:05:58.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.873+0000 7fe67bf25700 1 --2- 192.168.123.105:0/3829922452 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe6740690e0 0x7fe67419cf90 unknown :-1 s=CLOSED pgs=87 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:05:58.875 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.873+0000 7fe67bf25700 1 -- 192.168.123.105:0/3829922452 >> 192.168.123.105:0/3829922452 conn(0x7fe674075960 msgr2=0x7fe6740feaa0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:05:58.875 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.874+0000 7fe67bf25700 1 -- 192.168.123.105:0/3829922452 shutdown_connections 2026-03-10T09:05:58.875 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:58.874+0000 7fe67bf25700 1 -- 192.168.123.105:0/3829922452 wait complete. 2026-03-10T09:05:58.887 INFO:teuthology.orchestra.run.vm05.stdout:true 2026-03-10T09:05:58.944 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions | jq -e '"'"'.overall | keys'"'"' | grep $sha1' 2026-03-10T09:05:59.088 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T09:05:59.130 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:05:59 vm05.local ceph-mon[111630]: from='client.34552 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:05:59.130 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:05:59 vm05.local ceph-mon[111630]: pgmap v342: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:59.130 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:05:59 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/2907803409' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:05:59.130 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:05:59 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/3829922452' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:05:59.407 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.404+0000 7f8be941f700 1 -- 192.168.123.105:0/996743069 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8be4105be0 msgr2=0x7f8be4105fc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:05:59.407 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.404+0000 7f8be941f700 1 --2- 192.168.123.105:0/996743069 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8be4105be0 0x7f8be4105fc0 secure :-1 s=READY pgs=203 cs=0 l=1 rev1=1 crypto rx=0x7f8bd4009b50 tx=0x7f8bd4009e60 comp rx=0 tx=0).stop 2026-03-10T09:05:59.410 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.408+0000 7f8be941f700 1 -- 192.168.123.105:0/996743069 shutdown_connections 2026-03-10T09:05:59.410 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.408+0000 7f8be941f700 1 --2- 192.168.123.105:0/996743069 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8be40684d0 0x7f8be4068950 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:05:59.410 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.408+0000 7f8be941f700 1 --2- 192.168.123.105:0/996743069 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8be4105be0 0x7f8be4105fc0 unknown :-1 s=CLOSED pgs=203 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:05:59.410 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.408+0000 7f8be941f700 1 -- 192.168.123.105:0/996743069 >> 192.168.123.105:0/996743069 conn(0x7f8be40756b0 msgr2=0x7f8be4075ac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:05:59.411 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.410+0000 7f8be941f700 1 -- 192.168.123.105:0/996743069 shutdown_connections 2026-03-10T09:05:59.412 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.410+0000 7f8be941f700 1 -- 192.168.123.105:0/996743069 wait complete. 2026-03-10T09:05:59.412 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.411+0000 7f8be941f700 1 Processor -- start 2026-03-10T09:05:59.412 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.411+0000 7f8be941f700 1 -- start start 2026-03-10T09:05:59.412 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.411+0000 7f8be941f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8be40684d0 0x7f8be419ce70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:05:59.412 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.411+0000 7f8be941f700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8be4105be0 0x7f8be419d3b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:05:59.412 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.411+0000 7f8be941f700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8be419da90 con 0x7f8be40684d0 2026-03-10T09:05:59.413 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.411+0000 7f8be941f700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8be41a1820 con 0x7f8be4105be0 2026-03-10T09:05:59.413 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.411+0000 7f8be37fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8be4105be0 0x7f8be419d3b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:05:59.413 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.412+0000 7f8be3fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8be40684d0 0x7f8be419ce70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:05:59.416 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.412+0000 7f8be37fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8be4105be0 0x7f8be419d3b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:36988/0 (socket says 192.168.123.105:36988) 2026-03-10T09:05:59.416 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.412+0000 7f8be37fe700 1 -- 192.168.123.105:0/2762094746 learned_addr learned my addr 192.168.123.105:0/2762094746 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:05:59.416 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.415+0000 7f8be3fff700 1 -- 192.168.123.105:0/2762094746 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8be4105be0 msgr2=0x7f8be419d3b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:05:59.416 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.415+0000 7f8be3fff700 1 --2- 192.168.123.105:0/2762094746 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8be4105be0 0x7f8be419d3b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:05:59.416 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.415+0000 7f8be3fff700 1 -- 192.168.123.105:0/2762094746 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8bd40097e0 con 0x7f8be40684d0 2026-03-10T09:05:59.416 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.415+0000 7f8be3fff700 1 --2- 192.168.123.105:0/2762094746 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8be40684d0 0x7f8be419ce70 secure :-1 s=READY pgs=204 cs=0 l=1 rev1=1 crypto rx=0x7f8bd4009b50 tx=0x7f8bd40048c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:05:59.417 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.415+0000 7f8be17fa700 1 -- 192.168.123.105:0/2762094746 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8bd401d070 con 0x7f8be40684d0 2026-03-10T09:05:59.417 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.416+0000 7f8be17fa700 1 -- 192.168.123.105:0/2762094746 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f8bd4022470 con 0x7f8be40684d0 2026-03-10T09:05:59.417 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.416+0000 7f8be17fa700 1 -- 192.168.123.105:0/2762094746 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8bd400f670 con 0x7f8be40684d0 2026-03-10T09:05:59.417 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.416+0000 7f8be941f700 1 -- 192.168.123.105:0/2762094746 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8be41a1aa0 con 0x7f8be40684d0 2026-03-10T09:05:59.417 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.416+0000 7f8be941f700 1 -- 192.168.123.105:0/2762094746 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8be41a1f90 con 0x7f8be40684d0 2026-03-10T09:05:59.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.417+0000 7f8be941f700 1 -- 192.168.123.105:0/2762094746 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8be404f2e0 con 0x7f8be40684d0 2026-03-10T09:05:59.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.418+0000 7f8be17fa700 1 -- 192.168.123.105:0/2762094746 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f8bd40225e0 con 0x7f8be40684d0 2026-03-10T09:05:59.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.418+0000 7f8be17fa700 1 --2- 192.168.123.105:0/2762094746 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f8bcc0779e0 0x7f8bcc079ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:05:59.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.418+0000 7f8be17fa700 1 -- 192.168.123.105:0/2762094746 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f8bd409bc10 con 0x7f8be40684d0 2026-03-10T09:05:59.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.421+0000 7f8be37fe700 1 --2- 192.168.123.105:0/2762094746 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f8bcc0779e0 0x7f8bcc079ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:05:59.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.422+0000 7f8be37fe700 1 --2- 192.168.123.105:0/2762094746 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f8bcc0779e0 0x7f8bcc079ea0 secure :-1 s=READY pgs=143 cs=0 l=1 rev1=1 crypto rx=0x7f8be419e490 tx=0x7f8bd800b500 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:05:59.424 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.422+0000 7f8be17fa700 1 -- 192.168.123.105:0/2762094746 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f8bd4064390 con 0x7f8be40684d0 2026-03-10T09:05:59.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:05:59 vm08.local ceph-mon[101330]: from='client.34552 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T09:05:59.558 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:05:59 vm08.local ceph-mon[101330]: pgmap v342: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:05:59.558 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:05:59 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/2907803409' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:05:59.558 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:05:59 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/3829922452' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:05:59.602 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.600+0000 7f8be941f700 1 -- 192.168.123.105:0/2762094746 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f8be41a2270 con 0x7f8be40684d0 2026-03-10T09:05:59.610 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.609+0000 7f8be17fa700 1 -- 192.168.123.105:0/2762094746 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7f8bd4063ae0 con 0x7f8be40684d0 2026-03-10T09:05:59.612 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.611+0000 7f8bcaffd700 1 -- 192.168.123.105:0/2762094746 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f8bcc0779e0 msgr2=0x7f8bcc079ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:05:59.612 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.611+0000 7f8bcaffd700 1 --2- 192.168.123.105:0/2762094746 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f8bcc0779e0 0x7f8bcc079ea0 secure :-1 s=READY pgs=143 cs=0 l=1 rev1=1 crypto rx=0x7f8be419e490 tx=0x7f8bd800b500 comp rx=0 tx=0).stop 2026-03-10T09:05:59.613 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.611+0000 7f8bcaffd700 1 -- 192.168.123.105:0/2762094746 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8be40684d0 msgr2=0x7f8be419ce70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:05:59.613 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.611+0000 7f8bcaffd700 1 --2- 192.168.123.105:0/2762094746 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8be40684d0 0x7f8be419ce70 secure :-1 s=READY pgs=204 cs=0 l=1 rev1=1 crypto rx=0x7f8bd4009b50 tx=0x7f8bd40048c0 comp rx=0 tx=0).stop 2026-03-10T09:05:59.613 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.612+0000 7f8bcaffd700 1 -- 192.168.123.105:0/2762094746 shutdown_connections 2026-03-10T09:05:59.613 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.612+0000 7f8bcaffd700 1 --2- 192.168.123.105:0/2762094746 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f8bcc0779e0 0x7f8bcc079ea0 unknown :-1 s=CLOSED pgs=143 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:05:59.613 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.612+0000 7f8bcaffd700 1 --2- 192.168.123.105:0/2762094746 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8be40684d0 0x7f8be419ce70 unknown :-1 s=CLOSED pgs=204 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:05:59.613 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.612+0000 7f8bcaffd700 1 --2- 192.168.123.105:0/2762094746 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8be4105be0 0x7f8be419d3b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:05:59.613 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.612+0000 7f8bcaffd700 1 -- 192.168.123.105:0/2762094746 >> 192.168.123.105:0/2762094746 conn(0x7f8be40756b0 msgr2=0x7f8be40fdbb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:05:59.618 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.617+0000 7f8bcaffd700 1 -- 192.168.123.105:0/2762094746 shutdown_connections 2026-03-10T09:05:59.618 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:05:59.617+0000 7f8bcaffd700 1 -- 192.168.123.105:0/2762094746 wait complete. 2026-03-10T09:05:59.627 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)" 2026-03-10T09:05:59.696 DEBUG:teuthology.parallel:result is None 2026-03-10T09:05:59.696 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-10T09:05:59.700 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm05.local 2026-03-10T09:05:59.700 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- bash -c 'ceph fs dump' 2026-03-10T09:05:59.873 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T09:06:00.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:00 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/2762094746' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:06:00.232 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.230+0000 7f37882a3700 1 -- 192.168.123.105:0/3196430176 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3780103340 msgr2=0x7f3780103720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:00.232 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.230+0000 7f37882a3700 1 --2- 192.168.123.105:0/3196430176 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3780103340 0x7f3780103720 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f3770009b00 tx=0x7f3770009e10 comp rx=0 tx=0).stop 2026-03-10T09:06:00.232 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.231+0000 7f37882a3700 1 -- 192.168.123.105:0/3196430176 shutdown_connections 2026-03-10T09:06:00.232 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.231+0000 7f37882a3700 1 --2- 192.168.123.105:0/3196430176 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3780103cf0 0x7f3780107d40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:00.232 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.231+0000 7f37882a3700 1 --2- 192.168.123.105:0/3196430176 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3780103340 0x7f3780103720 unknown :-1 s=CLOSED pgs=88 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:00.232 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.231+0000 7f37882a3700 1 -- 192.168.123.105:0/3196430176 >> 192.168.123.105:0/3196430176 conn(0x7f37800feb90 msgr2=0x7f3780100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:00.232 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.231+0000 7f37882a3700 1 -- 192.168.123.105:0/3196430176 shutdown_connections 2026-03-10T09:06:00.232 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.231+0000 7f37882a3700 1 -- 192.168.123.105:0/3196430176 wait complete. 2026-03-10T09:06:00.233 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.232+0000 7f37882a3700 1 Processor -- start 2026-03-10T09:06:00.233 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.232+0000 7f37882a3700 1 -- start start 2026-03-10T09:06:00.233 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.232+0000 7f37882a3700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3780103340 0x7f3780198e30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:00.233 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.232+0000 7f37882a3700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3780103cf0 0x7f3780199370 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:00.234 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.232+0000 7f37882a3700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3780199a50 con 0x7f3780103340 2026-03-10T09:06:00.234 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.232+0000 7f37882a3700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f378019d7e0 con 0x7f3780103cf0 2026-03-10T09:06:00.234 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.232+0000 7f378603f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3780103340 0x7f3780198e30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:00.234 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.232+0000 7f378603f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3780103340 0x7f3780198e30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:45414/0 (socket says 192.168.123.105:45414) 2026-03-10T09:06:00.234 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.232+0000 7f378603f700 1 -- 192.168.123.105:0/4221016957 learned_addr learned my addr 192.168.123.105:0/4221016957 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:06:00.234 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.232+0000 7f378603f700 1 -- 192.168.123.105:0/4221016957 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3780103cf0 msgr2=0x7f3780199370 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T09:06:00.234 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.233+0000 7f378583e700 1 --2- 192.168.123.105:0/4221016957 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3780103cf0 0x7f3780199370 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:00.234 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.233+0000 7f378603f700 1 --2- 192.168.123.105:0/4221016957 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3780103cf0 0x7f3780199370 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:00.234 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.233+0000 7f378603f700 1 -- 192.168.123.105:0/4221016957 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f377c009710 con 0x7f3780103340 2026-03-10T09:06:00.234 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.233+0000 7f378603f700 1 --2- 192.168.123.105:0/4221016957 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3780103340 0x7f3780198e30 secure :-1 s=READY pgs=205 cs=0 l=1 rev1=1 crypto rx=0x7f377000c010 tx=0x7f377000bab0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:00.235 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.233+0000 7f37777fe700 1 -- 192.168.123.105:0/4221016957 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f377001d070 con 0x7f3780103340 2026-03-10T09:06:00.235 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.233+0000 7f37882a3700 1 -- 192.168.123.105:0/4221016957 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f37700097e0 con 0x7f3780103340 2026-03-10T09:06:00.235 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.233+0000 7f37882a3700 1 -- 192.168.123.105:0/4221016957 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f378019ddc0 con 0x7f3780103340 2026-03-10T09:06:00.236 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.233+0000 7f37777fe700 1 -- 192.168.123.105:0/4221016957 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f377000f460 con 0x7f3780103340 2026-03-10T09:06:00.236 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.233+0000 7f37777fe700 1 -- 192.168.123.105:0/4221016957 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3770021620 con 0x7f3780103340 2026-03-10T09:06:00.236 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.235+0000 7f37777fe700 1 -- 192.168.123.105:0/4221016957 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f377002b430 con 0x7f3780103340 2026-03-10T09:06:00.237 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.235+0000 7f37777fe700 1 --2- 192.168.123.105:0/4221016957 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f376c0778c0 0x7f376c079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:00.237 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.235+0000 7f37777fe700 1 -- 192.168.123.105:0/4221016957 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f377009bf50 con 0x7f3780103340 2026-03-10T09:06:00.237 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.236+0000 7f378583e700 1 --2- 192.168.123.105:0/4221016957 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f376c0778c0 0x7f376c079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:00.240 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.236+0000 7f37882a3700 1 -- 192.168.123.105:0/4221016957 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3764005320 con 0x7f3780103340 2026-03-10T09:06:00.240 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.237+0000 7f378583e700 1 --2- 192.168.123.105:0/4221016957 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f376c0778c0 0x7f376c079d80 secure :-1 s=READY pgs=144 cs=0 l=1 rev1=1 crypto rx=0x7f378019a450 tx=0x7f377c009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:00.243 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.241+0000 7f37777fe700 1 -- 192.168.123.105:0/4221016957 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f3770064650 con 0x7f3780103340 2026-03-10T09:06:00.410 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.409+0000 7f37882a3700 1 -- 192.168.123.105:0/4221016957 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f3764006200 con 0x7f3780103340 2026-03-10T09:06:00.411 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.409+0000 7f37777fe700 1 -- 192.168.123.105:0/4221016957 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 30 v30) v1 ==== 76+0+1973 (secure 0 0 0) 0x7f3770063da0 con 0x7f3780103340 2026-03-10T09:06:00.412 INFO:teuthology.orchestra.run.vm05.stdout:e30 2026-03-10T09:06:00.412 INFO:teuthology.orchestra.run.vm05.stdout:btime 2026-03-10T09:03:44:361690+0000 2026-03-10T09:06:00.412 INFO:teuthology.orchestra.run.vm05.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T09:06:00.412 INFO:teuthology.orchestra.run.vm05.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T09:06:00.412 INFO:teuthology.orchestra.run.vm05.stdout:legacy client fscid: 1 2026-03-10T09:06:00.412 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:06:00.412 INFO:teuthology.orchestra.run.vm05.stdout:Filesystem 'cephfs' (1) 2026-03-10T09:06:00.412 INFO:teuthology.orchestra.run.vm05.stdout:fs_name cephfs 2026-03-10T09:06:00.412 INFO:teuthology.orchestra.run.vm05.stdout:epoch 30 2026-03-10T09:06:00.412 INFO:teuthology.orchestra.run.vm05.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T09:06:00.412 INFO:teuthology.orchestra.run.vm05.stdout:created 2026-03-10T08:52:52.346264+0000 2026-03-10T09:06:00.412 INFO:teuthology.orchestra.run.vm05.stdout:modified 2026-03-10T09:03:44.361687+0000 2026-03-10T09:06:00.412 INFO:teuthology.orchestra.run.vm05.stdout:tableserver 0 2026-03-10T09:06:00.412 INFO:teuthology.orchestra.run.vm05.stdout:root 0 2026-03-10T09:06:00.412 INFO:teuthology.orchestra.run.vm05.stdout:session_timeout 60 2026-03-10T09:06:00.412 INFO:teuthology.orchestra.run.vm05.stdout:session_autoclose 300 2026-03-10T09:06:00.412 INFO:teuthology.orchestra.run.vm05.stdout:max_file_size 1099511627776 2026-03-10T09:06:00.412 INFO:teuthology.orchestra.run.vm05.stdout:max_xattr_size 65536 2026-03-10T09:06:00.412 INFO:teuthology.orchestra.run.vm05.stdout:required_client_features {} 2026-03-10T09:06:00.412 INFO:teuthology.orchestra.run.vm05.stdout:last_failure 0 2026-03-10T09:06:00.412 INFO:teuthology.orchestra.run.vm05.stdout:last_failure_osd_epoch 106 2026-03-10T09:06:00.412 INFO:teuthology.orchestra.run.vm05.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes} 2026-03-10T09:06:00.412 INFO:teuthology.orchestra.run.vm05.stdout:max_mds 1 2026-03-10T09:06:00.412 INFO:teuthology.orchestra.run.vm05.stdout:in 0 2026-03-10T09:06:00.412 INFO:teuthology.orchestra.run.vm05.stdout:up {0=34444} 2026-03-10T09:06:00.412 INFO:teuthology.orchestra.run.vm05.stdout:failed 2026-03-10T09:06:00.412 INFO:teuthology.orchestra.run.vm05.stdout:damaged 2026-03-10T09:06:00.412 INFO:teuthology.orchestra.run.vm05.stdout:stopped 2026-03-10T09:06:00.412 INFO:teuthology.orchestra.run.vm05.stdout:data_pools [3] 2026-03-10T09:06:00.412 INFO:teuthology.orchestra.run.vm05.stdout:metadata_pool 2 2026-03-10T09:06:00.412 INFO:teuthology.orchestra.run.vm05.stdout:inline_data disabled 2026-03-10T09:06:00.412 INFO:teuthology.orchestra.run.vm05.stdout:balancer 2026-03-10T09:06:00.412 INFO:teuthology.orchestra.run.vm05.stdout:bal_rank_mask -1 2026-03-10T09:06:00.412 INFO:teuthology.orchestra.run.vm05.stdout:standby_count_wanted 1 2026-03-10T09:06:00.413 INFO:teuthology.orchestra.run.vm05.stdout:qdb_cluster leader: 34444 members: 34444 2026-03-10T09:06:00.413 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.bxdvbu{0:34444} state up:active seq 10 join_fscid=1 addr [v2:192.168.123.105:6826/2948722085,v1:192.168.123.105:6827/2948722085] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T09:06:00.413 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:06:00.413 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:06:00.413 INFO:teuthology.orchestra.run.vm05.stdout:Standby daemons: 2026-03-10T09:06:00.413 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:06:00.413 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.xfzrbx{-1:34470} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.108:6824/3904677772,v1:192.168.123.108:6825/3904677772] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T09:06:00.413 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.slhztf{-1:44367} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.105:6828/930707688,v1:192.168.123.105:6829/930707688] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T09:06:00.413 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.ssijow{-1:44373} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.108:6826/42427465,v1:192.168.123.108:6827/42427465] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T09:06:00.414 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.413+0000 7f37882a3700 1 -- 192.168.123.105:0/4221016957 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f376c0778c0 msgr2=0x7f376c079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:00.415 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.413+0000 7f37882a3700 1 --2- 192.168.123.105:0/4221016957 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f376c0778c0 0x7f376c079d80 secure :-1 s=READY pgs=144 cs=0 l=1 rev1=1 crypto rx=0x7f378019a450 tx=0x7f377c009450 comp rx=0 tx=0).stop 2026-03-10T09:06:00.415 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.413+0000 7f37882a3700 1 -- 192.168.123.105:0/4221016957 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3780103340 msgr2=0x7f3780198e30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:00.415 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.413+0000 7f37882a3700 1 --2- 192.168.123.105:0/4221016957 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3780103340 0x7f3780198e30 secure :-1 s=READY pgs=205 cs=0 l=1 rev1=1 crypto rx=0x7f377000c010 tx=0x7f377000bab0 comp rx=0 tx=0).stop 2026-03-10T09:06:00.415 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.413+0000 7f37882a3700 1 -- 192.168.123.105:0/4221016957 shutdown_connections 2026-03-10T09:06:00.415 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.413+0000 7f37882a3700 1 --2- 192.168.123.105:0/4221016957 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f376c0778c0 0x7f376c079d80 unknown :-1 s=CLOSED pgs=144 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:00.415 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.414+0000 7f37882a3700 1 --2- 192.168.123.105:0/4221016957 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3780103340 0x7f3780198e30 unknown :-1 s=CLOSED pgs=205 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:00.415 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.414+0000 7f37882a3700 1 --2- 192.168.123.105:0/4221016957 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3780103cf0 0x7f3780199370 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:00.415 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.414+0000 7f37882a3700 1 -- 192.168.123.105:0/4221016957 >> 192.168.123.105:0/4221016957 conn(0x7f37800feb90 msgr2=0x7f3780100f80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:00.415 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.414+0000 7f37882a3700 1 -- 192.168.123.105:0/4221016957 shutdown_connections 2026-03-10T09:06:00.415 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.414+0000 7f37882a3700 1 -- 192.168.123.105:0/4221016957 wait complete. 2026-03-10T09:06:00.416 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 30 2026-03-10T09:06:00.464 INFO:teuthology.run_tasks:Running task fs.post_upgrade_checks... 2026-03-10T09:06:00.467 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph fs dump --format=json 2026-03-10T09:06:00.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:00 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/2762094746' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T09:06:00.634 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T09:06:00.906 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.903+0000 7fba83b14700 1 -- 192.168.123.105:0/181169649 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fba7c0fff40 msgr2=0x7fba7c1003c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:00.906 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.903+0000 7fba83b14700 1 --2- 192.168.123.105:0/181169649 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fba7c0fff40 0x7fba7c1003c0 secure :-1 s=READY pgs=206 cs=0 l=1 rev1=1 crypto rx=0x7fba78009b00 tx=0x7fba78009e10 comp rx=0 tx=0).stop 2026-03-10T09:06:00.906 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.904+0000 7fba83b14700 1 -- 192.168.123.105:0/181169649 shutdown_connections 2026-03-10T09:06:00.906 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.904+0000 7fba83b14700 1 --2- 192.168.123.105:0/181169649 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fba7c0fff40 0x7fba7c1003c0 secure :-1 s=CLOSED pgs=206 cs=0 l=1 rev1=1 crypto rx=0x7fba78009b00 tx=0x7fba78009e10 comp rx=0 tx=0).stop 2026-03-10T09:06:00.906 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.904+0000 7fba83b14700 1 --2- 192.168.123.105:0/181169649 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fba7c101dc0 0x7fba7c1021a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:00.906 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.904+0000 7fba83b14700 1 -- 192.168.123.105:0/181169649 >> 192.168.123.105:0/181169649 conn(0x7fba7c076b70 msgr2=0x7fba7c076f80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:00.906 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.905+0000 7fba83b14700 1 -- 192.168.123.105:0/181169649 shutdown_connections 2026-03-10T09:06:00.906 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.905+0000 7fba83b14700 1 -- 192.168.123.105:0/181169649 wait complete. 2026-03-10T09:06:00.906 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.905+0000 7fba83b14700 1 Processor -- start 2026-03-10T09:06:00.907 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.905+0000 7fba83b14700 1 -- start start 2026-03-10T09:06:00.907 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.906+0000 7fba83b14700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fba7c101dc0 0x7fba7c199040 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:00.907 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.906+0000 7fba83b14700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fba7c199580 0x7fba7c19d9f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:00.907 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.906+0000 7fba83b14700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fba7c199ba0 con 0x7fba7c199580 2026-03-10T09:06:00.907 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.906+0000 7fba83b14700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fba7c199d10 con 0x7fba7c101dc0 2026-03-10T09:06:00.907 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.906+0000 7fba810af700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fba7c199580 0x7fba7c19d9f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:00.907 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.906+0000 7fba810af700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fba7c199580 0x7fba7c19d9f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:45434/0 (socket says 192.168.123.105:45434) 2026-03-10T09:06:00.907 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.906+0000 7fba810af700 1 -- 192.168.123.105:0/3062594237 learned_addr learned my addr 192.168.123.105:0/3062594237 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:06:00.908 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.906+0000 7fba810af700 1 -- 192.168.123.105:0/3062594237 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fba7c101dc0 msgr2=0x7fba7c199040 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T09:06:00.908 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.906+0000 7fba818b0700 1 --2- 192.168.123.105:0/3062594237 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fba7c101dc0 0x7fba7c199040 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:00.908 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.907+0000 7fba810af700 1 --2- 192.168.123.105:0/3062594237 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fba7c101dc0 0x7fba7c199040 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:00.908 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.907+0000 7fba810af700 1 -- 192.168.123.105:0/3062594237 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fba780097e0 con 0x7fba7c199580 2026-03-10T09:06:00.908 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.907+0000 7fba818b0700 1 --2- 192.168.123.105:0/3062594237 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fba7c101dc0 0x7fba7c199040 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T09:06:00.908 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.907+0000 7fba810af700 1 --2- 192.168.123.105:0/3062594237 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fba7c199580 0x7fba7c19d9f0 secure :-1 s=READY pgs=207 cs=0 l=1 rev1=1 crypto rx=0x7fba78009fd0 tx=0x7fba780049e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:00.908 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.907+0000 7fba6effd700 1 -- 192.168.123.105:0/3062594237 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fba7801d070 con 0x7fba7c199580 2026-03-10T09:06:00.908 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.907+0000 7fba6effd700 1 -- 192.168.123.105:0/3062594237 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fba7800bc50 con 0x7fba7c199580 2026-03-10T09:06:00.909 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.907+0000 7fba83b14700 1 -- 192.168.123.105:0/3062594237 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fba7c19df90 con 0x7fba7c199580 2026-03-10T09:06:00.909 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.907+0000 7fba83b14700 1 -- 192.168.123.105:0/3062594237 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fba7c19e4e0 con 0x7fba7c199580 2026-03-10T09:06:00.910 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.908+0000 7fba6effd700 1 -- 192.168.123.105:0/3062594237 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fba78022620 con 0x7fba7c199580 2026-03-10T09:06:00.913 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.909+0000 7fba83b14700 1 -- 192.168.123.105:0/3062594237 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fba7c10ab30 con 0x7fba7c199580 2026-03-10T09:06:00.913 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.911+0000 7fba6effd700 1 -- 192.168.123.105:0/3062594237 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fba780229e0 con 0x7fba7c199580 2026-03-10T09:06:00.913 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.911+0000 7fba6effd700 1 --2- 192.168.123.105:0/3062594237 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fba680778c0 0x7fba68079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:00.913 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.911+0000 7fba6effd700 1 -- 192.168.123.105:0/3062594237 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7fba7809b420 con 0x7fba7c199580 2026-03-10T09:06:00.914 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.913+0000 7fba818b0700 1 --2- 192.168.123.105:0/3062594237 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fba680778c0 0x7fba68079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:00.914 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.913+0000 7fba6effd700 1 -- 192.168.123.105:0/3062594237 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fba78063aa0 con 0x7fba7c199580 2026-03-10T09:06:00.919 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:00.914+0000 7fba818b0700 1 --2- 192.168.123.105:0/3062594237 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fba680778c0 0x7fba68079d80 secure :-1 s=READY pgs=145 cs=0 l=1 rev1=1 crypto rx=0x7fba70005fd0 tx=0x7fba70005dc0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:01.065 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.063+0000 7fba83b14700 1 -- 192.168.123.105:0/3062594237 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump", "format": "json"} v 0) v1 -- 0x7fba7c04ea90 con 0x7fba7c199580 2026-03-10T09:06:01.067 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.066+0000 7fba6effd700 1 -- 192.168.123.105:0/3062594237 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "format": "json"}]=0 dumped fsmap epoch 30 v30) v1 ==== 94+0+5257 (secure 0 0 0) 0x7fba780631f0 con 0x7fba7c199580 2026-03-10T09:06:01.068 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:06:01.068 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":30,"btime":"2026-03-10T09:03:44:361690+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34470,"name":"cephfs.vm08.xfzrbx","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/3904677772","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":3904677772},{"type":"v1","addr":"192.168.123.108:6825","nonce":3904677772}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":25},{"gid":44367,"name":"cephfs.vm05.slhztf","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6829/930707688","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6828","nonce":930707688},{"type":"v1","addr":"192.168.123.105:6829","nonce":930707688}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":23},{"gid":44373,"name":"cephfs.vm08.ssijow","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6827/42427465","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":42427465},{"type":"v1","addr":"192.168.123.108:6827","nonce":42427465}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":30}],"filesystems":[{"mdsmap":{"epoch":30,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T08:52:52.346264+0000","modified":"2026-03-10T09:03:44.361687+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":106,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34444},"failed":[],"damaged":[],"stopped":[],"info":{"gid_34444":{"gid":34444,"name":"cephfs.vm05.bxdvbu","rank":0,"incarnation":27,"state":"up:active","state_seq":10,"addr":"192.168.123.105:6827/2948722085","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":2948722085},{"type":"v1","addr":"192.168.123.105:6827","nonce":2948722085}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":34444,"qdb_cluster":[34444]},"id":1}]} 2026-03-10T09:06:01.070 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.069+0000 7fba83b14700 1 -- 192.168.123.105:0/3062594237 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fba680778c0 msgr2=0x7fba68079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:01.070 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.069+0000 7fba83b14700 1 --2- 192.168.123.105:0/3062594237 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fba680778c0 0x7fba68079d80 secure :-1 s=READY pgs=145 cs=0 l=1 rev1=1 crypto rx=0x7fba70005fd0 tx=0x7fba70005dc0 comp rx=0 tx=0).stop 2026-03-10T09:06:01.070 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.069+0000 7fba83b14700 1 -- 192.168.123.105:0/3062594237 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fba7c199580 msgr2=0x7fba7c19d9f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:01.071 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.069+0000 7fba83b14700 1 --2- 192.168.123.105:0/3062594237 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fba7c199580 0x7fba7c19d9f0 secure :-1 s=READY pgs=207 cs=0 l=1 rev1=1 crypto rx=0x7fba78009fd0 tx=0x7fba780049e0 comp rx=0 tx=0).stop 2026-03-10T09:06:01.071 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.069+0000 7fba83b14700 1 -- 192.168.123.105:0/3062594237 shutdown_connections 2026-03-10T09:06:01.071 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.069+0000 7fba83b14700 1 --2- 192.168.123.105:0/3062594237 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fba680778c0 0x7fba68079d80 unknown :-1 s=CLOSED pgs=145 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:01.071 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.069+0000 7fba83b14700 1 --2- 192.168.123.105:0/3062594237 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fba7c101dc0 0x7fba7c199040 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:01.071 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.069+0000 7fba83b14700 1 --2- 192.168.123.105:0/3062594237 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fba7c199580 0x7fba7c19d9f0 unknown :-1 s=CLOSED pgs=207 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:01.071 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.069+0000 7fba83b14700 1 -- 192.168.123.105:0/3062594237 >> 192.168.123.105:0/3062594237 conn(0x7fba7c076b70 msgr2=0x7fba7c10c710 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:01.071 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.070+0000 7fba83b14700 1 -- 192.168.123.105:0/3062594237 shutdown_connections 2026-03-10T09:06:01.071 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.070+0000 7fba83b14700 1 -- 192.168.123.105:0/3062594237 wait complete. 2026-03-10T09:06:01.072 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 30 2026-03-10T09:06:01.313 DEBUG:tasks.fs:checking fs fscid=1,name=cephfs state = {'epoch': 10, 'max_mds': 1, 'flags': 18} 2026-03-10T09:06:01.313 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph fs dump --format=json 11 2026-03-10T09:06:01.342 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:01 vm05.local ceph-mon[111630]: pgmap v343: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:01.342 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:01 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/4221016957' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T09:06:01.342 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:01 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:06:01.342 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:01 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/3062594237' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-10T09:06:01.470 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T09:06:01.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:01 vm08.local ceph-mon[101330]: pgmap v343: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:01.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:01 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/4221016957' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T09:06:01.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:01 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:06:01.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:01 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/3062594237' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-10T09:06:01.915 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.913+0000 7f1e6f3ad700 1 -- 192.168.123.105:0/2654741414 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1e68105f30 msgr2=0x7f1e68069200 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:01.915 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.913+0000 7f1e6f3ad700 1 --2- 192.168.123.105:0/2654741414 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1e68105f30 0x7f1e68069200 secure :-1 s=READY pgs=208 cs=0 l=1 rev1=1 crypto rx=0x7f1e5c009b00 tx=0x7f1e5c009e10 comp rx=0 tx=0).stop 2026-03-10T09:06:01.915 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.914+0000 7f1e6f3ad700 1 -- 192.168.123.105:0/2654741414 shutdown_connections 2026-03-10T09:06:01.915 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.914+0000 7f1e6f3ad700 1 --2- 192.168.123.105:0/2654741414 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1e68069740 0x7f1e68069bc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:01.915 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.914+0000 7f1e6f3ad700 1 --2- 192.168.123.105:0/2654741414 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1e68105f30 0x7f1e68069200 unknown :-1 s=CLOSED pgs=208 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:01.915 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.914+0000 7f1e6f3ad700 1 -- 192.168.123.105:0/2654741414 >> 192.168.123.105:0/2654741414 conn(0x7f1e68076b30 msgr2=0x7f1e68076f40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:01.915 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.914+0000 7f1e6f3ad700 1 -- 192.168.123.105:0/2654741414 shutdown_connections 2026-03-10T09:06:01.915 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.914+0000 7f1e6f3ad700 1 -- 192.168.123.105:0/2654741414 wait complete. 2026-03-10T09:06:01.915 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.914+0000 7f1e6f3ad700 1 Processor -- start 2026-03-10T09:06:01.916 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.914+0000 7f1e6f3ad700 1 -- start start 2026-03-10T09:06:01.916 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.915+0000 7f1e6f3ad700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1e68069740 0x7f1e6810af60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:01.916 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.915+0000 7f1e6f3ad700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1e68105f30 0x7f1e6810b4a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:01.916 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.915+0000 7f1e6f3ad700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1e6810bb80 con 0x7f1e68105f30 2026-03-10T09:06:01.916 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.915+0000 7f1e6f3ad700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1e680718b0 con 0x7f1e68069740 2026-03-10T09:06:01.916 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.915+0000 7f1e6d149700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1e68069740 0x7f1e6810af60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:01.916 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.915+0000 7f1e6c948700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1e68105f30 0x7f1e6810b4a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:01.916 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.915+0000 7f1e6d149700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1e68069740 0x7f1e6810af60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:39210/0 (socket says 192.168.123.105:39210) 2026-03-10T09:06:01.916 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.915+0000 7f1e6c948700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1e68105f30 0x7f1e6810b4a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:45460/0 (socket says 192.168.123.105:45460) 2026-03-10T09:06:01.916 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.915+0000 7f1e6c948700 1 -- 192.168.123.105:0/388116525 learned_addr learned my addr 192.168.123.105:0/388116525 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:06:01.917 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.916+0000 7f1e6c948700 1 -- 192.168.123.105:0/388116525 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1e68069740 msgr2=0x7f1e6810af60 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:01.917 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.916+0000 7f1e6c948700 1 --2- 192.168.123.105:0/388116525 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1e68069740 0x7f1e6810af60 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:01.917 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.916+0000 7f1e6c948700 1 -- 192.168.123.105:0/388116525 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1e5c0097e0 con 0x7f1e68105f30 2026-03-10T09:06:01.917 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.916+0000 7f1e6c948700 1 --2- 192.168.123.105:0/388116525 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1e68105f30 0x7f1e6810b4a0 secure :-1 s=READY pgs=209 cs=0 l=1 rev1=1 crypto rx=0x7f1e6400eb10 tx=0x7f1e6400eed0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:01.918 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.916+0000 7f1e5a7fc700 1 -- 192.168.123.105:0/388116525 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1e6400cca0 con 0x7f1e68105f30 2026-03-10T09:06:01.918 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.916+0000 7f1e5a7fc700 1 -- 192.168.123.105:0/388116525 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f1e6400ce00 con 0x7f1e68105f30 2026-03-10T09:06:01.918 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.916+0000 7f1e6f3ad700 1 -- 192.168.123.105:0/388116525 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1e68071b30 con 0x7f1e68105f30 2026-03-10T09:06:01.918 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.917+0000 7f1e6f3ad700 1 -- 192.168.123.105:0/388116525 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1e68072080 con 0x7f1e68105f30 2026-03-10T09:06:01.920 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.918+0000 7f1e5a7fc700 1 -- 192.168.123.105:0/388116525 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1e640105e0 con 0x7f1e68105f30 2026-03-10T09:06:01.920 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.919+0000 7f1e6f3ad700 1 -- 192.168.123.105:0/388116525 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1e68068a10 con 0x7f1e68105f30 2026-03-10T09:06:01.923 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.922+0000 7f1e5a7fc700 1 -- 192.168.123.105:0/388116525 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f1e64010810 con 0x7f1e68105f30 2026-03-10T09:06:01.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.923+0000 7f1e5a7fc700 1 --2- 192.168.123.105:0/388116525 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f1e540778c0 0x7f1e54079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:01.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.923+0000 7f1e6d149700 1 --2- 192.168.123.105:0/388116525 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f1e540778c0 0x7f1e54079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:01.925 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.924+0000 7f1e5a7fc700 1 -- 192.168.123.105:0/388116525 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f1e64014070 con 0x7f1e68105f30 2026-03-10T09:06:01.925 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.924+0000 7f1e6d149700 1 --2- 192.168.123.105:0/388116525 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f1e540778c0 0x7f1e54079d80 secure :-1 s=READY pgs=146 cs=0 l=1 rev1=1 crypto rx=0x7f1e5c006010 tx=0x7f1e5c00b540 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:01.925 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:01.924+0000 7f1e5a7fc700 1 -- 192.168.123.105:0/388116525 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f1e640d09f0 con 0x7f1e68105f30 2026-03-10T09:06:02.065 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.064+0000 7f1e6f3ad700 1 -- 192.168.123.105:0/388116525 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 11, "format": "json"} v 0) v1 -- 0x7f1e681065f0 con 0x7f1e68105f30 2026-03-10T09:06:02.066 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.065+0000 7f1e5a7fc700 1 -- 192.168.123.105:0/388116525 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 11, "format": "json"}]=0 dumped fsmap epoch 11 v30) v1 ==== 107+0+4911 (secure 0 0 0) 0x7f1e64062840 con 0x7f1e68105f30 2026-03-10T09:06:02.067 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:06:02.067 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":11,"btime":"1970-01-01T00:00:00:000000+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14488,"name":"cephfs.vm05.slhztf","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.105:6829/2662194502","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6828","nonce":2662194502},{"type":"v1","addr":"192.168.123.105:6829","nonce":2662194502}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":24307,"name":"cephfs.vm08.ssijow","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.108:6827/573665424","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":573665424},{"type":"v1","addr":"192.168.123.108:6827","nonce":573665424}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":24317,"name":"cephfs.vm08.xfzrbx","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/3236998387","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":3236998387},{"type":"v1","addr":"192.168.123.108:6825","nonce":3236998387}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10}],"filesystems":[{"mdsmap":{"epoch":11,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T08:52:52.346264+0000","modified":"2026-03-10T08:53:00.417769+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":39,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24289},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24289":{"gid":24289,"name":"cephfs.vm05.bxdvbu","rank":0,"incarnation":9,"state":"up:rejoin","state_seq":4,"addr":"192.168.123.105:6827/2466638752","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":2466638752},{"type":"v1","addr":"192.168.123.105:6827","nonce":2466638752}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T09:06:02.069 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.068+0000 7f1e6f3ad700 1 -- 192.168.123.105:0/388116525 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f1e540778c0 msgr2=0x7f1e54079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:02.069 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.068+0000 7f1e6f3ad700 1 --2- 192.168.123.105:0/388116525 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f1e540778c0 0x7f1e54079d80 secure :-1 s=READY pgs=146 cs=0 l=1 rev1=1 crypto rx=0x7f1e5c006010 tx=0x7f1e5c00b540 comp rx=0 tx=0).stop 2026-03-10T09:06:02.070 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.068+0000 7f1e6f3ad700 1 -- 192.168.123.105:0/388116525 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1e68105f30 msgr2=0x7f1e6810b4a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:02.070 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.068+0000 7f1e6f3ad700 1 --2- 192.168.123.105:0/388116525 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1e68105f30 0x7f1e6810b4a0 secure :-1 s=READY pgs=209 cs=0 l=1 rev1=1 crypto rx=0x7f1e6400eb10 tx=0x7f1e6400eed0 comp rx=0 tx=0).stop 2026-03-10T09:06:02.070 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.069+0000 7f1e6f3ad700 1 -- 192.168.123.105:0/388116525 shutdown_connections 2026-03-10T09:06:02.070 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.069+0000 7f1e6f3ad700 1 --2- 192.168.123.105:0/388116525 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f1e540778c0 0x7f1e54079d80 unknown :-1 s=CLOSED pgs=146 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:02.074 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.069+0000 7f1e6f3ad700 1 --2- 192.168.123.105:0/388116525 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1e68069740 0x7f1e6810af60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:02.074 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.073+0000 7f1e6f3ad700 1 --2- 192.168.123.105:0/388116525 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1e68105f30 0x7f1e6810b4a0 unknown :-1 s=CLOSED pgs=209 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:02.074 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.073+0000 7f1e6f3ad700 1 -- 192.168.123.105:0/388116525 >> 192.168.123.105:0/388116525 conn(0x7f1e68076b30 msgr2=0x7f1e680fefc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:02.077 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.076+0000 7f1e6f3ad700 1 -- 192.168.123.105:0/388116525 shutdown_connections 2026-03-10T09:06:02.077 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.076+0000 7f1e6f3ad700 1 -- 192.168.123.105:0/388116525 wait complete. 2026-03-10T09:06:02.081 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 11 2026-03-10T09:06:02.125 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph fs dump --format=json 12 2026-03-10T09:06:02.330 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T09:06:02.445 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:02 vm05.local ceph-mon[111630]: pgmap v344: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:02.446 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:02 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/388116525' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 11, "format": "json"}]: dispatch 2026-03-10T09:06:02.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:02 vm08.local ceph-mon[101330]: pgmap v344: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:02.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:02 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/388116525' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 11, "format": "json"}]: dispatch 2026-03-10T09:06:02.716 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.714+0000 7fd0d1881700 1 -- 192.168.123.105:0/3345357103 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd0cc103cf0 msgr2=0x7fd0cc107d40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:02.717 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.714+0000 7fd0d1881700 1 --2- 192.168.123.105:0/3345357103 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd0cc103cf0 0x7fd0cc107d40 secure :-1 s=READY pgs=210 cs=0 l=1 rev1=1 crypto rx=0x7fd0bc009b00 tx=0x7fd0bc009e10 comp rx=0 tx=0).stop 2026-03-10T09:06:02.717 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.715+0000 7fd0d1881700 1 -- 192.168.123.105:0/3345357103 shutdown_connections 2026-03-10T09:06:02.717 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.715+0000 7fd0d1881700 1 --2- 192.168.123.105:0/3345357103 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd0cc103cf0 0x7fd0cc107d40 unknown :-1 s=CLOSED pgs=210 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:02.717 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.715+0000 7fd0d1881700 1 --2- 192.168.123.105:0/3345357103 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd0cc103340 0x7fd0cc103720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:02.717 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.715+0000 7fd0d1881700 1 -- 192.168.123.105:0/3345357103 >> 192.168.123.105:0/3345357103 conn(0x7fd0cc0feb90 msgr2=0x7fd0cc100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:02.717 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.715+0000 7fd0d1881700 1 -- 192.168.123.105:0/3345357103 shutdown_connections 2026-03-10T09:06:02.717 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.715+0000 7fd0d1881700 1 -- 192.168.123.105:0/3345357103 wait complete. 2026-03-10T09:06:02.717 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.716+0000 7fd0d1881700 1 Processor -- start 2026-03-10T09:06:02.717 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.716+0000 7fd0d1881700 1 -- start start 2026-03-10T09:06:02.718 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.716+0000 7fd0d1881700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd0cc103340 0x7fd0cc198e20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:02.718 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.716+0000 7fd0d1881700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd0cc103cf0 0x7fd0cc199360 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:02.718 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.716+0000 7fd0d1881700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd0cc1999b0 con 0x7fd0cc103cf0 2026-03-10T09:06:02.718 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.717+0000 7fd0d1881700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd0cc199af0 con 0x7fd0cc103340 2026-03-10T09:06:02.718 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.717+0000 7fd0ca7fc700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd0cc103cf0 0x7fd0cc199360 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:02.718 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.717+0000 7fd0ca7fc700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd0cc103cf0 0x7fd0cc199360 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:45490/0 (socket says 192.168.123.105:45490) 2026-03-10T09:06:02.718 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.717+0000 7fd0ca7fc700 1 -- 192.168.123.105:0/2343513367 learned_addr learned my addr 192.168.123.105:0/2343513367 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:06:02.718 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.717+0000 7fd0ca7fc700 1 -- 192.168.123.105:0/2343513367 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd0cc103340 msgr2=0x7fd0cc198e20 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T09:06:02.718 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.717+0000 7fd0ca7fc700 1 --2- 192.168.123.105:0/2343513367 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd0cc103340 0x7fd0cc198e20 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:02.718 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.717+0000 7fd0ca7fc700 1 -- 192.168.123.105:0/2343513367 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd0bc0097e0 con 0x7fd0cc103cf0 2026-03-10T09:06:02.719 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.717+0000 7fd0ca7fc700 1 --2- 192.168.123.105:0/2343513367 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd0cc103cf0 0x7fd0cc199360 secure :-1 s=READY pgs=211 cs=0 l=1 rev1=1 crypto rx=0x7fd0bc004930 tx=0x7fd0bc004a10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:02.719 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.717+0000 7fd0d087f700 1 -- 192.168.123.105:0/2343513367 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd0bc01d070 con 0x7fd0cc103cf0 2026-03-10T09:06:02.720 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.718+0000 7fd0d087f700 1 -- 192.168.123.105:0/2343513367 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fd0bc00bc50 con 0x7fd0cc103cf0 2026-03-10T09:06:02.720 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.718+0000 7fd0d1881700 1 -- 192.168.123.105:0/2343513367 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd0cc19d8e0 con 0x7fd0cc103cf0 2026-03-10T09:06:02.720 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.718+0000 7fd0d1881700 1 -- 192.168.123.105:0/2343513367 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd0cc19ddd0 con 0x7fd0cc103cf0 2026-03-10T09:06:02.720 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.718+0000 7fd0d087f700 1 -- 192.168.123.105:0/2343513367 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd0bc022620 con 0x7fd0cc103cf0 2026-03-10T09:06:02.721 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.719+0000 7fd0d1881700 1 -- 192.168.123.105:0/2343513367 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd0cc04ea90 con 0x7fd0cc103cf0 2026-03-10T09:06:02.724 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.720+0000 7fd0d087f700 1 -- 192.168.123.105:0/2343513367 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fd0bc022780 con 0x7fd0cc103cf0 2026-03-10T09:06:02.724 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.720+0000 7fd0d087f700 1 --2- 192.168.123.105:0/2343513367 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fd0b80778c0 0x7fd0b8079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:02.724 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.720+0000 7fd0d087f700 1 -- 192.168.123.105:0/2343513367 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7fd0bc09b490 con 0x7fd0cc103cf0 2026-03-10T09:06:02.724 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.722+0000 7fd0caffd700 1 --2- 192.168.123.105:0/2343513367 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fd0b80778c0 0x7fd0b8079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:02.724 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.723+0000 7fd0d087f700 1 -- 192.168.123.105:0/2343513367 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fd0bc063b10 con 0x7fd0cc103cf0 2026-03-10T09:06:02.724 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.723+0000 7fd0caffd700 1 --2- 192.168.123.105:0/2343513367 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fd0b80778c0 0x7fd0b8079d80 secure :-1 s=READY pgs=147 cs=0 l=1 rev1=1 crypto rx=0x7fd0b400a9b0 tx=0x7fd0b4005c90 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:02.862 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.860+0000 7fd0d1881700 1 -- 192.168.123.105:0/2343513367 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 12, "format": "json"} v 0) v1 -- 0x7fd0cc19a240 con 0x7fd0cc103cf0 2026-03-10T09:06:02.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.863+0000 7fd0d087f700 1 -- 192.168.123.105:0/2343513367 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 12, "format": "json"}]=0 dumped fsmap epoch 12 v30) v1 ==== 107+0+4911 (secure 0 0 0) 0x7fd0bc063260 con 0x7fd0cc103cf0 2026-03-10T09:06:02.865 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:06:02.865 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":12,"btime":"1970-01-01T00:00:00:000000+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14488,"name":"cephfs.vm05.slhztf","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.105:6829/2662194502","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6828","nonce":2662194502},{"type":"v1","addr":"192.168.123.105:6829","nonce":2662194502}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":24307,"name":"cephfs.vm08.ssijow","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.108:6827/573665424","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":573665424},{"type":"v1","addr":"192.168.123.108:6827","nonce":573665424}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":24317,"name":"cephfs.vm08.xfzrbx","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/3236998387","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":3236998387},{"type":"v1","addr":"192.168.123.108:6825","nonce":3236998387}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10}],"filesystems":[{"mdsmap":{"epoch":12,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T08:52:52.346264+0000","modified":"2026-03-10T08:53:01.426195+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":39,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24289},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24289":{"gid":24289,"name":"cephfs.vm05.bxdvbu","rank":0,"incarnation":9,"state":"up:active","state_seq":5,"addr":"192.168.123.105:6827/2466638752","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":2466638752},{"type":"v1","addr":"192.168.123.105:6827","nonce":2466638752}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T09:06:02.868 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.867+0000 7fd0d1881700 1 -- 192.168.123.105:0/2343513367 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fd0b80778c0 msgr2=0x7fd0b8079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:02.868 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.867+0000 7fd0d1881700 1 --2- 192.168.123.105:0/2343513367 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fd0b80778c0 0x7fd0b8079d80 secure :-1 s=READY pgs=147 cs=0 l=1 rev1=1 crypto rx=0x7fd0b400a9b0 tx=0x7fd0b4005c90 comp rx=0 tx=0).stop 2026-03-10T09:06:02.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.867+0000 7fd0d1881700 1 -- 192.168.123.105:0/2343513367 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd0cc103cf0 msgr2=0x7fd0cc199360 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:02.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.867+0000 7fd0d1881700 1 --2- 192.168.123.105:0/2343513367 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd0cc103cf0 0x7fd0cc199360 secure :-1 s=READY pgs=211 cs=0 l=1 rev1=1 crypto rx=0x7fd0bc004930 tx=0x7fd0bc004a10 comp rx=0 tx=0).stop 2026-03-10T09:06:02.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.868+0000 7fd0d1881700 1 -- 192.168.123.105:0/2343513367 shutdown_connections 2026-03-10T09:06:02.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.868+0000 7fd0d1881700 1 --2- 192.168.123.105:0/2343513367 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fd0b80778c0 0x7fd0b8079d80 unknown :-1 s=CLOSED pgs=147 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:02.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.868+0000 7fd0d1881700 1 --2- 192.168.123.105:0/2343513367 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd0cc103340 0x7fd0cc198e20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:02.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.868+0000 7fd0d1881700 1 --2- 192.168.123.105:0/2343513367 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd0cc103cf0 0x7fd0cc199360 unknown :-1 s=CLOSED pgs=211 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:02.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.868+0000 7fd0d1881700 1 -- 192.168.123.105:0/2343513367 >> 192.168.123.105:0/2343513367 conn(0x7fd0cc0feb90 msgr2=0x7fd0cc1000f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:02.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.868+0000 7fd0d1881700 1 -- 192.168.123.105:0/2343513367 shutdown_connections 2026-03-10T09:06:02.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:02.868+0000 7fd0d1881700 1 -- 192.168.123.105:0/2343513367 wait complete. 2026-03-10T09:06:02.871 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 12 2026-03-10T09:06:02.922 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph fs dump --format=json 13 2026-03-10T09:06:03.092 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T09:06:03.387 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:03 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/2343513367' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 12, "format": "json"}]: dispatch 2026-03-10T09:06:03.388 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.385+0000 7fa5fc900700 1 -- 192.168.123.105:0/2847083337 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa5f4102da0 msgr2=0x7fa5f4103180 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:03.388 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.385+0000 7fa5fc900700 1 --2- 192.168.123.105:0/2847083337 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa5f4102da0 0x7fa5f4103180 secure :-1 s=READY pgs=212 cs=0 l=1 rev1=1 crypto rx=0x7fa5e0009b00 tx=0x7fa5e0009e10 comp rx=0 tx=0).stop 2026-03-10T09:06:03.388 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.386+0000 7fa5fc900700 1 -- 192.168.123.105:0/2847083337 shutdown_connections 2026-03-10T09:06:03.389 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.386+0000 7fa5fc900700 1 --2- 192.168.123.105:0/2847083337 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa5f4069180 0x7fa5f4069600 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:03.389 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.386+0000 7fa5fc900700 1 --2- 192.168.123.105:0/2847083337 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa5f4102da0 0x7fa5f4103180 unknown :-1 s=CLOSED pgs=212 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:03.389 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.386+0000 7fa5fc900700 1 -- 192.168.123.105:0/2847083337 >> 192.168.123.105:0/2847083337 conn(0x7fa5f4076b70 msgr2=0x7fa5f4076f80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:03.389 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.387+0000 7fa5fc900700 1 -- 192.168.123.105:0/2847083337 shutdown_connections 2026-03-10T09:06:03.389 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.388+0000 7fa5fc900700 1 -- 192.168.123.105:0/2847083337 wait complete. 2026-03-10T09:06:03.390 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.389+0000 7fa5fc900700 1 Processor -- start 2026-03-10T09:06:03.390 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.389+0000 7fa5fc900700 1 -- start start 2026-03-10T09:06:03.390 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.389+0000 7fa5fc900700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa5f4069180 0x7fa5f419c9f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:03.390 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.389+0000 7fa5fa69c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa5f4069180 0x7fa5f419c9f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:03.391 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.389+0000 7fa5fa69c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa5f4069180 0x7fa5f419c9f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:45502/0 (socket says 192.168.123.105:45502) 2026-03-10T09:06:03.391 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.389+0000 7fa5fc900700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa5f4102da0 0x7fa5f419cf30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:03.391 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.389+0000 7fa5fa69c700 1 -- 192.168.123.105:0/3465914368 learned_addr learned my addr 192.168.123.105:0/3465914368 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:06:03.391 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.390+0000 7fa5fc900700 1 -- 192.168.123.105:0/3465914368 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa5f419d5c0 con 0x7fa5f4069180 2026-03-10T09:06:03.391 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.390+0000 7fa5fc900700 1 -- 192.168.123.105:0/3465914368 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa5f4196b10 con 0x7fa5f4102da0 2026-03-10T09:06:03.391 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.390+0000 7fa5f9e9b700 1 --2- 192.168.123.105:0/3465914368 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa5f4102da0 0x7fa5f419cf30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:03.391 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.390+0000 7fa5fa69c700 1 -- 192.168.123.105:0/3465914368 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa5f4102da0 msgr2=0x7fa5f419cf30 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:03.392 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.390+0000 7fa5fa69c700 1 --2- 192.168.123.105:0/3465914368 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa5f4102da0 0x7fa5f419cf30 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:03.392 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.390+0000 7fa5fa69c700 1 -- 192.168.123.105:0/3465914368 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa5e00097e0 con 0x7fa5f4069180 2026-03-10T09:06:03.392 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.390+0000 7fa5fa69c700 1 --2- 192.168.123.105:0/3465914368 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa5f4069180 0x7fa5f419c9f0 secure :-1 s=READY pgs=213 cs=0 l=1 rev1=1 crypto rx=0x7fa5e0009ad0 tx=0x7fa5e000bab0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:03.392 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.391+0000 7fa5ef7fe700 1 -- 192.168.123.105:0/3465914368 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa5e001d070 con 0x7fa5f4069180 2026-03-10T09:06:03.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.391+0000 7fa5ef7fe700 1 -- 192.168.123.105:0/3465914368 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fa5e000f460 con 0x7fa5f4069180 2026-03-10T09:06:03.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.391+0000 7fa5fc900700 1 -- 192.168.123.105:0/3465914368 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa5f4196d90 con 0x7fa5f4069180 2026-03-10T09:06:03.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.391+0000 7fa5fc900700 1 -- 192.168.123.105:0/3465914368 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa5f4197280 con 0x7fa5f4069180 2026-03-10T09:06:03.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.392+0000 7fa5fc900700 1 -- 192.168.123.105:0/3465914368 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa5f404ea90 con 0x7fa5f4069180 2026-03-10T09:06:03.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.392+0000 7fa5ef7fe700 1 -- 192.168.123.105:0/3465914368 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa5e0003bf0 con 0x7fa5f4069180 2026-03-10T09:06:03.398 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.396+0000 7fa5ef7fe700 1 -- 192.168.123.105:0/3465914368 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fa5e002b440 con 0x7fa5f4069180 2026-03-10T09:06:03.398 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.397+0000 7fa5ef7fe700 1 --2- 192.168.123.105:0/3465914368 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fa5e40778e0 0x7fa5e4079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:03.399 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.397+0000 7fa5ef7fe700 1 -- 192.168.123.105:0/3465914368 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7fa5e009b480 con 0x7fa5f4069180 2026-03-10T09:06:03.399 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.397+0000 7fa5ef7fe700 1 -- 192.168.123.105:0/3465914368 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fa5e009b910 con 0x7fa5f4069180 2026-03-10T09:06:03.399 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.397+0000 7fa5f9e9b700 1 --2- 192.168.123.105:0/3465914368 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fa5e40778e0 0x7fa5e4079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:03.399 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.398+0000 7fa5f9e9b700 1 --2- 192.168.123.105:0/3465914368 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fa5e40778e0 0x7fa5e4079da0 secure :-1 s=READY pgs=148 cs=0 l=1 rev1=1 crypto rx=0x7fa5f40ffd30 tx=0x7fa5e8009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:03.540 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.538+0000 7fa5fc900700 1 -- 192.168.123.105:0/3465914368 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 13, "format": "json"} v 0) v1 -- 0x7fa5f4066e80 con 0x7fa5f4069180 2026-03-10T09:06:03.541 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.540+0000 7fa5ef7fe700 1 -- 192.168.123.105:0/3465914368 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 13, "format": "json"}]=0 dumped fsmap epoch 13 v30) v1 ==== 107+0+4119 (secure 0 0 0) 0x7fa5e0063b80 con 0x7fa5f4069180 2026-03-10T09:06:03.541 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:06:03.541 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":13,"btime":"2026-03-10T09:03:08:618946+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14488,"name":"cephfs.vm05.slhztf","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.105:6829/2662194502","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6828","nonce":2662194502},{"type":"v1","addr":"192.168.123.105:6829","nonce":2662194502}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":24307,"name":"cephfs.vm08.ssijow","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.108:6827/573665424","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":573665424},{"type":"v1","addr":"192.168.123.108:6827","nonce":573665424}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":24317,"name":"cephfs.vm08.xfzrbx","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/3236998387","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":3236998387},{"type":"v1","addr":"192.168.123.108:6825","nonce":3236998387}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10}],"filesystems":[{"mdsmap":{"epoch":13,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T08:52:52.346264+0000","modified":"2026-03-10T09:03:08.618611+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":101,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{},"failed":[0],"damaged":[],"stopped":[],"info":{},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T09:06:03.543 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.542+0000 7fa5fc900700 1 -- 192.168.123.105:0/3465914368 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fa5e40778e0 msgr2=0x7fa5e4079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:03.544 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.542+0000 7fa5fc900700 1 --2- 192.168.123.105:0/3465914368 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fa5e40778e0 0x7fa5e4079da0 secure :-1 s=READY pgs=148 cs=0 l=1 rev1=1 crypto rx=0x7fa5f40ffd30 tx=0x7fa5e8009450 comp rx=0 tx=0).stop 2026-03-10T09:06:03.544 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.543+0000 7fa5fc900700 1 -- 192.168.123.105:0/3465914368 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa5f4069180 msgr2=0x7fa5f419c9f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:03.544 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.543+0000 7fa5fc900700 1 --2- 192.168.123.105:0/3465914368 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa5f4069180 0x7fa5f419c9f0 secure :-1 s=READY pgs=213 cs=0 l=1 rev1=1 crypto rx=0x7fa5e0009ad0 tx=0x7fa5e000bab0 comp rx=0 tx=0).stop 2026-03-10T09:06:03.545 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.543+0000 7fa5fc900700 1 -- 192.168.123.105:0/3465914368 shutdown_connections 2026-03-10T09:06:03.545 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.543+0000 7fa5fc900700 1 --2- 192.168.123.105:0/3465914368 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fa5e40778e0 0x7fa5e4079da0 unknown :-1 s=CLOSED pgs=148 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:03.545 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.544+0000 7fa5fc900700 1 --2- 192.168.123.105:0/3465914368 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa5f4069180 0x7fa5f419c9f0 unknown :-1 s=CLOSED pgs=213 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:03.545 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.544+0000 7fa5fc900700 1 --2- 192.168.123.105:0/3465914368 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa5f4102da0 0x7fa5f419cf30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:03.545 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.544+0000 7fa5fc900700 1 -- 192.168.123.105:0/3465914368 >> 192.168.123.105:0/3465914368 conn(0x7fa5f4076b70 msgr2=0x7fa5f40fdeb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:03.546 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.544+0000 7fa5fc900700 1 -- 192.168.123.105:0/3465914368 shutdown_connections 2026-03-10T09:06:03.546 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:03.544+0000 7fa5fc900700 1 -- 192.168.123.105:0/3465914368 wait complete. 2026-03-10T09:06:03.547 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 13 2026-03-10T09:06:03.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:03 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/2343513367' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 12, "format": "json"}]: dispatch 2026-03-10T09:06:04.012 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph fs dump --format=json 14 2026-03-10T09:06:04.169 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T09:06:04.554 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:04 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/3465914368' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 13, "format": "json"}]: dispatch 2026-03-10T09:06:04.554 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:04 vm05.local ceph-mon[111630]: pgmap v345: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:04.711 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.709+0000 7f836c0a0700 1 -- 192.168.123.105:0/1754966775 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f83640ffde0 msgr2=0x7f836410ae70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:04.711 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.709+0000 7f836c0a0700 1 --2- 192.168.123.105:0/1754966775 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f83640ffde0 0x7f836410ae70 secure :-1 s=READY pgs=214 cs=0 l=1 rev1=1 crypto rx=0x7f8354009b00 tx=0x7f8354009e10 comp rx=0 tx=0).stop 2026-03-10T09:06:04.712 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.710+0000 7f836c0a0700 1 -- 192.168.123.105:0/1754966775 shutdown_connections 2026-03-10T09:06:04.712 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.710+0000 7f836c0a0700 1 --2- 192.168.123.105:0/1754966775 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f83640ffde0 0x7f836410ae70 unknown :-1 s=CLOSED pgs=214 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:04.712 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.710+0000 7f836c0a0700 1 --2- 192.168.123.105:0/1754966775 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f83640ff4c0 0x7f83640ff8a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:04.712 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.710+0000 7f836c0a0700 1 -- 192.168.123.105:0/1754966775 >> 192.168.123.105:0/1754966775 conn(0x7f8364074bd0 msgr2=0x7f8364074fe0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:04.712 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.710+0000 7f836c0a0700 1 -- 192.168.123.105:0/1754966775 shutdown_connections 2026-03-10T09:06:04.712 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.710+0000 7f836c0a0700 1 -- 192.168.123.105:0/1754966775 wait complete. 2026-03-10T09:06:04.712 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.711+0000 7f836c0a0700 1 Processor -- start 2026-03-10T09:06:04.712 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.711+0000 7f836c0a0700 1 -- start start 2026-03-10T09:06:04.713 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.711+0000 7f836c0a0700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f83640ff4c0 0x7f836419b880 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:04.713 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.711+0000 7f836c0a0700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f83640ffde0 0x7f83641948a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:04.713 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.711+0000 7f836c0a0700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8364194de0 con 0x7f83640ff4c0 2026-03-10T09:06:04.713 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.711+0000 7f836c0a0700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8364194f50 con 0x7f83640ffde0 2026-03-10T09:06:04.713 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.712+0000 7f8369e3c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f83640ff4c0 0x7f836419b880 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:04.713 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.712+0000 7f8369e3c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f83640ff4c0 0x7f836419b880 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:45514/0 (socket says 192.168.123.105:45514) 2026-03-10T09:06:04.713 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.712+0000 7f8369e3c700 1 -- 192.168.123.105:0/2525904912 learned_addr learned my addr 192.168.123.105:0/2525904912 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:06:04.713 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.712+0000 7f8369e3c700 1 -- 192.168.123.105:0/2525904912 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f83640ffde0 msgr2=0x7f83641948a0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T09:06:04.713 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.712+0000 7f8369e3c700 1 --2- 192.168.123.105:0/2525904912 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f83640ffde0 0x7f83641948a0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:04.713 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.712+0000 7f8369e3c700 1 -- 192.168.123.105:0/2525904912 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f83540097e0 con 0x7f83640ff4c0 2026-03-10T09:06:04.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.712+0000 7f8369e3c700 1 --2- 192.168.123.105:0/2525904912 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f83640ff4c0 0x7f836419b880 secure :-1 s=READY pgs=215 cs=0 l=1 rev1=1 crypto rx=0x7f836000ebf0 tx=0x7f836000c2d0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:04.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.713+0000 7f835affd700 1 -- 192.168.123.105:0/2525904912 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f836000cd00 con 0x7f83640ff4c0 2026-03-10T09:06:04.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.713+0000 7f835affd700 1 -- 192.168.123.105:0/2525904912 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f836000ce60 con 0x7f83640ff4c0 2026-03-10T09:06:04.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.713+0000 7f835affd700 1 -- 192.168.123.105:0/2525904912 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8360010640 con 0x7f83640ff4c0 2026-03-10T09:06:04.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.713+0000 7f836c0a0700 1 -- 192.168.123.105:0/2525904912 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8364195150 con 0x7f83640ff4c0 2026-03-10T09:06:04.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.713+0000 7f836c0a0700 1 -- 192.168.123.105:0/2525904912 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8364195670 con 0x7f83640ff4c0 2026-03-10T09:06:04.716 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.715+0000 7f835affd700 1 -- 192.168.123.105:0/2525904912 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f83600107a0 con 0x7f83640ff4c0 2026-03-10T09:06:04.716 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.715+0000 7f836c0a0700 1 -- 192.168.123.105:0/2525904912 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f836404ea90 con 0x7f83640ff4c0 2026-03-10T09:06:04.719 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.715+0000 7f835affd700 1 --2- 192.168.123.105:0/2525904912 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f83500778c0 0x7f8350079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:04.719 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.715+0000 7f835affd700 1 -- 192.168.123.105:0/2525904912 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f8360014070 con 0x7f83640ff4c0 2026-03-10T09:06:04.719 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.718+0000 7f835affd700 1 -- 192.168.123.105:0/2525904912 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f83600626e0 con 0x7f83640ff4c0 2026-03-10T09:06:04.719 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.718+0000 7f836963b700 1 --2- 192.168.123.105:0/2525904912 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f83500778c0 0x7f8350079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:04.720 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.719+0000 7f836963b700 1 --2- 192.168.123.105:0/2525904912 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f83500778c0 0x7f8350079d80 secure :-1 s=READY pgs=149 cs=0 l=1 rev1=1 crypto rx=0x7f835400b5c0 tx=0x7f8354005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:04.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:04 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/3465914368' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 13, "format": "json"}]: dispatch 2026-03-10T09:06:04.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:04 vm08.local ceph-mon[101330]: pgmap v345: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:04.867 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.864+0000 7f836c0a0700 1 -- 192.168.123.105:0/2525904912 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 14, "format": "json"} v 0) v1 -- 0x7f8364195dd0 con 0x7f83640ff4c0 2026-03-10T09:06:04.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.867+0000 7f835affd700 1 -- 192.168.123.105:0/2525904912 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 14, "format": "json"}]=0 dumped fsmap epoch 14 v30) v1 ==== 107+0+4130 (secure 0 0 0) 0x7f8360005740 con 0x7f83640ff4c0 2026-03-10T09:06:04.869 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:06:04.869 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":14,"btime":"2026-03-10T09:03:08:628249+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":24307,"name":"cephfs.vm08.ssijow","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.108:6827/573665424","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":573665424},{"type":"v1","addr":"192.168.123.108:6827","nonce":573665424}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":24317,"name":"cephfs.vm08.xfzrbx","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/3236998387","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":3236998387},{"type":"v1","addr":"192.168.123.108:6825","nonce":3236998387}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10}],"filesystems":[{"mdsmap":{"epoch":14,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T08:52:52.346264+0000","modified":"2026-03-10T09:03:08.628242+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":101,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14488},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14488":{"gid":14488,"name":"cephfs.vm05.slhztf","rank":0,"incarnation":14,"state":"up:replay","state_seq":2,"addr":"192.168.123.105:6829/2662194502","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6828","nonce":2662194502},{"type":"v1","addr":"192.168.123.105:6829","nonce":2662194502}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T09:06:04.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.869+0000 7f836c0a0700 1 -- 192.168.123.105:0/2525904912 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f83500778c0 msgr2=0x7f8350079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:04.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.869+0000 7f836c0a0700 1 --2- 192.168.123.105:0/2525904912 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f83500778c0 0x7f8350079d80 secure :-1 s=READY pgs=149 cs=0 l=1 rev1=1 crypto rx=0x7f835400b5c0 tx=0x7f8354005fb0 comp rx=0 tx=0).stop 2026-03-10T09:06:04.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.869+0000 7f836c0a0700 1 -- 192.168.123.105:0/2525904912 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f83640ff4c0 msgr2=0x7f836419b880 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:04.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.869+0000 7f836c0a0700 1 --2- 192.168.123.105:0/2525904912 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f83640ff4c0 0x7f836419b880 secure :-1 s=READY pgs=215 cs=0 l=1 rev1=1 crypto rx=0x7f836000ebf0 tx=0x7f836000c2d0 comp rx=0 tx=0).stop 2026-03-10T09:06:04.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.870+0000 7f836c0a0700 1 -- 192.168.123.105:0/2525904912 shutdown_connections 2026-03-10T09:06:04.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.870+0000 7f836c0a0700 1 --2- 192.168.123.105:0/2525904912 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f83500778c0 0x7f8350079d80 unknown :-1 s=CLOSED pgs=149 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:04.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.870+0000 7f836c0a0700 1 --2- 192.168.123.105:0/2525904912 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f83640ff4c0 0x7f836419b880 unknown :-1 s=CLOSED pgs=215 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:04.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.870+0000 7f836c0a0700 1 --2- 192.168.123.105:0/2525904912 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f83640ffde0 0x7f83641948a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:04.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.870+0000 7f836c0a0700 1 -- 192.168.123.105:0/2525904912 >> 192.168.123.105:0/2525904912 conn(0x7f8364074bd0 msgr2=0x7f8364101490 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:04.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.870+0000 7f836c0a0700 1 -- 192.168.123.105:0/2525904912 shutdown_connections 2026-03-10T09:06:04.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:04.870+0000 7f836c0a0700 1 -- 192.168.123.105:0/2525904912 wait complete. 2026-03-10T09:06:04.872 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 14 2026-03-10T09:06:05.588 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:05 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/2525904912' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 14, "format": "json"}]: dispatch 2026-03-10T09:06:05.607 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph fs dump --format=json 15 2026-03-10T09:06:05.762 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T09:06:05.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:05 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/2525904912' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 14, "format": "json"}]: dispatch 2026-03-10T09:06:06.369 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.366+0000 7f588f050700 1 -- 192.168.123.105:0/603608426 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5888101a80 msgr2=0x7f5888105ad0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:06.369 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.366+0000 7f588f050700 1 --2- 192.168.123.105:0/603608426 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5888101a80 0x7f5888105ad0 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f587c009a60 tx=0x7f587c009d70 comp rx=0 tx=0).stop 2026-03-10T09:06:06.369 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.368+0000 7f588f050700 1 -- 192.168.123.105:0/603608426 shutdown_connections 2026-03-10T09:06:06.369 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.368+0000 7f588f050700 1 --2- 192.168.123.105:0/603608426 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5888101a80 0x7f5888105ad0 unknown :-1 s=CLOSED pgs=89 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:06.369 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.368+0000 7f588f050700 1 --2- 192.168.123.105:0/603608426 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f58881010d0 0x7f58881014b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:06.369 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.368+0000 7f588f050700 1 -- 192.168.123.105:0/603608426 >> 192.168.123.105:0/603608426 conn(0x7f58880fc920 msgr2=0x7f58880fed40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:06.369 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.368+0000 7f588f050700 1 -- 192.168.123.105:0/603608426 shutdown_connections 2026-03-10T09:06:06.370 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.368+0000 7f588f050700 1 -- 192.168.123.105:0/603608426 wait complete. 2026-03-10T09:06:06.370 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.369+0000 7f588f050700 1 Processor -- start 2026-03-10T09:06:06.370 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.369+0000 7f588f050700 1 -- start start 2026-03-10T09:06:06.371 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.370+0000 7f588f050700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f58881010d0 0x7f588819c960 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:06.372 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.370+0000 7f588f050700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5888101a80 0x7f588819cea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:06.372 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.370+0000 7f588f050700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f588819d530 con 0x7f5888101a80 2026-03-10T09:06:06.372 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.370+0000 7f588f050700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f58881969e0 con 0x7f58881010d0 2026-03-10T09:06:06.372 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.370+0000 7f5887fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5888101a80 0x7f588819cea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:06.372 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.370+0000 7f5887fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5888101a80 0x7f588819cea0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:45526/0 (socket says 192.168.123.105:45526) 2026-03-10T09:06:06.372 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.370+0000 7f5887fff700 1 -- 192.168.123.105:0/3546169073 learned_addr learned my addr 192.168.123.105:0/3546169073 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:06:06.372 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.370+0000 7f5887fff700 1 -- 192.168.123.105:0/3546169073 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f58881010d0 msgr2=0x7f588819c960 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:06.372 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.370+0000 7f5887fff700 1 --2- 192.168.123.105:0/3546169073 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f58881010d0 0x7f588819c960 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:06.372 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.370+0000 7f5887fff700 1 -- 192.168.123.105:0/3546169073 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f58780097e0 con 0x7f5888101a80 2026-03-10T09:06:06.372 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.370+0000 7f5887fff700 1 --2- 192.168.123.105:0/3546169073 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5888101a80 0x7f588819cea0 secure :-1 s=READY pgs=216 cs=0 l=1 rev1=1 crypto rx=0x7f587c00f6c0 tx=0x7f587c00f7a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:06.372 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.371+0000 7f5885ffb700 1 -- 192.168.123.105:0/3546169073 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f587c01d070 con 0x7f5888101a80 2026-03-10T09:06:06.372 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.371+0000 7f5885ffb700 1 -- 192.168.123.105:0/3546169073 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f587c00fdb0 con 0x7f5888101a80 2026-03-10T09:06:06.372 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.371+0000 7f5885ffb700 1 -- 192.168.123.105:0/3546169073 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f587c0177c0 con 0x7f5888101a80 2026-03-10T09:06:06.373 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.371+0000 7f588f050700 1 -- 192.168.123.105:0/3546169073 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f587c009710 con 0x7f5888101a80 2026-03-10T09:06:06.373 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.371+0000 7f588f050700 1 -- 192.168.123.105:0/3546169073 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5888196fc0 con 0x7f5888101a80 2026-03-10T09:06:06.376 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.372+0000 7f588f050700 1 -- 192.168.123.105:0/3546169073 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f588804ea90 con 0x7f5888101a80 2026-03-10T09:06:06.377 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.373+0000 7f5885ffb700 1 -- 192.168.123.105:0/3546169073 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f587c021b80 con 0x7f5888101a80 2026-03-10T09:06:06.377 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.375+0000 7f5885ffb700 1 --2- 192.168.123.105:0/3546169073 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f58700778c0 0x7f5870079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:06.377 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.375+0000 7f588cdec700 1 --2- 192.168.123.105:0/3546169073 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f58700778c0 0x7f5870079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:06.377 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.376+0000 7f5885ffb700 1 -- 192.168.123.105:0/3546169073 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f587c067580 con 0x7f5888101a80 2026-03-10T09:06:06.377 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.376+0000 7f5885ffb700 1 -- 192.168.123.105:0/3546169073 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f587c062fa0 con 0x7f5888101a80 2026-03-10T09:06:06.378 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.376+0000 7f588cdec700 1 --2- 192.168.123.105:0/3546169073 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f58700778c0 0x7f5870079d80 secure :-1 s=READY pgs=150 cs=0 l=1 rev1=1 crypto rx=0x7f58780097b0 tx=0x7f5878009700 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:06.528 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.527+0000 7f588f050700 1 -- 192.168.123.105:0/3546169073 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 15, "format": "json"} v 0) v1 -- 0x7f5888066e80 con 0x7f5888101a80 2026-03-10T09:06:06.529 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.527+0000 7f5885ffb700 1 -- 192.168.123.105:0/3546169073 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 15, "format": "json"}]=0 dumped fsmap epoch 15 v30) v1 ==== 107+0+4135 (secure 0 0 0) 0x7f587c026090 con 0x7f5888101a80 2026-03-10T09:06:06.529 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:06:06.529 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":15,"btime":"2026-03-10T09:03:14:349349+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":24307,"name":"cephfs.vm08.ssijow","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.108:6827/573665424","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":573665424},{"type":"v1","addr":"192.168.123.108:6827","nonce":573665424}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":24317,"name":"cephfs.vm08.xfzrbx","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/3236998387","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":3236998387},{"type":"v1","addr":"192.168.123.108:6825","nonce":3236998387}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10}],"filesystems":[{"mdsmap":{"epoch":15,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T08:52:52.346264+0000","modified":"2026-03-10T09:03:13.813995+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":101,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14488},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14488":{"gid":14488,"name":"cephfs.vm05.slhztf","rank":0,"incarnation":14,"state":"up:reconnect","state_seq":156,"addr":"192.168.123.105:6829/2662194502","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6828","nonce":2662194502},{"type":"v1","addr":"192.168.123.105:6829","nonce":2662194502}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T09:06:06.531 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.530+0000 7f588f050700 1 -- 192.168.123.105:0/3546169073 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f58700778c0 msgr2=0x7f5870079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:06.531 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.530+0000 7f588f050700 1 --2- 192.168.123.105:0/3546169073 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f58700778c0 0x7f5870079d80 secure :-1 s=READY pgs=150 cs=0 l=1 rev1=1 crypto rx=0x7f58780097b0 tx=0x7f5878009700 comp rx=0 tx=0).stop 2026-03-10T09:06:06.531 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.530+0000 7f588f050700 1 -- 192.168.123.105:0/3546169073 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5888101a80 msgr2=0x7f588819cea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:06.531 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.530+0000 7f588f050700 1 --2- 192.168.123.105:0/3546169073 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5888101a80 0x7f588819cea0 secure :-1 s=READY pgs=216 cs=0 l=1 rev1=1 crypto rx=0x7f587c00f6c0 tx=0x7f587c00f7a0 comp rx=0 tx=0).stop 2026-03-10T09:06:06.532 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.530+0000 7f588f050700 1 -- 192.168.123.105:0/3546169073 shutdown_connections 2026-03-10T09:06:06.532 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.531+0000 7f588f050700 1 --2- 192.168.123.105:0/3546169073 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f58700778c0 0x7f5870079d80 unknown :-1 s=CLOSED pgs=150 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:06.532 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.531+0000 7f588f050700 1 --2- 192.168.123.105:0/3546169073 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f58881010d0 0x7f588819c960 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:06.532 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.531+0000 7f588f050700 1 --2- 192.168.123.105:0/3546169073 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5888101a80 0x7f588819cea0 unknown :-1 s=CLOSED pgs=216 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:06.532 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.531+0000 7f588f050700 1 -- 192.168.123.105:0/3546169073 >> 192.168.123.105:0/3546169073 conn(0x7f58880fc920 msgr2=0x7f58880fecc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:06.532 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.531+0000 7f588f050700 1 -- 192.168.123.105:0/3546169073 shutdown_connections 2026-03-10T09:06:06.532 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:06.531+0000 7f588f050700 1 -- 192.168.123.105:0/3546169073 wait complete. 2026-03-10T09:06:06.534 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 15 2026-03-10T09:06:06.610 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph fs dump --format=json 16 2026-03-10T09:06:06.785 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T09:06:06.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:06 vm08.local ceph-mon[101330]: pgmap v346: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:06.811 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:06 vm05.local ceph-mon[111630]: pgmap v346: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:07.059 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.057+0000 7f9ca2723700 1 -- 192.168.123.105:0/3319365478 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9c9c102050 msgr2=0x7f9c9c102430 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:07.059 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.057+0000 7f9ca2723700 1 --2- 192.168.123.105:0/3319365478 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9c9c102050 0x7f9c9c102430 secure :-1 s=READY pgs=217 cs=0 l=1 rev1=1 crypto rx=0x7f9c8c009b00 tx=0x7f9c8c009e10 comp rx=0 tx=0).stop 2026-03-10T09:06:07.059 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.058+0000 7f9ca2723700 1 -- 192.168.123.105:0/3319365478 shutdown_connections 2026-03-10T09:06:07.059 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.058+0000 7f9ca2723700 1 --2- 192.168.123.105:0/3319365478 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9c9c102970 0x7f9c9c10ae60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:07.059 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.058+0000 7f9ca2723700 1 --2- 192.168.123.105:0/3319365478 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9c9c102050 0x7f9c9c102430 unknown :-1 s=CLOSED pgs=217 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:07.059 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.058+0000 7f9ca2723700 1 -- 192.168.123.105:0/3319365478 >> 192.168.123.105:0/3319365478 conn(0x7f9c9c0fb820 msgr2=0x7f9c9c0fdc40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:07.059 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.058+0000 7f9ca2723700 1 -- 192.168.123.105:0/3319365478 shutdown_connections 2026-03-10T09:06:07.060 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.058+0000 7f9ca2723700 1 -- 192.168.123.105:0/3319365478 wait complete. 2026-03-10T09:06:07.060 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.059+0000 7f9ca2723700 1 Processor -- start 2026-03-10T09:06:07.060 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.059+0000 7f9ca2723700 1 -- start start 2026-03-10T09:06:07.061 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.059+0000 7f9ca2723700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9c9c102050 0x7f9c9c19b800 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:07.061 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.059+0000 7f9ca2723700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9c9c102970 0x7f9c9c194820 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:07.061 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.059+0000 7f9ca2723700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9c9c19bf20 con 0x7f9c9c102970 2026-03-10T09:06:07.061 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.059+0000 7f9ca2723700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9c9c194d60 con 0x7f9c9c102050 2026-03-10T09:06:07.061 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.060+0000 7f9c93fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9c9c102970 0x7f9c9c194820 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:07.061 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.060+0000 7f9c93fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9c9c102970 0x7f9c9c194820 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:45540/0 (socket says 192.168.123.105:45540) 2026-03-10T09:06:07.061 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.060+0000 7f9c93fff700 1 -- 192.168.123.105:0/328952935 learned_addr learned my addr 192.168.123.105:0/328952935 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:06:07.062 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.060+0000 7f9c93fff700 1 -- 192.168.123.105:0/328952935 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9c9c102050 msgr2=0x7f9c9c19b800 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T09:06:07.062 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.060+0000 7f9c93fff700 1 --2- 192.168.123.105:0/328952935 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9c9c102050 0x7f9c9c19b800 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:07.062 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.060+0000 7f9c93fff700 1 -- 192.168.123.105:0/328952935 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9c8c0097e0 con 0x7f9c9c102970 2026-03-10T09:06:07.062 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.061+0000 7f9c93fff700 1 --2- 192.168.123.105:0/328952935 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9c9c102970 0x7f9c9c194820 secure :-1 s=READY pgs=218 cs=0 l=1 rev1=1 crypto rx=0x7f9c8400ec90 tx=0x7f9c8400c5b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:07.063 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.061+0000 7f9c99ffb700 1 -- 192.168.123.105:0/328952935 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9c8400cbc0 con 0x7f9c9c102970 2026-03-10T09:06:07.063 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.061+0000 7f9c99ffb700 1 -- 192.168.123.105:0/328952935 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f9c8400cd20 con 0x7f9c9c102970 2026-03-10T09:06:07.063 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.061+0000 7f9c99ffb700 1 -- 192.168.123.105:0/328952935 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9c84010430 con 0x7f9c9c102970 2026-03-10T09:06:07.063 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.062+0000 7f9ca2723700 1 -- 192.168.123.105:0/328952935 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9c9c195010 con 0x7f9c9c102970 2026-03-10T09:06:07.064 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.063+0000 7f9ca2723700 1 -- 192.168.123.105:0/328952935 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9c9c195560 con 0x7f9c9c102970 2026-03-10T09:06:07.065 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.064+0000 7f9c99ffb700 1 -- 192.168.123.105:0/328952935 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f9c84004750 con 0x7f9c9c102970 2026-03-10T09:06:07.065 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.064+0000 7f9c99ffb700 1 --2- 192.168.123.105:0/328952935 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f9c7c0778c0 0x7f9c7c079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:07.065 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.064+0000 7f9c99ffb700 1 -- 192.168.123.105:0/328952935 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f9c84014070 con 0x7f9c9c102970 2026-03-10T09:06:07.066 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.064+0000 7f9c9bfff700 1 --2- 192.168.123.105:0/328952935 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f9c7c0778c0 0x7f9c7c079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:07.066 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.065+0000 7f9c9bfff700 1 --2- 192.168.123.105:0/328952935 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f9c7c0778c0 0x7f9c7c079d80 secure :-1 s=READY pgs=151 cs=0 l=1 rev1=1 crypto rx=0x7f9c8c00b5c0 tx=0x7f9c8c005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:07.066 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.065+0000 7f9ca2723700 1 -- 192.168.123.105:0/328952935 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9c9c04ea90 con 0x7f9c9c102970 2026-03-10T09:06:07.070 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.069+0000 7f9c99ffb700 1 -- 192.168.123.105:0/328952935 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f9c84061e70 con 0x7f9c9c102970 2026-03-10T09:06:07.222 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.221+0000 7f9ca2723700 1 -- 192.168.123.105:0/328952935 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 16, "format": "json"} v 0) v1 -- 0x7f9c9c195d50 con 0x7f9c9c102970 2026-03-10T09:06:07.223 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.222+0000 7f9c99ffb700 1 -- 192.168.123.105:0/328952935 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 16, "format": "json"}]=0 dumped fsmap epoch 16 v30) v1 ==== 107+0+4132 (secure 0 0 0) 0x7f9c840615c0 con 0x7f9c9c102970 2026-03-10T09:06:07.223 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:06:07.223 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":16,"btime":"2026-03-10T09:03:15:759107+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":24307,"name":"cephfs.vm08.ssijow","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.108:6827/573665424","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":573665424},{"type":"v1","addr":"192.168.123.108:6827","nonce":573665424}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":24317,"name":"cephfs.vm08.xfzrbx","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/3236998387","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":3236998387},{"type":"v1","addr":"192.168.123.108:6825","nonce":3236998387}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10}],"filesystems":[{"mdsmap":{"epoch":16,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T08:52:52.346264+0000","modified":"2026-03-10T09:03:14.760866+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":101,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14488},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14488":{"gid":14488,"name":"cephfs.vm05.slhztf","rank":0,"incarnation":14,"state":"up:rejoin","state_seq":157,"addr":"192.168.123.105:6829/2662194502","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6828","nonce":2662194502},{"type":"v1","addr":"192.168.123.105:6829","nonce":2662194502}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T09:06:07.225 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.224+0000 7f9ca2723700 1 -- 192.168.123.105:0/328952935 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f9c7c0778c0 msgr2=0x7f9c7c079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:07.225 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.224+0000 7f9ca2723700 1 --2- 192.168.123.105:0/328952935 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f9c7c0778c0 0x7f9c7c079d80 secure :-1 s=READY pgs=151 cs=0 l=1 rev1=1 crypto rx=0x7f9c8c00b5c0 tx=0x7f9c8c005fb0 comp rx=0 tx=0).stop 2026-03-10T09:06:07.225 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.224+0000 7f9ca2723700 1 -- 192.168.123.105:0/328952935 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9c9c102970 msgr2=0x7f9c9c194820 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:07.225 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.224+0000 7f9ca2723700 1 --2- 192.168.123.105:0/328952935 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9c9c102970 0x7f9c9c194820 secure :-1 s=READY pgs=218 cs=0 l=1 rev1=1 crypto rx=0x7f9c8400ec90 tx=0x7f9c8400c5b0 comp rx=0 tx=0).stop 2026-03-10T09:06:07.226 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.224+0000 7f9ca2723700 1 -- 192.168.123.105:0/328952935 shutdown_connections 2026-03-10T09:06:07.226 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.224+0000 7f9ca2723700 1 --2- 192.168.123.105:0/328952935 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f9c7c0778c0 0x7f9c7c079d80 unknown :-1 s=CLOSED pgs=151 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:07.226 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.224+0000 7f9ca2723700 1 --2- 192.168.123.105:0/328952935 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9c9c102050 0x7f9c9c19b800 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:07.226 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.224+0000 7f9ca2723700 1 --2- 192.168.123.105:0/328952935 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9c9c102970 0x7f9c9c194820 unknown :-1 s=CLOSED pgs=218 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:07.226 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.224+0000 7f9ca2723700 1 -- 192.168.123.105:0/328952935 >> 192.168.123.105:0/328952935 conn(0x7f9c9c0fb820 msgr2=0x7f9c9c100470 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:07.226 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.225+0000 7f9ca2723700 1 -- 192.168.123.105:0/328952935 shutdown_connections 2026-03-10T09:06:07.226 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.225+0000 7f9ca2723700 1 -- 192.168.123.105:0/328952935 wait complete. 2026-03-10T09:06:07.227 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 16 2026-03-10T09:06:07.298 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph fs dump --format=json 17 2026-03-10T09:06:07.471 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T09:06:07.579 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:07 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/3546169073' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 15, "format": "json"}]: dispatch 2026-03-10T09:06:07.579 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:07 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/328952935' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 16, "format": "json"}]: dispatch 2026-03-10T09:06:07.772 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.770+0000 7f9603e21700 1 -- 192.168.123.105:0/808214248 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f95fc1009b0 msgr2=0x7f95fc104a00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:07.772 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.770+0000 7f9603e21700 1 --2- 192.168.123.105:0/808214248 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f95fc1009b0 0x7f95fc104a00 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7f95f4009b00 tx=0x7f95f4009e10 comp rx=0 tx=0).stop 2026-03-10T09:06:07.772 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.770+0000 7f9603e21700 1 -- 192.168.123.105:0/808214248 shutdown_connections 2026-03-10T09:06:07.772 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.770+0000 7f9603e21700 1 --2- 192.168.123.105:0/808214248 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f95fc1009b0 0x7f95fc104a00 secure :-1 s=CLOSED pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7f95f4009b00 tx=0x7f95f4009e10 comp rx=0 tx=0).stop 2026-03-10T09:06:07.772 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.770+0000 7f9603e21700 1 --2- 192.168.123.105:0/808214248 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f95fc100000 0x7f95fc1003e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:07.772 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.770+0000 7f9603e21700 1 -- 192.168.123.105:0/808214248 >> 192.168.123.105:0/808214248 conn(0x7f95fc076e80 msgr2=0x7f95fc077290 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:07.772 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.771+0000 7f9603e21700 1 -- 192.168.123.105:0/808214248 shutdown_connections 2026-03-10T09:06:07.772 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.771+0000 7f9603e21700 1 -- 192.168.123.105:0/808214248 wait complete. 2026-03-10T09:06:07.772 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.771+0000 7f9603e21700 1 Processor -- start 2026-03-10T09:06:07.773 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.771+0000 7f9603e21700 1 -- start start 2026-03-10T09:06:07.773 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.772+0000 7f9603e21700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f95fc100000 0x7f95fc196e30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:07.773 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.772+0000 7f9603e21700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f95fc197370 0x7f95fc19b7e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:07.773 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.772+0000 7f9603e21700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f95fc197990 con 0x7f95fc197370 2026-03-10T09:06:07.773 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.772+0000 7f9603e21700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f95fc197b00 con 0x7f95fc100000 2026-03-10T09:06:07.774 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.772+0000 7f96013bc700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f95fc197370 0x7f95fc19b7e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:07.774 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.772+0000 7f96013bc700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f95fc197370 0x7f95fc19b7e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:45554/0 (socket says 192.168.123.105:45554) 2026-03-10T09:06:07.774 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.772+0000 7f96013bc700 1 -- 192.168.123.105:0/1469269621 learned_addr learned my addr 192.168.123.105:0/1469269621 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:06:07.774 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.773+0000 7f96013bc700 1 -- 192.168.123.105:0/1469269621 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f95fc100000 msgr2=0x7f95fc196e30 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T09:06:07.774 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.773+0000 7f96013bc700 1 --2- 192.168.123.105:0/1469269621 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f95fc100000 0x7f95fc196e30 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:07.774 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.773+0000 7f96013bc700 1 -- 192.168.123.105:0/1469269621 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f95ec009e30 con 0x7f95fc197370 2026-03-10T09:06:07.775 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.773+0000 7f96013bc700 1 --2- 192.168.123.105:0/1469269621 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f95fc197370 0x7f95fc19b7e0 secure :-1 s=READY pgs=219 cs=0 l=1 rev1=1 crypto rx=0x7f95f4000c00 tx=0x7f95f400bbf0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:07.776 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.774+0000 7f95faffd700 1 -- 192.168.123.105:0/1469269621 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f95f401d070 con 0x7f95fc197370 2026-03-10T09:06:07.776 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.774+0000 7f95faffd700 1 -- 192.168.123.105:0/1469269621 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f95f400f460 con 0x7f95fc197370 2026-03-10T09:06:07.776 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.774+0000 7f95faffd700 1 -- 192.168.123.105:0/1469269621 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f95f4021620 con 0x7f95fc197370 2026-03-10T09:06:07.776 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.774+0000 7f9603e21700 1 -- 192.168.123.105:0/1469269621 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f95f40097e0 con 0x7f95fc197370 2026-03-10T09:06:07.776 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.774+0000 7f9603e21700 1 -- 192.168.123.105:0/1469269621 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f95fc19c0f0 con 0x7f95fc197370 2026-03-10T09:06:07.780 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.776+0000 7f95faffd700 1 -- 192.168.123.105:0/1469269621 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f95f400fad0 con 0x7f95fc197370 2026-03-10T09:06:07.780 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.776+0000 7f9603e21700 1 -- 192.168.123.105:0/1469269621 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f95fc107300 con 0x7f95fc197370 2026-03-10T09:06:07.780 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.776+0000 7f95faffd700 1 --2- 192.168.123.105:0/1469269621 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f95e40778c0 0x7f95e4079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:07.780 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.776+0000 7f95faffd700 1 -- 192.168.123.105:0/1469269621 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f95f409ab30 con 0x7f95fc197370 2026-03-10T09:06:07.781 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.779+0000 7f95faffd700 1 -- 192.168.123.105:0/1469269621 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f95f4063360 con 0x7f95fc197370 2026-03-10T09:06:07.781 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.780+0000 7f9601bbd700 1 --2- 192.168.123.105:0/1469269621 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f95e40778c0 0x7f95e4079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:07.782 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.780+0000 7f9601bbd700 1 --2- 192.168.123.105:0/1469269621 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f95e40778c0 0x7f95e4079d80 secure :-1 s=READY pgs=152 cs=0 l=1 rev1=1 crypto rx=0x7f95ec0075a0 tx=0x7f95ec009b10 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:07.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:07 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/3546169073' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 15, "format": "json"}]: dispatch 2026-03-10T09:06:07.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:07 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/328952935' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 16, "format": "json"}]: dispatch 2026-03-10T09:06:07.938 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.935+0000 7f9603e21700 1 -- 192.168.123.105:0/1469269621 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 17, "format": "json"} v 0) v1 -- 0x7f95fc198190 con 0x7f95fc197370 2026-03-10T09:06:07.938 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.936+0000 7f95faffd700 1 -- 192.168.123.105:0/1469269621 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 17, "format": "json"}]=0 dumped fsmap epoch 17 v30) v1 ==== 107+0+4141 (secure 0 0 0) 0x7f95f4062ab0 con 0x7f95fc197370 2026-03-10T09:06:07.938 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:06:07.939 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":17,"btime":"2026-03-10T09:03:16:775888+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":24307,"name":"cephfs.vm08.ssijow","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.108:6827/573665424","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":573665424},{"type":"v1","addr":"192.168.123.108:6827","nonce":573665424}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":24317,"name":"cephfs.vm08.xfzrbx","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/3236998387","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":3236998387},{"type":"v1","addr":"192.168.123.108:6825","nonce":3236998387}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10}],"filesystems":[{"mdsmap":{"epoch":17,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T08:52:52.346264+0000","modified":"2026-03-10T09:03:16.775887+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":101,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14488},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14488":{"gid":14488,"name":"cephfs.vm05.slhztf","rank":0,"incarnation":14,"state":"up:active","state_seq":158,"addr":"192.168.123.105:6829/2662194502","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6828","nonce":2662194502},{"type":"v1","addr":"192.168.123.105:6829","nonce":2662194502}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":14488,"qdb_cluster":[14488]},"id":1}]} 2026-03-10T09:06:07.941 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.940+0000 7f9603e21700 1 -- 192.168.123.105:0/1469269621 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f95e40778c0 msgr2=0x7f95e4079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:07.941 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.940+0000 7f9603e21700 1 --2- 192.168.123.105:0/1469269621 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f95e40778c0 0x7f95e4079d80 secure :-1 s=READY pgs=152 cs=0 l=1 rev1=1 crypto rx=0x7f95ec0075a0 tx=0x7f95ec009b10 comp rx=0 tx=0).stop 2026-03-10T09:06:07.942 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.940+0000 7f9603e21700 1 -- 192.168.123.105:0/1469269621 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f95fc197370 msgr2=0x7f95fc19b7e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:07.942 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.940+0000 7f9603e21700 1 --2- 192.168.123.105:0/1469269621 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f95fc197370 0x7f95fc19b7e0 secure :-1 s=READY pgs=219 cs=0 l=1 rev1=1 crypto rx=0x7f95f4000c00 tx=0x7f95f400bbf0 comp rx=0 tx=0).stop 2026-03-10T09:06:07.942 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.941+0000 7f9603e21700 1 -- 192.168.123.105:0/1469269621 shutdown_connections 2026-03-10T09:06:07.942 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.941+0000 7f9603e21700 1 --2- 192.168.123.105:0/1469269621 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f95e40778c0 0x7f95e4079d80 unknown :-1 s=CLOSED pgs=152 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:07.942 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.941+0000 7f9603e21700 1 --2- 192.168.123.105:0/1469269621 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f95fc100000 0x7f95fc196e30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:07.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.941+0000 7f9603e21700 1 --2- 192.168.123.105:0/1469269621 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f95fc197370 0x7f95fc19b7e0 unknown :-1 s=CLOSED pgs=219 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:07.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.942+0000 7f9603e21700 1 -- 192.168.123.105:0/1469269621 >> 192.168.123.105:0/1469269621 conn(0x7f95fc076e80 msgr2=0x7f95fc0fe8a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:07.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.942+0000 7f9603e21700 1 -- 192.168.123.105:0/1469269621 shutdown_connections 2026-03-10T09:06:07.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:07.942+0000 7f9603e21700 1 -- 192.168.123.105:0/1469269621 wait complete. 2026-03-10T09:06:07.944 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 17 2026-03-10T09:06:07.997 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph fs dump --format=json 18 2026-03-10T09:06:08.157 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T09:06:08.546 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:08 vm05.local ceph-mon[111630]: pgmap v347: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:08.546 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:08 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/1469269621' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 17, "format": "json"}]: dispatch 2026-03-10T09:06:08.653 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.651+0000 7fca80acf700 1 -- 192.168.123.105:0/3470506571 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fca7c100740 msgr2=0x7fca7c104c10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:08.653 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.651+0000 7fca80acf700 1 --2- 192.168.123.105:0/3470506571 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fca7c100740 0x7fca7c104c10 secure :-1 s=READY pgs=220 cs=0 l=1 rev1=1 crypto rx=0x7fca6c009b00 tx=0x7fca6c009e10 comp rx=0 tx=0).stop 2026-03-10T09:06:08.653 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.652+0000 7fca80acf700 1 -- 192.168.123.105:0/3470506571 shutdown_connections 2026-03-10T09:06:08.653 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.652+0000 7fca80acf700 1 --2- 192.168.123.105:0/3470506571 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fca7c100740 0x7fca7c104c10 unknown :-1 s=CLOSED pgs=220 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:08.653 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.652+0000 7fca80acf700 1 --2- 192.168.123.105:0/3470506571 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fca7c0ffe20 0x7fca7c100200 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:08.653 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.652+0000 7fca80acf700 1 -- 192.168.123.105:0/3470506571 >> 192.168.123.105:0/3470506571 conn(0x7fca7c0fb800 msgr2=0x7fca7c0fdc20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:08.654 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.652+0000 7fca80acf700 1 -- 192.168.123.105:0/3470506571 shutdown_connections 2026-03-10T09:06:08.654 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.652+0000 7fca80acf700 1 -- 192.168.123.105:0/3470506571 wait complete. 2026-03-10T09:06:08.654 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.653+0000 7fca80acf700 1 Processor -- start 2026-03-10T09:06:08.655 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.653+0000 7fca80acf700 1 -- start start 2026-03-10T09:06:08.655 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.654+0000 7fca80acf700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fca7c0ffe20 0x7fca7c19d230 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:08.655 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.654+0000 7fca80acf700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fca7c100740 0x7fca7c19d770 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:08.655 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.654+0000 7fca80acf700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fca7c19de50 con 0x7fca7c0ffe20 2026-03-10T09:06:08.655 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.654+0000 7fca80acf700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fca7c1a1be0 con 0x7fca7c100740 2026-03-10T09:06:08.655 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.654+0000 7fca79d9b700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fca7c100740 0x7fca7c19d770 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:08.655 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.654+0000 7fca79d9b700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fca7c100740 0x7fca7c19d770 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:39336/0 (socket says 192.168.123.105:39336) 2026-03-10T09:06:08.655 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.654+0000 7fca79d9b700 1 -- 192.168.123.105:0/1440424248 learned_addr learned my addr 192.168.123.105:0/1440424248 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:06:08.655 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.654+0000 7fca79d9b700 1 -- 192.168.123.105:0/1440424248 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fca7c0ffe20 msgr2=0x7fca7c19d230 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T09:06:08.656 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.654+0000 7fca7a59c700 1 --2- 192.168.123.105:0/1440424248 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fca7c0ffe20 0x7fca7c19d230 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:08.656 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.654+0000 7fca79d9b700 1 --2- 192.168.123.105:0/1440424248 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fca7c0ffe20 0x7fca7c19d230 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:08.656 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.654+0000 7fca79d9b700 1 -- 192.168.123.105:0/1440424248 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fca6c0097e0 con 0x7fca7c100740 2026-03-10T09:06:08.656 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.654+0000 7fca7a59c700 1 --2- 192.168.123.105:0/1440424248 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fca7c0ffe20 0x7fca7c19d230 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T09:06:08.656 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.655+0000 7fca79d9b700 1 --2- 192.168.123.105:0/1440424248 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fca7c100740 0x7fca7c19d770 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7fca6c005850 tx=0x7fca6c004970 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:08.656 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.655+0000 7fca737fe700 1 -- 192.168.123.105:0/1440424248 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fca6c01d070 con 0x7fca7c100740 2026-03-10T09:06:08.656 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.655+0000 7fca737fe700 1 -- 192.168.123.105:0/1440424248 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fca6c00bc50 con 0x7fca7c100740 2026-03-10T09:06:08.656 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.655+0000 7fca80acf700 1 -- 192.168.123.105:0/1440424248 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fca7c1a1e60 con 0x7fca7c100740 2026-03-10T09:06:08.657 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.655+0000 7fca737fe700 1 -- 192.168.123.105:0/1440424248 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fca6c00f700 con 0x7fca7c100740 2026-03-10T09:06:08.657 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.655+0000 7fca80acf700 1 -- 192.168.123.105:0/1440424248 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fca7c1a2350 con 0x7fca7c100740 2026-03-10T09:06:08.657 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.656+0000 7fca80acf700 1 -- 192.168.123.105:0/1440424248 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fca7c102370 con 0x7fca7c100740 2026-03-10T09:06:08.659 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.657+0000 7fca737fe700 1 -- 192.168.123.105:0/1440424248 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fca6c022a50 con 0x7fca7c100740 2026-03-10T09:06:08.659 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.658+0000 7fca737fe700 1 --2- 192.168.123.105:0/1440424248 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fca680778c0 0x7fca68079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:08.659 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.658+0000 7fca7a59c700 1 --2- 192.168.123.105:0/1440424248 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fca680778c0 0x7fca68079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:08.660 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.658+0000 7fca737fe700 1 -- 192.168.123.105:0/1440424248 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7fca6c09bdc0 con 0x7fca7c100740 2026-03-10T09:06:08.660 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.658+0000 7fca7a59c700 1 --2- 192.168.123.105:0/1440424248 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fca680778c0 0x7fca68079d80 secure :-1 s=READY pgs=153 cs=0 l=1 rev1=1 crypto rx=0x7fca64007900 tx=0x7fca64008040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:08.661 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.659+0000 7fca737fe700 1 -- 192.168.123.105:0/1440424248 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fca6c0644c0 con 0x7fca7c100740 2026-03-10T09:06:08.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:08 vm08.local ceph-mon[101330]: pgmap v347: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:08.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:08 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/1469269621' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 17, "format": "json"}]: dispatch 2026-03-10T09:06:08.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.804+0000 7fca80acf700 1 -- 192.168.123.105:0/1440424248 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 18, "format": "json"} v 0) v1 -- 0x7fca7c068a30 con 0x7fca7c100740 2026-03-10T09:06:08.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.805+0000 7fca737fe700 1 -- 192.168.123.105:0/1440424248 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 18, "format": "json"}]=0 dumped fsmap epoch 18 v30) v1 ==== 107+0+4992 (secure 0 0 0) 0x7fca6c063c10 con 0x7fca7c100740 2026-03-10T09:06:08.808 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:06:08.808 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":18,"btime":"2026-03-10T09:03:17:783297+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":24307,"name":"cephfs.vm08.ssijow","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.108:6827/573665424","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":573665424},{"type":"v1","addr":"192.168.123.108:6827","nonce":573665424}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":24317,"name":"cephfs.vm08.xfzrbx","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/3236998387","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":3236998387},{"type":"v1","addr":"192.168.123.108:6825","nonce":3236998387}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":34444,"name":"cephfs.vm05.bxdvbu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6827/2948722085","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":2948722085},{"type":"v1","addr":"192.168.123.105:6827","nonce":2948722085}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":18}],"filesystems":[{"mdsmap":{"epoch":17,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T08:52:52.346264+0000","modified":"2026-03-10T09:03:16.775887+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":101,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14488},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14488":{"gid":14488,"name":"cephfs.vm05.slhztf","rank":0,"incarnation":14,"state":"up:active","state_seq":158,"addr":"192.168.123.105:6829/2662194502","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6828","nonce":2662194502},{"type":"v1","addr":"192.168.123.105:6829","nonce":2662194502}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":14488,"qdb_cluster":[14488]},"id":1}]} 2026-03-10T09:06:08.811 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.809+0000 7fca80acf700 1 -- 192.168.123.105:0/1440424248 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fca680778c0 msgr2=0x7fca68079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:08.811 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.809+0000 7fca80acf700 1 --2- 192.168.123.105:0/1440424248 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fca680778c0 0x7fca68079d80 secure :-1 s=READY pgs=153 cs=0 l=1 rev1=1 crypto rx=0x7fca64007900 tx=0x7fca64008040 comp rx=0 tx=0).stop 2026-03-10T09:06:08.811 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.809+0000 7fca80acf700 1 -- 192.168.123.105:0/1440424248 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fca7c100740 msgr2=0x7fca7c19d770 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:08.811 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.809+0000 7fca80acf700 1 --2- 192.168.123.105:0/1440424248 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fca7c100740 0x7fca7c19d770 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7fca6c005850 tx=0x7fca6c004970 comp rx=0 tx=0).stop 2026-03-10T09:06:08.811 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.810+0000 7fca80acf700 1 -- 192.168.123.105:0/1440424248 shutdown_connections 2026-03-10T09:06:08.811 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.810+0000 7fca80acf700 1 --2- 192.168.123.105:0/1440424248 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fca680778c0 0x7fca68079d80 unknown :-1 s=CLOSED pgs=153 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:08.811 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.810+0000 7fca80acf700 1 --2- 192.168.123.105:0/1440424248 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fca7c0ffe20 0x7fca7c19d230 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:08.811 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.810+0000 7fca80acf700 1 --2- 192.168.123.105:0/1440424248 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fca7c100740 0x7fca7c19d770 unknown :-1 s=CLOSED pgs=91 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:08.811 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.810+0000 7fca80acf700 1 -- 192.168.123.105:0/1440424248 >> 192.168.123.105:0/1440424248 conn(0x7fca7c0fb800 msgr2=0x7fca7c111290 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:08.811 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.810+0000 7fca80acf700 1 -- 192.168.123.105:0/1440424248 shutdown_connections 2026-03-10T09:06:08.811 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:08.810+0000 7fca80acf700 1 -- 192.168.123.105:0/1440424248 wait complete. 2026-03-10T09:06:08.812 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 18 2026-03-10T09:06:08.860 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph fs dump --format=json 19 2026-03-10T09:06:09.007 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T09:06:09.288 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.286+0000 7f83b9c11700 1 -- 192.168.123.105:0/2451102057 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f83b4103cf0 msgr2=0x7f83b4107d40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:09.288 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.286+0000 7f83b9c11700 1 --2- 192.168.123.105:0/2451102057 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f83b4103cf0 0x7f83b4107d40 secure :-1 s=READY pgs=221 cs=0 l=1 rev1=1 crypto rx=0x7f83a4009b00 tx=0x7f83a4009e10 comp rx=0 tx=0).stop 2026-03-10T09:06:09.288 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.286+0000 7f83b9c11700 1 -- 192.168.123.105:0/2451102057 shutdown_connections 2026-03-10T09:06:09.288 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.286+0000 7f83b9c11700 1 --2- 192.168.123.105:0/2451102057 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f83b4103cf0 0x7f83b4107d40 unknown :-1 s=CLOSED pgs=221 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:09.288 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.286+0000 7f83b9c11700 1 --2- 192.168.123.105:0/2451102057 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f83b4103340 0x7f83b4103720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:09.288 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.286+0000 7f83b9c11700 1 -- 192.168.123.105:0/2451102057 >> 192.168.123.105:0/2451102057 conn(0x7f83b40feb90 msgr2=0x7f83b4100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:09.288 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.287+0000 7f83b9c11700 1 -- 192.168.123.105:0/2451102057 shutdown_connections 2026-03-10T09:06:09.288 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.287+0000 7f83b9c11700 1 -- 192.168.123.105:0/2451102057 wait complete. 2026-03-10T09:06:09.288 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.287+0000 7f83b9c11700 1 Processor -- start 2026-03-10T09:06:09.289 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.287+0000 7f83b9c11700 1 -- start start 2026-03-10T09:06:09.289 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.288+0000 7f83b9c11700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f83b4103340 0x7f83b40752a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:09.289 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.288+0000 7f83b9c11700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f83b4103cf0 0x7f83b40757e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:09.289 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.288+0000 7f83b9c11700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f83b40793a0 con 0x7f83b4103cf0 2026-03-10T09:06:09.289 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.288+0000 7f83b9c11700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f83b4075d20 con 0x7f83b4103340 2026-03-10T09:06:09.289 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.288+0000 7f83b2ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f83b4103cf0 0x7f83b40757e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:09.289 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.288+0000 7f83b2ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f83b4103cf0 0x7f83b40757e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:45604/0 (socket says 192.168.123.105:45604) 2026-03-10T09:06:09.289 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.288+0000 7f83b2ffd700 1 -- 192.168.123.105:0/2101874895 learned_addr learned my addr 192.168.123.105:0/2101874895 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:06:09.290 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.288+0000 7f83b37fe700 1 --2- 192.168.123.105:0/2101874895 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f83b4103340 0x7f83b40752a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:09.290 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.288+0000 7f83b2ffd700 1 -- 192.168.123.105:0/2101874895 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f83b4103340 msgr2=0x7f83b40752a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:09.290 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.288+0000 7f83b2ffd700 1 --2- 192.168.123.105:0/2101874895 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f83b4103340 0x7f83b40752a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:09.290 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.288+0000 7f83b2ffd700 1 -- 192.168.123.105:0/2101874895 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f83a40097e0 con 0x7f83b4103cf0 2026-03-10T09:06:09.290 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.288+0000 7f83b37fe700 1 --2- 192.168.123.105:0/2101874895 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f83b4103340 0x7f83b40752a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T09:06:09.291 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.288+0000 7f83b2ffd700 1 --2- 192.168.123.105:0/2101874895 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f83b4103cf0 0x7f83b40757e0 secure :-1 s=READY pgs=222 cs=0 l=1 rev1=1 crypto rx=0x7f83a4009fd0 tx=0x7f83a40049e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:09.291 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.289+0000 7f83b0ff9700 1 -- 192.168.123.105:0/2101874895 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f83a401d070 con 0x7f83b4103cf0 2026-03-10T09:06:09.291 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.289+0000 7f83b9c11700 1 -- 192.168.123.105:0/2101874895 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f83b4075fa0 con 0x7f83b4103cf0 2026-03-10T09:06:09.291 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.289+0000 7f83b9c11700 1 -- 192.168.123.105:0/2101874895 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f83b41a7540 con 0x7f83b4103cf0 2026-03-10T09:06:09.291 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.289+0000 7f83b0ff9700 1 -- 192.168.123.105:0/2101874895 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f83a400bc50 con 0x7f83b4103cf0 2026-03-10T09:06:09.292 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.289+0000 7f83b0ff9700 1 -- 192.168.123.105:0/2101874895 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f83a4017610 con 0x7f83b4103cf0 2026-03-10T09:06:09.292 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.290+0000 7f83b0ff9700 1 -- 192.168.123.105:0/2101874895 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f83a400f460 con 0x7f83b4103cf0 2026-03-10T09:06:09.292 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.291+0000 7f83b9c11700 1 -- 192.168.123.105:0/2101874895 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8394005320 con 0x7f83b4103cf0 2026-03-10T09:06:09.295 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.291+0000 7f83b0ff9700 1 --2- 192.168.123.105:0/2101874895 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f83a0077910 0x7f83a0079dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:09.295 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.291+0000 7f83b37fe700 1 --2- 192.168.123.105:0/2101874895 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f83a0077910 0x7f83a0079dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:09.295 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.292+0000 7f83b37fe700 1 --2- 192.168.123.105:0/2101874895 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f83a0077910 0x7f83a0079dd0 secure :-1 s=READY pgs=154 cs=0 l=1 rev1=1 crypto rx=0x7f839c005fd0 tx=0x7f839c005e20 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:09.295 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.292+0000 7f83b0ff9700 1 -- 192.168.123.105:0/2101874895 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f83a409b260 con 0x7f83b4103cf0 2026-03-10T09:06:09.295 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.294+0000 7f83b0ff9700 1 -- 192.168.123.105:0/2101874895 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f83a4063a10 con 0x7f83b4103cf0 2026-03-10T09:06:09.448 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.446+0000 7f83b9c11700 1 -- 192.168.123.105:0/2101874895 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 19, "format": "json"} v 0) v1 -- 0x7f8394005190 con 0x7f83b4103cf0 2026-03-10T09:06:09.448 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.447+0000 7f83b0ff9700 1 -- 192.168.123.105:0/2101874895 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 19, "format": "json"}]=0 dumped fsmap epoch 19 v30) v1 ==== 107+0+4187 (secure 0 0 0) 0x7f83a4063160 con 0x7f83b4103cf0 2026-03-10T09:06:09.449 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:06:09.449 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":19,"btime":"2026-03-10T09:03:21:674239+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":24307,"name":"cephfs.vm08.ssijow","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.108:6827/573665424","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":573665424},{"type":"v1","addr":"192.168.123.108:6827","nonce":573665424}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":24317,"name":"cephfs.vm08.xfzrbx","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/3236998387","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":3236998387},{"type":"v1","addr":"192.168.123.108:6825","nonce":3236998387}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":34444,"name":"cephfs.vm05.bxdvbu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6827/2948722085","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":2948722085},{"type":"v1","addr":"192.168.123.105:6827","nonce":2948722085}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":18}],"filesystems":[{"mdsmap":{"epoch":19,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T08:52:52.346264+0000","modified":"2026-03-10T09:03:21.674237+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":104,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{},"failed":[0],"damaged":[],"stopped":[],"info":{},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T09:06:09.451 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.449+0000 7f83b9c11700 1 -- 192.168.123.105:0/2101874895 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f83a0077910 msgr2=0x7f83a0079dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:09.451 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.449+0000 7f83b9c11700 1 --2- 192.168.123.105:0/2101874895 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f83a0077910 0x7f83a0079dd0 secure :-1 s=READY pgs=154 cs=0 l=1 rev1=1 crypto rx=0x7f839c005fd0 tx=0x7f839c005e20 comp rx=0 tx=0).stop 2026-03-10T09:06:09.451 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.449+0000 7f83b9c11700 1 -- 192.168.123.105:0/2101874895 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f83b4103cf0 msgr2=0x7f83b40757e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:09.451 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.449+0000 7f83b9c11700 1 --2- 192.168.123.105:0/2101874895 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f83b4103cf0 0x7f83b40757e0 secure :-1 s=READY pgs=222 cs=0 l=1 rev1=1 crypto rx=0x7f83a4009fd0 tx=0x7f83a40049e0 comp rx=0 tx=0).stop 2026-03-10T09:06:09.451 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.450+0000 7f83b9c11700 1 -- 192.168.123.105:0/2101874895 shutdown_connections 2026-03-10T09:06:09.452 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.450+0000 7f83b9c11700 1 --2- 192.168.123.105:0/2101874895 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f83a0077910 0x7f83a0079dd0 secure :-1 s=CLOSED pgs=154 cs=0 l=1 rev1=1 crypto rx=0x7f839c005fd0 tx=0x7f839c005e20 comp rx=0 tx=0).stop 2026-03-10T09:06:09.452 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.450+0000 7f83b9c11700 1 --2- 192.168.123.105:0/2101874895 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f83b4103340 0x7f83b40752a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:09.452 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.450+0000 7f83b9c11700 1 --2- 192.168.123.105:0/2101874895 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f83b4103cf0 0x7f83b40757e0 secure :-1 s=CLOSED pgs=222 cs=0 l=1 rev1=1 crypto rx=0x7f83a4009fd0 tx=0x7f83a40049e0 comp rx=0 tx=0).stop 2026-03-10T09:06:09.452 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.450+0000 7f83b9c11700 1 -- 192.168.123.105:0/2101874895 >> 192.168.123.105:0/2101874895 conn(0x7f83b40feb90 msgr2=0x7f83b41075b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:09.452 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.450+0000 7f83b9c11700 1 -- 192.168.123.105:0/2101874895 shutdown_connections 2026-03-10T09:06:09.452 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:09.450+0000 7f83b9c11700 1 -- 192.168.123.105:0/2101874895 wait complete. 2026-03-10T09:06:09.453 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 19 2026-03-10T09:06:09.693 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph fs dump --format=json 20 2026-03-10T09:06:09.710 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:09 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/1440424248' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 18, "format": "json"}]: dispatch 2026-03-10T09:06:09.710 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:09 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/2101874895' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 19, "format": "json"}]: dispatch 2026-03-10T09:06:09.857 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T09:06:10.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:09 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/1440424248' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 18, "format": "json"}]: dispatch 2026-03-10T09:06:10.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:09 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/2101874895' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 19, "format": "json"}]: dispatch 2026-03-10T09:06:10.354 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.351+0000 7f1404d50700 1 -- 192.168.123.105:0/4196143499 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f14001082d0 msgr2=0x7f1400108750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:10.354 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.351+0000 7f1404d50700 1 --2- 192.168.123.105:0/4196143499 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f14001082d0 0x7f1400108750 secure :-1 s=READY pgs=223 cs=0 l=1 rev1=1 crypto rx=0x7f13e8009b00 tx=0x7f13e8009e10 comp rx=0 tx=0).stop 2026-03-10T09:06:10.354 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.351+0000 7f1404d50700 1 -- 192.168.123.105:0/4196143499 shutdown_connections 2026-03-10T09:06:10.354 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.351+0000 7f1404d50700 1 --2- 192.168.123.105:0/4196143499 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f14001082d0 0x7f1400108750 unknown :-1 s=CLOSED pgs=223 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:10.354 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.351+0000 7f1404d50700 1 --2- 192.168.123.105:0/4196143499 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f140010f660 0x7f1400107d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:10.354 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.351+0000 7f1404d50700 1 -- 192.168.123.105:0/4196143499 >> 192.168.123.105:0/4196143499 conn(0x7f140006d0f0 msgr2=0x7f140006d500 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:10.355 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.353+0000 7f1404d50700 1 -- 192.168.123.105:0/4196143499 shutdown_connections 2026-03-10T09:06:10.355 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.353+0000 7f1404d50700 1 -- 192.168.123.105:0/4196143499 wait complete. 2026-03-10T09:06:10.355 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.353+0000 7f1404d50700 1 Processor -- start 2026-03-10T09:06:10.355 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.353+0000 7f1404d50700 1 -- start start 2026-03-10T09:06:10.355 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.354+0000 7f1404d50700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f14001082d0 0x7f14001ab870 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:10.355 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.354+0000 7f1404d50700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f140010f660 0x7f14001abdb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:10.356 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.354+0000 7f1404d50700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f14001a5860 con 0x7f14001082d0 2026-03-10T09:06:10.356 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.354+0000 7f1404d50700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f14001a59d0 con 0x7f140010f660 2026-03-10T09:06:10.358 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.354+0000 7f13fe59c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f14001082d0 0x7f14001ab870 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:10.358 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.354+0000 7f13fe59c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f14001082d0 0x7f14001ab870 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:60636/0 (socket says 192.168.123.105:60636) 2026-03-10T09:06:10.358 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.354+0000 7f13fe59c700 1 -- 192.168.123.105:0/1108247170 learned_addr learned my addr 192.168.123.105:0/1108247170 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:06:10.358 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.354+0000 7f13f5bff700 1 --2- 192.168.123.105:0/1108247170 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f140010f660 0x7f14001abdb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:10.358 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.354+0000 7f13fe59c700 1 -- 192.168.123.105:0/1108247170 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f140010f660 msgr2=0x7f14001abdb0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:10.358 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.354+0000 7f13fe59c700 1 --2- 192.168.123.105:0/1108247170 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f140010f660 0x7f14001abdb0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:10.359 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.354+0000 7f13fe59c700 1 -- 192.168.123.105:0/1108247170 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f13f0009710 con 0x7f14001082d0 2026-03-10T09:06:10.359 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.355+0000 7f13fe59c700 1 --2- 192.168.123.105:0/1108247170 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f14001082d0 0x7f14001ab870 secure :-1 s=READY pgs=224 cs=0 l=1 rev1=1 crypto rx=0x7f13f000ec80 tx=0x7f13f000ef90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:10.359 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.355+0000 7f13f7fff700 1 -- 192.168.123.105:0/1108247170 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f13f000ccd0 con 0x7f14001082d0 2026-03-10T09:06:10.359 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.355+0000 7f13f7fff700 1 -- 192.168.123.105:0/1108247170 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f13f0004500 con 0x7f14001082d0 2026-03-10T09:06:10.359 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.355+0000 7f13f7fff700 1 -- 192.168.123.105:0/1108247170 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f13f00052c0 con 0x7f14001082d0 2026-03-10T09:06:10.359 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.355+0000 7f1404d50700 1 -- 192.168.123.105:0/1108247170 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f13e80097e0 con 0x7f14001082d0 2026-03-10T09:06:10.359 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.355+0000 7f1404d50700 1 -- 192.168.123.105:0/1108247170 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f14001a6070 con 0x7f14001082d0 2026-03-10T09:06:10.359 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.357+0000 7f13f7fff700 1 -- 192.168.123.105:0/1108247170 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f13f001e030 con 0x7f14001082d0 2026-03-10T09:06:10.359 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.357+0000 7f13f7fff700 1 --2- 192.168.123.105:0/1108247170 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f13ec077990 0x7f13ec079e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:10.359 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.357+0000 7f13f7fff700 1 -- 192.168.123.105:0/1108247170 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f13f0014070 con 0x7f14001082d0 2026-03-10T09:06:10.359 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.358+0000 7f13f5bff700 1 --2- 192.168.123.105:0/1108247170 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f13ec077990 0x7f13ec079e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:10.359 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.358+0000 7f1404d50700 1 -- 192.168.123.105:0/1108247170 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f140010a820 con 0x7f14001082d0 2026-03-10T09:06:10.360 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.358+0000 7f13f5bff700 1 --2- 192.168.123.105:0/1108247170 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f13ec077990 0x7f13ec079e50 secure :-1 s=READY pgs=155 cs=0 l=1 rev1=1 crypto rx=0x7f14001a7090 tx=0x7f13e800b540 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:10.363 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.361+0000 7f13f7fff700 1 -- 192.168.123.105:0/1108247170 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f13f00625d0 con 0x7f14001082d0 2026-03-10T09:06:10.512 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.510+0000 7f1404d50700 1 -- 192.168.123.105:0/1108247170 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 20, "format": "json"} v 0) v1 -- 0x7f1400066e80 con 0x7f14001082d0 2026-03-10T09:06:10.512 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.511+0000 7f13f7fff700 1 -- 192.168.123.105:0/1108247170 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 20, "format": "json"}]=0 dumped fsmap epoch 20 v30) v1 ==== 107+0+4198 (secure 0 0 0) 0x7f13f0061d20 con 0x7f14001082d0 2026-03-10T09:06:10.513 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:06:10.513 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":20,"btime":"2026-03-10T09:03:21:683410+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":24317,"name":"cephfs.vm08.xfzrbx","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/3236998387","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":3236998387},{"type":"v1","addr":"192.168.123.108:6825","nonce":3236998387}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":34444,"name":"cephfs.vm05.bxdvbu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6827/2948722085","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":2948722085},{"type":"v1","addr":"192.168.123.105:6827","nonce":2948722085}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":18}],"filesystems":[{"mdsmap":{"epoch":20,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T08:52:52.346264+0000","modified":"2026-03-10T09:03:21.683406+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":104,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24307},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24307":{"gid":24307,"name":"cephfs.vm08.ssijow","rank":0,"incarnation":20,"state":"up:replay","state_seq":2,"addr":"192.168.123.108:6827/573665424","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":573665424},{"type":"v1","addr":"192.168.123.108:6827","nonce":573665424}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T09:06:10.515 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.514+0000 7f1404d50700 1 -- 192.168.123.105:0/1108247170 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f13ec077990 msgr2=0x7f13ec079e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:10.515 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.514+0000 7f1404d50700 1 --2- 192.168.123.105:0/1108247170 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f13ec077990 0x7f13ec079e50 secure :-1 s=READY pgs=155 cs=0 l=1 rev1=1 crypto rx=0x7f14001a7090 tx=0x7f13e800b540 comp rx=0 tx=0).stop 2026-03-10T09:06:10.516 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.514+0000 7f1404d50700 1 -- 192.168.123.105:0/1108247170 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f14001082d0 msgr2=0x7f14001ab870 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:10.516 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.514+0000 7f1404d50700 1 --2- 192.168.123.105:0/1108247170 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f14001082d0 0x7f14001ab870 secure :-1 s=READY pgs=224 cs=0 l=1 rev1=1 crypto rx=0x7f13f000ec80 tx=0x7f13f000ef90 comp rx=0 tx=0).stop 2026-03-10T09:06:10.516 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.514+0000 7f1404d50700 1 -- 192.168.123.105:0/1108247170 shutdown_connections 2026-03-10T09:06:10.516 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.514+0000 7f1404d50700 1 --2- 192.168.123.105:0/1108247170 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f13ec077990 0x7f13ec079e50 unknown :-1 s=CLOSED pgs=155 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:10.516 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.514+0000 7f1404d50700 1 --2- 192.168.123.105:0/1108247170 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f14001082d0 0x7f14001ab870 unknown :-1 s=CLOSED pgs=224 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:10.516 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.514+0000 7f1404d50700 1 --2- 192.168.123.105:0/1108247170 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f140010f660 0x7f14001abdb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:10.516 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.514+0000 7f1404d50700 1 -- 192.168.123.105:0/1108247170 >> 192.168.123.105:0/1108247170 conn(0x7f140006d0f0 msgr2=0x7f140010d510 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:10.516 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.514+0000 7f1404d50700 1 -- 192.168.123.105:0/1108247170 shutdown_connections 2026-03-10T09:06:10.516 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:10.514+0000 7f1404d50700 1 -- 192.168.123.105:0/1108247170 wait complete. 2026-03-10T09:06:10.517 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 20 2026-03-10T09:06:10.565 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph fs dump --format=json 21 2026-03-10T09:06:10.759 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T09:06:10.909 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:10 vm05.local ceph-mon[111630]: pgmap v348: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:10.909 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:10 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/1108247170' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 20, "format": "json"}]: dispatch 2026-03-10T09:06:11.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:10 vm08.local ceph-mon[101330]: pgmap v348: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:11.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:10 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/1108247170' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 20, "format": "json"}]: dispatch 2026-03-10T09:06:11.133 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.131+0000 7f3339576700 1 -- 192.168.123.105:0/2857350234 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3334103cf0 msgr2=0x7f3334107d40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:11.133 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.131+0000 7f3339576700 1 --2- 192.168.123.105:0/2857350234 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3334103cf0 0x7f3334107d40 secure :-1 s=READY pgs=225 cs=0 l=1 rev1=1 crypto rx=0x7f3324009a60 tx=0x7f3324009d70 comp rx=0 tx=0).stop 2026-03-10T09:06:11.133 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.132+0000 7f3339576700 1 -- 192.168.123.105:0/2857350234 shutdown_connections 2026-03-10T09:06:11.133 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.132+0000 7f3339576700 1 --2- 192.168.123.105:0/2857350234 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3334103cf0 0x7f3334107d40 unknown :-1 s=CLOSED pgs=225 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:11.133 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.132+0000 7f3339576700 1 --2- 192.168.123.105:0/2857350234 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3334103340 0x7f3334103720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:11.133 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.132+0000 7f3339576700 1 -- 192.168.123.105:0/2857350234 >> 192.168.123.105:0/2857350234 conn(0x7f33340feb90 msgr2=0x7f3334100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:11.133 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.132+0000 7f3339576700 1 -- 192.168.123.105:0/2857350234 shutdown_connections 2026-03-10T09:06:11.134 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.132+0000 7f3339576700 1 -- 192.168.123.105:0/2857350234 wait complete. 2026-03-10T09:06:11.134 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.133+0000 7f3339576700 1 Processor -- start 2026-03-10T09:06:11.134 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.133+0000 7f3339576700 1 -- start start 2026-03-10T09:06:11.135 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.133+0000 7f3339576700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3334103340 0x7f333410fce0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:11.135 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.133+0000 7f3339576700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3334103cf0 0x7f3334110260 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:11.135 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.133+0000 7f3339576700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3334110940 con 0x7f3334103340 2026-03-10T09:06:11.135 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.133+0000 7f3339576700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3334113680 con 0x7f3334103cf0 2026-03-10T09:06:11.135 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.134+0000 7f3332ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3334103340 0x7f333410fce0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:11.135 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.134+0000 7f3332ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3334103340 0x7f333410fce0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:60648/0 (socket says 192.168.123.105:60648) 2026-03-10T09:06:11.135 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.134+0000 7f33327fc700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3334103cf0 0x7f3334110260 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:11.135 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.134+0000 7f3332ffd700 1 -- 192.168.123.105:0/4072458647 learned_addr learned my addr 192.168.123.105:0/4072458647 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:06:11.135 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.134+0000 7f3332ffd700 1 -- 192.168.123.105:0/4072458647 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3334103cf0 msgr2=0x7f3334110260 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:11.135 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.134+0000 7f3332ffd700 1 --2- 192.168.123.105:0/4072458647 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3334103cf0 0x7f3334110260 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:11.135 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.134+0000 7f3332ffd700 1 -- 192.168.123.105:0/4072458647 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3324009710 con 0x7f3334103340 2026-03-10T09:06:11.136 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.134+0000 7f33327fc700 1 --2- 192.168.123.105:0/4072458647 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3334103cf0 0x7f3334110260 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T09:06:11.136 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.134+0000 7f3332ffd700 1 --2- 192.168.123.105:0/4072458647 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3334103340 0x7f333410fce0 secure :-1 s=READY pgs=226 cs=0 l=1 rev1=1 crypto rx=0x7f331c00eab0 tx=0x7f331c00edc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:11.136 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.135+0000 7f332bfff700 1 -- 192.168.123.105:0/4072458647 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f331c00cb80 con 0x7f3334103340 2026-03-10T09:06:11.136 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.135+0000 7f332bfff700 1 -- 192.168.123.105:0/4072458647 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f331c004d10 con 0x7f3334103340 2026-03-10T09:06:11.136 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.135+0000 7f332bfff700 1 -- 192.168.123.105:0/4072458647 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f331c010430 con 0x7f3334103340 2026-03-10T09:06:11.136 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.135+0000 7f3339576700 1 -- 192.168.123.105:0/4072458647 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3334113960 con 0x7f3334103340 2026-03-10T09:06:11.136 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.135+0000 7f3339576700 1 -- 192.168.123.105:0/4072458647 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3334113eb0 con 0x7f3334103340 2026-03-10T09:06:11.138 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.137+0000 7f332bfff700 1 -- 192.168.123.105:0/4072458647 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f331c010660 con 0x7f3334103340 2026-03-10T09:06:11.138 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.137+0000 7f3339576700 1 -- 192.168.123.105:0/4072458647 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f333404ea90 con 0x7f3334103340 2026-03-10T09:06:11.142 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.140+0000 7f332bfff700 1 --2- 192.168.123.105:0/4072458647 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f3320077920 0x7f3320079de0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:11.142 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.140+0000 7f332bfff700 1 -- 192.168.123.105:0/4072458647 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f331c014070 con 0x7f3334103340 2026-03-10T09:06:11.142 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.140+0000 7f33327fc700 1 --2- 192.168.123.105:0/4072458647 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f3320077920 0x7f3320079de0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:11.142 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.140+0000 7f33327fc700 1 --2- 192.168.123.105:0/4072458647 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f3320077920 0x7f3320079de0 secure :-1 s=READY pgs=156 cs=0 l=1 rev1=1 crypto rx=0x7f3334111300 tx=0x7f33240058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:11.142 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.141+0000 7f332bfff700 1 -- 192.168.123.105:0/4072458647 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f331c062670 con 0x7f3334103340 2026-03-10T09:06:11.288 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.287+0000 7f3339576700 1 -- 192.168.123.105:0/4072458647 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 21, "format": "json"} v 0) v1 -- 0x7f33341110e0 con 0x7f3334103340 2026-03-10T09:06:11.289 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.287+0000 7f332bfff700 1 -- 192.168.123.105:0/4072458647 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 21, "format": "json"}]=0 dumped fsmap epoch 21 v30) v1 ==== 107+0+4203 (secure 0 0 0) 0x7f331c061dc0 con 0x7f3334103340 2026-03-10T09:06:11.289 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:06:11.289 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":21,"btime":"2026-03-10T09:03:26:219880+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":24317,"name":"cephfs.vm08.xfzrbx","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/3236998387","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":3236998387},{"type":"v1","addr":"192.168.123.108:6825","nonce":3236998387}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":34444,"name":"cephfs.vm05.bxdvbu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6827/2948722085","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":2948722085},{"type":"v1","addr":"192.168.123.105:6827","nonce":2948722085}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":18}],"filesystems":[{"mdsmap":{"epoch":21,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T08:52:52.346264+0000","modified":"2026-03-10T09:03:26.169736+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":104,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24307},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24307":{"gid":24307,"name":"cephfs.vm08.ssijow","rank":0,"incarnation":20,"state":"up:reconnect","state_seq":159,"addr":"192.168.123.108:6827/573665424","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":573665424},{"type":"v1","addr":"192.168.123.108:6827","nonce":573665424}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T09:06:11.291 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.290+0000 7f3339576700 1 -- 192.168.123.105:0/4072458647 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f3320077920 msgr2=0x7f3320079de0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:11.291 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.290+0000 7f3339576700 1 --2- 192.168.123.105:0/4072458647 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f3320077920 0x7f3320079de0 secure :-1 s=READY pgs=156 cs=0 l=1 rev1=1 crypto rx=0x7f3334111300 tx=0x7f33240058e0 comp rx=0 tx=0).stop 2026-03-10T09:06:11.291 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.290+0000 7f3339576700 1 -- 192.168.123.105:0/4072458647 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3334103340 msgr2=0x7f333410fce0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:11.291 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.290+0000 7f3339576700 1 --2- 192.168.123.105:0/4072458647 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3334103340 0x7f333410fce0 secure :-1 s=READY pgs=226 cs=0 l=1 rev1=1 crypto rx=0x7f331c00eab0 tx=0x7f331c00edc0 comp rx=0 tx=0).stop 2026-03-10T09:06:11.292 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.290+0000 7f3339576700 1 -- 192.168.123.105:0/4072458647 shutdown_connections 2026-03-10T09:06:11.292 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.290+0000 7f3339576700 1 --2- 192.168.123.105:0/4072458647 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f3320077920 0x7f3320079de0 unknown :-1 s=CLOSED pgs=156 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:11.292 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.291+0000 7f3339576700 1 --2- 192.168.123.105:0/4072458647 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3334103340 0x7f333410fce0 unknown :-1 s=CLOSED pgs=226 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:11.292 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.291+0000 7f3339576700 1 --2- 192.168.123.105:0/4072458647 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3334103cf0 0x7f3334110260 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:11.292 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.291+0000 7f3339576700 1 -- 192.168.123.105:0/4072458647 >> 192.168.123.105:0/4072458647 conn(0x7f33340feb90 msgr2=0x7f3334106d20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:11.292 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.291+0000 7f3339576700 1 -- 192.168.123.105:0/4072458647 shutdown_connections 2026-03-10T09:06:11.292 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.291+0000 7f3339576700 1 -- 192.168.123.105:0/4072458647 wait complete. 2026-03-10T09:06:11.293 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 21 2026-03-10T09:06:11.355 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph fs dump --format=json 22 2026-03-10T09:06:11.504 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T09:06:11.761 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.759+0000 7ff97478c700 1 -- 192.168.123.105:0/3303971519 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff96c073130 msgr2=0x7ff96c073510 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:11.761 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.759+0000 7ff97478c700 1 --2- 192.168.123.105:0/3303971519 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff96c073130 0x7ff96c073510 secure :-1 s=READY pgs=227 cs=0 l=1 rev1=1 crypto rx=0x7ff95c009b00 tx=0x7ff95c009e10 comp rx=0 tx=0).stop 2026-03-10T09:06:11.761 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.760+0000 7ff97478c700 1 -- 192.168.123.105:0/3303971519 shutdown_connections 2026-03-10T09:06:11.761 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.760+0000 7ff97478c700 1 --2- 192.168.123.105:0/3303971519 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff96c073a50 0x7ff96c111940 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:11.761 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.760+0000 7ff97478c700 1 --2- 192.168.123.105:0/3303971519 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff96c073130 0x7ff96c073510 unknown :-1 s=CLOSED pgs=227 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:11.761 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.760+0000 7ff97478c700 1 -- 192.168.123.105:0/3303971519 >> 192.168.123.105:0/3303971519 conn(0x7ff96c0fc920 msgr2=0x7ff96c0fed40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:11.761 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.760+0000 7ff97478c700 1 -- 192.168.123.105:0/3303971519 shutdown_connections 2026-03-10T09:06:11.761 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.760+0000 7ff97478c700 1 -- 192.168.123.105:0/3303971519 wait complete. 2026-03-10T09:06:11.762 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.760+0000 7ff97478c700 1 Processor -- start 2026-03-10T09:06:11.762 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.761+0000 7ff97478c700 1 -- start start 2026-03-10T09:06:11.762 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.761+0000 7ff97478c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff96c073a50 0x7ff96c19d300 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:11.762 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.761+0000 7ff97478c700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff96c19d840 0x7ff96c1a1cb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:11.763 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.761+0000 7ff97478c700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff96c19de60 con 0x7ff96c073a50 2026-03-10T09:06:11.763 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.761+0000 7ff97478c700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff96c19dfd0 con 0x7ff96c19d840 2026-03-10T09:06:11.763 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.761+0000 7ff971d27700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff96c19d840 0x7ff96c1a1cb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:11.763 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.761+0000 7ff971d27700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff96c19d840 0x7ff96c1a1cb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:36904/0 (socket says 192.168.123.105:36904) 2026-03-10T09:06:11.763 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.761+0000 7ff971d27700 1 -- 192.168.123.105:0/1821004061 learned_addr learned my addr 192.168.123.105:0/1821004061 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:06:11.763 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.762+0000 7ff971d27700 1 -- 192.168.123.105:0/1821004061 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff96c073a50 msgr2=0x7ff96c19d300 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T09:06:11.763 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.762+0000 7ff972528700 1 --2- 192.168.123.105:0/1821004061 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff96c073a50 0x7ff96c19d300 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:11.763 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.762+0000 7ff971d27700 1 --2- 192.168.123.105:0/1821004061 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff96c073a50 0x7ff96c19d300 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:11.763 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.762+0000 7ff971d27700 1 -- 192.168.123.105:0/1821004061 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff95c0097e0 con 0x7ff96c19d840 2026-03-10T09:06:11.763 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.762+0000 7ff972528700 1 --2- 192.168.123.105:0/1821004061 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff96c073a50 0x7ff96c19d300 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T09:06:11.763 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.762+0000 7ff971d27700 1 --2- 192.168.123.105:0/1821004061 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff96c19d840 0x7ff96c1a1cb0 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7ff96800d8d0 tx=0x7ff96800dc90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:11.764 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.762+0000 7ff9637fe700 1 -- 192.168.123.105:0/1821004061 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff968009940 con 0x7ff96c19d840 2026-03-10T09:06:11.765 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.763+0000 7ff97478c700 1 -- 192.168.123.105:0/1821004061 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff96c1a22b0 con 0x7ff96c19d840 2026-03-10T09:06:11.765 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.763+0000 7ff9637fe700 1 -- 192.168.123.105:0/1821004061 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7ff968010460 con 0x7ff96c19d840 2026-03-10T09:06:11.765 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.763+0000 7ff9637fe700 1 -- 192.168.123.105:0/1821004061 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff96800f5d0 con 0x7ff96c19d840 2026-03-10T09:06:11.765 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.763+0000 7ff97478c700 1 -- 192.168.123.105:0/1821004061 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff96c1a2800 con 0x7ff96c19d840 2026-03-10T09:06:11.765 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.764+0000 7ff97478c700 1 -- 192.168.123.105:0/1821004061 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff96c10f0c0 con 0x7ff96c19d840 2026-03-10T09:06:11.767 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.765+0000 7ff9637fe700 1 -- 192.168.123.105:0/1821004061 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7ff968010a90 con 0x7ff96c19d840 2026-03-10T09:06:11.767 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.765+0000 7ff9637fe700 1 --2- 192.168.123.105:0/1821004061 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ff9580778c0 0x7ff958079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:11.767 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.765+0000 7ff972528700 1 --2- 192.168.123.105:0/1821004061 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ff9580778c0 0x7ff958079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:11.767 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.765+0000 7ff9637fe700 1 -- 192.168.123.105:0/1821004061 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7ff968099ff0 con 0x7ff96c19d840 2026-03-10T09:06:11.767 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.766+0000 7ff972528700 1 --2- 192.168.123.105:0/1821004061 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ff9580778c0 0x7ff958079d80 secure :-1 s=READY pgs=157 cs=0 l=1 rev1=1 crypto rx=0x7ff95c005950 tx=0x7ff95c0058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:11.769 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.767+0000 7ff9637fe700 1 -- 192.168.123.105:0/1821004061 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff968062770 con 0x7ff96c19d840 2026-03-10T09:06:11.853 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:11 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/4072458647' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 21, "format": "json"}]: dispatch 2026-03-10T09:06:11.912 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.910+0000 7ff97478c700 1 -- 192.168.123.105:0/1821004061 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 22, "format": "json"} v 0) v1 -- 0x7ff96c1a2ae0 con 0x7ff96c19d840 2026-03-10T09:06:11.912 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.911+0000 7ff9637fe700 1 -- 192.168.123.105:0/1821004061 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 22, "format": "json"}]=0 dumped fsmap epoch 22 v30) v1 ==== 107+0+4200 (secure 0 0 0) 0x7ff968061ec0 con 0x7ff96c19d840 2026-03-10T09:06:11.913 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:06:11.913 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":22,"btime":"2026-03-10T09:03:27:246231+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":24317,"name":"cephfs.vm08.xfzrbx","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/3236998387","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":3236998387},{"type":"v1","addr":"192.168.123.108:6825","nonce":3236998387}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":34444,"name":"cephfs.vm05.bxdvbu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6827/2948722085","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":2948722085},{"type":"v1","addr":"192.168.123.105:6827","nonce":2948722085}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":18}],"filesystems":[{"mdsmap":{"epoch":22,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T08:52:52.346264+0000","modified":"2026-03-10T09:03:26.252435+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":104,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24307},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24307":{"gid":24307,"name":"cephfs.vm08.ssijow","rank":0,"incarnation":20,"state":"up:rejoin","state_seq":160,"addr":"192.168.123.108:6827/573665424","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":573665424},{"type":"v1","addr":"192.168.123.108:6827","nonce":573665424}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T09:06:11.916 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.914+0000 7ff97478c700 1 -- 192.168.123.105:0/1821004061 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ff9580778c0 msgr2=0x7ff958079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:11.916 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.914+0000 7ff97478c700 1 --2- 192.168.123.105:0/1821004061 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ff9580778c0 0x7ff958079d80 secure :-1 s=READY pgs=157 cs=0 l=1 rev1=1 crypto rx=0x7ff95c005950 tx=0x7ff95c0058e0 comp rx=0 tx=0).stop 2026-03-10T09:06:11.916 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.915+0000 7ff97478c700 1 -- 192.168.123.105:0/1821004061 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff96c19d840 msgr2=0x7ff96c1a1cb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:11.916 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.915+0000 7ff97478c700 1 --2- 192.168.123.105:0/1821004061 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff96c19d840 0x7ff96c1a1cb0 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7ff96800d8d0 tx=0x7ff96800dc90 comp rx=0 tx=0).stop 2026-03-10T09:06:11.916 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.915+0000 7ff97478c700 1 -- 192.168.123.105:0/1821004061 shutdown_connections 2026-03-10T09:06:11.916 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.915+0000 7ff97478c700 1 --2- 192.168.123.105:0/1821004061 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7ff9580778c0 0x7ff958079d80 unknown :-1 s=CLOSED pgs=157 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:11.916 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.915+0000 7ff97478c700 1 --2- 192.168.123.105:0/1821004061 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff96c073a50 0x7ff96c19d300 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:11.916 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.915+0000 7ff97478c700 1 --2- 192.168.123.105:0/1821004061 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff96c19d840 0x7ff96c1a1cb0 unknown :-1 s=CLOSED pgs=92 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:11.917 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.915+0000 7ff97478c700 1 -- 192.168.123.105:0/1821004061 >> 192.168.123.105:0/1821004061 conn(0x7ff96c0fc920 msgr2=0x7ff96c103450 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:11.917 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.915+0000 7ff97478c700 1 -- 192.168.123.105:0/1821004061 shutdown_connections 2026-03-10T09:06:11.917 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:11.916+0000 7ff97478c700 1 -- 192.168.123.105:0/1821004061 wait complete. 2026-03-10T09:06:11.918 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 22 2026-03-10T09:06:11.961 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph fs dump --format=json 23 2026-03-10T09:06:12.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:11 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/4072458647' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 21, "format": "json"}]: dispatch 2026-03-10T09:06:12.104 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T09:06:12.338 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.336+0000 7f00a70ee700 1 -- 192.168.123.105:0/3855019862 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f00a01032e0 msgr2=0x7f00a01036c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:12.338 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.336+0000 7f00a70ee700 1 --2- 192.168.123.105:0/3855019862 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f00a01032e0 0x7f00a01036c0 secure :-1 s=READY pgs=228 cs=0 l=1 rev1=1 crypto rx=0x7f0090009b00 tx=0x7f0090009e10 comp rx=0 tx=0).stop 2026-03-10T09:06:12.339 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.337+0000 7f00a70ee700 1 -- 192.168.123.105:0/3855019862 shutdown_connections 2026-03-10T09:06:12.339 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.337+0000 7f00a70ee700 1 --2- 192.168.123.105:0/3855019862 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f00a0103c90 0x7f00a0107ce0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:12.339 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.337+0000 7f00a70ee700 1 --2- 192.168.123.105:0/3855019862 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f00a01032e0 0x7f00a01036c0 unknown :-1 s=CLOSED pgs=228 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:12.339 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.337+0000 7f00a70ee700 1 -- 192.168.123.105:0/3855019862 >> 192.168.123.105:0/3855019862 conn(0x7f00a00feb50 msgr2=0x7f00a0100f70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:12.339 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.338+0000 7f00a70ee700 1 -- 192.168.123.105:0/3855019862 shutdown_connections 2026-03-10T09:06:12.339 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.338+0000 7f00a70ee700 1 -- 192.168.123.105:0/3855019862 wait complete. 2026-03-10T09:06:12.339 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.338+0000 7f00a70ee700 1 Processor -- start 2026-03-10T09:06:12.340 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.338+0000 7f00a70ee700 1 -- start start 2026-03-10T09:06:12.340 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.339+0000 7f00a70ee700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f00a01032e0 0x7f00a0198d90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:12.340 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.339+0000 7f00a70ee700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f00a0103c90 0x7f00a01992d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:12.340 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.339+0000 7f00a70ee700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f00a01999b0 con 0x7f00a01032e0 2026-03-10T09:06:12.340 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.339+0000 7f00a70ee700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f00a019d740 con 0x7f00a0103c90 2026-03-10T09:06:12.340 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.339+0000 7f009ffff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f00a0103c90 0x7f00a01992d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:12.341 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.339+0000 7f009ffff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f00a0103c90 0x7f00a01992d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:36926/0 (socket says 192.168.123.105:36926) 2026-03-10T09:06:12.341 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.339+0000 7f009ffff700 1 -- 192.168.123.105:0/1825189207 learned_addr learned my addr 192.168.123.105:0/1825189207 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:06:12.341 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.340+0000 7f00a4e8a700 1 --2- 192.168.123.105:0/1825189207 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f00a01032e0 0x7f00a0198d90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:12.341 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.340+0000 7f00a4e8a700 1 -- 192.168.123.105:0/1825189207 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f00a0103c90 msgr2=0x7f00a01992d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:12.341 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.340+0000 7f00a4e8a700 1 --2- 192.168.123.105:0/1825189207 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f00a0103c90 0x7f00a01992d0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:12.341 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.340+0000 7f00a4e8a700 1 -- 192.168.123.105:0/1825189207 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f00900097e0 con 0x7f00a01032e0 2026-03-10T09:06:12.342 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.340+0000 7f00a4e8a700 1 --2- 192.168.123.105:0/1825189207 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f00a01032e0 0x7f00a0198d90 secure :-1 s=READY pgs=229 cs=0 l=1 rev1=1 crypto rx=0x7f0090009fd0 tx=0x7f00900048c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:12.343 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.341+0000 7f009dffb700 1 -- 192.168.123.105:0/1825189207 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f009001d070 con 0x7f00a01032e0 2026-03-10T09:06:12.343 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.341+0000 7f009dffb700 1 -- 192.168.123.105:0/1825189207 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f0090022470 con 0x7f00a01032e0 2026-03-10T09:06:12.343 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.341+0000 7f009dffb700 1 -- 192.168.123.105:0/1825189207 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f009000f670 con 0x7f00a01032e0 2026-03-10T09:06:12.343 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.341+0000 7f00a70ee700 1 -- 192.168.123.105:0/1825189207 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f00a019d9c0 con 0x7f00a01032e0 2026-03-10T09:06:12.343 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.341+0000 7f00a70ee700 1 -- 192.168.123.105:0/1825189207 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f00a019ddd0 con 0x7f00a01032e0 2026-03-10T09:06:12.347 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.342+0000 7f009dffb700 1 -- 192.168.123.105:0/1825189207 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f009000f7d0 con 0x7f00a01032e0 2026-03-10T09:06:12.347 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.342+0000 7f00a70ee700 1 -- 192.168.123.105:0/1825189207 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f00a010b680 con 0x7f00a01032e0 2026-03-10T09:06:12.347 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.343+0000 7f009dffb700 1 --2- 192.168.123.105:0/1825189207 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f0088077870 0x7f0088079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:12.347 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.343+0000 7f009dffb700 1 -- 192.168.123.105:0/1825189207 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f009009af40 con 0x7f00a01032e0 2026-03-10T09:06:12.347 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.343+0000 7f009ffff700 1 --2- 192.168.123.105:0/1825189207 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f0088077870 0x7f0088079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:12.347 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.344+0000 7f009ffff700 1 --2- 192.168.123.105:0/1825189207 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f0088077870 0x7f0088079d30 secure :-1 s=READY pgs=158 cs=0 l=1 rev1=1 crypto rx=0x7f00a019a3b0 tx=0x7f009400b480 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:12.347 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.346+0000 7f009dffb700 1 -- 192.168.123.105:0/1825189207 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f0090064850 con 0x7f00a01032e0 2026-03-10T09:06:12.485 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.483+0000 7f00a70ee700 1 -- 192.168.123.105:0/1825189207 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 23, "format": "json"} v 0) v1 -- 0x7f00a019a0f0 con 0x7f00a01032e0 2026-03-10T09:06:12.486 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.484+0000 7f009dffb700 1 -- 192.168.123.105:0/1825189207 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 23, "format": "json"}]=0 dumped fsmap epoch 23 v30) v1 ==== 107+0+5057 (secure 0 0 0) 0x7f0090063fa0 con 0x7f00a01032e0 2026-03-10T09:06:12.486 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:06:12.486 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":23,"btime":"2026-03-10T09:03:28:259240+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":24317,"name":"cephfs.vm08.xfzrbx","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/3236998387","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":3236998387},{"type":"v1","addr":"192.168.123.108:6825","nonce":3236998387}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":34444,"name":"cephfs.vm05.bxdvbu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6827/2948722085","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":2948722085},{"type":"v1","addr":"192.168.123.105:6827","nonce":2948722085}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":18},{"gid":44367,"name":"cephfs.vm05.slhztf","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6829/930707688","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6828","nonce":930707688},{"type":"v1","addr":"192.168.123.105:6829","nonce":930707688}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":23}],"filesystems":[{"mdsmap":{"epoch":23,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T08:52:52.346264+0000","modified":"2026-03-10T09:03:28.259239+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":104,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24307},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24307":{"gid":24307,"name":"cephfs.vm08.ssijow","rank":0,"incarnation":20,"state":"up:active","state_seq":161,"addr":"192.168.123.108:6827/573665424","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":573665424},{"type":"v1","addr":"192.168.123.108:6827","nonce":573665424}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24307,"qdb_cluster":[24307]},"id":1}]} 2026-03-10T09:06:12.488 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.487+0000 7f00a70ee700 1 -- 192.168.123.105:0/1825189207 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f0088077870 msgr2=0x7f0088079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:12.488 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.487+0000 7f00a70ee700 1 --2- 192.168.123.105:0/1825189207 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f0088077870 0x7f0088079d30 secure :-1 s=READY pgs=158 cs=0 l=1 rev1=1 crypto rx=0x7f00a019a3b0 tx=0x7f009400b480 comp rx=0 tx=0).stop 2026-03-10T09:06:12.488 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.487+0000 7f00a70ee700 1 -- 192.168.123.105:0/1825189207 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f00a01032e0 msgr2=0x7f00a0198d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:12.488 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.487+0000 7f00a70ee700 1 --2- 192.168.123.105:0/1825189207 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f00a01032e0 0x7f00a0198d90 secure :-1 s=READY pgs=229 cs=0 l=1 rev1=1 crypto rx=0x7f0090009fd0 tx=0x7f00900048c0 comp rx=0 tx=0).stop 2026-03-10T09:06:12.489 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.488+0000 7f00a70ee700 1 -- 192.168.123.105:0/1825189207 shutdown_connections 2026-03-10T09:06:12.489 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.488+0000 7f00a70ee700 1 --2- 192.168.123.105:0/1825189207 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f0088077870 0x7f0088079d30 unknown :-1 s=CLOSED pgs=158 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:12.489 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.488+0000 7f00a70ee700 1 --2- 192.168.123.105:0/1825189207 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f00a01032e0 0x7f00a0198d90 unknown :-1 s=CLOSED pgs=229 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:12.489 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.488+0000 7f00a70ee700 1 --2- 192.168.123.105:0/1825189207 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f00a0103c90 0x7f00a01992d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:12.489 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.488+0000 7f00a70ee700 1 -- 192.168.123.105:0/1825189207 >> 192.168.123.105:0/1825189207 conn(0x7f00a00feb50 msgr2=0x7f00a0100f10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:12.489 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.488+0000 7f00a70ee700 1 -- 192.168.123.105:0/1825189207 shutdown_connections 2026-03-10T09:06:12.489 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.488+0000 7f00a70ee700 1 -- 192.168.123.105:0/1825189207 wait complete. 2026-03-10T09:06:12.490 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 23 2026-03-10T09:06:12.536 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph fs dump --format=json 24 2026-03-10T09:06:12.692 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T09:06:12.739 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:12 vm05.local ceph-mon[111630]: pgmap v349: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:12.739 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:12 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/1821004061' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 22, "format": "json"}]: dispatch 2026-03-10T09:06:12.739 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:12 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/1825189207' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 23, "format": "json"}]: dispatch 2026-03-10T09:06:12.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.941+0000 7f5029487700 1 -- 192.168.123.105:0/3435938850 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5024068df0 msgr2=0x7f502410d5b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:12.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.941+0000 7f5029487700 1 --2- 192.168.123.105:0/3435938850 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5024068df0 0x7f502410d5b0 secure :-1 s=READY pgs=230 cs=0 l=1 rev1=1 crypto rx=0x7f5014009b30 tx=0x7f5014009e40 comp rx=0 tx=0).stop 2026-03-10T09:06:12.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.942+0000 7f5029487700 1 -- 192.168.123.105:0/3435938850 shutdown_connections 2026-03-10T09:06:12.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.942+0000 7f5029487700 1 --2- 192.168.123.105:0/3435938850 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5024068df0 0x7f502410d5b0 unknown :-1 s=CLOSED pgs=230 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:12.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.942+0000 7f5029487700 1 --2- 192.168.123.105:0/3435938850 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f50240684d0 0x7f50240688b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:12.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.942+0000 7f5029487700 1 -- 192.168.123.105:0/3435938850 >> 192.168.123.105:0/3435938850 conn(0x7f5024075960 msgr2=0x7f5024075d70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:12.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.942+0000 7f5029487700 1 -- 192.168.123.105:0/3435938850 shutdown_connections 2026-03-10T09:06:12.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.942+0000 7f5029487700 1 -- 192.168.123.105:0/3435938850 wait complete. 2026-03-10T09:06:12.944 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.942+0000 7f5029487700 1 Processor -- start 2026-03-10T09:06:12.944 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.943+0000 7f5029487700 1 -- start start 2026-03-10T09:06:12.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.943+0000 7f5029487700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f50240684d0 0x7f5024198d60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:12.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.943+0000 7f5029487700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5024068df0 0x7f50241992a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:12.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.943+0000 7f5029487700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5024199980 con 0x7f50240684d0 2026-03-10T09:06:12.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.943+0000 7f5029487700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f502419d710 con 0x7f5024068df0 2026-03-10T09:06:12.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.943+0000 7f5022ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f50240684d0 0x7f5024198d60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:12.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.943+0000 7f50227fc700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5024068df0 0x7f50241992a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:12.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.943+0000 7f50227fc700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5024068df0 0x7f50241992a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:36936/0 (socket says 192.168.123.105:36936) 2026-03-10T09:06:12.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.943+0000 7f50227fc700 1 -- 192.168.123.105:0/3285362172 learned_addr learned my addr 192.168.123.105:0/3285362172 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:06:12.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.944+0000 7f50227fc700 1 -- 192.168.123.105:0/3285362172 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f50240684d0 msgr2=0x7f5024198d60 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:12.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.944+0000 7f50227fc700 1 --2- 192.168.123.105:0/3285362172 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f50240684d0 0x7f5024198d60 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:12.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.944+0000 7f50227fc700 1 -- 192.168.123.105:0/3285362172 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f50140097e0 con 0x7f5024068df0 2026-03-10T09:06:12.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.944+0000 7f50227fc700 1 --2- 192.168.123.105:0/3285362172 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5024068df0 0x7f50241992a0 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f50140048c0 tx=0x7f50140048f0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:12.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.944+0000 7f501bfff700 1 -- 192.168.123.105:0/3285362172 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f501401d070 con 0x7f5024068df0 2026-03-10T09:06:12.946 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.944+0000 7f5029487700 1 -- 192.168.123.105:0/3285362172 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f502419d9f0 con 0x7f5024068df0 2026-03-10T09:06:12.946 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.944+0000 7f5029487700 1 -- 192.168.123.105:0/3285362172 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f502419df40 con 0x7f5024068df0 2026-03-10T09:06:12.947 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.945+0000 7f501bfff700 1 -- 192.168.123.105:0/3285362172 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f5014022470 con 0x7f5024068df0 2026-03-10T09:06:12.947 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.945+0000 7f501bfff700 1 -- 192.168.123.105:0/3285362172 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f501400f670 con 0x7f5024068df0 2026-03-10T09:06:12.948 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.946+0000 7f5029487700 1 -- 192.168.123.105:0/3285362172 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f502410ad20 con 0x7f5024068df0 2026-03-10T09:06:12.948 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.947+0000 7f501bfff700 1 -- 192.168.123.105:0/3285362172 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f5014022ac0 con 0x7f5024068df0 2026-03-10T09:06:12.949 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.948+0000 7f501bfff700 1 --2- 192.168.123.105:0/3285362172 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f5010077870 0x7f5010079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:12.949 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.948+0000 7f501bfff700 1 -- 192.168.123.105:0/3285362172 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f501409afb0 con 0x7f5024068df0 2026-03-10T09:06:12.949 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.948+0000 7f5022ffd700 1 --2- 192.168.123.105:0/3285362172 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f5010077870 0x7f5010079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:12.950 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.948+0000 7f5022ffd700 1 --2- 192.168.123.105:0/3285362172 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f5010077870 0x7f5010079d30 secure :-1 s=READY pgs=159 cs=0 l=1 rev1=1 crypto rx=0x7f500c005950 tx=0x7f500c00b410 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:12.952 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:12.950+0000 7f501bfff700 1 -- 192.168.123.105:0/3285362172 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f50140d19f0 con 0x7f5024068df0 2026-03-10T09:06:13.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:12 vm08.local ceph-mon[101330]: pgmap v349: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:13.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:12 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/1821004061' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 22, "format": "json"}]: dispatch 2026-03-10T09:06:13.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:12 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/1825189207' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 23, "format": "json"}]: dispatch 2026-03-10T09:06:13.097 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.096+0000 7f5029487700 1 -- 192.168.123.105:0/3285362172 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 24, "format": "json"} v 0) v1 -- 0x7f502404ea90 con 0x7f5024068df0 2026-03-10T09:06:13.099 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.098+0000 7f501bfff700 1 -- 192.168.123.105:0/3285362172 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 24, "format": "json"}]=0 dumped fsmap epoch 24 v30) v1 ==== 107+0+4274 (secure 0 0 0) 0x7f5014063760 con 0x7f5024068df0 2026-03-10T09:06:13.100 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:06:13.100 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":24,"btime":"2026-03-10T09:03:32:850013+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34444,"name":"cephfs.vm05.bxdvbu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6827/2948722085","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":2948722085},{"type":"v1","addr":"192.168.123.105:6827","nonce":2948722085}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":18},{"gid":44367,"name":"cephfs.vm05.slhztf","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6829/930707688","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6828","nonce":930707688},{"type":"v1","addr":"192.168.123.105:6829","nonce":930707688}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":23}],"filesystems":[{"mdsmap":{"epoch":23,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T08:52:52.346264+0000","modified":"2026-03-10T09:03:28.259239+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":104,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24307},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24307":{"gid":24307,"name":"cephfs.vm08.ssijow","rank":0,"incarnation":20,"state":"up:active","state_seq":161,"addr":"192.168.123.108:6827/573665424","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":573665424},{"type":"v1","addr":"192.168.123.108:6827","nonce":573665424}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24307,"qdb_cluster":[24307]},"id":1}]} 2026-03-10T09:06:13.103 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.102+0000 7f5029487700 1 -- 192.168.123.105:0/3285362172 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f5010077870 msgr2=0x7f5010079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:13.103 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.102+0000 7f5029487700 1 --2- 192.168.123.105:0/3285362172 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f5010077870 0x7f5010079d30 secure :-1 s=READY pgs=159 cs=0 l=1 rev1=1 crypto rx=0x7f500c005950 tx=0x7f500c00b410 comp rx=0 tx=0).stop 2026-03-10T09:06:13.103 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.102+0000 7f5029487700 1 -- 192.168.123.105:0/3285362172 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5024068df0 msgr2=0x7f50241992a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:13.103 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.102+0000 7f5029487700 1 --2- 192.168.123.105:0/3285362172 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5024068df0 0x7f50241992a0 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f50140048c0 tx=0x7f50140048f0 comp rx=0 tx=0).stop 2026-03-10T09:06:13.103 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.102+0000 7f5029487700 1 -- 192.168.123.105:0/3285362172 shutdown_connections 2026-03-10T09:06:13.103 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.102+0000 7f5029487700 1 --2- 192.168.123.105:0/3285362172 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f5010077870 0x7f5010079d30 unknown :-1 s=CLOSED pgs=159 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:13.103 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.102+0000 7f5029487700 1 --2- 192.168.123.105:0/3285362172 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f50240684d0 0x7f5024198d60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:13.103 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.102+0000 7f5029487700 1 --2- 192.168.123.105:0/3285362172 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5024068df0 0x7f50241992a0 unknown :-1 s=CLOSED pgs=93 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:13.103 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.102+0000 7f5029487700 1 -- 192.168.123.105:0/3285362172 >> 192.168.123.105:0/3285362172 conn(0x7f5024075960 msgr2=0x7f50240fe960 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:13.104 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.102+0000 7f5029487700 1 -- 192.168.123.105:0/3285362172 shutdown_connections 2026-03-10T09:06:13.104 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.102+0000 7f5029487700 1 -- 192.168.123.105:0/3285362172 wait complete. 2026-03-10T09:06:13.104 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 24 2026-03-10T09:06:13.150 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph fs dump --format=json 25 2026-03-10T09:06:13.304 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T09:06:13.583 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.581+0000 7fcca7645700 1 -- 192.168.123.105:0/3221419355 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcca00731c0 msgr2=0x7fcca00735a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:13.583 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.581+0000 7fcca7645700 1 --2- 192.168.123.105:0/3221419355 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcca00731c0 0x7fcca00735a0 secure :-1 s=READY pgs=231 cs=0 l=1 rev1=1 crypto rx=0x7fcc90009b00 tx=0x7fcc90009e10 comp rx=0 tx=0).stop 2026-03-10T09:06:13.583 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.582+0000 7fcca7645700 1 -- 192.168.123.105:0/3221419355 shutdown_connections 2026-03-10T09:06:13.583 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.582+0000 7fcca7645700 1 --2- 192.168.123.105:0/3221419355 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcca0073ae0 0x7fcca010d170 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:13.583 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.582+0000 7fcca7645700 1 --2- 192.168.123.105:0/3221419355 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcca00731c0 0x7fcca00735a0 unknown :-1 s=CLOSED pgs=231 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:13.583 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.582+0000 7fcca7645700 1 -- 192.168.123.105:0/3221419355 >> 192.168.123.105:0/3221419355 conn(0x7fcca00fc920 msgr2=0x7fcca00fed40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:13.583 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.582+0000 7fcca7645700 1 -- 192.168.123.105:0/3221419355 shutdown_connections 2026-03-10T09:06:13.584 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.582+0000 7fcca7645700 1 -- 192.168.123.105:0/3221419355 wait complete. 2026-03-10T09:06:13.584 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.583+0000 7fcca7645700 1 Processor -- start 2026-03-10T09:06:13.584 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.583+0000 7fcca7645700 1 -- start start 2026-03-10T09:06:13.585 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.583+0000 7fcca7645700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcca00731c0 0x7fcca0198d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:13.585 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.583+0000 7fcca53e1700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcca00731c0 0x7fcca0198d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:13.585 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.583+0000 7fcca53e1700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcca00731c0 0x7fcca0198d70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:60718/0 (socket says 192.168.123.105:60718) 2026-03-10T09:06:13.585 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.583+0000 7fcca7645700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcca0073ae0 0x7fcca01992b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:13.585 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.584+0000 7fcca7645700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcca0199990 con 0x7fcca00731c0 2026-03-10T09:06:13.585 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.584+0000 7fcca7645700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcca019d720 con 0x7fcca0073ae0 2026-03-10T09:06:13.585 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.584+0000 7fcca53e1700 1 -- 192.168.123.105:0/844551969 learned_addr learned my addr 192.168.123.105:0/844551969 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:06:13.586 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.584+0000 7fcca53e1700 1 -- 192.168.123.105:0/844551969 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcca0073ae0 msgr2=0x7fcca01992b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:13.586 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.584+0000 7fcca53e1700 1 --2- 192.168.123.105:0/844551969 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcca0073ae0 0x7fcca01992b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:13.586 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.584+0000 7fcca53e1700 1 -- 192.168.123.105:0/844551969 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fcc900097e0 con 0x7fcca00731c0 2026-03-10T09:06:13.586 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.584+0000 7fcca53e1700 1 --2- 192.168.123.105:0/844551969 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcca00731c0 0x7fcca0198d70 secure :-1 s=READY pgs=232 cs=0 l=1 rev1=1 crypto rx=0x7fcc90006010 tx=0x7fcc9000bfd0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:13.586 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.584+0000 7fcc967fc700 1 -- 192.168.123.105:0/844551969 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcc9001d070 con 0x7fcca00731c0 2026-03-10T09:06:13.586 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.585+0000 7fcc967fc700 1 -- 192.168.123.105:0/844551969 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fcc9000f460 con 0x7fcca00731c0 2026-03-10T09:06:13.588 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.585+0000 7fcca7645700 1 -- 192.168.123.105:0/844551969 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fcca019d9a0 con 0x7fcca00731c0 2026-03-10T09:06:13.588 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.585+0000 7fcca7645700 1 -- 192.168.123.105:0/844551969 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fcca019de90 con 0x7fcca00731c0 2026-03-10T09:06:13.588 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.586+0000 7fcc967fc700 1 -- 192.168.123.105:0/844551969 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcc90003bf0 con 0x7fcca00731c0 2026-03-10T09:06:13.591 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.587+0000 7fcc967fc700 1 -- 192.168.123.105:0/844551969 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fcc9002b440 con 0x7fcca00731c0 2026-03-10T09:06:13.591 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.587+0000 7fcca7645700 1 -- 192.168.123.105:0/844551969 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fcca010a870 con 0x7fcca00731c0 2026-03-10T09:06:13.591 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.588+0000 7fcc967fc700 1 --2- 192.168.123.105:0/844551969 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fcc8c0778c0 0x7fcc8c079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:13.591 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.588+0000 7fcc967fc700 1 -- 192.168.123.105:0/844551969 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7fcc9009af20 con 0x7fcca00731c0 2026-03-10T09:06:13.591 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.589+0000 7fcca4be0700 1 --2- 192.168.123.105:0/844551969 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fcc8c0778c0 0x7fcc8c079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:13.592 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.590+0000 7fcc967fc700 1 -- 192.168.123.105:0/844551969 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fcc90063660 con 0x7fcca00731c0 2026-03-10T09:06:13.592 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.591+0000 7fcca4be0700 1 --2- 192.168.123.105:0/844551969 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fcc8c0778c0 0x7fcc8c079d80 secure :-1 s=READY pgs=160 cs=0 l=1 rev1=1 crypto rx=0x7fcc9c005950 tx=0x7fcc9c0058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:13.738 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.736+0000 7fcca7645700 1 -- 192.168.123.105:0/844551969 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 25, "format": "json"} v 0) v1 -- 0x7fcca019a0d0 con 0x7fcca00731c0 2026-03-10T09:06:13.738 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.737+0000 7fcc967fc700 1 -- 192.168.123.105:0/844551969 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 25, "format": "json"}]=0 dumped fsmap epoch 25 v30) v1 ==== 107+0+5125 (secure 0 0 0) 0x7fcc90062db0 con 0x7fcca00731c0 2026-03-10T09:06:13.738 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:06:13.739 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":25,"btime":"2026-03-10T09:03:35:929325+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34444,"name":"cephfs.vm05.bxdvbu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6827/2948722085","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":2948722085},{"type":"v1","addr":"192.168.123.105:6827","nonce":2948722085}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":18},{"gid":34470,"name":"cephfs.vm08.xfzrbx","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/3904677772","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":3904677772},{"type":"v1","addr":"192.168.123.108:6825","nonce":3904677772}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":25},{"gid":44367,"name":"cephfs.vm05.slhztf","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6829/930707688","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6828","nonce":930707688},{"type":"v1","addr":"192.168.123.105:6829","nonce":930707688}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":23}],"filesystems":[{"mdsmap":{"epoch":23,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T08:52:52.346264+0000","modified":"2026-03-10T09:03:28.259239+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":104,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24307},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24307":{"gid":24307,"name":"cephfs.vm08.ssijow","rank":0,"incarnation":20,"state":"up:active","state_seq":161,"addr":"192.168.123.108:6827/573665424","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":573665424},{"type":"v1","addr":"192.168.123.108:6827","nonce":573665424}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24307,"qdb_cluster":[24307]},"id":1}]} 2026-03-10T09:06:13.742 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.740+0000 7fcca7645700 1 -- 192.168.123.105:0/844551969 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fcc8c0778c0 msgr2=0x7fcc8c079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:13.742 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.740+0000 7fcca7645700 1 --2- 192.168.123.105:0/844551969 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fcc8c0778c0 0x7fcc8c079d80 secure :-1 s=READY pgs=160 cs=0 l=1 rev1=1 crypto rx=0x7fcc9c005950 tx=0x7fcc9c0058e0 comp rx=0 tx=0).stop 2026-03-10T09:06:13.742 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.740+0000 7fcca7645700 1 -- 192.168.123.105:0/844551969 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcca00731c0 msgr2=0x7fcca0198d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:13.742 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.740+0000 7fcca7645700 1 --2- 192.168.123.105:0/844551969 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcca00731c0 0x7fcca0198d70 secure :-1 s=READY pgs=232 cs=0 l=1 rev1=1 crypto rx=0x7fcc90006010 tx=0x7fcc9000bfd0 comp rx=0 tx=0).stop 2026-03-10T09:06:13.742 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.740+0000 7fcca7645700 1 -- 192.168.123.105:0/844551969 shutdown_connections 2026-03-10T09:06:13.742 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.740+0000 7fcca7645700 1 --2- 192.168.123.105:0/844551969 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fcc8c0778c0 0x7fcc8c079d80 unknown :-1 s=CLOSED pgs=160 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:13.742 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.740+0000 7fcca7645700 1 --2- 192.168.123.105:0/844551969 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcca00731c0 0x7fcca0198d70 unknown :-1 s=CLOSED pgs=232 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:13.742 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.740+0000 7fcca7645700 1 --2- 192.168.123.105:0/844551969 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcca0073ae0 0x7fcca01992b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:13.742 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.740+0000 7fcca7645700 1 -- 192.168.123.105:0/844551969 >> 192.168.123.105:0/844551969 conn(0x7fcca00fc920 msgr2=0x7fcca01079b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:13.742 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.741+0000 7fcca7645700 1 -- 192.168.123.105:0/844551969 shutdown_connections 2026-03-10T09:06:13.742 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:13.741+0000 7fcca7645700 1 -- 192.168.123.105:0/844551969 wait complete. 2026-03-10T09:06:13.743 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 25 2026-03-10T09:06:13.813 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph fs dump --format=json 26 2026-03-10T09:06:13.964 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:13 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/3285362172' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 24, "format": "json"}]: dispatch 2026-03-10T09:06:13.973 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T09:06:14.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:13 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/3285362172' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 24, "format": "json"}]: dispatch 2026-03-10T09:06:14.237 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.235+0000 7fb809917700 1 -- 192.168.123.105:0/2707345945 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb8040690e0 msgr2=0x7fb8041059d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:14.237 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.235+0000 7fb809917700 1 --2- 192.168.123.105:0/2707345945 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb8040690e0 0x7fb8041059d0 secure :-1 s=READY pgs=233 cs=0 l=1 rev1=1 crypto rx=0x7fb7f4009b50 tx=0x7fb7f4009e60 comp rx=0 tx=0).stop 2026-03-10T09:06:14.238 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.236+0000 7fb809917700 1 -- 192.168.123.105:0/2707345945 shutdown_connections 2026-03-10T09:06:14.238 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.236+0000 7fb809917700 1 --2- 192.168.123.105:0/2707345945 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb8040690e0 0x7fb8041059d0 unknown :-1 s=CLOSED pgs=233 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:14.238 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.236+0000 7fb809917700 1 --2- 192.168.123.105:0/2707345945 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb804068730 0x7fb804068b10 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:14.238 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.236+0000 7fb809917700 1 -- 192.168.123.105:0/2707345945 >> 192.168.123.105:0/2707345945 conn(0x7fb804076950 msgr2=0x7fb804076d60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:14.238 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.236+0000 7fb809917700 1 -- 192.168.123.105:0/2707345945 shutdown_connections 2026-03-10T09:06:14.238 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.237+0000 7fb809917700 1 -- 192.168.123.105:0/2707345945 wait complete. 2026-03-10T09:06:14.238 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.237+0000 7fb809917700 1 Processor -- start 2026-03-10T09:06:14.239 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.237+0000 7fb809917700 1 -- start start 2026-03-10T09:06:14.239 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.238+0000 7fb809917700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb804068730 0x7fb80419c8c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:14.239 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.238+0000 7fb809917700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb8040690e0 0x7fb80419ce00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:14.239 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.238+0000 7fb809917700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb80419d490 con 0x7fb8040690e0 2026-03-10T09:06:14.239 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.238+0000 7fb809917700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb804196940 con 0x7fb804068730 2026-03-10T09:06:14.239 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.238+0000 7fb8027fc700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb8040690e0 0x7fb80419ce00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:14.239 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.238+0000 7fb8027fc700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb8040690e0 0x7fb80419ce00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:60744/0 (socket says 192.168.123.105:60744) 2026-03-10T09:06:14.239 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.238+0000 7fb8027fc700 1 -- 192.168.123.105:0/3288892210 learned_addr learned my addr 192.168.123.105:0/3288892210 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:06:14.240 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.238+0000 7fb8027fc700 1 -- 192.168.123.105:0/3288892210 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb804068730 msgr2=0x7fb80419c8c0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T09:06:14.240 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.238+0000 7fb8027fc700 1 --2- 192.168.123.105:0/3288892210 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb804068730 0x7fb80419c8c0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:14.240 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.238+0000 7fb8027fc700 1 -- 192.168.123.105:0/3288892210 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb7f40097e0 con 0x7fb8040690e0 2026-03-10T09:06:14.240 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.239+0000 7fb8027fc700 1 --2- 192.168.123.105:0/3288892210 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb8040690e0 0x7fb80419ce00 secure :-1 s=READY pgs=234 cs=0 l=1 rev1=1 crypto rx=0x7fb7f4004cb0 tx=0x7fb7f4005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:14.241 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.239+0000 7fb808915700 1 -- 192.168.123.105:0/3288892210 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb7f401d070 con 0x7fb8040690e0 2026-03-10T09:06:14.241 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.239+0000 7fb808915700 1 -- 192.168.123.105:0/3288892210 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fb7f4022470 con 0x7fb8040690e0 2026-03-10T09:06:14.241 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.239+0000 7fb808915700 1 -- 192.168.123.105:0/3288892210 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb7f400f650 con 0x7fb8040690e0 2026-03-10T09:06:14.241 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.239+0000 7fb809917700 1 -- 192.168.123.105:0/3288892210 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb804196bc0 con 0x7fb8040690e0 2026-03-10T09:06:14.241 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.239+0000 7fb809917700 1 -- 192.168.123.105:0/3288892210 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb8041970b0 con 0x7fb8040690e0 2026-03-10T09:06:14.242 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.240+0000 7fb809917700 1 -- 192.168.123.105:0/3288892210 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb8041092b0 con 0x7fb8040690e0 2026-03-10T09:06:14.247 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.245+0000 7fb808915700 1 -- 192.168.123.105:0/3288892210 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fb7f4022a60 con 0x7fb8040690e0 2026-03-10T09:06:14.247 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.245+0000 7fb808915700 1 --2- 192.168.123.105:0/3288892210 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb7f0077990 0x7fb7f0079e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:14.247 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.245+0000 7fb808915700 1 -- 192.168.123.105:0/3288892210 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7fb7f409bc00 con 0x7fb8040690e0 2026-03-10T09:06:14.247 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.245+0000 7fb808915700 1 -- 192.168.123.105:0/3288892210 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fb7f40cba70 con 0x7fb8040690e0 2026-03-10T09:06:14.249 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.246+0000 7fb802ffd700 1 --2- 192.168.123.105:0/3288892210 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb7f0077990 0x7fb7f0079e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:14.249 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.246+0000 7fb802ffd700 1 --2- 192.168.123.105:0/3288892210 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb7f0077990 0x7fb7f0079e50 secure :-1 s=READY pgs=161 cs=0 l=1 rev1=1 crypto rx=0x7fb7ec009d30 tx=0x7fb7ec009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:14.399 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.398+0000 7fb809917700 1 -- 192.168.123.105:0/3288892210 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 26, "format": "json"} v 0) v1 -- 0x7fb80404f350 con 0x7fb8040690e0 2026-03-10T09:06:14.400 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.399+0000 7fb808915700 1 -- 192.168.123.105:0/3288892210 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 26, "format": "json"}]=0 dumped fsmap epoch 26 v30) v1 ==== 107+0+4323 (secure 0 0 0) 0x7fb7f4064380 con 0x7fb8040690e0 2026-03-10T09:06:14.400 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:06:14.400 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":26,"btime":"2026-03-10T09:03:38:170949+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34444,"name":"cephfs.vm05.bxdvbu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6827/2948722085","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":2948722085},{"type":"v1","addr":"192.168.123.105:6827","nonce":2948722085}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":18},{"gid":34470,"name":"cephfs.vm08.xfzrbx","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/3904677772","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":3904677772},{"type":"v1","addr":"192.168.123.108:6825","nonce":3904677772}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":25},{"gid":44367,"name":"cephfs.vm05.slhztf","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6829/930707688","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6828","nonce":930707688},{"type":"v1","addr":"192.168.123.105:6829","nonce":930707688}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":23}],"filesystems":[{"mdsmap":{"epoch":26,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T08:52:52.346264+0000","modified":"2026-03-10T09:03:38.170946+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":106,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{},"failed":[0],"damaged":[],"stopped":[],"info":{},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T09:06:14.403 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.402+0000 7fb809917700 1 -- 192.168.123.105:0/3288892210 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb7f0077990 msgr2=0x7fb7f0079e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:14.403 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.402+0000 7fb809917700 1 --2- 192.168.123.105:0/3288892210 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb7f0077990 0x7fb7f0079e50 secure :-1 s=READY pgs=161 cs=0 l=1 rev1=1 crypto rx=0x7fb7ec009d30 tx=0x7fb7ec009450 comp rx=0 tx=0).stop 2026-03-10T09:06:14.403 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.402+0000 7fb809917700 1 -- 192.168.123.105:0/3288892210 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb8040690e0 msgr2=0x7fb80419ce00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:14.404 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.402+0000 7fb809917700 1 --2- 192.168.123.105:0/3288892210 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb8040690e0 0x7fb80419ce00 secure :-1 s=READY pgs=234 cs=0 l=1 rev1=1 crypto rx=0x7fb7f4004cb0 tx=0x7fb7f4005dc0 comp rx=0 tx=0).stop 2026-03-10T09:06:14.404 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.403+0000 7fb809917700 1 -- 192.168.123.105:0/3288892210 shutdown_connections 2026-03-10T09:06:14.404 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.403+0000 7fb809917700 1 --2- 192.168.123.105:0/3288892210 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fb7f0077990 0x7fb7f0079e50 unknown :-1 s=CLOSED pgs=161 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:14.404 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.403+0000 7fb809917700 1 --2- 192.168.123.105:0/3288892210 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb804068730 0x7fb80419c8c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:14.404 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.403+0000 7fb809917700 1 --2- 192.168.123.105:0/3288892210 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb8040690e0 0x7fb80419ce00 unknown :-1 s=CLOSED pgs=234 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:14.404 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.403+0000 7fb809917700 1 -- 192.168.123.105:0/3288892210 >> 192.168.123.105:0/3288892210 conn(0x7fb804076950 msgr2=0x7fb804105240 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:14.404 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.403+0000 7fb809917700 1 -- 192.168.123.105:0/3288892210 shutdown_connections 2026-03-10T09:06:14.404 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.403+0000 7fb809917700 1 -- 192.168.123.105:0/3288892210 wait complete. 2026-03-10T09:06:14.405 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 26 2026-03-10T09:06:14.471 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph fs dump --format=json 27 2026-03-10T09:06:14.621 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T09:06:14.889 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.887+0000 7f97dbef1700 1 -- 192.168.123.105:0/1070677012 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f97d4103340 msgr2=0x7f97d4103720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:14.889 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.887+0000 7f97dbef1700 1 --2- 192.168.123.105:0/1070677012 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f97d4103340 0x7f97d4103720 secure :-1 s=READY pgs=235 cs=0 l=1 rev1=1 crypto rx=0x7f97c4009b00 tx=0x7f97c4009e10 comp rx=0 tx=0).stop 2026-03-10T09:06:14.890 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.888+0000 7f97dbef1700 1 -- 192.168.123.105:0/1070677012 shutdown_connections 2026-03-10T09:06:14.890 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.888+0000 7f97dbef1700 1 --2- 192.168.123.105:0/1070677012 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f97d4103cf0 0x7f97d4107d40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:14.890 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.888+0000 7f97dbef1700 1 --2- 192.168.123.105:0/1070677012 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f97d4103340 0x7f97d4103720 unknown :-1 s=CLOSED pgs=235 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:14.890 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.888+0000 7f97dbef1700 1 -- 192.168.123.105:0/1070677012 >> 192.168.123.105:0/1070677012 conn(0x7f97d40feb90 msgr2=0x7f97d4100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:14.890 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.888+0000 7f97dbef1700 1 -- 192.168.123.105:0/1070677012 shutdown_connections 2026-03-10T09:06:14.890 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.889+0000 7f97dbef1700 1 -- 192.168.123.105:0/1070677012 wait complete. 2026-03-10T09:06:14.890 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.889+0000 7f97dbef1700 1 Processor -- start 2026-03-10T09:06:14.890 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.889+0000 7f97dbef1700 1 -- start start 2026-03-10T09:06:14.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.889+0000 7f97dbef1700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f97d4103cf0 0x7f97d4199070 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:14.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.890+0000 7f97dbef1700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f97d41995b0 0x7f97d419da20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:14.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.890+0000 7f97dbef1700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f97d4199bd0 con 0x7f97d41995b0 2026-03-10T09:06:14.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.890+0000 7f97dbef1700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f97d4199d40 con 0x7f97d4103cf0 2026-03-10T09:06:14.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.890+0000 7f97d9c8d700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f97d4103cf0 0x7f97d4199070 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:14.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.890+0000 7f97d948c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f97d41995b0 0x7f97d419da20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:14.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.890+0000 7f97d9c8d700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f97d4103cf0 0x7f97d4199070 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:36984/0 (socket says 192.168.123.105:36984) 2026-03-10T09:06:14.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.890+0000 7f97d9c8d700 1 -- 192.168.123.105:0/2282822877 learned_addr learned my addr 192.168.123.105:0/2282822877 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:06:14.892 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.890+0000 7f97d9c8d700 1 -- 192.168.123.105:0/2282822877 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f97d41995b0 msgr2=0x7f97d419da20 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:14.892 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.890+0000 7f97d9c8d700 1 --2- 192.168.123.105:0/2282822877 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f97d41995b0 0x7f97d419da20 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:14.892 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.890+0000 7f97d9c8d700 1 -- 192.168.123.105:0/2282822877 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f97c40097e0 con 0x7f97d4103cf0 2026-03-10T09:06:14.892 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.890+0000 7f97d9c8d700 1 --2- 192.168.123.105:0/2282822877 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f97d4103cf0 0x7f97d4199070 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7f97c40052a0 tx=0x7f97c400ba00 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:14.892 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.890+0000 7f97caffd700 1 -- 192.168.123.105:0/2282822877 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f97c401d070 con 0x7f97d4103cf0 2026-03-10T09:06:14.892 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.890+0000 7f97caffd700 1 -- 192.168.123.105:0/2282822877 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f97c400f460 con 0x7f97d4103cf0 2026-03-10T09:06:14.892 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.890+0000 7f97caffd700 1 -- 192.168.123.105:0/2282822877 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f97c4021620 con 0x7f97d4103cf0 2026-03-10T09:06:14.892 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.891+0000 7f97dbef1700 1 -- 192.168.123.105:0/2282822877 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f97d419dfc0 con 0x7f97d4103cf0 2026-03-10T09:06:14.892 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.891+0000 7f97dbef1700 1 -- 192.168.123.105:0/2282822877 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f97d419e510 con 0x7f97d4103cf0 2026-03-10T09:06:14.894 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.891+0000 7f97dbef1700 1 -- 192.168.123.105:0/2282822877 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f97d410b6e0 con 0x7f97d4103cf0 2026-03-10T09:06:14.894 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.892+0000 7f97caffd700 1 -- 192.168.123.105:0/2282822877 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f97c400f5d0 con 0x7f97d4103cf0 2026-03-10T09:06:14.894 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.892+0000 7f97caffd700 1 --2- 192.168.123.105:0/2282822877 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f97c00778c0 0x7f97c0079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:14.894 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.892+0000 7f97caffd700 1 -- 192.168.123.105:0/2282822877 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f97c409a800 con 0x7f97d4103cf0 2026-03-10T09:06:14.894 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.893+0000 7f97d948c700 1 --2- 192.168.123.105:0/2282822877 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f97c00778c0 0x7f97c0079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:14.894 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.893+0000 7f97d948c700 1 --2- 192.168.123.105:0/2282822877 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f97c00778c0 0x7f97c0079d80 secure :-1 s=READY pgs=162 cs=0 l=1 rev1=1 crypto rx=0x7f97d4074ec0 tx=0x7f97d0009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:14.897 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:14.895+0000 7f97caffd700 1 -- 192.168.123.105:0/2282822877 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f97c4062f00 con 0x7f97d4103cf0 2026-03-10T09:06:14.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:14 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/844551969' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 25, "format": "json"}]: dispatch 2026-03-10T09:06:14.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:14 vm05.local ceph-mon[111630]: pgmap v350: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:14.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:14 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/3288892210' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 26, "format": "json"}]: dispatch 2026-03-10T09:06:15.043 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.041+0000 7f97dbef1700 1 -- 192.168.123.105:0/2282822877 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 27, "format": "json"} v 0) v1 -- 0x7f97d404ea90 con 0x7f97d4103cf0 2026-03-10T09:06:15.044 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.042+0000 7f97caffd700 1 -- 192.168.123.105:0/2282822877 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 27, "format": "json"}]=0 dumped fsmap epoch 27 v30) v1 ==== 107+0+4402 (secure 0 0 0) 0x7f97c4062650 con 0x7f97d4103cf0 2026-03-10T09:06:15.044 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:06:15.044 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":27,"btime":"2026-03-10T09:03:38:177701+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34470,"name":"cephfs.vm08.xfzrbx","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/3904677772","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":3904677772},{"type":"v1","addr":"192.168.123.108:6825","nonce":3904677772}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":25},{"gid":44367,"name":"cephfs.vm05.slhztf","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6829/930707688","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6828","nonce":930707688},{"type":"v1","addr":"192.168.123.105:6829","nonce":930707688}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":23}],"filesystems":[{"mdsmap":{"epoch":27,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T08:52:52.346264+0000","modified":"2026-03-10T09:03:38.177693+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":106,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34444},"failed":[],"damaged":[],"stopped":[],"info":{"gid_34444":{"gid":34444,"name":"cephfs.vm05.bxdvbu","rank":0,"incarnation":27,"state":"up:replay","state_seq":1,"addr":"192.168.123.105:6827/2948722085","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":2948722085},{"type":"v1","addr":"192.168.123.105:6827","nonce":2948722085}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T09:06:15.046 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.045+0000 7f97dbef1700 1 -- 192.168.123.105:0/2282822877 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f97c00778c0 msgr2=0x7f97c0079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:15.046 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.045+0000 7f97dbef1700 1 --2- 192.168.123.105:0/2282822877 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f97c00778c0 0x7f97c0079d80 secure :-1 s=READY pgs=162 cs=0 l=1 rev1=1 crypto rx=0x7f97d4074ec0 tx=0x7f97d0009450 comp rx=0 tx=0).stop 2026-03-10T09:06:15.046 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.045+0000 7f97dbef1700 1 -- 192.168.123.105:0/2282822877 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f97d4103cf0 msgr2=0x7f97d4199070 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:15.046 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.045+0000 7f97dbef1700 1 --2- 192.168.123.105:0/2282822877 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f97d4103cf0 0x7f97d4199070 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7f97c40052a0 tx=0x7f97c400ba00 comp rx=0 tx=0).stop 2026-03-10T09:06:15.047 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.045+0000 7f97dbef1700 1 -- 192.168.123.105:0/2282822877 shutdown_connections 2026-03-10T09:06:15.047 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.045+0000 7f97dbef1700 1 --2- 192.168.123.105:0/2282822877 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f97c00778c0 0x7f97c0079d80 unknown :-1 s=CLOSED pgs=162 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:15.047 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.045+0000 7f97dbef1700 1 --2- 192.168.123.105:0/2282822877 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f97d4103cf0 0x7f97d4199070 unknown :-1 s=CLOSED pgs=94 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:15.047 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.045+0000 7f97dbef1700 1 --2- 192.168.123.105:0/2282822877 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f97d41995b0 0x7f97d419da20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:15.047 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.045+0000 7f97dbef1700 1 -- 192.168.123.105:0/2282822877 >> 192.168.123.105:0/2282822877 conn(0x7f97d40feb90 msgr2=0x7f97d41002c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:15.047 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.046+0000 7f97dbef1700 1 -- 192.168.123.105:0/2282822877 shutdown_connections 2026-03-10T09:06:15.047 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.046+0000 7f97dbef1700 1 -- 192.168.123.105:0/2282822877 wait complete. 2026-03-10T09:06:15.048 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 27 2026-03-10T09:06:15.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:14 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/844551969' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 25, "format": "json"}]: dispatch 2026-03-10T09:06:15.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:14 vm08.local ceph-mon[101330]: pgmap v350: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:15.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:14 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/3288892210' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 26, "format": "json"}]: dispatch 2026-03-10T09:06:15.114 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph fs dump --format=json 28 2026-03-10T09:06:15.263 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T09:06:15.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.535+0000 7f2f0b453700 1 -- 192.168.123.105:0/2757557734 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2f04102db0 msgr2=0x7f2f04103190 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:15.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.535+0000 7f2f0b453700 1 --2- 192.168.123.105:0/2757557734 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2f04102db0 0x7f2f04103190 secure :-1 s=READY pgs=236 cs=0 l=1 rev1=1 crypto rx=0x7f2f00009b00 tx=0x7f2f00009e10 comp rx=0 tx=0).stop 2026-03-10T09:06:15.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.535+0000 7f2f0b453700 1 -- 192.168.123.105:0/2757557734 shutdown_connections 2026-03-10T09:06:15.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.535+0000 7f2f0b453700 1 --2- 192.168.123.105:0/2757557734 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2f04069180 0x7f2f04069600 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:15.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.535+0000 7f2f0b453700 1 --2- 192.168.123.105:0/2757557734 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2f04102db0 0x7f2f04103190 unknown :-1 s=CLOSED pgs=236 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:15.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.536+0000 7f2f0b453700 1 -- 192.168.123.105:0/2757557734 >> 192.168.123.105:0/2757557734 conn(0x7f2f04076b70 msgr2=0x7f2f04076f80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:15.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.536+0000 7f2f0b453700 1 -- 192.168.123.105:0/2757557734 shutdown_connections 2026-03-10T09:06:15.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.536+0000 7f2f0b453700 1 -- 192.168.123.105:0/2757557734 wait complete. 2026-03-10T09:06:15.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.536+0000 7f2f0b453700 1 Processor -- start 2026-03-10T09:06:15.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.536+0000 7f2f0b453700 1 -- start start 2026-03-10T09:06:15.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.537+0000 7f2f0b453700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2f04069180 0x7f2f04196d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:15.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.537+0000 7f2f0b453700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2f04102db0 0x7f2f041972b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:15.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.537+0000 7f2f0b453700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2f04197990 con 0x7f2f04102db0 2026-03-10T09:06:15.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.537+0000 7f2f0b453700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2f0419b6d0 con 0x7f2f04069180 2026-03-10T09:06:15.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.537+0000 7f2f091ef700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2f04069180 0x7f2f04196d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:15.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.537+0000 7f2f089ee700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2f04102db0 0x7f2f041972b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:15.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.537+0000 7f2f091ef700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2f04069180 0x7f2f04196d70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:36994/0 (socket says 192.168.123.105:36994) 2026-03-10T09:06:15.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.537+0000 7f2f091ef700 1 -- 192.168.123.105:0/1725846207 learned_addr learned my addr 192.168.123.105:0/1725846207 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:06:15.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.537+0000 7f2f091ef700 1 -- 192.168.123.105:0/1725846207 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2f04102db0 msgr2=0x7f2f041972b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:15.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.537+0000 7f2f091ef700 1 --2- 192.168.123.105:0/1725846207 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2f04102db0 0x7f2f041972b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:15.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.537+0000 7f2f091ef700 1 -- 192.168.123.105:0/1725846207 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2f000097e0 con 0x7f2f04069180 2026-03-10T09:06:15.539 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.537+0000 7f2f089ee700 1 --2- 192.168.123.105:0/1725846207 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2f04102db0 0x7f2f041972b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T09:06:15.539 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.537+0000 7f2f091ef700 1 --2- 192.168.123.105:0/1725846207 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2f04069180 0x7f2f04196d70 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7f2f00004930 tx=0x7f2f00004960 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:15.539 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.538+0000 7f2efa7fc700 1 -- 192.168.123.105:0/1725846207 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2f0001d070 con 0x7f2f04069180 2026-03-10T09:06:15.539 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.538+0000 7f2f0b453700 1 -- 192.168.123.105:0/1725846207 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2f0419b890 con 0x7f2f04069180 2026-03-10T09:06:15.540 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.538+0000 7f2f0b453700 1 -- 192.168.123.105:0/1725846207 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2f0419bd80 con 0x7f2f04069180 2026-03-10T09:06:15.540 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.538+0000 7f2efa7fc700 1 -- 192.168.123.105:0/1725846207 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f2f0000bc50 con 0x7f2f04069180 2026-03-10T09:06:15.540 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.538+0000 7f2efa7fc700 1 -- 192.168.123.105:0/1725846207 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2f0000e660 con 0x7f2f04069180 2026-03-10T09:06:15.540 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.539+0000 7f2eeffff700 1 -- 192.168.123.105:0/1725846207 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2ee8005320 con 0x7f2f04069180 2026-03-10T09:06:15.541 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.540+0000 7f2efa7fc700 1 -- 192.168.123.105:0/1725846207 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f2f0000f460 con 0x7f2f04069180 2026-03-10T09:06:15.541 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.540+0000 7f2efa7fc700 1 --2- 192.168.123.105:0/1725846207 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f2ef0077870 0x7f2ef0079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:15.541 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.540+0000 7f2f089ee700 1 --2- 192.168.123.105:0/1725846207 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f2ef0077870 0x7f2ef0079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:15.542 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.540+0000 7f2efa7fc700 1 -- 192.168.123.105:0/1725846207 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f2f0009b130 con 0x7f2f04069180 2026-03-10T09:06:15.542 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.540+0000 7f2f089ee700 1 --2- 192.168.123.105:0/1725846207 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f2ef0077870 0x7f2ef0079d30 secure :-1 s=READY pgs=163 cs=0 l=1 rev1=1 crypto rx=0x7f2f04198340 tx=0x7f2ef4009380 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:15.544 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.543+0000 7f2efa7fc700 1 -- 192.168.123.105:0/1725846207 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f2f000638e0 con 0x7f2f04069180 2026-03-10T09:06:15.684 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.680+0000 7f2eeffff700 1 -- 192.168.123.105:0/1725846207 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 28, "format": "json"} v 0) v1 -- 0x7f2ee8005190 con 0x7f2f04069180 2026-03-10T09:06:15.684 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.683+0000 7f2efa7fc700 1 -- 192.168.123.105:0/1725846207 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 28, "format": "json"}]=0 dumped fsmap epoch 28 v30) v1 ==== 107+0+4405 (secure 0 0 0) 0x7f2f00063030 con 0x7f2f04069180 2026-03-10T09:06:15.684 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:06:15.684 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":28,"btime":"2026-03-10T09:03:42:352954+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34470,"name":"cephfs.vm08.xfzrbx","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/3904677772","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":3904677772},{"type":"v1","addr":"192.168.123.108:6825","nonce":3904677772}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":25},{"gid":44367,"name":"cephfs.vm05.slhztf","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6829/930707688","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6828","nonce":930707688},{"type":"v1","addr":"192.168.123.105:6829","nonce":930707688}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":23}],"filesystems":[{"mdsmap":{"epoch":28,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T08:52:52.346264+0000","modified":"2026-03-10T09:03:41.962325+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":106,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34444},"failed":[],"damaged":[],"stopped":[],"info":{"gid_34444":{"gid":34444,"name":"cephfs.vm05.bxdvbu","rank":0,"incarnation":27,"state":"up:reconnect","state_seq":8,"addr":"192.168.123.105:6827/2948722085","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":2948722085},{"type":"v1","addr":"192.168.123.105:6827","nonce":2948722085}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T09:06:15.687 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.685+0000 7f2eeffff700 1 -- 192.168.123.105:0/1725846207 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f2ef0077870 msgr2=0x7f2ef0079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:15.687 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.685+0000 7f2eeffff700 1 --2- 192.168.123.105:0/1725846207 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f2ef0077870 0x7f2ef0079d30 secure :-1 s=READY pgs=163 cs=0 l=1 rev1=1 crypto rx=0x7f2f04198340 tx=0x7f2ef4009380 comp rx=0 tx=0).stop 2026-03-10T09:06:15.687 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.685+0000 7f2eeffff700 1 -- 192.168.123.105:0/1725846207 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2f04069180 msgr2=0x7f2f04196d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:15.687 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.686+0000 7f2eeffff700 1 --2- 192.168.123.105:0/1725846207 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2f04069180 0x7f2f04196d70 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7f2f00004930 tx=0x7f2f00004960 comp rx=0 tx=0).stop 2026-03-10T09:06:15.687 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.686+0000 7f2eeffff700 1 -- 192.168.123.105:0/1725846207 shutdown_connections 2026-03-10T09:06:15.687 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.686+0000 7f2eeffff700 1 --2- 192.168.123.105:0/1725846207 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f2ef0077870 0x7f2ef0079d30 unknown :-1 s=CLOSED pgs=163 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:15.687 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.686+0000 7f2eeffff700 1 --2- 192.168.123.105:0/1725846207 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2f04069180 0x7f2f04196d70 unknown :-1 s=CLOSED pgs=95 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:15.687 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.686+0000 7f2eeffff700 1 --2- 192.168.123.105:0/1725846207 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2f04102db0 0x7f2f041972b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:15.687 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.686+0000 7f2eeffff700 1 -- 192.168.123.105:0/1725846207 >> 192.168.123.105:0/1725846207 conn(0x7f2f04076b70 msgr2=0x7f2f040fdef0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:15.687 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.686+0000 7f2eeffff700 1 -- 192.168.123.105:0/1725846207 shutdown_connections 2026-03-10T09:06:15.687 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:15.686+0000 7f2eeffff700 1 -- 192.168.123.105:0/1725846207 wait complete. 2026-03-10T09:06:15.688 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 28 2026-03-10T09:06:15.751 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph fs dump --format=json 29 2026-03-10T09:06:15.918 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T09:06:15.945 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:15 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/2282822877' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 27, "format": "json"}]: dispatch 2026-03-10T09:06:15.945 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:15 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/1725846207' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 28, "format": "json"}]: dispatch 2026-03-10T09:06:16.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:15 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/2282822877' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 27, "format": "json"}]: dispatch 2026-03-10T09:06:16.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:15 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/1725846207' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 28, "format": "json"}]: dispatch 2026-03-10T09:06:16.176 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.173+0000 7fe22c55d700 1 -- 192.168.123.105:0/4048443527 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe2240690e0 msgr2=0x7fe224105b50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:16.176 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.173+0000 7fe22c55d700 1 --2- 192.168.123.105:0/4048443527 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe2240690e0 0x7fe224105b50 secure :-1 s=READY pgs=237 cs=0 l=1 rev1=1 crypto rx=0x7fe220009a60 tx=0x7fe220009d70 comp rx=0 tx=0).stop 2026-03-10T09:06:16.176 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.175+0000 7fe22c55d700 1 -- 192.168.123.105:0/4048443527 shutdown_connections 2026-03-10T09:06:16.176 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.175+0000 7fe22c55d700 1 --2- 192.168.123.105:0/4048443527 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe2240690e0 0x7fe224105b50 unknown :-1 s=CLOSED pgs=237 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:16.176 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.175+0000 7fe22c55d700 1 --2- 192.168.123.105:0/4048443527 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe224068730 0x7fe224068b10 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:16.177 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.175+0000 7fe22c55d700 1 -- 192.168.123.105:0/4048443527 >> 192.168.123.105:0/4048443527 conn(0x7fe224075960 msgr2=0x7fe224075d70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:16.177 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.175+0000 7fe22c55d700 1 -- 192.168.123.105:0/4048443527 shutdown_connections 2026-03-10T09:06:16.177 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.175+0000 7fe22c55d700 1 -- 192.168.123.105:0/4048443527 wait complete. 2026-03-10T09:06:16.177 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.176+0000 7fe22c55d700 1 Processor -- start 2026-03-10T09:06:16.177 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.176+0000 7fe22c55d700 1 -- start start 2026-03-10T09:06:16.177 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.176+0000 7fe22c55d700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe224068730 0x7fe22419a7e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:16.178 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.176+0000 7fe22c55d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe2240690e0 0x7fe22419ad20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:16.178 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.176+0000 7fe22c55d700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe22419b3b0 con 0x7fe2240690e0 2026-03-10T09:06:16.178 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.176+0000 7fe22c55d700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe224194860 con 0x7fe224068730 2026-03-10T09:06:16.178 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.177+0000 7fe229af8700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe2240690e0 0x7fe22419ad20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:16.178 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.177+0000 7fe229af8700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe2240690e0 0x7fe22419ad20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:60790/0 (socket says 192.168.123.105:60790) 2026-03-10T09:06:16.178 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.177+0000 7fe229af8700 1 -- 192.168.123.105:0/586589169 learned_addr learned my addr 192.168.123.105:0/586589169 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:06:16.178 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.177+0000 7fe229af8700 1 -- 192.168.123.105:0/586589169 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe224068730 msgr2=0x7fe22419a7e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:16.179 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.177+0000 7fe22a2f9700 1 --2- 192.168.123.105:0/586589169 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe224068730 0x7fe22419a7e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:16.179 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.177+0000 7fe229af8700 1 --2- 192.168.123.105:0/586589169 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe224068730 0x7fe22419a7e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:16.179 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.177+0000 7fe229af8700 1 -- 192.168.123.105:0/586589169 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe220009710 con 0x7fe2240690e0 2026-03-10T09:06:16.179 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.178+0000 7fe22a2f9700 1 --2- 192.168.123.105:0/586589169 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe224068730 0x7fe22419a7e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T09:06:16.179 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.178+0000 7fe229af8700 1 --2- 192.168.123.105:0/586589169 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe2240690e0 0x7fe22419ad20 secure :-1 s=READY pgs=238 cs=0 l=1 rev1=1 crypto rx=0x7fe220005950 tx=0x7fe22000f740 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:16.179 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.178+0000 7fe2177fe700 1 -- 192.168.123.105:0/586589169 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe22001d070 con 0x7fe2240690e0 2026-03-10T09:06:16.180 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.178+0000 7fe22c55d700 1 -- 192.168.123.105:0/586589169 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe224194ae0 con 0x7fe2240690e0 2026-03-10T09:06:16.182 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.179+0000 7fe22c55d700 1 -- 192.168.123.105:0/586589169 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe224194fd0 con 0x7fe2240690e0 2026-03-10T09:06:16.182 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.179+0000 7fe2177fe700 1 -- 192.168.123.105:0/586589169 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fe22000fca0 con 0x7fe2240690e0 2026-03-10T09:06:16.182 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.179+0000 7fe2177fe700 1 -- 192.168.123.105:0/586589169 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe220017690 con 0x7fe2240690e0 2026-03-10T09:06:16.182 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.181+0000 7fe22c55d700 1 -- 192.168.123.105:0/586589169 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe224109470 con 0x7fe2240690e0 2026-03-10T09:06:16.187 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.181+0000 7fe2177fe700 1 -- 192.168.123.105:0/586589169 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fe2200177f0 con 0x7fe2240690e0 2026-03-10T09:06:16.187 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.182+0000 7fe2177fe700 1 --2- 192.168.123.105:0/586589169 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fe21007bd20 0x7fe21007e1e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:16.187 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.182+0000 7fe22a2f9700 1 --2- 192.168.123.105:0/586589169 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fe21007bd20 0x7fe21007e1e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:16.187 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.183+0000 7fe22a2f9700 1 --2- 192.168.123.105:0/586589169 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fe21007bd20 0x7fe21007e1e0 secure :-1 s=READY pgs=164 cs=0 l=1 rev1=1 crypto rx=0x7fe218005fd0 tx=0x7fe218009500 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:16.187 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.185+0000 7fe2177fe700 1 -- 192.168.123.105:0/586589169 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7fe220067890 con 0x7fe2240690e0 2026-03-10T09:06:16.187 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.186+0000 7fe2177fe700 1 -- 192.168.123.105:0/586589169 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fe2200d49f0 con 0x7fe2240690e0 2026-03-10T09:06:16.336 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.334+0000 7fe22c55d700 1 -- 192.168.123.105:0/586589169 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 29, "format": "json"} v 0) v1 -- 0x7fe22404ea90 con 0x7fe2240690e0 2026-03-10T09:06:16.336 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.335+0000 7fe2177fe700 1 -- 192.168.123.105:0/586589169 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 29, "format": "json"}]=0 dumped fsmap epoch 29 v30) v1 ==== 107+0+4402 (secure 0 0 0) 0x7fe220026090 con 0x7fe2240690e0 2026-03-10T09:06:16.337 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:06:16.337 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":29,"btime":"2026-03-10T09:03:43:359159+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34470,"name":"cephfs.vm08.xfzrbx","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/3904677772","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":3904677772},{"type":"v1","addr":"192.168.123.108:6825","nonce":3904677772}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":25},{"gid":44367,"name":"cephfs.vm05.slhztf","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6829/930707688","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6828","nonce":930707688},{"type":"v1","addr":"192.168.123.105:6829","nonce":930707688}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":23}],"filesystems":[{"mdsmap":{"epoch":29,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T08:52:52.346264+0000","modified":"2026-03-10T09:03:42.366011+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":106,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34444},"failed":[],"damaged":[],"stopped":[],"info":{"gid_34444":{"gid":34444,"name":"cephfs.vm05.bxdvbu","rank":0,"incarnation":27,"state":"up:rejoin","state_seq":9,"addr":"192.168.123.105:6827/2948722085","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":2948722085},{"type":"v1","addr":"192.168.123.105:6827","nonce":2948722085}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T09:06:16.339 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.338+0000 7fe22c55d700 1 -- 192.168.123.105:0/586589169 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fe21007bd20 msgr2=0x7fe21007e1e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:16.339 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.338+0000 7fe22c55d700 1 --2- 192.168.123.105:0/586589169 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fe21007bd20 0x7fe21007e1e0 secure :-1 s=READY pgs=164 cs=0 l=1 rev1=1 crypto rx=0x7fe218005fd0 tx=0x7fe218009500 comp rx=0 tx=0).stop 2026-03-10T09:06:16.339 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.338+0000 7fe22c55d700 1 -- 192.168.123.105:0/586589169 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe2240690e0 msgr2=0x7fe22419ad20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:16.339 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.338+0000 7fe22c55d700 1 --2- 192.168.123.105:0/586589169 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe2240690e0 0x7fe22419ad20 secure :-1 s=READY pgs=238 cs=0 l=1 rev1=1 crypto rx=0x7fe220005950 tx=0x7fe22000f740 comp rx=0 tx=0).stop 2026-03-10T09:06:16.339 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.338+0000 7fe22c55d700 1 -- 192.168.123.105:0/586589169 shutdown_connections 2026-03-10T09:06:16.340 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.338+0000 7fe22c55d700 1 --2- 192.168.123.105:0/586589169 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7fe21007bd20 0x7fe21007e1e0 unknown :-1 s=CLOSED pgs=164 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:16.340 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.338+0000 7fe22c55d700 1 --2- 192.168.123.105:0/586589169 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe224068730 0x7fe22419a7e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:16.340 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.338+0000 7fe22c55d700 1 --2- 192.168.123.105:0/586589169 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe2240690e0 0x7fe22419ad20 unknown :-1 s=CLOSED pgs=238 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:16.340 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.339+0000 7fe22c55d700 1 -- 192.168.123.105:0/586589169 >> 192.168.123.105:0/586589169 conn(0x7fe224075960 msgr2=0x7fe2240feac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:16.340 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.339+0000 7fe22c55d700 1 -- 192.168.123.105:0/586589169 shutdown_connections 2026-03-10T09:06:16.340 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.339+0000 7fe22c55d700 1 -- 192.168.123.105:0/586589169 wait complete. 2026-03-10T09:06:16.341 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 29 2026-03-10T09:06:16.409 DEBUG:teuthology.run_tasks:Unwinding manager ceph-fuse 2026-03-10T09:06:16.413 INFO:tasks.ceph_fuse:Unmounting ceph-fuse clients... 2026-03-10T09:06:16.413 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-10T09:06:16.413 DEBUG:teuthology.orchestra.run.vm05:> dd if=/proc/self/mounts of=/dev/stdout 2026-03-10T09:06:16.429 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-10T09:06:16.429 DEBUG:teuthology.orchestra.run.vm05:> dd if=/proc/self/mounts of=/dev/stdout 2026-03-10T09:06:16.488 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph osd blocklist ls 2026-03-10T09:06:16.687 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T09:06:16.708 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:16 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:06:16.708 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:16 vm05.local ceph-mon[111630]: pgmap v351: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:16.708 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:16 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/586589169' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 29, "format": "json"}]: dispatch 2026-03-10T09:06:16.960 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.957+0000 7f70e564b700 1 -- 192.168.123.105:0/2664674978 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f70e0102a00 msgr2=0x7f70e010aef0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:16.960 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.957+0000 7f70e564b700 1 --2- 192.168.123.105:0/2664674978 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f70e0102a00 0x7f70e010aef0 secure :-1 s=READY pgs=239 cs=0 l=1 rev1=1 crypto rx=0x7f70d0009b00 tx=0x7f70d0009e10 comp rx=0 tx=0).stop 2026-03-10T09:06:16.960 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.959+0000 7f70e564b700 1 -- 192.168.123.105:0/2664674978 shutdown_connections 2026-03-10T09:06:16.961 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.959+0000 7f70e564b700 1 --2- 192.168.123.105:0/2664674978 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f70e0102a00 0x7f70e010aef0 unknown :-1 s=CLOSED pgs=239 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:16.961 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.959+0000 7f70e564b700 1 --2- 192.168.123.105:0/2664674978 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f70e01020e0 0x7f70e01024c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:16.961 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.959+0000 7f70e564b700 1 -- 192.168.123.105:0/2664674978 >> 192.168.123.105:0/2664674978 conn(0x7f70e00fb830 msgr2=0x7f70e00fdc50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:16.961 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.959+0000 7f70e564b700 1 -- 192.168.123.105:0/2664674978 shutdown_connections 2026-03-10T09:06:16.961 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.959+0000 7f70e564b700 1 -- 192.168.123.105:0/2664674978 wait complete. 2026-03-10T09:06:16.961 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.960+0000 7f70e564b700 1 Processor -- start 2026-03-10T09:06:16.961 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.960+0000 7f70e564b700 1 -- start start 2026-03-10T09:06:16.961 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.960+0000 7f70e564b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f70e01020e0 0x7f70e019c970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:16.961 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.960+0000 7f70e564b700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f70e0102a00 0x7f70e019ceb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:16.961 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.960+0000 7f70e564b700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f70e019d4b0 con 0x7f70e01020e0 2026-03-10T09:06:16.961 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.960+0000 7f70e564b700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f70e01969f0 con 0x7f70e0102a00 2026-03-10T09:06:16.962 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.960+0000 7f70de7fc700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f70e0102a00 0x7f70e019ceb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:16.962 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.960+0000 7f70de7fc700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f70e0102a00 0x7f70e019ceb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:37030/0 (socket says 192.168.123.105:37030) 2026-03-10T09:06:16.962 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.960+0000 7f70de7fc700 1 -- 192.168.123.105:0/3834380235 learned_addr learned my addr 192.168.123.105:0/3834380235 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:06:16.962 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.961+0000 7f70de7fc700 1 -- 192.168.123.105:0/3834380235 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f70e01020e0 msgr2=0x7f70e019c970 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T09:06:16.962 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.961+0000 7f70deffd700 1 --2- 192.168.123.105:0/3834380235 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f70e01020e0 0x7f70e019c970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:16.962 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.961+0000 7f70de7fc700 1 --2- 192.168.123.105:0/3834380235 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f70e01020e0 0x7f70e019c970 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:16.962 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.961+0000 7f70de7fc700 1 -- 192.168.123.105:0/3834380235 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f70d00097e0 con 0x7f70e0102a00 2026-03-10T09:06:16.962 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.961+0000 7f70deffd700 1 --2- 192.168.123.105:0/3834380235 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f70e01020e0 0x7f70e019c970 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T09:06:16.962 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.961+0000 7f70de7fc700 1 --2- 192.168.123.105:0/3834380235 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f70e0102a00 0x7f70e019ceb0 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7f70d0004930 tx=0x7f70d0004a10 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:16.962 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.961+0000 7f70d7fff700 1 -- 192.168.123.105:0/3834380235 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f70d001d070 con 0x7f70e0102a00 2026-03-10T09:06:16.963 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.961+0000 7f70d7fff700 1 -- 192.168.123.105:0/3834380235 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f70d000bc50 con 0x7f70e0102a00 2026-03-10T09:06:16.963 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.961+0000 7f70d7fff700 1 -- 192.168.123.105:0/3834380235 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f70d000f770 con 0x7f70e0102a00 2026-03-10T09:06:16.963 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.961+0000 7f70e564b700 1 -- 192.168.123.105:0/3834380235 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f70e0196c70 con 0x7f70e0102a00 2026-03-10T09:06:16.963 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.961+0000 7f70e564b700 1 -- 192.168.123.105:0/3834380235 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f70e0197160 con 0x7f70e0102a00 2026-03-10T09:06:16.963 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.962+0000 7f70e564b700 1 -- 192.168.123.105:0/3834380235 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f70e01085f0 con 0x7f70e0102a00 2026-03-10T09:06:16.965 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.963+0000 7f70d7fff700 1 -- 192.168.123.105:0/3834380235 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f70d0022470 con 0x7f70e0102a00 2026-03-10T09:06:16.965 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.963+0000 7f70d7fff700 1 --2- 192.168.123.105:0/3834380235 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f70cc0778c0 0x7f70cc079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:16.965 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.964+0000 7f70deffd700 1 --2- 192.168.123.105:0/3834380235 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f70cc0778c0 0x7f70cc079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:16.965 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.964+0000 7f70d7fff700 1 -- 192.168.123.105:0/3834380235 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f70d009be20 con 0x7f70e0102a00 2026-03-10T09:06:16.965 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.964+0000 7f70deffd700 1 --2- 192.168.123.105:0/3834380235 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f70cc0778c0 0x7f70cc079d80 secure :-1 s=READY pgs=165 cs=0 l=1 rev1=1 crypto rx=0x7f70c8005fd0 tx=0x7f70c8005e20 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:16.967 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:16.965+0000 7f70d7fff700 1 -- 192.168.123.105:0/3834380235 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f70d00645a0 con 0x7f70e0102a00 2026-03-10T09:06:17.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:16 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:06:17.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:16 vm08.local ceph-mon[101330]: pgmap v351: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:17.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:16 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/586589169' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 29, "format": "json"}]: dispatch 2026-03-10T09:06:17.090 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.088+0000 7f70e564b700 1 -- 192.168.123.105:0/3834380235 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "osd blocklist ls"} v 0) v1 -- 0x7f70e0197f00 con 0x7f70e0102a00 2026-03-10T09:06:17.091 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.089+0000 7f70d7fff700 1 -- 192.168.123.105:0/3834380235 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "osd blocklist ls"}]=0 listed 39 entries v106) v1 ==== 81+0+2396 (secure 0 0 0) 0x7f70d0027970 con 0x7f70e0102a00 2026-03-10T09:06:17.092 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:6827/573665424 2026-03-11T09:03:38.170722+0000 2026-03-10T09:06:17.092 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:6826/573665424 2026-03-11T09:03:38.170722+0000 2026-03-10T09:06:17.092 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:6829/2662194502 2026-03-11T09:03:21.674033+0000 2026-03-10T09:06:17.092 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:6827/2466638752 2026-03-11T09:03:08.618601+0000 2026-03-10T09:06:17.092 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/1622646269 2026-03-11T08:55:47.276599+0000 2026-03-10T09:06:17.092 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:6828/2662194502 2026-03-11T09:03:21.674033+0000 2026-03-10T09:06:17.093 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:0/361633007 2026-03-11T08:56:20.475310+0000 2026-03-10T09:06:17.093 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:6828/865080403 2026-03-11T08:56:20.475310+0000 2026-03-10T09:06:17.093 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/1334754793 2026-03-11T08:56:45.717925+0000 2026-03-10T09:06:17.093 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:0/3175564349 2026-03-11T08:56:20.475310+0000 2026-03-10T09:06:17.093 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:0/1161047880 2026-03-11T08:56:20.475310+0000 2026-03-10T09:06:17.093 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/3434295633 2026-03-11T08:50:23.844460+0000 2026-03-10T09:06:17.093 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/2061557555 2026-03-11T08:51:15.172314+0000 2026-03-10T09:06:17.093 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/3563409701 2026-03-11T08:56:45.717925+0000 2026-03-10T09:06:17.093 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:0/1872027891 2026-03-11T08:56:20.475310+0000 2026-03-10T09:06:17.093 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:0/2531075368 2026-03-11T08:56:20.475310+0000 2026-03-10T09:06:17.093 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/2206982178 2026-03-11T08:56:45.717925+0000 2026-03-10T09:06:17.093 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:6800/2 2026-03-11T08:50:23.844460+0000 2026-03-10T09:06:17.093 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/840671361 2026-03-11T08:51:15.172314+0000 2026-03-10T09:06:17.093 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:6829/865080403 2026-03-11T08:56:20.475310+0000 2026-03-10T09:06:17.093 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:6801/2453972605 2026-03-11T08:56:45.717925+0000 2026-03-10T09:06:17.093 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:0/2139984420 2026-03-11T08:56:20.475310+0000 2026-03-10T09:06:17.093 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/3749819219 2026-03-11T08:55:47.276599+0000 2026-03-10T09:06:17.093 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/1368648469 2026-03-11T08:50:37.635038+0000 2026-03-10T09:06:17.093 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:6801/2 2026-03-11T08:50:23.844460+0000 2026-03-10T09:06:17.093 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/3706700996 2026-03-11T08:50:23.844460+0000 2026-03-10T09:06:17.093 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/2745800648 2026-03-11T08:51:15.172314+0000 2026-03-10T09:06:17.093 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/3102676859 2026-03-11T08:55:47.276599+0000 2026-03-10T09:06:17.093 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:6826/2466638752 2026-03-11T09:03:08.618601+0000 2026-03-10T09:06:17.093 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/4149772464 2026-03-11T08:55:47.276599+0000 2026-03-10T09:06:17.093 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:6824/1416297612 2026-03-11T08:52:58.407587+0000 2026-03-10T09:06:17.093 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:6825/1416297612 2026-03-11T08:52:58.407587+0000 2026-03-10T09:06:17.093 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/3674537463 2026-03-11T08:50:23.844460+0000 2026-03-10T09:06:17.093 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/2034780563 2026-03-11T08:56:45.717925+0000 2026-03-10T09:06:17.093 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/302360737 2026-03-11T08:50:37.635038+0000 2026-03-10T09:06:17.093 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/2607327145 2026-03-11T08:56:45.717925+0000 2026-03-10T09:06:17.093 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/3374278469 2026-03-11T08:50:37.635038+0000 2026-03-10T09:06:17.093 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:6800/2453972605 2026-03-11T08:56:45.717925+0000 2026-03-10T09:06:17.093 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/4152984295 2026-03-11T08:56:45.717925+0000 2026-03-10T09:06:17.094 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.093+0000 7f70e564b700 1 -- 192.168.123.105:0/3834380235 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f70cc0778c0 msgr2=0x7f70cc079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:17.094 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.093+0000 7f70e564b700 1 --2- 192.168.123.105:0/3834380235 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f70cc0778c0 0x7f70cc079d80 secure :-1 s=READY pgs=165 cs=0 l=1 rev1=1 crypto rx=0x7f70c8005fd0 tx=0x7f70c8005e20 comp rx=0 tx=0).stop 2026-03-10T09:06:17.094 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.093+0000 7f70e564b700 1 -- 192.168.123.105:0/3834380235 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f70e0102a00 msgr2=0x7f70e019ceb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:17.095 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.093+0000 7f70e564b700 1 --2- 192.168.123.105:0/3834380235 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f70e0102a00 0x7f70e019ceb0 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7f70d0004930 tx=0x7f70d0004a10 comp rx=0 tx=0).stop 2026-03-10T09:06:17.095 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.093+0000 7f70e564b700 1 -- 192.168.123.105:0/3834380235 shutdown_connections 2026-03-10T09:06:17.095 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.093+0000 7f70e564b700 1 --2- 192.168.123.105:0/3834380235 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f70cc0778c0 0x7f70cc079d80 unknown :-1 s=CLOSED pgs=165 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:17.095 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.093+0000 7f70e564b700 1 --2- 192.168.123.105:0/3834380235 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f70e01020e0 0x7f70e019c970 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:17.095 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.094+0000 7f70e564b700 1 --2- 192.168.123.105:0/3834380235 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f70e0102a00 0x7f70e019ceb0 unknown :-1 s=CLOSED pgs=96 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:17.095 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.094+0000 7f70e564b700 1 -- 192.168.123.105:0/3834380235 >> 192.168.123.105:0/3834380235 conn(0x7f70e00fb830 msgr2=0x7f70e0105730 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:17.095 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.094+0000 7f70e564b700 1 -- 192.168.123.105:0/3834380235 shutdown_connections 2026-03-10T09:06:17.095 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.094+0000 7f70e564b700 1 -- 192.168.123.105:0/3834380235 wait complete. 2026-03-10T09:06:17.096 INFO:teuthology.orchestra.run.vm05.stderr:listed 39 entries 2026-03-10T09:06:17.156 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-10T09:06:17.156 DEBUG:teuthology.orchestra.run.vm05:> dd if=/proc/self/mounts of=/dev/stdout 2026-03-10T09:06:17.173 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph osd blocklist ls 2026-03-10T09:06:17.367 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T09:06:17.625 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.623+0000 7f8cd519d700 1 -- 192.168.123.105:0/1293959498 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8cd0103cf0 msgr2=0x7f8cd0107d40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:17.625 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.623+0000 7f8cd519d700 1 --2- 192.168.123.105:0/1293959498 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8cd0103cf0 0x7f8cd0107d40 secure :-1 s=READY pgs=240 cs=0 l=1 rev1=1 crypto rx=0x7f8cc0009b00 tx=0x7f8cc0009e10 comp rx=0 tx=0).stop 2026-03-10T09:06:17.625 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.624+0000 7f8cd519d700 1 -- 192.168.123.105:0/1293959498 shutdown_connections 2026-03-10T09:06:17.625 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.624+0000 7f8cd519d700 1 --2- 192.168.123.105:0/1293959498 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8cd0103cf0 0x7f8cd0107d40 unknown :-1 s=CLOSED pgs=240 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:17.625 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.624+0000 7f8cd519d700 1 --2- 192.168.123.105:0/1293959498 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8cd0103340 0x7f8cd0103720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:17.625 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.624+0000 7f8cd519d700 1 -- 192.168.123.105:0/1293959498 >> 192.168.123.105:0/1293959498 conn(0x7f8cd00feb90 msgr2=0x7f8cd0100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:17.625 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.624+0000 7f8cd519d700 1 -- 192.168.123.105:0/1293959498 shutdown_connections 2026-03-10T09:06:17.625 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.624+0000 7f8cd519d700 1 -- 192.168.123.105:0/1293959498 wait complete. 2026-03-10T09:06:17.626 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.625+0000 7f8cd519d700 1 Processor -- start 2026-03-10T09:06:17.626 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.625+0000 7f8cd519d700 1 -- start start 2026-03-10T09:06:17.626 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.625+0000 7f8cd519d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8cd0103340 0x7f8cd0198de0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:17.627 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.625+0000 7f8cced9d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8cd0103340 0x7f8cd0198de0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:17.627 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.625+0000 7f8cced9d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8cd0103340 0x7f8cd0198de0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:60824/0 (socket says 192.168.123.105:60824) 2026-03-10T09:06:17.627 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.625+0000 7f8cd519d700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8cd0103cf0 0x7f8cd0199320 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:17.627 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.626+0000 7f8cd519d700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8cd0199a00 con 0x7f8cd0103340 2026-03-10T09:06:17.627 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.626+0000 7f8cd519d700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8cd019d790 con 0x7f8cd0103cf0 2026-03-10T09:06:17.627 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.626+0000 7f8cced9d700 1 -- 192.168.123.105:0/3465944089 learned_addr learned my addr 192.168.123.105:0/3465944089 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T09:06:17.627 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.626+0000 7f8cce59c700 1 --2- 192.168.123.105:0/3465944089 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8cd0103cf0 0x7f8cd0199320 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:17.627 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.626+0000 7f8cce59c700 1 -- 192.168.123.105:0/3465944089 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8cd0103340 msgr2=0x7f8cd0198de0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:17.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.626+0000 7f8cce59c700 1 --2- 192.168.123.105:0/3465944089 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8cd0103340 0x7f8cd0198de0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:17.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.626+0000 7f8cce59c700 1 -- 192.168.123.105:0/3465944089 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8cc00097e0 con 0x7f8cd0103cf0 2026-03-10T09:06:17.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.626+0000 7f8cced9d700 1 --2- 192.168.123.105:0/3465944089 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8cd0103340 0x7f8cd0198de0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T09:06:17.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.627+0000 7f8cce59c700 1 --2- 192.168.123.105:0/3465944089 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8cd0103cf0 0x7f8cd0199320 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7f8cc0000c00 tx=0x7f8cc00048c0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:17.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.627+0000 7f8cc7fff700 1 -- 192.168.123.105:0/3465944089 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8cc001d070 con 0x7f8cd0103cf0 2026-03-10T09:06:17.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.627+0000 7f8cc7fff700 1 -- 192.168.123.105:0/3465944089 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f8cc0022470 con 0x7f8cd0103cf0 2026-03-10T09:06:17.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.627+0000 7f8cd519d700 1 -- 192.168.123.105:0/3465944089 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8cd019da10 con 0x7f8cd0103cf0 2026-03-10T09:06:17.629 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.627+0000 7f8cc7fff700 1 -- 192.168.123.105:0/3465944089 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8cc000f670 con 0x7f8cd0103cf0 2026-03-10T09:06:17.630 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.627+0000 7f8cd519d700 1 -- 192.168.123.105:0/3465944089 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8cd019df00 con 0x7f8cd0103cf0 2026-03-10T09:06:17.630 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.628+0000 7f8cd519d700 1 -- 192.168.123.105:0/3465944089 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8cd010b740 con 0x7f8cd0103cf0 2026-03-10T09:06:17.631 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.629+0000 7f8cc7fff700 1 -- 192.168.123.105:0/3465944089 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 35) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f8cc0022ac0 con 0x7f8cd0103cf0 2026-03-10T09:06:17.631 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.629+0000 7f8cc7fff700 1 --2- 192.168.123.105:0/3465944089 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f8cbc0778c0 0x7f8cbc079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T09:06:17.631 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.630+0000 7f8cc7fff700 1 -- 192.168.123.105:0/3465944089 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f8cc009b150 con 0x7f8cd0103cf0 2026-03-10T09:06:17.631 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.630+0000 7f8cced9d700 1 --2- 192.168.123.105:0/3465944089 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f8cbc0778c0 0x7f8cbc079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T09:06:17.631 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.630+0000 7f8cced9d700 1 --2- 192.168.123.105:0/3465944089 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f8cbc0778c0 0x7f8cbc079d80 secure :-1 s=READY pgs=166 cs=0 l=1 rev1=1 crypto rx=0x7f8cb8005950 tx=0x7f8cb800b500 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T09:06:17.633 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.632+0000 7f8cc7fff700 1 -- 192.168.123.105:0/3465944089 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f8cc00637d0 con 0x7f8cd0103cf0 2026-03-10T09:06:17.769 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:17 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/3834380235' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch 2026-03-10T09:06:17.769 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.767+0000 7f8cd519d700 1 -- 192.168.123.105:0/3465944089 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "osd blocklist ls"} v 0) v1 -- 0x7f8cd004ea90 con 0x7f8cd0103cf0 2026-03-10T09:06:17.769 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.767+0000 7f8cc7fff700 1 -- 192.168.123.105:0/3465944089 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "osd blocklist ls"}]=0 listed 39 entries v106) v1 ==== 81+0+2396 (secure 0 0 0) 0x7f8cc0062f20 con 0x7f8cd0103cf0 2026-03-10T09:06:17.772 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:6827/573665424 2026-03-11T09:03:38.170722+0000 2026-03-10T09:06:17.772 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:6826/573665424 2026-03-11T09:03:38.170722+0000 2026-03-10T09:06:17.772 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:6829/2662194502 2026-03-11T09:03:21.674033+0000 2026-03-10T09:06:17.772 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:6827/2466638752 2026-03-11T09:03:08.618601+0000 2026-03-10T09:06:17.772 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/1622646269 2026-03-11T08:55:47.276599+0000 2026-03-10T09:06:17.772 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:6828/2662194502 2026-03-11T09:03:21.674033+0000 2026-03-10T09:06:17.772 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:0/361633007 2026-03-11T08:56:20.475310+0000 2026-03-10T09:06:17.772 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:6828/865080403 2026-03-11T08:56:20.475310+0000 2026-03-10T09:06:17.772 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/1334754793 2026-03-11T08:56:45.717925+0000 2026-03-10T09:06:17.772 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:0/3175564349 2026-03-11T08:56:20.475310+0000 2026-03-10T09:06:17.772 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:0/1161047880 2026-03-11T08:56:20.475310+0000 2026-03-10T09:06:17.772 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/3434295633 2026-03-11T08:50:23.844460+0000 2026-03-10T09:06:17.772 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/2061557555 2026-03-11T08:51:15.172314+0000 2026-03-10T09:06:17.772 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/3563409701 2026-03-11T08:56:45.717925+0000 2026-03-10T09:06:17.772 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:0/1872027891 2026-03-11T08:56:20.475310+0000 2026-03-10T09:06:17.772 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:0/2531075368 2026-03-11T08:56:20.475310+0000 2026-03-10T09:06:17.772 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/2206982178 2026-03-11T08:56:45.717925+0000 2026-03-10T09:06:17.772 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:6800/2 2026-03-11T08:50:23.844460+0000 2026-03-10T09:06:17.772 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/840671361 2026-03-11T08:51:15.172314+0000 2026-03-10T09:06:17.772 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:6829/865080403 2026-03-11T08:56:20.475310+0000 2026-03-10T09:06:17.772 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:6801/2453972605 2026-03-11T08:56:45.717925+0000 2026-03-10T09:06:17.772 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:0/2139984420 2026-03-11T08:56:20.475310+0000 2026-03-10T09:06:17.772 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/3749819219 2026-03-11T08:55:47.276599+0000 2026-03-10T09:06:17.772 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/1368648469 2026-03-11T08:50:37.635038+0000 2026-03-10T09:06:17.772 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:6801/2 2026-03-11T08:50:23.844460+0000 2026-03-10T09:06:17.772 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/3706700996 2026-03-11T08:50:23.844460+0000 2026-03-10T09:06:17.772 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/2745800648 2026-03-11T08:51:15.172314+0000 2026-03-10T09:06:17.772 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/3102676859 2026-03-11T08:55:47.276599+0000 2026-03-10T09:06:17.772 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:6826/2466638752 2026-03-11T09:03:08.618601+0000 2026-03-10T09:06:17.772 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/4149772464 2026-03-11T08:55:47.276599+0000 2026-03-10T09:06:17.772 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:6824/1416297612 2026-03-11T08:52:58.407587+0000 2026-03-10T09:06:17.772 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:6825/1416297612 2026-03-11T08:52:58.407587+0000 2026-03-10T09:06:17.772 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/3674537463 2026-03-11T08:50:23.844460+0000 2026-03-10T09:06:17.772 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/2034780563 2026-03-11T08:56:45.717925+0000 2026-03-10T09:06:17.772 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/302360737 2026-03-11T08:50:37.635038+0000 2026-03-10T09:06:17.772 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/2607327145 2026-03-11T08:56:45.717925+0000 2026-03-10T09:06:17.772 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/3374278469 2026-03-11T08:50:37.635038+0000 2026-03-10T09:06:17.772 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:6800/2453972605 2026-03-11T08:56:45.717925+0000 2026-03-10T09:06:17.772 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/4152984295 2026-03-11T08:56:45.717925+0000 2026-03-10T09:06:17.773 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.772+0000 7f8cd519d700 1 -- 192.168.123.105:0/3465944089 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f8cbc0778c0 msgr2=0x7f8cbc079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:17.773 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.772+0000 7f8cd519d700 1 --2- 192.168.123.105:0/3465944089 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f8cbc0778c0 0x7f8cbc079d80 secure :-1 s=READY pgs=166 cs=0 l=1 rev1=1 crypto rx=0x7f8cb8005950 tx=0x7f8cb800b500 comp rx=0 tx=0).stop 2026-03-10T09:06:17.773 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.772+0000 7f8cd519d700 1 -- 192.168.123.105:0/3465944089 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8cd0103cf0 msgr2=0x7f8cd0199320 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T09:06:17.773 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.772+0000 7f8cd519d700 1 --2- 192.168.123.105:0/3465944089 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8cd0103cf0 0x7f8cd0199320 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7f8cc0000c00 tx=0x7f8cc00048c0 comp rx=0 tx=0).stop 2026-03-10T09:06:17.774 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.772+0000 7f8cd519d700 1 -- 192.168.123.105:0/3465944089 shutdown_connections 2026-03-10T09:06:17.774 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.772+0000 7f8cd519d700 1 --2- 192.168.123.105:0/3465944089 >> [v2:192.168.123.105:6800/567882508,v1:192.168.123.105:6801/567882508] conn(0x7f8cbc0778c0 0x7f8cbc079d80 unknown :-1 s=CLOSED pgs=166 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:17.774 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.773+0000 7f8cd519d700 1 --2- 192.168.123.105:0/3465944089 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8cd0103340 0x7f8cd0198de0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:17.774 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.773+0000 7f8cd519d700 1 --2- 192.168.123.105:0/3465944089 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8cd0103cf0 0x7f8cd0199320 unknown :-1 s=CLOSED pgs=97 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T09:06:17.774 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.773+0000 7f8cd519d700 1 -- 192.168.123.105:0/3465944089 >> 192.168.123.105:0/3465944089 conn(0x7f8cd00feb90 msgr2=0x7f8cd0100f40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T09:06:17.774 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.773+0000 7f8cd519d700 1 -- 192.168.123.105:0/3465944089 shutdown_connections 2026-03-10T09:06:17.774 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T09:06:17.773+0000 7f8cd519d700 1 -- 192.168.123.105:0/3465944089 wait complete. 2026-03-10T09:06:17.775 INFO:teuthology.orchestra.run.vm05.stderr:listed 39 entries 2026-03-10T09:06:17.837 INFO:tasks.cephfs.fuse_mount:Running fusermount -u on ubuntu@vm05.local... 2026-03-10T09:06:17.837 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T09:06:17.837 DEBUG:teuthology.orchestra.run.vm05:> sudo fusermount -u /home/ubuntu/cephtest/mnt.0 2026-03-10T09:06:17.866 INFO:teuthology.orchestra.run:waiting for 300 2026-03-10T09:06:18.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:17 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/3834380235' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch 2026-03-10T09:06:19.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:18 vm08.local ceph-mon[101330]: from='client.? 192.168.123.105:0/3465944089' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch 2026-03-10T09:06:19.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:18 vm08.local ceph-mon[101330]: pgmap v352: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:19.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:18 vm05.local ceph-mon[111630]: from='client.? 192.168.123.105:0/3465944089' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch 2026-03-10T09:06:19.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:18 vm05.local ceph-mon[111630]: pgmap v352: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:21.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:20 vm05.local ceph-mon[111630]: pgmap v353: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:21.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:20 vm08.local ceph-mon[101330]: pgmap v353: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:23.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:22 vm05.local ceph-mon[111630]: pgmap v354: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:23.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:22 vm08.local ceph-mon[101330]: pgmap v354: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:25.575 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:25 vm05.local ceph-mon[111630]: pgmap v355: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:25.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:25 vm08.local ceph-mon[101330]: pgmap v355: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:26.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:26 vm05.local ceph-mon[111630]: pgmap v356: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:26.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:26 vm08.local ceph-mon[101330]: pgmap v356: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:28.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:27 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:06:28.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:27 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:06:29.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:28 vm08.local ceph-mon[101330]: pgmap v357: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:29.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:06:29.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:06:29.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:06:29.364 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:28 vm05.local ceph-mon[111630]: pgmap v357: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:29.365 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:06:29.365 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:06:29.365 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:06:31.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:30 vm08.local ceph-mon[101330]: pgmap v358: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:31.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:30 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:06:31.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:30 vm05.local ceph-mon[111630]: pgmap v358: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:31.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:30 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:06:33.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:32 vm08.local ceph-mon[101330]: pgmap v359: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:33.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:32 vm05.local ceph-mon[111630]: pgmap v359: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:34.963 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:34 vm05.local ceph-mon[111630]: pgmap v360: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:35.053 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:34 vm08.local ceph-mon[101330]: pgmap v360: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:37.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:36 vm08.local ceph-mon[101330]: pgmap v361: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:37.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:36 vm05.local ceph-mon[111630]: pgmap v361: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:39.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:39 vm08.local ceph-mon[101330]: pgmap v362: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:39.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:39 vm05.local ceph-mon[111630]: pgmap v362: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:41.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:41 vm08.local ceph-mon[101330]: pgmap v363: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:41.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:41 vm05.local ceph-mon[111630]: pgmap v363: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:43.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:43 vm08.local ceph-mon[101330]: pgmap v364: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:43.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:43 vm05.local ceph-mon[111630]: pgmap v364: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:45.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:45 vm05.local ceph-mon[111630]: pgmap v365: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:45.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:45 vm08.local ceph-mon[101330]: pgmap v365: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:46.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:46 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:06:46.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:46 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:06:47.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:47 vm08.local ceph-mon[101330]: pgmap v366: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:47.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:47 vm05.local ceph-mon[111630]: pgmap v366: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:49.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:49 vm05.local ceph-mon[111630]: pgmap v367: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:49.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:49 vm08.local ceph-mon[101330]: pgmap v367: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:51.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:51 vm05.local ceph-mon[111630]: pgmap v368: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:51.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:51 vm08.local ceph-mon[101330]: pgmap v368: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:53.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:53 vm05.local ceph-mon[111630]: pgmap v369: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:53.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:53 vm08.local ceph-mon[101330]: pgmap v369: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:55.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:55 vm08.local ceph-mon[101330]: pgmap v370: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:55.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:55 vm05.local ceph-mon[111630]: pgmap v370: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:56.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:56 vm08.local ceph-mon[101330]: pgmap v371: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:56.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:56 vm05.local ceph-mon[111630]: pgmap v371: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:59.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:06:58 vm05.local ceph-mon[111630]: pgmap v372: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:06:59.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:06:58 vm08.local ceph-mon[101330]: pgmap v372: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:01.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:07:00 vm05.local ceph-mon[111630]: pgmap v373: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:01.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:07:00 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:07:01.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:07:00 vm08.local ceph-mon[101330]: pgmap v373: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:01.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:07:00 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:07:03.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:07:02 vm08.local ceph-mon[101330]: pgmap v374: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:03.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:07:02 vm05.local ceph-mon[111630]: pgmap v374: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:05.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:07:04 vm08.local ceph-mon[101330]: pgmap v375: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:05.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:07:04 vm05.local ceph-mon[111630]: pgmap v375: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:07.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:07:06 vm08.local ceph-mon[101330]: pgmap v376: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:07.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:07:06 vm05.local ceph-mon[111630]: pgmap v376: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:09.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:07:09 vm08.local ceph-mon[101330]: pgmap v377: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:09.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:07:09 vm05.local ceph-mon[111630]: pgmap v377: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:11.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:07:11 vm08.local ceph-mon[101330]: pgmap v378: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:11.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:07:11 vm05.local ceph-mon[111630]: pgmap v378: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:13.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:07:13 vm08.local ceph-mon[101330]: pgmap v379: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:13.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:07:13 vm05.local ceph-mon[111630]: pgmap v379: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:15.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:07:15 vm08.local ceph-mon[101330]: pgmap v380: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:15.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:07:15 vm05.local ceph-mon[111630]: pgmap v380: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:16.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:07:16 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:07:16.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:07:16 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:07:17.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:07:17 vm08.local ceph-mon[101330]: pgmap v381: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:17.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:07:17 vm05.local ceph-mon[111630]: pgmap v381: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:19.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:07:19 vm08.local ceph-mon[101330]: pgmap v382: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:19.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:07:19 vm05.local ceph-mon[111630]: pgmap v382: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:21.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:07:21 vm08.local ceph-mon[101330]: pgmap v383: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:21.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:07:21 vm05.local ceph-mon[111630]: pgmap v383: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:23.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:07:23 vm08.local ceph-mon[101330]: pgmap v384: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:23.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:07:23 vm05.local ceph-mon[111630]: pgmap v384: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:25.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:07:25 vm05.local ceph-mon[111630]: pgmap v385: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:25.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:07:25 vm08.local ceph-mon[101330]: pgmap v385: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:27.363 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:07:27 vm08.local ceph-mon[101330]: pgmap v386: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:27.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:07:27 vm05.local ceph-mon[111630]: pgmap v386: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:29.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:07:29 vm05.local ceph-mon[111630]: pgmap v387: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:29.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:07:29 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:07:29.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:07:29 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-10T09:07:29.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:07:29 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-10T09:07:29.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:07:29 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:07:29.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:07:29 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:07:29.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:07:29 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:07:29.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:07:29 vm08.local ceph-mon[101330]: pgmap v387: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:29.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:07:29 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:07:29.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:07:29 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-10T09:07:29.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:07:29 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-10T09:07:29.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:07:29 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:07:29.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:07:29 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:07:29.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:07:29 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:07:31.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:07:31 vm08.local ceph-mon[101330]: pgmap v388: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:31.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:07:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:07:31.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:07:31 vm05.local ceph-mon[111630]: pgmap v388: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:31.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:07:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:07:32.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:07:32 vm08.local ceph-mon[101330]: pgmap v389: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:32.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:07:32 vm05.local ceph-mon[111630]: pgmap v389: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:35.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:07:35 vm05.local ceph-mon[111630]: pgmap v390: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:35.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:07:35 vm08.local ceph-mon[101330]: pgmap v390: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:37.453 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:07:37 vm08.local ceph-mon[101330]: pgmap v391: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:37.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:07:37 vm05.local ceph-mon[111630]: pgmap v391: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:39.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:07:39 vm05.local ceph-mon[111630]: pgmap v392: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:39.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:07:39 vm08.local ceph-mon[101330]: pgmap v392: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:41.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:07:41 vm05.local ceph-mon[111630]: pgmap v393: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:41.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:07:41 vm08.local ceph-mon[101330]: pgmap v393: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:43.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:07:43 vm08.local ceph-mon[101330]: pgmap v394: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:43.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:07:43 vm05.local ceph-mon[111630]: pgmap v394: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:45.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:07:45 vm08.local ceph-mon[101330]: pgmap v395: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:45.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:07:45 vm05.local ceph-mon[111630]: pgmap v395: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:46.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:07:46 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:07:46.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:07:46 vm08.local ceph-mon[101330]: pgmap v396: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:46.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:07:46 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:07:46.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:07:46 vm05.local ceph-mon[111630]: pgmap v396: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:49.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:07:48 vm05.local ceph-mon[111630]: pgmap v397: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:49.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:07:48 vm08.local ceph-mon[101330]: pgmap v397: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:51.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:07:50 vm08.local ceph-mon[101330]: pgmap v398: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:51.444 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:07:50 vm05.local ceph-mon[111630]: pgmap v398: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:53.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:07:52 vm08.local ceph-mon[101330]: pgmap v399: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:53.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:07:52 vm05.local ceph-mon[111630]: pgmap v399: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:55.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:07:54 vm08.local ceph-mon[101330]: pgmap v400: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:55.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:07:54 vm05.local ceph-mon[111630]: pgmap v400: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:56.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:07:56 vm05.local ceph-mon[111630]: pgmap v401: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:56.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:07:56 vm08.local ceph-mon[101330]: pgmap v401: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:59.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:07:58 vm05.local ceph-mon[111630]: pgmap v402: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:07:59.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:07:58 vm08.local ceph-mon[101330]: pgmap v402: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:01.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:08:01 vm08.local ceph-mon[101330]: pgmap v403: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:01.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:08:01 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:08:01.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:08:01 vm05.local ceph-mon[111630]: pgmap v403: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:01.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:08:01 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:08:03.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:08:03 vm05.local ceph-mon[111630]: pgmap v404: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:03.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:08:03 vm08.local ceph-mon[101330]: pgmap v404: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:05.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:08:05 vm05.local ceph-mon[111630]: pgmap v405: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:05.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:08:05 vm08.local ceph-mon[101330]: pgmap v405: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:07.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:08:07 vm05.local ceph-mon[111630]: pgmap v406: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:07.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:08:07 vm08.local ceph-mon[101330]: pgmap v406: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:09.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:08:09 vm05.local ceph-mon[111630]: pgmap v407: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:09.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:08:09 vm08.local ceph-mon[101330]: pgmap v407: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:11.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:08:11 vm05.local ceph-mon[111630]: pgmap v408: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:11.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:08:11 vm08.local ceph-mon[101330]: pgmap v408: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:13.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:08:13 vm05.local ceph-mon[111630]: pgmap v409: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:13.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:08:13 vm08.local ceph-mon[101330]: pgmap v409: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:15.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:08:15 vm05.local ceph-mon[111630]: pgmap v410: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:15.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:08:15 vm08.local ceph-mon[101330]: pgmap v410: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:16.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:08:16 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:08:16.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:08:16 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:08:17.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:08:17 vm05.local ceph-mon[111630]: pgmap v411: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:17.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:08:17 vm08.local ceph-mon[101330]: pgmap v411: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:19.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:08:19 vm05.local ceph-mon[111630]: pgmap v412: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:19.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:08:19 vm08.local ceph-mon[101330]: pgmap v412: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:21.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:08:21 vm05.local ceph-mon[111630]: pgmap v413: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:21.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:08:21 vm08.local ceph-mon[101330]: pgmap v413: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:23.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:08:23 vm08.local ceph-mon[101330]: pgmap v414: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:23.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:08:23 vm05.local ceph-mon[111630]: pgmap v414: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:25.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:08:25 vm08.local ceph-mon[101330]: pgmap v415: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:25.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:08:25 vm05.local ceph-mon[111630]: pgmap v415: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:26.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:08:26 vm05.local ceph-mon[111630]: pgmap v416: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:26.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:08:26 vm08.local ceph-mon[101330]: pgmap v416: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:29.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:08:28 vm05.local ceph-mon[111630]: pgmap v417: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:29.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:08:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:08:29.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:08:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:08:29.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:08:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:08:29.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:08:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:08:29.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:08:28 vm08.local ceph-mon[101330]: pgmap v417: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:29.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:08:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:08:29.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:08:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:08:29.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:08:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:08:29.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:08:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:08:31.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:08:30 vm08.local ceph-mon[101330]: pgmap v418: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:31.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:08:30 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:08:31.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:08:30 vm05.local ceph-mon[111630]: pgmap v418: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:31.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:08:30 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:08:33.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:08:32 vm08.local ceph-mon[101330]: pgmap v419: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:33.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:08:32 vm05.local ceph-mon[111630]: pgmap v419: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:35.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:08:34 vm08.local ceph-mon[101330]: pgmap v420: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:35.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:08:34 vm05.local ceph-mon[111630]: pgmap v420: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:37.212 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:08:36 vm05.local ceph-mon[111630]: pgmap v421: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:37.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:08:36 vm08.local ceph-mon[101330]: pgmap v421: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:39.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:08:38 vm08.local ceph-mon[101330]: pgmap v422: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:39.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:08:38 vm05.local ceph-mon[111630]: pgmap v422: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:41.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:08:40 vm08.local ceph-mon[101330]: pgmap v423: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:41.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:08:40 vm05.local ceph-mon[111630]: pgmap v423: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:43.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:08:43 vm08.local ceph-mon[101330]: pgmap v424: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:43.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:08:43 vm05.local ceph-mon[111630]: pgmap v424: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:45.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:08:45 vm08.local ceph-mon[101330]: pgmap v425: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:45.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:08:45 vm05.local ceph-mon[111630]: pgmap v425: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:46.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:08:46 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:08:46.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:08:46 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:08:47.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:08:47 vm08.local ceph-mon[101330]: pgmap v426: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:47.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:08:47 vm05.local ceph-mon[111630]: pgmap v426: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:49.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:08:49 vm08.local ceph-mon[101330]: pgmap v427: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:49.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:08:49 vm05.local ceph-mon[111630]: pgmap v427: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:51.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:08:51 vm08.local ceph-mon[101330]: pgmap v428: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:51.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:08:51 vm05.local ceph-mon[111630]: pgmap v428: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:53.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:08:53 vm08.local ceph-mon[101330]: pgmap v429: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:53.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:08:53 vm05.local ceph-mon[111630]: pgmap v429: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:55.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:08:55 vm08.local ceph-mon[101330]: pgmap v430: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:55.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:08:55 vm05.local ceph-mon[111630]: pgmap v430: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:57.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:08:57 vm08.local ceph-mon[101330]: pgmap v431: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:57.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:08:57 vm05.local ceph-mon[111630]: pgmap v431: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:59.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:08:59 vm05.local ceph-mon[111630]: pgmap v432: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:08:59.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:08:59 vm08.local ceph-mon[101330]: pgmap v432: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:01.355 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:09:01 vm08.local ceph-mon[101330]: pgmap v433: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:01.355 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:09:01 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:09:01.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:09:01 vm05.local ceph-mon[111630]: pgmap v433: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:01.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:09:01 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:09:03.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:09:03 vm05.local ceph-mon[111630]: pgmap v434: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:03.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:09:03 vm08.local ceph-mon[101330]: pgmap v434: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:05.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:09:05 vm05.local ceph-mon[111630]: pgmap v435: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:05.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:09:05 vm08.local ceph-mon[101330]: pgmap v435: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:07.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:09:07 vm05.local ceph-mon[111630]: pgmap v436: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:07.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:09:07 vm08.local ceph-mon[101330]: pgmap v436: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:09.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:09:09 vm05.local ceph-mon[111630]: pgmap v437: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:09.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:09:09 vm08.local ceph-mon[101330]: pgmap v437: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:11.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:09:11 vm05.local ceph-mon[111630]: pgmap v438: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:11.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:09:11 vm08.local ceph-mon[101330]: pgmap v438: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:13.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:09:13 vm05.local ceph-mon[111630]: pgmap v439: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:13.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:09:13 vm08.local ceph-mon[101330]: pgmap v439: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:15.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:09:15 vm05.local ceph-mon[111630]: pgmap v440: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:15.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:09:15 vm08.local ceph-mon[101330]: pgmap v440: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:16.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:09:16 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:09:16.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:09:16 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:09:17.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:09:17 vm05.local ceph-mon[111630]: pgmap v441: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:17.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:09:17 vm08.local ceph-mon[101330]: pgmap v441: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:19.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:09:19 vm05.local ceph-mon[111630]: pgmap v442: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:19.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:09:19 vm08.local ceph-mon[101330]: pgmap v442: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:21.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:09:21 vm08.local ceph-mon[101330]: pgmap v443: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:21.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:09:21 vm05.local ceph-mon[111630]: pgmap v443: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:22.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:09:22 vm08.local ceph-mon[101330]: pgmap v444: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:22.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:09:22 vm05.local ceph-mon[111630]: pgmap v444: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:25.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:09:24 vm08.local ceph-mon[101330]: pgmap v445: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:25.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:09:24 vm05.local ceph-mon[111630]: pgmap v445: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:27.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:09:26 vm08.local ceph-mon[101330]: pgmap v446: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:27.440 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:09:26 vm05.local ceph-mon[111630]: pgmap v446: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:29.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:09:28 vm08.local ceph-mon[101330]: pgmap v447: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:29.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:09:28 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:09:29.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:09:28 vm05.local ceph-mon[111630]: pgmap v447: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:29.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:09:28 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:09:30.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:09:29 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:09:30.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:09:29 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:09:30.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:09:29 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:09:30.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:09:29 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:09:30.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:09:29 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:09:30.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:09:29 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:09:31.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:09:31 vm05.local ceph-mon[111630]: pgmap v448: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:31.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:09:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:09:31.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:09:31 vm08.local ceph-mon[101330]: pgmap v448: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:31.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:09:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:09:33.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:09:33 vm05.local ceph-mon[111630]: pgmap v449: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:33.492 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:09:33 vm08.local ceph-mon[101330]: pgmap v449: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:35.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:09:35 vm05.local ceph-mon[111630]: pgmap v450: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:35.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:09:35 vm08.local ceph-mon[101330]: pgmap v450: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:37.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:09:37 vm05.local ceph-mon[111630]: pgmap v451: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:37.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:09:37 vm08.local ceph-mon[101330]: pgmap v451: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:38.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:09:38 vm05.local ceph-mon[111630]: pgmap v452: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:38.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:09:38 vm08.local ceph-mon[101330]: pgmap v452: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:41.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:09:41 vm05.local ceph-mon[111630]: pgmap v453: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:41.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:09:41 vm08.local ceph-mon[101330]: pgmap v453: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:42.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:09:42 vm05.local ceph-mon[111630]: pgmap v454: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:43.052 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:09:42 vm08.local ceph-mon[101330]: pgmap v454: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:45.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:09:44 vm08.local ceph-mon[101330]: pgmap v455: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:45.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:09:44 vm05.local ceph-mon[111630]: pgmap v455: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:46.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:09:46 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:09:46.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:09:46 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:09:47.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:09:47 vm08.local ceph-mon[101330]: pgmap v456: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:47.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:09:47 vm05.local ceph-mon[111630]: pgmap v456: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:49.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:09:49 vm08.local ceph-mon[101330]: pgmap v457: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:49.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:09:49 vm05.local ceph-mon[111630]: pgmap v457: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:51.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:09:51 vm08.local ceph-mon[101330]: pgmap v458: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:51.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:09:51 vm05.local ceph-mon[111630]: pgmap v458: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:53.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:09:53 vm08.local ceph-mon[101330]: pgmap v459: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:53.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:09:53 vm05.local ceph-mon[111630]: pgmap v459: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:55.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:09:55 vm08.local ceph-mon[101330]: pgmap v460: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:55.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:09:55 vm05.local ceph-mon[111630]: pgmap v460: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:57.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:09:57 vm08.local ceph-mon[101330]: pgmap v461: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:57.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:09:57 vm05.local ceph-mon[111630]: pgmap v461: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:59.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:09:59 vm08.local ceph-mon[101330]: pgmap v462: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:09:59.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:09:59 vm05.local ceph-mon[111630]: pgmap v462: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:00.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:10:00 vm05.local ceph-mon[111630]: overall HEALTH_OK 2026-03-10T09:10:00.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:10:00 vm08.local ceph-mon[101330]: overall HEALTH_OK 2026-03-10T09:10:01.453 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:10:01 vm08.local ceph-mon[101330]: pgmap v463: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:01.454 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:10:01 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:10:01.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:10:01 vm05.local ceph-mon[111630]: pgmap v463: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:01.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:10:01 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:10:03.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:10:03 vm05.local ceph-mon[111630]: pgmap v464: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:03.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:10:03 vm08.local ceph-mon[101330]: pgmap v464: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:05.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:10:05 vm05.local ceph-mon[111630]: pgmap v465: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:05.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:10:05 vm08.local ceph-mon[101330]: pgmap v465: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:07.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:10:07 vm05.local ceph-mon[111630]: pgmap v466: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:07.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:10:07 vm08.local ceph-mon[101330]: pgmap v466: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:08.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:10:08 vm05.local ceph-mon[111630]: pgmap v467: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:08.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:10:08 vm08.local ceph-mon[101330]: pgmap v467: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:11.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:10:10 vm08.local ceph-mon[101330]: pgmap v468: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:11.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:10:10 vm05.local ceph-mon[111630]: pgmap v468: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:13.213 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:10:12 vm05.local ceph-mon[111630]: pgmap v469: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:13.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:10:12 vm08.local ceph-mon[101330]: pgmap v469: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:15.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:10:14 vm08.local ceph-mon[101330]: pgmap v470: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:15.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:10:14 vm05.local ceph-mon[111630]: pgmap v470: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:16.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:10:16 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:10:16.712 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:10:16 vm05.local ceph-mon[111630]: pgmap v471: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:16.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:10:16 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:10:16.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:10:16 vm08.local ceph-mon[101330]: pgmap v471: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:19.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:10:18 vm08.local ceph-mon[101330]: pgmap v472: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:19.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:10:18 vm05.local ceph-mon[111630]: pgmap v472: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:21.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:10:20 vm08.local ceph-mon[101330]: pgmap v473: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:21.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:10:20 vm05.local ceph-mon[111630]: pgmap v473: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:23.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:10:22 vm08.local ceph-mon[101330]: pgmap v474: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:23.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:10:22 vm05.local ceph-mon[111630]: pgmap v474: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:25.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:10:24 vm08.local ceph-mon[101330]: pgmap v475: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:25.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:10:24 vm05.local ceph-mon[111630]: pgmap v475: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:27.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:10:27 vm08.local ceph-mon[101330]: pgmap v476: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:27.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:10:27 vm05.local ceph-mon[111630]: pgmap v476: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:29.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:10:29 vm08.local ceph-mon[101330]: pgmap v477: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:29.364 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:10:29 vm05.local ceph-mon[111630]: pgmap v477: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:30.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:10:30 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:10:30.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:10:30 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:10:30.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:10:30 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:10:30.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:10:30 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:10:30.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:10:30 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T09:10:30.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:10:30 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T09:10:30.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:10:30 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T09:10:30.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:10:30 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' 2026-03-10T09:10:31.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:10:31 vm08.local ceph-mon[101330]: pgmap v478: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:31.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:10:31 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:10:31.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:10:31 vm05.local ceph-mon[111630]: pgmap v478: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:31.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:10:31 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:10:33.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:10:33 vm08.local ceph-mon[101330]: pgmap v479: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:33.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:10:33 vm05.local ceph-mon[111630]: pgmap v479: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:35.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:10:35 vm08.local ceph-mon[101330]: pgmap v480: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:35.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:10:35 vm05.local ceph-mon[111630]: pgmap v480: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:37.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:10:37 vm08.local ceph-mon[101330]: pgmap v481: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:37.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:10:37 vm05.local ceph-mon[111630]: pgmap v481: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:39.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:10:39 vm08.local ceph-mon[101330]: pgmap v482: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:39.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:10:39 vm05.local ceph-mon[111630]: pgmap v482: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:41.303 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:10:41 vm08.local ceph-mon[101330]: pgmap v483: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:41.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:10:41 vm05.local ceph-mon[111630]: pgmap v483: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:43.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:10:43 vm08.local ceph-mon[101330]: pgmap v484: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:43.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:10:43 vm05.local ceph-mon[111630]: pgmap v484: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:45.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:10:45 vm05.local ceph-mon[111630]: pgmap v485: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:45.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:10:45 vm08.local ceph-mon[101330]: pgmap v485: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:46.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:10:46 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:10:46.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:10:46 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:10:47.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:10:47 vm05.local ceph-mon[111630]: pgmap v486: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:47.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:10:47 vm08.local ceph-mon[101330]: pgmap v486: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:49.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:10:49 vm05.local ceph-mon[111630]: pgmap v487: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:49.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:10:49 vm08.local ceph-mon[101330]: pgmap v487: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:51.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:10:51 vm05.local ceph-mon[111630]: pgmap v488: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:51.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:10:51 vm08.local ceph-mon[101330]: pgmap v488: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:53.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:10:53 vm05.local ceph-mon[111630]: pgmap v489: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:53.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:10:53 vm08.local ceph-mon[101330]: pgmap v489: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:55.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:10:55 vm05.local ceph-mon[111630]: pgmap v490: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:55.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:10:55 vm08.local ceph-mon[101330]: pgmap v490: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:57.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:10:57 vm05.local ceph-mon[111630]: pgmap v491: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:57.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:10:57 vm08.local ceph-mon[101330]: pgmap v491: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:58.713 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:10:58 vm05.local ceph-mon[111630]: pgmap v492: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:10:58.802 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:10:58 vm08.local ceph-mon[101330]: pgmap v492: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:11:01.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:11:00 vm08.local ceph-mon[101330]: pgmap v493: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:11:01.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:11:00 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:11:01.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:11:00 vm05.local ceph-mon[111630]: pgmap v493: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:11:01.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:11:00 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:11:03.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:11:02 vm08.local ceph-mon[101330]: pgmap v494: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:11:03.399 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:11:02 vm05.local ceph-mon[111630]: pgmap v494: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:11:05.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:11:05 vm08.local ceph-mon[101330]: pgmap v495: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:11:05.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:11:05 vm05.local ceph-mon[111630]: pgmap v495: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:11:07.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:11:07 vm08.local ceph-mon[101330]: pgmap v496: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:11:07.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:11:07 vm05.local ceph-mon[111630]: pgmap v496: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:11:09.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:11:09 vm08.local ceph-mon[101330]: pgmap v497: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:11:09.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:11:09 vm05.local ceph-mon[111630]: pgmap v497: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:11:11.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:11:11 vm08.local ceph-mon[101330]: pgmap v498: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:11:11.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:11:11 vm05.local ceph-mon[111630]: pgmap v498: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:11:13.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:11:13 vm08.local ceph-mon[101330]: pgmap v499: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:11:13.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:11:13 vm05.local ceph-mon[111630]: pgmap v499: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:11:15.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:11:15 vm05.local ceph-mon[111630]: pgmap v500: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:11:15.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:11:15 vm08.local ceph-mon[101330]: pgmap v500: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:11:16.462 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:11:16 vm05.local ceph-mon[111630]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:11:16.552 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:11:16 vm08.local ceph-mon[101330]: from='mgr.34104 192.168.123.105:0/2774735574' entity='mgr.vm05.rxwgjc' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T09:11:16.919 ERROR:tasks.cephfs.fuse_mount:process failed to terminate after unmount. This probably indicates a bug within ceph-fuse. 2026-03-10T09:11:16.920 ERROR:teuthology.run_tasks:Manager failed: ceph-fuse Traceback (most recent call last): File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 160, in run_tasks suppress = manager.__exit__(*exc_info) File "/home/teuthos/.local/share/uv/python/cpython-3.10.19-linux-x86_64-gnu/lib/python3.10/contextlib.py", line 142, in __exit__ next(self.gen) File "/home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks/ceph_fuse.py", line 248, in task mount.umount_wait() File "/home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks/cephfs/fuse_mount.py", line 403, in umount_wait run.wait([self.fuse_daemon], timeout) File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 479, in wait check_time() File "/home/teuthos/teuthology/teuthology/contextutil.py", line 134, in __call__ raise MaxWhileTries(error_msg) teuthology.exceptions.MaxWhileTries: reached maximum tries (50) after waiting for 300 seconds 2026-03-10T09:11:16.920 DEBUG:teuthology.run_tasks:Unwinding manager cephadm 2026-03-10T09:11:16.923 INFO:tasks.cephadm:Teardown begin 2026-03-10T09:11:16.923 ERROR:teuthology.contextutil:Saw exception from nested tasks Traceback (most recent call last): File "/home/teuthos/teuthology/teuthology/contextutil.py", line 32, in nested yield vars File "/home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks/cephadm.py", line 2252, in task yield File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 160, in run_tasks suppress = manager.__exit__(*exc_info) File "/home/teuthos/.local/share/uv/python/cpython-3.10.19-linux-x86_64-gnu/lib/python3.10/contextlib.py", line 142, in __exit__ next(self.gen) File "/home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks/ceph_fuse.py", line 248, in task mount.umount_wait() File "/home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks/cephfs/fuse_mount.py", line 403, in umount_wait run.wait([self.fuse_daemon], timeout) File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 479, in wait check_time() File "/home/teuthos/teuthology/teuthology/contextutil.py", line 134, in __call__ raise MaxWhileTries(error_msg) teuthology.exceptions.MaxWhileTries: reached maximum tries (50) after waiting for 300 seconds 2026-03-10T09:11:16.924 DEBUG:teuthology.orchestra.run.vm05:> sudo rm -f /etc/ceph/ceph.conf /etc/ceph/ceph.client.admin.keyring 2026-03-10T09:11:16.952 DEBUG:teuthology.orchestra.run.vm08:> sudo rm -f /etc/ceph/ceph.conf /etc/ceph/ceph.client.admin.keyring 2026-03-10T09:11:16.981 INFO:tasks.cephadm:Disabling cephadm mgr module 2026-03-10T09:11:16.981 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 -- ceph mgr module disable cephadm 2026-03-10T09:11:17.149 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/mon.vm05/config 2026-03-10T09:11:17.210 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:11:17 vm05.local ceph-mon[111630]: pgmap v501: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:11:17.300 INFO:teuthology.orchestra.run.vm05.stderr:Error: statfs /etc/ceph/ceph.client.admin.keyring: no such file or directory 2026-03-10T09:11:17.302 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:11:17 vm08.local ceph-mon[101330]: pgmap v501: 65 pgs: 65 active+clean; 215 MiB data, 916 MiB used, 119 GiB / 120 GiB avail 2026-03-10T09:11:17.316 DEBUG:teuthology.orchestra.run:got remote process result: 125 2026-03-10T09:11:17.316 INFO:tasks.cephadm:Cleaning up testdir ceph.* files... 2026-03-10T09:11:17.316 DEBUG:teuthology.orchestra.run.vm05:> rm -f /home/ubuntu/cephtest/seed.ceph.conf /home/ubuntu/cephtest/ceph.pub 2026-03-10T09:11:17.332 DEBUG:teuthology.orchestra.run.vm08:> rm -f /home/ubuntu/cephtest/seed.ceph.conf /home/ubuntu/cephtest/ceph.pub 2026-03-10T09:11:17.347 INFO:tasks.cephadm:Stopping all daemons... 2026-03-10T09:11:17.347 INFO:tasks.cephadm.mon.vm05:Stopping mon.vm05... 2026-03-10T09:11:17.347 DEBUG:teuthology.orchestra.run.vm05:> sudo systemctl stop ceph-16587ed2-1c5e-11f1-90f6-35051361a039@mon.vm05 2026-03-10T09:11:17.463 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:11:17 vm05.local systemd[1]: Stopping Ceph mon.vm05 for 16587ed2-1c5e-11f1-90f6-35051361a039... 2026-03-10T09:11:17.737 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:11:17 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-mon-vm05[111626]: 2026-03-10T09:11:17.486+0000 7ff6792b3640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-mon -n mon.vm05 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false --default-mon-cluster-log-to-file=false --default-mon-cluster-log-to-journald=true --default-mon-cluster-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T09:11:17.737 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:11:17 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-mon-vm05[111626]: 2026-03-10T09:11:17.486+0000 7ff6792b3640 -1 mon.vm05@0(leader) e3 *** Got Signal Terminated *** 2026-03-10T09:11:17.738 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:11:17 vm05.local podman[167679]: 2026-03-10 09:11:17.684864261 +0000 UTC m=+0.212302383 container died cdc9176bec281ab9d1e08966187c6abbc4fba6e4bdaea6686cadac19f3f2f8b2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-mon-vm05, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T09:11:17.738 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:11:17 vm05.local podman[167679]: 2026-03-10 09:11:17.702322127 +0000 UTC m=+0.229760249 container remove cdc9176bec281ab9d1e08966187c6abbc4fba6e4bdaea6686cadac19f3f2f8b2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-mon-vm05, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T09:11:17.738 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 09:11:17 vm05.local bash[167679]: ceph-16587ed2-1c5e-11f1-90f6-35051361a039-mon-vm05 2026-03-10T09:11:17.775 DEBUG:teuthology.orchestra.run.vm05:> sudo pkill -f 'journalctl -f -n 0 -u ceph-16587ed2-1c5e-11f1-90f6-35051361a039@mon.vm05.service' 2026-03-10T09:11:17.812 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-10T09:11:17.813 INFO:tasks.cephadm.mon.vm05:Stopped mon.vm05 2026-03-10T09:11:17.813 INFO:tasks.cephadm.mon.vm08:Stopping mon.vm08... 2026-03-10T09:11:17.813 DEBUG:teuthology.orchestra.run.vm08:> sudo systemctl stop ceph-16587ed2-1c5e-11f1-90f6-35051361a039@mon.vm08 2026-03-10T09:11:18.100 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:11:17 vm08.local systemd[1]: Stopping Ceph mon.vm08 for 16587ed2-1c5e-11f1-90f6-35051361a039... 2026-03-10T09:11:18.100 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:11:17 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-mon-vm08[101326]: 2026-03-10T09:11:17.919+0000 7f08f47ef640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-mon -n mon.vm08 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false --default-mon-cluster-log-to-file=false --default-mon-cluster-log-to-journald=true --default-mon-cluster-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T09:11:18.100 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:11:17 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-mon-vm08[101326]: 2026-03-10T09:11:17.919+0000 7f08f47ef640 -1 mon.vm08@1(peon) e3 *** Got Signal Terminated *** 2026-03-10T09:11:18.100 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:11:18 vm08.local podman[140055]: 2026-03-10 09:11:18.024063501 +0000 UTC m=+0.117249404 container died 34546aa1422bdf812a785754331901e7c3c8a5f6e641aef0bf3d305d15f0cce6 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-mon-vm08, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-10T09:11:18.100 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:11:18 vm08.local podman[140055]: 2026-03-10 09:11:18.043930154 +0000 UTC m=+0.137116057 container remove 34546aa1422bdf812a785754331901e7c3c8a5f6e641aef0bf3d305d15f0cce6 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-mon-vm08, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) 2026-03-10T09:11:18.100 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 09:11:18 vm08.local bash[140055]: ceph-16587ed2-1c5e-11f1-90f6-35051361a039-mon-vm08 2026-03-10T09:11:18.110 DEBUG:teuthology.orchestra.run.vm08:> sudo pkill -f 'journalctl -f -n 0 -u ceph-16587ed2-1c5e-11f1-90f6-35051361a039@mon.vm08.service' 2026-03-10T09:11:18.146 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-10T09:11:18.146 INFO:tasks.cephadm.mon.vm08:Stopped mon.vm08 2026-03-10T09:11:18.146 INFO:tasks.cephadm.osd.0:Stopping osd.0... 2026-03-10T09:11:18.146 DEBUG:teuthology.orchestra.run.vm05:> sudo systemctl stop ceph-16587ed2-1c5e-11f1-90f6-35051361a039@osd.0 2026-03-10T09:11:18.462 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 09:11:18 vm05.local systemd[1]: Stopping Ceph osd.0 for 16587ed2-1c5e-11f1-90f6-35051361a039... 2026-03-10T09:11:18.463 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 09:11:18 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0[119356]: 2026-03-10T09:11:18.247+0000 7f013e2f2640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.0 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T09:11:18.463 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 09:11:18 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0[119356]: 2026-03-10T09:11:18.247+0000 7f013e2f2640 -1 osd.0 106 *** Got signal Terminated *** 2026-03-10T09:11:18.463 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 09:11:18 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0[119356]: 2026-03-10T09:11:18.247+0000 7f013e2f2640 -1 osd.0 106 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T09:11:23.560 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 09:11:23 vm05.local podman[167856]: 2026-03-10 09:11:23.297557424 +0000 UTC m=+5.063474092 container died 4f1dac46f59bb0a72b86bab4a176031592a98ff9eb67b738e0a1aa6f743eaba7 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, CEPH_REF=squid) 2026-03-10T09:11:23.560 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 09:11:23 vm05.local podman[167856]: 2026-03-10 09:11:23.316533702 +0000 UTC m=+5.082450360 container remove 4f1dac46f59bb0a72b86bab4a176031592a98ff9eb67b738e0a1aa6f743eaba7 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid) 2026-03-10T09:11:23.560 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 09:11:23 vm05.local bash[167856]: ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0 2026-03-10T09:11:23.560 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 09:11:23 vm05.local podman[167920]: 2026-03-10 09:11:23.468851411 +0000 UTC m=+0.018526797 container create 33d7f0df68a4a4f242d579274d1da5c4291c6c1cb90ec65538ae9610edc76ad4 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0-deactivate, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-10T09:11:23.560 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 09:11:23 vm05.local podman[167920]: 2026-03-10 09:11:23.496589729 +0000 UTC m=+0.046265125 container init 33d7f0df68a4a4f242d579274d1da5c4291c6c1cb90ec65538ae9610edc76ad4 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0-deactivate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , ceph=True, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T09:11:23.560 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 09:11:23 vm05.local podman[167920]: 2026-03-10 09:11:23.502069485 +0000 UTC m=+0.051744871 container start 33d7f0df68a4a4f242d579274d1da5c4291c6c1cb90ec65538ae9610edc76ad4 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0-deactivate, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T09:11:23.560 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 09:11:23 vm05.local podman[167920]: 2026-03-10 09:11:23.502961314 +0000 UTC m=+0.052636700 container attach 33d7f0df68a4a4f242d579274d1da5c4291c6c1cb90ec65538ae9610edc76ad4 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-0-deactivate, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20260223, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-10T09:11:23.560 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 09:11:23 vm05.local podman[167920]: 2026-03-10 09:11:23.461300739 +0000 UTC m=+0.010976125 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T09:11:23.660 DEBUG:teuthology.orchestra.run.vm05:> sudo pkill -f 'journalctl -f -n 0 -u ceph-16587ed2-1c5e-11f1-90f6-35051361a039@osd.0.service' 2026-03-10T09:11:23.697 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-10T09:11:23.697 INFO:tasks.cephadm.osd.0:Stopped osd.0 2026-03-10T09:11:23.697 INFO:tasks.cephadm.osd.1:Stopping osd.1... 2026-03-10T09:11:23.697 DEBUG:teuthology.orchestra.run.vm05:> sudo systemctl stop ceph-16587ed2-1c5e-11f1-90f6-35051361a039@osd.1 2026-03-10T09:11:23.840 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:11:23 vm05.local systemd[1]: Stopping Ceph osd.1 for 16587ed2-1c5e-11f1-90f6-35051361a039... 2026-03-10T09:11:24.212 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:11:23 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1[126562]: 2026-03-10T09:11:23.840+0000 7fe218f57640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.1 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T09:11:24.212 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:11:23 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1[126562]: 2026-03-10T09:11:23.840+0000 7fe218f57640 -1 osd.1 106 *** Got signal Terminated *** 2026-03-10T09:11:24.212 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:11:23 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1[126562]: 2026-03-10T09:11:23.840+0000 7fe218f57640 -1 osd.1 106 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T09:11:28.898 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:11:28 vm05.local podman[168015]: 2026-03-10 09:11:28.874112937 +0000 UTC m=+5.047457111 container died 306e95bddd95de9d6fe6ebc844a29d0c146373ef3a7c7c9618df561806cacae1 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T09:11:29.208 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:11:28 vm05.local podman[168015]: 2026-03-10 09:11:28.897974449 +0000 UTC m=+5.071318633 container remove 306e95bddd95de9d6fe6ebc844a29d0c146373ef3a7c7c9618df561806cacae1 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-10T09:11:29.208 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:11:28 vm05.local bash[168015]: ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1 2026-03-10T09:11:29.208 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:11:29 vm05.local podman[168097]: 2026-03-10 09:11:29.045517111 +0000 UTC m=+0.020455918 container create fcf140af65e89a4c5f9a780f74443c46b298dda51ef4f6e6e244821077ecb9e6 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1-deactivate, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, ceph=True) 2026-03-10T09:11:29.208 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:11:29 vm05.local podman[168097]: 2026-03-10 09:11:29.080010522 +0000 UTC m=+0.054949329 container init fcf140af65e89a4c5f9a780f74443c46b298dda51ef4f6e6e244821077ecb9e6 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1-deactivate, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3) 2026-03-10T09:11:29.208 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:11:29 vm05.local podman[168097]: 2026-03-10 09:11:29.084253704 +0000 UTC m=+0.059192511 container start fcf140af65e89a4c5f9a780f74443c46b298dda51ef4f6e6e244821077ecb9e6 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1-deactivate, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS) 2026-03-10T09:11:29.208 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:11:29 vm05.local podman[168097]: 2026-03-10 09:11:29.085268583 +0000 UTC m=+0.060207391 container attach fcf140af65e89a4c5f9a780f74443c46b298dda51ef4f6e6e244821077ecb9e6 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1-deactivate, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.41.3, CEPH_REF=squid) 2026-03-10T09:11:29.208 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:11:29 vm05.local podman[168097]: 2026-03-10 09:11:29.034993795 +0000 UTC m=+0.009932612 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T09:11:29.208 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 09:11:29 vm05.local podman[168097]: 2026-03-10 09:11:29.208320868 +0000 UTC m=+0.183259665 container died fcf140af65e89a4c5f9a780f74443c46b298dda51ef4f6e6e244821077ecb9e6 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-1-deactivate, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True) 2026-03-10T09:11:29.253 DEBUG:teuthology.orchestra.run.vm05:> sudo pkill -f 'journalctl -f -n 0 -u ceph-16587ed2-1c5e-11f1-90f6-35051361a039@osd.1.service' 2026-03-10T09:11:29.287 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-10T09:11:29.287 INFO:tasks.cephadm.osd.1:Stopped osd.1 2026-03-10T09:11:29.287 INFO:tasks.cephadm.osd.2:Stopping osd.2... 2026-03-10T09:11:29.287 DEBUG:teuthology.orchestra.run.vm05:> sudo systemctl stop ceph-16587ed2-1c5e-11f1-90f6-35051361a039@osd.2 2026-03-10T09:11:29.463 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:11:29 vm05.local systemd[1]: Stopping Ceph osd.2 for 16587ed2-1c5e-11f1-90f6-35051361a039... 2026-03-10T09:11:29.463 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:11:29 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2[132347]: 2026-03-10T09:11:29.432+0000 7f5bdab6c640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.2 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T09:11:29.463 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:11:29 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2[132347]: 2026-03-10T09:11:29.432+0000 7f5bdab6c640 -1 osd.2 106 *** Got signal Terminated *** 2026-03-10T09:11:29.463 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:11:29 vm05.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2[132347]: 2026-03-10T09:11:29.432+0000 7f5bdab6c640 -1 osd.2 106 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T09:11:34.713 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:11:34 vm05.local podman[168192]: 2026-03-10 09:11:34.462708838 +0000 UTC m=+5.045200139 container died a555d70ff4bddd0aafe9353118eb5090320b317142599c87543e6eb71039c8bb (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T09:11:34.713 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:11:34 vm05.local podman[168192]: 2026-03-10 09:11:34.485997708 +0000 UTC m=+5.068488988 container remove a555d70ff4bddd0aafe9353118eb5090320b317142599c87543e6eb71039c8bb (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2) 2026-03-10T09:11:34.713 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:11:34 vm05.local bash[168192]: ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2 2026-03-10T09:11:34.713 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:11:34 vm05.local podman[168259]: 2026-03-10 09:11:34.627849799 +0000 UTC m=+0.021094333 container create 89d11a7bf9542c788b2690b02adff836761b8c75d420105f01f2097b9e0af2e4 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2-deactivate, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T09:11:34.713 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:11:34 vm05.local podman[168259]: 2026-03-10 09:11:34.676981809 +0000 UTC m=+0.070226353 container init 89d11a7bf9542c788b2690b02adff836761b8c75d420105f01f2097b9e0af2e4 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2-deactivate, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T09:11:34.713 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:11:34 vm05.local podman[168259]: 2026-03-10 09:11:34.680429902 +0000 UTC m=+0.073674447 container start 89d11a7bf9542c788b2690b02adff836761b8c75d420105f01f2097b9e0af2e4 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2-deactivate, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T09:11:34.713 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 09:11:34 vm05.local podman[168259]: 2026-03-10 09:11:34.683862837 +0000 UTC m=+0.077107392 container attach 89d11a7bf9542c788b2690b02adff836761b8c75d420105f01f2097b9e0af2e4 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-2-deactivate, ceph=True, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2) 2026-03-10T09:11:34.848 DEBUG:teuthology.orchestra.run.vm05:> sudo pkill -f 'journalctl -f -n 0 -u ceph-16587ed2-1c5e-11f1-90f6-35051361a039@osd.2.service' 2026-03-10T09:11:34.880 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-10T09:11:34.880 INFO:tasks.cephadm.osd.2:Stopped osd.2 2026-03-10T09:11:34.880 INFO:tasks.cephadm.osd.3:Stopping osd.3... 2026-03-10T09:11:34.880 DEBUG:teuthology.orchestra.run.vm08:> sudo systemctl stop ceph-16587ed2-1c5e-11f1-90f6-35051361a039@osd.3 2026-03-10T09:11:35.178 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:11:34 vm08.local systemd[1]: Stopping Ceph osd.3 for 16587ed2-1c5e-11f1-90f6-35051361a039... 2026-03-10T09:11:35.178 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:11:34 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3[111306]: 2026-03-10T09:11:34.991+0000 7fed4b4cd640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.3 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T09:11:35.178 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:11:34 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3[111306]: 2026-03-10T09:11:34.991+0000 7fed4b4cd640 -1 osd.3 106 *** Got signal Terminated *** 2026-03-10T09:11:35.178 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:11:34 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3[111306]: 2026-03-10T09:11:34.991+0000 7fed4b4cd640 -1 osd.3 106 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T09:11:40.303 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:11:40 vm08.local podman[140157]: 2026-03-10 09:11:40.019203321 +0000 UTC m=+5.049096128 container died b025f9a6ca2a74de41ae0c72e028f37ca9cc9a878231121bce43d419f8f864a4 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T09:11:40.303 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:11:40 vm08.local podman[140157]: 2026-03-10 09:11:40.043051978 +0000 UTC m=+5.072944785 container remove b025f9a6ca2a74de41ae0c72e028f37ca9cc9a878231121bce43d419f8f864a4 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, ceph=True) 2026-03-10T09:11:40.303 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:11:40 vm08.local bash[140157]: ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3 2026-03-10T09:11:40.303 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:11:40 vm08.local podman[140236]: 2026-03-10 09:11:40.169226455 +0000 UTC m=+0.014311548 container create 807131e0e3a4888b58fc1c5d6a4896cb112e17ab82954e3481fc98c79e724829 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3-deactivate, org.opencontainers.image.authors=Ceph Release Team , ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-10T09:11:40.303 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:11:40 vm08.local podman[140236]: 2026-03-10 09:11:40.203699778 +0000 UTC m=+0.048784880 container init 807131e0e3a4888b58fc1c5d6a4896cb112e17ab82954e3481fc98c79e724829 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3-deactivate, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T09:11:40.303 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:11:40 vm08.local podman[140236]: 2026-03-10 09:11:40.207227621 +0000 UTC m=+0.052312723 container start 807131e0e3a4888b58fc1c5d6a4896cb112e17ab82954e3481fc98c79e724829 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3-deactivate, ceph=True, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223) 2026-03-10T09:11:40.303 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:11:40 vm08.local podman[140236]: 2026-03-10 09:11:40.211163848 +0000 UTC m=+0.056248950 container attach 807131e0e3a4888b58fc1c5d6a4896cb112e17ab82954e3481fc98c79e724829 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-3-deactivate, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True) 2026-03-10T09:11:40.303 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 09:11:40 vm08.local podman[140236]: 2026-03-10 09:11:40.163456756 +0000 UTC m=+0.008541868 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T09:11:40.373 DEBUG:teuthology.orchestra.run.vm08:> sudo pkill -f 'journalctl -f -n 0 -u ceph-16587ed2-1c5e-11f1-90f6-35051361a039@osd.3.service' 2026-03-10T09:11:40.405 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-10T09:11:40.405 INFO:tasks.cephadm.osd.3:Stopped osd.3 2026-03-10T09:11:40.405 INFO:tasks.cephadm.osd.4:Stopping osd.4... 2026-03-10T09:11:40.405 DEBUG:teuthology.orchestra.run.vm08:> sudo systemctl stop ceph-16587ed2-1c5e-11f1-90f6-35051361a039@osd.4 2026-03-10T09:11:40.803 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:11:40 vm08.local systemd[1]: Stopping Ceph osd.4 for 16587ed2-1c5e-11f1-90f6-35051361a039... 2026-03-10T09:11:40.803 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:11:40 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4[116652]: 2026-03-10T09:11:40.533+0000 7fc5e1892640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.4 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T09:11:40.803 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:11:40 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4[116652]: 2026-03-10T09:11:40.533+0000 7fc5e1892640 -1 osd.4 106 *** Got signal Terminated *** 2026-03-10T09:11:40.803 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:11:40 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4[116652]: 2026-03-10T09:11:40.533+0000 7fc5e1892640 -1 osd.4 106 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T09:11:44.640 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:11:44 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4[116652]: 2026-03-10T09:11:44.357+0000 7fc5dde9a640 -1 osd.4 106 heartbeat_check: no reply from 192.168.123.105:6806 osd.0 since back 2026-03-10T09:11:20.167449+0000 front 2026-03-10T09:11:20.167417+0000 (oldest deadline 2026-03-10T09:11:44.267072+0000) 2026-03-10T09:11:45.052 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:11:44 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5[122013]: 2026-03-10T09:11:44.640+0000 7f54321cc640 -1 osd.5 106 heartbeat_check: no reply from 192.168.123.105:6806 osd.0 since back 2026-03-10T09:11:21.323033+0000 front 2026-03-10T09:11:21.323076+0000 (oldest deadline 2026-03-10T09:11:44.222584+0000) 2026-03-10T09:11:45.572 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:11:45 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4[116652]: 2026-03-10T09:11:45.351+0000 7fc5dde9a640 -1 osd.4 106 heartbeat_check: no reply from 192.168.123.105:6806 osd.0 since back 2026-03-10T09:11:20.167449+0000 front 2026-03-10T09:11:20.167417+0000 (oldest deadline 2026-03-10T09:11:44.267072+0000) 2026-03-10T09:11:45.850 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:11:45 vm08.local podman[140335]: 2026-03-10 09:11:45.572869128 +0000 UTC m=+5.050969925 container died 76fe84edd71692339bfdc3304aed26202046eee7a2b8e9877d8d3339b74fd0ae (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, CEPH_REF=squid, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-10T09:11:45.850 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:11:45 vm08.local podman[140335]: 2026-03-10 09:11:45.603711585 +0000 UTC m=+5.081812372 container remove 76fe84edd71692339bfdc3304aed26202046eee7a2b8e9877d8d3339b74fd0ae (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T09:11:45.850 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:11:45 vm08.local bash[140335]: ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4 2026-03-10T09:11:45.850 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:11:45 vm08.local podman[140401]: 2026-03-10 09:11:45.758048413 +0000 UTC m=+0.016298357 container create ff2989feea12c80c3751af034cfa59ecaa76317505ce9878407b534b67926f17 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4-deactivate, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=squid, ceph=True, org.label-schema.build-date=20260223) 2026-03-10T09:11:45.850 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:11:45 vm08.local podman[140401]: 2026-03-10 09:11:45.796334533 +0000 UTC m=+0.054584477 container init ff2989feea12c80c3751af034cfa59ecaa76317505ce9878407b534b67926f17 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4-deactivate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=squid, ceph=True, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-10T09:11:45.850 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:11:45 vm08.local podman[140401]: 2026-03-10 09:11:45.799001003 +0000 UTC m=+0.057250936 container start ff2989feea12c80c3751af034cfa59ecaa76317505ce9878407b534b67926f17 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4-deactivate, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2) 2026-03-10T09:11:45.850 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 09:11:45 vm08.local podman[140401]: 2026-03-10 09:11:45.801181304 +0000 UTC m=+0.059431248 container attach ff2989feea12c80c3751af034cfa59ecaa76317505ce9878407b534b67926f17 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-4-deactivate, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-10T09:11:45.850 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:11:45 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5[122013]: 2026-03-10T09:11:45.613+0000 7f54321cc640 -1 osd.5 106 heartbeat_check: no reply from 192.168.123.105:6806 osd.0 since back 2026-03-10T09:11:21.323033+0000 front 2026-03-10T09:11:21.323076+0000 (oldest deadline 2026-03-10T09:11:44.222584+0000) 2026-03-10T09:11:45.952 DEBUG:teuthology.orchestra.run.vm08:> sudo pkill -f 'journalctl -f -n 0 -u ceph-16587ed2-1c5e-11f1-90f6-35051361a039@osd.4.service' 2026-03-10T09:11:45.986 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-10T09:11:45.986 INFO:tasks.cephadm.osd.4:Stopped osd.4 2026-03-10T09:11:45.986 INFO:tasks.cephadm.osd.5:Stopping osd.5... 2026-03-10T09:11:45.986 DEBUG:teuthology.orchestra.run.vm08:> sudo systemctl stop ceph-16587ed2-1c5e-11f1-90f6-35051361a039@osd.5 2026-03-10T09:11:46.123 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:11:46 vm08.local systemd[1]: Stopping Ceph osd.5 for 16587ed2-1c5e-11f1-90f6-35051361a039... 2026-03-10T09:11:46.553 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:11:46 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5[122013]: 2026-03-10T09:11:46.122+0000 7f5435bc4640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.5 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T09:11:46.553 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:11:46 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5[122013]: 2026-03-10T09:11:46.122+0000 7f5435bc4640 -1 osd.5 106 *** Got signal Terminated *** 2026-03-10T09:11:46.553 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:11:46 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5[122013]: 2026-03-10T09:11:46.122+0000 7f5435bc4640 -1 osd.5 106 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T09:11:47.052 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:11:46 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5[122013]: 2026-03-10T09:11:46.581+0000 7f54321cc640 -1 osd.5 106 heartbeat_check: no reply from 192.168.123.105:6806 osd.0 since back 2026-03-10T09:11:21.323033+0000 front 2026-03-10T09:11:21.323076+0000 (oldest deadline 2026-03-10T09:11:44.222584+0000) 2026-03-10T09:11:48.052 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:11:47 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5[122013]: 2026-03-10T09:11:47.573+0000 7f54321cc640 -1 osd.5 106 heartbeat_check: no reply from 192.168.123.105:6806 osd.0 since back 2026-03-10T09:11:21.323033+0000 front 2026-03-10T09:11:21.323076+0000 (oldest deadline 2026-03-10T09:11:44.222584+0000) 2026-03-10T09:11:49.052 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:11:48 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5[122013]: 2026-03-10T09:11:48.605+0000 7f54321cc640 -1 osd.5 106 heartbeat_check: no reply from 192.168.123.105:6806 osd.0 since back 2026-03-10T09:11:21.323033+0000 front 2026-03-10T09:11:21.323076+0000 (oldest deadline 2026-03-10T09:11:44.222584+0000) 2026-03-10T09:11:50.053 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:11:49 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5[122013]: 2026-03-10T09:11:49.580+0000 7f54321cc640 -1 osd.5 106 heartbeat_check: no reply from 192.168.123.105:6806 osd.0 since back 2026-03-10T09:11:21.323033+0000 front 2026-03-10T09:11:21.323076+0000 (oldest deadline 2026-03-10T09:11:44.222584+0000) 2026-03-10T09:11:51.052 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:11:50 vm08.local ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5[122013]: 2026-03-10T09:11:50.569+0000 7f54321cc640 -1 osd.5 106 heartbeat_check: no reply from 192.168.123.105:6806 osd.0 since back 2026-03-10T09:11:21.323033+0000 front 2026-03-10T09:11:21.323076+0000 (oldest deadline 2026-03-10T09:11:44.222584+0000) 2026-03-10T09:11:51.424 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:11:51 vm08.local podman[140497]: 2026-03-10 09:11:51.152440331 +0000 UTC m=+5.041781407 container died 41f6c3ce6ac4c158c72321b1533f1de2882722b7f1f673d11508b17d9f40a12c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5, org.label-schema.license=GPLv2, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-10T09:11:51.424 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:11:51 vm08.local podman[140497]: 2026-03-10 09:11:51.186841789 +0000 UTC m=+5.076182845 container remove 41f6c3ce6ac4c158c72321b1533f1de2882722b7f1f673d11508b17d9f40a12c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T09:11:51.424 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:11:51 vm08.local bash[140497]: ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5 2026-03-10T09:11:51.424 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:11:51 vm08.local podman[140564]: 2026-03-10 09:11:51.331724973 +0000 UTC m=+0.016723773 container create 06e528a188cd19bc73f7891bb08334233f190c9c405f95818355e8fa24a66162 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5-deactivate, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-10T09:11:51.424 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:11:51 vm08.local podman[140564]: 2026-03-10 09:11:51.374464999 +0000 UTC m=+0.059463799 container init 06e528a188cd19bc73f7891bb08334233f190c9c405f95818355e8fa24a66162 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5-deactivate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid) 2026-03-10T09:11:51.424 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:11:51 vm08.local podman[140564]: 2026-03-10 09:11:51.377620656 +0000 UTC m=+0.062619456 container start 06e528a188cd19bc73f7891bb08334233f190c9c405f95818355e8fa24a66162 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5-deactivate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3) 2026-03-10T09:11:51.424 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 09:11:51 vm08.local podman[140564]: 2026-03-10 09:11:51.378618793 +0000 UTC m=+0.063617593 container attach 06e528a188cd19bc73f7891bb08334233f190c9c405f95818355e8fa24a66162 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-16587ed2-1c5e-11f1-90f6-35051361a039-osd-5-deactivate, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T09:11:51.549 DEBUG:teuthology.orchestra.run.vm08:> sudo pkill -f 'journalctl -f -n 0 -u ceph-16587ed2-1c5e-11f1-90f6-35051361a039@osd.5.service' 2026-03-10T09:11:51.581 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-10T09:11:51.581 INFO:tasks.cephadm.osd.5:Stopped osd.5 2026-03-10T09:11:51.581 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm rm-cluster --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 --force --keep-logs 2026-03-10T09:11:51.677 INFO:teuthology.orchestra.run.vm05.stdout:Deleting cluster with fsid: 16587ed2-1c5e-11f1-90f6-35051361a039 2026-03-10T09:11:52.988 INFO:tasks.cephfs.fuse_mount.ceph-fuse.0.vm05.stderr:ceph-fuse[96456]: fuse finished with error 0 and tester_r 0 2026-03-10T09:12:01.489 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm rm-cluster --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 --force --keep-logs 2026-03-10T09:12:01.592 INFO:teuthology.orchestra.run.vm08.stdout:Deleting cluster with fsid: 16587ed2-1c5e-11f1-90f6-35051361a039 2026-03-10T09:12:06.912 DEBUG:teuthology.orchestra.run.vm05:> sudo rm -f /etc/ceph/ceph.conf /etc/ceph/ceph.client.admin.keyring 2026-03-10T09:12:06.938 DEBUG:teuthology.orchestra.run.vm08:> sudo rm -f /etc/ceph/ceph.conf /etc/ceph/ceph.client.admin.keyring 2026-03-10T09:12:06.963 INFO:tasks.cephadm:Archiving crash dumps... 2026-03-10T09:12:06.964 DEBUG:teuthology.misc:Transferring archived files from vm05:/var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/crash to /archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/969/remote/vm05/crash 2026-03-10T09:12:06.964 DEBUG:teuthology.orchestra.run.vm05:> sudo tar c -f - -C /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/crash -- . 2026-03-10T09:12:07.001 INFO:teuthology.orchestra.run.vm05.stderr:tar: /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/crash: Cannot open: No such file or directory 2026-03-10T09:12:07.001 INFO:teuthology.orchestra.run.vm05.stderr:tar: Error is not recoverable: exiting now 2026-03-10T09:12:07.002 DEBUG:teuthology.misc:Transferring archived files from vm08:/var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/crash to /archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/969/remote/vm08/crash 2026-03-10T09:12:07.002 DEBUG:teuthology.orchestra.run.vm08:> sudo tar c -f - -C /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/crash -- . 2026-03-10T09:12:07.025 INFO:teuthology.orchestra.run.vm08.stderr:tar: /var/lib/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/crash: Cannot open: No such file or directory 2026-03-10T09:12:07.025 INFO:teuthology.orchestra.run.vm08.stderr:tar: Error is not recoverable: exiting now 2026-03-10T09:12:07.026 INFO:tasks.cephadm:Checking cluster log for badness... 2026-03-10T09:12:07.026 DEBUG:teuthology.orchestra.run.vm05:> sudo egrep '\[ERR\]|\[WRN\]|\[SEC\]' /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph.log | egrep -v '\(MDS_ALL_DOWN\)' | egrep -v '\(MDS_UP_LESS_THAN_MAX\)' | egrep -v FS_DEGRADED | egrep -v 'filesystem is degraded' | egrep -v FS_INLINE_DATA_DEPRECATED | egrep -v FS_WITH_FAILED_MDS | egrep -v MDS_ALL_DOWN | egrep -v 'filesystem is offline' | egrep -v 'is offline because no MDS' | egrep -v MDS_DAMAGE | egrep -v MDS_DEGRADED | egrep -v MDS_FAILED | egrep -v MDS_INSUFFICIENT_STANDBY | egrep -v MDS_UP_LESS_THAN_MAX | egrep -v 'online, but wants' | egrep -v 'filesystem is online with fewer MDS than max_mds' | egrep -v POOL_APP_NOT_ENABLED | egrep -v 'do not have an application enabled' | egrep -v 'overall HEALTH_' | egrep -v 'Replacing daemon' | egrep -v 'deprecated feature inline_data' | egrep -v MGR_MODULE_ERROR | egrep -v OSD_DOWN | egrep -v 'osds down' | egrep -v 'overall HEALTH_' | egrep -v '\(OSD_DOWN\)' | egrep -v '\(OSD_' | egrep -v 'but it is still running' | egrep -v 'is not responding' | egrep -v MON_DOWN | egrep -v PG_AVAILABILITY | egrep -v PG_DEGRADED | egrep -v 'Reduced data availability' | egrep -v 'Degraded data redundancy' | egrep -v 'pg .* is stuck inactive' | egrep -v 'pg .* is .*degraded' | egrep -v 'pg .* is stuck peering' | head -n 1 2026-03-10T09:12:07.092 INFO:tasks.cephadm:Compressing logs... 2026-03-10T09:12:07.092 DEBUG:teuthology.orchestra.run.vm05:> time sudo find /var/log/ceph /var/log/rbd-target-api -name '*.log' -print0 | sudo xargs --max-args=1 --max-procs=0 --verbose -0 --no-run-if-empty -- gzip -5 --verbose -- 2026-03-10T09:12:07.094 DEBUG:teuthology.orchestra.run.vm08:> time sudo find /var/log/ceph /var/log/rbd-target-api -name '*.log' -print0 | sudo xargs --max-args=1 --max-procs=0 --verbose -0 --no-run-if-empty -- gzip -5 --verbose -- 2026-03-10T09:12:07.116 INFO:teuthology.orchestra.run.vm05.stderr:find: gzip -5 --verbose -- /var/log/ceph/cephadm.log 2026-03-10T09:12:07.116 INFO:teuthology.orchestra.run.vm05.stderr:‘/var/log/rbd-target-api’: No such file or directory 2026-03-10T09:12:07.117 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-mon.vm05.log 2026-03-10T09:12:07.117 INFO:teuthology.orchestra.run.vm08.stderr:find: gzip -5 --verbose -- /var/log/ceph/cephadm.log 2026-03-10T09:12:07.118 INFO:teuthology.orchestra.run.vm08.stderr:‘/var/log/rbd-target-api’: No such file or directory 2026-03-10T09:12:07.118 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/cephadm.log: gzip -5 --verbose -- /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph.log 2026-03-10T09:12:07.119 INFO:teuthology.orchestra.run.vm08.stderr:gzip -5 --verbose -- /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-volume.log 2026-03-10T09:12:07.119 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-mgr.vm05.rxwgjc.log 2026-03-10T09:12:07.119 INFO:teuthology.orchestra.run.vm08.stderr:/var/log/ceph/cephadm.log: gzip -5 --verbose -- /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-client.ceph-exporter.vm08.log 2026-03-10T09:12:07.120 INFO:teuthology.orchestra.run.vm08.stderr:gzip -5 --verbose -- /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-mgr.vm08.rpongu.log 2026-03-10T09:12:07.121 INFO:teuthology.orchestra.run.vm08.stderr:/var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-client.ceph-exporter.vm08.log: 92.7% -- replaced with /var/log/ceph/cephadm.log.gz 2026-03-10T09:12:07.121 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph.log: 88.2% -- replaced with /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph.log.gz 2026-03-10T09:12:07.121 INFO:teuthology.orchestra.run.vm08.stderr:/var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-volume.log: gzip -5 --verbose -- /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-mon.vm08.log 2026-03-10T09:12:07.121 INFO:teuthology.orchestra.run.vm08.stderr: 94.1% -- replaced with /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-client.ceph-exporter.vm08.log.gz 2026-03-10T09:12:07.128 INFO:teuthology.orchestra.run.vm08.stderr:/var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-mgr.vm08.rpongu.log: 93.2% -- replaced with /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-volume.log.gz 2026-03-10T09:12:07.128 INFO:teuthology.orchestra.run.vm08.stderr:gzip -5 --verbose -- /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph.audit.log 2026-03-10T09:12:07.129 INFO:teuthology.orchestra.run.vm08.stderr:/var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-mon.vm08.log: gzip -5 --verbose -- /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph.log 2026-03-10T09:12:07.129 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-mon.vm05.log: gzip -5 --verbose -- /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph.audit.log 2026-03-10T09:12:07.138 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-mgr.vm05.rxwgjc.log: 90.8% -- replaced with /var/log/ceph/cephadm.log.gz 2026-03-10T09:12:07.138 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph.cephadm.log 2026-03-10T09:12:07.138 INFO:teuthology.orchestra.run.vm08.stderr:/var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph.audit.log: 91.5% -- replaced with /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph.audit.log.gz 2026-03-10T09:12:07.140 INFO:teuthology.orchestra.run.vm08.stderr:gzip -5 --verbose -- /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph.cephadm.log 2026-03-10T09:12:07.141 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph.audit.log: 91.3% -- replaced with /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph.audit.log.gz 2026-03-10T09:12:07.142 INFO:teuthology.orchestra.run.vm08.stderr:/var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph.log: 89.4% -- replaced with /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-mgr.vm08.rpongu.log.gz 2026-03-10T09:12:07.142 INFO:teuthology.orchestra.run.vm08.stderr:gzip -5 --verbose -- /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-osd.3.log 2026-03-10T09:12:07.143 INFO:teuthology.orchestra.run.vm08.stderr:/var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph.cephadm.log: 85.4% -- replaced with /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph.cephadm.log.gz 2026-03-10T09:12:07.143 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-volume.log 2026-03-10T09:12:07.143 INFO:teuthology.orchestra.run.vm08.stderr:gzip -5 --verbose -- /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-osd.4.log 2026-03-10T09:12:07.144 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph.cephadm.log: 85.5% -- replaced with /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph.cephadm.log.gz 2026-03-10T09:12:07.144 INFO:teuthology.orchestra.run.vm08.stderr:/var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-osd.3.log: 88.3% -- replaced with /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph.log.gz 2026-03-10T09:12:07.148 INFO:teuthology.orchestra.run.vm08.stderr:gzip -5 --verbose -- /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-osd.5.log 2026-03-10T09:12:07.149 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-client.ceph-exporter.vm05.log 2026-03-10T09:12:07.152 INFO:teuthology.orchestra.run.vm08.stderr:/var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-osd.4.log: gzip -5 --verbose -- /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-mds.cephfs.vm08.xfzrbx.log 2026-03-10T09:12:07.157 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-volume.log: gzip -5 --verbose -- /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-osd.0.log 2026-03-10T09:12:07.159 INFO:teuthology.orchestra.run.vm08.stderr:/var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-osd.5.log: gzip -5 --verbose -- /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-mds.cephfs.vm08.ssijow.log 2026-03-10T09:12:07.163 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-client.ceph-exporter.vm05.log: gzip -5 --verbose -- /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-osd.1.log 2026-03-10T09:12:07.164 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-osd.0.log: 94.1% -- replaced with /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-client.ceph-exporter.vm05.log.gz 2026-03-10T09:12:07.164 INFO:teuthology.orchestra.run.vm05.stderr: 93.5% -- replaced with /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-volume.log.gz 2026-03-10T09:12:07.169 INFO:teuthology.orchestra.run.vm08.stderr:/var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-mds.cephfs.vm08.xfzrbx.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.1.log 2026-03-10T09:12:07.179 INFO:teuthology.orchestra.run.vm08.stderr:/var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-mds.cephfs.vm08.ssijow.log: /var/log/ceph/ceph-client.1.log: 92.2% -- replaced with /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-mds.cephfs.vm08.xfzrbx.log.gz 2026-03-10T09:12:07.179 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-osd.2.log 2026-03-10T09:12:07.184 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-osd.1.log: gzip -5 --verbose -- /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-mds.cephfs.vm05.bxdvbu.log 2026-03-10T09:12:07.192 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-osd.2.log: gzip -5 --verbose -- /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-mds.cephfs.vm05.slhztf.log 2026-03-10T09:12:07.199 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-mds.cephfs.vm05.bxdvbu.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.0.log 2026-03-10T09:12:07.650 INFO:teuthology.orchestra.run.vm08.stderr: 92.3% -- replaced with /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-mon.vm08.log.gz 2026-03-10T09:12:07.758 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-mds.cephfs.vm05.slhztf.log: /var/log/ceph/ceph-client.0.log: 89.5% -- replaced with /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-mgr.vm05.rxwgjc.log.gz 2026-03-10T09:12:08.764 INFO:teuthology.orchestra.run.vm05.stderr: 90.4% -- replaced with /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-mon.vm05.log.gz 2026-03-10T09:12:14.697 INFO:teuthology.orchestra.run.vm08.stderr: 93.5% -- replaced with /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-osd.4.log.gz 2026-03-10T09:12:16.252 INFO:teuthology.orchestra.run.vm08.stderr: 93.7% -- replaced with /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-osd.5.log.gz 2026-03-10T09:12:16.388 INFO:teuthology.orchestra.run.vm08.stderr: 94.8% -- replaced with /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-mds.cephfs.vm08.ssijow.log.gz 2026-03-10T09:12:16.633 INFO:teuthology.orchestra.run.vm05.stderr: 93.9% -- replaced with /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-osd.0.log.gz 2026-03-10T09:12:17.109 INFO:teuthology.orchestra.run.vm08.stderr: 93.7% -- replaced with /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-osd.3.log.gz 2026-03-10T09:12:17.546 INFO:teuthology.orchestra.run.vm05.stderr: 93.6% -- replaced with /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-osd.2.log.gz 2026-03-10T09:12:18.669 INFO:teuthology.orchestra.run.vm05.stderr: 94.8% -- replaced with /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-mds.cephfs.vm05.slhztf.log.gz 2026-03-10T09:12:18.770 INFO:teuthology.orchestra.run.vm05.stderr: 93.6% -- replaced with /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-osd.1.log.gz 2026-03-10T09:12:21.053 INFO:teuthology.orchestra.run.vm08.stderr:gzip: /var/log/ceph/ceph-client.1.log: file size changed while zipping 2026-03-10T09:12:21.053 INFO:teuthology.orchestra.run.vm08.stderr: 93.4% -- replaced with /var/log/ceph/ceph-client.1.log.gz 2026-03-10T09:12:21.055 INFO:teuthology.orchestra.run.vm08.stderr: 2026-03-10T09:12:21.055 INFO:teuthology.orchestra.run.vm08.stderr:real 0m13.946s 2026-03-10T09:12:21.055 INFO:teuthology.orchestra.run.vm08.stderr:user 0m22.798s 2026-03-10T09:12:21.055 INFO:teuthology.orchestra.run.vm08.stderr:sys 0m0.989s 2026-03-10T09:12:23.423 INFO:teuthology.orchestra.run.vm05.stderr:gzip: /var/log/ceph/ceph-client.0.log: file size changed while zipping 2026-03-10T09:12:23.547 INFO:teuthology.orchestra.run.vm05.stderr: 93.4% -- replaced with /var/log/ceph/ceph-client.0.log.gz 2026-03-10T09:13:21.482 INFO:teuthology.orchestra.run.vm05.stderr: 93.0% -- replaced with /var/log/ceph/16587ed2-1c5e-11f1-90f6-35051361a039/ceph-mds.cephfs.vm05.bxdvbu.log.gz 2026-03-10T09:13:21.485 INFO:teuthology.orchestra.run.vm05.stderr: 2026-03-10T09:13:21.485 INFO:teuthology.orchestra.run.vm05.stderr:real 1m14.378s 2026-03-10T09:13:21.485 INFO:teuthology.orchestra.run.vm05.stderr:user 1m23.468s 2026-03-10T09:13:21.485 INFO:teuthology.orchestra.run.vm05.stderr:sys 0m5.682s 2026-03-10T09:13:21.485 INFO:tasks.cephadm:Archiving logs... 2026-03-10T09:13:21.485 DEBUG:teuthology.misc:Transferring archived files from vm05:/var/log/ceph to /archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/969/remote/vm05/log 2026-03-10T09:13:21.485 DEBUG:teuthology.orchestra.run.vm05:> sudo tar c -f - -C /var/log/ceph -- . 2026-03-10T09:13:26.332 DEBUG:teuthology.misc:Transferring archived files from vm08:/var/log/ceph to /archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/969/remote/vm08/log 2026-03-10T09:13:26.332 DEBUG:teuthology.orchestra.run.vm08:> sudo tar c -f - -C /var/log/ceph -- . 2026-03-10T09:13:27.534 INFO:tasks.cephadm:Removing cluster... 2026-03-10T09:13:27.535 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm rm-cluster --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 --force 2026-03-10T09:13:27.708 INFO:teuthology.orchestra.run.vm05.stdout:Deleting cluster with fsid: 16587ed2-1c5e-11f1-90f6-35051361a039 2026-03-10T09:13:28.058 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm rm-cluster --fsid 16587ed2-1c5e-11f1-90f6-35051361a039 --force 2026-03-10T09:13:28.168 INFO:teuthology.orchestra.run.vm08.stdout:Deleting cluster with fsid: 16587ed2-1c5e-11f1-90f6-35051361a039 2026-03-10T09:13:28.419 INFO:tasks.cephadm:Removing cephadm ... 2026-03-10T09:13:28.419 DEBUG:teuthology.orchestra.run.vm05:> rm -rf /home/ubuntu/cephtest/cephadm 2026-03-10T09:13:28.435 DEBUG:teuthology.orchestra.run.vm08:> rm -rf /home/ubuntu/cephtest/cephadm 2026-03-10T09:13:28.452 INFO:tasks.cephadm:Teardown complete 2026-03-10T09:13:28.452 DEBUG:teuthology.run_tasks:Unwinding manager install 2026-03-10T09:13:28.456 ERROR:teuthology.contextutil:Saw exception from nested tasks Traceback (most recent call last): File "/home/teuthos/teuthology/teuthology/contextutil.py", line 32, in nested yield vars File "/home/teuthos/teuthology/teuthology/task/install/__init__.py", line 644, in task yield File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 160, in run_tasks suppress = manager.__exit__(*exc_info) File "/home/teuthos/.local/share/uv/python/cpython-3.10.19-linux-x86_64-gnu/lib/python3.10/contextlib.py", line 142, in __exit__ next(self.gen) File "/home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks/ceph_fuse.py", line 248, in task mount.umount_wait() File "/home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks/cephfs/fuse_mount.py", line 403, in umount_wait run.wait([self.fuse_daemon], timeout) File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 479, in wait check_time() File "/home/teuthos/teuthology/teuthology/contextutil.py", line 134, in __call__ raise MaxWhileTries(error_msg) teuthology.exceptions.MaxWhileTries: reached maximum tries (50) after waiting for 300 seconds 2026-03-10T09:13:28.456 INFO:teuthology.task.install.util:Removing shipped files: /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer... 2026-03-10T09:13:28.456 DEBUG:teuthology.orchestra.run.vm05:> sudo rm -f -- /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer 2026-03-10T09:13:28.478 DEBUG:teuthology.orchestra.run.vm08:> sudo rm -f -- /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer 2026-03-10T09:13:28.528 INFO:teuthology.task.install.rpm:Removing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd on rpm system. 2026-03-10T09:13:28.528 DEBUG:teuthology.orchestra.run.vm05:> 2026-03-10T09:13:28.528 DEBUG:teuthology.orchestra.run.vm05:> for d in ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd ; do 2026-03-10T09:13:28.528 DEBUG:teuthology.orchestra.run.vm05:> sudo yum -y remove $d || true 2026-03-10T09:13:28.528 DEBUG:teuthology.orchestra.run.vm05:> done 2026-03-10T09:13:28.533 INFO:teuthology.task.install.rpm:Removing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd on rpm system. 2026-03-10T09:13:28.534 DEBUG:teuthology.orchestra.run.vm08:> 2026-03-10T09:13:28.534 DEBUG:teuthology.orchestra.run.vm08:> for d in ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd ; do 2026-03-10T09:13:28.534 DEBUG:teuthology.orchestra.run.vm08:> sudo yum -y remove $d || true 2026-03-10T09:13:28.534 DEBUG:teuthology.orchestra.run.vm08:> done 2026-03-10T09:13:28.859 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T09:13:28.859 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T09:13:28.859 INFO:teuthology.orchestra.run.vm08.stdout: Package Architecture Version Repository Size 2026-03-10T09:13:28.859 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T09:13:28.859 INFO:teuthology.orchestra.run.vm08.stdout:Removing: 2026-03-10T09:13:28.859 INFO:teuthology.orchestra.run.vm08.stdout: ceph-radosgw x86_64 2:18.2.1-0.el9 @ceph 31 M 2026-03-10T09:13:28.859 INFO:teuthology.orchestra.run.vm08.stdout:Removing unused dependencies: 2026-03-10T09:13:28.859 INFO:teuthology.orchestra.run.vm08.stdout: mailcap noarch 2.1.49-5.el9 @baseos 78 k 2026-03-10T09:13:28.859 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:28.860 INFO:teuthology.orchestra.run.vm08.stdout:Transaction Summary 2026-03-10T09:13:28.860 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T09:13:28.860 INFO:teuthology.orchestra.run.vm08.stdout:Remove 2 Packages 2026-03-10T09:13:28.860 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:28.860 INFO:teuthology.orchestra.run.vm08.stdout:Freed space: 31 M 2026-03-10T09:13:28.860 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction check 2026-03-10T09:13:28.864 INFO:teuthology.orchestra.run.vm08.stdout:Transaction check succeeded. 2026-03-10T09:13:28.864 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction test 2026-03-10T09:13:28.879 INFO:teuthology.orchestra.run.vm08.stdout:Transaction test succeeded. 2026-03-10T09:13:28.880 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction 2026-03-10T09:13:28.894 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T09:13:28.895 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T09:13:28.895 INFO:teuthology.orchestra.run.vm05.stdout: Package Architecture Version Repository Size 2026-03-10T09:13:28.895 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T09:13:28.895 INFO:teuthology.orchestra.run.vm05.stdout:Removing: 2026-03-10T09:13:28.895 INFO:teuthology.orchestra.run.vm05.stdout: ceph-radosgw x86_64 2:18.2.1-0.el9 @ceph 31 M 2026-03-10T09:13:28.895 INFO:teuthology.orchestra.run.vm05.stdout:Removing unused dependencies: 2026-03-10T09:13:28.895 INFO:teuthology.orchestra.run.vm05.stdout: mailcap noarch 2.1.49-5.el9 @baseos 78 k 2026-03-10T09:13:28.895 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:28.895 INFO:teuthology.orchestra.run.vm05.stdout:Transaction Summary 2026-03-10T09:13:28.895 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T09:13:28.895 INFO:teuthology.orchestra.run.vm05.stdout:Remove 2 Packages 2026-03-10T09:13:28.895 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:28.895 INFO:teuthology.orchestra.run.vm05.stdout:Freed space: 31 M 2026-03-10T09:13:28.895 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction check 2026-03-10T09:13:28.900 INFO:teuthology.orchestra.run.vm05.stdout:Transaction check succeeded. 2026-03-10T09:13:28.900 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction test 2026-03-10T09:13:28.912 INFO:teuthology.orchestra.run.vm08.stdout: Preparing : 1/1 2026-03-10T09:13:28.915 INFO:teuthology.orchestra.run.vm05.stdout:Transaction test succeeded. 2026-03-10T09:13:28.915 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction 2026-03-10T09:13:28.941 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-radosgw-2:18.2.1-0.el9.x86_64 1/2 2026-03-10T09:13:28.941 INFO:teuthology.orchestra.run.vm08.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T09:13:28.941 INFO:teuthology.orchestra.run.vm08.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-10T09:13:28.941 INFO:teuthology.orchestra.run.vm08.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-radosgw.target". 2026-03-10T09:13:28.941 INFO:teuthology.orchestra.run.vm08.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-radosgw.target". 2026-03-10T09:13:28.941 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:28.942 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-radosgw-2:18.2.1-0.el9.x86_64 1/2 2026-03-10T09:13:28.948 INFO:teuthology.orchestra.run.vm05.stdout: Preparing : 1/1 2026-03-10T09:13:28.953 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-radosgw-2:18.2.1-0.el9.x86_64 1/2 2026-03-10T09:13:28.967 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-10T09:13:28.974 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-radosgw-2:18.2.1-0.el9.x86_64 1/2 2026-03-10T09:13:28.974 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T09:13:28.974 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-10T09:13:28.974 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-radosgw.target". 2026-03-10T09:13:28.974 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-radosgw.target". 2026-03-10T09:13:28.974 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:28.975 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-radosgw-2:18.2.1-0.el9.x86_64 1/2 2026-03-10T09:13:28.986 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-radosgw-2:18.2.1-0.el9.x86_64 1/2 2026-03-10T09:13:29.000 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-10T09:13:29.047 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: mailcap-2.1.49-5.el9.noarch 2/2 2026-03-10T09:13:29.047 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-radosgw-2:18.2.1-0.el9.x86_64 1/2 2026-03-10T09:13:29.076 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: mailcap-2.1.49-5.el9.noarch 2/2 2026-03-10T09:13:29.076 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-radosgw-2:18.2.1-0.el9.x86_64 1/2 2026-03-10T09:13:29.095 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-10T09:13:29.095 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:29.095 INFO:teuthology.orchestra.run.vm08.stdout:Removed: 2026-03-10T09:13:29.095 INFO:teuthology.orchestra.run.vm08.stdout: ceph-radosgw-2:18.2.1-0.el9.x86_64 mailcap-2.1.49-5.el9.noarch 2026-03-10T09:13:29.095 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:29.095 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T09:13:29.131 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-10T09:13:29.132 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:29.132 INFO:teuthology.orchestra.run.vm05.stdout:Removed: 2026-03-10T09:13:29.132 INFO:teuthology.orchestra.run.vm05.stdout: ceph-radosgw-2:18.2.1-0.el9.x86_64 mailcap-2.1.49-5.el9.noarch 2026-03-10T09:13:29.132 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:29.132 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T09:13:29.319 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T09:13:29.320 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T09:13:29.320 INFO:teuthology.orchestra.run.vm08.stdout: Package Architecture Version Repository Size 2026-03-10T09:13:29.320 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T09:13:29.320 INFO:teuthology.orchestra.run.vm08.stdout:Removing: 2026-03-10T09:13:29.320 INFO:teuthology.orchestra.run.vm08.stdout: ceph-test x86_64 2:18.2.1-0.el9 @ceph 164 M 2026-03-10T09:13:29.320 INFO:teuthology.orchestra.run.vm08.stdout:Removing unused dependencies: 2026-03-10T09:13:29.320 INFO:teuthology.orchestra.run.vm08.stdout: libxslt x86_64 1.1.34-12.el9 @appstream 743 k 2026-03-10T09:13:29.320 INFO:teuthology.orchestra.run.vm08.stdout: socat x86_64 1.7.4.1-8.el9 @appstream 1.1 M 2026-03-10T09:13:29.320 INFO:teuthology.orchestra.run.vm08.stdout: xmlstarlet x86_64 1.6.1-20.el9 @appstream 195 k 2026-03-10T09:13:29.320 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:29.320 INFO:teuthology.orchestra.run.vm08.stdout:Transaction Summary 2026-03-10T09:13:29.320 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T09:13:29.321 INFO:teuthology.orchestra.run.vm08.stdout:Remove 4 Packages 2026-03-10T09:13:29.321 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:29.321 INFO:teuthology.orchestra.run.vm08.stdout:Freed space: 166 M 2026-03-10T09:13:29.321 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction check 2026-03-10T09:13:29.324 INFO:teuthology.orchestra.run.vm08.stdout:Transaction check succeeded. 2026-03-10T09:13:29.324 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction test 2026-03-10T09:13:29.344 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T09:13:29.345 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T09:13:29.345 INFO:teuthology.orchestra.run.vm05.stdout: Package Architecture Version Repository Size 2026-03-10T09:13:29.345 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T09:13:29.345 INFO:teuthology.orchestra.run.vm05.stdout:Removing: 2026-03-10T09:13:29.345 INFO:teuthology.orchestra.run.vm05.stdout: ceph-test x86_64 2:18.2.1-0.el9 @ceph 164 M 2026-03-10T09:13:29.345 INFO:teuthology.orchestra.run.vm05.stdout:Removing unused dependencies: 2026-03-10T09:13:29.345 INFO:teuthology.orchestra.run.vm05.stdout: libxslt x86_64 1.1.34-12.el9 @appstream 743 k 2026-03-10T09:13:29.345 INFO:teuthology.orchestra.run.vm05.stdout: socat x86_64 1.7.4.1-8.el9 @appstream 1.1 M 2026-03-10T09:13:29.345 INFO:teuthology.orchestra.run.vm05.stdout: xmlstarlet x86_64 1.6.1-20.el9 @appstream 195 k 2026-03-10T09:13:29.345 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:29.345 INFO:teuthology.orchestra.run.vm05.stdout:Transaction Summary 2026-03-10T09:13:29.345 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T09:13:29.345 INFO:teuthology.orchestra.run.vm05.stdout:Remove 4 Packages 2026-03-10T09:13:29.345 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:29.345 INFO:teuthology.orchestra.run.vm05.stdout:Freed space: 166 M 2026-03-10T09:13:29.345 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction check 2026-03-10T09:13:29.348 INFO:teuthology.orchestra.run.vm05.stdout:Transaction check succeeded. 2026-03-10T09:13:29.348 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction test 2026-03-10T09:13:29.348 INFO:teuthology.orchestra.run.vm08.stdout:Transaction test succeeded. 2026-03-10T09:13:29.349 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction 2026-03-10T09:13:29.372 INFO:teuthology.orchestra.run.vm05.stdout:Transaction test succeeded. 2026-03-10T09:13:29.373 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction 2026-03-10T09:13:29.401 INFO:teuthology.orchestra.run.vm08.stdout: Preparing : 1/1 2026-03-10T09:13:29.408 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-test-2:18.2.1-0.el9.x86_64 1/4 2026-03-10T09:13:29.410 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : xmlstarlet-1.6.1-20.el9.x86_64 2/4 2026-03-10T09:13:29.414 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : libxslt-1.1.34-12.el9.x86_64 3/4 2026-03-10T09:13:29.423 INFO:teuthology.orchestra.run.vm05.stdout: Preparing : 1/1 2026-03-10T09:13:29.429 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : socat-1.7.4.1-8.el9.x86_64 4/4 2026-03-10T09:13:29.430 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-test-2:18.2.1-0.el9.x86_64 1/4 2026-03-10T09:13:29.433 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : xmlstarlet-1.6.1-20.el9.x86_64 2/4 2026-03-10T09:13:29.437 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libxslt-1.1.34-12.el9.x86_64 3/4 2026-03-10T09:13:29.452 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : socat-1.7.4.1-8.el9.x86_64 4/4 2026-03-10T09:13:29.503 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: socat-1.7.4.1-8.el9.x86_64 4/4 2026-03-10T09:13:29.503 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-test-2:18.2.1-0.el9.x86_64 1/4 2026-03-10T09:13:29.503 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 2/4 2026-03-10T09:13:29.503 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 3/4 2026-03-10T09:13:29.519 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: socat-1.7.4.1-8.el9.x86_64 4/4 2026-03-10T09:13:29.519 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-test-2:18.2.1-0.el9.x86_64 1/4 2026-03-10T09:13:29.519 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 2/4 2026-03-10T09:13:29.519 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 3/4 2026-03-10T09:13:29.560 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 4/4 2026-03-10T09:13:29.560 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:29.560 INFO:teuthology.orchestra.run.vm08.stdout:Removed: 2026-03-10T09:13:29.560 INFO:teuthology.orchestra.run.vm08.stdout: ceph-test-2:18.2.1-0.el9.x86_64 libxslt-1.1.34-12.el9.x86_64 2026-03-10T09:13:29.560 INFO:teuthology.orchestra.run.vm08.stdout: socat-1.7.4.1-8.el9.x86_64 xmlstarlet-1.6.1-20.el9.x86_64 2026-03-10T09:13:29.560 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:29.560 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T09:13:29.572 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 4/4 2026-03-10T09:13:29.572 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:29.572 INFO:teuthology.orchestra.run.vm05.stdout:Removed: 2026-03-10T09:13:29.572 INFO:teuthology.orchestra.run.vm05.stdout: ceph-test-2:18.2.1-0.el9.x86_64 libxslt-1.1.34-12.el9.x86_64 2026-03-10T09:13:29.572 INFO:teuthology.orchestra.run.vm05.stdout: socat-1.7.4.1-8.el9.x86_64 xmlstarlet-1.6.1-20.el9.x86_64 2026-03-10T09:13:29.572 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:29.572 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T09:13:29.774 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T09:13:29.775 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T09:13:29.775 INFO:teuthology.orchestra.run.vm08.stdout: Package Arch Version Repository Size 2026-03-10T09:13:29.775 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T09:13:29.775 INFO:teuthology.orchestra.run.vm08.stdout:Removing: 2026-03-10T09:13:29.775 INFO:teuthology.orchestra.run.vm08.stdout: ceph x86_64 2:18.2.1-0.el9 @ceph 0 2026-03-10T09:13:29.775 INFO:teuthology.orchestra.run.vm08.stdout:Removing unused dependencies: 2026-03-10T09:13:29.775 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mds x86_64 2:18.2.1-0.el9 @ceph 6.5 M 2026-03-10T09:13:29.775 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mon x86_64 2:18.2.1-0.el9 @ceph 20 M 2026-03-10T09:13:29.775 INFO:teuthology.orchestra.run.vm08.stdout: ceph-osd x86_64 2:18.2.1-0.el9 @ceph 61 M 2026-03-10T09:13:29.775 INFO:teuthology.orchestra.run.vm08.stdout: ledmon-libs x86_64 1.1.0-3.el9 @baseos 80 k 2026-03-10T09:13:29.775 INFO:teuthology.orchestra.run.vm08.stdout: libconfig x86_64 1.7.2-9.el9 @baseos 220 k 2026-03-10T09:13:29.775 INFO:teuthology.orchestra.run.vm08.stdout: libstoragemgmt x86_64 1.10.1-1.el9 @appstream 685 k 2026-03-10T09:13:29.775 INFO:teuthology.orchestra.run.vm08.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 @appstream 832 k 2026-03-10T09:13:29.775 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:29.775 INFO:teuthology.orchestra.run.vm08.stdout:Transaction Summary 2026-03-10T09:13:29.775 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T09:13:29.775 INFO:teuthology.orchestra.run.vm08.stdout:Remove 8 Packages 2026-03-10T09:13:29.775 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:29.776 INFO:teuthology.orchestra.run.vm08.stdout:Freed space: 89 M 2026-03-10T09:13:29.776 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction check 2026-03-10T09:13:29.778 INFO:teuthology.orchestra.run.vm08.stdout:Transaction check succeeded. 2026-03-10T09:13:29.779 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction test 2026-03-10T09:13:29.782 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T09:13:29.783 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T09:13:29.783 INFO:teuthology.orchestra.run.vm05.stdout: Package Arch Version Repository Size 2026-03-10T09:13:29.783 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T09:13:29.783 INFO:teuthology.orchestra.run.vm05.stdout:Removing: 2026-03-10T09:13:29.783 INFO:teuthology.orchestra.run.vm05.stdout: ceph x86_64 2:18.2.1-0.el9 @ceph 0 2026-03-10T09:13:29.783 INFO:teuthology.orchestra.run.vm05.stdout:Removing unused dependencies: 2026-03-10T09:13:29.783 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mds x86_64 2:18.2.1-0.el9 @ceph 6.5 M 2026-03-10T09:13:29.783 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mon x86_64 2:18.2.1-0.el9 @ceph 20 M 2026-03-10T09:13:29.784 INFO:teuthology.orchestra.run.vm05.stdout: ceph-osd x86_64 2:18.2.1-0.el9 @ceph 61 M 2026-03-10T09:13:29.784 INFO:teuthology.orchestra.run.vm05.stdout: ledmon-libs x86_64 1.1.0-3.el9 @baseos 80 k 2026-03-10T09:13:29.784 INFO:teuthology.orchestra.run.vm05.stdout: libconfig x86_64 1.7.2-9.el9 @baseos 220 k 2026-03-10T09:13:29.784 INFO:teuthology.orchestra.run.vm05.stdout: libstoragemgmt x86_64 1.10.1-1.el9 @appstream 685 k 2026-03-10T09:13:29.784 INFO:teuthology.orchestra.run.vm05.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 @appstream 832 k 2026-03-10T09:13:29.784 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:29.784 INFO:teuthology.orchestra.run.vm05.stdout:Transaction Summary 2026-03-10T09:13:29.784 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T09:13:29.784 INFO:teuthology.orchestra.run.vm05.stdout:Remove 8 Packages 2026-03-10T09:13:29.784 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:29.784 INFO:teuthology.orchestra.run.vm05.stdout:Freed space: 89 M 2026-03-10T09:13:29.784 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction check 2026-03-10T09:13:29.787 INFO:teuthology.orchestra.run.vm05.stdout:Transaction check succeeded. 2026-03-10T09:13:29.787 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction test 2026-03-10T09:13:29.802 INFO:teuthology.orchestra.run.vm08.stdout:Transaction test succeeded. 2026-03-10T09:13:29.802 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction 2026-03-10T09:13:29.810 INFO:teuthology.orchestra.run.vm05.stdout:Transaction test succeeded. 2026-03-10T09:13:29.811 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction 2026-03-10T09:13:29.843 INFO:teuthology.orchestra.run.vm08.stdout: Preparing : 1/1 2026-03-10T09:13:29.845 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-2:18.2.1-0.el9.x86_64 1/8 2026-03-10T09:13:29.849 INFO:teuthology.orchestra.run.vm05.stdout: Preparing : 1/1 2026-03-10T09:13:29.851 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-2:18.2.1-0.el9.x86_64 1/8 2026-03-10T09:13:29.866 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-osd-2:18.2.1-0.el9.x86_64 2/8 2026-03-10T09:13:29.867 INFO:teuthology.orchestra.run.vm08.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T09:13:29.867 INFO:teuthology.orchestra.run.vm08.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-10T09:13:29.867 INFO:teuthology.orchestra.run.vm08.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-osd.target". 2026-03-10T09:13:29.867 INFO:teuthology.orchestra.run.vm08.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-osd.target". 2026-03-10T09:13:29.867 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:29.869 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-osd-2:18.2.1-0.el9.x86_64 2/8 2026-03-10T09:13:29.878 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-osd-2:18.2.1-0.el9.x86_64 2/8 2026-03-10T09:13:29.878 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-osd-2:18.2.1-0.el9.x86_64 2/8 2026-03-10T09:13:29.878 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T09:13:29.878 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-10T09:13:29.878 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-osd.target". 2026-03-10T09:13:29.878 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-osd.target". 2026-03-10T09:13:29.878 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:29.881 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-osd-2:18.2.1-0.el9.x86_64 2/8 2026-03-10T09:13:29.891 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-osd-2:18.2.1-0.el9.x86_64 2/8 2026-03-10T09:13:29.893 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-10T09:13:29.893 INFO:teuthology.orchestra.run.vm08.stdout:Removed "/etc/systemd/system/multi-user.target.wants/libstoragemgmt.service". 2026-03-10T09:13:29.893 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:29.894 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-10T09:13:29.906 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-10T09:13:29.907 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/multi-user.target.wants/libstoragemgmt.service". 2026-03-10T09:13:29.907 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:29.908 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-10T09:13:29.916 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-10T09:13:29.920 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 4/8 2026-03-10T09:13:29.922 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ledmon-libs-1.1.0-3.el9.x86_64 5/8 2026-03-10T09:13:29.924 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : libconfig-1.7.2-9.el9.x86_64 6/8 2026-03-10T09:13:29.930 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-10T09:13:29.933 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 4/8 2026-03-10T09:13:29.936 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ledmon-libs-1.1.0-3.el9.x86_64 5/8 2026-03-10T09:13:29.938 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libconfig-1.7.2-9.el9.x86_64 6/8 2026-03-10T09:13:29.947 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mds-2:18.2.1-0.el9.x86_64 7/8 2026-03-10T09:13:29.948 INFO:teuthology.orchestra.run.vm08.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T09:13:29.948 INFO:teuthology.orchestra.run.vm08.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-10T09:13:29.948 INFO:teuthology.orchestra.run.vm08.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mds.target". 2026-03-10T09:13:29.948 INFO:teuthology.orchestra.run.vm08.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mds.target". 2026-03-10T09:13:29.948 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:29.948 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-mds-2:18.2.1-0.el9.x86_64 7/8 2026-03-10T09:13:29.956 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mds-2:18.2.1-0.el9.x86_64 7/8 2026-03-10T09:13:29.959 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mds-2:18.2.1-0.el9.x86_64 7/8 2026-03-10T09:13:29.959 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T09:13:29.959 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-10T09:13:29.959 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mds.target". 2026-03-10T09:13:29.959 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mds.target". 2026-03-10T09:13:29.959 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:29.960 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-mds-2:18.2.1-0.el9.x86_64 7/8 2026-03-10T09:13:29.970 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mds-2:18.2.1-0.el9.x86_64 7/8 2026-03-10T09:13:29.979 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mon-2:18.2.1-0.el9.x86_64 8/8 2026-03-10T09:13:29.979 INFO:teuthology.orchestra.run.vm08.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T09:13:29.979 INFO:teuthology.orchestra.run.vm08.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-10T09:13:29.979 INFO:teuthology.orchestra.run.vm08.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mon.target". 2026-03-10T09:13:29.979 INFO:teuthology.orchestra.run.vm08.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mon.target". 2026-03-10T09:13:29.979 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:29.980 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-mon-2:18.2.1-0.el9.x86_64 8/8 2026-03-10T09:13:29.993 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mon-2:18.2.1-0.el9.x86_64 8/8 2026-03-10T09:13:29.993 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T09:13:29.993 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-10T09:13:29.993 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mon.target". 2026-03-10T09:13:29.993 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mon.target". 2026-03-10T09:13:29.993 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:29.994 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-mon-2:18.2.1-0.el9.x86_64 8/8 2026-03-10T09:13:30.078 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mon-2:18.2.1-0.el9.x86_64 8/8 2026-03-10T09:13:30.078 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-2:18.2.1-0.el9.x86_64 1/8 2026-03-10T09:13:30.078 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mds-2:18.2.1-0.el9.x86_64 2/8 2026-03-10T09:13:30.078 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mon-2:18.2.1-0.el9.x86_64 3/8 2026-03-10T09:13:30.078 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-osd-2:18.2.1-0.el9.x86_64 4/8 2026-03-10T09:13:30.078 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 5/8 2026-03-10T09:13:30.078 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 6/8 2026-03-10T09:13:30.078 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 7/8 2026-03-10T09:13:30.091 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mon-2:18.2.1-0.el9.x86_64 8/8 2026-03-10T09:13:30.091 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-2:18.2.1-0.el9.x86_64 1/8 2026-03-10T09:13:30.091 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mds-2:18.2.1-0.el9.x86_64 2/8 2026-03-10T09:13:30.091 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mon-2:18.2.1-0.el9.x86_64 3/8 2026-03-10T09:13:30.091 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-osd-2:18.2.1-0.el9.x86_64 4/8 2026-03-10T09:13:30.091 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 5/8 2026-03-10T09:13:30.091 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 6/8 2026-03-10T09:13:30.091 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 7/8 2026-03-10T09:13:30.141 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 8/8 2026-03-10T09:13:30.141 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:30.141 INFO:teuthology.orchestra.run.vm08.stdout:Removed: 2026-03-10T09:13:30.141 INFO:teuthology.orchestra.run.vm08.stdout: ceph-2:18.2.1-0.el9.x86_64 ceph-mds-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:30.141 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mon-2:18.2.1-0.el9.x86_64 ceph-osd-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:30.141 INFO:teuthology.orchestra.run.vm08.stdout: ledmon-libs-1.1.0-3.el9.x86_64 libconfig-1.7.2-9.el9.x86_64 2026-03-10T09:13:30.141 INFO:teuthology.orchestra.run.vm08.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-10T09:13:30.141 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:30.141 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T09:13:30.162 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 8/8 2026-03-10T09:13:30.162 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:30.162 INFO:teuthology.orchestra.run.vm05.stdout:Removed: 2026-03-10T09:13:30.162 INFO:teuthology.orchestra.run.vm05.stdout: ceph-2:18.2.1-0.el9.x86_64 ceph-mds-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:30.162 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mon-2:18.2.1-0.el9.x86_64 ceph-osd-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:30.162 INFO:teuthology.orchestra.run.vm05.stdout: ledmon-libs-1.1.0-3.el9.x86_64 libconfig-1.7.2-9.el9.x86_64 2026-03-10T09:13:30.162 INFO:teuthology.orchestra.run.vm05.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-10T09:13:30.162 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:30.162 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T09:13:30.364 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T09:13:30.369 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T09:13:30.369 INFO:teuthology.orchestra.run.vm08.stdout: Package Arch Version Repository Size 2026-03-10T09:13:30.369 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T09:13:30.369 INFO:teuthology.orchestra.run.vm08.stdout:Removing: 2026-03-10T09:13:30.369 INFO:teuthology.orchestra.run.vm08.stdout: ceph-base x86_64 2:18.2.1-0.el9 @ceph 22 M 2026-03-10T09:13:30.369 INFO:teuthology.orchestra.run.vm08.stdout:Removing dependent packages: 2026-03-10T09:13:30.369 INFO:teuthology.orchestra.run.vm08.stdout: ceph-immutable-object-cache x86_64 2:18.2.1-0.el9 @ceph 395 k 2026-03-10T09:13:30.369 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr x86_64 2:18.2.1-0.el9 @ceph 4.5 M 2026-03-10T09:13:30.369 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-cephadm noarch 2:18.2.1-0.el9 @ceph-noarch 678 k 2026-03-10T09:13:30.369 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-dashboard noarch 2:18.2.1-0.el9 @ceph-noarch 7.6 M 2026-03-10T09:13:30.369 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-diskprediction-local noarch 2:18.2.1-0.el9 @ceph-noarch 66 M 2026-03-10T09:13:30.369 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-rook noarch 2:18.2.1-0.el9 @ceph-noarch 574 k 2026-03-10T09:13:30.369 INFO:teuthology.orchestra.run.vm08.stdout: rbd-mirror x86_64 2:18.2.1-0.el9 @ceph 12 M 2026-03-10T09:13:30.369 INFO:teuthology.orchestra.run.vm08.stdout:Removing unused dependencies: 2026-03-10T09:13:30.369 INFO:teuthology.orchestra.run.vm08.stdout: ceph-common x86_64 2:18.2.1-0.el9 @ceph 70 M 2026-03-10T09:13:30.369 INFO:teuthology.orchestra.run.vm08.stdout: ceph-grafana-dashboards noarch 2:18.2.1-0.el9 @ceph-noarch 319 k 2026-03-10T09:13:30.369 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-modules-core noarch 2:18.2.1-0.el9 @ceph-noarch 1.4 M 2026-03-10T09:13:30.369 INFO:teuthology.orchestra.run.vm08.stdout: ceph-prometheus-alerts noarch 2:18.2.1-0.el9 @ceph-noarch 40 k 2026-03-10T09:13:30.369 INFO:teuthology.orchestra.run.vm08.stdout: ceph-selinux x86_64 2:18.2.1-0.el9 @ceph 138 k 2026-03-10T09:13:30.369 INFO:teuthology.orchestra.run.vm08.stdout: flexiblas x86_64 3.0.4-9.el9 @appstream 68 k 2026-03-10T09:13:30.369 INFO:teuthology.orchestra.run.vm08.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 @appstream 11 M 2026-03-10T09:13:30.369 INFO:teuthology.orchestra.run.vm08.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 @appstream 39 k 2026-03-10T09:13:30.369 INFO:teuthology.orchestra.run.vm08.stdout: fmt x86_64 8.1.1-5.el9 @epel 337 k 2026-03-10T09:13:30.369 INFO:teuthology.orchestra.run.vm08.stdout: libcephsqlite x86_64 2:18.2.1-0.el9 @ceph 434 k 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: libgfortran x86_64 11.5.0-14.el9 @baseos 2.8 M 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: liboath x86_64 2.6.12-1.el9 @epel 94 k 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: libquadmath x86_64 11.5.0-14.el9 @baseos 330 k 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: libradosstriper1 x86_64 2:18.2.1-0.el9 @ceph 1.5 M 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: openblas x86_64 0.3.29-1.el9 @appstream 112 k 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: openblas-openmp x86_64 0.3.29-1.el9 @appstream 46 M 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: python3-asyncssh noarch 2.13.2-5.el9 @epel 3.9 M 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: python3-autocommand noarch 2.2.2-8.el9 @epel 82 k 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: python3-babel noarch 2.9.1-2.el9 @appstream 27 M 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 @epel 254 k 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: python3-bcrypt x86_64 3.2.2-1.el9 @epel 87 k 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: python3-cachetools noarch 4.2.4-1.el9 @epel 93 k 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: python3-ceph-common x86_64 2:18.2.1-0.el9 @ceph 610 k 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: python3-certifi noarch 2023.05.07-4.el9 @epel 6.3 k 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: python3-cffi x86_64 1.14.5-5.el9 @baseos 1.0 M 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: python3-chardet noarch 4.0.0-5.el9 @anaconda 1.4 M 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: python3-cheroot noarch 10.0.1-4.el9 @epel 682 k 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: python3-cherrypy noarch 18.6.1-2.el9 @epel 1.1 M 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: python3-cryptography x86_64 36.0.1-5.el9 @baseos 4.5 M 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: python3-devel x86_64 3.9.25-3.el9 @appstream 765 k 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: python3-google-auth noarch 1:2.45.0-1.el9 @epel 1.4 M 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: python3-idna noarch 2.10-7.el9.1 @anaconda 513 k 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco noarch 8.2.1-3.el9 @epel 3.7 k 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 @epel 24 k 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 @epel 55 k 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-context noarch 6.0.1-3.el9 @epel 31 k 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 @epel 33 k 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-text noarch 4.0.0-2.el9 @epel 51 k 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: python3-jinja2 noarch 2.11.3-8.el9 @appstream 1.1 M 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: python3-jsonpatch noarch 1.21-16.el9 @koji-override-0 55 k 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: python3-jsonpointer noarch 2.0-4.el9 @koji-override-0 34 k 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: python3-jwt noarch 2.4.0-1.el9 @epel 103 k 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: python3-jwt+crypto noarch 2.4.0-1.el9 @epel 5.4 k 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 @epel 21 M 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: python3-logutils noarch 0.3.5-21.el9 @epel 126 k 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: python3-mako noarch 1.1.4-6.el9 @appstream 534 k 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: python3-markupsafe x86_64 1.1.1-12.el9 @appstream 60 k 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: python3-more-itertools noarch 8.12.0-2.el9 @epel 378 k 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: python3-natsort noarch 7.1.1-5.el9 @epel 215 k 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: python3-numpy x86_64 1:1.23.5-2.el9 @appstream 30 M 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 @appstream 1.7 M 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: python3-oauthlib noarch 3.1.1-5.el9 @koji-override-0 888 k 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: python3-pecan noarch 1.4.2-3.el9 @epel 1.3 M 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: python3-ply noarch 3.11-14.el9 @baseos 430 k 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: python3-portend noarch 3.1.0-2.el9 @epel 20 k 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: python3-prettytable noarch 0.7.2-27.el9 @koji-override-0 166 k 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 @epel 389 k 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: python3-pyasn1 noarch 0.4.8-7.el9 @appstream 622 k 2026-03-10T09:13:30.370 INFO:teuthology.orchestra.run.vm08.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 @appstream 1.0 M 2026-03-10T09:13:30.371 INFO:teuthology.orchestra.run.vm08.stdout: python3-pycparser noarch 2.20-6.el9 @baseos 745 k 2026-03-10T09:13:30.371 INFO:teuthology.orchestra.run.vm08.stdout: python3-pysocks noarch 1.7.1-12.el9 @anaconda 88 k 2026-03-10T09:13:30.371 INFO:teuthology.orchestra.run.vm08.stdout: python3-pytz noarch 2021.1-5.el9 @koji-override-0 176 k 2026-03-10T09:13:30.371 INFO:teuthology.orchestra.run.vm08.stdout: python3-repoze-lru noarch 0.7-16.el9 @epel 83 k 2026-03-10T09:13:30.371 INFO:teuthology.orchestra.run.vm08.stdout: python3-requests noarch 2.25.1-10.el9 @baseos 405 k 2026-03-10T09:13:30.371 INFO:teuthology.orchestra.run.vm08.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 @appstream 119 k 2026-03-10T09:13:30.371 INFO:teuthology.orchestra.run.vm08.stdout: python3-routes noarch 2.5.1-5.el9 @epel 459 k 2026-03-10T09:13:30.371 INFO:teuthology.orchestra.run.vm08.stdout: python3-rsa noarch 4.9-2.el9 @epel 202 k 2026-03-10T09:13:30.371 INFO:teuthology.orchestra.run.vm08.stdout: python3-scipy x86_64 1.9.3-2.el9 @appstream 76 M 2026-03-10T09:13:30.371 INFO:teuthology.orchestra.run.vm08.stdout: python3-tempora noarch 5.0.0-2.el9 @epel 96 k 2026-03-10T09:13:30.371 INFO:teuthology.orchestra.run.vm08.stdout: python3-toml noarch 0.10.2-6.el9 @appstream 99 k 2026-03-10T09:13:30.371 INFO:teuthology.orchestra.run.vm08.stdout: python3-typing-extensions noarch 4.15.0-1.el9 @epel 447 k 2026-03-10T09:13:30.371 INFO:teuthology.orchestra.run.vm08.stdout: python3-urllib3 noarch 1.26.5-7.el9 @baseos 746 k 2026-03-10T09:13:30.371 INFO:teuthology.orchestra.run.vm08.stdout: python3-webob noarch 1.8.8-2.el9 @epel 1.2 M 2026-03-10T09:13:30.371 INFO:teuthology.orchestra.run.vm08.stdout: python3-websocket-client noarch 1.2.3-2.el9 @epel 319 k 2026-03-10T09:13:30.371 INFO:teuthology.orchestra.run.vm08.stdout: python3-werkzeug noarch 2.0.3-3.el9.1 @epel 1.9 M 2026-03-10T09:13:30.371 INFO:teuthology.orchestra.run.vm08.stdout: python3-zc-lockfile noarch 2.0-10.el9 @epel 35 k 2026-03-10T09:13:30.371 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:30.371 INFO:teuthology.orchestra.run.vm08.stdout:Transaction Summary 2026-03-10T09:13:30.371 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T09:13:30.371 INFO:teuthology.orchestra.run.vm08.stdout:Remove 84 Packages 2026-03-10T09:13:30.371 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:30.371 INFO:teuthology.orchestra.run.vm08.stdout:Freed space: 434 M 2026-03-10T09:13:30.371 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction check 2026-03-10T09:13:30.378 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T09:13:30.383 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T09:13:30.383 INFO:teuthology.orchestra.run.vm05.stdout: Package Arch Version Repository Size 2026-03-10T09:13:30.383 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T09:13:30.383 INFO:teuthology.orchestra.run.vm05.stdout:Removing: 2026-03-10T09:13:30.383 INFO:teuthology.orchestra.run.vm05.stdout: ceph-base x86_64 2:18.2.1-0.el9 @ceph 22 M 2026-03-10T09:13:30.383 INFO:teuthology.orchestra.run.vm05.stdout:Removing dependent packages: 2026-03-10T09:13:30.383 INFO:teuthology.orchestra.run.vm05.stdout: ceph-immutable-object-cache x86_64 2:18.2.1-0.el9 @ceph 395 k 2026-03-10T09:13:30.383 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr x86_64 2:18.2.1-0.el9 @ceph 4.5 M 2026-03-10T09:13:30.383 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-cephadm noarch 2:18.2.1-0.el9 @ceph-noarch 678 k 2026-03-10T09:13:30.383 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-dashboard noarch 2:18.2.1-0.el9 @ceph-noarch 7.6 M 2026-03-10T09:13:30.383 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-diskprediction-local noarch 2:18.2.1-0.el9 @ceph-noarch 66 M 2026-03-10T09:13:30.383 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-rook noarch 2:18.2.1-0.el9 @ceph-noarch 574 k 2026-03-10T09:13:30.383 INFO:teuthology.orchestra.run.vm05.stdout: rbd-mirror x86_64 2:18.2.1-0.el9 @ceph 12 M 2026-03-10T09:13:30.383 INFO:teuthology.orchestra.run.vm05.stdout:Removing unused dependencies: 2026-03-10T09:13:30.383 INFO:teuthology.orchestra.run.vm05.stdout: ceph-common x86_64 2:18.2.1-0.el9 @ceph 70 M 2026-03-10T09:13:30.383 INFO:teuthology.orchestra.run.vm05.stdout: ceph-grafana-dashboards noarch 2:18.2.1-0.el9 @ceph-noarch 319 k 2026-03-10T09:13:30.383 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-modules-core noarch 2:18.2.1-0.el9 @ceph-noarch 1.4 M 2026-03-10T09:13:30.383 INFO:teuthology.orchestra.run.vm05.stdout: ceph-prometheus-alerts noarch 2:18.2.1-0.el9 @ceph-noarch 40 k 2026-03-10T09:13:30.383 INFO:teuthology.orchestra.run.vm05.stdout: ceph-selinux x86_64 2:18.2.1-0.el9 @ceph 138 k 2026-03-10T09:13:30.383 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas x86_64 3.0.4-9.el9 @appstream 68 k 2026-03-10T09:13:30.383 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 @appstream 11 M 2026-03-10T09:13:30.383 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 @appstream 39 k 2026-03-10T09:13:30.383 INFO:teuthology.orchestra.run.vm05.stdout: fmt x86_64 8.1.1-5.el9 @epel 337 k 2026-03-10T09:13:30.383 INFO:teuthology.orchestra.run.vm05.stdout: libcephsqlite x86_64 2:18.2.1-0.el9 @ceph 434 k 2026-03-10T09:13:30.383 INFO:teuthology.orchestra.run.vm05.stdout: libgfortran x86_64 11.5.0-14.el9 @baseos 2.8 M 2026-03-10T09:13:30.383 INFO:teuthology.orchestra.run.vm05.stdout: liboath x86_64 2.6.12-1.el9 @epel 94 k 2026-03-10T09:13:30.383 INFO:teuthology.orchestra.run.vm05.stdout: libquadmath x86_64 11.5.0-14.el9 @baseos 330 k 2026-03-10T09:13:30.383 INFO:teuthology.orchestra.run.vm05.stdout: libradosstriper1 x86_64 2:18.2.1-0.el9 @ceph 1.5 M 2026-03-10T09:13:30.383 INFO:teuthology.orchestra.run.vm05.stdout: openblas x86_64 0.3.29-1.el9 @appstream 112 k 2026-03-10T09:13:30.383 INFO:teuthology.orchestra.run.vm05.stdout: openblas-openmp x86_64 0.3.29-1.el9 @appstream 46 M 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-asyncssh noarch 2.13.2-5.el9 @epel 3.9 M 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-autocommand noarch 2.2.2-8.el9 @epel 82 k 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-babel noarch 2.9.1-2.el9 @appstream 27 M 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 @epel 254 k 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-bcrypt x86_64 3.2.2-1.el9 @epel 87 k 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-cachetools noarch 4.2.4-1.el9 @epel 93 k 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-ceph-common x86_64 2:18.2.1-0.el9 @ceph 610 k 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-certifi noarch 2023.05.07-4.el9 @epel 6.3 k 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-cffi x86_64 1.14.5-5.el9 @baseos 1.0 M 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-chardet noarch 4.0.0-5.el9 @anaconda 1.4 M 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-cheroot noarch 10.0.1-4.el9 @epel 682 k 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-cherrypy noarch 18.6.1-2.el9 @epel 1.1 M 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-cryptography x86_64 36.0.1-5.el9 @baseos 4.5 M 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-devel x86_64 3.9.25-3.el9 @appstream 765 k 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-google-auth noarch 1:2.45.0-1.el9 @epel 1.4 M 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-idna noarch 2.10-7.el9.1 @anaconda 513 k 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco noarch 8.2.1-3.el9 @epel 3.7 k 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 @epel 24 k 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 @epel 55 k 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-context noarch 6.0.1-3.el9 @epel 31 k 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 @epel 33 k 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-text noarch 4.0.0-2.el9 @epel 51 k 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-jinja2 noarch 2.11.3-8.el9 @appstream 1.1 M 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-jsonpatch noarch 1.21-16.el9 @koji-override-0 55 k 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-jsonpointer noarch 2.0-4.el9 @koji-override-0 34 k 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-jwt noarch 2.4.0-1.el9 @epel 103 k 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-jwt+crypto noarch 2.4.0-1.el9 @epel 5.4 k 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 @epel 21 M 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-logutils noarch 0.3.5-21.el9 @epel 126 k 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-mako noarch 1.1.4-6.el9 @appstream 534 k 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-markupsafe x86_64 1.1.1-12.el9 @appstream 60 k 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-more-itertools noarch 8.12.0-2.el9 @epel 378 k 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-natsort noarch 7.1.1-5.el9 @epel 215 k 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-numpy x86_64 1:1.23.5-2.el9 @appstream 30 M 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 @appstream 1.7 M 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-oauthlib noarch 3.1.1-5.el9 @koji-override-0 888 k 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-pecan noarch 1.4.2-3.el9 @epel 1.3 M 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-ply noarch 3.11-14.el9 @baseos 430 k 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-portend noarch 3.1.0-2.el9 @epel 20 k 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-prettytable noarch 0.7.2-27.el9 @koji-override-0 166 k 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 @epel 389 k 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyasn1 noarch 0.4.8-7.el9 @appstream 622 k 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 @appstream 1.0 M 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-pycparser noarch 2.20-6.el9 @baseos 745 k 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-pysocks noarch 1.7.1-12.el9 @anaconda 88 k 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-pytz noarch 2021.1-5.el9 @koji-override-0 176 k 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-repoze-lru noarch 0.7-16.el9 @epel 83 k 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-requests noarch 2.25.1-10.el9 @baseos 405 k 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 @appstream 119 k 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-routes noarch 2.5.1-5.el9 @epel 459 k 2026-03-10T09:13:30.384 INFO:teuthology.orchestra.run.vm05.stdout: python3-rsa noarch 4.9-2.el9 @epel 202 k 2026-03-10T09:13:30.385 INFO:teuthology.orchestra.run.vm05.stdout: python3-scipy x86_64 1.9.3-2.el9 @appstream 76 M 2026-03-10T09:13:30.385 INFO:teuthology.orchestra.run.vm05.stdout: python3-tempora noarch 5.0.0-2.el9 @epel 96 k 2026-03-10T09:13:30.385 INFO:teuthology.orchestra.run.vm05.stdout: python3-toml noarch 0.10.2-6.el9 @appstream 99 k 2026-03-10T09:13:30.385 INFO:teuthology.orchestra.run.vm05.stdout: python3-typing-extensions noarch 4.15.0-1.el9 @epel 447 k 2026-03-10T09:13:30.385 INFO:teuthology.orchestra.run.vm05.stdout: python3-urllib3 noarch 1.26.5-7.el9 @baseos 746 k 2026-03-10T09:13:30.385 INFO:teuthology.orchestra.run.vm05.stdout: python3-webob noarch 1.8.8-2.el9 @epel 1.2 M 2026-03-10T09:13:30.385 INFO:teuthology.orchestra.run.vm05.stdout: python3-websocket-client noarch 1.2.3-2.el9 @epel 319 k 2026-03-10T09:13:30.385 INFO:teuthology.orchestra.run.vm05.stdout: python3-werkzeug noarch 2.0.3-3.el9.1 @epel 1.9 M 2026-03-10T09:13:30.385 INFO:teuthology.orchestra.run.vm05.stdout: python3-zc-lockfile noarch 2.0-10.el9 @epel 35 k 2026-03-10T09:13:30.385 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:30.385 INFO:teuthology.orchestra.run.vm05.stdout:Transaction Summary 2026-03-10T09:13:30.385 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T09:13:30.385 INFO:teuthology.orchestra.run.vm05.stdout:Remove 84 Packages 2026-03-10T09:13:30.385 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:30.385 INFO:teuthology.orchestra.run.vm05.stdout:Freed space: 434 M 2026-03-10T09:13:30.385 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction check 2026-03-10T09:13:30.394 INFO:teuthology.orchestra.run.vm08.stdout:Transaction check succeeded. 2026-03-10T09:13:30.394 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction test 2026-03-10T09:13:30.407 INFO:teuthology.orchestra.run.vm05.stdout:Transaction check succeeded. 2026-03-10T09:13:30.407 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction test 2026-03-10T09:13:30.499 INFO:teuthology.orchestra.run.vm08.stdout:Transaction test succeeded. 2026-03-10T09:13:30.499 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction 2026-03-10T09:13:30.514 INFO:teuthology.orchestra.run.vm05.stdout:Transaction test succeeded. 2026-03-10T09:13:30.515 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction 2026-03-10T09:13:30.636 INFO:teuthology.orchestra.run.vm08.stdout: Preparing : 1/1 2026-03-10T09:13:30.636 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-mgr-rook-2:18.2.1-0.el9.noarch 1/84 2026-03-10T09:13:30.646 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mgr-rook-2:18.2.1-0.el9.noarch 1/84 2026-03-10T09:13:30.656 INFO:teuthology.orchestra.run.vm05.stdout: Preparing : 1/1 2026-03-10T09:13:30.656 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-mgr-rook-2:18.2.1-0.el9.noarch 1/84 2026-03-10T09:13:30.665 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-rook-2:18.2.1-0.el9.noarch 1/84 2026-03-10T09:13:30.668 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mgr-2:18.2.1-0.el9.x86_64 2/84 2026-03-10T09:13:30.668 INFO:teuthology.orchestra.run.vm08.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T09:13:30.668 INFO:teuthology.orchestra.run.vm08.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-10T09:13:30.668 INFO:teuthology.orchestra.run.vm08.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mgr.target". 2026-03-10T09:13:30.668 INFO:teuthology.orchestra.run.vm08.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mgr.target". 2026-03-10T09:13:30.669 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:30.669 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-mgr-2:18.2.1-0.el9.x86_64 2/84 2026-03-10T09:13:30.684 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-2:18.2.1-0.el9.x86_64 2/84 2026-03-10T09:13:30.684 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T09:13:30.684 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-10T09:13:30.684 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mgr.target". 2026-03-10T09:13:30.684 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mgr.target". 2026-03-10T09:13:30.684 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:30.684 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-mgr-2:18.2.1-0.el9.x86_64 2/84 2026-03-10T09:13:30.686 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mgr-2:18.2.1-0.el9.x86_64 2/84 2026-03-10T09:13:30.695 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-mgr-modules-core-2:18.2.1-0.el9.noarch 3/84 2026-03-10T09:13:30.695 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 4/84 2026-03-10T09:13:30.699 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-2:18.2.1-0.el9.x86_64 2/84 2026-03-10T09:13:30.710 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-mgr-modules-core-2:18.2.1-0.el9.noarch 3/84 2026-03-10T09:13:30.710 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 4/84 2026-03-10T09:13:30.760 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 4/84 2026-03-10T09:13:30.773 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-kubernetes-1:26.1.0-3.el9.noarch 5/84 2026-03-10T09:13:30.776 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 4/84 2026-03-10T09:13:30.778 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-requests-oauthlib-1.3.0-12.el9.noarch 6/84 2026-03-10T09:13:30.778 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 7/84 2026-03-10T09:13:30.787 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-kubernetes-1:26.1.0-3.el9.noarch 5/84 2026-03-10T09:13:30.791 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-requests-oauthlib-1.3.0-12.el9.noarch 6/84 2026-03-10T09:13:30.792 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 7/84 2026-03-10T09:13:30.792 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 7/84 2026-03-10T09:13:30.799 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-cherrypy-18.6.1-2.el9.noarch 8/84 2026-03-10T09:13:30.802 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-cheroot-10.0.1-4.el9.noarch 9/84 2026-03-10T09:13:30.804 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 7/84 2026-03-10T09:13:30.806 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-jaraco-collections-3.0.0-8.el9.noarch 10/84 2026-03-10T09:13:30.811 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-jaraco-text-4.0.0-2.el9.noarch 11/84 2026-03-10T09:13:30.811 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-cherrypy-18.6.1-2.el9.noarch 8/84 2026-03-10T09:13:30.815 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-cheroot-10.0.1-4.el9.noarch 9/84 2026-03-10T09:13:30.816 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-jinja2-2.11.3-8.el9.noarch 12/84 2026-03-10T09:13:30.818 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-jaraco-collections-3.0.0-8.el9.noarch 10/84 2026-03-10T09:13:30.823 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-jaraco-text-4.0.0-2.el9.noarch 11/84 2026-03-10T09:13:30.826 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-requests-2.25.1-10.el9.noarch 13/84 2026-03-10T09:13:30.828 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-jinja2-2.11.3-8.el9.noarch 12/84 2026-03-10T09:13:30.838 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-requests-2.25.1-10.el9.noarch 13/84 2026-03-10T09:13:30.840 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-google-auth-1:2.45.0-1.el9.noarch 14/84 2026-03-10T09:13:30.847 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-pecan-1.4.2-3.el9.noarch 15/84 2026-03-10T09:13:30.851 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-google-auth-1:2.45.0-1.el9.noarch 14/84 2026-03-10T09:13:30.857 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-rsa-4.9-2.el9.noarch 16/84 2026-03-10T09:13:30.858 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-pecan-1.4.2-3.el9.noarch 15/84 2026-03-10T09:13:30.865 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-pyasn1-modules-0.4.8-7.el9.noarch 17/84 2026-03-10T09:13:30.869 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-rsa-4.9-2.el9.noarch 16/84 2026-03-10T09:13:30.876 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-pyasn1-modules-0.4.8-7.el9.noarch 17/84 2026-03-10T09:13:30.895 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-urllib3-1.26.5-7.el9.noarch 18/84 2026-03-10T09:13:30.903 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-babel-2.9.1-2.el9.noarch 19/84 2026-03-10T09:13:30.906 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-jaraco-classes-3.2.1-5.el9.noarch 20/84 2026-03-10T09:13:30.907 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-urllib3-1.26.5-7.el9.noarch 18/84 2026-03-10T09:13:30.915 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-babel-2.9.1-2.el9.noarch 19/84 2026-03-10T09:13:30.916 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-pyOpenSSL-21.0.0-1.el9.noarch 21/84 2026-03-10T09:13:30.918 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-jaraco-classes-3.2.1-5.el9.noarch 20/84 2026-03-10T09:13:30.924 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-asyncssh-2.13.2-5.el9.noarch 22/84 2026-03-10T09:13:30.924 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noarc 23/84 2026-03-10T09:13:30.927 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-pyOpenSSL-21.0.0-1.el9.noarch 21/84 2026-03-10T09:13:30.934 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noarc 23/84 2026-03-10T09:13:30.936 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-asyncssh-2.13.2-5.el9.noarch 22/84 2026-03-10T09:13:30.936 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noarc 23/84 2026-03-10T09:13:30.944 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noarc 23/84 2026-03-10T09:13:31.030 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-jsonpatch-1.21-16.el9.noarch 24/84 2026-03-10T09:13:31.042 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-jsonpatch-1.21-16.el9.noarch 24/84 2026-03-10T09:13:31.060 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-scipy-1.9.3-2.el9.x86_64 25/84 2026-03-10T09:13:31.065 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 26/84 2026-03-10T09:13:31.072 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-scipy-1.9.3-2.el9.x86_64 25/84 2026-03-10T09:13:31.073 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-bcrypt-3.2.2-1.el9.x86_64 27/84 2026-03-10T09:13:31.077 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-mako-1.1.4-6.el9.noarch 28/84 2026-03-10T09:13:31.079 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 26/84 2026-03-10T09:13:31.081 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-jaraco-context-6.0.1-3.el9.noarch 29/84 2026-03-10T09:13:31.084 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-portend-3.1.0-2.el9.noarch 30/84 2026-03-10T09:13:31.087 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-tempora-5.0.0-2.el9.noarch 31/84 2026-03-10T09:13:31.087 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-bcrypt-3.2.2-1.el9.x86_64 27/84 2026-03-10T09:13:31.089 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-jaraco-functools-3.5.0-2.el9.noarch 32/84 2026-03-10T09:13:31.091 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-mako-1.1.4-6.el9.noarch 28/84 2026-03-10T09:13:31.092 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-jwt-2.4.0-1.el9.noarch 33/84 2026-03-10T09:13:31.093 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-jaraco-context-6.0.1-3.el9.noarch 29/84 2026-03-10T09:13:31.095 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-jwt+crypto-2.4.0-1.el9.noarch 34/84 2026-03-10T09:13:31.096 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-portend-3.1.0-2.el9.noarch 30/84 2026-03-10T09:13:31.099 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-tempora-5.0.0-2.el9.noarch 31/84 2026-03-10T09:13:31.101 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-jaraco-functools-3.5.0-2.el9.noarch 32/84 2026-03-10T09:13:31.103 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-jwt-2.4.0-1.el9.noarch 33/84 2026-03-10T09:13:31.107 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-jwt+crypto-2.4.0-1.el9.noarch 34/84 2026-03-10T09:13:31.110 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-routes-2.5.1-5.el9.noarch 35/84 2026-03-10T09:13:31.118 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-cryptography-36.0.1-5.el9.x86_64 36/84 2026-03-10T09:13:31.121 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-routes-2.5.1-5.el9.noarch 35/84 2026-03-10T09:13:31.123 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-cffi-1.14.5-5.el9.x86_64 37/84 2026-03-10T09:13:31.128 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-cryptography-36.0.1-5.el9.x86_64 36/84 2026-03-10T09:13:31.133 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-cffi-1.14.5-5.el9.x86_64 37/84 2026-03-10T09:13:31.174 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-pycparser-2.20-6.el9.noarch 38/84 2026-03-10T09:13:31.182 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-pycparser-2.20-6.el9.noarch 38/84 2026-03-10T09:13:31.187 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-numpy-1:1.23.5-2.el9.x86_64 39/84 2026-03-10T09:13:31.192 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : flexiblas-netlib-3.0.4-9.el9.x86_64 40/84 2026-03-10T09:13:31.194 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-numpy-1:1.23.5-2.el9.x86_64 39/84 2026-03-10T09:13:31.195 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 41/84 2026-03-10T09:13:31.197 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : flexiblas-netlib-3.0.4-9.el9.x86_64 40/84 2026-03-10T09:13:31.197 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : openblas-openmp-0.3.29-1.el9.x86_64 42/84 2026-03-10T09:13:31.199 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : libgfortran-11.5.0-14.el9.x86_64 43/84 2026-03-10T09:13:31.200 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 41/84 2026-03-10T09:13:31.202 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : openblas-openmp-0.3.29-1.el9.x86_64 42/84 2026-03-10T09:13:31.203 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libgfortran-11.5.0-14.el9.x86_64 43/84 2026-03-10T09:13:31.224 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: rbd-mirror-2:18.2.1-0.el9.x86_64 44/84 2026-03-10T09:13:31.224 INFO:teuthology.orchestra.run.vm08.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T09:13:31.224 INFO:teuthology.orchestra.run.vm08.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-10T09:13:31.224 INFO:teuthology.orchestra.run.vm08.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target". 2026-03-10T09:13:31.224 INFO:teuthology.orchestra.run.vm08.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target". 2026-03-10T09:13:31.224 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:31.224 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: rbd-mirror-2:18.2.1-0.el9.x86_64 44/84 2026-03-10T09:13:31.224 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T09:13:31.224 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-10T09:13:31.224 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target". 2026-03-10T09:13:31.224 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target". 2026-03-10T09:13:31.224 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:31.224 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : rbd-mirror-2:18.2.1-0.el9.x86_64 44/84 2026-03-10T09:13:31.225 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : rbd-mirror-2:18.2.1-0.el9.x86_64 44/84 2026-03-10T09:13:31.234 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: rbd-mirror-2:18.2.1-0.el9.x86_64 44/84 2026-03-10T09:13:31.234 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: rbd-mirror-2:18.2.1-0.el9.x86_64 44/84 2026-03-10T09:13:31.255 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.1-0.el9.x86_64 45/84 2026-03-10T09:13:31.255 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T09:13:31.255 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-10T09:13:31.255 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:31.255 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-immutable-object-cache-2:18.2.1-0.el9.x86_64 45/84 2026-03-10T09:13:31.261 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.1-0.el9.x86_64 45/84 2026-03-10T09:13:31.261 INFO:teuthology.orchestra.run.vm08.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T09:13:31.261 INFO:teuthology.orchestra.run.vm08.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-10T09:13:31.261 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:31.261 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-immutable-object-cache-2:18.2.1-0.el9.x86_64 45/84 2026-03-10T09:13:31.263 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.1-0.el9.x86_64 45/84 2026-03-10T09:13:31.265 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : openblas-0.3.29-1.el9.x86_64 46/84 2026-03-10T09:13:31.268 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : flexiblas-3.0.4-9.el9.x86_64 47/84 2026-03-10T09:13:31.271 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-ply-3.11-14.el9.noarch 48/84 2026-03-10T09:13:31.273 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-repoze-lru-0.7-16.el9.noarch 49/84 2026-03-10T09:13:31.273 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.1-0.el9.x86_64 45/84 2026-03-10T09:13:31.275 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : openblas-0.3.29-1.el9.x86_64 46/84 2026-03-10T09:13:31.275 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-jaraco-8.2.1-3.el9.noarch 50/84 2026-03-10T09:13:31.278 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : flexiblas-3.0.4-9.el9.x86_64 47/84 2026-03-10T09:13:31.278 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-more-itertools-8.12.0-2.el9.noarch 51/84 2026-03-10T09:13:31.281 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-toml-0.10.2-6.el9.noarch 52/84 2026-03-10T09:13:31.281 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-ply-3.11-14.el9.noarch 48/84 2026-03-10T09:13:31.284 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-repoze-lru-0.7-16.el9.noarch 49/84 2026-03-10T09:13:31.284 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-pytz-2021.1-5.el9.noarch 53/84 2026-03-10T09:13:31.286 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-jaraco-8.2.1-3.el9.noarch 50/84 2026-03-10T09:13:31.289 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-more-itertools-8.12.0-2.el9.noarch 51/84 2026-03-10T09:13:31.292 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-toml-0.10.2-6.el9.noarch 52/84 2026-03-10T09:13:31.293 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-backports-tarfile-1.2.0-1.el9.noarch 54/84 2026-03-10T09:13:31.296 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-pytz-2021.1-5.el9.noarch 53/84 2026-03-10T09:13:31.298 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-devel-3.9.25-3.el9.x86_64 55/84 2026-03-10T09:13:31.300 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-jsonpointer-2.0-4.el9.noarch 56/84 2026-03-10T09:13:31.303 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-typing-extensions-4.15.0-1.el9.noarch 57/84 2026-03-10T09:13:31.304 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-backports-tarfile-1.2.0-1.el9.noarch 54/84 2026-03-10T09:13:31.305 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-idna-2.10-7.el9.1.noarch 58/84 2026-03-10T09:13:31.310 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-devel-3.9.25-3.el9.x86_64 55/84 2026-03-10T09:13:31.311 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-pysocks-1.7.1-12.el9.noarch 59/84 2026-03-10T09:13:31.312 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-jsonpointer-2.0-4.el9.noarch 56/84 2026-03-10T09:13:31.315 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-typing-extensions-4.15.0-1.el9.noarch 57/84 2026-03-10T09:13:31.315 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-pyasn1-0.4.8-7.el9.noarch 60/84 2026-03-10T09:13:31.318 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-idna-2.10-7.el9.1.noarch 58/84 2026-03-10T09:13:31.321 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-logutils-0.3.5-21.el9.noarch 61/84 2026-03-10T09:13:31.324 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-pysocks-1.7.1-12.el9.noarch 59/84 2026-03-10T09:13:31.325 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-webob-1.8.8-2.el9.noarch 62/84 2026-03-10T09:13:31.328 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-pyasn1-0.4.8-7.el9.noarch 60/84 2026-03-10T09:13:31.331 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-cachetools-4.2.4-1.el9.noarch 63/84 2026-03-10T09:13:31.334 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-logutils-0.3.5-21.el9.noarch 61/84 2026-03-10T09:13:31.335 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-chardet-4.0.0-5.el9.noarch 64/84 2026-03-10T09:13:31.338 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-autocommand-2.2.2-8.el9.noarch 65/84 2026-03-10T09:13:31.339 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-webob-1.8.8-2.el9.noarch 62/84 2026-03-10T09:13:31.341 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-zc-lockfile-2.0-10.el9.noarch 66/84 2026-03-10T09:13:31.345 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-cachetools-4.2.4-1.el9.noarch 63/84 2026-03-10T09:13:31.349 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-chardet-4.0.0-5.el9.noarch 64/84 2026-03-10T09:13:31.349 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-natsort-7.1.1-5.el9.noarch 67/84 2026-03-10T09:13:31.352 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-autocommand-2.2.2-8.el9.noarch 65/84 2026-03-10T09:13:31.356 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-zc-lockfile-2.0-10.el9.noarch 66/84 2026-03-10T09:13:31.356 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-oauthlib-3.1.1-5.el9.noarch 68/84 2026-03-10T09:13:31.359 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-websocket-client-1.2.3-2.el9.noarch 69/84 2026-03-10T09:13:31.362 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-certifi-2023.05.07-4.el9.noarch 70/84 2026-03-10T09:13:31.363 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-grafana-dashboards-2:18.2.1-0.el9.noarch 71/84 2026-03-10T09:13:31.364 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-natsort-7.1.1-5.el9.noarch 67/84 2026-03-10T09:13:31.369 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-prometheus-alerts-2:18.2.1-0.el9.noarch 72/84 2026-03-10T09:13:31.370 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-oauthlib-3.1.1-5.el9.noarch 68/84 2026-03-10T09:13:31.374 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-werkzeug-2.0.3-3.el9.1.noarch 73/84 2026-03-10T09:13:31.374 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-websocket-client-1.2.3-2.el9.noarch 69/84 2026-03-10T09:13:31.377 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-certifi-2023.05.07-4.el9.noarch 70/84 2026-03-10T09:13:31.379 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-grafana-dashboards-2:18.2.1-0.el9.noarch 71/84 2026-03-10T09:13:31.385 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-prometheus-alerts-2:18.2.1-0.el9.noarch 72/84 2026-03-10T09:13:31.389 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-werkzeug-2.0.3-3.el9.1.noarch 73/84 2026-03-10T09:13:31.395 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-base-2:18.2.1-0.el9.x86_64 74/84 2026-03-10T09:13:31.395 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-crash.service". 2026-03-10T09:13:31.395 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:31.402 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-base-2:18.2.1-0.el9.x86_64 74/84 2026-03-10T09:13:31.412 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-base-2:18.2.1-0.el9.x86_64 74/84 2026-03-10T09:13:31.412 INFO:teuthology.orchestra.run.vm08.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-crash.service". 2026-03-10T09:13:31.412 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:31.418 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-base-2:18.2.1-0.el9.x86_64 74/84 2026-03-10T09:13:31.422 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-base-2:18.2.1-0.el9.x86_64 74/84 2026-03-10T09:13:31.422 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-selinux-2:18.2.1-0.el9.x86_64 75/84 2026-03-10T09:13:31.438 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-base-2:18.2.1-0.el9.x86_64 74/84 2026-03-10T09:13:31.438 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-selinux-2:18.2.1-0.el9.x86_64 75/84 2026-03-10T09:13:37.578 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-selinux-2:18.2.1-0.el9.x86_64 75/84 2026-03-10T09:13:37.578 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /sys 2026-03-10T09:13:37.578 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /proc 2026-03-10T09:13:37.578 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /mnt 2026-03-10T09:13:37.578 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /var/tmp 2026-03-10T09:13:37.578 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /home 2026-03-10T09:13:37.578 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /root 2026-03-10T09:13:37.578 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /tmp 2026-03-10T09:13:37.578 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:37.644 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-common-2:18.2.1-0.el9.x86_64 76/84 2026-03-10T09:13:37.674 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-selinux-2:18.2.1-0.el9.x86_64 75/84 2026-03-10T09:13:37.674 INFO:teuthology.orchestra.run.vm08.stdout:skipping the directory /sys 2026-03-10T09:13:37.674 INFO:teuthology.orchestra.run.vm08.stdout:skipping the directory /proc 2026-03-10T09:13:37.674 INFO:teuthology.orchestra.run.vm08.stdout:skipping the directory /mnt 2026-03-10T09:13:37.674 INFO:teuthology.orchestra.run.vm08.stdout:skipping the directory /var/tmp 2026-03-10T09:13:37.674 INFO:teuthology.orchestra.run.vm08.stdout:skipping the directory /home 2026-03-10T09:13:37.674 INFO:teuthology.orchestra.run.vm08.stdout:skipping the directory /root 2026-03-10T09:13:37.674 INFO:teuthology.orchestra.run.vm08.stdout:skipping the directory /tmp 2026-03-10T09:13:37.674 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:37.725 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-common-2:18.2.1-0.el9.x86_64 76/84 2026-03-10T09:13:37.767 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-common-2:18.2.1-0.el9.x86_64 76/84 2026-03-10T09:13:37.803 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-ceph-common-2:18.2.1-0.el9.x86_64 77/84 2026-03-10T09:13:37.805 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-prettytable-0.7.2-27.el9.noarch 78/84 2026-03-10T09:13:37.807 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : liboath-2.6.12-1.el9.x86_64 79/84 2026-03-10T09:13:37.807 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-common-2:18.2.1-0.el9.x86_64 76/84 2026-03-10T09:13:37.807 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libradosstriper1-2:18.2.1-0.el9.x86_64 80/84 2026-03-10T09:13:37.811 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-ceph-common-2:18.2.1-0.el9.x86_64 77/84 2026-03-10T09:13:37.813 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-prettytable-0.7.2-27.el9.noarch 78/84 2026-03-10T09:13:37.818 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : liboath-2.6.12-1.el9.x86_64 79/84 2026-03-10T09:13:37.818 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : libradosstriper1-2:18.2.1-0.el9.x86_64 80/84 2026-03-10T09:13:37.821 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: libradosstriper1-2:18.2.1-0.el9.x86_64 80/84 2026-03-10T09:13:37.823 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : fmt-8.1.1-5.el9.x86_64 81/84 2026-03-10T09:13:37.826 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libquadmath-11.5.0-14.el9.x86_64 82/84 2026-03-10T09:13:37.829 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-markupsafe-1.1.1-12.el9.x86_64 83/84 2026-03-10T09:13:37.829 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libcephsqlite-2:18.2.1-0.el9.x86_64 84/84 2026-03-10T09:13:37.836 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: libradosstriper1-2:18.2.1-0.el9.x86_64 80/84 2026-03-10T09:13:37.839 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : fmt-8.1.1-5.el9.x86_64 81/84 2026-03-10T09:13:37.841 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : libquadmath-11.5.0-14.el9.x86_64 82/84 2026-03-10T09:13:37.845 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-markupsafe-1.1.1-12.el9.x86_64 83/84 2026-03-10T09:13:37.845 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : libcephsqlite-2:18.2.1-0.el9.x86_64 84/84 2026-03-10T09:13:37.931 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: libcephsqlite-2:18.2.1-0.el9.x86_64 84/84 2026-03-10T09:13:37.931 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-base-2:18.2.1-0.el9.x86_64 1/84 2026-03-10T09:13:37.931 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-common-2:18.2.1-0.el9.x86_64 2/84 2026-03-10T09:13:37.931 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-grafana-dashboards-2:18.2.1-0.el9.noarch 3/84 2026-03-10T09:13:37.931 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-immutable-object-cache-2:18.2.1-0.el9.x86_64 4/84 2026-03-10T09:13:37.931 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-2:18.2.1-0.el9.x86_64 5/84 2026-03-10T09:13:37.931 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 6/84 2026-03-10T09:13:37.931 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 7/84 2026-03-10T09:13:37.931 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noarc 8/84 2026-03-10T09:13:37.931 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-modules-core-2:18.2.1-0.el9.noarch 9/84 2026-03-10T09:13:37.931 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-rook-2:18.2.1-0.el9.noarch 10/84 2026-03-10T09:13:37.931 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-prometheus-alerts-2:18.2.1-0.el9.noarch 11/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-selinux-2:18.2.1-0.el9.x86_64 12/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 13/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 14/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 15/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : fmt-8.1.1-5.el9.x86_64 16/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libcephsqlite-2:18.2.1-0.el9.x86_64 17/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 18/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 19/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 20/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libradosstriper1-2:18.2.1-0.el9.x86_64 21/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 22/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 23/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 24/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 25/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 26/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 27/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 28/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 29/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-ceph-common-2:18.2.1-0.el9.x86_64 30/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 31/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 32/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-chardet-4.0.0-5.el9.noarch 33/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cheroot-10.0.1-4.el9.noarch 34/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cherrypy-18.6.1-2.el9.noarch 35/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 36/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 37/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 38/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-idna-2.10-7.el9.1.noarch 39/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 40/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 41/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 42/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 43/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 44/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 45/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 46/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jsonpatch-1.21-16.el9.noarch 47/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jsonpointer-2.0-4.el9.noarch 48/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jwt-2.4.0-1.el9.noarch 49/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jwt+crypto-2.4.0-1.el9.noarch 50/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 51/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-logutils-0.3.5-21.el9.noarch 52/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-mako-1.1.4-6.el9.noarch 53/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 54/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 55/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 56/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 57/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 58/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-oauthlib-3.1.1-5.el9.noarch 59/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pecan-1.4.2-3.el9.noarch 60/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-ply-3.11-14.el9.noarch 61/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 62/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-prettytable-0.7.2-27.el9.noarch 63/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 64/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 65/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 66/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 67/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pysocks-1.7.1-12.el9.noarch 68/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pytz-2021.1-5.el9.noarch 69/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 70/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 71/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 72/84 2026-03-10T09:13:37.932 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 73/84 2026-03-10T09:13:37.933 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 74/84 2026-03-10T09:13:37.933 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 75/84 2026-03-10T09:13:37.933 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 76/84 2026-03-10T09:13:37.933 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 77/84 2026-03-10T09:13:37.933 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 78/84 2026-03-10T09:13:37.933 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 79/84 2026-03-10T09:13:37.933 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-webob-1.8.8-2.el9.noarch 80/84 2026-03-10T09:13:37.933 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 81/84 2026-03-10T09:13:37.933 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-werkzeug-2.0.3-3.el9.1.noarch 82/84 2026-03-10T09:13:37.933 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 83/84 2026-03-10T09:13:37.950 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: libcephsqlite-2:18.2.1-0.el9.x86_64 84/84 2026-03-10T09:13:37.950 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-base-2:18.2.1-0.el9.x86_64 1/84 2026-03-10T09:13:37.950 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-common-2:18.2.1-0.el9.x86_64 2/84 2026-03-10T09:13:37.950 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-grafana-dashboards-2:18.2.1-0.el9.noarch 3/84 2026-03-10T09:13:37.950 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-immutable-object-cache-2:18.2.1-0.el9.x86_64 4/84 2026-03-10T09:13:37.950 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mgr-2:18.2.1-0.el9.x86_64 5/84 2026-03-10T09:13:37.950 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 6/84 2026-03-10T09:13:37.950 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 7/84 2026-03-10T09:13:37.950 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noarc 8/84 2026-03-10T09:13:37.950 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mgr-modules-core-2:18.2.1-0.el9.noarch 9/84 2026-03-10T09:13:37.950 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mgr-rook-2:18.2.1-0.el9.noarch 10/84 2026-03-10T09:13:37.950 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-prometheus-alerts-2:18.2.1-0.el9.noarch 11/84 2026-03-10T09:13:37.950 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-selinux-2:18.2.1-0.el9.x86_64 12/84 2026-03-10T09:13:37.950 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 13/84 2026-03-10T09:13:37.950 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 14/84 2026-03-10T09:13:37.950 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 15/84 2026-03-10T09:13:37.952 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : fmt-8.1.1-5.el9.x86_64 16/84 2026-03-10T09:13:37.952 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libcephsqlite-2:18.2.1-0.el9.x86_64 17/84 2026-03-10T09:13:37.952 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 18/84 2026-03-10T09:13:37.952 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 19/84 2026-03-10T09:13:37.952 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 20/84 2026-03-10T09:13:37.952 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libradosstriper1-2:18.2.1-0.el9.x86_64 21/84 2026-03-10T09:13:37.952 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 22/84 2026-03-10T09:13:37.952 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 23/84 2026-03-10T09:13:37.952 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 24/84 2026-03-10T09:13:37.952 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 25/84 2026-03-10T09:13:37.952 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 26/84 2026-03-10T09:13:37.952 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 27/84 2026-03-10T09:13:37.952 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 28/84 2026-03-10T09:13:37.952 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 29/84 2026-03-10T09:13:37.952 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-ceph-common-2:18.2.1-0.el9.x86_64 30/84 2026-03-10T09:13:37.952 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 31/84 2026-03-10T09:13:37.952 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 32/84 2026-03-10T09:13:37.952 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-chardet-4.0.0-5.el9.noarch 33/84 2026-03-10T09:13:37.952 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-cheroot-10.0.1-4.el9.noarch 34/84 2026-03-10T09:13:37.952 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-cherrypy-18.6.1-2.el9.noarch 35/84 2026-03-10T09:13:37.952 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 36/84 2026-03-10T09:13:37.952 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 37/84 2026-03-10T09:13:37.952 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 38/84 2026-03-10T09:13:37.952 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-idna-2.10-7.el9.1.noarch 39/84 2026-03-10T09:13:37.952 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 40/84 2026-03-10T09:13:37.952 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 41/84 2026-03-10T09:13:37.952 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 42/84 2026-03-10T09:13:37.952 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 43/84 2026-03-10T09:13:37.952 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 44/84 2026-03-10T09:13:37.952 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 45/84 2026-03-10T09:13:37.952 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 46/84 2026-03-10T09:13:37.952 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jsonpatch-1.21-16.el9.noarch 47/84 2026-03-10T09:13:37.952 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jsonpointer-2.0-4.el9.noarch 48/84 2026-03-10T09:13:37.952 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jwt-2.4.0-1.el9.noarch 49/84 2026-03-10T09:13:37.952 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jwt+crypto-2.4.0-1.el9.noarch 50/84 2026-03-10T09:13:37.952 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 51/84 2026-03-10T09:13:37.952 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-logutils-0.3.5-21.el9.noarch 52/84 2026-03-10T09:13:37.952 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-mako-1.1.4-6.el9.noarch 53/84 2026-03-10T09:13:37.952 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 54/84 2026-03-10T09:13:37.952 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 55/84 2026-03-10T09:13:37.952 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 56/84 2026-03-10T09:13:37.952 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 57/84 2026-03-10T09:13:37.953 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 58/84 2026-03-10T09:13:37.953 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-oauthlib-3.1.1-5.el9.noarch 59/84 2026-03-10T09:13:37.953 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-pecan-1.4.2-3.el9.noarch 60/84 2026-03-10T09:13:37.953 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-ply-3.11-14.el9.noarch 61/84 2026-03-10T09:13:37.953 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 62/84 2026-03-10T09:13:37.953 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-prettytable-0.7.2-27.el9.noarch 63/84 2026-03-10T09:13:37.953 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 64/84 2026-03-10T09:13:37.953 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 65/84 2026-03-10T09:13:37.953 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 66/84 2026-03-10T09:13:37.953 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 67/84 2026-03-10T09:13:37.953 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-pysocks-1.7.1-12.el9.noarch 68/84 2026-03-10T09:13:37.953 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-pytz-2021.1-5.el9.noarch 69/84 2026-03-10T09:13:37.953 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 70/84 2026-03-10T09:13:37.953 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 71/84 2026-03-10T09:13:37.953 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 72/84 2026-03-10T09:13:37.953 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 73/84 2026-03-10T09:13:37.953 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 74/84 2026-03-10T09:13:37.953 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 75/84 2026-03-10T09:13:37.953 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 76/84 2026-03-10T09:13:37.953 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 77/84 2026-03-10T09:13:37.953 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 78/84 2026-03-10T09:13:37.953 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 79/84 2026-03-10T09:13:37.953 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-webob-1.8.8-2.el9.noarch 80/84 2026-03-10T09:13:37.953 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 81/84 2026-03-10T09:13:37.953 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-werkzeug-2.0.3-3.el9.1.noarch 82/84 2026-03-10T09:13:37.953 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 83/84 2026-03-10T09:13:38.024 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : rbd-mirror-2:18.2.1-0.el9.x86_64 84/84 2026-03-10T09:13:38.024 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:38.024 INFO:teuthology.orchestra.run.vm05.stdout:Removed: 2026-03-10T09:13:38.024 INFO:teuthology.orchestra.run.vm05.stdout: ceph-base-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:38.024 INFO:teuthology.orchestra.run.vm05.stdout: ceph-common-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:38.024 INFO:teuthology.orchestra.run.vm05.stdout: ceph-grafana-dashboards-2:18.2.1-0.el9.noarch 2026-03-10T09:13:38.024 INFO:teuthology.orchestra.run.vm05.stdout: ceph-immutable-object-cache-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:38.024 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:38.024 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 2026-03-10T09:13:38.024 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 2026-03-10T09:13:38.024 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noarch 2026-03-10T09:13:38.024 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-modules-core-2:18.2.1-0.el9.noarch 2026-03-10T09:13:38.024 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-rook-2:18.2.1-0.el9.noarch 2026-03-10T09:13:38.024 INFO:teuthology.orchestra.run.vm05.stdout: ceph-prometheus-alerts-2:18.2.1-0.el9.noarch 2026-03-10T09:13:38.024 INFO:teuthology.orchestra.run.vm05.stdout: ceph-selinux-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:38.024 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-10T09:13:38.025 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-10T09:13:38.025 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-10T09:13:38.025 INFO:teuthology.orchestra.run.vm05.stdout: fmt-8.1.1-5.el9.x86_64 2026-03-10T09:13:38.025 INFO:teuthology.orchestra.run.vm05.stdout: libcephsqlite-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:38.025 INFO:teuthology.orchestra.run.vm05.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-10T09:13:38.025 INFO:teuthology.orchestra.run.vm05.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-10T09:13:38.025 INFO:teuthology.orchestra.run.vm05.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-10T09:13:38.025 INFO:teuthology.orchestra.run.vm05.stdout: libradosstriper1-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:38.025 INFO:teuthology.orchestra.run.vm05.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-10T09:13:38.025 INFO:teuthology.orchestra.run.vm05.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-10T09:13:38.025 INFO:teuthology.orchestra.run.vm05.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-10T09:13:38.025 INFO:teuthology.orchestra.run.vm05.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-10T09:13:38.025 INFO:teuthology.orchestra.run.vm05.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-10T09:13:38.025 INFO:teuthology.orchestra.run.vm05.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-10T09:13:38.025 INFO:teuthology.orchestra.run.vm05.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-10T09:13:38.025 INFO:teuthology.orchestra.run.vm05.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-10T09:13:38.025 INFO:teuthology.orchestra.run.vm05.stdout: python3-ceph-common-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:38.025 INFO:teuthology.orchestra.run.vm05.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-10T09:13:38.025 INFO:teuthology.orchestra.run.vm05.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-10T09:13:38.025 INFO:teuthology.orchestra.run.vm05.stdout: python3-chardet-4.0.0-5.el9.noarch 2026-03-10T09:13:38.025 INFO:teuthology.orchestra.run.vm05.stdout: python3-cheroot-10.0.1-4.el9.noarch 2026-03-10T09:13:38.025 INFO:teuthology.orchestra.run.vm05.stdout: python3-cherrypy-18.6.1-2.el9.noarch 2026-03-10T09:13:38.025 INFO:teuthology.orchestra.run.vm05.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-10T09:13:38.025 INFO:teuthology.orchestra.run.vm05.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-10T09:13:38.025 INFO:teuthology.orchestra.run.vm05.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-10T09:13:38.025 INFO:teuthology.orchestra.run.vm05.stdout: python3-idna-2.10-7.el9.1.noarch 2026-03-10T09:13:38.025 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-10T09:13:38.025 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-10T09:13:38.025 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-10T09:13:38.025 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-10T09:13:38.025 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-10T09:13:38.025 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-10T09:13:38.025 INFO:teuthology.orchestra.run.vm05.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-10T09:13:38.025 INFO:teuthology.orchestra.run.vm05.stdout: python3-jsonpatch-1.21-16.el9.noarch 2026-03-10T09:13:38.025 INFO:teuthology.orchestra.run.vm05.stdout: python3-jsonpointer-2.0-4.el9.noarch 2026-03-10T09:13:38.025 INFO:teuthology.orchestra.run.vm05.stdout: python3-jwt-2.4.0-1.el9.noarch 2026-03-10T09:13:38.025 INFO:teuthology.orchestra.run.vm05.stdout: python3-jwt+crypto-2.4.0-1.el9.noarch 2026-03-10T09:13:38.025 INFO:teuthology.orchestra.run.vm05.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-10T09:13:38.025 INFO:teuthology.orchestra.run.vm05.stdout: python3-logutils-0.3.5-21.el9.noarch 2026-03-10T09:13:38.025 INFO:teuthology.orchestra.run.vm05.stdout: python3-mako-1.1.4-6.el9.noarch 2026-03-10T09:13:38.026 INFO:teuthology.orchestra.run.vm05.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-10T09:13:38.026 INFO:teuthology.orchestra.run.vm05.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-10T09:13:38.026 INFO:teuthology.orchestra.run.vm05.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-10T09:13:38.026 INFO:teuthology.orchestra.run.vm05.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-10T09:13:38.026 INFO:teuthology.orchestra.run.vm05.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-10T09:13:38.026 INFO:teuthology.orchestra.run.vm05.stdout: python3-oauthlib-3.1.1-5.el9.noarch 2026-03-10T09:13:38.026 INFO:teuthology.orchestra.run.vm05.stdout: python3-pecan-1.4.2-3.el9.noarch 2026-03-10T09:13:38.026 INFO:teuthology.orchestra.run.vm05.stdout: python3-ply-3.11-14.el9.noarch 2026-03-10T09:13:38.026 INFO:teuthology.orchestra.run.vm05.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-10T09:13:38.026 INFO:teuthology.orchestra.run.vm05.stdout: python3-prettytable-0.7.2-27.el9.noarch 2026-03-10T09:13:38.026 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-10T09:13:38.026 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-10T09:13:38.026 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-10T09:13:38.026 INFO:teuthology.orchestra.run.vm05.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-10T09:13:38.026 INFO:teuthology.orchestra.run.vm05.stdout: python3-pysocks-1.7.1-12.el9.noarch 2026-03-10T09:13:38.026 INFO:teuthology.orchestra.run.vm05.stdout: python3-pytz-2021.1-5.el9.noarch 2026-03-10T09:13:38.026 INFO:teuthology.orchestra.run.vm05.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-10T09:13:38.026 INFO:teuthology.orchestra.run.vm05.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-10T09:13:38.026 INFO:teuthology.orchestra.run.vm05.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-10T09:13:38.026 INFO:teuthology.orchestra.run.vm05.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-10T09:13:38.026 INFO:teuthology.orchestra.run.vm05.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-10T09:13:38.026 INFO:teuthology.orchestra.run.vm05.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-10T09:13:38.026 INFO:teuthology.orchestra.run.vm05.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-10T09:13:38.026 INFO:teuthology.orchestra.run.vm05.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-10T09:13:38.026 INFO:teuthology.orchestra.run.vm05.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-10T09:13:38.026 INFO:teuthology.orchestra.run.vm05.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-10T09:13:38.026 INFO:teuthology.orchestra.run.vm05.stdout: python3-webob-1.8.8-2.el9.noarch 2026-03-10T09:13:38.026 INFO:teuthology.orchestra.run.vm05.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-10T09:13:38.026 INFO:teuthology.orchestra.run.vm05.stdout: python3-werkzeug-2.0.3-3.el9.1.noarch 2026-03-10T09:13:38.026 INFO:teuthology.orchestra.run.vm05.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-10T09:13:38.026 INFO:teuthology.orchestra.run.vm05.stdout: rbd-mirror-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:38.026 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:38.026 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T09:13:38.030 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : rbd-mirror-2:18.2.1-0.el9.x86_64 84/84 2026-03-10T09:13:38.030 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:38.030 INFO:teuthology.orchestra.run.vm08.stdout:Removed: 2026-03-10T09:13:38.030 INFO:teuthology.orchestra.run.vm08.stdout: ceph-base-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:38.031 INFO:teuthology.orchestra.run.vm08.stdout: ceph-common-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:38.031 INFO:teuthology.orchestra.run.vm08.stdout: ceph-grafana-dashboards-2:18.2.1-0.el9.noarch 2026-03-10T09:13:38.031 INFO:teuthology.orchestra.run.vm08.stdout: ceph-immutable-object-cache-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:38.031 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:38.031 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 2026-03-10T09:13:38.031 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 2026-03-10T09:13:38.031 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noarch 2026-03-10T09:13:38.031 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-modules-core-2:18.2.1-0.el9.noarch 2026-03-10T09:13:38.031 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-rook-2:18.2.1-0.el9.noarch 2026-03-10T09:13:38.031 INFO:teuthology.orchestra.run.vm08.stdout: ceph-prometheus-alerts-2:18.2.1-0.el9.noarch 2026-03-10T09:13:38.031 INFO:teuthology.orchestra.run.vm08.stdout: ceph-selinux-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:38.031 INFO:teuthology.orchestra.run.vm08.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-10T09:13:38.031 INFO:teuthology.orchestra.run.vm08.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-10T09:13:38.031 INFO:teuthology.orchestra.run.vm08.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-10T09:13:38.031 INFO:teuthology.orchestra.run.vm08.stdout: fmt-8.1.1-5.el9.x86_64 2026-03-10T09:13:38.031 INFO:teuthology.orchestra.run.vm08.stdout: libcephsqlite-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:38.031 INFO:teuthology.orchestra.run.vm08.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-10T09:13:38.031 INFO:teuthology.orchestra.run.vm08.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-10T09:13:38.031 INFO:teuthology.orchestra.run.vm08.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-10T09:13:38.031 INFO:teuthology.orchestra.run.vm08.stdout: libradosstriper1-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:38.031 INFO:teuthology.orchestra.run.vm08.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-10T09:13:38.031 INFO:teuthology.orchestra.run.vm08.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-10T09:13:38.031 INFO:teuthology.orchestra.run.vm08.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-10T09:13:38.031 INFO:teuthology.orchestra.run.vm08.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-10T09:13:38.031 INFO:teuthology.orchestra.run.vm08.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-10T09:13:38.031 INFO:teuthology.orchestra.run.vm08.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-10T09:13:38.031 INFO:teuthology.orchestra.run.vm08.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-10T09:13:38.031 INFO:teuthology.orchestra.run.vm08.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-10T09:13:38.031 INFO:teuthology.orchestra.run.vm08.stdout: python3-ceph-common-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:38.031 INFO:teuthology.orchestra.run.vm08.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-10T09:13:38.031 INFO:teuthology.orchestra.run.vm08.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-10T09:13:38.031 INFO:teuthology.orchestra.run.vm08.stdout: python3-chardet-4.0.0-5.el9.noarch 2026-03-10T09:13:38.031 INFO:teuthology.orchestra.run.vm08.stdout: python3-cheroot-10.0.1-4.el9.noarch 2026-03-10T09:13:38.031 INFO:teuthology.orchestra.run.vm08.stdout: python3-cherrypy-18.6.1-2.el9.noarch 2026-03-10T09:13:38.031 INFO:teuthology.orchestra.run.vm08.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-10T09:13:38.031 INFO:teuthology.orchestra.run.vm08.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-10T09:13:38.031 INFO:teuthology.orchestra.run.vm08.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-10T09:13:38.031 INFO:teuthology.orchestra.run.vm08.stdout: python3-idna-2.10-7.el9.1.noarch 2026-03-10T09:13:38.031 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-10T09:13:38.032 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-10T09:13:38.032 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-10T09:13:38.032 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-10T09:13:38.032 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-10T09:13:38.032 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-10T09:13:38.032 INFO:teuthology.orchestra.run.vm08.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-10T09:13:38.032 INFO:teuthology.orchestra.run.vm08.stdout: python3-jsonpatch-1.21-16.el9.noarch 2026-03-10T09:13:38.032 INFO:teuthology.orchestra.run.vm08.stdout: python3-jsonpointer-2.0-4.el9.noarch 2026-03-10T09:13:38.032 INFO:teuthology.orchestra.run.vm08.stdout: python3-jwt-2.4.0-1.el9.noarch 2026-03-10T09:13:38.032 INFO:teuthology.orchestra.run.vm08.stdout: python3-jwt+crypto-2.4.0-1.el9.noarch 2026-03-10T09:13:38.032 INFO:teuthology.orchestra.run.vm08.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-10T09:13:38.032 INFO:teuthology.orchestra.run.vm08.stdout: python3-logutils-0.3.5-21.el9.noarch 2026-03-10T09:13:38.032 INFO:teuthology.orchestra.run.vm08.stdout: python3-mako-1.1.4-6.el9.noarch 2026-03-10T09:13:38.032 INFO:teuthology.orchestra.run.vm08.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-10T09:13:38.032 INFO:teuthology.orchestra.run.vm08.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-10T09:13:38.032 INFO:teuthology.orchestra.run.vm08.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-10T09:13:38.032 INFO:teuthology.orchestra.run.vm08.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-10T09:13:38.032 INFO:teuthology.orchestra.run.vm08.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-10T09:13:38.032 INFO:teuthology.orchestra.run.vm08.stdout: python3-oauthlib-3.1.1-5.el9.noarch 2026-03-10T09:13:38.032 INFO:teuthology.orchestra.run.vm08.stdout: python3-pecan-1.4.2-3.el9.noarch 2026-03-10T09:13:38.032 INFO:teuthology.orchestra.run.vm08.stdout: python3-ply-3.11-14.el9.noarch 2026-03-10T09:13:38.032 INFO:teuthology.orchestra.run.vm08.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-10T09:13:38.032 INFO:teuthology.orchestra.run.vm08.stdout: python3-prettytable-0.7.2-27.el9.noarch 2026-03-10T09:13:38.032 INFO:teuthology.orchestra.run.vm08.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-10T09:13:38.032 INFO:teuthology.orchestra.run.vm08.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-10T09:13:38.032 INFO:teuthology.orchestra.run.vm08.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-10T09:13:38.032 INFO:teuthology.orchestra.run.vm08.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-10T09:13:38.032 INFO:teuthology.orchestra.run.vm08.stdout: python3-pysocks-1.7.1-12.el9.noarch 2026-03-10T09:13:38.032 INFO:teuthology.orchestra.run.vm08.stdout: python3-pytz-2021.1-5.el9.noarch 2026-03-10T09:13:38.032 INFO:teuthology.orchestra.run.vm08.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-10T09:13:38.032 INFO:teuthology.orchestra.run.vm08.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-10T09:13:38.032 INFO:teuthology.orchestra.run.vm08.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-10T09:13:38.032 INFO:teuthology.orchestra.run.vm08.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-10T09:13:38.032 INFO:teuthology.orchestra.run.vm08.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-10T09:13:38.032 INFO:teuthology.orchestra.run.vm08.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-10T09:13:38.032 INFO:teuthology.orchestra.run.vm08.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-10T09:13:38.032 INFO:teuthology.orchestra.run.vm08.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-10T09:13:38.032 INFO:teuthology.orchestra.run.vm08.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-10T09:13:38.032 INFO:teuthology.orchestra.run.vm08.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-10T09:13:38.032 INFO:teuthology.orchestra.run.vm08.stdout: python3-webob-1.8.8-2.el9.noarch 2026-03-10T09:13:38.033 INFO:teuthology.orchestra.run.vm08.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-10T09:13:38.033 INFO:teuthology.orchestra.run.vm08.stdout: python3-werkzeug-2.0.3-3.el9.1.noarch 2026-03-10T09:13:38.033 INFO:teuthology.orchestra.run.vm08.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-10T09:13:38.033 INFO:teuthology.orchestra.run.vm08.stdout: rbd-mirror-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:38.033 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:38.033 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T09:13:38.251 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T09:13:38.252 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T09:13:38.252 INFO:teuthology.orchestra.run.vm05.stdout: Package Architecture Version Repository Size 2026-03-10T09:13:38.252 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T09:13:38.252 INFO:teuthology.orchestra.run.vm05.stdout:Removing: 2026-03-10T09:13:38.252 INFO:teuthology.orchestra.run.vm05.stdout: cephadm noarch 2:18.2.1-0.el9 @ceph-noarch 213 k 2026-03-10T09:13:38.252 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:38.252 INFO:teuthology.orchestra.run.vm05.stdout:Transaction Summary 2026-03-10T09:13:38.252 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T09:13:38.252 INFO:teuthology.orchestra.run.vm05.stdout:Remove 1 Package 2026-03-10T09:13:38.252 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:38.252 INFO:teuthology.orchestra.run.vm05.stdout:Freed space: 213 k 2026-03-10T09:13:38.252 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction check 2026-03-10T09:13:38.254 INFO:teuthology.orchestra.run.vm05.stdout:Transaction check succeeded. 2026-03-10T09:13:38.254 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction test 2026-03-10T09:13:38.255 INFO:teuthology.orchestra.run.vm05.stdout:Transaction test succeeded. 2026-03-10T09:13:38.255 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction 2026-03-10T09:13:38.262 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T09:13:38.263 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T09:13:38.263 INFO:teuthology.orchestra.run.vm08.stdout: Package Architecture Version Repository Size 2026-03-10T09:13:38.263 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T09:13:38.263 INFO:teuthology.orchestra.run.vm08.stdout:Removing: 2026-03-10T09:13:38.263 INFO:teuthology.orchestra.run.vm08.stdout: cephadm noarch 2:18.2.1-0.el9 @ceph-noarch 213 k 2026-03-10T09:13:38.263 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:38.263 INFO:teuthology.orchestra.run.vm08.stdout:Transaction Summary 2026-03-10T09:13:38.263 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T09:13:38.263 INFO:teuthology.orchestra.run.vm08.stdout:Remove 1 Package 2026-03-10T09:13:38.263 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:38.263 INFO:teuthology.orchestra.run.vm08.stdout:Freed space: 213 k 2026-03-10T09:13:38.263 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction check 2026-03-10T09:13:38.265 INFO:teuthology.orchestra.run.vm08.stdout:Transaction check succeeded. 2026-03-10T09:13:38.265 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction test 2026-03-10T09:13:38.266 INFO:teuthology.orchestra.run.vm08.stdout:Transaction test succeeded. 2026-03-10T09:13:38.266 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction 2026-03-10T09:13:38.271 INFO:teuthology.orchestra.run.vm05.stdout: Preparing : 1/1 2026-03-10T09:13:38.272 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : cephadm-2:18.2.1-0.el9.noarch 1/1 2026-03-10T09:13:38.283 INFO:teuthology.orchestra.run.vm08.stdout: Preparing : 1/1 2026-03-10T09:13:38.283 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : cephadm-2:18.2.1-0.el9.noarch 1/1 2026-03-10T09:13:38.393 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: cephadm-2:18.2.1-0.el9.noarch 1/1 2026-03-10T09:13:38.396 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: cephadm-2:18.2.1-0.el9.noarch 1/1 2026-03-10T09:13:38.434 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : cephadm-2:18.2.1-0.el9.noarch 1/1 2026-03-10T09:13:38.434 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:38.434 INFO:teuthology.orchestra.run.vm05.stdout:Removed: 2026-03-10T09:13:38.434 INFO:teuthology.orchestra.run.vm05.stdout: cephadm-2:18.2.1-0.el9.noarch 2026-03-10T09:13:38.434 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:38.434 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T09:13:38.441 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : cephadm-2:18.2.1-0.el9.noarch 1/1 2026-03-10T09:13:38.441 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:38.441 INFO:teuthology.orchestra.run.vm08.stdout:Removed: 2026-03-10T09:13:38.441 INFO:teuthology.orchestra.run.vm08.stdout: cephadm-2:18.2.1-0.el9.noarch 2026-03-10T09:13:38.441 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:38.442 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T09:13:38.620 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: ceph-immutable-object-cache 2026-03-10T09:13:38.620 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-10T09:13:38.623 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T09:13:38.624 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-10T09:13:38.624 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T09:13:38.625 INFO:teuthology.orchestra.run.vm08.stdout:No match for argument: ceph-immutable-object-cache 2026-03-10T09:13:38.625 INFO:teuthology.orchestra.run.vm08.stderr:No packages marked for removal. 2026-03-10T09:13:38.628 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T09:13:38.629 INFO:teuthology.orchestra.run.vm08.stdout:Nothing to do. 2026-03-10T09:13:38.629 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T09:13:38.800 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: ceph-mgr 2026-03-10T09:13:38.800 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-10T09:13:38.801 INFO:teuthology.orchestra.run.vm08.stdout:No match for argument: ceph-mgr 2026-03-10T09:13:38.801 INFO:teuthology.orchestra.run.vm08.stderr:No packages marked for removal. 2026-03-10T09:13:38.803 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T09:13:38.804 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-10T09:13:38.804 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T09:13:38.805 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T09:13:38.805 INFO:teuthology.orchestra.run.vm08.stdout:Nothing to do. 2026-03-10T09:13:38.805 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T09:13:38.976 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: ceph-mgr-dashboard 2026-03-10T09:13:38.976 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-10T09:13:38.979 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T09:13:38.980 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-10T09:13:38.980 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T09:13:38.981 INFO:teuthology.orchestra.run.vm08.stdout:No match for argument: ceph-mgr-dashboard 2026-03-10T09:13:38.981 INFO:teuthology.orchestra.run.vm08.stderr:No packages marked for removal. 2026-03-10T09:13:38.984 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T09:13:38.985 INFO:teuthology.orchestra.run.vm08.stdout:Nothing to do. 2026-03-10T09:13:38.985 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T09:13:39.149 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: ceph-mgr-diskprediction-local 2026-03-10T09:13:39.149 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-10T09:13:39.152 INFO:teuthology.orchestra.run.vm08.stdout:No match for argument: ceph-mgr-diskprediction-local 2026-03-10T09:13:39.152 INFO:teuthology.orchestra.run.vm08.stderr:No packages marked for removal. 2026-03-10T09:13:39.152 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T09:13:39.153 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-10T09:13:39.153 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T09:13:39.155 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T09:13:39.156 INFO:teuthology.orchestra.run.vm08.stdout:Nothing to do. 2026-03-10T09:13:39.156 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T09:13:39.328 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: ceph-mgr-rook 2026-03-10T09:13:39.328 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-10T09:13:39.329 INFO:teuthology.orchestra.run.vm08.stdout:No match for argument: ceph-mgr-rook 2026-03-10T09:13:39.329 INFO:teuthology.orchestra.run.vm08.stderr:No packages marked for removal. 2026-03-10T09:13:39.331 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T09:13:39.332 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-10T09:13:39.332 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T09:13:39.332 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T09:13:39.333 INFO:teuthology.orchestra.run.vm08.stdout:Nothing to do. 2026-03-10T09:13:39.333 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T09:13:39.507 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: ceph-mgr-cephadm 2026-03-10T09:13:39.507 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-10T09:13:39.509 INFO:teuthology.orchestra.run.vm08.stdout:No match for argument: ceph-mgr-cephadm 2026-03-10T09:13:39.509 INFO:teuthology.orchestra.run.vm08.stderr:No packages marked for removal. 2026-03-10T09:13:39.510 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T09:13:39.511 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-10T09:13:39.511 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T09:13:39.512 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T09:13:39.513 INFO:teuthology.orchestra.run.vm08.stdout:Nothing to do. 2026-03-10T09:13:39.513 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T09:13:39.695 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T09:13:39.696 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T09:13:39.696 INFO:teuthology.orchestra.run.vm08.stdout: Package Architecture Version Repository Size 2026-03-10T09:13:39.696 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T09:13:39.696 INFO:teuthology.orchestra.run.vm08.stdout:Removing: 2026-03-10T09:13:39.696 INFO:teuthology.orchestra.run.vm08.stdout: ceph-fuse x86_64 2:18.2.1-0.el9 @ceph 2.5 M 2026-03-10T09:13:39.696 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:39.696 INFO:teuthology.orchestra.run.vm08.stdout:Transaction Summary 2026-03-10T09:13:39.696 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T09:13:39.696 INFO:teuthology.orchestra.run.vm08.stdout:Remove 1 Package 2026-03-10T09:13:39.696 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:39.696 INFO:teuthology.orchestra.run.vm08.stdout:Freed space: 2.5 M 2026-03-10T09:13:39.696 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction check 2026-03-10T09:13:39.698 INFO:teuthology.orchestra.run.vm08.stdout:Transaction check succeeded. 2026-03-10T09:13:39.698 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction test 2026-03-10T09:13:39.706 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T09:13:39.707 INFO:teuthology.orchestra.run.vm08.stdout:Transaction test succeeded. 2026-03-10T09:13:39.707 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T09:13:39.707 INFO:teuthology.orchestra.run.vm05.stdout: Package Architecture Version Repository Size 2026-03-10T09:13:39.707 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T09:13:39.707 INFO:teuthology.orchestra.run.vm05.stdout:Removing: 2026-03-10T09:13:39.707 INFO:teuthology.orchestra.run.vm05.stdout: ceph-fuse x86_64 2:18.2.1-0.el9 @ceph 2.5 M 2026-03-10T09:13:39.707 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:39.707 INFO:teuthology.orchestra.run.vm05.stdout:Transaction Summary 2026-03-10T09:13:39.707 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T09:13:39.707 INFO:teuthology.orchestra.run.vm05.stdout:Remove 1 Package 2026-03-10T09:13:39.707 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:39.707 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction 2026-03-10T09:13:39.707 INFO:teuthology.orchestra.run.vm05.stdout:Freed space: 2.5 M 2026-03-10T09:13:39.707 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction check 2026-03-10T09:13:39.709 INFO:teuthology.orchestra.run.vm05.stdout:Transaction check succeeded. 2026-03-10T09:13:39.709 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction test 2026-03-10T09:13:39.719 INFO:teuthology.orchestra.run.vm05.stdout:Transaction test succeeded. 2026-03-10T09:13:39.720 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction 2026-03-10T09:13:39.732 INFO:teuthology.orchestra.run.vm08.stdout: Preparing : 1/1 2026-03-10T09:13:39.747 INFO:teuthology.orchestra.run.vm05.stdout: Preparing : 1/1 2026-03-10T09:13:39.747 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-fuse-2:18.2.1-0.el9.x86_64 1/1 2026-03-10T09:13:39.761 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-fuse-2:18.2.1-0.el9.x86_64 1/1 2026-03-10T09:13:39.827 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-fuse-2:18.2.1-0.el9.x86_64 1/1 2026-03-10T09:13:39.830 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-fuse-2:18.2.1-0.el9.x86_64 1/1 2026-03-10T09:13:39.880 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-fuse-2:18.2.1-0.el9.x86_64 1/1 2026-03-10T09:13:39.880 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:39.880 INFO:teuthology.orchestra.run.vm08.stdout:Removed: 2026-03-10T09:13:39.880 INFO:teuthology.orchestra.run.vm08.stdout: ceph-fuse-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:39.880 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:39.880 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T09:13:39.886 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-fuse-2:18.2.1-0.el9.x86_64 1/1 2026-03-10T09:13:39.886 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:39.886 INFO:teuthology.orchestra.run.vm05.stdout:Removed: 2026-03-10T09:13:39.886 INFO:teuthology.orchestra.run.vm05.stdout: ceph-fuse-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:39.886 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:39.886 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T09:13:40.087 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T09:13:40.088 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T09:13:40.088 INFO:teuthology.orchestra.run.vm08.stdout: Package Architecture Version Repository Size 2026-03-10T09:13:40.088 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T09:13:40.088 INFO:teuthology.orchestra.run.vm08.stdout:Removing: 2026-03-10T09:13:40.088 INFO:teuthology.orchestra.run.vm08.stdout: librados-devel x86_64 2:18.2.1-0.el9 @ceph 456 k 2026-03-10T09:13:40.088 INFO:teuthology.orchestra.run.vm08.stdout:Removing dependent packages: 2026-03-10T09:13:40.088 INFO:teuthology.orchestra.run.vm08.stdout: libcephfs-devel x86_64 2:18.2.1-0.el9 @ceph 139 k 2026-03-10T09:13:40.088 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:40.088 INFO:teuthology.orchestra.run.vm08.stdout:Transaction Summary 2026-03-10T09:13:40.088 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T09:13:40.088 INFO:teuthology.orchestra.run.vm08.stdout:Remove 2 Packages 2026-03-10T09:13:40.088 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:40.088 INFO:teuthology.orchestra.run.vm08.stdout:Freed space: 595 k 2026-03-10T09:13:40.088 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction check 2026-03-10T09:13:40.090 INFO:teuthology.orchestra.run.vm08.stdout:Transaction check succeeded. 2026-03-10T09:13:40.090 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction test 2026-03-10T09:13:40.101 INFO:teuthology.orchestra.run.vm08.stdout:Transaction test succeeded. 2026-03-10T09:13:40.101 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction 2026-03-10T09:13:40.125 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T09:13:40.126 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T09:13:40.126 INFO:teuthology.orchestra.run.vm05.stdout: Package Architecture Version Repository Size 2026-03-10T09:13:40.126 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T09:13:40.126 INFO:teuthology.orchestra.run.vm05.stdout:Removing: 2026-03-10T09:13:40.126 INFO:teuthology.orchestra.run.vm05.stdout: librados-devel x86_64 2:18.2.1-0.el9 @ceph 456 k 2026-03-10T09:13:40.126 INFO:teuthology.orchestra.run.vm05.stdout:Removing dependent packages: 2026-03-10T09:13:40.126 INFO:teuthology.orchestra.run.vm05.stdout: libcephfs-devel x86_64 2:18.2.1-0.el9 @ceph 139 k 2026-03-10T09:13:40.126 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:40.126 INFO:teuthology.orchestra.run.vm05.stdout:Transaction Summary 2026-03-10T09:13:40.126 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T09:13:40.126 INFO:teuthology.orchestra.run.vm05.stdout:Remove 2 Packages 2026-03-10T09:13:40.126 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:40.126 INFO:teuthology.orchestra.run.vm05.stdout:Freed space: 595 k 2026-03-10T09:13:40.126 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction check 2026-03-10T09:13:40.128 INFO:teuthology.orchestra.run.vm08.stdout: Preparing : 1/1 2026-03-10T09:13:40.128 INFO:teuthology.orchestra.run.vm05.stdout:Transaction check succeeded. 2026-03-10T09:13:40.128 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction test 2026-03-10T09:13:40.130 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : libcephfs-devel-2:18.2.1-0.el9.x86_64 1/2 2026-03-10T09:13:40.138 INFO:teuthology.orchestra.run.vm05.stdout:Transaction test succeeded. 2026-03-10T09:13:40.138 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction 2026-03-10T09:13:40.144 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : librados-devel-2:18.2.1-0.el9.x86_64 2/2 2026-03-10T09:13:40.165 INFO:teuthology.orchestra.run.vm05.stdout: Preparing : 1/1 2026-03-10T09:13:40.167 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libcephfs-devel-2:18.2.1-0.el9.x86_64 1/2 2026-03-10T09:13:40.180 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : librados-devel-2:18.2.1-0.el9.x86_64 2/2 2026-03-10T09:13:40.216 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: librados-devel-2:18.2.1-0.el9.x86_64 2/2 2026-03-10T09:13:40.216 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libcephfs-devel-2:18.2.1-0.el9.x86_64 1/2 2026-03-10T09:13:40.259 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: librados-devel-2:18.2.1-0.el9.x86_64 2/2 2026-03-10T09:13:40.259 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libcephfs-devel-2:18.2.1-0.el9.x86_64 1/2 2026-03-10T09:13:40.262 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : librados-devel-2:18.2.1-0.el9.x86_64 2/2 2026-03-10T09:13:40.263 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:40.263 INFO:teuthology.orchestra.run.vm08.stdout:Removed: 2026-03-10T09:13:40.263 INFO:teuthology.orchestra.run.vm08.stdout: libcephfs-devel-2:18.2.1-0.el9.x86_64 librados-devel-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:40.263 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:40.263 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T09:13:40.313 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librados-devel-2:18.2.1-0.el9.x86_64 2/2 2026-03-10T09:13:40.313 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:40.313 INFO:teuthology.orchestra.run.vm05.stdout:Removed: 2026-03-10T09:13:40.313 INFO:teuthology.orchestra.run.vm05.stdout: libcephfs-devel-2:18.2.1-0.el9.x86_64 librados-devel-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:40.313 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:40.313 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T09:13:40.484 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T09:13:40.485 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T09:13:40.485 INFO:teuthology.orchestra.run.vm08.stdout: Package Arch Version Repository Size 2026-03-10T09:13:40.485 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T09:13:40.485 INFO:teuthology.orchestra.run.vm08.stdout:Removing: 2026-03-10T09:13:40.485 INFO:teuthology.orchestra.run.vm08.stdout: libcephfs2 x86_64 2:18.2.1-0.el9 @ceph 1.9 M 2026-03-10T09:13:40.485 INFO:teuthology.orchestra.run.vm08.stdout:Removing dependent packages: 2026-03-10T09:13:40.485 INFO:teuthology.orchestra.run.vm08.stdout: python3-cephfs x86_64 2:18.2.1-0.el9 @ceph 505 k 2026-03-10T09:13:40.485 INFO:teuthology.orchestra.run.vm08.stdout:Removing unused dependencies: 2026-03-10T09:13:40.485 INFO:teuthology.orchestra.run.vm08.stdout: python3-ceph-argparse x86_64 2:18.2.1-0.el9 @ceph 186 k 2026-03-10T09:13:40.485 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:40.485 INFO:teuthology.orchestra.run.vm08.stdout:Transaction Summary 2026-03-10T09:13:40.485 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T09:13:40.485 INFO:teuthology.orchestra.run.vm08.stdout:Remove 3 Packages 2026-03-10T09:13:40.485 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:40.485 INFO:teuthology.orchestra.run.vm08.stdout:Freed space: 2.5 M 2026-03-10T09:13:40.485 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction check 2026-03-10T09:13:40.487 INFO:teuthology.orchestra.run.vm08.stdout:Transaction check succeeded. 2026-03-10T09:13:40.487 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction test 2026-03-10T09:13:40.499 INFO:teuthology.orchestra.run.vm08.stdout:Transaction test succeeded. 2026-03-10T09:13:40.500 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction 2026-03-10T09:13:40.513 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T09:13:40.514 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T09:13:40.514 INFO:teuthology.orchestra.run.vm05.stdout: Package Arch Version Repository Size 2026-03-10T09:13:40.514 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T09:13:40.514 INFO:teuthology.orchestra.run.vm05.stdout:Removing: 2026-03-10T09:13:40.514 INFO:teuthology.orchestra.run.vm05.stdout: libcephfs2 x86_64 2:18.2.1-0.el9 @ceph 1.9 M 2026-03-10T09:13:40.514 INFO:teuthology.orchestra.run.vm05.stdout:Removing dependent packages: 2026-03-10T09:13:40.514 INFO:teuthology.orchestra.run.vm05.stdout: python3-cephfs x86_64 2:18.2.1-0.el9 @ceph 505 k 2026-03-10T09:13:40.514 INFO:teuthology.orchestra.run.vm05.stdout:Removing unused dependencies: 2026-03-10T09:13:40.514 INFO:teuthology.orchestra.run.vm05.stdout: python3-ceph-argparse x86_64 2:18.2.1-0.el9 @ceph 186 k 2026-03-10T09:13:40.514 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:40.514 INFO:teuthology.orchestra.run.vm05.stdout:Transaction Summary 2026-03-10T09:13:40.514 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T09:13:40.514 INFO:teuthology.orchestra.run.vm05.stdout:Remove 3 Packages 2026-03-10T09:13:40.514 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:40.514 INFO:teuthology.orchestra.run.vm05.stdout:Freed space: 2.5 M 2026-03-10T09:13:40.514 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction check 2026-03-10T09:13:40.516 INFO:teuthology.orchestra.run.vm05.stdout:Transaction check succeeded. 2026-03-10T09:13:40.516 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction test 2026-03-10T09:13:40.528 INFO:teuthology.orchestra.run.vm08.stdout: Preparing : 1/1 2026-03-10T09:13:40.528 INFO:teuthology.orchestra.run.vm05.stdout:Transaction test succeeded. 2026-03-10T09:13:40.528 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction 2026-03-10T09:13:40.530 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-cephfs-2:18.2.1-0.el9.x86_64 1/3 2026-03-10T09:13:40.532 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-ceph-argparse-2:18.2.1-0.el9.x86_64 2/3 2026-03-10T09:13:40.532 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : libcephfs2-2:18.2.1-0.el9.x86_64 3/3 2026-03-10T09:13:40.554 INFO:teuthology.orchestra.run.vm05.stdout: Preparing : 1/1 2026-03-10T09:13:40.556 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-cephfs-2:18.2.1-0.el9.x86_64 1/3 2026-03-10T09:13:40.558 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-ceph-argparse-2:18.2.1-0.el9.x86_64 2/3 2026-03-10T09:13:40.558 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libcephfs2-2:18.2.1-0.el9.x86_64 3/3 2026-03-10T09:13:40.598 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: libcephfs2-2:18.2.1-0.el9.x86_64 3/3 2026-03-10T09:13:40.598 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libcephfs2-2:18.2.1-0.el9.x86_64 1/3 2026-03-10T09:13:40.598 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-ceph-argparse-2:18.2.1-0.el9.x86_64 2/3 2026-03-10T09:13:40.624 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: libcephfs2-2:18.2.1-0.el9.x86_64 3/3 2026-03-10T09:13:40.624 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libcephfs2-2:18.2.1-0.el9.x86_64 1/3 2026-03-10T09:13:40.624 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-ceph-argparse-2:18.2.1-0.el9.x86_64 2/3 2026-03-10T09:13:40.639 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-cephfs-2:18.2.1-0.el9.x86_64 3/3 2026-03-10T09:13:40.639 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:40.639 INFO:teuthology.orchestra.run.vm08.stdout:Removed: 2026-03-10T09:13:40.639 INFO:teuthology.orchestra.run.vm08.stdout: libcephfs2-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:40.639 INFO:teuthology.orchestra.run.vm08.stdout: python3-ceph-argparse-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:40.639 INFO:teuthology.orchestra.run.vm08.stdout: python3-cephfs-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:40.639 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:40.639 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T09:13:40.665 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cephfs-2:18.2.1-0.el9.x86_64 3/3 2026-03-10T09:13:40.665 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:40.665 INFO:teuthology.orchestra.run.vm05.stdout:Removed: 2026-03-10T09:13:40.665 INFO:teuthology.orchestra.run.vm05.stdout: libcephfs2-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:40.665 INFO:teuthology.orchestra.run.vm05.stdout: python3-ceph-argparse-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:40.665 INFO:teuthology.orchestra.run.vm05.stdout: python3-cephfs-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:40.665 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:40.665 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T09:13:40.832 INFO:teuthology.orchestra.run.vm08.stdout:No match for argument: libcephfs-devel 2026-03-10T09:13:40.832 INFO:teuthology.orchestra.run.vm08.stderr:No packages marked for removal. 2026-03-10T09:13:40.836 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T09:13:40.836 INFO:teuthology.orchestra.run.vm08.stdout:Nothing to do. 2026-03-10T09:13:40.836 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T09:13:40.858 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: libcephfs-devel 2026-03-10T09:13:40.859 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-10T09:13:40.862 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T09:13:40.863 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-10T09:13:40.863 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T09:13:41.036 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T09:13:41.037 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T09:13:41.037 INFO:teuthology.orchestra.run.vm08.stdout: Package Arch Version Repository Size 2026-03-10T09:13:41.038 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T09:13:41.038 INFO:teuthology.orchestra.run.vm08.stdout:Removing: 2026-03-10T09:13:41.038 INFO:teuthology.orchestra.run.vm08.stdout: librados2 x86_64 2:18.2.1-0.el9 @ceph 12 M 2026-03-10T09:13:41.038 INFO:teuthology.orchestra.run.vm08.stdout:Removing dependent packages: 2026-03-10T09:13:41.038 INFO:teuthology.orchestra.run.vm08.stdout: python3-rados x86_64 2:18.2.1-0.el9 @ceph 1.1 M 2026-03-10T09:13:41.038 INFO:teuthology.orchestra.run.vm08.stdout: python3-rbd x86_64 2:18.2.1-0.el9 @ceph 1.1 M 2026-03-10T09:13:41.038 INFO:teuthology.orchestra.run.vm08.stdout: python3-rgw x86_64 2:18.2.1-0.el9 @ceph 269 k 2026-03-10T09:13:41.038 INFO:teuthology.orchestra.run.vm08.stdout: qemu-kvm-block-rbd x86_64 17:10.1.0-15.el9 @appstream 37 k 2026-03-10T09:13:41.038 INFO:teuthology.orchestra.run.vm08.stdout: rbd-fuse x86_64 2:18.2.1-0.el9 @ceph 226 k 2026-03-10T09:13:41.038 INFO:teuthology.orchestra.run.vm08.stdout: rbd-nbd x86_64 2:18.2.1-0.el9 @ceph 494 k 2026-03-10T09:13:41.038 INFO:teuthology.orchestra.run.vm08.stdout:Removing unused dependencies: 2026-03-10T09:13:41.038 INFO:teuthology.orchestra.run.vm08.stdout: boost-program-options x86_64 1.75.0-13.el9 @appstream 276 k 2026-03-10T09:13:41.038 INFO:teuthology.orchestra.run.vm08.stdout: gperftools-libs x86_64 2.9.1-3.el9 @epel 1.4 M 2026-03-10T09:13:41.038 INFO:teuthology.orchestra.run.vm08.stdout: libarrow x86_64 9.0.0-15.el9 @epel 18 M 2026-03-10T09:13:41.038 INFO:teuthology.orchestra.run.vm08.stdout: libarrow-doc noarch 9.0.0-15.el9 @epel 122 k 2026-03-10T09:13:41.038 INFO:teuthology.orchestra.run.vm08.stdout: libpmemobj x86_64 1.12.1-1.el9 @appstream 383 k 2026-03-10T09:13:41.038 INFO:teuthology.orchestra.run.vm08.stdout: librabbitmq x86_64 0.11.0-7.el9 @appstream 102 k 2026-03-10T09:13:41.038 INFO:teuthology.orchestra.run.vm08.stdout: librbd1 x86_64 2:18.2.1-0.el9 @ceph 12 M 2026-03-10T09:13:41.038 INFO:teuthology.orchestra.run.vm08.stdout: librdkafka x86_64 1.6.1-102.el9 @appstream 2.0 M 2026-03-10T09:13:41.038 INFO:teuthology.orchestra.run.vm08.stdout: librgw2 x86_64 2:18.2.1-0.el9 @ceph 15 M 2026-03-10T09:13:41.038 INFO:teuthology.orchestra.run.vm08.stdout: libunwind x86_64 1.6.2-1.el9 @epel 170 k 2026-03-10T09:13:41.038 INFO:teuthology.orchestra.run.vm08.stdout: lttng-ust x86_64 2.12.0-6.el9 @appstream 1.0 M 2026-03-10T09:13:41.038 INFO:teuthology.orchestra.run.vm08.stdout: parquet-libs x86_64 9.0.0-15.el9 @epel 2.8 M 2026-03-10T09:13:41.038 INFO:teuthology.orchestra.run.vm08.stdout: re2 x86_64 1:20211101-20.el9 @epel 472 k 2026-03-10T09:13:41.038 INFO:teuthology.orchestra.run.vm08.stdout: thrift x86_64 0.15.0-4.el9 @epel 4.8 M 2026-03-10T09:13:41.038 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:41.038 INFO:teuthology.orchestra.run.vm08.stdout:Transaction Summary 2026-03-10T09:13:41.038 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T09:13:41.038 INFO:teuthology.orchestra.run.vm08.stdout:Remove 21 Packages 2026-03-10T09:13:41.038 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:41.038 INFO:teuthology.orchestra.run.vm08.stdout:Freed space: 74 M 2026-03-10T09:13:41.039 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction check 2026-03-10T09:13:41.042 INFO:teuthology.orchestra.run.vm08.stdout:Transaction check succeeded. 2026-03-10T09:13:41.042 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction test 2026-03-10T09:13:41.065 INFO:teuthology.orchestra.run.vm08.stdout:Transaction test succeeded. 2026-03-10T09:13:41.066 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction 2026-03-10T09:13:41.073 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T09:13:41.075 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T09:13:41.075 INFO:teuthology.orchestra.run.vm05.stdout: Package Arch Version Repository Size 2026-03-10T09:13:41.075 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T09:13:41.075 INFO:teuthology.orchestra.run.vm05.stdout:Removing: 2026-03-10T09:13:41.075 INFO:teuthology.orchestra.run.vm05.stdout: librados2 x86_64 2:18.2.1-0.el9 @ceph 12 M 2026-03-10T09:13:41.075 INFO:teuthology.orchestra.run.vm05.stdout:Removing dependent packages: 2026-03-10T09:13:41.075 INFO:teuthology.orchestra.run.vm05.stdout: python3-rados x86_64 2:18.2.1-0.el9 @ceph 1.1 M 2026-03-10T09:13:41.075 INFO:teuthology.orchestra.run.vm05.stdout: python3-rbd x86_64 2:18.2.1-0.el9 @ceph 1.1 M 2026-03-10T09:13:41.075 INFO:teuthology.orchestra.run.vm05.stdout: python3-rgw x86_64 2:18.2.1-0.el9 @ceph 269 k 2026-03-10T09:13:41.075 INFO:teuthology.orchestra.run.vm05.stdout: qemu-kvm-block-rbd x86_64 17:10.1.0-15.el9 @appstream 37 k 2026-03-10T09:13:41.075 INFO:teuthology.orchestra.run.vm05.stdout: rbd-fuse x86_64 2:18.2.1-0.el9 @ceph 226 k 2026-03-10T09:13:41.075 INFO:teuthology.orchestra.run.vm05.stdout: rbd-nbd x86_64 2:18.2.1-0.el9 @ceph 494 k 2026-03-10T09:13:41.075 INFO:teuthology.orchestra.run.vm05.stdout:Removing unused dependencies: 2026-03-10T09:13:41.075 INFO:teuthology.orchestra.run.vm05.stdout: boost-program-options x86_64 1.75.0-13.el9 @appstream 276 k 2026-03-10T09:13:41.075 INFO:teuthology.orchestra.run.vm05.stdout: gperftools-libs x86_64 2.9.1-3.el9 @epel 1.4 M 2026-03-10T09:13:41.075 INFO:teuthology.orchestra.run.vm05.stdout: libarrow x86_64 9.0.0-15.el9 @epel 18 M 2026-03-10T09:13:41.075 INFO:teuthology.orchestra.run.vm05.stdout: libarrow-doc noarch 9.0.0-15.el9 @epel 122 k 2026-03-10T09:13:41.075 INFO:teuthology.orchestra.run.vm05.stdout: libpmemobj x86_64 1.12.1-1.el9 @appstream 383 k 2026-03-10T09:13:41.075 INFO:teuthology.orchestra.run.vm05.stdout: librabbitmq x86_64 0.11.0-7.el9 @appstream 102 k 2026-03-10T09:13:41.075 INFO:teuthology.orchestra.run.vm05.stdout: librbd1 x86_64 2:18.2.1-0.el9 @ceph 12 M 2026-03-10T09:13:41.075 INFO:teuthology.orchestra.run.vm05.stdout: librdkafka x86_64 1.6.1-102.el9 @appstream 2.0 M 2026-03-10T09:13:41.076 INFO:teuthology.orchestra.run.vm05.stdout: librgw2 x86_64 2:18.2.1-0.el9 @ceph 15 M 2026-03-10T09:13:41.076 INFO:teuthology.orchestra.run.vm05.stdout: libunwind x86_64 1.6.2-1.el9 @epel 170 k 2026-03-10T09:13:41.076 INFO:teuthology.orchestra.run.vm05.stdout: lttng-ust x86_64 2.12.0-6.el9 @appstream 1.0 M 2026-03-10T09:13:41.076 INFO:teuthology.orchestra.run.vm05.stdout: parquet-libs x86_64 9.0.0-15.el9 @epel 2.8 M 2026-03-10T09:13:41.076 INFO:teuthology.orchestra.run.vm05.stdout: re2 x86_64 1:20211101-20.el9 @epel 472 k 2026-03-10T09:13:41.076 INFO:teuthology.orchestra.run.vm05.stdout: thrift x86_64 0.15.0-4.el9 @epel 4.8 M 2026-03-10T09:13:41.076 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:41.076 INFO:teuthology.orchestra.run.vm05.stdout:Transaction Summary 2026-03-10T09:13:41.076 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T09:13:41.076 INFO:teuthology.orchestra.run.vm05.stdout:Remove 21 Packages 2026-03-10T09:13:41.076 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:41.076 INFO:teuthology.orchestra.run.vm05.stdout:Freed space: 74 M 2026-03-10T09:13:41.076 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction check 2026-03-10T09:13:41.080 INFO:teuthology.orchestra.run.vm05.stdout:Transaction check succeeded. 2026-03-10T09:13:41.080 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction test 2026-03-10T09:13:41.103 INFO:teuthology.orchestra.run.vm05.stdout:Transaction test succeeded. 2026-03-10T09:13:41.103 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction 2026-03-10T09:13:41.109 INFO:teuthology.orchestra.run.vm08.stdout: Preparing : 1/1 2026-03-10T09:13:41.118 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : rbd-nbd-2:18.2.1-0.el9.x86_64 1/21 2026-03-10T09:13:41.121 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : rbd-fuse-2:18.2.1-0.el9.x86_64 2/21 2026-03-10T09:13:41.124 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-rgw-2:18.2.1-0.el9.x86_64 3/21 2026-03-10T09:13:41.124 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : librgw2-2:18.2.1-0.el9.x86_64 4/21 2026-03-10T09:13:41.139 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: librgw2-2:18.2.1-0.el9.x86_64 4/21 2026-03-10T09:13:41.142 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : parquet-libs-9.0.0-15.el9.x86_64 5/21 2026-03-10T09:13:41.144 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-rbd-2:18.2.1-0.el9.x86_64 6/21 2026-03-10T09:13:41.146 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-rados-2:18.2.1-0.el9.x86_64 7/21 2026-03-10T09:13:41.149 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 8/21 2026-03-10T09:13:41.149 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : librbd1-2:18.2.1-0.el9.x86_64 9/21 2026-03-10T09:13:41.158 INFO:teuthology.orchestra.run.vm05.stdout: Preparing : 1/1 2026-03-10T09:13:41.161 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : rbd-nbd-2:18.2.1-0.el9.x86_64 1/21 2026-03-10T09:13:41.163 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: librbd1-2:18.2.1-0.el9.x86_64 9/21 2026-03-10T09:13:41.163 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : librados2-2:18.2.1-0.el9.x86_64 10/21 2026-03-10T09:13:41.163 INFO:teuthology.orchestra.run.vm08.stdout:warning: file /etc/ceph: remove failed: No such file or directory 2026-03-10T09:13:41.163 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:41.163 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : rbd-fuse-2:18.2.1-0.el9.x86_64 2/21 2026-03-10T09:13:41.166 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-rgw-2:18.2.1-0.el9.x86_64 3/21 2026-03-10T09:13:41.166 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : librgw2-2:18.2.1-0.el9.x86_64 4/21 2026-03-10T09:13:41.177 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: librados2-2:18.2.1-0.el9.x86_64 10/21 2026-03-10T09:13:41.179 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: librgw2-2:18.2.1-0.el9.x86_64 4/21 2026-03-10T09:13:41.179 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : libarrow-9.0.0-15.el9.x86_64 11/21 2026-03-10T09:13:41.182 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : parquet-libs-9.0.0-15.el9.x86_64 5/21 2026-03-10T09:13:41.182 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : gperftools-libs-2.9.1-3.el9.x86_64 12/21 2026-03-10T09:13:41.184 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-rbd-2:18.2.1-0.el9.x86_64 6/21 2026-03-10T09:13:41.184 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : libarrow-doc-9.0.0-15.el9.noarch 13/21 2026-03-10T09:13:41.185 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-rados-2:18.2.1-0.el9.x86_64 7/21 2026-03-10T09:13:41.187 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : libunwind-1.6.2-1.el9.x86_64 14/21 2026-03-10T09:13:41.188 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 8/21 2026-03-10T09:13:41.188 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : librbd1-2:18.2.1-0.el9.x86_64 9/21 2026-03-10T09:13:41.191 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : re2-1:20211101-20.el9.x86_64 15/21 2026-03-10T09:13:41.195 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : lttng-ust-2.12.0-6.el9.x86_64 16/21 2026-03-10T09:13:41.198 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : thrift-0.15.0-4.el9.x86_64 17/21 2026-03-10T09:13:41.201 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : libpmemobj-1.12.1-1.el9.x86_64 18/21 2026-03-10T09:13:41.203 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: librbd1-2:18.2.1-0.el9.x86_64 9/21 2026-03-10T09:13:41.203 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : boost-program-options-1.75.0-13.el9.x86_64 19/21 2026-03-10T09:13:41.203 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : librados2-2:18.2.1-0.el9.x86_64 10/21 2026-03-10T09:13:41.203 INFO:teuthology.orchestra.run.vm05.stdout:warning: file /etc/ceph: remove failed: No such file or directory 2026-03-10T09:13:41.203 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:41.205 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : librabbitmq-0.11.0-7.el9.x86_64 20/21 2026-03-10T09:13:41.217 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: librados2-2:18.2.1-0.el9.x86_64 10/21 2026-03-10T09:13:41.219 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : librdkafka-1.6.1-102.el9.x86_64 21/21 2026-03-10T09:13:41.219 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libarrow-9.0.0-15.el9.x86_64 11/21 2026-03-10T09:13:41.222 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : gperftools-libs-2.9.1-3.el9.x86_64 12/21 2026-03-10T09:13:41.224 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libarrow-doc-9.0.0-15.el9.noarch 13/21 2026-03-10T09:13:41.227 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libunwind-1.6.2-1.el9.x86_64 14/21 2026-03-10T09:13:41.230 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : re2-1:20211101-20.el9.x86_64 15/21 2026-03-10T09:13:41.233 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : lttng-ust-2.12.0-6.el9.x86_64 16/21 2026-03-10T09:13:41.236 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : thrift-0.15.0-4.el9.x86_64 17/21 2026-03-10T09:13:41.238 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libpmemobj-1.12.1-1.el9.x86_64 18/21 2026-03-10T09:13:41.240 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : boost-program-options-1.75.0-13.el9.x86_64 19/21 2026-03-10T09:13:41.242 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : librabbitmq-0.11.0-7.el9.x86_64 20/21 2026-03-10T09:13:41.256 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : librdkafka-1.6.1-102.el9.x86_64 21/21 2026-03-10T09:13:41.285 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: librdkafka-1.6.1-102.el9.x86_64 21/21 2026-03-10T09:13:41.286 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 1/21 2026-03-10T09:13:41.286 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 2/21 2026-03-10T09:13:41.286 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 3/21 2026-03-10T09:13:41.286 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 4/21 2026-03-10T09:13:41.286 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 5/21 2026-03-10T09:13:41.286 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 6/21 2026-03-10T09:13:41.286 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : librados2-2:18.2.1-0.el9.x86_64 7/21 2026-03-10T09:13:41.286 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : librbd1-2:18.2.1-0.el9.x86_64 8/21 2026-03-10T09:13:41.286 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 9/21 2026-03-10T09:13:41.286 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : librgw2-2:18.2.1-0.el9.x86_64 10/21 2026-03-10T09:13:41.286 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 11/21 2026-03-10T09:13:41.286 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 12/21 2026-03-10T09:13:41.286 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 13/21 2026-03-10T09:13:41.286 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-rados-2:18.2.1-0.el9.x86_64 14/21 2026-03-10T09:13:41.286 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-rbd-2:18.2.1-0.el9.x86_64 15/21 2026-03-10T09:13:41.286 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-rgw-2:18.2.1-0.el9.x86_64 16/21 2026-03-10T09:13:41.286 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 17/21 2026-03-10T09:13:41.286 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : rbd-fuse-2:18.2.1-0.el9.x86_64 18/21 2026-03-10T09:13:41.286 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : rbd-nbd-2:18.2.1-0.el9.x86_64 19/21 2026-03-10T09:13:41.286 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : re2-1:20211101-20.el9.x86_64 20/21 2026-03-10T09:13:41.329 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: librdkafka-1.6.1-102.el9.x86_64 21/21 2026-03-10T09:13:41.329 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 1/21 2026-03-10T09:13:41.329 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 2/21 2026-03-10T09:13:41.329 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 3/21 2026-03-10T09:13:41.330 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 4/21 2026-03-10T09:13:41.330 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 5/21 2026-03-10T09:13:41.330 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 6/21 2026-03-10T09:13:41.330 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librados2-2:18.2.1-0.el9.x86_64 7/21 2026-03-10T09:13:41.330 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librbd1-2:18.2.1-0.el9.x86_64 8/21 2026-03-10T09:13:41.330 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 9/21 2026-03-10T09:13:41.330 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 21/21 2026-03-10T09:13:41.330 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:41.330 INFO:teuthology.orchestra.run.vm08.stdout:Removed: 2026-03-10T09:13:41.330 INFO:teuthology.orchestra.run.vm08.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-10T09:13:41.330 INFO:teuthology.orchestra.run.vm08.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-10T09:13:41.330 INFO:teuthology.orchestra.run.vm08.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-10T09:13:41.330 INFO:teuthology.orchestra.run.vm08.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-10T09:13:41.330 INFO:teuthology.orchestra.run.vm08.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-10T09:13:41.330 INFO:teuthology.orchestra.run.vm08.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-10T09:13:41.330 INFO:teuthology.orchestra.run.vm08.stdout: librados2-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:41.330 INFO:teuthology.orchestra.run.vm08.stdout: librbd1-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:41.330 INFO:teuthology.orchestra.run.vm08.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-10T09:13:41.330 INFO:teuthology.orchestra.run.vm08.stdout: librgw2-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:41.330 INFO:teuthology.orchestra.run.vm08.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-10T09:13:41.330 INFO:teuthology.orchestra.run.vm08.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-10T09:13:41.330 INFO:teuthology.orchestra.run.vm08.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-10T09:13:41.330 INFO:teuthology.orchestra.run.vm08.stdout: python3-rados-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:41.330 INFO:teuthology.orchestra.run.vm08.stdout: python3-rbd-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:41.330 INFO:teuthology.orchestra.run.vm08.stdout: python3-rgw-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:41.330 INFO:teuthology.orchestra.run.vm08.stdout: qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 2026-03-10T09:13:41.330 INFO:teuthology.orchestra.run.vm08.stdout: rbd-fuse-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:41.330 INFO:teuthology.orchestra.run.vm08.stdout: rbd-nbd-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:41.330 INFO:teuthology.orchestra.run.vm08.stdout: re2-1:20211101-20.el9.x86_64 2026-03-10T09:13:41.330 INFO:teuthology.orchestra.run.vm08.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-10T09:13:41.330 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T09:13:41.330 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T09:13:41.331 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librgw2-2:18.2.1-0.el9.x86_64 10/21 2026-03-10T09:13:41.331 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 11/21 2026-03-10T09:13:41.331 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 12/21 2026-03-10T09:13:41.331 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 13/21 2026-03-10T09:13:41.331 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-rados-2:18.2.1-0.el9.x86_64 14/21 2026-03-10T09:13:41.331 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-rbd-2:18.2.1-0.el9.x86_64 15/21 2026-03-10T09:13:41.331 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-rgw-2:18.2.1-0.el9.x86_64 16/21 2026-03-10T09:13:41.331 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 17/21 2026-03-10T09:13:41.331 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : rbd-fuse-2:18.2.1-0.el9.x86_64 18/21 2026-03-10T09:13:41.331 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : rbd-nbd-2:18.2.1-0.el9.x86_64 19/21 2026-03-10T09:13:41.331 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : re2-1:20211101-20.el9.x86_64 20/21 2026-03-10T09:13:41.387 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 21/21 2026-03-10T09:13:41.387 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:41.387 INFO:teuthology.orchestra.run.vm05.stdout:Removed: 2026-03-10T09:13:41.387 INFO:teuthology.orchestra.run.vm05.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-10T09:13:41.387 INFO:teuthology.orchestra.run.vm05.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-10T09:13:41.387 INFO:teuthology.orchestra.run.vm05.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-10T09:13:41.387 INFO:teuthology.orchestra.run.vm05.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-10T09:13:41.387 INFO:teuthology.orchestra.run.vm05.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-10T09:13:41.387 INFO:teuthology.orchestra.run.vm05.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-10T09:13:41.387 INFO:teuthology.orchestra.run.vm05.stdout: librados2-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:41.387 INFO:teuthology.orchestra.run.vm05.stdout: librbd1-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:41.387 INFO:teuthology.orchestra.run.vm05.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-10T09:13:41.387 INFO:teuthology.orchestra.run.vm05.stdout: librgw2-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:41.387 INFO:teuthology.orchestra.run.vm05.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-10T09:13:41.387 INFO:teuthology.orchestra.run.vm05.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-10T09:13:41.387 INFO:teuthology.orchestra.run.vm05.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-10T09:13:41.387 INFO:teuthology.orchestra.run.vm05.stdout: python3-rados-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:41.387 INFO:teuthology.orchestra.run.vm05.stdout: python3-rbd-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:41.387 INFO:teuthology.orchestra.run.vm05.stdout: python3-rgw-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:41.387 INFO:teuthology.orchestra.run.vm05.stdout: qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 2026-03-10T09:13:41.387 INFO:teuthology.orchestra.run.vm05.stdout: rbd-fuse-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:41.387 INFO:teuthology.orchestra.run.vm05.stdout: rbd-nbd-2:18.2.1-0.el9.x86_64 2026-03-10T09:13:41.387 INFO:teuthology.orchestra.run.vm05.stdout: re2-1:20211101-20.el9.x86_64 2026-03-10T09:13:41.387 INFO:teuthology.orchestra.run.vm05.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-10T09:13:41.387 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T09:13:41.387 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T09:13:41.559 INFO:teuthology.orchestra.run.vm08.stdout:No match for argument: librbd1 2026-03-10T09:13:41.559 INFO:teuthology.orchestra.run.vm08.stderr:No packages marked for removal. 2026-03-10T09:13:41.562 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T09:13:41.563 INFO:teuthology.orchestra.run.vm08.stdout:Nothing to do. 2026-03-10T09:13:41.563 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T09:13:41.602 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: librbd1 2026-03-10T09:13:41.602 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-10T09:13:41.606 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T09:13:41.607 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-10T09:13:41.607 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T09:13:41.751 INFO:teuthology.orchestra.run.vm08.stdout:No match for argument: python3-rados 2026-03-10T09:13:41.752 INFO:teuthology.orchestra.run.vm08.stderr:No packages marked for removal. 2026-03-10T09:13:41.755 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T09:13:41.755 INFO:teuthology.orchestra.run.vm08.stdout:Nothing to do. 2026-03-10T09:13:41.755 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T09:13:41.793 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: python3-rados 2026-03-10T09:13:41.793 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-10T09:13:41.796 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T09:13:41.797 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-10T09:13:41.797 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T09:13:41.932 INFO:teuthology.orchestra.run.vm08.stdout:No match for argument: python3-rgw 2026-03-10T09:13:41.932 INFO:teuthology.orchestra.run.vm08.stderr:No packages marked for removal. 2026-03-10T09:13:41.935 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T09:13:41.936 INFO:teuthology.orchestra.run.vm08.stdout:Nothing to do. 2026-03-10T09:13:41.936 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T09:13:41.975 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: python3-rgw 2026-03-10T09:13:41.975 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-10T09:13:41.980 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T09:13:41.981 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-10T09:13:41.981 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T09:13:42.117 INFO:teuthology.orchestra.run.vm08.stdout:No match for argument: python3-cephfs 2026-03-10T09:13:42.117 INFO:teuthology.orchestra.run.vm08.stderr:No packages marked for removal. 2026-03-10T09:13:42.120 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T09:13:42.121 INFO:teuthology.orchestra.run.vm08.stdout:Nothing to do. 2026-03-10T09:13:42.121 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T09:13:42.157 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: python3-cephfs 2026-03-10T09:13:42.158 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-10T09:13:42.161 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T09:13:42.162 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-10T09:13:42.162 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T09:13:42.305 INFO:teuthology.orchestra.run.vm08.stdout:No match for argument: python3-rbd 2026-03-10T09:13:42.305 INFO:teuthology.orchestra.run.vm08.stderr:No packages marked for removal. 2026-03-10T09:13:42.308 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T09:13:42.309 INFO:teuthology.orchestra.run.vm08.stdout:Nothing to do. 2026-03-10T09:13:42.309 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T09:13:42.345 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: python3-rbd 2026-03-10T09:13:42.345 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-10T09:13:42.348 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T09:13:42.348 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-10T09:13:42.348 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T09:13:42.493 INFO:teuthology.orchestra.run.vm08.stdout:No match for argument: rbd-fuse 2026-03-10T09:13:42.494 INFO:teuthology.orchestra.run.vm08.stderr:No packages marked for removal. 2026-03-10T09:13:42.497 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T09:13:42.498 INFO:teuthology.orchestra.run.vm08.stdout:Nothing to do. 2026-03-10T09:13:42.498 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T09:13:42.533 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: rbd-fuse 2026-03-10T09:13:42.533 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-10T09:13:42.536 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T09:13:42.537 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-10T09:13:42.537 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T09:13:42.693 INFO:teuthology.orchestra.run.vm08.stdout:No match for argument: rbd-mirror 2026-03-10T09:13:42.693 INFO:teuthology.orchestra.run.vm08.stderr:No packages marked for removal. 2026-03-10T09:13:42.696 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T09:13:42.697 INFO:teuthology.orchestra.run.vm08.stdout:Nothing to do. 2026-03-10T09:13:42.697 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T09:13:42.716 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: rbd-mirror 2026-03-10T09:13:42.716 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-10T09:13:42.719 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T09:13:42.719 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-10T09:13:42.719 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T09:13:42.883 INFO:teuthology.orchestra.run.vm08.stdout:No match for argument: rbd-nbd 2026-03-10T09:13:42.883 INFO:teuthology.orchestra.run.vm08.stderr:No packages marked for removal. 2026-03-10T09:13:42.886 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T09:13:42.887 INFO:teuthology.orchestra.run.vm08.stdout:Nothing to do. 2026-03-10T09:13:42.887 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T09:13:42.892 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: rbd-nbd 2026-03-10T09:13:42.893 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-10T09:13:42.900 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T09:13:42.901 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-10T09:13:42.901 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T09:13:42.916 DEBUG:teuthology.orchestra.run.vm08:> sudo yum clean all 2026-03-10T09:13:42.927 DEBUG:teuthology.orchestra.run.vm05:> sudo yum clean all 2026-03-10T09:13:43.048 INFO:teuthology.orchestra.run.vm08.stdout:56 files removed 2026-03-10T09:13:43.059 INFO:teuthology.orchestra.run.vm05.stdout:56 files removed 2026-03-10T09:13:43.076 DEBUG:teuthology.orchestra.run.vm08:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-10T09:13:43.088 DEBUG:teuthology.orchestra.run.vm05:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-10T09:13:43.106 DEBUG:teuthology.orchestra.run.vm08:> sudo yum clean expire-cache 2026-03-10T09:13:43.115 DEBUG:teuthology.orchestra.run.vm05:> sudo yum clean expire-cache 2026-03-10T09:13:43.274 INFO:teuthology.orchestra.run.vm08.stdout:Cache was expired 2026-03-10T09:13:43.274 INFO:teuthology.orchestra.run.vm08.stdout:0 files removed 2026-03-10T09:13:43.286 INFO:teuthology.orchestra.run.vm05.stdout:Cache was expired 2026-03-10T09:13:43.286 INFO:teuthology.orchestra.run.vm05.stdout:0 files removed 2026-03-10T09:13:43.303 DEBUG:teuthology.parallel:result is None 2026-03-10T09:13:43.312 DEBUG:teuthology.parallel:result is None 2026-03-10T09:13:43.312 INFO:teuthology.task.install:Removing ceph sources lists on ubuntu@vm05.local 2026-03-10T09:13:43.312 INFO:teuthology.task.install:Removing ceph sources lists on ubuntu@vm08.local 2026-03-10T09:13:43.312 DEBUG:teuthology.orchestra.run.vm05:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-10T09:13:43.312 DEBUG:teuthology.orchestra.run.vm08:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-10T09:13:43.339 DEBUG:teuthology.orchestra.run.vm05:> sudo mv -f /etc/yum/pluginconf.d/priorities.conf.orig /etc/yum/pluginconf.d/priorities.conf 2026-03-10T09:13:43.339 DEBUG:teuthology.orchestra.run.vm08:> sudo mv -f /etc/yum/pluginconf.d/priorities.conf.orig /etc/yum/pluginconf.d/priorities.conf 2026-03-10T09:13:43.406 DEBUG:teuthology.parallel:result is None 2026-03-10T09:13:43.410 DEBUG:teuthology.parallel:result is None 2026-03-10T09:13:43.410 DEBUG:teuthology.run_tasks:Unwinding manager clock 2026-03-10T09:13:43.413 INFO:teuthology.task.clock:Checking final clock skew... 2026-03-10T09:13:43.413 DEBUG:teuthology.orchestra.run.vm05:> PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-10T09:13:43.453 DEBUG:teuthology.orchestra.run.vm08:> PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-10T09:13:43.468 INFO:teuthology.orchestra.run.vm05.stderr:bash: line 1: ntpq: command not found 2026-03-10T09:13:43.468 INFO:teuthology.orchestra.run.vm08.stderr:bash: line 1: ntpq: command not found 2026-03-10T09:13:43.473 INFO:teuthology.orchestra.run.vm08.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-10T09:13:43.473 INFO:teuthology.orchestra.run.vm08.stdout:=============================================================================== 2026-03-10T09:13:43.473 INFO:teuthology.orchestra.run.vm08.stdout:^* vps-ber1.orleans.ddnss.de 2 7 377 20 -625us[ -622us] +/- 15ms 2026-03-10T09:13:43.473 INFO:teuthology.orchestra.run.vm08.stdout:^+ nc-root-nue.nicesrv.de 2 7 377 19 +2558us[+2558us] +/- 43ms 2026-03-10T09:13:43.473 INFO:teuthology.orchestra.run.vm08.stdout:^+ server1a.meinberg.de 2 6 377 22 -103us[ -100us] +/- 37ms 2026-03-10T09:13:43.473 INFO:teuthology.orchestra.run.vm08.stdout:^+ 47.ip-51-75-67.eu 4 6 377 20 -268us[ -265us] +/- 16ms 2026-03-10T09:13:43.476 INFO:teuthology.orchestra.run.vm05.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-10T09:13:43.476 INFO:teuthology.orchestra.run.vm05.stdout:=============================================================================== 2026-03-10T09:13:43.476 INFO:teuthology.orchestra.run.vm05.stdout:^+ server1a.meinberg.de 2 7 377 23 -144us[ -144us] +/- 37ms 2026-03-10T09:13:43.476 INFO:teuthology.orchestra.run.vm05.stdout:^+ 47.ip-51-75-67.eu 4 6 377 22 -257us[ -257us] +/- 16ms 2026-03-10T09:13:43.476 INFO:teuthology.orchestra.run.vm05.stdout:^* vps-ber1.orleans.ddnss.de 2 7 377 24 -654us[ -649us] +/- 15ms 2026-03-10T09:13:43.476 INFO:teuthology.orchestra.run.vm05.stdout:^+ nc-root-nue.nicesrv.de 2 7 377 21 +2582us[+2582us] +/- 43ms 2026-03-10T09:13:43.478 DEBUG:teuthology.run_tasks:Unwinding manager ansible.cephlab 2026-03-10T09:13:43.481 INFO:teuthology.task.ansible:Skipping ansible cleanup... 2026-03-10T09:13:43.482 DEBUG:teuthology.run_tasks:Unwinding manager selinux 2026-03-10T09:13:43.484 DEBUG:teuthology.run_tasks:Unwinding manager pcp 2026-03-10T09:13:43.487 DEBUG:teuthology.run_tasks:Unwinding manager internal.timer 2026-03-10T09:13:43.490 INFO:teuthology.task.internal:Duration was 1637.069967 seconds 2026-03-10T09:13:43.490 DEBUG:teuthology.run_tasks:Unwinding manager internal.syslog 2026-03-10T09:13:43.493 INFO:teuthology.task.internal.syslog:Shutting down syslog monitoring... 2026-03-10T09:13:43.493 DEBUG:teuthology.orchestra.run.vm05:> sudo rm -f -- /etc/rsyslog.d/80-cephtest.conf && sudo service rsyslog restart 2026-03-10T09:13:43.520 DEBUG:teuthology.orchestra.run.vm08:> sudo rm -f -- /etc/rsyslog.d/80-cephtest.conf && sudo service rsyslog restart 2026-03-10T09:13:43.562 INFO:teuthology.orchestra.run.vm05.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-10T09:13:43.566 INFO:teuthology.orchestra.run.vm08.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-10T09:13:43.835 INFO:teuthology.task.internal.syslog:Checking logs for errors... 2026-03-10T09:13:43.835 DEBUG:teuthology.task.internal.syslog:Checking ubuntu@vm05.local 2026-03-10T09:13:43.835 DEBUG:teuthology.orchestra.run.vm05:> grep -E --binary-files=text '\bBUG\b|\bINFO\b|\bDEADLOCK\b' /home/ubuntu/cephtest/archive/syslog/kern.log | grep -v 'task .* blocked for more than .* seconds' | grep -v 'lockdep is turned off' | grep -v 'trying to register non-static key' | grep -v 'DEBUG: fsize' | grep -v CRON | grep -v 'BUG: bad unlock balance detected' | grep -v 'inconsistent lock state' | grep -v '*** DEADLOCK ***' | grep -v 'INFO: possible irq lock inversion dependency detected' | grep -v 'INFO: NMI handler (perf_event_nmi_handler) took too long to run' | grep -v 'INFO: recovery required on readonly' | grep -v 'ceph-create-keys: INFO' | grep -v INFO:ceph-create-keys | grep -v 'Loaded datasource DataSourceOpenStack' | grep -v 'container-storage-setup: INFO: Volume group backing root filesystem could not be determined' | grep -E -v '\bsalt-master\b|\bsalt-minion\b|\bsalt-api\b' | grep -v ceph-crash | grep -E -v '\btcmu-runner\b.*\bINFO\b' | head -n 1 2026-03-10T09:13:43.863 DEBUG:teuthology.task.internal.syslog:Checking ubuntu@vm08.local 2026-03-10T09:13:43.863 DEBUG:teuthology.orchestra.run.vm08:> grep -E --binary-files=text '\bBUG\b|\bINFO\b|\bDEADLOCK\b' /home/ubuntu/cephtest/archive/syslog/kern.log | grep -v 'task .* blocked for more than .* seconds' | grep -v 'lockdep is turned off' | grep -v 'trying to register non-static key' | grep -v 'DEBUG: fsize' | grep -v CRON | grep -v 'BUG: bad unlock balance detected' | grep -v 'inconsistent lock state' | grep -v '*** DEADLOCK ***' | grep -v 'INFO: possible irq lock inversion dependency detected' | grep -v 'INFO: NMI handler (perf_event_nmi_handler) took too long to run' | grep -v 'INFO: recovery required on readonly' | grep -v 'ceph-create-keys: INFO' | grep -v INFO:ceph-create-keys | grep -v 'Loaded datasource DataSourceOpenStack' | grep -v 'container-storage-setup: INFO: Volume group backing root filesystem could not be determined' | grep -E -v '\bsalt-master\b|\bsalt-minion\b|\bsalt-api\b' | grep -v ceph-crash | grep -E -v '\btcmu-runner\b.*\bINFO\b' | head -n 1 2026-03-10T09:13:43.896 INFO:teuthology.task.internal.syslog:Gathering journactl... 2026-03-10T09:13:43.897 DEBUG:teuthology.orchestra.run.vm05:> sudo journalctl > /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-10T09:13:43.905 DEBUG:teuthology.orchestra.run.vm08:> sudo journalctl > /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-10T09:13:44.659 INFO:teuthology.task.internal.syslog:Compressing syslogs... 2026-03-10T09:13:44.660 DEBUG:teuthology.orchestra.run.vm05:> find /home/ubuntu/cephtest/archive/syslog -name '*.log' -print0 | sudo xargs -0 --max-args=1 --max-procs=0 --verbose --no-run-if-empty -- gzip -5 --verbose -- 2026-03-10T09:13:44.661 DEBUG:teuthology.orchestra.run.vm08:> find /home/ubuntu/cephtest/archive/syslog -name '*.log' -print0 | sudo xargs -0 --max-args=1 --max-procs=0 --verbose --no-run-if-empty -- gzip -5 --verbose -- 2026-03-10T09:13:44.687 INFO:teuthology.orchestra.run.vm08.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-10T09:13:44.687 INFO:teuthology.orchestra.run.vm08.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-10T09:13:44.687 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-10T09:13:44.688 INFO:teuthology.orchestra.run.vm08.stderr:/home/ubuntu/cephtest/archive/syslog/kern.log: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/kern.log.gz 2026-03-10T09:13:44.688 INFO:teuthology.orchestra.run.vm08.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-10T09:13:44.688 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-10T09:13:44.688 INFO:teuthology.orchestra.run.vm08.stderr:/home/ubuntu/cephtest/archive/syslog/misc.log: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/misc.log.gz/home/ubuntu/cephtest/archive/syslog/journalctl.log: 2026-03-10T09:13:44.688 INFO:teuthology.orchestra.run.vm05.stderr:gzip/home/ubuntu/cephtest/archive/syslog/kern.log: -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-10T09:13:44.688 INFO:teuthology.orchestra.run.vm05.stderr: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/kern.log.gz 2026-03-10T09:13:44.689 INFO:teuthology.orchestra.run.vm05.stderr:/home/ubuntu/cephtest/archive/syslog/misc.log: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/misc.log.gz 2026-03-10T09:13:44.825 INFO:teuthology.orchestra.run.vm08.stderr: 97.8% -- replaced with /home/ubuntu/cephtest/archive/syslog/journalctl.log.gz 2026-03-10T09:13:44.863 INFO:teuthology.orchestra.run.vm05.stderr:/home/ubuntu/cephtest/archive/syslog/journalctl.log: 96.8% -- replaced with /home/ubuntu/cephtest/archive/syslog/journalctl.log.gz 2026-03-10T09:13:44.866 DEBUG:teuthology.run_tasks:Unwinding manager internal.sudo 2026-03-10T09:13:44.869 INFO:teuthology.task.internal:Restoring /etc/sudoers... 2026-03-10T09:13:44.869 DEBUG:teuthology.orchestra.run.vm05:> sudo mv -f /etc/sudoers.orig.teuthology /etc/sudoers 2026-03-10T09:13:44.935 DEBUG:teuthology.orchestra.run.vm08:> sudo mv -f /etc/sudoers.orig.teuthology /etc/sudoers 2026-03-10T09:13:44.959 DEBUG:teuthology.run_tasks:Unwinding manager internal.coredump 2026-03-10T09:13:44.975 DEBUG:teuthology.orchestra.run.vm05:> sudo sysctl -w kernel.core_pattern=core && sudo bash -c 'for f in `find /home/ubuntu/cephtest/archive/coredump -type f`; do file $f | grep -q systemd-sysusers && rm $f || true ; done' && rmdir --ignore-fail-on-non-empty -- /home/ubuntu/cephtest/archive/coredump 2026-03-10T09:13:44.977 DEBUG:teuthology.orchestra.run.vm08:> sudo sysctl -w kernel.core_pattern=core && sudo bash -c 'for f in `find /home/ubuntu/cephtest/archive/coredump -type f`; do file $f | grep -q systemd-sysusers && rm $f || true ; done' && rmdir --ignore-fail-on-non-empty -- /home/ubuntu/cephtest/archive/coredump 2026-03-10T09:13:45.003 INFO:teuthology.orchestra.run.vm05.stdout:kernel.core_pattern = core 2026-03-10T09:13:45.026 INFO:teuthology.orchestra.run.vm08.stdout:kernel.core_pattern = core 2026-03-10T09:13:45.043 DEBUG:teuthology.orchestra.run.vm05:> test -e /home/ubuntu/cephtest/archive/coredump 2026-03-10T09:13:45.074 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T09:13:45.075 DEBUG:teuthology.orchestra.run.vm08:> test -e /home/ubuntu/cephtest/archive/coredump 2026-03-10T09:13:45.098 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T09:13:45.098 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive 2026-03-10T09:13:45.102 INFO:teuthology.task.internal:Transferring archived files... 2026-03-10T09:13:45.102 DEBUG:teuthology.misc:Transferring archived files from vm05:/home/ubuntu/cephtest/archive to /archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/969/remote/vm05 2026-03-10T09:13:45.102 DEBUG:teuthology.orchestra.run.vm05:> sudo tar c -f - -C /home/ubuntu/cephtest/archive -- . 2026-03-10T09:13:45.146 DEBUG:teuthology.misc:Transferring archived files from vm08:/home/ubuntu/cephtest/archive to /archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/969/remote/vm08 2026-03-10T09:13:45.146 DEBUG:teuthology.orchestra.run.vm08:> sudo tar c -f - -C /home/ubuntu/cephtest/archive -- . 2026-03-10T09:13:45.176 INFO:teuthology.task.internal:Removing archive directory... 2026-03-10T09:13:45.176 DEBUG:teuthology.orchestra.run.vm05:> rm -rf -- /home/ubuntu/cephtest/archive 2026-03-10T09:13:45.185 DEBUG:teuthology.orchestra.run.vm08:> rm -rf -- /home/ubuntu/cephtest/archive 2026-03-10T09:13:45.235 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive_upload 2026-03-10T09:13:45.238 INFO:teuthology.task.internal:Not uploading archives. 2026-03-10T09:13:45.238 DEBUG:teuthology.run_tasks:Unwinding manager internal.base 2026-03-10T09:13:45.241 INFO:teuthology.task.internal:Tidying up after the test... 2026-03-10T09:13:45.241 DEBUG:teuthology.orchestra.run.vm05:> find /home/ubuntu/cephtest -ls ; rmdir -- /home/ubuntu/cephtest 2026-03-10T09:13:45.243 DEBUG:teuthology.orchestra.run.vm08:> find /home/ubuntu/cephtest -ls ; rmdir -- /home/ubuntu/cephtest 2026-03-10T09:13:45.261 INFO:teuthology.orchestra.run.vm05.stdout: 8532138 0 drwxr-xr-x 3 ubuntu ubuntu 19 Mar 10 09:13 /home/ubuntu/cephtest 2026-03-10T09:13:45.261 INFO:teuthology.orchestra.run.vm05.stdout: 12949095 0 d--------- 2 ubuntu ubuntu 6 Mar 10 08:53 /home/ubuntu/cephtest/mnt.0 2026-03-10T09:13:45.261 INFO:teuthology.orchestra.run.vm05.stderr:find: ‘/home/ubuntu/cephtest/mnt.0’: Permission denied 2026-03-10T09:13:45.262 INFO:teuthology.orchestra.run.vm05.stderr:rmdir: failed to remove '/home/ubuntu/cephtest': Directory not empty 2026-03-10T09:13:45.277 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T09:13:45.277 ERROR:teuthology.run_tasks:Manager failed: internal.base Traceback (most recent call last): File "/home/teuthos/teuthology/teuthology/task/internal/__init__.py", line 48, in base yield File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 160, in run_tasks suppress = manager.__exit__(*exc_info) File "/home/teuthos/.local/share/uv/python/cpython-3.10.19-linux-x86_64-gnu/lib/python3.10/contextlib.py", line 142, in __exit__ next(self.gen) File "/home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks/ceph_fuse.py", line 248, in task mount.umount_wait() File "/home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks/cephfs/fuse_mount.py", line 403, in umount_wait run.wait([self.fuse_daemon], timeout) File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 479, in wait check_time() File "/home/teuthos/teuthology/teuthology/contextutil.py", line 134, in __call__ raise MaxWhileTries(error_msg) teuthology.exceptions.MaxWhileTries: reached maximum tries (50) after waiting for 300 seconds During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 160, in run_tasks suppress = manager.__exit__(*exc_info) File "/home/teuthos/.local/share/uv/python/cpython-3.10.19-linux-x86_64-gnu/lib/python3.10/contextlib.py", line 153, in __exit__ self.gen.throw(typ, value, traceback) File "/home/teuthos/teuthology/teuthology/task/internal/__init__.py", line 53, in base run.wait( File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 485, in wait proc.wait() File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 161, in wait self._raise_for_status() File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 181, in _raise_for_status raise CommandFailedError( teuthology.exceptions.CommandFailedError: Command failed on vm05 with status 1: 'find /home/ubuntu/cephtest -ls ; rmdir -- /home/ubuntu/cephtest' 2026-03-10T09:13:45.278 DEBUG:teuthology.run_tasks:Unwinding manager console_log 2026-03-10T09:13:45.281 DEBUG:teuthology.run_tasks:Exception was not quenched, exiting: MaxWhileTries: reached maximum tries (50) after waiting for 300 seconds 2026-03-10T09:13:45.282 INFO:teuthology.run:Summary data: description: orch/cephadm/mds_upgrade_sequence/{bluestore-bitmap centos_9.stream conf/{client mds mgr mon osd} fail_fs/no kernel overrides/{ignorelist_health ignorelist_upgrade ignorelist_wrongly_marked_down pg-warn pg_health syntax} roles tasks/{0-from/reef/{v18.2.1} 1-volume/{0-create 1-ranks/1 2-allow_standby_replay/no 3-inline/no 4-verify} 2-client/fuse 3-upgrade-mgr-staggered 4-config-upgrade/{fail_fs} 5-upgrade-with-workload 6-verify}} duration: 1637.0699665546417 failure_reason: reached maximum tries (50) after waiting for 300 seconds flavor: default owner: kyr status: fail success: false 2026-03-10T09:13:45.282 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-10T09:13:45.303 INFO:teuthology.run:FAIL